Sample records for risk-based computer models

  1. Physics-based Entry, Descent and Landing Risk Model

    NASA Technical Reports Server (NTRS)

    Gee, Ken; Huynh, Loc C.; Manning, Ted

    2014-01-01

    A physics-based risk model was developed to assess the risk associated with thermal protection system failures during the entry, descent and landing phase of a manned spacecraft mission. In the model, entry trajectories were computed using a three-degree-of-freedom trajectory tool, the aerothermodynamic heating environment was computed using an engineering-level computational tool and the thermal response of the TPS material was modeled using a one-dimensional thermal response tool. The model was capable of modeling the effect of micrometeoroid and orbital debris impact damage on the TPS thermal response. A Monte Carlo analysis was used to determine the effects of uncertainties in the vehicle state at Entry Interface, aerothermodynamic heating and material properties on the performance of the TPS design. The failure criterion was set as a temperature limit at the bondline between the TPS and the underlying structure. Both direct computation and response surface approaches were used to compute the risk. The model was applied to a generic manned space capsule design. The effect of material property uncertainty and MMOD damage on risk of failure were analyzed. A comparison of the direct computation and response surface approach was undertaken.

  2. A Model-based Framework for Risk Assessment in Human-Computer Controlled Systems

    NASA Technical Reports Server (NTRS)

    Hatanaka, Iwao

    2000-01-01

    The rapid growth of computer technology and innovation has played a significant role in the rise of computer automation of human tasks in modem production systems across all industries. Although the rationale for automation has been to eliminate "human error" or to relieve humans from manual repetitive tasks, various computer-related hazards and accidents have emerged as a direct result of increased system complexity attributed to computer automation. The risk assessment techniques utilized for electromechanical systems are not suitable for today's software-intensive systems or complex human-computer controlled systems. This thesis will propose a new systemic model-based framework for analyzing risk in safety-critical systems where both computers and humans are controlling safety-critical functions. A new systems accident model will be developed based upon modem systems theory and human cognitive processes to better characterize system accidents, the role of human operators, and the influence of software in its direct control of significant system functions. Better risk assessments will then be achievable through the application of this new framework to complex human-computer controlled systems.

  3. A Research Roadmap for Computation-Based Human Reliability Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boring, Ronald; Mandelli, Diego; Joe, Jeffrey

    2015-08-01

    The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is oftenmore » secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.« less

  4. FRAMEWORK FOR EVALUATION OF PHYSIOLOGICALLY-BASED PHARMACOKINETIC MODELS FOR USE IN SAFETY OR RISK ASSESSMENT

    EPA Science Inventory

    ABSTRACT

    Proposed applications of increasingly sophisticated biologically-based computational models, such as physiologically-based pharmacokinetic (PBPK) models, raise the issue of how to evaluate whether the models are adequate for proposed uses including safety or risk ...

  5. Tutorial: Parallel Computing of Simulation Models for Risk Analysis.

    PubMed

    Reilly, Allison C; Staid, Andrea; Gao, Michael; Guikema, Seth D

    2016-10-01

    Simulation models are widely used in risk analysis to study the effects of uncertainties on outcomes of interest in complex problems. Often, these models are computationally complex and time consuming to run. This latter point may be at odds with time-sensitive evaluations or may limit the number of parameters that are considered. In this article, we give an introductory tutorial focused on parallelizing simulation code to better leverage modern computing hardware, enabling risk analysts to better utilize simulation-based methods for quantifying uncertainty in practice. This article is aimed primarily at risk analysts who use simulation methods but do not yet utilize parallelization to decrease the computational burden of these models. The discussion is focused on conceptual aspects of embarrassingly parallel computer code and software considerations. Two complementary examples are shown using the languages MATLAB and R. A brief discussion of hardware considerations is located in the Appendix. © 2016 Society for Risk Analysis.

  6. Subject-enabled analytics model on measurement statistics in health risk expert system for public health informatics.

    PubMed

    Chung, Chi-Jung; Kuo, Yu-Chen; Hsieh, Yun-Yu; Li, Tsai-Chung; Lin, Cheng-Chieh; Liang, Wen-Miin; Liao, Li-Na; Li, Chia-Ing; Lin, Hsueh-Chun

    2017-11-01

    This study applied open source technology to establish a subject-enabled analytics model that can enhance measurement statistics of case studies with the public health data in cloud computing. The infrastructure of the proposed model comprises three domains: 1) the health measurement data warehouse (HMDW) for the case study repository, 2) the self-developed modules of online health risk information statistics (HRIStat) for cloud computing, and 3) the prototype of a Web-based process automation system in statistics (PASIS) for the health risk assessment of case studies with subject-enabled evaluation. The system design employed freeware including Java applications, MySQL, and R packages to drive a health risk expert system (HRES). In the design, the HRIStat modules enforce the typical analytics methods for biomedical statistics, and the PASIS interfaces enable process automation of the HRES for cloud computing. The Web-based model supports both modes, step-by-step analysis and auto-computing process, respectively for preliminary evaluation and real time computation. The proposed model was evaluated by computing prior researches in relation to the epidemiological measurement of diseases that were caused by either heavy metal exposures in the environment or clinical complications in hospital. The simulation validity was approved by the commercial statistics software. The model was installed in a stand-alone computer and in a cloud-server workstation to verify computing performance for a data amount of more than 230K sets. Both setups reached efficiency of about 10 5 sets per second. The Web-based PASIS interface can be used for cloud computing, and the HRIStat module can be flexibly expanded with advanced subjects for measurement statistics. The analytics procedure of the HRES prototype is capable of providing assessment criteria prior to estimating the potential risk to public health. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Variance computations for functional of absolute risk estimates.

    PubMed

    Pfeiffer, R M; Petracci, E

    2011-07-01

    We present a simple influence function based approach to compute the variances of estimates of absolute risk and functions of absolute risk. We apply this approach to criteria that assess the impact of changes in the risk factor distribution on absolute risk for an individual and at the population level. As an illustration we use an absolute risk prediction model for breast cancer that includes modifiable risk factors in addition to standard breast cancer risk factors. Influence function based variance estimates for absolute risk and the criteria are compared to bootstrap variance estimates.

  8. Variance computations for functional of absolute risk estimates

    PubMed Central

    Pfeiffer, R.M.; Petracci, E.

    2011-01-01

    We present a simple influence function based approach to compute the variances of estimates of absolute risk and functions of absolute risk. We apply this approach to criteria that assess the impact of changes in the risk factor distribution on absolute risk for an individual and at the population level. As an illustration we use an absolute risk prediction model for breast cancer that includes modifiable risk factors in addition to standard breast cancer risk factors. Influence function based variance estimates for absolute risk and the criteria are compared to bootstrap variance estimates. PMID:21643476

  9. Source-to-Outcome Microbial Exposure and Risk Modeling Framework

    EPA Science Inventory

    A Quantitative Microbial Risk Assessment (QMRA) is a computer-based data-delivery and modeling approach that integrates interdisciplinary fate/transport, exposure, and impact models and databases to characterize potential health impacts/risks due to pathogens. As such, a QMRA ex...

  10. A Risk-Analysis Approach to Implementing Web-Based Assessment

    ERIC Educational Resources Information Center

    Ricketts, Chris; Zakrzewski, Stan

    2005-01-01

    Computer-Based Assessment is a risky business. This paper proposes the use of a model for web-based assessment systems that identifies pedagogic, operational, technical (non web-based), web-based and financial risks. The strategies and procedures for risk elimination or reduction arise from risk analysis and management and are the means by which…

  11. Prototype Biology-Based Radiation Risk Module Project

    NASA Technical Reports Server (NTRS)

    Terrier, Douglas; Clayton, Ronald G.; Patel, Zarana; Hu, Shaowen; Huff, Janice

    2015-01-01

    Biological effects of space radiation and risk mitigation are strategic knowledge gaps for the Evolvable Mars Campaign. The current epidemiology-based NASA Space Cancer Risk (NSCR) model contains large uncertainties (HAT #6.5a) due to lack of information on the radiobiology of galactic cosmic rays (GCR) and lack of human data. The use of experimental models that most accurately replicate the response of human tissues is critical for precision in risk projections. Our proposed study will compare DNA damage, histological, and cell kinetic parameters after irradiation in normal 2D human cells versus 3D tissue models, and it will use a multi-scale computational model (CHASTE) to investigate various biological processes that may contribute to carcinogenesis, including radiation-induced cellular signaling pathways. This cross-disciplinary work, with biological validation of an evolvable mathematical computational model, will help reduce uncertainties within NSCR and aid risk mitigation for radiation-induced carcinogenesis.

  12. An evaluation of Computational Fluid dynamics model for flood risk analysis

    NASA Astrophysics Data System (ADS)

    Di Francesco, Silvia; Biscarini, Chiara; Montesarchio, Valeria

    2014-05-01

    This work presents an analysis of the hydrological-hydraulic engineering requisites for Risk evaluation and efficient flood damage reduction plans. Most of the research efforts have been dedicated to the scientific and technical aspects of risk assessment, providing estimates of possible alternatives and of the risk associated. In the decision making process for mitigation plan, the contribute of scientist is crucial, due to the fact that Risk-Damage analysis is based on evaluation of flow field ,of Hydraulic Risk and on economical and societal considerations. The present paper will focus on the first part of process, the mathematical modelling of flood events which is the base for all further considerations. The evaluation of potential catastrophic damage consequent to a flood event and in particular to dam failure requires modelling of the flood with sufficient detail so to capture the spatial and temporal evolutions of the event, as well of the velocity field. Thus, the selection of an appropriate mathematical model to correctly simulate flood routing is an essential step. In this work we present the application of two 3D Computational fluid dynamics models to a synthetic and real case study in order to evaluate the correct evolution of flow field and the associated flood Risk . The first model is based on a opensource CFD platform called openFoam. Water flow is schematized with a classical continuum approach based on Navier-Stokes equation coupled with Volume of fluid (VOF) method to take in account the multiphase character of river bottom-water- air systems. The second model instead is based on the Lattice Boltzmann method, an innovative numerical fluid dynamics scheme based on Boltzmann's kinetic equation that represents the flow dynamics at the macroscopic level by incorporating a microscopic kinetic approach. Fluid is seen as composed by particles that can move and collide among them. Simulation results from both models are promising and congruent to experimental results available in literature, thought the LBM model requires less computational effort respect to the NS one.

  13. Impact of model-based risk analysis for liver surgery planning.

    PubMed

    Hansen, C; Zidowitz, S; Preim, B; Stavrou, G; Oldhafer, K J; Hahn, H K

    2014-05-01

    A model-based risk analysis for oncologic liver surgery was described in previous work (Preim et al. in Proceedings of international symposium on computer assisted radiology and surgery (CARS), Elsevier, Amsterdam, pp. 353–358, 2002; Hansen et al. Int I Comput Assist Radiol Surg 4(5):469–474, 2009). In this paper, we present an evaluation of this method. To prove whether and how the risk analysis facilitates the process of liver surgery planning, an explorative user study with 10 liver experts was conducted. The purpose was to compare and analyze their decision-making. The results of the study show that model-based risk analysis enhances the awareness of surgical risk in the planning stage. Participants preferred smaller resection volumes and agreed more on the safety margins’ width in case the risk analysis was available. In addition, time to complete the planning task and confidence of participants were not increased when using the risk analysis. This work shows that the applied model-based risk analysis may influence important planning decisions in liver surgery. It lays a basis for further clinical evaluations and points out important fields for future research.

  14. Overview of Risk Mitigation for Safety-Critical Computer-Based Systems

    NASA Technical Reports Server (NTRS)

    Torres-Pomales, Wilfredo

    2015-01-01

    This report presents a high-level overview of a general strategy to mitigate the risks from threats to safety-critical computer-based systems. In this context, a safety threat is a process or phenomenon that can cause operational safety hazards in the form of computational system failures. This report is intended to provide insight into the safety-risk mitigation problem and the characteristics of potential solutions. The limitations of the general risk mitigation strategy are discussed and some options to overcome these limitations are provided. This work is part of an ongoing effort to enable well-founded assurance of safety-related properties of complex safety-critical computer-based aircraft systems by developing an effective capability to model and reason about the safety implications of system requirements and design.

  15. Safety Metrics for Human-Computer Controlled Systems

    NASA Technical Reports Server (NTRS)

    Leveson, Nancy G; Hatanaka, Iwao

    2000-01-01

    The rapid growth of computer technology and innovation has played a significant role in the rise of computer automation of human tasks in modem production systems across all industries. Although the rationale for automation has been to eliminate "human error" or to relieve humans from manual repetitive tasks, various computer-related hazards and accidents have emerged as a direct result of increased system complexity attributed to computer automation. The risk assessment techniques utilized for electromechanical systems are not suitable for today's software-intensive systems or complex human-computer controlled systems.This thesis will propose a new systemic model-based framework for analyzing risk in safety-critical systems where both computers and humans are controlling safety-critical functions. A new systems accident model will be developed based upon modem systems theory and human cognitive processes to better characterize system accidents, the role of human operators, and the influence of software in its direct control of significant system functions Better risk assessments will then be achievable through the application of this new framework to complex human-computer controlled systems.

  16. Estimating Skin Cancer Risk: Evaluating Mobile Computer-Adaptive Testing.

    PubMed

    Djaja, Ngadiman; Janda, Monika; Olsen, Catherine M; Whiteman, David C; Chien, Tsair-Wei

    2016-01-22

    Response burden is a major detriment to questionnaire completion rates. Computer adaptive testing may offer advantages over non-adaptive testing, including reduction of numbers of items required for precise measurement. Our aim was to compare the efficiency of non-adaptive (NAT) and computer adaptive testing (CAT) facilitated by Partial Credit Model (PCM)-derived calibration to estimate skin cancer risk. We used a random sample from a population-based Australian cohort study of skin cancer risk (N=43,794). All 30 items of the skin cancer risk scale were calibrated with the Rasch PCM. A total of 1000 cases generated following a normal distribution (mean [SD] 0 [1]) were simulated using three Rasch models with three fixed-item (dichotomous, rating scale, and partial credit) scenarios, respectively. We calculated the comparative efficiency and precision of CAT and NAT (shortening of questionnaire length and the count difference number ratio less than 5% using independent t tests). We found that use of CAT led to smaller person standard error of the estimated measure than NAT, with substantially higher efficiency but no loss of precision, reducing response burden by 48%, 66%, and 66% for dichotomous, Rating Scale Model, and PCM models, respectively. CAT-based administrations of the skin cancer risk scale could substantially reduce participant burden without compromising measurement precision. A mobile computer adaptive test was developed to help people efficiently assess their skin cancer risk.

  17. Probabilistic Structural Analysis Methods (PSAM) for select space propulsion system components, part 2

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The technical effort and computer code enhancements performed during the sixth year of the Probabilistic Structural Analysis Methods program are summarized. Various capabilities are described to probabilistically combine structural response and structural resistance to compute component reliability. A library of structural resistance models is implemented in the Numerical Evaluations of Stochastic Structures Under Stress (NESSUS) code that included fatigue, fracture, creep, multi-factor interaction, and other important effects. In addition, a user interface was developed for user-defined resistance models. An accurate and efficient reliability method was developed and was successfully implemented in the NESSUS code to compute component reliability based on user-selected response and resistance models. A risk module was developed to compute component risk with respect to cost, performance, or user-defined criteria. The new component risk assessment capabilities were validated and demonstrated using several examples. Various supporting methodologies were also developed in support of component risk assessment.

  18. Which risk models perform best in selecting ever-smokers for lung cancer screening?

    Cancer.gov

    A new analysis by scientists at NCI evaluates nine different individualized lung cancer risk prediction models based on their selections of ever-smokers for computed tomography (CT) lung cancer screening.

  19. Evidence Theory Based Uncertainty Quantification in Radiological Risk due to Accidental Release of Radioactivity from a Nuclear Power Plant

    NASA Astrophysics Data System (ADS)

    Ingale, S. V.; Datta, D.

    2010-10-01

    Consequence of the accidental release of radioactivity from a nuclear power plant is assessed in terms of exposure or dose to the members of the public. Assessment of risk is routed through this dose computation. Dose computation basically depends on the basic dose assessment model and exposure pathways. One of the exposure pathways is the ingestion of contaminated food. The aim of the present paper is to compute the uncertainty associated with the risk to the members of the public due to the ingestion of contaminated food. The governing parameters of the ingestion dose assessment model being imprecise, we have approached evidence theory to compute the bound of the risk. The uncertainty is addressed by the belief and plausibility fuzzy measures.

  20. Computer modelling as a tool for the exposure assessment of operators using faulty agricultural pesticide spraying equipment.

    PubMed

    Bańkowski, Robert; Wiadrowska, Bozena; Beresińska, Martyna; Ludwicki, Jan K; Noworyta-Głowacka, Justyna; Godyń, Artur; Doruchowski, Grzegorz; Hołownicki, Ryszard

    2013-01-01

    Faulty but still operating agricultural pesticide sprayers may pose an unacceptable health risk for operators. The computerized models designed to calculate exposure and risk for pesticide sprayers used as an aid in the evaluation and further authorisation of plant protection products may be applied also to assess a health risk for operators when faulty sprayers are used. To evaluate the impact of different exposure scenarios on the health risk for the operators using faulty agricultural spraying equipment by means of computer modelling. The exposure modelling was performed for 15 pesticides (5 insecticides, 7 fungicides and 3 herbicides). The critical parameter, i.e. toxicological end-point, on which the risk assessment was based was the no observable adverse effect level (NOAEL). This enabled risk to be estimated under various exposure conditions such as pesticide concentration in the plant protection product and type of the sprayed crop as well as the number of treatments. Computer modelling was based on the UK POEM model including determination of the acceptable operator exposure level (AOEL). Thus the degree of operator exposure could be defined during pesticide treatment whether or not personal protection equipment had been employed by individuals. Data used for computer modelling was obtained from simulated, pesticide substitute treatments using variously damaged knapsack sprayers. These substitute preparations consisted of markers that allowed computer simulations to be made, analogous to real-life exposure situations, in a dose dependent fashion. Exposures were estimated according to operator dosimetry exposure under 'field' conditions for low level, medium and high target field crops. The exposure modelling in the high target field crops demonstrated exceedance of the AOEL in all simulated treatment cases (100%) using damaged sprayers irrespective of the type of damage or if individual protective measures had been adopted or not. For low level and medium field crops exceedances ranged between 40 - 80% cases. The computer modelling may be considered as an practical tool for the hazard assessment when the faulty agricultural sprayers are used. It also may be applied for programming the quality checks and maintenance systems of this equipment.

  1. Method and system for dynamic probabilistic risk assessment

    NASA Technical Reports Server (NTRS)

    Dugan, Joanne Bechta (Inventor); Xu, Hong (Inventor)

    2013-01-01

    The DEFT methodology, system and computer readable medium extends the applicability of the PRA (Probabilistic Risk Assessment) methodology to computer-based systems, by allowing DFT (Dynamic Fault Tree) nodes as pivot nodes in the Event Tree (ET) model. DEFT includes a mathematical model and solution algorithm, supports all common PRA analysis functions and cutsets. Additional capabilities enabled by the DFT include modularization, phased mission analysis, sequence dependencies, and imperfect coverage.

  2. The Comparison of Inductive Reasoning under Risk Conditions between Chinese and Japanese Based on Computational Models: Toward the Application to CAE for Foreign Language

    ERIC Educational Resources Information Center

    Zhang, Yujie; Terai, Asuka; Nakagawa, Masanori

    2013-01-01

    Inductive reasoning under risk conditions is an important thinking process not only for sciences but also in our daily life. From this viewpoint, it is very useful for language learning to construct computational models of inductive reasoning which realize the CAE for foreign languages. This study proposes the comparison of inductive reasoning…

  3. Parameters for Pesticide QSAR and PBPK/PD Models to inform Human Risk Assessments

    EPA Science Inventory

    Physiologically-based pharmacokinetic and pharmacodynamic (PBPK/PD) modeling has emerged as an important computational approach supporting quantitative risk assessment of agrochemicals. However, before complete regulatory acceptance of this tool, an assessment of assets and liabi...

  4. Evaluating Computer-Based Assessment in a Risk-Based Model

    ERIC Educational Resources Information Center

    Zakrzewski, Stan; Steven, Christine; Ricketts, Chris

    2009-01-01

    There are three purposes for evaluation: evaluation for action to aid the decision making process, evaluation for understanding to further enhance enlightenment and evaluation for control to ensure compliance to standards. This article argues that the primary function of evaluation in the "Catherine Wheel" computer-based assessment (CBA)…

  5. Efficient computation of the joint probability of multiple inherited risk alleles from pedigree data.

    PubMed

    Madsen, Thomas; Braun, Danielle; Peng, Gang; Parmigiani, Giovanni; Trippa, Lorenzo

    2018-06-25

    The Elston-Stewart peeling algorithm enables estimation of an individual's probability of harboring germline risk alleles based on pedigree data, and serves as the computational backbone of important genetic counseling tools. However, it remains limited to the analysis of risk alleles at a small number of genetic loci because its computing time grows exponentially with the number of loci considered. We propose a novel, approximate version of this algorithm, dubbed the peeling and paring algorithm, which scales polynomially in the number of loci. This allows extending peeling-based models to include many genetic loci. The algorithm creates a trade-off between accuracy and speed, and allows the user to control this trade-off. We provide exact bounds on the approximation error and evaluate it in realistic simulations. Results show that the loss of accuracy due to the approximation is negligible in important applications. This algorithm will improve genetic counseling tools by increasing the number of pathogenic risk alleles that can be addressed. To illustrate we create an extended five genes version of BRCAPRO, a widely used model for estimating the carrier probabilities of BRCA1 and BRCA2 risk alleles and assess its computational properties. © 2018 WILEY PERIODICALS, INC.

  6. Direct biomechanical modeling of trabecular bone using a nonlinear manifold-based volumetric representation

    NASA Astrophysics Data System (ADS)

    Jin, Dakai; Lu, Jia; Zhang, Xiaoliu; Chen, Cheng; Bai, ErWei; Saha, Punam K.

    2017-03-01

    Osteoporosis is associated with increased fracture risk. Recent advancement in the area of in vivo imaging allows segmentation of trabecular bone (TB) microstructures, which is a known key determinant of bone strength and fracture risk. An accurate biomechanical modelling of TB micro-architecture provides a comprehensive summary measure of bone strength and fracture risk. In this paper, a new direct TB biomechanical modelling method using nonlinear manifold-based volumetric reconstruction of trabecular network is presented. It is accomplished in two sequential modules. The first module reconstructs a nonlinear manifold-based volumetric representation of TB networks from three-dimensional digital images. Specifically, it starts with the fuzzy digital segmentation of a TB network, and computes its surface and curve skeletons. An individual trabecula is identified as a topological segment in the curve skeleton. Using geometric analysis, smoothing and optimization techniques, the algorithm generates smooth, curved, and continuous representations of individual trabeculae glued at their junctions. Also, the method generates a geometrically consistent TB volume at junctions. In the second module, a direct computational biomechanical stress-strain analysis is applied on the reconstructed TB volume to predict mechanical measures. The accuracy of the method was examined using micro-CT imaging of cadaveric distal tibia specimens (N = 12). A high linear correlation (r = 0.95) between TB volume computed using the new manifold-modelling algorithm and that directly derived from the voxel-based micro-CT images was observed. Young's modulus (YM) was computed using direct mechanical analysis on the TB manifold-model over a cubical volume of interest (VOI), and its correlation with the YM, computed using micro-CT based conventional finite-element analysis over the same VOI, was examined. A moderate linear correlation (r = 0.77) was observed between the two YM measures. This preliminary results show the accuracy of the new nonlinear manifold modelling algorithm for TB, and demonstrate the feasibility of a new direct mechanical strain-strain analysis on a nonlinear manifold model of a highly complex biological structure.

  7. Unclassified Computing Capability: User Responses to a Multiprogrammatic and Institutional Computing Questionnaire

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCoy, M; Kissel, L

    2002-01-29

    We are experimenting with a new computing model to be applied to a new computer dedicated to that model. Several LLNL science teams now have computational requirements, evidenced by the mature scientific applications that have been developed over the past five plus years, that far exceed the capability of the institution's computing resources. Thus, there is increased demand for dedicated, powerful parallel computational systems. Computation can, in the coming year, potentially field a capability system that is low cost because it will be based on a model that employs open source software and because it will use PC (IA32-P4) hardware.more » This incurs significant computer science risk regarding stability and system features but also presents great opportunity. We believe the risks can be managed, but the existence of risk cannot be ignored. In order to justify the budget for this system, we need to make the case that it serves science and, through serving science, serves the institution. That is the point of the meeting and the White Paper that we are proposing to prepare. The questions are listed and the responses received are in this report.« less

  8. USE OF BIOLOGICALLY BASED COMPUTATIONAL MODELING IN MODE OF ACTION-BASED RISK ASSESSMENT – AN EXAMPLE OF CHLOROFORM

    EPA Science Inventory

    The objective of current work is to develop a new cancer dose-response assessment for chloroform using a physiologically based pharmacokinetic/pharmacodynamic (PBPK/PD) model. The PBPK/PD model is based on a mode of action in which the cytolethality of chloroform occurs when the ...

  9. Multi-objective reverse logistics model for integrated computer waste management.

    PubMed

    Ahluwalia, Poonam Khanijo; Nema, Arvind K

    2006-12-01

    This study aimed to address the issues involved in the planning and design of a computer waste management system in an integrated manner. A decision-support tool is presented for selecting an optimum configuration of computer waste management facilities (segregation, storage, treatment/processing, reuse/recycle and disposal) and allocation of waste to these facilities. The model is based on an integer linear programming method with the objectives of minimizing environmental risk as well as cost. The issue of uncertainty in the estimated waste quantities from multiple sources is addressed using the Monte Carlo simulation technique. An illustrated example of computer waste management in Delhi, India is presented to demonstrate the usefulness of the proposed model and to study tradeoffs between cost and risk. The results of the example problem show that it is possible to reduce the environmental risk significantly by a marginal increase in the available cost. The proposed model can serve as a powerful tool to address the environmental problems associated with exponentially growing quantities of computer waste which are presently being managed using rudimentary methods of reuse, recovery and disposal by various small-scale vendors.

  10. Web-Based Real Time Earthquake Forecasting and Personal Risk Management

    NASA Astrophysics Data System (ADS)

    Rundle, J. B.; Holliday, J. R.; Graves, W. R.; Turcotte, D. L.; Donnellan, A.

    2012-12-01

    Earthquake forecasts have been computed by a variety of countries and economies world-wide for over two decades. For the most part, forecasts have been computed for insurance, reinsurance and underwriters of catastrophe bonds. One example is the Working Group on California Earthquake Probabilities that has been responsible for the official California earthquake forecast since 1988. However, in a time of increasingly severe global financial constraints, we are now moving inexorably towards personal risk management, wherein mitigating risk is becoming the responsibility of individual members of the public. Under these circumstances, open access to a variety of web-based tools, utilities and information is a necessity. Here we describe a web-based system that has been operational since 2009 at www.openhazards.com and www.quakesim.org. Models for earthquake physics and forecasting require input data, along with model parameters. The models we consider are the Natural Time Weibull (NTW) model for regional earthquake forecasting, together with models for activation and quiescence. These models use small earthquakes ('seismicity-based models") to forecast the occurrence of large earthquakes, either through varying rates of small earthquake activity, or via an accumulation of this activity over time. These approaches use data-mining algorithms combined with the ANSS earthquake catalog. The basic idea is to compute large earthquake probabilities using the number of small earthquakes that have occurred in a region since the last large earthquake. Each of these approaches has computational challenges associated with computing forecast information in real time. Using 25 years of data from the ANSS California-Nevada catalog of earthquakes, we show that real-time forecasting is possible at a grid scale of 0.1o. We have analyzed the performance of these models using Reliability/Attributes and standard Receiver Operating Characteristic (ROC) tests. We show how the Reliability and ROC tests allow us to judge data completeness and estimate error. It is clear from much of the analysis that data quality is a major limitation on the accurate computation of earthquake probabilities. We discuss the challenges and pitfalls in serving up these datasets over the web.

  11. Functional correlation approach to operational risk in banking organizations

    NASA Astrophysics Data System (ADS)

    Kühn, Reimer; Neu, Peter

    2003-05-01

    A Value-at-Risk-based model is proposed to compute the adequate equity capital necessary to cover potential losses due to operational risks, such as human and system process failures, in banking organizations. Exploring the analogy to a lattice gas model from physics, correlations between sequential failures are modeled by as functionally defined, heterogeneous couplings between mutually supportive processes. In contrast to traditional risk models for market and credit risk, where correlations are described as equal-time-correlations by a covariance matrix, the dynamics of the model shows collective phenomena such as bursts and avalanches of process failures.

  12. A stochastic multicriteria model for evidence-based decision making in drug benefit-risk analysis.

    PubMed

    Tervonen, Tommi; van Valkenhoef, Gert; Buskens, Erik; Hillege, Hans L; Postmus, Douwe

    2011-05-30

    Drug benefit-risk (BR) analysis is based on firm clinical evidence regarding various safety and efficacy outcomes. In this paper, we propose a new and more formal approach for constructing a supporting multi-criteria model that fully takes into account the evidence on efficacy and adverse drug reactions. Our approach is based on the stochastic multi-criteria acceptability analysis methodology, which allows us to compute the typical value judgments that support a decision, to quantify decision uncertainty, and to compute a comprehensive BR profile. We construct a multi-criteria model for the therapeutic group of second-generation antidepressants. We assess fluoxetine and venlafaxine together with placebo according to incidence of treatment response and three common adverse drug reactions by using data from a published study. Our model shows that there are clear trade-offs among the treatment alternatives. Copyright © 2011 John Wiley & Sons, Ltd.

  13. A Comprehensive Review of Existing Risk Assessment Models in Cloud Computing

    NASA Astrophysics Data System (ADS)

    Amini, Ahmad; Jamil, Norziana

    2018-05-01

    Cloud computing is a popular paradigm in information technology and computing as it offers numerous advantages in terms of economical saving and minimal management effort. Although elasticity and flexibility brings tremendous benefits, it still raises many information security issues due to its unique characteristic that allows ubiquitous computing. Therefore, the vulnerabilities and threats in cloud computing have to be identified and proper risk assessment mechanism has to be in place for better cloud computing management. Various quantitative and qualitative risk assessment models have been proposed but up to our knowledge, none of them is suitable for cloud computing environment. This paper, we compare and analyse the strengths and weaknesses of existing risk assessment models. We then propose a new risk assessment model that sufficiently address all the characteristics of cloud computing, which was not appeared in the existing models.

  14. From QSAR to QSIIR: Searching for Enhanced Computational Toxicology Models

    PubMed Central

    Zhu, Hao

    2017-01-01

    Quantitative Structure Activity Relationship (QSAR) is the most frequently used modeling approach to explore the dependency of biological, toxicological, or other types of activities/properties of chemicals on their molecular features. In the past two decades, QSAR modeling has been used extensively in drug discovery process. However, the predictive models resulted from QSAR studies have limited use for chemical risk assessment, especially for animal and human toxicity evaluations, due to the low predictivity of new compounds. To develop enhanced toxicity models with independently validated external prediction power, novel modeling protocols were pursued by computational toxicologists based on rapidly increasing toxicity testing data in recent years. This chapter reviews the recent effort in our laboratory to incorporate the biological testing results as descriptors in the toxicity modeling process. This effort extended the concept of QSAR to Quantitative Structure In vitro-In vivo Relationship (QSIIR). The QSIIR study examples provided in this chapter indicate that the QSIIR models that based on the hybrid (biological and chemical) descriptors are indeed superior to the conventional QSAR models that only based on chemical descriptors for several animal toxicity endpoints. We believe that the applications introduced in this review will be of interest and value to researchers working in the field of computational drug discovery and environmental chemical risk assessment. PMID:23086837

  15. Home-Based Risk of Falling Assessment Test Using a Closed-Loop Balance Model.

    PubMed

    Ayena, Johannes C; Zaibi, Helmi; Otis, Martin J-D; Menelas, Bob-Antoine J

    2016-12-01

    The aim of this study is to improve and facilitate the methods used to assess risk of falling at home among older people through the computation of a risk of falling in real time in daily activities. In order to increase a real time computation of the risk of falling, a closed-loop balance model is proposed and compared with One-Leg Standing Test (OLST). This balance model allows studying the postural response of a person having an unpredictable perturbation. Twenty-nine volunteers participated in this study for evaluating the effectiveness of the proposed system which includes seventeen elder participants: ten healthy elderly ( 68.4 ±5.5 years), seven Parkinson's disease (PD) subjects ( 66.28 ±8.9 years), and twelve healthy young adults ( 28.27 ±3.74 years). Our work suggests that there is a relationship between OLST score and the risk of falling based on center of pressure measurement with four low cost force sensors located inside an instrumented insole, which could be predicted using our suggested closed-loop balance model. For long term monitoring at home, this system could be included in a medical electronic record and could be useful as a diagnostic aid tool.

  16. Rollover risk prediction of heavy vehicles by reliability index and empirical modelling

    NASA Astrophysics Data System (ADS)

    Sellami, Yamine; Imine, Hocine; Boubezoul, Abderrahmane; Cadiou, Jean-Charles

    2018-03-01

    This paper focuses on a combination of a reliability-based approach and an empirical modelling approach for rollover risk assessment of heavy vehicles. A reliability-based warning system is developed to alert the driver to a potential rollover before entering into a bend. The idea behind the proposed methodology is to estimate the rollover risk by the probability that the vehicle load transfer ratio (LTR) exceeds a critical threshold. Accordingly, a so-called reliability index may be used as a measure to assess the vehicle safe functioning. In the reliability method, computing the maximum of LTR requires to predict the vehicle dynamics over the bend which can be in some cases an intractable problem or time-consuming. With the aim of improving the reliability computation time, an empirical model is developed to substitute the vehicle dynamics and rollover models. This is done by using the SVM (Support Vector Machines) algorithm. The preliminary obtained results demonstrate the effectiveness of the proposed approach.

  17. A Novel Biobjective Risk-Based Model for Stochastic Air Traffic Network Flow Optimization Problem.

    PubMed

    Cai, Kaiquan; Jia, Yaoguang; Zhu, Yanbo; Xiao, Mingming

    2015-01-01

    Network-wide air traffic flow management (ATFM) is an effective way to alleviate demand-capacity imbalances globally and thereafter reduce airspace congestion and flight delays. The conventional ATFM models assume the capacities of airports or airspace sectors are all predetermined. However, the capacity uncertainties due to the dynamics of convective weather may make the deterministic ATFM measures impractical. This paper investigates the stochastic air traffic network flow optimization (SATNFO) problem, which is formulated as a weighted biobjective 0-1 integer programming model. In order to evaluate the effect of capacity uncertainties on ATFM, the operational risk is modeled via probabilistic risk assessment and introduced as an extra objective in SATNFO problem. Computation experiments using real-world air traffic network data associated with simulated weather data show that presented model has far less constraints compared to stochastic model with nonanticipative constraints, which means our proposed model reduces the computation complexity.

  18. Launch Vehicle Debris Models and Crew Vehicle Ascent Abort Risk

    NASA Technical Reports Server (NTRS)

    Gee, Ken; Lawrence, Scott

    2013-01-01

    For manned space launch systems, a reliable abort system is required to reduce the risks associated with a launch vehicle failure during ascent. Understanding the risks associated with failure environments can be achieved through the use of physics-based models of these environments. Debris fields due to destruction of the launch vehicle is one such environment. To better analyze the risk posed by debris, a physics-based model for generating launch vehicle debris catalogs has been developed. The model predicts the mass distribution of the debris field based on formulae developed from analysis of explosions. Imparted velocity distributions are computed using a shock-physics code to model the explosions within the launch vehicle. A comparison of the debris catalog with an existing catalog for the Shuttle external tank show good comparison in the debris characteristics and the predicted debris strike probability. The model is used to analyze the effects of number of debris pieces and velocity distributions on the strike probability and risk.

  19. Development of a General Form CO 2 and Brine Flux Input Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mansoor, K.; Sun, Y.; Carroll, S.

    2014-08-01

    The National Risk Assessment Partnership (NRAP) project is developing a science-based toolset for the quantitative analysis of the potential risks associated with changes in groundwater chemistry from CO 2 injection. In order to address uncertainty probabilistically, NRAP is developing efficient, reduced-order models (ROMs) as part of its approach. These ROMs are built from detailed, physics-based process models to provide confidence in the predictions over a range of conditions. The ROMs are designed to reproduce accurately the predictions from the computationally intensive process models at a fraction of the computational time, thereby allowing the utilization of Monte Carlo methods to probemore » variability in key parameters. This report presents the procedures used to develop a generalized model for CO 2 and brine leakage fluxes based on the output of a numerical wellbore simulation. The resulting generalized parameters and ranges reported here will be used for the development of third-generation groundwater ROMs.« less

  20. A high-resolution physically-based global flood hazard map

    NASA Astrophysics Data System (ADS)

    Kaheil, Y.; Begnudelli, L.; McCollum, J.

    2016-12-01

    We present the results from a physically-based global flood hazard model. The model uses a physically-based hydrologic model to simulate river discharges, and 2D hydrodynamic model to simulate inundation. The model is set up such that it allows the application of large-scale flood hazard through efficient use of parallel computing. For hydrology, we use the Hillslope River Routing (HRR) model. HRR accounts for surface hydrology using Green-Ampt parameterization. The model is calibrated against observed discharge data from the Global Runoff Data Centre (GRDC) network, among other publicly-available datasets. The parallel-computing framework takes advantage of the river network structure to minimize cross-processor messages, and thus significantly increases computational efficiency. For inundation, we implemented a computationally-efficient 2D finite-volume model with wetting/drying. The approach consists of simulating flood along the river network by forcing the hydraulic model with the streamflow hydrographs simulated by HRR, and scaled up to certain return levels, e.g. 100 years. The model is distributed such that each available processor takes the next simulation. Given an approximate criterion, the simulations are ordered from most-demanding to least-demanding to ensure that all processors finalize almost simultaneously. Upon completing all simulations, the maximum envelope of flood depth is taken to generate the final map. The model is applied globally, with selected results shown from different continents and regions. The maps shown depict flood depth and extent at different return periods. These maps, which are currently available at 3 arc-sec resolution ( 90m) can be made available at higher resolutions where high resolution DEMs are available. The maps can be utilized by flood risk managers at the national, regional, and even local levels to further understand their flood risk exposure, exercise certain measures of mitigation, and/or transfer the residual risk financially through flood insurance programs.

  1. Intelligent judgements over health risks in a spatial agent-based model.

    PubMed

    Abdulkareem, Shaheen A; Augustijn, Ellen-Wien; Mustafa, Yaseen T; Filatova, Tatiana

    2018-03-20

    Millions of people worldwide are exposed to deadly infectious diseases on a regular basis. Breaking news of the Zika outbreak for instance, made it to the main media titles internationally. Perceiving disease risks motivate people to adapt their behavior toward a safer and more protective lifestyle. Computational science is instrumental in exploring patterns of disease spread emerging from many individual decisions and interactions among agents and their environment by means of agent-based models. Yet, current disease models rarely consider simulating dynamics in risk perception and its impact on the adaptive protective behavior. Social sciences offer insights into individual risk perception and corresponding protective actions, while machine learning provides algorithms and methods to capture these learning processes. This article presents an innovative approach to extend agent-based disease models by capturing behavioral aspects of decision-making in a risky context using machine learning techniques. We illustrate it with a case of cholera in Kumasi, Ghana, accounting for spatial and social risk factors that affect intelligent behavior and corresponding disease incidents. The results of computational experiments comparing intelligent with zero-intelligent representations of agents in a spatial disease agent-based model are discussed. We present a spatial disease agent-based model (ABM) with agents' behavior grounded in Protection Motivation Theory. Spatial and temporal patterns of disease diffusion among zero-intelligent agents are compared to those produced by a population of intelligent agents. Two Bayesian Networks (BNs) designed and coded using R and are further integrated with the NetLogo-based Cholera ABM. The first is a one-tier BN1 (only risk perception), the second is a two-tier BN2 (risk and coping behavior). We run three experiments (zero-intelligent agents, BN1 intelligence and BN2 intelligence) and report the results per experiment in terms of several macro metrics of interest: an epidemic curve, a risk perception curve, and a distribution of different types of coping strategies over time. Our results emphasize the importance of integrating behavioral aspects of decision making under risk into spatial disease ABMs using machine learning algorithms. This is especially relevant when studying cumulative impacts of behavioral changes and possible intervention strategies.

  2. Image analysis and modeling in medical image computing. Recent developments and advances.

    PubMed

    Handels, H; Deserno, T M; Meinzer, H-P; Tolxdorff, T

    2012-01-01

    Medical image computing is of growing importance in medical diagnostics and image-guided therapy. Nowadays, image analysis systems integrating advanced image computing methods are used in practice e.g. to extract quantitative image parameters or to support the surgeon during a navigated intervention. However, the grade of automation, accuracy, reproducibility and robustness of medical image computing methods has to be increased to meet the requirements in clinical routine. In the focus theme, recent developments and advances in the field of modeling and model-based image analysis are described. The introduction of models in the image analysis process enables improvements of image analysis algorithms in terms of automation, accuracy, reproducibility and robustness. Furthermore, model-based image computing techniques open up new perspectives for prediction of organ changes and risk analysis of patients. Selected contributions are assembled to present latest advances in the field. The authors were invited to present their recent work and results based on their outstanding contributions to the Conference on Medical Image Computing BVM 2011 held at the University of Lübeck, Germany. All manuscripts had to pass a comprehensive peer review. Modeling approaches and model-based image analysis methods showing new trends and perspectives in model-based medical image computing are described. Complex models are used in different medical applications and medical images like radiographic images, dual-energy CT images, MR images, diffusion tensor images as well as microscopic images are analyzed. The applications emphasize the high potential and the wide application range of these methods. The use of model-based image analysis methods can improve segmentation quality as well as the accuracy and reproducibility of quantitative image analysis. Furthermore, image-based models enable new insights and can lead to a deeper understanding of complex dynamic mechanisms in the human body. Hence, model-based image computing methods are important tools to improve medical diagnostics and patient treatment in future.

  3. Risk in the Clouds?: Security Issues Facing Government Use of Cloud Computing

    NASA Astrophysics Data System (ADS)

    Wyld, David C.

    Cloud computing is poised to become one of the most important and fundamental shifts in how computing is consumed and used. Forecasts show that government will play a lead role in adopting cloud computing - for data storage, applications, and processing power, as IT executives seek to maximize their returns on limited procurement budgets in these challenging economic times. After an overview of the cloud computing concept, this article explores the security issues facing public sector use of cloud computing and looks to the risk and benefits of shifting to cloud-based models. It concludes with an analysis of the challenges that lie ahead for government use of cloud resources.

  4. EC FP6 Enviro-RISKS project outcomes in area of Earth and Space Science Informatics applications

    NASA Astrophysics Data System (ADS)

    Gordov, E. P.; Zakarin, E. A.

    2009-04-01

    Nowadays the community acknowledged that to understand dynamics of regional environment properly and perform its assessment on the base of monitoring and modeling more strong involvement of information-computational technologies (ICT) is required, which should lead to development of information-computational infrastructure as an inherent part of such investigations. This paper is based on the Report&Recommendations (www.dmi.dk/dmi/sr08-05-4.pdf) of the Enviro-RISKS (Man-induced Environmental Risks: Monitoring, Management and Remediation of Man-made Changes in Siberia) Project Thematic expert group for Information Systems, Integration and Synthesis Focus and presents results of activities of Project Partners in area of Information Technologies for Environmental Sciences development and usage. Approaches used the web-based Information Technologies and the GIS-based Information Technologies are described and a way to their integration is outlined. In particular, developed in course of the Project carrying out Enviro-RISKS web portal and its Climate site (http://climate.risks.scert.ru/), providing an access to interactive web-system for regional climate assessment on the base of standard meteorological data archives, which is a key element of the information-computational infrastructure of the Siberia Integrated Regional Study (SIRS), is described in details as well as developed on the base of GIS technology system for monitoring and modeling air and water pollutions transport and transformations. The later is quite useful for practical applications realization of geoinformation modeling, in which relevant mathematical models are plunged into GIS and all the modeling and analysis phases are accomplished in the informational sphere, based on the real data including those coming from satellites. Major efforts currently are undertaken in attempt to integrate GIS based environmental applications with web accessibility, computing power and data interoperability thus to exploit completely huge potential of web bases technologies. In particular, development of a region devoted web portal using approached suggested by the Open Geospatial Consortium has been started recently. The state of the art of the information-computational infrastructure in the targeted region is quite a step in the process of development of a distributed collaborative information-computational environment to support multidisciplinary investigations of Earth regional environment, especially those required meteorology, atmospheric pollution transport and climate modeling. Established in process of the Project carrying out cooperative links, new Partners initiatives, and gained expertise allow us to hope that this infrastructure rather soon will make significant input into understanding regional environmental processes in their relationships with Global Change. In particular, this infrastructure will play a role of the 'underlying mechanics' of the research work, leaving the earth scientists to concentrate on their investigations as well as providing the environment to make research results available and understandable to everyone. Additionally to the core FP6 Enviro-RISKS project (INCO-CT-2004-013427) support this activity was partially supported by SB RAS Integration Project 34, SB RAS Basic Program Project 4.5.2.2 and APN Project CBA2007-08NSY. Valuable input into the expert group work and elaborated outcomes of Profs. V. Lykosov and A. Starchenko, Drs. D. Belikov, , M. Korets, S. Kostrykin, B. Mirkarimova, I. Okladnikov, , A. Titov and A. Tridvornov is acknowledged.

  5. Associations between Screen-Based Sedentary Behaviour and Anxiety Symptoms in Mothers with Young Children

    PubMed Central

    Teychenne, Megan; Hinkley, Trina

    2016-01-01

    Objectives Anxiety is a serious illness and women (including mothers with young children) are at particular risk. Although physical activity (PA) may reduce anxiety risk, little research has investigated the link between sedentary behaviour and anxiety risk. The aim of this study was to examine the association between screen-based sedentary behaviour and anxiety symptoms, independent of PA, amongst mothers with young children. Methods During 2013–2014, 528 mothers with children aged 2–5 years completed self-report measures of recreational screen-based sedentary behaviour (TV/DVD/video viewing, computer/e-games/hand held device use) and anxiety symptoms (using the Hospital Anxiety and Depression Scale, HADS-A). Linear regression analyses examined the cross-sectional association between screen-based sedentary behaviour and anxiety symptoms. Results In models that adjusted for key demographic and behavioural covariates (including moderate- to vigorous-intensity PA, MVPA), computer/device use (B = 0.212; 95% CI = 0.048, 0.377) and total screen time (B = 0.109; 95% CI = 0.014, 0.205) were positively associated with heightened anxiety symptoms. TV viewing was not associated with anxiety symptoms in either model. Conclusions Higher levels of recreational computer or handheld device use and overall screen time may be linked to higher risk of anxiety symptoms in mothers with young children, independent of MVPA. Further longitudinal and intervention research is required to determine temporal associations. PMID:27191953

  6. Probabilistic risk assessment for CO2 storage in geological formations: robust design and support for decision making under uncertainty

    NASA Astrophysics Data System (ADS)

    Oladyshkin, Sergey; Class, Holger; Helmig, Rainer; Nowak, Wolfgang

    2010-05-01

    CO2 storage in geological formations is currently being discussed intensively as a technology for mitigating CO2 emissions. However, any large-scale application requires a thorough analysis of the potential risks. Current numerical simulation models are too expensive for probabilistic risk analysis and for stochastic approaches based on brute-force repeated simulation. Even single deterministic simulations may require parallel high-performance computing. The multiphase flow processes involved are too non-linear for quasi-linear error propagation and other simplified stochastic tools. As an alternative approach, we propose a massive stochastic model reduction based on the probabilistic collocation method. The model response is projected onto a orthogonal basis of higher-order polynomials to approximate dependence on uncertain parameters (porosity, permeability etc.) and design parameters (injection rate, depth etc.). This allows for a non-linear propagation of model uncertainty affecting the predicted risk, ensures fast computation and provides a powerful tool for combining design variables and uncertain variables into one approach based on an integrative response surface. Thus, the design task of finding optimal injection regimes explicitly includes uncertainty, which leads to robust designs of the non-linear system that minimize failure probability and provide valuable support for risk-informed management decisions. We validate our proposed stochastic approach by Monte Carlo simulation using a common 3D benchmark problem (Class et al. Computational Geosciences 13, 2009). A reasonable compromise between computational efforts and precision was reached already with second-order polynomials. In our case study, the proposed approach yields a significant computational speedup by a factor of 100 compared to Monte Carlo simulation. We demonstrate that, due to the non-linearity of the flow and transport processes during CO2 injection, including uncertainty in the analysis leads to a systematic and significant shift of predicted leakage rates towards higher values compared with deterministic simulations, affecting both risk estimates and the design of injection scenarios. This implies that, neglecting uncertainty can be a strong simplification for modeling CO2 injection, and the consequences can be stronger than when neglecting several physical phenomena (e.g. phase transition, convective mixing, capillary forces etc.). The authors would like to thank the German Research Foundation (DFG) for financial support of the project within the Cluster of Excellence in Simulation Technology (EXC 310/1) at the University of Stuttgart. Keywords: polynomial chaos; CO2 storage; multiphase flow; porous media; risk assessment; uncertainty; integrative response surfaces

  7. Physics-Based Fragment Acceleration Modeling for Pressurized Tank Burst Risk Assessments

    NASA Technical Reports Server (NTRS)

    Manning, Ted A.; Lawrence, Scott L.

    2014-01-01

    As part of comprehensive efforts to develop physics-based risk assessment techniques for space systems at NASA, coupled computational fluid and rigid body dynamic simulations were carried out to investigate the flow mechanisms that accelerate tank fragments in bursting pressurized vessels. Simulations of several configurations were compared to analyses based on the industry-standard Baker explosion model, and were used to formulate an improved version of the model. The standard model, which neglects an external fluid, was found to agree best with simulation results only in configurations where the internal-to-external pressure ratio is very high and fragment curvature is small. The improved model introduces terms that accommodate an external fluid and better account for variations based on circumferential fragment count. Physics-based analysis was critical in increasing the model's range of applicability. The improved tank burst model can be used to produce more accurate risk assessments of space vehicle failure modes that involve high-speed debris, such as exploding propellant tanks and bursting rocket engines.

  8. Recent advances in mathematical modeling of developmental abnormalities using mechanistic information.

    PubMed

    Kavlock, R J

    1997-01-01

    During the last several years, significant changes in the risk assessment process for developmental toxicity of environmental contaminants have begun to emerge. The first of these changes is the development and beginning use of statistically based dose-response models [the benchmark dose (BMD) approach] that better utilize data derived from existing testing approaches. Accompanying this change is the greater emphasis placed on understanding and using mechanistic information to yield more accurate, reliable, and less uncertain risk assessments. The next stage in the evolution of risk assessment will be the use of biologically based dose-response (BBDR) models that begin to build into the statistically based models factors related to the underlying kinetic, biochemical, and/or physiologic processes perturbed by a toxicant. Such models are now emerging from several research laboratories. The introduction of quantitative models and the incorporation of biologic information into them has pointed to the need for even more sophisticated modifications for which we offer the term embryologically based dose-response (EBDR) models. Because these models would be based upon the understanding of normal morphogenesis, they represent a quantum leap in our thinking, but their complexity presents daunting challenges both to the developmental biologist and the developmental toxicologist. Implementation of these models will require extensive communication between developmental toxicologists, molecular embryologists, and biomathematicians. The remarkable progress in the understanding of mammalian embryonic development at the molecular level that has occurred over the last decade combined with advances in computing power and computational models should eventually enable these as yet hypothetical models to be brought into use.

  9. Combining operational models and data into a dynamic vessel risk assessment tool for coastal regions

    NASA Astrophysics Data System (ADS)

    Fernandes, R.; Braunschweig, F.; Lourenço, F.; Neves, R.

    2016-02-01

    The technological evolution in terms of computational capacity, data acquisition systems, numerical modelling and operational oceanography is supplying opportunities for designing and building holistic approaches and complex tools for newer and more efficient management (planning, prevention and response) of coastal water pollution risk events. A combined methodology to dynamically estimate time and space variable individual vessel accident risk levels and shoreline contamination risk from ships has been developed, integrating numerical metocean forecasts and oil spill simulations with vessel tracking automatic identification systems (AIS). The risk rating combines the likelihood of an oil spill occurring from a vessel navigating in a study area - the Portuguese continental shelf - with the assessed consequences to the shoreline. The spill likelihood is based on dynamic marine weather conditions and statistical information from previous accidents. The shoreline consequences reflect the virtual spilled oil amount reaching shoreline and its environmental and socio-economic vulnerabilities. The oil reaching shoreline is quantified with an oil spill fate and behaviour model running multiple virtual spills from vessels along time, or as an alternative, a correction factor based on vessel distance from coast. Shoreline risks can be computed in real time or from previously obtained data. Results show the ability of the proposed methodology to estimate the risk properly sensitive to dynamic metocean conditions and to oil transport behaviour. The integration of meteo-oceanic + oil spill models with coastal vulnerability and AIS data in the quantification of risk enhances the maritime situational awareness and the decision support model, providing a more realistic approach in the assessment of shoreline impacts. The risk assessment from historical data can help finding typical risk patterns ("hot spots") or developing sensitivity analysis to specific conditions, whereas real-time risk levels can be used in the prioritization of individual ships, geographical areas, strategic tug positioning and implementation of dynamic risk-based vessel traffic monitoring.

  10. The NASA Space Radiobiology Risk Assessment Project

    NASA Astrophysics Data System (ADS)

    Cucinotta, Francis A.; Huff, Janice; Ponomarev, Artem; Patel, Zarana; Kim, Myung-Hee

    The current first phase (2006-2011) has the three major goals of: 1) optimizing the conventional cancer risk models currently used based on the double-detriment life-table and radiation quality functions; 2) the integration of biophysical models of acute radiation syndromes; and 3) the development of new systems radiation biology models of cancer processes. The first-phase also includes continued uncertainty assessment of space radiation environmental models and transport codes, and relative biological effectiveness factors (RBE) based on flight data and NSRL results, respectively. The second phase of the (2012-2016) will: 1) develop biophysical models of central nervous system risks (CNS); 2) achieve comphrensive systems biology models of cancer processes using data from proton and heavy ion studies performed at NSRL; and 3) begin to identify computational models of biological countermeasures. Goals for the third phase (2017-2021) include: 1) the development of a systems biology model of cancer risks for operational use at NASA; 2) development of models of degenerative risks, 2) quantitative models of counter-measure impacts on cancer risks; and 3) indiviudal based risk assessments. Finally, we will support a decision point to continue NSRL research in support of NASA's exploration goals beyond 2021, and create an archival of NSRL research results for continued analysis. Details on near term goals, plans for a WEB based data resource of NSRL results, and a space radiation Wikepedia are described.

  11. A risk management model for familial breast cancer: A new application using Fuzzy Cognitive Map method.

    PubMed

    Papageorgiou, Elpiniki I; Jayashree Subramanian; Karmegam, Akila; Papandrianos, Nikolaos

    2015-11-01

    Breast cancer is the most deadly disease affecting women and thus it is natural for women aged 40-49 years (who have a family history of breast cancer or other related cancers) to assess their personal risk for developing familial breast cancer (FBC). Besides, as each individual woman possesses different levels of risk of developing breast cancer depending on their family history, genetic predispositions and personal medical history, individualized care setting mechanism needs to be identified so that appropriate risk assessment, counseling, screening, and prevention options can be determined by the health care professionals. The presented work aims at developing a soft computing based medical decision support system using Fuzzy Cognitive Map (FCM) that assists health care professionals in deciding the individualized care setting mechanisms based on the FBC risk level of the given women. The FCM based FBC risk management system uses NHL to learn causal weights from 40 patient records and achieves a 95% diagnostic accuracy. The results obtained from the proposed model are in concurrence with the comprehensive risk evaluation tool based on Tyrer-Cuzick model for 38/40 patient cases (95%). Besides, the proposed model identifies high risk women by calculating higher accuracy of prediction than the standard Gail and NSAPB models. The testing accuracy of the proposed model using 10-fold cross validation technique outperforms other standard machine learning based inference engines as well as previous FCM-based risk prediction methods for BC. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  12. [Physically-based model of pesticide application for risk assessment of agricultural workers].

    PubMed

    Rubino, F M; Mandic-Rajcevic, S; Vianello, G; Brambilla, G; Colosio, C

    2012-01-01

    Due to their unavoidable toxicity to non-target organisms, including man, the not of Plant Protection Products requires a thorough risk assessment to rationally advise safe use procedures and protection equipment by farmers. Most information on active substances and formulations, such as dermal absorption rates and exposure limits are available in the large body of regulatory data. Physically-based computational models can be used to forecast risk in real-life conditions (preventive assessment by 'exposure profiles'), to drive the cost-effective use of products and equipment and to understand the sources of unexpected exposure.

  13. Combining operational models and data into a dynamic vessel risk assessment tool for coastal regions

    NASA Astrophysics Data System (ADS)

    Fernandes, R.; Braunschweig, F.; Lourenço, F.; Neves, R.

    2015-07-01

    The technological evolution in terms of computational capacity, data acquisition systems, numerical modelling and operational oceanography is supplying opportunities for designing and building holistic approaches and complex tools for newer and more efficient management (planning, prevention and response) of coastal water pollution risk events. A combined methodology to dynamically estimate time and space variable shoreline risk levels from ships has been developed, integrating numerical metocean forecasts and oil spill simulations with vessel tracking automatic identification systems (AIS). The risk rating combines the likelihood of an oil spill occurring from a vessel navigating in a study area - Portuguese Continental shelf - with the assessed consequences to the shoreline. The spill likelihood is based on dynamic marine weather conditions and statistical information from previous accidents. The shoreline consequences reflect the virtual spilled oil amount reaching shoreline and its environmental and socio-economic vulnerabilities. The oil reaching shoreline is quantified with an oil spill fate and behaviour model running multiple virtual spills from vessels along time. Shoreline risks can be computed in real-time or from previously obtained data. Results show the ability of the proposed methodology to estimate the risk properly sensitive to dynamic metocean conditions and to oil transport behaviour. The integration of meteo-oceanic + oil spill models with coastal vulnerability and AIS data in the quantification of risk enhances the maritime situational awareness and the decision support model, providing a more realistic approach in the assessment of shoreline impacts. The risk assessment from historical data can help finding typical risk patterns, "hot spots" or developing sensitivity analysis to specific conditions, whereas real time risk levels can be used in the prioritization of individual ships, geographical areas, strategic tug positioning and implementation of dynamic risk-based vessel traffic monitoring.

  14. Relationship Between Vehicle Size and Fatality Risk in Model Year 1985-93 Passenger Cars and Light Trucks

    DOT National Transportation Integrated Search

    1997-01-01

    Fatality rates per million exposure years are computed by make, model and model year, : based on the crash experience of model year 1985-93 passenger cars and light trucks (pickups) vans : and sport utility vehicles) in the United States during calen...

  15. Overview of Threats and Failure Models for Safety-Relevant Computer-Based Systems

    NASA Technical Reports Server (NTRS)

    Torres-Pomales, Wilfredo

    2015-01-01

    This document presents a high-level overview of the threats to safety-relevant computer-based systems, including (1) a description of the introduction and activation of physical and logical faults; (2) the propagation of their effects; and (3) function-level and component-level error and failure mode models. These models can be used in the definition of fault hypotheses (i.e., assumptions) for threat-risk mitigation strategies. This document is a contribution to a guide currently under development that is intended to provide a general technical foundation for designers and evaluators of safety-relevant systems.

  16. A Review on Automatic Mammographic Density and Parenchymal Segmentation

    PubMed Central

    He, Wenda; Juette, Arne; Denton, Erika R. E.; Oliver, Arnau

    2015-01-01

    Breast cancer is the most frequently diagnosed cancer in women. However, the exact cause(s) of breast cancer still remains unknown. Early detection, precise identification of women at risk, and application of appropriate disease prevention measures are by far the most effective way to tackle breast cancer. There are more than 70 common genetic susceptibility factors included in the current non-image-based risk prediction models (e.g., the Gail and the Tyrer-Cuzick models). Image-based risk factors, such as mammographic densities and parenchymal patterns, have been established as biomarkers but have not been fully incorporated in the risk prediction models used for risk stratification in screening and/or measuring responsiveness to preventive approaches. Within computer aided mammography, automatic mammographic tissue segmentation methods have been developed for estimation of breast tissue composition to facilitate mammographic risk assessment. This paper presents a comprehensive review of automatic mammographic tissue segmentation methodologies developed over the past two decades and the evidence for risk assessment/density classification using segmentation. The aim of this review is to analyse how engineering advances have progressed and the impact automatic mammographic tissue segmentation has in a clinical environment, as well as to understand the current research gaps with respect to the incorporation of image-based risk factors in non-image-based risk prediction models. PMID:26171249

  17. The comparison of various approach to evaluation erosion risks and design control erosion measures

    NASA Astrophysics Data System (ADS)

    Kapicka, Jiri

    2015-04-01

    In the present is in the Czech Republic one methodology how to compute and compare erosion risks. This methodology contain also method to design erosion control measures. The base of this methodology is Universal Soil Loss Equation (USLE) and their result long-term average annual rate of erosion (G). This methodology is used for landscape planners. Data and statistics from database of erosion events in the Czech Republic shows that many troubles and damages are from local episodes of erosion events. An extent of these events and theirs impact are conditional to local precipitation events, current plant phase and soil conditions. These erosion events can do troubles and damages on agriculture land, municipally property and hydro components and even in a location is from point of view long-term average annual rate of erosion in good conditions. Other way how to compute and compare erosion risks is episodes approach. In this paper is presented the compare of various approach to compute erosion risks. The comparison was computed to locality from database of erosion events on agricultural land in the Czech Republic where have been records two erosion events. The study area is a simple agriculture land without any barriers that can have high influence to water flow and soil sediment transport. The computation of erosion risks (for all methodology) was based on laboratory analysis of soil samples which was sampled on study area. Results of the methodology USLE, MUSLE and results from mathematical model Erosion 3D have been compared. Variances of the results in space distribution of the places with highest soil erosion where compared and discussed. Other part presents variances of design control erosion measures where their design was done on based different methodology. The results shows variance of computed erosion risks which was done by different methodology. These variances can start discussion about different approach how compute and evaluate erosion risks in areas with different importance.

  18. Correlations and risk contagion between mixed assets and mixed-asset portfolio VaR measurements in a dynamic view: An application based on time varying copula models

    NASA Astrophysics Data System (ADS)

    Han, Yingying; Gong, Pu; Zhou, Xiang

    2016-02-01

    In this paper, we apply time varying Gaussian and SJC copula models to study the correlations and risk contagion between mixed assets: financial (stock), real estate and commodity (gold) assets in China firstly. Then we study the dynamic mixed-asset portfolio risk through VaR measurement based on the correlations computed by the time varying copulas. This dynamic VaR-copula measurement analysis has never been used on mixed-asset portfolios. The results show the time varying estimations fit much better than the static models, not only for the correlations and risk contagion based on time varying copulas, but also for the VaR-copula measurement. The time varying VaR-SJC copula models are more accurate than VaR-Gaussian copula models when measuring more risky portfolios with higher confidence levels. The major findings suggest that real estate and gold play a role on portfolio risk diversification and there exist risk contagion and flight to quality between mixed-assets when extreme cases happen, but if we take different mixed-asset portfolio strategies with the varying of time and environment, the portfolio risk will be reduced.

  19. Modeling intelligent adversaries for terrorism risk assessment: some necessary conditions for adversary models.

    PubMed

    Guikema, Seth

    2012-07-01

    Intelligent adversary modeling has become increasingly important for risk analysis, and a number of different approaches have been proposed for incorporating intelligent adversaries in risk analysis models. However, these approaches are based on a range of often-implicit assumptions about the desirable properties of intelligent adversary models. This "Perspective" paper aims to further risk analysis for situations involving intelligent adversaries by fostering a discussion of the desirable properties for these models. A set of four basic necessary conditions for intelligent adversary models is proposed and discussed. These are: (1) behavioral accuracy to the degree possible, (2) computational tractability to support decision making, (3) explicit consideration of uncertainty, and (4) ability to gain confidence in the model. It is hoped that these suggested necessary conditions foster discussion about the goals and assumptions underlying intelligent adversary modeling in risk analysis. © 2011 Society for Risk Analysis.

  20. Risk, individual differences, and environment: an Agent-Based Modeling approach to sexual risk-taking.

    PubMed

    Nagoski, Emily; Janssen, Erick; Lohrmann, David; Nichols, Eric

    2012-08-01

    Risky sexual behaviors, including the decision to have unprotected sex, result from interactions between individuals and their environment. The current study explored the use of Agent-Based Modeling (ABM)-a methodological approach in which computer-generated artificial societies simulate human sexual networks-to assess the influence of heterogeneity of sexual motivation on the risk of contracting HIV. The models successfully simulated some characteristics of human sexual systems, such as the relationship between individual differences in sexual motivation (sexual excitation and inhibition) and sexual risk, but failed to reproduce the scale-free distribution of number of partners observed in the real world. ABM has the potential to inform intervention strategies that target the interaction between an individual and his or her social environment.

  1. Aquatic models, genomics and chemical risk management.

    PubMed

    Cheng, Keith C; Hinton, David E; Mattingly, Carolyn J; Planchart, Antonio

    2012-01-01

    The 5th Aquatic Animal Models for Human Disease meeting follows four previous meetings (Nairn et al., 2001; Schmale, 2004; Schmale et al., 2007; Hinton et al., 2009) in which advances in aquatic animal models for human disease research were reported, and community discussion of future direction was pursued. At this meeting, discussion at a workshop entitled Bioinformatics and Computational Biology with Web-based Resources (20 September 2010) led to an important conclusion: Aquatic model research using feral and experimental fish, in combination with web-based access to annotated anatomical atlases and toxicological databases, yields data that advance our understanding of human gene function, and can be used to facilitate environmental management and drug development. We propose here that the effects of genes and environment are best appreciated within an anatomical context - the specifically affected cells and organs in the whole animal. We envision the use of automated, whole-animal imaging at cellular resolution and computational morphometry facilitated by high-performance computing and automated entry into toxicological databases, as anchors for genetic and toxicological data, and as connectors between human and model system data. These principles should be applied to both laboratory and feral fish populations, which have been virtually irreplaceable sentinals for environmental contamination that results in human morbidity and mortality. We conclude that automation, database generation, and web-based accessibility, facilitated by genomic/transcriptomic data and high-performance and cloud computing, will potentiate the unique and potentially key roles that aquatic models play in advancing systems biology, drug development, and environmental risk management. Copyright © 2011 Elsevier Inc. All rights reserved.

  2. Algorithms for the Computation of Debris Risk

    NASA Technical Reports Server (NTRS)

    Matney, Mark J.

    2017-01-01

    Determining the risks from space debris involve a number of statistical calculations. These calculations inevitably involve assumptions about geometry - including the physical geometry of orbits and the geometry of satellites. A number of tools have been developed in NASA’s Orbital Debris Program Office to handle these calculations; many of which have never been published before. These include algorithms that are used in NASA’s Orbital Debris Engineering Model ORDEM 3.0, as well as other tools useful for computing orbital collision rates and ground casualty risks. This paper presents an introduction to these algorithms and the assumptions upon which they are based.

  3. Algorithms for the Computation of Debris Risks

    NASA Technical Reports Server (NTRS)

    Matney, Mark

    2017-01-01

    Determining the risks from space debris involve a number of statistical calculations. These calculations inevitably involve assumptions about geometry - including the physical geometry of orbits and the geometry of non-spherical satellites. A number of tools have been developed in NASA's Orbital Debris Program Office to handle these calculations; many of which have never been published before. These include algorithms that are used in NASA's Orbital Debris Engineering Model ORDEM 3.0, as well as other tools useful for computing orbital collision rates and ground casualty risks. This paper will present an introduction to these algorithms and the assumptions upon which they are based.

  4. Radiation exposure and risk assessment for critical female body organs

    NASA Technical Reports Server (NTRS)

    Atwell, William; Weyland, Mark D.; Hardy, Alva C.

    1991-01-01

    Space radiation exposure limits for astronauts are based on recommendations of the National Council on Radiation Protection and Measurements. These limits now include the age at exposure and sex of the astronaut. A recently-developed computerized anatomical female (CAF) model is discussed in detail. Computer-generated, cross-sectional data are presented to illustrate the completeness of the CAF model. By applying ray-tracing techniques, shield distribution functions have been computed to calculate absorbed dose and dose equivalent values for a variety of critical body organs (e.g., breasts, lungs, thyroid gland, etc.) and mission scenarios. Specific risk assessments, i.e., cancer induction and mortality, are reviewed.

  5. Children, computer exposure and musculoskeletal outcomes: the development of pathway models for school and home computer-related musculoskeletal outcomes.

    PubMed

    Harris, Courtenay; Straker, Leon; Pollock, Clare; Smith, Anne

    2015-01-01

    Children's computer use is rapidly growing, together with reports of related musculoskeletal outcomes. Models and theories of adult-related risk factors demonstrate multivariate risk factors associated with computer use. Children's use of computers is different from adult's computer use at work. This study developed and tested a child-specific model demonstrating multivariate relationships between musculoskeletal outcomes, computer exposure and child factors. Using pathway modelling, factors such as gender, age, television exposure, computer anxiety, sustained attention (flow), socio-economic status and somatic complaints (headache and stomach pain) were found to have effects on children's reports of musculoskeletal symptoms. The potential for children's computer exposure to follow a dose-response relationship was also evident. Developing a child-related model can assist in understanding risk factors for children's computer use and support the development of recommendations to encourage children to use this valuable resource in educational, recreational and communication environments in a safe and productive manner. Computer use is an important part of children's school and home life. Application of this developed model, that encapsulates related risk factors, enables practitioners, researchers, teachers and parents to develop strategies that assist young people to use information technology for school, home and leisure in a safe and productive manner.

  6. Computational methods using genome-wide association studies to predict radiotherapy complications and to identify correlative molecular processes

    NASA Astrophysics Data System (ADS)

    Oh, Jung Hun; Kerns, Sarah; Ostrer, Harry; Powell, Simon N.; Rosenstein, Barry; Deasy, Joseph O.

    2017-02-01

    The biological cause of clinically observed variability of normal tissue damage following radiotherapy is poorly understood. We hypothesized that machine/statistical learning methods using single nucleotide polymorphism (SNP)-based genome-wide association studies (GWAS) would identify groups of patients of differing complication risk, and furthermore could be used to identify key biological sources of variability. We developed a novel learning algorithm, called pre-conditioned random forest regression (PRFR), to construct polygenic risk models using hundreds of SNPs, thereby capturing genomic features that confer small differential risk. Predictive models were trained and validated on a cohort of 368 prostate cancer patients for two post-radiotherapy clinical endpoints: late rectal bleeding and erectile dysfunction. The proposed method results in better predictive performance compared with existing computational methods. Gene ontology enrichment analysis and protein-protein interaction network analysis are used to identify key biological processes and proteins that were plausible based on other published studies. In conclusion, we confirm that novel machine learning methods can produce large predictive models (hundreds of SNPs), yielding clinically useful risk stratification models, as well as identifying important underlying biological processes in the radiation damage and tissue repair process. The methods are generally applicable to GWAS data and are not specific to radiotherapy endpoints.

  7. A comprehensive risk assessment framework for offsite transportation of inflammable hazardous waste.

    PubMed

    Das, Arup; Gupta, A K; Mazumder, T N

    2012-08-15

    A framework for risk assessment due to offsite transportation of hazardous wastes is designed based on the type of event that can be triggered from an accident of a hazardous waste carrier. The objective of this study is to design a framework for computing the risk to population associated with offsite transportation of inflammable and volatile wastes. The framework is based on traditional definition of risk and is designed for conditions where accident databases are not available. The probability based variable in risk assessment framework is substituted by a composite accident index proposed in this study. The framework computes the impacts due to a volatile cloud explosion based on TNO Multi-energy model. The methodology also estimates the vulnerable population in terms of disability adjusted life years (DALY) which takes into consideration the demographic profile of the population and the degree of injury on mortality and morbidity sustained. The methodology is illustrated using a case study of a pharmaceutical industry in the Kolkata metropolitan area. Copyright © 2012 Elsevier B.V. All rights reserved.

  8. Object-oriented regression for building predictive models with high dimensional omics data from translational studies.

    PubMed

    Zhao, Lue Ping; Bolouri, Hamid

    2016-04-01

    Maturing omics technologies enable researchers to generate high dimension omics data (HDOD) routinely in translational clinical studies. In the field of oncology, The Cancer Genome Atlas (TCGA) provided funding support to researchers to generate different types of omics data on a common set of biospecimens with accompanying clinical data and has made the data available for the research community to mine. One important application, and the focus of this manuscript, is to build predictive models for prognostic outcomes based on HDOD. To complement prevailing regression-based approaches, we propose to use an object-oriented regression (OOR) methodology to identify exemplars specified by HDOD patterns and to assess their associations with prognostic outcome. Through computing patient's similarities to these exemplars, the OOR-based predictive model produces a risk estimate using a patient's HDOD. The primary advantages of OOR are twofold: reducing the penalty of high dimensionality and retaining the interpretability to clinical practitioners. To illustrate its utility, we apply OOR to gene expression data from non-small cell lung cancer patients in TCGA and build a predictive model for prognostic survivorship among stage I patients, i.e., we stratify these patients by their prognostic survival risks beyond histological classifications. Identification of these high-risk patients helps oncologists to develop effective treatment protocols and post-treatment disease management plans. Using the TCGA data, the total sample is divided into training and validation data sets. After building up a predictive model in the training set, we compute risk scores from the predictive model, and validate associations of risk scores with prognostic outcome in the validation data (P-value=0.015). Copyright © 2016 Elsevier Inc. All rights reserved.

  9. Object-Oriented Regression for Building Predictive Models with High Dimensional Omics Data from Translational Studies

    PubMed Central

    Zhao, Lue Ping; Bolouri, Hamid

    2016-01-01

    Maturing omics technologies enable researchers to generate high dimension omics data (HDOD) routinely in translational clinical studies. In the field of oncology, The Cancer Genome Atlas (TCGA) provided funding support to researchers to generate different types of omics data on a common set of biospecimens with accompanying clinical data and to make the data available for the research community to mine. One important application, and the focus of this manuscript, is to build predictive models for prognostic outcomes based on HDOD. To complement prevailing regression-based approaches, we propose to use an object-oriented regression (OOR) methodology to identify exemplars specified by HDOD patterns and to assess their associations with prognostic outcome. Through computing patient’s similarities to these exemplars, the OOR-based predictive model produces a risk estimate using a patient’s HDOD. The primary advantages of OOR are twofold: reducing the penalty of high dimensionality and retaining the interpretability to clinical practitioners. To illustrate its utility, we apply OOR to gene expression data from non-small cell lung cancer patients in TCGA and build a predictive model for prognostic survivorship among stage I patients, i.e., we stratify these patients by their prognostic survival risks beyond histological classifications. Identification of these high-risk patients helps oncologists to develop effective treatment protocols and post-treatment disease management plans. Using the TCGA data, the total sample is divided into training and validation data sets. After building up a predictive model in the training set, we compute risk scores from the predictive model, and validate associations of risk scores with prognostic outcome in the validation data (p=0.015). PMID:26972839

  10. Pulse!!: a model for research and development of virtual-reality learning in military medical education and training.

    PubMed

    Dunne, James R; McDonald, Claudia L

    2010-07-01

    Pulse!! The Virtual Clinical Learning Lab at Texas A&M University-Corpus Christi, in collaboration with the United States Navy, has developed a model for research and technological development that they believe is an essential element in the future of military and civilian medical education. The Pulse!! project models a strategy for providing cross-disciplinary expertise and resources to educational, governmental, and business entities challenged with meeting looming health care crises. It includes a three-dimensional virtual learning platform that provides unlimited, repeatable, immersive clinical experiences without risk to patients, and is available anywhere there is a computer. Pulse!! utilizes expertise in the fields of medicine, medical education, computer science, software engineering, physics, computer animation, art, and architecture. Lab scientists collaborate with the commercial virtual-reality simulation industry to produce research-based learning platforms based on cutting-edge computer technology.

  11. MODEL DEVELOPMENT AND APPLICATION FOR ASSESSING HUMAN EXPOSURE AND DOSE TO TOXIC CHEMICALS AND POLLUTANTS

    EPA Science Inventory

    This project aims to strengthen the general scientific foundation of EPA's exposure and risk assessment processes by developing state-of-the-art exposure to dose computational models. This research will produce physiologically-based pharmacokinetic (PBPK) and pharmacodynamic (PD)...

  12. Building a computer program to support children, parents, and distraction during healthcare procedures.

    PubMed

    Hanrahan, Kirsten; McCarthy, Ann Marie; Kleiber, Charmaine; Ataman, Kaan; Street, W Nick; Zimmerman, M Bridget; Ersig, Anne L

    2012-10-01

    This secondary data analysis used data mining methods to develop predictive models of child risk for distress during a healthcare procedure. Data used came from a study that predicted factors associated with children's responses to an intravenous catheter insertion while parents provided distraction coaching. From the 255 items used in the primary study, 44 predictive items were identified through automatic feature selection and used to build support vector machine regression models. Models were validated using multiple cross-validation tests and by comparing variables identified as explanatory in the traditional versus support vector machine regression. Rule-based approaches were applied to the model outputs to identify overall risk for distress. A decision tree was then applied to evidence-based instructions for tailoring distraction to characteristics and preferences of the parent and child. The resulting decision support computer application, titled Children, Parents and Distraction, is being used in research. Future use will support practitioners in deciding the level and type of distraction intervention needed by a child undergoing a healthcare procedure.

  13. Automation of a DXA-based finite element tool for clinical assessment of hip fracture risk.

    PubMed

    Luo, Yunhua; Ahmed, Sharif; Leslie, William D

    2018-03-01

    Finite element analysis of medical images is a promising tool for assessing hip fracture risk. Although a number of finite element models have been developed for this purpose, none of them have been routinely used in clinic. The main reason is that the computer programs that implement the finite element models have not been completely automated, and heavy training is required before clinicians can effectively use them. By using information embedded in clinical dual energy X-ray absorptiometry (DXA), we completely automated a DXA-based finite element (FE) model that we previously developed for predicting hip fracture risk. The automated FE tool can be run as a standalone computer program with the subject's raw hip DXA image as input. The automated FE tool had greatly improved short-term precision compared with the semi-automated version. To validate the automated FE tool, a clinical cohort consisting of 100 prior hip fracture cases and 300 matched controls was obtained from a local community clinical center. Both the automated FE tool and femoral bone mineral density (BMD) were applied to discriminate the fracture cases from the controls. Femoral BMD is the gold standard reference recommended by the World Health Organization for screening osteoporosis and for assessing hip fracture risk. The accuracy was measured by the area under ROC curve (AUC) and odds ratio (OR). Compared with femoral BMD (AUC = 0.71, OR = 2.07), the automated FE tool had a considerably improved accuracy (AUC = 0.78, OR = 2.61 at the trochanter). This work made a large step toward applying our DXA-based FE model as a routine clinical tool for the assessment of hip fracture risk. Furthermore, the automated computer program can be embedded into a web-site as an internet application. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Evaluation of Enhanced Risk Monitors for Use on Advanced Reactors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramuhalli, Pradeep; Veeramany, Arun; Bonebrake, Christopher A.

    This study provides an overview of the methodology for integrating time-dependent failure probabilities into nuclear power reactor risk monitors. This prototypic enhanced risk monitor (ERM) methodology was evaluated using a hypothetical probabilistic risk assessment (PRA) model, generated using a simplified design of a liquid-metal-cooled advanced reactor (AR). Component failure data from industry compilation of failures of components similar to those in the simplified AR model were used to initialize the PRA model. Core damage frequency (CDF) over time were computed and analyzed. In addition, a study on alternative risk metrics for ARs was conducted. Risk metrics that quantify the normalizedmore » cost of repairs, replacements, or other operations and management (O&M) actions were defined and used, along with an economic model, to compute the likely economic risk of future actions such as deferred maintenance based on the anticipated change in CDF due to current component condition and future anticipated degradation. Such integration of conventional-risk metrics with alternate-risk metrics provides a convenient mechanism for assessing the impact of O&M decisions on safety and economics of the plant. It is expected that, when integrated with supervisory control algorithms, such integrated-risk monitors will provide a mechanism for real-time control decision-making that ensure safety margins are maintained while operating the plant in an economically viable manner.« less

  15. Modeling the Dynamics of Disease States in Depression

    PubMed Central

    Demic, Selver; Cheng, Sen

    2014-01-01

    Major depressive disorder (MDD) is a common and costly disorder associated with considerable morbidity, disability, and risk for suicide. The disorder is clinically and etiologically heterogeneous. Despite intense research efforts, the response rates of antidepressant treatments are relatively low and the etiology and progression of MDD remain poorly understood. Here we use computational modeling to advance our understanding of MDD. First, we propose a systematic and comprehensive definition of disease states, which is based on a type of mathematical model called a finite-state machine. Second, we propose a dynamical systems model for the progression, or dynamics, of MDD. The model is abstract and combines several major factors (mechanisms) that influence the dynamics of MDD. We study under what conditions the model can account for the occurrence and recurrence of depressive episodes and how we can model the effects of antidepressant treatments and cognitive behavioral therapy within the same dynamical systems model through changing a small subset of parameters. Our computational modeling suggests several predictions about MDD. Patients who suffer from depression can be divided into two sub-populations: a high-risk sub-population that has a high risk of developing chronic depression and a low-risk sub-population, in which patients develop depression stochastically with low probability. The success of antidepressant treatment is stochastic, leading to widely different times-to-remission in otherwise identical patients. While the specific details of our model might be subjected to criticism and revisions, our approach shows the potential power of computationally modeling depression and the need for different type of quantitative data for understanding depression. PMID:25330102

  16. Computer-Based Model Calibration and Uncertainty Analysis: Terms and Concepts

    DTIC Science & Technology

    2015-07-01

    uncertainty analyses throughout the lifecycle of planning, designing, and operating of Civil Works flood risk management projects as described in...value 95% of the time. In the frequentist approach to PE, model parameters area regarded as having true values, and their estimate is based on the...in catchment models. 1. Evaluating parameter uncertainty. Water Resources Research 19(5):1151–1172. Lee, P. M. 2012. Bayesian statistics: An

  17. A hybrid CNN feature model for pulmonary nodule malignancy risk differentiation.

    PubMed

    Wang, Huafeng; Zhao, Tingting; Li, Lihong Connie; Pan, Haixia; Liu, Wanquan; Gao, Haoqi; Han, Fangfang; Wang, Yuehai; Qi, Yifan; Liang, Zhengrong

    2018-01-01

    The malignancy risk differentiation of pulmonary nodule is one of the most challenge tasks of computer-aided diagnosis (CADx). Most recently reported CADx methods or schemes based on texture and shape estimation have shown relatively satisfactory on differentiating the risk level of malignancy among the nodules detected in lung cancer screening. However, the existing CADx schemes tend to detect and analyze characteristics of pulmonary nodules from a statistical perspective according to local features only. Enlightened by the currently prevailing learning ability of convolutional neural network (CNN), which simulates human neural network for target recognition and our previously research on texture features, we present a hybrid model that takes into consideration of both global and local features for pulmonary nodule differentiation using the largest public database founded by the Lung Image Database Consortium and Image Database Resource Initiative (LIDC-IDRI). By comparing three types of CNN models in which two of them were newly proposed by us, we observed that the multi-channel CNN model yielded the best discrimination in capacity of differentiating malignancy risk of the nodules based on the projection of distributions of extracted features. Moreover, CADx scheme using the new multi-channel CNN model outperformed our previously developed CADx scheme using the 3D texture feature analysis method, which increased the computed area under a receiver operating characteristic curve (AUC) from 0.9441 to 0.9702.

  18. Calibrating a Rainfall-Runoff and Routing Model for the Continental United States

    NASA Astrophysics Data System (ADS)

    Jankowfsky, S.; Li, S.; Assteerawatt, A.; Tillmanns, S.; Hilberts, A.

    2014-12-01

    Catastrophe risk models are widely used in the insurance industry to estimate the cost of risk. The models consist of hazard models linked to vulnerability and financial loss models. In flood risk models, the hazard model generates inundation maps. In order to develop country wide inundation maps for different return periods a rainfall-runoff and routing model is run using stochastic rainfall data. The simulated discharge and runoff is then input to a two dimensional inundation model, which produces the flood maps. In order to get realistic flood maps, the rainfall-runoff and routing models have to be calibrated with observed discharge data. The rainfall-runoff model applied here is a semi-distributed model based on the Topmodel (Beven and Kirkby, 1979) approach which includes additional snowmelt and evapotranspiration models. The routing model is based on the Muskingum-Cunge (Cunge, 1969) approach and includes the simulation of lakes and reservoirs using the linear reservoir approach. Both models were calibrated using the multiobjective NSGA-II (Deb et al., 2002) genetic algorithm with NLDAS forcing data and around 4500 USGS discharge gauges for the period from 1979-2013. Additional gauges having no data after 1979 were calibrated using CPC rainfall data. The model performed well in wetter regions and shows the difficulty of simulating areas with sinks such as karstic areas or dry areas. Beven, K., Kirkby, M., 1979. A physically based, variable contributing area model of basin hydrology. Hydrol. Sci. Bull. 24 (1), 43-69. Cunge, J.A., 1969. On the subject of a flood propagation computation method (Muskingum method), J. Hydr. Research, 7(2), 205-230. Deb, K., Pratap, A., Agarwal, S., Meyarivan, T., 2002. A fast and elitist multiobjective genetic algorithm: NSGA-II, IEEE Transactions on evolutionary computation, 6(2), 182-197.

  19. Risk assessment of tropical cyclone rainfall flooding in the Delaware River Basin

    NASA Astrophysics Data System (ADS)

    Lu, P.; Lin, N.; Smith, J. A.; Emanuel, K.

    2016-12-01

    Rainfall-induced inland flooding is a leading cause of death, injury, and property damage from tropical cyclones (TCs). In the context of climate change, it has been shown that extreme precipitation from TCs is likely to increase during the 21st century. Assessing the long-term risk of inland flooding associated with landfalling TCs is therefore an important task. Standard risk assessment techniques, which are based on observations from rain gauges and stream gauges, are not broadly applicable to TC induced flooding, since TCs are rare, extreme events with very limited historical observations at any specific location. Also, rain gauges and stream gauges can hardly capture the complex spatial variation of TC rainfall and flooding. Furthermore, the utility of historically based assessments is compromised by climate change. Regional dynamical downscaling models can resolve many features of TC precipitation. In terms of risk assessment, however, it is computationally demanding to run such models to obtain long-term climatology of TC induced flooding. Here we apply a computationally efficient climatological-hydrological method to assess the risk of inland flooding associated with landfalling TCs. It includes: 1) a deterministic TC climatology modeling method to generate large numbers of synthetic TCs with physically correlated characteristics (i.e., track, intensity, size) under observed and projected climates; 2) a simple physics-based tropical cyclone rainfall model which is able to simulate rainfall fields associated with each synthetic storm; 3) a hydrologic modeling system that takes in rainfall fields to simulate flood peaks over an entire drainage basin. We will present results of this method applied to the Delaware River Basin in the mid-Atlantic US.

  20. How Adverse Outcome Pathways Can Aid the Development and Use of Computational Prediction Models for Regulatory Toxicology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wittwehr, Clemens; Aladjov, Hristo; Ankley, Gerald

    Efforts are underway to transform regulatory toxicology and chemical safety assessment from a largely empirical science based on direct observation of apical toxicity outcomes in whole organism toxicity tests to a predictive one in which outcomes and risk are inferred from accumulated mechanistic understanding. The adverse outcome pathway (AOP) framework has emerged as a systematic approach for organizing knowledge that supports such inference. We argue that this systematic organization of knowledge can inform and help direct the design and development of computational prediction models that can further enhance the utility of mechanistic and in silico data for chemical safety assessment.more » Examples of AOP-informed model development and its application to the assessment of chemicals for skin sensitization and multiple modes of endocrine disruption are provided. The role of problem formulation, not only as a critical phase of risk assessment, but also as guide for both AOP and complementary model development described. Finally, a proposal for actively engaging the modeling community in AOP-informed computational model development is made. The contents serve as a vision for how AOPs can be leveraged to facilitate development of computational prediction models needed to support the next generation of chemical safety assessment.« less

  1. Valence-Dependent Belief Updating: Computational Validation

    PubMed Central

    Kuzmanovic, Bojana; Rigoux, Lionel

    2017-01-01

    People tend to update beliefs about their future outcomes in a valence-dependent way: they are likely to incorporate good news and to neglect bad news. However, belief formation is a complex process which depends not only on motivational factors such as the desire for favorable conclusions, but also on multiple cognitive variables such as prior beliefs, knowledge about personal vulnerabilities and resources, and the size of the probabilities and estimation errors. Thus, we applied computational modeling in order to test for valence-induced biases in updating while formally controlling for relevant cognitive factors. We compared biased and unbiased Bayesian models of belief updating, and specified alternative models based on reinforcement learning. The experiment consisted of 80 trials with 80 different adverse future life events. In each trial, participants estimated the base rate of one of these events and estimated their own risk of experiencing the event before and after being confronted with the actual base rate. Belief updates corresponded to the difference between the two self-risk estimates. Valence-dependent updating was assessed by comparing trials with good news (better-than-expected base rates) with trials with bad news (worse-than-expected base rates). After receiving bad relative to good news, participants' updates were smaller and deviated more strongly from rational Bayesian predictions, indicating a valence-induced bias. Model comparison revealed that the biased (i.e., optimistic) Bayesian model of belief updating better accounted for data than the unbiased (i.e., rational) Bayesian model, confirming that the valence of the new information influenced the amount of updating. Moreover, alternative computational modeling based on reinforcement learning demonstrated higher learning rates for good than for bad news, as well as a moderating role of personal knowledge. Finally, in this specific experimental context, the approach based on reinforcement learning was superior to the Bayesian approach. The computational validation of valence-dependent belief updating represents a novel support for a genuine optimism bias in human belief formation. Moreover, the precise control of relevant cognitive variables justifies the conclusion that the motivation to adopt the most favorable self-referential conclusions biases human judgments. PMID:28706499

  2. Valence-Dependent Belief Updating: Computational Validation.

    PubMed

    Kuzmanovic, Bojana; Rigoux, Lionel

    2017-01-01

    People tend to update beliefs about their future outcomes in a valence-dependent way: they are likely to incorporate good news and to neglect bad news. However, belief formation is a complex process which depends not only on motivational factors such as the desire for favorable conclusions, but also on multiple cognitive variables such as prior beliefs, knowledge about personal vulnerabilities and resources, and the size of the probabilities and estimation errors. Thus, we applied computational modeling in order to test for valence-induced biases in updating while formally controlling for relevant cognitive factors. We compared biased and unbiased Bayesian models of belief updating, and specified alternative models based on reinforcement learning. The experiment consisted of 80 trials with 80 different adverse future life events. In each trial, participants estimated the base rate of one of these events and estimated their own risk of experiencing the event before and after being confronted with the actual base rate. Belief updates corresponded to the difference between the two self-risk estimates. Valence-dependent updating was assessed by comparing trials with good news (better-than-expected base rates) with trials with bad news (worse-than-expected base rates). After receiving bad relative to good news, participants' updates were smaller and deviated more strongly from rational Bayesian predictions, indicating a valence-induced bias. Model comparison revealed that the biased (i.e., optimistic) Bayesian model of belief updating better accounted for data than the unbiased (i.e., rational) Bayesian model, confirming that the valence of the new information influenced the amount of updating. Moreover, alternative computational modeling based on reinforcement learning demonstrated higher learning rates for good than for bad news, as well as a moderating role of personal knowledge. Finally, in this specific experimental context, the approach based on reinforcement learning was superior to the Bayesian approach. The computational validation of valence-dependent belief updating represents a novel support for a genuine optimism bias in human belief formation. Moreover, the precise control of relevant cognitive variables justifies the conclusion that the motivation to adopt the most favorable self-referential conclusions biases human judgments.

  3. DETERMINATION OF TRANSFORMATION RATES OF CHIRAL PESTICIDES AND PCBS IN SOIL AND SEDIMENT MICROCOSMS

    EPA Science Inventory

    Risk Based Corrective Action (RBCA) has gained widespread acceptance as a favorable approach to remediating contaminated sites. The use of RBCA methods often requires computer-based modeling to assess the fate and transport of hazardous contaminants in subsurface environments, a...

  4. High resolution global flood hazard map from physically-based hydrologic and hydraulic models.

    NASA Astrophysics Data System (ADS)

    Begnudelli, L.; Kaheil, Y.; McCollum, J.

    2017-12-01

    The global flood map published online at http://www.fmglobal.com/research-and-resources/global-flood-map at 90m resolution is being used worldwide to understand flood risk exposure, exercise certain measures of mitigation, and/or transfer the residual risk financially through flood insurance programs. The modeling system is based on a physically-based hydrologic model to simulate river discharges, and 2D shallow-water hydrodynamic model to simulate inundation. The model can be applied to large-scale flood hazard mapping thanks to several solutions that maximize its efficiency and the use of parallel computing. The hydrologic component of the modeling system is the Hillslope River Routing (HRR) hydrologic model. HRR simulates hydrological processes using a Green-Ampt parameterization, and is calibrated against observed discharge data from several publicly-available datasets. For inundation mapping, we use a 2D Finite-Volume Shallow-Water model with wetting/drying. We introduce here a grid Up-Scaling Technique (UST) for hydraulic modeling to perform simulations at higher resolution at global scale with relatively short computational times. A 30m SRTM is now available worldwide along with higher accuracy and/or resolution local Digital Elevation Models (DEMs) in many countries and regions. UST consists of aggregating computational cells, thus forming a coarser grid, while retaining the topographic information from the original full-resolution mesh. The full-resolution topography is used for building relationships between volume and free surface elevation inside cells and computing inter-cell fluxes. This approach almost achieves computational speed typical of the coarse grids while preserving, to a significant extent, the accuracy offered by the much higher resolution available DEM. The simulations are carried out along each river of the network by forcing the hydraulic model with the streamflow hydrographs generated by HRR. Hydrographs are scaled so that the peak corresponds to the return period corresponding to the hazard map being produced (e.g. 100 years, 500 years). Each numerical simulation models one river reach, except for the longest reaches which are split in smaller parts. Here we show results for selected river basins worldwide.

  5. Efficient and Flexible Climate Analysis with Python in a Cloud-Based Distributed Computing Framework

    NASA Astrophysics Data System (ADS)

    Gannon, C.

    2017-12-01

    As climate models become progressively more advanced, and spatial resolution further improved through various downscaling projects, climate projections at a local level are increasingly insightful and valuable. However, the raw size of climate datasets presents numerous hurdles for analysts wishing to develop customized climate risk metrics or perform site-specific statistical analysis. Four Twenty Seven, a climate risk consultancy, has implemented a Python-based distributed framework to analyze large climate datasets in the cloud. With the freedom afforded by efficiently processing these datasets, we are able to customize and continually develop new climate risk metrics using the most up-to-date data. Here we outline our process for using Python packages such as XArray and Dask to evaluate netCDF files in a distributed framework, StarCluster to operate in a cluster-computing environment, cloud computing services to access publicly hosted datasets, and how this setup is particularly valuable for generating climate change indicators and performing localized statistical analysis.

  6. Paradigm of pretest risk stratification before coronary computed tomography.

    PubMed

    Jensen, Jesper Møller; Ovrehus, Kristian A; Nielsen, Lene H; Jensen, Jesper K; Larsen, Henrik M; Nørgaard, Bjarne L

    2009-01-01

    The optimal method of determining the pretest risk of coronary artery disease as a patient selection tool before coronary multidetector computed tomography (MDCT) is unknown. We investigated the ability of 3 different clinical risk scores to predict the outcome of coronary MDCT. This was a retrospective study of 551 patients consecutively referred for coronary MDCT on a suspicion of coronary artery disease. Diamond-Forrester, Duke, and Morise risk models were used to predict coronary artery stenosis (>50%) as assessed by coronary MDCT. The models were compared by receiver operating characteristic analysis. The distribution of low-, intermediate-, and high-risk persons, respectively, was established and compared for each of the 3 risk models. Overall, all risk prediction models performed equally well. However, the Duke risk model classified the low-risk patients more correctly than did the other models (P < 0.01). In patients without coronary artery calcification (CAC), the predictive value of the Duke risk model was superior to the other risk models (P < 0.05). Currently available risk prediction models seem to perform better in patients without CAC. Between the risk prediction models, there was a significant discrepancy in the distribution of patients at low, intermediate, or high risk (P < 0.01). The 3 risk prediction models perform equally well, although the Duke risk score may have advantages in subsets of patients. The choice of risk prediction model affects the referral pattern to MDCT. Copyright (c) 2009 Society of Cardiovascular Computed Tomography. Published by Elsevier Inc. All rights reserved.

  7. The Virtual Liver Project: Modeling Tissue Response To Chemicals Through Multiscale Simulation

    EPA Science Inventory

    The US EPA Virtual Liver Project is aimed at simulating the risk of toxic effects from environmental chemicals in silico. The computational systems model of organ injury due to chronic chemical exposure is based on: (i) the dynamics of perturbed molecular pathways, (ii) their lin...

  8. Analyzing musculoskeletal neck pain, measured as present pain and periods of pain, with three different regression models: a cohort study.

    PubMed

    Grimby-Ekman, Anna; Andersson, Eva M; Hagberg, Mats

    2009-06-19

    In the literature there are discussions on the choice of outcome and the need for more longitudinal studies of musculoskeletal disorders. The general aim of this longitudinal study was to analyze musculoskeletal neck pain, in a group of young adults. Specific aims were to determine whether psychosocial factors, computer use, high work/study demands, and lifestyle are long-term or short-term factors for musculoskeletal neck pain, and whether these factors are important for developing or ongoing musculoskeletal neck pain. Three regression models were used to analyze the different outcomes. Pain at present was analyzed with a marginal logistic model, for number of years with pain a Poisson regression model was used and for developing and ongoing pain a logistic model was used. Presented results are odds ratios and proportion ratios (logistic models) and rate ratios (Poisson model). The material consisted of web-based questionnaires answered by 1204 Swedish university students from a prospective cohort recruited in 2002. Perceived stress was a risk factor for pain at present (PR = 1.6), for developing pain (PR = 1.7) and for number of years with pain (RR = 1.3). High work/study demands was associated with pain at present (PR = 1.6); and with number of years with pain when the demands negatively affect home life (RR = 1.3). Computer use pattern (number of times/week with a computer session > or = 4 h, without break) was a risk factor for developing pain (PR = 1.7), but also associated with pain at present (PR = 1.4) and number of years with pain (RR = 1.2). Among life style factors smoking (PR = 1.8) was found to be associated to pain at present. The difference between men and women in prevalence of musculoskeletal pain was confirmed in this study. It was smallest for the outcome ongoing pain (PR = 1.4) compared to pain at present (PR = 2.4) and developing pain (PR = 2.5). By using different regression models different aspects of neck pain pattern could be addressed and the risk factors impact on pain pattern was identified. Short-term risk factors were perceived stress, high work/study demands and computer use pattern (break pattern). Those were also long-term risk factors. For developing pain perceived stress and computer use pattern were risk factors.

  9. Analyzing musculoskeletal neck pain, measured as present pain and periods of pain, with three different regression models: a cohort study

    PubMed Central

    Grimby-Ekman, Anna; Andersson, Eva M; Hagberg, Mats

    2009-01-01

    Background In the literature there are discussions on the choice of outcome and the need for more longitudinal studies of musculoskeletal disorders. The general aim of this longitudinal study was to analyze musculoskeletal neck pain, in a group of young adults. Specific aims were to determine whether psychosocial factors, computer use, high work/study demands, and lifestyle are long-term or short-term factors for musculoskeletal neck pain, and whether these factors are important for developing or ongoing musculoskeletal neck pain. Methods Three regression models were used to analyze the different outcomes. Pain at present was analyzed with a marginal logistic model, for number of years with pain a Poisson regression model was used and for developing and ongoing pain a logistic model was used. Presented results are odds ratios and proportion ratios (logistic models) and rate ratios (Poisson model). The material consisted of web-based questionnaires answered by 1204 Swedish university students from a prospective cohort recruited in 2002. Results Perceived stress was a risk factor for pain at present (PR = 1.6), for developing pain (PR = 1.7) and for number of years with pain (RR = 1.3). High work/study demands was associated with pain at present (PR = 1.6); and with number of years with pain when the demands negatively affect home life (RR = 1.3). Computer use pattern (number of times/week with a computer session ≥ 4 h, without break) was a risk factor for developing pain (PR = 1.7), but also associated with pain at present (PR = 1.4) and number of years with pain (RR = 1.2). Among life style factors smoking (PR = 1.8) was found to be associated to pain at present. The difference between men and women in prevalence of musculoskeletal pain was confirmed in this study. It was smallest for the outcome ongoing pain (PR = 1.4) compared to pain at present (PR = 2.4) and developing pain (PR = 2.5). Conclusion By using different regression models different aspects of neck pain pattern could be addressed and the risk factors impact on pain pattern was identified. Short-term risk factors were perceived stress, high work/study demands and computer use pattern (break pattern). Those were also long-term risk factors. For developing pain perceived stress and computer use pattern were risk factors. PMID:19545386

  10. Toxcast and the Use of Human Relevant In Vitro Exposures ...

    EPA Pesticide Factsheets

    The path for incorporating new approach methods and technologies into quantitative chemical risk assessment poses a diverse set of scientific challenges. These challenges include sufficient coverage of toxicological mechanisms to meaningfully interpret negative test results, development of increasingly relevant test systems, computational modeling to integrate experimental data, putting results in a dose and exposure context, characterizing uncertainty, and efficient validation of the test systems and computational models. The presentation will cover progress at the U.S. EPA in systematically addressing each of these challenges and delivering more human-relevant risk-based assessments. This abstract does not necessarily reflect U.S. EPA policy. Presentation at the British Toxicological Society Annual Congress on ToxCast and the Use of Human Relevant In Vitro Exposures: Incorporating high-throughput exposure and toxicity testing data for 21st century risk assessments .

  11. Computer-Based Delivery Systems in College Developmental Writing: Increasing Access for At-Risk Students

    ERIC Educational Resources Information Center

    Olivier, Denise H.

    2016-01-01

    Purpose, Scope, and Method of Study: The purpose of this study was to compare student success rates in a college developmental writing course delivered in a conventional classroom to the same course using a computer-delivered model. The sample was drawn from a small, Midwestern community college. Students were enrolled in one of three sections…

  12. Poster — Thur Eve — 69: Computational Study of DVH-guided Cancer Treatment Planning Optimization Methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ghomi, Pooyan Shirvani; Zinchenko, Yuriy

    2014-08-15

    Purpose: To compare methods to incorporate the Dose Volume Histogram (DVH) curves into the treatment planning optimization. Method: The performance of three methods, namely, the conventional Mixed Integer Programming (MIP) model, a convex moment-based constrained optimization approach, and an unconstrained convex moment-based penalty approach, is compared using anonymized data of a prostate cancer patient. Three plans we generated using the corresponding optimization models. Four Organs at Risk (OARs) and one Tumor were involved in the treatment planning. The OARs and Tumor were discretized into total of 50,221 voxels. The number of beamlets was 943. We used commercially available optimization softwaremore » Gurobi and Matlab to solve the models. Plan comparison was done by recording the model runtime followed by visual inspection of the resulting dose volume histograms. Conclusion: We demonstrate the effectiveness of the moment-based approaches to replicate the set of prescribed DVH curves. The unconstrained convex moment-based penalty approach is concluded to have the greatest potential to reduce the computational effort and holds a promise of substantial computational speed up.« less

  13. An application of a hydraulic model simulator in flood risk assessment under changing climatic conditions

    NASA Astrophysics Data System (ADS)

    Doroszkiewicz, J. M.; Romanowicz, R. J.

    2016-12-01

    The standard procedure of climate change impact assessment on future hydrological extremes consists of a chain of consecutive actions, starting from the choice of GCM driven by an assumed CO2 scenario, through downscaling of climatic forcing to a catchment scale, estimation of hydrological extreme indices using hydrological modelling tools and subsequent derivation of flood risk maps with the help of a hydraulic model. Among many possible sources of uncertainty, the main are the uncertainties related to future climate scenarios, climate models, downscaling techniques and hydrological and hydraulic models. Unfortunately, we cannot directly assess the impact of these different sources of uncertainties on flood risk in future due to lack of observations of future climate realizations. The aim of this study is an assessment of a relative impact of different sources of uncertainty on the uncertainty of flood risk maps. Due to the complexity of the processes involved, an assessment of total uncertainty of maps of inundation probability might be very computer time consuming. As a way forward we present an application of a hydraulic model simulator based on a nonlinear transfer function model for the chosen locations along the river reach. The transfer function model parameters are estimated based on the simulations of the hydraulic model at each of the model cross-sections. The study shows that the application of a simulator substantially reduces the computer requirements related to the derivation of flood risk maps under future climatic conditions. Biala Tarnowska catchment, situated in southern Poland is used as a case study. Future discharges at the input to a hydraulic model are obtained using the HBV model and climate projections obtained from the EUROCORDEX project. The study describes a cascade of uncertainty related to different stages of the process of derivation of flood risk maps under changing climate conditions. In this context it takes into account the uncertainty of future climate projections, an uncertainty of flow routing model, the propagation of that uncertainty through the hydraulic model, and finally, the uncertainty related to the derivation of flood risk maps.

  14. Risk factors for keratinocyte skin cancer in patients diagnosed with melanoma, a large retrospective study.

    PubMed

    Espinosa, Pablo; Pfeiffer, Ruth M; García-Casado, Zaida; Requena, Celia; Landi, Maria Teresa; Kumar, Rajiv; Nagore, Eduardo

    2016-01-01

    Melanoma survivors are at an increased risk of developing other malignancies, including keratinocyte skin cancer (KSC). While it is known that many risk factors for melanoma also impact risk of KSC in the general population, no previous study has investigated risk factors for KSC development in melanoma patients. We assessed associations of personal and clinical characteristics, including skin phenotype and variations in the melanocortin 1 receptor (MC1R) gene, with KSC risk in melanoma patients. We used prospective follow-up information on 1200 patients treated for melanoma at the Instituto Valenciano de Oncología, Spain, between 2000 and 2011. We computed hazard ratios and 95% confidence intervals (CIs) for the association of clinical, personal and genetic characteristics with risk of KSC, squamous cell carcinoma (SCC), or basal cell carcinoma (BCC) from Cox proportional hazard models. Five-year cumulative incidence based on competing risk models of SCC, BCC or KSC overall was computed using multivariate subdistribution hazard models. To assess predictive performance of the models, we computed areas under the receiver-operating characteristic curves (AUCs, discriminatory power) using cross-validation. Median follow-up was 57.2 months; a KSC was detected in 163 patients (13.6%). In multivariable Cox models, age, sex, sunburns, chronic sun exposure, past personal history of non-melanoma skin cancer or other non-cutaneous neoplasia, and the MC1R variants p.D294H and p.R163Q were significantly associated with KSC risk. A cumulative incidence model including age, sex, personal history of KSC, and of other non-cutaneous neoplasia had an AUC of 0.76 (95% CI: 0.71-0.80). When p.D294H and p.R163Q variants were added to the model, the AUC increased to 0.81 (95% CI: 0.77-0.84) (p-value for difference <0.0001). In addition to age, sex, skin characteristics, and sun exposure, p.R163Q and p.D294H MC1R variants significantly increased KSC risk among melanoma patients. Our findings may help identify patients who could benefit most from preventive measures. Copyright © 2015 Elsevier Ltd. All rights reserved.

  15. Model-free and model-based reward prediction errors in EEG.

    PubMed

    Sambrook, Thomas D; Hardwick, Ben; Wills, Andy J; Goslin, Jeremy

    2018-05-24

    Learning theorists posit two reinforcement learning systems: model-free and model-based. Model-based learning incorporates knowledge about structure and contingencies in the world to assign candidate actions with an expected value. Model-free learning is ignorant of the world's structure; instead, actions hold a value based on prior reinforcement, with this value updated by expectancy violation in the form of a reward prediction error. Because they use such different learning mechanisms, it has been previously assumed that model-based and model-free learning are computationally dissociated in the brain. However, recent fMRI evidence suggests that the brain may compute reward prediction errors to both model-free and model-based estimates of value, signalling the possibility that these systems interact. Because of its poor temporal resolution, fMRI risks confounding reward prediction errors with other feedback-related neural activity. In the present study, EEG was used to show the presence of both model-based and model-free reward prediction errors and their place in a temporal sequence of events including state prediction errors and action value updates. This demonstration of model-based prediction errors questions a long-held assumption that model-free and model-based learning are dissociated in the brain. Copyright © 2018 Elsevier Inc. All rights reserved.

  16. The Use of Major Risk Factors for Computer-Based Distinction of Diabetic Patients with Ischemic Stroke and Without Stroke

    DTIC Science & Technology

    2001-10-25

    THE USE of MAJOR RISK FACTORS for COMPUTER-BASED DISTINCTION of DIABETIC PATIENTS with ISCHEMIC STROKE and WITHOUT STROKE Sibel Oge Merey1...highlighting the major risk factors of diabetic patients with non-embolic stroke and without stroke by performing dependency analysis and decision making...of Major Risk Factors for Computer-Based Distinction of Diabetic Patients with Ischemic Stroke and Without Stroke Contract Number Grant Number

  17. Development of a Risk-Based Comparison Methodology of Carbon Capture Technologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Engel, David W.; Dalton, Angela C.; Dale, Crystal

    2014-06-01

    Given the varying degrees of maturity among existing carbon capture (CC) technology alternatives, an understanding of the inherent technical and financial risk and uncertainty associated with these competing technologies is requisite to the success of carbon capture as a viable solution to the greenhouse gas emission challenge. The availability of tools and capabilities to conduct rigorous, risk–based technology comparisons is thus highly desirable for directing valuable resources toward the technology option(s) with a high return on investment, superior carbon capture performance, and minimum risk. To address this research need, we introduce a novel risk-based technology comparison method supported by anmore » integrated multi-domain risk model set to estimate risks related to technological maturity, technical performance, and profitability. Through a comparison between solid sorbent and liquid solvent systems, we illustrate the feasibility of estimating risk and quantifying uncertainty in a single domain (modular analytical capability) as well as across multiple risk dimensions (coupled analytical capability) for comparison. This method brings technological maturity and performance to bear on profitability projections, and carries risk and uncertainty modeling across domains via inter-model sharing of parameters, distributions, and input/output. The integration of the models facilitates multidimensional technology comparisons within a common probabilistic risk analysis framework. This approach and model set can equip potential technology adopters with the necessary computational capabilities to make risk-informed decisions about CC technology investment. The method and modeling effort can also be extended to other industries where robust tools and analytical capabilities are currently lacking for evaluating nascent technologies.« less

  18. The Role of Inertia in Modeling Decisions from Experience with Instance-Based Learning

    PubMed Central

    Dutt, Varun; Gonzalez, Cleotilde

    2012-01-01

    One form of inertia is the tendency to repeat the last decision irrespective of the obtained outcomes while making decisions from experience (DFE). A number of computational models based upon the Instance-Based Learning Theory, a theory of DFE, have included different inertia implementations and have shown to simultaneously account for both risk-taking and alternations between alternatives. The role that inertia plays in these models, however, is unclear as the same model without inertia is also able to account for observed risk-taking quite well. This paper demonstrates the predictive benefits of incorporating one particular implementation of inertia in an existing IBL model. We use two large datasets, estimation and competition, from the Technion Prediction Tournament involving a repeated binary-choice task to show that incorporating an inertia mechanism in an IBL model enables it to account for the observed average risk-taking and alternations. Including inertia, however, does not help the model to account for the trends in risk-taking and alternations over trials compared to the IBL model without the inertia mechanism. We generalize the two IBL models, with and without inertia, to the competition set by using the parameters determined in the estimation set. The generalization process demonstrates both the advantages and disadvantages of including inertia in an IBL model. PMID:22685443

  19. The role of inertia in modeling decisions from experience with instance-based learning.

    PubMed

    Dutt, Varun; Gonzalez, Cleotilde

    2012-01-01

    One form of inertia is the tendency to repeat the last decision irrespective of the obtained outcomes while making decisions from experience (DFE). A number of computational models based upon the Instance-Based Learning Theory, a theory of DFE, have included different inertia implementations and have shown to simultaneously account for both risk-taking and alternations between alternatives. The role that inertia plays in these models, however, is unclear as the same model without inertia is also able to account for observed risk-taking quite well. This paper demonstrates the predictive benefits of incorporating one particular implementation of inertia in an existing IBL model. We use two large datasets, estimation and competition, from the Technion Prediction Tournament involving a repeated binary-choice task to show that incorporating an inertia mechanism in an IBL model enables it to account for the observed average risk-taking and alternations. Including inertia, however, does not help the model to account for the trends in risk-taking and alternations over trials compared to the IBL model without the inertia mechanism. We generalize the two IBL models, with and without inertia, to the competition set by using the parameters determined in the estimation set. The generalization process demonstrates both the advantages and disadvantages of including inertia in an IBL model.

  20. A method for mapping fire hazard and risk across multiple scales and its application in fire management

    Treesearch

    Robert E. Keane; Stacy A. Drury; Eva C. Karau; Paul F. Hessburg; Keith M. Reynolds

    2010-01-01

    This paper presents modeling methods for mapping fire hazard and fire risk using a research model called FIREHARM (FIRE Hazard and Risk Model) that computes common measures of fire behavior, fire danger, and fire effects to spatially portray fire hazard over space. FIREHARM can compute a measure of risk associated with the distribution of these measures over time using...

  1. Biological and statistical approaches to predicting human lung cancer risk from silica.

    PubMed

    Kuempel, E D; Tran, C L; Bailer, A J; Porter, D W; Hubbs, A F; Castranova, V

    2001-01-01

    Chronic inflammation is a key step in the pathogenesis of particle-elicited fibrosis and lung cancer in rats, and possibly in humans. In this study, we compute the excess risk estimates for lung cancer in humans with occupational exposure to crystalline silica, using both rat and human data, and using both a threshold approach and linear models. From a toxicokinetic/dynamic model fit to lung burden and pulmonary response data from a subchronic inhalation study in rats, we estimated the minimum critical quartz lung burden (Mcrit) associated with reduced pulmonary clearance and increased neutrophilic inflammation. A chronic study in rats was also used to predict the human excess risk of lung cancer at various quartz burdens, including mean Mcrit (0.39 mg/g lung). We used a human kinetic lung model to link the equivalent lung burdens to external exposures in humans. We then computed the excess risk of lung cancer at these external exposures, using data of workers exposed to respirable crystalline silica and using Poisson regression and lifetable analyses. Finally, we compared the lung cancer excess risks estimated from male rat and human data. We found that the rat-based linear model estimates were approximately three times higher than those based on human data (e.g., 2.8% in rats vs. 0.9-1% in humans, at mean Mcrit lung burden or associated mean working lifetime exposure of 0.036 mg/m3). Accounting for variability and uncertainty resulted in 100-1000 times lower estimates of human critical lung burden and airborne exposure. This study illustrates that assumptions about the relevant biological mechanism, animal model, and statistical approach can all influence the magnitude of lung cancer risk estimates in humans exposed to crystalline silica.

  2. Model based verification of the Secure Socket Layer (SSL) Protocol for NASA systems

    NASA Technical Reports Server (NTRS)

    Powell, John D.; Gilliam, David

    2004-01-01

    The National Aeronautics and Space Administration (NASA) has tens of thousands of networked computer systems and applications. Software Security vulnerabilities present risks such as lost or corrupted data, information theft, and unavailability of critical systems. These risks represent potentially enormous costs to NASA. The NASA Code Q research initiative 'Reducing Software Security Risk (RSSR) Trough an Integrated Approach' offers formal verification of information technology (IT), through the creation of a Software Security Assessment Instrument (SSAI), to address software security risks.

  3. Evaluating Emulation-based Models of Distributed Computing Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, Stephen T.; Gabert, Kasimir G.; Tarman, Thomas D.

    Emulation-based models of distributed computing systems are collections of virtual ma- chines, virtual networks, and other emulation components configured to stand in for oper- ational systems when performing experimental science, training, analysis of design alterna- tives, test and evaluation, or idea generation. As with any tool, we should carefully evaluate whether our uses of emulation-based models are appropriate and justified. Otherwise, we run the risk of using a model incorrectly and creating meaningless results. The variety of uses of emulation-based models each have their own goals and deserve thoughtful evaluation. In this paper, we enumerate some of these uses andmore » describe approaches that one can take to build an evidence-based case that a use of an emulation-based model is credible. Predictive uses of emulation-based models, where we expect a model to tell us something true about the real world, set the bar especially high and the principal evaluation method, called validation , is comensurately rigorous. We spend the majority of our time describing and demonstrating the validation of a simple predictive model using a well-established methodology inherited from decades of development in the compuational science and engineering community.« less

  4. Building a Computer Program to Support Children, Parents, and Distraction during Healthcare Procedures

    PubMed Central

    McCarthy, Ann Marie; Kleiber, Charmaine; Ataman, Kaan; Street, W. Nick; Zimmerman, M. Bridget; Ersig, Anne L.

    2012-01-01

    This secondary data analysis used data mining methods to develop predictive models of child risk for distress during a healthcare procedure. Data used came from a study that predicted factors associated with children’s responses to an intravenous catheter insertion while parents provided distraction coaching. From the 255 items used in the primary study, 44 predictive items were identified through automatic feature selection and used to build support vector machine regression models. Models were validated using multiple cross-validation tests and by comparing variables identified as explanatory in the traditional versus support vector machine regression. Rule-based approaches were applied to the model outputs to identify overall risk for distress. A decision tree was then applied to evidence-based instructions for tailoring distraction to characteristics and preferences of the parent and child. The resulting decision support computer application, the Children, Parents and Distraction (CPaD), is being used in research. Future use will support practitioners in deciding the level and type of distraction intervention needed by a child undergoing a healthcare procedure. PMID:22805121

  5. Space Shuttle critical function audit

    NASA Technical Reports Server (NTRS)

    Sacks, Ivan J.; Dipol, John; Su, Paul

    1990-01-01

    A large fault-tolerance model of the main propulsion system of the US space shuttle has been developed. This model is being used to identify single components and pairs of components that will cause loss of shuttle critical functions. In addition, this model is the basis for risk quantification of the shuttle. The process used to develop and analyze the model is digraph matrix analysis (DMA). The DMA modeling and analysis process is accessed via a graphics-based computer user interface. This interface provides coupled display of the integrated system schematics, the digraph models, the component database, and the results of the fault tolerance and risk analyses.

  6. Real time forest fire warning and forest fire risk zoning: a Vietnamese case study

    NASA Astrophysics Data System (ADS)

    Chu, T.; Pham, D.; Phung, T.; Ha, A.; Paschke, M.

    2016-12-01

    Forest fire occurs seriously in Vietnam and has been considered as one of the major causes of forest lost and degradation. Several studies of forest fire risk warning were conducted using Modified Nesterov Index (MNI) but remaining shortcomings and inaccurate predictions that needs to be urgently improved. In our study, several important topographic and social factors such as aspect, slope, elevation, distance to residential areas and road system were considered as "permanent" factors while meteorological data were updated hourly using near-real-time (NRT) remotely sensed data (i.e. MODIS Terra/Aqua and TRMM) for the prediction and warning of fire. Due to the limited number of weather stations in Vietnam, data from all active stations (i.e. 178) were used with the satellite data to calibrate and upscale meteorological variables. These data with finer resolution were then used to generate MNI. The only significant "permanent" factors were selected as input variables based on the correlation coefficients that computed from multi-variable regression among true fire-burning (collected from 1/2007) and its spatial characteristics. These coefficients also used to suggest appropriate weight for computing forest fire risk (FR) model. Forest fire risk model was calculated from the MNI and the selected factors using fuzzy regression models (FRMs) and GIS based multi-criteria analysis. By this approach, the FR was slightly modified from MNI by the integrated use of various factors in our fire warning and prediction model. Multifactor-based maps of forest fire risk zone were generated from classifying FR into three potential danger levels. Fire risk maps were displayed using webgis technology that is easy for managing data and extracting reports. Reported fire-burnings thereafter have been used as true values for validating the forest fire risk. Fire probability has strong relationship with potential danger levels (varied from 5.3% to 53.8%) indicating that the higher potential risk, the more chance of fire happen. By adding spatial factors to continuous daily updated remote sensing based meteo-data, results are valuable for both mapping forest fire risk zones in short and long-term and real time fire warning in Vietnam. Key words: Near-real-time, forest fire warning, fuzzy regression model, remote sensing.

  7. Quantile uncertainty and value-at-risk model risk.

    PubMed

    Alexander, Carol; Sarabia, José María

    2012-08-01

    This article develops a methodology for quantifying model risk in quantile risk estimates. The application of quantile estimates to risk assessment has become common practice in many disciplines, including hydrology, climate change, statistical process control, insurance and actuarial science, and the uncertainty surrounding these estimates has long been recognized. Our work is particularly important in finance, where quantile estimates (called Value-at-Risk) have been the cornerstone of banking risk management since the mid 1980s. A recent amendment to the Basel II Accord recommends additional market risk capital to cover all sources of "model risk" in the estimation of these quantiles. We provide a novel and elegant framework whereby quantile estimates are adjusted for model risk, relative to a benchmark which represents the state of knowledge of the authority that is responsible for model risk. A simulation experiment in which the degree of model risk is controlled illustrates how to quantify Value-at-Risk model risk and compute the required regulatory capital add-on for banks. An empirical example based on real data shows how the methodology can be put into practice, using only two time series (daily Value-at-Risk and daily profit and loss) from a large bank. We conclude with a discussion of potential applications to nonfinancial risks. © 2012 Society for Risk Analysis.

  8. Globally-Applicable Predictive Wildfire Model   a Temporal-Spatial GIS Based Risk Analysis Using Data Driven Fuzzy Logic Functions

    NASA Astrophysics Data System (ADS)

    van den Dool, G.

    2017-11-01

    This study (van den Dool, 2017) is a proof of concept for a global predictive wildfire model, in which the temporal-spatial characteristics of wildfires are placed in a Geographical Information System (GIS), and the risk analysis is based on data-driven fuzzy logic functions. The data sources used in this model are available as global datasets, but subdivided into three pilot areas: North America (California/Nevada), Europe (Spain), and Asia (Mongolia), and are downscaled to the highest resolution (3-arc second). The GIS is constructed around three themes: topography, fuel availability and climate. From the topographical data, six derived sub-themes are created and converted to a fuzzy membership based on the catchment area statistics. The fuel availability score is a composite of four data layers: land cover, wood loads, biomass, biovolumes. As input for the climatological sub-model reanalysed daily averaged, weather-related data is used, which is accumulated to a global weekly time-window (to account for the uncertainty within the climatological model) and forms the temporal component of the model. The final product is a wildfire risk score (from 0 to 1) by week, representing the average wildfire risk in an area. To compute the potential wildfire risk the sub-models are combined usinga Multi-Criteria Approach, and the model results are validated against the area under the Receiver Operating Characteristic curve.

  9. Engineering models for catastrophe risk and their application to insurance

    NASA Astrophysics Data System (ADS)

    Dong, Weimin

    2002-06-01

    Internationally earthquake insurance, like all other insurance (fire, auto), adopted actuarial approach in the past, which is, based on historical loss experience to determine insurance rate. Due to the fact that earthquake is a rare event with severe consequence, irrational determination of premium rate and lack of understanding scale of potential loss led to many insurance companies insolvent after Northridge earthquake in 1994. Along with recent advances in earth science, computer science and engineering, computerized loss estimation methodologies based on first principles have been developed to the point that losses from destructive earthquakes can be quantified with reasonable accuracy using scientific modeling techniques. This paper intends to introduce how engineering models can assist to quantify earthquake risk and how insurance industry can use this information to manage their risk in the United States and abroad.

  10. How Adverse Outcome Pathways Can Aid the Development and Use of Computational Prediction Models for Regulatory Toxicology

    PubMed Central

    Aladjov, Hristo; Ankley, Gerald; Byrne, Hugh J.; de Knecht, Joop; Heinzle, Elmar; Klambauer, Günter; Landesmann, Brigitte; Luijten, Mirjam; MacKay, Cameron; Maxwell, Gavin; Meek, M. E. (Bette); Paini, Alicia; Perkins, Edward; Sobanski, Tomasz; Villeneuve, Dan; Waters, Katrina M.; Whelan, Maurice

    2017-01-01

    Efforts are underway to transform regulatory toxicology and chemical safety assessment from a largely empirical science based on direct observation of apical toxicity outcomes in whole organism toxicity tests to a predictive one in which outcomes and risk are inferred from accumulated mechanistic understanding. The adverse outcome pathway (AOP) framework provides a systematic approach for organizing knowledge that may support such inference. Likewise, computational models of biological systems at various scales provide another means and platform to integrate current biological understanding to facilitate inference and extrapolation. We argue that the systematic organization of knowledge into AOP frameworks can inform and help direct the design and development of computational prediction models that can further enhance the utility of mechanistic and in silico data for chemical safety assessment. This concept was explored as part of a workshop on AOP-Informed Predictive Modeling Approaches for Regulatory Toxicology held September 24–25, 2015. Examples of AOP-informed model development and its application to the assessment of chemicals for skin sensitization and multiple modes of endocrine disruption are provided. The role of problem formulation, not only as a critical phase of risk assessment, but also as guide for both AOP and complementary model development is described. Finally, a proposal for actively engaging the modeling community in AOP-informed computational model development is made. The contents serve as a vision for how AOPs can be leveraged to facilitate development of computational prediction models needed to support the next generation of chemical safety assessment. PMID:27994170

  11. Analyzing systemic risk using non-linear marginal expected shortfall and its minimum spanning tree

    NASA Astrophysics Data System (ADS)

    Song, Jae Wook; Ko, Bonggyun; Chang, Woojin

    2018-02-01

    The aim of this paper is to propose a new theoretical framework for analyzing the systemic risk using the marginal expected shortfall (MES) and its correlation-based minimum spanning tree (MST). At first, we develop two parametric models of MES with their closed-form solutions based on the Capital Asset Pricing Model. Our models are derived from the non-symmetric quadratic form, which allows them to consolidate the non-linear relationship between the stock and market returns. Secondly, we discover the evidences related to the utility of our models and the possible association in between the non-linear relationship and the emergence of severe systemic risk by considering the US financial system as a benchmark. In this context, the evolution of MES also can be regarded as a reasonable proxy of systemic risk. Lastly, we analyze the structural properties of the systemic risk using the MST based on the computed series of MES. The topology of MST conveys the presence of sectoral clustering and strong co-movements of systemic risk leaded by few hubs during the crisis. Specifically, we discover that the Depositories are the majority sector leading the connections during the Non-Crisis period, whereas the Broker-Dealers are majority during the Crisis period.

  12. A probabilistic method for computing quantitative risk indexes from medical injuries compensation claims.

    PubMed

    Dalle Carbonare, S; Folli, F; Patrini, E; Giudici, P; Bellazzi, R

    2013-01-01

    The increasing demand of health care services and the complexity of health care delivery require Health Care Organizations (HCOs) to approach clinical risk management through proper methods and tools. An important aspect of risk management is to exploit the analysis of medical injuries compensation claims in order to reduce adverse events and, at the same time, to optimize the costs of health insurance policies. This work provides a probabilistic method to estimate the risk level of a HCO by computing quantitative risk indexes from medical injury compensation claims. Our method is based on the estimate of a loss probability distribution from compensation claims data through parametric and non-parametric modeling and Monte Carlo simulations. The loss distribution can be estimated both on the whole dataset and, thanks to the application of a Bayesian hierarchical model, on stratified data. The approach allows to quantitatively assessing the risk structure of the HCO by analyzing the loss distribution and deriving its expected value and percentiles. We applied the proposed method to 206 cases of injuries with compensation requests collected from 1999 to the first semester of 2007 by the HCO of Lodi, in the Northern part of Italy. We computed the risk indexes taking into account the different clinical departments and the different hospitals involved. The approach proved to be useful to understand the HCO risk structure in terms of frequency, severity, expected and unexpected loss related to adverse events.

  13. Computational fluid dynamics modelling in cardiovascular medicine

    PubMed Central

    Morris, Paul D; Narracott, Andrew; von Tengg-Kobligk, Hendrik; Silva Soto, Daniel Alejandro; Hsiao, Sarah; Lungu, Angela; Evans, Paul; Bressloff, Neil W; Lawford, Patricia V; Hose, D Rodney; Gunn, Julian P

    2016-01-01

    This paper reviews the methods, benefits and challenges associated with the adoption and translation of computational fluid dynamics (CFD) modelling within cardiovascular medicine. CFD, a specialist area of mathematics and a branch of fluid mechanics, is used routinely in a diverse range of safety-critical engineering systems, which increasingly is being applied to the cardiovascular system. By facilitating rapid, economical, low-risk prototyping, CFD modelling has already revolutionised research and development of devices such as stents, valve prostheses, and ventricular assist devices. Combined with cardiovascular imaging, CFD simulation enables detailed characterisation of complex physiological pressure and flow fields and the computation of metrics which cannot be directly measured, for example, wall shear stress. CFD models are now being translated into clinical tools for physicians to use across the spectrum of coronary, valvular, congenital, myocardial and peripheral vascular diseases. CFD modelling is apposite for minimally-invasive patient assessment. Patient-specific (incorporating data unique to the individual) and multi-scale (combining models of different length- and time-scales) modelling enables individualised risk prediction and virtual treatment planning. This represents a significant departure from traditional dependence upon registry-based, population-averaged data. Model integration is progressively moving towards ‘digital patient’ or ‘virtual physiological human’ representations. When combined with population-scale numerical models, these models have the potential to reduce the cost, time and risk associated with clinical trials. The adoption of CFD modelling signals a new era in cardiovascular medicine. While potentially highly beneficial, a number of academic and commercial groups are addressing the associated methodological, regulatory, education- and service-related challenges. PMID:26512019

  14. Computer simulation models of pre-diabetes populations: a systematic review protocol

    PubMed Central

    Khurshid, Waqar; Pagano, Eva; Feenstra, Talitha

    2017-01-01

    Introduction Diabetes is a major public health problem and prediabetes (intermediate hyperglycaemia) is associated with a high risk of developing diabetes. With evidence supporting the use of preventive interventions for prediabetes populations and the discovery of novel biomarkers stratifying the risk of progression, there is a need to evaluate their cost-effectiveness across jurisdictions. In diabetes and prediabetes, it is relevant to inform cost-effectiveness analysis using decision models due to their ability to forecast long-term health outcomes and costs beyond the time frame of clinical trials. To support good implementation and reimbursement decisions of interventions in these populations, models should be clinically credible, based on best available evidence, reproducible and validated against clinical data. Our aim is to identify recent studies on computer simulation models and model-based economic evaluations of populations of individuals with prediabetes, qualify them and discuss the knowledge gaps, challenges and opportunities that need to be addressed for future evaluations. Methods and analysis A systematic review will be conducted in MEDLINE, Embase, EconLit and National Health Service Economic Evaluation Database. We will extract peer-reviewed studies published between 2000 and 2016 that describe computer simulation models of the natural history of individuals with prediabetes and/or decision models to evaluate the impact of interventions, risk stratification and/or screening on these populations. Two reviewers will independently assess each study for inclusion. Data will be extracted using a predefined pro forma developed using best practice. Study quality will be assessed using a modelling checklist. A narrative synthesis of all studies will be presented, focussing on model structure, quality of models and input data, and validation status. Ethics and dissemination This systematic review is exempt from ethics approval because the work is carried out on published documents. The findings of the review will be disseminated in a related peer-reviewed journal and presented at conferences. Reviewregistration number CRD42016047228. PMID:28982807

  15. A stochastic whole-body physiologically based pharmacokinetic model to assess the impact of inter-individual variability on tissue dosimetry over the human lifespan.

    PubMed

    Beaudouin, Rémy; Micallef, Sandrine; Brochot, Céline

    2010-06-01

    Physiologically based pharmacokinetic (PBPK) models have proven to be successful in integrating and evaluating the influence of age- or gender-dependent changes with respect to the pharmacokinetics of xenobiotics throughout entire lifetimes. Nevertheless, for an effective application of toxicokinetic modelling to chemical risk assessment, a PBPK model has to be detailed enough to include all the multiple tissues that could be targeted by the various xenobiotics present in the environment. For this reason, we developed a PBPK model based on a detailed compartmentalization of the human body and parameterized with new relationships describing the time evolution of physiological and anatomical parameters. To take into account the impact of human variability on the predicted toxicokinetics, we defined probability distributions for key parameters related to the xenobiotics absorption, distribution, metabolism and excretion. The model predictability was evaluated by a direct comparison between computational predictions and experimental data for the internal concentrations of two chemicals (1,3-butadiene and 2,3,7,8-tetrachlorodibenzo-p-dioxin). A good agreement between predictions and observed data was achieved for different scenarios of exposure (e.g., acute or chronic exposure and different populations). Our results support that the general stochastic PBPK model can be a valuable computational support in the area of chemical risk analysis. (c)2010 Elsevier Inc. All rights reserved.

  16. Teaching Subtraction and Multiplication with Regrouping Using the Concrete-Representational-Abstract Sequence and Strategic Instruction Model

    ERIC Educational Resources Information Center

    Flores, Margaret M.; Hinton, Vanessa; Strozier, Shaunita D.

    2014-01-01

    Based on Common Core Standards (2010), mathematics interventions should emphasize conceptual understanding of numbers and operations as well as fluency. For students at risk for failure, the concrete-representational-abstract (CRA) sequence and the Strategic Instruction Model (SIM) have been shown effective in teaching computation with an emphasis…

  17. Predictive Accuracy of the Liverpool Lung Project Risk Model for Stratifying Patients for Computed Tomography Screening for Lung Cancer

    PubMed Central

    Raji, Olaide Y.; Duffy, Stephen W.; Agbaje, Olorunshola F.; Baker, Stuart G.; Christiani, David C.; Cassidy, Adrian; Field, John K.

    2013-01-01

    Background External validation of existing lung cancer risk prediction models is limited. Using such models in clinical practice to guide the referral of patients for computed tomography (CT) screening for lung cancer depends on external validation and evidence of predicted clinical benefit. Objective To evaluate the discrimination of the Liverpool Lung Project (LLP) risk model and demonstrate its predicted benefit for stratifying patients for CT screening by using data from 3 independent studies from Europe and North America. Design Case–control and prospective cohort study. Setting Europe and North America. Patients Participants in the European Early Lung Cancer (EUELC) and Harvard case–control studies and the LLP population-based prospective cohort (LLPC) study. Measurements 5-year absolute risks for lung cancer predicted by the LLP model. Results The LLP risk model had good discrimination in both the Harvard (area under the receiver-operating characteristic curve [AUC], 0.76 [95% CI, 0.75 to 0.78]) and the LLPC (AUC, 0.82 [CI, 0.80 to 0.85]) studies and modest discrimination in the EUELC (AUC, 0.67 [CI, 0.64 to 0.69]) study. The decision utility analysis, which incorporates the harms and benefit of using a risk model to make clinical decisions, indicates that the LLP risk model performed better than smoking duration or family history alone in stratifying high-risk patients for lung cancer CT screening. Limitations The model cannot assess whether including other risk factors, such as lung function or genetic markers, would improve accuracy. Lack of information on asbestos exposure in the LLPC limited the ability to validate the complete LLP risk model. Conclusion Validation of the LLP risk model in 3 independent external data sets demonstrated good discrimination and evidence of predicted benefits for stratifying patients for lung cancer CT screening. Further studies are needed to prospectively evaluate model performance and evaluate the optimal population risk thresholds for initiating lung cancer screening. Primary Funding Source Roy Castle Lung Cancer Foundation. PMID:22910935

  18. Quantifying aggregated uncertainty in Plasmodium falciparum malaria prevalence and populations at risk via efficient space-time geostatistical joint simulation.

    PubMed

    Gething, Peter W; Patil, Anand P; Hay, Simon I

    2010-04-01

    Risk maps estimating the spatial distribution of infectious diseases are required to guide public health policy from local to global scales. The advent of model-based geostatistics (MBG) has allowed these maps to be generated in a formal statistical framework, providing robust metrics of map uncertainty that enhances their utility for decision-makers. In many settings, decision-makers require spatially aggregated measures over large regions such as the mean prevalence within a country or administrative region, or national populations living under different levels of risk. Existing MBG mapping approaches provide suitable metrics of local uncertainty--the fidelity of predictions at each mapped pixel--but have not been adapted for measuring uncertainty over large areas, due largely to a series of fundamental computational constraints. Here the authors present a new efficient approximating algorithm that can generate for the first time the necessary joint simulation of prevalence values across the very large prediction spaces needed for global scale mapping. This new approach is implemented in conjunction with an established model for P. falciparum allowing robust estimates of mean prevalence at any specified level of spatial aggregation. The model is used to provide estimates of national populations at risk under three policy-relevant prevalence thresholds, along with accompanying model-based measures of uncertainty. By overcoming previously unchallenged computational barriers, this study illustrates how MBG approaches, already at the forefront of infectious disease mapping, can be extended to provide large-scale aggregate measures appropriate for decision-makers.

  19. CFD: computational fluid dynamics or confounding factor dissemination? The role of hemodynamics in intracranial aneurysm rupture risk assessment.

    PubMed

    Xiang, J; Tutino, V M; Snyder, K V; Meng, H

    2014-10-01

    Image-based computational fluid dynamics holds a prominent position in the evaluation of intracranial aneurysms, especially as a promising tool to stratify rupture risk. Current computational fluid dynamics findings correlating both high and low wall shear stress with intracranial aneurysm growth and rupture puzzle researchers and clinicians alike. These conflicting findings may stem from inconsistent parameter definitions, small datasets, and intrinsic complexities in intracranial aneurysm growth and rupture. In Part 1 of this 2-part review, we proposed a unifying hypothesis: both high and low wall shear stress drive intracranial aneurysm growth and rupture through mural cell-mediated and inflammatory cell-mediated destructive remodeling pathways, respectively. In the present report, Part 2, we delineate different wall shear stress parameter definitions and survey recent computational fluid dynamics studies, in light of this mechanistic heterogeneity. In the future, we expect that larger datasets, better analyses, and increased understanding of hemodynamic-biologic mechanisms will lead to more accurate predictive models for intracranial aneurysm risk assessment from computational fluid dynamics. © 2014 by American Journal of Neuroradiology.

  20. Risk Assessment of Alzheimer's Disease using the Information Diffusion Model from Structural Magnetic Resonance Imaging.

    PubMed

    Beheshti, Iman; Olya, Hossain G T; Demirel, Hasan

    2016-04-05

    Recently, automatic risk assessment methods have been a target for the detection of Alzheimer's disease (AD) risk. This study aims to develop an automatic computer-aided AD diagnosis technique for risk assessment of AD using information diffusion theory. Information diffusion is a fuzzy mathematics logic of set-value that is used for risk assessment of natural phenomena, which attaches fuzziness (uncertainty) and incompleteness. Data were obtained from voxel-based morphometry analysis of structural magnetic resonance imaging. The information diffusion model results revealed that the risk of AD increases with a reduction of the normalized gray matter ratio (p > 0.5, normalized gray matter ratio <40%). The information diffusion model results were evaluated by calculation of the correlation of two traditional risk assessments of AD, the Mini-Mental State Examination and the Clinical Dementia Rating. The correlation results revealed that the information diffusion model findings were in line with Mini-Mental State Examination and Clinical Dementia Rating results. Application of information diffusion model contributes to the computerization of risk assessment of AD, which has a practical implication for the early detection of AD.

  1. Investigating Uncertainty and Sensitivity in Integrated, Multimedia Environmental Models: Tools for FRAMES-3MRA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Babendreier, Justin E.; Castleton, Karl J.

    2005-08-01

    Elucidating uncertainty and sensitivity structures in environmental models can be a difficult task, even for low-order, single-medium constructs driven by a unique set of site-specific data. Quantitative assessment of integrated, multimedia models that simulate hundreds of sites, spanning multiple geographical and ecological regions, will ultimately require a comparative approach using several techniques, coupled with sufficient computational power. The Framework for Risk Analysis in Multimedia Environmental Systems - Multimedia, Multipathway, and Multireceptor Risk Assessment (FRAMES-3MRA) is an important software model being developed by the United States Environmental Protection Agency for use in risk assessment of hazardous waste management facilities. The 3MRAmore » modeling system includes a set of 17 science modules that collectively simulate release, fate and transport, exposure, and risk associated with hazardous contaminants disposed of in land-based waste management units (WMU) .« less

  2. Conservation Risks: When Will Rhinos be Extinct?

    PubMed

    Haas, Timothy C; Ferreira, Sam M

    2016-08-01

    We develop a risk intelligence system for biodiversity enterprises. Such enterprises depend on a supply of endangered species for their revenue. Many of these enterprises, however, cannot purchase a supply of this resource and are largely unable to secure the resource against theft in the form of poaching. Because replacements are not available once a species becomes extinct, insurance products are not available to reduce the risk exposure of these enterprises to an extinction event. For many species, the dynamics of anthropogenic impacts driven by economic as well as noneconomic values of associated wildlife products along with their ecological stressors can help meaningfully predict extinction risks. We develop an agent/individual-based economic-ecological model that captures these effects and apply it to the case of South African rhinos. Our model uses observed rhino dynamics and poaching statistics. It seeks to predict rhino extinction under the present scenario. This scenario has no legal horn trade, but allows live African rhino trade and legal hunting. Present rhino populations are small and threatened by a rising onslaught of poaching. This present scenario and associated dynamics predicts continued decline in rhino population size with accelerated extinction risks of rhinos by 2036. Our model supports the computation of extinction risks at any future time point. This capability can be used to evaluate the effectiveness of proposed conservation strategies at reducing a species' extinction risk. Models used to compute risk predictions, however, need to be statistically estimated. We point out that statistically fitting such models to observations will involve massive numbers of observations on consumer behavior and time-stamped location observations on thousands of animals. Finally, we propose Big Data algorithms to perform such estimates and to interpret the fitted model's output.

  3. Review on pen-and-paper-based observational methods for assessing ergonomic risk factors of computer work.

    PubMed

    Rahman, Mohd Nasrull Abdol; Mohamad, Siti Shafika

    2017-01-01

    Computer works are associated with Musculoskeletal Disorders (MSDs). There are several methods have been developed to assess computer work risk factor related to MSDs. This review aims to give an overview of current techniques available for pen-and-paper-based observational methods in assessing ergonomic risk factors of computer work. We searched an electronic database for materials from 1992 until 2015. The selected methods were focused on computer work, pen-and-paper observational methods, office risk factors and musculoskeletal disorders. This review was developed to assess the risk factors, reliability and validity of pen-and-paper observational method associated with computer work. Two evaluators independently carried out this review. Seven observational methods used to assess exposure to office risk factor for work-related musculoskeletal disorders were identified. The risk factors involved in current techniques of pen and paper based observational tools were postures, office components, force and repetition. From the seven methods, only five methods had been tested for reliability. They were proven to be reliable and were rated as moderate to good. For the validity testing, from seven methods only four methods were tested and the results are moderate. Many observational tools already exist, but no single tool appears to cover all of the risk factors including working posture, office component, force, repetition and office environment at office workstations and computer work. Although the most important factor in developing tool is proper validation of exposure assessment techniques, the existing observational method did not test reliability and validity. Futhermore, this review could provide the researchers with ways on how to improve the pen-and-paper-based observational method for assessing ergonomic risk factors of computer work.

  4. Dopamine Receptor-Specific Contributions to the Computation of Value.

    PubMed

    Burke, Christopher J; Soutschek, Alexander; Weber, Susanna; Raja Beharelle, Anjali; Fehr, Ernst; Haker, Helene; Tobler, Philippe N

    2018-05-01

    Dopamine is thought to play a crucial role in value-based decision making. However, the specific contributions of different dopamine receptor subtypes to the computation of subjective value remain unknown. Here we demonstrate how the balance between D1 and D2 dopamine receptor subtypes shapes subjective value computation during risky decision making. We administered the D2 receptor antagonist amisulpride or placebo before participants made choices between risky options. Compared with placebo, D2 receptor blockade resulted in more frequent choice of higher risk and higher expected value options. Using a novel model fitting procedure, we concurrently estimated the three parameters that define individual risk attitude according to an influential theoretical account of risky decision making (prospect theory). This analysis revealed that the observed reduction in risk aversion under amisulpride was driven by increased sensitivity to reward magnitude and decreased distortion of outcome probability, resulting in more linear value coding. Our data suggest that different components that govern individual risk attitude are under dopaminergic control, such that D2 receptor blockade facilitates risk taking and expected value processing.

  5. A novel one-class SVM based negative data sampling method for reconstructing proteome-wide HTLV-human protein interaction networks.

    PubMed

    Mei, Suyu; Zhu, Hao

    2015-01-26

    Protein-protein interaction (PPI) prediction is generally treated as a problem of binary classification wherein negative data sampling is still an open problem to be addressed. The commonly used random sampling is prone to yield less representative negative data with considerable false negatives. Meanwhile rational constraints are seldom exerted on model selection to reduce the risk of false positive predictions for most of the existing computational methods. In this work, we propose a novel negative data sampling method based on one-class SVM (support vector machine, SVM) to predict proteome-wide protein interactions between HTLV retrovirus and Homo sapiens, wherein one-class SVM is used to choose reliable and representative negative data, and two-class SVM is used to yield proteome-wide outcomes as predictive feedback for rational model selection. Computational results suggest that one-class SVM is more suited to be used as negative data sampling method than two-class PPI predictor, and the predictive feedback constrained model selection helps to yield a rational predictive model that reduces the risk of false positive predictions. Some predictions have been validated by the recent literature. Lastly, gene ontology based clustering of the predicted PPI networks is conducted to provide valuable cues for the pathogenesis of HTLV retrovirus.

  6. Measures of Residual Risk with Connections to Regression, Risk Tracking, Surrogate Models, and Ambiguity

    DTIC Science & Technology

    2015-05-20

    original variable. Residual risk can be exempli ed as a quanti cation of the improved situation faced by a hedging investor compared to that of a...distributional information about Yx for every x as well as the computational cost of evaluating R(Yx) for numerous x, for example within an optimization...Still, when g is costly to evaluate , it might be desirable to develop an approximation of R(Yx), x ∈ IRn through regression based on observations {xj

  7. An extended reinforcement learning model of basal ganglia to understand the contributions of serotonin and dopamine in risk-based decision making, reward prediction, and punishment learning

    PubMed Central

    Balasubramani, Pragathi P.; Chakravarthy, V. Srinivasa; Ravindran, Balaraman; Moustafa, Ahmed A.

    2014-01-01

    Although empirical and neural studies show that serotonin (5HT) plays many functional roles in the brain, prior computational models mostly focus on its role in behavioral inhibition. In this study, we present a model of risk based decision making in a modified Reinforcement Learning (RL)-framework. The model depicts the roles of dopamine (DA) and serotonin (5HT) in Basal Ganglia (BG). In this model, the DA signal is represented by the temporal difference error (δ), while the 5HT signal is represented by a parameter (α) that controls risk prediction error. This formulation that accommodates both 5HT and DA reconciles some of the diverse roles of 5HT particularly in connection with the BG system. We apply the model to different experimental paradigms used to study the role of 5HT: (1) Risk-sensitive decision making, where 5HT controls risk assessment, (2) Temporal reward prediction, where 5HT controls time-scale of reward prediction, and (3) Reward/Punishment sensitivity, in which the punishment prediction error depends on 5HT levels. Thus the proposed integrated RL model reconciles several existing theories of 5HT and DA in the BG. PMID:24795614

  8. Assessment of two mammographic density related features in predicting near-term breast cancer risk

    NASA Astrophysics Data System (ADS)

    Zheng, Bin; Sumkin, Jules H.; Zuley, Margarita L.; Wang, Xingwei; Klym, Amy H.; Gur, David

    2012-02-01

    In order to establish a personalized breast cancer screening program, it is important to develop risk models that have high discriminatory power in predicting the likelihood of a woman developing an imaging detectable breast cancer in near-term (e.g., <3 years after a negative examination in question). In epidemiology-based breast cancer risk models, mammographic density is considered the second highest breast cancer risk factor (second to woman's age). In this study we explored a new feature, namely bilateral mammographic density asymmetry, and investigated the feasibility of predicting near-term screening outcome. The database consisted of 343 negative examinations, of which 187 depicted cancers that were detected during the subsequent screening examination and 155 that remained negative. We computed the average pixel value of the segmented breast areas depicted on each cranio-caudal view of the initial negative examinations. We then computed the mean and difference mammographic density for paired bilateral images. Using woman's age, subjectively rated density (BIRADS), and computed mammographic density related features we compared classification performance in estimating the likelihood of detecting cancer during the subsequent examination using areas under the ROC curves (AUC). The AUCs were 0.63+/-0.03, 0.54+/-0.04, 0.57+/-0.03, 0.68+/-0.03 when using woman's age, BIRADS rating, computed mean density and difference in computed bilateral mammographic density, respectively. Performance increased to 0.62+/-0.03 and 0.72+/-0.03 when we fused mean and difference in density with woman's age. The results suggest that, in this study, bilateral mammographic tissue density is a significantly stronger (p<0.01) risk indicator than both woman's age and mean breast density.

  9. How Adverse Outcome Pathways Can Aid the Development and Use of Computational Prediction Models for Regulatory Toxicology.

    PubMed

    Wittwehr, Clemens; Aladjov, Hristo; Ankley, Gerald; Byrne, Hugh J; de Knecht, Joop; Heinzle, Elmar; Klambauer, Günter; Landesmann, Brigitte; Luijten, Mirjam; MacKay, Cameron; Maxwell, Gavin; Meek, M E Bette; Paini, Alicia; Perkins, Edward; Sobanski, Tomasz; Villeneuve, Dan; Waters, Katrina M; Whelan, Maurice

    2017-02-01

    Efforts are underway to transform regulatory toxicology and chemical safety assessment from a largely empirical science based on direct observation of apical toxicity outcomes in whole organism toxicity tests to a predictive one in which outcomes and risk are inferred from accumulated mechanistic understanding. The adverse outcome pathway (AOP) framework provides a systematic approach for organizing knowledge that may support such inference. Likewise, computational models of biological systems at various scales provide another means and platform to integrate current biological understanding to facilitate inference and extrapolation. We argue that the systematic organization of knowledge into AOP frameworks can inform and help direct the design and development of computational prediction models that can further enhance the utility of mechanistic and in silico data for chemical safety assessment. This concept was explored as part of a workshop on AOP-Informed Predictive Modeling Approaches for Regulatory Toxicology held September 24-25, 2015. Examples of AOP-informed model development and its application to the assessment of chemicals for skin sensitization and multiple modes of endocrine disruption are provided. The role of problem formulation, not only as a critical phase of risk assessment, but also as guide for both AOP and complementary model development is described. Finally, a proposal for actively engaging the modeling community in AOP-informed computational model development is made. The contents serve as a vision for how AOPs can be leveraged to facilitate development of computational prediction models needed to support the next generation of chemical safety assessment. © The Author 2016. Published by Oxford University Press on behalf of the Society of Toxicology.

  10. Delivering The Benefits of Chemical-Biological Integration in ...

    EPA Pesticide Factsheets

    Abstract: Researchers at the EPA’s National Center for Computational Toxicology integrate advances in biology, chemistry, and computer science to examine the toxicity of chemicals and help prioritize chemicals for further research based on potential human health risks. The intention of this research program is to quickly evaluate thousands of chemicals for potential risk but with much reduced cost relative to historical approaches. This work involves computational and data driven approaches including high-throughput screening, modeling, text-mining and the integration of chemistry, exposure and biological data. We have developed a number of databases and applications that are delivering on the vision of developing a deeper understanding of chemicals and their effects on exposure and biological processes that are supporting a large community of scientists in their research efforts. This presentation will provide an overview of our work to bring together diverse large scale data from the chemical and biological domains, our approaches to integrate and disseminate these data, and the delivery of models supporting computational toxicology. This abstract does not reflect U.S. EPA policy. Presentation at ACS TOXI session on Computational Chemistry and Toxicology in Chemical Discovery and Assessement (QSARs).

  11. The uncertainty cascade in flood risk assessment under changing climatic conditions - the Biala Tarnowska case study

    NASA Astrophysics Data System (ADS)

    Doroszkiewicz, Joanna; Romanowicz, Renata

    2016-04-01

    Uncertainty in the results of the hydraulic model is not only associated with the limitations of that model and the shortcomings of data. An important factor that has a major impact on the uncertainty of the flood risk assessment in a changing climate conditions is associated with the uncertainty of future climate scenarios (IPCC WG I, 2013). Future climate projections provided by global climate models are used to generate future runoff required as an input to hydraulic models applied in the derivation of flood risk maps. Biala Tarnowska catchment, situated in southern Poland is used as a case study. Future discharges at the input to a hydraulic model are obtained using the HBV model and climate projections obtained from the EUROCORDEX project. The study describes a cascade of uncertainty related to different stages of the process of derivation of flood risk maps under changing climate conditions. In this context it takes into account the uncertainty of future climate projections, an uncertainty of flow routing model, the propagation of that uncertainty through the hydraulic model, and finally, the uncertainty related to the derivation of flood risk maps. One of the aims of this study is an assessment of a relative impact of different sources of uncertainty on the uncertainty of flood risk maps. Due to the complexity of the process, an assessment of total uncertainty of maps of inundation probability might be very computer time consuming. As a way forward we present an application of a hydraulic model simulator based on a nonlinear transfer function model for the chosen locations along the river reach. The transfer function model parameters are estimated based on the simulations of the hydraulic model at each of the model cross-section. The study shows that the application of the simulator substantially reduces the computer requirements related to the derivation of flood risk maps under future climatic conditions. Acknowledgements: This work was supported by the project CHIHE (Climate Change Impact on Hydrological Extremes), carried out in the Institute of Geophysics Polish Academy of Sciences, funded by Norway Grants (contract No. Pol-Nor/196243/80/2013). The hydro-meteorological observations were provided by the Institute of Meteorology and Water Management (IMGW), Poland.

  12. Numerical simulation of gender differences in a long-term microgravity exposure

    NASA Astrophysics Data System (ADS)

    Perez-Poch, Antoni

    The objective of this work is to analyse and simulate gender differences when individuals are exposed to long-term microgravity. Risk probability of a health impairment which may put in jeopardy a long-term mission is also evaluated. Computer simulations are becoming a promising research line of work, as physiological models become more and more sophisticated and reliable. Technological advances in state-of-the-art hardware technology and software allow nowadays for better and more accurate simulations of complex phenomena, such as the response of the human cardiovascular system to long-term exposure to microgravity. Experimental data for long-term missions are difficult to achieve and reproduce, therefore the predictions of computer simulations are of a major importance in this field. Our approach is based on a previous model developed and implemented in our laboratory (NELME: Numerical Evaluation of Long-term Microgravity Effects). The software simulates the behaviour of the cardiovascular system and different human organs, has a modular architecture, and allows to introduce perturbations such as physical exercise or countermeasures. The implementation is based on a complex electricallike model of this control system, using inexpensive software development frameworks, and has been tested and validated with the available experimental data. Gender differences have been implemented for this specific work, as an adjustment of a number of parameters that are included in the model. Women versus men physiological differences have been therefore taken into account, based upon estimations from the physiology bibliography. A number of simulations have been carried out for long-term exposure to microgravity. Gravity varying from Earth-based to zero, and time exposure are the two main variables involved in the construction of results, including responses to patterns of physical aerobical exercise, and also thermal stress simulating an extra-vehicular activity. Results show that significant differences appear between men and women physiological response after long-term exposure (more than three months) to microgravity. Risk evaluation for every gender, and specific risk thresholds are provided. Initial results are compatible with the existing data, and provide unique information regarding different patterns of microgravity exposure. We conclude that computer-based models such us NELME are a promising line of work to predict health risks in long-term missions. More experimental work is needed to adjust some parameters of the model. This work may be seen as another contribution to a better understanding of the underlying processes involved for both women in man adaptation to long-term microgravity.

  13. Computational fluid dynamics modelling in cardiovascular medicine.

    PubMed

    Morris, Paul D; Narracott, Andrew; von Tengg-Kobligk, Hendrik; Silva Soto, Daniel Alejandro; Hsiao, Sarah; Lungu, Angela; Evans, Paul; Bressloff, Neil W; Lawford, Patricia V; Hose, D Rodney; Gunn, Julian P

    2016-01-01

    This paper reviews the methods, benefits and challenges associated with the adoption and translation of computational fluid dynamics (CFD) modelling within cardiovascular medicine. CFD, a specialist area of mathematics and a branch of fluid mechanics, is used routinely in a diverse range of safety-critical engineering systems, which increasingly is being applied to the cardiovascular system. By facilitating rapid, economical, low-risk prototyping, CFD modelling has already revolutionised research and development of devices such as stents, valve prostheses, and ventricular assist devices. Combined with cardiovascular imaging, CFD simulation enables detailed characterisation of complex physiological pressure and flow fields and the computation of metrics which cannot be directly measured, for example, wall shear stress. CFD models are now being translated into clinical tools for physicians to use across the spectrum of coronary, valvular, congenital, myocardial and peripheral vascular diseases. CFD modelling is apposite for minimally-invasive patient assessment. Patient-specific (incorporating data unique to the individual) and multi-scale (combining models of different length- and time-scales) modelling enables individualised risk prediction and virtual treatment planning. This represents a significant departure from traditional dependence upon registry-based, population-averaged data. Model integration is progressively moving towards 'digital patient' or 'virtual physiological human' representations. When combined with population-scale numerical models, these models have the potential to reduce the cost, time and risk associated with clinical trials. The adoption of CFD modelling signals a new era in cardiovascular medicine. While potentially highly beneficial, a number of academic and commercial groups are addressing the associated methodological, regulatory, education- and service-related challenges. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  14. How adverse outcome pathways can aid the development and ...

    EPA Pesticide Factsheets

    Efforts are underway to transform regulatory toxicology and chemical safety assessment from a largely empirical science based on direct observation of apical toxicity outcomes in whole organism toxicity tests to a predictive one in which outcomes and risk are inferred from accumulated mechanistic understanding. The adverse outcome pathway (AOP) framework has emerged as a systematic approach for organizing knowledge that supports such inference. We argue that this systematic organization of knowledge can inform and help direct the design and development of computational prediction models that can further enhance the utility of mechanistic and in silico data for chemical safety assessment. Examples of AOP-informed model development and its application to the assessment of chemicals for skin sensitization and multiple modes of endocrine disruption are provided. The role of problem formulation, not only as a critical phase of risk assessment, but also as guide for both AOP and complementary model development described. Finally, a proposal for actively engaging the modeling community in AOP-informed computational model development is made. The contents serve as a vision for how AOPs can be leveraged to facilitate development of computational prediction models needed to support the next generation of chemical safety assessment. The present manuscript reports on expert opinion and case studies that came out of a European Commission, Joint Research Centre-sponsored work

  15. Two-stage stochastic unit commitment model including non-generation resources with conditional value-at-risk constraints

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Yuping; Zheng, Qipeng P.; Wang, Jianhui

    2014-11-01

    tThis paper presents a two-stage stochastic unit commitment (UC) model, which integrates non-generation resources such as demand response (DR) and energy storage (ES) while including riskconstraints to balance between cost and system reliability due to the fluctuation of variable genera-tion such as wind and solar power. This paper uses conditional value-at-risk (CVaR) measures to modelrisks associated with the decisions in a stochastic environment. In contrast to chance-constrained modelsrequiring extra binary variables, risk constraints based on CVaR only involve linear constraints and con-tinuous variables, making it more computationally attractive. The proposed models with risk constraintsare able to avoid over-conservative solutions butmore » still ensure system reliability represented by loss ofloads. Then numerical experiments are conducted to study the effects of non-generation resources ongenerator schedules and the difference of total expected generation costs with risk consideration. Sen-sitivity analysis based on reliability parameters is also performed to test the decision preferences ofconfidence levels and load-shedding loss allowances on generation cost reduction.« less

  16. Parallelization of the Coupled Earthquake Model

    NASA Technical Reports Server (NTRS)

    Block, Gary; Li, P. Peggy; Song, Yuhe T.

    2007-01-01

    This Web-based tsunami simulation system allows users to remotely run a model on JPL s supercomputers for a given undersea earthquake. At the time of this reporting, predicting tsunamis on the Internet has never happened before. This new code directly couples the earthquake model and the ocean model on parallel computers and improves simulation speed. Seismometers can only detect information from earthquakes; they cannot detect whether or not a tsunami may occur as a result of the earthquake. When earthquake-tsunami models are coupled with the improved computational speed of modern, high-performance computers and constrained by remotely sensed data, they are able to provide early warnings for those coastal regions at risk. The software is capable of testing NASA s satellite observations of tsunamis. It has been successfully tested for several historical tsunamis, has passed all alpha and beta testing, and is well documented for users.

  17. Using Computational Toxicology to Enable Risk-Based ...

    EPA Pesticide Factsheets

    presentation at Drug Safety Gordon Research Conference 2016 on research efforts in NCCT to enable Computational Toxicology to support risk assessment. Slide presentation at Drug Safety Gordon Research Conference 2016 on research efforts in NCCT to enable Computational Toxicology to support risk assessment.

  18. Physics-Based Hazard Assessment for Critical Structures Near Large Earthquake Sources

    NASA Astrophysics Data System (ADS)

    Hutchings, L.; Mert, A.; Fahjan, Y.; Novikova, T.; Golara, A.; Miah, M.; Fergany, E.; Foxall, W.

    2017-09-01

    We argue that for critical structures near large earthquake sources: (1) the ergodic assumption, recent history, and simplified descriptions of the hazard are not appropriate to rely on for earthquake ground motion prediction and can lead to a mis-estimation of the hazard and risk to structures; (2) a physics-based approach can address these issues; (3) a physics-based source model must be provided to generate realistic phasing effects from finite rupture and model near-source ground motion correctly; (4) wave propagations and site response should be site specific; (5) a much wider search of possible sources of ground motion can be achieved computationally with a physics-based approach; (6) unless one utilizes a physics-based approach, the hazard and risk to structures has unknown uncertainties; (7) uncertainties can be reduced with a physics-based approach, but not with an ergodic approach; (8) computational power and computer codes have advanced to the point that risk to structures can be calculated directly from source and site-specific ground motions. Spanning the variability of potential ground motion in a predictive situation is especially difficult for near-source areas, but that is the distance at which the hazard is the greatest. The basis of a "physical-based" approach is ground-motion syntheses derived from physics and an understanding of the earthquake process. This is an overview paper and results from previous studies are used to make the case for these conclusions. Our premise is that 50 years of strong motion records is insufficient to capture all possible ranges of site and propagation path conditions, rupture processes, and spatial geometric relationships between source and site. Predicting future earthquake scenarios is necessary; models that have little or no physical basis but have been tested and adjusted to fit available observations can only "predict" what happened in the past, which should be considered description as opposed to prediction. We have developed a methodology for synthesizing physics-based broadband ground motion that incorporates the effects of realistic earthquake rupture along specific faults and the actual geology between the source and site.

  19. Maximization of the usage of coronary CTA derived plaque information using a machine learning based algorithm to improve risk stratification; insights from the CONFIRM registry.

    PubMed

    van Rosendael, Alexander R; Maliakal, Gabriel; Kolli, Kranthi K; Beecy, Ashley; Al'Aref, Subhi J; Dwivedi, Aeshita; Singh, Gurpreet; Panday, Mohit; Kumar, Amit; Ma, Xiaoyue; Achenbach, Stephan; Al-Mallah, Mouaz H; Andreini, Daniele; Bax, Jeroen J; Berman, Daniel S; Budoff, Matthew J; Cademartiri, Filippo; Callister, Tracy Q; Chang, Hyuk-Jae; Chinnaiyan, Kavitha; Chow, Benjamin J W; Cury, Ricardo C; DeLago, Augustin; Feuchtner, Gudrun; Hadamitzky, Martin; Hausleiter, Joerg; Kaufmann, Philipp A; Kim, Yong-Jin; Leipsic, Jonathon A; Maffei, Erica; Marques, Hugo; Pontone, Gianluca; Raff, Gilbert L; Rubinshtein, Ronen; Shaw, Leslee J; Villines, Todd C; Gransar, Heidi; Lu, Yao; Jones, Erica C; Peña, Jessica M; Lin, Fay Y; Min, James K

    Machine learning (ML) is a field in computer science that demonstrated to effectively integrate clinical and imaging data for the creation of prognostic scores. The current study investigated whether a ML score, incorporating only the 16 segment coronary tree information derived from coronary computed tomography angiography (CCTA), provides enhanced risk stratification compared with current CCTA based risk scores. From the multi-center CONFIRM registry, patients were included with complete CCTA risk score information and ≥3 year follow-up for myocardial infarction and death (primary endpoint). Patients with prior coronary artery disease were excluded. Conventional CCTA risk scores (conventional CCTA approach, segment involvement score, duke prognostic index, segment stenosis score, and the Leaman risk score) and a score created using ML were compared for the area under the receiver operating characteristic curve (AUC). Only 16 segment based coronary stenosis (0%, 1-24%, 25-49%, 50-69%, 70-99% and 100%) and composition (calcified, mixed and non-calcified plaque) were provided to the ML model. A boosted ensemble algorithm (extreme gradient boosting; XGBoost) was used and the entire data was randomly split into a training set (80%) and testing set (20%). First, tuned hyperparameters were used to generate a trained model from the training data set (80% of data). Second, the performance of this trained model was independently tested on the unseen test set (20% of data). In total, 8844 patients (mean age 58.0 ± 11.5 years, 57.7% male) were included. During a mean follow-up time of 4.6 ± 1.5 years, 609 events occurred (6.9%). No CAD was observed in 48.7% (3.5% event), non-obstructive CAD in 31.8% (6.8% event), and obstructive CAD in 19.5% (15.6% event). Discrimination of events as expressed by AUC was significantly better for the ML based approach (0.771) vs the other scores (ranging from 0.685 to 0.701), P < 0.001. Net reclassification improvement analysis showed that the improved risk stratification was the result of down-classification of risk among patients that did not experience events (non-events). A risk score created by a ML based algorithm, that utilizes standard 16 coronary segment stenosis and composition information derived from detailed CCTA reading, has greater prognostic accuracy than current CCTA integrated risk scores. These findings indicate that a ML based algorithm can improve the integration of CCTA derived plaque information to improve risk stratification. Published by Elsevier Inc.

  20. Statistical surrogate models for prediction of high-consequence climate change.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Constantine, Paul; Field, Richard V., Jr.; Boslough, Mark Bruce Elrick

    2011-09-01

    In safety engineering, performance metrics are defined using probabilistic risk assessments focused on the low-probability, high-consequence tail of the distribution of possible events, as opposed to best estimates based on central tendencies. We frame the climate change problem and its associated risks in a similar manner. To properly explore the tails of the distribution requires extensive sampling, which is not possible with existing coupled atmospheric models due to the high computational cost of each simulation. We therefore propose the use of specialized statistical surrogate models (SSMs) for the purpose of exploring the probability law of various climate variables of interest.more » A SSM is different than a deterministic surrogate model in that it represents each climate variable of interest as a space/time random field. The SSM can be calibrated to available spatial and temporal data from existing climate databases, e.g., the Program for Climate Model Diagnosis and Intercomparison (PCMDI), or to a collection of outputs from a General Circulation Model (GCM), e.g., the Community Earth System Model (CESM) and its predecessors. Because of its reduced size and complexity, the realization of a large number of independent model outputs from a SSM becomes computationally straightforward, so that quantifying the risk associated with low-probability, high-consequence climate events becomes feasible. A Bayesian framework is developed to provide quantitative measures of confidence, via Bayesian credible intervals, in the use of the proposed approach to assess these risks.« less

  1. Drug development costs when financial risk is measured using the Fama-French three-factor model.

    PubMed

    Vernon, John A; Golec, Joseph H; Dimasi, Joseph A

    2010-08-01

    In a widely cited article, DiMasi, Hansen, and Grabowski (2003) estimate the average pre-tax cost of bringing a new molecular entity to market. Their base case estimate, excluding post-marketing studies, was $802 million (in $US 2000). Strikingly, almost half of this cost (or $399 million) is the cost of capital (COC) used to fund clinical development expenses to the point of FDA marketing approval. The authors used an 11% real COC computed using the capital asset pricing model (CAPM). But the CAPM is a single factor risk model, and multi-factor risk models are the current state of the art in finance. Using the Fama-French three factor model we find that the cost of drug development to be higher than the earlier estimate. Copyright (c) 2009 John Wiley & Sons, Ltd.

  2. Use of High-Throughput Testing and Approaches for Evaluating Chemical Risk-Relevance to Humans

    EPA Science Inventory

    ToxCast is profiling the bioactivity of thousands of chemicals based on high-throughput screening (HTS) and computational models that integrate knowledge of biological systems and in vivo toxicities. Many of these assays probe signaling pathways and cellular processes critical to...

  3. THE USE OF COMPUTER MODELING PACKAGES TO ILLUSTRATE UNCERTAINTY IN RISK ASSESSMENTS: AN EASE OF USE AND INTERPRETATION COMPARISON

    EPA Science Inventory

    Consistent improvements in processor speed and computer access have substantially increased the use of computer modeling by experts and non-experts alike. Several new computer modeling packages operating under graphical operating systems (i.e. Microsoft Windows or Macintosh) m...

  4. An integrated simulation and optimization approach for managing human health risks of atmospheric pollutants by coal-fired power plants.

    PubMed

    Dai, C; Cai, X H; Cai, Y P; Guo, H C; Sun, W; Tan, Q; Huang, G H

    2014-06-01

    This research developed a simulation-aided nonlinear programming model (SNPM). This model incorporated the consideration of pollutant dispersion modeling, and the management of coal blending and the related human health risks within a general modeling framework In SNPM, the simulation effort (i.e., California puff [CALPUFF]) was used to forecast the fate of air pollutants for quantifying the health risk under various conditions, while the optimization studies were to identify the optimal coal blending strategies from a number of alternatives. To solve the model, a surrogate-based indirect search approach was proposed, where the support vector regression (SVR) was used to create a set of easy-to-use and rapid-response surrogates for identifying the function relationships between coal-blending operating conditions and health risks. Through replacing the CALPUFF and the corresponding hazard quotient equation with the surrogates, the computation efficiency could be improved. The developed SNPM was applied to minimize the human health risk associated with air pollutants discharged from Gaojing and Shijingshan power plants in the west of Beijing. Solution results indicated that it could be used for reducing the health risk of the public in the vicinity of the two power plants, identifying desired coal blending strategies for decision makers, and considering a proper balance between coal purchase cost and human health risk. A simulation-aided nonlinear programming model (SNPM) is developed. It integrates the advantages of CALPUFF and nonlinear programming model. To solve the model, a surrogate-based indirect search approach based on the combination of support vector regression and genetic algorithm is proposed. SNPM is applied to reduce the health risk caused by air pollutants discharged from Gaojing and Shijingshan power plants in the west of Beijing. Solution results indicate that it is useful for generating coal blending schemes, reducing the health risk of the public, reflecting the trade-offbetween coal purchase cost and health risk.

  5. ESR/ERS white paper on lung cancer screening

    PubMed Central

    Bonomo, Lorenzo; Gaga, Mina; Nackaerts, Kristiaan; Peled, Nir; Prokop, Mathias; Remy-Jardin, Martine; von Stackelberg, Oyunbileg; Sculier, Jean-Paul

    2015-01-01

    Lung cancer is the most frequently fatal cancer, with poor survival once the disease is advanced. Annual low dose computed tomography has shown a survival benefit in screening individuals at high risk for lung cancer. Based on the available evidence, the European Society of Radiology and the European Respiratory Society recommend lung cancer screening in comprehensive, quality-assured, longitudinal programmes within a clinical trial or in routine clinical practice at certified multidisciplinary medical centres. Minimum requirements include: standardised operating procedures for low dose image acquisition, computer-assisted nodule evaluation, and positive screening results and their management; inclusion/exclusion criteria; expectation management; and smoking cessation programmes. Further refinements are recommended to increase quality, outcome and cost-effectiveness of lung cancer screening: inclusion of risk models, reduction of effective radiation dose, computer-assisted volumetric measurements and assessment of comorbidities (chronic obstructive pulmonary disease and vascular calcification). All these requirements should be adjusted to the regional infrastructure and healthcare system, in order to exactly define eligibility using a risk model, nodule management and quality assurance plan. The establishment of a central registry, including biobank and image bank, and preferably on a European level, is strongly encouraged. PMID:25929956

  6. Development of a GCR Event-based Risk Model

    NASA Technical Reports Server (NTRS)

    Cucinotta, Francis A.; Ponomarev, Artem L.; Plante, Ianik; Carra, Claudio; Kim, Myung-Hee

    2009-01-01

    A goal at NASA is to develop event-based systems biology models of space radiation risks that will replace the current dose-based empirical models. Complex and varied biochemical signaling processes transmit the initial DNA and oxidative damage from space radiation into cellular and tissue responses. Mis-repaired damage or aberrant signals can lead to genomic instability, persistent oxidative stress or inflammation, which are causative of cancer and CNS risks. Protective signaling through adaptive responses or cell repopulation is also possible. We are developing a computational simulation approach to galactic cosmic ray (GCR) effects that is based on biological events rather than average quantities such as dose, fluence, or dose equivalent. The goal of the GCR Event-based Risk Model (GERMcode) is to provide a simulation tool to describe and integrate physical and biological events into stochastic models of space radiation risks. We used the quantum multiple scattering model of heavy ion fragmentation (QMSFRG) and well known energy loss processes to develop a stochastic Monte-Carlo based model of GCR transport in spacecraft shielding and tissue. We validated the accuracy of the model by comparing to physical data from the NASA Space Radiation Laboratory (NSRL). Our simulation approach allows us to time-tag each GCR proton or heavy ion interaction in tissue including correlated secondary ions often of high multiplicity. Conventional space radiation risk assessment employs average quantities, and assumes linearity and additivity of responses over the complete range of GCR charge and energies. To investigate possible deviations from these assumptions, we studied several biological response pathway models of varying induction and relaxation times including the ATM, TGF -Smad, and WNT signaling pathways. We then considered small volumes of interacting cells and the time-dependent biophysical events that the GCR would produce within these tissue volumes to estimate how GCR event rates mapped to biological signaling induction and relaxation times. We considered several hypotheses related to signaling and cancer risk, and then performed simulations for conditions where aberrant or adaptive signaling would occur on long-duration space mission. Our results do not support the conventional assumptions of dose, linearity and additivity. A discussion on how event-based systems biology models, which focus on biological signaling as the mechanism to propagate damage or adaptation, can be further developed for cancer and CNS space radiation risk projections is given.

  7. A simple methodology to produce flood risk maps consistent with FEMA's base flood elevation maps: Implementation and validation over the entire contiguous United States

    NASA Astrophysics Data System (ADS)

    Goteti, G.; Kaheil, Y. H.; Katz, B. G.; Li, S.; Lohmann, D.

    2011-12-01

    In the United States, government agencies as well as the National Flood Insurance Program (NFIP) use flood inundation maps associated with the 100-year return period (base flood elevation, BFE), produced by the Federal Emergency Management Agency (FEMA), as the basis for flood insurance. A credibility check of the flood risk hydraulic models, often employed by insurance companies, is their ability to reasonably reproduce FEMA's BFE maps. We present results from the implementation of a flood modeling methodology aimed towards reproducing FEMA's BFE maps at a very fine spatial resolution using a computationally parsimonious, yet robust, hydraulic model. The hydraulic model used in this study has two components: one for simulating flooding of the river channel and adjacent floodplain, and the other for simulating flooding in the remainder of the catchment. The first component is based on a 1-D wave propagation model, while the second component is based on a 2-D diffusive wave model. The 1-D component captures the flooding from large-scale river transport (including upstream effects), while the 2-D component captures the flooding from local rainfall. The study domain consists of the contiguous United States, hydrologically subdivided into catchments averaging about 500 km2 in area, at a spatial resolution of 30 meters. Using historical daily precipitation data from the Climate Prediction Center (CPC), the precipitation associated with the 100-year return period event was computed for each catchment and was input to the hydraulic model. Flood extent from the FEMA BFE maps is reasonably replicated by the 1-D component of the model (riverine flooding). FEMA's BFE maps only represent the riverine flooding component and are unavailable for many regions of the USA. However, this modeling methodology (1-D and 2-D components together) covers the entire contiguous USA. This study is part of a larger modeling effort from Risk Management Solutions° (RMS) to estimate flood risk associated with extreme precipitation events in the USA. Towards this greater objective, state-of-the-art models of flood hazard and stochastic precipitation are being implemented over the contiguous United States. Results from the successful implementation of the modeling methodology will be presented.

  8. Hobbies with solvent exposure and risk of non-Hodgkin lymphoma.

    PubMed

    Colt, Joanne S; Hartge, Patricia; Davis, Scott; Cerhan, James R; Cozen, Wendy; Severson, Richard K

    2007-05-01

    Occupational exposure to solvents has been reported to increase non-Hodgkin lymphoma (NHL) risk in some, but not all, studies. In a population-based case-control study, we examined whether participation in selected hobbies involving solvent exposure increases NHL risk. We identified NHL cases diagnosed at ages 20-74 years between 1998 and 2000 in Iowa or metropolitan Los Angeles, Detroit, and Seattle. Controls were selected using random digit dialing or Medicare files. Computer-assisted personal interviews (551 cases, 462 controls) elicited data on model building, painting/silkscreening/artwork, furniture refinishing, and woodworking/home carpentry. Hobby participation (68% of cases, 69% of controls) was not associated with NHL risk (OR = 0.9, 95% CI = 0.7-1.2). Compared to people with none of the hobbies evaluated, those who built models had significantly lower risk (OR = 0.7, CI = 0.5-1.0), but risk did not vary with the number of years or lifetime hours. Risk estimates for the other hobbies were generally less than one, but the associations were not significant and there were no notable patterns with duration of exposure. Use of oil-based, acrylic, or water-based paints; paint strippers; polyurethane; or varnishes was not associated with NHL risk. We conclude that participation in hobbies involving exposure to organic solvents is unlikely to increase NHL risk.

  9. Road Risk Modeling and Cloud-Aided Safety-Based Route Planning.

    PubMed

    Li, Zhaojian; Kolmanovsky, Ilya; Atkins, Ella; Lu, Jianbo; Filev, Dimitar P; Michelini, John

    2016-11-01

    This paper presents a safety-based route planner that exploits vehicle-to-cloud-to-vehicle (V2C2V) connectivity. Time and road risk index (RRI) are considered as metrics to be balanced based on user preference. To evaluate road segment risk, a road and accident database from the highway safety information system is mined with a hybrid neural network model to predict RRI. Real-time factors such as time of day, day of the week, and weather are included as correction factors to the static RRI prediction. With real-time RRI and expected travel time, route planning is formulated as a multiobjective network flow problem and further reduced to a mixed-integer programming problem. A V2C2V implementation of our safety-based route planning approach is proposed to facilitate access to real-time information and computing resources. A real-world case study, route planning through the city of Columbus, Ohio, is presented. Several scenarios illustrate how the "best" route can be adjusted to favor time versus safety metrics.

  10. Functional Risk Modeling for Lunar Surface Systems

    NASA Technical Reports Server (NTRS)

    Thomson, Fraser; Mathias, Donovan; Go, Susie; Nejad, Hamed

    2010-01-01

    We introduce an approach to risk modeling that we call functional modeling , which we have developed to estimate the capabilities of a lunar base. The functional model tracks the availability of functions provided by systems, in addition to the operational state of those systems constituent strings. By tracking functions, we are able to identify cases where identical functions are provided by elements (rovers, habitats, etc.) that are connected together on the lunar surface. We credit functional diversity in those cases, and in doing so compute more realistic estimates of operational mode availabilities. The functional modeling approach yields more realistic estimates of the availability of the various operational modes provided to astronauts by the ensemble of surface elements included in a lunar base architecture. By tracking functional availability the effects of diverse backup, which often exists when two or more independent elements are connected together, is properly accounted for.

  11. The physical vulnerability of elements at risk: a methodology based on fluid and classical mechanics

    NASA Astrophysics Data System (ADS)

    Mazzorana, B.; Fuchs, S.; Levaggi, L.

    2012-04-01

    The impacts of the flood events occurred in autumn 2011 in the Italian regions Liguria and Tuscany revived the engagement of the public decision makers to enhance in synergy flood control and land use planning. In this context, the design of efficient flood risk mitigation strategies and their subsequent implementation critically relies on a careful vulnerability analysis of both, the immobile and mobile elements at risk potentially exposed to flood hazards. Based on fluid and classical mechanics notions we developed computation schemes enabling for a dynamic vulnerability and risk analysis facing a broad typological variety of elements at risk. The methodological skeleton consists of (1) hydrodynamic computation of the time-varying flood intensities resulting for each element at risk in a succession of loading configurations; (2) modelling the mechanical response of the impacted elements through static, elasto-static and dynamic analyses; (3) characterising the mechanical response through proper structural damage variables and (4) economic valuation of the expected losses as a function of the quantified damage variables. From a computational perspective we coupled the description of the hydrodynamic flow behaviour and the induced structural modifications of the elements at risk exposed. Valuation methods, suitable to support a correct mapping from the value domains of the physical damage variables to the economic loss values are discussed. In such a way we target to complement from a methodological perspective the existing, mainly empirical, vulnerability and risk assessment approaches to refine the conceptual framework of the cost-benefit analysis. Moreover, we aim to support the design of effective flood risk mitigation strategies by diminishing the main criticalities within the systems prone to flood risk.

  12. Modeling and minimizing interference from corneal birefringence in retinal birefringence scanning for foveal fixation detection

    PubMed Central

    Irsch, Kristina; Gramatikov, Boris; Wu, Yi-Kai; Guyton, David

    2011-01-01

    Utilizing the measured corneal birefringence from a data set of 150 eyes of 75 human subjects, an algorithm and related computer program, based on Müller-Stokes matrix calculus, were developed in MATLAB for assessing the influence of corneal birefringence on retinal birefringence scanning (RBS) and for converging upon an optical/mechanical design using wave plates (“wave-plate-enhanced RBS”) that allows foveal fixation detection essentially independently of corneal birefringence. The RBS computer model, and in particular the optimization algorithm, were verified with experimental human data using an available monocular RBS-based eye fixation monitor. Fixation detection using wave-plate-enhanced RBS is adaptable to less cooperative subjects, including young children at risk for developing amblyopia. PMID:21750772

  13. Development of optimization-based probabilistic earthquake scenarios for the city of Tehran

    NASA Astrophysics Data System (ADS)

    Zolfaghari, M. R.; Peyghaleh, E.

    2016-01-01

    This paper presents the methodology and practical example for the application of optimization process to select earthquake scenarios which best represent probabilistic earthquake hazard in a given region. The method is based on simulation of a large dataset of potential earthquakes, representing the long-term seismotectonic characteristics in a given region. The simulation process uses Monte-Carlo simulation and regional seismogenic source parameters to generate a synthetic earthquake catalogue consisting of a large number of earthquakes, each characterized with magnitude, location, focal depth and fault characteristics. Such catalogue provides full distributions of events in time, space and size; however, demands large computation power when is used for risk assessment, particularly when other sources of uncertainties are involved in the process. To reduce the number of selected earthquake scenarios, a mixed-integer linear program formulation is developed in this study. This approach results in reduced set of optimization-based probabilistic earthquake scenario, while maintaining shape of hazard curves and full probabilistic picture by minimizing the error between hazard curves driven by full and reduced sets of synthetic earthquake scenarios. To test the model, the regional seismotectonic and seismogenic characteristics of northern Iran are used to simulate a set of 10,000-year worth of events consisting of some 84,000 earthquakes. The optimization model is then performed multiple times with various input data, taking into account probabilistic seismic hazard for Tehran city as the main constrains. The sensitivity of the selected scenarios to the user-specified site/return period error-weight is also assessed. The methodology could enhance run time process for full probabilistic earthquake studies like seismic hazard and risk assessment. The reduced set is the representative of the contributions of all possible earthquakes; however, it requires far less computation power. The authors have used this approach for risk assessment towards identification of effectiveness-profitability of risk mitigation measures, using optimization model for resource allocation. Based on the error-computation trade-off, 62-earthquake scenarios are chosen to be used for this purpose.

  14. Improving performance of breast cancer risk prediction using a new CAD-based region segmentation scheme

    NASA Astrophysics Data System (ADS)

    Heidari, Morteza; Zargari Khuzani, Abolfazl; Danala, Gopichandh; Qiu, Yuchen; Zheng, Bin

    2018-02-01

    Objective of this study is to develop and test a new computer-aided detection (CAD) scheme with improved region of interest (ROI) segmentation combined with an image feature extraction framework to improve performance in predicting short-term breast cancer risk. A dataset involving 570 sets of "prior" negative mammography screening cases was retrospectively assembled. In the next sequential "current" screening, 285 cases were positive and 285 cases remained negative. A CAD scheme was applied to all 570 "prior" negative images to stratify cases into the high and low risk case group of having cancer detected in the "current" screening. First, a new ROI segmentation algorithm was used to automatically remove useless area of mammograms. Second, from the matched bilateral craniocaudal view images, a set of 43 image features related to frequency characteristics of ROIs were initially computed from the discrete cosine transform and spatial domain of the images. Third, a support vector machine model based machine learning classifier was used to optimally classify the selected optimal image features to build a CAD-based risk prediction model. The classifier was trained using a leave-one-case-out based cross-validation method. Applying this improved CAD scheme to the testing dataset, an area under ROC curve, AUC = 0.70+/-0.04, which was significantly higher than using the extracting features directly from the dataset without the improved ROI segmentation step (AUC = 0.63+/-0.04). This study demonstrated that the proposed approach could improve accuracy on predicting short-term breast cancer risk, which may play an important role in helping eventually establish an optimal personalized breast cancer paradigm.

  15. Pilot of a computer-based brief multiple-health behavior intervention for college students.

    PubMed

    Moore, Michele J; Werch, Chudley E; Bian, Hui

    2012-01-01

    Given the documented multiple health risks college students engage in, and the dearth of effective programs addressing them, the authors developed a computer-based brief multiple-health behavior intervention. This study reports immediate outcomes and feasibility of a pilot of this program. Two hundred students attending a midsized university participated. Participants were randomly assigned to the intervention or control program, both delivered via computer. Immediate feedback was collected with the computer program. Results indicate that the intervention had an early positive impact on alcohol and cigarette use intentions, as well as related constructs underlying the Behavior-Image Model specific to each of the 3 substances measured. Based on the implementation process, the program proved to be feasible to use and acceptable to the population. Results support the potential efficacy of the intervention to positively impact behavioral intentions and linkages between health promoting and damaging behaviors among college students.

  16. Reduced Order Model Implementation in the Risk-Informed Safety Margin Characterization Toolkit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mandelli, Diego; Smith, Curtis L.; Alfonsi, Andrea

    2015-09-01

    The RISMC project aims to develop new advanced simulation-based tools to perform Probabilistic Risk Analysis (PRA) for the existing fleet of U.S. nuclear power plants (NPPs). These tools numerically model not only the thermo-hydraulic behavior of the reactor primary and secondary systems but also external events temporal evolution and components/system ageing. Thus, this is not only a multi-physics problem but also a multi-scale problem (both spatial, µm-mm-m, and temporal, ms-s-minutes-years). As part of the RISMC PRA approach, a large amount of computationally expensive simulation runs are required. An important aspect is that even though computational power is regularly growing, themore » overall computational cost of a RISMC analysis may be not viable for certain cases. A solution that is being evaluated is the use of reduce order modeling techniques. During the FY2015, we investigated and applied reduced order modeling techniques to decrease the RICM analysis computational cost by decreasing the number of simulations runs to perform and employ surrogate models instead of the actual simulation codes. This report focuses on the use of reduced order modeling techniques that can be applied to any RISMC analysis to generate, analyze and visualize data. In particular, we focus on surrogate models that approximate the simulation results but in a much faster time (µs instead of hours/days). We apply reduced order and surrogate modeling techniques to several RISMC types of analyses using RAVEN and RELAP-7 and show the advantages that can be gained.« less

  17. Computational estimation of errors generated by lumping of physiologically-based pharmacokinetic (PBPK) interaction models of inhaled complex chemical mixtures

    EPA Science Inventory

    Many cases of environmental contamination result in concurrent or sequential exposure to more than one chemical. However, limitations of available resources make it unlikely that experimental toxicology will provide health risk information about all the possible mixtures to which...

  18. How adverse outcome pathways can aid the development and use of computational prediction models for regulatory toxicology

    EPA Science Inventory

    Efforts are underway to transform regulatory toxicology and chemical safety assessment from a largely empirical science based on direct observation of apical toxicity outcomes in whole organism toxicity tests to a predictive one in which outcomes and risk are inferred from accumu...

  19. A demonstrative model of a lunar base simulation on a personal computer

    NASA Technical Reports Server (NTRS)

    1985-01-01

    The initial demonstration model of a lunar base simulation is described. This initial model was developed on the personal computer level to demonstrate feasibility and technique before proceeding to a larger computer-based model. Lotus Symphony Version 1.1 software was used to base the demonstration model on an personal computer with an MS-DOS operating system. The personal computer-based model determined the applicability of lunar base modeling techniques developed at an LSPI/NASA workshop. In addition, the personnal computer-based demonstration model defined a modeling structure that could be employed on a larger, more comprehensive VAX-based lunar base simulation. Refinement of this personal computer model and the development of a VAX-based model is planned in the near future.

  20. On the role of numerical simulations in studies of reduced gravity-induced physiological effects in humans. Results from NELME.

    NASA Astrophysics Data System (ADS)

    Perez-Poch, Antoni

    Computer simulations are becoming a promising research line of work, as physiological models become more and more sophisticated and reliable. Technological advances in state-of-the-art hardware technology and software allow nowadays for better and more accurate simulations of complex phenomena, such as the response of the human cardiovascular system to long-term exposure to microgravity. Experimental data for long-term missions are difficult to achieve and reproduce, therefore the predictions of computer simulations are of a major importance in this field. Our approach is based on a previous model developed and implemented in our laboratory (NELME: Numercial Evaluation of Long-term Microgravity Effects). The software simulates the behaviour of the cardiovascular system and different human organs, has a modular archi-tecture, and allows to introduce perturbations such as physical exercise or countermeasures. The implementation is based on a complex electrical-like model of this control system, using inexpensive development frameworks, and has been tested and validated with the available experimental data. The objective of this work is to analyse and simulate long-term effects and gender differences when individuals are exposed to long-term microgravity. Risk probability of a health impairement which may put in jeopardy a long-term mission is also evaluated. . Gender differences have been implemented for this specific work, as an adjustment of a number of parameters that are included in the model. Women versus men physiological differences have been therefore taken into account, based upon estimations from the physiology bibliography. A number of simulations have been carried out for long-term exposure to microgravity. Gravity varying continuosly from Earth-based to zero, and time exposure are the two main variables involved in the construction of results, including responses to patterns of physical aerobic ex-ercise and thermal stress simulating an extra-vehicular activity. Results show that significant differences appear between men and women physiological response after long-term exposure (more than three months) to microgravity. Risk evaluation for every gender, and specific risk thresholds are provided. Different scenarios like a long-term mission to Moon or Mars are evaluated, including countermeasures such as aerobic exercise. Initial results are compatible with the existing data, and provide useful insights regarding different patterns of microgravity exposure. We conclude that computer-based models such us NELME are a promising line of work to predict health risks in long-term missions.

  1. Primary care physicians' perspectives on computer-based health risk assessment tools for chronic diseases: a mixed methods study.

    PubMed

    Voruganti, Teja R; O'Brien, Mary Ann; Straus, Sharon E; McLaughlin, John R; Grunfeld, Eva

    2015-09-24

    Health risk assessment tools compute an individual's risk of developing a disease. Routine use of such tools by primary care physicians (PCPs) is potentially useful in chronic disease prevention. We sought physicians' awareness and perceptions of the usefulness, usability and feasibility of performing assessments with computer-based risk assessment tools in primary care settings. Focus groups and usability testing with a computer-based risk assessment tool were conducted with PCPs from both university-affiliated and community-based practices. Analysis was derived from grounded theory methodology. PCPs (n = 30) were aware of several risk assessment tools although only select tools were used routinely. The decision to use a tool depended on how use impacted practice workflow and whether the tool had credibility. Participants felt that embedding tools in the electronic medical records (EMRs) system might allow for health information from the medical record to auto-populate into the tool. User comprehension of risk could also be improved with computer-based interfaces that present risk in different formats. In this study, PCPs chose to use certain tools more regularly because of usability and credibility. Despite there being differences in the particular tools a clinical practice used, there was general appreciation for the usefulness of tools for different clinical situations. Participants characterised particular features of an ideal tool, feeling strongly that embedding risk assessment tools in the EMR would maximise accessibility and use of the tool for chronic disease management. However, appropriate practice workflow integration and features that facilitate patient understanding at point-of-care are also essential.

  2. Comparison of different models for non-invasive FFR estimation

    NASA Astrophysics Data System (ADS)

    Mirramezani, Mehran; Shadden, Shawn

    2017-11-01

    Coronary artery disease is a leading cause of death worldwide. Fractional flow reserve (FFR), derived from invasively measuring the pressure drop across a stenosis, is considered the gold standard to diagnose disease severity and need for treatment. Non-invasive estimation of FFR has gained recent attention for its potential to reduce patient risk and procedural cost versus invasive FFR measurement. Non-invasive FFR can be obtained by using image-based computational fluid dynamics to simulate blood flow and pressure in a patient-specific coronary model. However, 3D simulations require extensive effort for model construction and numerical computation, which limits their routine use. In this study we compare (ordered by increasing computational cost/complexity): reduced-order algebraic models of pressure drop across a stenosis; 1D, 2D (multiring) and 3D CFD models; as well as 3D FSI for the computation of FFR in idealized and patient-specific stenosis geometries. We demonstrate the ability of an appropriate reduced order algebraic model to closely predict FFR when compared to FFR from a full 3D simulation. This work was supported by the NIH, Grant No. R01-HL103419.

  3. Nine steps to risk-informed wellhead protection and management: Methods and application to the Burgberg Catchment

    NASA Astrophysics Data System (ADS)

    Nowak, W.; Enzenhoefer, R.; Bunk, T.

    2013-12-01

    Wellhead protection zones are commonly delineated via advective travel time analysis without considering any aspects of model uncertainty. In the past decade, research efforts produced quantifiable risk-based safety margins for protection zones. They are based on well vulnerability criteria (e.g., travel times, exposure times, peak concentrations) cast into a probabilistic setting, i.e., they consider model and parameter uncertainty. Practitioners still refrain from applying these new techniques for mainly three reasons. (1) They fear the possibly cost-intensive additional areal demand of probabilistic safety margins, (2) probabilistic approaches are allegedly complex, not readily available, and consume huge computing resources, and (3) uncertainty bounds are fuzzy, whereas final decisions are binary. The primary goal of this study is to show that these reservations are unjustified. We present a straightforward and computationally affordable framework based on a novel combination of well-known tools (e.g., MODFLOW, PEST, Monte Carlo). This framework provides risk-informed decision support for robust and transparent wellhead delineation under uncertainty. Thus, probabilistic risk-informed wellhead protection is possible with methods readily available for practitioners. As vivid proof of concept, we illustrate our key points on a pumped karstic well catchment, located in Germany. In the case study, we show that reliability levels can be increased by re-allocating the existing delineated area at no increase in delineated area. This is achieved by simply swapping delineated low-risk areas against previously non-delineated high-risk areas. Also, we show that further improvements may often be available at only low additional delineation area. Depending on the context, increases or reductions of delineated area directly translate to costs and benefits, if the land is priced, or if land owners need to be compensated for land use restrictions.

  4. Numerical and experimental analysis of factors leading to suture dehiscence after Billroth II gastric resection.

    PubMed

    Cvetkovic, Aleksandar M; Milasinovic, Danko Z; Peulic, Aleksandar S; Mijailovic, Nikola V; Filipovic, Nenad D; Zdravkovic, Nebojsa D

    2014-11-01

    The main goal of this study was to numerically quantify risk of duodenal stump blowout after Billroth II (BII) gastric resection. Our hypothesis was that the geometry of the reconstructed tract after BII resection is one of the key factors that can lead to duodenal dehiscence. We used computational fluid dynamics (CFD) with finite element (FE) simulations of various models of BII reconstructed gastrointestinal (GI) tract, as well as non-perfused, ex vivo, porcine experimental models. As main geometrical parameters for FE postoperative models we have used duodenal stump length and inclination between gastric remnant and duodenal stump. Virtual gastric resection was performed on each of 3D FE models based on multislice Computer Tomography (CT) DICOM. According to our computer simulation the difference between maximal duodenal stump pressures for models with most and least preferable geometry of reconstructed GI tract is about 30%. We compared the resulting postoperative duodenal pressure from computer simulations with duodenal stump dehiscence pressure from the experiment. Pressure at duodenal stump after BII resection obtained by computer simulation is 4-5 times lower than the dehiscence pressure according to our experiment on isolated bowel segment. Our conclusion is that if the surgery is performed technically correct, geometry variations of the reconstructed GI tract by themselves are not sufficient to cause duodenal stump blowout. Pressure that develops in the duodenal stump after BII resection using omega loop, only in the conjunction with other risk factors can cause duodenal dehiscence. Increased duodenal pressure after BII resection is risk factor. Hence we recommend the routine use of Roux en Y anastomosis as a safer solution in terms of resulting intraluminal pressure. However, if the surgeon decides to perform BII reconstruction, results obtained with this methodology can be valuable. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  5. An individual risk prediction model for lung cancer based on a study in a Chinese population.

    PubMed

    Wang, Xu; Ma, Kewei; Cui, Jiuwei; Chen, Xiao; Jin, Lina; Li, Wei

    2015-01-01

    Early detection and diagnosis remains an effective yet challenging approach to improve the clinical outcome of patients with cancer. Low-dose computed tomography screening has been suggested to improve the diagnosis of lung cancer in high-risk individuals. To make screening more efficient, it is necessary to identify individuals who are at high risk. We conducted a case-control study to develop a predictive model for identification of such high-risk individuals. Clinical data from 705 lung cancer patients and 988 population-based controls were used for the development and evaluation of the model. Associations between environmental variants and lung cancer risk were analyzed with a logistic regression model. The predictive accuracy of the model was determined by calculating the area under the receiver operating characteristic curve and the optimal operating point. Our results indicate that lung cancer risk factors included older age, male gender, lower education level, family history of cancer, history of chronic obstructive pulmonary disease, lower body mass index, smoking cigarettes, a diet with less seafood, vegetables, fruits, dairy products, soybean products and nuts, a diet rich in meat, and exposure to pesticides and cooking emissions. The area under the curve was 0.8851 and the optimal operating point was obtained. With a cutoff of 0.35, the false positive rate, true positive rate, and Youden index were 0.21, 0.87, and 0.66, respectively. The risk prediction model for lung cancer developed in this study could discriminate high-risk from low-risk individuals.

  6. Accuracy and Calibration of Computational Approaches for Inpatient Mortality Predictive Modeling.

    PubMed

    Nakas, Christos T; Schütz, Narayan; Werners, Marcus; Leichtle, Alexander B

    2016-01-01

    Electronic Health Record (EHR) data can be a key resource for decision-making support in clinical practice in the "big data" era. The complete database from early 2012 to late 2015 involving hospital admissions to Inselspital Bern, the largest Swiss University Hospital, was used in this study, involving over 100,000 admissions. Age, sex, and initial laboratory test results were the features/variables of interest for each admission, the outcome being inpatient mortality. Computational decision support systems were utilized for the calculation of the risk of inpatient mortality. We assessed the recently proposed Acute Laboratory Risk of Mortality Score (ALaRMS) model, and further built generalized linear models, generalized estimating equations, artificial neural networks, and decision tree systems for the predictive modeling of the risk of inpatient mortality. The Area Under the ROC Curve (AUC) for ALaRMS marginally corresponded to the anticipated accuracy (AUC = 0.858). Penalized logistic regression methodology provided a better result (AUC = 0.872). Decision tree and neural network-based methodology provided even higher predictive performance (up to AUC = 0.912 and 0.906, respectively). Additionally, decision tree-based methods can efficiently handle Electronic Health Record (EHR) data that have a significant amount of missing records (in up to >50% of the studied features) eliminating the need for imputation in order to have complete data. In conclusion, we show that statistical learning methodology can provide superior predictive performance in comparison to existing methods and can also be production ready. Statistical modeling procedures provided unbiased, well-calibrated models that can be efficient decision support tools for predicting inpatient mortality and assigning preventive measures.

  7. The Use of Object-Oriented Analysis Methods in Surety Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Craft, Richard L.; Funkhouser, Donald R.; Wyss, Gregory D.

    1999-05-01

    Object-oriented analysis methods have been used in the computer science arena for a number of years to model the behavior of computer-based systems. This report documents how such methods can be applied to surety analysis. By embodying the causality and behavior of a system in a common object-oriented analysis model, surety analysts can make the assumptions that underlie their models explicit and thus better communicate with system designers. Furthermore, given minor extensions to traditional object-oriented analysis methods, it is possible to automatically derive a wide variety of traditional risk and reliability analysis methods from a single common object model. Automaticmore » model extraction helps ensure consistency among analyses and enables the surety analyst to examine a system from a wider variety of viewpoints in a shorter period of time. Thus it provides a deeper understanding of a system's behaviors and surety requirements. This report documents the underlying philosophy behind the common object model representation, the methods by which such common object models can be constructed, and the rules required to interrogate the common object model for derivation of traditional risk and reliability analysis models. The methodology is demonstrated in an extensive example problem.« less

  8. Reduced-order modeling with sparse polynomial chaos expansion and dimension reduction for evaluating the impact of CO2 and brine leakage on groundwater

    NASA Astrophysics Data System (ADS)

    Liu, Y.; Zheng, L.; Pau, G. S. H.

    2016-12-01

    A careful assessment of the risk associated with geologic CO2 storage is critical to the deployment of large-scale storage projects. While numerical modeling is an indispensable tool for risk assessment, there has been increasing need in considering and addressing uncertainties in the numerical models. However, uncertainty analyses have been significantly hindered by the computational complexity of the model. As a remedy, reduced-order models (ROM), which serve as computationally efficient surrogates for high-fidelity models (HFM), have been employed. The ROM is constructed at the expense of an initial set of HFM simulations, and afterwards can be relied upon to predict the model output values at minimal cost. The ROM presented here is part of National Risk Assessment Program (NRAP) and intends to predict the water quality change in groundwater in response to hypothetical CO2 and brine leakage. The HFM based on which the ROM is derived is a multiphase flow and reactive transport model, with 3-D heterogeneous flow field and complex chemical reactions including aqueous complexation, mineral dissolution/precipitation, adsorption/desorption via surface complexation and cation exchange. Reduced-order modeling techniques based on polynomial basis expansion, such as polynomial chaos expansion (PCE), are widely used in the literature. However, the accuracy of such ROMs can be affected by the sparse structure of the coefficients of the expansion. Failing to identify vanishing polynomial coefficients introduces unnecessary sampling errors, the accumulation of which deteriorates the accuracy of the ROMs. To address this issue, we treat the PCE as a sparse Bayesian learning (SBL) problem, and the sparsity is obtained by detecting and including only the non-zero PCE coefficients one at a time by iteratively selecting the most contributing coefficients. The computational complexity due to predicting the entire 3-D concentration fields is further mitigated by a dimension reduction procedure-proper orthogonal decomposition (POD). Our numerical results show that utilizing the sparse structure and POD significantly enhances the accuracy and efficiency of the ROMs, laying the basis for further analyses that necessitate a large number of model simulations.

  9. Tsunami Forecasting in the Atlantic Basin

    NASA Astrophysics Data System (ADS)

    Knight, W. R.; Whitmore, P.; Sterling, K.; Hale, D. A.; Bahng, B.

    2012-12-01

    The mission of the West Coast and Alaska Tsunami Warning Center (WCATWC) is to provide advance tsunami warning and guidance to coastal communities within its Area-of-Responsibility (AOR). Predictive tsunami models, based on the shallow water wave equations, are an important part of the Center's guidance support. An Atlantic-based counterpart to the long-standing forecasting ability in the Pacific known as the Alaska Tsunami Forecast Model (ATFM) is now developed. The Atlantic forecasting method is based on ATFM version 2 which contains advanced capabilities over the original model; including better handling of the dynamic interactions between grids, inundation over dry land, new forecast model products, an optional non-hydrostatic approach, and the ability to pre-compute larger and more finely gridded regions using parallel computational techniques. The wide and nearly continuous Atlantic shelf region presents a challenge for forecast models. Our solution to this problem has been to develop a single unbroken high resolution sub-mesh (currently 30 arc-seconds), trimmed to the shelf break. This allows for edge wave propagation and for kilometer scale bathymetric feature resolution. Terminating the fine mesh at the 2000m isobath keeps the number of grid points manageable while allowing for a coarse (4 minute) mesh to adequately resolve deep water tsunami dynamics. Higher resolution sub-meshes are then included around coastal forecast points of interest. The WCATWC Atlantic AOR includes eastern U.S. and Canada, the U.S. Gulf of Mexico, Puerto Rico, and the Virgin Islands. Puerto Rico and the Virgin Islands are in very close proximity to well-known tsunami sources. Because travel times are under an hour and response must be immediate, our focus is on pre-computing many tsunami source "scenarios" and compiling those results into a database accessible and calibrated with observations during an event. Seismic source evaluation determines the order of model pre-computation - starting with those sources that carry the highest risk. Model computation zones are confined to regions at risk to save computation time. For example, Atlantic sources have been shown to not propagate into the Gulf of Mexico. Therefore, fine grid computations are not performed in the Gulf for Atlantic sources. Outputs from the Atlantic model include forecast marigrams at selected sites, maximum amplitudes, drawdowns, and currents for all coastal points. The maximum amplitude maps will be supplemented with contoured energy flux maps which show more clearly the effects of bathymetric features on tsunami wave propagation. During an event, forecast marigrams will be compared to observations to adjust the model results. The modified forecasts will then be used to set alert levels between coastal breakpoints, and provided to emergency management.

  10. Hybrid-Aware Model for Senior Wellness Service in Smart Home.

    PubMed

    Jung, Yuchae

    2017-05-22

    Smart home technology with situation-awareness is important for seniors to improve safety and security. With the development of context-aware computing, wearable sensor technology, and ubiquitous computing, it is easier for seniors to manage their health problem in smart home environment. For monitoring senior activity in smart home, wearable, and motion sensors-such as respiration rate (RR), electrocardiography (ECG), body temperature, and blood pressure (BP)-were used for monitoring movements of seniors. For context-awareness, environmental sensors-such as gas, fire, smoke, dust, temperature, and light sensors-were used for senior location data collection. Based on senior activity, senior health status can be classified into positive and negative. Based on senior location and time, senior safety is classified into safe and emergency. In this paper, we propose a hybrid inspection service middleware for monitoring elderly health risk based on senior activity and location. This hybrid-aware model for the detection of abnormal status of seniors has four steps as follows: (1) data collection from biosensors and environmental sensors; (2) monitoring senior location and time of stay in each location using environmental sensors; (3) monitoring senior activity using biometric data; finally, (4) expectation-maximization based decision-making step recommending proper treatment based on a senior health risk ratio.

  11. Hybrid-Aware Model for Senior Wellness Service in Smart Home

    PubMed Central

    Jung, Yuchae

    2017-01-01

    Smart home technology with situation-awareness is important for seniors to improve safety and security. With the development of context-aware computing, wearable sensor technology, and ubiquitous computing, it is easier for seniors to manage their health problem in smart home environment. For monitoring senior activity in smart home, wearable, and motion sensors—such as respiration rate (RR), electrocardiography (ECG), body temperature, and blood pressure (BP)—were used for monitoring movements of seniors. For context-awareness, environmental sensors—such as gas, fire, smoke, dust, temperature, and light sensors—were used for senior location data collection. Based on senior activity, senior health status can be classified into positive and negative. Based on senior location and time, senior safety is classified into safe and emergency. In this paper, we propose a hybrid inspection service middleware for monitoring elderly health risk based on senior activity and location. This hybrid-aware model for the detection of abnormal status of seniors has four steps as follows: (1) data collection from biosensors and environmental sensors; (2) monitoring senior location and time of stay in each location using environmental sensors; (3) monitoring senior activity using biometric data; finally, (4) expectation-maximization based decision-making step recommending proper treatment based on a senior health risk ratio. PMID:28531157

  12. Clearing margin system in the futures markets—Applying the value-at-risk model to Taiwanese data

    NASA Astrophysics Data System (ADS)

    Chiu, Chien-Liang; Chiang, Shu-Mei; Hung, Jui-Cheng; Chen, Yu-Lung

    2006-07-01

    This article sets out to investigate if the TAIFEX has adequate clearing margin adjustment system via unconditional coverage, conditional coverage test and mean relative scaled bias to assess the performance of three value-at-risk (VaR) models (i.e., the TAIFEX, RiskMetrics and GARCH-t). For the same model, original and absolute returns are compared to explore which can accurately capture the true risk. For the same return, daily and tiered adjustment methods are examined to evaluate which corresponds to risk best. The results indicate that the clearing margin adjustment of the TAIFEX cannot reflect true risks. The adjustment rules, including the use of absolute return and tiered adjustment of the clearing margin, have distorted VaR-based margin requirements. Besides, the results suggest that the TAIFEX should use original return to compute VaR and daily adjustment system to set clearing margin. This approach would improve the funds operation efficiency and the liquidity of the futures markets.

  13. Developing points-based risk-scoring systems in the presence of competing risks.

    PubMed

    Austin, Peter C; Lee, Douglas S; D'Agostino, Ralph B; Fine, Jason P

    2016-09-30

    Predicting the occurrence of an adverse event over time is an important issue in clinical medicine. Clinical prediction models and associated points-based risk-scoring systems are popular statistical methods for summarizing the relationship between a multivariable set of patient risk factors and the risk of the occurrence of an adverse event. Points-based risk-scoring systems are popular amongst physicians as they permit a rapid assessment of patient risk without the use of computers or other electronic devices. The use of such points-based risk-scoring systems facilitates evidence-based clinical decision making. There is a growing interest in cause-specific mortality and in non-fatal outcomes. However, when considering these types of outcomes, one must account for competing risks whose occurrence precludes the occurrence of the event of interest. We describe how points-based risk-scoring systems can be developed in the presence of competing events. We illustrate the application of these methods by developing risk-scoring systems for predicting cardiovascular mortality in patients hospitalized with acute myocardial infarction. Code in the R statistical programming language is provided for the implementation of the described methods. © 2016 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd. © 2016 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd.

  14. Applying a machine learning model using a locally preserving projection based feature regeneration algorithm to predict breast cancer risk

    NASA Astrophysics Data System (ADS)

    Heidari, Morteza; Zargari Khuzani, Abolfazl; Danala, Gopichandh; Mirniaharikandehei, Seyedehnafiseh; Qian, Wei; Zheng, Bin

    2018-03-01

    Both conventional and deep machine learning has been used to develop decision-support tools applied in medical imaging informatics. In order to take advantages of both conventional and deep learning approach, this study aims to investigate feasibility of applying a locally preserving projection (LPP) based feature regeneration algorithm to build a new machine learning classifier model to predict short-term breast cancer risk. First, a computer-aided image processing scheme was used to segment and quantify breast fibro-glandular tissue volume. Next, initially computed 44 image features related to the bilateral mammographic tissue density asymmetry were extracted. Then, an LLP-based feature combination method was applied to regenerate a new operational feature vector using a maximal variance approach. Last, a k-nearest neighborhood (KNN) algorithm based machine learning classifier using the LPP-generated new feature vectors was developed to predict breast cancer risk. A testing dataset involving negative mammograms acquired from 500 women was used. Among them, 250 were positive and 250 remained negative in the next subsequent mammography screening. Applying to this dataset, LLP-generated feature vector reduced the number of features from 44 to 4. Using a leave-onecase-out validation method, area under ROC curve produced by the KNN classifier significantly increased from 0.62 to 0.68 (p < 0.05) and odds ratio was 4.60 with a 95% confidence interval of [3.16, 6.70]. Study demonstrated that this new LPP-based feature regeneration approach enabled to produce an optimal feature vector and yield improved performance in assisting to predict risk of women having breast cancer detected in the next subsequent mammography screening.

  15. Consensus Modeling in Support of a Semi-Automated Read-Across Application (SOT)

    EPA Science Inventory

    Read-across is a widely used technique to help fill data gaps in a risk assessment. With the increasing availability of large amounts of computable in vitro and in vivo data on chemicals, it should be possible to build a variety of computer models to help guide a risk assessor in...

  16. DEVELOPMENT OF 3-D COMPUTER MODELS OF HUMAN LUNG MORPHOLOGY FOR IMPROOVED RISK ASSESSMENT OF INHALED PARTICULATE MATTER

    EPA Science Inventory

    DEVELOPMENT OF 3-D COMPUTER MODELS OF HUMAN LUNG MORPHOLOGY FOR IMPROVED RISK ASSESSMENT OF INHALED PARTICULATE MATTER

    Jeffry D. Schroeter, Curriculum in Toxicology, University of North Carolina, Chapel Hill, NC 27599; Ted B. Martonen, ETD, NHEERL, USEPA, RTP, NC 27711; Do...

  17. Emerging systems biology approaches in nanotoxicology: Towards a mechanism-based understanding of nanomaterial hazard and risk

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Costa, Pedro M.; Fadeel, Bengt, E-mail: Bengt.Fade

    Engineered nanomaterials are being developed for a variety of technological applications. However, the increasing use of nanomaterials in society has led to concerns about their potential adverse effects on human health and the environment. During the first decade of nanotoxicological research, the realization has emerged that effective risk assessment of the multitudes of new nanomaterials would benefit from a comprehensive understanding of their toxicological mechanisms, which is difficult to achieve with traditional, low-throughput, single end-point oriented approaches. Therefore, systems biology approaches are being progressively applied within the nano(eco)toxicological sciences. This novel paradigm implies that the study of biological systems shouldmore » be integrative resulting in quantitative and predictive models of nanomaterial behaviour in a biological system. To this end, global ‘omics’ approaches with which to assess changes in genes, proteins, metabolites, etc. are deployed allowing for computational modelling of the biological effects of nanomaterials. Here, we highlight omics and systems biology studies in nanotoxicology, aiming towards the implementation of a systems nanotoxicology and mechanism-based risk assessment of nanomaterials. - Highlights: • Systems nanotoxicology is a multi-disciplinary approach to quantitative modelling. • Transcriptomics, proteomics and metabolomics remain the most common methods. • Global “omics” techniques should be coupled to computational modelling approaches. • The discovery of nano-specific toxicity pathways and biomarkers is a prioritized goal. • Overall, experimental nanosafety research must endeavour reproducibility and relevance.« less

  18. A First Step towards a Clinical Decision Support System for Post-traumatic Stress Disorders.

    PubMed

    Ma, Sisi; Galatzer-Levy, Isaac R; Wang, Xuya; Fenyö, David; Shalev, Arieh Y

    2016-01-01

    PTSD is distressful and debilitating, following a non-remitting course in about 10% to 20% of trauma survivors. Numerous risk indicators of PTSD have been identified, but individual level prediction remains elusive. As an effort to bridge the gap between scientific discovery and practical application, we designed and implemented a clinical decision support pipeline to provide clinically relevant recommendation for trauma survivors. To meet the specific challenge of early prediction, this work uses data obtained within ten days of a traumatic event. The pipeline creates personalized predictive model for each individual, and computes quality metrics for each predictive model. Clinical recommendations are made based on both the prediction of the model and its quality, thus avoiding making potentially detrimental recommendations based on insufficient information or suboptimal model. The current pipeline outperforms the acute stress disorder, a commonly used clinical risk factor for PTSD development, both in terms of sensitivity and specificity.

  19. Risk Factors for Addiction and Their Association with Model-Based Behavioral Control.

    PubMed

    Reiter, Andrea M F; Deserno, Lorenz; Wilbertz, Tilmann; Heinze, Hans-Jochen; Schlagenhauf, Florian

    2016-01-01

    Addiction shows familial aggregation and previous endophenotype research suggests that healthy relatives of addicted individuals share altered behavioral and cognitive characteristics with individuals suffering from addiction. In this study we asked whether impairments in behavioral control proposed for addiction, namely a shift from goal-directed, model-based toward habitual, model-free control, extends toward an unaffected sample (n = 20) of adult children of alcohol-dependent fathers as compared to a sample without any personal or family history of alcohol addiction (n = 17). Using a sequential decision-making task designed to investigate model-free and model-based control combined with a computational modeling analysis, we did not find any evidence for altered behavioral control in individuals with a positive family history of alcohol addiction. Independent of family history of alcohol dependence, we however observed that the interaction of two different risk factors of addiction, namely impulsivity and cognitive capacities, predicts the balance of model-free and model-based behavioral control. Post-hoc tests showed a positive association of model-based behavior with cognitive capacity in the lower, but not in the higher impulsive group of the original sample. In an independent sample of particularly high- vs. low-impulsive individuals, we confirmed the interaction effect of cognitive capacities and high vs. low impulsivity on model-based control. In the confirmation sample, a positive association of omega with cognitive capacity was observed in highly impulsive individuals, but not in low impulsive individuals. Due to the moderate sample size of the study, further investigation of the association of risk factors for addiction with model-based behavior in larger sample sizes is warranted.

  20. Exploring a new bilateral focal density asymmetry based image marker to predict breast cancer risk

    NASA Astrophysics Data System (ADS)

    Aghaei, Faranak; Mirniaharikandehei, Seyedehnafiseh; Hollingsworth, Alan B.; Wang, Yunzhi; Qiu, Yuchen; Liu, Hong; Zheng, Bin

    2017-03-01

    Although breast density has been widely considered an important breast cancer risk factor, it is not very effective to predict risk of developing breast cancer in a short-term or harboring cancer in mammograms. Based on our recent studies to build short-term breast cancer risk stratification models based on bilateral mammographic density asymmetry, we in this study explored a new quantitative image marker based on bilateral focal density asymmetry to predict the risk of harboring cancers in mammograms. For this purpose, we assembled a testing dataset involving 100 positive and 100 negative cases. In each of positive case, no any solid masses are visible on mammograms. We developed a computer-aided detection (CAD) scheme to automatically detect focal dense regions depicting on two bilateral mammograms of left and right breasts. CAD selects one focal dense region with the maximum size on each image and computes its asymmetrical ratio. We used this focal density asymmetry as a new imaging marker to divide testing cases into two groups of higher and lower focal density asymmetry. The first group included 70 cases in which 62.9% are positive, while the second group included 130 cases in which 43.1% are positive. The odds ratio is 2.24. As a result, this preliminary study supported the feasibility of applying a new focal density asymmetry based imaging marker to predict the risk of having mammography-occult cancers. The goal is to assist radiologists more effectively and accurately detect early subtle cancers using mammography and/or other adjunctive imaging modalities in the future.

  1. Cloud immersion building shielding factors for US residential structures.

    PubMed

    Dickson, E D; Hamby, D M

    2014-12-01

    This paper presents validated building shielding factors designed for contemporary US housing-stock under an idealized, yet realistic, exposure scenario within a semi-infinite cloud of radioactive material. The building shielding factors are intended for use in emergency planning and level three probabilistic risk assessments for a variety of postulated radiological events in which a realistic assessment is necessary to better understand the potential risks for accident mitigation and emergency response planning. Factors are calculated from detailed computational housing-units models using the general-purpose Monte Carlo N-Particle computational code, MCNP5, and are benchmarked from a series of narrow- and broad-beam measurements analyzing the shielding effectiveness of ten common general-purpose construction materials and ten shielding models representing the primary weather barriers (walls and roofs) of likely US housing-stock. Each model was designed to scale based on common residential construction practices and include, to the extent practical, all structurally significant components important for shielding against ionizing radiation. Calculations were performed for floor-specific locations as well as for computing a weighted-average representative building shielding factor for single- and multi-story detached homes, both with and without basement, as well for single-wide manufactured housing-units.

  2. The neural representation of unexpected uncertainty during value-based decision making.

    PubMed

    Payzan-LeNestour, Elise; Dunne, Simon; Bossaerts, Peter; O'Doherty, John P

    2013-07-10

    Uncertainty is an inherent property of the environment and a central feature of models of decision-making and learning. Theoretical propositions suggest that one form, unexpected uncertainty, may be used to rapidly adapt to changes in the environment, while being influenced by two other forms: risk and estimation uncertainty. While previous studies have reported neural representations of estimation uncertainty and risk, relatively little is known about unexpected uncertainty. Here, participants performed a decision-making task while undergoing functional magnetic resonance imaging (fMRI), which, in combination with a Bayesian model-based analysis, enabled us to separately examine each form of uncertainty examined. We found representations of unexpected uncertainty in multiple cortical areas, as well as the noradrenergic brainstem nucleus locus coeruleus. Other unique cortical regions were found to encode risk, estimation uncertainty, and learning rate. Collectively, these findings support theoretical models in which several formally separable uncertainty computations determine the speed of learning. Copyright © 2013 Elsevier Inc. All rights reserved.

  3. Development of a High Resolution 3D Infant Stomach Model for Surgical Planning

    NASA Astrophysics Data System (ADS)

    Chaudry, Qaiser; Raza, S. Hussain; Lee, Jeonggyu; Xu, Yan; Wulkan, Mark; Wang, May D.

    Medical surgical procedures have not changed much during the past century due to the lack of accurate low-cost workbench for testing any new improvement. The increasingly cheaper and powerful computer technologies have made computer-based surgery planning and training feasible. In our work, we have developed an accurate 3D stomach model, which aims to improve the surgical procedure that treats the infant pediatric and neonatal gastro-esophageal reflux disease (GERD). We generate the 3-D infant stomach model based on in vivo computer tomography (CT) scans of an infant. CT is a widely used clinical imaging modality that is cheap, but with low spatial resolution. To improve the model accuracy, we use the high resolution Visible Human Project (VHP) in model building. Next, we add soft muscle material properties to make the 3D model deformable. Then we use virtual reality techniques such as haptic devices to make the 3D stomach model deform upon touching force. This accurate 3D stomach model provides a workbench for testing new GERD treatment surgical procedures. It has the potential to reduce or eliminate the extensive cost associated with animal testing when improving any surgical procedure, and ultimately, to reduce the risk associated with infant GERD surgery.

  4. An Integrated Scenario Ensemble-Based Framework for Hurricane Evacuation Modeling: Part 2-Hazard Modeling.

    PubMed

    Blanton, Brian; Dresback, Kendra; Colle, Brian; Kolar, Randy; Vergara, Humberto; Hong, Yang; Leonardo, Nicholas; Davidson, Rachel; Nozick, Linda; Wachtendorf, Tricia

    2018-04-25

    Hurricane track and intensity can change rapidly in unexpected ways, thus making predictions of hurricanes and related hazards uncertain. This inherent uncertainty often translates into suboptimal decision-making outcomes, such as unnecessary evacuation. Representing this uncertainty is thus critical in evacuation planning and related activities. We describe a physics-based hazard modeling approach that (1) dynamically accounts for the physical interactions among hazard components and (2) captures hurricane evolution uncertainty using an ensemble method. This loosely coupled model system provides a framework for probabilistic water inundation and wind speed levels for a new, risk-based approach to evacuation modeling, described in a companion article in this issue. It combines the Weather Research and Forecasting (WRF) meteorological model, the Coupled Routing and Excess STorage (CREST) hydrologic model, and the ADvanced CIRCulation (ADCIRC) storm surge, tide, and wind-wave model to compute inundation levels and wind speeds for an ensemble of hurricane predictions. Perturbations to WRF's initial and boundary conditions and different model physics/parameterizations generate an ensemble of storm solutions, which are then used to drive the coupled hydrologic + hydrodynamic models. Hurricane Isabel (2003) is used as a case study to illustrate the ensemble-based approach. The inundation, river runoff, and wind hazard results are strongly dependent on the accuracy of the mesoscale meteorological simulations, which improves with decreasing lead time to hurricane landfall. The ensemble envelope brackets the observed behavior while providing "best-case" and "worst-case" scenarios for the subsequent risk-based evacuation model. © 2018 Society for Risk Analysis.

  5. Computational Planning in Facial Surgery.

    PubMed

    Zachow, Stefan

    2015-10-01

    This article reflects the research of the last two decades in computational planning for cranio-maxillofacial surgery. Model-guided and computer-assisted surgery planning has tremendously developed due to ever increasing computational capabilities. Simulators for education, planning, and training of surgery are often compared with flight simulators, where maneuvers are also trained to reduce a possible risk of failure. Meanwhile, digital patient models can be derived from medical image data with astonishing accuracy and thus can serve for model surgery to derive a surgical template model that represents the envisaged result. Computerized surgical planning approaches, however, are often still explorative, meaning that a surgeon tries to find a therapeutic concept based on his or her expertise using computational tools that are mimicking real procedures. Future perspectives of an improved computerized planning may be that surgical objectives will be generated algorithmically by employing mathematical modeling, simulation, and optimization techniques. Planning systems thus act as intelligent decision support systems. However, surgeons can still use the existing tools to vary the proposed approach, but they mainly focus on how to transfer objectives into reality. Such a development may result in a paradigm shift for future surgery planning. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  6. An Agent-Based Epidemic Simulation of Social Behaviors Affecting HIV Transmission among Taiwanese Homosexuals

    PubMed Central

    2015-01-01

    Computational simulations are currently used to identify epidemic dynamics, to test potential prevention and intervention strategies, and to study the effects of social behaviors on HIV transmission. The author describes an agent-based epidemic simulation model of a network of individuals who participate in high-risk sexual practices, using number of partners, condom usage, and relationship length to distinguish between high- and low-risk populations. Two new concepts—free links and fixed links—are used to indicate tendencies among individuals who either have large numbers of short-term partners or stay in long-term monogamous relationships. An attempt was made to reproduce epidemic curves of reported HIV cases among male homosexuals in Taiwan prior to using the agent-based model to determine the effects of various policies on epidemic dynamics. Results suggest that when suitable adjustments are made based on available social survey statistics, the model accurately simulates real-world behaviors on a large scale. PMID:25815047

  7. Assessing the impact of the Lebanese National Polio Immunization Campaign using a population-based computational model.

    PubMed

    Alawieh, Ali; Sabra, Zahraa; Langley, E Farris; Bizri, Abdul Rahman; Hamadeh, Randa; Zaraket, Fadi A

    2017-11-25

    After the re-introduction of poliovirus to Syria in 2013, Lebanon was considered at high transmission risk due to its proximity to Syria and the high number of Syrian refugees. However, after a large-scale national immunization initiative, Lebanon was able to prevent a potential outbreak of polio among nationals and refugees. In this work, we used a computational individual-simulation model to assess the risk of poliovirus threat to Lebanon prior and after the immunization campaign and to quantitatively assess the healthcare impact of the campaign and the required standards that need to be maintained nationally to prevent a future outbreak. Acute poliomyelitis surveillance in Lebanon was along with the design and coverage rate of the recent national polio immunization campaign were reviewed from the records of the Lebanese Ministry of Public Health. Lebanese population demographics including Syrian and Palestinian refugees were reviewed to design individual-based models that predicts the consequences of polio spread to Lebanon and evaluate the outcome of immunization campaigns. The model takes into account geographic, demographic and health-related features. Our simulations confirmed the high risk of polio outbreaks in Lebanon within 10 days of case introduction prior to the immunization campaign, and showed that the current immunization campaign significantly reduced the speed of the infection in the event poliomyelitis cases enter the country. A minimum of 90% national immunization coverage was found to be required to prevent exponential propagation of potential transmission. Both surveillance and immunization efforts should be maintained at high standards in Lebanon and other countries in the area to detect and limit any potential outbreak. The use of computational population simulation models can provide a quantitative approach to assess the impact of immunization campaigns and the burden of infectious diseases even in the context of population migration.

  8. MRI-Based Computational Fluid Dynamics in Experimental Vascular Models: Toward the Development of an Approach for Prediction of Cardiovascular Changes During Prolonged Space Missions

    NASA Technical Reports Server (NTRS)

    Spirka, T. A.; Myers, J. G.; Setser, R. M.; Halliburton, S. S.; White, R. D.; Chatzimavroudis, G. P.

    2005-01-01

    A priority of NASA is to identify and study possible risks to astronauts health during prolonged space missions [l]. The goal is to develop a procedure for a preflight evaluation of the cardiovascular system of an astronaut and to forecast how it will be affected during the mission. To predict these changes, a computational cardiovascular model must be constructed. Although physiology data can be used to make a general model, a more desirable subject-specific model requires anatomical, functional, and flow data from the specific astronaut. MRI has the unique advantage of providing images with all of the above information, including three-directional velocity data which can be used as boundary conditions in a computational fluid dynamics (CFD) program [2,3]. MRI-based CFD is very promising for reproduction of the flow patterns of a specific subject and prediction of changes in the absence of gravity. The aim of this study was to test the feasibility of this approach by reconstructing the geometry of MRI-scanned arterial models and reproducing the MRI-measured velocities using CFD simulations on these geometries.

  9. Boosting structured additive quantile regression for longitudinal childhood obesity data.

    PubMed

    Fenske, Nora; Fahrmeir, Ludwig; Hothorn, Torsten; Rzehak, Peter; Höhle, Michael

    2013-07-25

    Childhood obesity and the investigation of its risk factors has become an important public health issue. Our work is based on and motivated by a German longitudinal study including 2,226 children with up to ten measurements on their body mass index (BMI) and risk factors from birth to the age of 10 years. We introduce boosting of structured additive quantile regression as a novel distribution-free approach for longitudinal quantile regression. The quantile-specific predictors of our model include conventional linear population effects, smooth nonlinear functional effects, varying-coefficient terms, and individual-specific effects, such as intercepts and slopes. Estimation is based on boosting, a computer intensive inference method for highly complex models. We propose a component-wise functional gradient descent boosting algorithm that allows for penalized estimation of the large variety of different effects, particularly leading to individual-specific effects shrunken toward zero. This concept allows us to flexibly estimate the nonlinear age curves of upper quantiles of the BMI distribution, both on population and on individual-specific level, adjusted for further risk factors and to detect age-varying effects of categorical risk factors. Our model approach can be regarded as the quantile regression analog of Gaussian additive mixed models (or structured additive mean regression models), and we compare both model classes with respect to our obesity data.

  10. Risk prediction and aversion by anterior cingulate cortex.

    PubMed

    Brown, Joshua W; Braver, Todd S

    2007-12-01

    The recently proposed error-likelihood hypothesis suggests that anterior cingulate cortex (ACC) and surrounding areas will become active in proportion to the perceived likelihood of an error. The hypothesis was originally derived from a computational model prediction. The same computational model now makes a further prediction that ACC will be sensitive not only to predicted error likelihood, but also to the predicted magnitude of the consequences, should an error occur. The product of error likelihood and predicted error consequence magnitude collectively defines the general "expected risk" of a given behavior in a manner analogous but orthogonal to subjective expected utility theory. New fMRI results from an incentivechange signal task now replicate the error-likelihood effect, validate the further predictions of the computational model, and suggest why some segments of the population may fail to show an error-likelihood effect. In particular, error-likelihood effects and expected risk effects in general indicate greater sensitivity to earlier predictors of errors and are seen in risk-averse but not risk-tolerant individuals. Taken together, the results are consistent with an expected risk model of ACC and suggest that ACC may generally contribute to cognitive control by recruiting brain activity to avoid risk.

  11. Integration of Human Reliability Analysis Models into the Simulation-Based Framework for the Risk-Informed Safety Margin Characterization Toolkit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boring, Ronald; Mandelli, Diego; Rasmussen, Martin

    2016-06-01

    This report presents an application of a computation-based human reliability analysis (HRA) framework called the Human Unimodel for Nuclear Technology to Enhance Reliability (HUNTER). HUNTER has been developed not as a standalone HRA method but rather as framework that ties together different HRA methods to model dynamic risk of human activities as part of an overall probabilistic risk assessment (PRA). While we have adopted particular methods to build an initial model, the HUNTER framework is meant to be intrinsically flexible to new pieces that achieve particular modeling goals. In the present report, the HUNTER implementation has the following goals: •more » Integration with a high fidelity thermal-hydraulic model capable of modeling nuclear power plant behaviors and transients • Consideration of a PRA context • Incorporation of a solid psychological basis for operator performance • Demonstration of a functional dynamic model of a plant upset condition and appropriate operator response This report outlines these efforts and presents the case study of a station blackout scenario to demonstrate the various modules developed to date under the HUNTER research umbrella.« less

  12. Computer model predicting breakthrough febrile urinary tract infection in children with primary vesicoureteral reflux.

    PubMed

    Arlen, Angela M; Alexander, Siobhan E; Wald, Moshe; Cooper, Christopher S

    2016-10-01

    Factors influencing the decision to surgically correct vesicoureteral reflux (VUR) include risk of breakthrough febrile urinary tract infection (fUTI) or renal scarring, and decreased likelihood of spontaneous resolution. Improved identification of children at risk for recurrent fUTI may impact management decisions, and allow for more individualized VUR management. We have developed and investigated the accuracy of a multivariable computational model to predict probability of breakthrough fUTI in children with primary VUR. Children with primary VUR and detailed clinical and voiding cystourethrogram (VCUG) data were identified. Patient demographics, VCUG findings including grade, laterality, and bladder volume at onset of VUR, UTI history, presence of bladder-bowel dysfunction (BBD), and breakthrough fUTI were assessed. The VCUG dataset was randomized into a training set of 288 with a separate representational cross-validation set of 96. Various model types and architectures were investigated using neUROn++, a set of C++ programs. Two hundred fifty-five children (208 girls, 47 boys) diagnosed with primary VUR at a mean age of 3.1 years (±2.6) met all inclusion criteria. A total 384 VCUGs were analyzed. Median follow-up was 24 months (interquartile range 12-52 months). Sixty-eight children (26.7%) experienced 90 breakthrough fUTI events. Dilating VUR, reflux occurring at low bladder volumes, BBD, and history of multiple infections/fUTI were associated with breakthrough fUTI (Table). A 2-hidden node neural network model had the best fit with a receiver operating characteristic curve area of 0.755 for predicting breakthrough fUTI. The risk of recurrent febrile infections, renal parenchymal scarring, and likelihood of spontaneous resolution, as well as parental preference all influence management of primary VUR. The genesis of UTI is multifactorial, making precise prediction of an individual child's risk of breakthrough fUTI challenging. Demonstrated risk factors for UTI include age, gender, VUR grade, reflux at low bladder volume, BBD, and UTI history. We developed a prognostic calculator using a multivariable model with 76% accuracy that can be deployed for availability on the Internet, allowing input variables to be entered to calculate the odds of an individual child developing a breakthrough fUTI. A computational model using multiple variables including bladder volume at onset of VUR provides individualized prediction of children at risk for breakthrough fUTI. A web-based prognostic calculator based on this model will provide a useful tool for assessing personalized risk of breakthrough fUTI in children with primary VUR. Copyright © 2016 Journal of Pediatric Urology Company. Published by Elsevier Ltd. All rights reserved.

  13. A Cloud Computing Approach to Personal Risk Management: The Open Hazards Group

    NASA Astrophysics Data System (ADS)

    Graves, W. R.; Holliday, J. R.; Rundle, J. B.

    2010-12-01

    According to the California Earthquake Authority, only about 12% of current California residences are covered by any form of earthquake insurance, down from about 30% in 1996 following the 1994, M6.7 Northridge earthquake. Part of the reason for this decreasing rate of insurance uptake is the high deductible, either 10% or 15% of the value of the structure, and the relatively high cost of the premiums, as much as thousands of dollars per year. The earthquake insurance industry is composed of the CEA, a public-private partnership; modeling companies that produce damage and loss models similar to the FEMA HAZUS model; and financial companies such as the insurance, reinsurance, and investment banking companies in New York, London, the Cayman Islands, Zurich, Dubai, Singapore, and elsewhere. In setting earthquake insurance rates, financial companies rely on models like HAZUS, that calculate on risk and exposure. In California, the process begins with an official earthquake forecast by the Working Group on California Earthquake Probabilities. Modeling companies use these 30 year earthquake probabilities as inputs to their attenuation and damage models to estimate the possible damage factors from scenario earthquakes. Economic loss is then estimated from processes such as structural failure, lost economic activity, demand surge, and fire following the earthquake. Once the potential losses are known, rates can be set so that a target ruin probability of less than 1% or so can be assured. Open Hazards Group was founded with the idea that the global public might be interested in a personal estimate of earthquake risk, computed using data supplied by the public, with models running in a cloud computing environment. These models process data from the ANSS catalog, updated at least daily, to produce rupture forecasts that are backtested with standard Reliability/Attributes and Receiver Operating Characteristic tests, among others. Models for attenuation and structural damage are then used in a computationally efficient workflow to produce real-time estimates of damage and loss for individual structures. All models are based on techniques that either have been published in the literature or will soon be published. Using these results, members of the public can gain an appreciation of their risk of exposure to damage from destructive earthquakes, information that has heretofore only been available to a few members of the financial and insurance industries.

  14. Human In Silico Drug Trials Demonstrate Higher Accuracy than Animal Models in Predicting Clinical Pro-Arrhythmic Cardiotoxicity.

    PubMed

    Passini, Elisa; Britton, Oliver J; Lu, Hua Rong; Rohrbacher, Jutta; Hermans, An N; Gallacher, David J; Greig, Robert J H; Bueno-Orovio, Alfonso; Rodriguez, Blanca

    2017-01-01

    Early prediction of cardiotoxicity is critical for drug development. Current animal models raise ethical and translational questions, and have limited accuracy in clinical risk prediction. Human-based computer models constitute a fast, cheap and potentially effective alternative to experimental assays, also facilitating translation to human. Key challenges include consideration of inter-cellular variability in drug responses and integration of computational and experimental methods in safety pharmacology. Our aim is to evaluate the ability of in silico drug trials in populations of human action potential (AP) models to predict clinical risk of drug-induced arrhythmias based on ion channel information, and to compare simulation results against experimental assays commonly used for drug testing. A control population of 1,213 human ventricular AP models in agreement with experimental recordings was constructed. In silico drug trials were performed for 62 reference compounds at multiple concentrations, using pore-block drug models (IC 50 /Hill coefficient). Drug-induced changes in AP biomarkers were quantified, together with occurrence of repolarization/depolarization abnormalities. Simulation results were used to predict clinical risk based on reports of Torsade de Pointes arrhythmias, and further evaluated in a subset of compounds through comparison with electrocardiograms from rabbit wedge preparations and Ca 2+ -transient recordings in human induced pluripotent stem cell-derived cardiomyocytes (hiPS-CMs). Drug-induced changes in silico vary in magnitude depending on the specific ionic profile of each model in the population, thus allowing to identify cell sub-populations at higher risk of developing abnormal AP phenotypes. Models with low repolarization reserve (increased Ca 2+ /late Na + currents and Na + /Ca 2+ -exchanger, reduced Na + /K + -pump) are highly vulnerable to drug-induced repolarization abnormalities, while those with reduced inward current density (fast/late Na + and Ca 2+ currents) exhibit high susceptibility to depolarization abnormalities. Repolarization abnormalities in silico predict clinical risk for all compounds with 89% accuracy. Drug-induced changes in biomarkers are in overall agreement across different assays: in silico AP duration changes reflect the ones observed in rabbit QT interval and hiPS-CMs Ca 2+ -transient, and simulated upstroke velocity captures variations in rabbit QRS complex. Our results demonstrate that human in silico drug trials constitute a powerful methodology for prediction of clinical pro-arrhythmic cardiotoxicity, ready for integration in the existing drug safety assessment pipelines.

  15. Assessing and predicting drug-induced anticholinergic risks: an integrated computational approach.

    PubMed

    Xu, Dong; Anderson, Heather D; Tao, Aoxiang; Hannah, Katia L; Linnebur, Sunny A; Valuck, Robert J; Culbertson, Vaughn L

    2017-11-01

    Anticholinergic (AC) adverse drug events (ADEs) are caused by inhibition of muscarinic receptors as a result of designated or off-target drug-receptor interactions. In practice, AC toxicity is assessed primarily based on clinician experience. The goal of this study was to evaluate a novel concept of integrating big pharmacological and healthcare data to assess clinical AC toxicity risks. AC toxicity scores (ATSs) were computed using drug-receptor inhibitions identified through pharmacological data screening. A longitudinal retrospective cohort study using medical claims data was performed to quantify AC clinical risks. ATS was compared with two previously reported toxicity measures. A quantitative structure-activity relationship (QSAR) model was established for rapid assessment and prediction of AC clinical risks. A total of 25 common medications, and 575,228 exposed and unexposed patients were analyzed. Our data indicated that ATS is more consistent with the trend of AC outcomes than other toxicity methods. Incorporating drug pharmacokinetic parameters to ATS yielded a QSAR model with excellent correlation to AC incident rate ( R 2 = 0.83) and predictive performance (cross validation Q 2 = 0.64). Good correlation and predictive performance ( R 2 = 0.68/ Q 2 = 0.29) were also obtained for an M2 receptor-specific QSAR model and tachycardia, an M2 receptor-specific ADE. Albeit using a small medication sample size, our pilot data demonstrated the potential and feasibility of a new computational AC toxicity scoring approach driven by underlying pharmacology and big data analytics. Follow-up work is under way to further develop the ATS scoring approach and clinical toxicity predictive model using a large number of medications and clinical parameters.

  16. Stochastic multi-objective model for optimal energy exchange optimization of networked microgrids with presence of renewable generation under risk-based strategies.

    PubMed

    Gazijahani, Farhad Samadi; Ravadanegh, Sajad Najafi; Salehi, Javad

    2018-02-01

    The inherent volatility and unpredictable nature of renewable generations and load demand pose considerable challenges for energy exchange optimization of microgrids (MG). To address these challenges, this paper proposes a new risk-based multi-objective energy exchange optimization for networked MGs from economic and reliability standpoints under load consumption and renewable power generation uncertainties. In so doing, three various risk-based strategies are distinguished by using conditional value at risk (CVaR) approach. The proposed model is specified as a two-distinct objective function. The first function minimizes the operation and maintenance costs, cost of power transaction between upstream network and MGs as well as power loss cost, whereas the second function minimizes the energy not supplied (ENS) value. Furthermore, the stochastic scenario-based approach is incorporated into the approach in order to handle the uncertainty. Also, Kantorovich distance scenario reduction method has been implemented to reduce the computational burden. Finally, non-dominated sorting genetic algorithm (NSGAII) is applied to minimize the objective functions simultaneously and the best solution is extracted by fuzzy satisfying method with respect to risk-based strategies. To indicate the performance of the proposed model, it is performed on the modified IEEE 33-bus distribution system and the obtained results show that the presented approach can be considered as an efficient tool for optimal energy exchange optimization of MGs. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  17. Public health risk management case concerning the city of Isfahan according to a hypothetical release of HF from a chemical plant.

    PubMed

    Azari, Mansour R; Sadighzadeh, Asghar; Bayatian, Majid

    2018-06-19

    Accidents have happened in the chemical industries all over the world with serious consequences for the adjacent heavily populated areas. In this study, the impact of the probable hypothetical event, releasing considerable amounts of hydrogen fluoride (HF) as a strong irritant into the atmosphere over the city of Isfahan from a strategic chemical plant, was simulated by computational fluid dynamics (CFD). In this model, the meteorological parameters were integrated into time and space, and dispersion of the pollutants was estimated based on a probable accidental release of HF. According to the hypothetical results of the simulation model in this study, HF clouds reached Isfahan in 20 min and exposed 80% of the general public to HF concentration in the range of 0-34 ppm. Then, they dissipated 240 min after the time of the incident. Supposing the uniform population density within the proximity of the city of Isfahan with the population of 1.75 million, 5% of the population (87,500 people) could be exposed for a few minutes to a HF concentration as high as 34 ppm. This concentration is higher than a very hazardous concentration described as the Immediate Danger to Life and Health (30 ppm). This hypothetical risk evaluation of environmental exposure to HF with the potential of health risks was very instrumental for the general public of Isfahan in terms of risk management. Similar studies based on probable accidental scenarios along with the application of a simulation model for computation of dispersed pollutants are recommended for risk evaluation and management of cities in the developing countries with a fast pace of urbanization around the industrial sites.

  18. Examining Equivalency of the Driver Risk Inventory Test Versions: Does It Matter Which Version I Use?

    ERIC Educational Resources Information Center

    Degiorgio, Lisa

    2015-01-01

    Equivalency of test versions is often assumed by counselors and evaluators. This study examined two versions, paper-pencil and computer based, of the Driver Risk Inventory, a DUI/DWI (driving under the influence/driving while intoxicated) risk assessment. An overview of computer-based testing and standards for equivalency is also provided. Results…

  19. Nigerian Library Staff and Their Perceptions of Health Risks Posed by Using Computer-Based Systems in University Libraries

    ERIC Educational Resources Information Center

    Uwaifo, Stephen Osahon

    2008-01-01

    Purpose: The paper seeks to examine the health risks faced when using computer-based systems by library staff in Nigerian libraries. Design/methodology/approach: The paper uses a survey research approach to carry out this investigation. Findings: The investigation reveals that the perceived health risk does not predict perceived ease of use of…

  20. Development and application of a complex numerical model and software for the computation of dose conversion factors for radon progenies.

    PubMed

    Farkas, Árpád; Balásházy, Imre

    2015-04-01

    A more exact determination of dose conversion factors associated with radon progeny inhalation was possible due to the advancements in epidemiological health risk estimates in the last years. The enhancement of computational power and the development of numerical techniques allow computing dose conversion factors with increasing reliability. The objective of this study was to develop an integrated model and software based on a self-developed airway deposition code, an own bronchial dosimetry model and the computational methods accepted by International Commission on Radiological Protection (ICRP) to calculate dose conversion coefficients for different exposure conditions. The model was tested by its application for exposure and breathing conditions characteristic of mines and homes. The dose conversion factors were 8 and 16 mSv WLM(-1) for homes and mines when applying a stochastic deposition model combined with the ICRP dosimetry model (named PM-A model), and 9 and 17 mSv WLM(-1) when applying the same deposition model combined with authors' bronchial dosimetry model and the ICRP bronchiolar and alveolar-interstitial dosimetry model (called PM-B model). User friendly software for the computation of dose conversion factors has also been developed. The software allows one to compute conversion factors for a large range of exposure and breathing parameters and to perform sensitivity analyses. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  1. Computer-based video analysis identifies infants with absence of fidgety movements.

    PubMed

    Støen, Ragnhild; Songstad, Nils Thomas; Silberg, Inger Elisabeth; Fjørtoft, Toril; Jensenius, Alexander Refsum; Adde, Lars

    2017-10-01

    BackgroundAbsence of fidgety movements (FMs) at 3 months' corrected age is a strong predictor of cerebral palsy (CP) in high-risk infants. This study evaluates the association between computer-based video analysis and the temporal organization of FMs assessed with the General Movement Assessment (GMA).MethodsInfants were eligible for this prospective cohort study if referred to a high-risk follow-up program in a participating hospital. Video recordings taken at 10-15 weeks post term age were used for GMA and computer-based analysis. The variation of the spatial center of motion, derived from differences between subsequent video frames, was used for quantitative analysis.ResultsOf 241 recordings from 150 infants, 48 (24.1%) were classified with absence of FMs or sporadic FMs using the GMA. The variation of the spatial center of motion (C SD ) during a recording was significantly lower in infants with normal (0.320; 95% confidence interval (CI) 0.309, 0.330) vs. absence of or sporadic (0.380; 95% CI 0.361, 0.398) FMs (P<0.001). A triage model with C SD thresholds chosen for sensitivity of 90% and specificity of 80% gave a 40% referral rate for GMA.ConclusionQuantitative video analysis during the FMs' period can be used to triage infants at high risk of CP to early intervention or observational GMA.

  2. Aqueduct: a methodology to measure and communicate global water risks

    NASA Astrophysics Data System (ADS)

    Gassert, Francis; Reig, Paul

    2013-04-01

    The Aqueduct Water Risk Atlas (Aqueduct) is a publicly available, global database and interactive tool that maps indicators of water related risks for decision makers worldwide. Aqueduct makes use of the latest geo-statistical modeling techniques to compute a composite index and translate the most recently available hydrological data into practical information on water related risks for companies, investors, and governments alike. Twelve global indicators are grouped into a Water Risk Framework designed in response to the growing concerns from private sector actors around water scarcity, water quality, climate change, and increasing demand for freshwater. The Aqueduct framework organizes indicators into three categories of risk that bring together multiple dimensions of water related risk into comprehensive aggregated scores and includes indicators of water stress, variability in supply, storage, flood, drought, groundwater, water quality and social conflict, addressing both spatial and temporal variation in water hazards. Indicators are selected based on relevance to water users, availability and robustness of global data sources, and expert consultation, and are collected from existing datasets or derived from a Global Land Data Assimilation System (GLDAS) based integrated water balance model. Indicators are normalized using a threshold approach, and composite scores are computed using a linear aggregation scheme that allows for dynamic weighting to capture users' unique exposure to water hazards. By providing consistent scores across the globe, the Aqueduct Water Risk Atlas enables rapid comparison across diverse aspects of water risk. Companies can use this information to prioritize actions, investors to leverage financial interest to improve water management, and governments to engage with the private sector to seek solutions for more equitable and sustainable water governance. The Aqueduct Water Risk Atlas enables practical applications of scientific data, helping non-expert audiences better understand and evaluate risks facing water users. This presentation will discuss the methodology used to combine the indicator values into aggregated risk scores and lessons learned from working with diverse audiences in academia, development institutions, and the public and private sectors.

  3. Computational mechanics research and support for aerodynamics and hydraulics at TFHRC, year 2 quarter 1 progress report.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lottes, S.A.; Bojanowski, C.; Shen, J.

    2012-04-09

    The computational fluid dynamics (CFD) and computational structural mechanics (CSM) focus areas at Argonne's Transportation Research and Analysis Computing Center (TRACC) initiated a project to support and compliment the experimental programs at the Turner-Fairbank Highway Research Center (TFHRC) with high performance computing based analysis capabilities in August 2010. The project was established with a new interagency agreement between the Department of Energy and the Department of Transportation to provide collaborative research, development, and benchmarking of advanced three-dimensional computational mechanics analysis methods to the aerodynamics and hydraulics laboratories at TFHRC for a period of five years, beginning in October 2010. Themore » analysis methods employ well-benchmarked and supported commercial computational mechanics software. Computational mechanics encompasses the areas of Computational Fluid Dynamics (CFD), Computational Wind Engineering (CWE), Computational Structural Mechanics (CSM), and Computational Multiphysics Mechanics (CMM) applied in Fluid-Structure Interaction (FSI) problems. The major areas of focus of the project are wind and water effects on bridges - superstructure, deck, cables, and substructure (including soil), primarily during storms and flood events - and the risks that these loads pose to structural failure. For flood events at bridges, another major focus of the work is assessment of the risk to bridges caused by scour of stream and riverbed material away from the foundations of a bridge. Other areas of current research include modeling of flow through culverts to improve design allowing for fish passage, modeling of the salt spray transport into bridge girders to address suitability of using weathering steel in bridges, CFD analysis of the operation of the wind tunnel in the TFHRC wind engineering laboratory. This quarterly report documents technical progress on the project tasks for the period of October through December 2011.« less

  4. Computational mechanics research and support for aerodynamics and hydraulics at TFHRC, year 2 quarter 2 progress report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lottes, S.A.; Bojanowski, C.; Shen, J.

    2012-06-28

    The computational fluid dynamics (CFD) and computational structural mechanics (CSM) focus areas at Argonne's Transportation Research and Analysis Computing Center (TRACC) initiated a project to support and compliment the experimental programs at the Turner-Fairbank Highway Research Center (TFHRC) with high performance computing based analysis capabilities in August 2010. The project was established with a new interagency agreement between the Department of Energy and the Department of Transportation to provide collaborative research, development, and benchmarking of advanced three-dimensional computational mechanics analysis methods to the aerodynamics and hydraulics laboratories at TFHRC for a period of five years, beginning in October 2010. Themore » analysis methods employ well benchmarked and supported commercial computational mechanics software. Computational mechanics encompasses the areas of Computational Fluid Dynamics (CFD), Computational Wind Engineering (CWE), Computational Structural Mechanics (CSM), and Computational Multiphysics Mechanics (CMM) applied in Fluid-Structure Interaction (FSI) problems. The major areas of focus of the project are wind and water effects on bridges - superstructure, deck, cables, and substructure (including soil), primarily during storms and flood events - and the risks that these loads pose to structural failure. For flood events at bridges, another major focus of the work is assessment of the risk to bridges caused by scour of stream and riverbed material away from the foundations of a bridge. Other areas of current research include modeling of flow through culverts to improve design allowing for fish passage, modeling of the salt spray transport into bridge girders to address suitability of using weathering steel in bridges, CFD analysis of the operation of the wind tunnel in the TFHRC wind engineering laboratory. This quarterly report documents technical progress on the project tasks for the period of January through March 2012.« less

  5. Computational mechanics research and support for aerodynamics and hydraulics at TFHRC, year 1 quarter 3 progress report.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lottes, S.A.; Kulak, R.F.; Bojanowski, C.

    2011-08-26

    The computational fluid dynamics (CFD) and computational structural mechanics (CSM) focus areas at Argonne's Transportation Research and Analysis Computing Center (TRACC) initiated a project to support and compliment the experimental programs at the Turner-Fairbank Highway Research Center (TFHRC) with high performance computing based analysis capabilities in August 2010. The project was established with a new interagency agreement between the Department of Energy and the Department of Transportation to provide collaborative research, development, and benchmarking of advanced three-dimensional computational mechanics analysis methods to the aerodynamics and hydraulics laboratories at TFHRC for a period of five years, beginning in October 2010. Themore » analysis methods employ well-benchmarked and supported commercial computational mechanics software. Computational mechanics encompasses the areas of Computational Fluid Dynamics (CFD), Computational Wind Engineering (CWE), Computational Structural Mechanics (CSM), and Computational Multiphysics Mechanics (CMM) applied in Fluid-Structure Interaction (FSI) problems. The major areas of focus of the project are wind and water loads on bridges - superstructure, deck, cables, and substructure (including soil), primarily during storms and flood events - and the risks that these loads pose to structural failure. For flood events at bridges, another major focus of the work is assessment of the risk to bridges caused by scour of stream and riverbed material away from the foundations of a bridge. Other areas of current research include modeling of flow through culverts to assess them for fish passage, modeling of the salt spray transport into bridge girders to address suitability of using weathering steel in bridges, vehicle stability under high wind loading, and the use of electromagnetic shock absorbers to improve vehicle stability under high wind conditions. This quarterly report documents technical progress on the project tasks for the period of April through June 2011.« less

  6. Integrating emerging earth science technologies into disaster risk management: an enterprise architecture approach

    NASA Astrophysics Data System (ADS)

    Evans, J. D.; Hao, W.; Chettri, S. R.

    2014-12-01

    Disaster risk management has grown to rely on earth observations, multi-source data analysis, numerical modeling, and interagency information sharing. The practice and outcomes of disaster risk management will likely undergo further change as several emerging earth science technologies come of age: mobile devices; location-based services; ubiquitous sensors; drones; small satellites; satellite direct readout; Big Data analytics; cloud computing; Web services for predictive modeling, semantic reconciliation, and collaboration; and many others. Integrating these new technologies well requires developing and adapting them to meet current needs; but also rethinking current practice to draw on new capabilities to reach additional objectives. This requires a holistic view of the disaster risk management enterprise and of the analytical or operational capabilities afforded by these technologies. One helpful tool for this assessment, the GEOSS Architecture for the Use of Remote Sensing Products in Disaster Management and Risk Assessment (Evans & Moe, 2013), considers all phases of the disaster risk management lifecycle for a comprehensive set of natural hazard types, and outlines common clusters of activities and their use of information and computation resources. We are using these architectural views, together with insights from current practice, to highlight effective, interrelated roles for emerging earth science technologies in disaster risk management. These roles may be helpful in creating roadmaps for research and development investment at national and international levels.

  7. Arrhythmic risk biomarkers for the assessment of drug cardiotoxicity: from experiments to computer simulations

    PubMed Central

    Corrias, A.; Jie, X.; Romero, L.; Bishop, M. J.; Bernabeu, M.; Pueyo, E.; Rodriguez, B.

    2010-01-01

    In this paper, we illustrate how advanced computational modelling and simulation can be used to investigate drug-induced effects on cardiac electrophysiology and on specific biomarkers of pro-arrhythmic risk. To do so, we first perform a thorough literature review of proposed arrhythmic risk biomarkers from the ionic to the electrocardiogram levels. The review highlights the variety of proposed biomarkers, the complexity of the mechanisms of drug-induced pro-arrhythmia and the existence of significant animal species differences in drug-induced effects on cardiac electrophysiology. Predicting drug-induced pro-arrhythmic risk solely using experiments is challenging both preclinically and clinically, as attested by the rise in the cost of releasing new compounds to the market. Computational modelling and simulation has significantly contributed to the understanding of cardiac electrophysiology and arrhythmias over the last 40 years. In the second part of this paper, we illustrate how state-of-the-art open source computational modelling and simulation tools can be used to simulate multi-scale effects of drug-induced ion channel block in ventricular electrophysiology at the cellular, tissue and whole ventricular levels for different animal species. We believe that the use of computational modelling and simulation in combination with experimental techniques could be a powerful tool for the assessment of drug safety pharmacology. PMID:20478918

  8. Image based Monte Carlo Modeling for Computational Phantom

    NASA Astrophysics Data System (ADS)

    Cheng, Mengyun; Wang, Wen; Zhao, Kai; Fan, Yanchang; Long, Pengcheng; Wu, Yican

    2014-06-01

    The evaluation on the effects of ionizing radiation and the risk of radiation exposure on human body has been becoming one of the most important issues for radiation protection and radiotherapy fields, which is helpful to avoid unnecessary radiation and decrease harm to human body. In order to accurately evaluate the dose on human body, it is necessary to construct more realistic computational phantom. However, manual description and verfication of the models for Monte carlo(MC)simulation are very tedious, error-prone and time-consuming. In addiation, it is difficult to locate and fix the geometry error, and difficult to describe material information and assign it to cells. MCAM (CAD/Image-based Automatic Modeling Program for Neutronics and Radiation Transport Simulation) was developed as an interface program to achieve both CAD- and image-based automatic modeling by FDS Team (Advanced Nuclear Energy Research Team, http://www.fds.org.cn). The advanced version (Version 6) of MCAM can achieve automatic conversion from CT/segmented sectioned images to computational phantoms such as MCNP models. Imaged-based automatic modeling program(MCAM6.0) has been tested by several medical images and sectioned images. And it has been applied in the construction of Rad-HUMAN. Following manual segmentation and 3D reconstruction, a whole-body computational phantom of Chinese adult female called Rad-HUMAN was created by using MCAM6.0 from sectioned images of a Chinese visible human dataset. Rad-HUMAN contains 46 organs/tissues, which faithfully represented the average anatomical characteristics of the Chinese female. The dose conversion coefficients(Dt/Ka) from kerma free-in-air to absorbed dose of Rad-HUMAN were calculated. Rad-HUMAN can be applied to predict and evaluate dose distributions in the Treatment Plan System (TPS), as well as radiation exposure for human body in radiation protection.

  9. Patch-based models and algorithms for image processing: a review of the basic principles and methods, and their application in computed tomography.

    PubMed

    Karimi, Davood; Ward, Rabab K

    2016-10-01

    Image models are central to all image processing tasks. The great advancements in digital image processing would not have been made possible without powerful models which, themselves, have evolved over time. In the past decade, "patch-based" models have emerged as one of the most effective models for natural images. Patch-based methods have outperformed other competing methods in many image processing tasks. These developments have come at a time when greater availability of powerful computational resources and growing concerns over the health risks of the ionizing radiation encourage research on image processing algorithms for computed tomography (CT). The goal of this paper is to explain the principles of patch-based methods and to review some of their recent applications in CT. We first review the central concepts in patch-based image processing and explain some of the state-of-the-art algorithms, with a focus on aspects that are more relevant to CT. Then, we review some of the recent application of patch-based methods in CT. Patch-based methods have already transformed the field of image processing, leading to state-of-the-art results in many applications. More recently, several studies have proposed patch-based algorithms for various image processing tasks in CT, from denoising and restoration to iterative reconstruction. Although these studies have reported good results, the true potential of patch-based methods for CT has not been yet appreciated. Patch-based methods can play a central role in image reconstruction and processing for CT. They have the potential to lead to substantial improvements in the current state of the art.

  10. Multi-stakeholder decision analysis and comparative risk assessment for reuse-recycle oriented e-waste management strategies: a game theoretic approach.

    PubMed

    Kaushal, Rajendra Kumar; Nema, Arvind K

    2013-09-01

    This article deals with assessment of the potential health risk posed by carcinogenic and non-carcinogenic substances, namely lead (Pb), cadmium (Cd), copper, chromium (CrVI), zinc, nickel and mercury, present in e-waste. A multi-objective, multi-stakeholder approach based on strategic game theory model has been developed considering cost, as well as human health risk. The trade-off due to cost difference between a hazardous substances-free (HSF) and a hazardous substance (HS)-containing desktop computer, and the risk posed by them at the time of disposal, has been analyzed. The cancer risk due to dust inhalation for workers at a recycling site in Bangalore for Pb, Cr(VI) and Cd was found to be 4, 33 and 101 in 1 million respectively. Pb and Cr(VI) result in a very high risk owing to dust ingestion at slums near the recycling site--175 and 81 in 1 million for children, and 24 and 11 in 1 million for adults respectively. The concentration of Pb at a battery workshop in Mayapuri, Delhi (hazard quotient = 3.178) was found to pose adverse health hazards. The government may impose an appropriate penalty on the land disposal of computer waste and/or may give an incentive to manufacturer for producing HSF computers through, for example, relaxing taxes, but there should be no such incentive for manufacturing HS-containing computers.

  11. Portals for Real-Time Earthquake Data and Forecasting: Challenge and Promise (Invited)

    NASA Astrophysics Data System (ADS)

    Rundle, J. B.; Holliday, J. R.; Graves, W. R.; Feltstykket, R.; Donnellan, A.; Glasscoe, M. T.

    2013-12-01

    Earthquake forecasts have been computed by a variety of countries world-wide for over two decades. For the most part, forecasts have been computed for insurance, reinsurance and underwriters of catastrophe bonds. However, recent events clearly demonstrate that mitigating personal risk is becoming the responsibility of individual members of the public. Open access to a variety of web-based forecasts, tools, utilities and information is therefore required. Portals for data and forecasts present particular challenges, and require the development of both apps and the client/server architecture to deliver the basic information in real time. The basic forecast model we consider is the Natural Time Weibull (NTW) method (JBR et al., Phys. Rev. E, 86, 021106, 2012). This model uses small earthquakes (';seismicity-based models') to forecast the occurrence of large earthquakes, via data-mining algorithms combined with the ANSS earthquake catalog. This method computes large earthquake probabilities using the number of small earthquakes that have occurred in a region since the last large earthquake. Localizing these forecasts in space so that global forecasts can be computed in real time presents special algorithmic challenges, which we describe in this talk. Using 25 years of data from the ANSS California-Nevada catalog of earthquakes, we compute real-time global forecasts at a grid scale of 0.1o. We analyze and monitor the performance of these models using the standard tests, which include the Reliability/Attributes and Receiver Operating Characteristic (ROC) tests. It is clear from much of the analysis that data quality is a major limitation on the accurate computation of earthquake probabilities. We discuss the challenges of serving up these datasets over the web on web-based platforms such as those at www.quakesim.org , www.e-decider.org , and www.openhazards.com.

  12. Blood leakage detection during dialysis therapy based on fog computing with array photocell sensors and heteroassociative memory model

    PubMed Central

    Wu, Jian-Xing; Huang, Ping-Tzan; Li, Chien-Ming

    2018-01-01

    Blood leakage and blood loss are serious life-threatening complications occurring during dialysis therapy. These events have been of concerns to both healthcare givers and patients. More than 40% of adult blood volume can be lost in just a few minutes, resulting in morbidities and mortality. The authors intend to propose the design of a warning tool for the detection of blood leakage/blood loss during dialysis therapy based on fog computing with an array of photocell sensors and heteroassociative memory (HAM) model. Photocell sensors are arranged in an array on a flexible substrate to detect blood leakage via the resistance changes with illumination in the visible spectrum of 500–700 nm. The HAM model is implemented to design a virtual alarm unit using electricity changes in an embedded system. The proposed warning tool can indicate the risk level in both end-sensing units and remote monitor devices via a wireless network and fog/cloud computing. The animal experimental results (pig blood) will demonstrate the feasibility. PMID:29515815

  13. Blood leakage detection during dialysis therapy based on fog computing with array photocell sensors and heteroassociative memory model.

    PubMed

    Wu, Jian-Xing; Huang, Ping-Tzan; Lin, Chia-Hung; Li, Chien-Ming

    2018-02-01

    Blood leakage and blood loss are serious life-threatening complications occurring during dialysis therapy. These events have been of concerns to both healthcare givers and patients. More than 40% of adult blood volume can be lost in just a few minutes, resulting in morbidities and mortality. The authors intend to propose the design of a warning tool for the detection of blood leakage/blood loss during dialysis therapy based on fog computing with an array of photocell sensors and heteroassociative memory (HAM) model. Photocell sensors are arranged in an array on a flexible substrate to detect blood leakage via the resistance changes with illumination in the visible spectrum of 500-700 nm. The HAM model is implemented to design a virtual alarm unit using electricity changes in an embedded system. The proposed warning tool can indicate the risk level in both end-sensing units and remote monitor devices via a wireless network and fog/cloud computing. The animal experimental results (pig blood) will demonstrate the feasibility.

  14. Use of raster-based data layers to model spatial variation of seismotectonic data in probabilistic seismic hazard assessment

    NASA Astrophysics Data System (ADS)

    Zolfaghari, Mohammad R.

    2009-07-01

    Recent achievements in computer and information technology have provided the necessary tools to extend the application of probabilistic seismic hazard mapping from its traditional engineering use to many other applications. Examples for such applications are risk mitigation, disaster management, post disaster recovery planning and catastrophe loss estimation and risk management. Due to the lack of proper knowledge with regard to factors controlling seismic hazards, there are always uncertainties associated with all steps involved in developing and using seismic hazard models. While some of these uncertainties can be controlled by more accurate and reliable input data, the majority of the data and assumptions used in seismic hazard studies remain with high uncertainties that contribute to the uncertainty of the final results. In this paper a new methodology for the assessment of seismic hazard is described. The proposed approach provides practical facility for better capture of spatial variations of seismological and tectonic characteristics, which allows better treatment of their uncertainties. In the proposed approach, GIS raster-based data models are used in order to model geographical features in a cell-based system. The cell-based source model proposed in this paper provides a framework for implementing many geographically referenced seismotectonic factors into seismic hazard modelling. Examples for such components are seismic source boundaries, rupture geometry, seismic activity rate, focal depth and the choice of attenuation functions. The proposed methodology provides improvements in several aspects of the standard analytical tools currently being used for assessment and mapping of regional seismic hazard. The proposed methodology makes the best use of the recent advancements in computer technology in both software and hardware. The proposed approach is well structured to be implemented using conventional GIS tools.

  15. Initial retrieval sequence and blending strategy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pemwell, D.L.; Grenard, C.E.

    1996-09-01

    This report documents the initial retrieval sequence and the methodology used to select it. Waste retrieval, storage, pretreatment and vitrification were modeled for candidate single-shell tank retrieval sequences. Performance of the sequences was measured by a set of metrics (for example,high-level waste glass volume, relative risk and schedule).Computer models were used to evaluate estimated glass volumes,process rates, retrieval dates, and blending strategy effects.The models were based on estimates of component inventories and concentrations, sludge wash factors and timing, retrieval annex limitations, etc.

  16. Cost-of-illness studies based on massive data: a prevalence-based, top-down regression approach.

    PubMed

    Stollenwerk, Björn; Welchowski, Thomas; Vogl, Matthias; Stock, Stephanie

    2016-04-01

    Despite the increasing availability of routine data, no analysis method has yet been presented for cost-of-illness (COI) studies based on massive data. We aim, first, to present such a method and, second, to assess the relevance of the associated gain in numerical efficiency. We propose a prevalence-based, top-down regression approach consisting of five steps: aggregating the data; fitting a generalized additive model (GAM); predicting costs via the fitted GAM; comparing predicted costs between prevalent and non-prevalent subjects; and quantifying the stochastic uncertainty via error propagation. To demonstrate the method, it was applied to aggregated data in the context of chronic lung disease to German sickness funds data (from 1999), covering over 7.3 million insured. To assess the gain in numerical efficiency, the computational time of the innovative approach has been compared with corresponding GAMs applied to simulated individual-level data. Furthermore, the probability of model failure was modeled via logistic regression. Applying the innovative method was reasonably fast (19 min). In contrast, regarding patient-level data, computational time increased disproportionately by sample size. Furthermore, using patient-level data was accompanied by a substantial risk of model failure (about 80 % for 6 million subjects). The gain in computational efficiency of the innovative COI method seems to be of practical relevance. Furthermore, it may yield more precise cost estimates.

  17. Measuring the default risk of sovereign debt from the perspective of network

    NASA Astrophysics Data System (ADS)

    Chuang, Hongwei; Ho, Hwai-Chung

    2013-05-01

    Recently, there has been a growing interest in network research, especially in the fields of biology, computer science, and sociology. It is natural to address complex financial issues such as the European sovereign debt crisis from the perspective of network. In this article, we construct a network model according to the debt-credit relations instead of using the conventional methodology to measure the default risk. Based on the model, a risk index is examined using the quarterly report of consolidated foreign claims from the Bank for International Settlements (BIS) and debt/GDP ratios among these reporting countries. The empirical results show that this index can help the regulators and practitioners not only to determine the status of interconnectivity but also to point out the degree of the sovereign debt default risk. Our approach sheds new light on the investigation of quantifying the systemic risk.

  18. Development of a theory-guided pan-European computer-assisted safer sex intervention.

    PubMed

    Nöstlinger, Christiana; Borms, Ruth; Dec-Pietrowska, Joanna; Dias, Sonia; Rojas, Daniela; Platteau, Tom; Vanden Berghe, Wim; Kok, Gerjo

    2016-12-01

    HIV is a growing public health problem in Europe, with men-having-sex-with-men and migrants from endemic regions as the most affected key populations. More evidence on effective behavioral interventions to reduce sexual risk is needed. This article describes the systematic development of a theory-guided computer-assisted safer sex intervention, aiming at supporting people living with HIV in sexual risk reduction. We applied the Intervention Mapping (IM) protocol to develop this counseling intervention in the framework of a European multicenter study. We conducted a needs assessment guided by the information-motivation-behavioral (IMB) skills model, formulated change objectives and selected theory-based methods and practical strategies, i.e. interactive computer-assisted modules as supporting tools for provider-delivered counseling. Theoretical foundations were the IMB skills model, social cognitive theory and the transtheoretical model, complemented by dual process models of affective decision making to account for the specifics of sexual behavior. The counseling approach for delivering three individual sessions was tailored to participants' needs and contexts, adopting elements of motivational interviewing and cognitive-behavioral therapy. We implemented and evaluated the intervention using a randomized controlled trial combined with a process evaluation. IM provided a useful framework for developing a coherent intervention for heterogeneous target groups, which was feasible and effective across the culturally diverse settings. This article responds to the need for transparent descriptions of the development and content of evidence-based behavior change interventions as potential pillars of effective combination prevention strategies. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  19. MATILDA Version 2: Rough Earth TIALD Model for Laser Probabilistic Risk Assessment in Hilly Terrain - Part I

    DTIC Science & Technology

    2017-03-13

    support of airborne laser designator use during test and training exercises on military ranges. The initial MATILDA tool, MATILDA PRO Version-1.6.1...was based on the 2007 PRA model developed to perform range safety clearances for the UK Thermal Imaging Airborne Laser Designator (TIALD) system...AFRL Technical Reports. This Technical Report, designated Part I, con- tains documentation of the computational procedures for probabilistic fault

  20. A Data-Driven Framework for Incorporating New Tools for ...

    EPA Pesticide Factsheets

    This talk was given during the “Exposure-Based Toxicity Testing” session at the annual meeting of the International Society for Exposure Science. It provided an update on the state of the science and tools that may be employed in risk-based prioritization efforts. It outlined knowledge gained from the data provided using these high-throughput tools to assess chemical bioactivity and to predict chemical exposures and also identified future needs. It provided an opportunity to showcase ongoing research efforts within the National Exposure Research Laboratory and the National Center for Computational Toxicology within the Office of Research and Development to an international audience. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.

  1. PP087. Multicenter external validation and recalibration of a model for preconceptional prediction of recurrent early-onset preeclampsia.

    PubMed

    van Kuijk, Sander; Delahaije, Denise; Dirksen, Carmen; Scheepers, Hubertina C J; Spaanderman, Marc; Ganzevoort, W; Duvekot, Hans; Oudijk, M A; van Pampus, M G; Dadelszen, Peter von; Peeters, Louis L; Smiths, Luc

    2013-04-01

    In an earlier paper we reported on the development of a model aimed at the prediction of preeclampsia recurrence, based on variables obtained before the next pregnancy (fasting glucose, BMI, previous birth of a small-for-gestational-age infant, duration of the previous pregnancy, and the presence of hypertension). To externally validate and recalibrate the prediction model for the risk of recurrence of early-onset preeclampsia. We collected data about course and outcome of the next ongoing pregnancy in 229 women with a history of early-onset preeclampsia. Recurrence was defined as preeclampsia requiring delivery before 34 weeks. We computed risk of recurrence and assessed model performance. In addition, we constructed a table comparing sensitivity, specificity, and predictive values for different suggested risk-thresholds. Early-onset preeclampsia recurred in 6.6% of women. The model systematically underestimated recurrence risk. The model's discriminative ability was modest, the area under the receiver operating characteristic curve was 58.9% (95% CI: 45.1 - 72.7). Using relevant risk-thresholds, the model created groups that were only moderately different in terms of their average risk of recurrent preeclampsia (Table 1). Compared to an AUC of 65% in the development cohort, the discriminate ability of the model was diminished. It had inadequate performance to classify women into clinically relevant risk groups. Copyright © 2013. Published by Elsevier B.V.

  2. Mapping snow depth return levels: smooth spatial modeling versus station interpolation

    NASA Astrophysics Data System (ADS)

    Blanchet, J.; Lehning, M.

    2010-12-01

    For adequate risk management in mountainous countries, hazard maps for extreme snow events are needed. This requires the computation of spatial estimates of return levels. In this article we use recent developments in extreme value theory and compare two main approaches for mapping snow depth return levels from in situ measurements. The first one is based on the spatial interpolation of pointwise extremal distributions (the so-called Generalized Extreme Value distribution, GEV henceforth) computed at station locations. The second one is new and based on the direct estimation of a spatially smooth GEV distribution with the joint use of all stations. We compare and validate the different approaches for modeling annual maximum snow depth measured at 100 sites in Switzerland during winters 1965-1966 to 2007-2008. The results show a better performance of the smooth GEV distribution fitting, in particular where the station network is sparser. Smooth return level maps can be computed from the fitted model without any further interpolation. Their regional variability can be revealed by removing the altitudinal dependent covariates in the model. We show how return levels and their regional variability are linked to the main climatological patterns of Switzerland.

  3. Development and validation of a generic finite element vehicle buck model for the analysis of driver rib fractures in real life nearside oblique frontal crashes.

    PubMed

    Iraeus, Johan; Lindquist, Mats

    2016-10-01

    Frontal crashes still account for approximately half of all fatalities in passenger cars, despite several decades of crash-related research. For serious injuries in this crash mode, several authors have listed the thorax as the most important. Computer simulation provides an effective tool to study crashes and evaluate injury mechanisms, and using stochastic input data, whole populations of crashes can be studied. The aim of this study was to develop a generic buck model and to validate this model on a population of real-life frontal crashes in terms of the risk of rib fracture. The study was conducted in four phases. In the first phase, real-life validation data were derived by analyzing NASS/CDS data to find the relationship between injury risk and crash parameters. In addition, available statistical distributions for the parameters were collected. In the second phase, a generic parameterized finite element (FE) model of a vehicle interior was developed based on laser scans from the A2MAC1 database. In the third phase, model parameters that could not be found in the literature were estimated using reverse engineering based on NCAP tests. Finally, in the fourth phase, the stochastic FE model was used to simulate a population of real-life crashes, and the result was compared to the validation data from phase one. The stochastic FE simulation model overestimates the risk of rib fracture, more for young occupants and less for senior occupants. However, if the effect of underestimation of rib fractures in the NASS/CDS material is accounted for using statistical simulations, the risk of rib fracture based on the stochastic FE model matches the risk based on the NASS/CDS data for senior occupants. The current version of the stochastic model can be used to evaluate new safety measures using a population of frontal crashes for senior occupants. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. Health-adjusted premium subsidies in the Netherlands.

    PubMed

    van de Ven, Wynand P M M; van Vliet, René C J A; Lamers, Leida M

    2004-01-01

    The Dutch government has decided to proceed with managed competition in health care. In this paper we report on progress made with health-based risk adjustment, a key issue in managed competition. In 2004 both Diagnostic Cost Groups (DCGs) computed from hospital diagnoses only and Pharmacy-based Cost Groups (PCGs) computed from out-patient prescription drugs are used to set the premium subsidies for competing risk-bearing sickness funds. These health-based risk adjusters appear to be effective and complementary. Risk selection is not a major problem in the Netherlands. Despite the progress made, we are still faced with a full research agenda for risk adjustment in the coming years.

  5. Shape optimization of pulsatile ventricular assist devices using FSI to minimize thrombotic risk

    NASA Astrophysics Data System (ADS)

    Long, C. C.; Marsden, A. L.; Bazilevs, Y.

    2014-10-01

    In this paper we perform shape optimization of a pediatric pulsatile ventricular assist device (PVAD). The device simulation is carried out using fluid-structure interaction (FSI) modeling techniques within a computational framework that combines FEM for fluid mechanics and isogeometric analysis for structural mechanics modeling. The PVAD FSI simulations are performed under realistic conditions (i.e., flow speeds, pressure levels, boundary conditions, etc.), and account for the interaction of air, blood, and a thin structural membrane separating the two fluid subdomains. The shape optimization study is designed to reduce thrombotic risk, a major clinical problem in PVADs. Thrombotic risk is quantified in terms of particle residence time in the device blood chamber. Methods to compute particle residence time in the context of moving spatial domains are presented in a companion paper published in the same issue (Comput Mech, doi: 10.1007/s00466-013-0931-y, 2013). The surrogate management framework, a derivative-free pattern search optimization method that relies on surrogates for increased efficiency, is employed in this work. For the optimization study shown here, particle residence time is used to define a suitable cost or objective function, while four adjustable design optimization parameters are used to define the device geometry. The FSI-based optimization framework is implemented in a parallel computing environment, and deployed with minimal user intervention. Using five SEARCH/ POLL steps the optimization scheme identifies a PVAD design with significantly better throughput efficiency than the original device.

  6. Hygrothermal Simulation: A Tool for Building Envelope Design Analysis

    Treesearch

    Samuel V. Glass; Anton TenWolde; Samuel L. Zelinka

    2013-01-01

    Is it possible to gauge the risk of moisture problems while designing the building envelope? This article provides a brief introduction to computer-based hygrothermal (heat and moisture) simulation, shows how simulation can be useful as a design tool, and points out a number of im-portant considerations regarding model inputs and limita-tions. Hygrothermal simulation...

  7. Modeling Electrostatic Fields Generated by Internal Charging of Materials in Space Radiation Environments

    NASA Technical Reports Server (NTRS)

    Minow, Joseph I.

    2011-01-01

    Internal charging is a risk to spacecraft in energetic electron environments. DICTAT, NU MIT computational codes are the most widely used engineering tools for evaluating internal charging of insulator materials exposed to these environments. Engineering tools are designed for rapid evaluation of ESD threats, but there is a need for more physics based models for investigating the science of materials interactions with energetic electron environments. Current tools are limited by the physics included in the models and ease of user implementation .... additional development work is needed to improve models.

  8. Linguistic and Cultural Adaptation of a Computer-Based Counseling Program (CARE+ Spanish) to Support HIV Treatment Adherence and Risk Reduction for People Living With HIV/AIDS: A Randomized Controlled Trial.

    PubMed

    Kurth, Ann E; Chhun, Nok; Cleland, Charles M; Crespo-Fierro, Michele; Parés-Avila, José A; Lizcano, John A; Norman, Robert G; Shedlin, Michele G; Johnston, Barbara E; Sharp, Victoria L

    2016-07-13

    Human immunodeficiency virus (HIV) disease in the United States disproportionately affects minorities, including Latinos. Barriers including language are associated with lower antiretroviral therapy (ART) adherence seen among Latinos, yet ART and interventions for clinic visit adherence are rarely developed or delivered in Spanish. The aim was to adapt a computer-based counseling tool, demonstrated to reduce HIV-1 viral load and sexual risk transmission in a population of English-speaking adults, for use during routine clinical visits for an HIV-positive Spanish-speaking population (CARE+ Spanish); the Technology Acceptance Model (TAM) was the theoretical framework guiding program development. A longitudinal randomized controlled trial was conducted from June 4, 2010 to March 29, 2012. Participants were recruited from a comprehensive HIV treatment center comprising three clinics in New York City. Eligibility criteria were (1) adults (age ≥18 years), (2) Latino birth or ancestry, (3) speaks Spanish (mono- or multilingual), and (4) on antiretrovirals. Linear and generalized mixed linear effects models were used to analyze primary outcomes, which included ART adherence, sexual transmission risk behaviors, and HIV-1 viral loads. Exit interviews were offered to purposively selected intervention participants to explore cultural acceptability of the tool among participants, and focus groups explored the acceptability and system efficiency issues among clinic providers, using the TAM framework. A total of 494 Spanish-speaking HIV clinic attendees were enrolled and randomly assigned to the intervention (arm A: n=253) or risk assessment-only control (arm B, n=241) group and followed up at 3-month intervals for one year. Gender distribution was 296 (68.4%) male, 110 (25.4%) female, and 10 (2.3%) transgender. By study end, 433 of 494 (87.7%) participants were retained. Although intervention participants had reduced viral loads, increased ART adherence and decreased sexual transmission risk behaviors over time, these findings were not statistically significant. We also conducted 61 qualitative exit interviews with participants and two focus groups with a total of 16 providers. A computer-based counseling tool grounded in the TAM theoretical model and delivered in Spanish was acceptable and feasible to implement in a high-volume HIV clinic setting. It was able to provide evidence-based, linguistically appropriate ART adherence support without requiring additional staff time, bilingual status, or translation services. We found that language preferences and cultural acceptability of a computer-based counseling tool exist on a continuum in our urban Spanish-speaking population. Theoretical frameworks of technology's usefulness for behavioral modification need further exploration in other languages and cultures. ClinicalTrials.gov NCT01013935; https://clinicaltrials.gov/ct2/show/NCT01013935 (Archived by WebCite at http://www.webcitation.org/6ikaD3MT7).

  9. Computational Modeling of Space Physiology

    NASA Technical Reports Server (NTRS)

    Lewandowski, Beth E.; Griffin, Devon W.

    2016-01-01

    The Digital Astronaut Project (DAP), within NASAs Human Research Program, develops and implements computational modeling for use in the mitigation of human health and performance risks associated with long duration spaceflight. Over the past decade, DAP developed models to provide insights into space flight related changes to the central nervous system, cardiovascular system and the musculoskeletal system. Examples of the models and their applications include biomechanical models applied to advanced exercise device development, bone fracture risk quantification for mission planning, accident investigation, bone health standards development, and occupant protection. The International Space Station (ISS), in its role as a testing ground for long duration spaceflight, has been an important platform for obtaining human spaceflight data. DAP has used preflight, in-flight and post-flight data from short and long duration astronauts for computational model development and validation. Examples include preflight and post-flight bone mineral density data, muscle cross-sectional area, and muscle strength measurements. Results from computational modeling supplement space physiology research by informing experimental design. Using these computational models, DAP personnel can easily identify both important factors associated with a phenomenon and areas where data are lacking. This presentation will provide examples of DAP computational models, the data used in model development and validation, and applications of the model.

  10. Seismic hazard, risk, and design for South America

    USGS Publications Warehouse

    Petersen, Mark D.; Harmsen, Stephen; Jaiswal, Kishor; Rukstales, Kenneth S.; Luco, Nicolas; Haller, Kathleen; Mueller, Charles; Shumway, Allison

    2018-01-01

    We calculate seismic hazard, risk, and design criteria across South America using the latest data, models, and methods to support public officials, scientists, and engineers in earthquake risk mitigation efforts. Updated continental scale seismic hazard models are based on a new seismicity catalog, seismicity rate models, evaluation of earthquake sizes, fault geometry and rate parameters, and ground‐motion models. Resulting probabilistic seismic hazard maps show peak ground acceleration, modified Mercalli intensity, and spectral accelerations at 0.2 and 1 s periods for 2%, 10%, and 50% probabilities of exceedance in 50 yrs. Ground shaking soil amplification at each site is calculated by considering uniform soil that is applied in modern building codes or by applying site‐specific factors based on VS30">VS30 shear‐wave velocities determined through a simple topographic proxy technique. We use these hazard models in conjunction with the Prompt Assessment of Global Earthquakes for Response (PAGER) model to calculate economic and casualty risk. Risk is computed by incorporating the new hazard values amplified by soil, PAGER fragility/vulnerability equations, and LandScan 2012 estimates of population exposure. We also calculate building design values using the guidelines established in the building code provisions. Resulting hazard and associated risk is high along the northern and western coasts of South America, reaching damaging levels of ground shaking in Chile, western Argentina, western Bolivia, Peru, Ecuador, Colombia, Venezuela, and in localized areas distributed across the rest of the continent where historical earthquakes have occurred. Constructing buildings and other structures to account for strong shaking in these regions of high hazard and risk should mitigate losses and reduce casualties from effects of future earthquake strong ground shaking. National models should be developed by scientists and engineers in each country using the best available science.

  11. The Relative Importance of the Vadose Zone in Multimedia Risk Assessment Modeling Applied at a National Scale: An Analysis of Benzene Using 3MRA

    NASA Astrophysics Data System (ADS)

    Babendreier, J. E.

    2002-05-01

    Evaluating uncertainty and parameter sensitivity in environmental models can be a difficult task, even for low-order, single-media constructs driven by a unique set of site-specific data. The challenge of examining ever more complex, integrated, higher-order models is a formidable one, particularly in regulatory settings applied on a national scale. Quantitative assessment of uncertainty and sensitivity within integrated, multimedia models that simulate hundreds of sites, spanning multiple geographical and ecological regions, will ultimately require a systematic, comparative approach coupled with sufficient computational power. The Multimedia, Multipathway, and Multireceptor Risk Assessment Model (3MRA) is an important code being developed by the United States Environmental Protection Agency for use in site-scale risk assessment (e.g. hazardous waste management facilities). The model currently entails over 700 variables, 185 of which are explicitly stochastic. The 3MRA can start with a chemical concentration in a waste management unit (WMU). It estimates the release and transport of the chemical throughout the environment, and predicts associated exposure and risk. The 3MRA simulates multimedia (air, water, soil, sediments), pollutant fate and transport, multipathway exposure routes (food ingestion, water ingestion, soil ingestion, air inhalation, etc.), multireceptor exposures (resident, gardener, farmer, fisher, ecological habitats and populations), and resulting risk (human cancer and non-cancer effects, ecological population and community effects). The 3MRA collates the output for an overall national risk assessment, offering a probabilistic strategy as a basis for regulatory decisions. To facilitate model execution of 3MRA for purposes of conducting uncertainty and sensitivity analysis, a PC-based supercomputer cluster was constructed. Design of SuperMUSE, a 125 GHz Windows-based Supercomputer for Model Uncertainty and Sensitivity Evaluation is described, along with the conceptual layout of an accompanying java-based paralleling software toolset. Preliminary work is also reported for a scenario involving Benzene disposal that describes the relative importance of the vadose zone in driving risk levels for ecological receptors and human health. Incorporating landfills, waste piles, aerated tanks, surface impoundments, and land application units, the site-based data used in the analysis included 201 national facilities representing 419 site-WMU combinations.

  12. Systemic Inflammation-Based Biomarkers and Survival in HIV-Positive Subject With Solid Cancer in an Italian Multicenter Study.

    PubMed

    Raffetti, Elena; Donato, Francesco; Pezzoli, Chiara; Digiambenedetto, Simona; Bandera, Alessandra; Di Pietro, Massimo; Di Filippo, Elisa; Maggiolo, Franco; Sighinolfi, Laura; Fornabaio, Chiara; Castelnuovo, Filippo; Ladisa, Nicoletta; Castelli, Francesco; Quiros Roldan, Eugenia

    2015-08-15

    Recently, some systemic inflammation-based biomarkers have been demonstrated useful for predicting risk of death in patients with solid cancer independently of tumor characteristics. This study aimed to investigate the prognostic role of systemic inflammation-based biomarkers in HIV-infected patients with solid tumors and to propose a risk score for mortality in these subjects. Clinical and pathological data on solid AIDS-defining cancer (ADC) and non-AIDS-defining cancer (NADC), diagnosed between 1998 and 2012 in an Italian cohort, were analyzed. To evaluate the prognostic role of systemic inflammation- and nutrition-based markers, univariate and multivariable Cox regression models were applied. To compute the risk score equation, the patients were randomly assigned to a derivation and a validation sample. A total of 573 patients (76.3% males) with a mean age of 46.2 years (SD = 10.3) were enrolled. 178 patients died during a median of 3.2 years of follow-up. For solid NADCs, elevated Glasgow Prognostic Score, modified Glasgow Prognostic Score, neutrophil/lymphocyte ratio, platelet/lymphocyte ratio, and Prognostic Nutritional Index were independently associated with risk of death; for solid ADCs, none of these markers was associated with risk of death. For solid NADCs, we computed a mortality risk score on the basis of age at cancer diagnosis, intravenous drug use, and Prognostic Nutritional Index. The areas under the receiver operating characteristic curve were 0.67 (95% confidence interval: 0.58 to 0.75) in the derivation sample and 0.66 (95% confidence interval: 0.54 to 0.79) in the validation sample. Inflammatory biomarkers were associated with risk of death in HIV-infected patients with solid NADCs but not with ADCs.

  13. Computational Toxicology as Implemented by the U.S. EPA: Providing High Throughput Decision Support Tools for Screening and Assessing Chemical Exposure, Hazard and Risk

    EPA Science Inventory

    Computational toxicology is the application of mathematical and computer models to help assess chemical hazards and risks to human health and the environment. Supported by advances in informatics, high-throughput screening (HTS) technologies, and systems biology, the U.S. Environ...

  14. Metamodeling-based approach for risk assessment and cost estimation: Application to geological carbon sequestration planning

    NASA Astrophysics Data System (ADS)

    Sun, Alexander Y.; Jeong, Hoonyoung; González-Nicolás, Ana; Templeton, Thomas C.

    2018-04-01

    Carbon capture and storage (CCS) is being evaluated globally as a geoengineering measure for significantly reducing greenhouse emission. However, long-term liability associated with potential leakage from these geologic repositories is perceived as a main barrier of entry to site operators. Risk quantification and impact assessment help CCS operators to screen candidate sites for suitability of CO2 storage. Leakage risks are highly site dependent, and a quantitative understanding and categorization of these risks can only be made possible through broad participation and deliberation of stakeholders, with the use of site-specific, process-based models as the decision basis. Online decision making, however, requires that scenarios be run in real time. In this work, a Python based, Leakage Assessment and Cost Estimation (PyLACE) web application was developed for quantifying financial risks associated with potential leakage from geologic carbon sequestration sites. PyLACE aims to assist a collaborative, analytic-deliberative decision making processes by automating metamodel creation, knowledge sharing, and online collaboration. In PyLACE, metamodeling, which is a process of developing faster-to-run surrogates of process-level models, is enabled using a special stochastic response surface method and the Gaussian process regression. Both methods allow consideration of model parameter uncertainties and the use of that information to generate confidence intervals on model outputs. Training of the metamodels is delegated to a high performance computing cluster and is orchestrated by a set of asynchronous job scheduling tools for job submission and result retrieval. As a case study, workflow and main features of PyLACE are demonstrated using a multilayer, carbon storage model.

  15. A Quantitative Risk Assessment Model Involving Frequency and Threat Degree under Line-of-Business Services for Infrastructure of Emerging Sensor Networks.

    PubMed

    Jing, Xu; Hu, Hanwen; Yang, Huijun; Au, Man Ho; Li, Shuqin; Xiong, Naixue; Imran, Muhammad; Vasilakos, Athanasios V

    2017-03-21

    The prospect of Line-of-Business Services (LoBSs) for infrastructure of Emerging Sensor Networks (ESNs) is exciting. Access control remains a top challenge in this scenario as the service provider's server contains a lot of valuable resources. LoBSs' users are very diverse as they may come from a wide range of locations with vastly different characteristics. Cost of joining could be low and in many cases, intruders are eligible users conducting malicious actions. As a result, user access should be adjusted dynamically. Assessing LoBSs' risk dynamically based on both frequency and threat degree of malicious operations is therefore necessary. In this paper, we proposed a Quantitative Risk Assessment Model (QRAM) involving frequency and threat degree based on value at risk. To quantify the threat degree as an elementary intrusion effort, we amend the influence coefficient of risk indexes in the network security situation assessment model. To quantify threat frequency as intrusion trace effort, we make use of multiple behavior information fusion. Under the influence of intrusion trace, we adapt the historical simulation method of value at risk to dynamically access LoBSs' risk. Simulation based on existing data is used to select appropriate parameters for QRAM. Our simulation results show that the duration influence on elementary intrusion effort is reasonable when the normalized parameter is 1000. Likewise, the time window of intrusion trace and the weight between objective risk and subjective risk can be set to 10 s and 0.5, respectively. While our focus is to develop QRAM for assessing the risk of LoBSs for infrastructure of ESNs dynamically involving frequency and threat degree, we believe it is also appropriate for other scenarios in cloud computing.

  16. A Quantitative Risk Assessment Model Involving Frequency and Threat Degree under Line-of-Business Services for Infrastructure of Emerging Sensor Networks

    PubMed Central

    Jing, Xu; Hu, Hanwen; Yang, Huijun; Au, Man Ho; Li, Shuqin; Xiong, Naixue; Imran, Muhammad; Vasilakos, Athanasios V.

    2017-01-01

    The prospect of Line-of-Business Services (LoBSs) for infrastructure of Emerging Sensor Networks (ESNs) is exciting. Access control remains a top challenge in this scenario as the service provider’s server contains a lot of valuable resources. LoBSs’ users are very diverse as they may come from a wide range of locations with vastly different characteristics. Cost of joining could be low and in many cases, intruders are eligible users conducting malicious actions. As a result, user access should be adjusted dynamically. Assessing LoBSs’ risk dynamically based on both frequency and threat degree of malicious operations is therefore necessary. In this paper, we proposed a Quantitative Risk Assessment Model (QRAM) involving frequency and threat degree based on value at risk. To quantify the threat degree as an elementary intrusion effort, we amend the influence coefficient of risk indexes in the network security situation assessment model. To quantify threat frequency as intrusion trace effort, we make use of multiple behavior information fusion. Under the influence of intrusion trace, we adapt the historical simulation method of value at risk to dynamically access LoBSs’ risk. Simulation based on existing data is used to select appropriate parameters for QRAM. Our simulation results show that the duration influence on elementary intrusion effort is reasonable when the normalized parameter is 1000. Likewise, the time window of intrusion trace and the weight between objective risk and subjective risk can be set to 10 s and 0.5, respectively. While our focus is to develop QRAM for assessing the risk of LoBSs for infrastructure of ESNs dynamically involving frequency and threat degree, we believe it is also appropriate for other scenarios in cloud computing. PMID:28335569

  17. Rover Slip Validation and Prediction Algorithm

    NASA Technical Reports Server (NTRS)

    Yen, Jeng

    2009-01-01

    A physical-based simulation has been developed for the Mars Exploration Rover (MER) mission that applies a slope-induced wheel-slippage to the rover location estimator. Using the digital elevation map from the stereo images, the computational method resolves the quasi-dynamic equations of motion that incorporate the actual wheel-terrain speed to estimate the gross velocity of the vehicle. Based on the empirical slippage measured by the Visual Odometry software of the rover, this algorithm computes two factors for the slip model by minimizing the distance of the predicted and actual vehicle location, and then uses the model to predict the next drives. This technique, which has been deployed to operate the MER rovers in the extended mission periods, can accurately predict the rover position and attitude, mitigating the risk and uncertainties in the path planning on high-slope areas.

  18. Synthetic Earthquake Statistics From Physical Fault Models for the Lower Rhine Embayment

    NASA Astrophysics Data System (ADS)

    Brietzke, G. B.; Hainzl, S.; Zöller, G.

    2012-04-01

    As of today, seismic risk and hazard estimates mostly use pure empirical, stochastic models of earthquake fault systems tuned specifically to the vulnerable areas of interest. Although such models allow for reasonable risk estimates they fail to provide a link between the observed seismicity and the underlying physical processes. Solving a state-of-the-art fully dynamic description set of all relevant physical processes related to earthquake fault systems is likely not useful since it comes with a large number of degrees of freedom, poor constraints on its model parameters and a huge computational effort. Here, quasi-static and quasi-dynamic physical fault simulators provide a compromise between physical completeness and computational affordability and aim at providing a link between basic physical concepts and statistics of seismicity. Within the framework of quasi-static and quasi-dynamic earthquake simulators we investigate a model of the Lower Rhine Embayment (LRE) that is based upon seismological and geological data. We present and discuss statistics of the spatio-temporal behavior of generated synthetic earthquake catalogs with respect to simplification (e.g. simple two-fault cases) as well as to complication (e.g. hidden faults, geometric complexity, heterogeneities of constitutive parameters).

  19. SECURITY MODELING FOR MARITIME PORT DEFENSE RESOURCE ALLOCATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harris, S.; Dunn, D.

    2010-09-07

    Redeployment of existing law enforcement resources and optimal use of geographic terrain are examined for countering the threat of a maritime based small-vessel radiological or nuclear attack. The evaluation was based on modeling conducted by the Savannah River National Laboratory that involved the development of options for defensive resource allocation that can reduce the risk of a maritime based radiological or nuclear threat. A diverse range of potential attack scenarios has been assessed. As a result of identifying vulnerable pathways, effective countermeasures can be deployed using current resources. The modeling involved the use of the Automated Vulnerability Evaluation for Risksmore » of Terrorism (AVERT{reg_sign}) software to conduct computer based simulation modeling. The models provided estimates for the probability of encountering an adversary based on allocated resources including response boats, patrol boats and helicopters over various environmental conditions including day, night, rough seas and various traffic flow rates.« less

  20. Cardiovascular risk assessment in elderly adults using SCORE OP model in a Latin American population: The experience from Ecuador.

    PubMed

    Sisa, Ivan

    2018-02-09

    Cardiovascular disease (CVD) mortality is predicted to increase in Latin America countries due to their rapidly aging population. However, there is very little information about CVD risk assessment as a primary preventive measure in this high-risk population. We predicted the national risk of developing CVD in Ecuadorian elderly population using the Systematic COronary Risk Evaluation in Older Persons (SCORE OP) High and Low models by risk categories/CVD risk region in 2009. Data on national cardiovascular risk factors were obtained from the Encuesta sobre Salud, Bienestar y Envejecimiento. We computed the predicted 5-year risk of CVD risk and compared the extent of agreement and reclassification in stratifying high-risk individuals between SCORE OP High and Low models. Analyses were done by risk categories, CVD risk region, and sex. In 2009, based on SCORE OP Low model almost 42% of elderly adults living in Ecuador were at high risk of suffering CVD over a 5-year period. The extent of agreement between SCORE OP High and Low risk prediction models was moderate (Cohen's kappa test of 0.5), 34% of individuals approximately were reclassified into different risk categories and a third of the population would benefit from a pharmacologic intervention to reduce the CVD risk. Forty-two percent of elderly Ecuadorians were at high risk of suffering CVD over a 5-year period, indicating an urgent need to tailor primary preventive measures for this vulnerable and high-risk population. Copyright © 2017 Elsevier España, S.L.U. All rights reserved.

  1. Using the basic reproduction number to assess the effects of climate change in the risk of Chagas disease transmission in Colombia.

    PubMed

    Cordovez, Juan M; Rendon, Lina Maria; Gonzalez, Camila; Guhl, Felipe

    2014-01-01

    The dynamics of vector-borne diseases has often been linked to climate change. However the commonly complex dynamics of vector-borne diseases make it very difficult to predict risk based on vector or host distributions. The basic reproduction number (R0) integrates all factors that determine whether a pathogen can establish or not. To obtain R0 for complex vector-borne diseases one can use the next-generation matrix (NGM) approach. We used the NGM to compute R0 for Chagas disease in Colombia incorporating the effect of temperature in some of the transmission routes of Trypanosoma cruzi. We used R0 to generate a risk map of present conditions and a forecast risk map at 20 years from now based on mean annual temperature (data obtained from Worldclim). In addition we used the model to compute elasticity and sensitivity indexes on all model parameters and routes of transmission. We present this work as an approach to indicate which transmission pathways are more critical for disease transmission but acknowledge the fact that results and projections strongly depend on better knowledge of entomological parameters and transmission routes. We concluded that the highest contribution to R0 comes from transmission of the parasites from humans to vectors, which is a surprising result. In addition,parameters related to contacts between human and vectors and the efficiency of parasite transmission between them also show a prominent effect on R0.

  2. Radiation Physics for Space and High Altitude Air Travel

    NASA Technical Reports Server (NTRS)

    Cucinotta, F. A.; Wilson, J. W.; Goldhagen, P.; Saganti, P.; Shavers, M. R.; McKay, Gordon A. (Technical Monitor)

    2000-01-01

    Galactic cosmic rays (GCR) are of extra-solar origin consisting of high-energy hydrogen, helium, and heavy ions. The GCR are modified by physical processes as they traverse through the solar system, spacecraft shielding, atmospheres, and tissues producing copious amounts of secondary radiation including fragmentation products, neutrons, mesons, and muons. We discuss physical models and measurements relevant for estimating biological risks in space and high-altitude air travel. Ambient and internal spacecraft computational models for the International Space Station and a Mars mission are discussed. Risk assessment is traditionally based on linear addition of components. We discuss alternative models that include stochastic treatments of columnar damage by heavy ion tracks and multi-cellular damage following nuclear fragmentation in tissue.

  3. Applying a new computer-aided detection scheme generated imaging marker to predict short-term breast cancer risk

    NASA Astrophysics Data System (ADS)

    Mirniaharikandehei, Seyedehnafiseh; Hollingsworth, Alan B.; Patel, Bhavika; Heidari, Morteza; Liu, Hong; Zheng, Bin

    2018-05-01

    This study aims to investigate the feasibility of identifying a new quantitative imaging marker based on false-positives generated by a computer-aided detection (CAD) scheme to help predict short-term breast cancer risk. An image dataset including four view mammograms acquired from 1044 women was retrospectively assembled. All mammograms were originally interpreted as negative by radiologists. In the next subsequent mammography screening, 402 women were diagnosed with breast cancer and 642 remained negative. An existing CAD scheme was applied ‘as is’ to process each image. From CAD-generated results, four detection features including the total number of (1) initial detection seeds and (2) the final detected false-positive regions, (3) average and (4) sum of detection scores, were computed from each image. Then, by combining the features computed from two bilateral images of left and right breasts from either craniocaudal or mediolateral oblique view, two logistic regression models were trained and tested using a leave-one-case-out cross-validation method to predict the likelihood of each testing case being positive in the next subsequent screening. The new prediction model yielded the maximum prediction accuracy with an area under a ROC curve of AUC  =  0.65  ±  0.017 and the maximum adjusted odds ratio of 4.49 with a 95% confidence interval of (2.95, 6.83). The results also showed an increasing trend in the adjusted odds ratio and risk prediction scores (p  <  0.01). Thus, this study demonstrated that CAD-generated false-positives might include valuable information, which needs to be further explored for identifying and/or developing more effective imaging markers for predicting short-term breast cancer risk.

  4. Model-based approach for quantitative estimates of skin, heart, and lung toxicity risk for left-side photon and proton irradiation after breast-conserving surgery.

    PubMed

    Tommasino, Francesco; Durante, Marco; D'Avino, Vittoria; Liuzzi, Raffaele; Conson, Manuel; Farace, Paolo; Palma, Giuseppe; Schwarz, Marco; Cella, Laura; Pacelli, Roberto

    2017-05-01

    Proton beam therapy represents a promising modality for left-side breast cancer (BC) treatment, but concerns have been raised about skin toxicity and poor cosmesis. The aim of this study is to apply skin normal tissue complication probability (NTCP) model for intensity modulated proton therapy (IMPT) optimization in left-side BC. Ten left-side BC patients undergoing photon irradiation after breast-conserving surgery were randomly selected from our clinical database. Intensity modulated photon (IMRT) and IMPT plans were calculated with iso-tumor-coverage criteria and according to RTOG 1005 guidelines. Proton plans were computed with and without skin optimization. Published NTCP models were employed to estimate the risk of different toxicity endpoints for skin, lung, heart and its substructures. Acute skin NTCP evaluation suggests a lower toxicity level with IMPT compared to IMRT when the skin is included in proton optimization strategy (0.1% versus 1.7%, p < 0.001). Dosimetric results show that, with the same level of tumor coverage, IMPT attains significant heart and lung dose sparing compared with IMRT. By NTCP model-based analysis, an overall reduction in the cardiopulmonary toxicity risk prediction can be observed for all IMPT compared to IMRT plans: the relative risk reduction from protons varies between 0.1 and 0.7 depending on the considered toxicity endpoint. Our analysis suggests that IMPT might be safely applied without increasing the risk of severe acute radiation induced skin toxicity. The quantitative risk estimates also support the potential clinical benefits of IMPT for left-side BC irradiation due to lower risk of cardiac and pulmonary morbidity. The applied approach might be relevant on the long term for the setup of cost-effectiveness evaluation strategies based on NTCP predictions.

  5. A modular inverse elastostatics approach to resolve the pressure-induced stress state for in vivo imaging based cardiovascular modeling.

    PubMed

    Peirlinck, Mathias; De Beule, Matthieu; Segers, Patrick; Rebelo, Nuno

    2018-05-28

    Patient-specific biomechanical modeling of the cardiovascular system is complicated by the presence of a physiological pressure load given that the imaged tissue is in a pre-stressed and -strained state. Neglect of this prestressed state into solid tissue mechanics models leads to erroneous metrics (e.g. wall deformation, peak stress, wall shear stress) which in their turn are used for device design choices, risk assessment (e.g. procedure, rupture) and surgery planning. It is thus of utmost importance to incorporate this deformed and loaded tissue state into the computational models, which implies solving an inverse problem (calculating an undeformed geometry given the load and the deformed geometry). Methodologies to solve this inverse problem can be categorized into iterative and direct methodologies, both having their inherent advantages and disadvantages. Direct methodologies are typically based on the inverse elastostatics (IE) approach and offer a computationally efficient single shot methodology to compute the in vivo stress state. However, cumbersome and problem-specific derivations of the formulations and non-trivial access to the finite element analysis (FEA) code, especially for commercial products, refrain a broad implementation of these methodologies. For that reason, we developed a novel, modular IE approach and implemented this methodology in a commercial FEA solver with minor user subroutine interventions. The accuracy of this methodology was demonstrated in an arterial tube and porcine biventricular myocardium model. The computational power and efficiency of the methodology was shown by computing the in vivo stress and strain state, and the corresponding unloaded geometry, for two models containing multiple interacting incompressible, anisotropic (fiber-embedded) and hyperelastic material behaviors: a patient-specific abdominal aortic aneurysm and a full 4-chamber heart model. Copyright © 2018 Elsevier Ltd. All rights reserved.

  6. An initial investigation on developing a new method to predict short-term breast cancer risk based on deep learning technology

    NASA Astrophysics Data System (ADS)

    Qiu, Yuchen; Wang, Yunzhi; Yan, Shiju; Tan, Maxine; Cheng, Samuel; Liu, Hong; Zheng, Bin

    2016-03-01

    In order to establish a new personalized breast cancer screening paradigm, it is critically important to accurately predict the short-term risk of a woman having image-detectable cancer after a negative mammographic screening. In this study, we developed and tested a novel short-term risk assessment model based on deep learning method. During the experiment, a number of 270 "prior" negative screening cases was assembled. In the next sequential ("current") screening mammography, 135 cases were positive and 135 cases remained negative. These cases were randomly divided into a training set with 200 cases and a testing set with 70 cases. A deep learning based computer-aided diagnosis (CAD) scheme was then developed for the risk assessment, which consists of two modules: adaptive feature identification module and risk prediction module. The adaptive feature identification module is composed of three pairs of convolution-max-pooling layers, which contains 20, 10, and 5 feature maps respectively. The risk prediction module is implemented by a multiple layer perception (MLP) classifier, which produces a risk score to predict the likelihood of the woman developing short-term mammography-detectable cancer. The result shows that the new CAD-based risk model yielded a positive predictive value of 69.2% and a negative predictive value of 74.2%, with a total prediction accuracy of 71.4%. This study demonstrated that applying a new deep learning technology may have significant potential to develop a new short-term risk predicting scheme with improved performance in detecting early abnormal symptom from the negative mammograms.

  7. Contaminant deposition building shielding factors for US residential structures.

    PubMed

    Dickson, Elijah; Hamby, David; Eckerman, Keith

    2017-10-10

    This paper presents validated building shielding factors designed for contemporary US housing-stock under an idealized, yet realistic, exposure scenario from contaminant deposition on the roof and surrounding surfaces. The building shielding factors are intended for use in emergency planning and level three probabilistic risk assessments for a variety of postulated radiological events in which a realistic assessment is necessary to better understand the potential risks for accident mitigation and emergency response planning. Factors are calculated from detailed computational housing-units models using the general-purpose Monte Carlo N-Particle computational code, MCNP5, and are benchmarked from a series of narrow- and broad-beam measurements analyzing the shielding effectiveness of ten common general-purpose construction materials and ten shielding models representing the primary weather barriers (walls and roofs) of likely US housing-stock. Each model was designed to scale based on common residential construction practices and include, to the extent practical, all structurally significant components important for shielding against ionizing radiation. Calculations were performed for floor-specific locations from contaminant deposition on the roof and surrounding ground as well as for computing a weighted-average representative building shielding factor for single- and multi-story detached homes, both with and without basement as well for single-wide manufactured housing-unit. © 2017 IOP Publishing Ltd.

  8. Contaminant deposition building shielding factors for US residential structures.

    PubMed

    Dickson, E D; Hamby, D M; Eckerman, K F

    2015-06-01

    This paper presents validated building shielding factors designed for contemporary US housing-stock under an idealized, yet realistic, exposure scenario from contaminant deposition on the roof and surrounding surfaces. The building shielding factors are intended for use in emergency planning and level three probabilistic risk assessments for a variety of postulated radiological events in which a realistic assessment is necessary to better understand the potential risks for accident mitigation and emergency response planning. Factors are calculated from detailed computational housing-units models using the general-purpose Monte Carlo N-Particle computational code, MCNP5, and are benchmarked from a series of narrow- and broad-beam measurements analyzing the shielding effectiveness of ten common general-purpose construction materials and ten shielding models representing the primary weather barriers (walls and roofs) of likely US housing-stock. Each model was designed to scale based on common residential construction practices and include, to the extent practical, all structurally significant components important for shielding against ionizing radiation. Calculations were performed for floor-specific locations from contaminant deposition on the roof and surrounding ground as well as for computing a weighted-average representative building shielding factor for single- and multi-story detached homes, both with and without basement as well for single-wide manufactured housing-unit.

  9. Patient- and cohort-specific dose and risk estimation for abdominopelvic CT: a study based on 100 patients

    NASA Astrophysics Data System (ADS)

    Tian, Xiaoyu; Li, Xiang; Segars, W. Paul; Frush, Donald P.; Samei, Ehsan

    2012-03-01

    The purpose of this work was twofold: (a) to estimate patient- and cohort-specific radiation dose and cancer risk index for abdominopelvic computer tomography (CT) scans; (b) to evaluate the effects of patient anatomical characteristics (size, age, and gender) and CT scanner model on dose and risk conversion coefficients. The study included 100 patient models (42 pediatric models, 58 adult models) and multi-detector array CT scanners from two commercial manufacturers (LightSpeed VCT, GE Healthcare; SOMATOM Definition Flash, Siemens Healthcare). A previously-validated Monte Carlo program was used to simulate organ dose for each patient model and each scanner, from which DLP-normalized-effective dose (k factor) and DLP-normalized-risk index values (q factor) were derived. The k factor showed exponential decrease with increasing patient size. For a given gender, q factor showed exponential decrease with both increasing patient size and patient age. The discrepancies in k and q factors across scanners were on average 8% and 15%, respectively. This study demonstrates the feasibility of estimating patient-specific organ dose and cohort-specific effective dose and risk index in abdominopelvic CT requiring only the knowledge of patient size, gender, and age.

  10. AN OVERVIEW OF REDUCED ORDER MODELING TECHNIQUES FOR SAFETY APPLICATIONS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mandelli, D.; Alfonsi, A.; Talbot, P.

    2016-10-01

    The RISMC project is developing new advanced simulation-based tools to perform Computational Risk Analysis (CRA) for the existing fleet of U.S. nuclear power plants (NPPs). These tools numerically model not only the thermal-hydraulic behavior of the reactors primary and secondary systems, but also external event temporal evolution and component/system ageing. Thus, this is not only a multi-physics problem being addressed, but also a multi-scale problem (both spatial, µm-mm-m, and temporal, seconds-hours-years). As part of the RISMC CRA approach, a large amount of computationally-expensive simulation runs may be required. An important aspect is that even though computational power is growing, themore » overall computational cost of a RISMC analysis using brute-force methods may be not viable for certain cases. A solution that is being evaluated to assist the computational issue is the use of reduced order modeling techniques. During the FY2015, we investigated and applied reduced order modeling techniques to decrease the RISMC analysis computational cost by decreasing the number of simulation runs; for this analysis improvement we used surrogate models instead of the actual simulation codes. This article focuses on the use of reduced order modeling techniques that can be applied to RISMC analyses in order to generate, analyze, and visualize data. In particular, we focus on surrogate models that approximate the simulation results but in a much faster time (microseconds instead of hours/days).« less

  11. Uncertainty Modeling of Pollutant Transport in Atmosphere and Aquatic Route Using Soft Computing

    NASA Astrophysics Data System (ADS)

    Datta, D.

    2010-10-01

    Hazardous radionuclides are released as pollutants in the atmospheric and aquatic environment (ATAQE) during the normal operation of nuclear power plants. Atmospheric and aquatic dispersion models are routinely used to assess the impact of release of radionuclide from any nuclear facility or hazardous chemicals from any chemical plant on the ATAQE. Effect of the exposure from the hazardous nuclides or chemicals is measured in terms of risk. Uncertainty modeling is an integral part of the risk assessment. The paper focuses the uncertainty modeling of the pollutant transport in atmospheric and aquatic environment using soft computing. Soft computing is addressed due to the lack of information on the parameters that represent the corresponding models. Soft-computing in this domain basically addresses the usage of fuzzy set theory to explore the uncertainty of the model parameters and such type of uncertainty is called as epistemic uncertainty. Each uncertain input parameters of the model is described by a triangular membership function.

  12. Applying under-sampling techniques and cost-sensitive learning methods on risk assessment of breast cancer.

    PubMed

    Hsu, Jia-Lien; Hung, Ping-Cheng; Lin, Hung-Yen; Hsieh, Chung-Ho

    2015-04-01

    Breast cancer is one of the most common cause of cancer mortality. Early detection through mammography screening could significantly reduce mortality from breast cancer. However, most of screening methods may consume large amount of resources. We propose a computational model, which is solely based on personal health information, for breast cancer risk assessment. Our model can be served as a pre-screening program in the low-cost setting. In our study, the data set, consisting of 3976 records, is collected from Taipei City Hospital starting from 2008.1.1 to 2008.12.31. Based on the dataset, we first apply the sampling techniques and dimension reduction method to preprocess the testing data. Then, we construct various kinds of classifiers (including basic classifiers, ensemble methods, and cost-sensitive methods) to predict the risk. The cost-sensitive method with random forest classifier is able to achieve recall (or sensitivity) as 100 %. At the recall of 100 %, the precision (positive predictive value, PPV), and specificity of cost-sensitive method with random forest classifier was 2.9 % and 14.87 %, respectively. In our study, we build a breast cancer risk assessment model by using the data mining techniques. Our model has the potential to be served as an assisting tool in the breast cancer screening.

  13. Pilot Study of a Computer-Based Parental Questionnaire and Visual Profile of Obesity Risk in Healthy Preschoolers.

    PubMed

    Davies, Marilyn A; Terhorst, Lauren; Zhang, Peng; Nakonechny, Amanda J; Nowalk, Mary Patricia

    2015-01-01

    This group field-tested a computer-based, parental questionnaire entitled the Childhood Obesity Risk Questionnaire 2-5 (CORQ 2-5) designed to assess obesity risk in healthy preschoolers. COR 2-5 generates a profile of seven obesity risk factors. Field studies provided good internal reliability data and evidence of discriminant validity for the CORQ 2-5. Pediatric nurse clinicians found the CORQ 2-5 profile to be clinically relevant. The CORQ 2-5 is a promising measure of obesity risk in preschoolers who attend community-based health centers for their wellchild visits and who are not yet obese. CORQ 2-5 is intended to guide provider-parental obesity risk discussions. Copyright © 2015 Elsevier Inc. All rights reserved.

  14. A Stochastic Model of Space Radiation Transport as a Tool in the Development of Time-Dependent Risk Assessment

    NASA Technical Reports Server (NTRS)

    Kim, Myung-Hee Y.; Nounu, Hatem N.; Ponomarev, Artem L.; Cucinotta, Francis A.

    2011-01-01

    A new computer model, the GCR Event-based Risk Model code (GERMcode), was developed to describe biophysical events from high-energy protons and heavy ions that have been studied at the NASA Space Radiation Laboratory (NSRL) [1] for the purpose of simulating space radiation biological effects. In the GERMcode, the biophysical description of the passage of heavy ions in tissue and shielding materials is made with a stochastic approach that includes both ion track structure and nuclear interactions. The GERMcode accounts for the major nuclear interaction processes of importance for describing heavy ion beams, including nuclear fragmentation, elastic scattering, and knockout-cascade processes by using the quantum multiple scattering fragmentation (QMSFRG) model [2]. The QMSFRG model has been shown to be in excellent agreement with available experimental data for nuclear fragmentation cross sections

  15. Prediction of HIV Sexual Risk Behaviors among Disadvantaged African American Adults using a Syndemic Conceptual Framework

    PubMed Central

    Nehl, Eric J.; Klein, Hugh; Sterk, Claire E.; Elifson, Kirk W.

    2015-01-01

    The focus of this paper is on HIV sexual risk taking among a community-based sample of disadvantaged African American adults. The objective is to examine multiple factors associated with sexual HIV risk behaviors within a syndemic conceptual framework. Face-to-face, computer-assisted, structured interviews were conducted with 1,535 individuals in Atlanta, Georgia. Bivariate analyses indicated a high level of relationships among the HIV sexual risks and other factors. Results from multivariate models indicated that gender, sexual orientation, relationship status, self-esteem, condom use self-efficacy, sex while the respondent was high, and sex while the partner was high were significant predictors of condomless sex. Additionally, a multivariate additive model of risk behaviors indicated that the number of health risks significantly increased the risk of condomless sex. This intersection of HIV sexual risk behaviors and their associations with various other behavioral, socio-demographics, and psychological functioning factors helps explain HIV risk-taking among this sample of African American adults and highlights the need for research and practice that accounts for multiple health behaviors and problems. PMID:26188618

  16. A simulation study to quantify the impacts of exposure ...

    EPA Pesticide Factsheets

    A simulation study to quantify the impacts of exposure measurement error on air pollution health risk estimates in copollutant time-series models The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.

  17. Development of a GIS-based spill management information system.

    PubMed

    Martin, Paul H; LeBoeuf, Eugene J; Daniel, Edsel B; Dobbins, James P; Abkowitz, Mark D

    2004-08-30

    Spill Management Information System (SMIS) is a geographic information system (GIS)-based decision support system designed to effectively manage the risks associated with accidental or intentional releases of a hazardous material into an inland waterway. SMIS provides critical planning and impact information to emergency responders in anticipation of, or following such an incident. SMIS couples GIS and database management systems (DBMS) with the 2-D surface water model CE-QUAL-W2 Version 3.1 and the air contaminant model Computer-Aided Management of Emergency Operations (CAMEO) while retaining full GIS risk analysis and interpretive capabilities. Live 'real-time' data links are established within the spill management software to utilize current meteorological information and flowrates within the waterway. Capabilities include rapid modification of modeling conditions to allow for immediate scenario analysis and evaluation of 'what-if' scenarios. The functionality of the model is illustrated through a case study of the Cheatham Reach of the Cumberland River near Nashville, TN.

  18. Percent Mammographic Density and Dense Area as Risk Factors for Breast Cancer.

    PubMed

    Rauh, C; Hack, C C; Häberle, L; Hein, A; Engel, A; Schrauder, M G; Fasching, P A; Jud, S M; Ekici, A B; Loehberg, C R; Meier-Meitinger, M; Ozan, S; Schulz-Wendtland, R; Uder, M; Hartmann, A; Wachter, D L; Beckmann, M W; Heusinger, K

    2012-08-01

    Purpose: Mammographic characteristics are known to be correlated to breast cancer risk. Percent mammographic density (PMD), as assessed by computer-assisted methods, is an established risk factor for breast cancer. Along with this assessment the absolute dense area (DA) of the breast is reported as well. Aim of this study was to assess the predictive value of DA concerning breast cancer risk in addition to other risk factors and in addition to PMD. Methods: We conducted a case control study with hospital-based patients with a diagnosis of invasive breast cancer and healthy women as controls. A total of 561 patients and 376 controls with available mammographic density were included into this study. We describe the differences concerning the common risk factors BMI, parital status, use of hormone replacement therapy (HRT) and menopause between cases and controls and estimate the odds ratios for PMD and DA, adjusted for the mentioned risk factors. Furthermore we compare the prediction models with each other to find out whether the addition of DA improves the model. Results: Mammographic density and DA were highly correlated with each other. Both variables were as well correlated to the commonly known risk factors with an expected direction and strength, however PMD (ρ = -0.56) was stronger correlated to BMI than DA (ρ = -0.11). The group of women within the highest quartil of PMD had an OR of 2.12 (95 % CI: 1.25-3.62). This could not be seen for the fourth quartile concerning DA. However the assessment of breast cancer risk could be improved by including DA in a prediction model in addition to common risk factors and PMD. Conclusions: The inclusion of the parameter DA into a prediction model for breast cancer in addition to established risk factors and PMD could improve the breast cancer risk assessment. As DA is measured together with PMD in the process of computer-assisted assessment of PMD it might be considered to include it as one additional breast cancer risk factor that is obtained from breast imaging.

  19. Hemodynamic simulations in coronary aneurysms of children with Kawasaki disease

    NASA Astrophysics Data System (ADS)

    Sengupta, Dibyendu; Burns, Jane; Marsden, Alison

    2009-11-01

    Kawasaki disease (KD) is a serious pediatric illness affecting the cardiovascular system. One of the most serious complications of KD, occurring in about 25% of untreated cases, is the formation of large aneurysms in the coronary arteries, which put patients at risk for myocardial infarction. In this project we performed patient specific computational simulations of blood flow in aneurysmal left and right coronary arteries of a KD patient to gain an understanding about their hemodynamics. Models were constructed from CT data using custom software. Typical pulsatile flow waveforms were applied at the model inlets, while resistance and RCR lumped models were applied and compared at the outlets. Simulated pressure waveforms compared well with typical physiologic data. High wall shear stress values are found in the narrow region at the base of the aneurysm and low shear values occur in regions of recirculation. A Lagrangian approach has been adopted to perform particle tracking and compute particle residence time in the recirculation. Our long-term goal will be to develop links between hemodynamics and the risk for thrombus formation in order to assist in clinical decision-making.

  20. An efficient sampling strategy for selection of biobank samples using risk scores.

    PubMed

    Björk, Jonas; Malmqvist, Ebba; Rylander, Lars; Rignell-Hydbom, Anna

    2017-07-01

    The aim of this study was to suggest a new sample-selection strategy based on risk scores in case-control studies with biobank data. An ongoing Swedish case-control study on fetal exposure to endocrine disruptors and overweight in early childhood was used as the empirical example. Cases were defined as children with a body mass index (BMI) ⩾18 kg/m 2 ( n=545) at four years of age, and controls as children with a BMI of ⩽17 kg/m 2 ( n=4472 available). The risk of being overweight was modelled using logistic regression based on available covariates from the health examination and prior to selecting samples from the biobank. A risk score was estimated for each child and categorised as low (0-5%), medium (6-13%) or high (⩾14%) risk of being overweight. The final risk-score model, with smoking during pregnancy ( p=0.001), birth weight ( p<0.001), BMI of both parents ( p<0.001 for both), type of residence ( p=0.04) and economic situation ( p=0.12), yielded an area under the receiver operating characteristic curve of 67% ( n=3945 with complete data). The case group ( n=416) had the following risk-score profile: low (12%), medium (46%) and high risk (43%). Twice as many controls were selected from each risk group, with further matching on sex. Computer simulations showed that the proposed selection strategy with stratification on risk scores yielded consistent improvements in statistical precision. Using risk scores based on available survey or register data as a basis for sample selection may improve possibilities to study heterogeneity of exposure effects in biobank-based studies.

  1. Medical image computing for computer-supported diagnostics and therapy. Advances and perspectives.

    PubMed

    Handels, H; Ehrhardt, J

    2009-01-01

    Medical image computing has become one of the most challenging fields in medical informatics. In image-based diagnostics of the future software assistance will become more and more important, and image analysis systems integrating advanced image computing methods are needed to extract quantitative image parameters to characterize the state and changes of image structures of interest (e.g. tumors, organs, vessels, bones etc.) in a reproducible and objective way. Furthermore, in the field of software-assisted and navigated surgery medical image computing methods play a key role and have opened up new perspectives for patient treatment. However, further developments are needed to increase the grade of automation, accuracy, reproducibility and robustness. Moreover, the systems developed have to be integrated into the clinical workflow. For the development of advanced image computing systems methods of different scientific fields have to be adapted and used in combination. The principal methodologies in medical image computing are the following: image segmentation, image registration, image analysis for quantification and computer assisted image interpretation, modeling and simulation as well as visualization and virtual reality. Especially, model-based image computing techniques open up new perspectives for prediction of organ changes and risk analysis of patients and will gain importance in diagnostic and therapy of the future. From a methodical point of view the authors identify the following future trends and perspectives in medical image computing: development of optimized application-specific systems and integration into the clinical workflow, enhanced computational models for image analysis and virtual reality training systems, integration of different image computing methods, further integration of multimodal image data and biosignals and advanced methods for 4D medical image computing. The development of image analysis systems for diagnostic support or operation planning is a complex interdisciplinary process. Image computing methods enable new insights into the patient's image data and have the future potential to improve medical diagnostics and patient treatment.

  2. Contribution of future urbanisation expansion to flood risk changes

    NASA Astrophysics Data System (ADS)

    Bruwier, Martin; Mustafa, Ahmed; Archambeau, Pierre; Erpicum, Sébastien; Pirotton, Michel; Teller, Jacques; Dewals, Benjamin

    2016-04-01

    The flood risk is expected to increase in the future due to climate change and urban development. Climate change modifies flood hazard and urban development influences exposure and vulnerability to floods. While the influence of climate change on flood risk has been studied widely, the impact of urban development also needs to be considered in a sustainable flood risk management approach. The main goal of this study is the determination of the sensitivity of future flood risk to different urban development scenarios at a relatively short-time horizon in the River Meuse basin in Wallonia (Belgium). From the different scenarios, the expected impact of urban development on flood risk is assessed. Three urban expansion scenarios are developed up to 2030 based on a coupled cellular automata (CA) and agent-based (AB) urban expansion model: (i) business-as-usual, (ii) restrictive and (iii) extreme expansion scenarios. The main factor controlling these scenarios is the future urban land demand. Each urban expansion scenario is developed by considering or not high and/or medium flood hazard zones as a constraint for urban development. To assess the model's performance, it is calibrated for the Meuse River valley (Belgium) to simulate urban expansion between 1990 and 2000. Calibration results are then assessed by comparing the 2000 simulated land-use map and the actual 2000 land-use map. The flood damage estimation for each urban expansion scenario is determined for five flood discharges by overlaying the inundation map resulting from a hydraulic computation and the urban expansion map and by using damage curves and specific prices. The hydraulic model Wolf2D has been extensively validated by comparisons between observations and computational results during flood event .This study focuses only on mobile and immobile prices for urban lands, which are associated to the most severe damages caused by floods along the River Meuse. These findings of this study offers tools to drive urban expansion based on numerous policies visions to mitigate future flood risk along the Meuse River. In particular, we assess the impacts on future flood risk of the prohibition of urban development in high and/or medium flood hazard zones. Acknowledgements The research was funded through the ARC grant for Concerted Research Actions, financed by the Wallonia-Brussels Federation.

  3. IAQ MODEL FOR WINDOWS - RISK VERSION 1.0 USER MANUAL

    EPA Science Inventory

    The manual describes the use of the computer model, RISK, to calculate individual exposure to indoor air pollutants from sources. The model calculates exposure due to individual, as opposed to population, activity patterns and source use. The model also provides the capability to...

  4. Improvements to the Ionizing Radiation Risk Assessment Program for NASA Astronauts

    NASA Technical Reports Server (NTRS)

    Semones, E. J.; Bahadori, A. A.; Picco, C. E.; Shavers, M. R.; Flores-McLaughlin, J.

    2011-01-01

    To perform dosimetry and risk assessment, NASA collects astronaut ionizing radiation exposure data from space flight, medical imaging and therapy, aviation training activities and prior occupational exposure histories. Career risk of exposure induced death (REID) from radiation is limited to 3 percent at a 95 percent confidence level. The Radiation Health Office at Johnson Space Center (JSC) is implementing a program to integrate the gathering, storage, analysis and reporting of astronaut ionizing radiation dose and risk data and records. This work has several motivations, including more efficient analyses and greater flexibility in testing and adopting new methods for evaluating risks. The foundation for these improvements is a set of software tools called the Astronaut Radiation Exposure Analysis System (AREAS). AREAS is a series of MATLAB(Registered TradeMark)-based dose and risk analysis modules that interface with an enterprise level SQL Server database by means of a secure web service. It communicates with other JSC medical and space weather databases to maintain data integrity and consistency across systems. AREAS is part of a larger NASA Space Medicine effort, the Mission Medical Integration Strategy, with the goal of collecting accurate, high-quality and detailed astronaut health data, and then securely, timely and reliably presenting it to medical support personnel. The modular approach to the AREAS design accommodates past, current, and future sources of data from active and passive detectors, space radiation transport algorithms, computational phantoms and cancer risk models. Revisions of the cancer risk model, new radiation detection equipment and improved anthropomorphic computational phantoms can be incorporated. Notable hardware updates include the Radiation Environment Monitor (which uses Medipix technology to report real-time, on-board dosimetry measurements), an updated Tissue-Equivalent Proportional Counter, and the Southwest Research Institute Radiation Assessment Detector. Also, the University of Florida hybrid phantoms, which are flexible in morphometry and positioning, are being explored as alternatives to the current NASA computational phantoms.

  5. The development of a 3D risk analysis method.

    PubMed

    I, Yet-Pole; Cheng, Te-Lung

    2008-05-01

    Much attention has been paid to the quantitative risk analysis (QRA) research in recent years due to more and more severe disasters that have happened in the process industries. Owing to its calculation complexity, very few software, such as SAFETI, can really make the risk presentation meet the practice requirements. However, the traditional risk presentation method, like the individual risk contour in SAFETI, is mainly based on the consequence analysis results of dispersion modeling, which usually assumes that the vapor cloud disperses over a constant ground roughness on a flat terrain with no obstructions and concentration fluctuations, which is quite different from the real situations of a chemical process plant. All these models usually over-predict the hazardous regions in order to maintain their conservativeness, which also increases the uncertainty of the simulation results. On the other hand, a more rigorous model such as the computational fluid dynamics (CFD) model can resolve the previous limitations; however, it cannot resolve the complexity of risk calculations. In this research, a conceptual three-dimensional (3D) risk calculation method was proposed via the combination of results of a series of CFD simulations with some post-processing procedures to obtain the 3D individual risk iso-surfaces. It is believed that such technique will not only be limited to risk analysis at ground level, but also be extended into aerial, submarine, or space risk analyses in the near future.

  6. A protect solution for data security in mobile cloud storage

    NASA Astrophysics Data System (ADS)

    Yu, Xiaojun; Wen, Qiaoyan

    2013-03-01

    It is popular to access the cloud storage by mobile devices. However, this application suffer data security risk, especial the data leakage and privacy violate problem. This risk exists not only in cloud storage system, but also in mobile client platform. To reduce the security risk, this paper proposed a new security solution. It makes full use of the searchable encryption and trusted computing technology. Given the performance limit of the mobile devices, it proposes the trusted proxy based protection architecture. The design basic idea, deploy model and key flows are detailed. The analysis from the security and performance shows the advantage.

  7. Eigentumors for prediction of treatment failure in patients with early-stage breast cancer using dynamic contrast-enhanced MRI: a feasibility study

    NASA Astrophysics Data System (ADS)

    Chan, H. M.; van der Velden, B. H. M.; E Loo, C.; Gilhuijs, K. G. A.

    2017-08-01

    We present a radiomics model to discriminate between patients at low risk and those at high risk of treatment failure at long-term follow-up based on eigentumors: principal components computed from volumes encompassing tumors in washin and washout images of pre-treatment dynamic contrast-enhanced (DCE-) MR images. Eigentumors were computed from the images of 563 patients from the MARGINS study. Subsequently, a least absolute shrinkage selection operator (LASSO) selected candidates from the components that contained 90% of the variance of the data. The model for prediction of survival after treatment (median follow-up time 86 months) was based on logistic regression. Receiver operating characteristic (ROC) analysis was applied and area-under-the-curve (AUC) values were computed as measures of training and cross-validated performances. The discriminating potential of the model was confirmed using Kaplan-Meier survival curves and log-rank tests. From the 322 principal components that explained 90% of the variance of the data, the LASSO selected 28 components. The ROC curves of the model yielded AUC values of 0.88, 0.77 and 0.73, for the training, leave-one-out cross-validated and bootstrapped performances, respectively. The bootstrapped Kaplan-Meier survival curves confirmed significant separation for all tumors (P  <  0.0001). Survival analysis on immunohistochemical subgroups shows significant separation for the estrogen-receptor subtype tumors (P  <  0.0001) and the triple-negative subtype tumors (P  =  0.0039), but not for tumors of the HER2 subtype (P  =  0.41). The results of this retrospective study show the potential of early-stage pre-treatment eigentumors for use in prediction of treatment failure of breast cancer.

  8. Evidence-based ergonomics education: Promoting risk factor awareness among office computer workers.

    PubMed

    Mani, Karthik; Provident, Ingrid; Eckel, Emily

    2016-01-01

    Work-related musculoskeletal disorders (WMSDs) related to computer work have become a serious public health concern. Literature revealed a positive association between computer use and WMSDs. The purpose of this evidence-based pilot project was to provide a series of evidence-based educational sessions on ergonomics to office computer workers to enhance the awareness of risk factors of WMSDs. Seventeen office computer workers who work for the National Board of Certification in Occupational Therapy volunteered for this project. Each participant completed a baseline and post-intervention ergonomics questionnaire and attended six educational sessions. The Rapid Office Strain Assessment and an ergonomics questionnaire were used for data collection. The post-intervention data revealed that 89% of participants were able to identify a greater number of risk factors and answer more questions correctly in knowledge tests of the ergonomics questionnaire. Pre- and post-intervention comparisons showed changes in work posture and behaviors (taking rest breaks, participating in exercise, adjusting workstation) of participants. The findings have implications for injury prevention in office settings and suggest that ergonomics education may yield positive knowledge and behavioral changes among computer workers.

  9. BRICK v0.2, a simple, accessible, and transparent model framework for climate and regional sea-level projections

    NASA Astrophysics Data System (ADS)

    Wong, Tony E.; Bakker, Alexander M. R.; Ruckert, Kelsey; Applegate, Patrick; Slangen, Aimée B. A.; Keller, Klaus

    2017-07-01

    Simple models can play pivotal roles in the quantification and framing of uncertainties surrounding climate change and sea-level rise. They are computationally efficient, transparent, and easy to reproduce. These qualities also make simple models useful for the characterization of risk. Simple model codes are increasingly distributed as open source, as well as actively shared and guided. Alas, computer codes used in the geosciences can often be hard to access, run, modify (e.g., with regards to assumptions and model components), and review. Here, we describe the simple model framework BRICK (Building blocks for Relevant Ice and Climate Knowledge) v0.2 and its underlying design principles. The paper adds detail to an earlier published model setup and discusses the inclusion of a land water storage component. The framework largely builds on existing models and allows for projections of global mean temperature as well as regional sea levels and coastal flood risk. BRICK is written in R and Fortran. BRICK gives special attention to the model values of transparency, accessibility, and flexibility in order to mitigate the above-mentioned issues while maintaining a high degree of computational efficiency. We demonstrate the flexibility of this framework through simple model intercomparison experiments. Furthermore, we demonstrate that BRICK is suitable for risk assessment applications by using a didactic example in local flood risk management.

  10. IEEE 1982. Proceedings of the international conference on cybernetics and society

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1982-01-01

    The following topics were dealt with: knowledge-based systems; risk analysis; man-machine interactions; human information processing; metaphor, analogy and problem-solving; manual control modelling; transportation systems; simulation; adaptive and learning systems; biocybernetics; cybernetics; mathematical programming; robotics; decision support systems; analysis, design and validation of models; computer vision; systems science; energy systems; environmental modelling and policy; pattern recognition; nuclear warfare; technological forecasting; artificial intelligence; the Turin shroud; optimisation; workloads. Abstracts of individual papers can be found under the relevant classification codes in this or future issues.

  11. Validating and Extending the Three Process Model of Alertness in Airline Operations

    PubMed Central

    Ingre, Michael; Van Leeuwen, Wessel; Klemets, Tomas; Ullvetter, Christer; Hough, Stephen; Kecklund, Göran; Karlsson, David; Åkerstedt, Torbjörn

    2014-01-01

    Sleepiness and fatigue are important risk factors in the transport sector and bio-mathematical sleepiness, sleep and fatigue modeling is increasingly becoming a valuable tool for assessing safety of work schedules and rosters in Fatigue Risk Management Systems (FRMS). The present study sought to validate the inner workings of one such model, Three Process Model (TPM), on aircrews and extend the model with functions to model jetlag and to directly assess the risk of any sleepiness level in any shift schedule or roster with and without knowledge of sleep timings. We collected sleep and sleepiness data from 136 aircrews in a real life situation by means of an application running on a handheld touch screen computer device (iPhone, iPod or iPad) and used the TPM to predict sleepiness with varying level of complexity of model equations and data. The results based on multilevel linear and non-linear mixed effects models showed that the TPM predictions correlated with observed ratings of sleepiness, but explorative analyses suggest that the default model can be improved and reduced to include only two-processes (S+C), with adjusted phases of the circadian process based on a single question of circadian type. We also extended the model with a function to model jetlag acclimatization and with estimates of individual differences including reference limits accounting for 50%, 75% and 90% of the population as well as functions for predicting the probability of any level of sleepiness for ecological assessment of absolute and relative risk of sleepiness in shift systems for safety applications. PMID:25329575

  12. A screening-level modeling approach to estimate nitrogen ...

    EPA Pesticide Factsheets

    This paper presents a screening-level modeling approach that can be used to rapidly estimate nutrient loading and assess numerical nutrient standard exceedance risk of surface waters leading to potential classification as impaired for designated use. It can also be used to explore best management practice (BMP) implementation to reduce loading. The modeling framework uses a hybrid statistical and process based approach to estimate source of pollutants, their transport and decay in the terrestrial and aquatic parts of watersheds. The framework is developed in the ArcGIS environment and is based on the total maximum daily load (TMDL) balance model. Nitrogen (N) is currently addressed in the framework, referred to as WQM-TMDL-N. Loading for each catchment includes non-point sources (NPS) and point sources (PS). NPS loading is estimated using export coefficient or event mean concentration methods depending on the temporal scales, i.e., annual or daily. Loading from atmospheric deposition is also included. The probability of a nutrient load to exceed a target load is evaluated using probabilistic risk assessment, by including the uncertainty associated with export coefficients of various land uses. The computed risk data can be visualized as spatial maps which show the load exceedance probability for all stream segments. In an application of this modeling approach to the Tippecanoe River watershed in Indiana, USA, total nitrogen (TN) loading and risk of standard exce

  13. Risk assessment of occupational exposure to benzene using numerical simulation in a complex geometry of a reforming unit of petroleum refinery.

    PubMed

    Bayatian, Majid; Ashrafi, Khosro; Azari, Mansour Rezazadeh; Jafari, Mohammad Javad; Mehrabi, Yadollah

    2018-04-01

    There has been an increasing concern about the continuous and the sudden release of volatile organic pollutants from petroleum refineries and occupational and environmental exposures. Benzene is one of the most prevalent volatile compounds, and it has been addressed by many authors for its potential toxicity in occupational and environmental settings. Due to the complexities of sampling and analysis of benzene in routine and accidental situations, a reliable estimation of the benzene concentration in the outdoor setting of refinery using a computational fluid dynamics (CFD) could be instrumental for risk assessment of occupational exposure. In the present work, a computational fluid dynamic model was applied for exposure risk assessment with consideration of benzene being released continuously from a reforming unit of a refinery. For simulation of benzene dispersion, GAMBIT, FLUENT, and CFD post software are used as preprocessing, processing, and post-processing, respectively. Computational fluid dynamic validation was carried out by comparing the computed data with the experimental measurements. Eventually, chronic daily intake and lifetime cancer risk for routine operations through the two seasons of a year are estimated through the simulation model. Root mean square errors are 0.19 and 0.17 for wind speed and concentration, respectively. Lifetime risk assessments of workers are 0.4-3.8 and 0.0096-0.25 per 1000 workers in stable and unstable atmospheric conditions, respectively. Exposure risk is unacceptable for the head of shift work, chief engineer, and general workers in 141 days (38.77%) in a year. The results of this study show that computational fluid dynamics is a useful tool for modeling of benzene exposure in a complex geometry and can be used to estimate lifetime risks of occupation groups in a refinery setting.

  14. Risk Assessment for Toxic Air Pollutants: A Citizen's Guide

    MedlinePlus

    ... from the source(s). Engineers use either monitors or computer models to estimate the amount of pollutant released ... measure how much of the pollutant is present. Computer models use mathematical equations that represent the processes ...

  15. Computing Optimal Stochastic Portfolio Execution Strategies: A Parametric Approach Using Simulations

    NASA Astrophysics Data System (ADS)

    Moazeni, Somayeh; Coleman, Thomas F.; Li, Yuying

    2010-09-01

    Computing optimal stochastic portfolio execution strategies under appropriate risk consideration presents great computational challenge. We investigate a parametric approach for computing optimal stochastic strategies using Monte Carlo simulations. This approach allows reduction in computational complexity by computing coefficients for a parametric representation of a stochastic dynamic strategy based on static optimization. Using this technique, constraints can be similarly handled using appropriate penalty functions. We illustrate the proposed approach to minimize the expected execution cost and Conditional Value-at-Risk (CVaR).

  16. Using software security analysis to verify the secure socket layer (SSL) protocol

    NASA Technical Reports Server (NTRS)

    Powell, John D.

    2004-01-01

    nal Aeronautics and Space Administration (NASA) have tens of thousands of networked computer systems and applications. Software Security vulnerabilities present risks such as lost or corrupted data, information the3, and unavailability of critical systems. These risks represent potentially enormous costs to NASA. The NASA Code Q research initiative 'Reducing Software Security Risk (RSSR) Trough an Integrated Approach '' offers, among its capabilities, formal verification of software security properties, through the use of model based verification (MBV) to address software security risks. [1,2,3,4,5,6] MBV is a formal approach to software assurance that combines analysis of software, via abstract models, with technology, such as model checkers, that provide automation of the mechanical portions of the analysis process. This paper will discuss: The need for formal analysis to assure software systems with respect to software and why testing alone cannot provide it. The means by which MBV with a Flexible Modeling Framework (FMF) accomplishes the necessary analysis task. An example of FMF style MBV in the verification of properties over the Secure Socket Layer (SSL) communication protocol as a demonstration.

  17. Surrogate modeling of joint flood risk across coastal watersheds

    NASA Astrophysics Data System (ADS)

    Bass, Benjamin; Bedient, Philip

    2018-03-01

    This study discusses the development and performance of a rapid prediction system capable of representing the joint rainfall-runoff and storm surge flood response of tropical cyclones (TCs) for probabilistic risk analysis. Due to the computational demand required for accurately representing storm surge with the high-fidelity ADvanced CIRCulation (ADCIRC) hydrodynamic model and its coupling with additional numerical models to represent rainfall-runoff, a surrogate or statistical model was trained to represent the relationship between hurricane wind- and pressure-field characteristics and their peak joint flood response typically determined from physics based numerical models. This builds upon past studies that have only evaluated surrogate models for predicting peak surge, and provides the first system capable of probabilistically representing joint flood levels from TCs. The utility of this joint flood prediction system is then demonstrated by improving upon probabilistic TC flood risk products, which currently account for storm surge but do not take into account TC associated rainfall-runoff. Results demonstrate the source apportionment of rainfall-runoff versus storm surge and highlight that slight increases in flood risk levels may occur due to the interaction between rainfall-runoff and storm surge as compared to the Federal Emergency Management Association's (FEMAs) current practices.

  18. A Strategy to Safely Live and Work in the Space Radiation Environment

    NASA Technical Reports Server (NTRS)

    Corbin, Barbara J.; Sulzman, Frank M.; Krenek, Sam

    2006-01-01

    The goal of the National Aeronautics and Space Agency and the Space Radiation Project is to ensure that astronauts can safely live and work in the space radiation environment. The space radiation environment poses both acute and chronic risks to crew health and safety, but unlike some other aspects of space travel, space radiation exposure has clinically relevant implications for the lifetime of the crew. The term safely means that risks are sufficiently understood such that acceptable limits on mission, post-mission and multi-mission consequences (for example, excess lifetime fatal cancer risk) can be defined. The Space Radiation Project strategy has several elements. The first element is to use a peer-reviewed research program to increase our mechanistic knowledge and genetic capabilities to develop tools for individual risk projection, thereby reducing our dependency on epidemiological data and population-based risk assessment. The second element is to use the NASA Space Radiation Laboratory to provide a ground-based facility to study the understanding of health effects/mechanisms of damage from space radiation exposure and the development and validation of biological models of risk, as well as methods for extrapolation to human risk. The third element is a risk modeling effort that integrates the results from research efforts into models of human risk to reduce uncertainties in predicting risk of carcinogenesis, central nervous system damage, degenerative tissue disease, and acute radiation effects. To understand the biological basis for risk, we must also understand the physical aspects of the crew environment. Thus the fourth element develops computer codes to predict radiation transport properties, evaluate integrated shielding technologies and provide design optimization recommendations for the design of human space systems. Understanding the risks and determining methods to mitigate the risks are keys to a successful radiation protection strategy.

  19. Towards a conceptual framework of OSH risk management in smart working environments based on smart PPE, ambient intelligence and the Internet of Things technologies.

    PubMed

    Podgórski, Daniel; Majchrzycka, Katarzyna; Dąbrowska, Anna; Gralewicz, Grzegorz; Okrasa, Małgorzata

    2017-03-01

    Recent developments in domains of ambient intelligence (AmI), Internet of Things, cyber-physical systems (CPS), ubiquitous/pervasive computing, etc., have led to numerous attempts to apply ICT solutions in the occupational safety and health (OSH) area. A literature review reveals a wide range of examples of smart materials, smart personal protective equipment and other AmI applications that have been developed to improve workers' safety and health. Because the use of these solutions modifies work methods, increases complexity of production processes and introduces high dynamism into thus created smart working environments (SWE), a new conceptual framework for dynamic OSH management in SWE is called for. A proposed framework is based on a new paradigm of OSH risk management consisting of real-time risk assessment and the capacity to monitor the risk level of each worker individually. A rationale for context-based reasoning in SWE and a respective model of the SWE-dedicated CPS are also proposed.

  20. FRAT-up, a Web-based fall-risk assessment tool for elderly people living in the community.

    PubMed

    Cattelani, Luca; Palumbo, Pierpaolo; Palmerini, Luca; Bandinelli, Stefania; Becker, Clemens; Chesani, Federico; Chiari, Lorenzo

    2015-02-18

    About 30% of people over 65 are subject to at least one unintentional fall a year. Fall prevention protocols and interventions can decrease the number of falls. To be effective, a prevention strategy requires a prior step to evaluate the fall risk of the subjects. Despite extensive research, existing assessment tools for fall risk have been insufficient for predicting falls. The goal of this study is to present a novel web-based fall-risk assessment tool (FRAT-up) and to evaluate its accuracy in predicting falls, within a context of community-dwelling persons aged 65 and up. FRAT-up is based on the assumption that a subject's fall risk is given by the contribution of their exposure to each of the known fall-risk factors. Many scientific studies have investigated the relationship between falls and risk factors. The majority of these studies adopted statistical approaches, usually providing quantitative information such as odds ratios. FRAT-up exploits these numerical results to compute how each single factor contributes to the overall fall risk. FRAT-up is based on a formal ontology that enlists a number of known risk factors, together with quantitative findings in terms of odds ratios. From such information, an automatic algorithm generates a rule-based probabilistic logic program, that is, a set of rules for each risk factor. The rule-based program takes the health profile of the subject (in terms of exposure to the risk factors) and computes the fall risk. A Web-based interface allows users to input health profiles and to visualize the risk assessment for the given subject. FRAT-up has been evaluated on the InCHIANTI Study dataset, a representative population-based study of older persons living in the Chianti area (Tuscany, Italy). We compared reported falls with predicted ones and computed performance indicators. The obtained area under curve of the receiver operating characteristic was 0.642 (95% CI 0.614-0.669), while the Brier score was 0.174. The Hosmer-Lemeshow test indicated statistical significance of miscalibration. FRAT-up is a web-based tool for evaluating the fall risk of people aged 65 or up living in the community. Validation results of fall risks computed by FRAT-up show that its performance is comparable to externally validated state-of-the-art tools. A prototype is freely available through a web-based interface. ClinicalTrials.gov NCT01331512 (The InChianti Follow-Up Study); http://clinicaltrials.gov/show/NCT01331512 (Archived by WebCite at http://www.webcitation.org/6UDrrRuaR).

  1. Crowd-Sourced Verification of Computational Methods and Data in Systems Toxicology: A Case Study with a Heat-Not-Burn Candidate Modified Risk Tobacco Product.

    PubMed

    Poussin, Carine; Belcastro, Vincenzo; Martin, Florian; Boué, Stéphanie; Peitsch, Manuel C; Hoeng, Julia

    2017-04-17

    Systems toxicology intends to quantify the effect of toxic molecules in biological systems and unravel their mechanisms of toxicity. The development of advanced computational methods is required for analyzing and integrating high throughput data generated for this purpose as well as for extrapolating predictive toxicological outcomes and risk estimates. To ensure the performance and reliability of the methods and verify conclusions from systems toxicology data analysis, it is important to conduct unbiased evaluations by independent third parties. As a case study, we report here the results of an independent verification of methods and data in systems toxicology by crowdsourcing. The sbv IMPROVER systems toxicology computational challenge aimed to evaluate computational methods for the development of blood-based gene expression signature classification models with the ability to predict smoking exposure status. Participants created/trained models on blood gene expression data sets including smokers/mice exposed to 3R4F (a reference cigarette) or noncurrent smokers/Sham (mice exposed to air). Participants applied their models on unseen data to predict whether subjects classify closer to smoke-exposed or nonsmoke exposed groups. The data sets also included data from subjects that had been exposed to potential modified risk tobacco products (MRTPs) or that had switched to a MRTP after exposure to conventional cigarette smoke. The scoring of anonymized participants' predictions was done using predefined metrics. The top 3 performers' methods predicted class labels with area under the precision recall scores above 0.9. Furthermore, although various computational approaches were used, the crowd's results confirmed our own data analysis outcomes with regards to the classification of MRTP-related samples. Mice exposed directly to a MRTP were classified closer to the Sham group. After switching to a MRTP, the confidence that subjects belonged to the smoke-exposed group decreased significantly. Smoking exposure gene signatures that contributed to the group separation included a core set of genes highly consistent across teams such as AHRR, LRRN3, SASH1, and P2RY6. In conclusion, crowdsourcing constitutes a pertinent approach, in complement to the classical peer review process, to independently and unbiasedly verify computational methods and data for risk assessment using systems toxicology.

  2. Patient-specific radiation dose and cancer risk estimation in CT: Part II. Application to patients

    PubMed Central

    Li, Xiang; Samei, Ehsan; Segars, W. Paul; Sturgeon, Gregory M.; Colsher, James G.; Toncheva, Greta; Yoshizumi, Terry T.; Frush, Donald P.

    2011-01-01

    Purpose: Current methods for estimating and reporting radiation dose from CT examinations are largely patient-generic; the body size and hence dose variation from patient to patient is not reflected. Furthermore, the current protocol designs rely on dose as a surrogate for the risk of cancer incidence, neglecting the strong dependence of risk on age and gender. The purpose of this study was to develop a method for estimating patient-specific radiation dose and cancer risk from CT examinations. Methods: The study included two patients (a 5-week-old female patient and a 12-year-old male patient), who underwent 64-slice CT examinations (LightSpeed VCT, GE Healthcare) of the chest, abdomen, and pelvis at our institution in 2006. For each patient, a nonuniform rational B-spine (NURBS) based full-body computer model was created based on the patient’s clinical CT data. Large organs and structures inside the image volume were individually segmented and modeled. Other organs were created by transforming an existing adult male or female full-body computer model (developed from visible human data) to match the framework defined by the segmented organs, referencing the organ volume and anthropometry data in ICRP Publication 89. A Monte Carlo program previously developed and validated for dose simulation on the LightSpeed VCT scanner was used to estimate patient-specific organ dose, from which effective dose and risks of cancer incidence were derived. Patient-specific organ dose and effective dose were compared with patient-generic CT dose quantities in current clinical use: the volume-weighted CT dose index (CTDIvol) and the effective dose derived from the dose-length product (DLP). Results: The effective dose for the CT examination of the newborn patient (5.7 mSv) was higher but comparable to that for the CT examination of the teenager patient (4.9 mSv) due to the size-based clinical CT protocols at our institution, which employ lower scan techniques for smaller patients. However, the overall risk of cancer incidence attributable to the CT examination was much higher for the newborn (2.4 in 1000) than for the teenager (0.7 in 1000). For the two pediatric-aged patients in our study, CTDIvol underestimated dose to large organs in the scan coverage by 30%–48%. The effective dose derived from DLP using published conversion coefficients differed from that calculated using patient-specific organ dose values by −57% to 13%, when the tissue weighting factors of ICRP 60 were used, and by −63% to 28%, when the tissue weighting factors of ICRP 103 were used. Conclusions: It is possible to estimate patient-specific radiation dose and cancer risk from CT examinations by combining a validated Monte Carlo program with patient-specific anatomical models that are derived from the patients’ clinical CT data and supplemented by transformed models of reference adults. With the construction of a large library of patient-specific computer models encompassing patients of all ages and weight percentiles, dose and risk can be estimated for any patient prior to or after a CT examination. Such information may aid in decisions for image utilization and can further guide the design and optimization of CT technologies and scan protocols. PMID:21361209

  3. Patient-specific radiation dose and cancer risk estimation in CT: Part II. Application to patients

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li Xiang; Samei, Ehsan; Segars, W. Paul

    2011-01-15

    Purpose: Current methods for estimating and reporting radiation dose from CT examinations are largely patient-generic; the body size and hence dose variation from patient to patient is not reflected. Furthermore, the current protocol designs rely on dose as a surrogate for the risk of cancer incidence, neglecting the strong dependence of risk on age and gender. The purpose of this study was to develop a method for estimating patient-specific radiation dose and cancer risk from CT examinations. Methods: The study included two patients (a 5-week-old female patient and a 12-year-old male patient), who underwent 64-slice CT examinations (LightSpeed VCT, GEmore » Healthcare) of the chest, abdomen, and pelvis at our institution in 2006. For each patient, a nonuniform rational B-spine (NURBS) based full-body computer model was created based on the patient's clinical CT data. Large organs and structures inside the image volume were individually segmented and modeled. Other organs were created by transforming an existing adult male or female full-body computer model (developed from visible human data) to match the framework defined by the segmented organs, referencing the organ volume and anthropometry data in ICRP Publication 89. A Monte Carlo program previously developed and validated for dose simulation on the LightSpeed VCT scanner was used to estimate patient-specific organ dose, from which effective dose and risks of cancer incidence were derived. Patient-specific organ dose and effective dose were compared with patient-generic CT dose quantities in current clinical use: the volume-weighted CT dose index (CTDI{sub vol}) and the effective dose derived from the dose-length product (DLP). Results: The effective dose for the CT examination of the newborn patient (5.7 mSv) was higher but comparable to that for the CT examination of the teenager patient (4.9 mSv) due to the size-based clinical CT protocols at our institution, which employ lower scan techniques for smaller patients. However, the overall risk of cancer incidence attributable to the CT examination was much higher for the newborn (2.4 in 1000) than for the teenager (0.7 in 1000). For the two pediatric-aged patients in our study, CTDI{sub vol} underestimated dose to large organs in the scan coverage by 30%-48%. The effective dose derived from DLP using published conversion coefficients differed from that calculated using patient-specific organ dose values by -57% to 13%, when the tissue weighting factors of ICRP 60 were used, and by -63% to 28%, when the tissue weighting factors of ICRP 103 were used. Conclusions: It is possible to estimate patient-specific radiation dose and cancer risk from CT examinations by combining a validated Monte Carlo program with patient-specific anatomical models that are derived from the patients' clinical CT data and supplemented by transformed models of reference adults. With the construction of a large library of patient-specific computer models encompassing patients of all ages and weight percentiles, dose and risk can be estimated for any patient prior to or after a CT examination. Such information may aid in decisions for image utilization and can further guide the design and optimization of CT technologies and scan protocols.« less

  4. Computational toxicology as implemented by the U.S. EPA: providing high throughput decision support tools for screening and assessing chemical exposure, hazard and risk.

    PubMed

    Kavlock, Robert; Dix, David

    2010-02-01

    Computational toxicology is the application of mathematical and computer models to help assess chemical hazards and risks to human health and the environment. Supported by advances in informatics, high-throughput screening (HTS) technologies, and systems biology, the U.S. Environmental Protection Agency EPA is developing robust and flexible computational tools that can be applied to the thousands of chemicals in commerce, and contaminant mixtures found in air, water, and hazardous-waste sites. The Office of Research and Development (ORD) Computational Toxicology Research Program (CTRP) is composed of three main elements. The largest component is the National Center for Computational Toxicology (NCCT), which was established in 2005 to coordinate research on chemical screening and prioritization, informatics, and systems modeling. The second element consists of related activities in the National Health and Environmental Effects Research Laboratory (NHEERL) and the National Exposure Research Laboratory (NERL). The third and final component consists of academic centers working on various aspects of computational toxicology and funded by the U.S. EPA Science to Achieve Results (STAR) program. Together these elements form the key components in the implementation of both the initial strategy, A Framework for a Computational Toxicology Research Program (U.S. EPA, 2003), and the newly released The U.S. Environmental Protection Agency's Strategic Plan for Evaluating the Toxicity of Chemicals (U.S. EPA, 2009a). Key intramural projects of the CTRP include digitizing legacy toxicity testing information toxicity reference database (ToxRefDB), predicting toxicity (ToxCast) and exposure (ExpoCast), and creating virtual liver (v-Liver) and virtual embryo (v-Embryo) systems models. U.S. EPA-funded STAR centers are also providing bioinformatics, computational toxicology data and models, and developmental toxicity data and models. The models and underlying data are being made publicly available through the Aggregated Computational Toxicology Resource (ACToR), the Distributed Structure-Searchable Toxicity (DSSTox) Database Network, and other U.S. EPA websites. While initially focused on improving the hazard identification process, the CTRP is placing increasing emphasis on using high-throughput bioactivity profiling data in systems modeling to support quantitative risk assessments, and in developing complementary higher throughput exposure models. This integrated approach will enable analysis of life-stage susceptibility, and understanding of the exposures, pathways, and key events by which chemicals exert their toxicity in developing systems (e.g., endocrine-related pathways). The CTRP will be a critical component in next-generation risk assessments utilizing quantitative high-throughput data and providing a much higher capacity for assessing chemical toxicity than is currently available.

  5. Soft-sensing model of temperature for aluminum reduction cell on improved twin support vector regression

    NASA Astrophysics Data System (ADS)

    Li, Tao

    2018-06-01

    The complexity of aluminum electrolysis process leads the temperature for aluminum reduction cells hard to measure directly. However, temperature is the control center of aluminum production. To solve this problem, combining some aluminum plant's practice data, this paper presents a Soft-sensing model of temperature for aluminum electrolysis process on Improved Twin Support Vector Regression (ITSVR). ITSVR eliminates the slow learning speed of Support Vector Regression (SVR) and the over-fit risk of Twin Support Vector Regression (TSVR) by introducing a regularization term into the objective function of TSVR, which ensures the structural risk minimization principle and lower computational complexity. Finally, the model with some other parameters as auxiliary variable, predicts the temperature by ITSVR. The simulation result shows Soft-sensing model based on ITSVR has short time-consuming and better generalization.

  6. Scenario-based modeling for multiple allocation hub location problem under disruption risk: multiple cuts Benders decomposition approach

    NASA Astrophysics Data System (ADS)

    Yahyaei, Mohsen; Bashiri, Mahdi

    2017-12-01

    The hub location problem arises in a variety of domains such as transportation and telecommunication systems. In many real-world situations, hub facilities are subject to disruption. This paper deals with the multiple allocation hub location problem in the presence of facilities failure. To model the problem, a two-stage stochastic formulation is developed. In the proposed model, the number of scenarios grows exponentially with the number of facilities. To alleviate this issue, two approaches are applied simultaneously. The first approach is to apply sample average approximation to approximate the two stochastic problem via sampling. Then, by applying the multiple cuts Benders decomposition approach, computational performance is enhanced. Numerical studies show the effective performance of the SAA in terms of optimality gap for small problem instances with numerous scenarios. Moreover, performance of multi-cut Benders decomposition is assessed through comparison with the classic version and the computational results reveal the superiority of the multi-cut approach regarding the computational time and number of iterations.

  7. Computational mechanics research and support for aerodynamics and hydraulics at TFHRC year 1 quarter 4 progress report.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lottes, S.A.; Kulak, R.F.; Bojanowski, C.

    2011-12-09

    The computational fluid dynamics (CFD) and computational structural mechanics (CSM) focus areas at Argonne's Transportation Research and Analysis Computing Center (TRACC) initiated a project to support and compliment the experimental programs at the Turner-Fairbank Highway Research Center (TFHRC) with high performance computing based analysis capabilities in August 2010. The project was established with a new interagency agreement between the Department of Energy and the Department of Transportation to provide collaborative research, development, and benchmarking of advanced three-dimensional computational mechanics analysis methods to the aerodynamics and hydraulics laboratories at TFHRC for a period of five years, beginning in October 2010. Themore » analysis methods employ well-benchmarked and supported commercial computational mechanics software. Computational mechanics encompasses the areas of Computational Fluid Dynamics (CFD), Computational Wind Engineering (CWE), Computational Structural Mechanics (CSM), and Computational Multiphysics Mechanics (CMM) applied in Fluid-Structure Interaction (FSI) problems. The major areas of focus of the project are wind and water effects on bridges - superstructure, deck, cables, and substructure (including soil), primarily during storms and flood events - and the risks that these loads pose to structural failure. For flood events at bridges, another major focus of the work is assessment of the risk to bridges caused by scour of stream and riverbed material away from the foundations of a bridge. Other areas of current research include modeling of flow through culverts to assess them for fish passage, modeling of the salt spray transport into bridge girders to address suitability of using weathering steel in bridges, CFD analysis of the operation of the wind tunnel in the TFCHR wind engineering laboratory, vehicle stability under high wind loading, and the use of electromagnetic shock absorbers to improve vehicle stability under high wind conditions. This quarterly report documents technical progress on the project tasks for the period of July through September 2011.« less

  8. EVALUATION OF PHYSIOLOGY COMPUTER MODELS, AND THE FEASIBILITY OF THEIR USE IN RISK ASSESSMENT.

    EPA Science Inventory

    This project will evaluate the current state of quantitative models that simulate physiological processes, and the how these models might be used in conjunction with the current use of PBPK and BBDR models in risk assessment. The work will include a literature search to identify...

  9. A technology path to tactical agent-based modeling

    NASA Astrophysics Data System (ADS)

    James, Alex; Hanratty, Timothy P.

    2017-05-01

    Wargaming is a process of thinking through and visualizing events that could occur during a possible course of action. Over the past 200 years, wargaming has matured into a set of formalized processes. One area of growing interest is the application of agent-based modeling. Agent-based modeling and its additional supporting technologies has potential to introduce a third-generation wargaming capability to the Army, creating a positive overmatch decision-making capability. In its simplest form, agent-based modeling is a computational technique that helps the modeler understand and simulate how the "whole of a system" responds to change over time. It provides a decentralized method of looking at situations where individual agents are instantiated within an environment, interact with each other, and empowered to make their own decisions. However, this technology is not without its own risks and limitations. This paper explores a technology roadmap, identifying research topics that could realize agent-based modeling within a tactical wargaming context.

  10. Head injury assessment of non-lethal projectile impacts: A combined experimental/computational method.

    PubMed

    Sahoo, Debasis; Robbe, Cyril; Deck, Caroline; Meyer, Frank; Papy, Alexandre; Willinger, Remy

    2016-11-01

    The main objective of this study is to develop a methodology to assess this risk based on experimental tests versus numerical predictive head injury simulations. A total of 16 non-lethal projectiles (NLP) impacts were conducted with rigid force plate at three different ranges of impact velocity (120, 72 and 55m/s) and the force/deformation-time data were used for the validation of finite element (FE) NLP. A good accordance between experimental and simulation data were obtained during validation of FE NLP with high correlation value (>0.98) and peak force discrepancy of less than 3%. A state-of-the art finite element head model with enhanced brain and skull material laws and specific head injury criteria was used for numerical computation of NLP impacts. Frontal and lateral FE NLP impacts to the head model at different velocities were performed under LS-DYNA. It is the very first time that the lethality of NLP is assessed by axonal strain computation to predict diffuse axonal injury (DAI) in NLP impacts to head. In case of temporo-parietal impact the min-max risk of DAI is 0-86%. With a velocity above 99.2m/s there is greater than 50% risk of DAI for temporo-parietal impacts. All the medium- and high-velocity impacts are susceptible to skull fracture, with a percentage risk higher than 90%. This study provides tool for a realistic injury (DAI and skull fracture) assessment during NLP impacts to the human head. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. Risk Classification with an Adaptive Naive Bayes Kernel Machine Model.

    PubMed

    Minnier, Jessica; Yuan, Ming; Liu, Jun S; Cai, Tianxi

    2015-04-22

    Genetic studies of complex traits have uncovered only a small number of risk markers explaining a small fraction of heritability and adding little improvement to disease risk prediction. Standard single marker methods may lack power in selecting informative markers or estimating effects. Most existing methods also typically do not account for non-linearity. Identifying markers with weak signals and estimating their joint effects among many non-informative markers remains challenging. One potential approach is to group markers based on biological knowledge such as gene structure. If markers in a group tend to have similar effects, proper usage of the group structure could improve power and efficiency in estimation. We propose a two-stage method relating markers to disease risk by taking advantage of known gene-set structures. Imposing a naive bayes kernel machine (KM) model, we estimate gene-set specific risk models that relate each gene-set to the outcome in stage I. The KM framework efficiently models potentially non-linear effects of predictors without requiring explicit specification of functional forms. In stage II, we aggregate information across gene-sets via a regularization procedure. Estimation and computational efficiency is further improved with kernel principle component analysis. Asymptotic results for model estimation and gene set selection are derived and numerical studies suggest that the proposed procedure could outperform existing procedures for constructing genetic risk models.

  12. Quantifying uncertainty in Bayesian calibrated animal-to-human PBPK models with informative prior distributions

    EPA Science Inventory

    Understanding and quantifying the uncertainty of model parameters and predictions has gained more interest in recent years with the increased use of computational models in chemical risk assessment. Fully characterizing the uncertainty in risk metrics derived from linked quantita...

  13. Personalized dynamic prediction of death according to tumour progression and high-dimensional genetic factors: Meta-analysis with a joint model.

    PubMed

    Emura, Takeshi; Nakatochi, Masahiro; Matsui, Shigeyuki; Michimae, Hirofumi; Rondeau, Virginie

    2017-01-01

    Developing a personalized risk prediction model of death is fundamental for improving patient care and touches on the realm of personalized medicine. The increasing availability of genomic information and large-scale meta-analytic data sets for clinicians has motivated the extension of traditional survival prediction based on the Cox proportional hazards model. The aim of our paper is to develop a personalized risk prediction formula for death according to genetic factors and dynamic tumour progression status based on meta-analytic data. To this end, we extend the existing joint frailty-copula model to a model allowing for high-dimensional genetic factors. In addition, we propose a dynamic prediction formula to predict death given tumour progression events possibly occurring after treatment or surgery. For clinical use, we implement the computation software of the prediction formula in the joint.Cox R package. We also develop a tool to validate the performance of the prediction formula by assessing the prediction error. We illustrate the method with the meta-analysis of individual patient data on ovarian cancer patients.

  14. Comprehensive Computational Pathological Image Analysis Predicts Lung Cancer Prognosis.

    PubMed

    Luo, Xin; Zang, Xiao; Yang, Lin; Huang, Junzhou; Liang, Faming; Rodriguez-Canales, Jaime; Wistuba, Ignacio I; Gazdar, Adi; Xie, Yang; Xiao, Guanghua

    2017-03-01

    Pathological examination of histopathological slides is a routine clinical procedure for lung cancer diagnosis and prognosis. Although the classification of lung cancer has been updated to become more specific, only a small subset of the total morphological features are taken into consideration. The vast majority of the detailed morphological features of tumor tissues, particularly tumor cells' surrounding microenvironment, are not fully analyzed. The heterogeneity of tumor cells and close interactions between tumor cells and their microenvironments are closely related to tumor development and progression. The goal of this study is to develop morphological feature-based prediction models for the prognosis of patients with lung cancer. We developed objective and quantitative computational approaches to analyze the morphological features of pathological images for patients with NSCLC. Tissue pathological images were analyzed for 523 patients with adenocarcinoma (ADC) and 511 patients with squamous cell carcinoma (SCC) from The Cancer Genome Atlas lung cancer cohorts. The features extracted from the pathological images were used to develop statistical models that predict patients' survival outcomes in ADC and SCC, respectively. We extracted 943 morphological features from pathological images of hematoxylin and eosin-stained tissue and identified morphological features that are significantly associated with prognosis in ADC and SCC, respectively. Statistical models based on these extracted features stratified NSCLC patients into high-risk and low-risk groups. The models were developed from training sets and validated in independent testing sets: a predicted high-risk group versus a predicted low-risk group (for patients with ADC: hazard ratio = 2.34, 95% confidence interval: 1.12-4.91, p = 0.024; for patients with SCC: hazard ratio = 2.22, 95% confidence interval: 1.15-4.27, p = 0.017) after adjustment for age, sex, smoking status, and pathologic tumor stage. The results suggest that the quantitative morphological features of tumor pathological images predict prognosis in patients with lung cancer. Copyright © 2016 International Association for the Study of Lung Cancer. Published by Elsevier Inc. All rights reserved.

  15. Development of solute transport models in YMPYRÄ framework to simulate solute migration in military shooting and training areas

    NASA Astrophysics Data System (ADS)

    Warsta, L.; Karvonen, T.

    2017-12-01

    There are currently 25 shooting and training areas in Finland managed by The Finnish Defence Forces (FDF), where military activities can cause contamination of open waters and groundwater reservoirs. In the YMPYRÄ project, a computer software framework is being developed that combines existing open environmental data and proprietary information collected by FDF with computational models to investigate current and prevent future environmental problems. A data centric philosophy is followed in the development of the system, i.e. the models are updated and extended to handle available data from different areas. The results generated by the models are summarized as easily understandable flow and risk maps that can be opened in GIS programs and used in environmental assessments by experts. Substances investigated with the system include explosives and metals such as lead, and both surface and groundwater dominated areas can be simulated. The YMPYRÄ framework is composed of a three dimensional soil and groundwater flow model, several solute transport models and an uncertainty assessment system. Solute transport models in the framework include particle based, stream tube and finite volume based approaches. The models can be used to simulate solute dissolution from source area, transport in the unsaturated layers to groundwater and finally migration in groundwater to water extraction wells and springs. The models can be used to simulate advection, dispersion, equilibrium adsorption on soil particles, solubility and dissolution from solute phase and dendritic solute decay chains. Correct numerical solutions were confirmed by comparing results to analytical 1D and 2D solutions and by comparing the numerical solutions to each other. The particle based and stream tube type solute transport models were useful as they could complement the traditional finite volume based approach which in certain circumstances produced numerical dispersion due to piecewise solution of the governing equations in computational grids and included computationally intensive and in some cases unstable iterative solutions. The YMPYRÄ framework is being developed by WaterHope, Gain Oy, and SITO Oy consulting companies and funded by FDF.

  16. Personalized modeling for real-time pressure ulcer prevention in sitting posture.

    PubMed

    Luboz, Vincent; Bailet, Mathieu; Boichon Grivot, Christelle; Rochette, Michel; Diot, Bruno; Bucki, Marek; Payan, Yohan

    2018-02-01

    Ischial pressure ulcer is an important risk for every paraplegic person and a major public health issue. Pressure ulcers appear following excessive compression of buttock's soft tissues by bony structures, and particularly in ischial and sacral bones. Current prevention techniques are mainly based on daily skin inspection to spot red patches or injuries. Nevertheless, most pressure ulcers occur internally and are difficult to detect early. Estimating internal strains within soft tissues could help to evaluate the risk of pressure ulcer. A subject-specific biomechanical model could be used to assess internal strains from measured skin surface pressures. However, a realistic 3D non-linear Finite Element buttock model, with different layers of tissue materials for skin, fat and muscles, requires somewhere between minutes and hours to compute, therefore forbidding its use in a real-time daily prevention context. In this article, we propose to optimize these computations by using a reduced order modeling technique (ROM) based on proper orthogonal decompositions of the pressure and strain fields coupled with a machine learning method. ROM allows strains to be evaluated inside the model interactively (i.e. in less than a second) for any pressure field measured below the buttocks. In our case, with only 19 modes of variation of pressure patterns, an error divergence of one percent is observed compared to the full scale simulation for evaluating the strain field. This reduced model could therefore be the first step towards interactive pressure ulcer prevention in a daily set-up. Copyright © 2017 Tissue Viability Society. Published by Elsevier Ltd. All rights reserved.

  17. Mammographic features and subsequent risk of breast cancer: a comparison of qualitative and quantitative evaluations in the Guernsey prospective studies.

    PubMed

    Torres-Mejía, Gabriela; De Stavola, Bianca; Allen, Diane S; Pérez-Gavilán, Juan J; Ferreira, Jorge M; Fentiman, Ian S; Dos Santos Silva, Isabel

    2005-05-01

    Mammographic features are known to be associated with breast cancer but the magnitude of the effect differs markedly from study to study. Methods to assess mammographic features range from subjective qualitative classifications to computer-automated quantitative measures. We used data from the UK Guernsey prospective studies to examine the relative value of these methods in predicting breast cancer risk. In all, 3,211 women ages > or =35 years who had a mammogram taken in 1986 to 1989 were followed-up to the end of October 2003, with 111 developing breast cancer during this period. Mammograms were classified using the subjective qualitative Wolfe classification and several quantitative mammographic features measured using computer-based techniques. Breast cancer risk was positively associated with high-grade Wolfe classification, percent breast density and area of dense tissue, and negatively associated with area of lucent tissue, fractal dimension, and lacunarity. Inclusion of the quantitative measures in the same model identified area of dense tissue and lacunarity as the best predictors of breast cancer, with risk increasing by 59% [95% confidence interval (95% CI), 29-94%] per SD increase in total area of dense tissue but declining by 39% (95% CI, 53-22%) per SD increase in lacunarity, after adjusting for each other and for other confounders. Comparison of models that included both the qualitative Wolfe classification and these two quantitative measures to models that included either the qualitative or the two quantitative variables showed that they all made significant contributions to prediction of breast cancer risk. These findings indicate that breast cancer risk is affected not only by the amount of mammographic density but also by the degree of heterogeneity of the parenchymal pattern and, presumably, by other features captured by the Wolfe classification.

  18. Human Injury Criteria for Underwater Blasts

    PubMed Central

    Lance, Rachel M.; Capehart, Bruce; Kadro, Omar; Bass, Cameron R.

    2015-01-01

    Underwater blasts propagate further and injure more readily than equivalent air blasts. Development of effective personal protection and countermeasures, however, requires knowledge of the currently unknown human tolerance to underwater blast. Current guidelines for prevention of underwater blast injury are not based on any organized injury risk assessment, human data or experimental data. The goal of this study was to derive injury risk assessments for underwater blast using well-characterized human underwater blast exposures in the open literature. The human injury dataset was compiled using 34 case reports on underwater blast exposure to 475 personnel, dating as early as 1916. Using severity ratings, computational reconstructions of the blasts, and survival information from a final set of 262 human exposures, injury risk models were developed for both injury severity and risk of fatality as functions of blast impulse and blast peak overpressure. Based on these human data, we found that the 50% risk of fatality from underwater blast occurred at 302±16 kPa-ms impulse. Conservatively, there is a 20% risk of pulmonary injury at a kilometer from a 20 kg charge. From a clinical point of view, this new injury risk model emphasizes the large distances possible for potential pulmonary and gut injuries in water compared with air. This risk value is the first impulse-based fatality risk calculated from human data. The large-scale inconsistency between the blast exposures in the case reports and the guidelines available in the literature prior to this study further underscored the need for this new guideline derived from the unique dataset of actual injuries in this study. PMID:26606655

  19. The EPA Comptox Chemistry Dashboard: A Web-Based Data Integration Hub for Toxicology Data (SOT)

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) Computational Toxicology Program integrates advances in biology, chemistry, and computer science to help prioritize chemicals for further research based on potential human health risks. This work involves computational and data drive...

  20. Individual and Network Interventions With Injection Drug Users in 5 Ukraine Cities

    PubMed Central

    Lehman, Wayne E. K.; Latkin, Carl A.; Dvoryak, Sergey; Brewster, John T.; Royer, Mark S.; Sinitsyna, Larisa

    2011-01-01

    Objectives. We evaluated the effects of an individual intervention versus a network intervention on HIV-related injection and sexual risk behaviors among street-recruited opiate injection drug users in 5 Ukraine cities. Methods. Between 2004 and 2006, 722 opiate injection drug users were recruited to participate in interventions that were either individually based or based on a social network model in which peer educators intervened with their network members. Audio computer-assisted self-interview techniques were used to interview participants at baseline and follow-up. Results. Multiple logistic analyses controlling for baseline injection and sexual risks revealed that both peer educators and network members in the network intervention reduced injection-related risk behaviors significantly more than did those in the individually based intervention and that peer educators increased condom use significantly more than did those in the individual intervention. Individual intervention participants, however, showed significantly greater improvements than did network members with respect to reductions in sexual risk behaviors. Conclusions. Social network interventions may be more effective than individually based interventions in changing injection risk behaviors among both peer educators and network members. The effectiveness of network interventions in changing sexual risk behaviors is less clear, probably owing to network composition and inhibitions regarding discussing sexual risk behaviors. PMID:20395584

  1. An improved approach for flight readiness certification: Methodology for failure risk assessment and application examples. Volume 2: Software documentation

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes, These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.

  2. An improved approach for flight readiness certification: Methodology for failure risk assessment and application examples, volume 1

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.

  3. LLSURE: local linear SURE-based edge-preserving image filtering.

    PubMed

    Qiu, Tianshuang; Wang, Aiqi; Yu, Nannan; Song, Aimin

    2013-01-01

    In this paper, we propose a novel approach for performing high-quality edge-preserving image filtering. Based on a local linear model and using the principle of Stein's unbiased risk estimate as an estimator for the mean squared error from the noisy image only, we derive a simple explicit image filter which can filter out noise while preserving edges and fine-scale details. Moreover, this filter has a fast and exact linear-time algorithm whose computational complexity is independent of the filtering kernel size; thus, it can be applied to real time image processing tasks. The experimental results demonstrate the effectiveness of the new filter for various computer vision applications, including noise reduction, detail smoothing and enhancement, high dynamic range compression, and flash/no-flash denoising.

  4. A randomised controlled trial testing a web-based, computer-tailored self-management intervention for people with or at risk for chronic obstructive pulmonary disease: a study protocol

    PubMed Central

    2013-01-01

    Background Chronic Obstructive Pulmonary Disease (COPD) is a major cause of morbidity and mortality. Effective self-management support interventions are needed to improve the health and functional status of people with COPD or at risk for COPD. Computer-tailored technology could be an effective way to provide this support. Methods/Design This paper presents the protocol of a randomised controlled trial testing the effectiveness of a web-based, computer-tailored self-management intervention to change health behaviours of people with or at risk for COPD. An intervention group will be compared to a usual care control group, in which the intervention group will receive a web-based, computer-tailored self-management intervention. Participants will be recruited from an online panel and through general practices. Outcomes will be measured at baseline and at 6 months. The primary outcomes will be smoking behaviour, measuring the 7-day point prevalence abstinence and physical activity, measured in minutes. Secondary outcomes will include dyspnoea score, quality of life, stages of change, intention to change behaviour and alternative smoking behaviour measures, including current smoking behaviour, 24-hour point prevalence abstinence, prolonged abstinence, continued abstinence and number of quit attempts. Discussion To the best of our knowledge, this will be the first randomised controlled trial to test the effectiveness of a web-based, computer-tailored self-management intervention for people with or at risk for COPD. The results will be important to explore the possible benefits of computer-tailored interventions for the self-management of people with or at risk for COPD and potentially other chronic health conditions. Dutch trial register NTR3421 PMID:23742208

  5. Computational Modeling of Fluid–Structure–Acoustics Interaction during Voice Production

    PubMed Central

    Jiang, Weili; Zheng, Xudong; Xue, Qian

    2017-01-01

    The paper presented a three-dimensional, first-principle based fluid–structure–acoustics interaction computer model of voice production, which employed a more realistic human laryngeal and vocal tract geometries. Self-sustained vibrations, important convergent–divergent vibration pattern of the vocal folds, and entrainment of the two dominant vibratory modes were captured. Voice quality-associated parameters including the frequency, open quotient, skewness quotient, and flow rate of the glottal flow waveform were found to be well within the normal physiological ranges. The analogy between the vocal tract and a quarter-wave resonator was demonstrated. The acoustic perturbed flux and pressure inside the glottis were found to be at the same order with their incompressible counterparts, suggesting strong source–filter interactions during voice production. Such high fidelity computational model will be useful for investigating a variety of pathological conditions that involve complex vibrations, such as vocal fold paralysis, vocal nodules, and vocal polyps. The model is also an important step toward a patient-specific surgical planning tool that can serve as a no-risk trial and error platform for different procedures, such as injection of biomaterials and thyroplastic medialization. PMID:28243588

  6. Web based collaborative decision making in flood risk management

    NASA Astrophysics Data System (ADS)

    Evers, Mariele; Almoradie, Adrian; Jonoski, Andreja

    2014-05-01

    Stakeholder participation in the development of flood risk management (FRM) plans is essential since stakeholders often have a better understanding or knowledge of the potentials and limitation of their local area. Moreover, a participatory approach also creates trust amongst stakeholders, leading to a successful implementation of measures. Stakeholder participation however has its challenges and potential pitfalls that could lead to its premature termination. Such challenges and pitfalls are the limitation of financial resources, stakeholders' spatial distribution and their interest to participate. Different type of participation in FRM may encounter diverse challenges. These types of participation in FRM can be classified into (1) Information and knowledge sharing (IKS), (2) Consultative participation (CP) or (3) Collaborative decision making (CDM)- the most challenging type of participation. An innovative approach to address these challenges and potential pitfalls is a web-based mobile or computer-aided environment for stakeholder participation. This enhances the remote interaction between participating entities such as stakeholders. This paper presents a developed framework and an implementation of CDM web based environment for the Alster catchment (Hamburg, Germany) and Cranbrook catchment (London, UK). The CDM framework consists of two main stages: (1) Collaborative modelling and (2) Participatory decision making. This paper also highlights the stakeholder analyses, modelling approach and application of General Public License (GPL) technologies in developing the web-based environments. Actual test and evaluation of the environments was through series of stakeholders workshops. The overall results based from stakeholders' evaluation shows that web-based environments can address the challenges and potential pitfalls in stakeholder participation and it enhances participation in flood risk management. The web-based environment was developed within the DIANE-CM project (Decentralised Integrated Analysis and Enhancement of Awareness through Collaborative Modelling and Management of Flood Risk) of the 2nd ERANET CRUE funding initiative.

  7. Multiscale Mechanics of Articular Cartilage: Potentials and Challenges of Coupling Musculoskeletal, Joint, and Microscale Computational Models

    PubMed Central

    Halloran, J. P.; Sibole, S.; van Donkelaar, C. C.; van Turnhout, M. C.; Oomens, C. W. J.; Weiss, J. A.; Guilak, F.; Erdemir, A.

    2012-01-01

    Articular cartilage experiences significant mechanical loads during daily activities. Healthy cartilage provides the capacity for load bearing and regulates the mechanobiological processes for tissue development, maintenance, and repair. Experimental studies at multiple scales have provided a fundamental understanding of macroscopic mechanical function, evaluation of the micromechanical environment of chondrocytes, and the foundations for mechanobiological response. In addition, computational models of cartilage have offered a concise description of experimental data at many spatial levels under healthy and diseased conditions, and have served to generate hypotheses for the mechanical and biological function. Further, modeling and simulation provides a platform for predictive risk assessment, management of dysfunction, as well as a means to relate multiple spatial scales. Simulation-based investigation of cartilage comes with many challenges including both the computational burden and often insufficient availability of data for model development and validation. This review outlines recent modeling and simulation approaches to understand cartilage function from a mechanical systems perspective, and illustrates pathways to associate mechanics with biological function. Computational representations at single scales are provided from the body down to the microstructure, along with attempts to explore multiscale mechanisms of load sharing that dictate the mechanical environment of the cartilage and chondrocytes. PMID:22648577

  8. Motivating At-Risk Students through Computer-based Cooperative Learning Activities.

    ERIC Educational Resources Information Center

    Gan, Siowck-Lee

    1999-01-01

    Malaysian at-risk students trained in information-technology skills were appointed to lead cooperative-learning groups engaged in computer-search activities. Activities were structured to incorporate individual accountability, positive interdependence and interaction, collaborative skills, and group processing. Motivation, self-confidence,…

  9. Regional scale landslide risk assessment with a dynamic physical model - development, application and uncertainty analysis

    NASA Astrophysics Data System (ADS)

    Luna, Byron Quan; Vidar Vangelsten, Bjørn; Liu, Zhongqiang; Eidsvig, Unni; Nadim, Farrokh

    2013-04-01

    Landslide risk must be assessed at the appropriate scale in order to allow effective risk management. At the moment, few deterministic models exist that can do all the computations required for a complete landslide risk assessment at a regional scale. This arises from the difficulty to precisely define the location and volume of the released mass and from the inability of the models to compute the displacement with a large amount of individual initiation areas (computationally exhaustive). This paper presents a medium-scale, dynamic physical model for rapid mass movements in mountainous and volcanic areas. The deterministic nature of the approach makes it possible to apply it to other sites since it considers the frictional equilibrium conditions for the initiation process, the rheological resistance of the displaced flow for the run-out process and fragility curve that links intensity to economic loss for each building. The model takes into account the triggering effect of an earthquake, intense rainfall and a combination of both (spatial and temporal). The run-out module of the model considers the flow as a 2-D continuum medium solving the equations of mass balance and momentum conservation. The model is embedded in an open source environment geographical information system (GIS), it is computationally efficient and it is transparent (understandable and comprehensible) for the end-user. The model was applied to a virtual region, assessing landslide hazard, vulnerability and risk. A Monte Carlo simulation scheme was applied to quantify, propagate and communicate the effects of uncertainty in input parameters on the final results. In this technique, the input distributions are recreated through sampling and the failure criteria are calculated for each stochastic realisation of the site properties. The model is able to identify the released volumes of the critical slopes and the areas threatened by the run-out intensity. The obtained final outcome is the estimation of individual building damage and total economic risk. The research leading to these results has received funding from the European Community's Seventh Framework Programme [FP7/2007-2013] under grant agreement No 265138 New Multi-HAzard and MulTi-RIsK Assessment MethodS for Europe (MATRIX).

  10. Topography- and nightlight-based national flood risk assessment in Canada

    NASA Astrophysics Data System (ADS)

    Elshorbagy, Amin; Bharath, Raja; Lakhanpal, Anchit; Ceola, Serena; Montanari, Alberto; Lindenschmidt, Karl-Erich

    2017-04-01

    In Canada, flood analysis and water resource management, in general, are tasks conducted at the provincial level; therefore, unified national-scale approaches to water-related problems are uncommon. In this study, a national-scale flood risk assessment approach is proposed and developed. The study focuses on using global and national datasets available with various resolutions to create flood risk maps. First, a flood hazard map of Canada is developed using topography-based parameters derived from digital elevation models, namely, elevation above nearest drainage (EAND) and distance from nearest drainage (DFND). This flood hazard mapping method is tested on a smaller area around the city of Calgary, Alberta, against a flood inundation map produced by the city using hydraulic modelling. Second, a flood exposure map of Canada is developed using a land-use map and the satellite-based nightlight luminosity data as two exposure parameters. Third, an economic flood risk map is produced, and subsequently overlaid with population density information to produce a socioeconomic flood risk map for Canada. All three maps of hazard, exposure, and risk are classified into five classes, ranging from very low to severe. A simple way to include flood protection measures in hazard estimation is also demonstrated using the example of the city of Winnipeg, Manitoba. This could be done for the entire country if information on flood protection across Canada were available. The evaluation of the flood hazard map shows that the topography-based method adopted in this study is both practical and reliable for large-scale analysis. Sensitivity analysis regarding the resolution of the digital elevation model is needed to identify the resolution that is fine enough for reliable hazard mapping, but coarse enough for computational tractability. The nightlight data are found to be useful for exposure and risk mapping in Canada; however, uncertainty analysis should be conducted to investigate the effect of the overglow phenomenon on flood risk mapping.

  11. Effects of a Computer-Based Early Reading Program on the Early Reading and Oral Language Skills of At-Risk Preschool Children

    ERIC Educational Resources Information Center

    Huffstetter, Mary; King, James R.; Onwuegbuzie, Anthony J.; Schneider, Jenifer J.; Powell-Smith, Kelly A.

    2010-01-01

    This study examined the effects of a computer-based early reading program (Headsprout Early Reading) on the oral language and early reading skills of at-risk preschool children. In a pretest-posttest control group design, 62 children were randomly assigned to receive supplemental instruction with Headsprout Early Reading (experimental group) or…

  12. Computational Toxicology at the US EPA

    EPA Science Inventory

    Computational toxicology is the application of mathematical and computer models to help assess chemical hazards and risks to human health and the environment. Supported by advances in informatics, high-throughput screening (HTS) technologies, and systems biology, EPA is developin...

  13. Computed tomography-based finite element analysis to assess fracture risk and osteoporosis treatment

    PubMed Central

    Imai, Kazuhiro

    2015-01-01

    Finite element analysis (FEA) is a computer technique of structural stress analysis and developed in engineering mechanics. FEA has developed to investigate structural behavior of human bones over the past 40 years. When the faster computers have acquired, better FEA, using 3-dimensional computed tomography (CT) has been developed. This CT-based finite element analysis (CT/FEA) has provided clinicians with useful data. In this review, the mechanism of CT/FEA, validation studies of CT/FEA to evaluate accuracy and reliability in human bones, and clinical application studies to assess fracture risk and effects of osteoporosis medication are overviewed. PMID:26309819

  14. Physically-based Assessment of Tropical Cyclone Damage and Economic Losses

    NASA Astrophysics Data System (ADS)

    Lin, N.

    2012-12-01

    Estimating damage and economic losses caused by tropical cyclones (TC) is a topic of considerable research interest in many scientific fields, including meteorology, structural and coastal engineering, and actuarial sciences. One approach is based on the empirical relationship between TC characteristics and loss data. Another is to model the physical mechanism of TC-induced damage. In this talk we discuss about the physically-based approach to predict TC damage and losses due to extreme wind and storm surge. We first present an integrated vulnerability model, which, for the first time, explicitly models the essential mechanisms causing wind damage to residential areas during storm passage, including windborne-debris impact and the pressure-debris interaction that may lead, in a chain reaction, to structural failures (Lin and Vanmarcke 2010; Lin et al. 2010a). This model can be used to predict the economic losses in a residential neighborhood (with hundreds of buildings) during a specific TC (Yau et al. 2011) or applied jointly with a TC risk model (e.g., Emanuel et al 2008) to estimate the expected losses over long time periods. Then we present a TC storm surge risk model that has been applied to New York City (Lin et al. 2010b; Lin et al. 2012; Aerts et al. 2012), Miami-Dade County, Florida (Klima et al. 2011), Galveston, Texas (Lickley, 2012), and other coastal areas around the world (e.g., Tampa, Florida; Persian Gulf; Darwin, Australia; Shanghai, China). These physically-based models are applicable to various coastal areas and have the capability to account for the change of the climate and coastal exposure over time. We also point out that, although made computationally efficient for risk assessment, these models are not suitable for regional or global analysis, which has been a focus of the empirically-based economic analysis (e.g., Hsiang and Narita 2012). A future research direction is to simplify the physically-based models, possibly through parameterization, and make connections to the global loss data and economic analysis.

  15. Failure of the Porcine Ascending Aorta: Multidirectional Experiments and a Unifying Microstructural Model

    PubMed Central

    Witzenburg, Colleen M.; Dhume, Rohit Y.; Shah, Sachin B.; Korenczuk, Christopher E.; Wagner, Hallie P.; Alford, Patrick W.; Barocas, Victor H.

    2017-01-01

    The ascending thoracic aorta is poorly understood mechanically, especially its risk of dissection. To make better predictions of dissection risk, more information about the multidimensional failure behavior of the tissue is needed, and this information must be incorporated into an appropriate theoretical/computational model. Toward the creation of such a model, uniaxial, equibiaxial, peel, and shear lap tests were performed on healthy porcine ascending aorta samples. Uniaxial and equibiaxial tests showed anisotropy with greater stiffness and strength in the circumferential direction. Shear lap tests showed catastrophic failure at shear stresses (150–200 kPa) much lower than uniaxial tests (750–2500 kPa), consistent with the low peel tension (∼60 mN/mm). A novel multiscale computational model, including both prefailure and failure mechanics of the aorta, was developed. The microstructural part of the model included contributions from a collagen-reinforced elastin sheet and interlamellar connections representing fibrillin and smooth muscle. Components were represented as nonlinear fibers that failed at a critical stretch. Multiscale simulations of the different experiments were performed, and the model, appropriately specified, agreed well with all experimental data, representing a uniquely complete structure-based description of aorta mechanics. In addition, our experiments and model demonstrate the very low strength of the aorta in radial shear, suggesting an important possible mechanism for aortic dissection. PMID:27893044

  16. High-Performance Computer Modeling of the Cosmos-Iridium Collision

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Olivier, S; Cook, K; Fasenfest, B

    2009-08-28

    This paper describes the application of a new, integrated modeling and simulation framework, encompassing the space situational awareness (SSA) enterprise, to the recent Cosmos-Iridium collision. This framework is based on a flexible, scalable architecture to enable efficient simulation of the current SSA enterprise, and to accommodate future advancements in SSA systems. In particular, the code is designed to take advantage of massively parallel, high-performance computer systems available, for example, at Lawrence Livermore National Laboratory. We will describe the application of this framework to the recent collision of the Cosmos and Iridium satellites, including (1) detailed hydrodynamic modeling of the satellitemore » collision and resulting debris generation, (2) orbital propagation of the simulated debris and analysis of the increased risk to other satellites (3) calculation of the radar and optical signatures of the simulated debris and modeling of debris detection with space surveillance radar and optical systems (4) determination of simulated debris orbits from modeled space surveillance observations and analysis of the resulting orbital accuracy, (5) comparison of these modeling and simulation results with Space Surveillance Network observations. We will also discuss the use of this integrated modeling and simulation framework to analyze the risks and consequences of future satellite collisions and to assess strategies for mitigating or avoiding future incidents, including the addition of new sensor systems, used in conjunction with the Space Surveillance Network, for improving space situational awareness.« less

  17. Risk Assessment in the 21st Century | Science Inventory | US ...

    EPA Pesticide Factsheets

    For the past ~50 years, risk assessment depended almost exclusively on animal testing for hazard identification and dose-response assessment. Originally sound and effective, with increasing dependence on chemical tools and the number of chemicals in commerce, this traditional approach is no longer adequate. This presentation provides an update on current progress in achieving the goals outlined in the NAS report on Toxicology Testing in the 21st Century, highlighting many of the advances lead by the EPA. Topics covered include the evolution of the mode of action framework into a chemically agnostic, adverse outcome pathway (AOP), a systems-based data framework that facilitates integration of modifiable factors (e.g., genetic variation, life stages), and an understanding of networks, and mixtures. Further, the EDSP pivot is used to illustrate how AOPs drive development of predictive models for risk assessment based on assembly of high throughput assays representing AOP key elements. The birth of computational exposure science, capable of large-scale predictive exposure models, is reviewed. Although still in its infancy, development of non-targeted analysis to begin addressing exposome also is presented. Finally, the systems-based AEP is described that integrates exposure, toxicokinetics and AOPs into a comprehensive framework. For the past ~50 years, risk assessment depended almost exclusively on animal testing for hazard identification and dose-response as

  18. An improved approach for flight readiness certification: Probabilistic models for flaw propagation and turbine blade failure. Volume 1: Methodology and applications

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with analytical modeling of failure phenomena to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in analytical modeling, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which analytical models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. State-of-the-art analytical models currently employed for designs failure prediction, or performance analysis are used in this methodology. The rationale for the statistical approach taken in the PFA methodology is discussed, the PFA methodology is described, and examples of its application to structural failure modes are presented. The engineering models and computer software used in fatigue crack growth and fatigue crack initiation applications are thoroughly documented.

  19. An improved approach for flight readiness certification: Probabilistic models for flaw propagation and turbine blade failure. Volume 2: Software documentation

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflights systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with analytical modeling of failure phenomena to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in analytical modeling, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which analytical models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. State-of-the-art analytical models currently employed for design, failure prediction, or performance analysis are used in this methodology. The rationale for the statistical approach taken in the PFA methodology is discussed, the PFA methodology is described, and examples of its application to structural failure modes are presented. The engineering models and computer software used in fatigue crack growth and fatigue crack initiation applications are thoroughly documented.

  20. Computational modeling of the amphibian thyroid axis supported by targeted in vivo testing to advance quantitative adverse outcome pathway development

    EPA Science Inventory

    In vitro screening of chemicals for bioactivity together with computational modeling are beginning to replace animal toxicity testing in support of chemical risk assessment. To facilitate this transition, an amphibian thyroid axis model has been developed to describe thyroid home...

  1. Credibility Assessment of Deterministic Computational Models and Simulations for Space Biomedical Research and Operations

    NASA Technical Reports Server (NTRS)

    Mulugeta, Lealem; Walton, Marlei; Nelson, Emily; Myers, Jerry

    2015-01-01

    Human missions beyond low earth orbit to destinations, such as to Mars and asteroids will expose astronauts to novel operational conditions that may pose health risks that are currently not well understood and perhaps unanticipated. In addition, there are limited clinical and research data to inform development and implementation of health risk countermeasures for these missions. Consequently, NASA's Digital Astronaut Project (DAP) is working to develop and implement computational models and simulations (M&S) to help predict and assess spaceflight health and performance risks, and enhance countermeasure development. In order to effectively accomplish these goals, the DAP evaluates its models and simulations via a rigorous verification, validation and credibility assessment process to ensure that the computational tools are sufficiently reliable to both inform research intended to mitigate potential risk as well as guide countermeasure development. In doing so, DAP works closely with end-users, such as space life science researchers, to establish appropriate M&S credibility thresholds. We will present and demonstrate the process the DAP uses to vet computational M&S for space biomedical analysis using real M&S examples. We will also provide recommendations on how the larger space biomedical community can employ these concepts to enhance the credibility of their M&S codes.

  2. Assessment of cardiovascular risk profile based on measurement of tophus volume in patients with gout.

    PubMed

    Lee, Kyung-Ann; Ryu, Se-Ri; Park, Seong-Jun; Kim, Hae-Rim; Lee, Sang-Heon

    2018-05-01

    Hyperuricemia and gout are associated with increased risk of cardiovascular disease and metabolic syndrome. The aim of this study was to evaluate the correlation of total tophus volumes, measured using dual-energy computed tomography, with cardiovascular risk and the presence of metabolic syndrome. Dual-energy computed tomography datasets from 91 patients with a diagnosis of gout were analyzed retrospectively. Patients who received urate lowering therapy were excluded to avoid the effect on tophus volume. The total volumes of tophaceous deposition were quantified using automated volume assessment software. The 10-year cardiovascular risk using the Framingham Risk Score and metabolic syndrome based on the Third Adult Treatment Panel criteria were estimated. Fifty-five and 36 patients with positive and negative dual-energy computed tomography results, respectively, were assessed. Patients with positive dual-energy computed tomography results showed significantly higher systolic blood pressure, diastolic blood pressure, fasting glucose, and higher prevalence of chronic kidney disease, compared with those with negative dual-energy computed tomography results. The total tophus volumes were significantly correlated with the Framingham Risk Score, and the number of metabolic syndrome components (r = 0.22 and p = 0.036 and r = 0.373 and p < 0.001, respectively). The total tophus volume was one of the independent prognostic factors for the Framingham Risk Score in a multivariate analysis. This study showed the correlation of total tophus volumes with cardiovascular risk and metabolic syndrome-related comorbidities. A high urate burden could affect unfavorable cardiovascular profiles.

  3. Dynamic population flow based risk analysis of infectious disease propagation in a metropolis.

    PubMed

    Zhang, Nan; Huang, Hong; Duarte, Marlyn; Zhang, Junfeng Jim

    2016-09-01

    Knowledge on the characteristics of infectious disease propagation in metropolises plays a critical role in guiding public health intervention strategies to reduce death tolls, disease incidence, and possible economic losses. Based on the SIR model, we established a comprehensive spatiotemporal risk assessment model to compute infectious disease propagation within an urban setting using Beijing, China as a case study. The model was developed for a dynamic population distribution using actual data on location, density of residences and offices, and means of public transportation (e.g., subways, buses and taxis). We evaluated four influencing factors including biological, behavioral, environmental parameters and infectious sources. The model output resulted in a set of maps showing how the four influencing factors affected the trend and characteristics of airborne infectious disease propagation in Beijing. We compared the scenarios for the long-term dynamic propagation of infectious disease without governmental interventions versus scenarios with government intervention and hospital coordinated emergency responses. Lastly, the sensitivity of the average number of people at different location in spreading infections is analyzed. Based on our results, we provide valuable recommendations to governmental agencies and the public in order to minimize the disease propagation. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. Assessment of global and local region-based bilateral mammographic feature asymmetry to predict short-term breast cancer risk

    NASA Astrophysics Data System (ADS)

    Li, Yane; Fan, Ming; Cheng, Hu; Zhang, Peng; Zheng, Bin; Li, Lihua

    2018-01-01

    This study aims to develop and test a new imaging marker-based short-term breast cancer risk prediction model. An age-matched dataset of 566 screening mammography cases was used. All ‘prior’ images acquired in the two screening series were negative, while in the ‘current’ screening images, 283 cases were positive for cancer and 283 cases remained negative. For each case, two bilateral cranio-caudal view mammograms acquired from the ‘prior’ negative screenings were selected and processed by a computer-aided image processing scheme, which segmented the entire breast area into nine strip-based local regions, extracted the element regions using difference of Gaussian filters, and computed both global- and local-based bilateral asymmetrical image features. An initial feature pool included 190 features related to the spatial distribution and structural similarity of grayscale values, as well as of the magnitude and phase responses of multidirectional Gabor filters. Next, a short-term breast cancer risk prediction model based on a generalized linear model was built using an embedded stepwise regression analysis method to select features and a leave-one-case-out cross-validation method to predict the likelihood of each woman having image-detectable cancer in the next sequential mammography screening. The area under the receiver operating characteristic curve (AUC) values significantly increased from 0.5863  ±  0.0237 to 0.6870  ±  0.0220 when the model trained by the image features extracted from the global regions and by the features extracted from both the global and the matched local regions (p  =  0.0001). The odds ratio values monotonically increased from 1.00-8.11 with a significantly increasing trend in slope (p  =  0.0028) as the model-generated risk score increased. In addition, the AUC values were 0.6555  ±  0.0437, 0.6958  ±  0.0290, and 0.7054  ±  0.0529 for the three age groups of 37-49, 50-65, and 66-87 years old, respectively. AUC values of 0.6529  ±  0.1100, 0.6820  ±  0.0353, 0.6836  ±  0.0302 and 0.8043  ±  0.1067 were yielded for the four mammography density sub-groups (BIRADS from 1-4), respectively. This study demonstrated that bilateral asymmetry features extracted from local regions combined with the global region in bilateral negative mammograms could be used as a new imaging marker to assist in the prediction of short-term breast cancer risk.

  5. Software Safety Risk in Legacy Safety-Critical Computer Systems

    NASA Technical Reports Server (NTRS)

    Hill, Janice; Baggs, Rhoda

    2007-01-01

    Safety-critical computer systems must be engineered to meet system and software safety requirements. For legacy safety-critical computer systems, software safety requirements may not have been formally specified during development. When process-oriented software safety requirements are levied on a legacy system after the fact, where software development artifacts don't exist or are incomplete, the question becomes 'how can this be done?' The risks associated with only meeting certain software safety requirements in a legacy safety-critical computer system must be addressed should such systems be selected as candidates for reuse. This paper proposes a method for ascertaining formally, a software safety risk assessment, that provides measurements for software safety for legacy systems which may or may not have a suite of software engineering documentation that is now normally required. It relies upon the NASA Software Safety Standard, risk assessment methods based upon the Taxonomy-Based Questionnaire, and the application of reverse engineering CASE tools to produce original design documents for legacy systems.

  6. Consumers' behavior in quantitative microbial risk assessment for pathogens in raw milk: Incorporation of the likelihood of consumption as a function of storage time and temperature.

    PubMed

    Crotta, Matteo; Paterlini, Franco; Rizzi, Rita; Guitian, Javier

    2016-02-01

    Foodborne disease as a result of raw milk consumption is an increasing concern in Western countries. Quantitative microbial risk assessment models have been used to estimate the risk of illness due to different pathogens in raw milk. In these models, the duration and temperature of storage before consumption have a critical influence in the final outcome of the simulations and are usually described and modeled as independent distributions in the consumer phase module. We hypothesize that this assumption can result in the computation, during simulations, of extreme scenarios that ultimately lead to an overestimation of the risk. In this study, a sensorial analysis was conducted to replicate consumers' behavior. The results of the analysis were used to establish, by means of a logistic model, the relationship between time-temperature combinations and the probability that a serving of raw milk is actually consumed. To assess our hypothesis, 2 recently published quantitative microbial risk assessment models quantifying the risks of listeriosis and salmonellosis related to the consumption of raw milk were implemented. First, the default settings described in the publications were kept; second, the likelihood of consumption as a function of the length and temperature of storage was included. When results were compared, the density of computed extreme scenarios decreased significantly in the modified model; consequently, the probability of illness and the expected number of cases per year also decreased. Reductions of 11.6 and 12.7% in the proportion of computed scenarios in which a contaminated milk serving was consumed were observed for the first and the second study, respectively. Our results confirm that overlooking the time-temperature dependency may yield to an important overestimation of the risk. Furthermore, we provide estimates of this dependency that could easily be implemented in future quantitative microbial risk assessment models of raw milk pathogens. Copyright © 2016 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  7. Risk Based Reservoir Operations Using Ensemble Streamflow Predictions for Lake Mendocino in Mendocino County, California

    NASA Astrophysics Data System (ADS)

    Delaney, C.; Mendoza, J.; Whitin, B.; Hartman, R. K.

    2017-12-01

    Ensemble Forecast Operations (EFO) is a risk based approach of reservoir flood operations that incorporates ensemble streamflow predictions (ESPs) made by NOAA's California-Nevada River Forecast Center (CNRFC). With the EFO approach, each member of an ESP is individually modeled to forecast system conditions and calculate risk of reaching critical operational thresholds. Reservoir release decisions are computed which seek to manage forecasted risk to established risk tolerance levels. A water management model was developed for Lake Mendocino, a 111,000 acre-foot reservoir located near Ukiah, California, to evaluate the viability of the EFO alternative to improve water supply reliability but not increase downstream flood risk. Lake Mendocino is a dual use reservoir, which is owned and operated for flood control by the United States Army Corps of Engineers and is operated for water supply by the Sonoma County Water Agency. Due to recent changes in the operations of an upstream hydroelectric facility, this reservoir has suffered from water supply reliability issues since 2007. The EFO alternative was simulated using a 26-year (1985-2010) ESP hindcast generated by the CNRFC, which approximates flow forecasts for 61 ensemble members for a 15-day horizon. Model simulation results of the EFO alternative demonstrate a 36% increase in median end of water year (September 30) storage levels over existing operations. Additionally, model results show no increase in occurrence of flows above flood stage for points downstream of Lake Mendocino. This investigation demonstrates that the EFO alternative may be a viable approach for managing Lake Mendocino for multiple purposes (water supply, flood mitigation, ecosystems) and warrants further investigation through additional modeling and analysis.

  8. A randomized trial of computer-based communications using imagery and text information to alter representations of heart disease risk and motivate protective behaviour.

    PubMed

    Lee, Tarryn J; Cameron, Linda D; Wünsche, Burkhard; Stevens, Carey

    2011-02-01

    Advances in web-based animation technologies provide new opportunities to develop graphic health communications for dissemination throughout communities. We developed imagery and text contents of brief, computer-based programmes about heart disease risk, with both imagery and text contents guided by the common-sense model (CSM) of self-regulation. The imagery depicts a three-dimensional, beating heart tailored to user-specific information. A 2 × 2 × 4 factorial design was used to manipulate concrete imagery (imagery vs. no imagery) and conceptual information (text vs. no text) about heart disease risk in prevention-oriented programmes and assess changes in representations and behavioural motivations from baseline to 2 days, 2 weeks, and 4 weeks post-intervention. Sedentary young adults (N= 80) were randomized to view one of four programmes: imagery plus text, imagery only, text only, or control. Participants completed measures of risk representations, worry, and physical activity and healthy diet intentions and behaviours at baseline, 2 days post-intervention (except behaviours), and 2 weeks (intentions and behaviours only) and 4 weeks later. The imagery contents increased representational beliefs and mental imagery relating to heart disease, worry, and intentions at post-intervention. Increases in sense of coherence (understanding of heart disease) and worry were sustained after 1 month. The imagery contents also increased healthy diet efforts after 2 weeks. The text contents increased beliefs about causal factors, mental images of clogged arteries, and worry at post-intervention, and increased physical activity 2 weeks later and sense of coherence 1 month later. The CSM-based programmes induced short-term changes in risk representations and behaviour motivation. The combination of CSM-based text and imagery appears to be most effective in instilling risk representations that motivate protective behaviour. ©2010 The British Psychological Society.

  9. Validation of a multifactorial risk factor model used for predicting future caries risk with Nevada adolescents.

    PubMed

    Ditmyer, Marcia M; Dounis, Georgia; Howard, Katherine M; Mobley, Connie; Cappelli, David

    2011-05-20

    The objective of this study was to measure the validity and reliability of a multifactorial Risk Factor Model developed for use in predicting future caries risk in Nevada adolescents in a public health setting. This study examined retrospective data from an oral health surveillance initiative that screened over 51,000 students 13-18 years of age, attending public/private schools in Nevada across six academic years (2002/2003-2007/2008). The Risk Factor Model included ten demographic variables: exposure to fluoridation in the municipal water supply, environmental smoke exposure, race, age, locale (metropolitan vs. rural), tobacco use, Body Mass Index, insurance status, sex, and sealant application. Multiple regression was used in a previous study to establish which significantly contributed to caries risk. Follow-up logistic regression ascertained the weight of contribution and odds ratios of the ten variables. Researchers in this study computed sensitivity, specificity, positive predictive value (PVP), negative predictive value (PVN), and prevalence across all six years of screening to assess the validity of the Risk Factor Model. Subjects' overall mean caries prevalence across all six years was 66%. Average sensitivity across all six years was 79%; average specificity was 81%; average PVP was 89% and average PVN was 67%. Overall, the Risk Factor Model provided a relatively constant, valid measure of caries that could be used in conjunction with a comprehensive risk assessment in population-based screenings by school nurses/nurse practitioners, health educators, and physicians to guide them in assessing potential future caries risk for use in prevention and referral practices.

  10. A Bayesian framework for early risk prediction in traumatic brain injury

    NASA Astrophysics Data System (ADS)

    Chaganti, Shikha; Plassard, Andrew J.; Wilson, Laura; Smith, Miya A.; Patel, Mayur B.; Landman, Bennett A.

    2016-03-01

    Early detection of risk is critical in determining the course of treatment in traumatic brain injury (TBI). Computed tomography (CT) acquired at admission has shown latent prognostic value in prior studies; however, no robust clinical risk predictions have been achieved based on the imaging data in large-scale TBI analysis. The major challenge lies in the lack of consistent and complete medical records for patients, and an inherent bias associated with the limited number of patients samples with high-risk outcomes in available TBI datasets. Herein, we propose a Bayesian framework with mutual information-based forward feature selection to handle this type of data. Using multi-atlas segmentation, 154 image-based features (capturing intensity, volume and texture) were computed over 22 ROIs in 1791 CT scans. These features were combined with 14 clinical parameters and converted into risk likelihood scores using Bayes modeling. We explore the prediction power of the image features versus the clinical measures for various risk outcomes. The imaging data alone were more predictive of outcomes than the clinical data (including Marshall CT classification) for discharge disposition with an area under the curve of 0.81 vs. 0.67, but less predictive than clinical data for discharge Glasgow Coma Scale (GCS) score with an area under the curve of 0.65 vs. 0.85. However, in both cases, combining imaging and clinical data increased the combined area under the curve with 0.86 for discharge disposition and 0.88 for discharge GCS score. In conclusion, CT data have meaningful prognostic value for TBI patients beyond what is captured in clinical measures and the Marshall CT classification.

  11. Risk Assessment Methodology Based on the NISTIR 7628 Guidelines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abercrombie, Robert K; Sheldon, Frederick T; Hauser, Katie R

    2013-01-01

    Earlier work describes computational models of critical infrastructure that allow an analyst to estimate the security of a system in terms of the impact of loss per stakeholder resulting from security breakdowns. Here, we consider how to identify, monitor and estimate risk impact and probability for different smart grid stakeholders. Our constructive method leverages currently available standards and defined failure scenarios. We utilize the National Institute of Standards and Technology (NIST) Interagency or Internal Reports (NISTIR) 7628 as a basis to apply Cyberspace Security Econometrics system (CSES) for comparing design principles and courses of action in making security-related decisions.

  12. FRAT-up, a Web-based Fall-Risk Assessment Tool for Elderly People Living in the Community

    PubMed Central

    Cattelani, Luca; Palumbo, Pierpaolo; Palmerini, Luca; Bandinelli, Stefania; Becker, Clemens; Chiari, Lorenzo

    2015-01-01

    Background About 30% of people over 65 are subject to at least one unintentional fall a year. Fall prevention protocols and interventions can decrease the number of falls. To be effective, a prevention strategy requires a prior step to evaluate the fall risk of the subjects. Despite extensive research, existing assessment tools for fall risk have been insufficient for predicting falls. Objective The goal of this study is to present a novel web-based fall-risk assessment tool (FRAT-up) and to evaluate its accuracy in predicting falls, within a context of community-dwelling persons aged 65 and up. Methods FRAT-up is based on the assumption that a subject’s fall risk is given by the contribution of their exposure to each of the known fall-risk factors. Many scientific studies have investigated the relationship between falls and risk factors. The majority of these studies adopted statistical approaches, usually providing quantitative information such as odds ratios. FRAT-up exploits these numerical results to compute how each single factor contributes to the overall fall risk. FRAT-up is based on a formal ontology that enlists a number of known risk factors, together with quantitative findings in terms of odds ratios. From such information, an automatic algorithm generates a rule-based probabilistic logic program, that is, a set of rules for each risk factor. The rule-based program takes the health profile of the subject (in terms of exposure to the risk factors) and computes the fall risk. A Web-based interface allows users to input health profiles and to visualize the risk assessment for the given subject. FRAT-up has been evaluated on the InCHIANTI Study dataset, a representative population-based study of older persons living in the Chianti area (Tuscany, Italy). We compared reported falls with predicted ones and computed performance indicators. Results The obtained area under curve of the receiver operating characteristic was 0.642 (95% CI 0.614-0.669), while the Brier score was 0.174. The Hosmer-Lemeshow test indicated statistical significance of miscalibration. Conclusions FRAT-up is a web-based tool for evaluating the fall risk of people aged 65 or up living in the community. Validation results of fall risks computed by FRAT-up show that its performance is comparable to externally validated state-of-the-art tools. A prototype is freely available through a web-based interface. Trial Registration ClinicalTrials.gov NCT01331512 (The InChianti Follow-Up Study); http://clinicaltrials.gov/show/NCT01331512 (Archived by WebCite at http://www.webcitation.org/6UDrrRuaR). PMID:25693419

  13. Comptox Chemistry Dashboard: Web-Based Data Integration Hub for Environmental Chemistry and Toxicology Data (ACS Fall meeting 4 of 12)

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) Computational Toxicology Program integrate advances in biology, chemistry, exposure and computer science to help prioritize chemicals for further research based on potential human health risks. This work involves computational and da...

  14. Two-Step Approach for the Prediction of Future Type 2 Diabetes Risk

    PubMed Central

    Abdul-Ghani, Muhammad A.; Abdul-Ghani, Tamam; Stern, Michael P.; Karavic, Jasmina; Tuomi, Tiinamaija; Bo, Insoma; DeFronzo, Ralph A.; Groop, Leif

    2011-01-01

    OBJECTIVE To develop a model for the prediction of type 2 diabetes mellitus (T2DM) risk on the basis of a multivariate logistic model and 1-h plasma glucose concentration (1-h PG). RESEARCH DESIGN AND METHODS The model was developed in a cohort of 1,562 nondiabetic subjects from the San Antonio Heart Study (SAHS) and validated in 2,395 nondiabetic subjects in the Botnia Study. A risk score on the basis of anthropometric parameters, plasma glucose and lipid profile, and blood pressure was computed for each subject. Subjects with a risk score above a certain cut point were considered to represent high-risk individuals, and their 1-h PG concentration during the oral glucose tolerance test was used to further refine their future T2DM risk. RESULTS We used the San Antonio Diabetes Prediction Model (SADPM) to generate the initial risk score. A risk-score value of 0.065 was found to be an optimal cut point for initial screening and selection of high-risk individuals. A 1-h PG concentration >140 mg/dL in high-risk individuals (whose risk score was >0.065) was the optimal cut point for identification of subjects at increased risk. The two cut points had 77.8, 77.4, and 44.8% (for the SAHS) and 75.8, 71.6, and 11.9% (for the Botnia Study) sensitivity, specificity, and positive predictive value, respectively, in the SAHS and Botnia Study. CONCLUSIONS A two-step model, based on the combination of the SADPM and 1-h PG, is a useful tool for the identification of high-risk Mexican-American and Caucasian individuals. PMID:21788628

  15. The potential value of Clostridium difficile vaccine: an economic computer simulation model.

    PubMed

    Lee, Bruce Y; Popovich, Michael J; Tian, Ye; Bailey, Rachel R; Ufberg, Paul J; Wiringa, Ann E; Muder, Robert R

    2010-07-19

    Efforts are currently underway to develop a vaccine against Clostridium difficile infection (CDI). We developed two decision analytic Monte Carlo computer simulation models: (1) an Initial Prevention Model depicting the decision whether to administer C. difficile vaccine to patients at-risk for CDI and (2) a Recurrence Prevention Model depicting the decision whether to administer C. difficile vaccine to prevent CDI recurrence. Our results suggest that a C. difficile vaccine could be cost-effective over a wide range of C. difficile risk, vaccine costs, and vaccine efficacies especially, when being used post-CDI treatment to prevent recurrent disease. (c) 2010 Elsevier Ltd. All rights reserved.

  16. The Potential Value of Clostridium difficile Vaccine: An Economic Computer Simulation Model

    PubMed Central

    Lee, Bruce Y.; Popovich, Michael J.; Tian, Ye; Bailey, Rachel R.; Ufberg, Paul J.; Wiringa, Ann E.; Muder, Robert R.

    2010-01-01

    Efforts are currently underway to develop a vaccine against Clostridium difficile infection (CDI). We developed two decision analytic Monte Carlo computer simulation models: (1) an Initial Prevention Model depicting the decision whether to administer C. difficile vaccine to patients at-risk for CDI and (2) a Recurrence Prevention Model depicting the decision whether to administer C. difficile vaccine to prevent CDI recurrence. Our results suggest that a C. difficile vaccine could be cost-effective over a wide range of C. difficile risk, vaccine costs, and vaccine efficacies especially when being used post-CDI treatment to prevent recurrent disease. PMID:20541582

  17. Consequence of climate mitigation on the risk of hunger.

    PubMed

    Hasegawa, Tomoko; Fujimori, Shinichiro; Shin, Yonghee; Tanaka, Akemi; Takahashi, Kiyoshi; Masui, Toshihiko

    2015-06-16

    Climate change and mitigation measures have three major impacts on food consumption and the risk of hunger: (1) changes in crop yields caused by climate change; (2) competition for land between food crops and energy crops driven by the use of bioenergy; and (3) costs associated with mitigation measures taken to meet an emissions reduction target that keeps the global average temperature increase to 2 °C. In this study, we combined a global computable general equilibrium model and a crop model (M-GAEZ), and we quantified the three impacts on risk of hunger through 2050 based on the uncertainty range associated with 12 climate models and one economic and demographic scenario. The strong mitigation measures aimed at attaining the 2 °C target reduce the negative effects of climate change on yields but have large negative impacts on the risk of hunger due to mitigation costs in the low-income countries. We also found that in a strongly carbon-constrained world, the change in food consumption resulting from mitigation measures depends more strongly on the change in incomes than the change in food prices.

  18. Radiomics-based differentiation of lung disease models generated by polluted air based on X-ray computed tomography data.

    PubMed

    Szigeti, Krisztián; Szabó, Tibor; Korom, Csaba; Czibak, Ilona; Horváth, Ildikó; Veres, Dániel S; Gyöngyi, Zoltán; Karlinger, Kinga; Bergmann, Ralf; Pócsik, Márta; Budán, Ferenc; Máthé, Domokos

    2016-02-11

    Lung diseases (resulting from air pollution) require a widely accessible method for risk estimation and early diagnosis to ensure proper and responsive treatment. Radiomics-based fractal dimension analysis of X-ray computed tomography attenuation patterns in chest voxels of mice exposed to different air polluting agents was performed to model early stages of disease and establish differential diagnosis. To model different types of air pollution, BALBc/ByJ mouse groups were exposed to cigarette smoke combined with ozone, sulphur dioxide gas and a control group was established. Two weeks after exposure, the frequency distributions of image voxel attenuation data were evaluated. Specific cut-off ranges were defined to group voxels by attenuation. Cut-off ranges were binarized and their spatial pattern was associated with calculated fractal dimension, then abstracted by the fractal dimension -- cut-off range mathematical function. Nonparametric Kruskal-Wallis (KW) and Mann-Whitney post hoc (MWph) tests were used. Each cut-off range versus fractal dimension function plot was found to contain two distinctive Gaussian curves. The ratios of the Gaussian curve parameters are considerably significant and are statistically distinguishable within the three exposure groups. A new radiomics evaluation method was established based on analysis of the fractal dimension of chest X-ray computed tomography data segments. The specific attenuation patterns calculated utilizing our method may diagnose and monitor certain lung diseases, such as chronic obstructive pulmonary disease (COPD), asthma, tuberculosis or lung carcinomas.

  19. Capacity planning for electronic waste management facilities under uncertainty: multi-objective multi-time-step model development.

    PubMed

    Poonam Khanijo Ahluwalia; Nema, Arvind K

    2011-07-01

    Selection of optimum locations for locating new facilities and decision regarding capacities at the proposed facilities is a major concern for municipal authorities/managers. The decision as to whether a single facility is preferred over multiple facilities of smaller capacities would vary with varying priorities to cost and associated risks such as environmental or health risk or risk perceived by the society. Currently management of waste streams such as that of computer waste is being done using rudimentary practices and is flourishing as an unorganized sector, mainly as backyard workshops in many cities of developing nations such as India. Uncertainty in the quantification of computer waste generation is another major concern due to the informal setup of present computer waste management scenario. Hence, there is a need to simultaneously address uncertainty in waste generation quantities while analyzing the tradeoffs between cost and associated risks. The present study aimed to address the above-mentioned issues in a multi-time-step, multi-objective decision-support model, which can address multiple objectives of cost, environmental risk, socially perceived risk and health risk, while selecting the optimum configuration of existing and proposed facilities (location and capacities).

  20. A Fast Method for Embattling Optimization of Ground-Based Radar Surveillance Network

    NASA Astrophysics Data System (ADS)

    Jiang, H.; Cheng, H.; Zhang, Y.; Liu, J.

    A growing number of space activities have created an orbital debris environment that poses increasing impact risks to existing space systems and human space flight. For the safety of in-orbit spacecraft, a lot of observation facilities are needed to catalog space objects, especially in low earth orbit. Surveillance of Low earth orbit objects are mainly rely on ground-based radar, due to the ability limitation of exist radar facilities, a large number of ground-based radar need to build in the next few years in order to meet the current space surveillance demands. How to optimize the embattling of ground-based radar surveillance network is a problem to need to be solved. The traditional method for embattling optimization of ground-based radar surveillance network is mainly through to the detection simulation of all possible stations with cataloged data, and makes a comprehensive comparative analysis of various simulation results with the combinational method, and then selects an optimal result as station layout scheme. This method is time consuming for single simulation and high computational complexity for the combinational analysis, when the number of stations increases, the complexity of optimization problem will be increased exponentially, and cannot be solved with traditional method. There is no better way to solve this problem till now. In this paper, target detection procedure was simplified. Firstly, the space coverage of ground-based radar was simplified, a space coverage projection model of radar facilities in different orbit altitudes was built; then a simplified objects cross the radar coverage model was established according to the characteristics of space objects orbit motion; after two steps simplification, the computational complexity of the target detection was greatly simplified, and simulation results shown the correctness of the simplified results. In addition, the detection areas of ground-based radar network can be easily computed with the simplified model, and then optimized the embattling of ground-based radar surveillance network with the artificial intelligent algorithm, which can greatly simplifies the computational complexities. Comparing with the traditional method, the proposed method greatly improved the computational efficiency.

  1. Proof-of-Concept Demonstrations for Computation-Based Human Reliability Analysis. Modeling Operator Performance During Flooding Scenarios

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Joe, Jeffrey Clark; Boring, Ronald Laurids; Herberger, Sarah Elizabeth Marie

    The United States (U.S.) Department of Energy (DOE) Light Water Reactor Sustainability (LWRS) program has the overall objective to help sustain the existing commercial nuclear power plants (NPPs). To accomplish this program objective, there are multiple LWRS “pathways,” or research and development (R&D) focus areas. One LWRS focus area is called the Risk-Informed Safety Margin and Characterization (RISMC) pathway. Initial efforts under this pathway to combine probabilistic and plant multi-physics models to quantify safety margins and support business decisions also included HRA, but in a somewhat simplified manner. HRA experts at Idaho National Laboratory (INL) have been collaborating with othermore » experts to develop a computational HRA approach, called the Human Unimodel for Nuclear Technology to Enhance Reliability (HUNTER), for inclusion into the RISMC framework. The basic premise of this research is to leverage applicable computational techniques, namely simulation and modeling, to develop and then, using RAVEN as a controller, seamlessly integrate virtual operator models (HUNTER) with 1) the dynamic computational MOOSE runtime environment that includes a full-scope plant model, and 2) the RISMC framework PRA models already in use. The HUNTER computational HRA approach is a hybrid approach that leverages past work from cognitive psychology, human performance modeling, and HRA, but it is also a significant departure from existing static and even dynamic HRA methods. This report is divided into five chapters that cover the development of an external flooding event test case and associated statistical modeling considerations.« less

  2. Optimizing global liver function in radiation therapy treatment planning

    NASA Astrophysics Data System (ADS)

    Wu, Victor W.; Epelman, Marina A.; Wang, Hesheng; Romeijn, H. Edwin; Feng, Mary; Cao, Yue; Ten Haken, Randall K.; Matuszak, Martha M.

    2016-09-01

    Liver stereotactic body radiation therapy (SBRT) patients differ in both pre-treatment liver function (e.g. due to degree of cirrhosis and/or prior treatment) and radiosensitivity, leading to high variability in potential liver toxicity with similar doses. This work investigates three treatment planning optimization models that minimize risk of toxicity: two consider both voxel-based pre-treatment liver function and local-function-based radiosensitivity with dose; one considers only dose. Each model optimizes different objective functions (varying in complexity of capturing the influence of dose on liver function) subject to the same dose constraints and are tested on 2D synthesized and 3D clinical cases. The normal-liver-based objective functions are the linearized equivalent uniform dose (\\ell \\text{EUD} ) (conventional ‘\\ell \\text{EUD} model’), the so-called perfusion-weighted \\ell \\text{EUD} (\\text{fEUD} ) (proposed ‘fEUD model’), and post-treatment global liver function (GLF) (proposed ‘GLF model’), predicted by a new liver-perfusion-based dose-response model. The resulting \\ell \\text{EUD} , fEUD, and GLF plans delivering the same target \\ell \\text{EUD} are compared with respect to their post-treatment function and various dose-based metrics. Voxel-based portal venous liver perfusion, used as a measure of local function, is computed using DCE-MRI. In cases used in our experiments, the GLF plan preserves up to 4.6 % ≤ft(7.5 % \\right) more liver function than the fEUD (\\ell \\text{EUD} ) plan does in 2D cases, and up to 4.5 % ≤ft(5.6 % \\right) in 3D cases. The GLF and fEUD plans worsen in \\ell \\text{EUD} of functional liver on average by 1.0 Gy and 0.5 Gy in 2D and 3D cases, respectively. Liver perfusion information can be used during treatment planning to minimize the risk of toxicity by improving expected GLF; the degree of benefit varies with perfusion pattern. Although fEUD model optimization is computationally inexpensive and often achieves better GLF than \\ell \\text{EUD} model optimization does, the GLF model directly optimizes a more clinically relevant metric and can further improve fEUD plan quality.

  3. The SERGISAI procedure for seismic risk assessment

    NASA Astrophysics Data System (ADS)

    Zonno, G.; Garcia-Fernandez, M.; Jimenez, M.J.; Menoni, S.; Meroni, F.; Petrini, V.

    The European project SERGISAI developed a computational tool where amethodology for seismic risk assessment at different geographical scales hasbeen implemented. Experts of various disciplines, including seismologists,engineers, planners, geologists, and computer scientists, co-operated in anactual multidisciplinary process to develop this tool. Standard proceduralcodes, Geographical Information Systems (GIS), and Artificial Intelligence(AI) techniques compose the whole system, that will enable the end userto carry out a complete seismic risk assessment at three geographical scales:regional, sub-regional and local. At present, single codes or models thathave been incorporated are not new in general, but the modularity of theprototype, based on a user-friendly front-end, offers potential users thepossibility of updating or replacing any code or model if desired. Theproposed procedure is a first attempt to integrate tools, codes and methodsfor assessing expected earthquake damage, and it was mainly designedto become a useful support for civil defence and land use planning agencies.Risk factors have been treated in the most suitable way for each one, interms of level of detail, kind of parameters and units of measure.Identifying various geographical scales is not a mere question of dimension;since entities to be studied correspond to areas defined by administrativeand geographical borders. The procedure was applied in the following areas:Toscana in Italy, for the regional scale, the Garfagnana area in Toscana, forthe sub-regional scale, and a part of Barcelona city, Spain, for the localscale.

  4. Early assessment of proarrhythmic risk of drugs using the in vitro data and single-cell-based in silico models: proof of concept.

    PubMed

    Abbasi, Mitra; Small, Ben G; Patel, Nikunjkumar; Jamei, Masoud; Polak, Sebastian

    2017-02-01

    To determine the predictive performance of in silico models using drug-specific preclinical cardiac electrophysiology data to investigate drug-induced arrhythmia risk (e.g. Torsade de pointes (TdP)) in virtual human subjects. To assess drug proarrhythmic risk, we used a set of in vitro electrophysiological measurements describing ion channel inhibition triggered by the investigated drugs. The Cardiac Safety Simulator version 2.0 (CSS; Simcyp, Sheffield, UK) platform was used to simulate human left ventricular cardiac myocyte action potential models. This study shows the impact of drug concentration changes on particular ionic currents by using available experimental data. The simulation results display safety threshold according to drug concentration threshold and log (threshold concentration/ effective therapeutic plasma concentration (ETPC)). We reproduced the underlying biophysical characteristics of cardiac cells resulted in effects of drugs associated with cardiac arrhythmias (action potential duration (APD) and QT prolongation and TdP) which were observed in published 3D simulations, yet with much less computational burden.

  5. Virtual reality in radiology: virtual intervention

    NASA Astrophysics Data System (ADS)

    Harreld, Michael R.; Valentino, Daniel J.; Duckwiler, Gary R.; Lufkin, Robert B.; Karplus, Walter J.

    1995-04-01

    Intracranial aneurysms are the primary cause of non-traumatic subarachnoid hemorrhage. Morbidity and mortality remain high even with current endovascular intervention techniques. It is presently impossible to identify which aneurysms will grow and rupture, however hemodynamics are thought to play an important role in aneurysm development. With this in mind, we have simulated blood flow in laboratory animals using three dimensional computational fluid dynamics software. The data output from these simulations is three dimensional, complex and transient. Visualization of 3D flow structures with standard 2D display is cumbersome, and may be better performed using a virtual reality system. We are developing a VR-based system for visualization of the computed blood flow and stress fields. This paper presents the progress to date and future plans for our clinical VR-based intervention simulator. The ultimate goal is to develop a software system that will be able to accurately model an aneurysm detected on clinical angiography, visualize this model in virtual reality, predict its future behavior, and give insight into the type of treatment necessary. An associated database will give historical and outcome information on prior aneurysms (including dynamic, structural, and categorical data) that will be matched to any current case, and assist in treatment planning (e.g., natural history vs. treatment risk, surgical vs. endovascular treatment risks, cure prediction, complication rates).

  6. Reproducibility of risk figures in 2nd-trimester maternal serum screening for down syndrome: comparison of 2 laboratories.

    PubMed

    Benn, Peter A; Makowski, Gregory S; Egan, James F X; Wright, Dave

    2006-11-01

    Analytical error affects 2nd-trimester maternal serum screening for Down syndrome risk estimation. We analyzed the between-laboratory reproducibility of risk estimates from 2 laboratories. Laboratory 1 used Bayer ACS180 immunoassays for alpha-fetoprotein (AFP) and human chorionic gonadotropin (hCG), Diagnostic Systems Laboratories (DSL) RIA for unconjugated estriol (uE3), and DSL enzyme immunoassay for inhibin-A (INH-A). Laboratory 2 used Beckman immunoassays for AFP, hCG, and uE3, and DSL enzyme immunoassay for INH-A. Analyte medians were separately established for each laboratory. We used the same computational algorithm for all risk calculations, and we used Monte Carlo methods for computer modeling. For 462 samples tested, risk figures from the 2 laboratories differed >2-fold for 44.7%, >5-fold for 7.1%, and >10-fold for 1.7%. Between-laboratory differences in analytes were greatest for uE3 and INH-A. The screen-positive rates were 9.3% for laboratory 1 and 11.5% for laboratory 2, with a significant difference in the patients identified as screen-positive vs screen-negative (McNemar test, P<0.001). Computer modeling confirmed the large between-laboratory risk differences. Differences in performance of assays and laboratory procedures can have a large effect on patient-specific risks. Screening laboratories should minimize test imprecision and ensure that each assay performs in a manner similar to that assumed in the risk computational algorithm.

  7. Reduced order models for prediction of groundwater quality impacts from CO₂ and brine leakage

    DOE PAGES

    Zheng, Liange; Carroll, Susan; Bianchi, Marco; ...

    2014-12-31

    A careful assessment of the risk associated with geologic CO₂ storage is critical to the deployment of large-scale storage projects. A potential risk is the deterioration of groundwater quality caused by the leakage of CO₂ and brine leakage from deep subsurface reservoirs. In probabilistic risk assessment studies, numerical modeling is the primary tool employed to assess risk. However, the application of traditional numerical models to fully evaluate the impact of CO₂ leakage on groundwater can be computationally complex, demanding large processing times and resources, and involving large uncertainties. As an alternative, reduced order models (ROMs) can be used as highlymore » efficient surrogates for the complex process-based numerical models. In this study, we represent the complex hydrogeological and geochemical conditions in a heterogeneous aquifer and subsequent risk by developing and using two separate ROMs. The first ROM is derived from a model that accounts for the heterogeneous flow and transport conditions in the presence of complex leakage functions for CO₂ and brine. The second ROM is obtained from models that feature similar, but simplified flow and transport conditions, and allow for a more complex representation of all relevant geochemical reactions. To quantify possible impacts to groundwater aquifers, the basic risk metric is taken as the aquifer volume in which the water quality of the aquifer may be affected by an underlying CO₂ storage project. The integration of the two ROMs provides an estimate of the impacted aquifer volume taking into account uncertainties in flow, transport and chemical conditions. These two ROMs can be linked in a comprehensive system level model for quantitative risk assessment of the deep storage reservoir, wellbore leakage, and shallow aquifer impacts to assess the collective risk of CO₂ storage projects.« less

  8. Lognormal Approximations of Fault Tree Uncertainty Distributions.

    PubMed

    El-Shanawany, Ashraf Ben; Ardron, Keith H; Walker, Simon P

    2018-01-26

    Fault trees are used in reliability modeling to create logical models of fault combinations that can lead to undesirable events. The output of a fault tree analysis (the top event probability) is expressed in terms of the failure probabilities of basic events that are input to the model. Typically, the basic event probabilities are not known exactly, but are modeled as probability distributions: therefore, the top event probability is also represented as an uncertainty distribution. Monte Carlo methods are generally used for evaluating the uncertainty distribution, but such calculations are computationally intensive and do not readily reveal the dominant contributors to the uncertainty. In this article, a closed-form approximation for the fault tree top event uncertainty distribution is developed, which is applicable when the uncertainties in the basic events of the model are lognormally distributed. The results of the approximate method are compared with results from two sampling-based methods: namely, the Monte Carlo method and the Wilks method based on order statistics. It is shown that the closed-form expression can provide a reasonable approximation to results obtained by Monte Carlo sampling, without incurring the computational expense. The Wilks method is found to be a useful means of providing an upper bound for the percentiles of the uncertainty distribution while being computationally inexpensive compared with full Monte Carlo sampling. The lognormal approximation method and Wilks's method appear attractive, practical alternatives for the evaluation of uncertainty in the output of fault trees and similar multilinear models. © 2018 Society for Risk Analysis.

  9. Regulator of telomere elongation helicase 1 (RTEL1) rs6010620 polymorphism contribute to increased risk of glioma.

    PubMed

    Zhao, Wei; Bian, Yusong; Zhu, Wei; Zou, Peng; Tang, Guotai

    2014-06-01

    Regulator of telomere elongation helicase 1 (RTEL1) is critical for genome stability and tumor avoidance. Many studies have reported the associations of RTEL1 rs6010620 with glioma risk, but individually published results were inconclusive. This meta-analysis was performed to quantitatively summarize the evidence for such a relationship. The PubMed, Embase, and Web of Science were systematically searched to identify relevant studies. The odds ratio (OR) and 95 % confidence interval (95 % CI) were computed to estimate the strength of the association using a fixed or random effects model. Ten studies were eligible for meta-analysis including data on glioma with 6,490 cases and 9,288 controls. Overall, there was a significant association between RTEL1 rs6010620 polymorphism and glioma risk in all four genetic models (GG vs. AA: OR=1.87, 95 % CI=1.60-2.18, P heterogeneity=0.552; GA vs. AA: OR=1.30, 95 % CI=1.16-1.46, P heterogeneity=0.495; dominant model-GG+GA vs. AA: OR=1.46, 95 % CI=1.31-1.63, P heterogeneity=0.528; recessive model-GG vs. GA+AA: OR=1.36, 95 % CI=1.27-1.46, P heterogeneity=0.093). Subgroup analyses by ethnicity showed that RTEL1 rs6010620 polymorphism resulted in a higher risk of glioma among both Asians and Caucasians. In the stratified analysis by ethnicity and source of controls, significantly increased risk was observed for Asians and Europeans in all genetic models, population-based studies in all genetic models, and hospital-based studies in three genetic models (heterozygote comparison, homozygote comparison, and dominant model). Our meta-analysis suggested that RTEL1 rs6010620 polymorphism is likely to be associated with increased glioma risk, which lends further biological plausibility to these findings.

  10. Developing Computer Model-Based Assessment of Chemical Reasoning: A Feasibility Study

    ERIC Educational Resources Information Center

    Liu, Xiufeng; Waight, Noemi; Gregorius, Roberto; Smith, Erica; Park, Mihwa

    2012-01-01

    This paper reports a feasibility study on developing computer model-based assessments of chemical reasoning at the high school level. Computer models are flash and NetLogo environments to make simultaneously available three domains in chemistry: macroscopic, submicroscopic, and symbolic. Students interact with computer models to answer assessment…

  11. Estimating radiation risk induced by CT screening for Korean population

    NASA Astrophysics Data System (ADS)

    Yang, Won Seok; Yang, Hye Jeong; Min, Byung In

    2017-02-01

    The purposes of this study are to estimate the radiation risks induced by chest/abdomen computed tomography (CT) screening for healthcare and to determine the cancer risk level of the Korean population compared to other populations. We used an ImPACT CT Patient Dosimetry Calculator to compute the organ effective dose induced by CT screening (chest, low-dose chest, abdomen/pelvis, and chest/abdomen/pelvis CT). A risk model was applied using principles based on the BEIR VII Report in order to estimate the lifetime attributable risk (LAR) using the Korean Life Table 2010. In addition, several countries including Hong Kong, the United States (U.S.), and the United Kingdom, were selected for comparison. Herein, each population exposed radiation dose of 100 mSv was classified according to country, gender and age. For each CT screening the total organ effective dose calculated by ImPACT was 6.2, 1.5, 5.2 and 11.4 mSv, respectively. In the case of Korean female LAR, it was similar to Hong Kong female but lower than those of U.S. and U.K. females, except for those in their twenties. The LAR of Korean males was the highest for all types of CT screening. However, the difference of the risk level was negligible because of the quite low value.

  12. Computational strategy for quantifying human pesticide exposure based upon a saliva measurement

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Timchalk, Charles; Weber, Thomas J.; Smith, Jordan N.

    The National Research Council of the National Academies report, Toxicity Testing in the 21st Century: A Vision and Strategy, highlighted the importance of quantitative exposure data for evaluating human toxicity risk and noted that biomonitoring is a critical tool for quantitatively evaluating exposure from both environmental and occupational settings. Direct measurement of chemical exposures using personal monitoring provides the most accurate estimation of a subject’s true exposure, and non-invasive methods have also been advocated for quantifying the pharmacokinetics and bioavailability of drugs and xenobiotics. In this regard, there is a need to identify chemicals that are readily cleared in salivamore » at concentrations that can be quantified to support the implementation of this approach.. The current manuscript describes the use of computational modeling approaches that are closely coupled to in vivo and in vitro experiments to predict salivary uptake and clearance of xenobiotics. The primary mechanism by which xenobiotics leave the blood and enter saliva is thought to involve paracellular transport, passive transcellular diffusion, or trancellular active transport with the majority of drugs and xenobiotics cleared from plasma into saliva by passive diffusion. The transcellular or paracellular diffusion of unbound chemicals in plasma to saliva has been computational modeled using a combination of compartmental and physiologically based approaches. Of key importance for determining the plasma:saliva partitioning was the utilization of a modified Schmitt algorithm that calculates partitioning based upon the tissue composition, pH, chemical pKa and plasma protein-binding. Sensitivity analysis of key model parameters specifically identified that both protein-binding and pKa (for weak acids and bases) had the most significant impact on the determination of partitioning and that there were clear species dependent differences based upon physiological variance between rats and humans. Ongoing efforts are focused on extending this modeling strategy to an in vitro salivary acinar cell based system that will be utilized to experimentally determine and computationally predict salivary gland uptake and clearance for a broad range of xenobiotics. Hence, it is envisioned that a combination of salivary biomonitoring and computational modeling will enable the non-invasive measurement of both environmental and occupational exposure in human populations using saliva.« less

  13. Computed Tomography Imaging Features in Acute Uncomplicated Stanford Type-B Aortic Dissection Predict Late Adverse Events.

    PubMed

    Sailer, Anna M; van Kuijk, Sander M J; Nelemans, Patricia J; Chin, Anne S; Kino, Aya; Huininga, Mark; Schmidt, Johanna; Mistelbauer, Gabriel; Bäumler, Kathrin; Chiu, Peter; Fischbein, Michael P; Dake, Michael D; Miller, D Craig; Schurink, Geert Willem H; Fleischmann, Dominik

    2017-04-01

    Medical treatment of initially uncomplicated acute Stanford type-B aortic dissection is associated with a high rate of late adverse events. Identification of individuals who potentially benefit from preventive endografting is highly desirable. The association of computed tomography imaging features with late adverse events was retrospectively assessed in 83 patients with acute uncomplicated Stanford type-B aortic dissection, followed over a median of 850 (interquartile range 247-1824) days. Adverse events were defined as fatal or nonfatal aortic rupture, rapid aortic growth (>10 mm/y), aneurysm formation (≥6 cm), organ or limb ischemia, or new uncontrollable hypertension or pain. Five significant predictors were identified using multivariable Cox regression analysis: connective tissue disease (hazard ratio [HR] 2.94, 95% confidence interval [CI]: 1.29-6.72; P =0.01), circumferential extent of false lumen in angular degrees (HR 1.03 per degree, 95% CI: 1.01-1.04, P =0.003), maximum aortic diameter (HR 1.10 per mm, 95% CI: 1.02-1.18, P =0.015), false lumen outflow (HR 0.999 per mL/min, 95% CI: 0.998-1.000; P =0.055), and number of intercostal arteries (HR 0.89 per n, 95% CI: 0.80-0.98; P =0.024). A prediction model was constructed to calculate patient specific risk at 1, 2, and 5 years and to stratify patients into high-, intermediate-, and low-risk groups. The model was internally validated by bootstrapping and showed good discriminatory ability with an optimism-corrected C statistic of 70.1%. Computed tomography imaging-based morphological features combined into a prediction model may be able to identify patients at high risk for late adverse events after an initially uncomplicated type-B aortic dissection. © 2017 American Heart Association, Inc.

  14. Analytical Algorithms to Quantify the Uncertainty in Remaining Useful Life Prediction

    NASA Technical Reports Server (NTRS)

    Sankararaman, Shankar; Saxena, Abhinav; Daigle, Matthew; Goebel, Kai

    2013-01-01

    This paper investigates the use of analytical algorithms to quantify the uncertainty in the remaining useful life (RUL) estimate of components used in aerospace applications. The prediction of RUL is affected by several sources of uncertainty and it is important to systematically quantify their combined effect by computing the uncertainty in the RUL prediction in order to aid risk assessment, risk mitigation, and decisionmaking. While sampling-based algorithms have been conventionally used for quantifying the uncertainty in RUL, analytical algorithms are computationally cheaper and sometimes, are better suited for online decision-making. While exact analytical algorithms are available only for certain special cases (for e.g., linear models with Gaussian variables), effective approximations can be made using the the first-order second moment method (FOSM), the first-order reliability method (FORM), and the inverse first-order reliability method (Inverse FORM). These methods can be used not only to calculate the entire probability distribution of RUL but also to obtain probability bounds on RUL. This paper explains these three methods in detail and illustrates them using the state-space model of a lithium-ion battery.

  15. Optimizing colorectal cancer screening by race and sex: Microsimulation analysis II to inform the American Cancer Society colorectal cancer screening guideline.

    PubMed

    Meester, Reinier G S; Peterse, Elisabeth F P; Knudsen, Amy B; de Weerdt, Anne C; Chen, Jennifer C; Lietz, Anna P; Dwyer, Andrea; Ahnen, Dennis J; Siegel, Rebecca L; Smith, Robert A; Zauber, Ann G; Lansdorp-Vogelaar, Iris

    2018-05-30

    Colorectal cancer (CRC) risk varies by race and sex. This study, 1 of 2 microsimulation analyses to inform the 2018 American Cancer Society CRC screening guideline, explored the influence of race and sex on optimal CRC screening strategies. Two Cancer Intervention and Surveillance Modeling Network microsimulation models, informed by US incidence data, were used to evaluate a variety of screening methods, ages to start and stop, and intervals for 4 demographic subgroups (black and white males and females) under 2 scenarios for the projected lifetime CRC risk for 40-year-olds: 1) assuming that risk had remained stable since the early screening era and 2) assuming that risk had increased proportionally to observed incidence trends under the age of 40 years. Model-based screening recommendations were based on the predicted level of benefit (life-years gained) and burden (required number of colonoscopies), the incremental burden-to-benefit ratio, and the relative efficiency in comparison with strategies with similar burdens. When lifetime CRC risk was assumed to be stable over time, the models differed in the recommended age to start screening for whites (45 vs 50 years) but consistently recommended screening from the age of 45 years for blacks. When CRC risk was assumed to be increased, the models recommended starting at the age of 45 years, regardless of race and sex. Strategies recommended under both scenarios included colonoscopy every 10 or 15 years, annual fecal immunochemical testing, and computed tomographic colonography every 5 years through the age of 75 years. Microsimulation modeling suggests that CRC screening should be considered from the age of 45 years for blacks and for whites if the lifetime risk has increased proportionally to the incidence for younger adults. Cancer 2018. © 2018 The Authors. Cancer published by Wiley Periodicals, Inc. on behalf of American Cancer Society. © 2018 The Authors. Cancer published by Wiley Periodicals, Inc. on behalf of American Cancer Society.

  16. Prediction of breast cancer risk using a machine learning approach embedded with a locality preserving projection algorithm.

    PubMed

    Heidari, Morteza; Khuzani, Abolfazl Zargari; Hollingsworth, Alan B; Danala, Gopichandh; Mirniaharikandehei, Seyedehnafiseh; Qiu, Yuchen; Liu, Hong; Zheng, Bin

    2018-01-30

    In order to automatically identify a set of effective mammographic image features and build an optimal breast cancer risk stratification model, this study aims to investigate advantages of applying a machine learning approach embedded with a locally preserving projection (LPP) based feature combination and regeneration algorithm to predict short-term breast cancer risk. A dataset involving negative mammograms acquired from 500 women was assembled. This dataset was divided into two age-matched classes of 250 high risk cases in which cancer was detected in the next subsequent mammography screening and 250 low risk cases, which remained negative. First, a computer-aided image processing scheme was applied to segment fibro-glandular tissue depicted on mammograms and initially compute 44 features related to the bilateral asymmetry of mammographic tissue density distribution between left and right breasts. Next, a multi-feature fusion based machine learning classifier was built to predict the risk of cancer detection in the next mammography screening. A leave-one-case-out (LOCO) cross-validation method was applied to train and test the machine learning classifier embedded with a LLP algorithm, which generated a new operational vector with 4 features using a maximal variance approach in each LOCO process. Results showed a 9.7% increase in risk prediction accuracy when using this LPP-embedded machine learning approach. An increased trend of adjusted odds ratios was also detected in which odds ratios increased from 1.0 to 11.2. This study demonstrated that applying the LPP algorithm effectively reduced feature dimensionality, and yielded higher and potentially more robust performance in predicting short-term breast cancer risk.

  17. Prediction of breast cancer risk using a machine learning approach embedded with a locality preserving projection algorithm

    NASA Astrophysics Data System (ADS)

    Heidari, Morteza; Zargari Khuzani, Abolfazl; Hollingsworth, Alan B.; Danala, Gopichandh; Mirniaharikandehei, Seyedehnafiseh; Qiu, Yuchen; Liu, Hong; Zheng, Bin

    2018-02-01

    In order to automatically identify a set of effective mammographic image features and build an optimal breast cancer risk stratification model, this study aims to investigate advantages of applying a machine learning approach embedded with a locally preserving projection (LPP) based feature combination and regeneration algorithm to predict short-term breast cancer risk. A dataset involving negative mammograms acquired from 500 women was assembled. This dataset was divided into two age-matched classes of 250 high risk cases in which cancer was detected in the next subsequent mammography screening and 250 low risk cases, which remained negative. First, a computer-aided image processing scheme was applied to segment fibro-glandular tissue depicted on mammograms and initially compute 44 features related to the bilateral asymmetry of mammographic tissue density distribution between left and right breasts. Next, a multi-feature fusion based machine learning classifier was built to predict the risk of cancer detection in the next mammography screening. A leave-one-case-out (LOCO) cross-validation method was applied to train and test the machine learning classifier embedded with a LLP algorithm, which generated a new operational vector with 4 features using a maximal variance approach in each LOCO process. Results showed a 9.7% increase in risk prediction accuracy when using this LPP-embedded machine learning approach. An increased trend of adjusted odds ratios was also detected in which odds ratios increased from 1.0 to 11.2. This study demonstrated that applying the LPP algorithm effectively reduced feature dimensionality, and yielded higher and potentially more robust performance in predicting short-term breast cancer risk.

  18. Risk assessment of sleeping disorder breathing based on upper airway centerline evaluation

    NASA Astrophysics Data System (ADS)

    Alsufyani, Noura; Shen, Rui; Cheng, Irene; Major, Paul

    2013-02-01

    One of the most important breathing disorders in childhood is obstructive sleep apnea syndrome which affects 2-3% of children, and the reported failure rate of surgical treatment was as high as 54%. A possible reason in respiratory complications is having reduced dimensions of the upper airway which are further compressed when muscle tone is decreased during sleep. In this study, we use Cone-beam computed tomography (CBCT) to assess the location or cause of the airway obstruction. To date, all studies analyzing the upper airway in subjects with Sleeping Disorder Breathing were based on linear, area, or volumetric measurements, which are global computations and can easily ignore local significance. Skeletonization was initially introduced as a 3D modeling technique by which representative medial points of a model are extracted to generate centerlines for evaluations. Although centerlines have been commonly used in guiding surgical procedures, our novelty lies in comparing its geometric properties before and after surgeries. We apply 3D data refinement, registration and projection steps to quantify and localize the geometric deviation in target airway regions. Through cross validation with corresponding subjects' therapy data, we expect to quantify the tolerance threshold beyond which reduced dimensions of the upper airway are not clinically significant. The ultimate goal is to utilize this threshold to identify patients at risk of complications. Outcome from this research will also help establish a predictive model for training and to estimate treatment success based on airway measurements prior to intervention. Preliminary results demonstrate the feasibility of our approach.

  19. Security Investment in Contagious Networks.

    PubMed

    Hasheminasab, Seyed Alireza; Tork Ladani, Behrouz

    2018-01-16

    Security of the systems is normally interdependent in such a way that security risks of one part affect other parts and threats spread through the vulnerable links in the network. So, the risks of the systems can be mitigated through investments in the security of interconnecting links. This article takes an innovative look at the problem of security investment of nodes on their vulnerable links in a given contagious network as a game-theoretic model that can be applied to a variety of applications including information systems. In the proposed game model, each node computes its corresponding risk based on the value of its assets, vulnerabilities, and threats to determine the optimum level of security investments on its external links respecting its limited budget. Furthermore, direct and indirect nonlinear influences of a node's security investment on the risks of other nodes are considered. The existence and uniqueness of the game's Nash equilibrium in the proposed game are also proved. Further analysis of the model in a practical case revealed that taking advantage of the investment effects of other players, perfectly rational players (i.e., those who use the utility function of the proposed game model) make more cost-effective decisions than selfish nonrational or semirational players. © 2018 Society for Risk Analysis.

  20. Coupling of computer modeling with in vitro methodologies to reduce animal usage in toxicity testing.

    PubMed

    Clewell, H J

    1993-05-01

    The use of in vitro data to support the development of physiologically based pharmacokinetic (PBPK) models and to reduce the requirement for in vivo testing is demonstrated by three examples. In the first example, polychlorotrifluoroethylene, in vitro studies comparing metabolism and tissue response in rodents and primates made it possible to obtain definitive data for a human risk assessment without resorting to additional in vivo studies with primates. In the second example, a PBPK model for organophosphate esters was developed in which the parameters defining metabolism, tissue partitioning, and enzyme inhibition were all characterized by in vitro studies, and the rest of the model parameters were established from the literature. The resulting model was able to provide a coherent description of enzyme inhibition following both acute and chronic exposures in mice, rats, and humans. In the final example, the carcinogenic risk assessment for methylene chloride was refined by the incorporation of in vitro data on human metabolism into a PBPK model.

  1. Extreme value modelling of Ghana stock exchange index.

    PubMed

    Nortey, Ezekiel N N; Asare, Kwabena; Mettle, Felix Okoe

    2015-01-01

    Modelling of extreme events has always been of interest in fields such as hydrology and meteorology. However, after the recent global financial crises, appropriate models for modelling of such rare events leading to these crises have become quite essential in the finance and risk management fields. This paper models the extreme values of the Ghana stock exchange all-shares index (2000-2010) by applying the extreme value theory (EVT) to fit a model to the tails of the daily stock returns data. A conditional approach of the EVT was preferred and hence an ARMA-GARCH model was fitted to the data to correct for the effects of autocorrelation and conditional heteroscedastic terms present in the returns series, before the EVT method was applied. The Peak Over Threshold approach of the EVT, which fits a Generalized Pareto Distribution (GPD) model to excesses above a certain selected threshold, was employed. Maximum likelihood estimates of the model parameters were obtained and the model's goodness of fit was assessed graphically using Q-Q, P-P and density plots. The findings indicate that the GPD provides an adequate fit to the data of excesses. The size of the extreme daily Ghanaian stock market movements were then computed using the value at risk and expected shortfall risk measures at some high quantiles, based on the fitted GPD model.

  2. Computer simulation of stair falls to investigate scenarios in child abuse.

    PubMed

    Bertocci, G E; Pierce, M C; Deemer, E; Aguel, F

    2001-09-01

    To demonstrate the usefulness of computer simulation techniques in the investigation of pediatric stair falls. Since stair falls are a common falsely reported injury scenario in child abuse, our specific aim was to investigate the influence of stair characteristics on injury biomechanics of pediatric stair falls by using a computer simulation model. Our long-term goal is to use knowledge of biomechanics to aid in distinguishing between accidents and abuse. A computer simulation model of a 3-year-old child falling down stairs was developed using commercially available simulation software. This model was used to investigate the influence that stair characteristics have on biomechanical measures associated with injury risk. Since femur fractures occur in unintentional and abuse scenarios, biomechanical measures were focused on the lower extremities. The number and slope of steps and stair surface friction and elasticity were found to affect biomechanical measures associated with injury risk. Computer simulation techniques are useful for investigating the biomechanics of stair falls. Using our simulation model, we determined that stair characteristics have an effect on potential for lower extremity injuries. Although absolute values of biomechanical measures should not be relied on in an unvalidated model such as this, relationships between accident-environment factors and biomechanical measures can be studied through simulation. Future efforts will focus on model validation.

  3. An innovative computationally efficient hydromechanical coupling approach for fault reactivation in geological subsurface utilization

    NASA Astrophysics Data System (ADS)

    Adams, M.; Kempka, T.; Chabab, E.; Ziegler, M.

    2018-02-01

    Estimating the efficiency and sustainability of geological subsurface utilization, i.e., Carbon Capture and Storage (CCS) requires an integrated risk assessment approach, considering the occurring coupled processes, beside others, the potential reactivation of existing faults. In this context, hydraulic and mechanical parameter uncertainties as well as different injection rates have to be considered and quantified to elaborate reliable environmental impact assessments. Consequently, the required sensitivity analyses consume significant computational time due to the high number of realizations that have to be carried out. Due to the high computational costs of two-way coupled simulations in large-scale 3D multiphase fluid flow systems, these are not applicable for the purpose of uncertainty and risk assessments. Hence, an innovative semi-analytical hydromechanical coupling approach for hydraulic fault reactivation will be introduced. This approach determines the void ratio evolution in representative fault elements using one preliminary base simulation, considering one model geometry and one set of hydromechanical parameters. The void ratio development is then approximated and related to one reference pressure at the base of the fault. The parametrization of the resulting functions is then directly implemented into a multiphase fluid flow simulator to carry out the semi-analytical coupling for the simulation of hydromechanical processes. Hereby, the iterative parameter exchange between the multiphase and mechanical simulators is omitted, since the update of porosity and permeability is controlled by one reference pore pressure at the fault base. The suggested procedure is capable to reduce the computational time required by coupled hydromechanical simulations of a multitude of injection rates by a factor of up to 15.

  4. Computational hemodynamics of an implanted coronary stent based on three-dimensional cine angiography reconstruction.

    PubMed

    Chen, Mounter C Y; Lu, Po-Chien; Chen, James S Y; Hwang, Ned H C

    2005-01-01

    Coronary stents are supportive wire meshes that keep narrow coronary arteries patent, reducing the risk of restenosis. Despite the common use of coronary stents, approximately 20-35% of them fail due to restenosis. Flow phenomena adjacent to the stent may contribute to restenosis. Three-dimensional computational fluid dynamics (CFD) and reconstruction based on biplane cine angiography were used to assess coronary geometry and volumetric blood flows. A patient-specific left anterior descending (LAD) artery was reconstructed from single-plane x-ray imaging. With corresponding electrocardiographic signals, images from the same time phase were selected from the angiograms for dynamic three-dimensional reconstruction. The resultant three-dimensional LAD artery at end-diastole was adopted for detailed analysis. Both the geometries and flow fields, based on a computational model from CAE software (ANSYS and CATIA) and full three-dimensional Navier-Stroke equations in the CFD-ACE+ software, respectively, changed dramatically after stent placement. Flow fields showed a complex three-dimensional spiral motion due to arterial tortuosity. The corresponding wall shear stresses, pressure gradient, and flow field all varied significantly after stent placement. Combined angiography and CFD techniques allow more detailed investigation of flow patterns in various segments. The implanted stent(s) may be quantitatively studied from the proposed hemodynamic modeling approach.

  5. Brain injury tolerance limit based on computation of axonal strain.

    PubMed

    Sahoo, Debasis; Deck, Caroline; Willinger, Rémy

    2016-07-01

    Traumatic brain injury (TBI) is the leading cause of death and permanent impairment over the last decades. In both the severe and mild TBIs, diffuse axonal injury (DAI) is the most common pathology and leads to axonal degeneration. Computation of axonal strain by using finite element head model in numerical simulation can enlighten the DAI mechanism and help to establish advanced head injury criteria. The main objective of this study is to develop a brain injury criterion based on computation of axonal strain. To achieve the objective a state-of-the-art finite element head model with enhanced brain and skull material laws, was used for numerical computation of real world head trauma. The implementation of new medical imaging data such as, fractional anisotropy and axonal fiber orientation from Diffusion Tensor Imaging (DTI) of 12 healthy patients into the finite element brain model was performed to improve the brain constitutive material law with more efficient heterogeneous anisotropic visco hyper-elastic material law. The brain behavior has been validated in terms of brain deformation against Hardy et al. (2001), Hardy et al. (2007), and in terms of brain pressure against Nahum et al. (1977) and Trosseille et al. (1992) experiments. Verification of model stability has been conducted as well. Further, 109 well-documented TBI cases were simulated and axonal strain computed to derive brain injury tolerance curve. Based on an in-depth statistical analysis of different intra-cerebral parameters (brain axonal strain rate, axonal strain, first principal strain, Von Mises strain, first principal stress, Von Mises stress, CSDM (0.10), CSDM (0.15) and CSDM (0.25)), it was shown that axonal strain was the most appropriate candidate parameter to predict DAI. The proposed brain injury tolerance limit for a 50% risk of DAI has been established at 14.65% of axonal strain. This study provides a key step for a realistic novel injury metric for DAI. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. ADVANCED COMPUTATIONAL METHODS IN DOSE MODELING

    EPA Science Inventory

    The overall goal of the EPA-ORD NERL research program on Computational Toxicology (CompTox) is to provide the Agency with the tools of modern chemistry, biology, and computing to improve quantitative risk assessments and reduce uncertainties in the source-to-adverse outcome conti...

  7. Providing a parallel and distributed capability for JMASS using SPEEDES

    NASA Astrophysics Data System (ADS)

    Valinski, Maria; Driscoll, Jonathan; McGraw, Robert M.; Meyer, Bob

    2002-07-01

    The Joint Modeling And Simulation System (JMASS) is a Tri-Service simulation environment that supports engineering and engagement-level simulations. As JMASS is expanded to support other Tri-Service domains, the current set of modeling services must be expanded for High Performance Computing (HPC) applications by adding support for advanced time-management algorithms, parallel and distributed topologies, and high speed communications. By providing support for these services, JMASS can better address modeling domains requiring parallel computationally intense calculations such clutter, vulnerability and lethality calculations, and underwater-based scenarios. A risk reduction effort implementing some HPC services for JMASS using the SPEEDES (Synchronous Parallel Environment for Emulation and Discrete Event Simulation) Simulation Framework has recently concluded. As an artifact of the JMASS-SPEEDES integration, not only can HPC functionality be brought to the JMASS program through SPEEDES, but an additional HLA-based capability can be demonstrated that further addresses interoperability issues. The JMASS-SPEEDES integration provided a means of adding HLA capability to preexisting JMASS scenarios through an implementation of the standard JMASS port communication mechanism that allows players to communicate.

  8. Quantitative Microbial Risk Assessment Tutorial: Installation of Software for Watershed Modeling in Support of QMRA

    EPA Science Inventory

    This tutorial provides instructions for accessing, retrieving, and downloading the following software to install on a host computer in support of Quantitative Microbial Risk Assessment (QMRA) modeling:• SDMProjectBuilder (which includes the Microbial Source Module as part...

  9. Chemotherapy appointment scheduling under uncertainty using mean-risk stochastic integer programming.

    PubMed

    Alvarado, Michelle; Ntaimo, Lewis

    2018-03-01

    Oncology clinics are often burdened with scheduling large volumes of cancer patients for chemotherapy treatments under limited resources such as the number of nurses and chairs. These cancer patients require a series of appointments over several weeks or months and the timing of these appointments is critical to the treatment's effectiveness. Additionally, the appointment duration, the acuity levels of each appointment, and the availability of clinic nurses are uncertain. The timing constraints, stochastic parameters, rising treatment costs, and increased demand of outpatient oncology clinic services motivate the need for efficient appointment schedules and clinic operations. In this paper, we develop three mean-risk stochastic integer programming (SIP) models, referred to as SIP-CHEMO, for the problem of scheduling individual chemotherapy patient appointments and resources. These mean-risk models are presented and an algorithm is devised to improve computational speed. Computational results were conducted using a simulation model and results indicate that the risk-averse SIP-CHEMO model with the expected excess mean-risk measure can decrease patient waiting times and nurse overtime when compared to deterministic scheduling algorithms by 42 % and 27 %, respectively.

  10. pkCSM: Predicting Small-Molecule Pharmacokinetic and Toxicity Properties Using Graph-Based Signatures

    PubMed Central

    2015-01-01

    Drug development has a high attrition rate, with poor pharmacokinetic and safety properties a significant hurdle. Computational approaches may help minimize these risks. We have developed a novel approach (pkCSM) which uses graph-based signatures to develop predictive models of central ADMET properties for drug development. pkCSM performs as well or better than current methods. A freely accessible web server (http://structure.bioc.cam.ac.uk/pkcsm), which retains no information submitted to it, provides an integrated platform to rapidly evaluate pharmacokinetic and toxicity properties. PMID:25860834

  11. Biological Based Risk Assessment for Space Exploration

    NASA Technical Reports Server (NTRS)

    Cucinotta, Francis A.

    2011-01-01

    Exposures from galactic cosmic rays (GCR) - made up of high-energy protons and high-energy and charge (HZE) nuclei, and solar particle events (SPEs) - comprised largely of low- to medium-energy protons are the primary health concern for astronauts for long-term space missions. Experimental studies have shown that HZE nuclei produce both qualitative and quantitative differences in biological effects compared to terrestrial radiation, making risk assessments for cancer and degenerative risks, such as central nervous system effects and heart disease, highly uncertain. The goal for space radiation protection at NASA is to be able to reduce the uncertainties in risk assessments for Mars exploration to be small enough to ensure acceptable levels of risks are not exceeded and to adequately assess the efficacy of mitigation measures such as shielding or biological countermeasures. We review the recent BEIR VII and UNSCEAR-2006 models of cancer risks and their uncertainties. These models are shown to have an inherent 2-fold uncertainty as defined by ratio of the 95% percent confidence level to the mean projection, even before radiation quality is considered. In order to overcome the uncertainties in these models, new approaches to risk assessment are warranted. We consider new computational biology approaches to modeling cancer risks. A basic program of research that includes stochastic descriptions of the physics and chemistry of radiation tracks and biochemistry of metabolic pathways, to emerging biological understanding of cellular and tissue modifications leading to cancer is described.

  12. Development and validation of risk models to select ever-smokers for CT lung-cancer screening

    PubMed Central

    Katki, Hormuzd A.; Kovalchik, Stephanie A.; Berg, Christine D.; Cheung, Li C.; Chaturvedi, Anil K.

    2016-01-01

    Importance The US Preventive Services Task Force (USPSTF) recommends computed-tomography (CT) lung-cancer screening for ever-smokers ages 55-80 years who smoked at least 30 pack-years with no more than 15 years since quitting. However, selecting ever-smokers for screening using individualized lung-cancer risk calculations may be more effective and efficient than current USPSTF recommendations. Objective Comparison of modeled outcomes from risk-based CT lung-screening strategies versus USPSTF recommendations. Design/Setting/Participants Empirical risk models for lung-cancer incidence and death in the absence of CT screening using data on ever-smokers from the Prostate, Lung, Colorectal and Ovarian Cancer Screening Trial (PLCO; 1993-2009) control group. Covariates included age, education, sex, race, smoking intensity/duration/quit-years, Body Mass Index, family history of lung-cancer, and self-reported emphysema. Model validation in the chest radiography groups of the PLCO and the National Lung Screening Trial (NLST; 2002-2009), with additional validation of the death model in the National Health Interview Survey (NHIS; 1997-2001), a representative sample of the US. Models applied to US ever-smokers ages 50-80 (NHIS 2010-2012) to estimate outcomes of risk-based selection for CT lung-screening, assuming screening for all ever-smokers yields the percent changes in lung-cancer detection and death observed in the NLST. Exposure Annual CT lung-screening for 3 years. Main Outcomes and Measures Model validity: calibration (number of model-predicted cases divided by number of observed cases (Estimated/Observed)) and discrimination (Area-Under-Curve (AUC)). Modeled screening outcomes: estimated number of screen-avertable lung-cancer deaths, estimated screening effectiveness (number needed to screen (NNS) to prevent 1 lung-cancer death). Results Lung-cancer incidence and death risk models were well-calibrated in PLCO and NLST. The lung-cancer death model calibrated and discriminated well for US ever-smokers ages 50-80 (NHIS 1997-2001: Estimated/Observed=0.94, 95%CI=0.84-1.05; AUC=0.78, 95%CI=0.76-0.80). Under USPSTF recommendations, the models estimated 9.0 million US ever-smokers would qualify for lung-cancer screening and 46,488 (95%CI=43,924-49,053) lung-cancer deaths were estimated as screen-avertable over 5 years (estimated NNS=194, 95%CI=187-201). In contrast, risk-based selection screening the same number of ever-smokers (9.0 million) at highest 5-year lung-cancer risk (≥1.9%), was estimated to avert 20% more deaths (55,717; 95%CI=53,033-58,400) and was estimated to reduce the estimated NNS by 17% (NNS=162, 95%CI=157-166). Conclusions and Relevance Among a cohort of US ever-smokers age 50-80 years, application of a risk-based model for CT screening for lung cancer compared with a model based on USPSTF recommendations was estimated to be associated with a greater number of lung-cancer deaths prevented over 5 years along with a lower NNS to prevent 1 lung-cancer death. PMID:27179989

  13. Development and Validation of Risk Models to Select Ever-Smokers for CT Lung Cancer Screening.

    PubMed

    Katki, Hormuzd A; Kovalchik, Stephanie A; Berg, Christine D; Cheung, Li C; Chaturvedi, Anil K

    2016-06-07

    The US Preventive Services Task Force (USPSTF) recommends computed tomography (CT) lung cancer screening for ever-smokers aged 55 to 80 years who have smoked at least 30 pack-years with no more than 15 years since quitting. However, selecting ever-smokers for screening using individualized lung cancer risk calculations may be more effective and efficient than current USPSTF recommendations. Comparison of modeled outcomes from risk-based CT lung-screening strategies vs USPSTF recommendations. Empirical risk models for lung cancer incidence and death in the absence of CT screening using data on ever-smokers from the Prostate, Lung, Colorectal, and Ovarian Cancer Screening Trial (PLCO; 1993-2009) control group. Covariates included age; education; sex; race; smoking intensity, duration, and quit-years; body mass index; family history of lung cancer; and self-reported emphysema. Model validation in the chest radiography groups of the PLCO and the National Lung Screening Trial (NLST; 2002-2009), with additional validation of the death model in the National Health Interview Survey (NHIS; 1997-2001), a representative sample of the United States. Models were applied to US ever-smokers aged 50 to 80 years (NHIS 2010-2012) to estimate outcomes of risk-based selection for CT lung screening, assuming screening for all ever-smokers, yield the percent changes in lung cancer detection and death observed in the NLST. Annual CT lung screening for 3 years beginning at age 50 years. For model validity: calibration (number of model-predicted cases divided by number of observed cases [estimated/observed]) and discrimination (area under curve [AUC]). For modeled screening outcomes: estimated number of screen-avertable lung cancer deaths and estimated screening effectiveness (number needed to screen [NNS] to prevent 1 lung cancer death). Lung cancer incidence and death risk models were well calibrated in PLCO and NLST. The lung cancer death model calibrated and discriminated well for US ever-smokers aged 50 to 80 years (NHIS 1997-2001: estimated/observed = 0.94 [95%CI, 0.84-1.05]; AUC, 0.78 [95%CI, 0.76-0.80]). Under USPSTF recommendations, the models estimated 9.0 million US ever-smokers would qualify for lung cancer screening and 46,488 (95% CI, 43,924-49,053) lung cancer deaths were estimated as screen-avertable over 5 years (estimated NNS, 194 [95% CI, 187-201]). In contrast, risk-based selection screening of the same number of ever-smokers (9.0 million) at highest 5-year lung cancer risk (≥1.9%) was estimated to avert 20% more deaths (55,717 [95% CI, 53,033-58,400]) and was estimated to reduce the estimated NNS by 17% (NNS, 162 [95% CI, 157-166]). Among a cohort of US ever-smokers aged 50 to 80 years, application of a risk-based model for CT screening for lung cancer compared with a model based on USPSTF recommendations was estimated to be associated with a greater number of lung cancer deaths prevented over 5 years, along with a lower NNS to prevent 1 lung cancer death.

  14. Perception of young adults with sickle cell disease or sickle cell trait about participation in the CHOICES randomized controlled trial.

    PubMed

    Hershberger, Patricia E; Gallo, Agatha M; Molokie, Robert; Thompson, Alexis A; Suarez, Marie L; Yao, Yingwei; Wilkie, Diana J

    2016-06-01

    To gain an in-depth understanding of the perceptions of young adults with sickle cell disease and sickle cell trait about parenthood and participating in the CHOICES randomized controlled trial that used computer-based, educational programmes. In the USA, there is insufficient education to assure that all young adults with sickle cell disease or sickle cell trait understand genetic inheritance risks and reproductive options to make informed reproductive decisions. To address this educational need, we developed a computer-based, multimedia program (CHOICES) and reformatted usual care into a computer-based (e-Book) program. We then conducted a two-year randomized controlled trial that included a qualitative component that would deepen understanding of young adults' perceptions of parenthood and use of computer-based, educational programmes. A qualitative descriptive approach completed after a randomized controlled trial. Sixty-eight men and women of childbearing age participated in semi-structured interviews at the completion of the randomized controlled trial from 2012-2013. Thematic content analysis guided the qualitative description. Three main themes were identified: (1) increasing knowledge and new ways of thinking and behaving; (2) rethinking parenting plans; and (3) appraising the program design and delivery. Most participants reported increased knowledge and rethinking of their parenting plans and were supportive of computer-based learning. Some participants expressed difficulty in determining individual transmission risks. Participants perceived the computer programs as beneficial to their learning. Future development of an Internet-based educational programme is warranted, with emphasis on providing tailored education or memory boosters about individual transmission risks. © 2015 John Wiley & Sons Ltd.

  15. Chagas disease risk in Texas.

    PubMed

    Sarkar, Sahotra; Strutz, Stavana E; Frank, David M; Rivaldi, Chissa-Louise; Sissel, Blake; Sánchez-Cordero, Victor

    2010-10-05

    Chagas disease, caused by Trypanosoma cruzi, remains a serious public health concern in many areas of Latin America, including México. It is also endemic in Texas with an autochthonous canine cycle, abundant vectors (Triatoma species) in many counties, and established domestic and peridomestic cycles which make competent reservoirs available throughout the state. Yet, Chagas disease is not reportable in Texas, blood donor screening is not mandatory, and the serological profiles of human and canine populations remain unknown. The purpose of this analysis was to provide a formal risk assessment, including risk maps, which recommends the removal of these lacunae. The spatial relative risk of the establishment of autochthonous Chagas disease cycles in Texas was assessed using a five-stage analysis. 1. Ecological risk for Chagas disease was established at a fine spatial resolution using a maximum entropy algorithm that takes as input occurrence points of vectors and environmental layers. The analysis was restricted to triatomine vector species for which new data were generated through field collection and through collation of post-1960 museum records in both México and the United States with sufficiently low georeferenced error to be admissible given the spatial resolution of the analysis (1 arc-minute). The new data extended the distribution of vector species to 10 new Texas counties. The models predicted that Triatoma gerstaeckeri has a large region of contiguous suitable habitat in the southern United States and México, T. lecticularia has a diffuse suitable habitat distribution along both coasts of the same region, and T. sanguisuga has a disjoint suitable habitat distribution along the coasts of the United States. The ecological risk is highest in south Texas. 2. Incidence-based relative risk was computed at the county level using the Bayesian Besag-York-Mollié model and post-1960 T. cruzi incidence data. This risk is concentrated in south Texas. 3. The ecological and incidence-based risks were analyzed together in a multi-criteria dominance analysis of all counties and those counties in which there were as yet no reports of parasite incidence. Both analyses picked out counties in south Texas as those at highest risk. 4. As an alternative to the multi-criteria analysis, the ecological and incidence-based risks were compounded in a multiplicative composite risk model. Counties in south Texas emerged as those with the highest risk. 5. Risk as the relative expected exposure rate was computed using a multiplicative model for the composite risk and a scaled population county map for Texas. Counties with highest risk were those in south Texas and a few counties with high human populations in north, east, and central Texas showing that, though Chagas disease risk is concentrated in south Texas, it is not restricted to it. For all of Texas, Chagas disease should be designated as reportable, as it is in Arizona and Massachusetts. At least for south Texas, lower than N, blood donor screening should be mandatory, and the serological profiles of human and canine populations should be established. It is also recommended that a joint initiative be undertaken by the United States and México to combat Chagas disease in the trans-border region. The methodology developed for this analysis can be easily exported to other geographical and disease contexts in which risk assessment is of potential value.

  16. The influence of the free space environment on the superlight-weight thermal protection system: conception, methods, and risk analysis

    NASA Astrophysics Data System (ADS)

    Yatsenko, Vitaliy; Falchenko, Iurii; Fedorchuk, Viktor; Petrushynets, Lidiia

    2016-07-01

    This report focuses on the results of the EU project "Superlight-weight thermal protection system for space application (LIGHT-TPS)". The bottom line is an analysis of influence of the free space environment on the superlight-weight thermal protection system (TPS). This report focuses on new methods that based on the following models: synergetic, physical, and computational. This report concentrates on four approaches. The first concerns the synergetic approach. The synergetic approach to the solution of problems of self-controlled synthesis of structures and creation of self-organizing technologies is considered in connection with the super-problem of creation of materials with new functional properties. Synergetics methods and mathematical design are considered according to actual problems of material science. The second approach describes how the optimization methods can be used to determine material microstructures with optimized or targeted properties. This technique enables one to find unexpected microstructures with exotic behavior (e.g., negative thermal expansion coefficients). The third approach concerns the dynamic probabilistic risk analysis of TPS l elements with complex characterizations for damages using a physical model of TPS system and a predictable level of ionizing radiation and space weather. Focusing is given mainly on the TPS model, mathematical models for dynamic probabilistic risk assessment and software for the modeling and prediction of the influence of the free space environment. The probabilistic risk assessment method for TPS is presented considering some deterministic and stochastic factors. The last approach concerns results of experimental research of the temperature distribution on the surface of the honeycomb sandwich panel size 150 x 150 x 20 mm at the diffusion welding in vacuum are considered. An equipment, which provides alignment of temperature fields in a product for the formation of equal strength of welded joints is considered. Many tasks in computational materials science can be posed as optimization problems. This technique enables one to find unexpected microstructures with exotic behavior (e.g., negative thermal expansion coefficients). The last approach is concerned with the generation of realizations of materials with specified but limited microstructural information: an intriguing inverse problem of both fundamental and practical importance. Computational models based upon the theories of molecular dynamics or quantum mechanics would enable the prediction and modification of fundamental materials properties. This problem is solved using deterministic and stochastic optimization techniques. The main optimization approaches in the frame of the EU project "Superlight-weight thermal protection system for space application" are discussed. Optimization approach to the alloys for obtaining materials with required properties using modeling techniques and experimental data will be also considered. This report is supported by the EU project "Superlight-weight thermal protection system for space application (LIGHT-TPS)"

  17. A risk assessment method for multi-site damage

    NASA Astrophysics Data System (ADS)

    Millwater, Harry Russell, Jr.

    This research focused on developing probabilistic methods suitable for computing small probabilities of failure, e.g., 10sp{-6}, of structures subject to multi-site damage (MSD). MSD is defined as the simultaneous development of fatigue cracks at multiple sites in the same structural element such that the fatigue cracks may coalesce to form one large crack. MSD is modeled as an array of collinear cracks with random initial crack lengths with the centers of the initial cracks spaced uniformly apart. The data used was chosen to be representative of aluminum structures. The structure is considered failed whenever any two adjacent cracks link up. A fatigue computer model is developed that can accurately and efficiently grow a collinear array of arbitrary length cracks from initial size until failure. An algorithm is developed to compute the stress intensity factors of all cracks considering all interaction effects. The probability of failure of two to 100 cracks is studied. Lower bounds on the probability of failure are developed based upon the probability of the largest crack exceeding a critical crack size. The critical crack size is based on the initial crack size that will grow across the ligament when the neighboring crack has zero length. The probability is evaluated using extreme value theory. An upper bound is based on the probability of the maximum sum of initial cracks being greater than a critical crack size. A weakest link sampling approach is developed that can accurately and efficiently compute small probabilities of failure. This methodology is based on predicting the weakest link, i.e., the two cracks to link up first, for a realization of initial crack sizes, and computing the cycles-to-failure using these two cracks. Criteria to determine the weakest link are discussed. Probability results using the weakest link sampling method are compared to Monte Carlo-based benchmark results. The results indicate that very small probabilities can be computed accurately in a few minutes using a Hewlett-Packard workstation.

  18. Internal photon and electron dosimetry of the newborn patient—a hybrid computational phantom study

    NASA Astrophysics Data System (ADS)

    Wayson, Michael; Lee, Choonsik; Sgouros, George; Treves, S. Ted; Frey, Eric; Bolch, Wesley E.

    2012-03-01

    Estimates of radiation absorbed dose to organs of the nuclear medicine patient are a requirement for administered activity optimization and for stochastic risk assessment. Pediatric patients, and in particular the newborn child, represent that portion of the patient population where such optimization studies are most crucial owing to the enhanced tissue radiosensitivities and longer life expectancies of this patient subpopulation. In cases where whole-body CT imaging is not available, phantom-based calculations of radionuclide S values—absorbed dose to a target tissue per nuclear transformation in a source tissue—are required for dose and risk evaluation. In this study, a comprehensive model of electron and photon dosimetry of the reference newborn child is presented based on a high-resolution hybrid-voxel phantom from the University of Florida (UF) patient model series. Values of photon specific absorbed fraction (SAF) were assembled for both the reference male and female newborn using the radiation transport code MCNPX v2.6. Values of electron SAF were assembled in a unique and time-efficient manner whereby the collisional and radiative components of organ dose--for both self- and cross-dose terms—were computed separately. Dose to the newborn skeletal tissues were assessed via fluence-to-dose response functions reported for the first time in this study. Values of photon and electron SAFs were used to assemble a complete set of S values for some 16 radionuclides commonly associated with molecular imaging of the newborn. These values were then compared to those available in the OLINDA/EXM software. S value ratios for organ self-dose ranged from 0.46 to 1.42, while similar ratios for organ cross-dose varied from a low of 0.04 to a high of 3.49. These large discrepancies are due in large part to the simplistic organ modeling in the stylized newborn model used in the OLINDA/EXM software. A comprehensive model of internal dosimetry is presented in this study for the newborn nuclear medicine patient based upon the UF hybrid computational phantom. Photon dose response functions, photon and electron SAFs, and tables of radionuclide S values for the newborn child--both male and female--are given in a series of four electronic annexes available at stacks.iop.org/pmb/57/1433/mmedia. These values can be applied to optimization studies of image quality and stochastic risk for this most vulnerable class of pediatric patients.

  19. Improving the Efficiency of Abdominal Aortic Aneurysm Wall Stress Computations

    PubMed Central

    Zelaya, Jaime E.; Goenezen, Sevan; Dargon, Phong T.; Azarbal, Amir-Farzin; Rugonyi, Sandra

    2014-01-01

    An abdominal aortic aneurysm is a pathological dilation of the abdominal aorta, which carries a high mortality rate if ruptured. The most commonly used surrogate marker of rupture risk is the maximal transverse diameter of the aneurysm. More recent studies suggest that wall stress from models of patient-specific aneurysm geometries extracted, for instance, from computed tomography images may be a more accurate predictor of rupture risk and an important factor in AAA size progression. However, quantification of wall stress is typically computationally intensive and time-consuming, mainly due to the nonlinear mechanical behavior of the abdominal aortic aneurysm walls. These difficulties have limited the potential of computational models in clinical practice. To facilitate computation of wall stresses, we propose to use a linear approach that ensures equilibrium of wall stresses in the aneurysms. This proposed linear model approach is easy to implement and eliminates the burden of nonlinear computations. To assess the accuracy of our proposed approach to compute wall stresses, results from idealized and patient-specific model simulations were compared to those obtained using conventional approaches and to those of a hypothetical, reference abdominal aortic aneurysm model. For the reference model, wall mechanical properties and the initial unloaded and unstressed configuration were assumed to be known, and the resulting wall stresses were used as reference for comparison. Our proposed linear approach accurately approximates wall stresses for varying model geometries and wall material properties. Our findings suggest that the proposed linear approach could be used as an effective, efficient, easy-to-use clinical tool to estimate patient-specific wall stresses. PMID:25007052

  20. Child-related cognitions and affective functioning of physically abusive and comparison parents.

    PubMed

    Haskett, Mary E; Smith Scott, Susan; Grant, Raven; Ward, Caryn Sabourin; Robinson, Canby

    2003-06-01

    The goal of this research was to utilize the cognitive behavioral model of abusive parenting to select and examine risk factors to illuminate the unique and combined influences of social cognitive and affective variables in predicting abuse group membership. Participants included physically abusive parents (n=56) and a closely-matched group of comparison parents (n=62). Social cognitive risk variables measured were (a) parent's expectations for children's abilities and maturity, (b) parental attributions of intentionality of child misbehavior, and (c) parents' perceptions of their children's adjustment. Affective risk variables included (a) psychopathology and (b) parenting stress. A series of logistic regression models were constructed to test the individual, combined, and interactive effects of risk variables on abuse group membership. The full set of five risk variables was predictive of abuse status; however, not all variables were predictive when considered individually and interactions did not contribute significantly to prediction. A risk composite score computed for each parent based on the five risk variables significantly predicted abuse status. Wide individual differences in risk across the five variables were apparent within the sample of abusive parents. Findings were generally consistent with a cognitive behavioral model of abuse, with cognitive variables being more salient in predicting abuse status than affective factors. Results point to the importance of considering diversity in characteristics of abusive parents.

  1. Structural reliability assessment capability in NESSUS

    NASA Technical Reports Server (NTRS)

    Millwater, H.; Wu, Y.-T.

    1992-01-01

    The principal capabilities of NESSUS (Numerical Evaluation of Stochastic Structures Under Stress), an advanced computer code developed for probabilistic structural response analysis, are reviewed, and its structural reliability assessed. The code combines flexible structural modeling tools with advanced probabilistic algorithms in order to compute probabilistic structural response and resistance, component reliability and risk, and system reliability and risk. An illustrative numerical example is presented.

  2. Structural reliability assessment capability in NESSUS

    NASA Astrophysics Data System (ADS)

    Millwater, H.; Wu, Y.-T.

    1992-07-01

    The principal capabilities of NESSUS (Numerical Evaluation of Stochastic Structures Under Stress), an advanced computer code developed for probabilistic structural response analysis, are reviewed, and its structural reliability assessed. The code combines flexible structural modeling tools with advanced probabilistic algorithms in order to compute probabilistic structural response and resistance, component reliability and risk, and system reliability and risk. An illustrative numerical example is presented.

  3. An improved approach for flight readiness certification: Methodology for failure risk assessment and application examples. Volume 3: Structure and listing of programs

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.

  4. The effect of model uncertainty on cooperation in sensorimotor interactions

    PubMed Central

    Grau-Moya, J.; Hez, E.; Pezzulo, G.; Braun, D. A.

    2013-01-01

    Decision-makers have been shown to rely on probabilistic models for perception and action. However, these models can be incorrect or partially wrong in which case the decision-maker has to cope with model uncertainty. Model uncertainty has recently also been shown to be an important determinant of sensorimotor behaviour in humans that can lead to risk-sensitive deviations from Bayes optimal behaviour towards worst-case or best-case outcomes. Here, we investigate the effect of model uncertainty on cooperation in sensorimotor interactions similar to the stag-hunt game, where players develop models about the other player and decide between a pay-off-dominant cooperative solution and a risk-dominant, non-cooperative solution. In simulations, we show that players who allow for optimistic deviations from their opponent model are much more likely to converge to cooperative outcomes. We also implemented this agent model in a virtual reality environment, and let human subjects play against a virtual player. In this game, subjects' pay-offs were experienced as forces opposing their movements. During the experiment, we manipulated the risk sensitivity of the computer player and observed human responses. We found not only that humans adaptively changed their level of cooperation depending on the risk sensitivity of the computer player but also that their initial play exhibited characteristic risk-sensitive biases. Our results suggest that model uncertainty is an important determinant of cooperation in two-player sensorimotor interactions. PMID:23945266

  5. Fatigue Assessment of Nickel-Titanium Peripheral Stents: Comparison of Multi-Axial Fatigue Models

    NASA Astrophysics Data System (ADS)

    Allegretti, Dario; Berti, Francesca; Migliavacca, Francesco; Pennati, Giancarlo; Petrini, Lorenza

    2018-03-01

    Peripheral Nickel-Titanium (NiTi) stents exploit super-elasticity to treat femoropopliteal artery atherosclerosis. The stent is subject to cyclic loads, which may lead to fatigue fracture and treatment failure. The complexity of the loading conditions and device geometry, coupled with the nonlinear material behavior, may induce multi-axial and non-proportional deformation. Finite element analysis can assess the fatigue risk, by comparing the device state of stress with the material fatigue limit. The most suitable fatigue model is not fully understood for NiTi devices, due to its complex thermo-mechanical behavior. This paper assesses the fatigue behavior of NiTi stents through computational models and experimental validation. Four different strain-based models are considered: the von Mises criterion and three critical plane models (Fatemi-Socie, Brown-Miller, and Smith-Watson-Topper models). Two stents, made of the same material with different cell geometries are manufactured, and their fatigue behavior is experimentally characterized. The comparison between experimental and numerical results highlights an overestimation of the failure risk by the von Mises criterion. On the contrary, the selected critical plane models, even if based on different damage mechanisms, give a better fatigue life estimation. Further investigations on crack propagation mechanisms of NiTi stents are required to properly select the most reliable fatigue model.

  6. Fatigue Assessment of Nickel-Titanium Peripheral Stents: Comparison of Multi-Axial Fatigue Models

    NASA Astrophysics Data System (ADS)

    Allegretti, Dario; Berti, Francesca; Migliavacca, Francesco; Pennati, Giancarlo; Petrini, Lorenza

    2018-02-01

    Peripheral Nickel-Titanium (NiTi) stents exploit super-elasticity to treat femoropopliteal artery atherosclerosis. The stent is subject to cyclic loads, which may lead to fatigue fracture and treatment failure. The complexity of the loading conditions and device geometry, coupled with the nonlinear material behavior, may induce multi-axial and non-proportional deformation. Finite element analysis can assess the fatigue risk, by comparing the device state of stress with the material fatigue limit. The most suitable fatigue model is not fully understood for NiTi devices, due to its complex thermo-mechanical behavior. This paper assesses the fatigue behavior of NiTi stents through computational models and experimental validation. Four different strain-based models are considered: the von Mises criterion and three critical plane models (Fatemi-Socie, Brown-Miller, and Smith-Watson-Topper models). Two stents, made of the same material with different cell geometries are manufactured, and their fatigue behavior is experimentally characterized. The comparison between experimental and numerical results highlights an overestimation of the failure risk by the von Mises criterion. On the contrary, the selected critical plane models, even if based on different damage mechanisms, give a better fatigue life estimation. Further investigations on crack propagation mechanisms of NiTi stents are required to properly select the most reliable fatigue model.

  7. Intelligent instrumentation applied in environment management

    NASA Astrophysics Data System (ADS)

    Magheti, Mihnea I.; Walsh, Patrick; Delassus, Patrick

    2005-06-01

    The use of information and communications technology in environment management and research has witnessed a renaissance in recent years. From optical sensor technology, expert systems, GIS and communications technologies to computer aided harvesting and yield prediction, these systems are increasable used for applications developing in the management sector of natural resources and biodiversity. This paper presents an environmental decision support system, used to monitor biodiversity and present a risk rating for the invasion of pests into the particular systems being examined. This system will utilise expert mobile technology coupled with artificial intelligence and predictive modelling, and will emphasize the potential for expansion into many areas of intelligent remote sensing and computer aided decision-making for environment management or certification. Monitoring and prediction in natural systems, harnessing the potential of computing and communication technologies is an emerging technology within the area of environmental management. This research will lead to the initiation of a hardware and software multi tier decision support system for environment management allowing an evaluation of areas for biodiversity or areas at risk from invasive species, based upon environmental factors/systems.

  8. NASA Strategy to Safely Live and Work in the Space Radiation Environment

    NASA Technical Reports Server (NTRS)

    Cucinotta, Francis A.; Wu, Honglu; Corbin, Barbara J.; Sulzman, Frank M.; Krenek, Sam

    2007-01-01

    In space, astronauts are constantly bombarded with energetic particles. The goal of the National Aeronautics and Space Agency and the NASA Space Radiation Project is to ensure that astronauts can safely live and work in the space radiation environment. The space radiation environment poses both acute and chronic risks to crew health and safety, but unlike some other aspects of space travel, space radiation exposure has clinically relevant implications for the lifetime of the crew. Among the identified radiation risks are cancer, acute and late CNS damage, chronic and degenerative tissue decease, and acute radiation syndrome. The term "safely" means that risks are sufficiently understood such that acceptable limits on mission, post-mission and multi-mission consequences can be defined. The NASA Space Radiation Project strategy has several elements. The first element is to use a peer-reviewed research program to increase our mechanistic knowledge and genetic capabilities to develop tools for individual risk projection, thereby reducing our dependency on epidemiological data and population-based risk assessment. The second element is to use the NASA Space Radiation Laboratory to provide a ground-based facility to study the health effects/mechanisms of damage from space radiation exposure and the development and validation of biological models of risk, as well as methods for extrapolation to human risk. The third element is a risk modeling effort that integrates the results from research efforts into models of human risk to reduce uncertainties in predicting the identified radiation risks. To understand the biological basis for risk, we must also understand the physical aspects of the crew environment. Thus, the fourth element develops computer algorithms to predict radiation transport properties, evaluate integrated shielding technologies and provide design optimization recommendations for the design of human space systems. Understanding the risks and determining methods to mitigate the risks are keys to a successful radiation protection strategy.

  9. Crowd Sensing-Enabling Security Service Recommendation for Social Fog Computing Systems

    PubMed Central

    Wu, Jun; Su, Zhou; Li, Jianhua

    2017-01-01

    Fog computing, shifting intelligence and resources from the remote cloud to edge networks, has the potential of providing low-latency for the communication from sensing data sources to users. For the objects from the Internet of Things (IoT) to the cloud, it is a new trend that the objects establish social-like relationships with each other, which efficiently brings the benefits of developed sociality to a complex environment. As fog service become more sophisticated, it will become more convenient for fog users to share their own services, resources, and data via social networks. Meanwhile, the efficient social organization can enable more flexible, secure, and collaborative networking. Aforementioned advantages make the social network a potential architecture for fog computing systems. In this paper, we design an architecture for social fog computing, in which the services of fog are provisioned based on “friend” relationships. To the best of our knowledge, this is the first attempt at an organized fog computing system-based social model. Meanwhile, social networking enhances the complexity and security risks of fog computing services, creating difficulties of security service recommendations in social fog computing. To address this, we propose a novel crowd sensing-enabling security service provisioning method to recommend security services accurately in social fog computing systems. Simulation results show the feasibilities and efficiency of the crowd sensing-enabling security service recommendation method for social fog computing systems. PMID:28758943

  10. Crowd Sensing-Enabling Security Service Recommendation for Social Fog Computing Systems.

    PubMed

    Wu, Jun; Su, Zhou; Wang, Shen; Li, Jianhua

    2017-07-30

    Fog computing, shifting intelligence and resources from the remote cloud to edge networks, has the potential of providing low-latency for the communication from sensing data sources to users. For the objects from the Internet of Things (IoT) to the cloud, it is a new trend that the objects establish social-like relationships with each other, which efficiently brings the benefits of developed sociality to a complex environment. As fog service become more sophisticated, it will become more convenient for fog users to share their own services, resources, and data via social networks. Meanwhile, the efficient social organization can enable more flexible, secure, and collaborative networking. Aforementioned advantages make the social network a potential architecture for fog computing systems. In this paper, we design an architecture for social fog computing, in which the services of fog are provisioned based on "friend" relationships. To the best of our knowledge, this is the first attempt at an organized fog computing system-based social model. Meanwhile, social networking enhances the complexity and security risks of fog computing services, creating difficulties of security service recommendations in social fog computing. To address this, we propose a novel crowd sensing-enabling security service provisioning method to recommend security services accurately in social fog computing systems. Simulation results show the feasibilities and efficiency of the crowd sensing-enabling security service recommendation method for social fog computing systems.

  11. Quantitative Microbial Risk Assessment Tutorial Installation of Software for Watershed Modeling in Support of QMRA - Updated 2017

    EPA Science Inventory

    This tutorial provides instructions for accessing, retrieving, and downloading the following software to install on a host computer in support of Quantitative Microbial Risk Assessment (QMRA) modeling: • QMRA Installation • SDMProjectBuilder (which includes the Microbial ...

  12. Big data and high-performance analytics in structural health monitoring for bridge management

    NASA Astrophysics Data System (ADS)

    Alampalli, Sharada; Alampalli, Sandeep; Ettouney, Mohammed

    2016-04-01

    Structural Health Monitoring (SHM) can be a vital tool for effective bridge management. Combining large data sets from multiple sources to create a data-driven decision-making framework is crucial for the success of SHM. This paper presents a big data analytics framework that combines multiple data sets correlated with functional relatedness to convert data into actionable information that empowers risk-based decision-making. The integrated data environment incorporates near real-time streams of semi-structured data from remote sensors, historical visual inspection data, and observations from structural analysis models to monitor, assess, and manage risks associated with the aging bridge inventories. Accelerated processing of dataset is made possible by four technologies: cloud computing, relational database processing, support from NOSQL database, and in-memory analytics. The framework is being validated on a railroad corridor that can be subjected to multiple hazards. The framework enables to compute reliability indices for critical bridge components and individual bridge spans. In addition, framework includes a risk-based decision-making process that enumerate costs and consequences of poor bridge performance at span- and network-levels when rail networks are exposed to natural hazard events such as floods and earthquakes. Big data and high-performance analytics enable insights to assist bridge owners to address problems faster.

  13. The Pittsburgh Cervical Cancer Screening Model: a risk assessment tool.

    PubMed

    Austin, R Marshall; Onisko, Agnieszka; Druzdzel, Marek J

    2010-05-01

    Evaluation of cervical cancer screening has grown increasingly complex with the introduction of human papillomavirus (HPV) vaccination and newer screening technologies approved by the US Food and Drug Administration. To create a unique Pittsburgh Cervical Cancer Screening Model (PCCSM) that quantifies risk for histopathologic cervical precancer (cervical intraepithelial neoplasia [CIN] 2, CIN3, and adenocarcinoma in situ) and cervical cancer in an environment predominantly using newer screening technologies. The PCCSM is a dynamic Bayesian network consisting of 19 variables available in the laboratory information system, including patient history data (most recent HPV vaccination data), Papanicolaou test results, high-risk HPV results, procedure data, and histopathologic results. The model's graphic structure was based on the published literature. Results from 375 441 patient records from 2005 through 2008 were used to build and train the model. Additional data from 45 930 patients were used to test the model. The PCCSM compares risk quantitatively over time for histopathologically verifiable CIN2, CIN3, adenocarcinoma in situ, and cervical cancer in screened patients for each current cytology result category and for each HPV result. For each current cytology result, HPV test results affect risk; however, the degree of cytologic abnormality remains the largest positive predictor of risk. Prior history also alters the CIN2, CIN3, adenocarcinoma in situ, and cervical cancer risk for patients with common current cytology and HPV test results. The PCCSM can also generate negative risk projections, estimating the likelihood of the absence of histopathologic CIN2, CIN3, adenocarcinoma in situ, and cervical cancer in screened patients. The PCCSM is a dynamic Bayesian network that computes quantitative cervical disease risk estimates for patients undergoing cervical screening. Continuously updatable with current system data, the PCCSM provides a new tool to monitor cervical disease risk in the evolving postvaccination era.

  14. Model-Based Knowing: How Do Students Ground Their Understanding About Climate Systems in Agent-Based Computer Models?

    NASA Astrophysics Data System (ADS)

    Markauskaite, Lina; Kelly, Nick; Jacobson, Michael J.

    2017-12-01

    This paper gives a grounded cognition account of model-based learning of complex scientific knowledge related to socio-scientific issues, such as climate change. It draws on the results from a study of high school students learning about the carbon cycle through computational agent-based models and investigates two questions: First, how do students ground their understanding about the phenomenon when they learn and solve problems with computer models? Second, what are common sources of mistakes in students' reasoning with computer models? Results show that students ground their understanding in computer models in five ways: direct observation, straight abstraction, generalisation, conceptualisation, and extension. Students also incorporate into their reasoning their knowledge and experiences that extend beyond phenomena represented in the models, such as attitudes about unsustainable carbon emission rates, human agency, external events, and the nature of computational models. The most common difficulties of the students relate to seeing the modelled scientific phenomenon and connecting results from the observations with other experiences and understandings about the phenomenon in the outside world. An important contribution of this study is the constructed coding scheme for establishing different ways of grounding, which helps to understand some challenges that students encounter when they learn about complex phenomena with agent-based computer models.

  15. Clinical relevance of model based computer-assisted diagnosis and therapy

    NASA Astrophysics Data System (ADS)

    Schenk, Andrea; Zidowitz, Stephan; Bourquain, Holger; Hindennach, Milo; Hansen, Christian; Hahn, Horst K.; Peitgen, Heinz-Otto

    2008-03-01

    The ability to acquire and store radiological images digitally has made this data available to mathematical and scientific methods. With the step from subjective interpretation to reproducible measurements and knowledge, it is also possible to develop and apply models that give additional information which is not directly visible in the data. In this context, it is important to know the characteristics and limitations of each model. Four characteristics assure the clinical relevance of models for computer-assisted diagnosis and therapy: ability of patient individual adaptation, treatment of errors and uncertainty, dynamic behavior, and in-depth evaluation. We demonstrate the development and clinical application of a model in the context of liver surgery. Here, a model for intrahepatic vascular structures is combined with individual, but in the degree of vascular details limited anatomical information from radiological images. As a result, the model allows for a dedicated risk analysis and preoperative planning of oncologic resections as well as for living donor liver transplantations. The clinical relevance of the method was approved in several evaluation studies of our medical partners and more than 2900 complex surgical cases have been analyzed since 2002.

  16. MO-G-304-01: FEATURED PRESENTATION: Expanding the Knowledge Base for Data-Driven Treatment Planning: Incorporating Patient Outcome Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robertson, SP; Quon, H; Cheng, Z

    2015-06-15

    Purpose: To extend the capabilities of knowledge-based treatment planning beyond simple dose queries by incorporating validated patient outcome models. Methods: From an analytic, relational database of 684 head and neck cancer patients, 372 patients were identified having dose data for both left and right parotid glands as well as baseline and follow-up xerostomia assessments. For each existing patient, knowledge-based treatment planning was simulated for by querying the dose-volume histograms and geometric shape relationships (overlap volume histograms) for all other patients. Dose predictions were captured at normalized volume thresholds (NVT) of 0%, 10%, 20, 30%, 40%, 50%, and 85% and weremore » compared with the actual achieved doses using the Wilcoxon signed-rank test. Next, a logistic regression model was used to predict the maximum severity of xerostomia up to three months following radiotherapy. Baseline xerostomia scores were subtracted from follow-up assessments and were also included in the model. The relative risks from predicted doses and actual doses were computed and compared. Results: The predicted doses for both parotid glands were significantly less than the achieved doses (p < 0.0001), with differences ranging from 830 cGy ± 1270 cGy (0% NVT) to 1673 cGy ± 1197 cGy (30% NVT). The modelled risk of xerostomia ranged from 54% to 64% for achieved doses and from 33% to 51% for the dose predictions. Relative risks varied from 1.24 to 1.87, with maximum relative risk occurring at 85% NVT. Conclusions: Data-driven generation of treatment planning objectives without consideration of the underlying normal tissue complication probability may Result in inferior plans, even if quality metrics indicate otherwise. Inclusion of complication models in knowledge-based treatment planning is necessary in order to close the feedback loop between radiotherapy treatments and patient outcomes. Future work includes advancing and validating complication models in the context of knowledge-based treatment planning. This work is supported by Philips Radiation Oncology Systems.« less

  17. Integrated risk/cost planning models for the US Air Traffic system

    NASA Technical Reports Server (NTRS)

    Mulvey, J. M.; Zenios, S. A.

    1985-01-01

    A prototype network planning model for the U.S. Air Traffic control system is described. The model encompasses the dual objectives of managing collision risks and transportation costs where traffic flows can be related to these objectives. The underlying structure is a network graph with nonseparable convex costs; the model is solved efficiently by capitalizing on its intrinsic characteristics. Two specialized algorithms for solving the resulting problems are described: (1) truncated Newton, and (2) simplicial decomposition. The feasibility of the approach is demonstrated using data collected from a control center in the Midwest. Computational results with different computer systems are presented, including a vector supercomputer (CRAY-XMP). The risk/cost model has two primary uses: (1) as a strategic planning tool using aggregate flight information, and (2) as an integrated operational system for forecasting congestion and monitoring (controlling) flow throughout the U.S. In the latter case, access to a supercomputer is required due to the model's enormous size.

  18. Global Sampling for Integrating Physics-Specific Subsystems and Quantifying Uncertainties of CO 2 Geological Sequestration

    DOE PAGES

    Sun, Y.; Tong, C.; Trainor-Guitten, W. J.; ...

    2012-12-20

    The risk of CO 2 leakage from a deep storage reservoir into a shallow aquifer through a fault is assessed and studied using physics-specific computer models. The hypothetical CO 2 geological sequestration system is composed of three subsystems: a deep storage reservoir, a fault in caprock, and a shallow aquifer, which are modeled respectively by considering sub-domain-specific physics. Supercritical CO 2 is injected into the reservoir subsystem with uncertain permeabilities of reservoir, caprock, and aquifer, uncertain fault location, and injection rate (as a decision variable). The simulated pressure and CO 2/brine saturation are connected to the fault-leakage model as amore » boundary condition. CO 2 and brine fluxes from the fault-leakage model at the fault outlet are then imposed in the aquifer model as a source term. Moreover, uncertainties are propagated from the deep reservoir model, to the fault-leakage model, and eventually to the geochemical model in the shallow aquifer, thus contributing to risk profiles. To quantify the uncertainties and assess leakage-relevant risk, we propose a global sampling-based method to allocate sub-dimensions of uncertain parameters to sub-models. The risk profiles are defined and related to CO 2 plume development for pH value and total dissolved solids (TDS) below the EPA's Maximum Contaminant Levels (MCL) for drinking water quality. A global sensitivity analysis is conducted to select the most sensitive parameters to the risk profiles. The resulting uncertainty of pH- and TDS-defined aquifer volume, which is impacted by CO 2 and brine leakage, mainly results from the uncertainty of fault permeability. Subsequently, high-resolution, reduced-order models of risk profiles are developed as functions of all the decision variables and uncertain parameters in all three subsystems.« less

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sun, Y.; Tong, C.; Trainor-Guitten, W. J.

    The risk of CO 2 leakage from a deep storage reservoir into a shallow aquifer through a fault is assessed and studied using physics-specific computer models. The hypothetical CO 2 geological sequestration system is composed of three subsystems: a deep storage reservoir, a fault in caprock, and a shallow aquifer, which are modeled respectively by considering sub-domain-specific physics. Supercritical CO 2 is injected into the reservoir subsystem with uncertain permeabilities of reservoir, caprock, and aquifer, uncertain fault location, and injection rate (as a decision variable). The simulated pressure and CO 2/brine saturation are connected to the fault-leakage model as amore » boundary condition. CO 2 and brine fluxes from the fault-leakage model at the fault outlet are then imposed in the aquifer model as a source term. Moreover, uncertainties are propagated from the deep reservoir model, to the fault-leakage model, and eventually to the geochemical model in the shallow aquifer, thus contributing to risk profiles. To quantify the uncertainties and assess leakage-relevant risk, we propose a global sampling-based method to allocate sub-dimensions of uncertain parameters to sub-models. The risk profiles are defined and related to CO 2 plume development for pH value and total dissolved solids (TDS) below the EPA's Maximum Contaminant Levels (MCL) for drinking water quality. A global sensitivity analysis is conducted to select the most sensitive parameters to the risk profiles. The resulting uncertainty of pH- and TDS-defined aquifer volume, which is impacted by CO 2 and brine leakage, mainly results from the uncertainty of fault permeability. Subsequently, high-resolution, reduced-order models of risk profiles are developed as functions of all the decision variables and uncertain parameters in all three subsystems.« less

  20. Evaluation and recommendation of sensitivity analysis methods for application to Stochastic Human Exposure and Dose Simulation models.

    PubMed

    Mokhtari, Amirhossein; Christopher Frey, H; Zheng, Junyu

    2006-11-01

    Sensitivity analyses of exposure or risk models can help identify the most significant factors to aid in risk management or to prioritize additional research to reduce uncertainty in the estimates. However, sensitivity analysis is challenged by non-linearity, interactions between inputs, and multiple days or time scales. Selected sensitivity analysis methods are evaluated with respect to their applicability to human exposure models with such features using a testbed. The testbed is a simplified version of a US Environmental Protection Agency's Stochastic Human Exposure and Dose Simulation (SHEDS) model. The methods evaluated include the Pearson and Spearman correlation, sample and rank regression, analysis of variance, Fourier amplitude sensitivity test (FAST), and Sobol's method. The first five methods are known as "sampling-based" techniques, wheras the latter two methods are known as "variance-based" techniques. The main objective of the test cases was to identify the main and total contributions of individual inputs to the output variance. Sobol's method and FAST directly quantified these measures of sensitivity. Results show that sensitivity of an input typically changed when evaluated under different time scales (e.g., daily versus monthly). All methods provided similar insights regarding less important inputs; however, Sobol's method and FAST provided more robust insights with respect to sensitivity of important inputs compared to the sampling-based techniques. Thus, the sampling-based methods can be used in a screening step to identify unimportant inputs, followed by application of more computationally intensive refined methods to a smaller set of inputs. The implications of time variation in sensitivity results for risk management are briefly discussed.

  1. Risk-based maintenance of ethylene oxide production facilities.

    PubMed

    Khan, Faisal I; Haddara, Mahmoud R

    2004-05-20

    This paper discusses a methodology for the design of an optimum inspection and maintenance program. The methodology, called risk-based maintenance (RBM) is based on integrating a reliability approach and a risk assessment strategy to obtain an optimum maintenance schedule. First, the likely equipment failure scenarios are formulated. Out of many likely failure scenarios, the ones, which are most probable, are subjected to a detailed study. Detailed consequence analysis is done for the selected scenarios. Subsequently, these failure scenarios are subjected to a fault tree analysis to determine their probabilities. Finally, risk is computed by combining the results of the consequence and the probability analyses. The calculated risk is compared against known acceptable criteria. The frequencies of the maintenance tasks are obtained by minimizing the estimated risk. A case study involving an ethylene oxide production facility is presented. Out of the five most hazardous units considered, the pipeline used for the transportation of the ethylene is found to have the highest risk. Using available failure data and a lognormal reliability distribution function human health risk factors are calculated. Both societal risk factors and individual risk factors exceeded the acceptable risk criteria. To determine an optimal maintenance interval, a reverse fault tree analysis was used. The maintenance interval was determined such that the original high risk is brought down to an acceptable level. A sensitivity analysis is also undertaken to study the impact of changing the distribution of the reliability model as well as the error in the distribution parameters on the maintenance interval.

  2. Quantitative risk assessment system (QRAS)

    NASA Technical Reports Server (NTRS)

    Tan, Zhibin (Inventor); Mosleh, Ali (Inventor); Weinstock, Robert M (Inventor); Smidts, Carol S (Inventor); Chang, Yung-Hsien (Inventor); Groen, Francisco J (Inventor); Swaminathan, Sankaran (Inventor)

    2001-01-01

    A quantitative risk assessment system (QRAS) builds a risk model of a system for which risk of failure is being assessed, then analyzes the risk of the system corresponding to the risk model. The QRAS performs sensitivity analysis of the risk model by altering fundamental components and quantifications built into the risk model, then re-analyzes the risk of the system using the modifications. More particularly, the risk model is built by building a hierarchy, creating a mission timeline, quantifying failure modes, and building/editing event sequence diagrams. Multiplicities, dependencies, and redundancies of the system are included in the risk model. For analysis runs, a fixed baseline is first constructed and stored. This baseline contains the lowest level scenarios, preserved in event tree structure. The analysis runs, at any level of the hierarchy and below, access this baseline for risk quantitative computation as well as ranking of particular risks. A standalone Tool Box capability exists, allowing the user to store application programs within QRAS.

  3. A Summative Evaluation of the Effectiveness of Classroom-Embedded, Individualistic, Computer-Based Learning for Middle School Students Placed at Academic Risk in Schools with a High Proportion of Title I Eligible Students

    ERIC Educational Resources Information Center

    DeLoach, Regina M.

    2011-01-01

    The purpose of this "post hoc," summative evaluation was to evaluate the effectiveness of classroom-embedded, individualistic, computer-based learning for middle school students placed at academic risk in schools with a high proportion of Title I eligible students. Data were mined from existing school district databases. For data (n = 393)…

  4. The importance of bony impingement in restricting flexion after total knee arthroplasty: computer simulation model with clinical correlation.

    PubMed

    Mizu-Uchi, Hideki; Colwell, Clifford W; Fukagawa, Shingo; Matsuda, Shuichi; Iwamoto, Yukihide; D'Lima, Darryl D

    2012-10-01

    We constructed patient-specific models from computed tomography data after total knee arthroplasty to predict knee flexion based on implant-bone impingement. The maximum flexion before impingement between the femur and the tibial insert was computed using a musculoskeletal modeling program (KneeSIM; LifeModeler, Inc, San Clemente, California) during a weight-bearing deep knee bend. Postoperative flexion was measured in a clinical cohort of 21 knees (low-flex group: 6 knees with <100° of flexion and high-flex group: 15 size-matched knees with >125° of flexion at 2 years). Average predicted flexion angles were within 2° of clinical measurements for the high-flex group. In the low-flex group, 4 cases had impingement involving the bone cut at the posterior condyle, and the average predicted knee flexion was 102° compared with 93° measured clinically. These results indicate that the level of the distal femoral resection should be carefully planned and that exposed bone proximal to the tips of the posterior condyles of the femoral component should be removed if there is risk of impingement. Copyright © 2012 Elsevier Inc. All rights reserved.

  5. The EPA Comptox Chemistry Dashboard: A Web-Based Data ...

    EPA Pesticide Factsheets

    The U.S. Environmental Protection Agency (EPA) Computational Toxicology Program integrates advances in biology, chemistry, and computer science to help prioritize chemicals for further research based on potential human health risks. This work involves computational and data driven approaches that integrate chemistry, exposure and biological data. As an outcome of these efforts the National Center for Computational Toxicology (NCCT) has measured, assembled and delivered an enormous quantity and diversity of data for the environmental sciences including high-throughput in vitro screening data, in vivo and functional use data, exposure models and chemical databases with associated properties. A series of software applications and databases have been produced over the past decade to deliver these data but recent developments have focused on the development of a new software architecture that assembles the resources into a single platform. A new web application, the CompTox Chemistry Dashboard provides access to data associated with ~720,000 chemical substances. These data include experimental and predicted physicochemical property data, bioassay screening data associated with the ToxCast program, product and functional use information and a myriad of related data of value to environmental scientists. The dashboard provides chemical-based searching based on chemical names, synonyms and CAS Registry Numbers. Flexible search capabilities allow for chemical identificati

  6. Computer-Based Molecular Modelling: Finnish School Teachers' Experiences and Views

    ERIC Educational Resources Information Center

    Aksela, Maija; Lundell, Jan

    2008-01-01

    Modern computer-based molecular modelling opens up new possibilities for chemistry teaching at different levels. This article presents a case study seeking insight into Finnish school teachers' use of computer-based molecular modelling in teaching chemistry, into the different working and teaching methods used, and their opinions about necessary…

  7. Idiosyncratic risk in the Dow Jones Eurostoxx50 Index

    NASA Astrophysics Data System (ADS)

    Daly, Kevin; Vo, Vinh

    2008-07-01

    Recent evidence by Campbell et al. [J.Y. Campbell, M. Lettau B.G. Malkiel, Y. Xu, Have individual stocks become more volatile? An empirical exploration of idiosyncratic risk, The Journal of Finance (February) (2001)] shows an increase in firm-level volatility and a decline of the correlation among stock returns in the US. In relation to the Euro-Area stock markets, we find that both aggregate firm-level volatility and average stock market correlation have trended upwards. We estimate a linear model of the market risk-return relationship nested in an EGARCH(1, 1)-M model for conditional second moments. We then show that traditional estimates of the conditional risk-return relationship, that use ex-post excess-returns as the conditioning information set, lead to joint tests of the theoretical model (usually the ICAPM) and of the Efficient Market Hypothesis in its strong form. To overcome this problem we propose alternative measures of expected market risk based on implied volatility extracted from traded option prices and we discuss the conditions under which implied volatility depends solely on expected risk. We then regress market excess-returns on lagged market implied variance computed from implied market volatility to estimate the relationship between expected market excess-returns and expected market risk.We investigate whether, as predicted by the ICAPM, the expected market risk is the main factor in explaining the market risk premium and the latter is independent of aggregate idiosyncratic risk.

  8. Future cardiovascular disease in China: Markov model and risk factor scenario projections from the Coronary Heart Disease Policy Model-China

    PubMed Central

    Moran, Andrew; Gu, Dongfeng; Zhao, Dong; Coxson, Pamela; Wang, Y. Claire; Chen, Chung-Shiuan; Liu, Jing; Cheng, Jun; Bibbins-Domingo, Kirsten; Shen, Yu-Ming; He, Jiang; Goldman, Lee

    2010-01-01

    Background The relative effects of individual and combined risk factor trends on future cardiovascular disease in China have not been quantified in detail. Methods and Results Future risk factor trends in China were projected based on prior trends. Cardiovascular disease (coronary heart disease and stroke) in adults ages 35 to 84 years was projected from 2010 to 2030 using the Coronary Heart Disease Policy Model–China, a Markov computer simulation model. With risk factor levels held constant, projected annual cardiovascular events increased by >50% between 2010 and 2030 based on population aging and growth alone. Projected trends in blood pressure, total cholesterol, diabetes (increases), and active smoking (decline) would increase annual cardiovascular disease events by an additional 23%, an increase of approximately 21.3 million cardiovascular events and 7.7 million cardiovascular deaths over 2010 to 2030. Aggressively reducing active smoking in Chinese men to 20% prevalence in 2020 and 10% prevalence in 2030 or reducing mean systolic blood pressure by 3.8 mm Hg in men and women would counteract adverse trends in other risk factors by preventing cardiovascular events and 2.9 to 5.7 million total deaths over 2 decades. Conclusions Aging and population growth will increase cardiovascular disease by more than a half over the coming 20 years, and projected unfavorable trends in blood pressure, total cholesterol, diabetes, and body mass index may accelerate the epidemic. National policy aimed at controlling blood pressure, smoking, and other risk factors would counteract the expected future cardiovascular disease epidemic in China. PMID:20442213

  9. Prediction of near-term breast cancer risk using local region-based bilateral asymmetry features in mammography

    NASA Astrophysics Data System (ADS)

    Li, Yane; Fan, Ming; Li, Lihua; Zheng, Bin

    2017-03-01

    This study proposed a near-term breast cancer risk assessment model based on local region bilateral asymmetry features in Mammography. The database includes 566 cases who underwent at least two sequential FFDM examinations. The `prior' examination in the two series all interpreted as negative (not recalled). In the "current" examination, 283 women were diagnosed cancers and 283 remained negative. Age of cancers and negative cases completely matched. These cases were divided into three subgroups according to age: 152 cases among the 37-49 age-bracket, 220 cases in the age-bracket 50- 60, and 194 cases with the 61-86 age-bracket. For each image, two local regions including strip-based regions and difference-of-Gaussian basic element regions were segmented. After that, structural variation features among pixel values and structural similarity features were computed for strip regions. Meanwhile, positional features were extracted for basic element regions. The absolute subtraction value was computed between each feature of the left and right local-regions. Next, a multi-layer perception classifier was implemented to assess performance of features for prediction. Features were then selected according stepwise regression analysis. The AUC achieved 0.72, 0.75 and 0.71 for these 3 age-based subgroups, respectively. The maximum adjustable odds ratios were 12.4, 20.56 and 4.91 for these three groups, respectively. This study demonstrate that the local region-based bilateral asymmetry features extracted from CC-view mammography could provide useful information to predict near-term breast cancer risk.

  10. Improving Flood Predictions in Data-Scarce Basins

    NASA Astrophysics Data System (ADS)

    Vimal, Solomon; Zanardo, Stefano; Rafique, Farhat; Hilberts, Arno

    2017-04-01

    Flood modeling methodology at Risk Management Solutions Ltd. has evolved over several years with the development of continental scale flood risk models spanning most of Europe, the United States and Japan. Pluvial (rain fed) and fluvial (river fed) flood maps represent the basis for the assessment of regional flood risk. These maps are derived by solving the 1D energy balance equation for river routing and 2D shallow water equation (SWE) for overland flow. The models are run with high performance computing and GPU based solvers as the time taken for simulation is large in such continental scale modeling. These results are validated with data from authorities and business partners, and have been used in the insurance industry for many years. While this methodology has been proven extremely effective in regions where the quality and availability of data are high, its application is very challenging in other regions where data are scarce. This is generally the case for low and middle income countries, where simpler approaches are needed for flood risk modeling and assessment. In this study we explore new methods to make use of modeling results obtained in data-rich contexts to improve predictive ability in data-scarce contexts. As an example, based on our modeled flood maps in data-rich countries, we identify statistical relationships between flood characteristics and topographic and climatic indicators, and test their generalization across physical domains. Moreover, we apply the Height Above Nearest Drainage (HAND)approach to estimate "probable" saturated areas for different return period flood events as functions of basin characteristics. This work falls into the well-established research field of Predictions in Ungauged Basins.

  11. Development and Sensitivity Analysis of a Frost Risk model based primarily on freely distributed Earth Observation data

    NASA Astrophysics Data System (ADS)

    Louka, Panagiota; Petropoulos, George; Papanikolaou, Ioannis

    2015-04-01

    The ability to map the spatiotemporal distribution of extreme climatic conditions, such as frost, is a significant tool in successful agricultural management and decision making. Nowadays, with the development of Earth Observation (EO) technology, it is possible to obtain accurately, timely and in a cost-effective way information on the spatiotemporal distribution of frost conditions, particularly over large and otherwise inaccessible areas. The present study aimed at developing and evaluating a frost risk prediction model, exploiting primarily EO data from MODIS and ASTER sensors and ancillary ground observation data. For the evaluation of our model, a region in north-western Greece was selected as test site and a detailed sensitivity analysis was implemented. The agreement between the model predictions and the observed (remotely sensed) frost frequency obtained by MODIS sensor was evaluated thoroughly. Also, detailed comparisons of the model predictions were performed against reference frost ground observations acquired from the Greek Agricultural Insurance Organization (ELGA) over a period of 10-years (2000-2010). Overall, results evidenced the ability of the model to produce reasonably well the frost conditions, following largely explainable patterns in respect to the study site and local weather conditions characteristics. Implementation of our proposed frost risk model is based primarily on satellite imagery analysis provided nowadays globally at no cost. It is also straightforward and computationally inexpensive, requiring much less effort in comparison for example to field surveying. Finally, the method is adjustable to be potentially integrated with other high resolution data available from both commercial and non-commercial vendors. Keywords: Sensitivity analysis, frost risk mapping, GIS, remote sensing, MODIS, Greece

  12. Risk Assessment in the 21st Century - Conference Abstract ...

    EPA Pesticide Factsheets

    For the past ~50 years, risk assessment depended almost exclusively on animal testing for hazard identification and dose-response assessment. Originally sound and effective, with increasing dependence on chemical tools and the number of chemicals in commerce, this traditional approach is no longer sufficient. This presentation provides an update on current progress in achieving the goals outlined in the NAS reports: “Toxicology Testing in the 21st Century”, “Exposure Science in the 21st Century”, and most recently, “Using 21st Century Science to Improve Risk-Related Evaluations.” The presentation highlights many of the advances lead by the EPA. Topics covered include the evolution of the mode of action concept into the chemically agnostic, adverse outcome pathway (AOP), a systems-based data framework that facilitates integration of modifiable factors (e.g., genetic variation, life stages), and an understanding of networks, and mixtures. Further, the EDSP pivot is used to illustrate how AOPs drive development of predictive models for risk assessment based on assembly of high throughput assays representing AOP key elements. The birth of computational exposure science, capable of large-scale predictive exposure models, is reviewed. Although still in its infancy, development of non-targeted analysis to begin addressing the exposome is presented, as is the systems-based AEP that integrates exposure, toxicokinetics and AOPs into a comprehensive framework

  13. Harnessing the theoretical foundations of the exponential and beta-Poisson dose-response models to quantify parameter uncertainty using Markov Chain Monte Carlo.

    PubMed

    Schmidt, Philip J; Pintar, Katarina D M; Fazil, Aamir M; Topp, Edward

    2013-09-01

    Dose-response models are the essential link between exposure assessment and computed risk values in quantitative microbial risk assessment, yet the uncertainty that is inherent to computed risks because the dose-response model parameters are estimated using limited epidemiological data is rarely quantified. Second-order risk characterization approaches incorporating uncertainty in dose-response model parameters can provide more complete information to decisionmakers by separating variability and uncertainty to quantify the uncertainty in computed risks. Therefore, the objective of this work is to develop procedures to sample from posterior distributions describing uncertainty in the parameters of exponential and beta-Poisson dose-response models using Bayes's theorem and Markov Chain Monte Carlo (in OpenBUGS). The theoretical origins of the beta-Poisson dose-response model are used to identify a decomposed version of the model that enables Bayesian analysis without the need to evaluate Kummer confluent hypergeometric functions. Herein, it is also established that the beta distribution in the beta-Poisson dose-response model cannot address variation among individual pathogens, criteria to validate use of the conventional approximation to the beta-Poisson model are proposed, and simple algorithms to evaluate actual beta-Poisson probabilities of infection are investigated. The developed MCMC procedures are applied to analysis of a case study data set, and it is demonstrated that an important region of the posterior distribution of the beta-Poisson dose-response model parameters is attributable to the absence of low-dose data. This region includes beta-Poisson models for which the conventional approximation is especially invalid and in which many beta distributions have an extreme shape with questionable plausibility. © Her Majesty the Queen in Right of Canada 2013. Reproduced with the permission of the Minister of the Public Health Agency of Canada.

  14. Electronic health record analysis via deep poisson factor models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Henao, Ricardo; Lu, James T.; Lucas, Joseph E.

    Electronic Health Record (EHR) phenotyping utilizes patient data captured through normal medical practice, to identify features that may represent computational medical phenotypes. These features may be used to identify at-risk patients and improve prediction of patient morbidity and mortality. We present a novel deep multi-modality architecture for EHR analysis (applicable to joint analysis of multiple forms of EHR data), based on Poisson Factor Analysis (PFA) modules. Each modality, composed of observed counts, is represented as a Poisson distribution, parameterized in terms of hidden binary units. In-formation from different modalities is shared via a deep hierarchy of common hidden units. Activationmore » of these binary units occurs with probability characterized as Bernoulli-Poisson link functions, instead of more traditional logistic link functions. In addition, we demon-strate that PFA modules can be adapted to discriminative modalities. To compute model parameters, we derive efficient Markov Chain Monte Carlo (MCMC) inference that scales efficiently, with significant computational gains when compared to related models based on logistic link functions. To explore the utility of these models, we apply them to a subset of patients from the Duke-Durham patient cohort. We identified a cohort of over 12,000 patients with Type 2 Diabetes Mellitus (T2DM) based on diagnosis codes and laboratory tests out of our patient population of over 240,000. Examining the common hidden units uniting the PFA modules, we identify patient features that represent medical concepts. Experiments indicate that our learned features are better able to predict mortality and morbidity than clinical features identified previously in a large-scale clinical trial.« less

  15. Electronic health record analysis via deep poisson factor models

    DOE PAGES

    Henao, Ricardo; Lu, James T.; Lucas, Joseph E.; ...

    2016-01-01

    Electronic Health Record (EHR) phenotyping utilizes patient data captured through normal medical practice, to identify features that may represent computational medical phenotypes. These features may be used to identify at-risk patients and improve prediction of patient morbidity and mortality. We present a novel deep multi-modality architecture for EHR analysis (applicable to joint analysis of multiple forms of EHR data), based on Poisson Factor Analysis (PFA) modules. Each modality, composed of observed counts, is represented as a Poisson distribution, parameterized in terms of hidden binary units. In-formation from different modalities is shared via a deep hierarchy of common hidden units. Activationmore » of these binary units occurs with probability characterized as Bernoulli-Poisson link functions, instead of more traditional logistic link functions. In addition, we demon-strate that PFA modules can be adapted to discriminative modalities. To compute model parameters, we derive efficient Markov Chain Monte Carlo (MCMC) inference that scales efficiently, with significant computational gains when compared to related models based on logistic link functions. To explore the utility of these models, we apply them to a subset of patients from the Duke-Durham patient cohort. We identified a cohort of over 12,000 patients with Type 2 Diabetes Mellitus (T2DM) based on diagnosis codes and laboratory tests out of our patient population of over 240,000. Examining the common hidden units uniting the PFA modules, we identify patient features that represent medical concepts. Experiments indicate that our learned features are better able to predict mortality and morbidity than clinical features identified previously in a large-scale clinical trial.« less

  16. Computational Fluid Dynamics Modeling of Symptomatic Intracranial Atherosclerosis May Predict Risk of Stroke Recurrence

    PubMed Central

    Leng, Xinyi; Scalzo, Fabien; Ip, Hing Lung; Johnson, Mark; Fong, Albert K.; Fan, Florence S. Y.; Chen, Xiangyan; Soo, Yannie O. Y.; Miao, Zhongrong; Liu, Liping; Feldmann, Edward; Leung, Thomas W. H.; Liebeskind, David S.; Wong, Ka Sing

    2014-01-01

    Background Patients with symptomatic intracranial atherosclerosis (ICAS) of ≥70% luminal stenosis are at high risk of stroke recurrence. We aimed to evaluate the relationships between hemodynamics of ICAS revealed by computational fluid dynamics (CFD) models and risk of stroke recurrence in this patient subset. Methods Patients with a symptomatic ICAS lesion of 70–99% luminal stenosis were screened and enrolled in this study. CFD models were reconstructed based on baseline computed tomographic angiography (CTA) source images, to reveal hemodynamics of the qualifying symptomatic ICAS lesions. Change of pressures across a lesion was represented by the ratio of post- and pre-stenotic pressures. Change of shear strain rates (SSR) across a lesion was represented by the ratio of SSRs at the stenotic throat and proximal normal vessel segment, similar for the change of flow velocities. Patients were followed up for 1 year. Results Overall, 32 patients (median age 65; 59.4% males) were recruited. The median pressure, SSR and velocity ratios for the ICAS lesions were 0.40 (−2.46–0.79), 4.5 (2.2–20.6), and 7.4 (5.2–12.5), respectively. SSR ratio (hazard ratio [HR] 1.027; 95% confidence interval [CI], 1.004–1.051; P = 0.023) and velocity ratio (HR 1.029; 95% CI, 1.002–1.056; P = 0.035) were significantly related to recurrent territorial ischemic stroke within 1 year by univariate Cox regression, respectively with the c-statistics of 0.776 (95% CI, 0.594–0.903; P = 0.014) and 0.776 (95% CI, 0.594–0.903; P = 0.002) in receiver operating characteristic analysis. Conclusions Hemodynamics of ICAS on CFD models reconstructed from routinely obtained CTA images may predict subsequent stroke recurrence in patients with a symptomatic ICAS lesion of 70–99% luminal stenosis. PMID:24818753

  17. Dose coefficients in pediatric and adult abdominopelvic CT based on 100 patient models.

    PubMed

    Tian, Xiaoyu; Li, Xiang; Segars, W Paul; Frush, Donald P; Paulson, Erik K; Samei, Ehsan

    2013-12-21

    Recent studies have shown the feasibility of estimating patient dose from a CT exam using CTDI(vol)-normalized-organ dose (denoted as h), DLP-normalized-effective dose (denoted as k), and DLP-normalized-risk index (denoted as q). However, previous studies were limited to a small number of phantom models. The purpose of this work was to provide dose coefficients (h, k, and q) across a large number of computational models covering a broad range of patient anatomy, age, size percentile, and gender. The study consisted of 100 patient computer models (age range, 0 to 78 y.o.; weight range, 2-180 kg) including 42 pediatric models (age range, 0 to 16 y.o.; weight range, 2-80 kg) and 58 adult models (age range, 18 to 78 y.o.; weight range, 57-180 kg). Multi-detector array CT scanners from two commercial manufacturers (LightSpeed VCT, GE Healthcare; SOMATOM Definition Flash, Siemens Healthcare) were included. A previously-validated Monte Carlo program was used to simulate organ dose for each patient model and each scanner, from which h, k, and q were derived. The relationships between h, k, and q and patient characteristics (size, age, and gender) were ascertained. The differences in conversion coefficients across the scanners were further characterized. CTDI(vol)-normalized-organ dose (h) showed an exponential decrease with increasing patient size. For organs within the image coverage, the average differences of h across scanners were less than 15%. That value increased to 29% for organs on the periphery or outside the image coverage, and to 8% for distributed organs, respectively. The DLP-normalized-effective dose (k) decreased exponentially with increasing patient size. For a given gender, the DLP-normalized-risk index (q) showed an exponential decrease with both increasing patient size and patient age. The average differences in k and q across scanners were 8% and 10%, respectively. This study demonstrated that the knowledge of patient information and CTDIvol/DLP values may be used to estimate organ dose, effective dose, and risk index in abdominopelvic CT based on the coefficients derived from a large population of pediatric and adult patients.

  18. Dose coefficients in pediatric and adult abdominopelvic CT based on 100 patient models

    NASA Astrophysics Data System (ADS)

    Tian, Xiaoyu; Li, Xiang; Segars, W. Paul; Frush, Donald P.; Paulson, Erik K.; Samei, Ehsan

    2013-12-01

    Recent studies have shown the feasibility of estimating patient dose from a CT exam using CTDIvol-normalized-organ dose (denoted as h), DLP-normalized-effective dose (denoted as k), and DLP-normalized-risk index (denoted as q). However, previous studies were limited to a small number of phantom models. The purpose of this work was to provide dose coefficients (h, k, and q) across a large number of computational models covering a broad range of patient anatomy, age, size percentile, and gender. The study consisted of 100 patient computer models (age range, 0 to 78 y.o.; weight range, 2-180 kg) including 42 pediatric models (age range, 0 to 16 y.o.; weight range, 2-80 kg) and 58 adult models (age range, 18 to 78 y.o.; weight range, 57-180 kg). Multi-detector array CT scanners from two commercial manufacturers (LightSpeed VCT, GE Healthcare; SOMATOM Definition Flash, Siemens Healthcare) were included. A previously-validated Monte Carlo program was used to simulate organ dose for each patient model and each scanner, from which h, k, and q were derived. The relationships between h, k, and q and patient characteristics (size, age, and gender) were ascertained. The differences in conversion coefficients across the scanners were further characterized. CTDIvol-normalized-organ dose (h) showed an exponential decrease with increasing patient size. For organs within the image coverage, the average differences of h across scanners were less than 15%. That value increased to 29% for organs on the periphery or outside the image coverage, and to 8% for distributed organs, respectively. The DLP-normalized-effective dose (k) decreased exponentially with increasing patient size. For a given gender, the DLP-normalized-risk index (q) showed an exponential decrease with both increasing patient size and patient age. The average differences in k and q across scanners were 8% and 10%, respectively. This study demonstrated that the knowledge of patient information and CTDIvol/DLP values may be used to estimate organ dose, effective dose, and risk index in abdominopelvic CT based on the coefficients derived from a large population of pediatric and adult patients.

  19. Using Computational Approaches to Improve Risk-Stratified Patient Management: Rationale and Methods

    PubMed Central

    Stone, Bryan L; Sakaguchi, Farrant; Sheng, Xiaoming; Murtaugh, Maureen A

    2015-01-01

    Background Chronic diseases affect 52% of Americans and consume 86% of health care costs. A small portion of patients consume most health care resources and costs. More intensive patient management strategies, such as case management, are usually more effective at improving health outcomes, but are also more expensive. To use limited resources efficiently, risk stratification is commonly used in managing patients with chronic diseases, such as asthma, chronic obstructive pulmonary disease, diabetes, and heart disease. Patients are stratified based on predicted risk with patients at higher risk given more intensive care. The current risk-stratified patient management approach has 3 limitations resulting in many patients not receiving the most appropriate care, unnecessarily increased costs, and suboptimal health outcomes. First, using predictive models for health outcomes and costs is currently the best method for forecasting individual patient’s risk. Yet, accuracy of predictive models remains poor causing many patients to be misstratified. If an existing model were used to identify candidate patients for case management, enrollment would miss more than half of those who would benefit most, but include others unlikely to benefit, wasting limited resources. Existing models have been developed under the assumption that patient characteristics primarily influence outcomes and costs, leaving physician characteristics out of the models. In reality, both characteristics have an impact. Second, existing models usually give neither an explanation why a particular patient is predicted to be at high risk nor suggestions on interventions tailored to the patient’s specific case. As a result, many high-risk patients miss some suitable interventions. Third, thresholds for risk strata are suboptimal and determined heuristically with no quality guarantee. Objective The purpose of this study is to improve risk-stratified patient management so that more patients will receive the most appropriate care. Methods This study will (1) combine patient, physician profile, and environmental variable features to improve prediction accuracy of individual patient health outcomes and costs; (2) develop the first algorithm to explain prediction results and suggest tailored interventions; (3) develop the first algorithm to compute optimal thresholds for risk strata; and (4) conduct simulations to estimate outcomes of risk-stratified patient management for various configurations. The proposed techniques will be demonstrated on a test case of asthma patients. Results We are currently in the process of extracting clinical and administrative data from an integrated health care system’s enterprise data warehouse. We plan to complete this study in approximately 5 years. Conclusions Methods developed in this study will help transform risk-stratified patient management for better clinical outcomes, higher patient satisfaction and quality of life, reduced health care use, and lower costs. PMID:26503357

  20. Occupational risk factors have to be considered in the definition of high-risk lung cancer populations.

    PubMed

    Wild, P; Gonzalez, M; Bourgkard, E; Courouble, N; Clément-Duchêne, C; Martinet, Y; Févotte, J; Paris, C

    2012-03-27

    The aim of this study was to compute attributable fractions (AF) to occupational factors in an area in North-Eastern France with high lung cancer rates and a past of mining and steel industry. A population-based case-control study among males aged 40-79 was conducted, including confirmed primary lung cancer cases from all hospitals of the study region. Controls were stratified by broad age-classes, district and socioeconomic classes. Detailed occupational and personal risk factors were obtained in face-to-face interviews. Cumulative occupational exposure indices were obtained from the questionnaires. Attributable fractions were computed from multiple unconditional logistic regression models. A total of 246 cases and 531 controls were included. The odds ratios (ORs) adjusted on cumulative smoking and family history of lung cancer increased significantly with the cumulative exposure indices to asbestos, polycyclic aromatic hydrocarbons and crystalline silica, and with exposure to diesel motor exhaust. The AF for occupational factors exceeded 50%, the most important contributor being crystalline silica and asbestos. These AFs are higher than most published figures. This can be because of the highly industrialised area or methods for exposure assessments. Occupational factors are important risk factors and should not be forgotten when defining high-risk lung cancer populations.

  1. Investigating Impact Metrics for Performance for the US EPA National Center for Computational Toxicology (ACS Fall meeting)

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) Computational Toxicology Program integrates advances in biology, chemistry, and computer science to help prioritize chemicals for further research based on potential human health risks. This work involves computational and data drive...

  2. Mixed-field GCR Simulations for Radiobiological Research using Ground Based Accelerators

    NASA Astrophysics Data System (ADS)

    Kim, Myung-Hee Y.; Rusek, Adam; Cucinotta, Francis

    Space radiation is comprised of a large number of particle types and energies, which have differential ionization power from high energy protons to high charge and energy (HZE) particles and secondary neutrons produced by galactic cosmic rays (GCR). Ground based accelerators such as the NASA Space Radiation Laboratory (NSRL) at Brookhaven National Laboratory (BNL) are used to simulate space radiation for radiobiology research and dosimetry, electronics parts, and shielding testing using mono-energetic beams for single ion species. As a tool to support research on new risk assessment models, we have developed a stochastic model of heavy ion beams and space radiation effects, the GCR Event-based Risk Model computer code (GERMcode). For radiobiological research on mixed-field space radiation, a new GCR simulator at NSRL is proposed. The NSRL-GCR simulator, which implements the rapid switching mode and the higher energy beam extraction to 1.5 GeV/u, can integrate multiple ions into a single simulation to create GCR Z-spectrum in major energy bins. After considering the GCR environment and energy limitations of NSRL, a GCR reference field is proposed after extensive simulation studies using the GERMcode. The GCR reference field is shown to reproduce the Z and LET spectra of GCR behind shielding within 20 percents accuracy compared to simulated full GCR environments behind shielding. A major challenge for space radiobiology research is to consider chronic GCR exposure of up to 3-years in relation to simulations with cell and animal models of human risks. We discuss possible approaches to map important biological time scales in experimental models using ground-based simulation with extended exposure of up to a few weeks and fractionation approaches at a GCR simulator.

  3. Mixed-field GCR Simulations for Radiobiological Research Using Ground Based Accelerators

    NASA Technical Reports Server (NTRS)

    Kim, Myung-Hee Y.; Rusek, Adam; Cucinotta, Francis A.

    2014-01-01

    Space radiation is comprised of a large number of particle types and energies, which have differential ionization power from high energy protons to high charge and energy (HZE) particles and secondary neutrons produced by galactic cosmic rays (GCR). Ground based accelerators such as the NASA Space Radiation Laboratory (NSRL) at Brookhaven National Laboratory (BNL) are used to simulate space radiation for radiobiology research and dosimetry, electronics parts, and shielding testing using mono-energetic beams for single ion species. As a tool to support research on new risk assessment models, we have developed a stochastic model of heavy ion beams and space radiation effects, the GCR Event-based Risk Model computer code (GERMcode). For radiobiological research on mixed-field space radiation, a new GCR simulator at NSRL is proposed. The NSRL-GCR simulator, which implements the rapid switching mode and the higher energy beam extraction to 1.5 GeV/u, can integrate multiple ions into a single simulation to create GCR Z-spectrum in major energy bins. After considering the GCR environment and energy limitations of NSRL, a GCR reference field is proposed after extensive simulation studies using the GERMcode. The GCR reference field is shown to reproduce the Z and LET spectra of GCR behind shielding within 20% accuracy compared to simulated full GCR environments behind shielding. A major challenge for space radiobiology research is to consider chronic GCR exposure of up to 3-years in relation to simulations with cell and animal models of human risks. We discuss possible approaches to map important biological time scales in experimental models using ground-based simulation with extended exposure of up to a few weeks and fractionation approaches at a GCR simulator.

  4. A simple computational algorithm of model-based choice preference.

    PubMed

    Toyama, Asako; Katahira, Kentaro; Ohira, Hideki

    2017-08-01

    A broadly used computational framework posits that two learning systems operate in parallel during the learning of choice preferences-namely, the model-free and model-based reinforcement-learning systems. In this study, we examined another possibility, through which model-free learning is the basic system and model-based information is its modulator. Accordingly, we proposed several modified versions of a temporal-difference learning model to explain the choice-learning process. Using the two-stage decision task developed by Daw, Gershman, Seymour, Dayan, and Dolan (2011), we compared their original computational model, which assumes a parallel learning process, and our proposed models, which assume a sequential learning process. Choice data from 23 participants showed a better fit with the proposed models. More specifically, the proposed eligibility adjustment model, which assumes that the environmental model can weight the degree of the eligibility trace, can explain choices better under both model-free and model-based controls and has a simpler computational algorithm than the original model. In addition, the forgetting learning model and its variation, which assume changes in the values of unchosen actions, substantially improved the fits to the data. Overall, we show that a hybrid computational model best fits the data. The parameters used in this model succeed in capturing individual tendencies with respect to both model use in learning and exploration behavior. This computational model provides novel insights into learning with interacting model-free and model-based components.

  5. History of EPI Suite™ and future perspectives on chemical property estimation in US Toxic Substances Control Act new chemical risk assessments.

    PubMed

    Card, Marcella L; Gomez-Alvarez, Vicente; Lee, Wen-Hsiung; Lynch, David G; Orentas, Nerija S; Lee, Mari Titcombe; Wong, Edmund M; Boethling, Robert S

    2017-03-22

    Chemical property estimation is a key component in many industrial, academic, and regulatory activities, including in the risk assessment associated with the approximately 1000 new chemical pre-manufacture notices the United States Environmental Protection Agency (US EPA) receives annually. The US EPA evaluates fate, exposure and toxicity under the 1976 Toxic Substances Control Act (amended by the 2016 Frank R. Lautenberg Chemical Safety for the 21 st Century Act), which does not require test data with new chemical applications. Though the submission of data is not required, the US EPA has, over the past 40 years, occasionally received chemical-specific data with pre-manufacture notices. The US EPA has been actively using this and publicly available data to develop and refine predictive computerized models, most of which are housed in EPI Suite™, to estimate chemical properties used in the risk assessment of new chemicals. The US EPA develops and uses models based on (quantitative) structure-activity relationships ([Q]SARs) to estimate critical parameters. As in any evolving field, (Q)SARs have experienced successes, suffered failures, and responded to emerging trends. Correlations of a chemical structure with its properties or biological activity were first demonstrated in the late 19 th century and today have been encapsulated in a myriad of quantitative and qualitative SARs. The development and proliferation of the personal computer in the late 20 th century gave rise to a quickly increasing number of property estimation models, and continually improved computing power and connectivity among researchers via the internet are enabling the development of increasingly complex models.

  6. THE FUTURE OF COMPUTER-BASED TOXICITY PREDICTION: MECHANISM-BASED MODELS VS. INFORMATION MINING APPROACHES

    EPA Science Inventory


    The Future of Computer-Based Toxicity Prediction:
    Mechanism-Based Models vs. Information Mining Approaches

    When we speak of computer-based toxicity prediction, we are generally referring to a broad array of approaches which rely primarily upon chemical structure ...

  7. BOADICEA breast cancer risk prediction model: updates to cancer incidences, tumour pathology and web interface

    PubMed Central

    Lee, A J; Cunningham, A P; Kuchenbaecker, K B; Mavaddat, N; Easton, D F; Antoniou, A C

    2014-01-01

    Background: The Breast and Ovarian Analysis of Disease Incidence and Carrier Estimation Algorithm (BOADICEA) is a risk prediction model that is used to compute probabilities of carrying mutations in the high-risk breast and ovarian cancer susceptibility genes BRCA1 and BRCA2, and to estimate the future risks of developing breast or ovarian cancer. In this paper, we describe updates to the BOADICEA model that extend its capabilities, make it easier to use in a clinical setting and yield more accurate predictions. Methods: We describe: (1) updates to the statistical model to include cancer incidences from multiple populations; (2) updates to the distributions of tumour pathology characteristics using new data on BRCA1 and BRCA2 mutation carriers and women with breast cancer from the general population; (3) improvements to the computational efficiency of the algorithm so that risk calculations now run substantially faster; and (4) updates to the model's web interface to accommodate these new features and to make it easier to use in a clinical setting. Results: We present results derived using the updated model, and demonstrate that the changes have a significant impact on risk predictions. Conclusion: All updates have been implemented in a new version of the BOADICEA web interface that is now available for general use: http://ccge.medschl.cam.ac.uk/boadicea/. PMID:24346285

  8. A Hydrological Modeling Framework for Flood Risk Assessment for Japan

    NASA Astrophysics Data System (ADS)

    Ashouri, H.; Chinnayakanahalli, K.; Chowdhary, H.; Sen Gupta, A.

    2016-12-01

    Flooding has been the most frequent natural disaster that claims lives and imposes significant economic losses to human societies worldwide. Japan, with an annual rainfall of up to approximately 4000 mm is extremely vulnerable to flooding. The focus of this research is to develop a macroscale hydrologic model for simulating flooding toward an improved understanding and assessment of flood risk across Japan. The framework employs a conceptual hydrological model, known as the Probability Distributed Model (PDM), as well as the Muskingum-Cunge flood routing procedure for simulating streamflow. In addition, a Temperature-Index model is incorporated to account for snowmelt and its contribution to streamflow. For an efficient calibration of the model, in terms of computational timing and convergence of the parameters, a set of A Priori parameters is obtained based on the relationships between the model parameters and the physical properties of watersheds. In this regard, we have implemented a particle tracking algorithm and a statistical model which use high resolution Digital Terrain Models to estimate different time related parameters of the model such as time to peak of the unit hydrograph. In addition, global soil moisture and depth data are used to generate A Priori estimation of maximum soil moisture capacity, an important parameter of the PDM model. Once the model is calibrated, its performance is examined during the Typhoon Nabi which struck Japan in September 2005 and caused severe flooding throughout the country. The model is also validated for the extreme precipitation event in 2012 which affected Kyushu. In both cases, quantitative measures show that simulated streamflow depicts good agreement with gauge-based observations. The model is employed to simulate thousands of possible flood events for the entire Japan which makes a basis for a comprehensive flood risk assessment and loss estimation for the flood insurance industry.

  9. A risk assessment framework for irrigated agriculture under climate change

    NASA Astrophysics Data System (ADS)

    Ronco, P.; Zennaro, F.; Torresan, S.; Critto, A.; Santini, M.; Trabucco, A.; Zollo, A. L.; Galluccio, G.; Marcomini, A.

    2017-12-01

    In several regions, but especially in semi-arid areas, raising frequency, duration and intensity of drought events, mainly driven by climate change dynamics, are expected to dramatically reduce the current stocks of freshwater resources, limiting crop development and yield especially where agriculture largely depends on irrigation. The achievement of an affordable and sustainable equilibrium between available water resources and irrigation demand is essentially related to the planning and implementation of evidence-based adaptation strategies and actions. The present study proposed a state-of-the art conceptual framework and computational methodology to assess the potential water scarcity risk, due to changes in climate trends and variability, on irrigated croplands. The model has been tested over the irrigated agriculture of Puglia Region, a semi-arid territory with the largest agricultural production in Southern Italy. The methodology, based on the Regional Risk Assessment (RRA) approach, has been applied within a scenario-based hazard framework. Regional climate projections, under alternative greenhouse gas concentration scenarios (RCP4.5 and RCP8.5) and for two different timeframes, 2021-2050 and 2041-2070 compared to the baseline 1976-2005 period, have been used to drive hydrological simulations of river inflow to the most important reservoirs serving irrigation purposes in Puglia. The novelty of the proposed RRA-based approach does not simply rely on the concept of risk as combination of hazard, exposure and vulnerability, but rather elaborates detailed (scientific and conceptual) framing and computational description of these factors, to produce risk spatial pattern maps and related statistics distinguishing the most critical areas (risk hot spots).. The application supported the identification of the most affected areas (i.e. Capitanata Reclamation Consortia under RCP8.5 2041-2070 scenario), crops (fruit trees and vineyards), and, finally, the vulnerability pattern of irrigation systems and networks. The implemented assessment singled out future perspectives of water scarcity risk levels for irrigated agriculture by the administrative extent where individual bodies are in charge of the coordination of public and private irrigation activities (i.e. Reclamation Consortia). Based on the outcomes of the proposed methodology, tailored and knowledge-based adaptation strategies and related actions can be developed, to reduce the risk at both agronomic level (i.e. preferring crops with low vulnerability score, as olive groves) and at structural level (i.e. differentiating the water stocks and supplies and reducing losses and inefficiencies).

  10. A review and comparison of Bayesian and likelihood-based inferences in beta regression and zero-or-one-inflated beta regression.

    PubMed

    Liu, Fang; Eugenio, Evercita C

    2018-04-01

    Beta regression is an increasingly popular statistical technique in medical research for modeling of outcomes that assume values in (0, 1), such as proportions and patient reported outcomes. When outcomes take values in the intervals [0,1), (0,1], or [0,1], zero-or-one-inflated beta (zoib) regression can be used. We provide a thorough review on beta regression and zoib regression in the modeling, inferential, and computational aspects via the likelihood-based and Bayesian approaches. We demonstrate the statistical and practical importance of correctly modeling the inflation at zero/one rather than ad hoc replacing them with values close to zero/one via simulation studies; the latter approach can lead to biased estimates and invalid inferences. We show via simulation studies that the likelihood-based approach is computationally faster in general than MCMC algorithms used in the Bayesian inferences, but runs the risk of non-convergence, large biases, and sensitivity to starting values in the optimization algorithm especially with clustered/correlated data, data with sparse inflation at zero and one, and data that warrant regularization of the likelihood. The disadvantages of the regular likelihood-based approach make the Bayesian approach an attractive alternative in these cases. Software packages and tools for fitting beta and zoib regressions in both the likelihood-based and Bayesian frameworks are also reviewed.

  11. Earthquake Hazard and Risk in Alaska

    NASA Astrophysics Data System (ADS)

    Black Porto, N.; Nyst, M.

    2014-12-01

    Alaska is one of the most seismically active and tectonically diverse regions in the United States. To examine risk, we have updated the seismic hazard model in Alaska. The current RMS Alaska hazard model is based on the 2007 probabilistic seismic hazard maps for Alaska (Wesson et al., 2007; Boyd et al., 2007). The 2015 RMS model will update several key source parameters, including: extending the earthquake catalog, implementing a new set of crustal faults, updating the subduction zone geometry and reoccurrence rate. First, we extend the earthquake catalog to 2013; decluster the catalog, and compute new background rates. We then create a crustal fault model, based on the Alaska 2012 fault and fold database. This new model increased the number of crustal faults from ten in 2007, to 91 faults in the 2015 model. This includes the addition of: the western Denali, Cook Inlet folds near Anchorage, and thrust faults near Fairbanks. Previously the subduction zone was modeled at a uniform depth. In this update, we model the intraslab as a series of deep stepping events. We also use the best available data, such as Slab 1.0, to update the geometry of the subduction zone. The city of Anchorage represents 80% of the risk exposure in Alaska. In the 2007 model, the hazard in Alaska was dominated by the frequent rate of magnitude 7 to 8 events (Gutenberg-Richter distribution), and large magnitude 8+ events had a low reoccurrence rate (Characteristic) and therefore didn't contribute as highly to the overall risk. We will review these reoccurrence rates, and will present the results and impact to Anchorage. We will compare our hazard update to the 2007 USGS hazard map, and discuss the changes and drivers for these changes. Finally, we will examine the impact model changes have on Alaska earthquake risk. Consider risk metrics include average annual loss, an annualized expected loss level used by insurers to determine the costs of earthquake insurance (and premium levels), and the loss exceedance probability curve used by insurers to address their solvency and manage their portfolio risk. We analyze risk profile changes in areas with large population density and for structures of economic and financial importance: the Trans-Alaska pipeline, industrial facilities in Valdez, and typical residential wood buildings in Anchorage, Fairbanks and Juneau.

  12. GPU-based RFA simulation for minimally invasive cancer treatment of liver tumours.

    PubMed

    Mariappan, Panchatcharam; Weir, Phil; Flanagan, Ronan; Voglreiter, Philip; Alhonnoro, Tuomas; Pollari, Mika; Moche, Michael; Busse, Harald; Futterer, Jurgen; Portugaller, Horst Rupert; Sequeiros, Roberto Blanco; Kolesnik, Marina

    2017-01-01

    Radiofrequency ablation (RFA) is one of the most popular and well-standardized minimally invasive cancer treatments (MICT) for liver tumours, employed where surgical resection has been contraindicated. Less-experienced interventional radiologists (IRs) require an appropriate planning tool for the treatment to help avoid incomplete treatment and so reduce the tumour recurrence risk. Although a few tools are available to predict the ablation lesion geometry, the process is computationally expensive. Also, in our implementation, a few patient-specific parameters are used to improve the accuracy of the lesion prediction. Advanced heterogeneous computing using personal computers, incorporating the graphics processing unit (GPU) and the central processing unit (CPU), is proposed to predict the ablation lesion geometry. The most recent GPU technology is used to accelerate the finite element approximation of Penne's bioheat equation and a three state cell model. Patient-specific input parameters are used in the bioheat model to improve accuracy of the predicted lesion. A fast GPU-based RFA solver is developed to predict the lesion by doing most of the computational tasks in the GPU, while reserving the CPU for concurrent tasks such as lesion extraction based on the heat deposition at each finite element node. The solver takes less than 3 min for a treatment duration of 26 min. When the model receives patient-specific input parameters, the deviation between real and predicted lesion is below 3 mm. A multi-centre retrospective study indicates that the fast RFA solver is capable of providing the IR with the predicted lesion in the short time period before the intervention begins when the patient has been clinically prepared for the treatment.

  13. Trade-space Analysis for Constellations

    NASA Astrophysics Data System (ADS)

    Le Moigne, J.; Dabney, P.; de Weck, O. L.; Foreman, V.; Grogan, P.; Holland, M. P.; Hughes, S. P.; Nag, S.

    2016-12-01

    Traditionally, space missions have relied on relatively large and monolithic satellites, but in the past few years, under a changing technological and economic environment, including instrument and spacecraft miniaturization, scalable launchers, secondary launches as well as hosted payloads, there is growing interest in implementing future NASA missions as Distributed Spacecraft Missions (DSM). The objective of our project is to provide a framework that facilitates DSM Pre-Phase A investigations and optimizes DSM designs with respect to a-priori Science goals. In this first version of our Trade-space Analysis Tool for Constellations (TAT-C), we are investigating questions such as: "How many spacecraft should be included in the constellation? Which design has the best cost/risk value?" The main goals of TAT-C are to: Handle multiple spacecraft sharing a mission objective, from SmallSats up through flagships, Explore the variables trade space for pre-defined science, cost and risk goals, and pre-defined metrics Optimize cost and performance across multiple instruments and platforms vs. one at a time. This paper describes the overall architecture of TAT-C including: a User Interface (UI) interacting with multiple users - scientists, missions designers or program managers; an Executive Driver gathering requirements from UI, then formulating Trade-space Search Requests for the Trade-space Search Iterator first with inputs from the Knowledge Base, then, in collaboration with the Orbit & Coverage, Reduction & Metrics, and Cost& Risk modules, generating multiple potential architectures and their associated characteristics. TAT-C leverages the use of the Goddard Mission Analysis Tool (GMAT) to compute coverage and ancillary data, streamlining the computations by modeling orbits in a way that balances accuracy and performance. TAT-C current version includes uniform Walker constellations as well as Ad-Hoc constellations, and its cost model represents an aggregate model consisting of Cost Estimating Relationships (CERs) from widely accepted models. The Knowledge Base supports both analysis and exploration, and the current GUI prototype automatically generates graphics representing metrics such as average revisit time or coverage as a function of cost.

  14. Predictive Models and Computational Toxicology

    EPA Science Inventory

    Understanding the potential health risks posed by environmental chemicals is a significant challenge elevated by the large number of diverse chemicals with generally uncharacterized exposures, mechanisms, and toxicities. The ToxCast computational toxicology research program was l...

  15. A Computational Model to Simulate Groundwater Seepage Risk in Support of Geotechnical Investigations of Levee and Dam Projects

    DTIC Science & Technology

    2013-03-01

    Allen 1974, 1978; Bridge and Leeder 1979; Mackey and Bridge 1992) that computes synthetic stratigraphy for a floodplain cross section. The model...typical of that used to record and communicate geologic information for engineering applications. The computed stratigraphy differentiates between...belt dimensions measured for two well-studied river systems: (A) the Linge River within the Rhine-Meuse Delta , Netherlands, and (B) the Lower

  16. The feasibility of universal DLP-to-risk conversion coefficients for body CT protocols

    NASA Astrophysics Data System (ADS)

    Li, Xiang; Samei, Ehsan; Segars, W. Paul; Paulson, Erik K.; Frush, Donald P.

    2011-03-01

    The effective dose associated with computed tomography (CT) examinations is often estimated from dose-length product (DLP) using scanner-independent conversion coefficients. Such conversion coefficients are available for a small number of examinations, each covering an entire region of the body (e.g., head, neck, chest, abdomen and/or pelvis). Similar conversion coefficients, however, do not exist for examinations that cover a single organ or a sub-region of the body, as in the case of a multi-phase liver examination. In this study, we extended the DLP-to-effective dose conversion coefficient (k factor) to a wide range of body CT protocols and derived the corresponding DLP-to-cancer risk conversion coefficient (q factor). An extended cardiactorso (XCAT) computational model was used, which represented a reference adult male patient. A range of body CT protocols used in clinical practice were categorized based on anatomical regions examined into 10 protocol classes. A validated Monte Carlo program was used to estimate the organ dose associated with each protocol class. Assuming the reference model to be 20 years old, effective dose and risk index (an index of the total risk for cancer incidence) were then calculated and normalized by DLP to obtain the k and q factors. The k and q factors varied across protocol classes; the coefficients of variation were 28% and 9%, respectively. The small variation exhibited by the q factor suggested the feasibility of universal q factors for a wide range of body CT protocols.

  17. Scope Complexity Options Risks Excursions (SCORE) Factor Mathematical Description.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gearhart, Jared Lee; Samberson, Jonell Nicole; Shettigar, Subhasini

    The purpose of the Scope, Complexity, Options, Risks, Excursions (SCORE) model is to estimate the relative complexity of design variants of future warhead options, resulting in scores. SCORE factors extend this capability by providing estimates of complexity relative to a base system (i.e., all design options are normalized to one weapon system). First, a clearly defined set of scope elements for a warhead option is established. The complexity of each scope element is estimated by Subject Matter Experts (SMEs), including a level of uncertainty, relative to a specific reference system. When determining factors, complexity estimates for a scope element canmore » be directly tied to the base system or chained together via comparable scope elements in a string of reference systems that ends with the base system. The SCORE analysis process is a growing multi-organizational Nuclear Security Enterprise (NSE) effort, under the management of the NA-12 led Enterprise Modeling and Analysis Consortium (EMAC). Historically, it has provided the data elicitation, integration, and computation needed to support the out-year Life Extension Program (LEP) cost estimates included in the Stockpile Stewardship Management Plan (SSMP).« less

  18. Ionospheric Slant Total Electron Content Analysis Using Global Positioning System Based Estimation

    NASA Technical Reports Server (NTRS)

    Komjathy, Attila (Inventor); Mannucci, Anthony J. (Inventor); Sparks, Lawrence C. (Inventor)

    2017-01-01

    A method, system, apparatus, and computer program product provide the ability to analyze ionospheric slant total electron content (TEC) using global navigation satellite systems (GNSS)-based estimation. Slant TEC is estimated for a given set of raypath geometries by fitting historical GNSS data to a specified delay model. The accuracy of the specified delay model is estimated by computing delay estimate residuals and plotting a behavior of the delay estimate residuals. An ionospheric threat model is computed based on the specified delay model. Ionospheric grid delays (IGDs) and grid ionospheric vertical errors (GIVEs) are computed based on the ionospheric threat model.

  19. Moving towards a new paradigm for global flood risk estimation

    NASA Astrophysics Data System (ADS)

    Troy, Tara J.; Devineni, Naresh; Lima, Carlos; Lall, Upmanu

    2013-04-01

    Traditional approaches to flood risk assessment are typically indexed to an instantaneous peak flow event at a specific recording gage on a river, and then extrapolated through hydraulic modeling of that peak flow to the potential area that is likely to be inundated. Recent research shows that property losses tend to be determined as much by the duration of flooding as by the depth and velocity of inundation. The existing notion of a flood return period based on just the instantaneous peak flow rate at a stream gauge consequently needs to be revisited, especially for floods due to persistent rainfall as seen recently in Thailand, Pakistan, the Ohio and the Mississippi Rivers, France, and Germany. Depending on the flood event type considered, different rainfall inducing mechanisms (tropical storm, local convection, frontal system, recurrent tropical waves) may be involved. Each of these will have a characteristic spatial scale, expression and orientation and temporal characteristics. We develop stochastic models that can reproduce these attributes with appropriate intensity-duration-frequency and spatial expression, and hence provide a basis for conditioning basin hydrologic attributes for flood risk assessment. Past work on Non-homogeneous Hidden Markov Models (NHMM) is used as a basis to develop this capability at regional scales. In addition, a dynamic hierarchical Bayesian network model that is continuous and not based on discretization to states is tested and compared against NHMM. The exogenous variables in these models comes from the analysis of key synoptic circulation patterns which will be used as predictors for the regional spatio-temporal models. The stochastic simulations of rainfall are then used as input to a flood modeling system, which consists of a series of physically based models. Rainfall-runoff generation is produced by the Variable Infiltration Capacity (VIC) model. When the modeled streamflow crosses a threshold, a full kinematic wave routing model is implemented at a finer resolution (<=1km) in order to more accurately model streamflow under flood conditions and estimate inundation. This approach allows for efficient computational simulation of the hydrology when not under potential for flooding with high-resolution flood wave modeling when there is flooding potential. We demonstrate the results of this flood risk estimation system for the Ohio River basin in the United States, a large river basin that is historically prone to flooding, with the intention of using it to do global flood risk assessment.

  20. Delivering an Informational Hub for Data at the National Center for Computational Toxicology (ACS Spring Meeting) 7 of 7

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) Computational Toxicology Program integrates advances in biology, chemistry, and computer science to help prioritize chemicals for further research based on potential human health risks. This work involves computational and data drive...

  1. Computeen: A Randomized Trial of a Preventive Computer and Psychosocial Skills Curriculum for At-Risk Adolescents

    ERIC Educational Resources Information Center

    Lang, Jason M.; Waterman, Jill; Baker, Bruce L.

    2009-01-01

    Computeen, a preventive technology and psychosocial skills development program for at-risk adolescents, was designed to improve computer skills, self-esteem, and school attitudes, and reduce behavior problems, by combining elements of community-based and empirically supported prevention programs. Fifty-five mostly Latino adolescents from 12 to 16…

  2. On the usage of ultrasound computational models for decision making under ambiguity

    NASA Astrophysics Data System (ADS)

    Dib, Gerges; Sexton, Samuel; Prowant, Matthew; Crawford, Susan; Diaz, Aaron

    2018-04-01

    Computer modeling and simulation is becoming pervasive within the non-destructive evaluation (NDE) industry as a convenient tool for designing and assessing inspection techniques. This raises a pressing need for developing quantitative techniques for demonstrating the validity and applicability of the computational models. Computational models provide deterministic results based on deterministic and well-defined input, or stochastic results based on inputs defined by probability distributions. However, computational models cannot account for the effects of personnel, procedures, and equipment, resulting in ambiguity about the efficacy of inspections based on guidance from computational models only. In addition, ambiguity arises when model inputs, such as the representation of realistic cracks, cannot be defined deterministically, probabilistically, or by intervals. In this work, Pacific Northwest National Laboratory demonstrates the ability of computational models to represent field measurements under known variabilities, and quantify the differences using maximum amplitude and power spectrum density metrics. Sensitivity studies are also conducted to quantify the effects of different input parameters on the simulation results.

  3. Forecasting risk along a river basin using a probabilistic and deterministic model for environmental risk assessment of effluents through ecotoxicological evaluation and GIS.

    PubMed

    Gutiérrez, Simón; Fernandez, Carlos; Barata, Carlos; Tarazona, José Vicente

    2009-12-20

    This work presents a computer model for Risk Assessment of Basins by Ecotoxicological Evaluation (RABETOX). The model is based on whole effluent toxicity testing and water flows along a specific river basin. It is capable of estimating the risk along a river segment using deterministic and probabilistic approaches. The Henares River Basin was selected as a case study to demonstrate the importance of seasonal hydrological variations in Mediterranean regions. As model inputs, two different ecotoxicity tests (the miniaturized Daphnia magna acute test and the D.magna feeding test) were performed on grab samples from 5 waste water treatment plant effluents. Also used as model inputs were flow data from the past 25 years, water velocity measurements and precise distance measurements using Geographical Information Systems (GIS). The model was implemented into a spreadsheet and the results were interpreted and represented using GIS in order to facilitate risk communication. To better understand the bioassays results, the effluents were screened through SPME-GC/MS analysis. The deterministic model, performed each month during one calendar year, showed a significant seasonal variation of risk while revealing that September represents the worst-case scenario with values up to 950 Risk Units. This classifies the entire area of study for the month of September as "sublethal significant risk for standard species". The probabilistic approach using Monte Carlo analysis was performed on 7 different forecast points distributed along the Henares River. A 0% probability of finding "low risk" was found at all forecast points with a more than 50% probability of finding "potential risk for sensitive species". The values obtained through both the deterministic and probabilistic approximations reveal the presence of certain substances, which might be causing sublethal effects in the aquatic species present in the Henares River.

  4. Epidemic dynamics on a risk-based evolving social network

    NASA Astrophysics Data System (ADS)

    Antwi, Shadrack; Shaw, Leah

    2013-03-01

    Social network models have been used to study how behavior affects the dynamics of an infection in a population. Motivated by HIV, we consider how a trade-off between benefits and risks of sexual connections determine network structure and disease prevalence. We define a stochastic network model with formation and breaking of links as changes in sexual contacts. Each node has an intrinsic benefit its neighbors derive from connecting to it. Nodes' infection status is not apparent to others, but nodes with more connections (higher degree) are assumed more likely to be infected. The probability to form and break links is determined by a payoff computed from the benefit and degree-dependent risk. The disease is represented by a SI (susceptible-infected) model. We study network and epidemic evolution via Monte Carlo simulation and analytically predict the behavior with a heterogeneous mean field approach. The dependence of network connectivity and infection threshold on parameters is determined, and steady state degree distribution and epidemic levels are obtained. We also study a situation where system-wide infection levels alter perception of risk and cause nodes to adjust their behavior. This is a case of an adaptive network, where node status feeds back to change network geometry.

  5. A Computational Linguistic Measure of Clustering Behavior on Semantic Verbal Fluency Task Predicts Risk of Future Dementia in the Nun Study

    PubMed Central

    Pakhomov, Serguei V.S.; Hemmy, Laura S.

    2014-01-01

    Generative semantic verbal fluency (SVF) tests show early and disproportionate decline relative to other abilities in individuals developing Alzheimer’s disease. Optimal performance on SVF tests depends on the efficiency of using clustered organization of semantically related items and the ability to switch between clusters. Traditional approaches to clustering and switching have relied on manual determination of clusters. We evaluated a novel automated computational linguistic approach for quantifying clustering behavior. Our approach is based on Latent Semantic Analysis (LSA) for computing strength of semantic relatedness between pairs of words produced in response to SVF test. The mean size of semantic clusters (MCS) and semantic chains (MChS) are calculated based on pairwise relatedness values between words. We evaluated the predictive validity of these measures on a set of 239 participants in the Nun Study, a longitudinal study of aging. All were cognitively intact at baseline assessment, measured with the CERAD battery, and were followed in 18 month waves for up to 20 years. The onset of either dementia or memory impairment were used as outcomes in Cox proportional hazards models adjusted for age and education and censored at follow up waves 5 (6.3 years) and 13 (16.96 years). Higher MCS was associated with 38% reduction in dementia risk at wave 5 and 26% reduction at wave 13, but not with the onset of memory impairment. Higher (+1 SD) MChS was associated with 39% dementia risk reduction at wave 5 but not wave 13, and association with memory impairment was not significant. Higher traditional SVF scores were associated with 22–29% memory impairment and 35–40% dementia risk reduction. SVF scores were not correlated with either MCS or MChS. Our study suggests that an automated approach to measuring clustering behavior can be used to estimate dementia risk in cognitively normal individuals. PMID:23845236

  6. A computational linguistic measure of clustering behavior on semantic verbal fluency task predicts risk of future dementia in the nun study.

    PubMed

    Pakhomov, Serguei V S; Hemmy, Laura S

    2014-06-01

    Generative semantic verbal fluency (SVF) tests show early and disproportionate decline relative to other abilities in individuals developing Alzheimer's disease. Optimal performance on SVF tests depends on the efficiency of using clustered organization of semantically related items and the ability to switch between clusters. Traditional approaches to clustering and switching have relied on manual determination of clusters. We evaluated a novel automated computational linguistic approach for quantifying clustering behavior. Our approach is based on Latent Semantic Analysis (LSA) for computing strength of semantic relatedness between pairs of words produced in response to SVF test. The mean size of semantic clusters (MCS) and semantic chains (MChS) are calculated based on pairwise relatedness values between words. We evaluated the predictive validity of these measures on a set of 239 participants in the Nun Study, a longitudinal study of aging. All were cognitively intact at baseline assessment, measured with the Consortium to Establish a Registry for Alzheimer's Disease (CERAD) battery, and were followed in 18-month waves for up to 20 years. The onset of either dementia or memory impairment were used as outcomes in Cox proportional hazards models adjusted for age and education and censored at follow-up waves 5 (6.3 years) and 13 (16.96 years). Higher MCS was associated with 38% reduction in dementia risk at wave 5 and 26% reduction at wave 13, but not with the onset of memory impairment. Higher [+1 standard deviation (SD)] MChS was associated with 39% dementia risk reduction at wave 5 but not wave 13, and association with memory impairment was not significant. Higher traditional SVF scores were associated with 22-29% memory impairment and 35-40% dementia risk reduction. SVF scores were not correlated with either MCS or MChS. Our study suggests that an automated approach to measuring clustering behavior can be used to estimate dementia risk in cognitively normal individuals. Copyright © 2013 Elsevier Ltd. All rights reserved.

  7. CT-based manual segmentation and evaluation of paranasal sinuses.

    PubMed

    Pirner, S; Tingelhoff, K; Wagner, I; Westphal, R; Rilk, M; Wahl, F M; Bootz, F; Eichhorn, Klaus W G

    2009-04-01

    Manual segmentation of computed tomography (CT) datasets was performed for robot-assisted endoscope movement during functional endoscopic sinus surgery (FESS). Segmented 3D models are needed for the robots' workspace definition. A total of 50 preselected CT datasets were each segmented in 150-200 coronal slices with 24 landmarks being set. Three different colors for segmentation represent diverse risk areas. Extension and volumetric measurements were performed. Three-dimensional reconstruction was generated after segmentation. Manual segmentation took 8-10 h for each CT dataset. The mean volumes were: right maxillary sinus 17.4 cm(3), left side 17.9 cm(3), right frontal sinus 4.2 cm(3), left side 4.0 cm(3), total frontal sinuses 7.9 cm(3), sphenoid sinus right side 5.3 cm(3), left side 5.5 cm(3), total sphenoid sinus volume 11.2 cm(3). Our manually segmented 3D-models present the patient's individual anatomy with a special focus on structures in danger according to the diverse colored risk areas. For safe robot assistance, the high-accuracy models represent an average of the population for anatomical variations, extension and volumetric measurements. They can be used as a database for automatic model-based segmentation. None of the segmentation methods so far described provide risk segmentation. The robot's maximum distance to the segmented border can be adjusted according to the differently colored areas.

  8. Comparison of femoral strength and fracture risk index derived from DXA-based finite element analysis for stratifying hip fracture risk: A cross-sectional study.

    PubMed

    Yang, Shuman; Luo, Yunhua; Yang, Lang; Dall'Ara, Enrico; Eastell, Richard; Goertzen, Andrew L; McCloskey, Eugene V; Leslie, William D; Lix, Lisa M

    2018-05-01

    Dual-energy X-ray absorptiometry (DXA)-based finite element analysis (FEA) has been studied for assessment of hip fracture risk. Femoral strength (FS) is the maximum force that the femur can sustain before its weakest region reaches the yielding limit. Fracture risk index (FRI), which also considers subject-specific impact force, is defined as the ratio of von Mises stress induced by a sideways fall to the bone yield stress over the proximal femur. We compared risk stratification for prior hip fracture using FS and FRI derived from DXA-based FEA. The study cohort included women aged ≥65years undergoing baseline hip DXA, with femoral neck T-scores <-1 and no osteoporosis treatment; 324 cases had prior hip fracture and 655 controls had no prior fracture. Using anonymized DXA hip scans, we measured FS and FRI. Separate multivariable logistic regression models were used to estimate odds ratios (ORs), c-statistics and their 95% confidence intervals (95% CIs) for the association of hip fracture with FS and FRI. Increased hip fracture risk was associated with lower FS (OR per SD 1.36, 95% CI: 1.15, 1.62) and higher FRI (OR per SD 1.99, 95% CI: 1.63, 2.43) after adjusting for Fracture Risk Assessment Tool (FRAX) hip fracture probability computed with bone mineral density (BMD). The c-statistic for the model containing FS (0.69; 95% CI: 0.65, 0.72) was lower than the c-statistic for the model with FRI (0.77; 95% CI: 0.74, 0.80) or femoral neck BMD (0.74; 95% CI: 0.71, 0.77; all P<0.05). FS and FRI were independently associated with hip fracture, but there were differences in performance characteristics. Copyright © 2018 Elsevier Inc. All rights reserved.

  9. LIFETIME LUNG CANCER RISKS ASSOCIATED WITH INDOOR RADON EXPOSURE BASED ON VARIOUS RADON RISK MODELS FOR CANADIAN POPULATION.

    PubMed

    Chen, Jing

    2017-04-01

    This study calculates and compares the lifetime lung cancer risks associated with indoor radon exposure based on well-known risk models in the literature; two risk models are from joint studies among miners and the other three models were developed from pooling studies on residential radon exposure from China, Europe and North America respectively. The aim of this article is to make clear that the various models are mathematical descriptions of epidemiologically observed real risks in different environmental settings. The risk from exposure to indoor radon is real and it is normal that variations could exist among different risk models even when they were applied to the same dataset. The results show that lifetime risk estimates vary significantly between the various risk models considered here: the model based on the European residential data provides the lowest risk estimates, while models based on the European miners and Chinese residential pooling with complete dosimetry give the highest values. The lifetime risk estimates based on the EPA/BEIR-VI model lie within this range and agree reasonably well with the averages of risk estimates from the five risk models considered in this study. © Crown copyright 2016.

  10. Modeling thrombin generation: plasma composition based approach.

    PubMed

    Brummel-Ziedins, Kathleen E; Everse, Stephen J; Mann, Kenneth G; Orfeo, Thomas

    2014-01-01

    Thrombin has multiple functions in blood coagulation and its regulation is central to maintaining the balance between hemorrhage and thrombosis. Empirical and computational methods that capture thrombin generation can provide advancements to current clinical screening of the hemostatic balance at the level of the individual. In any individual, procoagulant and anticoagulant factor levels together act to generate a unique coagulation phenotype (net balance) that is reflective of the sum of its developmental, environmental, genetic, nutritional and pharmacological influences. Defining such thrombin phenotypes may provide a means to track disease progression pre-crisis. In this review we briefly describe thrombin function, methods for assessing thrombin dynamics as a phenotypic marker, computationally derived thrombin phenotypes versus determined clinical phenotypes, the boundaries of normal range thrombin generation using plasma composition based approaches and the feasibility of these approaches for predicting risk.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keller, Todd M.; Benjamin, Jacob S.; Wright, Virginia L.

    This paper will describe a practical methodology for understanding the cyber risk of a digital asset. This research attempts to gain a greater understanding of the cyber risk posed by a hardware-based computer asset by considering it as a sum of its hardware and software based sub-components.

  12. Reward salience and risk aversion underlie differential ACC activity in substance dependence

    PubMed Central

    Alexander, William H.; Fukunaga, Rena; Finn, Peter; Brown, Joshua W.

    2015-01-01

    The medial prefrontal cortex, especially the dorsal anterior cingulate cortex (ACC), has long been implicated in cognitive control and error processing. Although the association between ACC and behavior has been established, it is less clear how ACC contributes to dysfunctional behavior such as substance dependence. Evidence from neuroimaging studies investigating ACC function in substance users is mixed, with some studies showing disengagement of ACC in substance dependent individuals (SDs), while others show increased ACC activity related to substance use. In this study, we investigate ACC function in SDs and healthy individuals performing a change signal task for monetary rewards. Using a priori predictions derived from a recent computational model of ACC, we find that ACC activity differs between SDs and controls in factors related to reward salience and risk aversion between SDs and healthy individuals. Quantitative fits of a computational model to fMRI data reveal significant differences in best fit parameters for reward salience and risk preferences. Specifically, the ACC in SDs shows greater risk aversion, defined as concavity in the utility function, and greater attention to rewards relative to reward omission. Furthermore, across participants risk aversion and reward salience are positively correlated. The results clarify the role that ACC plays in both the reduced sensitivity to omitted rewards and greater reward valuation in SDs. Clinical implications of applying computational modeling in psychiatry are also discussed. PMID:26106528

  13. Reward salience and risk aversion underlie differential ACC activity in substance dependence.

    PubMed

    Alexander, William H; Fukunaga, Rena; Finn, Peter; Brown, Joshua W

    2015-01-01

    The medial prefrontal cortex, especially the dorsal anterior cingulate cortex (ACC), has long been implicated in cognitive control and error processing. Although the association between ACC and behavior has been established, it is less clear how ACC contributes to dysfunctional behavior such as substance dependence. Evidence from neuroimaging studies investigating ACC function in substance users is mixed, with some studies showing disengagement of ACC in substance dependent individuals (SDs), while others show increased ACC activity related to substance use. In this study, we investigate ACC function in SDs and healthy individuals performing a change signal task for monetary rewards. Using a priori predictions derived from a recent computational model of ACC, we find that ACC activity differs between SDs and controls in factors related to reward salience and risk aversion between SDs and healthy individuals. Quantitative fits of a computational model to fMRI data reveal significant differences in best fit parameters for reward salience and risk preferences. Specifically, the ACC in SDs shows greater risk aversion, defined as concavity in the utility function, and greater attention to rewards relative to reward omission. Furthermore, across participants risk aversion and reward salience are positively correlated. The results clarify the role that ACC plays in both the reduced sensitivity to omitted rewards and greater reward valuation in SDs. Clinical implications of applying computational modeling in psychiatry are also discussed.

  14. Impact of a primary care based intervention on breast cancer knowledge, risk perception and concern: A randomized, controlled trial

    PubMed Central

    Livaudais-Toman, Jennifer; Karliner, Leah S.; Tice, Jeffrey A.; Kerlikowske, Karla; Gregorich, Steven; Pérez-Stable, Eliseo J.; Pasick, Rena J.; Chen, Alice; Quinn, Jessica; Kaplan, Celia P.

    2015-01-01

    Purpose To estimate the effects of a tablet-based, breast cancer risk education intervention for use in primary care settings (BreastCARE) on patients' breast cancer knowledge, risk perception and concern. Methods From June 2011–August 2012, we enrolled women from two clinics, aged 40–74 years with no personal breast cancer history, and randomized them to the BreastCARE intervention group or to the control group. All patients completed a baseline telephone survey and risk assessment (via telephone for controls, via tablet computer in clinic waiting room prior to visit for intervention). All women were categorized as high or average risk based on the Referral Screening Tool, the Gail model or the Breast Cancer Surveillance Consortium model. Intervention patients and their physicians received an individualized risk report to discuss during the visit. All women completed a follow-up telephone survey 1–2 weeks after risk assessment. Post-test comparisons estimated differences at follow-up in breast cancer knowledge, risk perception and concern. Results 580 intervention and 655 control women completed follow-up interviews. Mean age was 56 years (SD = 9). At follow-up, 73% of controls and 71% of intervention women correctly perceived their breast cancer risk and 22% of controls and 24% of intervention women were very concerned about breast cancer. Intervention patients had greater knowledge (≥75% correct answers) of breast cancer risk factors at follow-up (24% vs. 16%; p = 0.002). In multivariable analysis, there were no differences in correct risk perception or concern, but intervention patients had greater knowledge ([OR] = 1.62; 95% [CI] = 1.19–2.23). Conclusions A simple, practical intervention involving physicians at the point of care can improve knowledge of breast cancer without increasing concern. Trial Registration ClinicalTrials.gov identifier NCT01830933. PMID:26476466

  15. Impact of a primary care based intervention on breast cancer knowledge, risk perception and concern: A randomized, controlled trial.

    PubMed

    Livaudais-Toman, Jennifer; Karliner, Leah S; Tice, Jeffrey A; Kerlikowske, Karla; Gregorich, Steven; Pérez-Stable, Eliseo J; Pasick, Rena J; Chen, Alice; Quinn, Jessica; Kaplan, Celia P

    2015-12-01

    To estimate the effects of a tablet-based, breast cancer risk education intervention for use in primary care settings (BreastCARE) on patients' breast cancer knowledge, risk perception and concern. From June 2011-August 2012, we enrolled women from two clinics, aged 40-74 years with no personal breast cancer history, and randomized them to the BreastCARE intervention group or to the control group. All patients completed a baseline telephone survey and risk assessment (via telephone for controls, via tablet computer in clinic waiting room prior to visit for intervention). All women were categorized as high or average risk based on the Referral Screening Tool, the Gail model or the Breast Cancer Surveillance Consortium model. Intervention patients and their physicians received an individualized risk report to discuss during the visit. All women completed a follow-up telephone survey 1-2 weeks after risk assessment. Post-test comparisons estimated differences at follow-up in breast cancer knowledge, risk perception and concern. 580 intervention and 655 control women completed follow-up interviews. Mean age was 56 years (SD = 9). At follow-up, 73% of controls and 71% of intervention women correctly perceived their breast cancer risk and 22% of controls and 24% of intervention women were very concerned about breast cancer. Intervention patients had greater knowledge (≥75% correct answers) of breast cancer risk factors at follow-up (24% vs. 16%; p = 0.002). In multivariable analysis, there were no differences in correct risk perception or concern, but intervention patients had greater knowledge ([OR] = 1.62; 95% [CI] = 1.19-2.23). A simple, practical intervention involving physicians at the point of care can improve knowledge of breast cancer without increasing concern. ClinicalTrials.gov identifier NCT01830933. Copyright © 2015 Elsevier Ltd. All rights reserved.

  16. Computer-Assisted Diabetes Risk Assessment and Education (CADRAE) for Medically Vulnerable Populations in the Middle East: a Novel and Practical Method for Prevention

    PubMed Central

    Rowther, Armaan A.; Dykzeul, Brad; Billimek, John; Abuhassan, Deyana; Anderson, Craig; Lotfipour, Shahram

    2016-01-01

    The prevalence of diabetes in the Middle East is increasing rapidly due to urbanization, reduced levels of physical activity, and a nutritional transition toward increased consumption of fats and refined carbohydrates. Preventive strategies are of paramount importance to stemming the tide. Portable touch-screen computer technology may hold an answer for alleviating the burdens of cost, time, and training that limit the implementation of diabetes risk screening and intervention, especially among refugees and other vulnerable populations. The Computer-Assisted Diabetes Risk Assessment and Education (CADRAE) Arabic-language intervention program is proposed as a model method for practicing proactive type 2 diabetes prevention in resource-limited settings of the Middle East that combines the efficiency of risk-score screening methods, the advantages of portable computer interface, and the spirit of brief motivational interviewing. This paper aims to describe the theory and novel design of CADRAE—introduced at the Noor Al Hussein Foundation's Institute of Family Health in January 2014—as well as discuss opportunities and challenges for its implementation and evaluation in primary or emergency care settings. Features of CADRAE are elucidated in detail, including development, translation, conceptual framework, theoretical basis, method of risk assessment, brief intervention style, definition of outcomes, requirements for implementation, and potential means of evaluation and quality improvement. CADRAE offers the first example of portable computer technology integrating diabetes risk screening with behavior change counseling tailored for an Arabic-speaking population of mostly refugees and could offer a valuable model for researchers and policy makers of the Middle East as well as other resource-limited settings. PMID:26835181

  17. Television, computer, and video viewing; physical activity; and upper limb fracture risk in children: a population-based case control study.

    PubMed

    Ma, Deqiong; Jones, Graeme

    2003-11-01

    The effect of physical activity on upper limb fractures was examined in this population-based case control study with 321 age- and gender-matched pairs. Sports participation increased fracture risk in boys and decreased risk in girls. Television viewing had a deleterious dose response association with wrist and forearm fractures while light physical activity was protective. The aim of this population-based case control study was to examine the association between television, computer, and video viewing; types and levels of physical activity; and upper limb fractures in children 9-16 years of age. A total of 321 fracture cases and 321 randomly selected individually matched controls were studied. Television, computer, and video viewing and types and levels of physical activity were determined by interview-administered questionnaire. Bone strength was assessed by DXA and metacarpal morphometry. In general, sports participation increased total upper limb fracture risk in boys and decreased risk in girls. Gender-specific risk estimates were significantly different for total, contact, noncontact, and high-risk sports participation as well as four individual sports (soccer, cricket, surfing, and swimming). In multivariate analysis, time spent television, computer, and video viewing in both sexes was positively associated with wrist and forearm fracture risk (OR 1.6/category, 95% CI: 1.1-2.2), whereas days involved in light physical activity participation decreased fracture risk (OR 0.8/category, 95% CI: 0.7-1.0). Sports participation increased hand (OR 1.5/sport, 95% CI: 1.1-2.0) and upper arm (OR 29.8/sport, 95% CI: 1.7-535) fracture risk in boys only and decreased wrist and forearm fracture risk in girls only (OR 0.5/sport, 95% CI: 0.3-0.9). Adjustment for bone density and metacarpal morphometry did not alter these associations. There is gender discordance with regard to sports participation and fracture risk in children, which may reflect different approaches to sport. Importantly, television, computer, and video viewing has a dose-dependent association with wrist and forearm fractures, whereas light physical activity is protective. The mechanism is unclear but may involve bone-independent factors, or less likely, changes in bone quality not detected by DXA or metacarpal morphometry.

  18. Visual privacy by context: proposal and evaluation of a level-based visualisation scheme.

    PubMed

    Padilla-López, José Ramón; Chaaraoui, Alexandros Andre; Gu, Feng; Flórez-Revuelta, Francisco

    2015-06-04

    Privacy in image and video data has become an important subject since cameras are being installed in an increasing number of public and private spaces. Specifically, in assisted living, intelligent monitoring based on computer vision can allow one to provide risk detection and support services that increase people's autonomy at home. In the present work, a level-based visualisation scheme is proposed to provide visual privacy when human intervention is necessary, such as at telerehabilitation and safety assessment applications. Visualisation levels are dynamically selected based on the previously modelled context. In this way, different levels of protection can be provided, maintaining the necessary intelligibility required for the applications. Furthermore, a case study of a living room, where a top-view camera is installed, is presented. Finally, the performed survey-based evaluation indicates the degree of protection provided by the different visualisation models, as well as the personal privacy preferences and valuations of the users.

  19. Impact of Hydrogeological Uncertainty on Estimation of Environmental Risks Posed by Hydrocarbon Transportation Networks

    NASA Astrophysics Data System (ADS)

    Ciriello, V.; Lauriola, I.; Bonvicini, S.; Cozzani, V.; Di Federico, V.; Tartakovsky, Daniel M.

    2017-11-01

    Ubiquitous hydrogeological uncertainty undermines the veracity of quantitative predictions of soil and groundwater contamination due to accidental hydrocarbon spills from onshore pipelines. Such predictions, therefore, must be accompanied by quantification of predictive uncertainty, especially when they are used for environmental risk assessment. We quantify the impact of parametric uncertainty on quantitative forecasting of temporal evolution of two key risk indices, volumes of unsaturated and saturated soil contaminated by a surface spill of light nonaqueous-phase liquids. This is accomplished by treating the relevant uncertain parameters as random variables and deploying two alternative probabilistic models to estimate their effect on predictive uncertainty. A physics-based model is solved with a stochastic collocation method and is supplemented by a global sensitivity analysis. A second model represents the quantities of interest as polynomials of random inputs and has a virtually negligible computational cost, which enables one to explore any number of risk-related contamination scenarios. For a typical oil-spill scenario, our method can be used to identify key flow and transport parameters affecting the risk indices, to elucidate texture-dependent behavior of different soils, and to evaluate, with a degree of confidence specified by the decision-maker, the extent of contamination and the correspondent remediation costs.

  20. Use of mechanistic simulations as a quantitative risk-ranking tool within the quality by design framework.

    PubMed

    Stocker, Elena; Toschkoff, Gregor; Sacher, Stephan; Khinast, Johannes G

    2014-11-20

    The purpose of this study is to evaluate the use of computer simulations for generating quantitative knowledge as a basis for risk ranking and mechanistic process understanding, as required by ICH Q9 on quality risk management systems. In this specific publication, the main focus is the demonstration of a risk assessment workflow, including a computer simulation for the generation of mechanistic understanding of active tablet coating in a pan coater. Process parameter screening studies are statistically planned under consideration of impacts on a potentially critical quality attribute, i.e., coating mass uniformity. Based on computer simulation data the process failure mode and effects analysis of the risk factors is performed. This results in a quantitative criticality assessment of process parameters and the risk priority evaluation of failure modes. The factor for a quantitative reassessment of the criticality and risk priority is the coefficient of variation, which represents the coating mass uniformity. The major conclusion drawn from this work is a successful demonstration of the integration of computer simulation in the risk management workflow leading to an objective and quantitative risk assessment. Copyright © 2014. Published by Elsevier B.V.

  1. TERRA: a computer code for simulating the transport of environmentally released radionuclides through agriculture

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baes, C.F. III; Sharp, R.D.; Sjoreen, A.L.

    1984-11-01

    TERRA is a computer code which calculates concentrations of radionuclides and ingrowing daughters in surface and root-zone soil, produce and feed, beef, and milk from a given deposition rate at any location in the conterminous United States. The code is fully integrated with seven other computer codes which together comprise a Computerized Radiological Risk Investigation System, CRRIS. Output from either the long range (> 100 km) atmospheric dispersion code RETADD-II or the short range (<80 km) atmospheric dispersion code ANEMOS, in the form of radionuclide air concentrations and ground deposition rates by downwind location, serves as input to TERRA. User-definedmore » deposition rates and air concentrations may also be provided as input to TERRA through use of the PRIMUS computer code. The environmental concentrations of radionuclides predicted by TERRA serve as input to the ANDROS computer code which calculates population and individual intakes, exposures, doses, and risks. TERRA incorporates models to calculate uptake from soil and atmospheric deposition on four groups of produce for human consumption and four groups of livestock feeds. During the environmental transport simulation, intermediate calculations of interception fraction for leafy vegetables, produce directly exposed to atmospherically depositing material, pasture, hay, and silage are made based on location-specific estimates of standing crop biomass. Pasture productivity is estimated by a model which considers the number and types of cattle and sheep, pasture area, and annual production of other forages (hay and silage) at a given location. Calculations are made of the fraction of grain imported from outside the assessment area. TERRA output includes the above calculations and estimated radionuclide concentrations in plant produce, milk, and a beef composite by location.« less

  2. Segmentation of risk structures for otologic surgery using the Probabilistic Active Shape Model (PASM)

    NASA Astrophysics Data System (ADS)

    Becker, Meike; Kirschner, Matthias; Sakas, Georgios

    2014-03-01

    Our research project investigates a multi-port approach for minimally-invasive otologic surgery. For planning such a surgery, an accurate segmentation of the risk structures is crucial. However, the segmentation of these risk structures is a challenging task: The anatomical structures are very small and some have a complex shape, low contrast and vary both in shape and appearance. Therefore, prior knowledge is needed which is why we apply model-based approaches. In the present work, we use the Probabilistic Active Shape Model (PASM), which is a more flexible and specific variant of the Active Shape Model (ASM), to segment the following risk structures: cochlea, semicircular canals, facial nerve, chorda tympani, ossicles, internal auditory canal, external auditory canal and internal carotid artery. For the evaluation we trained and tested the algorithm on 42 computed tomography data sets using leave-one-out tests. Visual assessment of the results shows in general a good agreement of manual and algorithmic segmentations. Further, we achieve a good Average Symmetric Surface Distance while the maximum error is comparatively large due to low contrast at start and end points. Last, we compare the PASM to the standard ASM and show that the PASM leads to a higher accuracy.

  3. How might Model-based Probabilities Extracted from Imperfect Models Guide Rational Decisions: The Case for non-probabilistic odds

    NASA Astrophysics Data System (ADS)

    Smith, Leonard A.

    2010-05-01

    This contribution concerns "deep" or "second-order" uncertainty, such as the uncertainty in our probability forecasts themselves. It asks the question: "Is it rational to take (or offer) bets using model-based probabilities as if they were objective probabilities?" If not, what alternative approaches for determining odds, perhaps non-probabilistic odds, might prove useful in practice, given the fact we know our models are imperfect? We consider the case where the aim is to provide sustainable odds: not to produce a profit but merely to rationally expect to break even in the long run. In other words, to run a quantified risk of ruin that is relatively small. Thus the cooperative insurance schemes of coastal villages provide a more appropriate parallel than a casino. A "better" probability forecast would lead to lower premiums charged and less volatile fluctuations in the cash reserves of the village. Note that the Bayesian paradigm does not constrain one to interpret model distributions as subjective probabilities, unless one believes the model to be empirically adequate for the task at hand. In geophysics, this is rarely the case. When a probability forecast is interpreted as the objective probability of an event, the odds on that event can be easily computed as one divided by the probability of the event, and one need not favour taking either side of the wager. (Here we are using "odds-for" not "odds-to", the difference being whether of not the stake is returned; odds of one to one are equivalent to odds of two for one.) The critical question is how to compute sustainable odds based on information from imperfect models. We suggest that this breaks the symmetry between the odds-on an event and the odds-against it. While a probability distribution can always be translated into odds, interpreting the odds on a set of events might result in "implied-probabilities" that sum to more than one. And/or the set of odds may be incomplete, not covering all events. We ask whether or not probabilities based on imperfect models can be expected to yield probabilistic odds which are sustainable. Evidence is provided that suggest this is not the case. Even with very good models (good in an Root-Mean-Square sense), the risk of ruin of probabilistic odds is significantly higher than might be expected. Methods for constructing model-based non-probabilistic odds which are sustainable are discussed. The aim here is to be relevant to real world decision support, and so unrealistic assumptions of equal knowledge, equal compute power, or equal access to information are to be avoided. Finally, the use of non-probabilistic odds as a method for communicating deep uncertainty (uncertainty in a probability forecast itself) is discussed in the context of other methods, such as stating one's subjective probability that the models will prove inadequate in each particular instance (that is, the Probability of a "Big Surprise").

  4. Impact of isotropic constitutive descriptions on the predicted peak wall stress in abdominal aortic aneurysms.

    PubMed

    Man, V; Polzer, S; Gasser, T C; Novotny, T; Bursa, J

    2018-03-01

    Biomechanics-based assessment of Abdominal Aortic Aneurysm (AAA) rupture risk has gained considerable scientific and clinical momentum. However, computation of peak wall stress (PWS) using state-of-the-art finite element models is time demanding. This study investigates which features of the constitutive description of AAA wall are decisive for achieving acceptable stress predictions in it. Influence of five different isotropic constitutive descriptions of AAA wall is tested; models reflect realistic non-linear, artificially stiff non-linear, or artificially stiff pseudo-linear constitutive descriptions of AAA wall. Influence of the AAA wall model is tested on idealized (n=4) and patient-specific (n=16) AAA geometries. Wall stress computations consider a (hypothetical) load-free configuration and include residual stresses homogenizing the stresses across the wall. Wall stress differences amongst the different descriptions were statistically analyzed. When the qualitatively similar non-linear response of the AAA wall with low initial stiffness and subsequent strain stiffening was taken into consideration, wall stress (and PWS) predictions did not change significantly. Keeping this non-linear feature when using an artificially stiff wall can save up to 30% of the computational time, without significant change in PWS. In contrast, a stiff pseudo-linear elastic model may underestimate the PWS and is not reliable for AAA wall stress computations. Copyright © 2018 IPEM. Published by Elsevier Ltd. All rights reserved.

  5. Delamination detection using methods of computational intelligence

    NASA Astrophysics Data System (ADS)

    Ihesiulor, Obinna K.; Shankar, Krishna; Zhang, Zhifang; Ray, Tapabrata

    2012-11-01

    Abstract Reliable delamination prediction scheme is indispensable in order to prevent potential risks of catastrophic failures in composite structures. The existence of delaminations changes the vibration characteristics of composite laminates and hence such indicators can be used to quantify the health characteristics of laminates. An approach for online health monitoring of in-service composite laminates is presented in this paper that relies on methods based on computational intelligence. Typical changes in the observed vibration characteristics (i.e. change in natural frequencies) are considered as inputs to identify the existence, location and magnitude of delaminations. The performance of the proposed approach is demonstrated using numerical models of composite laminates. Since this identification problem essentially involves the solution of an optimization problem, the use of finite element (FE) methods as the underlying tool for analysis turns out to be computationally expensive. A surrogate assisted optimization approach is hence introduced to contain the computational time within affordable limits. An artificial neural network (ANN) model with Bayesian regularization is used as the underlying approximation scheme while an improved rate of convergence is achieved using a memetic algorithm. However, building of ANN surrogate models usually requires large training datasets. K-means clustering is effectively employed to reduce the size of datasets. ANN is also used via inverse modeling to determine the position, size and location of delaminations using changes in measured natural frequencies. The results clearly highlight the efficiency and the robustness of the approach.

  6. Human Environmental Disease Network: A computational model to assess toxicology of contaminants.

    PubMed

    Taboureau, Olivier; Audouze, Karine

    2017-01-01

    During the past decades, many epidemiological, toxicological and biological studies have been performed to assess the role of environmental chemicals as potential toxicants associated with diverse human disorders. However, the relationships between diseases based on chemical exposure rarely have been studied by computational biology. We developed a human environmental disease network (EDN) to explore and suggest novel disease-disease and chemical-disease relationships. The presented scored EDN model is built upon the integration of systems biology and chemical toxicology using information on chemical contaminants and their disease relationships reported in the TDDB database. The resulting human EDN takes into consideration the level of evidence of the toxicant-disease relationships, allowing inclusion of some degrees of significance in the disease-disease associations. Such a network can be used to identify uncharacterized connections between diseases. Examples are discussed for type 2 diabetes (T2D). Additionally, this computational model allows confirmation of already known links between chemicals and diseases (e.g., between bisphenol A and behavioral disorders) and also reveals unexpected associations between chemicals and diseases (e.g., between chlordane and olfactory alteration), thus predicting which chemicals may be risk factors to human health. The proposed human EDN model allows exploration of common biological mechanisms of diseases associated with chemical exposure, helping us to gain insight into disease etiology and comorbidity. This computational approach is an alternative to animal testing supporting the 3R concept.

  7. A simulation model of IT risk on program trading

    NASA Astrophysics Data System (ADS)

    Xia, Bingying; Jiang, Wenbao; Luo, Guangxuan

    2015-12-01

    The biggest difficulty for Program trading IT risk measures lies in the loss of data, in view of this situation, the current scholars approach is collecting court, network and other public media such as all kinds of accident of IT both at home and abroad for data collection, and the loss of IT risk quantitative analysis based on this database. However, the IT risk loss database established by this method can only fuzzy reflect the real situation and not for real to make fundamental explanation. In this paper, based on the study of the concept and steps of the MC simulation, we use computer simulation method, by using the MC simulation method in the "Program trading simulation system" developed by team to simulate the real programming trading and get the IT risk loss of data through its IT failure experiment, at the end of the article, on the effectiveness of the experimental data is verified. In this way, better overcome the deficiency of the traditional research method and solves the problem of lack of IT risk data in quantitative research. More empirically provides researchers with a set of simulation method are used to study the ideas and the process template.

  8. Prediction of lung density changes after radiotherapy by cone beam computed tomography response markers and pre-treatment factors for non-small cell lung cancer patients.

    PubMed

    Bernchou, Uffe; Hansen, Olfred; Schytte, Tine; Bertelsen, Anders; Hope, Andrew; Moseley, Douglas; Brink, Carsten

    2015-10-01

    This study investigates the ability of pre-treatment factors and response markers extracted from standard cone-beam computed tomography (CBCT) images to predict the lung density changes induced by radiotherapy for non-small cell lung cancer (NSCLC) patients. Density changes in follow-up computed tomography scans were evaluated for 135 NSCLC patients treated with radiotherapy. Early response markers were obtained by analysing changes in lung density in CBCT images acquired during the treatment course. The ability of pre-treatment factors and CBCT markers to predict lung density changes induced by radiotherapy was investigated. Age and CBCT markers extracted at 10th, 20th, and 30th treatment fraction significantly predicted lung density changes in a multivariable analysis, and a set of response models based on these parameters were established. The correlation coefficient for the models was 0.35, 0.35, and 0.39, when based on the markers obtained at the 10th, 20th, and 30th fraction, respectively. The study indicates that younger patients without lung tissue reactions early into their treatment course may have minimal radiation induced lung density increase at follow-up. Further investigations are needed to examine the ability of the models to identify patients with low risk of symptomatic toxicity. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  9. Are computer and cell phone use associated with body mass index and overweight? A population study among twin adolescents.

    PubMed

    Lajunen, Hanna-Reetta; Keski-Rahkonen, Anna; Pulkkinen, Lea; Rose, Richard J; Rissanen, Aila; Kaprio, Jaakko

    2007-02-26

    Overweight in children and adolescents has reached dimensions of a global epidemic during recent years. Simultaneously, information and communication technology use has rapidly increased. A population-based sample of Finnish twins born in 1983-1987 (N = 4098) was assessed by self-report questionnaires at 17 y during 2000-2005. The association of overweight (defined by Cole's BMI-for-age cut-offs) with computer and cell phone use and ownership was analyzed by logistic regression and their association with BMI by linear regression models. The effect of twinship was taken into account by correcting for clustered sampling of families. All models were adjusted for gender, physical exercise, and parents' education and occupational class. The proportion of adolescents who did not have a computer at home decreased from 18% to 8% from 2000 to 2005. Compared to them, having a home computer (without an Internet connection) was associated with a higher risk of overweight (odds ratio 2.3, 95% CI 1.4 to 3.8) and BMI (beta coefficient 0.57, 95% CI 0.15 to 0.98). However, having a computer with an Internet connection was not associated with weight status. Belonging to the highest quintile (OR 1.8 95% CI 1.2 to 2.8) and second-highest quintile (OR 1.6 95% CI 1.1 to 2.4) of weekly computer use was positively associated with overweight. The proportion of adolescents without a personal cell phone decreased from 12% to 1% across 2000 to 2005. There was a positive linear trend of increasing monthly phone bill with BMI (beta 0.18, 95% CI 0.06 to 0.30), but the association of a cell phone bill with overweight was very weak. Time spent using a home computer was associated with an increased risk of overweight. Cell phone use correlated weakly with BMI. Increasing use of information and communication technology may be related to the obesity epidemic among adolescents.

  10. Space Shuttle Propulsion Systems Plume Modeling and Simulation for the Lift-Off Computational Fluid Dynamics Model

    NASA Technical Reports Server (NTRS)

    Strutzenberg, L. L.; Dougherty, N. S.; Liever, P. A.; West, J. S.; Smith, S. D.

    2007-01-01

    This paper details advances being made in the development of Reynolds-Averaged Navier-Stokes numerical simulation tools, models, and methods for the integrated Space Shuttle Vehicle at launch. The conceptual model and modeling approach described includes the development of multiple computational models to appropriately analyze the potential debris transport for critical debris sources at Lift-Off. The conceptual model described herein involves the integration of propulsion analysis for the nozzle/plume flow with the overall 3D vehicle flowfield at Lift-Off. Debris Transport Analyses are being performed using the Shuttle Lift-Off models to assess the risk to the vehicle from Lift-Off debris and appropriately prioritized mitigation of potential debris sources to continue to reduce vehicle risk. These integrated simulations are being used to evaluate plume-induced debris environments where the multi-plume interactions with the launch facility can potentially accelerate debris particles toward the vehicle.

  11. Software for Building Models of 3D Objects via the Internet

    NASA Technical Reports Server (NTRS)

    Schramer, Tim; Jensen, Jeff

    2003-01-01

    The Virtual EDF Builder (where EDF signifies Electronic Development Fixture) is a computer program that facilitates the use of the Internet for building and displaying digital models of three-dimensional (3D) objects that ordinarily comprise assemblies of solid models created previously by use of computer-aided-design (CAD) programs. The Virtual EDF Builder resides on a Unix-based server computer. It is used in conjunction with a commercially available Web-based plug-in viewer program that runs on a client computer. The Virtual EDF Builder acts as a translator between the viewer program and a database stored on the server. The translation function includes the provision of uniform resource locator (URL) links to other Web-based computer systems and databases. The Virtual EDF builder can be used in two ways: (1) If the client computer is Unix-based, then it can assemble a model locally; the computational load is transferred from the server to the client computer. (2) Alternatively, the server can be made to build the model, in which case the server bears the computational load and the results are downloaded to the client computer or workstation upon completion.

  12. Rationale, design and baseline characteristics of a randomized controlled trial of a web-based computer-tailored physical activity intervention for adults from Quebec City.

    PubMed

    Boudreau, François; Walthouwer, Michel Jean Louis; de Vries, Hein; Dagenais, Gilles R; Turbide, Ginette; Bourlaud, Anne-Sophie; Moreau, Michel; Côté, José; Poirier, Paul

    2015-10-09

    The relationship between physical activity and cardiovascular disease (CVD) protection is well documented. Numerous factors (e.g. patient motivation, lack of facilities, physician time constraints) can contribute to poor PA adherence. Web-based computer-tailored interventions offer an innovative way to provide tailored feedback and to empower adults to engage in regular moderate- to vigorous-intensity PA. To describe the rationale, design and content of a web-based computer-tailored PA intervention for Canadian adults enrolled in a randomized controlled trial (RCT). 244 men and women aged between 35 and 70 years, without CVD or physical disability, not participating in regular moderate- to vigorous-intensity PA, and familiar with and having access to a computer at home, were recruited from the Quebec City Prospective Urban and Rural Epidemiological (PURE) study centre. Participants were randomized into two study arms: 1) an experimental group receiving the intervention and 2) a waiting list control group. The fully automated web-based computer-tailored PA intervention consists of seven 10- to 15-min sessions over an 8-week period. The theoretical underpinning of the intervention is based on the I-Change Model. The aim of the intervention was to reach a total of 150 min per week of moderate- to vigorous-intensity aerobic PA. This study will provide useful information before engaging in a large RCT to assess the long-term participation and maintenance of PA, the potential impact of regular PA on CVD risk factors and the cost-effectiveness of a web-based computer-tailored intervention. ISRCTN36353353 registered on 24/07/2014.

  13. Lung cancer risk prediction to select smokers for screening CT--a model based on the Italian COSMOS trial.

    PubMed

    Maisonneuve, Patrick; Bagnardi, Vincenzo; Bellomi, Massimo; Spaggiari, Lorenzo; Pelosi, Giuseppe; Rampinelli, Cristiano; Bertolotti, Raffaella; Rotmensz, Nicole; Field, John K; Decensi, Andrea; Veronesi, Giulia

    2011-11-01

    Screening with low-dose helical computed tomography (CT) has been shown to significantly reduce lung cancer mortality but the optimal target population and time interval to subsequent screening are yet to be defined. We developed two models to stratify individual smokers according to risk of developing lung cancer. We first used the number of lung cancers detected at baseline screening CT in the 5,203 asymptomatic participants of the COSMOS trial to recalibrate the Bach model, which we propose using to select smokers for screening. Next, we incorporated lung nodule characteristics and presence of emphysema identified at baseline CT into the Bach model and proposed the resulting multivariable model to predict lung cancer risk in screened smokers after baseline CT. Age and smoking exposure were the main determinants of lung cancer risk. The recalibrated Bach model accurately predicted lung cancers detected during the first year of screening. Presence of nonsolid nodules (RR = 10.1, 95% CI = 5.57-18.5), nodule size more than 8 mm (RR = 9.89, 95% CI = 5.84-16.8), and emphysema (RR = 2.36, 95% CI = 1.59-3.49) at baseline CT were all significant predictors of subsequent lung cancers. Incorporation of these variables into the Bach model increased the predictive value of the multivariable model (c-index = 0.759, internal validation). The recalibrated Bach model seems suitable for selecting the higher risk population for recruitment for large-scale CT screening. The Bach model incorporating CT findings at baseline screening could help defining the time interval to subsequent screening in individual participants. Further studies are necessary to validate these models.

  14. Iterative evaluation in a mobile counseling and testing program to reach people of color at risk for HIV--new strategies improve program acceptability, effectiveness, and evaluation capabilities.

    PubMed

    Spielberg, Freya; Kurth, Ann; Reidy, William; McKnight, Teka; Dikobe, Wame; Wilson, Charles

    2011-06-01

    This article highlights findings from an evaluation that explored the impact of mobile versus clinic-based testing, rapid versus central-lab based testing, incentives for testing, and the use of a computer counseling program to guide counseling and automate evaluation in a mobile program reaching people of color at risk for HIV. The program's results show that an increased focus on mobile outreach using rapid testing, incentives and health information technology tools may improve program acceptability, quality, productivity and timeliness of reports. This article describes program design decisions based on continuous quality assessment efforts. It also examines the impact of the Computer Assessment and Risk Reduction Education computer tool on HIV testing rates, staff perception of counseling quality, program productivity, and on the timeliness of evaluation reports. The article concludes with a discussion of implications for programmatic responses to the Centers for Disease Control and Prevention's HIV testing recommendations.

  15. ITERATIVE EVALUATION IN A MOBILE COUNSELING AND TESTING PROGRAM TO REACH PEOPLE OF COLOR AT RISK FOR HIV—NEW STRATEGIES IMPROVE PROGRAM ACCEPTABILITY, EFFECTIVENESS, AND EVALUATION CAPABILITIES

    PubMed Central

    Spielberg, Freya; Kurth, Ann; Reidy, William; McKnight, Teka; Dikobe, Wame; Wilson, Charles

    2016-01-01

    This article highlights findings from an evaluation that explored the impact of mobile versus clinic-based testing, rapid versus central-lab based testing, incentives for testing, and the use of a computer counseling program to guide counseling and automate evaluation in a mobile program reaching people of color at risk for HIV. The program’s results show that an increased focus on mobile outreach using rapid testing, incentives and health information technology tools may improve program acceptability, quality, productivity and timeliness of reports. This article describes program design decisions based on continuous quality assessment efforts. It also examines the impact of the Computer Assessment and Risk Reduction Education computer tool on HIV testing rates, staff perception of counseling quality, program productivity, and on the timeliness of evaluation reports. The article concludes with a discussion of implications for programmatic responses to the Centers for Disease Control and Prevention’s HIV testing recommendations. PMID:21689041

  16. DTREEv2, a computer-based support system for the risk assessment of genetically modified plants.

    PubMed

    Pertry, Ine; Nothegger, Clemens; Sweet, Jeremy; Kuiper, Harry; Davies, Howard; Iserentant, Dirk; Hull, Roger; Mezzetti, Bruno; Messens, Kathy; De Loose, Marc; de Oliveira, Dulce; Burssens, Sylvia; Gheysen, Godelieve; Tzotzos, George

    2014-03-25

    Risk assessment of genetically modified organisms (GMOs) remains a contentious area and a major factor influencing the adoption of agricultural biotech. Methodologically, in many countries, risk assessment is conducted by expert committees with little or no recourse to databases and expert systems that can facilitate the risk assessment process. In this paper we describe DTREEv2, a computer-based decision support system for the identification of hazards related to the introduction of GM-crops into the environment. DTREEv2 structures hazard identification and evaluation by means of an Event-Tree type of analysis. The system produces an output flagging identified hazards and potential risks. It is intended to be used for the preparation and evaluation of biosafety dossiers and, as such, its usefulness extends to researchers, risk assessors and regulators in government and industry. Copyright © 2013 Elsevier B.V. All rights reserved.

  17. Can new passenger cars reduce pedestrian lower extremity injury? A review of geometrical changes of front-end design before and after regulatory efforts.

    PubMed

    Nie, Bingbing; Zhou, Qing

    2016-10-02

    Pedestrian lower extremity represents the most frequently injured body region in car-to-pedestrian accidents. The European Directive concerning pedestrian safety was established in 2003 for evaluating pedestrian protection performance of car models. However, design changes have not been quantified since then. The goal of this study was to investigate front-end profiles of representative passenger car models and the potential influence on pedestrian lower extremity injury risk. The front-end styling of sedans and sport utility vehicles (SUV) released from 2008 to 2011 was characterized by the geometrical parameters related to pedestrian safety and compared to representative car models before 2003. The influence of geometrical design change on the resultant risk of injury to pedestrian lower extremity-that is, knee ligament rupture and long bone fracture-was estimated by a previously developed assessment tool assuming identical structural stiffness. Based on response surface generated from simulation results of a human body model (HBM), the tool provided kinematic and kinetic responses of pedestrian lower extremity resulted from a given car's front-end design. Newer passenger cars exhibited a "flatter" front-end design. The median value of the sedan models provided 87.5 mm less bottom depth, and the SUV models exhibited 94.7 mm less bottom depth. In the lateral impact configuration similar to that in the regulatory test methods, these geometrical changes tend to reduce the injury risk of human knee ligament rupture by 36.6 and 39.6% based on computational approximation. The geometrical changes did not significantly influence the long bone fracture risk. The present study reviewed the geometrical changes in car front-ends along with regulatory concerns regarding pedestrian safety. A preliminary quantitative benefit of the lower extremity injury reduction was estimated based on these geometrical features. Further investigation is recommended on the structural changes and inclusion of more accident scenarios.

  18. Suicide Risk by Military Occupation in the DoD Active Component Population

    ERIC Educational Resources Information Center

    Trofimovich, Lily; Reger, Mark A.; Luxton, David D.; Oetjen-Gerdes, Lynne A.

    2013-01-01

    Suicide risk based on occupational cohorts within the U.S. military was investigated. Rates of suicide based on military occupational categories were computed for the Department of Defense (DoD) active component population between 2001 and 2010. The combined infantry, gun crews, and seamanship specialist group was at increased risk of suicide…

  19. Using the Clinical Interview and Curriculum Based Measurement to Examine Risk Levels

    ERIC Educational Resources Information Center

    Ginsburg, Herbert P.; Lee, Young-Sun; Pappas, Sandra

    2016-01-01

    This paper investigates the power of the computer guided clinical interview (CI) and new curriculum based measurement (CBM) measures to identify and help children at risk of low mathematics achievement. We use data from large numbers of children in Kindergarten through Grade 3 to investigate the construct validity of CBM risk categories. The basic…

  20. Learning with Artificial Worlds: Computer-Based Modelling in the Curriculum.

    ERIC Educational Resources Information Center

    Mellar, Harvey, Ed.; And Others

    With the advent of the British National Curriculum, computer-based modeling has become an integral part of the school curriculum. This book is about modeling in education and providing children with computer tools to create and explore representations of the world. Members of the London Mental Models Group contributed their research: (1)…

  1. Privacy-preserving genomic testing in the clinic: a model using HIV treatment.

    PubMed

    McLaren, Paul J; Raisaro, Jean Louis; Aouri, Manel; Rotger, Margalida; Ayday, Erman; Bartha, István; Delgado, Maria B; Vallet, Yannick; Günthard, Huldrych F; Cavassini, Matthias; Furrer, Hansjakob; Doco-Lecompte, Thanh; Marzolini, Catia; Schmid, Patrick; Di Benedetto, Caroline; Decosterd, Laurent A; Fellay, Jacques; Hubaux, Jean-Pierre; Telenti, Amalio

    2016-08-01

    The implementation of genomic-based medicine is hindered by unresolved questions regarding data privacy and delivery of interpreted results to health-care practitioners. We used DNA-based prediction of HIV-related outcomes as a model to explore critical issues in clinical genomics. We genotyped 4,149 markers in HIV-positive individuals. Variants allowed for prediction of 17 traits relevant to HIV medical care, inference of patient ancestry, and imputation of human leukocyte antigen (HLA) types. Genetic data were processed under a privacy-preserving framework using homomorphic encryption, and clinical reports describing potentially actionable results were delivered to health-care providers. A total of 230 patients were included in the study. We demonstrated the feasibility of encrypting a large number of genetic markers, inferring patient ancestry, computing monogenic and polygenic trait risks, and reporting results under privacy-preserving conditions. The average execution time of a multimarker test on encrypted data was 865 ms on a standard computer. The proportion of tests returning potentially actionable genetic results ranged from 0 to 54%. The model of implementation presented herein informs on strategies to deliver genomic test results for clinical care. Data encryption to ensure privacy helps to build patient trust, a key requirement on the road to genomic-based medicine.Genet Med 18 8, 814-822.

  2. Towards personalised management of atherosclerosis via computational models in vascular clinics: technology based on patient-specific simulation approach

    PubMed Central

    Di Tomaso, Giulia; Agu, Obiekezie; Pichardo-Almarza, Cesar

    2014-01-01

    The development of a new technology based on patient-specific modelling for personalised healthcare in the case of atherosclerosis is presented. Atherosclerosis is the main cause of death in the world and it has become a burden on clinical services as it manifests itself in many diverse forms, such as coronary artery disease, cerebrovascular disease/stroke and peripheral arterial disease. It is also a multifactorial, chronic and systemic process that lasts for a lifetime, putting enormous financial and clinical pressure on national health systems. In this Letter, the postulate is that the development of new technologies for healthcare using computer simulations can, in the future, be developed as in-silico management and support systems. These new technologies will be based on predictive models (including the integration of observations, theories and predictions across a range of temporal and spatial scales, scientific disciplines, key risk factors and anatomical sub-systems) combined with digital patient data and visualisation tools. Although the problem is extremely complex, a simulation workflow and an exemplar application of this type of technology for clinical use is presented, which is currently being developed by a multidisciplinary team following the requirements and constraints of the Vascular Service Unit at the University College Hospital, London. PMID:26609369

  3. Algorithms and analyses for stochastic optimization for turbofan noise reduction using parallel reduced-order modeling

    NASA Astrophysics Data System (ADS)

    Yang, Huanhuan; Gunzburger, Max

    2017-06-01

    Simulation-based optimization of acoustic liner design in a turbofan engine nacelle for noise reduction purposes can dramatically reduce the cost and time needed for experimental designs. Because uncertainties are inevitable in the design process, a stochastic optimization algorithm is posed based on the conditional value-at-risk measure so that an ideal acoustic liner impedance is determined that is robust in the presence of uncertainties. A parallel reduced-order modeling framework is developed that dramatically improves the computational efficiency of the stochastic optimization solver for a realistic nacelle geometry. The reduced stochastic optimization solver takes less than 500 seconds to execute. In addition, well-posedness and finite element error analyses of the state system and optimization problem are provided.

  4. Number of Psychosocial Strengths Predicts Reduced HIV Sexual Risk Behaviors Above and Beyond Syndemic Problems Among Gay and Bisexual Men.

    PubMed

    Hart, Trevor A; Noor, Syed W; Adam, Barry D; Vernon, Julia R G; Brennan, David J; Gardner, Sandra; Husbands, Winston; Myers, Ted

    2017-10-01

    Syndemics research shows the additive effect of psychosocial problems on high-risk sexual behavior among gay and bisexual men (GBM). Psychosocial strengths may predict less engagement in high-risk sexual behavior. In a study of 470 ethnically diverse HIV-negative GBM, regression models were computed using number of syndemic psychosocial problems, number of psychosocial strengths, and serodiscordant condomless anal sex (CAS). The number of syndemic psychosocial problems correlated with serodiscordant CAS (RR = 1.51, 95% CI 1.18-1.92; p = 0.001). When adding the number of psychosocial strengths to the model, the effect of syndemic psychosocial problems became non-significant, but the number of strengths-based factors remained significant (RR = 0.67, 95% CI 0.53-0.86; p = 0.002). Psychosocial strengths may operate additively in the same way as syndemic psychosocial problems, but in the opposite direction. Consistent with theories of resilience, psychosocial strengths may be an important set of variables predicting sexual risk behavior that is largely missing from the current HIV behavioral literature.

  5. Evaluating sufficient similarity for drinking-water disinfection by-product (DBP) mixtures with bootstrap hypothesis test procedures.

    PubMed

    Feder, Paul I; Ma, Zhenxu J; Bull, Richard J; Teuschler, Linda K; Rice, Glenn

    2009-01-01

    In chemical mixtures risk assessment, the use of dose-response data developed for one mixture to estimate risk posed by a second mixture depends on whether the two mixtures are sufficiently similar. While evaluations of similarity may be made using qualitative judgments, this article uses nonparametric statistical methods based on the "bootstrap" resampling technique to address the question of similarity among mixtures of chemical disinfectant by-products (DBP) in drinking water. The bootstrap resampling technique is a general-purpose, computer-intensive approach to statistical inference that substitutes empirical sampling for theoretically based parametric mathematical modeling. Nonparametric, bootstrap-based inference involves fewer assumptions than parametric normal theory based inference. The bootstrap procedure is appropriate, at least in an asymptotic sense, whether or not the parametric, distributional assumptions hold, even approximately. The statistical analysis procedures in this article are initially illustrated with data from 5 water treatment plants (Schenck et al., 2009), and then extended using data developed from a study of 35 drinking-water utilities (U.S. EPA/AMWA, 1989), which permits inclusion of a greater number of water constituents and increased structure in the statistical models.

  6. The degrees to which transtrochanteric rotational osteotomy moves the region of osteonecrotic femoral head out of the weight-bearing area as evaluated by computer simulation.

    PubMed

    Chen, Weng-Pin; Tai, Ching-Lung; Tan, Chih-Feng; Shih, Chun-Hsiung; Hou, Shun-Hsin; Lee, Mel S

    2005-01-01

    Transtrochanteric rotational osteotomy is a technical demanding procedure. Currently, the pre-operative planning of the transtrochanteric rotational osteotomy is mostly based on X-ray images. The surgeons would need to reconstruct the three-dimensional structure of the femoral head and the necrosis in their mind. This study develops a simulation platform using computer models based on the computed tomography images of the femoral head to evaluate the degree to which transtrochanteric rotational osteotomy moves the region of osteonecrotic femoral head out of the weight-bearing area in stance and gait cycle conditions. Based on this simulation procedure, the surgeons would be better informed before the surgery and the indication can be carefully assessed. A case with osteonecrosis involving 15% of the femoral head was recruited. Virtual models with the same size lesion but at different locations were devised. Computer models were created using SolidWorks 2000 CAD software. The area ratio of weight-bearing zone occupied by the necrotic lesion on two conditions, stance and gait cycle, were measured after surgery simulations. For the specific case and virtual models devised in this study, computer simulation showed the following two findings: (1) The degrees needed to move the necrosis out of the weight-bearing zone in stance were less by anterior rotational osteotomy as compared to that of posterior rotational osteotomy. However, the necrotic region would still overlap with the weight-bearing area during gait cycle. (2) Because the degrees allowed for posterior rotation were less restricted than anterior rotation, posterior rotational osteotomies were often more effective to move the necrotic region out of the weight-bearing area during gait cycle. The computer simulation platform by registering actual CT images is a useful tool to assess the direction and degrees needed for transtrochanteric rotational osteotomy. Although the results indicated that anterior rotational osteotomy was more effective to move the necrosis out of the weight-bearing zone in stance for models devised in this study, in circumstances where the necrotic region located at various locale, considering the limitation of anterior rotation inherited with the risk of vascular compromise, it might be more beneficial to perform posterior rotation osteotomy in taking account of gait cycle.

  7. An intelligent and secure system for predicting and preventing Zika virus outbreak using Fog computing

    NASA Astrophysics Data System (ADS)

    Sareen, Sanjay; Gupta, Sunil Kumar; Sood, Sandeep K.

    2017-10-01

    Zika virus is a mosquito-borne disease that spreads very quickly in different parts of the world. In this article, we proposed a system to prevent and control the spread of Zika virus disease using integration of Fog computing, cloud computing, mobile phones and the Internet of things (IoT)-based sensor devices. Fog computing is used as an intermediary layer between the cloud and end users to reduce the latency time and extra communication cost that is usually found high in cloud-based systems. A fuzzy k-nearest neighbour is used to diagnose the possibly infected users, and Google map web service is used to provide the geographic positioning system (GPS)-based risk assessment to prevent the outbreak. It is used to represent each Zika virus (ZikaV)-infected user, mosquito-dense sites and breeding sites on the Google map that help the government healthcare authorities to control such risk-prone areas effectively and efficiently. The proposed system is deployed on Amazon EC2 cloud to evaluate its performance and accuracy using data set for 2 million users. Our system provides high accuracy of 94.5% for initial diagnosis of different users according to their symptoms and appropriate GPS-based risk assessment.

  8. Assessment of the risk due to release of carbon fiber in civil aircraft accidents, phase 2

    NASA Technical Reports Server (NTRS)

    Pocinki, L.; Cornell, M. E.; Kaplan, L.

    1980-01-01

    The risk associated with the potential use of carbon fiber composite material in commercial jet aircraft is investigated. A simulation model developed to generate risk profiles for several airports is described. The risk profiles show the probability that the cost due to accidents in any year exceeds a given amount. The computer model simulates aircraft accidents with fire, release of fibers, their downwind transport and infiltration of buildings, equipment failures, and resulting ecomomic impact. The individual airport results were combined to yield the national risk profile.

  9. Combining numerical simulations with time-domain random walk for pathogen risk assessment in groundwater

    NASA Astrophysics Data System (ADS)

    Cvetkovic, V.; Molin, S.

    2012-02-01

    We present a methodology that combines numerical simulations of groundwater flow and advective transport in heterogeneous porous media with analytical retention models for computing the infection risk probability from pathogens in aquifers. The methodology is based on the analytical results presented in [1,2] for utilising the colloid filtration theory in a time-domain random walk framework. It is shown that in uniform flow, the results from the numerical simulations of advection yield comparable results as the analytical TDRW model for generating advection segments. It is shown that spatial variability of the attachment rate may be significant, however, it appears to affect risk in a different manner depending on if the flow is uniform or radially converging. In spite of the fact that numerous issues remain open regarding pathogen transport in aquifers on the field scale, the methodology presented here may be useful for screening purposes, and may also serve as a basis for future studies that would include greater complexity.

  10. NETWORK ASSISTED ANALYSIS TO REVEAL THE GENETIC BASIS OF AUTISM1

    PubMed Central

    Liu, Li; Lei, Jing; Roeder, Kathryn

    2016-01-01

    While studies show that autism is highly heritable, the nature of the genetic basis of this disorder remains illusive. Based on the idea that highly correlated genes are functionally interrelated and more likely to affect risk, we develop a novel statistical tool to find more potentially autism risk genes by combining the genetic association scores with gene co-expression in specific brain regions and periods of development. The gene dependence network is estimated using a novel partial neighborhood selection (PNS) algorithm, where node specific properties are incorporated into network estimation for improved statistical and computational efficiency. Then we adopt a hidden Markov random field (HMRF) model to combine the estimated network and the genetic association scores in a systematic manner. The proposed modeling framework can be naturally extended to incorporate additional structural information concerning the dependence between genes. Using currently available genetic association data from whole exome sequencing studies and brain gene expression levels, the proposed algorithm successfully identified 333 genes that plausibly affect autism risk. PMID:27134692

  11. Implications of Nine Risk Prediction Models for Selecting Ever-Smokers for Computed Tomography Lung Cancer Screening.

    PubMed

    Katki, Hormuzd A; Kovalchik, Stephanie A; Petito, Lucia C; Cheung, Li C; Jacobs, Eric; Jemal, Ahmedin; Berg, Christine D; Chaturvedi, Anil K

    2018-05-15

    Lung cancer screening guidelines recommend using individualized risk models to refer ever-smokers for screening. However, different models select different screening populations. The performance of each model in selecting ever-smokers for screening is unknown. To compare the U.S. screening populations selected by 9 lung cancer risk models (the Bach model; the Spitz model; the Liverpool Lung Project [LLP] model; the LLP Incidence Risk Model [LLPi]; the Hoggart model; the Prostate, Lung, Colorectal, and Ovarian Cancer Screening Trial Model 2012 [PLCOM2012]; the Pittsburgh Predictor; the Lung Cancer Risk Assessment Tool [LCRAT]; and the Lung Cancer Death Risk Assessment Tool [LCDRAT]) and to examine their predictive performance in 2 cohorts. Population-based prospective studies. United States. Models selected U.S. screening populations by using data from the National Health Interview Survey from 2010 to 2012. Model performance was evaluated using data from 337 388 ever-smokers in the National Institutes of Health-AARP Diet and Health Study and 72 338 ever-smokers in the CPS-II (Cancer Prevention Study II) Nutrition Survey cohort. Model calibration (ratio of model-predicted to observed cases [expected-observed ratio]) and discrimination (area under the curve [AUC]). At a 5-year risk threshold of 2.0%, the models chose U.S. screening populations ranging from 7.6 million to 26 million ever-smokers. These disagreements occurred because, in both validation cohorts, 4 models (the Bach model, PLCOM2012, LCRAT, and LCDRAT) were well-calibrated (expected-observed ratio range, 0.92 to 1.12) and had higher AUCs (range, 0.75 to 0.79) than 5 models that generally overestimated risk (expected-observed ratio range, 0.83 to 3.69) and had lower AUCs (range, 0.62 to 0.75). The 4 best-performing models also had the highest sensitivity at a fixed specificity (and vice versa) and similar discrimination at a fixed risk threshold. These models showed better agreement on size of the screening population (7.6 million to 10.9 million) and achieved consensus on 73% of persons chosen. No consensus on risk thresholds for screening. The 9 lung cancer risk models chose widely differing U.S. screening populations. However, 4 models (the Bach model, PLCOM2012, LCRAT, and LCDRAT) most accurately predicted risk and performed best in selecting ever-smokers for screening. Intramural Research Program of the National Institutes of Health/National Cancer Institute.

  12. Simulation based planning of surgical interventions in pediatric cardiology

    NASA Astrophysics Data System (ADS)

    Marsden, Alison L.

    2013-10-01

    Hemodynamics plays an essential role in the progression and treatment of cardiovascular disease. However, while medical imaging provides increasingly detailed anatomical information, clinicians often have limited access to hemodynamic data that may be crucial to patient risk assessment and treatment planning. Computational simulations can now provide detailed hemodynamic data to augment clinical knowledge in both adult and pediatric applications. There is a particular need for simulation tools in pediatric cardiology, due to the wide variation in anatomy and physiology in congenital heart disease patients, necessitating individualized treatment plans. Despite great strides in medical imaging, enabling extraction of flow information from magnetic resonance and ultrasound imaging, simulations offer predictive capabilities that imaging alone cannot provide. Patient specific simulations can be used for in silico testing of new surgical designs, treatment planning, device testing, and patient risk stratification. Furthermore, simulations can be performed at no direct risk to the patient. In this paper, we outline the current state of the art in methods for cardiovascular blood flow simulation and virtual surgery. We then step through pressing challenges in the field, including multiscale modeling, boundary condition selection, optimization, and uncertainty quantification. Finally, we summarize simulation results of two representative examples from pediatric cardiology: single ventricle physiology, and coronary aneurysms caused by Kawasaki disease. These examples illustrate the potential impact of computational modeling tools in the clinical setting.

  13. Computational identification of gene–social environment interaction at the human IL6 locus

    PubMed Central

    Cole, Steven W.; Arevalo, Jesusa M. G.; Takahashi, Rie; Sloan, Erica K.; Lutgendorf, Susan K.; Sood, Anil K.; Sheridan, John F.; Seeman, Teresa E.

    2010-01-01

    To identify genetic factors that interact with social environments to impact human health, we used a bioinformatic strategy that couples expression array–based detection of environmentally responsive transcription factors with in silico discovery of regulatory polymorphisms to predict genetic loci that modulate transcriptional responses to stressful environments. Tests of one predicted interaction locus in the human IL6 promoter (SNP rs1800795) verified that it modulates transcriptional response to β-adrenergic activation of the GATA1 transcription factor in vitro. In vivo validation studies confirmed links between adverse social conditions and increased transcription of GATA1 target genes in primary neural, immune, and cancer cells. Epidemiologic analyses verified the health significance of those molecular interactions by documenting increased 10-year mortality risk associated with late-life depressive symptoms that occurred solely for homozygous carriers of the GATA1-sensitive G allele of rs1800795. Gating of depression-related mortality risk by IL6 genotype pertained only to inflammation-related causes of death and was associated with increased chronic inflammation as indexed by plasma C-reactive protein. Computational modeling of molecular interactions, in vitro biochemical analyses, in vivo animal modeling, and human molecular epidemiologic analyses thus converge in identifying β-adrenergic activation of GATA1 as a molecular pathway by which social adversity can alter human health risk selectively depending on individual genetic status at the IL6 locus. PMID:20176930

  14. The Material Supply Adjustment Process in RAMF-SM, Step 2

    DTIC Science & Technology

    2016-06-01

    contain. The Risk Assessment and Mitigation Framework for Strategic Materials (RAMF-SM) is a suite of mathematical models and databases that has been...Risk Assessment and Mitigation Framework for Strategic Materials (RAMF-SM) is a suite of mathematical models and databases used to support the...and computes material shortfalls.1 Several mathematical models and dozens of databases, encompassing thousands of data items, support the

  15. Dynamic Modeling and Soil Mechanics for Path Planning of the Mars Exploration Rovers

    NASA Technical Reports Server (NTRS)

    Trease, Brian; Arvidson, Raymond; Lindemann, Randel; Bennett, Keith; Zhou, Feng; Iagnemma, Karl; Senatore, Carmine; Van Dyke, Lauren

    2011-01-01

    To help minimize risk of high sinkage and slippage during drives and to better understand soil properties and rover terramechanics from drive data, a multidisciplinary team was formed under the Mars Exploration Rover (MER) project to develop and utilize dynamic computer-based models for rover drives over realistic terrains. The resulting tool, named ARTEMIS (Adams-based Rover Terramechanics and Mobility Interaction Simulator), consists of the dynamic model, a library of terramechanics subroutines, and the high-resolution digital elevation maps of the Mars surface. A 200-element model of the rovers was developed and validated for drop tests before launch, using MSC-Adams dynamic modeling software. Newly modeled terrain-rover interactions include the rut-formation effect of deformable soils, using the classical Bekker-Wong implementation of compaction resistances and bull-dozing effects. The paper presents the details and implementation of the model with two case studies based on actual MER telemetry data. In its final form, ARTEMIS will be used in a predictive manner to assess terrain navigability and will become part of the overall effort in path planning and navigation for both Martian and lunar rovers.

  16. Is Computer-Aided Instruction an Effective Tier-One Intervention for Kindergarten Students at Risk for Reading Failure in an Applied Setting?

    ERIC Educational Resources Information Center

    Kreskey, Donna DeVaughn; Truscott, Stephen D.

    2016-01-01

    This study investigated the use of computer-aided instruction (CAI) as an intervention for kindergarten students at risk for reading failure. Headsprout Early Reading (Headsprout 2005), a type of CAI, provides internet-based, reading instruction incorporating the critical components of reading instruction cited by the National Reading Panel (NRP…

  17. A poisson process model for hip fracture risk.

    PubMed

    Schechner, Zvi; Luo, Gangming; Kaufman, Jonathan J; Siffert, Robert S

    2010-08-01

    The primary method for assessing fracture risk in osteoporosis relies primarily on measurement of bone mass. Estimation of fracture risk is most often evaluated using logistic or proportional hazards models. Notwithstanding the success of these models, there is still much uncertainty as to who will or will not suffer a fracture. This has led to a search for other components besides mass that affect bone strength. The purpose of this paper is to introduce a new mechanistic stochastic model that characterizes the risk of hip fracture in an individual. A Poisson process is used to model the occurrence of falls, which are assumed to occur at a rate, lambda. The load induced by a fall is assumed to be a random variable that has a Weibull probability distribution. The combination of falls together with loads leads to a compound Poisson process. By retaining only those occurrences of the compound Poisson process that result in a hip fracture, a thinned Poisson process is defined that itself is a Poisson process. The fall rate is modeled as an affine function of age, and hip strength is modeled as a power law function of bone mineral density (BMD). The risk of hip fracture can then be computed as a function of age and BMD. By extending the analysis to a Bayesian framework, the conditional densities of BMD given a prior fracture and no prior fracture can be computed and shown to be consistent with clinical observations. In addition, the conditional probabilities of fracture given a prior fracture and no prior fracture can also be computed, and also demonstrate results similar to clinical data. The model elucidates the fact that the hip fracture process is inherently random and improvements in hip strength estimation over and above that provided by BMD operate in a highly "noisy" environment and may therefore have little ability to impact clinical practice.

  18. Proposing a Compartmental Model for Leprosy and Parameterizing Using Regional Incidence in Brazil.

    PubMed

    Smith, Rebecca Lee

    2016-08-01

    Hansen's disease (HD), or leprosy, is still considered a public health risk in much of Brazil. Understanding the dynamics of the infection at a regional level can aid in identification of targets to improve control. A compartmental continuous-time model for leprosy dynamics was designed based on understanding of the biology of the infection. The transmission coefficients for the model and the rate of detection were fit for each region using Approximate Bayesian Computation applied to paucibacillary and multibacillary incidence data over the period of 2000 to 2010, and model fit was validated on incidence data from 2011 to 2012. Regional variation was noted in detection rate, with cases in the Midwest estimated to be infectious for 10 years prior to detection compared to 5 years for most other regions. Posterior predictions for the model estimated that elimination of leprosy as a public health risk would require, on average, 44-45 years in the three regions with the highest prevalence. The model is easily adaptable to other settings, and can be studied to determine the efficacy of improved case finding on leprosy control.

  19. Information-computational system for storage, search and analytical processing of environmental datasets based on the Semantic Web technologies

    NASA Astrophysics Data System (ADS)

    Titov, A.; Gordov, E.; Okladnikov, I.

    2009-04-01

    In this report the results of the work devoted to the development of working model of the software system for storage, semantically-enabled search and retrieval along with processing and visualization of environmental datasets containing results of meteorological and air pollution observations and mathematical climate modeling are presented. Specially designed metadata standard for machine-readable description of datasets related to meteorology, climate and atmospheric pollution transport domains is introduced as one of the key system components. To provide semantic interoperability the Resource Description Framework (RDF, http://www.w3.org/RDF/) technology means have been chosen for metadata description model realization in the form of RDF Schema. The final version of the RDF Schema is implemented on the base of widely used standards, such as Dublin Core Metadata Element Set (http://dublincore.org/), Directory Interchange Format (DIF, http://gcmd.gsfc.nasa.gov/User/difguide/difman.html), ISO 19139, etc. At present the system is available as a Web server (http://climate.risks.scert.ru/metadatabase/) based on the web-portal ATMOS engine [1] and is implementing dataset management functionality including SeRQL-based semantic search as well as statistical analysis and visualization of selected data archives [2,3]. The core of the system is Apache web server in conjunction with Tomcat Java Servlet Container (http://jakarta.apache.org/tomcat/) and Sesame Server (http://www.openrdf.org/) used as a database for RDF and RDF Schema. At present statistical analysis of meteorological and climatic data with subsequent visualization of results is implemented for such datasets as NCEP/NCAR Reanalysis, Reanalysis NCEP/DOE AMIP II, JMA/CRIEPI JRA-25, ECMWF ERA-40 and local measurements obtained from meteorological stations on the territory of Russia. This functionality is aimed primarily at finding of main characteristics of regional climate dynamics. The proposed system represents a step in the process of development of a distributed collaborative information-computational environment to support multidisciplinary investigations of Earth regional environment [4]. Partial support of this work by SB RAS Integration Project 34, SB RAS Basic Program Project 4.5.2.2, APN Project CBA2007-08NSY and FP6 Enviro-RISKS project (INCO-CT-2004-013427) is acknowledged. References 1. E.P. Gordov, V.N. Lykosov, and A.Z. Fazliev. Web portal on environmental sciences "ATMOS" // Advances in Geosciences. 2006. Vol. 8. p. 33 - 38. 2. Gordov E.P., Okladnikov I.G., Titov A.G. Development of elements of web based information-computational system supporting regional environment processes investigations // Journal of Computational Technologies, Vol. 12, Special Issue #3, 2007, pp. 20 - 28. 3. Okladnikov I.G., Titov A.G. Melnikova V.N., Shulgina T.M. Web-system for processing and visualization of meteorological and climatic data // Journal of Computational Technologies, Vol. 13, Special Issue #3, 2008, pp. 64 - 69. 4. Gordov E.P., Lykosov V.N. Development of information-computational infrastructure for integrated study of Siberia environment // Journal of Computational Technologies, Vol. 12, Special Issue #2, 2007, pp. 19 - 30.

  20. Network Model-Assisted Inference from Respondent-Driven Sampling Data

    PubMed Central

    Gile, Krista J.; Handcock, Mark S.

    2015-01-01

    Summary Respondent-Driven Sampling is a widely-used method for sampling hard-to-reach human populations by link-tracing over their social networks. Inference from such data requires specialized techniques because the sampling process is both partially beyond the control of the researcher, and partially implicitly defined. Therefore, it is not generally possible to directly compute the sampling weights for traditional design-based inference, and likelihood inference requires modeling the complex sampling process. As an alternative, we introduce a model-assisted approach, resulting in a design-based estimator leveraging a working network model. We derive a new class of estimators for population means and a corresponding bootstrap standard error estimator. We demonstrate improved performance compared to existing estimators, including adjustment for an initial convenience sample. We also apply the method and an extension to the estimation of HIV prevalence in a high-risk population. PMID:26640328

Top