Science.gov

Sample records for proportional hazards models

  1. Proportional Hazards Models of Graduation

    ERIC Educational Resources Information Center

    Chimka, Justin R.; Reed-Rhoads, Teri; Barker, Kash

    2008-01-01

    Survival analysis is a statistical tool used to describe the duration between events. Many processes in medical research, engineering, and economics can be described using survival analysis techniques. This research involves studying engineering college student graduation using Cox proportional hazards models. Among male students with American…

  2. Proportional hazards models with discrete frailty.

    PubMed

    Caroni, Chrys; Crowder, Martin; Kimber, Alan

    2010-07-01

    We extend proportional hazards frailty models for lifetime data to allow a negative binomial, Poisson, Geometric or other discrete distribution of the frailty variable. This might represent, for example, the unknown number of flaws in an item under test. Zero frailty corresponds to a limited failure model containing a proportion of units that never fail (long-term survivors). Ways of modifying the model to avoid this are discussed. The models are illustrated on a previously published set of data on failures of printed circuit boards and on new data on breaking strengths of samples of cord.

  3. Goodness-of-fit test for proportional subdistribution hazards model.

    PubMed

    Zhou, Bingqing; Fine, Jason; Laird, Glen

    2013-09-30

    This paper concerns using modified weighted Schoenfeld residuals to test the proportionality of subdistribution hazards for the Fine-Gray model, similar to the tests proposed by Grambsch and Therneau for independently censored data. We develop a score test for the time-varying coefficients based on the modified Schoenfeld residuals derived assuming a certain form of non-proportionality. The methods perform well in simulations and a real data analysis of breast cancer data, where the treatment effect exhibits non-proportional hazards.

  4. Sample size calculation for the proportional hazards cure model.

    PubMed

    Wang, Songfeng; Zhang, Jiajia; Lu, Wenbin

    2012-12-20

    In clinical trials with time-to-event endpoints, it is not uncommon to see a significant proportion of patients being cured (or long-term survivors), such as trials for the non-Hodgkins lymphoma disease. The popularly used sample size formula derived under the proportional hazards (PH) model may not be proper to design a survival trial with a cure fraction, because the PH model assumption may be violated. To account for a cure fraction, the PH cure model is widely used in practice, where a PH model is used for survival times of uncured patients and a logistic distribution is used for the probability of patients being cured. In this paper, we develop a sample size formula on the basis of the PH cure model by investigating the asymptotic distributions of the standard weighted log-rank statistics under the null and local alternative hypotheses. The derived sample size formula under the PH cure model is more flexible because it can be used to test the differences in the short-term survival and/or cure fraction. Furthermore, we also investigate as numerical examples the impacts of accrual methods and durations of accrual and follow-up periods on sample size calculation. The results show that ignoring the cure rate in sample size calculation can lead to either underpowered or overpowered studies. We evaluate the performance of the proposed formula by simulation studies and provide an example to illustrate its application with the use of data from a melanoma trial. PMID:22786805

  5. Proportional Hazards Model with Covariate Measurement Error and Instrumental Variables

    PubMed Central

    Song, Xiao; Wang, Ching-Yun

    2014-01-01

    In biomedical studies, covariates with measurement error may occur in survival data. Existing approaches mostly require certain replications on the error-contaminated covariates, which may not be available in the data. In this paper, we develop a simple nonparametric correction approach for estimation of the regression parameters in the proportional hazards model using a subset of the sample where instrumental variables are observed. The instrumental variables are related to the covariates through a general nonparametric model, and no distributional assumptions are placed on the error and the underlying true covariates. We further propose a novel generalized methods of moments nonparametric correction estimator to improve the efficiency over the simple correction approach. The efficiency gain can be substantial when the calibration subsample is small compared to the whole sample. The estimators are shown to be consistent and asymptotically normal. Performance of the estimators is evaluated via simulation studies and by an application to data from an HIV clinical trial. Estimation of the baseline hazard function is not addressed. PMID:25663724

  6. Analysis of the Proportional Hazards Model with Sparse Longitudinal Covariates

    PubMed Central

    Cao, Hongyuan; Churpek, Mathew M.; Zeng, Donglin; Fine, Jason P.

    2014-01-01

    Regression analysis of censored failure observations via the proportional hazards model permits time-varying covariates which are observed at death times. In practice, such longitudinal covariates are typically sparse and only measured at infrequent and irregularly spaced follow-up times. Full likelihood analyses of joint models for longitudinal and survival data impose stringent modelling assumptions which are difficult to verify in practice and which are complicated both inferentially and computationally. In this article, a simple kernel weighted score function is proposed with minimal assumptions. Two scenarios are considered: half kernel estimation in which observation ceases at the time of the event and full kernel estimation for data where observation may continue after the event, as with recurrent events data. It is established that these estimators are consistent and asymptotically normal. However, they converge at rates which are slower than the parametric rates which may be achieved with fully observed covariates, with the full kernel method achieving an optimal convergence rate which is superior to that of the half kernel method. Simulation results demonstrate that the large sample approximations are adequate for practical use and may yield improved performance relative to last value carried forward approach and joint modelling method. The analysis of the data from a cardiac arrest study demonstrates the utility of the proposed methods. PMID:26576066

  7. A Mixture Proportional Hazards Model with Random Effects for Response Times in Tests

    ERIC Educational Resources Information Center

    Ranger, Jochen; Kuhn, Jörg-Tobias

    2016-01-01

    In this article, a new model for test response times is proposed that combines latent class analysis and the proportional hazards model with random effects in a similar vein as the mixture factor model. The model assumes the existence of different latent classes. In each latent class, the response times are distributed according to a…

  8. On Model Specification and Selection of the Cox Proportional Hazards Model*

    PubMed Central

    Lin, Chen-Yen; Halabi, Susan

    2013-01-01

    Prognosis plays a pivotal role in patient management and trial design. A useful prognostic model should correctly identify important risk factors and estimate their effects. In this article, we discuss several challenges in selecting prognostic factors and estimating their effects using the Cox proportional hazards model. Although a flexible semiparametric form, the Cox’s model is not entirely exempt from model misspecification. To minimize possible misspecification, instead of imposing traditional linear assumption, flexible modeling techniques have been proposed to accommodate the nonlinear effect. We first review several existing nonparametric estimation and selection procedures and then present a numerical study to compare the performance between parametric and nonparametric procedures. We demonstrate the impact of model misspecification on variable selection and model prediction using a simulation study and a example from a phase III trial in prostate cancer. PMID:23784939

  9. Proportional hazards model for competing risks data with missing cause of failure.

    PubMed

    Hyun, Seunggeun; Lee, Jimin; Sun, Yanqing

    2012-07-01

    We consider the semiparametric proportional hazards model for the cause-specific hazard function in analysis of competing risks data with missing cause of failure. The inverse probability weighted equation and augmented inverse probability weighted equation are proposed for estimating the regression parameters in the model, and their theoretical properties are established for inference. Simulation studies demonstrate that the augmented inverse probability weighted estimator is doubly robust and the proposed method is appropriate for practical use. The simulations also compare the proposed estimators with the multiple imputation estimator of Lu and Tsiatis (2001). The application of the proposed method is illustrated using data from a bone marrow transplant study. PMID:22468017

  10. Comparing proportional hazards and accelerated failure time models for survival analysis.

    PubMed

    Orbe, Jesus; Ferreira, Eva; Núñez-Antón, Vicente

    2002-11-30

    This paper describes a method proposed for a censored linear regression model that can be used in the context of survival analysis. The method has the important characteristic of allowing estimation and inference without knowing the distribution of the duration variable. Moreover, it does not need the assumption of proportional hazards. Therefore, it can be an interesting alternative to the Cox proportional hazards models when this assumption does not hold. In addition, implementation and interpretation of the results is simple. In order to analyse the performance of this methodology, we apply it to two real examples and we carry out a simulation study. We present its results together with those obtained with the traditional Cox model and AFT parametric models. The new proposal seems to lead to more precise results.

  11. A semi-parametric generalization of the Cox proportional hazards regression model: Inference and Applications

    PubMed Central

    Devarajan, Karthik; Ebrahimi, Nader

    2010-01-01

    The assumption of proportional hazards (PH) fundamental to the Cox PH model sometimes may not hold in practice. In this paper, we propose a generalization of the Cox PH model in terms of the cumulative hazard function taking a form similar to the Cox PH model, with the extension that the baseline cumulative hazard function is raised to a power function. Our model allows for interaction between covariates and the baseline hazard and it also includes, for the two sample problem, the case of two Weibull distributions and two extreme value distributions differing in both scale and shape parameters. The partial likelihood approach can not be applied here to estimate the model parameters. We use the full likelihood approach via a cubic B-spline approximation for the baseline hazard to estimate the model parameters. A semi-automatic procedure for knot selection based on Akaike’s Information Criterion is developed. We illustrate the applicability of our approach using real-life data. PMID:21076652

  12. On Estimation of Covariate-Specific Residual Time Quantiles under the Proportional Hazards Model

    PubMed Central

    Crouch, Luis Alexander; May, Susanne; Chen, Ying Qing

    2015-01-01

    Estimation and inference in time-to-event analysis typically focus on hazard functions and their ratios under the Cox proportional hazards model. These hazard functions, while popular in the statistical literature, are not always easily or intuitively communicated in clinical practice, such as in the settings of patient counseling or resource planning. Expressing and comparing quantiles of event times may allow for easier understanding. In this article we focus on residual time, i.e., the remaining time-to-event at an arbitrary time t given that the event has yet to occur by t. In particular, we develop estimation and inference procedures for covariate-specific quantiles of the residual time under the Cox model. Our methods and theory are assessed by simulations, and demonstrated in analysis of two real data sets. PMID:26058825

  13. Cox Proportional Hazards Models for Modeling the Time to Onset of Decompression Sickness in Hypobaric Environments

    NASA Technical Reports Server (NTRS)

    Thompson, Laura A.; Chhikara, Raj S.; Conkin, Johnny

    2003-01-01

    In this paper we fit Cox proportional hazards models to a subset of data from the Hypobaric Decompression Sickness Databank. The data bank contains records on the time to decompression sickness (DCS) and venous gas emboli (VGE) for over 130,000 person-exposures to high altitude in chamber tests. The subset we use contains 1,321 records, with 87% censoring, and has the most recent experimental tests on DCS made available from Johnson Space Center. We build on previous analyses of this data set by considering more expanded models and more detailed model assessments specific to the Cox model. Our model - which is stratified on the quartiles of the final ambient pressure at altitude - includes the final ambient pressure at altitude as a nonlinear continuous predictor, the computed tissue partial pressure of nitrogen at altitude, and whether exercise was done at altitude. We conduct various assessments of our model, many of which are recently developed in the statistical literature, and conclude where the model needs improvement. We consider the addition of frailties to the stratified Cox model, but found that no significant gain was attained above a model that does not include frailties. Finally, we validate some of the models that we fit.

  14. Bayesian random threshold estimation in a Cox proportional hazards cure model.

    PubMed

    Zhao, Lili; Feng, Dai; Bellile, Emily L; Taylor, Jeremy M G

    2014-02-20

    In this paper, we develop a Bayesian approach to estimate a Cox proportional hazards model that allows a threshold in the regression coefficient, when some fraction of subjects are not susceptible to the event of interest. A data augmentation scheme with latent binary cure indicators is adopted to simplify the Markov chain Monte Carlo implementation. Given the binary cure indicators, the Cox cure model reduces to a standard Cox model and a logistic regression model. Furthermore, the threshold detection problem reverts to a threshold problem in a regular Cox model. The baseline cumulative hazard for the Cox model is formulated non-parametrically using counting processes with a gamma process prior. Simulation studies demonstrate that the method provides accurate point and interval estimates. Application to a data set of oropharynx cancer patients suggests a significant threshold in age at diagnosis such that the effect of gender on disease-specific survival changes after the threshold.

  15. [Clinical research XXII. From clinical judgment to Cox proportional hazards model].

    PubMed

    Pérez-Rodríguez, Marcela; Rivas-Ruiz, Rodolfo; Palacios-Cruz, Lino; Talavera, Juan O

    2014-01-01

    Survival analyses are commonly used to determine the time of an event (for example, death). However, they can be used also for other clinical outcomes on the condition that these are dichotomous, for example healing time. These analyses only consider the relationship of one variable. However, Cox proportional hazards model is a multivariate analysis of the survival analysis, in which other potentially confounding covariates of the effect of the main maneuver studied, such as age, gender or disease stage, are taken into account. This analysis can include both quantitative and qualitative variables in the model. The measure of association used is called hazard ratio (HR) or relative risk ratio, which is not the same as the relative risk or odds ratio (OR). The difference is that the HR refers to the possibility that one of the groups develops the event before it is compared with the other group. The proportional hazards multivariate model of Cox is the most widely used in medicine when the phenomenon is studied in two dimensions: time and event.

  16. ANALYSIS OF MULTIVARIATE FAILURE TIME DATA USING MARGINAL PROPORTIONAL HAZARDS MODEL.

    PubMed

    Chen, Ying; Chen, Kani; Ying, Zhiliang

    2010-01-01

    The marginal proportional hazards model is an important tool in the analysis of multivariate failure time data in the presence of censoring. We propose a method of estimation via the linear combinations of martingale residuals. The estimation and inference procedures are easy to implement numerically. The estimation is generally more accurate than the existing pseudo-likelihood approach: the size of efficiency gain can be considerable in some cases, and the maximum relative efficiency in theory is infinite. Consistency and asymptotic normality are established. Empirical evidence in support of the theoretical claims is shown in simulation studies. PMID:24307815

  17. A Flexible, Computationally Efficient Method for Fitting the Proportional Hazards Model to Interval-Censored Data

    PubMed Central

    Wang, Lianming; Hudgens, Michael G.; Qureshi, Zaina P.

    2015-01-01

    Summary The proportional hazards model (PH) is currently the most popular regression model for analyzing time-to-event data. Despite its popularity, the analysis of interval-censored data under the PH model can be challenging using many available techniques. This paper presents a new method for analyzing interval-censored data under the PH model. The proposed approach uses a monotone spline representation to approximate the unknown nondecreasing cumulative baseline hazard function. Formulating the PH model in this fashion results in a finite number of parameters to estimate while maintaining substantial modeling flexibility. A novel expectation-maximization (EM) algorithm is developed for finding the maximum likelihood estimates of the parameters. The derivation of the EM algorithm relies on a two-stage data augmentation involving latent Poisson random variables. The resulting algorithm is easy to implement, robust to initialization, enjoys quick convergence, and provides closed-form variance estimates. The performance of the proposed regression methodology is evaluated through a simulation study, and is further illustrated using data from a large population-based randomized trial designed and sponsored by the United States National Cancer Institute. PMID:26393917

  18. Relaxing the independent censoring assumption in the Cox proportional hazards model using multiple imputation.

    PubMed

    Jackson, Dan; White, Ian R; Seaman, Shaun; Evans, Hannah; Baisley, Kathy; Carpenter, James

    2014-11-30

    The Cox proportional hazards model is frequently used in medical statistics. The standard methods for fitting this model rely on the assumption of independent censoring. Although this is sometimes plausible, we often wish to explore how robust our inferences are as this untestable assumption is relaxed. We describe how this can be carried out in a way that makes the assumptions accessible to all those involved in a research project. Estimation proceeds via multiple imputation, where censored failure times are imputed under user-specified departures from independent censoring. A novel aspect of our method is the use of bootstrapping to generate proper imputations from the Cox model. We illustrate our approach using data from an HIV-prevention trial and discuss how it can be readily adapted and applied in other settings. PMID:25060703

  19. Proportional hazards model with varying coefficients for length-biased data.

    PubMed

    Zhang, Feipeng; Chen, Xuerong; Zhou, Yong

    2014-01-01

    Length-biased data arise in many important applications including epidemiological cohort studies, cancer prevention trials and studies of labor economics. Such data are also often subject to right censoring due to loss of follow-up or the end of study. In this paper, we consider a proportional hazards model with varying coefficients for right-censored and length-biased data, which is used to study the interact effect nonlinearly of covariates with an exposure variable. A local estimating equation method is proposed for the unknown coefficients and the intercept function in the model. The asymptotic properties of the proposed estimators are established by using the martingale theory and kernel smoothing techniques. Our simulation studies demonstrate that the proposed estimators have an excellent finite-sample performance. The Channing House data is analyzed to demonstrate the applications of the proposed method. PMID:23649724

  20. Sparse estimation of Cox proportional hazards models via approximated information criteria.

    PubMed

    Su, Xiaogang; Wijayasinghe, Chalani S; Fan, Juanjuan; Zhang, Ying

    2016-09-01

    We propose a new sparse estimation method for Cox (1972) proportional hazards models by optimizing an approximated information criterion. The main idea involves approximation of the ℓ0 norm with a continuous or smooth unit dent function. The proposed method bridges the best subset selection and regularization by borrowing strength from both. It mimics the best subset selection using a penalized likelihood approach yet with no need of a tuning parameter. We further reformulate the problem with a reparameterization step so that it reduces to one unconstrained nonconvex yet smooth programming problem, which can be solved efficiently as in computing the maximum partial likelihood estimator (MPLE). Furthermore, the reparameterization tactic yields an additional advantage in terms of circumventing postselection inference. The oracle property of the proposed method is established. Both simulated experiments and empirical examples are provided for assessment and illustration. PMID:26873398

  1. Mediation Analysis with Survival Outcomes: Accelerated Failure Time vs. Proportional Hazards Models

    PubMed Central

    Gelfand, Lois A.; MacKinnon, David P.; DeRubeis, Robert J.; Baraldi, Amanda N.

    2016-01-01

    Objective: Survival time is an important type of outcome variable in treatment research. Currently, limited guidance is available regarding performing mediation analyses with survival outcomes, which generally do not have normally distributed errors, and contain unobserved (censored) events. We present considerations for choosing an approach, using a comparison of semi-parametric proportional hazards (PH) and fully parametric accelerated failure time (AFT) approaches for illustration. Method: We compare PH and AFT models and procedures in their integration into mediation models and review their ability to produce coefficients that estimate causal effects. Using simulation studies modeling Weibull-distributed survival times, we compare statistical properties of mediation analyses incorporating PH and AFT approaches (employing SAS procedures PHREG and LIFEREG, respectively) under varied data conditions, some including censoring. A simulated data set illustrates the findings. Results: AFT models integrate more easily than PH models into mediation models. Furthermore, mediation analyses incorporating LIFEREG produce coefficients that can estimate causal effects, and demonstrate superior statistical properties. Censoring introduces bias in the coefficient estimate representing the treatment effect on outcome—underestimation in LIFEREG, and overestimation in PHREG. With LIFEREG, this bias can be addressed using an alternative estimate obtained from combining other coefficients, whereas this is not possible with PHREG. Conclusions: When Weibull assumptions are not violated, there are compelling advantages to using LIFEREG over PHREG for mediation analyses involving survival-time outcomes. Irrespective of the procedures used, the interpretation of coefficients, effects of censoring on coefficient estimates, and statistical properties should be taken into account when reporting results. PMID:27065906

  2. Generating survival times to simulate Cox proportional hazards models with time-varying covariates.

    PubMed

    Austin, Peter C

    2012-12-20

    Simulations and Monte Carlo methods serve an important role in modern statistical research. They allow for an examination of the performance of statistical procedures in settings in which analytic and mathematical derivations may not be feasible. A key element in any statistical simulation is the existence of an appropriate data-generating process: one must be able to simulate data from a specified statistical model. We describe data-generating processes for the Cox proportional hazards model with time-varying covariates when event times follow an exponential, Weibull, or Gompertz distribution. We consider three types of time-varying covariates: first, a dichotomous time-varying covariate that can change at most once from untreated to treated (e.g., organ transplant); second, a continuous time-varying covariate such as cumulative exposure at a constant dose to radiation or to a pharmaceutical agent used for a chronic condition; third, a dichotomous time-varying covariate with a subject being able to move repeatedly between treatment states (e.g., current compliance or use of a medication). In each setting, we derive closed-form expressions that allow one to simulate survival times so that survival times are related to a vector of fixed or time-invariant covariates and to a single time-varying covariate. We illustrate the utility of our closed-form expressions for simulating event times by using Monte Carlo simulations to estimate the statistical power to detect as statistically significant the effect of different types of binary time-varying covariates. This is compared with the statistical power to detect as statistically significant a binary time-invariant covariate.

  3. [Proportional hazards model of birth intervals among marriage cohorts since the 1960s].

    PubMed

    Otani, K

    1987-01-01

    With a view to investigating the possibility of an attitudinal change towards the timing of 1st and 2nd births, proportional hazards model analysis of the 1st and 2nd birth intervals and univariate life table analysis were both carried out. Results showed that love matches and conjugal families immediately after marriage are accompanied by a longer 1st birth interval than others, even after controlling for other independent variables. Marriage cohort analysis also shows a net effect on the relative risk of having a 1st birth. Marriage cohorts since the mid-1960s demonstrate a shorter 1st birth interval than the 1961-63 cohort. With regard to the 2nd birth interval, longer 1st birth intervals, arranged marriages, conjugal families immediately following marriage, and higher ages at 1st marriage of women tended to provoke a longer 2nd birth interval. There is no interaction between the 1st birth interval and marriage cohort. Once other independent variables were controlled, with the exception of the marriage cohorts of the early 1970s, the authors found no effect of marriage cohort on the relative risk of having a 2nd birth. This suggests that an attitudinal change towards the timing of births in this period was mainly restricted to that of a 1st birth. Fluctuations in the 2nd birth interval during the 1970-72 marriage cohort were scrutinized in detail. As a result, the authors found that conjugal families after marriage, wives with low educational status, women with husbands in white collar professions, women with white collar fathers, and wives with high age at 1st marriage who married during 1970-72 and had a 1st birth interval during 1972-74 suffered most from the pronounced rise in the 2nd birth interval. This might be due to the relatively high sensitivity to a change in socioeconomic status; the oil crisis occurring around the time of marriage and 1st birth induced a delay in the 2nd birth. The unanimous decrease in the 2nd birth interval among the 1973

  4. A flexible alternative to the Cox proportional hazards model for assessing the prognostic accuracy of hospice patient survival.

    PubMed

    Miladinovic, Branko; Kumar, Ambuj; Mhaskar, Rahul; Kim, Sehwan; Schonwetter, Ronald; Djulbegovic, Benjamin

    2012-01-01

    Prognostic models are often used to estimate the length of patient survival. The Cox proportional hazards model has traditionally been applied to assess the accuracy of prognostic models. However, it may be suboptimal due to the inflexibility to model the baseline survival function and when the proportional hazards assumption is violated. The aim of this study was to use internal validation to compare the predictive power of a flexible Royston-Parmar family of survival functions with the Cox proportional hazards model. We applied the Palliative Performance Scale on a dataset of 590 hospice patients at the time of hospice admission. The retrospective data were obtained from the Lifepath Hospice and Palliative Care center in Hillsborough County, Florida, USA. The criteria used to evaluate and compare the models' predictive performance were the explained variation statistic R(2), scaled Brier score, and the discrimination slope. The explained variation statistic demonstrated that overall the Royston-Parmar family of survival functions provided a better fit (R(2) =0.298; 95% CI: 0.236-0.358) than the Cox model (R(2) =0.156; 95% CI: 0.111-0.203). The scaled Brier scores and discrimination slopes were consistently higher under the Royston-Parmar model. Researchers involved in prognosticating patient survival are encouraged to consider the Royston-Parmar model as an alternative to Cox. PMID:23082220

  5. A proportional hazards regression model for the subdistribution with right-censored and left-truncated competing risks data.

    PubMed

    Zhang, Xu; Zhang, Mei-Jie; Fine, Jason

    2011-07-20

    With competing risks failure time data, one often needs to assess the covariate effects on the cumulative incidence probabilities. Fine and Gray proposed a proportional hazards regression model to directly model the subdistribution of a competing risk. They developed the estimating procedure for right-censored competing risks data, based on the inverse probability of censoring weighting. Right-censored and left-truncated competing risks data sometimes occur in biomedical researches. In this paper, we study the proportional hazards regression model for the subdistribution of a competing risk with right-censored and left-truncated data. We adopt a new weighting technique to estimate the parameters in this model. We have derived the large sample properties of the proposed estimators. To illustrate the application of the new method, we analyze the failure time data for children with acute leukemia. In this example, the failure times for children who had bone marrow transplants were left truncated.

  6. Testing Goodness-of-Fit for the Proportional Hazards Model based on Nested Case-Control Data

    PubMed Central

    Lu, Wenbin; Liu, Mengling; Chen, Yi-Hau

    2014-01-01

    Summary Nested case-control sampling is a popular design for large epidemiological cohort studies due to its cost effectiveness. A number of methods have been developed for the estimation of the proportional hazards model with nested case-control data; however, the evaluation of modeling assumption is less attended. In this paper, we propose a class of goodness-of-fit test statistics for testing the proportional hazards assumption based on nested case-control data. The test statistics are constructed based on asymptotically mean-zero processes derived from Samuelsen’s maximum pseudo-likelihood estimation method. In addition, we develop an innovative resampling scheme to approximate the asymptotic distribution of the test statistics while accounting for the dependent sampling scheme of nested case-control design. Numerical studies are conducted to evaluate the performance of our proposed approach, and an application to the Wilms’ Tumor Study is given to illustrate the methodology. PMID:25298193

  7. Assessing the prediction accuracy of cure in the Cox proportional hazards cure model: an application to breast cancer data.

    PubMed

    Asano, Junichi; Hirakawa, Akihiro; Hamada, Chikuma

    2014-01-01

    A cure rate model is a survival model incorporating the cure rate with the assumption that the population contains both uncured and cured individuals. It is a powerful statistical tool for prognostic studies, especially in cancer. The cure rate is important for making treatment decisions in clinical practice. The proportional hazards (PH) cure model can predict the cure rate for each patient. This contains a logistic regression component for the cure rate and a Cox regression component to estimate the hazard for uncured patients. A measure for quantifying the predictive accuracy of the cure rate estimated by the Cox PH cure model is required, as there has been a lack of previous research in this area. We used the Cox PH cure model for the breast cancer data; however, the area under the receiver operating characteristic curve (AUC) could not be estimated because many patients were censored. In this study, we used imputation-based AUCs to assess the predictive accuracy of the cure rate from the PH cure model. We examined the precision of these AUCs using simulation studies. The results demonstrated that the imputation-based AUCs were estimable and their biases were negligibly small in many cases, although ordinary AUC could not be estimated. Additionally, we introduced the bias-correction method of imputation-based AUCs and found that the bias-corrected estimate successfully compensated the overestimation in the simulation studies. We also illustrated the estimation of the imputation-based AUCs using breast cancer data.

  8. Estimating treatment effect in a proportional hazards model in randomized clinical trials with all-or-nothing compliance.

    PubMed

    Li, Shuli; Gray, Robert J

    2016-09-01

    We consider methods for estimating the treatment effect and/or the covariate by treatment interaction effect in a randomized clinical trial under noncompliance with time-to-event outcome. As in Cuzick et al. (2007), assuming that the patient population consists of three (possibly latent) subgroups based on treatment preference: the ambivalent group, the insisters, and the refusers, we estimate the effects among the ambivalent group. The parameters have causal interpretations under standard assumptions. The article contains two main contributions. First, we propose a weighted per-protocol (Wtd PP) estimator through incorporating time-varying weights in a proportional hazards model. In the second part of the article, under the model considered in Cuzick et al. (2007), we propose an EM algorithm to maximize a full likelihood (FL) as well as the pseudo likelihood (PL) considered in Cuzick et al. (2007). The E step of the algorithm involves computing the conditional expectation of a linear function of the latent membership, and the main advantage of the EM algorithm is that the risk parameters can be updated by fitting a weighted Cox model using standard software and the baseline hazard can be updated using closed-form solutions. Simulations show that the EM algorithm is computationally much more efficient than directly maximizing the observed likelihood. The main advantage of the Wtd PP approach is that it is more robust to model misspecifications among the insisters and refusers since the outcome model does not impose distributional assumptions among these two groups. PMID:26799700

  9. Effect of type traits on functional longevity of Czech Holstein cows estimated from a Cox proportional hazards model.

    PubMed

    Zavadilová, L; Němcová, E; Stípková, M

    2011-08-01

    Relationships between conformation traits and functional longevity in Holstein cows were evaluated using survival analysis. Functional longevity was defined as the number of days between the first calving and culling; that is, length of productive life. The data set consisted of 116,369 Holstein cows that first calved from 2003 to 2008. All cows used in the analysis were scored for conformation between d 30 and d 210 of their first lactation. The data included 48% censored records. Analyses were done separately for 20 linear descriptive type traits, 6 composite traits, and height at sacrum measured in centimeters. Cox proportional hazard models were fitted to analyze data. The hazard function was described as the product of a baseline hazard function and the time-independent effects of age at first calving and sire (random), and the time-dependent effects of stage of lactation and lactation number, herd, year and season, herd size, and 305-d milk production. The strongest relationship between a composite trait and functional longevity was for dairy form, followed by udder and final score. Among the descriptive type traits, the strongest relationships with longevity were found for body condition score, angularity, traits related to udder attachment, and udder depth. Foot and leg traits showed substantially lower effect on functional longevity, and the effect of foot angle was minimal. Functional longevity declined with decreased body condition score of cows. Cows with deep udders had significantly lower functional survival compared with cows with shallow udders. In addition, weak central ligament was associated with significant reduction of cow longevity. For dairy form and angularity, cows classified as very good were the worst with respect to longevity, whereas cows classified as poor were the best. An intermediate optimum was evident for rear legs rear view and rear legs set (side view), whereas cows with sickled legs had lower longevity than cows with straighter

  10. Risk Factors of Graft Survival After Diagnosis of Post-kidney Transplant Malignancy: Using Cox Proportional Hazard Model

    PubMed Central

    Rahimi Foroushani, Abbas; Salesi, Mahmoud; Rostami, Zohreh; Mehrazmay, Ali Reza; Mohammadi, Jamile; Einollahi, Behzad; Eshraghian, Mohammad Reza

    2015-01-01

    Background: All recipients of kidney transplantation, especially those with posttransplant malignancy, are at risk of long-term graft failure. Objectives: The purpose of our study was to evaluate the risk factors associated with graft survival after diagnosis of malignancy. Patients and Methods: To reach this purpose, we conducted a historical cohort study in Iran and 266 cases with posttransplant malignancy were followed up from diagnosis of malignancy until long-term graft loss or the date of last visit. These patients were taken as a census from 16 Transplant Centers in Iran during 22 years follow-up period since October 1984 to December 2008. A Cox proportional hazards model was performed to determine the important independent predictors of graft survival after malignancy. Results: At the end of the study, long-term graft failure was seen in 27 (10.2%) cases. One-year and 2-year graft survival after diagnosis of cancer were 93.6% and 91.7%, respectively. The univariate analysis showed that the incidence of chronic graft loss was significantly higher in male patients with solid cancers, withdrawal of immunosuppressant regimen, no response to treatment, and tumor metastasis. In continuation, the Cox model indicated that the significant risk factors associated with graft survival were type of cancer (P < 0.0001), response to treatment (P < 0.0001, HR = 0.14, 95% CI: 0.06 - 0.32), metastasis (P < 0.0001, HR = 5.68, 95% CI: 2.24 - 14.42), and treatment modality (P = 0.0001). Conclusions: By controlling the modifiable risk factors and modality of treatment in our study, physicians can reach more effective treatment. PMID:26734477

  11. Using Swiss Webster mice to model Fetal Alcohol Spectrum Disorders (FASD): An analysis of multilevel time-to-event data through mixed-effects Cox proportional hazards models.

    PubMed

    Chi, Peter; Aras, Radha; Martin, Katie; Favero, Carlita

    2016-05-15

    Fetal Alcohol Spectrum Disorders (FASD) collectively describes the constellation of effects resulting from human alcohol consumption during pregnancy. Even with public awareness, the incidence of FASD is estimated to be upwards of 5% in the general population and is becoming a global health problem. The physical, cognitive, and behavioral impairments of FASD are recapitulated in animal models. Recently rodent models utilizing voluntary drinking paradigms have been developed that accurately reflect moderate consumption, which makes up the majority of FASD cases. The range in severity of FASD characteristics reflects the frequency, dose, developmental timing, and individual susceptibility to alcohol exposure. As most rodent models of FASD use C57BL/6 mice, there is a need to expand the stocks of mice studied in order to more fully understand the complex neurobiology of this disorder. To that end, we allowed pregnant Swiss Webster mice to voluntarily drink ethanol via the drinking in the dark (DID) paradigm throughout their gestation period. Ethanol exposure did not alter gestational outcomes as determined by no significant differences in maternal weight gain, maternal liquid consumption, litter size, or pup weight at birth or weaning. Despite seemingly normal gestation, ethanol-exposed offspring exhibit significantly altered timing to achieve developmental milestones (surface righting, cliff aversion, and open field traversal), as analyzed through mixed-effects Cox proportional hazards models. These results confirm Swiss Webster mice as a viable option to study the incidence and causes of ethanol-induced neurobehavioral alterations during development. Future studies in our laboratory will investigate the brain regions and molecules responsible for these behavioral changes.

  12. A new method to evaluate the effects of bacterial dosage, infection route and Vibrio strain in experimental challenges of Litopenaeus vannamei, based on the Cox proportional hazard model.

    PubMed

    Xia, Qing; Wang, Baojie; Liu, Mei; Jiang, Keyong; Wang, Lei

    2015-10-01

    In the shrimp challenge test the Vibrio dosage, infection route, and strain are considered as risk factors that result in mortality. Assessment of Vibrio/shrimp interactions, and disease dynamics following infection by Vibrio, are useful techniques needed for detailed studies on the control of risk factors. In this paper we used an application of the Cox proportional hazard model to assess relative survival probability, estimate mortality risk, and construct a prognostic model to assess predictions of estimated time to death. Results indicate that infection route was the most important prognostic factor contributing to mortality in the challenge test (β = 3.698, P < 0.000). The shrimp infection rate following injection was found to be 40.4 times greater than that following immersion (hazard ratio (HR) = 40.4; p = 0.000). Our results also indicated that the HR resulting in shrimp mortality following a high dose of 10(8) cfu/shrimp was significantly greater (HR = 5.9, P < 0.000), than that following a baseline dosage of 10(7) cfu/shrimp. Strain Vh was found to be more virulent than Strain Vp (HR = 4.8; P < 0.000). The prognostic index also indicated that the infection route is the most important prognostic factor contributing to mortality in the challenge test.

  13. Addressing Loss of Efficiency Due to Misclassification Error in Enriched Clinical Trials for the Evaluation of Targeted Therapies Based on the Cox Proportional Hazards Model

    PubMed Central

    Tsai, Chen-An; Lee, Kuan-Ting; Liu, Jen-pei

    2016-01-01

    A key feature of precision medicine is that it takes individual variability at the genetic or molecular level into account in determining the best treatment for patients diagnosed with diseases detected by recently developed novel biotechnologies. The enrichment design is an efficient design that enrolls only the patients testing positive for specific molecular targets and randomly assigns them for the targeted treatment or the concurrent control. However there is no diagnostic device with perfect accuracy and precision for detecting molecular targets. In particular, the positive predictive value (PPV) can be quite low for rare diseases with low prevalence. Under the enrichment design, some patients testing positive for specific molecular targets may not have the molecular targets. The efficacy of the targeted therapy may be underestimated in the patients that actually do have the molecular targets. To address the loss of efficiency due to misclassification error, we apply the discrete mixture modeling for time-to-event data proposed by Eng and Hanlon [8] to develop an inferential procedure, based on the Cox proportional hazard model, for treatment effects of the targeted treatment effect for the true-positive patients with the molecular targets. Our proposed procedure incorporates both inaccuracy of diagnostic devices and uncertainty of estimated accuracy measures. We employed the expectation-maximization algorithm in conjunction with the bootstrap technique for estimation of the hazard ratio and its estimated variance. We report the results of simulation studies which empirically investigated the performance of the proposed method. Our proposed method is illustrated by a numerical example. PMID:27120450

  14. The role of social networks and media receptivity in predicting age of smoking initiation: a proportional hazards model of risk and protective factors.

    PubMed

    Unger, J B; Chen, X

    1999-01-01

    The increasing prevalence of adolescent smoking demonstrates the need to identify factors associated with early smoking initiation. Previous studies have shown that smoking by social network members and receptivity to pro-tobacco marketing are associated with smoking among adolescents. It is not clear, however, whether these variables also are associated with the age of smoking initiation. Using data from 10,030 California adolescents, this study identified significant correlates of age of smoking initiation using bivariate methods and a multivariate proportional hazards model. Age of smoking initiation was earlier among those adolescents whose friends, siblings, or parents were smokers, and among those adolescents who had a favorite tobacco advertisement, had received tobacco promotional items, or would be willing to use tobacco promotional items. Results suggest that the smoking behavior of social network members and pro-tobacco media influences are important determinants of age of smoking initiation. Because early smoking initiation is associated with higher levels of addiction in adulthood, tobacco control programs should attempt to counter these influences. PMID:10400276

  15. Comparison of Cox's and relative survival models when estimating the effects of prognostic factors on disease-specific mortality: a simulation study under proportional excess hazards.

    PubMed

    Le Teuff, Gwenaël; Abrahamowicz, Michal; Bolard, Philippe; Quantin, Catherine

    2005-12-30

    In many prognostic studies focusing on mortality of persons affected by a particular disease, the cause of death of individual patients is not recorded. In such situations, the conventional survival analytical methods, such as the Cox's proportional hazards regression model, do not allow to discriminate the effects of prognostic factors on disease-specific mortality from their effects on all-causes mortality. In the last decade, the relative survival approach has been proposed to deal with the analyses involving population-based cancer registries, where the problem of missing information on the cause of death is very common. However, some questions regarding the ability of the relative survival methods to accurately discriminate between the two sources of mortality remain open. In order to systematically assess the performance of the relative survival model proposed by Esteve et al., and to quantify its potential advantages over the Cox's model analyses, we carried out a series of simulation experiments, based on the population-based colon cancer registry in the French region of Burgundy. Simulations showed a systematic bias induced by the 'crude' conventional Cox's model analyses when individual causes of death are unknown. In simulations where only about 10 per cent of patients died of causes other than colon cancer, the Cox's model over-estimated the effects of male gender and oldest age category by about 17 and 13 per cent, respectively, with the coverage rate of the 95 per cent CI for the latter estimate as low as 65 per cent. In contrast, the effect of higher cancer stages was under-estimated by 8-28 per cent. In contrast to crude survival, relative survival model largely reduced such problems and handled well even such challenging tasks as separating the opposite effects of the same variable on cancer-related versus other-causes mortality. Specifically, in all the cases discussed above, the relative bias in the estimates from the Esteve et al.'s model was

  16. A multilevel excess hazard model to estimate net survival on hierarchical data allowing for non-linear and non-proportional effects of covariates.

    PubMed

    Charvat, Hadrien; Remontet, Laurent; Bossard, Nadine; Roche, Laurent; Dejardin, Olivier; Rachet, Bernard; Launoy, Guy; Belot, Aurélien

    2016-08-15

    The excess hazard regression model is an approach developed for the analysis of cancer registry data to estimate net survival, that is, the survival of cancer patients that would be observed if cancer was the only cause of death. Cancer registry data typically possess a hierarchical structure: individuals from the same geographical unit share common characteristics such as proximity to a large hospital that may influence access to and quality of health care, so that their survival times might be correlated. As a consequence, correct statistical inference regarding the estimation of net survival and the effect of covariates should take this hierarchical structure into account. It becomes particularly important as many studies in cancer epidemiology aim at studying the effect on the excess mortality hazard of variables, such as deprivation indexes, often available only at the ecological level rather than at the individual level. We developed here an approach to fit a flexible excess hazard model including a random effect to describe the unobserved heterogeneity existing between different clusters of individuals, and with the possibility to estimate non-linear and time-dependent effects of covariates. We demonstrated the overall good performance of the proposed approach in a simulation study that assessed the impact on parameter estimates of the number of clusters, their size and their level of unbalance. We then used this multilevel model to describe the effect of a deprivation index defined at the geographical level on the excess mortality hazard of patients diagnosed with cancer of the oral cavity. Copyright © 2016 John Wiley & Sons, Ltd.

  17. A multilevel excess hazard model to estimate net survival on hierarchical data allowing for non-linear and non-proportional effects of covariates.

    PubMed

    Charvat, Hadrien; Remontet, Laurent; Bossard, Nadine; Roche, Laurent; Dejardin, Olivier; Rachet, Bernard; Launoy, Guy; Belot, Aurélien

    2016-08-15

    The excess hazard regression model is an approach developed for the analysis of cancer registry data to estimate net survival, that is, the survival of cancer patients that would be observed if cancer was the only cause of death. Cancer registry data typically possess a hierarchical structure: individuals from the same geographical unit share common characteristics such as proximity to a large hospital that may influence access to and quality of health care, so that their survival times might be correlated. As a consequence, correct statistical inference regarding the estimation of net survival and the effect of covariates should take this hierarchical structure into account. It becomes particularly important as many studies in cancer epidemiology aim at studying the effect on the excess mortality hazard of variables, such as deprivation indexes, often available only at the ecological level rather than at the individual level. We developed here an approach to fit a flexible excess hazard model including a random effect to describe the unobserved heterogeneity existing between different clusters of individuals, and with the possibility to estimate non-linear and time-dependent effects of covariates. We demonstrated the overall good performance of the proposed approach in a simulation study that assessed the impact on parameter estimates of the number of clusters, their size and their level of unbalance. We then used this multilevel model to describe the effect of a deprivation index defined at the geographical level on the excess mortality hazard of patients diagnosed with cancer of the oral cavity. Copyright © 2016 John Wiley & Sons, Ltd. PMID:26924122

  18. Ethnicity, education, and the non-proportional hazard of first marriage in Turkey.

    PubMed

    Gore, DeAnna L; Carlson, Elwood

    2010-07-01

    This study uses the 1998 Turkish Demographic and Health Survey to estimate non-proportional piecewise-constant hazards for first marriage among women in Turkey by education and ethnicity, with controls for region of residence and rural-urban migration. At low education levels Kurdish speakers married earlier than women who spoke Turkish or other languages, but at high education levels Kurdish women delayed marriage more than other women. This reversal across education groups furnishes a new illustration of the minority-group-status hypothesis specifically focused on marriage as the first step in the family formation process. The ethnic contrast concerned only marriage timing in Turkey, not proportions ever marrying. Eventual marriage remained nearly universal for all groups of women. This means that an assumption of proportional duration hazards (widespread in contemporary research) across the whole range of marriage-forming ages should be replaced by models with non-proportional duration hazards.

  19. Comparing a marginal structural model with a Cox proportional hazard model to estimate the effect of time-dependent drug use in observational studies: statin use for primary prevention of cardiovascular disease as an example from the Rotterdam Study.

    PubMed

    de Keyser, Catherine E; Leening, Maarten J G; Romio, Silvana A; Jukema, J Wouter; Hofman, Albert; Ikram, M Arfan; Franco, Oscar H; Stijnen, Theo; Stricker, Bruno H

    2014-11-01

    When studying the causal effect of drug use in observational data, marginal structural modeling (MSM) can be used to adjust for time-dependent confounders that are affected by previous treatment. The objective of this study was to compare traditional Cox proportional hazard models (with and without time-dependent covariates) with MSM to study causal effects of time-dependent drug use. The example of primary prevention of cardiovascular disease (CVD) with statins was examined using up to 17.7 years of follow-up from 4,654 participants of the observational prospective population-based Rotterdam Study. In the MSM model, the weight was based on measurements of established cardiovascular risk factors and co-morbidity. In general, we could not demonstrate important differences in results from the Cox models and MSM. Results from analysis on duration of statin use suggested that substantial residual confounding by indication was not accounted for during the period shortly after statin initiation. In conclusion, although on theoretical grounds MSM is an elegant technique, lack of data on the precise time-dependent confounders, such as indication of treatment or other considerations of the prescribing physician jeopardizes the calculation of valid weights. Confounding remains a hurdle in observational effectiveness research on preventive drugs with a multitude of prescription determinants.

  20. PSHREG: a SAS macro for proportional and nonproportional subdistribution hazards regression.

    PubMed

    Kohl, Maria; Plischke, Max; Leffondré, Karen; Heinze, Georg

    2015-02-01

    We present a new SAS macro %pshreg that can be used to fit a proportional subdistribution hazards model for survival data subject to competing risks. Our macro first modifies the input data set appropriately and then applies SAS's standard Cox regression procedure, PROC PHREG, using weights and counting-process style of specifying survival times to the modified data set. The modified data set can also be used to estimate cumulative incidence curves for the event of interest. The application of PROC PHREG has several advantages, e.g., it directly enables the user to apply the Firth correction, which has been proposed as a solution to the problem of undefined (infinite) maximum likelihood estimates in Cox regression, frequently encountered in small sample analyses. Deviation from proportional subdistribution hazards can be detected by both inspecting Schoenfeld-type residuals and testing correlation of these residuals with time, or by including interactions of covariates with functions of time. We illustrate application of these extended methods for competing risk regression using our macro, which is freely available at: http://cemsiis.meduniwien.ac.at/en/kb/science-research/software/statistical-software/pshreg, by means of analysis of a real chronic kidney disease study. We discuss differences in features and capabilities of %pshreg and the recent (January 2014) SAS PROC PHREG implementation of proportional subdistribution hazards modelling.

  1. Sleepiness and prediction of driver impairment in simulator studies using a Cox proportional hazard approach.

    PubMed

    Vadeby, Anna; Forsman, Asa; Kecklund, Göran; Akerstedt, Torbjörn; Sandberg, David; Anund, Anna

    2010-05-01

    Cox proportional hazard models were used to study relationships between the event that a driver is leaving the lane caused by sleepiness and different indicators of sleepiness. In order to elucidate different indicators' performance, five different models developed by Cox proportional hazard on a data set from a simulator study were used. The models consisted of physiological indicators and indicators from driving data both as stand alone and in combination. The different models were compared on two different data sets by means of sensitivity and specificity and the models' ability to predict lane departure was studied. In conclusion, a combination of blink indicators based on the ratio between blink amplitude and peak closing velocity of eyelid (A/PCV) (or blink amplitude and peak opening velocity of eyelid (A/POV)), standard deviation of lateral position and standard deviation of lateral acceleration relative road (ddy) was the most sensitive approach with sensitivity 0.80. This is also supported by the fact that driving data only shows the impairment of driving performance while blink data have a closer relation to sleepiness. Thus, an effective sleepiness warning system may be based on a combination of lane variability measures and variables related to eye movements (particularly slow eye closure) in order to have both high sensitivity (many correct warnings) and acceptable specificity (few false alarms).

  2. Patient-specific meta-analysis for risk assessment using multivariate proportional hazards regression

    PubMed Central

    Crager, Michael R.; Tang, Gong

    2015-01-01

    We propose a method for assessing an individual patient’s risk of a future clinical event using clinical trial or cohort data and Cox proportional hazards regression, combining the information from several studies using meta-analysis techniques. The method combines patient-specific estimates of the log cumulative hazard across studies, weighting by the relative precision of the estimates, using either fixed- or random-effects meta-analysis calculations. Risk assessment can be done for any future patient using a few key summary statistics determined once and for all from each study. Generalizations of the method to logistic regression and linear models are immediate. We evaluate the methods using simulation studies and illustrate their application using real data. PMID:26664111

  3. Crossing Hazard Functions in Common Survival Models

    PubMed Central

    Zhang, Jiajia; Peng, Yingwei

    2010-01-01

    Crossing hazard functions have extensive applications in modeling survival data. However, existing studies in the literature mainly focus on comparing crossed hazard functions and estimating the time at which the hazard functions cross, and there is little theoretical work on conditions under which hazard functions from a model will have a crossing. In this paper, we investigate crossing status of hazard functions from the proportional hazards (PH) model, the accelerated hazard (AH) model, and the accelerated failure time (AFT) model. We provide and prove conditions under which the hazard functions from the AH and the AFT models have no crossings or a single crossing. A few examples are also provided to demonstrate how the conditions can be used to determine crossing status of hazard functions from the three models. PMID:20613974

  4. Progress in studying scintillator proportionality: Phenomenological model

    SciTech Connect

    Bizarri, Gregory; Cherepy, Nerine; Choong, Woon-Seng; Hull, Giulia; Moses, William; Payne, Sephen; Singh, Jai; Valentine, John; Vasilev, Andrey; Williams, Richard

    2009-04-30

    We present a model to describe the origin of non-proportional dependence of scintillator light yield on the energy of an ionizing particle. The non-proportionality is discussed in terms of energy relaxation channels and their linear and non-linear dependences on the deposited energy. In this approach, the scintillation response is described as a function of the deposited energy deposition and the kinetic rates of each relaxation channel. This mathematical framework allows both a qualitative interpretation and a quantitative fitting representation of scintillation non-proportionality response as function of kinetic rates. This method was successfully applied to thallium doped sodium iodide measured with SLYNCI, a new facility using the Compton coincidence technique. Finally, attention is given to the physical meaning of the dominant relaxation channels, and to the potential causes responsible for the scintillation non-proportionality. We find that thallium doped sodium iodide behaves as if non-proportionality is due to competition between radiative recombinations and non-radiative Auger processes.

  5. NASA CONNECT: Proportionality: Modeling the Future

    NASA Technical Reports Server (NTRS)

    2000-01-01

    'Proportionality: Modeling the Future' is the sixth of seven programs in the 1999-2000 NASA CONNECT series. Produced by NASA Langley Research Center's Office of Education, NASA CONNECT is an award-winning series of instructional programs designed to enhance the teaching of math, science and technology concepts in grades 5-8. NASA CONNECT establishes the 'connection' between the mathematics, science, and technology concepts taught in the classroom and NASA research. Each program in the series supports the national mathematics, science, and technology standards; includes a resource-rich teacher guide; and uses a classroom experiment and web-based activity to complement and enhance the math, science, and technology concepts presented in the program. NASA CONNECT is FREE and the programs in the series are in the public domain. Visit our web site and register. http://connect.larc.nasa.gov 'Proportionality: Modeling the Future', students will examine how patterns, measurement, ratios, and proportions are used in the research, development, and production of airplanes.

  6. Boron-10 Lined Proportional Counter Model Validation

    SciTech Connect

    Lintereur, Azaree T.; Siciliano, Edward R.; Kouzes, Richard T.

    2012-06-30

    The Department of Energy Office of Nuclear Safeguards (NA-241) is supporting the project “Coincidence Counting With Boron-Based Alternative Neutron Detection Technology” at Pacific Northwest National Laboratory (PNNL) for the development of an alternative neutron coincidence counter. The goal of this project is to design, build and demonstrate a boron-lined proportional tube-based alternative system in the configuration of a coincidence counter. This report discusses the validation studies performed to establish the degree of accuracy of the computer modeling methods current used to simulate the response of boron-lined tubes. This is the precursor to developing models for the uranium neutron coincidence collar under Task 2 of this project.

  7. Proportional Reasoning of Preservice Elementary Education Majors: An Epistemic Model of the Proportional Reasoning Construct.

    ERIC Educational Resources Information Center

    Fleener, M. Jayne

    Current research and learning theory suggest that a hierarchy of proportional reasoning exists that can be tested. Using G. Vergnaud's four complexity variables (structure, content, numerical characteristics, and presentation) and T. E. Kieren's model of rational number knowledge building, an epistemic model of proportional reasoning was…

  8. Augmented mixed models for clustered proportion data

    PubMed Central

    Bandyopadhyay, Dipankar; Galvis, Diana M; Lachos, Victor H

    2015-01-01

    Often in biomedical research, we deal with continuous (clustered) proportion responses ranging between zero and one quantifying the disease status of the cluster units. Interestingly, the study population might also consist of relatively disease-free as well as highly diseased subjects, contributing to proportion values in the interval [0, 1]. Regression on a variety of parametric densities with support lying in (0, 1), such as beta regression, can assess important covariate effects. However, they are deemed inappropriate due to the presence of zeros and/or ones. To evade this, we introduce a class of general proportion density, and further augment the probabilities of zero and one to this general proportion density, controlling for the clustering. Our approach is Bayesian and presents a computationally convenient framework amenable to available freeware. Bayesian case-deletion influence diagnostics based on q-divergence measures are automatic from the Markov chain Monte Carlo output. The methodology is illustrated using both simulation studies and application to a real dataset from a clinical periodontology study. PMID:25491718

  9. Mathematically modelling proportions of Japanese populations by industry

    NASA Astrophysics Data System (ADS)

    Hirata, Yoshito

    2016-10-01

    I propose a mathematical model for temporal changes of proportions for industrial sectors. I prove that the model keeps the proportions for the primary, the secondary, and the tertiary sectors between 0 and 100% and preserves their total as 100%. The model fits the Japanese historical data between 1950 and 2005 for the population proportions by industry very well. The model also predicts that the proportion for the secondary industry becomes negligible and becomes less than 1% at least around 2080.

  10. Estimating the relative hazard by the ratio of logarithms of event-free proportions.

    PubMed

    Perneger, Thomas V

    2008-09-01

    Clinical trials typically examine associations between an intervention and the occurrence of a clinical event. The association is often reported as a relative risk, more rarely as an odds ratio. Unfortunately, when the scientific interest lies with the ratio of incidence rates, both these statistics are inaccurate: the odds ratio is too extreme, and the relative risk too conservative. These biases are particularly strong when the outcomes are common. This paper describes an alternative statistic, the ratio of logarithms of event-free proportions (or relative log survival), which is simple to compute yet unbiased vis-à-vis the relative hazard. A formula to compute the sampling error of this statistic is also provided. Multivariate analysis can be conducted using complementary log-log regression. Precise knowledge of event occurrence times is not required for these analyses. Relative log survival may be particularly useful for meta-analyses of trials in which the proportion of events varies between studies.

  11. Identifying and modeling safety hazards

    SciTech Connect

    DANIELS,JESSE; BAHILL,TERRY; WERNER,PAUL W.

    2000-03-29

    The hazard model described in this paper is designed to accept data over the Internet from distributed databases. A hazard object template is used to ensure that all necessary descriptors are collected for each object. Three methods for combining the data are compared and contrasted. Three methods are used for handling the three types of interactions between the hazard objects.

  12. Empirical study of correlated survival times for recurrent events with proportional hazards margins and the effect of correlation and censoring

    PubMed Central

    2013-01-01

    Background In longitudinal studies where subjects experience recurrent incidents over a period of time, such as respiratory infections, fever or diarrhea, statistical methods are required to take into account the within-subject correlation. Methods For repeated events data with censored failure, the independent increment (AG), marginal (WLW) and conditional (PWP) models are three multiple failure models that generalize Cox’s proportional hazard model. In this paper, we revise the efficiency, accuracy and robustness of all three models under simulated scenarios with varying degrees of within-subject correlation, censoring levels, maximum number of possible recurrences and sample size. We also study the methods performance on a real dataset from a cohort study with bronchial obstruction. Results We find substantial differences between methods and there is not an optimal method. AG and PWP seem to be preferable to WLW for low correlation levels but the situation reverts for high correlations. Conclusions All methods are stable in front of censoring, worsen with increasing recurrence levels and share a bias problem which, among other consequences, makes asymptotic normal confidence intervals not fully reliable, although they are well developed theoretically. PMID:23883000

  13. Experiments to Determine Whether Recursive Partitioning (CART) or an Artificial Neural Network Overcomes Theoretical Limitations of Cox Proportional Hazards Regression

    NASA Technical Reports Server (NTRS)

    Kattan, Michael W.; Hess, Kenneth R.; Kattan, Michael W.

    1998-01-01

    New computationally intensive tools for medical survival analyses include recursive partitioning (also called CART) and artificial neural networks. A challenge that remains is to better understand the behavior of these techniques in effort to know when they will be effective tools. Theoretically they may overcome limitations of the traditional multivariable survival technique, the Cox proportional hazards regression model. Experiments were designed to test whether the new tools would, in practice, overcome these limitations. Two datasets in which theory suggests CART and the neural network should outperform the Cox model were selected. The first was a published leukemia dataset manipulated to have a strong interaction that CART should detect. The second was a published cirrhosis dataset with pronounced nonlinear effects that a neural network should fit. Repeated sampling of 50 training and testing subsets was applied to each technique. The concordance index C was calculated as a measure of predictive accuracy by each technique on the testing dataset. In the interaction dataset, CART outperformed Cox (P less than 0.05) with a C improvement of 0.1 (95% Cl, 0.08 to 0.12). In the nonlinear dataset, the neural network outperformed the Cox model (P less than 0.05), but by a very slight amount (0.015). As predicted by theory, CART and the neural network were able to overcome limitations of the Cox model. Experiments like these are important to increase our understanding of when one of these new techniques will outperform the standard Cox model. Further research is necessary to predict which technique will do best a priori and to assess the magnitude of superiority.

  14. Augmented mixed beta regression models for periodontal proportion data

    PubMed Central

    Galvis, Diana M.; Bandyopadhyay, Dipankar; Lachos, Victor H.

    2014-01-01

    Continuous (clustered) proportion data often arise in various domains of medicine and public health where the response variable of interest is a proportion (or percentage) quantifying disease status for the cluster units, ranging between zero and one. However, because of the presence of relatively disease-free as well as heavily diseased subjects in any study, the proportion values can lie in the interval [0, 1]. While beta regression can be adapted to assess covariate effects in these situations, its versatility is often challenged because of the presence/excess of zeros and ones because the beta support lies in the interval (0, 1). To circumvent this, we augment the probabilities of zero and one with the beta density, controlling for the clustering effect. Our approach is Bayesian with the ability to borrow information across various stages of the complex model hierarchy and produces a computationally convenient framework amenable to available freeware. The marginal likelihood is tractable and can be used to develop Bayesian case-deletion influence diagnostics based on q-divergence measures. Both simulation studies and application to a real dataset from a clinical periodontology study quantify the gain in model fit and parameter estimation over other ad hoc alternatives and provide quantitative insight into assessing the true covariate effects on the proportion responses. PMID:24764045

  15. Proportion cured models applied to 23 cancer sites in Norway.

    PubMed

    Cvancarova, Milada; Aagnes, Bjarte; Fosså, Sophie D; Lambert, Paul C; Møller, Bjørn; Bray, Freddie

    2013-04-01

    Statistical cure is reached when a group of patients has the same mortality as cancer-free individuals. Cure models predict the cured proportion and the median survival of fatal cases. Cure models have seldom been applied and tested systematically across all major cancer sites. Incidence and follow-up data on 23 cancer sites recorded at the Cancer Registry of Norway 1963-2007 were obtained. Mixture cure models were fitted to obtain trends and up-to-date estimates (based on period approach) assuming cured and uncured groups exist. The model converged for cancers of the mouth and pharynx, oesophagus, stomach, colon, rectum, liver, gallbladder, pancreas, lung and trachea, ovary, kidney, bladder, CNS, non-Hodgkin lymphoma (only for males) and leukemia. The proportion of cured patients increased 1963-2002 for both sexes, with the largest changes (in percent) seen for leukemia (46.4 and 46.7) and CNS (35.9, 42.0), males given first. Median survival time for the uncured cases increased for colon and rectal cancer, and there was a three- fold increase in median survival time for patients with fatal ovarian cancers. Cancers of bladder and CNS had the highest up-to-date proportion cured (in percent), 67.4 and 64.0, respectively, pancreas and liver were amongst the lowest (5.7 and 9.9, respectively). Cure models are useful when monitoring progress in cancer care, but must be applied and interpreted with caution. The absolute estimates of the cure proportion are speculative and should not be calculated where cure is not medically anticipated.

  16. Proportion cured models applied to 23 cancer sites in Norway.

    PubMed

    Cvancarova, Milada; Aagnes, Bjarte; Fosså, Sophie D; Lambert, Paul C; Møller, Bjørn; Bray, Freddie

    2013-04-01

    Statistical cure is reached when a group of patients has the same mortality as cancer-free individuals. Cure models predict the cured proportion and the median survival of fatal cases. Cure models have seldom been applied and tested systematically across all major cancer sites. Incidence and follow-up data on 23 cancer sites recorded at the Cancer Registry of Norway 1963-2007 were obtained. Mixture cure models were fitted to obtain trends and up-to-date estimates (based on period approach) assuming cured and uncured groups exist. The model converged for cancers of the mouth and pharynx, oesophagus, stomach, colon, rectum, liver, gallbladder, pancreas, lung and trachea, ovary, kidney, bladder, CNS, non-Hodgkin lymphoma (only for males) and leukemia. The proportion of cured patients increased 1963-2002 for both sexes, with the largest changes (in percent) seen for leukemia (46.4 and 46.7) and CNS (35.9, 42.0), males given first. Median survival time for the uncured cases increased for colon and rectal cancer, and there was a three- fold increase in median survival time for patients with fatal ovarian cancers. Cancers of bladder and CNS had the highest up-to-date proportion cured (in percent), 67.4 and 64.0, respectively, pancreas and liver were amongst the lowest (5.7 and 9.9, respectively). Cure models are useful when monitoring progress in cancer care, but must be applied and interpreted with caution. The absolute estimates of the cure proportion are speculative and should not be calculated where cure is not medically anticipated. PMID:22927104

  17. Two models for evaluating landslide hazards

    USGS Publications Warehouse

    Davis, J.C.; Chung, C.-J.; Ohlmacher, G.C.

    2006-01-01

    Two alternative procedures for estimating landslide hazards were evaluated using data on topographic digital elevation models (DEMs) and bedrock lithologies in an area adjacent to the Missouri River in Atchison County, Kansas, USA. The two procedures are based on the likelihood ratio model but utilize different assumptions. The empirical likelihood ratio model is based on non-parametric empirical univariate frequency distribution functions under an assumption of conditional independence while the multivariate logistic discriminant model assumes that likelihood ratios can be expressed in terms of logistic functions. The relative hazards of occurrence of landslides were estimated by an empirical likelihood ratio model and by multivariate logistic discriminant analysis. Predictor variables consisted of grids containing topographic elevations, slope angles, and slope aspects calculated from a 30-m DEM. An integer grid of coded bedrock lithologies taken from digitized geologic maps was also used as a predictor variable. Both statistical models yield relative estimates in the form of the proportion of total map area predicted to already contain or to be the site of future landslides. The stabilities of estimates were checked by cross-validation of results from random subsamples, using each of the two procedures. Cell-by-cell comparisons of hazard maps made by the two models show that the two sets of estimates are virtually identical. This suggests that the empirical likelihood ratio and the logistic discriminant analysis models are robust with respect to the conditional independent assumption and the logistic function assumption, respectively, and that either model can be used successfully to evaluate landslide hazards. ?? 2006.

  18. Computer Model Locates Environmental Hazards

    NASA Technical Reports Server (NTRS)

    2008-01-01

    Catherine Huybrechts Burton founded San Francisco-based Endpoint Environmental (2E) LLC in 2005 while she was a student intern and project manager at Ames Research Center with NASA's DEVELOP program. The 2E team created the Tire Identification from Reflectance model, which algorithmically processes satellite images using turnkey technology to retain only the darkest parts of an image. This model allows 2E to locate piles of rubber tires, which often are stockpiled illegally and cause hazardous environmental conditions and fires.

  19. Models of volcanic eruption hazards

    SciTech Connect

    Wohletz, K.H.

    1992-01-01

    Volcanic eruptions pose an ever present but poorly constrained hazard to life and property for geothermal installations in volcanic areas. Because eruptions occur sporadically and may limit field access, quantitative and systematic field studies of eruptions are difficult to complete. Circumventing this difficulty, laboratory models and numerical simulations are pivotal in building our understanding of eruptions. For example, the results of fuel-coolant interaction experiments show that magma-water interaction controls many eruption styles. Applying these results, increasing numbers of field studies now document and interpret the role of external water eruptions. Similarly, numerical simulations solve the fundamental physics of high-speed fluid flow and give quantitative predictions that elucidate the complexities of pyroclastic flows and surges. A primary goal of these models is to guide geologists in searching for critical field relationships and making their interpretations. Coupled with field work, modeling is beginning to allow more quantitative and predictive volcanic hazard assessments.

  20. Modeling lahar behavior and hazards

    USGS Publications Warehouse

    Manville, Vernon; Major, Jon J.; Fagents, Sarah A.

    2013-01-01

    Lahars are highly mobile mixtures of water and sediment of volcanic origin that are capable of traveling tens to > 100 km at speeds exceeding tens of km hr-1. Such flows are among the most serious ground-based hazards at many volcanoes because of their sudden onset, rapid advance rates, long runout distances, high energy, ability to transport large volumes of material, and tendency to flow along existing river channels where populations and infrastructure are commonly concentrated. They can grow in volume and peak discharge through erosion and incorporation of external sediment and/or water, inundate broad areas, and leave deposits many meters thick. Furthermore, lahars can recur for many years to decades after an initial volcanic eruption, as fresh pyroclastic material is eroded and redeposited during rainfall events, resulting in a spatially and temporally evolving hazard. Improving understanding of the behavior of these complex, gravitationally driven, multi-phase flows is key to mitigating the threat to communities at lahar-prone volcanoes. However, their complexity and evolving nature pose significant challenges to developing the models of flow behavior required for delineating their hazards and hazard zones.

  1. Proportional and scale change models to project failures of mechanical components with applications to space station

    NASA Technical Reports Server (NTRS)

    Taneja, Vidya S.

    1996-01-01

    In this paper we develop the mathematical theory of proportional and scale change models to perform reliability analysis. The results obtained will be applied for the Reaction Control System (RCS) thruster valves on an orbiter. With the advent of extended EVA's associated with PROX OPS (ISSA & MIR), and docking, the loss of a thruster valve now takes on an expanded safety significance. Previous studies assume a homogeneous population of components with each component having the same failure rate. However, as various components experience different stresses and are exposed to different environments, their failure rates change with time. In this paper we model the reliability of a thruster valves by treating these valves as a censored repairable system. The model for each valve will take the form of a nonhomogeneous process with the intensity function that is either treated as a proportional hazard model, or a scale change random effects hazard model. Each component has an associated z, an independent realization of the random variable Z from a distribution G(z). This unobserved quantity z can be used to describe heterogeneity systematically. For various models methods for estimating the model parameters using censored data will be developed. Available field data (from previously flown flights) is from non-renewable systems. The estimated failure rate using such data will need to be modified for renewable systems such as thruster valve.

  2. Accelerated Hazards Mixture Cure Model

    PubMed Central

    Zhang, Jiajia; Peng, Yingwei

    2010-01-01

    We propose a new cure model for survival data with a surviving or cure fraction. The new model is a mixture cure model where the covariate effects on the proportion of cure and the distribution of the failure time of uncured patients are separately modeled. Unlike the existing mixture cure models, the new model allows covariate effects on the failure time distribution of uncured patients to be negligible at time zero and to increase as time goes by. Such a model is particularly useful in some cancer treatments when the treat effect increases gradually from zero, and the existing models usually cannot handle this situation properly. We develop a rank based semiparametric estimation method to obtain the maximum likelihood estimates of the parameters in the model. We compare it with existing models and methods via a simulation study, and apply the model to a breast cancer data set. The numerical studies show that the new model provides a useful addition to the cure model literature. PMID:19697127

  3. Modelling boron-lined proportional counter response to neutrons.

    PubMed

    Shahri, A; Ghal-Eh, N; Etaati, G R

    2013-09-01

    The detailed Monte Carlo simulation of a boron-lined proportional counter response to a neutron source has been presented. The MCNP4C and experimental data on different source-moderator geometries have been given for comparison. The influence of different irradiation geometries and boron-lining thicknesses on the detector response has been studied.

  4. The Identification and Validation Process of Proportional Reasoning Attributes: An Application of a Proportional Reasoning Modeling Framework

    ERIC Educational Resources Information Center

    Tjoe, Hartono; de la Torre, Jimmy

    2014-01-01

    In this paper, we discuss the process of identifying and validating students' abilities to think proportionally. More specifically, we describe the methodology we used to identify these proportional reasoning attributes, beginning with the selection and review of relevant literature on proportional reasoning. We then continue with the…

  5. Modeling and Hazard Analysis Using STPA

    NASA Astrophysics Data System (ADS)

    Ishimatsu, Takuto; Leveson, Nancy; Thomas, John; Katahira, Masa; Miyamoto, Yuko; Nakao, Haruka

    2010-09-01

    A joint research project between MIT and JAXA/JAMSS is investigating the application of a new hazard analysis to the system and software in the HTV. Traditional hazard analysis focuses on component failures but software does not fail in this way. Software most often contributes to accidents by commanding the spacecraft into an unsafe state(e.g., turning off the descent engines prematurely) or by not issuing required commands. That makes the standard hazard analysis techniques of limited usefulness on software-intensive systems, which describes most spacecraft built today. STPA is a new hazard analysis technique based on systems theory rather than reliability theory. It treats safety as a control problem rather than a failure problem. The goal of STPA, which is to create a set of scenarios that can lead to a hazard, is the same as FTA but STPA includes a broader set of potential scenarios including those in which no failures occur but the problems arise due to unsafe and unintended interactions among the system components. STPA also provides more guidance to the analysts that traditional fault tree analysis. Functional control diagrams are used to guide the analysis. In addition, JAXA uses a model-based system engineering development environment(created originally by Leveson and called SpecTRM) which also assists in the hazard analysis. One of the advantages of STPA is that it can be applied early in the system engineering and development process in a safety-driven design process where hazard analysis drives the design decisions rather than waiting until reviews identify problems that are then costly or difficult to fix. It can also be applied in an after-the-fact analysis and hazard assessment, which is what we did in this case study. This paper describes the experimental application of STPA to the JAXA HTV in order to determine the feasibility and usefulness of the new hazard analysis technique. Because the HTV was originally developed using fault tree analysis

  6. Wind shear modeling for aircraft hazard definition

    NASA Technical Reports Server (NTRS)

    Frost, W.; Camp, D. W.; Wang, S. T.

    1978-01-01

    Mathematical models of wind profiles were developed for use in fast time and manned flight simulation studies aimed at defining and eliminating these wind shear hazards. A set of wind profiles and associated wind shear characteristics for stable and neutral boundary layers, thunderstorms, and frontal winds potentially encounterable by aircraft in the terminal area are given. Engineering models of wind shear for direct hazard analysis are presented in mathematical formulae, graphs, tables, and computer lookup routines. The wind profile data utilized to establish the models are described as to location, how obtained, time of observation and number of data points up to 500 m. Recommendations, engineering interpretations and guidelines for use of the data are given and the range of applicability of the wind shear models is described.

  7. A General Semiparametric Hazards Regression Model: Efficient Estimation and Structure Selection

    PubMed Central

    Tong, Xingwei; Zhu, Liang; Leng, Chenlei; Leisenring, Wendy; Robison, Leslie L.

    2014-01-01

    We consider a general semiparametric hazards regression model that encompasses Cox’s proportional hazards model and the accelerated failure time model for survival analysis. To overcome the nonexistence of the maximum likelihood, we derive a kernel-smoothed profile likelihood function, and prove that the resulting estimates of the regression parameters are consistent and achieve semiparametric efficiency. In addition, we develop penalized structure selection techniques to determine which covariates constitute the accelerate failure time model and which covariates constitute the proportional hazards model. The proposed method is able to estimate the model structure consistently and model parameters efficiently. Furthermore, variance estimation is straightforward. The proposed estimation performs well in simulation studies and is applied to the analysis of a real data set. Copyright PMID:23824784

  8. Fitting Proportional Odds Models to Educational Data in Ordinal Logistic Regression Using Stata, SAS and SPSS

    ERIC Educational Resources Information Center

    Liu, Xing

    2008-01-01

    The proportional odds (PO) model, which is also called cumulative odds model (Agresti, 1996, 2002 ; Armstrong & Sloan, 1989; Long, 1997, Long & Freese, 2006; McCullagh, 1980; McCullagh & Nelder, 1989; Powers & Xie, 2000; O'Connell, 2006), is one of the most commonly used models for the analysis of ordinal categorical data and comes from the class…

  9. Partial proportional odds model-an alternate choice for analyzing pedestrian crash injury severities.

    PubMed

    Sasidharan, Lekshmi; Menéndez, Mónica

    2014-11-01

    The conventional methods for crash injury severity analyses include either treating the severity data as ordered (e.g. ordered logit/probit models) or non-ordered (e.g. multinomial models). The ordered models require the data to meet proportional odds assumption, according to which the predictors can only have the same effect on different levels of the dependent variable, which is often not the case with crash injury severities. On the other hand, non-ordered analyses completely ignore the inherent hierarchical nature of crash injury severities. Therefore, treating the crash severity data as either ordered or non-ordered results in violating some of the key principles. To address these concerns, this paper explores the application of a partial proportional odds (PPO) model to bridge the gap between ordered and non-ordered severity modeling frameworks. The PPO model allows the covariates that meet the proportional odds assumption to affect different crash severity levels with the same magnitude; whereas the covariates that do not meet the proportional odds assumption can have different effects on different severity levels. This study is based on a five-year (2008-2012) national pedestrian safety dataset for Switzerland. A comparison between the application of PPO models, ordered logit models, and multinomial logit models for pedestrian injury severity evaluation is also included here. The study shows that PPO models outperform the other models considered based on different evaluation criteria. Hence, it is a viable method for analyzing pedestrian crash injury severities.

  10. Experimental Concepts for Testing Seismic Hazard Models

    NASA Astrophysics Data System (ADS)

    Marzocchi, W.; Jordan, T. H.

    2015-12-01

    Seismic hazard analysis is the primary interface through which useful information about earthquake rupture and wave propagation is delivered to society. To account for the randomness (aleatory variability) and limited knowledge (epistemic uncertainty) of these natural processes, seismologists must formulate and test hazard models using the concepts of probability. In this presentation, we will address the scientific objections that have been raised over the years against probabilistic seismic hazard analysis (PSHA). Owing to the paucity of observations, we must rely on expert opinion to quantify the epistemic uncertainties of PSHA models (e.g., in the weighting of individual models from logic-tree ensembles of plausible models). The main theoretical issue is a frequentist critique: subjectivity is immeasurable; ergo, PSHA models cannot be objectively tested against data; ergo, they are fundamentally unscientific. We have argued (PNAS, 111, 11973-11978) that the Bayesian subjectivity required for casting epistemic uncertainties can be bridged with the frequentist objectivity needed for pure significance testing through "experimental concepts." An experimental concept specifies collections of data, observed and not yet observed, that are judged to be exchangeable (i.e., with a joint distribution independent of the data ordering) when conditioned on a set of explanatory variables. We illustrate, through concrete examples, experimental concepts useful in the testing of PSHA models for ontological errors in the presence of aleatory variability and epistemic uncertainty. In particular, we describe experimental concepts that lead to exchangeable binary sequences that are statistically independent but not identically distributed, showing how the Bayesian concept of exchangeability generalizes the frequentist concept of experimental repeatability. We also address the issue of testing PSHA models using spatially correlated data.

  11. Decision-Tree Models of Categorization Response Times, Choice Proportions, and Typicality Judgments

    ERIC Educational Resources Information Center

    Lafond, Daniel; Lacouture, Yves; Cohen, Andrew L.

    2009-01-01

    The authors present 3 decision-tree models of categorization adapted from T. Trabasso, H. Rollins, and E. Shaughnessy (1971) and use them to provide a quantitative account of categorization response times, choice proportions, and typicality judgments at the individual-participant level. In Experiment 1, the decision-tree models were fit to…

  12. Frequencies as proportions: Using a teaching model based on Pirie and Kieren's model of mathematical understanding

    NASA Astrophysics Data System (ADS)

    Wright, Vince

    2014-03-01

    Pirie and Kieren (1989 For the learning of mathematics, 9(3)7-11, 1992 Journal of Mathematical Behavior, 11, 243-257, 1994a Educational Studies in Mathematics, 26, 61-86, 1994b For the Learning of Mathematics, 14(1)39-43) created a model (P-K) that describes a dynamic and recursive process by which learners develop their mathematical understanding. The model was adapted to create the teaching model used in the New Zealand Numeracy Development Projects (Ministry of Education, 2007). A case study of a 3-week sequence of instruction with a group of eight 12- and 13-year-old students provided the data. The teacher/researcher used folding back to materials and images and progressing from materials to imaging to number properties to assist students to develop their understanding of frequencies as proportions. The data show that successful implementation of the model is dependent on the teacher noticing and responding to the layers of understanding demonstrated by the students and the careful selection of materials, problems and situations. It supports the use of the model as a useful part of teachers' instructional strategies and the importance of pedagogical content knowledge to the quality of the way the model is used.

  13. Fuzzy portfolio model with fuzzy-input return rates and fuzzy-output proportions

    NASA Astrophysics Data System (ADS)

    Tsaur, Ruey-Chyn

    2015-02-01

    In the finance market, a short-term investment strategy is usually applied in portfolio selection in order to reduce investment risk; however, the economy is uncertain and the investment period is short. Further, an investor has incomplete information for selecting a portfolio with crisp proportions for each chosen security. In this paper we present a new method of constructing fuzzy portfolio model for the parameters of fuzzy-input return rates and fuzzy-output proportions, based on possibilistic mean-standard deviation models. Furthermore, we consider both excess or shortage of investment in different economic periods by using fuzzy constraint for the sum of the fuzzy proportions, and we also refer to risks of securities investment and vagueness of incomplete information during the period of depression economics for the portfolio selection. Finally, we present a numerical example of a portfolio selection problem to illustrate the proposed model and a sensitivity analysis is realised based on the results.

  14. Phylogenetic Tree Reconstruction Accuracy and Model Fit when Proportions of Variable Sites Change across the Tree

    PubMed Central

    Grievink, Liat Shavit; Penny, David; Hendy, Michael D.; Holland, Barbara R.

    2010-01-01

    Commonly used phylogenetic models assume a homogeneous process through time in all parts of the tree. However, it is known that these models can be too simplistic as they do not account for nonhomogeneous lineage-specific properties. In particular, it is now widely recognized that as constraints on sequences evolve, the proportion and positions of variable sites can vary between lineages causing heterotachy. The extent to which this model misspecification affects tree reconstruction is still unknown. Here, we evaluate the effect of changes in the proportions and positions of variable sites on model fit and tree estimation. We consider 5 current models of nucleotide sequence evolution in a Bayesian Markov chain Monte Carlo framework as well as maximum parsimony (MP). We show that for a tree with 4 lineages where 2 nonsister taxa undergo a change in the proportion of variable sites tree reconstruction under the best-fitting model, which is chosen using a relative test, often results in the wrong tree. In this case, we found that an absolute test of model fit is a better predictor of tree estimation accuracy. We also found further evidence that MP is not immune to heterotachy. In addition, we show that increased sampling of taxa that have undergone a change in proportion and positions of variable sites is critical for accurate tree reconstruction. PMID:20525636

  15. A comparative study on splitting criteria of a survival tree based on the Cox proportional model.

    PubMed

    Shimokawa, Asanao; Kawasaki, Yohei; Miyaoka, Etsuo

    2016-01-01

    We treat the situations that the effect of covariates on hazard is differed in subgroups of patients. To handle this situation, we can consider the hybrid model of the Cox model and tree-structured model. Through simulation studies, we compared several splitting criteria for constructing this hybrid model. As a result, the criterion using the degree of the improvement in the negative maximum partial log-likelihood obtained by splitting showed a good performance for many situations. We also present the results obtained by applying this tree model in an actual medical research study to show its utility. PMID:26043356

  16. Regression model estimation of early season crop proportions: North Dakota, some preliminary results

    NASA Technical Reports Server (NTRS)

    Lin, K. K. (Principal Investigator)

    1982-01-01

    To estimate crop proportions early in the season, an approach is proposed based on: use of a regression-based prediction equation to obtain an a priori estimate for specific major crop groups; modification of this estimate using current-year LANDSAT and weather data; and a breakdown of the major crop groups into specific crops by regression models. Results from the development and evaluation of appropriate regression models for the first portion of the proposed approach are presented. The results show that the model predicts 1980 crop proportions very well at both county and crop reporting district levels. In terms of planted acreage, the model underpredicted 9.1 percent of the 1980 published data on planted acreage at the county level. It predicted almost exactly the 1980 published data on planted acreage at the crop reporting district level and overpredicted the planted acreage by just 0.92 percent.

  17. Satellite image collection modeling for large area hazard emergency response

    NASA Astrophysics Data System (ADS)

    Liu, Shufan; Hodgson, Michael E.

    2016-08-01

    Timely collection of critical hazard information is the key to intelligent and effective hazard emergency response decisions. Satellite remote sensing imagery provides an effective way to collect critical information. Natural hazards, however, often have large impact areas - larger than a single satellite scene. Additionally, the hazard impact area may be discontinuous, particularly in flooding or tornado hazard events. In this paper, a spatial optimization model is proposed to solve the large area satellite image acquisition planning problem in the context of hazard emergency response. In the model, a large hazard impact area is represented as multiple polygons and image collection priorities for different portion of impact area are addressed. The optimization problem is solved with an exact algorithm. Application results demonstrate that the proposed method can address the satellite image acquisition planning problem. A spatial decision support system supporting the optimization model was developed. Several examples of image acquisition problems are used to demonstrate the complexity of the problem and derive optimized solutions.

  18. Lahar Hazard Modeling at Tungurahua Volcano, Ecuador

    NASA Astrophysics Data System (ADS)

    Sorensen, O. E.; Rose, W. I.; Jaya, D.

    2003-04-01

    lahar-hazard-zones using a digital elevation model (DEM), was used to construct a hazard map for the volcano. The 10 meter resolution DEM was constructed for Tungurahua Volcano using scanned topographic lines obtained from the GIS Department at the Escuela Politécnica Nacional, Quito, Ecuador. The steep topographic gradients and rapid downcutting of most rivers draining the edifice prevents the deposition of lahars on the lower flanks of Tungurahua. Modeling confirms the high degree of flow channelization in the deep Tungurahua canyons. Inundation zones observed and shown by LAHARZ at Baños yield identification of safe zones within the city which would provide safety from even the largest magnitude lahar expected.

  19. Incident duration modeling using flexible parametric hazard-based models.

    PubMed

    Li, Ruimin; Shang, Pan

    2014-01-01

    Assessing and prioritizing the duration time and effects of traffic incidents on major roads present significant challenges for road network managers. This study examines the effect of numerous factors associated with various types of incidents on their duration and proposes an incident duration prediction model. Several parametric accelerated failure time hazard-based models were examined, including Weibull, log-logistic, log-normal, and generalized gamma, as well as all models with gamma heterogeneity and flexible parametric hazard-based models with freedom ranging from one to ten, by analyzing a traffic incident dataset obtained from the Incident Reporting and Dispatching System in Beijing in 2008. Results show that different factors significantly affect different incident time phases, whose best distributions were diverse. Given the best hazard-based models of each incident time phase, the prediction result can be reasonable for most incidents. The results of this study can aid traffic incident management agencies not only in implementing strategies that would reduce incident duration, and thus reduce congestion, secondary incidents, and the associated human and economic losses, but also in effectively predicting incident duration time.

  20. Incident duration modeling using flexible parametric hazard-based models.

    PubMed

    Li, Ruimin; Shang, Pan

    2014-01-01

    Assessing and prioritizing the duration time and effects of traffic incidents on major roads present significant challenges for road network managers. This study examines the effect of numerous factors associated with various types of incidents on their duration and proposes an incident duration prediction model. Several parametric accelerated failure time hazard-based models were examined, including Weibull, log-logistic, log-normal, and generalized gamma, as well as all models with gamma heterogeneity and flexible parametric hazard-based models with freedom ranging from one to ten, by analyzing a traffic incident dataset obtained from the Incident Reporting and Dispatching System in Beijing in 2008. Results show that different factors significantly affect different incident time phases, whose best distributions were diverse. Given the best hazard-based models of each incident time phase, the prediction result can be reasonable for most incidents. The results of this study can aid traffic incident management agencies not only in implementing strategies that would reduce incident duration, and thus reduce congestion, secondary incidents, and the associated human and economic losses, but also in effectively predicting incident duration time. PMID:25530753

  1. Operation Reliability Assessment for Cutting Tools by Applying a Proportional Covariate Model to Condition Monitoring Information

    PubMed Central

    Cai, Gaigai; Chen, Xuefeng; Li, Bing; Chen, Baojia; He, Zhengjia

    2012-01-01

    The reliability of cutting tools is critical to machining precision and production efficiency. The conventional statistic-based reliability assessment method aims at providing a general and overall estimation of reliability for a large population of identical units under given and fixed conditions. However, it has limited effectiveness in depicting the operational characteristics of a cutting tool. To overcome this limitation, this paper proposes an approach to assess the operation reliability of cutting tools. A proportional covariate model is introduced to construct the relationship between operation reliability and condition monitoring information. The wavelet packet transform and an improved distance evaluation technique are used to extract sensitive features from vibration signals, and a covariate function is constructed based on the proportional covariate model. Ultimately, the failure rate function of the cutting tool being assessed is calculated using the baseline covariate function obtained from a small sample of historical data. Experimental results and a comparative study show that the proposed method is effective for assessing the operation reliability of cutting tools. PMID:23201980

  2. Operation reliability assessment for cutting tools by applying a proportional covariate model to condition monitoring information.

    PubMed

    Cai, Gaigai; Chen, Xuefeng; Li, Bing; Chen, Baojia; He, Zhengjia

    2012-09-25

    The reliability of cutting tools is critical to machining precision and production efficiency. The conventional statistic-based reliability assessment method aims at providing a general and overall estimation of reliability for a large population of identical units under given and fixed conditions. However, it has limited effectiveness in depicting the operational characteristics of a cutting tool. To overcome this limitation, this paper proposes an approach to assess the operation reliability of cutting tools. A proportional covariate model is introduced to construct the relationship between operation reliability and condition monitoring information. The wavelet packet transform and an improved distance evaluation technique are used to extract sensitive features from vibration signals, and a covariate function is constructed based on the proportional covariate model. Ultimately, the failure rate function of the cutting tool being assessed is calculated using the baseline covariate function obtained from a small sample of historical data. Experimental results and a comparative study show that the proposed method is effective for assessing the operation reliability of cutting tools.

  3. Natural Phenomena Hazards Modeling Project. Extreme wind/tornado hazard models for Department of Energy sites. Revision 1

    SciTech Connect

    Coats, D.W.; Murray, R.C.

    1985-08-01

    Lawrence Livermore National Laboratory (LLNL) has developed seismic and wind hazard models for the Office of Nuclear Safety (ONS), Department of Energy (DOE). The work is part of a three-phase effort aimed at establishing uniform building design criteria for seismic and wind hazards at DOE sites throughout the United States. This report summarizes the final wind/tornado hazard models recommended for each site and the methodology used to develop these models. Final seismic hazard models have been published separately by TERA Corporation. In the final phase, it is anticipated that the DOE will use the hazard models to establish uniform criteria for the design and evaluation of critical facilities. 19 refs., 3 figs., 9 tabs.

  4. Effects of Variable Inflationary Conditions on AN Inventory Model with Inflation-Proportional Demand Rate

    NASA Astrophysics Data System (ADS)

    Mirzazadeh, Abolfazl

    2009-08-01

    The inflation rate in the most of the previous researches has been considered constant and well-known over the time horizon, although the future rate of inflation is inherently uncertain and unstable, and is difficult to predict it accurately. Therefore, A time varying inventory model for deteriorating items with allowable shortages is developed in this paper. The inflation rates (internal and external) are time-dependent and demand rate is inflation-proportional. The inventory level is described by differential equations over the time horizon and present value method is used. The numerical example is given to explain the results. Some particular cases, which follow the main problem, will discuss and the results will compare with the main model by using the numerical examples. It has been achieved which shortages increases considerably in comparison with the case of without variable inflationary conditions.

  5. Age at marriage in Malaysia: a hazard model of marriage timing.

    PubMed

    Anderson, K H; Hill, M A; Butler, J S

    1987-08-01

    "This paper estimates a proportional hazards model for the timing of age at marriage of women in Malaysia. We hypothesize that age at marriage responds significantly to differences in male and female occupations, race, and age. We find considerable empirical support for the relevance of economic variables in determining age at marriage as well as evidence of strong differences in marriage patterns across races." PMID:12280709

  6. Physical vulnerability modelling in natural hazard risk assessment

    NASA Astrophysics Data System (ADS)

    Douglas, J.

    2007-04-01

    An evaluation of the risk to an exposed element from a hazardous event requires a consideration of the element's vulnerability, which expresses its propensity to suffer damage. This concept allows the assessed level of hazard to be translated to an estimated level of risk and is often used to evaluate the risk from earthquakes and cyclones. However, for other natural perils, such as mass movements, coastal erosion and volcanoes, the incorporation of vulnerability within risk assessment is not well established and consequently quantitative risk estimations are not often made. This impedes the study of the relative contributions from different hazards to the overall risk at a site. Physical vulnerability is poorly modelled for many reasons: the cause of human casualties (from the event itself rather than by building damage); lack of observational data on the hazard, the elements at risk and the induced damage; the complexity of the structural damage mechanisms; the temporal and geographical scales; and the ability to modify the hazard level. Many of these causes are related to the nature of the peril therefore for some hazards, such as coastal erosion, the benefits of considering an element's physical vulnerability may be limited. However, for hazards such as volcanoes and mass movements the modelling of vulnerability should be improved by, for example, following the efforts made in earthquake risk assessment. For example, additional observational data on induced building damage and the hazardous event should be routinely collected and correlated and also numerical modelling of building behaviour during a damaging event should be attempted.

  7. a model based on crowsourcing for detecting natural hazards

    NASA Astrophysics Data System (ADS)

    Duan, J.; Ma, C.; Zhang, J.; Liu, S.; Liu, J.

    2015-12-01

    Remote Sensing Technology provides a new method for the detecting,early warning,mitigation and relief of natural hazards. Given the suddenness and the unpredictability of the location of natural hazards as well as the actual demands for hazards work, this article proposes an evaluation model for remote sensing detecting of natural hazards based on crowdsourcing. Firstly, using crowdsourcing model and with the help of the Internet and the power of hundreds of millions of Internet users, this evaluation model provides visual interpretation of high-resolution remote sensing images of hazards area and collects massive valuable disaster data; secondly, this evaluation model adopts the strategy of dynamic voting consistency to evaluate the disaster data provided by the crowdsourcing workers; thirdly, this evaluation model pre-estimates the disaster severity with the disaster pre-evaluation model based on regional buffers; lastly, the evaluation model actuates the corresponding expert system work according to the forecast results. The idea of this model breaks the boundaries between geographic information professionals and the public, makes the public participation and the citizen science eventually be realized, and improves the accuracy and timeliness of hazards assessment results.

  8. 2015 USGS Seismic Hazard Model for Induced Seismicity

    NASA Astrophysics Data System (ADS)

    Petersen, M. D.; Mueller, C. S.; Moschetti, M. P.; Hoover, S. M.; Ellsworth, W. L.; Llenos, A. L.; Michael, A. J.

    2015-12-01

    Over the past several years, the seismicity rate has increased markedly in multiple areas of the central U.S. Studies have tied the majority of this increased activity to wastewater injection in deep wells and hydrocarbon production. These earthquakes are induced by human activities that change rapidly based on economic and policy decisions, making them difficult to forecast. Our 2014 USGS National Seismic Hazard Model and previous models are intended to provide the long-term hazard (2% probability of exceedance in 50 years) and are based on seismicity rates and patterns observed mostly from tectonic earthquakes. However, potentially induced earthquakes were identified in 14 regions that were not included in the earthquake catalog used for constructing the 2014 model. We recognized the importance of considering these induced earthquakes in a separate hazard analysis, and as a result in April 2015 we released preliminary models that explored the impact of this induced seismicity on the hazard. Several factors are important in determining the hazard from induced seismicity: period of the catalog that optimally forecasts the next year's activity, earthquake magnitude-rate distribution, earthquake location statistics, maximum magnitude, ground motion models, and industrial drivers such as injection rates. The industrial drivers are not currently available in a form that we can implement in a 1-year model. Hazard model inputs have been evaluated by a broad group of scientists and engineers to assess the range of acceptable models. Results indicate that next year's hazard is significantly higher by more than a factor of three in Oklahoma, Texas, and Colorado compared to the long-term 2014 hazard model. These results have raised concern about the impacts of induced earthquakes on the built environment and have led to many engineering and policy discussions about how to mitigate these effects for the more than 7 million people that live near areas of induced seismicity.

  9. Distributed Proportional-spatial Derivative control of nonlinear parabolic systems via fuzzy PDE modeling approach.

    PubMed

    Wang, Jun-Wei; Wu, Huai-Ning; Li, Han-Xiong

    2012-06-01

    In this paper, a distributed fuzzy control design based on Proportional-spatial Derivative (P-sD) is proposed for the exponential stabilization of a class of nonlinear spatially distributed systems described by parabolic partial differential equations (PDEs). Initially, a Takagi-Sugeno (T-S) fuzzy parabolic PDE model is proposed to accurately represent the nonlinear parabolic PDE system. Then, based on the T-S fuzzy PDE model, a novel distributed fuzzy P-sD state feedback controller is developed by combining the PDE theory and the Lyapunov technique, such that the closed-loop PDE system is exponentially stable with a given decay rate. The sufficient condition on the existence of an exponentially stabilizing fuzzy controller is given in terms of a set of spatial differential linear matrix inequalities (SDLMIs). A recursive algorithm based on the finite-difference approximation and the linear matrix inequality (LMI) techniques is also provided to solve these SDLMIs. Finally, the developed design methodology is successfully applied to the feedback control of the Fitz-Hugh-Nagumo equation.

  10. Natural Phenomena Hazards Modeling Project: Flood hazard models for Department of Energy sites

    SciTech Connect

    Savy, J.B.; Murray, R.C.

    1988-05-01

    For eight sites, the evaluation of flood hazards was considered in two steps. First, a screening assessment was performed to determine whether flood hazards may impact DOE operations. The screening analysis consisted of a preliminary flood hazard assessment that provides an initial estimate of the site design basis. The second step involves a review of the vulnerability of on-site facilities by the site manager; based on the results of the preliminary flood hazard assessment and a review of site operations, the manager can decide whether flood hazards should be considered a part of the design basis. The scope of the preliminary flood hazard analysis was restricted to evaluating the flood hazards that may exist in proximity to a site. The analysis does not involve an assessment of the potential of encroachment of flooding at specific on-site locations. Furthermore, the screening analysis does not consider localized flooding at a site due to precipitation (i.e., local run-off, storm sewer capacity, roof drainage). These issues were reserved for consideration by the DOE site manager. 9 refs., 18 figs.

  11. A high-resolution global flood hazard model

    NASA Astrophysics Data System (ADS)

    Sampson, Christopher C.; Smith, Andrew M.; Bates, Paul B.; Neal, Jeffrey C.; Alfieri, Lorenzo; Freer, Jim E.

    2015-09-01

    Floods are a natural hazard that affect communities worldwide, but to date the vast majority of flood hazard research and mapping has been undertaken by wealthy developed nations. As populations and economies have grown across the developing world, so too has demand from governments, businesses, and NGOs for modeled flood hazard data in these data-scarce regions. We identify six key challenges faced when developing a flood hazard model that can be applied globally and present a framework methodology that leverages recent cross-disciplinary advances to tackle each challenge. The model produces return period flood hazard maps at ˜90 m resolution for the whole terrestrial land surface between 56°S and 60°N, and results are validated against high-resolution government flood hazard data sets from the UK and Canada. The global model is shown to capture between two thirds and three quarters of the area determined to be at risk in the benchmark data without generating excessive false positive predictions. When aggregated to ˜1 km, mean absolute error in flooded fraction falls to ˜5%. The full complexity global model contains an automatically parameterized subgrid channel network, and comparison to both a simplified 2-D only variant and an independently developed pan-European model shows the explicit inclusion of channels to be a critical contributor to improved model performance. While careful processing of existing global terrain data sets enables reasonable model performance in urban areas, adoption of forthcoming next-generation global terrain data sets will offer the best prospect for a step-change improvement in model performance.

  12. Probabilistic modelling of rainfall induced landslide hazard assessment

    NASA Astrophysics Data System (ADS)

    Kawagoe, S.; Kazama, S.; Sarukkalige, P. R.

    2010-06-01

    To evaluate the frequency and distribution of landslides hazards over Japan, this study uses a probabilistic model based on multiple logistic regression analysis. Study particular concerns several important physical parameters such as hydraulic parameters, geographical parameters and the geological parameters which are considered to be influential in the occurrence of landslides. Sensitivity analysis confirmed that hydrological parameter (hydraulic gradient) is the most influential factor in the occurrence of landslides. Therefore, the hydraulic gradient is used as the main hydraulic parameter; dynamic factor which includes the effect of heavy rainfall and their return period. Using the constructed spatial data-sets, a multiple logistic regression model is applied and landslide hazard probability maps are produced showing the spatial-temporal distribution of landslide hazard probability over Japan. To represent the landslide hazard in different temporal scales, extreme precipitation in 5 years, 30 years, and 100 years return periods are used for the evaluation. The results show that the highest landslide hazard probability exists in the mountain ranges on the western side of Japan (Japan Sea side), including the Hida and Kiso, Iide and the Asahi mountainous range, the south side of Chugoku mountainous range, the south side of Kyusu mountainous and the Dewa mountainous range and the Hokuriku region. The developed landslide hazard probability maps in this study will assist authorities, policy makers and decision makers, who are responsible for infrastructural planning and development, as they can identify landslide-susceptible areas and thus decrease landslide damage through proper preparation.

  13. Additive interaction in survival analysis: use of the additive hazards model.

    PubMed

    Rod, Naja Hulvej; Lange, Theis; Andersen, Ingelise; Marott, Jacob Louis; Diderichsen, Finn

    2012-09-01

    It is a widely held belief in public health and clinical decision-making that interventions or preventive strategies should be aimed at patients or population subgroups where most cases could potentially be prevented. To identify such subgroups, deviation from additivity of absolute effects is the relevant measure of interest. Multiplicative survival models, such as the Cox proportional hazards model, are often used to estimate the association between exposure and risk of disease in prospective studies. In Cox models, deviations from additivity have usually been assessed by surrogate measures of additive interaction derived from multiplicative models-an approach that is both counter-intuitive and sometimes invalid. This paper presents a straightforward and intuitive way of assessing deviation from additivity of effects in survival analysis by use of the additive hazards model. The model directly estimates the absolute size of the deviation from additivity and provides confidence intervals. In addition, the model can accommodate both continuous and categorical exposures and models both exposures and potential confounders on the same underlying scale. To illustrate the approach, we present an empirical example of interaction between education and smoking on risk of lung cancer. We argue that deviations from additivity of effects are important for public health interventions and clinical decision-making, and such estimations should be encouraged in prospective studies on health. A detailed implementation guide of the additive hazards model is provided in the appendix.

  14. A conflict model for the international hazardous waste disposal dispute.

    PubMed

    Hu, Kaixian; Hipel, Keith W; Fang, Liping

    2009-12-15

    A multi-stage conflict model is developed to analyze international hazardous waste disposal disputes. More specifically, the ongoing toxic waste conflicts are divided into two stages consisting of the dumping prevention and dispute resolution stages. The modeling and analyses, based on the methodology of graph model for conflict resolution (GMCR), are used in both stages in order to grasp the structure and implications of a given conflict from a strategic viewpoint. Furthermore, a specific case study is investigated for the Ivory Coast hazardous waste conflict. In addition to the stability analysis, sensitivity and attitude analyses are conducted to capture various strategic features of this type of complicated dispute.

  15. Checking Fine and Gray Subdistribution Hazards Model with Cumulative Sums of Residuals

    PubMed Central

    Li, Jianing; Scheike, Thomas H.; Zhang, Mei-Jie

    2015-01-01

    Recently, Fine and Gray (1999) proposed a semi-parametric proportional regression model for the subdistribution hazard function which has been used extensively for analyzing competing risks data. However, failure of model adequacy could lead to severe bias in parameter estimation, and only a limited contribution has been made to check the model assumptions. In this paper, we present a class of analytical methods and graphical approaches for checking the assumptions of Fine and Gray’s model. The proposed goodness-of-fit test procedures are based on the cumulative sums of residuals, which validate the model in three aspects: (1) proportionality of hazard ratio, (2) the linear functional form and (3) the link function. For each assumption testing, we provide a p-values and a visualized plot against the null hypothesis using a simulation-based approach. We also consider an omnibus test for overall evaluation against any model misspecification. The proposed tests perform well in simulation studies and are illustrated with two real data examples. PMID:25421251

  16. Checking Fine and Gray subdistribution hazards model with cumulative sums of residuals.

    PubMed

    Li, Jianing; Scheike, Thomas H; Zhang, Mei-Jie

    2015-04-01

    Recently, Fine and Gray (J Am Stat Assoc 94:496-509, 1999) proposed a semi-parametric proportional regression model for the subdistribution hazard function which has been used extensively for analyzing competing risks data. However, failure of model adequacy could lead to severe bias in parameter estimation, and only a limited contribution has been made to check the model assumptions. In this paper, we present a class of analytical methods and graphical approaches for checking the assumptions of Fine and Gray's model. The proposed goodness-of-fit test procedures are based on the cumulative sums of residuals, which validate the model in three aspects: (1) proportionality of hazard ratio, (2) the linear functional form and (3) the link function. For each assumption testing, we provide a p-values and a visualized plot against the null hypothesis using a simulation-based approach. We also consider an omnibus test for overall evaluation against any model misspecification. The proposed tests perform well in simulation studies and are illustrated with two real data examples.

  17. Agent-based Modeling with MATSim for Hazards Evacuation Planning

    NASA Astrophysics Data System (ADS)

    Jones, J. M.; Ng, P.; Henry, K.; Peters, J.; Wood, N. J.

    2015-12-01

    Hazard evacuation planning requires robust modeling tools and techniques, such as least cost distance or agent-based modeling, to gain an understanding of a community's potential to reach safety before event (e.g. tsunami) arrival. Least cost distance modeling provides a static view of the evacuation landscape with an estimate of travel times to safety from each location in the hazard space. With this information, practitioners can assess a community's overall ability for timely evacuation. More information may be needed if evacuee congestion creates bottlenecks in the flow patterns. Dynamic movement patterns are best explored with agent-based models that simulate movement of and interaction between individual agents as evacuees through the hazard space, reacting to potential congestion areas along the evacuation route. The multi-agent transport simulation model MATSim is an agent-based modeling framework that can be applied to hazard evacuation planning. Developed jointly by universities in Switzerland and Germany, MATSim is open-source software written in Java and freely available for modification or enhancement. We successfully used MATSim to illustrate tsunami evacuation challenges in two island communities in California, USA, that are impacted by limited escape routes. However, working with MATSim's data preparation, simulation, and visualization modules in an integrated development environment requires a significant investment of time to develop the software expertise to link the modules and run a simulation. To facilitate our evacuation research, we packaged the MATSim modules into a single application tailored to the needs of the hazards community. By exposing the modeling parameters of interest to researchers in an intuitive user interface and hiding the software complexities, we bring agent-based modeling closer to practitioners and provide access to the powerful visual and analytic information that this modeling can provide.

  18. Modeling the proportion of cut slopes rock on forest roads using artificial neural network and ordinal linear regression.

    PubMed

    Babapour, R; Naghdi, R; Ghajar, I; Ghodsi, R

    2015-07-01

    Rock proportion of subsoil directly influences the cost of embankment in forest road construction. Therefore, developing a reliable framework for rock ratio estimation prior to the road planning could lead to more light excavation and less cost operations. Prediction of rock proportion was subjected to statistical analyses using the application of Artificial Neural Network (ANN) in MATLAB and five link functions of ordinal logistic regression (OLR) according to the rock type and terrain slope properties. In addition to bed rock and slope maps, more than 100 sample data of rock proportion were collected, observed by geologists, from any available bed rock of every slope class. Four predictive models were developed for rock proportion, employing independent variables and applying both the selected probit link function of OLR and Layer Recurrent and Feed forward back propagation networks of Neural Networks. In ANN, different numbers of neurons are considered for the hidden layer(s). Goodness of the fit measures distinguished that ANN models produced better results than OLR with R (2) = 0.72 and Root Mean Square Error = 0.42. Furthermore, in order to show the applicability of the proposed approach, and to illustrate the variability of rock proportion resulted from the model application, the optimum models were applied to a mountainous forest in where forest road network had been constructed in the past.

  19. Toward Building a New Seismic Hazard Model for Mainland China

    NASA Astrophysics Data System (ADS)

    Rong, Y.; Xu, X.; Chen, G.; Cheng, J.; Magistrale, H.; Shen, Z.

    2015-12-01

    At present, the only publicly available seismic hazard model for mainland China was generated by Global Seismic Hazard Assessment Program in 1999. We are building a new seismic hazard model by integrating historical earthquake catalogs, geological faults, geodetic GPS data, and geology maps. To build the model, we construct an Mw-based homogeneous historical earthquake catalog spanning from 780 B.C. to present, create fault models from active fault data using the methodology recommended by Global Earthquake Model (GEM), and derive a strain rate map based on the most complete GPS measurements and a new strain derivation algorithm. We divide China and the surrounding regions into about 20 large seismic source zones based on seismotectonics. For each zone, we use the tapered Gutenberg-Richter (TGR) relationship to model the seismicity rates. We estimate the TGR a- and b-values from the historical earthquake data, and constrain corner magnitude using the seismic moment rate derived from the strain rate. From the TGR distributions, 10,000 to 100,000 years of synthetic earthquakes are simulated. Then, we distribute small and medium earthquakes according to locations and magnitudes of historical earthquakes. Some large earthquakes are distributed on active faults based on characteristics of the faults, including slip rate, fault length and width, and paleoseismic data, and the rest to the background based on the distributions of historical earthquakes and strain rate. We evaluate available ground motion prediction equations (GMPE) by comparison to observed ground motions. To apply appropriate GMPEs, we divide the region into active and stable tectonics. The seismic hazard will be calculated using the OpenQuake software developed by GEM. To account for site amplifications, we construct a site condition map based on geology maps. The resulting new seismic hazard map can be used for seismic risk analysis and management, and business and land-use planning.

  20. Self-organization, the cascade model, and natural hazards

    PubMed Central

    Turcotte, Donald L.; Malamud, Bruce D.; Guzzetti, Fausto; Reichenbach, Paola

    2002-01-01

    We consider the frequency-size statistics of two natural hazards, forest fires and landslides. Both appear to satisfy power-law (fractal) distributions to a good approximation under a wide variety of conditions. Two simple cellular-automata models have been proposed as analogs for this observed behavior, the forest fire model for forest fires and the sand pile model for landslides. The behavior of these models can be understood in terms of a self-similar inverse cascade. For the forest fire model the cascade consists of the coalescence of clusters of trees; for the sand pile model the cascade consists of the coalescence of metastable regions. PMID:11875206

  1. Probabilistic modelling of rainfall induced landslide hazard assessment

    NASA Astrophysics Data System (ADS)

    Kawagoe, S.; Kazama, S.; Sarukkalige, P. R.

    2010-01-01

    To evaluate the frequency and distribution of landslides hazards over Japan, this study uses a probabilistic model based on multiple logistic regression analysis. Study particular concerns several important physical parameters such as hydraulic parameters, geographical parameters and the geological parameters which are considered to be influential in the occurrence of landslides. Sensitivity analysis confirmed that hydrological parameter (hydraulic gradient) is the most influential factor in the occurrence of landslides. Therefore, the hydraulic gradient is used as the main hydraulic parameter; dynamic factor which includes the effect of heavy rainfall and their return period. Using the constructed spatial data-sets, a multiple logistic regression model is applied and landslide susceptibility maps are produced showing the spatial-temporal distribution of landslide hazard susceptibility over Japan. To represent the susceptibility in different temporal scales, extreme precipitation in 5 years, 30 years, and 100 years return periods are used for the evaluation. The results show that the highest landslide hazard susceptibility exists in the mountain ranges on the western side of Japan (Japan Sea side), including the Hida and Kiso, Iide and the Asahi mountainous range, the south side of Chugoku mountainous range, the south side of Kyusu mountainous and the Dewa mountainous range and the Hokuriku region. The developed landslide hazard susceptibility maps in this study will assist authorities, policy makers and decision makers, who are responsible for infrastructural planning and development, as they can identify landslide-susceptible areas and thus decrease landslide damage through proper preparation.

  2. Simulation meets reality: Chemical hazard models in real world use

    SciTech Connect

    Newsom, D.E.

    1992-01-01

    In 1989 the US Department of Transportation (DOT), Federal Emergency Management Agency (FEMA), and US Environmental Protection Agency (EPA) released a set of models for analysis of chemical hazards on personal computers. The models, known collectively as ARCHIE (Automated Resource for Chemical Hazard Incident Evaluation), have been distributed free of charge to thousands of emergency planners and analysts in state governments, Local Emergency Planning Committees (LEPCs), and industry. Under DOT and FEMA sponsorship Argonne National Laboratory (ANL) conducted workshops in 1990 and 1991 to train federal state local government, and industry personnel, both end users and other trainers, in the use of the models. As a result of these distribution and training efforts ARCHIE has received substantial use by state, local and industrial emergency management personnel.

  3. Natural Phenomena Hazards Modeling Project: Preliminary flood hazards estimates for screening Department of Energy sites, Albuquerque Operations Office

    SciTech Connect

    McCann, M.W. Jr.; Boissonnade, A.C.

    1988-05-01

    As part of an ongoing program, Lawrence Livermore National Laboratory (LLNL) is directing the Natural Phenomena Hazards Modeling Project (NPHMP) on behalf of the Department of Energy (DOE). A major part of this effort is the development of probabilistic definitions of natural phenomena hazards; seismic, wind, and flood. In this report the first phase of the evaluation of flood hazards at DOE sites is described. Unlike seismic and wind events, floods may not present a significant threat to the operations of all DOE sites. For example, at some sites physical circumstances may exist that effectively preclude the occurrence of flooding. As a result, consideration of flood hazards may not be required as part of the site design basis. In this case it is not necessary to perform a detailed flood hazard study at all DOE sites, such as those conducted for other natural phenomena hazards, seismic and wind. The scope of the preliminary flood hazard analysis is restricted to evaluating the flood hazards that may exist in proximity to a site. The analysis does involve an assessment of the potential encroachment of flooding on-site at individual facility locations. However, the preliminary flood hazard assessment does not consider localized flooding at a site due to precipitation (i.e., local run-off, storm sewer capacity, roof drainage). These issues are reserved for consideration by the DOE site manager. 11 refs., 84 figs., 61 tabs.

  4. Disproportionate Proximity to Environmental Health Hazards: Methods, Models, and Measurement

    PubMed Central

    Maantay, Juliana A.; Brender, Jean D.

    2011-01-01

    We sought to provide a historical overview of methods, models, and data used in the environmental justice (EJ) research literature to measure proximity to environmental hazards and potential exposure to their adverse health effects. We explored how the assessment of disproportionate proximity and exposure has evolved from comparing the prevalence of minority or low-income residents in geographic entities hosting pollution sources and discrete buffer zones to more refined techniques that use continuous distances, pollutant fate-and-transport models, and estimates of health risk from toxic exposure. We also reviewed analytical techniques used to determine the characteristics of people residing in areas potentially exposed to environmental hazards and emerging geostatistical techniques that are more appropriate for EJ analysis than conventional statistical methods. We concluded by providing several recommendations regarding future research and data needs for EJ assessment that would lead to more reliable results and policy solutions. PMID:21836113

  5. Context effects: the proportional difference model and the reflection of preference.

    PubMed

    Gonzalez-Vallejo, Claudia; Reid, Aaron A; Schiltz, Joel

    2003-09-01

    The reflection effect (D. Kahneman & A. Tversky, 1979) was investigated using the stochastic model of choice developed by C. Gonzalez-Vallejo (2002). The model assumes that individuals make trade-offs among attribute values by relying on a difference variable. The model also specifies a threshold representing individual proclivities to reach to attribute differences. Two experiments demonstrated that changes in risk attitudes, from a gain to a loss situation, depended on the stimuli as well as on individuals' thresholds. Thresholds were generally lower in losses than in gains, indicating a risk-taking tendency. Thresholds were also lower when participants were endowed with greater savings. Model testing revealed better fits for the stochastic model than cumulative prospect theory (A. Tversky &. D. Kahneman, 1992).

  6. Assessment and indirect adjustment for confounding by smoking in cohort studies using relative hazards models.

    PubMed

    Richardson, David B; Laurier, Dominique; Schubauer-Berigan, Mary K; Tchetgen Tchetgen, Eric; Cole, Stephen R

    2014-11-01

    Workers' smoking histories are not measured in many occupational cohort studies. Here we discuss the use of negative control outcomes to detect and adjust for confounding in analyses that lack information on smoking. We clarify the assumptions necessary to detect confounding by smoking and the additional assumptions necessary to indirectly adjust for such bias. We illustrate these methods using data from 2 studies of radiation and lung cancer: the Colorado Plateau cohort study (1950-2005) of underground uranium miners (in which smoking was measured) and a French cohort study (1950-2004) of nuclear industry workers (in which smoking was unmeasured). A cause-specific relative hazards model is proposed for estimation of indirectly adjusted associations. Among the miners, the proposed method suggests no confounding by smoking of the association between radon and lung cancer--a conclusion supported by adjustment for measured smoking. Among the nuclear workers, the proposed method suggests substantial confounding by smoking of the association between radiation and lung cancer. Indirect adjustment for confounding by smoking resulted in an 18% decrease in the adjusted estimated hazard ratio, yet this cannot be verified because smoking was unmeasured. Assumptions underlying this method are described, and a cause-specific proportional hazards model that allows easy implementation using standard software is presented.

  7. Assessment and Indirect Adjustment for Confounding by Smoking in Cohort Studies Using Relative Hazards Models

    PubMed Central

    Richardson, David B.; Laurier, Dominique; Schubauer-Berigan, Mary K.; Tchetgen, Eric Tchetgen; Cole, Stephen R.

    2014-01-01

    Workers' smoking histories are not measured in many occupational cohort studies. Here we discuss the use of negative control outcomes to detect and adjust for confounding in analyses that lack information on smoking. We clarify the assumptions necessary to detect confounding by smoking and the additional assumptions necessary to indirectly adjust for such bias. We illustrate these methods using data from 2 studies of radiation and lung cancer: the Colorado Plateau cohort study (1950–2005) of underground uranium miners (in which smoking was measured) and a French cohort study (1950–2004) of nuclear industry workers (in which smoking was unmeasured). A cause-specific relative hazards model is proposed for estimation of indirectly adjusted associations. Among the miners, the proposed method suggests no confounding by smoking of the association between radon and lung cancer—a conclusion supported by adjustment for measured smoking. Among the nuclear workers, the proposed method suggests substantial confounding by smoking of the association between radiation and lung cancer. Indirect adjustment for confounding by smoking resulted in an 18% decrease in the adjusted estimated hazard ratio, yet this cannot be verified because smoking was unmeasured. Assumptions underlying this method are described, and a cause-specific proportional hazards model that allows easy implementation using standard software is presented. PMID:25245043

  8. Recent Experiences in Aftershock Hazard Modelling in New Zealand

    NASA Astrophysics Data System (ADS)

    Gerstenberger, M.; Rhoades, D. A.; McVerry, G.; Christophersen, A.; Bannister, S. C.; Fry, B.; Potter, S.

    2014-12-01

    The occurrence of several sequences of earthquakes in New Zealand in the last few years has meant that GNS Science has gained significant recent experience in aftershock hazard and forecasting. First was the Canterbury sequence of events which began in 2010 and included the destructive Christchurch earthquake of February, 2011. This sequence is occurring in what was a moderate-to-low hazard region of the National Seismic Hazard Model (NSHM): the model on which the building design standards are based. With the expectation that the sequence would produce a 50-year hazard estimate in exceedance of the existing building standard, we developed a time-dependent model that combined short-term (STEP & ETAS) and longer-term (EEPAS) clustering with time-independent models. This forecast was combined with the NSHM to produce a forecast of the hazard for the next 50 years. This has been used to revise building design standards for the region and has contributed to planning of the rebuilding of Christchurch in multiple aspects. An important contribution to this model comes from the inclusion of EEPAS, which allows for clustering on the scale of decades. EEPAS is based on three empirical regressions that relate the magnitudes, times of occurrence, and locations of major earthquakes to regional precursory scale increases in the magnitude and rate of occurrence of minor earthquakes. A second important contribution comes from the long-term rate to which seismicity is expected to return in 50-years. With little seismicity in the region in historical times, a controlling factor in the rate is whether-or-not it is based on a declustered catalog. This epistemic uncertainty in the model was allowed for by using forecasts from both declustered and non-declustered catalogs. With two additional moderate sequences in the capital region of New Zealand in the last year, we have continued to refine our forecasting techniques, including the use of potential scenarios based on the aftershock

  9. Development of hazard-compatible building fragility and vulnerability models

    USGS Publications Warehouse

    Karaca, E.; Luco, N.

    2008-01-01

    We present a methodology for transforming the structural and non-structural fragility functions in HAZUS into a format that is compatible with conventional seismic hazard analysis information. The methodology makes use of the building capacity (or pushover) curves and related building parameters provided in HAZUS. Instead of the capacity spectrum method applied in HAZUS, building response is estimated by inelastic response history analysis of corresponding single-degree-of-freedom systems under a large number of earthquake records. Statistics of the building response are used with the damage state definitions from HAZUS to derive fragility models conditioned on spectral acceleration values. Using the developed fragility models for structural and nonstructural building components, with corresponding damage state loss ratios from HAZUS, we also derive building vulnerability models relating spectral acceleration to repair costs. Whereas in HAZUS the structural and nonstructural damage states are treated as if they are independent, our vulnerability models are derived assuming "complete" nonstructural damage whenever the structural damage state is complete. We show the effects of considering this dependence on the final vulnerability models. The use of spectral acceleration (at selected vibration periods) as the ground motion intensity parameter, coupled with the careful treatment of uncertainty, makes the new fragility and vulnerability models compatible with conventional seismic hazard curves and hence useful for extensions to probabilistic damage and loss assessment.

  10. Simple model relating recombination rates and non-proportional light yield in scintillators

    SciTech Connect

    Moses, William W.; Bizarri, Gregory; Singh, Jai; Vasil'ev, Andrey N.; Williams, Richard T.

    2008-09-24

    We present a phenomenological approach to derive an approximate expression for the local light yield along a track as a function of the rate constants of different kinetic orders of radiative and quenching processes for excitons and electron-hole pairs excited by an incident {gamma}-ray in a scintillating crystal. For excitons, the radiative and quenching processes considered are linear and binary, and for electron-hole pairs a ternary (Auger type) quenching process is also taken into account. The local light yield (Y{sub L}) in photons per MeV is plotted as a function of the deposited energy, -dE/dx (keV/cm) at any point x along the track length. This model formulation achieves a certain simplicity by using two coupled rate equations. We discuss the approximations that are involved. There are a sufficient number of parameters in this model to fit local light yield profiles needed for qualitative comparison with experiment.

  11. Adjusting multistate capture-recapture models for misclassification bias: manatee breeding proportions

    USGS Publications Warehouse

    Kendall, W.L.; Hines, J.E.; Nichols, J.D.

    2003-01-01

    Matrix population models are important tools for research and management of populations. Estimating the parameters of these models is an important step in applying them to real populations. Multistate capture-recapture methods have provided a useful means for estimating survival and parameters of transition between locations or life history states but have mostly relied on the assumption that the state occupied by each detected animal is known with certainty. Nevertheless, in some cases animals can be misclassified. Using multiple capture sessions within each period of interest, we developed a method that adjusts estimates of transition probabilities for bias due to misclassification. We applied this method to 10 years of sighting data for a population of Florida manatees (Trichechus manatus latirostris) in order to estimate the annual probability of transition from nonbreeding to breeding status. Some sighted females were unequivocally classified as breeders because they were clearly accompanied by a first-year calf. The remainder were classified, sometimes erroneously, as nonbreeders because an attendant first-year calf was not observed or was classified as more than one year old. We estimated a conditional breeding probability of 0.31 + 0.04 (estimate + 1 SE) when we ignored misclassification bias, and 0.61 + 0.09 when we accounted for misclassification.

  12. FInal Report: First Principles Modeling of Mechanisms Underlying Scintillator Non-Proportionality

    SciTech Connect

    Aberg, Daniel; Sadigh, Babak; Zhou, Fei

    2015-01-01

    This final report presents work carried out on the project “First Principles Modeling of Mechanisms Underlying Scintillator Non-Proportionality” at Lawrence Livermore National Laboratory during 2013-2015. The scope of the work was to further the physical understanding of the microscopic mechanisms behind scintillator nonproportionality that effectively limits the achievable detector resolution. Thereby, crucial quantitative data for these processes as input to large-scale simulation codes has been provided. In particular, this project was divided into three tasks: (i) Quantum mechanical rates of non-radiative quenching, (ii) The thermodynamics of point defects and dopants, and (iii) Formation and migration of self-trapped polarons. The progress and results of each of these subtasks are detailed.

  13. Toward a user's toolkit for modeling scintillator proportionality and light yield

    NASA Astrophysics Data System (ADS)

    Li, Qi

    Intrinsic nonproportionality is a material-dependent phenomenon that sets an ultimate limit on energy resolution of radiation detectors. In general, anything that causes light yield to change along the particle track (e.g., the primary electron track in gamma-ray detectors) contributes to nonproportionality. Most of the physics of nonproportionality lies in the host-transport and transfer-to-activator term. The main physical phenomena involved are carrier diffusion, trapping, drift in internal electric fields, and nonlinear rates of radiative and nonradiative recombination. Some complexity is added by the now well-established fact that the electron temperature is changing during important parts of the physical processes listed above. It has consequences, but is tractable by application of electron-phonon interaction theory and first-principles calculation of trap structures checked by experiment. Determination of coefficients and rate "constants" as functions of electron temperature T e for diffusion, D(Te( t)); capture on multiple (i) radiative and nonradiative centers, Ali(Te(t)); bimolecular exciton formation, B2(T e(t)); and nonlinear quenching, K2( Te(t)), K3( Te(t)) in specific scintillator materials will enable computational prediction of energy-dependent response from standard rate equations solved in the electron track for initial excitation distributions calculated by standard methods such as Geant4. Te( t) itself is a function of time. Determination of these parameters can be combined with models describing carrier transport in scintillators, which is able to build a user's toolkit for analyzing any existing and potential scintillators. In the dissertation, progress in calculating electronic structure of traps and activators, diffusion coefficients and rate functions, and testing the model will be described.

  14. Household hazardous waste disposal to landfill: using LandSim to model leachate migration.

    PubMed

    Slack, Rebecca J; Gronow, Jan R; Hall, David H; Voulvoulis, Nikolaos

    2007-03-01

    Municipal solid waste (MSW) landfill leachate contains a number of aquatic pollutants. A specific MSW stream often referred to as household hazardous waste (HHW) can be considered to contribute a large proportion of these pollutants. This paper describes the use of the LandSim (Landfill Performance Simulation) modelling program to assess the environmental consequences of leachate release from a generic MSW landfill in receipt of co-disposed HHW. Heavy metals and organic pollutants were found to migrate into the zones beneath a model landfill site over a 20,000-year period. Arsenic and chromium were found to exceed European Union and US-EPA drinking water standards at the unsaturated zone/aquifer interface, with levels of mercury and cadmium exceeding minimum reporting values (MRVs). The findings demonstrate the pollution potential arising from HHW disposal with MSW. PMID:17046126

  15. Household hazardous waste disposal to landfill: using LandSim to model leachate migration.

    PubMed

    Slack, Rebecca J; Gronow, Jan R; Hall, David H; Voulvoulis, Nikolaos

    2007-03-01

    Municipal solid waste (MSW) landfill leachate contains a number of aquatic pollutants. A specific MSW stream often referred to as household hazardous waste (HHW) can be considered to contribute a large proportion of these pollutants. This paper describes the use of the LandSim (Landfill Performance Simulation) modelling program to assess the environmental consequences of leachate release from a generic MSW landfill in receipt of co-disposed HHW. Heavy metals and organic pollutants were found to migrate into the zones beneath a model landfill site over a 20,000-year period. Arsenic and chromium were found to exceed European Union and US-EPA drinking water standards at the unsaturated zone/aquifer interface, with levels of mercury and cadmium exceeding minimum reporting values (MRVs). The findings demonstrate the pollution potential arising from HHW disposal with MSW.

  16. Modelling and calculations of the response of tissue equivalent proportional counter to charged particles.

    PubMed

    Nikjoo, H; Uehara, S; Pinsky, L; Cucinotta, Francis A

    2007-01-01

    Space activities in earth orbit or in deep space pose challenges to the estimation of risk factors for both astronauts and instrumentation. In space, risk from exposure to ionising radiation is one of the main factors limiting manned space exploration. Therefore, characterising the radiation environment in terms of the types of radiations and the quantity of radiation that the astronauts are exposed to is of critical importance in planning space missions. In this paper, calculations of the response of TEPC to protons and carbon ions were reported. The calculations have been carried out using Monte Carlo track structure simulation codes for the walled and the wall-less TEPC counters. The model simulates nonhomogenous tracks in the sensitive volume of the counter and accounts for direct and indirect events. Calculated frequency- and dose-averaged lineal energies 0.3 MeV-1 GeV protons are presented and compared with the experimental data. The calculation of quality factors (QF) were made using individual track histories. Additionally, calculations of absolute frequencies of energy depositions in cylindrical targets, 100 nm height by 100 nm diameter, when randomly positioned and oriented in water irradiated with 1 Gy of protons of energy 0.3-100 MeV, is presented. The distributions show the clustering properties of protons of different energies in a 100 nm by 100 nm cylinder. PMID:17513858

  17. Modelling and calculations of the response of tissue equivalent proportional counter to charged particles.

    PubMed

    Nikjoo, H; Uehara, S; Pinsky, L; Cucinotta, Francis A

    2007-01-01

    Space activities in earth orbit or in deep space pose challenges to the estimation of risk factors for both astronauts and instrumentation. In space, risk from exposure to ionising radiation is one of the main factors limiting manned space exploration. Therefore, characterising the radiation environment in terms of the types of radiations and the quantity of radiation that the astronauts are exposed to is of critical importance in planning space missions. In this paper, calculations of the response of TEPC to protons and carbon ions were reported. The calculations have been carried out using Monte Carlo track structure simulation codes for the walled and the wall-less TEPC counters. The model simulates nonhomogenous tracks in the sensitive volume of the counter and accounts for direct and indirect events. Calculated frequency- and dose-averaged lineal energies 0.3 MeV-1 GeV protons are presented and compared with the experimental data. The calculation of quality factors (QF) were made using individual track histories. Additionally, calculations of absolute frequencies of energy depositions in cylindrical targets, 100 nm height by 100 nm diameter, when randomly positioned and oriented in water irradiated with 1 Gy of protons of energy 0.3-100 MeV, is presented. The distributions show the clustering properties of protons of different energies in a 100 nm by 100 nm cylinder.

  18. Nonlinear relative-proportion-based route adjustment process for day-to-day traffic dynamics: modeling, equilibrium and stability analysis

    NASA Astrophysics Data System (ADS)

    Zhu, Wenlong; Ma, Shoufeng; Tian, Junfang; Li, Geng

    2016-11-01

    Travelers' route adjustment behaviors in a congested road traffic network are acknowledged as a dynamic game process between them. Existing Proportional-Switch Adjustment Process (PSAP) models have been extensively investigated to characterize travelers' route choice behaviors; PSAP has concise structure and intuitive behavior rule. Unfortunately most of which have some limitations, i.e., the flow over adjustment problem for the discrete PSAP model, the absolute cost differences route adjustment problem, etc. This paper proposes a relative-Proportion-based Route Adjustment Process (rePRAP) maintains the advantages of PSAP and overcomes these limitations. The rePRAP describes the situation that travelers on higher cost route switch to those with lower cost at the rate that is unilaterally depended on the relative cost differences between higher cost route and its alternatives. It is verified to be consistent with the principle of the rational behavior adjustment process. The equivalence among user equilibrium, stationary path flow pattern and stationary link flow pattern is established, which can be applied to judge whether a given network traffic flow has reached UE or not by detecting the stationary or non-stationary state of link flow pattern. The stability theorem is proved by the Lyapunov function approach. A simple example is tested to demonstrate the effectiveness of the rePRAP model.

  19. Comparison of Proportional and On/Off Solar Collector Loop Control Strategies Using a Dynamic Collector Model

    SciTech Connect

    Schiller, Steven R.; Warren, Mashuri L.; Auslander, David M.

    1980-11-01

    In this paper, common control strategies used to regulate the flow of liquid through flat-plate solar collectors are discussed and evaluated using a dynamic collector model. Performance of all strategies is compared using different set points, flow rates, insolation levels and patterns, and ambient temperature conditions. The unique characteristic of the dynamic collector model is that it includes the effect of collector capacitance. Short term temperature response and the energy-storage capability of collector capacitance are shown to play significant roles in comparing on/off and proportional controllers. Inclusion of these effects has produced considerably more realistic simulations than any generated by steady-state models. Finally, simulations indicate relative advantages and disadvantages of both types of controllers, conditions under which each performs better, and the importance of pump cycling and controller set points on total energy collection.

  20. The application of models to the assessment of fire hazard from consumer products

    NASA Astrophysics Data System (ADS)

    Bukowski, R. W.

    1985-08-01

    The differences among models of fire, fire hazard, and fire risk are described. The use of field, zone, and network models for fire hazard assessment is discussed. A number of available single and multiple compartment models are described. Key considerations with respect to the use of the current models by the Consumer Product Safety Commission for hazard assessment from upholstered furniture and mattress fires is presented. Modifications necessary to improve the capability of these models for hazard assessments are identified. Model validation, output presentation, and data sources are discussed. Recommendations on specific models for the sponsor to consider for further study and use are provided.

  1. A multimodal location and routing model for hazardous materials transportation.

    PubMed

    Xie, Yuanchang; Lu, Wei; Wang, Wen; Quadrifoglio, Luca

    2012-08-15

    The recent US Commodity Flow Survey data suggest that transporting hazardous materials (HAZMAT) often involves multiple modes, especially for long-distance transportation. However, not much research has been conducted on HAZMAT location and routing on a multimodal transportation network. Most existing HAZMAT location and routing studies focus exclusively on single mode (either highways or railways). Motivated by the lack of research on multimodal HAZMAT location and routing and the fact that there is an increasing demand for it, this research proposes a multimodal HAZMAT model that simultaneously optimizes the locations of transfer yards and transportation routes. The developed model is applied to two case studies of different network sizes to demonstrate its applicability. The results are analyzed and suggestions for future research are provided.

  2. A multimodal location and routing model for hazardous materials transportation.

    PubMed

    Xie, Yuanchang; Lu, Wei; Wang, Wen; Quadrifoglio, Luca

    2012-08-15

    The recent US Commodity Flow Survey data suggest that transporting hazardous materials (HAZMAT) often involves multiple modes, especially for long-distance transportation. However, not much research has been conducted on HAZMAT location and routing on a multimodal transportation network. Most existing HAZMAT location and routing studies focus exclusively on single mode (either highways or railways). Motivated by the lack of research on multimodal HAZMAT location and routing and the fact that there is an increasing demand for it, this research proposes a multimodal HAZMAT model that simultaneously optimizes the locations of transfer yards and transportation routes. The developed model is applied to two case studies of different network sizes to demonstrate its applicability. The results are analyzed and suggestions for future research are provided. PMID:22633882

  3. Hazard based models for freeway traffic incident duration.

    PubMed

    Tavassoli Hojati, Ahmad; Ferreira, Luis; Washington, Simon; Charles, Phil

    2013-03-01

    Assessing and prioritising cost-effective strategies to mitigate the impacts of traffic incidents and accidents on non-recurrent congestion on major roads represents a significant challenge for road network managers. This research examines the influence of numerous factors associated with incidents of various types on their duration. It presents a comprehensive traffic incident data mining and analysis by developing an incident duration model based on twelve months of incident data obtained from the Australian freeway network. Parametric accelerated failure time (AFT) survival models of incident duration were developed, including log-logistic, lognormal, and Weibul-considering both fixed and random parameters, as well as a Weibull model with gamma heterogeneity. The Weibull AFT models with random parameters were appropriate for modelling incident duration arising from crashes and hazards. A Weibull model with gamma heterogeneity was most suitable for modelling incident duration of stationary vehicles. Significant variables affecting incident duration include characteristics of the incidents (severity, type, towing requirements, etc.), and location, time of day, and traffic characteristics of the incident. Moreover, the findings reveal no significant effects of infrastructure and weather on incident duration. A significant and unique contribution of this paper is that the durations of each type of incident are uniquely different and respond to different factors. The results of this study are useful for traffic incident management agencies to implement strategies to reduce incident duration, leading to reduced congestion, secondary incidents, and the associated human and economic losses.

  4. Hazard based models for freeway traffic incident duration.

    PubMed

    Tavassoli Hojati, Ahmad; Ferreira, Luis; Washington, Simon; Charles, Phil

    2013-03-01

    Assessing and prioritising cost-effective strategies to mitigate the impacts of traffic incidents and accidents on non-recurrent congestion on major roads represents a significant challenge for road network managers. This research examines the influence of numerous factors associated with incidents of various types on their duration. It presents a comprehensive traffic incident data mining and analysis by developing an incident duration model based on twelve months of incident data obtained from the Australian freeway network. Parametric accelerated failure time (AFT) survival models of incident duration were developed, including log-logistic, lognormal, and Weibul-considering both fixed and random parameters, as well as a Weibull model with gamma heterogeneity. The Weibull AFT models with random parameters were appropriate for modelling incident duration arising from crashes and hazards. A Weibull model with gamma heterogeneity was most suitable for modelling incident duration of stationary vehicles. Significant variables affecting incident duration include characteristics of the incidents (severity, type, towing requirements, etc.), and location, time of day, and traffic characteristics of the incident. Moreover, the findings reveal no significant effects of infrastructure and weather on incident duration. A significant and unique contribution of this paper is that the durations of each type of incident are uniquely different and respond to different factors. The results of this study are useful for traffic incident management agencies to implement strategies to reduce incident duration, leading to reduced congestion, secondary incidents, and the associated human and economic losses. PMID:23333698

  5. Modeling population exposures to outdoor sources of hazardous air pollutants.

    PubMed

    Ozkaynak, Halûk; Palma, Ted; Touma, Jawad S; Thurman, James

    2008-01-01

    Accurate assessment of human exposures is an important part of environmental health effects research. However, most air pollution epidemiology studies rely upon imperfect surrogates of personal exposures, such as information based on available central-site outdoor concentration monitoring or modeling data. In this paper, we examine the limitations of using outdoor concentration predictions instead of modeled personal exposures for over 30 gaseous and particulate hazardous air pollutants (HAPs) in the US. The analysis uses the results from an air quality dispersion model (the ASPEN or Assessment System for Population Exposure Nationwide model) and an inhalation exposure model (the HAPEM or Hazardous Air Pollutant Exposure Model, Version 5), applied by the US. Environmental protection Agency during the 1999 National Air Toxic Assessment (NATA) in the US. Our results show that the total predicted chronic exposure concentrations of outdoor HAPs from all sources are lower than the modeled ambient concentrations by about 20% on average for most gaseous HAPs and by about 60% on average for most particulate HAPs (mainly, due to the exclusion of indoor sources from our modeling analysis and lower infiltration of particles indoors). On the other hand, the HAPEM/ASPEN concentration ratio averages for onroad mobile source exposures were found to be greater than 1 (around 1.20) for most mobile-source related HAPs (e.g. 1, 3-butadiene, acetaldehyde, benzene, formaldehyde) reflecting the importance of near-roadway and commuting environments on personal exposures to HAPs. The distribution of the ratios of personal to ambient concentrations was found to be skewed for a number of the VOCs and reactive HAPs associated with major source emissions, indicating the importance of personal mobility factors. We conclude that the increase in personal exposures from the corresponding predicted ambient levels tends to occur near locations where there are either major emission sources of HAPs

  6. VHub - Cyberinfrastructure for volcano eruption and hazards modeling and simulation

    NASA Astrophysics Data System (ADS)

    Valentine, G. A.; Jones, M. D.; Bursik, M. I.; Calder, E. S.; Gallo, S. M.; Connor, C.; Carn, S. A.; Rose, W. I.; Moore-Russo, D. A.; Renschler, C. S.; Pitman, B.; Sheridan, M. F.

    2009-12-01

    Volcanic risk is increasing as populations grow in active volcanic regions, and as national economies become increasingly intertwined. In addition to their significance to risk, volcanic eruption processes form a class of multiphase fluid dynamics with rich physics on many length and time scales. Risk significance, physics complexity, and the coupling of models to complex dynamic spatial datasets all demand the development of advanced computational techniques and interdisciplinary approaches to understand and forecast eruption dynamics. Innovative cyberinfrastructure is needed to enable global collaboration and novel scientific creativity, while simultaneously enabling computational thinking in real-world risk mitigation decisions - an environment where quality control, documentation, and traceability are key factors. Supported by NSF, we are developing a virtual organization, referred to as VHub, to address this need. Overarching goals of the VHub project are: Dissemination. Make advanced modeling and simulation capabilities and key data sets readily available to researchers, students, and practitioners around the world. Collaboration. Provide a mechanism for participants not only to be users but also co-developers of modeling capabilities, and contributors of experimental and observational data sets for use in modeling and simulation, in a collaborative environment that reaches far beyond local work groups. Comparison. Facilitate comparison between different models in order to provide the practitioners with guidance for choosing the "right" model, depending upon the intended use, and provide a platform for multi-model analysis of specific problems and incorporation into probabilistic assessments. Application. Greatly accelerate access and application of a wide range of modeling tools and related data sets to agencies around the world that are charged with hazard planning, mitigation, and response. Education. Provide resources that will promote the training of the

  7. Lava flow hazard at Nyiragongo volcano, D.R.C.. 1. Model calibration and hazard mapping

    NASA Astrophysics Data System (ADS)

    Favalli, Massimiliano; Chirico, Giuseppe D.; Papale, Paolo; Pareschi, Maria Teresa; Boschi, Enzo

    2009-05-01

    The 2002 eruption of Nyiragongo volcano constitutes the most outstanding case ever of lava flow in a big town. It also represents one of the very rare cases of direct casualties from lava flows, which had high velocities of up to tens of kilometer per hour. As in the 1977 eruption, which is the only other eccentric eruption of the volcano in more than 100 years, lava flows were emitted from several vents along a N-S system of fractures extending for more than 10 km, from which they propagated mostly towards Lake Kivu and Goma, a town of about 500,000 inhabitants. We assessed the lava flow hazard on the entire volcano and in the towns of Goma (D.R.C.) and Gisenyi (Rwanda) through numerical simulations of probable lava flow paths. Lava flow paths are computed based on the steepest descent principle, modified by stochastically perturbing the topography to take into account the capability of lava flows to override topographic obstacles, fill topographic depressions, and spread over the topography. Code calibration and the definition of the expected lava flow length and vent opening probability distributions were done based on the 1977 and 2002 eruptions. The final lava flow hazard map shows that the eastern sector of Goma devastated in 2002 represents the area of highest hazard on the flanks of the volcano. The second highest hazard sector in Goma is the area of propagation of the western lava flow in 2002. The town of Gisenyi is subject to moderate to high hazard due to its proximity to the alignment of fractures active in 1977 and 2002. In a companion paper (Chirico et al., Bull Volcanol, in this issue, 2008) we use numerical simulations to investigate the possibility of reducing lava flow hazard through the construction of protective barriers, and formulate a proposal for the future development of the town of Goma.

  8. Preliminary deformation model for National Seismic Hazard map of Indonesia

    NASA Astrophysics Data System (ADS)

    Meilano, Irwan; Susilo, Gunawan, Endra; Sarsito, Dina; Prijatna, Kosasih; Abidin, Hasanuddin Z.; Efendi, Joni

    2015-04-01

    Preliminary deformation model for the Indonesia's National Seismic Hazard (NSH) map is constructed as the block rotation and strain accumulation function at the elastic half-space. Deformation due to rigid body motion is estimated by rotating six tectonic blocks in Indonesia. The interseismic deformation due to subduction is estimated by assuming coupling on subduction interface while deformation at active fault is calculated by assuming each of the fault`s segment slips beneath a locking depth or in combination with creeping in a shallower part. This research shows that rigid body motion dominates the deformation pattern with magnitude more than 15 mm/year, except in the narrow area near subduction zones and active faults where significant deformation reach to 25 mm/year.

  9. Preliminary deformation model for National Seismic Hazard map of Indonesia

    SciTech Connect

    Meilano, Irwan; Gunawan, Endra; Sarsito, Dina; Prijatna, Kosasih; Abidin, Hasanuddin Z.; Susilo,; Efendi, Joni

    2015-04-24

    Preliminary deformation model for the Indonesia’s National Seismic Hazard (NSH) map is constructed as the block rotation and strain accumulation function at the elastic half-space. Deformation due to rigid body motion is estimated by rotating six tectonic blocks in Indonesia. The interseismic deformation due to subduction is estimated by assuming coupling on subduction interface while deformation at active fault is calculated by assuming each of the fault‘s segment slips beneath a locking depth or in combination with creeping in a shallower part. This research shows that rigid body motion dominates the deformation pattern with magnitude more than 15 mm/year, except in the narrow area near subduction zones and active faults where significant deformation reach to 25 mm/year.

  10. A modeling framework for investment planning in interdependent infrastructures in multi-hazard environments.

    SciTech Connect

    Brown, Nathanael J. K.; Gearhart, Jared Lee; Jones, Dean A.; Nozick, Linda Karen; Prince, Michael

    2013-09-01

    Currently, much of protection planning is conducted separately for each infrastructure and hazard. Limited funding requires a balance of expenditures between terrorism and natural hazards based on potential impacts. This report documents the results of a Laboratory Directed Research & Development (LDRD) project that created a modeling framework for investment planning in interdependent infrastructures focused on multiple hazards, including terrorism. To develop this framework, three modeling elements were integrated: natural hazards, terrorism, and interdependent infrastructures. For natural hazards, a methodology was created for specifying events consistent with regional hazards. For terrorism, we modeled the terrorists actions based on assumptions regarding their knowledge, goals, and target identification strategy. For infrastructures, we focused on predicting post-event performance due to specific terrorist attacks and natural hazard events, tempered by appropriate infrastructure investments. We demonstrate the utility of this framework with various examples, including protection of electric power, roadway, and hospital networks.

  11. Nankai-Tokai subduction hazard for catastrophe risk modeling

    NASA Astrophysics Data System (ADS)

    Spurr, D. D.

    2010-12-01

    The historical record of Nankai subduction zone earthquakes includes nine event sequences over the last 1300 years. Typical characteristic behaviour is evident, with segments rupturing either co-seismically or as two large earthquakes less than 3 yrs apart (active phase), followed by periods of low seismicity lasting 90 - 150 yrs or more. Despite the long historical record, the recurrence behaviour and consequent seismic hazard remain uncertain and controversial. In 2005 the Headquarters for Earthquake Research Promotion (HERP) published models for hundreds of faults as part of an official Japanese seismic hazard map. The HERP models have been widely adopted in part or full both within Japan and by the main international catastrophe risk model companies. The time-dependent recurrence modelling we adopt for the Nankai faults departs considerably from HERP in three main areas: ■ A “Linked System” (LS) source model is used to simulate the strong correlation between segment ruptures evident in the historical record, whereas the HERP recurrence estimates assume the Nankai, Tonankai and Tokai segments rupture independently. The LS component models all historical events with a common rupture recurrence cycle for the three segments. System rupture probabilities are calculated assuming BPT behaviour and parameter uncertainties assessed from the full 1300 yr historical record. ■ An independent, “Tokai Only” (TO) rupture source is used specifically to model potential “Tokai only” earthquakes. There are widely diverging views on the possibility of this segment rupturing independently. Although all historical Tokai ruptures appear to have been composite Tonankai -Tokai earthquakes, the available data do not preclude the possibility of future “Tokai only” events. The HERP model also includes “Tokai only” earthquakes but the recurrence parameters are based on historical composite Tonankai -Tokai ruptures and do not appear to recognise the complex tectonic

  12. Estimation of the Proportion of Underachieving Students in Compulsory Secondary Education in Spain: An Application of the Rasch Model.

    PubMed

    Veas, Alejandro; Gilar, Raquel; Miñano, Pablo; Castejón, Juan-Luis

    2016-01-01

    There are very few studies in Spain that treat underachievement rigorously, and those that do are typically related to gifted students. The present study examined the proportion of underachieving students using the Rasch measurement model. A sample of 643 first-year high school students (mean age = 12.09; SD = 0.47) from 8 schools in the province of Alicante (Spain) completed the Battery of Differential and General Skills (Badyg), and these students' General Points Average (GPAs) were recovered by teachers. Dichotomous and Partial credit Rasch models were performed. After adjusting the measurement instruments, the individual underachievement index provided a total sample of 181 underachieving students, or 28.14% of the total sample across the ability levels. This study confirms that the Rasch measurement model can accurately estimate the construct validity of both the intelligence test and the academic grades for the calculation of underachieving students. Furthermore, the present study constitutes a pioneer framework for the estimation of the prevalence of underachievement in Spain. PMID:26973586

  13. Estimation of the Proportion of Underachieving Students in Compulsory Secondary Education in Spain: An Application of the Rasch Model

    PubMed Central

    Veas, Alejandro; Gilar, Raquel; Miñano, Pablo; Castejón, Juan-Luis

    2016-01-01

    There are very few studies in Spain that treat underachievement rigorously, and those that do are typically related to gifted students. The present study examined the proportion of underachieving students using the Rasch measurement model. A sample of 643 first-year high school students (mean age = 12.09; SD = 0.47) from 8 schools in the province of Alicante (Spain) completed the Battery of Differential and General Skills (Badyg), and these students' General Points Average (GPAs) were recovered by teachers. Dichotomous and Partial credit Rasch models were performed. After adjusting the measurement instruments, the individual underachievement index provided a total sample of 181 underachieving students, or 28.14% of the total sample across the ability levels. This study confirms that the Rasch measurement model can accurately estimate the construct validity of both the intelligence test and the academic grades for the calculation of underachieving students. Furthermore, the present study constitutes a pioneer framework for the estimation of the prevalence of underachievement in Spain. PMID:26973586

  14. Research collaboration, hazard modeling and dissemination in volcanology with Vhub

    NASA Astrophysics Data System (ADS)

    Palma Lizana, J. L.; Valentine, G. A.

    2011-12-01

    Vhub (online at vhub.org) is a cyberinfrastructure for collaboration in volcanology research, education, and outreach. One of the core objectives of this project is to accelerate the transfer of research tools to organizations and stakeholders charged with volcano hazard and risk mitigation (such as observatories). Vhub offers a clearinghouse for computational models of volcanic processes and data analysis, documentation of those models, and capabilities for online collaborative groups focused on issues such as code development, configuration management, benchmarking, and validation. A subset of simulations is already available for online execution, eliminating the need to download and compile locally. In addition, Vhub is a platform for sharing presentations and other educational material in a variety of media formats, which are useful in teaching university-level volcanology. VHub also has wikis, blogs and group functions around specific topics to encourage collaboration and discussion. In this presentation we provide examples of the vhub capabilities, including: (1) tephra dispersion and block-and-ash flow models; (2) shared educational materials; (3) online collaborative environment for different types of research, including field-based studies and plume dispersal modeling; (4) workshops. Future goals include implementation of middleware to allow access to data and databases that are stored and maintained at various institutions around the world. All of these capabilities can be exercised with a user-defined level of privacy, ranging from completely private (only shared and visible to specified people) to completely public. The volcanological community is encouraged to use the resources of vhub and also to contribute models, datasets, and other items that authors would like to disseminate. The project is funded by the US National Science Foundation and includes a core development team at University at Buffalo, Michigan Technological University, and University

  15. Landslide-Generated Tsunami Model for Quick Hazard Assessment

    NASA Astrophysics Data System (ADS)

    Franz, M.; Rudaz, B.; Locat, J.; Jaboyedoff, M.; Podladchikov, Y.

    2015-12-01

    Alpine regions are likely to be areas at risk regarding to landslide-induced tsunamis, because of the proximity between lakes and potential instabilities and due to the concentration of the population in valleys and on the lakes shores. In particular, dam lakes are often surrounded by steep slopes and frequently affect the stability of the banks. In order to assess comprehensively this phenomenon together with the induced risks, we have developed a 2.5D numerical model which aims to simulate the propagation of the landslide, the generation and the propagation of the wave and eventually the spread on the shores or the associated downstream flow. To perform this task, the process is done in three steps. Firstly, the geometry of the sliding mass is constructed using the Sloping Local Base Level (SLBL) concept. Secondly, the propagation of this volume is performed using a model based on viscous flow equations. Finally, the wave generation and its propagation are simulated using the shallow water equations stabilized by the Lax-Friedrichs scheme. The transition between wet and dry bed is performed by the combination of the two latter sets of equations. The proper behavior of our model is demonstrated by; (1) numerical tests from Toro (2001), and (2) by comparison with a real event where the horizontal run-up distance is known (Nicolet landslide, Quebec, Canada). The model is of particular interest due to its ability to perform quickly the 2.5D geometric model of the landslide, the tsunami simulation and, consequently, the hazard assessment.

  16. Methodology Using MELCOR Code to Model Proposed Hazard Scenario

    SciTech Connect

    Gavin Hawkley

    2010-07-01

    This study demonstrates a methodology for using the MELCOR code to model a proposed hazard scenario within a building containing radioactive powder, and the subsequent evaluation of a leak path factor (LPF) (or the amount of respirable material which that escapes a facility into the outside environment), implicit in the scenario. This LPF evaluation will analyzes the basis and applicability of an assumed standard multiplication of 0.5 × 0.5 (in which 0.5 represents the amount of material assumed to leave one area and enter another), for calculating an LPF value. The outside release is dependsent upon the ventilation/filtration system, both filtered and un-filtered, and from other pathways from the building, such as doorways (, both open and closed). This study is presents ed to show how the multiple leak path factorsLPFs from the interior building can be evaluated in a combinatory process in which a total leak path factorLPF is calculated, thus addressing the assumed multiplication, and allowing for the designation and assessment of a respirable source term (ST) for later consequence analysis, in which: the propagation of material released into the environmental atmosphere can be modeled and the dose received by a receptor placed downwind can be estimated and the distance adjusted to maintains such exposures as low as reasonably achievableALARA.. Also, this study will briefly addresses particle characteristics thatwhich affect atmospheric particle dispersion, and compares this dispersion with leak path factorLPF methodology.

  17. Simulated hazards of loosing infection-free status in a Dutch BHV1 model.

    PubMed

    Vonk Noordegraaf, A; Labrovic, A; Frankena, K; Pfeiffer, D U; Nielen, M

    2004-01-30

    A compulsory eradication programme for bovine herpesvirus 1 (BHV1) was implemented in the Netherlands in 1998. At the start of the programme, about 25% of the dairy herds were certified BHV1-free. Simulation models have played an important role in the decision-making process associated with BHV1 eradication. Our objective in this study was to improve understanding of model behaviour (as part of internal validation) regarding loss by herds of the BHV1-free certificate. Using a Cox proportional hazards model, the association between farm characteristics and the risk of certificate loss during simulation was quantified. The overall fraction of herds experiencing certificate loss amongst initially certified during simulation was 3.0% in 6.5 years. Factors that increased risk for earlier certificate loss in the final multivariable Cox model were higher 'yearly number of cattle purchased', 'farm density within a 1 km radius' and 'cattle density within a 1 km radius'. Qualitative behaviour of risk factors we found agreed with observations in field studies. PMID:15154684

  18. The influence of mapped hazards on risk beliefs: A proximity-based modeling approach

    PubMed Central

    Severtson, Dolores J.; Burt, James E.

    2013-01-01

    Interview findings suggest perceived proximity to mapped hazards influences risk beliefs when people view environmental hazard maps. For dot maps, four attributes of mapped hazards influenced beliefs: hazard value, proximity, prevalence, and dot patterns. In order to quantify the collective influence of these attributes for viewers' perceived or actual map locations, we present a model to estimate proximity-based hazard or risk (PBH) and share study results that indicate how modeled PBH and map attributes influenced risk beliefs. The randomized survey study among 447 university students assessed risk beliefs for 24 dot maps that systematically varied by the four attributes. Maps depicted water test results for a fictitious hazardous substance in private residential wells and included a designated “you live here” location. Of the nine variables that assessed risk beliefs, the numerical susceptibility variable was most consistently and strongly related to map attributes and PBH. Hazard value, location in or out of a clustered dot pattern, and distance had the largest effects on susceptibility. Sometimes, hazard value interacted with other attributes, e.g. distance had stronger effects on susceptibility for larger than smaller hazard values. For all combined maps, PBH explained about the same amount of variance in susceptibility as did attributes. Modeled PBH may have utility for studying the influence of proximity to mapped hazards on risk beliefs, protective behavior, and other dependent variables. Further work is needed to examine these influences for more realistic maps and representative study samples. PMID:22053748

  19. Physically Based Landslide Hazard Model U Method and Issues

    NASA Astrophysics Data System (ADS)

    Dhakal, A. S.; Sidle, R. C.

    An Integrated Dynamic Slope Stability Model (IDSSM) that integrates GIS with topo- graphic, distributed hydrologic and vegetation models to assess the slope stability at a basin scale is described to address the issues related to prediction of landslide hazards with physically based landslide models. Data limitations, which can be addressed as one of the major problem range from lack of spatially distributed data on soil depth, soil physical and engineering properties, and vegetation root strength to the need for better digital elevation models to characterize topography. Many times point data and their averages such as for soil depth and soil cohesion needs to be used as the represen- tative values at the element scale. These factors result in a great degree of uncertainty in the simulation results. Since factors related to landsliding have different degree of importance in causing landsliding the introduced uncertainties may not be identical for the entire variables. The sensitivities of different parameters associated with landslid- ing were examined using the IDSSM. Since many variables are important for landslide occurrence effects of most of the soil and vegetation parameters were evaluated. To test for parameter uncertainty, one variable is altered while others were held constant and cumulative areas (percentage of the drainage area) with safety factor less than certain values were compared. The sensitivity analysis suggests that the safety factor is most sensitive to changes in soil cohesion, soil depth, and internal frictional an- gle. Changes in hydraulic conductivity greatly influenced ground water table and thus slope stability. Parameters such as soil unit weight and tree surcharge was less sen- sitive to landsliding. Considering the possible fine spatial variation of soil depth and hydraulic conductivity in a forest soil these two factors seem to produce large uncer- tainties. In forest soil, the presence of macropores and preferential flow presents

  20. Suppressing epileptic activity in a neural mass model using a closed-loop proportional-integral controller

    NASA Astrophysics Data System (ADS)

    Wang, Junsong; Niebur, Ernst; Hu, Jinyu; Li, Xiaoli

    2016-06-01

    Closed-loop control is a promising deep brain stimulation (DBS) strategy that could be used to suppress high-amplitude epileptic activity. However, there are currently no analytical approaches to determine the stimulation parameters for effective and safe treatment protocols. Proportional-integral (PI) control is the most extensively used closed-loop control scheme in the field of control engineering because of its simple implementation and perfect performance. In this study, we took Jansen’s neural mass model (NMM) as a test bed to develop a PI-type closed-loop controller for suppressing epileptic activity. A graphical stability analysis method was employed to determine the stabilizing region of the PI controller in the control parameter space, which provided a theoretical guideline for the choice of the PI control parameters. Furthermore, we established the relationship between the parameters of the PI controller and the parameters of the NMM in the form of a stabilizing region, which provided insights into the mechanisms that may suppress epileptic activity in the NMM. The simulation results demonstrated the validity and effectiveness of the proposed closed-loop PI control scheme.

  1. Suppressing epileptic activity in a neural mass model using a closed-loop proportional-integral controller

    PubMed Central

    Wang, Junsong; Niebur, Ernst; Hu, Jinyu; Li, Xiaoli

    2016-01-01

    Closed-loop control is a promising deep brain stimulation (DBS) strategy that could be used to suppress high-amplitude epileptic activity. However, there are currently no analytical approaches to determine the stimulation parameters for effective and safe treatment protocols. Proportional-integral (PI) control is the most extensively used closed-loop control scheme in the field of control engineering because of its simple implementation and perfect performance. In this study, we took Jansen’s neural mass model (NMM) as a test bed to develop a PI-type closed-loop controller for suppressing epileptic activity. A graphical stability analysis method was employed to determine the stabilizing region of the PI controller in the control parameter space, which provided a theoretical guideline for the choice of the PI control parameters. Furthermore, we established the relationship between the parameters of the PI controller and the parameters of the NMM in the form of a stabilizing region, which provided insights into the mechanisms that may suppress epileptic activity in the NMM. The simulation results demonstrated the validity and effectiveness of the proposed closed-loop PI control scheme. PMID:27273563

  2. Evaluating the hazard from Siding Spring dust: Models and predictions

    NASA Astrophysics Data System (ADS)

    Christou, A.

    2014-12-01

    Long-period comet C/2013 A1 (Siding Spring) will pass at a distance of ~140 thousand km (9e-4 AU) - about a third of a lunar distance - from the centre of Mars, closer to this planet than any known comet has come to the Earth since records began. Closest approach is expected to occur at 18:30 UT on the 19th October. This provides an opportunity for a ``free'' flyby of a different type of comet than those investigated by spacecraft so far, including comet 67P/Churyumov-Gerasimenko currently under scrutiny by the Rosetta spacecraft. At the same time, the passage of the comet through Martian space will create the opportunity to study the reaction of the planet's upper atmosphere to a known natural perturbation. The flip-side of the coin is the risk to Mars-orbiting assets, both existing (NASA's Mars Odyssey & Mars Reconnaissance Orbiter and ESA's Mars Express) and in transit (NASA's MAVEN and ISRO's Mangalyaan) by high-speed cometary dust potentially impacting spacecraft surfaces. Much work has already gone into assessing this hazard and devising mitigating measures in the precious little warning time given to characterise this object until Mars encounter. In this presentation, we will provide an overview of how the meteoroid stream and comet coma dust impact models evolved since the comet's discovery and discuss lessons learned should similar circumstances arise in the future.

  3. Frequencies as Proportions: Using a Teaching Model Based on Pirie and Kieren's Model of Mathematical Understanding

    ERIC Educational Resources Information Center

    Wright, Vince

    2014-01-01

    Pirie and Kieren (1989 "For the learning of mathematics", 9(3)7-11, 1992 "Journal of Mathematical Behavior", 11, 243-257, 1994a "Educational Studies in Mathematics", 26, 61-86, 1994b "For the Learning of Mathematics":, 14(1)39-43) created a model (P-K) that describes a dynamic and recursive process by which…

  4. Modelling Inland Flood Events for Hazard Maps in Taiwan

    NASA Astrophysics Data System (ADS)

    Ghosh, S.; Nzerem, K.; Sassi, M.; Hilberts, A.; Assteerawatt, A.; Tillmanns, S.; Mathur, P.; Mitas, C.; Rafique, F.

    2015-12-01

    Taiwan experiences significant inland flooding, driven by torrential rainfall from plum rain storms and typhoons during summer and fall. From last 13 to 16 years data, 3,000 buildings were damaged by such floods annually with a loss US$0.41 billion (Water Resources Agency). This long, narrow island nation with mostly hilly/mountainous topography is located at tropical-subtropical zone with annual average typhoon-hit-frequency of 3-4 (Central Weather Bureau) and annual average precipitation of 2502mm (WRA) - 2.5 times of the world's average. Spatial and temporal distributions of countrywide precipitation are uneven, with very high local extreme rainfall intensities. Annual average precipitation is 3000-5000mm in the mountainous regions, 78% of it falls in May-October, and the 1-hour to 3-day maximum rainfall are about 85 to 93% of the world records (WRA). Rivers in Taiwan are short with small upstream areas and high runoff coefficients of watersheds. These rivers have the steepest slopes, the shortest response time with rapid flows, and the largest peak flows as well as specific flood peak discharge (WRA) in the world. RMS has recently developed a countrywide inland flood model for Taiwan, producing hazard return period maps at 1arcsec grid resolution. These can be the basis for evaluating and managing flood risk, its economic impacts, and insured flood losses. The model is initiated with sub-daily historical meteorological forcings and calibrated to daily discharge observations at about 50 river gauges over the period 2003-2013. Simulations of hydrologic processes, via rainfall-runoff and routing models, are subsequently performed based on a 10000 year set of stochastic forcing. The rainfall-runoff model is physically based continuous, semi-distributed model for catchment hydrology. The 1-D wave propagation hydraulic model considers catchment runoff in routing and describes large-scale transport processes along the river. It also accounts for reservoir storage

  5. Hidden Markov models for estimating animal mortality from anthropogenic hazards

    EPA Science Inventory

    Carcasses searches are a common method for studying the risk of anthropogenic hazards to wildlife, including non-target poisoning and collisions with anthropogenic structures. Typically, numbers of carcasses found must be corrected for scavenging rates and imperfect detection. ...

  6. Closed-loop control of epileptiform activities in a neural population model using a proportional-derivative controller

    NASA Astrophysics Data System (ADS)

    Wang, Jun-Song; Wang, Mei-Li; Li, Xiao-Li; Ernst, Niebur

    2015-03-01

    Epilepsy is believed to be caused by a lack of balance between excitation and inhibitation in the brain. A promising strategy for the control of the disease is closed-loop brain stimulation. How to determine the stimulation control parameters for effective and safe treatment protocols remains, however, an unsolved question. To constrain the complex dynamics of the biological brain, we use a neural population model (NPM). We propose that a proportional-derivative (PD) type closed-loop control can successfully suppress epileptiform activities. First, we determine the stability of root loci, which reveals that the dynamical mechanism underlying epilepsy in the NPM is the loss of homeostatic control caused by the lack of balance between excitation and inhibition. Then, we design a PD type closed-loop controller to stabilize the unstable NPM such that the homeostatic equilibriums are maintained; we show that epileptiform activities are successfully suppressed. A graphical approach is employed to determine the stabilizing region of the PD controller in the parameter space, providing a theoretical guideline for the selection of the PD control parameters. Furthermore, we establish the relationship between the control parameters and the model parameters in the form of stabilizing regions to help understand the mechanism of suppressing epileptiform activities in the NPM. Simulations show that the PD-type closed-loop control strategy can effectively suppress epileptiform activities in the NPM. Project supported by the National Natural Science Foundation of China (Grant Nos. 61473208, 61025019, and 91132722), ONR MURI N000141010278, and NIH grant R01EY016281.

  7. Conceptual geoinformation model of natural hazards risk assessment

    NASA Astrophysics Data System (ADS)

    Kulygin, Valerii

    2016-04-01

    Natural hazards are the major threat to safe interactions between nature and society. The assessment of the natural hazards impacts and their consequences is important in spatial planning and resource management. Today there is a challenge to advance our understanding of how socio-economical and climate changes will affect the frequency and magnitude of hydro-meteorological hazards and associated risks. However, the impacts from different types of natural hazards on various marine and coastal economic activities are not of the same type. In this study, the conceptual geomodel of risk assessment is presented to highlight the differentiation by the type of economic activities in extreme events risk assessment. The marine and coastal ecosystems are considered as the objects of management, on the one hand, and as the place of natural hazards' origin, on the other hand. One of the key elements in describing of such systems is the spatial characterization of their components. Assessment of ecosystem state is based on ecosystem indicators (indexes). They are used to identify the changes in time. The scenario approach is utilized to account for the spatio-temporal dynamics and uncertainty factors. Two types of scenarios are considered: scenarios of using ecosystem services by economic activities and scenarios of extreme events and related hazards. The reported study was funded by RFBR, according to the research project No. 16-35-60043 mol_a_dk.

  8. Standards and Guidelines for Numerical Models for Tsunami Hazard Mitigation

    NASA Astrophysics Data System (ADS)

    Titov, V.; Gonzalez, F.; Kanoglu, U.; Yalciner, A.; Synolakis, C. E.

    2006-12-01

    An increased number of nations around the workd need to develop tsunami mitigation plans which invariably involve inundation maps for warning guidance and evacuation planning. There is the risk that inundation maps may be produced with older or untested methodology, as there are currently no standards for modeling tools. In the aftermath of the 2004 megatsunami, some models were used to model inundation for Cascadia events with results much larger than sediment records and existing state-of-the-art studies suggest leading to confusion among emergency management. Incorrectly assessing tsunami impact is hazardous, as recent events in 2006 in Tonga, Kythira, Greece and Central Java have suggested (Synolakis and Bernard, 2006). To calculate tsunami currents, forces and runup on coastal structures, and inundation of coastlines one must calculate the evolution of the tsunami wave from the deep ocean to its target site, numerically. No matter what the numerical model, validation (the process of ensuring that the model solves the parent equations of motion accurately) and verification (the process of ensuring that the model used represents geophysical reality appropriately) both are an essential. Validation ensures that the model performs well in a wide range of circumstances and is accomplished through comparison with analytical solutions. Verification ensures that the computational code performs well over a range of geophysical problems. A few analytic solutions have been validated themselves with laboratory data. Even fewer existing numerical models have been both validated with the analytical solutions and verified with both laboratory measurements and field measurements, thus establishing a gold standard for numerical codes for inundation mapping. While there is in principle no absolute certainty that a numerical code that has performed well in all the benchmark tests will also produce correct inundation predictions with any given source motions, validated codes

  9. Understanding Recession and Self-Rated Health with the Partial Proportional Odds Model: An Analysis of 26 Countries

    PubMed Central

    Mayer, Adam; Foster, Michelle

    2015-01-01

    Introduction Self-rated health is demonstrated to vary substantially by both personal socio-economic status and national economic conditions. However, studies investigating the combined influence of individual and country level economic indicators across several countries in the context of recent global recession are limited. This paper furthers our knowledge of the effect of recession on health at both the individual and national level. Methods Using the Life in Transition II study, which provides data from 19,759 individuals across 26 European nations, we examine the relationship between self-rated health, personal economic experiences, and macro-economic change. Data analyses include, but are not limited to, the partial proportional odds model which permits the effect of predictors to vary across different levels of our dependent variable. Results Household experiences with recession, especially a loss of staple good consumption, are associated with lower self-rated health. Most individual-level experiences with recession, such as a job loss, have relatively small negative effects on perceived health; the effect of individual or household economic hardship is strongest in high income nations. Our findings also suggest that macroeconomic growth improves self-rated health in low-income nations but has no effect in high-income nations. Individuals with the greatest probability of “good” self-rated health reside in wealthy countries ($23,910 to $50, 870 GNI per capita). Conclusion Both individual and national economic variables are predictive of self-rated health. Personal and household experiences are most consequential for self-rated health in high income nations, while macroeconomic growth is most consequential in low-income nations. PMID:26513660

  10. Modelling the costs of natural hazards in games

    NASA Astrophysics Data System (ADS)

    Bostenaru-Dan, M.

    2012-04-01

    City are looked for today, including a development at the University of Torino called SimTorino, which simulates the development of the city in the next 20 years. The connection to another games genre as video games, the board games, will be investigated, since there are games on construction and reconstruction of a cathedral and its tower and a bridge in an urban environment of the middle ages based on the two novels of Ken Follett, "Pillars of the Earth" and "World Without End" and also more recent games, such as "Urban Sprawl" or the Romanian game "Habitat", dealing with the man-made hazard of demolition. A review of these games will be provided based on first hand playing experience. In games like "World without End" or "Pillars of the Earth", just like in the recently popular games of Zynga on social networks, construction management is done through providing "building" an item out of stylised materials, such as "stone", "sand" or more specific ones as "nail". Such approach could be used also for retrofitting buildings for earthquakes, in the series of "upgrade", not just for extension as it is currently in games, and this is what our research is about. "World without End" includes a natural disaster not so analysed today but which was judged by the author as the worst of manhood: the Black Death. The Black Death has effects and costs as well, not only modelled through action cards, but also on the built environment, by buildings remaining empty. On the other hand, games such as "Habitat" rely on role playing, which has been recently recognised as a way to bring games theory to decision making through the so-called contribution of drama, a way to solve conflicts through balancing instead of weighting, and thus related to Analytic Hierarchy Process. The presentation aims to also give hints on how to design a game for the problem of earthquake retrofit, translating the aims of the actors in such a process into role playing. Games are also employed in teaching of urban

  11. Expert elicitation for a national-level volcano hazard model

    NASA Astrophysics Data System (ADS)

    Bebbington, Mark; Stirling, Mark; Cronin, Shane; Wang, Ting; Jolly, Gill

    2016-04-01

    The quantification of volcanic hazard at national level is a vital pre-requisite to placing volcanic risk on a platform that permits meaningful comparison with other hazards such as earthquakes. New Zealand has up to a dozen dangerous volcanoes, with the usual mixed degrees of knowledge concerning their temporal and spatial eruptive history. Information on the 'size' of the eruptions, be it in terms of VEI, volume or duration, is sketchy at best. These limitations and the need for a uniform approach lend themselves to a subjective hazard analysis via expert elicitation. Approximately 20 New Zealand volcanologists provided estimates for the size of the next eruption from each volcano and, conditional on this, its location, timing and duration. Opinions were likewise elicited from a control group of statisticians, seismologists and (geo)chemists, all of whom had at least heard the term 'volcano'. The opinions were combined via the Cooke classical method. We will report on the preliminary results from the exercise.

  12. Quantitative physical models of volcanic phenomena for hazards assessment of critical infrastructures

    NASA Astrophysics Data System (ADS)

    Costa, Antonio

    2016-04-01

    Volcanic hazards may have destructive effects on economy, transport, and natural environments at both local and regional scale. Hazardous phenomena include pyroclastic density currents, tephra fall, gas emissions, lava flows, debris flows and avalanches, and lahars. Volcanic hazards assessment is based on available information to characterize potential volcanic sources in the region of interest and to determine whether specific volcanic phenomena might reach a given site. Volcanic hazards assessment is focussed on estimating the distances that volcanic phenomena could travel from potential sources and their intensity at the considered site. Epistemic and aleatory uncertainties strongly affect the resulting hazards assessment. Within the context of critical infrastructures, volcanic eruptions are rare natural events that can create severe hazards. In addition to being rare events, evidence of many past volcanic eruptions is poorly preserved in the geologic record. The models used for describing the impact of volcanic phenomena generally represent a range of model complexities, from simplified physics based conceptual models to highly coupled thermo fluid dynamical approaches. Modelling approaches represent a hierarchy of complexity, which reflects increasing requirements for well characterized data in order to produce a broader range of output information. In selecting models for the hazard analysis related to a specific phenomenon, questions that need to be answered by the models must be carefully considered. Independently of the model, the final hazards assessment strongly depends on input derived from detailed volcanological investigations, such as mapping and stratigraphic correlations. For each phenomenon, an overview of currently available approaches for the evaluation of future hazards will be presented with the aim to provide a foundation for future work in developing an international consensus on volcanic hazards assessment methods.

  13. The identification and validation process of proportional reasoning attributes: an application of a cognitive diagnosis modeling framework

    NASA Astrophysics Data System (ADS)

    Tjoe, Hartono; de la Torre, Jimmy

    2014-06-01

    In this paper, we discuss the process of identifying and validating students' abilities to think proportionally. More specifically, we describe the methodology we used to identify these proportional reasoning attributes, beginning with the selection and review of relevant literature on proportional reasoning. We then continue with the deliberation and resolution of differing views by mathematics researchers, mathematics educators, and middle school mathematics teachers of what should be learned theoretically and what can be taught practically in everyday classroom settings. We also present the initial development of proportional reasoning items as part of the two-phase validation process of the previously identified attributes. In particular, we detail in the first phase of the validation process our collaboration with middle school mathematics teachers in the creation of prototype items and the verification of each item-attribute specification in consideration of the most common ways (among many different ways) in which middle school students would have solved these prototype items themselves. In the second phase of the validation process, we elaborate our think-aloud interview procedure in the search for evidence of whether students generally solved the prototype items in the way they were expected to.

  14. Computer models used to support cleanup decision-making at hazardous and radioactive waste sites

    SciTech Connect

    Moskowitz, P.D.; Pardi, R.; DePhillips, M.P.; Meinhold, A.F.

    1992-07-01

    Massive efforts are underway to cleanup hazardous and radioactive waste sites located throughout the US To help determine cleanup priorities, computer models are being used to characterize the source, transport, fate and effects of hazardous chemicals and radioactive materials found at these sites. Although, the US Environmental Protection Agency (EPA), the US Department of Energy (DOE), and the US Nuclear Regulatory Commission (NRC) have provided preliminary guidance to promote the use of computer models for remediation purposes, no Agency has produced directed guidance on models that must be used in these efforts. To identify what models are actually being used to support decision-making at hazardous and radioactive waste sites, a project jointly funded by EPA, DOE and NRC was initiated. The purpose of this project was to: (1) Identify models being used for hazardous and radioactive waste site assessment purposes; and (2) describe and classify these models. This report presents the results of this study.

  15. Strip Diagrams: Illuminating Proportions

    ERIC Educational Resources Information Center

    Cohen, Jessica S.

    2013-01-01

    Proportional reasoning is both complex and layered, making it challenging to define. Lamon (1999) identified characteristics of proportional thinkers, such as being able to understand covariance of quantities; distinguish between proportional and nonproportional relationships; use a variety of strategies flexibly, most of which are nonalgorithmic,…

  16. Modelling the costs of natural hazards in games

    NASA Astrophysics Data System (ADS)

    Bostenaru-Dan, M.

    2012-04-01

    City are looked for today, including a development at the University of Torino called SimTorino, which simulates the development of the city in the next 20 years. The connection to another games genre as video games, the board games, will be investigated, since there are games on construction and reconstruction of a cathedral and its tower and a bridge in an urban environment of the middle ages based on the two novels of Ken Follett, "Pillars of the Earth" and "World Without End" and also more recent games, such as "Urban Sprawl" or the Romanian game "Habitat", dealing with the man-made hazard of demolition. A review of these games will be provided based on first hand playing experience. In games like "World without End" or "Pillars of the Earth", just like in the recently popular games of Zynga on social networks, construction management is done through providing "building" an item out of stylised materials, such as "stone", "sand" or more specific ones as "nail". Such approach could be used also for retrofitting buildings for earthquakes, in the series of "upgrade", not just for extension as it is currently in games, and this is what our research is about. "World without End" includes a natural disaster not so analysed today but which was judged by the author as the worst of manhood: the Black Death. The Black Death has effects and costs as well, not only modelled through action cards, but also on the built environment, by buildings remaining empty. On the other hand, games such as "Habitat" rely on role playing, which has been recently recognised as a way to bring games theory to decision making through the so-called contribution of drama, a way to solve conflicts through balancing instead of weighting, and thus related to Analytic Hierarchy Process. The presentation aims to also give hints on how to design a game for the problem of earthquake retrofit, translating the aims of the actors in such

  17. A time-dependent probabilistic seismic-hazard model for California

    USGS Publications Warehouse

    Cramer, C.H.; Petersen, M.D.; Cao, T.; Toppozada, Tousson R.; Reichle, M.

    2000-01-01

    For the purpose of sensitivity testing and illuminating nonconsensus components of time-dependent models, the California Department of Conservation, Division of Mines and Geology (CDMG) has assembled a time-dependent version of its statewide probabilistic seismic hazard (PSH) model for California. The model incorporates available consensus information from within the earth-science community, except for a few faults or fault segments where consensus information is not available. For these latter faults, published information has been incorporated into the model. As in the 1996 CDMG/U.S. Geological Survey (USGS) model, the time-dependent models incorporate three multisegment ruptures: a 1906, an 1857, and a southern San Andreas earthquake. Sensitivity tests are presented to show the effect on hazard and expected damage estimates of (1) intrinsic (aleatory) sigma, (2) multisegment (cascade) vs. independent segment (no cascade) ruptures, and (3) time-dependence vs. time-independence. Results indicate that (1) differences in hazard and expected damage estimates between time-dependent and independent models increase with decreasing intrinsic sigma, (2) differences in hazard and expected damage estimates between full cascading and not cascading are insensitive to intrinsic sigma, (3) differences in hazard increase with increasing return period (decreasing probability of occurrence), and (4) differences in moment-rate budgets increase with decreasing intrinsic sigma and with the degree of cascading, but are within the expected uncertainty in PSH time-dependent modeling and do not always significantly affect hazard and expected damage estimates.

  18. A Remote Sensing Based Approach For Modeling and Assessing Glacier Hazards

    NASA Astrophysics Data System (ADS)

    Huggel, C.; Kääb, A.; Salzmann, N.; Haeberli, W.; Paul, F.

    Glacier-related hazards such as ice avalanches and glacier lake outbursts can pose a significant threat to population and installations in high mountain regions. They are well documented in the Swiss Alps and the high data density is used to build up sys- tematic knowledge of glacier hazard locations and potentials. Experiences from long research activities thereby form an important basis for ongoing hazard monitoring and assessment. However, in the context of environmental changes in general, and the highly dynamic physical environment of glaciers in particular, historical experience may increasingly loose its significance with respect to impact zone of hazardous pro- cesses. On the other hand, in large and remote high mountains such as the Himalayas, exact information on location and potential of glacier hazards is often missing. There- fore, it is crucial to develop hazard monitoring and assessment concepts including area-wide applications. Remote sensing techniques offer a powerful tool to narrow current information gaps. The present contribution proposes an approach structured in (1) detection, (2) evaluation and (3) modeling of glacier hazards. Remote sensing data is used as the main input to (1). Algorithms taking advantage of multispectral, high-resolution data are applied for detecting glaciers and glacier lakes. Digital terrain modeling, and classification and fusion of panchromatic and multispectral satellite im- agery is performed in (2) to evaluate the hazard potential of possible hazard sources detected in (1). The locations found in (1) and (2) are used as input to (3). The models developed in (3) simulate the processes of lake outbursts and ice avalanches based on hydrological flow modeling and empirical values for average trajectory slopes. A probability-related function allows the model to indicate areas with lower and higher risk to be affected by catastrophic events. Application of the models for recent ice avalanches and lake outbursts show

  19. Comparison of the historical record of earthquake hazard with seismic-hazard models for New Zealand and the continental United States

    USGS Publications Warehouse

    Stirling, M.; Petersen, M.

    2006-01-01

    We compare the historical record of earthquake hazard experienced at 78 towns and cities (sites) distributed across New Zealand and the continental United States with the hazard estimated from the national probabilistic seismic-hazard (PSH) models for the two countries. The two PSH models are constructed with similar methodologies and data. Our comparisons show a tendency for the PSH models to slightly exceed the historical hazard in New Zealand and westernmost continental United States interplate regions, but show lower hazard than that of the historical record in the continental United States intraplate region. Factors such as non-Poissonian behavior, parameterization of active fault data in the PSH calculations, and uncertainties in estimation of ground-motion levels from historical felt intensity data for the interplate regions may have led to the higher-than-historical levels of hazard at the interplate sites. In contrast, the less-than-historical hazard for the remaining continental United States (intraplate) sites may be largely due to site conditions not having been considered at the intraplate sites, and uncertainties in correlating ground-motion levels to historical felt intensities. The study also highlights the importance of evaluating PSH models at more than one region, because the conclusions reached on the basis of a solely interplate or intraplate study would be very different.

  20. Potential of weight of evidence modelling for gully erosion hazard assessment in Mbire District - Zimbabwe

    NASA Astrophysics Data System (ADS)

    Dube, F.; Nhapi, I.; Murwira, A.; Gumindoga, W.; Goldin, J.; Mashauri, D. A.

    Gully erosion is an environmental concern particularly in areas where landcover has been modified by human activities. This study assessed the extent to which the potential of gully erosion could be successfully modelled as a function of seven environmental factors (landcover, soil type, distance from river, distance from road, Sediment Transport Index (STI), Stream Power Index (SPI) and Wetness Index (WI) using a GIS-based Weight of Evidence Modelling (WEM) in the Mbire District of Zimbabwe. Results show that out of the studied seven factors affecting gully erosion, five were significantly correlated (p < 0.05) to gully occurrence, namely; landcover, soil type, distance from river, STI and SPI. Two factors; WI and distance from road were not significantly correlated to gully occurrence (p > 0.05). A gully erosion hazard map showed that 78% of the very high hazard class area is within a distance of 250 m from rivers. Model validation indicated that 70% of the validation set of gullies were in the high hazard and very high hazard class. The resulting map of areas susceptible to gully erosion has a prediction accuracy of 67.8%. The predictive capability of the weight of evidence model in this study suggests that landcover, soil type, distance from river, STI and SPI are useful in creating a gully erosion hazard map but may not be sufficient to produce a valid map of gully erosion hazard.

  1. [Hazard evaluation modeling of particulate matters emitted by coal-fired boilers and case analysis].

    PubMed

    Shi, Yan-Ting; Du, Qian; Gao, Jian-Min; Bian, Xin; Wang, Zhi-Pu; Dong, He-Ming; Han, Qiang; Cao, Yang

    2014-02-01

    In order to evaluate the hazard of PM2.5 emitted by various boilers, in this paper, segmentation of particulate matters with sizes of below 2. 5 microm was performed based on their formation mechanisms and hazard level to human beings and environment. Meanwhile, taking into account the mass concentration, number concentration, enrichment factor of Hg, and content of Hg element in different coal ashes, a comprehensive model aimed at evaluating hazard of PM2.5 emitted by coal-fired boilers was established in this paper. Finally, through utilizing filed experimental data of previous literatures, a case analysis of the evaluation model was conducted, and the concept of hazard reduction coefficient was proposed, which can be used to evaluate the performance of dust removers.

  2. Snakes as hazards: modelling risk by chasing chimpanzees.

    PubMed

    McGrew, William C

    2015-04-01

    Snakes are presumed to be hazards to primates, including humans, by the snake detection hypothesis (Isbell in J Hum Evol 51:1-35, 2006; Isbell, The fruit, the tree, and the serpent. Why we see so well, 2009). Quantitative, systematic data to test this idea are lacking for the behavioural ecology of living great apes and human foragers. An alternative proxy is snakes encountered by primatologists seeking, tracking, and observing wild chimpanzees. We present 4 years of such data from Mt. Assirik, Senegal. We encountered 14 species of snakes a total of 142 times. Almost two-thirds of encounters were with venomous snakes. Encounters occurred most often in forest and least often in grassland, and more often in the dry season. The hypothesis seems to be supported, if frequency of encounter reflects selective risk of morbidity or mortality.

  3. Probabilistic seismic hazard study based on active fault and finite element geodynamic models

    NASA Astrophysics Data System (ADS)

    Kastelic, Vanja; Carafa, Michele M. C.; Visini, Francesco

    2016-04-01

    We present a probabilistic seismic hazard analysis (PSHA) that is exclusively based on active faults and geodynamic finite element input models whereas seismic catalogues were used only in a posterior comparison. We applied the developed model in the External Dinarides, a slow deforming thrust-and-fold belt at the contact between Adria and Eurasia.. is the Our method consists of establishing s two earthquake rupture forecast models: (i) a geological active fault input (GEO) model and, (ii) a finite element (FEM) model. The GEO model is based on active fault database that provides information on fault location and its geometric and kinematic parameters together with estimations on its slip rate. By default in this model all deformation is set to be released along the active faults. The FEM model is based on a numerical geodynamic model developed for the region of study. In this model the deformation is, besides along the active faults, released also in the volumetric continuum elements. From both models we calculated their corresponding activity rates, its earthquake rates and their final expected peak ground accelerations. We investigated both the source model and the earthquake model uncertainties by varying the main active fault and earthquake rate calculation parameters through constructing corresponding branches of the seismic hazard logic tree. Hazard maps and UHS curves have been produced for horizontal ground motion on bedrock conditions VS 30 ≥ 800 m/s), thereby not considering local site amplification effects. The hazard was computed over a 0.2° spaced grid considering 648 branches of the logic tree and the mean value of 10% probability of exceedance in 50 years hazard level, while the 5th and 95th percentiles were also computed to investigate the model limits. We conducted a sensitivity analysis to control which of the input parameters influence the final hazard results in which measure. The results of such comparison evidence the deformation model and

  4. Teamwork tools and activities within the hazard component of the Global Earthquake Model

    NASA Astrophysics Data System (ADS)

    Pagani, M.; Weatherill, G.; Monelli, D.; Danciu, L.

    2013-05-01

    The Global Earthquake Model (GEM) is a public-private partnership aimed at supporting and fostering a global community of scientists and engineers working in the fields of seismic hazard and risk assessment. In the hazard sector, in particular, GEM recognizes the importance of local ownership and leadership in the creation of seismic hazard models. For this reason, over the last few years, GEM has been promoting different activities in the context of seismic hazard analysis ranging, for example, from regional projects targeted at the creation of updated seismic hazard studies to the development of a new open-source seismic hazard and risk calculation software called OpenQuake-engine (http://globalquakemodel.org). In this communication we'll provide a tour of the various activities completed, such as the new ISC-GEM Global Instrumental Catalogue, and of currently on-going initiatives like the creation of a suite of tools for the creation of PSHA input models. Discussion, comments and criticism by the colleagues in the audience will be highly appreciated.

  5. Coincidence Proportional Counter

    DOEpatents

    Manley, J H

    1950-11-21

    A coincidence proportional counter having a plurality of collecting electrodes so disposed as to measure the range or energy spectrum of an ionizing particle-emitting source such as an alpha source, is disclosed.

  6. Estimating piecewise exponential frailty model with changing prior for baseline hazard function

    NASA Astrophysics Data System (ADS)

    Thamrin, Sri Astuti; Lawi, Armin

    2016-02-01

    Piecewise exponential models provide a very flexible framework for modelling univariate survival data. It can be used to estimate the effects of different covariates which are influenced by the survival data. Although in a strict sense it is a parametric model, a piecewise exponential hazard can approximate any shape of a parametric baseline hazard. In the parametric baseline hazard, the hazard function for each individual may depend on a set of risk factors or explanatory variables. However, it usually does not explain all such variables which are known or measurable, and these variables become interesting to be considered. This unknown and unobservable risk factor of the hazard function is often termed as the individual's heterogeneity or frailty. This paper analyses the effects of unobserved population heterogeneity in patients' survival times. The issue of model choice through variable selection is also considered. A sensitivity analysis is conducted to assess the influence of the prior for each parameter. We used the Markov Chain Monte Carlo method in computing the Bayesian estimator on kidney infection data. The results obtained show that the sex and frailty are substantially associated with survival in this study and the models are relatively quite sensitive to the choice of two different priors.

  7. Applying the Land Use Portfolio Model with Hazus to analyse risk from natural hazard events

    USGS Publications Warehouse

    Dinitz, Laura B.; Taketa, Richard A.

    2013-01-01

    This paper describes and demonstrates the integration of two geospatial decision-support systems for natural-hazard risk assessment and management. Hazus is a risk-assessment tool developed by the Federal Emergency Management Agency to identify risks and estimate the severity of risk from natural hazards. The Land Use Portfolio Model (LUPM) is a risk-management tool developed by the U.S. Geological Survey to evaluate plans or actions intended to reduce risk from natural hazards. We analysed three mitigation policies for one earthquake scenario in the San Francisco Bay area to demonstrate the added value of using Hazus and the LUPM together. The demonstration showed that Hazus loss estimates can be input to the LUPM to obtain estimates of losses avoided through mitigation, rates of return on mitigation investment, and measures of uncertainty. Together, they offer a more comprehensive approach to help with decisions for reducing risk from natural hazards.

  8. The influence of hazard models on GIS-based regional risk assessments and mitigation policies

    USGS Publications Warehouse

    Bernknopf, R.L.; Rabinovici, S.J.M.; Wood, N.J.; Dinitz, L.B.

    2006-01-01

    Geographic information systems (GIS) are important tools for understanding and communicating the spatial distribution of risks associated with natural hazards in regional economies. We present a GIS-based decision support system (DSS) for assessing community vulnerability to natural hazards and evaluating potential mitigation policy outcomes. The Land Use Portfolio Modeler (LUPM) integrates earth science and socioeconomic information to predict the economic impacts of loss-reduction strategies. However, the potential use of such systems in decision making may be limited when multiple but conflicting interpretations of the hazard are available. To explore this problem, we conduct a policy comparison using the LUPM to test the sensitivity of three available assessments of earthquake-induced lateral-spread ground failure susceptibility in a coastal California community. We find that the uncertainty regarding the interpretation of the science inputs can influence the development and implementation of natural hazard management policies. Copyright ?? 2006 Inderscience Enterprises Ltd.

  9. Development and Analysis of a Hurricane Hazard Model for Disaster Risk Assessment in Central America

    NASA Astrophysics Data System (ADS)

    Pita, G. L.; Gunasekera, R.; Ishizawa, O. A.

    2014-12-01

    Hurricane and tropical storm activity in Central America has consistently caused over the past decades thousands of casualties, significant population displacement, and substantial property and infrastructure losses. As a component to estimate future potential losses, we present a new regional probabilistic hurricane hazard model for Central America. Currently, there are very few openly available hurricane hazard models for Central America. This resultant hazard model would be used in conjunction with exposure and vulnerability components as part of a World Bank project to create country disaster risk profiles that will assist to improve risk estimation and provide decision makers with better tools to quantify disaster risk. This paper describes the hazard model methodology which involves the development of a wind field model that simulates the gust speeds at terrain height at a fine resolution. The HURDAT dataset has been used in this study to create synthetic events that assess average hurricane landfall angles and their variability at each location. The hazard model also then estimates the average track angle at multiple geographical locations in order to provide a realistic range of possible hurricane paths that will be used for risk analyses in all the Central-American countries. This probabilistic hurricane hazard model is then also useful for relating synthetic wind estimates to loss and damage data to develop and calibrate existing empirical building vulnerability curves. To assess the accuracy and applicability, modeled results are evaluated against historical events, their tracks and wind fields. Deeper analyses of results are also presented with a special reference to Guatemala. The findings, interpretations, and conclusions expressed in this paper are entirely those of the authors. They do not necessarily represent the views of the International Bank for Reconstruction and Development/World Bank and its affiliated organizations, or those of the

  10. Computer models used to support cleanup decision-making at hazardous and radioactive waste sites

    SciTech Connect

    Moskowitz, P.D.; Pardi, R.; DePhillips, M.P.; Meinhold, A.F. )

    1992-12-01

    Massive efforts are underway to clean up hazardous and radioactive waste sites located throughout the United States. To help determine cleanup priorities, computer models are being used to characterize the source, transport, fate, and effects of hazardous chemicals and radioactive materials found at these sites. Although the US Environmental Protection Agency (EPA), the US Department of Energy (DOE), and the US Nuclear Regulatory Commission (NRC) have provided preliminary guidance to promote the use of computer models for remediation purposes, no agency has produced directed guidance on models that must be used in these efforts. As a result, model selection is currently done on an ad hoc basis. This is administratively ineffective and costly, and can also result in technically inconsistent decision-making. To identify what models are actually being used to support decision-making at hazardous and radioactive waste sites, a project jointly funded by EPA, DOE, and NRC was initiated. The purpose of this project was to: (1) identify models being used for hazardous and radioactive waste site assessment purposes; and (2) describe and classify these models. This report presents the results of this study. A mail survey was conducted to identify models in use. The survey was sent to [approx] 550 persons engaged in the cleanup of hazardous and radioactive waste sites; 87 individuals responded. They represented organizations including federal agencies, national laboratories, and contractor organizations. The respondents identified 127 computer models that were being used to help support cleanup decision-making. There were a few models that appeared to be used across a large number of sites (e.g., RESRAD). In contrast, the survey results also suggested that most sites were using models which were not reported in use elsewhere. Information is presented on the types of models being used and the characteristics of the models in use.

  11. Three multimedia models used at hazardous and radioactive waste sites

    SciTech Connect

    Moskowitz, P.D.; Pardi, R.; Fthenakis, V.M.; Holtzman, S.; Sun, L.C.; Rambaugh, J.O.; Potter, S.

    1996-02-01

    Multimedia models are used commonly in the initial phases of the remediation process where technical interest is focused on determining the relative importance of various exposure pathways. This report provides an approach for evaluating and critically reviewing the capabilities of multimedia models. This study focused on three specific models MEPAS Version 3.0, MMSOILS Version 2.2, and PRESTO-EPA-CPG Version 2.0. These models evaluate the transport and fate of contaminants from source to receptor through more than a single pathway. The presence of radioactive and mixed wastes at a site poses special problems. Hence, in this report, restrictions associated with the selection and application of multimedia models for sites contaminated with radioactive and mixed wastes are highlighted. This report begins with a brief introduction to the concept of multimedia modeling, followed by an overview of the three models. The remaining chapters present more technical discussions of the issues associated with each compartment and their direct application to the specific models. In these analyses, the following components are discussed: source term; air transport; ground water transport; overland flow, runoff, and surface water transport; food chain modeling; exposure assessment; dosimetry/risk assessment; uncertainty; default parameters. The report concludes with a description of evolving updates to the model; these descriptions were provided by the model developers.

  12. New Activities of the U.S. National Tsunami Hazard Mitigation Program, Mapping and Modeling Subcommittee

    NASA Astrophysics Data System (ADS)

    Wilson, R. I.; Eble, M. C.

    2013-12-01

    The U.S. National Tsunami Hazard Mitigation Program (NTHMP) is comprised of representatives from coastal states and federal agencies who, under the guidance of NOAA, work together to develop protocols and products to help communities prepare for and mitigate tsunami hazards. Within the NTHMP are several subcommittees responsible for complimentary aspects of tsunami assessment, mitigation, education, warning, and response. The Mapping and Modeling Subcommittee (MMS) is comprised of state and federal scientists who specialize in tsunami source characterization, numerical tsunami modeling, inundation map production, and warning forecasting. Until September 2012, much of the work of the MMS was authorized through the Tsunami Warning and Education Act, an Act that has since expired but the spirit of which is being adhered to in parallel with reauthorization efforts. Over the past several years, the MMS has developed guidance and best practices for states and territories to produce accurate and consistent tsunami inundation maps for community level evacuation planning, and has conducted benchmarking of numerical inundation models. Recent tsunami events have highlighted the need for other types of tsunami hazard analyses and products for improving evacuation planning, vertical evacuation, maritime planning, land-use planning, building construction, and warning forecasts. As the program responsible for producing accurate and consistent tsunami products nationally, the NTHMP-MMS is initiating a multi-year plan to accomplish the following: 1) Create and build on existing demonstration projects that explore new tsunami hazard analysis techniques and products, such as maps identifying areas of strong currents and potential damage within harbors as well as probabilistic tsunami hazard analysis for land-use planning. 2) Develop benchmarks for validating new numerical modeling techniques related to current velocities and landslide sources. 3) Generate guidance and protocols for

  13. Modeling Exposure to Persistent Chemicals in Hazard and Risk Assessment

    SciTech Connect

    Cowan-Ellsberry, Christina E.; McLachlan, Michael S.; Arnot, Jon A.; MacLeod, Matthew; McKone, Thomas E.; Wania, Frank

    2008-11-01

    Fate and exposure modeling has not thus far been explicitly used in the risk profile documents prepared to evaluate significant adverse effect of candidate chemicals for either the Stockholm Convention or the Convention on Long-Range Transboundary Air Pollution. However, we believe models have considerable potential to improve the risk profiles. Fate and exposure models are already used routinely in other similar regulatory applications to inform decisions, and they have been instrumental in building our current understanding of the fate of POP and PBT chemicals in the environment. The goal of this paper is to motivate the use of fate and exposure models in preparing risk profiles in the POP assessment procedure by providing strategies for incorporating and using models. The ways that fate and exposure models can be used to improve and inform the development of risk profiles include: (1) Benchmarking the ratio of exposure and emissions of candidate chemicals to the same ratio for known POPs, thereby opening the possibility of combining this ratio with the relative emissions and relative toxicity to arrive at a measure of relative risk. (2) Directly estimating the exposure of the environment, biota and humans to provide information to complement measurements, or where measurements are not available or are limited. (3) To identify the key processes and chemical and/or environmental parameters that determine the exposure; thereby allowing the effective prioritization of research or measurements to improve the risk profile. (4) Predicting future time trends including how quickly exposure levels in remote areas would respond to reductions in emissions. Currently there is no standardized consensus model for use in the risk profile context. Therefore, to choose the appropriate model the risk profile developer must evaluate how appropriate an existing model is for a specific setting and whether the assumptions and input data are relevant in the context of the application

  14. Adaptation through proportion

    NASA Astrophysics Data System (ADS)

    Xiong, Liyang; Shi, Wenjia; Tang, Chao

    2016-08-01

    Adaptation is a ubiquitous feature in biological sensory and signaling networks. It has been suggested that adaptive systems may follow certain simple design principles across diverse organisms, cells and pathways. One class of networks that can achieve adaptation utilizes an incoherent feedforward control, in which two parallel signaling branches exert opposite but proportional effects on the output at steady state. In this paper, we generalize this adaptation mechanism by establishing a steady-state proportionality relationship among a subset of nodes in a network. Adaptation can be achieved by using any two nodes in the sub-network to respectively regulate the output node positively and negatively. We focus on enzyme networks and first identify basic regulation motifs consisting of two and three nodes that can be used to build small networks with proportional relationships. Larger proportional networks can then be constructed modularly similar to LEGOs. Our method provides a general framework to construct and analyze a class of proportional and/or adaptation networks with arbitrary size, flexibility and versatile functional features.

  15. The Balance-Scale Task Revisited: A Comparison of Statistical Models for Rule-Based and Information-Integration Theories of Proportional Reasoning.

    PubMed

    Hofman, Abe D; Visser, Ingmar; Jansen, Brenda R J; van der Maas, Han L J

    2015-01-01

    We propose and test three statistical models for the analysis of children's responses to the balance scale task, a seminal task to study proportional reasoning. We use a latent class modelling approach to formulate a rule-based latent class model (RB LCM) following from a rule-based perspective on proportional reasoning and a new statistical model, the Weighted Sum Model, following from an information-integration approach. Moreover, a hybrid LCM using item covariates is proposed, combining aspects of both a rule-based and information-integration perspective. These models are applied to two different datasets, a standard paper-and-pencil test dataset (N = 779), and a dataset collected within an online learning environment that included direct feedback, time-pressure, and a reward system (N = 808). For the paper-and-pencil dataset the RB LCM resulted in the best fit, whereas for the online dataset the hybrid LCM provided the best fit. The standard paper-and-pencil dataset yielded more evidence for distinct solution rules than the online data set in which quantitative item characteristics are more prominent in determining responses. These results shed new light on the discussion on sequential rule-based and information-integration perspectives of cognitive development. PMID:26505905

  16. The Balance-Scale Task Revisited: A Comparison of Statistical Models for Rule-Based and Information-Integration Theories of Proportional Reasoning

    PubMed Central

    Hofman, Abe D.; Visser, Ingmar; Jansen, Brenda R. J.; van der Maas, Han L. J.

    2015-01-01

    We propose and test three statistical models for the analysis of children’s responses to the balance scale task, a seminal task to study proportional reasoning. We use a latent class modelling approach to formulate a rule-based latent class model (RB LCM) following from a rule-based perspective on proportional reasoning and a new statistical model, the Weighted Sum Model, following from an information-integration approach. Moreover, a hybrid LCM using item covariates is proposed, combining aspects of both a rule-based and information-integration perspective. These models are applied to two different datasets, a standard paper-and-pencil test dataset (N = 779), and a dataset collected within an online learning environment that included direct feedback, time-pressure, and a reward system (N = 808). For the paper-and-pencil dataset the RB LCM resulted in the best fit, whereas for the online dataset the hybrid LCM provided the best fit. The standard paper-and-pencil dataset yielded more evidence for distinct solution rules than the online data set in which quantitative item characteristics are more prominent in determining responses. These results shed new light on the discussion on sequential rule-based and information-integration perspectives of cognitive development. PMID:26505905

  17. The Framework of a Coastal Hazards Model - A Tool for Predicting the Impact of Severe Storms

    USGS Publications Warehouse

    Barnard, Patrick L.; O'Reilly, Bill; van Ormondt, Maarten; Elias, Edwin; Ruggiero, Peter; Erikson, Li H.; Hapke, Cheryl; Collins, Brian D.; Guza, Robert T.; Adams, Peter N.; Thomas, Julie

    2009-01-01

    The U.S. Geological Survey (USGS) Multi-Hazards Demonstration Project in Southern California (Jones and others, 2007) is a five-year project (FY2007-FY2011) integrating multiple USGS research activities with the needs of external partners, such as emergency managers and land-use planners, to produce products and information that can be used to create more disaster-resilient communities. The hazards being evaluated include earthquakes, landslides, floods, tsunamis, wildfires, and coastal hazards. For the Coastal Hazards Task of the Multi-Hazards Demonstration Project in Southern California, the USGS is leading the development of a modeling system for forecasting the impact of winter storms threatening the entire Southern California shoreline from Pt. Conception to the Mexican border. The modeling system, run in real-time or with prescribed scenarios, will incorporate atmospheric information (that is, wind and pressure fields) with a suite of state-of-the-art physical process models (that is, tide, surge, and wave) to enable detailed prediction of currents, wave height, wave runup, and total water levels. Additional research-grade predictions of coastal flooding, inundation, erosion, and cliff failure will also be performed. Initial model testing, performance evaluation, and product development will be focused on a severe winter-storm scenario developed in collaboration with the Winter Storm Working Group of the USGS Multi-Hazards Demonstration Project in Southern California. Additional offline model runs and products will include coastal-hazard hindcasts of selected historical winter storms, as well as additional severe winter-storm simulations based on statistical analyses of historical wave and water-level data. The coastal-hazards model design will also be appropriate for simulating the impact of storms under various sea level rise and climate-change scenarios. The operational capabilities of this modeling system are designed to provide emergency planners with

  18. Modelling in infectious diseases: between haphazard and hazard.

    PubMed

    Neuberger, A; Paul, M; Nizar, A; Raoult, D

    2013-11-01

    Modelling of infectious diseases is difficult, if not impossible. No epidemic has ever been truly predicted, rather than being merely noticed when it was already ongoing. Modelling the future course of an epidemic is similarly tenuous, as exemplified by ominous predictions during the last influenza pandemic leading to exaggerated national responses. The continuous evolution of microorganisms, the introduction of new pathogens into the human population and the interactions of a specific pathogen with the environment, vectors, intermediate hosts, reservoir animals and other microorganisms are far too complex to be predictable. Our environment is changing at an unprecedented rate, and human-related factors, which are essential components of any epidemic prediction model, are difficult to foresee in our increasingly dynamic societies. Any epidemiological model is, by definition, an abstraction of the real world, and fundamental assumptions and simplifications are therefore required. Indicator-based surveillance methods and, more recently, Internet biosurveillance systems can detect and monitor outbreaks of infections more rapidly and accurately than ever before. As the interactions between microorganisms, humans and the environment are too numerous and unexpected to be accurately represented in a mathematical model, we argue that prediction and model-based management of epidemics in their early phase are quite unlikely to become the norm. PMID:23879334

  19. Efficient pan-European flood hazard modelling through a combination of statistical and physical models

    NASA Astrophysics Data System (ADS)

    Paprotny, Dominik; Morales Nápoles, Oswaldo

    2016-04-01

    Low-resolution hydrological models are often applied to calculate extreme river discharges and delimitate flood zones on continental and global scale. Still, the computational expense is very large and often limits the extent and depth of such studies. Here, we present a quick yet similarly accurate procedure for flood hazard assessment in Europe. Firstly, a statistical model based on Bayesian Networks is used. It describes the joint distribution of annual maxima of daily discharges of European rivers with variables describing the geographical characteristics of their catchments. It was quantified with 75,000 station-years of river discharge, as well as climate, terrain and land use data. The model's predictions of average annual maxima or discharges with certain return periods are of similar performance to physical rainfall-runoff models applied at continental scale. A database of discharge scenarios - return periods under present and future climate - was prepared for the majority of European rivers. Secondly, those scenarios were used as boundary conditions for one-dimensional (1D) hydrodynamic model SOBEK. Utilizing 1D instead of 2D modelling conserved computational time, yet gave satisfactory results. The resulting pan-European flood map was contrasted with some local high-resolution studies. Indeed, the comparison shows that, in overall, the methods presented here gave similar or better alignment with local studies than previously released pan-European flood map.

  20. Comparing the European (SHARE) and the reference Italian seismic hazard models

    NASA Astrophysics Data System (ADS)

    Visini, Francesco; Meletti, Carlo; D'Amico, Vera; Rovida, Andrea; Stucchi, Massimiliano

    2016-04-01

    A probabilistic seismic hazard evaluation for Europe has been recently released by the SHARE project (www.share-eu.org, Giardini et al., 2013; Woessner et al., 2015). A comparison between SHARE results for Italy and the official Italian seismic hazard model (MPS04, Stucchi et al., 2011), currently adopted by the building code, has been carried on to identify the main input elements that produce the differences between the two models. The SHARE model shows increased expected values (up to 70%) with respect to the MPS04 model for PGA with 10% probability of exceedance in 50 years. However, looking in detail at all output parameters of both the models, we observe that for spectral periods greater than 0.3 s, the reference PSHA for Italy proposes higher values than the SHARE model for many and large areas. This behaviour is mainly guided by the adoption of recent ground-motion prediction equations (GMPEs) that estimate higher values for PGA and for accelerations with periods lower than 0.3 s and lower values for higher periods with respect to older GMPEs used in MPS04. Another important set of tests consisted in analyzing separately the PSHA results obtained by the three source models adopted in SHARE (i.e., area sources, fault sources with background, and a refined smoothed seismicity model), whereas MPS04 only used area sources. Results show that, besides the strong impact of the GMPEs, the differences on the seismic hazard estimates among the three source models are relevant and, in particular, for some selected test sites, the fault-based model returns lowest estimates of seismic hazard. This result arises questions on the completeness of the fault database, their parameterization and assessment of activity rates as well as on the impact of the threshold magnitude between faults and background. Giardini D. et al., 2013. Seismic Hazard Harmonization in Europe (SHARE): Online Data Resource, doi:10.12686/SED-00000001-SHARE. Stucchi M. et al., 2011. Seismic Hazard

  1. The Integrated Nursing Pathway: An Innovative Collaborative Model to Increase the Proportion of Baccalaureate-Prepared Nurses.

    PubMed

    Goode, Colleen J; Preheim, Gayle J; Bonini, Susan; Case, Nancy K; VanderMeer, Jennifer; Iannelli, Gina

    2016-01-01

    This manuscript describes a collaborative, seamless program between a community college and a university college of nursing designed to increase the number of nurses prepared with a baccalaureate degree. The three-year Integrated Nursing Pathway provides community college students with a non-nursing associate degree, early introduction to nursing, and seamless progression through BSN education. The model includes dual admission and advising and is driven by the need for collaboration with community colleges, the need to increase the percentage of racial-ethnic minority students, the shortage of faculty, and employer preferences for BSN graduates. PMID:27209872

  2. Modeling and Prediction of Wildfire Hazard in Southern California, Integration of Models with Imaging Spectrometry

    NASA Technical Reports Server (NTRS)

    Roberts, Dar A.; Church, Richard; Ustin, Susan L.; Brass, James A. (Technical Monitor)

    2001-01-01

    Large urban wildfires throughout southern California have caused billions of dollars of damage and significant loss of life over the last few decades. Rapid urban growth along the wildland interface, high fuel loads and a potential increase in the frequency of large fires due to climatic change suggest that the problem will worsen in the future. Improved fire spread prediction and reduced uncertainty in assessing fire hazard would be significant, both economically and socially. Current problems in the modeling of fire spread include the role of plant community differences, spatial heterogeneity in fuels and spatio-temporal changes in fuels. In this research, we evaluated the potential of Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) and Airborne Synthetic Aperture Radar (AIRSAR) data for providing improved maps of wildfire fuel properties. Analysis concentrated in two areas of Southern California, the Santa Monica Mountains and Santa Barbara Front Range. Wildfire fuel information can be divided into four basic categories: fuel type, fuel load (live green and woody biomass), fuel moisture and fuel condition (live vs senesced fuels). To map fuel type, AVIRIS data were used to map vegetation species using Multiple Endmember Spectral Mixture Analysis (MESMA) and Binary Decision Trees. Green live biomass and canopy moisture were mapped using AVIRIS through analysis of the 980 nm liquid water absorption feature and compared to alternate measures of moisture and field measurements. Woody biomass was mapped using L and P band cross polarimetric data acquired in 1998 and 1999. Fuel condition was mapped using spectral mixture analysis to map green vegetation (green leaves), nonphotosynthetic vegetation (NPV; stems, wood and litter), shade and soil. Summaries describing the potential of hyperspectral and SAR data for fuel mapping are provided by Roberts et al. and Dennison et al. To utilize remotely sensed data to assess fire hazard, fuel-type maps were translated

  3. Keep It in Proportion.

    ERIC Educational Resources Information Center

    Snider, Richard G.

    1985-01-01

    The ratio factors approach involves recognizing a given fraction, then multiplying so that units cancel. This approach, which is grounded in concrete operational thinking patterns, provides a standard for science ratio and proportion problems. Examples are included for unit conversions, mole problems, molarity, speed/density problems, and…

  4. Incorporating induced seismicity in the 2014 United States National Seismic Hazard Model: results of the 2014 workshop and sensitivity studies

    USGS Publications Warehouse

    Petersen, Mark D.; Mueller, Charles S.; Moschetti, Morgan P.; Hoover, Susan M.; Rubinstein, Justin L.; Llenos, Andrea L.; Michael, Andrew J.; Ellsworth, William L.; McGarr, Arthur F.; Holland, Austin A.; Anderson, John G.

    2015-01-01

    The U.S. Geological Survey National Seismic Hazard Model for the conterminous United States was updated in 2014 to account for new methods, input models, and data necessary for assessing the seismic ground shaking hazard from natural (tectonic) earthquakes. The U.S. Geological Survey National Seismic Hazard Model project uses probabilistic seismic hazard analysis to quantify the rate of exceedance for earthquake ground shaking (ground motion). For the 2014 National Seismic Hazard Model assessment, the seismic hazard from potentially induced earthquakes was intentionally not considered because we had not determined how to properly treat these earthquakes for the seismic hazard analysis. The phrases “potentially induced” and “induced” are used interchangeably in this report, however it is acknowledged that this classification is based on circumstantial evidence and scientific judgment. For the 2014 National Seismic Hazard Model update, the potentially induced earthquakes were removed from the NSHM’s earthquake catalog, and the documentation states that we would consider alternative models for including induced seismicity in a future version of the National Seismic Hazard Model. As part of the process of incorporating induced seismicity into the seismic hazard model, we evaluate the sensitivity of the seismic hazard from induced seismicity to five parts of the hazard model: (1) the earthquake catalog, (2) earthquake rates, (3) earthquake locations, (4) earthquake Mmax (maximum magnitude), and (5) earthquake ground motions. We describe alternative input models for each of the five parts that represent differences in scientific opinions on induced seismicity characteristics. In this report, however, we do not weight these input models to come up with a preferred final model. Instead, we present a sensitivity study showing uniform seismic hazard maps obtained by applying the alternative input models for induced seismicity. The final model will be released after

  5. Contribution of physical modelling to climate-driven landslide hazard mapping: an alpine test site

    NASA Astrophysics Data System (ADS)

    Vandromme, R.; Desramaut, N.; Baills, A.; Hohmann, A.; Grandjean, G.; Sedan, O.; Mallet, J. P.

    2012-04-01

    The aim of this work is to develop a methodology for integrating climate change scenarios into quantitative hazard assessment and especially their precipitation component. The effects of climate change will be different depending on both the location of the site and the type of landslide considered. Indeed, mass movements can be triggered by different factors. This paper describes a methodology to address this issue and shows an application on an alpine test site. Mechanical approaches represent a solution for quantitative landslide susceptibility and hazard modeling. However, as the quantity and the quality of data are generally very heterogeneous at a regional scale, it is necessary to take into account the uncertainty in the analysis. In this perspective, a new hazard modeling method is developed and integrated in a program named ALICE. This program integrates mechanical stability analysis through a GIS software taking into account data uncertainty. This method proposes a quantitative classification of landslide hazard and offers a useful tool to gain time and efficiency in hazard mapping. However, an expertise approach is still necessary to finalize the maps. Indeed it is the only way to take into account some influent factors in slope stability such as heterogeneity of the geological formations or effects of anthropic interventions. To go further, the alpine test site (Barcelonnette area, France) is being used to integrate climate change scenarios into ALICE program, and especially their precipitation component with the help of a hydrological model (GARDENIA) and the regional climate model REMO (Jacob, 2001). From a DEM, land-cover map, geology, geotechnical data and so forth the program classifies hazard zones depending on geotechnics and different hydrological contexts varying in time. This communication, realized within the framework of Safeland project, is supported by the European Commission under the 7th Framework Programme for Research and Technological

  6. Modeling downwind hazards after an accidental release of chlorine trifluoride

    SciTech Connect

    Lombardi, D.A.; Cheng, Meng-Dawn

    1996-05-01

    A module simulating ClF{sub 3} chemical reactions with water vapor and thermodynamic processes in the atmosphere after an accidental release has been developed. This module was liked to the HGSYSTEM. Initial model runs simulate the rapid formation of HF and ClO{sub 2} after an atmospheric release of ClF{sub 3}. At distances beyond the first several meters from the release point, HF and ClO{sub 2} concentrations pose a greater threat to human health than do ClF{sub 3} concentrations. For most of the simulations, ClF{sub 3} concentrations rapidly fall below the IDLH. Fro releases occurring in ambient conditions with low relative humidity and/or ambient temperature, ClF{sub 3} concentrations exceed the IDLH up to almost 500 m. The performance of this model needs to be determined for potential release scenarios that will be considered. These release scenarios are currently being developed.

  7. Large area application of a corn hazard model. [Soviet Union

    NASA Technical Reports Server (NTRS)

    Ashburn, P.; Taylor, T. W. (Principal Investigator)

    1981-01-01

    An application test of the crop calendar portion of a corn (maize) stress indicator model developed by the early warning, crop condition assessment component of AgRISTARS was performed over the corn for grain producing regions of the U.S.S.R. during the 1980 crop year using real data. Performance of the crop calendar submodel was favorable; efficiency gains in meteorological data analysis time were on a magnitude of 85 to 90 percent.

  8. Recent Progress in Understanding Natural-Hazards-Generated TEC Perturbations: Measurements and Modeling Results

    NASA Astrophysics Data System (ADS)

    Komjathy, A.; Yang, Y. M.; Meng, X.; Verkhoglyadova, O. P.; Mannucci, A. J.; Langley, R. B.

    2015-12-01

    Natural hazards, including earthquakes, volcanic eruptions, and tsunamis, have been significant threats to humans throughout recorded history. The Global Positioning System satellites have become primary sensors to measure signatures associated with such natural hazards. These signatures typically include GPS-derived seismic deformation measurements, co-seismic vertical displacements, and real-time GPS-derived ocean buoy positioning estimates. Another way to use GPS observables is to compute the ionospheric total electron content (TEC) to measure and monitor post-seismic ionospheric disturbances caused by earthquakes, volcanic eruptions, and tsunamis. Research at the University of New Brunswick (UNB) laid the foundations to model the three-dimensional ionosphere at NASA's Jet Propulsion Laboratory by ingesting ground- and space-based GPS measurements into the state-of-the-art Global Assimilative Ionosphere Modeling (GAIM) software. As an outcome of the UNB and NASA research, new and innovative GPS applications have been invented including the use of ionospheric measurements to detect tiny fluctuations in the GPS signals between the spacecraft and GPS receivers caused by natural hazards occurring on or near the Earth's surface.We will show examples for early detection of natural hazards generated ionospheric signatures using ground-based and space-borne GPS receivers. We will also discuss recent results from the U.S. Real-time Earthquake Analysis for Disaster Mitigation Network (READI) exercises utilizing our algorithms. By studying the propagation properties of ionospheric perturbations generated by natural hazards along with applying sophisticated first-principles physics-based modeling, we are on track to develop new technologies that can potentially save human lives and minimize property damage. It is also expected that ionospheric monitoring of TEC perturbations might become an integral part of existing natural hazards warning systems.

  9. Landslides! Engaging students in natural hazards and STEM principles through the exploration of landslide analog models

    NASA Astrophysics Data System (ADS)

    Gochis, E. E.; Lechner, H. N.; Brill, K. A.; Lerner, G.; Ramos, E.

    2014-12-01

    Graduate students at Michigan Technological University developed the "Landslides!" activity to engage middle & high school students participating in summer engineering programs in a hands-on exploration of geologic engineering and STEM (Science, Technology, Engineering and Math) principles. The inquiry-based lesson plan is aligned to Next Generation Science Standards and is appropriate for 6th-12th grade classrooms. During the activity students focus on the factors contributing to landslide development and engineering practices used to mitigate hazards of slope stability hazards. Students begin by comparing different soil types and by developing predictions of how sediment type may contribute to differences in slope stability. Working in groups, students then build tabletop hill-slope models from the various materials in order to engage in evidence-based reasoning and test their predictions by adding groundwater until each group's modeled slope fails. Lastly students elaborate on their understanding of landslides by designing 'engineering solutions' to mitigate the hazards observed in each model. Post-evaluations from students demonstrate that they enjoyed the hands-on nature of the activity and the application of engineering principles to mitigate a modeled natural hazard.

  10. DEVELOPMENT AND ANALYSIS OF AIR QUALITY MODELING SIMULATIONS FOR HAZARDOUS AIR POLLUTANTS

    EPA Science Inventory

    The concentrations of five hazardous air pollutants were simulated using the Community Multi Scale Air Quality (CMAQ) modeling system. Annual simulations were performed over the continental United States for the entire year of 2001 to support human exposure estimates. Results a...

  11. Multiwire proportional chamber development

    NASA Technical Reports Server (NTRS)

    Doolittle, R. F.; Pollvogt, U.; Eskovitz, A. J.

    1973-01-01

    The development of large area multiwire proportional chambers, to be used as high resolution spatial detectors in cosmic ray experiments is described. A readout system was developed which uses a directly coupled, lumped element delay-line whose characteristics are independent of the MWPC design. A complete analysis of the delay-line and the readout electronic system shows that a spatial resolution of about 0.1 mm can be reached with the MWPC operating in the strictly proportional region. This was confirmed by measurements with a small MWPC and Fe-55 X-rays. A simplified analysis was carried out to estimate the theoretical limit of spatial resolution due to delta-rays, spread of the discharge along the anode wire, and inclined trajectories. To calculate the gas gain of MWPC's of different geometrical configurations a method was developed which is based on the knowledge of the first Townsend coefficient of the chamber gas.

  12. Monitor proportional counter

    NASA Technical Reports Server (NTRS)

    Weisskopf, M. C.

    1979-01-01

    An Uhuru class Ar-CO2 gas filled proportional counter sealed with a 1.5 mil beryllium window and sensitive to X-rays in the energy bandwidth from 1.5 to 22 keV is presented. This device is coaligned with the X-ray telescope aboard the Einstein Observatory and takes data as a normal part of the Observatory operations.

  13. Global river flood hazard maps: hydraulic modelling methods and appropriate uses

    NASA Astrophysics Data System (ADS)

    Townend, Samuel; Smith, Helen; Molloy, James

    2014-05-01

    Flood hazard is not well understood or documented in many parts of the world. Consequently, the (re-)insurance sector now needs to better understand where the potential for considerable river flooding aligns with significant exposure. For example, international manufacturing companies are often attracted to countries with emerging economies, meaning that events such as the 2011 Thailand floods have resulted in many multinational businesses with assets in these regions incurring large, unexpected losses. This contribution addresses and critically evaluates the hydraulic methods employed to develop a consistent global scale set of river flood hazard maps, used to fill the knowledge gap outlined above. The basis of the modelling approach is an innovative, bespoke 1D/2D hydraulic model (RFlow) which has been used to model a global river network of over 5.3 million kilometres. Estimated flood peaks at each of these model nodes are determined using an empirically based rainfall-runoff approach linking design rainfall to design river flood magnitudes. The hydraulic model is used to determine extents and depths of floodplain inundation following river bank overflow. From this, deterministic flood hazard maps are calculated for several design return periods between 20-years and 1,500-years. Firstly, we will discuss the rationale behind the appropriate hydraulic modelling methods and inputs chosen to produce a consistent global scaled river flood hazard map. This will highlight how a model designed to work with global datasets can be more favourable for hydraulic modelling at the global scale and why using innovative techniques customised for broad scale use are preferable to modifying existing hydraulic models. Similarly, the advantages and disadvantages of both 1D and 2D modelling will be explored and balanced against the time, computer and human resources available, particularly when using a Digital Surface Model at 30m resolution. Finally, we will suggest some

  14. Exploring the Differences Between the European (SHARE) and the Reference Italian Seismic Hazard Models

    NASA Astrophysics Data System (ADS)

    Visini, F.; Meletti, C.; D'Amico, V.; Rovida, A.; Stucchi, M.

    2014-12-01

    The recent release of the probabilistic seismic hazard assessment (PSHA) model for Europe by the SHARE project (Giardini et al., 2013, www.share-eu.org) arises questions about the comparison between its results for Italy and the official Italian seismic hazard model (MPS04; Stucchi et al., 2011) adopted by the building code. The goal of such a comparison is identifying the main input elements that produce the differences between the two models. It is worthwhile to remark that each PSHA is realized with data and knowledge available at the time of the release. Therefore, even if a new model provides estimates significantly different from the previous ones that does not mean that old models are wrong, but probably that the current knowledge is strongly changed and improved. Looking at the hazard maps with 10% probability of exceedance in 50 years (adopted as the standard input in the Italian building code), the SHARE model shows increased expected values with respect to the MPS04 model, up to 70% for PGA. However, looking in detail at all output parameters of both the models, we observe a different behaviour for other spectral accelerations. In fact, for spectral periods greater than 0.3 s, the current reference PSHA for Italy proposes higher values than the SHARE model for many and large areas. This observation suggests that this behaviour could not be due to a different definition of seismic sources and relevant seismicity rates; it mainly seems the result of the adoption of recent ground-motion prediction equations (GMPEs) that estimate higher values for PGA and for accelerations with periods lower than 0.3 s and lower values for higher periods with respect to old GMPEs. Another important set of tests consisted in analysing separately the PSHA results obtained by the three source models adopted in SHARE (i.e., area sources, fault sources with background, and a refined smoothed seismicity model), whereas MPS04 only uses area sources. Results seem to confirm the

  15. Rainfall Hazards Prevention based on a Local Model Forecasting System

    NASA Astrophysics Data System (ADS)

    Buendia, F.; Ojeda, B.; Buendia Moya, G.; Tarquis, A. M.; Andina, D.

    2009-04-01

    Rainfall is one of the most important events of human life and society. Some rainfall phenomena like floods or hailstone are a threat to the agriculture, business and even life. However in the meteorological observatories there are methods to detect and alarm about this kind of events, nowadays the prediction techniques based on synoptic measurements need to be improved to achieve medium term feasible forecasts. Any deviation in the measurements or in the model description makes the forecast to diverge in time from the real atmosphere evolution. In this paper the advances in a local rainfall forecasting system based on time series estimation with General Regression Neural Networks are presented. The system is introduced, explaining the measurements, methodology and the current state of the development. The aim of the work is to provide a complementary criteria to the current forecast systems, based on the daily atmosphere observation and tracking over a certain place.

  16. Neotectonic deformation models for probabilistic seismic hazard: a study in the External Dinarides

    NASA Astrophysics Data System (ADS)

    Kastelic, Vanja; Carafa, Michele M. C.; Visini, Francesco

    2016-06-01

    In Europe, common input data types for seismic hazard evaluation include earthquake catalogues, seismic zonation models and ground motion models, all with well-constrained epistemic uncertainties. In contrast, neotectonic deformation models and their related uncertainties are rarely considered in earthquake forecasting and seismic hazard studies. In this study, for the first time in Europe, we developed a seismic hazard model based exclusively on active fault and geodynamic deformation models. We applied it to the External Dinarides, a slow-deforming fold-and-thrust belt in the Central Mediterranean. The two deformation models furnish consistent long-term earthquake rates above the Mw 4.7 threshold on a latitude/longitude grid with 0.2° spacing. Results suggest that the use of deformation models is a valid alternative to empirical-statistical approaches in earthquake forecasting in slow-deforming regions of Europe. Furthermore, we show that the variability of different deformation models has a comparable effect on the peak ground motion acceleration uncertainty as do the ground motion prediction equations.

  17. Building a risk-targeted regional seismic hazard model for South-East Asia

    NASA Astrophysics Data System (ADS)

    Woessner, J.; Nyst, M.; Seyhan, E.

    2015-12-01

    The last decade has tragically shown the social and economic vulnerability of countries in South-East Asia to earthquake hazard and risk. While many disaster mitigation programs and initiatives to improve societal earthquake resilience are under way with the focus on saving lives and livelihoods, the risk management sector is challenged to develop appropriate models to cope with the economic consequences and impact on the insurance business. We present the source model and ground motions model components suitable for a South-East Asia earthquake risk model covering Indonesia, Malaysia, the Philippines and Indochine countries. The source model builds upon refined modelling approaches to characterize 1) seismic activity from geologic and geodetic data on crustal faults and 2) along the interface of subduction zones and within the slabs and 3) earthquakes not occurring on mapped fault structures. We elaborate on building a self-consistent rate model for the hazardous crustal fault systems (e.g. Sumatra fault zone, Philippine fault zone) as well as the subduction zones, showcase some characteristics and sensitivities due to existing uncertainties in the rate and hazard space using a well selected suite of ground motion prediction equations. Finally, we analyze the source model by quantifying the contribution by source type (e.g., subduction zone, crustal fault) to typical risk metrics (e.g.,return period losses, average annual loss) and reviewing their relative impact on various lines of businesses.

  18. How new fault data and models affect seismic hazard results? Examples from southeast Spain

    NASA Astrophysics Data System (ADS)

    Gaspar-Escribano, Jorge M.; Belén Benito, M.; Staller, Alejandra; Ruiz Barajas, Sandra; Quirós, Ligia E.

    2016-04-01

    In this work, we study the impact of different approaches to incorporate faults in a seismic hazard assessment analysis. Firstly, we consider two different methods to distribute the seismicity of the study area into faults and area-sources, based on magnitude partitioning and on moment rate distribution. We use two recurrence models to characterize fault activity: the characteristic earthquake model and the modified Gutenberg-Richter exponential frequency-magnitude distribution. An application of the work is developed in the region of Murcia (southeastern Spain), due to the availability of fault data and because is one of the areas in Spain with higher seismic hazard. The parameters used to model fault sources are derived from paleoseismological and field studies obtained from the literature and online repositories. Additionally, for some significant faults only, geodetically-derived slip rates are used to compute recurrence periods. The results of all the seismic hazard computations carried out using different models and data are represented in maps of expected peak ground accelerations for a return period of 475 years. Maps of coefficients of variation are presented to constraint the variability of the end-results to different input models and values. Additionally, the different hazard maps obtained in this study are compared with the seismic hazard maps obtained in previous work for the entire Spanish territory and more specifically for the region of Murcia. This work is developed in the context of the MERISUR project (ref. CGL2013-40492-R), with funding from the Spanish Ministry of Economy and Competitiveness.

  19. Seismic hazard assessment for Myanmar: Earthquake model database, ground-motion scenarios, and probabilistic assessments

    NASA Astrophysics Data System (ADS)

    Chan, C. H.; Wang, Y.; Thant, M.; Maung Maung, P.; Sieh, K.

    2015-12-01

    We have constructed an earthquake and fault database, conducted a series of ground-shaking scenarios, and proposed seismic hazard maps for all of Myanmar and hazard curves for selected cities. Our earthquake database integrates the ISC, ISC-GEM and global ANSS Comprehensive Catalogues, and includes harmonized magnitude scales without duplicate events. Our active fault database includes active fault data from previous studies. Using the parameters from these updated databases (i.e., the Gutenberg-Richter relationship, slip rate, maximum magnitude and the elapse time of last events), we have determined the earthquake recurrence models of seismogenic sources. To evaluate the ground shaking behaviours in different tectonic regimes, we conducted a series of tests by matching the modelled ground motions to the felt intensities of earthquakes. Through the case of the 1975 Bagan earthquake, we determined that Atkinson and Moore's (2003) scenario using the ground motion prediction equations (GMPEs) fits the behaviours of the subduction events best. Also, the 2011 Tarlay and 2012 Thabeikkyin events suggested the GMPEs of Akkar and Cagnan (2010) fit crustal earthquakes best. We thus incorporated the best-fitting GMPEs and site conditions based on Vs30 (the average shear-velocity down to 30 m depth) from analysis of topographic slope and microtremor array measurements to assess seismic hazard. The hazard is highest in regions close to the Sagaing Fault and along the Western Coast of Myanmar as seismic sources there have earthquakes occur at short intervals and/or last events occurred a long time ago. The hazard curves for the cities of Bago, Mandalay, Sagaing, Taungoo and Yangon show higher hazards for sites close to an active fault or with a low Vs30, e.g., the downtown of Sagaing and Shwemawdaw Pagoda in Bago.

  20. The hazards of the changing hazard of dialysis modalities.

    PubMed

    Argyropoulos, Christos P; Unruh, Mark L

    2014-11-01

    The impact of the dialysis modality on patient survival has received considerable epidemiological attention, with most studies suggesting an early benefit favoring peritoneal dialysis over hemodialysis. Kumar et al. report the relative outcomes of the two modalities in incident patients followed by an accountable care organization. Using advanced statistical techniques for non-proportional hazards survival models, the authors corroborate the early benefit of peritoneal dialysis for the first 3 years and equivalent outcomes thereafter.

  1. Using the RBFN model and GIS technique to assess wind erosion hazards of Inner Mongolia, China

    NASA Astrophysics Data System (ADS)

    Shi, Huading; Liu, Jiyuan; Zhuang, Dafang; Hu, Yunfeng

    2006-08-01

    Soil wind erosion is the primary process and the main driving force for land desertification and sand-dust storms in arid and semi-arid areas of Northern China. Many researchers have paid more attention to this issue. This paper select Inner Mongolia autonomous region as the research area, quantify the various indicators affecting the soil wind erosion, using the GIS technology to extract the spatial data, and construct the RBFN (Radial Basis Function Network) model for assessment of wind erosion hazard. After training the sample data of the different levels of wind erosion hazard, we get the parameters of the model, and then assess the wind erosion hazard. The result shows that in the Southern parts of Inner Mongolia wind erosion hazard are very severe, counties in the middle regions of Inner Mongolia vary from moderate to severe, and in eastern are slight. The comparison of the result with other researches shows that the result is in conformity with actual conditions, proving the reasonability and applicability of the RBFN model.

  2. Delta method and bootstrap in linear mixed models to estimate a proportion when no event is observed: application to intralesional resection in bone tumor surgery.

    PubMed

    Francq, Bernard G; Cartiaux, Olivier

    2016-09-10

    Resecting bone tumors requires good cutting accuracy to reduce the occurrence of local recurrence. This issue is considerably reduced with a navigated technology. The estimation of extreme proportions is challenging especially with small or moderate sample sizes. When no success is observed, the commonly used binomial proportion confidence interval is not suitable while the rule of three provides a simple solution. Unfortunately, these approaches are unable to differentiate between different unobserved events. Different delta methods and bootstrap procedures are compared in univariate and linear mixed models with simulations and real data by assuming the normality. The delta method on the z-score and parametric bootstrap provide similar results but the delta method requires the estimation of the covariance matrix of the estimates. In mixed models, the observed Fisher information matrix with unbounded variance components should be preferred. The parametric bootstrap, easier to apply, outperforms the delta method for larger sample sizes but it may be time costly. Copyright © 2016 John Wiley & Sons, Ltd.

  3. Advances in National Capabilities for Consequence Assessment Modeling of Airborne Hazards

    SciTech Connect

    Nasstrom, J; Sugiyama, G; Foster, K; Larsen, S; Kosovic, B; Eme, B; Walker, H; Goldstein, P; Lundquist, J; Pobanz, B; Fulton, J

    2007-11-26

    This paper describes ongoing advancement of airborne hazard modeling capabilities in support of multiple agencies through the National Atmospheric Release Advisory Center (NARAC) and the Interagency Atmospheric Modeling and Atmospheric Assessment Center (IMAAC). A suite of software tools developed by Lawrence Livermore National Laboratory (LLNL) and collaborating organizations includes simple stand-alone, local-scale plume modeling tools for end user's computers, Web- and Internet-based software to access advanced 3-D flow and atmospheric dispersion modeling tools and expert analysis from the national center at LLNL, and state-of-the-science high-resolution urban models and event reconstruction capabilities.

  4. New Elements To Consider When Modeling the Hazards Associated with Botulinum Neurotoxin in Food

    PubMed Central

    Mura, Ivan; Malakar, Pradeep K.; Walshaw, John; Peck, Michael W.; Barker, G. C.

    2015-01-01

    Botulinum neurotoxins (BoNTs) produced by the anaerobic bacterium Clostridium botulinum are the most potent biological substances known to mankind. BoNTs are the agents responsible for botulism, a rare condition affecting the neuromuscular junction and causing a spectrum of diseases ranging from mild cranial nerve palsies to acute respiratory failure and death. BoNTs are a potential biowarfare threat and a public health hazard, since outbreaks of foodborne botulism are caused by the ingestion of preformed BoNTs in food. Currently, mathematical models relating to the hazards associated with C. botulinum, which are largely empirical, make major contributions to botulinum risk assessment. Evaluated using statistical techniques, these models simulate the response of the bacterium to environmental conditions. Though empirical models have been successfully incorporated into risk assessments to support food safety decision making, this process includes significant uncertainties so that relevant decision making is frequently conservative and inflexible. Progression involves encoding into the models cellular processes at a molecular level, especially the details of the genetic and molecular machinery. This addition drives the connection between biological mechanisms and botulism risk assessment and hazard management strategies. This review brings together elements currently described in the literature that will be useful in building quantitative models of C. botulinum neurotoxin production. Subsequently, it outlines how the established form of modeling could be extended to include these new elements. Ultimately, this can offer further contributions to risk assessments to support food safety decision making. PMID:26350137

  5. Application of decision tree model for the ground subsidence hazard mapping near abandoned underground coal mines.

    PubMed

    Lee, Saro; Park, Inhye

    2013-09-30

    Subsidence of ground caused by underground mines poses hazards to human life and property. This study analyzed the hazard to ground subsidence using factors that can affect ground subsidence and a decision tree approach in a geographic information system (GIS). The study area was Taebaek, Gangwon-do, Korea, where many abandoned underground coal mines exist. Spatial data, topography, geology, and various ground-engineering data for the subsidence area were collected and compiled in a database for mapping ground-subsidence hazard (GSH). The subsidence area was randomly split 50/50 for training and validation of the models. A data-mining classification technique was applied to the GSH mapping, and decision trees were constructed using the chi-squared automatic interaction detector (CHAID) and the quick, unbiased, and efficient statistical tree (QUEST) algorithms. The frequency ratio model was also applied to the GSH mapping for comparing with probabilistic model. The resulting GSH maps were validated using area-under-the-curve (AUC) analysis with the subsidence area data that had not been used for training the model. The highest accuracy was achieved by the decision tree model using CHAID algorithm (94.01%) comparing with QUEST algorithms (90.37%) and frequency ratio model (86.70%). These accuracies are higher than previously reported results for decision tree. Decision tree methods can therefore be used efficiently for GSH analysis and might be widely used for prediction of various spatial events.

  6. A probabilistic tornado wind hazard model for the continental United States

    SciTech Connect

    Hossain, Q; Kimball, J; Mensing, R; Savy, J

    1999-04-19

    A probabilistic tornado wind hazard model for the continental United States (CONUS) is described. The model incorporates both aleatory (random) and epistemic uncertainties associated with quantifying the tornado wind hazard parameters. The temporal occurrences of tornadoes within the continental United States (CONUS) is assumed to be a Poisson process. A spatial distribution of tornado touchdown locations is developed empirically based on the observed historical events within the CONUS. The hazard model is an aerial probability model that takes into consideration the size and orientation of the facility, the length and width of the tornado damage area (idealized as a rectangle and dependent on the tornado intensity scale), wind speed variation within the damage area, tornado intensity classification errors (i.e.,errors in assigning a Fujita intensity scale based on surveyed damage), and the tornado path direction. Epistemic uncertainties in describing the distributions of the aleatory variables are accounted for by using more than one distribution model to describe aleatory variations. The epistemic uncertainties are based on inputs from a panel of experts. A computer program, TORNADO, has been developed incorporating this model; features of this program are also presented.

  7. New Elements To Consider When Modeling the Hazards Associated with Botulinum Neurotoxin in Food.

    PubMed

    Ihekwaba, Adaoha E C; Mura, Ivan; Malakar, Pradeep K; Walshaw, John; Peck, Michael W; Barker, G C

    2016-01-01

    Botulinum neurotoxins (BoNTs) produced by the anaerobic bacterium Clostridium botulinum are the most potent biological substances known to mankind. BoNTs are the agents responsible for botulism, a rare condition affecting the neuromuscular junction and causing a spectrum of diseases ranging from mild cranial nerve palsies to acute respiratory failure and death. BoNTs are a potential biowarfare threat and a public health hazard, since outbreaks of foodborne botulism are caused by the ingestion of preformed BoNTs in food. Currently, mathematical models relating to the hazards associated with C. botulinum, which are largely empirical, make major contributions to botulinum risk assessment. Evaluated using statistical techniques, these models simulate the response of the bacterium to environmental conditions. Though empirical models have been successfully incorporated into risk assessments to support food safety decision making, this process includes significant uncertainties so that relevant decision making is frequently conservative and inflexible. Progression involves encoding into the models cellular processes at a molecular level, especially the details of the genetic and molecular machinery. This addition drives the connection between biological mechanisms and botulism risk assessment and hazard management strategies. This review brings together elements currently described in the literature that will be useful in building quantitative models of C. botulinum neurotoxin production. Subsequently, it outlines how the established form of modeling could be extended to include these new elements. Ultimately, this can offer further contributions to risk assessments to support food safety decision making. PMID:26350137

  8. Critical load analysis in hazard assessment of metals using a Unit World Model.

    PubMed

    Gandhi, Nilima; Bhavsar, Satyendra P; Diamond, Miriam L

    2011-09-01

    A Unit World approach has been used extensively to rank chemicals for their hazards and to understand differences in chemical behavior. Whereas the fate and effects of an organic chemical in a Unit World Model (UWM) analysis vary systematically according to one variable (fraction of organic carbon), and the chemicals have a singular ranking regardless of environmental characteristics, metals can change their hazard ranking according to freshwater chemistry, notably pH and dissolved organic carbon (DOC). Consequently, developing a UWM approach for metals requires selecting a series of representative freshwater chemistries, based on an understanding of the sensitivity of model results to this chemistry. Here we analyze results from a UWM for metals with the goal of informing the selection of appropriate freshwater chemistries for a UWM. The UWM loosely couples the biotic ligand model (BLM) to a geochemical speciation model (Windermere Humic Adsorption Model [WHAM]) and then to the multi-species fate transport-speciation (Transpec) model. The UWM is applied to estimate the critical load (CL) of cationic metals Cd, Cu, Ni, Pb, and Zn, using three lake chemistries that vary in trophic status, pH, and other parameters. The model results indicated a difference of four orders of magnitude in particle-to-total dissolved partitioning (K(d)) that translated into minimal differences in fate because of the short water residence time used. However, a maximum 300-fold difference was calculated in Cu toxicity among the three chemistries and three aquatic organisms. Critical loads were lowest (greatest hazard) in the oligotrophic water chemistry and highest (least hazard) in the eutrophic water chemistry, despite the highest fraction of free metal ion as a function of total metal occurring in the mesotrophic system, where toxicity was ameliorated by competing cations. Water hardness, DOC, and pH had the greatest influence on CL, because of the influence of these factors on aquatic

  9. Proportional counter radiation camera

    DOEpatents

    Borkowski, C.J.; Kopp, M.K.

    1974-01-15

    A gas-filled proportional counter camera that images photon emitting sources is described. A two-dimensional, positionsensitive proportional multiwire counter is provided as the detector. The counter consists of a high- voltage anode screen sandwiched between orthogonally disposed planar arrays of multiple parallel strung, resistively coupled cathode wires. Two terminals from each of the cathode arrays are connected to separate timing circuitry to obtain separate X and Y coordinate signal values from pulse shape measurements to define the position of an event within the counter arrays which may be recorded by various means for data display. The counter is further provided with a linear drift field which effectively enlarges the active gas volume of the counter and constrains the recoil electrons produced from ionizing radiation entering the counter to drift perpendicularly toward the planar detection arrays. A collimator is interposed between a subject to be imaged and the counter to transmit only the radiation from the subject which has a perpendicular trajectory with respect to the planar cathode arrays of the detector. (Official Gazette)

  10. Impact of fault models on probabilistic seismic hazard assessment: the example of the West Corinth rift.

    NASA Astrophysics Data System (ADS)

    Chartier, Thomas; Scotti, Oona; Boiselet, Aurelien; Lyon-Caen, Hélène

    2016-04-01

    Including faults in probabilistic seismic hazard assessment tends to increase the degree of uncertainty in the results due to the intrinsically uncertain nature of the fault data. This is especially the case in the low to moderate seismicity regions of Europe, where slow slipping faults are difficult to characterize. In order to better understand the key parameters that control the uncertainty in the fault-related hazard computations, we propose to build an analytic tool that provides a clear link between the different components of the fault-related hazard computations and their impact on the results. This will allow identifying the important parameters that need to be better constrained in order to reduce the resulting uncertainty in hazard and also provide a more hazard-oriented strategy for collecting relevant fault parameters in the field. The tool will be illustrated through the example of the West Corinth rifts fault-models. Recent work performed in the gulf has shown the complexity of the normal faulting system that is accommodating the extensional deformation of the rift. A logic-tree approach is proposed to account for this complexity and the multiplicity of scientifically defendable interpretations. At the nodes of the logic tree, different options that could be considered at each step of the fault-related seismic hazard will be considered. The first nodes represent the uncertainty in the geometries of the faults and their slip rates, which can derive from different data and methodologies. The subsequent node explores, for a given geometry/slip rate of faults, different earthquake rupture scenarios that may occur in the complex network of faults. The idea is to allow the possibility of several faults segments to break together in a single rupture scenario. To build these multiple-fault-segment scenarios, two approaches are considered: one based on simple rules (i.e. minimum distance between faults) and a second one that relies on physically

  11. Masked Proportional Routing

    NASA Technical Reports Server (NTRS)

    Wolpert, David

    2004-01-01

    Masked proportional routing is an improved procedure for choosing links between adjacent nodes of a network for the purpose of transporting an entity from a source node ("A") to a destination node ("B"). The entity could be, for example, a physical object to be shipped, in which case the nodes would represent waypoints and the links would represent roads or other paths between waypoints. For another example, the entity could be a message or packet of data to be transmitted from A to B, in which case the nodes could be computer-controlled switching stations and the links could be communication channels between the stations. In yet another example, an entity could represent a workpiece while links and nodes could represent, respectively, manufacturing processes and stages in the progress of the workpiece towards a finished product. More generally, the nodes could represent states of an entity and the links could represent allowed transitions of the entity. The purpose of masked proportional routing and of related prior routing procedures is to schedule transitions of entities from their initial states ("A") to their final states ("B") in such a manner as to minimize a cost or to attain some other measure of optimality or efficiency. Masked proportional routing follows a distributed (in the sense of decentralized) approach to probabilistically or deterministically choosing the links. It was developed to satisfy a need for a routing procedure that 1. Does not always choose the same link(s), even for two instances characterized by identical estimated values of associated cost functions; 2. Enables a graceful transition from one set of links to another set of links as the circumstances of operation of the network change over time; 3. Is preferably amenable to separate optimization of different portions of the network; 4. Is preferably usable in a network in which some of the routing decisions are made by one or more other procedure(s); 5. Preferably does not cause an

  12. An animal model to study toxicity of central nervous system therapy for childhood acute lymphoblastic leukemia: Effects on growth and craniofacial proportion

    SciTech Connect

    Schunior, A.; Zengel, A.E.; Mullenix, P.J.; Tarbell, N.J.; Howes, A.; Tassinari, M.S. )

    1990-10-15

    Many long term survivors of childhood acute lymphoblastic leukemia have short stature, as well as craniofacial and dental abnormalities, as side effects of central nervous system prophylactic therapy. An animal model is presented to assess these adverse effects on growth. Cranial irradiation (1000 cGy) with and without prednisolone (18 mg/kg i.p.) and methotrexate (2 mg/kg i.p.) was administered to 17- and 18-day-old Sprague-Dawley male and female rats. Animals were weighed 3 times/week. Final body weight and body length were measured at 150 days of age. Femur length and craniofacial dimensions were measured directly from the bones, using calipers. For all exposed groups there was a permanent suppression of weight gain with no catch-up growth or normal adolescent growth spurt. Body length was reduced for all treated groups, as were the ratios of body weight to body length and cranial length to body length. Animals subjected to cranial irradiation exhibited microcephaly, whereas those who received a combination of radiation and chemotherapy demonstrated altered craniofacial proportions in addition to microcephaly. Changes in growth patterns and skeletal proportions exhibited sexually dimorphic characteristics. The results indicate that cranial irradiation is a major factor in the growth failure in exposed rats, but chemotherapeutic agents contribute significantly to the outcome of growth and craniofacial dimensions.

  13. The Pedestrian Evacuation Analyst: geographic information systems software for modeling hazard evacuation potential

    USGS Publications Warehouse

    Jones, Jeanne M.; Ng, Peter; Wood, Nathan J.

    2014-01-01

    Recent disasters such as the 2011 Tohoku, Japan, earthquake and tsunami; the 2013 Colorado floods; and the 2014 Oso, Washington, mudslide have raised awareness of catastrophic, sudden-onset hazards that arrive within minutes of the events that trigger them, such as local earthquakes or landslides. Due to the limited amount of time between generation and arrival of sudden-onset hazards, evacuations are typically self-initiated, on foot, and across the landscape (Wood and Schmidtlein, 2012). Although evacuation to naturally occurring high ground may be feasible in some vulnerable communities, evacuation modeling has demonstrated that other communities may require vertical-evacuation structures within a hazard zone, such as berms or buildings, if at-risk individuals are to survive some types of sudden-onset hazards (Wood and Schmidtlein, 2013). Researchers use both static least-cost-distance (LCD) and dynamic agent-based models to assess the pedestrian evacuation potential of vulnerable communities. Although both types of models help to understand the evacuation landscape, LCD models provide a more general overview that is independent of population distributions, which may be difficult to quantify given the dynamic spatial and temporal nature of populations (Wood and Schmidtlein, 2012). Recent LCD efforts related to local tsunami threats have focused on an anisotropic (directionally dependent) path distance modeling approach that incorporates travel directionality, multiple travel speed assumptions, and cost surfaces that reflect variations in slope and land cover (Wood and Schmidtlein, 2012, 2013). The Pedestrian Evacuation Analyst software implements this anisotropic path-distance approach for pedestrian evacuation from sudden-onset hazards, with a particular focus at this time on local tsunami threats. The model estimates evacuation potential based on elevation, direction of movement, land cover, and travel speed and creates a map showing travel times to safety (a

  14. Gated strip proportional detector

    DOEpatents

    Morris, Christopher L.; Idzorek, George C.; Atencio, Leroy G.

    1987-01-01

    A gated strip proportional detector includes a gas tight chamber which encloses a solid ground plane, a wire anode plane, a wire gating plane, and a multiconductor cathode plane. The anode plane amplifies the amount of charge deposited in the chamber by a factor of up to 10.sup.6. The gating plane allows only charge within a narrow strip to reach the cathode. The cathode plane collects the charge allowed to pass through the gating plane on a set of conductors perpendicular to the open-gated region. By scanning the open-gated region across the chamber and reading out the charge collected on the cathode conductors after a suitable integration time for each location of the gate, a two-dimensional image of the intensity of the ionizing radiation incident on the detector can be made.

  15. Gated strip proportional detector

    DOEpatents

    Morris, C.L.; Idzorek, G.C.; Atencio, L.G.

    1985-02-19

    A gated strip proportional detector includes a gas tight chamber which encloses a solid ground plane, a wire anode plane, a wire gating plane, and a multiconductor cathode plane. The anode plane amplifies the amount of charge deposited in the chamber by a factor of up to 10/sup 6/. The gating plane allows only charge within a narrow strip to reach the cathode. The cathode plane collects the charge allowed to pass through the gating plane on a set of conductors perpendicular to the open-gated region. By scanning the open-gated region across the chamber and reading out the charge collected on the cathode conductors after a suitable integration time for each location of the gate, a two-dimensional image of the intensity of the ionizing radiation incident on the detector can be made.

  16. Data Model for Multi Hazard Risk Assessment Spatial Support Decision System

    NASA Astrophysics Data System (ADS)

    Andrejchenko, Vera; Bakker, Wim; van Westen, Cees

    2014-05-01

    The goal of the CHANGES Spatial Decision Support System is to support end-users in making decisions related to risk reduction measures for areas at risk from multiple hydro-meteorological hazards. The crucial parts in the design of the system are the user requirements, the data model, the data storage and management, and the relationships between the objects in the system. The implementation of the data model is carried out entirely with an open source database management system with a spatial extension. The web application is implemented using open source geospatial technologies with PostGIS as the database, Python for scripting, and Geoserver and javascript libraries for visualization and the client-side user-interface. The model can handle information from different study areas (currently, study areas from France, Romania, Italia and Poland are considered). Furthermore, the data model handles information about administrative units, projects accessible by different types of users, user-defined hazard types (floods, snow avalanches, debris flows, etc.), hazard intensity maps of different return periods, spatial probability maps, elements at risk maps (buildings, land parcels, linear features etc.), economic and population vulnerability information dependent on the hazard type and the type of the element at risk, in the form of vulnerability curves. The system has an inbuilt database of vulnerability curves, but users can also add their own ones. Included in the model is the management of a combination of different scenarios (e.g. related to climate change, land use change or population change) and alternatives (possible risk-reduction measures), as well as data-structures for saving the calculated economic or population loss or exposure per element at risk, aggregation of the loss and exposure using the administrative unit maps, and finally, producing the risk maps. The risk data can be used for cost-benefit analysis (CBA) and multi-criteria evaluation (SMCE). The

  17. Earthquake Rate Models for Evolving Induced Seismicity Hazard in the Central and Eastern US

    NASA Astrophysics Data System (ADS)

    Llenos, A. L.; Ellsworth, W. L.; Michael, A. J.

    2015-12-01

    Injection-induced earthquake rates can vary rapidly in space and time, which presents significant challenges to traditional probabilistic seismic hazard assessment methodologies that are based on a time-independent model of mainshock occurrence. To help society cope with rapidly evolving seismicity, the USGS is developing one-year hazard models for areas of induced seismicity in the central and eastern US to forecast the shaking due to all earthquakes, including aftershocks which are generally omitted from hazards assessments (Petersen et al., 2015). However, the spatial and temporal variability of the earthquake rates make them difficult to forecast even on time-scales as short as one year. An initial approach is to use the previous year's seismicity rate to forecast the next year's seismicity rate. However, in places such as northern Oklahoma the rates vary so rapidly over time that a simple linear extrapolation does not accurately forecast the future, even when the variability in the rates is modeled with simulations based on an Epidemic-Type Aftershock Sequence (ETAS) model (Ogata, JASA, 1988) to account for earthquake clustering. Instead of relying on a fixed time period for rate estimation, we explore another way to determine when the earthquake rate should be updated. This approach could also objectively identify new areas where the induced seismicity hazard model should be applied. We will estimate the background seismicity rate by optimizing a single set of ETAS aftershock triggering parameters across the most active induced seismicity zones -- Oklahoma, Guy-Greenbrier, the Raton Basin, and the Azle-Dallas-Fort Worth area -- with individual background rate parameters in each zone. The full seismicity rate, with uncertainties, can then be estimated using ETAS simulations and changes in rate can be detected by applying change point analysis in ETAS transformed time with methods already developed for Poisson processes.

  18. Hazard Ranking System and toxicological risk assessment models yield different results

    SciTech Connect

    Dehghani, T.; Sells, G. . CER-CLA Site Assessment Div.)

    1993-09-01

    A major goal of the Superfund Site Assessment program is identifying hazardous waste sites that pose unacceptable risks to human health and the environment. To accomplish this, EPA developed the Hazard Ranking System (HRS), a mathematical model used to assess the relative risks associated with actual or potential releases of hazardous wastes from a site. HRS is a scoring system based on factors grouped into three categories--likelihood of release, waste characteristics and targets. Values for the factor categories are multiplied, then normalized to 100 points to obtain a pathway score. Four pathways--groundwater, surface water, air migration and soil exposure--are evaluated and scored. The final HRS score is obtained by combining pathway scores using a root-mean-square method. HRS is intended to be a screening tool for measuring relative, rather than absolute, risk. The Superfund site assessment program usually requires at least two studies of a potential hazardous waste site before it is proposed for listing on the NPL. The initial study, or preliminary assessment (PA), is a limited-scope evaluation based on available historical information and data that can be gathered readily during a site reconnaissance.

  19. What is proportional reasoning, anyway?

    NASA Astrophysics Data System (ADS)

    Subero, Keron; Kanim, Stephen

    2007-10-01

    There appears to be a correlation between some measures of scientific reasoning skills and gain on conceptual measures of student understanding of introductory physics such as the Force Concept Inventory. At NMSU, we have established a correlation between pretest scores on proportional reasoning tasks and student performance on conceptual post-tests in the introductory lab. Proponents of a Piagetian model of cognitive development would call these scientific reasoning skills `operational capacities'' that signal the last transition in human intellectual growth from ``Concrete Operational'' to ``Formal'' reasoning. Seen in this light, the correlations described above suggest a cognitive ``deficit'' associated with development. We are exploring the possibility that proportional reasoning may in fact be a blanket term to describe many smaller elements of skills which students often seem to lack. In this talk, I will present some initial results from our investigation.

  20. Seismic hazard assessment of Sub-Saharan Africa using geodetic strain rate models

    NASA Astrophysics Data System (ADS)

    Poggi, Valerio; Pagani, Marco; Weatherill, Graeme; Garcia, Julio; Durrheim, Raymond J.; Mavonga Tuluka, Georges

    2016-04-01

    The East African Rift System (EARS) is the major active tectonic feature of the Sub-Saharan Africa (SSA) region. Although the seismicity level of such a divergent plate boundary can be described as moderate, several earthquakes have been reported in historical times causing a non-negligible level of damage, albeit mostly due to the high vulnerability of the local buildings and structures. Formulation and enforcement of national seismic codes is therefore an essential future risk mitigation strategy. Nonetheless, a reliable risk assessment cannot be done without the calibration of an updated seismic hazard model for the region. Unfortunately, the major issue in assessing seismic hazard in Sub-Saharan Africa is the lack of basic information needed to construct source and ground motion models. The historical earthquake record is largely incomplete, while instrumental catalogue is complete down to sufficient magnitude only for a relatively short time span. In addition, mapping of seimogenically active faults is still an on-going program. Recent studies have identified major seismogenic lineaments, but there is substantial lack of kinematic information for intermediate-to-small scale tectonic features, information that is essential for the proper calibration of earthquake recurrence models. To compensate this lack of information, we experiment the use of a strain rate model recently developed by Stamps et al. (2015) in the framework of a earthquake hazard and risk project along the EARS supported by USAID and jointly carried out by GEM and AfricaArray. We use the inferred geodetic strain rates to derive estimates of total scalar moment release, subsequently used to constrain earthquake recurrence relationships for both area (as distributed seismicity) and fault source models. The rates obtained indirectly from strain rates and more classically derived from the available seismic catalogues are then compared and combined into a unique mixed earthquake recurrence model

  1. Accelerated Hazards Model based on Parametric Families Generalized with Bernstein Polynomials

    PubMed Central

    Chen, Yuhui; Hanson, Timothy; Zhang, Jiajia

    2015-01-01

    Summary A transformed Bernstein polynomial that is centered at standard parametric families, such as Weibull or log-logistic, is proposed for use in the accelerated hazards model. This class provides a convenient way towards creating a Bayesian non-parametric prior for smooth densities, blending the merits of parametric and non-parametric methods, that is amenable to standard estimation approaches. For example optimization methods in SAS or R can yield the posterior mode and asymptotic covariance matrix. This novel nonparametric prior is employed in the accelerated hazards model, which is further generalized to time-dependent covariates. The proposed approach fares considerably better than previous approaches in simulations; data on the effectiveness of biodegradable carmustine polymers on recurrent brain malignant gliomas is investigated. PMID:24261450

  2. Rockfall Hazard Analysis From Discrete Fracture Network Modelling with Finite Persistence Discontinuities

    NASA Astrophysics Data System (ADS)

    Lambert, Cédric; Thoeni, Klaus; Giacomini, Anna; Casagrande, Davide; Sloan, Scott

    2012-09-01

    Developing an accurate representation of the rock mass fabric is a key element in rock fall hazard analysis. The orientation, persistence and density of fractures control the volume and shape of unstable blocks or compartments. In this study, the discrete fracture modelling technique and digital photogrammetry were used to accurately depict the fabric. A volume distribution of unstable blocks was derived combining polyhedral modelling and kinematic analyses. For each block size, probabilities of failure and probabilities of propagation were calculated. A complete energy distribution was obtained by considering, for each block size, its occurrence in the rock mass, its probability of falling, its probability to reach a given location, and the resulting distribution of energies at each location. This distribution was then used with an energy-frequency diagram to assess the hazard.

  3. The Impact of the Subduction Modeling Beneath Calabria on Seismic Hazard

    NASA Astrophysics Data System (ADS)

    Morasca, P.; Johnson, W. J.; Del Giudice, T.; Poggi, P.; Traverso, C.; Parker, E. J.

    2014-12-01

    The aim of this work is to better understand the influence of subduction beneath Calabria on seismic hazard, as very little is known about present-day kinematics and the seismogenic potential of the slab interface in the Calabrian Arc region. This evaluation is significant because, depending on stress conditions, subduction zones can vary from being fully coupled to almost entirely decoupled with important consequences in the seismic hazard assessment. Although the debate is still open about the current kinematics of the plates and microplates lying in the region and the degree of coupling of Ionian lithosphere beneath Calabria, GPS data suggest that this subduction is locked in its interface sector. Also the lack of instrumentally recorded thrust earthquakes suggests this zone is locked. The current seismotectonic model developed for the Italian National territory is simplified in this area and does not reflect the possibility of locked subduction beneath the Calabria that could produce infrequent, but very large earthquakes associated with the subduction interface. Because of this we have conducted an independent seismic source analysis to take into account the influence of subduction as part of a regional seismic hazard analysis. Our final model includes two separate provinces for the subduction beneath the Calabria: inslab and interface. From a geometrical point of view the interface province is modeled with a depth between 20-50 km and a dip of 20°, while the inslab one dips 70° between 50 -100 km. Following recent interpretations we take into account that the interface subduction is possibly locked and, in such a case, large events could occur as characteristic earthquakes. The results of the PSHA analysis show that the subduction beneath the Calabrian region has an influence in the total hazard for this region, especially for long return periods. Regional seismotectonic models for this region should account for subduction.

  4. Forward induced seismic hazard assessment: application to a synthetic seismicity catalogue from hydraulic stimulation modelling

    NASA Astrophysics Data System (ADS)

    Hakimhashemi, Amir Hossein; Yoon, Jeoung Seok; Heidbach, Oliver; Zang, Arno; Grünthal, Gottfried

    2014-07-01

    The M w 3.2-induced seismic event in 2006 due to fluid injection at the Basel geothermal site in Switzerland was the starting point for an ongoing discussion in Europe on the potential risk of hydraulic stimulation in general. In particular, further development of mitigation strategies of induced seismic events of economic concern became a hot topic in geosciences and geoengineering. Here, we present a workflow to assess the hazard of induced seismicity in terms of occurrence rate of induced seismic events. The workflow is called Forward Induced Seismic Hazard Assessment (FISHA) as it combines the results of forward hydromechanical-numerical models with methods of time-dependent probabilistic seismic hazard assessment. To exemplify FISHA, we use simulations of four different fluid injection types with various injection parameters, i.e. injection rate, duration and style of injection. The hydromechanical-numerical model applied in this study represents a geothermal reservoir with preexisting fractures where a routine of viscous fluid flow in porous media is implemented from which flow and pressure driven failures of rock matrix and preexisting fractures are simulated, and corresponding seismic moment magnitudes are computed. The resulting synthetic catalogues of induced seismicity, including event location, occurrence time and magnitude, are used to calibrate the magnitude completeness M c and the parameters a and b of the frequency-magnitude relation. These are used to estimate the time-dependent occurrence rate of induced seismic events for each fluid injection scenario. In contrast to other mitigation strategies that rely on real-time data or already obtained catalogues, we can perform various synthetic experiments with the same initial conditions. Thus, the advantage of FISHA is that it can quantify hazard from numerical experiments and recommend a priori a stimulation type that lowers the occurrence rate of induced seismic events. The FISHA workflow is rather

  5. Hydrological-hydraulic model cascading for pan-European flood hazard mapping

    NASA Astrophysics Data System (ADS)

    Alfieri, Lorenzo; Salamon, Peter; Bianchi, Alessandra; Pappenberger, Florian; Wetterhall, Fredrik

    2013-04-01

    Flood hazard maps at trans-national and continental scale have potential for a large number of applications ranging from climate change studies, aid to emergency planning for major flood crisis, early damage assessment and urban development, among others. However, such maps are usually available at rather coarse resolution, which limits their applications to rough assessments. At finer resolution, maps are often limited to country boundaries, due to limited data sharing and specific cooperation programs at trans-national level. The European Floods Directive 2007/60/EC requires EU Member States to map the potential flood extent for all water courses by the end of 2013. In this work we derive a pan-European flood hazard map at 100 m resolution, covering most of the European territory. The proposed approach is based on expanding the cascade model presented by Barredo et al. (2007). First, a pan-European distributed rainfall-runoff model with a resolution of 5x5km is set up and calibrated using discharge observations at 481 gauging sites. Then, by using 21-year meteorological climatology we derived a long term discharge simulation. A generalized extreme value fitting is applied to estimate flood peaks with 100-year return period for each river pixel in the model. This data is downscaled to the river network at 100 m resolution and design flood hydrographs are derived for 100-year return period event along the entire pan-European river network. Design flood hydrographs are then used to perform small-scale floodplain hydraulic simulations every 5 km along the river network using a two-dimensional hydraulic model. Finally, output maps of more than 35000 hydraulic simulations are merged into a pan-European flood hazard map. The quality of this map is evaluated for selected areas against the flood hazard maps provided by national/regional authorities. Finally, limitations of the approach and future directions of research are discussed.

  6. Interpretation of laser/multi-sensor data for short range terrain modeling and hazard detection

    NASA Technical Reports Server (NTRS)

    Messing, B. S.

    1980-01-01

    A terrain modeling algorithm that would reconstruct the sensed ground images formed by the triangulation scheme, and classify as unsafe any terrain feature that would pose a hazard to a roving vehicle is described. This modeler greatly reduces quantization errors inherent in a laser/sensing system through the use of a thinning algorithm. Dual filters are employed to separate terrain steps from the general landscape, simplifying the analysis of terrain features. A crosspath analysis is utilized to detect and avoid obstacles that would adversely affect the roll of the vehicle. Computer simulations of the rover on various terrains examine the performance of the modeler.

  7. A seismic source zone model for the seismic hazard assessment of Slovakia

    NASA Astrophysics Data System (ADS)

    Hók, Jozef; Kysel, Robert; Kováč, Michal; Moczo, Peter; Kristek, Jozef; Kristeková, Miriam; Šujan, Martin

    2016-06-01

    We present a new seismic source zone model for the seismic hazard assessment of Slovakia based on a new seismotectonic model of the territory of Slovakia and adjacent areas. The seismotectonic model has been developed using a new Slovak earthquake catalogue (SLOVEC 2011), successive division of the large-scale geological structures into tectonic regions, seismogeological domains and seismogenic structures. The main criteria for definitions of regions, domains and structures are the age of the last tectonic consolidation of geological structures, thickness of lithosphere, thickness of crust, geothermal conditions, current tectonic regime and seismic activity. The seismic source zones are presented on a 1:1,000,000 scale map.

  8. Benchmarking Computational Fluid Dynamics Models for Application to Lava Flow Simulations and Hazard Assessment

    NASA Astrophysics Data System (ADS)

    Dietterich, H. R.; Lev, E.; Chen, J.; Cashman, K. V.; Honor, C.

    2015-12-01

    Recent eruptions in Hawai'i, Iceland, and Cape Verde highlight the need for improved lava flow models for forecasting and hazard assessment. Existing models used for lava flow simulation range in assumptions, complexity, and the degree to which they have been validated against analytical solutions, experiments, and natural observations. In order to assess the capabilities of existing models and test the development of new codes, we conduct a benchmarking study of computational fluid dynamics models for lava flows, including VolcFlow, OpenFOAM, Flow3D, and COMSOL. Using new benchmark scenarios defined in Cordonnier et al. (2015) as a guide, we model Newtonian, Herschel-Bulkley and cooling flows over inclined planes, obstacles, and digital elevation models with a wide range of source conditions. Results are compared to analytical theory, analogue and molten basalt experiments, and measurements from natural lava flows. Our study highlights the strengths and weakness of each code, including accuracy and computational costs, and provides insights regarding code selection. We apply the best-fit codes to simulate the lava flows in Harrat Rahat, a predominately mafic volcanic field in Saudi Arabia. Input parameters are assembled from rheology and volume measurements of past flows using geochemistry, crystallinity, and present-day lidar and photogrammetric digital elevation models. With these data, we use our verified models to reconstruct historic and prehistoric events, in order to assess the hazards posed by lava flows for Harrat Rahat.

  9. Fuzzy multi-objective chance-constrained programming model for hazardous materials transportation

    NASA Astrophysics Data System (ADS)

    Du, Jiaoman; Yu, Lean; Li, Xiang

    2016-04-01

    Hazardous materials transportation is an important and hot issue of public safety. Based on the shortest path model, this paper presents a fuzzy multi-objective programming model that minimizes the transportation risk to life, travel time and fuel consumption. First, we present the risk model, travel time model and fuel consumption model. Furthermore, we formulate a chance-constrained programming model within the framework of credibility theory, in which the lengths of arcs in the transportation network are assumed to be fuzzy variables. A hybrid intelligent algorithm integrating fuzzy simulation and genetic algorithm is designed for finding a satisfactory solution. Finally, some numerical examples are given to demonstrate the efficiency of the proposed model and algorithm.

  10. A new approach to hazardous materials transportation risk analysis: decision modeling to identify critical variables.

    PubMed

    Clark, Renee M; Besterfield-Sacre, Mary E

    2009-03-01

    We take a novel approach to analyzing hazardous materials transportation risk in this research. Previous studies analyzed this risk from an operations research (OR) or quantitative risk assessment (QRA) perspective by minimizing or calculating risk along a transport route. Further, even though the majority of incidents occur when containers are unloaded, the research has not focused on transportation-related activities, including container loading and unloading. In this work, we developed a decision model of a hazardous materials release during unloading using actual data and an exploratory data modeling approach. Previous studies have had a theoretical perspective in terms of identifying and advancing the key variables related to this risk, and there has not been a focus on probability and statistics-based approaches for doing this. Our decision model empirically identifies the critical variables using an exploratory methodology for a large, highly categorical database involving latent class analysis (LCA), loglinear modeling, and Bayesian networking. Our model identified the most influential variables and countermeasures for two consequences of a hazmat incident, dollar loss and release quantity, and is one of the first models to do this. The most influential variables were found to be related to the failure of the container. In addition to analyzing hazmat risk, our methodology can be used to develop data-driven models for strategic decision making in other domains involving risk.

  11. Perspectives on open access high resolution digital elevation models to produce global flood hazard layers

    NASA Astrophysics Data System (ADS)

    Sampson, Christopher; Smith, Andrew; Bates, Paul; Neal, Jeffrey; Trigg, Mark

    2015-12-01

    Global flood hazard models have recently become a reality thanks to the release of open access global digital elevation models, the development of simplified and highly efficient flow algorithms, and the steady increase in computational power. In this commentary we argue that although the availability of open access global terrain data has been critical in enabling the development of such models, the relatively poor resolution and precision of these data now limit significantly our ability to estimate flood inundation and risk for the majority of the planet's surface. The difficulty of deriving an accurate 'bare-earth' terrain model due to the interaction of vegetation and urban structures with the satellite-based remote sensors means that global terrain data are often poorest in the areas where people, property (and thus vulnerability) are most concentrated. Furthermore, the current generation of open access global terrain models are over a decade old and many large floodplains, particularly those in developing countries, have undergone significant change in this time. There is therefore a pressing need for a new generation of high resolution and high vertical precision open access global digital elevation models to allow significantly improved global flood hazard models to be developed.

  12. A new approach to hazardous materials transportation risk analysis: decision modeling to identify critical variables.

    PubMed

    Clark, Renee M; Besterfield-Sacre, Mary E

    2009-03-01

    We take a novel approach to analyzing hazardous materials transportation risk in this research. Previous studies analyzed this risk from an operations research (OR) or quantitative risk assessment (QRA) perspective by minimizing or calculating risk along a transport route. Further, even though the majority of incidents occur when containers are unloaded, the research has not focused on transportation-related activities, including container loading and unloading. In this work, we developed a decision model of a hazardous materials release during unloading using actual data and an exploratory data modeling approach. Previous studies have had a theoretical perspective in terms of identifying and advancing the key variables related to this risk, and there has not been a focus on probability and statistics-based approaches for doing this. Our decision model empirically identifies the critical variables using an exploratory methodology for a large, highly categorical database involving latent class analysis (LCA), loglinear modeling, and Bayesian networking. Our model identified the most influential variables and countermeasures for two consequences of a hazmat incident, dollar loss and release quantity, and is one of the first models to do this. The most influential variables were found to be related to the failure of the container. In addition to analyzing hazmat risk, our methodology can be used to develop data-driven models for strategic decision making in other domains involving risk. PMID:19087232

  13. Global Hydrological Hazard Evaluation System (Global BTOP) Using Distributed Hydrological Model

    NASA Astrophysics Data System (ADS)

    Gusyev, M.; Magome, J.; Hasegawa, A.; Takeuchi, K.

    2015-12-01

    A global hydrological hazard evaluation system based on the BTOP models (Global BTOP) is introduced and quantifies flood and drought hazards with simulated river discharges globally for historical, near real-time monitoring and climate change impact studies. The BTOP model utilizes a modified topographic index concept and simulates rainfall-runoff processes including snowmelt, overland flow, soil moisture in the root and unsaturated zones, sub-surface flow, and river flow routing. The current global BTOP is constructed from global data on 10-min grid and is available to conduct river basin analysis on local, regional, and global scale. To reduce the impact of a coarse resolution, topographical features of global BTOP were obtained using river network upscaling algorithm that preserves fine resolution characteristics of 3-arcsec HydroSHEDS and 30-arcsec Hydro1K datasets. In addition, GLCC-IGBP land cover (USGS) and the DSMW(FAO) were used for the root zone depth and soil properties, respectively. The long-term seasonal potential evapotranspiration within BTOP model was estimated by the Shuttleworth-Wallace model using climate forcing data CRU TS3.1 and a GIMMS-NDVI(UMD/GLCF). The global BTOP was run with globally available precipitation such APHRODITE dataset and showed a good statistical performance compared to the global and local river discharge data in the major river basins. From these simulated daily river discharges at each grid, the flood peak discharges of selected return periods were obtained using the Gumbel distribution with L-moments and the hydrological drought hazard was quantified using standardized runoff index (SRI). For the dynamic (near real-time) applications, the global BTOP model is run with GSMaP-NRT global precipitation and simulated daily river discharges are utilized in a prototype near-real time discharge simulation system (GFAS-Streamflow), which is used to issue flood peak discharge alerts globally. The global BTOP system and GFAS

  14. CyberShake: A Physics-Based Seismic Hazard Model for Southern California

    USGS Publications Warehouse

    Graves, R.; Jordan, T.H.; Callaghan, S.; Deelman, E.; Field, E.; Juve, G.; Kesselman, C.; Maechling, P.; Mehta, G.; Milner, K.; Okaya, D.; Small, P.; Vahi, K.

    2011-01-01

    CyberShake, as part of the Southern California Earthquake Center's (SCEC) Community Modeling Environment, is developing a methodology that explicitly incorporates deterministic source and wave propagation effects within seismic hazard calculations through the use of physics-based 3D ground motion simulations. To calculate a waveform-based seismic hazard estimate for a site of interest, we begin with Uniform California Earthquake Rupture Forecast, Version 2.0 (UCERF2.0) and identify all ruptures within 200 km of the site of interest. We convert the UCERF2.0 rupture definition into multiple rupture variations with differing hypocenter locations and slip distributions, resulting in about 415,000 rupture variations per site. Strain Green Tensors are calculated for the site of interest using the SCEC Community Velocity Model, Version 4 (CVM4), and then, using reciprocity, we calculate synthetic seismograms for each rupture variation. Peak intensity measures are then extracted from these synthetics and combined with the original rupture probabilities to produce probabilistic seismic hazard curves for the site. Being explicitly site-based, CyberShake directly samples the ground motion variability at that site over many earthquake cycles (i. e., rupture scenarios) and alleviates the need for the ergodic assumption that is implicitly included in traditional empirically based calculations. Thus far, we have simulated ruptures at over 200 sites in the Los Angeles region for ground shaking periods of 2 s and longer, providing the basis for the first generation CyberShake hazard maps. Our results indicate that the combination of rupture directivity and basin response effects can lead to an increase in the hazard level for some sites, relative to that given by a conventional Ground Motion Prediction Equation (GMPE). Additionally, and perhaps more importantly, we find that the physics-based hazard results are much more sensitive to the assumed magnitude-area relations and

  15. Beyond Flood Hazard Maps: Detailed Flood Characterization with Remote Sensing, GIS and 2d Modelling

    NASA Astrophysics Data System (ADS)

    Santillan, J. R.; Marqueso, J. T.; Makinano-Santillan, M.; Serviano, J. L.

    2016-09-01

    Flooding is considered to be one of the most destructive among many natural disasters such that understanding floods and assessing the risks associated to it are becoming more important nowadays. In the Philippines, Remote Sensing (RS) and Geographic Information System (GIS) are two main technologies used in the nationwide modelling and mapping of flood hazards. Although the currently available high resolution flood hazard maps have become very valuable, their use for flood preparedness and mitigation can be maximized by enhancing the layers of information these maps portrays. In this paper, we present an approach based on RS, GIS and two-dimensional (2D) flood modelling to generate new flood layers (in addition to the usual flood depths and hazard layers) that are also very useful in flood disaster management such as flood arrival times, flood velocities, flood duration, flood recession times, and the percentage within a given flood event period a particular location is inundated. The availability of these new layers of flood information are crucial for better decision making before, during, and after occurrence of a flood disaster. The generation of these new flood characteristic layers is illustrated using the Cabadbaran River Basin in Mindanao, Philippines as case study area. It is envisioned that these detailed maps can be considered as additional inputs in flood disaster risk reduction and management in the Philippines.

  16. Near-Field Probabilistic Seismic Hazard Analysis of Metropolitan Tehran Using Region-Specific Directivity Models

    NASA Astrophysics Data System (ADS)

    Yazdani, Azad; Nicknam, Ahmad; Dadras, Ehsan Yousefi; Eftekhari, Seyed Nasrollah

    2016-09-01

    Ground motions are affected by directivity effects at near-fault regions which result in low-frequency cycle pulses at the beginning of the velocity time history. The directivity features of near-fault ground motions can lead to significant increase in the risk of earthquake-induced damage on engineering structures. The ordinary probabilistic seismic hazard analysis (PSHA) does not take into account such effects; recent studies have thus proposed new frameworks to incorporate directivity effects in PSHA. The objective of this study is to develop the seismic hazard mapping of Tehran City according to near-fault PSHA procedure for different return periods. To this end, the directivity models required in the modified PSHA were developed based on a database of the simulated ground motions. The simulated database was used in this study because there are no recorded near-fault data in the region to derive purely empirically based pulse prediction models. The results show that the directivity effects can significantly affect the estimate of regional seismic hazard.

  17. Towards inclusion of dynamic slip features in stochastic models for probabilistic (tsunami) hazard analysis.

    NASA Astrophysics Data System (ADS)

    Murphy, S.; Scala, A.; Herrero, A.; Lorito, S.; Nielsen, S. B.; Festa, G.; Trasatti, E.; Tonini, R.; Molinari, I.; Romano, F.

    2015-12-01

    Stochastic slip modelling based on general scaling features with uniform slip probability over the fault plane is commonly employed in tsunami and seismic hazard. However, dynamic rupture effects driven by specific fault geometry and frictional conditions can potentially control the slip probability. Unfortunately dynamic simulations can be computationally intensive, preventing their extensive use for hazard analysis. The aim of this study is to produce a stochastic model that incorporates slip features observed in dynamic simulations. Taking a Tohoku-like fault as a case study, numerous 2d spectral element dynamic simulations are performed using a variety of pre-stress distributions. Comparing the slip distributions generated from these simulations to traditional stochastic slip models we find that the stochastic models generally under represent slip near the free surface. This is an important feature in tsunami hazard with very large slip at shallow depth observed for the 2011 Tohoku earthquake. To incorporate dynamic features in the stochastic modeling we generate a depth dependent "transfer function" based on comparisons between the dynamic and stochastic models. Assuming that the differences between stochastic and dynamic slip distributions are predominantly depth dependent and not along strike, the transfer function is then applied to stochastic source models over a 3d geometry of the Tohoku fault. Comparing maximum tsunami wave height along the Japanese coast using a traditional stochastic model and one modified by the transfer function we find that the inclusion of the transfer function leads to the occurrence of more extreme events. Applying this function to the traditional stochastic slip distribution as a depth-dependent PDF for the slip may allow for an approximated but efficient incorporation of regionally specific dynamic features in a modified source model, to be used specifically when a significant number of slip scenarios need to be produced, e

  18. A model standardized risk assessment protocol for use with hazardous waste sites.

    PubMed Central

    Marsh, G M; Day, R

    1991-01-01

    This paper presents a model standardized risk assessment protocol (SRAP) for use with hazardous waste sites. The proposed SRAP focuses on the degree and patterns of evidence that exist for a significant risk to human populations from exposure to a hazardous waste site. The SRAP was designed with at least four specific goals in mind: to organize the available scientific data on a specific site and to highlight important gaps in this knowledge; to facilitate rational, cost-effective decision making about the best distribution of available manpower and resources; to systematically classify sites roughly according to the level of risk they pose to surrounding human populations; and to promote an improved level of communication among professionals working in the area of waste site management and between decision makers and the local population. PMID:2050062

  19. Detailed Flood Modeling and Hazard Assessment from Storm Tides, Rainfall and Sea Level Rise

    NASA Astrophysics Data System (ADS)

    Orton, P. M.; Hall, T. M.; Georgas, N.; Conticello, F.; Cioffi, F.; Lall, U.; Vinogradov, S. V.; Blumberg, A. F.

    2014-12-01

    A flood hazard assessment has been conducted for the Hudson River from New York City to Troy at the head of tide, using a three-dimensional hydrodynamic model and merging hydrologic inputs and storm tides from tropical and extra-tropical cyclones, as well as spring freshet floods. Our recent work showed that neglecting freshwater flows leads to underestimation of peak water levels at up-river sites and neglecting stratification (typical with two-dimensional modeling) leads to underestimation all along the Hudson. The hazard assessment framework utilizes a representative climatology of over 1000 synthetic tropical cyclones (TCs) derived from a statistical-stochastic TC model, and historical extra-tropical cyclones and freshets from 1950-present. Hydrodynamic modeling is applied with seasonal variations in mean sea level and ocean and estuary stratification. The model is the Stevens ECOM model and is separately used for operational ocean forecasts on the NYHOPS domain (http://stevens.edu/NYHOPS). For the synthetic TCs, an Artificial Neural Network/ Bayesian multivariate approach is used for rainfall-driven freshwater inputs to the Hudson, translating the TC attributes (e.g. track, SST, wind speed) directly into tributary stream flows (see separate presentation by Cioffi for details). Rainfall intensity has been rising in recent decades in this region, and here we will also examine the sensitivity of Hudson flooding to future climate warming-driven increases in storm precipitation. The hazard assessment is being repeated for several values of sea level, as projected for future decades by the New York City Panel on Climate Change. Recent studies have given widely varying estimates of the present-day 100-year flood at New York City, from 2.0 m to 3.5 m, and special emphasis will be placed on quantifying our study's uncertainty.

  20. Multiple Landslide-Hazard Scenarios Modeled for the Oakland-Berkeley Area, Northern California

    USGS Publications Warehouse

    Pike, Richard J.; Graymer, Russell W.

    2008-01-01

    With the exception of Los Angeles, perhaps no urban area in the United States is more at risk from landsliding, triggered by either precipitation or earthquake, than the San Francisco Bay region of northern California. By January each year, seasonal winter storms usually bring moisture levels of San Francisco Bay region hillsides to the point of saturation, after which additional heavy rainfall may induce landslides of various types and levels of severity. In addition, movement at any time along one of several active faults in the area may generate an earthquake large enough to trigger landslides. The danger to life and property rises each year as local populations continue to expand and more hillsides are graded for development of residential housing and its supporting infrastructure. The chapters in the text consist of: *Introduction by Russell W. Graymer *Chapter 1 Rainfall Thresholds for Landslide Activity, San Francisco Bay Region, Northern California by Raymond C. Wilson *Chapter 2 Susceptibility to Deep-Seated Landsliding Modeled for the Oakland-Berkeley Area, Northern California by Richard J. Pike and Steven Sobieszczyk *Chapter 3 Susceptibility to Shallow Landsliding Modeled for the Oakland-Berkeley Area, Northern California by Kevin M. Schmidt and Steven Sobieszczyk *Chapter 4 Landslide Hazard Modeled for the Cities of Oakland, Piedmont, and Berkeley, Northern California, from a M=7.1 Scenario Earthquake on the Hayward Fault Zone by Scott B. Miles and David K. Keefer *Chapter 5 Synthesis of Landslide-Hazard Scenarios Modeled for the Oakland-Berkeley Area, Northern California by Richard J. Pike The plates consist of: *Plate 1 Susceptibility to Deep-Seated Landsliding Modeled for the Oakland-Berkeley Area, Northern California by Richard J. Pike, Russell W. Graymer, Sebastian Roberts, Naomi B. Kalman, and Steven Sobieszczyk *Plate 2 Susceptibility to Shallow Landsliding Modeled for the Oakland-Berkeley Area, Northern California by Kevin M. Schmidt and Steven

  1. Development of hydrogeological modelling approaches for assessment of consequences of hazardous accidents at nuclear power plants

    SciTech Connect

    Rumynin, V.G.; Mironenko, V.A.; Konosavsky, P.K.; Pereverzeva, S.A.

    1994-07-01

    This paper introduces some modeling approaches for predicting the influence of hazardous accidents at nuclear reactors on groundwater quality. Possible pathways for radioactive releases from nuclear power plants were considered to conceptualize boundary conditions for solving the subsurface radionuclides transport problems. Some approaches to incorporate physical-and-chemical interactions into transport simulators have been developed. The hydrogeological forecasts were based on numerical and semi-analytical scale-dependent models. They have been applied to assess the possible impact of the nuclear power plants designed in Russia on groundwater reservoirs.

  2. NASA/MSFC multilayer diffusion models and computer program for operational prediction of toxic fuel hazards

    NASA Technical Reports Server (NTRS)

    Dumbauld, R. K.; Bjorklund, J. R.; Bowers, J. F.

    1973-01-01

    The NASA/MSFC multilayer diffusion models are discribed which are used in applying meteorological information to the estimation of toxic fuel hazards resulting from the launch of rocket vehicle and from accidental cold spills and leaks of toxic fuels. Background information, definitions of terms, description of the multilayer concept are presented along with formulas for determining the buoyant rise of hot exhaust clouds or plumes from conflagrations, and descriptions of the multilayer diffusion models. A brief description of the computer program is given, and sample problems and their solutions are included. Derivations of the cloud rise formulas, users instructions, and computer program output lists are also included.

  3. Development Of An Open System For Integration Of Heterogeneous Models For Flood Forecasting And Hazard Mitigation

    NASA Astrophysics Data System (ADS)

    Chang, W.; Tsai, W.; Lin, F.; Lin, S.; Lien, H.; Chung, T.; Huang, L.; Lee, K.; Chang, C.

    2008-12-01

    During a typhoon or a heavy storm event, using various forecasting models to predict rainfall intensity, and water level variation in rivers and flood situation in the urban area is able to reveal its capability technically. However, in practice, the following two causes tend to restrain the further application of these models as a decision support system (DSS) for the hazard mitigation. The first one is due to the difficulty of integration of heterogeneous models. One has to take into consideration the different using format of models, such as input files, output files, computational requirements, and so on. The second one is that the development of DSS requires, due to the heterogeneity of models and systems, a friendly user interface or platform to hide the complexity of various tools from users. It is expected that users can be governmental officials rather than professional experts, therefore the complicated interface of DSS is not acceptable. Based on the above considerations, in the present study, we develop an open system for integration of several simulation models for flood forecasting by adopting the FEWS (Flood Early Warning System) platform developed by WL | Delft Hydraulics. It allows us to link heterogeneous models effectively and provides suitable display modules. In addition, FEWS also has been adopted by Water Resource Agency (WRA), Taiwan as the standard operational system for river flooding management. That means this work can be much easily integrated with the use of practical cases. In the present study, based on FEWS platform, the basin rainfall-runoff model, SOBEK channel-routing model, and estuary tide forecasting model are linked and integrated through the physical connection of model initial and boundary definitions. The work flow of the integrated processes of models is shown in Fig. 1. This differs from the typical single model linking used in FEWS, which only aims at data exchange but without much physical consideration. So it really

  4. A Simple Model for Probabilistic Seismic Hazard Analysis of Induced Seismicity Associated With Deep Geothermal Systems

    NASA Astrophysics Data System (ADS)

    Schlittenhardt, Joerg; Spies, Thomas; Kopera, Juergen; Morales Aviles, Wilhelm

    2014-05-01

    In the research project MAGS (Microseismic activity of geothermal systems) funded by the German Federal Ministry of Environment (BMU) a simple model was developed to determine seismic hazard as the probability of the exceedance of ground motion of a certain size. Such estimates of the annual frequency of exceedance of prescriptive limits of e.g. seismic intensities or ground motions are needed for the planning and licensing, but likewise for the development and operation of deep geothermal systems. For the development of the proposed model well established probabilistic seismic hazard analysis (PSHA) methods for the estimation of the hazard for the case of natural seismicity were adapted to the case of induced seismicity. Important differences between induced and natural seismicity had to be considered. These include significantly smaller magnitudes, depths and source to site distances of the seismic events and, hence, different ground motion prediction equations (GMPE) that had to be incorporated to account for the seismic amplitude attenuation with distance as well as differences in the stationarity of the underlying tectonic and induced processes. Appropriate GMPE's in terms of PGV (peak ground velocity) were tested and selected from the literature. The proposed model and its application to the case of induced seismicity observed during the circulation period (operation phase of the plant) at geothermal sites in Germany will be presented. Using GMPE's for PGV has the advantage to estimate hazard in terms of velocities of ground motion, which can be linked to engineering regulations (e.g. German DIN 4150) which give prescriptive standards for the effects of vibrations on buildings and people. It is thus possible to specify the probability of exceedance of such prescriptive standard values and to decide whether they can be accepted or not. On the other hand hazard curves for induced and natural seismicity can be compared to study the impact at a site. Preliminary

  5. An approach for modeling thermal destruction of hazardous wastes in circulating fluidized bed incinerator.

    PubMed

    Patil, M P; Sonolikar, R L

    2008-10-01

    This paper presents a detailed computational fluid dynamics (CFD) based approach for modeling thermal destruction of hazardous wastes in a circulating fluidized bed (CFB) incinerator. The model is based on Eular - Lagrangian approach in which gas phase (continuous phase) is treated in a Eularian reference frame, whereas the waste particulate (dispersed phase) is treated in a Lagrangian reference frame. The reaction chemistry hasbeen modeled through a mixture fraction/ PDF approach. The conservation equations for mass, momentum, energy, mixture fraction and other closure equations have been solved using a general purpose CFD code FLUENT4.5. Afinite volume method on a structured grid has been used for solution of governing equations. The model provides detailed information on the hydrodynamics (gas velocity, particulate trajectories), gas composition (CO, CO2, O2) and temperature inside the riser. The model also allows different operating scenarios to be examined in an efficient manner. PMID:19697764

  6. A spatio-temporal model for probabilistic seismic hazard zonation of Tehran

    NASA Astrophysics Data System (ADS)

    Hashemi, Mahdi; Alesheikh, Ali Asghar; Zolfaghari, Mohammad Reza

    2013-08-01

    A precondition for all disaster management steps, building damage prediction, and construction code developments is a hazard assessment that shows the exceedance probabilities of different ground motion levels at a site considering different near- and far-field earthquake sources. The seismic sources are usually categorized as time-independent area sources and time-dependent fault sources. While the earlier incorporates the small and medium events, the later takes into account only the large characteristic earthquakes. In this article, a probabilistic approach is proposed to aggregate the effects of time-dependent and time-independent sources on seismic hazard. The methodology is then applied to generate three probabilistic seismic hazard maps of Tehran for 10%, 5%, and 2% exceedance probabilities in 50 years. The results indicate an increase in peak ground acceleration (PGA) values toward the southeastern part of the study area and the PGA variations are mostly controlled by the shear wave velocities across the city. In addition, the implementation of the methodology takes advantage of GIS capabilities especially raster-based analyses and representations. During the estimation of the PGA exceedance rates, the emphasis has been placed on incorporating the effects of different attenuation relationships and seismic source models by using a logic tree.

  7. Monitoring and forecast of hydro meteorological hazards basing on data of distant assay and mathematical modeling

    NASA Astrophysics Data System (ADS)

    Sapunov, Valentin; Dikinis, Alexandr; Voronov, Nikolai

    2014-05-01

    Russian Federation having giant area has low concentration of land meteorological check points. Net of monitoring is not enough for effective forecast and prediction of weather dynamics and extremely situations. Under increase of extremely situations and incidents - hurricanes et al (two times from begin of XXI century) reconstruction and "perestroika" of monitoring net is needful and necessary. The basis of such a progress is distant monitoring using planes and satellites adding land contact monitoring base on efforts of existed points and stations. Interaction of contact and distant views may make hydro meteorological data and prediction more fine and significant. Tradition physical methods must be added by new biological methods of modern study. According to gotten researches animal are able to predict extremely hazards of natural and anthropogenic nature basing of interaction between biological matter and probable physical field that is under primary study. For example it was animals which forecasted dropping of Chelyabinsk meteorite of 2013. Adding of biological indication with complex of meteorological data may increase significance of hazard prediction. The uniting of all data and approaches may become basis of proposed mathematical hydro meteorological weather models. Introduction to practice reported complex methods may decrease of loss from hydro meteorological risks and hazards and increase stability of country economics.

  8. SCEC/CME CyberShake: Probabilistic Seismic Hazard Analysis Using 3D Seismic Waveform Modeling

    NASA Astrophysics Data System (ADS)

    Callaghan, S.; Maechling, P. J.; Cui, Y.; Faerman, M.; Field, E.; Graves, R.; Gupta, N.; Gupta, V.; Jordan, T. H.; Kesselman, C.; Mehta, G.; Okaya, D.; Vahi, K.; Zhao, L.

    2005-12-01

    Researchers on the SCEC Community Modeling Environment (SCEC/CME) Project are calculating Probabilistic Seismic Hazard Curves for several sites in the Los Angeles area. The hazard curves calculated in this study use Intensity Measure Relationships (IMRs) based on 3D ground motion simulations rather than on attenuation relationships. State-of-the-art Probabilistic Seismic Hazard Analysis (PSHA) is currently conducted using IMRs that use empirically-based attenuation relationships. These attenuation relationships represent relatively simple analytical models based on the regression of observed data. However, it is widely believed that significant improvements in SHA will rely on the use of more physics-based, waveform modeling. In fact, a more physics-based approach to PSHA was endorsed in a recent assessment of earthquake science by National Research Council (2003). In order to introduce the use of 3D seismic waveform modeling into PSHA hazard curve calculations, the SCEC/CME CyberShake group is integrating state-of-the-art PSHA software tools (OpenSHA), SCEC-developed geophysical models (SCEC CVM3.0), validated anelastic wave modeling (AWM) software, and state-of-the-art computational technologies including high performance computing and grid-based scientific workflows in an effort to develop an OpenSHA-compatible 3D waveform-based IMR component. This will allow researchers to combine a new class of waveform-based IMRs with the large number of existing PSHA components, such as Earthquake Rupture Forecasts (ERF's), that are currently implemented in the OpenSHA system. To calculate a probabilistic hazard curve for a site of interest, we use the OpenSHA implementation of the NSHMP-2002 ERF and identify all ruptures within 200km of the site of interest. For each of these ruptures, we convert the NSHMP-2002 rupture definition into one, or more, Ruptures with Slip Time History (Rupture Variations) using newly developed Rupture Generator software. Strain Green Tensors are

  9. Mark-specific Hazard Ratio Model with Multivariate Continuous Marks: An Application to Vaccine Efficacy

    PubMed Central

    Gilbert, P. B.

    2014-01-01

    Summary In randomized placebo-controlled preventive HIV vaccine efficacy trials, an objective is to evaluate the relationship between vaccine efficacy to prevent infection and genetic distances of the exposing HIV strains to the multiple HIV sequences included in the vaccine construct, where the set of genetic distances is considered as the continuous multivariate ‘mark’ observed in infected subjects only. This research develops a multivariate mark-specific hazard ratio model in the competing risks failure time analysis framework for the assessment of mark-specific vaccine efficacy. It allows improved efficiency of estimation by employing the semiparametric method of maximum profile likelihood estimation in the vaccine-to-placebo mark density ratio model. The model also enables the use of a more efficient estimation method for the overall log hazard ratio in the Cox model. Additionally, we propose testing procedures to evaluate two relevant hypotheses concerning mark-specific vaccine efficacy. The asymptotic properties and finite-sample performance of the inferential procedures are investigated. Finally, we apply the proposed methods to data collected in the Thai RV144 HIV vaccine efficacy trial. PMID:23421613

  10. The Role of Sister Cities' Staff Exchanges in Developing "Learning Cities": Exploring Necessary and Sufficient Conditions in Social Capital Development Utilizing Proportional Odds Modeling.

    PubMed

    Buckley, Patrick Henry; Takahashi, Akio; Anderson, Amy

    2015-07-01

    In the last half century former international adversaries have become cooperators through networking and knowledge sharing for decision making aimed at improving quality of life and sustainability; nowhere has this been more striking then at the urban level where such activity is seen as a key component in building "learning cities" through the development of social capital. Although mega-cities have been leaders in such efforts, mid-sized cities with lesser resource endowments have striven to follow by focusing on more frugal sister city type exchanges. The underlying thesis of our research is that great value can be derived from city-to-city exchanges through social capital development. However, such a study must differentiate between necessary and sufficient conditions. Past studies assumed necessary conditions were met and immediately jumped to demonstrating the existence of structural relationships by measuring networking while further assuming that the existence of such demonstrated a parallel development of cognitive social capital. Our research addresses this lacuna by stepping back and critically examining these assumptions. To accomplish this goal we use a Proportional Odds Modeling with a Cumulative Logit Link approach to demonstrate the existence of a common latent structure, hence asserting that necessary conditions are met. PMID:26114245

  11. The Role of Sister Cities’ Staff Exchanges in Developing “Learning Cities”: Exploring Necessary and Sufficient Conditions in Social Capital Development Utilizing Proportional Odds Modeling

    PubMed Central

    Buckley, Patrick Henry; Takahashi, Akio; Anderson, Amy

    2015-01-01

    In the last half century former international adversaries have become cooperators through networking and knowledge sharing for decision making aimed at improving quality of life and sustainability; nowhere has this been more striking then at the urban level where such activity is seen as a key component in building “learning cities” through the development of social capital. Although mega-cities have been leaders in such efforts, mid-sized cities with lesser resource endowments have striven to follow by focusing on more frugal sister city type exchanges. The underlying thesis of our research is that great value can be derived from city-to-city exchanges through social capital development. However, such a study must differentiate between necessary and sufficient conditions. Past studies assumed necessary conditions were met and immediately jumped to demonstrating the existence of structural relationships by measuring networking while further assuming that the existence of such demonstrated a parallel development of cognitive social capital. Our research addresses this lacuna by stepping back and critically examining these assumptions. To accomplish this goal we use a Proportional Odds Modeling with a Cumulative Logit Link approach to demonstrate the existence of a common latent structure, hence asserting that necessary conditions are met. PMID:26114245

  12. The Role of Sister Cities' Staff Exchanges in Developing "Learning Cities": Exploring Necessary and Sufficient Conditions in Social Capital Development Utilizing Proportional Odds Modeling.

    PubMed

    Buckley, Patrick Henry; Takahashi, Akio; Anderson, Amy

    2015-06-24

    In the last half century former international adversaries have become cooperators through networking and knowledge sharing for decision making aimed at improving quality of life and sustainability; nowhere has this been more striking then at the urban level where such activity is seen as a key component in building "learning cities" through the development of social capital. Although mega-cities have been leaders in such efforts, mid-sized cities with lesser resource endowments have striven to follow by focusing on more frugal sister city type exchanges. The underlying thesis of our research is that great value can be derived from city-to-city exchanges through social capital development. However, such a study must differentiate between necessary and sufficient conditions. Past studies assumed necessary conditions were met and immediately jumped to demonstrating the existence of structural relationships by measuring networking while further assuming that the existence of such demonstrated a parallel development of cognitive social capital. Our research addresses this lacuna by stepping back and critically examining these assumptions. To accomplish this goal we use a Proportional Odds Modeling with a Cumulative Logit Link approach to demonstrate the existence of a common latent structure, hence asserting that necessary conditions are met.

  13. Web-based Services for Earth Observing and Model Data in National Applications and Hazards

    NASA Astrophysics Data System (ADS)

    Kafatos, M.; Boybeyi, Z.; Cervone, G.; di, L.; Sun, D.; Yang, C.; Yang, R.

    2005-12-01

    The ever-growing large volumes of Earth system science data, collected by Earth observing platforms, in situ stations and as model output data, are increasingly being used by discipline scientists and by wider classes of users. In particular, applications of Earth system science data to environmental and hazards as well as other national applications, require tailored or specialized data, as well as web-based tools and infrastructure. The latter are driven by applications and usage drivers which include ease of access, visualization of complex data, ease of producing value-added data, GIS and open source analysis usage, metadata, etc. Here we present different aspects of such web-based services and access, and discuss several applications in the hazards and environmental areas, including earthquake signatures and observations and model runs of hurricanes. Examples and lessons learned from the consortium Mid-Atlantic Geospatial Information Consortium will be presented. We discuss a NASA-funded, open source on-line data analysis system that is being applied to climate studies for the ESIP Federation. Since enhanced, this project and the next-generation Metadata Integrated Data Analysis System allow users not only to identify data but also to generate new data products on-the-fly. The functionalities extend from limited predefined functions, to sophisticated functions described by general-purposed GrADS (Grid Analysis and Display System) commands. The Federation system also allows third party data products to be combined with local data. Software component are available for converting the output from MIDAS (OPenDAP) into OGC compatible software. The on-going Grid efforts at CEOSR and LAITS in the School of Computational Sciences (SCS) include enhancing the functions of Globus to provide support for a geospatial system so the system can share the computing power to handle problems with different peak access times and improve the stability and flexibility of a rapid

  14. TRENT2D WG: a smart web infrastructure for debris-flow modelling and hazard assessment

    NASA Astrophysics Data System (ADS)

    Zorzi, Nadia; Rosatti, Giorgio; Zugliani, Daniel; Rizzi, Alessandro; Piffer, Stefano

    2016-04-01

    Mountain regions are naturally exposed to geomorphic flows, which involve large amounts of sediments and induce significant morphological modifications. The physical complexity of this class of phenomena represents a challenging issue for modelling, leading to elaborate theoretical frameworks and sophisticated numerical techniques. In general, geomorphic-flows models proved to be valid tools in hazard assessment and management. However, model complexity seems to represent one of the main obstacles to the diffusion of advanced modelling tools between practitioners and stakeholders, although the UE Flood Directive (2007/60/EC) requires risk management and assessment to be based on "best practices and best available technologies". Furthermore, several cutting-edge models are not particularly user-friendly and multiple stand-alone software are needed to pre- and post-process modelling data. For all these reasons, users often resort to quicker and rougher approaches, leading possibly to unreliable results. Therefore, some effort seems to be necessary to overcome these drawbacks, with the purpose of supporting and encouraging a widespread diffusion of the most reliable, although sophisticated, modelling tools. With this aim, this work presents TRENT2D WG, a new smart modelling solution for the state-of-the-art model TRENT2D (Armanini et al., 2009, Rosatti and Begnudelli, 2013), which simulates debris flows and hyperconcentrated flows adopting a two-phase description over a mobile bed. TRENT2D WG is a web infrastructure joining advantages offered by the software-delivering model SaaS (Software as a Service) and by WebGIS technology and hosting a complete and user-friendly working environment for modelling. In order to develop TRENT2D WG, the model TRENT2D was converted into a service and exposed on a cloud server, transferring computational burdens from the user hardware to a high-performing server and reducing computational time. Then, the system was equipped with an

  15. Testing seismic hazard models with Be-10 exposure ages for precariously balanced rocks

    NASA Astrophysics Data System (ADS)

    Rood, D. H.; Anooshehpoor, R.; Balco, G.; Brune, J.; Brune, R.; Ludwig, L. Grant; Kendrick, K.; Purvance, M.; Saleeby, I.

    2012-04-01

    Currently, the only empirical tool available to test maximum earthquake ground motions spanning timescales of 10 ky-1 My is the use of fragile geologic features, including precariously balanced rocks (PBRs). The ages of PBRs together with their areal distribution and mechanical stability ("fragility") constrain probabilistic seismic hazard analysis (PSHA) over long timescales; pertinent applications include the USGS National Seismic Hazard Maps (NSHM) and tests for ground motion models (e.g., Cybershake). Until recently, age constraints for PBRs were limited to varnish microlamination (VML) dating techniques and sparse cosmogenic nuclide data; however, VML methods yield minimum limiting ages for individual rock surfaces, and the interpretations of cosmogenic nuclide data were ambiguous because they did not account for the exhumation history of the PBRs or the complex shielding of cosmic rays. We have recently published a robust method for the exposure dating of PBRs combining Be-10 profiles, a numerical model, and a three-dimensional model for each PBR constructed using photogrammetry (Balco et al., 2011, Quaternary Geochronology). Here, we use this method to calculate new exposure ages and fragilities for 6 PBRs in southern California (USA) near the San Andreas, San Jacinto, and Elsinore faults at the Lovejoy Buttes, Round Top, Pacifico, Beaumont South, Perris, and Benton Road sites (in addition to the recently published age of 18.7 +/- 2.8 ka for a PBR at the Grass Valley site). We combine our ages and fragilities for each PBR, and use these data to test the USGS 2008 NSHM PGA with 2% in 50 year probability, USGS 2008 PSHA deaggregations, and basic hazard curves from USGS 2002 NSHM data.

  16. Testing seismic hazard models with Be-10 exposure ages for precariously balanced rocks

    NASA Astrophysics Data System (ADS)

    Rood, D. H.; Anooshehpoor, R.; Balco, G.; Biasi, G. P.; Brune, J. N.; Brune, R.; Grant Ludwig, L.; Kendrick, K. J.; Purvance, M.; Saleeby, I.

    2012-12-01

    Currently, the only empirical tool available to test maximum earthquake ground motions spanning timescales of 10 ky-1 My is the use of fragile geologic features, including precariously balanced rocks (PBRs). The ages of PBRs together with their areal distribution and mechanical stability ("fragility") constrain probabilistic seismic hazard analysis (PSHA) over long timescales; pertinent applications include the USGS National Seismic Hazard Maps (NSHM) and tests for ground motion models (e.g., Cybershake). Until recently, age constraints for PBRs were limited to varnish microlamination (VML) dating techniques and sparse cosmogenic nuclide data; however, VML methods yield minimum limiting ages for individual rock surfaces, and the interpretations of cosmogenic nuclide data were ambiguous because they did not account for the exhumation history of the PBRs or the complex shielding of cosmic rays. We have recently published a robust method for the exposure dating of PBRs combining Be-10 profiles, a numerical model, and a three-dimensional shape model for each PBR constructed using photogrammetry (Balco et al., 2011, Quaternary Geochronology). Here, we use our published method to calculate new exposure ages for PBRs at 6 sites in southern California near the San Andreas, San Jacinto, and Elsinore faults, including: Lovejoy Buttes (9 +/- 1 ka), Round Top (35 +/- 1 ka), Pacifico (19 +/- 1 ka, but with a poor fit to data), Beaumont South (17 +/- 2 ka), Perris (24 +/- 2 ka), and Benton Road (40 +/- 1 ka), in addition to the recently published age of 18.5 +/- 2.0 ka for a PBR at the Grass Valley site. We combine our ages and fragilities for each PBR, and use these data to test the USGS 2008 NSHM PGA with 2% in 50 year probability, USGS 2008 PSHA deaggregations, and basic hazard curves from USGS 2002 NSHM data. Precariously balanced rock in southern California

  17. Predicting the Survival Time for Bladder Cancer Using an Additive Hazards Model in Microarray Data

    PubMed Central

    TAPAK, Leili; MAHJUB, Hossein; SADEGHIFAR, Majid; SAIDIJAM, Massoud; POOROLAJAL, Jalal

    2016-01-01

    Background: One substantial part of microarray studies is to predict patients’ survival based on their gene expression profile. Variable selection techniques are powerful tools to handle high dimensionality in analysis of microarray data. However, these techniques have not been investigated in competing risks setting. This study aimed to investigate the performance of four sparse variable selection methods in estimating the survival time. Methods: The data included 1381 gene expression measurements and clinical information from 301 patients with bladder cancer operated in the years 1987 to 2000 in hospitals in Denmark, Sweden, Spain, France, and England. Four methods of the least absolute shrinkage and selection operator, smoothly clipped absolute deviation, the smooth integration of counting and absolute deviation and elastic net were utilized for simultaneous variable selection and estimation under an additive hazards model. The criteria of area under ROC curve, Brier score and c-index were used to compare the methods. Results: The median follow-up time for all patients was 47 months. The elastic net approach was indicated to outperform other methods. The elastic net had the lowest integrated Brier score (0.137±0.07) and the greatest median of the over-time AUC and C-index (0.803±0.06 and 0.779±0.13, respectively). Five out of 19 selected genes by the elastic net were significant (P<0.05) under an additive hazards model. It was indicated that the expression of RTN4, SON, IGF1R and CDC20 decrease the survival time, while the expression of SMARCAD1 increase it. Conclusion: The elastic net had higher capability than the other methods for the prediction of survival time in patients with bladder cancer in the presence of competing risks base on additive hazards model. PMID:27114989

  18. Visual Manipulatives for Proportional Reasoning.

    ERIC Educational Resources Information Center

    Moore, Joyce L.; Schwartz, Daniel L.

    The use of a visual representation in learning about proportional relations was studied, examining students' understandings of the invariance of a multiplicative relation on both sides of a proportion equation and the invariance of the structural relations that exist in different semantic types of proportion problems. Subjects were 49 high-ability…

  19. Atmospheric Electrical Modeling in Support of the NASA F-106 Storm Hazards Project

    NASA Technical Reports Server (NTRS)

    Helsdon, John H., Jr.

    1988-01-01

    A recently developed storm electrification model (SEM) is used to investigate the operating environment of the F-106 airplane during the NASA Storm Hazards Project. The model is 2-D, time dependent and uses a bulkwater microphysical parameterization scheme. Electric charges and fields are included, and the model is fully coupled dynamically, microphysically and electrically. One flight showed that a high electric field was developed at the aircraft's operating altitude (28 kft) and that a strong electric field would also be found below 20 kft; however, this low-altitude, high-field region was associated with the presence of small hail, posing a hazard to the aircraft. An operational procedure to increase the frequency of low-altitude lightning strikes was suggested. To further the understanding of lightning within the cloud environment, a parameterization of the lightning process was included in the SEM. It accounted for the initiation, propagation, termination, and charge redistribution associated with an intracloud discharge. Finally, a randomized lightning propagation scheme was developed, and the effects of cloud particles on the initiation of lightning investigated.

  20. Development of tsunami hazard maps for the Mentawai Islands, Indonesia, using heterogeneous slip models

    NASA Astrophysics Data System (ADS)

    Griffin, J.; Pranantyo, I. R.; Kongko, W.; Haunan, A.; Horspool, N.; Maemunah, I.; Natawidjaja, D.; Latief, H.; Cummins, P. R.

    2013-12-01

    Heterogeneous distribution of slip during megathrust earthquakes has been shown to significantly affect the spatial distribution of tsunami height in both numerical studies and field observations. This means that tsunami hazard maps generated using uniform slip distributions in their tsunami source models may underestimate tsunami inundation in some locations compared with real events of the same magnitude in the same location. In order to more completely define areas that may be inundated during a tsunami it is important to consider how different possible distributions of slip will impact different parts of the coastline. We generate tsunami inundation maps for the Mentawai Islands, West Sumatra, Indonesia, from a composite suite of possible source models that are consistent with current knowledge of the source region. First, a suite of earthquake source models with randomly distributed slip along the Mentawai Segment of the Sunda Subduction Zone are generated. From this suite we select source models that generate vertical deformation consistent with that observed in coral palaeogeodetic records of previous ruptures of the Mentawai Segment. Tsunami inundation is modelled using high resolution elevation data for selected source models and the results compiled to generate a maximum tsunami inundation zone. This allows us to constrain the slip distribution beneath the Mentawai Islands, where coral palaeogeodetic data is available, while allowing greater variation in the slip distribution away from the islands, in particular near the trench where large slip events can generate large tsunami. This method also allows us to consider high slip events on deeper portions of the megathrust between the Mentawai Islands and the Sumatran Mainland, which give greater tsunami inundation on the eastern part of the Mentawai Islands and the west coast of Sumatra compared with near-trench events. By accounting for uncertainty in slip distribution, the resulting hazard maps give a

  1. "Developing a multi hazard air quality forecasting model for Santiago, Chile"

    NASA Astrophysics Data System (ADS)

    Mena, M. A.; Delgado, R.; Hernandez, R.; Saide, P. E.; Cienfuegos, R.; Pinochet, J. I.; Molina, L. T.; Carmichael, G. R.

    2013-05-01

    Santiago, Chile has reduced annual particulate matter from 69ug/m3 (in 1989) to 25ug/m3 (in 2012), mostly by forcing industry, the transport sector, and the residential heating sector to adopt stringent emission standards to be able to operate under bad air days. Statistical forecasting has been used to predict bad air days, and pollution control measures in Santiago, Chile, for almost two decades. Recently an operational PM2.5 deterministic model has been implemented using WRF-Chem. The model was developed by the University of Iowa and is run at the Chilean Meteorological Office. Model configuration includes high resolution emissions gridding (2km) and updated population distribution using 2008 data from LANDSCAN. The model is run using a 2 day spinup with a 5 day forecast. This model has allowed a preventive approach in pollution control measures, as episodes are the results of multiple days of bad dispersion. Decreeing air pollution control measures in advance of bad air days resulted in a reduction of 40% of alert days (80ug/m3 mean 24h PM2.5) and 66% of "preemergency days" (110ug/m3 mean 24h PM2.5) from 2011 to 2012, despite similar meteorological conditions. This model will be deployed under a recently funded Center for Natural Disaster Management, and include other meteorological hazards such as flooding, high temperature, storm waves, landslides, UV radiation, among other parameters. This paper will present the results of operational air quality forecasting, and the methodology that will be used to transform WRF-Chem into a multi hazard forecasting system.

  2. A "mental models" approach to the communication of subsurface hydrology and hazards

    NASA Astrophysics Data System (ADS)

    Gibson, Hazel; Stewart, Iain S.; Pahl, Sabine; Stokes, Alison

    2016-05-01

    Communicating information about geological and hydrological hazards relies on appropriately worded communications targeted at the needs of the audience. But what are these needs, and how does the geoscientist discern them? This paper adopts a psychological "mental models" approach to assess the public perception of the geological subsurface, presenting the results of attitudinal studies and surveys in three communities in the south-west of England. The findings reveal important preconceptions and misconceptions regarding the impact of hydrological systems and hazards on the geological subsurface, notably in terms of the persistent conceptualisation of underground rivers and the inferred relations between flooding and human activity. The study demonstrates how such mental models can provide geoscientists with empirical, detailed and generalised data of perceptions surrounding an issue, as well reveal unexpected outliers in perception that they may not have considered relevant, but which nevertheless may locally influence communication. Using this approach, geoscientists can develop information messages that more directly engage local concerns and create open engagement pathways based on dialogue, which in turn allow both geoscience "experts" and local "non-experts" to come together and understand each other more effectively.

  3. Modelling diameter distributions of two-cohort forest stands with various proportions of dominant species: a two-component mixture model approach.

    PubMed

    Podlaski, Rafał; Roesch, Francis A

    2014-03-01

    In recent years finite-mixture models have been employed to approximate and model empirical diameter at breast height (DBH) distributions. We used two-component mixtures of either the Weibull distribution or the gamma distribution for describing the DBH distributions of mixed-species, two-cohort forest stands, to analyse the relationships between the DBH components, age cohorts and dominant species, and to assess the significance of differences between the mixture distributions and the kernel density estimates. The data consisted of plots from the Świętokrzyski National Park (Central Poland) and areas close to and including the North Carolina section of the Great Smoky Mountains National Park (USA; southern Appalachians). The fit of the mixture Weibull model to empirical DBH distributions had a precision similar to that of the mixture gamma model, slightly less accurate estimate was obtained with the kernel density estimator. Generally, in the two-cohort, two-storied, multi-species stands in the southern Appalachians, the two-component DBH structure was associated with age cohort and dominant species. The 1st DBH component of the mixture model was associated with the 1st dominant species sp1 occurred in young age cohort (e.g., sweetgum, eastern hemlock); and to a lesser degree, the 2nd DBH component was associated with the 2nd dominant species sp2 occurred in old age cohort (e.g., loblolly pine, red maple). In two-cohort, partly multilayered, stands in the Świętokrzyski National Park, the DBH structure was usually associated with only age cohorts (two dominant species often occurred in both young and old age cohorts). When empirical DBH distributions representing stands of complex structure are approximated using mixture models, the convergence of the estimation process is often significantly dependent on the starting strategies. Depending on the number of DBHs measured, three methods for choosing the initial values are recommended: min.k/max.k, 0.5/1.5/mean

  4. Application of a Data Mining Model and It's Cross Application for Landslide Hazard Analysis: a Case Study in Malaysia

    NASA Astrophysics Data System (ADS)

    Pradhan, Biswajeet; Lee, Saro; Shattri, Mansor

    This paper deals with landslide hazard analysis and cross-application using Geographic Information System (GIS) and remote sensing data for Cameron Highland, Penang Island and Selangor in Malaysia. The aim of this study was to cross-apply and verify a spatial probabilistic model for landslide hazard analysis. Landslide locations were identified in the study area from interpretation of aerial photographs and field surveys. Topographical/geological data and satellite images were collected and processed using GIS and image processing tools. There are ten landslide inducing parameters which are considered for the landslide hazard analysis. These parameters are topographic slope, aspect, curvature and distance from drainage, all derived from the topographic database; geology and distance from lineament, derived from the geologic database; landuse from Landsat satellite images; soil from the soil database; precipitation amount, derived from the rainfall database; and the vegetation index value from SPOT satellite images. These factors were analyzed using an artificial neural network model to generate the landslide hazard map. Each factor's weight was determined by the back-propagation training method. Then the landslide hazard indices were calculated using the trained back-propagation weights, and finally the landslide hazard map was generated using GIS tools. Landslide hazard maps were drawn for these three areas using artificial neural network model derived not only from the data for that area but also using the weight for each parameters, one of the statistical model, calculated from each of the other two areas (nine maps in all) as a cross-check of the validity of the method. For verification, the results of the analyses were compared, in each study area, with actual landslide locations. The verification results showed sufficient agreement between the presumptive hazard map and the existing data on landslide areas.

  5. Quantification of Inter-Tsunami Model Variability for Hazard Assessment Studies

    NASA Astrophysics Data System (ADS)

    Catalan, P. A.; Alcantar, A.; Cortés, P. I.

    2014-12-01

    There is a wide range of numerical models capable of modeling tsunamis, most of which have been properly validated and verified against standard benchmark cases and particular field or laboratory cases studies. Consequently, these models are regularly used as essential tools in estimating the tsunami hazard on coastal communities by scientists, or consulting companies, and treating model results in a deterministic way. Most of these models are derived from the same set of equations, typically the Non Linear Shallow Water Equations, to which ad-hoc terms are added to include physical effects such as friction, the Coriolis force, and others. However, not very often these models are used in unison to address the variability in the results. Therefore, in this contribution, we perform a high number of simulations using a set of numerical models and quantify the variability in the results. In order to reduce the influence of input data on the results, a single tsunami scenario is used over a common bathymetry. Next, we perform model comparisons as to asses sensitivity to changes in grid resolution and Manning roughness coefficients. Results are presented either as intra-model comparisons (sensitivity to changes using the same model) and inter-model comparisons (sensitivity to changing models). For the case tested, it was observed that most models reproduced fairly consistently the arrival and periodicity of the tsunami waves. However, variations in amplitude, characterized by the standard-deviation between model runs, could be as large as the mean signal. This level of variability is considered too large for deterministic assessment, reinforcing the idea that uncertainty needs to be included in such studies.

  6. Socio-economic vulnerability to natural hazards - proposal for an indicator-based model

    NASA Astrophysics Data System (ADS)

    Eidsvig, U.; McLean, A.; Vangelsten, B. V.; Kalsnes, B.; Ciurean, R. L.; Argyroudis, S.; Winter, M.; Corominas, J.; Mavrouli, O. C.; Fotopoulou, S.; Pitilakis, K.; Baills, A.; Malet, J. P.

    2012-04-01

    Vulnerability assessment, with respect to natural hazards, is a complex process that must consider multiple dimensions of vulnerability, including both physical and social factors. Physical vulnerability refers to conditions of physical assets, and may be modeled by the intensity and magnitude of the hazard, the degree of physical protection provided by the natural and built environment, and the physical robustness of the exposed elements. Social vulnerability refers to the underlying factors leading to the inability of people, organizations, and societies to withstand impacts from the natural hazards. Social vulnerability models can be used in combination with physical vulnerability models to estimate both direct losses, i.e. losses that occur during and immediately after the impact, as well as indirect losses, i.e. long-term effects of the event. Direct impact of a landslide typically includes casualties and damages to buildings and infrastructure while indirect losses may e.g. include business closures or limitations in public services. The direct losses are often assessed using physical vulnerability indicators (e.g. construction material, height of buildings), while indirect losses are mainly assessed using social indicators (e.g. economical resources, demographic conditions). Within the EC-FP7 SafeLand research project, an indicator-based method was proposed to assess relative socio-economic vulnerability to landslides. The indicators represent the underlying factors which influence a community's ability to prepare for, deal with, and recover from the damage associated with landslides. The proposed model includes indicators representing demographic, economic and social characteristics as well as indicators representing the degree of preparedness and recovery capacity. Although the model focuses primarily on the indirect losses, it could easily be extended to include more physical indicators which account for the direct losses. Each indicator is individually

  7. Modeling of the Sedimentary Interbedded Basalt Stratigraphy for the Idaho National Laboratory Probabilistic Seismic Hazard Analysis

    SciTech Connect

    Suzette Payne

    2007-08-01

    This report summarizes how the effects of the sedimentary interbedded basalt stratigraphy were modeled in the probabilistic seismic hazard analysis (PSHA) of the Idaho National Laboratory (INL). Drill holes indicate the bedrock beneath INL facilities is composed of about 1.1 km of alternating layers of basalt rock and loosely consolidated sediments. Alternating layers of hard rock and “soft” loose sediments tend to attenuate seismic energy greater than uniform rock due to scattering and damping. The INL PSHA incorporated the effects of the sedimentary interbedded basalt stratigraphy by developing site-specific shear (S) wave velocity profiles. The profiles were used in the PSHA to model the near-surface site response by developing site-specific stochastic attenuation relationships.

  8. Modeling of the Sedimentary Interbedded Basalt Stratigraphy for the Idaho National Laboratory Probabilistic Seismic Hazard Analysis

    SciTech Connect

    Suzette Payne

    2006-04-01

    This report summarizes how the effects of the sedimentary interbedded basalt stratigraphy were modeled in the probabilistic seismic hazard analysis (PSHA) of the Idaho National Laboratory (INL). Drill holes indicate the bedrock beneath INL facilities is composed of about 1.1 km of alternating layers of basalt rock and loosely consolidated sediments. Alternating layers of hard rock and “soft” loose sediments tend to attenuate seismic energy greater than uniform rock due to scattering and damping. The INL PSHA incorporated the effects of the sedimentary interbedded basalt stratigraphy by developing site-specific shear (S) wave velocity profiles. The profiles were used in the PSHA to model the near-surface site response by developing site-specific stochastic attenuation relationships.

  9. A Risk Assessment Model for Water Resources: releases of dangerous and hazardous substances.

    PubMed

    Rebelo, Anabela; Ferra, Isabel; Gonçalves, Isolina; Marques, Albertina M

    2014-07-01

    Many dangerous and hazardous substances are used, transported and handled daily in diverse situations, from domestic use to industrial processing, and during those operations, spills or other anomalous situations may occur that can lead to contaminant releases followed by contamination of surface water or groundwater through direct or indirect pathways. When dealing with this problem, rapid, technically sound decisions are desirable, and the use of complex methods may not be able to deliver information quickly. This work describes a simple conceptual model established on multi-criteria based analysis involving a strategic appraisal for contamination risk assessment to support local authorities on rapid technical decisions. The model involves a screening for environmental risk sources, focussing on persistent, bioaccumulative and toxic (PBT) substances that may be discharged into water resources. It is a simple tool that can be used to follow-up actual accident scenarios in real time and to support daily activities, such as site-inspections.

  10. Tsunami hazard assessment in El Salvador, Central America, from seismic sources through flooding numerical models.

    NASA Astrophysics Data System (ADS)

    Álvarez-Gómez, J. A.; Aniel-Quiroga, Í.; Gutiérrez-Gutiérrez, O. Q.; Larreynaga, J.; González, M.; Castro, M.; Gavidia, F.; Aguirre-Ayerbe, I.; González-Riancho, P.; Carreño, E.

    2013-11-01

    El Salvador is the smallest and most densely populated country in Central America; its coast has an approximate length of 320 km, 29 municipalities and more than 700 000 inhabitants. In El Salvador there were 15 recorded tsunamis between 1859 and 2012, 3 of them causing damages and resulting in hundreds of victims. Hazard assessment is commonly based on propagation numerical models for earthquake-generated tsunamis and can be approached through both probabilistic and deterministic methods. A deterministic approximation has been applied in this study as it provides essential information for coastal planning and management. The objective of the research was twofold: on the one hand the characterization of the threat over the entire coast of El Salvador, and on the other the computation of flooding maps for the three main localities of the Salvadorian coast. For the latter we developed high-resolution flooding models. For the former, due to the extension of the coastal area, we computed maximum elevation maps, and from the elevation in the near shore we computed an estimation of the run-up and the flooded area using empirical relations. We have considered local sources located in the Middle America Trench, characterized seismotectonically, and distant sources in the rest of Pacific Basin, using historical and recent earthquakes and tsunamis. We used a hybrid finite differences-finite volumes numerical model in this work, based on the linear and non-linear shallow water equations, to simulate a total of 24 earthquake-generated tsunami scenarios. Our results show that at the western Salvadorian coast, run-up values higher than 5 m are common, while in the eastern area, approximately from La Libertad to the Gulf of Fonseca, the run-up values are lower. The more exposed areas to flooding are the lowlands in the Lempa River delta and the Barra de Santiago Western Plains. The results of the empirical approximation used for the whole country are similar to the results

  11. Tsunami hazard assessment in El Salvador, Central America, from seismic sources through flooding numerical models

    NASA Astrophysics Data System (ADS)

    Álvarez-Gómez, J. A.; Aniel-Quiroga, Í.; Gutiérrez-Gutiérrez, O. Q.; Larreynaga, J.; González, M.; Castro, M.; Gavidia, F.; Aguirre-Ayerbe, I.; González-Riancho, P.; Carreño, E.

    2013-05-01

    El Salvador is the smallest and most densely populated country in Central America; its coast has approximately a length of 320 km, 29 municipalities and more than 700 000 inhabitants. In El Salvador there have been 15 recorded tsunamis between 1859 and 2012, 3 of them causing damages and hundreds of victims. The hazard assessment is commonly based on propagation numerical models for earthquake-generated tsunamis and can be approached from both Probabilistic and Deterministic Methods. A deterministic approximation has been applied in this study as it provides essential information for coastal planning and management. The objective of the research was twofold, on the one hand the characterization of the threat over the entire coast of El Salvador, and on the other the computation of flooding maps for the three main localities of the Salvadorian coast. For the latter we developed high resolution flooding models. For the former, due to the extension of the coastal area, we computed maximum elevation maps and from the elevation in the near-shore we computed an estimation of the run-up and the flooded area using empirical relations. We have considered local sources located in the Middle America Trench, characterized seismotectonically, and distant sources in the rest of Pacific basin, using historical and recent earthquakes and tsunamis. We used a hybrid finite differences - finite volumes numerical model in this work, based on the Linear and Non-linear Shallow Water Equations, to simulate a total of 24 earthquake generated tsunami scenarios. In the western Salvadorian coast, run-up values higher than 5 m are common, while in the eastern area, approximately from La Libertad to the Gulf of Fonseca, the run-up values are lower. The more exposed areas to flooding are the lowlands in the Lempa River delta and the Barra de Santiago Western Plains. The results of the empirical approximation used for the whole country are similar to the results obtained with the high resolution

  12. A First Comparison of Multiple Probability Hazard Outputs from Three Global Flood Models

    NASA Astrophysics Data System (ADS)

    Trigg, M. A.; Bates, P. D.; Fewtrell, T. J.; Yamazaki, D.; Pappenberger, F.; Winsemius, H.

    2014-12-01

    With research advances in algorithms, remote sensing data sets and computing power, global flood models are now a practical reality. There are a number of different research models currently available or in development, and as these models mature and output becomes available for use, there is great interest in how these different models compare and how useful they may be at different scales. At the kick-off meeting of the Global Flood Partnership (GFP) in March 2014, the need to compare these new global flood models was identified as a research priority, both for developers of the models and users of the output. The Global Flood Partnership (GFP) is an informal network of scientists and practitioners from public, private and international organisations providing or using global flood monitoring, modelling and forecasting. (http://portal.gdacs.org/Global-Flood-Partnership). On behalf of the GFP, The Willis Research Network is undertaking this comparison research and the work presented here is the result of the first phase of this comparison for three models; CaMa-Flood, GLOFRIS & ECMWF. The comparison analysis is undertaken for the entire African continent, identified by GFP members as the best location to facilitate data sharing by model teams and where there was the most interest from potential users of the model outputs. Initial analysis results include flooded area for a range of hazard return periods (25, 50, 100, 250, 500, 1000 years) and this is also compared against catchment sizes and climatic zone. Results will be discussed in the context of the different model structures and input data used, while also addressing scale issues and practicalities of use. Finally, plans for the validation of the models against microwave and optical remote sensing data will be outlined.

  13. Integrating expert opinion with modelling for quantitative multi-hazard risk assessment in the Eastern Italian Alps

    NASA Astrophysics Data System (ADS)

    Chen, Lixia; van Westen, Cees J.; Hussin, Haydar; Ciurean, Roxana L.; Turkington, Thea; Chavarro-Rincon, Diana; Shrestha, Dhruba P.

    2016-11-01

    Extreme rainfall events are the main triggering causes for hydro-meteorological hazards in mountainous areas, where development is often constrained by the limited space suitable for construction. In these areas, hazard and risk assessments are fundamental for risk mitigation, especially for preventive planning, risk communication and emergency preparedness. Multi-hazard risk assessment in mountainous areas at local and regional scales remain a major challenge because of lack of data related to past events and causal factors, and the interactions between different types of hazards. The lack of data leads to a high level of uncertainty in the application of quantitative methods for hazard and risk assessment. Therefore, a systematic approach is required to combine these quantitative methods with expert-based assumptions and decisions. In this study, a quantitative multi-hazard risk assessment was carried out in the Fella River valley, prone to debris flows and flood in the north-eastern Italian Alps. The main steps include data collection and development of inventory maps, definition of hazard scenarios, hazard assessment in terms of temporal and spatial probability calculation and intensity modelling, elements-at-risk mapping, estimation of asset values and the number of people, physical vulnerability assessment, the generation of risk curves and annual risk calculation. To compare the risk for each type of hazard, risk curves were generated for debris flows, river floods and flash floods. Uncertainties were expressed as minimum, average and maximum values of temporal and spatial probability, replacement costs of assets, population numbers, and physical vulnerability. These result in minimum, average and maximum risk curves. To validate this approach, a back analysis was conducted using the extreme hydro-meteorological event that occurred in August 2003 in the Fella River valley. The results show a good performance when compared to the historical damage reports.

  14. Development of models to inform a national Daily Landslide Hazard Assessment for Great Britain

    NASA Astrophysics Data System (ADS)

    Dijkstra, Tom A.; Reeves, Helen J.; Dashwood, Claire; Pennington, Catherine; Freeborough, Katy; Mackay, Jonathan D.; Uhlemann, Sebastian S.; Chambers, Jonathan E.; Wilkinson, Paul B.

    2015-04-01

    were combined with records of observed landslide events to establish which antecedent effective precipitation (AEP) signatures of different duration could be used as a pragmatic proxy for the occurrence of landslides. It was established that 1, 7, and 90 days AEP provided the most significant correlations and these were used to calculate the probability of at least one landslide occurring. The method was then extended over the period 2006 to 2014 and the results evaluated against observed occurrences. It is recognised that AEP is a relatively poor proxy for simulating effective stress conditions along potential slip surfaces. However, the temporal pattern of landslide probability compares well to the observed occurrences and provides a potential benefit to assist with the DLHA. Further work is continuing to fine-tune the model for landslide type, better spatial resolution of effective precipitation input and cross-reference to models that capture changes in water balance and conditions along slip surfaces. The latter is facilitated by intensive research at several field laboratories, such as the Hollin Hill site in Yorkshire, England. At this site, a decade of activity has generated a broad range of research and a wealth of data. This paper reports on one example of recent work; the characterisation of near surface hydrology using infiltration experiments where hydrological pathways are captured, among others, by electrical resistivity tomography. This research, which has further developed our understanding of soil moisture movement in a heterogeneous landslide complex, has highlighted the importance of establishing detailed ground models to enable determination of landslide potential at high resolution. In turn, the knowledge gained through this research is used to enhance the expertise for the daily landslide hazard assessments at a national scale.

  15. Model uncertainties of the 2002 update of California seismic hazard maps

    USGS Publications Warehouse

    Cao, T.; Petersen, M.D.; Frankel, A.D.

    2005-01-01

    In this article we present and explore the source and ground-motion model uncertainty and parametric sensitivity for the 2002 update of the California probabilistic seismic hazard maps. Our approach is to implement a Monte Carlo simulation that allows for independent sampling from fault to fault in each simulation. The source-distance dependent characteristics of the uncertainty maps of seismic hazard are explained by the fundamental uncertainty patterns from four basic test cases, in which the uncertainties from one-fault and two-fault systems are studied in detail. The California coefficient of variation (COV, ratio of the standard deviation to the mean) map for peak ground acceleration (10% of exceedance in 50 years) shows lower values (0.1-0.15) along the San Andreas fault system and other class A faults than along class B faults (0.2-0.3). High COV values (0.4-0.6) are found around the Garlock, Anacapa-Dume, and Palos Verdes faults in southern California and around the Maacama fault and Cascadia subduction zone in northern California.

  16. Uncertainty quantification in satellite-driven modeling to forecast lava flow hazards

    NASA Astrophysics Data System (ADS)

    Ganci, Gaetana; Bilotta, Giuseppe; Cappello, Annalisa; Herault, Alexis; Zago, Vito; Del Negro, Ciro

    2016-04-01

    Over the last decades satellite-based remote sensing and data processing techniques have proved well suited to complement field observations to provide timely event detection for volcanic effusive events, as well as extraction of parameters allowing lava flow tracking. In parallel with this, physics-based models for lava flow simulations have improved enormously and are now capable of fast, accurate simulations, which are increasingly driven by, or validated using, satellite-derived parameters such as lava flow discharge rates. Together, these capabilities represent a prompt strategy with immediate applications to the real time monitoring and hazard assessment of effusive eruptions, but two important key issues still need to be addressed, to improve its effectiveness: (i) the provision of source term parameters and their uncertainties, (ii) how uncertainties in source terms propagate into the model outputs. We here address these topics considering uncertainties in satellite-derived products obtained by the HOTSAT thermal monitoring system (e.g. hotspot pixels, radiant heat flux, effusion rate) and evaluating how these uncertainties affect lava flow hazard scenarios by inputting them into the MAGFLOW physics-based model for lava flow simulations. Particular attention is given to topography and cloud effect on satellite-derived products as well as to the frequency of their acquisitions (GEO vs LEO). We also investigate how the DEM resolution impact final scenarios from both the numerical and physical points of view. To evaluate these effects, three different kinds of well documented eruptions occurred at Mt Etna are taken into account: a short-lived paroxysmal event, i.e. the 11-13 Jan 2011 lava fountain, a long lasting eruption, i.e. the 2008-2009 eruption, and a short effusive event, i.e. the 14-24 July 2006 eruption.

  17. Lava Flow Hazard Modeling during the 2014-2015 Fogo eruption, Cape Verde

    NASA Astrophysics Data System (ADS)

    Del Negro, C.; Cappello, A.; Ganci, G.; Calvari, S.; Perez, N. M.; Hernandez Perez, P. A.; Victoria, S. S.; Cabral, J.

    2015-12-01

    Satellite remote sensing techniques and lava flow forecasting models have been combined to allow an ensemble response during effusive crises at poorly monitored volcanoes. Here, we use the HOTSAT volcano hot spot detection system that works with satellite thermal infrared data and the MAGFLOW lava flow emplacement model that considers the way in which effusion rate changes during an eruption, to forecast lava flow hazards during the 2014-2015 Fogo eruption. In many ways this was one of the major effusive eruption crises of recent years, since the lava flows actually invaded populated areas. HOTSAT is used to promptly analyze MODIS and SEVIRI data to output hot spot location, lava thermal flux, and effusion rate estimation. We use this output to drive the MAGFLOW simulations of lava flow paths and to update continuously flow simulations. Satellite-derived TADR estimates can be obtained in real time and lava flow simulations of several days of eruption can be calculated in a few minutes, thus making such a combined approach of paramount importance to provide timely forecasts of the areas that a lava flow could possibly inundate. In addition, such forecasting scenarios can be continuously updated in response to changes in the eruptive activity as detected by satellite imagery. We also show how Landsat-8 OLI and EO-1 ALI images complement the field observations for tracking the flow front position through time, and add considerable data on lava flow advancement to validate the results of numerical simulations. Our results thus demonstrate how the combination of satellite remote sensing and lava flow modeling can be effectively used during eruptive crises to produce realistic lava flow hazard scenarios and for assisting local authorities in making decisions during a volcanic eruption.

  18. Geodesy- and geology-based slip-rate models for the Western United States (excluding California) national seismic hazard maps

    USGS Publications Warehouse

    Petersen, Mark D.; Zeng, Yuehua; Haller, Kathleen M.; McCaffrey, Robert; Hammond, William C.; Bird, Peter; Moschetti, Morgan; Shen, Zhengkang; Bormann, Jayne; Thatcher, Wayne

    2014-01-01

    The 2014 National Seismic Hazard Maps for the conterminous United States incorporate additional uncertainty in fault slip-rate parameter that controls the earthquake-activity rates than was applied in previous versions of the hazard maps. This additional uncertainty is accounted for by new geodesy- and geology-based slip-rate models for the Western United States. Models that were considered include an updated geologic model based on expert opinion and four combined inversion models informed by both geologic and geodetic input. The two block models considered indicate significantly higher slip rates than the expert opinion and the two fault-based combined inversion models. For the hazard maps, we apply 20 percent weight with equal weighting for the two fault-based models. Off-fault geodetic-based models were not considered in this version of the maps. Resulting changes to the hazard maps are generally less than 0.05 g (acceleration of gravity). Future research will improve the maps and interpret differences between the new models.

  19. Weather modeling for hazard and consequence assessment operations during the 2006 Winter Olympic Games

    NASA Astrophysics Data System (ADS)

    Hayes, P.; Trigg, J. L.; Stauffer, D.; Hunter, G.; McQueen, J.

    2006-05-01

    Consequence assessment (CA) operations are those processes that attempt to mitigate negative impacts of incidents involving hazardous materials such as chemical, biological, radiological, nuclear, and high explosive (CBRNE) agents, facilities, weapons, or transportation. Incident types range from accidental spillage of chemicals at/en route to/from a manufacturing plant, to the deliberate use of radiological or chemical material as a weapon in a crowded city. The impacts of these incidents are highly variable, from little or no impact to catastrophic loss of life and property. Local and regional scale atmospheric conditions strongly influence atmospheric transport and dispersion processes in the boundary layer, and the extent and scope of the spread of dangerous materials in the lower levels of the atmosphere. Therefore, CA personnel charged with managing the consequences of CBRNE incidents must have detailed knowledge of current and future weather conditions to accurately model potential effects. A meteorology team was established at the U.S. Defense Threat Reduction Agency (DTRA) to provide weather support to CA personnel operating DTRA's CA tools, such as the Hazard Prediction and Assessment Capability (HPAC) tool. The meteorology team performs three main functions: 1) regular provision of meteorological data for use by personnel using HPAC, 2) determination of the best performing medium-range model forecast for the 12 - 48 hour timeframe and 3) provision of real-time help-desk support to users regarding acquisition and use of weather in HPAC CA applications. The normal meteorology team operations were expanded during a recent modeling project which took place during the 2006 Winter Olympic Games. The meteorology team took advantage of special weather observation datasets available in the domain of the Winter Olympic venues and undertook a project to improve weather modeling at high resolution. The varied and complex terrain provided a special challenge to the

  20. Challenges in understanding, modelling, and mitigating Lake Outburst Flood Hazard: experiences from Central Asia

    NASA Astrophysics Data System (ADS)

    Mergili, Martin; Schneider, Demian; Andres, Norina; Worni, Raphael; Gruber, Fabian; Schneider, Jean F.

    2010-05-01

    Lake Outburst Floods can evolve from complex process chains like avalanches of rock or ice that produce flood waves in a lake which may overtop and eventually breach glacial, morainic, landslide, or artificial dams. Rising lake levels can lead to progressive incision and destabilization of a dam, to enhanced ground water flow (piping), or even to hydrostatic failure of ice dams which can cause sudden outflow of accumulated water. These events often have a highly destructive potential because a large amount of water is released in a short time, with a high capacity to erode loose debris, leading to a powerful debris flow with a long travel distance. The best-known example of a lake outburst flood is the Vajont event (Northern Italy, 1963), where a landslide rushed into an artificial lake which spilled over and caused a flood leading to almost 2000 fatalities. Hazards from the failure of landslide dams are often (not always) fairly manageable: most breaches occur in the first few days or weeks after the landslide event and the rapid construction of a spillway - though problematic - has solved some hazardous situations (e.g. in the case of Hattian landslide in 2005 in Pakistan). Older dams, like Usoi dam (Lake Sarez) in Tajikistan, are usually fairly stable, though landsildes into the lakes may create floodwaves overtopping and eventually weakening the dams. The analysis and the mitigation of glacial lake outburst flood (GLOF) hazard remains a challenge. A number of GLOFs resulting in fatalities and severe damage have occurred during the previous decades, particularly in the Himalayas and in the mountains of Central Asia (Pamir, Tien Shan). The source area is usually far away from the area of impact and events occur at very long intervals or as singularities, so that the population at risk is usually not prepared. Even though potentially hazardous lakes can be identified relatively easily with remote sensing and field work, modeling and predicting of GLOFs (and also

  1. Financial Distress Prediction Using Discrete-time Hazard Model and Rating Transition Matrix Approach

    NASA Astrophysics Data System (ADS)

    Tsai, Bi-Huei; Chang, Chih-Huei

    2009-08-01

    Previous studies used constant cut-off indicator to distinguish distressed firms from non-distressed ones in the one-stage prediction models. However, distressed cut-off indicator must shift according to economic prosperity, rather than remains fixed all the time. This study focuses on Taiwanese listed firms and develops financial distress prediction models based upon the two-stage method. First, this study employs the firm-specific financial ratio and market factors to measure the probability of financial distress based on the discrete-time hazard models. Second, this paper further focuses on macroeconomic factors and applies rating transition matrix approach to determine the distressed cut-off indicator. The prediction models are developed by using the training sample from 1987 to 2004, and their levels of accuracy are compared with the test sample from 2005 to 2007. As for the one-stage prediction model, the model in incorporation with macroeconomic factors does not perform better than that without macroeconomic factors. This suggests that the accuracy is not improved for one-stage models which pool the firm-specific and macroeconomic factors together. In regards to the two stage models, the negative credit cycle index implies the worse economic status during the test period, so the distressed cut-off point is adjusted to increase based on such negative credit cycle index. After the two-stage models employ such adjusted cut-off point to discriminate the distressed firms from non-distressed ones, their error of misclassification becomes lower than that of one-stage ones. The two-stage models presented in this paper have incremental usefulness in predicting financial distress.

  2. Probabilistic forecasts of debris-flow hazard at the regional scale with a combination of models.

    NASA Astrophysics Data System (ADS)

    Malet, Jean-Philippe; Remaître, Alexandre

    2015-04-01

    Debris flows are one of the many active slope-forming processes in the French Alps, where rugged and steep slopes mantled by various slope deposits offer a great potential for triggering hazardous events. A quantitative assessment of debris-flow hazard requires the estimation, in a probabilistic framework, of the spatial probability of occurrence of source areas, the spatial probability of runout areas, the temporal frequency of events, and their intensity. The main objective of this research is to propose a pipeline for the estimation of these quantities at the region scale using a chain of debris-flow models. The work uses the experimental site of the Barcelonnette Basin (South French Alps), where 26 active torrents have produced more than 150 debris-flow events since 1850 to develop and validate the methodology. First, a susceptibility assessment is performed to identify the debris-flow prone source areas. The most frequently used approach is the combination of environmental factors with GIS procedures and statistical techniques, integrating or not, detailed event inventories. Based on a 5m-DEM and derivatives, and information on slope lithology, engineering soils and landcover, the possible source areas are identified with a statistical logistic regression model. The performance of the statistical model is evaluated with the observed distribution of debris-flow events recorded after 1850 in the study area. The source areas in the three most active torrents (Riou-Bourdoux, Faucon, Sanières) are well identified by the model. Results are less convincing for three other active torrents (Bourget, La Valette and Riou-Chanal); this could be related to the type of debris-flow triggering mechanism as the model seems to better spot the open slope debris-flow source areas (e.g. scree slopes), but appears to be less efficient for the identification of landslide-induced debris flows. Second, a susceptibility assessment is performed to estimate the possible runout distance

  3. Numerical modelling for real-time forecasting of marine oil pollution and hazard assessment

    NASA Astrophysics Data System (ADS)

    De Dominicis, Michela; Pinardi, Nadia; Bruciaferri, Diego; Liubartseva, Svitlana

    2015-04-01

    (MEDESS4MS) system, which is an integrated operational multi-model oil spill prediction service, that can be used by different users to run simulations of oil spills at sea, even in real time, through a web portal. The MEDESS4MS system gathers different oil spill modelling systems and data from meteorological and ocean forecasting systems, as well as operational information on response equipment, together with environmental and socio-economic sensitivity maps. MEDSLIK-II has been also used to provide an assessment of hazard stemming from operational oil ship discharges in the Southern Adriatic and Northern Ionian (SANI) Seas. Operational pollution resulting from ships consists of a movable hazard with a magnitude that changes dynamically as a result of a number of external parameters varying in space and time (temperature, wind, sea currents). Simulations of oil releases have been performed with realistic oceanographic currents and the results show that the oil pollution hazard distribution has an inherent spatial and temporal variability related to the specific flow field variability.

  4. Multiple Ways to Solve Proportions

    ERIC Educational Resources Information Center

    Ercole, Leslie K.; Frantz, Marny; Ashline, George

    2011-01-01

    When solving problems involving proportions, students may intuitively draw on strategies that connect to their understanding of fractions, decimals, and percents. These two statements--"Instruction in solving proportions should include methods that have a strong intuitive basis" and "Teachers should begin instruction with more intuitive…

  5. Modeling the combustion behavior of hazardous waste in a rotary kiln incinerator.

    PubMed

    Yang, Yongxiang; Pijnenborg, Marc J A; Reuter, Markus A; Verwoerd, Joep

    2005-01-01

    Hazardous wastes have complex physical forms and chemical compositions and are normally incinerated in rotary kilns for safe disposal and energy recovery. In the rotary kiln, the multifeed stream and wide variation of thermal, physical, and chemical properties of the wastes cause the incineration system to be highly heterogeneous, with severe temperature fluctuations and unsteady combustion chemistry. Incomplete combustion is often the consequence, and the process is difficult to control. In this article, modeling of the waste combustion is described by using computational fluid dynamics (CFD). Through CFD simulation, gas flow and mixing, turbulent combustion, and heat transfer inside the incinerator were predicted and visualized. As the first step, the waste in various forms was modeled to a hydrocarbon-based virtual fuel mixture. The combustion of the simplified waste was then simulated with a seven-gas combustion model within a CFD framework. Comparison was made with previous global three-gas combustion model with which no chemical behavior can be derived. The distribution of temperature and chemical species has been investigated. The waste combustion model was validated with temperature measurements. Various operating conditions and the influence on the incineration performance were then simulated. Through this research, a better process understanding and potential optimization of the design were attained.

  6. Developing Sustainable Modeling Software and Necessary Data Repository for Volcanic Hazard Analysis -- Some Lessons Learnt

    NASA Astrophysics Data System (ADS)

    Patra, A. K.; Connor, C.; Webley, P.; Jones, M.; Charbonnier, S. J.; Connor, L.; Gallo, S.; Bursik, M. I.; Valentine, G.; Hughes, C. G.; Aghakhani, H.; Renschler, C. S.; Kosar, T.

    2014-12-01

    We report here on an effort to improve the sustainability, robustness and usability of the core modeling and simulation tools housed in the collaboratory VHub.org and used in the study of complex volcanic behavior. In particular, we focus on tools that support large scale mass flows (TITAN2D), ash deposition/transport and dispersal (Tephra2 and PUFF), and lava flows (Lava2). These tools have become very popular in the community especially due to the availability of an online usage modality. The redevelopment of the tools ot take advantage of new hardware and software advances was a primary thrust for the effort. However, as we start work we have reoriented the effort to also take advantage of significant new opportunities for supporting the complex workflows and use of distributed data resources that will enable effective and efficient hazard analysis.

  7. Risk assessment framework of fate and transport models applied to hazardous waste sites

    SciTech Connect

    Hwang, S.T.

    1993-06-01

    Risk assessment is an increasingly important part of the decision-making process in the cleanup of hazardous waste sites. Despite guidelines from regulatory agencies and considerable research efforts to reduce uncertainties in risk assessments, there are still many issues unanswered. This paper presents new research results pertaining to fate and transport models, which will be useful in estimating exposure concentrations and will help reduce uncertainties in risk assessment. These developments include an approach for (1) estimating the degree of emissions and concentration levels of volatile pollutants during the use of contaminated water, (2) absorption of organic chemicals in the soil matrix through the skin, and (3) steady state, near-field, contaminant concentrations in the aquifer within a waste boundary.

  8. Coupling Radar Rainfall Estimation and Hydrological Modelling For Flash-flood Hazard Mitigation

    NASA Astrophysics Data System (ADS)

    Borga, M.; Creutin, J. D.

    Flood risk mitigation is accomplished through managing either or both the hazard and vulnerability. Flood hazard may be reduced through structural measures which alter the frequency of flood levels in the area. The vulnerability of a community to flood loss can be mitigated through changing or regulating land use and through flood warning and effective emergency response. When dealing with flash-flood hazard, it is gener- ally accepted that the most effective way (and in many instances the only affordable in a sustainable perspective) to mitigate the risk is by reducing the vulnerability of the involved communities, in particular by implementing flood warning systems and community self-help programs. However, both the inherent characteristics of the at- mospheric and hydrologic processes involved in flash-flooding and the changing soci- etal needs provide a tremendous challenge to traditional flood forecasting and warning concepts. In fact, the targets of these systems are traditionally localised like urbanised sectors or hydraulic structures. Given the small spatial scale that characterises flash floods and the development of dispersed urbanisation, transportation, green tourism and water sports, human lives and property are exposed to flash flood risk in a scat- tered manner. This must be taken into consideration in flash flood warning strategies and the investigated region should be considered as a whole and every section of the drainage network as a potential target for hydrological warnings. Radar technology offers the potential to provide information describing rain intensities almost contin- uously in time and space. Recent research results indicate that coupling radar infor- mation to distributed hydrologic modelling can provide hydrologic forecasts at all potentially flooded points of a region. Nevertheless, very few flood warning services use radar data more than on a qualitative basis. After a short review of current under- standing in this area, two

  9. Development of a Probabilistic Tornado Wind Hazard Model for the Continental United States Volume I: Main Report

    SciTech Connect

    Boissonnade, A; Hossain, Q; Kimball, J

    2000-07-20

    Since the mid-l980's, assessment of the wind and tornado risks at the Department of Energy (DOE) high and moderate hazard facilities has been based on the straight wind/tornado hazard curves given in UCRL-53526 (Coats, 1985). These curves were developed using a methodology that utilized a model, developed by McDonald, for severe winds at sub-tornado wind speeds and a separate model, developed by Fujita, for tornado wind speeds. For DOE sites not covered in UCRL-53526, wind and tornado hazard assessments are based on the criteria outlined in DOE-STD-1023-95 (DOE, 1996), utilizing the methodology in UCRL-53526; Subsequent to the publication of UCRL53526, in a study sponsored by the Nuclear Regulatory Commission (NRC), the Pacific Northwest Laboratory developed tornado wind hazard curves for the contiguous United States, NUREG/CR-4461 (Ramsdell, 1986). Because of the different modeling assumptions and underlying data used to develop the tornado wind information, the wind speeds at specified exceedance levels, at a given location, based on the methodology in UCRL-53526, are different than those based on the methodology in NUREG/CR-4461. In 1997, Lawrence Livermore National Laboratory (LLNL) was funded by the DOE to review the current methodologies for characterizing tornado wind hazards and to develop a state-of-the-art wind/tornado characterization methodology based on probabilistic hazard assessment techniques and current historical wind data. This report describes the process of developing the methodology and the database of relevant tornado information needed to implement the methodology. It also presents the tornado wind hazard curves obtained from the application of the method to DOE sites throughout the contiguous United States.

  10. A comparative analysis of hazard models for predicting debris flows in Madison County, VA

    USGS Publications Warehouse

    Morrissey, Meghan M.; Wieczorek, Gerald F.; Morgan, Benjamin A.

    2001-01-01

    During the rainstorm of June 27, 1995, roughly 330-750 mm of rain fell within a sixteen-hour period, initiating floods and over 600 debris flows in a small area (130 km2) of Madison County, Virginia. Field studies showed that the majority (70%) of these debris flows initiated with a thickness of 0.5 to 3.0 m in colluvium on slopes from 17 o to 41 o (Wieczorek et al., 2000). This paper evaluated and compared the approaches of SINMAP, LISA, and Iverson's (2000) transient response model for slope stability analysis by applying each model to the landslide data from Madison County. Of these three stability models, only Iverson's transient response model evaluated stability conditions as a function of time and depth. Iverson?s model would be the preferred method of the three models to evaluate landslide hazards on a regional scale in areas prone to rain-induced landslides as it considers both the transient and spatial response of pore pressure in its calculation of slope stability. The stability calculation used in SINMAP and LISA is similar and utilizes probability distribution functions for certain parameters. Unlike SINMAP that only considers soil cohesion, internal friction angle and rainfall-rate distributions, LISA allows the use of distributed data for all parameters, so it is the preferred model to evaluate slope stability over SINMAP. Results from all three models suggested similar soil and hydrologic properties for triggering the landslides that occurred during the 1995 storm in Madison County, Virginia. The colluvium probably had cohesion of less than 2KPa. The root-soil system is above the failure plane and consequently root strength and tree surcharge had negligible effect on slope stability. The result that the final location of the water table was near the ground surface is supported by the water budget analysis of the rainstorm conducted by Smith et al. (1996).

  11. Development of algal interspecies correlation estimation models for chemical hazard assessment.

    PubMed

    Brill, Jessica L; Belanger, Scott E; Chaney, Joel G; Dyer, Scott D; Raimondo, Sandy; Barron, Mace G; Pittinger, Charles A

    2016-09-01

    Web-based Interspecies Correlation Estimation (ICE) is an application developed to predict the acute toxicity of a chemical from 1 species to another taxon. Web-ICE models use the acute toxicity value for a surrogate species to predict effect values for other species, thus potentially filling in data gaps for a variety of environmental assessment purposes. Web-ICE has historically been dominated by aquatic and terrestrial animal prediction models. Web-ICE models for algal species were essentially absent and are addressed in the present study. A compilation of public and private sector-held algal toxicity data were compiled and reviewed for quality based on relevant aspects of individual studies. Interspecies correlations were constructed from the most commonly tested algal genera for a broad spectrum of chemicals. The ICE regressions were developed based on acute 72-h and 96-h endpoint values involving 1647 unique studies on 476 unique chemicals encompassing 40 genera and 70 species of green, blue-green, and diatom algae. Acceptance criteria for algal ICE models were established prior to evaluation of individual models and included a minimum sample size of 3, a statistically significant regression slope, and a slope estimation parameter ≥0.65. A total of 186 ICE models were possible at the genus level, with 21 meeting quality criteria; and 264 ICE models were developed at the species level, with 32 meeting quality criteria. Algal ICE models will have broad utility in screening environmental hazard assessments, data gap filling in certain regulatory scenarios, and as supplemental information to derive species sensitivity distributions. Environ Toxicol Chem 2016;35:2368-2378. Published 2016 Wiley Periodicals Inc. on behalf of SETAC. This article is a US government work and, as such, is in the public domain in the United States of America.

  12. Lava flow hazard modeling during the 2014-2015 Fogo eruption, Cape Verde

    NASA Astrophysics Data System (ADS)

    Cappello, Annalisa; Ganci, Gaetana; Calvari, Sonia; Pérez, Nemesio M.; Hernández, Pedro A.; Silva, Sónia V.; Cabral, Jeremias; Del Negro, Ciro

    2016-04-01

    Satellite remote sensing techniques and lava flow forecasting models have been combined to enable a rapid response during effusive crises at poorly monitored volcanoes. Here we used the HOTSAT satellite thermal monitoring system and the MAGFLOW lava flow emplacement model to forecast lava flow hazards during the 2014-2015 Fogo eruption. In many ways this was one of the major effusive eruption crises of recent years, since the lava flows actually invaded populated areas. Combining satellite data and modeling allowed mapping of the probable evolution of lava flow fields while the eruption was ongoing and rapidly gaining as much relevant information as possible. HOTSAT was used to promptly analyze MODIS and SEVIRI data to output hot spot location, lava thermal flux, and effusion rate estimation. This output was used to drive the MAGFLOW simulations of lava flow paths and to continuously update flow simulations. We also show how Landsat 8 OLI and EO-1 ALI images complement the field observations for tracking the flow front position through time and adding considerable data on lava flow advancement to validate the results of numerical simulations. The integration of satellite data and modeling offers great promise in providing a unified and efficient system for global assessment and real-time response to effusive eruptions, including (i) the current state of the effusive activity, (ii) the probable evolution of the lava flow field, and (iii) the potential impact of lava flows.

  13. An investigation on the modelling of kinetics of thermal decomposition of hazardous mercury wastes.

    PubMed

    Busto, Yailen; M G Tack, Filip; Peralta, Luis M; Cabrera, Xiomara; Arteaga-Pérez, Luis E

    2013-09-15

    The kinetics of mercury removal from solid wastes generated by chlor-alkali plants were studied. The reaction order and model-free method with an isoconversional approach were used to estimate the kinetic parameters and reaction mechanism that apply to the thermal decomposition of hazardous mercury wastes. As a first approach to the understanding of thermal decomposition for this type of systems (poly-disperse and multi-component), a novel scheme of six reactions was proposed to represent the behaviour of mercury compounds in the solid matrix during the treatment. An integration-optimization algorithm was used in the screening of nine mechanistic models to develop kinetic expressions that best describe the process. The kinetic parameters were calculated by fitting each of these models to the experimental data. It was demonstrated that the D₁-diffusion mechanism appeared to govern the process at 250°C and high residence times, whereas at 450°C a combination of the diffusion mechanism (D₁) and the third order reaction mechanism (F3) fitted the kinetics of the conversions. The developed models can be applied in engineering calculations to dimension the installations and determine the optimal conditions to treat a mercury containing sludge.

  14. Converting HAZUS capacity curves to seismic hazard-compatible building fragility functions: effect of hysteretic models

    USGS Publications Warehouse

    Ryu, Hyeuk; Luco, Nicolas; Baker, Jack W.; Karaca, Erdem

    2008-01-01

    A methodology was recently proposed for the development of hazard-compatible building fragility models using parameters of capacity curves and damage state thresholds from HAZUS (Karaca and Luco, 2008). In the methodology, HAZUS curvilinear capacity curves were used to define nonlinear dynamic SDOF models that were subjected to the nonlinear time history analysis instead of the capacity spectrum method. In this study, we construct a multilinear capacity curve with negative stiffness after an ultimate (capping) point for the nonlinear time history analysis, as an alternative to the curvilinear model provided in HAZUS. As an illustration, here we propose parameter values of the multilinear capacity curve for a moderate-code low-rise steel moment resisting frame building (labeled S1L in HAZUS). To determine the final parameter values, we perform nonlinear time history analyses of SDOF systems with various parameter values and investigate their effects on resulting fragility functions through sensitivity analysis. The findings improve capacity curves and thereby fragility and/or vulnerability models for generic types of structures.

  15. A global vegetation corrected SRTM DEM for use in hazard modelling

    NASA Astrophysics Data System (ADS)

    Bates, P. D.; O'Loughlin, F.; Neal, J. C.; Durand, M. T.; Alsdorf, D. E.; Paiva, R. C. D.

    2015-12-01

    We present the methodology and results from the development of a near-global 'bare-earth' Digital Elevation Model (DEM) derived from the Shuttle Radar Topography Mission (SRTM) data. Digital Elevation Models are the most important input for hazard modelling, as the DEM quality governs the accuracy of the model outputs. While SRTM is currently the best near-globally [60N to 60S] available DEM, it requires adjustments to reduce the vegetation contamination and make it useful for hazard modelling over heavily vegetated areas (e.g. tropical wetlands). Unlike previous methods of accounting for vegetation contamination, which concentrated on correcting relatively small areas and usually applied a static adjustment, we account for vegetation contamination globally and apply a spatial varying correction, based on information about canopy height and density. Our new 'Bare-Earth' SRTM DEM combines multiple remote sensing datasets, including ICESat GLA14 ground elevations, the vegetation continuous field dataset as a proxy for penetration depth of SRTM and a global vegetation height map, to remove the vegetation artefacts present in the original SRTM DEM. In creating the final 'bare-earth' SRTM DEM dataset, we produced three different 'bare-earth' SRTM products. The first applies global parameters, while the second and third products apply parameters that are regionalised based on either climatic zones or vegetation types, respectively. We also tested two different canopy density proxies of different spatial resolution. Using ground elevations obtained from the ICESat GLA14 satellite altimeter, we calculate the residual errors for the raw SRTM and the three 'bare-earth' SRTM products and compare performances. The three 'bare-earth' products all show large improvements over the raw SRTM in vegetated areas with the overall mean bias reduced by between 75 and 92% from 4.94 m to 0.40 m. The overall standard deviation is reduced by between 29 and 33 % from 7.12 m to 4.80 m. As

  16. Modelling Active Faults in Probabilistic Seismic Hazard Analysis (PSHA) with OpenQuake: Definition, Design and Experience

    NASA Astrophysics Data System (ADS)

    Weatherill, Graeme; Garcia, Julio; Poggi, Valerio; Chen, Yen-Shin; Pagani, Marco

    2016-04-01

    The Global Earthquake Model (GEM) has, since its inception in 2009, made many contributions to the practice of seismic hazard modeling in different regions of the globe. The OpenQuake-engine (hereafter referred to simply as OpenQuake), GEM's open-source software for calculation of earthquake hazard and risk, has found application in many countries, spanning a diversity of tectonic environments. GEM itself has produced a database of national and regional seismic hazard models, harmonizing into OpenQuake's own definition the varied seismogenic sources found therein. The characterization of active faults in probabilistic seismic hazard analysis (PSHA) is at the centre of this process, motivating many of the developments in OpenQuake and presenting hazard modellers with the challenge of reconciling seismological, geological and geodetic information for the different regions of the world. Faced with these challenges, and from the experience gained in the process of harmonizing existing models of seismic hazard, four critical issues are addressed. The challenge GEM has faced in the development of software is how to define a representation of an active fault (both in terms of geometry and earthquake behaviour) that is sufficiently flexible to adapt to different tectonic conditions and levels of data completeness. By exploring the different fault typologies supported by OpenQuake we illustrate how seismic hazard calculations can, and do, take into account complexities such as geometrical irregularity of faults in the prediction of ground motion, highlighting some of the potential pitfalls and inconsistencies that can arise. This exploration leads to the second main challenge in active fault modeling, what elements of the fault source model impact most upon the hazard at a site, and when does this matter? Through a series of sensitivity studies we show how different configurations of fault geometry, and the corresponding characterisation of near-fault phenomena (including

  17. Mathematical models and methods of risk assessment in ecologically hazardous industries

    SciTech Connect

    Mikhalevich, V.S.; Knopov, P.S.; Golodnikov, A.N.

    1994-11-01

    Analysis of critical industrial situations leading to accidents or catastrophes has shown that the main factors responsible for accidents include technological inadequacy of ecologically hazardous facilities, equipment design errors, and insufficient preventive maintenance of facilities with an enhanced level of environmental hazard. The scale of the accident after-effects essentially depends on the location of the ecologically hazardous facility, timely development of preventive measures, and prompt implementations of these measures in emergency in compliance with strict deadlines for decision making.

  18. Spatial Distributed Seismicity Model of Seismic Hazard Mapping in the North-China Region: A Comparison with the GSHAP

    NASA Astrophysics Data System (ADS)

    Zhong, Q.; Shi, B.; Meng, L.

    2010-12-01

    The North China is one of the most seismically active regions in the mainland China. The moderate to large earthquakes have occurred here throughout history, resulting in huge losses of human life and properties. With the probabilistic seismic hazard analysis (PSHA) approach, we investigate the influence of different seismic environments, incorporating both near surface soil properties and distributed historical and modern seismicity. A simplified seismic source model, derived with the consideration of regional active fault distributions, is presented for the North China region. The spatial distributed seismicity model of PSHA is used to calculate the level of ground motion likely to be exceeded in a given time period. Following Frankel (1995) approach of circular Gaussian smoothing procedure, in the PSHA’s calculation, we proposed the fault-rupture-oriented elliptical Gaussian smoothing with the assumptions that earthquakes occur on faults or fault zones of past earthquakes to delineate the potential seismic zones (Lapajine et al., 2003). This is combined with regional active fault strike directions and the seismicity distribution patterns. Next Generation Attenuation model ((NGA), Boore et al., 2007) is used in generating hazard map for PGA with 2%, 5%, and 10 % probability of being exceeded in 50 years, and the resultant hazard map is compared with the result given by Global Seismic Hazard Assessment Project (GSHAP). There is general agreement for PGA distribution patterns between the results of this study and the GSHAP map that used the same seismic source zones. However, peak ground accelerations predicted in this study are typically 10-20% less than those of the GSHAP, and the seismic source models, such as fault distributions and regional seismicity used in the GSHAP seem to be oversimplified. We believe this study represents an improvement on prior seismic hazard evaluations for the region. In addition to the updated input data, we believe that, by

  19. AschFlow - A dynamic landslide run-out model for medium scale hazard analysis.

    NASA Astrophysics Data System (ADS)

    Luna, Byron Quan; Blahut, Jan; van Asch, Theo; van Westen, Cees; Kappes, Melanie

    2015-04-01

    Landslides and debris flow hazard assessments require a scale-dependent analysis in order to mitigate damage and other negative consequences at the respective scales of occurrence. Medium or large scale landslide run-out modelling for many possible landslide initiation areas has been a cumbersome task in the past. This arises from the difficulty to precisely define the location and volume of the released mass and from the inability of the run-out models to compute the displacement with a large amount of individual initiation areas (computational exhaustive). Most of the existing physically based run-out models have complications in handling such situations and therefore empirical methods have been used as a practical mean to predict landslides mobility at a medium scale (1:10,000 to 1:50,000). In this context, a simple medium scale numerical model for rapid mass movements in urban and mountainous areas was developed. The deterministic nature of the approach makes it possible to calculate the velocity, height and increase in mass by erosion, resulting in the estimation of various forms of impacts exerted by debris flows at the medium scale The established and implemented model ("AschFlow") is a 2-D one-phase continuum model that simulates, the entrainment, spreading and deposition process of a landslide or debris flow at a medium scale. The flow is thus treated as a single phase material, whose behavior is controlled by rheology (e.g. Voellmy or Bingham). The developed regional model "AschFlow" was applied and evaluated in well documented areas with known past debris flow events.

  20. Simulation of the 1992 Tessina landslide by a cellular automata model and future hazard scenarios

    NASA Astrophysics Data System (ADS)

    Avolio, MV; Di Gregorio, Salvatore; Mantovani, Franco; Pasuto, Alessandro; Rongo, Rocco; Silvano, Sandro; Spataro, William

    Cellular Automata are a powerful tool for modelling natural and artificial systems, which can be described in terms of local interactions of their constituent parts. Some types of landslides, such as debris/mud flows, match these requirements. The 1992 Tessina landslide has characteristics (slow mud flows) which make it appropriate for modelling by means of Cellular Automata, except for the initial phase of detachment, which is caused by a rotational movement that has no effect on the mud flow path. This paper presents the Cellular Automata approach for modelling slow mud/debris flows, the results of simulation of the 1992 Tessina landslide and future hazard scenarios based on the volumes of masses that could be mobilised in the future. They were obtained by adapting the Cellular Automata Model called SCIDDICA, which has been validated for very fast landslides. SCIDDICA was applied by modifying the general model to the peculiarities of the Tessina landslide. The simulations obtained by this initial model were satisfactory for forecasting the surface covered by mud. Calibration of the model, which was obtained from simulation of the 1992 event, was used for forecasting flow expansion during possible future reactivation. For this purpose two simulations concerning the collapse of about 1 million m 3 of material were tested. In one of these, the presence of a containment wall built in 1992 for the protection of the Tarcogna hamlet was inserted. The results obtained identified the conditions of high risk affecting the villages of Funes and Lamosano and show that this Cellular Automata approach can have a wide range of applications for different types of mud/debris flows.

  1. Hazardous Waste

    MedlinePlus

    ... you throw these substances away, they become hazardous waste. Some hazardous wastes come from products in our homes. Our garbage can include such hazardous wastes as old batteries, bug spray cans and paint ...

  2. Hazardous material life-cycle cost model. System user's guide. Version 1. 2. Final report, October 1992-September 1993

    SciTech Connect

    LaFleur, B.J.; Jaeger, J.A.; Hermansen, L.A.

    1994-09-01

    The Hazardous Material Life-Cycle Cost Model (HMLCCM) was developed to estimate the total life-cycle costs of using various hazardous materials in the construction, maintenance, and repair of U.S. naval systems and facilities. the model estimates those costs derived from the need to protect the health and safety of workplace personnel and the need to protect the environment. The purpose of this guide is to provide users with a detailed description of the system as well as the basic structure and features of the HMLCCM and instructions on how to use the system. This report is an updated version of the original user's guide. Changes include added functionality by including permissible exposure levels (portion of OSHA Z-Table) and on-line access to Material Safety Data Sheet (MSDS) based on the current Hazardous Information System (HMIS).

  3. Quantitative hazard assessment at Vulcano (Aeolian islands): integration of geology, event statistics and physical modelling

    NASA Astrophysics Data System (ADS)

    Dellino, Pierfrancesco; de Astis, Gianfilippo; La Volpe, Luigi; Mele, Daniela; Sulpizio, Roberto

    2010-05-01

    The analysis of stratigraphy and of pyroclastic deposits particle features allowed the reconstruction of the volcanic history of La Fossa di Vulcano. An eruptive scenario driven by superficial phreatomagmatic explosions emerged. A statistical analysis of the pyroclastic Successions led to define a repetitive sequence of dilute pyroclastic density currents as the most probable events at short term, followed by fallout of dense ballistic blocks. The scale of such events is related to the amount of magma involved in each explosion. Events involving a million of cubic meters of magma are probable in view of what happened in the most recent eruptions. They led to the formation of hundreds of meters thick dilute pyroclastic density currents, moving down the volcano slope at velocities exceeding 50 m/sec. The dispersion of desnity currents affected the whole Vulcano Porto area, the Vulcanello area and also overrode the Fossa Caldera's rim, spreading over the Piano area. Similarly, older pyroclastic deposits erupted at different times (Piano Grotte dei Rossi formation, ~20-7.7 ka) from vents within La Fossa Caldera and before La Fossa Cone formation. They also were phreatomagmatic in origin and fed dilute pyroclastic density currents (PDC). They represent the eruptions with the highest magnitude on the Island. Therefore, for the aim of hazard assessment, these deposits from La Fossa Cone and La Fossa Caldera were used to depict eruptive scenarios at short term and at long term. On the base of physical models that make use of pyroclastic deposits particle features, the impact parameters for each scenario have been calculated. They are dynamic pressure and particle volumetric concentration of density currents, and impact energy of ballistic blocks. On this base, a quantitative hazard map is presented, which could be of direct use for territory planning and for the calculation of the expected damage.

  4. Transient deterministic shallow landslide modeling: Requirements for susceptibility and hazard assessments in a GIS framework

    USGS Publications Warehouse

    Godt, J.W.; Baum, R.L.; Savage, W.Z.; Salciarini, D.; Schulz, W.H.; Harp, E.L.

    2008-01-01

    Application of transient deterministic shallow landslide models over broad regions for hazard and susceptibility assessments requires information on rainfall, topography and the distribution and properties of hillside materials. We survey techniques for generating the spatial and temporal input data for such models and present an example using a transient deterministic model that combines an analytic solution to assess the pore-pressure response to rainfall infiltration with an infinite-slope stability calculation. Pore-pressures and factors of safety are computed on a cell-by-cell basis and can be displayed or manipulated in a grid-based GIS. Input data are high-resolution (1.8??m) topographic information derived from LiDAR data and simple descriptions of initial pore-pressure distribution and boundary conditions for a study area north of Seattle, Washington. Rainfall information is taken from a previously defined empirical rainfall intensity-duration threshold and material strength and hydraulic properties were measured both in the field and laboratory. Results are tested by comparison with a shallow landslide inventory. Comparison of results with those from static infinite-slope stability analyses assuming fixed water-table heights shows that the spatial prediction of shallow landslide susceptibility is improved using the transient analyses; moreover, results can be depicted in terms of the rainfall intensity and duration known to trigger shallow landslides in the study area.

  5. Non-Volcanic release of CO2 in Italy: quantification, conceptual models and gas hazard

    NASA Astrophysics Data System (ADS)

    Chiodini, G.; Cardellini, C.; Caliro, S.; Avino, R.

    2011-12-01

    Central and South Italy are characterized by the presence of many reservoirs naturally recharged by CO2 of deep provenance. In the western sector, the reservoirs feed hundreds of gas emissions at the surface. Many studies in the last years were devoted to (i) elaborating a map of CO2 Earth degassing of the region; (ii) to asses the gas hazard; (iii) to develop methods suitable for the measurement of the gas fluxes from different types of emissions; (iv) to elaborate the conceptual model of Earth degassing and its relation with the seismic activity of the region and (v) to develop physical numerical models of CO2 air dispersion. The main results obtained are: 1) A general, regional map of CO2 Earth degassing in Central Italy has been elaborated. The total flux of CO2 in the area has been estimated in ~ 10 Mt/a which are released to the atmosphere trough numerous dangerous gas emissions or by degassing spring waters (~ 10 % of the CO2 globally estimated to be released by the Earth trough volcanic activity). 2) An on line, open access, georeferenced database of the main CO2 emissions (~ 250) was settled up (http://googas.ov.ingv.it). CO2 flux > 100 t/d characterise 14% of the degassing sites while CO2 fluxes from 100 t/d to 10 t/d have been estimated for about 35% of the gas emissions. 3) The sites of the gas emissions are not suitable for life: the gas causes many accidents to animals and people. In order to mitigate the gas hazard a specific model of CO2 air dispersion has been developed and applied to the main degassing sites. A relevant application regarded Mefite d'Ansanto, southern Apennines, which is the largest natural emission of low temperature CO2 rich gases, from non-volcanic environment, ever measured in the Earth (˜2000 t/d). Under low wind conditions, the gas flows along a narrow natural channel producing a persistent gas river which has killed over a period of time many people and animals. The application of the physical numerical model allowed us to

  6. Comparative hazard analysis and toxicological modeling of diverse nanomaterials using the embryonic zebrafish (EZ) metric of toxicity

    DOE PAGES

    Harper, Bryan; Thomas, Dennis G.; Chikkagoudar, Satish; Baker, Nathan A.; Tang, Kaizhi; Heredia-Langner, Alejandro; Lins, Roberto D.; Harper, Stacey

    2015-06-04

    The integration of rapid assays, large data sets, informatics and modeling can overcome current barriers in understanding nanomaterial structure-toxicity relationships by providing a weight-of-the-evidence mechanism to generate hazard rankings for nanomaterials. Here we present the use of a rapid, low-cost assay to perform screening-level toxicity evaluations of nanomaterials in vivo. Calculated EZ Metric scores, a combined measure of morbidity and mortality, were established at realistic exposure levels and used to develop a predictive model of nanomaterial toxicity. Hazard ranking and clustering analysis of 68 diverse nanomaterials revealed distinct patterns of toxicity related to both core composition and outermost surface chemistrymore » of nanomaterials. The resulting clusters guided the development of a predictive model of gold nanoparticle toxicity to embryonic zebrafish. In addition, our findings suggest that risk assessments based on the size and core composition of nanomaterials alone may be wholly inappropriate, especially when considering complex engineered nanomaterials. These findings reveal the need to expeditiously increase the availability of quantitative measures of nanomaterial hazard and broaden the sharing of that data and knowledge to support predictive modeling. In addition, research should continue to focus on methodologies for developing predictive models of nanomaterial hazard based on sub-lethal responses to low dose exposures.« less

  7. Comparison of empirical and data driven hydrometeorological hazard models on coastal cities of São Paulo, Brazil

    NASA Astrophysics Data System (ADS)

    Koga-Vicente, A.; Friedel, M. J.

    2010-12-01

    Every year thousands of people are affected by floods and landslide hazards caused by rainstorms. The problem is more serious in tropical developing countries because of the susceptibility as a result of the high amount of available energy to form storms, and the high vulnerability due to poor economic and social conditions. Predictive models of hazards are important tools to manage this kind of risk. In this study, a comparison of two different modeling approaches was made for predicting hydrometeorological hazards in 12 cities on the coast of São Paulo, Brazil, from 1994 to 2003. In the first approach, an empirical multiple linear regression (MLR) model was developed and used; the second approach used a type of unsupervised nonlinear artificial neural network called a self-organized map (SOM). By using twenty three independent variables of susceptibility (precipitation, soil type, slope, elevation, and regional atmospheric system scale) and vulnerability (distribution and total population, income and educational characteristics, poverty intensity, human development index), binary hazard responses were obtained. Model performance by cross-validation indicated that the respective MLR and SOM model accuracy was about 67% and 80%. Prediction accuracy can be improved by the addition of information, but the SOM approach is preferred because of sparse data and highly nonlinear relations among the independent variables.

  8. Comparative hazard analysis and toxicological modeling of diverse nanomaterials using the embryonic zebrafish (EZ) metric of toxicity

    SciTech Connect

    Harper, Bryan; Thomas, Dennis G.; Chikkagoudar, Satish; Baker, Nathan A.; Tang, Kaizhi; Heredia-Langner, Alejandro; Lins, Roberto D.; Harper, Stacey

    2015-06-04

    The integration of rapid assays, large data sets, informatics and modeling can overcome current barriers in understanding nanomaterial structure-toxicity relationships by providing a weight-of-the-evidence mechanism to generate hazard rankings for nanomaterials. Here we present the use of a rapid, low-cost assay to perform screening-level toxicity evaluations of nanomaterials in vivo. Calculated EZ Metric scores, a combined measure of morbidity and mortality, were established at realistic exposure levels and used to develop a predictive model of nanomaterial toxicity. Hazard ranking and clustering analysis of 68 diverse nanomaterials revealed distinct patterns of toxicity related to both core composition and outermost surface chemistry of nanomaterials. The resulting clusters guided the development of a predictive model of gold nanoparticle toxicity to embryonic zebrafish. In addition, our findings suggest that risk assessments based on the size and core composition of nanomaterials alone may be wholly inappropriate, especially when considering complex engineered nanomaterials. These findings reveal the need to expeditiously increase the availability of quantitative measures of nanomaterial hazard and broaden the sharing of that data and knowledge to support predictive modeling. In addition, research should continue to focus on methodologies for developing predictive models of nanomaterial hazard based on sub-lethal responses to low dose exposures.

  9. Combining SLBL routine with landslide-generated tsunami model for a quick hazard assessment tool

    NASA Astrophysics Data System (ADS)

    Franz, Martin; Rudaz, Benjamin; Jaboyedoff, Michel; Podladchikov, Yury

    2016-04-01

    Regions with steep topography are potentially subject to landslide-induced tsunami, because of the proximity between lakes, rivers, sea shores and potential instabilities. The concentration of the population and infrastructures on the water body shores and downstream valleys could lead to catastrophic consequences. In order to assess comprehensively this phenomenon together with the induced risks, we have developed a tool which allows the construction of the landslide geometry, and which is able to simulate its propagation, the generation and the propagation of the wave and eventually the spread on the shores or the associated downstream flow. The tool is developed in the Matlab© environment, with a graphical user interface (GUI) to select the parameters in a user-friendly manner. The whole process is done in three steps implying different methods. Firstly, the geometry of the sliding mass is constructed using the Sloping Local Base Level (SLBL) concept. Secondly, the propagation of this volume is performed using a model based on viscous flow equations. Finally, the wave generation and its propagation are simulated using the shallow water equations stabilized by the Lax-Friedrichs scheme. The transition between wet and dry bed is performed by the combination of the two latter sets of equations. The intensity map is based on the criterion of flooding in Switzerland provided by the OFEG and results from the multiplication of the velocity and the depth obtained during the simulation. The tool can be used for hazard assessment in the case of well-known landslides, where the SLBL routine can be constrained and checked for realistic construction of the geometrical model. In less-known cases, various failure plane geometries can be automatically built between given range and thus a multi-scenario approach is used. In any case, less-known parameters such as the landslide velocity, its run-out distance, etc. can also be set to vary within given ranges, leading to multi

  10. Long-Term Slip History Discriminates Among Occurrence Models for Seismic Hazard Assessment

    NASA Astrophysics Data System (ADS)

    Fitzenz, D. D.; Ferry, M. A.; Jalobeanu, A.

    2010-12-01

    Today, the probabilistic seismic hazard assessment (PSHA) community relies on one or a combination of stochastic models to compute occurrence probabilities for large earthquakes. Considerable efforts have been devoted to extracting the maximum information from long catalogues of large earthquakes (CLE) based on instrumental, historical, archeological and paleoseismological data (Biasi et al, 2009, Parsons, 2008, Rhoades and Dissen 2003). However, the models remain only and insufficiently constrained by these rare single-slip event data. Therefore, the selection of the models and their respective weights is necessarily left with the appreciation of a panel of experts (WGCEP, 2003). Since cumulative slip data with high temporal and spatial resolution are now available, we propose here a new approach to incorporate these pieces of evidence of mid- to long-term fault behavior into the next generation of PSHA: the Cumulative Offset-Based Bayesian Recurrence Analysis (COBBRA). Applied to the Jordan Valley segment of the Dead Sea Fault, the method yields the best combination of occurrence models for full-segment ruptures knowing the available single-event and cumulative data. Not only does our method provide data-driven, objective weights to the competing models, but it also allows to rule out time-independence, and to compute the cumulative probability of occurrence for the next full-segment event reflecting all available data. References: Biasi, G. P. & Weldon, R. J., II. Bull. Seism. Soc. Am. 99, 471-498, doi:10.1785/0120080287 (2009). Parsons, T. J. Geophys. Res., 113, doi:10.1029/2007JB004,998.216 (2008) Rhoades, D. A., and R. J. V. Dissen, New Zealand Journal of Geology & Geophysics, 46, 479-488 (2003). Working Group On California Earthquake Probabilities. Earthquake Probabilities in the San Francisco Bay Region: 2002-2031. (2003).

  11. Comparison of smoothing methods for the development of a smoothed seismicity model for Alaska and the implications for seismic hazard

    NASA Astrophysics Data System (ADS)

    Moschetti, M. P.; Mueller, C. S.; Boyd, O. S.; Petersen, M. D.

    2013-12-01

    In anticipation of the update of the Alaska seismic hazard maps (ASHMs) by the U. S. Geological Survey, we report progress on the comparison of smoothed seismicity models developed using fixed and adaptive smoothing algorithms, and investigate the sensitivity of seismic hazard to the models. While fault-based sources, such as those for great earthquakes in the Alaska-Aleutian subduction zone and for the ~10 shallow crustal faults within Alaska, dominate the seismic hazard estimates for locations near to the sources, smoothed seismicity rates make important contributions to seismic hazard away from fault-based sources and where knowledge of recurrence and magnitude is not sufficient for use in hazard studies. Recent developments in adaptive smoothing methods and statistical tests for evaluating and comparing rate models prompt us to investigate the appropriateness of adaptive smoothing for the ASHMs. We develop smoothed seismicity models for Alaska using fixed and adaptive smoothing methods and compare the resulting models by calculating and evaluating the joint likelihood test. We use the earthquake catalog, and associated completeness levels, developed for the 2007 ASHM to produce fixed-bandwidth-smoothed models with smoothing distances varying from 10 to 100 km and adaptively smoothed models. Adaptive smoothing follows the method of Helmstetter et al. and defines a unique smoothing distance for each earthquake epicenter from the distance to the nth nearest neighbor. The consequence of the adaptive smoothing methods is to reduce smoothing distances, causing locally increased seismicity rates, where seismicity rates are high and to increase smoothing distances where seismicity is sparse. We follow guidance from previous studies to optimize the neighbor number (n-value) by comparing model likelihood values, which estimate the likelihood that the observed earthquake epicenters from the recent catalog are derived from the smoothed rate models. We compare likelihood

  12. Comparison of smoothing methods for the development of a smoothed seismicity model for Alaska and the implications for seismic hazard

    USGS Publications Warehouse

    Moschetti, Morgan P.; Mueller, Charles S.; Boyd, Oliver S.; Petersen, Mark D.

    2014-01-01

    In anticipation of the update of the Alaska seismic hazard maps (ASHMs) by the U. S. Geological Survey, we report progress on the comparison of smoothed seismicity models developed using fixed and adaptive smoothing algorithms, and investigate the sensitivity of seismic hazard to the models. While fault-based sources, such as those for great earthquakes in the Alaska-Aleutian subduction zone and for the ~10 shallow crustal faults within Alaska, dominate the seismic hazard estimates for locations near to the sources, smoothed seismicity rates make important contributions to seismic hazard away from fault-based sources and where knowledge of recurrence and magnitude is not sufficient for use in hazard studies. Recent developments in adaptive smoothing methods and statistical tests for evaluating and comparing rate models prompt us to investigate the appropriateness of adaptive smoothing for the ASHMs. We develop smoothed seismicity models for Alaska using fixed and adaptive smoothing methods and compare the resulting models by calculating and evaluating the joint likelihood test. We use the earthquake catalog, and associated completeness levels, developed for the 2007 ASHM to produce fixed-bandwidth-smoothed models with smoothing distances varying from 10 to 100 km and adaptively smoothed models. Adaptive smoothing follows the method of Helmstetter et al. and defines a unique smoothing distance for each earthquake epicenter from the distance to the nth nearest neighbor. The consequence of the adaptive smoothing methods is to reduce smoothing distances, causing locally increased seismicity rates, where seismicity rates are high and to increase smoothing distances where seismicity is sparse. We follow guidance from previous studies to optimize the neighbor number (n-value) by comparing model likelihood values, which estimate the likelihood that the observed earthquake epicenters from the recent catalog are derived from the smoothed rate models. We compare likelihood

  13. Calibration of strong motion models for Central America region and its use in seismic hazard assessment

    NASA Astrophysics Data System (ADS)

    Climent, A.; Benito, M. B.; Piedra, R.; Lindholm, C.; Gaspar-Escribano, J.

    2013-05-01

    We present the results of a study aimed at choosing the more suitable strong-motion models for seismic hazard analysis in the Central America (CA) Region. After a careful revision of the state of the art, different models developed for subduction and volcanic crustal zones, in tectonic environment similar to those of CA, were selected. These models were calibrated with accelerograms recorded in Costa Rica, Nicaragua and El Salvador. The peak ground acceleration PGA and Spectral Acceleration SA (T) derived from the records were compared with the ones predicted by the models in similar conditions of magnitude, distance and soil. The type of magnitude (Ms, Mb, MW), distance (Rhyp, Rrup, etc) and ground motion parameter (maximum horizontal component, geometrical mean, etc ) was taken into account in the comparison with the real data. As results of the analysis, the models which present a best fit with the local data were identified. These models have been applied for carrying out seismic hazard analysis in the region, in the frame of the RESIS II project financed by the Norwegian Foreign Department and also by the Spanish project SISMOCAES. The methodology followed is based on the direct comparison between PGA and SA 5 % damped response values extracted from actual records with the corresponding acceleration values predicted by the selected ground-motion models for similar magnitude, distance and soil conditions. Residuals between observed and predicted values for PGA, and SA (1sec) are calculated and plotted as a function of distance and magnitude, analyzing their deviation from the mean value. Besides and most important, a statistical analysis of the normalized residuals was carry out using the criteria proposed by Scherbaum et al. (2004), which consists in categorizing ground motion models based in a likelihood parameter that reflects the goodness-of-fit of the median values as well as the shape of the underlying distribution of ground motion residuals. Considering

  14. Proportional smile design using the recurring esthetic dental (red) proportion.

    PubMed

    Ward, D H

    2001-01-01

    Dentists have needed an objective way in which to evaluate a smile. A method for determining the ideal size and position of the anterior teeth has been presented here. Use of the FIVE to evaluate the RED proportion and the width-to-height ratio, tempered with sound clinical judgment, gives pleasing and consistent results. With the diversity that exists in nature, rarely does the final result follow all the mathematical rules of proportional smile design. This approach may serve as a foundation on which to base initial smile design, however. When one begins to understand the relationship between beauty, mathematics, and the surrounding world, one begins to appreciate their interdependence.

  15. Hazard function theory for nonstationary natural hazards

    NASA Astrophysics Data System (ADS)

    Read, L.; Vogel, R. M.

    2015-12-01

    Studies from the natural hazards literature indicate that many natural processes, including wind speeds, landslides, wildfires, precipitation, streamflow and earthquakes, show evidence of nonstationary behavior such as trends in magnitudes through time. Traditional probabilistic analysis of natural hazards based on partial duration series (PDS) generally assumes stationarity in the magnitudes and arrivals of events, i.e. that the probability of exceedance is constant through time. Given evidence of trends and the consequent expected growth in devastating impacts from natural hazards across the world, new methods are needed to characterize their probabilistic behavior. The field of hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series (x) with its failure time series (t), enabling computation of corresponding average return periods and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose PDS magnitudes are assumed to follow the widely applied Poisson-GP model. We derive a 2-parameter Generalized Pareto hazard model and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. Our theoretical analysis linking hazard event series x, with corresponding failure time series t, should have application to a wide class of natural hazards.

  16. Modeling and hazard mapping of complex cascading mass movement processes: the case of glacier lake 513, Carhuaz, Peru

    NASA Astrophysics Data System (ADS)

    Schneider, Demian; Huggel, Christian; García, Javier; Ludeña, Sebastian; Cochachin, Alejo

    2013-04-01

    The Cordilleras in Peru are especially vulnerable to, and affected by impacts from climate change. Local communities and cities often exist directly within the reach of major hazard potentials such as lake outburst floods (aluviones), mud-/debris flows (huaycos) or large rock-/ice avalanches. They have been repeatedly and strongly affected these regions over the last decades and since the last century, and thousands of people have been killed. One of the most recent events in the Cordillera Blanca occurred on 11 April 2010, when a rock/ice avalanche from the top of Hualcán mountain, NE of the town of Carhuaz impacted the glacier lake 513 (Laguna 513), caused displacement waves and triggered an outburst flood wave. The flow repeatedly transformed from debris flow to hyperconcentrated flow and eventually caused significant damage in Carhuaz. This event was motivation to start early warning and prevention efforts to reduce risks related to ice/rock avalanches and glacier lake outburst floods (GLOF). One of the basic components of an early warning system is the assessment, understanding and communication of relevant hazards and risks. Here we report on the methodology and results of generating GLOF related hazard maps for Carhuaz based on numerical modeling and field work. This exercise required an advanced concept and implementation of different mass movement models. Specifically, numerical models were applied for simulating avalanche flow, avalanche lake impact, displacement wave generation and lake overtopping, and eventually flow propagation of the outburst flood with changing rheology between debris flow and hyperconcentrated flows. We adopted a hazard mapping procedure slightly adjusted adjusted from guidelines developed in Switzerland and in the Andes region. A methodology has thereby been developed to translate results from numerical mass movement modeling into hazard maps. The resulting hazard map was verified and adjusted during field work. This study shows

  17. Predictive models in hazard assessment of Great Lakes contaminants for fish

    USGS Publications Warehouse

    Passino, Dora R. May

    1986-01-01

    A hazard assessment scheme was developed and applied to predict potential harm to aquatic biota of nearly 500 organic compounds detected by gas chromatography/mass spectrometry (GC/MS) in Great Lakes fish. The frequency of occurrence and estimated concentrations of compounds found in lake trout (Salvelinus namaycush) and walleyes (Stizostedion vitreum vitreum) were compared with available manufacturing and discharge information. Bioconcentration potential of the compounds was estimated from available data or from calculations of quantitative structure-activity relationships (QSAR). Investigators at the National Fisheries Research Center-Great Lakes also measured the acute toxicity (48-h EC50's) of 35 representative compounds to Daphnia pulex and compared the results with acute toxicity values generated by QSAR. The QSAR-derived toxicities for several chemicals underestimated the actual acute toxicity by one or more orders of magnitude. A multiple regression of log EC50 on log water solubility and molecular volume proved to be a useful predictive model. Additional models providing insight into toxicity incorporate solvatochromic parameters that measure dipolarity/polarizability, hydrogen bond acceptor basicity, and hydrogen bond donor acidity of the solute (toxicant).

  18. Geochemical transformations and modeling of two deep-well injected hazardous wastes

    USGS Publications Warehouse

    Roy, W.R.; Seyler, B.; Steele, J.D.; Mravik, S.C.; Moore, D.M.; Krapac, I.G.; Peden, J.M.; Griffin, R.A.

    1991-01-01

    Two liquid hazardous wastes (an alkaline brine-like solution and a dilute acidic waste) were mixed with finely ground rock samples of three injection-related lithologies (sandstone, dolomite, and siltstone) for 155 to 230 days at 325??K-10.8 MPa. The pH and inorganic chemical composition of the alkaline waste were not significantly altered by any of the rock samples after 230 days of mixing. The acidic waste was neutralized as a consequence of carbonate dissolution, ion exchange, or clay-mineral dissolution, and hence was transformed into a nonhazardous waste. Mixing the alkaline waste with the solid phases yielded several reaction products: brucite, Mg(OH)2; calcite, CaCO3; and possibly a type of sodium metasilicate. Clay-like minerals formed in the sandstone, and hydrotalcite, Mg6Al2-CO3(OH)16??4H2O, may have formed in the siltstone at trace levels. Mixing the alkaline waste with a synthetic brine yielded brucite, calcite, and whewellite (CaC2O4??H2O). The thermodynamic model PHRQPITZ predicted that brucite and calcite would precipitate from solution in the dolomite and siltstone mixtures and in the alkaline waste-brine system. The dilute acidic waste did not significantly alter the mineralogical composition of the three rock types after 155 days of contact. The model PHREEQE indicated that the calcite was thermodynamically stable in the dolomite and siltstone mixtures.

  19. Modeling survival in colon cancer: a methodological review

    PubMed Central

    Ahmed, Farid E; Vos, Paul W; Holbert, Don

    2007-01-01

    The Cox proportional hazards model is the most widely used model for survival analysis because of its simplicity. The fundamental assumption in this model is the proportionality of the hazard function. When this condition is not met, other modifications or other models must be used for analysis of survival data. We illustrate in this review several methodological approaches to deal with the violation of the proportionality assumption, using survival in colon cancer as an illustrative example. PMID:17295918

  20. A threshold hazard model for estimating serious infection risk following anti-tumor necrosis factor therapy in rheumatoid arthritis patients.

    PubMed

    Fu, Bo; Lunt, Mark; Galloway, James; Dixon, Will; Hyrich, Kimme; Symmons, Deborah

    2013-03-11

    Over recent years novel biologic agents have been developed for the treatment of rheumatoid arthritis. The most common type of biologic agent in use in the United Kingdom is the anti-tumor necrosis factor inhibitor class. To fully appreciate the potential risks of anti-tumor necrosis factor therapy in patients, knowledge about the baseline hazard (risk pattern) and the characteristics of patients associated with serious infection is important. We propose a nonproportional hazard model for estimating the infection risk, by including the drug exposure history information into the baseline hazard. We found that the infection risk reaches a peak within 1 month after drug exposure starts and then declines steadily for nearly 2 years before stabilizing out.

  1. Landslide hazard assessment along a mountain highway in the Indian Himalayan Region (IHR) using remote sensing and computational models

    NASA Astrophysics Data System (ADS)

    Krishna, Akhouri P.; Kumar, Santosh

    2013-10-01

    Landslide hazard assessments using computational models, such as artificial neural network (ANN) and frequency ratio (FR), were carried out covering one of the important mountain highways in the Central Himalaya of Indian Himalayan Region (IHR). Landslide influencing factors were either calculated or extracted from spatial databases including recent remote sensing data of LANDSAT TM, CARTOSAT digital elevation model (DEM) and Tropical Rainfall Measuring Mission (TRMM) satellite for rainfall data. ANN was implemented using the multi-layered feed forward architecture with different input, output and hidden layers. This model based on back propagation algorithm derived weights for all possible parameters of landslides and causative factors considered. The training sites for landslide prone and non-prone areas were identified and verified through details gathered from remote sensing and other sources. Frequency Ratio (FR) models are based on observed relationships between the distribution of landslides and each landslide related factor. FR model implementation proved useful for assessing the spatial relationships between landslide locations and factors contributing to its occurrence. Above computational models generated respective susceptibility maps of landslide hazard for the study area. This further allowed the simulation of landslide hazard maps on a medium scale using GIS platform and remote sensing data. Upon validation and accuracy checks, it was observed that both models produced good results with FR having some edge over ANN based mapping. Such statistical and functional models led to better understanding of relationships between the landslides and preparatory factors as well as ensuring lesser levels of subjectivity compared to qualitative approaches.

  2. Bayesian inference on proportional elections.

    PubMed

    Brunello, Gabriel Hideki Vatanabe; Nakano, Eduardo Yoshio

    2015-01-01

    Polls for majoritarian voting systems usually show estimates of the percentage of votes for each candidate. However, proportional vote systems do not necessarily guarantee the candidate with the most percentage of votes will be elected. Thus, traditional methods used in majoritarian elections cannot be applied on proportional elections. In this context, the purpose of this paper was to perform a Bayesian inference on proportional elections considering the Brazilian system of seats distribution. More specifically, a methodology to answer the probability that a given party will have representation on the chamber of deputies was developed. Inferences were made on a Bayesian scenario using the Monte Carlo simulation technique, and the developed methodology was applied on data from the Brazilian elections for Members of the Legislative Assembly and Federal Chamber of Deputies in 2010. A performance rate was also presented to evaluate the efficiency of the methodology. Calculations and simulations were carried out using the free R statistical software.

  3. Bayesian Inference on Proportional Elections

    PubMed Central

    Brunello, Gabriel Hideki Vatanabe; Nakano, Eduardo Yoshio

    2015-01-01

    Polls for majoritarian voting systems usually show estimates of the percentage of votes for each candidate. However, proportional vote systems do not necessarily guarantee the candidate with the most percentage of votes will be elected. Thus, traditional methods used in majoritarian elections cannot be applied on proportional elections. In this context, the purpose of this paper was to perform a Bayesian inference on proportional elections considering the Brazilian system of seats distribution. More specifically, a methodology to answer the probability that a given party will have representation on the chamber of deputies was developed. Inferences were made on a Bayesian scenario using the Monte Carlo simulation technique, and the developed methodology was applied on data from the Brazilian elections for Members of the Legislative Assembly and Federal Chamber of Deputies in 2010. A performance rate was also presented to evaluate the efficiency of the methodology. Calculations and simulations were carried out using the free R statistical software. PMID:25786259

  4. Implications of different digital elevation models and preprocessing techniques to delineate debris flow inundation hazard zones in El Salvador

    NASA Astrophysics Data System (ADS)

    Anderson, E. R.; Griffin, R.; Irwin, D.

    2013-12-01

    Heavy rains and steep, volcanic slopes in El Salvador cause numerous landslides every year, posing a persistent threat to the population, economy and environment. Although potential debris inundation hazard zones have been delineated using digital elevation models (DEMs), some disparities exist between the simulated zones and actual affected areas. Moreover, these hazard zones have only been identified for volcanic lahars and not the shallow landslides that occur nearly every year. This is despite the availability of tools to delineate a variety of landslide types (e.g., the USGS-developed LAHARZ software). Limitations in DEM spatial resolution, age of the data, and hydrological preprocessing techniques can contribute to inaccurate hazard zone definitions. This study investigates the impacts of using different elevation models and pit filling techniques in the final debris hazard zone delineations, in an effort to determine which combination of methods most closely agrees with observed landslide events. In particular, a national DEM digitized from topographic sheets from the 1970s and 1980s provide an elevation product at a 10 meter resolution. Both natural and anthropogenic modifications of the terrain limit the accuracy of current landslide hazard assessments derived from this source. Global products from the Shuttle Radar Topography Mission (SRTM) and the Advanced Spaceborne Thermal Emission and Reflection Radiometer Global DEM (ASTER GDEM) offer more recent data but at the cost of spatial resolution. New data derived from the NASA Uninhabited Aerial Vehicle Synthetic Aperture Radar (UAVSAR) in 2013 provides the opportunity to update hazard zones at a higher spatial resolution (approximately 6 meters). Hydrological filling of sinks or pits for current hazard zone simulation has previously been achieved through ArcInfo spatial analyst. Such hydrological processing typically only fills pits and can lead to drastic modifications of original elevation values

  5. Tsunami Hazard Assessment: Source regions of concern to U.S. interests derived from NOAA Tsunami Forecast Model Development

    NASA Astrophysics Data System (ADS)

    Eble, M. C.; uslu, B. U.; Wright, L.

    2013-12-01

    Synthetic tsunamis generated from source regions around the Pacific Basin are analyzed in terms of their relative impact on United States coastal locations.. The region of tsunami origin is as important as the expected magnitude and the predicted inundation for understanding tsunami hazard. The NOAA Center for Tsunami Research has developed high-resolution tsunami models capable of predicting tsunami arrival time and amplitude of waves at each location. These models have been used to conduct tsunami hazard assessments to assess maximum impact and tsunami inundation for use by local communities in education and evacuation map development. Hazard assessment studies conducted for Los Angeles, San Francisco, Crescent City, Hilo, and Apra Harbor are combined with results of tsunami forecast model development at each of seventy-five locations. Complete hazard assessment, identifies every possible tsunami variation from a pre-computed propagation database. Study results indicate that the Eastern Aleutian Islands and Alaska are the most likely regions to produce the largest impact on the West Coast of the United States, while the East Philippines and Mariana trench regions impact Apra Harbor, Guam. Hawaii appears to be impacted equally from South America, Alaska and the Kuril Islands.

  6. Application of physical erosion modelling to derive off-site muddy flood hazard

    NASA Astrophysics Data System (ADS)

    Annika Arevalo, Sarah; Schmidt, Jürgen

    2015-04-01

    Muddy floods are local inundation events after heavy rain storms. They occur inside watersheds before the runoff reaches a river. The sediment is eroded from agricultural fields and transported with the surface runoff into adjacent residential areas. The environment where muddy floods occur is very small scaled. The damages related to muddy floods are caused by the runoff-water (flooded houses and cellars) and the transported sediment that is deposited on infrastructure and private properties. There are a variety of factors that drive the occurrence of muddy floods. The spatial extend is rather small and the distribution is very heterogeneous. This makes the prediction of the precise locations that are endangered by muddy flooding a challenge. The aim of this investigation is to identify potential hazard areas that might suffer muddy flooding out of modelled soil erosion data. For the German state of Saxony there is a modelled map of soil erosion and particle transport available. The model applied is EROSION 3D. The spatial resolution is a 20 m raster and the conditions assumed are a 10 year rainfall event on uncovered agricultural soils. A digital landuse map is edified, containing the outer borders of potential risk elements (residential and industrial areas, streets, railroads, etc.) that can be damaged by muddy flooding. The landuse map is merged with the transported sediment map calculated with EROSION 3D. The result precisely depicts the locations where high amounts of sediments might be transported into urban areas under worst case conditions. This map was validated with observed muddy flood events that proved to coincide very well with areas predicted to have a potentially high sediment input.

  7. Examining school-based bullying interventions using multilevel discrete time hazard modeling.

    PubMed

    Ayers, Stephanie L; Wagaman, M Alex; Geiger, Jennifer Mullins; Bermudez-Parsai, Monica; Hedberg, E C

    2012-10-01

    Although schools have been trying to address bullying by utilizing different approaches that stop or reduce the incidence of bullying, little remains known about what specific intervention strategies are most successful in reducing bullying in the school setting. Using the social-ecological framework, this paper examines school-based disciplinary interventions often used to deliver consequences to deter the reoccurrence of bullying and aggressive behaviors among school-aged children. Data for this study are drawn from the School-Wide Information System (SWIS) with the final analytic sample consisting of 1,221 students in grades K - 12 who received an office disciplinary referral for bullying during the first semester. Using Kaplan-Meier Failure Functions and Multi-level discrete time hazard models, determinants of the probability of a student receiving a second referral over time were examined. Of the seven interventions tested, only Parent-Teacher Conference (AOR = 0.65, p < .01) and Loss of Privileges (AOR = 0.71, p < .10) were significant in reducing the rate of the reoccurrence of bullying and aggressive behaviors. By using a social-ecological framework, schools can develop strategies that deter the reoccurrence of bullying by identifying key factors that enhance a sense of connection between the students' mesosystems as well as utilizing disciplinary strategies that take into consideration student's microsystem roles.

  8. Modeling of Natural Coastal Hazards in Puerto Rico in Support of Emergency Management and Coastal Planning

    NASA Astrophysics Data System (ADS)

    Mercado, A., Jr.

    2015-12-01

    The island of Puerto Rico is not only located in the so-called Caribbean hurricane alley, but is also located in a tsunami prone region. And both phenomena have affected the island. For the past few years we have undergone the task of upgrading the available coastal flood maps due to storm surges and tsunamis. This has been done taking advantage of new Lidar-derived, high resolution, topography and bathymetry and state-of-the-art models (MOST for tsunamis and ADCIRC/SWAN for storm surges). The tsunami inundation maps have been converted to evacuation maps. In tsunamis we are also working in preparing hazard maps due to tsunami currents inside ports, bays, and marinas. The storm surge maps include two scenarios of sea level rise: 0.5 and 1.0 m above Mean High Water. All maps have been adopted by the Puerto Rico State Emergency Management Agency, and are publicly available through the Internet. It is the purpose of this presentation to summarize how it has been done, the spin-off applications they have generated, and how we plan to improve coastal flooding predictions.

  9. Methodologies for the assessment of earthquake-triggered landslides hazard. A comparison of Logistic Regression and Artificial Neural Network models.

    NASA Astrophysics Data System (ADS)

    García-Rodríguez, M. J.; Malpica, J. A.; Benito, B.

    2009-04-01

    In recent years, interest in landslide hazard assessment studies has increased substantially. They are appropriate for evaluation and mitigation plan development in landslide-prone areas. There are several techniques available for landslide hazard research at a regional scale. Generally, they can be classified in two groups: qualitative and quantitative methods. Most of qualitative methods tend to be subjective, since they depend on expert opinions and represent hazard levels in descriptive terms. On the other hand, quantitative methods are objective and they are commonly used due to the correlation between the instability factors and the location of the landslides. Within this group, statistical approaches and new heuristic techniques based on artificial intelligence (artificial neural network (ANN), fuzzy logic, etc.) provide rigorous analysis to assess landslide hazard over large regions. However, they depend on qualitative and quantitative data, scale, types of movements and characteristic factors used. We analysed and compared an approach for assessing earthquake-triggered landslides hazard using logistic regression (LR) and artificial neural networks (ANN) with a back-propagation learning algorithm. One application has been developed in El Salvador, a country of Central America where the earthquake-triggered landslides are usual phenomena. In a first phase, we analysed the susceptibility and hazard associated to the seismic scenario of the 2001 January 13th earthquake. We calibrated the models using data from the landslide inventory for this scenario. These analyses require input variables representing physical parameters to contribute to the initiation of slope instability, for example, slope gradient, elevation, aspect, mean annual precipitation, lithology, land use, and terrain roughness, while the occurrence or non-occurrence of landslides is considered as dependent variable. The results of the landslide susceptibility analysis are checked using landslide

  10. Numerical modeling of marine Gravity data for tsunami hazard zone mapping

    NASA Astrophysics Data System (ADS)

    Porwal, Nipun

    2012-07-01

    Tsunami is a series of ocean wave with very high wavelengths ranges from 10 to 500 km. Therefore tsunamis act as shallow water waves and hard to predict from various methods. Bottom Pressure Recorders of Poseidon class considered as a preeminent method to detect tsunami waves but Acoustic Modem in Ocean Bottom Pressure (OBP) sensors placed in the vicinity of trenches having depth of more than 6000m fails to propel OBP data to Surface Buoys. Therefore this paper is developed for numerical modeling of Gravity field coefficients from Bureau Gravimetric International (BGI) which do not play a central role in the study of geodesy, satellite orbit computation, & geophysics but by mathematical transformation of gravity field coefficients using Normalized Legendre Polynomial high resolution ocean bottom pressure (OBP) data is generated. Real time sea level monitored OBP data of 0.3° by 1° spatial resolution using Kalman filter (kf080) for past 10 years by Estimating the Circulation and Climate of the Ocean (ECCO) has been correlated with OBP data from gravity field coefficients which attribute a feasible study on future tsunami detection system from space and in identification of most suitable sites to place OBP sensors near deep trenches. The Levitus Climatological temperature and salinity are assimilated into the version of the MITGCM using the ad-joint method to obtain the sea height segment. Then TOPEX/Poseidon satellite altimeter, surface momentum, heat, and freshwater fluxes from NCEP reanalysis product and the dynamic ocean topography DOT_DNSCMSS08_EGM08 is used to interpret sea-bottom elevation. Then all datasets are associated under raster calculator in ArcGIS 9.3 using Boolean Intersection Algebra Method and proximal analysis tools with high resolution sea floor topographic map. Afterward tsunami prone area and suitable sites for set up of BPR as analyzed in this research is authenticated by using Passive microwave radiometry system for Tsunami Hazard Zone

  11. On the predictive information criteria for model determination in seismic hazard analysis

    NASA Astrophysics Data System (ADS)

    Varini, Elisa; Rotondi, Renata

    2016-04-01

    estimate, but it is hardly applicable to data which are not independent given parameters (Watanabe, J. Mach. Learn. Res., 2010). A solution is given by Ando and Tsay criterion where the joint density may be decomposed into the product of the conditional densities (Ando and Tsay, Int. J. Forecast., 2010). The above mentioned criteria are global summary measures of model performance, but more detailed analysis could be required to discover the reasons for poor global performance. In this latter case, a retrospective predictive analysis is performed on each individual observation. In this study we performed the Bayesian analysis of Italian data sets by four versions of a long-term hazard model known as the stress release model (Vere-Jones, J. Physics Earth, 1978; Bebbington and Harte, Geophys. J. Int., 2003; Varini and Rotondi, Environ. Ecol. Stat., 2015). Then we illustrate the results on their performance evaluated by Bayes Factor, predictive information criteria and retrospective predictive analysis.

  12. On the predictive information criteria for model determination in seismic hazard analysis

    NASA Astrophysics Data System (ADS)

    Varini, Elisa; Rotondi, Renata

    2016-04-01

    estimate, but it is hardly applicable to data which are not independent given parameters (Watanabe, J. Mach. Learn. Res., 2010). A solution is given by Ando and Tsay criterion where the joint density may be decomposed into the product of the conditional densities (Ando and Tsay, Int. J. Forecast., 2010). The above mentioned criteria are global summary measures of model performance, but more detailed analysis could be required to discover the reasons for poor global performance. In this latter case, a retrospective predictive analysis is performed on each individual observation. In this study we performed the Bayesian analysis of Italian data sets by four versions of a long-term hazard model known as the stress release model (Vere-Jones, J. Physics Earth, 1978; Bebbington and Harte, Geophys. J. Int., 2003; Varini and Rotondi, Environ. Ecol. Stat., 2015). Then we illustrate the results on their performance evaluated by Bayes Factor, predictive information criteria and retrospective predictive analysis.

  13. Numerical Stress Field Modelling: from geophysical observations toward volcano hazard assessment

    NASA Astrophysics Data System (ADS)

    Currenti, Gilda; Coco, Armando; Privitera, Emanuela

    2015-04-01

    . Numerical results show the contribution of groundwater head gradients associated with topographically induced flow and pore-pressure changes, providing a quantitative estimate for deformation and failure of volcano edifice. The comparison between the predictions of the model and the observations can provide valuable insights about the stress state of the volcano and, hence, about the likelihood of an impending eruption. This innovative approach opens up new perspectives in geodetic inverse modelling and poses the basis for future development in a volcano hazard assessment based on a critical combination of geophysical observations and numerical modelling.

  14. Proportional Reasoning with a Pyramid

    ERIC Educational Resources Information Center

    Mamolo, Ami; Sinclair, Margaret; Whiteley, Walter J.

    2011-01-01

    Proportional reasoning pops up in math class in a variety of places, such as while making scaled drawings; finding equivalent fractions; converting units of measurement; comparing speeds, prices, and rates; and comparing lengths, areas, and volume. Students need to be exposed to a variety of representations to develop a sound understanding of this…

  15. Social Justice and Proportional Reasoning

    ERIC Educational Resources Information Center

    Simic-Muller, Ksenija

    2015-01-01

    Ratio and proportional reasoning tasks abound that have connections to real-world situations. Examples in this article demonstrate how textbook tasks can easily be transformed into authentic real-world problems that shed light on issues of equity and fairness, such as population growth and crime rates. A few ideas are presented on how teachers can…

  16. Saving Money Using Proportional Reasoning

    ERIC Educational Resources Information Center

    de la Cruz, Jessica A.; Garney, Sandra

    2016-01-01

    It is beneficial for students to discover intuitive strategies, as opposed to the teacher presenting strategies to them. Certain proportional reasoning tasks are more likely to elicit intuitive strategies than other tasks. The strategies that students are apt to use when approaching a task, as well as the likelihood of a student's success or…

  17. A spatiotemporal optimization model for the evacuation of the population exposed to flood hazard

    NASA Astrophysics Data System (ADS)

    Alaeddine, H.; Serrhini, K.; Maizia, M.

    2015-03-01

    Managing the crisis caused by natural disasters, and especially by floods, requires the development of effective evacuation systems. An effective evacuation system must take into account certain constraints, including those related to traffic network, accessibility, human resources and material equipment (vehicles, collecting points, etc.). The main objective of this work is to provide assistance to technical services and rescue forces in terms of accessibility by offering itineraries relating to rescue and evacuation of people and property. We consider in this paper the evacuation of an urban area of medium size exposed to the hazard of flood. In case of inundation, most people will be evacuated using their own vehicles. Two evacuation types are addressed in this paper: (1) a preventive evacuation based on a flood forecasting system and (2) an evacuation during the disaster based on flooding scenarios. The two study sites on which the developed evacuation model is applied are the Tours valley (Fr, 37), which is protected by a set of dikes (preventive evacuation), and the Gien valley (Fr, 45), which benefits from a low rate of flooding (evacuation before and during the disaster). Our goal is to construct, for each of these two sites, a chronological evacuation plan, i.e., computing for each individual the departure date and the path to reach the assembly point (also called shelter) according to a priority list established for this purpose. The evacuation plan must avoid the congestion on the road network. Here we present a spatiotemporal optimization model (STOM) dedicated to the evacuation of the population exposed to natural disasters and more specifically to flood risk.

  18. Using a ballistic-caprock model for developing a volcanic projectiles hazard map at Santorini caldera

    NASA Astrophysics Data System (ADS)

    Konstantinou, Konstantinos

    2015-04-01

    Volcanic Ballistic Projectiles (VBPs) are rock/magma fragments of variable size that are ejected from active vents during explosive eruptions. VBPs follow almost parabolic trajectories that are influenced by gravity and drag forces before they reach their impact point on the Earth's surface. Owing to their high temperature and kinetic energies, VBPs can potentially cause human casualties, severe damage to buildings as well as trigger fires. Since the Minoan eruption the Santorini caldera has produced several smaller (VEI = 2-3) vulcanian eruptions, the last of which occurred in 1950, while in 2011 it also experienced significant deformation/seismicity even though no eruption eventually occurred. In this work, an eruptive model appropriate for vulcanian eruptions is used to estimate initial conditions (ejection height, velocity) for VBPs assuming a broad range of gas concentration/overpressure in the vent. These initial conditions are then inserted into a ballistic model for the purpose of calculating the maximum range of VBPs for different VBP sizes (0.35-3 m), varying drag coefficient as a function of VBP speed and varying air density as a function of altitude. In agreement with previous studies a zone of reduced drag is also included in the ballistic calculations that is determined based on the size of vents that were active in the Kameni islands during previous eruptions (< 1 km). Results show that the horizontal range of VBPs varies between 0.9-3 km and greatly depends on gas concentration, the extent of the reduced drag zone and the size of VBP. Hazard maps are then constructed by taking into account the maximum horizontal range values as well as potential locations of eruptive vents along a NE-SW direction around the Kameni islands (the so-called "Kameni line").

  19. Landslide tsunami hazard in New South Wales, Australia: novel observations from 3D modelling

    NASA Astrophysics Data System (ADS)

    Power, Hannah; Clarke, Samantha; Hubble, Tom

    2015-04-01

    This paper examines the potential of tsunami inundation generated from two case study sites of submarine mass failures on the New South Wales coast of Australia. Two submarine mass failure events are investigated: the Bulli Slide and the Shovel Slide. Both slides are located approximately 65 km southeast of Sydney and 60 km east of the township of Wollongong. The Bulli Slide (~20 km3) and the Shovel Slide (7.97 km3) correspond to the two largest identified erosional surface submarine landslides scars of the NSW continental margin (Glenn et al. 2008; Clarke 2014) and represent examples of large to very large submarine landslide scars. The Shovel Slide is a moderately thick (80-165 m), moderately wide to wide (4.4 km) slide, and is located in 880 m water depth; and the Bulli Slide is an extremely thick (200-425 m), very wide (8.9 km) slide, and is located in 1500 m water depth. Previous work on the east Australian margin (Clarke et al., 2014) and elsewhere (Harbitz et al., 2013) suggests that submarine landslides similar to the Bulli Slide or the Shovel Slide are volumetrically large enough and occur at shallow enough water depths (400-2500 m) to generate substantial tsunamis that could cause widespread damage on the east Australian coast and threaten coastal communities (Burbidge et al. 2008; Clarke 2014; Talukder and Volker 2014). Currently, the tsunamogenic potential of these two slides has only been investigated using 2D modelling (Clarke 2014) and to date it has been difficult to establish the onshore tsunami surge characteristics for the submarine landslides with certainty. To address this knowledge gap, the forecast inundation as a result of these two mass failure events was investigated using a three-dimensional model (ANUGA) that predicts water flow resulting from natural hazard events such as tsunami (Nielsen et al., 2005). The ANUGA model solves the two-dimensional shallow water wave equations and accurately models the process of wetting and drying thus

  20. The SCEC Community Modeling Environment(SCEC/CME): A Collaboratory for Seismic Hazard Analysis

    NASA Astrophysics Data System (ADS)

    Maechling, P. J.; Jordan, T. H.; Minster, J. B.; Moore, R.; Kesselman, C.

    2005-12-01

    The SCEC Community Modeling Environment (SCEC/CME) Project is an NSF-supported Geosciences/IT partnership that is actively developing an advanced information infrastructure for system-level earthquake science in Southern California. This partnership includes SCEC, USC's Information Sciences Institute (ISI), the San Diego Supercomputer Center (SDSC), the Incorporated Institutions for Research in Seismology (IRIS), and the U.S. Geological Survey. The goal of the SCEC/CME is to develop seismological applications and information technology (IT) infrastructure to support the development of Seismic Hazard Analysis (SHA) programs and other geophysical simulations. The SHA application programs developed on the Project include a Probabilistic Seismic Hazard Analysis system called OpenSHA. OpenSHA computational elements that are currently available include a collection of attenuation relationships, and several Earthquake Rupture Forecasts (ERFs). Geophysicists in the collaboration have also developed Anelastic Wave Models (AWMs) using both finite-difference and finite-element approaches. Earthquake simulations using these codes have been run for a variety of earthquake sources. Rupture Dynamic Model (RDM) codes have also been developed that simulate friction-based fault slip. The SCEC/CME collaboration has also developed IT software and hardware infrastructure to support the development, execution, and analysis of these SHA programs. To support computationally expensive simulations, we have constructed a grid-based scientific workflow system. Using the SCEC grid, project collaborators can submit computations from the SCEC/CME servers to High Performance Computers at USC and TeraGrid High Performance Computing Centers. Data generated and archived by the SCEC/CME is stored in a digital library system, the Storage Resource Broker (SRB). This system provides a robust and secure system for maintaining the association between the data seta and their metadata. To provide an easy

  1. Numerical modeling of debris avalanches at Nevado de Toluca (Mexico): implications for hazard evaluation and mapping

    NASA Astrophysics Data System (ADS)

    Grieco, F.; Capra, L.; Groppelli, G.; Norini, G.

    2007-05-01

    The present study concerns the numerical modeling of debris avalanches on the Nevado de Toluca Volcano (Mexico) using TITAN2D simulation software, and its application to create hazard maps. Nevado de Toluca is an andesitic to dacitic stratovolcano of Late Pliocene-Holocene age, located in central México near to the cities of Toluca and México City; its past activity has endangered an area with more than 25 million inhabitants today. The present work is based upon the data collected during extensive field work finalized to the realization of the geological map of Nevado de Toluca at 1:25,000 scale. The activity of the volcano has developed from 2.6 Ma until 10.5 ka with both effusive and explosive events; the Nevado de Toluca has presented long phases of inactivity characterized by erosion and emplacement of debris flow and debris avalanche deposits on its flanks. The largest epiclastic events in the history of the volcano are wide debris flows and debris avalanches, occurred between 1 Ma and 50 ka, during a prolonged hiatus in eruptive activity. Other minor events happened mainly during the most recent volcanic activity (less than 50 ka), characterized by magmatic and tectonic-induced instability of the summit dome complex. According to the most recent tectonic analysis, the active transtensive kinematics of the E-W Tenango Fault System had a strong influence on the preferential directions of the last three documented lateral collapses, which generated the Arroyo Grande and Zaguàn debris avalanche deposits towards E and Nopal debris avalanche deposit towards W. The analysis of the data collected during the field work permitted to create a detailed GIS database of the spatial and temporal distribution of debris avalanche deposits on the volcano. Flow models, that have been performed with the software TITAN2D, developed by GMFG at Buffalo, were entirely based upon the information stored in the geological database. The modeling software is built upon equations

  2. First look at changes in flood hazard in the Inter-Sectoral Impact Model Intercomparison Project ensemble

    PubMed Central

    Dankers, Rutger; Arnell, Nigel W.; Clark, Douglas B.; Falloon, Pete D.; Fekete, Balázs M.; Gosling, Simon N.; Heinke, Jens; Kim, Hyungjun; Masaki, Yoshimitsu; Satoh, Yusuke; Stacke, Tobias; Wada, Yoshihide; Wisser, Dominik

    2014-01-01

    Climate change due to anthropogenic greenhouse gas emissions is expected to increase the frequency and intensity of precipitation events, which is likely to affect the probability of flooding into the future. In this paper we use river flow simulations from nine global hydrology and land surface models to explore uncertainties in the potential impacts of climate change on flood hazard at global scale. As an indicator of flood hazard we looked at changes in the 30-y return level of 5-d average peak flows under representative concentration pathway RCP8.5 at the end of this century. Not everywhere does climate change result in an increase in flood hazard: decreases in the magnitude and frequency of the 30-y return level of river flow occur at roughly one-third (20–45%) of the global land grid points, particularly in areas where the hydrograph is dominated by the snowmelt flood peak in spring. In most model experiments, however, an increase in flooding frequency was found in more than half of the grid points. The current 30-y flood peak is projected to occur in more than 1 in 5 y across 5–30% of land grid points. The large-scale patterns of change are remarkably consistent among impact models and even the driving climate models, but at local scale and in individual river basins there can be disagreement even on the sign of change, indicating large modeling uncertainty which needs to be taken into account in local adaptation studies. PMID:24344290

  3. Decision support model for assessing aquifer pollution hazard and prioritizing groundwater resources management in the wet Pampa plain, Argentina.

    PubMed

    Lima, M Lourdes; Romanelli, Asunción; Massone, Héctor E

    2013-06-01

    This paper gives an account of the implementation of a decision support system for assessing aquifer pollution hazard and prioritizing subwatersheds for groundwater resources management in the southeastern Pampa plain of Argentina. The use of this system is demonstrated with an example from Dulce Stream Basin (1,000 km(2) encompassing 27 subwatersheds), which has high level of agricultural activities and extensive available data regarding aquifer geology. In the logic model, aquifer pollution hazard is assessed as a function of two primary topics: groundwater and soil conditions. This logic model shows the state of each evaluated landscape with respect to aquifer pollution hazard based mainly on the parameters of the DRASTIC and GOD models. The decision model allows prioritizing subwatersheds for groundwater resources management according to three main criteria including farming activities, agrochemical application, and irrigation use. Stakeholder participation, through interviews, in combination with expert judgment was used to select and weight each criterion. The resulting subwatershed priority map, by combining the logic and decision models, allowed identifying five subwatersheds in the upper and middle basin as the main aquifer protection areas. The results reasonably fit the natural conditions of the basin, identifying those subwatersheds with shallow water depth, loam-loam silt texture soil media and pasture land cover in the middle basin, and others with intensive agricultural activity, coinciding with the natural recharge area to the aquifer system. Major difficulties and some recommendations of applying this methodology in real-world situations are discussed.

  4. First look at changes in flood hazard in the Inter-Sectoral Impact Model Intercomparison Project ensemble.

    PubMed

    Dankers, Rutger; Arnell, Nigel W; Clark, Douglas B; Falloon, Pete D; Fekete, Balázs M; Gosling, Simon N; Heinke, Jens; Kim, Hyungjun; Masaki, Yoshimitsu; Satoh, Yusuke; Stacke, Tobias; Wada, Yoshihide; Wisser, Dominik

    2014-03-01

    Climate change due to anthropogenic greenhouse gas emissions is expected to increase the frequency and intensity of precipitation events, which is likely to affect the probability of flooding into the future. In this paper we use river flow simulations from nine global hydrology and land surface models to explore uncertainties in the potential impacts of climate change on flood hazard at global scale. As an indicator of flood hazard we looked at changes in the 30-y return level of 5-d average peak flows under representative concentration pathway RCP8.5 at the end of this century. Not everywhere does climate change result in an increase in flood hazard: decreases in the magnitude and frequency of the 30-y return level of river flow occur at roughly one-third (20-45%) of the global land grid points, particularly in areas where the hydrograph is dominated by the snowmelt flood peak in spring. In most model experiments, however, an increase in flooding frequency was found in more than half of the grid points. The current 30-y flood peak is projected to occur in more than 1 in 5 y across 5-30% of land grid points. The large-scale patterns of change are remarkably consistent among impact models and even the driving climate models, but at local scale and in individual river basins there can be disagreement even on the sign of change, indicating large modeling uncertainty which needs to be taken into account in local adaptation studies.

  5. Students' Understanding of Proportional, Inverse Proportional, and Affine Functions: Two Studies on the Role of External Representations

    ERIC Educational Resources Information Center

    De Bock, Dirk; Van Dooren, Wim; Verschaffel, Lieven

    2015-01-01

    We investigated students' understanding of proportional, inverse proportional, and affine functions and the way this understanding is affected by various external representations. In a first study, we focus on students' ability to model textual descriptions of situations with different kinds of representations of proportional, inverse…

  6. Quantifying the uncertainty in site amplification modeling and its effects on site-specific seismic-hazard estimation in the upper Mississippi embayment and adjacent areas

    USGS Publications Warehouse

    Cramer, C.H.

    2006-01-01

    The Mississippi embayment, located in the central United States, and its thick deposits of sediments (over 1 km in places) have a large effect on earthquake ground motions. Several previous studies have addressed how these thick sediments might modify probabilistic seismic-hazard maps. The high seismic hazard associated with the New Madrid seismic zone makes it particularly important to quantify the uncertainty in modeling site amplification to better represent earthquake hazard in seismic-hazard maps. The methodology of the Memphis urban seismic-hazard-mapping project (Cramer et al., 2004) is combined with the reference profile approach of Toro and Silva (2001) to better estimate seismic hazard in the Mississippi embayment. Improvements over previous approaches include using the 2002 national seismic-hazard model, fully probabilistic hazard calculations, calibration of site amplification with improved nonlinear soil-response estimates, and estimates of uncertainty. Comparisons are made with the results of several previous studies, and estimates of uncertainty inherent in site-amplification modeling for the upper Mississippi embayment are developed. I present new seismic-hazard maps for the upper Mississippi embayment with the effects of site geology incorporating these uncertainties.

  7. Mathematical Decision Models Applied for Qualifying and Planning Areas Considering Natural Hazards and Human Dealing

    NASA Astrophysics Data System (ADS)

    Anton, Jose M.; Grau, Juan B.; Tarquis, Ana M.; Sanchez, Elena; Andina, Diego

    2014-05-01

    The authors were involved in the use of some Mathematical Decision Models, MDM, to improve knowledge and planning about some large natural or administrative areas for which natural soils, climate, and agro and forest uses where main factors, but human resources and results were important, natural hazards being relevant. In one line they have contributed about qualification of lands of the Community of Madrid, CM, administrative area in centre of Spain containing at North a band of mountains, in centre part of Iberian plateau and river terraces, and also Madrid metropolis, from an official study of UPM for CM qualifying lands using a FAO model from requiring minimums of a whole set of Soil Science criteria. The authors set first from these criteria a complementary additive qualification, and tried later an intermediate qualification from both using fuzzy logic. The authors were also involved, together with colleagues from Argentina et al. that are in relation with local planners, for the consideration of regions and of election of management entities for them. At these general levels they have adopted multi-criteria MDM, used a weighted PROMETHEE, and also an ELECTRE-I with the same elicited weights for the criteria and data, and at side AHP using Expert Choice from parallel comparisons among similar criteria structured in two levels. The alternatives depend on the case study, and these areas with monsoon climates have natural hazards that are decisive for their election and qualification with an initial matrix used for ELECTRE and PROMETHEE. For the natural area of Arroyos Menores at South of Rio Cuarto town, with at North the subarea of La Colacha, the loess lands are rich but suffer now from water erosions forming regressive ditches that are spoiling them, and use of soils alternatives must consider Soil Conservation and Hydraulic Management actions. The use of soils may be in diverse non compatible ways, as autochthonous forest, high value forest, traditional

  8. Modeling Information Accumulation in Psychological Tests Using Item Response Times

    ERIC Educational Resources Information Center

    Ranger, Jochen; Kuhn, Jörg-Tobias

    2015-01-01

    In this article, a latent trait model is proposed for the response times in psychological tests. The latent trait model is based on the linear transformation model and subsumes popular models from survival analysis, like the proportional hazards model and the proportional odds model. Core of the model is the assumption that an unspecified monotone…

  9. Assessment of erosion hazard after recurrence fires with the RUSLE 3D MODEL

    NASA Astrophysics Data System (ADS)

    Vecín-Arias, Daniel; Palencia, Covadonga; Fernández Raga, María

    2016-04-01

    The objective of this work is to calculate if there is more soil erosion after the recurrence of several forest fires on an area. To that end, it has been studied an area of 22 130 ha because has a high frequency of fires. This area is located in the northwest of the Iberian Peninsula. The assessment of erosion hazard was calculated in several times using Geographic Information Systems (GIS).The area have been divided into several plots according to the number of times they have been burnt in the past 15 years. Due to the complexity that has to make a detailed study of a so large field and that there are not information available anually, it is necessary to select the more interesting moments. In august 2012 it happened the most agressive and extensive fire of the area. So the study was focused on the erosion hazard for 2011 and 2014, because they are the date before and after from the fire of 2012 in which there are orthophotos available. RUSLE3D model (Revised Universal Soil Loss Equation) was used to calculate maps erosion losses. This model improves the traditional USLE (Wischmeier and D., 1965) because it studies the influence of the concavity / convexity (Renard et al., 1997), and improves the estimation of the slope factor LS (Renard et al., 1991). It is also one of the most commonly used models in literatura (Mitasova et al., 1996; Terranova et al., 2009). The tools used are free and accessible, using GIS "gvSIG" (http://www.gvsig.com/es) and the metadata were taken from Spatial Data Infrastructure of Spain webpage (IDEE, 2016). However the RUSLE model has many critics as some authors who suggest that only serves to carry out comparisons between areas, and not for the calculation of absolute soil loss data. These authors argue that in field measurements the actual recovered eroded soil can suppose about one-third of the values obtained with the model (Šúri et al., 2002). The study of the area shows that the error detected by the critics could come from

  10. Subduction zone and crustal dynamics of western Washington; a tectonic model for earthquake hazards evaluation

    USGS Publications Warehouse

    Stanley, Dal; Villaseñor, Antonio; Benz, Harley

    1999-01-01

    The Cascadia subduction zone is extremely complex in the western Washington region, involving local deformation of the subducting Juan de Fuca plate and complicated block structures in the crust. It has been postulated that the Cascadia subduction zone could be the source for a large thrust earthquake, possibly as large as M9.0. Large intraplate earthquakes from within the subducting Juan de Fuca plate beneath the Puget Sound region have accounted for most of the energy release in this century and future such large earthquakes are expected. Added to these possible hazards is clear evidence for strong crustal deformation events in the Puget Sound region near faults such as the Seattle fault, which passes through the southern Seattle metropolitan area. In order to understand the nature of these individual earthquake sources and their possible interrelationship, we have conducted an extensive seismotectonic study of the region. We have employed P-wave velocity models developed using local earthquake tomography as a key tool in this research. Other information utilized includes geological, paleoseismic, gravity, magnetic, magnetotelluric, deformation, seismicity, focal mechanism and geodetic data. Neotectonic concepts were tested and augmented through use of anelastic (creep) deformation models based on thin-plate, finite-element techniques developed by Peter Bird, UCLA. These programs model anelastic strain rate, stress, and velocity fields for given rheological parameters, variable crust and lithosphere thicknesses, heat flow, and elevation. Known faults in western Washington and the main Cascadia subduction thrust were incorporated in the modeling process. Significant results from the velocity models include delineation of a previously studied arch in the subducting Juan de Fuca plate. The axis of the arch is oriented in the direction of current subduction and asymmetrically deformed due to the effects of a northern buttress mapped in the velocity models. This

  11. System Dynamics Model to develop resilience management strategies for lifelines exposed to natural hazards

    NASA Astrophysics Data System (ADS)

    Pagano, Alessandro; Pluchinotta, Irene; Giordano, Raffaele; Vurro, Michele

    2016-04-01

    Resilience has recently become a key concept, and a crucial paradigm in the analysis of the impacts of natural disasters, mainly concerning Lifeline Systems (LS). Indeed, the traditional risk management approaches require a precise knowledge of all potential hazards and a full understanding of the interconnections among different infrastructures, based on past events and trends analysis. Nevertheless, due to the inner complexity of LS, their interconnectedness and the dynamic context in which they operate (i.e. technology, economy and society), it is difficult to gain a complete comprehension of the processes influencing vulnerabilities and threats. Therefore, resilience thinking addresses the complexities of large integrated systems and the uncertainty of future threats, emphasizing the absorbing, adapting and responsive behavior of the system. Resilience thinking approaches are focused on the capability of the system to deal with the unforeseeable. The increasing awareness of the role played by LS, has led governmental agencies and institutions to develop resilience management strategies. Risk prone areas, such as cities, are highly dependent on infrastructures providing essential services that support societal functions, safety, economic prosperity and quality of life. Among the LS, drinking water supply is critical for supporting citizens during emergency and recovery, since a disruption could have a range of serious societal impacts. A very well-known method to assess LS resilience is the TOSE approach. The most interesting feature of this approach is the integration of four dimensions: Technical, Organizational, Social and Economic. Such issues are all concurrent to the resilience level of an infrastructural system, and should be therefore quantitatively assessed. Several researches underlined that the lack of integration among the different dimensions, composing the resilience concept, may contribute to a mismanagement of LS in case of natural disasters

  12. Assessment of erosion hazard after recurrence fires with the RUSLE 3D MODEL

    NASA Astrophysics Data System (ADS)

    Vecín-Arias, Daniel; Palencia, Covadonga; Fernández Raga, María

    2016-04-01

    The objective of this work is to calculate if there is more soil erosion after the recurrence of several forest fires on an area. To that end, it has been studied an area of 22 130 ha because has a high frequency of fires. This area is located in the northwest of the Iberian Peninsula. The assessment of erosion hazard was calculated in several times using Geographic Information Systems (GIS).The area have been divided into several plots according to the number of times they have been burnt in the past 15 years. Due to the complexity that has to make a detailed study of a so large field and that there are not information available anually, it is necessary to select the more interesting moments. In august 2012 it happened the most agressive and extensive fire of the area. So the study was focused on the erosion hazard for 2011 and 2014, because they are the date before and after from the fire of 2012 in which there are orthophotos available. RUSLE3D model (Revised Universal Soil Loss Equation) was used to calculate maps erosion losses. This model improves the traditional USLE (Wischmeier and D., 1965) because it studies the influence of the concavity / convexity (Renard et al., 1997), and improves the estimation of the slope factor LS (Renard et al., 1991). It is also one of the most commonly used models in literatura (Mitasova et al., 1996; Terranova et al., 2009). The tools used are free and accessible, using GIS "gvSIG" (http://www.gvsig.com/es) and the metadata were taken from Spatial Data Infrastructure of Spain webpage (IDEE, 2016). However the RUSLE model has many critics as some authors who suggest that only serves to carry out comparisons between areas, and not for the calculation of absolute soil loss data. These authors argue that in field measurements the actual recovered eroded soil can suppose about one-third of the values obtained with the model (Šúri et al., 2002). The study of the area shows that the error detected by the critics could come from

  13. Hydrology Analysis and Modelling for Klang River Basin Flood Hazard Map

    NASA Astrophysics Data System (ADS)

    Sidek, L. M.; Rostam, N. E.; Hidayah, B.; Roseli, ZA; Majid, W. H. A. W. A.; Zahari, N. Z.; Salleh, S. H. M.; Ahmad, R. D. R.; Ahmad, M. N.

    2016-03-01

    Flooding, a common environmental hazard worldwide has in recent times, increased as a result of climate change and urbanization with the effects felt more in developing countries. As a result, the explosive of flooding to Tenaga Nasional Berhad (TNB) substation is increased rapidly due to existing substations are located in flood prone area. By understanding the impact of flood to their substation, TNB has provided the non-structure mitigation with the integration of Flood Hazard Map with their substation. Hydrology analysis is the important part in providing runoff as the input for the hydraulic part.

  14. Assessment of groundwater contamination risk using hazard quantification, a modified DRASTIC model and groundwater value, Beijing Plain, China.

    PubMed

    Wang, Junjie; He, Jiangtao; Chen, Honghan

    2012-08-15

    Groundwater contamination risk assessment is an effective tool for groundwater management. Most existing risk assessment methods only consider the basic contamination process based upon evaluations of hazards and aquifer vulnerability. In view of groundwater exploitation potentiality, including the value of contamination-threatened groundwater could provide relatively objective and targeted results to aid in decision making. This study describes a groundwater contamination risk assessment method that integrates hazards, intrinsic vulnerability and groundwater value. The hazard harmfulness was evaluated by quantifying contaminant properties and infiltrating contaminant load, the intrinsic aquifer vulnerability was evaluated using a modified DRASTIC model and the groundwater value was evaluated based on groundwater quality and aquifer storage. Two groundwater contamination risk maps were produced by combining the above factors: a basic risk map and a value-weighted risk map. The basic risk map was produced by overlaying the hazard map and the intrinsic vulnerability map. The value-weighted risk map was produced by overlaying the basic risk map and the groundwater value map. Relevant validation was completed by contaminant distributions and site investigation. Using Beijing Plain, China, as an example, thematic maps of the three factors and the two risks were generated. The thematic maps suggested that landfills, gas stations and oil depots, and industrial areas were the most harmful potential contamination sources. The western and northern parts of the plain were the most vulnerable areas and had the highest groundwater value. Additionally, both the basic and value-weighted risk classes in the western and northern parts of the plain were the highest, indicating that these regions should deserve the priority of concern. Thematic maps should be updated regularly because of the dynamic characteristics of hazards. Subjectivity and validation means in assessing the

  15. Assessment of groundwater contamination risk using hazard quantification, a modified DRASTIC model and groundwater value, Beijing Plain, China.

    PubMed

    Wang, Junjie; He, Jiangtao; Chen, Honghan

    2012-08-15

    Groundwater contamination risk assessment is an effective tool for groundwater management. Most existing risk assessment methods only consider the basic contamination process based upon evaluations of hazards and aquifer vulnerability. In view of groundwater exploitation potentiality, including the value of contamination-threatened groundwater could provide relatively objective and targeted results to aid in decision making. This study describes a groundwater contamination risk assessment method that integrates hazards, intrinsic vulnerability and groundwater value. The hazard harmfulness was evaluated by quantifying contaminant properties and infiltrating contaminant load, the intrinsic aquifer vulnerability was evaluated using a modified DRASTIC model and the groundwater value was evaluated based on groundwater quality and aquifer storage. Two groundwater contamination risk maps were produced by combining the above factors: a basic risk map and a value-weighted risk map. The basic risk map was produced by overlaying the hazard map and the intrinsic vulnerability map. The value-weighted risk map was produced by overlaying the basic risk map and the groundwater value map. Relevant validation was completed by contaminant distributions and site investigation. Using Beijing Plain, China, as an example, thematic maps of the three factors and the two risks were generated. The thematic maps suggested that landfills, gas stations and oil depots, and industrial areas were the most harmful potential contamination sources. The western and northern parts of the plain were the most vulnerable areas and had the highest groundwater value. Additionally, both the basic and value-weighted risk classes in the western and northern parts of the plain were the highest, indicating that these regions should deserve the priority of concern. Thematic maps should be updated regularly because of the dynamic characteristics of hazards. Subjectivity and validation means in assessing the

  16. Characterizing the danger of in-channel river hazards using LIDAR and a 2D hydrodynamic model

    NASA Astrophysics Data System (ADS)

    Strom, M. A.; Pasternack, G. B.

    2014-12-01

    Despite many injuries and deaths each year worldwide, no analytically rigorous attempt exists to characterize and quantify the dangers to boaters, swimmers, fishermen, and other river enthusiasts. While designed by expert boaters, the International Scale of River Difficulty provides a whitewater classification that uses qualitative descriptions and subjective scoring. The purpose of this study was to develop an objective characterization of in-channel hazard dangers across spatial scales from a single boulder to an entire river segment for application over a wide range of discharges and use in natural hazard assessment and mitigation, recreational boating safety, and river science. A process-based conceptualization of river hazards was developed, and algorithms were programmed in R to quantify the associated dangers. Danger indicators included the passage proximity and reaction time posed to boats and swimmers in a river by three hazards: emergent rocks, submerged rocks, and hydraulic jumps or holes. The testbed river was a 12.2 km mixed bedrock-alluvial section of the upper South Yuba River between Lake Spaulding and Washington, CA in the Sierra Mountains. The segment has a mean slope of 1.63%, with 8 reaches varying from 1.07% to 3.30% slope and several waterfalls. Data inputs to the hazard analysis included sub-decimeter aerial color imagery, airborne LIDAR of the river corridor, bathymetric data, flow inputs, and a stage-discharge relation for the end of the river segment. A key derived data product was the location and configuration of boulders and boulder clusters as these were potential hazards. Two-dimensional hydrodynamic modeling was used to obtain the meter-scale spatial pattern of depth and velocity at discharges ranging from baseflow to modest flood stages. Results were produced for four discharges and included the meter-scale spatial pattern of the passage proximity and reaction time dangers for each of the three hazards investigated. These results

  17. Photodetectors for Scintillator Proportionality Measurement

    SciTech Connect

    Moses, William W.; Choong, Woon-Seng; Hull, Giulia; Payne, Steve; Cherepy, Nerine; Valentine, J.D.

    2010-10-18

    We evaluate photodetectors for use in a Compton Coincidence apparatus designed for measuring scintillator proportionality. There are many requirements placed on the photodetector in these systems, including active area, linearity, and the ability to accurately measure low light levels (which implies high quantum efficiency and high signal-to-noise ratio). Through a combination of measurement and Monte Carlo simulation, we evaluate a number of potential photodetectors, especially photomultiplier tubes and hybrid photodetectors. Of these, we find that the most promising devices available are photomultiplier tubes with high ({approx}50%) quantum efficiency, although hybrid photodetectors with high quantum efficiency would be preferable.

  18. A hydro-sedimentary modeling system for flash flood propagation and hazard estimation under different agricultural practices

    NASA Astrophysics Data System (ADS)

    Kourgialas, N. N.; Karatzas, G. P.

    2014-03-01

    A modeling system for the estimation of flash flood flow velocity and sediment transport is developed in this study. The system comprises three components: (a) a modeling framework based on the hydrological model HSPF, (b) the hydrodynamic module of the hydraulic model MIKE 11 (quasi-2-D), and (c) the advection-dispersion module of MIKE 11 as a sediment transport model. An important parameter in hydraulic modeling is the Manning's coefficient, an indicator of the channel resistance which is directly dependent on riparian vegetation changes. Riparian vegetation's effect on flood propagation parameters such as water depth (inundation), discharge, flow velocity, and sediment transport load is investigated in this study. Based on the obtained results, when the weed-cutting percentage is increased, the flood wave depth decreases while flow discharge, velocity and sediment transport load increase. The proposed modeling system is used to evaluate and illustrate the flood hazard for different riparian vegetation cutting scenarios. For the estimation of flood hazard, a combination of the flood propagation characteristics of water depth, flow velocity and sediment load was used. Next, a well-balanced selection of the most appropriate agricultural cutting practices of riparian vegetation was performed. Ultimately, the model results obtained for different agricultural cutting practice scenarios can be employed to create flood protection measures for flood-prone areas. The proposed methodology was applied to the downstream part of a small Mediterranean river basin in Crete, Greece.

  19. A Model (Formula) for Deriving A Hazard Index of Rail-Highway Grade Crossings.

    ERIC Educational Resources Information Center

    Coburn, James Minton

    The purpose of this research was to compile data for use as related information in the education of drivers, and to derive a formula for computing a hazard index for rail-highway intersections. Data for the study were compiled from: (1) all crossings on which field data were collected, (2) reports of 642 accidents, and (3) data collected from…

  20. MODELS TO ESTIMATE VOLATILE ORGANIC HAZARDOUS AIR POLLUTANT EMISSIONS FROM MUNICIPAL SEWER SYSTEMS

    EPA Science Inventory

    Emissions from municipal sewers are usually omitted from hazardous air pollutant (HAP) emission inventories. This omission may result from a lack of appreciation for the potential emission impact and/or from inadequate emission estimation procedures. This paper presents an analys...

  1. Conveying Flood Hazard Risk Through Spatial Modeling: A Case Study for Hurricane Sandy-Affected Communities in Northern New Jersey.

    PubMed

    Artigas, Francisco; Bosits, Stephanie; Kojak, Saleh; Elefante, Dominador; Pechmann, Ildiko

    2016-10-01

    The accurate forecast from Hurricane Sandy sea surge was the result of integrating the most sophisticated environmental monitoring technology available. This stands in contrast to the limited information and technology that exists at the community level to translate these forecasts into flood hazard levels on the ground at scales that are meaningful to property owners. Appropriately scaled maps with high levels of certainty can be effectively used to convey exposure to flood hazard at the community level. This paper explores the most basic analysis and data required to generate a relatively accurate flood hazard map to convey inundation risk due to sea surge. A Boolean overlay analysis of four input layers: elevation and slope derived from LiDAR data and distances from streams and catch basins derived from aerial photography and field reconnaissance were used to create a spatial model that explained 55 % of the extent and depth of the flood during Hurricane Sandy. When a ponding layer was added to the previous model to account for depressions that would fill and spill over to nearby areas, the new model explained almost 70 % of the extent and depth of the flood. The study concludes that fairly accurate maps can be created with readily available information and that it is possible to infer a great deal about risk of inundation at the property level, from flood hazard maps. The study goes on to conclude that local communities are encouraged to prepare for disasters, but in reality because of the existing Federal emergency management framework there is very little incentive to do so. PMID:27342852

  2. Utilizing NASA Earth Observations to Model Volcanic Hazard Risk Levels in Areas Surrounding the Copahue Volcano in the Andes Mountains

    NASA Astrophysics Data System (ADS)

    Keith, A. M.; Weigel, A. M.; Rivas, J.

    2014-12-01

    Copahue is a stratovolcano located along the rim of the Caviahue Caldera near the Chile-Argentina border in the Andes Mountain Range. There are several small towns located in proximity of the volcano with the two largest being Banos Copahue and Caviahue. During its eruptive history, it has produced numerous lava flows, pyroclastic flows, ash deposits, and lahars. This isolated region has steep topography and little vegetation, rendering it poorly monitored. The need to model volcanic hazard risk has been reinforced by recent volcanic activity that intermittently released several ash plumes from December 2012 through May 2013. Exposure to volcanic ash is currently the main threat for the surrounding populations as the volcano becomes more active. The goal of this project was to study Copahue and determine areas that have the highest potential of being affected in the event of an eruption. Remote sensing techniques were used to examine and identify volcanic activity and areas vulnerable to experiencing volcanic hazards including volcanic ash, SO2 gas, lava flow, pyroclastic density currents and lahars. Landsat 7 Enhanced Thematic Mapper Plus (ETM+), Landsat 8 Operational Land Imager (OLI), EO-1 Advanced Land Imager (ALI), Terra Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER), Shuttle Radar Topography Mission (SRTM), ISS ISERV Pathfinder, and Aura Ozone Monitoring Instrument (OMI) products were used to analyze volcanic hazards. These datasets were used to create a historic lava flow map of the Copahue volcano by identifying historic lava flows, tephra, and lahars both visually and spectrally. Additionally, a volcanic risk and hazard map for the surrounding area was created by modeling the possible extent of ash fallout, lahars, lava flow, and pyroclastic density currents (PDC) for future eruptions. These model results were then used to identify areas that should be prioritized for disaster relief and evacuation orders.

  3. Conveying Flood Hazard Risk Through Spatial Modeling: A Case Study for Hurricane Sandy-Affected Communities in Northern New Jersey.

    PubMed

    Artigas, Francisco; Bosits, Stephanie; Kojak, Saleh; Elefante, Dominador; Pechmann, Ildiko

    2016-10-01

    The accurate forecast from Hurricane Sandy sea surge was the result of integrating the most sophisticated environmental monitoring technology available. This stands in contrast to the limited information and technology that exists at the community level to translate these forecasts into flood hazard levels on the ground at scales that are meaningful to property owners. Appropriately scaled maps with high levels of certainty can be effectively used to convey exposure to flood hazard at the community level. This paper explores the most basic analysis and data required to generate a relatively accurate flood hazard map to convey inundation risk due to sea surge. A Boolean overlay analysis of four input layers: elevation and slope derived from LiDAR data and distances from streams and catch basins derived from aerial photography and field reconnaissance were used to create a spatial model that explained 55 % of the extent and depth of the flood during Hurricane Sandy. When a ponding layer was added to the previous model to account for depressions that would fill and spill over to nearby areas, the new model explained almost 70 % of the extent and depth of the flood. The study concludes that fairly accurate maps can be created with readily available information and that it is possible to infer a great deal about risk of inundation at the property level, from flood hazard maps. The study goes on to conclude that local communities are encouraged to prepare for disasters, but in reality because of the existing Federal emergency management framework there is very little incentive to do so.

  4. Conveying Flood Hazard Risk Through Spatial Modeling: A Case Study for Hurricane Sandy-Affected Communities in Northern New Jersey

    NASA Astrophysics Data System (ADS)

    Artigas, Francisco; Bosits, Stephanie; Kojak, Saleh; Elefante, Dominador; Pechmann, Ildiko

    2016-10-01

    The accurate forecast from Hurricane Sandy sea surge was the result of integrating the most sophisticated environmental monitoring technology available. This stands in contrast to the limited information and technology that exists at the community level to translate these forecasts into flood hazard levels on the ground at scales that are meaningful to property owners. Appropriately scaled maps with high levels of certainty can be effectively used to convey exposure to flood hazard at the community level. This paper explores the most basic analysis and data required to generate a relatively accurate flood hazard map to convey inundation risk due to sea surge. A Boolean overlay analysis of four input layers: elevation and slope derived from LiDAR data and distances from streams and catch basins derived from aerial photography and field reconnaissance were used to create a spatial model that explained 55 % of the extent and depth of the flood during Hurricane Sandy. When a ponding layer was added to the previous model to account for depressions that would fill and spill over to nearby areas, the new model explained almost 70 % of the extent and depth of the flood. The study concludes that fairly accurate maps can be created with readily available information and that it is possible to infer a great deal about risk of inundation at the property level, from flood hazard maps. The study goes on to conclude that local communities are encouraged to prepare for disasters, but in reality because of the existing Federal emergency management framework there is very little incentive to do so.

  5. Aqueous and tissue residue-based interspecies correlation estimation models provide conservative hazard estimates for aromatic compounds.

    PubMed

    Bejarano, Adriana C; Barron, Mace G

    2016-01-01

    Interspecies correlation estimation (ICE) models were developed for 30 nonpolar aromatic compounds to allow comparison of prediction accuracy between 2 data compilation approaches. Type 1 models used data combined across studies, and type 2 models used data combined only within studies. Target lipid (TLM) ICE models were also developed using target lipid concentrations of the type 2 model dataset (type 2-TLM). Analyses were performed to assess model prediction uncertainty introduced by each approach. Most statistically significant models (90%; 266 models total) had mean square errors < 0.27 and adjusted coefficients of determination (adj R(2) ) > 0.59, with the lowest amount of variation in mean square errors noted for type 2-TLM followed by type 2 models. Cross-validation success (>0.62) across most models (86% of all models) confirmed the agreement between ICE predicted and observed values. Despite differences in model predictive ability, most predicted values across all 3 ICE model types were within a 2-fold difference of the observed values. As a result, no statistically significant differences (p > 0.05) were found between most ICE-based and empirical species sensitivity distributions (SSDs). In most cases hazard concentrations were within or below the 95% confidence intervals of the direct-empirical SSD-based values, regardless of model choice. Interspecies correlation estimation-based 5th percentile (HC5) values showed a 200- to 900-fold increase as the log KOW increased from 2 to 5.3. Results indicate that ICE models for aromatic compounds provide a statistically based approach for deriving conservative hazard estimates for protecting aquatic life. PMID:26184086

  6. Aqueous and tissue residue-based interspecies correlation estimation models provide conservative hazard estimates for aromatic compounds.

    PubMed

    Bejarano, Adriana C; Barron, Mace G

    2016-01-01

    Interspecies correlation estimation (ICE) models were developed for 30 nonpolar aromatic compounds to allow comparison of prediction accuracy between 2 data compilation approaches. Type 1 models used data combined across studies, and type 2 models used data combined only within studies. Target lipid (TLM) ICE models were also developed using target lipid concentrations of the type 2 model dataset (type 2-TLM). Analyses were performed to assess model prediction uncertainty introduced by each approach. Most statistically significant models (90%; 266 models total) had mean square errors < 0.27 and adjusted coefficients of determination (adj R(2) ) > 0.59, with the lowest amount of variation in mean square errors noted for type 2-TLM followed by type 2 models. Cross-validation success (>0.62) across most models (86% of all models) confirmed the agreement between ICE predicted and observed values. Despite differences in model predictive ability, most predicted values across all 3 ICE model types were within a 2-fold difference of the observed values. As a result, no statistically significant differences (p > 0.05) were found between most ICE-based and empirical species sensitivity distributions (SSDs). In most cases hazard concentrations were within or below the 95% confidence intervals of the direct-empirical SSD-based values, regardless of model choice. Interspecies correlation estimation-based 5th percentile (HC5) values showed a 200- to 900-fold increase as the log KOW increased from 2 to 5.3. Results indicate that ICE models for aromatic compounds provide a statistically based approach for deriving conservative hazard estimates for protecting aquatic life.

  7. Incisors’ proportions in smile esthetics

    PubMed Central

    Alsulaimani, Fahad F; Batwa, Waeil

    2013-01-01

    Aims: To determine whether alteration of the maxillary central and lateral incisors’ length and width, respectively, would affect perceived smile esthetics and to validate the most esthetic length and width, respectively, for the central and lateral incisors. Materials and Methods: Photographic manipulation was undertaken to produce two sets of photographs, each set of four photographs showing the altered width of the lateral incisor and length of the central length. The eight produced photographs were assessed by laypeople, dentists and orthodontists. Results: Alteration in the incisors’ proportion affected the relative smile attractiveness for laypeople (n=124), dentists (n=115) and orthodontists (n=68); dentists and orthodontists did not accept lateral width reduction of more than 0.5 mm (P<0.01), which suggests that the lateral to central incisor width ratio ranges from 54% to 62%. However, laypeople did not accept lateral width reduction of more than 1 mm (P<0.01), widening the range to be from 48% to 62%. All groups had zero tolerance for changes in central crown length (P<0.01). Conclusion: All participants recognized that the central incisors’ length changes. For lateral incisors, laypeople were more tolerant than dentists and orthodontists. This suggests that changing incisors’ proportions affects the relative smile attractiveness. PMID:24987650

  8. Cumulative Hazard Ratio Estimation for Treatment Regimes in Sequentially Randomized Clinical Trials

    PubMed Central

    Tang, Xinyu; Wahed, Abdus S.

    2014-01-01

    The proportional hazards model is widely used in survival analysis to allow adjustment for baseline covariates. The proportional hazard assumption may not be valid for treatment regimes that depend on intermediate responses to prior treatments received, and it is not clear how such a model can be adapted to clinical trials employing more than one randomization. Besides, since treatment is modified post-baseline, the hazards are unlikely to be proportional across treatment regimes. Although Lokhnygina and Helterbrand (Biometrics 63: 422–428, 2007) introduced the Cox regression method for two-stage randomization designs, their method can only be applied to test the equality of two treatment regimes that share the same maintenance therapy. Moreover, their method does not allow auxiliary variables to be included in the model nor does it account for treatment effects that are not constant over time. In this article, we propose a model that assumes proportionality across covariates within each treatment regime but not across treatment regimes. Comparisons among treatment regimes are performed by testing the log ratio of the estimated cumulative hazards. The ratio of the cumulative hazard across treatment regimes is estimated using a weighted Breslow-type statistic. A simulation study was conducted to evaluate the performance of the estimators and proposed tests. PMID:26085847

  9. Influence of Climate Change on Flood Hazard using Climate Informed Bayesian Hierarchical Model in Johnson Creek River

    NASA Astrophysics Data System (ADS)

    Zarekarizi, M.; Moradkhani, H.

    2015-12-01

    Extreme events are proven to be affected by climate change, influencing hydrologic simulations for which stationarity is usually a main assumption. Studies have discussed that this assumption would lead to large bias in model estimations and higher flood hazard consequently. Getting inspired by the importance of non-stationarity, we determined how the exceedance probabilities have changed over time in Johnson Creek River, Oregon. This could help estimate the probability of failure of a structure that was primarily designed to resist less likely floods according to common practice. Therefore, we built a climate informed Bayesian hierarchical model and non-stationarity was considered in modeling framework. Principle component analysis shows that North Atlantic Oscillation (NAO), Western Pacific Index (WPI) and Eastern Asia (EA) are mostly affecting stream flow in this river. We modeled flood extremes using peaks over threshold (POT) method rather than conventional annual maximum flood (AMF) mainly because it is possible to base the model on more information. We used available threshold selection methods to select a suitable threshold for the study area. Accounting for non-stationarity, model parameters vary through time with climate indices. We developed a couple of model scenarios and chose one which could best explain the variation in data based on performance measures. We also estimated return periods under non-stationarity condition. Results show that ignoring stationarity could increase the flood hazard up to four times which could increase the probability of an in-stream structure being overtopped.

  10. Models of magma-aquifer interactions and their implications for hazard assessment

    NASA Astrophysics Data System (ADS)

    Strehlow, Karen; Gottsmann, Jo; Tumi Gudmundsson, Magnús

    2014-05-01

    Interactions of magmatic and hydrological systems are manifold, complex and poorly understood. On the one side they bear a significant hazard potential in the form of phreatic explosions or by causing "dry" effusive eruptions to turn into explosive phreatomagmatic events. On the other side, they can equally serve to reduce volcanic risk, as resulting geophysical signals can help to forecast eruptions. It is therefore necessary to put efforts towards answering some outstanding questions regarding magma - aquifer interactions. Our research addresses these problems from two sides. Firstly, aquifers respond to magmatic activity and they can also become agents of unrest themselves. Therefore, monitoring the hydrology can provide a valuable window into subsurface processes in volcanic areas. Changes in temperature and strain conditions, seismic excitation or the injection of magmatic fluids into hydrothermal systems are just a few of the proposed processes induced by magmatic activity that affect the local hydrology. Interpretations of unrest signals as groundwater responses are described for many volcanoes and include changes in water table levels, changes in temperature or composition of hydrothermal waters and pore pressure-induced ground deformation. Volcano observatories can track these hydrological effects for example with potential field investigations or the monitoring of wells. To fully utilise these indicators as monitoring and forecasting tools, however, it is necessary to improve our understanding of the ongoing mechanisms. Our hydrogeophysical study uses finite element analysis to quantitatively test proposed mechanisms of aquifer excitation and the resultant geophysical signals. Secondly, volcanic activity is influenced by the presence of groundwater, including phreatomagmatic and phreatic eruptions. We focus here on phreatic explosions at hydrothermal systems. At least two of these impulsive events occurred in 2013: In August at the Icelandic volcano

  11. Modeling retrospective attribution of responsibility to hazard-managing institutions: an example involving a food contamination incident.

    PubMed

    Johnson, Branden B; Hallman, William K; Cuite, Cara L

    2015-03-01

    Perceptions of institutions that manage hazards are important because they can affect how the public responds to hazard events. Antecedents of trust judgments have received far more attention than antecedents of attributions of responsibility for hazard events. We build upon a model of retrospective attribution of responsibility to individuals to examine these relationships regarding five classes of institutions that bear responsibility for food safety: producers (e.g., farmers), processors (e.g., packaging firms), watchdogs (e.g., government agencies), sellers (e.g., supermarkets), and preparers (e.g., restaurants). A nationally representative sample of 1,200 American adults completed an Internet-based survey in which a hypothetical scenario involving contamination of diverse foods with Salmonella served as the stimulus event. Perceived competence and good intentions of the institution moderately decreased attributions of responsibility. A stronger factor was whether an institution was deemed (potentially) aware of the contamination and free to act to prevent or mitigate it. Responsibility was rated higher the more aware and free the institution. This initial model for attributions of responsibility to impersonal institutions (as opposed to individual responsibility) merits further development.

  12. A Fault-based Crustal Deformation Model for UCERF3 and Its Implication to Seismic Hazard Analysis

    NASA Astrophysics Data System (ADS)

    Zeng, Y.; Shen, Z.

    2012-12-01

    shear zone and northern Walker Lane. This implies a significant increase in seismic hazard in the eastern California and northern Walker Lane region, but decreased seismic hazard in the southern San Andreas area, relative to the current model used in the USGS 2008 seismic hazard map evaluation. Overall the geodetic model suggests an increase in total regional moment rate of 24% compared with the UCERF2 model and the 150-yr California earthquake catalog. However not all the increases are seismic so the seismic/aseismic slip rate ratios are critical for future seismic hazard assessment.

  13. An integrated approach to flood hazard assessment on alluvial fans using numerical modeling, field mapping, and remote sensing

    USGS Publications Warehouse

    Pelletier, J.D.; Mayer, L.; Pearthree, P.A.; House, P.K.; Demsey, K.A.; Klawon, J.K.; Vincent, K.R.

    2005-01-01

    Millions of people in the western United States live near the dynamic, distributary channel networks of alluvial fans where flood behavior is complex and poorly constrained. Here we test a new comprehensive approach to alluvial-fan flood hazard assessment that uses four complementary methods: two-dimensional raster-based hydraulic modeling, satellite-image change detection, fieldbased mapping of recent flood inundation, and surficial geologic mapping. Each of these methods provides spatial detail lacking in the standard method and each provides critical information for a comprehensive assessment. Our numerical model simultaneously solves the continuity equation and Manning's equation (Chow, 1959) using an implicit numerical method. It provides a robust numerical tool for predicting flood flows using the large, high-resolution Digital Elevation Models (DEMs) necessary to resolve the numerous small channels on the typical alluvial fan. Inundation extents and flow depths of historic floods can be reconstructed with the numerical model and validated against field- and satellite-based flood maps. A probabilistic flood hazard map can also be constructed by modeling multiple flood events with a range of specified discharges. This map can be used in conjunction with a surficial geologic map to further refine floodplain delineation on fans. To test the accuracy of the numerical model, we compared model predictions of flood inundation and flow depths against field- and satellite-based flood maps for two recent extreme events on the southern Tortolita and Harquahala piedmonts in Arizona. Model predictions match the field- and satellite-based maps closely. Probabilistic flood hazard maps based on the 10 yr, 100 yr, and maximum floods were also constructed for the study areas using stream gage records and paleoflood deposits. The resulting maps predict spatially complex flood hazards that strongly reflect small-scale topography and are consistent with surficial geology. In

  14. Methodology to assess potential glint and glare hazards from concentrating solar power plants : analytical models and experimental validation.

    SciTech Connect

    Diver, Richard B., Jr.; Ghanbari, Cheryl M.; Ho, Clifford Kuofei

    2010-04-01

    With growing numbers of concentrating solar power systems being designed and developed, glint and glare from concentrating solar collectors and receivers is receiving increased attention as a potential hazard or distraction for motorists, pilots, and pedestrians. This paper provides analytical methods to evaluate the irradiance originating from specularly and diffusely reflecting sources as a function of distance and characteristics of the source. Sample problems are provided for both specular and diffuse sources, and validation of the models is performed via testing. In addition, a summary of safety metrics is compiled from the literature to evaluate the potential hazards of calculated irradiances from glint and glare. Previous safety metrics have focused on prevention of permanent eye damage (e.g., retinal burn). New metrics used in this paper account for temporary flash blindness, which can occur at irradiance values several orders of magnitude lower than the irradiance values required for irreversible eye damage.

  15. Debris flow hazard assessment by integrated modeling of landslide triggering and propagation: application to the Messina Province, Italy

    NASA Astrophysics Data System (ADS)

    Stancanelli, L. M.; Peres, D. J.; Cavallaro, L.; Cancelliere, A.; Foti, E.

    2014-12-01

    During the last decades an increase of debris flow catastrophic events has been recorded along the Italian territory, mainly due to the increment of settlements and human activities in mountain areas. Considering the large extent of debris flow prone areas, non structural protection strategies should be preferably implemented because of economic constrains associated with structural mitigation measures. In such a framework hazard assessment methodologies play a key role representing useful tools for the development of emergency management policies. The aim of the present study is to apply an integrated debris flow hazard assessment methodology, where rainfall probabilistic analysis and physically-based landslide triggering and propagation models are combined. In particular, the probabilistic rainfall analysis provides the forcing scenarios of different return periods, which are then used as input to a model based on combination of the USGS TRIGRS and the FLO-2D codes. The TRIGRS model (Baum et al., 2008; 2010), developed for analyzing shallow landslide triggering is based on an analytical solution of linearized forms of the Richards' infiltration equation and an infinite-slope stability calculation to estimate the timing and locations of slope failures, while the FLO-2D (O'Brien 1986) is a two-dimensional finite difference model that simulates debris flow propagation following a mono-phase approach, based on empirical quadratic rheological relation developed by O'Brien and Julien (1985). Various aspects of the combination of the models are analyzed, giving a particular focus on the possible variations of triggered amounts compatible with a given return period. The methodology is applied to the case study area of the Messina Province in Italy, which has been recently struck by severe events, as the one of the 1st October 2009 which hit the Giampilieri Village causing 37 fatalities. Results are analyzed to assess the potential hazard that may affect the densely

  16. The Prospect of using Three-Dimensional Earth Models To Improve Nuclear Explosion Monitoring and Ground Motion Hazard Assessment

    SciTech Connect

    Antoun, T; Harris, D; Lay, T; Myers, S C; Pasyanos, M E; Richards, P; Rodgers, A J; Walter, W R; Zucca, J J

    2008-02-11

    The last ten years have brought rapid growth in the development and use of three-dimensional (3D) seismic models of earth structure at crustal, regional and global scales. In order to explore the potential for 3D seismic models to contribute to important societal applications, Lawrence Livermore National Laboratory (LLNL) hosted a 'Workshop on Multi-Resolution 3D Earth Models to Predict Key Observables in Seismic Monitoring and Related Fields' on June 6 and 7, 2007 in Berkeley, California. The workshop brought together academic, government and industry leaders in the research programs developing 3D seismic models and methods for the nuclear explosion monitoring and seismic ground motion hazard communities. The workshop was designed to assess the current state of work in 3D seismology and to discuss a path forward for determining if and how 3D earth models and techniques can be used to achieve measurable increases in our capabilities for monitoring underground nuclear explosions and characterizing seismic ground motion hazards. This paper highlights some of the presentations, issues, and discussions at the workshop and proposes a path by which to begin quantifying the potential contribution of progressively refined 3D seismic models in critical applied arenas.

  17. Parametric Hazard Function Estimation.

    1999-09-13

    Version 00 Phaze performs statistical inference calculations on a hazard function (also called a failure rate or intensity function) based on reported failure times of components that are repaired and restored to service. Three parametric models are allowed: the exponential, linear, and Weibull hazard models. The inference includes estimation (maximum likelihood estimators and confidence regions) of the parameters and of the hazard function itself, testing of hypotheses such as increasing failure rate, and checking ofmore » the model assumptions.« less

  18. Inter-Neighborhood Migration, Race, and Environmental Hazards: Modeling Micro-Level Processes of Environmental Inequality

    PubMed Central

    Crowder, Kyle; Downey, Liam

    2009-01-01

    This study combines data from the Panel Study of Income Dynamics with neighborhood-level industrial hazard data from the Environmental Protection Agency to examine the extent and sources of environmental inequality at the individual level. Results indicate that profound racial and ethnic differences in proximity to industrial pollution persist when differences in individual education, household income, and other micro-level characteristics are controlled. Examination of underlying migration patterns further reveals that black and Latino householders move into neighborhoods with significantly higher hazard levels than do comparable whites, and that racial differences in proximity to neighborhood pollution are maintained more by these disparate mobility destinations than by differential effects of pollution on the decision to move. PMID:20503918

  19. Hazard function theory for nonstationary natural hazards

    NASA Astrophysics Data System (ADS)

    Read, L. K.; Vogel, R. M.

    2015-11-01

    Impact from natural hazards is a shared global problem that causes tremendous loss of life and property, economic cost, and damage to the environment. Increasingly, many natural processes show evidence of nonstationary behavior including wind speeds, landslides, wildfires, precipitation, streamflow, sea levels, and earthquakes. Traditional probabilistic analysis of natural hazards based on peaks over threshold (POT) generally assumes stationarity in the magnitudes and arrivals of events, i.e. that the probability of exceedance of some critical event is constant through time. Given increasing evidence of trends in natural hazards, new methods are needed to characterize their probabilistic behavior. The well-developed field of hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series (X) with its failure time series (T), enabling computation of corresponding average return periods, risk and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose POT magnitudes are assumed to follow the widely applied Generalized Pareto (GP) model. We derive the hazard function for this case and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. Our theoretical analysis linking hazard event series X, with corresponding failure time series T, should have application to a wide class of natural hazards with rich opportunities for future extensions.

  20. Hazard function theory for nonstationary natural hazards

    NASA Astrophysics Data System (ADS)

    Read, Laura K.; Vogel, Richard M.

    2016-04-01

    Impact from natural hazards is a shared global problem that causes tremendous loss of life and property, economic cost, and damage to the environment. Increasingly, many natural processes show evidence of nonstationary behavior including wind speeds, landslides, wildfires, precipitation, streamflow, sea levels, and earthquakes. Traditional probabilistic analysis of natural hazards based on peaks over threshold (POT) generally assumes stationarity in the magnitudes and arrivals of events, i.e., that the probability of exceedance of some critical event is constant through time. Given increasing evidence of trends in natural hazards, new methods are needed to characterize their probabilistic behavior. The well-developed field of hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series (X) with its failure time series (T), enabling computation of corresponding average return periods, risk, and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose POT magnitudes are assumed to follow the widely applied generalized Pareto model. We derive the hazard function for this case and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. Our theoretical analysis linking hazard random variable X with corresponding failure time series T should have application to a wide class of natural hazards with opportunities for future extensions.

  1. Hazard function theory for nonstationary natural hazards

    DOE PAGES

    Read, Laura K.; Vogel, Richard M.

    2016-04-11

    Impact from natural hazards is a shared global problem that causes tremendous loss of life and property, economic cost, and damage to the environment. Increasingly, many natural processes show evidence of nonstationary behavior including wind speeds, landslides, wildfires, precipitation, streamflow, sea levels, and earthquakes. Traditional probabilistic analysis of natural hazards based on peaks over threshold (POT) generally assumes stationarity in the magnitudes and arrivals of events, i.e., that the probability of exceedance of some critical event is constant through time. Given increasing evidence of trends in natural hazards, new methods are needed to characterize their probabilistic behavior. The well-developed field ofmore » hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series (X) with its failure time series (T), enabling computation of corresponding average return periods, risk, and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose POT magnitudes are assumed to follow the widely applied generalized Pareto model. We derive the hazard function for this case and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. As a result, our theoretical analysis linking hazard random variable X with corresponding failure time series T should have application to a wide class of natural hazards with opportunities for future extensions.« less

  2. Hazard assessment of the Gschliefgraben earth flow (Austria) based on monitoring data and evolution modelling

    NASA Astrophysics Data System (ADS)

    Poisel, R.; Preh, A.; Hofmann, R.; Schiffer, M.; Sausgruber, Th.

    2009-04-01

    A rock slide on to the clayey - silty - sandy - pebbly masses in the Gschliefgraben (Upper Austria province, Lake Traunsee) having occurred in 2006 as well as the humid autumn of 2007 triggered an earth flow comprising a volume up to 5 mill m³ and moving with a maximum displacement velocity of 5 m/day during the winter of 2007-2008. The possible damage was estimated up to 60 mill € due to possible destruction of houses and of a road to a settlement with heavy tourism. Exploratory drillings revealed that the moving mass consists of an alternate bedding of thicker, less permeable clayey - silty layers and thinner, more permeable silty - sandy - pebbly layers. The movement front ran ahead in the creek bed. Therefore it was assumed that water played an important role and the earth flow moved due to soaking of water into the ground from the area of the rock slide downslope. Inclinometer measurements showed that the uppermost, less permeable layer was sliding on a thin, more permeable layer. The movement process was analysed by numerical models (FLAC) and by conventional calculations in order to assess the hazard. The coupled flow and mechanical models showed that sections of the less permeable layer soaked with water were sliding on the thin, more permeable layer due to excessive watering out of the more permeable layer. These sections were thrust over the downward lying, less soaked areas, therefore having higher strength. The material thrust over the downward lying, less soaked areas together with the moving front of pore water pressures caused the downward material to fail and to be thrust over the downslope lying material in a distance of some 50 m. Thus a cyclic process was created without any indication of a sudden sliding of the complete less permeable layer. Nevertheless, the inhabitants of 15 houses had to be evacuated for safety reasons. They could return to their homes after displacement velocities had decreased. Displacement monitoring by GPS showed that

  3. Gaussian mixture models for measuring local change down-track in LWIR imagery for explosive hazard detection

    NASA Astrophysics Data System (ADS)

    Spain, Christopher J.; Anderson, Derek T.; Keller, James M.; Popescu, Mihail; Stone, Kevin E.

    2011-06-01

    Burying objects below the ground can potentially alter their thermal properties. Moreover, there is often soil disturbance associated with recently buried objects. An intensity video frame image generated by an infrared camera in the medium and long wavelengths often locally varies in the presence of buried explosive hazards. Our approach to automatically detecting these anomalies is to estimate a background model of the image sequence. Pixel values that do not conform to the background model may represent local changes in thermal or soil signature caused by buried objects. Herein, we present a Gaussian mixture model-based technique to estimate the statistical model of background pixel values. The background model is used to detect anomalous pixel values on the road while a vehicle is moving. Foreground pixel confidence values are projected into the UTM coordinate system and a UTM confidence map is built. Different operating levels are explored and the connected component algorithm is then used to extract islands that are subjected to size, shape and orientation filters. We are currently using this approach as a feature in a larger multi-algorithm fusion system. However, in this article we also present results for using this algorithm as a stand-alone detector algorithm in order to further explore its value in detecting buried explosive hazards.

  4. Landslide Hazard Assessment and Mapping in the Guil Catchment (Queyras, Southern French Alps): From Landslide Inventory to Susceptibility Modelling

    NASA Astrophysics Data System (ADS)

    Roulleau, Louise; Bétard, François; Carlier, Benoît; Lissak, Candide; Fort, Monique

    2016-04-01

    Landslides are common natural hazards in the Southern French Alps, where they may affect human lives and cause severe damages to infrastructures. As a part of the SAMCO research project dedicated to risk evaluation in mountain areas, this study focuses on the Guil river catchment (317 km2), Queyras, to assess landslide hazard poorly studied until now. In that area, landslides are mainly occasional, low amplitude phenomena, with limited direct impacts when compared to other hazards such as floods or snow avalanches. However, when interacting with floods during extreme rainfall events, landslides may have indirect consequences of greater importance because of strong hillslope-channel connectivity along the Guil River and its tributaries (i.e. positive feedbacks). This specific morphodynamic functioning reinforces the need to have a better understanding of landslide hazards and their spatial distribution at the catchment scale to prevent local population from disasters with multi-hazard origin. The aim of this study is to produce a landslide susceptibility mapping at 1:50 000 scale as a first step towards global estimation of landslide hazard and risk. The three main methodologies used for assessing landslide susceptibility are qualitative (i.e. expert opinion), deterministic (i.e. physics-based models) and statistical methods (i.e. probabilistic models). Due to the rapid development of geographical information systems (GIS) during the last two decades, statistical methods are today widely used because they offer a greater objectivity and reproducibility at large scales. Among them, multivariate analyses are considered as the most robust techniques, especially the logistic regression method commonly used in landslide susceptibility mapping. However, this method like others is strongly dependent on the accuracy of the input data to avoid significant errors in the final results. In particular, a complete and accurate landslide inventory is required before the modelling

  5. A hazard-based duration model for analyzing crossing behavior of cyclists and electric bike riders at signalized intersections.

    PubMed

    Yang, Xiaobao; Huan, Mei; Abdel-Aty, Mohamed; Peng, Yichuan; Gao, Ziyou

    2015-01-01

    This paper presents a hazard-based duration approach to investigate riders' waiting times, violation hazards, associated risk factors, and their differences between cyclists and electric bike riders at signalized intersections. A total of 2322 two-wheeled riders approaching the intersections during red light periods were observed in Beijing, China. The data were classified into censored and uncensored data to distinguish between safe crossing and red-light running behavior. The results indicated that the red-light crossing behavior of most riders was dependent on waiting time. They were inclined to terminate waiting behavior and run against the traffic light with the increase of waiting duration. Over half of the observed riders cannot endure 49s or longer. 25% of the riders can endure 97s or longer. Rider type, gender, waiting position, conformity tendency and crossing traffic volume were identified to have significant effects on riders' waiting times and violation hazards. Electric bike riders were found to be more sensitive to the external risk factors such as other riders' crossing behavior and crossing traffic volume than cyclists. Moreover, unobserved heterogeneity was examined in the proposed models. The finding of this paper can explain when and why cyclists and electric bike riders run against the red light at intersections. The results of this paper are useful for traffic design and management agencies to implement strategies to enhance the safety of riders.

  6. A hazard-based duration model for analyzing crossing behavior of cyclists and electric bike riders at signalized intersections.

    PubMed

    Yang, Xiaobao; Huan, Mei; Abdel-Aty, Mohamed; Peng, Yichuan; Gao, Ziyou

    2015-01-01

    This paper presents a hazard-based duration approach to investigate riders' waiting times, violation hazards, associated risk factors, and their differences between cyclists and electric bike riders at signalized intersections. A total of 2322 two-wheeled riders approaching the intersections during red light periods were observed in Beijing, China. The data were classified into censored and uncensored data to distinguish between safe crossing and red-light running behavior. The results indicated that the red-light crossing behavior of most riders was dependent on waiting time. They were inclined to terminate waiting behavior and run against the traffic light with the increase of waiting duration. Over half of the observed riders cannot endure 49s or longer. 25% of the riders can endure 97s or longer. Rider type, gender, waiting position, conformity tendency and crossing traffic volume were identified to have significant effects on riders' waiting times and violation hazards. Electric bike riders were found to be more sensitive to the external risk factors such as other riders' crossing behavior and crossing traffic volume than cyclists. Moreover, unobserved heterogeneity was examined in the proposed models. The finding of this paper can explain when and why cyclists and electric bike riders run against the red light at intersections. The results of this paper are useful for traffic design and management agencies to implement strategies to enhance the safety of riders. PMID:25463942

  7. Challenging the principle of proportionality.

    PubMed

    Andersson, Anna-Karin Margareta

    2016-04-01

    The first objective of this article is to examine one aspect of the principle of proportionality (PP) as advanced by Alan Gewirth in his 1978 bookReason and Morality Gewirth claims that being capable of exercising agency to some minimal degree is a property that justifies having at least prima facie rights not to get killed. However, according to the PP, before the being possesses the capacity for exercising agency to that minimal degree, the extent of her rights depends on to what extent she approaches possession of agential capacities. One interpretation of PP holds that variations in degree of possession of the physical constitution necessary to exercise agency are morally relevant. The other interpretation holds that only variations in degree of actual mental capacity are morally relevant. The first of these interpretations is vastly more problematic than the other. The second objective is to argue that according to the most plausible interpretation of the PP, the fetus' level of development before at least the 20th week of pregnancy does not affect the fetus' moral rights status. I then suggest that my argument is not restricted to such fetuses, although extending my argument to more developed fetuses requires caution. PMID:26839114

  8. A spatial hazard model for cluster detection on continuous indicators of disease: application to somatic cell score.

    PubMed

    Gay, Emilie; Senoussi, Rachid; Barnouin, Jacques

    2007-01-01

    Methods for spatial cluster detection dealing with diseases quantified by continuous variables are few, whereas several diseases are better approached by continuous indicators. For example, subclinical mastitis of the dairy cow is evaluated using a continuous marker of udder inflammation, the somatic cell score (SCS). Consequently, this study proposed to analyze spatialized risk and cluster components of herd SCS through a new method based on a spatial hazard model. The dataset included annual SCS for 34 142 French dairy herds for the year 2000, and important SCS risk factors: mean parity, percentage of winter and spring calvings, and herd size. The model allowed the simultaneous estimation of the effects of known risk factors and of potential spatial clusters on SCS, and the mapping of the estimated clusters and their range. Mean parity and winter and spring calvings were significantly associated with subclinical mastitis risk. The model with the presence of 3 clusters was highly significant, and the 3 clusters were attractive, i.e. closeness to cluster center increased the occurrence of high SCS. The three localizations were the following: close to the city of Troyes in the northeast of France; around the city of Limoges in the center-west; and in the southwest close to the city of Tarbes. The semi-parametric method based on spatial hazard modeling applies to continuous variables, and takes account of both risk factors and potential heterogeneity of the background population. This tool allows a quantitative detection but assumes a spatially specified form for clusters.

  9. On the development of a seismic source zonation model for seismic hazard assessment in western Saudi Arabia

    NASA Astrophysics Data System (ADS)

    Zahran, Hani M.; Sokolov, Vladimir; Roobol, M. John; Stewart, Ian C. F.; El-Hadidy Youssef, Salah; El-Hadidy, Mahmoud

    2016-07-01

    A new seismic source model has been developed for the western part of the Arabian Peninsula, which has experienced considerable earthquake activity in the historical past and in recent times. The data used for the model include an up-to-date seismic catalog, results of recent studies of Cenozoic faulting in the area, aeromagnetic anomaly and gravity maps, geological maps, and miscellaneous information on volcanic activity. The model includes 18 zones ranging along the Red Sea and the Arabian Peninsula from the Gulf of Aqaba and the Dead Sea in the north to the Gulf of Aden in the south. The seismic source model developed in this study may be considered as one of the basic branches in a logic tree approach for seismic hazard assessment in Saudi Arabia and adjacent territories.

  10. Of Modeling the Radiation Hazards Along Trajectory Space Vehicles Various Purpose

    NASA Astrophysics Data System (ADS)

    Grichshenko, Valentina

    2016-07-01

    The paper discusses the results of the simulation of radiation hazard along trajectory low-orbit spacecraft for various purposes, geostationary and navigation satellites. Developed criteria of reliability of memory cells in Space, including influence of cosmic rays (CR), differences of geophysical and geomagnetic situation on SV orbit are discussed. Numerical value of vertical geomagnetic stiffness, of CR flux and assessment of correlation failures of memory cells along low-orbit spacecrafts trajectory are presented. Obtained results are used to forecasting the radiation situation along SV orbit, reliability of memory cells in the Space and to optimize nominal equipment kit and payload of Kazakhstan SV.

  11. Snow-avalanche modeling and hazard level assessment using statistical and physical modeling, DSS and WebGIS: case study from Czechia

    NASA Astrophysics Data System (ADS)

    Blahut, J.; Balek, J.; Juras, R.; Klimes, J.; Klose, Z.; Roubinek, J.; Pavlasek, J.

    2014-12-01

    Snow-avalanche modeling and hazard level assessment are important issues to be solved within mountain regions worldwide. In Czechia, there are two mountain ranges (Krkonoše and Jeseníky Mountains), which suffer from regular avalanche activity every year. Mountain Rescue Service is responsible for issuing avalanche bulletins. However, its approaches are still lacking objective assessments and procedures for hazard level estimations. This lack is mainly caused by missing expert avalanche information system. This paper presents preliminary results from a project funded by the Ministry of Interior of the Czech Republic. This project is focused on development of an information system for snow-avalanche hazard level forecasting. It is composed of three main modules, which should act as a Decision Support System (DSS) for the Mountain Rescue Service. Firstly, snow-avalanche susceptibility model is used for delimiting areas where avalanches can occur based on accurate statistical analyses. For that purpose a waste database is used, containing more than 1100 avalanche events from 1961/62 till present. Secondly, a physical modeling of the avalanches is being performed on avalanche paths using RAMMS modeling code. Regular paths, where avalanches occur every year, and irregular paths are being assessed. Their footprint is being updated using return period information for each path. Thirdly, snow distribution and stability models (distributed HBV-ETH, Snowtran 3D, Snowpack and Alpine 3D) are used to assess the critical conditions for avalanche release. For calibration of the models data about meteo/snow cover data and snowpits is used. Those three parts are being coupled in a WebGIS platform used as the principal component of the DSS in snow-avalanche hazard level assessment.

  12. Hazardous materials

    MedlinePlus

    ... people how to work with hazardous materials and waste. There are many different kinds of hazardous materials, including: Chemicals, like some that are used for cleaning Drugs, like chemotherapy to treat cancer Radioactive material that is used for x-rays or ...

  13. Modeling hydrologic and geomorphic hazards across post-fire landscapes using a self-organizing map approach

    USGS Publications Warehouse

    Friedel, M.J.

    2011-01-01

    Few studies attempt to model the range of possible post-fire hydrologic and geomorphic hazards because of the sparseness of data and the coupled, nonlinear, spatial, and temporal relationships among landscape variables. In this study, a type of unsupervised artificial neural network, called a self-organized map (SOM), is trained using data from 540 burned basins in the western United States. The sparsely populated data set includes variables from independent numerical landscape categories (climate, land surface form, geologic texture, and post-fire condition), independent landscape classes (bedrock geology and state), and dependent initiation processes (runoff, landslide, and runoff and landslide combination) and responses (debris flows, floods, and no events). Pattern analysis of the SOM-based component planes is used to identify and interpret relations among the variables. Application of the Davies-Bouldin criteria following k-means clustering of the SOM neurons identified eight conceptual regional models for focusing future research and empirical model development. A split-sample validation on 60 independent basins (not included in the training) indicates that simultaneous predictions of initiation process and response types are at least 78% accurate. As climate shifts from wet to dry conditions, forecasts across the burned landscape reveal a decreasing trend in the total number of debris flow, flood, and runoff events with considerable variability among individual basins. These findings suggest the SOM may be useful in forecasting real-time post-fire hazards, and long-term post-recovery processes and effects of climate change scenarios. ?? 2011.

  14. Planning ahead for asteroid and comet hazard mitigation, phase 1: parameter space exploration and scenario modeling

    SciTech Connect

    Plesko, Catherine S; Clement, R Ryan; Weaver, Robert P; Bradley, Paul A; Huebner, Walter F

    2009-01-01

    The mitigation of impact hazards resulting from Earth-approaching asteroids and comets has received much attention in the popular press. However, many questions remain about the near-term and long-term, feasibility and appropriate application of all proposed methods. Recent and ongoing ground- and space-based observations of small solar-system body composition and dynamics have revolutionized our understanding of these bodies (e.g., Ryan (2000), Fujiwara et al. (2006), and Jedicke et al. (2006)). Ongoing increases in computing power and algorithm sophistication make it possible to calculate the response of these inhomogeneous objects to proposed mitigation techniques. Here we present the first phase of a comprehensive hazard mitigation planning effort undertaken by Southwest Research Institute and Los Alamos National Laboratory. We begin by reviewing the parameter space of the object's physical and chemical composition and trajectory. We then use the radiation hydrocode RAGE (Gittings et al. 2008), Monte Carlo N-Particle (MCNP) radiation transport (see Clement et al., this conference), and N-body dynamics codes to explore the effects these variations in object properties have on the coupling of energy into the object from a variety of mitigation techniques, including deflection and disruption by nuclear and conventional munitions, and a kinetic impactor.

  15. STakeholder-Objective Risk Model (STORM): Determining the aggregated risk of multiple contaminant hazards in groundwater well catchments

    NASA Astrophysics Data System (ADS)

    Enzenhoefer, R.; Binning, P. J.; Nowak, W.

    2015-09-01

    Risk is often defined as the product of probability, vulnerability and value. Drinking water supply from groundwater abstraction is often at risk due to multiple hazardous land use activities in the well catchment. Each hazard might or might not introduce contaminants into the subsurface at any point in time, which then affects the pumped quality upon transport through the aquifer. In such situations, estimating the overall risk is not trivial, and three key questions emerge: (1) How to aggregate the impacts from different contaminants and spill locations to an overall, cumulative impact on the value at risk? (2) How to properly account for the stochastic nature of spill events when converting the aggregated impact to a risk estimate? (3) How will the overall risk and subsequent decision making depend on stakeholder objectives, where stakeholder objectives refer to the values at risk, risk attitudes and risk metrics that can vary between stakeholders. In this study, we provide a STakeholder-Objective Risk Model (STORM) for assessing the total aggregated risk. Or concept is a quantitative, probabilistic and modular framework for simulation-based risk estimation. It rests on the source-pathway-receptor concept, mass-discharge-based aggregation of stochastically occuring spill events, accounts for uncertainties in the involved flow and transport models through Monte Carlo simulation, and can address different stakeholder objectives. We illustrate the application of STORM in a numerical test case inspired by a German drinking water catchment. As one may expect, the results depend strongly on the chosen stakeholder objectives, but they are equally sensitive to different approaches for risk aggregation across different hazards, contaminant types, and over time.

  16. Climate change impact assessment on Veneto and Friuli Plain groundwater. Part I: an integrated modeling approach for hazard scenario construction.

    PubMed

    Baruffi, F; Cisotto, A; Cimolino, A; Ferri, M; Monego, M; Norbiato, D; Cappelletto, M; Bisaglia, M; Pretner, A; Galli, A; Scarinci, A; Marsala, V; Panelli, C; Gualdi, S; Bucchignani, E; Torresan, S; Pasini, S; Critto, A; Marcomini, A

    2012-12-01

    Climate change impacts on water resources, particularly groundwater, is a highly debated topic worldwide, triggering international attention and interest from both researchers and policy makers due to its relevant link with European water policy directives (e.g. 2000/60/EC and 2007/118/EC) and related environmental objectives. The understanding of long-term impacts of climate variability and change is therefore a key challenge in order to address effective protection measures and to implement sustainable management of water resources. This paper presents the modeling approach adopted within the Life+ project TRUST (Tool for Regional-scale assessment of groUndwater Storage improvement in adaptation to climaTe change) in order to provide climate change hazard scenarios for the shallow groundwater of high Veneto and Friuli Plain, Northern Italy. Given the aim to evaluate potential impacts on water quantity and quality (e.g. groundwater level variation, decrease of water availability for irrigation, variations of nitrate infiltration processes), the modeling approach integrated an ensemble of climate, hydrologic and hydrogeologic models running from the global to the regional scale. Global and regional climate models and downscaling techniques were used to make climate simulations for the reference period 1961-1990 and the projection period 2010-2100. The simulation of the recent climate was performed using observed radiative forcings, whereas the projections have been done prescribing the radiative forcings according to the IPCC A1B emission scenario. The climate simulations and the downscaling, then, provided the precipitation, temperatures and evapo-transpiration fields used for the impact analysis. Based on downscaled climate projections, 3 reference scenarios for the period 2071-2100 (i.e. the driest, the wettest and the mild year) were selected and used to run a regional geomorphoclimatic and hydrogeological model. The final output of the model ensemble produced

  17. Low speed wind tunnel investigation of span load alteration, forward-located spoilers, and splines as trailing-vortex-hazard alleviation devices on a transport aircraft model

    NASA Technical Reports Server (NTRS)

    Croom, D. R.; Dunham, R. E., Jr.

    1975-01-01

    The effectiveness of a forward-located spoiler, a spline, and span load alteration due to a flap configuration change as trailing-vortex-hazard alleviation methods was investigated. For the transport aircraft model in the normal approach configuration, the results indicate that either a forward-located spoiler or a spline is effective in reducing the trailing-vortex hazard. The results also indicate that large changes in span loading, due to retraction of the outboard flap, may be an effective method of reducing the trailing-vortex hazard.

  18. CalTOX, a multimedia total exposure model for hazardous-waste sites; Part 1, Executive summary

    SciTech Connect

    McKone, T.E.

    1993-06-01

    CalTOX has been developed as a spreadsheet model to assist in health-risk assessments that address contaminated soils and the contamination of adjacent air, surface water, sediments, and ground water. The modeling effort includes a multimedia transport and transformation model, exposure scenario models, and efforts to quantify and reduce uncertainty in multimedia, multiple-pathway exposure models. This report provides an overview of the CalTOX model components, lists the objectives of the model, describes the philosophy under which the model was developed, identifies the chemical classes for which the model can be used, and describes critical sensitivities and uncertainties. The multimedia transport and transformation model is a dynamic model that can be used to assess time-varying concentrations of contaminants introduced initially to soil layers or for contaminants released continuously to air or water. This model assists the user in examining how chemical and landscape properties impact both the ultimate route and quantity of human contact. Multimedia, multiple pathway exposure models are used in the CalTOX model to estimate average daily potential doses within a human population in the vicinity of a hazardous substances release site. The exposure models encompass twenty-three exposure pathways. The exposure assessment process consists of relating contaminant concentrations in the multimedia model compartments to contaminant concentrations in the media with which a human population has contact (personal air, tap water, foods, household dusts soils, etc.). The average daily dose is the product of the exposure concentrations in these contact media and an intake or uptake factor that relates the concentrations to the distributions of potential dose within the population.

  19. Advances in Landslide Hazard Forecasting: Evaluation of Global and Regional Modeling Approach

    NASA Technical Reports Server (NTRS)

    Kirschbaum, Dalia B.; Adler, Robert; Hone, Yang; Kumar, Sujay; Peters-Lidard, Christa; Lerner-Lam, Arthur

    2010-01-01

    A prototype global satellite-based landslide hazard algorithm has been developed to identify areas that exhibit a high potential for landslide activity by combining a calculation of landslide susceptibility with satellite-derived rainfall estimates. A recent evaluation of this algorithm framework found that while this tool represents an important first step in larger-scale landslide forecasting efforts, it requires several modifications before it can be fully realized as an operational tool. The evaluation finds that the landslide forecasting may be more feasible at a regional scale. This study draws upon a prior work's recommendations to develop a new approach for considering landslide susceptibility and forecasting at the regional scale. This case study uses a database of landslides triggered by Hurricane Mitch in 1998 over four countries in Central America: Guatemala, Honduras, EI Salvador and Nicaragua. A regional susceptibility map is calculated from satellite and surface datasets using a statistical methodology. The susceptibility map is tested with a regional rainfall intensity-duration triggering relationship and results are compared to global algorithm framework for the Hurricane Mitch event. The statistical results suggest that this regional investigation provides one plausible way to approach some of the data and resolution issues identified in the global assessment, providing more realistic landslide forecasts for this case study. Evaluation of landslide hazards for this extreme event helps to identify several potential improvements of the algorithm framework, but also highlights several remaining challenges for the algorithm assessment, transferability and performance accuracy. Evaluation challenges include representation errors from comparing susceptibility maps of different spatial resolutions, biases in event-based landslide inventory data, and limited nonlandslide event data for more comprehensive evaluation. Additional factors that may improve

  20. CAirTOX, An inter-media transfer model for assessing indirect exposures to hazardous air contaminants

    SciTech Connect

    McKone, T.E.

    1994-01-01

    Risk assessment is a quantitative evaluation of information on potential health hazards of environmental contaminants and the extent of human exposure to these contaminants. As applied to toxic chemical emissions to air, risk assessment involves four interrelated steps. These are (1) determination of source concentrations or emission characteristics, (2) exposure assessment, (3) toxicity assessment, and (4) risk characterization. These steps can be carried out with assistance from analytical models in order to estimate the potential risk associated with existing and future releases. CAirTOX has been developed as a spreadsheet model to assist in making these types of calculations. CAirTOX follows an approach that has been incorporated into the CalTOX model, which was developed for the California Department of Toxic Substances Control, With CAirTOX, we can address how contaminants released to an air basin can lead to contamination of soil, food, surface water, and sediments. The modeling effort includes a multimedia transport and transformation model, exposure scenario models, and efforts to quantify uncertainty in multimedia, multiple-pathway exposure assessments. The capacity to explicitly address uncertainty has been incorporated into the model in two ways. First, the spreadsheet form of the model makes it compatible with Monte-Carlo add-on programs that are available for uncertainty analysis. Second, all model inputs are specified in terms of an arithmetic mean and coefficient of variation so that uncertainty analyses can be carried out.

  1. Implementation of NGA-West2 ground motion models in the 2014 U.S. National Seismic Hazard Maps

    USGS Publications Warehouse

    Rezaeian, Sanaz; Petersen, Mark D.; Moschetti, Morgan P.; Powers, Peter; Harmsen, Stephen C.; Frankel, Arthur D.

    2014-01-01

    The U.S. National Seismic Hazard Maps (NSHMs) have been an important component of seismic design regulations in the United States for the past several decades. These maps present earthquake ground shaking intensities at specified probabilities of being exceeded over a 50-year time period. The previous version of the NSHMs was developed in 2008; during 2012 and 2013, scientists at the U.S. Geological Survey have been updating the maps based on their assessment of the “best available science,” resulting in the 2014 NSHMs. The update includes modifications to the seismic source models and the ground motion models (GMMs) for sites across the conterminous United States. This paper focuses on updates in the Western United States (WUS) due to the use of new GMMs for shallow crustal earthquakes in active tectonic regions developed by the Next Generation Attenuation (NGA-West2) project. Individual GMMs, their weighted combination, and their impact on the hazard maps relative to 2008 are discussed. In general, the combined effects of lower medians and increased standard deviations in the new GMMs have caused only small changes, within 5–20%, in the probabilistic ground motions for most sites across the WUS compared to the 2008 NSHMs.

  2. Atmospheric electrical modeling in support of the NASA F106 Storm Hazards Project

    NASA Technical Reports Server (NTRS)

    Helsdon, J. H.

    1986-01-01

    With the use of composite (non-metallic) and microelectronics becoming more prevalent in the construction of both military and commercial aircraft, the control systems have become more susceptible to damage or failure from electromagnetic transients. One source of such transients is the lightning discharge. In order to study the effects of the lightning discharge on the vital components of an aircraft, NASA Langley Research Center has undertaken a Storm Hazards Program in which a specially instrumented F106B jet aircraft is flown into active thunderstorms with the intention of being struck by lightning. One of the specific purposes of the program is to quantify the environmental conditions which are conductive to aircraft lightning strikes.

  3. Marine natural hazards in coastal zone: observations, analysis and modelling (Plinius Medal Lecture)

    NASA Astrophysics Data System (ADS)

    Didenkulova, Ira

    2010-05-01

    Giant surface waves approaching the coast frequently cause extensive coastal flooding, destruction of coastal constructions and loss of lives. Such waves can be generated by various phenomena: strong storms and cyclones, underwater earthquakes, high-speed ferries, aerial and submarine landslides. The most famous examples of such events are the catastrophic tsunami in the Indian Ocean, which occurred on 26 December 2004 and hurricane Katrina (28 August 2005) in the Atlantic Ocean. The huge storm in the Baltic Sea on 9 January 2005, which produced unexpectedly long waves in many areas of the Baltic Sea and the influence of unusually high surge created by long waves from high-speed ferries, should also be mentioned as examples of regional marine natural hazards connected with extensive runup of certain types of waves. The processes of wave shoaling and runup for all these different marine natural hazards (tsunami, coastal freak waves, ship waves) are studied based on rigorous solutions of nonlinear shallow-water theory. The key and novel results presented here are: i) parameterization of basic formulas for extreme runup characteristics for bell-shape waves, showing that they weakly depend on the initial wave shape, which is usually unknown in real sea conditions; ii) runup analysis of periodic asymmetric waves with a steep front, as such waves are penetrating inland over large distances and with larger velocities than symmetric waves; iii) statistical analysis of irregular wave runup demonstrating that wave nonlinearity nearshore does not influence on the probability distribution of the velocity of the moving shoreline and its moments, and influences on the vertical displacement of the moving shoreline (runup). Wave runup on convex beaches and in narrow bays, which allow abnormal wave amplification is also discussed. Described analytical results are used for explanation of observed extreme runup of tsunami, freak (sneaker) waves and ship waves on different coasts

  4. Internal structure and volcanic hazard potential of Mt Tongariro, New Zealand, from 3D gravity and magnetic models

    NASA Astrophysics Data System (ADS)

    Miller, Craig A.; Williams-Jones, Glyn

    2016-06-01

    A new 3D geophysical model of the Mt Tongariro Volcanic Massif (TgVM), New Zealand, provides a high resolution view of the volcano's internal structure and hydrothermal system, from which we derive implications for volcanic hazards. Geologically constrained 3D inversions of potential field data provides a greater level of insight into the volcanic structure than is possible from unconstrained models. A complex region of gravity highs and lows (± 6 mGal) is set within a broader, ~ 20 mGal gravity low. A magnetic high (1300 nT) is associated with Mt Ngauruhoe, while a substantial, thick, demagnetised area occurs to the north, coincident with a gravity low and interpreted as representing the hydrothermal system. The hydrothermal system is constrained to the west by major faults, interpreted as an impermeable barrier to fluid migration and extends to basement depth. These faults are considered low probability areas for future eruption sites, as there is little to indicate they have acted as magmatic pathways. Where the hydrothermal system coincides with steep topographic slopes, an increased likelihood of landslides is present and the newly delineated hydrothermal system maps the area most likely to have phreatic eruptions. Such eruptions, while small on a global scale, are important hazards at the TgVM as it is a popular hiking area with hundreds of visitors per day in close proximity to eruption sites. The model shows that the volume of volcanic material erupted over the lifespan of the TgVM is five to six times greater than previous estimates, suggesting a higher rate of magma supply, in line with global rates of andesite production. We suggest that our model of physical property distribution can be used to provide constraints for other models of dynamic geophysical processes occurring at the TgVM.

  5. Bi-Objective Modelling for Hazardous Materials Road-Rail Multimodal Routing Problem with Railway Schedule-Based Space-Time Constraints.

    PubMed

    Sun, Yan; Lang, Maoxiang; Wang, Danzhu

    2016-01-01

    The transportation of hazardous materials is always accompanied by considerable risk that will impact public and environment security. As an efficient and reliable transportation organization, a multimodal service should participate in the transportation of hazardous materials. In this study, we focus on transporting hazardous materials through the multimodal service network and explore the hazardous materials multimodal routing problem from the operational level of network planning. To formulate this problem more practicably, minimizing the total generalized costs of transporting the hazardous materials and the social risk along the planned routes are set as the optimization objectives. Meanwhile, the following formulation characteristics will be comprehensively modelled: (1) specific customer demands; (2) multiple hazardous material flows; (3) capacitated schedule-based rail service and uncapacitated time-flexible road service; and (4) environmental risk constraint. A bi-objective mixed integer nonlinear programming model is first built to formulate the routing problem that combines the formulation characteristics above. Then linear reformations are developed to linearize and improve the initial model so that it can be effectively solved by exact solution algorithms on standard mathematical programming software. By utilizing the normalized weighted sum method, we can generate the Pareto solutions to the bi-objective optimization problem for a specific case. Finally, a large-scale empirical case study from the Beijing-Tianjin-Hebei Region in China is presented to demonstrate the feasibility of the proposed methods in dealing with the practical problem. Various scenarios are also discussed in the case study. PMID:27483294

  6. Bi-Objective Modelling for Hazardous Materials Road–Rail Multimodal Routing Problem with Railway Schedule-Based Space–Time Constraints

    PubMed Central

    Sun, Yan; Lang, Maoxiang; Wang, Danzhu

    2016-01-01

    The transportation of hazardous materials is always accompanied by considerable risk that will impact public and environment security. As an efficient and reliable transportation organization, a multimodal service should participate in the transportation of hazardous materials. In this study, we focus on transporting hazardous materials through the multimodal service network and explore the hazardous materials multimodal routing problem from the operational level of network planning. To formulate this problem more practicably, minimizing the total generalized costs of transporting the hazardous materials and the social risk along the planned routes are set as the optimization objectives. Meanwhile, the following formulation characteristics will be comprehensively modelled: (1) specific customer demands; (2) multiple hazardous material flows; (3) capacitated schedule-based rail service and uncapacitated time-flexible road service; and (4) environmental risk constraint. A bi-objective mixed integer nonlinear programming model is first built to formulate the routing problem that combines the formulation characteristics above. Then linear reformations are developed to linearize and improve the initial model so that it can be effectively solved by exact solution algorithms on standard mathematical programming software. By utilizing the normalized weighted sum method, we can generate the Pareto solutions to the bi-objective optimization problem for a specific case. Finally, a large-scale empirical case study from the Beijing–Tianjin–Hebei Region in China is presented to demonstrate the feasibility of the proposed methods in dealing with the practical problem. Various scenarios are also discussed in the case study. PMID:27483294

  7. Bi-Objective Modelling for Hazardous Materials Road-Rail Multimodal Routing Problem with Railway Schedule-Based Space-Time Constraints.

    PubMed

    Sun, Yan; Lang, Maoxiang; Wang, Danzhu

    2016-07-28

    The transportation of hazardous materials is always accompanied by considerable risk that will impact public and environment security. As an efficient and reliable transportation organization, a multimodal service should participate in the transportation of hazardous materials. In this study, we focus on transporting hazardous materials through the multimodal service network and explore the hazardous materials multimodal routing problem from the operational level of network planning. To formulate this problem more practicably, minimizing the total generalized costs of transporting the hazardous materials and the social risk along the planned routes are set as the optimization objectives. Meanwhile, the following formulation characteristics will be comprehensively modelled: (1) specific customer demands; (2) multiple hazardous material flows; (3) capacitated schedule-based rail service and uncapacitated time-flexible road service; and (4) environmental risk constraint. A bi-objective mixed integer nonlinear programming model is first built to formulate the routing problem that combines the formulation characteristics above. Then linear reformations are developed to linearize and improve the initial model so that it can be effectively solved by exact solution algorithms on standard mathematical programming software. By utilizing the normalized weighted sum method, we can generate the Pareto solutions to the bi-objective optimization problem for a specific case. Finally, a large-scale empirical case study from the Beijing-Tianjin-Hebei Region in China is presented to demonstrate the feasibility of the proposed methods in dealing with the practical problem. Various scenarios are also discussed in the case study.

  8. Reproductive Hazards

    MedlinePlus

    ... and female reproductive systems play a role in pregnancy. Problems with these systems can affect fertility and ... a reproductive hazard can cause different effects during pregnancy, depending on when she is exposed. During the ...

  9. Coastal Hazards.

    ERIC Educational Resources Information Center

    Vandas, Steve

    1998-01-01

    Focuses on hurricanes and tsunamis and uses these topics to address other parts of the science curriculum. In addition to a discussion on beach erosion, a poster is provided that depicts these natural hazards that threaten coastlines. (DDR)

  10. Predictive Modeling of Chemical Hazard by Integrating Numerical Descriptors of Chemical Structures and Short-term Toxicity Assay Data

    PubMed Central

    Rusyn, Ivan; Sedykh, Alexander; Guyton, Kathryn Z.; Tropsha, Alexander

    2012-01-01

    Quantitative structure-activity relationship (QSAR) models are widely used for in silico prediction of in vivo toxicity of drug candidates or environmental chemicals, adding value to candidate selection in drug development or in a search for less hazardous and more sustainable alternatives for chemicals in commerce. The development of traditional QSAR models is enabled by numerical descriptors representing the inherent chemical properties that can be easily defined for any number of molecules; however, traditional QSAR models often have limited predictive power due to the lack of data and complexity of in vivo endpoints. Although it has been indeed difficult to obtain experimentally derived toxicity data on a large number of chemicals in the past, the results of quantitative in vitro screening of thousands of environmental chemicals in hundreds of experimental systems are now available and continue to accumulate. In addition, publicly accessible toxicogenomics data collected on hundreds of chemicals provide another dimension of molecular information that is potentially useful for predictive toxicity modeling. These new characteristics of molecular bioactivity arising from short-term biological assays, i.e., in vitro screening and/or in vivo toxicogenomics data can now be exploited in combination with chemical structural information to generate hybrid QSAR–like quantitative models to predict human toxicity and carcinogenicity. Using several case studies, we illustrate the benefits of a hybrid modeling approach, namely improvements in the accuracy of models, enhanced interpretation of the most predictive features, and expanded applicability domain for wider chemical space coverage. PMID:22387746

  11. Application of time-independent and time-dependent occurrence models on the seismic hazard estimations in the Marmara region, Turkey

    NASA Astrophysics Data System (ADS)

    Murru, M.; Akinci, A.; Console, R.; Falcone, G.; Pucci, S.

    2014-12-01

    We show the effect of time-independent and time-dependent occurrence models on the seismic hazard estimations. The time-dependence is introduced by 1) the Brownian Passage Time (BPT) probability model that is based on a simple physical model of the earthquake cycle, and 2) the fusion of the BPT renewal model with a physical model that considers the earthquake probability perturbation for interacting faults by static Coulomb stress changes We treat the uncertainties in the fault parameters (e.g. slip rate, characteristic magnitude and aperiodicity) of the statistical distribution associated to each examined fault source by a Monte Carlo technique. For a comparison among the results obtained from three different models, we give the probabilities of occurrence of earthquakes Mw > 6.5 for individual fault sources in the Marmara region, over the future 5-10-30 and 50 years, starting on January 1, 2013, considering the 10th, 50th and 90th percentiles of the Monte Carlo distribution. In order to evaluate the impact of the earthquake probability models to ground motion hazard we attempt to calculate the fault-based probabilistic seismic hazard maps (PSHA) of mean Peak Ground Acceleration (PGA) having 10% probability of exceedance in 50 years on rock site condition. We adopted only one Ground Motion Prediction Equation (GMPE) for the active shallow crustal region for assessing the ground shaking hazard in the Marmara region. We observed that the impact of the different occurrence models on the seismic hazard estimate of selected sites is quite high: the hazard may increase by more than 70% or decrease by as much as 70%, depending on the applied model in the selected sites. This difference mostly depends on the time elapsed after the latest major earthquake on a specific fault. We demonstrate that the estimated average recurrence time and the associated magnitude, together with the elapsed time, are crucial parameters in the earthquake probability calculations.

  12. Kalman-predictive-proportional-integral-derivative (KPPID)

    SciTech Connect

    Fluerasu, A.; Sutton, M.

    2004-12-17

    With third generation synchrotron X-ray sources, it is possible to acquire detailed structural information about the system under study with time resolution orders of magnitude faster than was possible a few years ago. These advances have generated many new challenges for changing and controlling the state of the system on very short time scales, in a uniform and controlled manner. For our particular X-ray experiments on crystallization or order-disorder phase transitions in metallic alloys, we need to change the sample temperature by hundreds of degrees as fast as possible while avoiding over or under shooting. To achieve this, we designed and implemented a computer-controlled temperature tracking system which combines standard Proportional-Integral-Derivative (PID) feedback, thermal modeling and finite difference thermal calculations (feedforward), and Kalman filtering of the temperature readings in order to reduce the noise. The resulting Kalman-Predictive-Proportional-Integral-Derivative (KPPID) algorithm allows us to obtain accurate control, to minimize the response time and to avoid over/under shooting, even in systems with inherently noisy temperature readings and time delays. The KPPID temperature controller was successfully implemented at the Advanced Photon Source at Argonne National Laboratories and was used to perform coherent and time-resolved X-ray diffraction experiments.

  13. Hydrological risks in anthropized watersheds: modeling of hazard, vulnerability and impacts on population from south-west of Madagascar

    NASA Astrophysics Data System (ADS)

    Mamy Rakotoarisoa, Mahefa; Fleurant, Cyril; Taibi, Nuscia; Razakamanana, Théodore

    2016-04-01

    Hydrological risks, especially for floods, are recurrent on the Fiherenana watershed - southwest of Madagascar. The city of Toliara, which is located at the outlet of the river basin, is subjected each year to hurricane hazards and floods. The stakes are of major importance in this part of the island. This study begins with the analysis of hazard by collecting all existing hydro-climatic data on the catchment. It then seeks to determine trends, despite the significant lack of data, using simple statistical models (decomposition of time series). Then, two approaches are conducted to assess the vulnerability of the city of Toliara and the surrounding villages. First, a static approach, from surveys of land and the use of GIS are used. Then, the second method is the use of a multi-agent-based simulation model. The first step is the mapping of a vulnerability index which is the arrangement of several static criteria. This is a microscale indicator (the scale used is the housing). For each House, there are several criteria of vulnerability, which are the potential water depth, the flow rate, or the architectural typology of the buildings. For the second part, simulations involving scenes of agents are used in order to evaluate the degree of vulnerability of homes from flooding. Agents are individual entities to which we can assign behaviours on purpose to simulate a given phenomenon. The aim is not to give a criterion to the house as physical building, such as its architectural typology or its strength. The model wants to know the chances of the occupants of the house to escape from a catastrophic flood. For this purpose, we compare various settings and scenarios. Some scenarios are conducted to take into account the effect of certain decision made by the responsible entities (Information and awareness of the villagers for example). The simulation consists of two essential parts taking place simultaneously in time: simulation of the rise of water and the flow using

  14. Spatio-temporal hazard estimation in the Auckland Volcanic Field, New Zealand, with a new event-order model

    NASA Astrophysics Data System (ADS)

    Bebbington, Mark S.; Cronin, Shane J.

    2011-01-01

    The Auckland Volcanic Field (AVF) with 49 eruptive centres in the last c. 250 ka presents many challenges to our understanding of distributed volcanic field construction and evolution. We re-examine the age constraints within the AVF and perform a correlation exercise matching the well-dated record of tephras from cores distributed throughout the field to the most likely source volcanoes, using thickness and location information and a simple attenuation model. Combining this augmented age information with known stratigraphic constraints, we produce a new age-order algorithm for the field, with errors incorporated using a Monte Carlo procedure. Analysis of the new age model discounts earlier appreciations of spatio-temporal clustering in the AVF. Instead the spatial and temporal aspects appear independent; hence the location of the last eruption provides no information about the next location. The temporal hazard intensity in the field has been highly variable, with over 63% of its centres formed in a high-intensity period between 40 and 20 ka. Another, smaller, high-intensity period may have occurred at the field onset, while the latest event, at 504 ± 5 years B.P., erupted 50% of the entire field's volume. This emphasises the lack of steady-state behaviour that characterises the AVF, which may also be the case in longer-lived fields with a lower dating resolution. Spatial hazard intensity in the AVF under the new age model shows a strong NE-SW structural control of volcanism that may reflect deep-seated crustal or subduction zone processes and matches the orientation of the Taupo Volcanic Zone to the south.

  15. Mount St. Helens a decade after the 1980 eruptions: magmatic models, chemical cycles, and a revised hazards assessment

    USGS Publications Warehouse

    Pallister, J.S.; Hoblitt, R.P.; Crandell, D.R.; Mullineaux, D.R.

    1992-01-01

    Available geophysical and geologic data provide a simplified model of the current magmatic plumbing system of Mount St. Helens (MSH). This model and new geochemical data are the basis for the revised hazards assessment presented here. The assessment is weighted by the style of eruptions and the chemistry of magmas erupted during the past 500 years, the interval for which the most detailed stratigraphic and geochemical data are available. This interval includes the Kalama (A. D. 1480-1770s?), Goat Rocks (A.D. 1800-1857), and current eruptive periods. In each of these periods, silica content decreased, then increased. The Kalama is a large amplitude chemical cycle (SiO2: 57%-67%), produced by mixing of arc dacite, which is depleted in high field-strength and incompatible elements, with enriched (OIB-like) basalt. The Goat Rocks and current cycles are of small amplitude (SiO2: 61%-64% and 62%-65%) and are related to the fluid dynamics of magma withdrawal from a zoned reservoir. The cyclic behavior is used to forecast future activity. The 1980-1986 chemical cycle, and consequently the current eruptive period, appears to be virtually complete. This inference is supported by the progressively decreasing volumes and volatile contents of magma erupted since 1980, both changes that suggest a decreasing potential for a major explosive eruption in the near future. However, recent changes in seismicity and a series of small gas-release explosions (beginning in late 1989 and accompanied by eruption of a minor fraction of relatively low-silica tephra on 6 January and 5 November 1990) suggest that the current eruptive period may continue to produce small explosions and that a small amount of magma may still be present within the conduit. The gas-release explosions occur without warning and pose a continuing hazard, especially in the crater area. An eruption as large or larger than that of 18 May 1980 (???0.5 km3 dense-rock equivalent) probably will occur only if magma rises from

  16. The egg-sharing model for human therapeutic cloning research: managing donor selection criteria, the proportion of shared oocytes allocated to research, and amount of financial subsidy given to the donor.

    PubMed

    Heng, Boon Chin; Tong, Guo Qing; Stojkovic, Miodrag

    2006-01-01

    Recent advances in human therapeutic cloning made by Hwang and colleagues have opened up new avenues of therapy for various human diseases. However, the major bottleneck of this new technology is the severe shortage of human donor oocytes. Egg-sharing in return for subsidized fertility treatment has been suggested as an ethically justifiable and practical solution to overcome the shortage of donor oocytes for therapeutic cloning. Because the utilization of shared oocytes in therapeutic cloning research does not result in any therapeutic benefit to a second party, this would necessitate a different management strategy compared to their use for the assisted conception of infertile women who are unable to produce any oocytes of their own. It is proposed that the pool of prospective egg-sharers in therapeutic cloning research be limited only to younger women (below 30 years of age) with indications for either male partner sub-fertility or tubal blockage. With regards to the proportion of the shared gametes being allocated to research, a threshold number of retrieved oocytes should be set that if not exceeded, would result in the patient being automatically removed from the egg-sharing scheme. Any excess supernumerary oocyte above this threshold number can be contributed to science, and allocation should be done in a randomized manner. Perhaps, a total of 10 retrieved oocytes from the patient may be considered a suitable threshold, since the chances of conception are unlikely to be impaired. With regards to the amount of subsidy being given to the patient, it is suggested that the proportion of financial subsidy should be equal to the proportion of the patient's oocytes being allocated to research. No doubt, the promise of future therapeutic benefit may be offered to the patient instead of financial subsidy. However, this is ethically controversial because therapeutic cloning has not yet been demonstrated to be a viable model of clinical therapy and any promises made to

  17. Digital elevation models in the marine domain: investigating the offshore tsunami hazard from submarine landslides

    NASA Astrophysics Data System (ADS)

    Tappin, David R.

    2015-04-01

    the resolution necessary to identify the hazard from landslides, particularly along convergent margins where this hazard is the greatest. Multibeam mapping of the deep seabed requires low frequency sound sources that, because of their corresponding low resolution, cannot produce the detail required to identify the finest scale features. In addition, outside of most countries, there are not the repeat surveys that allow seabed changes to be identified. Perhaps only japan has this data. In the near future as research budgets shrink and ship time becomes ever expensive new strategies will have to be used to make best use of the vessels available. Remote AUV technology is almost certainly the answer, and should be increasingly utilised to map the seabed while the mother ship is better used to carry out other duties, such as sampling or seismic data acquisition. This will have the advantage in the deep ocean of acquiring higher resolution data from high frequency multibeams. This talk presents on a number of projects that show the evolution of the use of MBES in mapping submarine landslides since the PNG tsunami. Data from PNG is presented, together with data from Japan, Hawaii and the NE Atlantic. New multibeam acquisition methodologies are also discussed.

  18. A multiple imputation approach to the analysis of interval-censored failure time data with the additive hazards model

    PubMed Central

    Chen, Ling; Sun, Jianguo

    2013-01-01

    This paper discusses regression analysis of interval-censored failure time data, which occur in many fields including demographical, epidemiological, financial, medical, and sociological studies. For the problem, we focus on the situation where the survival time of interest can be described by the additive hazards model and a multiple imputation approach is presented for inference. A major advantage of the approach is its simplicity and it can be easily implemented by using the existing software packages for right-censored failure time data. Extensive simulation studies are conducted which indicate that the approach performs well for practical situations and is comparable to the existing methods. The methodology is applied to a set of interval-censored failure time data arising from an AIDS clinical trial. PMID:25419022

  19. Modelling absorption and dilution of unconfined releases of hazardous gases by water curtains or monitors

    SciTech Connect

    Fthenakis, V.M.; Blewitt, D.N.; Hague, W.J.

    1995-05-01

    OSHA Process Safety Management guidelines suggest that a facility operator investigate and document a plan for installing systems to detect, contain, or mitigate accidental releases if such systems are not already in place. In addition, proposed EPA 112(r) regulations would require such analysis. This paper illustrates how mathematical modelling can aid such an evaluation and describes some recent enhancements of the HGSPRAY model: (1) Adding algorithms for modeling NH{sub 3} and LNG mitigation; (2) Modeling spraying of releases with fire water monitors encircling the point of release; (3) Combining wind tunnel modeling with mathematical modeling; and (4) Linking HGSPRAY and BEGADAS. Case cases are presented as examples of how HGSPRAY can aid the design of water spray systems for initiation of toxic gases (e.g., BF, NH,) or dilution/dispersion of flammable vapors (e.g., LNG).

  20. Estimation of lithofacies proportions using well and well test data

    SciTech Connect

    Hu, L.Y.; Blanc, G.; Noetinger, B.

    1996-12-31

    A crucial step of the commonly used geostatistical methods for modeling heterogeneous reservoirs (e.g. the sequential indicator simulation and the truncated Gaussian functions) is the estimation of the lithofacies local proportion (or probability density) functions. Well-test derived permeabilities show good correlation with lithofacies proportions around wells. Integrating well and well-test data in estimating lithofacies proportions could permit the building of more realistic models of reservoir heterogeneity. However this integration is difficult because of the different natures and measurement scales of these two types of data. This paper presents a two step approach to integrating well and well-test data into heterogeneous reservoir modeling. First lithofacies proportions in well-test investigation areas are estimated using a new kriging algorithm called KISCA. KISCA consists in kriging jointly the proportions of all lithofacies in a well-test investigation area so that the corresponding well-test derived permeability is respected through a weighted power averaging of lithofacies permeabilities. For multiple well-tests, an iterative process is used in KISCA to account for their interaction. After this, the estimated proportions are combined with lithofacies indicators at wells for estimating proportion (or probability density) functions over the entire reservoir field using a classical kriging method. Some numerical examples were considered to test the proposed method for estimating lithofacies proportions. In addition, a synthetic lithofacies reservoir model was generated and a well-test simulation was performed. The comparison between the experimental and estimated proportions in the well-test investigation area demonstrates the validity of the proposed method.

  1. A Coupled Damage and Reaction Model for Simulating Energetic Material Response to Impact Hazards

    SciTech Connect

    BAER,MELVIN R.; DRUMHELLER,D.S.; MATHESON,E.R.

    1999-09-01

    The Baer-Nunziato multiphase reactive theory for a granulated bed of energetic material is extended to allow for dynamic damage processes, that generate new surfaces as well as porosity. The Second Law of Thermodynamics is employed to constrain the constitutive forms of the mass, momentum, and energy exchange functions as well as those for the mechanical damage model ensuring that the models will be dissipative. The focus here is on the constitutive forms of the exchange functions. The mechanical constitutive modeling is discussed in a companion paper. The mechanical damage model provides dynamic surface area and porosity information needed by the exchange functions to compute combustion rates and interphase momentum and energy exchange rates. The models are implemented in the CTH shock physics code and used to simulate delayed detonations due to impacts in a bed of granulated energetic material and an undamaged cylindrical sample.

  2. Great paleoearthquakes of the central Himalaya and their implications for seismotectonic models and seismic hazard assessment

    NASA Astrophysics Data System (ADS)

    Yule, D.; Lave, J.; Kumar, S.; Wesnousky, S.

    2007-12-01

    Himalaya in over 500 years and that Mw 7.5-8.4 earthquakes are the 'moderate' earthquakes'. Further study to constrain the lateral extent and recurrence of the great paleoearthquakes of the central Himalaya is critical to answer important questions about the Himalaya earthquake cycle and the seismic hazard facing the rapidly urbanizing population of the region.

  3. Applications of the seismic hazard model of Italy: from a new building code to the L'Aquila trial against seismologists

    NASA Astrophysics Data System (ADS)

    Meletti, C.

    2013-05-01

    In 2003, a large national project fur updating the seismic hazard map and the seismic zoning in Italy started, according to the rules fixed by an Ordinance by Italian Prime Minister. New input elements for probabilistic seismic hazard assessment were compiled: the earthquake catalogue, the seismogenic zonation, the catalogue completeness, a set of new attenuation relationships. The map of expected PGA on rock soil condition with 10% probability of exceedance is the new reference seismic hazard map for Italy (http://zonesismiche.mi.ingv.it). In the following, further 9 probabilities of exceedance and the uniform hazard spectra up to 2 seconds together with the disaggregation of the PGA was also released. A comprehensive seismic hazard model that fully describes the seismic hazard in Italy was then available, accessible by a webGis application (http://esse1-gis.mi.ingv.it/en.php). The detailed information make possible to change the approach for evaluating the proper seismic action for designing: from a zone-dependent approach (in Italy there were 4 seismic zones, each one with a single design spectrum) to a site-dependent approach: the design spectrum is now defined at each site of a grid of about 11000 points covering the whole national territory. The new building code becomes mandatory only after the 6 April 2009 L'Aquila earthquake, the first strong event in Italy after the release of the seismic hazard map. The large number of recordings and the values of the experienced accelerations suggested the comparisons between the recorded spectra and spectra defined in the seismic codes Even if such comparisons could be robust only after several consecutive 50-year periods of observation and in a probabilistic approach it is not a single observation that can validate or not the hazard estimate, some of the comparisons that can be undertaken between the observed ground motions and the hazard model used for the seismic code have been performed and have shown that the

  4. Hazardous gas dispersion: a CFD model accounting for atmospheric stability classes.

    PubMed

    Pontiggia, M; Derudi, M; Busini, V; Rota, R

    2009-11-15

    Nowadays, thanks to the increasing CPU power the use of Computational Fluid Dynamics (CFD) is rapidly imposing also in the industrial risk assessment area, replacing integral models when particular situations, such as those involving complex terrains or large obstacles, are involved. Nevertheless, commercial CFD codes usually do not provide specific turbulence model for simulating atmospheric stratification effects, which are accounted of by the integral models through the well-known stability-class approach. In this work, a new approach able to take account of atmospheric features in CFD simulations has been developed and validated by comparison with available experimental data. PMID:19619939

  5. Hazardous gas dispersion: a CFD model accounting for atmospheric stability classes.

    PubMed

    Pontiggia, M; Derudi, M; Busini, V; Rota, R

    2009-11-15

    Nowadays, thanks to the increasing CPU power the use of Computational Fluid Dynamics (CFD) is rapidly imposing also in the industrial risk assessment area, replacing integral models when particular situations, such as those involving complex terrains or large obstacles, are involved. Nevertheless, commercial CFD codes usually do not provide specific turbulence model for simulating atmospheric stratification effects, which are accounted of by the integral models through the well-known stability-class approach. In this work, a new approach able to take account of atmospheric features in CFD simulations has been developed and validated by comparison with available experimental data.

  6. AIR QUALITY MODELING OF HAZARDOUS POLLUTANTS: CURRENT STATUS AND FUTURE DIRECTIONS

    EPA Science Inventory

    The paper presents a review of current air toxics modeling applications and discusses possible advanced approaches. Many applications require the ability to predict hot spots from industrial sources or large roadways that are needed for community health and Environmental Justice...

  7. Models Show Subsurface Cracking May Complicate Groundwater Cleanup at Hazardous Waste Sites

    EPA Science Inventory

    Chlorinated solvents like trichloroethylene contaminate groundwater at numerous sites nationwide. This modeling study, conducted at the Air Force Institute of Technology, shows that subsurface cracks, either natural or due to the presence of the contaminant itself, may result in...

  8. Applying the Land Use Portfolio Model to Estimate Natural-Hazard Loss and Risk - A Hypothetical Demonstration for Ventura County, California

    USGS Publications Warehouse

    Dinitz, Laura B.

    2008-01-01

    With costs of natural disasters skyrocketing and populations increasingly settling in areas vulnerable to natural hazards, society is challenged to better allocate its limited risk-reduction resources. In 2000, Congress passed the Disaster Mitigation Act, amending the Robert T. Stafford Disaster Relief and Emergency Assistance Act (Robert T. Stafford Disaster Relief and Emergency Assistance Act, Pub. L. 93-288, 1988; Federal Emergency Management Agency, 2002, 2008b; Disaster Mitigation Act, 2000), mandating that State, local, and tribal communities prepare natural-hazard mitigation plans to qualify for pre-disaster mitigation grants and post-disaster aid. The Federal Emergency Management Agency (FEMA) was assigned to coordinate and implement hazard-mitigation programs, and it published information about specific mitigation-plan requirements and the mechanisms (through the Hazard Mitigation Grant Program-HMGP) for distributing funds (Federal Emergency Management Agency, 2002). FEMA requires that each community develop a mitigation strategy outlining long-term goals to reduce natural-hazard vulnerability, mitigation objectives and specific actions to reduce the impacts of natural hazards, and an implementation plan for those actions. The implementation plan should explain methods for prioritizing, implementing, and administering the actions, along with a 'cost-benefit review' justifying the prioritization. FEMA, along with the National Institute of Building Sciences (NIBS), supported the development of HAZUS ('Hazards U.S.'), a geospatial natural-hazards loss-estimation tool, to help communities quantify potential losses and to aid in the selection and prioritization of mitigation actions. HAZUS was expanded to a multiple-hazard version, HAZUS-MH, that combines population, building, and natural-hazard science and economic data and models to estimate physical damages, replacement costs, and business interruption for specific natural-hazard scenarios. HAZUS

  9. Maximum magnitude (Mmax) in the central and eastern United States for the 2014 U.S. Geological Survey Hazard Model

    USGS Publications Warehouse

    Wheeler, Russell L.

    2016-01-01

    Probabilistic seismic‐hazard assessment (PSHA) requires an estimate of Mmax, the moment magnitude M of the largest earthquake that could occur within a specified area. Sparse seismicity hinders Mmax estimation in the central and eastern United States (CEUS) and tectonically similar regions worldwide (stable continental regions [SCRs]). A new global catalog of moderate‐to‐large SCR earthquakes is analyzed with minimal assumptions about enigmatic geologic controls on SCR Mmax. An earlier observation that SCR earthquakes of M 7.0 and larger occur in young (250–23 Ma) passive continental margins and associated rifts but not in cratons is not strongly supported by the new catalog. SCR earthquakes of M 7.5 and larger are slightly more numerous and reach slightly higher M in young passive margins and rifts than in cratons. However, overall histograms of M from young margins and rifts and from cratons are statistically indistinguishable. This conclusion is robust under uncertainties inM, the locations of SCR boundaries, and which of two available global SCR catalogs is used. The conclusion stems largely from recent findings that (1) large southeast Asian earthquakes once thought to be SCR were in actively deforming crust and (2) long escarpments in cratonic Australia were formed by prehistoric faulting. The 2014 seismic‐hazard model of the U.S. Geological Survey represents CEUS Mmax as four‐point probability distributions. The distributions have weighted averages of M 7.0 in cratons and M 7.4 in passive margins and rifts. These weighted averages are consistent with Mmax estimates of other SCR PSHAs of the CEUS, southeastern Canada, Australia, and India.

  10. A dynamic approach for the impact of a toxic gas dispersion hazard considering human behaviour and dispersion modelling.

    PubMed

    Lovreglio, Ruggiero; Ronchi, Enrico; Maragkos, Georgios; Beji, Tarek; Merci, Bart

    2016-11-15

    The release of toxic gases due to natural/industrial accidents or terrorist attacks in populated areas can have tragic consequences. To prevent and evaluate the effects of these disasters different approaches and modelling tools have been introduced in the literature. These instruments are valuable tools for risk managers doing risk assessment of threatened areas. Despite the significant improvements in hazard assessment in case of toxic gas dispersion, these analyses do not generally include the impact of human behaviour and people movement during emergencies. This work aims at providing an approach which considers both modelling of gas dispersion and evacuation movement in order to improve the accuracy of risk assessment for disasters involving toxic gases. The approach is applied to a hypothetical scenario including a ship releasing Nitrogen dioxide (NO2) on a crowd attending a music festival. The difference between the results obtained with existing static methods (people do not move) and a dynamic approach (people move away from the danger) which considers people movement with different degrees of sophistication (either a simple linear path or more complex behavioural modelling) is discussed.

  11. A dynamic approach for the impact of a toxic gas dispersion hazard considering human behaviour and dispersion modelling.

    PubMed

    Lovreglio, Ruggiero; Ronchi, Enrico; Maragkos, Georgios; Beji, Tarek; Merci, Bart

    2016-11-15

    The release of toxic gases due to natural/industrial accidents or terrorist attacks in populated areas can have tragic consequences. To prevent and evaluate the effects of these disasters different approaches and modelling tools have been introduced in the literature. These instruments are valuable tools for risk managers doing risk assessment of threatened areas. Despite the significant improvements in hazard assessment in case of toxic gas dispersion, these analyses do not generally include the impact of human behaviour and people movement during emergencies. This work aims at providing an approach which considers both modelling of gas dispersion and evacuation movement in order to improve the accuracy of risk assessment for disasters involving toxic gases. The approach is applied to a hypothetical scenario including a ship releasing Nitrogen dioxide (NO2) on a crowd attending a music festival. The difference between the results obtained with existing static methods (people do not move) and a dynamic approach (people move away from the danger) which considers people movement with different degrees of sophistication (either a simple linear path or more complex behavioural modelling) is discussed. PMID:27343142

  12. Coupling photogrammetric data with DFN-DEM model for rock slope hazard assessment

    NASA Astrophysics Data System (ADS)

    Donze, Frederic; Scholtes, Luc; Bonilla-Sierra, Viviana; Elmouttie, Marc

    2013-04-01

    Structural and mechanical analyses of rock mass are key components for rock slope stability assessment. The complementary use of photogrammetric techniques [Poropat, 2001] and coupled DFN-DEM models [Harthong et al., 2012] provides a methodology that can be applied to complex 3D configurations. DFN-DEM formulation [Scholtès & Donzé, 2012a,b] has been chosen for modeling since it can explicitly take into account the fracture sets. Analyses conducted in 3D can produce very complex and unintuitive failure mechanisms. Therefore, a modeling strategy must be established in order to identify the key features which control the stability. For this purpose, a realistic case is presented to show the overall methodology from the photogrammetry acquisition to the mechanical modeling. By combining Sirovision and YADE Open DEM [Kozicki & Donzé, 2008, 2009], it can be shown that even for large camera to rock slope ranges (tested about one kilometer), the accuracy of the data are sufficient to assess the role of the structures on the stability of a jointed rock slope. In this case, on site stereo pairs of 2D images were taken to create 3D surface models. Then, digital identification of structural features on the unstable block zone was processed with Sirojoint software [Sirovision, 2010]. After acquiring the numerical topography, the 3D digitalized and meshed surface was imported into the YADE Open DEM platform to define the studied rock mass as a closed (manifold) volume to define the bounding volume for numerical modeling. The discontinuities were then imported as meshed planar elliptic surfaces into the model. The model was then submitted to gravity loading. During this step, high values of cohesion were assigned to the discontinuities in order to avoid failure or block displacements triggered by inertial effects. To assess the respective role of the pre-existing discontinuities in the block stability, different configurations have been tested as well as different degree of

  13. A Nonparametric Bayesian Approach to Seismic Hazard Modeling Using the ETAS Framework

    NASA Astrophysics Data System (ADS)

    Ross, G.

    2015-12-01

    The epidemic-type aftershock sequence (ETAS) model is one of the most popular tools for modeling seismicity and quantifying risk in earthquake-prone regions. Under the ETAS model, the occurrence times of earthquakes are treated as a self-exciting Poisson process where each earthquake briefly increases the probability of subsequent earthquakes occurring soon afterwards, which captures the fact that large mainshocks tend to produce long sequences of aftershocks. A triggering kernel controls the amount by which the probability increases based on the magnitude of each earthquake, and the rate at which it then decays over time. This triggering kernel is usually chosen heuristically, to match the parametric form of the modified Omori law for aftershock decay. However recent work has questioned whether this is an appropriate choice. Since the choice of kernel has a large impact on the predictions made by the ETAS model, avoiding misspecification is crucially important. We present a novel nonparametric version of ETAS which avoids making parametric assumptions, and instead learns the correct specification from the data itself. Our approach is based on the Dirichlet process, which is a modern class of Bayesian prior distribution which allows for efficient inference over an infinite dimensional space of functions. We show how our nonparametric ETAS model can be fit to data, and present results demonstrating that the fit is greatly improved compared to the standard parametric specification. Additionally, we explain how our model can be used to perform probabilistic declustering of earthquake catalogs, to classify earthquakes as being either aftershocks or mainshocks. and to learn the causal relations between pairs of earthquakes.

  14. Monitoring and forecasting of hazardous hydrometeorological phenomena on the basis of conjuctive use of remote sensing data and the results of numerical modeling

    NASA Astrophysics Data System (ADS)

    Voronov, Nikolai; Dikinis, Alexandr

    2015-04-01

    Modern technologies of remote sensing (RS) open wide opportunities for monitoring and increasing the accuracy and forecast-time interval of forecasts of hazardous hydrometeorological phenomena. The RS data do not supersede ground-based observations, but they allow to solve new problems in the area of hydrological and meteorological monitoring and forecasting. In particular, the data of satellite, aviation or radar observations may be used for increasing of special-temporal discreteness of hydrometeorological observations. Besides, what seems very promising is conjunctive use of the data of remote sensing, ground-based observations and the "output" of hydrodynamical weather models, which allows to increase significantly the accuracy and forecast-time interval of forecasts of hazardous hydrometeorological phenomena. Modern technologies of monitoring and forecasting of hazardous of hazardous hydrometeorological phenomena on the basis of conjunctive use of the data of satellite, aviation and ground-based observations, as well as the output data of hydrodynamical weather models are considered. It is noted that an important and promising method of monitoring is bioindication - surveillance over response of the biota to external influence and behavior of animals that are able to be presentient of convulsions of nature. Implement of the described approaches allows to reduce significantly both the damage caused by certain hazardous hydrological and meteorological phenomena and the general level of hydrometeorological vulnerability of certain different-purpose objects and the RF economy as a whole.

  15. Citizens' Perceptions of Flood Hazard Adjustments: An Application of the Protective Action Decision Model

    ERIC Educational Resources Information Center

    Terpstra, Teun; Lindell, Michael K.

    2013-01-01

    Although research indicates that adoption of flood preparations among Europeans is low, only a few studies have attempted to explain citizens' preparedness behavior. This article applies the Protective Action Decision Model (PADM) to explain flood preparedness intentions in the Netherlands. Survey data ("N" = 1,115) showed that…

  16. Life-Stage Physiologically-Based Pharmacokinetic (PBPK) Model Applications to Screen Environmental Hazards.

    EPA Science Inventory

    This presentation discusses methods used to extrapolate from in vitro high-throughput screening (HTS) toxicity data for an endocrine pathway to in vivo for early life stages in humans, and the use of a life stage PBPK model to address rapidly changing physiological parameters. A...

  17. Estimation of Wildlife Hazard Levels Using Interspecies Correlation Models and Standard Laboratory Rodent Toxicity Data

    EPA Science Inventory

    Toxicity data from laboratory rodents are widely available and frequently used in human health assessments as an animal model. We explore the possibility of using single rodent acute toxicity values to predict chemical toxicity to a diversity of wildlife species and to estimate ...

  18. Development of Algal Interspecies Correlation Estimation Models for Chemical Hazard Assessment

    EPA Science Inventory

    Web-based Interspecies Correlation Estimation (ICE) is an application developed to predict the acute toxicity of a chemical from 1 species to another taxon. Web-ICE models use the acute toxicity value for a surrogate species to predict effect values for other species, thus potent...

  19. Applying Distributed, Coupled Hydrological Slope-Stability Models for Landslide Hazard Assessments

    NASA Astrophysics Data System (ADS)

    Godt, J. W.; Baum, R. L.; Lu, N.; Savage, W. Z.; McKenna, J. P.

    2006-12-01

    Application of distributed, coupled hydrological slope-stability models requires knowledge of hydraulic and material-strength properties at the scale of landslide processes. We describe results from a suite of laboratory and field tests that were used to define the soil-water characteristics of landslide-prone colluvium on the steep coastal bluffs in the Seattle, Washington area and then use these results in a coupled model. Many commonly used tests to determine soil-water characteristics are performed for the drying process. Because most soils display a pronounced hysteresis in the relation between moisture content and matric suction, results from such tests may not accurately describe the soil-water characteristics for the wetting process during rainfall infiltration. Open-tube capillary-rise and constant-flow permeameter tests on bluff colluvium were performed in the laboratory to determine the soil-water characteristic curves (SWCC) and unsaturated hydraulic conductivity functions (HCF) for the wetting process. Field-tests using a borehole permeameter were used to determine the saturated hydraulic conductivity of colluvial materials. Measurements of pore-water response to rainfall were used in an inverse numerical modeling procedure to determine the in-situ hydraulic parameters of hillside colluvium at the scale of the instrument installation. Comparison of laboratory and field results show that although both techniques generally produce SWCCs and HCFs with similar shapes, differences in bulk density among field and lab tests yield differences in saturated moisture content and saturated hydrologic conductivity. We use these material properties in an application of a new version of a distributed transient slope stability model (TRIGRS) that accounts for the effects of the unsaturated zone on the infiltration process. Applied over a LiDAR-based digital landscape of part of the Seattle area for an hourly rainfall history known to trigger shallow landslides, the

  20. Tsunami hazard assessment along the French Mediterranean coast : detailed modeling of tsunami impacts for the ALDES project

    NASA Astrophysics Data System (ADS)

    Quentel, E.; Loevenbruck, A.; Hébert, H.

    2012-04-01

    The catastrophic 2004 tsunami drew the international community's attention to tsunami risk in all basins where tsunamis occurred but no warning system exists. Consequently, under the coordination of UNESCO, France decided to create a regional center, called CENALT, for the north-east Atlantic and the western Mediterranean. This warning system, which should be operational by 2012, is set up by the CEA in collaboration with the SHOM and the CNRS. The French authorities are in charge of the top-down alert system including the local alert dissemination. In order to prepare the appropriate means and measures, they initiated the ALDES (Alerte Descendante) project to which the CEA also contributes. It aims at examining along the French Mediterranean coast the tsunami risk related to earthquakes and landslides. In addition to the evaluation at regional scale, it includes the detailed studies of 3 selected sites; the local alert system will be designed for one of them : the French Riviera. In this project, our main task at CEA consists in assessing tsunami hazard related to seismic sources using numerical modeling. Past tsunamis have affected the west Mediterranean coast but are too few and poorly documented to provide a suitable database. Thus, a synthesis of earthquakes representative of the tsunamigenic seismic activity and prone to induce the largest impact to the French coast is performed based on historical data, seismotectonics and first order models. The North Africa Margin, the Ligurian and the South Tyrrhenian Seas are considered as the main tsunamigenic zones. In order to forecast the most important plausible effects, the magnitudes are estimated by enhancing to some extent the largest known values. Our hazard estimation is based on the simulation of the induced tsunamis scenarios performed with the CEA code. The 3 sites have been chosen according to the regional hazard studies, coastal typology elements and the appropriate DTMs (Digital Terrain Models). The

  1. Experimental and Numerical Modelling of CO2 Atmospheric Dispersion in Hazardous Gas Emission Sites.

    NASA Astrophysics Data System (ADS)

    Gasparini, A.; sainz Gracia, A. S.; Grandia, F.; Bruno, J.

    2015-12-01

    Under stable atmospheric conditions and/or in presence of topographic depressions, CO2 concentrations can reach high values resulting in lethal effect to living organisms. The distribution of denser than air gases released from the underground is governed by gravity, turbulence and dispersion. Once emitted, the gas distribution is initially driven by buoyancy and a gas cloud accumulates on the ground (gravitational phase); with time the density gradient becomes less important due to dispersion or mixing and gas distribution is mainly governed by wind and atmospheric turbulence (passive dispersion phase). Natural analogues provide evidences of the impact of CO2 leakage. Dangerous CO2 concentration in atmosphere related to underground emission have been occasionally reported although the conditions favouring the persistence of such a concentration are barely studied.In this work, the dynamics of CO2 in the atmosphere after ground emission is assessed to quantify their potential risk. Two approaches have been followed: (1) direct measurement of air concentration in a natural emission site, where formation of a "CO2 lake" is common and (2) numerical atmospheric modelling. Two sites with different morphology were studied: (a) the Cañada Real site, a flat terrain in the Volcanic Field of Campo de Calatrava (Spain); (b) the Solforata di Pomezia site, a rough terrain in the Alban Hills Volcanic Region (Italy). The comparison between field data and model calculations reveal that numerical dispersion models are capable of predicting the formation of CO2 accumulation over the ground as a consequence of underground gas emission. Therefore, atmospheric modelling could be included as a valuable methodology in the risk assessment of leakage in natural degassing systems and in CCS projects. Conclusions from this work provide clues on whether leakage may be a real risk for humans and under which conditions this risk needs to be included in the risk assessment.

  2. Uniting Mandelbrot’s Noah and Joseph Effects in Toy Models of Natural Hazard Time Series

    NASA Astrophysics Data System (ADS)

    Credgington, D.; Watkins, N. W.; Chapman, S. C.; Rosenberg, S. J.; Sanchez, R.

    2009-12-01

    The forecasting of extreme events is a highly topical, cross-disciplinary problem. One aspect which is potentially tractable even when the events themselves are stochastic is the probability of a “burst” of a given size and duration, defined as the area between a time series and a constant threshold. Many natural time series depart from the simplest, Brownian, case and in the 1960s Mandelbrot developed the use of fractals to describe these departures. In particular he proposed two kinds of fractal model to capture the way in which natural data is often persistent in time (his “Joseph effect”, common in hydrology and exemplified by fractional Brownian motion) and/or prone to heavy tailed jumps (the “Noah effect”, typical of economic index time series, for which he gave Levy flights as an examplar). Much of the earlier modelling, however, has emphasised one of the Noah and Joseph parameters (the tail exponent mu and one derived from the temporal behaviour such as power spectral beta) at the other one's expense. I will describe work [1] in which we applied a simple self-affine stable model-linear fractional stable motion (LFSM)-which unifies both effects to better describe natural data, in this case from space physics. I will show how we have resolved some contradictions seen in earlier work, where purely Joseph or Noah descriptions had been sought. I will also show recent work [2] using numerical simulations of LFSM and simple analytic scaling arguments to study the problem of the area between a fractional Levy model time series and a threshold. [1] Watkins et al, Space Science Reviews [2005] [2] Watkins et al, Physical Review E [2009

  3. Optimal Design in and Hazards of Linearization of Langmuir's Nonlinear Model.

    ERIC Educational Resources Information Center

    Harrison, Ferrin; Katti, S. K.

    Langmuir's model is studied for the situation where epsilon is independently and identically normally distributed. The "Y/x" versus "Y" plot had a 90% mid-range that did not contain the true curve in a vast portion of the range of "x". The "1/Y" versus "1/chi" plot had undefined expected values, and this problem worsens as sample size increases.…

  4. Predictive modeling of hazardous waste landfill total above-ground biomass using passive optical and LIDAR remotely sensed data

    NASA Astrophysics Data System (ADS)

    Hadley, Brian Christopher

    This dissertation assessed remotely sensed data and geospatial modeling technique(s) to map the spatial distribution of total above-ground biomass present on the surface of the Savannah River National Laboratory's (SRNL) Mixed Waste Management Facility (MWMF) hazardous waste landfill. Ordinary least squares (OLS) regression, regression kriging, and tree-structured regression were employed to model the empirical relationship between in-situ measured Bahia (Paspalum notatum Flugge) and Centipede [Eremochloa ophiuroides (Munro) Hack.] grass biomass against an assortment of explanatory variables extracted from fine spatial resolution passive optical and LIDAR remotely sensed data. Explanatory variables included: (1) discrete channels of visible, near-infrared (NIR), and short-wave infrared (SWIR) reflectance, (2) spectral vegetation indices (SVI), (3) spectral mixture analysis (SMA) modeled fractions, (4) narrow-band derivative-based vegetation indices, and (5) LIDAR derived topographic variables (i.e. elevation, slope, and aspect). Results showed that a linear combination of the first- (1DZ_DGVI), second- (2DZ_DGVI), and third-derivative of green vegetation indices (3DZ_DGVI) calculated from hyperspectral data recorded over the 400--960 nm wavelengths of the electromagnetic spectrum explained the largest percentage of statistical variation (R2 = 0.5184) in the total above-ground biomass measurements. In general, the topographic variables did not correlate well with the MWMF biomass data, accounting for less than five percent of the statistical variation. It was concluded that tree-structured regression represented the optimum geospatial modeling technique due to a combination of model performance and efficiency/flexibility factors.

  5. Probabilistic seismic hazard in the San Francisco Bay area based on a simplified viscoelastic cycle model of fault interactions

    USGS Publications Warehouse

    Pollitz, F.F.; Schwartz, D.P.

    2008-01-01

    We construct a viscoelastic cycle model of plate boundary deformation that includes the effect of time-dependent interseismic strain accumulation, coseismic strain release, and viscoelastic relaxation of the substrate beneath the seismogenic crust. For a given fault system, time-averaged stress changes at any point (not on a fault) are constrained to zero; that is, kinematic consistency is enforced for the fault system. The dates of last rupture, mean recurrence times, and the slip distributions of the (assumed) repeating ruptures are key inputs into the viscoelastic cycle model. This simple formulation allows construction of stress evolution at all points in the plate boundary zone for purposes of probabilistic seismic hazard analysis (PSHA). Stress evolution is combined with a Coulomb failure stress threshold at representative points on the fault segments to estimate the times of their respective future ruptures. In our PSHA we consider uncertainties in a four-dimensional parameter space: the rupture peridocities, slip distributions, time of last earthquake (for prehistoric ruptures) and Coulomb failure stress thresholds. We apply this methodology to the San Francisco Bay region using a recently determined fault chronology of area faults. Assuming single-segment rupture scenarios, we find that fature rupture probabilities of area faults in the coming decades are the highest for the southern Hayward, Rodgers Creek, and northern Calaveras faults. This conclusion is qualitatively similar to that of Working Group on California Earthquake Probabilities, but the probabilities derived here are significantly higher. Given that fault rupture probabilities are highly model-dependent, no single model should be used to assess to time-dependent rupture probabilities. We suggest that several models, including the present one, be used in a comprehensive PSHA methodology, as was done by Working Group on California Earthquake Probabilities.

  6. Transient hazard model using radar data for predicting debris flows in Madison County, Virginia

    USGS Publications Warehouse

    Morrissey, M.M.; Wieczorek, G.F.; Morgan, B.A.

    2004-01-01

    During the rainstorm of June 27, 1995, roughly 330-750 mm of rain fell within a 16-hour period, initiating floods and over 600 debris flows in a small area (130 km2) of Madison County, VA. We developed a distributed version of Iverson's transient response model for regional slope stability analysis for the Madison County debris flows. This version of the model evaluates pore-pressure head response and factor of safety on a regional scale in areas prone to rainfall-induced shallow (<2-3 m) landslides. These calculations used soil properties of shear strength and hydraulic conductivity from laboratory measurements of soil samples collected from field sites where debris flows initiated. Rainfall data collected by radar every 6 minutes provided a basis for calculating the temporal variation of slope stability during the storm. The results demonstrate that the spatial and temporal variation of the factor of safety correlates with the movement of the storm cell. When the rainstorm was treated as two separate rainfall events and a larger hydraulic conductivity and friction angle than the laboratory values were used, the timing and location of landslides predicted by the model were in closer agreement with eyewitness observations of debris flows. Application of spatially variable initial pre-storm water table depth and soil properties may improve both the spatial and temporal prediction of instability.

  7. Proportional Reasoning and the Visually Impaired

    ERIC Educational Resources Information Center

    Hilton, Geoff; Hilton, Annette; Dole, Shelley L.; Goos, Merrilyn; O'Brien, Mia

    2012-01-01

    Proportional reasoning is an important aspect of formal thinking that is acquired during the developmental years that approximate the middle years of schooling. Students who fail to acquire sound proportional reasoning often experience difficulties in subjects that require quantitative thinking, such as science, technology, engineering, and…

  8. CCSSM Challenge: Graphing Ratio and Proportion

    ERIC Educational Resources Information Center

    Kastberg, Signe E.; D'Ambrosio, Beatriz S.; Lynch-Davis, Kathleen; Mintos, Alexia; Krawczyk, Kathryn

    2013-01-01

    A renewed emphasis was placed on ratio and proportional reasoning in the middle grades in the Common Core State Standards for Mathematics (CCSSM). The expectation for students includes the ability to not only compute and then compare and interpret the results of computations in context but also interpret ratios and proportions as they are…

  9. Working Memory Mechanism in Proportional Quantifier Verification

    ERIC Educational Resources Information Center

    Zajenkowski, Marcin; Szymanik, Jakub; Garraffa, Maria

    2014-01-01

    The paper explores the cognitive mechanisms involved in the verification of sentences with proportional quantifiers (e.g. "More than half of the dots are blue"). The first study shows that the verification of proportional sentences is more demanding than the verification of sentences such as: "There are seven blue and eight yellow…

  10. A combined M5P tree and hazard-based duration model for predicting urban freeway traffic accident durations.

    PubMed

    Lin, Lei; Wang, Qian; Sadek, Adel W

    2016-06-01

    The duration of freeway traffic accidents duration is an important factor, which affects traffic congestion, environmental pollution, and secondary accidents. Among previous studies, the M5P algorithm has been shown to be an effective tool for predicting incident duration. M5P builds a tree-based model, like the traditional classification and regression tree (CART) method, but with multiple linear regression models as its leaves. The problem with M5P for accident duration prediction, however, is that whereas linear regression assumes that the conditional distribution of accident durations is normally distributed, the distribution for a "time-to-an-event" is almost certainly nonsymmetrical. A hazard-based duration model (HBDM) is a better choice for this kind of a "time-to-event" modeling scenario, and given this, HBDMs have been previously applied to analyze and predict traffic accidents duration. Previous research, however, has not yet applied HBDMs for accident duration prediction, in association with clustering or classification of the dataset to minimize data heterogeneity. The current paper proposes a novel approach for accident duration prediction, which improves on the original M5P tree algorithm through the construction of a M5P-HBDM model, in which the leaves of the M5P tree model are HBDMs instead of linear regression models. Such a model offers the advantage of minimizing data heterogeneity through dataset classification, and avoids the need for the incorrect assumption of normality for traffic accident durations. The proposed model was then tested on two freeway accident datasets. For each dataset, the first 500 records were used to train the following three models: (1) an M5P tree; (2) a HBDM; and (3) the proposed M5P-HBDM, and the remainder of data were used for testing. The results show that the proposed M5P-HBDM managed to identify more significant and meaningful variables than either M5P or HBDMs. Moreover, the M5P-HBDM had the lowest overall mean

  11. Predicting the hazardous dose of industrial chemicals in warm-blooded species using machine learning-based modelling approaches.

    PubMed

    Gupta, S; Basant, N; Singh, K P

    2015-06-01

    The hazardous dose of a chemical (HD50) is an emerging and acceptable test statistic for the safety/risk assessment of chemicals. Since it is derived using the experimental toxicity values of the chemical in several test species, it is highly cumbersome, time and resource intensive. In this study, three machine learning-based QSARs were established for predicting the HD50 of chemicals in warm-blooded species following the OECD guidelines. A data set comprising HD50 values of 957 chemicals was used to develop SDT, DTF and DTB QSAR models. The diversity in chemical structures and nonlinearity in the data were verified. Several validation coefficients were derived to test the predictive and generalization abilities of the constructed QSARs. The chi-path descriptors were identified as the most influential in three QSARs. The DTF and DTB performed relatively better than SDT model and yielded r(2) values of 0.928 and 0.959 between the measured and predicted HD50 values in the complete data set. Substructure alerts responsible for the toxicity of the chemicals were identified. The results suggest the appropriateness of the developed QSARs for reliably predicting the HD50 values of chemicals, and they can be used for screening of new chemicals for their safety/risk assessment for regulatory purposes.

  12. Mapping hazard from urban non-point pollution: a screening model to support sustainable urban drainage planning.

    PubMed

    Mitchell, Gordon

    2005-01-01

    Non-point sources of pollution are difficult to identify and control, and are one of the main reasons that urban rivers fail to reach the water quality objectives set for them. Whilst sustainable drainage systems (SuDS) are available to help combat this diffuse pollution, they are mostly installed in areas of new urban development. However, SuDS must also be installed in existing built areas if diffuse loadings are to be reduced. Advice on where best to locate SuDS within existing built areas is limited, hence a semi-distributed stochastic GIS-model was developed to map small-area basin-wide loadings of 18 key stormwater pollutants. Load maps are combined with information on surface water quality objectives to permit mapping of diffuse pollution hazard to beneficial uses of receiving waters. The model thus aids SuDS planning and strategic management of urban diffuse pollution. The identification of diffuse emission 'hot spots' within a water quality objectives framework is consistent with the 'combined' (risk assessment) approach to pollution control advocated by the EU Water Framework Directive. PMID:15572076

  13. Prevention strategy for vibration hazards by portable power tools, national forest model of comprehensive prevention system in Japan.

    PubMed

    Yamada, S; Sakakibara, H

    1998-04-01

    In the 1950s, the introduction of portable power tools into the production process of many industries began on a large scale around the world and resulted in many cases of occupational vibration syndrome after the 1960s. There was an urgent world wide need to undertake preventive steps, medical assessment and therapy. At the end of 1964, our investigation began in Japanese national forests, and then in mines and stone quarries. The Japanese Association of Industrial Hygiene established a "Committee for Local Vibration Hazards" (1965), and many researchers in the medical and technological fields joined this Committee. After 10 years, a comprehensive system for the prevention of vibration syndrome was established in the national forestry. It consists of 1) improvements in vibrating tools, 2) hygienic regulation of operation time with an alternative working system, 3) health care system involving early medical checks, early therapy and age limitations in operation of vibrating tools, 4) protection against cold in the workplace and while commuting, and 5) education and training for health and safety. The prevention strategy for vibration syndrome in our national forests is to establish a comprehensive prevention system in cooperation among researchers in the medical and technological fields, workers and administration. The Ministry of Labor presented that strategy as good model of prevention for other industries (1976). New designs for this model were developed and adapted according to the special conditions of each industry. Thus comprehensive system for prevention of vibration syndrome developed successfully from the late 1970s to 1980s in Japan.

  14. Vulnerability of the Dover Strait to coseismic tsunami hazards: insights from numerical modelling

    NASA Astrophysics Data System (ADS)

    Roger, J.; Gunnell, Y.

    2012-02-01

    On 1580 April 6, a large earthquake shook the eastern English Channel and its shores, with numerous casualties and significant destruction documented. Some reports suggest that it was followed by a tsunami. Meanwhile, earthquake magnitudes of MW= 7 have been deemed possible on intraplate fault systems in neighbouring Benelux. This study aims to determine the possibility of a MW > 5.5 magnitude earthquake generating a tsunami in the Dover Strait, one of the world's busiest seaways. In a series of numerical models focusing on sensitivity analysis, earthquake source parameters for the Dover Strait are constrained by palaeoseismological evidence and historical accounts, producing maps of wave heights and analysis of frequencies based on six strategically located virtual tide gauges. Of potential concern to engineering geologists, a maximum credible scenario is also tested for MW= 6.9. For earthquakes with MW of 5.5, none of the fault models we tested produced a tsunami on neighbouring shores. However, for earthquakes with MW 6.9, both extensional and thrusting events produced tsunami waves with open-water amplitudes of up to 1.5 m, and higher amplitudes might be expected in regions where waves are amplified by regional nearshore bathymetry. Sensitivity to parameter choice is emphasized but a pattern of densely inhabited coastal hotspots liable to tsunami-related damage because of bathymetric forcing factors is consistently obtained.

  15. The Prospect of using Three-Dimensional Earth Models To Improve Nuclear Explosion Monitoring and Ground Motion Hazard Assessment

    SciTech Connect

    Zucca, J J; Walter, W R; Rodgers, A J; Richards, P; Pasyanos, M E; Myers, S C; Lay, T; Harris, D; Antoun, T

    2008-11-19

    The last ten years have brought rapid growth in the development and use of three-dimensional (3D) seismic models of Earth structure at crustal, regional and global scales. In order to explore the potential for 3D seismic models to contribute to important societal applications, Lawrence Livermore National Laboratory (LLNL) hosted a 'Workshop on Multi-Resolution 3D Earth Models to Predict Key Observables in Seismic Monitoring and Related Fields' on June 6 and 7, 2007 in Berkeley, California. The workshop brought together academic, government and industry leaders in the research programs developing 3D seismic models and methods for the nuclear explosion monitoring and seismic ground motion hazard communities. The workshop was designed to assess the current state of work in 3D seismology and to discuss a path forward for determining if and how 3D Earth models and techniques can be used to achieve measurable increases in our capabilities for monitoring underground nuclear explosions and characterizing seismic ground motion hazards. This paper highlights some of the presentations, issues, and discussions at the workshop and proposes two specific paths by which to begin quantifying the potential contribution of progressively refined 3D seismic models in critical applied arenas. Seismic monitoring agencies are tasked with detection, location, and characterization of seismic activity in near real time. In the case of nuclear explosion monitoring or seismic hazard, decisions to further investigate a suspect event or to launch disaster relief efforts may rely heavily on real-time analysis and results. Because these are weighty decisions, monitoring agencies are regularly called upon to meticulously document and justify every aspect of their monitoring system. In order to meet this level of scrutiny and maintain operational robustness requirements, only mature technologies are considered for operational monitoring systems, and operational technology necessarily lags

  16. New Time-independent and Time-dependent Seismic Source Models for the Calabria Region (Italy) for the Probabilistic Seismic Hazard Maps

    NASA Astrophysics Data System (ADS)

    Akinci, Aybige; Burrato, Pierfrancesco; Falcone, Giuseppe; Mariucci, Maria Teresa; Murru, Maura; Tiberti, Mara Monica; Vannoli, Paola

    2015-04-01

    The present study is carried out in the framework of the S2-2014 COBAS Project "Constraining Observations into Seismic Hazard" co-funded by the Civil Protection Department of the Presidency of Council of Ministers (DPC) within the general agreement DPC-INGV for the period 2012-2021. The two areas identified as priority areas in the first phase of the activities by the 2012- 2021 Agreement DPC-INGV, namely the Po Plain and the Southern Apennines from Molise-Lazio to Basilicata-Calabria borders, require different strategies for calculating "the best seismic hazard". In this study we develop new time-independent and time-dependent seismic source models for the Calabria region starting from the new version of the DISS (Database of Individual Seismogenic Sources). This version of the Database DISS contains remarkable and notable new data and information on the seismogenic sources and their parameterizations in the Calabria region. The probability of the earthquake occurrences is calculated by developing models of seismicity-derived hazard sources, and models of earthquakes on faults/seismogenic sources. Mainly the four different classes of earthquake source models are developed to be included into the PSHA maps: (1) shallow crustal background seismicity (2) special zone that account for deep background seismicity (many earthquakes deeper than 30 kilometers occur beneath the Calabrian Arc and may have caused considerable damage in the Calabria region; these earthquakes have different ground-motion properties than shallow earthquakes) (3) uniform background source zones (4) finite faults/seismogenic sources as defined in the previous activity. The first three models are based on the earthquake catalog and characterize the hazard from earthquakes Mw>4.7. In most cases, the faults contribute most to the hazard for earthquakes larger than Mw5.5. The earthquake occurrence for the faults are modeled both as a Poisson time-independent process and introducing the various renewal

  17. Potential hazards to embryo implantation: A human endometrial in vitro model to identify unwanted antigestagenic actions of chemicals

    SciTech Connect

    Fischer, L.; Deppert, W.R.; Pfeifer, D.; Stanzel, S.; Weimer, M.; Hanjalic-Beck, A.; Stein, A.; Straßer, M.; Zahradnik, H.P.; Schaefer, W.R.

    2012-05-01

    Embryo implantation is a crucial step in human reproduction and depends on the timely development of a receptive endometrium. The human endometrium is unique among adult tissues due to its dynamic alterations during each menstrual cycle. It hosts the implantation process which is governed by progesterone, whereas 17β-estradiol regulates the preceding proliferation of the endometrium. The receptors for both steroids are targets for drugs and endocrine disrupting chemicals. Chemicals with unwanted antigestagenic actions are potentially hazardous to embryo implantation since many pharmaceutical antiprogestins adversely affect endometrial receptivity. This risk can be addressed by human tissue-specific in vitro assays. As working basis we compiled data on chemicals interacting with the PR. In our experimental work, we developed a flexible in vitro model based on human endometrial Ishikawa cells. Effects of antiprogestin compounds on pre-selected target genes were characterized by sigmoidal concentration–response curves obtained by RT-qPCR. The estrogen sulfotransferase (SULT1E1) was identified as the most responsive target gene by microarray analysis. The agonistic effect of progesterone on SULT1E1 mRNA was concentration-dependently antagonized by RU486 (mifepristone) and ZK137316 and, with lower potency, by 4-nonylphenol, bisphenol A and apigenin. The negative control methyl acetoacetate showed no effect. The effects of progesterone and RU486 were confirmed on the protein level by Western blotting. We demonstrated proof of principle that our Ishikawa model is suitable to study quantitatively effects of antiprogestin-like chemicals on endometrial target genes in comparison to pharmaceutical reference compounds. This test is useful for hazard identification and may contribute to reduce animal studies. -- Highlights: ► We compare progesterone receptor-mediated endometrial effects of chemicals and drugs. ► 4-Nonylphenol, bisphenol A and apigenin exert weak

  18. Coastal Digital Elevation Models (DEMs) for tsunami hazard assessment on the French coasts

    NASA Astrophysics Data System (ADS)

    Maspataud, Aurélie; Biscara, Laurie; Hébert, Hélène; Schmitt, Thierry; Créach, Ronan

    2015-04-01

    Building precise and up-to-date coastal DEMs is a prerequisite for accurate modeling and forecasting of hydrodynamic processes at local scale. Marine flooding, originating from tsunamis, storm surges or waves, is one of them. Some high resolution DEMs are being generated for multiple coast configurations (gulf, embayment, strait, estuary, harbor approaches, low-lying areas…) along French Atlantic and Channel coasts. This work is undertaken within the framework of the TANDEM project (Tsunamis in the Atlantic and the English ChaNnel: Definition of the Effects through numerical Modeling) (2014-2017). DEMs boundaries were defined considering the vicinity of French civil nuclear facilities, site effects considerations and potential tsunamigenic sources. Those were identified from available historical observations. Seamless integrated topographic and bathymetric coastal DEMs will be used by institutions taking part in the study to simulate expected wave height at regional and local scale on the French coasts, for a set of defined scenarii. The main tasks were (1) the development of a new capacity of production of DEM, (2) aiming at the release of high resolution and precision digital field models referred to vertical reference frameworks, that require (3) horizontal and vertical datum conversions (all source elevation data need to be transformed to a common datum), on the basis of (4) the building of (national and/or local) conversion grids of datum relationships based on known measurements. Challenges in coastal DEMs development deal with good practices throughout model development that can help minimizing uncertainties. This is particularly true as scattered elevation data with variable density, from multiple sources (national hydrographic services, state and local government agencies, research organizations and private engineering companies) and from many different types (paper fieldsheets to be digitized, single beam echo sounder, multibeam sonar, airborne laser

  19. Evaluation of model-predicted hazardous air pollutants (HAPs) near a mid-sized U.S. airport

    NASA Astrophysics Data System (ADS)

    Vennam, Lakshmi Pradeepa; Vizuete, William; Arunachalam, Saravanan

    2015-10-01

    Accurate modeling of aircraft-emitted pollutants in the vicinity of airports is essential to study the impact on local air quality and to answer policy and health-impact related issues. To quantify air quality impacts of airport-related hazardous air pollutants (HAPs), we carried out a fine-scale (4 × 4 km horizontal resolution) Community Multiscale Air Quality model (CMAQ) model simulation at the T.F. Green airport in Providence (PVD), Rhode Island. We considered temporally and spatially resolved aircraft emissions from the new Aviation Environmental Design Tool (AEDT). These model predictions were then evaluated with observations from a field campaign focused on assessing HAPs near the PVD airport. The annual normalized mean error (NME) was in the range of 36-70% normalized mean error for all HAPs except for acrolein (>70%). The addition of highly resolved aircraft emissions showed only marginally incremental improvements in performance (1-2% decrease in NME) of some HAPs (formaldehyde, xylene). When compared to a coarser 36 × 36 km grid resolution, the 4 × 4 km grid resolution did improve performance by up to 5-20% NME for formaldehyde and acetaldehyde. The change in power setting (from traditional International Civil Aviation Organization (ICAO) 7% to observation studies based 4%) doubled the aircraft idling emissions of HAPs, but led to only a 2% decrease in NME. Overall modeled aircraft-attributable contributions are in the range of 0.5-28% near a mid-sized airport grid-cell with maximum impacts seen only within 4-16 km from the airport grid-cell. Comparison of CMAQ predictions with HAP estimates from EPA's National Air Toxics Assessment (NATA) did show similar annual mean concentrations and equally poor performance. Current estimates of HAPs for PVD are a challenge for modeling systems and refinements in our ability to simulate aircraft emissions have made only incremental improvements. Even with unrealistic increases in HAPs aviation emissions the model

  20. Improvement of ash plume monitoring, modeling and hazard assessment in the MED-SUV project

    NASA Astrophysics Data System (ADS)

    Coltelli, Mauro; Andronico, Daniele; Boselli, Antonella; Corradini, Stefano; Costa, Antonio; Donnadieu, Franck; Leto, Giuseppe; Macedonio, Giovanni; Merucci, Luca; Neri, Augusto; Pecora, Emilio; Prestifilippo, Michele; Scarlato, Piergiorgio; Scollo, Simona; Spinelli, Nicola; Spata, Gaetano; Taddeucci, Jacopo; Wang, Xuan; Zanmar Sanchez, Ricardo

    2014-05-01

    Volcanic ash clouds produced by explosive eruptions represent a strong problem for civil aviation, road transportation and other human activities. Since Etna volcano produced in the last 35 years more the 200 explosive eruptions of small and medium size. The INGV, liable for its volcano monitoring, developed since 2006 a specific system for forecasting and monitoring Etna's volcanic ash plumes in collaboration with several national and international institutions. Between 12 January 2011 and 31 December 2013 Etna produced forty-six basaltic lava fountains. Every paroxysm produced an eruption column ranging from a few up to eleven kilometers of height above sea level. The ash cloud contaminated the controlled airspace (CTR) of Catania and Reggio Calabria airports and caused tephra fallout on eastern Sicily sometime disrupting the operations of these airports. In order to give prompt and detailed warnings to the Aviation and Civil Protection authorities, ash plumes monitoring at Osservatorio Etneo, the INGV department in Catania, is carried out using multispectral (from visible to infrared) satellite and ground-based video-surveillance images; seismic and infrasound signals processed in real-time, a Doppler RADAR (Voldorad IIB) able to detect the eruption column in all weather conditions and a LIDAR (AMPLE) for retrieving backscattering and depolarization values of the ash clouds. Forecasting is performed running tephra dispersal models using weather forecast data, and then plotting results on maps published on a dedicated website. 24/7 Control Room operators were able to timely inform Aviation and Civil Protection operators for an effective aviation safety management. A variety of multidisciplinary activities are planned in the MED-SUV project with reference to volcanic ash observations and studies. These include: 1) physical and analogue laboratory experiments on ash dispersal and aggregation; 2) integration of satellite data (e.g. METEOSAT, MODIS) and ground

  1. Proportion of recovered waterfowl bands reported

    USGS Publications Warehouse

    Geis, A.D.; Atwood, E.L.

    1961-01-01

    Data from the annual mail survey of waterfowl hunters in the United States were used to estimate the total numbers of banded waterfowl that were shot. These estimates were compared with Banding Office records to estimate the proportion of recovered bands that was reported. On the average, about two banded birds were recovered for each one reported. The proportion reported was higher for some areas and for some species than for others. The proportion reported was higher when more of the reports came through employees of conservation agencies.

  2. Nuclear subsurface explosion modeling and hydrodynamic fragmentation simulation of hazardous asteroids

    NASA Astrophysics Data System (ADS)

    Premaratne, Pavithra Dhanuka

    Disruption and fragmentation of an asteroid using nuclear explosive devices (NEDs) is a highly complex yet a practical solution to mitigating the impact threat of asteroids with short warning time. A Hypervelocity Asteroid Intercept Vehicle (HAIV) concept, developed at the Asteroid Deflection Research Center (ADRC), consists of a primary vehicle that acts as kinetic impactor and a secondary vehicle that houses NEDs. The kinetic impactor (lead vehicle) strikes the asteroid creating a crater. The secondary vehicle will immediately enter the crater and detonate its nuclear payload creating a blast wave powerful enough to fragment the asteroid. The nuclear subsurface explosion modeling and hydrodynamic simulation has been a challenging research goal that paves the way an array of mission critical information. A mesh-free hydrodynamic simulation method, Smoothed Particle Hydrodynamics (SPH) was utilized to obtain both qualitative and quantitative solutions for explosion efficiency. Commercial fluid dynamics packages such as AUTODYN along with the in-house GPU accelerated SPH algorithms were used to validate and optimize high-energy explosion dynamics for a variety of test cases. Energy coupling from the NED to the target body was also examined to determine the effectiveness of