Sample records for hazard rate function

  1. Interval Estimation of Seismic Hazard Parameters

    NASA Astrophysics Data System (ADS)

    Orlecka-Sikora, Beata; Lasocki, Stanislaw

    2017-03-01

    The paper considers Poisson temporal occurrence of earthquakes and presents a way to integrate uncertainties of the estimates of mean activity rate and magnitude cumulative distribution function in the interval estimation of the most widely used seismic hazard functions, such as the exceedance probability and the mean return period. The proposed algorithm can be used either when the Gutenberg-Richter model of magnitude distribution is accepted or when the nonparametric estimation is in use. When the Gutenberg-Richter model of magnitude distribution is used the interval estimation of its parameters is based on the asymptotic normality of the maximum likelihood estimator. When the nonparametric kernel estimation of magnitude distribution is used, we propose the iterated bias corrected and accelerated method for interval estimation based on the smoothed bootstrap and second-order bootstrap samples. The changes resulted from the integrated approach in the interval estimation of the seismic hazard functions with respect to the approach, which neglects the uncertainty of the mean activity rate estimates have been studied using Monte Carlo simulations and two real dataset examples. The results indicate that the uncertainty of mean activity rate affects significantly the interval estimates of hazard functions only when the product of activity rate and the time period, for which the hazard is estimated, is no more than 5.0. When this product becomes greater than 5.0, the impact of the uncertainty of cumulative distribution function of magnitude dominates the impact of the uncertainty of mean activity rate in the aggregated uncertainty of the hazard functions. Following, the interval estimates with and without inclusion of the uncertainty of mean activity rate converge. The presented algorithm is generic and can be applied also to capture the propagation of uncertainty of estimates, which are parameters of a multiparameter function, onto this function.

  2. [Hazard function and life table: an introduction to the failure time analysis].

    PubMed

    Matsushita, K; Inaba, H

    1987-04-01

    Failure time analysis has become popular in demographic studies. It can be viewed as a part of regression analysis with limited dependent variables as well as a special case of event history analysis and multistate demography. The idea of hazard function and failure time analysis, however, has not been properly introduced to nor commonly discussed by demographers in Japan. The concept of hazard function in comparison with life tables is briefly described, where the force of mortality is interchangeable with the hazard rate. The basic idea of failure time analysis is summarized for the cases of exponential distribution, normal distribution, and proportional hazard models. The multiple decrement life table is also introduced as an example of lifetime data analysis with cause-specific hazard rates.

  3. Estimation of age- and stage-specific Catalan breast cancer survival functions using US and Catalan survival data

    PubMed Central

    2009-01-01

    Background During the last part of the 1990s the chance of surviving breast cancer increased. Changes in survival functions reflect a mixture of effects. Both, the introduction of adjuvant treatments and early screening with mammography played a role in the decline in mortality. Evaluating the contribution of these interventions using mathematical models requires survival functions before and after their introduction. Furthermore, required survival functions may be different by age groups and are related to disease stage at diagnosis. Sometimes detailed information is not available, as was the case for the region of Catalonia (Spain). Then one may derive the functions using information from other geographical areas. This work presents the methodology used to estimate age- and stage-specific Catalan breast cancer survival functions from scarce Catalan survival data by adapting the age- and stage-specific US functions. Methods Cubic splines were used to smooth data and obtain continuous hazard rate functions. After, we fitted a Poisson model to derive hazard ratios. The model included time as a covariate. Then the hazard ratios were applied to US survival functions detailed by age and stage to obtain Catalan estimations. Results We started estimating the hazard ratios for Catalonia versus the USA before and after the introduction of screening. The hazard ratios were then multiplied by the age- and stage-specific breast cancer hazard rates from the USA to obtain the Catalan hazard rates. We also compared breast cancer survival in Catalonia and the USA in two time periods, before cancer control interventions (USA 1975–79, Catalonia 1980–89) and after (USA and Catalonia 1990–2001). Survival in Catalonia in the 1980–89 period was worse than in the USA during 1975–79, but the differences disappeared in 1990–2001. Conclusion Our results suggest that access to better treatments and quality of care contributed to large improvements in survival in Catalonia. On the other hand, we obtained detailed breast cancer survival functions that will be used for modeling the effect of screening and adjuvant treatments in Catalonia. PMID:19331670

  4. Confidence intervals for the first crossing point of two hazard functions.

    PubMed

    Cheng, Ming-Yen; Qiu, Peihua; Tan, Xianming; Tu, Dongsheng

    2009-12-01

    The phenomenon of crossing hazard rates is common in clinical trials with time to event endpoints. Many methods have been proposed for testing equality of hazard functions against a crossing hazards alternative. However, there has been relatively few approaches available in the literature for point or interval estimation of the crossing time point. The problem of constructing confidence intervals for the first crossing time point of two hazard functions is considered in this paper. After reviewing a recent procedure based on Cox proportional hazard modeling with Box-Cox transformation of the time to event, a nonparametric procedure using the kernel smoothing estimate of the hazard ratio is proposed. The proposed procedure and the one based on Cox proportional hazard modeling with Box-Cox transformation of the time to event are both evaluated by Monte-Carlo simulations and applied to two clinical trial datasets.

  5. Nonparametric change point estimation for survival distributions with a partially constant hazard rate.

    PubMed

    Brazzale, Alessandra R; Küchenhoff, Helmut; Krügel, Stefanie; Schiergens, Tobias S; Trentzsch, Heiko; Hartl, Wolfgang

    2018-04-05

    We present a new method for estimating a change point in the hazard function of a survival distribution assuming a constant hazard rate after the change point and a decreasing hazard rate before the change point. Our method is based on fitting a stump regression to p values for testing hazard rates in small time intervals. We present three real data examples describing survival patterns of severely ill patients, whose excess mortality rates are known to persist far beyond hospital discharge. For designing survival studies in these patients and for the definition of hospital performance metrics (e.g. mortality), it is essential to define adequate and objective end points. The reliable estimation of a change point will help researchers to identify such end points. By precisely knowing this change point, clinicians can distinguish between the acute phase with high hazard (time elapsed after admission and before the change point was reached), and the chronic phase (time elapsed after the change point) in which hazard is fairly constant. We show in an extensive simulation study that maximum likelihood estimation is not robust in this setting, and we evaluate our new estimation strategy including bootstrap confidence intervals and finite sample bias correction.

  6. Survival Analysis of Patients with End Stage Renal Disease

    NASA Astrophysics Data System (ADS)

    Urrutia, J. D.; Gayo, W. S.; Bautista, L. A.; Baccay, E. B.

    2015-06-01

    This paper provides a survival analysis of End Stage Renal Disease (ESRD) under Kaplan-Meier Estimates and Weibull Distribution. The data were obtained from the records of V. L. MakabaliMemorial Hospital with respect to time t (patient's age), covariates such as developed secondary disease (Pulmonary Congestion and Cardiovascular Disease), gender, and the event of interest: the death of ESRD patients. Survival and hazard rates were estimated using NCSS for Weibull Distribution and SPSS for Kaplan-Meier Estimates. These lead to the same conclusion that hazard rate increases and survival rate decreases of ESRD patient diagnosed with Pulmonary Congestion, Cardiovascular Disease and both diseases with respect to time. It also shows that female patients have a greater risk of death compared to males. The probability risk was given the equation R = 1 — e-H(t) where e-H(t) is the survival function, H(t) the cumulative hazard function which was created using Cox-Regression.

  7. Comparison of hypertabastic survival model with other unimodal hazard rate functions using a goodness-of-fit test.

    PubMed

    Tahir, M Ramzan; Tran, Quang X; Nikulin, Mikhail S

    2017-05-30

    We studied the problem of testing a hypothesized distribution in survival regression models when the data is right censored and survival times are influenced by covariates. A modified chi-squared type test, known as Nikulin-Rao-Robson statistic, is applied for the comparison of accelerated failure time models. This statistic is used to test the goodness-of-fit for hypertabastic survival model and four other unimodal hazard rate functions. The results of simulation study showed that the hypertabastic distribution can be used as an alternative to log-logistic and log-normal distribution. In statistical modeling, because of its flexible shape of hazard functions, this distribution can also be used as a competitor of Birnbaum-Saunders and inverse Gaussian distributions. The results for the real data application are shown. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  8. The impact of legislation on divorce: a hazard function approach.

    PubMed

    Kidd, M P

    1995-01-01

    "The paper examines the impact of the introduction of no-fault divorce legislation in Australia. The approach used is rather novel, a hazard model of the divorce rate is estimated with the role of legislation captured via a time-varying covariate. The paper concludes that contrary to U.S. empirical evidence, no-fault divorce legislation appears to have had a positive impact upon the divorce rate in Australia." excerpt

  9. Some relevant parameters for assessing fire hazards of combustible mine materials using laboratory scale experiments

    PubMed Central

    Litton, Charles D.; Perera, Inoka E.; Harteis, Samuel P.; Teacoach, Kara A.; DeRosa, Maria I.; Thomas, Richard A.; Smith, Alex C.

    2018-01-01

    When combustible materials ignite and burn, the potential for fire growth and flame spread represents an obvious hazard, but during these processes of ignition and flaming, other life hazards present themselves and should be included to ensure an effective overall analysis of the relevant fire hazards. In particular, the gases and smoke produced both during the smoldering stages of fires leading to ignition and during the advanced flaming stages of a developing fire serve to contaminate the surrounding atmosphere, potentially producing elevated levels of toxicity and high levels of smoke obscuration that render the environment untenable. In underground mines, these hazards may be exacerbated by the existing forced ventilation that can carry the gases and smoke to locations far-removed from the fire location. Clearly, materials that require high temperatures (above 1400 K) and that exhibit low mass loss during thermal decomposition, or that require high heat fluxes or heat transfer rates to ignite represent less of a hazard than materials that decompose at low temperatures or ignite at low levels of heat flux. In order to define and quantify some possible parameters that can be used to assess these hazards, small-scale laboratory experiments were conducted in a number of configurations to measure: 1) the toxic gases and smoke produced both during non-flaming and flaming combustion; 2) mass loss rates as a function of temperature to determine ease of thermal decomposition; and 3) mass loss rates and times to ignition as a function of incident heat flux. This paper describes the experiments that were conducted, their results, and the development of a set of parameters that could possibly be used to assess the overall fire hazard of combustible materials using small scale laboratory experiments. PMID:29599565

  10. Some relevant parameters for assessing fire hazards of combustible mine materials using laboratory scale experiments.

    PubMed

    Litton, Charles D; Perera, Inoka E; Harteis, Samuel P; Teacoach, Kara A; DeRosa, Maria I; Thomas, Richard A; Smith, Alex C

    2018-04-15

    When combustible materials ignite and burn, the potential for fire growth and flame spread represents an obvious hazard, but during these processes of ignition and flaming, other life hazards present themselves and should be included to ensure an effective overall analysis of the relevant fire hazards. In particular, the gases and smoke produced both during the smoldering stages of fires leading to ignition and during the advanced flaming stages of a developing fire serve to contaminate the surrounding atmosphere, potentially producing elevated levels of toxicity and high levels of smoke obscuration that render the environment untenable. In underground mines, these hazards may be exacerbated by the existing forced ventilation that can carry the gases and smoke to locations far-removed from the fire location. Clearly, materials that require high temperatures (above 1400 K) and that exhibit low mass loss during thermal decomposition, or that require high heat fluxes or heat transfer rates to ignite represent less of a hazard than materials that decompose at low temperatures or ignite at low levels of heat flux. In order to define and quantify some possible parameters that can be used to assess these hazards, small-scale laboratory experiments were conducted in a number of configurations to measure: 1) the toxic gases and smoke produced both during non-flaming and flaming combustion; 2) mass loss rates as a function of temperature to determine ease of thermal decomposition; and 3) mass loss rates and times to ignition as a function of incident heat flux. This paper describes the experiments that were conducted, their results, and the development of a set of parameters that could possibly be used to assess the overall fire hazard of combustible materials using small scale laboratory experiments.

  11. A hazard rate analysis of fertility using duration data from Malaysia.

    PubMed

    Chang, C

    1988-01-01

    Data from the Malaysia Fertility and Family Planning Survey (MFLS) of 1974 were used to investigate the effects of biological and socioeconomic variables on fertility based on the hazard rate model. Another study objective was to investigate the robustness of the findings of Trussell et al. (1985) by comparing the findings of this study with theirs. The hazard rate of conception for the jth fecundable spell of the ith woman, hij, is determined by duration dependence, tij, measured by the waiting time to conception; unmeasured heterogeneity (HETi; the time-invariant variables, Yi (race, cohort, education, age at marriage); and time-varying variables, Xij (age, parity, opportunity cost, income, child mortality, child sex composition). In this study, all the time-varying variables were constant over a spell. An asymptotic X2 test for the equality of constant hazard rates across birth orders, allowing time-invariant variables and heterogeneity, showed the importance of time-varying variables and duration dependence. Under the assumption of fixed effects heterogeneity and the Weibull distribution for the duration of waiting time to conception, the empirical results revealed a negative parity effect, a negative impact from male children, and a positive effect from child mortality on the hazard rate of conception. The estimates of step functions for the hazard rate of conception showed parity-dependent fertility control, evidence of heterogeneity, and the possibility of nonmonotonic duration dependence. In a hazard rate model with piecewise-linear-segment duration dependence, the socioeconomic variables such as cohort, child mortality, income, and race had significant effects, after controlling for the length of the preceding birth. The duration dependence was consistant with the common finding, i.e., first increasing and then decreasing at a slow rate. The effects of education and opportunity cost on fertility were insignificant.

  12. Estimation of hazard function and its associated factors in gastric cancer patients using wavelet and kernel smoothing methods.

    PubMed

    Ahmadi, Azadeh; Roudbari, Masoud; Gohari, Mahmood Reza; Hosseini, Bistoon

    2012-01-01

    Increase of mortality rates of gastric cancer in Iran and the world in recent years reveal necessity of studies on this disease. Here, hazard function for gastric cancer patients was estimated using Wavelet and Kernel methods and some related factors were assessed. Ninety- five gastric cancer patients in Fayazbakhsh Hospital between 1996 and 2003 were studied. The effects of age of patients, gender, stage of disease and treatment method on patient's lifetime were assessed. For data analyses, survival analyses using Wavelet method and Log-rank test in R software were used. Nearly 25.3% of patients were female. Fourteen percent had surgery treatment and the rest had treatment without surgery. Three fourths died and the rest were censored. Almost 9.5% of patients were in early stages of the disease, 53.7% in locally advance stage and 36.8% in metastatic stage. Hazard function estimation with the wavelet method showed significant difference for stages of disease (P<0.001) and did not reveal any significant difference for age, gender and treatment method. Only stage of disease had effects on hazard and most patients were diagnosed in late stages of disease, which is possibly one of the most reasons for high hazard rate and low survival. Therefore, it seems to be necessary a public education about symptoms of disease by media and regular tests and screening for early diagnosis.

  13. Time prediction of failure a type of lamps by using general composite hazard rate model

    NASA Astrophysics Data System (ADS)

    Riaman; Lesmana, E.; Subartini, B.; Supian, S.

    2018-03-01

    This paper discusses the basic survival model estimates to obtain the average predictive value of lamp failure time. This estimate is for the parametric model, General Composite Hazard Level Model. The random time variable model used is the exponential distribution model, as the basis, which has a constant hazard function. In this case, we discuss an example of survival model estimation for a composite hazard function, using an exponential model as its basis. To estimate this model is done by estimating model parameters, through the construction of survival function and empirical cumulative function. The model obtained, will then be used to predict the average failure time of the model, for the type of lamp. By grouping the data into several intervals and the average value of failure at each interval, then calculate the average failure time of a model based on each interval, the p value obtained from the tes result is 0.3296.

  14. Separating spatial search and efficiency rates as components of predation risk

    PubMed Central

    DeCesare, Nicholas J.

    2012-01-01

    Predation risk is an important driver of ecosystems, and local spatial variation in risk can have population-level consequences by affecting multiple components of the predation process. I use resource selection and proportional hazard time-to-event modelling to assess the spatial drivers of two key components of risk—the search rate (i.e. aggregative response) and predation efficiency rate (i.e. functional response)—imposed by wolves (Canis lupus) in a multi-prey system. In my study area, both components of risk increased according to topographic variation, but anthropogenic features affected only the search rate. Predicted models of the cumulative hazard, or risk of a kill, underlying wolf search paths validated well with broad-scale variation in kill rates, suggesting that spatial hazard models provide a means of scaling up from local heterogeneity in predation risk to population-level dynamics in predator–prey systems. Additionally, I estimated an integrated model of relative spatial predation risk as the product of the search and efficiency rates, combining the distinct contributions of spatial heterogeneity to each component of risk. PMID:22977145

  15. Marginal regression approach for additive hazards models with clustered current status data.

    PubMed

    Su, Pei-Fang; Chi, Yunchan

    2014-01-15

    Current status data arise naturally from tumorigenicity experiments, epidemiology studies, biomedicine, econometrics and demographic and sociology studies. Moreover, clustered current status data may occur with animals from the same litter in tumorigenicity experiments or with subjects from the same family in epidemiology studies. Because the only information extracted from current status data is whether the survival times are before or after the monitoring or censoring times, the nonparametric maximum likelihood estimator of survival function converges at a rate of n(1/3) to a complicated limiting distribution. Hence, semiparametric regression models such as the additive hazards model have been extended for independent current status data to derive the test statistics, whose distributions converge at a rate of n(1/2) , for testing the regression parameters. However, a straightforward application of these statistical methods to clustered current status data is not appropriate because intracluster correlation needs to be taken into account. Therefore, this paper proposes two estimating functions for estimating the parameters in the additive hazards model for clustered current status data. The comparative results from simulation studies are presented, and the application of the proposed estimating functions to one real data set is illustrated. Copyright © 2013 John Wiley & Sons, Ltd.

  16. Clinical and multiple gene expression variables in survival analysis of breast cancer: Analysis with the hypertabastic survival model

    PubMed Central

    2012-01-01

    Background We explore the benefits of applying a new proportional hazard model to analyze survival of breast cancer patients. As a parametric model, the hypertabastic survival model offers a closer fit to experimental data than Cox regression, and furthermore provides explicit survival and hazard functions which can be used as additional tools in the survival analysis. In addition, one of our main concerns is utilization of multiple gene expression variables. Our analysis treats the important issue of interaction of different gene signatures in the survival analysis. Methods The hypertabastic proportional hazards model was applied in survival analysis of breast cancer patients. This model was compared, using statistical measures of goodness of fit, with models based on the semi-parametric Cox proportional hazards model and the parametric log-logistic and Weibull models. The explicit functions for hazard and survival were then used to analyze the dynamic behavior of hazard and survival functions. Results The hypertabastic model provided the best fit among all the models considered. Use of multiple gene expression variables also provided a considerable improvement in the goodness of fit of the model, as compared to use of only one. By utilizing the explicit survival and hazard functions provided by the model, we were able to determine the magnitude of the maximum rate of increase in hazard, and the maximum rate of decrease in survival, as well as the times when these occurred. We explore the influence of each gene expression variable on these extrema. Furthermore, in the cases of continuous gene expression variables, represented by a measure of correlation, we were able to investigate the dynamics with respect to changes in gene expression. Conclusions We observed that use of three different gene signatures in the model provided a greater combined effect and allowed us to assess the relative importance of each in determination of outcome in this data set. These results point to the potential to combine gene signatures to a greater effect in cases where each gene signature represents some distinct aspect of the cancer biology. Furthermore we conclude that the hypertabastic survival models can be an effective survival analysis tool for breast cancer patients. PMID:23241496

  17. Analysis of mean seismic ground motion and its uncertainty based on the UCERF3 geologic slip rate model with uncertainty for California

    USGS Publications Warehouse

    Zeng, Yuehua

    2018-01-01

    The Uniform California Earthquake Rupture Forecast v.3 (UCERF3) model (Field et al., 2014) considers epistemic uncertainty in fault‐slip rate via the inclusion of multiple rate models based on geologic and/or geodetic data. However, these slip rates are commonly clustered about their mean value and do not reflect the broader distribution of possible rates and associated probabilities. Here, we consider both a double‐truncated 2σ Gaussian and a boxcar distribution of slip rates and use a Monte Carlo simulation to sample the entire range of the distribution for California fault‐slip rates. We compute the seismic hazard following the methodology and logic‐tree branch weights applied to the 2014 national seismic hazard model (NSHM) for the western U.S. region (Petersen et al., 2014, 2015). By applying a new approach developed in this study to the probabilistic seismic hazard analysis (PSHA) using precomputed rates of exceedance from each fault as a Green’s function, we reduce the computer time by about 10^5‐fold and apply it to the mean PSHA estimates with 1000 Monte Carlo samples of fault‐slip rates to compare with results calculated using only the mean or preferred slip rates. The difference in the mean probabilistic peak ground motion corresponding to a 2% in 50‐yr probability of exceedance is less than 1% on average over all of California for both the Gaussian and boxcar probability distributions for slip‐rate uncertainty but reaches about 18% in areas near faults compared with that calculated using the mean or preferred slip rates. The average uncertainties in 1σ peak ground‐motion level are 5.5% and 7.3% of the mean with the relative maximum uncertainties of 53% and 63% for the Gaussian and boxcar probability density function (PDF), respectively.

  18. Prognostic Value of Improved Kidney Function After Transcatheter Aortic Valve Implantation for Aortic Stenosis.

    PubMed

    Nijenhuis, Vincent Johan; Peper, Joyce; Vorselaars, Veronique M M; Swaans, Martin J; De Kroon, Thom; Van der Heyden, Jan A S; Rensing, Benno J W M; Heijmen, Robin; Bos, Willem-Jan W; Ten Berg, Jurrien M

    2018-05-15

    Transcatheter aortic valve implantation (TAVI) is associated with acute kidney injury (AKI), but can also improve the kidney function (IKF). We assessed the effects of kidney function changes in relation to baseline kidney function on 2-year clinical outcomes after TAVI. In total, 639 consecutive patients with aortic stenosis who underwent TAVI were stratified into 3 groups according to the ratio of serum creatinine post- to pre-TAVI: IKF (≤0.80; n = 95 [15%]), stable kidney function (0.80 to 1.5; n = 477 [75%]), and AKI (≥1.5; n = 67 [10%]). Different AKI risk scores were compared using receiving-operator characteristics. Median follow-up was 24 (8 to 44) months. At 3 months, the increase in estimated glomerular filtration rate in the IKF group remained, and the decreased estimated glomerular filtration rate in the AKI group recovered. Compared with a stable kidney function, AKI showed a higher 2-year mortality rate (adjusted hazard ratio [HR] 3.69, 95% confidence interval [CI] 2.43 to 5.62) and IKF a lower mortality rate (adjusted hazard ratio 0.53, 95% CI 0.30 to 0.93). AKI also predicted major and life-threatening bleeding (adjusted odds ratio 2.94, 95% CI 1.27 to 6.78). Independent predictors of AKI were chronic kidney disease and pulmonary hypertension. Independent predictors of IKF were female gender, a preserved kidney function, absence of atrial fibrillation, and hemoglobin level. Established AKI risk scores performed moderately and did not differentiate between AKI and IKF. In conclusion, AKI is transient and is independently associated with a higher mortality rate, whereas IKF is sustained and is associated with a lower mortality rate. These effects are independent of baseline kidney function. Further studies are warranted to investigate the role of IKF and generate a dedicated prediction model. Copyright © 2018 Elsevier Inc. All rights reserved.

  19. Vulnerabilities, Influences and Interaction Paths: Failure Data for Integrated System Risk Analysis

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Fleming, Land

    2006-01-01

    We describe graph-based analysis methods for identifying and analyzing cross-subsystem interaction risks from subsystem connectivity information. By discovering external and remote influences that would be otherwise unexpected, these methods can support better communication among subsystem designers at points of potential conflict and to support design of more dependable and diagnosable systems. These methods identify hazard causes that can impact vulnerable functions or entities if propagated across interaction paths from the hazard source to the vulnerable target. The analysis can also assess combined impacts of And-Or trees of disabling influences. The analysis can use ratings of hazards and vulnerabilities to calculate cumulative measures of the severity and importance. Identification of cross-subsystem hazard-vulnerability pairs and propagation paths across subsystems will increase coverage of hazard and risk analysis and can indicate risk control and protection strategies.

  20. Driver Vigilance in Automated Vehicles: Hazard Detection Failures Are a Matter of Time.

    PubMed

    Greenlee, Eric T; DeLucia, Patricia R; Newton, David C

    2018-06-01

    The primary aim of the current study was to determine whether monitoring the roadway for hazards during automated driving results in a vigilance decrement. Although automated vehicles are relatively novel, the nature of human-automation interaction within them has the classic hallmarks of a vigilance task. Drivers must maintain attention for prolonged periods of time to detect and respond to rare and unpredictable events, for example, roadway hazards that automation may be ill equipped to detect. Given the similarity with traditional vigilance tasks, we predicted that drivers of a simulated automated vehicle would demonstrate a vigilance decrement in hazard detection performance. Participants "drove" a simulated automated vehicle for 40 minutes. During that time, their task was to monitor the roadway for roadway hazards. As predicted, hazard detection rate declined precipitously, and reaction times slowed as the drive progressed. Further, subjective ratings of workload and task-related stress indicated that sustained monitoring is demanding and distressing and it is a challenge to maintain task engagement. Monitoring the roadway for potential hazards during automated driving results in workload, stress, and performance decrements similar to those observed in traditional vigilance tasks. To the degree that vigilance is required of automated vehicle drivers, performance errors and associated safety risks are likely to occur as a function of time on task. Vigilance should be a focal safety concern in the development of vehicle automation.

  1. Skier triggering of backcountry avalanches with skilled route selection

    NASA Astrophysics Data System (ADS)

    Sinickas, Alexandra; Haegeli, Pascal; Jamieson, Bruce

    2015-04-01

    Jamieson (2009) provided numerical estimates for the baseline probabilities of triggering an avalanche by a backcountry skier making fresh tracks without skilled route selection as a function of the North American avalanche danger scale (i.e., hazard levels Low, Moderate, Considerable, High and Extreme). Using the results of an expert survey, he showed that triggering probabilities while skiing directly up, down or across a trigger zone without skilled route selection increase roughly by a factor of 10 with each step of the North American avalanche danger scale (i.e. hazard level). The objective of the present study is to examine the effect of skilled route selection on the relationship between triggering probability and hazard level. To assess the effect of skilled route selection on triggering probability by hazard level, we analysed avalanche hazard assessments as well as reports of skiing activity and triggering of avalanches from 11 Canadian helicopter and snowcat operations during two winters (2012-13 and 2013-14). These reports were submitted to the daily information exchange among Canadian avalanche safety operations, and reflect professional decision-making and route selection practices of guides leading groups of skiers. We selected all skier-controlled or accidentally triggered avalanches with a destructive size greater than size 1 according to the Canadian avalanche size classification, triggered by any member of a guided group (guide or guest). These operations forecast the avalanche hazard daily for each of three elevation bands: alpine, treeline and below treeline. In contrast to the 2009 study, an exposure was defined as a group skiing within any one of the three elevation bands, and consequently within a hazard rating, for the day (~4,300 ratings over two winters). For example, a group that skied below treeline (rated Moderate) and treeline (rated Considerable) in one day, would receive one count for exposure to Moderate hazard, and one count for exposure to Considerable hazard. While the absolute values for triggering probability cannot be compared to the 2009 study because of different definitions of exposure, our preliminary results suggest that with skilled route selection the triggering probability is similar all hazard levels, except for extreme for which there are few exposures. This means that the guiding teams of backcountry skiing operations effectively control the hazard from triggering avalanches with skilled route selection. Groups were exposed relatively evenly to Low hazard (1275 times or 29% of total exposure), Moderate hazard (1450 times or 33 %) and Considerable hazard (1215 times or 28 %). At higher levels, the exposure reduced to roughly 380 times (9 % of total exposure) to High hazard, and only 13 times (0.3 %) to Extreme hazard. We assess the sensitivity of the results to some of our key assumptions.

  2. Modeling the Risk of Fire/Explosion Due to Oxidizer/Fuel Leaks in the Ares I Interstage

    NASA Technical Reports Server (NTRS)

    Ring, Robert W.; Stott, James E.; Hales, Christy

    2008-01-01

    A significant flight hazard associated with liquid propellants, such as those used in the upper stage of NASA's new Ares I launch vehicle, is the possibility of leakage of hazardous fluids resulting in a catastrophic fire/explosion. The enclosed and vented interstage of the Ares I contains numerous oxidizer and fuel supply lines as well as ignition sources. The potential for fire/explosion due to leaks during ascent depends on the relative concentrations of hazardous and inert fluids within the interstage along with other variables such as pressure, temperature, leak rates, and fluid outgasing rates. This analysis improves on previous NASA Probabilistic Risk Assessment (PRA) estimates of the probability of deflagration, in which many of the variables pertinent to the problem were not explicitly modeled as a function of time. This paper presents the modeling methodology developed to analyze these risks.

  3. Quality-of-life-adjusted hazard of death: a formulation of the quality-adjusted life-years model of use in benefit-risk assessment.

    PubMed

    Garcia-Hernandez, Alberto

    2014-03-01

    Although the quality-adjusted life-years (QALY) model is standard in health technology assessment, quantitative methods are less frequent but increasingly used for benefit-risk assessment (BRA) at earlier stages of drug development. A frequent challenge when implementing metrics for BRA is to weigh the importance of effects on a chronic condition against the risk of severe events during the trial. The lifetime component of the QALY model has a counterpart in the BRA context, namely, the risk of dying during the study. A new concept is presented, the hazard of death function that a subject is willing to accept instead of the baseline hazard to improve his or her chronic health status, which we have called the quality-of-life-adjusted hazard of death. It has been proven that if assumptions of the linear QALY model hold, the excess mortality rate tolerated by a subject for a chronic health improvement is inversely proportional to the mean residual life. This result leads to a new representation of the linear QALY model in terms of hazard rate functions and allows utilities obtained by using standard methods involving trade-offs of life duration to be translated into thresholds of tolerated mortality risk during a short period of time, thereby avoiding direct trade-offs using small probabilities of events during the study, which is known to lead to bias and variability. Copyright © 2014 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  4. Chlorine hazard evaluation for the zinc-chlorine electric vehicle battery. Final technical report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zalosh, R.G.; Bajpai, S.N.; Short, T.P.

    1980-04-01

    An evaluation of the hazards associated with conceivable accidental chlorine releases from zinc-chlorine electric vehicle batteries is presented. Since commercial batteries are not yet available, this hazard assessment is based both on theoretical chlorine dispersion models and small-scale and large-scale spill tests with chlorine hydrate. Six spill tests involving chlorine hydrate indicate that the danger zone in which chlorine vapor concentrations intermittently exceed 100 ppM extends at least 23 m directly downwind of a spill onto a warm road surface. Chlorine concentration data from the hydrate spill tests compare favorably with calculations based on a quasi-steady area source dispersion modelmore » and empirical estimates of the hydrate decomposition rate. The theoretical dispersion model has been combined with assumed hydrate spill probabilities and current motor vehicle accident statistics in order to project expected chlorine-induced fatality rates. These calculations indicate that expected chlorine fatality rates are several times higher in a city with a warm and calm climate than in a colder and windier city. Calculated chlorine-induced fatality rate projections for various climates are presented as a function of hydrate spill probability in order to illustrate the degree of vehicle/battery crashworthiness required to maintain chlorine-induced fatality rates below current vehicle fatility rates due to fires and asphyxiations.« less

  5. A New Seismic Hazard Model for Mainland China

    NASA Astrophysics Data System (ADS)

    Rong, Y.; Xu, X.; Chen, G.; Cheng, J.; Magistrale, H.; Shen, Z. K.

    2017-12-01

    We are developing a new seismic hazard model for Mainland China by integrating historical earthquake catalogs, geological faults, geodetic GPS data, and geology maps. To build the model, we construct an Mw-based homogeneous historical earthquake catalog spanning from 780 B.C. to present, create fault models from active fault data, and derive a strain rate model based on the most complete GPS measurements and a new strain derivation algorithm. We divide China and the surrounding regions into about 20 large seismic source zones. For each zone, a tapered Gutenberg-Richter (TGR) magnitude-frequency distribution is used to model the seismic activity rates. The a- and b-values of the TGR distribution are calculated using observed earthquake data, while the corner magnitude is constrained independently using the seismic moment rate inferred from the geodetically-based strain rate model. Small and medium sized earthquakes are distributed within the source zones following the location and magnitude patterns of historical earthquakes. Some of the larger earthquakes are distributed onto active faults, based on their geological characteristics such as slip rate, fault length, down-dip width, and various paleoseismic data. The remaining larger earthquakes are then placed into the background. A new set of magnitude-rupture scaling relationships is developed based on earthquake data from China and vicinity. We evaluate and select appropriate ground motion prediction equations by comparing them with observed ground motion data and performing residual analysis. To implement the modeling workflow, we develop a tool that builds upon the functionalities of GEM's Hazard Modeler's Toolkit. The GEM OpenQuake software is used to calculate seismic hazard at various ground motion periods and various return periods. To account for site amplification, we construct a site condition map based on geology. The resulting new seismic hazard maps can be used for seismic risk analysis and management.

  6. The West Virginia university forest hazard rating study: the hazards of hazard rating

    Treesearch

    Ray R., Jr. Hicks; David E. Fosbroke; Shrivenkar Kosuri; Charles B. Yuill

    1991-01-01

    The West Virginia University (WVU) Forest is a 7,600-acre tract located along the leading edge of gypsy moth infestation. The hazard rating study at the WVU Forest serves three objectives. First, hazard rating is being used to determine the extent and distribution of damage that can be expected when gypsy moth defoliation occurs. Second, susceptibility and...

  7. Studies on Synthesis of Electrochemically Exfoliated Functionalized Graphene and Polylactic Acid/Ferric Phytate Functionalized Graphene Nanocomposites as New Fire Hazard Suppression Materials.

    PubMed

    Feng, Xiaming; Wang, Xin; Cai, Wei; Qiu, Shuilai; Hu, Yuan; Liew, Kim Meow

    2016-09-28

    Practical application of functionalized graphene in polymeric nanocomposites is hampered by the lack of cost-effective and eco-friendly methods for its production. Here, we reported a facile and green electrochemical approach for preparing ferric phytate functionalized graphene (f-GNS) by simultaneously utilizing biobased phytic acid as electrolyte and modifier for the first time. Due to the presence of phytic acid, electrochemical exfoliation leads to low oxidized graphene sheets (a C/O ratio of 14.8) that are tens of micrometers large. Successful functionalization of graphene was confirmed by the appearance of phosphorus and iron peaks in the X-ray photoelectron spectrum. Further, high-performance polylactic acid/f-GNS nanocomposites are readily fabricated by a convenient masterbatch strategy. Notably, inclusion of well-dispersed f-GNS resulted in dramatic suppression on fire hazards of polylactic acid in terms of reduced peak heat-release rate (decreased by 40%), low CO yield, and formation of a high graphitized protective char layer. Moreover, obviously improvements in crystallization rate and thermal conductivities of polylactic acid nanocomposites were observed, highlighting its promising potential in practical application. This novel strategy toward the simultaneous exfoliation and functionalization for graphene demonstrates a simple yet very effective approach for fabricating graphene-based flame retardants.

  8. A new variable interval schedule with constant hazard rate and finite time range.

    PubMed

    Bugallo, Mehdi; Machado, Armando; Vasconcelos, Marco

    2018-05-27

    We propose a new variable interval (VI) schedule that achieves constant probability of reinforcement in time while using a bounded range of intervals. By sampling each trial duration from a uniform distribution ranging from 0 to 2 T seconds, and then applying a reinforcement rule that depends linearly on trial duration, the schedule alternates reinforced and unreinforced trials, each less than 2 T seconds, while preserving a constant hazard function. © 2018 Society for the Experimental Analysis of Behavior.

  9. Fire Detection Tradeoffs as a Function of Vehicle Parameters

    NASA Technical Reports Server (NTRS)

    Urban, David L.; Dietrich, Daniel L.; Brooker, John E.; Meyer, Marit E.; Ruff, Gary A.

    2016-01-01

    Fire survivability depends on the detection of and response to a fire before it has produced an unacceptable environment in the vehicle. This detection time is the result of interplay between the fire burning and growth rates; the vehicle size; the detection system design; the transport time to the detector (controlled by the level of mixing in the vehicle); and the rate at which the life support system filters the atmosphere, potentially removing the detected species or particles. Given the large differences in critical vehicle parameters (volume, mixing rate and filtration rate) the detection approach that works for a large vehicle (e.g. the ISS) may not be the best choice for a smaller crew capsule. This paper examines the impact of vehicle size and environmental control and life support system parameters on the detectability of fires in comparison to the hazard they present. A lumped element model was developed that considers smoke, heat, and toxic product release rates in comparison to mixing and filtration rates in the vehicle. Recent work has quantified the production rate of smoke and several hazardous species from overheated spacecraft polymers. These results are used as the input data set in the lumped element model in combination with the transport behavior of major toxic products released by overheating spacecraft materials to evaluate the necessary alarm thresholds to enable appropriate response to the fire hazard.

  10. Investigation of automated and interactive crack measurement systems : final report, May 27, 2008.

    DOT National Transportation Integrated Search

    2008-05-27

    Currently adopted manual distress surveys involve a high degree of subjectivity, low production rates, and exposure to hazardous conditions. FDOT has acquired and validated a multi-functional survey vehicle (MPSV). However, when an automated crack di...

  11. Change in motor function and adverse health outcomes in older African-Americans.

    PubMed

    Buchman, Aron S; Wilson, Robert S; Leurgans, Sue E; Bennett, David A; Barnes, Lisa L

    2015-10-01

    We tested whether declining motor function accelerates with age in older African-Americans. Eleven motor performances were assessed annually in 513 older African-Americans. During follow-up of 5 years, linear mixed-effect models showed that motor function declined by about 0.03 units/year (Estimate, -0.026, p<0.001); about 4% more rapidly for each additional year of age at baseline. A proportional hazard model showed that both baseline motor function level and its rate of change were independent predictors of death and incident disability (all p's<0.001). These models showed that the additional annual amount of motor decline in 85 year old persons at baseline versus 65 year old persons was associated with a 1.5-fold higher rate of death and a 3-fold higher rate of developing Katz disability. The rate of declining motor function accelerates with increasing age and its rate of decline predicts adverse health outcomes in older African-Americans. Copyright © 2015 Elsevier Inc. All rights reserved.

  12. Change in Motor Function and Adverse Health Outcomes in Older African Americas

    PubMed Central

    Buchman, Aron S.; Wilson, Robert S.; Leurgans, Sue E.; Bennett, David A.; Barnes, Lisa L.

    2015-01-01

    Objective We tested whether declining motor function accelerates with age in older African Americans. Methods Eleven motor performances were assessed annually in 513 older African Americans. Results During follow-up of 5 years, linear mixed-effect models showed that motor function declined by about 0.03 units/yr (Estimate, −0.026, p<0.001); about 4% more rapidly for each additional year of age at baseline. A proportional hazard model showed that both baseline motor function level and its rate of change were independent predictors of death and incident disability (all p’s <0.001). These models showed that the additional annual amount of motor decline in 85 year old persons at baseline versus 65 year old persons was associated with a 1.5-fold higher rate of death and a 3-fold higher rate of developing Katz disability. Conclusions The rate of declining motor function accelerates with increasing age and its rate of decline predicts adverse health outcomes in older African Americans. PMID:26209439

  13. Effect of Contract Compliance Rate to a Fourth-Generation Telehealth Program on the Risk of Hospitalization in Patients With Chronic Kidney Disease: Retrospective Cohort Study.

    PubMed

    Hung, Chi-Sheng; Lee, Jenkuang; Chen, Ying-Hsien; Huang, Ching-Chang; Wu, Vin-Cent; Wu, Hui-Wen; Chuang, Pao-Yu; Ho, Yi-Lwun

    2018-01-24

    Chronic kidney disease (CKD) is prevalent in Taiwan and it is associated with high all-cause mortality. We have shown in a previous paper that a fourth-generation telehealth program is associated with lower all-cause mortality compared to usual care with a hazard ratio of 0.866 (95% CI 0.837-0.896). This study aimed to evaluate the effect of renal function status on hospitalization among patients receiving this program and to evaluate the relationship between contract compliance rate to the program and risk of hospitalization in patients with CKD. We retrospectively analyzed 715 patients receiving the telehealth care program. Contract compliance rate was defined as the percentage of days covered by the telehealth service before hospitalization. Patients were stratified into three groups according to renal function status: (1) normal renal function, (2) CKD, or (3) end-stage renal disease (ESRD) and on maintenance dialysis. The outcome measurements were first cardiovascular and all-cause hospitalizations. The association between contract compliance rate, renal function status, and hospitalization risk was analyzed with a Cox proportional hazards model with time-dependent covariates. The median follow-up duration was 694 days (IQR 338-1163). Contract compliance rate had a triphasic relationship with cardiovascular and all-cause hospitalizations. Patients with low or very high contract compliance rates were associated with a higher risk of hospitalization. Patients with CKD or ESRD were also associated with a higher risk of hospitalization. Moreover, we observed a significant interaction between the effects of renal function status and contract compliance rate on the risk of hospitalization: patients with ESRD, who were on dialysis, had an increased risk of hospitalization at a lower contract compliance rate, compared with patients with normal renal function or CKD. Our study showed that there was a triphasic relationship between contract compliance rate to the telehealth program and risk of hospitalization. Renal function status was associated with risk of hospitalization among these patients, and there was a significant interaction with contract compliance rate. ©Chi-Sheng Hung, Jenkuang Lee, Ying-Hsien Chen, Ching-Chang Huang, Vin-Cent Wu, Hui-Wen Wu, Pao-Yu Chuang, Yi-Lwun Ho. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 24.01.2018.

  14. Mental disorder and long-term risk of mortality: 41 years of follow-up of a population sample in Stockholm, Sweden.

    PubMed

    Lundin, A; Modig, K; Halldin, J; Carlsson, A C; Wändell, P; Theobald, H

    2016-08-01

    An increased mortality risk associated with mental disorder has been reported for patients, but there are few studies are based on random samples with interview-based psychiatric diagnoses. Part of the increased mortality for those with mental disorder may be attributable to worse somatic health or hazardous health behaviour - consequences of the disorder - but somatic health information is commonly lacking in psychiatric samples. This study aims to examine long-term mortality risk for psychiatric diagnoses in a general population sample and to assess mediation by somatic ill health and hazardous health behaviour. We used a double-phase stratified random sample of individuals aged 18-65 in Stockholm County 1970-1971 linked to vital records. First phase sample was 32 186 individuals screened with postal questionnaire and second phase was 1896 individuals (920 men and 976 women) that participated in a full-day examination (participation rate 88%). Baseline examination included both a semi-structured interview with a psychiatrist, with mental disorders set according to the 8th version of the International Classification of Disease (ICD-8), and clinical somatic examination, including measures of body composition (BMI), hypertension, fasting blood glucose, pulmonary function and self-reported tobacco smoking. Information on vital status was obtained from the Total Population Register for the years 1970-2011. Associations with mortality were studied with Cox proportional hazard analyses. A total of 883 deaths occurred among the participants during the 41-year follow-up. Increased mortality rates were found for ICD-8 functional psychoses (hazard ratio, HR = 2.22, 95% confidence interval (95% CI): 1.15-4.30); psycho-organic symptoms (HR = 1.94, 95% CI: 1.31-2.87); depressive neuroses (HR = 1.71, 95% CI: 1.23-2.39); alcohol use disorder (HR = 1.91, 95% CI: 1.40-2.61); drug dependence (HR = 3.71, 95% CI: 1.80-7.65) and psychopathy (HR = 2.88, 95% CI: 1.02-8.16). Non-participants (n = 349) had mortality rates similar to participants (HR = 0.98, 95% CI: 0.81-1.18). In subgroup analyses of those with psychoses, depression or alcohol use disorder, adjusting for the potential mediators smoking and pulmonary function, showed only slight changes in the HRs. This study confirms the increased risk of mortality for several psychiatric diagnoses in follow-up studies on American, Finnish and Swedish population-based samples. Only a small part of the increased mortality hazard was attributable to differences in somatic health or hazardous health behaviour measured at baseline.

  15. Effect of time of day and duration into shift on hazardous exposures to biological fluids.

    PubMed

    Macias, D J; Hafner, J; Brillman, J C; Tandberg, D

    1996-06-01

    To determine whether hospital employee biological hazardous exposure rates varied with time of day or increased with time interval into shift. This was a retrospective occurrence report review conducted at a university hospital with an emergency medicine residency program. Health care worker biological hazardous exposure data over a 30-month period were reviewed. Professional status, date, time, and type of exposure (needlestick, laceration, splash), time interval into shift of exposure, and hospital location of exposure were recorded. Hourly employee counts and risky procedure counts were matched by location with each reported exposure, to determine hourly rates of biological hazardous exposures. Analysis of 411 recorded exposures demonstrated that more people were exposed between 9:00 AM and 11:00 AM (p < 0.05), yet the exposure risk did not vary significantly when expressed as the number of exposures per worker or per procedure. Of the 393 exposures with data describing time interval into shift when the exposure occurred, significant numbers of exposures occurred during the first hour and at shift's end [when corrected for exposures per worker (p < 0.05) or exposures per procedure (p < 0.05)]. While the number of exposures are increased in the AM hours, the exposure rate (as a function of workers or procedures) does not vary with time of the day. However, the exposure rate is increased during the first hour and last 2 hours of a shift. Efforts to increase worker precautions at the beginning and end of shifts are warranted.

  16. Micro-foundation using percolation theory of the finite time singular behavior of the crash hazard rate in a class of rational expectation bubbles

    NASA Astrophysics Data System (ADS)

    Seyrich, Maximilian; Sornette, Didier

    2016-04-01

    We present a plausible micro-founded model for the previously postulated power law finite time singular form of the crash hazard rate in the Johansen-Ledoit-Sornette (JLS) model of rational expectation bubbles. The model is based on a percolation picture of the network of traders and the concept that clusters of connected traders share the same opinion. The key ingredient is the notion that a shift of position from buyer to seller of a sufficiently large group of traders can trigger a crash. This provides a formula to estimate the crash hazard rate by summation over percolation clusters above a minimum size of a power sa (with a>1) of the cluster sizes s, similarly to a generalized percolation susceptibility. The power sa of cluster sizes emerges from the super-linear dependence of group activity as a function of group size, previously documented in the literature. The crash hazard rate exhibits explosive finite time singular behaviors when the control parameter (fraction of occupied sites, or density of traders in the network) approaches the percolation threshold pc. Realistic dynamics are generated by modeling the density of traders on the percolation network by an Ornstein-Uhlenbeck process, whose memory controls the spontaneous excursion of the control parameter close to the critical region of bubble formation. Our numerical simulations recover the main stylized properties of the JLS model with intermittent explosive super-exponential bubbles interrupted by crashes.

  17. Automated design synthesis of robotic/human workcells for improved manufacturing system design in hazardous environments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, Joshua M.

    Manufacturing tasks that are deemed too hazardous for workers require the use of automation, robotics, and/or other remote handling tools. The associated hazards may be radiological or nonradiological, and based on the characteristics of the environment and processing, a design may necessitate robotic labor, human labor, or both. There are also other factors such as cost, ergonomics, maintenance, and efficiency that also effect task allocation and other design choices. Handling the tradeoffs of these factors can be complex, and lack of experience can be an issue when trying to determine if and what feasible automation/robotics options exist. To address thismore » problem, we utilize common engineering design approaches adapted more for manufacturing system design in hazardous environments. We limit our scope to the conceptual and embodiment design stages, specifically a computational algorithm for concept generation and early design evaluation. In regard to concept generation, we first develop the functional model or function structure for the process, using the common 'verb-noun' format for describing function. A common language or functional basis for manufacturing was developed and utilized to formalize function descriptions and guide rules for function decomposition. Potential components for embodiment are also grouped in terms of this functional language and are stored in a database. The properties of each component are given as quantitative and qualitative criteria. Operators are also rated for task-relevant criteria which are used to address task compatibility. Through the gathering of process requirements/constraints, construction of the component database, and development of the manufacturing basis and rule set, design knowledge is stored and available for computer use. Thus, once the higher level process functions are defined, the computer can automate the synthesis of new design concepts through alternating steps of embodiment and function structure updates/decomposition. In the process, criteria guide function allocation of components/operators and help ensure compatibility and feasibility. Through multiple function assignment options and varied function structures, multiple design concepts are created. All of the generated designs are then evaluated based on a number of relevant evaluation criteria: cost, dose, ergonomics, hazards, efficiency, etc. These criteria are computed using physical properties/parameters of each system based on the qualities an engineer would use to make evaluations. Nuclear processes such as oxide conversion and electrorefining are utilized to aid algorithm development and provide test cases for the completed program. Through our approach, we capture design knowledge related to manufacturing and other operations in hazardous environments to enable a computational program to automatically generate and evaluate system design concepts.« less

  18. Prediction of cardiovascular outcome by estimated glomerular filtration rate and estimated creatinine clearance in the high-risk hypertension population of the VALUE trial.

    PubMed

    Ruilope, Luis M; Zanchetti, Alberto; Julius, Stevo; McInnes, Gordon T; Segura, Julian; Stolt, Pelle; Hua, Tsushung A; Weber, Michael A; Jamerson, Ken

    2007-07-01

    Reduced renal function is predictive of poor cardiovascular outcomes but the predictive value of different measures of renal function is uncertain. We compared the value of estimated creatinine clearance, using the Cockcroft-Gault formula, with that of estimated glomerular filtration rate (GFR), using the Modification of Diet in Renal Disease (MDRD) formula, as predictors of cardiovascular outcome in 15 245 high-risk hypertensive participants in the Valsartan Antihypertensive Long-term Use Evaluation (VALUE) trial. For the primary end-point, the three secondary end-points and for all-cause death, outcomes were compared for individuals with baseline estimated creatinine clearance and estimated GFR < 60 ml/min and > or = 60 ml/min using hazard ratios and 95% confidence intervals. Coronary heart disease, left ventricular hypertrophy, age, sex and treatment effects were included as covariates in the model. For each end-point considered, the risk in individuals with poor renal function at baseline was greater than in those with better renal function. Estimated creatinine clearance (Cockcroft-Gault) was significantly predictive only of all-cause death [hazard ratio = 1.223, 95% confidence interval (CI) = 1.076-1.390; P = 0.0021] whereas estimated GFR was predictive of all outcomes except stroke. Hazard ratios (95% CIs) for estimated GFR were: primary cardiac end-point, 1.497 (1.332-1.682), P < 0.0001; myocardial infarction, 1.501 (1.254-1.796), P < 0.0001; congestive heart failure, 1.699 (1.435-2.013), P < 0.0001; stroke, 1.152 (0.952-1.394) P = 0.1452; and all-cause death, 1.231 (1.098-1.380), P = 0.0004. These results indicate that estimated glomerular filtration rate calculated with the MDRD formula is more informative than estimated creatinine clearance (Cockcroft-Gault) in the prediction of cardiovascular outcomes.

  19. Hazard Management Dealt by Safety Professionals in Colleges: The Impact of Individual Factors.

    PubMed

    Wu, Tsung-Chih; Chen, Chi-Hsiang; Yi, Nai-Wen; Lu, Pei-Chen; Yu, Shan-Chi; Wang, Chien-Peng

    2016-12-03

    Identifying, evaluating, and controlling workplace hazards are important functions of safety professionals (SPs). The purpose of this study was to investigate the content and frequency of hazard management dealt by safety professionals in colleges. The authors also explored the effects of organizational factors/individual factors on SPs' perception of frequency of hazard management. The researchers conducted survey research to achieve the objective of this study. The researchers mailed questionnaires to 200 SPs in colleges after simple random sampling, then received a total of 144 valid responses (response rate = 72%). Exploratory factor analysis indicated that the hazard management scale (HMS) extracted five factors, including physical hazards, biological hazards, social and psychological hazards, ergonomic hazards, and chemical hazards. Moreover, the top 10 hazards that the survey results identified that safety professionals were most likely to deal with (in order of most to least frequent) were: organic solvents, illumination, other chemicals, machinery and equipment, fire and explosion, electricity, noise, specific chemicals, human error, and lifting/carrying. Finally, the results of one-way multivariate analysis of variance (MANOVA) indicated there were four individual factors that impacted the perceived frequency of hazard management which were of statistical and practical significance: job tenure in the college of employment, type of certification, gender, and overall job tenure. SPs within colleges and industries can now discuss plans revolving around these five areas instead of having to deal with all of the separate hazards.

  20. Hazard Management Dealt by Safety Professionals in Colleges: The Impact of Individual Factors

    PubMed Central

    Wu, Tsung-Chih; Chen, Chi-Hsiang; Yi, Nai-Wen; Lu, Pei-Chen; Yu, Shan-Chi; Wang, Chien-Peng

    2016-01-01

    Identifying, evaluating, and controlling workplace hazards are important functions of safety professionals (SPs). The purpose of this study was to investigate the content and frequency of hazard management dealt by safety professionals in colleges. The authors also explored the effects of organizational factors/individual factors on SPs’ perception of frequency of hazard management. The researchers conducted survey research to achieve the objective of this study. The researchers mailed questionnaires to 200 SPs in colleges after simple random sampling, then received a total of 144 valid responses (response rate = 72%). Exploratory factor analysis indicated that the hazard management scale (HMS) extracted five factors, including physical hazards, biological hazards, social and psychological hazards, ergonomic hazards, and chemical hazards. Moreover, the top 10 hazards that the survey results identified that safety professionals were most likely to deal with (in order of most to least frequent) were: organic solvents, illumination, other chemicals, machinery and equipment, fire and explosion, electricity, noise, specific chemicals, human error, and lifting/carrying. Finally, the results of one-way multivariate analysis of variance (MANOVA) indicated there were four individual factors that impacted the perceived frequency of hazard management which were of statistical and practical significance: job tenure in the college of employment, type of certification, gender, and overall job tenure. SPs within colleges and industries can now discuss plans revolving around these five areas instead of having to deal with all of the separate hazards. PMID:27918474

  1. Memory Hazard Functions: A Vehicle for Theory Development and Test

    ERIC Educational Resources Information Center

    Chechile, Richard A.

    2006-01-01

    A framework is developed to rigorously test an entire class of memory retention functions by examining hazard properties. Evidence is provided that the memory hazard function is not monotonically decreasing. Yet most of the proposals for retention functions, which have emerged from the psychological literature, imply that memory hazard is…

  2. Determinants of self-rated health: could health status explain the association between self-rated health and mortality?

    PubMed

    Murata, Chiyoe; Kondo, Takaaki; Tamakoshi, Koji; Yatsuya, Hiroshi; Toyoshima, Hideaki

    2006-01-01

    The purpose of this study was to investigate factors related to self-rated health and to mortality among 2490 community-living elderly. Respondents were followed for 7.3 years for all-cause mortality. To compare the relative impact of each variable, we employed logistic regression analysis for self-rated health and Cox hazard analysis for mortality. Cox analysis stratified by gender, follow-up periods, age group, and functional status was also employed. Series of analysis found that factors associated with self-rated health and with mortality were not identical. Psychological factors such as perceived isolation at home or 'ikigai (one aspect of psychological well-being)' were associated with self-rated health only. Age, functional status, and social relations were associated both with self-rated health and mortality after controlling for possible confounders. Illnesses and functional status accounted for 35-40% of variances in the fair/poor self-rated health. Differences by gender and functional status were observed in the factors related to self-rated health. Overall, self-rated health effect on mortality was stronger for people with no functional impairment, for shorter follow-up period, and for young-old age group. Although, illnesses and functional status were major determinants of self-rated health, economical, psychological, and social factors were also related to self-rated health.

  3. Occupational-level interactions between physical hazards and cognitive ability and skill requirements in predicting injury incidence rates.

    PubMed

    Ford, Michael T; Wiggins, Bryan K

    2012-07-01

    Interactions between occupational-level physical hazards and cognitive ability and skill requirements were examined as predictors of injury incidence rates as reported by the U. S. Bureau of Labor Statistics. Based on ratings provided in the Occupational Information Network (O*NET) database, results across 563 occupations indicate that physical hazards at the occupational level were strongly related to injury incidence rates. Also, as expected, the physical hazard-injury rate relationship was stronger among occupations with high cognitive ability and skill requirements. In addition, there was an unexpected main effect such that occupations with high cognitive ability and skill requirements had lower injury rates even after controlling for physical hazards. The main effect of cognitive ability and skill requirements, combined with the interaction with physical hazards, resulted in unexpectedly high injury rates for low-ability and low-skill occupations with low physical hazard levels. Substantive and methodological explanations for these interactions and their theoretical and practical implications are offered. Results suggest that organizations and occupational health and safety researchers and practitioners should consider the occupational level of analysis and interactions between physical hazards and cognitive requirements in future research and practice when attempting to understand and prevent injuries.

  4. Monte Carlo simulation for slip rate sensitivity analysis in Cimandiri fault area

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pratama, Cecep, E-mail: great.pratama@gmail.com; Meilano, Irwan; Nugraha, Andri Dian

    Slip rate is used to estimate earthquake recurrence relationship which is the most influence for hazard level. We examine slip rate contribution of Peak Ground Acceleration (PGA), in probabilistic seismic hazard maps (10% probability of exceedance in 50 years or 500 years return period). Hazard curve of PGA have been investigated for Sukabumi using a PSHA (Probabilistic Seismic Hazard Analysis). We observe that the most influence in the hazard estimate is crustal fault. Monte Carlo approach has been developed to assess the sensitivity. Then, Monte Carlo simulations properties have been assessed. Uncertainty and coefficient of variation from slip rate formore » Cimandiri Fault area has been calculated. We observe that seismic hazard estimates is sensitive to fault slip rate with seismic hazard uncertainty result about 0.25 g. For specific site, we found seismic hazard estimate for Sukabumi is between 0.4904 – 0.8465 g with uncertainty between 0.0847 – 0.2389 g and COV between 17.7% – 29.8%.« less

  5. Boulder Distributions at Legacy Landing Sites: Assessing Regolith Production Rates and Landing Site Hazards

    NASA Technical Reports Server (NTRS)

    Watkins, R. N.; Jolliff, B. L.; Lawrence, S. J.; Hayne, P. O.; Ghent, R. R.

    2017-01-01

    Understanding how the distribution of boulders on the lunar surface changes over time is key to understanding small-scale erosion processes and the rate at which rocks become regolith. Boulders degrade over time, primarily as a result of micrometeorite bombardment so their residence time at the surface can inform the rate at which rocks become regolith or become buried within regolith. Because of the gradual degradation of exposed boulders, we expect that the boulder population around an impact crater will decrease as crater age increases. Boulder distributions around craters of varying ages are needed to understand regolith production rates, and Lunar Reconnaissance Orbiter Camera (LROC) Narrow Angle Camera (NAC) images provide one of the best tools for conducting these studies. Using NAC images to assess how the distribution of boulders varies as a function of crater age provides key constraints for boulder erosion processes. Boulders also represent a potential hazard that must be addressed in the planning of future lunar landings. A boulder under a landing leg can contribute to deck tilt, and boulders can damage spacecraft during landing. Using orbital data to characterize boulder populations at locations where landers have safely touched down (Apollo, Luna, Surveyor, Chang'e-3) provides validation for landed mission hazard avoidance planning. Additionally, counting boulders at legacy landing sites is useful because: 1) LROC has extensive coverage of these sites at high resolutions (approximately 0.5 meters per pixel). 2) Returned samples from craters at these sites have been radiometrically dated, allowing assessment of how boulder distributions vary as a function of crater age. 3) Surface photos at these sites can be used to correlate with remote sensing measurements.

  6. Association of Pulse Wave Velocity With Chronic Kidney Disease Progression and Mortality: Findings From the CRIC Study (Chronic Renal Insufficiency Cohort).

    PubMed

    Townsend, Raymond R; Anderson, Amanda Hyre; Chirinos, Julio A; Feldman, Harold I; Grunwald, Juan E; Nessel, Lisa; Roy, Jason; Weir, Matthew R; Wright, Jackson T; Bansal, Nisha; Hsu, Chi-Yuan

    2018-06-01

    Patients with chronic kidney diseases (CKDs) are at risk for further loss of kidney function and death, which occur despite reasonable blood pressure treatment. To determine whether arterial stiffness influences CKD progression and death, independent of blood pressure, we conducted a prospective cohort study of CKD patients enrolled in the CRIC study (Chronic Renal Insufficiency Cohort). Using carotid-femoral pulse wave velocity (PWV), we examined the relationship between PWV and end-stage kidney disease (ESRD), ESRD or halving of estimated glomerular filtration rate, or death from any cause. The 2795 participants we enrolled had a mean age of 60 years, 56.4% were men, 47.3% had diabetes mellitus, and the average estimated glomerular filtration rate at entry was 44.4 mL/min per 1.73 m 2 During follow-up, there were 504 ESRD events, 628 ESRD or halving of estimated glomerular filtration rate events, and 394 deaths. Patients with the highest tertile of PWV (>10.3 m/s) were at higher risk for ESRD (hazard ratio [95% confidence interval], 1.37 [1.05-1.80]), ESRD or 50% decline in estimated glomerular filtration rate (hazard ratio [95% confidence interval], 1.25 [0.98-1.58]), or death (hazard ratio [95% confidence interval], 1.72 [1.24-2.38]). PWV is a significant predictor of CKD progression and death in people with impaired kidney function. Incorporation of PWV measurements may help define better the risks for these important health outcomes in patients with CKDs. Interventions that reduce aortic stiffness deserve study in people with CKD. © 2018 American Heart Association, Inc.

  7. Temporal expectancy in the context of a theory of visual attention.

    PubMed

    Vangkilde, Signe; Petersen, Anders; Bundesen, Claus

    2013-10-19

    Temporal expectation is expectation with respect to the timing of an event such as the appearance of a certain stimulus. In this paper, temporal expectancy is investigated in the context of the theory of visual attention (TVA), and we begin by summarizing the foundations of this theoretical framework. Next, we present a parametric experiment exploring the effects of temporal expectation on perceptual processing speed in cued single-stimulus letter recognition with unspeeded motor responses. The length of the cue-stimulus foreperiod was exponentially distributed with one of six hazard rates varying between blocks. We hypothesized that this manipulation would result in a distinct temporal expectation in each hazard rate condition. Stimulus exposures were varied such that both the temporal threshold of conscious perception (t0 ms) and the perceptual processing speed (v letters s(-1)) could be estimated using TVA. We found that the temporal threshold t0 was unaffected by temporal expectation, but the perceptual processing speed v was a strikingly linear function of the logarithm of the hazard rate of the stimulus presentation. We argue that the effects on the v values were generated by changes in perceptual biases, suggesting that our perceptual biases are directly related to our temporal expectations.

  8. Screening for Frailty in Older Patients With Early-Stage Solid Tumors: A Prospective Longitudinal Evaluation of Three Different Geriatric Tools.

    PubMed

    Biganzoli, Laura; Mislang, Anna Rachelle; Di Donato, Samantha; Becheri, Dimitri; Biagioni, Chiara; Vitale, Stefania; Sanna, Giuseppina; Zafarana, Elena; Gabellini, Stefano; Del Monte, Francesca; Mori, Elena; Pozzessere, Daniele; Brunello, Antonella; Luciani, Andrea; Laera, Letizia; Boni, Luca; Di Leo, Angelo; Mottino, Giuseppe

    2017-07-01

    Frailty increases the risk of adverse health outcomes and/or dying when exposed to a stressor, and routine frailty assessment is recommended to guide treatment decision. The Balducci frailty criteria (BFC) and Fried frailty criteria (FFC) are commonly used, but these are time consuming. Vulnerable Elders Survey-13 (VES-13) score of ≥7, a simple and resource conserving function-based scoring system, may be used instead. This prospective study evaluates the performance of VES-13 in parallel with BFC and FFC, to identify frailty in elderly patients with early-stage cancer. Patients aged ≥70 years with early-stage solid tumors were classified as frail/nonfrail based on BFC (≥1 criteria), FFC (≥3 criteria), and VES-13 (score ≥ 7). All patients were assessed for functional decline and death. We evaluated 185 patients. FFC had a 17% frailty rate, whereas BFC and VES-13 both had 25%, with poor concordance seen between the three geriatric tools. FFC (hazard ratio = 1.99, p = .003) and VES-13 (hazard ratio = 2.81, p < .001) strongly discriminated for functional decline, whereas BFC (hazard ratio = 3.29, p < .001) had the highest discriminatory rate for deaths. BFC and VES-13 remained prognostic for overall survival in multivariate analysis correcting for age, tumor type, stage, and systemic treatment. A VES-13 score of ≥7 is a valuable discriminating tool for predicting functional decline or death and can be used as a frailty-screening tool among older cancer patients in centers with limited resources to conduct a comprehensive geriatric assessment. © The Author 2017. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  9. An empirical method for estimating travel times for wet volcanic mass flows

    USGS Publications Warehouse

    Pierson, Thomas C.

    1998-01-01

    Travel times for wet volcanic mass flows (debris avalanches and lahars) can be forecast as a function of distance from source when the approximate flow rate (peak discharge near the source) can be estimated beforehand. The near-source flow rate is primarily a function of initial flow volume, which should be possible to estimate to an order of magnitude on the basis of geologic, geomorphic, and hydrologic factors at a particular volcano. Least-squares best fits to plots of flow-front travel time as a function of distance from source provide predictive second-degree polynomial equations with high coefficients of determination for four broad size classes of flow based on near-source flow rate: extremely large flows (>1 000 000 m3/s), very large flows (10 000–1 000 000 m3/s), large flows (1000–10 000 m3/s), and moderate flows (100–1000 m3/s). A strong nonlinear correlation that exists between initial total flow volume and flow rate for "instantaneously" generated debris flows can be used to estimate near-source flow rates in advance. Differences in geomorphic controlling factors among different flows in the data sets have relatively little effect on the strong nonlinear correlations between travel time and distance from source. Differences in flow type may be important, especially for extremely large flows, but this could not be evaluated here. At a given distance away from a volcano, travel times can vary by approximately an order of magnitude depending on flow rate. The method can provide emergency-management officials a means for estimating time windows for evacuation of communities located in hazard zones downstream from potentially hazardous volcanoes.

  10. Maximum Likelihood Estimations and EM Algorithms with Length-biased Data

    PubMed Central

    Qin, Jing; Ning, Jing; Liu, Hao; Shen, Yu

    2012-01-01

    SUMMARY Length-biased sampling has been well recognized in economics, industrial reliability, etiology applications, epidemiological, genetic and cancer screening studies. Length-biased right-censored data have a unique data structure different from traditional survival data. The nonparametric and semiparametric estimations and inference methods for traditional survival data are not directly applicable for length-biased right-censored data. We propose new expectation-maximization algorithms for estimations based on full likelihoods involving infinite dimensional parameters under three settings for length-biased data: estimating nonparametric distribution function, estimating nonparametric hazard function under an increasing failure rate constraint, and jointly estimating baseline hazards function and the covariate coefficients under the Cox proportional hazards model. Extensive empirical simulation studies show that the maximum likelihood estimators perform well with moderate sample sizes and lead to more efficient estimators compared to the estimating equation approaches. The proposed estimates are also more robust to various right-censoring mechanisms. We prove the strong consistency properties of the estimators, and establish the asymptotic normality of the semi-parametric maximum likelihood estimators under the Cox model using modern empirical processes theory. We apply the proposed methods to a prevalent cohort medical study. Supplemental materials are available online. PMID:22323840

  11. Mussel-inspired functionalization of electrochemically exfoliated graphene: Based on self-polymerization of dopamine and its suppression effect on the fire hazards and smoke toxicity of thermoplastic polyurethane.

    PubMed

    Cai, Wei; Wang, Junling; Pan, Ying; Guo, Wenwen; Mu, Xiaowei; Feng, Xiaming; Yuan, Bihe; Wang, Xin; Hu, Yuan

    2018-06-15

    The suppression effect of graphene in the fire hazards and smoke toxicity of polymer composites has been seriously limited by both mass production and weak interfacial interaction. Though the electrochemical preparation provides an available approach for mass production, exfoliated graphene could not strongly bond with polar polymer chains. Herein, mussel-inspired functionalization of electrochemically exfoliated graphene was successfully processed and added into polar thermoplastic polyurethane matrix (TPU). As confirmed by SEM patterns of fracture surface, functionalized graphene possessing abundant hydroxyl could constitute a forceful chains interaction with TPU. By the incorporation of 2.0 wt % f-GNS, peak heat release rate (pHRR), total heat release (THR), specific extinction area (SEA), as well as smoke produce rate (SPR) of TPU composites were approximately decreased by 59.4%, 27.1%, 31.9%, and 26.7%, respectively. A probable mechanism of fire retardant was hypothesized: well-dispersed f-GNS constituted tortuous path and hindered the exchange process of degradation product with barrier function. Large quantities of degradation product gathered round f-GNS and reacted with flame retardant to produce the cross-linked and high-degree graphited residual char. The simple functionalization for electrochemically exfoliated graphene impels the application of graphene in the fields of flame retardant composites. Copyright © 2018 Elsevier B.V. All rights reserved.

  12. Risk factors for stent graft thrombosis after transjugular intrahepatic portosystemic shunt creation.

    PubMed

    Jahangiri, Younes; Kerrigan, Timothy; Li, Lei; Prosser, Dominik; Brar, Anantnoor; Righetti, Johnathan; Schenning, Ryan C; Kaufman, John A; Farsad, Khashayar

    2017-12-01

    To identify risk factors of stent graft thrombosis after transjugular intrahepatic portosystemic shunt (TIPS) creation. Patients who underwent TIPS creation between June 2003 and January 2016 and with follow-up assessing stent graft patency were included (n=174). Baseline comorbidities, liver function, procedural details and follow-up liver function tests were analyzed in association with hazards of thrombosis on follow-up. Competing risk cox regression models were used considering liver transplant after TIPS creation as the competing risk variable. One-, 2- and 5-year primary patency rates were 94.1%, 91.7% and 78.2%, respectively. Patient age [sub-hazard ratio (sHR): 1.13; P=0.001], body mass index (BMI) <30 (sHR: 33.08; P=0.008) and a higher post-TIPS portosystemic pressure gradient (sHR: 1.14; P=0.023) were significantly associated with TIPS thrombosis in multivariate analysis. A higher rate of TIPS thrombosis was observed in those for whom the procedure was clinically unsuccessful (P=0.014). A significant increase in incidence of thrombosis was noted with increasing tertiles of post-TIPS portosystemic gradients (P value for trend=0.017). Older age, lower BMI and higher post-TIPS portosystemic gradients were associated with higher hazards of shunt thrombosis after TIPS creation using stent grafts. Higher rates of shunt thrombosis were seen in patients for whom TIPS creation was clinically unsuccessful. The association between TIPS thrombosis and higher post-TIPS portosystemic gradients may indicate impaired flow through the shunt, a finding which may be technical or anatomic in nature and should be assessed before procedure completion.

  13. The limits on the usefulness of erosion hazard ratings

    Treesearch

    R. M. Rice; P. D. Gradek

    1984-01-01

    Although erosion-hazard ratings are often used to guide forest practices, those used in California from 1974 to 1982 have been inadequate for estimating erosion potential. To improve the erosion-hazard rating procedure, separate estimating equations were used for different situations. The ratings were partitioned according to yarding method, erosional process, and...

  14. SEISRISK II; a computer program for seismic hazard estimation

    USGS Publications Warehouse

    Bender, Bernice; Perkins, D.M.

    1982-01-01

    The computer program SEISRISK II calculates probabilistic ground motion values for use in seismic hazard mapping. SEISRISK II employs a model that allows earthquakes to occur as points within source zones and as finite-length ruptures along faults. It assumes that earthquake occurrences have a Poisson distribution, that occurrence rates remain constant during the time period considered, that ground motion resulting from an earthquake is a known function of magnitude and distance, that seismically homogeneous source zones are defined, that fault locations are known, that fault rupture lengths depend on magnitude, and that earthquake rates as a function of magnitude are specified for each source. SEISRISK II calculates for each site on a grid of sites the level of ground motion that has a specified probability of being exceeded during a given time period. The program was designed to process a large (essentially unlimited) number of sites and sources efficiently and has been used to produce regional and national maps of seismic hazard.}t is a substantial revision of an earlier program SEISRISK I, which has never been documented. SEISRISK II runs considerably [aster and gives more accurate results than the earlier program and in addition includes rupture length and acceleration variability which were not contained in the original version. We describe the model and how it is implemented in the computer program and provide a flowchart and listing of the code.

  15. Stand hazard rating for central Idaho forests

    Treesearch

    Robert Steele; Ralph E. Williams; Julie C. Weatherby; Elizabeth D. Reinhardt; James T. Hoffman; R. W. Thier

    1996-01-01

    Growing concern over sustainability of central ldaho forests has created a need to assess the health of forest stands on a relative basis. A stand hazard rating was developed as a composite of 11 individual ratings to compare the health hazards of different stands. The composite rating includes Douglas-fir beetle, mountain pine beetle, western pine beetle, spruce...

  16. 77 FR 72905 - Pipeline Safety: Random Drug Testing Rate; Contractor MIS Reporting; and Obtaining DAMIS Sign-In...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-06

    ... Drug and Alcohol Management Information System (DAMIS) to operators, but will make the user name and... DAMIS Sign-In Information AGENCY: Pipeline and Hazardous Materials Safety Administration (PHMSA), DOT... testing information must be submitted for contractors performing or ready to perform covered functions...

  17. Seismic hazard maps for Haiti

    USGS Publications Warehouse

    Frankel, Arthur; Harmsen, Stephen; Mueller, Charles; Calais, Eric; Haase, Jennifer

    2011-01-01

    We have produced probabilistic seismic hazard maps of Haiti for peak ground acceleration and response spectral accelerations that include the hazard from the major crustal faults, subduction zones, and background earthquakes. The hazard from the Enriquillo-Plantain Garden, Septentrional, and Matheux-Neiba fault zones was estimated using fault slip rates determined from GPS measurements. The hazard from the subduction zones along the northern and southeastern coasts of Hispaniola was calculated from slip rates derived from GPS data and the overall plate motion. Hazard maps were made for a firm-rock site condition and for a grid of shallow shear-wave velocities estimated from topographic slope. The maps show substantial hazard throughout Haiti, with the highest hazard in Haiti along the Enriquillo-Plantain Garden and Septentrional fault zones. The Matheux-Neiba Fault exhibits high hazard in the maps for 2% probability of exceedance in 50 years, although its slip rate is poorly constrained.

  18. Connoted hazard and perceived importance of fluorescent, neon, and standard safety colors.

    PubMed

    Zielinska, O A; Mayhorn, C B; Wogalter, M S

    2017-11-01

    The perceived hazard and rated importance of standard safety, fluorescent, and neon colors are investigated. Colors are used in warnings to enhance hazard communication. Red has consistently been rated as the highest in perceived hazard. Orange, yellow, and black are the next highest in connoted hazard; however, there is discrepancy in their ordering. Safety standards, such as ANSI Z535.1, also list colors to convey important information, but little research has examined the perceived importance of colors. In addition to standard safety colors, fluorescent colors are more commonly used in warnings. Understanding hazard and importance perceptions of standard safety and fluorescent colors is necessary to create effective warnings. Ninety participants rated and ranked a total of 33 colors on both perceived hazard and perceived importance. Rated highest were the safety red colors from the American National Standard Institute (ANSI), International Organization for Standardization (ISO), and Federal Highway Administration (FHWA) together with three fluorescent colors (orange, yellow, and yellow-green) from 3 M on both dimensions. Rankings were similar to ratings except that fluorescent orange was the highest on perceived hazard, while fluorescent orange and safety red from the ANSI were ranked as the highest in perceived importance. Fluorescent colors convey hazard and importance levels as high as the standard safety red colors. Implications for conveying hazard and importance in warnings through color are discussed. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Support Vector Hazards Machine: A Counting Process Framework for Learning Risk Scores for Censored Outcomes.

    PubMed

    Wang, Yuanjia; Chen, Tianle; Zeng, Donglin

    2016-01-01

    Learning risk scores to predict dichotomous or continuous outcomes using machine learning approaches has been studied extensively. However, how to learn risk scores for time-to-event outcomes subject to right censoring has received little attention until recently. Existing approaches rely on inverse probability weighting or rank-based regression, which may be inefficient. In this paper, we develop a new support vector hazards machine (SVHM) approach to predict censored outcomes. Our method is based on predicting the counting process associated with the time-to-event outcomes among subjects at risk via a series of support vector machines. Introducing counting processes to represent time-to-event data leads to a connection between support vector machines in supervised learning and hazards regression in standard survival analysis. To account for different at risk populations at observed event times, a time-varying offset is used in estimating risk scores. The resulting optimization is a convex quadratic programming problem that can easily incorporate non-linearity using kernel trick. We demonstrate an interesting link from the profiled empirical risk function of SVHM to the Cox partial likelihood. We then formally show that SVHM is optimal in discriminating covariate-specific hazard function from population average hazard function, and establish the consistency and learning rate of the predicted risk using the estimated risk scores. Simulation studies show improved prediction accuracy of the event times using SVHM compared to existing machine learning methods and standard conventional approaches. Finally, we analyze two real world biomedical study data where we use clinical markers and neuroimaging biomarkers to predict age-at-onset of a disease, and demonstrate superiority of SVHM in distinguishing high risk versus low risk subjects.

  20. Risk and safety perception on urban and rural roads: Effects of environmental features, driver age and risk sensitivity.

    PubMed

    Cox, Jolene A; Beanland, Vanessa; Filtness, Ashleigh J

    2017-10-03

    The ability to detect changing visual information is a vital component of safe driving. In addition to detecting changing visual information, drivers must also interpret its relevance to safety. Environmental changes considered to have high safety relevance will likely demand greater attention and more timely responses than those considered to have lower safety relevance. The aim of this study was to explore factors that are likely to influence perceptions of risk and safety regarding changing visual information in the driving environment. Factors explored were the environment in which the change occurs (i.e., urban vs. rural), the type of object that changes, and the driver's age, experience, and risk sensitivity. Sixty-three licensed drivers aged 18-70 years completed a hazard rating task, which required them to rate the perceived hazardousness of changing specific elements within urban and rural driving environments. Three attributes of potential hazards were systematically manipulated: the environment (urban, rural); the type of object changed (road sign, car, motorcycle, pedestrian, traffic light, animal, tree); and its inherent safety risk (low risk, high risk). Inherent safety risk was manipulated by either varying the object's placement, on/near or away from the road, or altering an infrastructure element that would require a change to driver behavior. Participants also completed two driving-related risk perception tasks, rating their relative crash risk and perceived risk of aberrant driving behaviors. Driver age was not significantly associated with hazard ratings, but individual differences in perceived risk of aberrant driving behaviors predicted hazard ratings, suggesting that general driving-related risk sensitivity plays a strong role in safety perception. In both urban and rural scenes, there were significant associations between hazard ratings and inherent safety risk, with low-risk changes perceived as consistently less hazardous than high-risk impact changes; however, the effect was larger for urban environments. There were also effects of object type, with certain objects rated as consistently more safety relevant. In urban scenes, changes involving pedestrians were rated significantly more hazardous than all other objects, and in rural scenes, changes involving animals were rated as significantly more hazardous. Notably, hazard ratings were found to be higher in urban compared with rural driving environments, even when changes were matched between environments. This study demonstrates that drivers perceive rural roads as less risky than urban roads, even when similar scenarios occur in both environments. Age did not affect hazard ratings. Instead, the findings suggest that the assessment of risk posed by hazards is influenced more by individual differences in risk sensitivity. This highlights the need for driver education to account for appraisal of hazards' risk and relevance, in addition to hazard detection, when considering factors that promote road safety.

  1. State-space modeling of the relationship between air quality and mortality.

    PubMed

    Murray, C J; Nelson, C R

    2000-07-01

    A portion of a population is assumed to be at risk, with the mortality hazard varying with atmospheric conditions including total suspended particulates (TSP). This at-risk population is not observed and the hazard function is unknown; we wish to estimate these from mortality count and atmospheric variables. Consideration of population dynamics leads to a state-space representation, allowing the Kalman Filter (KF) to be used for estimation. A harvesting effect is thus implied; high mortality is followed by lower mortality until the population is replenished by new arrivals. The model is applied to daily data for Philadelphia, PA, 1973-1990. The estimated hazard function rises with the level of TSP and at extremes of temperature and also reflects a positive interaction between TSP and temperature. The estimated at-risk population averages about 480 and varies seasonally. We find that lags of TSP are statistically significant, but the presence of negative coefficients suggests their role may be partially statistical rather than biological. In the population dynamics framework, the natural metric for health damage from air pollution is its impact on life expectancy. The range of hazard rates over the sample period is 0.07 to 0.085, corresponding to life expectancies of 14.3 and 11.8 days, respectively.

  2. Association between Late-Life Social Activity and Motor Decline in Older Adults

    PubMed Central

    Buchman, Aron S.; Boyle, Patricia A.; Wilson, Robert S.; Fleischman, Debra A.; Leurgans, Sue; Bennett, David A.

    2009-01-01

    Background Loss of motor function is a common consequence of aging, but little is known about factors that predict idiopathic motor decline. Methods We studied 906 persons without dementia, history of stroke or Parkinson's disease participating in the Rush Memory and Aging Project. At baseline, they rated their frequency of participation in common social activities. Outcome was annual change in global motor function, based on nine measures of muscle strength and nine motor performances. Results Mean social activity score at baseline was 2.6 (SD=0.58), with higher scores indicating more frequent participation in social activities. In a generalized estimating equation model, controlling for age, sex and education, motor function declined by about 0.05 unit/year [Estimate, 0.016; 95%CI (-0.057, -0.041); p=0.017]. Each 1-point decrease in social activity was associated with about a 33% more rapid rate of decline in motor function [Estimate, 0.016; 95%CI (0.003, 0.029); p=0.017)]. This amount of annual motor decline was associated with a more than 40% increased risk of death (Hazard Ratio: 1.44; 95%CI: 1.30, 1.60) and 65% increased risk of incident Katz disability (Hazard Ratio: 1.65; 95%CI: 1.48, 1.83). The association of social activity with change in motor function did not vary along demographic lines and was unchanged after controlling for potential confounders including late-life physical and cognitive activity, disability, global cognition, depressive symptoms, body composition and chronic medical conditions [Estimate, 0.025; 95%CI (0.005, 0.045); p=0.010]. Conclusion Less frequent participation in social activities is associated with a more rapid rate of motor decline in old age. PMID:19546415

  3. Temporal expectancy in the context of a theory of visual attention

    PubMed Central

    Vangkilde, Signe; Petersen, Anders; Bundesen, Claus

    2013-01-01

    Temporal expectation is expectation with respect to the timing of an event such as the appearance of a certain stimulus. In this paper, temporal expectancy is investigated in the context of the theory of visual attention (TVA), and we begin by summarizing the foundations of this theoretical framework. Next, we present a parametric experiment exploring the effects of temporal expectation on perceptual processing speed in cued single-stimulus letter recognition with unspeeded motor responses. The length of the cue–stimulus foreperiod was exponentially distributed with one of six hazard rates varying between blocks. We hypothesized that this manipulation would result in a distinct temporal expectation in each hazard rate condition. Stimulus exposures were varied such that both the temporal threshold of conscious perception (t0 ms) and the perceptual processing speed (v letters s−1) could be estimated using TVA. We found that the temporal threshold t0 was unaffected by temporal expectation, but the perceptual processing speed v was a strikingly linear function of the logarithm of the hazard rate of the stimulus presentation. We argue that the effects on the v values were generated by changes in perceptual biases, suggesting that our perceptual biases are directly related to our temporal expectations. PMID:24018716

  4. Association between late-life social activity and motor decline in older adults.

    PubMed

    Buchman, Aron S; Boyle, Patricia A; Wilson, Robert S; Fleischman, Debra A; Leurgans, Sue; Bennett, David A

    2009-06-22

    Loss of motor function is a common consequence of aging, but little is known about the factors that predict idiopathic motor decline. Our objective was to test the hypothesis that late-life social activity is related to the rate of change in motor function in old age. Longitudinal cohort study with a mean follow-up of 4.9 years with 906 persons without stroke, Parkinson disease, or dementia participating in the Rush Memory and Aging Project. At baseline, participants rated the frequency of their current participation in common social activities from which a summary measure of social activity was derived. The main outcome measure was annual change in a composite measure of global motor function, based on 9 measures of muscle strength and 9 motor performances. Mean (SD) social activity score at baseline was 2.6 (0.58), with higher scores indicating more frequent participation in social activities. In a generalized estimating equation model, controlling for age, sex, and education, global motor function declined by approximately 0.05 U/y (estimate, 0.016; 95% confidence interval [CI], -0.057 to 0.041 [P = .02]). Each 1-point decrease in social activity was associated with approximately a 33% more rapid rate of decline in motor function (estimate, 0.016; 95% CI, 0.003 to 0.029 [P = .02]). The effect of each 1-point decrease in the social activity score at baseline on the rate of change in global motor function was the same as being approximately 5 years older at baseline (age estimate, -0.003; 95% CI, -0.004 to -0.002 [P<.001]). Furthermore, this amount of motor decline per year was associated with a more than 40% increased risk of death (hazard ratio, 1.44; 95% CI, 1.30 to 1.60) and a 65% increased risk of incident Katz disability (hazard ratio, 1.65; 95% CI, 1.48 to 1.83). The association of social activity with the rate of global motor decline did not vary along demographic lines and was unchanged (estimate, 0.025; 95% CI, 0.005 to 0.045 [P = .01]) after controlling for potential confounders including late-life physical and cognitive activity, disability, global cognition depressive symptoms, body composition, and chronic medical conditions. Less frequent participation in social activities is associated with a more rapid rate of motor function decline in old age.

  5. Enclosure fire hazard analysis using relative energy release criteria. [burning rate and combustion control

    NASA Technical Reports Server (NTRS)

    Coulbert, C. D.

    1978-01-01

    A method for predicting the probable course of fire development in an enclosure is presented. This fire modeling approach uses a graphic plot of five fire development constraints, the relative energy release criteria (RERC), to bound the heat release rates in an enclosure as a function of time. The five RERC are flame spread rate, fuel surface area, ventilation, enclosure volume, and total fuel load. They may be calculated versus time based on the specified or empirical conditions describing the specific enclosure, the fuel type and load, and the ventilation. The calculation of these five criteria, using the common basis of energy release rates versus time, provides a unifying framework for the utilization of available experimental data from all phases of fire development. The plot of these criteria reveals the probable fire development envelope and indicates which fire constraint will be controlling during a criteria time period. Examples of RERC application to fire characterization and control and to hazard analysis are presented along with recommendations for the further development of the concept.

  6. SEE Design Guide and Requirements for Electrical Deadfacing

    NASA Technical Reports Server (NTRS)

    Berki, Joe M.; Sargent, Noel; Kauffman, W. (Technical Monitor)

    2002-01-01

    The purpose of this design guide is to present information for understanding and mitigating the potential hazards associated with de-mating and mating powered electrical connectors on space flight vehicles. The process of staging is a necessary function in the launching of space vehicles and in the deployment of satellites, and now in manned assembly of systems in space. During this electrical interconnection process, various environments may be encountered that warrant the restriction of the voltage and current present across the pins of an electrical connector prior to separation, mating, or in a static open non-mated configuration. This process is called deadfacing. These potentially hazardous environments encompass the obvious explosive fuel vapors and human shock hazard, to multiple Electro-Magnetic Interference (EMI) phenomena related to the rapid rate of change in current as well as exposure to Radio Frequency (RF) fields.

  7. The rockfall hazard rating system.

    DOT National Transportation Integrated Search

    1991-11-01

    The development and dissemination of the Rockfall Hazard Rating System (RHRS) is complete. RHRS is intended to be a proactive tool that will allow transportation agencies to address rationally their rockfall hazards instead of simply reacting to rock...

  8. [Occupational hazards survey of specially supervised enterprises during 2011-2012 in one district of Shenzhen, China].

    PubMed

    Zhang, Hongsheng; Zhang, Xianxing; Zhang, Chu; Liu, Song; He, Jian-Feng

    2014-04-01

    To analyze the results of an occupational hazards survey of specially supervised enterprises (156 enterprise-times) during 2011-2012 in one district of Shenzhen, China and find out the changes in occupational hazards in these enterprises, and to put forward countermeasures for the prevention and control of occupational hazards. Occupational hazards monitoring results for specially supervised enterprises (156 enterprise-times) during 2011-2012 were included. Comparison and analysis were performed between different years, different industries, different occupational hazards, and different sizes of enterprises. A total of 1274 monitoring sites from these specially supervised enterprises were included, of which qualification rate was 73.55% (937/1274), and the noise monitoring sites showed the lowest qualification rate. The overall qualification rate in 2012 (70.37%) was significantly lower than that in 2011 (80.94%) (χ(2) = 15.38, P < 0.01). In electronics industry, the qualification rate in 2012 was significantly lower than that in 2011 (χ2 = 11.27, P = 0.001). Comparison of various hazards in different industries indicated that electronic enterprises and furniture enterprises had the lowest qualification rate in noise monitoring, printing enterprises had the lowest qualification rate in organic solvent monitoring, and furniture enterprises had the lowest qualification rate in dust monitoring. Comparison between different sizes of enterprises indicated that the qualification rate of large and medium enterprises in 2012 was significantly lower than that in 2011, while the qualification rate of small enterprises in 2012 was significantly higher than that in 2011 (P < 0.01 or P < 0.05). In the prevention and control of occupational hazards in specially supervised enterprises, special attention should be paid to the control of organic solvents in printing enterprises and noise and dust in furniture enterprises.

  9. Idiopathic Pulmonary Fibrosis: Gender-Age-Physiology Index Stage for Predicting Future Lung Function Decline.

    PubMed

    Salisbury, Margaret L; Xia, Meng; Zhou, Yueren; Murray, Susan; Tayob, Nabihah; Brown, Kevin K; Wells, Athol U; Schmidt, Shelley L; Martinez, Fernando J; Flaherty, Kevin R

    2016-02-01

    Idiopathic pulmonary fibrosis is a progressive lung disease with variable course. The Gender-Age-Physiology (GAP) Index and staging system uses clinical variables to stage mortality risk. It is unknown whether clinical staging predicts future decline in pulmonary function. We assessed whether the GAP stage predicts future pulmonary function decline and whether interval pulmonary function change predicts mortality after accounting for stage. Patients with idiopathic pulmonary fibrosis (N = 657) were identified retrospectively at three tertiary referral centers, and baseline GAP stages were assessed. Mixed models were used to describe average trajectories of FVC and diffusing capacity of the lung for carbon monoxide (Dlco). Multivariable Cox proportional hazards models were used to assess whether declines in pulmonary function ≥ 10% in 6 months predict mortality after accounting for GAP stage. Over a 2-year period, GAP stage was not associated with differences in yearly lung function decline. After accounting for stage, a 10% decrease in FVC or Dlco over 6 months independently predicted death or transplantation (FVC hazard ratio, 1.37; Dlco hazard ratio, 1.30; both, P ≤ .03). Patients with GAP stage 2 with declining pulmonary function experienced a survival profile similar to patients with GAP stage 3, with 1-year event-free survival of 59.3% (95% CI, 49.4-67.8) vs 56.9% (95% CI, 42.2-69.1). Baseline GAP stage predicted death or lung transplantation but not the rate of future pulmonary function decline. After accounting for GAP stage, a decline of ≥ 10% over 6 months independently predicted death or lung transplantation. Copyright © 2016 American College of Chest Physicians. Published by Elsevier Inc. All rights reserved.

  10. Risk factors for stent graft thrombosis after transjugular intrahepatic portosystemic shunt creation

    PubMed Central

    Jahangiri, Younes; Kerrigan, Timothy; Li, Lei; Prosser, Dominik; Brar, Anantnoor; Righetti, Johnathan; Schenning, Ryan C.; Kaufman, John A.

    2017-01-01

    Background To identify risk factors of stent graft thrombosis after transjugular intrahepatic portosystemic shunt (TIPS) creation. Methods Patients who underwent TIPS creation between June 2003 and January 2016 and with follow-up assessing stent graft patency were included (n=174). Baseline comorbidities, liver function, procedural details and follow-up liver function tests were analyzed in association with hazards of thrombosis on follow-up. Competing risk cox regression models were used considering liver transplant after TIPS creation as the competing risk variable. Results One-, 2- and 5-year primary patency rates were 94.1%, 91.7% and 78.2%, respectively. Patient age [sub-hazard ratio (sHR): 1.13; P=0.001], body mass index (BMI) <30 (sHR: 33.08; P=0.008) and a higher post-TIPS portosystemic pressure gradient (sHR: 1.14; P=0.023) were significantly associated with TIPS thrombosis in multivariate analysis. A higher rate of TIPS thrombosis was observed in those for whom the procedure was clinically unsuccessful (P=0.014). A significant increase in incidence of thrombosis was noted with increasing tertiles of post-TIPS portosystemic gradients (P value for trend=0.017). Conclusions Older age, lower BMI and higher post-TIPS portosystemic gradients were associated with higher hazards of shunt thrombosis after TIPS creation using stent grafts. Higher rates of shunt thrombosis were seen in patients for whom TIPS creation was clinically unsuccessful. The association between TIPS thrombosis and higher post-TIPS portosystemic gradients may indicate impaired flow through the shunt, a finding which may be technical or anatomic in nature and should be assessed before procedure completion. PMID:29399518

  11. Semi-parametric regression model for survival data: graphical visualization with R

    PubMed Central

    2016-01-01

    Cox proportional hazards model is a semi-parametric model that leaves its baseline hazard function unspecified. The rationale to use Cox proportional hazards model is that (I) the underlying form of hazard function is stringent and unrealistic, and (II) researchers are only interested in estimation of how the hazard changes with covariate (relative hazard). Cox regression model can be easily fit with coxph() function in survival package. Stratified Cox model may be used for covariate that violates the proportional hazards assumption. The relative importance of covariates in population can be examined with the rankhazard package in R. Hazard ratio curves for continuous covariates can be visualized using smoothHR package. This curve helps to better understand the effects that each continuous covariate has on the outcome. Population attributable fraction is a classic quantity in epidemiology to evaluate the impact of risk factor on the occurrence of event in the population. In survival analysis, the adjusted/unadjusted attributable fraction can be plotted against survival time to obtain attributable fraction function. PMID:28090517

  12. Updated hazard rate equations for dual safeguard systems.

    PubMed

    Rothschild, Marc

    2007-04-11

    A previous paper by this author [M.J. Rothschild, Updated hazard rate equation for single safeguards, J. Hazard. Mater. 130 (1-2) (2006) 15-20] showed that commonly used analytical methods for quantifying failure rates overestimates the risk in some circumstances. This can lead the analyst to mistakenly believe that a given operation presents an unacceptable risk. For a single safeguard system, a formula was presented in that paper that accurately evaluates the risk over a wide range of conditions. This paper expands on that analysis by evaluating the failure rate for dual safeguard systems. The safeguards can be activated at the same time or at staggered times, and the safeguard may provide an indication whether it was successful upon a challenge, or its status may go undetected. These combinations were evaluated using a Monte Carlo simulation. Empirical formulas for evaluating the hazard rate were developed from this analysis. It is shown that having the safeguards activate at the same time while providing positive feedback of their individual actions is the most effective arrangement in reducing the hazard rate. The hazard rate can also be reduced by staggering the testing schedules of the safeguards.

  13. Competing risk models in reliability systems, an exponential distribution model with Bayesian analysis approach

    NASA Astrophysics Data System (ADS)

    Iskandar, I.

    2018-03-01

    The exponential distribution is the most widely used reliability analysis. This distribution is very suitable for representing the lengths of life of many cases and is available in a simple statistical form. The characteristic of this distribution is a constant hazard rate. The exponential distribution is the lower rank of the Weibull distributions. In this paper our effort is to introduce the basic notions that constitute an exponential competing risks model in reliability analysis using Bayesian analysis approach and presenting their analytic methods. The cases are limited to the models with independent causes of failure. A non-informative prior distribution is used in our analysis. This model describes the likelihood function and follows with the description of the posterior function and the estimations of the point, interval, hazard function, and reliability. The net probability of failure if only one specific risk is present, crude probability of failure due to a specific risk in the presence of other causes, and partial crude probabilities are also included.

  14. Hazardous drinking and military community functioning: identifying mediating risk factors.

    PubMed

    Foran, Heather M; Heyman, Richard E; Slep, Amy M Smith

    2011-08-01

    Hazardous drinking is a serious societal concern in military populations. Efforts to reduce hazardous drinking among military personnel have been limited in effectiveness. There is a need for a deeper understanding of how community-based prevention models apply to hazardous drinking in the military. Community-wide prevention efforts may be most effective in targeting community functioning (e.g., support from formal agencies, community cohesion) that impacts hazardous drinking via other proximal risk factors. The goal of the current study is to inform community-wide prevention efforts by testing a model of community functioning and mediating risk factors of hazardous drinking among active duty U.S. Air Force personnel. A large, representative survey sample of U.S. Air Force active duty members (N = 52,780) was collected at 82 bases worldwide. Hazardous drinking was assessed with the widely used Alcohol Use Disorders Identification Test (Saunders, Aasland, Babor, de la Fuente, & Grant, 1993). A variety of individual, family, and community measures were also assessed. Structural equation modeling was used to test a hypothesized model of community functioning, mediating risk factors and hazardous drinking. Depressive symptoms, perceived financial stress, and satisfaction with the U.S. Air Force were identified as significant mediators of the link between community functioning and hazardous drinking for men and women. Relationship satisfaction was also identified as a mediator for men. These results provide a framework for further community prevention research and suggest that prevention efforts geared at increasing aspects of community functioning (e.g., the U.S. Air Force Community Capacity model) may indirectly lead to reductions in hazardous drinking through other proximal risk factors.

  15. Association of Proteinuria and Incident Atrial Fibrillation in Patients With Intact and Reduced Kidney Function.

    PubMed

    Molnar, Amber O; Eddeen, Anan Bader; Ducharme, Robin; Garg, Amit X; Harel, Ziv; McCallum, Megan K; Perl, Jeffrey; Wald, Ron; Zimmerman, Deborah; Sood, Manish M

    2017-07-06

    Early evidence suggests proteinuria is independently associated with incident atrial fibrillation (AF). We sought to investigate whether the association of proteinuria with incident AF is altered by kidney function. Retrospective cohort study using administrative healthcare databases in Ontario, Canada (2002-2015). A total of 736 666 patients aged ≥40 years not receiving dialysis and with no previous history of AF were included. Proteinuria was defined using the urine albumin-to-creatinine ratio (ACR) and kidney function by the estimated glomerular filtration rate (eGFR). The primary outcome was time to AF. Cox proportional models were used to determine the hazard ratio for AF censored for death, dialysis, kidney transplant, or end of follow-up. Fine and Grey models were used to determine the subdistribution hazard ratio for AF, with death as a competing event. Median follow-up was 6 years and 44 809 patients developed AF. In adjusted models, ACR and eGFR were associated with AF ( P <0.0001). The association of proteinuria with AF differed based on kidney function (ACR × eGFR interaction, P <0.0001). Overt proteinuria (ACR, 120 mg/mmol) was associated with greater AF risk in patients with intact (eGFR, 120) versus reduced (eGFR, 30) kidney function (adjusted hazard ratios, 4.5 [95% CI, 4.0-5.1] and 2.6 [95% CI, 2.4-2.8], respectively; referent ACR 0 and eGFR 120). Results were similar in competing risk analyses. Proteinuria increases the risk of incident AF markedly in patients with intact kidney function compared with those with decreased kidney function. Screening and preventative strategies should consider proteinuria as an independent risk factor for AF. © 2017 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley.

  16. The 2014 update to the National Seismic Hazard Model in California

    USGS Publications Warehouse

    Powers, Peter; Field, Edward H.

    2015-01-01

    The 2014 update to the U. S. Geological Survey National Seismic Hazard Model in California introduces a new earthquake rate model and new ground motion models (GMMs) that give rise to numerous changes to seismic hazard throughout the state. The updated earthquake rate model is the third version of the Uniform California Earthquake Rupture Forecast (UCERF3), wherein the rates of all ruptures are determined via a self-consistent inverse methodology. This approach accommodates multifault ruptures and reduces the overprediction of moderate earthquake rates exhibited by the previous model (UCERF2). UCERF3 introduces new faults, changes to slip or moment rates on existing faults, and adaptively smoothed gridded seismicity source models, all of which contribute to significant changes in hazard. New GMMs increase ground motion near large strike-slip faults and reduce hazard over dip-slip faults. The addition of very large strike-slip ruptures and decreased reverse fault rupture rates in UCERF3 further enhances these effects.

  17. 76 FR 3307 - Hazardous Materials: Harmonization With the United Nations Recommendations, International...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-19

    ... Compatibility Group S indicates that hazardous effects from accidental functioning are limited to the extent the... package is capable of containing any hazardous effects in the event of an accidental functioning of its... demonstrate that any hazardous effects are confined within a package. In the ANPRM, we invited commenters to...

  18. Estimating latent time of maturation and survival costs of reproduction in continuous time from capture-recapture data

    USGS Publications Warehouse

    Ergon, T.; Yoccoz, N.G.; Nichols, J.D.; Thomson, David L.; Cooch, Evan G.; Conroy, Michael J.

    2009-01-01

    In many species, age or time of maturation and survival costs of reproduction may vary substantially within and among populations. We present a capture-mark-recapture model to estimate the latent individual trait distribution of time of maturation (or other irreversible transitions) as well as survival differences associated with the two states (representing costs of reproduction). Maturation can take place at any point in continuous time, and mortality hazard rates for each reproductive state may vary according to continuous functions over time. Although we explicitly model individual heterogeneity in age/time of maturation, we make the simplifying assumption that death hazard rates do not vary among individuals within groups of animals. However, the estimates of the maturation distribution are fairly robust against individual heterogeneity in survival as long as there is no individual level correlation between mortality hazards and latent time of maturation. We apply the model to biweekly capture?recapture data of overwintering field voles (Microtus agrestis) in cyclically fluctuating populations to estimate time of maturation and survival costs of reproduction. Results show that onset of seasonal reproduction is particularly late and survival costs of reproduction are particularly large in declining populations.

  19. Knowledge-based geographic information systems on the Macintosh computer: a component of the GypsES project

    Treesearch

    Gregory Elmes; Thomas Millette; Charles B. Yuill

    1991-01-01

    GypsES, a decision-support and expert system for the management of Gypsy Moth addresses five related research problems in a modular, computer-based project. The modules are hazard rating, monitoring, prediction, treatment decision and treatment implementation. One common component is a geographic information system designed to function intelligently. We refer to this...

  20. B-value and slip rate sensitivity analysis for PGA value in Lembang fault and Cimandiri fault area

    NASA Astrophysics Data System (ADS)

    Pratama, Cecep; Ito, Takeo; Meilano, Irwan; Nugraha, Andri Dian

    2017-07-01

    We examine slip rate and b-value contribution of Peak Ground Acceleration (PGA), in probabilistic seismic hazard maps (10% probability of exceedence in 50 years or 500 years return period). Hazard curve of PGA have been investigated for Sukabumi and Bandung using a PSHA (Probabilistic Seismic Hazard Analysis). We observe that the most influence in the hazard estimate is crustal fault. Monte Carlo approach has been developed to assess the sensitivity. Uncertainty and coefficient of variation from slip rate and b-value in Lembang and Cimandiri Fault area have been calculated. We observe that seismic hazard estimates are sensitive to fault slip rate and b-value with uncertainty result are 0.25 g dan 0.1-0.2 g, respectively. For specific site, we found seismic hazard estimate are 0.49 + 0.13 g with COV 27% and 0.39 + 0.05 g with COV 13% for Sukabumi and Bandung, respectively.

  1. Survival analysis for the missing censoring indicator model using kernel density estimation techniques

    PubMed Central

    Subramanian, Sundarraman

    2008-01-01

    This article concerns asymptotic theory for a new estimator of a survival function in the missing censoring indicator model of random censorship. Specifically, the large sample results for an inverse probability-of-non-missingness weighted estimator of the cumulative hazard function, so far not available, are derived, including an almost sure representation with rate for a remainder term, and uniform strong consistency with rate of convergence. The estimator is based on a kernel estimate for the conditional probability of non-missingness of the censoring indicator. Expressions for its bias and variance, in turn leading to an expression for the mean squared error as a function of the bandwidth, are also obtained. The corresponding estimator of the survival function, whose weak convergence is derived, is asymptotically efficient. A numerical study, comparing the performances of the proposed and two other currently existing efficient estimators, is presented. PMID:18953423

  2. Survival analysis for the missing censoring indicator model using kernel density estimation techniques.

    PubMed

    Subramanian, Sundarraman

    2006-01-01

    This article concerns asymptotic theory for a new estimator of a survival function in the missing censoring indicator model of random censorship. Specifically, the large sample results for an inverse probability-of-non-missingness weighted estimator of the cumulative hazard function, so far not available, are derived, including an almost sure representation with rate for a remainder term, and uniform strong consistency with rate of convergence. The estimator is based on a kernel estimate for the conditional probability of non-missingness of the censoring indicator. Expressions for its bias and variance, in turn leading to an expression for the mean squared error as a function of the bandwidth, are also obtained. The corresponding estimator of the survival function, whose weak convergence is derived, is asymptotically efficient. A numerical study, comparing the performances of the proposed and two other currently existing efficient estimators, is presented.

  3. Significant events in low-level flow conditions hazardous to aircraft

    NASA Technical Reports Server (NTRS)

    Alexander, M. B.; Camp, D. W.

    1983-01-01

    Atmospheric parameters recorded during high surface winds are analyzed to determine magnitude, frequency, duration, and simultaneity of occurrence of low level flow conditions known to be hazardous to the ascent and descent of conventional aircraft and the space shuttle. Graphic and tabular presentations of mean and extreme values and simultaneous occurrences of turbulence (gustiness and a gust factor), wind shear (speed and direction), and vertical motion (updrafts and downdrafts), along with associated temperature inversions are included as function of tower height, layer and/or distance for six 5 sec intervals (one interval every 100 sec) of parameters sampled simultaneously at the rate of 10 speeds, directions and temperatures per second during an approximately 10 min period.

  4. Influence of Lung Function and Sleep-disordered Breathing on All-Cause Mortality. A Community-based Study.

    PubMed

    Putcha, Nirupama; Crainiceanu, Ciprian; Norato, Gina; Samet, Jonathan; Quan, Stuart F; Gottlieb, Daniel J; Redline, Susan; Punjabi, Naresh M

    2016-10-15

    Whether sleep-disordered breathing (SDB) severity and diminished lung function act synergistically to heighten the risk of adverse health outcomes remains a topic of significant debate. The current study sought to determine whether the association between lower lung function and mortality would be stronger in those with increasing severity of SDB in a community-based cohort of middle-aged and older adults. Full montage home sleep testing and spirometry data were analyzed on 6,173 participants of the Sleep Heart Health Study. Proportional hazards models were used to calculate risk for all-cause mortality, with FEV 1 and apnea-hypopnea index (AHI) as the primary exposure indicators along with several potential confounders. All-cause mortality rate was 26.9 per 1,000 person-years in those with SDB (AHI ≥5 events/h) and 18.2 per 1,000 person-years in those without (AHI <5 events/h). For every 200-ml decrease in FEV 1 , all-cause mortality increased by 11.0% in those without SDB (hazard ratio, 1.11; 95% confidence interval, 1.08-1.13). In contrast, for every 200-ml decrease in FEV 1 , all-cause mortality increased by only 6.0% in participants with SDB (hazard ratio, 1.06; 95% confidence interval, 1.04-1.09). Additionally, the incremental influence of lung function on all-cause mortality was less with increasing severity of SDB (P value for interaction between AHI and FEV 1 , 0.004). Lung function was associated with risk for all-cause mortality. The incremental contribution of lung function to mortality diminishes with increasing severity of SDB.

  5. Influence of Lung Function and Sleep-disordered Breathing on All-Cause Mortality. A Community-based Study

    PubMed Central

    Putcha, Nirupama; Crainiceanu, Ciprian; Norato, Gina; Samet, Jonathan; Quan, Stuart F.; Gottlieb, Daniel J.; Redline, Susan

    2016-01-01

    Rationale: Whether sleep-disordered breathing (SDB) severity and diminished lung function act synergistically to heighten the risk of adverse health outcomes remains a topic of significant debate. Objectives: The current study sought to determine whether the association between lower lung function and mortality would be stronger in those with increasing severity of SDB in a community-based cohort of middle-aged and older adults. Methods: Full montage home sleep testing and spirometry data were analyzed on 6,173 participants of the Sleep Heart Health Study. Proportional hazards models were used to calculate risk for all-cause mortality, with FEV1 and apnea–hypopnea index (AHI) as the primary exposure indicators along with several potential confounders. Measurements and Main Results: All-cause mortality rate was 26.9 per 1,000 person-years in those with SDB (AHI ≥5 events/h) and 18.2 per 1,000 person-years in those without (AHI <5 events/h). For every 200-ml decrease in FEV1, all-cause mortality increased by 11.0% in those without SDB (hazard ratio, 1.11; 95% confidence interval, 1.08–1.13). In contrast, for every 200-ml decrease in FEV1, all-cause mortality increased by only 6.0% in participants with SDB (hazard ratio, 1.06; 95% confidence interval, 1.04–1.09). Additionally, the incremental influence of lung function on all-cause mortality was less with increasing severity of SDB (P value for interaction between AHI and FEV1, 0.004). Conclusions: Lung function was associated with risk for all-cause mortality. The incremental contribution of lung function to mortality diminishes with increasing severity of SDB. PMID:27105053

  6. Geodesy- and geology-based slip-rate models for the Western United States (excluding California) national seismic hazard maps

    USGS Publications Warehouse

    Petersen, Mark D.; Zeng, Yuehua; Haller, Kathleen M.; McCaffrey, Robert; Hammond, William C.; Bird, Peter; Moschetti, Morgan; Shen, Zhengkang; Bormann, Jayne; Thatcher, Wayne

    2014-01-01

    The 2014 National Seismic Hazard Maps for the conterminous United States incorporate additional uncertainty in fault slip-rate parameter that controls the earthquake-activity rates than was applied in previous versions of the hazard maps. This additional uncertainty is accounted for by new geodesy- and geology-based slip-rate models for the Western United States. Models that were considered include an updated geologic model based on expert opinion and four combined inversion models informed by both geologic and geodetic input. The two block models considered indicate significantly higher slip rates than the expert opinion and the two fault-based combined inversion models. For the hazard maps, we apply 20 percent weight with equal weighting for the two fault-based models. Off-fault geodetic-based models were not considered in this version of the maps. Resulting changes to the hazard maps are generally less than 0.05 g (acceleration of gravity). Future research will improve the maps and interpret differences between the new models.

  7. The NHLBI LAM Registry: Prognostic physiological and radiological biomarkers emerge from a 15-year prospective longitudinal analysis.

    PubMed

    Gupta, Nishant; Lee, Hye-Seung; Ryu, Jay H; Taveira-DaSilva, Angelo M; Beck, Gerald J; Lee, Jar-Chi; McCarthy, Kevin; Finlay, Geraldine A; Brown, Kevin K; Ruoss, Stephen J; Avila, Nilo A; Moss, Joel; McCormack, Francis X

    2018-06-22

    The natural history of lymphangioleiomyomatosis is mainly derived from retrospective cohort analyses and remains incompletely understood. A National Institutes of Health LAM Registry was established to define the natural history and identify prognostic biomarkers that can help guide management and decision-making in patients with LAM. A linear mixed effects model was employed to compute the rate of decline of FEV1, and identify variables impacting FEV1 decline among 217 registry patients who enrolled from 1998-2001. Prognostic variables associated with progression to death/lung transplantation were identified using a Cox proportional hazard model. Mean annual decline of FEV1 was 89±53 ml/year, and remained remarkably constant regardless of baseline lung function. FEV1 decline was more rapid in those with greater cyst profusion on CT scan (p=0.02), and in premenopausal subjects (118ml/year) compared to postmenopausal subjects (74ml/year), (p=0.003). There were 26 deaths and 43 lung transplants during the evaluation period. Estimated 5-, 10-, 15-, and 20-year transplant-free survival rates were 95%, 85%, 75%, and 64%, respectively. Postmenopausal status (hazard ratio 0.30, p=0.0002) and higher baseline FEV1 (hazard ratio 0.97, p=0.008) or DLCO (hazard ratio 0.97, p=0.001) were independently associated with a lower risk of progression to death or lung transplantation. The median transplant-free survival in patients with LAM is greater than 20 years. Menopausal status as well as structural and physiological markers of disease severity significantly affect the rate of decline of FEV1 and progression to death or lung transplantation in LAM. Copyright © 2018. Published by Elsevier Inc.

  8. Analysis of occupational health hazards and associated risks in fuzzy environment: a case research in an Indian underground coal mine.

    PubMed

    Samantra, Chitrasen; Datta, Saurav; Mahapatra, Siba Sankar

    2017-09-01

    This paper presents a unique hierarchical structure on various occupational health hazards including physical, chemical, biological, ergonomic and psychosocial hazards, and associated adverse consequences in relation to an underground coal mine. The study proposes a systematic health hazard risk assessment methodology for estimating extent of hazard risk using three important measuring parameters: consequence of exposure, period of exposure and probability of exposure. An improved decision making method using fuzzy set theory has been attempted herein for converting linguistic data into numeric risk ratings. The concept of 'centre of area' method for generalized triangular fuzzy numbers has been explored to quantify the 'degree of hazard risk' in terms of crisp ratings. Finally, a logical framework for categorizing health hazards into different risk levels has been constructed on the basis of distinguished ranges of evaluated risk ratings (crisp). Subsequently, an action requirement plan has been suggested, which could provide guideline to the managers for successfully managing health hazard risks in the context of underground coal mining exercise.

  9. [Noise hazard and hearing loss in workers in automotive component manufacturing industry in Guangzhou, China].

    PubMed

    Wang, Zhi; Liang, Jiabin; Rong, Xing; Zhou, Hao; Duan, Chuanwei; Du, Weijia; Liu, Yimin

    2015-12-01

    To investigate noise hazard and its influence on hearing loss in workers in the automotive component manufacturing industry. Noise level in the workplace of automotive component manufacturing enterprises was measured and hearing examination was performed for workers to analyze the features and exposure levels of noise in each process, as well as the influence on hearing loss in workers. In the manufacturing processes for different products in this industry, the manufacturing processes of automobile hub and suspension and steering systems had the highest degrees of noise hazard, with over-standard rates of 79.8% and 57.1%, respectively. In the different technical processes for automotive component manufacturing, punching and casting had the highest degrees of noise hazard, with over-standard rates of 65.0% and 50%, respectively. The workers engaged in the automotive air conditioning system had the highest rate of abnormal hearing ability (up to 3.1%). In the automotive component manufacturing industry, noise hazard exceeds the standard seriously. Although the rate of abnormal hearing is lower than the average value of the automobile manufacturing industry in China, this rate tends to increase gradually. Enough emphasis should be placed on the noise hazard in this industry.

  10. Tracking Temporal Hazard in the Human Electroencephalogram Using a Forward Encoding Model

    PubMed Central

    2018-01-01

    Abstract Human observers automatically extract temporal contingencies from the environment and predict the onset of future events. Temporal predictions are modeled by the hazard function, which describes the instantaneous probability for an event to occur given it has not occurred yet. Here, we tackle the question of whether and how the human brain tracks continuous temporal hazard on a moment-to-moment basis, and how flexibly it adjusts to strictly implicit variations in the hazard function. We applied an encoding-model approach to human electroencephalographic data recorded during a pitch-discrimination task, in which we implicitly manipulated temporal predictability of the target tones by varying the interval between cue and target tone (i.e. the foreperiod). Critically, temporal predictability either was driven solely by the passage of time (resulting in a monotonic hazard function) or was modulated to increase at intermediate foreperiods (resulting in a modulated hazard function with a peak at the intermediate foreperiod). Forward-encoding models trained to predict the recorded EEG signal from different temporal hazard functions were able to distinguish between experimental conditions, showing that implicit variations of temporal hazard bear tractable signatures in the human electroencephalogram. Notably, this tracking signal was reconstructed best from the supplementary motor area, underlining this area’s link to cognitive processing of time. Our results underline the relevance of temporal hazard to cognitive processing and show that the predictive accuracy of the encoding-model approach can be utilized to track abstract time-resolved stimuli. PMID:29740594

  11. Sexual Victimization and Hazardous Drinking Among Heterosexual and Sexual Minority Women

    PubMed Central

    Szalacha, Laura A.; Johnson, Timothy P.; Kinnison, Kelly E.; Wilsnack, Sharon C.; Cho, Young

    2010-01-01

    Aims Although research shows that sexual minority women report high rates of lifetime sexual victimization and high rates of hazardous drinking, investigators have yet to explore the relationships between sexual victimization and hazardous drinking in this population. In addition, because rates of these problems may vary within the sexual minority population, we examined and compared relationships between sexual victimization and hazardous drinking in exclusively heterosexual and sexual minority (mostly heterosexual, bisexual, mostly lesbian and exclusively lesbian) women. Method Data from 548 participants in the National Study of Health and Life Experiences of Women and 405 participants in the Chicago Health and Life Experiences of Women study were pooled to address these relationships. We compared hazardous drinking, childhood sexual abuse (CSA), adult sexual assault (ASA), and revictimization (both CSA and ASA) across the five sexual identity subgroups. We then fit a multilevel general linear model to examine group differences in the relationships between hazardous drinking and sexual victimization and to test for potential interactions between victimization and identity on hazardous drinking. Results Sexual minority women reported higher levels of hazardous drinking and higher rates of CSA and sexual revictimization than did exclusively heterosexual women. Revictimization was the strongest predictor of hazardous drinking among women who identified as mostly heterosexual and mostly lesbian. Conclusions This study extends previous research by examining associations between sexual victimization and hazardous drinking in heterosexual and sexual minority women and by exploring within-group variations in these associations among sexual minority women. Higher rates of lifetime sexual victimization and revictimization may help to explain sexual minority women’s heightened risk for hazardous drinking. The findings highlight the need for additional research that examines the meanings of sexual identity labels to more fully understand differences in risk within groups of sexual minority women as well as how sexual identity may affect responses to and interpretations of sexual victimization. PMID:20692771

  12. Incorporating induced seismicity in the 2014 United States National Seismic Hazard Model: results of the 2014 workshop and sensitivity studies

    USGS Publications Warehouse

    Petersen, Mark D.; Mueller, Charles S.; Moschetti, Morgan P.; Hoover, Susan M.; Rubinstein, Justin L.; Llenos, Andrea L.; Michael, Andrew J.; Ellsworth, William L.; McGarr, Arthur F.; Holland, Austin A.; Anderson, John G.

    2015-01-01

    The U.S. Geological Survey National Seismic Hazard Model for the conterminous United States was updated in 2014 to account for new methods, input models, and data necessary for assessing the seismic ground shaking hazard from natural (tectonic) earthquakes. The U.S. Geological Survey National Seismic Hazard Model project uses probabilistic seismic hazard analysis to quantify the rate of exceedance for earthquake ground shaking (ground motion). For the 2014 National Seismic Hazard Model assessment, the seismic hazard from potentially induced earthquakes was intentionally not considered because we had not determined how to properly treat these earthquakes for the seismic hazard analysis. The phrases “potentially induced” and “induced” are used interchangeably in this report, however it is acknowledged that this classification is based on circumstantial evidence and scientific judgment. For the 2014 National Seismic Hazard Model update, the potentially induced earthquakes were removed from the NSHM’s earthquake catalog, and the documentation states that we would consider alternative models for including induced seismicity in a future version of the National Seismic Hazard Model. As part of the process of incorporating induced seismicity into the seismic hazard model, we evaluate the sensitivity of the seismic hazard from induced seismicity to five parts of the hazard model: (1) the earthquake catalog, (2) earthquake rates, (3) earthquake locations, (4) earthquake Mmax (maximum magnitude), and (5) earthquake ground motions. We describe alternative input models for each of the five parts that represent differences in scientific opinions on induced seismicity characteristics. In this report, however, we do not weight these input models to come up with a preferred final model. Instead, we present a sensitivity study showing uniform seismic hazard maps obtained by applying the alternative input models for induced seismicity. The final model will be released after further consideration of the reliability and scientific acceptability of each alternative input model. Forecasting the seismic hazard from induced earthquakes is fundamentally different from forecasting the seismic hazard for natural, tectonic earthquakes. This is because the spatio-temporal patterns of induced earthquakes are reliant on economic forces and public policy decisions regarding extraction and injection of fluids. As such, the rates of induced earthquakes are inherently variable and nonstationary. Therefore, we only make maps based on an annual rate of exceedance rather than the 50-year rates calculated for previous U.S. Geological Survey hazard maps.

  13. Visual motion perception predicts driving hazard perception ability.

    PubMed

    Lacherez, Philippe; Au, Sandra; Wood, Joanne M

    2014-02-01

    To examine the basis of previous findings of an association between indices of driving safety and visual motion sensitivity and to examine whether this association could be explained by low-level changes in visual function. A total of 36 visually normal participants (aged 19-80 years) completed a battery of standard vision tests including visual acuity, contrast sensitivity and automated visual fields and two tests of motion perception including sensitivity for movement of a drifting Gabor stimulus and sensitivity for displacement in a random dot kinematogram (Dmin ). Participants also completed a hazard perception test (HPT), which measured participants' response times to hazards embedded in video recordings of real-world driving, which has been shown to be linked to crash risk. Dmin for the random dot stimulus ranged from -0.88 to -0.12 log minutes of arc, and the minimum drift rate for the Gabor stimulus ranged from 0.01 to 0.35 cycles per second. Both measures of motion sensitivity significantly predicted response times on the HPT. In addition, while the relationship involving the HPT and motion sensitivity for the random dot kinematogram was partially explained by the other visual function measures, the relationship with sensitivity for detection of the drifting Gabor stimulus remained significant even after controlling for these variables. These findings suggest that motion perception plays an important role in the visual perception of driving-relevant hazards independent of other areas of visual function and should be further explored as a predictive test of driving safety. Future research should explore the causes of reduced motion perception to develop better interventions to improve road safety. © 2012 The Authors. Acta Ophthalmologica © 2012 Acta Ophthalmologica Scandinavica Foundation.

  14. Hazard rating forest stands for gypsy moth

    Treesearch

    Ray R., Jr. Hicks

    1991-01-01

    A gypsy moth hazard exists when forest conditions prevail that are conducive to extensive damage from gypsy moth. Combining forest hazard rating with information on insect population trends provides the basis for predicting the probability (risk) of an event occurring. The likelihood of defoliation is termed susceptibility and the probability of damage (mortality,...

  15. Forest vegetation simulation tools and forest health assessment

    Treesearch

    Richard M. Teck; Melody Steele

    1995-01-01

    A Stand Hazard Rating System for Central ldaho forests has been incorporated into the Central ldaho Prognosis variant of the Forest Vegetation Simulator to evaluate how insects, disease and fire hazards within the Deadwood River Drainage change over time. A custom interface, BOISE.COMPUTE.PR, has been developed so hazard ratings can be electronically downloaded...

  16. Hazard Function Estimation with Cause-of-Death Data Missing at Random.

    PubMed

    Wang, Qihua; Dinse, Gregg E; Liu, Chunling

    2012-04-01

    Hazard function estimation is an important part of survival analysis. Interest often centers on estimating the hazard function associated with a particular cause of death. We propose three nonparametric kernel estimators for the hazard function, all of which are appropriate when death times are subject to random censorship and censoring indicators can be missing at random. Specifically, we present a regression surrogate estimator, an imputation estimator, and an inverse probability weighted estimator. All three estimators are uniformly strongly consistent and asymptotically normal. We derive asymptotic representations of the mean squared error and the mean integrated squared error for these estimators and we discuss a data-driven bandwidth selection method. A simulation study, conducted to assess finite sample behavior, demonstrates that the proposed hazard estimators perform relatively well. We illustrate our methods with an analysis of some vascular disease data.

  17. Role of familial factors in late-onset Alzheimer disease as a function of age.

    PubMed

    Wu, Z; Kinslow, C; Pettigrew, K D; Rapoport, S I; Schapiro, M B

    1998-09-01

    Whereas early-onset Alzheimer disease (AD; usually onset at age < 50 years) has been defined with genetic mutation on chromosomes 1, 14, and 21, the degree of familial contribution to late-onset AD is unclear. Further, it is uncertain if subgroups of late-onset AD exist. To examine the influence of familial factors as a function of age in late-onset AD we investigated lifetime risks and age-specific hazard rates of AD-like illness among late-onset AD probands' and controls' first-degree relatives, using questionnaires and medical records. As part of a longitudinal study on aging and AD, we studied 78 AD probands with age of onset > or =50 years (28 "definite" and 50 "probable" AD according to NINCDS/ADRDA criteria) and 101 healthy old controls seen since 1981. Both probands and controls were screened rigorously with medical tests and brain imaging and seen regularly until autopsy. Multiple informants and medical records were used for first-degree relatives. Among first-degree relatives, 49 secondary cases of AD-like illness were found for the AD probands' relatives (391 relatives 40 years old or older) compared with 20 cases among controls' relatives (456 relatives 40 years old or older). Relatives of AD probands had a significantly increased lifetime risk of AD-like illness of 52.8+/-11.4% by age 94 years compared with a lifetime risk in relatives of controls of 22.1+/-5.8% by age 90 years. Age-specific hazard rates in relatives of AD probands increased until the 75-79-year age interval and then decreased; in contrast the age-specific hazard rates increased in relatives of controls after the 80-84-year age interval. To determine if a dividing line exist among late-onset AD, several cutoff ages were used in our study to compare cumulative risk curves of AD-like illness between relatives of late-onset probands and relatives of late-late-onset probands. Differences in the pattern of cumulative incidence of AD in relatives showed that 67-71 years is the range for a dividing line between late- and late-late-onset AD. Age-specific hazard rates of AD in relatives supported a difference between late- and late-late-onset. Whereas these rates increased until the 75-79-year age interval and then decreased in late-onset AD, the rates began increasing after the 65-69-year age interval and through the oldest age interval in both late-late-onset AD and control groups. Our results support the concept that familial factors exist in late-onset AD and that different familial factors may exist in late-onset AD subgroups.

  18. LAV@HAZARD: a Web-GIS Framework for Real-Time Forecasting of Lava Flow Hazards

    NASA Astrophysics Data System (ADS)

    Del Negro, C.; Bilotta, G.; Cappello, A.; Ganci, G.; Herault, A.

    2014-12-01

    Crucial to lava flow hazard assessment is the development of tools for real-time prediction of flow paths, flow advance rates, and final flow lengths. Accurate prediction of flow paths and advance rates requires not only rapid assessment of eruption conditions (especially effusion rate) but also improved models of lava flow emplacement. Here we present the LAV@HAZARD web-GIS framework, which combines spaceborne remote sensing techniques and numerical simulations for real-time forecasting of lava flow hazards. By using satellite-derived discharge rates to drive a lava flow emplacement model, LAV@HAZARD allows timely definition of parameters and maps essential for hazard assessment, including the propagation time of lava flows and the maximum run-out distance. We take advantage of the flexibility of the HOTSAT thermal monitoring system to process satellite images coming from sensors with different spatial, temporal and spectral resolutions. HOTSAT was designed to ingest infrared satellite data acquired by the MODIS and SEVIRI sensors to output hot spot location, lava thermal flux and discharge rate. We use LAV@HAZARD to merge this output with the MAGFLOW physics-based model to simulate lava flow paths and to update, in a timely manner, flow simulations. Thus, any significant changes in lava discharge rate are included in the predictions. A significant benefit in terms of computational speed was obtained thanks to the parallel implementation of MAGFLOW on graphic processing units (GPUs). All this useful information has been gathered into the LAV@HAZARD platform which, due to the high degree of interactivity, allows generation of easily readable maps and a fast way to explore alternative scenarios. We will describe and demonstrate the operation of this framework using a variety of case studies pertaining to Mt Etna, Sicily. Although this study was conducted on Mt Etna, the approach used is designed to be applicable to other volcanic areas around the world.

  19. Rate of change in renal function and mortality in elderly treated hypertensive patients.

    PubMed

    Chowdhury, Enayet K; Langham, Robyn G; Ademi, Zanfina; Owen, Alice; Krum, Henry; Wing, Lindon M H; Nelson, Mark R; Reid, Christopher M

    2015-07-07

    Evidence relating the rate of change in renal function, measured as eGFR, after antihypertensive treatment in elderly patients to clinical outcome is sparse. This study characterized the rate of change in eGFR after commencement of antihypertensive treatment in an elderly population, the factors associated with eGFR rate change, and the rate's association with all-cause and cardiovascular mortality. Data from the Second Australian National Blood Pressure study were used, where 6083 hypertensive participants aged ≥65 years were enrolled during 1995-1997 and followed for a median of 4.1 years (in-trial). Following the Second Australian National Blood Pressure study, participants were followed-up for a further median 6.9 years (post-trial). The annual rate of change in the eGFR was calculated in 4940 participants using creatinine measurements during the in-trial period and classified into quintiles (Q) on the basis of the following eGFR changes: rapid decline (Q1), decline (Q2), stable (Q3), increase (Q4), and rapid increase (Q5). A rapid decline in eGFR in comparison with those with stable eGFRs during the in-trial period was associated with older age, living in a rural area, wider pulse pressure at baseline, receiving diuretic-based therapy, taking multiple antihypertensive drugs, and having blood pressure <140/90 mmHg during the study. However, a rapid increase in eGFR was observed in younger women and those with a higher cholesterol level. After adjustment for baseline and in-trial covariates, Cox-proportional hazard models showed a significantly greater risk for both all-cause (hazard ratio, 1.28; 95% confidence interval, 1.09 to 1.52; P=0.003) and cardiovascular (hazard ratio, 1.40; 95% confidence interval, 1.11 to 1.76; P=0.004) mortality in the rapid decline group compared with the stable group over a median of 7.2 years after the last eGFR measure. No significant association with mortality was observed for a rapid increase in eGFR. In elderly persons with treated hypertension, a rapid decline in eGFR is associated with a higher risk of mortality. Copyright © 2015 by the American Society of Nephrology.

  20. Exploring the effects of driving experience on hazard awareness and risk perception via real-time hazard identification, hazard classification, and rating tasks.

    PubMed

    Borowsky, Avinoam; Oron-Gilad, Tal

    2013-10-01

    This study investigated the effects of driving experience on hazard awareness and risk perception skills. These topics have previously been investigated separately, yet a novel approach is suggested where hazard awareness and risk perception are examined concurrently. Young, newly qualified drivers, experienced drivers, and a group of commercial drivers, namely, taxi drivers performed three consecutive tasks: (1) observed 10 short movies of real-world driving situations and were asked to press a button each time they identified a hazardous situation; (2) observed one of three possible sub-sets of 8 movies (out of the 10 they have seen earlier) for the second time, and were asked to categorize them into an arbitrary number of clusters according to the similarity in their hazardous situation; and (3) observed the same sub-set for a third time and following each movie were asked to rate its level of hazardousness. The first task is considered a real-time identification task while the other two are performed using hindsight. During it participants' eye movements were recorded. Results showed that taxi drivers were more sensitive to hidden hazards than the other driver groups and that young-novices were the least sensitive. Young-novice drivers also relied heavily on materialized hazards in their categorization structure. In addition, it emerged that risk perception was derived from two major components: the likelihood of a crash and the severity of its outcome. Yet, the outcome was rarely considered under time pressure (i.e., in real-time hazard identification tasks). Using hindsight, when drivers were provided with the opportunity to rate the movies' hazardousness more freely (rating task) they considered both components. Otherwise, in the categorization task, they usually chose the severity of the crash outcome as their dominant criterion. Theoretical and practical implications are discussed. Copyright © 2013 Elsevier Ltd. All rights reserved.

  1. Step 1: Human System Integration Pilot-Technology Interface Requirements for Weather Management

    NASA Technical Reports Server (NTRS)

    2005-01-01

    This document involves definition of technology interface requirements for Hazardous Weather Avoidance. Technology concepts in use by the Access 5 Weather Management Work Package were considered. Beginning with the Human System Integration (HIS) high-level functional requirement for Hazardous Weather Avoidance, and Hazardous Weather Avoidance technology elements, HSI requirements for the interface to the pilot were identified. Results of the analysis describe (1) the information required by the pilot to have knowledge of hazardous weather, and (2) the control capability needed by the pilot to obtain hazardous weather information. Fundamentally, these requirements provide the candidate Hazardous Weather Avoidance technology concepts with the necessary human-related elements to make them compatible with human capabilities and limitations. The results of the analysis describe how Hazardous Weather Avoidance operations and functions should interface with the pilot to provide the necessary Weather Management functionality to the UA-pilot system. Requirements and guidelines for Hazardous Weather Avoidance are partitioned into four categories: (1) Planning En Route (2) Encountering Hazardous Weather En Route, (3) Planning to Destination, and (4) Diversion Planning Alternate Airport. Each requirement is stated and is supported with a rationale and associated reference(s).

  2. Preclinical Alzheimer's disease and longitudinal driving decline.

    PubMed

    Roe, Catherine M; Babulal, Ganesh M; Head, Denise M; Stout, Sarah H; Vernon, Elizabeth K; Ghoshal, Nupur; Garland, Brad; Barco, Peggy P; Williams, Monique M; Johnson, Ann; Fierberg, Rebecca; Fague, M Scot; Xiong, Chengjie; Mormino, Elizabeth; Grant, Elizabeth A; Holtzman, David M; Benzinger, Tammie L S; Fagan, Anne M; Ott, Brian R; Carr, David B; Morris, John C

    2017-01-01

    Links between preclinical AD and driving difficulty onset would support the use of driving performance as an outcome in primary and secondary prevention trials among older adults (OAs). We examined whether AD biomarkers predicted the onset of driving difficulties among OAs. 104 OAs (65+ years) with normal cognition took part in biomarker measurements, a road test, clinical and psychometric batteries and self-reported their driving habits. Higher values of CSF tau/Aβ 42 and ptau 181 /Aβ 42 ratios, but not uptake on PIB amyloid imaging (p=.12), predicted time to a rating of Marginal or Fail on the driving test using Cox proportional hazards models. Hazards ratios (95% confidence interval) were 5.75 (1.70-19.53), p=.005 for CSF tau/Aβ 42 ; 6.19 (1.75-21.88) and p=.005 for CSF ptau 181 /Aβ 42 . Preclinical AD predicted time to receiving a Marginal or Fail rating on an on-road driving test. Driving performance shows promise as a functional outcome in AD prevention trials.

  3. Just-in-time training of dental responders in a simulated pandemic immunization response exercise.

    PubMed

    Colvard, Michael D; Hirst, Jeremy L; Vesper, Benjamin J; DeTella, George E; Tsagalis, Mila P; Roberg, Mary J; Peters, David E; Wallace, Jimmy D; James, James J

    2014-06-01

    The reauthorization of the Pandemic and All-Hazards Preparedness Act in 2013 incorporated the dental profession and dental professionals into the federal legislation governing public health response to pandemics and all-hazard situations. Work is now necessary to expand the processes needed to incorporate and train oral health care professionals into pandemic and all-hazard response events. A just-in-time (JIT) training exercise and immunization drill using an ex vivo porcine model system was conducted to demonstrate the rapidity to which dental professionals can respond to a pandemic influenza scenario. Medical history documentation, vaccination procedures, and patient throughput and error rates of 15 dental responders were evaluated by trained nursing staff and emergency response personnel. The average throughput (22.33/hr) and medical error rates (7 of 335; 2.08%) of the dental responders were similar to those found in analogous influenza mass vaccination clinics previously conducted using certified public health nurses. The dental responder immunization drill validated the capacity and capability of dental professionals to function as a valuable immunization resource. The ex vivo porcine model system used for JIT training can serve as a simple and inexpensive training tool to update pandemic responders' immunization techniques and procedures supporting inoculation protocols.

  4. Hazard Function Estimation with Cause-of-Death Data Missing at Random

    PubMed Central

    Wang, Qihua; Dinse, Gregg E.; Liu, Chunling

    2010-01-01

    Hazard function estimation is an important part of survival analysis. Interest often centers on estimating the hazard function associated with a particular cause of death. We propose three nonparametric kernel estimators for the hazard function, all of which are appropriate when death times are subject to random censorship and censoring indicators can be missing at random. Specifically, we present a regression surrogate estimator, an imputation estimator, and an inverse probability weighted estimator. All three estimators are uniformly strongly consistent and asymptotically normal. We derive asymptotic representations of the mean squared error and the mean integrated squared error for these estimators and we discuss a data-driven bandwidth selection method. A simulation study, conducted to assess finite sample behavior, demonstrates that the proposed hazard estimators perform relatively well. We illustrate our methods with an analysis of some vascular disease data. PMID:22267874

  5. Rating the Risks.

    ERIC Educational Resources Information Center

    Slovic, Paul; And Others

    1979-01-01

    Explains how people arrive at personal hazard assessments. Explores why people overestimate some hazards and underestimate others. Examines risk ratings for activities and technologies such as nuclear power, motor vehicles, pesticides, and vaccinations. (MA)

  6. Relation of worsened renal function during hospitalization for heart failure to long-term outcomes and rehospitalization.

    PubMed

    Lanfear, David E; Peterson, Edward L; Campbell, Janis; Phatak, Hemant; Wu, David; Wells, Karen; Spertus, John A; Williams, L Keoki

    2011-01-01

    Worsened renal function (WRF) during heart failure (HF) hospitalization is associated with in-hospital mortality, but there are limited data regarding its relation to long-term outcomes after discharge. The influence of WRF resolution is also unknown. This retrospective study analyzed patients who received care from a large health system and had a primary hospital discharge diagnosis of HF from January 2000 to June 2008. Renal function was estimated from creatinine levels during hospitalization. The first available value was considered baseline. WRF was defined a creatinine increase ≥ 0.3 mg/dl on any subsequent hospital day compared to baseline. Persistent WRF was defined as having WRF at discharge. Proportional hazards regression, adjusting for baseline renal function and potential confounding factors, was used to assess time to rehospitalization or death. Of 2,465 patients who survived to discharge, 887 (36%) developed WRF. Median follow-up was 2.1 years. In adjusted models, WRF was associated with higher rates of postdischarge death or rehospitalization (hazard ratio [HR] 1.12, 95% confidence interval [CI] 1.02 to 1.22). Of those with WRF, 528 (60%) had persistent WRF, whereas 359 (40%) recovered. Persistent WRF was significantly associated with higher postdischarge event rates (HR 1.14, 95% CI 1.02 to 1.27), whereas transient WRF showed only a nonsignificant trend toward risk (HR 1.09, 95% CI 0.96 to 1.24). In conclusion, in patients surviving hospitalization for HF, WRF was associated with increased long-term mortality and rehospitalization, particularly if renal function did not recover by the time of discharge. Copyright © 2011 Elsevier Inc. All rights reserved.

  7. Big Data Toolsets to Pharmacometrics: Application of Machine Learning for Time‐to‐Event Analysis

    PubMed Central

    Gong, Xiajing; Hu, Meng

    2018-01-01

    Abstract Additional value can be potentially created by applying big data tools to address pharmacometric problems. The performances of machine learning (ML) methods and the Cox regression model were evaluated based on simulated time‐to‐event data synthesized under various preset scenarios, i.e., with linear vs. nonlinear and dependent vs. independent predictors in the proportional hazard function, or with high‐dimensional data featured by a large number of predictor variables. Our results showed that ML‐based methods outperformed the Cox model in prediction performance as assessed by concordance index and in identifying the preset influential variables for high‐dimensional data. The prediction performances of ML‐based methods are also less sensitive to data size and censoring rates than the Cox regression model. In conclusion, ML‐based methods provide a powerful tool for time‐to‐event analysis, with a built‐in capacity for high‐dimensional data and better performance when the predictor variables assume nonlinear relationships in the hazard function. PMID:29536640

  8. Italian Case Studies Modelling Complex Earthquake Sources In PSHA

    NASA Astrophysics Data System (ADS)

    Gee, Robin; Peruzza, Laura; Pagani, Marco

    2017-04-01

    This study presents two examples of modelling complex seismic sources in Italy, done in the framework of regional probabilistic seismic hazard assessment (PSHA). The first case study is for an area centred around Collalto Stoccaggio, a natural gas storage facility in Northern Italy, located within a system of potentially seismogenic thrust faults in the Venetian Plain. The storage exploits a depleted natural gas reservoir located within an actively growing anticline, which is likely driven by the Montello Fault, the underlying blind thrust. This fault has been well identified by microseismic activity (M<2) detected by a local seismometric network installed in 2012 (http://rete-collalto.crs.inogs.it/). At this time, no correlation can be identified between the gas storage activity and local seismicity, so we proceed with a PSHA that considers only natural seismicity, where the rates of earthquakes are assumed to be time-independent. The source model consists of faults and distributed seismicity to consider earthquakes that cannot be associated to specific structures. All potentially active faults within 50 km of the site are considered, and are modelled as 3D listric surfaces, consistent with the proposed geometry of the Montello Fault. Slip rates are constrained using available geological, geophysical and seismological information. We explore the sensitivity of the hazard results to various parameters affected by epistemic uncertainty, such as ground motions prediction equations with different rupture-to-site distance metrics, fault geometry, and maximum magnitude. The second case is an innovative study, where we perform aftershock probabilistic seismic hazard assessment (APSHA) in Central Italy, following the Amatrice M6.1 earthquake of August 24th, 2016 (298 casualties) and the subsequent earthquakes of Oct 26th and 30th (M6.1 and M6.6 respectively, no deaths). The aftershock hazard is modelled using a fault source with complex geometry, based on literature data and field evidence associated with the August mainshock. Earthquake activity rates during the very first weeks after the deadly earthquake were used to calibrated an Omori-Utsu decay curve, and the magnitude distribution of aftershocks is assumed to follow a Gutenberg-Richter distribution. We apply uniform and non-uniform spatial distribution of the seismicity across the fault source, by modulating the rates as a decreasing function of distance from the mainshock. The hazard results are computed for short-exposure periods (1 month, before the occurrences of October earthquakes) and compared to the background hazard given by law (MPS04), and to observations at some reference sites. We also show the results of disaggregation computed for the city of Amatrice. Finally, we attempt to update the results in light of the new "main" events that occurred afterwards in the region. All source modeling and hazard calculations are performed using the OpenQuake engine. We discuss the novelties of these works, and the benefits and limitations of both analyses, particularly in such different contexts of seismic hazard.

  9. Suicide rates across income levels: Retrospective cohort data on 1 million participants collected between 2003 and 2013 in South Korea.

    PubMed

    Lee, Sang-Uk; Oh, In-Hwan; Jeon, Hong Jin; Roh, Sungwon

    2017-06-01

    The relation of income and socioeconomic status with suicide rates remains unclear. Most previous studies have focused on the relationship between suicide rates and macroeconomic factors (e.g., economic growth rate). Therefore, we aimed to identify the relationship between individuals' socioeconomic position and suicide risk. We analyzed suicide mortality rates across socioeconomic positions to identify potential trends using observational data on suicide mortality collected between January 2003 and December 2013 from 1,025,340 national health insurance enrollees. We followed the subjects for 123.5 months on average. Socioeconomic position was estimated using insurance premium levels. To examine the hazard ratios of suicide mortality in various socioeconomic positions, we used Cox proportional hazard models. We found that the hazard ratios of suicide showed an increasing trend as socioeconomic position decreased. After adjusting for gender, age, geographic location, and disability level, Medicaid recipients had the highest suicide hazard ratio (2.28; 95% CI, 1.87-2.77). Among the Medicaid recipients, men had higher hazard ratios than women (2.79; 95% CI, 2.17-3.59 vs. 1.71; 95% CI, 1.25-2.34). Hazard ratios also varied across age groups. The highest hazard ratio was found in the 40-59-year-old group (3.19; 95% CI, 2.31-4.43), whereas the lowest ratio was found in those 60 years and older (1.44; 95% CI, 1.09-1.87). Our results illuminate the relationship between socioeconomic position and suicide rates and can be used to design and implement future policies on suicide prevention. Copyright © 2017 The Authors. Production and hosting by Elsevier B.V. All rights reserved.

  10. [A dynamic study of sentinel surveillance for occupational hazard in typical industrial enterprises in Guangzhou, China, from 2012 to 2014].

    PubMed

    Wang, Zhi; Rong, Xing; Li, Yongqin; Zeng, Wenfeng; Du, Weijia; Liu, Yimin

    2015-08-01

    To perform a sampling survey of occupational hazard in typical industrial enterprises in Guangzhou, China, by means of sentinel surveillance, to understand the classification, distribution, and concentration/intensity of occupational hazard as well as the characteristics and development of occupational health management in Guangzhou, and to provide a scientific basis for occupational health supervision. Fifteen enterprises in information technology (IT), shipbuilding, chemical, leather and footwear, and auto repair industries were enrolled as subjects. Dynamic surveillance for occupational hazard and occupational health management was performed in workplaces of those enterprises. The overall overproof rate of occupational hazard in the 15 sentinel enterprises from 2012 to 2014 was 6.16% (45/731). There was no significant difference in the overproof rate between the three years (P > 0.05). During the three years, enterprises in shipbuilding industry had significant higher overproof rates than those in other industries (P < 0.05). According to the results of occupational health management questionnaire, the overall coincidence rate of survey items was 57.88% (393/679); enterprises in IT industry had significant lower coincidence rates than those in other industries in 2012 and 2014 (47.62%, 29.63%; P < 0.05), while enterprises in leather and footwear industry had significant lower coincidence rates than those in other industries in 2013 (40.63%; P < 0.05). The enterprises in shipbuilding industry are the key to the prevention and control of occupational hazard in Guangzhou. To strengthen surveillance for occupational health in workplaces in Guangzhou, it is important to enhance occupation health supervision among small and micro enterprises and develop continuous sentinel surveillance for occupational hazard in key industries.

  11. Use of raster-based data layers to model spatial variation of seismotectonic data in probabilistic seismic hazard assessment

    NASA Astrophysics Data System (ADS)

    Zolfaghari, Mohammad R.

    2009-07-01

    Recent achievements in computer and information technology have provided the necessary tools to extend the application of probabilistic seismic hazard mapping from its traditional engineering use to many other applications. Examples for such applications are risk mitigation, disaster management, post disaster recovery planning and catastrophe loss estimation and risk management. Due to the lack of proper knowledge with regard to factors controlling seismic hazards, there are always uncertainties associated with all steps involved in developing and using seismic hazard models. While some of these uncertainties can be controlled by more accurate and reliable input data, the majority of the data and assumptions used in seismic hazard studies remain with high uncertainties that contribute to the uncertainty of the final results. In this paper a new methodology for the assessment of seismic hazard is described. The proposed approach provides practical facility for better capture of spatial variations of seismological and tectonic characteristics, which allows better treatment of their uncertainties. In the proposed approach, GIS raster-based data models are used in order to model geographical features in a cell-based system. The cell-based source model proposed in this paper provides a framework for implementing many geographically referenced seismotectonic factors into seismic hazard modelling. Examples for such components are seismic source boundaries, rupture geometry, seismic activity rate, focal depth and the choice of attenuation functions. The proposed methodology provides improvements in several aspects of the standard analytical tools currently being used for assessment and mapping of regional seismic hazard. The proposed methodology makes the best use of the recent advancements in computer technology in both software and hardware. The proposed approach is well structured to be implemented using conventional GIS tools.

  12. Slope stability susceptibility evaluation parameter (SSEP) rating scheme - An approach for landslide hazard zonation

    NASA Astrophysics Data System (ADS)

    Raghuvanshi, Tarun Kumar; Ibrahim, Jemal; Ayalew, Dereje

    2014-11-01

    In this paper a new slope susceptibility evaluation parameter (SSEP) rating scheme is presented which is developed as an expert evaluation approach for landslide hazard zonation. The SSEP rating scheme is developed by considering intrinsic and external triggering parameters that are responsible for slope instability. The intrinsic parameters which are considered are; slope geometry, slope material (rock or soil type), structural discontinuities, landuse and landcover and groundwater. Besides, external triggering parameters such as, seismicity, rainfall and manmade activities are also considered. For SSEP empirical technique numerical ratings are assigned to each of the intrinsic and triggering parameters on the basis of logical judgments acquired from experience of studies of intrinsic and external triggering factors and their relative impact in inducing instability to the slope. Further, the distribution of maximum SSEP ratings is based on their relative order of importance in contributing instability to the slope. Finally, summation of all ratings for intrinsic and triggering parameter based on actual observation will provide the expected degree of landslide in a given land unit. This information may be utilized to develop a landslide hazard zonation map. The SSEP technique was applied in the area around Wurgessa Kebelle of North Wollo Zonal Administration, Amhara National Regional State in northern Ethiopia, some 490 km from Addis Ababa. The results obtained indicates that 8.33% of the area fall under Moderately hazard and 83.33% fall within High hazard whereas 8.34% of the area fall under Very high hazard. Further, in order to validate the LHZ map prepared during the study, active landslide activities and potential instability areas, delineated through inventory mapping was overlain on it. All active landslide activities and potential instability areas fall within very high and high hazard zone. Thus, the satisfactory agreement confirms the rationality of considered governing parameters, the adopted SSEP technique, tools and procedures in developing the landslide hazard map of the study area.

  13. Symptomatic BK Virus Infection Is Associated with Kidney Function Decline and Poor Overall Survival in Allogeneic Hematopoietic Stem Cell Recipients

    PubMed Central

    Abudayyeh, Ala; Hamdi, Amir; Lin, Heather; Abdelrahim, Maen; Rondon, Gabriela; Andersson, Borje S; Afrough, Aimaz; Martinez, Charles S; Tarrand, Jeffrey J; Kontoyiannis, Dimitrios P.; Marin, David; Gaber, A. Osama; Salahudeen, Abdulla; Oran, Betul; Chemaly, Roy F.; Olson, Amanda; Jones, Roy; Popat, Uday; Champlin, Richard E; Shpall, Elizabeth J.; Winkelmayer, Wolfgang C.; Rezvani, Katayoun

    2017-01-01

    Nephropathy due to BK virus infection is an evolving challenge in patients undergoing hematopoietic stem cell transplantation. We hypothesized that BKV infection was a marker of Kidney Function Decline and a poor prognostic factor in HSCT recipients who experience this complication. In this retrospective study, we analyzed all patients who underwent their first allogeneic hematopoietic stem cell transplantation at our institution between 2004 and 2012. We evaluated the incidence of persistent kidney function decline, which was defined as a confirmed reduction in estimated glomerular filtration rate of at least 25% from baseline using the CKD-EPI equation. Cox proportional hazard regression was used to model the cause-specific hazard of kidney function decline and Fine and Gray’s method was used to account for the competing risks of death. Among 2477 recipients of a first allogeneic hematopoietic stem cell transplantation, BK viruria was detected in 25% (n=629) and kidney function decline in 944 (38.1%). On multivariate analysis, after adjusting for age, sex, acute graft-versus-host disease, chronic graft versus host disease, preparative conditioning regimen, and graft source, BK viruria remained a significant risk factor for kidney function decline (P <0.001). In addition, patients with BKV infection and kidney function decline experienced worse overall survival. Post-allogeneic hematopoietic stem cell transplantation, BKV infection was strongly and independently associated with subsequent kidney function decline and worse patient survival after HSCT. PMID:26608093

  14. Fall Hazards Within Senior Independent Living: A Case-Control Study.

    PubMed

    Kim, Daejin; Portillo, Margaret

    2018-01-01

    The main purpose of this research was to identify significant relationships between environmental hazards and older adults' falling. Falls can present a major health risk to older persons. Identifying potential environmental hazards that increase fall risks can be effective for developing fall prevention strategies that can create safer residential environments for older adults. The research included a retrospective analysis of 449 fall incident reports in two case-control buildings. In the homes of 88 older adults residing in independent living, an observational study was conducted to identify environmental hazards using two assessment tools including Westmead Home Safety Assessment (WeHSA) and resident interviews. A fall history analysis indicated that falls occurred in the bathroom were significantly associated with hospitalization. The observational study revealed that the bathroom was the most common place for environmental hazards. The research showed, with increasing age and use of mobility assistive aids, there was a corresponding increase in the total number of environmental hazards. Home hazards were significantly and independently associated with the incidence rate of falls. In other words, the high fall rate building included more environmental hazards compared to the low fall rate building while controlling for residents' age and mobility. The current study provides empirical evidence of the link between environmental hazards and older adults' falling, which is useful for developing effective fall intervention design strategies.

  15. Flooding Hazard Maps of Different Land Uses in Subsidence Area

    NASA Astrophysics Data System (ADS)

    Lin, Yongjun; Chang, Hsiangkuan; Tan, Yihchi

    2017-04-01

    This study aims on flooding hazard maps of different land uses in the subsidence area of southern Taiwan. Those areas are low-lying due to subsidence resulting from over pumping ground water for aquaculture. As a result, the flooding due to storm surges and extreme rainfall are frequent in this area and are expected more frequently in the future. The main land uses there include: residence, fruit trees, and aquaculture. The hazard maps of the three land uses are investigated. The factors affecting hazards of different land uses are listed below. As for residence, flooding depth, duration of flooding, and rising rate of water surface level are factors affecting its degree of hazard. High flooding depth, long duration of flooding, and fast rising rate of water surface make residents harder to evacuate. As for fruit trees, flooding depth and duration of flooding affects its hazard most due to the root hypoxia. As for aquaculture, flooding depth affects its hazard most because the high flooding depth may cause the fish flush out the fishing ponds. An overland flow model is used for simulations of hydraulic parameters for factors such as flooding depth, rising rate of water surface level and duration of flooding. As above-mentioned factors, the hazard maps of different land uses can be made and high hazardous are can also be delineated in the subsidence areas.

  16. Neuropsychological Correlates of Hazard Perception in Older Adults.

    PubMed

    McInerney, Katalina; Suhr, Julie

    2016-03-01

    Hazard perception, the ability to identify and react to hazards while driving, is of growing importance in driving research, given its strong relationship to real word driving variables. Furthermore, although poor hazard perception is associated with novice drivers, recent research suggests that it declines with advanced age. In the present study, we examined the neuropsychological correlates of hazard perception in a healthy older adult sample. A total of 68 adults age 60 and older who showed no signs of dementia and were active drivers completed a battery of neuropsychological tests as well as a hazard perception task. Tests included the Repeatable Battery for the Assessment of Neuropsychological Status, Wechsler Test of Adult Reading, Trail Making Test, Block Design, Useful Field of View, and the Delis-Kaplan Executive Function System Color Word Interference Test. Hazard perception errors were related to visuospatial/constructional skills, processing speed, memory, and executive functioning skills, with a battery of tests across these domains accounting for 36.7% of the variance in hazard perception errors. Executive functioning, particularly Trail Making Test part B, emerged as a strong predictor of hazard perception ability. Consistent with prior work showing the relationship of neuropsychological performance to other measures of driving ability, neuropsychological performance was associated with hazard perception skill. Future studies should examine the relationship of neuropsychological changes in adults who are showing driving impairment and/or cognitive changes associated with Mild Cognitive Impairment or dementia.

  17. Rate of Change in Renal Function and Mortality in Elderly Treated Hypertensive Patients

    PubMed Central

    Langham, Robyn G.; Ademi, Zanfina; Owen, Alice; Krum, Henry; Wing, Lindon M.H.; Nelson, Mark R.; Reid, Christopher M.

    2015-01-01

    Background and objectives Evidence relating the rate of change in renal function, measured as eGFR, after antihypertensive treatment in elderly patients to clinical outcome is sparse. This study characterized the rate of change in eGFR after commencement of antihypertensive treatment in an elderly population, the factors associated with eGFR rate change, and the rate’s association with all-cause and cardiovascular mortality. Design, setting, participants, & measurements Data from the Second Australian National Blood Pressure study were used, where 6083 hypertensive participants aged ≥65 years were enrolled during 1995–1997 and followed for a median of 4.1 years (in-trial). Following the Second Australian National Blood Pressure study, participants were followed-up for a further median 6.9 years (post-trial). The annual rate of change in the eGFR was calculated in 4940 participants using creatinine measurements during the in-trial period and classified into quintiles (Q) on the basis of the following eGFR changes: rapid decline (Q1), decline (Q2), stable (Q3), increase (Q4), and rapid increase (Q5). Results A rapid decline in eGFR in comparison with those with stable eGFRs during the in-trial period was associated with older age, living in a rural area, wider pulse pressure at baseline, receiving diuretic-based therapy, taking multiple antihypertensive drugs, and having blood pressure <140/90 mmHg during the study. However, a rapid increase in eGFR was observed in younger women and those with a higher cholesterol level. After adjustment for baseline and in-trial covariates, Cox-proportional hazard models showed a significantly greater risk for both all-cause (hazard ratio, 1.28; 95% confidence interval, 1.09 to 1.52; P=0.003) and cardiovascular (hazard ratio, 1.40; 95% confidence interval, 1.11 to 1.76; P=0.004) mortality in the rapid decline group compared with the stable group over a median of 7.2 years after the last eGFR measure. No significant association with mortality was observed for a rapid increase in eGFR. Conclusions In elderly persons with treated hypertension, a rapid decline in eGFR is associated with a higher risk of mortality. PMID:25901093

  18. Rates of Atrial Fibrillation in Black Versus White Patients With Pacemakers.

    PubMed

    Kamel, Hooman; Kleindorfer, Dawn O; Bhave, Prashant D; Cushman, Mary; Levitan, Emily B; Howard, George; Soliman, Elsayed Z

    2016-02-12

    Black US residents experience higher rates of ischemic stroke than white residents but have lower rates of clinically apparent atrial fibrillation (AF), a strong risk factor for stroke. It is unclear whether black persons truly have less AF or simply more undiagnosed AF. We obtained administrative claims data from state health agencies regarding all emergency department visits and hospitalizations in California, Florida, and New York. We identified a cohort of patients with pacemakers, the regular interrogation of which reduces the likelihood of undiagnosed AF. We compared rates of documented AF or atrial flutter at follow-up visits using Kaplan-Meier survival statistics and Cox proportional hazards models adjusted for demographic characteristics and vascular risk factors. We identified 10 393 black and 91 380 white patients without documented AF or atrial flutter before or at the index visit for pacemaker implantation. During 3.7 (±1.8) years of follow-up, black patients had a significantly lower rate of AF (21.4%; 95% CI 19.8-23.2) than white patients (25.5%; 95% CI 24.9-26.0). After adjustment for demographic characteristics and comorbidities, black patients had a lower hazard of AF (hazard ratio 0.91; 95% CI 0.86-0.96), a higher hazard of atrial flutter (hazard ratio 1.29; 95% CI 1.11-1.49), and a lower hazard of the composite of AF or atrial flutter (hazard ratio 0.94; 95% CI 0.88-99). In a population-based sample of patients with pacemakers, black patients had a lower rate of AF compared with white patients. These findings indicate that the persistent racial disparities in rates of ischemic stroke are likely to be related to factors other than undiagnosed AF. © 2016 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley Blackwell.

  19. Deaggregation of Probabilistic Ground Motions in the Central and Eastern United States

    USGS Publications Warehouse

    Harmsen, S.; Perkins, D.; Frankel, A.

    1999-01-01

    Probabilistic seismic hazard analysis (PSHA) is a technique for estimating the annual rate of exceedance of a specified ground motion at a site due to known and suspected earthquake sources. The relative contributions of the various sources to the total seismic hazard are determined as a function of their occurrence rates and their ground-motion potential. The separation of the exceedance contributions into bins whose base dimensions are magnitude and distance is called deaggregation. We have deaggregated the hazard analyses for the new USGS national probabilistic ground-motion hazard maps (Frankel et al., 1996). For points on a 0.2?? grid in the central and eastern United States (CEUS), we show color maps of the geographical variation of mean and modal magnitudes (M??, M??) and distances (D??, D??) for ground motions having a 2% chance of exceedance in 50 years. These maps are displayed for peak horizontal acceleration and for spectral response accelerations of 0.2, 0.3, and 1.0 sec. We tabulate M??, D??, M??, and D?? for 49 CEUS cities for 0.2- and 1.0-sec response. Thus, these maps and tables are PSHA-derived estimates of the potential earthquakes that dominate seismic hazard at short and intermediate periods in the CEUS. The contribution to hazard of the New Madrid and Charleston sources dominates over much of the CEUS; for 0.2-sec response, over 40% of the area; for 1.0-sec response, over 80% of the area. For 0.2-sec response, D?? ranges from 20 to 200 km, for 1.0 sec, 30 to 600 km. For sites influenced by New Madrid or Charleston, D is less than the distance to these sources, and M?? is less than the characteristic magnitude of these sources, because averaging takes into account the effect of smaller magnitude and closer sources. On the other hand, D?? is directly the distance to New Madrid or Charleston and M?? for 0.2- and 1.0-sec response corresponds to the dominating source over much of the CEUS. For some cities in the North Atlantic states, short-period seismic hazard is apt to be controlled by local seismicity, whereas intermediate period (1.0 sec) hazard is commonly controlled by regional seismicity, such as that of the Charlevoix seismic zone.

  20. Applying the natural disasters vulnerability evaluation model to the March 2011 north-east Japan earthquake and tsunami.

    PubMed

    Ruiz Estrada, Mario Arturo; Yap, Su Fei; Park, Donghyun

    2014-07-01

    Natural hazards have a potentially large impact on economic growth, but measuring their economic impact is subject to a great deal of uncertainty. The central objective of this paper is to demonstrate a model--the natural disasters vulnerability evaluation (NDVE) model--that can be used to evaluate the impact of natural hazards on gross national product growth. The model is based on five basic indicators-natural hazards growth rates (αi), the national natural hazards vulnerability rate (ΩT), the natural disaster devastation magnitude rate (Π), the economic desgrowth rate (i.e. shrinkage of the economy) (δ), and the NHV surface. In addition, we apply the NDVE model to the north-east Japan earthquake and tsunami of March 2011 to evaluate its impact on the Japanese economy. © 2014 The Author(s). Disasters © Overseas Development Institute, 2014.

  1. On the importance of accounting for competing risks in pediatric cancer trials designed to delay or avoid radiotherapy: I. Basic concepts and first analyses.

    PubMed

    Tai, Bee-Choo; Grundy, Richard G; Machin, David

    2010-04-01

    In trials designed to delay or avoid irradiation among children with malignant brain tumor, although irradiation after disease progression is an important event, patients who have disease progression may decline radiotherapy (RT), or those without disease progression may opt for elective RT. To accurately describe the cumulative need for RT in such instances, it is crucial to account for these distinct events and to evaluate how each contributes to the delay or advancement of irradiation via a competing risks analysis. We describe the summary of competing events in such trials using competing risks methods based on cumulative incidence functions and Gray's test. The results obtained are contrasted with standard survival methods based on Kaplan-Meier curves, cause-specific hazard functions and log-rank test. The Kaplan-Meier method overestimates all event-specific rates. The cause-specific hazard analysis showed reduction in hazards for all events (A: RT after progression; B: no RT after progression; C: elective RT) among children with ependymoma. For event A, a higher cumulative incidence was reported for ependymoma. Although Gray's test failed to detect any difference (p = 0.331) between histologic subtypes, the log-rank test suggested marginal evidence (p = 0.057). Similarly, for event C, the log-rank test found stronger evidence of reduction in hazard among those with ependymoma (p = 0.005) as compared with Gray's test (p = 0.086). To evaluate treatment differences, failing to account for competing risks using appropriate methodology may lead to incorrect interpretations.

  2. Installation Restoration Program Records Search for Kingsley Field, Oregon.

    DTIC Science & Technology

    1982-06-01

    Hazardous Assesment Rating Methodology (HARM), is now used for all Air Force IRP studies. To maintain consistency, AFESC had their on-call contractors review...Installation History D. Industrial Facilities E. POL Storage Tanks F. Abandoned Tanks G. Oil/Water Separators :" H. Site Hazard Rating Methodology I. Site...and implementing regulations. The pur- pose of DOD policy is to control the migration of hazardous material contaminants from DOD installations. 3

  3. Clinical Aspects of Type-1 Long-QT Syndrome by Location, Coding Type, and Biophysical Function of Mutations Involving the KCNQ1 Gene

    PubMed Central

    Moss, Arthur J.; Shimizu, Wataru; Wilde, Arthur A.M.; Towbin, Jeffrey A.; Zareba, Wojciech; Robinson, Jennifer L.; Qi, Ming; Vincent, G. Michael; Ackerman, Michael J.; Kaufman, Elizabeth S.; Hofman, Nynke; Seth, Rahul; Kamakura, Shiro; Miyamoto, Yoshihiro; Goldenberg, Ilan; Andrews, Mark L.; McNitt, Scott

    2012-01-01

    Background Type-1 long-QT syndrome (LQTS) is caused by loss-of-function mutations in the KCNQ1-encoded IKs cardiac potassium channel. We evaluated the effect of location, coding type, and biophysical function of KCNQ1 mutations on the clinical phenotype of this disorder. Methods and Results We investigated the clinical course in 600 patients with 77 different KCNQ1 mutations in 101 proband-identified families derived from the US portion of the International LQTS Registry (n=425), the Netherlands’ LQTS Registry (n=93), and the Japanese LQTS Registry (n=82). The Cox proportional hazards survivorship model was used to evaluate the independent contribution of clinical and genetic factors to the first occurrence of time-dependent cardiac events from birth through age 40 years. The clinical characteristics, distribution of mutations, and overall outcome event rates were similar in patients enrolled from the 3 geographic regions. Biophysical function of the mutations was categorized according to dominant-negative (>50%) or haploinsufficiency (≤50%) reduction in cardiac repolarizing IKs potassium channel current. Patients with transmembrane versus C-terminus mutations (hazard ratio, 2.06; P<0.001) and those with mutations having dominant-negative versus haploinsufficiency ion channel effects (hazard ratio, 2.26; P<0.001) were at increased risk for cardiac events, and these genetic risks were independent of traditional clinical risk factors. Conclusions This genotype–phenotype study indicates that in type-1 LQTS, mutations located in the transmembrane portion of the ion channel protein and the degree of ion channel dysfunction caused by the mutations are important independent risk factors influencing the clinical course of this disorder. PMID:17470695

  4. Religion and mortality among the community-dwelling elderly.

    PubMed Central

    Oman, D; Reed, D

    1998-01-01

    OBJECTIVES: This study analyzed the prospective association between attending religious services and all-cause mortality to determine whether the association is explainable by 6 confounding factors: demographics, health status, physical functioning, health habits, social functioning and support, and psychological state. METHODS: The association between self-reported religious attendance and subsequent mortality over 5 years for 1931 older residents of Marin County, California, was examined by proportional hazards regression. Interaction terms of religion with social support were used to explore whether other forms of social support could substitute for religion and diminish its protective effect. RESULTS: Persons who attended religious services had lower mortality than those who did not (age- and sex-adjusted relative hazard [RH] = 0.64; 95% confidence interval [CI] = 0.52, 0.78). Multivariate adjustment reduced this relationship only slightly (RH = 0.76; 95% CI = 0.62, 0.94), primarily by including physical functioning and social support. Contrary to hypothesis, religious attendance tended to be slightly more protective for those with high social support. CONCLUSIONS: Lower mortality rates for those who attend religious services are only partly explained by the 6 possible confounders listed above. Psychodynamic and other explanations need further investigation. PMID:9772846

  5. Lunar mission safety and rescue: Hazards analysis and safety requirements

    NASA Technical Reports Server (NTRS)

    1971-01-01

    The results are presented of the hazards analysis which was concerned only with hazards to personnel and not with loss of equipment or property. Hazards characterization includes the definition of a hazard, the hazard levels, and the hazard groups. The analysis methodology is described in detail. The methodology was used to prepare the top level functional flow diagrams, to perform the first level hazards assessment, and to develop a list of conditions and situations requiring individual hazard studies. The 39 individual hazard study results are presented in total.

  6. An EM-based semi-parametric mixture model approach to the regression analysis of competing-risks data.

    PubMed

    Ng, S K; McLachlan, G J

    2003-04-15

    We consider a mixture model approach to the regression analysis of competing-risks data. Attention is focused on inference concerning the effects of factors on both the probability of occurrence and the hazard rate conditional on each of the failure types. These two quantities are specified in the mixture model using the logistic model and the proportional hazards model, respectively. We propose a semi-parametric mixture method to estimate the logistic and regression coefficients jointly, whereby the component-baseline hazard functions are completely unspecified. Estimation is based on maximum likelihood on the basis of the full likelihood, implemented via an expectation-conditional maximization (ECM) algorithm. Simulation studies are performed to compare the performance of the proposed semi-parametric method with a fully parametric mixture approach. The results show that when the component-baseline hazard is monotonic increasing, the semi-parametric and fully parametric mixture approaches are comparable for mildly and moderately censored samples. When the component-baseline hazard is not monotonic increasing, the semi-parametric method consistently provides less biased estimates than a fully parametric approach and is comparable in efficiency in the estimation of the parameters for all levels of censoring. The methods are illustrated using a real data set of prostate cancer patients treated with different dosages of the drug diethylstilbestrol. Copyright 2003 John Wiley & Sons, Ltd.

  7. Predictive value of myocardial perfusion single-photon emission computed tomography and the impact of renal function on cardiac death.

    PubMed

    Hakeem, Abdul; Bhatti, Sabha; Dillie, Kathryn Sullivan; Cook, Jeffrey R; Samad, Zainab; Roth-Cline, Michelle D; Chang, Su Min

    2008-12-09

    Patients with chronic kidney disease (CKD) have worse cardiovascular outcomes than those without CKD. The prognostic utility of myocardial perfusion single-photon emission CT (MPS) in patients with varying degrees of renal dysfunction and the impact of CKD on cardiac death prediction in patients undergoing MPS have not been investigated. We followed up 1652 consecutive patients who underwent stress MPS (32% exercise, 95% gated) for cardiac death for a mean of 2.15+/-0.8 years. MPS defects were defined with a summed stress score (normal summed stress score <4, abnormal summed stress score>or=4). Ischemia was defined as a summed stress score >or=4 plus a summed difference score >or=2, and scar was defined as a summed difference score <2 plus a summed stress score >or=4. Renal function was calculated with the Modified Diet in Renal Disease equation. CKD (estimated glomerular filtration rate <60 mL . min(-1) . 1.73 m(-2)) was present in 36%. Cardiac death increased with worsening levels of perfusion defects across the entire spectrum of renal function. Presence of ischemia was independently predictive of cardiac death, all-cause mortality, and nonfatal myocardial infarction. Patients with normal MPS and CKD had higher unadjusted cardiac death event rates than those with no CKD and normal MPS (2.7% versus 0.8%, P=0.001). Multivariate Cox proportional hazards models revealed that both perfusion defects (hazard ratio 1.90, 95% CI 1.47 to 2.46) and CKD (hazard ratio 1.96, 95% CI 1.29 to 2.95) were independent predictors of cardiac death after accounting for risk factors, left ventricular dysfunction, pharmacological stress, and symptom status. Both MPS and CKD had incremental power for cardiac death prediction over baseline risk factors and left ventricular dysfunction (global chi(2) 207.5 versus 169.3, P<0.0001). MPS provides effective risk stratification across the entire spectrum of renal function. Renal dysfunction is also an important independent predictor of cardiac death in patients undergoing MPS. Renal function and MPS have additive value in risk stratisfying patients with suspected coronary artery disease. Patients with CKD appear to have a relatively less benign prognosis than those without CKD, even in the presence of a normal scan.

  8. Exponential decline of aftershocks of the M7.9 1868 great Kau earthquake, Hawaii, through the 20th century

    USGS Publications Warehouse

    Klein, F.W.; Wright, Tim

    2008-01-01

    The remarkable catalog of Hawaiian earthquakes going back to the 1820s is based on missionary diaries, newspaper accounts, and instrumental records and spans the great M7.9 Kau earthquake of April 1868 and its aftershock sequence. The earthquake record since 1868 defines a smooth curve complete to M5.2 of the declining rate into the 21st century, after five short volcanic swarms are removed. A single aftershock curve fits the earthquake record, even with numerous M6 and 7 main shocks and eruptions. The timing of some moderate earthquakes may be controlled by magmatic stresses, but their overall long-term rate reflects one of aftershocks of the Kau earthquake. The 1868 earthquake is, therefore, the largest and most controlling stress event in the 19th and 20th centuries. We fit both the modified Omori (power law) and stretched exponential (SE) functions to the earthquakes. We found that the modified Omori law is a good fit to the M ??? 5.2 earthquake rate for the first 10 years or so and the more rapidly declining SE function fits better thereafter, as supported by three statistical tests. The switch to exponential decay suggests that a possible change in aftershock physics may occur from rate and state fault friction, with no change in the stress rate, to viscoelastic stress relaxation. The 61-year exponential decay constant is at the upper end of the range of geodetic relaxation times seen after other global earthquakes. Modeling deformation in Hawaii is beyond the scope of this paper, but a simple interpretation of the decay suggests an effective viscosity of 1019 to 1020 Pa s pertains in the volcanic spreading of Hawaii's flanks. The rapid decline in earthquake rate poses questions for seismic hazard estimates in an area that is cited as one of the most hazardous in the United States.

  9. Self-Regulation and Executive Functioning as Related to Survival in Motor Neuron Disease: Preliminary Findings.

    PubMed

    Garcia-Willingham, Natasha E; Roach, Abbey R; Kasarskis, Edward J; Segerstrom, Suzanne C

    2018-05-16

    Disease progression varies widely among patients with motor neuron disease (MND). Patients with MND and coexisting dementia have shorter survival. However, implications of mild cognitive and behavioral difficulties are unclear. The present study examined the relative contribution of executive functioning and self-regulation difficulties on survival over a 6-year period among patients with MND, who scored largely within normal limits on cognitive and behavioral indices. Patients with MND (N=37, age=59.97±11.57, 46% female) completed the Wisconsin Card Sorting Task (WCST) as an executive functioning perseveration index. The Behavior Rating Inventory of Executive Functions (BRIEF-A) was used as a behavioral measure of self-regulation in two subdomains self-regulatory behavior (Behavioral Regulation) and self-regulatory problem-solving (Metacognition). Cox proportional hazard regression analyses were used. In total, 23 patients died during follow-up. In Cox proportional hazard regressions adjusted for a priori covariates, each 10-point T-score increment in patient-reported BRIEF-A self-regulatory behavior and problem-solving difficulties increased mortality risk by 94% and103%, respectively (adjusted HR=1.94, 95% CI [1.07, 3.52]; adjusted HR=2.03, 95% CI [1.19, 3.48]). In sensitivity analyses, patient-reported self-regulatory problem-solving remained significant independent of disease severity and a priori covariates (adjusted HR=1.68, 95% CI [1.01, 2.78], though the predictive value of self-regulatory behavior was attenuated in adjusted models (HR=1.67, 95% CI [0.85, 3.27). Caregiver-reported BRIEF-A ratings of patients and WCST perseverative errors did not significantly predict survival. Preliminary evidence suggests patient-reported self-regulatory problem-solving difficulties indicate poorer prognosis in MND. Further research is needed to uncover mechanisms that negatively affect patient survival.

  10. Physical frailty is associated with incident mild cognitive impairment in community-based older persons.

    PubMed

    Boyle, Patricia A; Buchman, Aron S; Wilson, Robert S; Leurgans, Sue E; Bennett, David A

    2010-02-01

    To test the hypothesis that physical frailty is associated with risk of mild cognitive impairment (MCI). Prospective, observational cohort study. Approximately 40 retirement communities across the Chicago metropolitan area. More than 750 older persons without cognitive impairment at baseline. Physical frailty, based on four components (grip strength, timed walk, body composition, and fatigue), was assessed at baseline, and cognitive function was assessed annually. Proportional hazards models adjusted for age, sex, and education were used to examine the association between physical frailty and the risk of incident MCI, and mixed effect models were used to examine the association between frailty and the rate of change in cognition. During up to 12 years of annual follow-up, 305 of 761 (40%) persons developed MCI. In a proportional hazards model adjusted for age, sex, and education, physical frailty was associated with a high risk of incident MCI, such that each one-unit increase in physical frailty was associated with a 63% increase in the risk of MCI (hazard ratio=1.63; 95% confidence interval=1.27-2.08). This association persisted in analyses that required MCI to persist for at least 1 year and after controlling for depressive symptoms, disability, vascular risk factors, and vascular diseases. Furthermore, a higher level of physical frailty was associated with a faster rate of decline in global cognition and five cognitive systems (episodic memory, semantic memory, working memory, perceptual speed, and visuospatial abilities). Physical frailty is associated with risk of MCI and a rapid rate of cognitive decline in aging.

  11. Assessment of natural radionuclides and its radiological hazards from tiles made in Nigeria

    NASA Astrophysics Data System (ADS)

    Joel, E. S.; Maxwell, O.; Adewoyin, O. O.; Ehi-Eromosele, C. O.; Embong, Z.; Saeed, M. A.

    2018-03-01

    Activity concentration of 10 different brands of tiles made in Nigeria were analyzed using High purity Germanium gamma detector and its hazard indices such as absorbed dose rate, radium equivalent activity, external Hazard Index (Hex), internal Hazard Index (Hin), Annual Effective Dose (mSv/y), Gamma activity Index (Iγ) and Alpha Index (Iα) were determined. The result showed that the average activity concentrations of radionuclides (226Ra, 232Th and 40K) content are within the recommended limit. The average radium equivalent is within the recommended limit of 370 Bq/kg. The result obtained further showed that the mean values for the absorbed dose rate (D), external and internal hazard index, the annual effective dose (AEDR) equivalent, gamma activity index and Alpha Index were: 169.22 nGyh-1, 0.95 and 1.14, 1.59 mSv/y, 1.00 Sv yr-1 and 0.34 respectively. The result established that radiological hazards such as absorbed dose rate, internal hazard, annual effective dose rate, gamma activity index and Alpha Index for some samples are found to be slightly close or above international recommended values. The result for the present study was compared with tiles sample from others countries, it was observed that the concentration of tiles made in Nigeria and other countries are closer, however recommends proper radiation monitoring for some tiles made in Nigeria before usage due to the long term health effect.

  12. Integrated survival analysis using an event-time approach in a Bayesian framework

    USGS Publications Warehouse

    Walsh, Daniel P.; Dreitz, VJ; Heisey, Dennis M.

    2015-01-01

    Event-time or continuous-time statistical approaches have been applied throughout the biostatistical literature and have led to numerous scientific advances. However, these techniques have traditionally relied on knowing failure times. This has limited application of these analyses, particularly, within the ecological field where fates of marked animals may be unknown. To address these limitations, we developed an integrated approach within a Bayesian framework to estimate hazard rates in the face of unknown fates. We combine failure/survival times from individuals whose fates are known and times of which are interval-censored with information from those whose fates are unknown, and model the process of detecting animals with unknown fates. This provides the foundation for our integrated model and permits necessary parameter estimation. We provide the Bayesian model, its derivation, and use simulation techniques to investigate the properties and performance of our approach under several scenarios. Lastly, we apply our estimation technique using a piece-wise constant hazard function to investigate the effects of year, age, chick size and sex, sex of the tending adult, and nesting habitat on mortality hazard rates of the endangered mountain plover (Charadrius montanus) chicks. Traditional models were inappropriate for this analysis because fates of some individual chicks were unknown due to failed radio transmitters. Simulations revealed biases of posterior mean estimates were minimal (≤ 4.95%), and posterior distributions behaved as expected with RMSE of the estimates decreasing as sample sizes, detection probability, and survival increased. We determined mortality hazard rates for plover chicks were highest at <5 days old and were lower for chicks with larger birth weights and/or whose nest was within agricultural habitats. Based on its performance, our approach greatly expands the range of problems for which event-time analyses can be used by eliminating the need for having completely known fate data.

  13. Integrated survival analysis using an event-time approach in a Bayesian framework.

    PubMed

    Walsh, Daniel P; Dreitz, Victoria J; Heisey, Dennis M

    2015-02-01

    Event-time or continuous-time statistical approaches have been applied throughout the biostatistical literature and have led to numerous scientific advances. However, these techniques have traditionally relied on knowing failure times. This has limited application of these analyses, particularly, within the ecological field where fates of marked animals may be unknown. To address these limitations, we developed an integrated approach within a Bayesian framework to estimate hazard rates in the face of unknown fates. We combine failure/survival times from individuals whose fates are known and times of which are interval-censored with information from those whose fates are unknown, and model the process of detecting animals with unknown fates. This provides the foundation for our integrated model and permits necessary parameter estimation. We provide the Bayesian model, its derivation, and use simulation techniques to investigate the properties and performance of our approach under several scenarios. Lastly, we apply our estimation technique using a piece-wise constant hazard function to investigate the effects of year, age, chick size and sex, sex of the tending adult, and nesting habitat on mortality hazard rates of the endangered mountain plover (Charadrius montanus) chicks. Traditional models were inappropriate for this analysis because fates of some individual chicks were unknown due to failed radio transmitters. Simulations revealed biases of posterior mean estimates were minimal (≤ 4.95%), and posterior distributions behaved as expected with RMSE of the estimates decreasing as sample sizes, detection probability, and survival increased. We determined mortality hazard rates for plover chicks were highest at <5 days old and were lower for chicks with larger birth weights and/or whose nest was within agricultural habitats. Based on its performance, our approach greatly expands the range of problems for which event-time analyses can be used by eliminating the need for having completely known fate data.

  14. Hazard function theory for nonstationary natural hazards

    NASA Astrophysics Data System (ADS)

    Read, L. K.; Vogel, R. M.

    2015-11-01

    Impact from natural hazards is a shared global problem that causes tremendous loss of life and property, economic cost, and damage to the environment. Increasingly, many natural processes show evidence of nonstationary behavior including wind speeds, landslides, wildfires, precipitation, streamflow, sea levels, and earthquakes. Traditional probabilistic analysis of natural hazards based on peaks over threshold (POT) generally assumes stationarity in the magnitudes and arrivals of events, i.e. that the probability of exceedance of some critical event is constant through time. Given increasing evidence of trends in natural hazards, new methods are needed to characterize their probabilistic behavior. The well-developed field of hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series (X) with its failure time series (T), enabling computation of corresponding average return periods, risk and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose POT magnitudes are assumed to follow the widely applied Generalized Pareto (GP) model. We derive the hazard function for this case and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. Our theoretical analysis linking hazard event series X, with corresponding failure time series T, should have application to a wide class of natural hazards with rich opportunities for future extensions.

  15. Hazard function theory for nonstationary natural hazards

    NASA Astrophysics Data System (ADS)

    Read, Laura K.; Vogel, Richard M.

    2016-04-01

    Impact from natural hazards is a shared global problem that causes tremendous loss of life and property, economic cost, and damage to the environment. Increasingly, many natural processes show evidence of nonstationary behavior including wind speeds, landslides, wildfires, precipitation, streamflow, sea levels, and earthquakes. Traditional probabilistic analysis of natural hazards based on peaks over threshold (POT) generally assumes stationarity in the magnitudes and arrivals of events, i.e., that the probability of exceedance of some critical event is constant through time. Given increasing evidence of trends in natural hazards, new methods are needed to characterize their probabilistic behavior. The well-developed field of hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series (X) with its failure time series (T), enabling computation of corresponding average return periods, risk, and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose POT magnitudes are assumed to follow the widely applied generalized Pareto model. We derive the hazard function for this case and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. Our theoretical analysis linking hazard random variable X with corresponding failure time series T should have application to a wide class of natural hazards with opportunities for future extensions.

  16. Report on the "Shakedown" test of Oregon's rockfall hazard rating system.

    DOT National Transportation Integrated Search

    1989-04-01

    Oregon Rockfall Hazard Rating System (RHRS) was field tested at over 50 locations statewide to determine where clarification and improvements to the system were needed. Field use of the system demonstrated many areas where refinements were valuable. ...

  17. Characteristics and predictors of home injury hazards among toddlers in Wenzhou, China: a community-based cross-sectional study

    PubMed Central

    2014-01-01

    Background Home hazards are associated with toddlers receiving unintentional home injuries (UHI). These result in not only physical and psychological difficulties for children, but also economic losses and additional stress for their families. Few researchers pay attention to predictors of home hazards among toddlers in a systematic way. The purpose of this study is firstly to describe the characteristics of homes with hazards and secondly to explore the predicted relationship of children, parents and family factors to home hazards among toddlers aged 24–47 months in Wenzhou, China. Methods A random cluster sampling was employed to select 366 parents having children aged 24 – 47 months from 13 kindergartens between March and April of 2012. Four instruments assessed home hazards, demographics, parent’s awareness of UHI, as well as family functioning. Results Descriptive statistics showed that the mean of home hazards was 12.29 (SD = 6.39). The nine kinds of home hazards that were identified in over 50% of households were: plastic bags (74.3%), coin buttons (69.1%), and toys with small components (66.7%) etc. Multivariate linear regression revealed that the predictors of home hazards were the child’s age, the child’s residential status and family functioning (b = .19, 2.02, - .07, p < .01, < .05 and < .01, respectively). Conclusions The results showed that a higher number of home hazards were significantly attributed to older toddlers, migrant toddlers and poorer family functioning. This result suggested that heath care providers should focus on the vulnerable family and help the parents assess home hazards. Further study is needed to find interventions on how to manage home hazards for toddlers in China. PMID:24953678

  18. Rockfall Hazard Process Assessment : [Project Summary

    DOT National Transportation Integrated Search

    2017-10-01

    The Montana Department of Transportation (MDT) implemented its Rockfall Hazard Rating System (RHRS) between 2003 and 2005, obtaining information on the state's rock slopes and their associated hazards. The RHRS data facilitated decision-making in an ...

  19. Compliance with OSHA's respiratory protection standard in hospitals.

    PubMed

    Krishnan, U; Janicak, C A

    1999-01-01

    This study examined the incidence of violations of occupational safety and health standards for respiratory protection in hospitals. Data from Occupational Safety and Health Administration inspections that occurred in hospitals and resulted in violations of the respiratory protection standards were examined. From July 1, 1990, to June 30, 1995, the complaint rates for hazards in the workplace significantly increased. During 1990-1991, tuberculosis hazard complaint inspections rates were approximately 5 complaints per 1000 complaint inspections conducted. During 1994-1995, tuberculosis hazard complaint inspections rates were approximately 76 complaints per 1000 complaint inspections conducted, representing an increase of over 15 times. During this same period, the percentage of respiratory protection violations in relation to all violations doubled. Increased employee awareness of the hazards and current safety laws could have contributed to the increased frequency of employee complaints, leading to increases in inspections, violations, and fines. Employers must adhere to the current safety and health requirements specifically as they pertain to respiratory hazards and tuberculosis.

  20. Cumulative Incidence of Cancer among HIV-infected Individuals in North America

    PubMed Central

    Silverberg, Michael J.; Lau, Bryan; Achenbach, Chad J.; Jing, Yuezhou; Althoff, Keri N.; D’Souza, Gypsyamber; Engels, Eric A.; Hessol, Nancy; Brooks, John T.; Burchell, Ann N.; Gill, M. John; Goedert, James J.; Hogg, Robert; Horberg, Michael A.; Kirk, Gregory D.; Kitahata, Mari M.; Korthuis, Phillip T.; Mathews, William C.; Mayor, Angel; Modur, Sharada P.; Napravnik, Sonia; Novak, Richard M.; Patel, Pragna; Rachlis, Anita R.; Sterling, Timothy R.; Willig, James H.; Justice, Amy C.; Moore, Richard D.; Dubrow, Robert

    2016-01-01

    Background Cancer is increasingly common among HIV patients given improved survival. Objective To examine calendar trends in cumulative cancer incidence and hazard rate by HIV status. Design Cohort study Setting North American AIDS Cohort Collaboration on Research and Design during 1996–2009 Patients 86,620 HIV-infected and 196,987 uninfected adults Measurements We estimated cancer-type-specific cumulative incidence by age 75 years by HIV status and calendar era, and examined calendar trends in cumulative incidence and hazard rates. Results Cumulative incidences (%) of cancer by age 75 (HIV+/HIV−) were: Kaposi sarcoma (KS), 4.4/0.01; non-Hodgkin’s lymphoma (NHL), 4.5/0.7; lung, 3.4/2.8; anal, 1.5/0.1; colorectal, 1.0/1.5; liver, 1.1/0.4; Hodgkin lymphoma (HL), 0.9/0.1; melanoma, 0.5/0.6; and oral cavity/pharyngeal, 0.8/0.8. Among HIV-infected subjects, we observed decreasing calendar trends in cumulative incidence and hazard rate for KS and NHL. For anal, colorectal and liver cancers, increasing cumulative incidence, but not hazard rate trends, were due to the decreasing mortality rate trend (−9% per year), allowing greater opportunity to be diagnosed with these cancer types. Despite decreasing hazard rate trends for lung, HL, and melanoma, we did not observe cumulative incidence trends due to the compensating effect of the declining mortality rate on cumulative incidence. Limitations Secular trends in screening, smoking, and viral co-infections were not evaluated. Conclusions Our analytic approach helped disentangle the effects of improved survival and changing cancer-specific hazard rates on cumulative incidence trends among HIV patients. Cumulative cancer incidence by age 75, approximating lifetime risk in HIV patients, may have clinical utility in this population. The high cumulative incidences by age 75 for KS, NHL, and lung cancer supports early and sustained ART and smoking cessation. Primary Funding Source National Institutes of Health PMID:26436616

  1. 75 FR 9018 - Pipeline Safety: Random Drug Testing Rate

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-26

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket ID PHMSA-2010-0034] Pipeline Safety: Random Drug Testing Rate AGENCY: Pipeline and Hazardous Materials... pipelines and operators of liquefied natural gas facilities must select and test a percentage of covered...

  2. 77 FR 2606 - Pipeline Safety: Random Drug Testing Rate

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-18

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket ID PHMSA-2012-0004] Pipeline Safety: Random Drug Testing Rate AGENCY: Pipeline and Hazardous Materials... pipelines and operators of liquefied natural gas facilities must select and test a percentage of covered...

  3. Earthquake Rate Models for Evolving Induced Seismicity Hazard in the Central and Eastern US

    NASA Astrophysics Data System (ADS)

    Llenos, A. L.; Ellsworth, W. L.; Michael, A. J.

    2015-12-01

    Injection-induced earthquake rates can vary rapidly in space and time, which presents significant challenges to traditional probabilistic seismic hazard assessment methodologies that are based on a time-independent model of mainshock occurrence. To help society cope with rapidly evolving seismicity, the USGS is developing one-year hazard models for areas of induced seismicity in the central and eastern US to forecast the shaking due to all earthquakes, including aftershocks which are generally omitted from hazards assessments (Petersen et al., 2015). However, the spatial and temporal variability of the earthquake rates make them difficult to forecast even on time-scales as short as one year. An initial approach is to use the previous year's seismicity rate to forecast the next year's seismicity rate. However, in places such as northern Oklahoma the rates vary so rapidly over time that a simple linear extrapolation does not accurately forecast the future, even when the variability in the rates is modeled with simulations based on an Epidemic-Type Aftershock Sequence (ETAS) model (Ogata, JASA, 1988) to account for earthquake clustering. Instead of relying on a fixed time period for rate estimation, we explore another way to determine when the earthquake rate should be updated. This approach could also objectively identify new areas where the induced seismicity hazard model should be applied. We will estimate the background seismicity rate by optimizing a single set of ETAS aftershock triggering parameters across the most active induced seismicity zones -- Oklahoma, Guy-Greenbrier, the Raton Basin, and the Azle-Dallas-Fort Worth area -- with individual background rate parameters in each zone. The full seismicity rate, with uncertainties, can then be estimated using ETAS simulations and changes in rate can be detected by applying change point analysis in ETAS transformed time with methods already developed for Poisson processes.

  4. Reduced survival in adult cystic fibrosis despite attenuated lung function decline.

    PubMed

    Keating, Claire; Poor, Armeen D; Liu, Xinhua; Chiuzan, Codruta; Backenroth, Daniel; Zhang, Yuan; DiMango, Emily

    2017-01-01

    There is limited data on disease progression and survival in adult diagnosis cystic fibrosis (CF). This study evaluates change of lung function over time and rates of death/lung transplant in adult diagnosis CF. The CF Foundation Patient Registry was reviewed for patients diagnosed 1993-2003. Rate of FEV1 decline was calculated up to 2010 for age groups 6-11, 12-17, and 18 and above. Kaplan Meier method was used for 10 and 15year survival rate calculations for patients diagnosed as adults. Cox Proportional hazards models using predictors affecting disease progression and survival without transplant were run. Between 1993 and 2003, 11,884 patients were diagnosed with CF, of which 2848 were ages 6 and older. Annual rate of change of FEV1% predicted over 5years differed by diagnosis age group: -1.42% per year for ages 6-11, -2.04% for ages 12-17 and -1.13% for ages 18-65 (p<0.0001). Pseudomonas aeruginosa infection was associated with faster rates of lung function decline in all age groups. Survival without transplant for CF patients diagnosed at ≥18years were 76% and 65% by 10 and 15years, respectively. Of adults with FEV1 of >70% predicted at diagnosis, 95% were alive without transplant at 10years, whereas of those with FEV1<40% predicted at diagnosis, 31% were alive without transplant at 10years. Lung function declines at a slower rate in adult diagnosis CF. However, particularly in those with low lung function at diagnosis, rates of death or transplant in adult diagnosis CF after 10 and 15years is not negligible. Copyright © 2016 European Cystic Fibrosis Society. Published by Elsevier B.V. All rights reserved.

  5. Effect of Early Referral to Specialist in Dementia on Institutionalization and Functional Decline: Findings from a Population-Based Study.

    PubMed

    Pimouguet, Clément; Le-Goff, Mélanie; Rizzuto, Debora; Berr, Claudine; Leffondré, Karen; Pérès, Karine; Dartigues, Jean FranÇois; Helmer, Catherine

    2016-01-01

    Although early diagnosis has been hypothesized to benefit both patients and caregivers, until now studies evaluating the effect of early dementia diagnosis are lacking. To investigate the influence of early specialist referral for dementia on the risk of institutionalization and functional decline in Activity of Daily Living (ADL). Incident dementia cases were screened in a prospective population-based cohort, the Three-City Study, and initial specialist consultation for cognitive complaint was assessed at dementia diagnosis. Proportional hazard regression and illness-death models were used to test the association between specialist referral and, respectively, institutionalization and functional decline. Only one third of the incident individuals with dementia had consulted a specialist for cognitive problems early (36%). After adjustment on potential confounders (including cognitive and functional decline) and competing risk of death, participants who had consulted a specialist early in the disease course presented a higher rate of being institutionalized than those who did not (Hazard Ratio = 2.00, 95% Confidence Interval (CI): 1.09- 3.64). But early specialist referral was not associated with further functional decline (HR = 1.09, 95% CI: 0.71- 1.67). Early specialist referral in dementia is associated with increased risk of institutionalization but not with functional decline in ADL. These findings suggest that early care referral in dementia may be a marker of concern for patients and/or caregivers; subsequent medical and social care could be suboptimal or inappropriate to allow patients to stay longer at home.

  6. Kaplan-Meier Meets Chemical Kinetics: Intrinsic Rate of SOD1 Amyloidogenesis Decreased by Subset of ALS Mutations and Cannot Fully Explain Age of Disease Onset.

    PubMed

    Abdolvahabi, Alireza; Shi, Yunhua; Rasouli, Sanaz; Croom, Corbin M; Aliyan, Amir; Martí, Angel A; Shaw, Bryan F

    2017-06-21

    Over 150 mutations in SOD1 (superoxide dismutase-1) cause amyotrophic lateral sclerosis (ALS), presumably by accelerating SOD1 amyloidogenesis. Like many nucleation processes, SOD1 fibrillization is stochastic (in vitro), which inhibits the determination of aggregation rates (and obscures whether rates correlate with patient phenotypes). Here, we diverged from classical chemical kinetics and used Kaplan-Meier estimators to quantify the probability of apo-SOD1 fibrillization (in vitro) from ∼10 3 replicate amyloid assays of wild-type (WT) SOD1 and nine ALS variants. The probability of apo-SOD1 fibrillization (expressed as a Hazard ratio) is increased by certain ALS-linked SOD1 mutations but is decreased or remains unchanged by other mutations. Despite this diversity, Hazard ratios of fibrillization correlated linearly with (and for three mutants, approximately equaled) Hazard ratios of patient survival (R 2 = 0.67; Pearson's r = 0.82). No correlation exists between Hazard ratios of fibrillization and age of initial onset of ALS (R 2 = 0.09). Thus, Hazard ratios of fibrillization might explain rates of disease progression but not onset. Classical kinetic metrics of fibrillization, i.e., mean lag time and propagation rate, did not correlate as strongly with phenotype (and ALS mutations did not uniformly accelerate mean rate of nucleation or propagation). A strong correlation was found, however, between mean ThT fluorescence at lag time and patient survival (R 2 = 0.93); oligomers of SOD1 with weaker fluorescence correlated with shorter survival. This study suggests that SOD1 mutations trigger ALS by altering a property of SOD1 or its oligomers other than the intrinsic rate of amyloid nucleation (e.g., oligomer stability; rates of intercellular propagation; affinity for membrane surfaces; and maturation rate).

  7. Probabilistic Appraisal of Earthquake Hazard Parameters Deduced from a Bayesian Approach in the Northwest Frontier of the Himalayas

    NASA Astrophysics Data System (ADS)

    Yadav, R. B. S.; Tsapanos, T. M.; Bayrak, Yusuf; Koravos, G. Ch.

    2013-03-01

    A straightforward Bayesian statistic is applied in five broad seismogenic source zones of the northwest frontier of the Himalayas to estimate the earthquake hazard parameters (maximum regional magnitude M max, β value of G-R relationship and seismic activity rate or intensity λ). For this purpose, a reliable earthquake catalogue which is homogeneous for M W ≥ 5.0 and complete during the period 1900 to 2010 is compiled. The Hindukush-Pamir Himalaya zone has been further divided into two seismic zones of shallow ( h ≤ 70 km) and intermediate depth ( h > 70 km) according to the variation of seismicity with depth in the subduction zone. The estimated earthquake hazard parameters by Bayesian approach are more stable and reliable with low standard deviations than other approaches, but the technique is more time consuming. In this study, quantiles of functions of distributions of true and apparent magnitudes for future time intervals of 5, 10, 20, 50 and 100 years are calculated with confidence limits for probability levels of 50, 70 and 90 % in all seismogenic source zones. The zones of estimated M max greater than 8.0 are related to the Sulaiman-Kirthar ranges, Hindukush-Pamir Himalaya and Himalayan Frontal Thrusts belt; suggesting more seismically hazardous regions in the examined area. The lowest value of M max (6.44) has been calculated in Northern-Pakistan and Hazara syntaxis zone which have estimated lowest activity rate 0.0023 events/day as compared to other zones. The Himalayan Frontal Thrusts belt exhibits higher earthquake magnitude (8.01) in next 100-years with 90 % probability level as compared to other zones, which reveals that this zone is more vulnerable to occurrence of a great earthquake. The obtained results in this study are directly useful for the probabilistic seismic hazard assessment in the examined region of Himalaya.

  8. Reintervention for stent occlusion after bilateral self-expandable metallic stent placement for malignant hilar biliary obstruction.

    PubMed

    Inoue, Tadahisa; Naitoh, Itaru; Okumura, Fumihiro; Ozeki, Takanori; Anbe, Kaiki; Iwasaki, Hiroyasu; Nishie, Hirotada; Mizushima, Takashi; Sano, Hitoshi; Nakazawa, Takahiro; Yoneda, Masashi; Joh, Takashi

    2016-11-01

    Endoscopic reintervention for stent occlusions following bilateral self-expandable metallic stent (SEMS) placement for malignant hilar biliary obstruction (MHBO) is challenging, and time to recurrent biliary obstruction (RBO) of the revisionary stent remains unclear. We aimed to clarify a suitable reintervention method for stent occlusions following bilateral SEMS placement for MHBO. Between 2002 and 2014, 52 consecutive patients with MHBO who underwent endoscopic reintervention for stent occlusion after bilateral SEMS placement were enrolled at two university hospitals and one tertiary care referral center. We retrospectively evaluated the technical and functional success rates of the reinterventions, and the time to RBO of the revisionary stents. Technical and functional success rates of the reinterventions were 92% (48/52) and 90% (43/48), respectively. Univariate analysis did not determine any significant predictive factors for technical and functional failures. Median time to RBO of the revisionary stents was 68 days. Median time to RBO was significantly longer for revisionary SEMS placement than for plastic stent placement (131 days vs 47 days, respectively; log-rank test, P = 0.005). Revisionary SEMS placement was the only independent factor that was significantly associated with a longer time to RBO of the revisionary stent in the multivariate Cox proportional hazards analysis (hazard ratio 0.37; 95% confidence interval 0.14-0.95; P = 0.039). Revisionary SEMS placement is a suitable endoscopic reintervention method for stent occlusion following bilateral SEMS placement from the perspective of time to RBO of the revisionary stent. © 2016 Japan Gastroenterological Endoscopy Society.

  9. Influence of visual clutter on the effect of navigated safety inspection: a case study on elevator installation.

    PubMed

    Liao, Pin-Chao; Sun, Xinlu; Liu, Mei; Shih, Yu-Nien

    2018-01-11

    Navigated safety inspection based on task-specific checklists can increase the hazard detection rate, theoretically with interference from scene complexity. Visual clutter, a proxy of scene complexity, can theoretically impair visual search performance, but its impact on the effect of safety inspection performance remains to be explored for the optimization of navigated inspection. This research aims to explore whether the relationship between working memory and hazard detection rate is moderated by visual clutter. Based on a perceptive model of hazard detection, we: (a) developed a mathematical influence model for construction hazard detection; (b) designed an experiment to observe the performance of hazard detection rate with adjusted working memory under different levels of visual clutter, while using an eye-tracking device to observe participants' visual search processes; (c) utilized logistic regression to analyze the developed model under various visual clutter. The effect of a strengthened working memory on the detection rate through increased search efficiency is more apparent in high visual clutter. This study confirms the role of visual clutter in construction-navigated inspections, thus serving as a foundation for the optimization of inspection planning.

  10. An empirical Bayes approach for the Poisson life distribution.

    NASA Technical Reports Server (NTRS)

    Canavos, G. C.

    1973-01-01

    A smooth empirical Bayes estimator is derived for the intensity parameter (hazard rate) in the Poisson distribution as used in life testing. The reliability function is also estimated either by using the empirical Bayes estimate of the parameter, or by obtaining the expectation of the reliability function. The behavior of the empirical Bayes procedure is studied through Monte Carlo simulation in which estimates of mean-squared errors of the empirical Bayes estimators are compared with those of conventional estimators such as minimum variance unbiased or maximum likelihood. Results indicate a significant reduction in mean-squared error of the empirical Bayes estimators over the conventional variety.

  11. GPS Imaging of Time-Variable Earthquake Hazard: The Hilton Creek Fault, Long Valley California

    NASA Astrophysics Data System (ADS)

    Hammond, W. C.; Blewitt, G.

    2016-12-01

    The Hilton Creek Fault, in Long Valley, California is a down-to-the-east normal fault that bounds the eastern edge of the Sierra Nevada/Great Valley microplate, and lies half inside and half outside the magmatically active caldera. Despite the dense coverage with GPS networks, the rapid and time-variable surface deformation attributable to sporadic magmatic inflation beneath the resurgent dome makes it difficult to use traditional geodetic methods to estimate the slip rate of the fault. While geologic studies identify cumulative offset, constrain timing of past earthquakes, and constrain a Quaternary slip rate to within 1-5 mm/yr, it is not currently possible to use geologic data to evaluate how the potential for slip correlates with transient caldera inflation. To estimate time-variable seismic hazard of the fault we estimate its instantaneous slip rate from GPS data using a new set of algorithms for robust estimation of velocity and strain rate fields and fault slip rates. From the GPS time series, we use the robust MIDAS algorithm to obtain time series of velocity that are highly insensitive to the effects of seasonality, outliers and steps in the data. We then use robust imaging of the velocity field to estimate a gridded time variable velocity field. Then we estimate fault slip rate at each time using a new technique that forms ad-hoc block representations that honor fault geometries, network complexity, connectivity, but does not require labor-intensive drawing of block boundaries. The results are compared to other slip rate estimates that have implications for hazard over different time scales. Time invariant long term seismic hazard is proportional to the long term slip rate accessible from geologic data. Contemporary time-invariant hazard, however, may differ from the long term rate, and is estimated from the geodetic velocity field that has been corrected for the effects of magmatic inflation in the caldera using a published model of a dipping ellipsoidal magma chamber. Contemporary time-variable hazard can be estimated from the time variable slip rate estimated from the evolving GPS velocity field.

  12. Man-rating of a launch vehicle

    NASA Astrophysics Data System (ADS)

    Soeffker, D.

    Analysis techniques for hazard identification, classification, and control, developed for Spacelab, are presented. Hazards were classified as catastrophic (leading to crew or vehicle loss) critical (could lead to serious injury or damage) and controlled (counteracted by design). All nonmetallic materials were rated for flammability in oxygen enriched atmospheres, toxic offgassing, and odor. Any element with less than 200 mission capability was rated life limited.

  13. Can a tailored exercise and home hazard reduction program reduce the rate of falls in community dwelling older people with cognitive impairment: protocol paper for the i-FOCIS randomised controlled trial.

    PubMed

    Close, Jacqueline C T; Wesson, Jacqueline; Sherrington, Catherine; Hill, Keith D; Kurrle, Sue; Lord, Stephen R; Brodaty, Henry; Howard, Kirsten; Gitlin, Laura N; O'Rourke, Sandra D; Clemson, Lindy

    2014-08-15

    The rate of falls in community dwelling older people with cognitive impairment (CI) is twice that of a cognitively intact population, with almost two thirds of people with CI falling annually. Studies indicate that exercise involving balance and/or a home hazard reduction program are effective in preventing falls in cognitively intact older people. However the potential benefit of these interventions in reducing falls in people with CI has not been established.This randomised controlled trial will determine whether a tailored exercise and home hazard reduction program can reduce the rate of falls in community dwelling older people with CI. We will determine whether the intervention has beneficial effects on a range of physical and psychological outcome measures as well as quality of life of participants and their carers. A health economic analysis examining the cost and potential benefits of the program will also be undertaken. Three hundred and sixty people aged 65 years or older living in the community with CI will be recruited to participate in the trial. Each will have an identifiable carer with a minimum of 3.5 hours of face to face contact each week.Participants will undergo an assessment at baseline with retests at 6 and 12 months. Participants allocated to the intervention group will participate in an exercise and home hazard reduction program tailored to their cognitive and physical abilities.The primary outcome measure will be the rate of falls which will be measured using monthly falls calendars. Secondary outcome measures will include the risk of falling, quality of life, measures of physical and cognitive function, fear of falling and planned and unplanned use of health services. Carers will be followed up to determine carer burden, coping strategies and quality of life. The study will determine the impact of this tailored intervention in reducing the rate of falls in community dwelling older people with CI as well as the cost-effectiveness and adherence to the program. The results will have direct implications for the design and implementation of interventions for this high-risk group of older people. The protocol for this study is registered with the Australian New Zealand Clinical Trials Registry - ACTRN12614000603617.

  14. A first hazard analysis of the Harrat Ash Shamah volcanic field, Syria-Jordan Borderline

    NASA Astrophysics Data System (ADS)

    Cagnan, Zehra; Akkar, Sinan; Moghimi, Saed

    2017-04-01

    The northernmost part of the Saudi Cenozoic Volcanic Fields, the 100,000 km2 Harrat Ash Shamah has hosted some of the most recent volcanic eruptions along the Syria-Jordan borderline. With rapid growth of the cities in this region, exposure to any potential renewed volcanism increased considerably. We present here a first-order probabilistic hazard analysis related to new vent formation and subsequent lava flow from Harrat Ash Shamah. The 733 visible eruption vent sites were utilized to develop a probability density function for new eruption sites using Gaussian kernel smoothing. This revealed a NNW striking zone of high spatial hazard surrounding the cities Amman and Irbid in Jordan. The temporal eruption recurrence rate is estimated to be approximately one vent per 3500 years, but the temporal record of the field is so poorly constrained that the lower and upper bounds for the recurrence interval are 17,700 yrs and 70 yrs, respectively. A Poisson temporal model is employed within the scope of this study. In order to treat the uncertainties associated with the spatio-temporal models as well as size of the area affected by the lava flow, the logic tree approach is adopted. For the Syria-Jordan borderline, the spatial variation of volcanic hazard is computed as well as uncertainty associated with these estimates.

  15. Big Data Toolsets to Pharmacometrics: Application of Machine Learning for Time-to-Event Analysis.

    PubMed

    Gong, Xiajing; Hu, Meng; Zhao, Liang

    2018-05-01

    Additional value can be potentially created by applying big data tools to address pharmacometric problems. The performances of machine learning (ML) methods and the Cox regression model were evaluated based on simulated time-to-event data synthesized under various preset scenarios, i.e., with linear vs. nonlinear and dependent vs. independent predictors in the proportional hazard function, or with high-dimensional data featured by a large number of predictor variables. Our results showed that ML-based methods outperformed the Cox model in prediction performance as assessed by concordance index and in identifying the preset influential variables for high-dimensional data. The prediction performances of ML-based methods are also less sensitive to data size and censoring rates than the Cox regression model. In conclusion, ML-based methods provide a powerful tool for time-to-event analysis, with a built-in capacity for high-dimensional data and better performance when the predictor variables assume nonlinear relationships in the hazard function. © 2018 The Authors. Clinical and Translational Science published by Wiley Periodicals, Inc. on behalf of American Society for Clinical Pharmacology and Therapeutics.

  16. Rockfall Hazard Process Assessment : Final Project Report

    DOT National Transportation Integrated Search

    2017-10-01

    After a decade of using the Rockfall Hazard Rating System (RHRS), the Montana Department of Transportation (MDT) sought a reassessment of their rockfall hazard evaluation process. Their prior system was a slightly modified version of the RHRS and was...

  17. Hazard function theory for nonstationary natural hazards

    DOE PAGES

    Read, Laura K.; Vogel, Richard M.

    2016-04-11

    Impact from natural hazards is a shared global problem that causes tremendous loss of life and property, economic cost, and damage to the environment. Increasingly, many natural processes show evidence of nonstationary behavior including wind speeds, landslides, wildfires, precipitation, streamflow, sea levels, and earthquakes. Traditional probabilistic analysis of natural hazards based on peaks over threshold (POT) generally assumes stationarity in the magnitudes and arrivals of events, i.e., that the probability of exceedance of some critical event is constant through time. Given increasing evidence of trends in natural hazards, new methods are needed to characterize their probabilistic behavior. The well-developed field ofmore » hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series ( X) with its failure time series ( T), enabling computation of corresponding average return periods, risk, and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose POT magnitudes are assumed to follow the widely applied generalized Pareto model. We derive the hazard function for this case and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. As a result, our theoretical analysis linking hazard random variable  X with corresponding failure time series  T should have application to a wide class of natural hazards with opportunities for future extensions.« less

  18. Hazard function theory for nonstationary natural hazards

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Read, Laura K.; Vogel, Richard M.

    Impact from natural hazards is a shared global problem that causes tremendous loss of life and property, economic cost, and damage to the environment. Increasingly, many natural processes show evidence of nonstationary behavior including wind speeds, landslides, wildfires, precipitation, streamflow, sea levels, and earthquakes. Traditional probabilistic analysis of natural hazards based on peaks over threshold (POT) generally assumes stationarity in the magnitudes and arrivals of events, i.e., that the probability of exceedance of some critical event is constant through time. Given increasing evidence of trends in natural hazards, new methods are needed to characterize their probabilistic behavior. The well-developed field ofmore » hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series ( X) with its failure time series ( T), enabling computation of corresponding average return periods, risk, and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose POT magnitudes are assumed to follow the widely applied generalized Pareto model. We derive the hazard function for this case and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. As a result, our theoretical analysis linking hazard random variable  X with corresponding failure time series  T should have application to a wide class of natural hazards with opportunities for future extensions.« less

  19. Renal Salvage with Renal Artery Stenting Improves Long-term Survival.

    PubMed

    Modrall, J Gregory; Trimmer, Clayton; Tsai, Shirling; Kirkwood, Melissa L; Ali, Mujtaba; Rectenwald, John E; Timaran, Carlos H; Rosero, Eric B

    2017-11-01

    The Cardiovascular Outcomes in Renal Atherosclerotic Lesions (CORAL) Trial cast doubt on the benefits of renal artery stenting (RAS). However, the outcomes for patients with chronic kidney disease (CKD) were not analyzed separately in the CORAL Trial. We hypothesized that patients who experienced a significant improvement in renal function after RAS would have improved long-term survival, compared with patients whose renal function was not improved by stenting. This single-center retrospective study included 60 patients with stage 3 or worse CKD and renal artery occlusive disease who were treated with RAS for renal salvage. Patients were categorized as "responders" or "nonresponders" based on postoperative changes in estimated glomerular filtration rate (eGFR) after RAS. "Responders" were those patients with an improvement of at least 20% in eGFR over baseline; all others were categorized as "nonresponders." Survival was analyzed using the Kaplan-Meier method. Cox proportional hazards regression was used to identify predictors of long-term survival. The median age of the cohort was 66 years (interquartile range [IQR], 60-73). Median preoperative eGFR was 34 mL/min/1.73 m 2 (IQR, 24-45). At late follow-up (median 35 months, IQR, 22-97 months), 16 of 60 patients (26.7%) were categorized as "responders" with a median increase in postoperative eGFR of 40% (IQR, 21-67). Long-term survival was superior for responders, compared with nonresponders (P = 0.046 by log-rank test). Cox proportional hazards regression identified improved renal function after RAS as the only significant predictor of increased long-term survival (hazard ratio = 0.235, 95% confidence interval = 0.075-0.733; P = 0.0126 for improved versus worsened renal function after RAS). Successful salvage of renal function by RAS is associated with improved long-term survival. These data provide an important counter argument to the prior negative clinical trials that found no benefit to RAS. Published by Elsevier Inc.

  20. PERSONNEL PROTECTION THROUGH RECONNAISSANCE ROBOTICS AT SUPERFUND REMEDIAL SITES

    EPA Science Inventory

    Investigation, mitigation, and clean-up of hazardous materials at Superfund sites normally require on-site workers to perform hazardous and sometimes potentially dangerous functions. uch functions include site surveys and the reconnaissance for airborne and buried toxic environme...

  1. DEMONSTRATION OF AUTONOMOUS AIR MONITORING THROUGH ROBOTICS

    EPA Science Inventory

    Hazardous and/or tedious functions are often performed by on-site workers during investigation, mitigation and clean-up of hazardous substances. These functions include site surveys, sampling and analysis, excavation, and treatment and preparation of wastes for shipment to chemic...

  2. Using strain rates to forecast seismic hazards

    USGS Publications Warehouse

    Evans, Eileen

    2017-01-01

    One essential component in forecasting seismic hazards is observing the gradual accumulation of tectonic strain accumulation along faults before this strain is suddenly released as earthquakes. Typically, seismic hazard models are based on geologic estimates of slip rates along faults and historical records of seismic activity, neither of which records actively accumulating strain. But this strain can be estimated by geodesy: the precise measurement of tiny position changes of Earth’s surface, obtained from GPS, interferometric synthetic aperture radar (InSAR), or a variety of other instruments.

  3. Measuring disease progression in early Parkinson disease: the National Institutes of Health Exploratory Trials in Parkinson Disease (NET-PD) experience.

    PubMed

    Parashos, Sotirios A; Luo, Sheng; Biglan, Kevin M; Bodis-Wollner, Ivan; He, Bo; Liang, Grace S; Ross, G Webster; Tilley, Barbara C; Shulman, Lisa M

    2014-06-01

    Optimizing assessments of rate of progression in Parkinson disease (PD) is important in designing clinical trials, especially of potential disease-modifying agents. To examine the value of measures of impairment, disability, and quality of life in assessing progression in early PD. Inception cohort analysis of data from 413 patients with early, untreated PD who were enrolled in 2 multicenter, randomized, double-blind clinical trials. Participants were randomly assigned to 1 of 5 treatments (67 received creatine, 66 received minocycline, 71 received coenzyme Q10, 71 received GPI-1485, and 138 received placebo). We assessed the association between the rates of change in measures of impairment, disability, and quality of life and time to initiation of symptomatic treatment. Time between baseline assessment and need for the initiation of symptomatic pharmaceutical treatment for PD was the primary indicator of disease progression. After adjusting for baseline confounding variables with regard to the Unified Parkinson's Disease Rating Scale (UPDRS) Part II score, the UPDRS Part III score, the modified Rankin Scale score, level of education, and treatment group, we assessed the rate of change for the following measurements: the UPDRS Part II score; the UPDRS Part III score; the Schwab and England Independence Scale score (which measures activities of daily living); the Total Functional Capacity scale; the 39-item Parkinson's Disease Questionnaire, summary index, and activities of daily living subscale; and version 2 of the 12-item Short Form Health Survey Physical Summary and Mental Summary. Variables reaching the statistical threshold in univariate analysis were entered into a multivariable Cox proportional hazards model using time to symptomatic treatment as the dependent variable. More rapid change (ie, worsening) in the UPDRS Part II score (hazard ratio, 1.15 [95% CI, 1.08-1.22] for 1 scale unit change per 6 months), the UPDRS Part III score (hazard ratio, 1.09 [95% CI, 1.06-1.13] for 1 scale unit change per 6 months), and the Schwab and England Independence Scale score (hazard ratio, 1.29 [95% CI, 1.12-1.48] for 5 percentage point change per 6 months) was associated with earlier need for symptomatic therapy. AND RELEVANCE In early PD, the UPDRS Part II score and Part III score and the Schwab and England Independence Scale score can be used to measure disease progression, whereas the 39-item Parkinson's Disease Questionnaire and summary index, Total Functional Capacity scale, and the 12-item Short Form Health Survey Physical Summary and Mental Summary are not sensitive to change. clinicaltrials.gov Identifiers: NCT00063193 and NCT00076492.

  4. Evaluation of hazard and integrity monitor functions for integrated alerting and notification using a sensor simulation framework

    NASA Astrophysics Data System (ADS)

    Bezawada, Rajesh; Uijt de Haag, Maarten

    2010-04-01

    This paper discusses the results of an initial evaluation study of hazard and integrity monitor functions for use with integrated alerting and notification. The Hazard and Integrity Monitor (HIM) (i) allocates information sources within the Integrated Intelligent Flight Deck (IIFD) to required functionality (like conflict detection and avoidance) and determines required performance of these information sources as part of that function; (ii) monitors or evaluates the required performance of the individual information sources and performs consistency checks among various information sources; (iii) integrates the information to establish tracks of potential hazards that can be used for the conflict probes or conflict prediction for various time horizons including the 10, 5, 3, and <3 minutes used in our scenario; (iv) detects and assesses the class of the hazard and provide possible resolutions. The HIM monitors the operation-dependent performance parameters related to the potential hazards in a manner similar to the Required Navigation Performance (RNP). Various HIM concepts have been implemented and evaluated using a previously developed sensor simulator/synthesizer. Within the simulation framework, various inputs to the IIFD and its subsystems are simulated, synthesized from actual collected data, or played back from actual flight test sensor data. The framework and HIM functions are implemented in SimulinkR, a modeling language developed by The MathworksTM. This modeling language allows for test and evaluation of various sensor and communication link configurations as well as the inclusion of feedback from the pilot on the performance of the aircraft.

  5. Documentation of hazards and safety perceptions for mechanized logging operations in east central Alabama

    Treesearch

    R. M. Bordas; G. A. Davis; B. L. Hopkins; R. E. Thornas; Robert B. Rummer

    2001-01-01

    The logging industry remains one of the most hazardous in the nation. Despite more stringent safety regulations and improvements in equipment safety features, the rate of logging fatalities has decreased at a much lower rate than the decrease in the rate of illnesses and injuries in the same occupation. The objective of this research was to identify and assess the...

  6. Smoothing spline ANOVA frailty model for recurrent event data.

    PubMed

    Du, Pang; Jiang, Yihua; Wang, Yuedong

    2011-12-01

    Gap time hazard estimation is of particular interest in recurrent event data. This article proposes a fully nonparametric approach for estimating the gap time hazard. Smoothing spline analysis of variance (ANOVA) decompositions are used to model the log gap time hazard as a joint function of gap time and covariates, and general frailty is introduced to account for between-subject heterogeneity and within-subject correlation. We estimate the nonparametric gap time hazard function and parameters in the frailty distribution using a combination of the Newton-Raphson procedure, the stochastic approximation algorithm (SAA), and the Markov chain Monte Carlo (MCMC) method. The convergence of the algorithm is guaranteed by decreasing the step size of parameter update and/or increasing the MCMC sample size along iterations. Model selection procedure is also developed to identify negligible components in a functional ANOVA decomposition of the log gap time hazard. We evaluate the proposed methods with simulation studies and illustrate its use through the analysis of bladder tumor data. © 2011, The International Biometric Society.

  7. Advanced statistical methods to study the effects of gastric tube and non-invasive ventilation on functional decline and survival in amyotrophic lateral sclerosis.

    PubMed

    Atassi, Nazem; Cudkowicz, Merit E; Schoenfeld, David A

    2011-07-01

    A few studies suggest that non-invasive ventilation (1) and gastric tube (G-tube) may have a positive impact on survival but the effect on functional decline is unclear. Confounding by indication may have produced biased estimates of the benefit seen in some of these retrospective studies. The objective of this study was to evaluate the effects of G-tube and NIV on survival and functional decline using advanced statistical models that adjust for confounding by indications. A database of 331 subjects enrolled in previous clinical trials in ALS was available for analysis. Marginal structural models (MSM) were used to compare the mortality hazards and ALSFRS-R slopes between treatment and non-treatment groups, after adjusting for confounding by indication. Results showed that the placement of a G-tube was associated with an additional 1.42 units/month decline in the ALSFRS-R slope (p < 0.0001) and increased mortality hazard of 0.28 (p = 0.02). The use of NIV had no significant effect on ALSFRS-R decline or mortality. In conclusion, marginal structural models can be used to adjust for confounding by indication in retrospective ALS studies. G-tube placement could be followed by a faster rate of functional decline and increased mortality. Our results may suffer from some of the limitations of retrospective analyses.

  8. Toward an Application Guide for Safety Integrity Level Allocation in Railway Systems.

    PubMed

    Ouedraogo, Kiswendsida Abel; Beugin, Julie; El-Koursi, El-Miloudi; Clarhaut, Joffrey; Renaux, Dominique; Lisiecki, Frederic

    2018-02-02

    The work in the article presents the development of an application guide based on feedback and comments stemming from various railway actors on their practices of SIL allocation to railway safety-related functions. The initial generic methodology for SIL allocation has been updated to be applied to railway rolling stock safety-related functions in order to solve the SIL concept application issues. Various actors dealing with railway SIL allocation problems are the intended target of the methodology; its principles will be summarized in this article with a focus on modifications and precisions made in order to establish a practical guide for railway safety authorities. The methodology is based on the flowchart formalism used in CSM (common safety method) European regulation. It starts with the use of quantitative safety requirements, particularly tolerable hazard rates (THR). THR apportioning rules are applied. On the one hand, the rules are related to classical logical combinations of safety-related functions preventing hazard occurrence. On the other hand, to take into account technical conditions (last safety weak link, functional dependencies, technological complexity, etc.), specific rules implicitly used in existing practices are defined for readjusting some THR values. SIL allocation process based on apportioned and validated THR values is finally illustrated through the example of "emergency brake" subsystems. Some specific SIL allocation rules are also defined and illustrated. © 2018 Society for Risk Analysis.

  9. Epidemiology for hazard rating of white pine blister rust

    Treesearch

    Eugene P. Van Arsdel; Brian W. Geils; Paul J. Zambino

    2006-01-01

    The ability to assess the potential for a severe infestation of white pine blister rust is an important management tool. Successful hazard rating requires a proper understanding of blister rust epidemiology, including environmental and genetic factors. For the blister rust caused by Cronartium ribicola, climate and meteorology, and the ecology,...

  10. Hazardous Drinking and Military Community Functioning: Identifying Mediating Risk Factors

    ERIC Educational Resources Information Center

    Foran, Heather M.; Heyman, Richard E.; Slep, Amy M. Smith

    2011-01-01

    Objective: Hazardous drinking is a serious societal concern in military populations. Efforts to reduce hazardous drinking among military personnel have been limited in effectiveness. There is a need for a deeper understanding of how community-based prevention models apply to hazardous drinking in the military. Community-wide prevention efforts may…

  11. 49 CFR 171.1 - Applicability of Hazardous Materials Regulations (HMR) to persons and functions.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... package or container or on a package or container containing a residue of a hazardous material. (5... bracing a hazardous materials package in a freight container or transport vehicle. (13) Segregating a hazardous materials package in a freight container or transport vehicle from incompatible cargo. (14...

  12. 49 CFR 171.1 - Applicability of Hazardous Materials Regulations (HMR) to persons and functions.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... package or container or on a package or container containing a residue of a hazardous material. (5... bracing a hazardous materials package in a freight container or transport vehicle. (13) Segregating a hazardous materials package in a freight container or transport vehicle from incompatible cargo. (14...

  13. 49 CFR 171.1 - Applicability of Hazardous Materials Regulations (HMR) to persons and functions.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... package or container or on a package or container containing a residue of a hazardous material. (5... bracing a hazardous materials package in a freight container or transport vehicle. (13) Segregating a hazardous materials package in a freight container or transport vehicle from incompatible cargo. (14...

  14. 49 CFR 171.1 - Applicability of Hazardous Materials Regulations (HMR) to persons and functions.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... package or container or on a package or container containing a residue of a hazardous material. (5... bracing a hazardous materials package in a freight container or transport vehicle. (13) Segregating a hazardous materials package in a freight container or transport vehicle from incompatible cargo. (14...

  15. Hazard recognition in mining: A psychological perspective. Information circular/1995

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perdue, C.W.; Kowalski, K.M.; Barrett, E.A.

    1995-07-01

    This U.S. Bureau of Mines report considers, from a psychological perspective, the perceptual process by which miners recognize and respond to mining hazards. It proposes that if the hazard recognition skills of miners can be improved, mining accidents may be reduced to a significant degree. Prior studies of hazard perception in mining are considered, as are relevant studies from investigations of military target identification, pilot and gunnery officer training, transportation safety, automobile operator behavior, as well as research into sensory functioning and visual information processing. A general model of hazard perception is introduced, and selected concepts from the psychology ofmore » perception that are applicable to the detection of mining hazards are reviewed. Hazard recognition is discussed as a function of the perceptual cues available to the miner as well as the cognitive resources and strategies employed by the miner. The development of expertise in resonding to hazards is related to individual differences in the experience, aptitude, and personality of the worker. Potential applications to miner safety and training are presented.« less

  16. Comparison of Characteristics and Outcomes of Asymptomatic Versus Symptomatic Left Ventricular Dysfunction in Subjects 65 Years Old or Older (from the Cardiovascular Health Study)

    PubMed Central

    Pandhi, Jay; Gottdiener, John S.; Bartz, Traci M.; Kop, Willem J.; Mehra, Mandeep R.

    2014-01-01

    Although asymptomatic left ventricular (LV) systolic dysfunction (ALVSD) is common, its phenotype and prognosis for incident heart failure (HF) and mortality are insufficiently understood. Echocardiography was done in 5,649 participants in the Cardiovascular Health Study (age 73.0 ± 5.6 years, 57.6% women). The clinical characteristics and cardiovascular risk factors of the participants with ALVSD were compared to those with normal LV function (ejection fraction ≥55%) and with symptomatic LV systolic dysfunction (SLVSD; ejection fraction <55% and a history of HF). Cox proportional hazards models were used to estimate the risk of incident HF and mortality in those with ALVSD. Also, comparisons were made among the LV ejection fraction subgroups using previously validated cutoff values (<45% and 45% to 55%), adjusting for the demographic and cardiovascular disease risk factors. Those with ALVSD (7.3%) were more likely to have cardiovascular risk factors than those in the reference group (without LV dysfunction or symptomatic HF) but less likely than those with SLVSD. The HF rate was 24 occurrences per 1,000 person-years in the reference group and 57 occurrences per 1,000 person-years in those with ALVSD. The HF rate was 45 occurrences per 1,000 person-years for those with ALVSD and mildly impaired LV dysfunction and 93 occurrences per 1,000 person-years for those with ALVSD and moderate to severe LV dysfunction. The mortality rate was 51 deaths per 1,000 person-years in the reference group, 90 deaths per 1,000 person-years in the ALVSD group, and 156 deaths per 1,000 person-years in the SLVSD group. Adjusting for covariates, compared to the reference group, ALVSD was associated with an increased risk of incident HF (hazard ratio 1.60,95% confidence interval 1.35 to 1.91), cardiovascular mortality (hazard ratio 2.13, 95% confidence interval 1.81 to 2.51), and all-cause mortality (hazard ratio 1.46, 95% confidence interval 1.29 to 1.64). In conclusion, subjects with ALVSD are characterized by a greater prevalence of cardiovascular risk factors and co-morbidities than those with normal LV function and without HF. However, the prevalence is lower than in those with SLVSD. Patients with ALVSD are at an increased risk of HF and mortality, particularly those with greater severity of LV impairment. PMID:21575752

  17. Comparison of characteristics and outcomes of asymptomatic versus symptomatic left ventricular dysfunction in subjects 65 years old or older (from the Cardiovascular Health Study).

    PubMed

    Pandhi, Jay; Gottdiener, John S; Bartz, Traci M; Kop, Willem J; Mehra, Mandeep R

    2011-06-01

    Although asymptomatic left ventricular (LV) systolic dysfunction (ALVSD) is common, its phenotype and prognosis for incident heart failure (HF) and mortality are insufficiently understood. Echocardiography was done in 5,649 participants in the Cardiovascular Health Study (age 73.0 ± 5.6 years, 57.6% women). The clinical characteristics and cardiovascular risk factors of the participants with ALVSD were compared to those with normal LV function (ejection fraction ≥55%) and with symptomatic LV systolic dysfunction (SLVSD; ejection fraction <55% and a history of HF). Cox proportional hazards models were used to estimate the risk of incident HF and mortality in those with ALVSD. Also, comparisons were made among the LV ejection fraction subgroups using previously validated cutoff values (<45% and 45% to 55%), adjusting for the demographic and cardiovascular disease risk factors. Those with ALVSD (7.3%) were more likely to have cardiovascular risk factors than those in the reference group (without LV dysfunction or symptomatic HF) but less likely than those with SLVSD. The HF rate was 24 occurrences per 1,000 person-years in the reference group and 57 occurrences per 1,000 person-years in those with ALVSD. The HF rate was 45 occurrences per 1,000 person-years for those with ALVSD and mildly impaired LV dysfunction and 93 occurrences per 1,000 person-years for those with ALVSD and moderate to severe LV dysfunction. The mortality rate was 51 deaths per 1,000 person-years in the reference group, 90 deaths per 1,000 person-years in the ALVSD group, and 156 deaths per 1,000 person-years in the SLVSD group. Adjusting for covariates, compared to the reference group, ALVSD was associated with an increased risk of incident HF (hazard ratio 1.60, 95% confidence interval 1.35 to 1.91), cardiovascular mortality (hazard ratio 2.13, 95% confidence interval 1.81 to 2.51), and all-cause mortality (hazard ratio 1.46, 95% confidence interval 1.29 to 1.64). In conclusion, subjects with ALVSD are characterized by a greater prevalence of cardiovascular risk factors and co-morbidities than those with normal LV function and without HF. However, the prevalence is lower than in those with SLVSD. Patients with ALVSD are at an increased risk of HF and mortality, particularly those with greater severity of LV impairment. Copyright © 2011 Elsevier Inc. All rights reserved.

  18. [An in vitro study on toxic effect of vanadium-titanium-magnetite dust on alveolar macrophage in rabbits].

    PubMed

    Song, Y; Chen, Q; Guan, Y

    1998-11-01

    To study the toxic effect of vanadium-titanium-magnetite (VTM) dust on alveolar macrophage (AM) and its hazardous extent. Survival rates, morphology and function of AM were compared in rabbits exposed to dust of VTM, vanadium oxide, titanium dioxide and silica in various doses and length of time with in vitro cell culture and putamen membrane cover glass transmission electron microscopy, and changes in activities of lactic dehydrogenase (LDH) and acid phosphatase (ACP) in cell culture were measured. Exposure to all the four kinds of dust could lead to decrease in survival rate of AM, increase in activities of LDH and ACP in the cell culture, and changes in their morphology and function to the extent dependent on the nature of dust. Toxic effect of exposure to VTM dust was lower than that to vanadium oxide and silica, but higher than that to titanium dioxide, which had slight toxic effect.

  19. Worth of data and natural disaster insurance

    USGS Publications Warehouse

    Attanasi, E.D.; Karlinger, M.R.

    1979-01-01

    The Federal Government in the past has provided medical and economic aid to victims of earthquakes and floods. However, regulating the use of hazard-prone areas would probably be more efficient. One way to implement such land use regulation is through the national flood and earthquake insurance program. Because insurance firms base their premium rates on available information, the benefits from additional data used to improve parameter estimates of the probability distribution (governing actual disaster events) can be computed by computing changes in the premiums as a function of additional data. An insurance firm is assumed to set rates so as to trade off penalties of overestimation and underestimation of expected damages. A Bayesian preposterior analysis is applied to determine the worth of additional data, as measured by changes in consumers’ surplus, by examining the effects of changes in premiums as a function of a longer hydrologic record.

  20. Simulation program for estimating statistical power of Cox's proportional hazards model assuming no specific distribution for the survival time.

    PubMed

    Akazawa, K; Nakamura, T; Moriguchi, S; Shimada, M; Nose, Y

    1991-07-01

    Small sample properties of the maximum partial likelihood estimates for Cox's proportional hazards model depend on the sample size, the true values of regression coefficients, covariate structure, censoring pattern and possibly baseline hazard functions. Therefore, it would be difficult to construct a formula or table to calculate the exact power of a statistical test for the treatment effect in any specific clinical trial. The simulation program, written in SAS/IML, described in this paper uses Monte-Carlo methods to provide estimates of the exact power for Cox's proportional hazards model. For illustrative purposes, the program was applied to real data obtained from a clinical trial performed in Japan. Since the program does not assume any specific function for the baseline hazard, it is, in principle, applicable to any censored survival data as long as they follow Cox's proportional hazards model.

  1. Assessing the prediction accuracy of a cure model for censored survival data with long-term survivors: Application to breast cancer data.

    PubMed

    Asano, Junichi; Hirakawa, Akihiro

    2017-01-01

    The Cox proportional hazards cure model is a survival model incorporating a cure rate with the assumption that the population contains both uncured and cured individuals. It contains a logistic regression for the cure rate, and a Cox regression to estimate the hazard for uncured patients. A single predictive model for both the cure and hazard can be developed by using a cure model that simultaneously predicts the cure rate and hazards for uncured patients; however, model selection is a challenge because of the lack of a measure for quantifying the predictive accuracy of a cure model. Recently, we developed an area under the receiver operating characteristic curve (AUC) for determining the cure rate in a cure model (Asano et al., 2014), but the hazards measure for uncured patients was not resolved. In this article, we propose novel C-statistics that are weighted by the patients' cure status (i.e., cured, uncured, or censored cases) for the cure model. The operating characteristics of the proposed C-statistics and their confidence interval were examined by simulation analyses. We also illustrate methods for predictive model selection and for further interpretation of variables using the proposed AUCs and C-statistics via application to breast cancer data.

  2. Increased Earthquake Rates in the Central and Eastern US Portend Higher Earthquake Hazards

    NASA Astrophysics Data System (ADS)

    Llenos, A. L.; Rubinstein, J. L.; Ellsworth, W. L.; Mueller, C. S.; Michael, A. J.; McGarr, A.; Petersen, M. D.; Weingarten, M.; Holland, A. A.

    2014-12-01

    Since 2009 the central and eastern United States has experienced an unprecedented increase in the rate of M≥3 earthquakes that is unlikely to be due to natural variation. Where the rates have increased so has the seismic hazard, making it important to understand these changes. Areas with significant seismicity increases are limited to areas where oil and gas production take place. By far the largest contributor to the seismicity increase is Oklahoma, where recent studies suggest that these rate changes may be due to fluid injection (e.g., Keranen et al., Geology, 2013; Science, 2014). Moreover, the area of increased seismicity in northern Oklahoma that began in 2013 coincides with the Mississippi Lime play, where well completions greatly increased the year before the seismicity increase. This suggests a link to oil and gas production either directly or from the disposal of significant amounts of produced water within the play. For the purpose of assessing the hazard due to these earthquakes, should they be treated differently from natural earthquakes? Previous studies suggest that induced seismicity may differ from natural seismicity in clustering characteristics or frequency-magnitude distributions (e.g., Bachmann et al., GJI, 2011; Llenos and Michael, BSSA, 2013). These differences could affect time-independent hazard computations, which typically assume that clustering and size distribution remain constant. In Oklahoma, as well as other areas of suspected induced seismicity, we find that earthquakes since 2009 tend to be considerably more clustered in space and time than before 2009. However differences between various regional and national catalogs leave unclear whether there are significant changes in magnitude distribution. Whether they are due to natural or industrial causes, the increased earthquake rates in these areas could increase the hazard in ways that are not accounted for in current hazard assessment practice. Clearly the possibility of induced earthquakes needs to be considered in seismic hazard assessments.

  3. Prospective Cohort Study of Work Functioning Impairment and Subsequent Absenteeism Among Japanese Workers.

    PubMed

    Fujino, Yoshihisa; Shazuki, Shuichiro; Izumi, Hiroyuki; Uehara, Masamichi; Muramatsu, Keiji; Kubo, Tatsuhiko; Oyama, Ichiro; Matsuda, Shinya

    2016-07-01

    This study examined the association of work functioning impairment as measured by work functioning impairment scale (WFun) and subsequent sick leave. A prospective cohort study was conducted at a manufacturer in Japan, and 1263 employees participated. Information on sick leave was gathered during an 18-month follow-up period. The hazard ratios (HRs) of long-term sick leave were substantially increased for those with a WFun score greater than 25 (HR = 3.99, P = 0.003). The incidence rate ratios (IRRs) of days of short-term absence gradually increased as scores of WFun increased (IRR = 1.18, P < 0.001 in the subjects with WFun of over 25 comparing with those with WFun of 14 or less). Assessing work functioning impairment is a useful way of classifying risk for future sick leave among employees.

  4. Occupational skin diseases in Washington State, 1989 through 1993: using workers' compensation data to identify cutaneous hazards.

    PubMed

    Kaufman, J D; Cohen, M A; Sama, S R; Shields, J W; Kalat, J

    1998-07-01

    This study sought to characterize occupational dermatoses and cutaneous hazards. Workers' compensation claims filed for skin disease in the Washington State Fund were analyzed for 1989 through 1993; incidence rates for industries and employers were calculated, and cutaneous hazards associated with the highest rates were identified. A total of 7445 claims were filed for skin disorders, principally contact dermatitis; 675 (9.1%) involved more than 3 missed work-days. The rate of accepted skin disorder claims was 1.0 per 1000 full-time employee-years. The highest incidence rates (4.6 to 30.7 accepted claims per 1000 full-time employee-years) were in certain manufacturing industries (plastics related, concrete products, aircraft parts, sporting goods, and boat building), wholesale farm product raw materials, automotive glass replacement, and beauty shops. Seven of the 10 employers with the highest incidence rates (19.6 to 85.5 accepted claims per 1000 full-time employee-years) used fiber-reinforced plastics (composites) and exposed workers to epoxy and other resin systems associated with contact dermatitis. Workers' compensation data identify known and emerging workplace cutaneous hazards and show promise for targeting prevention efforts.

  5. Occupational skin diseases in Washington State, 1989 through 1993: using workers' compensation data to identify cutaneous hazards.

    PubMed Central

    Kaufman, J D; Cohen, M A; Sama, S R; Shields, J W; Kalat, J

    1998-01-01

    OBJECTIVES: This study sought to characterize occupational dermatoses and cutaneous hazards. METHODS: Workers' compensation claims filed for skin disease in the Washington State Fund were analyzed for 1989 through 1993; incidence rates for industries and employers were calculated, and cutaneous hazards associated with the highest rates were identified. RESULTS: A total of 7445 claims were filed for skin disorders, principally contact dermatitis; 675 (9.1%) involved more than 3 missed work-days. The rate of accepted skin disorder claims was 1.0 per 1000 full-time employee-years. The highest incidence rates (4.6 to 30.7 accepted claims per 1000 full-time employee-years) were in certain manufacturing industries (plastics related, concrete products, aircraft parts, sporting goods, and boat building), wholesale farm product raw materials, automotive glass replacement, and beauty shops. Seven of the 10 employers with the highest incidence rates (19.6 to 85.5 accepted claims per 1000 full-time employee-years) used fiber-reinforced plastics (composites) and exposed workers to epoxy and other resin systems associated with contact dermatitis. CONCLUSIONS: Workers' compensation data identify known and emerging workplace cutaneous hazards and show promise for targeting prevention efforts. PMID:9663152

  6. Comparison of smoothing methods for the development of a smoothed seismicity model for Alaska and the implications for seismic hazard

    NASA Astrophysics Data System (ADS)

    Moschetti, M. P.; Mueller, C. S.; Boyd, O. S.; Petersen, M. D.

    2013-12-01

    In anticipation of the update of the Alaska seismic hazard maps (ASHMs) by the U. S. Geological Survey, we report progress on the comparison of smoothed seismicity models developed using fixed and adaptive smoothing algorithms, and investigate the sensitivity of seismic hazard to the models. While fault-based sources, such as those for great earthquakes in the Alaska-Aleutian subduction zone and for the ~10 shallow crustal faults within Alaska, dominate the seismic hazard estimates for locations near to the sources, smoothed seismicity rates make important contributions to seismic hazard away from fault-based sources and where knowledge of recurrence and magnitude is not sufficient for use in hazard studies. Recent developments in adaptive smoothing methods and statistical tests for evaluating and comparing rate models prompt us to investigate the appropriateness of adaptive smoothing for the ASHMs. We develop smoothed seismicity models for Alaska using fixed and adaptive smoothing methods and compare the resulting models by calculating and evaluating the joint likelihood test. We use the earthquake catalog, and associated completeness levels, developed for the 2007 ASHM to produce fixed-bandwidth-smoothed models with smoothing distances varying from 10 to 100 km and adaptively smoothed models. Adaptive smoothing follows the method of Helmstetter et al. and defines a unique smoothing distance for each earthquake epicenter from the distance to the nth nearest neighbor. The consequence of the adaptive smoothing methods is to reduce smoothing distances, causing locally increased seismicity rates, where seismicity rates are high and to increase smoothing distances where seismicity is sparse. We follow guidance from previous studies to optimize the neighbor number (n-value) by comparing model likelihood values, which estimate the likelihood that the observed earthquake epicenters from the recent catalog are derived from the smoothed rate models. We compare likelihood values from all rate models to rank the smoothing methods. We find that adaptively smoothed seismicity models yield better likelihood values than the fixed smoothing models. Holding all other (source and ground motion) models constant, we calculate seismic hazard curves for all points across Alaska on a 0.1 degree grid, using the adaptively smoothed and fixed smoothed seismicity models separately. Because adaptively smoothed models concentrate seismicity near the earthquake epicenters where seismicity rates are high, the corresponding hazard values are higher, locally, but reduced with distance from observed seismicity, relative to the hazard from fixed-bandwidth models. We suggest that adaptively smoothed seismicity models be considered for implementation in the update to the ASHMs because of their improved likelihood estimates relative to fixed smoothing methods; however, concomitant increases in seismic hazard will cause significant changes in regions of high seismicity, such as near the subduction zone, northeast of Kotzebue, and along the NNE trending zone of seismicity in the Alaskan interior.

  7. Comparison of smoothing methods for the development of a smoothed seismicity model for Alaska and the implications for seismic hazard

    USGS Publications Warehouse

    Moschetti, Morgan P.; Mueller, Charles S.; Boyd, Oliver S.; Petersen, Mark D.

    2014-01-01

    In anticipation of the update of the Alaska seismic hazard maps (ASHMs) by the U. S. Geological Survey, we report progress on the comparison of smoothed seismicity models developed using fixed and adaptive smoothing algorithms, and investigate the sensitivity of seismic hazard to the models. While fault-based sources, such as those for great earthquakes in the Alaska-Aleutian subduction zone and for the ~10 shallow crustal faults within Alaska, dominate the seismic hazard estimates for locations near to the sources, smoothed seismicity rates make important contributions to seismic hazard away from fault-based sources and where knowledge of recurrence and magnitude is not sufficient for use in hazard studies. Recent developments in adaptive smoothing methods and statistical tests for evaluating and comparing rate models prompt us to investigate the appropriateness of adaptive smoothing for the ASHMs. We develop smoothed seismicity models for Alaska using fixed and adaptive smoothing methods and compare the resulting models by calculating and evaluating the joint likelihood test. We use the earthquake catalog, and associated completeness levels, developed for the 2007 ASHM to produce fixed-bandwidth-smoothed models with smoothing distances varying from 10 to 100 km and adaptively smoothed models. Adaptive smoothing follows the method of Helmstetter et al. and defines a unique smoothing distance for each earthquake epicenter from the distance to the nth nearest neighbor. The consequence of the adaptive smoothing methods is to reduce smoothing distances, causing locally increased seismicity rates, where seismicity rates are high and to increase smoothing distances where seismicity is sparse. We follow guidance from previous studies to optimize the neighbor number (n-value) by comparing model likelihood values, which estimate the likelihood that the observed earthquake epicenters from the recent catalog are derived from the smoothed rate models. We compare likelihood values from all rate models to rank the smoothing methods. We find that adaptively smoothed seismicity models yield better likelihood values than the fixed smoothing models. Holding all other (source and ground motion) models constant, we calculate seismic hazard curves for all points across Alaska on a 0.1 degree grid, using the adaptively smoothed and fixed smoothed seismicity models separately. Because adaptively smoothed models concentrate seismicity near the earthquake epicenters where seismicity rates are high, the corresponding hazard values are higher, locally, but reduced with distance from observed seismicity, relative to the hazard from fixed-bandwidth models. We suggest that adaptively smoothed seismicity models be considered for implementation in the update to the ASHMs because of their improved likelihood estimates relative to fixed smoothing methods; however, concomitant increases in seismic hazard will cause significant changes in regions of high seismicity, such as near the subduction zone, northeast of Kotzebue, and along the NNE trending zone of seismicity in the Alaskan interior.

  8. Why is Probabilistic Seismic Hazard Analysis (PSHA) still used?

    NASA Astrophysics Data System (ADS)

    Mulargia, Francesco; Stark, Philip B.; Geller, Robert J.

    2017-03-01

    Even though it has never been validated by objective testing, Probabilistic Seismic Hazard Analysis (PSHA) has been widely used for almost 50 years by governments and industry in applications with lives and property hanging in the balance, such as deciding safety criteria for nuclear power plants, making official national hazard maps, developing building code requirements, and determining earthquake insurance rates. PSHA rests on assumptions now known to conflict with earthquake physics; many damaging earthquakes, including the 1988 Spitak, Armenia, event and the 2011 Tohoku, Japan, event, have occurred in regions relatively rated low-risk by PSHA hazard maps. No extant method, including PSHA, produces reliable estimates of seismic hazard. Earthquake hazard mitigation should be recognized to be inherently political, involving a tradeoff between uncertain costs and uncertain risks. Earthquake scientists, engineers, and risk managers can make important contributions to the hard problem of allocating limited resources wisely, but government officials and stakeholders must take responsibility for the risks of accidents due to natural events that exceed the adopted safety criteria.

  9. Atmospheric Ionizing Radiation (AIR) ER-2 Preflight Analysis

    NASA Technical Reports Server (NTRS)

    Tai, Hsiang; Wilson, John W.; Maiden, D. L.

    1998-01-01

    Atmospheric ionizing radiation (AIR) produces chemically active radicals in biological tissues that alter the cell function or result in cell death. The AIR ER-2 flight measurements will enable scientists to study the radiation risk associated with the high-altitude operation of a commercial supersonic transport. The ER-2 radiation measurement flights will follow predetermined, carefully chosen courses to provide an appropriate database matrix which will enable the evaluation of predictive modeling techniques. Explicit scientific results such as dose rate, dose equivalent rate, magnetic cutoff, neutron flux, and air ionization rate associated with those flights are predicted by using the AIR model. Through these flight experiments, we will further increase our knowledge and understanding of the AIR environment and our ability to assess the risk from the associated hazard.

  10. Hypoglycaemia with oral antidiabetic drugs: results from prescription-event monitoring cohorts of rosiglitazone, pioglitazone, nateglinide and repaglinide.

    PubMed

    Vlckova, Veronika; Cornelius, Victoria; Kasliwal, Rachna; Wilton, Lynda; Shakir, Saad A W

    2009-01-01

    Hypoglycaemia is an acute complication associated with intensive treatment of patients with diabetes mellitus. This complication poses a major challenge in diabetes management. Furthermore, severe hypoglycaemia may be life threatening. Although hypoglycaemia is more often associated with insulin treatment, oral hypoglycaemic agents have the potential to trigger hypoglycaemia. The aim of this study was to quantify the incidence of hypoglycaemic events and to describe the pattern of these incident events during the first 9 months of treatment with four oral antidiabetic drugs, rosiglitazone, pioglitazone, nateglinide and repaglinide, prescribed in general practice in England. We used data collected for prescription-event monitoring (PEM) studies of rosiglitazone, pioglitazone, nateglinide and repaglinide. PEM is an observational, non-interventional, incept cohort study. Observation time for each patient and incidence rate (IR) per 1000 patient-years of treatment for hypoglycaemia was calculated for each drug cohort. Smoothed hazard estimates were plotted over time. Case/non-case analysis was performed to describe and compare patients who had at least one hypoglycaemic event in the first 9 months of treatment with those who did not. The total number of patients included in the analysis was 14,373, 12,768, 4,549 and 5,727 in rosiglitazone, pioglitazone, nateglinide and repaglinide cohorts, respectively. From these, 276 patients experienced at least one episode of hypoglycaemia. The IR was between 50% and 100% higher in patients receiving treatment with meglitinides compared with those treated with the thiazolidinediones (TZDs) [IR = 9.94, 9.64, 15.71 and 20.32 per 1,000 patient-years for rosiglitazone, pioglitazone, nateglinide and repaglinide, respectively]. The plot of the hazard function and the estimated shape parameter from the Weibull regression model showed that pioglitazone, nateglinide and repaglinide had non-constant (decreasing) hazards over time, whereas the hazard for rosiglitazone-treated patients was approximately constant over time. Nateglinide and repaglinide had similar shape hazard function, indicating a significantly higher number of hypoglycaemic episodes shortly after starting treatment. For women treated with TZDs, hypoglycaemia was reported more frequently than for men. This analysis shows that the frequency of reported hypoglycaemia within the study cohorts was relatively low. The rates of hypoglycaemia were not equal between drug classes. Treatment with nateglinide or repaglinide was characterized by a higher incidence of hypoglycaemia at the beginning of treatment. Further investigation is necessary to assess whether women treated with TZDs are more prone to hypoglycaemia than men. Findings from this study should be taken into account with other clinical and pharmacoepidemiological studies.

  11. Cumulative Incidence of Cancer Among Persons With HIV in North America: A Cohort Study.

    PubMed

    Silverberg, Michael J; Lau, Bryan; Achenbach, Chad J; Jing, Yuezhou; Althoff, Keri N; D'Souza, Gypsyamber; Engels, Eric A; Hessol, Nancy A; Brooks, John T; Burchell, Ann N; Gill, M John; Goedert, James J; Hogg, Robert; Horberg, Michael A; Kirk, Gregory D; Kitahata, Mari M; Korthuis, Philip T; Mathews, William C; Mayor, Angel; Modur, Sharada P; Napravnik, Sonia; Novak, Richard M; Patel, Pragna; Rachlis, Anita R; Sterling, Timothy R; Willig, James H; Justice, Amy C; Moore, Richard D; Dubrow, Robert

    2015-10-06

    Cancer is increasingly common among persons with HIV. To examine calendar trends in cumulative cancer incidence and hazard rate by HIV status. Cohort study. North American AIDS Cohort Collaboration on Research and Design during 1996 to 2009. 86 620 persons with HIV and 196 987 uninfected adults. Cancer type-specific cumulative incidence by age 75 years and calendar trends in cumulative incidence and hazard rates, each by HIV status. Cumulative incidences of cancer by age 75 years for persons with and without HIV, respectively, were as follows: Kaposi sarcoma, 4.4% and 0.01%; non-Hodgkin lymphoma, 4.5% and 0.7%; lung cancer, 3.4% and 2.8%; anal cancer, 1.5% and 0.05%; colorectal cancer, 1.0% and 1.5%; liver cancer, 1.1% and 0.4%; Hodgkin lymphoma, 0.9% and 0.09%; melanoma, 0.5% and 0.6%; and oral cavity/pharyngeal cancer, 0.8% and 0.8%. Among persons with HIV, calendar trends in cumulative incidence and hazard rate decreased for Kaposi sarcoma and non-Hodgkin lymphoma. For anal, colorectal, and liver cancer, increasing cumulative incidence, but not hazard rate trends, were due to the decreasing mortality rate trend (-9% per year), allowing greater opportunity to be diagnosed. Despite decreasing hazard rate trends for lung cancer, Hodgkin lymphoma, and melanoma, cumulative incidence trends were not seen because of the compensating effect of the declining mortality rate. Secular trends in screening, smoking, and viral co-infections were not evaluated. Cumulative cancer incidence by age 75 years, approximating lifetime risk in persons with HIV, may have clinical utility in this population. The high cumulative incidences by age 75 years for Kaposi sarcoma, non-Hodgkin lymphoma, and lung cancer support early and sustained antiretroviral therapy and smoking cessation.

  12. Cognitive-Behavioral Therapy to Prevent Relapse in Pediatric Responders to Pharmacotherapy for Major Depressive Disorder

    PubMed Central

    Kennard, Betsy D.; Emslie, Graham J.; Mayes, Taryn L.; Nightingale-Teresi, Jeanne; Nakonezny, Paul A.; Hughes, Jennifer L.; Jones, Jessica M.; Tao, Rongrong; Stewart, Sunita M.; Jarrett, Robin B.

    2010-01-01

    Objective We present results of a feasibility test of a sequential treatment strategy using continuation phase cognitive-behavioral therapy (CBT) to prevent relapse in youths with major depressive disorder (MDD) who have responded to acute phase pharmacotherapy. Method Forty-six youths (ages 11–18 years) who had responded to 12 weeks of treatment with fluoxetine were randomized to receive either 6 months of continued antidepressant medication management (MM) or antidepressant MM plus relapse prevention CBT (MM+CBT). Primary outcome was time to relapse, defined as a Childhood Depression Rating Scale-Revised score of 40 or higher and 2 weeks of symptom worsening or clinical deterioration warranting alteration of treatment to prevent full relapse. Results Cox proportional hazards regression, adjusting for depression severity at randomization and for the hazard of relapsing by age across the trial, revealed that participants in the MM treatment group had a significantly greater risk for relapse than those in the MM+CBT treatment group (hazard ratio = 8.80; 95% confidence interval 1.01–76.89; χ2 = 3.86, p = .049) during 6 months of continuation treatment. In addition, patient satisfaction was significantly higher in the MM+CBT group. No differences were found between the two treatment groups on attrition rate, serious adverse events, and overall global functioning. Conclusions These preliminary results suggest that continuation phase CBT reduces the risk for relapse by eightfold compared with pharmacotherapy responders who received antidepressant medication alone during the 6-month continuation phase. PMID:18978634

  13. Hazard Identification, Risk Assessment, and Control Measures as an Effective Tool of Occupational Health Assessment of Hazardous Process in an Iron Ore Pelletizing Industry.

    PubMed

    Rout, B K; Sikdar, B K

    2017-01-01

    With the growing numbers of iron ore pelletization industries in India, various impacts on environment and health in relation to the workplace will rise. Therefore, understanding the hazardous process is crucial in the development of effective control measures. Hazard Identification, Risk Assessment, and Control measures (HIRAC) acts as an effective tool of Occupational Health Assessment. The aim of the study was to identify all the possible hazards at different workplaces of an iron ore pelletizing industry, to conduct an occupational health risk assessment, to calculate the risk rating based on the risk matrix, and to compare the risk rating before and after the control measures. The research was a cross-sectional study done from March to December 2015 in an iron ore pelletizing industry located in Odisha, India. Data from the survey were collected by inspecting the workplace, responses of employees regarding possible hazards in their workplace, reviewing department procedure manual, work instructions, standard operating procedure, previous incident reports, material safety data sheet, first aid/injury register, and health record of employees. A total of 116 hazards were identified. Results of the paired-sample's t -test showed that mean risk rating differs before taking control measures (M = 9.13, SD = 5.99) and after taking control measures (M = 2.80, SD = 1.38) at the 0.0001 level of significance ( t = 12.6428, df = 115, N = 116, P < 0.0001, 95% CI for mean difference 5.34 to 7.32). On an average, risk reduction was about 6.33 points lower after taking control measures. The hazards having high-risk rating and above were reduced to a level considered As Low as Reasonably Practicable (ALARP) when the control measures were applied, thereby reducing the occurrence of injury or disease in the workplace.

  14. 77 FR 64759 - Rescission of 10-Day Agency Discretionary Period in Assigning Unsatisfactory Safety Ratings

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-23

    ... hazardous materials carriers must cease operation after receiving a proposed unsatisfactory safety rating... Carrier Safety Act of 1990 (1990 Act) (section 15 of the Sanitary Food Transportation Act of 1990, Pub. L. 101-500, 104 Stat. 1218) amended the Hazardous Materials Transportation Act to prohibit motor carriers...

  15. RATES OF IRON OXIDATION AND ARSENIC SORPTION DURING GROUND WATER-SURFACE WATER MIXING AT A HAZARDOUS WASTE SITE

    EPA Science Inventory

    The fate of arsenic discharged from contaminated ground water to a pond at a hazardous waste site is controlled, in part, by the rate of ferrous iron oxidation-precipitation and arsenic sorption. Laboratory experiments were conducted using site-derived water to assess the impact...

  16. Development of a satellite-based hazard rating system for Dendrctonus frontallis (Coleoptera: Scolytidae) in the Ouachita Mountains of Arkansas

    Treesearch

    Stephen Cook; Shane Cherry; Karen Humes; James Guldin; Christopher Williams

    2007-01-01

    The southern pine beetle, Dendroctonus frontalis Zimmermann (Coleoptera: Scolytidae), is the most damaging forest insect pest of pines (Pinus spp.) throughout the southeastern United States. Hazard rating schemes have been developed for D. frontalis, but for these schemes to be accurate and effective, they...

  17. Natural hazard fatalities in Switzerland from 1946 to 2015

    NASA Astrophysics Data System (ADS)

    Badoux, Alexandre; Andres, Norina; Techel, Frank; Hegg, Christoph

    2016-12-01

    A database of fatalities caused by natural hazard processes in Switzerland was compiled for the period between 1946 and 2015. Using information from the Swiss flood and landslide damage database and the Swiss destructive avalanche database, the data set was extended back in time and more hazard processes were added by conducting an in-depth search of newspaper reports. The new database now covers all natural hazards common in Switzerland, categorised into seven process types: flood, landslide, rockfall, lightning, windstorm, avalanche and other processes (e.g. ice avalanches, earthquakes). Included were all fatal accidents associated with natural hazard processes in which victims did not expose themselves to an important danger on purpose. The database contains information on 635 natural hazard events causing 1023 fatalities, which corresponds to a mean of 14.6 victims per year. The most common causes of death were snow avalanches (37 %), followed by lightning (16 %), floods (12 %), windstorms (10 %), rockfall (8 %), landslides (7 %) and other processes (9 %). About 50 % of all victims died in one of the 507 single-fatality events; the other half were killed in the 128 multi-fatality events. The number of natural hazard fatalities that occurred annually during our 70-year study period ranged from 2 to 112 and exhibited a distinct decrease over time. While the number of victims in the first three decades (until 1975) ranged from 191 to 269 per decade, it ranged from 47 to 109 in the four following decades. This overall decrease was mainly driven by a considerable decline in the number of avalanche and lightning fatalities. About 75 % of victims were males in all natural hazard events considered together, and this ratio was roughly maintained in all individual process categories except landslides (lower) and other processes (higher). The ratio of male to female victims was most likely to be balanced when deaths occurred at home (in or near a building), a situation that mainly occurred in association with landslides and avalanches. The average age of victims of natural hazards was 35.9 years and, accordingly, the age groups with the largest number of victims were the 20-29 and 30-39 year-old groups, which in combination represented 34 % of all fatalities. It appears that the overall natural hazard mortality rate in Switzerland over the past 70 years has been relatively low in comparison to rates in other countries or rates of other types of fatal accidents in Switzerland. However, a large variability in mortality rates was observed within the country with considerably higher rates in Alpine environments.

  18. 44 CFR 64.3 - Flood Insurance Maps.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... with water surface elevations determined A0 Area of special flood hazards having shallow water depths... insurance rating purposes AH Areas of special flood hazards having shallow water depths and/or unpredictable... of special flood hazards having shallow water depths and/or unpredictable flow paths between (1) and...

  19. Computation of nonparametric convex hazard estimators via profile methods.

    PubMed

    Jankowski, Hanna K; Wellner, Jon A

    2009-05-01

    This paper proposes a profile likelihood algorithm to compute the nonparametric maximum likelihood estimator of a convex hazard function. The maximisation is performed in two steps: First the support reduction algorithm is used to maximise the likelihood over all hazard functions with a given point of minimum (or antimode). Then it is shown that the profile (or partially maximised) likelihood is quasi-concave as a function of the antimode, so that a bisection algorithm can be applied to find the maximum of the profile likelihood, and hence also the global maximum. The new algorithm is illustrated using both artificial and real data, including lifetime data for Canadian males and females.

  20. Sports eyewear protective standards.

    PubMed

    Dain, Stephen J

    2016-01-01

    Eye injuries sustained during sport comprise up to 20 per cent of all injuries to the eye serious enough for medical attention to be sought. The prevalence of eye injuries in sport is not easily assessed due to lack of authoritative participation rates, so most studies report total numbers in a time period. The evidence on the proportion of all ocular injuries that are from sport is reviewed. The relative frequencies in different sports are compared in a qualitative manner and the sports with greater numbers of ocular injuries are detailed. In common with occupational injuries to the eye, most sports eye injuries are considered preventable. The hierarchy of action for occupational risk is detailed and adapted to use in a sports scenario. All the available international, regional and national standards on sports eye protection are detailed and their provisions compared. The major function of the standards is to provide adequate protection against the hazard of the sport concerned. These are detailed and compared as a function of energy transfer. Eye protection must not introduce additional or secondary hazards (for instance, fracturing into sharp fragments on impact) and not introduce features that would deter the wearing of eye protection (for instance, restricting field of view to impede playing the sport). The provisions of the standards intended to limit secondary hazards are detailed and compared. The need for future work in standards writing and the activities of the International Standardization Organization in sports eye protection are detailed. © 2016 Optometry Australia.

  1. The natural progression and outcomes of adrenal incidentaloma: a systematic review and meta-analysis.

    PubMed

    Loh, Huai H; Yee, Anne; Loh, Huai S; Sukor, Norlela; Kamaruddin, Nor A

    2017-03-01

    Long-term outcome of patients with adrenal incidentaloma (AI) is unknown. The aim of this study was to systematically summarize the follow-up and outcome of clinically silent AI who do not undergo surgery. All major databases and medical literature in English-language, published from 1998 to May 2015, were systematically searched for publications on AI. Primary endpoint was hormonal hyper function; secondary endpoints were time from diagnosis to study endpoint and the outcome of adrenalectomy. Meta-analysis was performed using both qualitative and quantitative approach. A total of 11 publications were included. Total sample size was 1298 patients. Mean follow-up duration was 44.2 months. There were 82 patients confirmed to have subclinical Cushing's syndrome at diagnosis, with 1.79% new cases at the end of follow up (95% CI, 0.002 to 0.045). Incidence of Cushing's syndrome was 0.7% (95% CI, 0.001 to 0.013) and pheochromocytoma 0.4% (95% CI, 0.001 to 0.008). The mean tumor size was 2.52cm, with mean increment of 0.03cm to 2.9cm at the end of follow up. About 3% of patients ended up with surgery (95% CI, 0.01 to 0.05) but none were due to primary adrenal malignancy. Time of greatest risk of developing Cushing's syndrome and pheochromocytoma was between months 36 and 42 (hazard rate 14%), and between months 48 and 54 (hazard rate 7%) respectively. Malignant change in non-functioning AI is rare. The risk of developing overt disease over the follow-up period is low. A less stringent imaging and functional work-up interval can be considered.

  2. Lean body mass predicts long-term survival in Chinese patients on peritoneal dialysis.

    PubMed

    Huang, Jenq-Wen; Lien, Yu-Chung; Wu, Hon-Yen; Yen, Chung-Jen; Pan, Chun-Chun; Hung, Tsai-Wei; Su, Chi-Ting; Chiang, Chih-Kang; Cheng, Hui-Teng; Hung, Kuan-Yu

    2013-01-01

    Reduced lean body mass (LBM) is one of the main indicators in malnutrition inflammation syndrome among patients on dialysis. However, the influence of LBM on peritoneal dialysis (PD) patients' outcomes and the factors related to increasing LBM are seldom reported. We enrolled 103 incident PD patients between 2002 and 2003, and followed them until December 2011. Clinical characteristics, PD-associated parameters, residual renal function, and serum chemistry profiles of each patient were collected at 1 month and 1 year after initiating PD. LBM was estimated using creatinine index corrected with body weight. Multiple linear regression analysis, Kaplan-Meier survival analysis, and Cox regression proportional hazard analysis were used to define independent variables and compare survival between groups. Using the median LBM value (70% for men and 64% for women), patients were divided into group 1 (n = 52; low LBM) and group 2 (n = 51; high LBM). Group 1 patients had higher rates of peritonitis (1.6 vs. 1.1/100 patient months; p<0.05) and hospitalization (14.6 vs. 9.7/100 patient months; p<0.05). Group 1 patients also had shorter overall survival and technique survival (p<0.01). Each percentage point increase in LBM reduced the hazard ratio for mortality by 8% after adjustment for diabetes, age, sex, and body mass index (BMI). Changes in residual renal function and protein catabolic rate were independently associated with changes in LBM in the first year of PD. LBM serves as a good parameter in addition to BMI to predict the survival of patients on PD. Preserving residual renal function and increasing protein intake can increase LBM.

  3. Lean Body Mass Predicts Long-Term Survival in Chinese Patients on Peritoneal Dialysis

    PubMed Central

    Huang, Jenq-Wen; Lien, Yu-Chung; Wu, Hon-Yen; Yen, Chung-Jen; Pan, Chun-Chun; Hung, Tsai-Wei; Su, Chi-Ting; Chiang, Chih-Kang; Cheng, Hui-Teng; Hung, Kuan-Yu

    2013-01-01

    Background Reduced lean body mass (LBM) is one of the main indicators in malnutrition inflammation syndrome among patients on dialysis. However, the influence of LBM on peritoneal dialysis (PD) patients’ outcomes and the factors related to increasing LBM are seldom reported. Methods We enrolled 103 incident PD patients between 2002 and 2003, and followed them until December 2011. Clinical characteristics, PD-associated parameters, residual renal function, and serum chemistry profiles of each patient were collected at 1 month and 1 year after initiating PD. LBM was estimated using creatinine index corrected with body weight. Multiple linear regression analysis, Kaplan–Meier survival analysis, and Cox regression proportional hazard analysis were used to define independent variables and compare survival between groups. Results Using the median LBM value (70% for men and 64% for women), patients were divided into group 1 (n = 52; low LBM) and group 2 (n = 51; high LBM). Group 1 patients had higher rates of peritonitis (1.6 vs. 1.1/100 patient months; p<0.05) and hospitalization (14.6 vs. 9.7/100 patient months; p<0.05). Group 1 patients also had shorter overall survival and technique survival (p<0.01). Each percentage point increase in LBM reduced the hazard ratio for mortality by 8% after adjustment for diabetes, age, sex, and body mass index (BMI). Changes in residual renal function and protein catabolic rate were independently associated with changes in LBM in the first year of PD. Conclusions LBM serves as a good parameter in addition to BMI to predict the survival of patients on PD. Preserving residual renal function and increasing protein intake can increase LBM. PMID:23372806

  4. Toward Building a New Seismic Hazard Model for Mainland China

    NASA Astrophysics Data System (ADS)

    Rong, Y.; Xu, X.; Chen, G.; Cheng, J.; Magistrale, H.; Shen, Z.

    2015-12-01

    At present, the only publicly available seismic hazard model for mainland China was generated by Global Seismic Hazard Assessment Program in 1999. We are building a new seismic hazard model by integrating historical earthquake catalogs, geological faults, geodetic GPS data, and geology maps. To build the model, we construct an Mw-based homogeneous historical earthquake catalog spanning from 780 B.C. to present, create fault models from active fault data using the methodology recommended by Global Earthquake Model (GEM), and derive a strain rate map based on the most complete GPS measurements and a new strain derivation algorithm. We divide China and the surrounding regions into about 20 large seismic source zones based on seismotectonics. For each zone, we use the tapered Gutenberg-Richter (TGR) relationship to model the seismicity rates. We estimate the TGR a- and b-values from the historical earthquake data, and constrain corner magnitude using the seismic moment rate derived from the strain rate. From the TGR distributions, 10,000 to 100,000 years of synthetic earthquakes are simulated. Then, we distribute small and medium earthquakes according to locations and magnitudes of historical earthquakes. Some large earthquakes are distributed on active faults based on characteristics of the faults, including slip rate, fault length and width, and paleoseismic data, and the rest to the background based on the distributions of historical earthquakes and strain rate. We evaluate available ground motion prediction equations (GMPE) by comparison to observed ground motions. To apply appropriate GMPEs, we divide the region into active and stable tectonics. The seismic hazard will be calculated using the OpenQuake software developed by GEM. To account for site amplifications, we construct a site condition map based on geology maps. The resulting new seismic hazard map can be used for seismic risk analysis and management, and business and land-use planning.

  5. Probabilistic Seismic Hazard Maps for Ecuador

    NASA Astrophysics Data System (ADS)

    Mariniere, J.; Beauval, C.; Yepes, H. A.; Laurence, A.; Nocquet, J. M.; Alvarado, A. P.; Baize, S.; Aguilar, J.; Singaucho, J. C.; Jomard, H.

    2017-12-01

    A probabilistic seismic hazard study is led for Ecuador, a country facing a high seismic hazard, both from megathrust subduction earthquakes and shallow crustal moderate to large earthquakes. Building on the knowledge produced in the last years in historical seismicity, earthquake catalogs, active tectonics, geodynamics, and geodesy, several alternative earthquake recurrence models are developed. An area source model is first proposed, based on the seismogenic crustal and inslab sources defined in Yepes et al. (2016). A slightly different segmentation is proposed for the subduction interface, with respect to Yepes et al. (2016). Three earthquake catalogs are used to account for the numerous uncertainties in the modeling of frequency-magnitude distributions. The hazard maps obtained highlight several source zones enclosing fault systems that exhibit low seismic activity, not representative of the geological and/or geodetical slip rates. Consequently, a fault model is derived, including faults with an earthquake recurrence model inferred from geological and/or geodetical slip rate estimates. The geodetical slip rates on the set of simplified faults are estimated from a GPS horizontal velocity field (Nocquet et al. 2014). Assumptions on the aseismic component of the deformation are required. Combining these alternative earthquake models in a logic tree, and using a set of selected ground-motion prediction equations adapted to Ecuador's different tectonic contexts, a mean hazard map is obtained. Hazard maps corresponding to the percentiles 16 and 84% are also derived, highlighting the zones where uncertainties on the hazard are highest.

  6. Survival Data and Regression Models

    NASA Astrophysics Data System (ADS)

    Grégoire, G.

    2014-12-01

    We start this chapter by introducing some basic elements for the analysis of censored survival data. Then we focus on right censored data and develop two types of regression models. The first one concerns the so-called accelerated failure time models (AFT), which are parametric models where a function of a parameter depends linearly on the covariables. The second one is a semiparametric model, where the covariables enter in a multiplicative form in the expression of the hazard rate function. The main statistical tool for analysing these regression models is the maximum likelihood methodology and, in spite we recall some essential results about the ML theory, we refer to the chapter "Logistic Regression" for a more detailed presentation.

  7. Hazardous Waste Site Remediation, Neighborhood Change, and Neighborhood Quality.

    PubMed Central

    Greenberg, M; Schneider, D

    1994-01-01

    We tested the hypothesis that neighborhoods with hazardous waste sites may no longer be undesirable places to live if they have been at least partly remediated. We collected 377 questionnaires (42% response rate) administered from within one-half mile of the number 1, 4, and 12 hazardous waste sites on the National Priority List (Superfund). These neighborhoods were rated higher quality than neighborhoods with unremediated hazardous waste sites and about the same as neighborhoods in northern New Jersey and the United States as a whole. Newer residents considered these formerly tainted areas to be opportunities to upgrade their housing and living conditions. Long-term residents retained the negative image of the blemished neighborhood. Images p542-a PMID:9679112

  8. On the development of weighting factors for ballast ranking prioritization & development of the relationship and rate of defective segments based on volume of missing ballast

    NASA Astrophysics Data System (ADS)

    Cronin, John

    This thesis explores the effects of missing ballast on track behavior and degradation. As ballast is an integral part of the track structure, the hypothesized effect of missing ballast is that defects will be more common which in turn leads to more derailments. In order to quantify the volume of missing ballast, remote sensing technologies were used to provide an accurate profile of the ballast. When the existing profile is compared to an idealized profile, the area of missing ballast can be computed. The area is then subdivided into zones which represent the area in which the ballast performs a key function in the track structure. These areas are then extrapolated into the volume of missing ballast for each zone based on the distance between collected profiles. In order to emphasize the key functions that the zones previously created perform, weighting factors were developed based on common risk-increasing hazards, such as curves and heavy axle loads, which are commonly found on railways. These weighting factors are applied to the specified zones' missing ballast volume when such a hazard exists in that segment of track. Another set of weighting factors were developed to represent the increased risk, or preference for lower risk, for operational factors such as the transport of hazardous materials or for being a key route. Through these weighting factors, ballast replenishment can be prioritized to focus on the areas that pose a higher risk of derailments and their associated costs. For the special cases where the risk or aversion to risk comes from what is being transported, such as the case with hazardous materials or passengers, an economic risk assessment was completed in order to quantify the risk associated with their transport. This economic risk assessment looks at the increased costs associated with incidents that occur and how they compare to incidents which do not directly involve the special cargos. In order to provide support for the use of the previously developed weightings as well as to quantify the actual impact that missing ballast has on the rate of geometry defects, analyses which quantified the risk of missing ballast were performed. In addition to quantifying the rate of defects, analyses were performed which looked at the impact associated with curved track, how the location of missing ballast impacts the rate of geometry defects and how the combination of the two compared with the previous analyses. Through this research, the relationship between the volume of missing ballast and ballast-related defects has been identified and quantified. This relationship is positive for the aggregate of all ballast-related defects but does not always exist for individual defects which occasionally have unique behavior. For the non-ballast defects, a relationship between missing ballast and their rate of occurrence did not always appear to exist. The impact of curves was apparent, showing that the rate of defects was either similar to or exceeded the rate of defects for tangent track. For the analyses which looked at the location of ballast in crib or shoulder, the results were quite similar to the previous analyses. The development, application and improvements of a risk-based ballast maintenance prioritization system provides a relatively low-cost and effective method to improve the operational safety for all railroads.

  9. Association of High-Dose Ibuprofen Use, Lung Function Decline, and Long-Term Survival in Children with Cystic Fibrosis.

    PubMed

    Konstan, Michael W; VanDevanter, Donald R; Sawicki, Gregory S; Pasta, David J; Foreman, Aimee J; Neiman, Evgueni A; Morgan, Wayne J

    2018-04-01

    Cystic fibrosis deaths result primarily from lung function loss, so chronic respiratory therapies, intended to preserve lung function, are cornerstones of cystic fibrosis care. Although treatment-associated reduction in rate of lung function loss should ultimately improve cystic fibrosis survival, no such relationship has been described for any chronic cystic fibrosis therapy. In part, this is because the ages of most rapid lung function decline-early adolescence-precede the median age of cystic fibrosis deaths by more than a decade. To study associations of high-dose ibuprofen treatment with the rate of forced expiratory volume in 1 second decline and mortality among children followed in the Epidemiologic Study of Cystic Fibrosis and subsequently in the U.S. Cystic Fibrosis Foundation Patient Registry. We performed a matched cohort study using data from Epidemiologic Study of Cystic Fibrosis. Exposure was defined as high-dose ibuprofen use reported at ≥80% of encounters over 2 years. Unexposed children were matched to exposed children 5:1 using propensity scores on the basis of demographic, clinical, and treatment covariates. The rate of decline of percent predicted forced expiratory volume in 1 second during the 2-year follow-up period was estimated by mixed-effects modeling with random slopes and intercepts. Survival over 16 follow-up years in the U.S. Cystic Fibrosis Foundation Patient Registry was compared between treatment groups by using proportional hazards modeling controlling for matching and covariates. We included 775 high-dose ibuprofen users and 3,665 nonusers who were well matched on demographic, clinical, and treatment variables. High-dose ibuprofen users declined on average 1.10 percent predicted forced expiratory volume in 1 second/yr (95% confidence interval; 0.51, 1.69) during the 2-year treatment period, whereas nonusers declined at a rate of 1.76% percent predicted forced expiratory volume in 1 second/yr (95% confidence interval; 1.48, 2.04) during the corresponding 2-year period, a 37.5% slower decline among users compared with nonusers (95% confidence interval; 0.4%, 71.3%; P = 0.046). The users had better subsequent survival (P < 0.001): the unadjusted and adjusted hazard ratios for mortality (high-dose ibuprofen/non-high-dose ibuprofen) (95% confidence interval) were 0.75 (0.64, 0.87) and 0.82 (0.69, 0.96). In a propensity-score matched cohort study of children with cystic fibrosis, we observed an association between high-dose ibuprofen use and both slower lung function decline and improved long-term survival. These results are consistent with the hypothesis that treatment-associated reduction of lung function decline in children with cystic fibrosis leads to improved survival.

  10. Updated Colombian Seismic Hazard Map

    NASA Astrophysics Data System (ADS)

    Eraso, J.; Arcila, M.; Romero, J.; Dimate, C.; Bermúdez, M. L.; Alvarado, C.

    2013-05-01

    The Colombian seismic hazard map used by the National Building Code (NSR-98) in effect until 2009 was developed in 1996. Since then, the National Seismological Network of Colombia has improved in both coverage and technology providing fifteen years of additional seismic records. These improvements have allowed a better understanding of the regional geology and tectonics which in addition to the seismic activity in Colombia with destructive effects has motivated the interest and the need to develop a new seismic hazard assessment in this country. Taking advantage of new instrumental information sources such as new broad band stations of the National Seismological Network, new historical seismicity data, standardized global databases availability, and in general, of advances in models and techniques, a new Colombian seismic hazard map was developed. A PSHA model was applied. The use of the PSHA model is because it incorporates the effects of all seismic sources that may affect a particular site solving the uncertainties caused by the parameters and assumptions defined in this kind of studies. First, the seismic sources geometry and a complete and homogeneous seismic catalog were defined; the parameters of seismic rate of each one of the seismic sources occurrence were calculated establishing a national seismotectonic model. Several of attenuation-distance relationships were selected depending on the type of seismicity considered. The seismic hazard was estimated using the CRISIS2007 software created by the Engineering Institute of the Universidad Nacional Autónoma de México -UNAM (National Autonomous University of Mexico). A uniformly spaced grid each 0.1° was used to calculate the peak ground acceleration (PGA) and response spectral values at 0.1, 0.2, 0.3, 0.5, 0.75, 1, 1.5, 2, 2.5 and 3.0 seconds with return periods of 75, 225, 475, 975 and 2475 years. For each site, a uniform hazard spectrum and exceedance rate curves were calculated. With the results, it is possible to determinate environments and scenarios where the seismic hazard is a function of distance and magnitude and also the principal seismic sources that contribute to the seismic hazard at each site (dissagregation). This project was conducted by the Servicio Geológico Colombiano (Colombian Geological Survey) and the Universidad Nacional de Colombia (National University of Colombia), with the collaboration of national and foreign experts and the National System of Prevention and Attention of Disaster (SNPAD). It is important to stand out that this new seismic hazard map was used in the updated national building code (NSR-10). A new process is ongoing in order to improve and present the Seismic Hazard Map in terms of intensity. This require new knowledge in site effects, in both local and regional scales, checking the existing and develop new acceleration to intensity relationships, in order to obtain results more understandable and useful for a wider range of users, not only in the engineering field, but also all the risk assessment and management institutions, research and general community.

  11. Preharvest food safety.

    PubMed

    Childers, A B; Walsh, B

    1996-07-23

    Preharvest food safety is essential for the protection of our food supply. The production and transport of livestock and poultry play an integral part in the safety of these food products. The goals of this safety assurance include freedom from pathogenic microorganisms, disease, and parasites, and from potentially harmful residues and physical hazards. Its functions should be based on hazard analysis and critical control points from producer to slaughter plant with emphasis on prevention of identifiable hazards rather than on removal of contaminated products. The production goal is to minimize infection and insure freedom from potentially harmful residues and physical hazards. The marketing goal is control of exposure to pathogens and stress. Both groups should have functional hazard analysis and critical control points management programs which include personnel training and certification of producers. These programs must cover production procedures, chemical usage, feeding, treatment practices, drug usage, assembly and transportation, and animal identification. Plans must use risk assessment principles, and the procedures must be defined. Other elements would include preslaughter certification, environmental protection, control of chemical hazards, live-animal drug-testing procedures, and identification of physical hazards.

  12. 49 CFR 173.12 - Exceptions for shipment of waste materials.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... impracticable, an equivalent (except for closure) open head drum may be used for the hazardous waste. (b) Lab....101 Hazardous Materials Table may be used in place of specific chemical names, when two or more... exceeding 4 L (1 gallon) rated capacity, or metal or plastic, not exceeding 20 L (5.3 gallons) rated...

  13. Estimating Sedimentation from an Erosion-Hazard Rating

    Treesearch

    R.M. Rice; S.A. Sherbin

    1977-01-01

    Data from two watersheds in northern California were used to develop an interpretation of the erosion hazard rating (EHR) of the Coast Forest District as amount of sedimentation. For the Caspar Creek Experimental Watershed (North Fork and South Fork), each EHR unit was estimated as equivalent to 0.0543 cubic yards per acre per year, on undisturbed forest. Experience...

  14. Estimating sedimentation from an erosion-hazard rating

    Treesearch

    R. M. Rice; S. A. Sherbin

    1977-01-01

    Data from two watersheds in northern California were used to develop an interpretation of the erosion-hazard rating (EHR) of the Coast Forest District as amount of sedimentation. For the Caspar Creek Experimental Watershed (North Fork and South Fork), each EHR unit was estimated as equivalent to 0.0543 cubic yards per acre per year, on undisturbed forest. Experience...

  15. The influence of image valence on visual attention and perception of risk in drivers.

    PubMed

    Jones, M P; Chapman, P; Bailey, K

    2014-12-01

    Currently there is little research into the relationship between emotion and driving in the context of advertising and distraction. Research that has looked into this also has methodological limitations that could be affecting the results rather than emotional processing (Trick et al., 2012). The current study investigated the relationship between image valence and risk perception, eye movements and physiological reactions. Participants watched hazard perception clips which had emotional images from the international affective picture system overlaid onto them. They rated how hazardous or safe they felt, whilst eye movements, galvanic skin response and heart rate were recorded. Results suggested that participants were more aware of potential hazards when a neutral image had been shown, in comparison to positive and negative valenced images; that is, participants showed higher subjective ratings of risk, larger physiological responses and marginally longer fixation durations when viewing a hazard after a neutral image, but this effect was attenuated after emotional images. It appears that emotional images reduce sensitivity to potential hazards, and we suggest that future studies could apply these findings to higher fidelity paradigms such as driving simulators. Copyright © 2014 Elsevier Ltd. All rights reserved.

  16. Physics Simulation Software for Autonomous Propellant Loading and Gas House Autonomous System Monitoring

    NASA Technical Reports Server (NTRS)

    Regalado Reyes, Bjorn Constant

    2015-01-01

    1. Kennedy Space Center (KSC) is developing a mobile launching system with autonomous propellant loading capabilities for liquid-fueled rockets. An autonomous system will be responsible for monitoring and controlling the storage, loading and transferring of cryogenic propellants. The Physics Simulation Software will reproduce the sensor data seen during the delivery of cryogenic fluids including valve positions, pressures, temperatures and flow rates. The simulator will provide insight into the functionality of the propellant systems and demonstrate the effects of potential faults. This will provide verification of the communications protocols and the autonomous system control. 2. The High Pressure Gas Facility (HPGF) stores and distributes hydrogen, nitrogen, helium and high pressure air. The hydrogen and nitrogen are stored in cryogenic liquid state. The cryogenic fluids pose several hazards to operators and the storage and transfer equipment. Constant monitoring of pressures, temperatures and flow rates are required in order to maintain the safety of personnel and equipment during the handling and storage of these commodities. The Gas House Autonomous System Monitoring software will be responsible for constantly observing and recording sensor data, identifying and predicting faults and relaying hazard and operational information to the operators.

  17. Considerations in comparing the U.S. Geological Survey one‐year induced‐seismicity hazard models with “Did You Feel It?” and instrumental data

    USGS Publications Warehouse

    White, Isabel; Liu, Taojun; Luco, Nicolas; Liel, Abbie

    2017-01-01

    The recent steep increase in seismicity rates in Oklahoma, southern Kansas, and other parts of the central United States led the U.S. Geological Survey (USGS) to develop, for the first time, a probabilistic seismic hazard forecast for one year (2016) that incorporates induced seismicity. In this study, we explore a process to ground‐truth the hazard model by comparing it with two databases of observations: modified Mercalli intensity (MMI) data from the “Did You Feel It?” (DYFI) system and peak ground acceleration (PGA) values from instrumental data. Because the 2016 hazard model was heavily based on earthquake catalogs from 2014 to 2015, this initial comparison utilized observations from these years. Annualized exceedance rates were calculated with the DYFI and instrumental data for direct comparison with the model. These comparisons required assessment of the options for converting hazard model results and instrumental data from PGA to MMI for comparison with the DYFI data. In addition, to account for known differences that affect the comparisons, the instrumental PGA and DYFI data were declustered, and the hazard model was adjusted for local site conditions. With these adjustments, examples at sites with the most data show reasonable agreement in the exceedance rates. However, the comparisons were complicated by the spatial and temporal completeness of the instrumental and DYFI observations. Furthermore, most of the DYFI responses are in the MMI II–IV range, whereas the hazard model is oriented toward forecasts at higher ground‐motion intensities, usually above about MMI IV. Nevertheless, the study demonstrates some of the issues that arise in making these comparisons, thereby informing future efforts to ground‐truth and improve hazard modeling for induced‐seismicity applications.

  18. Real-Time Hazard Detection and Avoidance Demonstration for a Planetary Lander

    NASA Technical Reports Server (NTRS)

    Epp, Chirold D.; Robertson, Edward A.; Carson, John M., III

    2014-01-01

    The Autonomous Landing Hazard Avoidance Technology (ALHAT) Project is chartered to develop and mature to a Technology Readiness Level (TRL) of six an autonomous system combining guidance, navigation and control with terrain sensing and recognition functions for crewed, cargo, and robotic planetary landing vehicles. In addition to precision landing close to a pre-mission defined landing location, the ALHAT System must be capable of autonomously identifying and avoiding surface hazards in real-time to enable a safe landing under any lighting conditions. This paper provides an overview of the recent results of the ALHAT closed loop hazard detection and avoidance flight demonstrations on the Morpheus Vertical Testbed (VTB) at the Kennedy Space Center, including results and lessons learned. This effort is also described in the context of a technology path in support of future crewed and robotic planetary exploration missions based upon the core sensing functions of the ALHAT system: Terrain Relative Navigation (TRN), Hazard Detection and Avoidance (HDA), and Hazard Relative Navigation (HRN).

  19. Further thoughts on the utility of risk matrices.

    PubMed

    Ball, David J; Watt, John

    2013-11-01

    Risk matrices are commonly encountered devices for rating hazards in numerous areas of risk management. Part of their popularity is predicated on their apparent simplicity and transparency. Recent research, however, has identified serious mathematical defects and inconsistencies. This article further examines the reliability and utility of risk matrices for ranking hazards, specifically in the context of public leisure activities including travel. We find that (1) different risk assessors may assign vastly different ratings to the same hazard, (2) even following lengthy reflection and learning scatter remains high, and (3) the underlying drivers of disparate ratings relate to fundamentally different worldviews, beliefs, and a panoply of psychosocial factors that are seldom explicitly acknowledged. It appears that risk matrices when used in this context may be creating no more than an artificial and even untrustworthy picture of the relative importance of hazards, which may be of little or no benefit to those trying to manage risk effectively and rationally. © 2013 Society for Risk Analysis.

  20. Superiority of Serum Cystatin C Over Creatinine in Prediction of Long-Term Prognosis at Discharge From ICU.

    PubMed

    Ravn, Bo; Prowle, John R; Mårtensson, Johan; Martling, Claes-Roland; Bell, Max

    2017-09-01

    Renal outcomes after critical illness are seldom assessed despite strong correlation between chronic kidney disease and survival. Outside hospital, renal dysfunction is more strongly associated with mortality when assessed by serum cystatin C than by creatinine. The relationship between creatinine and longer term mortality might be particularly weak in survivors of critical illness. Retrospective observational cohort study. In 3,077 adult ICU survivors, we compared ICU discharge cystatin C and creatinine and their association with 1-year mortality. Exclusions were death within 72 hours of ICU discharge, ICU stay less than 24 hours, and end-stage renal disease. None. During ICU admission, serum cystatin C and creatinine diverged, so that by ICU discharge, almost twice as many patients had glomerular filtration rate less than 60 mL/min/1.73 m when estimated from cystatin C compared with glomerular filtration rate estimated from creatinine, 44% versus 26%. In 743 patients without acute kidney injury, where ICU discharge renal function should reflect ongoing baseline, discharge glomerular filtration rate estimated from creatinine consistently overestimated follow-up glomerular filtration rate estimated from creatinine, whereas ICU discharge glomerular filtration rate estimated from cystatin C well matched follow-up chronic kidney disease status. By 1 year, 535 (17.4%) had died. In survival analysis adjusted for age, sex, and comorbidity, cystatin C was near-linearly associated with increased mortality, hazard ratio equals to 1.78 (95% CI, 1.46-2.18), 75th versus 25th centile. Conversely, creatinine demonstrated a J-shaped relationship with mortality, so that in the majority of patients, there was no significant association with survival, hazard ratio equals to 1.03 (0.87-1.2), 75th versus 25th centile. After adjustment for both creatinine and cystatin C levels, higher discharge creatinine was then associated with lower long-term mortality. In contrast to creatinine, cystatin C consistently associated with long-term mortality, identifying patients at both high and low risk, and better correlated with follow-up renal function. Conversely, lower creatinine relative to cystatin C appeared to confer adverse prognosis, confounding creatinine interpretation in isolation. Cystatin C warrants further investigation as a more meaningful measure of renal function after critical illness.

  1. Executive function, but not memory, associates with incident coronary heart disease and stroke.

    PubMed

    Rostamian, Somayeh; van Buchem, Mark A; Westendorp, Rudi G J; Jukema, J Wouter; Mooijaart, Simon P; Sabayan, Behnam; de Craen, Anton J M

    2015-09-01

    To evaluate the association of performance in cognitive domains executive function and memory with incident coronary heart disease and stroke in older participants without dementia. We included 3,926 participants (mean age 75 years, 44% male) at risk for cardiovascular diseases from the Prospective Study of Pravastatin in the Elderly at Risk (PROSPER) with Mini-Mental State Examination score ≥24 points. Scores on the Stroop Color-Word Test (selective attention) and the Letter Digit Substitution Test (processing speed) were converted to Z scores and averaged into a composite executive function score. Likewise, scores of the Picture Learning Test (immediate and delayed memory) were transformed into a composite memory score. Associations of executive function and memory were longitudinally assessed with risk of coronary heart disease and stroke using multivariable Cox regression models. During 3.2 years of follow-up, incidence rates of coronary heart disease and stroke were 30.5 and 12.4 per 1,000 person-years, respectively. In multivariable models, participants in the lowest third of executive function, as compared to participants in the highest third, had 1.85-fold (95% confidence interval [CI] 1.39-2.45) higher risk of coronary heart disease and 1.51-fold (95% CI 0.99-2.30) higher risk of stroke. Participants in the lowest third of memory had no increased risk of coronary heart disease (hazard ratio 0.99, 95% CI 0.74-1.32) or stroke (hazard ratio 0.87, 95% CI 0.57-1.32). Lower executive function, but not memory, is associated with higher risk of coronary heart disease and stroke. Lower executive function, as an independent risk indicator, might better reflect brain vascular pathologies. © 2015 American Academy of Neurology.

  2. Hazard function theory for nonstationary natural hazards

    NASA Astrophysics Data System (ADS)

    Read, L.; Vogel, R. M.

    2015-12-01

    Studies from the natural hazards literature indicate that many natural processes, including wind speeds, landslides, wildfires, precipitation, streamflow and earthquakes, show evidence of nonstationary behavior such as trends in magnitudes through time. Traditional probabilistic analysis of natural hazards based on partial duration series (PDS) generally assumes stationarity in the magnitudes and arrivals of events, i.e. that the probability of exceedance is constant through time. Given evidence of trends and the consequent expected growth in devastating impacts from natural hazards across the world, new methods are needed to characterize their probabilistic behavior. The field of hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series (x) with its failure time series (t), enabling computation of corresponding average return periods and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose PDS magnitudes are assumed to follow the widely applied Poisson-GP model. We derive a 2-parameter Generalized Pareto hazard model and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. Our theoretical analysis linking hazard event series x, with corresponding failure time series t, should have application to a wide class of natural hazards.

  3. Right Atrial Deformation in Predicting Outcomes in Pediatric Pulmonary Hypertension.

    PubMed

    Jone, Pei-Ni; Schäfer, Michal; Li, Ling; Craft, Mary; Ivy, D Dunbar; Kutty, Shelby

    2017-12-01

    Elevated right atrial (RA) pressure is a risk factor for mortality, and RA size is prognostic of adverse outcomes in pulmonary hypertension (PH). There is limited data on phasic RA function (reservoir, conduit, and pump) in pediatric PH. We sought to evaluate (1) the RA function in pediatric PH patients compared with controls, (2) compare the RA deformation indices with Doppler indices of diastolic dysfunction, functional capacity, biomarkers, invasive hemodynamics, and right ventricular functional indices, and (3) evaluate the potential of RA deformation indices to predict clinical outcomes. Sixty-six PH patients (mean age 7.9±4.7 years) were compared with 36 controls (7.7±4.4 years). RA and right ventricular deformation indices were obtained using 2-dimensional speckle tracking (2DCPA; TomTec, Germany). RA strain, strain rates, emptying fraction, and right ventricular longitudinal strain were measured. RA function was impaired in PH patients versus controls ( P <0.001). There were significant associations between RA function with invasive hemodynamics ( P <0.01). RA reservoir, pump function, the rate of RA filling, and atrial minimum volume predicted adverse clinical outcomes (hazard ratio [HR], 0.15; confidence interval [CI], 0.03-0.73; P <0.01; HR, 0.05; CI, 0.003-0.43; P <0.004; HR, 0.04; CI, 0.006-0.56; P <0.01; and HR, 8.6; CI, 1.6-37.2; P <0.01, respectively). RA deformation properties are significantly altered in pediatric PH patients. Progressive worsening of RA reservoir and conduit functions is related to changes in right ventricular diastolic dysfunction. RA reservoir function, pump function, the rate of atrial filling, and atrial minimum volume emerged as outcome predictors in pediatric PH. © 2017 American Heart Association, Inc.

  4. Estimating piecewise exponential frailty model with changing prior for baseline hazard function

    NASA Astrophysics Data System (ADS)

    Thamrin, Sri Astuti; Lawi, Armin

    2016-02-01

    Piecewise exponential models provide a very flexible framework for modelling univariate survival data. It can be used to estimate the effects of different covariates which are influenced by the survival data. Although in a strict sense it is a parametric model, a piecewise exponential hazard can approximate any shape of a parametric baseline hazard. In the parametric baseline hazard, the hazard function for each individual may depend on a set of risk factors or explanatory variables. However, it usually does not explain all such variables which are known or measurable, and these variables become interesting to be considered. This unknown and unobservable risk factor of the hazard function is often termed as the individual's heterogeneity or frailty. This paper analyses the effects of unobserved population heterogeneity in patients' survival times. The issue of model choice through variable selection is also considered. A sensitivity analysis is conducted to assess the influence of the prior for each parameter. We used the Markov Chain Monte Carlo method in computing the Bayesian estimator on kidney infection data. The results obtained show that the sex and frailty are substantially associated with survival in this study and the models are relatively quite sensitive to the choice of two different priors.

  5. A Minimum Assumption Tornado-Hazard Probability Model.

    NASA Astrophysics Data System (ADS)

    Schaefer, Joseph T.; Kelly, Donald L.; Abbey, Robert F.

    1986-12-01

    One of the principle applications of climatological tornado data is in tornado-hazard assessment. To perform such a hazard-potential determination, historical tornado characteristics in either a regional or tom area are complied. A model is then used to determine a site-specific point probability of a tornado greater than a specified intensity occurring. Various models require different climatological input. However, a knowledge of the mean values of tornado track width, tornado track width, tornado affected area and tornado occurrence rate as both a function of tornado intensity and geographic area, along with a violence frequency distribution, enable Mod of the models to be applied.The NSSFC-NRC tornado data base is used to supply input for the determination of these parameters over the United States. This climatic data base has undergone extensive updating and quality control since it was last reported. For track parameters, internally redundant data were used to cheek consistency. Further, reports which derivated significantly from the mean wore individually checked. Intensity data have been compared with the University of Chicago DAPPLE tornado base. All tornadoes whose recorded intensifies differed by more than one category were reclassified by an independent scientist so that the two data sets are consistent.

  6. CERCLA and EPCRA Continuous Release Reporting

    EPA Pesticide Factsheets

    Congress established reportable quantities for Superfund hazardous substances. A continuous release of a hazardous substance is defined as being without interruption or abatement and stable in quantity and rate.

  7. United States National Seismic Hazard Maps

    USGS Publications Warehouse

    Petersen, M.D.; ,

    2008-01-01

    The U.S. Geological Survey?s maps of earthquake shaking hazards provide information essential to creating and updating the seismic design provisions of building codes and insurance rates used in the United States. Periodic revisions of these maps incorporate the results of new research. Buildings, bridges, highways, and utilities built to meet modern seismic design provisions are better able to withstand earthquakes, not only saving lives but also enabling critical activities to continue with less disruption. These maps can also help people assess the hazard to their homes or places of work and can also inform insurance rates.

  8. Prognostic value of myocardial ischemia and necrosis in depressed left ventricular function: a multicenter stress cardiac magnetic resonance registry.

    PubMed

    Husser, Oliver; Monmeneu, Jose V; Bonanad, Clara; Lopez-Lereu, Maria P; Nuñez, Julio; Bosch, Maria J; Garcia, Carlos; Sanchis, Juan; Chorro, Francisco J; Bodi, Vicente

    2014-09-01

    The incremental prognostic value of inducible myocardial ischemia over necrosis derived by stress cardiac magnetic resonance in depressed left ventricular function is unknown. We determined the prognostic value of necrosis and ischemia in patients with depressed left ventricular function referred for dipyridamole stress perfusion magnetic resonance. In a multicenter registry using stress magnetic resonance, the presence (≥ 2 segments) of late enhancement and perfusion defects and their association with major events (cardiac death and nonfatal infarction) was determined. In 391 patients, perfusion defect or late enhancement were present in 224 (57%) and 237 (61%). During follow-up (median, 96 weeks), 47 major events (12%) occurred: 25 cardiac deaths and 22 myocardial infarctions. Patients with major events displayed a larger extent of perfusion defects (6 segments vs 3 segments; P <.001) but not late enhancement (5 segments vs 3 segments; P =.1). Major event rate was significantly higher in the presence of perfusion defects (17% vs 5%; P =.0005) but not of late enhancement (14% vs 9%; P =.1). Patients were categorized into 4 groups: absence of perfusion defect and absence of late enhancement (n = 124), presence of late enhancement and absence of perfusion defect (n = 43), presence of perfusion defect and presence of late enhancement (n = 195), absence of late enhancement and presence of perfusion defect (n = 29). Event rate was 5%, 7%, 16%, and 24%, respectively (P for trend = .003). In a multivariate regression model, only perfusion defect (hazard ratio = 2.86; 95% confidence interval, 1.37-5.95]; P = .002) but not late enhancement (hazard ratio = 1.70; 95% confidence interval, 0.90-3.22; P =.105) predicted events. In depressed left ventricular function, the presence of inducible ischemia is the strongest predictor of major events. Copyright © 2014 Sociedad Española de Cardiología. Published by Elsevier Espana. All rights reserved.

  9. Frequency of Leaving the House and Mortality from Age 70 to 95.

    PubMed

    Jacobs, Jeremy M; Hammerman-Rozenberg, Aliza; Stessman, Jochanan

    2018-01-01

    To determine the association between frequency of leaving the house and mortality. Prospective follow-up of an age-homogenous, representative, community-dwelling birth cohort (born 1920-21) from the Jerusalem Longitudinal Study (1990-2015). Home. Individuals aged 70 (n = 593), 78 (n = 973), 85 (n = 1164), and 90 (n = 645), examined in 1990, 1998, 2005, and 2010, respectively. Frequency of leaving the house, defined as daily (6-7/week), often (2-5/week), and rarely (≤1/week); geriatric assessment; all-cause mortality (2010-15). Kaplan-Meier survival charts and proportional hazards models adjusted for social (sex, marital status, financial status, loneliness), functional (sex, self-rated health, fatigue, depression, physical activity, activity of daily living difficulty), and medical (sex, chronic pain, visual impairment, hearing impairment, diabetes mellitus, hypertension, ischemic heart disease, chronic kidney disease) covariates. At ages 70, 78, 85, and 90, frequency of going out daily was 87.0%, 80.6%, 65.6%, and 48.4%; often was 6.4%, 9.5%, 17.4%, and 11.3%; and rarely was 6.6%, 10.0%, 17.0%, and 40.3% respectively. Decreasing frequency of going out was associated with negative social, functional, and medical characteristics. Survival rates were lowest among those leaving rarely and highest among those going out daily throughout follow-up. Similarly, compared with rarely leaving the house, unadjusted mortality hazard ratios (HRs) were lowest among subjects leaving daily and remained significant after adjustment for social, functional and medical covariates. Among subjects leaving often, unadjusted HRs showed a similar effect of smaller magnitude, with attenuation of significance after adjustment in certain models. Findings were unchanged after excluding subjects dying within 6 months of follow-up. In community-dwelling elderly adults aged 70 to 90, leaving the house daily was associated with lower mortality risk, independent of social, functional, or medical status. © 2017, Copyright the Authors Journal compilation © 2017, The American Geriatrics Society.

  10. Hydrological risks in anthropized watersheds: modeling of hazard, vulnerability and impacts on population from south-west of Madagascar

    NASA Astrophysics Data System (ADS)

    Mamy Rakotoarisoa, Mahefa; Fleurant, Cyril; Taibi, Nuscia; Razakamanana, Théodore

    2016-04-01

    Hydrological risks, especially for floods, are recurrent on the Fiherenana watershed - southwest of Madagascar. The city of Toliara, which is located at the outlet of the river basin, is subjected each year to hurricane hazards and floods. The stakes are of major importance in this part of the island. This study begins with the analysis of hazard by collecting all existing hydro-climatic data on the catchment. It then seeks to determine trends, despite the significant lack of data, using simple statistical models (decomposition of time series). Then, two approaches are conducted to assess the vulnerability of the city of Toliara and the surrounding villages. First, a static approach, from surveys of land and the use of GIS are used. Then, the second method is the use of a multi-agent-based simulation model. The first step is the mapping of a vulnerability index which is the arrangement of several static criteria. This is a microscale indicator (the scale used is the housing). For each House, there are several criteria of vulnerability, which are the potential water depth, the flow rate, or the architectural typology of the buildings. For the second part, simulations involving scenes of agents are used in order to evaluate the degree of vulnerability of homes from flooding. Agents are individual entities to which we can assign behaviours on purpose to simulate a given phenomenon. The aim is not to give a criterion to the house as physical building, such as its architectural typology or its strength. The model wants to know the chances of the occupants of the house to escape from a catastrophic flood. For this purpose, we compare various settings and scenarios. Some scenarios are conducted to take into account the effect of certain decision made by the responsible entities (Information and awareness of the villagers for example). The simulation consists of two essential parts taking place simultaneously in time: simulation of the rise of water and the flow using classical hydrological functions and multi agent system (transfer function and production function) and the simulation of the behaviour of the people facing the arrival of hazard.

  11. A Marine Hazardous Substances Data System. Volume 2.

    DTIC Science & Technology

    1985-12-01

    substances are considered by the Task III panel ill to exhibit the greatest potential for occupational health effects and warrant the greatest precautions for...Hazards Branch 1111 N NIOSH Registry of Toxic Effects of Chemical Substances 1121 P NIOSH/OSHA Pocket Guideto Chemical Hazards [61 U Undocumented Source...NAS Hazard Liquid or -- Rating Vapor Irritant Solid Irritant Poisons 0 No effect No effect No effect 1 Slight Effect Causes skin Slightly toxic

  12. 49 CFR 171.1 - Applicability of Hazardous Materials Regulations (HMR) to persons and functions.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... transportation of hazardous materials in commerce and to pre-transportation and transportation functions. (a..., reconditions, repairs, or tests a packaging or a component of a packaging that is represented, marked..., reconditions, repairs, or tests a packaging or a component of a packaging that is represented, marked...

  13. The toxicity, in vitro, of silicon carbide whiskers.

    PubMed

    Vaughan, G L; Jordan, J; Karr, S

    1991-10-01

    To mouse cells in culture, SiC whiskers (SiCW) and asbestos are similarly cytotoxic, disrupting cell membranes and killing cells. Both shorten cell generation time, increase the rate of DNA synthesis, increase total cell DNA content, and cause a loss in growth control often associated with malignant cellular transformation. Within the narrow size range of materials examined, the amount of damage appeared to be more a function of the number of whiskers present than of their size. Silicon carbide whiskers, if mishandled, may pose a serious health hazard to humans.

  14. Annosus Root Disease Hazard Rating, Detection, and Management Strategies in the Southeastern United States

    Treesearch

    S. A. Alexander

    1989-01-01

    Annosus root disease (ARD), is the major root disease of pines in the southeastern United States where severely affected trees exhibit growth loss. Assessing the potential damage of ARD is essential for making effective disease control and management decisions. A soil hazard rating system developed to identify potential for tree mortality is described. The Annosus...

  15. Estimating survival probabilities by exposure levels: utilizing vital statistics and complex survey data with mortality follow-up.

    PubMed

    Landsman, V; Lou, W Y W; Graubard, B I

    2015-05-20

    We present a two-step approach for estimating hazard rates and, consequently, survival probabilities, by levels of general categorical exposure. The resulting estimator utilizes three sources of data: vital statistics data and census data are used at the first step to estimate the overall hazard rate for a given combination of gender and age group, and cohort data constructed from a nationally representative complex survey with linked mortality records, are used at the second step to divide the overall hazard rate by exposure levels. We present an explicit expression for the resulting estimator and consider two methods for variance estimation that account for complex multistage sample design: (1) the leaving-one-out jackknife method, and (2) the Taylor linearization method, which provides an analytic formula for the variance estimator. The methods are illustrated with smoking and all-cause mortality data from the US National Health Interview Survey Linked Mortality Files, and the proposed estimator is compared with a previously studied crude hazard rate estimator that uses survey data only. The advantages of a two-step approach and possible extensions of the proposed estimator are discussed. Copyright © 2015 John Wiley & Sons, Ltd.

  16. Edoxaban versus warfarin in patients with atrial fibrillation.

    PubMed

    Giugliano, Robert P; Ruff, Christian T; Braunwald, Eugene; Murphy, Sabina A; Wiviott, Stephen D; Halperin, Jonathan L; Waldo, Albert L; Ezekowitz, Michael D; Weitz, Jeffrey I; Špinar, Jindřich; Ruzyllo, Witold; Ruda, Mikhail; Koretsune, Yukihiro; Betcher, Joshua; Shi, Minggao; Grip, Laura T; Patel, Shirali P; Patel, Indravadan; Hanyok, James J; Mercuri, Michele; Antman, Elliott M

    2013-11-28

    Edoxaban is a direct oral factor Xa inhibitor with proven antithrombotic effects. The long-term efficacy and safety of edoxaban as compared with warfarin in patients with atrial fibrillation is not known. We conducted a randomized, double-blind, double-dummy trial comparing two once-daily regimens of edoxaban with warfarin in 21,105 patients with moderate-to-high-risk atrial fibrillation (median follow-up, 2.8 years). The primary efficacy end point was stroke or systemic embolism. Each edoxaban regimen was tested for noninferiority to warfarin during the treatment period. The principal safety end point was major bleeding. The annualized rate of the primary end point during treatment was 1.50% with warfarin (median time in the therapeutic range, 68.4%), as compared with 1.18% with high-dose edoxaban (hazard ratio, 0.79; 97.5% confidence interval [CI], 0.63 to 0.99; P<0.001 for noninferiority) and 1.61% with low-dose edoxaban (hazard ratio, 1.07; 97.5% CI, 0.87 to 1.31; P=0.005 for noninferiority). In the intention-to-treat analysis, there was a trend favoring high-dose edoxaban versus warfarin (hazard ratio, 0.87; 97.5% CI, 0.73 to 1.04; P=0.08) and an unfavorable trend with low-dose edoxaban versus warfarin (hazard ratio, 1.13; 97.5% CI, 0.96 to 1.34; P=0.10). The annualized rate of major bleeding was 3.43% with warfarin versus 2.75% with high-dose edoxaban (hazard ratio, 0.80; 95% CI, 0.71 to 0.91; P<0.001) and 1.61% with low-dose edoxaban (hazard ratio, 0.47; 95% CI, 0.41 to 0.55; P<0.001). The corresponding annualized rates of death from cardiovascular causes were 3.17% versus 2.74% (hazard ratio, 0.86; 95% CI, 0.77 to 0.97; P=0.01), and 2.71% (hazard ratio, 0.85; 95% CI, 0.76 to 0.96; P=0.008), and the corresponding rates of the key secondary end point (a composite of stroke, systemic embolism, or death from cardiovascular causes) were 4.43% versus 3.85% (hazard ratio, 0.87; 95% CI, 0.78 to 0.96; P=0.005), and 4.23% (hazard ratio, 0.95; 95% CI, 0.86 to 1.05; P=0.32). Both once-daily regimens of edoxaban were noninferior to warfarin with respect to the prevention of stroke or systemic embolism and were associated with significantly lower rates of bleeding and death from cardiovascular causes. (Funded by Daiichi Sankyo Pharma Development; ENGAGE AF-TIMI 48 ClinicalTrials.gov number, NCT00781391.).

  17. Determinants of job turnover of young men and women in the United States: a hazard rate analysis.

    PubMed

    Donohue, J J

    1988-01-01

    Hazard models were used to examine the expected job tenure of male and female entrants to the full-time labor force after they appear to have completed their full-time education. Other analysts who have examined the relative quit rates of men and women have not limited their analyses to the 1st job, but they have implicitly assumed that hazard rates from 1st jobs are completely representative of hazard rates from any random nth job. This is 1 of the most important and questionable aspects of their implicit assumption that job terminations can be treated in semi-Markov processes. The basic goal is to analyze the hazard rates for a set of workers who have in some sense terminated their primary tie to education and have shifted toward a primary commitment to the labor force. The compilation of the durations of 1st full-time (20 or more hours/week) jobs yielded a sample of 1431 men and 1527 women. Female workers on average had about a half-year less education than the men: 12.47 years compared to 12.89 years. The percentage of workers with less than a high school education was similar for men (18.1%) and women (18.6%). The percentage of workers with 18 or more years of education was almost 6 times as high for men as for women: 2.73% versus 0.46%. The racial composition of the sample reflected the higher labor force participation rates of black women over white women. For the male sample, 73.2% of the workers were white and 25.7% were black. For the female sample, 70.7% were white and 28.3% were black. For the period 1968-71, female full-time workers quit their 1st job after completing school at substantially higher rates than male workers. This finding was robust to several different model specifications and selection criteria, as well as to estimations with and without duration dependence and with and without corrections for unobserved heterogeneity. While changes were not marked, increasing the definition of full-time employment from 20-30 hours reduced overall quit rates and tended to widen the tenure gap between men and women workers. Treating layoffs as completed spells of work raised overall quit rates and tended to narrow slightly the male-female tenure differential. Also contrary to the other microdata studies, the following were among the results: increased education had a significant and negative effect on quitting for both men and women; the unemployment rate had a significant, negative effect on quit rates for men; the hazard rates for women did not decline monotonically with duration but increased sharply after 18 months; and nonwhites did not have lower rates than whites.

  18. Fuels planning: science synthesis and integration; forest structure and fire hazard fact sheet 02: fire hazard

    Treesearch

    Rocky Mountain Research Station USDA Forest Service

    2004-01-01

    Fire hazard reflects the potential fire behavior and magnitude of effects as a function of fuel conditions. This fact sheet discusses crown fuels, surface fuels, and ground fuels and their contribution and involvement in wildland fire.Other publications in this series...

  19. Pyrotechnic hazards classification and evaluation program. Phase 3, segments 1-4: Investigation of sensitivity test methods and procedures for pyrotechnic hazards evaluation and classification, part A

    NASA Technical Reports Server (NTRS)

    1971-01-01

    The findings, conclusions, and recommendations relative to the investigations conducted to evaluate tests for classifying pyrotechnic materials and end items as to their hazard potential are presented. Information required to establish an applicable means of determining the potential hazards of pyrotechnics is described. Hazard evaluations are based on the peak overpressure or impulse resulting from the explosion as a function of distance from the source. Other hazard classification tests include dust ignition sensitivity, impact ignition sensitivity, spark ignition sensitivity, and differential thermal analysis.

  20. Weather Avoidance Using Route Optimization as a Decision Aid: An AWIN Topical Study. Phase 1

    NASA Technical Reports Server (NTRS)

    1998-01-01

    The aviation community is faced with reducing the fatal aircraft accident rate by 80 percent within 10 years. This must be achieved even with ever increasing, traffic and a changing National Airspace System. This is not just an altruistic goal, but a real necessity, if our growing level of commerce is to continue. Honeywell Technology Center's topical study, "Weather Avoidance Using Route Optimization as a Decision Aid", addresses these pressing needs. The goal of this program is to use route optimization and user interface technologies to develop a prototype decision aid for dispatchers and pilots. This decision aid will suggest possible diversions through single or multiple weather hazards and present weather information with a human-centered design. At the conclusion of the program, we will have a laptop prototype decision aid that will be used to demonstrate concepts to industry for integration into commercialized products for dispatchers and/or pilots. With weather a factor in 30% of aircraft accidents, our program will prevent accidents by strategically avoiding weather hazards in flight. By supplying more relevant weather information in a human-centered format along with the tools to generate flight plans around weather, aircraft exposure to weather hazards can be reduced. Our program directly addresses the NASA's five year investment areas of Strategic Weather Information and Weather Operations (simulation/hazard characterization and crew/dispatch/ATChazard monitoring, display, and decision support) (NASA Aeronautics Safety Investment Strategy: Weather Investment Recommendations, April 15, 1997). This program is comprised of two phases, Phase I concluded December 31, 1998. This first phase defined weather data requirements, lateral routing algorithms, an conceptual displays for a user-centered design. Phase II runs from January 1999 through September 1999. The second phase integrates vertical routing into the lateral optimizer and combines the user interface into a prototype software testbed. Phase II concludes with a dispatcher and pilot evaluation of the route optimizer decision aid. This document describes work completed in Phase I in contract with NASA Langley August 1998 - December 1998. This report includes: (1) Discuss how weather hazards were identified in partnership with experts, and how weather hazards were prioritized; (2) Static representations of display layouts for integrated planning function (3) Cost function for the 2D route optimizer; (4) Discussion of the method for obtaining, access to raw data of, and the results of the flight deck user information requirements definition; (5) Itemized display format requirements identified for representing weather hazards in a route planning aid.

  1. Probabilistic properties of injection induced seismicity - implications for the seismic hazard analysis

    NASA Astrophysics Data System (ADS)

    Lasocki, Stanislaw; Urban, Pawel; Kwiatek, Grzegorz; Martinez-Garzón, Particia

    2017-04-01

    Injection induced seismicity (IIS) is an undesired dynamic rockmass response to massive fluid injections. This includes reactions, among others, to hydro-fracturing for shale gas exploitation. Complexity and changeability of technological factors that induce IIS, may result in significant deviations of the observed distributions of seismic process parameters from the models, which perform well in natural, tectonic seismic processes. Classic formulations of probabilistic seismic hazard analysis in natural seismicity assume the seismic marked point process to be a stationary Poisson process, whose marks - magnitudes are governed by a Gutenberg-Richter born exponential distribution. It is well known that the use of an inappropriate earthquake occurrence model and/or an inappropriate of magnitude distribution model leads to significant systematic errors of hazard estimates. It is therefore of paramount importance to check whether the mentioned, commonly used in natural seismicity assumptions on the seismic process, can be safely used in IIS hazard problems or not. Seismicity accompanying shale gas operations is widely studied in the framework of the project "Shale Gas Exploration and Exploitation Induced Risks" (SHEER). Here we present results of SHEER project investigations of such seismicity from Oklahoma and of a proxy of such seismicity - IIS data from The Geysers geothermal field. We attempt to answer to the following questions: • Do IIS earthquakes follow the Gutenberg-Richter distribution law, so that the magnitude distribution can be modelled by an exponential distribution? • Is the occurrence process of IIS earthquakes Poissonian? Is it segmentally Poissonian? If yes, how are these segments linked to cycles of technological operations? Statistical tests indicate that the Gutenberg-Richter relation born exponential distribution model for magnitude is, in general, inappropriate. The magnitude distribution can be complex, multimodal, with no ready-to-use functional model. In this connection, we recommend to use in hazard analyses non-parametric, kernel estimators of magnitude distribution. The earthquake occurrence process of IIS is not a Poisson process. When earthquakes' occurrences are influenced by a multitude of inducing factors, the interevent time distribution can be modelled by the Weibull distribution supporting a negative ageing property of the process. When earthquake occurrences are due to a specific injection activity, the earthquake rate directly depends on the injection rate and responds immediately to the changes of the injection rate. Furthermore, this response is not limited only to correlated variations of the seismic activity but it also concerns significant changes of the shape of interevent time distribution. Unlike the event rate, the shape of magnitude distribution does not exhibit correlation with the injection rate. This work was supported within SHEER: "Shale Gas Exploration and Exploitation Induced Risks" project funded from Horizon 2020 - R&I Framework Programme, call H2020-LCE 16-2014-1 and within statutory activities No3841/E-41/S/2016 of Ministry of Science and Higher Education of Poland.

  2. Assessing landslide susceptibility, hazards and sediment yield in the Río El Estado watershed, Pico de Orizaba volcano, Mexico

    NASA Astrophysics Data System (ADS)

    Legorreta Paulin, G.; Bursik, M. I.; Lugo Hubp, J.; Aceves Quesada, J. F.

    2014-12-01

    This work provides an overview of the on-going research project (Grant SEP-CONACYT # 167495) from the Institute of Geography at the National Autonomous University of Mexico (UNAM) that seeks to conduct a multi-temporal landslide inventory, analyze the distribution of landslides, and characterize landforms that are prone to slope instability by using Geographic Information Systems (GIS). The study area is the Río El Estado watershed that covers 5.2 km2 and lies on the southwestern flank of Pico de Orizaba volcano.The watershed was studied by using aerial photographs, fieldwork, and adaptation of the Landslide Hazard Zonation Protocol of the Washington State Department of Natural Resources, USA. 107 gravitational slope failures of six types were recognized: shallow landslides, debris-avalanches, deep-seated landslides, debris flows, earthflows, and rock falls. This analysis divided the watershed into 12 mass-wasting landforms on which gravitational processes occur: inner gorges, headwalls, active scarps of deep-seated landslides, meanders, plains, rockfalls, non-rule-identified inner gorges, non-rule-identified headwalls, non-rule-identified converging hillslopes and three types of hillslopes classified by their gradient: low, moderate, and high. For each landform the landslide area rate and the landslide frequency rate were calculated as well as the overall hazard rating. The slope-stability hazard rating has a range that goes from low to very high. The overall hazard rating for this watershed was very high. The shallow slide type landslide was selected and area and volume of individual landslides were retrieved from the watershed landslide inventory geo-database, to establish an empirical relationship between area and volume that takes the form of a power law. The relationship was used to estimate the total volume of landslides in the study area. The findings are important to understand the long-term evolution of the southwestern flank stream system of Pico de Orizaba, and may prove useful in the assessment of landslide susceptibility and hazard in volcanic terrains.

  3. Photovoice in the workplace: A participatory method to give voice to workers to identify health and safety hazards and promote workplace change-a study of university custodians.

    PubMed

    Flum, Marian R; Siqueira, Carlos Eduardo; DeCaro, Anthony; Redway, Scott

    2010-11-01

    Photovoice, a photographic participatory action research methodology was used in a workplace setting to assess hazards that were creating extremely high injury and incidents rates for university custodians and to promote the conditions to eliminate or reduce those hazards. University custodians participated in a Photovoice project to identify, categorize, and prioritize occupational hazards and to discuss and propose solutions to these problems. Results were presented to management and to all custodians for further discussion. The effort was led by a worker-based union-sponsored participatory evaluation team in partnership with a university researcher. Visual depiction of hazardous tasks and exposures among custodians and management focused primarily on improper or unsafe equipment, awkward postures, lifting hazards, and electrical hazards. The process of taking pictures and presenting them created an ongoing discussion among workers and management regarding the need for change and for process improvements, and resulted in greater interest and activity regarding occupational health among the workers. In a follow-up evaluation 1-year later, a number of hazards identified through Photovoice had been corrected. Injury rates for custodians had decreased from 39% to 26%. Photovoice can be an important tool, not just for identifying occupational hazards, but also empowering workers to be more active around health and safety and may facilitate important changes in the workplace. © 2010 Wiley-Liss, Inc.

  4. The role of gender in the association between self-rated health and mortality among older adults in Santiago, Chile: A cohort study

    PubMed Central

    Moreno, Ximena; Albala, Cecilia; Lera, Lydia; Sánchez, Hugo; Fuentes-García, Alejandra; Dangour, Alan D.

    2017-01-01

    Background Previous studies on the role of gender in the association between self-rated health and mortality have shown contrasting results. This study was aimed to determine the importance of gender in the association between self-rated health and mortality among older people in Santiago, Chile. Methods A 10 year follow-up of 1066 people aged 60 or more, from the Chilean cohort of the Study of Health, Ageing and Well-Being. Self-rated health was assessed in face to face interviews through a single general question, along with socio-demographic and health status information. Cox proportional hazards and flexible parametric models for survival analyses were employed. Results By the end of follow-up, 30.7% of women and 39.4% of men died. Adjusted hazard ratio of poor self-rated health, compared to good self-rated health, was 1.92(95% CI 1.29–2.86). In models stratified by gender, an increased risk of mortality was observed among women who rated their health as poor (HR = 2.21, 95% CI 1.43–3.40), but not among men (HR = 1.04, 95% CI 0.58–1.86). Age was associated with mortality in both groups; for men, functional limitation and underweight were also risk factors and obesity was a protective factor. Conclusions Compared to older women who rated their health as good, older women who rated their health as poor had a 2 fold increased risk of mortality over the subsequent 10 years. These findings stress the importance of considering a gender perspective into health programmes, including those focused on older people, in order to address the different elements that increase, on the long run, the risk of dying among older women and men. PMID:28719627

  5. An integrated user-friendly ArcMAP tool for bivariate statistical modeling in geoscience applications

    NASA Astrophysics Data System (ADS)

    Jebur, M. N.; Pradhan, B.; Shafri, H. Z. M.; Yusof, Z.; Tehrany, M. S.

    2014-10-01

    Modeling and classification difficulties are fundamental issues in natural hazard assessment. A geographic information system (GIS) is a domain that requires users to use various tools to perform different types of spatial modeling. Bivariate statistical analysis (BSA) assists in hazard modeling. To perform this analysis, several calculations are required and the user has to transfer data from one format to another. Most researchers perform these calculations manually by using Microsoft Excel or other programs. This process is time consuming and carries a degree of uncertainty. The lack of proper tools to implement BSA in a GIS environment prompted this study. In this paper, a user-friendly tool, BSM (bivariate statistical modeler), for BSA technique is proposed. Three popular BSA techniques such as frequency ratio, weights-of-evidence, and evidential belief function models are applied in the newly proposed ArcMAP tool. This tool is programmed in Python and is created by a simple graphical user interface, which facilitates the improvement of model performance. The proposed tool implements BSA automatically, thus allowing numerous variables to be examined. To validate the capability and accuracy of this program, a pilot test area in Malaysia is selected and all three models are tested by using the proposed program. Area under curve is used to measure the success rate and prediction rate. Results demonstrate that the proposed program executes BSA with reasonable accuracy. The proposed BSA tool can be used in numerous applications, such as natural hazard, mineral potential, hydrological, and other engineering and environmental applications.

  6. An integrated user-friendly ArcMAP tool for bivariate statistical modelling in geoscience applications

    NASA Astrophysics Data System (ADS)

    Jebur, M. N.; Pradhan, B.; Shafri, H. Z. M.; Yusoff, Z. M.; Tehrany, M. S.

    2015-03-01

    Modelling and classification difficulties are fundamental issues in natural hazard assessment. A geographic information system (GIS) is a domain that requires users to use various tools to perform different types of spatial modelling. Bivariate statistical analysis (BSA) assists in hazard modelling. To perform this analysis, several calculations are required and the user has to transfer data from one format to another. Most researchers perform these calculations manually by using Microsoft Excel or other programs. This process is time-consuming and carries a degree of uncertainty. The lack of proper tools to implement BSA in a GIS environment prompted this study. In this paper, a user-friendly tool, bivariate statistical modeler (BSM), for BSA technique is proposed. Three popular BSA techniques, such as frequency ratio, weight-of-evidence (WoE), and evidential belief function (EBF) models, are applied in the newly proposed ArcMAP tool. This tool is programmed in Python and created by a simple graphical user interface (GUI), which facilitates the improvement of model performance. The proposed tool implements BSA automatically, thus allowing numerous variables to be examined. To validate the capability and accuracy of this program, a pilot test area in Malaysia is selected and all three models are tested by using the proposed program. Area under curve (AUC) is used to measure the success rate and prediction rate. Results demonstrate that the proposed program executes BSA with reasonable accuracy. The proposed BSA tool can be used in numerous applications, such as natural hazard, mineral potential, hydrological, and other engineering and environmental applications.

  7. Preliminary Seismic Probabilistic Tsunami Hazard Map for Italy

    NASA Astrophysics Data System (ADS)

    Lorito, Stefano; Selva, Jacopo; Basili, Roberto; Grezio, Anita; Molinari, Irene; Piatanesi, Alessio; Romano, Fabrizio; Tiberti, Mara Monica; Tonini, Roberto; Bonini, Lorenzo; Michelini, Alberto; Macias, Jorge; Castro, Manuel J.; González-Vida, José Manuel; de la Asunción, Marc

    2015-04-01

    We present a preliminary release of the first seismic probabilistic tsunami hazard map for Italy. The map aims to become an important tool for the Italian Department of Civil Protection (DPC), as well as a support tool for the NEAMTWS Tsunami Service Provider, the Centro Allerta Tsunami (CAT) at INGV, Rome. The map shows the offshore maximum tsunami elevation expected for several average return periods. Both crustal and subduction earthquakes are considered. The probability for each scenario (location, depth, mechanism, source size, magnitude and temporal rate) is defined on a uniform grid covering the entire Mediterranean for crustal earthquakes and on the plate interface for subduction earthquakes. Activity rates are assigned from seismic catalogues and basing on a tectonic regionalization of the Mediterranean area. The methodology explores the associated aleatory uncertainty through the innovative application of an Event Tree. Main sources of epistemic uncertainty are also addressed although in preliminary way. The whole procedure relies on a database of pre-calculated Gaussian-shaped Green's functions for the sea level elevation, to be used also as a real time hazard assessment tool by CAT. Tsunami simulations are performed using the non-linear shallow water multi-GPU code HySEA, over a 30 arcsec bathymetry (from the SRTM30+ dataset) and the maximum elevations are stored at the 50-meter isobath and then extrapolated through the Green's law at 1 meter depth. This work is partially funded by project ASTARTE - Assessment, Strategy And Risk Reduction for Tsunamis in Europe - FP7-ENV2013 6.4-3, Grant 603839, and by the Italian flagship project RITMARE.

  8. Reduced Risk of Importing Ebola Virus Disease because of Travel Restrictions in 2014: A Retrospective Epidemiological Modeling Study.

    PubMed

    Otsuki, Shiori; Nishiura, Hiroshi

    An epidemic of Ebola virus disease (EVD) from 2013-16 posed a serious risk of global spread during its early growth phase. A post-epidemic evaluation of the effectiveness of travel restrictions has yet to be conducted. The present study aimed to estimate the effectiveness of travel restrictions in reducing the risk of importation from mid-August to September, 2014, using a simple hazard-based statistical model. The hazard rate was modeled as an inverse function of the effective distance, an excellent predictor of disease spread, which was calculated from the airline transportation network. By analyzing datasets of the date of EVD case importation from the 15th of July to the 15th of September 2014, and assuming that the network structure changed from the 8th of August 2014 because of travel restrictions, parameters that characterized the hazard rate were estimated. The absolute risk reduction and relative risk reductions due to travel restrictions were estimated to be less than 1% and about 20%, respectively, for all models tested. Effectiveness estimates among African countries were greater than those for other countries outside Africa. The travel restrictions were not effective enough to expect the prevention of global spread of Ebola virus disease. It is more efficient to control the spread of disease locally during an early phase of an epidemic than to attempt to control the epidemic at international borders. Capacity building for local containment and coordinated and expedited international cooperation are essential to reduce the risk of global transmission.

  9. Reduced Risk of Importing Ebola Virus Disease because of Travel Restrictions in 2014: A Retrospective Epidemiological Modeling Study

    PubMed Central

    Otsuki, Shiori

    2016-01-01

    Background An epidemic of Ebola virus disease (EVD) from 2013–16 posed a serious risk of global spread during its early growth phase. A post-epidemic evaluation of the effectiveness of travel restrictions has yet to be conducted. The present study aimed to estimate the effectiveness of travel restrictions in reducing the risk of importation from mid-August to September, 2014, using a simple hazard-based statistical model. Methodology/Principal Findings The hazard rate was modeled as an inverse function of the effective distance, an excellent predictor of disease spread, which was calculated from the airline transportation network. By analyzing datasets of the date of EVD case importation from the 15th of July to the 15th of September 2014, and assuming that the network structure changed from the 8th of August 2014 because of travel restrictions, parameters that characterized the hazard rate were estimated. The absolute risk reduction and relative risk reductions due to travel restrictions were estimated to be less than 1% and about 20%, respectively, for all models tested. Effectiveness estimates among African countries were greater than those for other countries outside Africa. Conclusions The travel restrictions were not effective enough to expect the prevention of global spread of Ebola virus disease. It is more efficient to control the spread of disease locally during an early phase of an epidemic than to attempt to control the epidemic at international borders. Capacity building for local containment and coordinated and expedited international cooperation are essential to reduce the risk of global transmission. PMID:27657544

  10. The effect of performance feedback on drivers' hazard perception ability and self-ratings.

    PubMed

    Horswill, Mark S; Garth, Megan; Hill, Andrew; Watson, Marcus O

    2017-04-01

    Drivers' hazard perception ability has been found to predict crash risk, and novice drivers appear to be particularly poor at this skill. This competency appears to develop only slowly with experience, and this could partially be a result of poor quality performance feedback. We report an experiment in which we provided high-quality artificial feedback on individual drivers' performance in a validated video-based hazard perception test via either: (1) a graph-based comparison of hazard perception response times between the test-taker, the average driver, and an expert driver; (2) a video-based comparison between the same groups; or (3) both. All three types of feedback resulted in both an improvement in hazard perception performance and a reduction in self-rated hazard perception skill, compared with a no-feedback control group. Video-based and graph-based feedback combined resulted in a greater improvement in hazard perception performance than either of the individual components, which did not differ from one another. All three types of feedback eliminated participants' self-enhancement bias for hazard perception skill. Participants judged both interventions involving video feedback to be significantly more likely to improve their real-world driving than the no feedback control group. While all three forms of feedback had some value, the combined video and graph feedback intervention appeared to be the most effective across all outcome measures. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Butt rot defect and potential hazard in lodgepole pine on selected California recreational areas

    Treesearch

    Lee A. Paine

    1966-01-01

    Within the area sampled, potentially hazardous lodgepole pine were common on recreational sites. The incidence of decayed and mechanically weak trees was correlated with fire damage. Two-thirds of fire-scarred trees were decayed; one-third were rated potentially hazardous. Fire scars occurred roughly in proportion to level of plot recreational use.

  12. The Role of Environmental Hazard in Mothers' Beliefs about Appropriate Supervision

    ERIC Educational Resources Information Center

    Damashek, Amy; Borduin, Charles; Ronis, Scott

    2014-01-01

    Understanding factors that influence mothers' beliefs about appropriate levels of supervision for their children may assist in efforts to reduce child injury rates. This study examined the interaction of child (i.e. age, gender, and injury risk behavior) and maternal perception of environmental hazard (i.e. hazard level, injury likelihood,…

  13. 77 FR 43002 - Hazardous Waste Management System: Identification and Listing of Hazardous Waste Amendment

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-23

    ... Subjects in 40 CFR Part 261 Environmental protection, Hazardous waste, Recycling, and Reporting and... a maximum annual rate of 200 cubic yards per year must be disposed in a lined Subtitle D landfill... forth in paragraph 1, Phillips 66 can dispose of the processed sludge in a lined Subtitle D landfill...

  14. 75 FR 9647 - National Emission Standards for Hazardous Air Pollutants for Reciprocating Internal Combustion...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-03

    ...EPA is promulgating national emission standards for hazardous air pollutants for existing stationary compression ignition reciprocating internal combustion engines that either are located at area sources of hazardous air pollutant emissions or that have a site rating of less than or equal to 500 brake horsepower and are located at major sources of hazardous air pollutant emissions. In addition, EPA is promulgating national emission standards for hazardous air pollutants for existing non-emergency stationary compression ignition engines greater than 500 brake horsepower that are located at major sources of hazardous air pollutant emissions. Finally, EPA is revising the provisions related to startup, shutdown, and malfunction for the engines that were regulated previously by these national emission standards for hazardous air pollutants.

  15. Natural hazard fatalities in Switzerland from 1946 to 2015

    NASA Astrophysics Data System (ADS)

    Andres, Norina; Badoux, Alexandre; Techel, Frank

    2017-04-01

    Switzerland, located in the middle of the Alps, is prone to several different natural hazards which regularly cause fatalities. To explore temporal trends as well as demographic and spatial patterns in the number of natural hazard fatalities, a database comprising all natural hazard events causing fatalities was compiled for the years 1946 until 2015. The new database includes avalanche, flood, lightning, windstorm, landslide, debris flow, rockfall, earthquake and ice avalanche processes. Two existing databases were incorporated and the resulting dataset extended by a comprehensive newspaper search. In total the database contains 635 natural hazard events causing 1023 fatalities. The database does not include victims which exposed themselves to an important danger on purpose (e.g. high risk sports). The most common causes of death were snow avalanches (37 %), followed by lightning (16 %), floods (12 %), windstorms (10 %), rockfall (8 %), landslides (7 %) and other processes (9 %). Around 14.6 fatalities occurred on average each year. A distinct decrease of natural hazard fatalities could be shown over the last 70 years, which was mostly due to the decline in the number of avalanche and lightning fatalities. Thus, nearly three times as many people were killed by natural hazard processes from 1946 to 1980 than from 1981 to 2015. Normalisation of fatality data by population resulted in a clearly declining annual crude mortality rate: 3.9 deaths per million persons for the first 35 years and 1.1 deaths per million persons for the second 35 years of the study period. The average age of the victims was approximately 36 years and about 75% were males. Most people were killed in summer (JJA, 42%) and winter (DJF, 32 %). Furthermore, almost two-thirds of the fatalities took place in the afternoon and evening. The spatial distribution of the natural hazard fatalities over Switzerland was quite homogeneous. However, mountainous parts of the country (Prealps, Alps) were somewhat more prone to fatal events compared to the Swiss Plateau and the Jura. It appears that the overall natural hazard mortality rate in Switzerland over the past 70 years has been relatively low in comparison to rates in other countries or rates of other types of fatal accidents in Switzerland. Nevertheless, the collected data provides a valuable base for analysis and helps authorities to better identify higher risk demographic groups and regions, and accordingly target these to reduce the number of victims.

  16. Yonsei nomogram: A predictive model of new-onset chronic kidney disease after on-clamp partial nephrectomy in patients with T1 renal tumors.

    PubMed

    Abdel Raheem, Ali; Shin, Tae Young; Chang, Ki Don; Santok, Glen Denmer R; Alenzi, Mohamed Jayed; Yoon, Young Eun; Ham, Won Sik; Han, Woong Kyu; Choi, Young Deuk; Rha, Koon Ho

    2018-06-19

    To develop a predictive nomogram for chronic kidney disease-free survival probability in the long term after partial nephrectomy. A retrospective analysis was carried out of 698 patients with T1 renal tumors undergoing partial nephrectomy at a tertiary academic institution. A multivariable Cox regression analysis was carried out based on parameters proven to have an impact on postoperative renal function. Patients with incomplete data, <12 months follow up and preoperative chronic kidney disease stage III or greater were excluded. The study end-points were to identify independent risk factors for new-onset chronic kidney disease development, as well as to construct a predictive model for chronic kidney disease-free survival probability after partial nephrectomy. The median age was 52 years, median tumor size was 2.5 cm and mean warm ischemia time was 28 min. A total of 91 patients (13.1%) developed new-onset chronic kidney disease at a median follow up of 60 months. The chronic kidney disease-free survival rates at 1, 3, 5 and 10 year were 97.1%, 94.4%, 85.3% and 70.6%, respectively. On multivariable Cox regression analysis, age (1.041, P = 0.001), male sex (hazard ratio 1.653, P < 0.001), diabetes mellitus (hazard ratio 1.921, P = 0.046), tumor size (hazard ratio 1.331, P < 0.001) and preoperative estimated glomerular filtration rate (hazard ratio 0.937, P < 0.001) were independent predictors for new-onset chronic kidney disease. The C-index for chronic kidney disease-free survival was 0.853 (95% confidence interval 0.815-0.895). We developed a novel nomogram for predicting the 5-year chronic kidney disease-free survival probability after on-clamp partial nephrectomy. This model might have an important role in partial nephrectomy decision-making and follow-up plan after surgery. External validation of our nomogram in a larger cohort of patients should be considered. © 2018 The Japanese Urological Association.

  17. Failure Time Distributions: Estimates and Asymptotic Results.

    DTIC Science & Technology

    1980-01-01

    of the models. A parametric family of distributions is proposed for approximating life distri- butions whose hazard rate is bath-tub shaped, this...of the limiting dirtributions of the models. A parametric family of distributions is proposed for approximating life distribution~s whose hazard rate...12. always justified. But, because of this gener- ality, the possible limit laws for the maximum form a very large family . The

  18. Executive Function and Remission of Geriatric Depression: The Role of Semantic Strategy

    PubMed Central

    Morimoto, Sarah Shizuko; Gunning, Faith M.; Murphy, Christopher F.; Kanellopoulos, Dora; Kelly, Robert E.; Alexopoulos, George S.

    2013-01-01

    BACKGROUND This study tested the hypothesis that use of semantic organizational strategy in approaching the Mattis Dementia Rating Scale (MDRS) Complex Verbal Initiation Perseveration (I/P) task, a test of semantic fluency, is the function specifically associated with remission of late-life depression. METHOD 70 elders with major depression participated in a 12-week escitalopram treatment trial. Neuropsychological performance was assessed at baseline after a 2-week drug washout period. Patients with a Hamilton Depression Rating Scale Score less than or equal to 7 for two consecutive weeks and who no longer met DSM-IV criteria were considered to be remitted. Cox proportional hazards survival analysis was used to examine the relationship between subtests of the I/P, other neuropsychological domains and remission rate. Participants’ performance on the CV I/P was coded for perseverations, and use of semantic strategy. RESULTS The relationship of performance on the Complex Verbal I/P and remission rate was significant. No other subtest of the MDRS I/P evidenced this association. There was no significant relationship of speed, confrontation naming, verbal memory or perseveration with remission rate. Remitters’ use of verbal strategy was significantly greater than non-remitters. CONCLUSIONS Geriatric depressed patients who showed decrements in performance on a semantic fluency task showed poorer remission rates than those who showed adequate performance on this measure. Executive impairment in verbal strategy explained performance. This finding supports the concept that executive functioning exerts a “top down” effect on other basic cognitive processes, perhaps as a result of frontostriatal network dysfunction implicated in geriatric depression. PMID:20808124

  19. Long-term volcanic hazard forecasts based on Somma-Vesuvio past eruptive activity

    NASA Astrophysics Data System (ADS)

    Lirer, Lucio; Petrosino, Paola; Alberico, Ines; Postiglione, Immacolata

    2001-02-01

    Distributions of pyroclastic deposits from the main explosive events at Somma-Vesuvio during the 8,000-year B.P.-A.D. 1906 time-span have been analysed to provide maps of volcanic hazard for long-term eruption forecasting. In order to define hazard ratings, the spatial distributions and loads (kg/m2) exerted by the fall deposits on the roofs of buildings have been considered. A load higher than 300 kg/m2 is defined as destructive. The relationship load/frequency (the latter defined as the number of times that an area has been impacted by the deposition of fall deposits) is considered to be a suitable parameter for differentiating among areas according to hazard rating. Using past fall deposit distributions as the basis for future eruptive scenarios, the total area that could be affected by the products of a future Vesuvio explosive eruption is 1,500 km2. The perivolcanic area (274 km2) has the greatest hazard rating because it could be buried by pyroclastic flow deposits thicker than 0.5 m and up to several tens of metres in thickness. Currently, the perivolcanic area also has the highest risk because of the high exposed value, mainly arising from the high population density.

  20. Identification of emergent off-nominal operational requirements during conceptual architecting of the more electric aircraft

    NASA Astrophysics Data System (ADS)

    Armstrong, Michael James

    Increases in power demands and changes in the design practices of overall equipment manufacturers has led to a new paradigm in vehicle systems definition. The development of unique power systems architectures is of increasing importance to overall platform feasibility and must be pursued early in the aircraft design process. Many vehicle systems architecture trades must be conducted concurrent to platform definition. With an increased complexity introduced during conceptual design, accurate predictions of unit level sizing requirements must be made. Architecture specific emergent requirements must be identified which arise due to the complex integrated effect of unit behaviors. Off-nominal operating scenarios present sizing critical requirements to the aircraft vehicle systems. These requirements are architecture specific and emergent. Standard heuristically defined failure mitigation is sufficient for sizing traditional and evolutionary architectures. However, architecture concepts which vary significantly in terms of structure and composition require that unique failure mitigation strategies be defined for accurate estimations of unit level requirements. Identifying of these off-nominal emergent operational requirements require extensions to traditional safety and reliability tools and the systematic identification of optimal performance degradation strategies. Discrete operational constraints posed by traditional Functional Hazard Assessment (FHA) are replaced by continuous relationships between function loss and operational hazard. These relationships pose the objective function for hazard minimization. Load shedding optimization is performed for all statistically significant failures by varying the allocation of functional capability throughout the vehicle systems architecture. Expressing hazards, and thereby, reliability requirements as continuous relationships with the magnitude and duration of functional failure requires augmentations to the traditional means for system safety assessment (SSA). The traditional two state and discrete system reliability assessment proves insufficient. Reliability is, therefore, handled in an analog fashion: as a function of magnitude of failure and failure duration. A series of metrics are introduced which characterize system performance in terms of analog hazard probabilities. These include analog and cumulative system and functional risk, hazard correlation, and extensions to the traditional component importance metrics. Continuous FHA, load shedding optimization, and analog SSA constitute the SONOMA process (Systematic Off-Nominal Requirements Analysis). Analog system safety metrics inform both architecture optimization (changes in unit level capability and reliability) and architecture augmentation (changes in architecture structure and composition). This process was applied for two vehicle systems concepts (conventional and 'more-electric') in terms of loss/hazard relationships with varying degrees of fidelity. Application of this process shows that the traditional assumptions regarding the structure of the function loss vs. hazard relationship apply undue design bias to functions and components during exploratory design. This bias is illustrated in terms of inaccurate estimations of the system and function level risk and unit level importance. It was also shown that off-nominal emergent requirements must be defined specific to each architecture concept. Quantitative comparisons of architecture specific off-nominal performance were obtained which provide evidence to the need for accurate definition of load shedding strategies during architecture exploratory design. Formally expressing performance degradation strategies in terms of the minimization of a continuous hazard space enhances the system architects ability to accurately predict sizing critical emergent requirements concurrent to architecture definition. Furthermore, the methods and frameworks generated here provide a structured and flexible means for eliciting these architecture specific requirements during the performance of architecture trades.

  1. Assessing hail risk for a building portfolio by generating stochastic events

    NASA Astrophysics Data System (ADS)

    Nicolet, Pierrick; Choffet, Marc; Demierre, Jonathan; Imhof, Markus; Jaboyedoff, Michel; Nguyen, Liliane; Voumard, Jérémie

    2015-04-01

    Among the natural hazards affecting buildings, hail is one of the most costly and is nowadays a major concern for building insurance companies. In Switzerland, several costly events were reported these last years, among which the July 2011 event, which cost around 125 million EUR to the Aargauer public insurance company (North-western Switzerland). This study presents the new developments in a stochastic model which aims at evaluating the risk for a building portfolio. Thanks to insurance and meteorological radar data of the 2011 Aargauer event, vulnerability curves are proposed by comparing the damage rate to the radar intensity (i.e. the maximum hailstone size reached during the event, deduced from the radar signal). From these data, vulnerability is defined by a two-step process. The first step defines the probability for a building to be affected (i.e. to claim damages), while the second, if the building is affected, attributes a damage rate to the building from a probability distribution specific to the intensity class. To assess the risk, stochastic events are then generated by summing a set of Gaussian functions with 6 random parameters (X and Y location, maximum hailstone size, standard deviation, eccentricity and orientation). The location of these functions is constrained by a general event shape and by the position of the previously defined functions of the same event. For each generated event, the total cost is calculated in order to obtain a distribution of event costs. The general events parameters (shape, size, …) as well as the distribution of the Gaussian parameters are inferred from two radar intensity maps, namely the one of the aforementioned event, and a second from an event which occurred in 2009. After a large number of simulations, the hailstone size distribution obtained in different regions is compared to the distribution inferred from pre-existing hazard maps, built from a larger set of radar data. The simulation parameters are then adjusted by trial and error, in order to get the best reproduction of the expected distributions. The value of the mean annual risk obtained using the model is also compared to the mean annual risk calculated using directly the hazard maps. According to the first results, the return period of an event inducing a total damage cost equal or greater than 125 million EUR for the Aargauer insurance company would be of around 10 to 40 years.

  2. Geospatial assessment of ecological functions and flood-related risks on floodplains along major rivers in the Puget Sound Basin, Washington

    USGS Publications Warehouse

    Konrad, Christopher P.

    2015-01-01

    Ecological functions and flood-related risks were assessed for floodplains along the 17 major rivers flowing into Puget Sound Basin, Washington. The assessment addresses five ecological functions, five components of flood-related risks at two spatial resolutions—fine and coarse. The fine-resolution assessment compiled spatial attributes of floodplains from existing, publically available sources and integrated the attributes into 10-meter rasters for each function, hazard, or exposure. The raster values generally represent different types of floodplains with regard to each function, hazard, or exposure rather than the degree of function, hazard, or exposure. The coarse-resolution assessment tabulates attributes from the fine-resolution assessment for larger floodplain units, which are floodplains associated with 0.1 to 21-kilometer long segments of major rivers. The coarse-resolution assessment also derives indices that can be used to compare function or risk among different floodplain units and to develop normative (based on observed distributions) standards. The products of the assessment are available online as geospatial datasets (Konrad, 2015; http://dx.doi.org/10.5066/F7DR2SJC).

  3. Testing healthy immigrant effects among late life immigrants in the United States: using multiple indicators.

    PubMed

    Choi, Sunha H

    2012-04-01

    This study tested a healthy immigrant effect (HIE) and postimmigration health status changes among late life immigrants. Using three waves of the Second Longitudinal Study of Aging (1994-2000) and the linked mortality file through 2006, this study compared (a) chronic health conditions, (b) longitudinal trajectories of self-rated health, (c) longitudinal trajectories of functional impairments, and (d) mortality between three groups (age 70+): (i) late life immigrants with less than 15 years in the United States (n = 133), (ii) longer term immigrants (n = 672), and (iii) U.S.-born individuals (n = 8,642). Logistic and Poisson regression, hierarchical generalized linear modeling, and survival analyses were conducted. Late life immigrants were less likely to suffer from cancer, had lower numbers of chronic conditions at baseline, and displayed lower hazards of mortality during the 12-year follow-up. However, their self-rated health and functional status were worse than those of their counterparts over time. A HIE was only partially supported among older adults.

  4. Right ventricular function in heart failure with preserved ejection fraction: a community-based study.

    PubMed

    Mohammed, Selma F; Hussain, Imad; AbouEzzeddine, Omar F; Abou Ezzeddine, Omar F; Takahama, Hiroyuki; Kwon, Susan H; Forfia, Paul; Roger, Véronique L; Redfield, Margaret M

    2014-12-23

    The prevalence and clinical significance of right ventricular (RV) systolic dysfunction (RVD) in patients with heart failure and preserved ejection fraction (HFpEF) are not well characterized. Consecutive, prospectively identified HFpEF (Framingham HF criteria, ejection fraction ≥50%) patients (n=562) from Olmsted County, Minnesota, underwent echocardiography at HF diagnosis and follow-up for cause-specific mortality and HF hospitalization. RV function was categorized by tertiles of tricuspid annular plane systolic excursion and by semiquantitative (normal, mild RVD, or moderate to severe RVD) 2-dimensional assessment. Whether RVD was defined by semiquantitative assessment or tricuspid annular plane systolic excursion ≤15 mm, HFpEF patients with RVD were more likely to have atrial fibrillation, pacemakers, and chronic diuretic therapy. At echocardiography, patients with RVD had slightly lower left ventricular ejection fraction, worse diastolic dysfunction, lower blood pressure and cardiac output, higher pulmonary artery systolic pressure, and more severe RV enlargement and tricuspid valve regurgitation. After adjustment for age, sex, pulmonary artery systolic pressure, and comorbidities, the presence of any RVD by semiquantitative assessment was associated with higher all-cause (hazard ratio=1.35; 95% confidence interval, 1.03-1.77; P=0.03) and cardiovascular (hazard ratio=1.85; 95% confidence interval, 1.20-2.80; P=0.006) mortality and higher first (hazard ratio=1.99; 95% confidence interval, 1.35-2.90; P=0.0006) and multiple (hazard ratio=1.81; 95% confidence interval, 1.18-2.78; P=0.007) HF hospitalization rates. RVD defined by tricuspid annular plane systolic excursion values showed similar but weaker associations with mortality and HF hospitalizations. In the community, RVD is common in HFpEF patients, is associated with clinical and echocardiographic evidence of more advanced HF, and is predictive of poorer outcomes. © 2014 American Heart Association, Inc.

  5. Increased Rate of Hospitalization for Diabetes and Residential Proximity of Hazardous Waste Sites

    PubMed Central

    Kouznetsova, Maria; Huang, Xiaoyu; Ma, Jing; Lessner, Lawrence; Carpenter, David O.

    2007-01-01

    Background Epidemiologic studies suggest that there may be an association between environmental exposure to persistent organic pollutants (POPs) and diabetes. Objective The aim of this study was to test the hypothesis that residential proximity to POP-contaminated waste sites result in increased rates of hospitalization for diabetes. Methods We determined the number of hospitalized patients 25–74 years of age diagnosed with diabetes in New York State exclusive of New York City for the years 1993–2000. Descriptive statistics and negative binomial regression were used to compare diabetes hospitalization rates in individuals who resided in ZIP codes containing or abutting hazardous waste sites containing POPs (“POP” sites); ZIP codes containing hazardous waste sites but with wastes other than POPs (“other” sites); and ZIP codes without any identified hazardous waste sites (“clean” sites). Results Compared with the hospitalization rates for diabetes in clean sites, the rate ratios for diabetes discharges for people residing in POP sites and “other” sites, after adjustment for potential confounders were 1.23 [95% confidence interval (CI), 1.15–1.32] and 1.25 (95% CI, 1.16–1.34), respectively. In a subset of POP sites along the Hudson River, where there is higher income, less smoking, better diet, and more exercise, the rate ratio was 1.36 (95% CI, 1.26–1.47) compared to clean sites. Conclusions After controlling for major confounders, we found a statistically significant increase in the rate of hospitalization for diabetes among the population residing in the ZIP codes containing toxic waste sites. PMID:17366823

  6. Extended cox regression model: The choice of timefunction

    NASA Astrophysics Data System (ADS)

    Isik, Hatice; Tutkun, Nihal Ata; Karasoy, Durdu

    2017-07-01

    Cox regression model (CRM), which takes into account the effect of censored observations, is one the most applicative and usedmodels in survival analysis to evaluate the effects of covariates. Proportional hazard (PH), requires a constant hazard ratio over time, is the assumptionofCRM. Using extended CRM provides the test of including a time dependent covariate to assess the PH assumption or an alternative model in case of nonproportional hazards. In this study, the different types of real data sets are used to choose the time function and the differences between time functions are analyzed and discussed.

  7. Hypermetabolism is a deleterious prognostic factor in patients with amyotrophic lateral sclerosis.

    PubMed

    Jésus, P; Fayemendy, P; Nicol, M; Lautrette, G; Sourisseau, H; Preux, P-M; Desport, J-C; Marin, B; Couratier, P

    2018-01-01

    The aim of this study was to investigate patients with amyotrophic lateral sclerosis in order to determine their nutritional, neurological and respiratory parameters, and survival according to metabolic level. Nutritional assessment included resting energy expenditure (REE) measured by indirect calorimetry [hypermetabolism if REE variation (ΔREE) > 10%] and fat mass (FM) using impedancemetry. Neurological assessment included the Amyotrophic Lateral Sclerosis Functional Rating Scale-Revised score. Survival analysis used the Kaplan-Meier method and multivariate Cox model. A total of 315 patients were analysed. Median age at diagnosis was 65.9 years and 55.2% of patients were hypermetabolic. With regard to the metabolic level (ΔREE: < 10%, 10-20% and >20%), patients with ΔREE > 20% initially had a lower FM(29.7% vs. 32.1% in those with ΔREE ≤10%; P = 0.0054). During follow-up, the median slope of Amyotrophic Lateral Sclerosis Functional Rating Scale-Revised tended to worsen more in patients with ΔREE > 20% (-1.4 vs. -1.0 points/month in those with ΔREE ≤10%; P = 0.07). Overall median survival since diagnosis was 18.4 months. ΔREE > 20% tended to increase the risk of dying compared with ΔREE ≤10% (hazard ratio, 1.33; P = 0.055). In multivariate analysis, an increased REE:FM ratio was independently associated with death (hazard ratio, 1.005; P = 0.001). Hypermetabolism is present in more than half of patients with amyotrophic lateral sclerosis. It modifies the body composition at diagnosis, and patients with hypermetabolism >20% have a worse prognosis than those without hypermetabolism. © 2017 EAN.

  8. California Fault Parameters for the National Seismic Hazard Maps and Working Group on California Earthquake Probabilities 2007

    USGS Publications Warehouse

    Wills, Chris J.; Weldon, Ray J.; Bryant, W.A.

    2008-01-01

    This report describes development of fault parameters for the 2007 update of the National Seismic Hazard Maps and the Working Group on California Earthquake Probabilities (WGCEP, 2007). These reference parameters are contained within a database intended to be a source of values for use by scientists interested in producing either seismic hazard or deformation models to better understand the current seismic hazards in California. These parameters include descriptions of the geometry and rates of movements of faults throughout the state. These values are intended to provide a starting point for development of more sophisticated deformation models which include known rates of movement on faults as well as geodetic measurements of crustal movement and the rates of movements of the tectonic plates. The values will be used in developing the next generation of the time-independent National Seismic Hazard Maps, and the time-dependant seismic hazard calculations being developed for the WGCEP. Due to the multiple uses of this information, development of these parameters has been coordinated between USGS, CGS and SCEC. SCEC provided the database development and editing tools, in consultation with USGS, Golden. This database has been implemented in Oracle and supports electronic access (e.g., for on-the-fly access). A GUI-based application has also been developed to aid in populating the database. Both the continually updated 'living' version of this database, as well as any locked-down official releases (e.g., used in a published model for calculating earthquake probabilities or seismic shaking hazards) are part of the USGS Quaternary Fault and Fold Database http://earthquake.usgs.gov/regional/qfaults/ . CGS has been primarily responsible for updating and editing of the fault parameters, with extensive input from USGS and SCEC scientists.

  9. Survivorship analysis when cure is a possibility: a Monte Carlo study.

    PubMed

    Goldman, A I

    1984-01-01

    Parametric survivorship analyses of clinical trials commonly involves the assumption of a hazard function constant with time. When the empirical curve obviously levels off, one can modify the hazard function model by use of a Gompertz or Weibull distribution with hazard decreasing over time. Some cancer treatments are thought to cure some patients within a short time of initiation. Then, instead of all patients having the same hazard, decreasing over time, a biologically more appropriate model assumes that an unknown proportion (1 - pi) have constant high risk whereas the remaining proportion (pi) have essentially no risk. This paper discusses the maximum likelihood estimation of pi and the power curves of the likelihood ratio test. Monte Carlo studies provide results for a variety of simulated trials; empirical data illustrate the methods.

  10. Spaceflight Toxicology

    NASA Technical Reports Server (NTRS)

    Meyers, Valerie

    2008-01-01

    This viewgraph presentation provides a review of NASA Johnson Space Center's Toxicology program. The mission of this program is to protect crews from toxic exposures during spaceflight. The presentation reviews some of the health hazards. A toxicological hazard level chart is presented that reviews the rating of hazard level, irritancy, systemic effects and containability. The program also participates in the Lunar Airborne Dust Toxicity Advisory Group.

  11. A smoothed stochastic earthquake rate model considering seismicity and fault moment release for Europe

    NASA Astrophysics Data System (ADS)

    Hiemer, S.; Woessner, J.; Basili, R.; Danciu, L.; Giardini, D.; Wiemer, S.

    2014-08-01

    We present a time-independent gridded earthquake rate forecast for the European region including Turkey. The spatial component of our model is based on kernel density estimation techniques, which we applied to both past earthquake locations and fault moment release on mapped crustal faults and subduction zone interfaces with assigned slip rates. Our forecast relies on the assumption that the locations of past seismicity is a good guide to future seismicity, and that future large-magnitude events occur more likely in the vicinity of known faults. We show that the optimal weighted sum of the corresponding two spatial densities depends on the magnitude range considered. The kernel bandwidths and density weighting function are optimized using retrospective likelihood-based forecast experiments. We computed earthquake activity rates (a- and b-value) of the truncated Gutenberg-Richter distribution separately for crustal and subduction seismicity based on a maximum likelihood approach that considers the spatial and temporal completeness history of the catalogue. The final annual rate of our forecast is purely driven by the maximum likelihood fit of activity rates to the catalogue data, whereas its spatial component incorporates contributions from both earthquake and fault moment-rate densities. Our model constitutes one branch of the earthquake source model logic tree of the 2013 European seismic hazard model released by the EU-FP7 project `Seismic HAzard haRmonization in Europe' (SHARE) and contributes to the assessment of epistemic uncertainties in earthquake activity rates. We performed retrospective and pseudo-prospective likelihood consistency tests to underline the reliability of our model and SHARE's area source model (ASM) using the testing algorithms applied in the collaboratory for the study of earthquake predictability (CSEP). We comparatively tested our model's forecasting skill against the ASM and find a statistically significant better performance for testing periods of 10-20 yr. The testing results suggest that our model is a viable candidate model to serve for long-term forecasting on timescales of years to decades for the European region.

  12. Earthquake Hazard Assessment: an Independent Review

    NASA Astrophysics Data System (ADS)

    Kossobokov, Vladimir

    2016-04-01

    Seismic hazard assessment (SHA), from term-less (probabilistic PSHA or deterministic DSHA) to time-dependent (t-DASH) including short-term earthquake forecast/prediction (StEF), is not an easy task that implies a delicate application of statistics to data of limited size and different accuracy. Regretfully, in many cases of SHA, t-DASH, and StEF, the claims of a high potential and efficiency of the methodology are based on a flawed application of statistics and hardly suitable for communication to decision makers. The necessity and possibility of applying the modified tools of Earthquake Prediction Strategies, in particular, the Error Diagram, introduced by G.M. Molchan in early 1990ies for evaluation of SHA, and the Seismic Roulette null-hypothesis as a measure of the alerted space, is evident, and such a testing must be done in advance claiming hazardous areas and/or times. The set of errors, i.e. the rates of failure and of the alerted space-time volume, compared to those obtained in the same number of random guess trials permits evaluating the SHA method effectiveness and determining the optimal choice of the parameters in regard to specified cost-benefit functions. These and other information obtained in such a testing may supply us with a realistic estimate of confidence in SHA results and related recommendations on the level of risks for decision making in regard to engineering design, insurance, and emergency management. These basics of SHA evaluation are exemplified with a few cases of misleading "seismic hazard maps", "precursors", and "forecast/prediction methods".

  13. Religiosity as a protective factor for hazardous drinking and drug use among sexual minority and heterosexual women: Findings from the National Alcohol Survey

    PubMed Central

    Drabble, Laurie; Trocki, Karen F.; Klinger, Jamie L.

    2016-01-01

    Objective Despite research documenting disparities in risk for alcohol-related problems among sexual minority women, few studies explore potential protective factors within this population. This study examines how religiosity may function as a protective or risk factor for alcohol-problems or other substance use among sexual minorities compared to heterosexuals. Method Data from 11,169 women who responded to sexual identity and sexual behavior questions from three population-based National Alcohol Survey waves (2000, 2005, 2010) were utilized for analyses of religiosity in relation to lifetime drinking, past year hazardous drinking, and past year drug use. Results Religiosity was significantly greater among exclusively heterosexual women compared to all sexual minority groups (lesbian, bisexual and heterosexual women who report same sex partners). Lesbians reported the lowest rates of affiliation with religions/denominations discouraging alcohol use. Past year hazardous drinking and use of any illicit drugs were significantly lower among exclusively heterosexual women compared to all sexual minority groups. High religiosity was associated with lifetime alcohol abstention and was found to be protective against hazardous drinking and drug use among both sexual minority and heterosexual women. Reporting religious norms unfavorable to drinking was protective against hazardous drinking among exclusively heterosexual women but not sexual minority women Conclusions Findings reveal the importance of considering sexual minority status in evaluation of religion or spirituality as protective among women. Future studies should explore religiosity in the context of other individual and environmental factors, such as positive identity development and community-level acceptance, which may be salient to resiliency among sexual minorities. PMID:26857897

  14. A New Lifetime Distribution with Bathtube and Unimodal Hazard Function

    NASA Astrophysics Data System (ADS)

    Barriga, Gladys D. C.; Louzada-Neto, Francisco; Cancho, Vicente G.

    2008-11-01

    In this paper we propose a new lifetime distribution which accommodate bathtub-shaped, unimodal, increasing and decreasing hazard function. Some special particular cases are derived, including the standard Weibull distribution. Maximum likelihood estimation is considered for estimate the tree parameters present in the model. The methodology is illustrated in a real data set on industrial devices on a lite test.

  15. Mortality among subjects with chronic obstructive pulmonary disease or asthma at two respiratory disease clinics in Ontario

    PubMed Central

    Finkelstein, Murray M; Chapman, Kenneth R; McIvor, R Andrew; Sears, Malcolm R

    2011-01-01

    BACKGROUND: Chronic obstructive pulmonary disease (COPD) and asthma are common; however, mortality rates among individuals with these diseases are not well studied in North America. OBJECTIVE: To investigate mortality rates and risk factors for premature death among subjects with COPD. METHODS: Subjects were identified from the lung function testing databases of two academic respiratory disease clinics in Hamilton and Toronto, Ontario. Mortality was ascertained by linkage to the Ontario mortality registry between 1992 and 2002, inclusive. Standardized mortality ratios were computed. Poisson regression of standardized mortality ratios and proportional hazards regression were performed to examine the multivariate effect of risk factors on the standardized mortality ratios and mortality hazards. RESULTS: Compared with the Ontario population, all-cause mortality was approximately doubled among subjects with COPD, but was lower than expected among subjects with asthma. The risk of mortality in patients with COPD was related to cigarette smoking, to the presence of comorbid conditons of ischemic heart disease and diabetes, and to Global initiative for chronic Obstructive Lung Disease severity scores. Individuals living closer to traffic sources showed an elevated risk of death compared with those who lived further away from traffic sources. CONCLUSIONS: Mortality rates among subjects diagnosed with COPD were substantially elevated. There were several deaths attributed to asthma among subjects in the present study; however, overall, patients with asthma demonstrated lower mortality rates than the general population. Subjects with COPD need to be managed with attention devoted to both their respiratory disorders and related comorbidities. PMID:22187688

  16. Relocating San Miguel Volcanic Seismic Events for Receiver Functions and Tomographic Models

    NASA Astrophysics Data System (ADS)

    Patlan, E.; Velasco, A. A.; Konter, J.

    2009-12-01

    The San Miguel volcano lies near the city of San Miguel, El Salvador (13.43N and -88.26W). San Miguel volcano, an active stratovolcano, presents a significant natural hazard for the city of San Miguel. Furthermore, the internal state and activity of volcanoes remains an important component to understanding volcanic hazard. The main technology for addressing volcanic hazards and processes is through the analysis of data collected from the deployment of seismic sensors that record ground motion. Six UTEP seismic stations were deployed around San Miguel volcano from 2007-2008 to define the magma chamber and assess the seismic and volcanic hazard. We utilize these data to develop images of the earth structure beneath the volcano, studying the volcanic processes by identifying different sources, and investigating the role of earthquakes and faults in controlling the volcanic processes. We will calculate receiver functions to determine the thickness of San Miguel volcano internal structure, within the Caribbean plate. Crustal thicknesses will be modeled using calculated receiver functions from both theoretical and hand-picked P-wave arrivals. We will use this information derived from receiver functions, along with P-wave delay times, to map the location of the magma chamber.

  17. Risk of infective endocarditis in patients with systemic lupus erythematosus in Taiwan: a nationwide population-based study.

    PubMed

    Chang, Y S; Chang, C C; Chen, Y H; Chen, W S; Chen, J H

    2017-10-01

    Objectives Patients with systemic lupus erythematosus are considered vulnerable to infective endocarditis and prophylactic antibiotics are recommended before an invasive dental procedure. However, the evidence is insufficient. This nationwide population-based study evaluated the risk and related factors of infective endocarditis in systemic lupus erythematosus. Methods We identified 12,102 systemic lupus erythematosus patients from the National Health Insurance research-oriented database, and compared the incidence rate of infective endocarditis with that among 48,408 non-systemic lupus erythematosus controls. A Cox multivariable proportional hazards model was employed to evaluate the risk of infective endocarditis in the systemic lupus erythematosus cohort. Results After a mean follow-up of more than six years, the systemic lupus erythematosus cohort had a significantly higher incidence rate of infective endocarditis (42.58 vs 4.32 per 100,000 person-years, incidence rate ratio = 9.86, p < 0.001) than that of the control cohort. By contrast, the older systemic lupus erythematosus cohort had lower risk (adjusted hazard ratio 11.64) than that of the younger-than-60-years systemic lupus erythematosus cohort (adjusted hazard ratio 15.82). Cox multivariate proportional hazards analysis revealed heart disease (hazard ratio = 5.71, p < 0.001), chronic kidney disease (hazard ratio = 2.98, p = 0.034), receiving a dental procedure within 30 days (hazard ratio = 36.80, p < 0.001), and intravenous steroid therapy within 30 days (hazard ratio = 39.59, p < 0.001) were independent risk factors for infective endocarditis in systemic lupus erythematosus patients. Conclusions A higher risk of infective endocarditis was observed in systemic lupus erythematosus patients. Risk factors for infective endocarditis in the systemic lupus erythematosus cohort included heart disease, chronic kidney disease, steroid pulse therapy within 30 days, and a recent invasive dental procedure within 30 days.

  18. Submarine Landslide Hazards Offshore Southern Alaska: Seismic Strengthening Versus Rapid Sedimentation

    NASA Astrophysics Data System (ADS)

    Sawyer, D.; Reece, R.; Gulick, S. P. S.; Lenz, B. L.

    2017-12-01

    The southern Alaskan offshore margin is prone to submarine landslides and tsunami hazards due to seismically active plate boundaries and extreme sedimentation rates from glacially enhanced mountain erosion. We examine the submarine landslide potential with new shear strength measurements acquired by Integrated Ocean Drilling Program Expedition 341 on the continental slope and Surveyor Fan. These data reveal lower than expected sediment strength. Contrary to other active margins where seismic strengthening enhances slope stability, the high-sedimentation margin offshore southern Alaska behaves like a passive margin from a shear strength perspective. We interpret that seismic strengthening occurs but is offset by high sedimentation rates and overpressure within the slope and Surveyor Fan. This conclusion is supported because shear strength follows an expected active margin profile outside of the fan, where background sedimentation rates occur. More broadly, seismically active margins with wet-based glaciers are susceptible to submarine landslide hazards because of the combination of high sedimentation rates and earthquake shaking

  19. HMPT: Hazardous Waste Transportation Live 27928, Test 27929

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simpson, Lewis Edward

    2016-03-17

    HMPT: Hazardous Waste Transportation (Live 27928, suggested one time and associated Test 27929, required initially and every 36 months) addresses the Department of Transportation (DOT) function-specific training requirements of the hazardous materials packagings and transportation (HMPT) Los Alamos National Laboratory (LANL) lab-wide training. This course addresses the requirements of the DOT that are unique to hazardous waste shipments. Appendix B provides the Title 40 Code of Federal Regulations (CFR) reference material needed for this course.

  20. Application-driven ground motion prediction equation for seismic hazard assessments in non-cratonic moderate-seismicity areas

    NASA Astrophysics Data System (ADS)

    Bindi, D.; Cotton, F.; Kotha, S. R.; Bosse, C.; Stromeyer, D.; Grünthal, G.

    2017-09-01

    We present a ground motion prediction equation (GMPE) for probabilistic seismic hazard assessments (PSHA) in low-to-moderate seismicity areas, such as Germany. Starting from the NGA-West2 flat-file (Ancheta et al. in Earthquake Spectra 30:989-1005, 2014), we develop a model tailored to the hazard application in terms of data selection and implemented functional form. In light of such hazard application, the GMPE is derived for hypocentral distance (along with the Joyner-Boore one), selecting recordings at sites with vs30 ≥ 360 m/s, distances within 300 km, and magnitudes in the range 3 to 8 (being 7.4 the maximum magnitude for the PSHA in the target area). Moreover, the complexity of the considered functional form is reflecting the availability of information in the target area. The median predictions are compared with those from the NGA-West2 models and with one recent European model, using the Sammon's map constructed for different scenarios. Despite the simplification in the functional form, the assessed epistemic uncertainty in the GMPE median is of the order of those affecting the NGA-West2 models for the magnitude range of interest of the hazard application. On the other hand, the simplification of the functional form led to an increment of the apparent aleatory variability. In conclusion, the GMPE developed in this study is tailored to the needs for applications in low-to-moderate seismic areas and for short return periods (e.g., 475 years); its application in studies where the hazard is involving magnitudes above 7.4 and for long return periods is not advised.

  1. The Very High Premature Mortality Rate among Active Professional Wrestlers Is Primarily Due to Cardiovascular Disease

    PubMed Central

    Herman, Christopher W.; Conlon, Anna S. C.; Rubenfire, Melvyn; Burghardt, Andrew R.; McGregor, Stephen J.

    2014-01-01

    Purpose Recently, much media attention has been given to the premature deaths in professional wrestlers. Since no formal studies exist that have statistically examined the probability of premature mortality in professional wrestlers, we determined survival estimates for active wresters over the past quarter century to establish the factors contributing to the premature mortality of these individuals. Methods Data including cause of death was obtained from public records and wrestling publications in wrestlers who were active between January 1, 1985 and December 31, 2011. 557 males were considered consistently active wrestlers during this time period. 2007 published mortality rates from the Center for Disease Control were used to compare the general population to the wrestlers by age, BMI, time period, and cause of death. Survival estimates and Cox hazard regression models were fit to determine incident premature deaths and factors associated with lower survival. Cumulative incidence function (CIF) estimates given years wrestled was obtained using a competing risks model for cause of death. Results The mortality for all wrestlers over the 26-year study period was.007 deaths/total person-years or 708 per 100,000 per year, and 16% of deaths occurred below age 50 years. Among wrestlers, the leading cause of deaths based on CIF was cardiovascular-related (38%). For cardiovascular-related deaths, drug overdose-related deaths and cancer deaths, wrestler mortality rates were respectively 15.1, 122.7 and 6.4 times greater than those of males in the general population. Survival estimates from hazard models indicated that BMI is significantly associated with the hazard of death from total time wrestling (p<0.0001). Conclusion Professional wrestlers are more likely to die prematurely from cardiovascular disease compared to the general population and morbidly obese wrestlers are especially at risk. Results from this study may be useful for professional wrestlers, as well as wellness policy and medical care implementation. PMID:25372569

  2. Echocardiographic aortic valve calcification and outcomes in women and men with aortic stenosis

    PubMed Central

    Thomassen, Henrik K; Cioffi, Giovanni; Gerdts, Eva; Einarsen, Eigir; Midtbø, Helga Bergljot; Mancusi, Costantino; Cramariuc, Dana

    2017-01-01

    Objective Sex differences in risk factors of aortic valve calcification (AVC) by echocardiography have not been reported from a large prospective study in aortic stenosis (AS). Methods AVC was assessed using a prognostically validated visual score and grouped into none/mild or moderate/severe AVC in 1725 men and women with asymptomatic AS in the Simvastatin Ezetimibe in Aortic Stenosis study. The severity of AS was assessed by the energy loss index (ELI) taking pressure recovery in the aortic root into account. Results More men than women had moderate/severe AVC at baseline despite less severe AS by ELI (p<0.01). Moderate/severe AVC at baseline was independently associated with lower aortic compliance and more severe AS in both sexes, and with increased high-sensitive C reactive protein (hs-CRP) only in men (all p<0.01). In Cox regression analyses, moderate/severe AVC at baseline was associated with a 2.5-fold (95% CI 1.64 to 3.80) higher hazard rate of major cardiovascular events in women, and a 2.2-fold higher hazard rate in men (95% CI 1.54 to 3.17) (both p<0.001), after adjustment for age, hypertension, study treatment, aortic compliance, left ventricular (LV) mass and systolic function, AS severity and hs-CRP. Moderate/severe AVC at baseline also predicted a 1.8-fold higher hazard rate of all-cause mortality in men (95% CI 1.04 to 3.06, p<0.05) independent of age, AS severity, LV mass and aortic compliance, but not in women. Conclusion In conclusion, AVC scored by echocardiography has sex-specific characteristics in AS. Moderate/severe AVC is associated with higher cardiovascular morbidity in both sexes, and with higher all-cause mortality in men. Trial registration number ClinicalTrials.gov identifier: NCT00092677 PMID:28698175

  3. Echocardiographic aortic valve calcification and outcomes in women and men with aortic stenosis.

    PubMed

    Thomassen, Henrik K; Cioffi, Giovanni; Gerdts, Eva; Einarsen, Eigir; Midtbø, Helga Bergljot; Mancusi, Costantino; Cramariuc, Dana

    2017-10-01

    Sex differences in risk factors of aortic valve calcification (AVC) by echocardiography have not been reported from a large prospective study in aortic stenosis (AS). AVC was assessed using a prognostically validated visual score and grouped into none/mild or moderate/severe AVC in 1725 men and women with asymptomatic AS in the Simvastatin Ezetimibe in Aortic Stenosis study. The severity of AS was assessed by the energy loss index (ELI) taking pressure recovery in the aortic root into account. More men than women had moderate/severe AVC at baseline despite less severe AS by ELI (p<0.01). Moderate/severe AVC at baseline was independently associated with lower aortic compliance and more severe AS in both sexes, and with increased high-sensitive C reactive protein (hs-CRP) only in men (all p<0.01). In Cox regression analyses, moderate/severe AVC at baseline was associated with a 2.5-fold (95% CI 1.64 to 3.80) higher hazard rate of major cardiovascular events in women, and a 2.2-fold higher hazard rate in men (95% CI 1.54 to 3.17) (both p<0.001), after adjustment for age, hypertension, study treatment, aortic compliance, left ventricular (LV) mass and systolic function, AS severity and hs-CRP. Moderate/severe AVC at baseline also predicted a 1.8-fold higher hazard rate of all-cause mortality in men (95% CI 1.04 to 3.06, p<0.05) independent of age, AS severity, LV mass and aortic compliance, but not in women. In conclusion, AVC scored by echocardiography has sex-specific characteristics in AS. Moderate/severe AVC is associated with higher cardiovascular morbidity in both sexes, and with higher all-cause mortality in men. ClinicalTrials.gov identifier: NCT00092677. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  4. The effect of sibutramine prescribing in routine clinical practice on cardiovascular outcomes: a cohort study in the United Kingdom

    PubMed Central

    Hayes, J F; Bhaskaran, K; Batterham, R; Smeeth, L; Douglas, I

    2015-01-01

    Background/Objectives: The marketing authorization for the weight loss drug sibutramine was suspended in 2010 following a major trial that showed increased rates of non-fatal myocardial infarction and cerebrovascular events in patients with pre-existing cardiovascular disease. In routine clinical practice, sibutramine was already contraindicated in patients with cardiovascular disease and so the relevance of these influential clinical trial findings to the ‘real World' population of patients receiving or eligible for the drug is questionable. We assessed rates of myocardial infarction and cerebrovascular events in a cohort of patients prescribed sibutramine or orlistat in the United Kingdom. Subjects/Methods: A cohort of patients prescribed weight loss medication was identified within the Clinical Practice Research Datalink. Rates of myocardial infarction or cerebrovascular event, and all-cause mortality were compared between patients prescribed sibutramine and similar patients prescribed orlistat, using both a multivariable Cox proportional hazard model, and propensity score-adjusted model. Possible effect modification by pre-existing cardiovascular disease and cardiovascular risk factors was assessed. Results: Patients prescribed sibutramine (N=23 927) appeared to have an elevated rate of myocardial infarction or cerebrovascular events compared with those taking orlistat (N=77 047; hazard ratio 1.69, 95% confidence interval 1.12–2.56). However, subgroup analysis showed the elevated rate was larger in those with pre-existing cardiovascular disease (hazard ratio 4.37, 95% confidence interval 2.21–8.64), compared with those with no cardiovascular disease (hazard ratio 1.52, 95% confidence interval 0.92–2.48, P-interaction=0.0076). All-cause mortality was not increased in those prescribed sibutramine (hazard ratio 0.67, 95% confidence interval 0.34–1.32). Conclusions: Sibutramine was associated with increased rates of acute cardiovascular events in people with pre-existing cardiovascular disease, but there was a low absolute risk in those without. Sibutramine's marketing authorization may have, therefore, been inappropriately withdrawn for people without cardiovascular disease. PMID:25971925

  5. The effect of sibutramine prescribing in routine clinical practice on cardiovascular outcomes: a cohort study in the United Kingdom.

    PubMed

    Hayes, J F; Bhaskaran, K; Batterham, R; Smeeth, L; Douglas, I

    2015-09-01

    The marketing authorization for the weight loss drug sibutramine was suspended in 2010 following a major trial that showed increased rates of non-fatal myocardial infarction and cerebrovascular events in patients with pre-existing cardiovascular disease. In routine clinical practice, sibutramine was already contraindicated in patients with cardiovascular disease and so the relevance of these influential clinical trial findings to the 'real World' population of patients receiving or eligible for the drug is questionable. We assessed rates of myocardial infarction and cerebrovascular events in a cohort of patients prescribed sibutramine or orlistat in the United Kingdom. A cohort of patients prescribed weight loss medication was identified within the Clinical Practice Research Datalink. Rates of myocardial infarction or cerebrovascular event, and all-cause mortality were compared between patients prescribed sibutramine and similar patients prescribed orlistat, using both a multivariable Cox proportional hazard model, and propensity score-adjusted model. Possible effect modification by pre-existing cardiovascular disease and cardiovascular risk factors was assessed. Patients prescribed sibutramine (N=23,927) appeared to have an elevated rate of myocardial infarction or cerebrovascular events compared with those taking orlistat (N=77,047; hazard ratio 1.69, 95% confidence interval 1.12-2.56). However, subgroup analysis showed the elevated rate was larger in those with pre-existing cardiovascular disease (hazard ratio 4.37, 95% confidence interval 2.21-8.64), compared with those with no cardiovascular disease (hazard ratio 1.52, 95% confidence interval 0.92-2.48, P-interaction=0.0076). All-cause mortality was not increased in those prescribed sibutramine (hazard ratio 0.67, 95% confidence interval 0.34-1.32). Sibutramine was associated with increased rates of acute cardiovascular events in people with pre-existing cardiovascular disease, but there was a low absolute risk in those without. Sibutramine's marketing authorization may have, therefore, been inappropriately withdrawn for people without cardiovascular disease.

  6. Estimating animal mortality from anthropogenic hazards

    EPA Science Inventory

    Carcass searches are a common method for studying the risk of anthropogenic hazards to wildlife, including non-target poisoning and collisions with anthropogenic structures. Typically, numbers of carcasses found must be corrected for scavenging rates and imperfect detection. Para...

  7. Rockfall Hazard Process Assessment : Implementation Report

    DOT National Transportation Integrated Search

    2017-10-01

    The Montana Department of Transportation (MDT) commissioned a new research program to improve assessment and management of its rock slope assets. The Department implemented a Rockfall Hazard Rating System (RHRS) program in 2005 and wished to add valu...

  8. Impact of renal function on ischemic stroke and major bleeding rates in nonvalvular atrial fibrillation patients treated with warfarin or rivaroxaban: a retrospective cohort study using real-world evidence.

    PubMed

    Weir, Matthew R; Berger, Jeffrey S; Ashton, Veronica; Laliberté, François; Brown, Kip; Lefebvre, Patrick; Schein, Jeffrey

    2017-10-01

    Renal dysfunction is associated with increased risk of cardiovascular disease and is an independent predictor of stroke and systemic embolism. Nonvalvular atrial fibrillation (NVAF) patients with renal dysfunction may face a particularly high risk of thromboembolism and bleeding. The current retrospective cohort study was designed to assess the impact of renal function on ischemic stroke and major bleeding rates in NVAF patients in the real-world setting (outside a clinical trial). Medical claims and Electronic Health Records were retrieved retrospectively from Optum's Integrated Claims-Clinical de-identified dataset from May 2011 to August 2014. Patients with NVAF treated with warfarin (2468) or rivaroxaban (1290) were selected. Each treatment cohort was stratified by baseline estimated creatinine clearance (eCrCl) levels. Confounding adjustments were made using inverse probability of treatment weights (IPTWs). Incidence rates and hazard ratios of ischemic stroke and major bleeding events were calculated for both cohorts. Overall, patients treated with rivaroxaban had an ischemic stroke incidence rate of 1.9 per 100 person-years (PY) while patients treated with warfarin had a rate of 4.2 per 100 PY (HR = 0.41 [0.21-0.80], p = .009). Rivaroxaban patients with an eCrCl below 50 mL/min (N = 229) had an ischemic stroke rate of 0.8 per 100 PY, while the rate for the warfarin cohort (N = 647) was 6.0 per 100 PY (HR = 0.09 [0.01-0.72], p = .02). For the other renal function levels (i.e. eCrCl 50-80 and ≥80 mL/min) HRs indicated no statistically significant differences in ischemic stroke risks. Bleeding events did not differ significantly between cohorts stratified by renal function. Ischemic stroke rates were significantly lower in the overall NVAF population for rivaroxaban vs. warfarin users, including patients with eCrCl below 50 mL/min. For all renal function groups, major bleeding risks were not statistically different between treatment groups.

  9. Retirement as Meaningful: Positive Retirement Stereotypes Associated with Longevity

    PubMed Central

    Ng, Reuben; Allore, Heather G.; Monin, Joan K.; Levy, Becca R.

    2016-01-01

    Studies examining the association between retirement and health have produced mixed results. This may be due to previous studies treating retirement as merely a change in job status rather than a transition associated with stereotypes or societal beliefs (e.g., retirement is a time of mental decline or retirement is a time of growth). To examine whether these stereotypes are associated with health, we studied retirement stereotypes and survival over a 23-year period among 1,011 older adults. As predicted by stereotype embodiment theory, it was found that positive stereotypes about physical health during retirement showed a survival advantage of 4.5 years (hazard ratio = 0.88, p = .022) and positive stereotypes about mental health during retirement tended to show a survival advantage of 2.5 years (hazard ratio = 0.87, p = .034). Models adjusted for relevant covariates such as age, gender, race, employment status, functional health, and self-rated health. These results suggest that retirement preparation could benefit from considering retirement stereotypes. PMID:27346893

  10. Application of fuzzy logic approach for wind erosion hazard mapping in Laghouat region (Algeria) using remote sensing and GIS

    NASA Astrophysics Data System (ADS)

    Saadoud, Djouher; Hassani, Mohamed; Martin Peinado, Francisco José; Guettouche, Mohamed Saïd

    2018-06-01

    Wind erosion is one of the most serious environmental problems in Algeria that threatens human activities and socio-economic development. The main goal of this study is to apply a fuzzy logic approach to wind erosion sensitivity mapping in the Laghouat region, Algeria. Six causative factors, obtained by applying fuzzy membership functions to each used parameter, are considered: soil, vegetation cover, wind factor, soil dryness, land topography and land cover sensitivity. Different fuzzy operators (AND, OR, SUM, PRODUCT, and GAMMA) are applied to generate wind-erosion hazard map. Success rate curves reveal that the fuzzy gamma (γ) operator, with γ equal to 0.9, gives the best prediction accuracy with an area under curve of 85.2%. The resulting wind-erosion sensitivity map delineates the area into different zones of five relative sensitivity classes: very high, high, moderate, low and very low. The estimated result was verified by field measurements and the high statistically significant value of a chi-square test.

  11. Earthquake Hazard and Risk in Alaska

    NASA Astrophysics Data System (ADS)

    Black Porto, N.; Nyst, M.

    2014-12-01

    Alaska is one of the most seismically active and tectonically diverse regions in the United States. To examine risk, we have updated the seismic hazard model in Alaska. The current RMS Alaska hazard model is based on the 2007 probabilistic seismic hazard maps for Alaska (Wesson et al., 2007; Boyd et al., 2007). The 2015 RMS model will update several key source parameters, including: extending the earthquake catalog, implementing a new set of crustal faults, updating the subduction zone geometry and reoccurrence rate. First, we extend the earthquake catalog to 2013; decluster the catalog, and compute new background rates. We then create a crustal fault model, based on the Alaska 2012 fault and fold database. This new model increased the number of crustal faults from ten in 2007, to 91 faults in the 2015 model. This includes the addition of: the western Denali, Cook Inlet folds near Anchorage, and thrust faults near Fairbanks. Previously the subduction zone was modeled at a uniform depth. In this update, we model the intraslab as a series of deep stepping events. We also use the best available data, such as Slab 1.0, to update the geometry of the subduction zone. The city of Anchorage represents 80% of the risk exposure in Alaska. In the 2007 model, the hazard in Alaska was dominated by the frequent rate of magnitude 7 to 8 events (Gutenberg-Richter distribution), and large magnitude 8+ events had a low reoccurrence rate (Characteristic) and therefore didn't contribute as highly to the overall risk. We will review these reoccurrence rates, and will present the results and impact to Anchorage. We will compare our hazard update to the 2007 USGS hazard map, and discuss the changes and drivers for these changes. Finally, we will examine the impact model changes have on Alaska earthquake risk. Consider risk metrics include average annual loss, an annualized expected loss level used by insurers to determine the costs of earthquake insurance (and premium levels), and the loss exceedance probability curve used by insurers to address their solvency and manage their portfolio risk. We analyze risk profile changes in areas with large population density and for structures of economic and financial importance: the Trans-Alaska pipeline, industrial facilities in Valdez, and typical residential wood buildings in Anchorage, Fairbanks and Juneau.

  12. Administrative goals and safety standards for hazard control on forested recreation sites

    Treesearch

    Lee A. Paine

    1973-01-01

    For efficient control of tree hazard on recreation sites, a specific administrative goal must be selected. A safety standard designed to achieve the selected goal and a uniform hazard-rating procedure will then promote a consistent level of safety at an acceptable cost. Safety standards can be established with the aid of data for past years, and dollar evaluations are...

  13. Reliability analysis using an exponential power model with bathtub-shaped failure rate function: a Bayes study.

    PubMed

    Shehla, Romana; Khan, Athar Ali

    2016-01-01

    Models with bathtub-shaped hazard function have been widely accepted in the field of reliability and medicine and are particularly useful in reliability related decision making and cost analysis. In this paper, the exponential power model capable of assuming increasing as well as bathtub-shape, is studied. This article makes a Bayesian study of the same model and simultaneously shows how posterior simulations based on Markov chain Monte Carlo algorithms can be straightforward and routine in R. The study is carried out for complete as well as censored data, under the assumption of weakly-informative priors for the parameters. In addition to this, inference interest focuses on the posterior distribution of non-linear functions of the parameters. Also, the model has been extended to include continuous explanatory variables and R-codes are well illustrated. Two real data sets are considered for illustrative purposes.

  14. Visual and cognitive predictors of driving safety in Parkinson's disease patients

    PubMed Central

    Amick, M.M.; Grace, J.; Ott, B.R.

    2012-01-01

    This study assessed the clinical utility of contrast sensitivity (CS) relative to attention, executive function, and visuospatial abilities for predicting driving safety in participants with Parkinson's disease (PD). Twenty-five, non-demented PD patients completed measures of contrast sensitivity, visuospatial skills, executive functions, and attention. All PD participants also underwent a formal on-road driving evaluation. Of the 25 participants, 11 received a marginal or unsafe rating on the road test. Poorer driving performance was associated with worse performance on measures of CS, visuospatial constructions, set shifting, and attention. While impaired driving was associated with a range of cognitive and visual abilities, only a composite measure of executive functioning and visuospatial abilities, and not CS or attentional skills, predicted driving performance. These findings suggest that neuropsychological tests, which are multifactorial in nature and require visual perception and visual spatial judgments are the most useful screening measures for hazardous driving in PD patients. PMID:17851032

  15. Visual and cognitive predictors of driving safety in Parkinson's disease patients.

    PubMed

    Amick, M M; Grace, J; Ott, B R

    2007-11-01

    This study assessed the clinical utility of contrast sensitivity (CS) relative to attention, executive function, and visuospatial abilities for predicting driving safety in participants with Parkinson's disease (PD). Twenty-five, non-demented PD patients completed measures of contrast sensitivity, visuospatial skills, executive functions, and attention. All PD participants also underwent a formal on-road driving evaluation. Of the 25 participants, 11 received a marginal or unsafe rating on the road test. Poorer driving performance was associated with worse performance on measures of CS, visuospatial constructions, set shifting, and attention. While impaired driving was associated with a range of cognitive and visual abilities, only a composite measure of executive functioning and visuospatial abilities, and not CS or attentional skills, predicted driving performance. These findings suggest that neuropsychological tests, which are multifactorial in nature and require visual perception and visual spatial judgments are the most useful screening measures for hazardous driving in PD patients.

  16. Programming and Isolation of Highly Pure Physiologically and Pharmacologically Functional Sinus-Nodal Bodies from Pluripotent Stem Cells

    PubMed Central

    Jung, Julia Jeannine; Husse, Britta; Rimmbach, Christian; Krebs, Stefan; Stieber, Juliane; Steinhoff, Gustav; Dendorfer, Andreas; Franz, Wolfgang-Michael; David, Robert

    2014-01-01

    Summary Therapeutic approaches for “sick sinus syndrome” rely on electrical pacemakers, which lack hormone responsiveness and bear hazards such as infection and battery failure. These issues may be overcome via “biological pacemakers” derived from pluripotent stem cells (PSCs). Here, we show that forward programming of PSCs with the nodal cell inducer TBX3 plus an additional Myh6-promoter-based antibiotic selection leads to cardiomyocyte aggregates consisting of >80% physiologically and pharmacologically functional pacemaker cells. These induced sinoatrial bodies (iSABs) exhibited highly increased beating rates (300–400 bpm), coming close to those found in mouse hearts, and were able to robustly pace myocardium ex vivo. Our study introduces iSABs as highly pure, functional nodal tissue that is derived from PSCs and may be important for future cell therapies and drug testing in vitro. PMID:24936448

  17. Depression and Liver Transplant Survival.

    PubMed

    Meller, William; Welle, Nicole; Sutley, Kristen; Thurber, Steven

    Patients who underwent liver transplantation and experienced clinical depression have heretofore evinced lower survival rates when compared to nondepressed counterparts. To investigate the hypothesis that transplant patients who seek and obtain medical treatment for depression would circumvent the prior reduced survival findings. A total of 765 patients with liver transplants were scrutinized for complications following transplantation. Further, 104 patients experienced posttransplant depression as manifested by diagnosis and treatment by medical personnel. Survival analyses were conducted comparing hazard and survival curves for these selected individuals and the remainder of transplant patients. Contrary to prior data and consistent with the aforementioned hypothesis, median survival durations, survival curves, and hazard functions (controlling for age and prolonged posttransplant survival for the depressed patients were better. The improved survival for the depressed patients may simply be related to an amelioration of depressed symptoms via antidepressant medications. However, this interpretation would only be congruent with reduced hazard, not elevated survival, beyond the norm (median) for other transplant participants. Assuming the reliability and generalization of our findings, perhaps a reasonable and compelling interpretation is that combined with the effectiveness of antidepressant medications, the seeking and receiving treatment for depression is a type of proxy measure of a more global pattern of adherence to recommended posttransplant medical regimens. Copyright © 2017 The Academy of Psychosomatic Medicine. Published by Elsevier Inc. All rights reserved.

  18. Prevalence of hazardous exposures in veterinary practice

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wiggins, P.; Schenker, M.B.; Green, R.

    1989-01-01

    All female graduates of a major U.S. veterinary school were surveyed by mailed questionnaire to obtain details of work practice and hazard exposure during the most recent year worked and during all pregnancies. Exposure questions were based on previously implicated occupational hazards which included anesthetic gases, radiation, zoonoses, prostaglandins, vaccines, physical trauma, and pesticides. The response rate was 86% (462/537). We found that practice type and pregnancy status were major determinants of hazard exposure within the veterinary profession. Small-animal practitioners reported the highest rates of exposure to anesthetic gas (94%), X-ray (90%), and pesticides (57%). Large-animal practitioners reported greater ratesmore » of trauma (64%) and potential exposure to prostaglandins (92%), Brucella abortus vaccine (23%), and carbon monoxide (18%). Potentially hazardous workplace practices or equipment were common. Forty-one percent of respondents who reported taking X-rays did not wear film badges, and 76% reported physically restraining animals for X-ray procedures. Twenty-seven percent of the respondents exposed to anesthetic gases worked at facilities which did not have waste anesthetic gas scavenging systems. Women who worked as veterinarians during a pregnancy attempted to reduce exposures to X-rays, insecticides, and other potentially hazardous exposures. Some potentially hazardous workplace exposures are common in veterinary practice, and measures to educate workers and to reduce these exposures should not await demonstration of adverse health effects.« less

  19. 32 CFR 172.2 - Applicability and scope.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... chemical processing. The recycling of hazardous materials or hazardous waste shall be accomplished with due...-bearing scrap and those items that may be used again for their original purposes or functions without any...

  20. 32 CFR 172.2 - Applicability and scope.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... chemical processing. The recycling of hazardous materials or hazardous waste shall be accomplished with due...-bearing scrap and those items that may be used again for their original purposes or functions without any...

  1. 32 CFR 172.2 - Applicability and scope.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... chemical processing. The recycling of hazardous materials or hazardous waste shall be accomplished with due...-bearing scrap and those items that may be used again for their original purposes or functions without any...

  2. 32 CFR 172.2 - Applicability and scope.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... chemical processing. The recycling of hazardous materials or hazardous waste shall be accomplished with due...-bearing scrap and those items that may be used again for their original purposes or functions without any...

  3. 32 CFR 172.2 - Applicability and scope.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... chemical processing. The recycling of hazardous materials or hazardous waste shall be accomplished with due...-bearing scrap and those items that may be used again for their original purposes or functions without any...

  4. Verbal collision avoidance messages during simulated driving: perceived urgency, alerting effectiveness and annoyance.

    PubMed

    Baldwin, Carryl L

    2011-04-01

    Matching the perceived urgency of an alert with the relative hazard level of the situation is critical for effective alarm response. Two experiments describe the impact of acoustic and semantic parameters on ratings of perceived urgency, annoyance and alerting effectiveness and on alarm response speed. Within a simulated driving context, participants rated and responded to collision avoidance system (CAS) messages spoken by a female or male voice (experiments 1 and 2, respectively). Results indicated greater perceived urgency and faster alarm response times as intensity increased from -2 dB signal to noise (S/N) ratio to +10 dB S/N, although annoyance ratings increased as well. CAS semantic content interacted with alarm intensity, indicating that at lower intensity levels participants paid more attention to the semantic content. Results indicate that both acoustic and semantic parameters independently and interactively impact CAS alert perceptions in divided attention conditions and this work can inform auditory alarm design for effective hazard matching. Matching the perceived urgency of an alert with the relative hazard level of the situation is critical for effective alarm response. Here, both acoustic and semantic parameters independently and interactively impacted CAS alert perceptions in divided attention conditions. This work can inform auditory alarm design for effective hazard matching. STATEMENT OF RELEVANCE: Results indicate that both acoustic parameters and semantic content can be used to design collision warnings with a range of urgency levels. Further, these results indicate that verbal warnings tailored to a specific hazard situation may improve hazard-matching capabilities without substantial trade-offs in perceived annoyance.

  5. Lava flow hazards and risk assessment on Mauna Loa Volcano, Hawaii

    NASA Astrophysics Data System (ADS)

    Trusdell, Frank A.

    "It is profoundly significant that the Hawaiians of Ka'u did not fear or cringe before, or hate, the power and destructive violence of Mauna Loa. They took unto them this huge mountain as their mother, and measured their personal dignity and powers in terms of its majesty and drama." (Pukui and Handy, 1952) The Island of Hawai'i is the fastest-growing region in the State of Hawai`i with over 100,000 residents. Because the population continues to grow at a rate of 3% per annum, more and more construction will occur on the flanks of active volcanoes. Since the last eruption of Mauna Loa in 1984, $2.3 billion have been invested in new construction on the volcano's flanks, posing an inevitable hazard to the people living there. Part of the mission of The U.S. Geological Survey's Hawaiian Volcano Observatory is to make the public aware of these hazards. Recent mapping has shown that lava flows on Mauna Loa have covered its surface area at a rate of 30-40% every 1000 years. Average effusion rates of up to 12 million cubic meters per day during eruptions, combined with slopes >10 degrees, increase the risk for the population of South Kona. Studies of Mauna Loa's long-term eruptive history will lead to more accurate volcanic hazards assessments and enable us to refine the boundaries between the hazards zones. Our work thus serves as a guide for land-use planners and developers to make more informed decisions for the future. Land-use planning is a powerful way to minimize risk in hazardous areas.

  6. Effects of uric acid on kidney function decline differ depending on baseline kidney function in type 2 diabetic patients.

    PubMed

    Hanai, Ko; Tauchi, Eriko; Nishiwaki, Yui; Mori, Tomomi; Yokoyama, Yoichi; Uchigata, Yasuko; Babazono, Tetsuya

    2018-05-30

    Most existing data regarding effects of uric acid (UA) on diabetic kidney disease have considered patients with preserved kidney function. We examined a hypothesis that there are differences in the effects of serum UA levels on the decline in kidney function depending on baseline kidney function in diabetic patients. In this historical cohort study, 7033 type 2 diabetic patients were analyzed and classified into two groups as follows: nonchronic kidney disease (non-CKD), with an estimated glomerular filtration rate (eGFR) ≥60 mL/min/1.73 m2 (n = 4994), and CKD, with an eGFR <60 mL/min/1.73 m2 (n = 2039). The composite endpoint was a ≥30% decrease in eGFR from baseline or the initiation of renal replacement therapy. The hazard ratio (HR) of serum UA levels at baseline was estimated using multivariate Cox proportional hazards models. There was a significant interaction between UA levels and baseline eGFR with respect to the endpoint (P < 0.001). The HRs of 1 mg/dL increase in UA levels were 1.13 [95% confidence interval (CI) 1.05-1.22, P = 0.002] and 0.93 (95% CI 0.88-0.99, P = 0.02) in the non-CKD and CKD groups, respectively. When patients were classified by quintile of UA levels, the HRs of those in the 5th quintile (versus 1st quintile) were 1.64 (95% CI 1.23-2.18, P < 0.001) and 0.76 (95% CI 0.58-0.99, P = 0.05) in the non-CKD and CKD groups, respectively. The effects of UA on kidney function decline might differ depending on baseline kidney function in type 2 diabetic patients. High UA levels are the prognostic factor only in patients with preserved kidney function.

  7. The Average Hazard Ratio - A Good Effect Measure for Time-to-event Endpoints when the Proportional Hazard Assumption is Violated?

    PubMed

    Rauch, Geraldine; Brannath, Werner; Brückner, Matthias; Kieser, Meinhard

    2018-05-01

    In many clinical trial applications, the endpoint of interest corresponds to a time-to-event endpoint. In this case, group differences are usually expressed by the hazard ratio. Group differences are commonly assessed by the logrank test, which is optimal under the proportional hazard assumption. However, there are many situations in which this assumption is violated. Especially in applications were a full population and several subgroups or a composite time-to-first-event endpoint and several components are considered, the proportional hazard assumption usually does not simultaneously hold true for all test problems under investigation. As an alternative effect measure, Kalbfleisch and Prentice proposed the so-called 'average hazard ratio'. The average hazard ratio is based on a flexible weighting function to modify the influence of time and has a meaningful interpretation even in the case of non-proportional hazards. Despite this favorable property, it is hardly ever used in practice, whereas the standard hazard ratio is commonly reported in clinical trials regardless of whether the proportional hazard assumption holds true or not. There exist two main approaches to construct corresponding estimators and tests for the average hazard ratio where the first relies on weighted Cox regression and the second on a simple plug-in estimator. The aim of this work is to give a systematic comparison of these two approaches and the standard logrank test for different time-toevent settings with proportional and nonproportional hazards and to illustrate the pros and cons in application. We conduct a systematic comparative study based on Monte-Carlo simulations and by a real clinical trial example. Our results suggest that the properties of the average hazard ratio depend on the underlying weighting function. The two approaches to construct estimators and related tests show very similar performance for adequately chosen weights. In general, the average hazard ratio defines a more valid effect measure than the standard hazard ratio under non-proportional hazards and the corresponding tests provide a power advantage over the common logrank test. As non-proportional hazards are often met in clinical practice and the average hazard ratio tests often outperform the common logrank test, this approach should be used more routinely in applications. Schattauer GmbH.

  8. Complex Dynamic Processes in Sign Tracking With an Omission Contingency (Negative Automaintenance)

    PubMed Central

    Killeen, Peter R.

    2008-01-01

    Hungry pigeons received food periodically, signaled by the onset of a keylight. Key pecks aborted the feeding. Subjects responded for thousands of trials, despite the contingent nonreinforcement, with varying probability as the intertrial interval was varied. Hazard functions showed the dominant tendency to be perseveration in responding and not responding. Once perseveration was accounted for, a linear operator model of associative conditioning further improved predictions. Response rates during trials were correlated with the prior probabilities of a response. Rescaled range analyses showed that the behavioral trajectories were a kind of fractional Brownian motion. PMID:12561133

  9. Complex dynamic processes in sign tracking with an omission contingency (negative automaintenance).

    PubMed

    Killeen, Peter R

    2003-01-01

    Hungry pigeons received food periodically, signaled by the onset of a keylight. Key pecks aborted the feeding. Subjects responded for thousands of trials, despite the contingent nonreinforcement, with varying probability as the intertrial interval was varied. Hazard functions showed the dominant tendency to be perseveration in responding and not responding. Once perseveration was accounted for, a linear operator model of associative conditioning further improved predictions. Response rates during trials were correlated with the prior probabilities of a response. Rescaled range analyses showed that the behavioral trajectories were a kind of fractional Brownian motion.

  10. A FORTRAN program for multivariate survival analysis on the personal computer.

    PubMed

    Mulder, P G

    1988-01-01

    In this paper a FORTRAN program is presented for multivariate survival or life table regression analysis in a competing risks' situation. The relevant failure rate (for example, a particular disease or mortality rate) is modelled as a log-linear function of a vector of (possibly time-dependent) explanatory variables. The explanatory variables may also include the variable time itself, which is useful for parameterizing piecewise exponential time-to-failure distributions in a Gompertz-like or Weibull-like way as a more efficient alternative to Cox's proportional hazards model. Maximum likelihood estimates of the coefficients of the log-linear relationship are obtained from the iterative Newton-Raphson method. The program runs on a personal computer under DOS; running time is quite acceptable, even for large samples.

  11. Genetic drift and mutational hazard in the evolution of salamander genomic gigantism.

    PubMed

    Mohlhenrich, Erik Roger; Mueller, Rachel Lockridge

    2016-12-01

    Salamanders have the largest nuclear genomes among tetrapods and, excepting lungfishes, among vertebrates as a whole. Lynch and Conery (2003) have proposed the mutational-hazard hypothesis to explain variation in genome size and complexity. Under this hypothesis, noncoding DNA imposes a selective cost by increasing the target for degenerative mutations (i.e., the mutational hazard). Expansion of noncoding DNA, and thus genome size, is driven by increased levels of genetic drift and/or decreased mutation rates; the former determines the efficiency with which purifying selection can remove excess DNA, whereas the latter determines the level of mutational hazard. Here, we test the hypothesis that salamanders have experienced stronger long-term, persistent genetic drift than frogs, a related clade with more typically sized vertebrate genomes. To test this hypothesis, we compared dN/dS and Kr/Kc values of protein-coding genes between these clades. Our results do not support this hypothesis; we find that salamanders have not experienced stronger genetic drift than frogs. Additionally, we find evidence consistent with a lower nucleotide substitution rate in salamanders. This result, along with previous work showing lower rates of small deletion and ectopic recombination in salamanders, suggests that a lower mutational hazard may contribute to genomic gigantism in this clade. © 2016 The Author(s). Evolution © 2016 The Society for the Study of Evolution.

  12. Lava-flow hazard on the SE flank of Mt. Etna (Southern Italy)

    NASA Astrophysics Data System (ADS)

    Crisci, G. M.; Iovine, G.; Di Gregorio, S.; Lupiano, V.

    2008-11-01

    A method for mapping lava-flow hazard on the SE flank of Mt. Etna (Sicily, Southern Italy) by applying the Cellular Automata model SCIARA -fv is described, together with employed techniques of calibration and validation through a parallel Genetic Algorithm. The study area is partly urbanised; it has repeatedly been affected by lava flows from flank eruptions in historical time, and shows evidence of a dominant SSE-trending fracture system. Moreover, a dormant deep-seated gravitational deformation, associated with a larger volcano-tectonic phenomenon, affects the whole south-eastern flank of the volcano. The Etnean 2001 Mt. Calcarazzi lava-flow event has been selected for model calibration, while validation has been performed by considering the 2002 Linguaglossa and the 1991-93 Valle del Bove events — suitable data for back analysis being available for these recent eruptions. Quantitative evaluation of the simulations, with respect to the real events, has been performed by means of a couple of fitness functions, which consider either the areas affected by the lava flows, or areas and eruption duration. Sensitivity analyses are in progress for thoroughly evaluating the role of parameters, topographic input data, and mesh geometry on model performance; though, preliminary results have already given encouraging responses on model robustness. In order to evaluate lava-flow hazard in the study area, a regular grid of n.340 possible vents, uniformly covering the study area and located at 500 m intervals, has been hypothesised. For each vent, a statistically-significant number of simulations has been planned, by adopting combinations of durations, lava volumes, and effusion-rate functions, selected by considering available volcanological data. Performed simulations have been stored in a GIS environment for successive analyses and map elaboration. Probabilities of activation, empirically based on past behaviour of the volcano, can be assigned to each vent of the grid, by considering its elevation, location with respect to the volcanic edifice, and proximity to its main weakness zones. Similarly, different probabilities can be assigned to the simulated event types (combinations of durations and lava volumes, and to the effusion-rate functions considered). In such a way, an implicit assumption is made that the volcanic style will not dramatically change in the near future. Depending on adopted criteria for probability evaluation, different maps of lava-flow hazard can be compiled, by taking into account both the overlapping of the simulated lava flows and their assumed probabilities, and by finally ranking computed values into few relative classes. The adopted methodology allows to rapidly exploring changes in lava-flow hazard as a function of varying probabilities of occurrence, by simply re-processing the database of the simulations stored in the GIS. For Civil Protection purposes, in case of expected imminent opening of a vent in a given sector of the volcano, re-processing may help in real-time forecasting the presumable affected areas, and thus in better managing the eruptive crisis. Moreover, further simulations can be added to the GIS data base at any time new different event types were recognised to be of interest. In this paper, three examples of maps of lava-flow hazard for the SE flank of Mt. Etna are presented: the first has been realised without assigning any probability to the performed simulations, by simply counting the frequencies of lava flows affecting each site; in the second map, information on past eruptions is taken into account, and probabilities are empirically attributed to each simulation based on location of vents and types of eruption; in the third one, a stronger role is ascribed to the main SSE-trending weakness zone, which crosses the study area between Nicolosi and Trecastagni, associated with the right flank of the above-cited deep-seated deformation. Despite being only preliminary (as based on a sub-set of the overall planned simulations), the maps clearly depict the most hazardous sectors of the volcano, which have been identified by applying the coupled modelling-GIS method here described.

  13. Submarine landslide and tsunami hazards offshore southern Alaska: Seismic strengthening versus rapid sedimentation

    NASA Astrophysics Data System (ADS)

    Sawyer, Derek E.; Reece, Robert S.; Gulick, Sean P. S.; Lenz, Brandi L.

    2017-08-01

    The southern Alaskan offshore margin is prone to submarine landslides and tsunami hazards due to seismically active plate boundaries and extreme sedimentation rates from glacially enhanced mountain erosion. We examine the submarine landslide potential with new shear strength measurements acquired by Integrated Ocean Drilling Program Expedition 341 on the continental slope and Surveyor Fan. These data reveal lower than expected sediment strength. Contrary to other active margins where seismic strengthening enhances slope stability, the high-sedimentation margin offshore southern Alaska behaves like a passive margin from a shear strength perspective. We interpret that seismic strengthening occurs but is offset by high sedimentation rates and overpressure. This conclusion is supported by shear strength outside of the fan that follow an active margin trend. More broadly, seismically active margins with wet-based glaciers are susceptible to submarine landslide hazards because of the combination of high sedimentation rates and earthquake shaking.

  14. Occupational Exposures in the Oil and Gas Extraction Industry: State of the Science and Research Recommendations

    PubMed Central

    Witter, Roxana Z.; Tenney, Liliana; Clark, Suzanne; Newman, Lee S.

    2015-01-01

    The oil and gas extraction industry is rapidly growing due to horizontal drilling and high volume hydraulic fracturing (HVHF). This growth has provided new jobs and economic stimulus. The industry occupational fatality rate is 2.5 times higher than the construction industry and 7 times higher than general industry; however injury rates are lower than the construction industry, suggesting injuries are not being reported. Some workers are exposed to crystalline silica at hazardous levels, above occupational health standards. Other hazards (particulate, benzene, noise, radiation) exist. In this article, we review occupational fatality and injury rate data; discuss research looking at root causes of fatal injuries and hazardous exposures; review interventions aimed at improving occupational health and safety; and discuss information gaps and areas of needed research. We also describe Wyoming efforts to improve occupational safety in this industry, as a case example. PMID:24634090

  15. HAZARDOUS AIR POLLUTANTS: WET REMOVAL RATES AND MECHANISMS

    EPA Science Inventory

    Fourteen hazardous organic air pollutants were evaluated for their potentials to be wet deposited by precipitation scavenging. This effort included a survey of solubilities (Henry's Law constants) in the literature, measurement of solubilities of three selected species, developme...

  16. 40 CFR 264.279 - Recordkeeping.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) SOLID WASTES (CONTINUED) STANDARDS FOR OWNERS AND OPERATORS OF HAZARDOUS WASTE TREATMENT, STORAGE, AND DISPOSAL FACILITIES Land Treatment § 264.279 Recordkeeping. The owner or operator must include hazardous waste application dates and rates in...

  17. 40 CFR 264.279 - Recordkeeping.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) SOLID WASTES (CONTINUED) STANDARDS FOR OWNERS AND OPERATORS OF HAZARDOUS WASTE TREATMENT, STORAGE, AND DISPOSAL FACILITIES Land Treatment § 264.279 Recordkeeping. The owner or operator must include hazardous waste application dates and rates in...

  18. 40 CFR 264.279 - Recordkeeping.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) SOLID WASTES (CONTINUED) STANDARDS FOR OWNERS AND OPERATORS OF HAZARDOUS WASTE TREATMENT, STORAGE, AND DISPOSAL FACILITIES Land Treatment § 264.279 Recordkeeping. The owner or operator must include hazardous waste application dates and rates in...

  19. 40 CFR 264.279 - Recordkeeping.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) SOLID WASTES (CONTINUED) STANDARDS FOR OWNERS AND OPERATORS OF HAZARDOUS WASTE TREATMENT, STORAGE, AND DISPOSAL FACILITIES Land Treatment § 264.279 Recordkeeping. The owner or operator must include hazardous waste application dates and rates in...

  20. Determination of the fire hazards of mine materials using a radiant panel.

    PubMed

    Harteis, S P; Litton, C D; Thomas, R A

    2016-01-01

    The objective of this study was to develop a laboratory-scale method to rank the ignition and fire hazards of commonly used underground mine materials and to eliminate the need for the expensive large-scale tests that are currently being used. A radiant-panel apparatus was used to determine the materials' relevant thermal characteristics: time to ignition, critical heat flux for ignition, heat of gasification, and mass-loss rate. Three thermal parameters, TRP , TP1 and TP4 , were derived from the data, then developed and subsequently used to rank the combined ignition and fire hazards of the combustible materials from low hazard to high hazard. The results compared favorably with the thermal and ignition hazards of similar materials reported in the literature and support this approach as a simpler one for quantifying these combustible hazards.

  1. A Model for Generating Multi-hazard Scenarios

    NASA Astrophysics Data System (ADS)

    Lo Jacomo, A.; Han, D.; Champneys, A.

    2017-12-01

    Communities in mountain areas are often subject to risk from multiple hazards, such as earthquakes, landslides, and floods. Each hazard has its own different rate of onset, duration, and return period. Multiple hazards tend to complicate the combined risk due to their interactions. Prioritising interventions for minimising risk in this context is challenging. We developed a probabilistic multi-hazard model to help inform decision making in multi-hazard areas. The model is applied to a case study region in the Sichuan province in China, using information from satellite imagery and in-situ data. The model is not intended as a predictive model, but rather as a tool which takes stakeholder input and can be used to explore plausible hazard scenarios over time. By using a Monte Carlo framework and varrying uncertain parameters for each of the hazards, the model can be used to explore the effect of different mitigation interventions aimed at reducing the disaster risk within an uncertain hazard context.

  2. Probabilistic seismic hazard study based on active fault and finite element geodynamic models

    NASA Astrophysics Data System (ADS)

    Kastelic, Vanja; Carafa, Michele M. C.; Visini, Francesco

    2016-04-01

    We present a probabilistic seismic hazard analysis (PSHA) that is exclusively based on active faults and geodynamic finite element input models whereas seismic catalogues were used only in a posterior comparison. We applied the developed model in the External Dinarides, a slow deforming thrust-and-fold belt at the contact between Adria and Eurasia.. is the Our method consists of establishing s two earthquake rupture forecast models: (i) a geological active fault input (GEO) model and, (ii) a finite element (FEM) model. The GEO model is based on active fault database that provides information on fault location and its geometric and kinematic parameters together with estimations on its slip rate. By default in this model all deformation is set to be released along the active faults. The FEM model is based on a numerical geodynamic model developed for the region of study. In this model the deformation is, besides along the active faults, released also in the volumetric continuum elements. From both models we calculated their corresponding activity rates, its earthquake rates and their final expected peak ground accelerations. We investigated both the source model and the earthquake model uncertainties by varying the main active fault and earthquake rate calculation parameters through constructing corresponding branches of the seismic hazard logic tree. Hazard maps and UHS curves have been produced for horizontal ground motion on bedrock conditions VS 30 ≥ 800 m/s), thereby not considering local site amplification effects. The hazard was computed over a 0.2° spaced grid considering 648 branches of the logic tree and the mean value of 10% probability of exceedance in 50 years hazard level, while the 5th and 95th percentiles were also computed to investigate the model limits. We conducted a sensitivity analysis to control which of the input parameters influence the final hazard results in which measure. The results of such comparison evidence the deformation model and with their internal variability together with the choice of the ground motion prediction equations (GMPEs) are the most influencing parameter. Both of these parameters have significan affect on the hazard results. Thus having good knowledge of the existence of active faults and their geometric and activity characteristics is of key importance. We also show that PSHA models based exclusively on active faults and geodynamic inputs, which are thus not dependent on past earthquake occurrences, provide a valid method for seismic hazard calculation.

  3. Diabetes mellitus is associated with adverse structural and functional cardiac remodelling in chronic heart failure with reduced ejection fraction.

    PubMed

    Walker, Andrew Mn; Patel, Peysh A; Rajwani, Adil; Groves, David; Denby, Christine; Kearney, Lorraine; Sapsford, Robert J; Witte, Klaus K; Kearney, Mark T; Cubbon, Richard M

    2016-09-01

    Diabetes mellitus is associated with an increased risk of death and hospitalisation in patients with chronic heart failure. Better understanding of potential underlying mechanisms may aid the development of diabetes mellitus-specific chronic heart failure therapeutic strategies. Prospective observational cohort study of 628 patients with chronic heart failure associated with left ventricular systolic dysfunction receiving contemporary evidence-based therapy. Indices of cardiac structure and function, along with symptoms and biochemical parameters, were compared in patients with and without diabetes mellitus at study recruitment and 1 year later. Patients with diabetes mellitus (24.2%) experienced higher rates of all-cause [hazard ratio, 2.3 (95% confidence interval, 1.8-3.0)] and chronic heart failure-specific mortality and hospitalisation despite comparable pharmacological and device-based therapies. At study recruitment, patients with diabetes mellitus were more symptomatic, required greater diuretic doses and more frequently had radiologic evidence of pulmonary oedema, despite higher left ventricular ejection fraction. They also exhibited echocardiographic evidence of increased left ventricular wall thickness and pulmonary arterial pressure. Diabetes mellitus was associated with reduced indices of heart rate variability and increased heart rate turbulence. During follow-up, patients with diabetes mellitus experienced less beneficial left ventricular remodelling and greater deterioration in renal function. Diabetes mellitus is associated with features of adverse structural and functional cardiac remodelling in patients with chronic heart failure. © The Author(s) 2016.

  4. Causal Mediation Analysis for the Cox Proportional Hazards Model with a Smooth Baseline Hazard Estimator.

    PubMed

    Wang, Wei; Albert, Jeffrey M

    2017-08-01

    An important problem within the social, behavioral, and health sciences is how to partition an exposure effect (e.g. treatment or risk factor) among specific pathway effects and to quantify the importance of each pathway. Mediation analysis based on the potential outcomes framework is an important tool to address this problem and we consider the estimation of mediation effects for the proportional hazards model in this paper. We give precise definitions of the total effect, natural indirect effect, and natural direct effect in terms of the survival probability, hazard function, and restricted mean survival time within the standard two-stage mediation framework. To estimate the mediation effects on different scales, we propose a mediation formula approach in which simple parametric models (fractional polynomials or restricted cubic splines) are utilized to approximate the baseline log cumulative hazard function. Simulation study results demonstrate low bias of the mediation effect estimators and close-to-nominal coverage probability of the confidence intervals for a wide range of complex hazard shapes. We apply this method to the Jackson Heart Study data and conduct sensitivity analysis to assess the impact on the mediation effects inference when the no unmeasured mediator-outcome confounding assumption is violated.

  5. Profiling the careers of Thoroughbred horses racing in Hong Kong between 2000 and 2010.

    PubMed

    Velie, B D; Stewart, B D; Lam, K; Wade, C M; Hamilton, N A

    2013-11-01

    Research in Thoroughbred racehorses is often specific to horses from a given racing population or region. In order to investigate trends in racehorse careers across populations accurately, population-specific benchmarks for performance outcomes must be established. To provide summary statistics for performance outcomes for Thoroughbreds racing in Hong Kong between 2000 and 2010 and to document and provide evidence on the current differences in racing careers across sexes and regions of origin for horses racing in Hong Kong. Performance data on the population of Thoroughbreds racing in Hong Kong between 3 September 2000 and 12 March 2011 (n = 4950) were acquired and used to describe and compare the careers of Thoroughbred racehorses in Hong Kong. Career length, number of career starts and number of spells from racing per year were evaluated. Kaplan-Meier survival curves, stratified by sex, age group, country of origin and region of origin were produced for career length. A Cox's proportional hazards model was fitted to assess factors influencing the risk of retirement from racing in Hong Kong. Log-rank tests for equality of career length survivor functions showed significant differences (P<0.001) across sexes, age groups, countries of origin and regions of origin. An increased age at first start in Hong Kong tended to increase the hazard rate for retirement from racing in Hong Kong, whereas greater earnings per race and originating from Europe tended to reduce the hazard rate for racing retirement. Differences in career outcomes within a racing population appear to be influenced partly by the region from which a horse originates, with specific effects on each performance outcome also varying between regions. Future research should take into account these potential differences when comparing results across populations. © 2013 EVJ Ltd.

  6. Individuality and universality in the growth-division laws of single E. coli cells

    NASA Astrophysics Data System (ADS)

    Kennard, Andrew S.; Osella, Matteo; Javer, Avelino; Grilli, Jacopo; Nghe, Philippe; Tans, Sander J.; Cicuta, Pietro; Cosentino Lagomarsino, Marco

    2016-01-01

    The mean size of exponentially dividing Escherichia coli cells in different nutrient conditions is known to depend on the mean growth rate only. However, the joint fluctuations relating cell size, doubling time, and individual growth rate are only starting to be characterized. Recent studies in bacteria reported a universal trend where the spread in both size and doubling times is a linear function of the population means of these variables. Here we combine experiments and theory and use scaling concepts to elucidate the constraints posed by the second observation on the division control mechanism and on the joint fluctuations of sizes and doubling times. We found that scaling relations based on the means collapse both size and doubling-time distributions across different conditions and explain how the shape of their joint fluctuations deviates from the means. Our data on these joint fluctuations highlight the importance of cell individuality: Single cells do not follow the dependence observed for the means between size and either growth rate or inverse doubling time. Our calculations show that these results emerge from a broad class of division control mechanisms requiring a certain scaling form of the "division hazard rate function," which defines the probability rate of dividing as a function of measurable parameters. This "model free" approach gives a rationale for the universal body-size distributions observed in microbial ecosystems across many microbial species, presumably dividing with multiple mechanisms. Additionally, our experiments show a crossover between fast and slow growth in the relation between individual-cell growth rate and division time, which can be understood in terms of different regimes of genome replication control.

  7. An estimator of the survival function based on the semi-Markov model under dependent censorship.

    PubMed

    Lee, Seung-Yeoun; Tsai, Wei-Yann

    2005-06-01

    Lee and Wolfe (Biometrics vol. 54 pp. 1176-1178, 1998) proposed the two-stage sampling design for testing the assumption of independent censoring, which involves further follow-up of a subset of lost-to-follow-up censored subjects. They also proposed an adjusted estimator for the survivor function for a proportional hazards model under the dependent censoring model. In this paper, a new estimator for the survivor function is proposed for the semi-Markov model under the dependent censorship on the basis of the two-stage sampling data. The consistency and the asymptotic distribution of the proposed estimator are derived. The estimation procedure is illustrated with an example of lung cancer clinical trial and simulation results are reported of the mean squared errors of estimators under a proportional hazards and two different nonproportional hazards models.

  8. Total lightning characteristics of recent hazardous weather events in Japan

    NASA Astrophysics Data System (ADS)

    Hobara, Y.; Kono, S.; Ogawa, T.; Heckman, S.; Stock, M.; Liu, C.

    2017-12-01

    In recent years, the total lightning (IC + CG) activity have attracted a lot of attention to improve the quality of prediction of hazardous weather phenomena (hail, wind gusts, tornadoes, heavy precipitation). Sudden increases of the total lightning flash rate so-called lightning jump (LJ) preceding the hazardous weather, reported in several studies, are one of the promising precursors. Although, increases in the frequency and intensity of these extreme weather events were reported in Japan, relationship with these events with total lightning have not studied intensively yet. In this paper, we will demonstrate the recent results from Japanese total lightning detection network (JTLN) in relation with hazardous weather events occurred in Japan in the period of 2014-2016. Automatic thunderstorm cell tracking was carried out based on the very high spatial and temporal resolution X-band MP radar echo data (1 min and 250 m) to correlate with total lightning activity. Results obtained reveal promising because the flash rate of total lightning tends to increase about 10 40 minutes before the onset of the extreme weather events. We also present the differences in lightning characteristics of thunderstorm cells between hazardous weather events and non-hazardous weather events, which is a vital information to improve the prediction efficiency.

  9. Hidden Markov models for estimating animal mortality from anthropogenic hazards

    EPA Science Inventory

    Carcasses searches are a common method for studying the risk of anthropogenic hazards to wildlife, including non-target poisoning and collisions with anthropogenic structures. Typically, numbers of carcasses found must be corrected for scavenging rates and imperfect detection. ...

  10. 40 CFR 265.1084 - Waste determination procedures.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... to determine the organic biodegradation efficiency (Rbio) for a treated hazardous waste. (i) The... Where: Rbio = Organic biodegradation efficiency, percent. Fbio = Fraction of organic biodegraded as... to determine the actual organic mass biodegradation rate (MRbio) for a treated hazardous waste. (i...

  11. NYC CV Pilot Deployment : Safety Management Plan : New York City.

    DOT National Transportation Integrated Search

    2016-04-22

    This safety management plan identifies preliminary safety hazards associated with the New York City Connected Vehicle Pilot Deployment project. Each of the hazards is rated, and a plan for managing the risks through detailed design and deployment is ...

  12. National-Level Multi-Hazard Risk Assessments in Sub-Saharan Africa

    NASA Astrophysics Data System (ADS)

    Murnane, R. J.; Balog, S.; Fraser, S. A.; Jongman, B.; Van Ledden, M.; Phillips, E.; Simpson, A.

    2017-12-01

    National-level risk assessments can provide important baseline information for decision-making on risk management and risk financing strategies. In this study, multi-hazard risk assessments were undertaken for 9 countries in Sub-Saharan Africa: Cape Verde, Ethiopia, Kenya, Niger, Malawi, Mali, Mozambique, Senegal and Uganda. The assessment was part of the Building Disaster Resilience in Sub-Saharan Africa Program and aimed at supporting the development of multi-risk financing strategies to help African countries make informed decisions to mitigate the socio-economic, fiscal and financial impacts of disasters. The assessments considered hazards and exposures consistent with the years 2010 and 2050. We worked with multiple firms to develop the hazard, exposure and vulnerability data and the risk results. The hazards include: coastal flood, drought, earthquake, landslide, riverine flood, tropical cyclone wind and storm surge, and volcanoes. For hazards expected to vary with climate, the 2050 hazard is based on the IPCC RCP 6.0. Geolocated exposure data for 2010 and 2050 at a 15 arc second ( 0.5 km) resolution includes: structures as a function of seven development patterns; transportation networks including roads, bridges, tunnels and rail; critical facilities such as schools, hospitals, energy facilities and government buildings; crops; population; and, gross domestic product (GDP). The 2050 exposure values for population are based on the IPCC SSP 2. Values for other exposure data are a function of population change. Vulnerability was based on openly available vulnerability functions. Losses were based on replacement values (e.g., cost/m2 or cost/km). Risk results are provided in terms of annual average loss and a variety of return periods at the national and Admin 1 levels. Assessments of recent historical events are used to validate the model results. In the future, it would be useful to use hazard footprints of historical events for validation purposes. The results will be visualized in a set of national risk profile documents intended to form the basis for conversations with governments on risk reduction and risk financing strategies.

  13. A versatile test for equality of two survival functions based on weighted differences of Kaplan-Meier curves.

    PubMed

    Uno, Hajime; Tian, Lu; Claggett, Brian; Wei, L J

    2015-12-10

    With censored event time observations, the logrank test is the most popular tool for testing the equality of two underlying survival distributions. Although this test is asymptotically distribution free, it may not be powerful when the proportional hazards assumption is violated. Various other novel testing procedures have been proposed, which generally are derived by assuming a class of specific alternative hypotheses with respect to the hazard functions. The test considered by Pepe and Fleming (1989) is based on a linear combination of weighted differences of the two Kaplan-Meier curves over time and is a natural tool to assess the difference of two survival functions directly. In this article, we take a similar approach but choose weights that are proportional to the observed standardized difference of the estimated survival curves at each time point. The new proposal automatically makes weighting adjustments empirically. The new test statistic is aimed at a one-sided general alternative hypothesis and is distributed with a short right tail under the null hypothesis but with a heavy tail under the alternative. The results from extensive numerical studies demonstrate that the new procedure performs well under various general alternatives with a caution of a minor inflation of the type I error rate when the sample size is small or the number of observed events is small. The survival data from a recent cancer comparative study are utilized for illustrating the implementation of the process. Copyright © 2015 John Wiley & Sons, Ltd.

  14. Accelerated decline of renal function in type 2 diabetes following severe hypoglycemia.

    PubMed

    Tsujimoto, Tetsuro; Yamamoto-Honda, Ritsuko; Kajio, Hiroshi; Kishimoto, Miyako; Noto, Hiroshi; Hachiya, Remi; Kimura, Akio; Kakei, Masafumi; Noda, Mitsuhiko

    2016-01-01

    This study aimed to evaluate whether the pronounced elevation in blood pressure during severe hypoglycemia is associated with subsequent renal insufficiency. We conducted a 3-year cohort study to assess the clinical course of renal function in type 2 diabetes patients with or without blood pressure surge during severe hypoglycemia. Of 111 type 2 diabetes patients with severe hypoglycemia, 76 exhibited an extremely high systolic blood pressure before treatment, whereas 35 demonstrated no such increase (179.1 ± 27.7 mmHg vs. 131.1 ± 20.2 mmHg, P<0.001). At 12h after treatment, systolic blood pressure did not differ significantly (131.5 ± 30.7 mmHg vs. 123.5 ± 20.7 mmHg; P=0.39). The estimated glomerular filtration rate (GFR) before and at the time of severe hypoglycemia did not significantly differ between both groups. A multivariate Cox proportional hazards regression analysis revealed that blood pressure surge during severe hypoglycemia was independently associated with a composite outcome of a more than 15 mL/min/1.73 m(2) decrease in the estimated GFR and initiation of chronic dialysis (hazard ratio, 2.68; 95% confidence interval, 1.12-6.38; P=0.02). Renal function after severe hypoglycemia was significantly worse in type 2 diabetes patients with blood pressure surge during severe hypoglycemia than those without blood pressure surge. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  15. Time-dependent resilience assessment and improvement of urban infrastructure systems

    NASA Astrophysics Data System (ADS)

    Ouyang, Min; Dueñas-Osorio, Leonardo

    2012-09-01

    This paper introduces an approach to assess and improve the time-dependent resilience of urban infrastructure systems, where resilience is defined as the systems' ability to resist various possible hazards, absorb the initial damage from hazards, and recover to normal operation one or multiple times during a time period T. For different values of T and its position relative to current time, there are three forms of resilience: previous resilience, current potential resilience, and future potential resilience. This paper mainly discusses the third form that takes into account the systems' future evolving processes. Taking the power transmission grid in Harris County, Texas, USA as an example, the time-dependent features of resilience and the effectiveness of some resilience-inspired strategies, including enhancement of situational awareness, management of consumer demand, and integration of distributed generators, are all simulated and discussed. Results show a nonlinear nature of resilience as a function of T, which may exhibit a transition from an increasing function to a decreasing function at either a threshold of post-blackout improvement rate, a threshold of load profile with consumer demand management, or a threshold number of integrated distributed generators. These results are further confirmed by studying a typical benchmark system such as the IEEE RTS-96. Such common trends indicate that some resilience strategies may enhance infrastructure system resilience in the short term, but if not managed well, they may compromise practical utility system resilience in the long run.

  16. Time-dependent resilience assessment and improvement of urban infrastructure systems.

    PubMed

    Ouyang, Min; Dueñas-Osorio, Leonardo

    2012-09-01

    This paper introduces an approach to assess and improve the time-dependent resilience of urban infrastructure systems, where resilience is defined as the systems' ability to resist various possible hazards, absorb the initial damage from hazards, and recover to normal operation one or multiple times during a time period T. For different values of T and its position relative to current time, there are three forms of resilience: previous resilience, current potential resilience, and future potential resilience. This paper mainly discusses the third form that takes into account the systems' future evolving processes. Taking the power transmission grid in Harris County, Texas, USA as an example, the time-dependent features of resilience and the effectiveness of some resilience-inspired strategies, including enhancement of situational awareness, management of consumer demand, and integration of distributed generators, are all simulated and discussed. Results show a nonlinear nature of resilience as a function of T, which may exhibit a transition from an increasing function to a decreasing function at either a threshold of post-blackout improvement rate, a threshold of load profile with consumer demand management, or a threshold number of integrated distributed generators. These results are further confirmed by studying a typical benchmark system such as the IEEE RTS-96. Such common trends indicate that some resilience strategies may enhance infrastructure system resilience in the short term, but if not managed well, they may compromise practical utility system resilience in the long run.

  17. A global probabilistic tsunami hazard assessment from earthquake sources

    USGS Publications Warehouse

    Davies, Gareth; Griffin, Jonathan; Lovholt, Finn; Glimsdal, Sylfest; Harbitz, Carl; Thio, Hong Kie; Lorito, Stefano; Basili, Roberto; Selva, Jacopo; Geist, Eric L.; Baptista, Maria Ana

    2017-01-01

    Large tsunamis occur infrequently but have the capacity to cause enormous numbers of casualties, damage to the built environment and critical infrastructure, and economic losses. A sound understanding of tsunami hazard is required to underpin management of these risks, and while tsunami hazard assessments are typically conducted at regional or local scales, globally consistent assessments are required to support international disaster risk reduction efforts, and can serve as a reference for local and regional studies. This study presents a global-scale probabilistic tsunami hazard assessment (PTHA), extending previous global-scale assessments based largely on scenario analysis. Only earthquake sources are considered, as they represent about 80% of the recorded damaging tsunami events. Globally extensive estimates of tsunami run-up height are derived at various exceedance rates, and the associated uncertainties are quantified. Epistemic uncertainties in the exceedance rates of large earthquakes often lead to large uncertainties in tsunami run-up. Deviations between modelled tsunami run-up and event observations are quantified, and found to be larger than suggested in previous studies. Accounting for these deviations in PTHA is important, as it leads to a pronounced increase in predicted tsunami run-up for a given exceedance rate.

  18. Blood urea nitrogen/creatinine ratio identifies a high-risk but potentially reversible form of renal dysfunction in patients with decompensated heart failure.

    PubMed

    Brisco, Meredith A; Coca, Steven G; Chen, Jennifer; Owens, Anjali Tiku; McCauley, Brian D; Kimmel, Stephen E; Testani, Jeffrey M

    2013-03-01

    Identifying reversible renal dysfunction (RD) in the setting of heart failure is challenging. The goal of this study was to evaluate whether elevated admission blood urea nitrogen/creatinine ratio (BUN/Cr) could identify decompensated heart failure patients likely to experience improvement in renal function (IRF) with treatment. Consecutive hospitalizations with a discharge diagnosis of heart failure were reviewed. IRF was defined as ≥20% increase and worsening renal function as ≥20% decrease in estimated glomerular filtration rate. IRF occurred in 31% of the 896 patients meeting eligibility criteria. Higher admission BUN/Cr was associated with in-hospital IRF (odds ratio, 1.5 per 10 increase; 95% confidence interval [CI], 1.3-1.8; P<0.001), an association persisting after adjustment for baseline characteristics (odds ratio, 1.4; 95% CI, 1.1-1.8; P=0.004). However, higher admission BUN/Cr was also associated with post-discharge worsening renal function (odds ratio, 1.4; 95% CI, 1.1-1.8; P=0.011). Notably, in patients with an elevated admission BUN/Cr, the risk of death associated with RD (estimated glomerular filtration rate <45) was substantial (hazard ratio, 2.2; 95% CI, 1.6-3.1; P<0.001). However, in patients with a normal admission BUN/Cr, RD was not associated with increased mortality (hazard ratio, 1.2; 95% CI, 0.67-2.0; P=0.59; p interaction=0.03). An elevated admission BUN/Cr identifies decompensated patients with heart failure likely to experience IRF with treatment, providing proof of concept that reversible RD may be a discernible entity. However, this improvement seems to be largely transient, and RD, in the setting of an elevated BUN/Cr, remains strongly associated with death. Further research is warranted to develop strategies for the optimal detection and treatment of these high-risk patients.

  19. Blood Urea Nitrogen/Creatinine Ratio Identifies a High-Risk but Potentially Reversible Form of Renal Dysfunction in Patients With Decompensated Heart Failure

    PubMed Central

    Brisco, Meredith A.; Coca, Steven G.; Chen, Jennifer; Owens, Anjali Tiku; McCauley, Brian D.; Kimmel, Stephen E.; Testani, Jeffrey M.

    2014-01-01

    Background Identifying reversible renal dysfunction (RD) in the setting of heart failure is challenging. The goal of this study was to evaluate whether elevated admission blood urea nitrogen/creatinine ratio (BUN/Cr) could identify decompensated heart failure patients likely to experience improvement in renal function (IRF) with treatment. Methods and Results Consecutive hospitalizations with a discharge diagnosis of heart failure were reviewed. IRF was defined as ≥20% increase and worsening renal function as ≥20% decrease in estimated glomerular filtration rate. IRF occurred in 31% of the 896 patients meeting eligibility criteria. Higher admission BUN/Cr was associated with inhospital IRF (odds ratio, 1.5 per 10 increase; 95% confidence interval [CI], 1.3–1.8; P<0.001), an association persisting after adjustment for baseline characteristics (odds ratio, 1.4; 95% CI, 1.1–1.8; P=0.004). However, higher admission BUN/Cr was also associated with post-discharge worsening renal function (odds ratio, 1.4; 95% CI, 1.1–1.8; P=0.011). Notably, in patients with an elevated admission BUN/Cr, the risk of death associated with RD (estimated glomerular filtration rate <45) was substantial (hazard ratio, 2.2; 95% CI, 1.6–3.1; P<0.001). However, in patients with a normal admission BUN/Cr, RD was not associated with increased mortality (hazard ratio, 1.2; 95% CI, 0.67–2.0; P=0.59; p interaction=0.03). Conclusions An elevated admission BUN/Cr identifies decompensated patients with heart failure likely to experience IRF with treatment, providing proof of concept that reversible RD may be a discernible entity. However, this improvement seems to be largely transient, and RD, in the setting of an elevated BUN/Cr, remains strongly associated with death. Further research is warranted to develop strategies for the optimal detection and treatment of these high-risk patients. PMID:23325460

  20. The Spatial Assessment of the Current Seismic Hazard State for Hard Rock Underground Mines

    NASA Astrophysics Data System (ADS)

    Wesseloo, Johan

    2018-06-01

    Mining-induced seismic hazard assessment is an important component in the management of safety and financial risk in mines. As the seismic hazard is a response to the mining activity, it is non-stationary and variable both in space and time. This paper presents an approach for implementing a probabilistic seismic hazard assessment to assess the current hazard state of a mine. Each of the components of the probabilistic seismic hazard assessment is considered within the context of hard rock underground mines. The focus of this paper is the assessment of the in-mine hazard distribution and does not consider the hazard to nearby public or structures. A rating system and methodologies to present hazard maps, for the purpose of communicating to different stakeholders in the mine, i.e. mine managers, technical personnel and the work force, are developed. The approach allows one to update the assessment with relative ease and within short time periods as new data become available, enabling the monitoring of the spatial and temporal change in the seismic hazard.

  1. Assessment of the detectability of geo-hazards using Google Earth applied to the Three Parallel Rivers Area, Yunnan province of China

    NASA Astrophysics Data System (ADS)

    Voermans, Michiel; Mao, Zhun; Baartman, Jantiene EM; Stokes, Alexia

    2017-04-01

    Anthropogenic activities such as hydropower, mining and road construction in mountainous areas can induce and intensify mass wasting geo-hazards (e.g. landslides, gullies, rockslides). This represses local safety and socio-economic development, and endangers biodiversity at larger scale. Until today, data and knowledge to construct geo-hazard databases for further assessments are lacking. This applies in particular to countries with a recently emerged rapid economic growth, where there are no previous hazard documentations and where means to gain data from e.g. intensive fieldwork or VHR satellite imagery and DEM processing are lacking. Google Earth (GE, https://www.google.com/earth/) is a freely available and relatively simple virtual globe, map and geographical information program, which is potentially useful in detecting geo-hazards. This research aimed at (i) testing the capability of Google Earth to detect locations of geo-hazards and (ii) identifying factors affecting the diagnosing quality of the detection, including effects of geo-hazard dimensions, environs setting and professional background and effort of GE users. This was tested on nine geo-hazard sites following road segments in the Three Parallel Rivers Area in the Yunnan province of China, where geo-hazards are frequently occurring. Along each road site, the position and size of each geo-hazard was measured in situ. Next, independent diagnosers with varying professional experience (students, researchers, engineers etc.) were invited to detect geo-hazard occurrence along each of the eight sites via GE. Finally, the inventory and diagnostic data were compared to validate the objectives. Rates of detected geo-hazards from 30 diagnosers ranged from 10% to 48%. No strong correlations were found between the type and size of the geo-hazards and their detection rates. Also the years of expertise of the diagnosers proved not to make a difference, opposite to what may be expected. Meanwhile the amount of time spent by the diagnoser proved to be positively influencing the detectability. GE showed to be a useful tool in detecting mainly larger geo-hazards if diligently applied, and is therefore applicable to identify geo-hazard hotspots. The usability for further assessments such as sediment delivery estimations is questionable and further research should be carried out to give insight to its full potential.

  2. PCP METHODOLOGY FOR DETERMINING DOSE RATES FOR SMALL GRAM QUANTITIES IN SHIPPING PACKAGINGS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nathan, S.

    The Small Gram Quantity (SGQ) concept is based on the understanding that small amounts of hazardous materials, in this case radioactive materials, are significantly less hazardous than large amounts of the same materials. This study describes a methodology designed to estimate an SGQ for several neutron and gamma emitting isotopes that can be shipped in a package compliant with 10 CFR Part 71 external radiation level limits regulations. These regulations require packaging for the shipment of radioactive materials perform, under both normal and accident conditions, the essential functions of material containment, subcriticality, and maintain external radiation levels within regulatory limits.more » 10 CFR 71.33(b)(1)(2)&(3) state radioactive and fissile materials must be identified and their maximum quantity, chemical and physical forms be included in an application. Furthermore, the U.S. Federal Regulations require application contain an evaluation demonstrating the package (i.e., the packaging and its contents) satisfies the external radiation standards for all packages (10 CFR 71.31(2), 71.35(a), & 71.47). By placing the contents in a He leak-tight containment vessel, and limiting the mass to ensure subcriticality, the first two essential functions are readily met. Some isotopes emit sufficiently strong photon radiation that small amounts of material can yield a large external dose rate. Quantifying of the dose rate for a proposed content is a challenging issue for the SGQ approach. It is essential to quantify external radiation levels from several common gamma and neutron sources that can be safely placed in a specific packaging, to ensure compliance with federal regulations. The Packaging Certification Program (PCP) Methodology for Determining Dose Rate for Small Gram Quantities in Shipping Packagings described in this report provides bounding mass limits for a set of proposed SGQ isotopes. Methodology calculations were performed to estimate external radiation levels for the 9977 shipping package using the MCNP radiation transport code to develop a set of response multipliers (Green's functions) for 'dose per particle' for each neutron and photon spectral group. The source spectrum for each isotope generated using the ORIGEN-S and RASTA computer codes was folded with the response multipliers to generate the dose rate per gram of each isotope in the 9977 shipping package and its associated shielded containers. The maximum amount of a single isotope that could be shipped within the regulatory limits contained in 10 CFR 71.47 for dose rate at the surface of the package is determined. If a package contains a mixture of isotopes, the acceptability for shipment can be determined by a sum of fractions approach. Furthermore, the results of this analysis can be easily extended to additional radioisotopes by simply evaluating the neutron and/or photon spectra of those isotopes and folding the spectral data with the Green's functions provided.« less

  3. Seismic‐hazard forecast for 2016 including induced and natural earthquakes in the central and eastern United States

    USGS Publications Warehouse

    Petersen, Mark D.; Mueller, Charles; Moschetti, Morgan P.; Hoover, Susan M.; Llenos, Andrea L.; Ellsworth, William L.; Michael, Andrew J.; Rubinstein, Justin L.; McGarr, Arthur F.; Rukstales, Kenneth S.

    2016-01-01

    The U.S. Geological Survey (USGS) has produced a one‐year (2016) probabilistic seismic‐hazard assessment for the central and eastern United States (CEUS) that includes contributions from both induced and natural earthquakes that are constructed with probabilistic methods using alternative data and inputs. This hazard assessment builds on our 2016 final model (Petersen et al., 2016) by adding sensitivity studies, illustrating hazard in new ways, incorporating new population data, and discussing potential improvements. The model considers short‐term seismic activity rates (primarily 2014–2015) and assumes that the activity rates will remain stationary over short time intervals. The final model considers different ways of categorizing induced and natural earthquakes by incorporating two equally weighted earthquake rate submodels that are composed of alternative earthquake inputs for catalog duration, smoothing parameters, maximum magnitudes, and ground‐motion models. These alternatives represent uncertainties on how we calculate earthquake occurrence and the diversity of opinion within the science community. In this article, we also test sensitivity to the minimum moment magnitude between M 4 and M 4.7 and the choice of applying a declustered catalog with b=1.0 rather than the full catalog with b=1.3. We incorporate two earthquake rate submodels: in the informed submodel we classify earthquakes as induced or natural, and in the adaptive submodel we do not differentiate. The alternative submodel hazard maps both depict high hazard and these are combined in the final model. Results depict several ground‐shaking measures as well as intensity and include maps showing a high‐hazard level (1% probability of exceedance in 1 year or greater). Ground motions reach 0.6g horizontal peak ground acceleration (PGA) in north‐central Oklahoma and southern Kansas, and about 0.2g PGA in the Raton basin of Colorado and New Mexico, in central Arkansas, and in north‐central Texas near Dallas–Fort Worth. The chance of having levels of ground motions corresponding to modified Mercalli intensity (MMI) VI or greater earthquake shaking is 2%–12% per year in north‐central Oklahoma and southern Kansas and New Madrid similar to the chance of damage at sites in high‐hazard portions of California caused by natural earthquakes. Hazard is also significant in the Raton basin of Colorado/New Mexico; north‐central Arkansas; Dallas–Fort Worth, Texas; and in a few other areas. Hazard probabilities are much lower (by about half or more) for exceeding MMI VII or VIII. Hazard is 3‐ to 10‐fold higher near some areas of active‐induced earthquakes than in the 2014 USGS National Seismic Hazard Model (NSHM), which did not consider induced earthquakes. This study in conjunction with the LandScan TM Database (2013) indicates that about 8 million people live in areas of active injection wells that have a greater than 1% chance of experiencing damaging ground shaking (MMI≥VI) in 2016. The final model has high uncertainty, and engineers, regulators, and industry should use these assessments cautiously to make informed decisions on mitigating the potential effects of induced and natural earthquakes.

  4. Physical Activity in Advanced Age: Physical Activity, Function, and Mortality in Advanced Age: A Longitudinal Follow Up (LiLACS NZ).

    PubMed

    Mace Firebaugh, Casey; Moyes, Simon; Jatrana, Santosh; Rolleston, Anna; Kerse, Ngaire

    2018-01-18

    The relationship between physical activity, function, and mortality is not established in advanced age. Physical activity, function, and mortality were followed in a cohort of Māori and non-Māori adults living in advanced age for a period of six years. Generalised Linear regression models were used to analyse the association between physical activity and NEADL while Kaplan-Meier survival analysis, and Cox-proportional hazard models were used to assess the association between the physical activity and mortality. The Hazard Ratio for mortality for those in the least active physical activity quartile was 4.1 for Māori and 1.8 for non- Māori compared to the most active physical activity quartile. There was an inverse relationship between physical activity and mortality, with lower hazard ratios for mortality at all levels of physical activity. Higher levels of physical activity were associated with lower mortality and higher functional status in advanced aged adults.

  5. PACKAGING CERTIFICATION PROGRAM METHODOLOGY FOR DETERMINING DOSE RATES FOR SMALL GRAM QUANTITIES IN SHIPPING PACKAGINGS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nathan, S.; Loftin, B.; Abramczyk, G.

    The Small Gram Quantity (SGQ) concept is based on the understanding that small amounts of hazardous materials, in this case radioactive materials (RAM), are significantly less hazardous than large amounts of the same materials. This paper describes a methodology designed to estimate an SGQ for several neutron and gamma emitting isotopes that can be shipped in a package compliant with 10 CFR Part 71 external radiation level limits regulations. These regulations require packaging for the shipment of radioactive materials, under both normal and accident conditions, to perform the essential functions of material containment, subcriticality, and maintain external radiation levels withinmore » the specified limits. By placing the contents in a helium leak-tight containment vessel, and limiting the mass to ensure subcriticality, the first two essential functions are readily met. Some isotopes emit sufficiently strong photon radiation that small amounts of material can yield a large dose rate outside the package. Quantifying the dose rate for a proposed content is a challenging issue for the SGQ approach. It is essential to quantify external radiation levels from several common gamma and neutron sources that can be safely placed in a specific packaging, to ensure compliance with federal regulations. The Packaging Certification Program (PCP) Methodology for Determining Dose Rate for Small Gram Quantities in Shipping Packagings provides bounding shielding calculations that define mass limits compliant with 10 CFR 71.47 for a set of proposed SGQ isotopes. The approach is based on energy superposition with dose response calculated for a set of spectral groups for a baseline physical packaging configuration. The methodology includes using the MCNP radiation transport code to evaluate a family of neutron and photon spectral groups using the 9977 shipping package and its associated shielded containers as the base case. This results in a set of multipliers for 'dose per particle' for each spectral group. For a given isotope, the source spectrum is folded with the response for each group. The summed contribution from all isotopes determines the total dose from the RAM in the container.« less

  6. Why people do what they do to protect against earthquake risk: perceptions of hazard adjustment attributes.

    PubMed

    Lindell, Michael K; Arlikatti, Sudha; Prater, Carla S

    2009-08-01

    This study examined respondents' self-reported adoption of 16 hazard adjustments (preimpact actions to reduce danger to persons and property), their perceptions of those adjustments' attributes, and the correlations of those perceived attributes with respondents' demographic characteristics. The sample comprised 561 randomly selected residents from three cities in Southern California prone to high seismic risk and three cities from Western Washington prone to moderate seismic risks. The results show that the hazard adjustment perceptions were defined by hazard-related attributes and resource-related attributes. More significantly, the respondents had a significant degree of consensus in their ratings of those attributes and used them to differentiate among the hazard adjustments, as indicated by statistically significant differences among the hazard adjustment profiles. Finally, there were many significant correlations between respondents' demographic characteristics and the perceived characteristics of hazard adjustments, but there were few consistent patterns among these correlations.

  7. Construction Management Training in the Navy Seabees

    DTIC Science & Technology

    1992-01-01

    classroom training in developing a variety of skills. Skills attained are recorded under the Personnel Readiness Capability Program (PRCP) and...Functional Skill 090.2) - Hands on safety course required for all crew leaders and project supervisors. e- Hazard Communication (094. 1) - Federal...Hazard Communication Training Program m required by 19CFR1910.1200. This course is required for all personnel. Those exposed to hazardous chemicals

  8. RiskScape: a new tool for comparing risk from natural hazards (Invited)

    NASA Astrophysics Data System (ADS)

    Stirling, M. W.; King, A.

    2010-12-01

    The Regional RiskScape is New Zealand’s joint venture between GNS Science & NIWA, and represents a comprehensive and easy-to-use tool for multi-hazard-based risk and impact analysis. It has basic GIS functionality, in that it has Import/Export functions to use with GIS software. Five natural hazards have been implemented in Riskscape to date: Flood (river), earthquake, volcano (ash), tsunami and wind storm. The software converts hazard exposure information into the likely impacts for a region, for example, damage and replacement costs, casualties, economic losses, disruption, and number of people affected. It therefore can be used to assist with risk management, land use planning, building codes and design, risk identification, prioritization of risk-reduction/mitigation, determination of “best use” risk-reduction investment, evacuation and contingency planning, awareness raising, public information, realistic scenarios for exercises, and hazard event response. Three geographically disparate pilot regions have been used to develop and triall Riskscape in New Zealand, and each region is exposed to a different mix of natural hazards. Future (phase II) development of Riskscape will include the following hazards: Landslides (both rainfall and earthquake triggered), storm surges, pyroclastic flows and lahars, and climate change effects. While Riskscape developments have thus far focussed on scenario-based risk, future developments will advance the software into providing probabilistic-based solutions.

  9. A Hybrid Ground-Motion Prediction Equation for Earthquakes in Western Alberta

    NASA Astrophysics Data System (ADS)

    Spriggs, N.; Yenier, E.; Law, A.; Moores, A. O.

    2015-12-01

    Estimation of ground-motion amplitudes that may be produced by future earthquakes constitutes the foundation of seismic hazard assessment and earthquake-resistant structural design. This is typically done by using a prediction equation that quantifies amplitudes as a function of key seismological variables such as magnitude, distance and site condition. In this study, we develop a hybrid empirical prediction equation for earthquakes in western Alberta, where evaluation of seismic hazard associated with induced seismicity is of particular interest. We use peak ground motions and response spectra from recorded seismic events to model the regional source and attenuation attributes. The available empirical data is limited in the magnitude range of engineering interest (M>4). Therefore, we combine empirical data with a simulation-based model in order to obtain seismologically informed predictions for moderate-to-large magnitude events. The methodology is two-fold. First, we investigate the shape of geometrical spreading in Alberta. We supplement the seismic data with ground motions obtained from mining/quarry blasts, in order to gain insights into the regional attenuation over a wide distance range. A comparison of ground-motion amplitudes for earthquakes and mining/quarry blasts show that both event types decay at similar rates with distance and demonstrate a significant Moho-bounce effect. In the second stage, we calibrate the source and attenuation parameters of a simulation-based prediction equation to match the available amplitude data from seismic events. We model the geometrical spreading using a trilinear function with attenuation rates obtained from the first stage, and calculate coefficients of anelastic attenuation and site amplification via regression analysis. This provides a hybrid ground-motion prediction equation that is calibrated for observed motions in western Alberta and is applicable to moderate-to-large magnitude events.

  10. Factors influencing to earthquake caused economical losses on urban territories

    NASA Astrophysics Data System (ADS)

    Nurtaev, B.; Khakimov, S.

    2005-12-01

    Questions of assessment of earthquake economical losses on urban territories of Uzbekistan, taking into account damage forming factors, which are increqasing or reducing economical losses were discussed in the paper. Buildings and facilities vulnerability factors were classified. From total value (equal to 50) were selected most important ones. Factors ranging by level of impact and weight function in loss assessment were ranged. One group of damage forming factors includs seismic hazard assessment, design, construction and maintenance of building and facilities. Other one is formed by city planning characteristics and includes : density of constructions and population, area of soft soils, existence of liquefaction susceptible soils and etc. To all these factors has been given weight functions and interval values by groups. Methodical recomendations for loss asessment taking into account above mentioned factors were developed. It gives possibility to carry out preventive measures for protection of vulnerable territories, to differentiate cost assessment of each region in relation with territory peculiarity and damage value. Using developed method we have ranged cities by risk level. It has allowed to establish ratings of the general vulnerability of urban territories of cities and on their basis to make optimum decisions, oriented to loss mitigation and increase of safety of population. Besides the technique can be used by insurance companies for estimated zoning of territory, development of effective utilization schema of land resources, rational town-planning, an economic estimation of used territory for supply with information of the various works connected to an estimation of seismic hazard. Further improvement of technique of establishment of rating of cities by level of damage from earthquakes will allow to increase quality of construction, rationality of accommodation of buildings, will be an economic stimulator for increasing of seismic resistance of building.

  11. Association of Physical Activity History With Physical Function and Mortality in Old Age

    PubMed Central

    Koster, Annemarie; Valkeinen, Heli; Patel, Kushang V.; Bandinelli, Stefania; Guralnik, Jack M.; Ferrucci, Luigi

    2016-01-01

    Background. We examined whether physical activity in early adulthood, late midlife, and old age as well as cumulative physical activity history are associated with changes in physical functioning and mortality in old age. Methods. Data are from participants aged 65 years or older enrolled in the InCHIANTI study who were followed up from 1998–2000 to 2007–2008 (n = 1,149). At baseline, participants recalled their physical activity levels at ages 20–40, 40–60, and in the previous year, and they were categorized as physically inactive, moderately active, and physically active. Physical performance was assessed with the Short Physical Performance Battery and self-reported mobility disability was evaluated at the 3-, 6- and 9-year follow-up. Mortality follow-up was assessed until the end of 2010. Results. Physical inactivity at baseline was associated with greater decline in Short Physical Performance Battery score (mean 9-year change: −2.72, 95% CI: −3.08, −2.35 vs −0.98, 95% −1.57, −0.39) and greater rate of incident mobility disability (hazard ratio 4.66, 95% CI 1.14–19.07) and mortality (hazard ratio 2.18, 95% CI 1.01–4.70) compared to physically active participants at baseline. Being physically active throughout adulthood was associated with smaller decline in physical performance as well as with lower risk of incident mobility disability and premature death compared with those who had been less active during their adult life. Conclusions. Higher cumulative physical activity over the life course was associated with less decline in physical performance and reduced rate of incident mobility disability and mortality in older ages. PMID:26290538

  12. A rainfall risk analysis thanks to an GIS based estimation of urban vulnerability

    NASA Astrophysics Data System (ADS)

    Renard, Florent; Pierre-Marie, Chapon

    2010-05-01

    The urban community of Lyon, situated in France in the north of the Rhône valley, comprises 1.2 million inhabitants within 515 km ². With such a concentration of issues, policy makers and local elected officials therefore attach great importance to the management of hydrological risks, particularly due to the inherent characteristics of the territory. If the hazards associated with these risks in the territory of Lyon have been the subject of numerous analyses, studies on the vulnerability of greater Lyon are rare and have common shortcomings that impair their validity. We recall that the risk is seen as the classic relationship between the probability of occurrence of hazards and vulnerability. In this article, this vulnerability will be composed of two parts. The first one is the sensitivity of the stakes facing hydrological hazards as urban runoff, that is to say, their propensity to suffer damage during a flood (Gleize and Reghezza, 2007). The second factor is their relative importance in the functioning of the community. Indeed, not all the stakes could provide the same role and contribution to the Greater Lyon. For example, damage to the urban furniture such as bus shelter seems less harmful to the activities of the urban area than that of transport infrastructure (Renard and Chapon, 2010). This communication proposes to assess the vulnerability of Lyon urban area facing to hydrological hazards. This territory is composed of human, environmental and material stakes. The first part of this work is to identify all these issues so as to completeness. Then, is it required to build a "vulnerability index" (Tixier et al, 2006). Thus, it is necessary to use methods of multicriteria decision aid to evaluate the two components of vulnerability: the sensitivity and the contribution to the functioning of the community. Finally, the results of the overall vulnerability are presented, and then coupled to various hazards related to water such as runoff associated with heavy rains, to locate areas of risk in the urban area. The targets that share the same rank of this vulnerability index do not possess the same importance, or the same sensitivity to the flood hazard. Therefore, the second part of this work is to define the priorities and sensitivities of different targets based on the judgments of experts. Multicriteria decision methods are used to prioritize elements and are therefore adapted to the modelling of the sensitivity of the issues of greater Lyon (Griot, 2008). The purpose of these methods is the assessment of priorities between the different components of the situation. Thomas Saaty's analytic hierarchy process (1980) is the most frequently used because of its many advantages. On this basis, the formal calculations of priorities and sensitivities of the elements have been conducted. These calculations are based on the judgments of experts. Indeed, during semi-structured interview, the 38 experts in our sample delivered a verdict on issues that seem relatively more important than others by binary comparison. They carry the same manner to determine sensitivity's stakes to hazard flooding. Finally, the consistency of answers given by experts is validated by calculating a ratio of coherence, and their results are aggregated to provide functions of priority (based on the relative importance of each stakes), and functions of sensitivity (based on the relative sensitivity of each stakes). From these functions of priority and sensitivity is obtained the general function of vulnerability. The vulnerability functions allow defining the importance of the stakes of Greater Lyon and their sensitivity to hydrological hazards. The global vulnerability function is obtained from sensitivity and priority functions and shows the great importance of human issues (75 %). The vulnerability factor of environmental targets represents 12 % of the global vulnerability function, as much as the materials issues. However, it can be seen that the environmental and material stakes do not represent the same weight into the priority and sensitivity functions. Indeed, the environmental issues seem more important than the material ones (17 % for the environmental stakes whereas only 5 % for the material stakes in the priority function), but less sensitive to an hydrological hazard (6 % for the environmental issues while 20 % for the material issues in the sensitivity function). Similarly, priority functions and sensitivity are established for all stakes at all levels. The stakes are then converted into a mesh form (100 meters wide). This will standardize the collection framework and the heterogeneous nature of data to allow their comparison. Finally, it is obtained a detailed, consistent and objective vulnerability of the territory of Greater Lyon. At the end, to get a direct reading of risk, combination of hazard and vulnerability, it is overlaid the two maps.

  13. Extended GTST-MLD for aerospace system safety analysis.

    PubMed

    Guo, Chiming; Gong, Shiyu; Tan, Lin; Guo, Bo

    2012-06-01

    The hazards caused by complex interactions in the aerospace system have become a problem that urgently needs to be settled. This article introduces a method for aerospace system hazard interaction identification based on extended GTST-MLD (goal tree-success tree-master logic diagram) during the design stage. GTST-MLD is a functional modeling framework with a simple architecture. Ontology is used to extend the ability of system interaction description in GTST-MLD by adding the system design knowledge and the past accident experience. From the level of functionality and equipment, respectively, this approach can help the technician detect potential hazard interactions. Finally, a case is used to show the method. © 2011 Society for Risk Analysis.

  14. Sexual orientation differences in the relationship between victimization and hazardous drinking among women in the National Alcohol Survey

    PubMed Central

    Drabble, Laurie; Trocki, Karen F.; Hughes, Tonda L.; Korcha, Rachael A.; Lown, Anne E.

    2013-01-01

    This study examined relationships between past experiences of victimization (sexual abuse and physical abuse in childhood, sexual abuse and physical abuse in adulthood, and lifetime victimization) and hazardous drinking among sexual minority women compared to exclusively heterosexual women. Data were from 11,169 women responding to sexual identity and sexual behavior questions from three National Alcohol Survey waves: 2000 (n=3,880), 2005 (n=3,464) and 2010 (n=3,825). A hazardous drinking index was constructed from five dichotomous variables (5+ drinking in the past year, drinking two or more drinks daily, drinking to intoxication in the past year, two or more lifetime dependence symptoms and two or more lifetime drinking-related negative consequences). Exclusively heterosexual women were compared to three groups of sexual minority women: lesbian, bisexual, and women who identified as heterosexual but reported same-sex partners. Each of the sexual minority groups reported significantly higher rates of lifetime victimization (59.1% lesbians, 76% bisexuals, and 64.4% heterosexual women reporting same-sex partners) than exclusively heterosexual women (42.3 %). Odds for hazardous drinking among sexual minority women were attenuated when measures of victimization were included in the regression models. Sexual minority groups had significantly higher odds of hazardous drinking, even after controlling for demographic and victimization variables: lesbian (ORadj=2.0, CI=1.1–3.9, p<.01; bisexual (ORadj=1.8, CI=1.0–3.3, p<.05; heterosexual with same-sex partners (ORadj=2.7; CI=1.7–4.3, p<.001). Higher rates of victimization likely contribute to, but do not fully explain, higher rates of hazardous drinking among sexual minority women. PMID:23438246

  15. The beliefs about pros and cons of drinking and intention to change among hazardous and moderate alcohol users: a population-based cross-sectional study.

    PubMed

    Ansker, Fredrik G; Helgason, Asgeir R; Ahacic, Kozma

    2014-08-01

    Fundamental to supporting hazardous alcohol users are the rationales for reducing alcohol intake highlighted by the users themselves. This study analyses the relative importance of beliefs about pros and cons of drinking in relation to having an intention to reduce intake among both hazardous and moderate alcohol users. Intention to change was assessed in a representative sample of Stockholm's population (n = 4278, response rate 56.5%). Alcohol use was assessed using the Alcohol Use Disorders Identification Test measure. A decisional balance inventory was used to examine various beliefs about the pros and cons of drinking, which covered affect changes, social gains and losses, and possible adverse effects. Independent correlations were determined by logistic regression using a backward exclusion procedure (P > 0.05). Higher ratings of importance were generally related to intent, whether or not the contrast was with having no intent or already having made a reduction. This was especially true for hazardous users. Only two beliefs were independently correlated with change among hazardous users: 'Drinking could get me addicted' and 'Drinking makes me more relaxed/less tense' (pseudo-R2 < 0.1). Among moderate users, there was no uniform pattern in the relationships. Unexpectedly, hazardous users with an intent to change rated pro arguments as more important than those with no intent to change. Of the investigated pros and cons, only a few were independently related to intention to change drinking behaviour. These arguments provide interesting topics in consultations. Little support was found for any rational decision making behind the intention to reduce alcohol intake. © The Author 2014. Published by Oxford University Press on behalf of the European Public Health Association.

  16. Improved Performance and Safety for High Energy Batteries Through Use of Hazard Anticipation and Capacity Prediction

    NASA Technical Reports Server (NTRS)

    Atwater, Terrill

    1993-01-01

    Prediction of the capacity remaining in used high rate, high energy batteries is important information to the user. Knowledge of the capacity remaining in used batteries results in better utilization. This translates into improved readiness and cost savings due to complete, efficient use. High rate batteries, due to their chemical nature, are highly sensitive to misuse (i.e., over discharge or very high rate discharge). Battery failure due to misuse or manufacturing defects could be disastrous. Since high rate, high energy batteries are expensive and energetic, a reliable method of predicting both failures and remaining energy has been actively sought. Due to concerns over safety, the behavior of lithium/sulphur dioxide cells at different temperatures and current drains was examined. The main thrust of this effort was to determine failure conditions for incorporation in hazard anticipation circuitry. In addition, capacity prediction formulas have been developed from test data. A process that performs continuous, real-time hazard anticipation and capacity prediction was developed. The introduction of this process into microchip technology will enable the production of reliable, safe, and efficient high energy batteries.

  17. [An analysis of occupational hazard in manufacturing industry in Guangzhou, China, in 2013].

    PubMed

    Zhang, Haihong; Li, Yongqin; Zhou, Hailin; Rong, Xing; Zhu, Shaofang; He, Yinan; Zhai, Ran; Liu, Yiming

    2015-08-01

    To provide data for the occupational health supervision by analyzing the occupational health status in manufacturing industry in Guangzhou, China. The occupational health investigation was performed in 280 enterprises randomly selected from 8 industries based on industry stratification. According to the occupational health standards, 198 out of the 280 enterprises were supervised and monitored. Sample testing was performed in 3~5 workplaces where workers were exposed to the highest concentration/intensity of occupational hazard for the longest time. Comparative analyses of the overproof rates of hazard were performed among enterprises, workplaces, and testing items from different industries. The concentrations of occupational hazard in 42.93% (85/198) of enterprises and 22.96% (200/871) of workplaces were above the limit concentration. The most severe hazards were the noises in shipbuilding and wooden furniture industries and the welding fumes in shipbuilding industry. Less than 30% of enterprises were able to provide occupational health examination and periodic test reports of occupational hazard in workplaces. The rate of the workers with abnormal occupational health examination results and the need for reexamination reached 6.63% (832/12 549), and they were mostly from shipbuilding, wooden furniture, and chemical industries. The occupational health supervision should be strengthened in enterprises, and hazard from noises and dusts should be selectively controlled or reduced. The publication of relevant data and information of occupational health in enterprises should be promoted to enhance social supervision.

  18. Suicide Following Deliberate Self-Harm.

    PubMed

    Olfson, Mark; Wall, Melanie; Wang, Shuai; Crystal, Stephen; Gerhard, Tobias; Blanco, Carlos

    2017-08-01

    The authors sought to identify risk factors for repeat self-harm and completed suicide over the following year among adults with deliberate self-harm. A national cohort of Medicaid-financed adults clinically diagnosed with deliberate self-harm (N=61,297) was followed for up to 1 year. Repeat self-harm per 1,000 person-years and suicide rates per 100,000 person-years (based on cause of death information from the National Death Index) were determined. Hazard ratios of repeat self-harm and suicide were estimated by Cox proportional hazard models. During the 12 months after nonfatal self-harm, the rate of repeat self-harm was 263.2 per 1,000 person-years and the rate of completed suicide was 439.1 per 100,000 person-years, or 37.2 times higher than in a matched general population cohort. The hazard of suicide was higher after initial self-harm events involving violent as compared with nonviolent methods (hazard ratio=7.5, 95% CI=5.5-10.1), especially firearms (hazard ratio=15.86, 95% CI=10.7-23.4; computed with poisoning as reference), and to a lesser extent after events of patients who had recently received outpatient mental health care (hazard ratio=1.6, 95% CI=1.2-2.0). Compared with self-harm patients using nonviolent methods, those who used violent methods were at significantly increased risk of suicide during the first 30 days after the initial event (hazard ratio=17.5, 95% CI=11.2-27.3), but not during the following 335 days. Adults treated for deliberate self-harm frequently repeat self-harm in the following year. Patients who use a violent method for their initial self-harm, especially firearms, have an exceptionally high risk of suicide, particularly right after the initial event, which highlights the importance of careful assessment and close follow-up of this group.

  19. Measurement of natural radioactivity and assessment of radiation hazard indices in soil samples at Pengerang, Kota Tinggi, Johor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hassan, Nur Nazihah; Khoo, Kok Siong

    Pengerang area consists of a mix of private plantation, individual residential lots and state land, which is leased for agriculture related activities. The analysis was conducted to determine the specific activity of the initial value and the radiation hazard indices in the surrounding area in Pengerang. This area will be developed into a major downstream for oil and gas. The aims of this preliminary study were 1) to determine the specific activities of {sup 238}U, {sup 232}Th, {sup 226}Ra and {sup 40}K of soil samples at six selected areas by Gamma-ray spectrometry and 2) to calculate the radiation hazard indices.more » The specific activities (Bq/kg) of the samples ranged from 7.08±5.01 to 36.29±25.72 Bq/kg, 5.62±3.98 to 34.53±24.07 Bq/kg, 4.75±3.42 to 24.76±17.66 Bq/kg and 10.58±7.51 to 101.25±72.00 Bq/kg for {sup 238}U, {sup 232}Th, {sup 226}Ra and {sup 40}K, respectively. These values were well within the range that reported by UNSCEAR. The study also examined the radiation hazard indices, the mean values obtained were 48.49±28.06 Bq/kg for Radium Equivalent Activity (Raeq), 0.34 Bq/kg for Representative Level Index (I{sub γ}), 21.83 nGy/h for Absorbed dose rates (D), 0.27 mSv/y for Annual Effective Dose Rates (Deff), 0.13 and 0.18 for External Hazards Index (H{sub ex}) and Internal Hazard Index (H{sub in}), respectively. These calculated hazard indices were used to estimate the potential radiological health risk in soil and the dose rates associated with it were well below their permissible limit. The overall findings show that no radiological threat to the health of the population in the study area.« less

  20. Morbidity, mortality and economic burden of renal impairment in cardiac intensive care.

    PubMed

    Chew, D P; Astley, C; Molloy, D; Vaile, J; De Pasquale, C G; Aylward, P

    2006-03-01

    Moderate to severe impairment of renal function has emerged as a potent risk factor for adverse short- and long-term outcomes among patients presenting with cardiac disease. We sought to define the clinical, late mortality and economic burden of this risk factor among patients presenting to cardiac intensive care. A clinical audit of patients presenting to cardiac intensive care was undertaken between July 2002 and June 2003. All patients presenting with cardiac diagnoses were included in the study. Baseline creatinine levels were assessed in all patients. Late mortality was assessed by the interrogation of the National Death Register. Renal impairment was defined as estimated glomerular filtration rate <60 mL/min per 1.73 m2, as calculated by the Modified Diet in Renal Disease formula. In-hospital and late outcomes were compared by Cox proportional hazards modelling, adjusting for known confounders. A matched analysis and attributable risk calculation were undertaken to assess the proportion of late mortality accounted for by impairment of renal function and other known negative prognostic factors. The in-hospital total cost associated with renal impairment was assessed by linear regression. Glomerular filtration rate <60 mL/min per 1.73 m2 was evident in 33.0% of this population. Among these patients, in-hospital and late mortality were substantially increased: risk ratio 13.2; 95% CI 3.0-58.1; P < 0.001 and hazard ratio 6.2; 95% CI 3.6-10.7; P < 0.001, respectively. In matched analysis, renal impairment to this level was associated with 42.1% of all the late deaths observed. Paradoxically, patients with renal impairment were more conservatively managed, but their hospitalizations were associated with an excess adjusted in-hospital cost of $A1676. Impaired renal function is associated with a striking clinical and economic burden among patients presenting to cardiac intensive care. As a marker for future risk, renal function accounts for a substantial proportion of the burden of late mortality. The burden of risk suggests a greater potential opportunity for improvement of outcomes through optimisation of therapeutic strategies.

  1. How much does a reminder letter increase cervical screening among under-screened women in NSW?

    PubMed

    Morrell, Stephen; Taylor, Richard; Zeckendorf, Sue; Niciak, Amanda; Wain, Gerard; Ross, Jayne

    2005-02-01

    To evaluate a direct mail-out campaign to increase Pap screening rates in women who have not had a test in 48 months. Ninety thousand under-screened women were randomised to be mailed a 48-month reminder letter to have a Pap test (n=60,000), or not to be mailed a letter (n=30,000). Differences in Pap test rates were assessed by Kaplan-Meier survival analysis, by chi2 tests of significance between Pap test rates in letter versus no-letter groups, and by proportional hazards regression modelling of predictors of a Pap test with letter versus no-letter as the main study variable. T-tests were conducted on mean time to Pap test to assess whether time to Pap test was significantly different between the intervention and control groups. After 90 days following each mail-out, Pap test rates in the letter group were significantly higher than in the non-letter group, by approximately two percentage points. After controlling for potential confounders, the hazard ratio of a Pap test within 90 days of a mail-out in the letter group was 1.5 compared with 1.0 in the no-letter group. Hazard ratios of having a Pap test within 90 days decreased significantly with time since last Pap test (p<0.0001); were significantly higher than 1.0 for most non-metropolitan areas of NSW compared with metropolitan areas; and increased significantly with age (p<0.0001). Pap test hazard ratios were not associated with socio-economic status of area of residence, but the hazard ratio was significantly higher than 1.0 if the reminder letter was sent after the Christmas/New Year break. No significant differences in mean time to Pap test were found between the letter and no-letter groups. Being sent a reminder letter is associated with higher Pap testing rates in under-screened women.

  2. 2018 one‐year seismic hazard forecast for the central and eastern United States from induced and natural earthquakes

    USGS Publications Warehouse

    Petersen, Mark D.; Mueller, Charles; Moschetti, Morgan P.; Hoover, Susan M.; Rukstales, Kenneth S.; McNamara, Daniel E.; Williams, Robert A.; Shumway, Allison; Powers, Peter; Earle, Paul; Llenos, Andrea L.; Michael, Andrew J.; Rubinstein, Justin L.; Norbeck, Jack; Cochran, Elizabeth S.

    2018-01-01

    This article describes the U.S. Geological Survey (USGS) 2018 one‐year probabilistic seismic hazard forecast for the central and eastern United States from induced and natural earthquakes. For consistency, the updated 2018 forecast is developed using the same probabilistic seismicity‐based methodology as applied in the two previous forecasts. Rates of earthquakes across the United States M≥3.0">M≥3.0 grew rapidly between 2008 and 2015 but have steadily declined over the past 3 years, especially in areas of Oklahoma and southern Kansas where fluid injection has decreased. The seismicity pattern in 2017 was complex with earthquakes more spatially dispersed than in the previous years. Some areas of west‐central Oklahoma experienced increased activity rates where industrial activity increased. Earthquake rates in Oklahoma (429 earthquakes of M≥3">M≥3 and 4 M≥4">M≥4), Raton basin (Colorado/New Mexico border, six earthquakes M≥3">M≥3), and the New Madrid seismic zone (11 earthquakes M≥3">M≥3) continue to be higher than historical levels. Almost all of these earthquakes occurred within the highest hazard regions of the 2017 forecast. Even though rates declined over the past 3 years, the short‐term hazard for damaging ground shaking across much of Oklahoma remains at high levels due to continuing high rates of smaller earthquakes that are still hundreds of times higher than at any time in the state’s history. Fine details and variability between the 2016–2018 forecasts are obscured by significant uncertainties in the input model. These short‐term hazard levels are similar to active regions in California. During 2017, M≥3">M≥3 earthquakes also occurred in or near Ohio, West Virginia, Missouri, Kentucky, Tennessee, Arkansas, Illinois, Oklahoma, Kansas, Colorado, New Mexico, Utah, and Wyoming.

  3. Comparing the 12-month patency of low- versus high-pressure dilation in failing arteriovenous fistulae: A prospective multicenter trial (YOROI study).

    PubMed

    Wakamoto, Koki; Doi, Shigehiro; Nakashima, Ayumu; Kawai, Toru; Kyuden, Yasufumi; Naito, Takayuki; Asai, Mariko; Takahashi, Shunsuke; Murakami, Masaaki; Masaki, Takao

    2018-03-01

    This study was performed to investigate the effect of the balloon dilation pressure on the 12-month patency rate in patients with failed arteriovenous fistulas undergoing hemodialysis. In this multicenter, prospective, randomized trial, the 4-mm-diameter YOROI balloon was used for dilation of stenotic lesions. The balloons were inflated to a pressure of 8 atm (low-pressure group) or 30 atm to achieve complete expansion (high-pressure group). The 12-month patency rate after balloon angioplasty was analyzed by the Kaplan-Meier method and log-rank test and/or a Cox proportional hazard model. We also investigated the dilation pressure required to achieve complete expansion in the high-pressure group. In total, 71 patients were enrolled and allocated to either the low-pressure group (n = 34) or the high-pressure group (n = 37). The 12-month patency rates showed no significant difference between the low- and high-pressure groups (47% and 49%, respectively; p = 0.87). In the low-pressure group, the patency rate was not different between patients with complete dilation and residual stenosis (44% and 50%, respectively; p = 0.87). The Cox proportional hazard model revealed that the 12-month patency rate was associated with the stenosis diameter (hazard ratio 0.36; p = 0.001) and the presence of diabetes (hazard ratio 0.33; p = 0.018). Finally, the pressure required to achieve complete dilation was ≤20 atm in 76% of patients and ≤30 atm in 97% of patients. One patient required a dilation pressure of >30 atm. The patency rate does not differ between low-pressure dilation and high-pressure dilation.

  4. Preliminary volcanic hazards evaluation for Los Alamos National Laboratory Facilities and Operations : current state of knowledge and proposed path forward

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keating, Gordon N.; Schultz-Fellenz, Emily S.; Miller, Elizabeth D.

    2010-09-01

    The integration of available information on the volcanic history of the region surrounding Los Alamos National Laboratory indicates that the Laboratory is at risk from volcanic hazards. Volcanism in the vicinity of the Laboratory is unlikely within the lifetime of the facility (ca. 50–100 years) but cannot be ruled out. This evaluation provides a preliminary estimate of recurrence rates for volcanic activity. If further assessment of the hazard is deemed beneficial to reduce risk uncertainty, the next step would be to convene a formal probabilistic volcanic hazards assessment.

  5. An evaluation of the relative fire hazards of jet A and jet B for commercial flight

    NASA Technical Reports Server (NTRS)

    Hibbard, R. R.; Hacker, P. T.

    1973-01-01

    The relative fire hazards of Jet A and Jet B aircraft fuels are evaluated. The evaluation is based on a consideration of the presence of and/or the generation of flammable mixtures in fuel systems, the ignition characteristics, and the flame propagation rates for the two fuel types. Three distinct aircraft operating regimes where fuel type may be a factor in fire hazards are considered. These are: (1) ground handling and refueling, (2) flight, and (3) crash. The evaluation indicates that the overall fire hazards for Jet A are less than for Jet B fuel.

  6. Semantic organizational strategy predicts verbal memory and remission rate of geriatric depression.

    PubMed

    Morimoto, Sarah Shizuko; Gunning, Faith M; Kanellopoulos, Dora; Murphy, Christopher F; Klimstra, Sibel A; Kelly, Robert E; Alexopoulos, George S

    2012-05-01

    This study tests the hypothesis that the use of semantic organizational strategy during the free-recall phase of a verbal memory task predicts remission of geriatric depression. Sixty-five older patients with major depression participated in a 12-week escitalopram treatment trial. Neuropsychological performance was assessed at baseline after a 2-week drug washout period. The Hopkins Verbal Learning Test-Revised was used to assess verbal learning and memory. Remission was defined as a Hamilton Depression Rating Scale score of ≤ 7 for 2 consecutive weeks and no longer meeting the DSM-IV-TR criteria for major depression. The association between the number of clusters used at the final learning trial (trial 3) and remission was examined using Cox's proportional hazards survival analysis. The relationship between the number of clusters utilized in the final learning trial and the number of words recalled after a 25-min delay was examined in a regression with age and education as covariates. Higher number of clusters utilized predicted remission rates (hazard ratio, 1.26 (95% confidence interval, 1.04-1.54); χ(2)  = 4.23, df = 3, p = 0.04). There was a positive relationship between the total number of clusters used by the end of the third learning trial and the total number of words recalled at the delayed recall trial (F(3,58) = 7.93; p < 0.001). Effective semantic strategy use at baseline on a verbal list learning task by older depressed patients was associated with higher rates of remission with antidepressant treatment. This result provides support for previous findings indicating that measures of executive functioning at baseline are useful in predicting antidepressant response. Copyright © 2011 John Wiley & Sons, Ltd.

  7. 2017 One‐year seismic‐hazard forecast for the central and eastern United States from induced and natural earthquakes

    USGS Publications Warehouse

    Petersen, Mark D.; Mueller, Charles; Moschetti, Morgan P.; Hoover, Susan M.; Shumway, Allison; McNamara, Daniel E.; Williams, Robert; Llenos, Andrea L.; Ellsworth, William L.; Rubinstein, Justin L.; McGarr, Arthur F.; Rukstales, Kenneth S.

    2017-01-01

    We produce a one‐year 2017 seismic‐hazard forecast for the central and eastern United States from induced and natural earthquakes that updates the 2016 one‐year forecast; this map is intended to provide information to the public and to facilitate the development of induced seismicity forecasting models, methods, and data. The 2017 hazard model applies the same methodology and input logic tree as the 2016 forecast, but with an updated earthquake catalog. We also evaluate the 2016 seismic‐hazard forecast to improve future assessments. The 2016 forecast indicated high seismic hazard (greater than 1% probability of potentially damaging ground shaking in one year) in five focus areas: Oklahoma–Kansas, the Raton basin (Colorado/New Mexico border), north Texas, north Arkansas, and the New Madrid Seismic Zone. During 2016, several damaging induced earthquakes occurred in Oklahoma within the highest hazard region of the 2016 forecast; all of the 21 moment magnitude (M) ≥4 and 3 M≥5 earthquakes occurred within the highest hazard area in the 2016 forecast. Outside the Oklahoma–Kansas focus area, two earthquakes with M≥4 occurred near Trinidad, Colorado (in the Raton basin focus area), but no earthquakes with M≥2.7 were observed in the north Texas or north Arkansas focus areas. Several observations of damaging ground‐shaking levels were also recorded in the highest hazard region of Oklahoma. The 2017 forecasted seismic rates are lower in regions of induced activity due to lower rates of earthquakes in 2016 compared with 2015, which may be related to decreased wastewater injection caused by regulatory actions or by a decrease in unconventional oil and gas production. Nevertheless, the 2017 forecasted hazard is still significantly elevated in Oklahoma compared to the hazard calculated from seismicity before 2009.

  8. Risk factors for hazardous events in olfactory-impaired patients.

    PubMed

    Pence, Taylor S; Reiter, Evan R; DiNardo, Laurence J; Costanzo, Richard M

    2014-10-01

    Normal olfaction provides essential cues to allow early detection and avoidance of potentially hazardous situations. Thus, patients with impaired olfaction may be at increased risk of experiencing certain hazardous events such as cooking or house fires, delayed detection of gas leaks, and exposure to or ingestion of toxic substances. To identify risk factors and potential trends over time in olfactory-related hazardous events in patients with impaired olfactory function. Retrospective cohort study of 1047 patients presenting to a university smell and taste clinic between 1983 and 2013. A total of 704 patients had both clinical olfactory testing and a hazard interview and were studied. On the basis of olfactory function testing results, patients were categorized as normosmic (n = 161), mildly hyposmic (n = 99), moderately hyposmic (n = 93), severely hyposmic (n = 142), and anosmic (n = 209). Patient evaluation including interview, examination, and olfactory testing. Incidence of specific olfaction-related hazardous events (ie, burning pots and/or pans, starting a fire while cooking, inability to detect gas leaks, inability to detect smoke, and ingestion of toxic substances or spoiled foods) by degree of olfactory impairment. The incidence of having experienced any hazardous event progressively increased with degree of impairment: normosmic (18.0%), mildly hyposmic (22.2%), moderately hyposmic (31.2%), severely hyposmic (32.4%), and anosmic (39.2%). Over 3 decades there was no significant change in the overall incidence of hazardous events. Analysis of demographic data (age, sex, race, smoking status, and etiology) revealed significant differences in the incidence of hazardous events based on age (among 397 patients <65 years, 148 [37.3%] with hazardous event, vs 31 of 146 patients ≥65 years [21.3%]; P < .001), sex (among 278 women, 106 [38.1%] with hazardous event, vs 73 of 265 men [27.6%]; P = .009), and race (among 98 African Americans, 41 [41.8%] with hazardous event, vs 134 of 434 whites [30.9%]; P = .04). Increased level of olfactory impairment portends an increased risk of experiencing a hazardous event. Risk is further impacted by individuals' age, sex, and race. These results may assist health care practitioners in counseling patients on the risks associated with olfactory impairment.

  9. Performance Analysis: Work Control Events Identified January - August 2010

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    De Grange, C E; Freeman, J W; Kerr, C E

    2011-01-14

    This performance analysis evaluated 24 events that occurred at LLNL from January through August 2010. The analysis identified areas of potential work control process and/or implementation weaknesses and several common underlying causes. Human performance improvement and safety culture factors were part of the causal analysis of each event and were analyzed. The collective significance of all events in 2010, as measured by the occurrence reporting significance category and by the proportion of events that have been reported to the DOE ORPS under the ''management concerns'' reporting criteria, does not appear to have increased in 2010. The frequency of reporting inmore » each of the significance categories has not changed in 2010 compared to the previous four years. There is no change indicating a trend in the significance category and there has been no increase in the proportion of occurrences reported in the higher significance category. Also, the frequency of events, 42 events reported through August 2010, is not greater than in previous years and is below the average of 63 occurrences per year at LLNL since 2006. Over the previous four years, an average of 43% of the LLNL's reported occurrences have been reported as either ''management concerns'' or ''near misses.'' In 2010, 29% of the occurrences have been reported as ''management concerns'' or ''near misses.'' This rate indicates that LLNL is now reporting fewer ''management concern'' and ''near miss'' occurrences compared to the previous four years. From 2008 to the present, LLNL senior management has undertaken a series of initiatives to strengthen the work planning and control system with the primary objective to improve worker safety. In 2008, the LLNL Deputy Director established the Work Control Integrated Project Team to develop the core requirements and graded elements of an institutional work planning and control system. By the end of that year this system was documented and implementation had begun. In 2009, training of the workforce began and as of the time of this report more than 50% of authorized Integration Work Sheets (IWS) use the activity-based planning process. In 2010, LSO independently reviewed the work planning and control process and confirmed to the Laboratory that the Integrated Safety Management (ISM) System was implemented. LLNL conducted a cross-directorate management self-assessment of work planning and control and is developing actions to respond to the issues identified. Ongoing efforts to strengthen the work planning and control process and to improve the quality of LLNL work packages are in progress: completion of remaining actions in response to the 2009 DOE Office of Health, Safety, and Security (HSS) evaluation of LLNL's ISM System; scheduling more than 14 work planning and control self-assessments in FY11; continuing to align subcontractor work control with the Institutional work planning and control system; and continuing to maintain the electronic IWS application. The 24 events included in this analysis were caused by errors in the first four of the five ISMS functions. The most frequent cause was errors in analyzing the hazards (Function 2). The second most frequent cause was errors occurring when defining the work (Function 1), followed by errors during the performance of work (Function 4). Interestingly, very few errors in developing controls (Function 3) resulted in events. This leads one to conclude that if improvements are made to defining the scope of work and analyzing the potential hazards, LLNL may reduce the frequency or severity of events. Analysis of the 24 events resulted in the identification of ten common causes. Some events had multiple causes, resulting in the mention of 39 causes being identified for the 24 events. The most frequent cause was workers, supervisors, or experts believing they understood the work and the hazards but their understanding was incomplete. The second most frequent cause was unclear, incomplete or confusing documents directing the work. Together, these two causes were mentioned 17 times and contributed to 13 of the events. All of the events with the cause of ''workers, supervisors, or experts believing they understood the work and the hazards but their understanding was incomplete'' had this error in the first two ISMS functions: define the work and analyze the hazard. This means that these causes result in the scope of work being ill-defined or the hazard(s) improperly analyzed. Incomplete implementation of these functional steps leads to the hazards not being controlled. The causes are then manifested in events when the work is conducted. The process to operate safely relies on accurately defining the scope of work. This review has identified a number of examples of latent organizational weakness in the execution of work control processes.« less

  10. The Hawaiian Volcano Observatory's current approach to forecasting lava flow hazards (Invited)

    NASA Astrophysics Data System (ADS)

    Kauahikaua, J. P.

    2013-12-01

    Hawaiian Volcanoes are best known for their frequent basaltic eruptions, which typically start with fast-moving channelized `a`a flows fed by high eruptions rates. If the flows continue, they generally transition into pahoehoe flows, fed by lower eruption rates, after a few days to weeks. Kilauea Volcano's ongoing eruption illustrates this--since 1986, effusion at Kilauea has mostly produced pahoehoe. The current state of lava flow simulation is quite advanced, but the simplicity of the models mean that they are most appropriately used during the first, most vigorous, days to weeks of an eruption - during the effusion of `a`a flows. Colleagues at INGV in Catania have shown decisively that MAGFLOW simulations utilizing satellite-derived eruption rates can be effective at estimating hazards during the initial periods of an eruption crisis. However, the algorithms do not simulate the complexity of pahoehoe flows. Forecasts of lava flow hazards are the most common form of volcanic hazard assessments made in Hawai`i. Communications with emergency managers over the last decade have relied on simple steepest-descent line maps, coupled with empirical lava flow advance rate information, to portray the imminence of lava flow hazard to nearby communities. Lavasheds, calculated as watersheds, are used as a broader context for the future flow paths and to advise on the utility of diversion efforts, should they be contemplated. The key is to communicate the uncertainty of any approach used to formulate a forecast and, if the forecast uses simple tools, these communications can be fairly straightforward. The calculation of steepest-descent paths and lavasheds relies on the accuracy of the digital elevation model (DEM) used, so the choice of DEM is critical. In Hawai`i, the best choice is not the most recent but is a 1980s-vintage 10-m DEM--more recent LIDAR and satellite radar DEM are referenced to the ellipsoid and include vegetation effects. On low-slope terrain, steepest descent lines calculated on a geoid-based DEM may differ significantly from those calculated on an ellipsoid-based DEM. Good estimates of lava flow advance rates can be obtained from empirical compilations of historical advance rates of Hawaiian lava flows. In this way, rates appropriate for observed flow types (`a`a or pahoehoe, channelized or not) can be applied. Eruption rate is arguably the most important factor, while slope is also significant for low eruption rates. Eruption rate, however, remains the most difficult parameter to estimate during an active eruption. The simplicity of the HVO approach is its major benefit. How much better can lava-flow advance be forecast for all types of lava flows? Will the improvements outweigh the increased uncertainty propagated through the simulation calculations? HVO continues to improve and evaluate its lava flow forecasting tools to provide better hazard assessments to emergency personnel.

  11. Preliminary Earthquake Hazard Map of Afghanistan

    USGS Publications Warehouse

    Boyd, Oliver S.; Mueller, Charles S.; Rukstales, Kenneth S.

    2007-01-01

    Introduction Earthquakes represent a serious threat to the people and institutions of Afghanistan. As part of a United States Agency for International Development (USAID) effort to assess the resource potential and seismic hazards of Afghanistan, the Seismic Hazard Mapping group of the United States Geological Survey (USGS) has prepared a series of probabilistic seismic hazard maps that help quantify the expected frequency and strength of ground shaking nationwide. To construct the maps, we do a complete hazard analysis for each of ~35,000 sites in the study area. We use a probabilistic methodology that accounts for all potential seismic sources and their rates of earthquake activity, and we incorporate modeling uncertainty by using logic trees for source and ground-motion parameters. See the Appendix for an explanation of probabilistic seismic hazard analysis and discussion of seismic risk. Afghanistan occupies a southward-projecting, relatively stable promontory of the Eurasian tectonic plate (Ambraseys and Bilham, 2003; Wheeler and others, 2005). Active plate boundaries, however, surround Afghanistan on the west, south, and east. To the west, the Arabian plate moves northward relative to Eurasia at about 3 cm/yr. The active plate boundary trends northwestward through the Zagros region of southwestern Iran. Deformation is accommodated throughout the territory of Iran; major structures include several north-south-trending, right-lateral strike-slip fault systems in the east and, farther to the north, a series of east-west-trending reverse- and strike-slip faults. This deformation apparently does not cross the border into relatively stable western Afghanistan. In the east, the Indian plate moves northward relative to Eurasia at a rate of about 4 cm/yr. A broad, transpressional plate-boundary zone extends into eastern Afghanistan, trending southwestward from the Hindu Kush in northeast Afghanistan, through Kabul, and along the Afghanistan-Pakistan border. Deformation here is expressed as a belt of major, north-northeast-trending, left-lateral strike-slip faults and abundant seismicity. The seismicity intensifies farther to the northeast and includes a prominent zone of deep earthquakes associated with northward subduction of the Indian plate beneath Eurasia that extends beneath the Hindu Kush and Pamirs Mountains. Production of the seismic hazard maps is challenging because the geological and seismological data required to produce a seismic hazard model are limited. The data that are available for this project include historical seismicity and poorly constrained slip rates on only a few of the many active faults in the country. Much of the hazard is derived from a new catalog of historical earthquakes: from 1964 to the present, with magnitude equal to or greater than about 4.5, and with depth between 0 and 250 kilometers. We also include four specific faults in the model: the Chaman fault with an assigned slip rate of 10 mm/yr, the Central Badakhshan fault with an assigned slip rate of 12 mm/yr, the Darvaz fault with an assigned slip rate of 7 mm/yr, and the Hari Rud fault with an assigned slip rate of 2 mm/yr. For these faults and for shallow seismicity less than 50 km deep, we incorporate published ground-motion estimates from tectonically active regions of western North America, Europe, and the Middle East. Ground-motion estimates for deeper seismicity are derived from data in subduction environments. We apply estimates derived for tectonic regions where subduction is the main tectonic process for intermediate-depth seismicity between 50- and 250-km depth. Within the framework of these limitations, we have developed a preliminary probabilistic seismic-hazard assessment of Afghanistan, the type of analysis that underpins the seismic components of modern building codes in the United States. The assessment includes maps of estimated peak ground-acceleration (PGA), 0.2-second spectral acceleration (SA), and 1.0-secon

  12. The 2014 United States National Seismic Hazard Model

    USGS Publications Warehouse

    Petersen, Mark D.; Moschetti, Morgan P.; Powers, Peter; Mueller, Charles; Haller, Kathleen; Frankel, Arthur; Zeng, Yuehua; Rezaeian, Sanaz; Harmsen, Stephen; Boyd, Oliver; Field, Edward; Chen, Rui; Rukstales, Kenneth S.; Luco, Nicolas; Wheeler, Russell; Williams, Robert; Olsen, Anna H.

    2015-01-01

    New seismic hazard maps have been developed for the conterminous United States using the latest data, models, and methods available for assessing earthquake hazard. The hazard models incorporate new information on earthquake rupture behavior observed in recent earthquakes; fault studies that use both geologic and geodetic strain rate data; earthquake catalogs through 2012 that include new assessments of locations and magnitudes; earthquake adaptive smoothing models that more fully account for the spatial clustering of earthquakes; and 22 ground motion models, some of which consider more than double the shaking data applied previously. Alternative input models account for larger earthquakes, more complicated ruptures, and more varied ground shaking estimates than assumed in earlier models. The ground motions, for levels applied in building codes, differ from the previous version by less than ±10% over 60% of the country, but can differ by ±50% in localized areas. The models are incorporated in insurance rates, risk assessments, and as input into the U.S. building code provisions for earthquake ground shaking.

  13. HMPT: Basic Radioactive Material Transportation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hypes, Philip A.

    2016-02-29

    Hazardous Materials and Packaging and Transportation (HMPT): Basic Radioactive Material Transportation Live (#30462, suggested one time) and Test (#30463, required initially and every 36 months) address the Department of Transportation’s (DOT’s) function-specific [required for hazardous material (HAZMAT) handlers, packagers, and shippers] training requirements of the HMPT Los Alamos National Laboratory (LANL) Labwide training. This course meets the requirements of 49 CFR 172, Subpart H, Section 172.704(a)(ii), Function-Specific Training.

  14. Using hazard functions to assess changes in processing capacity in an attentional cuing paradigm.

    PubMed

    Wenger, Michael J; Gibson, Bradley S

    2004-08-01

    Processing capacity--defined as the relative ability to perform mental work in a unit of time--is a critical construct in cognitive psychology and is central to theories of visual attention. The unambiguous use of the construct, experimentally and theoretically, has been hindered by both conceptual confusions and the use of measures that are at best only coarsely mapped to the construct. However, more than 25 years ago, J. T. Townsend and F. G. Ashby (1978) suggested that the hazard function on the response time (RT) distribution offered a number of conceptual advantages as a measure of capacity. The present study suggests that a set of statistical techniques, well-known outside the cognitive and perceptual literatures, offers the ability to perform hypothesis tests on RT-distribution hazard functions. These techniques are introduced, and their use is illustrated in application to data from the contingent attentional capture paradigm.

  15. Compliance with Electrical and Fire Protection Standards of U.S. Controlled and Occupied Facilities in Afghanistan

    DTIC Science & Technology

    2013-07-18

    are subject to damage and abrasion (figure 5). The use of an extension cord instead of fixed wiring creates the possibility of fire , electrical shock...Medical Clinic has an adjacent warehouse that is of a higher hazard and is not separated by 1-hour fire resistance rated construction. KMC 17-May...higher hazard and is not separated by 1-hour fire resistance rated construction. DynCorp NO Awaiting USG Decision DI has not received governmental

  16. Hazardous medical waste generation in Greece: case studies from medical facilities in Attica and from a small insular hospital.

    PubMed

    Komilis, Dimitrios; Katsafaros, Nikolaos; Vassilopoulos, Panagiotis

    2011-08-01

    The accurate calculation of the unit generation rates and composition of medical waste generated from medical facilities is necessary in order to design medical waste treatment systems. In this work, the unit medical waste generation rates of 95 public and private medical facilities in the Attica region were calculated based on daily weight records from a central medical waste incineration facility. The calculated medical waste generation rates (in kg bed(-1) day( -1)) varied widely with average values at 0.27 ± 113% and 0.24 ± 121%, for public and private medical facilities, respectively. The hazardous medical waste generation was measured, at the source, in the 40 bed hospital of the island of Ikaria for a period of 42 days during a 6 month period. The average hazardous medical waste generation rate was 1.204 kg occupied bed(-1) day(-1) or 0.33 kg (official) bed( -1) day(-1). From the above amounts, 54% resulted from the patients' room (solid and liquid wastes combined), 24% from the emergency department (solid waste), 17% from the clinical pathology lab and 6% from the X-ray lab. In average, 17% of the total hazardous medical waste was solely infectious. Conclusively, no correlation among the number of beds and the unit medical waste generation rate could be established. Each hospital should be studied separately, since medical waste generation and composition depends on the number and type of departments/laboratories at each hospital, number of external patients and number of occupied beds.

  17. Post-Discharge Worsening Renal Function in Patients with Type 2 Diabetes and Recent Acute Coronary Syndrome.

    PubMed

    Morici, Nuccia; Savonitto, Stefano; Ponticelli, Claudio; Schrieks, Ilse C; Nozza, Anna; Cosentino, Francesco; Stähli, Barbara E; Perrone Filardi, Pasquale; Schwartz, Gregory G; Mellbin, Linda; Lincoff, A Michael; Tardif, Jean-Claude; Grobbee, Diederick E

    2017-09-01

    Worsening renal function during hospitalization for an acute coronary syndrome is strongly predictive of in-hospital and long-term outcome. However, the role of post-discharge worsening renal function has never been investigated in this setting. We considered the placebo cohort of the AleCardio trial comparing aleglitazar with standard medical therapy among patients with type 2 diabetes mellitus and a recent acute coronary syndrome. Patients who had died or had been admitted to hospital for heart failure before the 6-month follow-up, as well as patients without complete renal function data, were excluded, leaving 2776 patients for the analysis. Worsening renal function was defined as a >20% reduction in estimated glomerular filtration rate from discharge to 6 months, or progression to macroalbuminuria. The Cox regression analysis was used to determine the prognostic impact of 6-month renal deterioration on the composite of all-cause death and hospitalization for heart failure. Worsening renal function occurred in 204 patients (7.34%). At a median follow-up of 2 years the estimated rates of death and hospitalization for heart failure per 100 person-years were 3.45 (95% confidence interval [CI], 2.46-6.36) for those with worsening renal function, versus 1.43 (95% CI, 1.14-1.79) for patients with stable renal function. At the adjusted analysis worsening renal function was associated with the composite endpoint (hazard ratio 2.65; 95% CI, 1.57-4.49; P <.001). Post-discharge worsening renal function is not infrequent among patients with type 2 diabetes and acute coronary syndromes with normal or mildly depressed renal function, and is a strong predictor of adverse cardiovascular events. Copyright © 2017 Elsevier Inc. All rights reserved.

  18. Social Hazards as Manifested Workplace Discrimination and Health (Vietnamese and Ukrainian Female and Male Migrants in Czechia)

    PubMed Central

    Drbohlav, Dušan; Dzúrová, Dagmar

    2017-01-01

    Social hazards as one of the dimensions of workplace discrimination are a potential social determinant of health inequalities. The aim of this study was to investigate relations between self-reported health and social hazard characteristics (defined as—discrimination as such, violence or threat of violence, time pressure or work overload and risk of accident) among Vietnamese and Ukrainian migrants (males and females) in Czechia by age, education level and marital status. This study is based on data from a survey of 669 immigrants in Czechia in 2013. Logistic regression analysis indicates that the given independent variables (given social hazards and socio-demographic characteristics), as predictors of a quality of self-reported health are more important for immigrant females than for males, irrespective of citizenship, albeit only for some of them and to differing extents. We found out that being exposed to the selected social hazards in the workplace leads to worsening self-rated health, especially for females. On the other hand, there was no statistically significant relationship found between poor self-rated health and discrimination as such. Reality calls for more research and, consequently, better policies and practices in the field of health inequalities. PMID:28994700

  19. Social Hazards as Manifested Workplace Discrimination and Health (Vietnamese and Ukrainian Female and Male Migrants in Czechia).

    PubMed

    Drbohlav, Dušan; Dzúrová, Dagmar

    2017-10-10

    Social hazards as one of the dimensions of workplace discrimination are a potential social determinant of health inequalities. The aim of this study was to investigate relations between self-reported health and social hazard characteristics (defined as-discrimination as such, violence or threat of violence, time pressure or work overload and risk of accident) among Vietnamese and Ukrainian migrants (males and females) in Czechia by age, education level and marital status. This study is based on data from a survey of 669 immigrants in Czechia in 2013. Logistic regression analysis indicates that the given independent variables (given social hazards and socio-demographic characteristics), as predictors of a quality of self-reported health are more important for immigrant females than for males, irrespective of citizenship, albeit only for some of them and to differing extents. We found out that being exposed to the selected social hazards in the workplace leads to worsening self-rated health, especially for females. On the other hand, there was no statistically significant relationship found between poor self-rated health and discrimination as such. Reality calls for more research and, consequently, better policies and practices in the field of health inequalities.

  20. Fault-based PSHA of an active tectonic region characterized by low deformation rates: the case of the Lower Rhine Graben

    NASA Astrophysics Data System (ADS)

    Vanneste, Kris; Vleminckx, Bart; Camelbeeck, Thierry

    2016-04-01

    The Lower Rhine Graben (LRG) is one of the few regions in intraplate NW Europe where seismic activity can be linked to active faults, yet probabilistic seismic hazard assessments of this region have hitherto been based on area-source models, in which the LRG is modeled as a single or a small number of seismotectonic zones with uniform seismicity. While fault-based PSHA has become common practice in more active regions of the world (e.g., California, Japan, New Zealand, Italy), knowledge of active faults has been lagging behind in other regions, due to incomplete tectonic inventory, low level of seismicity, lack of systematic fault parameterization, or a combination thereof. The past few years, efforts are increasingly being directed to the inclusion of fault sources in PSHA in these regions as well, in order to predict hazard on a more physically sound basis. In Europe, the EC project SHARE ("Seismic Hazard Harmonization in Europe", http://www.share-eu.org/) represented an important step forward in this regard. In the frame of this project, we previously compiled the first parameterized fault model for the LRG that can be applied in PSHA. We defined 15 fault sources based on major stepovers, bifurcations, gaps, and important changes in strike, dip direction or slip rate. Based on the available data, we were able to place reasonable bounds on the parameters required for time-independent PSHA: length, width, strike, dip, rake, slip rate, and maximum magnitude. With long-term slip rates remaining below 0.1 mm/yr, the LRG can be classified as a low-deformation-rate structure. Information on recurrence interval and elapsed time since the last major earthquake is lacking for most faults, impeding time-dependent PSHA. We consider different models to construct the magnitude-frequency distribution (MFD) of each fault: a slip-rate constrained form of the classical truncated Gutenberg-Richter MFD (Anderson & Luco, 1983) versus a characteristic MFD following Youngs & Coppersmith (1985). The summed Anderson & Luco fault MFDs show a remarkably good agreement with the MFD obtained from the historical and instrumental catalog for the entire LRG, whereas the summed Youngs & Coppersmith MFD clearly underpredicts low to moderate magnitudes, but yields higher occurrence rates for M > 6.3 than would be obtained by simple extrapolation of the catalog MFD. The moment rate implied by the Youngs & Coppersmith MFDs is about three times higher, but is still within the range allowed by current GPS uncertainties. Using the open-source hazard engine OpenQuake (http://openquake.org/), we compute hazard maps for return periods of 475, 2475, and 10,000 yr, and for spectral periods of 0 s (PGA) and 1 s. We explore the impact of various parameter choices, such as MFD model, GMPE distance metric, and inclusion of a background zone to account for lower magnitudes, and we also compare the results with hazard maps based on area-source models. References: Anderson, J. G., and J. E. Luco (1983), Consequences of slip rate constraints on earthquake occurrence relations, Bull. Seismol. Soc. Am., 73(2), 471-496. Youngs, R. R., and K. J. Coppersmith (1985), Implications of fault slip rates and earthquake recurrence models to probabilistic seismic hazard estimates, Bull. Seismol. Soc. Am., 75(4), 939-964.

  1. Don't Forget Kīlauea: Explosive Hazards at an Ocean Island Basaltic Volcano

    NASA Astrophysics Data System (ADS)

    Swanson, D. A.; Houghton, B. F.

    2015-12-01

    Kīlauea alternates between periods of high and low magma supply rate, each period lasting centuries. The low rate is only a few percent of the high rate. High supply rate, typified by the past 200 years, leads to frequent lava flows, elevated SO2 emission, and relatively low-hazard Hawaiian-style explosive activity (lava fountains, spattering). Periods of low magma supply are very different. They accompany formation and maintenance of a deep caldera, the floor of which is at or below the water table, and are characterized by phreatomagmatic and phreatic explosive eruptions largely powered by external water. The low magma supply rate results in few lava flows and reduced SO2 output. Studies of explosive deposits from the past two periods of low magma supply (~200 BCE-1000 CE and ~1500-1800 CE) indicate that VEIs calculated from isopach maps can range up to a low 3. Clast-size studies suggest that subplinian column heights can reach >10 km (most recently in 1790), though more frequent column heights are ~5-8 km. Pyroclastic density currents (PDCs) present severe proximal hazards; a PDC in 1790 killed a few hundred people in an area of Hawaíi Volcanoes National Park today visited by 5000 people daily. Ash in columns less than about 5 km a.s.l. is confined to the trade-wind regime and advects southwest. Ash in higher columns enters the jet stream and is transported east and southeast of the summit caldera. Recurrence of such column heights today would present aviation hazards, which, for an isolated state dependent on air transport, could have especially deleterious economic impact. There is currently no way to estimate when a period of low magma supply, a deep caldera, and powerful explosive activity will return. Hazard assessments must take into account the cyclic nature of Kīlauea's eruptive activity, not just its present status; consequently, assessments for periods of high and low magma supply rates should be made in parallel to cover all eventualities.

  2. Survival and weak chaos.

    PubMed

    Nee, Sean

    2018-05-01

    Survival analysis in biology and reliability theory in engineering concern the dynamical functioning of bio/electro/mechanical units. Here we incorporate effects of chaotic dynamics into the classical theory. Dynamical systems theory now distinguishes strong and weak chaos. Strong chaos generates Type II survivorship curves entirely as a result of the internal operation of the system, without any age-independent, external, random forces of mortality. Weak chaos exhibits (a) intermittency and (b) Type III survivorship, defined as a decreasing per capita mortality rate: engineering explicitly defines this pattern of decreasing hazard as 'infant mortality'. Weak chaos generates two phenomena from the normal functioning of the same system. First, infant mortality- sensu engineering-without any external explanatory factors, such as manufacturing defects, which is followed by increased average longevity of survivors. Second, sudden failure of units during their normal period of operation, before the onset of age-dependent mortality arising from senescence. The relevance of these phenomena encompasses, for example: no-fault-found failure of electronic devices; high rates of human early spontaneous miscarriage/abortion; runaway pacemakers; sudden cardiac death in young adults; bipolar disorder; and epilepsy.

  3. Survival and weak chaos

    PubMed Central

    2018-01-01

    Survival analysis in biology and reliability theory in engineering concern the dynamical functioning of bio/electro/mechanical units. Here we incorporate effects of chaotic dynamics into the classical theory. Dynamical systems theory now distinguishes strong and weak chaos. Strong chaos generates Type II survivorship curves entirely as a result of the internal operation of the system, without any age-independent, external, random forces of mortality. Weak chaos exhibits (a) intermittency and (b) Type III survivorship, defined as a decreasing per capita mortality rate: engineering explicitly defines this pattern of decreasing hazard as ‘infant mortality’. Weak chaos generates two phenomena from the normal functioning of the same system. First, infant mortality—sensu engineering—without any external explanatory factors, such as manufacturing defects, which is followed by increased average longevity of survivors. Second, sudden failure of units during their normal period of operation, before the onset of age-dependent mortality arising from senescence. The relevance of these phenomena encompasses, for example: no-fault-found failure of electronic devices; high rates of human early spontaneous miscarriage/abortion; runaway pacemakers; sudden cardiac death in young adults; bipolar disorder; and epilepsy. PMID:29892407

  4. Social networks, social support, and mortality among older people in Japan.

    PubMed

    Sugisawa, H; Liang, J; Liu, X

    1994-01-01

    This study examined the effects of social networks and social support on the mortality of a national probability sample of 2,200 elderly Japanese persons during a three-year period. The direct and indirect effects of social relationships were assessed by using hazard rate models in conjunction with ordinary least squares regressions. Among the five measures of social relationships, social participation is shown to have a strong impact on mortality, and this effect remains statistically significant when other factors are considered. Social participation, social support, and feelings of loneliness are found to have indirect effects on the mortality of the Japanese elders through their linkages with chronic diseases, functional status, and self-rated health. On the other hand, marital status and social contacts are not shown to have statistically significant effects on the risk of dying, either directly or indirectly.

  5. Quality of life at sea in Polish seafarer's evaluation.

    PubMed

    Jeżewska, Maria; Grubman-Nowak, Marta; Moryś, Joanna

    2015-01-01

    Work at sea is highly burdening, hazardous and stressful. Environmental, physical, and psychosociological factors have a great impact on the seafarer's quality of life and work. The research is a part of a broader psychological project performed on people working at sea in Poland during a period of 2011-2014. This report presents the self-evaluation of life quality conducted by a total of 1,700 Polish seafarers who took part in the study. The average age of the group was 45. Following methods were used: WHOQOL-BREF and the "Survey for people working at sea". Polish seafarers gave the highest rates to their social relationships (16.27), then the psychological functioning (15.62), and environment (15.51). The physical domain gave the lowest rates (14.63). The results have shown that quality of life of Polish seafarers is quite high.

  6. Aiding alternatives assessment with an uncertainty-tolerant hazard scoring method.

    PubMed

    Faludi, Jeremy; Hoang, Tina; Gorman, Patrick; Mulvihill, Martin

    2016-11-01

    This research developed a single-score system to simplify and clarify decision-making in chemical alternatives assessment, accounting for uncertainty. Today, assessing alternatives to hazardous constituent chemicals is a difficult task-rather than comparing alternatives by a single definitive score, many independent toxicological variables must be considered at once, and data gaps are rampant. Thus, most hazard assessments are only comprehensible to toxicologists, but business leaders and politicians need simple scores to make decisions. In addition, they must balance hazard against other considerations, such as product functionality, and they must be aware of the high degrees of uncertainty in chemical hazard data. This research proposes a transparent, reproducible method to translate eighteen hazard endpoints into a simple numeric score with quantified uncertainty, alongside a similar product functionality score, to aid decisions between alternative products. The scoring method uses Clean Production Action's GreenScreen as a guide, but with a different method of score aggregation. It provides finer differentiation between scores than GreenScreen's four-point scale, and it displays uncertainty quantitatively in the final score. Displaying uncertainty also illustrates which alternatives are early in product development versus well-defined commercial products. This paper tested the proposed assessment method through a case study in the building industry, assessing alternatives to spray polyurethane foam insulation containing methylene diphenyl diisocyanate (MDI). The new hazard scoring method successfully identified trade-offs between different alternatives, showing finer resolution than GreenScreen Benchmarking. Sensitivity analysis showed that different weighting schemes in hazard scores had almost no effect on alternatives ranking, compared to uncertainty from data gaps. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. Central US earthquake catalog for hazard maps of Memphis, Tennessee

    USGS Publications Warehouse

    Wheeler, R.L.; Mueller, C.S.

    2001-01-01

    An updated version of the catalog that was used for the current national probabilistic seismic-hazard maps would suffice for production of large-scale hazard maps of the Memphis urban area. Deaggregation maps provide guidance as to the area that a catalog for calculating Memphis hazard should cover. For the future, the Nuttli and local network catalogs could be examined for earthquakes not presently included in the catalog. Additional work on aftershock removal might reduce hazard uncertainty. Graphs of decadal and annual earthquake rates suggest completeness at and above magnitude 3 for the last three or four decades. Any additional work on completeness should consider the effects of rapid, local population changes during the Nation's westward expansion. ?? 2001 Elsevier Science B.V. All rights reserved.

  8. Structured Light-Based Hazard Detection For Planetary Surface Navigation

    NASA Technical Reports Server (NTRS)

    Nefian, Ara; Wong, Uland Y.; Dille, Michael; Bouyssounouse, Xavier; Edwards, Laurence; To, Vinh; Deans, Matthew; Fong, Terry

    2017-01-01

    This paper describes a structured light-based sensor for hazard avoidance in planetary environments. The system presented here can also be used in terrestrial applications constrained by reduced onboard power and computational complexity and low illumination conditions. The sensor is on a calibrated camera and laser dot projector system. The onboard hazard avoidance system determines the position of the projected dots in the image and through a triangulation process detects potential hazards. The paper presents the design parameters for this sensor and describes the image based solution for hazard avoidance. The system presented here was tested extensively in day and night conditions in Lunar analogue environments. The current system achieves over 97 detection rate with 1.7 false alarms over 2000 images.

  9. A general framework for parametric survival analysis.

    PubMed

    Crowther, Michael J; Lambert, Paul C

    2014-12-30

    Parametric survival models are being increasingly used as an alternative to the Cox model in biomedical research. Through direct modelling of the baseline hazard function, we can gain greater understanding of the risk profile of patients over time, obtaining absolute measures of risk. Commonly used parametric survival models, such as the Weibull, make restrictive assumptions of the baseline hazard function, such as monotonicity, which is often violated in clinical datasets. In this article, we extend the general framework of parametric survival models proposed by Crowther and Lambert (Journal of Statistical Software 53:12, 2013), to incorporate relative survival, and robust and cluster robust standard errors. We describe the general framework through three applications to clinical datasets, in particular, illustrating the use of restricted cubic splines, modelled on the log hazard scale, to provide a highly flexible survival modelling framework. Through the use of restricted cubic splines, we can derive the cumulative hazard function analytically beyond the boundary knots, resulting in a combined analytic/numerical approach, which substantially improves the estimation process compared with only using numerical integration. User-friendly Stata software is provided, which significantly extends parametric survival models available in standard software. Copyright © 2014 John Wiley & Sons, Ltd.

  10. 78 FR 21136 - Changes in Flood Hazard Determinations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-09

    ... zone designations, or the regulatory floodway (hereinafter referred to as flood hazard determinations), as shown on the Flood Insurance Rate Maps (FIRMs), and where applicable, in the supporting Flood... appeals to the Chief Executive Officer of the community as listed in the table below. FOR FURTHER...

  11. 78 FR 35300 - Changes in Flood Hazard Determinations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-12

    ... zone designations, or the regulatory floodway (hereinafter referred to as flood hazard determinations), as shown on the Flood Insurance Rate Maps (FIRMs), and where applicable, in the supporting Flood... appeals to the Chief Executive Officer of the community as listed in the table below. FOR FURTHER...

  12. 77 FR 17573 - Hazard Communication

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-26

    ... rate instead would have the effect of lowering the costs to $161 million per year and increasing the.... Information on chronic effects of exposure to hazardous chemicals helps employees recognize signs and symptoms... required. The current standard covers every type of health effect that may occur, including both acute and...

  13. 40 CFR 265.1084 - Waste determination procedures.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... biodegradation efficiency (Rbio) for a treated hazardous waste. (i) The fraction of organics biodegraded (Fbio... biodegradation efficiency, percent. Fbio = Fraction of organic biodegraded as determined in accordance with the... biodegradation rate (MRbio) for a treated hazardous waste. (i) The MRbio shall be determined based on results for...

  14. 40 CFR 265.1084 - Waste determination procedures.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... biodegradation efficiency (Rbio) for a treated hazardous waste. (i) The fraction of organics biodegraded (Fbio... biodegradation efficiency, percent. Fbio = Fraction of organic biodegraded as determined in accordance with the... biodegradation rate (MRbio) for a treated hazardous waste. (i) The MRbio shall be determined based on results for...

  15. 40 CFR 265.1084 - Waste determination procedures.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... biodegradation efficiency (Rbio) for a treated hazardous waste. (i) The fraction of organics biodegraded (Fbio... biodegradation efficiency, percent. Fbio = Fraction of organic biodegraded as determined in accordance with the... biodegradation rate (MRbio) for a treated hazardous waste. (i) The MRbio shall be determined based on results for...

  16. 40 CFR 265.1084 - Waste determination procedures.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... biodegradation efficiency (Rbio) for a treated hazardous waste. (i) The fraction of organics biodegraded (Fbio... biodegradation efficiency, percent. Fbio = Fraction of organic biodegraded as determined in accordance with the... biodegradation rate (MRbio) for a treated hazardous waste. (i) The MRbio shall be determined based on results for...

  17. 78 FR 20336 - Changes in Flood Hazard Determinations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-04

    ...] Changes in Flood Hazard Determinations AGENCY: Federal Emergency Management Agency, DHS. ACTION: Final... minimum that are required. They should not be construed to mean that the community must change any... flood insurance premium rates for new buildings, and for the contents in those buildings. The changes in...

  18. Hazardous Waste Sites, Stress, and Neighborhood Quality in USA.

    ERIC Educational Resources Information Center

    Greenberg, Michael; And Others

    1994-01-01

    Reports the results of a survey from seven neighborhoods adjacent to hazardous waste sites. The survey asked residents to rate their neighborhoods, compare the quality of their present neighborhoods to the quality of their previous neighborhoods, and report on stresses associated with their neighborhood. (LZ)

  19. Seismic hazard, risk, and design for South America

    USGS Publications Warehouse

    Petersen, Mark D.; Harmsen, Stephen; Jaiswal, Kishor; Rukstales, Kenneth S.; Luco, Nicolas; Haller, Kathleen; Mueller, Charles; Shumway, Allison

    2018-01-01

    We calculate seismic hazard, risk, and design criteria across South America using the latest data, models, and methods to support public officials, scientists, and engineers in earthquake risk mitigation efforts. Updated continental scale seismic hazard models are based on a new seismicity catalog, seismicity rate models, evaluation of earthquake sizes, fault geometry and rate parameters, and ground‐motion models. Resulting probabilistic seismic hazard maps show peak ground acceleration, modified Mercalli intensity, and spectral accelerations at 0.2 and 1 s periods for 2%, 10%, and 50% probabilities of exceedance in 50 yrs. Ground shaking soil amplification at each site is calculated by considering uniform soil that is applied in modern building codes or by applying site‐specific factors based on VS30">VS30 shear‐wave velocities determined through a simple topographic proxy technique. We use these hazard models in conjunction with the Prompt Assessment of Global Earthquakes for Response (PAGER) model to calculate economic and casualty risk. Risk is computed by incorporating the new hazard values amplified by soil, PAGER fragility/vulnerability equations, and LandScan 2012 estimates of population exposure. We also calculate building design values using the guidelines established in the building code provisions. Resulting hazard and associated risk is high along the northern and western coasts of South America, reaching damaging levels of ground shaking in Chile, western Argentina, western Bolivia, Peru, Ecuador, Colombia, Venezuela, and in localized areas distributed across the rest of the continent where historical earthquakes have occurred. Constructing buildings and other structures to account for strong shaking in these regions of high hazard and risk should mitigate losses and reduce casualties from effects of future earthquake strong ground shaking. National models should be developed by scientists and engineers in each country using the best available science.

  20. Evaluation of a Home-Based Environmental and Educational Intervention to Improve Health in Vulnerable Households: Southeastern Pennsylvania Lead and Healthy Homes Program.

    PubMed

    Mankikar, Deepa; Campbell, Carla; Greenberg, Rachael

    2016-09-09

    This evaluation examined whether participation in a home-based environmental educational intervention would reduce exposure to health and safety hazards and asthma-related medical visits. The home intervention program focused on vulnerable, low-income households, where children had asthma, were at risk for lead poisoning, or faced multiple unsafe housing conditions. Home visitors conducted two home visits, two months apart, consisting of an environmental home assessment, Healthy Homes education, and distribution of Healthy Homes supplies. Measured outcomes included changes in participant knowledge and awareness of environmental home-based hazards, rate of children's asthma-related medical use, and the presence of asthma triggers and safety hazards. Analysis of 2013-2014 baseline and post-intervention program data for a cohort of 150 families revealed a significantly lower three-month rate (p < 0.05) of children's asthma-related doctor visits and hospital admissions at program completion. In addition, there were significantly reduced reports of the presence of home-based hazards, including basement or roof leaks (p = 0.011), plumbing leaks (p = 0.019), and use of an oven to heat the home (p < 0.001). Participants' pre- and post- test scores showed significant improvement (p < 0.05) in knowledge and awareness of home hazards. Comprehensive home interventions may effectively reduce environmental home hazards and improve the health of asthmatic children in the short term.

  1. The Model for End-stage Liver Disease score is potentially a useful predictor of hyperkalemia occurrence among hospitalized angiotensin receptor blocker users.

    PubMed

    Sheen, S S; Park, R W; Yoon, D; Shin, G-T; Kim, H; Park, I-W

    2015-02-01

    Angiotensin receptor blockers (ARBs) are medications commonly used for treating conditions such as hypertension. However, ARBs are frequently associated with hyperkalemia, a potentially critical adverse event, in high-risk patients. Although both the liver and the kidney are major elimination routes of ARBs, the relationship between hepatorenal function and ARB-related hyperkalemia has not yet been investigated. The purpose of this study was to evaluate the risk of hyperkalemia, in terms of various hepatorenal functions, for hospitalized patients newly initiated on ARB treatment. We evaluated ARB-related hyperkalemia in a cohort of 5530 hospitalized patients, who had not previously used ARBs, between 12 April 2004 and 31 May 2012. Hepatorenal function was assessed by the Model for End-stage Liver Disease (MELD) score. Hyperkalemia risk was assessed by hepatorenal function, risks were categorized into the four MELD scoring groups, and the groups were compared with one another. The MELD score was significantly different between the hyperkalemic and non-hyperkalemic groups (independent t-test, P < 0.001). The MELD score 10-14, 15-19 and ≥ 20 groups showed higher risks of hyperkalemia than the lowest MELD score group {log-rank test, P < 0.001; multiple Cox proportional hazard model, hazard ratios 1.478 (P = 0.003), 2.285 (P < 0.001) and 3.024 (P < 0.001), respectively}. The MELD score showed a stronger predictive performance for hyperkalemia than either serum creatinine or estimated glomerular filtration rate alone. Furthermore, the MELD score showed good predictive performance for ARB-related hyperkalemia among hospitalized patients. The clinical implications and reasons for these findings merit future investigation. © 2014 John Wiley & Sons Ltd.

  2. The population in China’s earthquake-prone areas has increased by over 32 million along with rapid urbanization

    NASA Astrophysics Data System (ADS)

    He, Chunyang; Huang, Qingxu; Dou, Yinyin; Tu, Wei; Liu, Jifu

    2016-07-01

    Accurate assessments of the population exposed to seismic hazard are crucial in seismic risk mapping. Recent rapid urbanization in China has resulted in substantial changes in the size and structure of the population exposed to seismic hazard. Using the latest population census data and seismic maps, this work investigated spatiotemporal changes in the exposure of the population in the most seismically hazardous areas (MSHAs) in China from 1990 to 2010. In the context of rapid urbanization and massive rural-to-urban migration, nearly one-tenth of the Chinese population in 2010 lived in MSHAs. From 1990 to 2010, the MSHA population increased by 32.53 million at a significantly higher rate of change (33.6%) than the national average rate (17.7%). The elderly population in MSHAs increased by 81.4%, which is much higher than the group’s national growth rate of 58.9%. Greater attention should be paid to the demographic changes in earthquake-prone areas in China.

  3. The prevalence of selected potentially hazardous workplace exposures in the US: findings from the 2010 National Health Interview Survey.

    PubMed

    Calvert, Geoffrey M; Luckhaupt, Sara E; Sussell, Aaron; Dahlhamer, James M; Ward, Brian W

    2013-06-01

    Assess the national prevalence of current workplace exposure to potential skin hazards, secondhand smoke (SHS), and outdoor work among various industry and occupation groups. Also, assess the national prevalence of chronic workplace exposure to vapors, gas, dust, and fumes (VGDF) among these groups. Data were obtained from the 2010 National Health Interview Survey (NHIS). NHIS is a multistage probability sample survey of the civilian non-institutionalized population of the US. Prevalence rates and their variances were calculated using SUDAAN to account for the complex NHIS sample design. The data for 2010 were available for 17,524 adults who worked in the 12 months that preceded interview. The highest prevalence rates of hazardous workplace exposures were typically in agriculture, mining, and construction. The prevalence rate of frequent handling of or skin contact with chemicals, and of non-smokers frequently exposed to SHS at work was highest in mining and construction. Outdoor work was most common in agriculture (85%), construction (73%), and mining (65%). Finally, frequent occupational exposure to VGDF was most common among mining (67%), agriculture (53%), and construction workers (51%). We identified industries and occupations with the highest prevalence of potentially hazardous workplace exposures, and provided targets for investigation and intervention activities. Copyright © 2012 Wiley Periodicals, Inc.

  4. The Prevalence of Selected Potentially Hazardous Workplace Exposures in the US: Findings From the 2010 National Health Interview Survey

    PubMed Central

    Calvert, Geoffrey M.; Luckhaupt, Sara E.; Sussell, Aaron; Dahlhamer, James M.; Ward, Brian W.

    2015-01-01

    Objective Assess the national prevalence of current workplace exposure to potential skin hazards, secondhand smoke (SHS), and outdoor work among various industry and occupation groups. Also, assess the national prevalence of chronic workplace exposure to vapors, gas, dust, and fumes (VGDF) among these groups. Methods Data were obtained from the 2010 National Health Interview Survey (NHIS). NHIS is a multistage probability sample survey of the civilian non-institutionalized population of the US. Prevalence rates and their variances were calculated using SUDAAN to account for the complex NHIS sample design. Results The data for 2010 were available for 17,524 adults who worked in the 12 months that preceded interview. The highest prevalence rates of hazardous workplace exposures were typically in agriculture, mining, and construction. The prevalence rate of frequent handling of or skin contact with chemicals, and of non-smokers frequently exposed to SHS at work was highest in mining and construction. Outdoor work was most common in agriculture (85%), construction (73%), and mining (65%). Finally, frequent occupational exposure to VGDF was most common among mining (67%), agriculture (53%), and construction workers (51%). Conclusion We identified industries and occupations with the highest prevalence of potentially hazardous workplace exposures, and provided targets for investigation and intervention activities. PMID:22821700

  5. Comparing methods to combine functional loss and mortality in clinical trials for amyotrophic lateral sclerosis

    PubMed Central

    van Eijk, Ruben PA; Eijkemans, Marinus JC; Rizopoulos, Dimitris

    2018-01-01

    Objective Amyotrophic lateral sclerosis (ALS) clinical trials based on single end points only partially capture the full treatment effect when both function and mortality are affected, and may falsely dismiss efficacious drugs as futile. We aimed to investigate the statistical properties of several strategies for the simultaneous analysis of function and mortality in ALS clinical trials. Methods Based on the Pooled Resource Open-Access ALS Clinical Trials (PRO-ACT) database, we simulated longitudinal patterns of functional decline, defined by the revised amyotrophic lateral sclerosis functional rating scale (ALSFRS-R) and conditional survival time. Different treatment scenarios with varying effect sizes were simulated with follow-up ranging from 12 to 18 months. We considered the following analytical strategies: 1) Cox model; 2) linear mixed effects (LME) model; 3) omnibus test based on Cox and LME models; 4) composite time-to-6-point decrease or death; 5) combined assessment of function and survival (CAFS); and 6) test based on joint modeling framework. For each analytical strategy, we calculated the empirical power and sample size. Results Both Cox and LME models have increased false-negative rates when treatment exclusively affects either function or survival. The joint model has superior power compared to other strategies. The composite end point increases false-negative rates among all treatment scenarios. To detect a 15% reduction in ALSFRS-R decline and 34% decline in hazard with 80% power after 18 months, the Cox model requires 524 patients, the LME model 794 patients, the omnibus test 526 patients, the composite end point 1,274 patients, the CAFS 576 patients and the joint model 464 patients. Conclusion Joint models have superior statistical power to analyze simultaneous effects on survival and function and may circumvent pitfalls encountered by other end points. Optimizing trial end points is essential, as selecting suboptimal outcomes may disguise important treatment clues. PMID:29593436

  6. Hazard and operability study of the multi-function Waste Tank Facility. Revision 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hughes, M.E.

    1995-05-15

    The Multi-Function Waste Tank Facility (MWTF) East site will be constructed on the west side of the 200E area and the MWTF West site will be constructed in the SW quadrant of the 200W site in the Hanford Area. This is a description of facility hazards that site personnel or the general public could potentially be exposed to during operation. A list of preliminary Design Basis Accidents was developed.

  7. Architecture-Led Safety Analysis of the Joint Multi-Role (JMR) Joint Common Architecture (JCA) Demonstration System

    DTIC Science & Technology

    2015-12-01

    relevant system components (i.e., their component type declarations) have been anno - tated with EMV2 error source or propagation declarations and hazard...contributors. They are recorded as EMV2 anno - tations for each of the ASSA. Figure 40 shows a sampling of potential hazard contributors by the functional...2012] Leveson, N., Engineering a Safer World. MIT Press. 2012. [Parnas 1991] Parnas, D. & Madey, J . Functional Documentation for Computer Systems

  8. Active, capable, and potentially active faults - a paleoseismic perspective

    USGS Publications Warehouse

    Machette, M.N.

    2000-01-01

    Maps of faults (geologically defined source zones) may portray seismic hazards in a wide range of completeness depending on which types of faults are shown. Three fault terms - active, capable, and potential - are used in a variety of ways for different reasons or applications. Nevertheless, to be useful for seismic-hazards analysis, fault maps should encompass a time interval that includes several earthquake cycles. For example, if the common recurrence in an area is 20,000-50,000 years, then maps should include faults that are 50,000-100,000 years old (two to five typical earthquake cycles), thus allowing for temporal variability in slip rate and recurrence intervals. Conversely, in more active areas such as plate boundaries, maps showing faults that are <10,000 years old should include those with at least 2 to as many as 20 paleoearthquakes. For the International Lithosphere Programs' Task Group II-2 Project on Major Active Faults of the World our maps and database will show five age categories and four slip rate categories that allow one to select differing time spans and activity rates for seismic-hazard analysis depending on tectonic regime. The maps are accompanied by a database that describes evidence for Quaternary faulting, geomorphic expression, and paleoseismic parameters (slip rate, recurrence interval and time of most recent surface faulting). These maps and databases provide an inventory of faults that would be defined as active, capable, and potentially active for seismic-hazard assessments.

  9. Selection and properties of alternative forming fluids for TRISO fuel kernel production

    NASA Astrophysics Data System (ADS)

    Baker, M. P.; King, J. C.; Gorman, B. P.; Marshall, D. W.

    2013-01-01

    Current Very High Temperature Reactor (VHTR) designs incorporate TRi-structural ISOtropic (TRISO) fuel, which consists of a spherical fissile fuel kernel surrounded by layers of pyrolytic carbon and silicon carbide. An internal sol-gel process forms the fuel kernel using wet chemistry to produce uranium oxyhydroxide gel spheres by dropping a cold precursor solution into a hot column of trichloroethylene (TCE). Over time, gelation byproducts inhibit complete gelation, and the TCE must be purified or discarded. The resulting TCE waste stream contains both radioactive and hazardous materials and is thus considered a mixed hazardous waste. Changing the forming fluid to a non-hazardous alternative could greatly improve the economics of TRISO fuel kernel production. Selection criteria for a replacement forming fluid narrowed a list of ˜10,800 chemicals to yield ten potential replacement forming fluids: 1-bromododecane, 1-bromotetradecane, 1-bromoundecane, 1-chlorooctadecane, 1-chlorotetradecane, 1-iododecane, 1-iodododecane, 1-iodohexadecane, 1-iodooctadecane, and squalane. The density, viscosity, and surface tension for each potential replacement forming fluid were measured as a function of temperature between 25 °C and 80 °C. Calculated settling velocities and heat transfer rates give an overall column height approximation. 1-bromotetradecane, 1-chlorooctadecane, and 1-iodododecane show the greatest promise as replacements, and future tests will verify their ability to form satisfactory fuel kernels.

  10. Pouch functional outcomes after restorative proctocolectomy with ileal-pouch reconstruction in patients with ulcerative colitis: Japanese multi-center nationwide cohort study.

    PubMed

    Uchino, Motoi; Ikeuchi, Hiroki; Sugita, Akira; Futami, Kitaro; Watanabe, Toshiaki; Fukushima, Kouhei; Tatsumi, Kenji; Koganei, Kazutaka; Kimura, Hideaki; Hata, Keisuke; Takahashi, Kenichi; Watanabe, Kazuhiro; Mizushima, Tsunekazu; Funayama, Yuji; Higashi, Daijiro; Araki, Toshimitsu; Kusunoki, Masato; Ueda, Takeshi; Koyama, Fumikazu; Itabashi, Michio; Nezu, Riichiro; Suzuki, Yasuo

    2018-05-01

    Although several complications capable of causing pouch failure may develop after restorative proctocolectomy (RPC) for ulcerative colitis (UC), the incidences and causes are conflicting and vary according to country, race and institution. To avoid pouch failure, this study aimed to evaluate the rate of pouch failure and its risk factors in UC patients over the past decade via a nationwide cohort study. We conducted a retrospective, observational, multicenter study that included 13 institutions in Japan. Patients who underwent RPC between January 2005 and December 2014 were included. The characteristics and backgrounds of the patients before and during surgery and their postoperative courses and complications were reviewed. A total of 2376 patients were evaluated over 6.7 ± 3.5 years of follow-up. Twenty-seven non-functional pouches were observed, and the functional pouch rate was 98.9% after RPC. Anastomotic leakage (odds ratio, 9.1) was selected as a risk factor for a non-functional pouch. The cumulative pouch failure rate was 4.2%/10 years. A change in diagnosis to Crohn's disease/indeterminate colitis (hazard ratio, 13.2) was identified as an independent risk factor for pouch failure. The significant risk factor for a non-functional pouch was anastomotic leakage. The optimal staged surgical procedure should be selected according to a patient's condition to avoid anastomotic failure during RPC. Changes in diagnosis after RPC confer a substantial risk of pouch failure. Additional cohort studies are needed to obtain an understanding of the long-standing clinical course of and proper treatment for pouch failure.

  11. Among nonagenarians, congruence between self-rated and proxy-rated health was low but both predicted mortality.

    PubMed

    Vuorisalmi, Merja; Sarkeala, Tytti; Hervonen, Antti; Jylhä, Marja

    2012-05-01

    The congruence between self-rated global health (SRH) and proxy-rated global health (PRH), the factors associated with congruence between SRH and PRH, and their associations with mortality are examined using data from the Vitality 90+ study. The data consist of 213 pairs of subjects--aged 90 years and older--and proxies. The relationship between SRH and PRH was analyzed by chi-square test and Cohen's kappa. Logistic regression analysis was used to find out the factors that are associated with the congruence between health ratings. The association between SRH and PRH with mortality was studied using Cox proportional hazard models. The subjects rated their health more negatively than the proxies. Kappa value indicated only slight congruence between SRH and PRH, and they also predicted mortality differently. Good self-reported functional ability was associated with congruence between SRH and PRH. The results imply that the evaluation processes of SRH and PRH differ, and the measures are not directly interchangeable. Both measures are useful health indicators in very old age but SRH cannot be replaced by PRH in analyses. Copyright © 2012 Elsevier Inc. All rights reserved.

  12. Predicting EMP hazard: Lessons from studies with inhaled fibrous and non-fibrous nano- and micro-particles.

    PubMed

    Oberdörster, Günter; Graham, Uschi

    2018-05-08

    Inhalation exposure to elongated cleavage fragments occurring at mineral and rock mining and crushing operations raises important questions regarding potential health effects given their resemblance to fibers with known adverse health effects like amphibole asbestos. Thus, a major goal for establishing a toxicity profile for elongate mineral particles (EMPs) is to identify and characterize a suspected hazard and characterize a risk by examining together results of hazard and exposure assessment. This will require not only knowledge about biokinetics of inhaled EMPs but also about underlying mechanisms of effects induced by retained EMPs. In vitro toxicity assays with predictive power for in vivo effects have been established as useful screening tools for toxicological characterization of particulate materials including EMPs. Important determinants of physiological/toxicological mechanisms are physico-chemical and functional properties of inhaled particulate materials. Of the physico-chemical (intrinsic) properties, size, shape and surface characteristics are well known to affect toxicological responses; functional properties include (i) solubility/dissolution rate in physiological fluid simulants in vitro and following inhalation in vivo; (ii) ROS-inducing capacity in vitro and in vivo determined as specific particle surface reactivity; (iii) bioprocessing in vivo. A key parameter for all is the dose and duration of exposure, requiring to establish exposure-dose-response relationships. Examples of studies with fibrous and non-fibrous particles are discussed to illustrate the relevancy of evaluating extrinsic and intrinsic particle properties for predicting in vivo responses of new particulate materials. This will allow hazard and risk ranking/grouping based on a comparison to toxicologically well-characterized positive and negative benchmarks. Future efforts should be directed at developing and validating new approaches using in vitro (non-animal) studies for establishing a complete risk assessment for EMPs. Further comparative in-depth analyses with analytical and ultra-high resolution technology examining bioprocessing events at target organ sites have proven highly successful to identify biotransformations in target cells at near atomic level. In the case of EMPs, such analyses can be essential to separate benign from harmful ones. Copyright © 2018. Published by Elsevier Inc.

  13. Lung cancer incidence and survival among HIV-infected and uninfected women and men.

    PubMed

    Hessol, Nancy A; Martínez-Maza, Otoniel; Levine, Alexandra M; Morris, Alison; Margolick, Joseph B; Cohen, Mardge H; Jacobson, Lisa P; Seaberg, Eric C

    2015-06-19

    To determine the lung cancer incidence and survival time among HIV-infected and uninfected women and men. Two longitudinal studies of HIV infection in the United States. Data from 2549 women in the Women's Interagency HIV Study (WIHS) and 4274 men in the Multicenter AIDS Cohort Study (MACS), all with a history of cigarette smoking, were analyzed. Lung cancer incidence rates and incidence rate ratios were calculated using Poisson regression analyses. Survival time was assessed using Kaplan-Meier and Cox proportional-hazard analyses. Thirty-seven women and 23 men developed lung cancer (46 HIV-infected and 14 HIV-uninfected) during study follow-up. In multivariable analyses, the factors that were found to be independently associated with a higher lung cancer incidence rate ratios were older age, less education, 10 or more pack-years of smoking, and a prior diagnosis of AIDS pneumonia (vs. HIV-uninfected women). In an adjusted Cox model that allowed different hazard functions for each cohort, a history of injection drug use was associated with shorter survival, and a lung cancer diagnosis after 2001 was associated with longer survival. In an adjusted Cox model restricted to HIV-infected participants, nadir CD4 lymphocyte cell count less than 200 was associated with shorter survival time. Our data suggest that pulmonary damage and inflammation associated with HIV infection may be causative for the increased risk of lung cancer. Encouraging and assisting younger HIV-infected smokers to quit and to sustain cessation of smoking is imperative to reduce the lung cancer burden in this population.

  14. Why are well-educated Muscovites more likely to survive? Understanding the biological pathways.

    PubMed

    Todd, Megan A; Shkolnikov, Vladimir M; Goldman, Noreen

    2016-05-01

    There are large socioeconomic disparities in adult mortality in Russia, although the biological mechanisms are not well understood. With data from the study of Stress, Aging, and Health in Russia (SAHR), we use Gompertz hazard models to assess the relationship between educational attainment and mortality among older adults in Moscow and to evaluate biomarkers associated with inflammation, neuroendocrine function, heart rate variability, and clinical cardiovascular and metabolic risk as potential mediators of that relationship. We do this by assessing the extent to which the addition of biomarker variables into hazard models of mortality attenuates the association between educational attainment and mortality. We find that an additional year of education is associated with about 5% lower risk of age-specific all-cause and cardiovascular mortality. Inflammation biomarkers are best able to account for this relationship, explaining 25% of the education-all-cause mortality association, and 35% of the education-cardiovascular mortality association. Clinical markers perform next best, accounting for 13% and 23% of the relationship between education and all-cause and cardiovascular mortality, respectively. Although heart rate biomarkers are strongly associated with subsequent mortality, they explain very little of the education-mortality link. Neuroendocrine biomarkers fail to account for any portion of the link. These findings suggest that inflammation may be important for understanding mortality disparities by socioeconomic status. Copyright © 2016. Published by Elsevier Ltd.

  15. The severity of toxic reactions to ephedra: comparisons to other botanical products and national trends from 1993-2002.

    PubMed

    Woolf, Alan D; Watson, William A; Smolinske, Susan; Litovitz, Toby

    2005-01-01

    Ephedra is a botanical product widely used to enhance alertness, as a weight loss aide, and as a decongestant. Its reported adverse effects led the Food and Drug Administration (FDA) to ban ephedra-containing products in the United States in 2004. This study's purpose was to compare toxicity from botanical products containing ephedra to nonephedra products. The Toxic Exposure Surveillance System (TESS), a national poison center database, was utilized to determine the number and outcomes of cases involving botanical products reported from 1993-2002. Cases listing both a botanical product and any other drugs or chemicals were excluded a priori. Ten-year hazard rates (moderate outcomes + major outcomes + deaths per 1000 exposures) were used to compare botanical product categories. There were 21,533 toxic exposures with definitive medical outcomes reported over the 10 yrs where a botanical product was the only substance involved. Of these, 4306 (19.9%) had moderate or major medical outcomes and there were two deaths, for an overall hazard score of 200 per 1000 exposures. The number of ephedra reports to poison centers increased 150-fold over the 10-yr period. The hazard rate for products that contained only ephedra was 250 per 1000 exposures and 267 per 1000 exposures for products that contained ephedra and additional ingredients; whereas the hazard score for only nonephedra botanical products was 96 per 1000 exposures. The rate ratios for multibotanical products with ephedra (RR 1.33; 95% C.I. 1.27-1.40) and for single-ingredient ephedra products (RR 1.25; 95% C.I. 1.11-1.40) were both two to six times higher than those of other common botanical products. Yohimbe-containing products had the highest hazard score (417) and rate ratio (2.08; 95% C.I. 1.59-2.80). Ephedra-containing botanical products accounted for a significant number of toxic exposures with severe medical outcomes reported to poison centers. Hazard rate analysis suggests poison center-reported events involving ephedra-containing botanical products were much more likely to result in severe medical outcomes than those involving nonephedra-containing botanical products. These data support recommendations by policymakers that the sale of ephedra should be prohibited to protect consumers. Our data suggest that the botanical product, yohimbe, may also be associated with unacceptably high risks of toxicity and should receive close scrutiny from health policymakers.

  16. 78 FR 45941 - Changes in Flood Hazard Determinations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-30

    ... (hereinafter referred to as flood hazard determinations) as shown on the indicated Letter of Map Revision (LOMR... Insurance Rate Maps (FIRMs), and in some cases the Flood Insurance Study (FIS) reports, currently in effect... respective Community Map Repository address listed in the table below and online through the FEMA Map Service...

  17. Factors in Perception of Tornado Hazard: An Exploratory Study.

    ERIC Educational Resources Information Center

    de Man, Anton; Simpson-Housley, Paul

    1987-01-01

    Administered questionnaire on tornado hazard to 142 adults. Results indicated that subject's gender and education level were best predictors of perceived probability of tornado recurrence; that ratings of severity of potential damage were related to education level; and that gender accounted for significant percentage of variance in anxiety…

  18. Self-Monitoring and Reactivity in the Modification of Cigarette Smoking.

    ERIC Educational Resources Information Center

    Abrams, David B.; Wilson, G. Terence

    1979-01-01

    Subjects were assigned to conditions based on smoking rates: self-monitoring nicotine plus health hazard information; self-monitoring cigarettes plus health information; and self-monitoring cigarettes with no health information. Nicotine self-monitoring groups showed greater reactivity. Exposure to health hazard information had no effect. (Author)

  19. Energy dissipation for flat-sloped stepped spillways using new inception point relationship

    USDA-ARS?s Scientific Manuscript database

    Transforming from a rural to an urban landscape has created a change in hazard classification for many earthen embankments. As a result, these facilities provide inadequate spillway capacity for the upgraded hazard rating. To bring these dams into compliance with state and federal dam safety regul...

  20. 75 FR 5261 - Waybill Data Reporting for Toxic Inhalation Hazards

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-02

    ... monitor traffic flows and rate trends in the industry, and to develop evidence in Board proceedings. The... submitted to include all traffic movements designated as a TIH (Toxic Inhalation Hazard). The revised... Board to assess more accurately TIH traffic within the United States, and specifically would be...

  1. 78 FR 8177 - Proposed Flood Hazard Determinations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-05

    ... insurance agents and others to calculate appropriate flood insurance premium rates for new buildings and the contents of those buildings. DATES: Comments are to be submitted on or before May 6, 2013. ADDRESSES: The... buildings built after the FIRM and FIS report become effective. The communities affected by the flood hazard...

  2. Evaluation of the ToxRTool's ability to rate the reliability of toxicological data for human health hazard assessments

    EPA Science Inventory

    Regulatory agencies often utilize results from peer reviewed publications for hazard assessments.A problem in doing so is the lack of well-accepted tools to objectively, efficiently and systematically assess the quality of published toxicological studies. Herein, we evaluated the...

  3. 40 CFR 266.106 - Standards to control metals emissions.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... HAZARDOUS WASTE MANAGEMENT FACILITIES Hazardous Waste Burned in Boilers and Industrial Furnaces § 266.106... implemented by limiting feed rates of the individual metals to levels during the trial burn (for new... screening limit for the worst-case stack. (d) Tier III and Adjusted Tier I site-specific risk assessment...

  4. 40 CFR 266.106 - Standards to control metals emissions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... HAZARDOUS WASTE MANAGEMENT FACILITIES Hazardous Waste Burned in Boilers and Industrial Furnaces § 266.106... implemented by limiting feed rates of the individual metals to levels during the trial burn (for new... screening limit for the worst-case stack. (d) Tier III and Adjusted Tier I site-specific risk assessment...

  5. Energy Drinks: A New Health Hazard for Adolescents

    ERIC Educational Resources Information Center

    Pennington, Nicole; Johnson, Molly; Delaney, Elizabeth; Blankenship, Mary Beth

    2010-01-01

    A new hazard for adolescents is the negative health effects of energy drink consumption. Adolescents are consuming these types of drinks at an alarming amount and rate. Specific effects that have been reported by adolescents include jitteriness, nervousness, dizziness, the inability to focus, difficulty concentrating, gastrointestinal upset, and…

  6. Comparison of methods for estimating the attributable risk in the context of survival analysis.

    PubMed

    Gassama, Malamine; Bénichou, Jacques; Dartois, Laureen; Thiébaut, Anne C M

    2017-01-23

    The attributable risk (AR) measures the proportion of disease cases that can be attributed to an exposure in the population. Several definitions and estimation methods have been proposed for survival data. Using simulations, we compared four methods for estimating AR defined in terms of survival functions: two nonparametric methods based on Kaplan-Meier's estimator, one semiparametric based on Cox's model, and one parametric based on the piecewise constant hazards model, as well as one simpler method based on estimated exposure prevalence at baseline and Cox's model hazard ratio. We considered a fixed binary exposure with varying exposure probabilities and strengths of association, and generated event times from a proportional hazards model with constant or monotonic (decreasing or increasing) Weibull baseline hazard, as well as from a nonproportional hazards model. We simulated 1,000 independent samples of size 1,000 or 10,000. The methods were compared in terms of mean bias, mean estimated standard error, empirical standard deviation and 95% confidence interval coverage probability at four equally spaced time points. Under proportional hazards, all five methods yielded unbiased results regardless of sample size. Nonparametric methods displayed greater variability than other approaches. All methods showed satisfactory coverage except for nonparametric methods at the end of follow-up for a sample size of 1,000 especially. With nonproportional hazards, nonparametric methods yielded similar results to those under proportional hazards, whereas semiparametric and parametric approaches that both relied on the proportional hazards assumption performed poorly. These methods were applied to estimate the AR of breast cancer due to menopausal hormone therapy in 38,359 women of the E3N cohort. In practice, our study suggests to use the semiparametric or parametric approaches to estimate AR as a function of time in cohort studies if the proportional hazards assumption appears appropriate.

  7. Analysis and design of randomised clinical trials involving competing risks endpoints.

    PubMed

    Tai, Bee-Choo; Wee, Joseph; Machin, David

    2011-05-19

    In randomised clinical trials involving time-to-event outcomes, the failures concerned may be events of an entirely different nature and as such define a classical competing risks framework. In designing and analysing clinical trials involving such endpoints, it is important to account for the competing events, and evaluate how each contributes to the overall failure. An appropriate choice of statistical model is important for adequate determination of sample size. We describe how competing events may be summarised in such trials using cumulative incidence functions and Gray's test. The statistical modelling of competing events using proportional cause-specific and subdistribution hazard functions, and the corresponding procedures for sample size estimation are outlined. These are illustrated using data from a randomised clinical trial (SQNP01) of patients with advanced (non-metastatic) nasopharyngeal cancer. In this trial, treatment has no effect on the competing event of loco-regional recurrence. Thus the effects of treatment on the hazard of distant metastasis were similar via both the cause-specific (unadjusted csHR = 0.43, 95% CI 0.25 - 0.72) and subdistribution (unadjusted subHR 0.43; 95% CI 0.25 - 0.76) hazard analyses, in favour of concurrent chemo-radiotherapy followed by adjuvant chemotherapy. Adjusting for nodal status and tumour size did not alter the results. The results of the logrank test (p = 0.002) comparing the cause-specific hazards and the Gray's test (p = 0.003) comparing the cumulative incidences also led to the same conclusion. However, the subdistribution hazard analysis requires many more subjects than the cause-specific hazard analysis to detect the same magnitude of effect. The cause-specific hazard analysis is appropriate for analysing competing risks outcomes when treatment has no effect on the cause-specific hazard of the competing event. It requires fewer subjects than the subdistribution hazard analysis for a similar effect size. However, if the main and competing events are influenced in opposing directions by an intervention, a subdistribution hazard analysis may be warranted.

  8. Brain Perivascular Spaces as Biomarkers of Vascular Risk: Results from the Northern Manhattan Study.

    PubMed

    Gutierrez, J; Elkind, M S V; Dong, C; Di Tullio, M; Rundek, T; Sacco, R L; Wright, C B

    2017-05-01

    Dilated perivascular spaces in the brain are associated with greater arterial pulsatility. We hypothesized that perivascular spaces identify individuals at higher risk for systemic and cerebral vascular events. Stroke-free participants in the population-based Northern Manhattan Study had brain MR imaging performed and were followed for myocardial infarction, any stroke, and death. Imaging analyses distinguished perivascular spaces from lesions presumably ischemic. Perivascular spaces were further subdivided into lesions with diameters of ≤3 mm (small perivascular spaces) and >3 mm (large perivascular spaces). We calculated relative rates of events with Poisson models and hazard ratios with Cox proportional models. The Northern Manhattan Study participants who had MR imaging data available for review ( n = 1228; 59% women, 65% Hispanic; mean age, 71 ± 9 years) were followed for an average of 9 ± 2 years. Participants in the highest tertile of the small perivascular space score had a higher relative rate of all deaths (relative rate, 1.38; 95% CI, 1.01-1.91), vascular death (relative rate, 1.87; 95% CI, 1.12-3.14), myocardial infarction (relative rate, 2.08; 95% CI, 1.01-4.31), any stroke (relative rate, 1.79; 95% CI, 1.03-3.11), and any vascular event (relative rate, 1.74; 95% CI, 1.18-2.56). After we adjusted for confounders, there was a higher risk of vascular death (hazard ratio, 1.06; 95% CI, 1.01-1.11), myocardial infarction (hazard ratio, 2.22; 95% CI, 1.12-4.42), and any vascular event (hazard ratio, 1.04; 95% CI, 1.01-1.08) with higher small perivascular space scores. In this multiethnic, population-based study, participants with a high burden of small perivascular spaces had increased risk of vascular events. By gaining pathophysiologic insight into the mechanism of perivascular space dilation, we may be able to propose novel therapies to better prevent vascular disorders in the population. © 2017 by American Journal of Neuroradiology.

  9. Space-Time Earthquake Rate Models for One-Year Hazard Forecasts in Oklahoma

    NASA Astrophysics Data System (ADS)

    Llenos, A. L.; Michael, A. J.

    2017-12-01

    The recent one-year seismic hazard assessments for natural and induced seismicity in the central and eastern US (CEUS) (Petersen et al., 2016, 2017) rely on earthquake rate models based on declustered catalogs (i.e., catalogs with foreshocks and aftershocks removed), as is common practice in probabilistic seismic hazard analysis. However, standard declustering can remove over 90% of some induced sequences in the CEUS. Some of these earthquakes may still be capable of causing damage or concern (Petersen et al., 2015, 2016). The choices of whether and how to decluster can lead to seismicity rate estimates that vary by up to factors of 10-20 (Llenos and Michael, AGU, 2016). Therefore, in order to improve the accuracy of hazard assessments, we are exploring ways to make forecasts based on full, rather than declustered, catalogs. We focus on Oklahoma, where earthquake rates began increasing in late 2009 mainly in central Oklahoma and ramped up substantially in 2013 with the expansion of seismicity into northern Oklahoma and southern Kansas. We develop earthquake rate models using the space-time Epidemic-Type Aftershock Sequence (ETAS) model (Ogata, JASA, 1988; Ogata, AISM, 1998; Zhuang et al., JASA, 2002), which characterizes both the background seismicity rate as well as aftershock triggering. We examine changes in the model parameters over time, focusing particularly on background rate, which reflects earthquakes that are triggered by external driving forces such as fluid injection rather than other earthquakes. After the model parameters are fit to the seismicity data from a given year, forecasts of the full catalog for the following year can then be made using a suite of 100,000 ETAS model simulations based on those parameters. To evaluate this approach, we develop pseudo-prospective yearly forecasts for Oklahoma from 2013-2016 and compare them with the observations using standard Collaboratory for the Study of Earthquake Predictability tests for consistency.

  10. National information network and database system of hazardous waste management in China

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ma Hongchang

    1996-12-31

    Industries in China generate large volumes of hazardous waste, which makes it essential for the nation to pay more attention to hazardous waste management. National laws and regulations, waste surveys, and manifest tracking and permission systems have been initiated. Some centralized hazardous waste disposal facilities are under construction. China`s National Environmental Protection Agency (NEPA) has also obtained valuable information on hazardous waste management from developed countries. To effectively share this information with local environmental protection bureaus, NEPA developed a national information network and database system for hazardous waste management. This information network will have such functions as information collection, inquiry,more » and connection. The long-term objective is to establish and develop a national and local hazardous waste management information network. This network will significantly help decision makers and researchers because it will be easy to obtain information (e.g., experiences of developed countries in hazardous waste management) to enhance hazardous waste management in China. The information network consists of five parts: technology consulting, import-export management, regulation inquiry, waste survey, and literature inquiry.« less

  11. Time-dependent seismic hazard analysis for the Greater Tehran and surrounding areas

    NASA Astrophysics Data System (ADS)

    Jalalalhosseini, Seyed Mostafa; Zafarani, Hamid; Zare, Mehdi

    2018-01-01

    This study presents a time-dependent approach for seismic hazard in Tehran and surrounding areas. Hazard is evaluated by combining background seismic activity, and larger earthquakes may emanate from fault segments. Using available historical and paleoseismological data or empirical relation, the recurrence time and maximum magnitude of characteristic earthquakes for the major faults have been explored. The Brownian passage time (BPT) distribution has been used to calculate equivalent fictitious seismicity rate for major faults in the region. To include ground motion uncertainty, a logic tree and five ground motion prediction equations have been selected based on their applicability in the region. Finally, hazard maps have been presented.

  12. A 10-Year Follow-Up Study of Social Ties and Functional Health among the Old: The AGES Project

    PubMed Central

    Murata, Chiyoe; Saito, Tami; Tsuji, Taishi; Saito, Masashige

    2017-01-01

    In Asian nations, family ties are considered important. However, it is not clear what happens among older people with no such ties. To investigate the association, we used longitudinal data from the Aichi Gerontological Evaluation Study (AGES) project. Functionally independent older people at baseline (N = 14,088) in 10 municipalities were followed from 2003 to 2013. Social ties were assessed by asking about their social support exchange with family, relatives, friends, or neighbors. Cox proportional hazard models were employed to investigate the association between social ties and the onset of functional disability adjusting for age, health status, and living arrangement. We found that social ties with co-residing family members, and those with friends or neighbors, independently protected functional health with hazard ratios of 0.81 and 0.85 among men. Among women, ties with friend or neighbors had a stronger effect on health compared to their male counterparts with a hazard ratio of 0.89. The fact that social ties with friends or neighbors are associated with a lower risk of functional decline, independent of family support, serves to underscore the importance of promoting social ties, especially among those lacking family ties. PMID:28671627

  13. Directing driver attention with augmented reality cues

    PubMed Central

    Rusch, Michelle L.; Schall, Mark C.; Gavin, Patrick; Lee, John D.; Dawson, Jeffrey D.; Vecera, Shaun; Rizzo, Matthew

    2013-01-01

    This simulator study evaluated the effects of augmented reality (AR) cues designed to direct the attention of experienced drivers to roadside hazards. Twenty-seven healthy middle-aged licensed drivers with a range of attention capacity participated in a 54 mile (1.5 hour) drive in an interactive fixed-base driving simulator. Each participant received AR cues to potential roadside hazards in six simulated straight (9 mile long) rural roadway segments. Drivers were evaluated on response time for detecting a potentially hazardous event, detection accuracy for target (hazard) and non-target objects, and headway with respect to the hazards. Results showed no negative outcomes associated with interference. AR cues did not impair perception of non-target objects, including for drivers with lower attentional capacity. Results showed near significant response time benefits for AR cued hazards. AR cueing increased response rate for detecting pedestrians and warning signs but not vehicles. AR system false alarms and misses did not impair driver responses to potential hazards. PMID:24436635

  14. The association between mental health, physical function, and hemodialysis mortality.

    PubMed

    Knight, Eric L; Ofsthun, Norma; Teng, Ming; Lazarus, J Michael; Curhan, Gary C

    2003-05-01

    Mortality rates for individuals on chronic hemodialysis remain very high; therefore, strategies are needed to identify individuals at greatest risk for mortality so preventive strategies can be implemented. One such approach is to stratify individuals by self-reported mental health and physical function. Examining these parameters at baseline, and over time, may help identify individuals at greater risk for mortality. We enrolled 14,815 individuals with end-stage renal disease (ESRD) and followed these individuals for up to 2 years. The mean age was 61.0 +/- 15.4 years (range, 20 to 96 years) and 31% were African Americans. The SF-36 Health Survey was administered 1 to 3 months after hemodialysis initiation and 6 months later. We examined the associations between the initial SF-36 Health Survey mental component summary (MCS) and physical component summary (PCS) scores and mortality during the follow-up period, and examined the associations between 6-month decline in PCS and MCS scores and subsequent mortality. We also examined the interactions between age and MCS and PCS scores. The general population-based mean of each of these scores was 50 with a standard deviation of 10. The main outcome measurement was death. Self-reported baseline mental health (MCS score) and physical function (PCS score) were both independently associated with increased mortality, and 6-month decline in these parameters was also associated with increased mortality. The multivariate hazard ratios for 1-year mortality for MCS scores of less than 30, 30 to 39, and 40 to 49 were 1.48 (95% CI, 1.32 to 1.64), 1.23 (95% CI, 1.14 to 1.32) and 1.18 (95% CI, 1.10 to 1.26) compared with a MCS score of 50 or more. The hazard ratios for PCS scores of less than 20, 20 to 29, and 30 to 39 were 1.97 (95% CI, 1.64 to 2.36), 1.62 (95% CI, 1.36 to 1.92), and 1.32 (95% CI, 1.11 to 1.57) compared with a PCS score of 50 or more. Six-month decline in self-reported mental health (hazard ratio, 1.07; 95% CI, 1.02 to 1.12, per 10-point decline in MCS score) and physical function (hazard ratio, 1.25; 95% CI, 1.18 to 1.33, per 10-point decline in PCS score) were also both significantly associated with an additional increase in mortality beyond baseline risk. We also found a significant interaction between age and physical function (P = 0.02). Specifically, there was a graded response between the PCS score category and mortality in most age strata, but this relationship was not observed in the oldest age (85 years old or older). In individuals newly initiated on chronic hemodialysis, self-reported baseline mental health and physical function are important, independent predictors of mortality, and there is a graded relationship between these parameters and mortality risk. Following these parameters over time provides additional information on mortality risk. One must also consider age when interpreting the relationship between physical function and mortality.

  15. Airflow Hazard Visualization for Helicopter Pilots: Flight Simulation Study Results

    NASA Technical Reports Server (NTRS)

    Aragon, Cecilia R.; Long, Kurtis R.

    2005-01-01

    Airflow hazards such as vortices or low level wind shear have been identified as a primary contributing factor in many helicopter accidents. US Navy ships generate airwakes over their decks, creating potentially hazardous conditions for shipboard rotorcraft launch and recovery. Recent sensor developments may enable the delivery of airwake data to the cockpit, where visualizing the hazard data may improve safety and possibly extend ship/helicopter operational envelopes. A prototype flight-deck airflow hazard visualization system was implemented on a high-fidelity rotorcraft flight dynamics simulator. Experienced helicopter pilots, including pilots from all five branches of the military, participated in a usability study of the system. Data was collected both objectively from the simulator and subjectively from post-test questionnaires. Results of the data analysis are presented, demonstrating a reduction in crash rate and other trends that illustrate the potential of airflow hazard visualization to improve flight safety.

  16. Seismic hazard in the Intermountain West

    USGS Publications Warehouse

    Haller, Kathleen; Moschetti, Morgan P.; Mueller, Charles; Rezaeian, Sanaz; Petersen, Mark D.; Zeng, Yuehua

    2015-01-01

    The 2014 national seismic-hazard model for the conterminous United States incorporates new scientific results and important model adjustments. The current model includes updates to the historical catalog, which is spatially smoothed using both fixed-length and adaptive-length smoothing kernels. Fault-source characterization improved by adding faults, revising rates of activity, and incorporating new results from combined inversions of geologic and geodetic data. The update also includes a new suite of published ground motion models. Changes in probabilistic ground motion are generally less than 10% in most of the Intermountain West compared to the prior assessment, and ground-motion hazard in four Intermountain West cities illustrates the range and magnitude of change in the region. Seismic hazard at reference sites in Boise and Reno increased as much as 10%, whereas hazard in Salt Lake City decreased 5–6%. The largest change was in Las Vegas, where hazard increased 32–35%.

  17. Understanding the Risk Factors of Trauma Center Closures

    PubMed Central

    Shen, Yu-Chu; Hsia, Renee Y.; Kuzma, Kristen

    2011-01-01

    Objectives We analyze whether hazard rates of shutting down trauma centers are higher due to financial pressures or in areas with vulnerable populations (such as minorities or the poor). Materials and Methods This is a retrospective study of all hospitals with trauma center services in urban areas in the continental US between 1990 and 2005, identified from the American Hospital Association Annual Surveys. These data were linked with Medicare cost reports, and supplemented with other sources, including the Area Resource File. We analyze the hazard rates of trauma center closures among several dimensions of risk factors using discrete-time proportional hazard models. Results The number of trauma center closures increased from 1990 to 2005, with a total of 339 during this period. The hazard rate of closing trauma centers in hospitals with a negative profit margin is 1.38 times higher than those hospitals without the negative profit margin (P < 0.01). Hospitals receiving more generous Medicare reimbursements face a lower hazard of shutting down trauma centers (ratio: 0.58, P < 0.01) than those receiving below average reimbursement. Hospitals in areas with higher health maintenance organizations penetration face a higher hazard of trauma center closure (ratio: 2.06, P < 0.01). Finally, hospitals in areas with higher shares of minorities face a higher risk of trauma center closure (ratio: 1.69, P < 0.01). Medicaid load and uninsured populations, however, are not risk factors for higher rates of closure after we control for other financial and community characteristics. Conclusions Our findings give an indication on how the current proposals to cut public spending could exacerbate the trauma closure particularly among areas with high shares of minorities. In addition, given the negative effect of health maintenance organizations on trauma center survival, the growth of Medicaid managed care population should be monitored. Finally, high shares of Medicaid or uninsurance by themselves are not independent risk factors for higher closure as long as financial pressures are mitigated. Targeted policy interventions and further research on the causes, are needed to address these systems-level disparities. PMID:19704354

  18. Understanding the risk factors of trauma center closures: do financial pressure and community characteristics matter?

    PubMed

    Shen, Yu-Chu; Hsia, Renee Y; Kuzma, Kristen

    2009-09-01

    We analyze whether hazard rates of shutting down trauma centers are higher due to financial pressures or in areas with vulnerable populations (such as minorities or the poor). This is a retrospective study of all hospitals with trauma center services in urban areas in the continental US between 1990 and 2005, identified from the American Hospital Association Annual Surveys. These data were linked with Medicare cost reports, and supplemented with other sources, including the Area Resource File. We analyze the hazard rates of trauma center closures among several dimensions of risk factors using discrete-time proportional hazard models. The number of trauma center closures increased from 1990 to 2005, with a total of 339 during this period. The hazard rate of closing trauma centers in hospitals with a negative profit margin is 1.38 times higher than those hospitals without the negative profit margin (P < 0.01). Hospitals receiving more generous Medicare reimbursements face a lower hazard of shutting down trauma centers (ratio: 0.58, P < 0.01) than those receiving below average reimbursement. Hospitals in areas with higher health maintenance organizations penetration face a higher hazard of trauma center closure (ratio: 2.06, P < 0.01). Finally, hospitals in areas with higher shares of minorities face a higher risk of trauma center closure (ratio: 1.69, P < 0.01). Medicaid load and uninsured populations, however, are not risk factors for higher rates of closure after we control for other financial and community characteristics. Our findings give an indication on how the current proposals to cut public spending could exacerbate the trauma closure particularly among areas with high shares of minorities. In addition, given the negative effect of health maintenance organizations on trauma center survival, the growth of Medicaid managed care population should be monitored. Finally, high shares of Medicaid or uninsurance by themselves are not independent risk factors for higher closure as long as financial pressures are mitigated. Targeted policy interventions and further research on the causes, are needed to address these systems-level disparities.

  19. Deadly Cold: Health Hazards Due to Cold Weather. An Information Paper by the Subcommittee on Health and Long-Term Care of the Select Committee on Aging. House of Representatives, Ninety-Eighth Congress, Second Session (February 1984).

    ERIC Educational Resources Information Center

    Congress of the U.S., Washington, DC. House Select Committee on Aging.

    This paper, on the health hazards of cold weather for elderly persons, presents information from various sources on the death rates in winter throughout the United States. After reviewing the scope of the problem, specific health hazards associated with cold weather are discussed, i.e., hypothermia, fires, carbon monoxide poisoning, and influenza…

  20. Job Loss, Unemployment and the Incidence of Hazardous Drinking during the Late 2000s Recession in Europe among Adults Aged 50-64 Years.

    PubMed

    Bosque-Prous, Marina; Espelt, Albert; Sordo, Luis; Guitart, Anna M; Brugal, M Teresa; Bravo, Maria J

    2015-01-01

    To estimate the incidence of hazardous drinking in middle-aged people during an economic recession and ascertain whether individual job loss and contextual changes in unemployment influence the incidence rate in that period. Longitudinal study based on two waves of the SHARE project (Survey of Health, Ageing and Retirement in Europe). Individuals aged 50-64 years from 11 European countries, who were not hazardous drinkers at baseline (n = 7,615), were selected for this study. We estimated the cumulative incidence of hazardous drinking (≥40g and ≥20g of pure alcohol on average in men and women, respectively) between 2006 and 2012. Furthermore, in the statistical analysis, multilevel Poisson regression models with robust variance were fitted and obtained Risk Ratios (RR) and their 95% Confidence Intervals (95%CI). Over a 6-year period, 505 subjects became hazardous drinkers, with cumulative incidence of 6.6 per 100 persons between 2006 and 2012 (95%CI:6.1-7.2). Age [RR = 1.02 (95%CI:1.00-1.04)] and becoming unemployed [RR = 1.55 (95%CI:1.08-2.23)] were independently associated with higher risk of becoming a hazardous drinker. Conversely, having poorer self-perceived health was associated with lower risk of becoming a hazardous drinker [RR = 0.75 (95%CI:0.60-0.95)]. At country-level, an increase in the unemployment rate during the study period [RR = 1.32 (95%CI:1.17-1.50)] and greater increases in the household disposable income [RR = 0.97 (95%CI:0.95-0.99)] were associated with risk of becoming a hazardous drinker. Job loss among middle-aged individuals during the economic recession was positively associated with becoming a hazardous drinker. Changes in country-level variables were also related to this drinking pattern.

  1. Analyzing Distributed Functions in an Integrated Hazard Analysis

    NASA Technical Reports Server (NTRS)

    Morris, A. Terry; Massie, Michael J.

    2010-01-01

    Large scale integration of today's aerospace systems is achievable through the use of distributed systems. Validating the safety of distributed systems is significantly more difficult as compared to centralized systems because of the complexity of the interactions between simultaneously active components. Integrated hazard analysis (IHA), a process used to identify unacceptable risks and to provide a means of controlling them, can be applied to either centralized or distributed systems. IHA, though, must be tailored to fit the particular system being analyzed. Distributed systems, for instance, must be analyzed for hazards in terms of the functions that rely on them. This paper will describe systems-oriented IHA techniques (as opposed to traditional failure-event or reliability techniques) that should be employed for distributed systems in aerospace environments. Special considerations will be addressed when dealing with specific distributed systems such as active thermal control, electrical power, command and data handling, and software systems (including the interaction with fault management systems). Because of the significance of second-order effects in large scale distributed systems, the paper will also describe how to analyze secondary functions to secondary functions through the use of channelization.

  2. Combined effects of family history of CVD and heart rate on ischemic stroke incidence among Inner Mongolians in China.

    PubMed

    Zhou, Yipeng; Tian, Yunfan; Zhong, Chongke; Batu, Buren; Xu, Tian; Li, Hongmei; Zhang, Mingzhi; Wang, Aili; Zhang, Yonghong

    2016-05-01

    This study aimed to evaluate the combined effects of family history of cardiovascular diseases (FHCVD) and heart rate on ischemic stroke incidence among Inner Mongolians in China. A prospective cohort study was conducted among 2589 participants aged 20 years and older from Inner Mongolia, China. The participants were divided into four groups according to status of FHCVD and heart rate and followed up from June 2002 to July 2012. Cox proportional hazards models were used to evaluate the combined effects of FHCVD and heart rate on the incidence of ischemic stroke. A total of 76 ischemic stroke occurred during the follow-up period. The observed ischemic stroke cases tended to be older and male, and had higher prevalence of smoking, drinking, hypertension and FHCVD as well as higher systolic and diastolic blood pressures at baseline compared with those who did not experience ischemic stroke. Age- and gender-adjusted hazard ratio (95% confidence interval) of ischemic stroke in the participants with both FHCVD and heart rate ≥ 80 were 2.89 (1.51-5.53), compared with those without FHCVD and heart rate < 80. After multiple adjustment, the association between ischemic stroke risk and both FHCVD and heart rate ≥ 80 remained statistically significant (hazard ratio, 2.47; 95% confidence interval: 1.22-5.01). Our main finding that participants with both FHCVD and faster heart rate have the highest risk of ischemic stroke suggests that faster heart rate may increase the risk of ischemic stroke among people with FHCVD.

  3. Two models for evaluating landslide hazards

    USGS Publications Warehouse

    Davis, J.C.; Chung, C.-J.; Ohlmacher, G.C.

    2006-01-01

    Two alternative procedures for estimating landslide hazards were evaluated using data on topographic digital elevation models (DEMs) and bedrock lithologies in an area adjacent to the Missouri River in Atchison County, Kansas, USA. The two procedures are based on the likelihood ratio model but utilize different assumptions. The empirical likelihood ratio model is based on non-parametric empirical univariate frequency distribution functions under an assumption of conditional independence while the multivariate logistic discriminant model assumes that likelihood ratios can be expressed in terms of logistic functions. The relative hazards of occurrence of landslides were estimated by an empirical likelihood ratio model and by multivariate logistic discriminant analysis. Predictor variables consisted of grids containing topographic elevations, slope angles, and slope aspects calculated from a 30-m DEM. An integer grid of coded bedrock lithologies taken from digitized geologic maps was also used as a predictor variable. Both statistical models yield relative estimates in the form of the proportion of total map area predicted to already contain or to be the site of future landslides. The stabilities of estimates were checked by cross-validation of results from random subsamples, using each of the two procedures. Cell-by-cell comparisons of hazard maps made by the two models show that the two sets of estimates are virtually identical. This suggests that the empirical likelihood ratio and the logistic discriminant analysis models are robust with respect to the conditional independent assumption and the logistic function assumption, respectively, and that either model can be used successfully to evaluate landslide hazards. ?? 2006.

  4. Exposure and Dosimetry Considerations for Adverse Outcome Pathways (AOPs) (NIH-AOP)

    EPA Science Inventory

    Risk is a function of both of hazard and exposure. Toxicokinetic (TK) models can determine whether chemical exposures produce potentially hazardous tissue concentrations. Whether or not the initial molecular event (MIE) in an Adverse Outcome Pathway (AOP) occurs depends on both e...

  5. The Relationship Between Caregiving and Mortality After Accounting for Time-Varying Caregiver Status and Addressing the Healthy Caregiver Hypothesis.

    PubMed

    Fredman, Lisa; Lyons, Jennifer G; Cauley, Jane A; Hochberg, Marc; Applebaum, Katie M

    2015-09-01

    Previous studies have shown inconsistent associations between caregiving and mortality. This may be due to analyzing caregiver status at baseline only, and that better health is probably related to taking on caregiving responsibilities and continuing in that role. The latter is termed The Healthy Caregiver Hypothesis, similar to the Healthy Worker Effect in occupational epidemiology. We applied common approaches from occupational epidemiology to evaluate the association between caregiving and mortality, including treating caregiving as time-varying and lagging exposure up to 5 years. Caregiving status among 1,068 women (baseline mean age = 81.0 years; 35% caregivers) participating in the Caregiver-Study of Osteoporotic Fractures study was assessed at five interviews conducted between 1999 and 2009. Mortality was determined through January 2012. Cox proportional hazards models were used to estimate adjusted hazard ratios and 95% confidence intervals adjusted for sociodemographics, perceived stress, and functional limitations. A total of 483 participants died during follow-up (38.8% and 48.7% of baseline caregivers and noncaregivers, respectively). Using baseline caregiving status, the association with mortality was 0.77, 0.62-0.95. Models of time-varying caregiving status showed a more pronounced reduction in mortality in current caregivers (hazard ratios = 0.54, 0.38-0.75), which diminished with longer lag periods (3-year lag hazard ratio = 0.68, 0.52-0.88, 5-year lag hazard ratios = 0.76, 0.60-0.95). Overall, caregivers had lower mortality rates than noncaregivers in all analyses. These associations were sensitive to the lagged period, indicating that the timing of leaving caregiving does influence this relationship and should be considered in future investigations. © The Author 2015. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  6. Simple estimation procedures for regression analysis of interval-censored failure time data under the proportional hazards model.

    PubMed

    Sun, Jianguo; Feng, Yanqin; Zhao, Hui

    2015-01-01

    Interval-censored failure time data occur in many fields including epidemiological and medical studies as well as financial and sociological studies, and many authors have investigated their analysis (Sun, The statistical analysis of interval-censored failure time data, 2006; Zhang, Stat Modeling 9:321-343, 2009). In particular, a number of procedures have been developed for regression analysis of interval-censored data arising from the proportional hazards model (Finkelstein, Biometrics 42:845-854, 1986; Huang, Ann Stat 24:540-568, 1996; Pan, Biometrics 56:199-203, 2000). For most of these procedures, however, one drawback is that they involve estimation of both regression parameters and baseline cumulative hazard function. In this paper, we propose two simple estimation approaches that do not need estimation of the baseline cumulative hazard function. The asymptotic properties of the resulting estimates are given, and an extensive simulation study is conducted and indicates that they work well for practical situations.

  7. Fire danger index efficiency as a function of fuel moisture and fire behavior.

    PubMed

    Torres, Fillipe Tamiozzo Pereira; Romeiro, Joyce Machado Nunes; Santos, Ana Carolina de Albuquerque; de Oliveira Neto, Ricardo Rodrigues; Lima, Gumercindo Souza; Zanuncio, José Cola

    2018-08-01

    Assessment of the performance of forest fire hazard indices is important for prevention and management strategies, such as planning prescribed burnings, public notifications and firefighting resource allocation. The objective of this study was to evaluate the performance of fire hazard indices considering fire behavior variables and susceptibility expressed by the moisture of combustible material. Controlled burns were carried out at different times and information related to meteorological conditions, characteristics of combustible material and fire behavior variables were recorded. All variables analyzed (fire behavior and fuel moisture content) can be explained by the prediction indices. The Brazilian EVAP/P showed the best performance, both at predicting moisture content of the fuel material and fire behavior variables, and the Canadian system showed the best performance to predicting the rate of spread. The coherence of the correlations between the indices and the variables analyzed makes the methodology, which can be applied anywhere, important for decision-making in regions with no records or with only unreliable forest fire data. Copyright © 2018 Elsevier B.V. All rights reserved.

  8. Natural radioelement concentrations in the soil of the Mila region of Algeria

    NASA Astrophysics Data System (ADS)

    Bramki, Amina; Ramdhane, Mourad; Benrachi, Fatima

    2017-09-01

    In this study, the levels of the natural and artificial radioactivity in soil samples collected at various depths from Algerian agricultural region El-Athmania Mila was measured. Activity concentrations of the concerned radionuclides were determined by gamma-ray spectrometry using a high-purity germanium detector. The activity concentrations of 226Ra, 232Th and 40K were found unchanged as a function of depth and ranged from 23.72±2.37to 65.47±5.06 Bq.kg-1 for 226Ra, from 26.45±0.78 to 27.10±0.80 Bq.kg-1 for 232Th and from 220.80±10.01 to 260.70±8.24 Bq.kg-1 for 40K respectively. To evaluate the radiological hazard of radioactivity in samples, the radium equivalent activity (Raeq), the absorbed dose rate (D), the annual effective dose and the external (Hex) and internal hazard indices (Hin) were calculated. The mean of the excess lifetime cancer risk observed in this study are under the world's mean values.

  9. Bias correction in the hierarchical likelihood approach to the analysis of multivariate survival data.

    PubMed

    Jeon, Jihyoun; Hsu, Li; Gorfine, Malka

    2012-07-01

    Frailty models are useful for measuring unobserved heterogeneity in risk of failures across clusters, providing cluster-specific risk prediction. In a frailty model, the latent frailties shared by members within a cluster are assumed to act multiplicatively on the hazard function. In order to obtain parameter and frailty variate estimates, we consider the hierarchical likelihood (H-likelihood) approach (Ha, Lee and Song, 2001. Hierarchical-likelihood approach for frailty models. Biometrika 88, 233-243) in which the latent frailties are treated as "parameters" and estimated jointly with other parameters of interest. We find that the H-likelihood estimators perform well when the censoring rate is low, however, they are substantially biased when the censoring rate is moderate to high. In this paper, we propose a simple and easy-to-implement bias correction method for the H-likelihood estimators under a shared frailty model. We also extend the method to a multivariate frailty model, which incorporates complex dependence structure within clusters. We conduct an extensive simulation study and show that the proposed approach performs very well for censoring rates as high as 80%. We also illustrate the method with a breast cancer data set. Since the H-likelihood is the same as the penalized likelihood function, the proposed bias correction method is also applicable to the penalized likelihood estimators.

  10. The Type of Payment and Working Conditions.

    PubMed

    Rhee, Kyung Yong; Kim, Young Sun; Cho, Yoon Ho

    2015-12-01

    The type of payment is one of the important factors that has an effect on the health of employees, as a basic working condition. In the conventional research field of occupational safety and health, only the physical, chemical, biological, and ergonomic factors are treated as the main hazardous factors. Managerial factors and basic working conditions such as working hours and the type of payment are neglected. This paper aimed to investigate the association of the type of payment and the exposure to the various hazardous factors as an heuristic study. The third Korean Working Conditions Survey (KWCS) by the Occupational Safety and Health Research Institute in 2011 was used for this study. Among the total sample of 50,032 economically active persons, 34,788 employees were considered for analysis. This study examined the relation between the three types of payment such as basic fixed salary and wage, piece rate, and extra payment for bad and dangerous working conditions and exposure to hazardous factors like vibration, noise, temperature, chemical contact, and working at very high speeds. Multivariate regression analysis was used to measure the effect of the type of payment on working hours exposed to hazards. The result showed that the proportion of employees with a basic fixed salary was 94.5%, the proportion with piece rates was 38.6%, and the proportion who received extra payment for hazardous working conditions was 11.7%. The piece rate was associated with exposure to working with tight deadlines and stressful jobs. This study had some limitations because KWCS was a cross-sectional survey.

  11. All-Cause, Cardiovascular, and Cancer Mortality Rates in Postmenopausal White, Black, Hispanic, and Asian Women With and Without Diabetes in the United States

    PubMed Central

    Ma, Yunsheng; Hébert, James R.; Balasubramanian, Raji; Wedick, Nicole M.; Howard, Barbara V.; Rosal, Milagros C.; Liu, Simin; Bird, Chloe E.; Olendzki, Barbara C.; Ockene, Judith K.; Wactawski-Wende, Jean; Phillips, Lawrence S.; LaMonte, Michael J.; Schneider, Kristin L.; Garcia, Lorena; Ockene, Ira S.; Merriam, Philip A.; Sepavich, Deidre M.; Mackey, Rachel H.; Johnson, Karen C.; Manson, JoAnn E.

    2013-01-01

    Using data from the Women's Health Initiative (1993–2009; n = 158,833 participants, of whom 84.1% were white, 9.2% were black, 4.1% were Hispanic, and 2.6% were Asian), we compared all-cause, cardiovascular, and cancer mortality rates in white, black, Hispanic, and Asian postmenopausal women with and without diabetes. Cox proportional hazard models were used for the comparison from which hazard ratios and 95% confidence intervals were computed. Within each racial/ethnic subgroup, women with diabetes had an approximately 2–3 times higher risk of all-cause, cardiovascular, and cancer mortality than did those without diabetes. However, the hazard ratios for mortality outcomes were not significantly different between racial/ethnic subgroups. Population attributable risk percentages (PARPs) take into account both the prevalence of diabetes and hazard ratios. For all-cause mortality, whites had the lowest PARP (11.1, 95% confidence interval (CI): 10.1, 12.1), followed by Asians (12.9, 95% CI: 4.7, 20.9), blacks (19.4, 95% CI: 15.0, 23.7), and Hispanics (23.2, 95% CI: 14.8, 31.2). To our knowledge, the present study is the first to show that hazard ratios for mortality outcomes were not significantly different between racial/ethnic subgroups when stratified by diabetes status. Because of the “amplifying” effect of diabetes prevalence, efforts to reduce racial/ethnic disparities in the rate of death from diabetes should focus on prevention of diabetes. PMID:24045960

  12. A Bimodal Hybrid Model for Time-Dependent Probabilistic Seismic Hazard Analysis

    NASA Astrophysics Data System (ADS)

    Yaghmaei-Sabegh, Saman; Shoaeifar, Nasser; Shoaeifar, Parva

    2018-03-01

    The evaluation of evidence provided by geological studies and historical catalogs indicates that in some seismic regions and faults, multiple large earthquakes occur in cluster. Then, the occurrences of large earthquakes confront with quiescence and only the small-to-moderate earthquakes take place. Clustering of large earthquakes is the most distinguishable departure from the assumption of constant hazard of random occurrence of earthquakes in conventional seismic hazard analysis. In the present study, a time-dependent recurrence model is proposed to consider a series of large earthquakes that occurs in clusters. The model is flexible enough to better reflect the quasi-periodic behavior of large earthquakes with long-term clustering, which can be used in time-dependent probabilistic seismic hazard analysis with engineering purposes. In this model, the time-dependent hazard results are estimated by a hazard function which comprises three parts. A decreasing hazard of last large earthquake cluster and an increasing hazard of the next large earthquake cluster, along with a constant hazard of random occurrence of small-to-moderate earthquakes. In the final part of the paper, the time-dependent seismic hazard of the New Madrid Seismic Zone at different time intervals has been calculated for illustrative purpose.

  13. Evolution of biological sequences implies an extreme value distribution of type I for both global and local pairwise alignment scores.

    PubMed

    Bastien, Olivier; Maréchal, Eric

    2008-08-07

    Confidence in pairwise alignments of biological sequences, obtained by various methods such as Blast or Smith-Waterman, is critical for automatic analyses of genomic data. Two statistical models have been proposed. In the asymptotic limit of long sequences, the Karlin-Altschul model is based on the computation of a P-value, assuming that the number of high scoring matching regions above a threshold is Poisson distributed. Alternatively, the Lipman-Pearson model is based on the computation of a Z-value from a random score distribution obtained by a Monte-Carlo simulation. Z-values allow the deduction of an upper bound of the P-value (1/Z-value2) following the TULIP theorem. Simulations of Z-value distribution is known to fit with a Gumbel law. This remarkable property was not demonstrated and had no obvious biological support. We built a model of evolution of sequences based on aging, as meant in Reliability Theory, using the fact that the amount of information shared between an initial sequence and the sequences in its lineage (i.e., mutual information in Information Theory) is a decreasing function of time. This quantity is simply measured by a sequence alignment score. In systems aging, the failure rate is related to the systems longevity. The system can be a machine with structured components, or a living entity or population. "Reliability" refers to the ability to operate properly according to a standard. Here, the "reliability" of a sequence refers to the ability to conserve a sufficient functional level at the folded and maturated protein level (positive selection pressure). Homologous sequences were considered as systems 1) having a high redundancy of information reflected by the magnitude of their alignment scores, 2) which components are the amino acids that can independently be damaged by random DNA mutations. From these assumptions, we deduced that information shared at each amino acid position evolved with a constant rate, corresponding to the information hazard rate, and that pairwise sequence alignment scores should follow a Gumbel distribution, which parameters could find some theoretical rationale. In particular, one parameter corresponds to the information hazard rate. Extreme value distribution of alignment scores, assessed from high scoring segments pairs following the Karlin-Altschul model, can also be deduced from the Reliability Theory applied to molecular sequences. It reflects the redundancy of information between homologous sequences, under functional conservative pressure. This model also provides a link between concepts of biological sequence analysis and of systems biology.

  14. Mortality and cardiovascular diseases risk in patients with Barrett's oesophagus: a population-based nationwide cohort study.

    PubMed

    Erichsen, R; Horvath-Puho, E; Lund, J L; Dellon, E S; Shaheen, N J; Pedersen, L; Davey Smith, G; Sørensen, H T

    2017-04-01

    Patients with Barrett's oesophagus may be at increased risk of mortality overall, and cardiovascular disease has been suggested as the main underlying cause of death. To examine cause-specific mortality and risk of cardiovascular events among patients with Barrett's oesophagus. Utilising existing Danish data sources (1997-2011), we identified all patients with histologically verified Barrett's oesophagus (n = 13 435) and 123 526 members of the general population matched by age, sex and individual comorbidities. We calculated cause-specific mortality rates and incidence rates of cardiovascular diseases. We then compared rates between patients with Barrett's oesophagus and the general population comparison cohort, using stratified Cox proportional hazard regression. Patients with Barrett's oesophagus had a 71% increased risk of overall mortality. The cause-specific mortality rate per 1000 person-years for patients with Barrett's oesophagus was 8.5 for cardiovascular diseases, 14.7 for non-oesophageal cancers, and 5.4 for oesophageal cancer. Compared to the general population cohort, corresponding hazard ratios were 1.26 (95% confidence interval (CI): 1.15-1.38), 1.77 (95% CI: 1.65-1.90), and 19.4 (95% CI: 16.1-23.4), respectively. The incidence rates of cardiovascular diseases per 1000 person-years for Barrett's oesophagus patients and for persons from the general population cohort, respectively, varied from 0.4 and 0.2 for subarachnoid bleeding (hazard ratio 1.10, 95% CI: 0.87-1.39) to 8.1 and 5.9 for congestive heart failure (hazard ratio 1.33, 95% CI: 1.21-1.46). Prophylactic measures targeted at cardiovascular diseases and non-oesophageal cancers potentially could be more important than measures against oesophageal cancer, for improving prognosis among patients with Barrett's oesophagus. © 2017 John Wiley & Sons Ltd.

  15. Recent international regulations: low dose-low rate radiation protection and the demise of reason.

    PubMed

    Okkalides, Demetrios

    2008-01-01

    The radiation protection measures suggested by the International Committee for Radiation Protection (ICRP), national regulating bodies and experts, have been becoming ever more strict despite the decrease of any information supporting the existence of the Linear no Threshold model (LNT) and of any adverse effects of Low Dose Low Rate (LDLR) irradiation. This tendency arises from the disproportionate response of human society to hazards that are currently in fashion and is unreasonable. The 1 mSv/year dose limit for the public suggested by the ICRP corresponds to a 1/18,181 detriment-adjusted cancer risk and is much lower than other hazards that are faced by modern societies such as e.g. driving and smoking which carry corresponding rate risks of 1/2,100 and 1/2,000. Even worldwide deadly work accidents rate is higher at 1/ 8,065. Such excessive safety measures against minimal risks from man made radiation sources divert resources from very real and much greater hazards. In addition they undermine research and development of radiation technology and tend to subjugate science and the quest for understanding nature to phobic practices.

  16. Novel PEPA-functionalized graphene oxide for fire safety enhancement of polypropylene

    NASA Astrophysics Data System (ADS)

    You Xu, Jia; Liu, Jie; Li, Kai Dan; Miao, Lei; Tanemura, Sakae

    2015-04-01

    Polypropylene (PP) is a general-purpose plastic, but some applications are constrained by its high flammability. Thus, flame retardant PP is urgently demanded. In this article, intumescent flame retardant PP (IFRPP) composites with enhanced fire safety were prepared using 1-oxo-4-hydroxymethyl-2,6,7-trioxa-1-phosphabicyclo [2.2.2] octane (PEPA) functionalized graphene oxide (PGO) as synergist. The PGO was prepared through a mild chemical reaction by the covalent attachment of a caged-structure organic compound, PEPA, onto GO nanosheets using toluene diisocynate (TDI) as the intermediary agent. The novel PEPA-functionalized graphene oxide not only improves the heat resistance of GO but also converts GO and PEPA from hydrophobic to hydrophilic materials, which leads to even distribution in PP. In our case, 7 wt% addition of PGO as one of the fillers for IFRPP composites significantly reduces its inflammability and fire hazards when compared with PEPA, by the improvement of first release rate peak (PHRR), total heat release, first smoke release rate peak (PSRR) and total smoke release, suggesting its great potential as the IFR synergist in industry. The reason is mainly attributed to the barrier effect of the unburned graphene sheets, which protects by the decomposition products of PEPA and TDI, promotes the formation of graphitized carbon and inhibits the heat and gas release.

  17. Relationship between chemical structure and the occupational asthma hazard of low molecular weight organic compounds

    PubMed Central

    Jarvis, J; Seed, M; Elton, R; Sawyer, L; Agius, R

    2005-01-01

    Aims: To investigate quantitatively, relationships between chemical structure and reported occupational asthma hazard for low molecular weight (LMW) organic compounds; to develop and validate a model linking asthma hazard with chemical substructure; and to generate mechanistic hypotheses that might explain the relationships. Methods: A learning dataset used 78 LMW chemical asthmagens reported in the literature before 1995, and 301 control compounds with recognised occupational exposures and hazards other than respiratory sensitisation. The chemical structures of the asthmagens and control compounds were characterised by the presence of chemical substructure fragments. Odds ratios were calculated for these fragments to determine which were associated with a likelihood of being reported as an occupational asthmagen. Logistic regression modelling was used to identify the independent contribution of these substructures. A post-1995 set of 21 asthmagens and 77 controls were selected to externally validate the model. Results: Nitrogen or oxygen containing functional groups such as isocyanate, amine, acid anhydride, and carbonyl were associated with an occupational asthma hazard, particularly when the functional group was present twice or more in the same molecule. A logistic regression model using only statistically significant independent variables for occupational asthma hazard correctly assigned 90% of the model development set. The external validation showed a sensitivity of 86% and specificity of 99%. Conclusions: Although a wide variety of chemical structures are associated with occupational asthma, bifunctional reactivity is strongly associated with occupational asthma hazard across a range of chemical substructures. This suggests that chemical cross-linking is an important molecular mechanism leading to the development of occupational asthma. The logistic regression model is freely available on the internet and may offer a useful but inexpensive adjunct to the prediction of occupational asthma hazard. PMID:15778257

  18. Hazardous-Materials Robot

    NASA Technical Reports Server (NTRS)

    Stone, Henry W.; Edmonds, Gary O.

    1995-01-01

    Remotely controlled mobile robot used to locate, characterize, identify, and eventually mitigate incidents involving hazardous-materials spills/releases. Possesses number of innovative features, allowing it to perform mission-critical functions such as opening and unlocking doors and sensing for hazardous materials. Provides safe means for locating and identifying spills and eliminates risks of injury associated with use of manned entry teams. Current version of vehicle, called HAZBOT III, also features unique mechanical and electrical design enabling vehicle to operate safely within combustible atmosphere.

  19. Home safety measures and the risk of unintentional injury among young children: a multicentre case–control study

    PubMed Central

    LeBlanc, John C.; Pless, I. Barry; King, W. James; Bawden, Harry; Bernard-Bonnin, Anne-Claude; Klassen, Terry; Tenenbein, Milton

    2006-01-01

    Background Young children may sustain injuries when exposed to certain hazards in the home. To better understand the relation between several childproofing strategies and the risk of injuries to children in the home, we undertook a multicentre case–control study in which we compared hazards in the homes of children with and without injuries. Methods We conducted this case-control study using records from 5 pediatric hospital emergency departments for the 2-year period 1995–1996. The 351 case subjects were children aged 7 years and less who presented with injuries from falls, burns or scalds, ingestions or choking. The matched control subjects were children who presented during the same period with acute non-injury-related conditions. A home visitor, blinded to case-control status, assessed 19 injury hazards at the children's homes. Results Hazards found in the homes included baby walkers (21% of homes with infants), no functioning smoke alarm (17% of homes) and no fire extinguisher (51% of homes). Cases did not differ from controls in the mean proportion of home hazards. After controlling for siblings, maternal education and employment, we found that cases differed from controls for 5 hazards: the presence of a baby walker (odds ratio [OR] 9.0, 95% confidence interval [CI] 1.1–71.0), the presence of choking hazards within a child's reach (OR 2.0, 95% CI 1.0–3.7), no child-resistant lids in bathroom (OR 1.6, 95% CI 1.0–2.5), no smoke alarm (OR 3.2, 95% CI 1.4–7.7) and no functioning smoke alarm (OR 1.7, 95% CI 1.0–2.8). Interpretation Homes of children with injuries differed from those of children without injuries in the proportions of specific hazards for falls, choking, poisoning and burns, with a striking difference noted for the presence of a baby walker. In addition to counselling parents about specific hazards, clinicians should consider that the presence of some hazards may indicate an increased risk for home injuries beyond those directly related to the hazard found. Families with any home hazard may be candidates for interventions to childproof against other types of home hazards. PMID:16998079

  20. Delayed seizures after intracerebral haemorrhage

    PubMed Central

    Rattani, Abbas; Anderson, Christopher D.; Ayres, Alison M.; Gurol, Edip M.; Greenberg, Steven M.; Rosand, Jonathan; Viswanathan, Anand

    2016-01-01

    Late seizures after intracerebral haemorrhage occur after the initial acute haemorrhagic insult subsides, and represent one of its most feared long-term sequelae. Both susceptibility to late seizures and their functional impact remain poorly characterized. We sought to: (i) compare patients with new-onset late seizures (i.e. delayed seizures), with those who experienced a recurrent late seizure following an immediately post-haemorrhagic seizure; and (ii) investigate the effect of late seizures on long-term functional performance after intracerebral haemorrhage. We performed prospective longitudinal follow-up of consecutive intracerebral haemorrhage survivors presenting to a single tertiary care centre. We tested for association with seizures the following neuroimaging and genetic markers of cerebral small vessel disease: APOE variants ε2/ε4, computer tomography-defined white matter disease, magnetic resonance imaging-defined white matter hyperintensities volume and cerebral microbleeds. Cognitive performance was measured using the Modified Telephone Interview for Cognitive Status, and functional performance using structured questionnaires obtained every 6 months. We performed time-to-event analysis using separate Cox models for risk to develop delayed and recurrent seizures, as well as for functional decline risk (mortality, incident dementia, and loss of functional independence) after intracerebral haemorrhage. A total of 872 survivors of intracerebral haemorrhage were enrolled and followed for a median of 3.9 years. Early seizure developed in 86 patients, 42 of whom went on to experience recurrent seizures. Admission Glasgow Coma Scale, increasing haematoma volume and cortical involvement were associated with recurrent seizure risk (all P < 0.01). Recurrent seizures were not associated with long-term functional outcome (P = 0.67). Delayed seizures occurred in 37 patients, corresponding to an estimated incidence of 0.8% per year (95% confidence interval 0.5–1.2%). Factors associated with delayed seizures included cortical involvement on index haemorrhage (hazard ratio 1.63, P = 0.036), pre-haemorrhage dementia (hazard ratio 1.36, P = 0.044), history of multiple prior lobar haemorrhages (hazard ratio 2.50, P = 0.038), exclusively lobar microbleeds (hazard ratio 2.22, P = 0.008) and presence of ≥ 1 APOE ε4 copies (hazard ratio 1.95, P = 0.020). Delayed seizures were associated with worse long-term functional outcome (hazard ratio 1.83, P = 0.005), but the association was removed by adjusting for neuroimaging and genetic markers of cerebral small vessel disease. Delayed seizures after intracerebral haemorrhage are associated with different risk factors, when compared to recurrent seizures. They are also associated with worse functional outcome, but this finding appears to be related to underlying small vessel disease. Further investigations into the connections between small vessel disease and delayed seizures are warranted. PMID:27497491

  1. Functional form diagnostics for Cox's proportional hazards model.

    PubMed

    León, Larry F; Tsai, Chih-Ling

    2004-03-01

    We propose a new type of residual and an easily computed functional form test for the Cox proportional hazards model. The proposed test is a modification of the omnibus test for testing the overall fit of a parametric regression model, developed by Stute, González Manteiga, and Presedo Quindimil (1998, Journal of the American Statistical Association93, 141-149), and is based on what we call censoring consistent residuals. In addition, we develop residual plots that can be used to identify the correct functional forms of covariates. We compare our test with the functional form test of Lin, Wei, and Ying (1993, Biometrika80, 557-572) in a simulation study. The practical application of the proposed residuals and functional form test is illustrated using both a simulated data set and a real data set.

  2. Cardiovascular and other causes of death as a function of lifestyle habits in a quasi extinct middle-aged male population. A 50-year follow-up study.

    PubMed

    Menotti, Alessandro; Puddu, Paolo Emilio; Maiani, Giuseppe; Catasta, Giovina

    2016-05-01

    To relate major causes of death with lifestyle habits in an almost extinct male middle-aged population. A 40-59 aged male population of 1712 subjects was examined and followed-up for 50 years. Baseline smoking habits, working physical activity and dietary habits were related to 50 years mortality subdivided into 12 simple and 3 composite causes of death by Cox proportional hazard models. Duration of survival was related to the same characteristics by a multiple linear regression model. Death rate in 50 years was of 97.5%. Out of 12 simple groups of causes of death, 6 were related to smoking habits, 3 to physical activity and 4 to dietary habits. Among composite groups of causes of death, hazard ratios (and their 95% confidence limits) of never smokers versus smokers were 0.68 (0.57-0.81) for major cardiovascular diseases; 0.65 (0.52-0.81) for all cancers; and 0.72 (0.64-0.81) for all-cause deaths. Hazard ratios of vigorous physical activity at work versus sedentary physical activity were 0.63 (0.49-0.80) for major cardiovascular diseases; 1.01 (0.72-1.41) for all cancers; and 0.76 (0.64-0.90) for all-cause deaths. Hazard ratios of Mediterranean Diet versus non-Mediterranean Diet were 0.68 (0.54-0.86) for major cardiovascular diseases; 0.54 (0.40-0.73) for all cancers; and 0.67 (0.57-0.78) for all-cause deaths. Expectancy of life was 12 years longer for men with the 3 best behaviors than for those with the 3 worst behaviors. Some lifestyle habits are strongly related to lifetime mortality. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  3. Hemoglobin level in older persons and incident Alzheimer disease: prospective cohort analysis.

    PubMed

    Shah, R C; Buchman, A S; Wilson, R S; Leurgans, S E; Bennett, D A

    2011-07-19

    To test the hypothesis that level of hemoglobin is associated with incident Alzheimer disease (AD). A total of 881 community-dwelling older persons participating in the Rush Memory and Aging Project without dementia and a measure of hemoglobin level underwent annual cognitive assessments and clinical evaluations for AD. During an average of 3.3 years of follow-up, 113 persons developed AD. In a Cox proportional hazards model adjusted for age, sex, and education, there was a nonlinear relationship between baseline level of hemoglobin such that higher and lower levels of hemoglobin were associated with AD risk (hazard ratio [HR] for the quadratic of hemoglobin 1.06, 95% confidence interval [CI] 1.01-1.11). Findings were unchanged after controlling for multiple covariates. When compared to participants with clinically normal hemoglobin (n = 717), participants with anemia (n = 154) had a 60% increased hazard for developing AD (95% CI 1.02-2.52), as did participants with clinically high hemoglobin (n = 10, HR 3.39, 95% CI 1.25-9.20). Linear mixed-effects models showed that lower and higher hemoglobin levels were associated with a greater rate of global cognitive decline (parameter estimate for quadratic of hemoglobin = -0.008, SE -0.002, p < 0.001). Compared to participants with clinically normal hemoglobin, participants with anemia had a -0.061 z score unit annual decline in global cognitive function (SE 0.012, p < 0.001), as did participants with clinically high hemoglobin (-0.090 unit/year, SE 0.038, p = 0.018). In older persons without dementia, both lower and higher hemoglobin levels are associated with an increased hazard for developing AD and more rapid cognitive decline.

  4. Synopsis of Precision Landing and Hazard Avoidance (PL&HA) Capabilities for Space Exploration

    NASA Technical Reports Server (NTRS)

    Robertson, Edward A.

    2017-01-01

    Until recently, robotic exploration missions to the Moon, Mars, and other solar system bodies relied upon controlled blind landings. Because terrestrial techniques for terrain relative navigation (TRN) had not yet been evolved to support space exploration, landing dispersions were driven by the capabilities of inertial navigation systems combined with surface relative altimetry and velocimetry. Lacking tight control over the actual landing location, mission success depended on the statistical vetting of candidate landing areas within the predicted landing dispersion ellipse based on orbital reconnaissance data, combined with the ability of the spacecraft to execute a controlled landing in terms of touchdown attitude, attitude rates, and velocity. In addition, the sensors, algorithms, and processing technologies required to perform autonomous hazard detection and avoidance in real time during the landing sequence were not yet available. Over the past decade, NASA has invested substantial resources on the development, integration, and testing of autonomous precision landing and hazard avoidance (PL&HA) capabilities. In addition to substantially improving landing accuracy and safety, these autonomous PL&HA functions also offer access to targets of interest located within more rugged and hazardous terrain. Optical TRN systems are baselined on upcoming robotic landing missions to the Moon and Mars, and NASA JPL is investigating the development of a comprehensive PL&HA system for a Europa lander. These robotic missions will demonstrate and mature PL&HA technologies that are considered essential for future human exploration missions. PL&HA technologies also have applications to rendezvous and docking/berthing with other spacecraft, as well as proximity navigation, contact, and retrieval missions to smaller bodies with microgravity environments, such as asteroids.

  5. Hemoglobin level in older persons and incident Alzheimer disease

    PubMed Central

    Buchman, A.S.; Wilson, R.S.; Leurgans, S.E.; Bennett, D.A.

    2011-01-01

    Objective: To test the hypothesis that level of hemoglobin is associated with incident Alzheimer disease (AD). Methods: A total of 881 community-dwelling older persons participating in the Rush Memory and Aging Project without dementia and a measure of hemoglobin level underwent annual cognitive assessments and clinical evaluations for AD. Results: During an average of 3.3 years of follow-up, 113 persons developed AD. In a Cox proportional hazards model adjusted for age, sex, and education, there was a nonlinear relationship between baseline level of hemoglobin such that higher and lower levels of hemoglobin were associated with AD risk (hazard ratio [HR] for the quadratic of hemoglobin 1.06, 95% confidence interval [CI] 1.01–1.11). Findings were unchanged after controlling for multiple covariates. When compared to participants with clinically normal hemoglobin (n = 717), participants with anemia (n = 154) had a 60% increased hazard for developing AD (95% CI 1.02–2.52), as did participants with clinically high hemoglobin (n = 10, HR 3.39, 95% CI 1.25–9.20). Linear mixed-effects models showed that lower and higher hemoglobin levels were associated with a greater rate of global cognitive decline (parameter estimate for quadratic of hemoglobin = −0.008, SE −0.002, p < 0.001). Compared to participants with clinically normal hemoglobin, participants with anemia had a −0.061 z score unit annual decline in global cognitive function (SE 0.012, p < 0.001), as did participants with clinically high hemoglobin (−0.090 unit/year, SE 0.038, p = 0.018). Conclusions: In older persons without dementia, both lower and higher hemoglobin levels are associated with an increased hazard for developing AD and more rapid cognitive decline. PMID:21753176

  6. 40 CFR 1600.2 - Organization.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false Organization. 1600.2 Section 1600.2 Protection of Environment CHEMICAL SAFETY AND HAZARD INVESTIGATION BOARD ORGANIZATION AND FUNCTIONS OF THE CHEMICAL SAFETY AND HAZARD INVESTIGATION BOARD § 1600.2 Organization. (a) The CSB's Board consists of five...

  7. 40 CFR 1600.6 - Office location.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false Office location. 1600.6 Section 1600.6 Protection of Environment CHEMICAL SAFETY AND HAZARD INVESTIGATION BOARD ORGANIZATION AND FUNCTIONS OF THE CHEMICAL SAFETY AND HAZARD INVESTIGATION BOARD § 1600.6 Office location. The principal offices of the...

  8. Simulation-Based Probabilistic Tsunami Hazard Analysis: Empirical and Robust Hazard Predictions

    NASA Astrophysics Data System (ADS)

    De Risi, Raffaele; Goda, Katsuichiro

    2017-08-01

    Probabilistic tsunami hazard analysis (PTHA) is the prerequisite for rigorous risk assessment and thus for decision-making regarding risk mitigation strategies. This paper proposes a new simulation-based methodology for tsunami hazard assessment for a specific site of an engineering project along the coast, or, more broadly, for a wider tsunami-prone region. The methodology incorporates numerous uncertain parameters that are related to geophysical processes by adopting new scaling relationships for tsunamigenic seismic regions. Through the proposed methodology it is possible to obtain either a tsunami hazard curve for a single location, that is the representation of a tsunami intensity measure (such as inundation depth) versus its mean annual rate of occurrence, or tsunami hazard maps, representing the expected tsunami intensity measures within a geographical area, for a specific probability of occurrence in a given time window. In addition to the conventional tsunami hazard curve that is based on an empirical statistical representation of the simulation-based PTHA results, this study presents a robust tsunami hazard curve, which is based on a Bayesian fitting methodology. The robust approach allows a significant reduction of the number of simulations and, therefore, a reduction of the computational effort. Both methods produce a central estimate of the hazard as well as a confidence interval, facilitating the rigorous quantification of the hazard uncertainties.

  9. Drought impact functions as intermediate step towards drought damage assessment

    NASA Astrophysics Data System (ADS)

    Bachmair, Sophie; Svensson, Cecilia; Prosdocimi, Ilaria; Hannaford, Jamie; Helm Smith, Kelly; Svoboda, Mark; Stahl, Kerstin

    2016-04-01

    While damage or vulnerability functions for floods and seismic hazards have gained considerable attention, there is comparably little knowledge on drought damage or loss. On the one hand this is due to the complexity of the drought hazard affecting different domains of the hydrological cycle and different sectors of human activity. Hence, a single hazard indicator is likely not able to fully capture this multifaceted hazard. On the other hand, drought impacts are often non-structural and hard to quantify or monetize. Examples are impaired navigability of streams, restrictions on domestic water use, reduced hydropower production, reduced tree growth, and irreversible deterioration/loss of wetlands. Apart from reduced crop yield, data about drought damage or loss with adequate spatial and temporal resolution is scarce, making the development of drought damage functions difficult. As an intermediate step towards drought damage functions we exploit text-based reports on drought impacts from the European Drought Impact report Inventory and the US Drought Impact Reporter to derive surrogate information for drought damage or loss. First, text-based information on drought impacts is converted into timeseries of absence versus presence of impacts, or number of impact occurrences. Second, meaningful hydro-meteorological indicators characterizing drought intensity are identified. Third, different statistical models are tested as link functions relating drought hazard indicators with drought impacts: 1) logistic regression for drought impacts coded as binary response variable; and 2) mixture/hurdle models (zero-inflated/zero-altered negative binomial regression) and an ensemble regression tree approach for modeling the number of drought impact occurrences. Testing the predictability of (number of) drought impact occurrences based on cross-validation revealed a good agreement between observed and modeled (number of) impacts for regions at the scale of federal states or provinces with good data availability. Impact functions representing localized drought impacts are more challenging to construct given that less data is available, yet may provide information that more directly addresses stakeholders' needs. Overall, our study contributes insights into how drought intensity translates into ecological and socioeconomic impacts, and how such information may be used for enhancing drought monitoring and early warning.

  10. Seismic hazard in Hawaii: High rate of large earthquakes and probabilistics ground-motion maps

    USGS Publications Warehouse

    Klein, F.W.; Frankel, A.D.; Mueller, C.S.; Wesson, R.L.; Okubo, P.G.

    2001-01-01

    The seismic hazard and earthquake occurrence rates in Hawaii are locally as high as that near the most hazardous faults elsewhere in the United States. We have generated maps of peak ground acceleration (PGA) and spectral acceleration (SA) (at 0.2, 0.3 and 1.0 sec, 5% critical damping) at 2% and 10% exceedance probabilities in 50 years. The highest hazard is on the south side of Hawaii Island, as indicated by the MI 7.0, MS 7.2, and MI 7.9 earthquakes, which occurred there since 1868. Probabilistic values of horizontal PGA (2% in 50 years) on Hawaii's south coast exceed 1.75g. Because some large earthquake aftershock zones and the geometry of flank blocks slipping on subhorizontal decollement faults are known, we use a combination of spatially uniform sources in active flank blocks and smoothed seismicity in other areas to model seismicity. Rates of earthquakes are derived from magnitude distributions of the modem (1959-1997) catalog of the Hawaiian Volcano Observatory's seismic network supplemented by the historic (1868-1959) catalog. Modern magnitudes are ML measured on a Wood-Anderson seismograph or MS. Historic magnitudes may add ML measured on a Milne-Shaw or Bosch-Omori seismograph or MI derived from calibrated areas of MM intensities. Active flank areas, which by far account for the highest hazard, are characterized by distributions with b slopes of about 1.0 below M 5.0 and about 0.6 above M 5.0. The kinked distribution means that large earthquake rates would be grossly under-estimated by extrapolating small earthquake rates, and that longer catalogs are essential for estimating or verifying the rates of large earthquakes. Flank earthquakes thus follow a semicharacteristic model, which is a combination of background seismicity and an excess number of large earthquakes. Flank earthquakes are geometrically confined to rupture zones on the volcano flanks by barriers such as rift zones and the seaward edge of the volcano, which may be expressed by a magnitude distribution similar to that including characteristic earthquakes. The island chain northwest of Hawaii Island is seismically and volcanically much less active. We model its seismic hazard with a combination of a linearly decaying ramp fit to the cataloged seismicity and spatially smoothed seismicity with a smoothing half-width of 10 km. We use a combination of up to four attenuation relations for each map because for either PGA or SA, there is no single relation that represents ground motion for all distance and magnitude ranges. Great slumps and landslides visible on the ocean floor correspond to catastrophes with effective energy magnitudes ME above 8.0. A crude estimate of their frequency suggests that the probabilistic earthquake hazard is at least an order of magnitude higher for flank earthquakes than that from submarine slumps.

  11. Expected hazards and hospital beds in host cities of the 2014 FIFA World Cup in Brazil.

    PubMed

    Miranda, Elaine Silva; Shoaf, Kimberley; Silva, Raulino Sabino da; Freitas, Carolina Figueiredo; Osorio-de-Castro, Claudia Garcia Serpa

    2017-06-12

    Planning for mass gatherings involves health system preparedness based on an understanding of natural and technological hazards identified through prior risk assessment. We present the expected hazards reported by health administrators of the host cities for the 2014 FIFA World Cup in Brazil and discuss the hazards considering minimal available public hospital beds in the 12 cities at the time of the event. Four different groups of respondents were interviewed: pharmaceutical service administrators and overall health administrators at both the municipal and hospital levels. The hospital bed occupancy rate was calculated, based on the Brazilian Health Informatics Department (DATASUS). The number of surplus beds was calculated using parameters from the literature regarding surge and mass casualty needs and number of unoccupied beds. In all groups, physical injuries ranked first, followed by emerging and endemic diseases. Baseline occupancy rates were high (95%CI: 0.93-2.19) in all 12 cities. Total shortage, considering all the cities, ranged from -47,670 (for surges) to -60,569 beds (for mass casualties). The study can contribute to discussions on mass-gathering preparedness.

  12. Fault Specific Seismic Hazard Maps as Input to Loss Reserves Calculation for Attica Buildings

    NASA Astrophysics Data System (ADS)

    Deligiannakis, Georgios; Papanikolaou, Ioannis; Zimbidis, Alexandros; Roberts, Gerald

    2014-05-01

    Greece is prone to various natural disasters, such as wildfires, floods, landslides and earthquakes, due to the special environmental and geological conditions dominating in tectonic plate boundaries. Seismic is the predominant risk, in terms of damages and casualties in the Greek territory. The historical record of earthquakes in Greece has been published from various researchers, providing useful data in seismic hazard assessment of Greece. However, the completeness of the historical record in Greece, despite being one of the longest worldwide, reaches only 500 years for M ≥ 7.3 and less than 200 years for M ≥ 6.5. Considering that active faults in the area have recurrence intervals of a few hundred to several thousands of years, it is clear that many active faults have not been activated during the completeness period covered by the historical records. New Seismic Hazard Assessment methodologies tend to follow fault specific approaches where seismic sources are geologically constrained active faults, in order to address problems related to the historical records incompleteness, obtain higher spatial resolution and calculate realistic source locality distances, since seismic sources are very accurately located. Fault specific approaches provide quantitative assessments as they measure fault slip rates from geological data, providing a more reliable estimate of seismic hazard. We used a fault specific seismic hazard assessment approach for the region of Attica. The method of seismic hazard mapping from geological fault throw-rate data combined three major factors: Empirical data which combine fault rupture lengths, earthquake magnitudes and coseismic slip relationships. The radiuses of VI, VII, VIII and IX isoseismals on the Modified Mercalli (MM) intensity scale. Attenuation - amplification functions for seismic shaking on bedrock compared to basin filling sediments. We explicitly modeled 22 active faults that could affect the region of Attica, including Athens, using detailed data derived from published papers, neotectonic maps and fieldwork observations. Moreover, we incorporated background seismicity models from the historic record and also the subduction zone earthquakes distribution, for the integration of strong deep earthquakes that could also affect Attica region. We created 4 high spatial resolution seismic hazard maps for the region of Attica, one for each of the intensities VII - X (MM). These maps offer a locality specific shaking recurrence record, which represents the long-term shaking record in a more complete way, since they incorporate several seismic cycles of the active faults that could affect Attica. Each one of these high resolution seismic hazard maps displays both the spatial distribution and the recurrence, over a specific time period, of the relevant intensity. Time - independent probabilities were extracted based on these average recurrence intervals, using the stationary Poisson model P = 1 -e-Λt. The 'Λ' value was provided by the intensities recurrence, as displayed in the seismic hazard maps. However, the insurance contracts usually lack of detailed spatial information and they refer to Postal Codes level, akin to CRESTA zones. To this end, a time-independent probability of shaking at intensities VII - X was calculated for every Postal Code, for a given time period, using the Poisson model. The reserves calculation on buildings portfolio combines the probability of events of specific intensities within the Postal Codes, with the buildings characteristics, such as the building construction type and the insured value. We propose a standard approach for the reserves calculation K(t) for a specific time period: K (t) = x2 ·[x1 ·y1 ·P1(t) + x1 ·y2 ·P2(t) + x1 ·y3 ·P3(t) + x1 ·y4 ·P4(t)] x1 which is a function of the probabilities of occurrence for the seismic intensities VII - X (P1(t) -P4(t)) for the same period, the value of the building x1, the insured value x2 and the characteristics of the building, such as the construction type, age, height and use of property (y1 - y4). Furthermore a stochastic approach is also adopted in order to obtain the relevant reserve value K(t) for the specific time period. This calculation considers a set of simulations from the Poisson random variable and then taking the respective expectations.

  13. Evaluation of a Home-Based Environmental and Educational Intervention to Improve Health in Vulnerable Households: Southeastern Pennsylvania Lead and Healthy Homes Program

    PubMed Central

    Mankikar, Deepa; Campbell, Carla; Greenberg, Rachael

    2016-01-01

    This evaluation examined whether participation in a home-based environmental educational intervention would reduce exposure to health and safety hazards and asthma-related medical visits. The home intervention program focused on vulnerable, low-income households, where children had asthma, were at risk for lead poisoning, or faced multiple unsafe housing conditions. Home visitors conducted two home visits, two months apart, consisting of an environmental home assessment, Healthy Homes education, and distribution of Healthy Homes supplies. Measured outcomes included changes in participant knowledge and awareness of environmental home-based hazards, rate of children’s asthma-related medical use, and the presence of asthma triggers and safety hazards. Analysis of 2013–2014 baseline and post-intervention program data for a cohort of 150 families revealed a significantly lower three-month rate (p < 0.05) of children’s asthma-related doctor visits and hospital admissions at program completion. In addition, there were significantly reduced reports of the presence of home-based hazards, including basement or roof leaks (p = 0.011), plumbing leaks (p = 0.019), and use of an oven to heat the home (p < 0.001). Participants’ pre- and post- test scores showed significant improvement (p < 0.05) in knowledge and awareness of home hazards. Comprehensive home interventions may effectively reduce environmental home hazards and improve the health of asthmatic children in the short term. PMID:27618087

  14. Harvesting rockfall hazard evaluation parameters from Google Earth Street View

    NASA Astrophysics Data System (ADS)

    Partsinevelos, Panagiotis; Agioutantis, Zacharias; Tripolitsiotis, Achilles; Steiakakis, Chrysanthos; Mertikas, Stelios

    2015-04-01

    Rockfall incidents along highways and railways prove extremely dangerous for properties, infrastructures and human lives. Several qualitative metrics such as the Rockfall Hazard Rating System (RHRS) and the Colorado Rockfall Hazard Rating System (CRHRS) have been established to estimate rockfall potential and provide risk maps in order to control and monitor rockfall incidents. The implementation of such metrics for efficient and reliable risk modeling require accurate knowledge of multi-parametric attributes such as the geological, geotechnical, topographic parameters of the study area. The Missouri Rockfall Hazard Rating System (MORH RS) identifies the most potentially problematic areas using digital video logging for the determination of parameters like slope height and angle, face irregularities, etc. This study aims to harvest in a semi-automated approach geometric and qualitative measures through open source platforms that may provide 3-dimensional views of the areas of interest. More specifically, the Street View platform from Google Maps, is hereby used to provide essential information that can be used towards 3-dimensional reconstruction of slopes along highways. The potential of image capturing along a programmable virtual route to provide the input data for photogrammetric processing is also evaluated. Moreover, qualitative characterization of the geological and geotechnical status, based on the Street View images, is performed. These attributes are then integrated to deliver a GIS-based rockfall hazard map. The 3-dimensional models are compared to actual photogrammetric measures in a rockfall prone area in Crete, Greece while in-situ geotechnical characterization is also used to compare and validate the hazard risk. This work is considered as the first step towards the exploitation of open source platforms to improve road safety and the development of an operational system where authorized agencies (i.e., civil protection) will be able to acquire near-real time hazard maps based on video images retrieved either by open source platforms, operational unmanned aerial vehicles, and/or simple video recordings from users. This work has been performed under the framework of the "Cooperation 2011" project ISTRIA (11_SYN_9_13989) funded from the Operational Program "Competitiveness and Entrepreneurship" (co-funded by the European Regional Development Fund (ERDF)) and managed by the Greek General Secretariat for Research and Technology.

  15. In-Flight Decision-Making by General Aviation Pilots Operating in Areas of Extreme Thunderstorms.

    PubMed

    Boyd, Douglas D

    2017-12-01

    General aviation (comprised mainly of noncommercial, light aircraft) accounts for 94% of civil aviation fatalities in the United States. Although thunderstorms are hazardous to light aircraft, little research has been undertaken on in-flight pilot decision-making regarding their avoidance. The study objectives were: 1) to determine if the thunderstorm accident rate has declined over the last two decades; and 2) assess in-flight (enroute/landing) airman decision-making regarding adherence to FAA separation minima from thunderstorms. Thunderstorm-related accidents were identified from the NTSB database. To determine en route/arriving aircraft real-time thunderstorm proximity/relative position and airplane location, using a flight-tracking (Flight Aware®) website, were overlaid on a graphical weather image. Statistics employed Poisson and Chi-squared analyses. The thunderstorm-related accident rate was undiminished over the 1996-2014 period. In a prospective analysis the majority (enroute 77%, landing 93%) of flights violated the FAA-recommended separation distance from extreme convection. Of these, 79 and 69% (en route and landing, respectively) selected a route downwind of the thunderstorm rather than a less hazardous upwind flight path. Using a mathematical product of binary (separation distance, relative aircraft-thunderstorm position) and nominal (thunderstorm-free egress area) parameters, airmen were more likely to operate in the thunderstorm hazard zone for landings than en route operations. The thunderstorm-related accident rate, carrying a 70% fatality rate, remains unabated, largely reflecting nonadherence to the FAA-recommended separation minima and selection of a more hazardous route (downwind) for circumnavigation of extreme convective weather. These findings argue for additional emphasis in ab initio pilot training/recurrency on thunderstorm hazards and safe practices (separation distance and flight path).Boyd DD. In-flight decision-making by general aviation pilots operating in areas of extreme thunderstorms. Aerosp Med Hum Perform. 2017; 88(12):1066-1072.

  16. Variation in Major Depressive Disorder Onset by Place of Origin Among U.S. Latinos.

    PubMed

    Lee, Sungkyu; Park, Yangjin

    2017-09-01

    Using a nationally representative sample of 2514 U.S. Latinos, this study examined the extent to which major depressive disorder (MDD) onset differs by place of origin and the factors associated with it. The Kaplan-Meier method estimated the survival and hazard functions for MDD onset by place of origin, and Cox proportional hazards models identified its associative factors. Approximately 13% of the sample had experienced MDD in their lifetimes. Cuban respondents showed the highest survival function, while Puerto Ricans showed the lowest. With the entire sample, the smoothed hazard function showed that the risk of MDD onset peaked in the late 20s and early 80s. Puerto Rican respondents showed the highest risk of MDD during their 20s and 30s, whereas Cuban respondents showed a relatively stable pattern over time. The results from the Cox proportional hazards model indicated that age, sex, and marital status were significantly related to MDD onset (p < .05). In addition, the effect of U.S.-born status on MDD onset was greater among Mexican respondents than among Puerto Ricans. Findings from the present study demonstrate that different Latino subgroups experience different and unique patterns of MDD onset over time. Future research should account for the role of immigration status in examining MDD onset.

  17. Faster Blood Flow Rate Does Not Improve Circuit Life in Continuous Renal Replacement Therapy: A Randomized Controlled Trial.

    PubMed

    Fealy, Nigel; Aitken, Leanne; du Toit, Eugene; Lo, Serigne; Baldwin, Ian

    2017-10-01

    To determine whether blood flow rate influences circuit life in continuous renal replacement therapy. Prospective randomized controlled trial. Single center tertiary level ICU. Critically ill adults requiring continuous renal replacement therapy. Patients were randomized to receive one of two blood flow rates: 150 or 250 mL/min. The primary outcome was circuit life measured in hours. Circuit and patient data were collected until each circuit clotted or was ceased electively for nonclotting reasons. Data for clotted circuits are presented as median (interquartile range) and compared using the Mann-Whitney U test. Survival probability for clotted circuits was compared using log-rank test. Circuit clotting data were analyzed for repeated events using hazards ratio. One hundred patients were randomized with 96 completing the study (150 mL/min, n = 49; 250 mL/min, n = 47) using 462 circuits (245 run at 150 mL/min and 217 run at 250 mL/min). Median circuit life for first circuit (clotted) was similar for both groups (150 mL/min: 9.1 hr [5.5-26 hr] vs 10 hr [4.2-17 hr]; p = 0.37). Continuous renal replacement therapy using blood flow rate set at 250 mL/min was not more likely to cause clotting compared with 150 mL/min (hazards ratio, 1.00 [0.60-1.69]; p = 0.68). Gender, body mass index, weight, vascular access type, length, site, and mode of continuous renal replacement therapy or international normalized ratio had no effect on clotting risk. Continuous renal replacement therapy without anticoagulation was more likely to cause clotting compared with use of heparin strategies (hazards ratio, 1.62; p = 0.003). Longer activated partial thromboplastin time (hazards ratio, 0.98; p = 0.002) and decreased platelet count (hazards ratio, 1.19; p = 0.03) were associated with a reduced likelihood of circuit clotting. There was no difference in circuit life whether using blood flow rates of 250 or 150 mL/min during continuous renal replacement therapy.

  18. Mortality Measurement at Advanced Ages: A Study of the Social Security Administration Death Master File

    PubMed Central

    Gavrilov, Leonid A.; Gavrilova, Natalia S.

    2011-01-01

    Accurate estimates of mortality at advanced ages are essential to improving forecasts of mortality and the population size of the oldest old age group. However, estimation of hazard rates at extremely old ages poses serious challenges to researchers: (1) The observed mortality deceleration may be at least partially an artifact of mixing different birth cohorts with different mortality (heterogeneity effect); (2) standard assumptions of hazard rate estimates may be invalid when risk of death is extremely high at old ages and (3) ages of very old people may be exaggerated. One way of obtaining estimates of mortality at extreme ages is to pool together international records of persons surviving to extreme ages with subsequent efforts of strict age validation. This approach helps researchers to resolve the third of the above-mentioned problems but does not resolve the first two problems because of inevitable data heterogeneity when data for people belonging to different birth cohorts and countries are pooled together. In this paper we propose an alternative approach, which gives an opportunity to resolve the first two problems by compiling data for more homogeneous single-year birth cohorts with hazard rates measured at narrow (monthly) age intervals. Possible ways of resolving the third problem of hazard rate estimation are elaborated. This approach is based on data from the Social Security Administration Death Master File (DMF). Some birth cohorts covered by DMF could be studied by the method of extinct generations. Availability of month of birth and month of death information provides a unique opportunity to obtain hazard rate estimates for every month of age. Study of several single-year extinct birth cohorts shows that mortality trajectory at advanced ages follows the Gompertz law up to the ages 102–105 years without a noticeable deceleration. Earlier reports of mortality deceleration (deviation of mortality from the Gompertz law) at ages below 100 appear to be artifacts of mixing together several birth cohorts with different mortality levels and using cross-sectional instead of cohort data. Age exaggeration and crude assumptions applied to mortality estimates at advanced ages may also contribute to mortality underestimation at very advanced ages. PMID:22308064

  19. An academic approach to climate change emergency preparedness.

    PubMed

    Trask, Jeffrey A

    To achieve effective emergency management and business continuity, all hazards should be considered during the planning and preparedness process. In recent years, several new hazards have attracted the attention of Emergency Management and Business Continuity practitioners. Climate change presents a unique challenge. Practitioners must rely on historical data combined with scientific projections to guide their planning and preparedness efforts. This article examines how an academic institution's emergency management programme can plan successfully for this hazard by focusing on best practices in the area of building cross-departmental and cross-jurisdictional relationships. Examples of scientific data related to the hazard of climate change will be presented along with the latest guidance from the Federal Emergency Management Agency encouraging the planning for future hazards. The article presents a functional exercise in which this hazard was prominently featured, and presents testimony from subject matter experts. Recommendations for emergency management and business continuity programmes are so provided.

  20. Expert assessment of the resilience of drinking water and sanitation systems to climate-related hazards.

    PubMed

    Luh, Jeanne; Royster, Sarah; Sebastian, Daniel; Ojomo, Edema; Bartram, Jamie

    2017-08-15

    We conducted an expert assessment to obtain expert opinions on the relative global resilience of ten drinking water and five sanitation technologies to the following six climate-related hazards: drought, decreased inter-annual precipitation, flood, superstorm flood, wind damage, and saline intrusion. Resilience scores ranged from 1.7 to 9.9 out of a maximum resilience of 10, with high scores corresponding to high resilience. We find that for some climate-related hazards, such as drought, technologies demonstrated a large range in resilience, indicating that the choice of water and sanitation technologies is important for areas prone to drought. On the other hand, the range of resilience scores for superstorm flooding was much smaller, particularly for sanitation technologies, suggesting that the choice of technology is less of a determinant of functionality for superstorm flooding as compared to other climate-related hazards. For drinking water technologies, only treated piped utility-managed systems that use surface water had resilience scores >6.0 for all hazards, while protected dug wells were found to be one of the least resilient technologies, consistently scoring <5.0 for all hazards except wind damage. In general, sanitation technologies were found to have low to medium resilience, suggesting that sanitation systems need to be adapted to ensure functionality during and after climate-related hazards. The results of the study can be used to help communities decide which technologies are best suited for the climate-related challenges they face and help in future adaptation planning. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. Programmatic Environmental Assessment (EA) for Hazardous Materials Removal at F. E. Warren Air Force Base, Wyoming

    DTIC Science & Technology

    2013-05-31

    ACM). The FEW Environmental Planning Function ( EPF ) conducted the analysis of this proposed action. 2. PURPOSE AND NEED FOR ACTION. The... EPF determined that the proposed action has the potential to affect Air Quality, Occupational Safety and Health, Cultural Resources and Hazardous Waste

  2. 40 CFR 117.11 - General applicability.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ...-Scene Coordinator pursuant to 40 CFR part 1510 (the National Oil and Hazardous Substances Pollution Plan) or 33 CFR 153.10(e) (Pollution by Oil and Hazardous Substances) or in accordance with applicable... § 165.7 of Title 14 of the State of California Administrative Code; (g) From a properly functioning...

  3. 40 CFR 117.11 - General applicability.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ...-Scene Coordinator pursuant to 40 CFR part 1510 (the National Oil and Hazardous Substances Pollution Plan) or 33 CFR 153.10(e) (Pollution by Oil and Hazardous Substances) or in accordance with applicable... § 165.7 of Title 14 of the State of California Administrative Code; (g) From a properly functioning...

  4. 40 CFR 117.11 - General applicability.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ...-Scene Coordinator pursuant to 40 CFR part 1510 (the National Oil and Hazardous Substances Pollution Plan) or 33 CFR 153.10(e) (Pollution by Oil and Hazardous Substances) or in accordance with applicable... § 165.7 of Title 14 of the State of California Administrative Code; (g) From a properly functioning...

  5. The Mw 7.7 Bhuj earthquake: Global lessons for earthquake hazard in intra-plate regions

    USGS Publications Warehouse

    Schweig, E.; Gomberg, J.; Petersen, M.; Ellis, M.; Bodin, P.; Mayrose, L.; Rastogi, B.K.

    2003-01-01

    The Mw 7.7 Bhuj earthquake occurred in the Kachchh District of the State of Gujarat, India on 26 January 2001, and was one of the most damaging intraplate earthquakes ever recorded. This earthquake is in many ways similar to the three great New Madrid earthquakes that occurred in the central United States in 1811-1812, An Indo-US team is studying the similarities and differences of these sequences in order to learn lessons for earthquake hazard in intraplate regions. Herein we present some preliminary conclusions from that study. Both the Kutch and New Madrid regions have rift type geotectonic setting. In both regions the strain rates are of the order of 10-9/yr and attenuation of seismic waves as inferred from observations of intensity and liquefaction are low. These strain rates predict recurrence intervals for Bhuj or New Madrid sized earthquakes of several thousand years or more. In contrast, intervals estimated from paleoseismic studies and from other independent data are significantly shorter, probably hundreds of years. All these observations together may suggest that earthquakes relax high ambient stresses that are locally concentrated by rheologic heterogeneities, rather than loading by plate-tectonic forces. The latter model generally underlies basic assumptions made in earthquake hazard assessment, that the long-term average rate of energy released by earthquakes is determined by the tectonic loading rate, which thus implies an inherent average periodicity of earthquake occurrence. Interpreting the observations in terms of the former model therefore may require re-examining the basic assumptions of hazard assessment.

  6. Marital status and survival in patients with rectal cancer: A population-based STROBE cohort study.

    PubMed

    Li, Zhuyue; Wang, Kang; Zhang, Xuemei; Wen, Jin

    2018-05-01

    To examine the impact of marital status on overall survival (OS) and rectal cancer-specific survival (RCSS) for aged patients.We used the Surveillance, Epidemiology and End Results database to identify aged patients (>65 years) with early stage rectal cancer (RC) (T1-T4, N0, M0) in the United States from 2004 to 2010. Propensity score matching was conducted to avoid potential confounding factors with ratio at 1:1. We used Kaplan-Meier to compare OS and RCSS between the married patients and the unmarried, respectively. We used cox proportion hazard regressions to obtain hazard rates for OS, and proportional subdistribution hazard model was performed to calculate hazard rates for RCSS.Totally, 5196 patients were included. The married (2598 [50%]) aged patients had better crude 5-year overall survival rate (64.2% vs 57.3%, P < .001) and higher crude 5-year cancer-specific survival rate (80% vs 75.9%, P < .001) than the unmarried (2598 (50%)), respectively. In multivariate analyses, married patients had significantly lower overall death than unmarried patients (HR = 0.77, 95% CI = 0.71-0.83, P < .001), while aged married patients had no cancer-specific survival benefit versus the unmarried aged patients (HR = 0.92, 95% CI = 0.81-1.04, P = .17).Among old population, married patients with early stage RC had better OS than the unmarried, while current evidence showed that marital status might have no protective effect on cancer-specific survival.

  7. Increasing seismicity in the U. S. midcontinent: Implications for earthquake hazard

    USGS Publications Warehouse

    Ellsworth, William L.; Llenos, Andrea L.; McGarr, Arthur F.; Michael, Andrew J.; Rubinstein, Justin L.; Mueller, Charles S.; Petersen, Mark D.; Calais, Eric

    2015-01-01

    Earthquake activity in parts of the central United States has increased dramatically in recent years. The space-time distribution of the increased seismicity, as well as numerous published case studies, indicates that the increase is of anthropogenic origin, principally driven by injection of wastewater coproduced with oil and gas from tight formations. Enhanced oil recovery and long-term production also contribute to seismicity at a few locations. Preliminary hazard models indicate that areas experiencing the highest rate of earthquakes in 2014 have a short-term (one-year) hazard comparable to or higher than the hazard in the source region of tectonic earthquakes in the New Madrid and Charleston seismic zones.

  8. Earthquake shaking hazard estimates and exposure changes in the conterminous United States

    USGS Publications Warehouse

    Jaiswal, Kishor S.; Petersen, Mark D.; Rukstales, Kenneth S.; Leith, William S.

    2015-01-01

    A large portion of the population of the United States lives in areas vulnerable to earthquake hazards. This investigation aims to quantify population and infrastructure exposure within the conterminous U.S. that are subjected to varying levels of earthquake ground motions by systematically analyzing the last four cycles of the U.S. Geological Survey's (USGS) National Seismic Hazard Models (published in 1996, 2002, 2008 and 2014). Using the 2013 LandScan data, we estimate the numbers of people who are exposed to potentially damaging ground motions (peak ground accelerations at or above 0.1g). At least 28 million (~9% of the total population) may experience 0.1g level of shaking at relatively frequent intervals (annual rate of 1 in 72 years or 50% probability of exceedance (PE) in 50 years), 57 million (~18% of the total population) may experience this level of shaking at moderately frequent intervals (annual rate of 1 in 475 years or 10% PE in 50 years), and 143 million (~46% of the total population) may experience such shaking at relatively infrequent intervals (annual rate of 1 in 2,475 years or 2% PE in 50 years). We also show that there is a significant number of critical infrastructure facilities located in high earthquake-hazard areas (Modified Mercalli Intensity ≥ VII with moderately frequent recurrence interval).

  9. Preliminary Considerations for Classifying Hazards of Unmanned Aircraft Systems

    NASA Technical Reports Server (NTRS)

    Hayhurst, Kelly J.; Maddalon, Jeffrey M.; Miner, Paul S.; Szatkowski, George N.; Ulrey, Michael L.; DeWalt, Michael P.; Spitzer, Cary R.

    2007-01-01

    The use of unmanned aircraft in national airspace has been characterized as the next great step forward in the evolution of civil aviation. To make routine and safe operation of these aircraft a reality, a number of technological and regulatory challenges must be overcome. This report discusses some of the regulatory challenges with respect to deriving safety and reliability requirements for unmanned aircraft. In particular, definitions of hazards and their classification are discussed and applied to a preliminary functional hazard assessment of a generic unmanned system.

  10. Low Dose Radiation Cancer Risks: Epidemiological and Toxicological Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    David G. Hoel, PhD

    2012-04-19

    The basic purpose of this one year research grant was to extend the two stage clonal expansion model (TSCE) of carcinogenesis to exposures other than the usual single acute exposure. The two-stage clonal expansion model of carcinogenesis incorporates the biological process of carcinogenesis, which involves two mutations and the clonal proliferation of the intermediate cells, in a stochastic, mathematical way. The current TSCE model serves a general purpose of acute exposure models but requires numerical computation of both the survival and hazard functions. The primary objective of this research project was to develop the analytical expressions for the survival functionmore » and the hazard function of the occurrence of the first cancer cell for acute, continuous and multiple exposure cases within the framework of the piece-wise constant parameter two-stage clonal expansion model of carcinogenesis. For acute exposure and multiple exposures of acute series, it is either only allowed to have the first mutation rate vary with the dose, or to have all the parameters be dose dependent; for multiple exposures of continuous exposures, all the parameters are allowed to vary with the dose. With these analytical functions, it becomes easy to evaluate the risks of cancer and allows one to deal with the various exposure patterns in cancer risk assessment. A second objective was to apply the TSCE model with varing continuous exposures from the cancer studies of inhaled plutonium in beagle dogs. Using step functions to estimate the retention functions of the pulmonary exposure of plutonium the multiple exposure versions of the TSCE model was to be used to estimate the beagle dog lung cancer risks. The mathematical equations of the multiple exposure versions of the TSCE model were developed. A draft manuscript which is attached provides the results of this mathematical work. The application work using the beagle dog data from plutonium exposure has not been completed due to the fact that the research project did not continue beyond its first year.« less

  11. Kidney function and sudden cardiac death in the community: The Atherosclerosis Risk in Communities (ARIC) Study.

    PubMed

    Suzuki, Takeki; Agarwal, Sunil K; Deo, Rajat; Sotoodehnia, Nona; Grams, Morgan E; Selvin, Elizabeth; Calkins, Hugh; Rosamond, Wayne; Tomaselli, Gordon; Coresh, Josef; Matsushita, Kunihiro

    2016-10-01

    Individuals with chronic kidney disease, particularly those requiring dialysis, are at high risk of sudden cardiac death (SCD). However, comprehensive data for the full spectrum of kidney function and SCD risk in the community are sparse. Furthermore, newly developed equations for estimated glomerular filtration rate (eGFR) and novel filtration markers might add further insight to the role of kidney function in SCD. We investigated the associations of baseline eGFRs using serum creatinine, cystatin C, or both (eGFRcr, eGFRcys, and eGFRcr-cys); cystatin C itself; and β2-microglobulin (B2M) with SCD (205 cases through 2001) among 13,070 black and white ARIC participants at baseline during 1990-1992 using Cox regression models accounting for potential confounders. Low eGFR was independently associated with SCD risk: for example, hazard ratio for eGFR <45 versus ≥90mL/(min 1.73m(2)) was 3.71 (95% CI 1.74-7.90) with eGFRcr, 5.40 (2.97-9.83) with eGFRcr-cys, and 5.24 (3.01-9.11) with eGFRcys. When eGFRcr and eGFRcys were included together in a single model, the association was only significant for eGFRcys. When three eGFRs, cystatin C, and B2M were divided into quartiles, B2M demonstrated the strongest association with SCD (hazard ratio for fourth quartile vs first quartile 3.48 (2.03-5.96) vs ≤2.7 for the other kidney markers). Kidney function was independently and robustly associated with SCD in the community, particularly when cystatin C or B2M was used. These results suggest the potential value of kidney function as a risk factor for SCD and the advantage of novel filtration markers over eGFRcr in this context. Copyright © 2016 Elsevier Inc. All rights reserved.

  12. Time-dependent landslide probability mapping

    USGS Publications Warehouse

    Campbell, Russell H.; Bernknopf, Richard L.; ,

    1993-01-01

    Case studies where time of failure is known for rainfall-triggered debris flows can be used to estimate the parameters of a hazard model in which the probability of failure is a function of time. As an example, a time-dependent function for the conditional probability of a soil slip is estimated from independent variables representing hillside morphology, approximations of material properties, and the duration and rate of rainfall. If probabilities are calculated in a GIS (geomorphic information system ) environment, the spatial distribution of the result for any given hour can be displayed on a map. Although the probability levels in this example are uncalibrated, the method offers a potential for evaluating different physical models and different earth-science variables by comparing the map distribution of predicted probabilities with inventory maps for different areas and different storms. If linked with spatial and temporal socio-economic variables, this method could be used for short-term risk assessment.

  13. Reliability Analysis and Optimal Release Problem Considering Maintenance Time of Software Components for an Embedded OSS Porting Phase

    NASA Astrophysics Data System (ADS)

    Tamura, Yoshinobu; Yamada, Shigeru

    OSS (open source software) systems which serve as key components of critical infrastructures in our social life are still ever-expanding now. Especially, embedded OSS systems have been gaining a lot of attention in the embedded system area, i.e., Android, BusyBox, TRON, etc. However, the poor handling of quality problem and customer support prohibit the progress of embedded OSS. Also, it is difficult for developers to assess the reliability and portability of embedded OSS on a single-board computer. In this paper, we propose a method of software reliability assessment based on flexible hazard rates for the embedded OSS. Also, we analyze actual data of software failure-occurrence time-intervals to show numerical examples of software reliability assessment for the embedded OSS. Moreover, we compare the proposed hazard rate model for the embedded OSS with the typical conventional hazard rate models by using the comparison criteria of goodness-of-fit. Furthermore, we discuss the optimal software release problem for the porting-phase based on the total expected software maintenance cost.

  14. Apixaban versus warfarin in patients with atrial fibrillation.

    PubMed

    Granger, Christopher B; Alexander, John H; McMurray, John J V; Lopes, Renato D; Hylek, Elaine M; Hanna, Michael; Al-Khalidi, Hussein R; Ansell, Jack; Atar, Dan; Avezum, Alvaro; Bahit, M Cecilia; Diaz, Rafael; Easton, J Donald; Ezekowitz, Justin A; Flaker, Greg; Garcia, David; Geraldes, Margarida; Gersh, Bernard J; Golitsyn, Sergey; Goto, Shinya; Hermosillo, Antonio G; Hohnloser, Stefan H; Horowitz, John; Mohan, Puneet; Jansky, Petr; Lewis, Basil S; Lopez-Sendon, Jose Luis; Pais, Prem; Parkhomenko, Alexander; Verheugt, Freek W A; Zhu, Jun; Wallentin, Lars

    2011-09-15

    Vitamin K antagonists are highly effective in preventing stroke in patients with atrial fibrillation but have several limitations. Apixaban is a novel oral direct factor Xa inhibitor that has been shown to reduce the risk of stroke in a similar population in comparison with aspirin. In this randomized, double-blind trial, we compared apixaban (at a dose of 5 mg twice daily) with warfarin (target international normalized ratio, 2.0 to 3.0) in 18,201 patients with atrial fibrillation and at least one additional risk factor for stroke. The primary outcome was ischemic or hemorrhagic stroke or systemic embolism. The trial was designed to test for noninferiority, with key secondary objectives of testing for superiority with respect to the primary outcome and to the rates of major bleeding and death from any cause. The median duration of follow-up was 1.8 years. The rate of the primary outcome was 1.27% per year in the apixaban group, as compared with 1.60% per year in the warfarin group (hazard ratio with apixaban, 0.79; 95% confidence interval [CI], 0.66 to 0.95; P<0.001 for noninferiority; P=0.01 for superiority). The rate of major bleeding was 2.13% per year in the apixaban group, as compared with 3.09% per year in the warfarin group (hazard ratio, 0.69; 95% CI, 0.60 to 0.80; P<0.001), and the rates of death from any cause were 3.52% and 3.94%, respectively (hazard ratio, 0.89; 95% CI, 0.80 to 0.99; P=0.047). The rate of hemorrhagic stroke was 0.24% per year in the apixaban group, as compared with 0.47% per year in the warfarin group (hazard ratio, 0.51; 95% CI, 0.35 to 0.75; P<0.001), and the rate of ischemic or uncertain type of stroke was 0.97% per year in the apixaban group and 1.05% per year in the warfarin group (hazard ratio, 0.92; 95% CI, 0.74 to 1.13; P=0.42). In patients with atrial fibrillation, apixaban was superior to warfarin in preventing stroke or systemic embolism, caused less bleeding, and resulted in lower mortality. (Funded by Bristol-Myers Squibb and Pfizer; ARISTOTLE ClinicalTrials.gov number, NCT00412984.).

  15. 46 CFR 153.316 - Special cargo pumproom ventilation rate.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... CARGOES SHIPS CARRYING BULK LIQUID, LIQUEFIED GAS, OR COMPRESSED GAS HAZARDOUS MATERIALS Design and Equipment Cargo Handling Space Ventilation § 153.316 Special cargo pumproom ventilation rate. When Table 1...

  16. 46 CFR 153.316 - Special cargo pumproom ventilation rate.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... CARGOES SHIPS CARRYING BULK LIQUID, LIQUEFIED GAS, OR COMPRESSED GAS HAZARDOUS MATERIALS Design and Equipment Cargo Handling Space Ventilation § 153.316 Special cargo pumproom ventilation rate. When Table 1...

  17. 46 CFR 153.316 - Special cargo pumproom ventilation rate.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... CARGOES SHIPS CARRYING BULK LIQUID, LIQUEFIED GAS, OR COMPRESSED GAS HAZARDOUS MATERIALS Design and Equipment Cargo Handling Space Ventilation § 153.316 Special cargo pumproom ventilation rate. When Table 1...

  18. 46 CFR 153.316 - Special cargo pumproom ventilation rate.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... CARGOES SHIPS CARRYING BULK LIQUID, LIQUEFIED GAS, OR COMPRESSED GAS HAZARDOUS MATERIALS Design and Equipment Cargo Handling Space Ventilation § 153.316 Special cargo pumproom ventilation rate. When Table 1...

  19. 46 CFR 153.316 - Special cargo pumproom ventilation rate.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... CARGOES SHIPS CARRYING BULK LIQUID, LIQUEFIED GAS, OR COMPRESSED GAS HAZARDOUS MATERIALS Design and Equipment Cargo Handling Space Ventilation § 153.316 Special cargo pumproom ventilation rate. When Table 1...

  20. Home blood pressure predicts stroke incidence among older adults with impaired physical function: the Ohasama study.

    PubMed

    Murakami, Keiko; Asayama, Kei; Satoh, Michihiro; Hosaka, Miki; Matsuda, Ayako; Inoue, Ryusuke; Tsubota-Utsugi, Megumi; Murakami, Takahisa; Nomura, Kyoko; Kikuya, Masahiro; Metoki, Hirohito; Imai, Yutaka; Ohkubo, Takayoshi

    2017-12-01

    Several observational studies have found modifying effects of functional status on the association between conventional office blood pressure (BP) and adverse outcomes. We aimed to examine whether the association between higher BP and stroke was attenuated or inverted among older adults with impaired function using self-measured home BP measurements. We followed 501 Japanese community-dwelling adults aged at least 60 years (mean age, 68.6 years) with no history of stroke. Multivariate-adjusted hazard ratios for 1-SD increase in home BP and office BP measurements were calculated by the Cox proportional hazards model. Functional status was assessed by self-reported physical function. During a median follow-up of 11.5 years, first strokes were observed in 47 participants. Higher home SBP, but not office SBP, was significantly associated with increased risk of stroke among both 349 participants with normal physical function and 152 participants with impaired physical function [hazard ratio (95% confidence interval) per 14.4-mmHg increase: 1.74 (1.12-2.69) and 1.77 (1.06-2.94), respectively], with no significant interaction for physical function (P = 0.56). Higher home DBP, but not office DBP, was also significantly associated with increased risk of stroke (P ≤ 0.029) irrespective of physical function (all P > 0.05 for interaction). Neither home BP nor office BP was significantly associated with all-cause mortality irrespective of physical function. Higher home BP was associated with increased risk of stroke even among those with impaired physical function. Measurements of home BP would be useful for stroke prevention, even after physical function decline.

Top