Science.gov

Sample records for accurate risk estimates

  1. Do We Know Whether Researchers and Reviewers are Estimating Risk and Benefit Accurately?

    PubMed

    Hey, Spencer Phillips; Kimmelman, Jonathan

    2016-10-01

    Accurate estimation of risk and benefit is integral to good clinical research planning, ethical review, and study implementation. Some commentators have argued that various actors in clinical research systems are prone to biased or arbitrary risk/benefit estimation. In this commentary, we suggest the evidence supporting such claims is very limited. Most prior work has imputed risk/benefit beliefs based on past behavior or goals, rather than directly measuring them. We describe an approach - forecast analysis - that would enable direct and effective measure of the quality of risk/benefit estimation. We then consider some objections and limitations to the forecasting approach. PMID:27197044

  2. Do We Know Whether Researchers and Reviewers are Estimating Risk and Benefit Accurately?

    PubMed

    Hey, Spencer Phillips; Kimmelman, Jonathan

    2016-10-01

    Accurate estimation of risk and benefit is integral to good clinical research planning, ethical review, and study implementation. Some commentators have argued that various actors in clinical research systems are prone to biased or arbitrary risk/benefit estimation. In this commentary, we suggest the evidence supporting such claims is very limited. Most prior work has imputed risk/benefit beliefs based on past behavior or goals, rather than directly measuring them. We describe an approach - forecast analysis - that would enable direct and effective measure of the quality of risk/benefit estimation. We then consider some objections and limitations to the forecasting approach.

  3. Estimating risk.

    PubMed

    2016-07-01

    A free mobile phone app has been launched providing nurses and other hospital clinicians with a simple way to identify high-risk surgical patients. The app is a phone version of the Surgical Outcome Risk Tool (SORT), originally developed for online use with computers by researchers from the National Confidential Enquiry into Patient Outcome and Death and the University College London Hospital Surgical Outcomes Research Centre. SORT uses information about patients' health and planned surgical procedures to estimate the risk of death within 30 days of an operation. The percentages are only estimates, taking into account the general risks of the procedures and some information about patients, and should not be confused with patient-specific estimates in individual cases. PMID:27369709

  4. Rosiglitazone: can meta-analysis accurately estimate excess cardiovascular risk given the available data? Re-analysis of randomized trials using various methodologic approaches

    PubMed Central

    Friedrich, Jan O; Beyene, Joseph; Adhikari, Neill KJ

    2009-01-01

    statistically significant. Conclusion We have shown that alternative reasonable methodological approaches to the rosiglitazone meta-analysis can yield increased or decreased risks that are either statistically significant or not significant at the p = 0.05 level for both myocardial infarction and cardiovascular death. Completion of ongoing trials may help to generate more accurate estimates of rosiglitazone's effect on cardiovascular outcomes. However, given that almost all point estimates suggest harm rather than benefit and the availability of alternative agents, the use of rosiglitazone may greatly decline prior to more definitive safety data being generated. PMID:19134216

  5. How accurately are maximal metabolic equivalents estimated based on the treadmill workload in healthy people and asymptomatic subjects with cardiovascular risk factors?

    PubMed

    Maeder, M T; Muenzer, T; Rickli, H; Brunner-La Rocca, H P; Myers, J; Ammann, P

    2008-08-01

    Maximal exercise capacity expressed as metabolic equivalents (METs) is rarely directly measured (measured METs; mMETs) but estimated from maximal workload (estimated METs; eMETs). We assessed the accuracy of predicting mMETs by eMETs in asymptomatic subjects. Thirty-four healthy volunteers without cardiovascular risk factors (controls) and 90 patients with at least one risk factor underwent cardiopulmonary exercise testing using individualized treadmill ramp protocols. The equation of the American College of Sports Medicine (ACSM) was employed to calculate eMETs. Despite a close correlation between eMETs and mMETs (patients: r = 0.82, controls: r = 0.88; p < 0.001 for both), eMETs were higher than mMETs in both patients [11.7 (8.9 - 13.4) vs. 8.2 (7.0 - 10.6) METs; p < 0.001] and controls [17.0 (16.2 - 18.2) vs. 15.6 (14.2 - 17.0) METs; p < 0.001]. The absolute [2.5 (1.6 - 3.7) vs. 1.3 (0.9 - 2.1) METs; p < 0.001] and the relative [28 (19 - 47) vs. 9 (6 - 14) %; p < 0.001] difference between eMETs and mMETs was higher in patients. In patients, ratio limits of agreement of 1.33 (*/ divided by 1.40) between eMETs and mMETs were obtained, whereas the ratio limits of agreement were 1.09 (*/ divided by 1.13) in controls. The ACSM equation is associated with a significant overestimation of mMETs in young and fit subjects, which is markedly more pronounced in older and less fit subjects with cardiovascular risk factors.

  6. Accurate Biomass Estimation via Bayesian Adaptive Sampling

    NASA Technical Reports Server (NTRS)

    Wheeler, Kevin R.; Knuth, Kevin H.; Castle, Joseph P.; Lvov, Nikolay

    2005-01-01

    The following concepts were introduced: a) Bayesian adaptive sampling for solving biomass estimation; b) Characterization of MISR Rahman model parameters conditioned upon MODIS landcover. c) Rigorous non-parametric Bayesian approach to analytic mixture model determination. d) Unique U.S. asset for science product validation and verification.

  7. 31 CFR 205.24 - How are accurate estimates maintained?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 31 Money and Finance: Treasury 2 2010-07-01 2010-07-01 false How are accurate estimates maintained... Treasury-State Agreement § 205.24 How are accurate estimates maintained? (a) If a State has knowledge that an estimate does not reasonably correspond to the State's cash needs for a Federal assistance...

  8. Micromagnetometer calibration for accurate orientation estimation.

    PubMed

    Zhang, Zhi-Qiang; Yang, Guang-Zhong

    2015-02-01

    Micromagnetometers, together with inertial sensors, are widely used for attitude estimation for a wide variety of applications. However, appropriate sensor calibration, which is essential to the accuracy of attitude reconstruction, must be performed in advance. Thus far, many different magnetometer calibration methods have been proposed to compensate for errors such as scale, offset, and nonorthogonality. They have also been used for obviate magnetic errors due to soft and hard iron. However, in order to combine the magnetometer with inertial sensor for attitude reconstruction, alignment difference between the magnetometer and the axes of the inertial sensor must be determined as well. This paper proposes a practical means of sensor error correction by simultaneous consideration of sensor errors, magnetic errors, and alignment difference. We take the summation of the offset and hard iron error as the combined bias and then amalgamate the alignment difference and all the other errors as a transformation matrix. A two-step approach is presented to determine the combined bias and transformation matrix separately. In the first step, the combined bias is determined by finding an optimal ellipsoid that can best fit the sensor readings. In the second step, the intrinsic relationships of the raw sensor readings are explored to estimate the transformation matrix as a homogeneous linear least-squares problem. Singular value decomposition is then applied to estimate both the transformation matrix and magnetic vector. The proposed method is then applied to calibrate our sensor node. Although there is no ground truth for the combined bias and transformation matrix for our node, the consistency of calibration results among different trials and less than 3(°) root mean square error for orientation estimation have been achieved, which illustrates the effectiveness of the proposed sensor calibration method for practical applications. PMID:25265625

  9. Quantifying Accurate Calorie Estimation Using the "Think Aloud" Method

    ERIC Educational Resources Information Center

    Holmstrup, Michael E.; Stearns-Bruening, Kay; Rozelle, Jeffrey

    2013-01-01

    Objective: Clients often have limited time in a nutrition education setting. An improved understanding of the strategies used to accurately estimate calories may help to identify areas of focused instruction to improve nutrition knowledge. Methods: A "Think Aloud" exercise was recorded during the estimation of calories in a standard dinner meal…

  10. Risk estimates for bone

    SciTech Connect

    Schlenker, R.A.

    1981-01-01

    The primary sources of information on the skeletal effects of internal emitters in humans are the US radium cases with occupational and medical exposures to /sup 226/ /sup 228/Ra and the German patients injected with /sup 224/Ra primarily for treatment of ankylosing spondylitis and tuberculosis. During the past decade, dose-response data from both study populations have been used by committees, e.g., the BEIR committees, to estimate risks at low dose levels. NCRP Committee 57 and its task groups are now engaged in making risk estimates for internal emitters. This paper presents brief discussions of the radium data, the results of some new analyses and suggestions for expressing risk estimates in a form appropriate to radiation protection.

  11. Accurate Parameter Estimation for Unbalanced Three-Phase System

    PubMed Central

    Chen, Yuan

    2014-01-01

    Smart grid is an intelligent power generation and control console in modern electricity networks, where the unbalanced three-phase power system is the commonly used model. Here, parameter estimation for this system is addressed. After converting the three-phase waveforms into a pair of orthogonal signals via the α β-transformation, the nonlinear least squares (NLS) estimator is developed for accurately finding the frequency, phase, and voltage parameters. The estimator is realized by the Newton-Raphson scheme, whose global convergence is studied in this paper. Computer simulations show that the mean square error performance of NLS method can attain the Cramér-Rao lower bound. Moreover, our proposal provides more accurate frequency estimation when compared with the complex least mean square (CLMS) and augmented CLMS. PMID:25162056

  12. Accurate parameter estimation for unbalanced three-phase system.

    PubMed

    Chen, Yuan; So, Hing Cheung

    2014-01-01

    Smart grid is an intelligent power generation and control console in modern electricity networks, where the unbalanced three-phase power system is the commonly used model. Here, parameter estimation for this system is addressed. After converting the three-phase waveforms into a pair of orthogonal signals via the α β-transformation, the nonlinear least squares (NLS) estimator is developed for accurately finding the frequency, phase, and voltage parameters. The estimator is realized by the Newton-Raphson scheme, whose global convergence is studied in this paper. Computer simulations show that the mean square error performance of NLS method can attain the Cramér-Rao lower bound. Moreover, our proposal provides more accurate frequency estimation when compared with the complex least mean square (CLMS) and augmented CLMS.

  13. An Accurate Link Correlation Estimator for Improving Wireless Protocol Performance

    PubMed Central

    Zhao, Zhiwei; Xu, Xianghua; Dong, Wei; Bu, Jiajun

    2015-01-01

    Wireless link correlation has shown significant impact on the performance of various sensor network protocols. Many works have been devoted to exploiting link correlation for protocol improvements. However, the effectiveness of these designs heavily relies on the accuracy of link correlation measurement. In this paper, we investigate state-of-the-art link correlation measurement and analyze the limitations of existing works. We then propose a novel lightweight and accurate link correlation estimation (LACE) approach based on the reasoning of link correlation formation. LACE combines both long-term and short-term link behaviors for link correlation estimation. We implement LACE as a stand-alone interface in TinyOS and incorporate it into both routing and flooding protocols. Simulation and testbed results show that LACE: (1) achieves more accurate and lightweight link correlation measurements than the state-of-the-art work; and (2) greatly improves the performance of protocols exploiting link correlation. PMID:25686314

  14. An accurate link correlation estimator for improving wireless protocol performance.

    PubMed

    Zhao, Zhiwei; Xu, Xianghua; Dong, Wei; Bu, Jiajun

    2015-02-12

    Wireless link correlation has shown significant impact on the performance of various sensor network protocols. Many works have been devoted to exploiting link correlation for protocol improvements. However, the effectiveness of these designs heavily relies on the accuracy of link correlation measurement. In this paper, we investigate state-of-the-art link correlation measurement and analyze the limitations of existing works. We then propose a novel lightweight and accurate link correlation estimation (LACE) approach based on the reasoning of link correlation formation. LACE combines both long-term and short-term link behaviors for link correlation estimation. We implement LACE as a stand-alone interface in TinyOS and incorporate it into both routing and flooding protocols. Simulation and testbed results show that LACE: (1) achieves more accurate and lightweight link correlation measurements than the state-of-the-art work; and (2) greatly improves the performance of protocols exploiting link correlation.

  15. Accurate estimation of sigma(exp 0) using AIRSAR data

    NASA Technical Reports Server (NTRS)

    Holecz, Francesco; Rignot, Eric

    1995-01-01

    During recent years signature analysis, classification, and modeling of Synthetic Aperture Radar (SAR) data as well as estimation of geophysical parameters from SAR data have received a great deal of interest. An important requirement for the quantitative use of SAR data is the accurate estimation of the backscattering coefficient sigma(exp 0). In terrain with relief variations radar signals are distorted due to the projection of the scene topography into the slant range-Doppler plane. The effect of these variations is to change the physical size of the scattering area, leading to errors in the radar backscatter values and incidence angle. For this reason the local incidence angle, derived from sensor position and Digital Elevation Model (DEM) data must always be considered. Especially in the airborne case, the antenna gain pattern can be an additional source of radiometric error, because the radar look angle is not known precisely as a result of the the aircraft motions and the local surface topography. Consequently, radiometric distortions due to the antenna gain pattern must also be corrected for each resolution cell, by taking into account aircraft displacements (position and attitude) and position of the backscatter element, defined by the DEM data. In this paper, a method to derive an accurate estimation of the backscattering coefficient using NASA/JPL AIRSAR data is presented. The results are evaluated in terms of geometric accuracy, radiometric variations of sigma(exp 0), and precision of the estimated forest biomass.

  16. Accurate photometric redshift probability density estimation - method comparison and application

    NASA Astrophysics Data System (ADS)

    Rau, Markus Michael; Seitz, Stella; Brimioulle, Fabrice; Frank, Eibe; Friedrich, Oliver; Gruen, Daniel; Hoyle, Ben

    2015-10-01

    We introduce an ordinal classification algorithm for photometric redshift estimation, which significantly improves the reconstruction of photometric redshift probability density functions (PDFs) for individual galaxies and galaxy samples. As a use case we apply our method to CFHTLS galaxies. The ordinal classification algorithm treats distinct redshift bins as ordered values, which improves the quality of photometric redshift PDFs, compared with non-ordinal classification architectures. We also propose a new single value point estimate of the galaxy redshift, which can be used to estimate the full redshift PDF of a galaxy sample. This method is competitive in terms of accuracy with contemporary algorithms, which stack the full redshift PDFs of all galaxies in the sample, but requires orders of magnitude less storage space. The methods described in this paper greatly improve the log-likelihood of individual object redshift PDFs, when compared with a popular neural network code (ANNZ). In our use case, this improvement reaches 50 per cent for high-redshift objects (z ≥ 0.75). We show that using these more accurate photometric redshift PDFs will lead to a reduction in the systematic biases by up to a factor of 4, when compared with less accurate PDFs obtained from commonly used methods. The cosmological analyses we examine and find improvement upon are the following: gravitational lensing cluster mass estimates, modelling of angular correlation functions and modelling of cosmic shear correlation functions.

  17. Accurate Satellite-Derived Estimates of Tropospheric Ozone Radiative Forcing

    NASA Technical Reports Server (NTRS)

    Joiner, Joanna; Schoeberl, Mark R.; Vasilkov, Alexander P.; Oreopoulos, Lazaros; Platnick, Steven; Livesey, Nathaniel J.; Levelt, Pieternel F.

    2008-01-01

    Estimates of the radiative forcing due to anthropogenically-produced tropospheric O3 are derived primarily from models. Here, we use tropospheric ozone and cloud data from several instruments in the A-train constellation of satellites as well as information from the GEOS-5 Data Assimilation System to accurately estimate the instantaneous radiative forcing from tropospheric O3 for January and July 2005. We improve upon previous estimates of tropospheric ozone mixing ratios from a residual approach using the NASA Earth Observing System (EOS) Aura Ozone Monitoring Instrument (OMI) and Microwave Limb Sounder (MLS) by incorporating cloud pressure information from OMI. Since we cannot distinguish between natural and anthropogenic sources with the satellite data, our estimates reflect the total forcing due to tropospheric O3. We focus specifically on the magnitude and spatial structure of the cloud effect on both the shortand long-wave radiative forcing. The estimates presented here can be used to validate present day O3 radiative forcing produced by models.

  18. Accurate estimators of correlation functions in Fourier space

    NASA Astrophysics Data System (ADS)

    Sefusatti, E.; Crocce, M.; Scoccimarro, R.; Couchman, H. M. P.

    2016-08-01

    Efficient estimators of Fourier-space statistics for large number of objects rely on fast Fourier transforms (FFTs), which are affected by aliasing from unresolved small-scale modes due to the finite FFT grid. Aliasing takes the form of a sum over images, each of them corresponding to the Fourier content displaced by increasing multiples of the sampling frequency of the grid. These spurious contributions limit the accuracy in the estimation of Fourier-space statistics, and are typically ameliorated by simultaneously increasing grid size and discarding high-frequency modes. This results in inefficient estimates for e.g. the power spectrum when desired systematic biases are well under per cent level. We show that using interlaced grids removes odd images, which include the dominant contribution to aliasing. In addition, we discuss the choice of interpolation kernel used to define density perturbations on the FFT grid and demonstrate that using higher order interpolation kernels than the standard Cloud-In-Cell algorithm results in significant reduction of the remaining images. We show that combining fourth-order interpolation with interlacing gives very accurate Fourier amplitudes and phases of density perturbations. This results in power spectrum and bispectrum estimates that have systematic biases below 0.01 per cent all the way to the Nyquist frequency of the grid, thus maximizing the use of unbiased Fourier coefficients for a given grid size and greatly reducing systematics for applications to large cosmological data sets.

  19. Accurate heart rate estimation from camera recording via MUSIC algorithm.

    PubMed

    Fouladi, Seyyed Hamed; Balasingham, Ilangko; Ramstad, Tor Audun; Kansanen, Kimmo

    2015-01-01

    In this paper, we propose an algorithm to extract heart rate frequency from video camera using the Multiple SIgnal Classification (MUSIC) algorithm. This leads to improved accuracy of the estimated heart rate frequency in cases the performance is limited by the number of samples and frame rate. Monitoring vital signs remotely can be exploited for both non-contact physiological and psychological diagnosis. The color variation recorded by ordinary cameras is used for heart rate monitoring. The orthogonality between signal space and noise space is used to find more accurate heart rate frequency in comparison with traditional methods. It is shown via experimental results that the limitation of previous methods can be overcome by using subspace methods. PMID:26738015

  20. Accurate Orientation Estimation Using AHRS under Conditions of Magnetic Distortion

    PubMed Central

    Yadav, Nagesh; Bleakley, Chris

    2014-01-01

    Low cost, compact attitude heading reference systems (AHRS) are now being used to track human body movements in indoor environments by estimation of the 3D orientation of body segments. In many of these systems, heading estimation is achieved by monitoring the strength of the Earth's magnetic field. However, the Earth's magnetic field can be locally distorted due to the proximity of ferrous and/or magnetic objects. Herein, we propose a novel method for accurate 3D orientation estimation using an AHRS, comprised of an accelerometer, gyroscope and magnetometer, under conditions of magnetic field distortion. The system performs online detection and compensation for magnetic disturbances, due to, for example, the presence of ferrous objects. The magnetic distortions are detected by exploiting variations in magnetic dip angle, relative to the gravity vector, and in magnetic strength. We investigate and show the advantages of using both magnetic strength and magnetic dip angle for detecting the presence of magnetic distortions. The correction method is based on a particle filter, which performs the correction using an adaptive cost function and by adapting the variance during particle resampling, so as to place more emphasis on the results of dead reckoning of the gyroscope measurements and less on the magnetometer readings. The proposed method was tested in an indoor environment in the presence of various magnetic distortions and under various accelerations (up to 3 g). In the experiments, the proposed algorithm achieves <2° static peak-to-peak error and <5° dynamic peak-to-peak error, significantly outperforming previous methods. PMID:25347584

  1. Simulation model accurately estimates total dietary iodine intake.

    PubMed

    Verkaik-Kloosterman, Janneke; van 't Veer, Pieter; Ocké, Marga C

    2009-07-01

    One problem with estimating iodine intake is the lack of detailed data about the discretionary use of iodized kitchen salt and iodization of industrially processed foods. To be able to take into account these uncertainties in estimating iodine intake, a simulation model combining deterministic and probabilistic techniques was developed. Data from the Dutch National Food Consumption Survey (1997-1998) and an update of the Food Composition database were used to simulate 3 different scenarios: Dutch iodine legislation until July 2008, Dutch iodine legislation after July 2008, and a potential future situation. Results from studies measuring iodine excretion during the former legislation are comparable with the iodine intakes estimated with our model. For both former and current legislation, iodine intake was adequate for a large part of the Dutch population, but some young children (<5%) were at risk of intakes that were too low. In the scenario of a potential future situation using lower salt iodine levels, the percentage of the Dutch population with intakes that were too low increased (almost 10% of young children). To keep iodine intakes adequate, salt iodine levels should not be decreased, unless many more foods will contain iodized salt. Our model should be useful in predicting the effects of food reformulation or fortification on habitual nutrient intakes.

  2. Estimates of radiogenic cancer risks

    SciTech Connect

    Puskin, J.S.; Nelson, C.B.

    1995-07-01

    A methodology recently developed by the U.S. EPA for estimating the carcingenic risks from ionizing radiation is described. For most cancer sites, the risk model is one in which age-specific, relative risk coefficients are obtained by taking a geometric mean of the coefficients derived from the atomic bomb survivor data using two different methods for transporting risks from the Japanese to the U.S. population. The risk models are applied to estimate organ-specific risks per unit dose for a stationary population with mortality rates governed by 1980 U.S. vital statistics. With the exception of breast cancer, low-LET radiogenic cancer risk estimates are reduced by a factor of 2 at low doses and dose rates compared to acute high dose exposure conditions. For low dose (or dose rate) conditions, the risk of inducing a premature cancer death from uniform, whole body, low-LET irradiation is calculated to be 5.1 x 10{sup -2} Gy{sup -1}. Neglecting nonfatal skin cancers, the corresponding incidence risk is 7.6 x 10{sup -2} Gy{sup -1}. High-LET (alpha particle) risks are presumed to increase linearly with dose and to be independent of dose rate. High-LET risks are estimated to be 20 times the low-LET risks estimated under low dose rate conditions, except for leukemia and breast cancer where RBEs of 1 and 10 are adopted, respectively. 29 refs., 3 tabs.

  3. [Medical insurance estimation of risks].

    PubMed

    Dunér, H

    1975-11-01

    The purpose of insurance medicine is to make a prognostic estimate of medical risk-factors in persons who apply for life, health, or accident insurance. Established risk-groups with a calculated average mortality and morbidity form the basis for premium rates and insurance terms. In most cases the applicant is accepted for insurance after a self-assessment of his health. Only around one per cent of the applications are refused, but there are cases in which the premium is raised, temporarily or permanently. It is often a matter of rough estimate, since the knowlege of the long-term prognosis for many diseases is incomplete. The insurance companies' rules for estimate of risk are revised at intervals of three or four years. The estimate of risk as regards life insurance has been gradually liberalised, while the medical conditions for health insurance have become stricter owing to an increase in the claims rate.

  4. Reinforcing flood-risk estimation.

    PubMed

    Reed, Duncan W

    2002-07-15

    Flood-frequency estimation is inherently uncertain. The practitioner applies a combination of gauged data, scientific method and hydrological judgement to derive a flood-frequency curve for a particular site. The resulting estimate can be thought fully satisfactory only if it is broadly consistent with all that is reliably known about the flood-frequency behaviour of the river. The paper takes as its main theme the search for information to strengthen a flood-risk estimate made from peak flows alone. Extra information comes in many forms, including documentary and monumental records of historical floods, and palaeological markers. Meteorological information is also useful, although rainfall rarity is difficult to assess objectively and can be a notoriously unreliable indicator of flood rarity. On highly permeable catchments, groundwater levels present additional data. Other types of information are relevant to judging hydrological similarity when the flood-frequency estimate derives from data pooled across several catchments. After highlighting information sources, the paper explores a second theme: that of consistency in flood-risk estimates. Following publication of the Flood estimation handbook, studies of flood risk are now using digital catchment data. Automated calculation methods allow estimates by standard methods to be mapped basin-wide, revealing anomalies at special sites such as river confluences. Such mapping presents collateral information of a new character. Can this be used to achieve flood-risk estimates that are coherent throughout a river basin? PMID:12804255

  5. Reinforcing flood-risk estimation.

    PubMed

    Reed, Duncan W

    2002-07-15

    Flood-frequency estimation is inherently uncertain. The practitioner applies a combination of gauged data, scientific method and hydrological judgement to derive a flood-frequency curve for a particular site. The resulting estimate can be thought fully satisfactory only if it is broadly consistent with all that is reliably known about the flood-frequency behaviour of the river. The paper takes as its main theme the search for information to strengthen a flood-risk estimate made from peak flows alone. Extra information comes in many forms, including documentary and monumental records of historical floods, and palaeological markers. Meteorological information is also useful, although rainfall rarity is difficult to assess objectively and can be a notoriously unreliable indicator of flood rarity. On highly permeable catchments, groundwater levels present additional data. Other types of information are relevant to judging hydrological similarity when the flood-frequency estimate derives from data pooled across several catchments. After highlighting information sources, the paper explores a second theme: that of consistency in flood-risk estimates. Following publication of the Flood estimation handbook, studies of flood risk are now using digital catchment data. Automated calculation methods allow estimates by standard methods to be mapped basin-wide, revealing anomalies at special sites such as river confluences. Such mapping presents collateral information of a new character. Can this be used to achieve flood-risk estimates that are coherent throughout a river basin?

  6. Fast and Accurate Learning When Making Discrete Numerical Estimates.

    PubMed

    Sanborn, Adam N; Beierholm, Ulrik R

    2016-04-01

    Many everyday estimation tasks have an inherently discrete nature, whether the task is counting objects (e.g., a number of paint buckets) or estimating discretized continuous variables (e.g., the number of paint buckets needed to paint a room). While Bayesian inference is often used for modeling estimates made along continuous scales, discrete numerical estimates have not received as much attention, despite their common everyday occurrence. Using two tasks, a numerosity task and an area estimation task, we invoke Bayesian decision theory to characterize how people learn discrete numerical distributions and make numerical estimates. Across three experiments with novel stimulus distributions we found that participants fell between two common decision functions for converting their uncertain representation into a response: drawing a sample from their posterior distribution and taking the maximum of their posterior distribution. While this was consistent with the decision function found in previous work using continuous estimation tasks, surprisingly the prior distributions learned by participants in our experiments were much more adaptive: When making continuous estimates, participants have required thousands of trials to learn bimodal priors, but in our tasks participants learned discrete bimodal and even discrete quadrimodal priors within a few hundred trials. This makes discrete numerical estimation tasks good testbeds for investigating how people learn and make estimates. PMID:27070155

  7. Fast and Accurate Learning When Making Discrete Numerical Estimates.

    PubMed

    Sanborn, Adam N; Beierholm, Ulrik R

    2016-04-01

    Many everyday estimation tasks have an inherently discrete nature, whether the task is counting objects (e.g., a number of paint buckets) or estimating discretized continuous variables (e.g., the number of paint buckets needed to paint a room). While Bayesian inference is often used for modeling estimates made along continuous scales, discrete numerical estimates have not received as much attention, despite their common everyday occurrence. Using two tasks, a numerosity task and an area estimation task, we invoke Bayesian decision theory to characterize how people learn discrete numerical distributions and make numerical estimates. Across three experiments with novel stimulus distributions we found that participants fell between two common decision functions for converting their uncertain representation into a response: drawing a sample from their posterior distribution and taking the maximum of their posterior distribution. While this was consistent with the decision function found in previous work using continuous estimation tasks, surprisingly the prior distributions learned by participants in our experiments were much more adaptive: When making continuous estimates, participants have required thousands of trials to learn bimodal priors, but in our tasks participants learned discrete bimodal and even discrete quadrimodal priors within a few hundred trials. This makes discrete numerical estimation tasks good testbeds for investigating how people learn and make estimates.

  8. Fast and Accurate Learning When Making Discrete Numerical Estimates

    PubMed Central

    Sanborn, Adam N.; Beierholm, Ulrik R.

    2016-01-01

    Many everyday estimation tasks have an inherently discrete nature, whether the task is counting objects (e.g., a number of paint buckets) or estimating discretized continuous variables (e.g., the number of paint buckets needed to paint a room). While Bayesian inference is often used for modeling estimates made along continuous scales, discrete numerical estimates have not received as much attention, despite their common everyday occurrence. Using two tasks, a numerosity task and an area estimation task, we invoke Bayesian decision theory to characterize how people learn discrete numerical distributions and make numerical estimates. Across three experiments with novel stimulus distributions we found that participants fell between two common decision functions for converting their uncertain representation into a response: drawing a sample from their posterior distribution and taking the maximum of their posterior distribution. While this was consistent with the decision function found in previous work using continuous estimation tasks, surprisingly the prior distributions learned by participants in our experiments were much more adaptive: When making continuous estimates, participants have required thousands of trials to learn bimodal priors, but in our tasks participants learned discrete bimodal and even discrete quadrimodal priors within a few hundred trials. This makes discrete numerical estimation tasks good testbeds for investigating how people learn and make estimates. PMID:27070155

  9. Bioaccessibility tests accurately estimate bioavailability of lead to quail

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Hazards of soil-borne Pb to wild birds may be more accurately quantified if the bioavailability of that Pb is known. To better understand the bioavailability of Pb, we incorporated Pb-contaminated soils or Pb acetate into diets for Japanese quail (Coturnix japonica), fed the quail for 15 days, and ...

  10. BIOACCESSIBILITY TESTS ACCURATELY ESTIMATE BIOAVAILABILITY OF LEAD TO QUAIL

    EPA Science Inventory

    Hazards of soil-borne Pb to wild birds may be more accurately quantified if the bioavailability of that Pb is known. To better understand the bioavailability of Pb to birds, we measured blood Pb concentrations in Japanese quail (Coturnix japonica) fed diets containing Pb-contami...

  11. Does more accurate exposure prediction necessarily improve health effect estimates?

    PubMed

    Szpiro, Adam A; Paciorek, Christopher J; Sheppard, Lianne

    2011-09-01

    A unique challenge in air pollution cohort studies and similar applications in environmental epidemiology is that exposure is not measured directly at subjects' locations. Instead, pollution data from monitoring stations at some distance from the study subjects are used to predict exposures, and these predicted exposures are used to estimate the health effect parameter of interest. It is usually assumed that minimizing the error in predicting the true exposure will improve health effect estimation. We show in a simulation study that this is not always the case. We interpret our results in light of recently developed statistical theory for measurement error, and we discuss implications for the design and analysis of epidemiologic research.

  12. Accurate feature detection and estimation using nonlinear and multiresolution analysis

    NASA Astrophysics Data System (ADS)

    Rudin, Leonid; Osher, Stanley

    1994-11-01

    A program for feature detection and estimation using nonlinear and multiscale analysis was completed. The state-of-the-art edge detection was combined with multiscale restoration (as suggested by the first author) and robust results in the presence of noise were obtained. Successful applications to numerous images of interest to DOD were made. Also, a new market in the criminal justice field was developed, based in part, on this work.

  13. Bioaccessibility tests accurately estimate bioavailability of lead to quail

    USGS Publications Warehouse

    Beyer, W. Nelson; Basta, Nicholas T; Chaney, Rufus L.; Henry, Paula F.; Mosby, David; Rattner, Barnett A.; Scheckel, Kirk G.; Sprague, Dan; Weber, John

    2016-01-01

    Hazards of soil-borne Pb to wild birds may be more accurately quantified if the bioavailability of that Pb is known. To better understand the bioavailability of Pb to birds, we measured blood Pb concentrations in Japanese quail (Coturnix japonica) fed diets containing Pb-contaminated soils. Relative bioavailabilities were expressed by comparison with blood Pb concentrations in quail fed a Pb acetate reference diet. Diets containing soil from five Pb-contaminated Superfund sites had relative bioavailabilities from 33%-63%, with a mean of about 50%. Treatment of two of the soils with phosphorus significantly reduced the bioavailability of Pb. Bioaccessibility of Pb in the test soils was then measured in six in vitro tests and regressed on bioavailability. They were: the “Relative Bioavailability Leaching Procedure” (RBALP) at pH 1.5, the same test conducted at pH 2.5, the “Ohio State University In vitro Gastrointestinal” method (OSU IVG), the “Urban Soil Bioaccessible Lead Test”, the modified “Physiologically Based Extraction Test” and the “Waterfowl Physiologically Based Extraction Test.” All regressions had positive slopes. Based on criteria of slope and coefficient of determination, the RBALP pH 2.5 and OSU IVG tests performed very well. Speciation by X-ray absorption spectroscopy demonstrated that, on average, most of the Pb in the sampled soils was sorbed to minerals (30%), bound to organic matter (24%), or present as Pb sulfate (18%). Additional Pb was associated with P (chloropyromorphite, hydroxypyromorphite and tertiary Pb phosphate), and with Pb carbonates, leadhillite (a lead sulfate carbonate hydroxide), and Pb sulfide. The formation of chloropyromorphite reduced the bioavailability of Pb and the amendment of Pb-contaminated soils with P may be a thermodynamically favored means to sequester Pb.

  14. Bioaccessibility tests accurately estimate bioavailability of lead to quail.

    PubMed

    Beyer, W Nelson; Basta, Nicholas T; Chaney, Rufus L; Henry, Paula F P; Mosby, David E; Rattner, Barnett A; Scheckel, Kirk G; Sprague, Daniel T; Weber, John S

    2016-09-01

    Hazards of soil-borne lead (Pb) to wild birds may be more accurately quantified if the bioavailability of that Pb is known. To better understand the bioavailability of Pb to birds, the authors measured blood Pb concentrations in Japanese quail (Coturnix japonica) fed diets containing Pb-contaminated soils. Relative bioavailabilities were expressed by comparison with blood Pb concentrations in quail fed a Pb acetate reference diet. Diets containing soil from 5 Pb-contaminated Superfund sites had relative bioavailabilities from 33% to 63%, with a mean of approximately 50%. Treatment of 2 of the soils with phosphorus (P) significantly reduced the bioavailability of Pb. Bioaccessibility of Pb in the test soils was then measured in 6 in vitro tests and regressed on bioavailability: the relative bioavailability leaching procedure at pH 1.5, the same test conducted at pH 2.5, the Ohio State University in vitro gastrointestinal method, the urban soil bioaccessible lead test, the modified physiologically based extraction test, and the waterfowl physiologically based extraction test. All regressions had positive slopes. Based on criteria of slope and coefficient of determination, the relative bioavailability leaching procedure at pH 2.5 and Ohio State University in vitro gastrointestinal tests performed very well. Speciation by X-ray absorption spectroscopy demonstrated that, on average, most of the Pb in the sampled soils was sorbed to minerals (30%), bound to organic matter (24%), or present as Pb sulfate (18%). Additional Pb was associated with P (chloropyromorphite, hydroxypyromorphite, and tertiary Pb phosphate) and with Pb carbonates, leadhillite (a lead sulfate carbonate hydroxide), and Pb sulfide. The formation of chloropyromorphite reduced the bioavailability of Pb, and the amendment of Pb-contaminated soils with P may be a thermodynamically favored means to sequester Pb. Environ Toxicol Chem 2016;35:2311-2319. Published 2016 Wiley Periodicals Inc. on behalf of

  15. New ventures require accurate risk analyses and adjustments.

    PubMed

    Eastaugh, S R

    2000-01-01

    For new business ventures to succeed, healthcare executives need to conduct robust risk analyses and develop new approaches to balance risk and return. Risk analysis involves examination of objective risks and harder-to-quantify subjective risks. Mathematical principles applied to investment portfolios also can be applied to a portfolio of departments or strategic business units within an organization. The ideal business investment would have a high expected return and a low standard deviation. Nonetheless, both conservative and speculative strategies should be considered in determining an organization's optimal service line and helping the organization manage risk.

  16. New Cardiovascular Risk Factors and Their Use for an Accurate Cardiovascular Risk Assessment in Hypertensive Patients

    PubMed Central

    TAUTU, Oana-Florentina; DARABONT, Roxana; ONCIUL, Sebastian; DEACONU, Alexandru; COMANESCU, Ioana; ANDREI, Radu Dan; DRAGOESCU, Bogdan; CINTEZA, Mircea; DOROBANTU, Maria

    2014-01-01

    Objectives: To analyze the predictive value of new cardiovascular (CV) risk factors for CV risk assessment in the adult Romanian hypertensive (HT) population. Methods: Hypertensive adults aged between 40-65 years of age, identified in national representative SEPHAR II survey were evaluated by anthropometric, BP and arterial stiffness measurements: aortic pulse wave velocity (PWVao), aortic augmentation index (AIXao), revers time (RT) and central systolic blood pressure (SBPao), 12 lead ECGs and laboratory workup. Values above the 4th quartile of mean SBP' standard deviation (s.d.) defined increased BP variability. Log(TG/HDL-cholesterol) defined atherogenic index of plasma (AIP). Serum uric acid levels above 5.70 mg/dl for women and 7.0 mg/dl for males defined hyperuricemia (HUA). CV risk was assessed based on SCORE chart for high CV risk countries. Binary logistic regression using a stepwise likelihood ratio method (adjustments for major confounders and colliniarity analysis) was used in order to validate predictors of high and very high CV risk class. Results: The mean SBP value of the study group was 148.46±19.61 mmHg. Over forty percent of hypertensives had a high and very high CV risk. Predictors of high/very high CV risk category validated by regression analysis were: increased visit-to-visit BP variability (OR: 2.49; 95%CI: 1.67-3.73), PWVao (OR: 1.12; 95%CI: 1.02-1.22), RT (OR: 0.95; 95% CI: 0.93-0.98), SBPao (OR: 1.01; 95%CI: 1.01-1.03) and AIP (OR: 7.08; 95%CI: 3.91-12.82). Conclusion: The results of our study suggests that the new CV risk factors such as increased BP variability, arterial stiffness indices and AIP are useful tools for a more accurate identification of hypertensives patients at high and very high CV risk. PMID:25705267

  17. Thinking Concretely Increases the Perceived Likelihood of Risks: The Effect of Construal Level on Risk Estimation.

    PubMed

    Lermer, Eva; Streicher, Bernhard; Sachs, Rainer; Raue, Martina; Frey, Dieter

    2016-03-01

    Recent findings on construal level theory (CLT) suggest that abstract thinking leads to a lower estimated probability of an event occurring compared to concrete thinking. We applied this idea to the risk context and explored the influence of construal level (CL) on the overestimation of small and underestimation of large probabilities for risk estimates concerning a vague target person (Study 1 and Study 3) and personal risk estimates (Study 2). We were specifically interested in whether the often-found overestimation of small probabilities could be reduced with abstract thinking, and the often-found underestimation of large probabilities was reduced with concrete thinking. The results showed that CL influenced risk estimates. In particular, a concrete mindset led to higher risk estimates compared to an abstract mindset for several adverse events, including events with small and large probabilities. This suggests that CL manipulation can indeed be used for improving the accuracy of lay people's estimates of small and large probabilities. Moreover, the results suggest that professional risk managers' risk estimates of common events (thus with a relatively high probability) could be improved by adopting a concrete mindset. However, the abstract manipulation did not lead managers to estimate extremely unlikely events more accurately. Potential reasons for different CL manipulation effects on risk estimates' accuracy between lay people and risk managers are discussed.

  18. A fast and accurate frequency estimation algorithm for sinusoidal signal with harmonic components

    NASA Astrophysics Data System (ADS)

    Hu, Jinghua; Pan, Mengchun; Zeng, Zhidun; Hu, Jiafei; Chen, Dixiang; Tian, Wugang; Zhao, Jianqiang; Du, Qingfa

    2016-10-01

    Frequency estimation is a fundamental problem in many applications, such as traditional vibration measurement, power system supervision, and microelectromechanical system sensors control. In this paper, a fast and accurate frequency estimation algorithm is proposed to deal with low efficiency problem in traditional methods. The proposed algorithm consists of coarse and fine frequency estimation steps, and we demonstrate that it is more efficient than conventional searching methods to achieve coarse frequency estimation (location peak of FFT amplitude) by applying modified zero-crossing technique. Thus, the proposed estimation algorithm requires less hardware and software sources and can achieve even higher efficiency when the experimental data increase. Experimental results with modulated magnetic signal show that the root mean square error of frequency estimation is below 0.032 Hz with the proposed algorithm, which has lower computational complexity and better global performance than conventional frequency estimation methods.

  19. Development of Classification and Story Building Data for Accurate Earthquake Damage Estimation

    NASA Astrophysics Data System (ADS)

    Sakai, Yuki; Fukukawa, Noriko; Arai, Kensuke

    We investigated the method of developing classification and story building data from census population database in order to estimate earthquake damage more accurately especially in the urban area presuming that there are correlation between numbers of non-wooden or high-rise buildings and the population. We formulated equations of estimating numbers of wooden houses, low-to-mid-rise(1-9 story) and high-rise(over 10 story) non-wooden buildings in the 1km mesh from night and daytime population database based on the building data we investigated and collected in the selected 20 meshs in Kanto area. We could accurately estimate the numbers of three classified buildings by the formulated equations, but in some special cases, such as the apartment block mesh, the estimated values are quite different from actual values.

  20. Estimating the re-identification risk of clinical data sets

    PubMed Central

    2012-01-01

    Background De-identification is a common way to protect patient privacy when disclosing clinical data for secondary purposes, such as research. One type of attack that de-identification protects against is linking the disclosed patient data with public and semi-public registries. Uniqueness is a commonly used measure of re-identification risk under this attack. If uniqueness can be measured accurately then the risk from this kind of attack can be managed. In practice, it is often not possible to measure uniqueness directly, therefore it must be estimated. Methods We evaluated the accuracy of uniqueness estimators on clinically relevant data sets. Four candidate estimators were identified because they were evaluated in the past and found to have good accuracy or because they were new and not evaluated comparatively before: the Zayatz estimator, slide negative binomial estimator, Pitman’s estimator, and mu-argus. A Monte Carlo simulation was performed to evaluate the uniqueness estimators on six clinically relevant data sets. We varied the sampling fraction and the uniqueness in the population (the value being estimated). The median relative error and inter-quartile range of the uniqueness estimates was measured across 1000 runs. Results There was no single estimator that performed well across all of the conditions. We developed a decision rule which selected between the Pitman, slide negative binomial and Zayatz estimators depending on the sampling fraction and the difference between estimates. This decision rule had the best consistent median relative error across multiple conditions and data sets. Conclusion This study identified an accurate decision rule that can be used by health privacy researchers and disclosure control professionals to estimate uniqueness in clinical data sets. The decision rule provides a reliable way to measure re-identification risk. PMID:22776564

  1. On the accurate estimation of gap fraction during daytime with digital cover photography

    NASA Astrophysics Data System (ADS)

    Hwang, Y. R.; Ryu, Y.; Kimm, H.; Macfarlane, C.; Lang, M.; Sonnentag, O.

    2015-12-01

    Digital cover photography (DCP) has emerged as an indirect method to obtain gap fraction accurately. Thus far, however, the intervention of subjectivity, such as determining the camera relative exposure value (REV) and threshold in the histogram, hindered computing accurate gap fraction. Here we propose a novel method that enables us to measure gap fraction accurately during daytime under various sky conditions by DCP. The novel method computes gap fraction using a single DCP unsaturated raw image which is corrected for scattering effects by canopies and a reconstructed sky image from the raw format image. To test the sensitivity of the novel method derived gap fraction to diverse REVs, solar zenith angles and canopy structures, we took photos in one hour interval between sunrise to midday under dense and sparse canopies with REV 0 to -5. The novel method showed little variation of gap fraction across different REVs in both dense and spares canopies across diverse range of solar zenith angles. The perforated panel experiment, which was used to test the accuracy of the estimated gap fraction, confirmed that the novel method resulted in the accurate and consistent gap fractions across different hole sizes, gap fractions and solar zenith angles. These findings highlight that the novel method opens new opportunities to estimate gap fraction accurately during daytime from sparse to dense canopies, which will be useful in monitoring LAI precisely and validating satellite remote sensing LAI products efficiently.

  2. Accurate Estimation of the Entropy of Rotation-Translation Probability Distributions.

    PubMed

    Fogolari, Federico; Dongmo Foumthuim, Cedrix Jurgal; Fortuna, Sara; Soler, Miguel Angel; Corazza, Alessandra; Esposito, Gennaro

    2016-01-12

    The estimation of rotational and translational entropies in the context of ligand binding has been the subject of long-time investigations. The high dimensionality (six) of the problem and the limited amount of sampling often prevent the required resolution to provide accurate estimates by the histogram method. Recently, the nearest-neighbor distance method has been applied to the problem, but the solutions provided either address rotation and translation separately, therefore lacking correlations, or use a heuristic approach. Here we address rotational-translational entropy estimation in the context of nearest-neighbor-based entropy estimation, solve the problem numerically, and provide an exact and an approximate method to estimate the full rotational-translational entropy.

  3. Polynomial fitting of DT-MRI fiber tracts allows accurate estimation of muscle architectural parameters.

    PubMed

    Damon, Bruce M; Heemskerk, Anneriet M; Ding, Zhaohua

    2012-06-01

    Fiber curvature is a functionally significant muscle structural property, but its estimation from diffusion-tensor magnetic resonance imaging fiber tracking data may be confounded by noise. The purpose of this study was to investigate the use of polynomial fitting of fiber tracts for improving the accuracy and precision of fiber curvature (κ) measurements. Simulated image data sets were created in order to provide data with known values for κ and pennation angle (θ). Simulations were designed to test the effects of increasing inherent fiber curvature (3.8, 7.9, 11.8 and 15.3 m(-1)), signal-to-noise ratio (50, 75, 100 and 150) and voxel geometry (13.8- and 27.0-mm(3) voxel volume with isotropic resolution; 13.5-mm(3) volume with an aspect ratio of 4.0) on κ and θ measurements. In the originally reconstructed tracts, θ was estimated accurately under most curvature and all imaging conditions studied; however, the estimates of κ were imprecise and inaccurate. Fitting the tracts to second-order polynomial functions provided accurate and precise estimates of κ for all conditions except very high curvature (κ=15.3 m(-1)), while preserving the accuracy of the θ estimates. Similarly, polynomial fitting of in vivo fiber tracking data reduced the κ values of fitted tracts from those of unfitted tracts and did not change the θ values. Polynomial fitting of fiber tracts allows accurate estimation of physiologically reasonable values of κ, while preserving the accuracy of θ estimation.

  4. Skin Temperature Over the Carotid Artery, an Accurate Non-invasive Estimation of Near Core Temperature

    PubMed Central

    Imani, Farsad; Karimi Rouzbahani, Hamid Reza; Goudarzi, Mehrdad; Tarrahi, Mohammad Javad; Ebrahim Soltani, Alireza

    2016-01-01

    Background: During anesthesia, continuous body temperature monitoring is essential, especially in children. Anesthesia can increase the risk of loss of body temperature by three to four times. Hypothermia in children results in increased morbidity and mortality. Since the measurement points of the core body temperature are not easily accessible, near core sites, like rectum, are used. Objectives: The purpose of this study was to measure skin temperature over the carotid artery and compare it with the rectum temperature, in order to propose a model for accurate estimation of near core body temperature. Patients and Methods: Totally, 124 patients within the age range of 2 - 6 years, undergoing elective surgery, were selected. Temperature of rectum and skin over the carotid artery was measured. Then, the patients were randomly divided into two groups (each including 62 subjects), namely modeling (MG) and validation groups (VG). First, in the modeling group, the average temperature of the rectum and skin over the carotid artery were measured separately. The appropriate model was determined, according to the significance of the model’s coefficients. The obtained model was used to predict the rectum temperature in the second group (VG group). Correlation of the predicted values with the real values (the measured rectum temperature) in the second group was investigated. Also, the difference in the average values of these two groups was examined in terms of significance. Results: In the modeling group, the average rectum and carotid temperatures were 36.47 ± 0.54°C and 35.45 ± 0.62°C, respectively. The final model was obtained, as follows: Carotid temperature × 0.561 + 16.583 = Rectum temperature. The predicted value was calculated based on the regression model and then compared with the measured rectum value, which showed no significant difference (P = 0.361). Conclusions: The present study was the first research, in which rectum temperature was compared with that

  5. Mental health disorders among individuals with mental retardation: challenges to accurate prevalence estimates.

    PubMed Central

    Kerker, Bonnie D.; Owens, Pamela L.; Zigler, Edward; Horwitz, Sarah M.

    2004-01-01

    OBJECTIVES: The objectives of this literature review were to assess current challenges to estimating the prevalence of mental health disorders among individuals with mental retardation (MR) and to develop recommendations to improve such estimates for this population. METHODS: The authors identified 200 peer-reviewed articles, book chapters, government documents, or reports from national and international organizations on the mental health status of people with MR. Based on the study's inclusion criteria, 52 articles were included in the review. RESULTS: Available data reveal inconsistent estimates of the prevalence of mental health disorders among those with MR, but suggest that some mental health conditions are more common among these individuals than in the general population. Two main challenges to identifying accurate prevalence estimates were found: (1) health care providers have difficulty diagnosing mental health conditions among individuals with MR; and (2) methodological limitations of previous research inhibit confidence in study results. CONCLUSIONS: Accurate prevalence estimates are necessary to ensure the availability of appropriate treatment services. To this end, health care providers should receive more training regarding the mental health treatment of individuals with MR. Further, government officials should discuss mechanisms of collecting nationally representative data, and the research community should utilize consistent methods with representative samples when studying mental health conditions in this population. PMID:15219798

  6. Accurate estimation of forest carbon stocks by 3-D remote sensing of individual trees.

    PubMed

    Omasa, Kenji; Qiu, Guo Yu; Watanuki, Kenichi; Yoshimi, Kenji; Akiyama, Yukihide

    2003-03-15

    Forests are one of the most important carbon sinks on Earth. However, owing to the complex structure, variable geography, and large area of forests, accurate estimation of forest carbon stocks is still a challenge for both site surveying and remote sensing. For these reasons, the Kyoto Protocol requires the establishment of methodologies for estimating the carbon stocks of forests (Kyoto Protocol, Article 5). A possible solution to this challenge is to remotely measure the carbon stocks of every tree in an entire forest. Here, we present a methodology for estimating carbon stocks of a Japanese cedar forest by using a high-resolution, helicopter-borne 3-dimensional (3-D) scanning lidar system that measures the 3-D canopy structure of every tree in a forest. Results show that a digital image (10-cm mesh) of woody canopy can be acquired. The treetop can be detected automatically with a reasonable accuracy. The absolute error ranges for tree height measurements are within 42 cm. Allometric relationships of height to carbon stocks then permit estimation of total carbon storage by measurement of carbon stocks of every tree. Thus, we suggest that our methodology can be used to accurately estimate the carbon stocks of Japanese cedar forests at a stand scale. Periodic measurements will reveal changes in forest carbon stocks.

  7. A Method to Accurately Estimate the Muscular Torques of Human Wearing Exoskeletons by Torque Sensors

    PubMed Central

    Hwang, Beomsoo; Jeon, Doyoung

    2015-01-01

    In exoskeletal robots, the quantification of the user’s muscular effort is important to recognize the user’s motion intentions and evaluate motor abilities. In this paper, we attempt to estimate users’ muscular efforts accurately using joint torque sensor which contains the measurements of dynamic effect of human body such as the inertial, Coriolis, and gravitational torques as well as torque by active muscular effort. It is important to extract the dynamic effects of the user’s limb accurately from the measured torque. The user’s limb dynamics are formulated and a convenient method of identifying user-specific parameters is suggested for estimating the user’s muscular torque in robotic exoskeletons. Experiments were carried out on a wheelchair-integrated lower limb exoskeleton, EXOwheel, which was equipped with torque sensors in the hip and knee joints. The proposed methods were evaluated by 10 healthy participants during body weight-supported gait training. The experimental results show that the torque sensors are to estimate the muscular torque accurately in cases of relaxed and activated muscle conditions. PMID:25860074

  8. A method to accurately estimate the muscular torques of human wearing exoskeletons by torque sensors.

    PubMed

    Hwang, Beomsoo; Jeon, Doyoung

    2015-04-09

    In exoskeletal robots, the quantification of the user's muscular effort is important to recognize the user's motion intentions and evaluate motor abilities. In this paper, we attempt to estimate users' muscular efforts accurately using joint torque sensor which contains the measurements of dynamic effect of human body such as the inertial, Coriolis, and gravitational torques as well as torque by active muscular effort. It is important to extract the dynamic effects of the user's limb accurately from the measured torque. The user's limb dynamics are formulated and a convenient method of identifying user-specific parameters is suggested for estimating the user's muscular torque in robotic exoskeletons. Experiments were carried out on a wheelchair-integrated lower limb exoskeleton, EXOwheel, which was equipped with torque sensors in the hip and knee joints. The proposed methods were evaluated by 10 healthy participants during body weight-supported gait training. The experimental results show that the torque sensors are to estimate the muscular torque accurately in cases of relaxed and activated muscle conditions.

  9. Exposure Estimation and Interpretation of Occupational Risk: Enhanced Information for the Occupational Risk Manager.

    PubMed

    Waters, Martha; McKernan, Lauralynn; Maier, Andrew; Jayjock, Michael; Schaeffer, Val; Brosseau, Lisa

    2015-01-01

    The fundamental goal of this article is to describe, define, and analyze the components of the risk characterization process for occupational exposures. Current methods are described for the probabilistic characterization of exposure, including newer techniques that have increasing applications for assessing data from occupational exposure scenarios. In addition, since the probability of health effects reflects variability in the exposure estimate as well as the dose-response curve-the integrated considerations of variability surrounding both components of the risk characterization provide greater information to the occupational hygienist. Probabilistic tools provide a more informed view of exposure as compared to use of discrete point estimates for these inputs to the risk characterization process. Active use of such tools for exposure and risk assessment will lead to a scientifically supported worker health protection program. Understanding the bases for an occupational risk assessment, focusing on important sources of variability and uncertainty enables characterizing occupational risk in terms of a probability, rather than a binary decision of acceptable risk or unacceptable risk. A critical review of existing methods highlights several conclusions: (1) exposure estimates and the dose-response are impacted by both variability and uncertainty and a well-developed risk characterization reflects and communicates this consideration; (2) occupational risk is probabilistic in nature and most accurately considered as a distribution, not a point estimate; and (3) occupational hygienists have a variety of tools available to incorporate concepts of risk characterization into occupational health and practice.

  10. Exposure Estimation and Interpretation of Occupational Risk: Enhanced Information for the Occupational Risk Manager

    PubMed Central

    Waters, Martha; McKernan, Lauralynn; Maier, Andrew; Jayjock, Michael; Schaeffer, Val; Brosseau, Lisa

    2015-01-01

    The fundamental goal of this article is to describe, define, and analyze the components of the risk characterization process for occupational exposures. Current methods are described for the probabilistic characterization of exposure, including newer techniques that have increasing applications for assessing data from occupational exposure scenarios. In addition, since the probability of health effects reflects variability in the exposure estimate as well as the dose-response curve—the integrated considerations of variability surrounding both components of the risk characterization provide greater information to the occupational hygienist. Probabilistic tools provide a more informed view of exposure as compared to use of discrete point estimates for these inputs to the risk characterization process. Active use of such tools for exposure and risk assessment will lead to a scientifically supported worker health protection program. Understanding the bases for an occupational risk assessment, focusing on important sources of variability and uncertainty enables characterizing occupational risk in terms of a probability, rather than a binary decision of acceptable risk or unacceptable risk. A critical review of existing methods highlights several conclusions: (1) exposure estimates and the dose-response are impacted by both variability and uncertainty and a well-developed risk characterization reflects and communicates this consideration; (2) occupational risk is probabilistic in nature and most accurately considered as a distribution, not a point estimate; and (3) occupational hygienists have a variety of tools available to incorporate concepts of risk characterization into occupational health and practice. PMID:26302336

  11. Novel serologic biomarkers provide accurate estimates of recent Plasmodium falciparum exposure for individuals and communities

    PubMed Central

    Helb, Danica A.; Tetteh, Kevin K. A.; Felgner, Philip L.; Skinner, Jeff; Hubbard, Alan; Arinaitwe, Emmanuel; Mayanja-Kizza, Harriet; Ssewanyana, Isaac; Kamya, Moses R.; Beeson, James G.; Tappero, Jordan; Smith, David L.; Crompton, Peter D.; Rosenthal, Philip J.; Dorsey, Grant; Drakeley, Christopher J.; Greenhouse, Bryan

    2015-01-01

    Tools to reliably measure Plasmodium falciparum (Pf) exposure in individuals and communities are needed to guide and evaluate malaria control interventions. Serologic assays can potentially produce precise exposure estimates at low cost; however, current approaches based on responses to a few characterized antigens are not designed to estimate exposure in individuals. Pf-specific antibody responses differ by antigen, suggesting that selection of antigens with defined kinetic profiles will improve estimates of Pf exposure. To identify novel serologic biomarkers of malaria exposure, we evaluated responses to 856 Pf antigens by protein microarray in 186 Ugandan children, for whom detailed Pf exposure data were available. Using data-adaptive statistical methods, we identified combinations of antibody responses that maximized information on an individual’s recent exposure. Responses to three novel Pf antigens accurately classified whether an individual had been infected within the last 30, 90, or 365 d (cross-validated area under the curve = 0.86–0.93), whereas responses to six antigens accurately estimated an individual’s malaria incidence in the prior year. Cross-validated incidence predictions for individuals in different communities provided accurate stratification of exposure between populations and suggest that precise estimates of community exposure can be obtained from sampling a small subset of that community. In addition, serologic incidence predictions from cross-sectional samples characterized heterogeneity within a community similarly to 1 y of continuous passive surveillance. Development of simple ELISA-based assays derived from the successful selection strategy outlined here offers the potential to generate rich epidemiologic surveillance data that will be widely accessible to malaria control programs. PMID:26216993

  12. Procedure for estimating orbital debris risks

    NASA Technical Reports Server (NTRS)

    Crafts, J. L.; Lindberg, J. P.

    1985-01-01

    A procedure for estimating the potential orbital debris risk to the world's populace from payloads or spent stages left in orbit on future missions is presented. This approach provides a consistent, but simple, procedure to assess the risk due to random reentry with an adequate accuracy level for making programmatic decisions on planned low Earth orbit missions.

  13. The challenge of accurately quantifying future megadrought risk in the American Southwest

    NASA Astrophysics Data System (ADS)

    Coats, Sloan; Mankin, Justin S.

    2016-09-01

    American Southwest (ASW) megadroughts represent decadal-scale periods of dry conditions the near-term risks of which arise from natural low-frequency hydroclimate variability and anthropogenic forcing. A large single-climate-model ensemble indicates that anthropogenic forcing increases near-term ASW megadrought risk by a factor of 100; however, accurate risk assessment remains a challenge. At the global-scale we find that anthropogenic forcing may alter the variability driving megadroughts over 55% of land areas, undermining accurate assessments of their risk. For the remaining areas, current ensembles are too small to characterize megadroughts' driving variability. For example, constraining uncertainty in near-term ASW megadrought risk to 5 percentage points with high confidence requires 287 simulations. Such ensemble sizes are beyond current computational and storage resources, and these limitations suggest that constraining errors in near-term megadrought risk projections with high confidence—even in places where underlying variability is stationary—is not currently possible.

  14. Estimating the Effective Permittivity for Reconstructing Accurate Microwave-Radar Images.

    PubMed

    Lavoie, Benjamin R; Okoniewski, Michal; Fear, Elise C

    2016-01-01

    We present preliminary results from a method for estimating the optimal effective permittivity for reconstructing microwave-radar images. Using knowledge of how microwave-radar images are formed, we identify characteristics that are typical of good images, and define a fitness function to measure the relative image quality. We build a polynomial interpolant of the fitness function in order to identify the most likely permittivity values of the tissue. To make the estimation process more efficient, the polynomial interpolant is constructed using a locally and dimensionally adaptive sampling method that is a novel combination of stochastic collocation and polynomial chaos. Examples, using a series of simulated, experimental and patient data collected using the Tissue Sensing Adaptive Radar system, which is under development at the University of Calgary, are presented. These examples show how, using our method, accurate images can be reconstructed starting with only a broad estimate of the permittivity range.

  15. Estimating the Effective Permittivity for Reconstructing Accurate Microwave-Radar Images

    PubMed Central

    Lavoie, Benjamin R.; Okoniewski, Michal; Fear, Elise C.

    2016-01-01

    We present preliminary results from a method for estimating the optimal effective permittivity for reconstructing microwave-radar images. Using knowledge of how microwave-radar images are formed, we identify characteristics that are typical of good images, and define a fitness function to measure the relative image quality. We build a polynomial interpolant of the fitness function in order to identify the most likely permittivity values of the tissue. To make the estimation process more efficient, the polynomial interpolant is constructed using a locally and dimensionally adaptive sampling method that is a novel combination of stochastic collocation and polynomial chaos. Examples, using a series of simulated, experimental and patient data collected using the Tissue Sensing Adaptive Radar system, which is under development at the University of Calgary, are presented. These examples show how, using our method, accurate images can be reconstructed starting with only a broad estimate of the permittivity range. PMID:27611785

  16. Estimating the Effective Permittivity for Reconstructing Accurate Microwave-Radar Images.

    PubMed

    Lavoie, Benjamin R; Okoniewski, Michal; Fear, Elise C

    2016-01-01

    We present preliminary results from a method for estimating the optimal effective permittivity for reconstructing microwave-radar images. Using knowledge of how microwave-radar images are formed, we identify characteristics that are typical of good images, and define a fitness function to measure the relative image quality. We build a polynomial interpolant of the fitness function in order to identify the most likely permittivity values of the tissue. To make the estimation process more efficient, the polynomial interpolant is constructed using a locally and dimensionally adaptive sampling method that is a novel combination of stochastic collocation and polynomial chaos. Examples, using a series of simulated, experimental and patient data collected using the Tissue Sensing Adaptive Radar system, which is under development at the University of Calgary, are presented. These examples show how, using our method, accurate images can be reconstructed starting with only a broad estimate of the permittivity range. PMID:27611785

  17. Accurate estimation of object location in an image sequence using helicopter flight data

    NASA Technical Reports Server (NTRS)

    Tang, Yuan-Liang; Kasturi, Rangachar

    1994-01-01

    In autonomous navigation, it is essential to obtain a three-dimensional (3D) description of the static environment in which the vehicle is traveling. For a rotorcraft conducting low-latitude flight, this description is particularly useful for obstacle detection and avoidance. In this paper, we address the problem of 3D position estimation for static objects from a monocular sequence of images captured from a low-latitude flying helicopter. Since the environment is static, it is well known that the optical flow in the image will produce a radiating pattern from the focus of expansion. We propose a motion analysis system which utilizes the epipolar constraint to accurately estimate 3D positions of scene objects in a real world image sequence taken from a low-altitude flying helicopter. Results show that this approach gives good estimates of object positions near the rotorcraft's intended flight-path.

  18. Effective Echo Detection and Accurate Orbit Estimation Algorithms for Space Debris Radar

    NASA Astrophysics Data System (ADS)

    Isoda, Kentaro; Sakamoto, Takuya; Sato, Toru

    Orbit estimation of space debris, objects of no inherent value orbiting the earth, is a task that is important for avoiding collisions with spacecraft. The Kamisaibara Spaceguard Center radar system was built in 2004 as the first radar facility in Japan devoted to the observation of space debris. In order to detect the smaller debris, coherent integration is effective in improving SNR (Signal-to-Noise Ratio). However, it is difficult to apply coherent integration to real data because the motions of the targets are unknown. An effective algorithm is proposed for echo detection and orbit estimation of the faint echoes from space debris. The characteristics of the evaluation function are utilized by the algorithm. Experiments show the proposed algorithm improves SNR by 8.32dB and enables estimation of orbital parameters accurately to allow for re-tracking with a single radar.

  19. Parameter Estimation of Ion Current Formulations Requires Hybrid Optimization Approach to Be Both Accurate and Reliable

    PubMed Central

    Loewe, Axel; Wilhelms, Mathias; Schmid, Jochen; Krause, Mathias J.; Fischer, Fathima; Thomas, Dierk; Scholz, Eberhard P.; Dössel, Olaf; Seemann, Gunnar

    2016-01-01

    Computational models of cardiac electrophysiology provided insights into arrhythmogenesis and paved the way toward tailored therapies in the last years. To fully leverage in silico models in future research, these models need to be adapted to reflect pathologies, genetic alterations, or pharmacological effects, however. A common approach is to leave the structure of established models unaltered and estimate the values of a set of parameters. Today’s high-throughput patch clamp data acquisition methods require robust, unsupervised algorithms that estimate parameters both accurately and reliably. In this work, two classes of optimization approaches are evaluated: gradient-based trust-region-reflective and derivative-free particle swarm algorithms. Using synthetic input data and different ion current formulations from the Courtemanche et al. electrophysiological model of human atrial myocytes, we show that neither of the two schemes alone succeeds to meet all requirements. Sequential combination of the two algorithms did improve the performance to some extent but not satisfactorily. Thus, we propose a novel hybrid approach coupling the two algorithms in each iteration. This hybrid approach yielded very accurate estimates with minimal dependency on the initial guess using synthetic input data for which a ground truth parameter set exists. When applied to measured data, the hybrid approach yielded the best fit, again with minimal variation. Using the proposed algorithm, a single run is sufficient to estimate the parameters. The degree of superiority over the other investigated algorithms in terms of accuracy and robustness depended on the type of current. In contrast to the non-hybrid approaches, the proposed method proved to be optimal for data of arbitrary signal to noise ratio. The hybrid algorithm proposed in this work provides an important tool to integrate experimental data into computational models both accurately and robustly allowing to assess the often non

  20. Intraocular lens power estimation by accurate ray tracing for eyes underwent previous refractive surgeries

    NASA Astrophysics Data System (ADS)

    Yang, Que; Wang, Shanshan; Wang, Kai; Zhang, Chunyu; Zhang, Lu; Meng, Qingyu; Zhu, Qiudong

    2015-08-01

    For normal eyes without history of any ocular surgery, traditional equations for calculating intraocular lens (IOL) power, such as SRK-T, Holladay, Higis, SRK-II, et al., all were relativley accurate. However, for eyes underwent refractive surgeries, such as LASIK, or eyes diagnosed as keratoconus, these equations may cause significant postoperative refractive error, which may cause poor satisfaction after cataract surgery. Although some methods have been carried out to solve this problem, such as Hagis-L equation[1], or using preoperative data (data before LASIK) to estimate K value[2], no precise equations were available for these eyes. Here, we introduced a novel intraocular lens power estimation method by accurate ray tracing with optical design software ZEMAX. Instead of using traditional regression formula, we adopted the exact measured corneal elevation distribution, central corneal thickness, anterior chamber depth, axial length, and estimated effective lens plane as the input parameters. The calculation of intraocular lens power for a patient with keratoconus and another LASIK postoperative patient met very well with their visual capacity after cataract surgery.

  1. Accurate and quantitative polarization-sensitive OCT by unbiased birefringence estimator with noise-stochastic correction

    NASA Astrophysics Data System (ADS)

    Kasaragod, Deepa; Sugiyama, Satoshi; Ikuno, Yasushi; Alonso-Caneiro, David; Yamanari, Masahiro; Fukuda, Shinichi; Oshika, Tetsuro; Hong, Young-Joo; Li, En; Makita, Shuichi; Miura, Masahiro; Yasuno, Yoshiaki

    2016-03-01

    Polarization sensitive optical coherence tomography (PS-OCT) is a functional extension of OCT that contrasts the polarization properties of tissues. It has been applied to ophthalmology, cardiology, etc. Proper quantitative imaging is required for a widespread clinical utility. However, the conventional method of averaging to improve the signal to noise ratio (SNR) and the contrast of the phase retardation (or birefringence) images introduce a noise bias offset from the true value. This bias reduces the effectiveness of birefringence contrast for a quantitative study. Although coherent averaging of Jones matrix tomography has been widely utilized and has improved the image quality, the fundamental limitation of nonlinear dependency of phase retardation and birefringence to the SNR was not overcome. So the birefringence obtained by PS-OCT was still not accurate for a quantitative imaging. The nonlinear effect of SNR to phase retardation and birefringence measurement was previously formulated in detail for a Jones matrix OCT (JM-OCT) [1]. Based on this, we had developed a maximum a-posteriori (MAP) estimator and quantitative birefringence imaging was demonstrated [2]. However, this first version of estimator had a theoretical shortcoming. It did not take into account the stochastic nature of SNR of OCT signal. In this paper, we present an improved version of the MAP estimator which takes into account the stochastic property of SNR. This estimator uses a probability distribution function (PDF) of true local retardation, which is proportional to birefringence, under a specific set of measurements of the birefringence and SNR. The PDF was pre-computed by a Monte-Carlo (MC) simulation based on the mathematical model of JM-OCT before the measurement. A comparison between this new MAP estimator, our previous MAP estimator [2], and the standard mean estimator is presented. The comparisons are performed both by numerical simulation and in vivo measurements of anterior and

  2. READSCAN: a fast and scalable pathogen discovery program with accurate genome relative abundance estimation

    PubMed Central

    Rashid, Mamoon; Pain, Arnab

    2013-01-01

    Summary: READSCAN is a highly scalable parallel program to identify non-host sequences (of potential pathogen origin) and estimate their genome relative abundance in high-throughput sequence datasets. READSCAN accurately classified human and viral sequences on a 20.1 million reads simulated dataset in <27 min using a small Beowulf compute cluster with 16 nodes (Supplementary Material). Availability: http://cbrc.kaust.edu.sa/readscan Contact: arnab.pain@kaust.edu.sa or raeece.naeem@gmail.com Supplementary information: Supplementary data are available at Bioinformatics online. PMID:23193222

  3. Toward an Accurate Estimate of the Exfoliation Energy of Black Phosphorus: A Periodic Quantum Chemical Approach.

    PubMed

    Sansone, Giuseppe; Maschio, Lorenzo; Usvyat, Denis; Schütz, Martin; Karttunen, Antti

    2016-01-01

    The black phosphorus (black-P) crystal is formed of covalently bound layers of phosphorene stacked together by weak van der Waals interactions. An experimental measurement of the exfoliation energy of black-P is not available presently, making theoretical studies the most important source of information for the optimization of phosphorene production. Here, we provide an accurate estimate of the exfoliation energy of black-P on the basis of multilevel quantum chemical calculations, which include the periodic local Møller-Plesset perturbation theory of second order, augmented by higher-order corrections, which are evaluated with finite clusters mimicking the crystal. Very similar results are also obtained by density functional theory with the D3-version of Grimme's empirical dispersion correction. Our estimate of the exfoliation energy for black-P of -151 meV/atom is substantially larger than that of graphite, suggesting the need for different strategies to generate isolated layers for these two systems. PMID:26651397

  4. Toward an Accurate Estimate of the Exfoliation Energy of Black Phosphorus: A Periodic Quantum Chemical Approach.

    PubMed

    Sansone, Giuseppe; Maschio, Lorenzo; Usvyat, Denis; Schütz, Martin; Karttunen, Antti

    2016-01-01

    The black phosphorus (black-P) crystal is formed of covalently bound layers of phosphorene stacked together by weak van der Waals interactions. An experimental measurement of the exfoliation energy of black-P is not available presently, making theoretical studies the most important source of information for the optimization of phosphorene production. Here, we provide an accurate estimate of the exfoliation energy of black-P on the basis of multilevel quantum chemical calculations, which include the periodic local Møller-Plesset perturbation theory of second order, augmented by higher-order corrections, which are evaluated with finite clusters mimicking the crystal. Very similar results are also obtained by density functional theory with the D3-version of Grimme's empirical dispersion correction. Our estimate of the exfoliation energy for black-P of -151 meV/atom is substantially larger than that of graphite, suggesting the need for different strategies to generate isolated layers for these two systems.

  5. Accurate Estimation of Carotid Luminal Surface Roughness Using Ultrasonic Radio-Frequency Echo

    NASA Astrophysics Data System (ADS)

    Kitamura, Kosuke; Hasegawa, Hideyuki; Kanai, Hiroshi

    2012-07-01

    It would be useful to measure the minute surface roughness of the carotid arterial wall to detect the early stage of atherosclerosis. In conventional ultrasonography, the axial resolution of a B-mode image depends on the ultrasonic wavelength of 150 µm at 10 MHz because a B-mode image is constructed using the amplitude of the radio-frequency (RF) echo. Therefore, the surface roughness caused by atherosclerosis in an early stage cannot be measured using a conventional B-mode image obtained by ultrasonography because the roughness is 10-20 µm. We have realized accurate transcutaneous estimation of such a minute surface profile using the lateral motion of the carotid arterial wall, which is estimated by block matching of received ultrasonic signals. However, the width of the region where the surface profile is estimated depends on the magnitude of the lateral displacement of the carotid arterial wall (i.e., if the lateral displacement of the arterial wall is 1 mm, the surface profile is estimated in a region of 1 mm in width). In this study, the width was increased by combining surface profiles estimated using several ultrasonic beams. In the present study, we first measured a fine wire, whose diameter was 13 µm, using ultrasonic equipment to obtain an ultrasonic beam profile for determination of the optimal kernel size for block matching based on the correlation between RF echoes. Second, we estimated the lateral displacement and surface profile of a phantom, which had a saw tooth profile on its surface, and compared the surface profile measured by ultrasound with that measured by a laser profilometer. Finally, we estimated the lateral displacement and surface roughness of the carotid arterial wall of three healthy subjects (24-, 23-, and 23-year-old males) using the proposed method.

  6. Lamb mode selection for accurate wall loss estimation via guided wave tomography

    SciTech Connect

    Huthwaite, P.; Ribichini, R.; Lowe, M. J. S.; Cawley, P.

    2014-02-18

    Guided wave tomography offers a method to accurately quantify wall thickness losses in pipes and vessels caused by corrosion. This is achieved using ultrasonic waves transmitted over distances of approximately 1–2m, which are measured by an array of transducers and then used to reconstruct a map of wall thickness throughout the inspected region. To achieve accurate estimations of remnant wall thickness, it is vital that a suitable Lamb mode is chosen. This paper presents a detailed evaluation of the fundamental modes, S{sub 0} and A{sub 0}, which are of primary interest in guided wave tomography thickness estimates since the higher order modes do not exist at all thicknesses, to compare their performance using both numerical and experimental data while considering a range of challenging phenomena. The sensitivity of A{sub 0} to thickness variations was shown to be superior to S{sub 0}, however, the attenuation from A{sub 0} when a liquid loading was present was much higher than S{sub 0}. A{sub 0} was less sensitive to the presence of coatings on the surface of than S{sub 0}.

  7. Accurate Estimation of the Intrinsic Dimension Using Graph Distances: Unraveling the Geometric Complexity of Datasets

    NASA Astrophysics Data System (ADS)

    Granata, Daniele; Carnevale, Vincenzo

    2016-08-01

    The collective behavior of a large number of degrees of freedom can be often described by a handful of variables. This observation justifies the use of dimensionality reduction approaches to model complex systems and motivates the search for a small set of relevant “collective” variables. Here, we analyze this issue by focusing on the optimal number of variable needed to capture the salient features of a generic dataset and develop a novel estimator for the intrinsic dimension (ID). By approximating geodesics with minimum distance paths on a graph, we analyze the distribution of pairwise distances around the maximum and exploit its dependency on the dimensionality to obtain an ID estimate. We show that the estimator does not depend on the shape of the intrinsic manifold and is highly accurate, even for exceedingly small sample sizes. We apply the method to several relevant datasets from image recognition databases and protein multiple sequence alignments and discuss possible interpretations for the estimated dimension in light of the correlations among input variables and of the information content of the dataset.

  8. Accurate Estimation of the Intrinsic Dimension Using Graph Distances: Unraveling the Geometric Complexity of Datasets

    PubMed Central

    Granata, Daniele; Carnevale, Vincenzo

    2016-01-01

    The collective behavior of a large number of degrees of freedom can be often described by a handful of variables. This observation justifies the use of dimensionality reduction approaches to model complex systems and motivates the search for a small set of relevant “collective” variables. Here, we analyze this issue by focusing on the optimal number of variable needed to capture the salient features of a generic dataset and develop a novel estimator for the intrinsic dimension (ID). By approximating geodesics with minimum distance paths on a graph, we analyze the distribution of pairwise distances around the maximum and exploit its dependency on the dimensionality to obtain an ID estimate. We show that the estimator does not depend on the shape of the intrinsic manifold and is highly accurate, even for exceedingly small sample sizes. We apply the method to several relevant datasets from image recognition databases and protein multiple sequence alignments and discuss possible interpretations for the estimated dimension in light of the correlations among input variables and of the information content of the dataset. PMID:27510265

  9. Accurate Estimation of the Intrinsic Dimension Using Graph Distances: Unraveling the Geometric Complexity of Datasets.

    PubMed

    Granata, Daniele; Carnevale, Vincenzo

    2016-01-01

    The collective behavior of a large number of degrees of freedom can be often described by a handful of variables. This observation justifies the use of dimensionality reduction approaches to model complex systems and motivates the search for a small set of relevant "collective" variables. Here, we analyze this issue by focusing on the optimal number of variable needed to capture the salient features of a generic dataset and develop a novel estimator for the intrinsic dimension (ID). By approximating geodesics with minimum distance paths on a graph, we analyze the distribution of pairwise distances around the maximum and exploit its dependency on the dimensionality to obtain an ID estimate. We show that the estimator does not depend on the shape of the intrinsic manifold and is highly accurate, even for exceedingly small sample sizes. We apply the method to several relevant datasets from image recognition databases and protein multiple sequence alignments and discuss possible interpretations for the estimated dimension in light of the correlations among input variables and of the information content of the dataset. PMID:27510265

  10. Removing the thermal component from heart rate provides an accurate VO2 estimation in forest work.

    PubMed

    Dubé, Philippe-Antoine; Imbeau, Daniel; Dubeau, Denise; Lebel, Luc; Kolus, Ahmet

    2016-05-01

    Heart rate (HR) was monitored continuously in 41 forest workers performing brushcutting or tree planting work. 10-min seated rest periods were imposed during the workday to estimate the HR thermal component (ΔHRT) per Vogt et al. (1970, 1973). VO2 was measured using a portable gas analyzer during a morning submaximal step-test conducted at the work site, during a work bout over the course of the day (range: 9-74 min), and during an ensuing 10-min rest pause taken at the worksite. The VO2 estimated, from measured HR and from corrected HR (thermal component removed), were compared to VO2 measured during work and rest. Varied levels of HR thermal component (ΔHRTavg range: 0-38 bpm) originating from a wide range of ambient thermal conditions, thermal clothing insulation worn, and physical load exerted during work were observed. Using raw HR significantly overestimated measured work VO2 by 30% on average (range: 1%-64%). 74% of VO2 prediction error variance was explained by the HR thermal component. VO2 estimated from corrected HR, was not statistically different from measured VO2. Work VO2 can be estimated accurately in the presence of thermal stress using Vogt et al.'s method, which can be implemented easily by the practitioner with inexpensive instruments.

  11. MIDAS robust trend estimator for accurate GPS station velocities without step detection

    NASA Astrophysics Data System (ADS)

    Blewitt, Geoffrey; Kreemer, Corné; Hammond, William C.; Gazeaux, Julien

    2016-03-01

    Automatic estimation of velocities from GPS coordinate time series is becoming required to cope with the exponentially increasing flood of available data, but problems detectable to the human eye are often overlooked. This motivates us to find an automatic and accurate estimator of trend that is resistant to common problems such as step discontinuities, outliers, seasonality, skewness, and heteroscedasticity. Developed here, Median Interannual Difference Adjusted for Skewness (MIDAS) is a variant of the Theil-Sen median trend estimator, for which the ordinary version is the median of slopes vij = (xj-xi)/(tj-ti) computed between all data pairs i > j. For normally distributed data, Theil-Sen and least squares trend estimates are statistically identical, but unlike least squares, Theil-Sen is resistant to undetected data problems. To mitigate both seasonality and step discontinuities, MIDAS selects data pairs separated by 1 year. This condition is relaxed for time series with gaps so that all data are used. Slopes from data pairs spanning a step function produce one-sided outliers that can bias the median. To reduce bias, MIDAS removes outliers and recomputes the median. MIDAS also computes a robust and realistic estimate of trend uncertainty. Statistical tests using GPS data in the rigid North American plate interior show ±0.23 mm/yr root-mean-square (RMS) accuracy in horizontal velocity. In blind tests using synthetic data, MIDAS velocities have an RMS accuracy of ±0.33 mm/yr horizontal, ±1.1 mm/yr up, with a 5th percentile range smaller than all 20 automatic estimators tested. Considering its general nature, MIDAS has the potential for broader application in the geosciences.

  12. Methods for accurate estimation of net discharge in a tidal channel

    USGS Publications Warehouse

    Simpson, M.R.; Bland, R.

    2000-01-01

    Accurate estimates of net residual discharge in tidally affected rivers and estuaries are possible because of recently developed ultrasonic discharge measurement techniques. Previous discharge estimates using conventional mechanical current meters and methods based on stage/discharge relations or water slope measurements often yielded errors that were as great as or greater than the computed residual discharge. Ultrasonic measurement methods consist of: 1) the use of ultrasonic instruments for the measurement of a representative 'index' velocity used for in situ estimation of mean water velocity and 2) the use of the acoustic Doppler current discharge measurement system to calibrate the index velocity measurement data. Methods used to calibrate (rate) the index velocity to the channel velocity measured using the Acoustic Doppler Current Profiler are the most critical factors affecting the accuracy of net discharge estimation. The index velocity first must be related to mean channel velocity and then used to calculate instantaneous channel discharge. Finally, discharge is low-pass filtered to remove the effects of the tides. An ultrasonic velocity meter discharge-measurement site in a tidally affected region of the Sacramento-San Joaquin Rivers was used to study the accuracy of the index velocity calibration procedure. Calibration data consisting of ultrasonic velocity meter index velocity and concurrent acoustic Doppler discharge measurement data were collected during three time periods. Two sets of data were collected during a spring tide (monthly maximum tidal current) and one of data collected during a neap tide (monthly minimum tidal current). The relative magnitude of instrumental errors, acoustic Doppler discharge measurement errors, and calibration errors were evaluated. Calibration error was found to be the most significant source of error in estimating net discharge. Using a comprehensive calibration method, net discharge estimates developed from the three

  13. MIDAS robust trend estimator for accurate GPS station velocities without step detection

    PubMed Central

    Kreemer, Corné; Hammond, William C.; Gazeaux, Julien

    2016-01-01

    Abstract Automatic estimation of velocities from GPS coordinate time series is becoming required to cope with the exponentially increasing flood of available data, but problems detectable to the human eye are often overlooked. This motivates us to find an automatic and accurate estimator of trend that is resistant to common problems such as step discontinuities, outliers, seasonality, skewness, and heteroscedasticity. Developed here, Median Interannual Difference Adjusted for Skewness (MIDAS) is a variant of the Theil‐Sen median trend estimator, for which the ordinary version is the median of slopes vij = (xj–xi)/(tj–ti) computed between all data pairs i > j. For normally distributed data, Theil‐Sen and least squares trend estimates are statistically identical, but unlike least squares, Theil‐Sen is resistant to undetected data problems. To mitigate both seasonality and step discontinuities, MIDAS selects data pairs separated by 1 year. This condition is relaxed for time series with gaps so that all data are used. Slopes from data pairs spanning a step function produce one‐sided outliers that can bias the median. To reduce bias, MIDAS removes outliers and recomputes the median. MIDAS also computes a robust and realistic estimate of trend uncertainty. Statistical tests using GPS data in the rigid North American plate interior show ±0.23 mm/yr root‐mean‐square (RMS) accuracy in horizontal velocity. In blind tests using synthetic data, MIDAS velocities have an RMS accuracy of ±0.33 mm/yr horizontal, ±1.1 mm/yr up, with a 5th percentile range smaller than all 20 automatic estimators tested. Considering its general nature, MIDAS has the potential for broader application in the geosciences. PMID:27668140

  14. MIDAS robust trend estimator for accurate GPS station velocities without step detection

    PubMed Central

    Kreemer, Corné; Hammond, William C.; Gazeaux, Julien

    2016-01-01

    Abstract Automatic estimation of velocities from GPS coordinate time series is becoming required to cope with the exponentially increasing flood of available data, but problems detectable to the human eye are often overlooked. This motivates us to find an automatic and accurate estimator of trend that is resistant to common problems such as step discontinuities, outliers, seasonality, skewness, and heteroscedasticity. Developed here, Median Interannual Difference Adjusted for Skewness (MIDAS) is a variant of the Theil‐Sen median trend estimator, for which the ordinary version is the median of slopes vij = (xj–xi)/(tj–ti) computed between all data pairs i > j. For normally distributed data, Theil‐Sen and least squares trend estimates are statistically identical, but unlike least squares, Theil‐Sen is resistant to undetected data problems. To mitigate both seasonality and step discontinuities, MIDAS selects data pairs separated by 1 year. This condition is relaxed for time series with gaps so that all data are used. Slopes from data pairs spanning a step function produce one‐sided outliers that can bias the median. To reduce bias, MIDAS removes outliers and recomputes the median. MIDAS also computes a robust and realistic estimate of trend uncertainty. Statistical tests using GPS data in the rigid North American plate interior show ±0.23 mm/yr root‐mean‐square (RMS) accuracy in horizontal velocity. In blind tests using synthetic data, MIDAS velocities have an RMS accuracy of ±0.33 mm/yr horizontal, ±1.1 mm/yr up, with a 5th percentile range smaller than all 20 automatic estimators tested. Considering its general nature, MIDAS has the potential for broader application in the geosciences.

  15. Accurate Relative Location Estimates for the North Korean Nuclear Tests Using Empirical Slowness Corrections

    NASA Astrophysics Data System (ADS)

    Gibbons, S. J.; Pabian, F.; Näsholm, S. P.; Kværna', T.; Mykkeltveit, S.

    2016-10-01

    modified velocity gradients reduce the residuals, the relative location uncertainties, and the sensitivity to the combination of stations used. The traveltime gradients appear to be overestimated for the regional phases, and teleseismic relative location estimates are likely to be more accurate despite an apparent lower precision. Calibrations for regional phases are essential given that smaller magnitude events are likely not to be recorded teleseismically. We discuss the implications for the absolute event locations. Placing the 2006 event under a local maximum of overburden at 41.293°N, 129.105°E would imply a location of 41.299°N, 129.075°E for the January 2016 event, providing almost optimal overburden for the later four events.

  16. Estimating Terrorist Risk with Possibility Theory

    SciTech Connect

    J.L. Darby

    2004-11-30

    This report summarizes techniques that use possibility theory to estimate the risk of terrorist acts. These techniques were developed under the sponsorship of the Department of Homeland Security (DHS) as part of the National Infrastructure Simulation Analysis Center (NISAC) project. The techniques have been used to estimate the risk of various terrorist scenarios to support NISAC analyses during 2004. The techniques are based on the Logic Evolved Decision (LED) methodology developed over the past few years by Terry Bott and Steve Eisenhawer at LANL. [LED] The LED methodology involves the use of fuzzy sets, possibility theory, and approximate reasoning. LED captures the uncertainty due to vagueness and imprecision that is inherent in the fidelity of the information available for terrorist acts; probability theory cannot capture these uncertainties. This report does not address the philosophy supporting the development of nonprobabilistic approaches, and it does not discuss possibility theory in detail. The references provide a detailed discussion of these subjects. [Shafer] [Klir and Yuan] [Dubois and Prade] Suffice to say that these approaches were developed to address types of uncertainty that cannot be addressed by a probability measure. An earlier report discussed in detail the problems with using a probability measure to evaluate terrorist risk. [Darby Methodology]. Two related techniques are discussed in this report: (1) a numerical technique, and (2) a linguistic technique. The numerical technique uses traditional possibility theory applied to crisp sets, while the linguistic technique applies possibility theory to fuzzy sets. Both of these techniques as applied to terrorist risk for NISAC applications are implemented in software called PossibleRisk. The techniques implemented in PossibleRisk were developed specifically for use in estimating terrorist risk for the NISAC program. The LEDTools code can be used to perform the same linguistic evaluation as

  17. Accurate estimation of human body orientation from RGB-D sensors.

    PubMed

    Liu, Wu; Zhang, Yongdong; Tang, Sheng; Tang, Jinhui; Hong, Richang; Li, Jintao

    2013-10-01

    Accurate estimation of human body orientation can significantly enhance the analysis of human behavior, which is a fundamental task in the field of computer vision. However, existing orientation estimation methods cannot handle the various body poses and appearances. In this paper, we propose an innovative RGB-D-based orientation estimation method to address these challenges. By utilizing the RGB-D information, which can be real time acquired by RGB-D sensors, our method is robust to cluttered environment, illumination change and partial occlusions. Specifically, efficient static and motion cue extraction methods are proposed based on the RGB-D superpixels to reduce the noise of depth data. Since it is hard to discriminate all the 360 (°) orientation using static cues or motion cues independently, we propose to utilize a dynamic Bayesian network system (DBNS) to effectively employ the complementary nature of both static and motion cues. In order to verify our proposed method, we build a RGB-D-based human body orientation dataset that covers a wide diversity of poses and appearances. Our intensive experimental evaluations on this dataset demonstrate the effectiveness and efficiency of the proposed method. PMID:23893759

  18. Accurate estimation of motion blur parameters in noisy remote sensing image

    NASA Astrophysics Data System (ADS)

    Shi, Xueyan; Wang, Lin; Shao, Xiaopeng; Wang, Huilin; Tao, Zhong

    2015-05-01

    The relative motion between remote sensing satellite sensor and objects is one of the most common reasons for remote sensing image degradation. It seriously weakens image data interpretation and information extraction. In practice, point spread function (PSF) should be estimated firstly for image restoration. Identifying motion blur direction and length accurately is very crucial for PSF and restoring image with precision. In general, the regular light-and-dark stripes in the spectrum can be employed to obtain the parameters by using Radon transform. However, serious noise existing in actual remote sensing images often causes the stripes unobvious. The parameters would be difficult to calculate and the error of the result relatively big. In this paper, an improved motion blur parameter identification method to noisy remote sensing image is proposed to solve this problem. The spectrum characteristic of noisy remote sensing image is analyzed firstly. An interactive image segmentation method based on graph theory called GrabCut is adopted to effectively extract the edge of the light center in the spectrum. Motion blur direction is estimated by applying Radon transform on the segmentation result. In order to reduce random error, a method based on whole column statistics is used during calculating blur length. Finally, Lucy-Richardson algorithm is applied to restore the remote sensing images of the moon after estimating blur parameters. The experimental results verify the effectiveness and robustness of our algorithm.

  19. Efficient and accurate estimation of relative order tensors from λ- maps

    NASA Astrophysics Data System (ADS)

    Mukhopadhyay, Rishi; Miao, Xijiang; Shealy, Paul; Valafar, Homayoun

    2009-06-01

    The rapid increase in the availability of RDC data from multiple alignment media in recent years has necessitated the development of more sophisticated analyses that extract the RDC data's full information content. This article presents an analysis of the distribution of RDCs from two media (2D-RDC data), using the information obtained from a λ-map. This article also introduces an efficient algorithm, which leverages these findings to extract the order tensors for each alignment medium using unassigned RDC data in the absence of any structural information. The results of applying this 2D-RDC analysis method to synthetic and experimental data are reported in this article. The relative order tensor estimates obtained from the 2D-RDC analysis are compared to order tensors obtained from the program REDCAT after using assignment and structural information. The final comparisons indicate that the relative order tensors estimated from the unassigned 2D-RDC method very closely match the results from methods that require assignment and structural information. The presented method is successful even in cases with small datasets. The results of analyzing experimental RDC data for the protein 1P7E are presented to demonstrate the potential of the presented work in accurately estimating the principal order parameters from RDC data that incompletely sample the RDC space. In addition to the new algorithm, a discussion of the uniqueness of the solutions is presented; no more than two clusters of distinct solutions have been shown to satisfy each λ-map.

  20. Accurate estimation of human body orientation from RGB-D sensors.

    PubMed

    Liu, Wu; Zhang, Yongdong; Tang, Sheng; Tang, Jinhui; Hong, Richang; Li, Jintao

    2013-10-01

    Accurate estimation of human body orientation can significantly enhance the analysis of human behavior, which is a fundamental task in the field of computer vision. However, existing orientation estimation methods cannot handle the various body poses and appearances. In this paper, we propose an innovative RGB-D-based orientation estimation method to address these challenges. By utilizing the RGB-D information, which can be real time acquired by RGB-D sensors, our method is robust to cluttered environment, illumination change and partial occlusions. Specifically, efficient static and motion cue extraction methods are proposed based on the RGB-D superpixels to reduce the noise of depth data. Since it is hard to discriminate all the 360 (°) orientation using static cues or motion cues independently, we propose to utilize a dynamic Bayesian network system (DBNS) to effectively employ the complementary nature of both static and motion cues. In order to verify our proposed method, we build a RGB-D-based human body orientation dataset that covers a wide diversity of poses and appearances. Our intensive experimental evaluations on this dataset demonstrate the effectiveness and efficiency of the proposed method.

  1. Accurate estimation of the RMS emittance from single current amplifier data

    SciTech Connect

    Stockli, Martin P.; Welton, R.F.; Keller, R.; Letchford, A.P.; Thomae, R.W.; Thomason, J.W.G.

    2002-05-31

    This paper presents the SCUBEEx rms emittance analysis, a self-consistent, unbiased elliptical exclusion method, which combines traditional data-reduction methods with statistical methods to obtain accurate estimates for the rms emittance. Rather than considering individual data, the method tracks the average current density outside a well-selected, variable boundary to separate the measured beam halo from the background. The average outside current density is assumed to be part of a uniform background and not part of the particle beam. Therefore the average outside current is subtracted from the data before evaluating the rms emittance within the boundary. As the boundary area is increased, the average outside current and the inside rms emittance form plateaus when all data containing part of the particle beam are inside the boundary. These plateaus mark the smallest acceptable exclusion boundary and provide unbiased estimates for the average background and the rms emittance. Small, trendless variations within the plateaus allow for determining the uncertainties of the estimates caused by variations of the measured background outside the smallest acceptable exclusion boundary. The robustness of the method is established with complementary variations of the exclusion boundary. This paper presents a detailed comparison between traditional data reduction methods and SCUBEEx by analyzing two complementary sets of emittance data obtained with a Lawrence Berkeley National Laboratory and an ISIS H{sup -} ion source.

  2. Quick and accurate estimation of the elastic constants using the minimum image method

    NASA Astrophysics Data System (ADS)

    Tretiakov, Konstantin V.; Wojciechowski, Krzysztof W.

    2015-04-01

    A method for determining the elastic properties using the minimum image method (MIM) is proposed and tested on a model system of particles interacting by the Lennard-Jones (LJ) potential. The elastic constants of the LJ system are determined in the thermodynamic limit, N → ∞, using the Monte Carlo (MC) method in the NVT and NPT ensembles. The simulation results show that when determining the elastic constants, the contribution of long-range interactions cannot be ignored, because that would lead to erroneous results. In addition, the simulations have revealed that the inclusion of further interactions of each particle with all its minimum image neighbors even in case of small systems leads to results which are very close to the values of elastic constants in the thermodynamic limit. This enables one for a quick and accurate estimation of the elastic constants using very small samples.

  3. Pitfalls in accurate estimation of overdiagnosis: implications for screening policy and compliance.

    PubMed

    Feig, Stephen A

    2013-01-01

    Stories in the public media that 30 to 50% of screen-detected breast cancers are overdiagnosed dissuade women from being screened because overdiagnosed cancers would never result in death if undetected yet do result in unnecessary treatment. However, such concerns are unwarranted because the frequency of overdiagnosis, when properly calculated, is only 0 to 5%. In the previous issue of Breast Cancer Research, Duffy and Parmar report that accurate estimation of the rate of overdiagnosis recognizes the effect of lead time on detection rates and the consequent requirement for an adequate number of years of follow-up. These indispensable elements were absent from highly publicized studies that overestimated the frequency of overdiagnosis.

  4. A Simple yet Accurate Method for the Estimation of the Biovolume of Planktonic Microorganisms.

    PubMed

    Saccà, Alessandro

    2016-01-01

    Determining the biomass of microbial plankton is central to the study of fluxes of energy and materials in aquatic ecosystems. This is typically accomplished by applying proper volume-to-carbon conversion factors to group-specific abundances and biovolumes. A critical step in this approach is the accurate estimation of biovolume from two-dimensional (2D) data such as those available through conventional microscopy techniques or flow-through imaging systems. This paper describes a simple yet accurate method for the assessment of the biovolume of planktonic microorganisms, which works with any image analysis system allowing for the measurement of linear distances and the estimation of the cross sectional area of an object from a 2D digital image. The proposed method is based on Archimedes' principle about the relationship between the volume of a sphere and that of a cylinder in which the sphere is inscribed, plus a coefficient of 'unellipticity' introduced here. Validation and careful evaluation of the method are provided using a variety of approaches. The new method proved to be highly precise with all convex shapes characterised by approximate rotational symmetry, and combining it with an existing method specific for highly concave or branched shapes allows covering the great majority of cases with good reliability. Thanks to its accuracy, consistency, and low resources demand, the new method can conveniently be used in substitution of any extant method designed for convex shapes, and can readily be coupled with automated cell imaging technologies, including state-of-the-art flow-through imaging devices. PMID:27195667

  5. A Simple yet Accurate Method for the Estimation of the Biovolume of Planktonic Microorganisms

    PubMed Central

    2016-01-01

    Determining the biomass of microbial plankton is central to the study of fluxes of energy and materials in aquatic ecosystems. This is typically accomplished by applying proper volume-to-carbon conversion factors to group-specific abundances and biovolumes. A critical step in this approach is the accurate estimation of biovolume from two-dimensional (2D) data such as those available through conventional microscopy techniques or flow-through imaging systems. This paper describes a simple yet accurate method for the assessment of the biovolume of planktonic microorganisms, which works with any image analysis system allowing for the measurement of linear distances and the estimation of the cross sectional area of an object from a 2D digital image. The proposed method is based on Archimedes’ principle about the relationship between the volume of a sphere and that of a cylinder in which the sphere is inscribed, plus a coefficient of ‘unellipticity’ introduced here. Validation and careful evaluation of the method are provided using a variety of approaches. The new method proved to be highly precise with all convex shapes characterised by approximate rotational symmetry, and combining it with an existing method specific for highly concave or branched shapes allows covering the great majority of cases with good reliability. Thanks to its accuracy, consistency, and low resources demand, the new method can conveniently be used in substitution of any extant method designed for convex shapes, and can readily be coupled with automated cell imaging technologies, including state-of-the-art flow-through imaging devices. PMID:27195667

  6. Evaluating methods for estimating existential risks.

    PubMed

    Tonn, Bruce; Stiefel, Dorian

    2013-10-01

    Researchers and commissions contend that the risk of human extinction is high, but none of these estimates have been based upon a rigorous methodology suitable for estimating existential risks. This article evaluates several methods that could be used to estimate the probability of human extinction. Traditional methods evaluated include: simple elicitation; whole evidence Bayesian; evidential reasoning using imprecise probabilities; and Bayesian networks. Three innovative methods are also considered: influence modeling based on environmental scans; simple elicitation using extinction scenarios as anchors; and computationally intensive possible-worlds modeling. Evaluation criteria include: level of effort required by the probability assessors; level of effort needed to implement the method; ability of each method to model the human extinction event; ability to incorporate scientific estimates of contributory events; transparency of the inputs and outputs; acceptability to the academic community (e.g., with respect to intellectual soundness, familiarity, verisimilitude); credibility and utility of the outputs of the method to the policy community; difficulty of communicating the method's processes and outputs to nonexperts; and accuracy in other contexts. The article concludes by recommending that researchers assess the risks of human extinction by combining these methods. PMID:23551083

  7. Accurate biopsy-needle depth estimation in limited-angle tomography using multi-view geometry

    NASA Astrophysics Data System (ADS)

    van der Sommen, Fons; Zinger, Sveta; de With, Peter H. N.

    2016-03-01

    Recently, compressed-sensing based algorithms have enabled volume reconstruction from projection images acquired over a relatively small angle (θ < 20°). These methods enable accurate depth estimation of surgical tools with respect to anatomical structures. However, they are computationally expensive and time consuming, rendering them unattractive for image-guided interventions. We propose an alternative approach for depth estimation of biopsy needles during image-guided interventions, in which we split the problem into two parts and solve them independently: needle-depth estimation and volume reconstruction. The complete proposed system consists of the previous two steps, preceded by needle extraction. First, we detect the biopsy needle in the projection images and remove it by interpolation. Next, we exploit epipolar geometry to find point-to-point correspondences in the projection images to triangulate the 3D position of the needle in the volume. Finally, we use the interpolated projection images to reconstruct the local anatomical structures and indicate the position of the needle within this volume. For validation of the algorithm, we have recorded a full CT scan of a phantom with an inserted biopsy needle. The performance of our approach ranges from a median error of 2.94 mm for an distributed viewing angle of 1° down to an error of 0.30 mm for an angle larger than 10°. Based on the results of this initial phantom study, we conclude that multi-view geometry offers an attractive alternative to time-consuming iterative methods for the depth estimation of surgical tools during C-arm-based image-guided interventions.

  8. Accurate Estimation of the Fine Layering Effect on the Wave Propagation in the Carbonate Rocks

    NASA Astrophysics Data System (ADS)

    Bouchaala, F.; Ali, M. Y.

    2014-12-01

    The attenuation caused to the seismic wave during its propagation can be mainly divided into two parts, the scattering and the intrinsic attenuation. The scattering is an elastic redistribution of the energy due to the medium heterogeneities. However the intrinsic attenuation is an inelastic phenomenon, mainly due to the fluid-grain friction during the wave passage. The intrinsic attenuation is directly related to the physical characteristics of the medium, so this parameter is very can be used for media characterization and fluid detection, which is beneficial for the oil and gas industry. The intrinsic attenuation is estimated by subtracting the scattering from the total attenuation, therefore the accuracy of the intrinsic attenuation is directly dependent on the accuracy of the total attenuation and the scattering. The total attenuation can be estimated from the recorded waves, by using in-situ methods as the spectral ratio and frequency shift methods. The scattering is estimated by assuming the heterogeneities as a succession of stacked layers, each layer is characterized by a single density and velocity. The accuracy of the scattering is strongly dependent on the layer thicknesses, especially in the case of the media composed of carbonate rocks, such media are known for their strong heterogeneity. Previous studies gave some assumptions for the choice of the layer thickness, but they showed some limitations especially in the case of carbonate rocks. In this study we established a relationship between the layer thicknesses and the frequency of the propagation, after certain mathematical development of the Generalized O'Doherty-Anstey formula. We validated this relationship through some synthetic tests and real data provided from a VSP carried out over an onshore oilfield in the emirate of Abu Dhabi in the United Arab Emirates, primarily composed of carbonate rocks. The results showed the utility of our relationship for an accurate estimation of the scattering

  9. How to estimate your tolerance for risk

    SciTech Connect

    Mackay, J.A.

    1996-12-31

    Risk tolerance is used to calculate the Risk Adjusted Value (RAV) of a proposed investment. The RAV incorporates both the expected value and risk attitude for a particular investment, taking into consideration your concern for catastrophic financial loss, as well as chance of success, cost and value if successful. Uncertainty can be incorporated into all of the above variables. Often a project is more valuable to a corporation if a partial working interest is taken rather than the entire working interest. The RAV can be used to calculate the optimum working interest and the value of that diversification. To estimate the Apparent Risk Tolerance (ART) of an individual, division or corporation several methods can be employed: (1) ART can be calculated from the working interest selected in prior investment decisions. (2) ART can be estimated from a selection of working interests by the decision maker in a proposed portfolio of projects. (3) ART can be approximated from data released to the Security and Exchange Commission (SEC) in the annual 10K supplements (for both your company and possible partners). (4) ART can be assigned based on corporate size, budget, or activity. Examples are provided for the various methods to identify risk tolerance and apply it in making optimum working interest calculations for individual projects and portfolios.

  10. Risk Estimation Methodology for Launch Accidents.

    SciTech Connect

    Clayton, Daniel James; Lipinski, Ronald J.; Bechtel, Ryan D.

    2014-02-01

    As compact and light weight power sources with reliable, long lives, Radioisotope Power Systems (RPSs) have made space missions to explore the solar system possible. Due to the hazardous material that can be released during a launch accident, the potential health risk of an accident must be quantified, so that appropriate launch approval decisions can be made. One part of the risk estimation involves modeling the response of the RPS to potential accident environments. Due to the complexity of modeling the full RPS response deterministically on dynamic variables, the evaluation is performed in a stochastic manner with a Monte Carlo simulation. The potential consequences can be determined by modeling the transport of the hazardous material in the environment and in human biological pathways. The consequence analysis results are summed and weighted by appropriate likelihood values to give a collection of probabilistic results for the estimation of the potential health risk. This information is used to guide RPS designs, spacecraft designs, mission architecture, or launch procedures to potentially reduce the risk, as well as to inform decision makers of the potential health risks resulting from the use of RPSs for space missions.

  11. Can student health professionals accurately estimate alcohol content in commonly occurring drinks?

    PubMed Central

    Sinclair, Julia; Searle, Emma

    2016-01-01

    Objectives: Correct identification of alcohol as a contributor to, or comorbidity of, many psychiatric diseases requires health professionals to be competent and confident to take an accurate alcohol history. Being able to estimate (or calculate) the alcohol content in commonly consumed drinks is a prerequisite for quantifying levels of alcohol consumption. The aim of this study was to assess this ability in medical and nursing students. Methods: A cross-sectional survey of 891 medical and nursing students across different years of training was conducted. Students were asked the alcohol content of 10 different alcoholic drinks by seeing a slide of the drink (with picture, volume and percentage of alcohol by volume) for 30 s. Results: Overall, the mean number of correctly estimated drinks (out of the 10 tested) was 2.4, increasing to just over 3 if a 10% margin of error was used. Wine and premium strength beers were underestimated by over 50% of students. Those who drank alcohol themselves, or who were further on in their clinical training, did better on the task, but overall the levels remained low. Conclusions: Knowledge of, or the ability to work out, the alcohol content of commonly consumed drinks is poor, and further research is needed to understand the reasons for this and the impact this may have on the likelihood to undertake screening or initiate treatment. PMID:27536344

  12. Ocean Lidar Measurements of Beam Attenuation and a Roadmap to Accurate Phytoplankton Biomass Estimates

    NASA Astrophysics Data System (ADS)

    Hu, Yongxiang; Behrenfeld, Mike; Hostetler, Chris; Pelon, Jacques; Trepte, Charles; Hair, John; Slade, Wayne; Cetinic, Ivona; Vaughan, Mark; Lu, Xiaomei; Zhai, Pengwang; Weimer, Carl; Winker, David; Verhappen, Carolus C.; Butler, Carolyn; Liu, Zhaoyan; Hunt, Bill; Omar, Ali; Rodier, Sharon; Lifermann, Anne; Josset, Damien; Hou, Weilin; MacDonnell, David; Rhew, Ray

    2016-06-01

    Beam attenuation coefficient, c, provides an important optical index of plankton standing stocks, such as phytoplankton biomass and total particulate carbon concentration. Unfortunately, c has proven difficult to quantify through remote sensing. Here, we introduce an innovative approach for estimating c using lidar depolarization measurements and diffuse attenuation coefficients from ocean color products or lidar measurements of Brillouin scattering. The new approach is based on a theoretical formula established from Monte Carlo simulations that links the depolarization ratio of sea water to the ratio of diffuse attenuation Kd and beam attenuation C (i.e., a multiple scattering factor). On July 17, 2014, the CALIPSO satellite was tilted 30° off-nadir for one nighttime orbit in order to minimize ocean surface backscatter and demonstrate the lidar ocean subsurface measurement concept from space. Depolarization ratios of ocean subsurface backscatter are measured accurately. Beam attenuation coefficients computed from the depolarization ratio measurements compare well with empirical estimates from ocean color measurements. We further verify the beam attenuation coefficient retrievals using aircraft-based high spectral resolution lidar (HSRL) data that are collocated with in-water optical measurements.

  13. mBEEF: An accurate semi-local Bayesian error estimation density functional

    NASA Astrophysics Data System (ADS)

    Wellendorff, Jess; Lundgaard, Keld T.; Jacobsen, Karsten W.; Bligaard, Thomas

    2014-04-01

    We present a general-purpose meta-generalized gradient approximation (MGGA) exchange-correlation functional generated within the Bayesian error estimation functional framework [J. Wellendorff, K. T. Lundgaard, A. Møgelhøj, V. Petzold, D. D. Landis, J. K. Nørskov, T. Bligaard, and K. W. Jacobsen, Phys. Rev. B 85, 235149 (2012)]. The functional is designed to give reasonably accurate density functional theory (DFT) predictions of a broad range of properties in materials physics and chemistry, while exhibiting a high degree of transferability. Particularly, it improves upon solid cohesive energies and lattice constants over the BEEF-vdW functional without compromising high performance on adsorption and reaction energies. We thus expect it to be particularly well-suited for studies in surface science and catalysis. An ensemble of functionals for error estimation in DFT is an intrinsic feature of exchange-correlation models designed this way, and we show how the Bayesian ensemble may provide a systematic analysis of the reliability of DFT based simulations.

  14. Greater contrast in Martian hydrological history from more accurate estimates of paleodischarge

    NASA Astrophysics Data System (ADS)

    Jacobsen, R. E.; Burr, D. M.

    2016-09-01

    Correlative width-discharge relationships from the Missouri River Basin are commonly used to estimate fluvial paleodischarge on Mars. However, hydraulic geometry provides alternative, and causal, width-discharge relationships derived from broader samples of channels, including those in reduced-gravity (submarine) environments. Comparison of these relationships implies that causal relationships from hydraulic geometry should yield more accurate and more precise discharge estimates. Our remote analysis of a Martian-terrestrial analog channel, combined with in situ discharge data, substantiates this implication. Applied to Martian features, these results imply that paleodischarges of interior channels of Noachian-Hesperian (~3.7 Ga) valley networks have been underestimated by a factor of several, whereas paleodischarges for smaller fluvial deposits of the Late Hesperian-Early Amazonian (~3.0 Ga) have been overestimated. Thus, these new paleodischarges significantly magnify the contrast between early and late Martian hydrologic activity. Width-discharge relationships from hydraulic geometry represent validated tools for quantifying fluvial input near candidate landing sites of upcoming missions.

  15. Impact of microbial count distributions on human health risk estimates.

    PubMed

    Duarte, A S R; Nauta, M J

    2015-02-16

    Quantitative microbiological risk assessment (QMRA) is influenced by the choice of the probability distribution used to describe pathogen concentrations, as this may eventually have a large effect on the distribution of doses at exposure. When fitting a probability distribution to microbial enumeration data, several factors may have an impact on the accuracy of that fit. Analysis of the best statistical fits of different distributions alone does not provide a clear indication of the impact in terms of risk estimates. Thus, in this study we focus on the impact of fitting microbial distributions on risk estimates, at two different concentration scenarios and at a range of prevalence levels. By using five different parametric distributions, we investigate whether different characteristics of a good fit are crucial for an accurate risk estimate. Among the factors studied are the importance of accounting for the Poisson randomness in counts, the difference between treating "true" zeroes as such or as censored below a limit of quantification (LOQ) and the importance of making the correct assumption about the underlying distribution of concentrations. By running a simulation experiment with zero-inflated Poisson-lognormal distributed data and an existing QMRA model from retail to consumer level, it was possible to assess the difference between expected risk and the risk estimated with using a lognormal, a zero-inflated lognormal, a Poisson-gamma, a zero-inflated Poisson-gamma and a zero-inflated Poisson-lognormal distribution. We show that the impact of the choice of different probability distributions to describe concentrations at retail on risk estimates is dependent both on concentration and prevalence levels. We also show that the use of an LOQ should be done consciously, especially when zero-inflation is not used. In general, zero-inflation does not necessarily improve the absolute risk estimation, but performance of zero-inflated distributions in QMRA tends to be

  16. A rapid, economical, and accurate method to determining the physical risk of storm marine inundations using sedimentary evidence

    NASA Astrophysics Data System (ADS)

    Nott, Jonathan F.

    2015-04-01

    The majority of physical risk assessments from storm surge inundations are derived from synthetic time series generated from short climate records, which can often result in inaccuracies and are time-consuming and expensive to develop. A new method is presented here for the wet tropics region of northeast Australia. It uses lidar-generated topographic cross sections of beach ridge plains, which have been demonstrated to be deposited by marine inundations generated by tropical cyclones. Extreme value theory statistics are applied to data derived from the cross sections to generate return period plots for a given location. The results suggest that previous methods to estimate return periods using synthetic data sets have underestimated the magnitude/frequency relationship by at least an order of magnitude. The new method promises to be a more rapid, economical, and accurate assessment of the physical risk of these events.

  17. Auditory risk estimates for youth target shooting

    PubMed Central

    Meinke, Deanna K.; Murphy, William J.; Finan, Donald S.; Lankford, James E.; Flamme, Gregory A.; Stewart, Michael; Soendergaard, Jacob; Jerome, Trevor W.

    2015-01-01

    Objective To characterize the impulse noise exposure and auditory risk for youth recreational firearm users engaged in outdoor target shooting events. The youth shooting positions are typically standing or sitting at a table, which places the firearm closer to the ground or reflective surface when compared to adult shooters. Design Acoustic characteristics were examined and the auditory risk estimates were evaluated using contemporary damage-risk criteria for unprotected adult listeners and the 120-dB peak limit suggested by the World Health Organization (1999) for children. Study sample Impulses were generated by 26 firearm/ammunition configurations representing rifles, shotguns, and pistols used by youth. Measurements were obtained relative to a youth shooter’s left ear. Results All firearms generated peak levels that exceeded the 120 dB peak limit suggested by the WHO for children. In general, shooting from the seated position over a tabletop increases the peak levels, LAeq8 and reduces the unprotected maximum permissible exposures (MPEs) for both rifles and pistols. Pistols pose the greatest auditory risk when fired over a tabletop. Conclusion Youth should utilize smaller caliber weapons, preferably from the standing position, and always wear hearing protection whenever engaging in shooting activities to reduce the risk for auditory damage. PMID:24564688

  18. Accurate Visual Heading Estimation at High Rotation Rate Without Oculomotor or Static-Depth Cues

    NASA Technical Reports Server (NTRS)

    Stone, Leland S.; Perrone, John A.; Null, Cynthia H. (Technical Monitor)

    1995-01-01

    It has been claimed that either oculomotor or static depth cues provide the signals about self-rotation necessary approx.-1 deg/s. We tested this hypothesis by simulating self-motion along a curved path with the eyes fixed in the head (plus or minus 16 deg/s of rotation). Curvilinear motion offers two advantages: 1) heading remains constant in retinotopic coordinates, and 2) there is no visual-oculomotor conflict (both actual and simulated eye position remain stationary). We simulated 400 ms of rotation combined with 16 m/s of translation at fixed angles with respect to gaze towards two vertical planes of random dots initially 12 and 24 m away, with a field of view of 45 degrees. Four subjects were asked to fixate a central cross and to respond whether they were translating to the left or right of straight-ahead gaze. From the psychometric curves, heading bias (mean) and precision (semi-interquartile) were derived. The mean bias over 2-5 runs was 3.0, 4.0, -2.0, -0.4 deg for the first author and three naive subjects, respectively (positive indicating towards the rotation direction). The mean precision was 2.0, 1.9, 3.1, 1.6 deg. respectively. The ability of observers to make relatively accurate and precise heading judgments, despite the large rotational flow component, refutes the view that extra-flow-field information is necessary for human visual heading estimation at high rotation rates. Our results support models that process combined translational/rotational flow to estimate heading, but should not be construed to suggest that other cues do not play an important role when they are available to the observer.

  19. Extreme Earthquake Risk Estimation by Hybrid Modeling

    NASA Astrophysics Data System (ADS)

    Chavez, M.; Cabrera, E.; Ashworth, M.; Garcia, S.; Emerson, D.; Perea, N.; Salazar, A.; Moulinec, C.

    2012-12-01

    The estimation of the hazard and the economical consequences i.e. the risk associated to the occurrence of extreme magnitude earthquakes in the neighborhood of urban or lifeline infrastructure, such as the 11 March 2011 Mw 9, Tohoku, Japan, represents a complex challenge as it involves the propagation of seismic waves in large volumes of the earth crust, from unusually large seismic source ruptures up to the infrastructure location. The large number of casualties and huge economic losses observed for those earthquakes, some of which have a frequency of occurrence of hundreds or thousands of years, calls for the development of new paradigms and methodologies in order to generate better estimates, both of the seismic hazard, as well as of its consequences, and if possible, to estimate the probability distributions of their ground intensities and of their economical impacts (direct and indirect losses), this in order to implement technological and economical policies to mitigate and reduce, as much as possible, the mentioned consequences. Herewith, we propose a hybrid modeling which uses 3D seismic wave propagation (3DWP) and neural network (NN) modeling in order to estimate the seismic risk of extreme earthquakes. The 3DWP modeling is achieved by using a 3D finite difference code run in the ~100 thousands cores Blue Gene Q supercomputer of the STFC Daresbury Laboratory of UK, combined with empirical Green function (EGF) techniques and NN algorithms. In particular the 3DWP is used to generate broadband samples of the 3D wave propagation of extreme earthquakes (plausible) scenarios corresponding to synthetic seismic sources and to enlarge those samples by using feed-forward NN. We present the results of the validation of the proposed hybrid modeling for Mw 8 subduction events, and show examples of its application for the estimation of the hazard and the economical consequences, for extreme Mw 8.5 subduction earthquake scenarios with seismic sources in the Mexican

  20. Optimization of Correlation Kernel Size for Accurate Estimation of Myocardial Contraction and Relaxation

    NASA Astrophysics Data System (ADS)

    Honjo, Yasunori; Hasegawa, Hideyuki; Kanai, Hiroshi

    2012-07-01

    rates estimated using different kernel sizes were examined using the normalized mean-squared error of the estimated strain rate from the actual one obtained by the 1D phase-sensitive method. Compared with conventional kernel sizes, this result shows the possibility of the proposed correlation kernel to enable more accurate measurement of the strain rate. In in vivo measurement, the regional instantaneous velocities and strain rates in the radial direction of the heart wall were analyzed in detail at an extremely high temporal resolution (frame rate of 860 Hz). In this study, transition in contraction and relaxation was able to be detected by 2D tracking. These results indicate the potential of this method in the high-accuracy estimation of the strain rates and detailed analyses of the physiological function of the myocardium.

  1. How accurately can we estimate energetic costs in a marine top predator, the king penguin?

    PubMed

    Halsey, Lewis G; Fahlman, Andreas; Handrich, Yves; Schmidt, Alexander; Woakes, Anthony J; Butler, Patrick J

    2007-01-01

    King penguins (Aptenodytes patagonicus) are one of the greatest consumers of marine resources. However, while their influence on the marine ecosystem is likely to be significant, only an accurate knowledge of their energy demands will indicate their true food requirements. Energy consumption has been estimated for many marine species using the heart rate-rate of oxygen consumption (f(H) - V(O2)) technique, and the technique has been applied successfully to answer eco-physiological questions. However, previous studies on the energetics of king penguins, based on developing or applying this technique, have raised a number of issues about the degree of validity of the technique for this species. These include the predictive validity of the present f(H) - V(O2) equations across different seasons and individuals and during different modes of locomotion. In many cases, these issues also apply to other species for which the f(H) - V(O2) technique has been applied. In the present study, the accuracy of three prediction equations for king penguins was investigated based on validity studies and on estimates of V(O2) from published, field f(H) data. The major conclusions from the present study are: (1) in contrast to that for walking, the f(H) - V(O2) relationship for swimming king penguins is not affected by body mass; (2) prediction equation (1), log(V(O2) = -0.279 + 1.24log(f(H) + 0.0237t - 0.0157log(f(H)t, derived in a previous study, is the most suitable equation presently available for estimating V(O2) in king penguins for all locomotory and nutritional states. A number of possible problems associated with producing an f(H) - V(O2) relationship are discussed in the present study. Finally, a statistical method to include easy-to-measure morphometric characteristics, which may improve the accuracy of f(H) - V(O2) prediction equations, is explained. PMID:17363231

  2. Relating space radiation environments to risk estimates

    NASA Technical Reports Server (NTRS)

    Curtis, Stanley B.

    1993-01-01

    A number of considerations must go into the process of determining the risk of deleterious effects of space radiation to travelers. Among them are (1) determination of the components of the radiation environment (particle species, fluxes and energy spectra) which will encounter, (2) determination of the effects of shielding provided by the spacecraft and the bodies of the travelers which modify the incident particle spectra and mix of particles, and (3) determination of relevant biological effects of the radiation in the organs of interest. The latter can then lead to an estimation of risk from a given space scenario. Clearly, the process spans many scientific disciplines from solar and cosmic ray physics to radiation transport theeory to the multistage problem of the induction by radiation of initial lesions in living material and their evolution via physical, chemical, and biological processes at the molecular, cellular, and tissue levels to produce the end point of importance.

  3. The Impact of Perceived Frailty on Surgeons’ Estimates of Surgical Risk

    PubMed Central

    Ferguson, Mark K.; Farnan, Jeanne; Hemmerich, Josh A.; Slawinski, Kris; Acevedo, Julissa; Small, Stephen

    2015-01-01

    Background Physicians are only moderately accurate in estimating surgical risk based on clinical vignettes. We assessed the impact of perceived frailty by measuring the influence of a short video of a standardized patient on surgical risk estimates. Methods Thoracic surgeons and cardiothoracic trainees estimated the risk of major complications for lobectomy based on clinical vignettes of varied risk categories (low, average, high). After each vignette, subjects viewed a randomly selected video of a standardized patient exhibiting either vigorous or frail behavior, then re-estimated risk. Subjects were asked to rate 5 vignettes paired with 5 different standardized patients. Results Seventy-one physicians participated. Initial risk estimates varied according to the vignette risk category: low, 15.2 ± 11.2% risk; average, 23.7 ± 16.1%; high, 37.3 ± 18.9%; p<0.001 by ANOVA. Concordant information in vignettes and videos moderately altered estimates (high risk vignette, frail video: 10.6 ± 27.5% increase in estimate, p=0.006; low risk vignette, vigorous video: 14.5 ± 45.0% decrease, p=0.009). Discordant findings influenced risk estimates more substantially (high risk vignette, vigorous video: 21.2 ± 23.5% decrease in second risk estimate, p<0.001; low risk vignette, frail video: 151.9 ± 209.8% increase, p<0.001). Conclusions Surgeons differentiated relative risk of lobectomy based on clinical vignettes. The effect of viewing videos was small when vignettes and videos were concordant; the effect was more substantial when vignettes and videos were discordant. The information will be helpful in training future surgeons in frailty recognition and risk estimation. PMID:24932570

  4. Crop area estimation based on remotely-sensed data with an accurate but costly subsample

    NASA Technical Reports Server (NTRS)

    Gunst, R. F.

    1983-01-01

    Alternatives to sampling-theory stratified and regression estimators of crop production and timber biomass were examined. An alternative estimator which is viewed as especially promising is the errors-in-variable regression estimator. Investigations established the need for caution with this estimator when the ratio of two error variances is not precisely known.

  5. Software risk estimation and management techniques at JPL

    NASA Technical Reports Server (NTRS)

    Hihn, J.; Lum, K.

    2002-01-01

    In this talk we will discuss how uncertainty has been incorporated into the JPL software model, probabilistic-based estimates, and how risk is addressed, how cost risk is currently being explored via a variety of approaches, from traditional risk lists, to detailed WBS-based risk estimates to the Defect Detection and Prevention (DDP) tool.

  6. IMPROVED RISK ESTIMATES FOR CARBON TETRACHLORIDE

    SciTech Connect

    Benson, Janet M.; Springer, David L.

    1999-12-31

    Carbon tetrachloride has been used extensively within the DOE nuclear weapons facilities. Rocky Flats was formerly the largest volume consumer of CCl4 in the United States using 5000 gallons in 1977 alone (Ripple, 1992). At the Hanford site, several hundred thousand gallons of CCl4 were discharged between 1955 and 1973 into underground cribs for storage. Levels of CCl4 in groundwater at highly contaminated sites at the Hanford facility have exceeded 8 the drinking water standard of 5 ppb by several orders of magnitude (Illman, 1993). High levels of CCl4 at these facilities represent a potential health hazard for workers conducting cleanup operations and for surrounding communities. The level of CCl4 cleanup required at these sites and associated costs are driven by current human health risk estimates, which assume that CCl4 is a genotoxic carcinogen. The overall purpose of these studies was to improve the scientific basis for assessing the health risk associated with human exposure to CCl4. Specific research objectives of this project were to: (1) compare the rates of CCl4 metabolism by rats, mice and hamsters in vivo and extrapolate those rates to man based on parallel studies on the metabolism of CCl4 by rat, mouse, hamster and human hepatic microsomes in vitro; (2) using hepatic microsome preparations, determine the role of specific cytochrome P450 isoforms in CCl4-mediated toxicity and the effects of repeated inhalation and ingestion of CCl4 on these isoforms; and (3) evaluate the toxicokinetics of inhaled CCl4 in rats, mice and hamsters. This information has been used to improve the physiologically based pharmacokinetic (PBPK) model for CCl4 originally developed by Paustenbach et al. (1988) and more recently revised by Thrall and Kenny (1996). Another major objective of the project was to provide scientific evidence that CCl4, like chloroform, is a hepatocarcinogen only when exposure results in cell damage, cell killing and regenerative proliferation. In

  7. Determining Sample Size for Accurate Estimation of the Squared Multiple Correlation Coefficient.

    ERIC Educational Resources Information Center

    Algina, James; Olejnik, Stephen

    2000-01-01

    Discusses determining sample size for estimation of the squared multiple correlation coefficient and presents regression equations that permit determination of the sample size for estimating this parameter for up to 20 predictor variables. (SLD)

  8. A robust and accurate center-frequency estimation (RACE) algorithm for improving motion estimation performance of SinMod on tagged cardiac MR images without known tagging parameters.

    PubMed

    Liu, Hong; Wang, Jie; Xu, Xiangyang; Song, Enmin; Wang, Qian; Jin, Renchao; Hung, Chih-Cheng; Fei, Baowei

    2014-11-01

    A robust and accurate center-frequency (CF) estimation (RACE) algorithm for improving the performance of the local sine-wave modeling (SinMod) method, which is a good motion estimation method for tagged cardiac magnetic resonance (MR) images, is proposed in this study. The RACE algorithm can automatically, effectively and efficiently produce a very appropriate CF estimate for the SinMod method, under the circumstance that the specified tagging parameters are unknown, on account of the following two key techniques: (1) the well-known mean-shift algorithm, which can provide accurate and rapid CF estimation; and (2) an original two-direction-combination strategy, which can further enhance the accuracy and robustness of CF estimation. Some other available CF estimation algorithms are brought out for comparison. Several validation approaches that can work on the real data without ground truths are specially designed. Experimental results on human body in vivo cardiac data demonstrate the significance of accurate CF estimation for SinMod, and validate the effectiveness of RACE in facilitating the motion estimation performance of SinMod.

  9. Estimation of health risks from radiation exposures

    SciTech Connect

    Randolph, M.L.

    1983-08-01

    An informal presentation is given of the cancer and genetic risks from exposures to ionizing radiations. The risks from plausible radiation exposures are shown to be comparable to other commonly encountered risks.

  10. Estimation method of point spread function based on Kalman filter for accurately evaluating real optical properties of photonic crystal fibers.

    PubMed

    Shen, Yan; Lou, Shuqin; Wang, Xin

    2014-03-20

    The evaluation accuracy of real optical properties of photonic crystal fibers (PCFs) is determined by the accurate extraction of air hole edges from microscope images of cross sections of practical PCFs. A novel estimation method of point spread function (PSF) based on Kalman filter is presented to rebuild the micrograph image of the PCF cross-section and thus evaluate real optical properties for practical PCFs. Through tests on both artificially degraded images and microscope images of cross sections of practical PCFs, we prove that the proposed method can achieve more accurate PSF estimation and lower PSF variance than the traditional Bayesian estimation method, and thus also reduce the defocus effect. With this method, we rebuild the microscope images of two kinds of commercial PCFs produced by Crystal Fiber and analyze the real optical properties of these PCFs. Numerical results are in accord with the product parameters.

  11. Bayesian parameter estimation of a k-ε model for accurate jet-in-crossflow simulations

    DOE PAGES

    Ray, Jaideep; Lefantzi, Sophia; Arunajatesan, Srinivasan; Dechant, Lawrence

    2016-05-31

    Reynolds-averaged Navier–Stokes models are not very accurate for high-Reynolds-number compressible jet-in-crossflow interactions. The inaccuracy arises from the use of inappropriate model parameters and model-form errors in the Reynolds-averaged Navier–Stokes model. In this study, the hypothesis is pursued that Reynolds-averaged Navier–Stokes predictions can be significantly improved by using parameters inferred from experimental measurements of a supersonic jet interacting with a transonic crossflow.

  12. Accurate state estimation for a hydraulic actuator via a SDRE nonlinear filter

    NASA Astrophysics Data System (ADS)

    Strano, Salvatore; Terzo, Mario

    2016-06-01

    The state estimation in hydraulic actuators is a fundamental tool for the detection of faults or a valid alternative to the installation of sensors. Due to the hard nonlinearities that characterize the hydraulic actuators, the performances of the linear/linearization based techniques for the state estimation are strongly limited. In order to overcome these limits, this paper focuses on an alternative nonlinear estimation method based on the State-Dependent-Riccati-Equation (SDRE). The technique is able to fully take into account the system nonlinearities and the measurement noise. A fifth order nonlinear model is derived and employed for the synthesis of the estimator. Simulations and experimental tests have been conducted and comparisons with the largely used Extended Kalman Filter (EKF) are illustrated. The results show the effectiveness of the SDRE based technique for applications characterized by not negligible nonlinearities such as dead zone and frictions.

  13. The GFR and GFR decline cannot be accurately estimated in type 2 diabetics.

    PubMed

    Gaspari, Flavio; Ruggenenti, Piero; Porrini, Esteban; Motterlini, Nicola; Cannata, Antonio; Carrara, Fabiola; Jiménez Sosa, Alejandro; Cella, Claudia; Ferrari, Silvia; Stucchi, Nadia; Parvanova, Aneliya; Iliev, Ilian; Trevisan, Roberto; Bossi, Antonio; Zaletel, Jelka; Remuzzi, Giuseppe

    2013-07-01

    There are no adequate studies that have formally tested the performance of different estimating formulas in patients with type 2 diabetes both with and without overt nephropathy. Here we evaluated the agreement between baseline GFRs, GFR changes at month 6, and long-term GFR decline measured by iohexol plasma clearance or estimated by 15 creatinine-based formulas in 600 type 2 diabetics followed for a median of 4.0 years. Ninety patients were hyperfiltering. The number of those identified by estimation formulas ranged from 0 to 24:58 were not identified by any formula. Baseline GFR was significantly underestimated and a 6-month GFR reduction was missed in hyperfiltering patients. Long-term GFR decline was also underestimated by all formulas in the whole study group and in hyper-, normo-, and hypofiltering patients considered separately. Five formulas generated positive slopes in hyperfiltering patients. Baseline concordance correlation coefficients and total deviation indexes ranged from 32.1% to 92.6% and from 0.21 to 0.53, respectively. Concordance correlation coefficients between estimated and measured long-term GFR decline ranged from -0.21 to 0.35. The agreement between estimated and measured values was also poor within each subgroup considered separately. Thus, our study questions the use of any estimation formula to identify hyperfiltering patients and monitor renal disease progression and response to treatment in type 2 diabetics without overt nephropathy.

  14. Unilateral Prostate Cancer Cannot be Accurately Predicted in Low-Risk Patients

    SciTech Connect

    Isbarn, Hendrik; Karakiewicz, Pierre I.; Vogel, Susanne

    2010-07-01

    Purpose: Hemiablative therapy (HAT) is increasing in popularity for treatment of patients with low-risk prostate cancer (PCa). The validity of this therapeutic modality, which exclusively treats PCa within a single prostate lobe, rests on accurate staging. We tested the accuracy of unilaterally unremarkable biopsy findings in cases of low-risk PCa patients who are potential candidates for HAT. Methods and Materials: The study population consisted of 243 men with clinical stage {<=}T2a, a prostate-specific antigen (PSA) concentration of <10 ng/ml, a biopsy-proven Gleason sum of {<=}6, and a maximum of 2 ipsilateral positive biopsy results out of 10 or more cores. All men underwent a radical prostatectomy, and pathology stage was used as the gold standard. Univariable and multivariable logistic regression models were tested for significant predictors of unilateral, organ-confined PCa. These predictors consisted of PSA, %fPSA (defined as the quotient of free [uncomplexed] PSA divided by the total PSA), clinical stage (T2a vs. T1c), gland volume, and number of positive biopsy cores (2 vs. 1). Results: Despite unilateral stage at biopsy, bilateral or even non-organ-confined PCa was reported in 64% of all patients. In multivariable analyses, no variable could clearly and independently predict the presence of unilateral PCa. This was reflected in an overall accuracy of 58% (95% confidence interval, 50.6-65.8%). Conclusions: Two-thirds of patients with unilateral low-risk PCa, confirmed by clinical stage and biopsy findings, have bilateral or non-organ-confined PCa at radical prostatectomy. This alarming finding questions the safety and validity of HAT.

  15. FAST TRACK COMMUNICATION Accurate estimate of α variation and isotope shift parameters in Na and Mg+

    NASA Astrophysics Data System (ADS)

    Sahoo, B. K.

    2010-12-01

    We present accurate calculations of fine-structure constant variation coefficients and isotope shifts in Na and Mg+ using the relativistic coupled-cluster method. In our approach, we are able to discover the roles of various correlation effects explicitly to all orders in these calculations. Most of the results, especially for the excited states, are reported for the first time. It is possible to ascertain suitable anchor and probe lines for the studies of possible variation in the fine-structure constant by using the above results in the considered systems.

  16. Precision Pointing Control to and Accurate Target Estimation of a Non-Cooperative Vehicle

    NASA Technical Reports Server (NTRS)

    VanEepoel, John; Thienel, Julie; Sanner, Robert M.

    2006-01-01

    In 2004, NASA began investigating a robotic servicing mission for the Hubble Space Telescope (HST). Such a mission would not only require estimates of the HST attitude and rates in order to achieve capture by the proposed Hubble Robotic Vehicle (HRV), but also precision control to achieve the desired rate and maintain the orientation to successfully dock with HST. To generalize the situation, HST is the target vehicle and HRV is the chaser. This work presents a nonlinear approach for estimating the body rates of a non-cooperative target vehicle, and coupling this estimation to a control scheme. Non-cooperative in this context relates to the target vehicle no longer having the ability to maintain attitude control or transmit attitude knowledge.

  17. Some recommendations for an accurate estimation of Lanice conchilega density based on tube counts

    NASA Astrophysics Data System (ADS)

    van Hoey, Gert; Vincx, Magda; Degraer, Steven

    2006-12-01

    The tube building polychaete Lanice conchilega is a common and ecologically important species in intertidal and shallow subtidal sands. It builds a characteristic tube with ragged fringes and can retract rapidly into its tube to depths of more than 20 cm. Therefore, it is very difficult to sample L. conchilega individuals, especially with a Van Veen grab. Consequently, many studies have used tube counts as estimates of real densities. This study reports on some aspects to be considered when using tube counts as a density estimate of L. conchilega, based on intertidal and subtidal samples. Due to its accuracy and independence of sampling depth, the tube method is considered the prime method to estimate the density of L. conchilega. However, caution is needed when analyzing samples with fragile young individuals and samples from areas where temporary physical disturbance is likely to occur.

  18. Accurate State Estimation and Tracking of a Non-Cooperative Target Vehicle

    NASA Technical Reports Server (NTRS)

    Thienel, Julie K.; Sanner, Robert M.

    2006-01-01

    Autonomous space rendezvous scenarios require knowledge of the target vehicle state in order to safely dock with the chaser vehicle. Ideally, the target vehicle state information is derived from telemetered data, or with the use of known tracking points on the target vehicle. However, if the target vehicle is non-cooperative and does not have the ability to maintain attitude control, or transmit attitude knowledge, the docking becomes more challenging. This work presents a nonlinear approach for estimating the body rates of a non-cooperative target vehicle, and coupling this estimation to a tracking control scheme. The approach is tested with the robotic servicing mission concept for the Hubble Space Telescope (HST). Such a mission would not only require estimates of the HST attitude and rates, but also precision control to achieve the desired rate and maintain the orientation to successfully dock with HST.

  19. A microbial clock provides an accurate estimate of the postmortem interval in a mouse model system

    PubMed Central

    Metcalf, Jessica L; Wegener Parfrey, Laura; Gonzalez, Antonio; Lauber, Christian L; Knights, Dan; Ackermann, Gail; Humphrey, Gregory C; Gebert, Matthew J; Van Treuren, Will; Berg-Lyons, Donna; Keepers, Kyle; Guo, Yan; Bullard, James; Fierer, Noah; Carter, David O; Knight, Rob

    2013-01-01

    Establishing the time since death is critical in every death investigation, yet existing techniques are susceptible to a range of errors and biases. For example, forensic entomology is widely used to assess the postmortem interval (PMI), but errors can range from days to months. Microbes may provide a novel method for estimating PMI that avoids many of these limitations. Here we show that postmortem microbial community changes are dramatic, measurable, and repeatable in a mouse model system, allowing PMI to be estimated within approximately 3 days over 48 days. Our results provide a detailed understanding of bacterial and microbial eukaryotic ecology within a decomposing corpse system and suggest that microbial community data can be developed into a forensic tool for estimating PMI. DOI: http://dx.doi.org/10.7554/eLife.01104.001 PMID:24137541

  20. Fast and accurate probability density estimation in large high dimensional astronomical datasets

    NASA Astrophysics Data System (ADS)

    Gupta, Pramod; Connolly, Andrew J.; Gardner, Jeffrey P.

    2015-01-01

    Astronomical surveys will generate measurements of hundreds of attributes (e.g. color, size, shape) on hundreds of millions of sources. Analyzing these large, high dimensional data sets will require efficient algorithms for data analysis. An example of this is probability density estimation that is at the heart of many classification problems such as the separation of stars and quasars based on their colors. Popular density estimation techniques use binning or kernel density estimation. Kernel density estimation has a small memory footprint but often requires large computational resources. Binning has small computational requirements but usually binning is implemented with multi-dimensional arrays which leads to memory requirements which scale exponentially with the number of dimensions. Hence both techniques do not scale well to large data sets in high dimensions. We present an alternative approach of binning implemented with hash tables (BASH tables). This approach uses the sparseness of data in the high dimensional space to ensure that the memory requirements are small. However hashing requires some extra computation so a priori it is not clear if the reduction in memory requirements will lead to increased computational requirements. Through an implementation of BASH tables in C++ we show that the additional computational requirements of hashing are negligible. Hence this approach has small memory and computational requirements. We apply our density estimation technique to photometric selection of quasars using non-parametric Bayesian classification and show that the accuracy of the classification is same as the accuracy of earlier approaches. Since the BASH table approach is one to three orders of magnitude faster than the earlier approaches it may be useful in various other applications of density estimation in astrostatistics.

  1. Crop area estimation based on remotely-sensed data with an accurate but costly subsample

    NASA Technical Reports Server (NTRS)

    Gunst, R. F.

    1985-01-01

    Research activities conducted under the auspices of National Aeronautics and Space Administration Cooperative Agreement NCC 9-9 are discussed. During this contract period research efforts are concentrated in two primary areas. The first are is an investigation of the use of measurement error models as alternatives to least squares regression estimators of crop production or timber biomass. The secondary primary area of investigation is on the estimation of the mixing proportion of two-component mixture models. This report lists publications, technical reports, submitted manuscripts, and oral presentation generated by these research efforts. Possible areas of future research are mentioned.

  2. Spectral estimation from laser scanner data for accurate color rendering of objects

    NASA Astrophysics Data System (ADS)

    Baribeau, Rejean

    2002-06-01

    Estimation methods are studied for the recovery of the spectral reflectance across the visible range from the sensing at just three discrete laser wavelengths. Methods based on principal component analysis and on spline interpolation are judged based on the CIE94 color differences for some reference data sets. These include the Macbeth color checker, the OSA-UCS color charts, some artist pigments, and a collection of miscellaneous surface colors. The optimal three sampling wavelengths are also investigated. It is found that color can be estimated with average accuracy ΔE94 = 2.3 when optimal wavelengths 455 nm, 540 n, and 610 nm are used.

  3. Accurate radiocarbon age estimation using "early" measurements: a new approach to reconstructing the Paleolithic absolute chronology

    NASA Astrophysics Data System (ADS)

    Omori, Takayuki; Sano, Katsuhiro; Yoneda, Minoru

    2014-05-01

    This paper presents new correction approaches for "early" radiocarbon ages to reconstruct the Paleolithic absolute chronology. In order to discuss time-space distribution about the replacement of archaic humans, including Neanderthals in Europe, by the modern humans, a massive data, which covers a wide-area, would be needed. Today, some radiocarbon databases focused on the Paleolithic have been published and used for chronological studies. From a viewpoint of current analytical technology, however, the any database have unreliable results that make interpretation of radiocarbon dates difficult. Most of these unreliable ages had been published in the early days of radiocarbon analysis. In recent years, new analytical methods to determine highly-accurate dates have been developed. Ultrafiltration and ABOx-SC methods, as new sample pretreatments for bone and charcoal respectively, have attracted attention because they could remove imperceptible contaminates and derive reliable accurately ages. In order to evaluate the reliability of "early" data, we investigated the differences and variabilities of radiocarbon ages on different pretreatments, and attempted to develop correction functions for the assessment of the reliability. It can be expected that reliability of the corrected age is increased and the age applied to chronological research together with recent ages. Here, we introduce the methodological frameworks and archaeological applications.

  4. How Accurate and Robust Are the Phylogenetic Estimates of Austronesian Language Relationships?

    PubMed Central

    Greenhill, Simon J.; Drummond, Alexei J.; Gray, Russell D.

    2010-01-01

    We recently used computational phylogenetic methods on lexical data to test between two scenarios for the peopling of the Pacific. Our analyses of lexical data supported a pulse-pause scenario of Pacific settlement in which the Austronesian speakers originated in Taiwan around 5,200 years ago and rapidly spread through the Pacific in a series of expansion pulses and settlement pauses. We claimed that there was high congruence between traditional language subgroups and those observed in the language phylogenies, and that the estimated age of the Austronesian expansion at 5,200 years ago was consistent with the archaeological evidence. However, the congruence between the language phylogenies and the evidence from historical linguistics was not quantitatively assessed using tree comparison metrics. The robustness of the divergence time estimates to different calibration points was also not investigated exhaustively. Here we address these limitations by using a systematic tree comparison metric to calculate the similarity between the Bayesian phylogenetic trees and the subgroups proposed by historical linguistics, and by re-estimating the age of the Austronesian expansion using only the most robust calibrations. The results show that the Austronesian language phylogenies are highly congruent with the traditional subgroupings, and the date estimates are robust even when calculated using a restricted set of historical calibrations. PMID:20224774

  5. Risk analysis and Monte Carlo simulation applied to the generation of drilling AFE estimates

    SciTech Connect

    Peterson, S.K.; Murtha, J.A.; Schneider, F.F.

    1995-06-01

    This paper presents a method for developing an authorization-for-expenditure (AFE)-generating model and illustrates the technique with a specific offshore field development case study. The model combines Monte Carlo simulation and statistical analysis of historical drilling data to generate more accurate, risked, AFE estimates. In addition to the general method, two examples of making AFE time estimates for North Sea wells with the presented techniques are given.

  6. Accurate estimation of influenza epidemics using Google search data via ARGO.

    PubMed

    Yang, Shihao; Santillana, Mauricio; Kou, S C

    2015-11-24

    Accurate real-time tracking of influenza outbreaks helps public health officials make timely and meaningful decisions that could save lives. We propose an influenza tracking model, ARGO (AutoRegression with GOogle search data), that uses publicly available online search data. In addition to having a rigorous statistical foundation, ARGO outperforms all previously available Google-search-based tracking models, including the latest version of Google Flu Trends, even though it uses only low-quality search data as input from publicly available Google Trends and Google Correlate websites. ARGO not only incorporates the seasonality in influenza epidemics but also captures changes in people's online search behavior over time. ARGO is also flexible, self-correcting, robust, and scalable, making it a potentially powerful tool that can be used for real-time tracking of other social events at multiple temporal and spatial resolutions.

  7. Accurate estimation of influenza epidemics using Google search data via ARGO.

    PubMed

    Yang, Shihao; Santillana, Mauricio; Kou, S C

    2015-11-24

    Accurate real-time tracking of influenza outbreaks helps public health officials make timely and meaningful decisions that could save lives. We propose an influenza tracking model, ARGO (AutoRegression with GOogle search data), that uses publicly available online search data. In addition to having a rigorous statistical foundation, ARGO outperforms all previously available Google-search-based tracking models, including the latest version of Google Flu Trends, even though it uses only low-quality search data as input from publicly available Google Trends and Google Correlate websites. ARGO not only incorporates the seasonality in influenza epidemics but also captures changes in people's online search behavior over time. ARGO is also flexible, self-correcting, robust, and scalable, making it a potentially powerful tool that can be used for real-time tracking of other social events at multiple temporal and spatial resolutions. PMID:26553980

  8. Accurate estimation of influenza epidemics using Google search data via ARGO

    PubMed Central

    Yang, Shihao; Santillana, Mauricio; Kou, S. C.

    2015-01-01

    Accurate real-time tracking of influenza outbreaks helps public health officials make timely and meaningful decisions that could save lives. We propose an influenza tracking model, ARGO (AutoRegression with GOogle search data), that uses publicly available online search data. In addition to having a rigorous statistical foundation, ARGO outperforms all previously available Google-search–based tracking models, including the latest version of Google Flu Trends, even though it uses only low-quality search data as input from publicly available Google Trends and Google Correlate websites. ARGO not only incorporates the seasonality in influenza epidemics but also captures changes in people’s online search behavior over time. ARGO is also flexible, self-correcting, robust, and scalable, making it a potentially powerful tool that can be used for real-time tracking of other social events at multiple temporal and spatial resolutions. PMID:26553980

  9. Raman spectroscopy for highly accurate estimation of the age of refrigerated porcine muscle

    NASA Astrophysics Data System (ADS)

    Timinis, Constantinos; Pitris, Costas

    2016-03-01

    The high water content of meat, combined with all the nutrients it contains, make it vulnerable to spoilage at all stages of production and storage even when refrigerated at 5 °C. A non-destructive and in situ tool for meat sample testing, which could provide an accurate indication of the storage time of meat, would be very useful for the control of meat quality as well as for consumer safety. The proposed solution is based on Raman spectroscopy which is non-invasive and can be applied in situ. For the purposes of this project, 42 meat samples from 14 animals were obtained and three Raman spectra per sample were collected every two days for two weeks. The spectra were subsequently processed and the sample age was calculated using a set of linear differential equations. In addition, the samples were classified in categories corresponding to the age in 2-day steps (i.e., 0, 2, 4, 6, 8, 10, 12 or 14 days old), using linear discriminant analysis and cross-validation. Contrary to other studies, where the samples were simply grouped into two categories (higher or lower quality, suitable or unsuitable for human consumption, etc.), in this study, the age was predicted with a mean error of ~ 1 day (20%) or classified, in 2-day steps, with 100% accuracy. Although Raman spectroscopy has been used in the past for the analysis of meat samples, the proposed methodology has resulted in a prediction of the sample age far more accurately than any report in the literature.

  10. Are satellite based rainfall estimates accurate enough for crop modelling under Sahelian climate?

    NASA Astrophysics Data System (ADS)

    Ramarohetra, J.; Sultan, B.

    2012-04-01

    Agriculture is considered as the most climate dependant human activity. In West Africa and especially in the sudano-sahelian zone, rain-fed agriculture - that represents 93% of cultivated areas and is the means of support of 70% of the active population - is highly vulnerable to precipitation variability. To better understand and anticipate climate impacts on agriculture, crop models - that estimate crop yield from climate information (e.g rainfall, temperature, insolation, humidity) - have been developed. These crop models are useful (i) in ex ante analysis to quantify the impact of different strategies implementation - crop management (e.g. choice of varieties, sowing date), crop insurance or medium-range weather forecast - on yields, (ii) for early warning systems and to (iii) assess future food security. Yet, the successful application of these models depends on the accuracy of their climatic drivers. In the sudano-sahelian zone , the quality of precipitation estimations is then a key factor to understand and anticipate climate impacts on agriculture via crop modelling and yield estimations. Different kinds of precipitation estimations can be used. Ground measurements have long-time series but an insufficient network density, a large proportion of missing values, delay in reporting time, and they have limited availability. An answer to these shortcomings may lie in the field of remote sensing that provides satellite-based precipitation estimations. However, satellite-based rainfall estimates (SRFE) are not a direct measurement but rather an estimation of precipitation. Used as an input for crop models, it determines the performance of the simulated yield, hence SRFE require validation. The SARRAH crop model is used to model three different varieties of pearl millet (HKP, MTDO, Souna3) in a square degree centred on 13.5°N and 2.5°E, in Niger. Eight satellite-based rainfall daily products (PERSIANN, CMORPH, TRMM 3b42-RT, GSMAP MKV+, GPCP, TRMM 3b42v6, RFEv2 and

  11. Techniques for accurate estimation of net discharge in a tidal channel

    USGS Publications Warehouse

    Simpson, Michael R.; Bland, Roger

    1999-01-01

    An ultrasonic velocity meter discharge-measurement site in a tidally affected region of the Sacramento-San Joaquin rivers was used to study the accuracy of the index velocity calibration procedure. Calibration data consisting of ultrasonic velocity meter index velocity and concurrent acoustic Doppler discharge measurement data were collected during three time periods. The relative magnitude of equipment errors, acoustic Doppler discharge measurement errors, and calibration errors were evaluated. Calibration error was the most significant source of error in estimating net discharge. Using a comprehensive calibration method, net discharge estimates developed from the three sets of calibration data differed by less than an average of 4 cubic meters per second. Typical maximum flow rates during the data-collection period averaged 750 cubic meters per second.

  12. Plant DNA Barcodes Can Accurately Estimate Species Richness in Poorly Known Floras

    PubMed Central

    Costion, Craig; Ford, Andrew; Cross, Hugh; Crayn, Darren; Harrington, Mark; Lowe, Andrew

    2011-01-01

    Background Widespread uptake of DNA barcoding technology for vascular plants has been slow due to the relatively poor resolution of species discrimination (∼70%) and low sequencing and amplification success of one of the two official barcoding loci, matK. Studies to date have mostly focused on finding a solution to these intrinsic limitations of the markers, rather than posing questions that can maximize the utility of DNA barcodes for plants with the current technology. Methodology/Principal Findings Here we test the ability of plant DNA barcodes using the two official barcoding loci, rbcLa and matK, plus an alternative barcoding locus, trnH-psbA, to estimate the species diversity of trees in a tropical rainforest plot. Species discrimination accuracy was similar to findings from previous studies but species richness estimation accuracy proved higher, up to 89%. All combinations which included the trnH-psbA locus performed better at both species discrimination and richness estimation than matK, which showed little enhanced species discriminatory power when concatenated with rbcLa. The utility of the trnH-psbA locus is limited however, by the occurrence of intraspecific variation observed in some angiosperm families to occur as an inversion that obscures the monophyly of species. Conclusions/Significance We demonstrate for the first time, using a case study, the potential of plant DNA barcodes for the rapid estimation of species richness in taxonomically poorly known areas or cryptic populations revealing a powerful new tool for rapid biodiversity assessment. The combination of the rbcLa and trnH-psbA loci performed better for this purpose than any two-locus combination that included matK. We show that although DNA barcodes fail to discriminate all species of plants, new perspectives and methods on biodiversity value and quantification may overshadow some of these shortcomings by applying barcode data in new ways. PMID:22096501

  13. Accurate distortion estimation and optimal bandwidth allocation for scalable H.264 video transmission over MIMO systems.

    PubMed

    Jubran, Mohammad K; Bansal, Manu; Kondi, Lisimachos P; Grover, Rohan

    2009-01-01

    In this paper, we propose an optimal strategy for the transmission of scalable video over packet-based multiple-input multiple-output (MIMO) systems. The scalable extension of H.264/AVC that provides a combined temporal, quality and spatial scalability is used. For given channel conditions, we develop a method for the estimation of the distortion of the received video and propose different error concealment schemes. We show the accuracy of our distortion estimation algorithm in comparison with simulated wireless video transmission with packet errors. In the proposed MIMO system, we employ orthogonal space-time block codes (O-STBC) that guarantee independent transmission of different symbols within the block code. In the proposed constrained bandwidth allocation framework, we use the estimated end-to-end decoder distortion to optimally select the application layer parameters, i.e., quantization parameter (QP) and group of pictures (GOP) size, and physical layer parameters, i.e., rate-compatible turbo (RCPT) code rate and symbol constellation. Results show the substantial performance gain by using different symbol constellations across the scalable layers as compared to a fixed constellation.

  14. Compact and accurate linear and nonlinear autoregressive moving average model parameter estimation using laguerre functions.

    PubMed

    Chon, K H; Cohen, R J; Holstein-Rathlou, N H

    1997-01-01

    A linear and nonlinear autoregressive moving average (ARMA) identification algorithm is developed for modeling time series data. The algorithm uses Laguerre expansion of kernals (LEK) to estimate Volterra-Wiener kernals. However, instead of estimating linear and nonlinear system dynamics via moving average models, as is the case for the Volterra-Wiener analysis, we propose an ARMA model-based approach. The proposed algorithm is essentially the same as LEK, but this algorithm is extended to include past values of the output as well. Thus, all of the advantages associated with using the Laguerre function remain with our algorithm; but, by extending the algorithm to the linear and nonlinear ARMA model, a significant reduction in the number of Laguerre functions can be made, compared with the Volterra-Wiener approach. This translates into a more compact system representation and makes the physiological interpretation of higher order kernels easier. Furthermore, simulation results show better performance of the proposed approach in estimating the system dynamics than LEK in certain cases, and it remains effective in the presence of significant additive measurement noise. PMID:9236985

  15. Evaluation of the sample needed to accurately estimate outcome-based measurements of dairy welfare on farm.

    PubMed

    Endres, M I; Lobeck-Luchterhand, K M; Espejo, L A; Tucker, C B

    2014-01-01

    Dairy welfare assessment programs are becoming more common on US farms. Outcome-based measurements, such as locomotion, hock lesion, hygiene, and body condition scores (BCS), are included in these assessments. The objective of the current study was to investigate the proportion of cows in the pen or subsamples of pens on a farm needed to provide an accurate estimate of the previously mentioned measurements. In experiment 1, we evaluated cows in 52 high pens (50 farms) for lameness using a 1- to 5-scale locomotion scoring system (1 = normal and 5 = severely lame; 24.4 and 6% of animals were scored ≥ 3 or ≥ 4, respectively). Cows were also given a BCS using a 1- to 5-scale, where 1 = emaciated and 5 = obese; cows were rarely thin (BCS ≤ 2; 0.10% of cows) or fat (BCS ≥ 4; 0.11% of cows). Hygiene scores were assessed on a 1- to 5-scale with 1 = clean and 5 = severely dirty; 54.9% of cows had a hygiene score ≥ 3. Hock injuries were classified as 1 = no lesion, 2 = mild lesion, and 3 = severe lesion; 10.6% of cows had a score of 3. Subsets of data were created with 10 replicates of random sampling that represented 100, 90, 80, 70, 60, 50, 40, 30, 20, 15, 10, 5, and 3% of the cows measured/pen. In experiment 2, we scored the same outcome measures on all cows in lactating pens from 12 farms and evaluated using pen subsamples: high; high and fresh; high, fresh, and hospital; and high, low, and hospital. For both experiments, the association between the estimates derived from all subsamples and entire pen (experiment 1) or herd (experiment 2) prevalence was evaluated using linear regression. To be considered a good estimate, 3 criteria must be met: R(2)>0.9, slope = 1, and intercept = 0. In experiment 1, on average, recording 15% of the pen represented the percentage of clinically lame cows (score ≥ 3), whereas 30% needed to be measured to estimate severe lameness (score ≥ 4). Only 15% of the pen was needed to estimate the percentage of the herd with a hygiene

  16. Relating space radiation environments to risk estimates

    SciTech Connect

    Curtis, S.B.

    1991-10-01

    This lecture will provide a bridge from the physical energy or LET spectra as might be calculated in an organ to the risk of carcinogenesis, a particular concern for extended missions to the moon or beyond to Mars. Topics covered will include (1) LET spectra expected from galactic cosmic rays, (2) probabilities that individual cell nuclei in the body will be hit by heavy galactic cosmic ray particles, (3) the conventional methods of calculating risks from a mixed environment of high and low LET radiation, (4) an alternate method which provides certain advantages using fluence-related risk coefficients (risk cross sections), and (5) directions for future research and development of these ideas.

  17. Accurate Estimation of Airborne Ultrasonic Time-of-Flight for Overlapping Echoes

    PubMed Central

    Sarabia, Esther G.; Llata, Jose R.; Robla, Sandra; Torre-Ferrero, Carlos; Oria, Juan P.

    2013-01-01

    In this work, an analysis of the transmission of ultrasonic signals generated by piezoelectric sensors for air applications is presented. Based on this analysis, an ultrasonic response model is obtained for its application to the recognition of objects and structured environments for navigation by autonomous mobile robots. This model enables the analysis of the ultrasonic response that is generated using a pair of sensors in transmitter-receiver configuration using the pulse-echo technique. This is very interesting for recognizing surfaces that simultaneously generate a multiple echo response. This model takes into account the effect of the radiation pattern, the resonant frequency of the sensor, the number of cycles of the excitation pulse, the dynamics of the sensor and the attenuation with distance in the medium. This model has been developed, programmed and verified through a battery of experimental tests. Using this model a new procedure for obtaining accurate time of flight is proposed. This new method is compared with traditional ones, such as threshold or correlation, to highlight its advantages and drawbacks. Finally the advantages of this method are demonstrated for calculating multiple times of flight when the echo is formed by several overlapping echoes. PMID:24284774

  18. Accurate estimation of airborne ultrasonic time-of-flight for overlapping echoes.

    PubMed

    Sarabia, Esther G; Llata, Jose R; Robla, Sandra; Torre-Ferrero, Carlos; Oria, Juan P

    2013-01-01

    In this work, an analysis of the transmission of ultrasonic signals generated by piezoelectric sensors for air applications is presented. Based on this analysis, an ultrasonic response model is obtained for its application to the recognition of objects and structured environments for navigation by autonomous mobile robots. This model enables the analysis of the ultrasonic response that is generated using a pair of sensors in transmitter-receiver configuration using the pulse-echo technique. This is very interesting for recognizing surfaces that simultaneously generate a multiple echo response. This model takes into account the effect of the radiation pattern, the resonant frequency of the sensor, the number of cycles of the excitation pulse, the dynamics of the sensor and the attenuation with distance in the medium. This model has been developed, programmed and verified through a battery of experimental tests. Using this model a new procedure for obtaining accurate time of flight is proposed. This new method is compared with traditional ones, such as threshold or correlation, to highlight its advantages and drawbacks. Finally the advantages of this method are demonstrated for calculating multiple times of flight when the echo is formed by several overlapping echoes. PMID:24284774

  19. A weighted genetic risk score using all known susceptibility variants to estimate rheumatoid arthritis risk

    PubMed Central

    Yarwood, Annie; Han, Buhm; Raychaudhuri, Soumya; Bowes, John; Lunt, Mark; Pappas, Dimitrios A; Kremer, Joel; Greenberg, Jeffrey D; Plenge, Robert; Worthington, Jane; Barton, Anne; Eyre, Steve

    2015-01-01

    Background There is currently great interest in the incorporation of genetic susceptibility loci into screening models to identify individuals at high risk of disease. Here, we present the first risk prediction model including all 46 known genetic loci associated with rheumatoid arthritis (RA). Methods A weighted genetic risk score (wGRS) was created using 45 RA non-human leucocyte antigen (HLA) susceptibility loci, imputed amino acids at HLA-DRB1 (11, 71 and 74), HLA-DPB1 (position 9) HLA-B (position 9) and gender. The wGRS was tested in 11 366 RA cases and 15 489 healthy controls. The risk of developing RA was estimated using logistic regression by dividing the wGRS into quintiles. The ability of the wGRS to discriminate between cases and controls was assessed by receiver operator characteristic analysis and discrimination improvement tests. Results Individuals in the highest risk group showed significantly increased odds of developing anti-cyclic citrullinated peptide-positive RA compared to the lowest risk group (OR 27.13, 95% CI 23.70 to 31.05). The wGRS was validated in an independent cohort that showed similar results (area under the curve 0.78, OR 18.00, 95% CI 13.67 to 23.71). Comparison of the full wGRS with a wGRS in which HLA amino acids were replaced by a HLA tag single-nucleotide polymorphism showed a significant loss of sensitivity and specificity. Conclusions Our study suggests that in RA, even when using all known genetic susceptibility variants, prediction performance remains modest; while this is insufficiently accurate for general population screening, it may prove of more use in targeted studies. Our study has also highlighted the importance of including HLA variation in risk prediction models. PMID:24092415

  20. Voxel-based registration of simulated and real patient CBCT data for accurate dental implant pose estimation

    NASA Astrophysics Data System (ADS)

    Moreira, António H. J.; Queirós, Sandro; Morais, Pedro; Rodrigues, Nuno F.; Correia, André Ricardo; Fernandes, Valter; Pinho, A. C. M.; Fonseca, Jaime C.; Vilaça, João. L.

    2015-03-01

    The success of dental implant-supported prosthesis is directly linked to the accuracy obtained during implant's pose estimation (position and orientation). Although traditional impression techniques and recent digital acquisition methods are acceptably accurate, a simultaneously fast, accurate and operator-independent methodology is still lacking. Hereto, an image-based framework is proposed to estimate the patient-specific implant's pose using cone-beam computed tomography (CBCT) and prior knowledge of implanted model. The pose estimation is accomplished in a threestep approach: (1) a region-of-interest is extracted from the CBCT data using 2 operator-defined points at the implant's main axis; (2) a simulated CBCT volume of the known implanted model is generated through Feldkamp-Davis-Kress reconstruction and coarsely aligned to the defined axis; and (3) a voxel-based rigid registration is performed to optimally align both patient and simulated CBCT data, extracting the implant's pose from the optimal transformation. Three experiments were performed to evaluate the framework: (1) an in silico study using 48 implants distributed through 12 tridimensional synthetic mandibular models; (2) an in vitro study using an artificial mandible with 2 dental implants acquired with an i-CAT system; and (3) two clinical case studies. The results shown positional errors of 67+/-34μm and 108μm, and angular misfits of 0.15+/-0.08° and 1.4°, for experiment 1 and 2, respectively. Moreover, in experiment 3, visual assessment of clinical data results shown a coherent alignment of the reference implant. Overall, a novel image-based framework for implants' pose estimation from CBCT data was proposed, showing accurate results in agreement with dental prosthesis modelling requirements.

  1. Exposure corrected risk estimates for childhood product related injuries.

    PubMed

    Senturia, Y D; Binns, H J; Christoffel, K K; Tanz, R R

    1993-08-01

    This study assesses the effect of exposure correction on injury risk estimates for children, using Chicago-area survey data on age-specific exposure of children to seven products: amusement park rides, sleds, bunkbeds, skateboards, fireworks, toboggans, and air guns and rifles. National Electronic Injury Surveillance System estimates for 1987 were used as numerators with two denominators: (i) uncorrected age-specific U.S. Census estimates for 1987 and (ii) these estimates corrected for exposure. Except for bunkbeds, skateboards and sleds, corrected injury risk decreased as age increased. Uncorrected population injury rates underestimated the risk posed to product-using children, especially those who are youngest and those who use skateboards.

  2. An Energy-Efficient Strategy for Accurate Distance Estimation in Wireless Sensor Networks

    PubMed Central

    Tarrío, Paula; Bernardos, Ana M.; Casar, José R.

    2012-01-01

    In line with recent research efforts made to conceive energy saving protocols and algorithms and power sensitive network architectures, in this paper we propose a transmission strategy to minimize the energy consumption in a sensor network when using a localization technique based on the measurement of the strength (RSS) or the time of arrival (TOA) of the received signal. In particular, we find the transmission power and the packet transmission rate that jointly minimize the total consumed energy, while ensuring at the same time a desired accuracy in the RSS or TOA measurements. We also propose some corrections to these theoretical results to take into account the effects of shadowing and packet loss in the propagation channel. The proposed strategy is shown to be effective in realistic scenarios providing energy savings with respect to other transmission strategies, and also guaranteeing a given accuracy in the distance estimations, which will serve to guarantee a desired accuracy in the localization result. PMID:23202218

  3. Accurate automatic estimation of total intracranial volume: a nuisance variable with less nuisance.

    PubMed

    Malone, Ian B; Leung, Kelvin K; Clegg, Shona; Barnes, Josephine; Whitwell, Jennifer L; Ashburner, John; Fox, Nick C; Ridgway, Gerard R

    2015-01-01

    Total intracranial volume (TIV/ICV) is an important covariate for volumetric analyses of the brain and brain regions, especially in the study of neurodegenerative diseases, where it can provide a proxy of maximum pre-morbid brain volume. The gold-standard method is manual delineation of brain scans, but this requires careful work by trained operators. We evaluated Statistical Parametric Mapping 12 (SPM12) automated segmentation for TIV measurement in place of manual segmentation and also compared it with SPM8 and FreeSurfer 5.3.0. For T1-weighted MRI acquired from 288 participants in a multi-centre clinical trial in Alzheimer's disease we find a high correlation between SPM12 TIV and manual TIV (R(2)=0.940, 95% Confidence Interval (0.924, 0.953)), with a small mean difference (SPM12 40.4±35.4ml lower than manual, amounting to 2.8% of the overall mean TIV in the study). The correlation with manual measurements (the key aspect when using TIV as a covariate) for SPM12 was significantly higher (p<0.001) than for either SPM8 (R(2)=0.577 CI (0.500, 0.644)) or FreeSurfer (R(2)=0.801 CI (0.744, 0.843)). These results suggest that SPM12 TIV estimates are an acceptable substitute for labour-intensive manual estimates even in the challenging context of multiple centres and the presence of neurodegenerative pathology. We also briefly discuss some aspects of the statistical modelling approaches to adjust for TIV. PMID:25255942

  4. [Research on maize multispectral image accurate segmentation and chlorophyll index estimation].

    PubMed

    Wu, Qian; Sun, Hong; Li, Min-zan; Song, Yuan-yuan; Zhang, Yan-e

    2015-01-01

    In order to rapidly acquire maize growing information in the field, a non-destructive method of maize chlorophyll content index measurement was conducted based on multi-spectral imaging technique and imaging processing technology. The experiment was conducted at Yangling in Shaanxi province of China and the crop was Zheng-dan 958 planted in about 1 000 m X 600 m experiment field. Firstly, a 2-CCD multi-spectral image monitoring system was available to acquire the canopy images. The system was based on a dichroic prism, allowing precise separation of the visible (Blue (B), Green (G), Red (R): 400-700 nm) and near-infrared (NIR, 760-1 000 nm) band. The multispectral images were output as RGB and NIR images via the system vertically fixed to the ground with vertical distance of 2 m and angular field of 50°. SPAD index of each sample was'measured synchronously to show the chlorophyll content index. Secondly, after the image smoothing using adaptive smooth filtering algorithm, the NIR maize image was selected to segment the maize leaves from background, because there was a big difference showed in gray histogram between plant and soil background. The NIR image segmentation algorithm was conducted following steps of preliminary and accuracy segmentation: (1) The results of OTSU image segmentation method and the variable threshold algorithm were discussed. It was revealed that the latter was better one in corn plant and weed segmentation. As a result, the variable threshold algorithm based on local statistics was selected for the preliminary image segmentation. The expansion and corrosion were used to optimize the segmented image. (2) The region labeling algorithm was used to segment corn plants from soil and weed background with an accuracy of 95. 59 %. And then, the multi-spectral image of maize canopy was accurately segmented in R, G and B band separately. Thirdly, the image parameters were abstracted based on the segmented visible and NIR images. The average gray

  5. How to Estimate Epidemic Risk from Incomplete Contact Diaries Data?

    PubMed Central

    Mastrandrea, Rossana; Barrat, Alain

    2016-01-01

    Social interactions shape the patterns of spreading processes in a population. Techniques such as diaries or proximity sensors allow to collect data about encounters and to build networks of contacts between individuals. The contact networks obtained from these different techniques are however quantitatively different. Here, we first show how these discrepancies affect the prediction of the epidemic risk when these data are fed to numerical models of epidemic spread: low participation rate, under-reporting of contacts and overestimation of contact durations in contact diaries with respect to sensor data determine indeed important differences in the outcomes of the corresponding simulations with for instance an enhanced sensitivity to initial conditions. Most importantly, we investigate if and how information gathered from contact diaries can be used in such simulations in order to yield an accurate description of the epidemic risk, assuming that data from sensors represent the ground truth. The contact networks built from contact sensors and diaries present indeed several structural similarities: this suggests the possibility to construct, using only the contact diary network information, a surrogate contact network such that simulations using this surrogate network give the same estimation of the epidemic risk as simulations using the contact sensor network. We present and compare several methods to build such surrogate data, and show that it is indeed possible to obtain a good agreement between the outcomes of simulations using surrogate and sensor data, as long as the contact diary information is complemented by publicly available data describing the heterogeneity of the durations of human contacts. PMID:27341027

  6. Moving towards a new paradigm for global flood risk estimation

    NASA Astrophysics Data System (ADS)

    Troy, Tara J.; Devineni, Naresh; Lima, Carlos; Lall, Upmanu

    2013-04-01

    model is implemented at a finer resolution (<=1km) in order to more accurately model streamflow under flood conditions and estimate inundation. This approach allows for efficient computational simulation of the hydrology when not under potential for flooding with high-resolution flood wave modeling when there is flooding potential. We demonstrate the results of this flood risk estimation system for the Ohio River basin in the United States, a large river basin that is historically prone to flooding, with the intention of using it to do global flood risk assessment.

  7. Thermal Conductivities in Solids from First Principles: Accurate Computations and Rapid Estimates

    NASA Astrophysics Data System (ADS)

    Carbogno, Christian; Scheffler, Matthias

    In spite of significant research efforts, a first-principles determination of the thermal conductivity κ at high temperatures has remained elusive. Boltzmann transport techniques that account for anharmonicity perturbatively become inaccurate under such conditions. Ab initio molecular dynamics (MD) techniques using the Green-Kubo (GK) formalism capture the full anharmonicity, but can become prohibitively costly to converge in time and size. We developed a formalism that accelerates such GK simulations by several orders of magnitude and that thus enables its application within the limited time and length scales accessible in ab initio MD. For this purpose, we determine the effective harmonic potential occurring during the MD, the associated temperature-dependent phonon properties and lifetimes. Interpolation in reciprocal and frequency space then allows to extrapolate to the macroscopic scale. For both force-field and ab initio MD, we validate this approach by computing κ for Si and ZrO2, two materials known for their particularly harmonic and anharmonic character. Eventually, we demonstrate how these techniques facilitate reasonable estimates of κ from existing MD calculations at virtually no additional computational cost.

  8. Accurate Estimation of Protein Folding and Unfolding Times: Beyond Markov State Models.

    PubMed

    Suárez, Ernesto; Adelman, Joshua L; Zuckerman, Daniel M

    2016-08-01

    Because standard molecular dynamics (MD) simulations are unable to access time scales of interest in complex biomolecular systems, it is common to "stitch together" information from multiple shorter trajectories using approximate Markov state model (MSM) analysis. However, MSMs may require significant tuning and can yield biased results. Here, by analyzing some of the longest protein MD data sets available (>100 μs per protein), we show that estimators constructed based on exact non-Markovian (NM) principles can yield significantly improved mean first-passage times (MFPTs) for protein folding and unfolding. In some cases, MSM bias of more than an order of magnitude can be corrected when identical trajectory data are reanalyzed by non-Markovian approaches. The NM analysis includes "history" information, higher order time correlations compared to MSMs, that is available in every MD trajectory. The NM strategy is insensitive to fine details of the states used and works well when a fine time-discretization (i.e., small "lag time") is used. PMID:27340835

  9. Accurate estimation of normal incidence absorption coefficients with confidence intervals using a scanning laser Doppler vibrometer

    NASA Astrophysics Data System (ADS)

    Vuye, Cedric; Vanlanduit, Steve; Guillaume, Patrick

    2009-06-01

    When using optical measurements of the sound fields inside a glass tube, near the material under test, to estimate the reflection and absorption coefficients, not only these acoustical parameters but also confidence intervals can be determined. The sound fields are visualized using a scanning laser Doppler vibrometer (SLDV). In this paper the influence of different test signals on the quality of the results, obtained with this technique, is examined. The amount of data gathered during one measurement scan makes a thorough statistical analysis possible leading to the knowledge of confidence intervals. The use of a multi-sine, constructed on the resonance frequencies of the test tube, shows to be a very good alternative for the traditional periodic chirp. This signal offers the ability to obtain data for multiple frequencies in one measurement, without the danger of a low signal-to-noise ratio. The variability analysis in this paper clearly shows the advantages of the proposed multi-sine compared to the periodic chirp. The measurement procedure and the statistical analysis are validated by measuring the reflection ratio at a closed end and comparing the results with the theoretical value. Results of the testing of two building materials (an acoustic ceiling tile and linoleum) are presented and compared to supplier data.

  10. Political risk in fair market value estimates

    SciTech Connect

    Gruy, H.J.; Hartsock, J.H.

    1996-09-01

    Political risk arises from unstable governments, commercial establishments and infrastructure as well as labor unrest. All these factors vary from country to country and from time to time. Banks and insurance companies quantify these risks, but they are reluctant to divulge their opinions for fear of alienating possible customers that have been assigned a high risk. An investment in a fixed property such as an oil and gas lease, concession or other mineral interest is subject to political risk. No one will deny that money to be received several years in the future has a greater value today in a country with a stable government, stable tax regime, a sound economy and reliable labor force than in a Third World country where a revolution is brewing. Even in stable countries, the risk of tax law changes, exorbitant environmental production regulations and cleanup costs may vary. How do these factors affect fair market value and how are these calculations made? An important consideration discussed in this paper is the treatment of capital investments.

  11. Wind effect on PV module temperature: Analysis of different techniques for an accurate estimation.

    NASA Astrophysics Data System (ADS)

    Schwingshackl, Clemens; Petitta, Marcello; Ernst Wagner, Jochen; Belluardo, Giorgio; Moser, David; Castelli, Mariapina; Zebisch, Marc; Tetzlaff, Anke

    2013-04-01

    temperature estimation using meteorological parameters. References: [1] Skoplaki, E. et al., 2008: A simple correlation for the operating temperature of photovoltaic modules of arbitrary mounting, Solar Energy Materials & Solar Cells 92, 1393-1402 [2] Skoplaki, E. et al., 2008: Operating temperature of photovoltaic modules: A survey of pertinent correlations, Renewable Energy 34, 23-29 [3] Koehl, M. et al., 2011: Modeling of the nominal operating cell temperature based on outdoor weathering, Solar Energy Materials & Solar Cells 95, 1638-1646 [4] Mattei, M. et al., 2005: Calculation of the polycrystalline PV module temperature using a simple method of energy balance, Renewable Energy 31, 553-567 [5] Kurtz, S. et al.: Evaluation of high-temperature exposure of rack-mounted photovoltaic modules

  12. Parametric Estimation in a Recurrent Competing Risks Model

    PubMed Central

    Peña, Edsel A.

    2014-01-01

    A resource-efficient approach to making inferences about the distributional properties of the failure times in a competing risks setting is presented. Efficiency is gained by observing recurrences of the competing risks over a random monitoring period. The resulting model is called the recurrent competing risks model (RCRM) and is coupled with two repair strategies whenever the system fails. Maximum likelihood estimators of the parameters of the marginal distribution functions associated with each of the competing risks and also of the system lifetime distribution function are presented. Estimators are derived under perfect and partial repair strategies. Consistency and asymptotic properties of the estimators are obtained. The estimation methods are applied to a data set of failures for cars under warranty. Simulation studies are used to ascertain the small sample properties and the efficiency gains of the resulting estimators. PMID:25346751

  13. A new set of atomic radii for accurate estimation of solvation free energy by Poisson-Boltzmann solvent model.

    PubMed

    Yamagishi, Junya; Okimoto, Noriaki; Morimoto, Gentaro; Taiji, Makoto

    2014-11-01

    The Poisson-Boltzmann implicit solvent (PB) is widely used to estimate the solvation free energies of biomolecules in molecular simulations. An optimized set of atomic radii (PB radii) is an important parameter for PB calculations, which determines the distribution of dielectric constants around the solute. We here present new PB radii for the AMBER protein force field to accurately reproduce the solvation free energies obtained from explicit solvent simulations. The presented PB radii were optimized using results from explicit solvent simulations of the large systems. In addition, we discriminated PB radii for N- and C-terminal residues from those for nonterminal residues. The performances using our PB radii showed high accuracy for the estimation of solvation free energies at the level of the molecular fragment. The obtained PB radii are effective for the detailed analysis of the solvation effects of biomolecules.

  14. Resources for global risk assessment: the International Toxicity Estimates for Risk (ITER) and Risk Information Exchange (RiskIE) databases.

    PubMed

    Wullenweber, Andrea; Kroner, Oliver; Kohrman, Melissa; Maier, Andrew; Dourson, Michael; Rak, Andrew; Wexler, Philip; Tomljanovic, Chuck

    2008-11-15

    The rate of chemical synthesis and use has outpaced the development of risk values and the resolution of risk assessment methodology questions. In addition, available risk values derived by different organizations may vary due to scientific judgments, mission of the organization, or use of more recently published data. Further, each organization derives values for a unique chemical list so it can be challenging to locate data on a given chemical. Two Internet resources are available to address these issues. First, the International Toxicity Estimates for Risk (ITER) database (www.tera.org/iter) provides chronic human health risk assessment data from a variety of organizations worldwide in a side-by-side format, explains differences in risk values derived by different organizations, and links directly to each organization's website for more detailed information. It is also the only database that includes risk information from independent parties whose risk values have undergone independent peer review. Second, the Risk Information Exchange (RiskIE) is a database of in progress chemical risk assessment work, and includes non-chemical information related to human health risk assessment, such as training modules, white papers and risk documents. RiskIE is available at http://www.allianceforrisk.org/RiskIE.htm, and will join ITER on National Library of Medicine's TOXNET (http://toxnet.nlm.nih.gov/). Together, ITER and RiskIE provide risk assessors essential tools for easily identifying and comparing available risk data, for sharing in progress assessments, and for enhancing interaction among risk assessment groups to decrease duplication of effort and to harmonize risk assessment procedures across organizations.

  15. Resources for global risk assessment: The International Toxicity Estimates for Risk (ITER) and Risk Information Exchange (RiskIE) databases

    SciTech Connect

    Wullenweber, Andrea Kroner, Oliver; Kohrman, Melissa; Maier, Andrew; Dourson, Michael; Rak, Andrew; Wexler, Philip; Tomljanovic, Chuck

    2008-11-15

    The rate of chemical synthesis and use has outpaced the development of risk values and the resolution of risk assessment methodology questions. In addition, available risk values derived by different organizations may vary due to scientific judgments, mission of the organization, or use of more recently published data. Further, each organization derives values for a unique chemical list so it can be challenging to locate data on a given chemical. Two Internet resources are available to address these issues. First, the International Toxicity Estimates for Risk (ITER) database ( (www.tera.org/iter)) provides chronic human health risk assessment data from a variety of organizations worldwide in a side-by-side format, explains differences in risk values derived by different organizations, and links directly to each organization's website for more detailed information. It is also the only database that includes risk information from independent parties whose risk values have undergone independent peer review. Second, the Risk Information Exchange (RiskIE) is a database of in progress chemical risk assessment work, and includes non-chemical information related to human health risk assessment, such as training modules, white papers and risk documents. RiskIE is available at (http://www.allianceforrisk.org/RiskIE.htm), and will join ITER on National Library of Medicine's TOXNET ( (http://toxnet.nlm.nih.gov/)). Together, ITER and RiskIE provide risk assessors essential tools for easily identifying and comparing available risk data, for sharing in progress assessments, and for enhancing interaction among risk assessment groups to decrease duplication of effort and to harmonize risk assessment procedures across organizations.

  16. Estimating Fire Risks at Industrial Nuclear Facilities

    SciTech Connect

    Coutts, D.A.

    1999-07-12

    The Savannah River Site (SRS) has a wide variety of nuclear production facilities that include chemical processing facilities, machine shops, production reactors, and laboratories. Current safety documentation must be maintained for the nuclear facilities at SRS. Fire Risk Analyses (FRAs) are used to support the safety documentation basis. These FRAs present the frequency that specified radiological and chemical consequences will be exceeded. The consequence values are based on mechanistic models assuming specific fire protection features fail to function as designed.

  17. Estimating Risk: Stereotype Amplification and the Perceived Risk of Criminal Victimization

    PubMed Central

    QUILLIAN, LINCOLN; PAGER, DEVAH

    2010-01-01

    This paper considers the process by which individuals estimate the risk of adverse events, with particular attention to the social context in which risk estimates are formed. We compare subjective probability estimates of crime victimization to actual victimization experiences among respondents from the 1994 to 2002 waves of the Survey of Economic Expectations (Dominitz and Manski 2002). Using zip code identifiers, we then match these survey data to local area characteristics from the census. The results show that: (1) the risk of criminal victimization is significantly overestimated relative to actual rates of victimization or other negative events; (2) neighborhood racial composition is strongly associated with perceived risk of victimization, whereas actual victimization risk is driven by nonracial neighborhood characteristics; and (3) white respondents appear more strongly affected by racial composition than nonwhites in forming their estimates of risk. We argue these results support a model of stereotype amplification in the formation of risk estimates. Implications for persistent racial inequality are considered. PMID:20686631

  18. Incentives Increase Participation in Mass Dog Rabies Vaccination Clinics and Methods of Coverage Estimation Are Assessed to Be Accurate

    PubMed Central

    Steinmetz, Melissa; Czupryna, Anna; Bigambo, Machunde; Mzimbiri, Imam; Powell, George; Gwakisa, Paul

    2015-01-01

    In this study we show that incentives (dog collars and owner wristbands) are effective at increasing owner participation in mass dog rabies vaccination clinics and we conclude that household questionnaire surveys and the mark-re-sight (transect survey) method for estimating post-vaccination coverage are accurate when all dogs, including puppies, are included. Incentives were distributed during central-point rabies vaccination clinics in northern Tanzania to quantify their effect on owner participation. In villages where incentives were handed out participation increased, with an average of 34 more dogs being vaccinated. Through economies of scale, this represents a reduction in the cost-per-dog of $0.47. This represents the price-threshold under which the cost of the incentive used must fall to be economically viable. Additionally, vaccination coverage levels were determined in ten villages through the gold-standard village-wide census technique, as well as through two cheaper and quicker methods (randomized household questionnaire and the transect survey). Cost data were also collected. Both non-gold standard methods were found to be accurate when puppies were included in the calculations, although the transect survey and the household questionnaire survey over- and under-estimated the coverage respectively. Given that additional demographic data can be collected through the household questionnaire survey, and that its estimate of coverage is more conservative, we recommend this method. Despite the use of incentives the average vaccination coverage was below the 70% threshold for eliminating rabies. We discuss the reasons and suggest solutions to improve coverage. Given recent international targets to eliminate rabies, this study provides valuable and timely data to help improve mass dog vaccination programs in Africa and elsewhere. PMID:26633821

  19. Incentives Increase Participation in Mass Dog Rabies Vaccination Clinics and Methods of Coverage Estimation Are Assessed to Be Accurate.

    PubMed

    Minyoo, Abel B; Steinmetz, Melissa; Czupryna, Anna; Bigambo, Machunde; Mzimbiri, Imam; Powell, George; Gwakisa, Paul; Lankester, Felix

    2015-12-01

    In this study we show that incentives (dog collars and owner wristbands) are effective at increasing owner participation in mass dog rabies vaccination clinics and we conclude that household questionnaire surveys and the mark-re-sight (transect survey) method for estimating post-vaccination coverage are accurate when all dogs, including puppies, are included. Incentives were distributed during central-point rabies vaccination clinics in northern Tanzania to quantify their effect on owner participation. In villages where incentives were handed out participation increased, with an average of 34 more dogs being vaccinated. Through economies of scale, this represents a reduction in the cost-per-dog of $0.47. This represents the price-threshold under which the cost of the incentive used must fall to be economically viable. Additionally, vaccination coverage levels were determined in ten villages through the gold-standard village-wide census technique, as well as through two cheaper and quicker methods (randomized household questionnaire and the transect survey). Cost data were also collected. Both non-gold standard methods were found to be accurate when puppies were included in the calculations, although the transect survey and the household questionnaire survey over- and under-estimated the coverage respectively. Given that additional demographic data can be collected through the household questionnaire survey, and that its estimate of coverage is more conservative, we recommend this method. Despite the use of incentives the average vaccination coverage was below the 70% threshold for eliminating rabies. We discuss the reasons and suggest solutions to improve coverage. Given recent international targets to eliminate rabies, this study provides valuable and timely data to help improve mass dog vaccination programs in Africa and elsewhere.

  20. Incentives Increase Participation in Mass Dog Rabies Vaccination Clinics and Methods of Coverage Estimation Are Assessed to Be Accurate.

    PubMed

    Minyoo, Abel B; Steinmetz, Melissa; Czupryna, Anna; Bigambo, Machunde; Mzimbiri, Imam; Powell, George; Gwakisa, Paul; Lankester, Felix

    2015-12-01

    In this study we show that incentives (dog collars and owner wristbands) are effective at increasing owner participation in mass dog rabies vaccination clinics and we conclude that household questionnaire surveys and the mark-re-sight (transect survey) method for estimating post-vaccination coverage are accurate when all dogs, including puppies, are included. Incentives were distributed during central-point rabies vaccination clinics in northern Tanzania to quantify their effect on owner participation. In villages where incentives were handed out participation increased, with an average of 34 more dogs being vaccinated. Through economies of scale, this represents a reduction in the cost-per-dog of $0.47. This represents the price-threshold under which the cost of the incentive used must fall to be economically viable. Additionally, vaccination coverage levels were determined in ten villages through the gold-standard village-wide census technique, as well as through two cheaper and quicker methods (randomized household questionnaire and the transect survey). Cost data were also collected. Both non-gold standard methods were found to be accurate when puppies were included in the calculations, although the transect survey and the household questionnaire survey over- and under-estimated the coverage respectively. Given that additional demographic data can be collected through the household questionnaire survey, and that its estimate of coverage is more conservative, we recommend this method. Despite the use of incentives the average vaccination coverage was below the 70% threshold for eliminating rabies. We discuss the reasons and suggest solutions to improve coverage. Given recent international targets to eliminate rabies, this study provides valuable and timely data to help improve mass dog vaccination programs in Africa and elsewhere. PMID:26633821

  1. Risk estimates for neonatal myotonic dystrophy.

    PubMed Central

    Glånz, A; Fråser, F C

    1984-01-01

    Children who inherit the autosomal dominant gene for myotonic dystrophy from their mother rather than their father may develop the severe neonatal type rather than the late onset type. The families of 22 neonatal type probands and 59 late onset type probands were studied to determine the risk of occurrence and recurrence of the neonatal type. The frequency of the neonatal type in sibs of neonatal type probands was 29%, or 37% if cases of neonatal deaths are counted as affected. This is significantly higher than the 6% of the neonatal type found in the offspring of affected women not ascertained through a child with the neonatal type. These data suggest that certain women carrying the gene for myotonic dystrophy are predisposed to have children affected with the neonatal type rather than the late onset type. The female near relatives of these women do not seem to share this predisposition. The data should be useful for genetic counseling. PMID:6748014

  2. Sensitivity of health risk estimates to air quality adjustment procedure

    SciTech Connect

    Whitfield, R.G.

    1997-06-30

    This letter is a summary of risk results associated with exposure estimates using two-parameter Weibull and quadratic air quality adjustment procedures (AQAPs). New exposure estimates were developed for children and child-occurrences, six urban areas, and five alternative air quality scenarios. In all cases, the Weibull and quadratic results are compared to previous results, which are based on a proportional AQAP.

  3. Towards more accurate life cycle risk management through integration of DDP and PRA

    NASA Technical Reports Server (NTRS)

    Cornford, Steven L.; Paulos, Todd; Meshkat, Leila; Feather, Martin

    2003-01-01

    The focus of this paper is on the integration of PRA and DDP. The intent is twofold: to extend risk-based decision though more of the lifecycle, and to lead to improved risk modeling (hence better informed decision making) wherever it is applied, most especially in the early phases as designs begin to mature.

  4. Reservoir evaluation of thin-bedded turbidites and hydrocarbon pore thickness estimation for an accurate quantification of resource

    NASA Astrophysics Data System (ADS)

    Omoniyi, Bayonle; Stow, Dorrik

    2016-04-01

    One of the major challenges in the assessment of and production from turbidite reservoirs is to take full account of thin and medium-bedded turbidites (<10cm and <30cm respectively). Although such thinner, low-pay sands may comprise a significant proportion of the reservoir succession, they can go unnoticed by conventional analysis and so negatively impact on reserve estimation, particularly in fields producing from prolific thick-bedded turbidite reservoirs. Field development plans often take little note of such thin beds, which are therefore bypassed by mainstream production. In fact, the trapped and bypassed fluids can be vital where maximising field value and optimising production are key business drivers. We have studied in detail, a succession of thin-bedded turbidites associated with thicker-bedded reservoir facies in the North Brae Field, UKCS, using a combination of conventional logs and cores to assess the significance of thin-bedded turbidites in computing hydrocarbon pore thickness (HPT). This quantity, being an indirect measure of thickness, is critical for an accurate estimation of original-oil-in-place (OOIP). By using a combination of conventional and unconventional logging analysis techniques, we obtain three different results for the reservoir intervals studied. These results include estimated net sand thickness, average sand thickness, and their distribution trend within a 3D structural grid. The net sand thickness varies from 205 to 380 ft, and HPT ranges from 21.53 to 39.90 ft. We observe that an integrated approach (neutron-density cross plots conditioned to cores) to HPT quantification reduces the associated uncertainties significantly, resulting in estimation of 96% of actual HPT. Further work will focus on assessing the 3D dynamic connectivity of the low-pay sands with the surrounding thick-bedded turbidite facies.

  5. Performance evaluation of ocean color satellite models for deriving accurate chlorophyll estimates in the Gulf of Saint Lawrence

    NASA Astrophysics Data System (ADS)

    Montes-Hugo, M.; Bouakba, H.; Arnone, R.

    2014-06-01

    The understanding of phytoplankton dynamics in the Gulf of the Saint Lawrence (GSL) is critical for managing major fisheries off the Canadian East coast. In this study, the accuracy of two atmospheric correction techniques (NASA standard algorithm, SA, and Kuchinke's spectral optimization, KU) and three ocean color inversion models (Carder's empirical for SeaWiFS (Sea-viewing Wide Field-of-View Sensor), EC, Lee's quasi-analytical, QAA, and Garver- Siegel-Maritorena semi-empirical, GSM) for estimating the phytoplankton absorption coefficient at 443 nm (aph(443)) and the chlorophyll concentration (chl) in the GSL is examined. Each model was validated based on SeaWiFS images and shipboard measurements obtained during May of 2000 and April 2001. In general, aph(443) estimates derived from coupling KU and QAA models presented the smallest differences with respect to in situ determinations as measured by High Pressure liquid Chromatography measurements (median absolute bias per cruise up to 0.005, RMSE up to 0.013). A change on the inversion approach used for estimating aph(443) values produced up to 43.4% increase on prediction error as inferred from the median relative bias per cruise. Likewise, the impact of applying different atmospheric correction schemes was secondary and represented an additive error of up to 24.3%. By using SeaDAS (SeaWiFS Data Analysis System) default values for the optical cross section of phytoplankton (i.e., aph(443) = aph(443)/chl = 0.056 m2mg-1), the median relative bias of our chl estimates as derived from the most accurate spaceborne aph(443) retrievals and with respect to in situ determinations increased up to 29%.

  6. Model-based Small Area Estimates of Cancer Risk Factors and Screening Behaviors - Small Area Estimates

    Cancer.gov

    These model-based estimates use two surveys, the Behavioral Risk Factor Surveillance System (BRFSS) and the National Health Interview Survey (NHIS). The two surveys are combined using novel statistical methodology.

  7. Estimation of Hypertension Risk from Lifestyle Factors and Health Profile: A Case Study

    PubMed Central

    2014-01-01

    Hypertension is a highly prevalent risk factor for cardiovascular disease and it can also lead to other diseases which seriously harm the human health. Screening the risks and finding a clinical model for estimating the risk of onset, maintenance, or the prognosis of hypertension are of great importance to the prevention or treatment of the disease, especially if the indicator can be derived from simple health profile. In this study, we investigate a chronic disease questionnaire data set of 6563 rural citizens in East China and find out a clinical signature that can assess the risk of hypertension easily and accurately. The signature achieves an accuracy of about 83% on the external test dataset, with an AUC of 0.91. Our study demonstrates that a combination of simple lifestyle features can sufficiently reflect the risk of hypertension onset. This finding provides potential guidance for disease prevention and control as well as development of home care and home-care technologies. PMID:25019099

  8. Estimating and Mapping the Population at Risk of Sleeping Sickness

    PubMed Central

    Franco, José R.; Paone, Massimo; Diarra, Abdoulaye; Ruiz-Postigo, José Antonio; Fèvre, Eric M.; Mattioli, Raffaele C.; Jannin, Jean G.

    2012-01-01

    Background Human African trypanosomiasis (HAT), also known as sleeping sickness, persists as a public health problem in several sub-Saharan countries. Evidence-based, spatially explicit estimates of population at risk are needed to inform planning and implementation of field interventions, monitor disease trends, raise awareness and support advocacy. Comprehensive, geo-referenced epidemiological records from HAT-affected countries were combined with human population layers to map five categories of risk, ranging from “very high” to “very low,” and to estimate the corresponding at-risk population. Results Approximately 70 million people distributed over a surface of 1.55 million km2 are estimated to be at different levels of risk of contracting HAT. Trypanosoma brucei gambiense accounts for 82.2% of the population at risk, the remaining 17.8% being at risk of infection from T. b. rhodesiense. Twenty-one million people live in areas classified as moderate to very high risk, where more than 1 HAT case per 10,000 inhabitants per annum is reported. Discussion Updated estimates of the population at risk of sleeping sickness were made, based on quantitative information on the reported cases and the geographic distribution of human population. Due to substantial methodological differences, it is not possible to make direct comparisons with previous figures for at-risk population. By contrast, it will be possible to explore trends in the future. The presented maps of different HAT risk levels will help to develop site-specific strategies for control and surveillance, and to monitor progress achieved by ongoing efforts aimed at the elimination of sleeping sickness. PMID:23145192

  9. Accurate recovery of 4D left ventricular deformations using volumetric B-splines incorporating phase based displacement estimates

    NASA Astrophysics Data System (ADS)

    Chen, Jian; Tustison, Nicholas J.; Amini, Amir A.

    2006-03-01

    In this paper, an improved framework for estimation of 3-D left-ventricular deformations from tagged MRI is presented. Contiguous short- and long-axis tagged MR images are collected and are used within a 4-D B-Spline based deformable model to determine 4-D displacements and strains. An initial 4-D B-spline model fitted to sparse tag line data is first constructed by minimizing a 4-D Chamfer distance potential-based energy function for aligning isoparametric planes of the model with tag line locations; subsequently, dense virtual tag lines based on 2-D phase-based displacement estimates and the initial model are created. A final 4-D B-spline model with increased knots is fitted to the virtual tag lines. From the final model, we can extract accurate 3-D myocardial deformation fields and corresponding strain maps which are local measures of non-rigid deformation. Lagrangian strains in simulated data are derived which show improvement over our previous work. The method is also applied to 3-D tagged MRI data collected in a canine.

  10. Can endocranial volume be estimated accurately from external skull measurements in great-tailed grackles (Quiscalus mexicanus)?

    PubMed Central

    Palmstrom, Christin R.

    2015-01-01

    There is an increasing need to validate and collect data approximating brain size on individuals in the field to understand what evolutionary factors drive brain size variation within and across species. We investigated whether we could accurately estimate endocranial volume (a proxy for brain size), as measured by computerized tomography (CT) scans, using external skull measurements and/or by filling skulls with beads and pouring them out into a graduated cylinder for male and female great-tailed grackles. We found that while females had higher correlations than males, estimations of endocranial volume from external skull measurements or beads did not tightly correlate with CT volumes. We found no accuracy in the ability of external skull measures to predict CT volumes because the prediction intervals for most data points overlapped extensively. We conclude that we are unable to detect individual differences in endocranial volume using external skull measurements. These results emphasize the importance of validating and explicitly quantifying the predictive accuracy of brain size proxies for each species and each sex. PMID:26082858

  11. [Application of spatial relative risk estimation in communicable disease risk evaluation].

    PubMed

    Zhang, Yewu; Guo, Qing; Wang, Xiaofeng; Yu, Meng; Su, Xuemei; Dong, Yan; Zhang, Chunxi

    2015-05-01

    This paper summaries the application of adaptive kernel density algorithm in the spatial relative risk estimation of communicable diseases by using the reported data of infectious diarrhea (other than cholera, dysentery, typhoid and paratyphoid) in Ludian county and surrounding area in Yunnan province in 2013. Statistically significant fluctuations in an estimated risk function were identified through the use of asymptotic tolerance contours, and finally these data were visualized though disease mapping. The results of spatial relative risk estimation and disease mapping showed that high risk areas were in southeastern Shaoyang next to Ludian. Therefore, the spatial relative risk estimation of disease by using adaptive kernel density algorithm and disease mapping technique is a powerful method in identifying high risk population and areas.

  12. [Application of spatial relative risk estimation in communicable disease risk evaluation].

    PubMed

    Zhang, Yewu; Guo, Qing; Wang, Xiaofeng; Yu, Meng; Su, Xuemei; Dong, Yan; Zhang, Chunxi

    2015-05-01

    This paper summaries the application of adaptive kernel density algorithm in the spatial relative risk estimation of communicable diseases by using the reported data of infectious diarrhea (other than cholera, dysentery, typhoid and paratyphoid) in Ludian county and surrounding area in Yunnan province in 2013. Statistically significant fluctuations in an estimated risk function were identified through the use of asymptotic tolerance contours, and finally these data were visualized though disease mapping. The results of spatial relative risk estimation and disease mapping showed that high risk areas were in southeastern Shaoyang next to Ludian. Therefore, the spatial relative risk estimation of disease by using adaptive kernel density algorithm and disease mapping technique is a powerful method in identifying high risk population and areas. PMID:26080648

  13. On cancer risk estimation of urban air pollution.

    PubMed Central

    Törnqvist, M; Ehrenberg, L

    1994-01-01

    The usefulness of data from various sources for a cancer risk estimation of urban air pollution is discussed. Considering the irreversibility of initiations, a multiplicative model is preferred for solid tumors. As has been concluded for exposure to ionizing radiation, the multiplicative model, in comparison with the additive model, predicts a relatively larger number of cases at high ages, with enhanced underestimation of risks by short follow-up times in disease-epidemiological studies. For related reasons, the extrapolation of risk from animal tests on the basis of daily absorbed dose per kilogram body weight or per square meter surface area without considering differences in life span may lead to an underestimation, and agreements with epidemiologically determined values may be fortuitous. Considering these possibilities, the most likely lifetime risks of cancer death at the average exposure levels in Sweden were estimated for certain pollution fractions or indicator compounds in urban air. The risks amount to approximately 50 deaths per 100,000 for inhaled particulate organic material (POM), with a contribution from ingested POM about three times larger, and alkenes, and butadiene cause 20 deaths, respectively, per 100,000 individuals. Also, benzene and formaldehyde are expected to be associated with considerable risk increments. Comparative potency methods were applied for POM and alkenes. Due to incompleteness of the list of compounds considered and the uncertainties of the above estimates, the total risk calculation from urban air has not been attempted here. PMID:7821292

  14. Accurately Predicting Future Reading Difficulty for Bilingual Latino Children at Risk for Language Impairment

    ERIC Educational Resources Information Center

    Petersen, Douglas B.; Gillam, Ronald B.

    2013-01-01

    Sixty-three bilingual Latino children who were at risk for language impairment were administered reading-related measures in English and Spanish (letter identification, phonological awareness, rapid automatized naming, and sentence repetition) and descriptive measures including English language proficiency (ELP), language ability (LA),…

  15. TIMP2•IGFBP7 biomarker panel accurately predicts acute kidney injury in high-risk surgical patients

    PubMed Central

    Gunnerson, Kyle J.; Shaw, Andrew D.; Chawla, Lakhmir S.; Bihorac, Azra; Al-Khafaji, Ali; Kashani, Kianoush; Lissauer, Matthew; Shi, Jing; Walker, Michael G.; Kellum, John A.

    2016-01-01

    BACKGROUND Acute kidney injury (AKI) is an important complication in surgical patients. Existing biomarkers and clinical prediction models underestimate the risk for developing AKI. We recently reported data from two trials of 728 and 408 critically ill adult patients in whom urinary TIMP2•IGFBP7 (NephroCheck, Astute Medical) was used to identify patients at risk of developing AKI. Here we report a preplanned analysis of surgical patients from both trials to assess whether urinary tissue inhibitor of metalloproteinase 2 (TIMP-2) and insulin-like growth factor–binding protein 7 (IGFBP7) accurately identify surgical patients at risk of developing AKI. STUDY DESIGN We enrolled adult surgical patients at risk for AKI who were admitted to one of 39 intensive care units across Europe and North America. The primary end point was moderate-severe AKI (equivalent to KDIGO [Kidney Disease Improving Global Outcomes] stages 2–3) within 12 hours of enrollment. Biomarker performance was assessed using the area under the receiver operating characteristic curve, integrated discrimination improvement, and category-free net reclassification improvement. RESULTS A total of 375 patients were included in the final analysis of whom 35 (9%) developed moderate-severe AKI within 12 hours. The area under the receiver operating characteristic curve for [TIMP-2]•[IGFBP7] alone was 0.84 (95% confidence interval, 0.76–0.90; p < 0.0001). Biomarker performance was robust in sensitivity analysis across predefined subgroups (urgency and type of surgery). CONCLUSION For postoperative surgical intensive care unit patients, a single urinary TIMP2•IGFBP7 test accurately identified patients at risk for developing AKI within the ensuing 12 hours and its inclusion in clinical risk prediction models significantly enhances their performance. LEVEL OF EVIDENCE Prognostic study, level I. PMID:26816218

  16. Estimating the gas transfer velocity: a prerequisite for more accurate and higher resolution GHG fluxes (lower Aare River, Switzerland)

    NASA Astrophysics Data System (ADS)

    Sollberger, S.; Perez, K.; Schubert, C. J.; Eugster, W.; Wehrli, B.; Del Sontro, T.

    2013-12-01

    Currently, carbon dioxide (CO2) and methane (CH4) emissions from lakes, reservoirs and rivers are readily investigated due to the global warming potential of those gases and the role these inland waters play in the carbon cycle. However, there is a lack of high spatiotemporally-resolved emission estimates, and how to accurately assess the gas transfer velocity (K) remains controversial. In anthropogenically-impacted systems where run-of-river reservoirs disrupt the flow of sediments by increasing the erosion and load accumulation patterns, the resulting production of carbonic greenhouse gases (GH-C) is likely to be enhanced. The GH-C flux is thus counteracting the terrestrial carbon sink in these environments that act as net carbon emitters. The aim of this project was to determine the GH-C emissions from a medium-sized river heavily impacted by several impoundments and channelization through a densely-populated region of Switzerland. Estimating gas emission from rivers is not trivial and recently several models have been put forth to do so; therefore a second goal of this project was to compare the river emission models available with direct measurements. Finally, we further validated the modeled fluxes by using a combined approach with water sampling, chamber measurements, and highly temporal GH-C monitoring using an equilibrator. We conducted monthly surveys along the 120 km of the lower Aare River where we sampled for dissolved CH4 (';manual' sampling) at a 5-km sampling resolution, and measured gas emissions directly with chambers over a 35 km section. We calculated fluxes (F) via the boundary layer equation (F=K×(Cw-Ceq)) that uses the water-air GH-C concentration (C) gradient (Cw-Ceq) and K, which is the most sensitive parameter. K was estimated using 11 different models found in the literature with varying dependencies on: river hydrology (n=7), wind (2), heat exchange (1), and river width (1). We found that chamber fluxes were always higher than boundary

  17. SU-E-J-208: Fast and Accurate Auto-Segmentation of Abdominal Organs at Risk for Online Adaptive Radiotherapy

    SciTech Connect

    Gupta, V; Wang, Y; Romero, A; Heijmen, B; Hoogeman, M; Myronenko, A; Jordan, P

    2014-06-01

    Purpose: Various studies have demonstrated that online adaptive radiotherapy by real-time re-optimization of the treatment plan can improve organs-at-risk (OARs) sparing in the abdominal region. Its clinical implementation, however, requires fast and accurate auto-segmentation of OARs in CT scans acquired just before each treatment fraction. Autosegmentation is particularly challenging in the abdominal region due to the frequently observed large deformations. We present a clinical validation of a new auto-segmentation method that uses fully automated non-rigid registration for propagating abdominal OAR contours from planning to daily treatment CT scans. Methods: OARs were manually contoured by an expert panel to obtain ground truth contours for repeat CT scans (3 per patient) of 10 patients. For the non-rigid alignment, we used a new non-rigid registration method that estimates the deformation field by optimizing local normalized correlation coefficient with smoothness regularization. This field was used to propagate planning contours to repeat CTs. To quantify the performance of the auto-segmentation, we compared the propagated and ground truth contours using two widely used metrics- Dice coefficient (Dc) and Hausdorff distance (Hd). The proposed method was benchmarked against translation and rigid alignment based auto-segmentation. Results: For all organs, the auto-segmentation performed better than the baseline (translation) with an average processing time of 15 s per fraction CT. The overall improvements ranged from 2% (heart) to 32% (pancreas) in Dc, and 27% (heart) to 62% (spinal cord) in Hd. For liver, kidneys, gall bladder, stomach, spinal cord and heart, Dc above 0.85 was achieved. Duodenum and pancreas were the most challenging organs with both showing relatively larger spreads and medians of 0.79 and 2.1 mm for Dc and Hd, respectively. Conclusion: Based on the achieved accuracy and computational time we conclude that the investigated auto

  18. Development of a new, robust and accurate, spectroscopic metric for scatterer size estimation in optical coherence tomography (OCT) images

    NASA Astrophysics Data System (ADS)

    Kassinopoulos, Michalis; Pitris, Costas

    2016-03-01

    The modulations appearing on the backscattering spectrum originating from a scatterer are related to its diameter as described by Mie theory for spherical particles. Many metrics for Spectroscopic Optical Coherence Tomography (SOCT) take advantage of this observation in order to enhance the contrast of Optical Coherence Tomography (OCT) images. However, none of these metrics has achieved high accuracy when calculating the scatterer size. In this work, Mie theory was used to further investigate the relationship between the degree of modulation in the spectrum and the scatterer size. From this study, a new spectroscopic metric, the bandwidth of the Correlation of the Derivative (COD) was developed which is more robust and accurate, compared to previously reported techniques, in the estimation of scatterer size. The self-normalizing nature of the derivative and the robustness of the first minimum of the correlation as a measure of its width, offer significant advantages over other spectral analysis approaches especially for scatterer sizes above 3 μm. The feasibility of this technique was demonstrated using phantom samples containing 6, 10 and 16 μm diameter microspheres as well as images of normal and cancerous human colon. The results are very promising, suggesting that the proposed metric could be implemented in OCT spectral analysis for measuring nuclear size distribution in biological tissues. A technique providing such information would be of great clinical significance since it would allow the detection of nuclear enlargement at the earliest stages of precancerous development.

  19. Optimization of tissue physical parameters for accurate temperature estimation from finite-element simulation of radiofrequency ablation

    NASA Astrophysics Data System (ADS)

    Subramanian, Swetha; Mast, T. Douglas

    2015-09-01

    Computational finite element models are commonly used for the simulation of radiofrequency ablation (RFA) treatments. However, the accuracy of these simulations is limited by the lack of precise knowledge of tissue parameters. In this technical note, an inverse solver based on the unscented Kalman filter (UKF) is proposed to optimize values for specific heat, thermal conductivity, and electrical conductivity resulting in accurately simulated temperature elevations. A total of 15 RFA treatments were performed on ex vivo bovine liver tissue. For each RFA treatment, 15 finite-element simulations were performed using a set of deterministically chosen tissue parameters to estimate the mean and variance of the resulting tissue ablation. The UKF was implemented as an inverse solver to recover the specific heat, thermal conductivity, and electrical conductivity corresponding to the measured area of the ablated tissue region, as determined from gross tissue histology. These tissue parameters were then employed in the finite element model to simulate the position- and time-dependent tissue temperature. Results show good agreement between simulated and measured temperature.

  20. Optimization of tissue physical parameters for accurate temperature estimation from finite-element simulation of radiofrequency ablation.

    PubMed

    Subramanian, Swetha; Mast, T Douglas

    2015-10-01

    Computational finite element models are commonly used for the simulation of radiofrequency ablation (RFA) treatments. However, the accuracy of these simulations is limited by the lack of precise knowledge of tissue parameters. In this technical note, an inverse solver based on the unscented Kalman filter (UKF) is proposed to optimize values for specific heat, thermal conductivity, and electrical conductivity resulting in accurately simulated temperature elevations. A total of 15 RFA treatments were performed on ex vivo bovine liver tissue. For each RFA treatment, 15 finite-element simulations were performed using a set of deterministically chosen tissue parameters to estimate the mean and variance of the resulting tissue ablation. The UKF was implemented as an inverse solver to recover the specific heat, thermal conductivity, and electrical conductivity corresponding to the measured area of the ablated tissue region, as determined from gross tissue histology. These tissue parameters were then employed in the finite element model to simulate the position- and time-dependent tissue temperature. Results show good agreement between simulated and measured temperature. PMID:26352462

  1. Accurate Risk Assessment of Patients with Asymptomatic Hematuria for the Presence of Bladder Cancer

    PubMed Central

    Cha, Eugene K.; Tirsar, Lenuta-Ancuta; Schwentner, Christian; Hennenlotter, Joerg; Christos, Paul J.; Stenzl, Arnulf; Mian, Christine; Martini, Thomas; Pycha, Armin; Shariat, Shahrokh F.; Schmitz-Dräger, Bernd J.

    2014-01-01

    Purpose Bladder cancer is frequently diagnosed during a workup for hematuria. However, most patients with microscopic hematuria and many with gross hematuria are not appropriately referred to urologists. We hypothesized that in patients presenting with asymptomatic hematuria, the risk of having bladder cancer can be predicted with high accuracy. Towards this end, we analyzed risk factors in patients with asymptomatic hematuria and developed a nomogram for the prediction of bladder cancer presence. Methods Data from 1,182 consecutive subjects without a history of bladder cancer undergoing initial evaluation for asymptomatic hematuria were collected at three centers. Clinical risk factors including age, gender, smoking status, and degree of hematuria were recorded. All subjects underwent standard workup including voided cytology, upper tract imaging, and cystourethroscopy. Factors associated with the presence of bladder cancer were evaluated by univariable and multivariable logistic regression analyses. The multivariable analysis was used to construct a nomogram. Internal validation was performed using 200 bootstrap samples. Results Of the 1,182 subjects who presented with asymptomatic hematuria, 245 (20.7%) had bladder cancer. Increasing age (OR=1.03, p<0.0001), smoking history (OR=3.72, p<0.0001), gross hematuria (OR=1.71, p=0.002), and positive cytology (OR=14.71, p<0.0001) were independent predictors of bladder cancer presence. The multivariable model achieved 83.1% accuracy for predicting the presence of bladder cancer. Conclusions Bladder cancer presence can be predicted with high accuracy in patients who present with asymptomatic hematuria. We developed a nomogram to help optimize referral patterns (i.e., timing and prioritization) of patients with asymptomatic hematuria. PMID:23124847

  2. Estimating wildfire risk on a Mojave Desert landscape using remote sensing and field sampling

    USGS Publications Warehouse

    Van Linn, Peter F.; Nussear, Kenneth E.; Esque, Todd C.; DeFalco, Lesley A.; Inman, Richard D.; Abella, Scott R.

    2013-01-01

    Predicting wildfires that affect broad landscapes is important for allocating suppression resources and guiding land management. Wildfire prediction in the south-western United States is of specific concern because of the increasing prevalence and severe effects of fire on desert shrublands and the current lack of accurate fire prediction tools. We developed a fire risk model to predict fire occurrence in a north-eastern Mojave Desert landscape. First we developed a spatial model using remote sensing data to predict fuel loads based on field estimates of fuels. We then modelled fire risk (interactions of fuel characteristics and environmental conditions conducive to wildfire) using satellite imagery, our model of fuel loads, and spatial data on ignition potential (lightning strikes and distance to roads), topography (elevation and aspect) and climate (maximum and minimum temperatures). The risk model was developed during a fire year at our study landscape and validated at a nearby landscape; model performance was accurate and similar at both sites. This study demonstrates that remote sensing techniques used in combination with field surveys can accurately predict wildfire risk in the Mojave Desert and may be applicable to other arid and semiarid lands where wildfires are prevalent.

  3. Underestimating the Alcohol Content of a Glass of Wine: The Implications for Estimates of Mortality Risk

    PubMed Central

    Britton, Annie; O’Neill, Darragh; Bell, Steven

    2016-01-01

    Aims Increases in glass sizes and wine strength over the last 25 years in the UK are likely to have led to an underestimation of alcohol intake in population studies. We explore whether this probable misclassification affects the association between average alcohol intake and risk of mortality from all causes, cardiovascular disease and cancer. Methods Self-reported alcohol consumption in 1997–1999 among 7010 men and women in the Whitehall II cohort of British civil servants was linked to the risk of mortality until mid-2015. A conversion factor of 8 g of alcohol per wine glass (1 unit) was compared with a conversion of 16 g per wine glass (2 units). Results When applying a higher alcohol content conversion for wine consumption, the proportion of heavy/very heavy drinkers increased from 28% to 41% for men and 15% to 28% for women. There was a significantly increased risk of very heavy drinking compared with moderate drinking for deaths from all causes and cancer before and after change in wine conversion; however, the hazard ratios were reduced when a higher wine conversion was used. Conclusions In this population-based study, assuming higher alcohol content in wine glasses changed the estimates of mortality risk. We propose that investigator-led cohorts need to revisit conversion factors based on more accurate estimates of alcohol content in wine glasses. Prospectively, researchers need to collect more detailed information on alcohol including serving sizes and strength. Short summary The alcohol content in a wine glass is likely to be underestimated in population surveys as wine strength and serving size have increased in recent years. We demonstrate that in a large cohort study, this underestimation affects estimates of mortality risk. Investigator-led cohorts need to revisit conversion factors based on more accurate estimates of alcohol content in wine glasses. PMID:27261472

  4. Effects of exposure uncertainty on estimation of radon risks

    SciTech Connect

    Chambers, D.B.; Lowe, L.M.; Stager, R.H.; Reilly, P.M.; Duport, P.

    1992-12-31

    Estimates of lung-cancer risk from exposure to radon daughters are largely based on epidemiological studies of underground miners. The reliability of exposure data for these miners is a cause for concern, as actual workplace measurements of radon and/or radon-daughter levels are either sparse or absent for the early years of mining, when much of the exposure occurred.

  5. Estimates of endemic waterborne risks from community-intervention studies.

    PubMed

    Calderon, Rebecca L; Craun, Gunther F

    2006-01-01

    The nature and magnitude of endemic waterborne disease are not well characterized in the United States. Epidemiologic studies of various designs can provide an estimate of the waterborne attributable risk along with other types of information. Community drinking water systems frequently improve their operations and may change drinking water treatment and their major source of water. In the United States, many of these treatment changes are the result of regulations promulgated under the Safe Drinking Water Act. A community-intervention study design takes advantage of these "natural" experiments to assess changes in health risks. In this paper, we review the community-intervention studies that have assessed changes in waterborne gastroenteritis risks among immunocompetent populations in industrialized countries. Published results are available from two studies in Australia, one study in the United Kingdom, and one study in the United States. Preliminary results from two other US studies are also available. Although the current information is limited, the risks reported in these community-intervention studies can help inform the national estimate of endemic waterborne gastroenteritis. Information is provided about endemic waterborne risks for unfiltered surface water sources and a groundwater under the influence of surface water. Community-intervention studies with recommended study modifications should be conducted to better estimate the benefits associated with improved drinking water treatment. PMID:16895087

  6. Neoplastic potential of gastric irradiation. IV. Risk estimates

    SciTech Connect

    Griem, M.L.; Justman, J.; Weiss, L.

    1984-12-01

    No significant tumor increase was found in the initial analysis of patients irradiated for peptic ulcer and followed through 1962. A preliminary study was undertaken 22 years later to estimate the risk of cancer due to gastric irradiation for peptic ulcer disease. A population of 2,049 irradiated patients and 763 medically managed patients has been identified. A relative risk of 3.7 was found for stomach cancer and an initial risk estimate of 5.5 x 10(-6) excess stomach cancers per person rad was calculated. A more complete follow-up is in progress to further elucidate this observation and decrease the ascertainment bias; however, preliminary data are in agreement with the Japanese atomic bomb reports.

  7. Estimation of myocardial volume at risk from CT angiography

    NASA Astrophysics Data System (ADS)

    Zhu, Liangjia; Gao, Yi; Mohan, Vandana; Stillman, Arthur; Faber, Tracy; Tannenbaum, Allen

    2011-03-01

    The determination of myocardial volume at risk distal to coronary stenosis provides important information for prognosis and treatment of coronary artery disease. In this paper, we present a novel computational framework for estimating the myocardial volume at risk in computed tomography angiography (CTA) imagery. Initially, epicardial and endocardial surfaces, and coronary arteries are extracted using an active contour method. Then, the extracted coronary arteries are projected onto the epicardial surface, and each point on this surface is associated with its closest coronary artery using the geodesic distance measurement. The likely myocardial region at risk on the epicardial surface caused by a stenosis is approximated by the region in which all its inner points are associated with the sub-branches distal to the stenosis on the coronary artery tree. Finally, the likely myocardial volume at risk is approximated by the volume in between the region at risk on the epicardial surface and its projection on the endocardial surface, which is expected to yield computational savings over risk volume estimation using the entire image volume. Furthermore, we expect increased accuracy since, as compared to prior work using the Euclidean distance, we employ the geodesic distance in this work. The experimental results demonstrate the effectiveness of the proposed approach on pig heart CTA datasets.

  8. Estimating Non-stationary Flood Risk in a Changing Climate

    NASA Astrophysics Data System (ADS)

    Yu, X.; Cohn, T. A.; Stedinger, J. R.

    2015-12-01

    Flood risk is usually described by a probability distribution for annual maximum streamflow which is assumed not to change with time. Federal, state and local governments in the United States are demanding guidance on flood frequency estimates that account for climate change. If a trend exists in peak flow series, ignoring it could result in large quantile estimator bias, while trying to estimate a trend will increase the flood quantile estimator's variance. Thus the issue is, what bias-variance tradeoff should we accept? This paper discusses approaches to flood frequency analysis (FFA) when flood series have trends. GCMs describe how annual runoff might vary over sub-continental scales, but this information is nearly useless for FFA in small watersheds. A LP3 Monte Carlo analysis and a re-sampling study of 100-year flood estimation (25- and 50-year projections) compares the performance of five methods: FFA as prescribed in national guidelines (Bulletin 17B), assumes the flood series is stationary and follows a log-Pearson type III (LP3) distribution; Fitting a LP3 distribution with time-varying parameters that include future trends in mean and perhaps variance, where slopes are assumed known; Fitting a LP3 distribution with time-varying parameters that capture future trends in mean and perhaps variance, where slopes are estimated from annual peak flow series; Employing only the most recent 30 years of flood records to fit a LP3 distribution; Applying a safety factor to the 100-year flood estimator (e.g. 25% increase). The 100-year flood estimator of method 2 has the smallest log-space mean squared error, though it is unlikely that the true trend would be known. Method 3 is only recommended over method 1 for large trends (≥ 0.5% per year). The 100-year flood estimators of method 1, 4, and 5 often have poor accuracy. Clearly, flood risk assessment will be a challenge in an uncertain world.

  9. Measurement of total risk of spontaneous abortion: the virtue of conditional risk estimation.

    PubMed

    Modvig, J; Schmidt, L; Damsgaard, M T

    1990-12-01

    The concepts, methods, and problems of measuring spontaneous abortion risk are reviewed. The problems touched on include the process of pregnancy verification, the changes in risk by gestational age and maternal age, and the presence of induced abortions. Methods used in studies of spontaneous abortion risk include biochemical assays as well as life table technique, although the latter appears in two different forms. The consequences of using either of these are discussed. It is concluded that no study design so far is appropriate for measuring the total risk of spontaneous abortion from early conception to the end of the 27th week. It is proposed that pregnancy may be considered to consist of two or three specific periods and that different study designs should concentrate on measuring the conditional risk within each period. A careful estimate using this principle leads to an estimate of total risk of spontaneous abortion of 0.33.

  10. NRC committee provides new risk estimates for exposure to radon

    SciTech Connect

    Not Available

    1988-03-01

    A new set of age-specific estimates describing the increased risk of lung cancer following exposure to radon was released in January by a National Research Council committee. The revised estimates result from new statistical techniques used to analyze previously collected data. In a study jointly sponsored by the Environmental Protection Agency (EPA) and the Nuclear Regulatory Commission, the committee concluded that lifetime exposure to one working level month (WLM) of radon per year, a standard measure used by radiation experts, increases an individual's chances of dying from lung cancer by 1.5 times compared with someone exposed only to background levels of radon. The committee estimated that, for every 1 million people exposed over a lifetime to one WLM of radon, about 350 additional deaths would occur due to lung cancer. The committee found that lung cancer risks associated with radon increased with increasing length of exposure. Moreover, it said that 15 years after exposure to radon has ended, the risk of lung cancer from the exposure declines to half the original risk.

  11. Urban micro-scale flood risk estimation with parsimonious hydraulic modelling and census data

    NASA Astrophysics Data System (ADS)

    Arrighi, C.; Brugioni, M.; Castelli, F.; Franceschini, S.; Mazzanti, B.

    2013-05-01

    The adoption of 2007/60/EC Directive requires European countries to implement flood hazard and flood risk maps by the end of 2013. Flood risk is the product of flood hazard, vulnerability and exposure, all three to be estimated with comparable level of accuracy. The route to flood risk assessment is consequently much more than hydraulic modelling of inundation, that is hazard mapping. While hazard maps have already been implemented in many countries, quantitative damage and risk maps are still at a preliminary level. A parsimonious quasi-2-D hydraulic model is here adopted, having many advantages in terms of easy set-up. It is here evaluated as being accurate in flood depth estimation in urban areas with a high-resolution and up-to-date Digital Surface Model (DSM). The accuracy, estimated by comparison with marble-plate records of a historic flood in the city of Florence, is characterized in the downtown's most flooded area by a bias of a very few centimetres and a determination coefficient of 0.73. The average risk is found to be about 14 € m-2 yr-1, corresponding to about 8.3% of residents' income. The spatial distribution of estimated risk highlights a complex interaction between the flood pattern and the building characteristics. As a final example application, the estimated risk values have been used to compare different retrofitting measures. Proceeding through the risk estimation steps, a new micro-scale potential damage assessment method is proposed. This is based on the georeferenced census system as the optimal compromise between spatial detail and open availability of socio-economic data. The results of flood risk assessment at the census section scale resolve most of the risk spatial variability, and they can be easily aggregated to whatever upper scale is needed given that they are geographically defined as contiguous polygons. Damage is calculated through stage-damage curves, starting from census data on building type and function, for the main

  12. Estimating Skin Cancer Risk: Evaluating Mobile Computer-Adaptive Testing

    PubMed Central

    Djaja, Ngadiman; Janda, Monika; Olsen, Catherine M; Whiteman, David C

    2016-01-01

    Background Response burden is a major detriment to questionnaire completion rates. Computer adaptive testing may offer advantages over non-adaptive testing, including reduction of numbers of items required for precise measurement. Objective Our aim was to compare the efficiency of non-adaptive (NAT) and computer adaptive testing (CAT) facilitated by Partial Credit Model (PCM)-derived calibration to estimate skin cancer risk. Methods We used a random sample from a population-based Australian cohort study of skin cancer risk (N=43,794). All 30 items of the skin cancer risk scale were calibrated with the Rasch PCM. A total of 1000 cases generated following a normal distribution (mean [SD] 0 [1]) were simulated using three Rasch models with three fixed-item (dichotomous, rating scale, and partial credit) scenarios, respectively. We calculated the comparative efficiency and precision of CAT and NAT (shortening of questionnaire length and the count difference number ratio less than 5% using independent t tests). Results We found that use of CAT led to smaller person standard error of the estimated measure than NAT, with substantially higher efficiency but no loss of precision, reducing response burden by 48%, 66%, and 66% for dichotomous, Rating Scale Model, and PCM models, respectively. Conclusions CAT-based administrations of the skin cancer risk scale could substantially reduce participant burden without compromising measurement precision. A mobile computer adaptive test was developed to help people efficiently assess their skin cancer risk. PMID:26800642

  13. A Review of Expertise and Judgment Processes for Risk Estimation

    SciTech Connect

    R. L. Boring

    2007-06-01

    A major challenge of risk and reliability analysis for human errors or hardware failures is the need to enlist expert opinion in areas for which adequate operational data are not available. Experts enlisted in this capacity provide probabilistic estimates of reliability, typically comprised of a measure of central tendency and uncertainty bounds. While formal guidelines for expert elicitation are readily available, they largely fail to provide a theoretical basis for expertise and judgment. This paper reviews expertise and judgment in the context of risk analysis; overviews judgment biases, the role of training, and multivariate judgments; and provides guidance on the appropriate use of atomistic and holistic judgment processes.

  14. Observing Volcanic Thermal Anomalies from Space: How Accurate is the Estimation of the Hotspot's Size and Temperature?

    NASA Astrophysics Data System (ADS)

    Zaksek, K.; Pick, L.; Lombardo, V.; Hort, M. K.

    2015-12-01

    Measuring the heat emission from active volcanic features on the basis of infrared satellite images contributes to the volcano's hazard assessment. Because these thermal anomalies only occupy a small fraction (< 1 %) of a typically resolved target pixel (e.g. from Landsat 7, MODIS) the accurate determination of the hotspot's size and temperature is however problematic. Conventionally this is overcome by comparing observations in at least two separate infrared spectral wavebands (Dual-Band method). We investigate the resolution limits of this thermal un-mixing technique by means of a uniquely designed indoor analog experiment. Therein the volcanic feature is simulated by an electrical heating alloy of 0.5 mm diameter installed on a plywood panel of high emissivity. Two thermographic cameras (VarioCam high resolution and ImageIR 8300 by Infratec) record images of the artificial heat source in wavebands comparable to those available from satellite data. These range from the short-wave infrared (1.4-3 µm) over the mid-wave infrared (3-8 µm) to the thermal infrared (8-15 µm). In the conducted experiment the pixel fraction of the hotspot was successively reduced by increasing the camera-to-target distance from 3 m to 35 m. On the basis of an individual target pixel the expected decrease of the hotspot pixel area with distance at a relatively constant wire temperature of around 600 °C was confirmed. The deviation of the hotspot's pixel fraction yielded by the Dual-Band method from the theoretically calculated one was found to be within 20 % up until a target distance of 25 m. This means that a reliable estimation of the hotspot size is only possible if the hotspot is larger than about 3 % of the pixel area, a resolution boundary most remotely sensed volcanic hotspots fall below. Future efforts will focus on the investigation of a resolution limit for the hotspot's temperature by varying the alloy's amperage. Moreover, the un-mixing results for more realistic multi

  15. Estimation of radiation risk for astronauts on the Moon

    NASA Astrophysics Data System (ADS)

    Kuznetsov, N. V.; Nymmik, R. A.; Panasyuk, M. I.; Denisov, A. N.; Sobolevsky, N. M.

    2012-05-01

    The problem of estimating the risk of radiation for humans on the Moon is discussed, taking into account the probabilistic nature of occurrence of solar particle events. Calculations of the expected values of tissue-averaged equivalent dose rates, which are created by galactic and solar cosmic-ray particle fluxes on the lunar surface behind shielding, are made for different durations of lunar missions.

  16. Estimating relative risks for common outcome using PROC NLP.

    PubMed

    Yu, Binbing; Wang, Zhuoqiao

    2008-05-01

    In cross-sectional or cohort studies with binary outcomes, it is biologically interpretable and of interest to estimate the relative risk or prevalence ratio, especially when the response rates are not rare. Several methods have been used to estimate the relative risk, among which the log-binomial models yield the maximum likelihood estimate (MLE) of the parameters. Because of restrictions on the parameter space, the log-binomial models often run into convergence problems. Some remedies, e.g., the Poisson and Cox regressions, have been proposed. However, these methods may give out-of-bound predicted response probabilities. In this paper, a new computation method using the SAS Nonlinear Programming (NLP) procedure is proposed to find the MLEs. The proposed NLP method was compared to the COPY method, a modified method to fit the log-binomial model. Issues in the implementation are discussed. For illustration, both methods were applied to data on the prevalence of microalbuminuria (micro-protein leakage into urine) for kidney disease patients from the Diabetes Control and Complications Trial. The sample SAS macro for calculating relative risk is provided in the appendix.

  17. Improved risk estimates for carbon tetrachloride. 1998 annual progress report

    SciTech Connect

    Benson, J.M.; Springer, D.L.; Thrall, K.D.

    1998-06-01

    'The overall purpose of these studies is to improve the scientific basis for assessing the cancer risk associated with human exposure to carbon tetrachloride. Specifically, the toxicokinetics of inhaled carbon tetrachloride is being determined in rats, mice and hamsters. Species differences in the metabolism of carbon tetrachloride by rats, mice and hamsters is being determined in vivo and in vitro using tissues and microsomes from these rodent species and man. Dose-response relationships will be determined in all studies. The information will be used to improve the current physiologically based pharmacokinetic model for carbon tetrachloride. The authors will also determine whether carbon tetrachloride is a hepatocarcinogen only when exposure results in cell damage, cell killing, and regenerative cell proliferation. In combination, the results of these studies will provide the types of information needed to enable a refined risk estimate for carbon tetrachloride under EPA''s new guidelines for cancer risk assessment.'

  18. Risk estimation based on chromosomal aberrations induced by radiation

    NASA Technical Reports Server (NTRS)

    Durante, M.; Bonassi, S.; George, K.; Cucinotta, F. A.

    2001-01-01

    The presence of a causal association between the frequency of chromosomal aberrations in peripheral blood lymphocytes and the risk of cancer has been substantiated recently by epidemiological studies. Cytogenetic analyses of crew members of the Mir Space Station have shown that a significant increase in the frequency of chromosomal aberrations can be detected after flight, and that such an increase is likely to be attributed to the radiation exposure. The risk of cancer can be estimated directly from the yields of chromosomal aberrations, taking into account some aspects of individual susceptibility and other factors unrelated to radiation. However, the use of an appropriate technique for the collection and analysis of chromosomes and the choice of the structural aberrations to be measured are crucial in providing sound results. Based on the fraction of aberrant lymphocytes detected before and after flight, the relative risk after a long-term Mir mission is estimated to be about 1.2-1.3. The new technique of mFISH can provide useful insights into the quantification of risk on an individual basis.

  19. Exploration of diffusion kernel density estimation in agricultural drought risk analysis: a case study in Shandong, China

    NASA Astrophysics Data System (ADS)

    Chen, W.; Shao, Z.; Tiong, L. K.

    2015-11-01

    Drought caused the most widespread damage in China, making up over 50 % of the total affected area nationwide in recent decades. In the paper, a Standardized Precipitation Index-based (SPI-based) drought risk study is conducted using historical rainfall data of 19 weather stations in Shandong province, China. Kernel density based method is adopted to carry out the risk analysis. Comparison between the bivariate Gaussian kernel density estimation (GKDE) and diffusion kernel density estimation (DKDE) are carried out to analyze the effect of drought intensity and drought duration. The results show that DKDE is relatively more accurate without boundary-leakage. Combined with the GIS technique, the drought risk is presented which reveals the spatial and temporal variation of agricultural droughts for corn in Shandong. The estimation provides a different way to study the occurrence frequency and severity of drought risk from multiple perspectives.

  20. A comparison of genetic risk score with family history for estimating prostate cancer risk

    PubMed Central

    Helfand, Brian T

    2016-01-01

    Prostate cancer (PCa) testing is recommended by most authoritative groups for high-risk men including those with a family history of the disease. However, family history information is often limited by patient knowledge and clinician intake, and thus, many men are incorrectly assigned to different risk groups. Alternate methods to assess PCa risk are required. In this review, we discuss how genetic variants, referred to as PCa-risk single-nucleotide polymorphisms, can be used to calculate a genetic risk score (GRS). GRS assigns a relatively unique value to all men based on the number of PCa-risk SNPs that an individual carries. This GRS value can provide a more precise estimate of a man's PCa risk. This is particularly relevant in situations when an individual is unaware of his family history. In addition, GRS has utility and can provide a more precise estimate of risk even among men with a positive family history. It can even distinguish risk among relatives with the same degree of family relationships. Taken together, this review serves to provide support for the clinical utility of GRS as an independent test to provide supplemental information to family history. As such, GRS can serve as a platform to help guide-shared decision-making processes regarding the timing and frequency of PCa testing and biopsies. PMID:27004541

  1. Seismic Risk Assessment and Loss Estimation for Tbilisi City

    NASA Astrophysics Data System (ADS)

    Tsereteli, Nino; Alania, Victor; Varazanashvili, Otar; Gugeshashvili, Tengiz; Arabidze, Vakhtang; Arevadze, Nika; Tsereteli, Emili; Gaphrindashvili, Giorgi; Gventcadze, Alexander; Goguadze, Nino; Vephkhvadze, Sophio

    2013-04-01

    The proper assessment of seismic risk is of crucial importance for society protection and city sustainable economic development, as it is the essential part to seismic hazard reduction. Estimation of seismic risk and losses is complicated tasks. There is always knowledge deficiency on real seismic hazard, local site effects, inventory on elements at risk, infrastructure vulnerability, especially for developing countries. Lately great efforts was done in the frame of EMME (earthquake Model for Middle East Region) project, where in the work packages WP1, WP2 , WP3 and WP4 where improved gaps related to seismic hazard assessment and vulnerability analysis. Finely in the frame of work package wp5 "City Scenario" additional work to this direction and detail investigation of local site conditions, active fault (3D) beneath Tbilisi were done. For estimation economic losses the algorithm was prepared taking into account obtained inventory. The long term usage of building is very complex. It relates to the reliability and durability of buildings. The long term usage and durability of a building is determined by the concept of depreciation. Depreciation of an entire building is calculated by summing the products of individual construction unit' depreciation rates and the corresponding value of these units within the building. This method of calculation is based on an assumption that depreciation is proportional to the building's (constructions) useful life. We used this methodology to create a matrix, which provides a way to evaluate the depreciation rates of buildings with different type and construction period and to determine their corresponding value. Finally loss was estimated resulting from shaking 10%, 5% and 2% exceedance probability in 50 years. Loss resulting from scenario earthquake (earthquake with possible maximum magnitude) also where estimated.

  2. Risk cross sections and their application to risk estimation in the galactic cosmic-ray environment.

    PubMed

    Curtis, S B; Nealy, J E; Wilson, J W

    1995-01-01

    Radiation risk cross sections (i.e. risks per particle fluence) are discussed in the context of estimating the risk of radiation-induced cancer on long-term space flights from the galactic cosmic radiation outside the confines of the earth's magnetic field. Such quantities are useful for handling effects not seen after low-LET radiation. Since appropriate cross-section functions for cancer induction for each particle species are not yet available, the conventional quality factor is used as an approximation to obtain numerical results for risks of excess cancer mortality. Risks are obtained for seven of the most radiosensitive organs as determined by the ICRP [stomach, colon, lung, bone marrow (BFO), bladder, esophagus and breast], beneath 10 g/cm2 aluminum shielding at solar minimum. Spectra are obtained for excess relative risk for each cancer per LET interval by calculating the average fluence-LET spectrum for the organ and converting to risk by multiplying by a factor proportional to R gamma L Q(L) before integrating over L, the unrestricted LET. Here R gamma is the risk coefficient for low-LET radiation (excess relative mortality per Sv) for the particular organ in question. The total risks of excess cancer mortality obtained are 1.3 and 1.1% to female and male crew, respectively, for a 1-year exposure at solar minimum. Uncertainties in these values are estimated to range between factors of 4 and 15 and are dominated by the biological uncertainties in the risk coefficients for low-LET radiation and in the LET (or energy) dependence of the risk cross sections (as approximated by the quality factor). The direct substitution of appropriate risk cross sections will eventually circumvent entirely the need to calculate, measure or use absorbed dose, equivalent dose and quality factor for such a high-energy charged-particle environment. PMID:7997515

  3. Risk cross sections and their application to risk estimation in the galactic cosmic-ray environment

    NASA Technical Reports Server (NTRS)

    Curtis, S. B.; Nealy, J. E.; Wilson, J. W.; Chatterjee, A. (Principal Investigator)

    1995-01-01

    Radiation risk cross sections (i.e. risks per particle fluence) are discussed in the context of estimating the risk of radiation-induced cancer on long-term space flights from the galactic cosmic radiation outside the confines of the earth's magnetic field. Such quantities are useful for handling effects not seen after low-LET radiation. Since appropriate cross-section functions for cancer induction for each particle species are not yet available, the conventional quality factor is used as an approximation to obtain numerical results for risks of excess cancer mortality. Risks are obtained for seven of the most radiosensitive organs as determined by the ICRP [stomach, colon, lung, bone marrow (BFO), bladder, esophagus and breast], beneath 10 g/cm2 aluminum shielding at solar minimum. Spectra are obtained for excess relative risk for each cancer per LET interval by calculating the average fluence-LET spectrum for the organ and converting to risk by multiplying by a factor proportional to R gamma L Q(L) before integrating over L, the unrestricted LET. Here R gamma is the risk coefficient for low-LET radiation (excess relative mortality per Sv) for the particular organ in question. The total risks of excess cancer mortality obtained are 1.3 and 1.1% to female and male crew, respectively, for a 1-year exposure at solar minimum. Uncertainties in these values are estimated to range between factors of 4 and 15 and are dominated by the biological uncertainties in the risk coefficients for low-LET radiation and in the LET (or energy) dependence of the risk cross sections (as approximated by the quality factor). The direct substitution of appropriate risk cross sections will eventually circumvent entirely the need to calculate, measure or use absorbed dose, equivalent dose and quality factor for such a high-energy charged-particle environment.

  4. Estimation of value at risk and conditional value at risk using normal mixture distributions model

    NASA Astrophysics Data System (ADS)

    Kamaruzzaman, Zetty Ain; Isa, Zaidi

    2013-04-01

    Normal mixture distributions model has been successfully applied in financial time series analysis. In this paper, we estimate the return distribution, value at risk (VaR) and conditional value at risk (CVaR) for monthly and weekly rates of returns for FTSE Bursa Malaysia Kuala Lumpur Composite Index (FBMKLCI) from July 1990 until July 2010 using the two component univariate normal mixture distributions model. First, we present the application of normal mixture distributions model in empirical finance where we fit our real data. Second, we present the application of normal mixture distributions model in risk analysis where we apply the normal mixture distributions model to evaluate the value at risk (VaR) and conditional value at risk (CVaR) with model validation for both risk measures. The empirical results provide evidence that using the two components normal mixture distributions model can fit the data well and can perform better in estimating value at risk (VaR) and conditional value at risk (CVaR) where it can capture the stylized facts of non-normality and leptokurtosis in returns distribution.

  5. Quasi-likelihood estimation for relative risk regression models.

    PubMed

    Carter, Rickey E; Lipsitz, Stuart R; Tilley, Barbara C

    2005-01-01

    For a prospective randomized clinical trial with two groups, the relative risk can be used as a measure of treatment effect and is directly interpretable as the ratio of success probabilities in the new treatment group versus the placebo group. For a prospective study with many covariates and a binary outcome (success or failure), relative risk regression may be of interest. If we model the log of the success probability as a linear function of covariates, the regression coefficients are log-relative risks. However, using such a log-linear model with a Bernoulli likelihood can lead to convergence problems in the Newton-Raphson algorithm. This is likely to occur when the success probabilities are close to one. A constrained likelihood method proposed by Wacholder (1986, American Journal of Epidemiology 123, 174-184), also has convergence problems. We propose a quasi-likelihood method of moments technique in which we naively assume the Bernoulli outcome is Poisson, with the mean (success probability) following a log-linear model. We use the Poisson maximum likelihood equations to estimate the regression coefficients without constraints. Using method of moment ideas, one can show that the estimates using the Poisson likelihood will be consistent and asymptotically normal. We apply these methods to a double-blinded randomized trial in primary biliary cirrhosis of the liver (Markus et al., 1989, New England Journal of Medicine 320, 1709-1713). PMID:15618526

  6. Leukemia risk associated with benzene exposure in the pliofilm cohort. II. Risk estimates.

    PubMed

    Paxton, M B; Chinchilli, V M; Brett, S M; Rodricks, J V

    1994-04-01

    The detailed work histories of the individual workers composing the Pliofilm cohort represent a unique resource for estimating the dose-response for leukemia that may follow occupational exposure to benzene. In this paper, we report the results of analyzing the updated Pliofilm cohort using the proportional hazards model, a more sophisticated technique that uses more of the available exposure data than the conditional logistic model used by Rinsky et al. The more rigorously defined exposure estimates derived by Paustenbach et al. are consistent with those of Crump and Allen in giving estimates of the slope of the leukemogenic dose-response that are not as steep as the slope resulting from the exposure estimates of Rinsky et al. We consider estimates of 0.3-0.5 additional leukemia deaths per thousand workers with 45 ppm-years of cumulative benzene exposure to be the best estimates currently available of leukemia risk from occupational exposure to benzene. These risks were estimated in the proportional hazards model when the exposure estimates of Crump and Allen or of Paustenbach et al. were used to derive a cumulative concentration-by-time metric. PMID:8008924

  7. Cancer Risk Estimates from Space Flight Estimated Using Yields of Chromosome Damage in Astronaut's Blood Lymphocytes

    NASA Technical Reports Server (NTRS)

    George, Kerry A.; Rhone, J.; Chappell, L. J.; Cucinotta, F. A.

    2011-01-01

    To date, cytogenetic damage has been assessed in blood lymphocytes from more than 30 astronauts before and after they participated in long-duration space missions of three months or more on board the International Space Station. Chromosome damage was assessed using fluorescence in situ hybridization whole chromosome analysis techniques. For all individuals, the frequency of chromosome damage measured within a month of return from space was higher than their preflight yield, and biodosimetry estimates were within the range expected from physical dosimetry. Follow up analyses have been performed on most of the astronauts at intervals ranging from around 6 months to many years after flight, and the cytogenetic effects of repeat long-duration missions have so far been assessed in four individuals. Chromosomal aberrations in peripheral blood lymphocytes have been validated as biomarkers of cancer risk and cytogenetic damage can therefore be used to characterize excess health risk incurred by individual crewmembers after their respective missions. Traditional risk assessment models are based on epidemiological data obtained on Earth in cohorts exposed predominantly to acute doses of gamma-rays, and the extrapolation to the space environment is highly problematic, involving very large uncertainties. Cytogenetic damage could play a key role in reducing uncertainty in risk estimation because it is incurred directly in the space environment, using specimens from the astronauts themselves. Relative cancer risks were estimated from the biodosimetry data using the quantitative approach derived from the European Study Group on Cytogenetic Biomarkers and Health database. Astronauts were categorized into low, medium, or high tertiles according to their yield of chromosome damage. Age adjusted tertile rankings were used to estimate cancer risk and results were compared with values obtained using traditional modeling approaches. Individual tertile rankings increased after space

  8. Data Sources for the Model-based Small Area Estimates of Cancer Risk Factors and Screening Behaviors - Small Area Estimates

    Cancer.gov

    The model-based estimates of important cancer risk factors and screening behaviors are obtained by combining the responses to the Behavioral Risk Factor Surveillance System (BRFSS) and the National Health Interview Survey (NHIS).

  9. Estimating Worker Risk Levels Using Accident/Incident Data

    SciTech Connect

    Kenoyer, Judson L.; Stenner, Robert D.; Andrews, William B.; Scherpelz, Robert I.; Aaberg, Rosanne L.

    2000-09-26

    The purpose of the work described in this report was to identify methods that are currently being used in the Department of Energy (DOE) complex to identify and control hazards/risks in the workplace, evaluate them in terms of their effectiveness in reducing risk to the workers, and to develop a preliminary method that could be used to predict the relative risks to workers performing proposed tasks using some of the current methodology. This report describes some of the performance indicators (i.e., safety metrics) that are currently being used to track relative levels of workplace safety in the DOE complex, how these fit into an Integrated Safety Management (ISM) system, some strengths and weaknesses of using a statistically based set of indicators, and methods to evaluate them. Also discussed are methods used to reduce risk to the workers and some of the techniques that appear to be working in the process of establishing a condition of continuous improvement. The results of these methods will be used in future work involved with the determination of modifying factors for a more complex model. The preliminary method to predict the relative risk level to workers during an extended future time period is based on a currently used performance indicator that uses several factors tracked in the CAIRS. The relative risks for workers in a sample (but real) facility on the Hanford site are estimated for a time period of twenty years and are based on workforce predictions. This is the first step in developing a more complex model that will incorporate other modifying factors related to the workers, work environment and status of the ISM system to adjust the preliminary prediction.

  10. Estimation of Hail Risk in the UK and Europe

    NASA Astrophysics Data System (ADS)

    Robinson, Eric; Parker, Melanie; Higgs, Stephanie

    2016-04-01

    Observations of hail events in Europe, and the UK especially, are relatively limited. In order to determine hail risk it is therefore necessary to use information other than relying purely on the historical record. One such methodology is to leverage reanalysis data, in this case ERA-Interim, along with a numerical model (WRF) to recreate the past state of the atmosphere. Relevant atmospheric properties can be extracted and used in a regression model to determine hail probability for each day contained within the reanalyses. The results presented here show the results of using a regression model based on convective available potential energy, deep level shear and weather type. Combined these parameters represent the probability of severe thunderstorm, and in turn hail, activity. Once the probability of hail occurring on each day is determined this can be used as the basis of a stochastic catalogue which can be used in the estimation of hail risk.

  11. Global Building Inventory for Earthquake Loss Estimation and Risk Management

    USGS Publications Warehouse

    Jaiswal, Kishor; Wald, David; Porter, Keith

    2010-01-01

    We develop a global database of building inventories using taxonomy of global building types for use in near-real-time post-earthquake loss estimation and pre-earthquake risk analysis, for the U.S. Geological Survey’s Prompt Assessment of Global Earthquakes for Response (PAGER) program. The database is available for public use, subject to peer review, scrutiny, and open enhancement. On a country-by-country level, it contains estimates of the distribution of building types categorized by material, lateral force resisting system, and occupancy type (residential or nonresidential, urban or rural). The database draws on and harmonizes numerous sources: (1) UN statistics, (2) UN Habitat’s demographic and health survey (DHS) database, (3) national housing censuses, (4) the World Housing Encyclopedia and (5) other literature.

  12. Estimation of risks associated with paediatric cochlear implantation.

    PubMed

    Johnston, J Cyne; Smith, Andrée Durieux; Fitzpatrick, Elizabeth; O'Connor, Annette; Angus, Douglas; Benzies, Karen; Schramm, David

    2010-09-01

    The objectives of this study were to estimate the rates of complications associated with paediatric cochlear implantation use: a) at one Canadian cochlear implant (CI) centre, and b) in the published literature. It comprised a retrospective hospital-based chart review and a concurrent review of complications in the published literature. There were 224 children who had undergone surgery from 1994 to June 2007. Results indicate that the rates of complications at the local Canadian paediatric CI centre are not significantly different from the literature rates for all examined complication types. This hospital-based retrospective chart review and review of the literature provide readers with an estimation of the risks to aid in evidence-based decision-making surrounding paediatric cochlear implantation.

  13. 41 CFR 102-80.50 - Are Federal agencies responsible for identifying/estimating risks and for appropriate risk...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Environmental Management Risks and Risk Reduction Strategies § 102-80.50 Are Federal agencies responsible for... identify and estimate safety and environmental management risks and appropriate risk reduction strategies... environmental management risks and report or correct the situation, as appropriate. Federal agencies must...

  14. Quaternion-based unscented Kalman filter for accurate indoor heading estimation using wearable multi-sensor system.

    PubMed

    Yuan, Xuebing; Yu, Shuai; Zhang, Shengzhi; Wang, Guoping; Liu, Sheng

    2015-01-01

    Inertial navigation based on micro-electromechanical system (MEMS) inertial measurement units (IMUs) has attracted numerous researchers due to its high reliability and independence. The heading estimation, as one of the most important parts of inertial navigation, has been a research focus in this field. Heading estimation using magnetometers is perturbed by magnetic disturbances, such as indoor concrete structures and electronic equipment. The MEMS gyroscope is also used for heading estimation. However, the accuracy of gyroscope is unreliable with time. In this paper, a wearable multi-sensor system has been designed to obtain the high-accuracy indoor heading estimation, according to a quaternion-based unscented Kalman filter (UKF) algorithm. The proposed multi-sensor system including one three-axis accelerometer, three single-axis gyroscopes, one three-axis magnetometer and one microprocessor minimizes the size and cost. The wearable multi-sensor system was fixed on waist of pedestrian and the quadrotor unmanned aerial vehicle (UAV) for heading estimation experiments in our college building. The results show that the mean heading estimation errors are less 10° and 5° to multi-sensor system fixed on waist of pedestrian and the quadrotor UAV, respectively, compared to the reference path. PMID:25961384

  15. Quaternion-Based Unscented Kalman Filter for Accurate Indoor Heading Estimation Using Wearable Multi-Sensor System

    PubMed Central

    Yuan, Xuebing; Yu, Shuai; Zhang, Shengzhi; Wang, Guoping; Liu, Sheng

    2015-01-01

    Inertial navigation based on micro-electromechanical system (MEMS) inertial measurement units (IMUs) has attracted numerous researchers due to its high reliability and independence. The heading estimation, as one of the most important parts of inertial navigation, has been a research focus in this field. Heading estimation using magnetometers is perturbed by magnetic disturbances, such as indoor concrete structures and electronic equipment. The MEMS gyroscope is also used for heading estimation. However, the accuracy of gyroscope is unreliable with time. In this paper, a wearable multi-sensor system has been designed to obtain the high-accuracy indoor heading estimation, according to a quaternion-based unscented Kalman filter (UKF) algorithm. The proposed multi-sensor system including one three-axis accelerometer, three single-axis gyroscopes, one three-axis magnetometer and one microprocessor minimizes the size and cost. The wearable multi-sensor system was fixed on waist of pedestrian and the quadrotor unmanned aerial vehicle (UAV) for heading estimation experiments in our college building. The results show that the mean heading estimation errors are less 10° and 5° to multi-sensor system fixed on waist of pedestrian and the quadrotor UAV, respectively, compared to the reference path. PMID:25961384

  16. Quaternion-based unscented Kalman filter for accurate indoor heading estimation using wearable multi-sensor system.

    PubMed

    Yuan, Xuebing; Yu, Shuai; Zhang, Shengzhi; Wang, Guoping; Liu, Sheng

    2015-05-07

    Inertial navigation based on micro-electromechanical system (MEMS) inertial measurement units (IMUs) has attracted numerous researchers due to its high reliability and independence. The heading estimation, as one of the most important parts of inertial navigation, has been a research focus in this field. Heading estimation using magnetometers is perturbed by magnetic disturbances, such as indoor concrete structures and electronic equipment. The MEMS gyroscope is also used for heading estimation. However, the accuracy of gyroscope is unreliable with time. In this paper, a wearable multi-sensor system has been designed to obtain the high-accuracy indoor heading estimation, according to a quaternion-based unscented Kalman filter (UKF) algorithm. The proposed multi-sensor system including one three-axis accelerometer, three single-axis gyroscopes, one three-axis magnetometer and one microprocessor minimizes the size and cost. The wearable multi-sensor system was fixed on waist of pedestrian and the quadrotor unmanned aerial vehicle (UAV) for heading estimation experiments in our college building. The results show that the mean heading estimation errors are less 10° and 5° to multi-sensor system fixed on waist of pedestrian and the quadrotor UAV, respectively, compared to the reference path.

  17. Developing accurate survey methods for estimating population sizes and trends of the critically endangered Nihoa Millerbird and Nihoa Finch.

    USGS Publications Warehouse

    Gorresen, P. Marcos; Camp, Richard J.; Brinck, Kevin W.; Farmer, Chris

    2012-01-01

    Point-transect surveys indicated that millerbirds were more abundant than shown by the striptransect method, and were estimated at 802 birds in 2010 (95%CI = 652 – 964) and 704 birds in 2011 (95%CI = 579 – 837). Point-transect surveys yielded population estimates with improved precision which will permit trends to be detected in shorter time periods and with greater statistical power than is available from strip-transect survey methods. Mean finch population estimates and associated uncertainty were not markedly different among the three survey methods, but the performance of models used to estimate density and population size are expected to improve as the data from additional surveys are incorporated. Using the pointtransect survey, the mean finch population size was estimated at 2,917 birds in 2010 (95%CI = 2,037 – 3,965) and 2,461 birds in 2011 (95%CI = 1,682 – 3,348). Preliminary testing of the line-transect method in 2011 showed that it would not generate sufficient detections to effectively model bird density, and consequently, relatively precise population size estimates. Both species were fairly evenly distributed across Nihoa and appear to occur in all or nearly all available habitat. The time expended and area traversed by observers was similar among survey methods; however, point-transect surveys do not require that observers walk a straight transect line, thereby allowing them to avoid culturally or biologically sensitive areas and minimize the adverse effects of recurrent travel to any particular area. In general, pointtransect surveys detect more birds than strip-survey methods, thereby improving precision and resulting population size and trend estimation. The method is also better suited for the steep and uneven terrain of Nihoa

  18. Survivorship models for estimating the risk of decompression sickness.

    PubMed

    Kumar, K V; Powell, M R

    1994-07-01

    Several approaches have been used for modeling the incidence of decompression sickness (DCS) such as Hill's dose-response and logistic regression. Most of these methods do not include the time-to-onset information in the model. Survival analysis (failure time analysis) is appropriate when the time to onset of an event is of interest. The applicability of survival analysis for modeling the risk of DCS is illustrated by using data obtained from hypobaric chamber exposures simulating extravehicular activities (n = 426). Univariate analysis of incidence-free survival proportions were obtained for Doppler-detectable circulating microbubbles (CMB), symptoms of DCS and test aborts. A log-linear failure time regression model with 360-min half-time tissue ratio (TR) as covariate was constructed, and estimated probabilities for various TR values were calculated. Further regression analysis by including CMB status in this model showed significant improvement (p < 0.05) in the estimation of DCS over the previous model. Since DCS is dependent on the exposure pressure as well as the duration of exposure, we recommend the use of survival analysis for modeling the risk of DCS. PMID:7945136

  19. Assignment of Calibration Information to Deeper Phylogenetic Nodes is More Effective in Obtaining Precise and Accurate Divergence Time Estimates.

    PubMed

    Mello, Beatriz; Schrago, Carlos G

    2014-01-01

    Divergence time estimation has become an essential tool for understanding macroevolutionary events. Molecular dating aims to obtain reliable inferences, which, within a statistical framework, means jointly increasing the accuracy and precision of estimates. Bayesian dating methods exhibit the propriety of a linear relationship between uncertainty and estimated divergence dates. This relationship occurs even if the number of sites approaches infinity and places a limit on the maximum precision of node ages. However, how the placement of calibration information may affect the precision of divergence time estimates remains an open question. In this study, relying on simulated and empirical data, we investigated how the location of calibration within a phylogeny affects the accuracy and precision of time estimates. We found that calibration priors set at median and deep phylogenetic nodes were associated with higher precision values compared to analyses involving calibration at the shallowest node. The results were independent of the tree symmetry. An empirical mammalian dataset produced results that were consistent with those generated by the simulated sequences. Assigning time information to the deeper nodes of a tree is crucial to guarantee the accuracy and precision of divergence times. This finding highlights the importance of the appropriate choice of outgroups in molecular dating. PMID:24855333

  20. Health risks in wastewater irrigation: comparing estimates from quantitative microbial risk analyses and epidemiological studies.

    PubMed

    Mara, D D; Sleigh, P A; Blumenthal, U J; Carr, R M

    2007-03-01

    The combination of standard quantitative microbial risk analysis (QMRA) techniques and 10,000-trial Monte Carlo risk simulations was used to estimate the human health risks associated with the use of wastewater for unrestricted and restricted crop irrigation. A risk of rotavirus infection of 10(-2) per person per year (pppy) was used as the reference level of acceptable risk. Using the model scenario of involuntary soil ingestion for restricted irrigation, the risk of rotavirus infection is approximately 10(-2) pppy when the wastewater contains < or =10(6) Escherichia coli per 100ml and when local agricultural practices are highly mechanised. For labour-intensive agriculture the risk of rotavirus infection is approximately 10(-2) pppy when the wastewater contains < or = 10(5) E. coli per 100ml; however, the wastewater quality should be < or = 10(4) E. coli per 100ml when children under 15 are exposed. With the model scenario of lettuce consumption for unrestricted irrigation, the use of wastewaters containing < or =10(4) E. coli per 100ml results in a rotavirus infection risk of approximately 10(-2) pppy; however, again based on epidemiological evidence from Mexico, the current WHO guideline level of < or =1,000 E. coli per 100ml should be retained for root crops eaten raw. PMID:17402278

  1. Time-to-Compromise Model for Cyber Risk Reduction Estimation

    SciTech Connect

    Miles A. McQueen; Wayne F. Boyer; Mark A. Flynn; George A. Beitel

    2005-09-01

    We propose a new model for estimating the time to compromise a system component that is visible to an attacker. The model provides an estimate of the expected value of the time-to-compromise as a function of known and visible vulnerabilities, and attacker skill level. The time-to-compromise random process model is a composite of three subprocesses associated with attacker actions aimed at the exploitation of vulnerabilities. In a case study, the model was used to aid in a risk reduction estimate between a baseline Supervisory Control and Data Acquisition (SCADA) system and the baseline system enhanced through a specific set of control system security remedial actions. For our case study, the total number of system vulnerabilities was reduced by 86% but the dominant attack path was through a component where the number of vulnerabilities was reduced by only 42% and the time-to-compromise of that component was increased by only 13% to 30% depending on attacker skill level.

  2. How Accurate Are German Work-Time Data? A Comparison of Time-Diary Reports and Stylized Estimates

    ERIC Educational Resources Information Center

    Otterbach, Steffen; Sousa-Poza, Alfonso

    2010-01-01

    This study compares work time data collected by the German Time Use Survey (GTUS) using the diary method with stylized work time estimates from the GTUS, the German Socio-Economic Panel, and the German Microcensus. Although on average the differences between the time-diary data and the interview data is not large, our results show that significant…

  3. Prospect theory based estimation of drivers' risk attitudes in route choice behaviors.

    PubMed

    Zhou, Lizhen; Zhong, Shiquan; Ma, Shoufeng; Jia, Ning

    2014-12-01

    This paper applied prospect theory (PT) to describe drivers' route choice behavior under Variable Message Sign (VMS), which presented visual traffic information to assist them to make route choice decisions. A quite rich empirical data from questionnaire and field spot was used to estimate parameters of PT. In order to make the parameters more realistic with drivers' attitudes, they were classified into different types by significant factors influencing their behaviors. Based on the travel time distribution of alternative routes and route choice results from questionnaire, the parameterized value function of each category was figured out, which represented drivers' risk attitudes and choice characteristics. The empirical verification showed that the estimates were acceptable and effective. The result showed drivers' risk attitudes and route choice characteristics could be captured by PT under real-time information shown on VMS. For practical application, once drivers' route choice characteristics and parameters were identified, their route choice behavior under different road conditions could be predicted accurately, which was the basis of traffic guidance measures formulation and implementation for targeted traffic management. Moreover, the heterogeneous risk attitudes among drivers should be considered when releasing traffic information and regulating traffic flow.

  4. The number of alleles at a microsatellite defines the allele frequency spectrum and facilitates fast accurate estimation of theta.

    PubMed

    Haasl, Ryan J; Payseur, Bret A

    2010-12-01

    Theoretical work focused on microsatellite variation has produced a number of important results, including the expected distribution of repeat sizes and the expected squared difference in repeat size between two randomly selected samples. However, closed-form expressions for the sampling distribution and frequency spectrum of microsatellite variation have not been identified. Here, we use coalescent simulations of the stepwise mutation model to develop gamma and exponential approximations of the microsatellite allele frequency spectrum, a distribution central to the description of microsatellite variation across the genome. For both approximations, the parameter of biological relevance is the number of alleles at a locus, which we express as a function of θ, the population-scaled mutation rate, based on simulated data. Discovered relationships between θ, the number of alleles, and the frequency spectrum support the development of three new estimators of microsatellite θ. The three estimators exhibit roughly similar mean squared errors (MSEs) and all are biased. However, across a broad range of sample sizes and θ values, the MSEs of these estimators are frequently lower than all other estimators tested. The new estimators are also reasonably robust to mutation that includes step sizes greater than one. Finally, our approximation to the microsatellite allele frequency spectrum provides a null distribution of microsatellite variation. In this context, a preliminary analysis of the effects of demographic change on the frequency spectrum is performed. We suggest that simulations of the microsatellite frequency spectrum under evolutionary scenarios of interest may guide investigators to the use of relevant and sometimes novel summary statistics.

  5. Declining bioavailability and inappropriate estimation of risk of persistent compounds

    SciTech Connect

    Kelsey, J.W.; Alexander, M.

    1997-03-01

    Earthworms (Eisenia foetida) assimilated decreasing amounts of atrazine, phenanthrene, and naphthalene that had been incubated for increasing periods of time in sterile soil. The amount of atrazine and phenanthrene removed from soil by mild extractants also decreased with time. The declines in bioavailability of the three compounds to earthworms and of naphthalene to bacteria were not reflected by analysis involving vigorous methods of solvent extraction; similar results for bioavailability of phenanthrene and 4-nitrophenol to bacteria were obtained in a previous study conducted at this laboratory. The authors suggest that regulations based on vigorous extractions for the analyses of persistent organic pollutants in soil do not appropriately estimate exposure or risk to susceptible populations.

  6. How do we measure dose and estimate risk?

    NASA Astrophysics Data System (ADS)

    Hoeschen, Christoph; Regulla, Dieter; Schlattl, Helmut; Petoussi-Henss, Nina; Li, Wei Bo; Zankl, Maria

    2011-03-01

    Radiation exposure due to medical imaging is a topic of emerging importance. In Europe this topic has been dealt with for a long time and in other countries it is getting more and more important and it gets an aspect of public interest in the latest years. This is mainly true due to the fact that the average dose per person in developed countries is increasing rapidly since threedimensional imaging is getting more and more available and useful for diagnosis. This paper introduces the most common dose quantities used in medical radiation exposure characterization, discusses usual ways for determination of such quantities as well as some considerations how these values are linked to radiation risk estimation. For this last aspect the paper will refer to the linear non threshold theory for an imaging application.

  7. 41 CFR 102-80.50 - Are Federal agencies responsible for identifying/estimating risks and for appropriate risk...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... (Continued) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 80-SAFETY AND ENVIRONMENTAL MANAGEMENT Safety and Environmental Management Risks and Risk Reduction Strategies § 102-80.50 Are Federal agencies responsible for... identify and estimate safety and environmental management risks and appropriate risk reduction...

  8. 41 CFR 102-80.50 - Are Federal agencies responsible for identifying/estimating risks and for appropriate risk...

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... (Continued) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 80-SAFETY AND ENVIRONMENTAL MANAGEMENT Safety and Environmental Management Risks and Risk Reduction Strategies § 102-80.50 Are Federal agencies responsible for... identify and estimate safety and environmental management risks and appropriate risk reduction...

  9. 41 CFR 102-80.50 - Are Federal agencies responsible for identifying/estimating risks and for appropriate risk...

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... (Continued) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 80-SAFETY AND ENVIRONMENTAL MANAGEMENT Safety and Environmental Management Risks and Risk Reduction Strategies § 102-80.50 Are Federal agencies responsible for... identify and estimate safety and environmental management risks and appropriate risk reduction...

  10. 41 CFR 102-80.50 - Are Federal agencies responsible for identifying/estimating risks and for appropriate risk...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... (Continued) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 80-SAFETY AND ENVIRONMENTAL MANAGEMENT Safety and Environmental Management Risks and Risk Reduction Strategies § 102-80.50 Are Federal agencies responsible for... identify and estimate safety and environmental management risks and appropriate risk reduction...

  11. An assessment of ecological and case-control methods for estimating lung cancer risk due to indoor radon

    SciTech Connect

    Stidley, C.A.; Samet, J.M.

    1992-12-31

    Studies of underground miners indicate that indoor radon is an important cause of lung cancer. This finding has raised concern that exposure to radon also causes lung cancer in the general population. Epidemiological studies, including both case-control and ecological approaches, have directly addressed the risks of indoor residential radon; many more case-control studies are in progress. Ecological studies that associate lung-cancer rates with typical indoor radon levels in various geographic areas have not consistently shown positive associations. The results of purportedly negative ecological studies have been used as a basis for questioning the hazards of indoor radon exposure. Because of potentially serious methodologic flaws for testing hypotheses, we examined the ecological method as a tool for assessing lung-cancer risk from indoor radon exposure. We developed a simulation approach that utilizes the Environmental Protection Agency (EPA) radon survey data to assign exposures to individuals within counties. Using the computer-generated data, we compared risk estimates obtained by ecological regression methods with those obtained from other regression methods and with the {open_quotes}true{close_quotes} risks used to generate the data. For many of these simulations, the ecological models, while fitting the summary data well, gave risk estimates that differed considerably from the true risks. For some models, the risk estimates were negatively correlated with exposure, although the assumed relationship was positive. Attempts to improve the ecological models by adding smoking variables, including interaction terms, did not always improve the estimates of risk, which are easily affected by model misspecification. Because exposure situations used in the simulations are realistic, our results show that ecological methods may not accurately estimate the lung-cancer risk associated with indoor radon exposure.

  12. Estimation of Tsunami Risk for the Caribbean Coast

    NASA Astrophysics Data System (ADS)

    Zahibo, N.

    2004-05-01

    The tsunami problem for the coast of the Caribbean basin is discussed. Briefly the historical data of tsunami in the Caribbean Sea are presented. Numerical simulation of potential tsunamis in the Caribbean Sea is performed in the framework of the nonlinear-shallow theory. The tsunami wave height distribution along the Caribbean Coast is computed. These results are used to estimate the far-field tsunami potential of various coastal locations in the Caribbean Sea. In fact, five zones with tsunami low risk are selected basing on prognostic computations, they are: the bay "Golfo de Batabano" and the coast of province "Ciego de Avila" in Cuba, the Nicaraguan Coast (between Bluefields and Puerto Cabezas), the border between Mexico and Belize, the bay "Golfo de Venezuela" in Venezuela. The analysis of historical data confirms that there was no tsunami in the selected zones. Also, the wave attenuation in the Caribbean Sea is investigated; in fact, wave amplitude decreases in an order if the tsunami source is located on the distance up to 1000 km from the coastal location. Both factors wave attenuation and wave height distribution should be taken into account in the planned warning system for the Caribbean Sea. Specially the problem of tsunami risk for Lesser Antilles including Guadeloupe is discussed.

  13. How accurate and precise are limited sampling strategies in estimating exposure to mycophenolic acid in people with autoimmune disease?

    PubMed

    Abd Rahman, Azrin N; Tett, Susan E; Staatz, Christine E

    2014-03-01

    Mycophenolic acid (MPA) is a potent immunosuppressant agent, which is increasingly being used in the treatment of patients with various autoimmune diseases. Dosing to achieve a specific target MPA area under the concentration-time curve from 0 to 12 h post-dose (AUC12) is likely to lead to better treatment outcomes in patients with autoimmune disease than a standard fixed-dose strategy. This review summarizes the available published data around concentration monitoring strategies for MPA in patients with autoimmune disease and examines the accuracy and precision of methods reported to date using limited concentration-time points to estimate MPA AUC12. A total of 13 studies were identified that assessed the correlation between single time points and MPA AUC12 and/or examined the predictive performance of limited sampling strategies in estimating MPA AUC12. The majority of studies investigated mycophenolate mofetil (MMF) rather than the enteric-coated mycophenolate sodium (EC-MPS) formulation of MPA. Correlations between MPA trough concentrations and MPA AUC12 estimated by full concentration-time profiling ranged from 0.13 to 0.94 across ten studies, with the highest associations (r (2) = 0.90-0.94) observed in lupus nephritis patients. Correlations were generally higher in autoimmune disease patients compared with renal allograft recipients and higher after MMF compared with EC-MPS intake. Four studies investigated use of a limited sampling strategy to predict MPA AUC12 determined by full concentration-time profiling. Three studies used a limited sampling strategy consisting of a maximum combination of three sampling time points with the latest sample drawn 3-6 h after MMF intake, whereas the remaining study tested all combinations of sampling times. MPA AUC12 was best predicted when three samples were taken at pre-dose and at 1 and 3 h post-dose with a mean bias and imprecision of 0.8 and 22.6 % for multiple linear regression analysis and of -5.5 and 23.0 % for

  14. Soil-ecological risks for soil degradation estimation

    NASA Astrophysics Data System (ADS)

    Trifonova, Tatiana; Shirkin, Leonid; Kust, German; Andreeva, Olga

    2016-04-01

    Soil degradation includes the processes of soil properties and quality worsening, primarily from the point of view of their productivity and decrease of ecosystem services quality. Complete soil cover destruction and/or functioning termination of soil forms of organic life are considered as extreme stages of soil degradation, and for the fragile ecosystems they are normally considered in the network of their desertification, land degradation and droughts /DLDD/ concept. Block-model of ecotoxic effects, generating soil and ecosystem degradation, has been developed as a result of the long-term field and laboratory research of sod-podzol soils, contaminated with waste, containing heavy metals. The model highlights soil degradation mechanisms, caused by direct and indirect impact of ecotoxicants on "phytocenosis- soil" system and their combination, frequently causing synergistic effect. The sequence of occurring changes here can be formalized as a theory of change (succession of interrelated events). Several stages are distinguished here - from heavy metals leaching (releasing) in waste and their migration downward the soil profile to phytoproductivity decrease and certain phytocenosis composition changes. Phytoproductivity decrease leads to the reduction of cellulose content introduced into the soil. The described feedback mechanism acts as a factor of sod-podzolic soil self-purification and stability. It has been shown, that using phytomass productivity index, integrally reflecting the worsening of soil properties complex, it is possible to solve the problems dealing with the dose-reflecting reactions creation and determination of critical levels of load for phytocenosis and corresponding soil-ecological risks. Soil-ecological risk in "phytocenosis- soil" system means probable negative changes and the loss of some ecosystem functions during the transformation process of dead organic substance energy for the new biomass composition. Soil-ecological risks estimation is

  15. Accurate spike estimation from noisy calcium signals for ultrafast three-dimensional imaging of large neuronal populations in vivo

    PubMed Central

    Deneux, Thomas; Kaszas, Attila; Szalay, Gergely; Katona, Gergely; Lakner, Tamás; Grinvald, Amiram; Rózsa, Balázs; Vanzetta, Ivo

    2016-01-01

    Extracting neuronal spiking activity from large-scale two-photon recordings remains challenging, especially in mammals in vivo, where large noises often contaminate the signals. We propose a method, MLspike, which returns the most likely spike train underlying the measured calcium fluorescence. It relies on a physiological model including baseline fluctuations and distinct nonlinearities for synthetic and genetically encoded indicators. Model parameters can be either provided by the user or estimated from the data themselves. MLspike is computationally efficient thanks to its original discretization of probability representations; moreover, it can also return spike probabilities or samples. Benchmarked on extensive simulations and real data from seven different preparations, it outperformed state-of-the-art algorithms. Combined with the finding obtained from systematic data investigation (noise level, spiking rate and so on) that photonic noise is not necessarily the main limiting factor, our method allows spike extraction from large-scale recordings, as demonstrated on acousto-optical three-dimensional recordings of over 1,000 neurons in vivo. PMID:27432255

  16. RADON EXPOSURE ASSESSMENT AND DOSIMETRY APPLIED TO EPIDEMIOLOGY AND RISK ESTIMATION

    EPA Science Inventory

    Epidemiological studies of underground miners provide the primary basis for radon risk estimates for indoor exposures as well as mine exposures. A major source of uncertainty in these risk estimates is the uncertainty in radon progeny exposure estimates for the miners. In addit...

  17. Shorter sampling periods and accurate estimates of milk volume and components are possible for pasture based dairy herds milked with automated milking systems.

    PubMed

    Kamphuis, Claudia; Burke, Jennie K; Taukiri, Sarah; Petch, Susan-Fay; Turner, Sally-Anne

    2016-08-01

    Dairy cows grazing pasture and milked using automated milking systems (AMS) have lower milking frequencies than indoor fed cows milked using AMS. Therefore, milk recording intervals used for herd testing indoor fed cows may not be suitable for cows on pasture based farms. We hypothesised that accurate standardised 24 h estimates could be determined for AMS herds with milk recording intervals of less than the Gold Standard (48 hs), but that the optimum milk recording interval would depend on the herd average for milking frequency. The Gold Standard protocol was applied on five commercial dairy farms with AMS, between December 2011 and February 2013. From 12 milk recording test periods, involving 2211 cow-test days and 8049 cow milkings, standardised 24 h estimates for milk volume and milk composition were calculated for the Gold Standard protocol and compared with those collected during nine alternative sampling scenarios, including six shorter sampling periods and three in which a fixed number of milk samples per cow were collected. Results infer a 48 h milk recording protocol is unnecessarily long for collecting accurate estimates during milk recording on pasture based AMS farms. Collection of two milk samples only per cow was optimal in terms of high concordance correlation coefficients for milk volume and components and a low proportion of missed cow-test days. Further research is required to determine the effects of diurnal variations in milk composition on standardised 24 h estimates for milk volume and components, before a protocol based on a fixed number of samples could be considered. Based on the results of this study New Zealand have adopted a split protocol for herd testing based on the average milking frequency for the herd (NZ Herd Test Standard 8100:2015). PMID:27600967

  18. Methodology for the Model-based Small Area Estimates of Cancer Risk Factors and Screening Behaviors - Small Area Estimates

    Cancer.gov

    This model-based approach uses data from both the Behavioral Risk Factor Surveillance System (BRFSS) and the National Health Interview Survey (NHIS) to produce estimates of the prevalence rates of cancer risk factors and screening behaviors at the state, health service area, and county levels.

  19. The role of models in estimating consequences as part of the risk assessment process.

    PubMed

    Forde-Folle, K; Mitchell, D; Zepeda, C

    2011-08-01

    The degree of disease risk represented by the introduction, spread, or establishment of one or several diseases through the importation of animals and animal products is assessed by importing countries through an analysis of risk. The components of a risk analysis include hazard identification, risk assessment, risk management, and risk communication. A risk assessment starts with identification of the hazard(s) and then continues with four interrelated steps: release assessment, exposure assessment, consequence assessment, and risk estimation. Risk assessments may be either qualitative or quantitative. This paper describes how, through the integration of epidemiological and economic models, the potential adverse biological and economic consequences of exposure can be quantified.

  20. State Estimates of Adolescent Cigarette Use and Perceptions of Risk of Smoking: 2012 and 2013

    MedlinePlus

    ... 2015 STATE ESTIMATES OF ADOLESCENT CIGARETTE USE AND PERCEPTIONS OF RISK OF SMOKING: 2012 AND 2013 AUTHORS ... with an inverse association between use and risk perceptions (i.e., the prevalence of use is lower ...

  1. Prognostic models and risk scores: can we accurately predict postoperative nausea and vomiting in children after craniotomy?

    PubMed

    Neufeld, Susan M; Newburn-Cook, Christine V; Drummond, Jane E

    2008-10-01

    Postoperative nausea and vomiting (PONV) is a problem for many children after craniotomy. Prognostic models and risk scores help identify who is at risk for an adverse event such as PONV to help guide clinical care. The purpose of this article is to assess whether an existing prognostic model or risk score can predict PONV in children after craniotomy. The concepts of transportability, calibration, and discrimination are presented to identify what is required to have a valid tool for clinical use. Although previous work may inform clinical practice and guide future research, existing prognostic models and risk scores do not appear to be options for predicting PONV in children undergoing craniotomy. However, until risk factors are further delineated, followed by the development and validation of prognostic models and risk scores that include children after craniotomy, clinical judgment in the context of current research may serve as a guide for clinical care in this population. PMID:18939320

  2. Cancer risk estimation caused by radiation exposure during endovascular procedure

    NASA Astrophysics Data System (ADS)

    Kang, Y. H.; Cho, J. H.; Yun, W. S.; Park, K. H.; Kim, H. G.; Kwon, S. M.

    2014-05-01

    The objective of this study was to identify the radiation exposure dose of patients, as well as staff caused by fluoroscopy for C-arm-assisted vascular surgical operation and to estimate carcinogenic risk due to such exposure dose. The study was conducted in 71 patients (53 men and 18 women) who had undergone vascular surgical intervention at the division of vascular surgery in the University Hospital from November of 2011 to April of 2012. It had used a mobile C-arm device and calculated the radiation exposure dose of patient (dose-area product, DAP). Effective dose was measured by attaching optically stimulated luminescence on the radiation protectors of staff who participates in the surgery to measure the radiation exposure dose of staff during the vascular surgical operation. From the study results, DAP value of patients was 308.7 Gy cm2 in average, and the maximum value was 3085 Gy cm2. When converted to the effective dose, the resulted mean was 6.2 m Gy and the maximum effective dose was 61.7 milliSievert (mSv). The effective dose of staff was 3.85 mSv; while the radiation technician was 1.04 mSv, the nurse was 1.31 mSv. All cancer incidences of operator are corresponding to 2355 persons per 100,000 persons, which deemed 1 of 42 persons is likely to have all cancer incidences. In conclusion, the vascular surgeons should keep the radiation protection for patient, staff, and all participants in the intervention in mind as supervisor of fluoroscopy while trying to understand the effects by radiation by themselves to prevent invisible danger during the intervention and to minimize the harm.

  3. Estimating Risk of Alcohol Dependence Using Alcohol Screening Scores*

    PubMed Central

    Rubinsky, Anna D.; Kivlahan, Daniel R.; Volk, Robert J.; Maynard, Charles; Bradley, Katharine A.

    2010-01-01

    Brief alcohol counseling interventions can reduce alcohol consumption and related morbidity among non-dependent risky drinkers, but more intensive alcohol treatment is recommended for persons with alcohol dependence. This study evaluated whether scores on common alcohol screening tests could identify patients likely to have current alcohol dependence so that more appropriate follow-up assessment and/or intervention could be offered. This cross-sectional study used secondary data from 392 male and 927 female adult family medicine outpatients (1993–1994). Likelihood ratios were used to empirically identify and evaluate ranges of scores of the AUDIT, the AUDIT-C, two single-item questions about frequency of binge drinking, and the CAGE questionnaire for detecting DSM-IV past-year alcohol dependence. Based on the prevalence of past-year alcohol dependence in this sample (men: 12.2%; women: 5.8%), zones of the AUDIT and AUDIT-C identified wide variability in the post-screening risk of alcohol dependence in men and women, even among those who screened positive for alcohol misuse. Among men, AUDIT zones 5–10, 11–14 and 15–40 were associated with post-screening probabilities of past-year alcohol dependence ranging from 18–87%, and AUDIT-C zones 5–6, 7–9 and 10–12 were associated with probabilities ranging from 22–75%. Among women, AUDIT zones 3–4, 5–8, 9–12 and 13–40 were associated with post-screening probabilities of past-year alcohol dependence ranging from 6–94%, and AUDIT-C zones 3, 4–6, 7–9 and 10–12 were associated with probabilities ranging from 9–88%. AUDIT or AUDIT-C scores could be used to estimate the probability of past-year alcohol dependence among patients who screen positive for alcohol misuse and inform clinical decision-making. PMID:20042299

  4. Subcutaneous nerve activity is more accurate than the heart rate variability in estimating cardiac sympathetic tone in ambulatory dogs with myocardial infarction

    PubMed Central

    Chan, Yi-Hsin; Tsai, Wei-Chung; Shen, Changyu; Han, Seongwook; Chen, Lan S.; Lin, Shien-Fong; Chen, Peng-Sheng

    2015-01-01

    Background We recently reported that subcutaneous nerve activity (SCNA) can be used to estimate sympathetic tone. Objectives To test the hypothesis that left thoracic SCNA is more accurate than heart rate variability (HRV) in estimating cardiac sympathetic tone in ambulatory dogs with myocardial infarction (MI). Methods We used an implanted radiotransmitter to study left stellate ganglion nerve activity (SGNA), vagal nerve activity (VNA), and thoracic SCNA in 9 dogs at baseline and up to 8 weeks after MI. HRV was determined based by time-domain, frequency-domain and non-linear analyses. Results The correlation coefficients between integrated SGNA and SCNA averaged 0.74 (95% confidence interval (CI), 0.41–1.06) at baseline and 0.82 (95% CI, 0.63–1.01) after MI (P<.05 for both). The absolute values of the correlation coefficients were significant larger than that between SGNA and HRV analysis based on time-domain, frequency-domain and non-linear analyses, respectively, at baseline (P<.05 for all) and after MI (P<.05 for all). There was a clear increment of SGNA and SCNA at 2, 4, 6 and 8 weeks after MI, while HRV parameters showed no significant changes. Significant circadian variations were noted in SCNA, SGNA and all HRV parameters at baseline and after MI, respectively. Atrial tachycardia (AT) episodes were invariably preceded by the SCNA and SGNA, which were progressively increased from 120th, 90th, 60th to 30th s before the AT onset. No such changes of HRV parameters were observed before AT onset. Conclusion SCNA is more accurate than HRV in estimating cardiac sympathetic tone in ambulatory dogs with MI. PMID:25778433

  5. Estimation of Ten-Year Survival of Patients with Pulmonary Tuberculosis Based on the Competing Risks Model in Iran

    PubMed Central

    Kazempour-Dizaji, Mehdi; Tabarsi, Payam; Zayeri, Farid

    2016-01-01

    Background: Tuberculosis (TB) is a chronic bacterial disease, which despite the presence of effective drug strategies, still remains a serious health problem worldwide. Estimation of survival rate is an appropriate indicator for prognosis in patients with pulmonary TB. Therefore, this research was designed with the aim of accurate estimation of the survival of patients by taking both the death event and relapse into consideration. Materials and Methods: Based on a retrospective cohort study, information of 2,299 patients with pulmonary TB that had been referred to and treated in Masih Daneshvari Hospital from 2005 to 2015 was reviewed. To estimate the survival of patients with pulmonary TB, the competing risks model, which considered death and relapse as competing events, was used. In addition, the effect of factors affecting the cumulative incidence function (CIF) of death event and relapse was also examined. Results: The effect of risk factors on the CIF of death events and relapse showed that patients’ age, marital status, contact with TB patients, adverse effect of drugs, imprisonment and HIV positivity were factors that affected the CIF of death. Meanwhile, sex, marital status, imprisonment and HIV positivity were factors affecting the CIF of relapse (P <0.05). Considering death and relapse as competing events, survival estimation in pulmonary TB patients showed that survival in this group of patients in the first, third, fifth and tenth year after treatment was 39%, 14%, 7% and 0%, respectively. Conclusion: The use of competing risks model in survival analysis of patients with pulmonary TB with consideration of competing events, enables more accurate estimation of survival. PMID:27403177

  6. State Estimates of Adolescent Marijuana Use and Perceptions of Risk of Harm from Marijuana Use: 2013 and 2014

    MedlinePlus

    ... 2014 estimates to 2012–2013 estimates). However, youth perceptions of great risk of harm from monthly marijuana ... change. State Estimates of Adolescent Marijuana Use and Perceptions of Risk of Harm From Marijuana Use: 2013 ...

  7. Indoor radon and lung cancer. Estimating the risks

    SciTech Connect

    Samet, J.M. )

    1992-01-01

    Radon is ubiquitous in indoor environments. Epidemiologic studies of underground miners with exposure to radon and experimental evidence have established that radon causes lung cancer. The finding that this naturally occurring carcinogen is present in the air of homes and other buildings has raised concern about the lung cancer risk to the general population from radon. I review current approaches for assessing the risk of indoor radon, emphasizing the extrapolation of the risks for miners to the general population. Although uncertainties are inherent in this risk assessment, the present evidence warrants identifying homes that have unacceptably high concentrations.23 references.

  8. Indoor radon and lung cancer. Estimating the risks.

    PubMed Central

    Samet, J. M.

    1992-01-01

    Radon is ubiquitous in indoor environments. Epidemiologic studies of underground miners with exposure to radon and experimental evidence have established that radon causes lung cancer. The finding that this naturally occurring carcinogen is present in the air of homes and other buildings has raised concern about the lung cancer risk to the general population from radon. I review current approaches for assessing the risk of indoor radon, emphasizing the extrapolation of the risks for miners to the general population. Although uncertainties are inherent in this risk assessment, the present evidence warrants identifying homes that have unacceptably high concentrations. PMID:1734594

  9. A systematic approach for the accurate non-invasive estimation of blood glucose utilizing a novel light-tissue interaction adaptive modelling scheme

    NASA Astrophysics Data System (ADS)

    Rybynok, V. O.; Kyriacou, P. A.

    2007-10-01

    Diabetes is one of the biggest health challenges of the 21st century. The obesity epidemic, sedentary lifestyles and an ageing population mean prevalence of the condition is currently doubling every generation. Diabetes is associated with serious chronic ill health, disability and premature mortality. Long-term complications including heart disease, stroke, blindness, kidney disease and amputations, make the greatest contribution to the costs of diabetes care. Many of these long-term effects could be avoided with earlier, more effective monitoring and treatment. Currently, blood glucose can only be monitored through the use of invasive techniques. To date there is no widely accepted and readily available non-invasive monitoring technique to measure blood glucose despite the many attempts. This paper challenges one of the most difficult non-invasive monitoring techniques, that of blood glucose, and proposes a new novel approach that will enable the accurate, and calibration free estimation of glucose concentration in blood. This approach is based on spectroscopic techniques and a new adaptive modelling scheme. The theoretical implementation and the effectiveness of the adaptive modelling scheme for this application has been described and a detailed mathematical evaluation has been employed to prove that such a scheme has the capability of extracting accurately the concentration of glucose from a complex biological media.

  10. Properties of alkali-metal atoms and alkaline-earth-metal ions for an accurate estimate of their long-range interactions

    NASA Astrophysics Data System (ADS)

    Kaur, Jasmeet; Nandy, D. K.; Arora, Bindiya; Sahoo, B. K.

    2015-01-01

    Accurate knowledge of interaction potentials among the alkali-metal atoms and alkaline-earth ions is very useful in the studies of cold atom physics. Here we carry out theoretical studies of the long-range interactions among the Li, Na, K, and Rb alkali-metal atoms with the Ca+, Ba+, Sr+, and Ra+ alkaline-earth ions systematically, which are largely motivated by their importance in a number of applications. These interactions are expressed as a power series in the inverse of the internuclear separation R . Both the dispersion and induction components of these interactions are determined accurately from the algebraic coefficients corresponding to each power combination in the series. Ultimately, these coefficients are expressed in terms of the electric multipole polarizabilities of the above-mentioned systems, which are calculated using the matrix elements obtained from a relativistic coupled-cluster method and core contributions to these quantities from the random-phase approximation. We also compare our estimated polarizabilities with the other available theoretical and experimental results to verify accuracies in our calculations. In addition, we also evaluate the lifetimes of the first two low-lying states of the ions using the above matrix elements. Graphical representations of the dispersion coefficients versus R are given among all the alkaline ions with Rb.

  11. CCSI Risk Estimation: An Application of Expert Elicitation

    SciTech Connect

    Engel, David W.; Dalton, Angela C.

    2012-10-01

    The Carbon Capture Simulation Initiative (CCSI) is a multi-laboratory simulation-driven effort to develop carbon capture technologies with the goal of accelerating commercialization and adoption in the near future. One of the key CCSI technical challenges is representing and quantifying the inherent uncertainty and risks associated with developing, testing, and deploying the technology in simulated and real operational settings. To address this challenge, the CCSI Element 7 team developed a holistic risk analysis and decision-making framework. The purpose of this report is to document the CCSI Element 7 structured systematic expert elicitation to identify additional risk factors. We review the significance of and established approaches to expert elicitation, describe the CCSI risk elicitation plan and implementation strategies, and conclude by discussing the next steps and highlighting the contribution of risk elicitation toward the achievement of the overarching CCSI objectives.

  12. Estimation of radiation risk from screening mammography: Recent trends and comparison with expected benefits

    SciTech Connect

    Feig, S.A.; Ehrlich, S.M. )

    1990-03-01

    On the basis of recent epidemiologic studies, the National Institutes of Health in 1985 provided a new estimate for radiation risk to the breast that employed a relative risk model and acknowledged greater dependence on age at exposure. Lifetime risks from a single mammogram may be calculated from this estimate and are lower than those based on the previous 1977 National Cancer Institute estimate. Possible years of life expectancy lost from annual mammography beginning at age 40 years may also be calculated and are negligible compared with estimates for years of life expectancy gained from such screening.

  13. Estimating the concordance probability in a survival analysis with a discrete number of risk groups.

    PubMed

    Heller, Glenn; Mo, Qianxing

    2016-04-01

    A clinical risk classification system is an important component of a treatment decision algorithm. A measure used to assess the strength of a risk classification system is discrimination, and when the outcome is survival time, the most commonly applied global measure of discrimination is the concordance probability. The concordance probability represents the pairwise probability of lower patient risk given longer survival time. The c-index and the concordance probability estimate have been used to estimate the concordance probability when patient-specific risk scores are continuous. In the current paper, the concordance probability estimate and an inverse probability censoring weighted c-index are modified to account for discrete risk scores. Simulations are generated to assess the finite sample properties of the concordance probability estimate and the weighted c-index. An application of these measures of discriminatory power to a metastatic prostate cancer risk classification system is examined.

  14. NEED FOR INDIVIDUAL CANCER RISK ESTIMATES IN X-RAY AND NUCLEAR MEDICINE IMAGING.

    PubMed

    Mattsson, Sören

    2016-06-01

    To facilitate the justification of an X-ray or nuclear medicine investigation and for informing patients, it is desirable that the individual patient's radiation dose and potential cancer risk can be prospectively assessed and documented. The current dose-reporting is based on effective dose, which ignores body size and does not reflect the strong dependence of risk on the age at exposure. Risk estimations should better be done through individual organ dose assessments, which need careful exposure characterisation as well as anatomical description of the individual patient. In nuclear medicine, reference biokinetic models should also be replaced with models describing individual physiological states and biokinetics. There is a need to adjust population-based cancer risk estimates to the possible risk of leukaemia and solid tumours for the individual depending on age and gender. The article summarises reasons for individual cancer risk estimates and gives examples of methods and results of such estimates. PMID:26994092

  15. Estimation of the Disease Burden Attributable to 11 Risk Factors in Hubei Province, China: A Comparative Risk Assessment.

    PubMed

    Cui, Fangfang; Zhang, Lan; Yu, Chuanhua; Hu, Songbo; Zhang, Yunquan

    2016-01-01

    In order to estimate the health losses caused by common risk factors in the Hubei province, China, we calculated the deaths and disability-adjusted life years (DALYs) attributable to 11 risk factors. We estimated the exposure distributions of risk factors in Hubei Province in 2013 from the monitoring system on chronic disease and related risk factors, combined with relative risk (RR) in order to calculate the population attributable fraction. Deaths and DALYs attributed to the selected risk factors were then estimated together with cause-specific deaths and DALYs. In total, 53.39% of the total deaths and 36.23% of the total DALYs in Hubei were a result of the 11 selected risk factors. The top five risk factors were high blood pressure, smoking, high body mass index, diet low in fruits and alcohol use, accounting for 14.68%, 12.57%, 6.03%, 3.90% and 3.19% of total deaths, respectively, and 9.41%, 7.22%, 4.42%, 2.51% and 2.44% of total DALYs, respectively. These risk factors, especially high blood pressure, smoking and high body mass index, significantly influenced quality of life, causing a large number of deaths and DALYs. The burden of chronic disease could be substantially reduced if these risk factors were effectively controlled, which would allow people to enjoy healthier lives. PMID:27669279

  16. Estimation of the Disease Burden Attributable to 11 Risk Factors in Hubei Province, China: A Comparative Risk Assessment

    PubMed Central

    Cui, Fangfang; Zhang, Lan; Yu, Chuanhua; Hu, Songbo; Zhang, Yunquan

    2016-01-01

    In order to estimate the health losses caused by common risk factors in the Hubei province, China, we calculated the deaths and disability-adjusted life years (DALYs) attributable to 11 risk factors. We estimated the exposure distributions of risk factors in Hubei Province in 2013 from the monitoring system on chronic disease and related risk factors, combined with relative risk (RR) in order to calculate the population attributable fraction. Deaths and DALYs attributed to the selected risk factors were then estimated together with cause-specific deaths and DALYs. In total, 53.39% of the total deaths and 36.23% of the total DALYs in Hubei were a result of the 11 selected risk factors. The top five risk factors were high blood pressure, smoking, high body mass index, diet low in fruits and alcohol use, accounting for 14.68%, 12.57%, 6.03%, 3.90% and 3.19% of total deaths, respectively, and 9.41%, 7.22%, 4.42%, 2.51% and 2.44% of total DALYs, respectively. These risk factors, especially high blood pressure, smoking and high body mass index, significantly influenced quality of life, causing a large number of deaths and DALYs. The burden of chronic disease could be substantially reduced if these risk factors were effectively controlled, which would allow people to enjoy healthier lives. PMID:27669279

  17. The Impact of a Frailty Education Module on Surgical Resident Estimates of Lobectomy Risk

    PubMed Central

    Ferguson, Mark K.; Thompson, Katherine; Huisingh-Scheetz, Megan; Farnan, Jeanne; Hemmerich, Joshua; Acevedo, Julissa; Small, Stephen

    2015-01-01

    Background Frailty is a risk factor for adverse events after surgery. Residents’ ability to recognize frailty is underdeveloped. We assessed the influence of a frailty education module on surgical residents’ estimates of lobectomy risk. Methods Traditional track cardiothoracic surgery residents were randomized to take an on-line short course on frailty (experimental group) or to receive no training (control group). Residents read a clinical vignette, made an initial risk estimate of major complications for lobectomy, and rated clinical factors on their importance to their estimates. They viewed a video of a standardized patient portraying the patient in the vignette, randomly selected to exhibit either vigorous or frail behavior, and provided a final risk estimate. After rating 5 vignettes, they completed a test on their frailty knowledge. Results Forty-one residents participated (20 in the experimental group). Initial risk estimates were similar between the groups. The experimental group rated clinical factors as “very important” in their initial risk estimates more often than did the control group (47.6% vs 38.5%; p<0.001). Viewing videos resulted in a significant change from initial to final risk estimates (frail: 50±75% increase, p=0.008; vigorous: 14±32% decrease, p=0.043). The magnitude of change in risk estimates was greater for the experimental group (10.0±8.1 vs 5.1±7.7; p<0.001). The experimental group answered more frailty test questions correctly (93.7% vs 75.2%; p<0.001). Conclusions A frailty education module improved resident knowledge of frailty and influenced surgical risk estimates. Training in frailty may help educate residents in frailty recognition and surgical risk assessment. PMID:26004924

  18. Estimating the subjective risks of driving simulator accidents.

    PubMed

    Dixit, Vinayak; Harrison, Glenn W; Rutström, E Elisabet

    2014-01-01

    We examine the subjective risks of driving behavior using a controlled virtual reality experiment. Use of a driving simulator allows us to observe choices over risky alternatives that are presented to the individual in a naturalistic manner, with many of the cues one would find in the field. However, the use of a simulator allows us the type of controls one expects from a laboratory environment. The subject was tasked with making a left-hand turn into incoming traffic, and the experimenter controlled the headways of oncoming traffic. Subjects were rewarded for making a successful turn, and lost income if they crashed. The experimental design provided opportunities for subjects to develop subjective beliefs about when it would be safe to turn, and it also elicited their attitudes towards risk. A simple structural model explains behavior, and showed evidence of heterogeneity in both the subjective beliefs that subjects formed and their risk attitudes. We find that subjective beliefs change with experience in the task and the driver's skill. A significant difference was observed in the perceived probability to successfully turn among the inexperienced drivers who did and did not crash even though there was no significant difference in drivers' risk attitudes among the two groups. We use experimental economics to design controlled, incentive compatible tasks that provide an opportunity to evaluate the impact on driver safety of subject's subjective beliefs about when it would be safe to turn as well as their attitudes towards risk. This method could be used to help insurance companies determine risk premia associated with risk attitudes or beliefs of crashing, to better incentivize safe driving.

  19. Multicentre validation of the Geneva Risk Score for hospitalised medical patients at risk of venous thromboembolism. Explicit ASsessment of Thromboembolic RIsk and Prophylaxis for Medical PATients in SwitzErland (ESTIMATE).

    PubMed

    Nendaz, M; Spirk, D; Kucher, N; Aujesky, D; Hayoz, D; Beer, J H; Husmann, M; Frauchiger, B; Korte, W; Wuillemin, W A; Jäger, K; Righini, M; Bounameaux, H

    2014-03-01

    There is a need to validate risk assessment tools for hospitalised medical patients at risk of venous thromboembolism (VTE). We investigated whether a predefined cut-off of the Geneva Risk Score, as compared to the Padua Prediction Score, accurately distinguishes low-risk from high-risk patients regardless of the use of thromboprophylaxis. In the multicentre, prospective Explicit ASsessment of Thromboembolic RIsk and Prophylaxis for Medical PATients in SwitzErland (ESTIMATE) cohort study, 1,478 hospitalised medical patients were enrolled of whom 637 (43%) did not receive thromboprophylaxis. The primary endpoint was symptomatic VTE or VTE-related death at 90 days. The study is registered at ClinicalTrials.gov, number NCT01277536. According to the Geneva Risk Score, the cumulative rate of the primary endpoint was 3.2% (95% confidence interval [CI] 2.2-4.6%) in 962 high-risk vs 0.6% (95% CI 0.2-1.9%) in 516 low-risk patients (p=0.002); among patients without prophylaxis, this rate was 3.5% vs 0.8% (p=0.029), respectively. In comparison, the Padua Prediction Score yielded a cumulative rate of the primary endpoint of 3.5% (95% CI 2.3-5.3%) in 714 high-risk vs 1.1% (95% CI 0.6-2.3%) in 764 low-risk patients (p=0.002); among patients without prophylaxis, this rate was 3.2% vs 1.5% (p=0.130), respectively. Negative likelihood ratio was 0.28 (95% CI 0.10-0.83) for the Geneva Risk Score and 0.51 (95% CI 0.28-0.93) for the Padua Prediction Score. In conclusion, among hospitalised medical patients, the Geneva Risk Score predicted VTE and VTE-related mortality and compared favourably with the Padua Prediction Score, particularly for its accuracy to identify low-risk patients who do not require thromboprophylaxis.

  20. Biomechanical Risk Estimates for Mild Traumatic Brain Injury

    PubMed Central

    Funk, J. R.; Duma, S. M.; Manoogian, S. J.; Rowson, S.

    2007-01-01

    The objective of this study was to characterize the risk of mild traumatic brain injury (MTBI) in living humans based on a large set of head impact data taken from American football players at the collegiate level. Real-time head accelerations were recorded from helmet-mounted accelerometers designed to stay in contact with the player’s head. Over 27,000 head impacts were recorded, including four impacts resulting in MTBI. Parametric risk curves were developed by normalizing MTBI incidence data by head impact exposure data. An important finding of this research is that living humans, at least in the setting of collegiate football, sustain much more significant head impacts without apparent injury than previously thought. The following preliminary nominal injury assessment reference values associated with a 10% risk of MTBI are proposed: a peak linear head acceleration of 165 g, a HIC of 400, and a peak angular head acceleration of 9000 rad/s2. PMID:18184501

  1. Probabilistic methodology for estimating radiation-induced cancer risk

    SciTech Connect

    Dunning, D.E. Jr.; Leggett, R.W.; Williams, L.R.

    1981-01-01

    The RICRAC computer code was developed at Oak Ridge National Laboratory to provide a versatile and convenient methodology for radiation risk assessment. The code allows as input essentially any dose pattern commonly encountered in risk assessments for either acute or chronic exposures, and it includes consideration of the age structure of the exposed population. Results produced by the analysis include the probability of one or more radiation-induced cancer deaths in a specified population, expected numbers of deaths, and expected years of life lost as a result of premature fatalities. These calculatons include consideration of competing risks of death from all other causes. The program also generates a probability frequency distribution of the expected number of cancers in any specified cohort resulting from a given radiation dose. The methods may be applied to any specified population and dose scenario.

  2. Space Radiation Cancer, Circulatory Disease and CNS Risks for Near Earth Asteroid and Mars Missions: Uncertainty Estimates for Never-Smokers

    NASA Technical Reports Server (NTRS)

    Cucinotta, Francis A.; Chappell, Lori J.; Wang, Minli; Kim, Myung-Hee

    2011-01-01

    The uncertainties in estimating the health risks from galactic cosmic rays (GCR) and solar particle events (SPE) are a major limitation to the length of space missions and the evaluation of potential risk mitigation approaches. NASA limits astronaut exposures to a 3% risk of exposure induced cancer death (REID), and protects against uncertainties in risks projections using an assessment of 95% confidence intervals after propagating the error from all model factors (environment and organ exposure, risk coefficients, dose-rate modifiers, and quality factors). Because there are potentially significant late mortality risks from diseases of the circulatory system and central nervous system (CNS) which are less well defined than cancer risks, the cancer REID limit is not necessarily conservative. In this report, we discuss estimates of lifetime risks from space radiation and new estimates of model uncertainties are described. The key updates to the NASA risk projection model are: 1) Revised values for low LET risk coefficients for tissue specific cancer incidence, with incidence rates transported to an average U.S. population to estimate the probability of Risk of Exposure Induced Cancer (REIC) and REID. 2) An analysis of smoking attributable cancer risks for never-smokers that shows significantly reduced lung cancer risk as well as overall cancer risks from radiation compared to risk estimated for the average U.S. population. 3) Derivation of track structure based quality functions depends on particle fluence, charge number, Z and kinetic energy, E. 4) The assignment of a smaller maximum in quality function for leukemia than for solid cancers. 5) The use of the ICRP tissue weights is shown to over-estimate cancer risks from SPEs by a factor of 2 or more. Summing cancer risks for each tissue is recommended as a more accurate approach to estimate SPE cancer risks. 6) Additional considerations on circulatory and CNS disease risks. Our analysis shows that an individual s

  3. Aggregate versus Individual-Level Sexual Behavior Assessment: How Much Detail Is Needed to Accurately Estimate HIV/STI Risk?

    ERIC Educational Resources Information Center

    Pinkerton, Steven D.; Galletly, Carol L.; McAuliffe, Timothy L.; DiFranceisco, Wayne; Raymond, H. Fisher; Chesson, Harrell W.

    2010-01-01

    The sexual behaviors of HIV/sexually transmitted infection (STI) prevention intervention participants can be assessed on a partner-by-partner basis: in aggregate (i.e., total numbers of sex acts, collapsed across partners) or using a combination of these two methods (e.g., assessing five partners in detail and any remaining partners in aggregate).…

  4. CubeSat mission design software tool for risk estimating relationships

    NASA Astrophysics Data System (ADS)

    Gamble, Katharine Brumbaugh; Lightsey, E. Glenn

    2014-09-01

    In an effort to make the CubeSat risk estimation and management process more scientific, a software tool has been created that enables mission designers to estimate mission risks. CubeSat mission designers are able to input mission characteristics, such as form factor, mass, development cycle, and launch information, in order to determine the mission risk root causes which historically present the highest risk for their mission. Historical data was collected from the CubeSat community and analyzed to provide a statistical background to characterize these Risk Estimating Relationships (RERs). This paper develops and validates the mathematical model based on the same cost estimating relationship methodology used by the Unmanned Spacecraft Cost Model (USCM) and the Small Satellite Cost Model (SSCM). The RER development uses general error regression models to determine the best fit relationship between root cause consequence and likelihood values and the input factors of interest. These root causes are combined into seven overall CubeSat mission risks which are then graphed on the industry-standard 5×5 Likelihood-Consequence (L-C) chart to help mission designers quickly identify areas of concern within their mission. This paper is the first to document not only the creation of a historical database of CubeSat mission risks, but, more importantly, the scientific representation of Risk Estimating Relationships.

  5. Development and validation of risk prediction equations to estimate future risk of blindness and lower limb amputation in patients with diabetes: cohort study

    PubMed Central

    Coupland, Carol

    2015-01-01

    Study question Is it possible to develop and externally validate risk prediction equations to estimate the 10 year risk of blindness and lower limb amputation in patients with diabetes aged 25-84 years? Methods This was a prospective cohort study using routinely collected data from general practices in England contributing to the QResearch and Clinical Practice Research Datalink (CPRD) databases during the study period 1998-2014. The equations were developed using 763 QResearch practices (n=454 575 patients with diabetes) and validated in 254 different QResearch practices (n=142 419) and 357 CPRD practices (n=206 050). Cox proportional hazards models were used to derive separate risk equations for blindness and amputation in men and women that could be evaluated at 10 years. Measures of calibration and discrimination were calculated in the two validation cohorts. Study answer and limitations Risk prediction equations to quantify absolute risk of blindness and amputation in men and women with diabetes have been developed and externally validated. In the QResearch derivation cohort, 4822 new cases of lower limb amputation and 8063 new cases of blindness occurred during follow-up. The risk equations were well calibrated in both validation cohorts. Discrimination was good in men in the external CPRD cohort for amputation (D statistic 1.69, Harrell’s C statistic 0.77) and blindness (D statistic 1.40, Harrell’s C statistic 0.73), with similar results in women and in the QResearch validation cohort. The algorithms are based on variables that patients are likely to know or that are routinely recorded in general practice computer systems. They can be used to identify patients at high risk for prevention or further assessment. Limitations include lack of formally adjudicated outcomes, information bias, and missing data. What this study adds Patients with type 1 or type 2 diabetes are at increased risk of blindness and amputation but generally do not have accurate

  6. Assessing uncertainty in published risk estimates using hexavalent chromium and lung cancer mortality as an example

    EPA Science Inventory

    Introduction: The National Research Council recommended quantitative evaluation of uncertainty in effect estimates for risk assessment. This analysis considers uncertainty across model forms and model parameterizations with hexavalent chromium [Cr(VI)] and lung cancer mortality a...

  7. Derivation and validation of a prediction rule for estimating advanced colorectal neoplasm risk in average-risk Chinese.

    PubMed

    Cai, Quan-Cai; Yu, En-Da; Xiao, Yi; Bai, Wen-Yuan; Chen, Xing; He, Li-Ping; Yang, Yu-Xiu; Zhou, Ping-Hong; Jiang, Xue-Liang; Xu, Hui-Min; Fan, Hong; Ge, Zhi-Zheng; Lv, Nong-Hua; Huang, Zhi-Gang; Li, You-Ming; Ma, Shu-Ren; Chen, Jie; Li, Yan-Qing; Xu, Jian-Ming; Xiang, Ping; Yang, Li; Lin, Fu-Lin; Li, Zhao-Shen

    2012-03-15

    No prediction rule is currently available for advanced colorectal neoplasms, defined as invasive cancer, an adenoma of 10 mm or more, a villous adenoma, or an adenoma with high-grade dysplasia, in average-risk Chinese. In this study between 2006 and 2008, a total of 7,541 average-risk Chinese persons aged 40 years or older who had complete colonoscopy were included. The derivation and validation cohorts consisted of 5,229 and 2,312 persons, respectively. A prediction rule was developed from a logistic regression model and then internally and externally validated. The prediction rule comprised 8 variables (age, sex, smoking, diabetes mellitus, green vegetables, pickled food, fried food, and white meat), with scores ranging from 0 to 14. Among the participants with low-risk (≤3) or high-risk (>3) scores in the validation cohort, the risks of advanced neoplasms were 2.6% and 10.0% (P < 0.001), respectively. If colonoscopy was used only for persons with high risk, 80.3% of persons with advanced neoplasms would be detected while the number of colonoscopies would be reduced by 49.2%. The prediction rule had good discrimination (area under the receiver operating characteristic curve = 0.74, 95% confidence interval: 0.70, 0.78) and calibration (P = 0.77) and, thus, provides accurate risk stratification for advanced neoplasms in average-risk Chinese. PMID:22328705

  8. Estimated exposure to phthalates in cosmetics and risk assessment.

    PubMed

    Koo, Hyun Jung; Lee, Byung Mu

    2004-12-01

    Some phthalates such as di(2-ethylhexyl) phthalate (DEHP) and dibutyl phthalate (DBP) and their metabolites are suspected of producing teratogenic or endocrine-disrupting effects. To predict possible human exposure to phthalates in cosmetics, the levels of DEHP, diethyl phthalate (DEP), DBP, and butylbenzyl phthalate (BBP) were determined by high-performance liquid chromatography (HPLC) in 102 branded hair sprays, perfumes, deodorants, and nail polishes. DBP was detected in 19 of the 21 nail polishes and in 11 of the 42 perfumes, and DEP was detected in 24 of the 42 perfumes and 2 of the 8 deodorants. Median exposure levels to phthalates in cosmetics by dermal absorption were estimated to be 0.0006 g/kg body weight (bw)/d for DEHP, 0.6 g/kg bw/d for DEP, and 0.103 g/kg bw/d for DBP. Furthermore, if phthalates in cosmetics were assumed to be absorbed exclusively via 100% inhalation, the median daily exposure levels to phthalates in cosmetics were estimated to be 0.026 g/kg bw/d for DEHP, 81.471 g/kg bw/d for DEP, and 22.917 g/kg bw/d for DBP, which are far lower than the regulation levels set buy the Scientific Committee on Toxicity, Ecotoxicity, and the Environment (CSTEE) (37 g/kg bw/d, DEHP), Agency for Toxic Substances and Disease Registry (ATSDR) (7000 g/kg bw/d, DEP), and International Programme on Chemical Safety (IPCS) (66 g/kg bw/d, DBP), respectively. Based on these data, hazard indices (HI, daily exposure level/regulation level) were calculated to be 0.0007 for DEHP, 0.012 for DEP, and 0.347 for DBP. These data suggest that estimated exposure to-phthalates in the cosmetics mentioned are relatively small. However, total exposure levels from several sources may be greater and require further investigation.

  9. Cardiovascular risk factors and estimated 10-year risk of fatal cardiovascular events using various equations in Greeks with metabolic syndrome.

    PubMed

    Chimonas, Theodoros; Athyros, Vassilios G; Ganotakis, Emmanouel; Nicolaou, Vassilios; Panagiotakos, Demosthenes B; Mikhailidis, Dimitri P; Elisaf, Moses

    2010-01-01

    We investigated cardiovascular disease (CVD) risk factors in 1501 Greeks (613 men and 888 women, aged 40-65 years) referred to outpatients with metabolic syndrome (MetS) and without diabetes mellitus or CVD. The 10-year risk of fatal CVD events was calculated using European Society of Cardiology Systematic Coronary Risk Estimation (ESC SCORE), Hellenic-SCORE, and Framingham equations. Raised blood pressure (BP) and hypertriglyceridemia were more common in men (89.6% vs 84.2% and 86.8% vs 74.2%, respectively; P < .001). Low high-density lipoprotein cholesterol (HDL-C) and abdominal obesity were more common in women (58.2% vs 66.2% and 85.8% vs 97.1%, respectively; P < .001). The 10-year risk of fatal CVD events using HellenicSCORE was higher in men (6.3% +/- 4.3% vs 2.7% +/- 2.1%; P < .001). European Society of Cardiology Systematic Coronary Risk Estimation and Framingham yielded similar results. The risk equations gave similar assessments in a European Mediterranean population except for HellenicSCORE that calculated more MetS women requiring risk modification. This might justify local risk engine evaluation in event-based studies. (Clinical-Trials.gov ID: NCT00416741).

  10. Disease Risk Estimation by Combining Case–Control Data with Aggregated Information on the Population at Risk

    PubMed Central

    Chang, Xiaohui; Waagepetersen, Rasmus; Yu, Herbert; Ma, Xiaomei; Holford, Theodore R.; Wang, Rong; Guan, Yongtao

    2016-01-01

    Summary We propose a novel statistical framework by supplementing case–control data with summary statistics on the population at risk for a subset of risk factors. Our approach is to first form two unbiased estimating equations, one based on the case–control data and the other on both the case data and the summary statistics, and then optimally combine them to derive another estimating equation to be used for the estimation. The proposed method is computationally simple and more efficient than standard approaches based on case–control data alone. We also establish asymptotic properties of the resulting estimator, and investigate its finite-sample performance through simulation. As a substantive application, we apply the proposed method to investigate risk factors for endometrial cancer, by using data from a recently completed population-based case–control study and summary statistics from the Behavioral Risk Factor Surveillance System, the Population Estimates Program of the US Census Bureau, and the Connecticut Department of Transportation. PMID:25351292

  11. Estimated occupational risk from bioaerosols generated during land application of Class B biosolids

    Technology Transfer Automated Retrieval System (TEKTRAN)

    It has been speculated that bioaerosols generated during land application of biosolids pose a serious occupational risk, but few scientific studies have been performed to assess levels of aerosolization of microorganisms from biosolids and to estimate the occupational risks of infection. This study ...

  12. Bayesian Framework for Water Quality Model Uncertainty Estimation and Risk Management

    EPA Science Inventory

    A formal Bayesian methodology is presented for integrated model calibration and risk-based water quality management using Bayesian Monte Carlo simulation and maximum likelihood estimation (BMCML). The primary focus is on lucid integration of model calibration with risk-based wat...

  13. Estimating risks to aquatic life using quantile regression

    USGS Publications Warehouse

    Schmidt, Travis S.; Clements, William H.; Cade, Brian S.

    2012-01-01

    One of the primary goals of biological assessment is to assess whether contaminants or other stressors limit the ecological potential of running waters. It is important to interpret responses to contaminants relative to other environmental factors, but necessity or convenience limit quantification of all factors that influence ecological potential. In these situations, the concept of limiting factors is useful for data interpretation. We used quantile regression to measure risks to aquatic life exposed to metals by including all regression quantiles (τ  =  0.05–0.95, by increments of 0.05), not just the upper limit of density (e.g., 90th quantile). We measured population densities (individuals/0.1 m2) of 2 mayflies (Rhithrogena spp., Drunella spp.) and a caddisfly (Arctopsyche grandis), aqueous metal mixtures (Cd, Cu, Zn), and other limiting factors (basin area, site elevation, discharge, temperature) at 125 streams in Colorado. We used a model selection procedure to test which factor was most limiting to density. Arctopsyche grandis was limited by other factors, whereas metals limited most quantiles of density for the 2 mayflies. Metals reduced mayfly densities most at sites where other factors were not limiting. Where other factors were limiting, low mayfly densities were observed despite metal concentrations. Metals affected mayfly densities most at quantiles above the mean and not just at the upper limit of density. Risk models developed from quantile regression showed that mayfly densities observed at background metal concentrations are improbable when metal mixtures are at US Environmental Protection Agency criterion continuous concentrations. We conclude that metals limit potential density, not realized average density. The most obvious effects on mayfly populations were at upper quantiles and not mean density. Therefore, we suggest that policy developed from mean-based measures of effects may not be as useful as policy based on the concept of

  14. Uncertainties in Estimates of the Risks of Late Effects from Space Radiation

    NASA Technical Reports Server (NTRS)

    Cucinotta, F. A.; Schimmerling, W.; Wilson, J. W.; Peterson, L. E.; Saganti, P.; Dicelli, J. F.

    2002-01-01

    The health risks faced by astronauts from space radiation include cancer, cataracts, hereditary effects, and non-cancer morbidity and mortality risks related to the diseases of the old age. Methods used to project risks in low-Earth orbit are of questionable merit for exploration missions because of the limited radiobiology data and knowledge of galactic cosmic ray (GCR) heavy ions, which causes estimates of the risk of late effects to be highly uncertain. Risk projections involve a product of many biological and physical factors, each of which has a differential range of uncertainty due to lack of data and knowledge. Within the linear-additivity model, we use Monte-Carlo sampling from subjective uncertainty distributions in each factor to obtain a Maximum Likelihood estimate of the overall uncertainty in risk projections. The resulting methodology is applied to several human space exploration mission scenarios including ISS, lunar station, deep space outpost, and Mar's missions of duration of 360, 660, and 1000 days. The major results are the quantification of the uncertainties in current risk estimates, the identification of factors that dominate risk projection uncertainties, and the development of a method to quantify candidate approaches to reduce uncertainties or mitigate risks. The large uncertainties in GCR risk projections lead to probability distributions of risk that mask any potential risk reduction using the "optimization" of shielding materials or configurations. In contrast, the design of shielding optimization approaches for solar particle events and trapped protons can be made at this time, and promising technologies can be shown to have merit using our approach. The methods used also make it possible to express risk management objectives in terms of quantitative objective's, i.e., the number of days in space without exceeding a given risk level within well defined confidence limits.

  15. REVIEW OF DRAFT REVISED BLUE BOOK ON ESTIMATING CANCER RISKS FROM EXPOSURE TO IONIZING RADIATION

    EPA Science Inventory

    In 1994, EPA published a report, referred to as the “Blue Book,” which lays out EPA’s current methodology for quantitatively estimating radiogenic cancer risks. A follow-on report made minor adjustments to the previous estimates and presented a partial analysis of the uncertainti...

  16. ASSESSMENT OF METHODS FOR ESTIMATING RISK TO BIRDS FROM INGESTION OF CONTAMINATED GRIT PARTICLES (FINAL REPORT)

    EPA Science Inventory

    The report evaluates approaches for estimating the probability of ingestion by birds of contaminated particles such as pesticide granules or lead particles (i.e. shot or bullet fragments). In addition, it presents an approach for using this information to estimate the risk of mo...

  17. Estimating Toxicity Pathway Activating Doses for High Throughput Chemical Risk Assessments

    EPA Science Inventory

    Estimating a Toxicity Pathway Activating Dose (TPAD) from in vitro assays as an analog to a reference dose (RfD) derived from in vivo toxicity tests would facilitate high throughput risk assessments of thousands of data-poor environmental chemicals. Estimating a TPAD requires def...

  18. Risk estimations, risk factors, and genetic variants associated with Alzheimer's disease in selected publications from the Framingham Heart Study.

    PubMed

    Weinstein, Galit; Wolf, Philip A; Beiser, Alexa S; Au, Rhoda; Seshadri, Sudha

    2013-01-01

    The study of Alzheimer's disease (AD) in the Framingham Heart Study (FHS), a multi-generational, community-based population study, began nearly four decades ago. In this overview, we highlight findings from seven prior publications that examined lifetime risk estimates for AD, environmental risk factors for AD, circulating and imaging markers of aging-related brain injury, and explorations on the genetics underlying AD. First, we describe estimations of the lifetime risk of AD. These estimates are distinguished from other measures of disease burden and have substantial public health implications. We then describe prospective studies of environmental AD risk factors: one examined the association between plasma levels of omega-3 fatty-acid and risk of incident AD, the other explored the association of diabetes to this risk in subsamples with specific characteristics. With evidence of inflammation as an underlying mechanism, we also describe findings from a study that compared the effects of serum cytokines and spontaneous production of peripheral blood mononuclear cell cytokines on AD risk. Investigating AD related endophenotypes increases sensitivity in identifying risk factors and can be used to explore pathophysiologic pathways between a risk factor and the disease. We describe findings of an association between large volume of white matter hyperintensities and a specific pattern of cognitive deficits in non-demented participants. Finally, we summarize our findings from two genetic studies: The first used genome-wide association (GWA) and family-based association methods to explore the genetic basis of cognitive and structural brain traits. The second is a large meta-analysis GWA study of AD, in which novel loci of AD susceptibility were found. Together, these findings demonstrate the FHS multi-directional efforts in investigating dementia and AD. PMID:22796871

  19. An exploration of spatial risk assessment for soil protection: estimating risk and establishing priority areas for soil protection.

    PubMed

    Kibblewhite, M G; Bellamy, P H; Brewer, T R; Graves, A R; Dawson, C A; Rickson, R J; Truckell, I; Stuart, J

    2014-03-01

    Methods for the spatial estimation of risk of harm to soil by erosion by water and wind and by soil organic matter decline are explored. Rates of harm are estimated for combinations of soil type and land cover (as a proxy for hazard frequency) and used to estimate risk of soil erosion and loss of soil organic carbon (SOC) for 1 km(2)pixels. Scenarios are proposed for defining the acceptability of risk of harm to soil: the most precautionary one corresponds to no net harm after natural regeneration of soil (i.e. a 1 in 20 chance of exceeding an erosion rate of <1 tha(-1)y(-1) and SOC content decline of 0 kg t(-1)y(-1) for mineral soils and a carbon stock decline of 0 tha(-1)y(-1) for organic soils). Areas at higher and lower than possible acceptable risk are mapped. The veracity of boundaries is compromised if areas of unacceptable risk are mapped to administrative boundaries. Errors in monitoring change in risk of harm to soil and inadequate information on risk reduction measures' efficacy, at landscape scales, make it impossible to use or monitor quantitative targets for risk reduction adequately. The consequences for priority area definition of expressing varying acceptable risk of harm to soil as a varying probability of exceeding a fixed level of harm, or, a varying level of harm being exceeded with a fixed probability, are discussed. Soil data and predictive models for rates of harm to soil would need considerable development and validation to implement a priority area approach robustly. PMID:24412915

  20. An exploration of spatial risk assessment for soil protection: estimating risk and establishing priority areas for soil protection.

    PubMed

    Kibblewhite, M G; Bellamy, P H; Brewer, T R; Graves, A R; Dawson, C A; Rickson, R J; Truckell, I; Stuart, J

    2014-03-01

    Methods for the spatial estimation of risk of harm to soil by erosion by water and wind and by soil organic matter decline are explored. Rates of harm are estimated for combinations of soil type and land cover (as a proxy for hazard frequency) and used to estimate risk of soil erosion and loss of soil organic carbon (SOC) for 1 km(2)pixels. Scenarios are proposed for defining the acceptability of risk of harm to soil: the most precautionary one corresponds to no net harm after natural regeneration of soil (i.e. a 1 in 20 chance of exceeding an erosion rate of <1 tha(-1)y(-1) and SOC content decline of 0 kg t(-1)y(-1) for mineral soils and a carbon stock decline of 0 tha(-1)y(-1) for organic soils). Areas at higher and lower than possible acceptable risk are mapped. The veracity of boundaries is compromised if areas of unacceptable risk are mapped to administrative boundaries. Errors in monitoring change in risk of harm to soil and inadequate information on risk reduction measures' efficacy, at landscape scales, make it impossible to use or monitor quantitative targets for risk reduction adequately. The consequences for priority area definition of expressing varying acceptable risk of harm to soil as a varying probability of exceeding a fixed level of harm, or, a varying level of harm being exceeded with a fixed probability, are discussed. Soil data and predictive models for rates of harm to soil would need considerable development and validation to implement a priority area approach robustly.

  1. Overview of Risk-Estimation Tools for Primary Prevention of Cardiovascular Diseases in European Populations.

    PubMed

    Gorenoi, Vitali; Hagen, Anja

    2015-06-01

    To identify persons with a high risk for cardiovascular diseases (CVD) special tools (scores, charts, graphics or computer programs) for CVD-risk assessment based on levels of the certain risk factors have been constructed. The applicability of these instruments depends on the derivation cohorts, considered risk factors and endpoints, applied statistical methods as well as used formats. The review addresses the risk-estimation tools for primary prevention of CVD potentially relevant for European populations. The risk-estimation tools were identified using two previously published systematic reviews as well as conducting a literature search in MEDLINE and a manual search. Only instruments were considered which were derived from cohorts of at least 1,000 participants of one gender without pre-existing CVD, enable risk assessment for a period of at least 5 years, were designed for an age-range of at least 25 years and published after the year 2000. A number of risk-estimation tools for CVD derived from single European, several European and from non-European cohorts were identified. From a clinical perspective, seem to be preferable instruments for risk of CVD contemporary developed for the population of interest, which use easily accessible measures and show a high discriminating ability. Instruments, restricting risk-estimation to certain cardiovascular events, recalibrated high-accuracy tools or tools derived from European populations with similar risk factors distribution and CVD-incidence are the second choice. In younger people, calculating the relative risk or cardiovascular age equivalence measures may be of more benefit.

  2. Quantitative Risk reduction estimation Tool For Control Systems, Suggested Approach and Research Needs

    SciTech Connect

    Miles McQueen; Wayne Boyer; Mark Flynn; Sam Alessi

    2006-03-01

    For the past year we have applied a variety of risk assessment technologies to evaluate the risk to critical infrastructure from cyber attacks on control systems. More recently, we identified the need for a stand alone control system risk reduction estimation tool to provide owners and operators of control systems with a more useable, reliable, and credible method for managing the risks from cyber attack. Risk is defined as the probability of a successful attack times the value of the resulting loss, typically measured in lives and dollars. Qualitative and ad hoc techniques for measuring risk do not provide sufficient support for cost benefit analyses associated with cyber security mitigation actions. To address the need for better quantitative risk reduction models we surveyed previous quantitative risk assessment research; evaluated currently available tools; developed new quantitative techniques [17] [18]; implemented a prototype analysis tool to demonstrate how such a tool might be used; used the prototype to test a variety of underlying risk calculational engines (e.g. attack tree, attack graph); and identified technical and research needs. We concluded that significant gaps still exist and difficult research problems remain for quantitatively assessing the risk to control system components and networks, but that a useable quantitative risk reduction estimation tool is not beyond reach.

  3. Space Radiation Heart Disease Risk Estimates for Lunar and Mars Missions

    NASA Technical Reports Server (NTRS)

    Cucinotta, Francis A.; Chappell, Lori; Kim, Myung-Hee

    2010-01-01

    The NASA Space Radiation Program performs research on the risks of late effects from space radiation for cancer, neurological disorders, cataracts, and heart disease. For mortality risks, an aggregate over all risks should be considered as well as projection of the life loss per radiation induced death. We report on a triple detriment life-table approach to combine cancer and heart disease risks. Epidemiology results show extensive heterogeneity between populations for distinct components of the overall heart disease risks including hypertension, ischaemic heart disease, stroke, and cerebrovascular diseases. We report on an update to our previous heart disease estimates for Heart disease (ICD9 390-429) and Stroke (ICD9 430-438), and other sub-groups using recent meta-analysis results for various exposed radiation cohorts to low LET radiation. Results for multiplicative and additive risk transfer models are considered using baseline rates for US males and female. Uncertainty analysis indicated heart mortality risks as low as zero, assuming a threshold dose for deterministic effects, and projections approaching one-third of the overall cancer risk. Medan life-loss per death estimates were significantly less than that of solid cancer and leukemias. Critical research questions to improve risks estimates for heart disease are distinctions in mechanisms at high doses (>2 Gy) and low to moderate doses (<2 Gy), and data and basic understanding of radiation doserate and quality effects, and individual sensitivity.

  4. Breast Cancer Risk Estimation Using Parenchymal Texture Analysis in Digital Breast Tomosynthesis

    SciTech Connect

    Ikejimba, Lynda C.; Kontos, Despina; Maidment, Andrew D. A.

    2010-10-11

    Mammographic parenchymal texture has been shown to correlate with genetic markers of developing breast cancer. Digital breast tomosynthesis (DBT) is a novel x-ray imaging technique in which tomographic images of the breast are reconstructed from multiple source projections acquired at different angles of the x-ray tube. Compared to digital mammography (DM), DBT eliminates breast tissue overlap, offering superior parenchymal tissue visualization. We hypothesize that texture analysis in DBT could potentially provide a better assessment of parenchymal texture and ultimately result in more accurate assessment of breast cancer risk. As a first step towards validating this hypothesis, we investigated the association between DBT parenchymal texture and breast percent density (PD), a known breast cancer risk factor, and compared it to DM. Bilateral DBT and DM images from 71 women participating in a breast cancer screening trial were analyzed. Filtered-backprojection was used to reconstruct DBT tomographic planes in 1 mm increments with 0.22 mm in-plane resolution. Corresponding DM images were acquired at 0.1 mm pixel resolution. Retroareolar regions of interest (ROIs) equivalent to 2.5 cm{sup 3} were segmented from the DBT images and corresponding 2.5 cm{sup 2} ROIs were segmented from the DM images. Breast PD was mammographically estimated using the Cumulus scale. Overall, DBT texture features demonstrated a stronger correlation than DM to PD. The Pearson correlation coefficients for DBT were r = 0.40 (p<0.001) for contrast and r = -0.52 (p<0.001) for homogeneity; the corresponding DM correlations were r = 0.26 (p = 0.002) and r = -0.33 (p<0.001). Multiple linear regression of the texture features versus breast PD also demonstrated significantly stronger associations in DBT (R{sup 2} = 0.39) compared to DM (R{sup 2} = 0.33). We attribute these observations to the superior parenchymal tissue visualization in DBT. Our study is the first to perform DBT texture analysis in a

  5. Uncertainties in estimates of the risks of late effects from space radiation.

    PubMed

    Cucinotta, F A; Schimmerling, W; Wilson, J W; Peterson, L E; Saganti, P B; Dicello, J F

    2004-01-01

    Methods used to project risks in low-Earth orbit are of questionable merit for exploration missions because of the limited radiobiology data and knowledge of galactic cosmic ray (GCR) heavy ions, which causes estimates of the risk of late effects to be highly uncertain. Risk projections involve a product of many biological and physical factors, each of which has a differential range of uncertainty due to lack of data and knowledge. Using the linear-additivity model for radiation risks, we use Monte-Carlo sampling from subjective uncertainty distributions in each factor to obtain an estimate of the overall uncertainty in risk projections. The resulting methodology is applied to several human space exploration mission scenarios including a deep space outpost and Mars missions of duration of 360, 660, and 1000 days. The major results are the quantification of the uncertainties in current risk estimates, the identification of factors that dominate risk projection uncertainties, and the development of a method to quantify candidate approaches to reduce uncertainties or mitigate risks. The large uncertainties in GCR risk projections lead to probability distributions of risk that mask any potential risk reduction using the "optimization" of shielding materials or configurations. In contrast, the design of shielding optimization approaches for solar particle events and trapped protons can be made at this time and promising technologies can be shown to have merit using our approach. The methods used also make it possible to express risk management objectives in terms of quantitative metrics, e.g., the number of days in space without exceeding a given risk level within well-defined confidence limits.

  6. Uncertainties in estimates of the risks of late effects from space radiation

    NASA Technical Reports Server (NTRS)

    Cucinotta, F. A.; Schimmerling, W.; Wilson, J. W.; Peterson, L. E.; Saganti, P. B.; Dicello, J. F.

    2004-01-01

    Methods used to project risks in low-Earth orbit are of questionable merit for exploration missions because of the limited radiobiology data and knowledge of galactic cosmic ray (GCR) heavy ions, which causes estimates of the risk of late effects to be highly uncertain. Risk projections involve a product of many biological and physical factors, each of which has a differential range of uncertainty due to lack of data and knowledge. Using the linear-additivity model for radiation risks, we use Monte-Carlo sampling from subjective uncertainty distributions in each factor to obtain an estimate of the overall uncertainty in risk projections. The resulting methodology is applied to several human space exploration mission scenarios including a deep space outpost and Mars missions of duration of 360, 660, and 1000 days. The major results are the quantification of the uncertainties in current risk estimates, the identification of factors that dominate risk projection uncertainties, and the development of a method to quantify candidate approaches to reduce uncertainties or mitigate risks. The large uncertainties in GCR risk projections lead to probability distributions of risk that mask any potential risk reduction using the "optimization" of shielding materials or configurations. In contrast, the design of shielding optimization approaches for solar particle events and trapped protons can be made at this time and promising technologies can be shown to have merit using our approach. The methods used also make it possible to express risk management objectives in terms of quantitative metrics, e.g., the number of days in space without exceeding a given risk level within well-defined confidence limits. Published by Elsevier Ltd on behalf of COSPAR.

  7. A simple procedure for estimating pseudo risk ratios from exposure to non-carcinogenic chemical mixtures.

    PubMed

    Scinicariello, Franco; Portier, Christopher

    2016-03-01

    Non-cancer risk assessment traditionally assumes a threshold of effect, below which there is a negligible risk of an adverse effect. The Agency for Toxic Substances and Disease Registry derives health-based guidance values known as Minimal Risk Levels (MRLs) as estimates of the toxicity threshold for non-carcinogens. Although the definition of an MRL, as well as EPA reference dose values (RfD and RfC), is a level that corresponds to "negligible risk," they represent daily exposure doses or concentrations, not risks. We present a new approach to calculate the risk at exposure to specific doses for chemical mixtures, the assumption in this approach is to assign de minimis risk at the MRL. The assigned risk enables the estimation of parameters in an exponential model, providing a complete dose-response curve for each compound from the chosen point of departure to zero. We estimated parameters for 27 chemicals. The value of k, which determines the shape of the dose-response curve, was moderately insensitive to the choice of the risk at the MRL. The approach presented here allows for the calculation of a risk from a single substance or the combined risk from multiple chemical exposures in a community. The methodology is applicable from point of departure data derived from quantal data, such as data from benchmark dose analyses or from data that can be transformed into probabilities, such as lowest-observed-adverse-effect level. The individual risks are used to calculate risk ratios that can facilitate comparison and cost-benefit analyses of environmental contamination control strategies. PMID:25667015

  8. Uncertainties in estimates of the risks of late effects from space radiation.

    PubMed

    Cucinotta, F A; Schimmerling, W; Wilson, J W; Peterson, L E; Saganti, P B; Dicello, J F

    2004-01-01

    Methods used to project risks in low-Earth orbit are of questionable merit for exploration missions because of the limited radiobiology data and knowledge of galactic cosmic ray (GCR) heavy ions, which causes estimates of the risk of late effects to be highly uncertain. Risk projections involve a product of many biological and physical factors, each of which has a differential range of uncertainty due to lack of data and knowledge. Using the linear-additivity model for radiation risks, we use Monte-Carlo sampling from subjective uncertainty distributions in each factor to obtain an estimate of the overall uncertainty in risk projections. The resulting methodology is applied to several human space exploration mission scenarios including a deep space outpost and Mars missions of duration of 360, 660, and 1000 days. The major results are the quantification of the uncertainties in current risk estimates, the identification of factors that dominate risk projection uncertainties, and the development of a method to quantify candidate approaches to reduce uncertainties or mitigate risks. The large uncertainties in GCR risk projections lead to probability distributions of risk that mask any potential risk reduction using the "optimization" of shielding materials or configurations. In contrast, the design of shielding optimization approaches for solar particle events and trapped protons can be made at this time and promising technologies can be shown to have merit using our approach. The methods used also make it possible to express risk management objectives in terms of quantitative metrics, e.g., the number of days in space without exceeding a given risk level within well-defined confidence limits. PMID:15881779

  9. A simple procedure for estimating pseudo risk ratios from exposure to non-carcinogenic chemical mixtures.

    PubMed

    Scinicariello, Franco; Portier, Christopher

    2016-03-01

    Non-cancer risk assessment traditionally assumes a threshold of effect, below which there is a negligible risk of an adverse effect. The Agency for Toxic Substances and Disease Registry derives health-based guidance values known as Minimal Risk Levels (MRLs) as estimates of the toxicity threshold for non-carcinogens. Although the definition of an MRL, as well as EPA reference dose values (RfD and RfC), is a level that corresponds to "negligible risk," they represent daily exposure doses or concentrations, not risks. We present a new approach to calculate the risk at exposure to specific doses for chemical mixtures, the assumption in this approach is to assign de minimis risk at the MRL. The assigned risk enables the estimation of parameters in an exponential model, providing a complete dose-response curve for each compound from the chosen point of departure to zero. We estimated parameters for 27 chemicals. The value of k, which determines the shape of the dose-response curve, was moderately insensitive to the choice of the risk at the MRL. The approach presented here allows for the calculation of a risk from a single substance or the combined risk from multiple chemical exposures in a community. The methodology is applicable from point of departure data derived from quantal data, such as data from benchmark dose analyses or from data that can be transformed into probabilities, such as lowest-observed-adverse-effect level. The individual risks are used to calculate risk ratios that can facilitate comparison and cost-benefit analyses of environmental contamination control strategies.

  10. Breast Cancer Risk Estimation with Artificial Neural Networks Revisited: Discrimination and Calibration

    PubMed Central

    Ayer, Turgay; Alagoz, Oguzhan; Chhatwal, Jagpreet; Shavlik, Jude W.; Kahn, Charles E.; Burnside, Elizabeth S.

    2010-01-01

    Background Discriminating malignant breast lesions from benign ones and accurately predicting the risk of breast cancer for individual patients are critical in successful clinical decision-making. In the past, several artificial neural network (ANN) models have been developed for breast cancer risk prediction. All of these studies reported discrimination performance, but none has assessed calibration, which is an equivalently important measure for accurate risk prediction. In this study, we have evaluated whether an artificial neural network (ANN) trained on a large prospectively-collected dataset of consecutive mammography findings can discriminate between benign and malignant disease and accurately predict the probability of breast cancer for individual patients. Methods Our dataset consisted of 62,219 consecutively collected mammography findings matched with Wisconsin State Cancer Reporting System. We built a three-layer feedforward ANN with 1000 hidden layer nodes. We trained and tested our ANN using ten-fold cross validation to predict the risk of breast cancer. We used area the under the receiver operating characteristic curve (AUC), sensitivity, and specificity to evaluate discriminative performance of the radiologists and our ANN. We assessed the accuracy of risk prediction (i.e. calibration) of our ANN using the Hosmer–Lemeshow (H-L) goodness-of-fit test. Results Our ANN demonstrated superior discrimination, AUC = 0.965, as compared to the radiologists, AUC = 0.939 (P < 0.001). Our ANN was also well-calibrated as shown by an H-L goodness of fit P-value of 0.13. Conclusion Our ANN can effectively discriminate malignant abnormalities from benign ones and accurately predict the risk of breast cancer for individual abnormalities. PMID:20564067

  11. Spatial Estimation of Populations at Risk from Radiological Dispersion Device Terrorism Incidents

    SciTech Connect

    Regens, J.L.; Gunter, J.T.

    2008-07-01

    Delineation of the location and size of the population potentially at risk of exposure to ionizing radiation is one of the key analytical challenges in estimating accurately the severity of the potential health effects associated with a radiological terrorism incident. Regardless of spatial scale, the geographical units for which population data commonly are collected rarely coincide with the geographical scale necessary for effective incident management and medical response. This paper identifies major government and commercial open sources of U.S. population data and presents a GIS-based approach for allocating publicly available population data, including age distributions, to geographical units appropriate for planning and implementing incident management and medical response strategies. In summary: The gravity model offers a straight-forward, empirical tool for estimating population flows, especially when geographical areas are relatively well-defined in terms of accessibility and spatial separation. This is particularly important for several reasons. First, the spatial scale for the area impacted by a RDD terrorism event is unlikely to match fully the spatial scale of available population data. That is, the plume spread typically will not uniformly overlay the impacted area. Second, the number of people within the impacted area varies as a function whether an attack occurs during the day or night. For example, the population of a central business district or industrial area typically is larger during the day while predominately residential areas have larger night time populations. As a result, interpolation techniques that link population data to geographical units and allocate those data based on time-frame at a spatial scale that is relevant to enhancing preparedness and response. The gravity model's main advantage is that it efficiently allocates readily available, open source population data to geographical units appropriate for planning and implementing

  12. ESTIMATING RISK TO CALIFORNIA ENERGY INFRASTRUCTURE FROM PROJECTED CLIMATE CHANGE

    SciTech Connect

    Sathaye, Jayant; Dale, Larry; Larsen, Peter; Fitts, Gary; Koy, Kevin; Lewis, Sarah; Lucena, Andre

    2011-06-22

    This report outlines the results of a study of the impact of climate change on the energy infrastructure of California and the San Francisco Bay region, including impacts on power plant generation; transmission line and substation capacity during heat spells; wildfires near transmission lines; sea level encroachment upon power plants, substations, and natural gas facilities; and peak electrical demand. Some end-of-century impacts were projected:Expected warming will decrease gas-fired generator efficiency. The maximum statewide coincident loss is projected at 10.3 gigawatts (with current power plant infrastructure and population), an increase of 6.2 percent over current temperature-induced losses. By the end of the century, electricity demand for almost all summer days is expected to exceed the current ninetieth percentile per-capita peak load. As much as 21 percent growth is expected in ninetieth percentile peak demand (per-capita, exclusive of population growth). When generator losses are included in the demand, the ninetieth percentile peaks may increase up to 25 percent. As the climate warms, California's peak supply capacity will need to grow faster than the population.Substation capacity is projected to decrease an average of 2.7 percent. A 5C (9F) air temperature increase (the average increase predicted for hot days in August) will diminish the capacity of a fully-loaded transmission line by an average of 7.5 percent.The potential exposure of transmission lines to wildfire is expected to increase with time. We have identified some lines whose probability of exposure to fire are expected to increase by as much as 40 percent. Up to 25 coastal power plants and 86 substations are at risk of flooding (or partial flooding) due to sea level rise.

  13. Comparison of Paper-and-Pencil versus Web Administration of the Youth Risk Behavior Survey (YRBS): Risk Behavior Prevalence Estimates

    ERIC Educational Resources Information Center

    Eaton, Danice K.; Brener, Nancy D.; Kann, Laura; Denniston, Maxine M.; McManus, Tim; Kyle, Tonja M.; Roberts, Alice M.; Flint, Katherine H.; Ross, James G.

    2010-01-01

    The authors examined whether paper-and-pencil and Web surveys administered in the school setting yield equivalent risk behavior prevalence estimates. Data were from a methods study conducted by the Centers for Disease Control and Prevention (CDC) in spring 2008. Intact classes of 9th- or 10th-grade students were assigned randomly to complete a…

  14. Estimation of cancer risks and benefits associated with a potential increased consumption of fruits and vegetables.

    PubMed

    Reiss, Richard; Johnston, Jason; Tucker, Kevin; DeSesso, John M; Keen, Carl L

    2012-12-01

    The current paper provides an analysis of the potential number of cancer cases that might be prevented if half the U.S. population increased its fruit and vegetable consumption by one serving each per day. This number is contrasted with an upper-bound estimate of concomitant cancer cases that might be theoretically attributed to the intake of pesticide residues arising from the same additional fruit and vegetable consumption. The cancer prevention estimates were derived using a published meta-analysis of nutritional epidemiology studies. The cancer risks were estimated using U.S. Environmental Protection Agency (EPA) methods, cancer potency estimates from rodent bioassays, and pesticide residue sampling data from the U.S. Department of Agriculture (USDA). The resulting estimates are that approximately 20,000 cancer cases per year could be prevented by increasing fruit and vegetable consumption, while up to 10 cancer cases per year could be caused by the added pesticide consumption. These estimates have significant uncertainties (e.g., potential residual confounding in the fruit and vegetable epidemiologic studies and reliance on rodent bioassays for cancer risk). However, the overwhelming difference between benefit and risk estimates provides confidence that consumers should not be concerned about cancer risks from consuming conventionally-grown fruits and vegetables.

  15. Assessment of the value of a genetic risk score in improving the estimation of coronary risk

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The American Heart Association has established criteria for the evaluation of novel markers of cardiovascular risk. In accordance with these criteria, we assessed the association between a multi-locus genetic risk score (GRS) and incident coronary heart disease (CHD), and evaluated whether this GRS ...

  16. Mobile Applications for Type 2 Diabetes Risk Estimation: a Systematic Review.

    PubMed

    Fijacko, Nino; Brzan, Petra Povalej; Stiglic, Gregor

    2015-10-01

    Screening for chronical diseases like type 2 diabetes can be done using different methods and various risk tests. This study present a review of type 2 diabetes risk estimation mobile applications focusing on their functionality and availability of information on the underlying risk calculators. Only 9 out of 31 reviewed mobile applications, featured in three major mobile application stores, disclosed the name of risk calculator used for assessing the risk of type 2 diabetes. Even more concerning, none of the reviewed applications mentioned that they are collecting the data from users to improve the performance of their risk estimation calculators or offer users the descriptive statistics of the results from users that already used the application. For that purpose the questionnaires used for calculation of risk should be upgraded by including the information on the most recent blood sugar level measurements from users. Although mobile applications represent a great future potential for health applications, developers still do not put enough emphasis on informing the user of the underlying methods used to estimate the risk for a specific clinical condition. PMID:26303152

  17. Mobile Applications for Type 2 Diabetes Risk Estimation: a Systematic Review.

    PubMed

    Fijacko, Nino; Brzan, Petra Povalej; Stiglic, Gregor

    2015-10-01

    Screening for chronical diseases like type 2 diabetes can be done using different methods and various risk tests. This study present a review of type 2 diabetes risk estimation mobile applications focusing on their functionality and availability of information on the underlying risk calculators. Only 9 out of 31 reviewed mobile applications, featured in three major mobile application stores, disclosed the name of risk calculator used for assessing the risk of type 2 diabetes. Even more concerning, none of the reviewed applications mentioned that they are collecting the data from users to improve the performance of their risk estimation calculators or offer users the descriptive statistics of the results from users that already used the application. For that purpose the questionnaires used for calculation of risk should be upgraded by including the information on the most recent blood sugar level measurements from users. Although mobile applications represent a great future potential for health applications, developers still do not put enough emphasis on informing the user of the underlying methods used to estimate the risk for a specific clinical condition.

  18. Mathematical Models for Estimating the Risks of Bovine Spongiform Encephalopathy (BSE).

    PubMed

    Al-Zoughool, Mustafa; Cottrell, David; Elsaadany, Susie; Murray, Noel; Oraby, Tamer; Smith, Robert; Krewski, Daniel

    2015-01-01

    When the bovine spongiform encephalopathy (BSE) epidemic first emerged in the United Kingdom in the mid 1980s, the etiology of animal prion diseases was largely unknown. Risk management efforts to control the disease were also subject to uncertainties regarding the extent of BSE infections and future course of the epidemic. As understanding of BSE increased, mathematical models were developed to estimate risk of BSE infection and to predict reductions in risk in response to BSE control measures. Risk models of BSE-transmission dynamics determined disease persistence in cattle herds and relative infectivity of cattle prior to onset of clinical disease. These BSE models helped in understanding key epidemiological features of BSE transmission and dynamics, such as incubation period distribution and age-dependent infection susceptibility to infection with the BSE agent. This review summarizes different mathematical models and methods that have been used to estimate risk of BSE, and discusses how such risk projection models have informed risk assessment and management of BSE. This review also provides some general insights on how mathematical models of the type discussed here may be used to estimate risks of emerging zoonotic diseases when biological data on transmission of the etiological agent are limited.

  19. Mathematical Models for Estimating the Risks of Bovine Spongiform Encephalopathy (BSE).

    PubMed

    Al-Zoughool, Mustafa; Cottrell, David; Elsaadany, Susie; Murray, Noel; Oraby, Tamer; Smith, Robert; Krewski, Daniel

    2015-01-01

    When the bovine spongiform encephalopathy (BSE) epidemic first emerged in the United Kingdom in the mid 1980s, the etiology of animal prion diseases was largely unknown. Risk management efforts to control the disease were also subject to uncertainties regarding the extent of BSE infections and future course of the epidemic. As understanding of BSE increased, mathematical models were developed to estimate risk of BSE infection and to predict reductions in risk in response to BSE control measures. Risk models of BSE-transmission dynamics determined disease persistence in cattle herds and relative infectivity of cattle prior to onset of clinical disease. These BSE models helped in understanding key epidemiological features of BSE transmission and dynamics, such as incubation period distribution and age-dependent infection susceptibility to infection with the BSE agent. This review summarizes different mathematical models and methods that have been used to estimate risk of BSE, and discusses how such risk projection models have informed risk assessment and management of BSE. This review also provides some general insights on how mathematical models of the type discussed here may be used to estimate risks of emerging zoonotic diseases when biological data on transmission of the etiological agent are limited. PMID:26158300

  20. Prophylactic radiotherapy against heterotopic ossification following internal fixation of acetabular fractures: a comparative estimate of risk

    PubMed Central

    Nasr, P; Yip, G; Scaife, J E; House, T; Thomas, S J; Harris, F; Owen, P J; Hull, P

    2014-01-01

    Objective: Radiotherapy (RT) is effective in preventing heterotopic ossification (HO) around acetabular fractures requiring surgical reconstruction. We audited outcomes and estimated risks from RT prophylaxis, and alternatives of indometacin or no prophylaxis. Methods: 34 patients underwent reconstruction of acetabular fractures through a posterior approach, followed by a 8-Gy single fraction. The mean age was 44 years. The mean time from surgery to RT was 1.1 days. The major RT risk is radiation-induced fatal cancer. The International Commission on Radiological Protection (ICRP) method was used to estimate risk, and compared with a method (Trott and Kemprad) specifically for estimating RT risk for benign disease. These were compared with risks associated with indometacin and no prophylaxis. Results: 28 patients (82%) developed no HO; 6 developed Brooker Class I; and none developed Class II–IV HO. The ICRP method suggests a risk of fatal cancer in the range of 1 in 1000 to 1 in 10,000; the Trott and Kemprad method suggests 1 in 3000. For younger patients, this may rise to 1 in 2000; and for elderly patients, it may fall to 1 in 6000. The risk of death from gastric bleeding or perforation from indometacin is 1 in 180 to 1 in 900 in older patients. Without prophylaxis risk of death from reoperation to remove HO is 1 in 4000 to 1 in 30,000. Conclusion: These results are encouraging, consistent with much larger series and endorse our multidisciplinary management. Risk estimates can be used in discussion with patients. Advances in knowledge: The risk from RT prophylaxis is small, it is safer than indometacin and substantially overlaps with the range for no prophylaxis. PMID:25089852

  1. Estimating risk at a Superfund site using passive sampling devices as biological surrogates in human health risk models.

    PubMed

    Allan, Sarah E; Sower, Gregory J; Anderson, Kim A

    2011-10-01

    Passive sampling devices (PSDs) sequester the freely dissolved fraction of lipophilic contaminants, mimicking passive chemical uptake and accumulation by biomembranes and lipid tissues. Public Health Assessments that inform the public about health risks from exposure to contaminants through consumption of resident fish are generally based on tissue data, which can be difficult to obtain and requires destructive sampling. The purpose of this study is to apply PSD data in a Public Health Assessment to demonstrate that PSDs can be used as a biological surrogate to evaluate potential human health risks and elucidate spatio-temporal variations in risk. PSDs were used to measure polycyclic aromatic hydrocarbons (PAHs) in the Willamette River; upriver, downriver and within the Portland Harbor Superfund megasite for 3 years during wet and dry seasons. Based on an existing Public Health Assessment for this area, concentrations of PAHs in PSDs were substituted for fish tissue concentrations. PSD measured PAH concentrations captured the magnitude, range and variability of PAH concentrations reported for fish/shellfish from Portland Harbor. Using PSD results in place of fish data revealed an unacceptable risk level for cancer in all seasons but no unacceptable risk for non-cancer endpoints. Estimated cancer risk varied by several orders of magnitude based on season and location. Sites near coal tar contamination demonstrated the highest risk, particularly during the dry season and remediation activities. Incorporating PSD data into Public Health Assessments provides specific spatial and temporal contaminant exposure information that can assist public health professionals in evaluating human health risks.

  2. Estimating risk at a Superfund site using passive sampling devices as biological surrogates in human health risk models

    PubMed Central

    Allan, Sarah E.; Sower, Gregory J.; Anderson, Kim A.

    2013-01-01

    Passive sampling devices (PSDs) sequester the freely dissolved fraction of lipophilic contaminants, mimicking passive chemical uptake and accumulation by biomembranes and lipid tissues. Public Health Assessments that inform the public about health risks from exposure to contaminants through consumption of resident fish are generally based on tissue data, which can be difficulties to obtain and requires destructive sampling. The purpose of this study is to apply PSD data in a Public Health Assessment to demonstrate that PSDs can be used as a biological surrogate to evaluate potential human health risks and elucidate spatio-temporal variations in risk. PSDs were used to measure polycyclic aromatic hydrocarbons (PAHs) in the Willamette River; upriver, downriver and within the Portland Harbor Superfund megasite for three years during wet and dry seasons. Based on an existing Public Health Assessment for this area, concentrations of PAHs in PSDs were substituted for fish tissue concentrations. PSD measured PAH concentrations captured the magnitude, range and variability of PAH concentrations reported for fish/shellfish from Portland Harbor. Using PSD results in place of fish data revealed an unacceptable risk level for cancer in all seasons but no unacceptable risk for non-cancer endpoints. Estimated cancer risk varied by several orders of magnitude based on season and location. Sites near coal tar contamination demonstrated the highest risk, particularly during the dry season and remediation activities. Incorporating PSD data into Public Health Assessments provides specific spatial and temporal contaminant exposure information that can assist public health professionals in evaluating human health risks. PMID:21741671

  3. Inhalation exposure or body burden? Better way of estimating risk--An application of PBPK model.

    PubMed

    Majumdar, Dipanjali; Dutta, Chirasree; Sen, Subha

    2016-01-01

    We aim to establish a new way for estimating the risk from internal dose or body burden due to exposure of benzene in human subject utilizing physiologically based pharmacokinetic (PBPK) model. We also intend to verify its applicability on human subjects exposed to different levels of benzene. We estimated personal inhalation exposure of benzene for two occupational groups namely petrol pump workers and car drivers with respect to a control group, only environmentally exposed. Benzene in personal air was pre-concentrated on charcoal followed by chemical desorption and analysis by gas chromatography equipped with flame ionization detector (GC-FID). We selected urinary trans,trans-muconic acid (t,t-MA) as biomarker of benzene exposure and measured its concentration using solid phase extraction followed by high performance liquid chromatography (HPLC). Our estimated inhalation exposure of benzene was 137.5, 97.9 and 38.7 μg/m(3) for petrol pump workers, car drivers and environmentally exposed control groups respectively which resulted in urinary t,t-MA levels of 145.4±55.3, 112.6±63.5 and 60.0±34.9 μg g(-1) of creatinine, for the groups in the same order. We deduced a derivation for estimation of body burden from urinary metabolite concentration using PBPK model. Estimation of the internal dose or body burden of benzene in human subject has been made for the first time by the measurement of t,t-MA as a urinary metabolite using physiologically based pharmacokinetic (PBPK) model as a tool. The weight adjusted total body burden of benzene was estimated to be 17.6, 11.1 and 5.0 μg kg(-1) of body weight for petrol pump workers, drivers and the environmentally exposed control group, respectively using this method. We computed the carcinogenic risk using both the estimated internal benzene body burden and external exposure values using conventional method. Our study result shows that internal dose or body burden is not proportional to level of exposure rather have a

  4. Inhalation exposure or body burden? Better way of estimating risk--An application of PBPK model.

    PubMed

    Majumdar, Dipanjali; Dutta, Chirasree; Sen, Subha

    2016-01-01

    We aim to establish a new way for estimating the risk from internal dose or body burden due to exposure of benzene in human subject utilizing physiologically based pharmacokinetic (PBPK) model. We also intend to verify its applicability on human subjects exposed to different levels of benzene. We estimated personal inhalation exposure of benzene for two occupational groups namely petrol pump workers and car drivers with respect to a control group, only environmentally exposed. Benzene in personal air was pre-concentrated on charcoal followed by chemical desorption and analysis by gas chromatography equipped with flame ionization detector (GC-FID). We selected urinary trans,trans-muconic acid (t,t-MA) as biomarker of benzene exposure and measured its concentration using solid phase extraction followed by high performance liquid chromatography (HPLC). Our estimated inhalation exposure of benzene was 137.5, 97.9 and 38.7 μg/m(3) for petrol pump workers, car drivers and environmentally exposed control groups respectively which resulted in urinary t,t-MA levels of 145.4±55.3, 112.6±63.5 and 60.0±34.9 μg g(-1) of creatinine, for the groups in the same order. We deduced a derivation for estimation of body burden from urinary metabolite concentration using PBPK model. Estimation of the internal dose or body burden of benzene in human subject has been made for the first time by the measurement of t,t-MA as a urinary metabolite using physiologically based pharmacokinetic (PBPK) model as a tool. The weight adjusted total body burden of benzene was estimated to be 17.6, 11.1 and 5.0 μg kg(-1) of body weight for petrol pump workers, drivers and the environmentally exposed control group, respectively using this method. We computed the carcinogenic risk using both the estimated internal benzene body burden and external exposure values using conventional method. Our study result shows that internal dose or body burden is not proportional to level of exposure rather have a

  5. Obesity phenotype and coronary heart disease risk as estimated by the Framingham risk score.

    PubMed

    Park, Yong Soon; Kim, Jun-Su

    2012-03-01

    There are conflicting data as to whether general or abdominal obesity is a better predictor of cardiovascular risk. This cross-sectional study involved 4,573 subjects aged 30 to 74 yr who participated in the Fourth Korea National Health and Nutrition Examination Survey conducted in 2008. Obesity phenotype was classified by means of body mass index (BMI) and waist circumference (WC), and participants were categorized into 4 groups. Individuals' 10-yr risk of coronary heart diseases (CHD) was determined from the Framingham risk score. Subjects with obese WC had a higher proportion of high risk for CHD compared to the normal WC group, irrespective of BMI level. Relative to subjects with normal BMI/normal WC, the adjusted odds ratios (ORs) of normal BMI/obese WC group (OR 2.93 [1.70, 5.04] and OR 3.10 [1.49, 6.46]) for CHD risk in male were higher than obese BMI/obese WC group (OR 1.91 [1.40, 2.61] and OR 1.70 [1.16, 2.47]), whereas the adjusted ORs of obese BMI/obese WC group (OR 1.94 [1.24, 3.04] and OR 3.92 [1.75, 8.78]) were higher than the others in female. Subjects with obese BMI/normal WC were not significantly associated with 10-yr CHD risk in men (P = 0.449 and P = 0.067) and women (P = 0.702 and P = 0.658). WC is associated with increased CHD risk regardless of the level of BMI. Men with normal BMI and obese WC tend to be associated with CHD risk than those with obese BMI and obese WC.

  6. FY 2000 Buildings Energy Savings Estimates under Uncertainty: Developing Approaches for Incorporating Risk into Buildings Program Energy Efficiency Estimates

    SciTech Connect

    Anderson, Dave M.

    2002-11-18

    This report is one of two that re-examines the forecasted impact of individual programs currently within the Buildings Technology Program (BT) and the Weatherization and Intergovernmental Program (WIP) that appeared in the FY2000 Presidential Budget request. This report develops potential methods for allowing inherent risk to be captured in the program benefits analysis. Note that the FY2000 budget request was originally analyzed under the former Office of Building Technology, State and Community Programs (BTS), where BT and WIP were previously combined. Throughout the document, reference will be made to the predecessor of the BT and WIP programs, BTS, as FY2000 reflected that organization. A companion report outlines the effects of re-estimating the FY 2000 budget request based on overlaying program data from subsequent years, essentially revised out-year forecasts. That report shows that year-to-year long-term projections of primary energy savings can vary widely as models improve and programs change. Those point estimates are not influenced by uncertainty or risk. This report develops potential methods for allowing inherent risk to affect the benefits analysis via Monte Carlo simulation.

  7. Accounting for Ecosystem Alteration Doubles Estimates of Conservation Risk in the Conterminous United States

    PubMed Central

    Swaty, Randy; Blankenship, Kori; Hagen, Sarah; Fargione, Joseph; Smith, Jim; Patton, Jeannie

    2011-01-01

    Previous national and global conservation assessments have relied on habitat conversion data to quantify conservation risk. However, in addition to habitat conversion to crop production or urban uses, ecosystem alteration (e.g., from logging, conversion to plantations, biological invasion, or fire suppression) is a large source of conservation risk. We add data quantifying ecosystem alteration on unconverted lands to arrive at a more accurate depiction of conservation risk for the conterminous United States. We quantify ecosystem alteration using a recent national assessment based on remote sensing of current vegetation compared with modeled reference natural vegetation conditions. Highly altered (but not converted) ecosystems comprise 23% of the conterminous United States, such that the number of critically endangered ecoregions in the United States is 156% higher than when calculated using habitat conversion data alone. Increased attention to natural resource management will be essential to address widespread ecosystem alteration and reduce conservation risk. PMID:21850248

  8. Estimated Risk Level of Unified Stereotactic Body Radiation Therapy Dose Tolerance Limits for Spinal Cord.

    PubMed

    Grimm, Jimm; Sahgal, Arjun; Soltys, Scott G; Luxton, Gary; Patel, Ashish; Herbert, Scott; Xue, Jinyu; Ma, Lijun; Yorke, Ellen; Adler, John R; Gibbs, Iris C

    2016-04-01

    A literature review of more than 200 stereotactic body radiation therapy spine articles from the past 20 years found only a single article that provided dose-volume data and outcomes for each spinal cord of a clinical dataset: the Gibbs 2007 article (Gibbs et al, 2007(1)), which essentially contains the first 100 stereotactic body radiation therapy (SBRT) spine treatments from Stanford University Medical Center. The dataset is modeled and compared in detail to the rest of the literature review, which found 59 dose tolerance limits for the spinal cord in 1-5 fractions. We partitioned these limits into a unified format of high-risk and low-risk dose tolerance limits. To estimate the corresponding risk level of each limit we used the Gibbs 2007 clinical spinal cord dose-volume data for 102 spinal metastases in 74 patients treated by spinal radiosurgery. In all, 50 of the patients were previously irradiated to a median dose of 40Gy in 2-3Gy fractions and 3 patients developed treatment-related myelopathy. These dose-volume data were digitized into the dose-volume histogram (DVH) Evaluator software tool where parameters of the probit dose-response model were fitted using the maximum likelihood approach (Jackson et al, 1995(3)). Based on this limited dataset, for de novo cases the unified low-risk dose tolerance limits yielded an estimated risk of spinal cord injury of ≤1% in 1-5 fractions, and the high-risk limits yielded an estimated risk of ≤3%. The QUANTEC Dmax limits of 13Gy in a single fraction and 20Gy in 3 fractions had less than 1% risk estimated from this dataset, so we consider these among the low-risk limits. In the previously irradiated cohort, the estimated risk levels for 10 and 14Gy maximum cord dose limits in 5 fractions are 0.4% and 0.6%, respectively. Longer follow-up and more patients are required to improve the risk estimates and provide more complete validation. PMID:27000514

  9. Estimated Risk Level of Unified Stereotactic Body Radiation Therapy Dose Tolerance Limits for Spinal Cord.

    PubMed

    Grimm, Jimm; Sahgal, Arjun; Soltys, Scott G; Luxton, Gary; Patel, Ashish; Herbert, Scott; Xue, Jinyu; Ma, Lijun; Yorke, Ellen; Adler, John R; Gibbs, Iris C

    2016-04-01

    A literature review of more than 200 stereotactic body radiation therapy spine articles from the past 20 years found only a single article that provided dose-volume data and outcomes for each spinal cord of a clinical dataset: the Gibbs 2007 article (Gibbs et al, 2007(1)), which essentially contains the first 100 stereotactic body radiation therapy (SBRT) spine treatments from Stanford University Medical Center. The dataset is modeled and compared in detail to the rest of the literature review, which found 59 dose tolerance limits for the spinal cord in 1-5 fractions. We partitioned these limits into a unified format of high-risk and low-risk dose tolerance limits. To estimate the corresponding risk level of each limit we used the Gibbs 2007 clinical spinal cord dose-volume data for 102 spinal metastases in 74 patients treated by spinal radiosurgery. In all, 50 of the patients were previously irradiated to a median dose of 40Gy in 2-3Gy fractions and 3 patients developed treatment-related myelopathy. These dose-volume data were digitized into the dose-volume histogram (DVH) Evaluator software tool where parameters of the probit dose-response model were fitted using the maximum likelihood approach (Jackson et al, 1995(3)). Based on this limited dataset, for de novo cases the unified low-risk dose tolerance limits yielded an estimated risk of spinal cord injury of ≤1% in 1-5 fractions, and the high-risk limits yielded an estimated risk of ≤3%. The QUANTEC Dmax limits of 13Gy in a single fraction and 20Gy in 3 fractions had less than 1% risk estimated from this dataset, so we consider these among the low-risk limits. In the previously irradiated cohort, the estimated risk levels for 10 and 14Gy maximum cord dose limits in 5 fractions are 0.4% and 0.6%, respectively. Longer follow-up and more patients are required to improve the risk estimates and provide more complete validation.

  10. Measurement of natural radionuclides in Malaysian bottled mineral water and consequent health risk estimation

    SciTech Connect

    Priharti, W.; Samat, S. B.; Yasir, M. S.

    2015-09-25

    The radionuclides of {sup 226}Ra, {sup 232}Th and {sup 40}K were measured in ten mineral water samples, of which from the radioactivity obtained, the ingestion doses for infants, children and adults were calculated and the cancer risk for the adult was estimated. Results showed that the calculated ingestion doses for the three age categories are much lower than the average worldwide ingestion exposure of 0.29 mSv/y and the estimated cancer risk is much lower than the cancer risk of 8.40 × 10{sup −3} (estimated from the total natural radiation dose of 2.40 mSv/y). The present study concludes that the bottled mineral water produced in Malaysia is safe for daily human consumption.

  11. Measurement of natural radionuclides in Malaysian bottled mineral water and consequent health risk estimation

    NASA Astrophysics Data System (ADS)

    Priharti, W.; Samat, S. B.; Yasir, M. S.

    2015-09-01

    The radionuclides of 226Ra, 232Th and 40K were measured in ten mineral water samples, of which from the radioactivity obtained, the ingestion doses for infants, children and adults were calculated and the cancer risk for the adult was estimated. Results showed that the calculated ingestion doses for the three age categories are much lower than the average worldwide ingestion exposure of 0.29 mSv/y and the estimated cancer risk is much lower than the cancer risk of 8.40 × 10-3 (estimated from the total natural radiation dose of 2.40 mSv/y). The present study concludes that the bottled mineral water produced in Malaysia is safe for daily human consumption.

  12. Estimates of radiation doses in tissue and organs and risk of excess cancer in the single-course radiotherapy patients treated for ankylosing spondylitis in England and Wales

    SciTech Connect

    Fabrikant, J.I.; Lyman, J.T.

    1982-02-01

    The estimates of absorbed doses of x rays and excess risk of cancer in bone marrow and heavily irradiated sites are extremely crude and are based on very limited data and on a number of assumptions. Some of these assumptions may later prove to be incorrect, but it is probable that they are correct to within a factor of 2. The excess cancer risk estimates calculated compare well with the most reliable epidemiological surveys thus far studied. This is particularly important for cancers of heavily irradiated sites with long latent periods. The mean followup period for the patients was 16.2 y, and an increase in cancers of heavily irradiated sites may appear in these patients in the 1970s in tissues and organs with long latent periods for the induction of cancer. The accuracy of these estimates is severely limited by the inadequacy of information on doses absorbed by the tissues at risk in the irradiated patients. The information on absorbed dose is essential for an accurate assessment of dose-cancer incidence analysis. Furthermore, in this valuable series of irradiated patients, the information on radiation dosimetry on the radiotherapy charts is central to any reliable determination of somatic risks of radiation with regard to carcinogenesis in man. The work necessary to obtain these data is under way; only when they are available can more precise estimates of risk of cancer induction by radiation in man be obtained.

  13. Consolidating Risk Estimates for Radiation-Induced Complications in Individual Patient: Late Rectal Toxicity

    SciTech Connect

    Prior, Phillip; Devisetty, Kiran; Tarima, Sergey S.; Lawton, Colleen A.F.; Semenenko, Vladimir A.

    2012-05-01

    Purpose: To test the feasibility of a new approach to synthesize published normal tissue complication data using late rectal toxicity in prostate cancer as an example. Methods and Materials: A data survey was performed to identify the published reports on the dose-response relationships for late rectal toxicity. The risk estimates for Grade 1 or greater, Grade 2 or greater, and Grade 3 or greater toxicity were obtained for a test cohort of patients treated at our institution. The influence of the potential factors that might have affected the reported toxicity levels was investigated. The studies that did not conform to the general data trends were excluded, and single, combined risk estimates were derived for each patient and toxicity level. Results: A total of 21 studies of nonoverlapping patient populations were identified. Three studies provided dose-response models for more than one level of toxicity. Of these 21 studies, 6, 14, and 5 were used to derive the initial risk estimates for Grade 1, 2, and 3 or greater toxicity, respectively. A comparison of risk estimates between the studies reporting rectal bleeding and rectal toxicity (bleeding plus other symptoms) or between studies with follow-up <36 months and {>=}36 months did not reveal significant differences (p {>=} .29 for all comparisons). After excluding three reports that did not conform to the general data trends, the combined risk estimates were derived from 5 reports (647 patients), 11 reports (3,369 patients), and 5 reports (1,330 patients) for Grade 1, 2, and 3 or greater toxicity, respectively. Conclusions: The proposed approach is feasible and allows for more systematic use of published dose-response data to estimate the complication risks for the individual patient.

  14. The Estimate of Risk of Adolescent Sexual Offense Recidivism (ERASOR): preliminary psychometric data.

    PubMed

    Worling, James R

    2004-06-01

    The Estimate of Risk of Adolescent Sexual Offense Recidivism (ERASOR) is an empirically guided checklist designed to assist clinicians to estimate the short-term risk of a sexual reoffense for youth aged 12-18 years of age. The ERASOR provides objective coding instructions for 25 risk factors (16 dynamic and 9 static). To investigate the psychometric properties, risk ratings were collected from 28 clinicians who evaluated 136 adolescent males (aged 12-18 years) following comprehensive, clinical assessments. Preliminary psychometric data (i.e., interrater agreement, item-total correlation, internal consistency) were found to be supportive of the reliability and item composition of the tool. ERASOR ratings also significantly discriminated adolescents based on whether or not they had previously been sanctioned for a prior sexual offense. PMID:15326883

  15. Performance of default risk model with barrier option framework and maximum likelihood estimation: Evidence from Taiwan

    NASA Astrophysics Data System (ADS)

    Chou, Heng-Chih; Wang, David

    2007-11-01

    We investigate the performance of a default risk model based on the barrier option framework with maximum likelihood estimation. We provide empirical validation of the model by showing that implied default barriers are statistically significant for a sample of construction firms in Taiwan over the period 1994-2004. We find that our model dominates the commonly adopted models, Merton model, Z-score model and ZETA model. Moreover, we test the n-year-ahead prediction performance of the model and find evidence that the prediction accuracy of the model improves as the forecast horizon decreases. Finally, we assess the effect of estimated default risk on equity returns and find that default risk is able to explain equity returns and that default risk is a variable worth considering in asset-pricing tests, above and beyond size and book-to-market.

  16. A methodology for estimating risks associated with landslides of contaminated soil into rivers.

    PubMed

    Göransson, Gunnel; Norrman, Jenny; Larson, Magnus; Alén, Claes; Rosén, Lars

    2014-02-15

    Urban areas adjacent to surface water are exposed to soil movements such as erosion and slope failures (landslides). A landslide is a potential mechanism for mobilisation and spreading of pollutants. This mechanism is in general not included in environmental risk assessments for contaminated sites, and the consequences associated with contamination in the soil are typically not considered in landslide risk assessments. This study suggests a methodology to estimate the environmental risks associated with landslides in contaminated sites adjacent to rivers. The methodology is probabilistic and allows for datasets with large uncertainties and the use of expert judgements, providing quantitative estimates of probabilities for defined failures. The approach is illustrated by a case study along the river Göta Älv, Sweden, where failures are defined and probabilities for those failures are estimated. Failures are defined from a pollution perspective and in terms of exceeding environmental quality standards (EQSs) and acceptable contaminant loads. Models are then suggested to estimate probabilities of these failures. A landslide analysis is carried out to assess landslide probabilities based on data from a recent landslide risk classification study along the river Göta Älv. The suggested methodology is meant to be a supplement to either landslide risk assessment (LRA) or environmental risk assessment (ERA), providing quantitative estimates of the risks associated with landslide in contaminated sites. The proposed methodology can also act as a basis for communication and discussion, thereby contributing to intersectoral management solutions. From the case study it was found that the defined failures are governed primarily by the probability of a landslide occurring. The overall probabilities for failure are low; however, if a landslide occurs the probabilities of exceeding EQS are high and the probability of having at least a 10% increase in the contamination load

  17. A methodology for estimating risks associated with landslides of contaminated soil into rivers.

    PubMed

    Göransson, Gunnel; Norrman, Jenny; Larson, Magnus; Alén, Claes; Rosén, Lars

    2014-02-15

    Urban areas adjacent to surface water are exposed to soil movements such as erosion and slope failures (landslides). A landslide is a potential mechanism for mobilisation and spreading of pollutants. This mechanism is in general not included in environmental risk assessments for contaminated sites, and the consequences associated with contamination in the soil are typically not considered in landslide risk assessments. This study suggests a methodology to estimate the environmental risks associated with landslides in contaminated sites adjacent to rivers. The methodology is probabilistic and allows for datasets with large uncertainties and the use of expert judgements, providing quantitative estimates of probabilities for defined failures. The approach is illustrated by a case study along the river Göta Älv, Sweden, where failures are defined and probabilities for those failures are estimated. Failures are defined from a pollution perspective and in terms of exceeding environmental quality standards (EQSs) and acceptable contaminant loads. Models are then suggested to estimate probabilities of these failures. A landslide analysis is carried out to assess landslide probabilities based on data from a recent landslide risk classification study along the river Göta Älv. The suggested methodology is meant to be a supplement to either landslide risk assessment (LRA) or environmental risk assessment (ERA), providing quantitative estimates of the risks associated with landslide in contaminated sites. The proposed methodology can also act as a basis for communication and discussion, thereby contributing to intersectoral management solutions. From the case study it was found that the defined failures are governed primarily by the probability of a landslide occurring. The overall probabilities for failure are low; however, if a landslide occurs the probabilities of exceeding EQS are high and the probability of having at least a 10% increase in the contamination load

  18. Estimated human health risks of disposing of nonhazardous oil field waste in salt caverns

    SciTech Connect

    Tomasko, D.; Elcock, D.; Veil, J.

    1997-09-01

    Argonne National Laboratory (ANL) has completed an evaluation of the possibility that adverse human health effects (carcinogenic and noncarcinogenic) could result from exposure to contaminants released from nonhazardous oil field wastes (NOW) disposed in domal salt caverns. In this assessment, several steps were used to evaluate potential human health risks: identifying potential contaminants of concern, determining how humans could be exposed to these contaminants, assessing the contaminants` toxicities, estimating contaminant intakes, and, finally, calculating human cancer and noncancer risks.

  19. Examining the effects of air pollution composition on within region differences in PM2.5 mortality risk estimates

    EPA Science Inventory

    Multi-city population-based epidemiological studies have observed significant heterogeneity in both the magnitude and direction of city-specific risk estimates, but tended to focus on regional differences in PM2.5 mortality risk estimates. Interpreting differences in risk estimat...

  20. Estimation of flood risk for cultural heritage in an art city

    NASA Astrophysics Data System (ADS)

    Arrighi, Chiara; Brugioni, Marcello; Franceschini, Serena; Castelli, Fabio; Mazzanti, Bernardo

    2015-04-01

    Flood risk assessment in art cities poses many challenges for the presence of cultural heritage at risk, which is a damage category whose value is hardly monetizable. In fact, valuing cultural asset is a complex task, usually requiring more effort than a rough estimation of restoration costs. The lack of an adequate risk evaluation of the cultural asset may also lead to enormous difficulties and political problems for the accomplishment of the structural mitigation solutions. The aim of the work is to perform a first analysis of the risk to cultural heritage avoiding a full quantification of exposure. Here we present a case study of broad importance, which is the art city of Florence (Italy), affected by a devastating flood in 1966. In previous studies the estimated flood risk, neglecting damages to cultural heritage, was about 53 Mio€ /year. Nevertheless, Florence hosts 176 buildings officially classified as cultural heritage and thousands of paintings, sculptures and ancient books. Proceeding similarly to the commonly accepted flood risk assessment method, the annual expected loss in terms of cultural heritage/artworks is estimated.

  1. How accurately can students estimate their performance on an exam and how does this relate to their actual performance on the exam?

    NASA Astrophysics Data System (ADS)

    Rebello, N. Sanjay

    2012-02-01

    Research has shown students' beliefs regarding their own abilities in math and science can influence their performance in these disciplines. I investigated the relationship between students' estimated performance and actual performance on five exams in a second semester calculus-based physics class. Students in a second-semester calculus-based physics class were given about 72 hours after the completion of each of five exams, to estimate their individual and class mean score on each exam. Students were given extra credit worth 1% of the exam points for estimating their score correct within 2% of the actual score and another 1% extra credit for estimating the class mean score within 2% of the correct value. I compared students' individual and mean score estimations with the actual scores to investigate the relationship between estimation accuracies and exam performance of the students as well as trends over the semester.

  2. Normal Tissue Complication Probability Estimation by the Lyman-Kutcher-Burman Method Does Not Accurately Predict Spinal Cord Tolerance to Stereotactic Radiosurgery

    SciTech Connect

    Daly, Megan E.; Luxton, Gary; Choi, Clara Y.H.; Gibbs, Iris C.; Chang, Steven D.; Adler, John R.; Soltys, Scott G.

    2012-04-01

    Purpose: To determine whether normal tissue complication probability (NTCP) analyses of the human spinal cord by use of the Lyman-Kutcher-Burman (LKB) model, supplemented by linear-quadratic modeling to account for the effect of fractionation, predict the risk of myelopathy from stereotactic radiosurgery (SRS). Methods and Materials: From November 2001 to July 2008, 24 spinal hemangioblastomas in 17 patients were treated with SRS. Of the tumors, 17 received 1 fraction with a median dose of 20 Gy (range, 18-30 Gy) and 7 received 20 to 25 Gy in 2 or 3 sessions, with cord maximum doses of 22.7 Gy (range, 17.8-30.9 Gy) and 22.0 Gy (range, 20.2-26.6 Gy), respectively. By use of conventional values for {alpha}/{beta}, volume parameter n, 50% complication probability dose TD{sub 50}, and inverse slope parameter m, a computationally simplified implementation of the LKB model was used to calculate the biologically equivalent uniform dose and NTCP for each treatment. Exploratory calculations were performed with alternate values of {alpha}/{beta} and n. Results: In this study 1 case (4%) of myelopathy occurred. The LKB model using radiobiological parameters from Emami and the logistic model with parameters from Schultheiss overestimated complication rates, predicting 13 complications (54%) and 18 complications (75%), respectively. An increase in the volume parameter (n), to assume greater parallel organization, improved the predictive value of the models. Maximum-likelihood LKB fitting of {alpha}/{beta} and n yielded better predictions (0.7 complications), with n = 0.023 and {alpha}/{beta} = 17.8 Gy. Conclusions: The spinal cord tolerance to the dosimetry of SRS is higher than predicted by the LKB model using any set of accepted parameters. Only a high {alpha}/{beta} value in the LKB model and only a large volume effect in the logistic model with Schultheiss data could explain the low number of complications observed. This finding emphasizes that radiobiological models

  3. Estimate of Space Radiation-Induced Cancer Risks for International Space Station Orbits

    SciTech Connect

    Wu, H.; Atwell, W.; Cucinotta, F.A.; Yang, C.

    1996-03-01

    Excess cancer risks from exposures to space radiation are estimated for various orbits of the International Space Station (ISS). Organ exposures are computed with the transport codes, BRYNTRN and HZETRN, and the computerized anatomical male and computerized anatomical female models. Cancer risk coefficients in the National Council on Radiation Protection and Measurements report No. 98 are used to generate lifetime excess cancer incidence and cancer mortality after a one-month mission to ISS. The generated data are tabulated to serve as a quick reference for assessment of radiation risk to astronauts on ISS missions.

  4. Estimate of Space Radiation-Induced Cancer Risks for International Space Station Orbits

    NASA Technical Reports Server (NTRS)

    Wu, Honglu; Atwell, William; Cucinotta, Francis A.; Yang, Chui-hsu

    1996-01-01

    Excess cancer risks from exposures to space radiation are estimated for various orbits of the International Space Station (ISS). Organ exposures are computed with the transport codes, BRYNTRN and HZETRN, and the computerized anatomical male and computerized anatomical female models. Cancer risk coefficients in the National Council on Radiation Protection and Measurements report No. 98 are used to generate lifetime excess cancer incidence and cancer mortality after a one-month mission to ISS. The generated data are tabulated to serve as a quick reference for assessment of radiation risk to astronauts on ISS missions.

  5. Estimated occupational risk from bioaerosols generated during land application of class B biosolids.

    PubMed

    Tanner, Benjamin D; Brooks, John P; Gerba, Charles P; Haas, Charles N; Josephson, Karen L; Pepper, Ian L

    2008-01-01

    Some speculate that bioaerosols from land application of biosolids pose occupational risks, but few studies have assessed aerosolization of microorganisms from biosolids or estimated occupational risks of infection. This study investigated levels of microorganisms in air immediately downwind of land application operations and estimated occupational risks from aerosolized microorganisms. In all, more than 300 air samples were collected downwind of biosolids application sites at various locations within the United States. Coliform bacteria, coliphages, and heterotrophic plate count (HPC) bacteria were enumerated from air and biosolids at each site. Concentrations of coliforms relative to Salmonella and concentrations of coliphage relative to enteroviruses in biosolids were used, in conjunction with levels of coliforms and coliphages measured in air during this study, to estimate exposure to Salmonella and enteroviruses in air. The HPC bacteria were ubiquitous in air near land application sites whether or not biosolids were being applied, and concentrations were positively correlated to windspeed. Coliform bacteria were detected only when biosolids were being applied to land or loaded into land applicators. Coliphages were detected in few air samples, and only when biosolids were being loaded into land applicators. In general, environmental parameters had little impact on concentrations of microorganisms in air immediately downwind of land application. The method of land application was most correlated to aerosolization. From this large body of data, the occupational risk of infection from bioaerosols was estimated to be 0.78 to 2.1%/yr. Extraordinary exposure scenarios carried an estimated annual risk of infection of up to 34%, with viruses posing the greatest threat. Risks from aerosolized microorganisms at biosolids land application sites appear to be lower than those at wastewater treatment plants, based on previously reported literature. PMID:18948485

  6. Estimating cancer risk from dental cone-beam CT exposures based on skin dosimetry

    NASA Astrophysics Data System (ADS)

    Pauwels, Ruben; Cockmartin, Lesley; Ivanauskaité, Deimante; Urbonienė, Ausra; Gavala, Sophia; Donta, Catherine; Tsiklakis, Kostas; Jacobs, Reinhilde; Bosmans, Hilde; Bogaerts, Ria; Horner, Keith; SEDENTEXCT Project Consortium, The

    2014-07-01

    The aim of this study was to measure entrance skin doses on patients undergoing cone-beam computed tomography (CBCT) examinations, to establish conversion factors between skin and organ doses, and to estimate cancer risk from CBCT exposures. 266 patients (age 8-83) were included, involving three imaging centres. CBCT scans were acquired using the SCANORA 3D (Soredex, Tuusula, Finland) and NewTom 9000 (QR, Verona, Italy). Eight thermoluminescent dosimeters were attached to the patient's skin at standardized locations. Using previously published organ dose estimations on various CBCTs with an anthropomorphic phantom, correlation factors to convert skin dose to organ doses were calculated and applied to estimate patient organ doses. The BEIR VII age- and gender-dependent dose-risk model was applied to estimate the lifetime attributable cancer risk. For the SCANORA 3D, average skin doses over the eight locations varied between 484 and 1788 µGy. For the NewTom 9000 the range was between 821 and 1686 µGy for Centre 1 and between 292 and 2325 µGy for Centre 2. Entrance skin dose measurements demonstrated the combined effect of exposure and patient factors on the dose. The lifetime attributable cancer risk, expressed as the probability to develop a radiation-induced cancer, varied between 2.7 per million (age >60) and 9.8 per million (age 8-11) with an average of 6.0 per million. On average, the risk for female patients was 40% higher. The estimated radiation risk was primarily influenced by the age at exposure and the gender, pointing out the continuing need for justification and optimization of CBCT exposures, with a specific focus on children.

  7. Heritability Estimates Identify a Substantial Genetic Contribution to Risk and Outcome of Intracerebral Hemorrhage

    PubMed Central

    Devan, William J.; Falcone, Guido J.; Anderson, Christopher D.; Jagiella, Jeremiasz M.; Schmidt, Helena; Hansen, Björn M.; Jimenez-Conde, Jordi; Giralt-Steinhauer, Eva; Cuadrado-Godia, Elisa; Soriano, Carolina; Ayres, Alison M.; Schwab, Kristin; Kassis, Sylvia Baedorf; Valant, Valerie; Pera, Joanna; Urbanik, Andrzej; Viswanathan, Anand; Rost, Natalia S.; Goldstein, Joshua N.; Freudenberger, Paul; Stögerer, Eva-Maria; Norrving, Bo; Tirschwell, David L.; Selim, Magdy; Brown, Devin L.; Silliman, Scott L.; Worrall, Bradford B.; Meschia, James F.; Kidwell, Chelsea S.; Montaner, Joan; Fernandez-Cadenas, Israel; Delgado, Pilar; Greenberg, Steven M.; Roquer, Jaume; Lindgren, Arne; Slowik, Agnieszka; Schmidt, Reinhold; Woo, Daniel; Rosand, Jonathan; Biffi, Alessandro

    2013-01-01

    Background and Purpose Previous studies suggest that genetic variation plays a substantial role in occurrence and evolution of intracerebral hemorrhage (ICH). Genetic contribution to disease can be determined by calculating heritability using family-based data, but such an approach is impractical for ICH because of lack of large pedigree-based studies. However, a novel analytic tool based on genome-wide data allows heritability estimation from unrelated subjects. We sought to apply this method to provide heritability estimates for ICH risk, severity, and outcome. Methods We analyzed genome-wide genotype data for 791 ICH cases and 876 controls, and determined heritability as the proportion of variation in phenotype attributable to captured genetic variants. Contribution to heritability was separately estimated for the APOE (encoding apolipoprotein E) gene, an established genetic risk factor, and for the rest of the genome. Analyzed phenotypes included ICH risk, admission hematoma volume, and 90-day mortality. Results ICH risk heritability was estimated at 29% (SE, 11%) for non-APOE loci and at 15% (SE, 10%) for APOE. Heritability for 90-day ICH mortality was 41% for non-APOE loci and 10% (SE, 9%) for APOE. Genetic influence on hematoma volume was also substantial: admission volume heritability was estimated at 60% (SE, 70%) for non-APOE loci and at 12% (SE, 4%) for APOE. Conclusions Genetic variation plays a substantial role in ICH risk, outcome, and hematoma volume. Previously reported risk variants account for only a portion of inherited genetic influence on ICH pathophysiology, pointing to additional loci yet to be identified. PMID:23559261

  8. Estimating cancer risk from dental cone-beam CT exposures based on skin dosimetry.

    PubMed

    Pauwels, Ruben; Cockmartin, Lesley; Ivanauskaité, Deimante; Urbonienė, Ausra; Gavala, Sophia; Donta, Catherine; Tsiklakis, Kostas; Jacobs, Reinhilde; Bosmans, Hilde; Bogaerts, Ria; Horner, Keith

    2014-07-21

    The aim of this study was to measure entrance skin doses on patients undergoing cone-beam computed tomography (CBCT) examinations, to establish conversion factors between skin and organ doses, and to estimate cancer risk from CBCT exposures. 266 patients (age 8-83) were included, involving three imaging centres. CBCT scans were acquired using the SCANORA 3D (Soredex, Tuusula, Finland) and NewTom 9000 (QR, Verona, Italy). Eight thermoluminescent dosimeters were attached to the patient's skin at standardized locations. Using previously published organ dose estimations on various CBCTs with an anthropomorphic phantom, correlation factors to convert skin dose to organ doses were calculated and applied to estimate patient organ doses. The BEIR VII age- and gender-dependent dose-risk model was applied to estimate the lifetime attributable cancer risk. For the SCANORA 3D, average skin doses over the eight locations varied between 484 and 1788 µGy. For the NewTom 9000 the range was between 821 and 1686 µGy for Centre 1 and between 292 and 2325 µGy for Centre 2. Entrance skin dose measurements demonstrated the combined effect of exposure and patient factors on the dose. The lifetime attributable cancer risk, expressed as the probability to develop a radiation-induced cancer, varied between 2.7 per million (age >60) and 9.8 per million (age 8-11) with an average of 6.0 per million. On average, the risk for female patients was 40% higher. The estimated radiation risk was primarily influenced by the age at exposure and the gender, pointing out the continuing need for justification and optimization of CBCT exposures, with a specific focus on children. PMID:24957710

  9. Micro-scale flood risk estimation in historic centres: a case study in Florence, Italy

    NASA Astrophysics Data System (ADS)

    Castelli, Fabio; Arrighi, Chiara; Brugioni, Marcello; Franceschini, Serena; Mazzanti, Bernardo

    2013-04-01

    The route to flood risk assessment is much more than hydraulic modelling of inundation, that is hazard mapping. Flood risk is the product of flood hazard, vulnerability and exposure, all the three to be estimated with comparable level of accuracy. While hazard maps have already been implemented in many countries, quantitative damage and risk maps are still at a preliminary level. Currently one of the main challenges in flood damage estimation resides in the scarce availability of socio-economic data characterizing the monetary value of the exposed assets. When these public-open data are available, the variability of their level of detail drives the need of merging different sources and of selecting an appropriate scale of analysis. In this work a parsimonious quasi-2D hydraulic model is adopted, having many advantages in terms of easy set-up. In order to represent the geometry of the study domain an high-resolution and up-to-date Digital Surface Model (DSM) is used. The accuracy in flood depth estimation is evaluated by comparison with marble-plate records of a historic flood in the city of Florence (Italy). The accuracy is characterized in the downtown most flooded area by a bias of a very few centimetres and a determination coefficient of 0.73. The average risk is found to be about 14 €/m2•year, corresponding to about 8.3% of residents income. The spatial distribution of estimated risk highlights a complex interaction between the flood pattern and the buildings characteristics. Proceeding through the risk estimation steps, a new micro-scale potential damage assessment method is proposed. This method is based on the georeferenced census system considered as optimal compromise between spatial detail and open availability of socio-economic data. The census sections system consist of geographically contiguous polygons that usually coincide with building blocks in dense urban areas. The results of flood risk assessment at the census section scale resolve most of

  10. Estimated risk from exposure to radon decay products in US homes

    SciTech Connect

    Nero, A.V. Jr.

    1986-05-01

    Recent analyses now permit direct estimation of the risks of lung cancer from radon decay products in US homes. Analysis of data from indoor monitoring in single-family homes yields a tentative frequency distribution of annual-average /sup 222/Rn concentrations averaging 55 Bq m/sup -3/ and having 2% of homes exceeding 300 Bq m/sup -3/. Application of the results of occupational epidemiological studies, either directly or using recent advances in lung dosimetry, to indoor exposures suggests that the average indoor concentration entails a lifetime risk of lung cancer of 0.3% or about 10% of the total risk of lung cancer. The risk to individuals occupying the homes with 300 Bq m/sup -3/ or more for their lifetimes is estimated to exceed 2%, with risks from the homes with thousands of Bq m/sup -3/ correspondingly higher, even exceeding the total risk of premature death due to cigarette smoking. The potential for such average and high-level risks in ordinary homes forces development of a new perspective on environmental exposures.

  11. An Evidenced-Based Approach for Estimating Decompression Sickness Risk in Aircraft Operations

    NASA Technical Reports Server (NTRS)

    Robinson, Ronald R.; Dervay, Joseph P.; Conkin, Johnny

    1999-01-01

    Estimating the risk of decompression Sickness (DCS) in aircraft operations remains a challenge, making the reduction of this risk through the development of operationally acceptable denitrogenation schedules difficult. In addition, the medical recommendations which are promulgated are often not supported by rigorous evaluation of the available data, but are instead arrived at by negotiation with the aircraft operations community, are adapted from other similar aircraft operations, or are based upon the opinion of the local medical community. We present a systematic approach for defining DCS risk in aircraft operations by analyzing the data available for a specific aircraft, flight profile, and aviator population. Once the risk of DCS in a particular aircraft operation is known, appropriate steps can be taken to reduce this risk to a level acceptable to the applicable aviation community. Using this technique will allow any aviation medical community to arrive at the best estimate of DCS risk for its specific mission and aviator population and will allow systematic reevaluation of the decisions regarding DCS risk reduction when additional data are available.

  12. Cancer risk estimation for mixtures of coal tars and benzo(a)pyrene

    SciTech Connect

    Gaylor, D.W.; Culp, S.J.; Goldstein, L.S.; Beland, F.A.

    2000-02-01

    Two-year chronic bioassays were conducted by using B6C3F1 female mice fed several concentrations of two different mixtures of coal tars from manufactured gas waste sites or benzo(a)pyrene (BaP). The purpose of the study was to obtain estimates of cancer potency of coal tar mixtures, by using conventional regulatory methods, for use in manufactured gas waste site remediation. A secondary purpose was to investigate the validity of using the concentration of a single potent carcinogen, in this case benzo(a)pyrene, to estimate the relative risk for a coal tar mixture. The study has shown that BaP dominates the cancer risk when its concentration is greater than 6,300 ppm in the coal tar mixture. In this case the most sensitive tissue site is the forestomach. Using low-dose linear extrapolation, the lifetime cancer risk for humans is estimated to be: Risk < 1.03 x 10{sup {minus}4} (ppm coal tar in total diet) + 240 x 10{sup {minus}4} (ppm BaP in total diet), based on forestomach tumors. If the BaP concentration in the coal tar mixture is less than 6,300 ppm, the more likely case, then lung tumors provide the largest estimated upper limit of risk, Risk < 2.55 x 10{sup {minus}4} (ppm coal tar in total diet), with no contribution of BaP to lung tumors. The upper limit of the cancer potency (slope factor) for lifetime oral exposure to benzo(a)pyrene is 1.2 x 10{sup {minus}3} per {micro}g per kg body weight per day from this Good Laboratory Practice (GLP) study compared with the current value of 7.3 x 10{sup {minus}3} per {micro}g per kg body weight per day listed in the US EPA Integrated Risk Information System.

  13. Quantitative assessment of the microbial risk of leafy greens from farm to consumption: preliminary framework, data, and risk estimates.

    PubMed

    Danyluk, Michelle D; Schaffner, Donald W

    2011-05-01

    This project was undertaken to relate what is known about the behavior of Escherichia coli O157:H7 under laboratory conditions and integrate this information to what is known regarding the 2006 E. coli O157:H7 spinach outbreak in the context of a quantitative microbial risk assessment. The risk model explicitly assumes that all contamination arises from exposure in the field. Extracted data, models, and user inputs were entered into an Excel spreadsheet, and the modeling software @RISK was used to perform Monte Carlo simulations. The model predicts that cut leafy greens that are temperature abused will support the growth of E. coli O157:H7, and populations of the organism may increase by as much a 1 log CFU/day under optimal temperature conditions. When the risk model used a starting level of -1 log CFU/g, with 0.1% of incoming servings contaminated, the predicted numbers of cells per serving were within the range of best available estimates of pathogen levels during the outbreak. The model predicts that levels in the field of -1 log CFU/g and 0.1% prevalence could have resulted in an outbreak approximately the size of the 2006 E. coli O157:H7 outbreak. This quantitative microbial risk assessment model represents a preliminary framework that identifies available data and provides initial risk estimates for pathogenic E. coli in leafy greens. Data gaps include retail storage times, correlations between storage time and temperature, determining the importance of E. coli O157:H7 in leafy greens lag time models, and validation of the importance of cross-contamination during the washing process.

  14. “Any Condomless Anal Intercourse” is No Longer an Accurate Measure of HIV Sexual risk Behavior in Gay and Other Men Who have Sex with Men

    PubMed Central

    Jin, Fengyi; Prestage, Garrett P.; Mao, Limin; Poynten, I. Mary; Templeton, David J.; Grulich, Andrew E.; Zablotska, Iryna

    2015-01-01

    Background: Condomless anal intercourse (CLAI) has long been recognized as the primary mode of sexual transmission of HIV in gay and other men who have sex with men (MSM). A variety of measures of CLAI have been commonly used in behavioral surveillance for HIV risk and to forecast trends in HIV infection. However, gay and other MSM’s sexual practices changed as the understanding of disease and treatment options advance. In the present paper, we argue that summary measures such as “any CLAI” do not accurately measure HIV sexual risk behavior. Methods: Participants were 1,427 HIV-negative men from the Health in Men cohort study run from 2001 to 2007 in Sydney, Australia, with six-monthly interviews. At each interview, detailed quantitative data on the number of episodes of insertive and receptive CLAI in the last 6 months were collected, separated by partner type (regular vs. casual) and partners’ HIV status (negative, positive, and HIV status unknown). Results: A total of 228,064 episodes of CLAI were reported during the study period with a mean of 44 episodes per year per participant (median: 14). The great majority of CLAI episodes were with a regular partner (92.6%), most of them with HIV-negative regular partners (84.8%). Participants were more likely to engage in insertive CLAI with casual than with regular partners (66.7 vs. 55.3% of all acts of CLAI with each partner type, p < 0.001). Men were more likely to report CLAI in the receptive position with HIV-negative and HIV status unknown partners than with HIV-positive partners (p < 0.001 for both regular and casual partners). Conclusion: Gay and other MSM engaging in CLAI demonstrate clear patterns of HIV risk reduction behavior. As HIV prevention enters the era of antiretroviral-based biomedical approach, using all forms of CLAI indiscriminately as a measure of HIV behavioral risk is not helpful in understanding the current drivers of HIV transmission in the community. PMID:25774158

  15. A review of methods to estimate cause-specific mortality in presence of competing risks

    USGS Publications Warehouse

    Heisey, Dennis M.; Patterson, Brent R.

    2006-01-01

    Estimating cause-specific mortality is often of central importance for understanding the dynamics of wildlife populations. Despite such importance, methodology for estimating and analyzing cause-specific mortality has received little attention in wildlife ecology during the past 20 years. The issue of analyzing cause-specific, mutually exclusive events in time is not unique to wildlife. In fact, this general problem has received substantial attention in human biomedical applications within the context of biostatistical survival analysis. Here, we consider cause-specific mortality from a modern biostatistical perspective. This requires carefully defining what we mean by cause-specific mortality and then providing an appropriate hazard-based representation as a competing risks problem. This leads to the general solution of cause-specific mortality as the cumulative incidence function (CIF). We describe the appropriate generalization of the fully nonparametric staggered-entry Kaplan–Meier survival estimator to cause-specific mortality via the nonparametric CIF estimator (NPCIFE), which in many situations offers an attractive alternative to the Heisey–Fuller estimator. An advantage of the NPCIFE is that it lends itself readily to risk factors analysis with standard software for Cox proportional hazards model. The competing risks–based approach also clarifies issues regarding another intuitive but erroneous "cause-specific mortality" estimator based on the Kaplan–Meier survival estimator and commonly seen in the life sciences literature.

  16. Estimators of annual probability of infection for quantitative microbial risk assessment.

    PubMed

    Karavarsamis, N; Hamilton, A J

    2010-06-01

    Four estimators of annual infection probability were compared pertinent to Quantitative Microbial Risk Analysis (QMRA). A stochastic model, the Gold Standard, was used as the benchmark. It is a product of independent daily infection probabilities which in turn are based on daily doses. An alternative and commonly-used estimator, here referred to as the Naïve, assumes a single daily infection probability from a single value of daily dose. The typical use of this estimator in stochastic QMRA involves the generation of a distribution of annual infection probabilities, but since each of these is based on a single realisation of the dose distribution, the resultant annual infection probability distribution simply represents a set of inaccurate estimates. While the medians of both distributions were within an order of magnitude for our test scenario, the 95th percentiles, which are sometimes used in QMRA as conservative estimates of risk, differed by around one order of magnitude. The other two estimators examined, the Geometric and Arithmetic, were closely related to the Naïve and use the same equation, and both proved to be poor estimators. Lastly, this paper proposes a simple adjustment to the Gold Standard equation accommodating periodic infection probabilities when the daily infection probabilities are unknown.

  17. Prevalence Estimates of Health Risk Behaviors of Immigrant Latino Men Who Have Sex with Men

    ERIC Educational Resources Information Center

    Rhodes, Scott D.; McCoy, Thomas P.; Hergenrather, Kenneth C.; Vissman, Aaron T.; Wolfson, Mark; Alonzo, Jorge; Bloom, Fred R.; Alegria-Ortega, Jose; Eng, Eugenia

    2012-01-01

    Purpose: Little is known about the health status of rural immigrant Latino men who have sex with men (MSM). These MSM comprise a subpopulation that tends to remain "hidden" from both researchers and practitioners. This study was designed to estimate the prevalence of tobacco, alcohol, and drug use, and sexual risk behaviors of Latino MSM living in…

  18. AN INFORMATIC APPROACH TO ESTIMATING ECOLOGICAL RISKS POSED BY PHARMACEUTICAL USE

    EPA Science Inventory

    A new method for estimating risks of human prescription pharmaceuticals based on information found in regulatory filings as well as scientific and trade literature is described in a presentation at the Pharmaceuticals in the Environment Workshop in Las Vegas, NV, August 23-25, 20...

  19. Risk estimates for radiation-induced cancer and radiation protection standards

    SciTech Connect

    Sinclair, W.K. )

    1989-11-01

    At low doses, the primary biological effects of concern are stochastic in nature, i.e., they are more probable at higher doses, but their severity is independent of the dose. In the last decade, a new epidemiological information on radiation-induced cancer in humans has become available. In the Japanese survivors three new cycles of data (11 yr of experience) have accumulated, and a revised dosimetry system (DS86) has been introduced. UNSCEAR (United Nations Scientific Committee on the Effects of Atomic Radiation) reevaluated the risk of cancer from all human sources, which include other human populations such as those treated for ankylosing spondylitis and for cancer of the cervix. UNSCEAR has also evaluated the cancer risk for each of nine organs. For radiation protection purposes (low doses and dose rates, adult populations mainly), nominal values of risk since the 1977-80 period have been {approximately}1%/Sv. This value will need to be increased in the light of the new estimates. Also, risk estimates for various tissues must be reconsidered, and weighting factors used by International Commission on Radiological Protection need to be reexamined. Recommendations on occupational and public dose limits must also be reconsidered. The National Council on Radiation Protection and Measurements is in a comparatively good position with a recently produced set of recommendations that had higher cancer risk estimates in mind.

  20. Estimate of the risk in radiation therapy due to unwanted neutrons

    SciTech Connect

    Swanson, W.P.

    1980-03-01

    The integral dose of accelerator-produced leakage neutrons to patients undergoing high-energy photon therapy is estimated and compared to other sources of integral dose. The leakage neutron component contributes about 5 g rad (1 rad=10/sup -2/ Gy) for a typical treatment course of 5000 rad. When averaged over a 70-kg tissue volume, the corresponding dose amounts to only 0.36 rad. From this, the risk of inducing fatal malignancies by leakage neutrons is estimated to be about 50 x 10/sup -6/ per year following treatment. This is compared to other risks to which the patient is unavoidably exposed, and it is argued that the unwanted neutrons pose such small additional risk that regulatory intervention is not warranted. This assessment is performed without reference to neutron RBE or quality factor.

  1. Value at risk estimation with entropy-based wavelet analysis in exchange markets

    NASA Astrophysics Data System (ADS)

    He, Kaijian; Wang, Lijun; Zou, Yingchao; Lai, Kin Keung

    2014-08-01

    In recent years, exchange markets are increasingly integrated together. Fluctuations and risks across different exchange markets exhibit co-moving and complex dynamics. In this paper we propose the entropy-based multivariate wavelet based approaches to analyze the multiscale characteristic in the multidimensional domain and improve further the Value at Risk estimation reliability. Wavelet analysis has been introduced to construct the entropy-based Multiscale Portfolio Value at Risk estimation algorithm to account for the multiscale dynamic correlation. The entropy measure has been proposed as the more effective measure with the error minimization principle to select the best basis when determining the wavelet families and the decomposition level to use. The empirical studies conducted in this paper have provided positive evidence as to the superior performance of the proposed approach, using the closely related Chinese Renminbi and European Euro exchange market.

  2. Prevalence Estimates of Health Risk Behaviors of Immigrant Latino Men Who Have Sex With Men

    PubMed Central

    Rhodes, Scott D.; McCoy, Thomas P.; Hergenrather, Kenneth C.; Vissman, Aaron T.; Wolfson, Mark; Alonzo, Jorge; Bloom, Fred R.; Alegría-Ortega, Jose; Eng, Eugenia

    2011-01-01

    Purpose Little is known about the health status of rural immigrant Latino men who have sex with men (MSM). These MSM comprise a subpopulation that tends to remain “hidden” from both researchers and practitioners. This study was designed to estimate the prevalence of tobacco, alcohol, and drug use, and sexual risk behaviors of Latino MSM living in rural North Carolina. Methods A community-based participatory research (CBPR) partnership used respondent-driven sampling (RDS) to identify, recruit, and enroll Latino MSM to participate in an interviewer-administered behavioral assessment. RDS weighted prevalence of risk behaviors was estimated using the RDS Analysis Tool. Data collection occurred in 2008. Results A total of 190 Latino MSM was reached; the average age was 25.5 years old and nearly 80% reported being from Mexico. Prevalence estimates of smoking everyday and past 30-day heavy episodic drinking were 6.5% and 35.0%, respectively. Prevalence estimates of past 12-month marijuana and cocaine use were 56.0% and 27.1%, respectively. Past 3-month prevalence estimates of sex with at least one woman, multiple male partners, and inconsistent condom use were 21.2%, 88.9%, and 54.1%, respectively. Conclusions Respondents had low rates of tobacco use and club drug use, and high rates of sexual risk behaviors. Although this study represents an initial step in documenting the health risk behaviors of immigrant Latino MSM who are part of a new trend in Latino immigration to the southeastern US, a need exists for further research, including longitudinal studies to understand the trajectory of risk behavior among immigrant Latino MSM. PMID:22236317

  3. Estimating the accumulation of chemicals in an estuarine food web: A case study for evaluation of future ecological and human health risks

    SciTech Connect

    Iannuzzi, T.J.; Finley, B.L.

    1995-12-31

    A model was constructed and calibrated for estimating the accumulation of sediment associated nonionic organic chemicals, including selected PCBs and PCDD/Fs, in a simplified food web of the tidal Passaic River, New Jersey. The model was used to estimate concentrations of several chemicals in infaunal invertebrates, forage fish, blue crab, and adult finfish in the River as part of a screening-level risk assessment that was conducted during the preliminary phase of a CERCLA Remedial Investigation/Feasibility Study (RI/FS). Subsequent tissue-residue data were collected to evaluate the performance of the model, and to calibrate the model for multiple chemicals of concern in the River. A follow-up program of data collection was designed to support a more detailed risk assessment. The objectives of calibrating the model are to supplement the extant tissue-residue data that is available for risk assessment, and to evaluate future scenarios of bioaccumulation (and potential ecological and human health risk) under various future conditions in the River. Results to-date suggest that the model performs well for the simplified food web that exists in the Passaic River. A case study was constructed to demonstrate the application of the model for future predictions of ecological risk. These preliminary results suggest that the model is sufficiently sensitive and accurate for estimating variations of bioaccumulation under varying degrees of source control or other future conditions.

  4. Quantitative microbial risk assessment combined with hydrodynamic modelling to estimate the public health risk associated with bathing after rainfall events.

    PubMed

    Eregno, Fasil Ejigu; Tryland, Ingun; Tjomsland, Torulv; Myrmel, Mette; Robertson, Lucy; Heistad, Arve

    2016-04-01

    This study investigated the public health risk from exposure to infectious microorganisms at Sandvika recreational beaches, Norway and dose-response relationships by combining hydrodynamic modelling with Quantitative Microbial Risk Assessment (QMRA). Meteorological and hydrological data were collected to produce a calibrated hydrodynamic model using Escherichia coli as an indicator of faecal contamination. Based on average concentrations of reference pathogens (norovirus, Campylobacter, Salmonella, Giardia and Cryptosporidium) relative to E. coli in Norwegian sewage from previous studies, the hydrodynamic model was used for simulating the concentrations of pathogens at the local beaches during and after a heavy rainfall event, using three different decay rates. The simulated concentrations were used as input for QMRA and the public health risk was estimated as probability of infection from a single exposure of bathers during the three consecutive days after the rainfall event. The level of risk on the first day after the rainfall event was acceptable for the bacterial and parasitic reference pathogens, but high for the viral reference pathogen at all beaches, and severe at Kalvøya-small and Kalvøya-big beaches, supporting the advice of avoiding swimming in the day(s) after heavy rainfall. The study demonstrates the potential of combining discharge-based hydrodynamic modelling with QMRA in the context of bathing water as a tool to evaluate public health risk and support beach management decisions. PMID:26802355

  5. Estimated GFR associates with cardiovascular risk factors independently of measured GFR.

    PubMed

    Mathisen, Ulla Dorte; Melsom, Toralf; Ingebretsen, Ole C; Jenssen, Trond; Njølstad, Inger; Solbu, Marit D; Toft, Ingrid; Eriksen, Bjørn O

    2011-05-01

    Estimation of the GFR (eGFR) using creatinine- or cystatin C-based equations is imperfect, especially when the true GFR is normal or near-normal. Modest reductions in eGFR from the normal range variably predict cardiovascular morbidity. If eGFR associates not only with measured GFR (mGFR) but also with cardiovascular risk factors, the effects of these non-GFR-related factors might bias the association between eGFR and outcome. To investigate these potential non-GFR-related associations between eGFR and cardiovascular risk factors, we measured GFR by iohexol clearance in a sample from the general population (age 50 to 62 years) without known cardiovascular disease, diabetes, or kidney disease. Even after adjustment for mGFR, eGFR associated with traditional cardiovascular risk factors in multiple regression analyses. More risk factors influenced cystatin C-based eGFR than creatinine-based eGFR, adjusted for mGFR, and some of the risk factors exhibited nonlinear effects in generalized additive models (P<0.05). These results suggest that eGFR, calculated using standard creatinine- or cystatin C-based equations, partially depends on factors other than the true GFR. Thus, estimates of cardiovascular risk associated with small changes in eGFR must be interpreted with caution.

  6. Estimates of prevalence and risk associated with inattention and distraction based upon in situ naturalistic data.

    PubMed

    Dingus, Thomas A

    2014-01-01

    By using in situ naturalistic driving data, estimates of prevalence and risk can be made regarding driver populations' secondary task distractions and crash rates. Through metadata analysis, three populations of drivers (i.e., adult light vehicle, teenaged light vehicle, and adult heavy vehicle) were compared regarding frequency of secondary task behavior and the associated risk for safety-critical incidents. Relative risk estimates provide insight into the risk associated with engaging in a single task. When such risk is considered in combination with frequency of use, it sheds additional light on those secondary tasks that create the greatest overall risk to driving safety. The results show that secondary tasks involving manual typing, texting, dialing, reaching for an object, or reading are dangerous for all three populations. Additionally, novice teen drivers have difficulty in several tasks that the other two populations do not, including eating and external distractions. Truck drivers also perform a number of risky "mobile office" types of tasks, including writing, not seen in the other populations. Implications are described for policy makers and designers of in-vehicle and nomadic, portable systems. PMID:24776227

  7. Estimates of auditory risk from outdoor impulse noise. II: Civilian firearms.

    PubMed

    Flamme, Gregory A; Wong, Adam; Liebe, Kevin; Lynd, James

    2009-01-01

    Firearm impulses are common noise exposures in the United States. This study records, describes and analyzes impulses produced outdoors by civilian firearms with respect to the amount of auditory risk they pose to the unprotected listener under various listening conditions. Risk estimates were obtained using three contemporary damage risk criteria (DRC) including a waveform parameter-based approach (peak SPL and B-duration), an energy-based criterion (A-weighted SEL and equivalent continuous level) and a physiological model (AHAAH). Results from these DRC were converted into a number of maximum permissible unprotected exposures to facilitate interpretation. Acoustic characteristics of firearm impulses differed substantially across guns, ammunition, and microphone location. The type of gun, ammunition and the microphone location all significantly affected estimates of auditory risk from firearms. Vast differences in maximum permissible exposures were observed; the rank order of the differences varied with the source of the impulse. Unprotected exposure to firearm noise is not recommended, but people electing to fire a gun without hearing protection should be advised to minimize auditory risk through careful selection of ammunition and shooting environment. Small-caliber guns with long barrels and guns loaded with the least powerful ammunition tend to be associated with the least auditory risk. PMID:19805933

  8. Estimating risks of heat strain by age and sex: a population-level simulation model.

    PubMed

    Glass, Kathryn; Tait, Peter W; Hanna, Elizabeth G; Dear, Keith

    2015-05-18

    Individuals living in hot climates face health risks from hyperthermia due to excessive heat. Heat strain is influenced by weather exposure and by individual characteristics such as age, sex, body size, and occupation. To explore the population-level drivers of heat strain, we developed a simulation model that scales up individual risks of heat storage (estimated using Myrup and Morgan's man model "MANMO") to a large population. Using Australian weather data, we identify high-risk weather conditions together with individual characteristics that increase the risk of heat stress under these conditions. The model identifies elevated risks in children and the elderly, with females aged 75 and older those most likely to experience heat strain. Risk of heat strain in males does not increase as rapidly with age, but is greatest on hot days with high solar radiation. Although cloudy days are less dangerous for the wider population, older women still have an elevated risk of heat strain on hot cloudy days or when indoors during high temperatures. Simulation models provide a valuable method for exploring population level risks of heat strain, and a tool for evaluating public health and other government policy interventions.

  9. Cancer risk estimation in Digital Breast Tomosynthesis using GEANT4 Monte Carlo simulations and voxel phantoms.

    PubMed

    Ferreira, P; Baptista, M; Di Maria, S; Vaz, P

    2016-05-01

    The aim of this work was to estimate the risk of radiation induced cancer following the Portuguese breast screening recommendations for Digital Mammography (DM) when applied to Digital Breast Tomosynthesis (DBT) and to evaluate how the risk to induce cancer could influence the energy used in breast diagnostic exams. The organ doses were calculated by Monte Carlo simulations using a female voxel phantom and considering the acquisition of 25 projection images. Single organ cancer incidence risks were calculated in order to assess the total effective radiation induced cancer risk. The screening strategy techniques considered were: DBT in Cranio-Caudal (CC) view and two-view DM (CC and Mediolateral Oblique (MLO)). The risk of cancer incidence following the Portuguese screening guidelines (screening every two years in the age range of 50-80years) was calculated by assuming a single CC DBT acquisition view as standalone screening strategy and compared with two-view DM. The difference in the total effective risk between DBT and DM is quite low. Nevertheless in DBT an increase of risk for the lung is observed with respect to DM. The lung is also the organ that is mainly affected when non-optimal beam energy (in terms of image quality and absorbed dose) is used instead of an optimal one. The use of non-optimal energies could increase the risk of lung cancer incidence by a factor of about 2. PMID:27133140

  10. Estimating Risks of Heat Strain by Age and Sex: A Population-Level Simulation Model

    PubMed Central

    Glass, Kathryn; Tait, Peter W.; Hanna, Elizabeth G.; Dear, Keith

    2015-01-01

    Individuals living in hot climates face health risks from hyperthermia due to excessive heat. Heat strain is influenced by weather exposure and by individual characteristics such as age, sex, body size, and occupation. To explore the population-level drivers of heat strain, we developed a simulation model that scales up individual risks of heat storage (estimated using Myrup and Morgan’s man model “MANMO”) to a large population. Using Australian weather data, we identify high-risk weather conditions together with individual characteristics that increase the risk of heat stress under these conditions. The model identifies elevated risks in children and the elderly, with females aged 75 and older those most likely to experience heat strain. Risk of heat strain in males does not increase as rapidly with age, but is greatest on hot days with high solar radiation. Although cloudy days are less dangerous for the wider population, older women still have an elevated risk of heat strain on hot cloudy days or when indoors during high temperatures. Simulation models provide a valuable method for exploring population level risks of heat strain, and a tool for evaluating public health and other government policy interventions. PMID:25993102

  11. Vertebral Strength and Estimated Fracture Risk Across the BMI Spectrum in Women.

    PubMed

    Bachmann, Katherine N; Bruno, Alexander G; Bredella, Miriam A; Schorr, Melanie; Lawson, Elizabeth A; Gill, Corey M; Singhal, Vibha; Meenaghan, Erinne; Gerweck, Anu V; Eddy, Kamryn T; Ebrahimi, Seda; Koman, Stuart L; Greenblatt, James M; Keane, Robert J; Weigel, Thomas; Dechant, Esther; Misra, Madhusmita; Klibanski, Anne; Bouxsein, Mary L; Miller, Karen K

    2016-02-01

    Somewhat paradoxically, fracture risk, which depends on applied loads and bone strength, is elevated in both anorexia nervosa and obesity at certain skeletal sites. Factor-of-risk (Φ), the ratio of applied load to bone strength, is a biomechanically based method to estimate fracture risk; theoretically, higher Φ reflects increased fracture risk. We estimated vertebral strength (linear combination of integral volumetric bone mineral density [Int.vBMD] and cross-sectional area from quantitative computed tomography [QCT]), vertebral compressive loads, and Φ at L4 in 176 women (65 anorexia nervosa, 45 lean controls, and 66 obese). Using biomechanical models, applied loads were estimated for: 1) standing; 2) arms flexed 90°, holding 5 kg in each hand (holding); 3) 45° trunk flexion, 5 kg in each hand (lifting); 4) 20° trunk right lateral bend, 10 kg in right hand (bending). We also investigated associations of Int.vBMD and vertebral strength with lean mass (from dual-energy X-ray absorptiometry [DXA]) and visceral adipose tissue (VAT, from QCT). Women with anorexia nervosa had lower, whereas obese women had similar, Int.vBMD and estimated vertebral strength compared with controls. Vertebral loads were highest in obesity and lowest in anorexia nervosa for standing, holding, and lifting (p < 0.0001) but were highest in anorexia nervosa for bending (p < 0.02). Obese women had highest Φ for standing and lifting, whereas women with anorexia nervosa had highest Φ for bending (p < 0.0001). Obese and anorexia nervosa subjects had higher Φ for holding than controls (p < 0.03). Int.vBMD and estimated vertebral strength were associated positively with lean mass (R = 0.28 to 0.45, p ≤ 0.0001) in all groups combined and negatively with VAT (R = -[0.36 to 0.38], p < 0.003) within the obese group. Therefore, women with anorexia nervosa had higher estimated vertebral fracture risk (Φ) for holding and bending because of inferior vertebral strength. Despite similar

  12. Bayesian estimation of the relative toxicity of (239)Pu and (226)Ra with dependent competing risks

    NASA Astrophysics Data System (ADS)

    Xiao, Shili

    The purpose of this dissertation research is to compare the toxicity of the alpha-emitting, bone-seeking radionuclides sp{239}Pu and sp{226}Ra, develop a model for radiation induced osteosarcomas, and analyze the survival data of beagles exposed to these radionuclides. This research integrates the knowledge of radiation protection, survival theory and methods (competing risks, maximum likelihood estimation, and Bayesian techniques), numerical integration techniques (Monte Carlo, Lattice rule and Gauss-quadrature) and object-oriented programming in C++. The outline of this research is: (1) survival data preprocessing, (2) model identification and selection, (3) introduction of FGM model, the dependent competing risk model created by Farlie, Gumbel and Morgenstern, to the study of survival data with dependent competing risks: osteosarcomas and other diseases, development of the crude density of the FGM model and construction of the likelihood function for the FGM model, (4) Bayesian estimates of the posterior marginal density of the toxicity ratio in the FGM model using several numerical integration techniques (Monte Carlo, Lattice rule and Gaussian Quadrature), (5) construction of the likelihood function for the independent competing risk model, Bayesian estimate of the posterior marginal density of toxicity ratio in the model using Monte Carlo method, which is compared with the posterior marginal densities for the toxicity ratio obtained from the FGM model, (6) Bayesian estimates of all other parameters in the FGM model using Monte Carlo method, (7) Comparison of the cumulative hazard for sp{239}Pu calculated according to the model with Nelson's cumulative hazard plot under Bayesian point estimates of parameters and the mean activity in each injection level, (8) Comparison of the toxicity of plutonium in osteosarcoma with that of radium under Bayesian point estimates of parameters an d the selected activit of 0.85 muCsbi, (7) discuss Bayesian prediction of the

  13. Impact of provision of cardiovascular disease risk estimates to healthcare professionals and patients: a systematic review

    PubMed Central

    Usher-Smith, Juliet A; Silarova, Barbora; Schuit, Ewoud; GM Moons, Karel; Griffin, Simon J

    2015-01-01

    Objective To systematically review whether the provision of information on cardiovascular disease (CVD) risk to healthcare professionals and patients impacts their decision-making, behaviour and ultimately patient health. Design A systematic review. Data sources An electronic literature search of MEDLINE and PubMed from 01/01/2004 to 01/06/2013 with no language restriction and manual screening of reference lists of systematic reviews on similar topics and all included papers. Eligibility criteria for selecting studies (1) Primary research published in a peer-reviewed journal; (2) inclusion of participants with no history of CVD; (3) intervention strategy consisted of provision of a CVD risk model estimate to either professionals or patients; and (4) the only difference between the intervention group and control group (or the only intervention in the case of before-after studies) was the provision of a CVD risk model estimate. Results After duplicates were removed, the initial electronic search identified 9671 papers. We screened 196 papers at title and abstract level and included 17 studies. The heterogeneity of the studies limited the analysis, but together they showed that provision of risk information to patients improved the accuracy of risk perception without decreasing quality of life or increasing anxiety, but had little effect on lifestyle. Providing risk information to physicians increased prescribing of lipid-lowering and blood pressure medication, with greatest effects in those with CVD risk >20% (relative risk for change in prescribing 2.13 (1.02 to 4.63) and 2.38 (1.11 to 5.10) respectively). Overall, there was a trend towards reductions in cholesterol and blood pressure and a statistically significant reduction in modelled CVD risk (−0.39% (−0.71 to −0.07)) after, on average, 12 months. Conclusions There seems evidence that providing CVD risk model estimates to professionals and patients improves perceived CVD risk and medical prescribing

  14. Validation of model-based estimates (synthetic estimates) of the prevalence of risk factors for coronary heart disease for wards in England.

    PubMed

    Scarborough, Peter; Allender, Steven; Rayner, Mike; Goldacre, Michael

    2009-06-01

    Several sets of model-based estimates (synthetic estimates) of the prevalence of risk factors for coronary heart disease for small areas in England have been developed. These have been used in policy documents to indicate which areas are in need of intervention. In general, these models have not been subjected to validity assessment. This paper describes a validity assessment of 16 sets of synthetic estimates, by comparison of the models with national, regional and local survey-based estimates, and local mortality rate estimates. Model-based estimates of the prevalence of smoking, low fruit and vegetable consumption, obesity, hypertension and raised cholesterol are found to be valid.

  15. Health risk estimates for groundwater and soil contamination in the Slovak Republic: a convenient tool for identification and mapping of risk areas.

    PubMed

    Fajčíková, K; Cvečková, V; Stewart, A; Rapant, S

    2014-10-01

    We undertook a quantitative estimation of health risks to residents living in the Slovak Republic and exposed to contaminated groundwater (ingestion by adult population) and/or soils (ingestion by adult and child population). Potential risk areas were mapped to give a visual presentation at basic administrative units of the country (municipalities, districts, regions) for easy discussion with policy and decision-makers. The health risk estimates were calculated by US EPA methods, applying threshold values for chronic risk and non-threshold values for cancer risk. The potential health risk was evaluated for As, Ba, Cd, Cu, F, Hg, Mn, NO3 (-), Pb, Sb, Se and Zn for groundwater and As, B, Ba, Be, Cd, Cu, F, Hg, Mn, Mo, Ni, Pb, Sb, Se and Zn for soils. An increased health risk was identified mainly in historical mining areas highly contaminated by geogenic-anthropogenic sources (ore deposit occurrence, mining, metallurgy). Arsenic and antimony were the most significant elements in relation to health risks from groundwater and soil contamination in the Slovak Republic contributing a significant part of total chronic risk levels. Health risk estimation for soil contamination has highlighted the significance of exposure through soil ingestion in children. Increased cancer risks from groundwater and soil contamination by arsenic were noted in several municipalities and districts throughout the country in areas with significantly high arsenic levels in the environment. This approach to health risk estimations and visualization represents a fast, clear and convenient tool for delineation of risk areas at national and local levels.

  16. European risk assessment of LAS in agricultural soil revisited: species sensitivity distribution and risk estimates.

    PubMed

    Jensen, John; Smith, Stephen R; Krogh, Paul Henning; Versteeg, Donald J; Temara, Ali

    2007-10-01

    Linear alkylbenzene sulphonate (LAS) is used at a rate of approximately 430,000 tons/y in Western Europe, mainly in laundry detergents. It is present in sewage sludge (70-5,600 mg/kg; 5-95th percentile) because of its high usage per capita, its sorption and precipitation in primary settlers, and its lack of degradation in anaerobic digesters. Immediately after amendment, calculated and measured concentrations are <1 to 60 mg LAS/kg soil. LAS biodegrades rapidly in soil with primary and ultimate half-lives of up to 7 and 30 days, respectively. Calculated residual concentrations after the averaging time (30 days) are 0.24-18 mg LAS/kg soil. The long-term ecotoxicity to soil microbiota is relatively low (EC10 >or=26 mg sludge-associated LAS/kg soil). An extensive review of the invertebrate and plant ecotoxicological data, combined with a probabilistic assessment approach, led to a PNEC value of 35 mg LAS/kg soil, i.e. the 5th percentile (HC5) of the species sensitivity distribution (lognormal distribution of the EC10 and NOEC values). Risk ratios were identified to fall within a range of 0.01 (median LAS concentration in sludge) to 0.1 (95th percentile) and always below 0.5 (maximum LAS concentration measured in sludge) according to various scenarios covering different factors such as local sewage influent concentration, water hardness, and sewage sludge stabilisation process. Based on the present information, it can be concluded that LAS does not represent an ecological risk in Western Europe when applied via normal sludge amendment to agricultural soil. PMID:17765285

  17. Radiobiologic risk estimation from dental radiology. Part II. Cancer incidence and fatality

    SciTech Connect

    Underhill, T.E.; Kimura, K.; Chilvarquer, I.; McDavid, W.D.; Langlais, R.P.; Preece, J.W.; Barnwell, G.

    1988-08-01

    With the use of the measured absorbed doses from part I of this article, the specific radiobiologic risk to the patient from (1) five different panoramic machines with rare-earth screens, (2) a 20-film complete-mouth survey with E-speed film, long round cone, (3) a 20-film complete-mouth survey with E-speed film, long rectangular cone, (4) a 4-film interproximal survey with E-speed film, long round cone, and (5) a 4-film interproximal survey with E-speed film, long rectangular cone, was calculated. The estimated risks are expressed in two ways: the probability of radiation-induced cancer in specific organs per million examinations and the probability of expression of a fatal cancer per million examinations. The highest risks calculated were from the complete-mouth survey with the use of round collimation. The lowest risks calculated were from panoramic radiography and four interproximal radiographs with rectangular collimation.

  18. Radiobiologic risk estimation from dental radiology. Part II. Cancer incidence and fatality.

    PubMed

    Underhill, T E; Kimura, K; Chilvarquer, I; McDavid, W D; Langlais, R P; Preece, J W; Barnwell, G

    1988-08-01

    With the use of the measured absorbed doses from part I of this article, the specific radiobiologic risk to the patient from (1) five different panoramic machines with rare-earth screens, (2) a 20-film complete-mouth survey with E-speed film, long round cone, (3) a 20-film complete-mouth survey with E-speed film, long rectangular cone, (4) a 4-film interproximal survey with E-speed film, long round cone, and (5) a 4-film interproximal survey with E-speed film, long rectangular cone, was calculated. The estimated risks are expressed in two ways: the probability of radiation-induced cancer in specific organs per million examinations and the probability of expression of a fatal cancer per million examinations. The highest risks calculated were from the complete-mouth survey with the use of round collimation. The lowest risks calculated were from panoramic radiography and four interproximal radiographs with rectangular collimation.

  19. A Methodological Approach to Small Area Estimation for the Behavioral Risk Factor Surveillance System.

    PubMed

    Pierannunzi, Carol; Xu, Fang; Wallace, Robyn C; Garvin, William; Greenlund, Kurt J; Bartoli, William; Ford, Derek; Eke, Paul; Town, G Machell

    2016-07-14

    Public health researchers have used a class of statistical methods to calculate prevalence estimates for small geographic areas with few direct observations. Many researchers have used Behavioral Risk Factor Surveillance System (BRFSS) data as a basis for their models. The aims of this study were to 1) describe a new BRFSS small area estimation (SAE) method and 2) investigate the internal and external validity of the BRFSS SAEs it produced. The BRFSS SAE method uses 4 data sets (the BRFSS, the American Community Survey Public Use Microdata Sample, Nielsen Claritas population totals, and the Missouri Census Geographic Equivalency File) to build a single weighted data set. Our findings indicate that internal and external validity tests were successful across many estimates. The BRFSS SAE method is one of several methods that can be used to produce reliable prevalence estimates in small geographic areas.

  20. A Methodological Approach to Small Area Estimation for the Behavioral Risk Factor Surveillance System

    PubMed Central

    Xu, Fang; Wallace, Robyn C.; Garvin, William; Greenlund, Kurt J.; Bartoli, William; Ford, Derek; Eke, Paul; Town, G. Machell

    2016-01-01

    Public health researchers have used a class of statistical methods to calculate prevalence estimates for small geographic areas with few direct observations. Many researchers have used Behavioral Risk Factor Surveillance System (BRFSS) data as a basis for their models. The aims of this study were to 1) describe a new BRFSS small area estimation (SAE) method and 2) investigate the internal and external validity of the BRFSS SAEs it produced. The BRFSS SAE method uses 4 data sets (the BRFSS, the American Community Survey Public Use Microdata Sample, Nielsen Claritas population totals, and the Missouri Census Geographic Equivalency File) to build a single weighted data set. Our findings indicate that internal and external validity tests were successful across many estimates. The BRFSS SAE method is one of several methods that can be used to produce reliable prevalence estimates in small geographic areas. PMID:27418213

  1. Risk estimates for deterministic health effects of inhaled weapons grade plutonium.

    PubMed

    Scott, Bobby R; Peterson, Vern L

    2003-09-01

    Risk estimates for deterministic effects of inhaled weapons-grade plutonium (WG Pu) are needed to evaluate potential serious harm to (1) U.S. Department of Energy nuclear workers from accidental or other work-place releases of WG Pu; and (2) the public from terrorist actions resulting in the release of WG Pu to the environment. Deterministic health effects (the most serious radiobiological consequences to humans) can arise when large amounts of WG Pu are taken into the body. Inhalation is considered the most likely route of intake during work-place accidents or during a nuclear terrorism incident releasing WG Pu to the environment. Our current knowledge about radiation-related harm is insufficient for generating precise estimates of risk for a given WG Pu exposure scenario. This relates largely to uncertainties associated with currently available risk and dosimetry models. Thus, rather than generating point estimates of risk, distributions that account for variability/uncertainty are needed to properly characterize potential harm to humans from a given WG Pu exposure scenario. In this manuscript, we generate and summarize risk distributions for deterministic radiation effects in the lungs of nuclear workers from inhaled WG Pu particles (standard isotopic mix). These distributions were developed using NUREG/CR-4214 risk models and time-dependent, dose conversion factor data based on Publication 30 of the International Commission on Radiological Protection. Dose conversion factors based on ICRP Publication 30 are more relevant to deterministic effects than are the dose conversion factors based on ICRP Publication 66, which relate to targets for stochastic effects. Risk distributions that account for NUREG/CR-4214 parameter and model uncertainties were generated using the Monte Carlo method. Risks were evaluated for both lethality (from radiation pneumonitis) and morbidity (due to radiation-induced respiratory dysfunction) and were found to depend strongly on absorbed

  2. Estimating Risk of Natural Gas Portfolios by Using GARCH-EVT-Copula Model.

    PubMed

    Tang, Jiechen; Zhou, Chao; Yuan, Xinyu; Sriboonchitta, Songsak

    2015-01-01

    This paper concentrates on estimating the risk of Title Transfer Facility (TTF) Hub natural gas portfolios by using the GARCH-EVT-copula model. We first use the univariate ARMA-GARCH model to model each natural gas return series. Second, the extreme value distribution (EVT) is fitted to the tails of the residuals to model marginal residual distributions. Third, multivariate Gaussian copula and Student t-copula are employed to describe the natural gas portfolio risk dependence structure. Finally, we simulate N portfolios and estimate value at risk (VaR) and conditional value at risk (CVaR). Our empirical results show that, for an equally weighted portfolio of five natural gases, the VaR and CVaR values obtained from the Student t-copula are larger than those obtained from the Gaussian copula. Moreover, when minimizing the portfolio risk, the optimal natural gas portfolio weights are found to be similar across the multivariate Gaussian copula and Student t-copula and different confidence levels. PMID:26351652

  3. Estimating Risk of Natural Gas Portfolios by Using GARCH-EVT-Copula Model

    PubMed Central

    Tang, Jiechen; Zhou, Chao; Yuan, Xinyu; Sriboonchitta, Songsak

    2015-01-01

    This paper concentrates on estimating the risk of Title Transfer Facility (TTF) Hub natural gas portfolios by using the GARCH-EVT-copula model. We first use the univariate ARMA-GARCH model to model each natural gas return series. Second, the extreme value distribution (EVT) is fitted to the tails of the residuals to model marginal residual distributions. Third, multivariate Gaussian copula and Student t-copula are employed to describe the natural gas portfolio risk dependence structure. Finally, we simulate N portfolios and estimate value at risk (VaR) and conditional value at risk (CVaR). Our empirical results show that, for an equally weighted portfolio of five natural gases, the VaR and CVaR values obtained from the Student t-copula are larger than those obtained from the Gaussian copula. Moreover, when minimizing the portfolio risk, the optimal natural gas portfolio weights are found to be similar across the multivariate Gaussian copula and Student t-copula and different confidence levels. PMID:26351652

  4. Estimating Risk of Natural Gas Portfolios by Using GARCH-EVT-Copula Model.

    PubMed

    Tang, Jiechen; Zhou, Chao; Yuan, Xinyu; Sriboonchitta, Songsak

    2015-01-01

    This paper concentrates on estimating the risk of Title Transfer Facility (TTF) Hub natural gas portfolios by using the GARCH-EVT-copula model. We first use the univariate ARMA-GARCH model to model each natural gas return series. Second, the extreme value distribution (EVT) is fitted to the tails of the residuals to model marginal residual distributions. Third, multivariate Gaussian copula and Student t-copula are employed to describe the natural gas portfolio risk dependence structure. Finally, we simulate N portfolios and estimate value at risk (VaR) and conditional value at risk (CVaR). Our empirical results show that, for an equally weighted portfolio of five natural gases, the VaR and CVaR values obtained from the Student t-copula are larger than those obtained from the Gaussian copula. Moreover, when minimizing the portfolio risk, the optimal natural gas portfolio weights are found to be similar across the multivariate Gaussian copula and Student t-copula and different confidence levels.

  5. 49 CFR Appendix G to Part 222 - Excess Risk Estimates for Public Highway-Rail Grade Crossings

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... Grade Crossings G Appendix G to Part 222 Transportation Other Regulations Relating to Transportation... HIGHWAY-RAIL GRADE CROSSINGS Pt. 222, App. G Appendix G to Part 222—Excess Risk Estimates for Public Highway-Rail Grade Crossings Ban Effects/Train Horn Effectiveness Warning type Excess risk estimate...

  6. 49 CFR Appendix G to Part 222 - Excess Risk Estimates for Public Highway-Rail Grade Crossings

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... Grade Crossings G Appendix G to Part 222 Transportation Other Regulations Relating to Transportation... HIGHWAY-RAIL GRADE CROSSINGS Pt. 222, App. G Appendix G to Part 222—Excess Risk Estimates for Public Highway-Rail Grade Crossings Ban Effects/Train Horn Effectiveness Warning type Excess risk estimate...

  7. 49 CFR Appendix G to Part 222 - Excess Risk Estimates for Public Highway-Rail Grade Crossings

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Grade Crossings G Appendix G to Part 222 Transportation Other Regulations Relating to Transportation... HIGHWAY-RAIL GRADE CROSSINGS Pt. 222, App. G Appendix G to Part 222—Excess Risk Estimates for Public Highway-Rail Grade Crossings Ban Effects/Train Horn Effectiveness Warning type Excess risk estimate...

  8. 49 CFR Appendix G to Part 222 - Excess Risk Estimates for Public Highway-Rail Grade Crossings

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 4 2010-10-01 2010-10-01 false Excess Risk Estimates for Public Highway-Rail Grade Crossings G Appendix G to Part 222 Transportation Other Regulations Relating to Transportation... HIGHWAY-RAIL GRADE CROSSINGS Pt. 222, App. G Appendix G to Part 222—Excess Risk Estimates for...

  9. Estimating the risks of cancer mortality and genetic defects resulting from exposures to low levels of ionizing radiation

    SciTech Connect

    Buhl, T.E.; Hansen, W.R.

    1984-05-01

    Estimators for calculating the risk of cancer and genetic disorders induced by exposure to ionizing radiation have been recommended by the US National Academy of Sciences Committee on the Biological Effects of Ionizing Radiations, the UN Scientific Committee on the Effects of Atomic Radiation, and the International Committee on Radiological Protection. These groups have also considered the risks of somatic effects other than cancer. The US National Council on Radiation Protection and Measurements has discussed risk estimate procedures for radiation-induced health effects. The recommendations of these national and international advisory committees are summarized and compared in this report. Based on this review, two procedures for risk estimation are presented for use in radiological assessments performed by the US Department of Energy under the National Environmental Policy Act of 1969 (NEPA). In the first procedure, age- and sex-averaged risk estimators calculated with US average demographic statistics would be used with estimates of radiation dose to calculate the projected risk of cancer and genetic disorders that would result from the operation being reviewed under NEPA. If more site-specific risk estimators are needed, and the demographic information is available, a second procedure is described that would involve direct calculation of the risk estimators using recommended risk-rate factors. The computer program REPCAL has been written to perform this calculation and is described in this report. 25 references, 16 tables.

  10. [Estimation of the risk of Alzheimer type dementia based on "Prognostic Questionnaire"].

    PubMed

    Bidzan, Leszek; Bidzan, Mariola

    2002-01-01

    The aim of undertaken studies was verification of usefulness of "Prognostic Questionnaire" for estimations of risk of Alzheimer type dementia. In support for own base were included 286 persons, which were investigated by scales creating of "Prognostic Questionnaire" in years 1991-1993. From among 286 persons originally classified to renewed study one surrendered 163 persons. Result in investigation MMSE equal or lower from 25 points was base to removal full psychiatric investigations which had in view recognition of Alzheimer type dementia or its exclusion. Possible Alzheimer disease were recognized at 23 persons. Sensitivity and specificity for "Prognostic Questionnaire" obtained for several critical values were marked. The study showed usefulness of joint interpretation of results of several clinical scales creating "Prognostic Questionnaire" for estimations of risk of Alzheimer type dementia. PMID:12647439

  11. Quantitative Cyber Risk Reduction Estimation Methodology for a Small Scada Control System

    SciTech Connect

    Miles A. McQueen; Wayne F. Boyer; Mark A. Flynn; George A. Beitel

    2006-01-01

    We propose a new methodology for obtaining a quick quantitative measurement of the risk reduction achieved when a control system is modified with the intent to improve cyber security defense against external attackers. The proposed methodology employs a directed graph called a compromise graph, where the nodes represent stages of a potential attack and the edges represent the expected time-to-compromise for differing attacker skill levels. Time-to-compromise is modeled as a function of known vulnerabilities and attacker skill level. The methodology was used to calculate risk reduction estimates for a specific SCADA system and for a specific set of control system security remedial actions. Despite an 86% reduction in the total number of vulnerabilities, the estimated time-to-compromise was increased only by about 3 to 30% depending on target and attacker skill level.

  12. Uncertainties in estimating health risks associated with exposure to ionising radiation.

    PubMed

    Preston, R Julian; Boice, John D; Brill, A Bertrand; Chakraborty, Ranajit; Conolly, Rory; Hoffman, F Owen; Hornung, Richard W; Kocher, David C; Land, Charles E; Shore, Roy E; Woloschak, Gayle E

    2013-09-01

    The information for the present discussion on the uncertainties associated with estimation of radiation risks and probability of disease causation was assembled for the recently published NCRP Report No. 171 on this topic. This memorandum provides a timely overview of the topic, given that quantitative uncertainty analysis is the state of the art in health risk assessment and given its potential importance to developments in radiation protection. Over the past decade the increasing volume of epidemiology data and the supporting radiobiology findings have aided in the reduction of uncertainty in the risk estimates derived. However, it is equally apparent that there remain significant uncertainties related to dose assessment, low dose and low dose-rate extrapolation approaches (e.g. the selection of an appropriate dose and dose-rate effectiveness factor), the biological effectiveness where considerations of the health effects of high-LET and lower-energy low-LET radiations are required and the transfer of risks from a population for which health effects data are available to one for which such data are not available. The impact of radiation on human health has focused in recent years on cancer, although there has been a decided increase in the data for noncancer effects together with more reliable estimates of the risk following radiation exposure, even at relatively low doses (notably for cataracts and cardiovascular disease). New approaches for the estimation of hereditary risk have been developed with the use of human data whenever feasible, although the current estimates of heritable radiation effects still are based on mouse data because of an absence of effects in human studies. Uncertainties associated with estimation of these different types of health effects are discussed in a qualitative and semi-quantitative manner as appropriate. The way forward would seem to require additional epidemiological studies, especially studies of low dose and low dose

  13. Uncertainties in estimating health risks associated with exposure to ionising radiation.

    PubMed

    Preston, R Julian; Boice, John D; Brill, A Bertrand; Chakraborty, Ranajit; Conolly, Rory; Hoffman, F Owen; Hornung, Richard W; Kocher, David C; Land, Charles E; Shore, Roy E; Woloschak, Gayle E

    2013-09-01

    The information for the present discussion on the uncertainties associated with estimation of radiation risks and probability of disease causation was assembled for the recently published NCRP Report No. 171 on this topic. This memorandum provides a timely overview of the topic, given that quantitative uncertainty analysis is the state of the art in health risk assessment and given its potential importance to developments in radiation protection. Over the past decade the increasing volume of epidemiology data and the supporting radiobiology findings have aided in the reduction of uncertainty in the risk estimates derived. However, it is equally apparent that there remain significant uncertainties related to dose assessment, low dose and low dose-rate extrapolation approaches (e.g. the selection of an appropriate dose and dose-rate effectiveness factor), the biological effectiveness where considerations of the health effects of high-LET and lower-energy low-LET radiations are required and the transfer of risks from a population for which health effects data are available to one for which such data are not available. The impact of radiation on human health has focused in recent years on cancer, although there has been a decided increase in the data for noncancer effects together with more reliable estimates of the risk following radiation exposure, even at relatively low doses (notably for cataracts and cardiovascular disease). New approaches for the estimation of hereditary risk have been developed with the use of human data whenever feasible, although the current estimates of heritable radiation effects still are based on mouse data because of an absence of effects in human studies. Uncertainties associated with estimation of these different types of health effects are discussed in a qualitative and semi-quantitative manner as appropriate. The way forward would seem to require additional epidemiological studies, especially studies of low dose and low dose

  14. Estimating risks of importation and local transmission of Zika virus infection

    PubMed Central

    Nah, Kyeongah; Mizumoto, Kenji; Miyamatsu, Yuichiro; Yasuda, Yohei; Kinoshita, Ryo

    2016-01-01

    Background. An international spread of Zika virus (ZIKV) infection has attracted global attention. ZIKV is conveyed by a mosquito vector, Aedes species, which also acts as the vector species of dengue and chikungunya viruses. Methods. Arrival time of ZIKV importation (i.e., the time at which the first imported case was diagnosed) in each imported country was collected from publicly available data sources. Employing a survival analysis model in which the hazard is an inverse function of the effective distance as informed by the airline transportation network data, and using dengue and chikungunya virus transmission data, risks of importation and local transmission were estimated. Results. A total of 78 countries with imported case(s) have been identified, with the arrival time ranging from 1 to 44 weeks since the first ZIKV was identified in Brazil, 2015. Whereas the risk of importation was well explained by the airline transportation network data, the risk of local transmission appeared to be best captured by additionally accounting for the presence of dengue and chikungunya viruses. Discussion. The risk of importation may be high given continued global travel of mildly infected travelers but, considering that the public health concerns over ZIKV infection stems from microcephaly, it is more important to focus on the risk of local and widespread transmission that could involve pregnant women. The predicted risk of local transmission was frequently seen in tropical and subtropical countries with dengue or chikungunya epidemic experience. PMID:27069825

  15. Estimating risks of importation and local transmission of Zika virus infection.

    PubMed

    Nah, Kyeongah; Mizumoto, Kenji; Miyamatsu, Yuichiro; Yasuda, Yohei; Kinoshita, Ryo; Nishiura, Hiroshi

    2016-01-01

    Background. An international spread of Zika virus (ZIKV) infection has attracted global attention. ZIKV is conveyed by a mosquito vector, Aedes species, which also acts as the vector species of dengue and chikungunya viruses. Methods. Arrival time of ZIKV importation (i.e., the time at which the first imported case was diagnosed) in each imported country was collected from publicly available data sources. Employing a survival analysis model in which the hazard is an inverse function of the effective distance as informed by the airline transportation network data, and using dengue and chikungunya virus transmission data, risks of importation and local transmission were estimated. Results. A total of 78 countries with imported case(s) have been identified, with the arrival time ranging from 1 to 44 weeks since the first ZIKV was identified in Brazil, 2015. Whereas the risk of importation was well explained by the airline transportation network data, the risk of local transmission appeared to be best captured by additionally accounting for the presence of dengue and chikungunya viruses. Discussion. The risk of importation may be high given continued global travel of mildly infected travelers but, considering that the public health concerns over ZIKV infection stems from microcephaly, it is more important to focus on the risk of local and widespread transmission that could involve pregnant women. The predicted risk of local transmission was frequently seen in tropical and subtropical countries with dengue or chikungunya epidemic experience.

  16. Estimating risks of importation and local transmission of Zika virus infection.

    PubMed

    Nah, Kyeongah; Mizumoto, Kenji; Miyamatsu, Yuichiro; Yasuda, Yohei; Kinoshita, Ryo; Nishiura, Hiroshi

    2016-01-01

    Background. An international spread of Zika virus (ZIKV) infection has attracted global attention. ZIKV is conveyed by a mosquito vector, Aedes species, which also acts as the vector species of dengue and chikungunya viruses. Methods. Arrival time of ZIKV importation (i.e., the time at which the first imported case was diagnosed) in each imported country was collected from publicly available data sources. Employing a survival analysis model in which the hazard is an inverse function of the effective distance as informed by the airline transportation network data, and using dengue and chikungunya virus transmission data, risks of importation and local transmission were estimated. Results. A total of 78 countries with imported case(s) have been identified, with the arrival time ranging from 1 to 44 weeks since the first ZIKV was identified in Brazil, 2015. Whereas the risk of importation was well explained by the airline transportation network data, the risk of local transmission appeared to be best captured by additionally accounting for the presence of dengue and chikungunya viruses. Discussion. The risk of importation may be high given continued global travel of mildly infected travelers but, considering that the public health concerns over ZIKV infection stems from microcephaly, it is more important to focus on the risk of local and widespread transmission that could involve pregnant women. The predicted risk of local transmission was frequently seen in tropical and subtropical countries with dengue or chikungunya epidemic experience. PMID:27069825

  17. Cancer risk estimates from radiation therapy for heterotopic ossification prophylaxis after total hip arthroplasty

    SciTech Connect

    Mazonakis, Michalis; Berris, Theoharris; Damilakis, John; Lyraraki, Efrossyni

    2013-10-15

    Purpose: Heterotopic ossification (HO) is a frequent complication following total hip arthroplasty. This study was conducted to calculate the radiation dose to organs-at-risk and estimate the probability of cancer induction from radiotherapy for HO prophylaxis.Methods: Hip irradiation for HO with a 6 MV photon beam was simulated with the aid of a Monte Carlo model. A realistic humanoid phantom representing an average adult patient was implemented in Monte Carlo environment for dosimetric calculations. The average out-of-field radiation dose to stomach, liver, lung, prostate, bladder, thyroid, breast, uterus, and ovary was calculated. The organ-equivalent-dose to colon, that was partly included within the treatment field, was also determined. Organ dose calculations were carried out using three different field sizes. The dependence of organ doses upon the block insertion into primary beam for shielding colon and prosthesis was investigated. The lifetime attributable risk for cancer development was estimated using organ, age, and gender-specific risk coefficients.Results: For a typical target dose of 7 Gy, organ doses varied from 1.0 to 741.1 mGy by the field dimensions and organ location relative to the field edge. Blocked field irradiations resulted in a dose range of 1.4–146.3 mGy. The most probable detriment from open field treatment of male patients was colon cancer with a high risk of 564.3 × 10{sup −5} to 837.4 × 10{sup −5} depending upon the organ dose magnitude and the patient's age. The corresponding colon cancer risk for female patients was (372.2–541.0) × 10{sup −5}. The probability of bladder cancer development was more than 113.7 × 10{sup −5} and 110.3 × 10{sup −5} for males and females, respectively. The cancer risk range to other individual organs was reduced to (0.003–68.5) × 10{sup −5}.Conclusions: The risk for cancer induction from radiation therapy for HO prophylaxis after total hip arthroplasty varies considerably by the

  18. The economic value of reducing environmental health risks: Contingent valuation estimates of the value of information

    SciTech Connect

    Krieger, D.J.; Hoehn, J.P.

    1999-05-01

    Obtaining economically consistent values for changes in low probability health risks continues to be a challenge for contingent valuation (CV) as well as for other valuation methods. One of the cited condition for economic consistency is that estimated values be sensitive to the scope (differences in quantity or quality) of a good described in a CV application. The alleged limitations of CV pose a particular problem for environmental managers who must often make decisions that affect human health risks. This paper demonstrates that a well-designed CV application can elicit scope sensitive values even for programs that provide conceptually complex goods such as risk reduction. Specifically, it finds that the amount sport anglers are willing to pay for information about chemical residues in fish varies systematically with informativeness--a relationship suggested by the theory of information value.

  19. Recent estimates of cancer risk from low-let ionizing radiation and radiation protection limits

    NASA Astrophysics Data System (ADS)

    Sinclair, Warren K.

    1992-07-01

    Estimates of the risk of cancer induction, formerly about 1%/Sv, formed the basis of ICRP radiation protection limits in 1977. They have now increased to about 4-5%/Sv for low doses. These increases are based mainly on new data for the Japanese survivors of the A-bombs of 1945. They result from the accumulation of 11 years more of data on solid tumors, the revisions in the dosimetry of those exposed and improvement in statistical methods and projections. The application of a dose rate effectiveness factor between effects at high dose rate and those at low dose and dose rate is also an important consideration. Not only has the total risk changed but also the distribution of risk among organs. Thus the effective dose equivalent may require modification. These changes are modifying ICRP and NCRP thinking about recommendations on protection limits, especially for radiation workers.

  20. Forest fire risk estimation from time series analisys of NOAA NDVI data

    NASA Astrophysics Data System (ADS)

    Gabban, Andrea; Liberta, Giorgio; San-Miguel-Ayanz, Jesus; Barbosa, Paulo

    2004-02-01

    The values of the Normalized Difference Vegetation Index obtained from NOAA Advanced Very High Resolution Radiometer (AVHRR) have often been used for forestry application, including the assessment of fire risk. Forest fire risk estimates were based mainly on the decrease of NDVI values during the summer in areas subject to summer drought. However, the inter-annual variability of the vegetation response has never been extensively taken into account. The present work was based on the assumption that Mediterranean vegetation is adapted to summer drought and one possible estimator of the vegetation stress was the inter-annual variability of the vegetation status, as reflected by NDVI values. This article presents a novel methodology for the assessment of fire risk based on the comparison of the current NDVI values, on a given area, with the historical values along a time series of 13 years. The first part of the study is focused on the characterization of the Minimum and Maximum long term daily images. The second part is centered on the best method to compare the long term Maximum and Minimum with the current NDVI. A statistical index, Dynamic Relative Greenness, DRG, was tested on as a novel potential fire risk indicator.

  1. Estimates of Radiation Doses and Cancer Risk from Food Intake in Korea.

    PubMed

    Moon, Eun-Kyeong; Ha, Wi-Ho; Seo, Songwon; Jin, Young Woo; Jeong, Kyu Hwan; Yoon, Hae-Jung; Kim, Hyoung-Soo; Hwang, Myung-Sil; Choi, Hoon; Lee, Won Jin

    2016-01-01

    The aim of this study was to estimate internal radiation doses and lifetime cancer risk from food ingestion. Radiation doses from food intake were calculated using the Korea National Health and Nutrition Examination Survey and the measured radioactivity of (134)Cs, (137)Cs, and (131)I from the Ministry of Food and Drug Safety in Korea. Total number of measured data was 8,496 (3,643 for agricultural products, 644 for livestock products, 43 for milk products, 3,193 for marine products, and 973 for processed food). Cancer risk was calculated by multiplying the estimated committed effective dose and the detriment adjusted nominal risk coefficients recommended by the International Commission on Radiation Protection. The lifetime committed effective doses from the daily diet are ranged 2.957-3.710 mSv. Excess lifetime cancer risks are 14.4-18.1, 0.4-0.5, and 1.8-2.3 per 100,000 for all solid cancers combined, thyroid cancer, and leukemia, respectively.

  2. Genetic risk and longitudinal disease activity in systemic lupus erythematosus using targeted maximum likelihood estimation.

    PubMed

    Gianfrancesco, M A; Balzer, L; Taylor, K E; Trupin, L; Nititham, J; Seldin, M F; Singer, A W; Criswell, L A; Barcellos, L F

    2016-09-01

    Systemic lupus erythematous (SLE) is a chronic autoimmune disease associated with genetic and environmental risk factors. However, the extent to which genetic risk is causally associated with disease activity is unknown. We utilized longitudinal-targeted maximum likelihood estimation to estimate the causal association between a genetic risk score (GRS) comprising 41 established SLE variants and clinically important disease activity as measured by the validated Systemic Lupus Activity Questionnaire (SLAQ) in a multiethnic cohort of 942 individuals with SLE. We did not find evidence of a clinically important SLAQ score difference (>4.0) for individuals with a high GRS compared with those with a low GRS across nine time points after controlling for sex, ancestry, renal status, dialysis, disease duration, treatment, depression, smoking and education, as well as time-dependent confounding of missing visits. Individual single-nucleotide polymorphism (SNP) analyses revealed that 12 of the 41 variants were significantly associated with clinically relevant changes in SLAQ scores across time points eight and nine after controlling for multiple testing. Results based on sophisticated causal modeling of longitudinal data in a large patient cohort suggest that individual SLE risk variants may influence disease activity over time. Our findings also emphasize a role for other biological or environmental factors. PMID:27467283

  3. Estimates of Radiation Doses and Cancer Risk from Food Intake in Korea

    PubMed Central

    2016-01-01

    The aim of this study was to estimate internal radiation doses and lifetime cancer risk from food ingestion. Radiation doses from food intake were calculated using the Korea National Health and Nutrition Examination Survey and the measured radioactivity of 134Cs, 137Cs, and 131I from the Ministry of Food and Drug Safety in Korea. Total number of measured data was 8,496 (3,643 for agricultural products, 644 for livestock products, 43 for milk products, 3,193 for marine products, and 973 for processed food). Cancer risk was calculated by multiplying the estimated committed effective dose and the detriment adjusted nominal risk coefficients recommended by the International Commission on Radiation Protection. The lifetime committed effective doses from the daily diet are ranged 2.957-3.710 mSv. Excess lifetime cancer risks are 14.4-18.1, 0.4-0.5, and 1.8-2.3 per 100,000 for all solid cancers combined, thyroid cancer, and leukemia, respectively. PMID:26770031

  4. Injury Risk Estimation Expertise: Cognitive-Perceptual Mechanisms of ACL-IQ.

    PubMed

    Petushek, Erich J; Cokely, Edward T; Ward, Paul; Myer, Gregory D

    2015-06-01

    Instrument-based biomechanical movement analysis is an effective injury screening method but relies on expensive equipment and time-consuming analysis. Screening methods that rely on visual inspection and perceptual skill for prognosticating injury risk provide an alternative approach that can significantly reduce cost and time. However, substantial individual differences exist in skill when estimating injury risk performance via observation. The underlying perceptual-cognitive mechanisms of injury risk identification were explored to better understand the nature of this skill and provide a foundation for improving performance. Quantitative structural and process modeling of risk estimation indicated that superior performance was largely mediated by specific strategies and skills (e.g., irrelevant information reduction), and independent of domain-general cognitive abilities (e.g., mental rotation, general decision skill). These cognitive models suggest that injury prediction expertise (i.e., ACL-IQ) is a trainable skill, and provide a foundation for future research and applications in training, decision support, and ultimately clinical screening investigations.

  5. Estimates of Radiation Doses and Cancer Risk from Food Intake in Korea.

    PubMed

    Moon, Eun-Kyeong; Ha, Wi-Ho; Seo, Songwon; Jin, Young Woo; Jeong, Kyu Hwan; Yoon, Hae-Jung; Kim, Hyoung-Soo; Hwang, Myung-Sil; Choi, Hoon; Lee, Won Jin

    2016-01-01

    The aim of this study was to estimate internal radiation doses and lifetime cancer risk from food ingestion. Radiation doses from food intake were calculated using the Korea National Health and Nutrition Examination Survey and the measured radioactivity of (134)Cs, (137)Cs, and (131)I from the Ministry of Food and Drug Safety in Korea. Total number of measured data was 8,496 (3,643 for agricultural products, 644 for livestock products, 43 for milk products, 3,193 for marine products, and 973 for processed food). Cancer risk was calculated by multiplying the estimated committed effective dose and the detriment adjusted nominal risk coefficients recommended by the International Commission on Radiation Protection. The lifetime committed effective doses from the daily diet are ranged 2.957-3.710 mSv. Excess lifetime cancer risks are 14.4-18.1, 0.4-0.5, and 1.8-2.3 per 100,000 for all solid cancers combined, thyroid cancer, and leukemia, respectively. PMID:26770031

  6. Estimation of sport fish harvest for risk and hazard assessment of environmental contaminants

    SciTech Connect

    Poston, T.M.; Strenge, D.L.

    1989-01-01

    Consumption of contaminated fish flesh can be a significant route of human exposure to hazardous chemicals. Estimation of exposure resulting from the consumption of fish requires knowledge of fish consumption and contaminant levels in the edible portion of fish. Realistic figures of sport fish harvest are needed to estimate consumption. Estimates of freshwater sport fish harvest were developed from a review of 72 articles and reports. Descriptive statistics based on fishing pressure were derived from harvest data for four distinct groups of freshwater sport fish in three water types: streams, lakes, and reservoirs. Regression equations were developed to relate harvest to surface area fished where data bases were sufficiently large. Other aspects of estimating human exposure to contaminants in fish flesh that are discussed include use of bioaccumulation factors for trace metals and organic compounds. Using the bioaccumulation factor and the concentration of contaminants in water as variables in the exposure equation may also lead to less precise estimates of tissue concentration. For instance, muscle levels of contaminants may not increase proportionately with increases in water concentrations, leading to overestimation of risk. In addition, estimates of water concentration may be variable or expressed in a manner that does not truly represent biological availability of the contaminant. These factors are discussed. 45 refs., 1 fig., 7 tabs.

  7. Estimating the risks of smoking, air pollution, and passive smoke on acute respiratory conditions

    SciTech Connect

    Ostro, B.D. )

    1989-06-01

    Five years of the annual Health Interview Survey, conducted by the National Center for Health Statistics, are used to estimate the effects of air pollution, smoking, and environmental tobacco smoke on respiratory restrictions in activity for adults, and bed disability for children. After adjusting for several socioeconomic factors, the multiple regression estimates indicate that an independent and statistically significant association exists between these three forms of air pollution and respiratory morbidity. The comparative risks of these exposures are computed and the plausibility of the relative risks is examined by comparing the equivalent doses with actual measurements of exposure taken in the homes of smokers. The results indicate that: (1) smokers will have a 55-75% excess in days with respiratory conditions severe enough to cause reductions in normal activity; (2) a 1 microgram increase in fine particulate matter air pollution is associated with a 3% excess in acute respiratory disease; and (3) a pack-a-day smoker will increase respiratory restricted days for a nonsmoking spouse by 20% and increase the number of bed disability days for young children living in the household by 20%. The results also indicate that the estimates of the effects of secondhand smoking on children are improved when the mother's work status is known and incorporated into the exposure estimate.

  8. Estimating the risks of smoking, air pollution, and passive smoke on acute respiratory conditions.

    PubMed

    Ostro, B D

    1989-06-01

    Five years of the annual Health Interview Survey, conducted by the National Center for Health Statistics, are used to estimate the effects of air pollution, smoking, and environmental tobacco smoke on respiratory restrictions in activity for adults, and bed disability for children. After adjusting for several socioeconomic factors, the multiple regression estimates indicate that an independent and statistically significant association exists between these three forms of air pollution and respiratory morbidity. The comparative risks of these exposures are computed and the plausibility of the relative risks is examined by comparing the equivalent doses with actual measurements of exposure taken in the homes of smokers. The results indicate that: (1) smokers will have a 55-75% excess in days with respiratory conditions severe enough to cause reductions in normal activity; (2) a 1 microgram increase in fine particulate matter air pollution is associated with a 3% excess in acute respiratory disease; and (3) a pack-a-day smoker will increase respiratory restricted days for a nonsmoking spouse by 20% and increase the number of bed disability days for young children living in the household by 20%. The results also indicate that the estimates of the effects of secondhand smoking on children are improved when the mother's work status is known and incorporated into the exposure estimate.

  9. Problems and solutions in the estimation of genetic risks from radiation and chemicals

    SciTech Connect

    Russell, W. L.

    1980-01-01

    Extensive investigations with mice on the effects of various physical and biological factors, such as dose rate, sex and cell stage, on radiation-induced mutation have provided an evaluation of the genetics hazards of radiation in man. The mutational results obtained in both sexes with progressive lowering of the radiation dose rate have permitted estimation of the mutation frequency expected under the low-level radiation conditions of most human exposure. Supplementing the studies on mutation frequency are investigations on the phenotypic effects of mutations in mice, particularly anatomical disorders of the skeleton, which allow an estimation of the degree of human handicap associated with the occurrence of parallel defects in man. Estimation of the genetic risk from chemical mutagens is much more difficult, and the research is much less advanced. Results on transmitted mutations in mice indicate a poor correlation with mutation induction in non-mammalian organisms.

  10. Estimate of the risks of disposing nonhazardous oil field wastes into salt caverns

    SciTech Connect

    Tomasko, D.; Elcock, D.; Veil, J.

    1997-12-31

    Argonne National Laboratory (ANL) has completed an evaluation of the possibility that adverse human health effects (carcinogenic and noncarcinogenic) could result from exposure to contaminants released from nonhazardous oil field wastes (NOW) disposed in domal salt caverns. Potential human health risks associated with hazardous substances (arsenic, benzene, cadmium, and chromium) in NOW were assessed under four postclosure cavern release scenarios: inadvertent cavern intrusion, failure of the cavern seal, failure of the cavern through cracks or leaky interbeds, and a partial collapse of the cavern roof. To estimate potential human health risks for these scenarios, contaminant concentrations at the receptor were calculated using a one-dimensional solution to an advection/dispersion equation that included first order degradation. Assuming a single, generic salt cavern and generic oil-field wastes, the best-estimate excess cancer risks ranged from 1.7 {times} 10{sup {minus}12} to 1.1 {times} 10{sup {minus}8} and hazard indices (referring to noncancer health effects) ranged from 7 {times} 10{sup {minus}9} to 7 {times} 10{sup {minus}4}. Under worse-case conditions in which the probability of cavern failure is 1.0, excess cancer risks ranged from 4.9 {times} 10{sup {minus}9} to 1.7 {times} 10{sup {minus}5} and hazard indices ranged from 7.0 {times} 10{sup {minus}4} to 0.07. Even under worst-case conditions, the risks are within the US Environmental Protection Agency (EPA) target range for acceptable exposure levels. From a human health risk perspective, salt caverns can, therefore, provide an acceptable disposal method for NOW.

  11. Risk estimation for future glacier lake outburst floods based on local land-use changes

    NASA Astrophysics Data System (ADS)

    Nussbaumer, S.; Schaub, Y.; Huggel, C.; Walz, A.

    2014-06-01

    Effects of climate change are particularly strong in high-mountain regions. Most visibly, glaciers are shrinking at a rapid pace, and as a consequence, glacier lakes are forming or growing. At the same time the stability of mountain slopes is reduced by glacier retreat, permafrost thaw and other factors, resulting in an increasing landslide hazard which can potentially impact lakes and therewith trigger far-reaching and devastating outburst floods. To manage risks from existing or future lakes, strategies need to be developed to plan in time for adequate risk reduction measures at a local level. However, methods to assess risks from future lake outbursts are not available and need to be developed to evaluate both future hazard and future damage potential. Here a method is presented to estimate future risks related to glacier lake outbursts for a local site in southern Switzerland (Naters, Valais). To generate two hazard scenarios, glacier shrinkage and lake formation modelling was applied, combined with simple flood modelling and field work. Furthermore, a land-use model was developed to quantify and allocate land-use changes based on local-to-regional storylines and three scenarios of land-use driving forces. Results are conceptualized in a matrix of three land-use and two hazard scenarios for the year 2045, and show the distribution of risk in the community of Naters, including high and very high risk areas. The study underlines the importance of combined risk management strategies focusing on land-use planning, on vulnerability reduction, as well as on structural measures (where necessary) to effectively reduce future risks related to lake outburst floods.

  12. Local land-use change based risk estimation for future glacier lake outburst flood

    NASA Astrophysics Data System (ADS)

    Nussbaumer, S.; Huggel, C.; Schaub, Y.; Walz, A.

    2013-08-01

    Effects of climate change are particularly strong in high-mountain regions. Most visibly, glaciers are shrinking at a rapid pace, and as a consequence, glacier lakes are forming or growing. At the same time the stability of mountain slopes is reduced by glacier retreat, permafrost thaw and other factors, resulting in an increasing risk of landslides which can potentially impact lakes and therewith trigger far reaching and devastating outburst floods. To manage risks from existing or future lakes, strategies need to be developed to plan in time for adequate risk reduction measures at a local level. However, methods to assess risks from future lake outbursts are not available. It is actually a challenge to develop methods to evaluate both, future hazard potential and future damage potential. Here we present an analysis of future risks related to glacier lake outbursts for a local site in southern Switzerland (Naters, Valais). To estimate two hazard scenarios, we used glacier shrinkage and lake formation modelling, simple flood modelling and field work. Further we developed a land-use model to quantify and allocate land-use changes based on local-to-regional storylines and three scenarios of land-use driving forces. Results are conceptualized in a matrix of three land-use and two hazard scenarios for a time period of 2045, and show the distribution of risk in the community of Naters, including high and very high risk areas. The study corroborates the importance of land-use planning to effectively reduce future risks related to lake outburst floods.

  13. Radiation-Induced Leukemia at Doses Relevant to Radiation Therapy: Modeling Mechanisms and Estimating Risks

    NASA Technical Reports Server (NTRS)

    Shuryak, Igor; Sachs, Rainer K.; Hlatky, Lynn; Mark P. Little; Hahnfeldt, Philip; Brenner, David J.

    2006-01-01

    Because many cancer patients are diagnosed earlier and live longer than in the past, second cancers induced by radiation therapy have become a clinically significant issue. An earlier biologically based model that was designed to estimate risks of high-dose radiation induced solid cancers included initiation of stem cells to a premalignant state, inactivation of stem cells at high radiation doses, and proliferation of stem cells during cellular repopulation after inactivation. This earlier model predicted the risks of solid tumors induced by radiation therapy but overestimated the corresponding leukemia risks. Methods: To extend the model to radiation-induced leukemias, we analyzed in addition to cellular initiation, inactivation, and proliferation a repopulation mechanism specific to the hematopoietic system: long-range migration through the blood stream of hematopoietic stem cells (HSCs) from distant locations. Parameters for the model were derived from HSC biologic data in the literature and from leukemia risks among atomic bomb survivors v^ ho were subjected to much lower radiation doses. Results: Proliferating HSCs that migrate from sites distant from the high-dose region include few preleukemic HSCs, thus decreasing the high-dose leukemia risk. The extended model for leukemia provides risk estimates that are consistent with epidemiologic data for leukemia risk associated with radiation therapy over a wide dose range. For example, when applied to an earlier case-control study of 110000 women undergoing radiotherapy for uterine cancer, the model predicted an excess relative risk (ERR) of 1.9 for leukemia among women who received a large inhomogeneous fractionated external beam dose to the bone marrow (mean = 14.9 Gy), consistent with the measured ERR (2.0, 95% confidence interval [CI] = 0.2 to 6.4; from 3.6 cases expected and 11 cases observed). As a corresponding example for brachytherapy, the predicted ERR of 0.80 among women who received an inhomogeneous low

  14. Sensitivity Analysis of Median Lifetime on Radiation Risks Estimates for Cancer and Circulatory Disease amongst Never-Smokers

    NASA Technical Reports Server (NTRS)

    Chappell, Lori J.; Cucinotta, Francis A.

    2011-01-01

    Radiation risks are estimated in a competing risk formalism where age or time after exposure estimates of increased risks for cancer and circulatory diseases are folded with a probability to survive to a given age. The survival function, also called the life-table, changes with calendar year, gender, smoking status and other demographic variables. An outstanding problem in risk estimation is the method of risk transfer between exposed populations and a second population where risks are to be estimated. Approaches used to transfer risks are based on: 1) Multiplicative risk transfer models -proportional to background disease rates. 2) Additive risk transfer model -risks independent of background rates. In addition, a Mixture model is often considered where the multiplicative and additive transfer assumptions are given weighted contributions. We studied the influence of the survival probability on the risk of exposure induced cancer and circulatory disease morbidity and mortality in the Multiplicative transfer model and the Mixture model. Risks for never-smokers (NS) compared to the average U.S. population are estimated to be reduced between 30% and 60% dependent on model assumptions. Lung cancer is the major contributor to the reduction for NS, with additional contributions from circulatory diseases and cancers of the stomach, liver, bladder, oral cavity, esophagus, colon, a portion of the solid cancer remainder, and leukemia. Greater improvements in risk estimates for NS s are possible, and would be dependent on improved understanding of risk transfer models, and elucidating the role of space radiation on the various stages of disease formation (e.g. initiation, promotion, and progression).

  15. Biokinetic and dosimetric modelling in the estimation of radiation risks from internal emitters.

    PubMed

    Harrison, John

    2009-06-01

    The International Commission on Radiological Protection (ICRP) has developed biokinetic and dosimetric models that enable the calculation of organ and tissue doses for a wide range of radionuclides. These are used to calculate equivalent and effective dose coefficients (dose in Sv Bq(-1) intake), considering occupational and environmental exposures. Dose coefficients have also been given for a range of radiopharmaceuticals used in diagnostic medicine. Using equivalent and effective dose, exposures from external sources and from different radionuclides can be summed for comparison with dose limits, constraints and reference levels that relate to risks from whole-body radiation exposure. Risk estimates are derived largely from follow-up studies of the survivors of the atomic bombings at Hiroshima and Nagasaki in 1945. New dose coefficients will be required following the publication in 2007 of new ICRP recommendations. ICRP biokinetic and dosimetric models are subject to continuing review and improvement, although it is arguable that the degree of sophistication of some of the most recent models is greater than required for the calculation of effective dose to a reference person for the purposes of regulatory control. However, the models are also used in the calculation of best estimates of doses and risks to individuals, in epidemiological studies and to determine probability of cancer causation. Models are then adjusted to best fit the characteristics of the individuals and population under consideration. For example, doses resulting from massive discharges of strontium-90 and other radionuclides to the Techa River from the Russian Mayak plutonium plant in the early years of its operation are being estimated using models adapted to take account of measurements on local residents and other population-specific data. Best estimates of doses to haemopoietic bone marrow, in utero and postnatally, are being used in epidemiological studies of radiation-induced leukaemia

  16. Estimates of radiological risk from depleted uranium weapons in war scenarios.

    PubMed

    Durante, Marco; Pugliese, Mariagabriella

    2002-01-01

    Several weapons used during the recent conflict in Yugoslavia contain depleted uranium, including missiles and armor-piercing incendiary rounds. Health concern is related to the use of these weapons, because of the heavy-metal toxicity and radioactivity of uranium. Although chemical toxicity is considered the more important source of health risk related to uranium, radiation exposure has been allegedly related to cancers among veterans of the Balkan conflict, and uranium munitions are a possible source of contamination in the environment. Actual measurements of radioactive contamination are needed to assess the risk. In this paper, a computer simulation is proposed to estimate radiological risk related to different exposure scenarios. Dose caused by inhalation of radioactive aerosols and ground contamination induced by Tomahawk missile impact are simulated using a Gaussian plume model (HOTSPOT code). Environmental contamination and committed dose to the population resident in contaminated areas are predicted by a food-web model (RESRAD code). Small values of committed effective dose equivalent appear to be associated with missile impacts (50-y CEDE < 5 mSv), or population exposure by water-independent pathways (50-y CEDE < 80 mSv). The greatest hazard is related to the water contamination in conditions of effective leaching of uranium in the groundwater (50-y CEDE < 400 mSv). Even in this worst case scenario, the chemical toxicity largely predominates over radiological risk. These computer simulations suggest that little radiological risk is associated to the use of depleted uranium weapons.

  17. Mathematical model to estimate risk of calcium-containing renal stones

    NASA Technical Reports Server (NTRS)

    Pietrzyk, R. A.; Feiveson, A. H.; Whitson, P. A.

    1999-01-01

    BACKGROUND/AIMS: Astronauts exposed to microgravity during the course of spaceflight undergo physiologic changes that alter the urinary environment so as to increase the risk of renal stone formation. This study was undertaken to identify a simple method with which to evaluate the potential risk of renal stone development during spaceflight. METHOD: We used a large database of urinary risk factors obtained from 323 astronauts before and after spaceflight to generate a mathematical model with which to predict the urinary supersaturation of calcium stone forming salts. RESULT: This model, which involves the fewest possible analytical variables (urinary calcium, citrate, oxalate, phosphorus, and total volume), reliably and accurately predicted the urinary supersaturation of the calcium stone forming salts when compared to results obtained from a group of 6 astronauts who collected urine during flight. CONCLUSIONS: The use of this model will simplify both routine medical monitoring during spaceflight as well as the evaluation of countermeasures designed to minimize renal stone development. This model also can be used for Earth-based applications in which access to analytical resources is limited.

  18. Estimated Insulin Sensitivity and Cardiovascular Disease Risk Factors in Adolescents with and without Type 1 Diabetes

    PubMed Central

    Specht, Brian J; Wadwa, R Paul; Snell-Bergeon, Janet K; Nadeau, Kristen J; Bishop, Franziska K; Maahs, David M.

    2012-01-01

    Objective To test the hypothesis that cardiovascular disease (CVD) risk factors are similar in adolescents with and without diabetes (T1D) in the most insulin sensitive (IS) tertile and CVD risk factors are more atherogenic with decreasing IS in adolescents with T1D. Study design Adolescents with IS T1D (n=292; age=15.4±2.1 years; duration=8.8±3.0 years, HbA1c=8.9±1.6%) and non-diabetic (non-DM) controls (n=89; age=15.4±2.1 years) was estimated using the model: logeIS=4.64725 – 0.02032(waist, cm) – 0.09779(HbA1c, %) – 0.00235(triglycerides, mg/dl). CVD risk factors (blood pressure, fasting total, LDL and HDL-cholesterol, hs-CRP, and BMI Z-score) were compared between all non-DM adolescents and those with T1D in the most IS tertile, and then examined for a linear trend by IS tertile in adolescents with T1D, adjusted for sex, race/ethnicity and Tanner Stage. Results Estimated IS was significantly lower in adolescents with T1D compared with those without (T1D=7.8±2.4, non-DM=11.5±2.9; p<0.0001). CVD risk factors were similar for non-DM compared with the adolescents with most IS T1D, except for higher HDL-c and DBP in adolescents with T1D (p<0.05). Among adolescents with T1D, all CVD risk factors except for HDL-c, were more atherogenic across decreasing IS tertiles in linear regression analysis (p<0.05). Conclusion Adolescents with T1D who are the most IS have similar CVD risk factors compared with non-DM adolescents. CVD risk factors are inversely associated with adolescents with IS T1D. IS may be an important therapeutic target for reducing CVD risk factors in adolescents with T1D. PMID:22921593

  19. Risk of water contamination by nitrogen in Canada as estimated by the IROWC-N model.

    PubMed

    De Jong, R; Drury, C F; Yang, J Y; Campbell, C A

    2009-07-01

    With increasing amounts of nitrogen (N) being added to farmland in the form of fertilizer and manure to optimize crop yields, and more broadly, to meet the growing demands for food, feed and energy, there are public concerns regarding its possible negative impact on the environment. An optimal balance between N requirements for production versus efficient N use is required, so as to minimize N losses from the agricultural system. An agri-environmental indicator i.e., the Indicator of the Risk of Water Contamination by Nitrogen (IROWC-N) was developed to assess the risk of N moving from agricultural areas into groundwater and/or nearby surface water bodies. The indicator linked the quantity of mineral nitrogen remaining in the soil at harvest, i.e., the Residual Soil Nitrogen (RSN) indicator, and the subsequent climatic conditions during the winter period. The results were assessed in terms of nitrate lost through leaching and nitrate concentration in the drainage water, expressed in five IROWC-N risk classes. Unlike previous versions of the indicator, the current model provided a more complete description of the soil-water balance, including the calculation of rainfall interception by crops, surface runoff, actual evapotranspiration and soil-water contents. Consequently, the current IROWC-N estimates differed markedly from those obtained previously. Between 1981 and 2006, the risk of water contamination by N in Canada was small, and reflected what was happening in the three Prairie provinces where 85% of Canada's farmland is located. However, the aggregated IROWC-N index, which is a combination of all five risk classes, increased steadily by 2.3% per year, from 6.7 in 1981 to 10.6 in 2006. The proportion of farmland in the very low IROWC-N risk class decreased from 88 to 78%; correspondingly, the proportion in the low risk class increased from 2 to 12%. The proportion of farmland in the moderate-, high- and very high-risk classes changed by less than 3% over time

  20. Risk of water contamination by nitrogen in Canada as estimated by the IROWC-N model.

    PubMed

    De Jong, R; Drury, C F; Yang, J Y; Campbell, C A

    2009-07-01

    With increasing amounts of nitrogen (N) being added to farmland in the form of fertilizer and manure to optimize crop yields, and more broadly, to meet the growing demands for food, feed and energy, there are public concerns regarding its possible negative impact on the environment. An optimal balance between N requirements for production versus efficient N use is required, so as to minimize N losses from the agricultural system. An agri-environmental indicator i.e., the Indicator of the Risk of Water Contamination by Nitrogen (IROWC-N) was developed to assess the risk of N moving from agricultural areas into groundwater and/or nearby surface water bodies. The indicator linked the quantity of mineral nitrogen remaining in the soil at harvest, i.e., the Residual Soil Nitrogen (RSN) indicator, and the subsequent climatic conditions during the winter period. The results were assessed in terms of nitrate lost through leaching and nitrate concentration in the drainage water, expressed in five IROWC-N risk classes. Unlike previous versions of the indicator, the current model provided a more complete description of the soil-water balance, including the calculation of rainfall interception by crops, surface runoff, actual evapotranspiration and soil-water contents. Consequently, the current IROWC-N estimates differed markedly from those obtained previously. Between 1981 and 2006, the risk of water contamination by N in Canada was small, and reflected what was happening in the three Prairie provinces where 85% of Canada's farmland is located. However, the aggregated IROWC-N index, which is a combination of all five risk classes, increased steadily by 2.3% per year, from 6.7 in 1981 to 10.6 in 2006. The proportion of farmland in the very low IROWC-N risk class decreased from 88 to 78%; correspondingly, the proportion in the low risk class increased from 2 to 12%. The proportion of farmland in the moderate-, high- and very high-risk classes changed by less than 3% over time

  1. Examining the effects of air pollution composition on within region differences in PM2.5 mortality risk estimates.

    PubMed

    Baxter, Lisa K; Duvall, Rachelle M; Sacks, Jason

    2013-01-01

    Multi-city population-based epidemiological studies have observed significant heterogeneity in both the magnitude and direction of city-specific risk estimates, but tended to focus on regional differences in PM2.5 mortality risk estimates. Interpreting differences in risk estimates is complicated by city-to-city heterogeneity observed within regions due to city-to-city variations in the PM2.5 composition and the concentration of gaseous pollutants. We evaluate whether variations in PM2.5 composition and gaseous pollutant concentrations have a role in explaining the heterogeneity in PM2.5 mortality risk estimates observed in 27 US cities from 1997 to 2002. Within each region, we select the two cities with the largest and smallest mortality risk estimate. We compare for each region the within- and between-city concentrations and correlations of PM2.5 constituents and gaseous pollutants. We also attempt to identify source factors through principal component analysis (PCA) for each city. The results of this analysis indicate that identifying a PM constituent(s) that explains the differences in the PM2.5 mortality risk estimates is not straightforward. The difference in risk estimates between cities in the same region may be attributed to a group of pollutants, possibly those related to local sources such as traffic.

  2. Longer genotypically-estimated leukocyte telomere length is associated with increased adult glioma risk.

    PubMed

    Walsh, Kyle M; Codd, Veryan; Rice, Terri; Nelson, Christopher P; Smirnov, Ivan V; McCoy, Lucie S; Hansen, Helen M; Elhauge, Edward; Ojha, Juhi; Francis, Stephen S; Madsen, Nils R; Bracci, Paige M; Pico, Alexander R; Molinaro, Annette M; Tihan, Tarik; Berger, Mitchel S; Chang, Susan M; Prados, Michael D; Jenkins, Robert B; Wiemels, Joseph L; Samani, Nilesh J; Wiencke, John K; Wrensch, Margaret R

    2015-12-15

    Telomere maintenance has emerged as an important molecular feature with impacts on adult glioma susceptibility and prognosis. Whether longer or shorter leukocyte telomere length (LTL) is associated with glioma risk remains elusive and is often confounded by the effects of age and patient treatment. We sought to determine if genotypically-estimated LTL is associated with glioma risk and if inherited single nucleotide polymorphisms (SNPs) that are associated with LTL are glioma risk factors. Using a Mendelian randomization approach, we assessed differences in genotypically-estimated relative LTL in two independent glioma case-control datasets from the UCSF Adult Glioma Study (652 patients and 3735 controls) and The Cancer Genome Atlas (478 non-overlapping patients and 2559 controls). LTL estimates were based on a weighted linear combination of subject genotype at eight SNPs, previously associated with LTL in the ENGAGE Consortium Telomere Project. Mean estimated LTL was 31bp (5.7%) longer in glioma patients than controls in discovery analyses (P = 7.82x10-8) and 27bp (5.0%) longer in glioma patients than controls in replication analyses (1.48x10-3). Glioma risk increased monotonically with each increasing septile of LTL (O.R.=1.12; P = 3.83x10-12). Four LTL-associated SNPs were significantly associated with glioma risk in pooled analyses, including those in the telomerase component genes TERC (O.R.=1.14; 95% C.I.=1.03-1.28) and TERT (O.R.=1.39; 95% C.I.=1.27-1.52), and those in the CST complex genes OBFC1 (O.R.=1.18; 95% C.I.=1.05-1.33) and CTC1 (O.R.=1.14; 95% C.I.=1.02-1.28). Future work is needed to characterize the role of the CST complex in gliomagenesis and further elucidate the complex balance between ageing, telomere length, and molecular carcinogenesis.

  3. Longer genotypically-estimated leukocyte telomere length is associated with increased adult glioma risk

    PubMed Central

    Walsh, Kyle M.; Codd, Veryan; Rice, Terri; Nelson, Christopher P.; Smirnov, Ivan V.; McCoy, Lucie S.; Hansen, Helen M.; Elhauge, Edward; Ojha, Juhi; Francis, Stephen S.; Madsen, Nils R.; Bracci, Paige M.; Pico, Alexander R.; Molinaro, Annette M.; Tihan, Tarik; Berger, Mitchel S.; Chang, Susan M.; Prados, Michael D.; Jenkins, Robert B.; Wiemels, Joseph L.; Samani, Nilesh J.; Wiencke, John K.; Wrensch, Margaret R.

    2015-01-01

    Telomere maintenance has emerged as an important molecular feature with impacts on adult glioma susceptibility and prognosis. Whether longer or shorter leukocyte telomere length (LTL) is associated with glioma risk remains elusive and is often confounded by the effects of age and patient treatment. We sought to determine if genotypically-estimated LTL is associated with glioma risk and if inherited single nucleotide polymorphisms (SNPs) that are associated with LTL are glioma risk factors. Using a Mendelian randomization approach, we assessed differences in genotypically-estimated relative LTL in two independent glioma case-control datasets from the UCSF Adult Glioma Study (652 patients and 3735 controls) and The Cancer Genome Atlas (478 non-overlapping patients and 2559 controls). LTL estimates were based on a weighted linear combination of subject genotype at eight SNPs, previously associated with LTL in the ENGAGE Consortium Telomere Project. Mean estimated LTL was 31bp (5.7%) longer in glioma patients than controls in discovery analyses (P = 7.82×10-8) and 27bp (5.0%) longer in glioma patients than controls in replication analyses (1.48×10-3). Glioma risk increased monotonically with each increasing septile of LTL (O.R.=1.12; P = 3.83×10-12). Four LTL-associated SNPs were significantly associated with glioma risk in pooled analyses, including those in the telomerase component genes TERC (O.R.=1.14; 95% C.I.=1.03-1.28) and TERT (O.R.=1.39; 95% C.I.=1.27-1.52), and those in the CST complex genes OBFC1 (O.R.=1.18; 95% C.I.=1.05-1.33) and CTC1 (O.R.=1.14; 95% C.I.=1.02-1.28). Future work is needed to characterize the role of the CST complex in gliomagenesis and further elucidate the complex balance between ageing, telomere length, and molecular carcinogenesis. PMID:26646793

  4. Patient-specific radiation dose and cancer risk estimation in CT: Part I. Development and validation of a Monte Carlo program

    SciTech Connect

    Li Xiang; Samei, Ehsan; Segars, W. Paul; Sturgeon, Gregory M.; Colsher, James G.; Toncheva, Greta; Yoshizumi, Terry T.; Frush, Donald P.

    2011-01-15

    Purpose: Radiation-dose awareness and optimization in CT can greatly benefit from a dose-reporting system that provides dose and risk estimates specific to each patient and each CT examination. As the first step toward patient-specific dose and risk estimation, this article aimed to develop a method for accurately assessing radiation dose from CT examinations. Methods: A Monte Carlo program was developed to model a CT system (LightSpeed VCT, GE Healthcare). The geometry of the system, the energy spectra of the x-ray source, the three-dimensional geometry of the bowtie filters, and the trajectories of source motions during axial and helical scans were explicitly modeled. To validate the accuracy of the program, a cylindrical phantom was built to enable dose measurements at seven different radial distances from its central axis. Simulated radial dose distributions in the cylindrical phantom were validated against ion chamber measurements for single axial scans at all combinations of tube potential and bowtie filter settings. The accuracy of the program was further validated using two anthropomorphic phantoms (a pediatric one-year-old phantom and an adult female phantom). Computer models of the two phantoms were created based on their CT data and were voxelized for input into the Monte Carlo program. Simulated dose at various organ locations was compared against measurements made with thermoluminescent dosimetry chips for both single axial and helical scans. Results: For the cylindrical phantom, simulations differed from measurements by -4.8% to 2.2%. For the two anthropomorphic phantoms, the discrepancies between simulations and measurements ranged between (-8.1%, 8.1%) and (-17.2%, 13.0%) for the single axial scans and the helical scans, respectively. Conclusions: The authors developed an accurate Monte Carlo program for assessing radiation dose from CT examinations. When combined with computer models of actual patients, the program can provide accurate dose

  5. Two-compartment, two-sample technique for accurate estimation of effective renal plasma flow: Theoretical development and comparison with other methods

    SciTech Connect

    Lear, J.L.; Feyerabend, A.; Gregory, C.

    1989-08-01

    Discordance between effective renal plasma flow (ERPF) measurements from radionuclide techniques that use single versus multiple plasma samples was investigated. In particular, the authors determined whether effects of variations in distribution volume (Vd) of iodine-131 iodohippurate on measurement of ERPF could be ignored, an assumption implicit in the single-sample technique. The influence of Vd on ERPF was found to be significant, a factor indicating an important and previously unappreciated source of error in the single-sample technique. Therefore, a new two-compartment, two-plasma-sample technique was developed on the basis of the observations that while variations in Vd occur from patient to patient, the relationship between intravascular and extravascular components of Vd and the rate of iodohippurate exchange between the components are stable throughout a wide range of physiologic and pathologic conditions. The new technique was applied in a series of 30 studies in 19 patients. Results were compared with those achieved with the reference, single-sample, and slope-intercept techniques. The new two-compartment, two-sample technique yielded estimates of ERPF that more closely agreed with the reference multiple-sample method than either the single-sample or slope-intercept techniques.

  6. Relative risk estimation of Chikungunya disease in Malaysia: An analysis based on Poisson-gamma model

    NASA Astrophysics Data System (ADS)

    Samat, N. A.; Ma'arof, S. H. Mohd Imam

    2015-05-01

    Disease mapping is a method to display the geographical distribution of disease occurrence, which generally involves the usage and interpretation of a map to show the incidence of certain diseases. Relative risk (RR) estimation is one of the most important issues in disease mapping. This paper begins by providing a brief overview of Chikungunya disease. This is followed by a review of the classical model used in disease mapping, based on the standardized morbidity ratio (SMR), which we then apply to our Chikungunya data. We then fit an extension of the classical model, which we refer to as a Poisson-Gamma model, when prior distributions for the relative risks are assumed known. Both results are displayed and compared using maps and we reveal a smoother map with fewer extremes values of estimated relative risk. The extensions of this paper will consider other methods that are relevant to overcome the drawbacks of the existing methods, in order to inform and direct government strategy for monitoring and controlling Chikungunya disease.

  7. Developing a utility decision framework to evaluate predictive models in breast cancer risk estimation

    PubMed Central

    Wu, Yirong; Abbey, Craig K.; Chen, Xianqiao; Liu, Jie; Page, David C.; Alagoz, Oguzhan; Peissig, Peggy; Onitilo, Adedayo A.; Burnside, Elizabeth S.

    2015-01-01

    Abstract. Combining imaging and genetic information to predict disease presence and progression is being codified into an emerging discipline called “radiogenomics.” Optimal evaluation methodologies for radiogenomics have not been well established. We aim to develop a decision framework based on utility analysis to assess predictive models for breast cancer diagnosis. We garnered Gail risk factors, single nucleotide polymorphisms (SNPs), and mammographic features from a retrospective case-control study. We constructed three logistic regression models built on different sets of predictive features: (1) Gail, (2) Gail + Mammo, and (3) Gail + Mammo + SNP. Then we generated receiver operating characteristic (ROC) curves for three models. After we assigned utility values for each category of outcomes (true negatives, false positives, false negatives, and true positives), we pursued optimal operating points on ROC curves to achieve maximum expected utility of breast cancer diagnosis. We performed McNemar’s test based on threshold levels at optimal operating points, and found that SNPs and mammographic features played a significant role in breast cancer risk estimation. Our study comprising utility analysis and McNemar’s test provides a decision framework to evaluate predictive models in breast cancer risk estimation. PMID:26835489

  8. Estimating the Risk of ABO Hemolytic Disease of the Newborn in Lagos

    PubMed Central

    Akanmu, Alani Sulaimon; Oyedeji, Olufemi Abiola; Adeyemo, Titilope Adenike; Ogbenna, Ann Abiola

    2015-01-01

    Background. ABO hemolytic disease of the newborn is the most common hemolytic consequence of maternofetal blood group incompatibility restricted mostly to non-group-O babies of group O mothers with immune anti-A or anti-B antibodies. Aim. We estimated the risk of ABO HDN with view to determining need for routine screening for ABO incompatibility between mother and fetus. Materials and Methods. Prevalence of ABO blood group phenotypes in blood donors at the donor clinic of the Lagos University Teaching Hospital and arithmetic methods were used to determine population prevalence of ABO genes. We then estimated proportion of pregnancies of group O mothers carrying a non-group-O baby and the risk that maternofetal ABO incompatibility will cause clinical ABO HDN. Results. Blood from 9138 donors was ABO typed. 54.3%, 23%, 19.4%, and 3.3% were blood groups O, A, B, and AB, respectively. Calculated gene frequencies were 0.1416, 0.1209, and 0.7375 for A, B, and O genes, respectively. It was estimated that 14.3% of deliveries will result in a blood group O woman giving birth to a child who is non-group-O. Approximately 4.3% of deliveries are likely to suffer ABO HDN with 2.7% prone to suffer from moderately severe to severe hemolysis. PMID:26491605

  9. Estimating the Size of Populations at High Risk for HIV Using Respondent-Driven Sampling Data

    PubMed Central

    Handcock, Mark S.; Gile, Krista J.; Mar, Corinne M.

    2015-01-01

    Summary The study of hard-to-reach populations presents significant challenges. Typically, a sampling frame is not available, and population members are difficult to identify or recruit from broader sampling frames. This is especially true of populations at high risk for HIV/AIDS. Respondent-driven sampling (RDS) is often used in such settings with the primary goal of estimating the prevalence of infection. In such populations, the number of people at risk for infection and the number of people infected are of fundamental importance. This article presents a case-study of the estimation of the size of the hard-to-reach population based on data collected through RDS. We study two populations of female sex workers and men-who-have-sex-with-men in El Salvador. The approach is Bayesian and we consider different forms of prior information, including using the UNAIDS population size guidelines for this region. We show that the method is able to quantify the amount of information on population size available in RDS samples. As separate validation, we compare our results to those estimated by extrapolating from a capture–recapture study of El Salvadorian cities. The results of our case-study are largely comparable to those of the capture–recapture study when they differ from the UNAIDS guidelines. Our method is widely applicable to data from RDS studies and we provide a software package to facilitate this. PMID:25585794

  10. Comparison of two methods for estimating absolute risk of prostate cancer based on SNPs and family history

    PubMed Central

    Hsu, Fang-Chi; Sun, Jielin; Zhu, Yi; Kim, Seong-Tae; Jin, Tao; Zhang, Zheng; Wiklund, Fredrik; Kader, A. Karim; Zheng, S. Lilly; Isaacs, William; Grönberg, Henrik; Xu, Jianfeng

    2010-01-01

    Disease risk-associated single nucleotide polymorphisms (SNPs) identified from genome-wide association studies have the potential to be used for disease risk prediction. An important feature of these risk-associated SNPs is their weak individual effect but stronger cumulative effect on disease risk. Several approaches are commonly used to model the combined effect in risk prediction but their performance is unclear. We compared two methods to model the combined effect of 14 prostate cancer (PCa) risk-associated SNPs and family history for the estimation of absolute risk for PCa in a population-based case-control study in Sweden (2,899 cases and 1,722 controls). Method 1 weighs each risk allele equally using a simple method of counting the number of risk alleles while Method 2 weighs each risk SNP differently based on their respective Odds Ratios. We found considerable differences between the two methods. Absolute risk estimates from Method 1 were generally higher than that of Method 2, especially among men at higher risk. The difference in the overall discriminative performance, measured by area under the curve (AUC) of the receiver operating characteristic was small between Method 1 (0.614) and Method 2 (0.618), P = 0.20. However, the performance of these two methods in identifying high-risk individuals (two-fold or three-fold higher than average risk), measured by positive predictive values (PPV), was higher for Method 2 than Method 1. In conclusion, these results suggest that Method 2 is superior to Method 1 in estimating absolute risk if the purpose of risk prediction is to identify high-risk individuals. PMID:20332264

  11. Patient-specific radiation dose and cancer risk estimation in CT: Part II. Application to patients

    SciTech Connect

    Li Xiang; Samei, Ehsan; Segars, W. Paul; Sturgeon, Gregory M.; Colsher, James G.; Toncheva, Greta; Yoshizumi, Terry T.; Frush, Donald P.

    2011-01-15

    Purpose: Current methods for estimating and reporting radiation dose from CT examinations are largely patient-generic; the body size and hence dose variation from patient to patient is not reflected. Furthermore, the current protocol designs rely on dose as a surrogate for the risk of cancer incidence, neglecting the strong dependence of risk on age and gender. The purpose of this study was to develop a method for estimating patient-specific radiation dose and cancer risk from CT examinations. Methods: The study included two patients (a 5-week-old female patient and a 12-year-old male patient), who underwent 64-slice CT examinations (LightSpeed VCT, GE Healthcare) of the chest, abdomen, and pelvis at our institution in 2006. For each patient, a nonuniform rational B-spine (NURBS) based full-body computer model was created based on the patient's clinical CT data. Large organs and structures inside the image volume were individually segmented and modeled. Other organs were created by transforming an existing adult male or female full-body computer model (developed from visible human data) to match the framework defined by the segmented organs, referencing the organ volume and anthropometry data in ICRP Publication 89. A Monte Carlo program previously developed and validated for dose simulation on the LightSpeed VCT scanner was used to estimate patient-specific organ dose, from which effective dose and risks of cancer incidence were derived. Patient-specific organ dose and effective dose were compared with patient-generic CT dose quantities in current clinical use: the volume-weighted CT dose index (CTDI{sub vol}) and the effective dose derived from the dose-length product (DLP). Results: The effective dose for the CT examination of the newborn patient (5.7 mSv) was higher but comparable to that for the CT examination of the teenager patient (4.9 mSv) due to the size-based clinical CT protocols at our institution, which employ lower scan techniques for smaller

  12. Estimating the Risk of Renal Stone Events During Long-Duration Spaceflight

    NASA Technical Reports Server (NTRS)

    Reyes, David; Kerstman, Eric; Locke, James

    2014-01-01

    Introduction: Given the bone loss and increased urinary calcium excretion in the microgravity environment, persons participating in long-duration spaceflight may have an increased risk for renal stone formation. Renal stones are often an incidental finding of abdominal imaging studies done for other reasons. Thus, some crewmembers may have undiscovered, asymptomatic stones prior to their mission. Methods: An extensive literature search was conducted concerning the natural history of asymptomatic renal stones. For comparison, simulations were done using the Integrated Medical Model (IMM). The IMM is an evidence-based decision support tool that provides risk analysis and has the capability to optimize medical systems for missions by minimizing the occurrence of adverse mission outcomes such as evacuation and loss of crew life within specified mass and volume constraints. Results: The literature of the natural history of asymptomatic renal stones in the general medical population shows that the probability of symptomatic event is 8% to 34% at 1 to 3 years for stones < 7 mm. Extrapolated to a 6-month mission, for stones < 5 to 7 mm, the risk for any stone event is about 4 to 6%, with a 0.7% to 4% risk for intervention, respectively. IMM simulations compare favorably with risk estimates garnered from the terrestrial literature. The IMM forecasts that symptomatic renal stones may be one of the top drivers for medical evacuation of an International Space Station (ISS) mission. Discussion: Although the likelihood of a stone event is low, the consequences could be severe due to limitations of current ISS medical capabilities. Therefore, these risks need to be quantified to aid planning, limit crew morbidity and mitigate mission impacts. This will be especially critical for missions beyond earth orbit, where evacuation may not be an option.

  13. Measurement error affects risk estimates for recruitment to the Hudson River stock of striped bass.

    PubMed

    Dunning, Dennis J; Ross, Quentin E; Munch, Stephan B; Ginzburg, Lev R

    2002-06-01

    We examined the consequences of ignoring the distinction between measurement error and natural variability in an assessment of risk to the Hudson River stock of striped bass posed by entrainment at the Bowline Point, Indian Point, and Roseton power plants. Risk was defined as the probability that recruitment of age-1+ striped bass would decline by 80% or more, relative to the equilibrium value, at least once during the time periods examined (1, 5, 10, and 15 years). Measurement error, estimated using two abundance indices from independent beach seine surveys conducted on the Hudson River, accounted for 50% of the variability in one index and 56% of the variability in the other. If a measurement error of 50% was ignored and all of the variability in abundance was attributed to natural causes, the risk that recruitment of age-1+ striped bass would decline by 80% or more after 15 years was 0.308 at the current level of entrainment mortality (11%). However, the risk decreased almost tenfold (0.032) if a measurement error of 50% was considered. The change in risk attributable to decreasing the entrainment mortality rate from 11 to 0% was very small (0.009) and similar in magnitude to the change in risk associated with an action proposed in Amendment #5 to the Interstate Fishery Management Plan for Atlantic striped bass (0.006)--an increase in the instantaneous fishing mortality rate from 0.33 to 0.4. The proposed increase in fishing mortality was not considered an adverse environmental impact, which suggests that potentially costly efforts to reduce entrainment mortality on the Hudson River stock of striped bass are not warranted.

  14. Estimating drought risk across Europe from reported drought impacts, drought indices, and vulnerability factors

    NASA Astrophysics Data System (ADS)

    Blauhut, Veit; Stahl, Kerstin; Stagge, James Howard; Tallaksen, Lena M.; De Stefano, Lucia; Vogt, Jürgen

    2016-07-01

    Drought is one of the most costly natural hazards in Europe. Due to its complexity, drought risk, meant as the combination of the natural hazard and societal vulnerability, is difficult to define and challenging to detect and predict, as the impacts of drought are very diverse, covering the breadth of socioeconomic and environmental systems. Pan-European maps of drought risk could inform the elaboration of guidelines and policies to address its documented severity and impact across borders. This work tests the capability of commonly applied drought indices and vulnerability factors to predict annual drought impact occurrence for different sectors and macro regions in Europe and combines information on past drought impacts, drought indices, and vulnerability factors into estimates of drought risk at the pan-European scale. This hybrid approach bridges the gap between traditional vulnerability assessment and probabilistic impact prediction in a statistical modelling framework. Multivariable logistic regression was applied to predict the likelihood of impact occurrence on an annual basis for particular impact categories and European macro regions. The results indicate sector- and macro-region-specific sensitivities of drought indices, with the Standardized Precipitation Evapotranspiration Index (SPEI) for a 12-month accumulation period as the overall best hazard predictor. Vulnerability factors have only limited ability to predict drought impacts as single predictors, with information about land use and water resources being the best vulnerability-based predictors. The application of the hybrid approach revealed strong regional and sector-specific differences in drought risk across Europe. The majority of the best predictor combinations rely on a combination of SPEI for shorter and longer accumulation periods, and a combination of information on land use and water resources. The added value of integrating regional vulnerability information with drought risk prediction

  15. Estimating drought risk across Europe from reported drought impacts, hazard indicators and vulnerability factors

    NASA Astrophysics Data System (ADS)

    Blauhut, V.; Stahl, K.; Stagge, J. H.; Tallaksen, L. M.; De Stefano, L.; Vogt, J.

    2015-12-01

    Drought is one of the most costly natural hazards in Europe. Due to its complexity, drought risk, the combination of the natural hazard and societal vulnerability, is difficult to define and challenging to detect and predict, as the impacts of drought are very diverse, covering the breadth of socioeconomic and environmental systems. Pan-European maps of drought risk could inform the elaboration of guidelines and policies to address its documented severity and impact across borders. This work (1) tests the capability of commonly applied hazard indicators and vulnerability factors to predict annual drought impact occurrence for different sectors and macro regions in Europe and (2) combines information on past drought impacts, drought hazard indicators, and vulnerability factors into estimates of drought risk at the pan-European scale. This "hybrid approach" bridges the gap between traditional vulnerability assessment and probabilistic impact forecast in a statistical modelling framework. Multivariable logistic regression was applied to predict the likelihood of impact occurrence on an annual basis for particular impact categories and European macro regions. The results indicate sector- and macro region specific sensitivities of hazard indicators, with the Standardised Precipitation Evapotranspiration Index for a twelve month aggregation period (SPEI-12) as the overall best hazard predictor. Vulnerability factors have only limited ability to predict drought impacts as single predictor, with information about landuse and water resources as best vulnerability-based predictors. (3) The application of the "hybrid approach" revealed strong regional (NUTS combo level) and sector specific differences in drought risk across Europe. The majority of best predictor combinations rely on a combination of SPEI for shorter and longer aggregation periods, and a combination of information on landuse and water resources. The added value of integrating regional vulnerability information

  16. Uncertainty in the estimation of benzene risks: application of an uncertainty taxonomy to risk assessments based on an epidemiology study of rubber hydrochloride workers.

    PubMed Central

    Byrd, D M; Barfield, E T

    1989-01-01

    This paper reviews 14 risk assessments that use the data from descriptions by Rinsky, Young, and co-workers of benzene-associated leukemias among a group of rubber hydrochloride workers in Ohio. The leukemogenic risks of benzene estimated in these assessments differ. The assessors use different assumptions (parameters, confounding factors, or formulas), which account for the differences in risk. The purpose of the review is to determine whether the major source of uncertainty in assessments of benzene risk arises from data, method, or concept. The results show that methodological differences dominate the other two potential sources with respect to impact on risk magnitude. PMID:2792047

  17. Estimating the risk of collisions between bicycles and motor vehicles at signalized intersections.

    PubMed

    Wang, Yinhai; Nihan, Nancy L

    2004-05-01

    Collisions between bicycles and motor vehicles have caused severe life and property losses in many countries. The majority of bicycle-motor vehicle (BMV) accidents occur at intersections. In order to reduce the number of BMV accidents at intersections, a substantial understanding of the causal factors for the collisions is required. In this study, intersection BMV accidents were classified into three types based on the movements of the involved motor vehicles and bicycles. The three BMV accident classifications were through motor vehicle related collisions, left-turn motor vehicle related collisions, and right-turn motor vehicle related collisions. A methodology for estimating these BMV accident risks was developed based on probability theory. A significant difference between this proposed methodology and most current approaches is that the proposed approach explicitly relates the risk of each specific BMV accident type to its related flows. The methodology was demonstrated using a 4-year (1992-1995) data set collected from 115 signalized intersections in the Tokyo Metropolitan area. This data set contains BMV accident data, bicycle flow data, motor vehicle flow data, traffic control data, and geometric data for each intersection approach. For each BMV risk model, an independent explanatory variable set was chosen according to the characteristics of the accident type. Three negative binomial regression models (one corresponding to each BMV accident type) were estimated using the maximum likelihood method. The coefficient value and its significance level were estimated for each selected variable. The negative binomial dispersion parameters for all the three models were significant at 0.01 levels. This supported the choice of the negative binomial regression over the Poisson regression for the quantitative analyses in this study.

  18. Estimates of coextinction risk: how anuran parasites respond to the extinction of their hosts.

    PubMed

    Campião, Karla Magalhães; de Aquino Ribas, Augusto Cesar; Cornell, Stephen J; Begon, Michael; Tavares, Luiz Eduardo Roland

    2015-12-01

    Amphibians are known as the most threatened vertebrate group. One of the outcomes of a species' extinction is the coextinction of its dependents. Here, we estimate the extinction risk of helminth parasites of South America anurans. Parasite coextinction probabilities were modeled, assuming parasite specificity and host vulnerability to extinction as determinants. Parasite species associated with few hosts were the most prone to extinction, and extinction risk varied amongst helminth species of different taxonomic groups and life cycle complexity. Considering host vulnerability in the model decreased the extinction probability of most parasites species. However, parasite specificity and host vulnerability combined to increase the extinction probabilities of 44% of the helminth species reported in a single anuran species.

  19. Risk estimation of infectious diseases determines the effectiveness of the control strategy

    NASA Astrophysics Data System (ADS)

    Zhang, Haifeng; Zhang, Jie; Li, Ping; Small, Michael; Wang, Binghong

    2011-05-01

    Usually, whether to take vaccination or not is a voluntary decision, which is determined by many factors, from societal factors (such as religious belief and human rights) to individual preferences (including psychology and altruism). Facing the outbreaks of infectious diseases, different people often have different estimations on the risk of infectious diseases. So, some persons are willing to vaccinate, but other persons are willing to take risks. In this paper, we establish two different risk assessment systems using the technique of dynamic programming, and then compare the effects of the two different systems on the prevention of diseases on complex networks. One is that the perceived probability of being infected for each individual is the same (uniform case). The other is that the perceived probability of being infected is positively correlated to individual degrees (preferential case). We show that these two risk assessment systems can yield completely different results, such as, the effectiveness of controlling diseases, the time evolution of the number of infections, and so on.

  20. Risk information in support of cost estimates for the Baseline Environmental Management Report (BEMR). Section 1

    SciTech Connect

    Gelston, G.M.; Jarvis, M.F.; Warren, B.R.; Von Berg, R.

    1995-06-01

    The Pacific Northwest Laboratory (PNL)(1) effort on the overall Baseline Environmental Management Report (BEMR) project consists of four installation-specific work components performed in succession. These components include (1) development of source terms, 92) collection of data and preparation of environmental settings reports, (3) calculation of unit risk factors, and (4) utilization of the unit risk factors in Automated Remedial Action Methodology (ARAM) for computation of target concentrations and cost estimates. This report documents work completed for the Nevada Test Site, Nevada, for components 2 and 3. The product of this phase of the BEMR project is the development of unit factors (i.e., unit transport factors, unit exposure factors, and unit risk factors). Thousands of these unit factors are gene rated and fill approximately one megabyte of computer information per installation. The final unit risk factors (URF) are transmitted electronically to BEMR-Cost task personnel as input to a computer program (ARAM). Abstracted files and exhibits of the URF information are included in this report. These visual formats are intended to provide a sample of the final task deliverable (the URF files) which can be easily read without a computer.

  1. Waste management programmatic environmental impact statement methodology for estimating human health risks

    SciTech Connect

    Bergenback, B.; Blaylock, B.P.; Legg, J.L.

    1995-05-01

    The US Department of Energy (DOE) has produced large quantities of radioactive and hazardous waste during years of nuclear weapons production. As a result, a large number of sites across the DOE Complex have become chemically and/or radiologically contaminated. In 1990, the Secretary of Energy charged the DOE Office of Environmental Restoration and Waste management (EM) with the task of preparing a Programmatic Environmental Impact Statement (PEIS). The PEIS should identify and assess the potential environmental impacts of implementing several integrated Environmental Restoration (ER) and Waste Management (WM) alternatives. The determination and integration of appropriate remediation activities and sound waste management practices is vital for ensuring the diminution of adverse human health impacts during site cleanup and waste management programs. This report documents the PEIS risk assessment methodology used to evaluate human health risks posed by WM activities. The methodology presents a programmatic cradle to grave risk assessment for EM program activities. A unit dose approach is used to estimate risks posed by WM activities and is the subject of this document.

  2. Arsenic risk mapping in Bangladesh: a simulation technique of cokriging estimation from regional count data.

    PubMed

    Hassan, M Manzurul; Atkins, Peter J

    2007-10-01

    Risk analysis with spatial interpolation methods from a regional database on to a continuous surface is of contemporary interest. Groundwater arsenic poisoning in Bangladesh and its impact on human health has been one of the "biggest environmental health disasters" in current years. It is ironic that so many tubewells have been installed in recent times for pathogen-free drinking water but the water pumped is often contaminated with toxic levels of arsenic. This paper seeks to analyse the spatial pattern of arsenic risk by mapping composite "problem regions" in southwest Bangladesh. It also examines the cokriging interpolation method in analysing the suitability of isopleth maps for different risk areas. GIS-based data processing and spatial analysis were used for this research, along with state-of-the-art decision-making techniques. Apart from the GIS-based buffering and overlay mapping operations, a cokriging interpolation method was adopted because of its exact interpolation capacity. The paper presents an interpolation of regional estimates of arsenic data for spatial risk mapping that overcomes the areal bias problem for administrative boundaries. Moreover, the functionality of the cokriging method demonstrates the suitability of isopleth maps that are easy to read.

  3. Use of binary logistic regression technique with MODIS data to estimate wild fire risk

    NASA Astrophysics Data System (ADS)

    Fan, Hong; Di, Liping; Yang, Wenli; Bonnlander, Brian; Li, Xiaoyan

    2007-11-01

    Many forest fires occur across the globe each year, which destroy life and property, and strongly impact ecosystems. In recent years, wildland fires and altered fire disturbance regimes have become a significant management and science problem affecting ecosystems and wildland/urban interface cross the United States and global. In this paper, we discuss the estimation of 504 probability models for forecasting fire risk for 14 fuel types, 12 months, one day/week/month in advance, which use 19 years of historical fire data in addition to meteorological and vegetation variables. MODIS land products are utilized as a major data source, and a logistical binary regression was adopted to solve fire forecast probability. In order to better modeling the change of fire risk along with the transition of seasons, some spatial and temporal stratification strategies were applied. In order to explore the possibilities of real time prediction, the Matlab distributing computing toolbox was used to accelerate the prediction. Finally, this study give an evaluation and validation of predict based on the ground truth collected. Validating results indicate these fire risk models have achieved nearly 70% accuracy of prediction and as well MODIS data are potential data source to implement near real-time fire risk prediction.

  4. The new risk estimates and exposures to radon in high grade uranium mines

    SciTech Connect

    Brown, L.D. )

    1992-09-01

    The DS86 dosimetry used by the Radiation Effects Research Foundation, in re-evaluating radiation exposure risks from the atom bomb survivor lifespan studies, has led directly to a significant reduction in the maximum permissible dose recommended by the International Commission on Radiological Protection. In the case of uranium miners the contribution to the total dose resulting from the inhalation of radon daughters continues to be assessed in accordance with the procedures recommended in ICRP Report 47, which means that there has been no change in the maximum permissible radon daughter exposure limit. ICRP intends to review this situation once the new report of their task force on lung dosimetry has been adopted. This paper suggests that the direct epidemiological data on the risks of radon daughter inhalation is more satisfactory than indirect estimates of risk based on a lung dosimetry model, and that this direct evidence does not justify any increase in the presently accepted risk factor associated with radon daughter inhalation.

  5. Overall risk estimation for nonreactor nuclear facilities and implementation of safety goals

    SciTech Connect

    Kim, K.S.; Bradley, R.F.

    1993-06-01

    A typical safety analysis report (SAR) contains estimated frequencies.and consequences of various design basis accidents (DBA). However, the results are organized and presented in such a way that they are not conducive for summing up with mathematical rigor to express total or overall risk. This paper describes a mathematical formalism for deriving total risk indicators. The mathematical formalism is based on the complementary cumulative distribution function (CCDF) or exceedance probability of radioactivity release fraction and individual radiation dose. A simple protocol is presented for establishing exceedance probabilities from the results of DBA analyses typically available from an SAR. The exceedance probability of release fraction can be a useful indicator for gaining insights into the capability of confinement barriers, characteristics of source terms, and scope of the SAR. Fatality risks comparable to the DOE Safety Goals can be derived from the exceedance probability of individual doses. Example case analyses are presented to illustrate the use of the proposed protocol and mathematical formalism. The methodology is finally applied to proposed risk guidelines for individual accident events to show that these guidelines would be within the DOE Safety Goals.

  6. Social and economic factors of the natural risk increasing: estimation of the Russian regions

    NASA Astrophysics Data System (ADS)

    Petrova, E.

    2004-04-01

    This study is an attempt to assess quantitatively social and economic factors that determine vulnerability of Russian regions to natural risk, to trace the space differences of the considered factors, and to group the regions by their similarity. In order to indicate the regional differences in social and economic development, equipment condition, dangerous substances accumulation, and social trouble four the most suitable parameters were estimated, including the per capita production of Gross Regional Product (GRP), capital consumption, volume of total toxic waste, and crime rate. Increase of the first parameter causes vulnerability reducing, the increase of the last three causes its increasing. Using multidimensional cluster analysis five types of regions were found for Russia according to similarity of the considered parameters. These types are characterized with higher value of a single (rarely two) chosen parameter, which seems to be sufficient enough to affect natural risks increasing in these regions in near future. Only few regions belonging to the fifth type proved to have rather high value of GRP and relatively low values of the other parameters. The negative correlation was found between a number of natural disasters (ND) and the per capita GRP in case when some parameters reached anomalously high value. The distinctions between regions by prevailing different parameters, which result in natural risk increasing, help risk management to find directions where to focus on.

  7. A global building inventory for earthquake loss estimation and risk management

    USGS Publications Warehouse

    Jaiswal, K.; Wald, D.; Porter, K.

    2010-01-01

    We develop a global database of building inventories using taxonomy of global building types for use in near-real-time post-earthquake loss estimation and pre-earthquake risk analysis, for the U.S. Geological Survey's Prompt Assessment of Global Earthquakes for Response (PAGER) program. The database is available for public use, subject to peer review, scrutiny, and open enhancement. On a country-by-country level, it contains estimates of the distribution of building types categorized by material, lateral force resisting system, and occupancy type (residential or nonresidential, urban or rural). The database draws on and harmonizes numerous sources: (1) UN statistics, (2) UN Habitat's demographic and health survey (DHS) database, (3) national housing censuses, (4) the World Housing Encyclopedia and (5) other literature. ?? 2010, Earthquake Engineering Research Institute.

  8. Spatially interpolated disease prevalence estimation using collateral indicators of morbidity and ecological risk.

    PubMed

    Congdon, Peter

    2013-10-01

    This paper considers estimation of disease prevalence for small areas (neighbourhoods) when the available observations on prevalence are for an alternative partition of a region, such as service areas. Interpolation to neighbourhoods uses a kernel method extended to take account of two types of collateral information. The first is morbidity and service use data, such as hospital admissions, observed for neighbourhoods. Variations in morbidity and service use are expected to reflect prevalence. The second type of collateral information is ecological risk factors (e.g., pollution indices) that are expected to explain variability in prevalence in service areas, but are typically observed only for neighbourhoods. An application involves estimating neighbourhood asthma prevalence in a London health region involving 562 neighbourhoods and 189 service (primary care) areas. PMID:24129116

  9. Nonparametric estimation and classification using radial basis function nets and empirical risk minimization.

    PubMed

    Krzyzak, A; Linder, T; Lugosi, C

    1996-01-01

    Studies convergence properties of radial basis function (RBF) networks for a large class of basis functions, and reviews the methods and results related to this topic. The authors obtain the network parameters through empirical risk minimization. The authors show the optimal nets to be consistent in the problem of nonlinear function approximation and in nonparametric classification. For the classification problem the authors consider two approaches: the selection of the RBF classifier via nonlinear function estimation and the direct method of minimizing the empirical error probability. The tools used in the analysis include distribution-free nonasymptotic probability inequalities and covering numbers for classes of functions.

  10. Risk estimation of HNA-3 incompatibility and alloimmunization in Thai populations.

    PubMed

    Nathalang, Oytip; Intharanut, Kamphon; Siriphanthong, Kanokpol; Nathalang, Siriporn; Leetrakool, Nipapan

    2015-01-01

    Severe transfusion-related acute lung injury (TRALI) is often due to antibodies in blood components directed against human neutrophil antigen (HNA)-3a. This study aimed to report the genotype frequencies of the HNA-3 system and to estimate the potential risk of HNA-3 incompatibility and alloimmunization in two Thai populations. Eight hundred DNA samples obtained from 500 unrelated healthy blood donors at the National Blood Centre, Thai Red Cross Society, Bangkok and 300 samples from the Blood Bank, Faculty of Medicine, Chiang Mai University, Chiang Mai, Thailand were included. HNA-3 genotyping was performed using an in-house polymerase chain reaction with sequence-specific primer (PCR-SSP) technique. The observed frequencies of the HNA-3a/3a, HNA-3a/3b, and HNA-3b/3b genotypes were 0.528, 0.380, and 0.092 in central Thais and 0.600, 0.350, and 0.050 in northern Thais, respectively. The frequencies were used to estimate HNA-3 incompatibility and risk of HNA-3a alloimmunization. The HNA-3 incompatibility in central Thais (33.28%) was higher than northern Thais (28.75%), corresponding to a significantly higher probability of HNA-3a alloimmunization (P<0.05) similar to Japanese and Chinese populations. This study showed the high risk of HNA-3 incompatibility and alloimmunization, especially in central Thai blood donors. A molecular-based identification of the HNA-3 genotype of female donors is suggested to reduce the risk of TRALI following plasma and whole blood allogeneic transfusion.

  11. Estimation of mortality and morbidity risk of radical cystectomy using POSSUM and the Portsmouth predictor equation

    PubMed Central

    Morizane, Shuichi; Honda, Masashi; Isoyama, Tadahiro; Koumi, Tsutomu; Ono, Kouji; Kadowaki, Hiroyuki; Sejima, Takehiro; Takenaka, Atsushi

    2015-01-01

    Introduction The Physiological and Operative Severity Score for the enumeration of Mortality and Morbidity (POSSUM) and the Portsmouth predictor equation (P-POSSUM) are simple scoring systems used to estimate the risk of complications and death postoperatively. We investigated the use of these scores to predict the postoperative risk in patients undergoing radical cystectomy (RC). Material and methods In this retrospective study, we enrolled 280 patients who underwent RC for invasive bladder cancer between January 2003 and December 2011. Morbidity and mortality were predicted using the POSSUM and P-POSSUM equations. We further assessed the ability of the POSSUM and P-POSSUM to predict the mortality and morbidity risk in RC patients with a Clavien–Dindo classification of surgical complications of grade II or higher. Results The observed morbidity and mortality rates were 58.9% (165 patients) and 1.8% (5 patients), respectively. Predicted morbidity using POSSUM was 49.2% (138 patients) compared to the 58.9% (165 patients) observed (P <0.0001). Compared to the observed death rate of 1.8% (5 patients), predicted mortality using POSSUM and P-POSSUM was 12.1% (34 patients) and 3.9% (11 patients), respectively (P <0.0001 and P = 0.205). The mortality risk estimated by P-POSSUM was not significantly different from the observed mortality rate. Conclusions The results of this study supported the efficacy of POSSUM combined with P-POSSUM to predict morbidity and mortality in patients undergoing RC. Further prospective studies are needed to better determine the usefulness of POSSUM and P-POSSUM for a comparative audit in urological patients undergoing RC. PMID:26568864

  12. Daily salt intake estimated by overnight urine collections indicates a high cardiovascular disease risk in Thailand.

    PubMed

    Yokokawa, Hirohide; Yuasa, Motoyuki; Nedsuwan, Supalert; Moolphate, Saiyud; Fukuda, Hiroshi; Kitajima, Tsutomu; Minematsu, Kazuo; Tanimura, Susumu; Marui, Eiji

    2016-01-01

    This cross-sectional study (February 2012 to March 2013) was conducted to estimate daily salt intake and basic characteristics among 793 community-dwelling participants at high risk of cardiovascular disease (Framingham risk score >15%), who had visited diabetes or hypertension clinics at health centres in the Muang district, Chiang Rai, Thailand. We performed descriptive analysis of baseline data and used an automated analyser to estimate the average of 24-hour salt intake estimated from 3 days overnight urine collection. Participants were divided into two groups based on median estimated daily salt intake. Mean age and proportion of males were 65.2 years and 37.6% in the higher salt intake group (>=10.0 g/day, n=362), and 67.5 years and 42.7% in the lower salt intake group (<10.0 g/day, n=431), respectively (p=0.01, p<0.01). The higher salt intake group comprised more patients with a family history of hypertension, antihypertensive drug use, less ideal body mass index (18.5-24.9), higher exercise frequency (>=2 times weekly) and lower awareness of high salt intake. Among higher salt intake participants, those with lower awareness of high salt intake were younger and more often had a family history of hypertension, relative to those with more awareness. Our data indicated that families often share lifestyles involving high salt intake, and discrepancies between actual salt intake and awareness of high salt intake may represent a need for salt reduction intervention aiming at family level. Awareness of actual salt intake should be improved for each family. PMID:26965760

  13. Impact of alternative metrics on estimates of extent of occurrence for extinction risk assessment.

    PubMed

    Joppa, Lucas N; Butchart, Stuart H M; Hoffmann, Michael; Bachman, Steve P; Akçakaya, H Resit; Moat, Justin F; Böhm, Monika; Holland, Robert A; Newton, Adrian; Polidoro, Beth; Hughes, Adrian

    2016-04-01

    In International Union for Conservation of Nature (IUCN) Red List assessments, extent of occurrence (EOO) is a key measure of extinction risk. However, the way assessors estimate EOO from maps of species' distributions is inconsistent among assessments of different species and among major taxonomic groups. Assessors often estimate EOO from the area of mapped distribution, but these maps often exclude areas that are not habitat in idiosyncratic ways and are not created at the same spatial resolutions. We assessed the impact on extinction risk categories of applying different methods (minimum convex polygon, alpha hull) for estimating EOO for 21,763 species of mammals, birds, and amphibians. Overall, the percentage of threatened species requiring down listing to a lower category of threat (taking into account other Red List criteria under which they qualified) spanned 11-13% for all species combined (14-15% for mammals, 7-8% for birds, and 12-15% for amphibians). These down listings resulted from larger estimates of EOO and depended on the EOO calculation method. Using birds as an example, we found that 14% of threatened and near threatened species could require down listing based on the minimum convex polygon (MCP) approach, an approach that is now recommended by IUCN. Other metrics (such as alpha hull) had marginally smaller impacts. Our results suggest that uniformly applying the MCP approach may lead to a one-time down listing of hundreds of species but ultimately ensure consistency across assessments and realign the calculation of EOO with the theoretical basis on which the metric was founded.

  14. Impact of alternative metrics on estimates of extent of occurrence for extinction risk assessment.

    PubMed

    Joppa, Lucas N; Butchart, Stuart H M; Hoffmann, Michael; Bachman, Steve P; Akçakaya, H Resit; Moat, Justin F; Böhm, Monika; Holland, Robert A; Newton, Adrian; Polidoro, Beth; Hughes, Adrian

    2016-04-01

    In International Union for Conservation of Nature (IUCN) Red List assessments, extent of occurrence (EOO) is a key measure of extinction risk. However, the way assessors estimate EOO from maps of species' distributions is inconsistent among assessments of different species and among major taxonomic groups. Assessors often estimate EOO from the area of mapped distribution, but these maps often exclude areas that are not habitat in idiosyncratic ways and are not created at the same spatial resolutions. We assessed the impact on extinction risk categories of applying different methods (minimum convex polygon, alpha hull) for estimating EOO for 21,763 species of mammals, birds, and amphibians. Overall, the percentage of threatened species requiring down listing to a lower category of threat (taking into account other Red List criteria under which they qualified) spanned 11-13% for all species combined (14-15% for mammals, 7-8% for birds, and 12-15% for amphibians). These down listings resulted from larger estimates of EOO and depended on the EOO calculation method. Using birds as an example, we found that 14% of threatened and near threatened species could require down listing based on the minimum convex polygon (MCP) approach, an approach that is now recommended by IUCN. Other metrics (such as alpha hull) had marginally smaller impacts. Our results suggest that uniformly applying the MCP approach may lead to a one-time down listing of hundreds of species but ultimately ensure consistency across assessments and realign the calculation of EOO with the theoretical basis on which the metric was founded. PMID:26183938

  15. The performance of different propensity-score methods for estimating differences in proportions (risk differences or absolute risk reductions) in observational studies.

    PubMed

    Austin, Peter C

    2010-09-10

    Propensity score methods are increasingly being used to estimate the effects of treatments on health outcomes using observational data. There are four methods for using the propensity score to estimate treatment effects: covariate adjustment using the propensity score, stratification on the propensity score, propensity-score matching, and inverse probability of treatment weighting (IPTW) using the propensity score. When outcomes are binary, the effect of treatment on the outcome can be described using odds ratios, relative risks, risk differences, or the number needed to treat. Several clinical commentators suggested that risk differences and numbers needed to treat are more meaningful for clinical decision making than are odds ratios or relative risks. However, there is a paucity of information about the relative performance of the different propensity-score methods for estimating risk differences. We conducted a series of Monte Carlo simulations to examine this issue. We examined bias, variance estimation, coverage of confidence intervals, mean-squared error (MSE), and type I error rates. A doubly robust version of IPTW had superior performance compared with the other propensity-score methods. It resulted in unbiased estimation of risk differences, treatment effects with the lowest standard errors, confidence intervals with the correct coverage rates, and correct type I error rates. Stratification, matching on the propensity score, and covariate adjustment using the propensity score resulted in minor to modest bias in estimating risk differences. Estimators based on IPTW had lower MSE compared with other propensity-score methods. Differences between IPTW and propensity-score matching may reflect that these two methods estimate the average treatment effect and the average treatment effect for the treated, respectively.

  16. Assessing uncertainty in published risk estimates using hexavalent chromium and lung cancer mortality as an example [Presentation 2015

    EPA Science Inventory

    Introduction: The National Research Council recommended quantitative evaluation of uncertainty in effect estimates for risk assessment. This analysis considers uncertainty across model forms and model parameterizations with hexavalent chromium [Cr(VI)] and lung cancer mortality a...

  17. Estimating Geographical Variation in the Risk of Zoonotic Plasmodium knowlesi Infection in Countries Eliminating Malaria

    PubMed Central

    Shearer, Freya M.; Huang, Zhi; Weiss, Daniel J.; Wiebe, Antoinette; Gibson, Harry S.; Battle, Katherine E.; Pigott, David M.; Brady, Oliver J.; Putaporntip, Chaturong; Jongwutiwes, Somchai; Lau, Yee Ling; Manske, Magnus; Amato, Roberto; Elyazar, Iqbal R. F.; Vythilingam, Indra; Bhatt, Samir; Gething, Peter W.; Singh, Balbir; Golding, Nick; Hay, Simon I.

    2016-01-01

    Background Infection by the simian malaria parasite, Plasmodium knowlesi, can lead to severe and fatal disease in humans, and is the most common cause of malaria in parts of Malaysia. Despite being a serious public health concern, the geographical distribution of P. knowlesi malaria risk is poorly understood because the parasite is often misidentified as one of the human malarias. Human cases have been confirmed in at least nine Southeast Asian countries, many of which are making progress towards eliminating the human malarias. Understanding the geographical distribution of P. knowlesi is important for identifying areas where malaria transmission will continue after the human malarias have been eliminated. Methodology/Principal Findings A total of 439 records of P. knowlesi infections in humans, macaque reservoir and vector species were collated. To predict spatial variation in disease risk, a model was fitted using records from countries where the infection data coverage is high. Predictions were then made throughout Southeast Asia, including regions where infection data are sparse. The resulting map predicts areas of high risk for P. knowlesi infection in a number of countries that are forecast to be malaria-free by 2025 (Malaysia, Cambodia, Thailand and Vietnam) as well as countries projected to be eliminating malaria (Myanmar, Laos, Indonesia and the Philippines). Conclusions/Significance We have produced the first map of P. knowlesi malaria risk, at a fine-scale resolution, to identify priority areas for surveillance based on regions with sparse data and high estimated risk. Our map provides an initial evidence base to better understand the spatial distribution of this disease and its potential wider contribution to malaria incidence. Considering malaria elimination goals, areas for prioritised surveillance are identified. PMID:27494405

  18. The potential of plasma miRNAs for diagnosis and risk estimation of colorectal cancer.

    PubMed

    Chen, Wang-Yang; Zhao, Xiao-Juan; Yu, Zhi-Fu; Hu, Fu-Lan; Liu, Yu-Peng; Cui, Bin-Bin; Dong, Xin-Shu; Zhao, Ya-Shuang

    2015-01-01

    Circulating microRNAs (miRNAs) were recognized to be potential non-invasive biomarkers for colorectal cancer (CRC) detection and prediction. Meanwhile, the association of the expression of plasma miRNAs with the risk of CRC patients has rarely been analyzed. Therefore, we conducted this study to evaluate the value of plasma miRNAs for CRC diagnosis and risk estimation. Fasting blood samples from 100 CRC patients and 79 cancer-free controls were collected. Plasma miR-106a, miR-20a, miR-27b, miR-92a and miR-29a levels were detected by RT-qPCR. Sensitivity and specificity were employed to evaluate the diagnostic value of miRNAs for CRC. Univariate and multivariate logistic regression were employed to analyze the association between miRNAs expression and CRC risk. As results, miR-106a and miR-20a were elevated in the patients with CRC. The sensitivity of miR-106a was 74.00% and the specificity was 44.40%, while the cutoff value was 2.03. As for miR-20a, the sensitivity was 46.00% and specificity was 73.42% when employed 2.44 as cutoff value. High expression of plasma miR-106a increased CRC risk by 1.80 -fold. Plasma miR-106a and miR-20a may as noninvasive biomarkers for detecting the CRC. High expression of miR-106a associated with CRC risk. PMID:26261602

  19. Semi-analytical estimation of wellbore leakage risk during CO2 sequestration in Ottawa County, Michigan

    NASA Astrophysics Data System (ADS)

    Guo, B.; Matteo, E. N.; Elliot, T. R.; Nogues, J. P.; Deng, H.; Fitts, J. P.; Pollak, M.; Bielicki, J.; Wilson, E.; Celia, M. A.; Peters, C. A.

    2011-12-01

    Using the semi-analytical ELSA model, wellbore leakage risk is estimated for CO2 injection into either the Mt. Simon or St. Peter formations, which are part of the Michigan Sedimentary Basin that lies beneath Ottawa County, MI. ELSA is a vertically integrated subsurface modeling tool that can be used to simulate both supercritical CO2 plume distribution/migration and pressure- induced brine displacement during CO2 injection. A composite 3D subsurface domain was constructed for the ELSA simulations based on estimated permeabilities for formation layers, as well as GIS databases containing subsurface stratigraphy, active and inactive and inactive wells, and potential interactions with subsurface activities. These activities include potable aquifers, oil and gas reservoirs, and waste injection sites, which represent potential liabilities if encountered by brine or supercritical CO2 displaced from the injection formation. Overall, the 3D subsurface domain encompasses an area of 1500 km2 to a depth of 2 km and contains over 3,000 wells. The permeabilities for abandoned wells are derived from a ranking system based on available well data including historical records and well logs. This distribution is then randomly sampled in Monte Carlo simulations that are used to generate a probability map for subsurface interferences or atmospheric release resulting from leakage of CO2 and /or brine from the injection formation. This method serves as the basis for comparative testing between various scenarios for injection, as well as for comparing the relative risk of leakage between injection formations or storage sites.

  20. A framework for estimating radiation-related cancer risks in Japan from the 2011 Fukushima nuclear accident.

    PubMed

    Walsh, L; Zhang, W; Shore, R E; Auvinen, A; Laurier, D; Wakeford, R; Jacob, P; Gent, N; Anspaugh, L R; Schüz, J; Kesminiene, A; van Deventer, E; Tritscher, A; del Rosarion Pérez, M

    2014-11-01

    We present here a methodology for health risk assessment adopted by the World Health Organization that provides a framework for estimating risks from the Fukushima nuclear accident after the March 11, 2011 Japanese major earthquake and tsunami. Substantial attention has been given to the possible health risks associated with human exposure to radiation from damaged reactors at the Fukushima Daiichi nuclear power station. Cumulative doses were estimated and applied for each post-accident year of life, based on a reference level of exposure during the first year after the earthquake. A lifetime cumulative dose of twice the first year dose was estimated for the primary radionuclide contaminants ((134)Cs and (137)Cs) and are based on Chernobyl data, relative abundances of cesium isotopes, and cleanup efforts. Risks for particularly radiosensitive cancer sites (leukemia, thyroid and breast cancer), as well as the combined risk for all solid cancers were considered. The male and female cumulative risks of cancer incidence attributed to radiation doses from the accident, for those exposed at various ages, were estimated in terms of the lifetime attributable risk (LAR). Calculations of LAR were based on recent Japanese population statistics for cancer incidence and current radiation risk models from the Life Span Study of Japanese A-bomb survivors. Cancer risks over an initial period of 15 years after first exposure were also considered. LAR results were also given as a percentage of the lifetime baseline risk (i.e., the cancer risk in the absence of radiation exposure from the accident). The LAR results were based on either a reference first year dose (10 mGy) or a reference lifetime dose (20 mGy) so that risk assessment may be applied for relocated and non-relocated members of the public, as well as for adult male emergency workers. The results show that the major contribution to LAR from the reference lifetime dose comes from the first year dose. For a dose of 10 mGy in

  1. A framework for estimating radiation-related cancer risks in Japan from the 2011 Fukushima nuclear accident.

    PubMed

    Walsh, L; Zhang, W; Shore, R E; Auvinen, A; Laurier, D; Wakeford, R; Jacob, P; Gent, N; Anspaugh, L R; Schüz, J; Kesminiene, A; van Deventer, E; Tritscher, A; del Rosarion Pérez, M

    2014-11-01

    We present here a methodology for health risk assessment adopted by the World Health Organization that provides a framework for estimating risks from the Fukushima nuclear accident after the March 11, 2011 Japanese major earthquake and tsunami. Substantial attention has been given to the possible health risks associated with human exposure to radiation from damaged reactors at the Fukushima Daiichi nuclear power station. Cumulative doses were estimated and applied for each post-accident year of life, based on a reference level of exposure during the first year after the earthquake. A lifetime cumulative dose of twice the first year dose was estimated for the primary radionuclide contaminants ((134)Cs and (137)Cs) and are based on Chernobyl data, relative abundances of cesium isotopes, and cleanup efforts. Risks for particularly radiosensitive cancer sites (leukemia, thyroid and breast cancer), as well as the combined risk for all solid cancers were considered. The male and female cumulative risks of cancer incidence attributed to radiation doses from the accident, for those exposed at various ages, were estimated in terms of the lifetime attributable risk (LAR). Calculations of LAR were based on recent Japanese population statistics for cancer incidence and current radiation risk models from the Life Span Study of Japanese A-bomb survivors. Cancer risks over an initial period of 15 years after first exposure were also considered. LAR results were also given as a percentage of the lifetime baseline risk (i.e., the cancer risk in the absence of radiation exposure from the accident). The LAR results were based on either a reference first year dose (10 mGy) or a reference lifetime dose (20 mGy) so that risk assessment may be applied for relocated and non-relocated members of the public, as well as for adult male emergency workers. The results show that the major contribution to LAR from the reference lifetime dose comes from the first year dose. For a dose of 10 mGy in

  2. Estimating the Risk of Chronic Pain: Development and Validation of a Prognostic Model (PICKUP) for Patients with Acute Low Back Pain

    PubMed Central

    Traeger, Adrian C.; Henschke, Nicholas; Hübscher, Markus; Williams, Christopher M.; Kamper, Steven J.; Maher, Christopher G.; Moseley, G. Lorimer; McAuley, James H.

    2016-01-01

    Background Low back pain (LBP) is a major health problem. Globally it is responsible for the most years lived with disability. The most problematic type of LBP is chronic LBP (pain lasting longer than 3 mo); it has a poor prognosis and is costly, and interventions are only moderately effective. Targeting interventions according to risk profile is a promising approach to prevent the onset of chronic LBP. Developing accurate prognostic models is the first step. No validated prognostic models are available to accurately predict the onset of chronic LBP. The primary aim of this study was to develop and validate a prognostic model to estimate the risk of chronic LBP. Methods and Findings We used the PROGRESS framework to specify a priori methods, which we published in a study protocol. Data from 2,758 patients with acute LBP attending primary care in Australia between 5 November 2003 and 15 July 2005 (development sample, n = 1,230) and between 10 November 2009 and 5 February 2013 (external validation sample, n = 1,528) were used to develop and externally validate the model. The primary outcome was chronic LBP (ongoing pain at 3 mo). In all, 30% of the development sample and 19% of the external validation sample developed chronic LBP. In the external validation sample, the primary model (PICKUP) discriminated between those who did and did not develop chronic LBP with acceptable performance (area under the receiver operating characteristic curve 0.66 [95% CI 0.63 to 0.69]). Although model calibration was also acceptable in the external validation sample (intercept = −0.55, slope = 0.89), some miscalibration was observed for high-risk groups. The decision curve analysis estimated that, if decisions to recommend further intervention were based on risk scores, screening could lead to a net reduction of 40 unnecessary interventions for every 100 patients presenting to primary care compared to a “treat all” approach. Limitations of the method include the model being

  3. Estimation of Wild Fire Risk Area based on Climate and Maximum Entropy in Korean Peninsular

    NASA Astrophysics Data System (ADS)

    Kim, T.; Lim, C. H.; Song, C.; Lee, W. K.

    2015-12-01

    The number of forest fires and accompanying human injuries and physical damages has been increased by frequent drought. In this study, forest fire danger zone of Korea is estimated to predict and prepare for future forest fire hazard regions. The MaxEnt (Maximum Entropy) model is used to estimate the forest fire hazard region which estimates the probability distribution of the status. The MaxEnt model is primarily for the analysis of species distribution, but its applicability for various natural disasters is getting recognition. The detailed forest fire occurrence data collected by the MODIS for past 5 years (2010-2014) is used as occurrence data for the model. Also meteorology, topography, vegetation data are used as environmental variable. In particular, various meteorological variables are used to check impact of climate such as annual average temperature, annual precipitation, precipitation of dry season, annual effective humidity, effective humidity of dry season, aridity index. Consequently, the result was valid based on the AUC(Area Under the Curve) value (= 0.805) which is used to predict accuracy in the MaxEnt model. Also predicted forest fire locations were practically corresponded with the actual forest fire distribution map. Meteorological variables such as effective humidity showed the greatest contribution, and topography variables such as TWI (Topographic Wetness Index) and slope also contributed on the forest fire. As a result, the east coast and the south part of Korea peninsula were predicted to have high risk on the forest fire. In contrast, high-altitude mountain area and the west coast appeared to be safe with the forest fire. The result of this study is similar with former studies, which indicates high risks of forest fire in accessible area and reflects climatic characteristics of east and south part in dry season. To sum up, we estimated the forest fire hazard zone with existing forest fire locations and environment variables and had

  4. Improving the Estimation of Celiac Disease Sibling Risk by Non-HLA Genes

    PubMed Central

    Izzo, Valentina; Pinelli, Michele; Tinto, Nadia; Esposito, Maria Valeria; Cola, Arturo; Sperandeo, Maria Pia; Tucci, Francesca; Cocozza, Sergio; Greco, Luigi; Sacchetti, Lucia

    2011-01-01

    Celiac Disease (CD) is a polygenic trait, and HLA genes explain less than half of the genetic variation. Through large GWAs more than 40 associated non-HLA genes were identified, but they give a small contribution to the heritability of the disease. The aim of this study is to improve the estimate of the CD risk in siblings, by adding to HLA a small set of non-HLA genes. One-hundred fifty-seven Italian families with a confirmed CD case and at least one other sib and both parents were recruited. Among 249 sibs, 29 developed CD in a 6 year follow-up period. All individuals were typed for HLA and 10 SNPs in non-HLA genes: CCR1/CCR3 (rs6441961), IL12A/SCHIP1 and IL12A (rs17810546 and rs9811792), TAGAP (rs1738074), RGS1 (rs2816316), LPP (rs1464510), OLIG3 (rs2327832), REL (rs842647), IL2/IL21 (rs6822844), SH2B3 (rs3184504). Three associated SNPs (in LPP, REL, and RGS1 genes) were identified through the Transmission Disequilibrium Test and a Bayesian approach was used to assign a score (BS) to each detected HLA+SNPs genotype combination. We then classified CD sibs as at low or at high risk if their BS was respectively < or ≥ median BS value within each HLA risk group. A larger number (72%) of CD sibs showed a BS ≥ the median value and had a more than two fold higher OR than CD sibs with a BS value < the median (O.R = 2.53, p = 0.047). Our HLA+SNPs genotype classification, showed both a higher predictive negative value (95% vs 91%) and diagnostic sensitivity (79% vs 45%) than the HLA only. In conclusion, the estimate of the CD risk by HLA+SNPs approach, even if not applicable to prevention, could be a precious tool to improve the prediction of the disease in a cohort of first degree relatives, particularly in the low HLA risk groups. PMID:22087237

  5. Estimation of wildfire size and risk changes due to fuels treatments

    USGS Publications Warehouse

    Cochrane, M.A.; Moran, C.J.; Wimberly, M.C.; Baer, A.D.; Finney, M.A.; Beckendorf, K.L.; Eidenshink, J.; Zhu, Z.

    2012-01-01

    Human land use practices, altered climates, and shifting forest and fire management policies have increased the frequency of large wildfires several-fold. Mitigation of potential fire behaviour and fire severity have increasingly been attempted through pre-fire alteration of wildland fuels using mechanical treatments and prescribed fires. Despite annual treatment of more than a million hectares of land, quantitative assessments of the effectiveness of existing fuel treatments at reducing the size of actual wildfires or how they might alter the risk of burning across landscapes are currently lacking. Here, we present a method for estimating spatial probabilities of burning as a function of extant fuels treatments for any wildland fire-affected landscape. We examined the landscape effects of more than 72 000 ha of wildland fuel treatments involved in 14 large wildfires that burned 314 000 ha of forests in nine US states between 2002 and 2010. Fuels treatments altered the probability of fire occurrence both positively and negatively across landscapes, effectively redistributing fire risk by changing surface fire spread rates and reducing the likelihood of crowning behaviour. Trade offs are created between formation of large areas with low probabilities of increased burning and smaller, well-defined regions with reduced fire risk.

  6. Estimation of the Transportation Risks for the Spent Fuel in Korea for Various Transportation Scenarios

    SciTech Connect

    Jongtae, Jeong; Cho, D.K.; Choi, H.J.; Choi, J.W.

    2008-07-01

    According to the long term management strategy for spent fuels in Korea, they will be transported from the spent fuel pools in each nuclear power plant to the central interim storage facility (CISF) which is to start operation in 2016. Therefore, we have to determine the safe and economical logistics for the transportation of these spent fuels by considering their transportation risks and costs. In this study, we developed four transportation scenarios by considering the type of transportation casks and transport means in order to suggest safe and economical transportation logistics for the spent fuels in Korea. Also, we estimated and compared the transportation risks for these four transportation scenarios. From the results of this study, we found that these four transportation scenarios for spent fuels have a very low radiological risk activity with a manageable safety and health consequences. The results of this study can be used as basic data for the development of safe and economical logistics for a transportation of the spent fuels in Korea by considering the transportation costs for the four scenarios which will be needed in the near future. (authors)

  7. Identification and estimation of socioeconomic impacts resulting from perceived risks and changing images; An annotated bibliography

    SciTech Connect

    Nieves, L.A.; Wernette, D.R.; Hemphill, R.C.; Mohiudden, S.; Corso, J.

    1990-02-01

    In 1982, the US Congress passed the Nuclear Waste Policy Act to initiate the process of choosing a location to permanently store high-level nuclear waste from the designated Yucca Mountain, Nevada, as the only location to be studied as a candidate site for such a repository. The original acts and its amendments had established the grant mechanism by which the state of Nevada could finance an investigation of the potential socioeconomic impacts that could result from the installation and operation of this facility. Over the past three years, the Office of Civilian Radioactive Waste Management (OCRWM or RW) in the US Department of Energy (DOE) has approved grant requests by Nevada to perform this investigation. This report is intended to update and enhance a literature review conducted by the Human Affairs Research Center (HARC) for the Basalt Waste Isolation Project that dealt with the psychological and sociological processes underlying risk perception. It provides addition information on the HARC work, covers a subsequent step in the impact-estimation process, and translates risk perception into decisions and behaviors with economic consequences. It also covers recently developed techniques for assessing the nature and magnitude of impacts caused by environmental changes focusing on those impacts caused by changes in perceived risks.

  8. Towards a US national estimate of the risk of endemic waterborne disease--sero-epidemiologic studies.

    PubMed

    Casemore, David

    2006-01-01

    Worldwide literature on serological methods and sero-surveys on waterborne pathogens has been reviewed. Outbreak investigation and research reports have also been examined to aid understanding of the serological response and transmission dynamics. The aim was to seek an estimate of seroprevalence and to determine if this could inform the US national estimate of risk for endemic waterborne infection associated with public water supplies. Antibody responses indicate infection, both symptomatic and asymptomatic, so probably give a truer indication of prevalence. Outbreak data can probably be regarded as the upper bound for seroprevalence estimations. Antibody is not necessarily protective per se but is a good indicator for at least partial resistance to symptomatic infection; absence of antibody will normally imply susceptibility. Pathogens transmitted by water are commonly transmitted by other routes. However, the fact that other transmission routes are more common does not detract from the potential protective effect of immunity when waterborne transmission occurs. Data indicate that seroprevalence varies widely, reflecting geographic, social and hygiene factors, but is generally greater where surface water sources are used rather than groundwater. Areas of low seroprevalence may expect a high attack rate in the event of contamination of their water supply.

  9. Fast and Accurate Construction of Confidence Intervals for Heritability.

    PubMed

    Schweiger, Regev; Kaufman, Shachar; Laaksonen, Reijo; Kleber, Marcus E; März, Winfried; Eskin, Eleazar; Rosset, Saharon; Halperin, Eran

    2016-06-01

    Estimation of heritability is fundamental in genetic studies. Recently, heritability estimation using linear mixed models (LMMs) has gained popularity because these estimates can be obtained from unrelated individuals collected in genome-wide association studies. Typically, heritability estimation under LMMs uses the restricted maximum likelihood (REML) approach. Existing methods for the construction of confidence intervals and estimators of SEs for REML rely on asymptotic properties. However, these assumptions are often violated because of the bounded parameter space, statistical dependencies, and limited sample size, leading to biased estimates and inflated or deflated confidence intervals. Here, we show that the estimation of confidence intervals by state-of-the-art methods is inaccurate, especially when the true heritability is relatively low or relatively high. We further show that these inaccuracies occur in datasets including thousands of individuals. Such biases are present, for example, in estimates of heritability of gene expression in the Genotype-Tissue Expression project and of lipid profiles in the Ludwigshafen Risk and Cardiovascular Health study. We also show that often the probability that the genetic component is estimated as 0 is high even when the true heritability is bounded away from 0, emphasizing the need for accurate confidence intervals. We propose a computationally efficient method, ALBI (accurate LMM-based heritability bootstrap confidence intervals), for estimating the distribution of the heritability estimator and for constructing accurate confidence intervals. Our method can be used as an add-on to existing methods for estimating heritability and variance components, such as GCTA, FaST-LMM, GEMMA, or EMMAX. PMID:27259052

  10. Use of health effect risk estimates and uncertainty in formal regulatory proceedings: a case study involving atmospheric particulates

    SciTech Connect

    Habegger, L.J.; Oezkaynak, A.H.

    1984-01-01

    Coal combustion particulates are released to the atmosphere by power plants supplying electrical to the nuclear fuel cycle. This paper presents estimates of the public health risks associated with the release of these particulates at a rate associated with the annual nuclear fuel production requirements for a nuclear power plan. Utilization of these risk assessments as a new component in the formal evaluation of total risks from nuclear power plants is discussed. 23 references, 3 tables.

  11. Estimating Hurricane Rates in a Changing Climate

    NASA Astrophysics Data System (ADS)

    Coughlin, K.; Laepple, T.; Rowlands, D.; Jewson, S.; Bellone, E.

    2009-04-01

    The estimation of hurricane risk is of life-and-death importance for millions of people living on the West Coast of the Atlantic. Risk Management Solutions (RMS) provides products and services for the quantification and management of many catastrophe risks, including the risks associated with Atlantic hurricanes. Of particular importance in the modeling of hurricane risk, is the estimation of future hurricane rates. We are interested in making accurate estimates over the next 5-years of the underlying rates associated with Atlantic Hurricanes that make landfall. This presentation discusses our methodology used in making these estimates. Specifically, we discuss the importance of estimating the changing environments, both local and global, that affect hurricane formation and development. Our methodology combines statistical modeling, physical insight and modeling and expert opinion to provide RMS with accurate estimates of the underlying rate associated with landfalling hurricanes in the Atlantic.

  12. Estimated Reduction in Cancer Risk due to PAH Exposures If Source Control Measures during the 2008 Beijing Olympics Were Sustained

    PubMed Central

    Jia, Yuling; Stone, Dave; Wang, Wentao; Schrlau, Jill; Tao, Shu; Massey Simonich, Staci L.

    2011-01-01

    Background The 2008 Beijing Olympic Games provided a unique case study to investigate the effect of source control measures on the reduction in air pollution, and associated inhalation cancer risk, in a Chinese megacity. Objectives We measured 17 carcinogenic polycyclic aromatic hydrocarbons (PAHs) and estimated the lifetime excess inhalation cancer risk during different periods of the Beijing Olympic Games, to assess the effectiveness of source control measures in reducing PAH-induced inhalation cancer risks. Methods PAH concentrations were measured in samples of particulate matter ≤ 2.5 μm in aerodynamic diameter (PM2.5) collected during the Beijing Olympic Games, and the associated inhalation cancer risks were estimated using a point-estimate approach based on relative potency factors. Results We estimated the number of lifetime excess cancer cases due to exposure to the 17 carcinogenic PAHs [12 priority pollutant PAHs and five high-molecular-weight (302 Da) PAHs (MW 302 PAHs)] to range from 6.5 to 518 per million people for the source control period concentrations and from 12.2 to 964 per million people for the nonsource control period concentrations. This would correspond to a 46% reduction in estimated inhalation cancer risk due to source control measures, if these measures were sustained over time. Benzo[b]fluoranthene, dibenz[a,h]anthracene, benzo[a]pyrene, and dibenzo[a,l]pyrene were the most carcinogenic PAH species evaluated. Total excess inhalation cancer risk would be underestimated by 23% if we did not include the five MW 302 PAHs in the risk calculation. Conclusions Source control measures, such as those imposed during the 2008 Beijing Olympics, can significantly reduce the inhalation cancer risk associated with PAH exposure in Chinese megacities similar to Beijing. MW 302 PAHs are a significant contributor to the estimated overall inhalation cancer risk. PMID:21632310

  13. Strategy Guideline. Accurate Heating and Cooling Load Calculations

    SciTech Connect

    Burdick, Arlan

    2011-06-01

    This guide presents the key criteria required to create accurate heating and cooling load calculations and offers examples of the implications when inaccurate adjustments are applied to the HVAC design process. The guide shows, through realistic examples, how various defaults and arbitrary safety factors can lead to significant increases in the load estimate. Emphasis is placed on the risks incurred from inaccurate adjustments or ignoring critical inputs of the load calculation.

  14. Strategy Guideline: Accurate Heating and Cooling Load Calculations

    SciTech Connect

    Burdick, A.

    2011-06-01

    This guide presents the key criteria required to create accurate heating and cooling load calculations and offers examples of the implications when inaccurate adjustments are applied to the HVAC design process. The guide shows, through realistic examples, how various defaults and arbitrary safety factors can lead to significant increases in the load estimate. Emphasis is placed on the risks incurred from inaccurate adjustments or ignoring critical inputs of the load calculation.

  15. Dementia risk estimates associated with measures of depression: a systematic review and meta-analysis

    PubMed Central

    Anstey, Kaarin J

    2015-01-01

    Objectives To perform a systematic review of reported HRs of all cause dementia, Alzheimer's disease (AD) and vascular dementia (VaD) for late-life depression and depressive symptomatology on specific screening instruments at specific thresholds. Design Meta-analysis with meta-regression. Setting and participants PubMed, PsycInfo, and Cochrane databases were searched through 28 February 2014. Articles reporting HRs for incident all-cause dementia, AD and VaD based on published clinical criteria using validated measures of clinical depression or symptomatology from prospective studies of general population of adults were selected by consensus among multiple reviewers. Studies that did not use clinical dementia diagnoses or validated instruments for the assessment of depression were excluded. Data were extracted by two reviewers and reviewed by two other independent reviewers. The most specific analyses possible using continuous symptomatology ratings and categorical measures of clinical depression focusing on single instruments with defined reported cut-offs were conducted. Primary outcome measures HRs for all-cause dementia, AD, and VaD were computed where possible for continuous depression scores, or for major depression assessed with single or comparable validated instruments. Results Searches yielded 121 301 articles, of which 36 (0.03%) were eligible. Included studies provided a combined sample size of 66 532 individuals including 6593 cases of dementia, 2797 cases of AD and 585 cases of VaD. The increased risk associated with depression did not significantly differ by type of dementia and ranged from 83% to 104% for diagnostic thresholds consistent with major depression. Risk associated with continuous depression symptomatology measures were consistent with those for clinical thresholds. Conclusions Late-life depression is consistently and similarly associated with a twofold increased risk of dementia. The precise risk estimates produced in this study for

  16. Improved risk estimated from carbon tetrachloride. Annual progress report, October 1, 1996--September 30, 1997

    SciTech Connect

    Benson, J.M.

    1997-10-27

    'Carbon tetrachloride (CCl{sub 4}) has been used extensively within the Department of Energy (DOE) nuclear weapons facilities. Rocky Flats was formerly the largest volume user of CCl{sub 4} in the US, with 5,000 gallons used there in 1977 alone. At the Hanford site, several hundred thousand gallons of CCl{sub 4} were discharged between 1955 and 1973 into underground cribs for storage. Levels of CCl{sub 4} in groundwater at highly contaminated sites at the Hanford. facility have exceeded the drinking water standard of 5 ppb by several orders of magnitude. High levels of CCl{sub 4} at these facilities represent a potential health hazard for workers conducting cleanup operations and for surrounding communities. The level of CCl{sub 4} cleanup required at these sites and associated costs are driven by current human health risk estimates which assume that CCl{sub 4} is a genotoxic carcinogen. The overall purpose of these studies is to improve the scientific basis for assessing the health risk associated with human exposure to CCl{sub 4}. Specifically, the authors will determine the toxicokinetics of inhaled and ingested CCl{sub 4} in F344/Crl rats, B6C3F1 mice, and Syrian hamsters. They will also evaluate species differences in the metabolism of CCl{sub 4} by rats, mice, hamsters, and man. Dose-response relationships will be determined in all these studies. This information will be used to improve the physiologically based pharmacokinetic (PBPK) model for CCl4 originally developed by Paustenbach et al. (1988) and more recently revised by Thrall and Kenny (1996). They will also provide scientific evidence that CCl{sub 4} , like chloroform, is a hepatocarcinogen only when exposure results in cell damage, cell killing, and regenerative cell proliferation. In combination, the studies outlined in this proposal will provide the exact types of information needed to enable refined cancer risk estimates for CCl{sub 4} under the new guidelines for risk assessment proposed by the

  17. Simulation-extrapolation method to address errors in atomic bomb survivor dosimetry on solid cancer and leukaemia mortality risk estimates, 1950-2003.

    PubMed

    Allodji, Rodrigue S; Schwartz, Boris; Diallo, Ibrahima; Agbovon, Césaire; Laurier, Dominique; de Vathaire, Florent

    2015-08-01

    Analyses of the Life Span Study (LSS) of Japanese atomic bombing survivors have routinely incorporated corrections for additive classical measurement errors using regression calibration. Recently, several studies reported that the efficiency of the simulation-extrapolation method (SIMEX) is slightly more accurate than the simple regression calibration method (RCAL). In the present paper, the SIMEX and RCAL methods have been used to address errors in atomic bomb survivor dosimetry on solid cancer and leukaemia mortality risk estimates. For instance, it is shown that using the SIMEX method, the ERR/Gy is increased by an amount of about 29 % for all solid cancer deaths using a linear model compared to the RCAL method, and the corrected EAR 10(-4) person-years at 1 Gy (the linear terms) is decreased by about 8 %, while the corrected quadratic term (EAR 10(-4) person-years/Gy(2)) is increased by about 65 % for leukaemia deaths based on a linear-quadratic model. The results with SIMEX method are slightly higher than published values. The observed differences were probably due to the fact that with the RCAL method the dosimetric data were partially corrected, while all doses were considered with the SIMEX method. Therefore, one should be careful when comparing the estimated risks and it may be useful to use several correction techniques in order to obtain a range of corrected estimates, rather than to rely on a single technique. This work will enable to improve the risk estimates derived from LSS data, and help to make more reliable the development of radiation protection standards.

  18. Simulation-extrapolation method to address errors in atomic bomb survivor dosimetry on solid cancer and leukaemia mortality risk estimates, 1950-2003.

    PubMed

    Allodji, Rodrigue S; Schwartz, Boris; Diallo, Ibrahima; Agbovon, Césaire; Laurier, Dominique; de Vathaire, Florent

    2015-08-01

    Analyses of the Life Span Study (LSS) of Japanese atomic bombing survivors have routinely incorporated corrections for additive classical measurement errors using regression calibration. Recently, several studies reported that the efficiency of the simulation-extrapolation method (SIMEX) is slightly more accurate than the simple regression calibration method (RCAL). In the present paper, the SIMEX and RCAL methods have been used to address errors in atomic bomb survivor dosimetry on solid cancer and leukaemia mortality risk estimates. For instance, it is shown that using the SIMEX method, the ERR/Gy is increased by an amount of about 29 % for all solid cancer deaths using a linear model compared to the RCAL method, and the corrected EAR 10(-4) person-years at 1 Gy (the linear terms) is decreased by about 8 %, while the corrected quadratic term (EAR 10(-4) person-years/Gy(2)) is increased by about 65 % for leukaemia deaths based on a linear-quadratic model. The results with SIMEX method are slightly higher than published values. The observed differences were probably due to the fact that with the RCAL method the dosimetric data were partially corrected, while all doses were considered with the SIMEX method. Therefore, one should be careful when comparing the estimated risks and it may be useful to use several correction techniques in order to obtain a range of corrected estimates, rather than to rely on a single technique. This work will enable to improve the risk estimates derived from LSS data, and help to make more reliable the development of radiation protection standards. PMID:25894839

  19. Flood risk assessment in France: comparison of extreme flood estimation methods (EXTRAFLO project, Task 7)

    NASA Astrophysics Data System (ADS)

    Garavaglia, F.; Paquet, E.; Lang, M.; Renard, B.; Arnaud, P.; Aubert, Y.; Carre, J.

    2013-12-01

    In flood risk assessment the methods can be divided in two families: deterministic methods and probabilistic methods. In the French hydrologic community the probabilistic methods are historically preferred to the deterministic ones. Presently a French research project named EXTRAFLO (RiskNat Program of the French National Research Agency, https://extraflo.cemagref.fr) deals with the design values for extreme rainfall and floods. The object of this project is to carry out a comparison of the main methods used in France for estimating extreme values of rainfall and floods, to obtain a better grasp of their respective fields of application. In this framework we present the results of Task 7 of EXTRAFLO project. Focusing on French watersheds, we compare the main extreme flood estimation methods used in French background: (i) standard flood frequency analysis (Gumbel and GEV distribution), (ii) regional flood frequency analysis (regional Gumbel and GEV distribution), (iii) local and regional flood frequency analysis improved by historical information (Naulet et al., 2005), (iv) simplify probabilistic method based on rainfall information (i.e. Gradex method (CFGB, 1994), Agregee method (Margoum, 1992) and Speed method (Cayla, 1995)), (v) flood frequency analysis by continuous simulation approach and based on rainfall information (i.e. Schadex method (Paquet et al., 2013, Garavaglia et al., 2010), Shyreg method (Lavabre et al., 2003)) and (vi) multifractal approach. The main result of this comparative study is that probabilistic methods based on additional information (i.e. regional, historical and rainfall information) provide better estimations than the standard flood frequency analysis. Another interesting result is that, the differences between the various extreme flood quantile estimations of compared methods increase with return period, staying relatively moderate up to 100-years return levels. Results and discussions are here illustrated throughout with the example

  20. Combining Radiation Epidemiology With Molecular Biology-Changing From Health Risk Estimates to Therapeutic Intervention.

    PubMed

    Abend, Michael; Port, Matthias

    2016-08-01

    The authors herein summarize six presentations dedicated to the key session "molecular radiation epidemiology" of the ConRad meeting 2015. These presentations were chosen in order to highlight the promise when combining conventional radiation epidemiology with molecular biology. Conventional radiation epidemiology uses dose estimates for risk predictions on health. However, combined with molecular biology, dose-dependent bioindicators of effect hold the promise to improve clinical diagnostics and to provide target molecules for potential therapeutic intervention. One out of the six presentations exemplified the use of radiation-induced molecular changes as biomarkers of exposure by measuring stabile chromosomal translocations. The remaining five presentations focused on molecular changes used as bioindicators of the effect. These bioindicators of the effect could be used for diagnostic purposes on colon cancers (genomic instability), thyroid cancer (CLIP2), or head and neck squamous cell cancers. Therapeutic implications of gene expression changes were examined in Chernobyl thyroid cancer victims and Mayak workers. PMID:27356062

  1. Schistosomiasis risk estimation in Minas Gerais State, Brazil, using environmental data and GIS techniques.

    PubMed

    Guimarães, Ricardo J P S; Freitas, Corina C; Dutra, Luciano V; Moura, Ana C M; Amaral, Ronaldo S; Drummond, Sandra C; Scholte, Ronaldo G C; Carvalho, Omar S

    2008-01-01

    The influence of climate and environmental variables to the distribution of schistosomiasis has been assessed in several previous studies. Also Geographical Information System (GIS), is a tool that has been recently tested for better understanding the spatial disease distribution. The objective of this paper is to further develop the GIS technology for modeling and control of schistosomiasis using meteorological and social variables and introducing new potential environmental-related variables, particularly those produced by recently launched orbital sensors like the Moderate Resolution Imaging Spectroradiometer (MODIS) and the Shuttle Radar Topography Mission (SRTM). Three different scenarios have been analyzed, and despite of not quite large determination factor, the standard deviation of risk estimates was considered adequate for public health needs. The main variables selected as important for modeling purposes was topographic elevation, summer minimum temperature, the NDVI vegetation index, and the social index HDI91. PMID:18692017

  2. Combining Radiation Epidemiology With Molecular Biology-Changing From Health Risk Estimates to Therapeutic Intervention.

    PubMed

    Abend, Michael; Port, Matthias

    2016-08-01

    The authors herein summarize six presentations dedicated to the key session "molecular radiation epidemiology" of the ConRad meeting 2015. These presentations were chosen in order to highlight the promise when combining conventional radiation epidemiology with molecular biology. Conventional radiation epidemiology uses dose estimates for risk predictions on health. However, combined with molecular biology, dose-dependent bioindicators of effect hold the promise to improve clinical diagnostics and to provide target molecules for potential therapeutic intervention. One out of the six presentations exemplified the use of radiation-induced molecular changes as biomarkers of exposure by measuring stabile chromosomal translocations. The remaining five presentations focused on molecular changes used as bioindicators of the effect. These bioindicators of the effect could be used for diagnostic purposes on colon cancers (genomic instability), thyroid cancer (CLIP2), or head and neck squamous cell cancers. Therapeutic implications of gene expression changes were examined in Chernobyl thyroid cancer victims and Mayak workers.

  3. Improved Radiation Dosimetry/Risk Estimates to Facilitate Environmental Management of Plutonium-Contaminated Sites

    SciTech Connect

    Scott, Bobby R.; Tokarskaya, Zoya B.; Zhuntova, Galina V.; Osovets, Sergey V.; Syrchikov, Victor A., Belyaeva, Zinaida D.

    2007-12-14

    This report summarizes 4 years of research achievements in this Office of Science (BER), U.S. Department of Energy (DOE) project. The research described was conducted by scientists and supporting staff at Lovelace Respiratory Research Institute (LRRI)/Lovelace Biomedical and Environmental Research Institute (LBERI) and the Southern Urals Biophysics Institute (SUBI). All project objectives and goals were achieved. A major focus was on obtaining improved cancer risk estimates for exposure via inhalation to plutonium (Pu) isotopes in the workplace (DOE radiation workers) and environment (public exposures to Pu-contaminated soil). A major finding was that low doses and dose rates of gamma rays can significantly suppress cancer induction by alpha radiation from inhaled Pu isotopes. The suppression relates to stimulation of the body's natural defenses, including immunity against cancer cells and selective apoptosis which removes precancerous and other aberrant cells.

  4. Bayesian Risk Mapping and Model-Based Estimation of Schistosoma haematobium–Schistosoma mansoni Co-distribution in Côte d′Ivoire

    PubMed Central

    Chammartin, Frédérique; Houngbedji, Clarisse A.; Hürlimann, Eveline; Yapi, Richard B.; Silué, Kigbafori D.; Soro, Gotianwa; Kouamé, Ferdinand N.; N′Goran, Eliézer K.; Utzinger, Jürg; Raso, Giovanna; Vounatsou, Penelope

    2014-01-01

    Background Schistosoma haematobium and Schistosoma mansoni are blood flukes that cause urogenital and intestinal schistosomiasis, respectively. In Côte d′Ivoire, both species are endemic and control efforts are being scaled up. Accurate knowledge of the geographical distribution, including delineation of high-risk areas, is a central feature for spatial targeting of interventions. Thus far, model-based predictive risk mapping of schistosomiasis has relied on historical data of separate parasite species. Methodology We analyzed data pertaining to Schistosoma infection among school-aged children obtained from a national, cross-sectional survey conducted between November 2011 and February 2012. More than 5,000 children in 92 schools across Côte d′Ivoire participated. Bayesian geostatistical multinomial models were developed to assess infection risk, including S. haematobium–S. mansoni co-infection. The predicted risk of schistosomiasis was utilized to estimate the number of children that need preventive chemotherapy with praziquantel according to World Health Organization guidelines. Principal Findings We estimated that 8.9% of school-aged children in Côte d′Ivoire are affected by schistosomiasis; 5.3% with S. haematobium and 3.8% with S. mansoni. Approximately 2 million annualized praziquantel treatments would be required for preventive chemotherapy at health districts level. The distinct spatial patterns of S. haematobium and S. mansoni imply that co-infection is of little importance across the country. Conclusions/Significance We provide a comprehensive analysis of the spatial distribution of schistosomiasis risk among school-aged children in Côte d′Ivoire and a strong empirical basis for a rational targeting of control interventions. PMID:25522007

  5. Effect of water resource development and management on lymphatic filariasis, and estimates of populations at risk.

    PubMed

    Erlanger, Tobias E; Keiser, Jennifer; Caldas De Castro, Marcia; Bos, Robert; Singer, Burton H; Tanner, Marcel; Utzinger, Jürg

    2005-09-01

    Lymphatic filariasis (LF) is a debilitating disease overwhelmingly caused by Wuchereria bancrofti, which is transmitted by various mosquito species. Here, we present a systematic literature review with the following objectives: (i) to establish global and regional estimates of populations at risk of LF with particular consideration of water resource development projects, and (ii) to assess the effects of water resource development and management on the frequency and transmission dynamics of the disease. We estimate that globally, 2 billion people are at risk of LF. Among them, there are 394.5 million urban dwellers without access to improved sanitation and 213 million rural dwellers living in close proximity to irrigation. Environmental changes due to water resource development and management consistently led to a shift in vector species composition and generally to a strong proliferation of vector populations. For example, in World Health Organization (WHO) subregions 1 and 2, mosquito densities of the Anopheles gambiae complex and Anopheles funestus were up to 25-fold higher in irrigated areas when compared with irrigation-free sites. Although the infection prevalence of LF often increased after the implementation of a water project, there was no clear association with clinical symptoms. Concluding, there is a need to assess and quantify changes of LF transmission parameters and clinical manifestations over the entire course of water resource developments. Where resources allow, integrated vector management should complement mass drug administration, and broad-based monitoring and surveillance of the disease should become an integral part of large-scale waste management and sanitation programs, whose basic rationale lies in a systemic approach to city, district, and regional level health services and disease prevention.

  6. Estimate of the secondary cancer risk from megavoltage CT in tomotherapy

    NASA Astrophysics Data System (ADS)

    Kim, Dong Wook; Chung, Weon Kuu; Ahn, Sung Hwan; Yoon, Myonggeun

    2013-04-01

    We have assessed the radiation-induced excess cancer risk to organs from megavoltage computed tomography (MVCT). MVCT was performed in coarse, normal and fine scanning modes. Using a glass dosimeter, we measured the primary and the secondary doses inside a homemade phantom and at various distances from the imaging center. The organ-specific excess absolute risk (EAR) for cancer induction was estimated using an organ-equivalent dose (OED) based on the measured imaging doses. The average primary doses inside the phantom for the coarse, normal and fine scanning modes were 0.78, 1.15 and 2.15 cGy, respectively. The average secondary dose per scan, measured 20 to 60 cm from the imaging center, ranged from 0.044 to 0.008 cGy. The EAR for major organs indicated that when 30 MVCT scans are performed to position each patient during the course of radiation treatment, organ-specific cancers may develop in as many as 6 per 10,000 persons per year.

  7. Regression models and risk estimation for mixed discrete and continuous outcomes in developmental toxicology.

    PubMed

    Regan, M M; Catalano, P J

    2000-06-01

    Multivariate dose-response models have recently been proposed for developmental toxicity data to simultaneously model malformation incidence (a binary outcome), and reductions in fetal weight (a continuous outcome). In this and other applications, the binary outcome often represents a dichotomization of another outcome or a composite of outcomes, which facilitates analysis. For example, in Segment II developmental toxicology studies, multiple malformation types (i.e., external, visceral, skeletal) are evaluated on each fetus; malformation status may also be ordinally measured (e.g., normal, signs of variation, full malformation). A model is proposed is for fetal weight and multiple malformation variables measured on an ordinal scale, where the correlations between the outcomes and between the offspring within a litter are taken into account. Fully specifying the joint distribution of outcomes within a litter is avoided by specifying only the distribution of the multivariate outcome for each fetus and using generalized estimating equation methodology to account for correlations due to litter clustering. The correlations between the outcomes are required to characterize joint risk to the fetus, and are therefore a focus of inference. Dose-response models and their application to quantitative risk assessment are illustrated using data from a recent developmental toxicology experiment of ethylene oxide in mice. PMID:10949415

  8. Be rich or don't be sick: estimating Vietnamese patients' risk of falling into destitution.

    PubMed

    Vuong, Quan Hoang

    2015-01-01

    This paper represents the first research attempt to estimate the probabilities of Vietnamese patients falling into destitution due to financial burdens occurring during a curative hospital stay. The study models risk against such factors as level of insurance coverage, residency status of patient, and cost of treatment, among others. The results show that very high probabilities of destitution, approximately 70 %, apply to a large group of patients, who are non-residents, poor and ineligible for significant insurance coverage. There is also a probability of 58 % that seriously ill low-income patients who face higher health care costs would quit their treatment. These facts put the Vietnamese government's ambitious plan of increasing both universal coverage (UC) to 100 % of expenditure and the rate of UC beneficiaries to 100 %, to a serious test. The current study also raises issues of asymmetric information and alternative financing options for the poor, who are most exposed to risk of destitution following market-based health care reforms. PMID:26413435

  9. Update of identification and estimation of socioeconomic impacts resulting from perceived risks and changing images: An annotated bibliography

    SciTech Connect

    Nieves, L.A.; Clark, D.E.; Wernette, D.

    1991-08-01

    This annotated bibliography reviews selected literature published through August 1991 on the identification of perceived risks and methods for estimating the economic impacts of risk perception. It updates the literature review found in Argonne National Laboratory report ANL/EAIS/TM-24 (February 1990). Included in this update are (1) a literature review of the risk perception process, of the relationship between risk perception and economic impacts, of economic methods and empirical applications, and interregional market interactions and adjustments; (2) a working bibliography (that includes the documents abstracted in the 1990 report); (3) a topical index to the abstracts found in both reports; and (4) abstracts of selected articles found in this update.

  10. Measurement of total polycyclic aromatic hydrocarbon concentrations in sediments and toxic units used for estimating risk to benthic invertebrates at manufactured gas plant sites.

    PubMed

    Hawthorne, Steven B; Miller, David J; Kreitinger, Joseph P

    2006-01-01

    The U.S. Environmental Protection Agency's (U.S. EPA) narcosis model requires the measurement of 18 parent and 16 groups of alkyl polycyclic aromatic hydrocarbons (PAHs) (so-called 34 PAHs) in sediments to calculate the number of PAH toxic units (TU) available to benthic organisms. If data for the 34 PAHs are not available, the U.S. EPA proposes estimating the risk by multiplying the TU for 13 parent PAHs by 11.5 (95% confidence interval) based on data from 488 sediments. This estimate is overly conservative for PAHs from pyrogenic manufactured gas plant (MGP) processes based on the analysis of 45 sediments from six sites. Parent PAHs contributed approximately 40% of the total concentrations and TU for MGP sediments. In contrast, parent PAHs from diesel fuel and petroleum crude oil contributed only 2 and 1%, respectively, of the PAH concentrations and TU, compared to approximately 98 to 99% contributed by the alkyl PAHs. Statistical comparison of the TU based on the measured 34 alkyl and parent PAHs and those based on only 13 parent PAHs demonstrated that a factor of 4.2 (rather than 11.5) is sufficient to estimate total TU within a 95% confidence level for MGP sites. Similarly, measurement of parent PAHs is sufficient to accurately estimate the total 34 alkyl and parent PAH concentrations for MGP-impacted sediments. PMID:16494254

  11. Estimating the pollution risk of cadmium in soil using a composite soil environmental quality standard.

    PubMed

    Qu, Mingkai; Li, Weidong; Zhang, Chuanrong; Huang, Biao; Zhao, Yongcun

    2014-01-01

    Estimating standard-exceeding probabilities of toxic metals in soil is crucial for environmental evaluation. Because soil pH and land use types have strong effects on the bioavailability of trace metals in soil, they were taken into account by some environmental protection agencies in making composite soil environmental quality standards (SEQSs) that contain multiple metal thresholds under different pH and land use conditions. This study proposed a method for estimating the standard-exceeding probability map of soil cadmium using a composite SEQS. The spatial variability and uncertainty of soil pH and site-specific land use type were incorporated through simulated realizations by sequential Gaussian simulation. A case study was conducted using a sample data set from a 150 km(2) area in Wuhan City and the composite SEQS for cadmium, recently set by the State Environmental Protection Administration of China. The method may be useful for evaluating the pollution risks of trace metals in soil with composite SEQSs.

  12. Estimating the Pollution Risk of Cadmium in Soil Using a Composite Soil Environmental Quality Standard

    PubMed Central

    Huang, Biao; Zhao, Yongcun

    2014-01-01

    Estimating standard-exceeding probabilities of toxic metals in soil is crucial for environmental evaluation. Because soil pH and land use types have strong effects on the bioavailability of trace metals in soil, they were taken into account by some environmental protection agencies in making composite soil environmental quality standards (SEQSs) that contain multiple metal thresholds under different pH and land use conditions. This study proposed a method for estimating the standard-exceeding probability map of soil cadmium using a composite SEQS. The spatial variability and uncertainty of soil pH and site-specific land use type were incorporated through simulated realizations by sequential Gaussian simulation. A case study was conducted using a sample data set from a 150 km2 area in Wuhan City and the composite SEQS for cadmium, recently set by the State Environmental Protection Administration of China. The method may be useful for evaluating the pollution risks of trace metals in soil with composite SEQSs. PMID:24672364

  13. Value-at-risk estimation with wavelet-based extreme value theory: Evidence from emerging markets

    NASA Astrophysics Data System (ADS)

    Cifter, Atilla

    2011-06-01

    This paper introduces wavelet-based extreme value theory (EVT) for univariate value-at-risk estimation. Wavelets and EVT are combined for volatility forecasting to estimate a hybrid model. In the first stage, wavelets are used as a threshold in generalized Pareto distribution, and in the second stage, EVT is applied with a wavelet-based threshold. This new model is applied to two major emerging stock markets: the Istanbul Stock Exchange (ISE) and the Budapest Stock Exchange (BUX). The relative performance of wavelet-based EVT is benchmarked against the Riskmetrics-EWMA, ARMA-GARCH, generalized Pareto distribution, and conditional generalized Pareto distribution models. The empirical results show that the wavelet-based extreme value theory increases predictive performance of financial forecasting according to number of violations and tail-loss tests. The superior forecasting performance of the wavelet-based EVT model is also consistent with Basel II requirements, and this new model can be used by financial institutions as well.

  14. Estimating the Influence of Oil and Gas Emissions on Urban Ozone and Associated Health Risks

    NASA Astrophysics Data System (ADS)

    Capps, S.; Nsanzineza, R.; Turner, M. D.; Henze, D. K.; Zhao, S.; Russell, M. G.; Hakami, A.; Milford, J. B.

    2015-12-01

    Tropospheric ozone (O3) degrades air quality, impacting human health and public welfare. The National Ambient Air Quality Standard (NAAQS) is designed to limit these impacts, but certain areas in the continental U.S. exceed this standard. Mitigating O3 NAAQS exceedances by designing emissions controls can be complicated in urban areas because of the long-range transport of ozone and its gaseous precursors as well as the complex mix of local emissions sources. Recent growth of unconventional oil and gas development near urban areas in Colorado, Texas, and the northeastern corridor has exacerbated this problem. To estimate the contribution of emissions from oil and gas development to urban O3 issues, we apply the CMAQ adjoint, which efficiently elucidates the relative influence of emissions sources on select concentration-based metrics. Specifically, the adjoint is used to calculate the spatially-specific relative contributions of emissions of oxides of nitrogen (NOx) and volatile organic compounds (VOCs) throughout the continental U.S. to O3 NAAQS exceedances and to ozone-related health risks in select urban areas. By evaluating these influences for different urban areas, including one in California that has been managing air quality with adjacent oil and gas development for a longer period of time, we are able to compare and contrast the emissions control strategies that may be more effective in particular regions. Additionally, the resulting relationships between emissions and concentrations provide a way to project ozone impacts when measurements provide refined estimates of emissions from this sector.

  15. Estimating functional connectivity of wildlife habitat and its relevance to ecological risk assessment

    USGS Publications Warehouse

    Johnson, A.R.; Allen, C.R.; Simpson, K.A.N.

    2004-01-01

    Habitat fragmentation is a major threat to the viability of wildlife populations and the maintenance of biodiversity. Fragmentation relates to the sub-division of habitat intq disjunct patches. Usually coincident with fragmentation per se is loss of habitat, a reduction in the size of the remnant pat