Science.gov

Sample records for accurate risk estimates

  1. Estimating risk.

    PubMed

    2016-07-01

    A free mobile phone app has been launched providing nurses and other hospital clinicians with a simple way to identify high-risk surgical patients. The app is a phone version of the Surgical Outcome Risk Tool (SORT), originally developed for online use with computers by researchers from the National Confidential Enquiry into Patient Outcome and Death and the University College London Hospital Surgical Outcomes Research Centre. SORT uses information about patients' health and planned surgical procedures to estimate the risk of death within 30 days of an operation. The percentages are only estimates, taking into account the general risks of the procedures and some information about patients, and should not be confused with patient-specific estimates in individual cases. PMID:27369709

  2. Accurate pose estimation for forensic identification

    NASA Astrophysics Data System (ADS)

    Merckx, Gert; Hermans, Jeroen; Vandermeulen, Dirk

    2010-04-01

    In forensic authentication, one aims to identify the perpetrator among a series of suspects or distractors. A fundamental problem in any recognition system that aims for identification of subjects in a natural scene is the lack of constrains on viewing and imaging conditions. In forensic applications, identification proves even more challenging, since most surveillance footage is of abysmal quality. In this context, robust methods for pose estimation are paramount. In this paper we will therefore present a new pose estimation strategy for very low quality footage. Our approach uses 3D-2D registration of a textured 3D face model with the surveillance image to obtain accurate far field pose alignment. Starting from an inaccurate initial estimate, the technique uses novel similarity measures based on the monogenic signal to guide a pose optimization process. We will illustrate the descriptive strength of the introduced similarity measures by using them directly as a recognition metric. Through validation, using both real and synthetic surveillance footage, our pose estimation method is shown to be accurate, and robust to lighting changes and image degradation.

  3. Radiation risk estimation models

    SciTech Connect

    Hoel, D.G.

    1987-11-01

    Cancer risk models and their relationship to ionizing radiation are discussed. There are many model assumptions and risk factors that have a large quantitative impact on the cancer risk estimates. Other health end points such as mental retardation may be an even more serious risk than cancer for those with in utero exposures. 8 references.

  4. 31 CFR 205.24 - How are accurate estimates maintained?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 31 Money and Finance: Treasury 2 2010-07-01 2010-07-01 false How are accurate estimates maintained... Treasury-State Agreement § 205.24 How are accurate estimates maintained? (a) If a State has knowledge that an estimate does not reasonably correspond to the State's cash needs for a Federal assistance...

  5. Accurate Biomass Estimation via Bayesian Adaptive Sampling

    NASA Technical Reports Server (NTRS)

    Wheeler, Kevin R.; Knuth, Kevin H.; Castle, Joseph P.; Lvov, Nikolay

    2005-01-01

    The following concepts were introduced: a) Bayesian adaptive sampling for solving biomass estimation; b) Characterization of MISR Rahman model parameters conditioned upon MODIS landcover. c) Rigorous non-parametric Bayesian approach to analytic mixture model determination. d) Unique U.S. asset for science product validation and verification.

  6. Micromagnetometer calibration for accurate orientation estimation.

    PubMed

    Zhang, Zhi-Qiang; Yang, Guang-Zhong

    2015-02-01

    Micromagnetometers, together with inertial sensors, are widely used for attitude estimation for a wide variety of applications. However, appropriate sensor calibration, which is essential to the accuracy of attitude reconstruction, must be performed in advance. Thus far, many different magnetometer calibration methods have been proposed to compensate for errors such as scale, offset, and nonorthogonality. They have also been used for obviate magnetic errors due to soft and hard iron. However, in order to combine the magnetometer with inertial sensor for attitude reconstruction, alignment difference between the magnetometer and the axes of the inertial sensor must be determined as well. This paper proposes a practical means of sensor error correction by simultaneous consideration of sensor errors, magnetic errors, and alignment difference. We take the summation of the offset and hard iron error as the combined bias and then amalgamate the alignment difference and all the other errors as a transformation matrix. A two-step approach is presented to determine the combined bias and transformation matrix separately. In the first step, the combined bias is determined by finding an optimal ellipsoid that can best fit the sensor readings. In the second step, the intrinsic relationships of the raw sensor readings are explored to estimate the transformation matrix as a homogeneous linear least-squares problem. Singular value decomposition is then applied to estimate both the transformation matrix and magnetic vector. The proposed method is then applied to calibrate our sensor node. Although there is no ground truth for the combined bias and transformation matrix for our node, the consistency of calibration results among different trials and less than 3(°) root mean square error for orientation estimation have been achieved, which illustrates the effectiveness of the proposed sensor calibration method for practical applications. PMID:25265625

  7. Quantifying Accurate Calorie Estimation Using the "Think Aloud" Method

    ERIC Educational Resources Information Center

    Holmstrup, Michael E.; Stearns-Bruening, Kay; Rozelle, Jeffrey

    2013-01-01

    Objective: Clients often have limited time in a nutrition education setting. An improved understanding of the strategies used to accurately estimate calories may help to identify areas of focused instruction to improve nutrition knowledge. Methods: A "Think Aloud" exercise was recorded during the estimation of calories in a standard dinner meal…

  8. Injury Risk Estimation Expertise

    PubMed Central

    Petushek, Erich J.; Ward, Paul; Cokely, Edward T.; Myer, Gregory D.

    2015-01-01

    Background: Simple observational assessment of movement is a potentially low-cost method for anterior cruciate ligament (ACL) injury screening and prevention. Although many individuals utilize some form of observational assessment of movement, there are currently no substantial data on group skill differences in observational screening of ACL injury risk. Purpose/Hypothesis: The purpose of this study was to compare various groups’ abilities to visually assess ACL injury risk as well as the associated strategies and ACL knowledge levels. The hypothesis was that sports medicine professionals would perform better than coaches and exercise science academics/students and that these subgroups would all perform better than parents and other general population members. Study Design: Cross-sectional study; Level of evidence, 3. Methods: A total of 428 individuals, including physicians, physical therapists, athletic trainers, strength and conditioning coaches, exercise science researchers/students, athletes, parents, and members of the general public participated in the study. Participants completed the ACL Injury Risk Estimation Quiz (ACL-IQ) and answered questions related to assessment strategy and ACL knowledge. Results: Strength and conditioning coaches, athletic trainers, physical therapists, and exercise science students exhibited consistently superior ACL injury risk estimation ability (+2 SD) as compared with sport coaches, parents of athletes, and members of the general public. The performance of a substantial number of individuals in the exercise sciences/sports medicines (approximately 40%) was similar to or exceeded clinical instrument-based biomechanical assessment methods (eg, ACL nomogram). Parents, sport coaches, and the general public had lower ACL-IQ, likely due to their lower ACL knowledge and to rating the importance of knee/thigh motion lower and weight and jump height higher. Conclusion: Substantial cross-professional/group differences in visual ACL

  9. Accurate parameter estimation for unbalanced three-phase system.

    PubMed

    Chen, Yuan; So, Hing Cheung

    2014-01-01

    Smart grid is an intelligent power generation and control console in modern electricity networks, where the unbalanced three-phase power system is the commonly used model. Here, parameter estimation for this system is addressed. After converting the three-phase waveforms into a pair of orthogonal signals via the α β-transformation, the nonlinear least squares (NLS) estimator is developed for accurately finding the frequency, phase, and voltage parameters. The estimator is realized by the Newton-Raphson scheme, whose global convergence is studied in this paper. Computer simulations show that the mean square error performance of NLS method can attain the Cramér-Rao lower bound. Moreover, our proposal provides more accurate frequency estimation when compared with the complex least mean square (CLMS) and augmented CLMS. PMID:25162056

  10. Accurate pose estimation using single marker single camera calibration system

    NASA Astrophysics Data System (ADS)

    Pati, Sarthak; Erat, Okan; Wang, Lejing; Weidert, Simon; Euler, Ekkehard; Navab, Nassir; Fallavollita, Pascal

    2013-03-01

    Visual marker based tracking is one of the most widely used tracking techniques in Augmented Reality (AR) applications. Generally, multiple square markers are needed to perform robust and accurate tracking. Various marker based methods for calibrating relative marker poses have already been proposed. However, the calibration accuracy of these methods relies on the order of the image sequence and pre-evaluation of pose-estimation errors, making the method offline. Several studies have shown that the accuracy of pose estimation for an individual square marker depends on camera distance and viewing angle. We propose a method to accurately model the error in the estimated pose and translation of a camera using a single marker via an online method based on the Scaled Unscented Transform (SUT). Thus, the pose estimation for each marker can be estimated with highly accurate calibration results independent of the order of image sequences compared to cases when this knowledge is not used. This removes the need for having multiple markers and an offline estimation system to calculate camera pose in an AR application.

  11. An Accurate Link Correlation Estimator for Improving Wireless Protocol Performance

    PubMed Central

    Zhao, Zhiwei; Xu, Xianghua; Dong, Wei; Bu, Jiajun

    2015-01-01

    Wireless link correlation has shown significant impact on the performance of various sensor network protocols. Many works have been devoted to exploiting link correlation for protocol improvements. However, the effectiveness of these designs heavily relies on the accuracy of link correlation measurement. In this paper, we investigate state-of-the-art link correlation measurement and analyze the limitations of existing works. We then propose a novel lightweight and accurate link correlation estimation (LACE) approach based on the reasoning of link correlation formation. LACE combines both long-term and short-term link behaviors for link correlation estimation. We implement LACE as a stand-alone interface in TinyOS and incorporate it into both routing and flooding protocols. Simulation and testbed results show that LACE: (1) achieves more accurate and lightweight link correlation measurements than the state-of-the-art work; and (2) greatly improves the performance of protocols exploiting link correlation. PMID:25686314

  12. An accurate link correlation estimator for improving wireless protocol performance.

    PubMed

    Zhao, Zhiwei; Xu, Xianghua; Dong, Wei; Bu, Jiajun

    2015-01-01

    Wireless link correlation has shown significant impact on the performance of various sensor network protocols. Many works have been devoted to exploiting link correlation for protocol improvements. However, the effectiveness of these designs heavily relies on the accuracy of link correlation measurement. In this paper, we investigate state-of-the-art link correlation measurement and analyze the limitations of existing works. We then propose a novel lightweight and accurate link correlation estimation (LACE) approach based on the reasoning of link correlation formation. LACE combines both long-term and short-term link behaviors for link correlation estimation. We implement LACE as a stand-alone interface in TinyOS and incorporate it into both routing and flooding protocols. Simulation and testbed results show that LACE: (1) achieves more accurate and lightweight link correlation measurements than the state-of-the-art work; and (2) greatly improves the performance of protocols exploiting link correlation. PMID:25686314

  13. Fast and accurate estimation for astrophysical problems in large databases

    NASA Astrophysics Data System (ADS)

    Richards, Joseph W.

    2010-10-01

    A recent flood of astronomical data has created much demand for sophisticated statistical and machine learning tools that can rapidly draw accurate inferences from large databases of high-dimensional data. In this Ph.D. thesis, methods for statistical inference in such databases will be proposed, studied, and applied to real data. I use methods for low-dimensional parametrization of complex, high-dimensional data that are based on the notion of preserving the connectivity of data points in the context of a Markov random walk over the data set. I show how this simple parameterization of data can be exploited to: define appropriate prototypes for use in complex mixture models, determine data-driven eigenfunctions for accurate nonparametric regression, and find a set of suitable features to use in a statistical classifier. In this thesis, methods for each of these tasks are built up from simple principles, compared to existing methods in the literature, and applied to data from astronomical all-sky surveys. I examine several important problems in astrophysics, such as estimation of star formation history parameters for galaxies, prediction of redshifts of galaxies using photometric data, and classification of different types of supernovae based on their photometric light curves. Fast methods for high-dimensional data analysis are crucial in each of these problems because they all involve the analysis of complicated high-dimensional data in large, all-sky surveys. Specifically, I estimate the star formation history parameters for the nearly 800,000 galaxies in the Sloan Digital Sky Survey (SDSS) Data Release 7 spectroscopic catalog, determine redshifts for over 300,000 galaxies in the SDSS photometric catalog, and estimate the types of 20,000 supernovae as part of the Supernova Photometric Classification Challenge. Accurate predictions and classifications are imperative in each of these examples because these estimates are utilized in broader inference problems

  14. Accurate estimation of sigma(exp 0) using AIRSAR data

    NASA Technical Reports Server (NTRS)

    Holecz, Francesco; Rignot, Eric

    1995-01-01

    During recent years signature analysis, classification, and modeling of Synthetic Aperture Radar (SAR) data as well as estimation of geophysical parameters from SAR data have received a great deal of interest. An important requirement for the quantitative use of SAR data is the accurate estimation of the backscattering coefficient sigma(exp 0). In terrain with relief variations radar signals are distorted due to the projection of the scene topography into the slant range-Doppler plane. The effect of these variations is to change the physical size of the scattering area, leading to errors in the radar backscatter values and incidence angle. For this reason the local incidence angle, derived from sensor position and Digital Elevation Model (DEM) data must always be considered. Especially in the airborne case, the antenna gain pattern can be an additional source of radiometric error, because the radar look angle is not known precisely as a result of the the aircraft motions and the local surface topography. Consequently, radiometric distortions due to the antenna gain pattern must also be corrected for each resolution cell, by taking into account aircraft displacements (position and attitude) and position of the backscatter element, defined by the DEM data. In this paper, a method to derive an accurate estimation of the backscattering coefficient using NASA/JPL AIRSAR data is presented. The results are evaluated in terms of geometric accuracy, radiometric variations of sigma(exp 0), and precision of the estimated forest biomass.

  15. Accurate and robust estimation of camera parameters using RANSAC

    NASA Astrophysics Data System (ADS)

    Zhou, Fuqiang; Cui, Yi; Wang, Yexin; Liu, Liu; Gao, He

    2013-03-01

    Camera calibration plays an important role in the field of machine vision applications. The popularly used calibration approach based on 2D planar target sometimes fails to give reliable and accurate results due to the inaccurate or incorrect localization of feature points. To solve this problem, an accurate and robust estimation method for camera parameters based on RANSAC algorithm is proposed to detect the unreliability and provide the corresponding solutions. Through this method, most of the outliers are removed and the calibration errors that are the main factors influencing measurement accuracy are reduced. Both simulative and real experiments have been carried out to evaluate the performance of the proposed method and the results show that the proposed method is robust under large noise condition and quite efficient to improve the calibration accuracy compared with the original state.

  16. Accurate Satellite-Derived Estimates of Tropospheric Ozone Radiative Forcing

    NASA Technical Reports Server (NTRS)

    Joiner, Joanna; Schoeberl, Mark R.; Vasilkov, Alexander P.; Oreopoulos, Lazaros; Platnick, Steven; Livesey, Nathaniel J.; Levelt, Pieternel F.

    2008-01-01

    Estimates of the radiative forcing due to anthropogenically-produced tropospheric O3 are derived primarily from models. Here, we use tropospheric ozone and cloud data from several instruments in the A-train constellation of satellites as well as information from the GEOS-5 Data Assimilation System to accurately estimate the instantaneous radiative forcing from tropospheric O3 for January and July 2005. We improve upon previous estimates of tropospheric ozone mixing ratios from a residual approach using the NASA Earth Observing System (EOS) Aura Ozone Monitoring Instrument (OMI) and Microwave Limb Sounder (MLS) by incorporating cloud pressure information from OMI. Since we cannot distinguish between natural and anthropogenic sources with the satellite data, our estimates reflect the total forcing due to tropospheric O3. We focus specifically on the magnitude and spatial structure of the cloud effect on both the shortand long-wave radiative forcing. The estimates presented here can be used to validate present day O3 radiative forcing produced by models.

  17. Robust ODF smoothing for accurate estimation of fiber orientation.

    PubMed

    Beladi, Somaieh; Pathirana, Pubudu N; Brotchie, Peter

    2010-01-01

    Q-ball imaging was presented as a model free, linear and multimodal diffusion sensitive approach to reconstruct diffusion orientation distribution function (ODF) using diffusion weighted MRI data. The ODFs are widely used to estimate the fiber orientations. However, the smoothness constraint was proposed to achieve a balance between the angular resolution and noise stability for ODF constructs. Different regularization methods were proposed for this purpose. However, these methods are not robust and quite sensitive to the global regularization parameter. Although, numerical methods such as L-curve test are used to define a globally appropriate regularization parameter, it cannot serve as a universal value suitable for all regions of interest. This may result in over smoothing and potentially end up in neglecting an existing fiber population. In this paper, we propose to include an interpolation step prior to the spherical harmonic decomposition. This interpolation based approach is based on Delaunay triangulation provides a reliable, robust and accurate smoothing approach. This method is easy to implement and does not require other numerical methods to define the required parameters. Also, the fiber orientations estimated using this approach are more accurate compared to other common approaches. PMID:21096202

  18. Accurate estimators of correlation functions in Fourier space

    NASA Astrophysics Data System (ADS)

    Sefusatti, E.; Crocce, M.; Scoccimarro, R.; Couchman, H. M. P.

    2016-08-01

    Efficient estimators of Fourier-space statistics for large number of objects rely on fast Fourier transforms (FFTs), which are affected by aliasing from unresolved small-scale modes due to the finite FFT grid. Aliasing takes the form of a sum over images, each of them corresponding to the Fourier content displaced by increasing multiples of the sampling frequency of the grid. These spurious contributions limit the accuracy in the estimation of Fourier-space statistics, and are typically ameliorated by simultaneously increasing grid size and discarding high-frequency modes. This results in inefficient estimates for e.g. the power spectrum when desired systematic biases are well under per cent level. We show that using interlaced grids removes odd images, which include the dominant contribution to aliasing. In addition, we discuss the choice of interpolation kernel used to define density perturbations on the FFT grid and demonstrate that using higher order interpolation kernels than the standard Cloud-In-Cell algorithm results in significant reduction of the remaining images. We show that combining fourth-order interpolation with interlacing gives very accurate Fourier amplitudes and phases of density perturbations. This results in power spectrum and bispectrum estimates that have systematic biases below 0.01 per cent all the way to the Nyquist frequency of the grid, thus maximizing the use of unbiased Fourier coefficients for a given grid size and greatly reducing systematics for applications to large cosmological data sets.

  19. Accurate Orientation Estimation Using AHRS under Conditions of Magnetic Distortion

    PubMed Central

    Yadav, Nagesh; Bleakley, Chris

    2014-01-01

    Low cost, compact attitude heading reference systems (AHRS) are now being used to track human body movements in indoor environments by estimation of the 3D orientation of body segments. In many of these systems, heading estimation is achieved by monitoring the strength of the Earth's magnetic field. However, the Earth's magnetic field can be locally distorted due to the proximity of ferrous and/or magnetic objects. Herein, we propose a novel method for accurate 3D orientation estimation using an AHRS, comprised of an accelerometer, gyroscope and magnetometer, under conditions of magnetic field distortion. The system performs online detection and compensation for magnetic disturbances, due to, for example, the presence of ferrous objects. The magnetic distortions are detected by exploiting variations in magnetic dip angle, relative to the gravity vector, and in magnetic strength. We investigate and show the advantages of using both magnetic strength and magnetic dip angle for detecting the presence of magnetic distortions. The correction method is based on a particle filter, which performs the correction using an adaptive cost function and by adapting the variance during particle resampling, so as to place more emphasis on the results of dead reckoning of the gyroscope measurements and less on the magnetometer readings. The proposed method was tested in an indoor environment in the presence of various magnetic distortions and under various accelerations (up to 3 g). In the experiments, the proposed algorithm achieves <2° static peak-to-peak error and <5° dynamic peak-to-peak error, significantly outperforming previous methods. PMID:25347584

  20. Reinforcing flood-risk estimation.

    PubMed

    Reed, Duncan W

    2002-07-15

    Flood-frequency estimation is inherently uncertain. The practitioner applies a combination of gauged data, scientific method and hydrological judgement to derive a flood-frequency curve for a particular site. The resulting estimate can be thought fully satisfactory only if it is broadly consistent with all that is reliably known about the flood-frequency behaviour of the river. The paper takes as its main theme the search for information to strengthen a flood-risk estimate made from peak flows alone. Extra information comes in many forms, including documentary and monumental records of historical floods, and palaeological markers. Meteorological information is also useful, although rainfall rarity is difficult to assess objectively and can be a notoriously unreliable indicator of flood rarity. On highly permeable catchments, groundwater levels present additional data. Other types of information are relevant to judging hydrological similarity when the flood-frequency estimate derives from data pooled across several catchments. After highlighting information sources, the paper explores a second theme: that of consistency in flood-risk estimates. Following publication of the Flood estimation handbook, studies of flood risk are now using digital catchment data. Automated calculation methods allow estimates by standard methods to be mapped basin-wide, revealing anomalies at special sites such as river confluences. Such mapping presents collateral information of a new character. Can this be used to achieve flood-risk estimates that are coherent throughout a river basin? PMID:12804255

  1. Accurately Determining the Risks of Rising Sea Level

    NASA Astrophysics Data System (ADS)

    Marbaix, Philippe; Nicholls, Robert J.

    2007-10-01

    With the highest density of people and the greatest concentration of economic activity located in the coastal regions, sea level rise is an important concern as the climate continues to warm. Subsequent flooding may potentially disrupt industries, populations, and livelihoods, particularly in the long term if the climate is not quickly stabilized [McGranahan et al., 2007; Tol et al., 2006]. To help policy makers understand these risks, a more accurate description of hazards posed by rising sea levels is needed at the global scale, even though the impacts in specific regions are better known.

  2. Fast and Accurate Learning When Making Discrete Numerical Estimates.

    PubMed

    Sanborn, Adam N; Beierholm, Ulrik R

    2016-04-01

    Many everyday estimation tasks have an inherently discrete nature, whether the task is counting objects (e.g., a number of paint buckets) or estimating discretized continuous variables (e.g., the number of paint buckets needed to paint a room). While Bayesian inference is often used for modeling estimates made along continuous scales, discrete numerical estimates have not received as much attention, despite their common everyday occurrence. Using two tasks, a numerosity task and an area estimation task, we invoke Bayesian decision theory to characterize how people learn discrete numerical distributions and make numerical estimates. Across three experiments with novel stimulus distributions we found that participants fell between two common decision functions for converting their uncertain representation into a response: drawing a sample from their posterior distribution and taking the maximum of their posterior distribution. While this was consistent with the decision function found in previous work using continuous estimation tasks, surprisingly the prior distributions learned by participants in our experiments were much more adaptive: When making continuous estimates, participants have required thousands of trials to learn bimodal priors, but in our tasks participants learned discrete bimodal and even discrete quadrimodal priors within a few hundred trials. This makes discrete numerical estimation tasks good testbeds for investigating how people learn and make estimates. PMID:27070155

  3. Fast and Accurate Learning When Making Discrete Numerical Estimates

    PubMed Central

    Sanborn, Adam N.; Beierholm, Ulrik R.

    2016-01-01

    Many everyday estimation tasks have an inherently discrete nature, whether the task is counting objects (e.g., a number of paint buckets) or estimating discretized continuous variables (e.g., the number of paint buckets needed to paint a room). While Bayesian inference is often used for modeling estimates made along continuous scales, discrete numerical estimates have not received as much attention, despite their common everyday occurrence. Using two tasks, a numerosity task and an area estimation task, we invoke Bayesian decision theory to characterize how people learn discrete numerical distributions and make numerical estimates. Across three experiments with novel stimulus distributions we found that participants fell between two common decision functions for converting their uncertain representation into a response: drawing a sample from their posterior distribution and taking the maximum of their posterior distribution. While this was consistent with the decision function found in previous work using continuous estimation tasks, surprisingly the prior distributions learned by participants in our experiments were much more adaptive: When making continuous estimates, participants have required thousands of trials to learn bimodal priors, but in our tasks participants learned discrete bimodal and even discrete quadrimodal priors within a few hundred trials. This makes discrete numerical estimation tasks good testbeds for investigating how people learn and make estimates. PMID:27070155

  4. Radiations in space: risk estimates.

    PubMed

    Fry, R J M

    2002-01-01

    The complexity of radiation environments in space makes estimation of risks more difficult than for the protection of terrestrial populations. In deep space the duration of the mission, position in the solar cycle, number and size of solar particle events (SPE) and the spacecraft shielding are the major determinants of risk. In low-earth orbit missions there are the added factors of altitude and orbital inclination. Different radiation qualities such as protons and heavy ions and secondary radiations inside the spacecraft such as neutrons of various energies, have to be considered. Radiation dose rates in space are low except for short periods during very large SPEs. Risk estimation for space activities is based on the human experience of exposure to gamma rays and to a lesser extent X rays. The doses of protons, heavy ions and neutrons are adjusted to take into account the relative biological effectiveness (RBE) of the different radiation types and thus derive equivalent doses. RBE values and factors to adjust for the effect of dose rate have to be obtained from experimental data. The influence of age and gender on the cancer risk is estimated from the data from atomic bomb survivors. Because of the large number of variables the uncertainities in the probability of the effects are large. Information needed to improve the risk estimates includes: (1) risk of cancer induction by protons, heavy ions and neutrons: (2) influence of dose rate and protraction, particularly on potential tissue effects such as reduced fertility and cataracts: and (3) possible effects of heavy ions on the central nervous system. Risk cannot be eliminated and thus there must be a consensus on what level of risk is acceptable. PMID:12382925

  5. Bioaccessibility tests accurately estimate bioavailability of lead to quail

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Hazards of soil-borne Pb to wild birds may be more accurately quantified if the bioavailability of that Pb is known. To better understand the bioavailability of Pb, we incorporated Pb-contaminated soils or Pb acetate into diets for Japanese quail (Coturnix japonica), fed the quail for 15 days, and ...

  6. BIOACCESSIBILITY TESTS ACCURATELY ESTIMATE BIOAVAILABILITY OF LEAD TO QUAIL

    EPA Science Inventory

    Hazards of soil-borne Pb to wild birds may be more accurately quantified if the bioavailability of that Pb is known. To better understand the bioavailability of Pb to birds, we measured blood Pb concentrations in Japanese quail (Coturnix japonica) fed diets containing Pb-contami...

  7. How Accurately Do Spectral Methods Estimate Effective Elastic Thickness?

    NASA Astrophysics Data System (ADS)

    Perez-Gussinye, M.; Lowry, A. R.; Watts, A. B.; Velicogna, I.

    2002-12-01

    The effective elastic thickness, Te, is an important parameter that has the potential to provide information on the long-term thermal and mechanical properties of the the lithosphere. Previous studies have estimated Te using both forward and inverse (spectral) methods. While there is generally good agreement between the results obtained using these methods, spectral methods are limited because they depend on the spectral estimator and the window size chosen for analysis. In order to address this problem, we have used a multitaper technique which yields optimal estimates of the bias and variance of the Bouguer coherence function relating topography and gravity anomaly data. The technique has been tested using realistic synthetic topography and gravity. Synthetic data were generated assuming surface and sub-surface (buried) loading of an elastic plate with fractal statistics consistent with real data sets. The cases of uniform and spatially varying Te are examined. The topography and gravity anomaly data consist of 2000x2000 km grids sampled at 8 km interval. The bias in the Te estimate is assessed from the difference between the true Te value and the mean from analyzing 100 overlapping windows within the 2000x2000 km data grids. For the case in which Te is uniform, the bias and variance decrease with window size and increase with increasing true Te value. In the case of a spatially varying Te, however, there is a trade-off between spatial resolution and variance. With increasing window size the variance of the Te estimate decreases, but the spatial changes in Te are smeared out. We find that for a Te distribution consisting of a strong central circular region of Te=50 km (radius 600 km) and progressively smaller Te towards its edges, the 800x800 and 1000x1000 km window gave the best compromise between spatial resolution and variance. Our studies demonstrate that assumed stationarity of the relationship between gravity and topography data yields good results even in

  8. Accurate feature detection and estimation using nonlinear and multiresolution analysis

    NASA Astrophysics Data System (ADS)

    Rudin, Leonid; Osher, Stanley

    1994-11-01

    A program for feature detection and estimation using nonlinear and multiscale analysis was completed. The state-of-the-art edge detection was combined with multiscale restoration (as suggested by the first author) and robust results in the presence of noise were obtained. Successful applications to numerous images of interest to DOD were made. Also, a new market in the criminal justice field was developed, based in part, on this work.

  9. Accurate tempo estimation based on harmonic + noise decomposition

    NASA Astrophysics Data System (ADS)

    Alonso, Miguel; Richard, Gael; David, Bertrand

    2006-12-01

    We present an innovative tempo estimation system that processes acoustic audio signals and does not use any high-level musical knowledge. Our proposal relies on a harmonic + noise decomposition of the audio signal by means of a subspace analysis method. Then, a technique to measure the degree of musical accentuation as a function of time is developed and separately applied to the harmonic and noise parts of the input signal. This is followed by a periodicity estimation block that calculates the salience of musical accents for a large number of potential periods. Next, a multipath dynamic programming searches among all the potential periodicities for the most consistent prospects through time, and finally the most energetic candidate is selected as tempo. Our proposal is validated using a manually annotated test-base containing 961 music signals from various musical genres. In addition, the performance of the algorithm under different configurations is compared. The robustness of the algorithm when processing signals of degraded quality is also measured.

  10. Bioaccessibility tests accurately estimate bioavailability of lead to quail

    USGS Publications Warehouse

    Beyer, W. Nelson; Basta, Nicholas T; Chaney, Rufus L.; Henry, Paula F.; Mosby, David; Rattner, Barnett A.; Scheckel, Kirk G.; Sprague, Dan; Weber, John

    2016-01-01

    Hazards of soil-borne Pb to wild birds may be more accurately quantified if the bioavailability of that Pb is known. To better understand the bioavailability of Pb to birds, we measured blood Pb concentrations in Japanese quail (Coturnix japonica) fed diets containing Pb-contaminated soils. Relative bioavailabilities were expressed by comparison with blood Pb concentrations in quail fed a Pb acetate reference diet. Diets containing soil from five Pb-contaminated Superfund sites had relative bioavailabilities from 33%-63%, with a mean of about 50%. Treatment of two of the soils with phosphorus significantly reduced the bioavailability of Pb. Bioaccessibility of Pb in the test soils was then measured in six in vitro tests and regressed on bioavailability. They were: the “Relative Bioavailability Leaching Procedure” (RBALP) at pH 1.5, the same test conducted at pH 2.5, the “Ohio State University In vitro Gastrointestinal” method (OSU IVG), the “Urban Soil Bioaccessible Lead Test”, the modified “Physiologically Based Extraction Test” and the “Waterfowl Physiologically Based Extraction Test.” All regressions had positive slopes. Based on criteria of slope and coefficient of determination, the RBALP pH 2.5 and OSU IVG tests performed very well. Speciation by X-ray absorption spectroscopy demonstrated that, on average, most of the Pb in the sampled soils was sorbed to minerals (30%), bound to organic matter (24%), or present as Pb sulfate (18%). Additional Pb was associated with P (chloropyromorphite, hydroxypyromorphite and tertiary Pb phosphate), and with Pb carbonates, leadhillite (a lead sulfate carbonate hydroxide), and Pb sulfide. The formation of chloropyromorphite reduced the bioavailability of Pb and the amendment of Pb-contaminated soils with P may be a thermodynamically favored means to sequester Pb.

  11. Bioaccessibility tests accurately estimate bioavailability of lead to quail.

    PubMed

    Beyer, W Nelson; Basta, Nicholas T; Chaney, Rufus L; Henry, Paula F P; Mosby, David E; Rattner, Barnett A; Scheckel, Kirk G; Sprague, Daniel T; Weber, John S

    2016-09-01

    Hazards of soil-borne lead (Pb) to wild birds may be more accurately quantified if the bioavailability of that Pb is known. To better understand the bioavailability of Pb to birds, the authors measured blood Pb concentrations in Japanese quail (Coturnix japonica) fed diets containing Pb-contaminated soils. Relative bioavailabilities were expressed by comparison with blood Pb concentrations in quail fed a Pb acetate reference diet. Diets containing soil from 5 Pb-contaminated Superfund sites had relative bioavailabilities from 33% to 63%, with a mean of approximately 50%. Treatment of 2 of the soils with phosphorus (P) significantly reduced the bioavailability of Pb. Bioaccessibility of Pb in the test soils was then measured in 6 in vitro tests and regressed on bioavailability: the relative bioavailability leaching procedure at pH 1.5, the same test conducted at pH 2.5, the Ohio State University in vitro gastrointestinal method, the urban soil bioaccessible lead test, the modified physiologically based extraction test, and the waterfowl physiologically based extraction test. All regressions had positive slopes. Based on criteria of slope and coefficient of determination, the relative bioavailability leaching procedure at pH 2.5 and Ohio State University in vitro gastrointestinal tests performed very well. Speciation by X-ray absorption spectroscopy demonstrated that, on average, most of the Pb in the sampled soils was sorbed to minerals (30%), bound to organic matter (24%), or present as Pb sulfate (18%). Additional Pb was associated with P (chloropyromorphite, hydroxypyromorphite, and tertiary Pb phosphate) and with Pb carbonates, leadhillite (a lead sulfate carbonate hydroxide), and Pb sulfide. The formation of chloropyromorphite reduced the bioavailability of Pb, and the amendment of Pb-contaminated soils with P may be a thermodynamically favored means to sequester Pb. Environ Toxicol Chem 2016;35:2311-2319. Published 2016 Wiley Periodicals Inc. on behalf of

  12. How accurately does the public perceive differences in transport risks? An exploratory analysis of scales representing perceived risk.

    PubMed

    Elvik, Rune; Bjørnskau, Torkel

    2005-11-01

    This paper probes the extent to which the public accurately perceives differences in transport risks. The paper is based on a survey of a random sample of the Norwegian population, conducted in September 2003. In the survey, respondents were asked: "How safe do you think it is to travel by means of (bus, train, etc.)?" Answers were given as: very safe, safe, a little unsafe, and very unsafe. A cursory examination of the answers suggested that the Norwegian public was quite well informed about differences in the risk of accident between different modes of transport, as well as between groups formed according to age and gender for each mode of transport. This paper probes the relationship between statistical estimates of risk and summary representations of perceived risk more systematically. It is found that the differences in fatality rate between different modes of transport are quite well perceived by the Norwegian public, irrespective of the way in which perceived risk is represented numerically. The relationship between statistical estimates of risk and numerical representations of perceived risk for each mode of transport is more sensitive to the choice of a numerical representation of perceived risk. A scale in which the answer "very safe" is assigned the value of 0.01 and the answer "very unsafe" is assigned the value of 10 is found to perform quite well. When the perception of risk is represented numerically according to this scale, a positive correlation between statistically estimated risk and perceived risk is found in seven of the eight comparisons that were made to determine how well variation in accident rates according to age and gender for car occupants, car drivers, cyclists and pedestrians are perceived. PMID:16054102

  13. New Cardiovascular Risk Factors and Their Use for an Accurate Cardiovascular Risk Assessment in Hypertensive Patients

    PubMed Central

    TAUTU, Oana-Florentina; DARABONT, Roxana; ONCIUL, Sebastian; DEACONU, Alexandru; COMANESCU, Ioana; ANDREI, Radu Dan; DRAGOESCU, Bogdan; CINTEZA, Mircea; DOROBANTU, Maria

    2014-01-01

    Objectives: To analyze the predictive value of new cardiovascular (CV) risk factors for CV risk assessment in the adult Romanian hypertensive (HT) population. Methods: Hypertensive adults aged between 40-65 years of age, identified in national representative SEPHAR II survey were evaluated by anthropometric, BP and arterial stiffness measurements: aortic pulse wave velocity (PWVao), aortic augmentation index (AIXao), revers time (RT) and central systolic blood pressure (SBPao), 12 lead ECGs and laboratory workup. Values above the 4th quartile of mean SBP' standard deviation (s.d.) defined increased BP variability. Log(TG/HDL-cholesterol) defined atherogenic index of plasma (AIP). Serum uric acid levels above 5.70 mg/dl for women and 7.0 mg/dl for males defined hyperuricemia (HUA). CV risk was assessed based on SCORE chart for high CV risk countries. Binary logistic regression using a stepwise likelihood ratio method (adjustments for major confounders and colliniarity analysis) was used in order to validate predictors of high and very high CV risk class. Results: The mean SBP value of the study group was 148.46±19.61 mmHg. Over forty percent of hypertensives had a high and very high CV risk. Predictors of high/very high CV risk category validated by regression analysis were: increased visit-to-visit BP variability (OR: 2.49; 95%CI: 1.67-3.73), PWVao (OR: 1.12; 95%CI: 1.02-1.22), RT (OR: 0.95; 95% CI: 0.93-0.98), SBPao (OR: 1.01; 95%CI: 1.01-1.03) and AIP (OR: 7.08; 95%CI: 3.91-12.82). Conclusion: The results of our study suggests that the new CV risk factors such as increased BP variability, arterial stiffness indices and AIP are useful tools for a more accurate identification of hypertensives patients at high and very high CV risk. PMID:25705267

  14. Radiologists’ ability to accurately estimate and compare their own interpretative mammography performance to their peers

    PubMed Central

    Cook, Andrea J.; Elmore, Joann G.; Zhu, Weiwei; Jackson, Sara L.; Carney, Patricia A.; Flowers, Chris; Onega, Tracy; Geller, Berta; Rosenberg, Robert D.; Miglioretti, Diana L.

    2013-01-01

    Objective To determine if U.S. radiologists accurately estimate their own interpretive performance of screening mammography and how they compare their performance to their peers’. Materials and Methods 174 radiologists from six Breast Cancer Surveillance Consortium (BCSC) registries completed a mailed survey between 2005 and 2006. Radiologists’ estimated and actual recall, false positive, and cancer detection rates and positive predictive value of biopsy recommendation (PPV2) for screening mammography were compared. Radiologists’ ratings of their performance as lower, similar, or higher than their peers were compared to their actual performance. Associations with radiologist characteristics were estimated using weighted generalized linear models. The study was approved by the institutional review boards of the participating sites, informed consent was obtained from radiologists, and procedures were HIPAA compliant. Results While most radiologists accurately estimated their cancer detection and recall rates (74% and 78% of radiologists), fewer accurately estimated their false positive rate and PPV2 (19% and 26%). Radiologists reported having similar (43%) or lower (31%) recall rates and similar (52%) or lower (33%) false positive rates compared to their peers, and similar (72%) or higher (23%) cancer detection rates and similar (72%) or higher (38%) PPV2. Estimation accuracy did not differ by radiologists’ characteristics except radiologists who interpret ≤1,000 mammograms annually were less accurate at estimating their recall rates. Conclusion Radiologists perceive their performance to be better than it actually is and at least as good as their peers. Radiologists have particular difficulty estimating their false positive rates and PPV2. PMID:22915414

  15. Thinking Concretely Increases the Perceived Likelihood of Risks: The Effect of Construal Level on Risk Estimation.

    PubMed

    Lermer, Eva; Streicher, Bernhard; Sachs, Rainer; Raue, Martina; Frey, Dieter

    2016-03-01

    Recent findings on construal level theory (CLT) suggest that abstract thinking leads to a lower estimated probability of an event occurring compared to concrete thinking. We applied this idea to the risk context and explored the influence of construal level (CL) on the overestimation of small and underestimation of large probabilities for risk estimates concerning a vague target person (Study 1 and Study 3) and personal risk estimates (Study 2). We were specifically interested in whether the often-found overestimation of small probabilities could be reduced with abstract thinking, and the often-found underestimation of large probabilities was reduced with concrete thinking. The results showed that CL influenced risk estimates. In particular, a concrete mindset led to higher risk estimates compared to an abstract mindset for several adverse events, including events with small and large probabilities. This suggests that CL manipulation can indeed be used for improving the accuracy of lay people's estimates of small and large probabilities. Moreover, the results suggest that professional risk managers' risk estimates of common events (thus with a relatively high probability) could be improved by adopting a concrete mindset. However, the abstract manipulation did not lead managers to estimate extremely unlikely events more accurately. Potential reasons for different CL manipulation effects on risk estimates' accuracy between lay people and risk managers are discussed. PMID:26111548

  16. Accurate Non-parametric Estimation of Recent Effective Population Size from Segments of Identity by Descent

    PubMed Central

    Browning, Sharon R.; Browning, Brian L.

    2015-01-01

    Existing methods for estimating historical effective population size from genetic data have been unable to accurately estimate effective population size during the most recent past. We present a non-parametric method for accurately estimating recent effective population size by using inferred long segments of identity by descent (IBD). We found that inferred segments of IBD contain information about effective population size from around 4 generations to around 50 generations ago for SNP array data and to over 200 generations ago for sequence data. In human populations that we examined, the estimates of effective size were approximately one-third of the census size. We estimate the effective population size of European-ancestry individuals in the UK four generations ago to be eight million and the effective population size of Finland four generations ago to be 0.7 million. Our method is implemented in the open-source IBDNe software package. PMID:26299365

  17. Simple, fast and accurate eight points amplitude estimation method of sinusoidal signals for DSP based instrumentation

    NASA Astrophysics Data System (ADS)

    Vizireanu, D. N.; Halunga, S. V.

    2012-04-01

    A simple, fast and accurate amplitude estimation algorithm of sinusoidal signals for DSP based instrumentation is proposed. It is shown that eight samples, used in two steps, are sufficient. A practical analytical formula for amplitude estimation is obtained. Numerical results are presented. Simulations have been performed when the sampled signal is affected by white Gaussian noise and when the samples are quantized on a given number of bits.

  18. Estimating the re-identification risk of clinical data sets

    PubMed Central

    2012-01-01

    Background De-identification is a common way to protect patient privacy when disclosing clinical data for secondary purposes, such as research. One type of attack that de-identification protects against is linking the disclosed patient data with public and semi-public registries. Uniqueness is a commonly used measure of re-identification risk under this attack. If uniqueness can be measured accurately then the risk from this kind of attack can be managed. In practice, it is often not possible to measure uniqueness directly, therefore it must be estimated. Methods We evaluated the accuracy of uniqueness estimators on clinically relevant data sets. Four candidate estimators were identified because they were evaluated in the past and found to have good accuracy or because they were new and not evaluated comparatively before: the Zayatz estimator, slide negative binomial estimator, Pitman’s estimator, and mu-argus. A Monte Carlo simulation was performed to evaluate the uniqueness estimators on six clinically relevant data sets. We varied the sampling fraction and the uniqueness in the population (the value being estimated). The median relative error and inter-quartile range of the uniqueness estimates was measured across 1000 runs. Results There was no single estimator that performed well across all of the conditions. We developed a decision rule which selected between the Pitman, slide negative binomial and Zayatz estimators depending on the sampling fraction and the difference between estimates. This decision rule had the best consistent median relative error across multiple conditions and data sets. Conclusion This study identified an accurate decision rule that can be used by health privacy researchers and disclosure control professionals to estimate uniqueness in clinical data sets. The decision rule provides a reliable way to measure re-identification risk. PMID:22776564

  19. On the accurate estimation of gap fraction during daytime with digital cover photography

    NASA Astrophysics Data System (ADS)

    Hwang, Y. R.; Ryu, Y.; Kimm, H.; Macfarlane, C.; Lang, M.; Sonnentag, O.

    2015-12-01

    Digital cover photography (DCP) has emerged as an indirect method to obtain gap fraction accurately. Thus far, however, the intervention of subjectivity, such as determining the camera relative exposure value (REV) and threshold in the histogram, hindered computing accurate gap fraction. Here we propose a novel method that enables us to measure gap fraction accurately during daytime under various sky conditions by DCP. The novel method computes gap fraction using a single DCP unsaturated raw image which is corrected for scattering effects by canopies and a reconstructed sky image from the raw format image. To test the sensitivity of the novel method derived gap fraction to diverse REVs, solar zenith angles and canopy structures, we took photos in one hour interval between sunrise to midday under dense and sparse canopies with REV 0 to -5. The novel method showed little variation of gap fraction across different REVs in both dense and spares canopies across diverse range of solar zenith angles. The perforated panel experiment, which was used to test the accuracy of the estimated gap fraction, confirmed that the novel method resulted in the accurate and consistent gap fractions across different hole sizes, gap fractions and solar zenith angles. These findings highlight that the novel method opens new opportunities to estimate gap fraction accurately during daytime from sparse to dense canopies, which will be useful in monitoring LAI precisely and validating satellite remote sensing LAI products efficiently.

  20. A new geometric-based model to accurately estimate arm and leg inertial estimates.

    PubMed

    Wicke, Jason; Dumas, Geneviève A

    2014-06-01

    Segment estimates of mass, center of mass and moment of inertia are required input parameters to analyze the forces and moments acting across the joints. The objectives of this study were to propose a new geometric model for limb segments, to evaluate it against criterion values obtained from DXA, and to compare its performance to five other popular models. Twenty five female and 24 male college students participated in the study. For the criterion measures, the participants underwent a whole body DXA scan, and estimates for segment mass, center of mass location, and moment of inertia (frontal plane) were directly computed from the DXA mass units. For the new model, the volume was determined from two standing frontal and sagittal photographs. Each segment was modeled as a stack of slices, the sections of which were ellipses if they are not adjoining another segment and sectioned ellipses if they were adjoining another segment (e.g. upper arm and trunk). Length of axes of the ellipses was obtained from the photographs. In addition, a sex-specific, non-uniform density function was developed for each segment. A series of anthropometric measurements were also taken by directly following the definitions provided of the different body segment models tested, and the same parameters determined for each model. Comparison of models showed that estimates from the new model were consistently closer to the DXA criterion than those from the other models, with an error of less than 5% for mass and moment of inertia and less than about 6% for center of mass location. PMID:24735506

  1. Accurate Estimation of the Entropy of Rotation-Translation Probability Distributions.

    PubMed

    Fogolari, Federico; Dongmo Foumthuim, Cedrix Jurgal; Fortuna, Sara; Soler, Miguel Angel; Corazza, Alessandra; Esposito, Gennaro

    2016-01-12

    The estimation of rotational and translational entropies in the context of ligand binding has been the subject of long-time investigations. The high dimensionality (six) of the problem and the limited amount of sampling often prevent the required resolution to provide accurate estimates by the histogram method. Recently, the nearest-neighbor distance method has been applied to the problem, but the solutions provided either address rotation and translation separately, therefore lacking correlations, or use a heuristic approach. Here we address rotational-translational entropy estimation in the context of nearest-neighbor-based entropy estimation, solve the problem numerically, and provide an exact and an approximate method to estimate the full rotational-translational entropy. PMID:26605696

  2. Dynamic cost risk estimation and budget misspecification

    NASA Technical Reports Server (NTRS)

    Ebbeler, D. H.; Fox, G.; Habib-Agahi, H.

    2003-01-01

    Cost risk for new technology development is estimated by explicit stochastic processes. Monte Carlo simulation is used to propagate technology development activity budget changes during the technology development cycle.

  3. Exposure Estimation and Interpretation of Occupational Risk: Enhanced Information for the Occupational Risk Manager

    PubMed Central

    Waters, Martha; McKernan, Lauralynn; Maier, Andrew; Jayjock, Michael; Schaeffer, Val; Brosseau, Lisa

    2015-01-01

    The fundamental goal of this article is to describe, define, and analyze the components of the risk characterization process for occupational exposures. Current methods are described for the probabilistic characterization of exposure, including newer techniques that have increasing applications for assessing data from occupational exposure scenarios. In addition, since the probability of health effects reflects variability in the exposure estimate as well as the dose-response curve—the integrated considerations of variability surrounding both components of the risk characterization provide greater information to the occupational hygienist. Probabilistic tools provide a more informed view of exposure as compared to use of discrete point estimates for these inputs to the risk characterization process. Active use of such tools for exposure and risk assessment will lead to a scientifically supported worker health protection program. Understanding the bases for an occupational risk assessment, focusing on important sources of variability and uncertainty enables characterizing occupational risk in terms of a probability, rather than a binary decision of acceptable risk or unacceptable risk. A critical review of existing methods highlights several conclusions: (1) exposure estimates and the dose-response are impacted by both variability and uncertainty and a well-developed risk characterization reflects and communicates this consideration; (2) occupational risk is probabilistic in nature and most accurately considered as a distribution, not a point estimate; and (3) occupational hygienists have a variety of tools available to incorporate concepts of risk characterization into occupational health and practice. PMID:26302336

  4. Polynomial Fitting of DT-MRI Fiber Tracts Allows Accurate Estimation of Muscle Architectural Parameters

    PubMed Central

    Damon, Bruce M.; Heemskerk, Anneriet M.; Ding, Zhaohua

    2012-01-01

    Fiber curvature is a functionally significant muscle structural property, but its estimation from diffusion-tensor MRI fiber tracking data may be confounded by noise. The purpose of this study was to investigate the use of polynomial fitting of fiber tracts for improving the accuracy and precision of fiber curvature (κ) measurements. Simulated image datasets were created in order to provide data with known values for κ and pennation angle (θ). Simulations were designed to test the effects of increasing inherent fiber curvature (3.8, 7.9, 11.8, and 15.3 m−1), signal-to-noise ratio (50, 75, 100, and 150), and voxel geometry (13.8 and 27.0 mm3 voxel volume with isotropic resolution; 13.5 mm3 volume with an aspect ratio of 4.0) on κ and θ measurements. In the originally reconstructed tracts, θ was estimated accurately under most curvature and all imaging conditions studied; however, the estimates of κ were imprecise and inaccurate. Fitting the tracts to 2nd order polynomial functions provided accurate and precise estimates of κ for all conditions except very high curvature (κ=15.3 m−1), while preserving the accuracy of the θ estimates. Similarly, polynomial fitting of in vivo fiber tracking data reduced the κ values of fitted tracts from those of unfitted tracts and did not change the θ values. Polynomial fitting of fiber tracts allows accurate estimation of physiologically reasonable values of κ, while preserving the accuracy of θ estimation. PMID:22503094

  5. Skin Temperature Over the Carotid Artery, an Accurate Non-invasive Estimation of Near Core Temperature

    PubMed Central

    Imani, Farsad; Karimi Rouzbahani, Hamid Reza; Goudarzi, Mehrdad; Tarrahi, Mohammad Javad; Ebrahim Soltani, Alireza

    2016-01-01

    Background: During anesthesia, continuous body temperature monitoring is essential, especially in children. Anesthesia can increase the risk of loss of body temperature by three to four times. Hypothermia in children results in increased morbidity and mortality. Since the measurement points of the core body temperature are not easily accessible, near core sites, like rectum, are used. Objectives: The purpose of this study was to measure skin temperature over the carotid artery and compare it with the rectum temperature, in order to propose a model for accurate estimation of near core body temperature. Patients and Methods: Totally, 124 patients within the age range of 2 - 6 years, undergoing elective surgery, were selected. Temperature of rectum and skin over the carotid artery was measured. Then, the patients were randomly divided into two groups (each including 62 subjects), namely modeling (MG) and validation groups (VG). First, in the modeling group, the average temperature of the rectum and skin over the carotid artery were measured separately. The appropriate model was determined, according to the significance of the model’s coefficients. The obtained model was used to predict the rectum temperature in the second group (VG group). Correlation of the predicted values with the real values (the measured rectum temperature) in the second group was investigated. Also, the difference in the average values of these two groups was examined in terms of significance. Results: In the modeling group, the average rectum and carotid temperatures were 36.47 ± 0.54°C and 35.45 ± 0.62°C, respectively. The final model was obtained, as follows: Carotid temperature × 0.561 + 16.583 = Rectum temperature. The predicted value was calculated based on the regression model and then compared with the measured rectum value, which showed no significant difference (P = 0.361). Conclusions: The present study was the first research, in which rectum temperature was compared with that

  6. Uranium mill tailings and risk estimation

    SciTech Connect

    Marks, S.

    1984-04-01

    Work done in estimating projected health effects for persons exposed to mill tailings at vicinity properties is described. The effect of the reassessment of exposures at Hiroshima and Nagasaki on the risk estimates for gamma radiation is discussed. A presentation of current results in the epidemiological study of Hanford workers is included. 2 references. (ACR)

  7. Accurate estimation of forest carbon stocks by 3-D remote sensing of individual trees.

    PubMed

    Omasa, Kenji; Qiu, Guo Yu; Watanuki, Kenichi; Yoshimi, Kenji; Akiyama, Yukihide

    2003-03-15

    Forests are one of the most important carbon sinks on Earth. However, owing to the complex structure, variable geography, and large area of forests, accurate estimation of forest carbon stocks is still a challenge for both site surveying and remote sensing. For these reasons, the Kyoto Protocol requires the establishment of methodologies for estimating the carbon stocks of forests (Kyoto Protocol, Article 5). A possible solution to this challenge is to remotely measure the carbon stocks of every tree in an entire forest. Here, we present a methodology for estimating carbon stocks of a Japanese cedar forest by using a high-resolution, helicopter-borne 3-dimensional (3-D) scanning lidar system that measures the 3-D canopy structure of every tree in a forest. Results show that a digital image (10-cm mesh) of woody canopy can be acquired. The treetop can be detected automatically with a reasonable accuracy. The absolute error ranges for tree height measurements are within 42 cm. Allometric relationships of height to carbon stocks then permit estimation of total carbon storage by measurement of carbon stocks of every tree. Thus, we suggest that our methodology can be used to accurately estimate the carbon stocks of Japanese cedar forests at a stand scale. Periodic measurements will reveal changes in forest carbon stocks. PMID:12680675

  8. A method to accurately estimate the muscular torques of human wearing exoskeletons by torque sensors.

    PubMed

    Hwang, Beomsoo; Jeon, Doyoung

    2015-01-01

    In exoskeletal robots, the quantification of the user's muscular effort is important to recognize the user's motion intentions and evaluate motor abilities. In this paper, we attempt to estimate users' muscular efforts accurately using joint torque sensor which contains the measurements of dynamic effect of human body such as the inertial, Coriolis, and gravitational torques as well as torque by active muscular effort. It is important to extract the dynamic effects of the user's limb accurately from the measured torque. The user's limb dynamics are formulated and a convenient method of identifying user-specific parameters is suggested for estimating the user's muscular torque in robotic exoskeletons. Experiments were carried out on a wheelchair-integrated lower limb exoskeleton, EXOwheel, which was equipped with torque sensors in the hip and knee joints. The proposed methods were evaluated by 10 healthy participants during body weight-supported gait training. The experimental results show that the torque sensors are to estimate the muscular torque accurately in cases of relaxed and activated muscle conditions. PMID:25860074

  9. Submarine tower escape decompression sickness risk estimation.

    PubMed

    Loveman, G A M; Seddon, E M; Thacker, J C; Stansfield, M R; Jurd, K M

    2014-01-01

    Actions to enhance survival in a distressed submarine (DISSUB) scenario may be guided in part by knowledge of the likely risk of decompression sickness (DCS) should the crew attempt tower escape. A mathematical model for DCS risk estimation has been calibrated against DCS outcome data from 3,738 exposures of either men or goats to raised pressure. Body mass was used to scale DCS risk. The calibration data included more than 1,000 actual or simulated submarine escape exposures and no exposures with substantial staged decompression. Cases of pulmonary barotrauma were removed from the calibration data. The calibrated model was used to estimate the likelihood of DCS occurrence following submarine escape from the United Kingdom Royal Navy tower escape system. Where internal DISSUB pressure remains at - 0.1 MPa, escape from DISSUB depths < 200 meters is estimated to have DCS risk < 6%. Saturation at raised DISSUB pressure markedly increases risk, with > 60% DCS risk predicted for a 200-meter escape from saturation at 0.21 MPa. Using the calibrated model to predict DCS for direct ascent from saturation gives similar risk estimates to other published models. PMID:25109085

  10. Easy and accurate variance estimation of the nonparametric estimator of the partial area under the ROC curve and its application.

    PubMed

    Yu, Jihnhee; Yang, Luge; Vexler, Albert; Hutson, Alan D

    2016-06-15

    The receiver operating characteristic (ROC) curve is a popular technique with applications, for example, investigating an accuracy of a biomarker to delineate between disease and non-disease groups. A common measure of accuracy of a given diagnostic marker is the area under the ROC curve (AUC). In contrast with the AUC, the partial area under the ROC curve (pAUC) looks into the area with certain specificities (i.e., true negative rate) only, and it can be often clinically more relevant than examining the entire ROC curve. The pAUC is commonly estimated based on a U-statistic with the plug-in sample quantile, making the estimator a non-traditional U-statistic. In this article, we propose an accurate and easy method to obtain the variance of the nonparametric pAUC estimator. The proposed method is easy to implement for both one biomarker test and the comparison of two correlated biomarkers because it simply adapts the existing variance estimator of U-statistics. In this article, we show accuracy and other advantages of the proposed variance estimation method by broadly comparing it with previously existing methods. Further, we develop an empirical likelihood inference method based on the proposed variance estimator through a simple implementation. In an application, we demonstrate that, depending on the inferences by either the AUC or pAUC, we can make a different decision on a prognostic ability of a same set of biomarkers. Copyright © 2016 John Wiley & Sons, Ltd. PMID:26790540

  11. Estimating the Effective Permittivity for Reconstructing Accurate Microwave-Radar Images.

    PubMed

    Lavoie, Benjamin R; Okoniewski, Michal; Fear, Elise C

    2016-01-01

    We present preliminary results from a method for estimating the optimal effective permittivity for reconstructing microwave-radar images. Using knowledge of how microwave-radar images are formed, we identify characteristics that are typical of good images, and define a fitness function to measure the relative image quality. We build a polynomial interpolant of the fitness function in order to identify the most likely permittivity values of the tissue. To make the estimation process more efficient, the polynomial interpolant is constructed using a locally and dimensionally adaptive sampling method that is a novel combination of stochastic collocation and polynomial chaos. Examples, using a series of simulated, experimental and patient data collected using the Tissue Sensing Adaptive Radar system, which is under development at the University of Calgary, are presented. These examples show how, using our method, accurate images can be reconstructed starting with only a broad estimate of the permittivity range. PMID:27611785

  12. Accurate estimation of object location in an image sequence using helicopter flight data

    NASA Technical Reports Server (NTRS)

    Tang, Yuan-Liang; Kasturi, Rangachar

    1994-01-01

    In autonomous navigation, it is essential to obtain a three-dimensional (3D) description of the static environment in which the vehicle is traveling. For a rotorcraft conducting low-latitude flight, this description is particularly useful for obstacle detection and avoidance. In this paper, we address the problem of 3D position estimation for static objects from a monocular sequence of images captured from a low-latitude flying helicopter. Since the environment is static, it is well known that the optical flow in the image will produce a radiating pattern from the focus of expansion. We propose a motion analysis system which utilizes the epipolar constraint to accurately estimate 3D positions of scene objects in a real world image sequence taken from a low-altitude flying helicopter. Results show that this approach gives good estimates of object positions near the rotorcraft's intended flight-path.

  13. Effective Echo Detection and Accurate Orbit Estimation Algorithms for Space Debris Radar

    NASA Astrophysics Data System (ADS)

    Isoda, Kentaro; Sakamoto, Takuya; Sato, Toru

    Orbit estimation of space debris, objects of no inherent value orbiting the earth, is a task that is important for avoiding collisions with spacecraft. The Kamisaibara Spaceguard Center radar system was built in 2004 as the first radar facility in Japan devoted to the observation of space debris. In order to detect the smaller debris, coherent integration is effective in improving SNR (Signal-to-Noise Ratio). However, it is difficult to apply coherent integration to real data because the motions of the targets are unknown. An effective algorithm is proposed for echo detection and orbit estimation of the faint echoes from space debris. The characteristics of the evaluation function are utilized by the algorithm. Experiments show the proposed algorithm improves SNR by 8.32dB and enables estimation of orbital parameters accurately to allow for re-tracking with a single radar.

  14. Parameter Estimation of Ion Current Formulations Requires Hybrid Optimization Approach to Be Both Accurate and Reliable

    PubMed Central

    Loewe, Axel; Wilhelms, Mathias; Schmid, Jochen; Krause, Mathias J.; Fischer, Fathima; Thomas, Dierk; Scholz, Eberhard P.; Dössel, Olaf; Seemann, Gunnar

    2016-01-01

    Computational models of cardiac electrophysiology provided insights into arrhythmogenesis and paved the way toward tailored therapies in the last years. To fully leverage in silico models in future research, these models need to be adapted to reflect pathologies, genetic alterations, or pharmacological effects, however. A common approach is to leave the structure of established models unaltered and estimate the values of a set of parameters. Today’s high-throughput patch clamp data acquisition methods require robust, unsupervised algorithms that estimate parameters both accurately and reliably. In this work, two classes of optimization approaches are evaluated: gradient-based trust-region-reflective and derivative-free particle swarm algorithms. Using synthetic input data and different ion current formulations from the Courtemanche et al. electrophysiological model of human atrial myocytes, we show that neither of the two schemes alone succeeds to meet all requirements. Sequential combination of the two algorithms did improve the performance to some extent but not satisfactorily. Thus, we propose a novel hybrid approach coupling the two algorithms in each iteration. This hybrid approach yielded very accurate estimates with minimal dependency on the initial guess using synthetic input data for which a ground truth parameter set exists. When applied to measured data, the hybrid approach yielded the best fit, again with minimal variation. Using the proposed algorithm, a single run is sufficient to estimate the parameters. The degree of superiority over the other investigated algorithms in terms of accuracy and robustness depended on the type of current. In contrast to the non-hybrid approaches, the proposed method proved to be optimal for data of arbitrary signal to noise ratio. The hybrid algorithm proposed in this work provides an important tool to integrate experimental data into computational models both accurately and robustly allowing to assess the often non

  15. Accurate reconstruction of viral quasispecies spectra through improved estimation of strain richness

    PubMed Central

    2015-01-01

    Background Estimating the number of different species (richness) in a mixed microbial population has been a main focus in metagenomic research. Existing methods of species richness estimation ride on the assumption that the reads in each assembled contig correspond to only one of the microbial genomes in the population. This assumption and the underlying probabilistic formulations of existing methods are not useful for quasispecies populations where the strains are highly genetically related. The lack of knowledge on the number of different strains in a quasispecies population is observed to hinder the precision of existing Viral Quasispecies Spectrum Reconstruction (QSR) methods due to the uncontrolled reconstruction of a large number of in silico false positives. In this work, we formulated a novel probabilistic method for strain richness estimation specifically targeting viral quasispecies. By using this approach we improved our recently proposed spectrum reconstruction pipeline ViQuaS to achieve higher levels of precision in reconstructed quasispecies spectra without compromising the recall rates. We also discuss how one other existing popular QSR method named ShoRAH can be improved using this new approach. Results On benchmark data sets, our estimation method provided accurate richness estimates (< 0.2 median estimation error) and improved the precision of ViQuaS by 2%-13% and F-score by 1%-9% without compromising the recall rates. We also demonstrate that our estimation method can be used to improve the precision and F-score of ShoRAH by 0%-7% and 0%-5% respectively. Conclusions The proposed probabilistic estimation method can be used to estimate the richness of viral populations with a quasispecies behavior and to improve the accuracy of the quasispecies spectra reconstructed by the existing methods ViQuaS and ShoRAH in the presence of a moderate level of technical sequencing errors. Availability http://sourceforge.net/projects/viquas/ PMID:26678073

  16. [Estimation of risk areas for hepatitis A].

    PubMed

    Braga, Ricardo Cerqueira Campos; Valencia, Luís Iván Ortiz; Medronho, Roberto de Andrade; Escosteguy, Claudia Caminha

    2008-08-01

    This study estimated hepatitis A risk areas in a region of Duque de Caxias, Rio de Janeiro State, Brazil. A cross-sectional study consisting of a hepatitis A serological survey and a household survey were conducted in 19 census tracts. Of these, 11 tracts were selected and 1,298 children from one to ten years of age were included in the study. Geostatistical techniques allowed modeling the spatial continuity of hepatitis A, non-use of filtered drinking water, time since installation of running water, and number of water taps per household and their spatial estimation through ordinary and indicator kriging. Adjusted models for the outcome and socioeconomic variables were isotropic; risk maps were constructed; cross-validation of the four models was satisfactory. Spatial estimation using the kriging method detected areas with increased risk of hepatitis A, independently of the urban administrative area in which the census tracts were located. PMID:18709215

  17. Intraocular lens power estimation by accurate ray tracing for eyes underwent previous refractive surgeries

    NASA Astrophysics Data System (ADS)

    Yang, Que; Wang, Shanshan; Wang, Kai; Zhang, Chunyu; Zhang, Lu; Meng, Qingyu; Zhu, Qiudong

    2015-08-01

    For normal eyes without history of any ocular surgery, traditional equations for calculating intraocular lens (IOL) power, such as SRK-T, Holladay, Higis, SRK-II, et al., all were relativley accurate. However, for eyes underwent refractive surgeries, such as LASIK, or eyes diagnosed as keratoconus, these equations may cause significant postoperative refractive error, which may cause poor satisfaction after cataract surgery. Although some methods have been carried out to solve this problem, such as Hagis-L equation[1], or using preoperative data (data before LASIK) to estimate K value[2], no precise equations were available for these eyes. Here, we introduced a novel intraocular lens power estimation method by accurate ray tracing with optical design software ZEMAX. Instead of using traditional regression formula, we adopted the exact measured corneal elevation distribution, central corneal thickness, anterior chamber depth, axial length, and estimated effective lens plane as the input parameters. The calculation of intraocular lens power for a patient with keratoconus and another LASIK postoperative patient met very well with their visual capacity after cataract surgery.

  18. Accurate and quantitative polarization-sensitive OCT by unbiased birefringence estimator with noise-stochastic correction

    NASA Astrophysics Data System (ADS)

    Kasaragod, Deepa; Sugiyama, Satoshi; Ikuno, Yasushi; Alonso-Caneiro, David; Yamanari, Masahiro; Fukuda, Shinichi; Oshika, Tetsuro; Hong, Young-Joo; Li, En; Makita, Shuichi; Miura, Masahiro; Yasuno, Yoshiaki

    2016-03-01

    Polarization sensitive optical coherence tomography (PS-OCT) is a functional extension of OCT that contrasts the polarization properties of tissues. It has been applied to ophthalmology, cardiology, etc. Proper quantitative imaging is required for a widespread clinical utility. However, the conventional method of averaging to improve the signal to noise ratio (SNR) and the contrast of the phase retardation (or birefringence) images introduce a noise bias offset from the true value. This bias reduces the effectiveness of birefringence contrast for a quantitative study. Although coherent averaging of Jones matrix tomography has been widely utilized and has improved the image quality, the fundamental limitation of nonlinear dependency of phase retardation and birefringence to the SNR was not overcome. So the birefringence obtained by PS-OCT was still not accurate for a quantitative imaging. The nonlinear effect of SNR to phase retardation and birefringence measurement was previously formulated in detail for a Jones matrix OCT (JM-OCT) [1]. Based on this, we had developed a maximum a-posteriori (MAP) estimator and quantitative birefringence imaging was demonstrated [2]. However, this first version of estimator had a theoretical shortcoming. It did not take into account the stochastic nature of SNR of OCT signal. In this paper, we present an improved version of the MAP estimator which takes into account the stochastic property of SNR. This estimator uses a probability distribution function (PDF) of true local retardation, which is proportional to birefringence, under a specific set of measurements of the birefringence and SNR. The PDF was pre-computed by a Monte-Carlo (MC) simulation based on the mathematical model of JM-OCT before the measurement. A comparison between this new MAP estimator, our previous MAP estimator [2], and the standard mean estimator is presented. The comparisons are performed both by numerical simulation and in vivo measurements of anterior and

  19. Regularization Based Iterative Point Match Weighting for Accurate Rigid Transformation Estimation.

    PubMed

    Liu, Yonghuai; De Dominicis, Luigi; Wei, Baogang; Chen, Liang; Martin, Ralph R

    2015-09-01

    Feature extraction and matching (FEM) for 3D shapes finds numerous applications in computer graphics and vision for object modeling, retrieval, morphing, and recognition. However, unavoidable incorrect matches lead to inaccurate estimation of the transformation relating different datasets. Inspired by AdaBoost, this paper proposes a novel iterative re-weighting method to tackle the challenging problem of evaluating point matches established by typical FEM methods. Weights are used to indicate the degree of belief that each point match is correct. Our method has three key steps: (i) estimation of the underlying transformation using weighted least squares, (ii) penalty parameter estimation via minimization of the weighted variance of the matching errors, and (iii) weight re-estimation taking into account both matching errors and information learnt in previous iterations. A comparative study, based on real shapes captured by two laser scanners, shows that the proposed method outperforms four other state-of-the-art methods in terms of evaluating point matches between overlapping shapes established by two typical FEM methods, resulting in more accurate estimates of the underlying transformation. This improved transformation can be used to better initialize the iterative closest point algorithm and its variants, making 3D shape registration more likely to succeed. PMID:26357287

  20. Spatial ascariasis risk estimation using socioeconomic variables.

    PubMed

    Valencia, Luis Iván Ortiz; Fortes, Bruno de Paula Menezes Drumond; Medronho, Roberto de Andrade

    2005-12-01

    Frequently, disease incidence is mapped as area data, for example, census tracts, districts or states. Spatial disease incidence can be highly heterogeneous inside these areas. Ascariasis is a highly prevalent disease, which is associated with poor sanitation and hygiene. Geostatistics was applied to model spatial distribution of Ascariasis risk and socioeconomic risk events in a poor community in Rio de Janeiro, Brazil. Data were gathered from a coproparasitologic and a domiciliary survey in 1550 children aged 1-9. Ascariasis risk and socioeconomic risk events were spatially estimated using Indicator Kriging. Cokriging models with a Linear Model of Coregionalization incorporating one socioeconomic variable were implemented. If a housewife attended school for less than four years, the non-use of a home water filter, a household density greater than one, and a household income lower than one Brazilian minimum wage increased the risk of Ascariasis. Cokriging improved spatial estimation of Ascariasis risk areas when compared to Indicator Kriging and detected more Ascariasis very-high risk areas than the GIS Overlay method. PMID:16506435

  1. The accurate estimation of physicochemical properties of ternary mixtures containing ionic liquids via artificial neural networks.

    PubMed

    Cancilla, John C; Díaz-Rodríguez, Pablo; Matute, Gemma; Torrecilla, José S

    2015-02-14

    The estimation of the density and refractive index of ternary mixtures comprising the ionic liquid (IL) 1-butyl-3-methylimidazolium tetrafluoroborate, 2-propanol, and water at a fixed temperature of 298.15 K has been attempted through artificial neural networks. The obtained results indicate that the selection of this mathematical approach was a well-suited option. The mean prediction errors obtained, after simulating with a dataset never involved in the training process of the model, were 0.050% and 0.227% for refractive index and density estimation, respectively. These accurate results, which have been attained only using the composition of the dissolutions (mass fractions), imply that, most likely, ternary mixtures similar to the one analyzed, can be easily evaluated utilizing this algorithmic tool. In addition, different chemical processes involving ILs can be monitored precisely, and furthermore, the purity of the compounds in the studied mixtures can be indirectly assessed thanks to the high accuracy of the model. PMID:25583241

  2. Toward an Accurate Estimate of the Exfoliation Energy of Black Phosphorus: A Periodic Quantum Chemical Approach.

    PubMed

    Sansone, Giuseppe; Maschio, Lorenzo; Usvyat, Denis; Schütz, Martin; Karttunen, Antti

    2016-01-01

    The black phosphorus (black-P) crystal is formed of covalently bound layers of phosphorene stacked together by weak van der Waals interactions. An experimental measurement of the exfoliation energy of black-P is not available presently, making theoretical studies the most important source of information for the optimization of phosphorene production. Here, we provide an accurate estimate of the exfoliation energy of black-P on the basis of multilevel quantum chemical calculations, which include the periodic local Møller-Plesset perturbation theory of second order, augmented by higher-order corrections, which are evaluated with finite clusters mimicking the crystal. Very similar results are also obtained by density functional theory with the D3-version of Grimme's empirical dispersion correction. Our estimate of the exfoliation energy for black-P of -151 meV/atom is substantially larger than that of graphite, suggesting the need for different strategies to generate isolated layers for these two systems. PMID:26651397

  3. IMPROVED RISK ESTIMATES FOR CARBON TETRACHLORIDE

    EPA Science Inventory

    Carbon tetrachloride (CCl4) has been used extensively within the Department of Energy (DOE) nuclear weapons facilities. Costs associated with cleanup of CCl4 at DOE facilities are driven by current cancer risk estimates which assume CCl4 is a genotoxic carcinogen. However, a grow...

  4. Reconstruction of financial networks for robust estimation of systemic risk

    NASA Astrophysics Data System (ADS)

    Mastromatteo, Iacopo; Zarinelli, Elia; Marsili, Matteo

    2012-03-01

    In this paper we estimate the propagation of liquidity shocks through interbank markets when the information about the underlying credit network is incomplete. We show that techniques such as maximum entropy currently used to reconstruct credit networks severely underestimate the risk of contagion by assuming a trivial (fully connected) topology, a type of network structure which can be very different from the one empirically observed. We propose an efficient message-passing algorithm to explore the space of possible network structures and show that a correct estimation of the network degree of connectedness leads to more reliable estimations for systemic risk. Such an algorithm is also able to produce maximally fragile structures, providing a practical upper bound for the risk of contagion when the actual network structure is unknown. We test our algorithm on ensembles of synthetic data encoding some features of real financial networks (sparsity and heterogeneity), finding that more accurate estimations of risk can be achieved. Finally we find that this algorithm can be used to control the amount of information that regulators need to require from banks in order to sufficiently constrain the reconstruction of financial networks.

  5. Secondary prevention and estimation of fracture risk.

    PubMed

    Mitchell, Paul James; Chem, C

    2013-12-01

    The key questions addressed in this chapter are: • How can individual risk of fracture be best estimated? • What is the best system to prevent a further fracture? • How to implement systems for preventing further fractures? Absolute fracture risk calculators (FRCs) provide a means to estimate an individual's future fracture risk. FRCs are widely available and provide clinicians and patients a platform to discuss the need for intervention to prevent fragility fractures. Despite availability of effective osteoporosis medicines for almost two decades, most patients presenting with new fragility fractures do not receive secondary preventive care. The Fracture Liaison Service (FLS) model has been shown in a number of countries to eliminate the care gap in a clinically and cost-effective manner. Leading international and national organisations have developed comprehensive resources and/or national strategy documents to provide guidance on implementation of FLS in local, regional and national health-care systems. PMID:24836336

  6. Estimating Terrorist Risk with Possibility Theory

    SciTech Connect

    J.L. Darby

    2004-11-30

    This report summarizes techniques that use possibility theory to estimate the risk of terrorist acts. These techniques were developed under the sponsorship of the Department of Homeland Security (DHS) as part of the National Infrastructure Simulation Analysis Center (NISAC) project. The techniques have been used to estimate the risk of various terrorist scenarios to support NISAC analyses during 2004. The techniques are based on the Logic Evolved Decision (LED) methodology developed over the past few years by Terry Bott and Steve Eisenhawer at LANL. [LED] The LED methodology involves the use of fuzzy sets, possibility theory, and approximate reasoning. LED captures the uncertainty due to vagueness and imprecision that is inherent in the fidelity of the information available for terrorist acts; probability theory cannot capture these uncertainties. This report does not address the philosophy supporting the development of nonprobabilistic approaches, and it does not discuss possibility theory in detail. The references provide a detailed discussion of these subjects. [Shafer] [Klir and Yuan] [Dubois and Prade] Suffice to say that these approaches were developed to address types of uncertainty that cannot be addressed by a probability measure. An earlier report discussed in detail the problems with using a probability measure to evaluate terrorist risk. [Darby Methodology]. Two related techniques are discussed in this report: (1) a numerical technique, and (2) a linguistic technique. The numerical technique uses traditional possibility theory applied to crisp sets, while the linguistic technique applies possibility theory to fuzzy sets. Both of these techniques as applied to terrorist risk for NISAC applications are implemented in software called PossibleRisk. The techniques implemented in PossibleRisk were developed specifically for use in estimating terrorist risk for the NISAC program. The LEDTools code can be used to perform the same linguistic evaluation as

  7. Lamb mode selection for accurate wall loss estimation via guided wave tomography

    SciTech Connect

    Huthwaite, P.; Ribichini, R.; Lowe, M. J. S.; Cawley, P.

    2014-02-18

    Guided wave tomography offers a method to accurately quantify wall thickness losses in pipes and vessels caused by corrosion. This is achieved using ultrasonic waves transmitted over distances of approximately 1–2m, which are measured by an array of transducers and then used to reconstruct a map of wall thickness throughout the inspected region. To achieve accurate estimations of remnant wall thickness, it is vital that a suitable Lamb mode is chosen. This paper presents a detailed evaluation of the fundamental modes, S{sub 0} and A{sub 0}, which are of primary interest in guided wave tomography thickness estimates since the higher order modes do not exist at all thicknesses, to compare their performance using both numerical and experimental data while considering a range of challenging phenomena. The sensitivity of A{sub 0} to thickness variations was shown to be superior to S{sub 0}, however, the attenuation from A{sub 0} when a liquid loading was present was much higher than S{sub 0}. A{sub 0} was less sensitive to the presence of coatings on the surface of than S{sub 0}.

  8. Lamb mode selection for accurate wall loss estimation via guided wave tomography

    NASA Astrophysics Data System (ADS)

    Huthwaite, P.; Ribichini, R.; Lowe, M. J. S.; Cawley, P.

    2014-02-01

    Guided wave tomography offers a method to accurately quantify wall thickness losses in pipes and vessels caused by corrosion. This is achieved using ultrasonic waves transmitted over distances of approximately 1-2m, which are measured by an array of transducers and then used to reconstruct a map of wall thickness throughout the inspected region. To achieve accurate estimations of remnant wall thickness, it is vital that a suitable Lamb mode is chosen. This paper presents a detailed evaluation of the fundamental modes, S0 and A0, which are of primary interest in guided wave tomography thickness estimates since the higher order modes do not exist at all thicknesses, to compare their performance using both numerical and experimental data while considering a range of challenging phenomena. The sensitivity of A0 to thickness variations was shown to be superior to S0, however, the attenuation from A0 when a liquid loading was present was much higher than S0. A0 was less sensitive to the presence of coatings on the surface of than S0.

  9. Removing the thermal component from heart rate provides an accurate VO2 estimation in forest work.

    PubMed

    Dubé, Philippe-Antoine; Imbeau, Daniel; Dubeau, Denise; Lebel, Luc; Kolus, Ahmet

    2016-05-01

    Heart rate (HR) was monitored continuously in 41 forest workers performing brushcutting or tree planting work. 10-min seated rest periods were imposed during the workday to estimate the HR thermal component (ΔHRT) per Vogt et al. (1970, 1973). VO2 was measured using a portable gas analyzer during a morning submaximal step-test conducted at the work site, during a work bout over the course of the day (range: 9-74 min), and during an ensuing 10-min rest pause taken at the worksite. The VO2 estimated, from measured HR and from corrected HR (thermal component removed), were compared to VO2 measured during work and rest. Varied levels of HR thermal component (ΔHRTavg range: 0-38 bpm) originating from a wide range of ambient thermal conditions, thermal clothing insulation worn, and physical load exerted during work were observed. Using raw HR significantly overestimated measured work VO2 by 30% on average (range: 1%-64%). 74% of VO2 prediction error variance was explained by the HR thermal component. VO2 estimated from corrected HR, was not statistically different from measured VO2. Work VO2 can be estimated accurately in the presence of thermal stress using Vogt et al.'s method, which can be implemented easily by the practitioner with inexpensive instruments. PMID:26851474

  10. Accurate Estimation of the Intrinsic Dimension Using Graph Distances: Unraveling the Geometric Complexity of Datasets

    PubMed Central

    Granata, Daniele; Carnevale, Vincenzo

    2016-01-01

    The collective behavior of a large number of degrees of freedom can be often described by a handful of variables. This observation justifies the use of dimensionality reduction approaches to model complex systems and motivates the search for a small set of relevant “collective” variables. Here, we analyze this issue by focusing on the optimal number of variable needed to capture the salient features of a generic dataset and develop a novel estimator for the intrinsic dimension (ID). By approximating geodesics with minimum distance paths on a graph, we analyze the distribution of pairwise distances around the maximum and exploit its dependency on the dimensionality to obtain an ID estimate. We show that the estimator does not depend on the shape of the intrinsic manifold and is highly accurate, even for exceedingly small sample sizes. We apply the method to several relevant datasets from image recognition databases and protein multiple sequence alignments and discuss possible interpretations for the estimated dimension in light of the correlations among input variables and of the information content of the dataset. PMID:27510265

  11. Hybridization modeling of oligonucleotide SNP arrays for accurate DNA copy number estimation

    PubMed Central

    Wan, Lin; Sun, Kelian; Ding, Qi; Cui, Yuehua; Li, Ming; Wen, Yalu; Elston, Robert C.; Qian, Minping; Fu, Wenjiang J

    2009-01-01

    Affymetrix SNP arrays have been widely used for single-nucleotide polymorphism (SNP) genotype calling and DNA copy number variation inference. Although numerous methods have achieved high accuracy in these fields, most studies have paid little attention to the modeling of hybridization of probes to off-target allele sequences, which can affect the accuracy greatly. In this study, we address this issue and demonstrate that hybridization with mismatch nucleotides (HWMMN) occurs in all SNP probe-sets and has a critical effect on the estimation of allelic concentrations (ACs). We study sequence binding through binding free energy and then binding affinity, and develop a probe intensity composite representation (PICR) model. The PICR model allows the estimation of ACs at a given SNP through statistical regression. Furthermore, we demonstrate with cell-line data of known true copy numbers that the PICR model can achieve reasonable accuracy in copy number estimation at a single SNP locus, by using the ratio of the estimated AC of each sample to that of the reference sample, and can reveal subtle genotype structure of SNPs at abnormal loci. We also demonstrate with HapMap data that the PICR model yields accurate SNP genotype calls consistently across samples, laboratories and even across array platforms. PMID:19586935

  12. Accurate Estimation of the Intrinsic Dimension Using Graph Distances: Unraveling the Geometric Complexity of Datasets.

    PubMed

    Granata, Daniele; Carnevale, Vincenzo

    2016-01-01

    The collective behavior of a large number of degrees of freedom can be often described by a handful of variables. This observation justifies the use of dimensionality reduction approaches to model complex systems and motivates the search for a small set of relevant "collective" variables. Here, we analyze this issue by focusing on the optimal number of variable needed to capture the salient features of a generic dataset and develop a novel estimator for the intrinsic dimension (ID). By approximating geodesics with minimum distance paths on a graph, we analyze the distribution of pairwise distances around the maximum and exploit its dependency on the dimensionality to obtain an ID estimate. We show that the estimator does not depend on the shape of the intrinsic manifold and is highly accurate, even for exceedingly small sample sizes. We apply the method to several relevant datasets from image recognition databases and protein multiple sequence alignments and discuss possible interpretations for the estimated dimension in light of the correlations among input variables and of the information content of the dataset. PMID:27510265

  13. Methods for accurate estimation of net discharge in a tidal channel

    USGS Publications Warehouse

    Simpson, M.R.; Bland, R.

    2000-01-01

    Accurate estimates of net residual discharge in tidally affected rivers and estuaries are possible because of recently developed ultrasonic discharge measurement techniques. Previous discharge estimates using conventional mechanical current meters and methods based on stage/discharge relations or water slope measurements often yielded errors that were as great as or greater than the computed residual discharge. Ultrasonic measurement methods consist of: 1) the use of ultrasonic instruments for the measurement of a representative 'index' velocity used for in situ estimation of mean water velocity and 2) the use of the acoustic Doppler current discharge measurement system to calibrate the index velocity measurement data. Methods used to calibrate (rate) the index velocity to the channel velocity measured using the Acoustic Doppler Current Profiler are the most critical factors affecting the accuracy of net discharge estimation. The index velocity first must be related to mean channel velocity and then used to calculate instantaneous channel discharge. Finally, discharge is low-pass filtered to remove the effects of the tides. An ultrasonic velocity meter discharge-measurement site in a tidally affected region of the Sacramento-San Joaquin Rivers was used to study the accuracy of the index velocity calibration procedure. Calibration data consisting of ultrasonic velocity meter index velocity and concurrent acoustic Doppler discharge measurement data were collected during three time periods. Two sets of data were collected during a spring tide (monthly maximum tidal current) and one of data collected during a neap tide (monthly minimum tidal current). The relative magnitude of instrumental errors, acoustic Doppler discharge measurement errors, and calibration errors were evaluated. Calibration error was found to be the most significant source of error in estimating net discharge. Using a comprehensive calibration method, net discharge estimates developed from the three

  14. MIDAS robust trend estimator for accurate GPS station velocities without step detection

    NASA Astrophysics Data System (ADS)

    Blewitt, Geoffrey; Kreemer, Corné; Hammond, William C.; Gazeaux, Julien

    2016-03-01

    Automatic estimation of velocities from GPS coordinate time series is becoming required to cope with the exponentially increasing flood of available data, but problems detectable to the human eye are often overlooked. This motivates us to find an automatic and accurate estimator of trend that is resistant to common problems such as step discontinuities, outliers, seasonality, skewness, and heteroscedasticity. Developed here, Median Interannual Difference Adjusted for Skewness (MIDAS) is a variant of the Theil-Sen median trend estimator, for which the ordinary version is the median of slopes vij = (xj-xi)/(tj-ti) computed between all data pairs i > j. For normally distributed data, Theil-Sen and least squares trend estimates are statistically identical, but unlike least squares, Theil-Sen is resistant to undetected data problems. To mitigate both seasonality and step discontinuities, MIDAS selects data pairs separated by 1 year. This condition is relaxed for time series with gaps so that all data are used. Slopes from data pairs spanning a step function produce one-sided outliers that can bias the median. To reduce bias, MIDAS removes outliers and recomputes the median. MIDAS also computes a robust and realistic estimate of trend uncertainty. Statistical tests using GPS data in the rigid North American plate interior show ±0.23 mm/yr root-mean-square (RMS) accuracy in horizontal velocity. In blind tests using synthetic data, MIDAS velocities have an RMS accuracy of ±0.33 mm/yr horizontal, ±1.1 mm/yr up, with a 5th percentile range smaller than all 20 automatic estimators tested. Considering its general nature, MIDAS has the potential for broader application in the geosciences.

  15. High-Resolution Tsunami Inundation Simulations Based on Accurate Estimations of Coastal Waveforms

    NASA Astrophysics Data System (ADS)

    Oishi, Y.; Imamura, F.; Sugawara, D.; Furumura, T.

    2015-12-01

    We evaluate the accuracy of high-resolution tsunami inundation simulations in detail using the actual observational data of the 2011 Tohoku-Oki earthquake (Mw9.0) and investigate the methodologies to improve the simulation accuracy.Due to the recent development of parallel computing technologies, high-resolution tsunami inundation simulations are conducted more commonly than before. To evaluate how accurately these simulations can reproduce inundation processes, we test several types of simulation configurations on a parallel computer, where we can utilize the observational data (e.g., offshore and coastal waveforms and inundation properties) that are recorded during the Tohoku-Oki earthquake.Before discussing the accuracy of inundation processes on land, the incident waves at coastal sites must be accurately estimated. However, for megathrust earthquakes, it is difficult to find the tsunami source that can provide accurate estimations of tsunami waveforms at every coastal site because of the complex spatiotemporal distribution of the source and the limitation of observation. To overcome this issue, we employ a site-specific source inversion approach that increases the estimation accuracy within a specific coastal site by applying appropriate weighting to the observational data in the inversion process.We applied our source inversion technique to the Tohoku tsunami and conducted inundation simulations using 5-m resolution digital elevation model data (DEM) for the coastal area around Miyako Bay and Sendai Bay. The estimated waveforms at the coastal wave gauges of these bays successfully agree with the observed waveforms. However, the simulations overestimate the inundation extent indicating the necessity to improve the inundation model. We find that the value of Manning's roughness coefficient should be modified from the often-used value of n = 0.025 to n = 0.033 to obtain proper results at both cities.In this presentation, the simulation results with several

  16. How to estimate your tolerance for risk

    SciTech Connect

    Mackay, J.A.

    1996-12-31

    Risk tolerance is used to calculate the Risk Adjusted Value (RAV) of a proposed investment. The RAV incorporates both the expected value and risk attitude for a particular investment, taking into consideration your concern for catastrophic financial loss, as well as chance of success, cost and value if successful. Uncertainty can be incorporated into all of the above variables. Often a project is more valuable to a corporation if a partial working interest is taken rather than the entire working interest. The RAV can be used to calculate the optimum working interest and the value of that diversification. To estimate the Apparent Risk Tolerance (ART) of an individual, division or corporation several methods can be employed: (1) ART can be calculated from the working interest selected in prior investment decisions. (2) ART can be estimated from a selection of working interests by the decision maker in a proposed portfolio of projects. (3) ART can be approximated from data released to the Security and Exchange Commission (SEC) in the annual 10K supplements (for both your company and possible partners). (4) ART can be assigned based on corporate size, budget, or activity. Examples are provided for the various methods to identify risk tolerance and apply it in making optimum working interest calculations for individual projects and portfolios.

  17. Risk Estimation Methodology for Launch Accidents.

    SciTech Connect

    Clayton, Daniel James; Lipinski, Ronald J.; Bechtel, Ryan D.

    2014-02-01

    As compact and light weight power sources with reliable, long lives, Radioisotope Power Systems (RPSs) have made space missions to explore the solar system possible. Due to the hazardous material that can be released during a launch accident, the potential health risk of an accident must be quantified, so that appropriate launch approval decisions can be made. One part of the risk estimation involves modeling the response of the RPS to potential accident environments. Due to the complexity of modeling the full RPS response deterministically on dynamic variables, the evaluation is performed in a stochastic manner with a Monte Carlo simulation. The potential consequences can be determined by modeling the transport of the hazardous material in the environment and in human biological pathways. The consequence analysis results are summed and weighted by appropriate likelihood values to give a collection of probabilistic results for the estimation of the potential health risk. This information is used to guide RPS designs, spacecraft designs, mission architecture, or launch procedures to potentially reduce the risk, as well as to inform decision makers of the potential health risks resulting from the use of RPSs for space missions.

  18. Accurate estimation of the RMS emittance from single current amplifier data

    SciTech Connect

    Stockli, Martin P.; Welton, R.F.; Keller, R.; Letchford, A.P.; Thomae, R.W.; Thomason, J.W.G.

    2002-05-31

    This paper presents the SCUBEEx rms emittance analysis, a self-consistent, unbiased elliptical exclusion method, which combines traditional data-reduction methods with statistical methods to obtain accurate estimates for the rms emittance. Rather than considering individual data, the method tracks the average current density outside a well-selected, variable boundary to separate the measured beam halo from the background. The average outside current density is assumed to be part of a uniform background and not part of the particle beam. Therefore the average outside current is subtracted from the data before evaluating the rms emittance within the boundary. As the boundary area is increased, the average outside current and the inside rms emittance form plateaus when all data containing part of the particle beam are inside the boundary. These plateaus mark the smallest acceptable exclusion boundary and provide unbiased estimates for the average background and the rms emittance. Small, trendless variations within the plateaus allow for determining the uncertainties of the estimates caused by variations of the measured background outside the smallest acceptable exclusion boundary. The robustness of the method is established with complementary variations of the exclusion boundary. This paper presents a detailed comparison between traditional data reduction methods and SCUBEEx by analyzing two complementary sets of emittance data obtained with a Lawrence Berkeley National Laboratory and an ISIS H{sup -} ion source.

  19. Accurate estimation of motion blur parameters in noisy remote sensing image

    NASA Astrophysics Data System (ADS)

    Shi, Xueyan; Wang, Lin; Shao, Xiaopeng; Wang, Huilin; Tao, Zhong

    2015-05-01

    The relative motion between remote sensing satellite sensor and objects is one of the most common reasons for remote sensing image degradation. It seriously weakens image data interpretation and information extraction. In practice, point spread function (PSF) should be estimated firstly for image restoration. Identifying motion blur direction and length accurately is very crucial for PSF and restoring image with precision. In general, the regular light-and-dark stripes in the spectrum can be employed to obtain the parameters by using Radon transform. However, serious noise existing in actual remote sensing images often causes the stripes unobvious. The parameters would be difficult to calculate and the error of the result relatively big. In this paper, an improved motion blur parameter identification method to noisy remote sensing image is proposed to solve this problem. The spectrum characteristic of noisy remote sensing image is analyzed firstly. An interactive image segmentation method based on graph theory called GrabCut is adopted to effectively extract the edge of the light center in the spectrum. Motion blur direction is estimated by applying Radon transform on the segmentation result. In order to reduce random error, a method based on whole column statistics is used during calculating blur length. Finally, Lucy-Richardson algorithm is applied to restore the remote sensing images of the moon after estimating blur parameters. The experimental results verify the effectiveness and robustness of our algorithm.

  20. Painfree and accurate Bayesian estimation of psychometric functions for (potentially) overdispersed data.

    PubMed

    Schütt, Heiko H; Harmeling, Stefan; Macke, Jakob H; Wichmann, Felix A

    2016-05-01

    The psychometric function describes how an experimental variable, such as stimulus strength, influences the behaviour of an observer. Estimation of psychometric functions from experimental data plays a central role in fields such as psychophysics, experimental psychology and in the behavioural neurosciences. Experimental data may exhibit substantial overdispersion, which may result from non-stationarity in the behaviour of observers. Here we extend the standard binomial model which is typically used for psychometric function estimation to a beta-binomial model. We show that the use of the beta-binomial model makes it possible to determine accurate credible intervals even in data which exhibit substantial overdispersion. This goes beyond classical measures for overdispersion-goodness-of-fit-which can detect overdispersion but provide no method to do correct inference for overdispersed data. We use Bayesian inference methods for estimating the posterior distribution of the parameters of the psychometric function. Unlike previous Bayesian psychometric inference methods our software implementation-psignifit 4-performs numerical integration of the posterior within automatically determined bounds. This avoids the use of Markov chain Monte Carlo (MCMC) methods typically requiring expert knowledge. Extensive numerical tests show the validity of the approach and we discuss implications of overdispersion for experimental design. A comprehensive MATLAB toolbox implementing the method is freely available; a python implementation providing the basic capabilities is also available. PMID:27013261

  1. Accurate estimation of human body orientation from RGB-D sensors.

    PubMed

    Liu, Wu; Zhang, Yongdong; Tang, Sheng; Tang, Jinhui; Hong, Richang; Li, Jintao

    2013-10-01

    Accurate estimation of human body orientation can significantly enhance the analysis of human behavior, which is a fundamental task in the field of computer vision. However, existing orientation estimation methods cannot handle the various body poses and appearances. In this paper, we propose an innovative RGB-D-based orientation estimation method to address these challenges. By utilizing the RGB-D information, which can be real time acquired by RGB-D sensors, our method is robust to cluttered environment, illumination change and partial occlusions. Specifically, efficient static and motion cue extraction methods are proposed based on the RGB-D superpixels to reduce the noise of depth data. Since it is hard to discriminate all the 360 (°) orientation using static cues or motion cues independently, we propose to utilize a dynamic Bayesian network system (DBNS) to effectively employ the complementary nature of both static and motion cues. In order to verify our proposed method, we build a RGB-D-based human body orientation dataset that covers a wide diversity of poses and appearances. Our intensive experimental evaluations on this dataset demonstrate the effectiveness and efficiency of the proposed method. PMID:23893759

  2. Quick and accurate estimation of the elastic constants using the minimum image method

    NASA Astrophysics Data System (ADS)

    Tretiakov, Konstantin V.; Wojciechowski, Krzysztof W.

    2015-04-01

    A method for determining the elastic properties using the minimum image method (MIM) is proposed and tested on a model system of particles interacting by the Lennard-Jones (LJ) potential. The elastic constants of the LJ system are determined in the thermodynamic limit, N → ∞, using the Monte Carlo (MC) method in the NVT and NPT ensembles. The simulation results show that when determining the elastic constants, the contribution of long-range interactions cannot be ignored, because that would lead to erroneous results. In addition, the simulations have revealed that the inclusion of further interactions of each particle with all its minimum image neighbors even in case of small systems leads to results which are very close to the values of elastic constants in the thermodynamic limit. This enables one for a quick and accurate estimation of the elastic constants using very small samples.

  3. A Simple yet Accurate Method for the Estimation of the Biovolume of Planktonic Microorganisms.

    PubMed

    Saccà, Alessandro

    2016-01-01

    Determining the biomass of microbial plankton is central to the study of fluxes of energy and materials in aquatic ecosystems. This is typically accomplished by applying proper volume-to-carbon conversion factors to group-specific abundances and biovolumes. A critical step in this approach is the accurate estimation of biovolume from two-dimensional (2D) data such as those available through conventional microscopy techniques or flow-through imaging systems. This paper describes a simple yet accurate method for the assessment of the biovolume of planktonic microorganisms, which works with any image analysis system allowing for the measurement of linear distances and the estimation of the cross sectional area of an object from a 2D digital image. The proposed method is based on Archimedes' principle about the relationship between the volume of a sphere and that of a cylinder in which the sphere is inscribed, plus a coefficient of 'unellipticity' introduced here. Validation and careful evaluation of the method are provided using a variety of approaches. The new method proved to be highly precise with all convex shapes characterised by approximate rotational symmetry, and combining it with an existing method specific for highly concave or branched shapes allows covering the great majority of cases with good reliability. Thanks to its accuracy, consistency, and low resources demand, the new method can conveniently be used in substitution of any extant method designed for convex shapes, and can readily be coupled with automated cell imaging technologies, including state-of-the-art flow-through imaging devices. PMID:27195667

  4. A Simple yet Accurate Method for the Estimation of the Biovolume of Planktonic Microorganisms

    PubMed Central

    2016-01-01

    Determining the biomass of microbial plankton is central to the study of fluxes of energy and materials in aquatic ecosystems. This is typically accomplished by applying proper volume-to-carbon conversion factors to group-specific abundances and biovolumes. A critical step in this approach is the accurate estimation of biovolume from two-dimensional (2D) data such as those available through conventional microscopy techniques or flow-through imaging systems. This paper describes a simple yet accurate method for the assessment of the biovolume of planktonic microorganisms, which works with any image analysis system allowing for the measurement of linear distances and the estimation of the cross sectional area of an object from a 2D digital image. The proposed method is based on Archimedes’ principle about the relationship between the volume of a sphere and that of a cylinder in which the sphere is inscribed, plus a coefficient of ‘unellipticity’ introduced here. Validation and careful evaluation of the method are provided using a variety of approaches. The new method proved to be highly precise with all convex shapes characterised by approximate rotational symmetry, and combining it with an existing method specific for highly concave or branched shapes allows covering the great majority of cases with good reliability. Thanks to its accuracy, consistency, and low resources demand, the new method can conveniently be used in substitution of any extant method designed for convex shapes, and can readily be coupled with automated cell imaging technologies, including state-of-the-art flow-through imaging devices. PMID:27195667

  5. Accurate Estimation of the Fine Layering Effect on the Wave Propagation in the Carbonate Rocks

    NASA Astrophysics Data System (ADS)

    Bouchaala, F.; Ali, M. Y.

    2014-12-01

    The attenuation caused to the seismic wave during its propagation can be mainly divided into two parts, the scattering and the intrinsic attenuation. The scattering is an elastic redistribution of the energy due to the medium heterogeneities. However the intrinsic attenuation is an inelastic phenomenon, mainly due to the fluid-grain friction during the wave passage. The intrinsic attenuation is directly related to the physical characteristics of the medium, so this parameter is very can be used for media characterization and fluid detection, which is beneficial for the oil and gas industry. The intrinsic attenuation is estimated by subtracting the scattering from the total attenuation, therefore the accuracy of the intrinsic attenuation is directly dependent on the accuracy of the total attenuation and the scattering. The total attenuation can be estimated from the recorded waves, by using in-situ methods as the spectral ratio and frequency shift methods. The scattering is estimated by assuming the heterogeneities as a succession of stacked layers, each layer is characterized by a single density and velocity. The accuracy of the scattering is strongly dependent on the layer thicknesses, especially in the case of the media composed of carbonate rocks, such media are known for their strong heterogeneity. Previous studies gave some assumptions for the choice of the layer thickness, but they showed some limitations especially in the case of carbonate rocks. In this study we established a relationship between the layer thicknesses and the frequency of the propagation, after certain mathematical development of the Generalized O'Doherty-Anstey formula. We validated this relationship through some synthetic tests and real data provided from a VSP carried out over an onshore oilfield in the emirate of Abu Dhabi in the United Arab Emirates, primarily composed of carbonate rocks. The results showed the utility of our relationship for an accurate estimation of the scattering

  6. Accurate biopsy-needle depth estimation in limited-angle tomography using multi-view geometry

    NASA Astrophysics Data System (ADS)

    van der Sommen, Fons; Zinger, Sveta; de With, Peter H. N.

    2016-03-01

    Recently, compressed-sensing based algorithms have enabled volume reconstruction from projection images acquired over a relatively small angle (θ < 20°). These methods enable accurate depth estimation of surgical tools with respect to anatomical structures. However, they are computationally expensive and time consuming, rendering them unattractive for image-guided interventions. We propose an alternative approach for depth estimation of biopsy needles during image-guided interventions, in which we split the problem into two parts and solve them independently: needle-depth estimation and volume reconstruction. The complete proposed system consists of the previous two steps, preceded by needle extraction. First, we detect the biopsy needle in the projection images and remove it by interpolation. Next, we exploit epipolar geometry to find point-to-point correspondences in the projection images to triangulate the 3D position of the needle in the volume. Finally, we use the interpolated projection images to reconstruct the local anatomical structures and indicate the position of the needle within this volume. For validation of the algorithm, we have recorded a full CT scan of a phantom with an inserted biopsy needle. The performance of our approach ranges from a median error of 2.94 mm for an distributed viewing angle of 1° down to an error of 0.30 mm for an angle larger than 10°. Based on the results of this initial phantom study, we conclude that multi-view geometry offers an attractive alternative to time-consuming iterative methods for the depth estimation of surgical tools during C-arm-based image-guided interventions.

  7. Nonparametric estimation with recurrent competing risks data

    PubMed Central

    Peña, Edsel A.

    2014-01-01

    Nonparametric estimators of component and system life distributions are developed and presented for situations where recurrent competing risks data from series systems are available. The use of recurrences of components’ failures leads to improved efficiencies in statistical inference, thereby leading to resource-efficient experimental or study designs or improved inferences about the distributions governing the event times. Finite and asymptotic properties of the estimators are obtained through simulation studies and analytically. The detrimental impact of parametric model misspecification is also vividly demonstrated, lending credence to the virtue of adopting nonparametric or semiparametric models, especially in biomedical settings. The estimators are illustrated by applying them to a data set pertaining to car repairs for vehicles that were under warranty. PMID:24072583

  8. Can student health professionals accurately estimate alcohol content in commonly occurring drinks?

    PubMed Central

    Sinclair, Julia; Searle, Emma

    2016-01-01

    Objectives: Correct identification of alcohol as a contributor to, or comorbidity of, many psychiatric diseases requires health professionals to be competent and confident to take an accurate alcohol history. Being able to estimate (or calculate) the alcohol content in commonly consumed drinks is a prerequisite for quantifying levels of alcohol consumption. The aim of this study was to assess this ability in medical and nursing students. Methods: A cross-sectional survey of 891 medical and nursing students across different years of training was conducted. Students were asked the alcohol content of 10 different alcoholic drinks by seeing a slide of the drink (with picture, volume and percentage of alcohol by volume) for 30 s. Results: Overall, the mean number of correctly estimated drinks (out of the 10 tested) was 2.4, increasing to just over 3 if a 10% margin of error was used. Wine and premium strength beers were underestimated by over 50% of students. Those who drank alcohol themselves, or who were further on in their clinical training, did better on the task, but overall the levels remained low. Conclusions: Knowledge of, or the ability to work out, the alcohol content of commonly consumed drinks is poor, and further research is needed to understand the reasons for this and the impact this may have on the likelihood to undertake screening or initiate treatment. PMID:27536344

  9. Ultrasound Fetal Weight Estimation: How Accurate Are We Now Under Emergency Conditions?

    PubMed

    Dimassi, Kaouther; Douik, Fatma; Ajroudi, Mariem; Triki, Amel; Gara, Mohamed Faouzi

    2015-10-01

    The primary aim of this study was to evaluate the accuracy of sonographic estimation of fetal weight when performed at due date by first-line sonographers. This was a prospective study including 500 singleton pregnancies. Ultrasound examinations were performed by residents on delivery day. Estimated fetal weights (EFWs) were calculated and compared with the corresponding birth weights. The median absolute difference between EFW and birth weight was 200 g (100-330). This difference was within ±10% in 75.2% of the cases. The median absolute percentage error was 5.53% (2.70%-10.03%). Linear regression analysis revealed a good correlation between EFW and birth weight (r = 0.79, p < 0.0001). According to Bland-Altman analysis, bias was -85.06 g (95% limits of agreement: -663.33 to 494.21). In conclusion, EFWs calculated by residents were as accurate as those calculated by experienced sonographers. Nevertheless, predictive performance remains limited, with a low sensitivity in the diagnosis of macrosomia. PMID:26164286

  10. Ocean Lidar Measurements of Beam Attenuation and a Roadmap to Accurate Phytoplankton Biomass Estimates

    NASA Astrophysics Data System (ADS)

    Hu, Yongxiang; Behrenfeld, Mike; Hostetler, Chris; Pelon, Jacques; Trepte, Charles; Hair, John; Slade, Wayne; Cetinic, Ivona; Vaughan, Mark; Lu, Xiaomei; Zhai, Pengwang; Weimer, Carl; Winker, David; Verhappen, Carolus C.; Butler, Carolyn; Liu, Zhaoyan; Hunt, Bill; Omar, Ali; Rodier, Sharon; Lifermann, Anne; Josset, Damien; Hou, Weilin; MacDonnell, David; Rhew, Ray

    2016-06-01

    Beam attenuation coefficient, c, provides an important optical index of plankton standing stocks, such as phytoplankton biomass and total particulate carbon concentration. Unfortunately, c has proven difficult to quantify through remote sensing. Here, we introduce an innovative approach for estimating c using lidar depolarization measurements and diffuse attenuation coefficients from ocean color products or lidar measurements of Brillouin scattering. The new approach is based on a theoretical formula established from Monte Carlo simulations that links the depolarization ratio of sea water to the ratio of diffuse attenuation Kd and beam attenuation C (i.e., a multiple scattering factor). On July 17, 2014, the CALIPSO satellite was tilted 30° off-nadir for one nighttime orbit in order to minimize ocean surface backscatter and demonstrate the lidar ocean subsurface measurement concept from space. Depolarization ratios of ocean subsurface backscatter are measured accurately. Beam attenuation coefficients computed from the depolarization ratio measurements compare well with empirical estimates from ocean color measurements. We further verify the beam attenuation coefficient retrievals using aircraft-based high spectral resolution lidar (HSRL) data that are collocated with in-water optical measurements.

  11. A rapid, economical, and accurate method to determining the physical risk of storm marine inundations using sedimentary evidence

    NASA Astrophysics Data System (ADS)

    Nott, Jonathan F.

    2015-04-01

    The majority of physical risk assessments from storm surge inundations are derived from synthetic time series generated from short climate records, which can often result in inaccuracies and are time-consuming and expensive to develop. A new method is presented here for the wet tropics region of northeast Australia. It uses lidar-generated topographic cross sections of beach ridge plains, which have been demonstrated to be deposited by marine inundations generated by tropical cyclones. Extreme value theory statistics are applied to data derived from the cross sections to generate return period plots for a given location. The results suggest that previous methods to estimate return periods using synthetic data sets have underestimated the magnitude/frequency relationship by at least an order of magnitude. The new method promises to be a more rapid, economical, and accurate assessment of the physical risk of these events.

  12. Auditory risk estimates for youth target shooting

    PubMed Central

    Meinke, Deanna K.; Murphy, William J.; Finan, Donald S.; Lankford, James E.; Flamme, Gregory A.; Stewart, Michael; Soendergaard, Jacob; Jerome, Trevor W.

    2015-01-01

    Objective To characterize the impulse noise exposure and auditory risk for youth recreational firearm users engaged in outdoor target shooting events. The youth shooting positions are typically standing or sitting at a table, which places the firearm closer to the ground or reflective surface when compared to adult shooters. Design Acoustic characteristics were examined and the auditory risk estimates were evaluated using contemporary damage-risk criteria for unprotected adult listeners and the 120-dB peak limit suggested by the World Health Organization (1999) for children. Study sample Impulses were generated by 26 firearm/ammunition configurations representing rifles, shotguns, and pistols used by youth. Measurements were obtained relative to a youth shooter’s left ear. Results All firearms generated peak levels that exceeded the 120 dB peak limit suggested by the WHO for children. In general, shooting from the seated position over a tabletop increases the peak levels, LAeq8 and reduces the unprotected maximum permissible exposures (MPEs) for both rifles and pistols. Pistols pose the greatest auditory risk when fired over a tabletop. Conclusion Youth should utilize smaller caliber weapons, preferably from the standing position, and always wear hearing protection whenever engaging in shooting activities to reduce the risk for auditory damage. PMID:24564688

  13. Discrete state model and accurate estimation of loop entropy of RNA secondary structures.

    PubMed

    Zhang, Jian; Lin, Ming; Chen, Rong; Wang, Wei; Liang, Jie

    2008-03-28

    Conformational entropy makes important contribution to the stability and folding of RNA molecule, but it is challenging to either measure or compute conformational entropy associated with long loops. We develop optimized discrete k-state models of RNA backbone based on known RNA structures for computing entropy of loops, which are modeled as self-avoiding walks. To estimate entropy of hairpin, bulge, internal loop, and multibranch loop of long length (up to 50), we develop an efficient sampling method based on the sequential Monte Carlo principle. Our method considers excluded volume effect. It is general and can be applied to calculating entropy of loops with longer length and arbitrary complexity. For loops of short length, our results are in good agreement with a recent theoretical model and experimental measurement. For long loops, our estimated entropy of hairpin loops is in excellent agreement with the Jacobson-Stockmayer extrapolation model. However, for bulge loops and more complex secondary structures such as internal and multibranch loops, we find that the Jacobson-Stockmayer extrapolation model has large errors. Based on estimated entropy, we have developed empirical formulae for accurate calculation of entropy of long loops in different secondary structures. Our study on the effect of asymmetric size of loops suggest that loop entropy of internal loops is largely determined by the total loop length, and is only marginally affected by the asymmetric size of the two loops. Our finding suggests that the significant asymmetric effects of loop length in internal loops measured by experiments are likely to be partially enthalpic. Our method can be applied to develop improved energy parameters important for studying RNA stability and folding, and for predicting RNA secondary and tertiary structures. The discrete model and the program used to calculate loop entropy can be downloaded at http://gila.bioengr.uic.edu/resources/RNA.html. PMID:18376982

  14. Extreme Earthquake Risk Estimation by Hybrid Modeling

    NASA Astrophysics Data System (ADS)

    Chavez, M.; Cabrera, E.; Ashworth, M.; Garcia, S.; Emerson, D.; Perea, N.; Salazar, A.; Moulinec, C.

    2012-12-01

    The estimation of the hazard and the economical consequences i.e. the risk associated to the occurrence of extreme magnitude earthquakes in the neighborhood of urban or lifeline infrastructure, such as the 11 March 2011 Mw 9, Tohoku, Japan, represents a complex challenge as it involves the propagation of seismic waves in large volumes of the earth crust, from unusually large seismic source ruptures up to the infrastructure location. The large number of casualties and huge economic losses observed for those earthquakes, some of which have a frequency of occurrence of hundreds or thousands of years, calls for the development of new paradigms and methodologies in order to generate better estimates, both of the seismic hazard, as well as of its consequences, and if possible, to estimate the probability distributions of their ground intensities and of their economical impacts (direct and indirect losses), this in order to implement technological and economical policies to mitigate and reduce, as much as possible, the mentioned consequences. Herewith, we propose a hybrid modeling which uses 3D seismic wave propagation (3DWP) and neural network (NN) modeling in order to estimate the seismic risk of extreme earthquakes. The 3DWP modeling is achieved by using a 3D finite difference code run in the ~100 thousands cores Blue Gene Q supercomputer of the STFC Daresbury Laboratory of UK, combined with empirical Green function (EGF) techniques and NN algorithms. In particular the 3DWP is used to generate broadband samples of the 3D wave propagation of extreme earthquakes (plausible) scenarios corresponding to synthetic seismic sources and to enlarge those samples by using feed-forward NN. We present the results of the validation of the proposed hybrid modeling for Mw 8 subduction events, and show examples of its application for the estimation of the hazard and the economical consequences, for extreme Mw 8.5 subduction earthquake scenarios with seismic sources in the Mexican

  15. Reexamination of spent fuel shipment risk estimates

    SciTech Connect

    COOK,J.R.; SPRUNG,JEREMY L.

    2000-04-25

    The risks associated with the transport of spent nuclear fuel by truck and rail have been reexamined and compared to results published in NUREG-O170 and the Modal Study. The full reexamination considered transport of PWR and BWR spent fuel by truck and rail in four generic Type B spent fuel casks. Because they are typical, this paper presents results only for transport of PWR spent fuel in steel-lead steel casks. Cask and spent fuel response to collision impacts and fires were evaluated by performing three-dimensional finite element and one-dimensional heat transport calculations. Accident release fractions were developed by critical review of literature data. Accident severity fractions were developed from Modal Study truck and rail accident event trees, modified to reflect the frequency of occurrence of hard and soft rock wayside route surfaces as determined by analysis of geographic data. Incident-free population doses and the population dose risks associated with the accidents that might occur during transport were calculated using the RADTRAN 5 transportation risk code. The calculated incident-free doses were compared to those published in NUREG-O170. The calculated accident dose risks were compared to dose risks calculated using NUREG-0170 and Modal Study accident source terms. The comparisons demonstrated that both of these studies made a number of very conservative assumptions about spent fuel and cask response to accident conditions, which caused their estimates of accident source terms, accident frequencies, and accident consequences to also be very conservative. The results of this study and the previous studies demonstrate that the risks associated with the shipment of spent fuel by truck or rail are very small.

  16. Relating space radiation environments to risk estimates

    NASA Technical Reports Server (NTRS)

    Curtis, Stanley B.

    1993-01-01

    A number of considerations must go into the process of determining the risk of deleterious effects of space radiation to travelers. Among them are (1) determination of the components of the radiation environment (particle species, fluxes and energy spectra) which will encounter, (2) determination of the effects of shielding provided by the spacecraft and the bodies of the travelers which modify the incident particle spectra and mix of particles, and (3) determination of relevant biological effects of the radiation in the organs of interest. The latter can then lead to an estimation of risk from a given space scenario. Clearly, the process spans many scientific disciplines from solar and cosmic ray physics to radiation transport theeory to the multistage problem of the induction by radiation of initial lesions in living material and their evolution via physical, chemical, and biological processes at the molecular, cellular, and tissue levels to produce the end point of importance.

  17. Relating space radiation environments to risk estimates

    SciTech Connect

    Curtis, S.B. ||

    1993-12-31

    A number of considerations must go into the process of determining the risk of deleterious effects of space radiation to travelers. Among them are (1) determination of the components of the radiation environment (particle species, fluxes and energy spectra) which will encounter, (2) determination of the effects of shielding provided by the spacecraft and the bodies of the travelers which modify the incident particle spectra and mix of particles, and (3) determination of relevant biological effects of the radiation in the organs of interest. The latter can then lead to an estimation of risk from a given space scenario. Clearly, the process spans many scientific disciplines from solar and cosmic ray physics to radiation transport theeory to the multistage problem of the induction by radiation of initial lesions in living material and their evolution via physical, chemical, and biological processes at the molecular, cellular, and tissue levels to produce the end point of importance.

  18. Climate-informed flood risk estimation

    NASA Astrophysics Data System (ADS)

    Troy, T.; Devineni, N.; Lima, C.; Lall, U.

    2013-12-01

    Currently, flood risk assessments are typically tied to a peak flow event that has an associated return period and inundation extent. This method is convenient: based on a historical record of annual maximum flows, a return period can be calculated with some assumptions about the probability distribution and stationarity. It is also problematic in its stationarity assumption, reliance on relatively short records, and treating flooding as a random event disconnected from large-scale climate processes. Recognizing these limitations, we have developed a new approach to flood risk assessment that connects climate variability, precipitation dynamics, and flood modeling to estimate the likelihood of flooding. To provide more robust, long time series of precipitation, we used stochastic weather generator models to simulate the rainfall fields. The method uses a k-nearest neighbor resampling algorithm in conjunction with a non-parametric empirical copulas based simulation strategy to reproduce the temporal and spatial dynamics, respectively. Climate patterns inform the likelihood of heavy rainfall in the model. For example, ENSO affects the likelihood of wet or dry years in Australia, and this is incorporated in the model. The stochastic simulations are then used to drive a cascade of models to predict flood inundation. Runoff is generated by the Variable Infiltration Capacity (VIC) model, fed into a full kinematic wave routing model at high resolution, and the kinematic wave is used as a boundary condition to predict flood inundation using a coupled storage cell model. Combining the strengths of a stochastic model for rainfall and a physical model for flood prediction allows us to overcome the limitations of traditional flood risk assessment and provide robust estimates of flood risk.

  19. Accurate Visual Heading Estimation at High Rotation Rate Without Oculomotor or Static-Depth Cues

    NASA Technical Reports Server (NTRS)

    Stone, Leland S.; Perrone, John A.; Null, Cynthia H. (Technical Monitor)

    1995-01-01

    It has been claimed that either oculomotor or static depth cues provide the signals about self-rotation necessary approx.-1 deg/s. We tested this hypothesis by simulating self-motion along a curved path with the eyes fixed in the head (plus or minus 16 deg/s of rotation). Curvilinear motion offers two advantages: 1) heading remains constant in retinotopic coordinates, and 2) there is no visual-oculomotor conflict (both actual and simulated eye position remain stationary). We simulated 400 ms of rotation combined with 16 m/s of translation at fixed angles with respect to gaze towards two vertical planes of random dots initially 12 and 24 m away, with a field of view of 45 degrees. Four subjects were asked to fixate a central cross and to respond whether they were translating to the left or right of straight-ahead gaze. From the psychometric curves, heading bias (mean) and precision (semi-interquartile) were derived. The mean bias over 2-5 runs was 3.0, 4.0, -2.0, -0.4 deg for the first author and three naive subjects, respectively (positive indicating towards the rotation direction). The mean precision was 2.0, 1.9, 3.1, 1.6 deg. respectively. The ability of observers to make relatively accurate and precise heading judgments, despite the large rotational flow component, refutes the view that extra-flow-field information is necessary for human visual heading estimation at high rotation rates. Our results support models that process combined translational/rotational flow to estimate heading, but should not be construed to suggest that other cues do not play an important role when they are available to the observer.

  20. The Impact of Perceived Frailty on Surgeons’ Estimates of Surgical Risk

    PubMed Central

    Ferguson, Mark K.; Farnan, Jeanne; Hemmerich, Josh A.; Slawinski, Kris; Acevedo, Julissa; Small, Stephen

    2015-01-01

    Background Physicians are only moderately accurate in estimating surgical risk based on clinical vignettes. We assessed the impact of perceived frailty by measuring the influence of a short video of a standardized patient on surgical risk estimates. Methods Thoracic surgeons and cardiothoracic trainees estimated the risk of major complications for lobectomy based on clinical vignettes of varied risk categories (low, average, high). After each vignette, subjects viewed a randomly selected video of a standardized patient exhibiting either vigorous or frail behavior, then re-estimated risk. Subjects were asked to rate 5 vignettes paired with 5 different standardized patients. Results Seventy-one physicians participated. Initial risk estimates varied according to the vignette risk category: low, 15.2 ± 11.2% risk; average, 23.7 ± 16.1%; high, 37.3 ± 18.9%; p<0.001 by ANOVA. Concordant information in vignettes and videos moderately altered estimates (high risk vignette, frail video: 10.6 ± 27.5% increase in estimate, p=0.006; low risk vignette, vigorous video: 14.5 ± 45.0% decrease, p=0.009). Discordant findings influenced risk estimates more substantially (high risk vignette, vigorous video: 21.2 ± 23.5% decrease in second risk estimate, p<0.001; low risk vignette, frail video: 151.9 ± 209.8% increase, p<0.001). Conclusions Surgeons differentiated relative risk of lobectomy based on clinical vignettes. The effect of viewing videos was small when vignettes and videos were concordant; the effect was more substantial when vignettes and videos were discordant. The information will be helpful in training future surgeons in frailty recognition and risk estimation. PMID:24932570

  1. Launch Risk Acceptability Considering Uncertainty in Risk Estimates

    NASA Astrophysics Data System (ADS)

    Collins, J. D.; Carbon, S. L.

    2010-09-01

    Quantification of launch risk is difficult and uncertain due to the assumptions made in the modeling process and the difficulty in developing the supporting data. This means that estimates of the risks are uncertain and the decision maker must decide on the acceptability of the launch under uncertainty. This paper describes the process to quantify the uncertainty and, in the process, describes the separate roles of aleatory and epistemic uncertainty in obtaining the point estimate of the casualty expectation and, ultimately, the distribution of the uncertainty in the computed casualty expectation. Tables are included of the significant sources and the nature of the contributing uncertainties. In addition, general procedures and an example are also included to describe the computational procedure. The second part of the paper discusses how the quantified uncertainty should be applied to the decision-making process. This discussion describes the procedure proposed and adopted by the Risk Committee of the Range Commander’s Council Range Safety Group which will be published in RCC 321-10 [1].

  2. Estimating the risk of Amazonian forest dieback.

    PubMed

    Rammig, Anja; Jupp, Tim; Thonicke, Kirsten; Tietjen, Britta; Heinke, Jens; Ostberg, Sebastian; Lucht, Wolfgang; Cramer, Wolfgang; Cox, Peter

    2010-08-01

    *Climate change will very likely affect most forests in Amazonia during the course of the 21st century, but the direction and intensity of the change are uncertain, in part because of differences in rainfall projections. In order to constrain this uncertainty, we estimate the probability for biomass change in Amazonia on the basis of rainfall projections that are weighted by climate model performance for current conditions. *We estimate the risk of forest dieback by using weighted rainfall projections from 24 general circulation models (GCMs) to create probability density functions (PDFs) for future forest biomass changes simulated by a dynamic vegetation model (LPJmL). *Our probabilistic assessment of biomass change suggests a likely shift towards increasing biomass compared with nonweighted results. Biomass estimates range between a gain of 6.2 and a loss of 2.7 kg carbon m(-2) for the Amazon region, depending on the strength of CO(2) fertilization. *The uncertainty associated with the long-term effect of CO(2) is much larger than that associated with precipitation change. This underlines the importance of reducing uncertainties in the direct effects of CO(2) on tropical ecosystems. PMID:20553387

  3. Software risk estimation and management techniques at JPL

    NASA Technical Reports Server (NTRS)

    Hihn, J.; Lum, K.

    2002-01-01

    In this talk we will discuss how uncertainty has been incorporated into the JPL software model, probabilistic-based estimates, and how risk is addressed, how cost risk is currently being explored via a variety of approaches, from traditional risk lists, to detailed WBS-based risk estimates to the Defect Detection and Prevention (DDP) tool.

  4. IMPROVED RISK ESTIMATES FOR CARBON TETRACHLORIDE

    SciTech Connect

    Benson, Janet M.; Springer, David L.

    1999-12-31

    Carbon tetrachloride has been used extensively within the DOE nuclear weapons facilities. Rocky Flats was formerly the largest volume consumer of CCl4 in the United States using 5000 gallons in 1977 alone (Ripple, 1992). At the Hanford site, several hundred thousand gallons of CCl4 were discharged between 1955 and 1973 into underground cribs for storage. Levels of CCl4 in groundwater at highly contaminated sites at the Hanford facility have exceeded 8 the drinking water standard of 5 ppb by several orders of magnitude (Illman, 1993). High levels of CCl4 at these facilities represent a potential health hazard for workers conducting cleanup operations and for surrounding communities. The level of CCl4 cleanup required at these sites and associated costs are driven by current human health risk estimates, which assume that CCl4 is a genotoxic carcinogen. The overall purpose of these studies was to improve the scientific basis for assessing the health risk associated with human exposure to CCl4. Specific research objectives of this project were to: (1) compare the rates of CCl4 metabolism by rats, mice and hamsters in vivo and extrapolate those rates to man based on parallel studies on the metabolism of CCl4 by rat, mouse, hamster and human hepatic microsomes in vitro; (2) using hepatic microsome preparations, determine the role of specific cytochrome P450 isoforms in CCl4-mediated toxicity and the effects of repeated inhalation and ingestion of CCl4 on these isoforms; and (3) evaluate the toxicokinetics of inhaled CCl4 in rats, mice and hamsters. This information has been used to improve the physiologically based pharmacokinetic (PBPK) model for CCl4 originally developed by Paustenbach et al. (1988) and more recently revised by Thrall and Kenny (1996). Another major objective of the project was to provide scientific evidence that CCl4, like chloroform, is a hepatocarcinogen only when exposure results in cell damage, cell killing and regenerative proliferation. In

  5. Crop area estimation based on remotely-sensed data with an accurate but costly subsample

    NASA Technical Reports Server (NTRS)

    Gunst, R. F.

    1983-01-01

    Alternatives to sampling-theory stratified and regression estimators of crop production and timber biomass were examined. An alternative estimator which is viewed as especially promising is the errors-in-variable regression estimator. Investigations established the need for caution with this estimator when the ratio of two error variances is not precisely known.

  6. Uncertainty of Calculated Risk Estimates for Secondary Malignancies After Radiotherapy

    SciTech Connect

    Kry, Stephen F. . E-mail: sfkry@mdanderson.org; Followill, David; White, R. Allen; Stovall, Marilyn; Kuban, Deborah A.; Salehpour, Mohammad

    2007-07-15

    Purpose: The significance of risk estimates for fatal secondary malignancies caused by out-of-field radiation exposure remains unresolved because the uncertainty in calculated risk estimates has not been established. This work examines the uncertainty in absolute risk estimates and in the ratio of risk estimates between different treatment modalities. Methods and Materials: Clinically reasonable out-of-field doses and calculated risk estimates were taken from the literature for several prostate treatment modalities, including intensity-modulated radiotherapy (IMRT), and were recalculated using the most recent risk model. The uncertainties in this risk model and uncertainties in the linearity of the dose-response model were considered in generating 90% confidence intervals for the uncertainty in the absolute risk estimates and in the ratio of the risk estimates. Results: The absolute risk estimates of fatal secondary malignancy were associated with very large uncertainties, which precluded distinctions between the risks associated with the different treatment modalities considered. However, a much smaller confidence interval exists for the ratio of risk estimates, and this ratio between different treatment modalities may be statistically significant when there is an effective dose equivalent difference of at least 50%. Such a difference may exist between clinically reasonable treatment options, including 6-MV IMRT versus 18-MV IMRT for prostate therapy. Conclusion: The ratio of the risk between different treatment modalities may be significantly different. Consequently risk models and associated risk estimates may be useful and meaningful for evaluating different treatment options. The calculated risk of secondary malignancy should be considered in the selection of an optimal treatment plan.

  7. An accurate modeling, simulation, and analysis tool for predicting and estimating Raman LIDAR system performance

    NASA Astrophysics Data System (ADS)

    Grasso, Robert J.; Russo, Leonard P.; Barrett, John L.; Odhner, Jefferson E.; Egbert, Paul I.

    2007-09-01

    BAE Systems presents the results of a program to model the performance of Raman LIDAR systems for the remote detection of atmospheric gases, air polluting hydrocarbons, chemical and biological weapons, and other molecular species of interest. Our model, which integrates remote Raman spectroscopy, 2D and 3D LADAR, and USAF atmospheric propagation codes permits accurate determination of the performance of a Raman LIDAR system. The very high predictive performance accuracy of our model is due to the very accurate calculation of the differential scattering cross section for the specie of interest at user selected wavelengths. We show excellent correlation of our calculated cross section data, used in our model, with experimental data obtained from both laboratory measurements and the published literature. In addition, the use of standard USAF atmospheric models provides very accurate determination of the atmospheric extinction at both the excitation and Raman shifted wavelengths.

  8. Bi-fluorescence imaging for estimating accurately the nuclear condition of Rhizoctonia spp.

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Aims: To simplify the determination of the nuclear condition of the pathogenic Rhizoctonia, which currently needs to be performed either using two fluorescent dyes, thus is more costly and time-consuming, or using only one fluorescent dye, and thus less accurate. Methods and Results: A red primary ...

  9. Estimation method of point spread function based on Kalman filter for accurately evaluating real optical properties of photonic crystal fibers.

    PubMed

    Shen, Yan; Lou, Shuqin; Wang, Xin

    2014-03-20

    The evaluation accuracy of real optical properties of photonic crystal fibers (PCFs) is determined by the accurate extraction of air hole edges from microscope images of cross sections of practical PCFs. A novel estimation method of point spread function (PSF) based on Kalman filter is presented to rebuild the micrograph image of the PCF cross-section and thus evaluate real optical properties for practical PCFs. Through tests on both artificially degraded images and microscope images of cross sections of practical PCFs, we prove that the proposed method can achieve more accurate PSF estimation and lower PSF variance than the traditional Bayesian estimation method, and thus also reduce the defocus effect. With this method, we rebuild the microscope images of two kinds of commercial PCFs produced by Crystal Fiber and analyze the real optical properties of these PCFs. Numerical results are in accord with the product parameters. PMID:24663461

  10. Unilateral Prostate Cancer Cannot be Accurately Predicted in Low-Risk Patients

    SciTech Connect

    Isbarn, Hendrik; Karakiewicz, Pierre I.; Vogel, Susanne

    2010-07-01

    Purpose: Hemiablative therapy (HAT) is increasing in popularity for treatment of patients with low-risk prostate cancer (PCa). The validity of this therapeutic modality, which exclusively treats PCa within a single prostate lobe, rests on accurate staging. We tested the accuracy of unilaterally unremarkable biopsy findings in cases of low-risk PCa patients who are potential candidates for HAT. Methods and Materials: The study population consisted of 243 men with clinical stage {<=}T2a, a prostate-specific antigen (PSA) concentration of <10 ng/ml, a biopsy-proven Gleason sum of {<=}6, and a maximum of 2 ipsilateral positive biopsy results out of 10 or more cores. All men underwent a radical prostatectomy, and pathology stage was used as the gold standard. Univariable and multivariable logistic regression models were tested for significant predictors of unilateral, organ-confined PCa. These predictors consisted of PSA, %fPSA (defined as the quotient of free [uncomplexed] PSA divided by the total PSA), clinical stage (T2a vs. T1c), gland volume, and number of positive biopsy cores (2 vs. 1). Results: Despite unilateral stage at biopsy, bilateral or even non-organ-confined PCa was reported in 64% of all patients. In multivariable analyses, no variable could clearly and independently predict the presence of unilateral PCa. This was reflected in an overall accuracy of 58% (95% confidence interval, 50.6-65.8%). Conclusions: Two-thirds of patients with unilateral low-risk PCa, confirmed by clinical stage and biopsy findings, have bilateral or non-organ-confined PCa at radical prostatectomy. This alarming finding questions the safety and validity of HAT.

  11. Technical note: tree truthing: how accurate are substrate estimates in primate field studies?

    PubMed

    Bezanson, Michelle; Watts, Sean M; Jobin, Matthew J

    2012-04-01

    Field studies of primate positional behavior typically rely on ground-level estimates of substrate size, angle, and canopy location. These estimates potentially influence the identification of positional modes by the observer recording behaviors. In this study we aim to test ground-level estimates against direct measurements of support angles, diameters, and canopy heights in trees at La Suerte Biological Research Station in Costa Rica. After reviewing methods that have been used by past researchers, we provide data collected within trees that are compared to estimates obtained from the ground. We climbed five trees and measured 20 supports. Four observers collected measurements of each support from different locations on the ground. Diameter estimates varied from the direct tree measures by 0-28 cm (Mean: 5.44 ± 4.55). Substrate angles varied by 1-55° (Mean: 14.76 ± 14.02). Height in the tree was best estimated using a clinometer as estimates with a two-meter reference placed by the tree varied by 3-11 meters (Mean: 5.31 ± 2.44). We determined that the best support size estimates were those generated relative to the size of the focal animal and divided into broader categories. Support angles were best estimated in 5° increments and then checked using a Haglöf clinometer in combination with a laser pointer. We conclude that three major factors should be addressed when estimating support features: observer error (e.g., experience and distance from the target), support deformity, and how support size and angle influence the positional mode selected by a primate individual. individual. PMID:22371099

  12. What's the Risk? A Simple Approach for Estimating Adjusted Risk Measures from Nonlinear Models Including Logistic Regression

    PubMed Central

    Kleinman, Lawrence C; Norton, Edward C

    2009-01-01

    Objective To develop and validate a general method (called regression risk analysis) to estimate adjusted risk measures from logistic and other nonlinear multiple regression models. We show how to estimate standard errors for these estimates. These measures could supplant various approximations (e.g., adjusted odds ratio [AOR]) that may diverge, especially when outcomes are common. Study Design Regression risk analysis estimates were compared with internal standards as well as with Mantel–Haenszel estimates, Poisson and log-binomial regressions, and a widely used (but flawed) equation to calculate adjusted risk ratios (ARR) from AOR. Data Collection Data sets produced using Monte Carlo simulations. Principal Findings Regression risk analysis accurately estimates ARR and differences directly from multiple regression models, even when confounders are continuous, distributions are skewed, outcomes are common, and effect size is large. It is statistically sound and intuitive, and has properties favoring it over other methods in many cases. Conclusions Regression risk analysis should be the new standard for presenting findings from multiple regression analysis of dichotomous outcomes for cross-sectional, cohort, and population-based case–control studies, particularly when outcomes are common or effect size is large. PMID:18793213

  13. The over-estimation of risk in pregnancy.

    PubMed

    Robinson, Monique; Pennell, Craig E; McLean, Neil J; Oddy, Wendy H; Newnham, John P

    2011-06-01

    The concept of risk is especially salient to obstetric care. Unknown factors can still be responsible for peri-natal morbidity and mortality in circumstances that appeared to present little risk, while perfectly healthy infants are born in high-risk circumstances: a contradiction that patients and providers struggle with on a daily basis. With such contradictions comes the potential for the over-estimation of risk during pregnancy in order to assure a positive outcome. Understanding and addressing the estimation of risk during pregnancy requires acknowledging the history of obstetric risk in addition to understanding risk-related psychological theory. A relationship of trust between provider and patient is vital in addressing risk over-estimation, as is encouraging the development of self-efficacy in patients. Ultimately obstetric care is complex and efforts to avoid pre-natal risk exposure based on heightened perceptions of threat may do more harm than the perceived threat itself. PMID:21480770

  14. Accurate state estimation for a hydraulic actuator via a SDRE nonlinear filter

    NASA Astrophysics Data System (ADS)

    Strano, Salvatore; Terzo, Mario

    2016-06-01

    The state estimation in hydraulic actuators is a fundamental tool for the detection of faults or a valid alternative to the installation of sensors. Due to the hard nonlinearities that characterize the hydraulic actuators, the performances of the linear/linearization based techniques for the state estimation are strongly limited. In order to overcome these limits, this paper focuses on an alternative nonlinear estimation method based on the State-Dependent-Riccati-Equation (SDRE). The technique is able to fully take into account the system nonlinearities and the measurement noise. A fifth order nonlinear model is derived and employed for the synthesis of the estimator. Simulations and experimental tests have been conducted and comparisons with the largely used Extended Kalman Filter (EKF) are illustrated. The results show the effectiveness of the SDRE based technique for applications characterized by not negligible nonlinearities such as dead zone and frictions.

  15. Spatio-temporal population estimates for risk management

    NASA Astrophysics Data System (ADS)

    Cockings, Samantha; Martin, David; Smith, Alan; Martin, Rebecca

    2013-04-01

    Accurate estimation of population at risk from hazards and effective emergency management of events require not just appropriate spatio-temporal modelling of hazards but also of population. While much recent effort has been focused on improving the modelling and predictions of hazards (both natural and anthropogenic), there has been little parallel advance in the measurement or modelling of population statistics. Different hazard types occur over diverse temporal cycles, are of varying duration and differ significantly in their spatial extent. Even events of the same hazard type, such as flood events, vary markedly in their spatial and temporal characteristics. Conceptually and pragmatically then, population estimates should also be available for similarly varying spatio-temporal scales. Routine population statistics derived from traditional censuses or surveys are usually static representations in both space and time, recording people at their place of usual residence on census/survey night and presenting data for administratively defined areas. Such representations effectively fix the scale of population estimates in both space and time, which is unhelpful for meaningful risk management. Over recent years, the Pop24/7 programme of research, based at the University of Southampton (UK), has developed a framework for spatio-temporal modelling of population, based on gridded population surfaces. Based on a data model which is fully flexible in terms of space and time, the framework allows population estimates to be produced for any time slice relevant to the data contained in the model. It is based around a set of origin and destination centroids, which have capacities, spatial extents and catchment areas, all of which can vary temporally, such as by time of day, day of week, season. A background layer, containing information on features such as transport networks and landuse, provides information on the likelihood of people being in certain places at specific times

  16. Estimated radiation risks associated with endodontic radiography.

    PubMed

    Danforth, R A; Torabinejad, M

    1990-02-01

    Endodontic patients are sometimes concerned about the risks of tumors or cataracts from radiation exposure during root canal therapy. By using established dose and risk information, we calculated the extent of these risks. The chance of getting leukemia from an endodontic x-ray survey using 90 kVp was found to be 1 in 7.69 million, the same as the risk of dying from cancer from smoking 0.94 cigarettes or from an auto accident when driving 3.7 km. Risk of thyroid gland neoplasia was 1 in 667,000 (smoking 11.6 cigarettes, driving 45 km) and risk of salivary gland neoplasia 1 in 1.35 million (smoking 5.4 cigarettes, driving 21.1 km). Use of 70 kVp radiography reduced these risks only slightly. To receive the threshold dose to eyes to produce cataract changes, a patient would have to undergo 10,900 endodontic surveys. PMID:2390963

  17. Objective estimates improve risk stratification for primary graft dysfunction after lung transplantation

    PubMed Central

    Shah, Rupal J.; Diamond, Joshua M.; Cantu, Edward; Flesch, Judd; Lee, James C.; Lederer, David J.; Lama, Vibha N.; Orens, Jonathon; Weinacker, Ann; Wilkes, David S.; Roe, David; Bhorade, Sangeeta; Wille, Keith M.; Ware, Lorraine B.; Palmer, Scott M.; Crespo, Maria; Demissie, Ejigayehu; Sonnet, Joshua; Shah, Ashish; Kawut, Steven M.; Bellamy, Scarlett L.; Localio, A. Russell; Christie, Jason D.

    2016-01-01

    Primary graft dysfunction (PGD) is a major cause of early mortality after lung transplant. We aimed to define objective estimates of PGD risk based on readily available clinical variables, using a prospective study of 11 centers in Lung Transplant Outcomes Group (LTOG). Derivation included 1255 subjects from 2002–2010; with separate validation in 382 subjects accrued from 2011–2012. We used logistic regression to identify predictors of grade 3 PGD at 48/72 hours, and decision curve methods to assess impact on clinical decisions. 211/1255 subjects in the derivation and 56/382 subjects in the validation developed PGD. We developed 3 prediction models, where low-risk recipients had a normal BMI (18.5–25 kg/m2), COPD/CF, and absent or mild PH (mPAP< 40mmHg). All others were considered higher-risk. Low-risk recipients had a predicted PGD risk of 4–7%, and high-risk a predicted PGD risk of 15–18%. Adding a donor-smoking lung to a higher-risk recipient significantly increased PGD risk, although risk did not change in low-risk recipients. Validation demonstrated that probability estimates were generally accurate and that models worked best at baseline PGD incidences between 5–25%. We conclude that valid estimates of PGD risk can be produced using readily-available clinical variables. PMID:25877792

  18. FAST TRACK COMMUNICATION Accurate estimate of α variation and isotope shift parameters in Na and Mg+

    NASA Astrophysics Data System (ADS)

    Sahoo, B. K.

    2010-12-01

    We present accurate calculations of fine-structure constant variation coefficients and isotope shifts in Na and Mg+ using the relativistic coupled-cluster method. In our approach, we are able to discover the roles of various correlation effects explicitly to all orders in these calculations. Most of the results, especially for the excited states, are reported for the first time. It is possible to ascertain suitable anchor and probe lines for the studies of possible variation in the fine-structure constant by using the above results in the considered systems.

  19. Accurate State Estimation and Tracking of a Non-Cooperative Target Vehicle

    NASA Technical Reports Server (NTRS)

    Thienel, Julie K.; Sanner, Robert M.

    2006-01-01

    Autonomous space rendezvous scenarios require knowledge of the target vehicle state in order to safely dock with the chaser vehicle. Ideally, the target vehicle state information is derived from telemetered data, or with the use of known tracking points on the target vehicle. However, if the target vehicle is non-cooperative and does not have the ability to maintain attitude control, or transmit attitude knowledge, the docking becomes more challenging. This work presents a nonlinear approach for estimating the body rates of a non-cooperative target vehicle, and coupling this estimation to a tracking control scheme. The approach is tested with the robotic servicing mission concept for the Hubble Space Telescope (HST). Such a mission would not only require estimates of the HST attitude and rates, but also precision control to achieve the desired rate and maintain the orientation to successfully dock with HST.

  20. Precision Pointing Control to and Accurate Target Estimation of a Non-Cooperative Vehicle

    NASA Technical Reports Server (NTRS)

    VanEepoel, John; Thienel, Julie; Sanner, Robert M.

    2006-01-01

    In 2004, NASA began investigating a robotic servicing mission for the Hubble Space Telescope (HST). Such a mission would not only require estimates of the HST attitude and rates in order to achieve capture by the proposed Hubble Robotic Vehicle (HRV), but also precision control to achieve the desired rate and maintain the orientation to successfully dock with HST. To generalize the situation, HST is the target vehicle and HRV is the chaser. This work presents a nonlinear approach for estimating the body rates of a non-cooperative target vehicle, and coupling this estimation to a control scheme. Non-cooperative in this context relates to the target vehicle no longer having the ability to maintain attitude control or transmit attitude knowledge.

  1. A microbial clock provides an accurate estimate of the postmortem interval in a mouse model system

    PubMed Central

    Metcalf, Jessica L; Wegener Parfrey, Laura; Gonzalez, Antonio; Lauber, Christian L; Knights, Dan; Ackermann, Gail; Humphrey, Gregory C; Gebert, Matthew J; Van Treuren, Will; Berg-Lyons, Donna; Keepers, Kyle; Guo, Yan; Bullard, James; Fierer, Noah; Carter, David O; Knight, Rob

    2013-01-01

    Establishing the time since death is critical in every death investigation, yet existing techniques are susceptible to a range of errors and biases. For example, forensic entomology is widely used to assess the postmortem interval (PMI), but errors can range from days to months. Microbes may provide a novel method for estimating PMI that avoids many of these limitations. Here we show that postmortem microbial community changes are dramatic, measurable, and repeatable in a mouse model system, allowing PMI to be estimated within approximately 3 days over 48 days. Our results provide a detailed understanding of bacterial and microbial eukaryotic ecology within a decomposing corpse system and suggest that microbial community data can be developed into a forensic tool for estimating PMI. DOI: http://dx.doi.org/10.7554/eLife.01104.001 PMID:24137541

  2. Fast and accurate probability density estimation in large high dimensional astronomical datasets

    NASA Astrophysics Data System (ADS)

    Gupta, Pramod; Connolly, Andrew J.; Gardner, Jeffrey P.

    2015-01-01

    Astronomical surveys will generate measurements of hundreds of attributes (e.g. color, size, shape) on hundreds of millions of sources. Analyzing these large, high dimensional data sets will require efficient algorithms for data analysis. An example of this is probability density estimation that is at the heart of many classification problems such as the separation of stars and quasars based on their colors. Popular density estimation techniques use binning or kernel density estimation. Kernel density estimation has a small memory footprint but often requires large computational resources. Binning has small computational requirements but usually binning is implemented with multi-dimensional arrays which leads to memory requirements which scale exponentially with the number of dimensions. Hence both techniques do not scale well to large data sets in high dimensions. We present an alternative approach of binning implemented with hash tables (BASH tables). This approach uses the sparseness of data in the high dimensional space to ensure that the memory requirements are small. However hashing requires some extra computation so a priori it is not clear if the reduction in memory requirements will lead to increased computational requirements. Through an implementation of BASH tables in C++ we show that the additional computational requirements of hashing are negligible. Hence this approach has small memory and computational requirements. We apply our density estimation technique to photometric selection of quasars using non-parametric Bayesian classification and show that the accuracy of the classification is same as the accuracy of earlier approaches. Since the BASH table approach is one to three orders of magnitude faster than the earlier approaches it may be useful in various other applications of density estimation in astrostatistics.

  3. Spectral estimation from laser scanner data for accurate color rendering of objects

    NASA Astrophysics Data System (ADS)

    Baribeau, Rejean

    2002-06-01

    Estimation methods are studied for the recovery of the spectral reflectance across the visible range from the sensing at just three discrete laser wavelengths. Methods based on principal component analysis and on spline interpolation are judged based on the CIE94 color differences for some reference data sets. These include the Macbeth color checker, the OSA-UCS color charts, some artist pigments, and a collection of miscellaneous surface colors. The optimal three sampling wavelengths are also investigated. It is found that color can be estimated with average accuracy ΔE94 = 2.3 when optimal wavelengths 455 nm, 540 n, and 610 nm are used.

  4. Crop area estimation based on remotely-sensed data with an accurate but costly subsample

    NASA Technical Reports Server (NTRS)

    Gunst, R. F.

    1985-01-01

    Research activities conducted under the auspices of National Aeronautics and Space Administration Cooperative Agreement NCC 9-9 are discussed. During this contract period research efforts are concentrated in two primary areas. The first are is an investigation of the use of measurement error models as alternatives to least squares regression estimators of crop production or timber biomass. The secondary primary area of investigation is on the estimation of the mixing proportion of two-component mixture models. This report lists publications, technical reports, submitted manuscripts, and oral presentation generated by these research efforts. Possible areas of future research are mentioned.

  5. Accurate radiocarbon age estimation using "early" measurements: a new approach to reconstructing the Paleolithic absolute chronology

    NASA Astrophysics Data System (ADS)

    Omori, Takayuki; Sano, Katsuhiro; Yoneda, Minoru

    2014-05-01

    This paper presents new correction approaches for "early" radiocarbon ages to reconstruct the Paleolithic absolute chronology. In order to discuss time-space distribution about the replacement of archaic humans, including Neanderthals in Europe, by the modern humans, a massive data, which covers a wide-area, would be needed. Today, some radiocarbon databases focused on the Paleolithic have been published and used for chronological studies. From a viewpoint of current analytical technology, however, the any database have unreliable results that make interpretation of radiocarbon dates difficult. Most of these unreliable ages had been published in the early days of radiocarbon analysis. In recent years, new analytical methods to determine highly-accurate dates have been developed. Ultrafiltration and ABOx-SC methods, as new sample pretreatments for bone and charcoal respectively, have attracted attention because they could remove imperceptible contaminates and derive reliable accurately ages. In order to evaluate the reliability of "early" data, we investigated the differences and variabilities of radiocarbon ages on different pretreatments, and attempted to develop correction functions for the assessment of the reliability. It can be expected that reliability of the corrected age is increased and the age applied to chronological research together with recent ages. Here, we introduce the methodological frameworks and archaeological applications.

  6. Relating space radiation environments to risk estimates

    SciTech Connect

    Curtis, S.B.

    1991-10-01

    This lecture will provide a bridge from the physical energy or LET spectra as might be calculated in an organ to the risk of carcinogenesis, a particular concern for extended missions to the moon or beyond to Mars. Topics covered will include (1) LET spectra expected from galactic cosmic rays, (2) probabilities that individual cell nuclei in the body will be hit by heavy galactic cosmic ray particles, (3) the conventional methods of calculating risks from a mixed environment of high and low LET radiation, (4) an alternate method which provides certain advantages using fluence-related risk coefficients (risk cross sections), and (5) directions for future research and development of these ideas.

  7. A Generalized Subspace Least Mean Square Method for High-resolution Accurate Estimation of Power System Oscillation Modes

    SciTech Connect

    Zhang, Peng; Zhou, Ning; Abdollahi, Ali

    2013-09-10

    A Generalized Subspace-Least Mean Square (GSLMS) method is presented for accurate and robust estimation of oscillation modes from exponentially damped power system signals. The method is based on orthogonality of signal and noise eigenvectors of the signal autocorrelation matrix. Performance of the proposed method is evaluated using Monte Carlo simulation and compared with Prony method. Test results show that the GSLMS is highly resilient to noise and significantly dominates Prony method in tracking power system modes under noisy environments.

  8. Accurate motion parameter estimation for colonoscopy tracking using a regression method

    NASA Astrophysics Data System (ADS)

    Liu, Jianfei; Subramanian, Kalpathi R.; Yoo, Terry S.

    2010-03-01

    Co-located optical and virtual colonoscopy images have the potential to provide important clinical information during routine colonoscopy procedures. In our earlier work, we presented an optical flow based algorithm to compute egomotion from live colonoscopy video, permitting navigation and visualization of the corresponding patient anatomy. In the original algorithm, motion parameters were estimated using the traditional Least Sum of squares(LS) procedure which can be unstable in the context of optical flow vectors with large errors. In the improved algorithm, we use the Least Median of Squares (LMS) method, a robust regression method for motion parameter estimation. Using the LMS method, we iteratively analyze and converge toward the main distribution of the flow vectors, while disregarding outliers. We show through three experiments the improvement in tracking results obtained using the LMS method, in comparison to the LS estimator. The first experiment demonstrates better spatial accuracy in positioning the virtual camera in the sigmoid colon. The second and third experiments demonstrate the robustness of this estimator, resulting in longer tracked sequences: from 300 to 1310 in the ascending colon, and 410 to 1316 in the transverse colon.

  9. How Accurate and Robust Are the Phylogenetic Estimates of Austronesian Language Relationships?

    PubMed Central

    Greenhill, Simon J.; Drummond, Alexei J.; Gray, Russell D.

    2010-01-01

    We recently used computational phylogenetic methods on lexical data to test between two scenarios for the peopling of the Pacific. Our analyses of lexical data supported a pulse-pause scenario of Pacific settlement in which the Austronesian speakers originated in Taiwan around 5,200 years ago and rapidly spread through the Pacific in a series of expansion pulses and settlement pauses. We claimed that there was high congruence between traditional language subgroups and those observed in the language phylogenies, and that the estimated age of the Austronesian expansion at 5,200 years ago was consistent with the archaeological evidence. However, the congruence between the language phylogenies and the evidence from historical linguistics was not quantitatively assessed using tree comparison metrics. The robustness of the divergence time estimates to different calibration points was also not investigated exhaustively. Here we address these limitations by using a systematic tree comparison metric to calculate the similarity between the Bayesian phylogenetic trees and the subgroups proposed by historical linguistics, and by re-estimating the age of the Austronesian expansion using only the most robust calibrations. The results show that the Austronesian language phylogenies are highly congruent with the traditional subgroupings, and the date estimates are robust even when calculated using a restricted set of historical calibrations. PMID:20224774

  10. Accurate Angle Estimator for High-Frame-Rate 2-D Vector Flow Imaging.

    PubMed

    Villagomez Hoyos, Carlos Armando; Stuart, Matthias Bo; Hansen, Kristoffer Lindskov; Nielsen, Michael Bachmann; Jensen, Jorgen Arendt

    2016-06-01

    This paper presents a novel approach for estimating 2-D flow angles using a high-frame-rate ultrasound method. The angle estimator features high accuracy and low standard deviation (SD) over the full 360° range. The method is validated on Field II simulations and phantom measurements using the experimental ultrasound scanner SARUS and a flow rig before being tested in vivo. An 8-MHz linear array transducer is used with defocused beam emissions. In the simulations of a spinning disk phantom, a 360° uniform behavior on the angle estimation is observed with a median angle bias of 1.01° and a median angle SD of 1.8°. Similar results are obtained on a straight vessel for both simulations and measurements, where the obtained angle biases are below 1.5° with SDs around 1°. Estimated velocity magnitudes are also kept under 10% bias and 5% relative SD in both simulations and measurements. An in vivo measurement is performed on a carotid bifurcation of a healthy individual. A 3-s acquisition during three heart cycles is captured. A consistent and repetitive vortex is observed in the carotid bulb during systoles. PMID:27093598

  11. Accurate estimation of influenza epidemics using Google search data via ARGO.

    PubMed

    Yang, Shihao; Santillana, Mauricio; Kou, S C

    2015-11-24

    Accurate real-time tracking of influenza outbreaks helps public health officials make timely and meaningful decisions that could save lives. We propose an influenza tracking model, ARGO (AutoRegression with GOogle search data), that uses publicly available online search data. In addition to having a rigorous statistical foundation, ARGO outperforms all previously available Google-search-based tracking models, including the latest version of Google Flu Trends, even though it uses only low-quality search data as input from publicly available Google Trends and Google Correlate websites. ARGO not only incorporates the seasonality in influenza epidemics but also captures changes in people's online search behavior over time. ARGO is also flexible, self-correcting, robust, and scalable, making it a potentially powerful tool that can be used for real-time tracking of other social events at multiple temporal and spatial resolutions. PMID:26553980

  12. Accurate estimation of influenza epidemics using Google search data via ARGO

    PubMed Central

    Yang, Shihao; Santillana, Mauricio; Kou, S. C.

    2015-01-01

    Accurate real-time tracking of influenza outbreaks helps public health officials make timely and meaningful decisions that could save lives. We propose an influenza tracking model, ARGO (AutoRegression with GOogle search data), that uses publicly available online search data. In addition to having a rigorous statistical foundation, ARGO outperforms all previously available Google-search–based tracking models, including the latest version of Google Flu Trends, even though it uses only low-quality search data as input from publicly available Google Trends and Google Correlate websites. ARGO not only incorporates the seasonality in influenza epidemics but also captures changes in people’s online search behavior over time. ARGO is also flexible, self-correcting, robust, and scalable, making it a potentially powerful tool that can be used for real-time tracking of other social events at multiple temporal and spatial resolutions. PMID:26553980

  13. Raman spectroscopy for highly accurate estimation of the age of refrigerated porcine muscle

    NASA Astrophysics Data System (ADS)

    Timinis, Constantinos; Pitris, Costas

    2016-03-01

    The high water content of meat, combined with all the nutrients it contains, make it vulnerable to spoilage at all stages of production and storage even when refrigerated at 5 °C. A non-destructive and in situ tool for meat sample testing, which could provide an accurate indication of the storage time of meat, would be very useful for the control of meat quality as well as for consumer safety. The proposed solution is based on Raman spectroscopy which is non-invasive and can be applied in situ. For the purposes of this project, 42 meat samples from 14 animals were obtained and three Raman spectra per sample were collected every two days for two weeks. The spectra were subsequently processed and the sample age was calculated using a set of linear differential equations. In addition, the samples were classified in categories corresponding to the age in 2-day steps (i.e., 0, 2, 4, 6, 8, 10, 12 or 14 days old), using linear discriminant analysis and cross-validation. Contrary to other studies, where the samples were simply grouped into two categories (higher or lower quality, suitable or unsuitable for human consumption, etc.), in this study, the age was predicted with a mean error of ~ 1 day (20%) or classified, in 2-day steps, with 100% accuracy. Although Raman spectroscopy has been used in the past for the analysis of meat samples, the proposed methodology has resulted in a prediction of the sample age far more accurately than any report in the literature.

  14. Techniques for accurate estimation of net discharge in a tidal channel

    USGS Publications Warehouse

    Simpson, Michael R.; Bland, Roger

    1999-01-01

    An ultrasonic velocity meter discharge-measurement site in a tidally affected region of the Sacramento-San Joaquin rivers was used to study the accuracy of the index velocity calibration procedure. Calibration data consisting of ultrasonic velocity meter index velocity and concurrent acoustic Doppler discharge measurement data were collected during three time periods. The relative magnitude of equipment errors, acoustic Doppler discharge measurement errors, and calibration errors were evaluated. Calibration error was the most significant source of error in estimating net discharge. Using a comprehensive calibration method, net discharge estimates developed from the three sets of calibration data differed by less than an average of 4 cubic meters per second. Typical maximum flow rates during the data-collection period averaged 750 cubic meters per second.

  15. Lower bound on reliability for Weibull distribution when shape parameter is not estimated accurately

    NASA Technical Reports Server (NTRS)

    Huang, Zhaofeng; Porter, Albert A.

    1991-01-01

    The mathematical relationships between the shape parameter Beta and estimates of reliability and a life limit lower bound for the two parameter Weibull distribution are investigated. It is shown that under rather general conditions, both the reliability lower bound and the allowable life limit lower bound (often called a tolerance limit) have unique global minimums over a range of Beta. Hence lower bound solutions can be obtained without assuming or estimating Beta. The existence and uniqueness of these lower bounds are proven. Some real data examples are given to show how these lower bounds can be easily established and to demonstrate their practicality. The method developed here has proven to be extremely useful when using the Weibull distribution in analysis of no-failure or few-failures data. The results are applicable not only in the aerospace industry but anywhere that system reliabilities are high.

  16. Lower bound on reliability for Weibull distribution when shape parameter is not estimated accurately

    NASA Technical Reports Server (NTRS)

    Huang, Zhaofeng; Porter, Albert A.

    1990-01-01

    The mathematical relationships between the shape parameter Beta and estimates of reliability and a life limit lower bound for the two parameter Weibull distribution are investigated. It is shown that under rather general conditions, both the reliability lower bound and the allowable life limit lower bound (often called a tolerance limit) have unique global minimums over a range of Beta. Hence lower bound solutions can be obtained without assuming or estimating Beta. The existence and uniqueness of these lower bounds are proven. Some real data examples are given to show how these lower bounds can be easily established and to demonstrate their practicality. The method developed here has proven to be extremely useful when using the Weibull distribution in analysis of no-failure or few-failures data. The results are applicable not only in the aerospace industry but anywhere that system reliabilities are high.

  17. Are satellite based rainfall estimates accurate enough for crop modelling under Sahelian climate?

    NASA Astrophysics Data System (ADS)

    Ramarohetra, J.; Sultan, B.

    2012-04-01

    Agriculture is considered as the most climate dependant human activity. In West Africa and especially in the sudano-sahelian zone, rain-fed agriculture - that represents 93% of cultivated areas and is the means of support of 70% of the active population - is highly vulnerable to precipitation variability. To better understand and anticipate climate impacts on agriculture, crop models - that estimate crop yield from climate information (e.g rainfall, temperature, insolation, humidity) - have been developed. These crop models are useful (i) in ex ante analysis to quantify the impact of different strategies implementation - crop management (e.g. choice of varieties, sowing date), crop insurance or medium-range weather forecast - on yields, (ii) for early warning systems and to (iii) assess future food security. Yet, the successful application of these models depends on the accuracy of their climatic drivers. In the sudano-sahelian zone , the quality of precipitation estimations is then a key factor to understand and anticipate climate impacts on agriculture via crop modelling and yield estimations. Different kinds of precipitation estimations can be used. Ground measurements have long-time series but an insufficient network density, a large proportion of missing values, delay in reporting time, and they have limited availability. An answer to these shortcomings may lie in the field of remote sensing that provides satellite-based precipitation estimations. However, satellite-based rainfall estimates (SRFE) are not a direct measurement but rather an estimation of precipitation. Used as an input for crop models, it determines the performance of the simulated yield, hence SRFE require validation. The SARRAH crop model is used to model three different varieties of pearl millet (HKP, MTDO, Souna3) in a square degree centred on 13.5°N and 2.5°E, in Niger. Eight satellite-based rainfall daily products (PERSIANN, CMORPH, TRMM 3b42-RT, GSMAP MKV+, GPCP, TRMM 3b42v6, RFEv2 and

  18. Plant DNA Barcodes Can Accurately Estimate Species Richness in Poorly Known Floras

    PubMed Central

    Costion, Craig; Ford, Andrew; Cross, Hugh; Crayn, Darren; Harrington, Mark; Lowe, Andrew

    2011-01-01

    Background Widespread uptake of DNA barcoding technology for vascular plants has been slow due to the relatively poor resolution of species discrimination (∼70%) and low sequencing and amplification success of one of the two official barcoding loci, matK. Studies to date have mostly focused on finding a solution to these intrinsic limitations of the markers, rather than posing questions that can maximize the utility of DNA barcodes for plants with the current technology. Methodology/Principal Findings Here we test the ability of plant DNA barcodes using the two official barcoding loci, rbcLa and matK, plus an alternative barcoding locus, trnH-psbA, to estimate the species diversity of trees in a tropical rainforest plot. Species discrimination accuracy was similar to findings from previous studies but species richness estimation accuracy proved higher, up to 89%. All combinations which included the trnH-psbA locus performed better at both species discrimination and richness estimation than matK, which showed little enhanced species discriminatory power when concatenated with rbcLa. The utility of the trnH-psbA locus is limited however, by the occurrence of intraspecific variation observed in some angiosperm families to occur as an inversion that obscures the monophyly of species. Conclusions/Significance We demonstrate for the first time, using a case study, the potential of plant DNA barcodes for the rapid estimation of species richness in taxonomically poorly known areas or cryptic populations revealing a powerful new tool for rapid biodiversity assessment. The combination of the rbcLa and trnH-psbA loci performed better for this purpose than any two-locus combination that included matK. We show that although DNA barcodes fail to discriminate all species of plants, new perspectives and methods on biodiversity value and quantification may overshadow some of these shortcomings by applying barcode data in new ways. PMID:22096501

  19. Accurate estimation of retinal vessel width using bagged decision trees and an extended multiresolution Hermite model.

    PubMed

    Lupaşcu, Carmen Alina; Tegolo, Domenico; Trucco, Emanuele

    2013-12-01

    We present an algorithm estimating the width of retinal vessels in fundus camera images. The algorithm uses a novel parametric surface model of the cross-sectional intensities of vessels, and ensembles of bagged decision trees to estimate the local width from the parameters of the best-fit surface. We report comparative tests with REVIEW, currently the public database of reference for retinal width estimation, containing 16 images with 193 annotated vessel segments and 5066 profile points annotated manually by three independent experts. Comparative tests are reported also with our own set of 378 vessel widths selected sparsely in 38 images from the Tayside Scotland diabetic retinopathy screening programme and annotated manually by two clinicians. We obtain considerably better accuracies compared to leading methods in REVIEW tests and in Tayside tests. An important advantage of our method is its stability (success rate, i.e., meaningful measurement returned, of 100% on all REVIEW data sets and on the Tayside data set) compared to a variety of methods from the literature. We also find that results depend crucially on testing data and conditions, and discuss criteria for selecting a training set yielding optimal accuracy. PMID:24001930

  20. Compact and accurate linear and nonlinear autoregressive moving average model parameter estimation using laguerre functions.

    PubMed

    Chon, K H; Cohen, R J; Holstein-Rathlou, N H

    1997-01-01

    A linear and nonlinear autoregressive moving average (ARMA) identification algorithm is developed for modeling time series data. The algorithm uses Laguerre expansion of kernals (LEK) to estimate Volterra-Wiener kernals. However, instead of estimating linear and nonlinear system dynamics via moving average models, as is the case for the Volterra-Wiener analysis, we propose an ARMA model-based approach. The proposed algorithm is essentially the same as LEK, but this algorithm is extended to include past values of the output as well. Thus, all of the advantages associated with using the Laguerre function remain with our algorithm; but, by extending the algorithm to the linear and nonlinear ARMA model, a significant reduction in the number of Laguerre functions can be made, compared with the Volterra-Wiener approach. This translates into a more compact system representation and makes the physiological interpretation of higher order kernels easier. Furthermore, simulation results show better performance of the proposed approach in estimating the system dynamics than LEK in certain cases, and it remains effective in the presence of significant additive measurement noise. PMID:9236985

  1. Accurate estimation of sea surface temperatures using dissolution-corrected calibrations for Mg/Ca paleothermometry

    NASA Astrophysics Data System (ADS)

    Rosenthal, Yair; Lohmann, George P.

    2002-09-01

    Paired δ18O and Mg/Ca measurements on the same foraminiferal shells offer the ability to independently estimate sea surface temperature (SST) changes and assess their temporal relationship to the growth and decay of continental ice sheets. The accuracy of this method is confounded, however, by the absence of a quantitative method to correct Mg/Ca records for alteration by dissolution. Here we describe dissolution-corrected calibrations for Mg/Ca-paleothermometry in which the preexponent constant is a function of size-normalized shell weight: (1) for G. ruber (212-300 μm) (Mg/Ca)ruber = (0.025 wt + 0.11) e0.095T and (b) for G. sacculifer (355-425 μm) (Mg/Ca)sacc = (0.0032 wt + 0.181) e0.095T. The new calibrations improve the accuracy of SST estimates and are globally applicable. With this correction, eastern equatorial Atlantic SST during the Last Glacial Maximum is estimated to be 2.9° ± 0.4°C colder than today.

  2. How to Estimate Epidemic Risk from Incomplete Contact Diaries Data?

    PubMed

    Mastrandrea, Rossana; Barrat, Alain

    2016-06-01

    Social interactions shape the patterns of spreading processes in a population. Techniques such as diaries or proximity sensors allow to collect data about encounters and to build networks of contacts between individuals. The contact networks obtained from these different techniques are however quantitatively different. Here, we first show how these discrepancies affect the prediction of the epidemic risk when these data are fed to numerical models of epidemic spread: low participation rate, under-reporting of contacts and overestimation of contact durations in contact diaries with respect to sensor data determine indeed important differences in the outcomes of the corresponding simulations with for instance an enhanced sensitivity to initial conditions. Most importantly, we investigate if and how information gathered from contact diaries can be used in such simulations in order to yield an accurate description of the epidemic risk, assuming that data from sensors represent the ground truth. The contact networks built from contact sensors and diaries present indeed several structural similarities: this suggests the possibility to construct, using only the contact diary network information, a surrogate contact network such that simulations using this surrogate network give the same estimation of the epidemic risk as simulations using the contact sensor network. We present and compare several methods to build such surrogate data, and show that it is indeed possible to obtain a good agreement between the outcomes of simulations using surrogate and sensor data, as long as the contact diary information is complemented by publicly available data describing the heterogeneity of the durations of human contacts. PMID:27341027

  3. How to Estimate Epidemic Risk from Incomplete Contact Diaries Data?

    PubMed Central

    Mastrandrea, Rossana; Barrat, Alain

    2016-01-01

    Social interactions shape the patterns of spreading processes in a population. Techniques such as diaries or proximity sensors allow to collect data about encounters and to build networks of contacts between individuals. The contact networks obtained from these different techniques are however quantitatively different. Here, we first show how these discrepancies affect the prediction of the epidemic risk when these data are fed to numerical models of epidemic spread: low participation rate, under-reporting of contacts and overestimation of contact durations in contact diaries with respect to sensor data determine indeed important differences in the outcomes of the corresponding simulations with for instance an enhanced sensitivity to initial conditions. Most importantly, we investigate if and how information gathered from contact diaries can be used in such simulations in order to yield an accurate description of the epidemic risk, assuming that data from sensors represent the ground truth. The contact networks built from contact sensors and diaries present indeed several structural similarities: this suggests the possibility to construct, using only the contact diary network information, a surrogate contact network such that simulations using this surrogate network give the same estimation of the epidemic risk as simulations using the contact sensor network. We present and compare several methods to build such surrogate data, and show that it is indeed possible to obtain a good agreement between the outcomes of simulations using surrogate and sensor data, as long as the contact diary information is complemented by publicly available data describing the heterogeneity of the durations of human contacts. PMID:27341027

  4. Moving towards a new paradigm for global flood risk estimation

    NASA Astrophysics Data System (ADS)

    Troy, Tara J.; Devineni, Naresh; Lima, Carlos; Lall, Upmanu

    2013-04-01

    model is implemented at a finer resolution (<=1km) in order to more accurately model streamflow under flood conditions and estimate inundation. This approach allows for efficient computational simulation of the hydrology when not under potential for flooding with high-resolution flood wave modeling when there is flooding potential. We demonstrate the results of this flood risk estimation system for the Ohio River basin in the United States, a large river basin that is historically prone to flooding, with the intention of using it to do global flood risk assessment.

  5. Preoperative Evaluation: Estimation of Pulmonary Risk.

    PubMed

    Lakshminarasimhachar, Anand; Smetana, Gerald W

    2016-03-01

    Postoperative pulmonary complications (PPCs) are common after major non-thoracic surgery and associated with significant morbidity and high cost of care. A number of risk factors are strong predictors of PPCs. The overall goal of the preoperative pulmonary evaluation is to identify these potential, patient and procedure-related risks and optimize the health of the patients before surgery. A thorough clinical examination supported by appropriate laboratory tests will help guide the clinician to provide optimal perioperative care. PMID:26927740

  6. Higher Accurate Estimation of Axial and Bending Stiffnesses of Plates Clamped by Bolts

    NASA Astrophysics Data System (ADS)

    Naruse, Tomohiro; Shibutani, Yoji

    Equivalent stiffness of clamped plates should be prescribed not only to evaluate the strength of bolted joints by the scheme of “joint diagram” but also to make structural analyses for practical structures with many bolted joints. We estimated the axial stiffness and bending stiffness of clamped plates by using Finite Element (FE) analyses while taking the contact condition on bearing surfaces and between the plates into account. The FE models were constructed for bolted joints tightened with M8, 10, 12 and 16 bolts and plate thicknesses of 3.2, 4.5, 6.0 and 9.0 mm, and the axial and bending compliances were precisely evaluated. These compliances of clamped plates were compared with those from VDI 2230 (2003) code, in which the equivalent conical compressive stress field in the plate has been assumed. The code gives larger axial stiffness for 11% and larger bending stiffness for 22%, and it cannot apply to the clamped plates with different thickness. Thus the code shall give lower bolt stress (unsafe estimation). We modified the vertical angle tangent, tanφ, of the equivalent conical by adding a term of the logarithm of thickness ratio t1/t2 and by fitting to the analysis results. The modified tanφ can estimate the axial compliance with the error from -1.5% to 6.8% and the bending compliance with the error from -6.5% to 10%. Furthermore, the modified tanφ can take the thickness difference into consideration.

  7. Accurate estimation of airborne ultrasonic time-of-flight for overlapping echoes.

    PubMed

    Sarabia, Esther G; Llata, Jose R; Robla, Sandra; Torre-Ferrero, Carlos; Oria, Juan P

    2013-01-01

    In this work, an analysis of the transmission of ultrasonic signals generated by piezoelectric sensors for air applications is presented. Based on this analysis, an ultrasonic response model is obtained for its application to the recognition of objects and structured environments for navigation by autonomous mobile robots. This model enables the analysis of the ultrasonic response that is generated using a pair of sensors in transmitter-receiver configuration using the pulse-echo technique. This is very interesting for recognizing surfaces that simultaneously generate a multiple echo response. This model takes into account the effect of the radiation pattern, the resonant frequency of the sensor, the number of cycles of the excitation pulse, the dynamics of the sensor and the attenuation with distance in the medium. This model has been developed, programmed and verified through a battery of experimental tests. Using this model a new procedure for obtaining accurate time of flight is proposed. This new method is compared with traditional ones, such as threshold or correlation, to highlight its advantages and drawbacks. Finally the advantages of this method are demonstrated for calculating multiple times of flight when the echo is formed by several overlapping echoes. PMID:24284774

  8. Accurate Estimation of Airborne Ultrasonic Time-of-Flight for Overlapping Echoes

    PubMed Central

    Sarabia, Esther G.; Llata, Jose R.; Robla, Sandra; Torre-Ferrero, Carlos; Oria, Juan P.

    2013-01-01

    In this work, an analysis of the transmission of ultrasonic signals generated by piezoelectric sensors for air applications is presented. Based on this analysis, an ultrasonic response model is obtained for its application to the recognition of objects and structured environments for navigation by autonomous mobile robots. This model enables the analysis of the ultrasonic response that is generated using a pair of sensors in transmitter-receiver configuration using the pulse-echo technique. This is very interesting for recognizing surfaces that simultaneously generate a multiple echo response. This model takes into account the effect of the radiation pattern, the resonant frequency of the sensor, the number of cycles of the excitation pulse, the dynamics of the sensor and the attenuation with distance in the medium. This model has been developed, programmed and verified through a battery of experimental tests. Using this model a new procedure for obtaining accurate time of flight is proposed. This new method is compared with traditional ones, such as threshold or correlation, to highlight its advantages and drawbacks. Finally the advantages of this method are demonstrated for calculating multiple times of flight when the echo is formed by several overlapping echoes. PMID:24284774

  9. Voxel-based registration of simulated and real patient CBCT data for accurate dental implant pose estimation

    NASA Astrophysics Data System (ADS)

    Moreira, António H. J.; Queirós, Sandro; Morais, Pedro; Rodrigues, Nuno F.; Correia, André Ricardo; Fernandes, Valter; Pinho, A. C. M.; Fonseca, Jaime C.; Vilaça, João. L.

    2015-03-01

    The success of dental implant-supported prosthesis is directly linked to the accuracy obtained during implant's pose estimation (position and orientation). Although traditional impression techniques and recent digital acquisition methods are acceptably accurate, a simultaneously fast, accurate and operator-independent methodology is still lacking. Hereto, an image-based framework is proposed to estimate the patient-specific implant's pose using cone-beam computed tomography (CBCT) and prior knowledge of implanted model. The pose estimation is accomplished in a threestep approach: (1) a region-of-interest is extracted from the CBCT data using 2 operator-defined points at the implant's main axis; (2) a simulated CBCT volume of the known implanted model is generated through Feldkamp-Davis-Kress reconstruction and coarsely aligned to the defined axis; and (3) a voxel-based rigid registration is performed to optimally align both patient and simulated CBCT data, extracting the implant's pose from the optimal transformation. Three experiments were performed to evaluate the framework: (1) an in silico study using 48 implants distributed through 12 tridimensional synthetic mandibular models; (2) an in vitro study using an artificial mandible with 2 dental implants acquired with an i-CAT system; and (3) two clinical case studies. The results shown positional errors of 67+/-34μm and 108μm, and angular misfits of 0.15+/-0.08° and 1.4°, for experiment 1 and 2, respectively. Moreover, in experiment 3, visual assessment of clinical data results shown a coherent alignment of the reference implant. Overall, a novel image-based framework for implants' pose estimation from CBCT data was proposed, showing accurate results in agreement with dental prosthesis modelling requirements.

  10. An Energy-Efficient Strategy for Accurate Distance Estimation in Wireless Sensor Networks

    PubMed Central

    Tarrío, Paula; Bernardos, Ana M.; Casar, José R.

    2012-01-01

    In line with recent research efforts made to conceive energy saving protocols and algorithms and power sensitive network architectures, in this paper we propose a transmission strategy to minimize the energy consumption in a sensor network when using a localization technique based on the measurement of the strength (RSS) or the time of arrival (TOA) of the received signal. In particular, we find the transmission power and the packet transmission rate that jointly minimize the total consumed energy, while ensuring at the same time a desired accuracy in the RSS or TOA measurements. We also propose some corrections to these theoretical results to take into account the effects of shadowing and packet loss in the propagation channel. The proposed strategy is shown to be effective in realistic scenarios providing energy savings with respect to other transmission strategies, and also guaranteeing a given accuracy in the distance estimations, which will serve to guarantee a desired accuracy in the localization result. PMID:23202218

  11. Parametric Estimation in a Recurrent Competing Risks Model

    PubMed Central

    Peña, Edsel A.

    2014-01-01

    A resource-efficient approach to making inferences about the distributional properties of the failure times in a competing risks setting is presented. Efficiency is gained by observing recurrences of the competing risks over a random monitoring period. The resulting model is called the recurrent competing risks model (RCRM) and is coupled with two repair strategies whenever the system fails. Maximum likelihood estimators of the parameters of the marginal distribution functions associated with each of the competing risks and also of the system lifetime distribution function are presented. Estimators are derived under perfect and partial repair strategies. Consistency and asymptotic properties of the estimators are obtained. The estimation methods are applied to a data set of failures for cars under warranty. Simulation studies are used to ascertain the small sample properties and the efficiency gains of the resulting estimators. PMID:25346751

  12. Estimated soil ingestion rates for use in risk assessment

    SciTech Connect

    LaGoy, P.K.

    1987-09-01

    Assessing the risks to human health posed by contaminants present in soil requires an estimate of likely soil ingestion rates. In the past, direct measurements of soil ingestion were not available and risk assessors were forced to estimate soil ingestion rates based on observations of mouthing behavior and measurements of soil on hands. Recently, empirical data on soil ingestion rates have become available from two sources. Although preliminary, these data can be used to derive better estimates of soil ingestion rates for use in risk assessments. Estimates of average soil ingestion rates derived in this paper range from 25 to 100 mg/day, depending on the age of the individual at risk. Maximum soil ingestion rates that are unlikely to underestimate exposure range from 100 to 500 mg. A value of 5000 mg/day is considered a reasonable estimate of a maximum single-day exposure for a child with habitual pica. 12 references.

  13. [Research on maize multispectral image accurate segmentation and chlorophyll index estimation].

    PubMed

    Wu, Qian; Sun, Hong; Li, Min-zan; Song, Yuan-yuan; Zhang, Yan-e

    2015-01-01

    In order to rapidly acquire maize growing information in the field, a non-destructive method of maize chlorophyll content index measurement was conducted based on multi-spectral imaging technique and imaging processing technology. The experiment was conducted at Yangling in Shaanxi province of China and the crop was Zheng-dan 958 planted in about 1 000 m X 600 m experiment field. Firstly, a 2-CCD multi-spectral image monitoring system was available to acquire the canopy images. The system was based on a dichroic prism, allowing precise separation of the visible (Blue (B), Green (G), Red (R): 400-700 nm) and near-infrared (NIR, 760-1 000 nm) band. The multispectral images were output as RGB and NIR images via the system vertically fixed to the ground with vertical distance of 2 m and angular field of 50°. SPAD index of each sample was'measured synchronously to show the chlorophyll content index. Secondly, after the image smoothing using adaptive smooth filtering algorithm, the NIR maize image was selected to segment the maize leaves from background, because there was a big difference showed in gray histogram between plant and soil background. The NIR image segmentation algorithm was conducted following steps of preliminary and accuracy segmentation: (1) The results of OTSU image segmentation method and the variable threshold algorithm were discussed. It was revealed that the latter was better one in corn plant and weed segmentation. As a result, the variable threshold algorithm based on local statistics was selected for the preliminary image segmentation. The expansion and corrosion were used to optimize the segmented image. (2) The region labeling algorithm was used to segment corn plants from soil and weed background with an accuracy of 95. 59 %. And then, the multi-spectral image of maize canopy was accurately segmented in R, G and B band separately. Thirdly, the image parameters were abstracted based on the segmented visible and NIR images. The average gray

  14. Estimating Risk: Stereotype Amplification and the Perceived Risk of Criminal Victimization

    ERIC Educational Resources Information Center

    Quillian, Lincoln; Pager, Devah

    2010-01-01

    This paper considers the process by which individuals estimate the risk of adverse events, with particular attention to the social context in which risk estimates are formed. We compare subjective probability estimates of crime victimization to actual victimization experiences among respondents from the 1994 to 2002 waves of the Survey of Economic…

  15. The challenges of accurately estimating time of long bone injury in children.

    PubMed

    Pickett, Tracy A

    2015-07-01

    The ability to determine the time an injury occurred can be of crucial significance in forensic medicine and holds special relevance to the investigation of child abuse. However, dating paediatric long bone injury, including fractures, is nuanced by complexities specific to the paediatric population. These challenges include the ability to identify bone injury in a growing or only partially-calcified skeleton, different injury patterns seen within the spectrum of the paediatric population, the effects of bone growth on healing as a separate entity from injury, differential healing rates seen at different ages, and the relative scarcity of information regarding healing rates in children, especially the very young. The challenges posed by these factors are compounded by a lack of consistency in defining and categorizing healing parameters. This paper sets out the primary limitations of existing knowledge regarding estimating timing of paediatric bone injury. Consideration and understanding of the multitude of factors affecting bone injury and healing in children will assist those providing opinion in the medical-legal forum. PMID:26048508

  16. Accurate estimation of normal incidence absorption coefficients with confidence intervals using a scanning laser Doppler vibrometer

    NASA Astrophysics Data System (ADS)

    Vuye, Cedric; Vanlanduit, Steve; Guillaume, Patrick

    2009-06-01

    When using optical measurements of the sound fields inside a glass tube, near the material under test, to estimate the reflection and absorption coefficients, not only these acoustical parameters but also confidence intervals can be determined. The sound fields are visualized using a scanning laser Doppler vibrometer (SLDV). In this paper the influence of different test signals on the quality of the results, obtained with this technique, is examined. The amount of data gathered during one measurement scan makes a thorough statistical analysis possible leading to the knowledge of confidence intervals. The use of a multi-sine, constructed on the resonance frequencies of the test tube, shows to be a very good alternative for the traditional periodic chirp. This signal offers the ability to obtain data for multiple frequencies in one measurement, without the danger of a low signal-to-noise ratio. The variability analysis in this paper clearly shows the advantages of the proposed multi-sine compared to the periodic chirp. The measurement procedure and the statistical analysis are validated by measuring the reflection ratio at a closed end and comparing the results with the theoretical value. Results of the testing of two building materials (an acoustic ceiling tile and linoleum) are presented and compared to supplier data.

  17. Accurate Estimation of Protein Folding and Unfolding Times: Beyond Markov State Models.

    PubMed

    Suárez, Ernesto; Adelman, Joshua L; Zuckerman, Daniel M

    2016-08-01

    Because standard molecular dynamics (MD) simulations are unable to access time scales of interest in complex biomolecular systems, it is common to "stitch together" information from multiple shorter trajectories using approximate Markov state model (MSM) analysis. However, MSMs may require significant tuning and can yield biased results. Here, by analyzing some of the longest protein MD data sets available (>100 μs per protein), we show that estimators constructed based on exact non-Markovian (NM) principles can yield significantly improved mean first-passage times (MFPTs) for protein folding and unfolding. In some cases, MSM bias of more than an order of magnitude can be corrected when identical trajectory data are reanalyzed by non-Markovian approaches. The NM analysis includes "history" information, higher order time correlations compared to MSMs, that is available in every MD trajectory. The NM strategy is insensitive to fine details of the states used and works well when a fine time-discretization (i.e., small "lag time") is used. PMID:27340835

  18. Thermal Conductivities in Solids from First Principles: Accurate Computations and Rapid Estimates

    NASA Astrophysics Data System (ADS)

    Carbogno, Christian; Scheffler, Matthias

    In spite of significant research efforts, a first-principles determination of the thermal conductivity κ at high temperatures has remained elusive. Boltzmann transport techniques that account for anharmonicity perturbatively become inaccurate under such conditions. Ab initio molecular dynamics (MD) techniques using the Green-Kubo (GK) formalism capture the full anharmonicity, but can become prohibitively costly to converge in time and size. We developed a formalism that accelerates such GK simulations by several orders of magnitude and that thus enables its application within the limited time and length scales accessible in ab initio MD. For this purpose, we determine the effective harmonic potential occurring during the MD, the associated temperature-dependent phonon properties and lifetimes. Interpolation in reciprocal and frequency space then allows to extrapolate to the macroscopic scale. For both force-field and ab initio MD, we validate this approach by computing κ for Si and ZrO2, two materials known for their particularly harmonic and anharmonic character. Eventually, we demonstrate how these techniques facilitate reasonable estimates of κ from existing MD calculations at virtually no additional computational cost.

  19. ProViDE: A software tool for accurate estimation of viral diversity in metagenomic samples

    PubMed Central

    Ghosh, Tarini Shankar; Mohammed, Monzoorul Haque; Komanduri, Dinakar; Mande, Sharmila Shekhar

    2011-01-01

    Given the absence of universal marker genes in the viral kingdom, researchers typically use BLAST (with stringent E-values) for taxonomic classification of viral metagenomic sequences. Since majority of metagenomic sequences originate from hitherto unknown viral groups, using stringent e-values results in most sequences remaining unclassified. Furthermore, using less stringent e-values results in a high number of incorrect taxonomic assignments. The SOrt-ITEMS algorithm provides an approach to address the above issues. Based on alignment parameters, SOrt-ITEMS follows an elaborate work-flow for assigning reads originating from hitherto unknown archaeal/bacterial genomes. In SOrt-ITEMS, alignment parameter thresholds were generated by observing patterns of sequence divergence within and across various taxonomic groups belonging to bacterial and archaeal kingdoms. However, many taxonomic groups within the viral kingdom lack a typical Linnean-like taxonomic hierarchy. In this paper, we present ProViDE (Program for Viral Diversity Estimation), an algorithm that uses a customized set of alignment parameter thresholds, specifically suited for viral metagenomic sequences. These thresholds capture the pattern of sequence divergence and the non-uniform taxonomic hierarchy observed within/across various taxonomic groups of the viral kingdom. Validation results indicate that the percentage of ‘correct’ assignments by ProViDE is around 1.7 to 3 times higher than that by the widely used similarity based method MEGAN. The misclassification rate of ProViDE is around 3 to 19% (as compared to 5 to 42% by MEGAN) indicating significantly better assignment accuracy. ProViDE software and a supplementary file (containing supplementary figures and tables referred to in this article) is available for download from http://metagenomics.atc.tcs.com/binning/ProViDE/ PMID:21544173

  20. A new method based on the subpixel Gaussian model for accurate estimation of asteroid coordinates

    NASA Astrophysics Data System (ADS)

    Savanevych, V. E.; Briukhovetskyi, O. B.; Sokovikova, N. S.; Bezkrovny, M. M.; Vavilova, I. B.; Ivashchenko, Yu. M.; Elenin, L. V.; Khlamov, S. V.; Movsesian, Ia. S.; Dashkova, A. M.; Pogorelov, A. V.

    2015-08-01

    We describe a new iteration method to estimate asteroid coordinates, based on a subpixel Gaussian model of the discrete object image. The method operates by continuous parameters (asteroid coordinates) in a discrete observational space (the set of pixel potentials) of the CCD frame. In this model, the kind of coordinate distribution of the photons hitting a pixel of the CCD frame is known a priori, while the associated parameters are determined from a real digital object image. The method that is developed, which is flexible in adapting to any form of object image, has a high measurement accuracy along with a low calculating complexity, due to the maximum-likelihood procedure that is implemented to obtain the best fit instead of a least-squares method and Levenberg-Marquardt algorithm for minimization of the quadratic form. Since 2010, the method has been tested as the basis of our Collection Light Technology (COLITEC) software, which has been installed at several observatories across the world with the aim of the automatic discovery of asteroids and comets in sets of CCD frames. As a result, four comets (C/2010 X1 (Elenin), P/2011 NO1(Elenin), C/2012 S1 (ISON) and P/2013 V3 (Nevski)) as well as more than 1500 small Solar system bodies (including five near-Earth objects (NEOs), 21 Trojan asteroids of Jupiter and one Centaur object) have been discovered. We discuss these results, which allowed us to compare the accuracy parameters of the new method and confirm its efficiency. In 2014, the COLITEC software was recommended to all members of the Gaia-FUN-SSO network for analysing observations as a tool to detect faint moving objects in frames.

  1. Estimating Risk: Stereotype Amplification and the Perceived Risk of Criminal Victimization.

    PubMed

    Quillian, Lincoln; Pager, Devah

    2010-03-01

    This paper considers the process by which individuals estimate the risk of adverse events, with particular attention to the social context in which risk estimates are formed. We compare subjective probability estimates of crime victimization to actual victimization experiences among respondents from the 1994 to 2002 waves of the Survey of Economic Expectations (Dominitz and Manski 2002). Using zip code identifiers, we then match these survey data to local area characteristics from the census. The results show that: (1) the risk of criminal victimization is significantly overestimated relative to actual rates of victimization or other negative events; (2) neighborhood racial composition is strongly associated with perceived risk of victimization, whereas actual victimization risk is driven by nonracial neighborhood characteristics; and (3) white respondents appear more strongly affected by racial composition than nonwhites in forming their estimates of risk. We argue these results support a model of stereotype amplification in the formation of risk estimates. Implications for persistent racial inequality are considered. PMID:20686631

  2. Estimating Risk: Stereotype Amplification and the Perceived Risk of Criminal Victimization

    PubMed Central

    QUILLIAN, LINCOLN; PAGER, DEVAH

    2010-01-01

    This paper considers the process by which individuals estimate the risk of adverse events, with particular attention to the social context in which risk estimates are formed. We compare subjective probability estimates of crime victimization to actual victimization experiences among respondents from the 1994 to 2002 waves of the Survey of Economic Expectations (Dominitz and Manski 2002). Using zip code identifiers, we then match these survey data to local area characteristics from the census. The results show that: (1) the risk of criminal victimization is significantly overestimated relative to actual rates of victimization or other negative events; (2) neighborhood racial composition is strongly associated with perceived risk of victimization, whereas actual victimization risk is driven by nonracial neighborhood characteristics; and (3) white respondents appear more strongly affected by racial composition than nonwhites in forming their estimates of risk. We argue these results support a model of stereotype amplification in the formation of risk estimates. Implications for persistent racial inequality are considered. PMID:20686631

  3. Figure of merit of diamond power devices based on accurately estimated impact ionization processes

    NASA Astrophysics Data System (ADS)

    Hiraiwa, Atsushi; Kawarada, Hiroshi

    2013-07-01

    Although a high breakdown voltage or field is considered as a major advantage of diamond, there has been a large difference in breakdown voltages or fields of diamond devices in literature. Most of these apparently contradictory results did not correctly reflect material properties because of specific device designs, such as punch-through structure and insufficient edge termination. Once these data were removed, the remaining few results, including a record-high breakdown field of 20 MV/cm, were theoretically reproduced, exactly calculating ionization integrals based on the ionization coefficients that were obtained after compensating for possible errors involved in reported theoretical values. In this compensation, we newly developed a method for extracting an ionization coefficient from an arbitrary relationship between breakdown voltage and doping density in the Chynoweth's framework. The breakdown field of diamond was estimated to depend on the doping density more than other materials, and accordingly required to be compared at the same doping density. The figure of merit (FOM) of diamond devices, obtained using these breakdown data, was comparable to the FOMs of 4H-SiC and Wurtzite-GaN devices at room temperature, but was projected to be larger than the latter by more than one order of magnitude at higher temperatures about 300 °C. Considering the relatively undeveloped state of diamond technology, there is room for further enhancement of the diamond FOM, improving breakdown voltage and mobility. Through these investigations, junction breakdown was found to be initiated by electrons or holes in a p--type or n--type drift layer, respectively. The breakdown voltages in the two types of drift layers differed from each other in a strict sense but were practically the same. Hence, we do not need to care about the conduction type of drift layers, but should rather exactly calculate the ionization integral without approximating ionization coefficients by a power

  4. Wind effect on PV module temperature: Analysis of different techniques for an accurate estimation.

    NASA Astrophysics Data System (ADS)

    Schwingshackl, Clemens; Petitta, Marcello; Ernst Wagner, Jochen; Belluardo, Giorgio; Moser, David; Castelli, Mariapina; Zebisch, Marc; Tetzlaff, Anke

    2013-04-01

    temperature estimation using meteorological parameters. References: [1] Skoplaki, E. et al., 2008: A simple correlation for the operating temperature of photovoltaic modules of arbitrary mounting, Solar Energy Materials & Solar Cells 92, 1393-1402 [2] Skoplaki, E. et al., 2008: Operating temperature of photovoltaic modules: A survey of pertinent correlations, Renewable Energy 34, 23-29 [3] Koehl, M. et al., 2011: Modeling of the nominal operating cell temperature based on outdoor weathering, Solar Energy Materials & Solar Cells 95, 1638-1646 [4] Mattei, M. et al., 2005: Calculation of the polycrystalline PV module temperature using a simple method of energy balance, Renewable Energy 31, 553-567 [5] Kurtz, S. et al.: Evaluation of high-temperature exposure of rack-mounted photovoltaic modules

  5. [Epidemiological data and radiation risk estimates].

    PubMed

    Cardis, E

    2002-01-01

    The results of several major epidemiology studies on populations with particular exposure to ionizing radiation should become available during the first years of the 21(st) century. These studies are expected to provide answers to a number of questions concerning public health and radiation protection. Most of the populations concerned were accidentally exposed to radiation in ex-USSR or elsewhere or in a nuclear industrial context. The results will complete and test information on risk coming from studies among survivors of the Hiroshima and Nagasaki atomic bombs, particularly studies on the effects of low dose exposure and prolonged low-dose exposure, of different types of radiation, and environmental and host-related factors which could modify the risk of radiation-induced effects. These studies are thus important to assess the currently accepted scientific evidence on radiation protection for workers and the general population. In addition, supplementary information on radiation protection could be provided by formal comparisons and analyses combining data from populations with different types of exposure. Finally, in order to provide pertinent information for public health and radiation protection, future epidemiology studies should be targeted and designed to answer specific questions, concerning, for example, the risk for specific populations (children, patients, people with genetic predisposition). An integrated approach, combining epidemiology and studies on the mechanisms of radiation induction should provide particularly pertinent information. PMID:11938114

  6. Toward an Accurate and Inexpensive Estimation of CCSD(T)/CBS Binding Energies of Large Water Clusters.

    PubMed

    Sahu, Nityananda; Singh, Gurmeet; Nandi, Apurba; Gadre, Shridhar R

    2016-07-21

    Owing to the steep scaling behavior, highly accurate CCSD(T) calculations, the contemporary gold standard of quantum chemistry, are prohibitively difficult for moderate- and large-sized water clusters even with the high-end hardware. The molecular tailoring approach (MTA), a fragmentation-based technique is found to be useful for enabling such high-level ab initio calculations. The present work reports the CCSD(T) level binding energies of many low-lying isomers of large (H2O)n (n = 16, 17, and 25) clusters employing aug-cc-pVDZ and aug-cc-pVTZ basis sets within the MTA framework. Accurate estimation of the CCSD(T) level binding energies [within 0.3 kcal/mol of the respective full calculation (FC) results] is achieved after effecting the grafting procedure, a protocol for minimizing the errors in the MTA-derived energies arising due to the approximate nature of MTA. The CCSD(T) level grafting procedure presented here hinges upon the well-known fact that the MP2 method, which scales as O(N(5)), can be a suitable starting point for approximating to the highly accurate CCSD(T) [that scale as O(N(7))] energies. On account of the requirement of only an MP2-level FC on the entire cluster, the current methodology ultimately leads to a cost-effective solution for the CCSD(T) level accurate binding energies of large-sized water clusters even at the complete basis set limit utilizing off-the-shelf hardware. PMID:27351269

  7. ITER- International Toxicity Estimates for Risk, new TOXNET database.

    PubMed

    Tomasulo, Patricia

    2005-01-01

    ITER, the International Toxicity Estimates for Risk database, joined the TOXNET system in the winter of 2004. ITER features international comparisons of environmental health risk assessment information and contains over 620 chemical records. ITER includes data from the EPA, Health Canada, the National Institute of Public Health and the Environment of the Netherlands, and other organizations that provide risk values that have been peer-reviewed. PMID:15760833

  8. Sensitivity of health risk estimates to air quality adjustment procedure

    SciTech Connect

    Whitfield, R.G.

    1997-06-30

    This letter is a summary of risk results associated with exposure estimates using two-parameter Weibull and quadratic air quality adjustment procedures (AQAPs). New exposure estimates were developed for children and child-occurrences, six urban areas, and five alternative air quality scenarios. In all cases, the Weibull and quadratic results are compared to previous results, which are based on a proportional AQAP.

  9. Incentives Increase Participation in Mass Dog Rabies Vaccination Clinics and Methods of Coverage Estimation Are Assessed to Be Accurate

    PubMed Central

    Steinmetz, Melissa; Czupryna, Anna; Bigambo, Machunde; Mzimbiri, Imam; Powell, George; Gwakisa, Paul

    2015-01-01

    In this study we show that incentives (dog collars and owner wristbands) are effective at increasing owner participation in mass dog rabies vaccination clinics and we conclude that household questionnaire surveys and the mark-re-sight (transect survey) method for estimating post-vaccination coverage are accurate when all dogs, including puppies, are included. Incentives were distributed during central-point rabies vaccination clinics in northern Tanzania to quantify their effect on owner participation. In villages where incentives were handed out participation increased, with an average of 34 more dogs being vaccinated. Through economies of scale, this represents a reduction in the cost-per-dog of $0.47. This represents the price-threshold under which the cost of the incentive used must fall to be economically viable. Additionally, vaccination coverage levels were determined in ten villages through the gold-standard village-wide census technique, as well as through two cheaper and quicker methods (randomized household questionnaire and the transect survey). Cost data were also collected. Both non-gold standard methods were found to be accurate when puppies were included in the calculations, although the transect survey and the household questionnaire survey over- and under-estimated the coverage respectively. Given that additional demographic data can be collected through the household questionnaire survey, and that its estimate of coverage is more conservative, we recommend this method. Despite the use of incentives the average vaccination coverage was below the 70% threshold for eliminating rabies. We discuss the reasons and suggest solutions to improve coverage. Given recent international targets to eliminate rabies, this study provides valuable and timely data to help improve mass dog vaccination programs in Africa and elsewhere. PMID:26633821

  10. Incentives Increase Participation in Mass Dog Rabies Vaccination Clinics and Methods of Coverage Estimation Are Assessed to Be Accurate.

    PubMed

    Minyoo, Abel B; Steinmetz, Melissa; Czupryna, Anna; Bigambo, Machunde; Mzimbiri, Imam; Powell, George; Gwakisa, Paul; Lankester, Felix

    2015-12-01

    In this study we show that incentives (dog collars and owner wristbands) are effective at increasing owner participation in mass dog rabies vaccination clinics and we conclude that household questionnaire surveys and the mark-re-sight (transect survey) method for estimating post-vaccination coverage are accurate when all dogs, including puppies, are included. Incentives were distributed during central-point rabies vaccination clinics in northern Tanzania to quantify their effect on owner participation. In villages where incentives were handed out participation increased, with an average of 34 more dogs being vaccinated. Through economies of scale, this represents a reduction in the cost-per-dog of $0.47. This represents the price-threshold under which the cost of the incentive used must fall to be economically viable. Additionally, vaccination coverage levels were determined in ten villages through the gold-standard village-wide census technique, as well as through two cheaper and quicker methods (randomized household questionnaire and the transect survey). Cost data were also collected. Both non-gold standard methods were found to be accurate when puppies were included in the calculations, although the transect survey and the household questionnaire survey over- and under-estimated the coverage respectively. Given that additional demographic data can be collected through the household questionnaire survey, and that its estimate of coverage is more conservative, we recommend this method. Despite the use of incentives the average vaccination coverage was below the 70% threshold for eliminating rabies. We discuss the reasons and suggest solutions to improve coverage. Given recent international targets to eliminate rabies, this study provides valuable and timely data to help improve mass dog vaccination programs in Africa and elsewhere. PMID:26633821

  11. Towards more accurate life cycle risk management through integration of DDP and PRA

    NASA Technical Reports Server (NTRS)

    Cornford, Steven L.; Paulos, Todd; Meshkat, Leila; Feather, Martin

    2003-01-01

    The focus of this paper is on the integration of PRA and DDP. The intent is twofold: to extend risk-based decision though more of the lifecycle, and to lead to improved risk modeling (hence better informed decision making) wherever it is applied, most especially in the early phases as designs begin to mature.

  12. Estimation of Hypertension Risk from Lifestyle Factors and Health Profile: A Case Study

    PubMed Central

    2014-01-01

    Hypertension is a highly prevalent risk factor for cardiovascular disease and it can also lead to other diseases which seriously harm the human health. Screening the risks and finding a clinical model for estimating the risk of onset, maintenance, or the prognosis of hypertension are of great importance to the prevention or treatment of the disease, especially if the indicator can be derived from simple health profile. In this study, we investigate a chronic disease questionnaire data set of 6563 rural citizens in East China and find out a clinical signature that can assess the risk of hypertension easily and accurately. The signature achieves an accuracy of about 83% on the external test dataset, with an AUC of 0.91. Our study demonstrates that a combination of simple lifestyle features can sufficiently reflect the risk of hypertension onset. This finding provides potential guidance for disease prevention and control as well as development of home care and home-care technologies. PMID:25019099

  13. [Application of spatial relative risk estimation in communicable disease risk evaluation].

    PubMed

    Zhang, Yewu; Guo, Qing; Wang, Xiaofeng; Yu, Meng; Su, Xuemei; Dong, Yan; Zhang, Chunxi

    2015-05-01

    This paper summaries the application of adaptive kernel density algorithm in the spatial relative risk estimation of communicable diseases by using the reported data of infectious diarrhea (other than cholera, dysentery, typhoid and paratyphoid) in Ludian county and surrounding area in Yunnan province in 2013. Statistically significant fluctuations in an estimated risk function were identified through the use of asymptotic tolerance contours, and finally these data were visualized though disease mapping. The results of spatial relative risk estimation and disease mapping showed that high risk areas were in southeastern Shaoyang next to Ludian. Therefore, the spatial relative risk estimation of disease by using adaptive kernel density algorithm and disease mapping technique is a powerful method in identifying high risk population and areas. PMID:26080648

  14. Accurate state estimation from uncertain data and models: an application of data assimilation to mathematical models of human brain tumors

    PubMed Central

    2011-01-01

    Background Data assimilation refers to methods for updating the state vector (initial condition) of a complex spatiotemporal model (such as a numerical weather model) by combining new observations with one or more prior forecasts. We consider the potential feasibility of this approach for making short-term (60-day) forecasts of the growth and spread of a malignant brain cancer (glioblastoma multiforme) in individual patient cases, where the observations are synthetic magnetic resonance images of a hypothetical tumor. Results We apply a modern state estimation algorithm (the Local Ensemble Transform Kalman Filter), previously developed for numerical weather prediction, to two different mathematical models of glioblastoma, taking into account likely errors in model parameters and measurement uncertainties in magnetic resonance imaging. The filter can accurately shadow the growth of a representative synthetic tumor for 360 days (six 60-day forecast/update cycles) in the presence of a moderate degree of systematic model error and measurement noise. Conclusions The mathematical methodology described here may prove useful for other modeling efforts in biology and oncology. An accurate forecast system for glioblastoma may prove useful in clinical settings for treatment planning and patient counseling. Reviewers This article was reviewed by Anthony Almudevar, Tomas Radivoyevitch, and Kristin Swanson (nominated by Georg Luebeck). PMID:22185645

  15. Reservoir evaluation of thin-bedded turbidites and hydrocarbon pore thickness estimation for an accurate quantification of resource

    NASA Astrophysics Data System (ADS)

    Omoniyi, Bayonle; Stow, Dorrik

    2016-04-01

    One of the major challenges in the assessment of and production from turbidite reservoirs is to take full account of thin and medium-bedded turbidites (<10cm and <30cm respectively). Although such thinner, low-pay sands may comprise a significant proportion of the reservoir succession, they can go unnoticed by conventional analysis and so negatively impact on reserve estimation, particularly in fields producing from prolific thick-bedded turbidite reservoirs. Field development plans often take little note of such thin beds, which are therefore bypassed by mainstream production. In fact, the trapped and bypassed fluids can be vital where maximising field value and optimising production are key business drivers. We have studied in detail, a succession of thin-bedded turbidites associated with thicker-bedded reservoir facies in the North Brae Field, UKCS, using a combination of conventional logs and cores to assess the significance of thin-bedded turbidites in computing hydrocarbon pore thickness (HPT). This quantity, being an indirect measure of thickness, is critical for an accurate estimation of original-oil-in-place (OOIP). By using a combination of conventional and unconventional logging analysis techniques, we obtain three different results for the reservoir intervals studied. These results include estimated net sand thickness, average sand thickness, and their distribution trend within a 3D structural grid. The net sand thickness varies from 205 to 380 ft, and HPT ranges from 21.53 to 39.90 ft. We observe that an integrated approach (neutron-density cross plots conditioned to cores) to HPT quantification reduces the associated uncertainties significantly, resulting in estimation of 96% of actual HPT. Further work will focus on assessing the 3D dynamic connectivity of the low-pay sands with the surrounding thick-bedded turbidite facies.

  16. Population-based absolute risk estimation with survey data.

    PubMed

    Kovalchik, Stephanie A; Pfeiffer, Ruth M

    2014-04-01

    Absolute risk is the probability that a cause-specific event occurs in a given time interval in the presence of competing events. We present methods to estimate population-based absolute risk from a complex survey cohort that can accommodate multiple exposure-specific competing risks. The hazard function for each event type consists of an individualized relative risk multiplied by a baseline hazard function, which is modeled nonparametrically or parametrically with a piecewise exponential model. An influence method is used to derive a Taylor-linearized variance estimate for the absolute risk estimates. We introduce novel measures of the cause-specific influences that can guide modeling choices for the competing event components of the model. To illustrate our methodology, we build and validate cause-specific absolute risk models for cardiovascular and cancer deaths using data from the National Health and Nutrition Examination Survey. Our applications demonstrate the usefulness of survey-based risk prediction models for predicting health outcomes and quantifying the potential impact of disease prevention programs at the population level. PMID:23686614

  17. On cancer risk estimation of urban air pollution.

    PubMed Central

    Törnqvist, M; Ehrenberg, L

    1994-01-01

    The usefulness of data from various sources for a cancer risk estimation of urban air pollution is discussed. Considering the irreversibility of initiations, a multiplicative model is preferred for solid tumors. As has been concluded for exposure to ionizing radiation, the multiplicative model, in comparison with the additive model, predicts a relatively larger number of cases at high ages, with enhanced underestimation of risks by short follow-up times in disease-epidemiological studies. For related reasons, the extrapolation of risk from animal tests on the basis of daily absorbed dose per kilogram body weight or per square meter surface area without considering differences in life span may lead to an underestimation, and agreements with epidemiologically determined values may be fortuitous. Considering these possibilities, the most likely lifetime risks of cancer death at the average exposure levels in Sweden were estimated for certain pollution fractions or indicator compounds in urban air. The risks amount to approximately 50 deaths per 100,000 for inhaled particulate organic material (POM), with a contribution from ingested POM about three times larger, and alkenes, and butadiene cause 20 deaths, respectively, per 100,000 individuals. Also, benzene and formaldehyde are expected to be associated with considerable risk increments. Comparative potency methods were applied for POM and alkenes. Due to incompleteness of the list of compounds considered and the uncertainties of the above estimates, the total risk calculation from urban air has not been attempted here. PMID:7821292

  18. Parenchymal Texture Analysis in Digital Breast Tomosynthesis for Breast Cancer Risk Estimation: A Preliminary Study

    PubMed Central

    Kontos, Despina; Bakic, Predrag R.; Carton, Ann-Katherine; Troxel, Andrea B.; Conant, Emily F.; Maidment, Andrew D.A.

    2009-01-01

    Rationale and Objectives Studies have demonstrated a relationship between mammographic parenchymal texture and breast cancer risk. Although promising, texture analysis in mammograms is limited by tissue superimposition. Digital breast tomosynthesis (DBT) is a novel tomographic x-ray breast imaging modality that alleviates the effect of tissue superimposition, offering superior parenchymal texture visualization compared to mammography. Our study investigates the potential advantages of DBT parenchymal texture analysis for breast cancer risk estimation. Materials and Methods DBT and digital mammography (DM) images of 39 women were analyzed. Texture features, shown in studies with mammograms to correlate with cancer risk, were computed from the retroareolar breast region. We compared the relative performance of DBT and DM texture features in correlating with two measures of breast cancer risk: (i) the Gail and Claus risk estimates, and (ii) mammographic breast density. Linear regression was performed to model the association between texture features and increasing levels of risk. Results No significant correlation was detected between parenchymal texture and the Gail and Claus risk estimates. Significant correlations were observed between texture features and breast density. Overall, the DBT texture features demonstrated stronger correlations with breast percent density (PD) than DM (p ≤0.05). When dividing our study population in groups of increasing breast PD, the DBT texture features appeared to be more discriminative, having regression lines with overall lower p-values, steeper slopes, and higher R2 estimates. Conclusion Although preliminary, our results suggest that DBT parenchymal texture analysis could provide more accurate characterization of breast density patterns, which could ultimately improve breast cancer risk estimation. PMID:19201357

  19. Can endocranial volume be estimated accurately from external skull measurements in great-tailed grackles (Quiscalus mexicanus)?

    PubMed

    Logan, Corina J; Palmstrom, Christin R

    2015-01-01

    There is an increasing need to validate and collect data approximating brain size on individuals in the field to understand what evolutionary factors drive brain size variation within and across species. We investigated whether we could accurately estimate endocranial volume (a proxy for brain size), as measured by computerized tomography (CT) scans, using external skull measurements and/or by filling skulls with beads and pouring them out into a graduated cylinder for male and female great-tailed grackles. We found that while females had higher correlations than males, estimations of endocranial volume from external skull measurements or beads did not tightly correlate with CT volumes. We found no accuracy in the ability of external skull measures to predict CT volumes because the prediction intervals for most data points overlapped extensively. We conclude that we are unable to detect individual differences in endocranial volume using external skull measurements. These results emphasize the importance of validating and explicitly quantifying the predictive accuracy of brain size proxies for each species and each sex. PMID:26082858

  20. A plan for accurate estimation of daily area-mean rainfall during the CaPE experiment

    NASA Technical Reports Server (NTRS)

    Duchon, Claude E.

    1992-01-01

    The Convection and Precipitation/Electrification (CaPE) experiment took place in east central Florida from 8 July to 18 August, 1991. There were five research themes associated with CaPE. In broad terms they are: investigation of the evolution of the electric field in convective clouds, determination of meteorological and electrical conditions associated with lightning, development of mesoscale numerical forecasts (2-12 hr) and nowcasts (less than 2 hr) of convective initiation and remote estimation of rainfall. It is the last theme coupled with numerous raingage and streamgage measurements, satellite and aircraft remote sensing, radiosondes and other meteorological measurements in the atmospheric boundary layer that provide the basis for determining the hydrologic cycle for the CaPE experiment area. The largest component of the hydrologic cycle in this region is rainfall. An accurate determination of daily area-mean rainfall is important in correctly modeling its apportionment into runoff, infiltration and evapotranspiration. In order to achieve this goal a research plan was devised and initial analysis begun. The overall research plan is discussed with special emphasis placed on the adjustment of radar rainfall estimates to raingage rainfall.

  1. Accurately Predicting Future Reading Difficulty for Bilingual Latino Children at Risk for Language Impairment

    ERIC Educational Resources Information Center

    Petersen, Douglas B.; Gillam, Ronald B.

    2013-01-01

    Sixty-three bilingual Latino children who were at risk for language impairment were administered reading-related measures in English and Spanish (letter identification, phonological awareness, rapid automatized naming, and sentence repetition) and descriptive measures including English language proficiency (ELP), language ability (LA),…

  2. Population-Attributable Risk Estimates for Risk Factors Associated with Campylobacter Infection, Australia

    PubMed Central

    Schluter, Philip J.; Wilson, Andrew J.; Kirk, Martyn D.; Hall, Gillian; Unicomb, Leanne

    2008-01-01

    In 2001–2002, a multicenter, prospective case-control study involving 1,714 participants >5 years of age was conducted in Australia to identify risk factors for Campylobacter infection. Adjusted population-attributable risks (PARs) were derived for each independent risk factor contained within the final multivariable logistic regression model. Estimated PARs were combined with adjusted (for the >5 years of age eligibility criterion) notifiable disease surveillance data to estimate annual Australian Campylobacter case numbers attributable to each risk factor. Simulated distributions of “credible values” were then generated to model the uncertainty associated with each case number estimate. Among foodborne risk factors, an estimated 50,500 (95% credible interval 10,000–105,500) cases of Campylobacter infection in persons >5 years of age could be directly attributed each year to consumption of chicken in Australia. Our statistical technique could be applied more widely to other communicable diseases that are subject to routine surveillance. PMID:18507899

  3. How to estimate the Value at Risk under incomplete information

    NASA Astrophysics Data System (ADS)

    de Schepper, Ann; Heijnen, Bart

    2010-03-01

    A key problem in financial and actuarial research, and particularly in the field of risk management, is the choice of models so as to avoid systematic biases in the measurement of risk. An alternative consists of relaxing the assumption that the probability distribution is completely known, leading to interval estimates instead of point estimates. In the present contribution, we show how this is possible for the Value at Risk, by fixing only a small number of parameters of the underlying probability distribution. We start by deriving bounds on tail probabilities, and we show how a conversion leads to bounds for the Value at Risk. It will turn out that with a maximum of three given parameters, the best estimates are always realized in the case of a unimodal random variable for which two moments and the mode are given. It will also be shown that a lognormal model results in estimates for the Value at Risk that are much closer to the upper bound than to the lower bound.

  4. Studies on the extended Techa river cohort: cancer risk estimation

    SciTech Connect

    Kossenko, M M.; Preston, D L.; Krestinina, L Y.; Degteva, M O.; Startsev, N V.; Thomas, T; Vyushkova, O V.; Anspaugh, L R.; Napier, Bruce A. ); Kozheurov, V P.; Ron, E; Akleyev, A V.

    2001-12-01

    Initial population-based studies of riverside residents were begun in the late 1950s and in 1967 a systematic effort was undertaken to develop a well-defined fixed cohort of Techa river residents, to carry out ongoing mortality and (limited) clinical follow-up of this cohort, and to provide individualized dose estimates for cohort members. Over the past decade, extensive efforts have been made to refine the cohort definition and improve both the follow-up and dosimetry data. Analyses of the Techa river cohort can provide useful quantitative estimates of the effects of low dose rate, chronic external and internal exposures on cancer mortality and incidence and non-cancer mortality rates. These risk estimates complement quantitative risk estimates for acute exposures based on the atomic bomb survivors and chronic exposure risk estimates from worker studies, including Mayak workers and other groups with occupational radiation exposures. As the dosimetry and follow-up are refined it may also be possible to gain useful insights into risks associated with 90Sr exposures.

  5. TIMP2•IGFBP7 biomarker panel accurately predicts acute kidney injury in high-risk surgical patients

    PubMed Central

    Gunnerson, Kyle J.; Shaw, Andrew D.; Chawla, Lakhmir S.; Bihorac, Azra; Al-Khafaji, Ali; Kashani, Kianoush; Lissauer, Matthew; Shi, Jing; Walker, Michael G.; Kellum, John A.

    2016-01-01

    BACKGROUND Acute kidney injury (AKI) is an important complication in surgical patients. Existing biomarkers and clinical prediction models underestimate the risk for developing AKI. We recently reported data from two trials of 728 and 408 critically ill adult patients in whom urinary TIMP2•IGFBP7 (NephroCheck, Astute Medical) was used to identify patients at risk of developing AKI. Here we report a preplanned analysis of surgical patients from both trials to assess whether urinary tissue inhibitor of metalloproteinase 2 (TIMP-2) and insulin-like growth factor–binding protein 7 (IGFBP7) accurately identify surgical patients at risk of developing AKI. STUDY DESIGN We enrolled adult surgical patients at risk for AKI who were admitted to one of 39 intensive care units across Europe and North America. The primary end point was moderate-severe AKI (equivalent to KDIGO [Kidney Disease Improving Global Outcomes] stages 2–3) within 12 hours of enrollment. Biomarker performance was assessed using the area under the receiver operating characteristic curve, integrated discrimination improvement, and category-free net reclassification improvement. RESULTS A total of 375 patients were included in the final analysis of whom 35 (9%) developed moderate-severe AKI within 12 hours. The area under the receiver operating characteristic curve for [TIMP-2]•[IGFBP7] alone was 0.84 (95% confidence interval, 0.76–0.90; p < 0.0001). Biomarker performance was robust in sensitivity analysis across predefined subgroups (urgency and type of surgery). CONCLUSION For postoperative surgical intensive care unit patients, a single urinary TIMP2•IGFBP7 test accurately identified patients at risk for developing AKI within the ensuing 12 hours and its inclusion in clinical risk prediction models significantly enhances their performance. LEVEL OF EVIDENCE Prognostic study, level I. PMID:26816218

  6. SU-E-J-208: Fast and Accurate Auto-Segmentation of Abdominal Organs at Risk for Online Adaptive Radiotherapy

    SciTech Connect

    Gupta, V; Wang, Y; Romero, A; Heijmen, B; Hoogeman, M; Myronenko, A; Jordan, P

    2014-06-01

    Purpose: Various studies have demonstrated that online adaptive radiotherapy by real-time re-optimization of the treatment plan can improve organs-at-risk (OARs) sparing in the abdominal region. Its clinical implementation, however, requires fast and accurate auto-segmentation of OARs in CT scans acquired just before each treatment fraction. Autosegmentation is particularly challenging in the abdominal region due to the frequently observed large deformations. We present a clinical validation of a new auto-segmentation method that uses fully automated non-rigid registration for propagating abdominal OAR contours from planning to daily treatment CT scans. Methods: OARs were manually contoured by an expert panel to obtain ground truth contours for repeat CT scans (3 per patient) of 10 patients. For the non-rigid alignment, we used a new non-rigid registration method that estimates the deformation field by optimizing local normalized correlation coefficient with smoothness regularization. This field was used to propagate planning contours to repeat CTs. To quantify the performance of the auto-segmentation, we compared the propagated and ground truth contours using two widely used metrics- Dice coefficient (Dc) and Hausdorff distance (Hd). The proposed method was benchmarked against translation and rigid alignment based auto-segmentation. Results: For all organs, the auto-segmentation performed better than the baseline (translation) with an average processing time of 15 s per fraction CT. The overall improvements ranged from 2% (heart) to 32% (pancreas) in Dc, and 27% (heart) to 62% (spinal cord) in Hd. For liver, kidneys, gall bladder, stomach, spinal cord and heart, Dc above 0.85 was achieved. Duodenum and pancreas were the most challenging organs with both showing relatively larger spreads and medians of 0.79 and 2.1 mm for Dc and Hd, respectively. Conclusion: Based on the achieved accuracy and computational time we conclude that the investigated auto

  7. Sensitivity of risk estimates to wildlife bioaccumulation factors in ecological risk assessment

    SciTech Connect

    Karustis, C.G.; Brewer, R.A.

    1995-12-31

    The concept of conservatism in risk assessment is well established. However, overly conservative assumptions may result in risk estimates that incorrectly predict remediation goals. Therefore, realistic assumptions should be applied in risk assessment whenever possible. A sensitivity analysis was performed on conservative (i.e. bioaccumulation factor = 1) and scientifically-derived wildlife bioaccumulation factors (BAFs) utilized to calculate risks during a terrestrial ecological risk assessment (ERA). In the first approach, 100% bioaccumulation of contaminants was assumed to estimate the transfer of contaminants through the terrestrial food chain. In the second approach, scientifically-derived BAFs were selected from the literature. For one of the measurement species selected, total risks calculated during the first approach were higher than those calculated during the second approach by two orders of magnitude. However, potential risks due to individual contaminants were not necessarily higher using the conservative approach. Potential risk due to contaminants with low actual bioaccumulation were exaggerated while potential risks due to contaminants with greater than 100% bioaccumulation were underestimated. Therefore, the use of a default of 100% bioaccumulation (BAF = 1) for all contaminants encountered during an ERA could result in cases where contaminants are incorrectly identified as risk drivers, and the calculation of incorrect ecological risk-based cleanup goals. The authors suggest using site-specific or literature-derived BAFs whenever possible and realistic BAF estimates, based upon factors such as log K{sub ow}, when BAFs are unavailable.

  8. Estimating wildfire risk on a Mojave Desert landscape using remote sensing and field sampling

    USGS Publications Warehouse

    Van Linn, Peter F., III; Nussear, Kenneth E.; Esque, Todd C.; DeFalco, Lesley A.; Inman, Richard D.; Abella, Scott R.

    2013-01-01

    Predicting wildfires that affect broad landscapes is important for allocating suppression resources and guiding land management. Wildfire prediction in the south-western United States is of specific concern because of the increasing prevalence and severe effects of fire on desert shrublands and the current lack of accurate fire prediction tools. We developed a fire risk model to predict fire occurrence in a north-eastern Mojave Desert landscape. First we developed a spatial model using remote sensing data to predict fuel loads based on field estimates of fuels. We then modelled fire risk (interactions of fuel characteristics and environmental conditions conducive to wildfire) using satellite imagery, our model of fuel loads, and spatial data on ignition potential (lightning strikes and distance to roads), topography (elevation and aspect) and climate (maximum and minimum temperatures). The risk model was developed during a fire year at our study landscape and validated at a nearby landscape; model performance was accurate and similar at both sites. This study demonstrates that remote sensing techniques used in combination with field surveys can accurately predict wildfire risk in the Mojave Desert and may be applicable to other arid and semiarid lands where wildfires are prevalent.

  9. Neoplastic potential of gastric irradiation. IV. Risk estimates

    SciTech Connect

    Griem, M.L.; Justman, J.; Weiss, L.

    1984-12-01

    No significant tumor increase was found in the initial analysis of patients irradiated for peptic ulcer and followed through 1962. A preliminary study was undertaken 22 years later to estimate the risk of cancer due to gastric irradiation for peptic ulcer disease. A population of 2,049 irradiated patients and 763 medically managed patients has been identified. A relative risk of 3.7 was found for stomach cancer and an initial risk estimate of 5.5 x 10(-6) excess stomach cancers per person rad was calculated. A more complete follow-up is in progress to further elucidate this observation and decrease the ascertainment bias; however, preliminary data are in agreement with the Japanese atomic bomb reports.

  10. Development of a new, robust and accurate, spectroscopic metric for scatterer size estimation in optical coherence tomography (OCT) images

    NASA Astrophysics Data System (ADS)

    Kassinopoulos, Michalis; Pitris, Costas

    2016-03-01

    The modulations appearing on the backscattering spectrum originating from a scatterer are related to its diameter as described by Mie theory for spherical particles. Many metrics for Spectroscopic Optical Coherence Tomography (SOCT) take advantage of this observation in order to enhance the contrast of Optical Coherence Tomography (OCT) images. However, none of these metrics has achieved high accuracy when calculating the scatterer size. In this work, Mie theory was used to further investigate the relationship between the degree of modulation in the spectrum and the scatterer size. From this study, a new spectroscopic metric, the bandwidth of the Correlation of the Derivative (COD) was developed which is more robust and accurate, compared to previously reported techniques, in the estimation of scatterer size. The self-normalizing nature of the derivative and the robustness of the first minimum of the correlation as a measure of its width, offer significant advantages over other spectral analysis approaches especially for scatterer sizes above 3 μm. The feasibility of this technique was demonstrated using phantom samples containing 6, 10 and 16 μm diameter microspheres as well as images of normal and cancerous human colon. The results are very promising, suggesting that the proposed metric could be implemented in OCT spectral analysis for measuring nuclear size distribution in biological tissues. A technique providing such information would be of great clinical significance since it would allow the detection of nuclear enlargement at the earliest stages of precancerous development.

  11. Optimization of tissue physical parameters for accurate temperature estimation from finite-element simulation of radiofrequency ablation.

    PubMed

    Subramanian, Swetha; Mast, T Douglas

    2015-10-01

    Computational finite element models are commonly used for the simulation of radiofrequency ablation (RFA) treatments. However, the accuracy of these simulations is limited by the lack of precise knowledge of tissue parameters. In this technical note, an inverse solver based on the unscented Kalman filter (UKF) is proposed to optimize values for specific heat, thermal conductivity, and electrical conductivity resulting in accurately simulated temperature elevations. A total of 15 RFA treatments were performed on ex vivo bovine liver tissue. For each RFA treatment, 15 finite-element simulations were performed using a set of deterministically chosen tissue parameters to estimate the mean and variance of the resulting tissue ablation. The UKF was implemented as an inverse solver to recover the specific heat, thermal conductivity, and electrical conductivity corresponding to the measured area of the ablated tissue region, as determined from gross tissue histology. These tissue parameters were then employed in the finite element model to simulate the position- and time-dependent tissue temperature. Results show good agreement between simulated and measured temperature. PMID:26352462

  12. Optimization of tissue physical parameters for accurate temperature estimation from finite-element simulation of radiofrequency ablation

    NASA Astrophysics Data System (ADS)

    Subramanian, Swetha; Mast, T. Douglas

    2015-09-01

    Computational finite element models are commonly used for the simulation of radiofrequency ablation (RFA) treatments. However, the accuracy of these simulations is limited by the lack of precise knowledge of tissue parameters. In this technical note, an inverse solver based on the unscented Kalman filter (UKF) is proposed to optimize values for specific heat, thermal conductivity, and electrical conductivity resulting in accurately simulated temperature elevations. A total of 15 RFA treatments were performed on ex vivo bovine liver tissue. For each RFA treatment, 15 finite-element simulations were performed using a set of deterministically chosen tissue parameters to estimate the mean and variance of the resulting tissue ablation. The UKF was implemented as an inverse solver to recover the specific heat, thermal conductivity, and electrical conductivity corresponding to the measured area of the ablated tissue region, as determined from gross tissue histology. These tissue parameters were then employed in the finite element model to simulate the position- and time-dependent tissue temperature. Results show good agreement between simulated and measured temperature.

  13. Estimation of myocardial volume at risk from CT angiography

    NASA Astrophysics Data System (ADS)

    Zhu, Liangjia; Gao, Yi; Mohan, Vandana; Stillman, Arthur; Faber, Tracy; Tannenbaum, Allen

    2011-03-01

    The determination of myocardial volume at risk distal to coronary stenosis provides important information for prognosis and treatment of coronary artery disease. In this paper, we present a novel computational framework for estimating the myocardial volume at risk in computed tomography angiography (CTA) imagery. Initially, epicardial and endocardial surfaces, and coronary arteries are extracted using an active contour method. Then, the extracted coronary arteries are projected onto the epicardial surface, and each point on this surface is associated with its closest coronary artery using the geodesic distance measurement. The likely myocardial region at risk on the epicardial surface caused by a stenosis is approximated by the region in which all its inner points are associated with the sub-branches distal to the stenosis on the coronary artery tree. Finally, the likely myocardial volume at risk is approximated by the volume in between the region at risk on the epicardial surface and its projection on the endocardial surface, which is expected to yield computational savings over risk volume estimation using the entire image volume. Furthermore, we expect increased accuracy since, as compared to prior work using the Euclidean distance, we employ the geodesic distance in this work. The experimental results demonstrate the effectiveness of the proposed approach on pig heart CTA datasets.

  14. Estimation of the environmental risk of regulated river flow

    NASA Astrophysics Data System (ADS)

    Latu, Kilisimasi; Malano, Hector M.; Costelloe, Justin F.; Peterson, Tim J.

    2014-09-01

    A commonly accepted paradigm in environmental flow management is that a regulated river flow regime should mimic the natural hydrological regime to sustain the key attributes of freshwater ecosystems. Estimation of the environmental risk arising from flow regulation needs to consider all aspects of the flow regime when applied to water allocation decisions. We present a holistic, dynamic and robust approach that is based on a statistical analysis of the entire flow regime and accounts for flow stress indicators to produce an environmental risk time series based on the consequence of departures from the optimum flow range of a river or reach. When applied to a catchment, (Campaspe River, southern Australia) the model produced a dynamic and robust environmental risk time series that clearly showed that when the observed river flow is drawn away from the optimum range of environmental flow demand, the environmental risk increased. In addition, the model produced risk time series showing that the Campaspe River has reversed seasonal patterns of river flow due to water releases during summer periods, which altered the flow nature of the river. Hence, this resulted in higher environmental risk occurring during summer but lower in winter periods. Furthermore, we found that the vulnerability and coefficient of variation indices have the highest contributions to consequence in comparison to other indices used to calculate environmental risk.

  15. Estimation of earthquake risk curves of physical building damage

    NASA Astrophysics Data System (ADS)

    Raschke, Mathias; Janouschkowetz, Silke; Fischer, Thomas; Simon, Christian

    2014-05-01

    In this study, a new approach to quantify seismic risks is presented. Here, the earthquake risk curves for the number of buildings with a defined physical damage state are estimated for South Africa. Therein, we define the physical damage states according to the current European macro-seismic intensity scale (EMS-98). The advantage of such kind of risk curve is that its plausibility can be checked more easily than for other types. The earthquake risk curve for physical building damage can be compared with historical damage and their corresponding empirical return periods. The number of damaged buildings from historical events is generally explored and documented in more detail than the corresponding monetary losses. The latter are also influenced by different economic conditions, such as inflation and price hikes. Further on, the monetary risk curve can be derived from the developed risk curve of physical building damage. The earthquake risk curve can also be used for the validation of underlying sub-models such as the hazard and vulnerability modules.

  16. Estimating Skin Cancer Risk: Evaluating Mobile Computer-Adaptive Testing

    PubMed Central

    Djaja, Ngadiman; Janda, Monika; Olsen, Catherine M; Whiteman, David C

    2016-01-01

    Background Response burden is a major detriment to questionnaire completion rates. Computer adaptive testing may offer advantages over non-adaptive testing, including reduction of numbers of items required for precise measurement. Objective Our aim was to compare the efficiency of non-adaptive (NAT) and computer adaptive testing (CAT) facilitated by Partial Credit Model (PCM)-derived calibration to estimate skin cancer risk. Methods We used a random sample from a population-based Australian cohort study of skin cancer risk (N=43,794). All 30 items of the skin cancer risk scale were calibrated with the Rasch PCM. A total of 1000 cases generated following a normal distribution (mean [SD] 0 [1]) were simulated using three Rasch models with three fixed-item (dichotomous, rating scale, and partial credit) scenarios, respectively. We calculated the comparative efficiency and precision of CAT and NAT (shortening of questionnaire length and the count difference number ratio less than 5% using independent t tests). Results We found that use of CAT led to smaller person standard error of the estimated measure than NAT, with substantially higher efficiency but no loss of precision, reducing response burden by 48%, 66%, and 66% for dichotomous, Rating Scale Model, and PCM models, respectively. Conclusions CAT-based administrations of the skin cancer risk scale could substantially reduce participant burden without compromising measurement precision. A mobile computer adaptive test was developed to help people efficiently assess their skin cancer risk. PMID:26800642

  17. A Review of Expertise and Judgment Processes for Risk Estimation

    SciTech Connect

    R. L. Boring

    2007-06-01

    A major challenge of risk and reliability analysis for human errors or hardware failures is the need to enlist expert opinion in areas for which adequate operational data are not available. Experts enlisted in this capacity provide probabilistic estimates of reliability, typically comprised of a measure of central tendency and uncertainty bounds. While formal guidelines for expert elicitation are readily available, they largely fail to provide a theoretical basis for expertise and judgment. This paper reviews expertise and judgment in the context of risk analysis; overviews judgment biases, the role of training, and multivariate judgments; and provides guidance on the appropriate use of atomistic and holistic judgment processes.

  18. Estimated incidence and risk factors of sudden unexpected death

    PubMed Central

    Lin, Feng-Chang; Mehta, Neil; Mounsey, Louisa; Nwosu, Anthony; Pursell, Irion; Chung, Eugene H; Mounsey, J Paul; Simpson, Ross J

    2016-01-01

    Objective In this manuscript, we estimate the incidence and identify risk factors for sudden unexpected death in a socioeconomically and racially diverse population in one county in North Carolina. Estimates of the incidence and risk factors contributing to sudden death vary widely. The Sudden Unexpected Death in North Carolina (SUDDEN) project is a population-based investigation of the incidence and potential causes of sudden death. Methods From 3 March 2013 to 2 March 2014, all out-of-hospital deaths in Wake County, North Carolina, were screened to identify presumed sudden unexpected death among free-living residents between the ages of 18 and 64 years. Death certificate, public and medical records were reviewed and adjudicated to confirm sudden unexpected death cases. Results Following adjudication, 190 sudden unexpected deaths including 122 men and 68 women were identified. Estimated incidence was 32.1 per 100 000 person-years overall: 42.7 among men and 22.4 among women. The majority of victims were white, unmarried men over age 55 years, with unwitnessed deaths at home. Hypertension and dyslipidaemia were common in men and women. African-American women dying from sudden unexpected death were over-represented. Women who were under age 55 years with coronary disease accounted for over half of female participants with coronary artery disease. Conclusions The overall estimated incidence of sudden unexpected death may account for approximately 10% of all deaths classified as ‘natural’. Women have a lower estimated incidence of sudden unexpected death than men. However, we found no major differences in age or comorbidities between men and women. African-Americans and young women with coronary disease are at risk for sudden unexpected death. PMID:27042316

  19. Improved risk estimates for carbon tetrachloride. 1998 annual progress report

    SciTech Connect

    Benson, J.M.; Springer, D.L.; Thrall, K.D.

    1998-06-01

    'The overall purpose of these studies is to improve the scientific basis for assessing the cancer risk associated with human exposure to carbon tetrachloride. Specifically, the toxicokinetics of inhaled carbon tetrachloride is being determined in rats, mice and hamsters. Species differences in the metabolism of carbon tetrachloride by rats, mice and hamsters is being determined in vivo and in vitro using tissues and microsomes from these rodent species and man. Dose-response relationships will be determined in all studies. The information will be used to improve the current physiologically based pharmacokinetic model for carbon tetrachloride. The authors will also determine whether carbon tetrachloride is a hepatocarcinogen only when exposure results in cell damage, cell killing, and regenerative cell proliferation. In combination, the results of these studies will provide the types of information needed to enable a refined risk estimate for carbon tetrachloride under EPA''s new guidelines for cancer risk assessment.'

  20. A comparison of genetic risk score with family history for estimating prostate cancer risk

    PubMed Central

    Helfand, Brian T

    2016-01-01

    Prostate cancer (PCa) testing is recommended by most authoritative groups for high-risk men including those with a family history of the disease. However, family history information is often limited by patient knowledge and clinician intake, and thus, many men are incorrectly assigned to different risk groups. Alternate methods to assess PCa risk are required. In this review, we discuss how genetic variants, referred to as PCa-risk single-nucleotide polymorphisms, can be used to calculate a genetic risk score (GRS). GRS assigns a relatively unique value to all men based on the number of PCa-risk SNPs that an individual carries. This GRS value can provide a more precise estimate of a man's PCa risk. This is particularly relevant in situations when an individual is unaware of his family history. In addition, GRS has utility and can provide a more precise estimate of risk even among men with a positive family history. It can even distinguish risk among relatives with the same degree of family relationships. Taken together, this review serves to provide support for the clinical utility of GRS as an independent test to provide supplemental information to family history. As such, GRS can serve as a platform to help guide-shared decision-making processes regarding the timing and frequency of PCa testing and biopsies. PMID:27004541

  1. Risk estimation based on chromosomal aberrations induced by radiation

    NASA Technical Reports Server (NTRS)

    Durante, M.; Bonassi, S.; George, K.; Cucinotta, F. A.

    2001-01-01

    The presence of a causal association between the frequency of chromosomal aberrations in peripheral blood lymphocytes and the risk of cancer has been substantiated recently by epidemiological studies. Cytogenetic analyses of crew members of the Mir Space Station have shown that a significant increase in the frequency of chromosomal aberrations can be detected after flight, and that such an increase is likely to be attributed to the radiation exposure. The risk of cancer can be estimated directly from the yields of chromosomal aberrations, taking into account some aspects of individual susceptibility and other factors unrelated to radiation. However, the use of an appropriate technique for the collection and analysis of chromosomes and the choice of the structural aberrations to be measured are crucial in providing sound results. Based on the fraction of aberrant lymphocytes detected before and after flight, the relative risk after a long-term Mir mission is estimated to be about 1.2-1.3. The new technique of mFISH can provide useful insights into the quantification of risk on an individual basis.

  2. Exploration of diffusion kernel density estimation in agricultural drought risk analysis: a case study in Shandong, China

    NASA Astrophysics Data System (ADS)

    Chen, W.; Shao, Z.; Tiong, L. K.

    2015-11-01

    Drought caused the most widespread damage in China, making up over 50 % of the total affected area nationwide in recent decades. In the paper, a Standardized Precipitation Index-based (SPI-based) drought risk study is conducted using historical rainfall data of 19 weather stations in Shandong province, China. Kernel density based method is adopted to carry out the risk analysis. Comparison between the bivariate Gaussian kernel density estimation (GKDE) and diffusion kernel density estimation (DKDE) are carried out to analyze the effect of drought intensity and drought duration. The results show that DKDE is relatively more accurate without boundary-leakage. Combined with the GIS technique, the drought risk is presented which reveals the spatial and temporal variation of agricultural droughts for corn in Shandong. The estimation provides a different way to study the occurrence frequency and severity of drought risk from multiple perspectives.

  3. Methodology for estimating extreme winds for probabilistic risk assessments

    SciTech Connect

    Ramsdell, J.V.; Elliott, D.L.; Holladay, C.G.; Hubbe, J.M.

    1986-10-01

    The US Nuclear Reguulatory Commission (NRC) assesses the risks associated with nuclear faciliies using techniques that fall under a generic name of Probabilistic Risk Assessment. In these assessments, potential accident sequences are traced from initiating event to final outcome. At each step of the sequence, a probability of occurrence is assigned to each available alternative. Ultimately, the probability of occurrence of each possible outcome is determined from the probabilities assigned to the initiating events and the alternative paths. Extreme winds are considered in these sequences. As a result, it is necessary to estimate extreme wind probabilities as low as 10/sup -7/yr/sup -1/. When the NRC staff is called on to provide extreme wind estimates, the staff is likely to be subjected to external time and funding constraints. These constraints dictate that the estimates be based on readily available wind data. In general, readily available data will be limited to the data provided by the facility applicant or licensee and the data archived at the National Climatic Data Center in Asheville, North Carolina. This report describes readily available data that can be used in estimating extreme wind probabilities, procedures of screening the data to eliminate erroneous values and for adjusting data to compensate for differences in data collection methods, and statistical methods for making extreme wind estimates. Supporting technical details are presented in several appendices. Estimation of extreme wind probabilities at a given location involves many subjective decisions. The procedures described do not eliminate all of the subjectivity, but they do increase the reproducibility of the analysis. They provide consistent methods for determining probabilities given a set of subjective decisions. By following these procedures, subjective decisions can be identified and documented.

  4. Observing Volcanic Thermal Anomalies from Space: How Accurate is the Estimation of the Hotspot's Size and Temperature?

    NASA Astrophysics Data System (ADS)

    Zaksek, K.; Pick, L.; Lombardo, V.; Hort, M. K.

    2015-12-01

    Measuring the heat emission from active volcanic features on the basis of infrared satellite images contributes to the volcano's hazard assessment. Because these thermal anomalies only occupy a small fraction (< 1 %) of a typically resolved target pixel (e.g. from Landsat 7, MODIS) the accurate determination of the hotspot's size and temperature is however problematic. Conventionally this is overcome by comparing observations in at least two separate infrared spectral wavebands (Dual-Band method). We investigate the resolution limits of this thermal un-mixing technique by means of a uniquely designed indoor analog experiment. Therein the volcanic feature is simulated by an electrical heating alloy of 0.5 mm diameter installed on a plywood panel of high emissivity. Two thermographic cameras (VarioCam high resolution and ImageIR 8300 by Infratec) record images of the artificial heat source in wavebands comparable to those available from satellite data. These range from the short-wave infrared (1.4-3 µm) over the mid-wave infrared (3-8 µm) to the thermal infrared (8-15 µm). In the conducted experiment the pixel fraction of the hotspot was successively reduced by increasing the camera-to-target distance from 3 m to 35 m. On the basis of an individual target pixel the expected decrease of the hotspot pixel area with distance at a relatively constant wire temperature of around 600 °C was confirmed. The deviation of the hotspot's pixel fraction yielded by the Dual-Band method from the theoretically calculated one was found to be within 20 % up until a target distance of 25 m. This means that a reliable estimation of the hotspot size is only possible if the hotspot is larger than about 3 % of the pixel area, a resolution boundary most remotely sensed volcanic hotspots fall below. Future efforts will focus on the investigation of a resolution limit for the hotspot's temperature by varying the alloy's amperage. Moreover, the un-mixing results for more realistic multi

  5. Risk cross sections and their application to risk estimation in the galactic cosmic-ray environment

    NASA Technical Reports Server (NTRS)

    Curtis, S. B.; Nealy, J. E.; Wilson, J. W.; Chatterjee, A. (Principal Investigator)

    1995-01-01

    Radiation risk cross sections (i.e. risks per particle fluence) are discussed in the context of estimating the risk of radiation-induced cancer on long-term space flights from the galactic cosmic radiation outside the confines of the earth's magnetic field. Such quantities are useful for handling effects not seen after low-LET radiation. Since appropriate cross-section functions for cancer induction for each particle species are not yet available, the conventional quality factor is used as an approximation to obtain numerical results for risks of excess cancer mortality. Risks are obtained for seven of the most radiosensitive organs as determined by the ICRP [stomach, colon, lung, bone marrow (BFO), bladder, esophagus and breast], beneath 10 g/cm2 aluminum shielding at solar minimum. Spectra are obtained for excess relative risk for each cancer per LET interval by calculating the average fluence-LET spectrum for the organ and converting to risk by multiplying by a factor proportional to R gamma L Q(L) before integrating over L, the unrestricted LET. Here R gamma is the risk coefficient for low-LET radiation (excess relative mortality per Sv) for the particular organ in question. The total risks of excess cancer mortality obtained are 1.3 and 1.1% to female and male crew, respectively, for a 1-year exposure at solar minimum. Uncertainties in these values are estimated to range between factors of 4 and 15 and are dominated by the biological uncertainties in the risk coefficients for low-LET radiation and in the LET (or energy) dependence of the risk cross sections (as approximated by the quality factor). The direct substitution of appropriate risk cross sections will eventually circumvent entirely the need to calculate, measure or use absorbed dose, equivalent dose and quality factor for such a high-energy charged-particle environment.

  6. Estimation of value at risk and conditional value at risk using normal mixture distributions model

    NASA Astrophysics Data System (ADS)

    Kamaruzzaman, Zetty Ain; Isa, Zaidi

    2013-04-01

    Normal mixture distributions model has been successfully applied in financial time series analysis. In this paper, we estimate the return distribution, value at risk (VaR) and conditional value at risk (CVaR) for monthly and weekly rates of returns for FTSE Bursa Malaysia Kuala Lumpur Composite Index (FBMKLCI) from July 1990 until July 2010 using the two component univariate normal mixture distributions model. First, we present the application of normal mixture distributions model in empirical finance where we fit our real data. Second, we present the application of normal mixture distributions model in risk analysis where we apply the normal mixture distributions model to evaluate the value at risk (VaR) and conditional value at risk (CVaR) with model validation for both risk measures. The empirical results provide evidence that using the two components normal mixture distributions model can fit the data well and can perform better in estimating value at risk (VaR) and conditional value at risk (CVaR) where it can capture the stylized facts of non-normality and leptokurtosis in returns distribution.

  7. Seismic Risk Assessment and Loss Estimation for Tbilisi City

    NASA Astrophysics Data System (ADS)

    Tsereteli, Nino; Alania, Victor; Varazanashvili, Otar; Gugeshashvili, Tengiz; Arabidze, Vakhtang; Arevadze, Nika; Tsereteli, Emili; Gaphrindashvili, Giorgi; Gventcadze, Alexander; Goguadze, Nino; Vephkhvadze, Sophio

    2013-04-01

    The proper assessment of seismic risk is of crucial importance for society protection and city sustainable economic development, as it is the essential part to seismic hazard reduction. Estimation of seismic risk and losses is complicated tasks. There is always knowledge deficiency on real seismic hazard, local site effects, inventory on elements at risk, infrastructure vulnerability, especially for developing countries. Lately great efforts was done in the frame of EMME (earthquake Model for Middle East Region) project, where in the work packages WP1, WP2 , WP3 and WP4 where improved gaps related to seismic hazard assessment and vulnerability analysis. Finely in the frame of work package wp5 "City Scenario" additional work to this direction and detail investigation of local site conditions, active fault (3D) beneath Tbilisi were done. For estimation economic losses the algorithm was prepared taking into account obtained inventory. The long term usage of building is very complex. It relates to the reliability and durability of buildings. The long term usage and durability of a building is determined by the concept of depreciation. Depreciation of an entire building is calculated by summing the products of individual construction unit' depreciation rates and the corresponding value of these units within the building. This method of calculation is based on an assumption that depreciation is proportional to the building's (constructions) useful life. We used this methodology to create a matrix, which provides a way to evaluate the depreciation rates of buildings with different type and construction period and to determine their corresponding value. Finally loss was estimated resulting from shaking 10%, 5% and 2% exceedance probability in 50 years. Loss resulting from scenario earthquake (earthquake with possible maximum magnitude) also where estimated.

  8. Risk estimation for fast neutrons with regard to solid cancer.

    PubMed

    Kellerer, A M; Walsh, L

    2001-12-01

    In the absence of epidemiological information on the effects of neutrons, their cancer mortality risk coefficient is currently taken as the product of two low-dose extrapolations: the nominal risk coefficient for photons and the presumed maximum relative biological effectiveness of neutrons. This approach is unnecessary. Since linearity in dose is assumed for neutrons at low to moderate effect levels, the risk coefficient can be derived in terms of the excess risk from epidemiological observations at an intermediate dose of gamma rays and an assumed value, R(1), of the neutron RBE relative to this reference dose of gamma rays. Application of this procedure to the A-bomb data requires accounting for the effect of the neutron dose component, which, according to the current dosimetry system, DS86, amounts on average to 11 mGy in the two cities at a total dose of 1 Gy. With R(1) tentatively set to 20 or 50, it is concluded that the neutrons have caused 18% or 35%, respectively, of the total effect at 1 Gy. The excess relative risk (ERR) for neutrons then lies between 8 per Gy and 16 per Gy. Translating these values into risk coefficients in terms of the effective dose, E, requires accounting for the gamma-ray component produced by the neutron field in the human body, which will require a separate analysis. The risk estimate for neutrons will remain essentially unaffected by the current reassessment of the neutron doses in Hiroshima, because the doses are unlikely to change much at the reference dose of 1 Gy. PMID:11741494

  9. Leukemia risk associated with benzene exposure in the pliofilm cohort. II. Risk estimates.

    PubMed

    Paxton, M B; Chinchilli, V M; Brett, S M; Rodricks, J V

    1994-04-01

    The detailed work histories of the individual workers composing the Pliofilm cohort represent a unique resource for estimating the dose-response for leukemia that may follow occupational exposure to benzene. In this paper, we report the results of analyzing the updated Pliofilm cohort using the proportional hazards model, a more sophisticated technique that uses more of the available exposure data than the conditional logistic model used by Rinsky et al. The more rigorously defined exposure estimates derived by Paustenbach et al. are consistent with those of Crump and Allen in giving estimates of the slope of the leukemogenic dose-response that are not as steep as the slope resulting from the exposure estimates of Rinsky et al. We consider estimates of 0.3-0.5 additional leukemia deaths per thousand workers with 45 ppm-years of cumulative benzene exposure to be the best estimates currently available of leukemia risk from occupational exposure to benzene. These risks were estimated in the proportional hazards model when the exposure estimates of Crump and Allen or of Paustenbach et al. were used to derive a cumulative concentration-by-time metric. PMID:8008924

  10. Cancer Risk Estimates from Space Flight Estimated Using Yields of Chromosome Damage in Astronaut's Blood Lymphocytes

    NASA Technical Reports Server (NTRS)

    George, Kerry A.; Rhone, J.; Chappell, L. J.; Cucinotta, F. A.

    2011-01-01

    To date, cytogenetic damage has been assessed in blood lymphocytes from more than 30 astronauts before and after they participated in long-duration space missions of three months or more on board the International Space Station. Chromosome damage was assessed using fluorescence in situ hybridization whole chromosome analysis techniques. For all individuals, the frequency of chromosome damage measured within a month of return from space was higher than their preflight yield, and biodosimetry estimates were within the range expected from physical dosimetry. Follow up analyses have been performed on most of the astronauts at intervals ranging from around 6 months to many years after flight, and the cytogenetic effects of repeat long-duration missions have so far been assessed in four individuals. Chromosomal aberrations in peripheral blood lymphocytes have been validated as biomarkers of cancer risk and cytogenetic damage can therefore be used to characterize excess health risk incurred by individual crewmembers after their respective missions. Traditional risk assessment models are based on epidemiological data obtained on Earth in cohorts exposed predominantly to acute doses of gamma-rays, and the extrapolation to the space environment is highly problematic, involving very large uncertainties. Cytogenetic damage could play a key role in reducing uncertainty in risk estimation because it is incurred directly in the space environment, using specimens from the astronauts themselves. Relative cancer risks were estimated from the biodosimetry data using the quantitative approach derived from the European Study Group on Cytogenetic Biomarkers and Health database. Astronauts were categorized into low, medium, or high tertiles according to their yield of chromosome damage. Age adjusted tertile rankings were used to estimate cancer risk and results were compared with values obtained using traditional modeling approaches. Individual tertile rankings increased after space

  11. Estimating Worker Risk Levels Using Accident/Incident Data

    SciTech Connect

    Kenoyer, Judson L.; Stenner, Robert D.; Andrews, William B.; Scherpelz, Robert I.; Aaberg, Rosanne L.

    2000-09-26

    The purpose of the work described in this report was to identify methods that are currently being used in the Department of Energy (DOE) complex to identify and control hazards/risks in the workplace, evaluate them in terms of their effectiveness in reducing risk to the workers, and to develop a preliminary method that could be used to predict the relative risks to workers performing proposed tasks using some of the current methodology. This report describes some of the performance indicators (i.e., safety metrics) that are currently being used to track relative levels of workplace safety in the DOE complex, how these fit into an Integrated Safety Management (ISM) system, some strengths and weaknesses of using a statistically based set of indicators, and methods to evaluate them. Also discussed are methods used to reduce risk to the workers and some of the techniques that appear to be working in the process of establishing a condition of continuous improvement. The results of these methods will be used in future work involved with the determination of modifying factors for a more complex model. The preliminary method to predict the relative risk level to workers during an extended future time period is based on a currently used performance indicator that uses several factors tracked in the CAIRS. The relative risks for workers in a sample (but real) facility on the Hanford site are estimated for a time period of twenty years and are based on workforce predictions. This is the first step in developing a more complex model that will incorporate other modifying factors related to the workers, work environment and status of the ISM system to adjust the preliminary prediction.

  12. Estimation of Hail Risk in the UK and Europe

    NASA Astrophysics Data System (ADS)

    Robinson, Eric; Parker, Melanie; Higgs, Stephanie

    2016-04-01

    Observations of hail events in Europe, and the UK especially, are relatively limited. In order to determine hail risk it is therefore necessary to use information other than relying purely on the historical record. One such methodology is to leverage reanalysis data, in this case ERA-Interim, along with a numerical model (WRF) to recreate the past state of the atmosphere. Relevant atmospheric properties can be extracted and used in a regression model to determine hail probability for each day contained within the reanalyses. The results presented here show the results of using a regression model based on convective available potential energy, deep level shear and weather type. Combined these parameters represent the probability of severe thunderstorm, and in turn hail, activity. Once the probability of hail occurring on each day is determined this can be used as the basis of a stochastic catalogue which can be used in the estimation of hail risk.

  13. Global Building Inventory for Earthquake Loss Estimation and Risk Management

    USGS Publications Warehouse

    Jaiswal, Kishor; Wald, David; Porter, Keith

    2010-01-01

    We develop a global database of building inventories using taxonomy of global building types for use in near-real-time post-earthquake loss estimation and pre-earthquake risk analysis, for the U.S. Geological Survey’s Prompt Assessment of Global Earthquakes for Response (PAGER) program. The database is available for public use, subject to peer review, scrutiny, and open enhancement. On a country-by-country level, it contains estimates of the distribution of building types categorized by material, lateral force resisting system, and occupancy type (residential or nonresidential, urban or rural). The database draws on and harmonizes numerous sources: (1) UN statistics, (2) UN Habitat’s demographic and health survey (DHS) database, (3) national housing censuses, (4) the World Housing Encyclopedia and (5) other literature.

  14. Estimation of risks associated with paediatric cochlear implantation.

    PubMed

    Johnston, J Cyne; Smith, Andrée Durieux; Fitzpatrick, Elizabeth; O'Connor, Annette; Angus, Douglas; Benzies, Karen; Schramm, David

    2010-09-01

    The objectives of this study were to estimate the rates of complications associated with paediatric cochlear implantation use: a) at one Canadian cochlear implant (CI) centre, and b) in the published literature. It comprised a retrospective hospital-based chart review and a concurrent review of complications in the published literature. There were 224 children who had undergone surgery from 1994 to June 2007. Results indicate that the rates of complications at the local Canadian paediatric CI centre are not significantly different from the literature rates for all examined complication types. This hospital-based retrospective chart review and review of the literature provide readers with an estimation of the risks to aid in evidence-based decision-making surrounding paediatric cochlear implantation. PMID:19655302

  15. Health risks in wastewater irrigation: comparing estimates from quantitative microbial risk analyses and epidemiological studies.

    PubMed

    Mara, D D; Sleigh, P A; Blumenthal, U J; Carr, R M

    2007-03-01

    The combination of standard quantitative microbial risk analysis (QMRA) techniques and 10,000-trial Monte Carlo risk simulations was used to estimate the human health risks associated with the use of wastewater for unrestricted and restricted crop irrigation. A risk of rotavirus infection of 10(-2) per person per year (pppy) was used as the reference level of acceptable risk. Using the model scenario of involuntary soil ingestion for restricted irrigation, the risk of rotavirus infection is approximately 10(-2) pppy when the wastewater contains < or =10(6) Escherichia coli per 100ml and when local agricultural practices are highly mechanised. For labour-intensive agriculture the risk of rotavirus infection is approximately 10(-2) pppy when the wastewater contains < or = 10(5) E. coli per 100ml; however, the wastewater quality should be < or = 10(4) E. coli per 100ml when children under 15 are exposed. With the model scenario of lettuce consumption for unrestricted irrigation, the use of wastewaters containing < or =10(4) E. coli per 100ml results in a rotavirus infection risk of approximately 10(-2) pppy; however, again based on epidemiological evidence from Mexico, the current WHO guideline level of < or =1,000 E. coli per 100ml should be retained for root crops eaten raw. PMID:17402278

  16. Survivorship models for estimating the risk of decompression sickness.

    PubMed

    Kumar, K V; Powell, M R

    1994-07-01

    Several approaches have been used for modeling the incidence of decompression sickness (DCS) such as Hill's dose-response and logistic regression. Most of these methods do not include the time-to-onset information in the model. Survival analysis (failure time analysis) is appropriate when the time to onset of an event is of interest. The applicability of survival analysis for modeling the risk of DCS is illustrated by using data obtained from hypobaric chamber exposures simulating extravehicular activities (n = 426). Univariate analysis of incidence-free survival proportions were obtained for Doppler-detectable circulating microbubbles (CMB), symptoms of DCS and test aborts. A log-linear failure time regression model with 360-min half-time tissue ratio (TR) as covariate was constructed, and estimated probabilities for various TR values were calculated. Further regression analysis by including CMB status in this model showed significant improvement (p < 0.05) in the estimation of DCS over the previous model. Since DCS is dependent on the exposure pressure as well as the duration of exposure, we recommend the use of survival analysis for modeling the risk of DCS. PMID:7945136

  17. Estimating Accurate Relative Spacecraft Angular Position from DSN VLBI Phases Using X-Band Telemetry or DOR Tones

    NASA Technical Reports Server (NTRS)

    Bagri, Durgadas S.; Majid, Walid

    2009-01-01

    At present spacecraft angular position with Deep Space Network (DSN) is determined using group delay estimates from very long baseline interferometer (VLBI) phase measurements employing differential one way ranging (DOR) tones. As an alternative to this approach, we propose estimating position of a spacecraft to half a fringe cycle accuracy using time variations between measured and calculated phases as the Earth rotates using DSN VLBI baseline(s). Combining fringe location of the target with the phase allows high accuracy for spacecraft angular position estimate. This can be achieved using telemetry signals of at least 4-8 MSamples/sec data rate or DOR tones.

  18. Time-to-Compromise Model for Cyber Risk Reduction Estimation

    SciTech Connect

    Miles A. McQueen; Wayne F. Boyer; Mark A. Flynn; George A. Beitel

    2005-09-01

    We propose a new model for estimating the time to compromise a system component that is visible to an attacker. The model provides an estimate of the expected value of the time-to-compromise as a function of known and visible vulnerabilities, and attacker skill level. The time-to-compromise random process model is a composite of three subprocesses associated with attacker actions aimed at the exploitation of vulnerabilities. In a case study, the model was used to aid in a risk reduction estimate between a baseline Supervisory Control and Data Acquisition (SCADA) system and the baseline system enhanced through a specific set of control system security remedial actions. For our case study, the total number of system vulnerabilities was reduced by 86% but the dominant attack path was through a component where the number of vulnerabilities was reduced by only 42% and the time-to-compromise of that component was increased by only 13% to 30% depending on attacker skill level.

  19. Quaternion-Based Unscented Kalman Filter for Accurate Indoor Heading Estimation Using Wearable Multi-Sensor System

    PubMed Central

    Yuan, Xuebing; Yu, Shuai; Zhang, Shengzhi; Wang, Guoping; Liu, Sheng

    2015-01-01

    Inertial navigation based on micro-electromechanical system (MEMS) inertial measurement units (IMUs) has attracted numerous researchers due to its high reliability and independence. The heading estimation, as one of the most important parts of inertial navigation, has been a research focus in this field. Heading estimation using magnetometers is perturbed by magnetic disturbances, such as indoor concrete structures and electronic equipment. The MEMS gyroscope is also used for heading estimation. However, the accuracy of gyroscope is unreliable with time. In this paper, a wearable multi-sensor system has been designed to obtain the high-accuracy indoor heading estimation, according to a quaternion-based unscented Kalman filter (UKF) algorithm. The proposed multi-sensor system including one three-axis accelerometer, three single-axis gyroscopes, one three-axis magnetometer and one microprocessor minimizes the size and cost. The wearable multi-sensor system was fixed on waist of pedestrian and the quadrotor unmanned aerial vehicle (UAV) for heading estimation experiments in our college building. The results show that the mean heading estimation errors are less 10° and 5° to multi-sensor system fixed on waist of pedestrian and the quadrotor UAV, respectively, compared to the reference path. PMID:25961384

  20. Quaternion-based unscented Kalman filter for accurate indoor heading estimation using wearable multi-sensor system.

    PubMed

    Yuan, Xuebing; Yu, Shuai; Zhang, Shengzhi; Wang, Guoping; Liu, Sheng

    2015-01-01

    Inertial navigation based on micro-electromechanical system (MEMS) inertial measurement units (IMUs) has attracted numerous researchers due to its high reliability and independence. The heading estimation, as one of the most important parts of inertial navigation, has been a research focus in this field. Heading estimation using magnetometers is perturbed by magnetic disturbances, such as indoor concrete structures and electronic equipment. The MEMS gyroscope is also used for heading estimation. However, the accuracy of gyroscope is unreliable with time. In this paper, a wearable multi-sensor system has been designed to obtain the high-accuracy indoor heading estimation, according to a quaternion-based unscented Kalman filter (UKF) algorithm. The proposed multi-sensor system including one three-axis accelerometer, three single-axis gyroscopes, one three-axis magnetometer and one microprocessor minimizes the size and cost. The wearable multi-sensor system was fixed on waist of pedestrian and the quadrotor unmanned aerial vehicle (UAV) for heading estimation experiments in our college building. The results show that the mean heading estimation errors are less 10° and 5° to multi-sensor system fixed on waist of pedestrian and the quadrotor UAV, respectively, compared to the reference path. PMID:25961384

  1. Developing accurate survey methods for estimating population sizes and trends of the critically endangered Nihoa Millerbird and Nihoa Finch.

    USGS Publications Warehouse

    Gorresen, P. Marcos; Camp, Richard J.; Brinck, Kevin W.; Farmer, Chris

    2012-01-01

    Point-transect surveys indicated that millerbirds were more abundant than shown by the striptransect method, and were estimated at 802 birds in 2010 (95%CI = 652 – 964) and 704 birds in 2011 (95%CI = 579 – 837). Point-transect surveys yielded population estimates with improved precision which will permit trends to be detected in shorter time periods and with greater statistical power than is available from strip-transect survey methods. Mean finch population estimates and associated uncertainty were not markedly different among the three survey methods, but the performance of models used to estimate density and population size are expected to improve as the data from additional surveys are incorporated. Using the pointtransect survey, the mean finch population size was estimated at 2,917 birds in 2010 (95%CI = 2,037 – 3,965) and 2,461 birds in 2011 (95%CI = 1,682 – 3,348). Preliminary testing of the line-transect method in 2011 showed that it would not generate sufficient detections to effectively model bird density, and consequently, relatively precise population size estimates. Both species were fairly evenly distributed across Nihoa and appear to occur in all or nearly all available habitat. The time expended and area traversed by observers was similar among survey methods; however, point-transect surveys do not require that observers walk a straight transect line, thereby allowing them to avoid culturally or biologically sensitive areas and minimize the adverse effects of recurrent travel to any particular area. In general, pointtransect surveys detect more birds than strip-survey methods, thereby improving precision and resulting population size and trend estimation. The method is also better suited for the steep and uneven terrain of Nihoa

  2. GGOS and the EOP - the key role of SLR for a stable estimation of highly accurate Earth orientation parameters

    NASA Astrophysics Data System (ADS)

    Bloßfeld, Mathis; Panzetta, Francesca; Müller, Horst; Gerstl, Michael

    2016-04-01

    The GGOS vision is to integrate geometric and gravimetric observation techniques to estimate consistent geodetic-geophysical parameters. In order to reach this goal, the common estimation of station coordinates, Stokes coefficients and Earth Orientation Parameters (EOP) is necessary. Satellite Laser Ranging (SLR) provides the ability to study correlations between the different parameter groups since the observed satellite orbit dynamics are sensitive to the above mentioned geodetic parameters. To decrease the correlations, SLR observations to multiple satellites have to be combined. In this paper, we compare the estimated EOP of (i) single satellite SLR solutions and (ii) multi-satellite SLR solutions. Therefore, we jointly estimate station coordinates, EOP, Stokes coefficients and orbit parameters using different satellite constellations. A special focus in this investigation is put on the de-correlation of different geodetic parameter groups due to the combination of SLR observations. Besides SLR observations to spherical satellites (commonly used), we discuss the impact of SLR observations to non-spherical satellites such as, e.g., the JASON-2 satellite. The goal of this study is to discuss the existing parameter interactions and to present a strategy how to obtain reliable estimates of station coordinates, EOP, orbit parameter and Stokes coefficients in one common adjustment. Thereby, the benefits of a multi-satellite SLR solution are evaluated.

  3. Assignment of Calibration Information to Deeper Phylogenetic Nodes is More Effective in Obtaining Precise and Accurate Divergence Time Estimates.

    PubMed

    Mello, Beatriz; Schrago, Carlos G

    2014-01-01

    Divergence time estimation has become an essential tool for understanding macroevolutionary events. Molecular dating aims to obtain reliable inferences, which, within a statistical framework, means jointly increasing the accuracy and precision of estimates. Bayesian dating methods exhibit the propriety of a linear relationship between uncertainty and estimated divergence dates. This relationship occurs even if the number of sites approaches infinity and places a limit on the maximum precision of node ages. However, how the placement of calibration information may affect the precision of divergence time estimates remains an open question. In this study, relying on simulated and empirical data, we investigated how the location of calibration within a phylogeny affects the accuracy and precision of time estimates. We found that calibration priors set at median and deep phylogenetic nodes were associated with higher precision values compared to analyses involving calibration at the shallowest node. The results were independent of the tree symmetry. An empirical mammalian dataset produced results that were consistent with those generated by the simulated sequences. Assigning time information to the deeper nodes of a tree is crucial to guarantee the accuracy and precision of divergence times. This finding highlights the importance of the appropriate choice of outgroups in molecular dating. PMID:24855333

  4. 41 CFR 102-80.50 - Are Federal agencies responsible for identifying/estimating risks and for appropriate risk...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... (Continued) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 80-SAFETY AND ENVIRONMENTAL MANAGEMENT Safety and Environmental Management Risks and Risk Reduction Strategies § 102-80.50 Are Federal agencies responsible for... identify and estimate safety and environmental management risks and appropriate risk reduction...

  5. 41 CFR 102-80.50 - Are Federal agencies responsible for identifying/estimating risks and for appropriate risk...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... (Continued) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 80-SAFETY AND ENVIRONMENTAL MANAGEMENT Safety and Environmental Management Risks and Risk Reduction Strategies § 102-80.50 Are Federal agencies responsible for... identify and estimate safety and environmental management risks and appropriate risk reduction...

  6. Declining bioavailability and inappropriate estimation of risk of persistent compounds

    SciTech Connect

    Kelsey, J.W.; Alexander, M.

    1997-03-01

    Earthworms (Eisenia foetida) assimilated decreasing amounts of atrazine, phenanthrene, and naphthalene that had been incubated for increasing periods of time in sterile soil. The amount of atrazine and phenanthrene removed from soil by mild extractants also decreased with time. The declines in bioavailability of the three compounds to earthworms and of naphthalene to bacteria were not reflected by analysis involving vigorous methods of solvent extraction; similar results for bioavailability of phenanthrene and 4-nitrophenol to bacteria were obtained in a previous study conducted at this laboratory. The authors suggest that regulations based on vigorous extractions for the analyses of persistent organic pollutants in soil do not appropriately estimate exposure or risk to susceptible populations.

  7. How Accurate Are German Work-Time Data? A Comparison of Time-Diary Reports and Stylized Estimates

    ERIC Educational Resources Information Center

    Otterbach, Steffen; Sousa-Poza, Alfonso

    2010-01-01

    This study compares work time data collected by the German Time Use Survey (GTUS) using the diary method with stylized work time estimates from the GTUS, the German Socio-Economic Panel, and the German Microcensus. Although on average the differences between the time-diary data and the interview data is not large, our results show that significant…

  8. Estimation of Tsunami Risk for the Caribbean Coast

    NASA Astrophysics Data System (ADS)

    Zahibo, N.

    2004-05-01

    The tsunami problem for the coast of the Caribbean basin is discussed. Briefly the historical data of tsunami in the Caribbean Sea are presented. Numerical simulation of potential tsunamis in the Caribbean Sea is performed in the framework of the nonlinear-shallow theory. The tsunami wave height distribution along the Caribbean Coast is computed. These results are used to estimate the far-field tsunami potential of various coastal locations in the Caribbean Sea. In fact, five zones with tsunami low risk are selected basing on prognostic computations, they are: the bay "Golfo de Batabano" and the coast of province "Ciego de Avila" in Cuba, the Nicaraguan Coast (between Bluefields and Puerto Cabezas), the border between Mexico and Belize, the bay "Golfo de Venezuela" in Venezuela. The analysis of historical data confirms that there was no tsunami in the selected zones. Also, the wave attenuation in the Caribbean Sea is investigated; in fact, wave amplitude decreases in an order if the tsunami source is located on the distance up to 1000 km from the coastal location. Both factors wave attenuation and wave height distribution should be taken into account in the planned warning system for the Caribbean Sea. Specially the problem of tsunami risk for Lesser Antilles including Guadeloupe is discussed.

  9. Gambling disorder: estimated prevalence rates and risk factors in Macao.

    PubMed

    Wu, Anise M S; Lai, Mark H C; Tong, Kwok-Kit

    2014-12-01

    An excessive, problematic gambling pattern has been regarded as a mental disorder in the Diagnostic and Statistical Manual for Mental Disorders (DSM) for more than 3 decades (American Psychiatric Association [APA], 1980). In this study, its latest prevalence in Macao (one of very few cities with legalized gambling in China and the Far East) was estimated with 2 major changes in the diagnostic criteria, suggested by the 5th edition of DSM (APA, 2013): (a) removing the "Illegal Act" criterion, and (b) lowering the threshold for diagnosis. A random, representative sample of 1,018 Macao residents was surveyed with a phone poll design in January 2013. After the 2 changes were adopted, the present study showed that the estimated prevalence rate of gambling disorder was 2.1% of the Macao adult population. Moreover, the present findings also provided empirical support to the application of these 2 recommended changes when assessing symptoms of gambling disorder among Chinese community adults. Personal risk factors of gambling disorder, namely being male, having low education, a preference for casino gambling, as well as high materialism, were identified. PMID:25134026

  10. Accurate 3D rigid-body target motion and structure estimation by using GMTI/HRR with template information

    NASA Astrophysics Data System (ADS)

    Wu, Shunguang; Hong, Lang

    2008-04-01

    A framework of simultaneously estimating the motion and structure parameters of a 3D object by using high range resolution (HRR) and ground moving target indicator (GMTI) measurements with template information is given. By decoupling the motion and structure information and employing rigid-body constraints, we have developed the kinematic and measurement equations of the problem. Since the kinematic system is unobservable by using only one scan HRR and GMTI measurements, we designed an architecture to run the motion and structure filters in parallel by using multi-scan measurements. Moreover, to improve the estimation accuracy in large noise and/or false alarm environments, an interacting multi-template joint tracking (IMTJT) algorithm is proposed. Simulation results have shown that the averaged root mean square errors for both motion and structure state vectors have been significantly reduced by using the template information.

  11. Dense and accurate motion and strain estimation in high resolution speckle images using an image-adaptive approach

    NASA Astrophysics Data System (ADS)

    Cofaru, Corneliu; Philips, Wilfried; Van Paepegem, Wim

    2011-09-01

    Digital image processing methods represent a viable and well acknowledged alternative to strain gauges and interferometric techniques for determining full-field displacements and strains in materials under stress. This paper presents an image adaptive technique for dense motion and strain estimation using high-resolution speckle images that show the analyzed material in its original and deformed states. The algorithm starts by dividing the speckle image showing the original state into irregular cells taking into consideration both spatial and gradient image information present. Subsequently the Newton-Raphson digital image correlation technique is applied to calculate the corresponding motion for each cell. Adaptive spatial regularization in the form of the Geman- McClure robust spatial estimator is employed to increase the spatial consistency of the motion components of a cell with respect to the components of neighbouring cells. To obtain the final strain information, local least-squares fitting using a linear displacement model is performed on the horizontal and vertical displacement fields. To evaluate the presented image partitioning and strain estimation techniques two numerical and two real experiments are employed. The numerical experiments simulate the deformation of a specimen with constant strain across the surface as well as small rigid-body rotations present while real experiments consist specimens that undergo uniaxial stress. The results indicate very good accuracy of the recovered strains as well as better rotation insensitivity compared to classical techniques.

  12. Estimated life expectancy and risk of death from cancer by quartiles in the older Japanese population: 2010 vital statistics.

    PubMed

    Iwamoto, Momoko; Nakamura, Fumiaki; Higashi, Takahiro

    2014-10-01

    Data on life expectancies and risk of death from cancer are essential information to have when making informed decisions about cancer screening and treatment options, but has never been presented in a way that is readily available to use for physicians in Japan. We provided estimates of life expectancies and predicted risk of death from seven most common types of cancer (lung, gastric, liver, colon, prostate, breast, and cervical) by quartiles for the older Japanese population above 50 years old, using 2010 life tables and cancer mortality statistics data. We found that there was a large difference in life expectancy between older persons in the upper and lower quartiles. Risk of death from breast cancer was low. By using this data, physicians can more accurately obtain life expectancy estimates by assessing which quartile the patient is most likely to fall under, and help patients make better informed decisions. PMID:25113939

  13. Soil-ecological risks for soil degradation estimation

    NASA Astrophysics Data System (ADS)

    Trifonova, Tatiana; Shirkin, Leonid; Kust, German; Andreeva, Olga

    2016-04-01

    Soil degradation includes the processes of soil properties and quality worsening, primarily from the point of view of their productivity and decrease of ecosystem services quality. Complete soil cover destruction and/or functioning termination of soil forms of organic life are considered as extreme stages of soil degradation, and for the fragile ecosystems they are normally considered in the network of their desertification, land degradation and droughts /DLDD/ concept. Block-model of ecotoxic effects, generating soil and ecosystem degradation, has been developed as a result of the long-term field and laboratory research of sod-podzol soils, contaminated with waste, containing heavy metals. The model highlights soil degradation mechanisms, caused by direct and indirect impact of ecotoxicants on "phytocenosis- soil" system and their combination, frequently causing synergistic effect. The sequence of occurring changes here can be formalized as a theory of change (succession of interrelated events). Several stages are distinguished here - from heavy metals leaching (releasing) in waste and their migration downward the soil profile to phytoproductivity decrease and certain phytocenosis composition changes. Phytoproductivity decrease leads to the reduction of cellulose content introduced into the soil. The described feedback mechanism acts as a factor of sod-podzolic soil self-purification and stability. It has been shown, that using phytomass productivity index, integrally reflecting the worsening of soil properties complex, it is possible to solve the problems dealing with the dose-reflecting reactions creation and determination of critical levels of load for phytocenosis and corresponding soil-ecological risks. Soil-ecological risk in "phytocenosis- soil" system means probable negative changes and the loss of some ecosystem functions during the transformation process of dead organic substance energy for the new biomass composition. Soil-ecological risks estimation is

  14. Accurate estimate of the critical exponent nu for self-avoiding walks via a fast implementation of the pivot algorithm.

    PubMed

    Clisby, Nathan

    2010-02-01

    We introduce a fast implementation of the pivot algorithm for self-avoiding walks, which we use to obtain large samples of walks on the cubic lattice of up to 33x10{6} steps. Consequently the critical exponent nu for three-dimensional self-avoiding walks is determined to great accuracy; the final estimate is nu=0.587 597(7). The method can be adapted to other models of polymers with short-range interactions, on the lattice or in the continuum. PMID:20366773

  15. Childhood CT scans and cancer risk: impact of predisposing factors for cancer on the risk estimates.

    PubMed

    Journy, N; Roué, T; Cardis, E; Le Pointe, H Ducou; Brisse, H; Chateil, J-F; Laurier, D; Bernier, M-O

    2016-03-01

    To investigate the role of cancer predisposing factors (PFs) on the associations between paediatric computed tomography (CT) scan exposures and subsequent risk of central nervous system (CNS) tumours and leukaemia. A cohort of children who underwent a CT scan in 2000-2010 in 23 French radiology departments was linked with the national childhood cancers registry and national vital status registry; information on PFs was retrieved through hospital discharge databases. In children without PF, hazard ratios of 1.07 (95% CI 0.99-1.10) for CNS tumours (15 cases) and 1.16 (95% CI 0.77-1.27) for leukaemia (12 cases) were estimated for each 10 mGy increment in CT x-rays organ doses. These estimates were similar to those obtained in the whole cohort. In children with PFs, no positive dose-risk association was observed, possibly related to earlier non-cancer mortality in this group. Our results suggest a modifying effect of PFs on CT-related cancer risks, but need to be confirmed by longer follow-up and other studies. PMID:26878249

  16. How accurate and precise are limited sampling strategies in estimating exposure to mycophenolic acid in people with autoimmune disease?

    PubMed

    Abd Rahman, Azrin N; Tett, Susan E; Staatz, Christine E

    2014-03-01

    Mycophenolic acid (MPA) is a potent immunosuppressant agent, which is increasingly being used in the treatment of patients with various autoimmune diseases. Dosing to achieve a specific target MPA area under the concentration-time curve from 0 to 12 h post-dose (AUC12) is likely to lead to better treatment outcomes in patients with autoimmune disease than a standard fixed-dose strategy. This review summarizes the available published data around concentration monitoring strategies for MPA in patients with autoimmune disease and examines the accuracy and precision of methods reported to date using limited concentration-time points to estimate MPA AUC12. A total of 13 studies were identified that assessed the correlation between single time points and MPA AUC12 and/or examined the predictive performance of limited sampling strategies in estimating MPA AUC12. The majority of studies investigated mycophenolate mofetil (MMF) rather than the enteric-coated mycophenolate sodium (EC-MPS) formulation of MPA. Correlations between MPA trough concentrations and MPA AUC12 estimated by full concentration-time profiling ranged from 0.13 to 0.94 across ten studies, with the highest associations (r (2) = 0.90-0.94) observed in lupus nephritis patients. Correlations were generally higher in autoimmune disease patients compared with renal allograft recipients and higher after MMF compared with EC-MPS intake. Four studies investigated use of a limited sampling strategy to predict MPA AUC12 determined by full concentration-time profiling. Three studies used a limited sampling strategy consisting of a maximum combination of three sampling time points with the latest sample drawn 3-6 h after MMF intake, whereas the remaining study tested all combinations of sampling times. MPA AUC12 was best predicted when three samples were taken at pre-dose and at 1 and 3 h post-dose with a mean bias and imprecision of 0.8 and 22.6 % for multiple linear regression analysis and of -5.5 and 23.0 % for

  17. Potential risk of using General Estimates System: bicycle safety.

    PubMed

    Kweon, Young-Jun; Lee, Joyoung

    2010-11-01

    Beneficial effects of bicycle helmet use have been reported mostly based on medical or survey data collected from hospitals. This study was to examine the validity of the United States General Estimates System (GES) database familiar to many transportation professionals for a beneficial effect of helmet use in reducing the severity of injury to bicyclists and found potential risk of erroneous conclusions that can be drawn by a narrowly focused study when the GES database is used. Although the focus of the study was on bicycle helmet use, its findings regarding potential risk might be true for any type of traffic safety study using the GES data. A partial proportional odds model reflecting intrinsic ordering of injury severity was mainly used. About 16,000 bicycle-involved traffic crash records occurring in 2003 through 2008 in the United States were extracted from the GES database. Using the 2003-2008 GES data, a beneficial effect of helmet use was found in 2007, yet a detrimental effect in 2004 and no effect in 2003, 2005, 2006, and 2008, which are contrary to the past findings from medical or hospital survey data. It was speculated that these mixed results might be attributable to a possible lack of representation of the GES data for bicycle-involved traffic crashes, which may be supported by the findings, such as the average helmet use rates at the time of the crashes varying from 12% in 2004 to 38% in 2008. This suggests that the GES data may not be a reliable source for studying narrowly focused issues such as the effect of helmet use. A considerable fluctuation over years in basic statistical values (e.g., average) of variables of interest (e.g., helmet use) may be an indication of a possible lack of representation of the GES data. In such a case, caution should be exercised in interpreting and generalizing analysis results. PMID:20728621

  18. Estimating urban flood risk - uncertainty in design criteria

    NASA Astrophysics Data System (ADS)

    Newby, M.; Franks, S. W.; White, C. J.

    2015-06-01

    The design of urban stormwater infrastructure is generally performed assuming that climate is static. For engineering practitioners, stormwater infrastructure is designed using a peak flow method, such as the Rational Method as outlined in the Australian Rainfall and Runoff (AR&R) guidelines and estimates of design rainfall intensities. Changes to Australian rainfall intensity design criteria have been made through updated releases of the AR&R77, AR&R87 and the recent 2013 AR&R Intensity Frequency Distributions (IFDs). The primary focus of this study is to compare the three IFD sets from 51 locations Australia wide. Since the release of the AR&R77 IFDs, the duration and number of locations for rainfall data has increased and techniques for data analysis have changed. Updated terminology coinciding with the 2013 IFD release has also resulted in a practical change to the design rainfall. For example, infrastructure that is designed for a 1 : 5 year ARI correlates with an 18.13% AEP, however for practical purposes, hydraulic guidelines have been updated with the more intuitive 20% AEP. The evaluation of design rainfall variation across Australia has indicated that the changes are dependent upon location, recurrence interval and rainfall duration. The changes to design rainfall IFDs are due to the application of differing data analysis techniques, the length and number of data sets and the change in terminology from ARI to AEP. Such changes mean that developed infrastructure has been designed to a range of different design criteria indicating the likely inadequacy of earlier developments to the current estimates of flood risk. In many cases, the under-design of infrastructure is greater than the expected impact of increased rainfall intensity under climate change scenarios.

  19. RADON EXPOSURE ASSESSMENT AND DOSIMETRY APPLIED TO EPIDEMIOLOGY AND RISK ESTIMATION

    EPA Science Inventory

    Epidemiological studies of underground miners provide the primary basis for radon risk estimates for indoor exposures as well as mine exposures. A major source of uncertainty in these risk estimates is the uncertainty in radon progeny exposure estimates for the miners. In addit...

  20. Accurate spike estimation from noisy calcium signals for ultrafast three-dimensional imaging of large neuronal populations in vivo

    PubMed Central

    Deneux, Thomas; Kaszas, Attila; Szalay, Gergely; Katona, Gergely; Lakner, Tamás; Grinvald, Amiram; Rózsa, Balázs; Vanzetta, Ivo

    2016-01-01

    Extracting neuronal spiking activity from large-scale two-photon recordings remains challenging, especially in mammals in vivo, where large noises often contaminate the signals. We propose a method, MLspike, which returns the most likely spike train underlying the measured calcium fluorescence. It relies on a physiological model including baseline fluctuations and distinct nonlinearities for synthetic and genetically encoded indicators. Model parameters can be either provided by the user or estimated from the data themselves. MLspike is computationally efficient thanks to its original discretization of probability representations; moreover, it can also return spike probabilities or samples. Benchmarked on extensive simulations and real data from seven different preparations, it outperformed state-of-the-art algorithms. Combined with the finding obtained from systematic data investigation (noise level, spiking rate and so on) that photonic noise is not necessarily the main limiting factor, our method allows spike extraction from large-scale recordings, as demonstrated on acousto-optical three-dimensional recordings of over 1,000 neurons in vivo. PMID:27432255

  1. Accurate spike estimation from noisy calcium signals for ultrafast three-dimensional imaging of large neuronal populations in vivo.

    PubMed

    Deneux, Thomas; Kaszas, Attila; Szalay, Gergely; Katona, Gergely; Lakner, Tamás; Grinvald, Amiram; Rózsa, Balázs; Vanzetta, Ivo

    2016-01-01

    Extracting neuronal spiking activity from large-scale two-photon recordings remains challenging, especially in mammals in vivo, where large noises often contaminate the signals. We propose a method, MLspike, which returns the most likely spike train underlying the measured calcium fluorescence. It relies on a physiological model including baseline fluctuations and distinct nonlinearities for synthetic and genetically encoded indicators. Model parameters can be either provided by the user or estimated from the data themselves. MLspike is computationally efficient thanks to its original discretization of probability representations; moreover, it can also return spike probabilities or samples. Benchmarked on extensive simulations and real data from seven different preparations, it outperformed state-of-the-art algorithms. Combined with the finding obtained from systematic data investigation (noise level, spiking rate and so on) that photonic noise is not necessarily the main limiting factor, our method allows spike extraction from large-scale recordings, as demonstrated on acousto-optical three-dimensional recordings of over 1,000 neurons in vivo. PMID:27432255

  2. State Estimates of Adolescent Cigarette Use and Perceptions of Risk of Smoking: 2012 and 2013

    MedlinePlus

    ... 2015 STATE ESTIMATES OF ADOLESCENT CIGARETTE USE AND PERCEPTIONS OF RISK OF SMOKING: 2012 AND 2013 AUTHORS ... with an inverse association between use and risk perceptions (i.e., the prevalence of use is lower ...

  3. Risk Estimates From an Online Risk Calculator Are More Believable and Recalled Better When Expressed as Integers

    PubMed Central

    Zikmund-Fisher, Brian J; Waters, Erika A; Gavaruzzi, Teresa; Fagerlin, Angela

    2011-01-01

    Background Online risk calculators offer different levels of precision in their risk estimates. People interpret numbers in varying ways depending on how they are presented, and we do not know how the number of decimal places displayed might influence perceptions of risk estimates. Objective The objective of our study was to determine whether precision (ie, number of decimals) in risk estimates offered by an online risk calculator influences users’ ratings of (1) how believable the estimate is, (2) risk magnitude (ie, how large or small the risk feels to them), and (3) how well they can recall the risk estimate after a brief delay. Methods We developed two mock risk calculator websites that offered hypothetical percentage estimates of participants’ lifetime risk of kidney cancer. Participants were randomly assigned to a condition where the risk estimate value rose with increasing precision (2, 2.1, 2.13, 2.133) or the risk estimate value fell with increasing precision (2, 1.9, 1.87, 1.867). Within each group, participants were randomly assigned one of the four numbers as their first risk estimate, and later received one of the remaining three as a comparison. Results Participants who completed the experiment (N = 3422) were a demographically diverse online sample, approximately representative of the US adult population on age, gender, and race. Participants whose risk estimates had no decimal places gave the highest ratings of believability (F 3,3384 = 2.94, P = .03) and the lowest ratings of risk magnitude (F 3,3384 = 4.70, P = .003). Compared to estimates with decimal places, integer estimates were judged as highly believable by 7%–10% more participants (χ2 3 =17.8, P < .001). When comparing two risk estimates with different levels of precision, large majorities of participants reported that the numbers seemed equivalent across all measures. Both exact and approximate recall were highest for estimates with zero decimals. Odds ratios (OR) for correct

  4. The effects of spatial population dataset choice on estimates of population at risk of disease

    PubMed Central

    2011-01-01

    Background The spatial modeling of infectious disease distributions and dynamics is increasingly being undertaken for health services planning and disease control monitoring, implementation, and evaluation. Where risks are heterogeneous in space or dependent on person-to-person transmission, spatial data on human population distributions are required to estimate infectious disease risks, burdens, and dynamics. Several different modeled human population distribution datasets are available and widely used, but the disparities among them and the implications for enumerating disease burdens and populations at risk have not been considered systematically. Here, we quantify some of these effects using global estimates of populations at risk (PAR) of P. falciparum malaria as an example. Methods The recent construction of a global map of P. falciparum malaria endemicity enabled the testing of different gridded population datasets for providing estimates of PAR by endemicity class. The estimated population numbers within each class were calculated for each country using four different global gridded human population datasets: GRUMP (~1 km spatial resolution), LandScan (~1 km), UNEP Global Population Databases (~5 km), and GPW3 (~5 km). More detailed assessments of PAR variation and accuracy were conducted for three African countries where census data were available at a higher administrative-unit level than used by any of the four gridded population datasets. Results The estimates of PAR based on the datasets varied by more than 10 million people for some countries, even accounting for the fact that estimates of population totals made by different agencies are used to correct national totals in these datasets and can vary by more than 5% for many low-income countries. In many cases, these variations in PAR estimates comprised more than 10% of the total national population. The detailed country-level assessments suggested that none of the datasets was consistently more

  5. Shorter sampling periods and accurate estimates of milk volume and components are possible for pasture based dairy herds milked with automated milking systems.

    PubMed

    Kamphuis, Claudia; Burke, Jennie K; Taukiri, Sarah; Petch, Susan-Fay; Turner, Sally-Anne

    2016-08-01

    Dairy cows grazing pasture and milked using automated milking systems (AMS) have lower milking frequencies than indoor fed cows milked using AMS. Therefore, milk recording intervals used for herd testing indoor fed cows may not be suitable for cows on pasture based farms. We hypothesised that accurate standardised 24 h estimates could be determined for AMS herds with milk recording intervals of less than the Gold Standard (48 hs), but that the optimum milk recording interval would depend on the herd average for milking frequency. The Gold Standard protocol was applied on five commercial dairy farms with AMS, between December 2011 and February 2013. From 12 milk recording test periods, involving 2211 cow-test days and 8049 cow milkings, standardised 24 h estimates for milk volume and milk composition were calculated for the Gold Standard protocol and compared with those collected during nine alternative sampling scenarios, including six shorter sampling periods and three in which a fixed number of milk samples per cow were collected. Results infer a 48 h milk recording protocol is unnecessarily long for collecting accurate estimates during milk recording on pasture based AMS farms. Collection of two milk samples only per cow was optimal in terms of high concordance correlation coefficients for milk volume and components and a low proportion of missed cow-test days. Further research is required to determine the effects of diurnal variations in milk composition on standardised 24 h estimates for milk volume and components, before a protocol based on a fixed number of samples could be considered. Based on the results of this study New Zealand have adopted a split protocol for herd testing based on the average milking frequency for the herd (NZ Herd Test Standard 8100:2015). PMID:27600967

  6. Cancer risk estimation caused by radiation exposure during endovascular procedure

    NASA Astrophysics Data System (ADS)

    Kang, Y. H.; Cho, J. H.; Yun, W. S.; Park, K. H.; Kim, H. G.; Kwon, S. M.

    2014-05-01

    The objective of this study was to identify the radiation exposure dose of patients, as well as staff caused by fluoroscopy for C-arm-assisted vascular surgical operation and to estimate carcinogenic risk due to such exposure dose. The study was conducted in 71 patients (53 men and 18 women) who had undergone vascular surgical intervention at the division of vascular surgery in the University Hospital from November of 2011 to April of 2012. It had used a mobile C-arm device and calculated the radiation exposure dose of patient (dose-area product, DAP). Effective dose was measured by attaching optically stimulated luminescence on the radiation protectors of staff who participates in the surgery to measure the radiation exposure dose of staff during the vascular surgical operation. From the study results, DAP value of patients was 308.7 Gy cm2 in average, and the maximum value was 3085 Gy cm2. When converted to the effective dose, the resulted mean was 6.2 m Gy and the maximum effective dose was 61.7 milliSievert (mSv). The effective dose of staff was 3.85 mSv; while the radiation technician was 1.04 mSv, the nurse was 1.31 mSv. All cancer incidences of operator are corresponding to 2355 persons per 100,000 persons, which deemed 1 of 42 persons is likely to have all cancer incidences. In conclusion, the vascular surgeons should keep the radiation protection for patient, staff, and all participants in the intervention in mind as supervisor of fluoroscopy while trying to understand the effects by radiation by themselves to prevent invisible danger during the intervention and to minimize the harm.

  7. Estimating Risk of Alcohol Dependence Using Alcohol Screening Scores*

    PubMed Central

    Rubinsky, Anna D.; Kivlahan, Daniel R.; Volk, Robert J.; Maynard, Charles; Bradley, Katharine A.

    2010-01-01

    Brief alcohol counseling interventions can reduce alcohol consumption and related morbidity among non-dependent risky drinkers, but more intensive alcohol treatment is recommended for persons with alcohol dependence. This study evaluated whether scores on common alcohol screening tests could identify patients likely to have current alcohol dependence so that more appropriate follow-up assessment and/or intervention could be offered. This cross-sectional study used secondary data from 392 male and 927 female adult family medicine outpatients (1993–1994). Likelihood ratios were used to empirically identify and evaluate ranges of scores of the AUDIT, the AUDIT-C, two single-item questions about frequency of binge drinking, and the CAGE questionnaire for detecting DSM-IV past-year alcohol dependence. Based on the prevalence of past-year alcohol dependence in this sample (men: 12.2%; women: 5.8%), zones of the AUDIT and AUDIT-C identified wide variability in the post-screening risk of alcohol dependence in men and women, even among those who screened positive for alcohol misuse. Among men, AUDIT zones 5–10, 11–14 and 15–40 were associated with post-screening probabilities of past-year alcohol dependence ranging from 18–87%, and AUDIT-C zones 5–6, 7–9 and 10–12 were associated with probabilities ranging from 22–75%. Among women, AUDIT zones 3–4, 5–8, 9–12 and 13–40 were associated with post-screening probabilities of past-year alcohol dependence ranging from 6–94%, and AUDIT-C zones 3, 4–6, 7–9 and 10–12 were associated with probabilities ranging from 9–88%. AUDIT or AUDIT-C scores could be used to estimate the probability of past-year alcohol dependence among patients who screen positive for alcohol misuse and inform clinical decision-making. PMID:20042299

  8. The 2006 William Feinberg lecture: shifting the paradigm from stroke to global vascular risk estimation.

    PubMed

    Sacco, Ralph L

    2007-06-01

    By the year 2010, it is estimated that 18.1 million people worldwide will die annually because of cardiovascular diseases and stroke. "Global vascular risk" more broadly includes the multiple overlapping disease silos of stroke, myocardial infarction, peripheral arterial disease, and vascular death. Estimation of global vascular risk requires consideration of a variety of variables including demographics, environmental behaviors, and risk factors. Data from multiple studies suggest continuous linear relationships between the physiological vascular risk modulators of blood pressure, lipids, and blood glucose rather than treating these conditions as categorical risk factors. Constellations of risk factors may be more relevant than individual categorical components. Exciting work with novel risk factors may also have predictive value in estimates of global vascular risk. Advances in imaging have led to the measurement of subclinical conditions such as carotid intima-media thickness and subclinical brain conditions such as white matter hyperintensities and silent infarcts. These subclinical measurements may be intermediate stages in the transition from asymptomatic to symptomatic vascular events, appear to be associated with the fundamental vascular risk factors, and represent opportunities to more precisely quantitate disease progression. The expansion of studies in molecular epidemiology and detection of genetic markers underlying vascular risks also promises to extend our precision of global vascular risk estimation. Global vascular risk estimation will require quantitative methods that bundle these multi-dimensional data into more precise estimates of future risk. The power of genetic information coupled with data on demographics, risk-inducing behaviors, vascular risk modulators, biomarkers, and measures of subclinical conditions should provide the most realistic approximation of an individual's future global vascular risk. The ultimate public health benefit

  9. Latent-failure risk estimates for computer control

    NASA Technical Reports Server (NTRS)

    Dunn, William R.; Folsom, Rolfe A.; Green, Owen R.

    1991-01-01

    It is shown that critical computer controls employing unmonitored safety circuits are unsafe. Analysis supporting this result leads to two additional, important conclusions: (1) annual maintenance checks of safety circuit function do not, as widely believed, eliminate latent failure risk; (2) safety risk remains even if multiple, series-connected protection circuits are employed. Finally, it is shown analytically that latent failure risk is eliminated when continuous monitoring is employed.

  10. State Estimates of Adolescent Marijuana Use and Perceptions of Risk of Harm from Marijuana Use: 2013 and 2014

    MedlinePlus

    ... 2014 estimates to 2012–2013 estimates). However, youth perceptions of great risk of harm from monthly marijuana ... change. State Estimates of Adolescent Marijuana Use and Perceptions of Risk of Harm From Marijuana Use: 2013 ...

  11. Estimation of Ten-Year Survival of Patients with Pulmonary Tuberculosis Based on the Competing Risks Model in Iran

    PubMed Central

    Kazempour-Dizaji, Mehdi; Tabarsi, Payam; Zayeri, Farid

    2016-01-01

    Background: Tuberculosis (TB) is a chronic bacterial disease, which despite the presence of effective drug strategies, still remains a serious health problem worldwide. Estimation of survival rate is an appropriate indicator for prognosis in patients with pulmonary TB. Therefore, this research was designed with the aim of accurate estimation of the survival of patients by taking both the death event and relapse into consideration. Materials and Methods: Based on a retrospective cohort study, information of 2,299 patients with pulmonary TB that had been referred to and treated in Masih Daneshvari Hospital from 2005 to 2015 was reviewed. To estimate the survival of patients with pulmonary TB, the competing risks model, which considered death and relapse as competing events, was used. In addition, the effect of factors affecting the cumulative incidence function (CIF) of death event and relapse was also examined. Results: The effect of risk factors on the CIF of death events and relapse showed that patients’ age, marital status, contact with TB patients, adverse effect of drugs, imprisonment and HIV positivity were factors that affected the CIF of death. Meanwhile, sex, marital status, imprisonment and HIV positivity were factors affecting the CIF of relapse (P <0.05). Considering death and relapse as competing events, survival estimation in pulmonary TB patients showed that survival in this group of patients in the first, third, fifth and tenth year after treatment was 39%, 14%, 7% and 0%, respectively. Conclusion: The use of competing risks model in survival analysis of patients with pulmonary TB with consideration of competing events, enables more accurate estimation of survival. PMID:27403177

  12. Indoor radon and lung cancer. Estimating the risks.

    PubMed Central

    Samet, J. M.

    1992-01-01

    Radon is ubiquitous in indoor environments. Epidemiologic studies of underground miners with exposure to radon and experimental evidence have established that radon causes lung cancer. The finding that this naturally occurring carcinogen is present in the air of homes and other buildings has raised concern about the lung cancer risk to the general population from radon. I review current approaches for assessing the risk of indoor radon, emphasizing the extrapolation of the risks for miners to the general population. Although uncertainties are inherent in this risk assessment, the present evidence warrants identifying homes that have unacceptably high concentrations. PMID:1734594

  13. CCSI Risk Estimation: An Application of Expert Elicitation

    SciTech Connect

    Engel, David W.; Dalton, Angela C.

    2012-10-01

    The Carbon Capture Simulation Initiative (CCSI) is a multi-laboratory simulation-driven effort to develop carbon capture technologies with the goal of accelerating commercialization and adoption in the near future. One of the key CCSI technical challenges is representing and quantifying the inherent uncertainty and risks associated with developing, testing, and deploying the technology in simulated and real operational settings. To address this challenge, the CCSI Element 7 team developed a holistic risk analysis and decision-making framework. The purpose of this report is to document the CCSI Element 7 structured systematic expert elicitation to identify additional risk factors. We review the significance of and established approaches to expert elicitation, describe the CCSI risk elicitation plan and implementation strategies, and conclude by discussing the next steps and highlighting the contribution of risk elicitation toward the achievement of the overarching CCSI objectives.

  14. NEED FOR INDIVIDUAL CANCER RISK ESTIMATES IN X-RAY AND NUCLEAR MEDICINE IMAGING.

    PubMed

    Mattsson, Sören

    2016-06-01

    To facilitate the justification of an X-ray or nuclear medicine investigation and for informing patients, it is desirable that the individual patient's radiation dose and potential cancer risk can be prospectively assessed and documented. The current dose-reporting is based on effective dose, which ignores body size and does not reflect the strong dependence of risk on the age at exposure. Risk estimations should better be done through individual organ dose assessments, which need careful exposure characterisation as well as anatomical description of the individual patient. In nuclear medicine, reference biokinetic models should also be replaced with models describing individual physiological states and biokinetics. There is a need to adjust population-based cancer risk estimates to the possible risk of leukaemia and solid tumours for the individual depending on age and gender. The article summarises reasons for individual cancer risk estimates and gives examples of methods and results of such estimates. PMID:26994092

  15. Time-Dependent Risk Estimation and Cost-Benefit Analysis for Mitigation Actions

    NASA Astrophysics Data System (ADS)

    van Stiphout, T.; Wiemer, S.; Marzocchi, W.

    2009-04-01

    Earthquakes strongly cluster in space and time. Consequently, the most dangerous time is right after a moderate earthquake has happened, because their is a ‘high' (i.e., 2-5 percent) probability that this event will be followed by a subsequent aftershock which happens to be as large or larger than the initiating event. The seismic hazard during this time-period exceeds the background probability significantly and by several orders of magnitude. Scientists have developed increasingly accurate forecast models that model this time-dependent hazard, and such models are currently being validated in prospective testing. However, this probabilistic information in the hazard space is difficult to digest for decision makers, the media and general public. Here, we introduce a possible bridge between seismology and decision makers (authorities, civil defense) by proposing a more objective way to estimate time-dependent risk assessment. Short Term Earthquake Risk assessment (STEER) combines aftershock hazard and loss assessments. We use site-specific information on site effects and building class distribution and combine this with existing loss models to compute site specific time-dependent risk curves (probability of exceedance for fatalities, injuries, damages etc). We show the effect of uncertainties in the different components using Monte Carlo Simulations of the input parameters. This time-dependent risk curves can act as a decision support. We extend the STEER approach by introducing a Cost-Benefit approach for certain mitigation actions after a medium-sized earthquake. Such Cost-Benefit approaches have been recently developed for volcanic risk assessment to rationalize precautionary evacuations in densely inhabitated areas threatened by volcanoes. Here we extend the concept to time-dependent probabilistic seismic risk assessment. For the Cost-Benefit analysis of mitigation actions we calculate the ratio between the cost for the mitigation actions and the cost of the

  16. A systematic approach for the accurate non-invasive estimation of blood glucose utilizing a novel light-tissue interaction adaptive modelling scheme

    NASA Astrophysics Data System (ADS)

    Rybynok, V. O.; Kyriacou, P. A.

    2007-10-01

    Diabetes is one of the biggest health challenges of the 21st century. The obesity epidemic, sedentary lifestyles and an ageing population mean prevalence of the condition is currently doubling every generation. Diabetes is associated with serious chronic ill health, disability and premature mortality. Long-term complications including heart disease, stroke, blindness, kidney disease and amputations, make the greatest contribution to the costs of diabetes care. Many of these long-term effects could be avoided with earlier, more effective monitoring and treatment. Currently, blood glucose can only be monitored through the use of invasive techniques. To date there is no widely accepted and readily available non-invasive monitoring technique to measure blood glucose despite the many attempts. This paper challenges one of the most difficult non-invasive monitoring techniques, that of blood glucose, and proposes a new novel approach that will enable the accurate, and calibration free estimation of glucose concentration in blood. This approach is based on spectroscopic techniques and a new adaptive modelling scheme. The theoretical implementation and the effectiveness of the adaptive modelling scheme for this application has been described and a detailed mathematical evaluation has been employed to prove that such a scheme has the capability of extracting accurately the concentration of glucose from a complex biological media.

  17. Individualized Risk Estimation for Postoperative Complications After Surgery for Oral Cavity Cancer

    PubMed Central

    Awad, Mahmoud I.; Palmer, Frank L.; Kou, Lei; Yu, Changhong; Montero, Pablo H.; Shuman, Andrew G.; Ganly, Ian; Shah, Jatin P.; Kattan, Michael W.; Patel, Snehal G.

    2016-01-01

    , preoperative hematocrit, planned neck dissection, and planned tracheotomy. The nomogram predicted a major complication with a validated concordance index of 0.79. Inclusion of surgical operative variables in the nomogram maintained predictive accuracy (concordance index, 0.77). CONCLUSIONS AND RELEVANCE A statistical tool was developed that accurately estimates an individual patient’s risk of developing a major complication after surgery for oral cavity squamous cell carcinoma. PMID:26469394

  18. Comparing an estimate of seabirds at risk to a mortality estimate from the November 2004 Terra Nova FPSO oil spill.

    PubMed

    Wilhelm, Sabina I; Robertson, Gregory J; Ryan, Pierre C; Schneider, David C

    2007-05-01

    On 21 November 2004, about 1000 barrels of crude oil were accidentally released from the Terra Nova FPSO (floating production, storage and offloading) onto the Grand Banks, approximately 340 km east-southeast of St. John's, Newfoundland. We estimated the number of vulnerable seabirds (murres (Uria spp.) and dovekies (Alle alle)) at risk from this incident by multiplying observed densities of seabirds with the total area covered by the slick, estimated at 793 km(2). A mean density of 3.46 murres/km(2) and 1.07 dovekies/km(2) on the sea surface was recorded during vessel-based surveys on 28 and 29 November 2004, with a mean density of 6.90 murres/km(2) and 13.43 dovekies/km(2) combining those on the sea and in flight. We calculated a mean of 9858 murres and dovekies were at risk of being oiled, with estimates ranging from 3593 to 16,122 depending on what portion of birds in flight were assumed to be at risk. A mortality model based on spill volume was derived independently of the risk model, and estimated that 4688 (CI 95%: 1905-12,480) birds were killed during this incident. A low mortality estimate based strictly on spill volume would be expected for this incident, which occurred in an area of relatively high seabird densities. Given that the risk and mortality estimates are statistically indistinguishable, we estimate that on the order of 10,000 birds were killed by the Terra Nova spill. PMID:17328926

  19. Space Radiation Cancer, Circulatory Disease and CNS Risks for Near Earth Asteroid and Mars Missions: Uncertainty Estimates for Never-Smokers

    NASA Technical Reports Server (NTRS)

    Cucinotta, Francis A.; Chappell, Lori J.; Wang, Minli; Kim, Myung-Hee

    2011-01-01

    The uncertainties in estimating the health risks from galactic cosmic rays (GCR) and solar particle events (SPE) are a major limitation to the length of space missions and the evaluation of potential risk mitigation approaches. NASA limits astronaut exposures to a 3% risk of exposure induced cancer death (REID), and protects against uncertainties in risks projections using an assessment of 95% confidence intervals after propagating the error from all model factors (environment and organ exposure, risk coefficients, dose-rate modifiers, and quality factors). Because there are potentially significant late mortality risks from diseases of the circulatory system and central nervous system (CNS) which are less well defined than cancer risks, the cancer REID limit is not necessarily conservative. In this report, we discuss estimates of lifetime risks from space radiation and new estimates of model uncertainties are described. The key updates to the NASA risk projection model are: 1) Revised values for low LET risk coefficients for tissue specific cancer incidence, with incidence rates transported to an average U.S. population to estimate the probability of Risk of Exposure Induced Cancer (REIC) and REID. 2) An analysis of smoking attributable cancer risks for never-smokers that shows significantly reduced lung cancer risk as well as overall cancer risks from radiation compared to risk estimated for the average U.S. population. 3) Derivation of track structure based quality functions depends on particle fluence, charge number, Z and kinetic energy, E. 4) The assignment of a smaller maximum in quality function for leukemia than for solid cancers. 5) The use of the ICRP tissue weights is shown to over-estimate cancer risks from SPEs by a factor of 2 or more. Summing cancer risks for each tissue is recommended as a more accurate approach to estimate SPE cancer risks. 6) Additional considerations on circulatory and CNS disease risks. Our analysis shows that an individual s

  20. Biomechanical Risk Estimates for Mild Traumatic Brain Injury

    PubMed Central

    Funk, J. R.; Duma, S. M.; Manoogian, S. J.; Rowson, S.

    2007-01-01

    The objective of this study was to characterize the risk of mild traumatic brain injury (MTBI) in living humans based on a large set of head impact data taken from American football players at the collegiate level. Real-time head accelerations were recorded from helmet-mounted accelerometers designed to stay in contact with the player’s head. Over 27,000 head impacts were recorded, including four impacts resulting in MTBI. Parametric risk curves were developed by normalizing MTBI incidence data by head impact exposure data. An important finding of this research is that living humans, at least in the setting of collegiate football, sustain much more significant head impacts without apparent injury than previously thought. The following preliminary nominal injury assessment reference values associated with a 10% risk of MTBI are proposed: a peak linear head acceleration of 165 g, a HIC of 400, and a peak angular head acceleration of 9000 rad/s2. PMID:18184501

  1. Probabilistic methodology for estimating radiation-induced cancer risk

    SciTech Connect

    Dunning, D.E. Jr.; Leggett, R.W.; Williams, L.R.

    1981-01-01

    The RICRAC computer code was developed at Oak Ridge National Laboratory to provide a versatile and convenient methodology for radiation risk assessment. The code allows as input essentially any dose pattern commonly encountered in risk assessments for either acute or chronic exposures, and it includes consideration of the age structure of the exposed population. Results produced by the analysis include the probability of one or more radiation-induced cancer deaths in a specified population, expected numbers of deaths, and expected years of life lost as a result of premature fatalities. These calculatons include consideration of competing risks of death from all other causes. The program also generates a probability frequency distribution of the expected number of cancers in any specified cohort resulting from a given radiation dose. The methods may be applied to any specified population and dose scenario.

  2. State of the art coronary heart disease risk estimations based on the Framingham heart study.

    PubMed

    Reissigová, J; Tomecková, M

    2005-12-01

    The aim was to review the most interesting articles dealing with estimations of an individual's absolute coronary heart disease risk based on the Framingham heart study. Besides the Framingham coronary heart disease risk functions, results of validation studies of these Framingham risk functions are discussed. In general, the Framingham risk functions overestimated an individual's absolute risk in external (non-Framingham) populations with a lower occurrence of coronary heart disease compared with the Framingham population, and underestimated it in populations with a higher occurrence of coronary heart disease. Even if the calibration accuracy of the Framingham risk functions were not satisfying, the Framingham risk functions were able to rank individuals according to risk from low-risk to high-risk groups, with the discrimination ability of 60% and more. PMID:16419382

  3. How reliable are the risk estimates for X-ray examinations in forensic age estimations? A safety update.

    PubMed

    Ramsthaler, F; Proschek, P; Betz, W; Verhoff, M A

    2009-05-01

    Possible biological side effects of exposure to X-rays are stochastic effects such as carcinogenesis and genetic alterations. In recent years, a number of new studies have been published about the special cancer risk that children may suffer from diagnostic X-rays. Children and adolescents who constitute many of the probands in forensic age-estimation proceedings are considerably more sensitive to the carcinogenic risks of ionizing radiation than adults. Established doses for X-ray examinations in forensic age estimations vary from less than 0.1 microSv (left hand X-ray) up to more than 800 microSv (computed tomography). Computed tomography in children, as a relatively high-dose procedure, is of particular interest because the doses involved are near to the lower limit of the doses observed and analyzed in A-bombing survivor studies. From these studies, direct epidemiological data exist concerning the lifetime cancer risk. Since there is no medical indication for forensic age examinations, it should be stressed that only safe methods are generally acceptable. This paper reviews current knowledge on cancer risks associated with diagnostic radiation and aims to help forensic experts, dentists, and pediatricians evaluate the risk from radiation when using X-rays in age-estimation procedures. PMID:19153756

  4. CubeSat mission design software tool for risk estimating relationships

    NASA Astrophysics Data System (ADS)

    Gamble, Katharine Brumbaugh; Lightsey, E. Glenn

    2014-09-01

    In an effort to make the CubeSat risk estimation and management process more scientific, a software tool has been created that enables mission designers to estimate mission risks. CubeSat mission designers are able to input mission characteristics, such as form factor, mass, development cycle, and launch information, in order to determine the mission risk root causes which historically present the highest risk for their mission. Historical data was collected from the CubeSat community and analyzed to provide a statistical background to characterize these Risk Estimating Relationships (RERs). This paper develops and validates the mathematical model based on the same cost estimating relationship methodology used by the Unmanned Spacecraft Cost Model (USCM) and the Small Satellite Cost Model (SSCM). The RER development uses general error regression models to determine the best fit relationship between root cause consequence and likelihood values and the input factors of interest. These root causes are combined into seven overall CubeSat mission risks which are then graphed on the industry-standard 5×5 Likelihood-Consequence (L-C) chart to help mission designers quickly identify areas of concern within their mission. This paper is the first to document not only the creation of a historical database of CubeSat mission risks, but, more importantly, the scientific representation of Risk Estimating Relationships.

  5. Development and validation of risk prediction equations to estimate future risk of blindness and lower limb amputation in patients with diabetes: cohort study

    PubMed Central

    Coupland, Carol

    2015-01-01

    Study question Is it possible to develop and externally validate risk prediction equations to estimate the 10 year risk of blindness and lower limb amputation in patients with diabetes aged 25-84 years? Methods This was a prospective cohort study using routinely collected data from general practices in England contributing to the QResearch and Clinical Practice Research Datalink (CPRD) databases during the study period 1998-2014. The equations were developed using 763 QResearch practices (n=454 575 patients with diabetes) and validated in 254 different QResearch practices (n=142 419) and 357 CPRD practices (n=206 050). Cox proportional hazards models were used to derive separate risk equations for blindness and amputation in men and women that could be evaluated at 10 years. Measures of calibration and discrimination were calculated in the two validation cohorts. Study answer and limitations Risk prediction equations to quantify absolute risk of blindness and amputation in men and women with diabetes have been developed and externally validated. In the QResearch derivation cohort, 4822 new cases of lower limb amputation and 8063 new cases of blindness occurred during follow-up. The risk equations were well calibrated in both validation cohorts. Discrimination was good in men in the external CPRD cohort for amputation (D statistic 1.69, Harrell’s C statistic 0.77) and blindness (D statistic 1.40, Harrell’s C statistic 0.73), with similar results in women and in the QResearch validation cohort. The algorithms are based on variables that patients are likely to know or that are routinely recorded in general practice computer systems. They can be used to identify patients at high risk for prevention or further assessment. Limitations include lack of formally adjudicated outcomes, information bias, and missing data. What this study adds Patients with type 1 or type 2 diabetes are at increased risk of blindness and amputation but generally do not have accurate

  6. Estimation of Newborn Risk for Child or Adolescent Obesity: Lessons from Longitudinal Birth Cohorts

    PubMed Central

    Morandi, Anita; Meyre, David; Lobbens, Stéphane; Kleinman, Ken; Kaakinen, Marika; Rifas-Shiman, Sheryl L.; Vatin, Vincent; Gaget, Stefan; Pouta, Anneli; Hartikainen, Anna-Liisa; Laitinen, Jaana; Ruokonen, Aimo; Das, Shikta; Khan, Anokhi Ali; Elliott, Paul; Maffeis, Claudio; Gillman, Matthew W.

    2012-01-01

    Objectives Prevention of obesity should start as early as possible after birth. We aimed to build clinically useful equations estimating the risk of later obesity in newborns, as a first step towards focused early prevention against the global obesity epidemic. Methods We analyzed the lifetime Northern Finland Birth Cohort 1986 (NFBC1986) (N = 4,032) to draw predictive equations for childhood and adolescent obesity from traditional risk factors (parental BMI, birth weight, maternal gestational weight gain, behaviour and social indicators), and a genetic score built from 39 BMI/obesity-associated polymorphisms. We performed validation analyses in a retrospective cohort of 1,503 Italian children and in a prospective cohort of 1,032 U.S. children. Results In the NFBC1986, the cumulative accuracy of traditional risk factors predicting childhood obesity, adolescent obesity, and childhood obesity persistent into adolescence was good: AUROC = 0·78[0·74–0.82], 0·75[0·71–0·79] and 0·85[0·80–0·90] respectively (all p<0·001). Adding the genetic score produced discrimination improvements ≤1%. The NFBC1986 equation for childhood obesity remained acceptably accurate when applied to the Italian and the U.S. cohort (AUROC = 0·70[0·63–0·77] and 0·73[0·67–0·80] respectively) and the two additional equations for childhood obesity newly drawn from the Italian and the U.S. datasets showed good accuracy in respective cohorts (AUROC = 0·74[0·69–0·79] and 0·79[0·73–0·84]) (all p<0·001). The three equations for childhood obesity were converted into simple Excel risk calculators for potential clinical use. Conclusion This study provides the first example of handy tools for predicting childhood obesity in newborns by means of easily recorded information, while it shows that currently known genetic variants have very little usefulness for such prediction. PMID:23209618

  7. Assessing uncertainty in published risk estimates using hexavalent chromium and lung cancer mortality as an example

    EPA Science Inventory

    Introduction: The National Research Council recommended quantitative evaluation of uncertainty in effect estimates for risk assessment. This analysis considers uncertainty across model forms and model parameterizations with hexavalent chromium [Cr(VI)] and lung cancer mortality a...

  8. Risk Assessment Tool for Estimating Your 10-Year Risk of Having a Heart Attack

    MedlinePlus

    ... Cardiovascular Risk: Systematic Evidence Review from the Risk Assessment Work Group The Evidence Report Full Report Accessible ... MB) Printer-friendly version (2 MB) Study Quality Assessment Tools Clinical Practice Guideline: Developed Under NHLBI Partnership ...

  9. Assessing the risk of Legionnaires' disease: the inhalation exposure model and the estimated risk in residential bathrooms.

    PubMed

    Azuma, Kenichi; Uchiyama, Iwao; Okumura, Jiro

    2013-02-01

    Legionella are widely found in the built environment. Patients with Legionnaires' disease have been increasing in Japan; however, health risks from Legionella bacteria in the environment are not appropriately assessed. We performed a quantitative health risk assessment modeled on residential bathrooms in the Adachi outbreak area and estimated risk levels. The estimated risks in the Adachi outbreak approximately corresponded to the risk levels exponentially extrapolated into lower levels on the basis of infection and mortality rates calculated from actual outbreaks, suggesting that the model of Legionnaires' disease in residential bathrooms was adequate to predict disease risk for the evaluated outbreaks. Based on this model, the infection and mortality risk levels per year in 10 CFU/100 ml (100 CFU/L) of the Japanese water quality guideline value were approximately 10(-2) and 10(-5), respectively. However, acceptable risk levels of infection and mortality from Legionnaires' disease should be adjusted to approximately 10(-4) and 10(-7), respectively, per year. Therefore, a reference value of 0.1 CFU/100 ml (1 CFU/L) as a water quality guideline for Legionella bacteria is recommended. This value is occasionally less than the actual detection limit. Legionella levels in water system should be maintained as low as reasonably achievable (<1 CFU/L). PMID:23195792

  10. Smoking is a major preventable risk factor for Rheumatoid arthritis Estimations of risks after various exposures to cigarette smoke

    PubMed Central

    Källberg, Henrik; Ding, Bo; Padyukov, Leonid; Bengtsson, Camilla; Rönnelid, Johan; Klareskog, Lars; Alfredsson, Lars

    2011-01-01

    Background Earlier studies have demonstrated that smoking and genetic risk factors interact in providing an increased risk for Rheumatoid Arthritis (RA). Less is known on how smoking contributes to RA in the context of genetic variability, and what proportion of RA that may be caused by smoking. Objectives To determine the association between amount of smoking and risk of RA in the context of different HLA-DRB1 shared epitope (SE) alleles, and to estimate proportions of RA cases attributed to smoking. Design, Setting, and Participants Data from the Swedish Epidemiological Investigation of Rheumatoid Arthritis (EIRA) case-control study encompassing 1204 cases and 871 controls were analysed. Main Outcome Measure Estimated odds ratio to develop RA and excess fraction of cases attributable to smoking according to amount of smoking and genotype. Results Smoking was estimated to be responsible for 35 % of the ACPA+ cases. For each HLA-DRB1 SE genotype, smoking was dose-dependently associated with increased risk of ACPA+ RA (p-trend<0.001). In individuals carrying two copies of the HLA-DRB1 shared epitope, 55 % of ACPA-positive RA were attributable to smoking. Conclusions Smoking is a preventable risk factor for RA. The increased risk due to smoking is dependent on amount of smoking and genotype. PMID:21149499

  11. Bayesian Framework for Water Quality Model Uncertainty Estimation and Risk Management

    EPA Science Inventory

    A formal Bayesian methodology is presented for integrated model calibration and risk-based water quality management using Bayesian Monte Carlo simulation and maximum likelihood estimation (BMCML). The primary focus is on lucid integration of model calibration with risk-based wat...

  12. Estimated occupational risk from bioaerosols generated during land application of Class B biosolids

    Technology Transfer Automated Retrieval System (TEKTRAN)

    It has been speculated that bioaerosols generated during land application of biosolids pose a serious occupational risk, but few scientific studies have been performed to assess levels of aerosolization of microorganisms from biosolids and to estimate the occupational risks of infection. This study ...

  13. Risk estimations, risk factors, and genetic variants associated with Alzheimer's disease in selected publications from the Framingham Heart Study.

    PubMed

    Weinstein, Galit; Wolf, Philip A; Beiser, Alexa S; Au, Rhoda; Seshadri, Sudha

    2013-01-01

    The study of Alzheimer's disease (AD) in the Framingham Heart Study (FHS), a multi-generational, community-based population study, began nearly four decades ago. In this overview, we highlight findings from seven prior publications that examined lifetime risk estimates for AD, environmental risk factors for AD, circulating and imaging markers of aging-related brain injury, and explorations on the genetics underlying AD. First, we describe estimations of the lifetime risk of AD. These estimates are distinguished from other measures of disease burden and have substantial public health implications. We then describe prospective studies of environmental AD risk factors: one examined the association between plasma levels of omega-3 fatty-acid and risk of incident AD, the other explored the association of diabetes to this risk in subsamples with specific characteristics. With evidence of inflammation as an underlying mechanism, we also describe findings from a study that compared the effects of serum cytokines and spontaneous production of peripheral blood mononuclear cell cytokines on AD risk. Investigating AD related endophenotypes increases sensitivity in identifying risk factors and can be used to explore pathophysiologic pathways between a risk factor and the disease. We describe findings of an association between large volume of white matter hyperintensities and a specific pattern of cognitive deficits in non-demented participants. Finally, we summarize our findings from two genetic studies: The first used genome-wide association (GWA) and family-based association methods to explore the genetic basis of cognitive and structural brain traits. The second is a large meta-analysis GWA study of AD, in which novel loci of AD susceptibility were found. Together, these findings demonstrate the FHS multi-directional efforts in investigating dementia and AD. PMID:22796871

  14. Uncertainties in Estimates of the Risks of Late Effects from Space Radiation

    NASA Technical Reports Server (NTRS)

    Cucinotta, F. A.; Schimmerling, W.; Wilson, J. W.; Peterson, L. E.; Saganti, P.; Dicelli, J. F.

    2002-01-01

    The health risks faced by astronauts from space radiation include cancer, cataracts, hereditary effects, and non-cancer morbidity and mortality risks related to the diseases of the old age. Methods used to project risks in low-Earth orbit are of questionable merit for exploration missions because of the limited radiobiology data and knowledge of galactic cosmic ray (GCR) heavy ions, which causes estimates of the risk of late effects to be highly uncertain. Risk projections involve a product of many biological and physical factors, each of which has a differential range of uncertainty due to lack of data and knowledge. Within the linear-additivity model, we use Monte-Carlo sampling from subjective uncertainty distributions in each factor to obtain a Maximum Likelihood estimate of the overall uncertainty in risk projections. The resulting methodology is applied to several human space exploration mission scenarios including ISS, lunar station, deep space outpost, and Mar's missions of duration of 360, 660, and 1000 days. The major results are the quantification of the uncertainties in current risk estimates, the identification of factors that dominate risk projection uncertainties, and the development of a method to quantify candidate approaches to reduce uncertainties or mitigate risks. The large uncertainties in GCR risk projections lead to probability distributions of risk that mask any potential risk reduction using the "optimization" of shielding materials or configurations. In contrast, the design of shielding optimization approaches for solar particle events and trapped protons can be made at this time, and promising technologies can be shown to have merit using our approach. The methods used also make it possible to express risk management objectives in terms of quantitative objective's, i.e., the number of days in space without exceeding a given risk level within well defined confidence limits.

  15. Estimating risks to aquatic life using quantile regression

    USGS Publications Warehouse

    Schmidt, Travis S.; Clements, William H.; Cade, Brian S.

    2012-01-01

    One of the primary goals of biological assessment is to assess whether contaminants or other stressors limit the ecological potential of running waters. It is important to interpret responses to contaminants relative to other environmental factors, but necessity or convenience limit quantification of all factors that influence ecological potential. In these situations, the concept of limiting factors is useful for data interpretation. We used quantile regression to measure risks to aquatic life exposed to metals by including all regression quantiles (τ  =  0.05–0.95, by increments of 0.05), not just the upper limit of density (e.g., 90th quantile). We measured population densities (individuals/0.1 m2) of 2 mayflies (Rhithrogena spp., Drunella spp.) and a caddisfly (Arctopsyche grandis), aqueous metal mixtures (Cd, Cu, Zn), and other limiting factors (basin area, site elevation, discharge, temperature) at 125 streams in Colorado. We used a model selection procedure to test which factor was most limiting to density. Arctopsyche grandis was limited by other factors, whereas metals limited most quantiles of density for the 2 mayflies. Metals reduced mayfly densities most at sites where other factors were not limiting. Where other factors were limiting, low mayfly densities were observed despite metal concentrations. Metals affected mayfly densities most at quantiles above the mean and not just at the upper limit of density. Risk models developed from quantile regression showed that mayfly densities observed at background metal concentrations are improbable when metal mixtures are at US Environmental Protection Agency criterion continuous concentrations. We conclude that metals limit potential density, not realized average density. The most obvious effects on mayfly populations were at upper quantiles and not mean density. Therefore, we suggest that policy developed from mean-based measures of effects may not be as useful as policy based on the concept of

  16. Overview of Risk-Estimation Tools for Primary Prevention of Cardiovascular Diseases in European Populations.

    PubMed

    Gorenoi, Vitali; Hagen, Anja

    2015-06-01

    To identify persons with a high risk for cardiovascular diseases (CVD) special tools (scores, charts, graphics or computer programs) for CVD-risk assessment based on levels of the certain risk factors have been constructed. The applicability of these instruments depends on the derivation cohorts, considered risk factors and endpoints, applied statistical methods as well as used formats. The review addresses the risk-estimation tools for primary prevention of CVD potentially relevant for European populations. The risk-estimation tools were identified using two previously published systematic reviews as well as conducting a literature search in MEDLINE and a manual search. Only instruments were considered which were derived from cohorts of at least 1,000 participants of one gender without pre-existing CVD, enable risk assessment for a period of at least 5 years, were designed for an age-range of at least 25 years and published after the year 2000. A number of risk-estimation tools for CVD derived from single European, several European and from non-European cohorts were identified. From a clinical perspective, seem to be preferable instruments for risk of CVD contemporary developed for the population of interest, which use easily accessible measures and show a high discriminating ability. Instruments, restricting risk-estimation to certain cardiovascular events, recalibrated high-accuracy tools or tools derived from European populations with similar risk factors distribution and CVD-incidence are the second choice. In younger people, calculating the relative risk or cardiovascular age equivalence measures may be of more benefit. PMID:26851417

  17. Estimating Toxicity Pathway Activating Doses for High Throughput Chemical Risk Assessments

    EPA Science Inventory

    Estimating a Toxicity Pathway Activating Dose (TPAD) from in vitro assays as an analog to a reference dose (RfD) derived from in vivo toxicity tests would facilitate high throughput risk assessments of thousands of data-poor environmental chemicals. Estimating a TPAD requires def...

  18. ASSESSMENT OF METHODS FOR ESTIMATING RISK TO BIRDS FROM INGESTION OF CONTAMINATED GRIT PARTICLES (FINAL REPORT)

    EPA Science Inventory

    The report evaluates approaches for estimating the probability of ingestion by birds of contaminated particles such as pesticide granules or lead particles (i.e. shot or bullet fragments). In addition, it presents an approach for using this information to estimate the risk of mo...

  19. REVIEW OF DRAFT REVISED BLUE BOOK ON ESTIMATING CANCER RISKS FROM EXPOSURE TO IONIZING RADIATION

    EPA Science Inventory

    In 1994, EPA published a report, referred to as the “Blue Book,” which lays out EPA’s current methodology for quantitatively estimating radiogenic cancer risks. A follow-on report made minor adjustments to the previous estimates and presented a partial analysis of the uncertainti...

  20. Quantitative Risk reduction estimation Tool For Control Systems, Suggested Approach and Research Needs

    SciTech Connect

    Miles McQueen; Wayne Boyer; Mark Flynn; Sam Alessi

    2006-03-01

    For the past year we have applied a variety of risk assessment technologies to evaluate the risk to critical infrastructure from cyber attacks on control systems. More recently, we identified the need for a stand alone control system risk reduction estimation tool to provide owners and operators of control systems with a more useable, reliable, and credible method for managing the risks from cyber attack. Risk is defined as the probability of a successful attack times the value of the resulting loss, typically measured in lives and dollars. Qualitative and ad hoc techniques for measuring risk do not provide sufficient support for cost benefit analyses associated with cyber security mitigation actions. To address the need for better quantitative risk reduction models we surveyed previous quantitative risk assessment research; evaluated currently available tools; developed new quantitative techniques [17] [18]; implemented a prototype analysis tool to demonstrate how such a tool might be used; used the prototype to test a variety of underlying risk calculational engines (e.g. attack tree, attack graph); and identified technical and research needs. We concluded that significant gaps still exist and difficult research problems remain for quantitatively assessing the risk to control system components and networks, but that a useable quantitative risk reduction estimation tool is not beyond reach.

  1. Space Radiation Heart Disease Risk Estimates for Lunar and Mars Missions

    NASA Technical Reports Server (NTRS)

    Cucinotta, Francis A.; Chappell, Lori; Kim, Myung-Hee

    2010-01-01

    The NASA Space Radiation Program performs research on the risks of late effects from space radiation for cancer, neurological disorders, cataracts, and heart disease. For mortality risks, an aggregate over all risks should be considered as well as projection of the life loss per radiation induced death. We report on a triple detriment life-table approach to combine cancer and heart disease risks. Epidemiology results show extensive heterogeneity between populations for distinct components of the overall heart disease risks including hypertension, ischaemic heart disease, stroke, and cerebrovascular diseases. We report on an update to our previous heart disease estimates for Heart disease (ICD9 390-429) and Stroke (ICD9 430-438), and other sub-groups using recent meta-analysis results for various exposed radiation cohorts to low LET radiation. Results for multiplicative and additive risk transfer models are considered using baseline rates for US males and female. Uncertainty analysis indicated heart mortality risks as low as zero, assuming a threshold dose for deterministic effects, and projections approaching one-third of the overall cancer risk. Medan life-loss per death estimates were significantly less than that of solid cancer and leukemias. Critical research questions to improve risks estimates for heart disease are distinctions in mechanisms at high doses (>2 Gy) and low to moderate doses (<2 Gy), and data and basic understanding of radiation doserate and quality effects, and individual sensitivity.

  2. [Estimation of the radiation risk of determined effects of human exposure in space].

    PubMed

    Petrov, V M; Vasina, Iu I; Vlasov, A G; Shurshakov, V A

    2001-01-01

    Subject of the paper is possibility to estimate radiation risk of determined consequences of exposure to solar space rays as a probability of violation of established dose limits. Analysis of specifies of spacecrew exposure to solar space rays in a long-term, particularly interplanetary mission suggests that immediate introduction of the principle in the radiation health policy can result in serious errors, mainly exaggeration, in determination of radiation risk. Proposed are approaches to radiation risk estimation with consideration of the specific of human exposure in space. PMID:11915752

  3. A simple procedure for estimating pseudo risk ratios from exposure to non-carcinogenic chemical mixtures.

    PubMed

    Scinicariello, Franco; Portier, Christopher

    2016-03-01

    Non-cancer risk assessment traditionally assumes a threshold of effect, below which there is a negligible risk of an adverse effect. The Agency for Toxic Substances and Disease Registry derives health-based guidance values known as Minimal Risk Levels (MRLs) as estimates of the toxicity threshold for non-carcinogens. Although the definition of an MRL, as well as EPA reference dose values (RfD and RfC), is a level that corresponds to "negligible risk," they represent daily exposure doses or concentrations, not risks. We present a new approach to calculate the risk at exposure to specific doses for chemical mixtures, the assumption in this approach is to assign de minimis risk at the MRL. The assigned risk enables the estimation of parameters in an exponential model, providing a complete dose-response curve for each compound from the chosen point of departure to zero. We estimated parameters for 27 chemicals. The value of k, which determines the shape of the dose-response curve, was moderately insensitive to the choice of the risk at the MRL. The approach presented here allows for the calculation of a risk from a single substance or the combined risk from multiple chemical exposures in a community. The methodology is applicable from point of departure data derived from quantal data, such as data from benchmark dose analyses or from data that can be transformed into probabilities, such as lowest-observed-adverse-effect level. The individual risks are used to calculate risk ratios that can facilitate comparison and cost-benefit analyses of environmental contamination control strategies. PMID:25667015

  4. Uncertainties in estimates of the risks of late effects from space radiation

    NASA Technical Reports Server (NTRS)

    Cucinotta, F. A.; Schimmerling, W.; Wilson, J. W.; Peterson, L. E.; Saganti, P. B.; Dicello, J. F.

    2004-01-01

    Methods used to project risks in low-Earth orbit are of questionable merit for exploration missions because of the limited radiobiology data and knowledge of galactic cosmic ray (GCR) heavy ions, which causes estimates of the risk of late effects to be highly uncertain. Risk projections involve a product of many biological and physical factors, each of which has a differential range of uncertainty due to lack of data and knowledge. Using the linear-additivity model for radiation risks, we use Monte-Carlo sampling from subjective uncertainty distributions in each factor to obtain an estimate of the overall uncertainty in risk projections. The resulting methodology is applied to several human space exploration mission scenarios including a deep space outpost and Mars missions of duration of 360, 660, and 1000 days. The major results are the quantification of the uncertainties in current risk estimates, the identification of factors that dominate risk projection uncertainties, and the development of a method to quantify candidate approaches to reduce uncertainties or mitigate risks. The large uncertainties in GCR risk projections lead to probability distributions of risk that mask any potential risk reduction using the "optimization" of shielding materials or configurations. In contrast, the design of shielding optimization approaches for solar particle events and trapped protons can be made at this time and promising technologies can be shown to have merit using our approach. The methods used also make it possible to express risk management objectives in terms of quantitative metrics, e.g., the number of days in space without exceeding a given risk level within well-defined confidence limits. Published by Elsevier Ltd on behalf of COSPAR.

  5. Uncertainties in estimates of the risks of late effects from space radiation.

    PubMed

    Cucinotta, F A; Schimmerling, W; Wilson, J W; Peterson, L E; Saganti, P B; Dicello, J F

    2004-01-01

    Methods used to project risks in low-Earth orbit are of questionable merit for exploration missions because of the limited radiobiology data and knowledge of galactic cosmic ray (GCR) heavy ions, which causes estimates of the risk of late effects to be highly uncertain. Risk projections involve a product of many biological and physical factors, each of which has a differential range of uncertainty due to lack of data and knowledge. Using the linear-additivity model for radiation risks, we use Monte-Carlo sampling from subjective uncertainty distributions in each factor to obtain an estimate of the overall uncertainty in risk projections. The resulting methodology is applied to several human space exploration mission scenarios including a deep space outpost and Mars missions of duration of 360, 660, and 1000 days. The major results are the quantification of the uncertainties in current risk estimates, the identification of factors that dominate risk projection uncertainties, and the development of a method to quantify candidate approaches to reduce uncertainties or mitigate risks. The large uncertainties in GCR risk projections lead to probability distributions of risk that mask any potential risk reduction using the "optimization" of shielding materials or configurations. In contrast, the design of shielding optimization approaches for solar particle events and trapped protons can be made at this time and promising technologies can be shown to have merit using our approach. The methods used also make it possible to express risk management objectives in terms of quantitative metrics, e.g., the number of days in space without exceeding a given risk level within well-defined confidence limits. PMID:15881779

  6. Breast Cancer Risk Estimation Using Parenchymal Texture Analysis in Digital Breast Tomosynthesis

    SciTech Connect

    Ikejimba, Lynda C.; Kontos, Despina; Maidment, Andrew D. A.

    2010-10-11

    Mammographic parenchymal texture has been shown to correlate with genetic markers of developing breast cancer. Digital breast tomosynthesis (DBT) is a novel x-ray imaging technique in which tomographic images of the breast are reconstructed from multiple source projections acquired at different angles of the x-ray tube. Compared to digital mammography (DM), DBT eliminates breast tissue overlap, offering superior parenchymal tissue visualization. We hypothesize that texture analysis in DBT could potentially provide a better assessment of parenchymal texture and ultimately result in more accurate assessment of breast cancer risk. As a first step towards validating this hypothesis, we investigated the association between DBT parenchymal texture and breast percent density (PD), a known breast cancer risk factor, and compared it to DM. Bilateral DBT and DM images from 71 women participating in a breast cancer screening trial were analyzed. Filtered-backprojection was used to reconstruct DBT tomographic planes in 1 mm increments with 0.22 mm in-plane resolution. Corresponding DM images were acquired at 0.1 mm pixel resolution. Retroareolar regions of interest (ROIs) equivalent to 2.5 cm{sup 3} were segmented from the DBT images and corresponding 2.5 cm{sup 2} ROIs were segmented from the DM images. Breast PD was mammographically estimated using the Cumulus scale. Overall, DBT texture features demonstrated a stronger correlation than DM to PD. The Pearson correlation coefficients for DBT were r = 0.40 (p<0.001) for contrast and r = -0.52 (p<0.001) for homogeneity; the corresponding DM correlations were r = 0.26 (p = 0.002) and r = -0.33 (p<0.001). Multiple linear regression of the texture features versus breast PD also demonstrated significantly stronger associations in DBT (R{sup 2} = 0.39) compared to DM (R{sup 2} = 0.33). We attribute these observations to the superior parenchymal tissue visualization in DBT. Our study is the first to perform DBT texture analysis in a

  7. Biologically based risk estimation for radiation-induced CML. Inferences from BCR and ABL geometric distributions.

    PubMed

    Radivoyevitch, T; Kozubek, S; Sachs, R K

    2001-03-01

    Chronic myeloid leukemia (CML) invites biologically based radiation risk modeling because CML is simultaneously well-understood, homogeneous and prevalent. CML is known to be caused by a translocation involving the ABL and BCR genes, almost all CML patients have the BCR-ABL translocation, and CML is prevalent enough that its induction is unequivocally detected among Hiroshima A-bomb survivors. In a previous paper, a linear-quadratic-exponential (LQE) dose-response model was used to estimate the lifetime excess risk of CML in the limit of low doses of gamma-rays, R gamma. This estimate assumed that BCR-ABL translocation dose-response curves in stem cells for both neutrons and gamma-rays, differ only by a common proportionality constant from dicentric aberration dose-response curves in lymphocytes. In the present paper we challenge this assumption by predicting the BCR-ABL dose response. The predictions are based on the biophysical theory of dual radiation action (TDRA) as it applies to recent BCR-to-ABL distance data in G0 human lymphocytes; this data shows BCR and ABL geometric distributions that are not uniform and not independent, with close association of the two genes in some cells. The analysis speaks against the previous proportionality assumption. We compute 11 plausible LQE estimates of R gamma, 2 based on the proportionality assumption and 9 based on TDRA predictions. For each estimate of R gamma we also compute an associated estimate of the number of CML target cells, N; the biological basis of the LQE model allows us to form such estimates. Consistency between N and hematological considerations provides a plausibility check of the risk estimates. Within the group of estimates investigated, the most plausible lifetime excess risk estimates tend to lie near R gamma = 0.01 Gy-1, substantially higher than risk estimates based on the proportionality assumption. PMID:11357705

  8. Race-specific genetic risk score is more accurate than nonrace-specific genetic risk score for predicting prostate cancer and high-grade diseases

    PubMed Central

    Na, Rong; Ye, Dingwei; Qi, Jun; Liu, Fang; Lin, Xiaoling; Helfand, Brian T; Brendler, Charles B; Conran, Carly; Gong, Jian; Wu, Yishuo; Gao, Xu; Chen, Yaqing; Zheng, S Lilly; Mo, Zengnan; Ding, Qiang; Sun, Yinghao; Xu, Jianfeng

    2016-01-01

    Genetic risk score (GRS) based on disease risk-associated single nucleotide polymorphisms (SNPs) is an informative tool that can be used to provide inherited information for specific diseases in addition to family history. However, it is still unknown whether only SNPs that are implicated in a specific racial group should be used when calculating GRSs. The objective of this study is to compare the performance of race-specific GRS and nonrace-specific GRS for predicting prostate cancer (PCa) among 1338 patients underwent prostate biopsy in Shanghai, China. A race-specific GRS was calculated with seven PCa risk-associated SNPs implicated in East Asians (GRS7), and a nonrace-specific GRS was calculated based on 76 PCa risk-associated SNPs implicated in at least one racial group (GRS76). The means of GRS7 and GRS76 were 1.19 and 1.85, respectively, in the study population. Higher GRS7 and GRS76 were independent predictors for PCa and high-grade PCa in univariate and multivariate analyses. GRS7 had a better area under the receiver-operating curve (AUC) than GRS76 for discriminating PCa (0.602 vs 0.573) and high-grade PCa (0.603 vs 0.575) but did not reach statistical significance. GRS7 had a better (up to 13% at different cutoffs) positive predictive value (PPV) than GRS76. In conclusion, a race-specific GRS is more robust and has a better performance when predicting PCa in East Asian men than a GRS calculated using SNPs that are not shown to be associated with East Asians. PMID:27140652

  9. Race-specific genetic risk score is more accurate than nonrace-specific genetic risk score for predicting prostate cancer and high-grade diseases.

    PubMed

    Na, Rong; Ye, Dingwei; Qi, Jun; Liu, Fang; Lin, Xiaoling; Helfand, Brian T; Brendler, Charles B; Conran, Carly; Gong, Jian; Wu, Yishuo; Gao, Xu; Chen, Yaqing; Zheng, S Lilly; Mo, Zengnan; Ding, Qiang; Sun, Yinghao; Xu, Jianfeng

    2016-01-01

    Genetic risk score (GRS) based on disease risk-associated single nucleotide polymorphisms (SNPs) is an informative tool that can be used to provide inherited information for specific diseases in addition to family history. However, it is still unknown whether only SNPs that are implicated in a specific racial group should be used when calculating GRSs. The objective of this study is to compare the performance of race-specific GRS and nonrace-specific GRS for predicting prostate cancer (PCa) among 1338 patients underwent prostate biopsy in Shanghai, China. A race-specific GRS was calculated with seven PCa risk-associated SNPs implicated in East Asians (GRS7), and a nonrace-specific GRS was calculated based on 76 PCa risk-associated SNPs implicated in at least one racial group (GRS76). The means of GRS7 and GRS76 were 1.19 and 1.85, respectively, in the study population. Higher GRS7 and GRS76 were independent predictors for PCa and high-grade PCa in univariate and multivariate analyses. GRS7 had a better area under the receiver-operating curve (AUC) than GRS76 for discriminating PCa (0.602 vs 0.573) and high-grade PCa (0.603 vs 0.575) but did not reach statistical significance. GRS7 had a better (up to 13% at different cutoffs) positive predictive value (PPV) than GRS76. In conclusion, a race-specific GRS is more robust and has a better performance when predicting PCa in East Asian men than a GRS calculated using SNPs that are not shown to be associated with East Asians. PMID:27140652

  10. Comparison of Paper-and-Pencil versus Web Administration of the Youth Risk Behavior Survey (YRBS): Risk Behavior Prevalence Estimates

    ERIC Educational Resources Information Center

    Eaton, Danice K.; Brener, Nancy D.; Kann, Laura; Denniston, Maxine M.; McManus, Tim; Kyle, Tonja M.; Roberts, Alice M.; Flint, Katherine H.; Ross, James G.

    2010-01-01

    The authors examined whether paper-and-pencil and Web surveys administered in the school setting yield equivalent risk behavior prevalence estimates. Data were from a methods study conducted by the Centers for Disease Control and Prevention (CDC) in spring 2008. Intact classes of 9th- or 10th-grade students were assigned randomly to complete a…

  11. Spatial Estimation of Populations at Risk from Radiological Dispersion Device Terrorism Incidents

    SciTech Connect

    Regens, J.L.; Gunter, J.T.

    2008-07-01

    Delineation of the location and size of the population potentially at risk of exposure to ionizing radiation is one of the key analytical challenges in estimating accurately the severity of the potential health effects associated with a radiological terrorism incident. Regardless of spatial scale, the geographical units for which population data commonly are collected rarely coincide with the geographical scale necessary for effective incident management and medical response. This paper identifies major government and commercial open sources of U.S. population data and presents a GIS-based approach for allocating publicly available population data, including age distributions, to geographical units appropriate for planning and implementing incident management and medical response strategies. In summary: The gravity model offers a straight-forward, empirical tool for estimating population flows, especially when geographical areas are relatively well-defined in terms of accessibility and spatial separation. This is particularly important for several reasons. First, the spatial scale for the area impacted by a RDD terrorism event is unlikely to match fully the spatial scale of available population data. That is, the plume spread typically will not uniformly overlay the impacted area. Second, the number of people within the impacted area varies as a function whether an attack occurs during the day or night. For example, the population of a central business district or industrial area typically is larger during the day while predominately residential areas have larger night time populations. As a result, interpolation techniques that link population data to geographical units and allocate those data based on time-frame at a spatial scale that is relevant to enhancing preparedness and response. The gravity model's main advantage is that it efficiently allocates readily available, open source population data to geographical units appropriate for planning and implementing

  12. ESTIMATING RISK TO CALIFORNIA ENERGY INFRASTRUCTURE FROM PROJECTED CLIMATE CHANGE

    SciTech Connect

    Sathaye, Jayant; Dale, Larry; Larsen, Peter; Fitts, Gary; Koy, Kevin; Lewis, Sarah; Lucena, Andre

    2011-06-22

    This report outlines the results of a study of the impact of climate change on the energy infrastructure of California and the San Francisco Bay region, including impacts on power plant generation; transmission line and substation capacity during heat spells; wildfires near transmission lines; sea level encroachment upon power plants, substations, and natural gas facilities; and peak electrical demand. Some end-of-century impacts were projected:Expected warming will decrease gas-fired generator efficiency. The maximum statewide coincident loss is projected at 10.3 gigawatts (with current power plant infrastructure and population), an increase of 6.2 percent over current temperature-induced losses. By the end of the century, electricity demand for almost all summer days is expected to exceed the current ninetieth percentile per-capita peak load. As much as 21 percent growth is expected in ninetieth percentile peak demand (per-capita, exclusive of population growth). When generator losses are included in the demand, the ninetieth percentile peaks may increase up to 25 percent. As the climate warms, California's peak supply capacity will need to grow faster than the population.Substation capacity is projected to decrease an average of 2.7 percent. A 5C (9F) air temperature increase (the average increase predicted for hot days in August) will diminish the capacity of a fully-loaded transmission line by an average of 7.5 percent.The potential exposure of transmission lines to wildfire is expected to increase with time. We have identified some lines whose probability of exposure to fire are expected to increase by as much as 40 percent. Up to 25 coastal power plants and 86 substations are at risk of flooding (or partial flooding) due to sea level rise.

  13. Antipsychotic effects on estimated 10 year coronary heart disease risk in the CATIE Schizophrenia Study

    PubMed Central

    Daumit, Gail L.; Goff, Donald C.; Meyer, Jonathan M.; Davis, Vicki G.; Nasrallah, Henry A.; McEvoy, Joseph P.; Rosenheck, Robert; Davis, Sonia M.; Hsiao, John K.; Stroup, T. Scott; Lieberman, Jeffrey A.

    2008-01-01

    Objective Persons with schizophrenia die earlier than the general population, in large part due to cardiovascular disease. The study objective was to examine effects of different antipsychotic treatments on estimates of 10 year coronary heart disease (CHD) risk calculated by the Framingham Heart Study formula. Method Change in ten-year risk for CHD was compared between treatment groups in 1125 patients followed for 18 months or until treatment discontinuation in the Clinical Antipsychotic Trials of Intervention Effectiveness (CATIE) Schizophrenia Trial. Results The covariate-adjusted mean change in 10-year CHD risk differed significantly between treatments. Olanzapine was associated with a 0.5% (SE 0.3) increase and quetiapine, a 0.3% (SE 0.3) increase; whereas risk decreased in patients treated with perphenazine, −0.5% (SE 0.3), risperidone, −0.6% (SE 0.3), and ziprasidone −0.6% (SE 0.4). The difference in 10-year CHD risk between olanzpaine and risperidone was statistically significant (p=0.004). Differences in estimated 10 year CHD risk between drugs were most marked in the tertile of subjects with a baseline CHD risk of at least 10%. Among individual CHD risk factors used in the Framingham formula, only total and HDL cholesterol levels differed between treatments. Conclusions These results indicate that the impact on 10-year CHD risk differs significantly between antipsychotic agents, with olanzapine producing the largest elevation in CHD risk of the agents studied in CATIE. PMID:18775645

  14. Laypersons’ Responses to the Communication of Uncertainty Regarding Cancer Risk Estimates

    PubMed Central

    Han, Paul K. J.; Klein, William M. P.; Lehman, Thomas C.; Massett, Holly; Lee, Simon C.; Freedman, Andrew N.

    2009-01-01

    Objective To explore laypersons’ responses to the communication of uncertainty associated with individualized cancer risk estimates and to identify reasons for individual differences in these responses. Design A qualitative study was conducted using focus groups. Participants were informed about a new colorectal cancer risk prediction model, and presented with hypothetical individualized risk estimates using presentation formats varying in expressed uncertainty (range v. point estimate). Semistructured interviews explored participants’ responses to this information. Participants and Setting Eight focus groups were conducted with 48 adults aged 50 to 74 residing in 2 major US metropolitan areas, Chicago, IL and Washington, DC. Purposive sampling was used to recruit participants with a high school or greater education, some familiarity with information technology, and no personal or immediate family history of cancer. Results Participants identified several sources of uncertainty regarding cancer risk estimates, including missing data, limitations in accuracy and source credibility, and conflicting information. In comparing presentation formats, most participants reported greater worry and perceived risk with the range than with the point estimate, consistent with the phenomenon of “ambiguity aversion.” However, others reported the opposite effect or else indifference between formats. Reasons suggested by participants’ responses included individual differences in optimism and motivations to reduce feelings of vulnerability and personal lack of control. Perceptions of source credibility and risk mutability emerged as potential mediating factors. Conclusions Laypersons’ responses to the communication of uncertainty regarding cancer risk estimates differ, and include both heightened and diminished risk perceptions. These differences may be attributable to personality, cognitive, and motivational factors. PMID:19470720

  15. Assessment of the value of a genetic risk score in improving the estimation of coronary risk

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The American Heart Association has established criteria for the evaluation of novel markers of cardiovascular risk. In accordance with these criteria, we assessed the association between a multi-locus genetic risk score (GRS) and incident coronary heart disease (CHD), and evaluated whether this GRS ...

  16. Mobile Applications for Type 2 Diabetes Risk Estimation: a Systematic Review.

    PubMed

    Fijacko, Nino; Brzan, Petra Povalej; Stiglic, Gregor

    2015-10-01

    Screening for chronical diseases like type 2 diabetes can be done using different methods and various risk tests. This study present a review of type 2 diabetes risk estimation mobile applications focusing on their functionality and availability of information on the underlying risk calculators. Only 9 out of 31 reviewed mobile applications, featured in three major mobile application stores, disclosed the name of risk calculator used for assessing the risk of type 2 diabetes. Even more concerning, none of the reviewed applications mentioned that they are collecting the data from users to improve the performance of their risk estimation calculators or offer users the descriptive statistics of the results from users that already used the application. For that purpose the questionnaires used for calculation of risk should be upgraded by including the information on the most recent blood sugar level measurements from users. Although mobile applications represent a great future potential for health applications, developers still do not put enough emphasis on informing the user of the underlying methods used to estimate the risk for a specific clinical condition. PMID:26303152

  17. Multiple automated headspace in-tube extraction for the accurate analysis of relevant wine aroma compounds and for the estimation of their relative liquid-gas transfer rates.

    PubMed

    Zapata, Julián; Lopez, Ricardo; Herrero, Paula; Ferreira, Vicente

    2012-11-30

    An automated headspace in-tube extraction (ITEX) method combined with multiple headspace extraction (MHE) has been developed to provide simultaneously information about the accurate wine content in 20 relevant aroma compounds and about their relative transfer rates to the headspace and hence about the relative strength of their interactions with the matrix. In the method, 5 μL (for alcohols, acetates and carbonyl alcohols) or 200 μL (for ethyl esters) of wine sample were introduced in a 2 mL vial, heated at 35°C and extracted with 32 (for alcohols, acetates and carbonyl alcohols) or 16 (for ethyl esters) 0.5 mL pumping strokes in four consecutive extraction and analysis cycles. The application of the classical theory of Multiple Extractions makes it possible to obtain a highly reliable estimate of the total amount of volatile compound present in the sample and a second parameter, β, which is simply the proportion of volatile not transferred to the trap in one extraction cycle, but that seems to be a reliable indicator of the actual volatility of the compound in that particular wine. A study with 20 wines of different types and 1 synthetic sample has revealed the existence of significant differences in the relative volatility of 15 out of 20 odorants. Differences are particularly intense for acetaldehyde and other carbonyls, but are also notable for alcohols and long chain fatty acid ethyl esters. It is expected that these differences, linked likely to sulphur dioxide and some unknown specific compositional aspects of the wine matrix, can be responsible for relevant sensory changes, and may even be the cause explaining why the same aroma composition can produce different aroma perceptions in two different wines. PMID:23102525

  18. Estimating risk at a Superfund site using passive sampling devices as biological surrogates in human health risk models

    PubMed Central

    Allan, Sarah E.; Sower, Gregory J.; Anderson, Kim A.

    2013-01-01

    Passive sampling devices (PSDs) sequester the freely dissolved fraction of lipophilic contaminants, mimicking passive chemical uptake and accumulation by biomembranes and lipid tissues. Public Health Assessments that inform the public about health risks from exposure to contaminants through consumption of resident fish are generally based on tissue data, which can be difficulties to obtain and requires destructive sampling. The purpose of this study is to apply PSD data in a Public Health Assessment to demonstrate that PSDs can be used as a biological surrogate to evaluate potential human health risks and elucidate spatio-temporal variations in risk. PSDs were used to measure polycyclic aromatic hydrocarbons (PAHs) in the Willamette River; upriver, downriver and within the Portland Harbor Superfund megasite for three years during wet and dry seasons. Based on an existing Public Health Assessment for this area, concentrations of PAHs in PSDs were substituted for fish tissue concentrations. PSD measured PAH concentrations captured the magnitude, range and variability of PAH concentrations reported for fish/shellfish from Portland Harbor. Using PSD results in place of fish data revealed an unacceptable risk level for cancer in all seasons but no unacceptable risk for non-cancer endpoints. Estimated cancer risk varied by several orders of magnitude based on season and location. Sites near coal tar contamination demonstrated the highest risk, particularly during the dry season and remediation activities. Incorporating PSD data into Public Health Assessments provides specific spatial and temporal contaminant exposure information that can assist public health professionals in evaluating human health risks. PMID:21741671

  19. Prophylactic radiotherapy against heterotopic ossification following internal fixation of acetabular fractures: a comparative estimate of risk

    PubMed Central

    Nasr, P; Yip, G; Scaife, J E; House, T; Thomas, S J; Harris, F; Owen, P J; Hull, P

    2014-01-01

    Objective: Radiotherapy (RT) is effective in preventing heterotopic ossification (HO) around acetabular fractures requiring surgical reconstruction. We audited outcomes and estimated risks from RT prophylaxis, and alternatives of indometacin or no prophylaxis. Methods: 34 patients underwent reconstruction of acetabular fractures through a posterior approach, followed by a 8-Gy single fraction. The mean age was 44 years. The mean time from surgery to RT was 1.1 days. The major RT risk is radiation-induced fatal cancer. The International Commission on Radiological Protection (ICRP) method was used to estimate risk, and compared with a method (Trott and Kemprad) specifically for estimating RT risk for benign disease. These were compared with risks associated with indometacin and no prophylaxis. Results: 28 patients (82%) developed no HO; 6 developed Brooker Class I; and none developed Class II–IV HO. The ICRP method suggests a risk of fatal cancer in the range of 1 in 1000 to 1 in 10,000; the Trott and Kemprad method suggests 1 in 3000. For younger patients, this may rise to 1 in 2000; and for elderly patients, it may fall to 1 in 6000. The risk of death from gastric bleeding or perforation from indometacin is 1 in 180 to 1 in 900 in older patients. Without prophylaxis risk of death from reoperation to remove HO is 1 in 4000 to 1 in 30,000. Conclusion: These results are encouraging, consistent with much larger series and endorse our multidisciplinary management. Risk estimates can be used in discussion with patients. Advances in knowledge: The risk from RT prophylaxis is small, it is safer than indometacin and substantially overlaps with the range for no prophylaxis. PMID:25089852

  20. Estimating the predictive ability of genetic risk models in simulated data based on published results from genome-wide association studies

    PubMed Central

    Kundu, Suman; Mihaescu, Raluca; Meijer, Catherina M. C.; Bakker, Rachel; Janssens, A. Cecile J. W.

    2014-01-01

    Background: There is increasing interest in investigating genetic risk models in empirical studies, but such studies are premature when the expected predictive ability of the risk model is low. We assessed how accurately the predictive ability of genetic risk models can be estimated in simulated data that are created based on the odds ratios (ORs) and frequencies of single-nucleotide polymorphisms (SNPs) obtained from genome-wide association studies (GWASs). Methods: We aimed to replicate published prediction studies that reported the area under the receiver operating characteristic curve (AUC) as a measure of predictive ability. We searched GWAS articles for all SNPs included in these models and extracted ORs and risk allele frequencies to construct genotypes and disease status for a hypothetical population. Using these hypothetical data, we reconstructed the published genetic risk models and compared their AUC values to those reported in the original articles. Results: The accuracy of the AUC values varied with the method used for the construction of the risk models. When logistic regression analysis was used to construct the genetic risk model, AUC values estimated by the simulation method were similar to the published values with a median absolute difference of 0.02 [range: 0.00, 0.04]. This difference was 0.03 [range: 0.01, 0.06] and 0.05 [range: 0.01, 0.08] for unweighted and weighted risk scores. Conclusions: The predictive ability of genetic risk models can be estimated using simulated data based on results from GWASs. Simulation methods can be useful to estimate the predictive ability in the absence of empirical data and to decide whether empirical investigation of genetic risk models is warranted. PMID:24982668

  1. Inhalation exposure or body burden? Better way of estimating risk--An application of PBPK model.

    PubMed

    Majumdar, Dipanjali; Dutta, Chirasree; Sen, Subha

    2016-01-01

    We aim to establish a new way for estimating the risk from internal dose or body burden due to exposure of benzene in human subject utilizing physiologically based pharmacokinetic (PBPK) model. We also intend to verify its applicability on human subjects exposed to different levels of benzene. We estimated personal inhalation exposure of benzene for two occupational groups namely petrol pump workers and car drivers with respect to a control group, only environmentally exposed. Benzene in personal air was pre-concentrated on charcoal followed by chemical desorption and analysis by gas chromatography equipped with flame ionization detector (GC-FID). We selected urinary trans,trans-muconic acid (t,t-MA) as biomarker of benzene exposure and measured its concentration using solid phase extraction followed by high performance liquid chromatography (HPLC). Our estimated inhalation exposure of benzene was 137.5, 97.9 and 38.7 μg/m(3) for petrol pump workers, car drivers and environmentally exposed control groups respectively which resulted in urinary t,t-MA levels of 145.4±55.3, 112.6±63.5 and 60.0±34.9 μg g(-1) of creatinine, for the groups in the same order. We deduced a derivation for estimation of body burden from urinary metabolite concentration using PBPK model. Estimation of the internal dose or body burden of benzene in human subject has been made for the first time by the measurement of t,t-MA as a urinary metabolite using physiologically based pharmacokinetic (PBPK) model as a tool. The weight adjusted total body burden of benzene was estimated to be 17.6, 11.1 and 5.0 μg kg(-1) of body weight for petrol pump workers, drivers and the environmentally exposed control group, respectively using this method. We computed the carcinogenic risk using both the estimated internal benzene body burden and external exposure values using conventional method. Our study result shows that internal dose or body burden is not proportional to level of exposure rather have a

  2. Accounting for Ecosystem Alteration Doubles Estimates of Conservation Risk in the Conterminous United States

    PubMed Central

    Swaty, Randy; Blankenship, Kori; Hagen, Sarah; Fargione, Joseph; Smith, Jim; Patton, Jeannie

    2011-01-01

    Previous national and global conservation assessments have relied on habitat conversion data to quantify conservation risk. However, in addition to habitat conversion to crop production or urban uses, ecosystem alteration (e.g., from logging, conversion to plantations, biological invasion, or fire suppression) is a large source of conservation risk. We add data quantifying ecosystem alteration on unconverted lands to arrive at a more accurate depiction of conservation risk for the conterminous United States. We quantify ecosystem alteration using a recent national assessment based on remote sensing of current vegetation compared with modeled reference natural vegetation conditions. Highly altered (but not converted) ecosystems comprise 23% of the conterminous United States, such that the number of critically endangered ecoregions in the United States is 156% higher than when calculated using habitat conversion data alone. Increased attention to natural resource management will be essential to address widespread ecosystem alteration and reduce conservation risk. PMID:21850248

  3. Accounting for ecosystem alteration doubles estimates of conservation risk in the conterminous United States.

    PubMed

    Swaty, Randy; Blankenship, Kori; Hagen, Sarah; Fargione, Joseph; Smith, Jim; Patton, Jeannie

    2011-01-01

    Previous national and global conservation assessments have relied on habitat conversion data to quantify conservation risk. However, in addition to habitat conversion to crop production or urban uses, ecosystem alteration (e.g., from logging, conversion to plantations, biological invasion, or fire suppression) is a large source of conservation risk. We add data quantifying ecosystem alteration on unconverted lands to arrive at a more accurate depiction of conservation risk for the conterminous United States. We quantify ecosystem alteration using a recent national assessment based on remote sensing of current vegetation compared with modeled reference natural vegetation conditions. Highly altered (but not converted) ecosystems comprise 23% of the conterminous United States, such that the number of critically endangered ecoregions in the United States is 156% higher than when calculated using habitat conversion data alone. Increased attention to natural resource management will be essential to address widespread ecosystem alteration and reduce conservation risk. PMID:21850248

  4. FY 2000 Buildings Energy Savings Estimates under Uncertainty: Developing Approaches for Incorporating Risk into Buildings Program Energy Efficiency Estimates

    SciTech Connect

    Anderson, Dave M.

    2002-11-18

    This report is one of two that re-examines the forecasted impact of individual programs currently within the Buildings Technology Program (BT) and the Weatherization and Intergovernmental Program (WIP) that appeared in the FY2000 Presidential Budget request. This report develops potential methods for allowing inherent risk to be captured in the program benefits analysis. Note that the FY2000 budget request was originally analyzed under the former Office of Building Technology, State and Community Programs (BTS), where BT and WIP were previously combined. Throughout the document, reference will be made to the predecessor of the BT and WIP programs, BTS, as FY2000 reflected that organization. A companion report outlines the effects of re-estimating the FY 2000 budget request based on overlaying program data from subsequent years, essentially revised out-year forecasts. That report shows that year-to-year long-term projections of primary energy savings can vary widely as models improve and programs change. Those point estimates are not influenced by uncertainty or risk. This report develops potential methods for allowing inherent risk to affect the benefits analysis via Monte Carlo simulation.

  5. Estimated Risk Level of Unified Stereotactic Body Radiation Therapy Dose Tolerance Limits for Spinal Cord.

    PubMed

    Grimm, Jimm; Sahgal, Arjun; Soltys, Scott G; Luxton, Gary; Patel, Ashish; Herbert, Scott; Xue, Jinyu; Ma, Lijun; Yorke, Ellen; Adler, John R; Gibbs, Iris C

    2016-04-01

    A literature review of more than 200 stereotactic body radiation therapy spine articles from the past 20 years found only a single article that provided dose-volume data and outcomes for each spinal cord of a clinical dataset: the Gibbs 2007 article (Gibbs et al, 2007(1)), which essentially contains the first 100 stereotactic body radiation therapy (SBRT) spine treatments from Stanford University Medical Center. The dataset is modeled and compared in detail to the rest of the literature review, which found 59 dose tolerance limits for the spinal cord in 1-5 fractions. We partitioned these limits into a unified format of high-risk and low-risk dose tolerance limits. To estimate the corresponding risk level of each limit we used the Gibbs 2007 clinical spinal cord dose-volume data for 102 spinal metastases in 74 patients treated by spinal radiosurgery. In all, 50 of the patients were previously irradiated to a median dose of 40Gy in 2-3Gy fractions and 3 patients developed treatment-related myelopathy. These dose-volume data were digitized into the dose-volume histogram (DVH) Evaluator software tool where parameters of the probit dose-response model were fitted using the maximum likelihood approach (Jackson et al, 1995(3)). Based on this limited dataset, for de novo cases the unified low-risk dose tolerance limits yielded an estimated risk of spinal cord injury of ≤1% in 1-5 fractions, and the high-risk limits yielded an estimated risk of ≤3%. The QUANTEC Dmax limits of 13Gy in a single fraction and 20Gy in 3 fractions had less than 1% risk estimated from this dataset, so we consider these among the low-risk limits. In the previously irradiated cohort, the estimated risk levels for 10 and 14Gy maximum cord dose limits in 5 fractions are 0.4% and 0.6%, respectively. Longer follow-up and more patients are required to improve the risk estimates and provide more complete validation. PMID:27000514

  6. Estimating the long-term phosphorus accretion rate in the Everglades: A Bayesian approach with risk assessment

    NASA Astrophysics Data System (ADS)

    Qian, Song S.; Richardson, Curtis J.

    Using wetlands as a sink of nutrients, phosphorus in particular, is becoming an increasingly attractive alternative to conventional wastewater treatment technology. In this paper, we briefly review the mechanism of phosphorus retention in wetlands, as well as previous modeling efforts. A Bayesian method is then proposed for estimating the long-term phosphorus accretion rate in wetlands through a piecewise linear model of outflow phosphorus concentration and phosphorus mass loading rate. The Bayesian approach was used for its simplicity in computation and its ability to accurately represent uncertainty. Applied to an Everglades wetland, the Bayesian method not only produced the probability distribution of the long-term phosphorus accretion rate but also generated a relationship of acceptable level of ``risk'' and optimal phosphorus mass loading rate for the proposed constructed wetlands in south Florida. The latter is a useful representation of uncertainty which is of interest to decision makers.

  7. Measurement of natural radionuclides in Malaysian bottled mineral water and consequent health risk estimation

    NASA Astrophysics Data System (ADS)

    Priharti, W.; Samat, S. B.; Yasir, M. S.

    2015-09-01

    The radionuclides of 226Ra, 232Th and 40K were measured in ten mineral water samples, of which from the radioactivity obtained, the ingestion doses for infants, children and adults were calculated and the cancer risk for the adult was estimated. Results showed that the calculated ingestion doses for the three age categories are much lower than the average worldwide ingestion exposure of 0.29 mSv/y and the estimated cancer risk is much lower than the cancer risk of 8.40 × 10-3 (estimated from the total natural radiation dose of 2.40 mSv/y). The present study concludes that the bottled mineral water produced in Malaysia is safe for daily human consumption.

  8. Measurement of natural radionuclides in Malaysian bottled mineral water and consequent health risk estimation

    SciTech Connect

    Priharti, W.; Samat, S. B.; Yasir, M. S.

    2015-09-25

    The radionuclides of {sup 226}Ra, {sup 232}Th and {sup 40}K were measured in ten mineral water samples, of which from the radioactivity obtained, the ingestion doses for infants, children and adults were calculated and the cancer risk for the adult was estimated. Results showed that the calculated ingestion doses for the three age categories are much lower than the average worldwide ingestion exposure of 0.29 mSv/y and the estimated cancer risk is much lower than the cancer risk of 8.40 × 10{sup −3} (estimated from the total natural radiation dose of 2.40 mSv/y). The present study concludes that the bottled mineral water produced in Malaysia is safe for daily human consumption.

  9. Estimates of radiation doses in tissue and organs and risk of excess cancer in the single-course radiotherapy patients treated for ankylosing spondylitis in England and Wales

    SciTech Connect

    Fabrikant, J.I.; Lyman, J.T.

    1982-02-01

    The estimates of absorbed doses of x rays and excess risk of cancer in bone marrow and heavily irradiated sites are extremely crude and are based on very limited data and on a number of assumptions. Some of these assumptions may later prove to be incorrect, but it is probable that they are correct to within a factor of 2. The excess cancer risk estimates calculated compare well with the most reliable epidemiological surveys thus far studied. This is particularly important for cancers of heavily irradiated sites with long latent periods. The mean followup period for the patients was 16.2 y, and an increase in cancers of heavily irradiated sites may appear in these patients in the 1970s in tissues and organs with long latent periods for the induction of cancer. The accuracy of these estimates is severely limited by the inadequacy of information on doses absorbed by the tissues at risk in the irradiated patients. The information on absorbed dose is essential for an accurate assessment of dose-cancer incidence analysis. Furthermore, in this valuable series of irradiated patients, the information on radiation dosimetry on the radiotherapy charts is central to any reliable determination of somatic risks of radiation with regard to carcinogenesis in man. The work necessary to obtain these data is under way; only when they are available can more precise estimates of risk of cancer induction by radiation in man be obtained.

  10. A methodology for estimating risks associated with landslides of contaminated soil into rivers.

    PubMed

    Göransson, Gunnel; Norrman, Jenny; Larson, Magnus; Alén, Claes; Rosén, Lars

    2014-02-15

    Urban areas adjacent to surface water are exposed to soil movements such as erosion and slope failures (landslides). A landslide is a potential mechanism for mobilisation and spreading of pollutants. This mechanism is in general not included in environmental risk assessments for contaminated sites, and the consequences associated with contamination in the soil are typically not considered in landslide risk assessments. This study suggests a methodology to estimate the environmental risks associated with landslides in contaminated sites adjacent to rivers. The methodology is probabilistic and allows for datasets with large uncertainties and the use of expert judgements, providing quantitative estimates of probabilities for defined failures. The approach is illustrated by a case study along the river Göta Älv, Sweden, where failures are defined and probabilities for those failures are estimated. Failures are defined from a pollution perspective and in terms of exceeding environmental quality standards (EQSs) and acceptable contaminant loads. Models are then suggested to estimate probabilities of these failures. A landslide analysis is carried out to assess landslide probabilities based on data from a recent landslide risk classification study along the river Göta Älv. The suggested methodology is meant to be a supplement to either landslide risk assessment (LRA) or environmental risk assessment (ERA), providing quantitative estimates of the risks associated with landslide in contaminated sites. The proposed methodology can also act as a basis for communication and discussion, thereby contributing to intersectoral management solutions. From the case study it was found that the defined failures are governed primarily by the probability of a landslide occurring. The overall probabilities for failure are low; however, if a landslide occurs the probabilities of exceeding EQS are high and the probability of having at least a 10% increase in the contamination load

  11. Estimated human health risks of disposing of nonhazardous oil field waste in salt caverns

    SciTech Connect

    Tomasko, D.; Elcock, D.; Veil, J.

    1997-09-01

    Argonne National Laboratory (ANL) has completed an evaluation of the possibility that adverse human health effects (carcinogenic and noncarcinogenic) could result from exposure to contaminants released from nonhazardous oil field wastes (NOW) disposed in domal salt caverns. In this assessment, several steps were used to evaluate potential human health risks: identifying potential contaminants of concern, determining how humans could be exposed to these contaminants, assessing the contaminants` toxicities, estimating contaminant intakes, and, finally, calculating human cancer and noncancer risks.

  12. Science policy choices and the estimation of cancer risk associated with exposure to TCDD

    SciTech Connect

    Gough, M.

    1988-09-01

    United States regulatory agencies use no-threshold models for estimating carcinogenic risks. Other countries use no-threshold models for carcinogens that are genotoxic and threshold models for carcinogens that are not genotoxic, such as 2, 3, 7, 8-tetrachlorodibenzo-p-dioxin (TCDD or dioxin). The U.S. Environmental Protection Agency has proposed a revision of the carcinogenic potency estimate for TCDD that is based on neither a threshold nor a no-threshold model; instead, it is a compromise between risk numbers generated by the two irreconcilably different models. This paper discusses the revision and its implications.

  13. Examining the effects of air pollution composition on within region differences in PM2.5 mortality risk estimates

    EPA Science Inventory

    Multi-city population-based epidemiological studies have observed significant heterogeneity in both the magnitude and direction of city-specific risk estimates, but tended to focus on regional differences in PM2.5 mortality risk estimates. Interpreting differences in risk estimat...

  14. Estimation of flood risk for cultural heritage in an art city

    NASA Astrophysics Data System (ADS)

    Arrighi, Chiara; Brugioni, Marcello; Franceschini, Serena; Castelli, Fabio; Mazzanti, Bernardo

    2015-04-01

    Flood risk assessment in art cities poses many challenges for the presence of cultural heritage at risk, which is a damage category whose value is hardly monetizable. In fact, valuing cultural asset is a complex task, usually requiring more effort than a rough estimation of restoration costs. The lack of an adequate risk evaluation of the cultural asset may also lead to enormous difficulties and political problems for the accomplishment of the structural mitigation solutions. The aim of the work is to perform a first analysis of the risk to cultural heritage avoiding a full quantification of exposure. Here we present a case study of broad importance, which is the art city of Florence (Italy), affected by a devastating flood in 1966. In previous studies the estimated flood risk, neglecting damages to cultural heritage, was about 53 Mio€ /year. Nevertheless, Florence hosts 176 buildings officially classified as cultural heritage and thousands of paintings, sculptures and ancient books. Proceeding similarly to the commonly accepted flood risk assessment method, the annual expected loss in terms of cultural heritage/artworks is estimated.

  15. Estimate of Space Radiation-Induced Cancer Risks for International Space Station Orbits

    NASA Technical Reports Server (NTRS)

    Wu, Honglu; Atwell, William; Cucinotta, Francis A.; Yang, Chui-hsu

    1996-01-01

    Excess cancer risks from exposures to space radiation are estimated for various orbits of the International Space Station (ISS). Organ exposures are computed with the transport codes, BRYNTRN and HZETRN, and the computerized anatomical male and computerized anatomical female models. Cancer risk coefficients in the National Council on Radiation Protection and Measurements report No. 98 are used to generate lifetime excess cancer incidence and cancer mortality after a one-month mission to ISS. The generated data are tabulated to serve as a quick reference for assessment of radiation risk to astronauts on ISS missions.

  16. Estimate of Space Radiation-Induced Cancer Risks for International Space Station Orbits

    SciTech Connect

    Wu, H.; Atwell, W.; Cucinotta, F.A.; Yang, C.

    1996-03-01

    Excess cancer risks from exposures to space radiation are estimated for various orbits of the International Space Station (ISS). Organ exposures are computed with the transport codes, BRYNTRN and HZETRN, and the computerized anatomical male and computerized anatomical female models. Cancer risk coefficients in the National Council on Radiation Protection and Measurements report No. 98 are used to generate lifetime excess cancer incidence and cancer mortality after a one-month mission to ISS. The generated data are tabulated to serve as a quick reference for assessment of radiation risk to astronauts on ISS missions.

  17. Estimating cancer risk from dental cone-beam CT exposures based on skin dosimetry.

    PubMed

    Pauwels, Ruben; Cockmartin, Lesley; Ivanauskaité, Deimante; Urbonienė, Ausra; Gavala, Sophia; Donta, Catherine; Tsiklakis, Kostas; Jacobs, Reinhilde; Bosmans, Hilde; Bogaerts, Ria; Horner, Keith

    2014-07-21

    The aim of this study was to measure entrance skin doses on patients undergoing cone-beam computed tomography (CBCT) examinations, to establish conversion factors between skin and organ doses, and to estimate cancer risk from CBCT exposures. 266 patients (age 8-83) were included, involving three imaging centres. CBCT scans were acquired using the SCANORA 3D (Soredex, Tuusula, Finland) and NewTom 9000 (QR, Verona, Italy). Eight thermoluminescent dosimeters were attached to the patient's skin at standardized locations. Using previously published organ dose estimations on various CBCTs with an anthropomorphic phantom, correlation factors to convert skin dose to organ doses were calculated and applied to estimate patient organ doses. The BEIR VII age- and gender-dependent dose-risk model was applied to estimate the lifetime attributable cancer risk. For the SCANORA 3D, average skin doses over the eight locations varied between 484 and 1788 µGy. For the NewTom 9000 the range was between 821 and 1686 µGy for Centre 1 and between 292 and 2325 µGy for Centre 2. Entrance skin dose measurements demonstrated the combined effect of exposure and patient factors on the dose. The lifetime attributable cancer risk, expressed as the probability to develop a radiation-induced cancer, varied between 2.7 per million (age >60) and 9.8 per million (age 8-11) with an average of 6.0 per million. On average, the risk for female patients was 40% higher. The estimated radiation risk was primarily influenced by the age at exposure and the gender, pointing out the continuing need for justification and optimization of CBCT exposures, with a specific focus on children. PMID:24957710

  18. Estimating cancer risk from dental cone-beam CT exposures based on skin dosimetry

    NASA Astrophysics Data System (ADS)

    Pauwels, Ruben; Cockmartin, Lesley; Ivanauskaité, Deimante; Urbonienė, Ausra; Gavala, Sophia; Donta, Catherine; Tsiklakis, Kostas; Jacobs, Reinhilde; Bosmans, Hilde; Bogaerts, Ria; Horner, Keith; SEDENTEXCT Project Consortium, The

    2014-07-01

    The aim of this study was to measure entrance skin doses on patients undergoing cone-beam computed tomography (CBCT) examinations, to establish conversion factors between skin and organ doses, and to estimate cancer risk from CBCT exposures. 266 patients (age 8-83) were included, involving three imaging centres. CBCT scans were acquired using the SCANORA 3D (Soredex, Tuusula, Finland) and NewTom 9000 (QR, Verona, Italy). Eight thermoluminescent dosimeters were attached to the patient's skin at standardized locations. Using previously published organ dose estimations on various CBCTs with an anthropomorphic phantom, correlation factors to convert skin dose to organ doses were calculated and applied to estimate patient organ doses. The BEIR VII age- and gender-dependent dose-risk model was applied to estimate the lifetime attributable cancer risk. For the SCANORA 3D, average skin doses over the eight locations varied between 484 and 1788 µGy. For the NewTom 9000 the range was between 821 and 1686 µGy for Centre 1 and between 292 and 2325 µGy for Centre 2. Entrance skin dose measurements demonstrated the combined effect of exposure and patient factors on the dose. The lifetime attributable cancer risk, expressed as the probability to develop a radiation-induced cancer, varied between 2.7 per million (age >60) and 9.8 per million (age 8-11) with an average of 6.0 per million. On average, the risk for female patients was 40% higher. The estimated radiation risk was primarily influenced by the age at exposure and the gender, pointing out the continuing need for justification and optimization of CBCT exposures, with a specific focus on children.

  19. Estimation of the Optimal Statistical Quality Control Sampling Time Intervals Using a Residual Risk Measure

    PubMed Central

    Hatjimihail, Aristides T.

    2009-01-01

    Background An open problem in clinical chemistry is the estimation of the optimal sampling time intervals for the application of statistical quality control (QC) procedures that are based on the measurement of control materials. This is a probabilistic risk assessment problem that requires reliability analysis of the analytical system, and the estimation of the risk caused by the measurement error. Methodology/Principal Findings Assuming that the states of the analytical system are the reliability state, the maintenance state, the critical-failure modes and their combinations, we can define risk functions based on the mean time of the states, their measurement error and the medically acceptable measurement error. Consequently, a residual risk measure rr can be defined for each sampling time interval. The rr depends on the state probability vectors of the analytical system, the state transition probability matrices before and after each application of the QC procedure and the state mean time matrices. As optimal sampling time intervals can be defined those minimizing a QC related cost measure while the rr is acceptable. I developed an algorithm that estimates the rr for any QC sampling time interval of a QC procedure applied to analytical systems with an arbitrary number of critical-failure modes, assuming any failure time and measurement error probability density function for each mode. Furthermore, given the acceptable rr, it can estimate the optimal QC sampling time intervals. Conclusions/Significance It is possible to rationally estimate the optimal QC sampling time intervals of an analytical system to sustain an acceptable residual risk with the minimum QC related cost. For the optimization the reliability analysis of the analytical system and the risk analysis of the measurement error are needed. PMID:19513124

  20. An Evidenced-Based Approach for Estimating Decompression Sickness Risk in Aircraft Operations

    NASA Technical Reports Server (NTRS)

    Robinson, Ronald R.; Dervay, Joseph P.; Conkin, Johnny

    1999-01-01

    Estimating the risk of decompression Sickness (DCS) in aircraft operations remains a challenge, making the reduction of this risk through the development of operationally acceptable denitrogenation schedules difficult. In addition, the medical recommendations which are promulgated are often not supported by rigorous evaluation of the available data, but are instead arrived at by negotiation with the aircraft operations community, are adapted from other similar aircraft operations, or are based upon the opinion of the local medical community. We present a systematic approach for defining DCS risk in aircraft operations by analyzing the data available for a specific aircraft, flight profile, and aviator population. Once the risk of DCS in a particular aircraft operation is known, appropriate steps can be taken to reduce this risk to a level acceptable to the applicable aviation community. Using this technique will allow any aviation medical community to arrive at the best estimate of DCS risk for its specific mission and aviator population and will allow systematic reevaluation of the decisions regarding DCS risk reduction when additional data are available.

  1. Estimated risk from exposure to radon decay products in US homes

    SciTech Connect

    Nero, A.V. Jr.

    1986-05-01

    Recent analyses now permit direct estimation of the risks of lung cancer from radon decay products in US homes. Analysis of data from indoor monitoring in single-family homes yields a tentative frequency distribution of annual-average /sup 222/Rn concentrations averaging 55 Bq m/sup -3/ and having 2% of homes exceeding 300 Bq m/sup -3/. Application of the results of occupational epidemiological studies, either directly or using recent advances in lung dosimetry, to indoor exposures suggests that the average indoor concentration entails a lifetime risk of lung cancer of 0.3% or about 10% of the total risk of lung cancer. The risk to individuals occupying the homes with 300 Bq m/sup -3/ or more for their lifetimes is estimated to exceed 2%, with risks from the homes with thousands of Bq m/sup -3/ correspondingly higher, even exceeding the total risk of premature death due to cigarette smoking. The potential for such average and high-level risks in ordinary homes forces development of a new perspective on environmental exposures.

  2. Quantitative assessment of the microbial risk of leafy greens from farm to consumption: preliminary framework, data, and risk estimates.

    PubMed

    Danyluk, Michelle D; Schaffner, Donald W

    2011-05-01

    This project was undertaken to relate what is known about the behavior of Escherichia coli O157:H7 under laboratory conditions and integrate this information to what is known regarding the 2006 E. coli O157:H7 spinach outbreak in the context of a quantitative microbial risk assessment. The risk model explicitly assumes that all contamination arises from exposure in the field. Extracted data, models, and user inputs were entered into an Excel spreadsheet, and the modeling software @RISK was used to perform Monte Carlo simulations. The model predicts that cut leafy greens that are temperature abused will support the growth of E. coli O157:H7, and populations of the organism may increase by as much a 1 log CFU/day under optimal temperature conditions. When the risk model used a starting level of -1 log CFU/g, with 0.1% of incoming servings contaminated, the predicted numbers of cells per serving were within the range of best available estimates of pathogen levels during the outbreak. The model predicts that levels in the field of -1 log CFU/g and 0.1% prevalence could have resulted in an outbreak approximately the size of the 2006 E. coli O157:H7 outbreak. This quantitative microbial risk assessment model represents a preliminary framework that identifies available data and provides initial risk estimates for pathogenic E. coli in leafy greens. Data gaps include retail storage times, correlations between storage time and temperature, determining the importance of E. coli O157:H7 in leafy greens lag time models, and validation of the importance of cross-contamination during the washing process. PMID:21549039

  3. A review of methods to estimate cause-specific mortality in presence of competing risks

    USGS Publications Warehouse

    Heisey, Dennis M.; Patterson, Brent R.

    2006-01-01

    Estimating cause-specific mortality is often of central importance for understanding the dynamics of wildlife populations. Despite such importance, methodology for estimating and analyzing cause-specific mortality has received little attention in wildlife ecology during the past 20 years. The issue of analyzing cause-specific, mutually exclusive events in time is not unique to wildlife. In fact, this general problem has received substantial attention in human biomedical applications within the context of biostatistical survival analysis. Here, we consider cause-specific mortality from a modern biostatistical perspective. This requires carefully defining what we mean by cause-specific mortality and then providing an appropriate hazard-based representation as a competing risks problem. This leads to the general solution of cause-specific mortality as the cumulative incidence function (CIF). We describe the appropriate generalization of the fully nonparametric staggered-entry Kaplan–Meier survival estimator to cause-specific mortality via the nonparametric CIF estimator (NPCIFE), which in many situations offers an attractive alternative to the Heisey–Fuller estimator. An advantage of the NPCIFE is that it lends itself readily to risk factors analysis with standard software for Cox proportional hazards model. The competing risks–based approach also clarifies issues regarding another intuitive but erroneous "cause-specific mortality" estimator based on the Kaplan–Meier survival estimator and commonly seen in the life sciences literature.

  4. “Any Condomless Anal Intercourse” is No Longer an Accurate Measure of HIV Sexual risk Behavior in Gay and Other Men Who have Sex with Men

    PubMed Central

    Jin, Fengyi; Prestage, Garrett P.; Mao, Limin; Poynten, I. Mary; Templeton, David J.; Grulich, Andrew E.; Zablotska, Iryna

    2015-01-01

    Background: Condomless anal intercourse (CLAI) has long been recognized as the primary mode of sexual transmission of HIV in gay and other men who have sex with men (MSM). A variety of measures of CLAI have been commonly used in behavioral surveillance for HIV risk and to forecast trends in HIV infection. However, gay and other MSM’s sexual practices changed as the understanding of disease and treatment options advance. In the present paper, we argue that summary measures such as “any CLAI” do not accurately measure HIV sexual risk behavior. Methods: Participants were 1,427 HIV-negative men from the Health in Men cohort study run from 2001 to 2007 in Sydney, Australia, with six-monthly interviews. At each interview, detailed quantitative data on the number of episodes of insertive and receptive CLAI in the last 6 months were collected, separated by partner type (regular vs. casual) and partners’ HIV status (negative, positive, and HIV status unknown). Results: A total of 228,064 episodes of CLAI were reported during the study period with a mean of 44 episodes per year per participant (median: 14). The great majority of CLAI episodes were with a regular partner (92.6%), most of them with HIV-negative regular partners (84.8%). Participants were more likely to engage in insertive CLAI with casual than with regular partners (66.7 vs. 55.3% of all acts of CLAI with each partner type, p < 0.001). Men were more likely to report CLAI in the receptive position with HIV-negative and HIV status unknown partners than with HIV-positive partners (p < 0.001 for both regular and casual partners). Conclusion: Gay and other MSM engaging in CLAI demonstrate clear patterns of HIV risk reduction behavior. As HIV prevention enters the era of antiretroviral-based biomedical approach, using all forms of CLAI indiscriminately as a measure of HIV behavioral risk is not helpful in understanding the current drivers of HIV transmission in the community. PMID:25774158

  5. Prevalence Estimates of Health Risk Behaviors of Immigrant Latino Men Who Have Sex with Men

    ERIC Educational Resources Information Center

    Rhodes, Scott D.; McCoy, Thomas P.; Hergenrather, Kenneth C.; Vissman, Aaron T.; Wolfson, Mark; Alonzo, Jorge; Bloom, Fred R.; Alegria-Ortega, Jose; Eng, Eugenia

    2012-01-01

    Purpose: Little is known about the health status of rural immigrant Latino men who have sex with men (MSM). These MSM comprise a subpopulation that tends to remain "hidden" from both researchers and practitioners. This study was designed to estimate the prevalence of tobacco, alcohol, and drug use, and sexual risk behaviors of Latino MSM living in…

  6. RISK ESTIMATES FOR DETERMINISTIC HEALTH EFFECTS OF INHALED WEAPONS GRADE PLUTONIUM

    EPA Science Inventory

    Risk estimates for deterministic effects of inhaled weapons-grade plutonium (WG Pu) are needed to evaluate potential serious harm to: (1) U. S. Department of Energy nuclear workers from accidental or other work-place releases of WG Pu; and (2) the public from terrorist actions re...

  7. EVALUATION AND ESTIMATION OF POTENTIAL CARCINOGENIC RISKS OF POLYNUCLEAR AROMATIC HYDROCARBONS (PAH)

    EPA Science Inventory

    The evaluation and estimation of the potential risk of human exposures to a hazardous substance requires the analysis of all relevant data to answer two questions (1) does the agent cause the effect; (2) what is the relationship between dose (exposure) and incidence of the effect...

  8. Silica exposure and silicosis among Ontario hardrock miners: III. Analysis and risk estimates.

    PubMed

    Muir, D C; Julian, J A; Shannon, H S; Verma, D K; Sebestyen, A; Bernholz, C D

    1989-01-01

    An epidemiological investigation was undertaken to determine the relationship between silicosis in hardrock miners in Ontario and cumulative exposure to silica (free crystalline silica--alpha quartz) dust. This report describes the analytic method and presents the risk estimates. PMID:2750748

  9. ESTIMATING THE RISK OF LUNG CANCER FROM INHALATION OF RADON DAUGHTERS INDOORS: REVIEW AND EVALUATION

    EPA Science Inventory

    A review of the dosimetric models and epidemiological studies with regard to the relation between indoor radon exposure and lung cancer indicates that the Working Level is an appropriate unit for indoor radon exposure; that the uncertainty in applying risk estimates derived from ...

  10. Normal Tissue Complication Probability Estimation by the Lyman-Kutcher-Burman Method Does Not Accurately Predict Spinal Cord Tolerance to Stereotactic Radiosurgery

    SciTech Connect

    Daly, Megan E.; Luxton, Gary; Choi, Clara Y.H.; Gibbs, Iris C.; Chang, Steven D.; Adler, John R.; Soltys, Scott G.

    2012-04-01

    Purpose: To determine whether normal tissue complication probability (NTCP) analyses of the human spinal cord by use of the Lyman-Kutcher-Burman (LKB) model, supplemented by linear-quadratic modeling to account for the effect of fractionation, predict the risk of myelopathy from stereotactic radiosurgery (SRS). Methods and Materials: From November 2001 to July 2008, 24 spinal hemangioblastomas in 17 patients were treated with SRS. Of the tumors, 17 received 1 fraction with a median dose of 20 Gy (range, 18-30 Gy) and 7 received 20 to 25 Gy in 2 or 3 sessions, with cord maximum doses of 22.7 Gy (range, 17.8-30.9 Gy) and 22.0 Gy (range, 20.2-26.6 Gy), respectively. By use of conventional values for {alpha}/{beta}, volume parameter n, 50% complication probability dose TD{sub 50}, and inverse slope parameter m, a computationally simplified implementation of the LKB model was used to calculate the biologically equivalent uniform dose and NTCP for each treatment. Exploratory calculations were performed with alternate values of {alpha}/{beta} and n. Results: In this study 1 case (4%) of myelopathy occurred. The LKB model using radiobiological parameters from Emami and the logistic model with parameters from Schultheiss overestimated complication rates, predicting 13 complications (54%) and 18 complications (75%), respectively. An increase in the volume parameter (n), to assume greater parallel organization, improved the predictive value of the models. Maximum-likelihood LKB fitting of {alpha}/{beta} and n yielded better predictions (0.7 complications), with n = 0.023 and {alpha}/{beta} = 17.8 Gy. Conclusions: The spinal cord tolerance to the dosimetry of SRS is higher than predicted by the LKB model using any set of accepted parameters. Only a high {alpha}/{beta} value in the LKB model and only a large volume effect in the logistic model with Schultheiss data could explain the low number of complications observed. This finding emphasizes that radiobiological models

  11. Challenges in Obtaining Estimates of the Risk of Tuberculosis Infection During Overseas Deployment.

    PubMed

    Mancuso, James D; Geurts, Mia

    2015-12-01

    Estimates of the risk of tuberculosis (TB) infection resulting from overseas deployment among U.S. military service members have varied widely, and have been plagued by methodological problems. The purpose of this study was to estimate the incidence of TB infection in the U.S. military resulting from deployment. Three populations were examined: 1) a unit of 2,228 soldiers redeploying from Iraq in 2008, 2) a cohort of 1,978 soldiers followed up over 5 years after basic training at Fort Jackson in 2009, and 3) 6,062 participants in the 2011-2012 National Health and Nutrition Examination Survey (NHANES). The risk of TB infection in the deployed population was low-0.6% (95% confidence interval [CI]: 0.1-2.3%)-and was similar to the non-deployed population. The prevalence of latent TB infection (LTBI) in the U.S. population was not significantly different among deployed and non-deployed veterans and those with no military service. The limitations of these retrospective studies highlight the challenge in obtaining valid estimates of risk using retrospective data and the need for a more definitive study. Similar to civilian long-term travelers, risks for TB infection during deployment are focal in nature, and testing should be targeted to only those at increased risk. PMID:26416114

  12. Statistical Risk Estimation for Communication System Design: Results of the HETE-2 Test Case

    NASA Astrophysics Data System (ADS)

    Babuscia, A.; Cheung, K.-M.

    2014-05-01

    The Statistical Risk Estimation (SRE) technique described in this article is a methodology to quantify the likelihood that the major design drivers of mass and power of a space system meet the spacecraft and mission requirements and constraints through the design and development lifecycle. The SRE approach addresses the long-standing challenges of small sample size and unclear evaluation path of a space system, and uses a combination of historical data and expert opinions to estimate risk. Although the methodology is applicable to the entire spacecraft, this article is focused on a specific subsystem: the communication subsystem. Using this approach, the communication system designers will be able to evaluate and to compare different communication architectures in a risk trade-off perspective. SRE was introduced in two previous papers. This article aims to present additional results of the methodology by adding a new test case from a university mission, the High-Energy Transient Experiment (HETE)-2. The results illustrate the application of SRE to estimate the risks of exceeding constraints in mass and power, hence providing crucial risk information to support a project's decision on requirements rescope and/or system redesign.

  13. Hip fracture risk estimation based on principal component analysis of QCT atlas: a preliminary study

    NASA Astrophysics Data System (ADS)

    Li, Wenjun; Kornak, John; Harris, Tamara; Lu, Ying; Cheng, Xiaoguang; Lang, Thomas

    2009-02-01

    We aim to capture and apply 3-dimensional bone fragility features for fracture risk estimation. Using inter-subject image registration, we constructed a hip QCT atlas comprising 37 patients with hip fractures and 38 age-matched controls. In the hip atlas space, we performed principal component analysis to identify the principal components (eigen images) that showed association with hip fracture. To develop and test a hip fracture risk model based on the principal components, we randomly divided the 75 QCT scans into two groups, one serving as the training set and the other as the test set. We applied this model to estimate a fracture risk index for each test subject, and used the fracture risk indices to discriminate the fracture patients and controls. To evaluate the fracture discrimination efficacy, we performed ROC analysis and calculated the AUC (area under curve). When using the first group as the training group and the second as the test group, the AUC was 0.880, compared to conventional fracture risk estimation methods based on bone densitometry, which had AUC values ranging between 0.782 and 0.871. When using the second group as the training group, the AUC was 0.839, compared to densitometric methods with AUC values ranging between 0.767 and 0.807. Our results demonstrate that principal components derived from hip QCT atlas are associated with hip fracture. Use of such features may provide new quantitative measures of interest to osteoporosis.

  14. How accurately can students estimate their performance on an exam and how does this relate to their actual performance on the exam?

    NASA Astrophysics Data System (ADS)

    Rebello, N. Sanjay

    2012-02-01

    Research has shown students' beliefs regarding their own abilities in math and science can influence their performance in these disciplines. I investigated the relationship between students' estimated performance and actual performance on five exams in a second semester calculus-based physics class. Students in a second-semester calculus-based physics class were given about 72 hours after the completion of each of five exams, to estimate their individual and class mean score on each exam. Students were given extra credit worth 1% of the exam points for estimating their score correct within 2% of the actual score and another 1% extra credit for estimating the class mean score within 2% of the correct value. I compared students' individual and mean score estimations with the actual scores to investigate the relationship between estimation accuracies and exam performance of the students as well as trends over the semester.

  15. Quantitative microbial risk assessment combined with hydrodynamic modelling to estimate the public health risk associated with bathing after rainfall events.

    PubMed

    Eregno, Fasil Ejigu; Tryland, Ingun; Tjomsland, Torulv; Myrmel, Mette; Robertson, Lucy; Heistad, Arve

    2016-04-01

    This study investigated the public health risk from exposure to infectious microorganisms at Sandvika recreational beaches, Norway and dose-response relationships by combining hydrodynamic modelling with Quantitative Microbial Risk Assessment (QMRA). Meteorological and hydrological data were collected to produce a calibrated hydrodynamic model using Escherichia coli as an indicator of faecal contamination. Based on average concentrations of reference pathogens (norovirus, Campylobacter, Salmonella, Giardia and Cryptosporidium) relative to E. coli in Norwegian sewage from previous studies, the hydrodynamic model was used for simulating the concentrations of pathogens at the local beaches during and after a heavy rainfall event, using three different decay rates. The simulated concentrations were used as input for QMRA and the public health risk was estimated as probability of infection from a single exposure of bathers during the three consecutive days after the rainfall event. The level of risk on the first day after the rainfall event was acceptable for the bacterial and parasitic reference pathogens, but high for the viral reference pathogen at all beaches, and severe at Kalvøya-small and Kalvøya-big beaches, supporting the advice of avoiding swimming in the day(s) after heavy rainfall. The study demonstrates the potential of combining discharge-based hydrodynamic modelling with QMRA in the context of bathing water as a tool to evaluate public health risk and support beach management decisions. PMID:26802355

  16. Value at risk estimation with entropy-based wavelet analysis in exchange markets

    NASA Astrophysics Data System (ADS)

    He, Kaijian; Wang, Lijun; Zou, Yingchao; Lai, Kin Keung

    2014-08-01

    In recent years, exchange markets are increasingly integrated together. Fluctuations and risks across different exchange markets exhibit co-moving and complex dynamics. In this paper we propose the entropy-based multivariate wavelet based approaches to analyze the multiscale characteristic in the multidimensional domain and improve further the Value at Risk estimation reliability. Wavelet analysis has been introduced to construct the entropy-based Multiscale Portfolio Value at Risk estimation algorithm to account for the multiscale dynamic correlation. The entropy measure has been proposed as the more effective measure with the error minimization principle to select the best basis when determining the wavelet families and the decomposition level to use. The empirical studies conducted in this paper have provided positive evidence as to the superior performance of the proposed approach, using the closely related Chinese Renminbi and European Euro exchange market.

  17. Estimating the accumulation of chemicals in an estuarine food web: A case study for evaluation of future ecological and human health risks

    SciTech Connect

    Iannuzzi, T.J.; Finley, B.L.

    1995-12-31

    A model was constructed and calibrated for estimating the accumulation of sediment associated nonionic organic chemicals, including selected PCBs and PCDD/Fs, in a simplified food web of the tidal Passaic River, New Jersey. The model was used to estimate concentrations of several chemicals in infaunal invertebrates, forage fish, blue crab, and adult finfish in the River as part of a screening-level risk assessment that was conducted during the preliminary phase of a CERCLA Remedial Investigation/Feasibility Study (RI/FS). Subsequent tissue-residue data were collected to evaluate the performance of the model, and to calibrate the model for multiple chemicals of concern in the River. A follow-up program of data collection was designed to support a more detailed risk assessment. The objectives of calibrating the model are to supplement the extant tissue-residue data that is available for risk assessment, and to evaluate future scenarios of bioaccumulation (and potential ecological and human health risk) under various future conditions in the River. Results to-date suggest that the model performs well for the simplified food web that exists in the Passaic River. A case study was constructed to demonstrate the application of the model for future predictions of ecological risk. These preliminary results suggest that the model is sufficiently sensitive and accurate for estimating variations of bioaccumulation under varying degrees of source control or other future conditions.

  18. Effect of recent changes in atomic bomb survivor dosimetry on cancer mortality risk estimates.

    PubMed

    Preston, Dale L; Pierce, Donald A; Shimizu, Yukiko; Cullings, Harry M; Fujita, Shoichiro; Funamoto, Sachiyo; Kodama, Kazunori

    2004-10-01

    The Radiation Effects Research Foundation has recently implemented a new dosimetry system, DS02, to replace the previous system, DS86. This paper assesses the effect of the change on risk estimates for radiation-related solid cancer and leukemia mortality. The changes in dose estimates were smaller than many had anticipated, with the primary systematic change being an increase of about 10% in gamma-ray estimates for both cities. In particular, an anticipated large increase of the neutron component in Hiroshima for low-dose survivors did not materialize. However, DS02 improves on DS86 in many details, including the specifics of the radiation released by the bombs and the effects of shielding by structures and terrain. The data used here extend the last reported follow-up for solid cancers by 3 years, with a total of 10,085 deaths, and extends the follow-up for leukemia by 10 years, with a total of 296 deaths. For both solid cancer and leukemia, estimated age-time patterns and sex difference are virtually unchanged by the dosimetry revision. The estimates of solid-cancer radiation risk per sievert and the curvilinear dose response for leukemia are both decreased by about 8% by the dosimetry revision, due to the increase in the gamma-ray dose estimates. The apparent shape of the dose response is virtually unchanged by the dosimetry revision, but for solid cancers, the additional 3 years of follow-up has some effect. In particular, there is for the first time a statistically significant upward curvature for solid cancer on the restricted dose range 0-2 Sv. However, the low-dose slope of a linear-quadratic fit to that dose range should probably not be relied on for risk estimation, since that is substantially smaller than the linear slopes on ranges 0-1 Sv, 0-0.5 Sv, and 0- 0.25 Sv. Although it was anticipated that the new dosimetry system might reduce some apparent dose overestimates for Nagasaki factory workers, this did not materialize, and factory workers have

  19. Estimates of Prevalence and Risk Associated with Inattention and Distraction Based Upon In Situ Naturalistic Data

    PubMed Central

    Dingus, Thomas A.

    2014-01-01

    By using in situ naturalistic driving data, estimates of prevalence and risk can be made regarding driver populations’ secondary task distractions and crash rates. Through metadata analysis, three populations of drivers (i.e., adult light vehicle, teenaged light vehicle, and adult heavy vehicle) were compared regarding frequency of secondary task behavior and the associated risk for safety-critical incidents. Relative risk estimates provide insight into the risk associated with engaging in a single task. When such risk is considered in combination with frequency of use, it sheds additional light on those secondary tasks that create the greatest overall risk to driving safety. The results show that secondary tasks involving manual typing, texting, dialing, reaching for an object, or reading are dangerous for all three populations. Additionally, novice teen drivers have difficulty in several tasks that the other two populations do not, including eating and external distractions. Truck drivers also perform a number of risky “mobile office” types of tasks, including writing, not seen in the other populations. Implications are described for policy makers and designers of in-vehicle and nomadic, portable systems. PMID:24776227

  20. Estimated GFR Associates with Cardiovascular Risk Factors Independently of Measured GFR

    PubMed Central

    Melsom, Toralf; Ingebretsen, Ole C.; Jenssen, Trond; Njølstad, Inger; Solbu, Marit D.; Toft, Ingrid; Eriksen, Bjørn O.

    2011-01-01

    Estimation of the GFR (eGFR) using creatinine- or cystatin C–based equations is imperfect, especially when the true GFR is normal or near-normal. Modest reductions in eGFR from the normal range variably predict cardiovascular morbidity. If eGFR associates not only with measured GFR (mGFR) but also with cardiovascular risk factors, the effects of these non–GFR-related factors might bias the association between eGFR and outcome. To investigate these potential non–GFR-related associations between eGFR and cardiovascular risk factors, we measured GFR by iohexol clearance in a sample from the general population (age 50 to 62 years) without known cardiovascular disease, diabetes, or kidney disease. Even after adjustment for mGFR, eGFR associated with traditional cardiovascular risk factors in multiple regression analyses. More risk factors influenced cystatin C–based eGFR than creatinine-based eGFR, adjusted for mGFR, and some of the risk factors exhibited nonlinear effects in generalized additive models (P < 0.05). These results suggest that eGFR, calculated using standard creatinine- or cystatin C–based equations, partially depends on factors other than the true GFR. Thus, estimates of cardiovascular risk associated with small changes in eGFR must be interpreted with caution. PMID:21454717

  1. Estimating risk during showering exposure to VOCs of workers in a metal-degreasing facility.

    PubMed

    Franco, Amaya; Costoya, Miguel Angel; Roca, Enrique

    2007-04-01

    The incremental risk of workers in a metal-degreasing facility exposed to volatile organic compounds (VOCs) present in the water supply during showering was estimated. A probabilistic and worst-case approach using specific-site concentration data and a generalized multipathway exposure model was applied. Estimates of hazard index and lifetime cancer risk were analyzed for each chemical and each route of exposure (inhalation and dermal absorption). The results showed that dermal exposure to trichloroethylene (TCE) and tetrachloroethylene (perchloroethylene, PCE) represented the main contribution to total risk. Although the inhalation route did not produce significant exposure, it was mainly influenced by the liquid flow rate of the shower. Lower values of this parameter during showering resulted in a significant reduction of both carcinogenic and noncarcinogenic risk, while decreasing water temperature produced a minimal effect on exposure by this pathway. The results obtained in the present study indicated that significant exposures of workers may be produced during showering in metal degreasing installations where releases to water of VOCs occur. A sensitivity analysis was developed for investigating the effect of scenario parameters on exposure. Although site-specific data were employed, the exposure of workers was assessed in a model scenario and thus the quantification of risk is associated with uncertainty. Considering that occupational exposure to organic solvents of workers in metal-degreasing facilities may also be significant, risk assessment must be included in the planning of this kind of industrial installation. PMID:17365617

  2. Estimates of auditory risk from outdoor impulse noise. II: Civilian firearms.

    PubMed

    Flamme, Gregory A; Wong, Adam; Liebe, Kevin; Lynd, James

    2009-01-01

    Firearm impulses are common noise exposures in the United States. This study records, describes and analyzes impulses produced outdoors by civilian firearms with respect to the amount of auditory risk they pose to the unprotected listener under various listening conditions. Risk estimates were obtained using three contemporary damage risk criteria (DRC) including a waveform parameter-based approach (peak SPL and B-duration), an energy-based criterion (A-weighted SEL and equivalent continuous level) and a physiological model (AHAAH). Results from these DRC were converted into a number of maximum permissible unprotected exposures to facilitate interpretation. Acoustic characteristics of firearm impulses differed substantially across guns, ammunition, and microphone location. The type of gun, ammunition and the microphone location all significantly affected estimates of auditory risk from firearms. Vast differences in maximum permissible exposures were observed; the rank order of the differences varied with the source of the impulse. Unprotected exposure to firearm noise is not recommended, but people electing to fire a gun without hearing protection should be advised to minimize auditory risk through careful selection of ammunition and shooting environment. Small-caliber guns with long barrels and guns loaded with the least powerful ammunition tend to be associated with the least auditory risk. PMID:19805933

  3. Estimating Risks of Heat Strain by Age and Sex: A Population-Level Simulation Model

    PubMed Central

    Glass, Kathryn; Tait, Peter W.; Hanna, Elizabeth G.; Dear, Keith

    2015-01-01

    Individuals living in hot climates face health risks from hyperthermia due to excessive heat. Heat strain is influenced by weather exposure and by individual characteristics such as age, sex, body size, and occupation. To explore the population-level drivers of heat strain, we developed a simulation model that scales up individual risks of heat storage (estimated using Myrup and Morgan’s man model “MANMO”) to a large population. Using Australian weather data, we identify high-risk weather conditions together with individual characteristics that increase the risk of heat stress under these conditions. The model identifies elevated risks in children and the elderly, with females aged 75 and older those most likely to experience heat strain. Risk of heat strain in males does not increase as rapidly with age, but is greatest on hot days with high solar radiation. Although cloudy days are less dangerous for the wider population, older women still have an elevated risk of heat strain on hot cloudy days or when indoors during high temperatures. Simulation models provide a valuable method for exploring population level risks of heat strain, and a tool for evaluating public health and other government policy interventions. PMID:25993102

  4. Cancer risk estimation in Digital Breast Tomosynthesis using GEANT4 Monte Carlo simulations and voxel phantoms.

    PubMed

    Ferreira, P; Baptista, M; Di Maria, S; Vaz, P

    2016-05-01

    The aim of this work was to estimate the risk of radiation induced cancer following the Portuguese breast screening recommendations for Digital Mammography (DM) when applied to Digital Breast Tomosynthesis (DBT) and to evaluate how the risk to induce cancer could influence the energy used in breast diagnostic exams. The organ doses were calculated by Monte Carlo simulations using a female voxel phantom and considering the acquisition of 25 projection images. Single organ cancer incidence risks were calculated in order to assess the total effective radiation induced cancer risk. The screening strategy techniques considered were: DBT in Cranio-Caudal (CC) view and two-view DM (CC and Mediolateral Oblique (MLO)). The risk of cancer incidence following the Portuguese screening guidelines (screening every two years in the age range of 50-80years) was calculated by assuming a single CC DBT acquisition view as standalone screening strategy and compared with two-view DM. The difference in the total effective risk between DBT and DM is quite low. Nevertheless in DBT an increase of risk for the lung is observed with respect to DM. The lung is also the organ that is mainly affected when non-optimal beam energy (in terms of image quality and absorbed dose) is used instead of an optimal one. The use of non-optimal energies could increase the risk of lung cancer incidence by a factor of about 2. PMID:27133140

  5. Better contralateral breast cancer risk estimation and alternative options to contralateral prophylactic mastectomy

    PubMed Central

    Davies, Kalatu R; Cantor, Scott B; Brewster, Abenaa M

    2015-01-01

    The incidence of contralateral prophylactic mastectomy (CPM) has increased among women with breast cancer, despite uncertain survival benefit and a declining incidence of contralateral breast cancer (CBC). Patient-related reasons for undergoing CPM include an overestimation of the risk of CBC, increased cancer worry, and a desire to improve survival. We summarize the existing literature on CBC risk and outcomes and the clinical benefit of CPM among women with unilateral breast cancer who have a low-to-moderate risk of developing a secondary cancer in the contralateral breast. Published studies were retrieved from the MEDLINE database with the keywords “contralateral breast cancer” and “contralateral prophylactic mastectomy”. These include observational studies, clinical trials, survival analyses, and decision models examining the risk of CBC, the clinical and psychosocial effects of CPM, and other treatment strategies to reduce CBC risk. Studies that have evaluated CBC risk estimate it to be approximately 0.5% annually on average. Patient-related factors associated with an increased risk of CBC include carriers of BRCA1/2 mutations, young age at breast cancer, and strong family history of breast cancer in the absence of a BRCA1/2 mutation. Although CPM reduces the risk of CBC by approximately 94%, it may not provide a significant gain in overall survival and there is conflicting evidence that it improves disease-free survival among women with breast cancer regardless of estrogen receptor (ER) status. Therefore, alternative strategies such as the use of tamoxifen or aromatase inhibitors, which reduce the risk of CBC by approximately 50%, should be encouraged for eligible women with ER-positive breast cancers. Future research is needed to evaluate the impact of decision and educational tools that can be used for personalized counseling of patients regarding their CBC risk, the uncertain role of CPM, and alternative CBC risk reduction strategies. PMID:25678823

  6. Better contralateral breast cancer risk estimation and alternative options to contralateral prophylactic mastectomy.

    PubMed

    Davies, Kalatu R; Cantor, Scott B; Brewster, Abenaa M

    2015-01-01

    The incidence of contralateral prophylactic mastectomy (CPM) has increased among women with breast cancer, despite uncertain survival benefit and a declining incidence of contralateral breast cancer (CBC). Patient-related reasons for undergoing CPM include an overestimation of the risk of CBC, increased cancer worry, and a desire to improve survival. We summarize the existing literature on CBC risk and outcomes and the clinical benefit of CPM among women with unilateral breast cancer who have a low-to-moderate risk of developing a secondary cancer in the contralateral breast. Published studies were retrieved from the MEDLINE database with the keywords "contralateral breast cancer" and "contralateral prophylactic mastectomy". These include observational studies, clinical trials, survival analyses, and decision models examining the risk of CBC, the clinical and psychosocial effects of CPM, and other treatment strategies to reduce CBC risk. Studies that have evaluated CBC risk estimate it to be approximately 0.5% annually on average. Patient-related factors associated with an increased risk of CBC include carriers of BRCA1/2 mutations, young age at breast cancer, and strong family history of breast cancer in the absence of a BRCA1/2 mutation. Although CPM reduces the risk of CBC by approximately 94%, it may not provide a significant gain in overall survival and there is conflicting evidence that it improves disease-free survival among women with breast cancer regardless of estrogen receptor (ER) status. Therefore, alternative strategies such as the use of tamoxifen or aromatase inhibitors, which reduce the risk of CBC by approximately 50%, should be encouraged for eligible women with ER-positive breast cancers. Future research is needed to evaluate the impact of decision and educational tools that can be used for personalized counseling of patients regarding their CBC risk, the uncertain role of CPM, and alternative CBC risk reduction strategies. PMID:25678823

  7. Vertebral Strength and Estimated Fracture Risk Across the BMI Spectrum in Women.

    PubMed

    Bachmann, Katherine N; Bruno, Alexander G; Bredella, Miriam A; Schorr, Melanie; Lawson, Elizabeth A; Gill, Corey M; Singhal, Vibha; Meenaghan, Erinne; Gerweck, Anu V; Eddy, Kamryn T; Ebrahimi, Seda; Koman, Stuart L; Greenblatt, James M; Keane, Robert J; Weigel, Thomas; Dechant, Esther; Misra, Madhusmita; Klibanski, Anne; Bouxsein, Mary L; Miller, Karen K

    2016-02-01

    Somewhat paradoxically, fracture risk, which depends on applied loads and bone strength, is elevated in both anorexia nervosa and obesity at certain skeletal sites. Factor-of-risk (Φ), the ratio of applied load to bone strength, is a biomechanically based method to estimate fracture risk; theoretically, higher Φ reflects increased fracture risk. We estimated vertebral strength (linear combination of integral volumetric bone mineral density [Int.vBMD] and cross-sectional area from quantitative computed tomography [QCT]), vertebral compressive loads, and Φ at L4 in 176 women (65 anorexia nervosa, 45 lean controls, and 66 obese). Using biomechanical models, applied loads were estimated for: 1) standing; 2) arms flexed 90°, holding 5 kg in each hand (holding); 3) 45° trunk flexion, 5 kg in each hand (lifting); 4) 20° trunk right lateral bend, 10 kg in right hand (bending). We also investigated associations of Int.vBMD and vertebral strength with lean mass (from dual-energy X-ray absorptiometry [DXA]) and visceral adipose tissue (VAT, from QCT). Women with anorexia nervosa had lower, whereas obese women had similar, Int.vBMD and estimated vertebral strength compared with controls. Vertebral loads were highest in obesity and lowest in anorexia nervosa for standing, holding, and lifting (p < 0.0001) but were highest in anorexia nervosa for bending (p < 0.02). Obese women had highest Φ for standing and lifting, whereas women with anorexia nervosa had highest Φ for bending (p < 0.0001). Obese and anorexia nervosa subjects had higher Φ for holding than controls (p < 0.03). Int.vBMD and estimated vertebral strength were associated positively with lean mass (R = 0.28 to 0.45, p ≤ 0.0001) in all groups combined and negatively with VAT (R = -[0.36 to 0.38], p < 0.003) within the obese group. Therefore, women with anorexia nervosa had higher estimated vertebral fracture risk (Φ) for holding and bending because of inferior vertebral strength. Despite similar

  8. A Methodological Approach to Small Area Estimation for the Behavioral Risk Factor Surveillance System

    PubMed Central

    Xu, Fang; Wallace, Robyn C.; Garvin, William; Greenlund, Kurt J.; Bartoli, William; Ford, Derek; Eke, Paul; Town, G. Machell

    2016-01-01

    Public health researchers have used a class of statistical methods to calculate prevalence estimates for small geographic areas with few direct observations. Many researchers have used Behavioral Risk Factor Surveillance System (BRFSS) data as a basis for their models. The aims of this study were to 1) describe a new BRFSS small area estimation (SAE) method and 2) investigate the internal and external validity of the BRFSS SAEs it produced. The BRFSS SAE method uses 4 data sets (the BRFSS, the American Community Survey Public Use Microdata Sample, Nielsen Claritas population totals, and the Missouri Census Geographic Equivalency File) to build a single weighted data set. Our findings indicate that internal and external validity tests were successful across many estimates. The BRFSS SAE method is one of several methods that can be used to produce reliable prevalence estimates in small geographic areas. PMID:27418213

  9. A Methodological Approach to Small Area Estimation for the Behavioral Risk Factor Surveillance System.

    PubMed

    Pierannunzi, Carol; Xu, Fang; Wallace, Robyn C; Garvin, William; Greenlund, Kurt J; Bartoli, William; Ford, Derek; Eke, Paul; Town, G Machell

    2016-01-01

    Public health researchers have used a class of statistical methods to calculate prevalence estimates for small geographic areas with few direct observations. Many researchers have used Behavioral Risk Factor Surveillance System (BRFSS) data as a basis for their models. The aims of this study were to 1) describe a new BRFSS small area estimation (SAE) method and 2) investigate the internal and external validity of the BRFSS SAEs it produced. The BRFSS SAE method uses 4 data sets (the BRFSS, the American Community Survey Public Use Microdata Sample, Nielsen Claritas population totals, and the Missouri Census Geographic Equivalency File) to build a single weighted data set. Our findings indicate that internal and external validity tests were successful across many estimates. The BRFSS SAE method is one of several methods that can be used to produce reliable prevalence estimates in small geographic areas. PMID:27418213

  10. Risk estimates for deterministic health effects of inhaled weapons grade plutonium.

    PubMed

    Scott, Bobby R; Peterson, Vern L

    2003-09-01

    Risk estimates for deterministic effects of inhaled weapons-grade plutonium (WG Pu) are needed to evaluate potential serious harm to (1) U.S. Department of Energy nuclear workers from accidental or other work-place releases of WG Pu; and (2) the public from terrorist actions resulting in the release of WG Pu to the environment. Deterministic health effects (the most serious radiobiological consequences to humans) can arise when large amounts of WG Pu are taken into the body. Inhalation is considered the most likely route of intake during work-place accidents or during a nuclear terrorism incident releasing WG Pu to the environment. Our current knowledge about radiation-related harm is insufficient for generating precise estimates of risk for a given WG Pu exposure scenario. This relates largely to uncertainties associated with currently available risk and dosimetry models. Thus, rather than generating point estimates of risk, distributions that account for variability/uncertainty are needed to properly characterize potential harm to humans from a given WG Pu exposure scenario. In this manuscript, we generate and summarize risk distributions for deterministic radiation effects in the lungs of nuclear workers from inhaled WG Pu particles (standard isotopic mix). These distributions were developed using NUREG/CR-4214 risk models and time-dependent, dose conversion factor data based on Publication 30 of the International Commission on Radiological Protection. Dose conversion factors based on ICRP Publication 30 are more relevant to deterministic effects than are the dose conversion factors based on ICRP Publication 66, which relate to targets for stochastic effects. Risk distributions that account for NUREG/CR-4214 parameter and model uncertainties were generated using the Monte Carlo method. Risks were evaluated for both lethality (from radiation pneumonitis) and morbidity (due to radiation-induced respiratory dysfunction) and were found to depend strongly on absorbed

  11. Estimating doses and risks associated with decontamination and decommissioning activities using the CRRIS

    SciTech Connect

    Miller, C.W.; Sjoreen, A.L.; Cotter, S.J.

    1986-01-01

    The Computerized Radiological Risk Investigation System (CRRIS) is applicable to determining doses and risks from a variety of decontamination and decommissioning activities. For example, concentrations in air from resuspended radionuclides initially deposited on the ground surface and the concentrations of deposited radionuclides in various soil layers can be obtained. The CRRIS will estimate exposure to radon and its progeny in terms of working-level months, and will compute the resulting health risks. The CRRIS consists of seven integrated computer codes that stand alone or are run as a system to calculate environmental transport, doses, and risks. PRIMUS output provides other CRRIS codes the capability to handle radionuclide decay chains. ANEMOS and RETADD-II calculate atmospheric dispersion and deposition for local and regional distances, respectively. Multiple ANEMOS runs for sources within a small area are combined on a master grid by SUMIT. MLSOIL is used to estimate effective ground surface concentrations for dose computations. TERRA calculates food chain transport, and ANDROS calculates individual or population exposures, doses, and risks. Applications of the CRRIS to decontamination problems are discussed. 16 refs., 1 fig.

  12. Estimating Risk of Natural Gas Portfolios by Using GARCH-EVT-Copula Model

    PubMed Central

    Tang, Jiechen; Zhou, Chao; Yuan, Xinyu; Sriboonchitta, Songsak

    2015-01-01

    This paper concentrates on estimating the risk of Title Transfer Facility (TTF) Hub natural gas portfolios by using the GARCH-EVT-copula model. We first use the univariate ARMA-GARCH model to model each natural gas return series. Second, the extreme value distribution (EVT) is fitted to the tails of the residuals to model marginal residual distributions. Third, multivariate Gaussian copula and Student t-copula are employed to describe the natural gas portfolio risk dependence structure. Finally, we simulate N portfolios and estimate value at risk (VaR) and conditional value at risk (CVaR). Our empirical results show that, for an equally weighted portfolio of five natural gases, the VaR and CVaR values obtained from the Student t-copula are larger than those obtained from the Gaussian copula. Moreover, when minimizing the portfolio risk, the optimal natural gas portfolio weights are found to be similar across the multivariate Gaussian copula and Student t-copula and different confidence levels. PMID:26351652

  13. Estimating Risk of Natural Gas Portfolios by Using GARCH-EVT-Copula Model.

    PubMed

    Tang, Jiechen; Zhou, Chao; Yuan, Xinyu; Sriboonchitta, Songsak

    2015-01-01

    This paper concentrates on estimating the risk of Title Transfer Facility (TTF) Hub natural gas portfolios by using the GARCH-EVT-copula model. We first use the univariate ARMA-GARCH model to model each natural gas return series. Second, the extreme value distribution (EVT) is fitted to the tails of the residuals to model marginal residual distributions. Third, multivariate Gaussian copula and Student t-copula are employed to describe the natural gas portfolio risk dependence structure. Finally, we simulate N portfolios and estimate value at risk (VaR) and conditional value at risk (CVaR). Our empirical results show that, for an equally weighted portfolio of five natural gases, the VaR and CVaR values obtained from the Student t-copula are larger than those obtained from the Gaussian copula. Moreover, when minimizing the portfolio risk, the optimal natural gas portfolio weights are found to be similar across the multivariate Gaussian copula and Student t-copula and different confidence levels. PMID:26351652

  14. Comparison of Pooled Risk Estimates for Adverse Effects from Different Observational Study Designs: Methodological Overview

    PubMed Central

    Golder, Su; Loke, Yoon K.; Bland, Martin

    2013-01-01

    Background A diverse range of study designs (e.g. case-control or cohort) are used in the evaluation of adverse effects. We aimed to ascertain whether the risk estimates from meta-analyses of case-control studies differ from that of other study designs. Methods Searches were carried out in 10 databases in addition to reference checking, contacting experts, and handsearching key journals and conference proceedings. Studies were included where a pooled relative measure of an adverse effect (odds ratio or risk ratio) from case-control studies could be directly compared with the pooled estimate for the same adverse effect arising from other types of observational studies. Results We included 82 meta-analyses. Pooled estimates of harm from the different study designs had 95% confidence intervals that overlapped in 78/82 instances (95%). Of the 23 cases of discrepant findings (significant harm identified in meta-analysis of one type of study design, but not with the other study design), 16 (70%) stemmed from significantly elevated pooled estimates from case-control studies. There was associated evidence of funnel plot asymmetry consistent with higher risk estimates from case-control studies. On average, cohort or cross-sectional studies yielded pooled odds ratios 0.94 (95% CI 0.88–1.00) times lower than that from case-control studies. Interpretation Empirical evidence from this overview indicates that meta-analysis of case-control studies tend to give slightly higher estimates of harm as compared to meta-analyses of other observational studies. However it is impossible to rule out potential confounding from differences in drug dose, duration and populations when comparing between study designs. PMID:23977151

  15. Estimation of the standardized risk difference and ratio in a competing risks framework: application to injection drug use and progression to AIDS after initiation of antiretroviral therapy.

    PubMed

    Cole, Stephen R; Lau, Bryan; Eron, Joseph J; Brookhart, M Alan; Kitahata, Mari M; Martin, Jeffrey N; Mathews, William C; Mugavero, Michael J

    2015-02-15

    There are few published examples of absolute risk estimated from epidemiologic data subject to censoring and competing risks with adjustment for multiple confounders. We present an example estimating the effect of injection drug use on 6-year risk of acquired immunodeficiency syndrome (AIDS) after initiation of combination antiretroviral therapy between 1998 and 2012 in an 8-site US cohort study with death before AIDS as a competing risk. We estimate the risk standardized to the total study sample by combining inverse probability weights with the cumulative incidence function; estimates of precision are obtained by bootstrap. In 7,182 patients (83% male, 33% African American, median age of 38 years), we observed 6-year standardized AIDS risks of 16.75% among 1,143 injection drug users and 12.08% among 6,039 nonusers, yielding a standardized risk difference of 4.68 (95% confidence interval: 1.27, 8.08) and a standardized risk ratio of 1.39 (95% confidence interval: 1.12, 1.72). Results may be sensitive to the assumptions of exposure-version irrelevance, no measurement bias, and no unmeasured confounding. These limitations suggest that results be replicated with refined measurements of injection drug use. Nevertheless, estimating the standardized risk difference and ratio is straightforward, and injection drug use appears to increase the risk of AIDS. PMID:24966220

  16. Estimating the risks of cancer mortality and genetic defects resulting from exposures to low levels of ionizing radiation

    SciTech Connect

    Buhl, T.E.; Hansen, W.R.

    1984-05-01

    Estimators for calculating the risk of cancer and genetic disorders induced by exposure to ionizing radiation have been recommended by the US National Academy of Sciences Committee on the Biological Effects of Ionizing Radiations, the UN Scientific Committee on the Effects of Atomic Radiation, and the International Committee on Radiological Protection. These groups have also considered the risks of somatic effects other than cancer. The US National Council on Radiation Protection and Measurements has discussed risk estimate procedures for radiation-induced health effects. The recommendations of these national and international advisory committees are summarized and compared in this report. Based on this review, two procedures for risk estimation are presented for use in radiological assessments performed by the US Department of Energy under the National Environmental Policy Act of 1969 (NEPA). In the first procedure, age- and sex-averaged risk estimators calculated with US average demographic statistics would be used with estimates of radiation dose to calculate the projected risk of cancer and genetic disorders that would result from the operation being reviewed under NEPA. If more site-specific risk estimators are needed, and the demographic information is available, a second procedure is described that would involve direct calculation of the risk estimators using recommended risk-rate factors. The computer program REPCAL has been written to perform this calculation and is described in this report. 25 references, 16 tables.

  17. 49 CFR Appendix G to Part 222 - Excess Risk Estimates for Public Highway-Rail Grade Crossings

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 4 2010-10-01 2010-10-01 false Excess Risk Estimates for Public Highway-Rail Grade Crossings G Appendix G to Part 222 Transportation Other Regulations Relating to Transportation... HIGHWAY-RAIL GRADE CROSSINGS Pt. 222, App. G Appendix G to Part 222—Excess Risk Estimates for...

  18. COMMUNITY-RANDOMIZED INTERVENTION TRIAL WITH UV DISINFECTION FOR ESTIMATING THE RISK OF PEDIATRIC ILLNESS FROM MUNICIPAL GROUNDWATER CONSUMPTION

    EPA Science Inventory

    The goal of this study is to estimate the risk of childhood febrile and gastrointestinal illnesses associated with drinking municipal water from a groundwater source. The risk estimate will be partitioned into two separate components— illness attributable to contaminated...

  19. 49 CFR Appendix G to Part 222 - Excess Risk Estimates for Public Highway-Rail Grade Crossings

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 49 Transportation 4 2014-10-01 2014-10-01 false Excess Risk Estimates for Public Highway-Rail Grade Crossings G Appendix G to Part 222 Transportation Other Regulations Relating to Transportation... HIGHWAY-RAIL GRADE CROSSINGS Pt. 222, App. G Appendix G to Part 222—Excess Risk Estimates for...

  20. 49 CFR Appendix G to Part 222 - Excess Risk Estimates for Public Highway-Rail Grade Crossings

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 49 Transportation 4 2011-10-01 2011-10-01 false Excess Risk Estimates for Public Highway-Rail Grade Crossings G Appendix G to Part 222 Transportation Other Regulations Relating to Transportation... HIGHWAY-RAIL GRADE CROSSINGS Pt. 222, App. G Appendix G to Part 222—Excess Risk Estimates for...

  1. 49 CFR Appendix G to Part 222 - Excess Risk Estimates for Public Highway-Rail Grade Crossings

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 49 Transportation 4 2013-10-01 2013-10-01 false Excess Risk Estimates for Public Highway-Rail Grade Crossings G Appendix G to Part 222 Transportation Other Regulations Relating to Transportation... HIGHWAY-RAIL GRADE CROSSINGS Pt. 222, App. G Appendix G to Part 222—Excess Risk Estimates for...

  2. 49 CFR Appendix G to Part 222 - Excess Risk Estimates for Public Highway-Rail Grade Crossings

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 49 Transportation 4 2012-10-01 2012-10-01 false Excess Risk Estimates for Public Highway-Rail Grade Crossings G Appendix G to Part 222 Transportation Other Regulations Relating to Transportation... HIGHWAY-RAIL GRADE CROSSINGS Pt. 222, App. G Appendix G to Part 222—Excess Risk Estimates for...

  3. Parametric estimation of P(X > Y) for normal distributions in the context of probabilistic environmental risk assessment

    PubMed Central

    Bekker, Andriëtte A.; van der Voet, Hilko; ter Braak, Cajo J.F.

    2015-01-01

    Estimating the risk, P(X > Y), in probabilistic environmental risk assessment of nanoparticles is a problem when confronted by potentially small risks and small sample sizes of the exposure concentration X and/or the effect concentration Y. This is illustrated in the motivating case study of aquatic risk assessment of nano-Ag. A non-parametric estimator based on data alone is not sufficient as it is limited by sample size. In this paper, we investigate the maximum gain possible when making strong parametric assumptions as opposed to making no parametric assumptions at all. We compare maximum likelihood and Bayesian estimators with the non-parametric estimator and study the influence of sample size and risk on the (interval) estimators via simulation. We found that the parametric estimators enable us to estimate and bound the risk for smaller sample sizes and small risks. Also, the Bayesian estimator outperforms the maximum likelihood estimators in terms of coverage and interval lengths and is, therefore, preferred in our motivating case study. PMID:26312175

  4. Quantitative Cyber Risk Reduction Estimation Methodology for a Small Scada Control System

    SciTech Connect

    Miles A. McQueen; Wayne F. Boyer; Mark A. Flynn; George A. Beitel

    2006-01-01

    We propose a new methodology for obtaining a quick quantitative measurement of the risk reduction achieved when a control system is modified with the intent to improve cyber security defense against external attackers. The proposed methodology employs a directed graph called a compromise graph, where the nodes represent stages of a potential attack and the edges represent the expected time-to-compromise for differing attacker skill levels. Time-to-compromise is modeled as a function of known vulnerabilities and attacker skill level. The methodology was used to calculate risk reduction estimates for a specific SCADA system and for a specific set of control system security remedial actions. Despite an 86% reduction in the total number of vulnerabilities, the estimated time-to-compromise was increased only by about 3 to 30% depending on target and attacker skill level.

  5. Estimating risks of importation and local transmission of Zika virus infection

    PubMed Central

    Nah, Kyeongah; Mizumoto, Kenji; Miyamatsu, Yuichiro; Yasuda, Yohei; Kinoshita, Ryo

    2016-01-01

    Background. An international spread of Zika virus (ZIKV) infection has attracted global attention. ZIKV is conveyed by a mosquito vector, Aedes species, which also acts as the vector species of dengue and chikungunya viruses. Methods. Arrival time of ZIKV importation (i.e., the time at which the first imported case was diagnosed) in each imported country was collected from publicly available data sources. Employing a survival analysis model in which the hazard is an inverse function of the effective distance as informed by the airline transportation network data, and using dengue and chikungunya virus transmission data, risks of importation and local transmission were estimated. Results. A total of 78 countries with imported case(s) have been identified, with the arrival time ranging from 1 to 44 weeks since the first ZIKV was identified in Brazil, 2015. Whereas the risk of importation was well explained by the airline transportation network data, the risk of local transmission appeared to be best captured by additionally accounting for the presence of dengue and chikungunya viruses. Discussion. The risk of importation may be high given continued global travel of mildly infected travelers but, considering that the public health concerns over ZIKV infection stems from microcephaly, it is more important to focus on the risk of local and widespread transmission that could involve pregnant women. The predicted risk of local transmission was frequently seen in tropical and subtropical countries with dengue or chikungunya epidemic experience. PMID:27069825

  6. Estimating risks of importation and local transmission of Zika virus infection.

    PubMed

    Nah, Kyeongah; Mizumoto, Kenji; Miyamatsu, Yuichiro; Yasuda, Yohei; Kinoshita, Ryo; Nishiura, Hiroshi

    2016-01-01

    Background. An international spread of Zika virus (ZIKV) infection has attracted global attention. ZIKV is conveyed by a mosquito vector, Aedes species, which also acts as the vector species of dengue and chikungunya viruses. Methods. Arrival time of ZIKV importation (i.e., the time at which the first imported case was diagnosed) in each imported country was collected from publicly available data sources. Employing a survival analysis model in which the hazard is an inverse function of the effective distance as informed by the airline transportation network data, and using dengue and chikungunya virus transmission data, risks of importation and local transmission were estimated. Results. A total of 78 countries with imported case(s) have been identified, with the arrival time ranging from 1 to 44 weeks since the first ZIKV was identified in Brazil, 2015. Whereas the risk of importation was well explained by the airline transportation network data, the risk of local transmission appeared to be best captured by additionally accounting for the presence of dengue and chikungunya viruses. Discussion. The risk of importation may be high given continued global travel of mildly infected travelers but, considering that the public health concerns over ZIKV infection stems from microcephaly, it is more important to focus on the risk of local and widespread transmission that could involve pregnant women. The predicted risk of local transmission was frequently seen in tropical and subtropical countries with dengue or chikungunya epidemic experience. PMID:27069825

  7. Cancer risk estimates from radiation therapy for heterotopic ossification prophylaxis after total hip arthroplasty

    SciTech Connect

    Mazonakis, Michalis; Berris, Theoharris; Damilakis, John; Lyraraki, Efrossyni

    2013-10-15

    Purpose: Heterotopic ossification (HO) is a frequent complication following total hip arthroplasty. This study was conducted to calculate the radiation dose to organs-at-risk and estimate the probability of cancer induction from radiotherapy for HO prophylaxis.Methods: Hip irradiation for HO with a 6 MV photon beam was simulated with the aid of a Monte Carlo model. A realistic humanoid phantom representing an average adult patient was implemented in Monte Carlo environment for dosimetric calculations. The average out-of-field radiation dose to stomach, liver, lung, prostate, bladder, thyroid, breast, uterus, and ovary was calculated. The organ-equivalent-dose to colon, that was partly included within the treatment field, was also determined. Organ dose calculations were carried out using three different field sizes. The dependence of organ doses upon the block insertion into primary beam for shielding colon and prosthesis was investigated. The lifetime attributable risk for cancer development was estimated using organ, age, and gender-specific risk coefficients.Results: For a typical target dose of 7 Gy, organ doses varied from 1.0 to 741.1 mGy by the field dimensions and organ location relative to the field edge. Blocked field irradiations resulted in a dose range of 1.4–146.3 mGy. The most probable detriment from open field treatment of male patients was colon cancer with a high risk of 564.3 × 10{sup −5} to 837.4 × 10{sup −5} depending upon the organ dose magnitude and the patient's age. The corresponding colon cancer risk for female patients was (372.2–541.0) × 10{sup −5}. The probability of bladder cancer development was more than 113.7 × 10{sup −5} and 110.3 × 10{sup −5} for males and females, respectively. The cancer risk range to other individual organs was reduced to (0.003–68.5) × 10{sup −5}.Conclusions: The risk for cancer induction from radiation therapy for HO prophylaxis after total hip arthroplasty varies considerably by the

  8. Estimating the risk of squamous cell cancer induction in skin following nonlinear optical imaging.

    PubMed

    Thomas, Giju; Nadiarnykh, Oleg; van Voskuilen, Johan; Hoy, Christopher L; Gerritsen, Hans C; Sterenborg, Henricus J C M

    2014-07-01

    High power femto-second (fs) laser pulses used for in-vivo nonlinear optical (NLO) imaging can form cyclobutane pyrimidine dimers (CPD) in DNA, which may lead to carcinogenesis via subsequent mutations. Since UV radiation from routine sun exposure is the primary source of CPD lesions, we evaluated the risk of CPD-related squamous cell carcinoma (SCC) in human skin due to NLO imaging relative to that from sun exposure. We developed a unique cancer risk model expanding previously published estimation of risk from exposure to continuous wave (CW) laser. This new model showed that the increase in CPD-related SCC in skin from NLO imaging is negligible above that due to regular sun exposure. PMID:23401419

  9. The economic value of reducing environmental health risks: Contingent valuation estimates of the value of information

    SciTech Connect

    Krieger, D.J.; Hoehn, J.P.

    1999-05-01

    Obtaining economically consistent values for changes in low probability health risks continues to be a challenge for contingent valuation (CV) as well as for other valuation methods. One of the cited condition for economic consistency is that estimated values be sensitive to the scope (differences in quantity or quality) of a good described in a CV application. The alleged limitations of CV pose a particular problem for environmental managers who must often make decisions that affect human health risks. This paper demonstrates that a well-designed CV application can elicit scope sensitive values even for programs that provide conceptually complex goods such as risk reduction. Specifically, it finds that the amount sport anglers are willing to pay for information about chemical residues in fish varies systematically with informativeness--a relationship suggested by the theory of information value.

  10. Forest fire risk estimation from time series analisys of NOAA NDVI data

    NASA Astrophysics Data System (ADS)

    Gabban, Andrea; Liberta, Giorgio; San-Miguel-Ayanz, Jesus; Barbosa, Paulo

    2004-02-01

    The values of the Normalized Difference Vegetation Index obtained from NOAA Advanced Very High Resolution Radiometer (AVHRR) have often been used for forestry application, including the assessment of fire risk. Forest fire risk estimates were based mainly on the decrease of NDVI values during the summer in areas subject to summer drought. However, the inter-annual variability of the vegetation response has never been extensively taken into account. The present work was based on the assumption that Mediterranean vegetation is adapted to summer drought and one possible estimator of the vegetation stress was the inter-annual variability of the vegetation status, as reflected by NDVI values. This article presents a novel methodology for the assessment of fire risk based on the comparison of the current NDVI values, on a given area, with the historical values along a time series of 13 years. The first part of the study is focused on the characterization of the Minimum and Maximum long term daily images. The second part is centered on the best method to compare the long term Maximum and Minimum with the current NDVI. A statistical index, Dynamic Relative Greenness, DRG, was tested on as a novel potential fire risk indicator.

  11. Estimates of Radiation Doses and Cancer Risk from Food Intake in Korea

    PubMed Central

    2016-01-01

    The aim of this study was to estimate internal radiation doses and lifetime cancer risk from food ingestion. Radiation doses from food intake were calculated using the Korea National Health and Nutrition Examination Survey and the measured radioactivity of 134Cs, 137Cs, and 131I from the Ministry of Food and Drug Safety in Korea. Total number of measured data was 8,496 (3,643 for agricultural products, 644 for livestock products, 43 for milk products, 3,193 for marine products, and 973 for processed food). Cancer risk was calculated by multiplying the estimated committed effective dose and the detriment adjusted nominal risk coefficients recommended by the International Commission on Radiation Protection. The lifetime committed effective doses from the daily diet are ranged 2.957-3.710 mSv. Excess lifetime cancer risks are 14.4-18.1, 0.4-0.5, and 1.8-2.3 per 100,000 for all solid cancers combined, thyroid cancer, and leukemia, respectively. PMID:26770031

  12. Estimates of Radiation Doses and Cancer Risk from Food Intake in Korea.

    PubMed

    Moon, Eun-Kyeong; Ha, Wi-Ho; Seo, Songwon; Jin, Young Woo; Jeong, Kyu Hwan; Yoon, Hae-Jung; Kim, Hyoung-Soo; Hwang, Myung-Sil; Choi, Hoon; Lee, Won Jin

    2016-01-01

    The aim of this study was to estimate internal radiation doses and lifetime cancer risk from food ingestion. Radiation doses from food intake were calculated using the Korea National Health and Nutrition Examination Survey and the measured radioactivity of (134)Cs, (137)Cs, and (131)I from the Ministry of Food and Drug Safety in Korea. Total number of measured data was 8,496 (3,643 for agricultural products, 644 for livestock products, 43 for milk products, 3,193 for marine products, and 973 for processed food). Cancer risk was calculated by multiplying the estimated committed effective dose and the detriment adjusted nominal risk coefficients recommended by the International Commission on Radiation Protection. The lifetime committed effective doses from the daily diet are ranged 2.957-3.710 mSv. Excess lifetime cancer risks are 14.4-18.1, 0.4-0.5, and 1.8-2.3 per 100,000 for all solid cancers combined, thyroid cancer, and leukemia, respectively. PMID:26770031

  13. Injury Risk Estimation Expertise: Cognitive-Perceptual Mechanisms of ACL-IQ.

    PubMed

    Petushek, Erich J; Cokely, Edward T; Ward, Paul; Myer, Gregory D

    2015-06-01

    Instrument-based biomechanical movement analysis is an effective injury screening method but relies on expensive equipment and time-consuming analysis. Screening methods that rely on visual inspection and perceptual skill for prognosticating injury risk provide an alternative approach that can significantly reduce cost and time. However, substantial individual differences exist in skill when estimating injury risk performance via observation. The underlying perceptual-cognitive mechanisms of injury risk identification were explored to better understand the nature of this skill and provide a foundation for improving performance. Quantitative structural and process modeling of risk estimation indicated that superior performance was largely mediated by specific strategies and skills (e.g., irrelevant information reduction), and independent of domain-general cognitive abilities (e.g., mental rotation, general decision skill). These cognitive models suggest that injury prediction expertise (i.e., ACL-IQ) is a trainable skill, and provide a foundation for future research and applications in training, decision support, and ultimately clinical screening investigations. PMID:26265341

  14. Estimate of the risks of disposing nonhazardous oil field wastes into salt caverns

    SciTech Connect

    Tomasko, D.; Elcock, D.; Veil, J.

    1997-12-31

    Argonne National Laboratory (ANL) has completed an evaluation of the possibility that adverse human health effects (carcinogenic and noncarcinogenic) could result from exposure to contaminants released from nonhazardous oil field wastes (NOW) disposed in domal salt caverns. Potential human health risks associated with hazardous substances (arsenic, benzene, cadmium, and chromium) in NOW were assessed under four postclosure cavern release scenarios: inadvertent cavern intrusion, failure of the cavern seal, failure of the cavern through cracks or leaky interbeds, and a partial collapse of the cavern roof. To estimate potential human health risks for these scenarios, contaminant concentrations at the receptor were calculated using a one-dimensional solution to an advection/dispersion equation that included first order degradation. Assuming a single, generic salt cavern and generic oil-field wastes, the best-estimate excess cancer risks ranged from 1.7 {times} 10{sup {minus}12} to 1.1 {times} 10{sup {minus}8} and hazard indices (referring to noncancer health effects) ranged from 7 {times} 10{sup {minus}9} to 7 {times} 10{sup {minus}4}. Under worse-case conditions in which the probability of cavern failure is 1.0, excess cancer risks ranged from 4.9 {times} 10{sup {minus}9} to 1.7 {times} 10{sup {minus}5} and hazard indices ranged from 7.0 {times} 10{sup {minus}4} to 0.07. Even under worst-case conditions, the risks are within the US Environmental Protection Agency (EPA) target range for acceptable exposure levels. From a human health risk perspective, salt caverns can, therefore, provide an acceptable disposal method for NOW.

  15. Estimation of sport fish harvest for risk and hazard assessment of environmental contaminants

    SciTech Connect

    Poston, T.M.; Strenge, D.L.

    1989-01-01

    Consumption of contaminated fish flesh can be a significant route of human exposure to hazardous chemicals. Estimation of exposure resulting from the consumption of fish requires knowledge of fish consumption and contaminant levels in the edible portion of fish. Realistic figures of sport fish harvest are needed to estimate consumption. Estimates of freshwater sport fish harvest were developed from a review of 72 articles and reports. Descriptive statistics based on fishing pressure were derived from harvest data for four distinct groups of freshwater sport fish in three water types: streams, lakes, and reservoirs. Regression equations were developed to relate harvest to surface area fished where data bases were sufficiently large. Other aspects of estimating human exposure to contaminants in fish flesh that are discussed include use of bioaccumulation factors for trace metals and organic compounds. Using the bioaccumulation factor and the concentration of contaminants in water as variables in the exposure equation may also lead to less precise estimates of tissue concentration. For instance, muscle levels of contaminants may not increase proportionately with increases in water concentrations, leading to overestimation of risk. In addition, estimates of water concentration may be variable or expressed in a manner that does not truly represent biological availability of the contaminant. These factors are discussed. 45 refs., 1 fig., 7 tabs.

  16. Risk estimation for future glacier lake outburst floods based on local land-use changes

    NASA Astrophysics Data System (ADS)

    Nussbaumer, S.; Schaub, Y.; Huggel, C.; Walz, A.

    2014-06-01

    Effects of climate change are particularly strong in high-mountain regions. Most visibly, glaciers are shrinking at a rapid pace, and as a consequence, glacier lakes are forming or growing. At the same time the stability of mountain slopes is reduced by glacier retreat, permafrost thaw and other factors, resulting in an increasing landslide hazard which can potentially impact lakes and therewith trigger far-reaching and devastating outburst floods. To manage risks from existing or future lakes, strategies need to be developed to plan in time for adequate risk reduction measures at a local level. However, methods to assess risks from future lake outbursts are not available and need to be developed to evaluate both future hazard and future damage potential. Here a method is presented to estimate future risks related to glacier lake outbursts for a local site in southern Switzerland (Naters, Valais). To generate two hazard scenarios, glacier shrinkage and lake formation modelling was applied, combined with simple flood modelling and field work. Furthermore, a land-use model was developed to quantify and allocate land-use changes based on local-to-regional storylines and three scenarios of land-use driving forces. Results are conceptualized in a matrix of three land-use and two hazard scenarios for the year 2045, and show the distribution of risk in the community of Naters, including high and very high risk areas. The study underlines the importance of combined risk management strategies focusing on land-use planning, on vulnerability reduction, as well as on structural measures (where necessary) to effectively reduce future risks related to lake outburst floods.

  17. Local land-use change based risk estimation for future glacier lake outburst flood

    NASA Astrophysics Data System (ADS)

    Nussbaumer, S.; Huggel, C.; Schaub, Y.; Walz, A.

    2013-08-01

    Effects of climate change are particularly strong in high-mountain regions. Most visibly, glaciers are shrinking at a rapid pace, and as a consequence, glacier lakes are forming or growing. At the same time the stability of mountain slopes is reduced by glacier retreat, permafrost thaw and other factors, resulting in an increasing risk of landslides which can potentially impact lakes and therewith trigger far reaching and devastating outburst floods. To manage risks from existing or future lakes, strategies need to be developed to plan in time for adequate risk reduction measures at a local level. However, methods to assess risks from future lake outbursts are not available. It is actually a challenge to develop methods to evaluate both, future hazard potential and future damage potential. Here we present an analysis of future risks related to glacier lake outbursts for a local site in southern Switzerland (Naters, Valais). To estimate two hazard scenarios, we used glacier shrinkage and lake formation modelling, simple flood modelling and field work. Further we developed a land-use model to quantify and allocate land-use changes based on local-to-regional storylines and three scenarios of land-use driving forces. Results are conceptualized in a matrix of three land-use and two hazard scenarios for a time period of 2045, and show the distribution of risk in the community of Naters, including high and very high risk areas. The study corroborates the importance of land-use planning to effectively reduce future risks related to lake outburst floods.

  18. Estimating the risks of smoking, air pollution, and passive smoke on acute respiratory conditions

    SciTech Connect

    Ostro, B.D. )

    1989-06-01

    Five years of the annual Health Interview Survey, conducted by the National Center for Health Statistics, are used to estimate the effects of air pollution, smoking, and environmental tobacco smoke on respiratory restrictions in activity for adults, and bed disability for children. After adjusting for several socioeconomic factors, the multiple regression estimates indicate that an independent and statistically significant association exists between these three forms of air pollution and respiratory morbidity. The comparative risks of these exposures are computed and the plausibility of the relative risks is examined by comparing the equivalent doses with actual measurements of exposure taken in the homes of smokers. The results indicate that: (1) smokers will have a 55-75% excess in days with respiratory conditions severe enough to cause reductions in normal activity; (2) a 1 microgram increase in fine particulate matter air pollution is associated with a 3% excess in acute respiratory disease; and (3) a pack-a-day smoker will increase respiratory restricted days for a nonsmoking spouse by 20% and increase the number of bed disability days for young children living in the household by 20%. The results also indicate that the estimates of the effects of secondhand smoking on children are improved when the mother's work status is known and incorporated into the exposure estimate.

  19. Problems and solutions in the estimation of genetic risks from radiation and chemicals

    SciTech Connect

    Russell, W. L.

    1980-01-01

    Extensive investigations with mice on the effects of various physical and biological factors, such as dose rate, sex and cell stage, on radiation-induced mutation have provided an evaluation of the genetics hazards of radiation in man. The mutational results obtained in both sexes with progressive lowering of the radiation dose rate have permitted estimation of the mutation frequency expected under the low-level radiation conditions of most human exposure. Supplementing the studies on mutation frequency are investigations on the phenotypic effects of mutations in mice, particularly anatomical disorders of the skeleton, which allow an estimation of the degree of human handicap associated with the occurrence of parallel defects in man. Estimation of the genetic risk from chemical mutagens is much more difficult, and the research is much less advanced. Results on transmitted mutations in mice indicate a poor correlation with mutation induction in non-mammalian organisms.

  20. Marginal and Conditional Distribution Estimation from Double-Sampled Semi-Competing Risks Data

    PubMed Central

    Yu, Menggang; Yiannoutsos, Constantin T

    2015-01-01

    Informative dropout is a vexing problem for any biomedical study. Most existing statistical methods attempt to correct estimation bias related to this phenomenon by specifying unverifiable assumptions about the dropout mechanism. We consider a cohort study in Africa that uses an outreach program to ascertain the vital status for dropout subjects. These data can be used to identify a number of relevant distributions. However, as only a subset of dropout subjects were followed, vital status ascertainment was incomplete. We use semi-competing risk methods as our analysis framework to address this specific case where the terminal event is incompletely ascertained and consider various procedures for estimating the marginal distribution of dropout and the marginal and conditional distributions of survival. We also consider model selection and estimation efficiency in our setting. Performance of the proposed methods is demonstrated via simulations, asymptotic study, and analysis of the study data. PMID:26924877

  1. Sensitivity Analysis of Median Lifetime on Radiation Risks Estimates for Cancer and Circulatory Disease amongst Never-Smokers

    NASA Technical Reports Server (NTRS)

    Chappell, Lori J.; Cucinotta, Francis A.

    2011-01-01

    Radiation risks are estimated in a competing risk formalism where age or time after exposure estimates of increased risks for cancer and circulatory diseases are folded with a probability to survive to a given age. The survival function, also called the life-table, changes with calendar year, gender, smoking status and other demographic variables. An outstanding problem in risk estimation is the method of risk transfer between exposed populations and a second population where risks are to be estimated. Approaches used to transfer risks are based on: 1) Multiplicative risk transfer models -proportional to background disease rates. 2) Additive risk transfer model -risks independent of background rates. In addition, a Mixture model is often considered where the multiplicative and additive transfer assumptions are given weighted contributions. We studied the influence of the survival probability on the risk of exposure induced cancer and circulatory disease morbidity and mortality in the Multiplicative transfer model and the Mixture model. Risks for never-smokers (NS) compared to the average U.S. population are estimated to be reduced between 30% and 60% dependent on model assumptions. Lung cancer is the major contributor to the reduction for NS, with additional contributions from circulatory diseases and cancers of the stomach, liver, bladder, oral cavity, esophagus, colon, a portion of the solid cancer remainder, and leukemia. Greater improvements in risk estimates for NS s are possible, and would be dependent on improved understanding of risk transfer models, and elucidating the role of space radiation on the various stages of disease formation (e.g. initiation, promotion, and progression).

  2. Radiation-Induced Leukemia at Doses Relevant to Radiation Therapy: Modeling Mechanisms and Estimating Risks

    NASA Technical Reports Server (NTRS)

    Shuryak, Igor; Sachs, Rainer K.; Hlatky, Lynn; Mark P. Little; Hahnfeldt, Philip; Brenner, David J.

    2006-01-01

    Because many cancer patients are diagnosed earlier and live longer than in the past, second cancers induced by radiation therapy have become a clinically significant issue. An earlier biologically based model that was designed to estimate risks of high-dose radiation induced solid cancers included initiation of stem cells to a premalignant state, inactivation of stem cells at high radiation doses, and proliferation of stem cells during cellular repopulation after inactivation. This earlier model predicted the risks of solid tumors induced by radiation therapy but overestimated the corresponding leukemia risks. Methods: To extend the model to radiation-induced leukemias, we analyzed in addition to cellular initiation, inactivation, and proliferation a repopulation mechanism specific to the hematopoietic system: long-range migration through the blood stream of hematopoietic stem cells (HSCs) from distant locations. Parameters for the model were derived from HSC biologic data in the literature and from leukemia risks among atomic bomb survivors v^ ho were subjected to much lower radiation doses. Results: Proliferating HSCs that migrate from sites distant from the high-dose region include few preleukemic HSCs, thus decreasing the high-dose leukemia risk. The extended model for leukemia provides risk estimates that are consistent with epidemiologic data for leukemia risk associated with radiation therapy over a wide dose range. For example, when applied to an earlier case-control study of 110000 women undergoing radiotherapy for uterine cancer, the model predicted an excess relative risk (ERR) of 1.9 for leukemia among women who received a large inhomogeneous fractionated external beam dose to the bone marrow (mean = 14.9 Gy), consistent with the measured ERR (2.0, 95% confidence interval [CI] = 0.2 to 6.4; from 3.6 cases expected and 11 cases observed). As a corresponding example for brachytherapy, the predicted ERR of 0.80 among women who received an inhomogeneous low

  3. Estimates of radiological risk from depleted uranium weapons in war scenarios.

    PubMed

    Durante, Marco; Pugliese, Mariagabriella

    2002-01-01

    Several weapons used during the recent conflict in Yugoslavia contain depleted uranium, including missiles and armor-piercing incendiary rounds. Health concern is related to the use of these weapons, because of the heavy-metal toxicity and radioactivity of uranium. Although chemical toxicity is considered the more important source of health risk related to uranium, radiation exposure has been allegedly related to cancers among veterans of the Balkan conflict, and uranium munitions are a possible source of contamination in the environment. Actual measurements of radioactive contamination are needed to assess the risk. In this paper, a computer simulation is proposed to estimate radiological risk related to different exposure scenarios. Dose caused by inhalation of radioactive aerosols and ground contamination induced by Tomahawk missile impact are simulated using a Gaussian plume model (HOTSPOT code). Environmental contamination and committed dose to the population resident in contaminated areas are predicted by a food-web model (RESRAD code). Small values of committed effective dose equivalent appear to be associated with missile impacts (50-y CEDE < 5 mSv), or population exposure by water-independent pathways (50-y CEDE < 80 mSv). The greatest hazard is related to the water contamination in conditions of effective leaching of uranium in the groundwater (50-y CEDE < 400 mSv). Even in this worst case scenario, the chemical toxicity largely predominates over radiological risk. These computer simulations suggest that little radiological risk is associated to the use of depleted uranium weapons. PMID:11768794

  4. Quantitative risk estimation for a Legionella pneumophila infection due to whirlpool use.

    PubMed

    Bouwknegt, Martijn; Schijven, Jack F; Schalk, Johanna A C; de Roda Husman, Ana Maria

    2013-07-01

    Quantitative microbiological risk assessment was used to quantify the risk associated with the exposure to Legionella pneumophila in a whirlpool. Conceptually, air bubbles ascend to the surface, intercepting Legionella from the traversed water. At the surface the bubble bursts into dominantly noninhalable jet drops and inhalable film drops. Assuming that film drops carry half of the intercepted Legionella, a total of four (95% interval: 1-9) and 4.5×10(4) (4.4×10(4) - 4.7×10(4) ) cfu/min were estimated to be aerosolized for concentrations of 1 and 1,000 legionellas per liter, respectively. Using a dose-response model for guinea pigs to represent humans, infection risks for active whirlpool use with 100 cfu/L water for 15 minutes were 0.29 (∼0.11-0.48) for susceptible males and 0.22 (∼0.06-0.42) for susceptible females. A L. pneumophila concentration of ≥1,000 cfu/L water was estimated to nearly always cause an infection (mean: 0.95; 95% interval: 0.9-∼1). Estimated infection risks were time-dependent, ranging from 0.02 (0-0.11) for 1-minute exposures to 0.93 (0.86-0.97) for 2-hour exposures when the L. pneumophila concentration was 100 cfu/L water. Pool water in Dutch bathing establishments should contain <100 cfu Legionella/L water. This study suggests that stricter provisions might be required to assure adequate public health protection. PMID:23078231

  5. Estimates of the auditory risk from outdoor impulse noise. I: Firecrackers.

    PubMed

    Flamme, Gregory A; Liebe, Kevin; Wong, Adam

    2009-01-01

    Firecrackers are common impulse noise exposures in the United States. In this study, impulses produced outdoors by consumer firecrackers were recorded, described, and analyzed with respect to the amount of the auditory risk they pose to the unprotected listener under various listening conditions. Risk estimates were obtained using three contemporary damage risk criteria (DRC), including a waveform parameter-based approach (peak SPL and B duration), an energy-based criterion (A-weighted sound exposure level and equivalent continuous level), and a physiological model (the AHAAH model developed by Price and Kalb). Results from these DRC were converted into numbers of maximum permissible unprotected exposures to facilitate comparison. Acoustic characteristics of firecracker impulses varied with the distance, but only subtle differences were observed across firecrackers. Typical peak levels ranged between 171 dB SPL at 0.5 m and 142 dB SPL at 8 m. Estimates of the auditory risk did not differ significantly across firecrackers, but varied with the distance. Vast differences in maximum permissible exposures were observed, and the directions of the differences varied with the level of the impulse. Typical estimates of maximum permissible exposures ranged between 0 and 2 at 0.5 m and between 31 and 227,000 at 8 m. Unprotected exposures to firecracker impulses should be limited or avoided entirely if the firecrackers are ignited in batches within 8 m of the listener. Differences across DRC are inconsequential at 0.5 m, but have substantial implications at distances of 1 m and more. PMID:19805932

  6. Estimating health risk from exposure to 1,4-dioxane in Japan.

    PubMed

    Makino, Ryoji; Kawasaki, Hajime; Kishimoto, Atsuo; Gamo, Masashi; Nakanishi, Junko

    2006-01-01

    Exposure to 1,4-dioxane from the atmosphere around high-emission plants and from consumer products used in daily life that contain the substance may have adverse health effects; however, its emission into the atmosphere is not regulated. In this study, the health risk posed by 1,4-dioxane is assessed to investigate whether measures should be undertaken to reduce exposure to 1,4-dioxane. The notion of the margin of exposure (MOE), given by the ratio of no observed adverse effect level (NOAEL) to actual or projected exposure level, is used to assess risk. In exposure assessment, two types of exposure channel are considered: (a) the use of consumer products that contain 1,4-dioxane and (b) the inhalation of air around high-emission plants. To estimate exposure via channel (a), we measured the concentration of 1,4-dioxane in consumer products and estimated the interindividual variability of exposure by Monte Carlo simulation that reflects the measured data. To estimate exposure via channel (b), we employed a local-level atmospheric dispersion model to estimate the concentration of 1,4-dioxane immediately around high-emission plants. For hazard assessment, we derived the inhalatory and oral NOAELs for liver adenomas and carcinomas and the uncertainty factor. The results suggest that measures are not needed to reduce exposure to 1,4-dioxane from consumer products. As for inhalation exposure around high-emission plants, some residents may be exposed to health risks if certain conservative analytical conditions are assumed. Even in this case, we conclude that it is not necessary for Plant A to stop the use of 1,4-dioxane immediately and that medium- to long-term emission reduction measures should be sufficient. PMID:16685251

  7. Estimating Loss-of-Coolant Accident Frequencies for the Standardized Plant Analysis Risk Models

    SciTech Connect

    S. A. Eide; D. M. Rasmuson; C. L. Atwood

    2008-09-01

    The U.S. Nuclear Regulatory Commission maintains a set of risk models covering the U.S. commercial nuclear power plants. These standardized plant analysis risk (SPAR) models include several loss-of-coolant accident (LOCA) initiating events such as small (SLOCA), medium (MLOCA), and large (LLOCA). All of these events involve a loss of coolant inventory from the reactor coolant system. In order to maintain a level of consistency across these models, initiating event frequencies generally are based on plant-type average performance, where the plant types are boiling water reactors and pressurized water reactors. For certain risk analyses, these plant-type initiating event frequencies may be replaced by plant-specific estimates. Frequencies for SPAR LOCA initiating events previously were based on results presented in NUREG/CR-5750, but the newest models use results documented in NUREG/CR-6928. The estimates in NUREG/CR-6928 are based on historical data from the initiating events database for pressurized water reactor SLOCA or an interpretation of results presented in the draft version of NUREG-1829. The information in NUREG-1829 can be used several ways, resulting in different estimates for the various LOCA frequencies. Various ways NUREG-1829 information can be used to estimate LOCA frequencies were investigated and this paper presents two methods for the SPAR model standard inputs, which differ from the method used in NUREG/CR-6928. In addition, results obtained from NUREG-1829 are compared with actual operating experience as contained in the initiating events database.

  8. Direct estimates of low-level radiation risks of lung cancer at two NRC-compliant nuclear installations: why are the new risk estimates 20 to 200 times the old official estimates?

    PubMed Central

    Bross, I. D.; Driscoll, D. L.

    1981-01-01

    An official report on the health hazards to nuclear submarine workers at the Portsmouth Naval Shipyard (PNS), who were exposed to low-level ionizing radiation, was based on a casual inspection of the data and not on statistical analyses of the dosage-response relationships. When these analyses are done, serious hazards from lung cancer and other causes of death are shown. As a result of the recent studies on nuclear workers, the new risk estimates have been found to be much higher than the official estimates currently used in setting NRC permissible levels. The official BEIR estimates are about one lung cancer death per year per million persons per rem[s]. The PNS data show 189 lung cancer deaths per year per million persons per rem. PMID:7336762

  9. Polydimethylsiloxane-air partition ratios for semi-volatile organic compounds by GC-based measurement and COSMO-RS estimation: Rapid measurements and accurate modelling.

    PubMed

    Okeme, Joseph O; Parnis, J Mark; Poole, Justen; Diamond, Miriam L; Jantunen, Liisa M

    2016-08-01

    Polydimethylsiloxane (PDMS) shows promise for use as a passive air sampler (PAS) for semi-volatile organic compounds (SVOCs). To use PDMS as a PAS, knowledge of its chemical-specific partitioning behaviour and time to equilibrium is needed. Here we report on the effectiveness of two approaches for estimating the partitioning properties of polydimethylsiloxane (PDMS), values of PDMS-to-air partition ratios or coefficients (KPDMS-Air), and time to equilibrium of a range of SVOCs. Measured values of KPDMS-Air, Exp' at 25 °C obtained using the gas chromatography retention method (GC-RT) were compared with estimates from a poly-parameter free energy relationship (pp-FLER) and a COSMO-RS oligomer-based model. Target SVOCs included novel flame retardants (NFRs), polybrominated diphenyl ethers (PBDEs), polycyclic aromatic hydrocarbons (PAHs), organophosphate flame retardants (OPFRs), polychlorinated biphenyls (PCBs) and organochlorine pesticides (OCPs). Significant positive relationships were found between log KPDMS-Air, Exp' and estimates made using the pp-FLER model (log KPDMS-Air, pp-LFER) and the COSMOtherm program (log KPDMS-Air, COSMOtherm). The discrepancy and bias between measured and predicted values were much higher for COSMO-RS than the pp-LFER model, indicating the anticipated better performance of the pp-LFER model than COSMO-RS. Calculations made using measured KPDMS-Air, Exp' values show that a PDMS PAS of 0.1 cm thickness will reach 25% of its equilibrium capacity in ∼1 day for alpha-hexachlorocyclohexane (α-HCH) to ∼ 500 years for tris (4-tert-butylphenyl) phosphate (TTBPP), which brackets the volatility range of all compounds tested. The results presented show the utility of GC-RT method for rapid and precise measurements of KPDMS-Air. PMID:27179237

  10. Impact of ground motion characterization on conservatism and variability in seismic risk estimates

    SciTech Connect

    Sewell, R.T.; Toro, G.R.; McGuire, R.K.

    1996-07-01

    This study evaluates the impact, on estimates of seismic risk and its uncertainty, of alternative methods in treatment and characterization of earthquake ground motions. The objective of this study is to delineate specific procedures and characterizations that may lead to less biased and more precise seismic risk results. This report focuses on sources of conservatism and variability in risk that may be introduced through the analytical processes and ground-motion descriptions which are commonly implemented at the interface of seismic hazard and fragility assessments. In particular, implication of the common practice of using a single, composite spectral shape to characterize motions of different magnitudes is investigated. Also, the impact of parameterization of ground motion on fragility and hazard assessments is shown. Examination of these results demonstrates the following. (1) There exists significant conservatism in the review spectra (usually, spectra characteristic of western U.S. earthquakes) that have been used in conducting past seismic risk assessments and seismic margin assessments for eastern U.S. nuclear power plants. (2) There is a strong dependence of seismic fragility on earthquake magnitude when PGA is used as the ground-motion characterization. When, however, magnitude-dependent spectra are anchored to a common measure of elastic spectral acceleration averaged over the appropriate frequency range, seismic fragility shows no important nor consistent dependence on either magnitude or strong-motion duration. Use of inelastic spectral acceleration (at the proper frequency) as the ground spectrum anchor demonstrates a very similar result. This study concludes that a single, composite-magnitude spectrum can generally be used to characterize ground motion for fragility assessment without introducing significant bias or uncertainty in seismic risk estimates.

  11. Estimated Insulin Sensitivity and Cardiovascular Disease Risk Factors in Adolescents with and without Type 1 Diabetes

    PubMed Central

    Specht, Brian J; Wadwa, R Paul; Snell-Bergeon, Janet K; Nadeau, Kristen J; Bishop, Franziska K; Maahs, David M.

    2012-01-01

    Objective To test the hypothesis that cardiovascular disease (CVD) risk factors are similar in adolescents with and without diabetes (T1D) in the most insulin sensitive (IS) tertile and CVD risk factors are more atherogenic with decreasing IS in adolescents with T1D. Study design Adolescents with IS T1D (n=292; age=15.4±2.1 years; duration=8.8±3.0 years, HbA1c=8.9±1.6%) and non-diabetic (non-DM) controls (n=89; age=15.4±2.1 years) was estimated using the model: logeIS=4.64725 – 0.02032(waist, cm) – 0.09779(HbA1c, %) – 0.00235(triglycerides, mg/dl). CVD risk factors (blood pressure, fasting total, LDL and HDL-cholesterol, hs-CRP, and BMI Z-score) were compared between all non-DM adolescents and those with T1D in the most IS tertile, and then examined for a linear trend by IS tertile in adolescents with T1D, adjusted for sex, race/ethnicity and Tanner Stage. Results Estimated IS was significantly lower in adolescents with T1D compared with those without (T1D=7.8±2.4, non-DM=11.5±2.9; p<0.0001). CVD risk factors were similar for non-DM compared with the adolescents with most IS T1D, except for higher HDL-c and DBP in adolescents with T1D (p<0.05). Among adolescents with T1D, all CVD risk factors except for HDL-c, were more atherogenic across decreasing IS tertiles in linear regression analysis (p<0.05). Conclusion Adolescents with T1D who are the most IS have similar CVD risk factors compared with non-DM adolescents. CVD risk factors are inversely associated with adolescents with IS T1D. IS may be an important therapeutic target for reducing CVD risk factors in adolescents with T1D. PMID:22921593

  12. Mathematical model to estimate risk of calcium-containing renal stones

    NASA Technical Reports Server (NTRS)

    Pietrzyk, R. A.; Feiveson, A. H.; Whitson, P. A.

    1999-01-01

    BACKGROUND/AIMS: Astronauts exposed to microgravity during the course of spaceflight undergo physiologic changes that alter the urinary environment so as to increase the risk of renal stone formation. This study was undertaken to identify a simple method with which to evaluate the potential risk of renal stone development during spaceflight. METHOD: We used a large database of urinary risk factors obtained from 323 astronauts before and after spaceflight to generate a mathematical model with which to predict the urinary supersaturation of calcium stone forming salts. RESULT: This model, which involves the fewest possible analytical variables (urinary calcium, citrate, oxalate, phosphorus, and total volume), reliably and accurately predicted the urinary supersaturation of the calcium stone forming salts when compared to results obtained from a group of 6 astronauts who collected urine during flight. CONCLUSIONS: The use of this model will simplify both routine medical monitoring during spaceflight as well as the evaluation of countermeasures designed to minimize renal stone development. This model also can be used for Earth-based applications in which access to analytical resources is limited.

  13. Longer genotypically-estimated leukocyte telomere length is associated with increased adult glioma risk

    PubMed Central

    Walsh, Kyle M.; Codd, Veryan; Rice, Terri; Nelson, Christopher P.; Smirnov, Ivan V.; McCoy, Lucie S.; Hansen, Helen M.; Elhauge, Edward; Ojha, Juhi; Francis, Stephen S.; Madsen, Nils R.; Bracci, Paige M.; Pico, Alexander R.; Molinaro, Annette M.; Tihan, Tarik; Berger, Mitchel S.; Chang, Susan M.; Prados, Michael D.; Jenkins, Robert B.; Wiemels, Joseph L.; Samani, Nilesh J.; Wiencke, John K.; Wrensch, Margaret R.

    2015-01-01

    Telomere maintenance has emerged as an important molecular feature with impacts on adult glioma susceptibility and prognosis. Whether longer or shorter leukocyte telomere length (LTL) is associated with glioma risk remains elusive and is often confounded by the effects of age and patient treatment. We sought to determine if genotypically-estimated LTL is associated with glioma risk and if inherited single nucleotide polymorphisms (SNPs) that are associated with LTL are glioma risk factors. Using a Mendelian randomization approach, we assessed differences in genotypically-estimated relative LTL in two independent glioma case-control datasets from the UCSF Adult Glioma Study (652 patients and 3735 controls) and The Cancer Genome Atlas (478 non-overlapping patients and 2559 controls). LTL estimates were based on a weighted linear combination of subject genotype at eight SNPs, previously associated with LTL in the ENGAGE Consortium Telomere Project. Mean estimated LTL was 31bp (5.7%) longer in glioma patients than controls in discovery analyses (P = 7.82×10-8) and 27bp (5.0%) longer in glioma patients than controls in replication analyses (1.48×10-3). Glioma risk increased monotonically with each increasing septile of LTL (O.R.=1.12; P = 3.83×10-12). Four LTL-associated SNPs were significantly associated with glioma risk in pooled analyses, including those in the telomerase component genes TERC (O.R.=1.14; 95% C.I.=1.03-1.28) and TERT (O.R.=1.39; 95% C.I.=1.27-1.52), and those in the CST complex genes OBFC1 (O.R.=1.18; 95% C.I.=1.05-1.33) and CTC1 (O.R.=1.14; 95% C.I.=1.02-1.28). Future work is needed to characterize the role of the CST complex in gliomagenesis and further elucidate the complex balance between ageing, telomere length, and molecular carcinogenesis. PMID:26646793

  14. Longer genotypically-estimated leukocyte telomere length is associated with increased adult glioma risk.

    PubMed

    Walsh, Kyle M; Codd, Veryan; Rice, Terri; Nelson, Christopher P; Smirnov, Ivan V; McCoy, Lucie S; Hansen, Helen M; Elhauge, Edward; Ojha, Juhi; Francis, Stephen S; Madsen, Nils R; Bracci, Paige M; Pico, Alexander R; Molinaro, Annette M; Tihan, Tarik; Berger, Mitchel S; Chang, Susan M; Prados, Michael D; Jenkins, Robert B; Wiemels, Joseph L; Samani, Nilesh J; Wiencke, John K; Wrensch, Margaret R

    2015-12-15

    Telomere maintenance has emerged as an important molecular feature with impacts on adult glioma susceptibility and prognosis. Whether longer or shorter leukocyte telomere length (LTL) is associated with glioma risk remains elusive and is often confounded by the effects of age and patient treatment. We sought to determine if genotypically-estimated LTL is associated with glioma risk and if inherited single nucleotide polymorphisms (SNPs) that are associated with LTL are glioma risk factors. Using a Mendelian randomization approach, we assessed differences in genotypically-estimated relative LTL in two independent glioma case-control datasets from the UCSF Adult Glioma Study (652 patients and 3735 controls) and The Cancer Genome Atlas (478 non-overlapping patients and 2559 controls). LTL estimates were based on a weighted linear combination of subject genotype at eight SNPs, previously associated with LTL in the ENGAGE Consortium Telomere Project. Mean estimated LTL was 31bp (5.7%) longer in glioma patients than controls in discovery analyses (P = 7.82x10-8) and 27bp (5.0%) longer in glioma patients than controls in replication analyses (1.48x10-3). Glioma risk increased monotonically with each increasing septile of LTL (O.R.=1.12; P = 3.83x10-12). Four LTL-associated SNPs were significantly associated with glioma risk in pooled analyses, including those in the telomerase component genes TERC (O.R.=1.14; 95% C.I.=1.03-1.28) and TERT (O.R.=1.39; 95% C.I.=1.27-1.52), and those in the CST complex genes OBFC1 (O.R.=1.18; 95% C.I.=1.05-1.33) and CTC1 (O.R.=1.14; 95% C.I.=1.02-1.28). Future work is needed to characterize the role of the CST complex in gliomagenesis and further elucidate the complex balance between ageing, telomere length, and molecular carcinogenesis. PMID:26646793

  15. Patient-specific radiation dose and cancer risk estimation in CT: Part I. Development and validation of a Monte Carlo program

    SciTech Connect

    Li Xiang; Samei, Ehsan; Segars, W. Paul; Sturgeon, Gregory M.; Colsher, James G.; Toncheva, Greta; Yoshizumi, Terry T.; Frush, Donald P.

    2011-01-15

    Purpose: Radiation-dose awareness and optimization in CT can greatly benefit from a dose-reporting system that provides dose and risk estimates specific to each patient and each CT examination. As the first step toward patient-specific dose and risk estimation, this article aimed to develop a method for accurately assessing radiation dose from CT examinations. Methods: A Monte Carlo program was developed to model a CT system (LightSpeed VCT, GE Healthcare). The geometry of the system, the energy spectra of the x-ray source, the three-dimensional geometry of the bowtie filters, and the trajectories of source motions during axial and helical scans were explicitly modeled. To validate the accuracy of the program, a cylindrical phantom was built to enable dose measurements at seven different radial distances from its central axis. Simulated radial dose distributions in the cylindrical phantom were validated against ion chamber measurements for single axial scans at all combinations of tube potential and bowtie filter settings. The accuracy of the program was further validated using two anthropomorphic phantoms (a pediatric one-year-old phantom and an adult female phantom). Computer models of the two phantoms were created based on their CT data and were voxelized for input into the Monte Carlo program. Simulated dose at various organ locations was compared against measurements made with thermoluminescent dosimetry chips for both single axial and helical scans. Results: For the cylindrical phantom, simulations differed from measurements by -4.8% to 2.2%. For the two anthropomorphic phantoms, the discrepancies between simulations and measurements ranged between (-8.1%, 8.1%) and (-17.2%, 13.0%) for the single axial scans and the helical scans, respectively. Conclusions: The authors developed an accurate Monte Carlo program for assessing radiation dose from CT examinations. When combined with computer models of actual patients, the program can provide accurate dose

  16. Patient-specific radiation dose and cancer risk estimation in CT: Part I. Development and validation of a Monte Carlo program

    PubMed Central

    Li, Xiang; Samei, Ehsan; Segars, W. Paul; Sturgeon, Gregory M.; Colsher, James G.; Toncheva, Greta; Yoshizumi, Terry T.; Frush, Donald P.

    2011-01-01

    Purpose: Radiation-dose awareness and optimization in CT can greatly benefit from a dose-reporting system that provides dose and risk estimates specific to each patient and each CT examination. As the first step toward patient-specific dose and risk estimation, this article aimed to develop a method for accurately assessing radiation dose from CT examinations. Methods: A Monte Carlo program was developed to model a CT system (LightSpeed VCT, GE Healthcare). The geometry of the system, the energy spectra of the x-ray source, the three-dimensional geometry of the bowtie filters, and the trajectories of source motions during axial and helical scans were explicitly modeled. To validate the accuracy of the program, a cylindrical phantom was built to enable dose measurements at seven different radial distances from its central axis. Simulated radial dose distributions in the cylindrical phantom were validated against ion chamber measurements for single axial scans at all combinations of tube potential and bowtie filter settings. The accuracy of the program was further validated using two anthropomorphic phantoms (a pediatric one-year-old phantom and an adult female phantom). Computer models of the two phantoms were created based on their CT data and were voxelized for input into the Monte Carlo program. Simulated dose at various organ locations was compared against measurements made with thermoluminescent dosimetry chips for both single axial and helical scans. Results: For the cylindrical phantom, simulations differed from measurements by −4.8% to 2.2%. For the two anthropomorphic phantoms, the discrepancies between simulations and measurements ranged between (−8.1%, 8.1%) and (−17.2%, 13.0%) for the single axial scans and the helical scans, respectively. Conclusions: The authors developed an accurate Monte Carlo program for assessing radiation dose from CT examinations. When combined with computer models of actual patients, the program can provide accurate dose

  17. Improving risk estimates of runoff producing areas: formulating variable source areas as a bivariate process.

    PubMed

    Cheng, Xiaoya; Shaw, Stephen B; Marjerison, Rebecca D; Yearick, Christopher D; DeGloria, Stephen D; Walter, M Todd

    2014-05-01

    Predicting runoff producing areas and their corresponding risks of generating storm runoff is important for developing watershed management strategies to mitigate non-point source pollution. However, few methods for making these predictions have been proposed, especially operational approaches that would be useful in areas where variable source area (VSA) hydrology dominates storm runoff. The objective of this study is to develop a simple approach to estimate spatially-distributed risks of runoff production. By considering the development of overland flow as a bivariate process, we incorporated both rainfall and antecedent soil moisture conditions into a method for predicting VSAs based on the Natural Resource Conservation Service-Curve Number equation. We used base-flow immediately preceding storm events as an index of antecedent soil wetness status. Using nine sub-basins of the Upper Susquehanna River Basin, we demonstrated that our estimated runoff volumes and extent of VSAs agreed with observations. We further demonstrated a method for mapping these areas in a Geographic Information System using a Soil Topographic Index. The proposed methodology provides a new tool for watershed planners for quantifying runoff risks across watersheds, which can be used to target water quality protection strategies. PMID:24632403

  18. Health risk estimates for groundwater and soil contamination in the Slovak Republic: a convenient tool for identification and mapping of risk areas.

    PubMed

    Fajčíková, K; Cvečková, V; Stewart, A; Rapant, S

    2014-10-01

    We undertook a quantitative estimation of health risks to residents living in the Slovak Republic and exposed to contaminated groundwater (ingestion by adult population) and/or soils (ingestion by adult and child population). Potential risk areas were mapped to give a visual presentation at basic administrative units of the country (municipalities, districts, regions) for easy discussion with policy and decision-makers. The health risk estimates were calculated by US EPA methods, applying threshold values for chronic risk and non-threshold values for cancer risk. The potential health risk was evaluated for As, Ba, Cd, Cu, F, Hg, Mn, NO3 (-), Pb, Sb, Se and Zn for groundwater and As, B, Ba, Be, Cd, Cu, F, Hg, Mn, Mo, Ni, Pb, Sb, Se and Zn for soils. An increased health risk was identified mainly in historical mining areas highly contaminated by geogenic-anthropogenic sources (ore deposit occurrence, mining, metallurgy). Arsenic and antimony were the most significant elements in relation to health risks from groundwater and soil contamination in the Slovak Republic contributing a significant part of total chronic risk levels. Health risk estimation for soil contamination has highlighted the significance of exposure through soil ingestion in children. Increased cancer risks from groundwater and soil contamination by arsenic were noted in several municipalities and districts throughout the country in areas with significantly high arsenic levels in the environment. This approach to health risk estimations and visualization represents a fast, clear and convenient tool for delineation of risk areas at national and local levels. PMID:24729053

  19. Estimating the Size of Populations at High Risk for HIV Using Respondent-Driven Sampling Data

    PubMed Central

    Handcock, Mark S.; Gile, Krista J.; Mar, Corinne M.

    2015-01-01

    Summary The study of hard-to-reach populations presents significant challenges. Typically, a sampling frame is not available, and population members are difficult to identify or recruit from broader sampling frames. This is especially true of populations at high risk for HIV/AIDS. Respondent-driven sampling (RDS) is often used in such settings with the primary goal of estimating the prevalence of infection. In such populations, the number of people at risk for infection and the number of people infected are of fundamental importance. This article presents a case-study of the estimation of the size of the hard-to-reach population based on data collected through RDS. We study two populations of female sex workers and men-who-have-sex-with-men in El Salvador. The approach is Bayesian and we consider different forms of prior information, including using the UNAIDS population size guidelines for this region. We show that the method is able to quantify the amount of information on population size available in RDS samples. As separate validation, we compare our results to those estimated by extrapolating from a capture–recapture study of El Salvadorian cities. The results of our case-study are largely comparable to those of the capture–recapture study when they differ from the UNAIDS guidelines. Our method is widely applicable to data from RDS studies and we provide a software package to facilitate this. PMID:25585794

  20. Uncertainty in the estimation of benzene risks: application of an uncertainty taxonomy to risk assessments based on an epidemiology study of rubber hydrochloride workers.

    PubMed Central

    Byrd, D M; Barfield, E T

    1989-01-01

    This paper reviews 14 risk assessments that use the data from descriptions by Rinsky, Young, and co-workers of benzene-associated leukemias among a group of rubber hydrochloride workers in Ohio. The leukemogenic risks of benzene estimated in these assessments differ. The assessors use different assumptions (parameters, confounding factors, or formulas), which account for the differences in risk. The purpose of the review is to determine whether the major source of uncertainty in assessments of benzene risk arises from data, method, or concept. The results show that methodological differences dominate the other two potential sources with respect to impact on risk magnitude. PMID:2792047

  1. Patient-specific radiation dose and cancer risk estimation in CT: Part II. Application to patients

    SciTech Connect

    Li Xiang; Samei, Ehsan; Segars, W. Paul; Sturgeon, Gregory M.; Colsher, James G.; Toncheva, Greta; Yoshizumi, Terry T.; Frush, Donald P.

    2011-01-15

    Purpose: Current methods for estimating and reporting radiation dose from CT examinations are largely patient-generic; the body size and hence dose variation from patient to patient is not reflected. Furthermore, the current protocol designs rely on dose as a surrogate for the risk of cancer incidence, neglecting the strong dependence of risk on age and gender. The purpose of this study was to develop a method for estimating patient-specific radiation dose and cancer risk from CT examinations. Methods: The study included two patients (a 5-week-old female patient and a 12-year-old male patient), who underwent 64-slice CT examinations (LightSpeed VCT, GE Healthcare) of the chest, abdomen, and pelvis at our institution in 2006. For each patient, a nonuniform rational B-spine (NURBS) based full-body computer model was created based on the patient's clinical CT data. Large organs and structures inside the image volume were individually segmented and modeled. Other organs were created by transforming an existing adult male or female full-body computer model (developed from visible human data) to match the framework defined by the segmented organs, referencing the organ volume and anthropometry data in ICRP Publication 89. A Monte Carlo program previously developed and validated for dose simulation on the LightSpeed VCT scanner was used to estimate patient-specific organ dose, from which effective dose and risks of cancer incidence were derived. Patient-specific organ dose and effective dose were compared with patient-generic CT dose quantities in current clinical use: the volume-weighted CT dose index (CTDI{sub vol}) and the effective dose derived from the dose-length product (DLP). Results: The effective dose for the CT examination of the newborn patient (5.7 mSv) was higher but comparable to that for the CT examination of the teenager patient (4.9 mSv) due to the size-based clinical CT protocols at our institution, which employ lower scan techniques for smaller

  2. Ecological risk assessment of multimedia hazardous air pollutants: estimating exposure and effects.

    PubMed

    Efroymson, R A; Murphy, D L

    2001-07-01

    Hazardous air pollutants, some of which have the potential for multimedia distribution, raise several hurdles for ecological risk assessment including: (1) the development of an adequate transport, fate and exposure model; and (2) the selection of exposure-response models that can accommodate multiple exposure routes for ecological receptors. To address the first issue, the EPA Office of Air Quality Planning and Standards has developed TRIM.FaTE, a mass-balance, fate, transport, and ecological exposure model that is a component of the Total Risk Integrated Methodology (TRIM) for air pollutants. In addition to abiotic transfers and transformations, TRIM.FaTE estimates the uptake of a chemical by terrestrial and aquatic organisms with time. Measures of exposure that TRIM.FaTE can provide include: (1) body burdens or tissue concentrations; (2) doses averaged over any time period; or (3) concentrations of chemicals in abiotic media. The model provides the user with the flexibility to choose the exposure-response thresholds or dose-response relationships that are best suited to data availability, routes of exposure, and the mechanism of toxicity of the chemical to an ecological receptor. One of the challenges of incorporating TRIM.FaTE into a risk assessment methodology lies in defining a streamlined model simulation scenario for initial screening-level risk assessments. These assessments may encompass multiple facilities that emit a variety of pollutants near diverse ecosystems. The information on ecological risk assessment methodology that is described is applicable to the EPA Residual Risk Program with emphasis on multimedia pollutants and the role of TRIM.FaTE. PMID:11453299

  3. Estimating drought risk across Europe from reported drought impacts, hazard indicators and vulnerability factors

    NASA Astrophysics Data System (ADS)

    Blauhut, V.; Stahl, K.; Stagge, J. H.; Tallaksen, L. M.; De Stefano, L.; Vogt, J.

    2015-12-01

    Drought is one of the most costly natural hazards in Europe. Due to its complexity, drought risk, the combination of the natural hazard and societal vulnerability, is difficult to define and challenging to detect and predict, as the impacts of drought are very diverse, covering the breadth of socioeconomic and environmental systems. Pan-European maps of drought risk could inform the elaboration of guidelines and policies to address its documented severity and impact across borders. This work (1) tests the capability of commonly applied hazard indicators and vulnerability factors to predict annual drought impact occurrence for different sectors and macro regions in Europe and (2) combines information on past drought impacts, drought hazard indicators, and vulnerability factors into estimates of drought risk at the pan-European scale. This "hybrid approach" bridges the gap between traditional vulnerability assessment and probabilistic impact forecast in a statistical modelling framework. Multivariable logistic regression was applied to predict the likelihood of impact occurrence on an annual basis for particular impact categories and European macro regions. The results indicate sector- and macro region specific sensitivities of hazard indicators, with the Standardised Precipitation Evapotranspiration Index for a twelve month aggregation period (SPEI-12) as the overall best hazard predictor. Vulnerability factors have only limited ability to predict drought impacts as single predictor, with information about landuse and water resources as best vulnerability-based predictors. (3) The application of the "hybrid approach" revealed strong regional (NUTS combo level) and sector specific differences in drought risk across Europe. The majority of best predictor combinations rely on a combination of SPEI for shorter and longer aggregation periods, and a combination of information on landuse and water resources. The added value of integrating regional vulnerability information

  4. Estimated phytanic acid intake and prostate cancer risk: a prospective cohort study.

    PubMed

    Wright, Margaret E; Bowen, Phyllis; Virtamo, Jarmo; Albanes, Demetrius; Gann, Peter H

    2012-09-15

    Phytanic acid is a saturated fatty acid found predominantly in red meat and dairy products and may contribute to increases in prostate cancer risk that are observed with higher intakes of these foods. We constructed a novel summary measure of phytanic acid intake and prospectively examined its association with prostate cancer risk in the Alpha-Tocopherol, Beta-Carotene Cancer Prevention Study--a cohort of Finnish male smokers aged 50-69 years. Diet was assessed at baseline in 27,111 participants using a validated 276-item dietary questionnaire. Since phytanic acid is not currently included in food composition tables, we used the published phytanic acid content of 151 major food items to estimate total daily intake. During up to 21 years of follow-up, a total of 1,929 incident prostate cancer cases (including 438 advanced cases) were identified. Higher phytanic acid intake, though unrelated to the risk of localized disease [relative risks (RR) and 95% confidence intervals (CI) for increasing quartiles of intake = 1.00 (ref), 0.83 (0.68-1.01), 0.76 (0.62-0.94) and 0.91 (0.74-1.13); p trend = 0.23], was associated with increased risks of advanced prostate cancer [RR and 95% CI = 1.00 (ref), 1.43 (1.09-1.89), 1.31 (0.99-1.75) and 1.38 (1.02-1.89); p trend = 0.06]. This association appeared to be driven predominantly by phytanic acid obtained from dairy products (particularly butter). Our study indicates that phytanic acid may contribute to previously observed associations between high-fat animal foods (particularly dairy products) and prostate cancer risk, although some caution is warranted as it may be acting as a surrogate marker of dairy fat. PMID:22120496

  5. Estimating the Risk of Renal Stone Events During Long-Duration Spaceflight

    NASA Technical Reports Server (NTRS)

    Reyes, David; Kerstman, Eric; Locke, James

    2014-01-01

    Introduction: Given the bone loss and increased urinary calcium excretion in the microgravity environment, persons participating in long-duration spaceflight may have an increased risk for renal stone formation. Renal stones are often an incidental finding of abdominal imaging studies done for other reasons. Thus, some crewmembers may have undiscovered, asymptomatic stones prior to their mission. Methods: An extensive literature search was conducted concerning the natural history of asymptomatic renal stones. For comparison, simulations were done using the Integrated Medical Model (IMM). The IMM is an evidence-based decision support tool that provides risk analysis and has the capability to optimize medical systems for missions by minimizing the occurrence of adverse mission outcomes such as evacuation and loss of crew life within specified mass and volume constraints. Results: The literature of the natural history of asymptomatic renal stones in the general medical population shows that the probability of symptomatic event is 8% to 34% at 1 to 3 years for stones < 7 mm. Extrapolated to a 6-month mission, for stones < 5 to 7 mm, the risk for any stone event is about 4 to 6%, with a 0.7% to 4% risk for intervention, respectively. IMM simulations compare favorably with risk estimates garnered from the terrestrial literature. The IMM forecasts that symptomatic renal stones may be one of the top drivers for medical evacuation of an International Space Station (ISS) mission. Discussion: Although the likelihood of a stone event is low, the consequences could be severe due to limitations of current ISS medical capabilities. Therefore, these risks need to be quantified to aid planning, limit crew morbidity and mitigate mission impacts. This will be especially critical for missions beyond earth orbit, where evacuation may not be an option.

  6. Estimating drought risk across Europe from reported drought impacts, drought indices, and vulnerability factors

    NASA Astrophysics Data System (ADS)

    Blauhut, Veit; Stahl, Kerstin; Stagge, James Howard; Tallaksen, Lena M.; De Stefano, Lucia; Vogt, Jürgen

    2016-07-01

    Drought is one of the most costly natural hazards in Europe. Due to its complexity, drought risk, meant as the combination of the natural hazard and societal vulnerability, is difficult to define and challenging to detect and predict, as the impacts of drought are very diverse, covering the breadth of socioeconomic and environmental systems. Pan-European maps of drought risk could inform the elaboration of guidelines and policies to address its documented severity and impact across borders. This work tests the capability of commonly applied drought indices and vulnerability factors to predict annual drought impact occurrence for different sectors and macro regions in Europe and combines information on past drought impacts, drought indices, and vulnerability factors into estimates of drought risk at the pan-European scale. This hybrid approach bridges the gap between traditional vulnerability assessment and probabilistic impact prediction in a statistical modelling framework. Multivariable logistic regression was applied to predict the likelihood of impact occurrence on an annual basis for particular impact categories and European macro regions. The results indicate sector- and macro-region-specific sensitivities of drought indices, with the Standardized Precipitation Evapotranspiration Index (SPEI) for a 12-month accumulation period as the overall best hazard predictor. Vulnerability factors have only limited ability to predict drought impacts as single predictors, with information about land use and water resources being the best vulnerability-based predictors. The application of the hybrid approach revealed strong regional and sector-specific differences in drought risk across Europe. The majority of the best predictor combinations rely on a combination of SPEI for shorter and longer accumulation periods, and a combination of information on land use and water resources. The added value of integrating regional vulnerability information with drought risk prediction

  7. Estimated Phytanic Acid Intake and Prostate Cancer Risk: a Prospective Cohort Study

    PubMed Central

    Wright, Margaret E.; Bowen, Phyllis; Virtamo, Jarmo; Albanes, Demetrius; Gann, Peter H.

    2013-01-01

    Phytanic acid is a saturated fatty acid found predominantly in red meat and dairy products and may contribute to increases in prostate cancer risk that are observed with higher intakes of these foods. We constructed a novel summary measure of phytanic acid intake and prospectively examined its association with prostate cancer risk in the Alpha-Tocopherol, Beta-Carotene Cancer Prevention Study – a cohort of Finnish male smokers ages 50–69 years. Diet was assessed at baseline in 27,111 participants using a validated 276-item dietary questionnaire. Since phytanic acid is not currently included in food composition tables, we used the published phytanic acid content of 151 major food items to estimate total daily intake. During up to 20 years of follow-up, a total of 1,929 incident prostate cancer cases (including 438 advanced cases) were identified. Higher phytanic acid intake, though unrelated to the risk of localized disease [relative risks and 95% confidence intervals for increasing quartiles of intake = 1.00 (ref), 0.83 (0.68–1.01), 0.76 (0.62–0.94), and 0.91 (0.74–1.13); p trend = 0.23], was associated with increased risks of advanced prostate cancer [RR and 95% CI = 1.00 (ref), 1.43 (1.09–1.89), 1.31 (0.99–1.75), and 1.38 (1.02–1.89); p trend = 0.06]. This association appeared to be driven predominantly by phytanic acid obtained from dairy products (particularly butter). Our study indicates that phytanic acid may contribute to previously observed associations between high-fat animal foods (particularly dairy products) and prostate cancer risk, although some caution is warranted as it may be acting as a surrogate marker of dairy fat. PMID:22120496

  8. Propose a Wall Shear Stress Divergence to Estimate the Risks of Intracranial Aneurysm Rupture

    PubMed Central

    Zhang, Y.; Takao, H.; Murayama, Y.; Qian, Y.

    2013-01-01

    Although wall shear stress (WSS) has long been considered a critical indicator of intracranial aneurysm rupture, there is still no definite conclusion as to whether a high or a low WSS results in aneurysm rupture. The reason may be that the effect of WSS direction has not been fully considered. The objectives of this study are to investigate the magnitude of WSS (|WSS|) and its divergence on the aneurysm surface and to test the significance of both in relation to the aneurysm rupture. Patient-specific computational fluid dynamics (CFD) was used to compute WSS and wall shear stress divergence (WSSD) on the aneurysm surface for nineteen patients. Our results revealed that if high |WSS| is stretching aneurysm luminal surface, and the stretching region is concentrated, the aneurysm is under a high risk of rupture. It seems that, by considering both direction and magnitude of WSS, WSSD may be a better indicator for the risk estimation of aneurysm rupture (154). PMID:24191140

  9. Risk estimation of infectious diseases determines the effectiveness of the control strategy

    NASA Astrophysics Data System (ADS)

    Zhang, Haifeng; Zhang, Jie; Li, Ping; Small, Michael; Wang, Binghong

    2011-05-01

    Usually, whether to take vaccination or not is a voluntary decision, which is determined by many factors, from societal factors (such as religious belief and human rights) to individual preferences (including psychology and altruism). Facing the outbreaks of infectious diseases, different people often have different estimations on the risk of infectious diseases. So, some persons are willing to vaccinate, but other persons are willing to take risks. In this paper, we establish two different risk assessment systems using the technique of dynamic programming, and then compare the effects of the two different systems on the prevention of diseases on complex networks. One is that the perceived probability of being infected for each individual is the same (uniform case). The other is that the perceived probability of being infected is positively correlated to individual degrees (preferential case). We show that these two risk assessment systems can yield completely different results, such as, the effectiveness of controlling diseases, the time evolution of the number of infections, and so on.

  10. Social and economic factors of the natural risk increasing: estimation of the Russian regions

    NASA Astrophysics Data System (ADS)

    Petrova, E.

    2004-04-01

    This study is an attempt to assess quantitatively social and economic factors that determine vulnerability of Russian regions to natural risk, to trace the space differences of the considered factors, and to group the regions by their similarity. In order to indicate the regional differences in social and economic development, equipment condition, dangerous substances accumulation, and social trouble four the most suitable parameters were estimated, including the per capita production of Gross Regional Product (GRP), capital consumption, volume of total toxic waste, and crime rate. Increase of the first parameter causes vulnerability reducing, the increase of the last three causes its increasing. Using multidimensional cluster analysis five types of regions were found for Russia according to similarity of the considered parameters. These types are characterized with higher value of a single (rarely two) chosen parameter, which seems to be sufficient enough to affect natural risks increasing in these regions in near future. Only few regions belonging to the fifth type proved to have rather high value of GRP and relatively low values of the other parameters. The negative correlation was found between a number of natural disasters (ND) and the per capita GRP in case when some parameters reached anomalously high value. The distinctions between regions by prevailing different parameters, which result in natural risk increasing, help risk management to find directions where to focus on.

  11. Contribution of molecular analyses to the estimation of the risk of congenital myotonic dystrophy.

    PubMed Central

    Cobo, A M; Poza, J J; Martorell, L; López de Munain, A; Emparanza, J I; Baiget, M

    1995-01-01

    A molecular analysis of the maternal and child CTG repeat size and intergenerational amplification was performed in order to estimate the risk of having a child with congenital myotonic dystrophy (CMD). In a study of 124 affected mother-child pairs (42 mother-CMD and 82 mother-non-CMD) the mean maternal CTG allele in CMD cases was three times higher (700 repeats) than in non-CMD cases (236 repeats). When the maternal allele was in the 50-300 repeats range, 90% of children were non-CMD. In contrast, when the maternal allele was greater than 300 repeats, 59% inherited the congenital form. Furthermore, the risk of having a CMD child is also related to the intergenerational amplification, which was significantly greater in the mother-CMD pairs than in the mother-non-CMD pairs. Although the risk of giving birth to a CMD child always exists for affected mothers, our data show that such a risk is considerably higher if the maternal allele is greater than 300 repeats. Images PMID:7760317

  12. Risk information in support of cost estimates for the Baseline Environmental Management Report (BEMR). Section 1

    SciTech Connect

    Gelston, G.M.; Jarvis, M.F.; Warren, B.R.; Von Berg, R.

    1995-06-01

    The Pacific Northwest Laboratory (PNL)(1) effort on the overall Baseline Environmental Management Report (BEMR) project consists of four installation-specific work components performed in succession. These components include (1) development of source terms, 92) collection of data and preparation of environmental settings reports, (3) calculation of unit risk factors, and (4) utilization of the unit risk factors in Automated Remedial Action Methodology (ARAM) for computation of target concentrations and cost estimates. This report documents work completed for the Nevada Test Site, Nevada, for components 2 and 3. The product of this phase of the BEMR project is the development of unit factors (i.e., unit transport factors, unit exposure factors, and unit risk factors). Thousands of these unit factors are gene rated and fill approximately one megabyte of computer information per installation. The final unit risk factors (URF) are transmitted electronically to BEMR-Cost task personnel as input to a computer program (ARAM). Abstracted files and exhibits of the URF information are included in this report. These visual formats are intended to provide a sample of the final task deliverable (the URF files) which can be easily read without a computer.

  13. Overall risk estimation for nonreactor nuclear facilities and implementation of safety goals

    SciTech Connect

    Kim, K.S.; Bradley, R.F.

    1993-06-01

    A typical safety analysis report (SAR) contains estimated frequencies.and consequences of various design basis accidents (DBA). However, the results are organized and presented in such a way that they are not conducive for summing up with mathematical rigor to express total or overall risk. This paper describes a mathematical formalism for deriving total risk indicators. The mathematical formalism is based on the complementary cumulative distribution function (CCDF) or exceedance probability of radioactivity release fraction and individual radiation dose. A simple protocol is presented for establishing exceedance probabilities from the results of DBA analyses typically available from an SAR. The exceedance probability of release fraction can be a useful indicator for gaining insights into the capability of confinement barriers, characteristics of source terms, and scope of the SAR. Fatality risks comparable to the DOE Safety Goals can be derived from the exceedance probability of individual doses. Example case analyses are presented to illustrate the use of the proposed protocol and mathematical formalism. The methodology is finally applied to proposed risk guidelines for individual accident events to show that these guidelines would be within the DOE Safety Goals.

  14. Use of binary logistic regression technique with MODIS data to estimate wild fire risk

    NASA Astrophysics Data System (ADS)

    Fan, Hong; Di, Liping; Yang, Wenli; Bonnlander, Brian; Li, Xiaoyan

    2007-11-01

    Many forest fires occur across the globe each year, which destroy life and property, and strongly impact ecosystems. In recent years, wildland fires and altered fire disturbance regimes have become a significant management and science problem affecting ecosystems and wildland/urban interface cross the United States and global. In this paper, we discuss the estimation of 504 probability models for forecasting fire risk for 14 fuel types, 12 months, one day/week/month in advance, which use 19 years of historical fire data in addition to meteorological and vegetation variables. MODIS land products are utilized as a major data source, and a logistical binary regression was adopted to solve fire forecast probability. In order to better modeling the change of fire risk along with the transition of seasons, some spatial and temporal stratification strategies were applied. In order to explore the possibilities of real time prediction, the Matlab distributing computing toolbox was used to accelerate the prediction. Finally, this study give an evaluation and validation of predict based on the ground truth collected. Validating results indicate these fire risk models have achieved nearly 70% accuracy of prediction and as well MODIS data are potential data source to implement near real-time fire risk prediction.

  15. Waste management programmatic environmental impact statement methodology for estimating human health risks

    SciTech Connect

    Bergenback, B.; Blaylock, B.P.; Legg, J.L.

    1995-05-01

    The US Department of Energy (DOE) has produced large quantities of radioactive and hazardous waste during years of nuclear weapons production. As a result, a large number of sites across the DOE Complex have become chemically and/or radiologically contaminated. In 1990, the Secretary of Energy charged the DOE Office of Environmental Restoration and Waste management (EM) with the task of preparing a Programmatic Environmental Impact Statement (PEIS). The PEIS should identify and assess the potential environmental impacts of implementing several integrated Environmental Restoration (ER) and Waste Management (WM) alternatives. The determination and integration of appropriate remediation activities and sound waste management practices is vital for ensuring the diminution of adverse human health impacts during site cleanup and waste management programs. This report documents the PEIS risk assessment methodology used to evaluate human health risks posed by WM activities. The methodology presents a programmatic cradle to grave risk assessment for EM program activities. A unit dose approach is used to estimate risks posed by WM activities and is the subject of this document.

  16. Cumulative Radiation Exposure and Cancer Risk Estimation in Children with Heart Disease

    PubMed Central

    Johnson, Jason N.; Hornik, Christoph P.; Li, Jennifer S.; Benjamin, Daniel K.; Yoshizumi, Terry; Reiman, Robert E.; Frush, Donald P.; Hill, Kevin D.

    2014-01-01

    Background Children with heart disease are frequently exposed to imaging examinations using ionizing radiation. Although radiation exposure is potentially carcinogenic, there are limited data on cumulative exposure and the associated cancer risk. We evaluated the cumulative effective dose (ED) of radiation from all radiation examinations to estimate the lifetime attributable risk (LAR) of cancer in children with heart disease. Methods and Results Children ≤6 years of age who had previously undergone 1 of 7 primary surgical procedures for heart disease at a single institution between 2005 and 2010 were eligible. Exposure to radiation-producing examinations was tabulated, and cumulative ED was calculated in millisievert (mSv). These data were used to estimate LAR of cancer above baseline using the approach of the Committee on Biological Effects of Ionizing Radiation VII. The cohort included 337 children exposed to 13,932 radiation examinations. Conventional radiographs represented 92% of examinations, while cardiac catheterization and computed tomography accounted for 81% of cumulative exposure. Overall median cumulative ED was 2.7 mSv (range 0.1–76.9 mSv), and the associated LAR of cancer was 0.07% (range 0.001–6.5%). Median LAR of cancer ranged widely depending on surgical complexity (0.006–1.6% for the 7 surgical cohorts) and was twice as high in females per unit exposure (0.04% versus 0.02% per 1 mSv ED for females versus males, respectively; p<0.001). Conclusions Overall radiation exposures in children with heart disease are relatively low, however select cohorts receive significant exposure. Cancer risk estimation highlights the need for limiting radiation dose, particularly for high-exposure modalities. PMID:24914037

  17. Results from the HARPS-N 2014 Campaign to Estimate Accurately the Densities of Planets Smaller than 2.5 Earth Radii

    NASA Astrophysics Data System (ADS)

    Charbonneau, David; Harps-N Collaboration

    2015-01-01

    Although the NASA Kepler Mission has determined the physical sizes of hundreds of small planets, and we have in many cases characterized the star in detail, we know virtually nothing about the planetary masses: There are only 7 planets smaller than 2.5 Earth radii for which there exist published mass estimates with a precision better than 20 percent, the bare minimum value required to begin to distinguish between different models of composition.HARPS-N is an ultra-stable fiber-fed high-resolution spectrograph optimized for the measurement of very precise radial velocities. We have 80 nights of guaranteed time per year, of which half are dedicated to the study of small Kepler planets.In preparation for the 2014 season, we compared all available Kepler Objects of Interest to identify the ones for which our 40 nights could be used most profitably. We analyzed the Kepler light curves to constrain the stellar rotation periods, the lifetimes of active regions on the stellar surface, and the noise that would result in our radial velocities. We assumed various mass-radius relations to estimate the observing time required to achieve a mass measurement with a precision of 15%, giving preference to stars that had been well characterized through asteroseismology. We began by monitoring our long list of targets. Based on preliminary results we then selected our final short list, gathering typically 70 observations per target during summer 2014.These resulting mass measurements will have a signifcant impact on our understanding of these so-called super-Earths and small Neptunes. They would form a core dataset with which the international astronomical community can meaningfully seek to understand these objects and their formation in a quantitative fashion.HARPS-N was funded by the Swiss Space Office, the Harvard Origin of Life Initiative, the Scottish Universities Physics Alliance, the University of Geneva, the Smithsonian Astrophysical Observatory, the Italian National

  18. A simple score for estimating the long-term risk of fracture in patients with multiple sclerosis

    PubMed Central

    Bazelier, Marloes T.; van Staa, Tjeerd-Pieter; Uitdehaag, Bernard M.J.; Cooper, Cyrus; Leufkens, Hubert G.M.; Vestergaard, Peter; Bentzen, Joan

    2012-01-01

    Objective: To derive a simple score for estimating the long-term risk of osteoporotic and hip fracture in individual patients with MS. Methods: Using the UK General Practice Research Database linked to the National Hospital Registry (1997–2008), we identified patients with incident MS (n = 5,494). They were matched 1:6 by year of birth, sex, and practice with patients without MS (control subjects). Cox proportional hazards models were used to calculate the long-term risk of osteoporotic and hip fracture. We fitted the regression model with general and specific risk factors, and the final Cox model was converted into integer risk scores. Results: In comparison with the FRAX calculator, our risk score contains several new risk factors that have been linked with fracture, which include MS, use of antidepressants, use of anticonvulsants, history of falling, and history of fatigue. We estimated the 5- and 10-year risks of osteoporotic and hip fracture in relation to the risk score. The C-statistic was moderate (0.67) for the prediction of osteoporotic fracture and excellent (0.89) for the prediction of hip fracture. Conclusion: This is the first clinical risk score for fracture risk estimation involving MS as a risk factor. PMID:22895583

  19. A global building inventory for earthquake loss estimation and risk management

    USGS Publications Warehouse

    Jaiswal, K.; Wald, D.; Porter, K.

    2010-01-01

    We develop a global database of building inventories using taxonomy of global building types for use in near-real-time post-earthquake loss estimation and pre-earthquake risk analysis, for the U.S. Geological Survey's Prompt Assessment of Global Earthquakes for Response (PAGER) program. The database is available for public use, subject to peer review, scrutiny, and open enhancement. On a country-by-country level, it contains estimates of the distribution of building types categorized by material, lateral force resisting system, and occupancy type (residential or nonresidential, urban or rural). The database draws on and harmonizes numerous sources: (1) UN statistics, (2) UN Habitat's demographic and health survey (DHS) database, (3) national housing censuses, (4) the World Housing Encyclopedia and (5) other literature. ?? 2010, Earthquake Engineering Research Institute.

  20. Spatially interpolated disease prevalence estimation using collateral indicators of morbidity and ecological risk.

    PubMed

    Congdon, Peter

    2013-10-01

    This paper considers estimation of disease prevalence for small areas (neighbourhoods) when the available observations on prevalence are for an alternative partition of a region, such as service areas. Interpolation to neighbourhoods uses a kernel method extended to take account of two types of collateral information. The first is morbidity and service use data, such as hospital admissions, observed for neighbourhoods. Variations in morbidity and service use are expected to reflect prevalence. The second type of collateral information is ecological risk factors (e.g., pollution indices) that are expected to explain variability in prevalence in service areas, but are typically observed only for neighbourhoods. An application involves estimating neighbourhood asthma prevalence in a London health region involving 562 neighbourhoods and 189 service (primary care) areas. PMID:24129116

  1. Empirical application of normal mixture GARCH and value-at-risk estimation

    NASA Astrophysics Data System (ADS)

    Kamaruzzaman, Zetty Ain; Isa, Zaidi

    2014-06-01

    Normal mixture (NM) GARCH model can capture time variation in both conditional skewness and kurtosis. In this paper, we present the general framework of Normal mixture GARCH (1,1). An empirical application is presented using Malaysia weekly stock market returns. This paper provides evidence that, for modeling stock market returns, two-component Normal mixture GARCH (1,1) model perform better than Normal, symmetric and skewed Student's t-GARCH models. This model can quantify the volatility corresponding to stable and crash market circumstances. We also consider Value-at-Risk (VaR) estimation for Normal mixture GARCH model.

  2. Daily salt intake estimated by overnight urine collections indicates a high cardiovascular disease risk in Thailand.

    PubMed

    Yokokawa, Hirohide; Yuasa, Motoyuki; Nedsuwan, Supalert; Moolphate, Saiyud; Fukuda, Hiroshi; Kitajima, Tsutomu; Minematsu, Kazuo; Tanimura, Susumu; Marui, Eiji

    2016-01-01

    This cross-sectional study (February 2012 to March 2013) was conducted to estimate daily salt intake and basic characteristics among 793 community-dwelling participants at high risk of cardiovascular disease (Framingham risk score >15%), who had visited diabetes or hypertension clinics at health centres in the Muang district, Chiang Rai, Thailand. We performed descriptive analysis of baseline data and used an automated analyser to estimate the average of 24-hour salt intake estimated from 3 days overnight urine collection. Participants were divided into two groups based on median estimated daily salt intake. Mean age and proportion of males were 65.2 years and 37.6% in the higher salt intake group (>=10.0 g/day, n=362), and 67.5 years and 42.7% in the lower salt intake group (<10.0 g/day, n=431), respectively (p=0.01, p<0.01). The higher salt intake group comprised more patients with a family history of hypertension, antihypertensive drug use, less ideal body mass index (18.5-24.9), higher exercise frequency (>=2 times weekly) and lower awareness of high salt intake. Among higher salt intake participants, those with lower awareness of high salt intake were younger and more often had a family history of hypertension, relative to those with more awareness. Our data indicated that families often share lifestyles involving high salt intake, and discrepancies between actual salt intake and awareness of high salt intake may represent a need for salt reduction intervention aiming at family level. Awareness of actual salt intake should be improved for each family. PMID:26965760

  3. Impact of alternative metrics on estimates of extent of occurrence for extinction risk assessment.

    PubMed

    Joppa, Lucas N; Butchart, Stuart H M; Hoffmann, Michael; Bachman, Steve P; Akçakaya, H Resit; Moat, Justin F; Böhm, Monika; Holland, Robert A; Newton, Adrian; Polidoro, Beth; Hughes, Adrian

    2016-04-01

    In International Union for Conservation of Nature (IUCN) Red List assessments, extent of occurrence (EOO) is a key measure of extinction risk. However, the way assessors estimate EOO from maps of species' distributions is inconsistent among assessments of different species and among major taxonomic groups. Assessors often estimate EOO from the area of mapped distribution, but these maps often exclude areas that are not habitat in idiosyncratic ways and are not created at the same spatial resolutions. We assessed the impact on extinction risk categories of applying different methods (minimum convex polygon, alpha hull) for estimating EOO for 21,763 species of mammals, birds, and amphibians. Overall, the percentage of threatened species requiring down listing to a lower category of threat (taking into account other Red List criteria under which they qualified) spanned 11-13% for all species combined (14-15% for mammals, 7-8% for birds, and 12-15% for amphibians). These down listings resulted from larger estimates of EOO and depended on the EOO calculation method. Using birds as an example, we found that 14% of threatened and near threatened species could require down listing based on the minimum convex polygon (MCP) approach, an approach that is now recommended by IUCN. Other metrics (such as alpha hull) had marginally smaller impacts. Our results suggest that uniformly applying the MCP approach may lead to a one-time down listing of hundreds of species but ultimately ensure consistency across assessments and realign the calculation of EOO with the theoretical basis on which the metric was founded. PMID:26183938

  4. Accurate Evaluation of Quantum Integrals

    NASA Technical Reports Server (NTRS)

    Galant, David C.; Goorvitch, D.

    1994-01-01

    Combining an appropriate finite difference method with Richardson's extrapolation results in a simple, highly accurate numerical method for solving a Schr\\"{o}dinger's equation. Important results are that error estimates are provided, and that one can extrapolate expectation values rather than the wavefunctions to obtain highly accurate expectation values. We discuss the eigenvalues, the error growth in repeated Richardson's extrapolation, and show that the expectation values calculated on a crude mesh can be extrapolated to obtain expectation values of high accuracy.

  5. Severity of disease estimation and risk-adjustment for comparison of outcomes in mechanically ventilated patients using electronic routine care data.

    PubMed

    van Mourik, Maaike S M; Moons, Karel G M; Murphy, Michael V; Bonten, Marc J M; Klompas, Michael

    2015-07-01

    BACKGROUND Valid comparison between hospitals for benchmarking or pay-for-performance incentives requires accurate correction for underlying disease severity (case-mix). However, existing models are either very simplistic or require extensive manual data collection. OBJECTIVE To develop a disease severity prediction model based solely on data routinely available in electronic health records for risk-adjustment in mechanically ventilated patients. DESIGN Retrospective cohort study. PARTICIPANTS Mechanically ventilated patients from a single tertiary medical center (2006-2012). METHODS Predictors were extracted from electronic data repositories (demographic characteristics, laboratory tests, medications, microbiology results, procedure codes, and comorbidities) and assessed for feasibility and generalizability of data collection. Models for in-hospital mortality of increasing complexity were built using logistic regression. Estimated disease severity from these models was linked to rates of ventilator-associated events. RESULTS A total of 20,028 patients were initiated on mechanical ventilation, of whom 3,027 deceased in hospital. For models of incremental complexity, area under the receiver operating characteristic curve ranged from 0.83 to 0.88. A simple model including demographic characteristics, type of intensive care unit, time to intubation, blood culture sampling, 8 common laboratory tests, and surgical status achieved an area under the receiver operating characteristic curve of 0.87 (95% CI, 0.86-0.88) with adequate calibration. The estimated disease severity was associated with occurrence of ventilator-associated events. CONCLUSIONS Accurate estimation of disease severity in ventilated patients using electronic, routine care data was feasible using simple models. These estimates may be useful for risk-adjustment in ventilated patients. Additional research is necessary to validate and refine these models. PMID:25881675

  6. Assessing uncertainty in published risk estimates using hexavalent chromium and lung cancer mortality as an example [Presentation 2015

    EPA Science Inventory

    Introduction: The National Research Council recommended quantitative evaluation of uncertainty in effect estimates for risk assessment. This analysis considers uncertainty across model forms and model parameterizations with hexavalent chromium [Cr(VI)] and lung cancer mortality a...

  7. The potential of plasma miRNAs for diagnosis and risk estimation of colorectal cancer

    PubMed Central

    Chen, Wang-Yang; Zhao, Xiao-Juan; Yu, Zhi-Fu; Hu, Fu-Lan; Liu, Yu-Peng; Cui, Bin-Bin; Dong, Xin-Shu; Zhao, Ya-Shuang

    2015-01-01

    Circulating microRNAs (miRNAs) were recognized to be potential non-invasive biomarkers for colorectal cancer (CRC) detection and prediction. Meanwhile, the association of the expression of plasma miRNAs with the risk of CRC patients has rarely been analyzed. Therefore, we conducted this study to evaluate the value of plasma miRNAs for CRC diagnosis and risk estimation. Fasting blood samples from 100 CRC patients and 79 cancer-free controls were collected. Plasma miR-106a, miR-20a, miR-27b, miR-92a and miR-29a levels were detected by RT-qPCR. Sensitivity and specificity were employed to evaluate the diagnostic value of miRNAs for CRC. Univariate and multivariate logistic regression were employed to analyze the association between miRNAs expression and CRC risk. As results, miR-106a and miR-20a were elevated in the patients with CRC. The sensitivity of miR-106a was 74.00% and the specificity was 44.40%, while the cutoff value was 2.03. As for miR-20a, the sensitivity was 46.00% and specificity was 73.42% when employed 2.44 as cutoff value. High expression of plasma miR-106a increased CRC risk by 1.80 -fold. Plasma miR-106a and miR-20a may as noninvasive biomarkers for detecting the CRC. High expression of miR-106a associated with CRC risk. PMID:26261602

  8. Estimating Geographical Variation in the Risk of Zoonotic Plasmodium knowlesi Infection in Countries Eliminating Malaria

    PubMed Central

    Shearer, Freya M.; Huang, Zhi; Weiss, Daniel J.; Wiebe, Antoinette; Gibson, Harry S.; Battle, Katherine E.; Pigott, David M.; Brady, Oliver J.; Putaporntip, Chaturong; Jongwutiwes, Somchai; Lau, Yee Ling; Manske, Magnus; Amato, Roberto; Elyazar, Iqbal R. F.; Vythilingam, Indra; Bhatt, Samir; Gething, Peter W.; Singh, Balbir; Golding, Nick; Hay, Simon I.

    2016-01-01

    Background Infection by the simian malaria parasite, Plasmodium knowlesi, can lead to severe and fatal disease in humans, and is the most common cause of malaria in parts of Malaysia. Despite being a serious public health concern, the geographical distribution of P. knowlesi malaria risk is poorly understood because the parasite is often misidentified as one of the human malarias. Human cases have been confirmed in at least nine Southeast Asian countries, many of which are making progress towards eliminating the human malarias. Understanding the geographical distribution of P. knowlesi is important for identifying areas where malaria transmission will continue after the human malarias have been eliminated. Methodology/Principal Findings A total of 439 records of P. knowlesi infections in humans, macaque reservoir and vector species were collated. To predict spatial variation in disease risk, a model was fitted using records from countries where the infection data coverage is high. Predictions were then made throughout Southeast Asia, including regions where infection data are sparse. The resulting map predicts areas of high risk for P. knowlesi infection in a number of countries that are forecast to be malaria-free by 2025 (Malaysia, Cambodia, Thailand and Vietnam) as well as countries projected to be eliminating malaria (Myanmar, Laos, Indonesia and the Philippines). Conclusions/Significance We have produced the first map of P. knowlesi malaria risk, at a fine-scale resolution, to identify priority areas for surveillance based on regions with sparse data and high estimated risk. Our map provides an initial evidence base to better understand the spatial distribution of this disease and its potential wider contribution to malaria incidence. Considering malaria elimination goals, areas for prioritised surveillance are identified. PMID:27494405

  9. A framework for estimating radiation-related cancer risks in Japan from the 2011 Fukushima nuclear accident.

    PubMed

    Walsh, L; Zhang, W; Shore, R E; Auvinen, A; Laurier, D; Wakeford, R; Jacob, P; Gent, N; Anspaugh, L R; Schüz, J; Kesminiene, A; van Deventer, E; Tritscher, A; del Rosarion Pérez, M

    2014-11-01

    We present here a methodology for health risk assessment adopted by the World Health Organization that provides a framework for estimating risks from the Fukushima nuclear accident after the March 11, 2011 Japanese major earthquake and tsunami. Substantial attention has been given to the possible health risks associated with human exposure to radiation from damaged reactors at the Fukushima Daiichi nuclear power station. Cumulative doses were estimated and applied for each post-accident year of life, based on a reference level of exposure during the first year after the earthquake. A lifetime cumulative dose of twice the first year dose was estimated for the primary radionuclide contaminants ((134)Cs and (137)Cs) and are based on Chernobyl data, relative abundances of cesium isotopes, and cleanup efforts. Risks for particularly radiosensitive cancer sites (leukemia, thyroid and breast cancer), as well as the combined risk for all solid cancers were considered. The male and female cumulative risks of cancer incidence attributed to radiation doses from the accident, for those exposed at various ages, were estimated in terms of the lifetime attributable risk (LAR). Calculations of LAR were based on recent Japanese population statistics for cancer incidence and current radiation risk models from the Life Span Study of Japanese A-bomb survivors. Cancer risks over an initial period of 15 years after first exposure were also considered. LAR results were also given as a percentage of the lifetime baseline risk (i.e., the cancer risk in the absence of radiation exposure from the accident). The LAR results were based on either a reference first year dose (10 mGy) or a reference lifetime dose (20 mGy) so that risk assessment may be applied for relocated and non-relocated members of the public, as well as for adult male emergency workers. The results show that the major contribution to LAR from the reference lifetime dose comes from the first year dose. For a dose of 10 mGy in

  10. Semi-analytical estimation of wellbore leakage risk during CO2 sequestration in Ottawa County, Michigan

    NASA Astrophysics Data System (ADS)

    Guo, B.; Matteo, E. N.; Elliot, T. R.; Nogues, J. P.; Deng, H.; Fitts, J. P.; Pollak, M.; Bielicki, J.; Wilson, E.; Celia, M. A.; Peters, C. A.

    2011-12-01

    Using the semi-analytical ELSA model, wellbore leakage risk is estimated for CO2 injection into either the Mt. Simon or St. Peter formations, which are part of the Michigan Sedimentary Basin that lies beneath Ottawa County, MI. ELSA is a vertically integrated subsurface modeling tool that can be used to simulate both supercritical CO2 plume distribution/migration and pressure- induced brine displacement during CO2 injection. A composite 3D subsurface domain was constructed for the ELSA simulations based on estimated permeabilities for formation layers, as well as GIS databases containing subsurface stratigraphy, active and inactive and inactive wells, and potential interactions with subsurface activities. These activities include potable aquifers, oil and gas reservoirs, and waste injection sites, which represent potential liabilities if encountered by brine or supercritical CO2 displaced from the injection formation. Overall, the 3D subsurface domain encompasses an area of 1500 km2 to a depth of 2 km and contains over 3,000 wells. The permeabilities for abandoned wells are derived from a ranking system based on available well data including historical records and well logs. This distribution is then randomly sampled in Monte Carlo simulations that are used to generate a probability map for subsurface interferences or atmospheric release resulting from leakage of CO2 and /or brine from the injection formation. This method serves as the basis for comparative testing between various scenarios for injection, as well as for comparing the relative risk of leakage between injection formations or storage sites.

  11. Estimating the Risk of Chronic Pain: Development and Validation of a Prognostic Model (PICKUP) for Patients with Acute Low Back Pain

    PubMed Central

    Traeger, Adrian C.; Henschke, Nicholas; Hübscher, Markus; Williams, Christopher M.; Kamper, Steven J.; Maher, Christopher G.; Moseley, G. Lorimer; McAuley, James H.

    2016-01-01

    Background Low back pain (LBP) is a major health problem. Globally it is responsible for the most years lived with disability. The most problematic type of LBP is chronic LBP (pain lasting longer than 3 mo); it has a poor prognosis and is costly, and interventions are only moderately effective. Targeting interventions according to risk profile is a promising approach to prevent the onset of chronic LBP. Developing accurate prognostic models is the first step. No validated prognostic models are available to accurately predict the onset of chronic LBP. The primary aim of this study was to develop and validate a prognostic model to estimate the risk of chronic LBP. Methods and Findings We used the PROGRESS framework to specify a priori methods, which we published in a study protocol. Data from 2,758 patients with acute LBP attending primary care in Australia between 5 November 2003 and 15 July 2005 (development sample, n = 1,230) and between 10 November 2009 and 5 February 2013 (external validation sample, n = 1,528) were used to develop and externally validate the model. The primary outcome was chronic LBP (ongoing pain at 3 mo). In all, 30% of the development sample and 19% of the external validation sample developed chronic LBP. In the external validation sample, the primary model (PICKUP) discriminated between those who did and did not develop chronic LBP with acceptable performance (area under the receiver operating characteristic curve 0.66 [95% CI 0.63 to 0.69]). Although model calibration was also acceptable in the external validation sample (intercept = −0.55, slope = 0.89), some miscalibration was observed for high-risk groups. The decision curve analysis estimated that, if decisions to recommend further intervention were based on risk scores, screening could lead to a net reduction of 40 unnecessary interventions for every 100 patients presenting to primary care compared to a “treat all” approach. Limitations of the method include the model being

  12. Biological and physical methods for risk estimation in interventional radiology: a detrimental effect approach.

    PubMed

    Ramos, M; Montoro, A; Almonacid, M; Barquinero, S Ferrer J F; Tortosa, R; Miró, R; Verdú, G; Rodríguez, P; Barrios, L L; Villaescusa, J I

    2011-01-01

    Interventional radiologists and staff members are frequently exposed to the effects of direct and scattered radiation, which undergo in deterministic effects (radiodermitis, aged skin, cataracts, telangiectasia in nasal region, vasocellular epitelioms, hands depilation) and/or stochastic ones (cancer incidence). A methodology has been proposed for estimating the radiation risk or detriment from a group of six exposed interventional radiologists of the Hospital Universitario La Fe (Valencia, Spain), which had developed general exposition symptoms attributable to deterministic effects of ionizing radiation. Equivalent doses have been periodically registered using termoluminiscence dosimeters (TLD's) and wrist dosimeters, H(p)(10) and H(p)(0.07), respectively, and estimated through the observation of translocations in lymphocytes of peripheral blood (biological methods), by extrapolating the yield of translocations to their respective dose-effect curves. The software RADRISK has been applied for estimating radiation risks in these occupational radiation exposures. The minimum and maximum average excess ratio for skin cancer has been, using wrist physical doses, of [1.03 × 10(-3), 5.06 × 10(-2)], concluding that there is not an increased risk of skin cancer incidence. The minimum and maximum average excess ratio for leukemia has been, using TLD physical doses, of [7.84 × 10(-2), 3.36 × 10(-1)], and using biological doses, of [1.40 × 10(-1), 1.51], which is considerably higher than incidence rates, showing an excess radio-induced risk of leukemia in the group under study. Finally, the maximum radiological detriment in the group, evaluated as the total number of radio-induced cancers using physical dosimetry, has been of 2.18 per 1000 person-year (skin and leukemia), and using biological dosimetry of 9.20 per 1000 PY (leukemia). As a conclusion, this study has provided an assessment of the non-deterministic effects (rate of radio-induced cancer incidence

  13. Estimation of Wild Fire Risk Area based on Climate and Maximum Entropy in Korean Peninsular

    NASA Astrophysics Data System (ADS)

    Kim, T.; Lim, C. H.; Song, C.; Lee, W. K.

    2015-12-01

    The number of forest fires and accompanying human injuries and physical damages has been increased by frequent drought. In this study, forest fire danger zone of Korea is estimated to predict and prepare for future forest fire hazard regions. The MaxEnt (Maximum Entropy) model is used to estimate the forest fire hazard region which estimates the probability distribution of the status. The MaxEnt model is primarily for the analysis of species distribution, but its applicability for various natural disasters is getting recognition. The detailed forest fire occurrence data collected by the MODIS for past 5 years (2010-2014) is used as occurrence data for the model. Also meteorology, topography, vegetation data are used as environmental variable. In particular, various meteorological variables are used to check impact of climate such as annual average temperature, annual precipitation, precipitation of dry season, annual effective humidity, effective humidity of dry season, aridity index. Consequently, the result was valid based on the AUC(Area Under the Curve) value (= 0.805) which is used to predict accuracy in the MaxEnt model. Also predicted forest fire locations were practically corresponded with the actual forest fire distribution map. Meteorological variables such as effective humidity showed the greatest contribution, and topography variables such as TWI (Topographic Wetness Index) and slope also contributed on the forest fire. As a result, the east coast and the south part of Korea peninsula were predicted to have high risk on the forest fire. In contrast, high-altitude mountain area and the west coast appeared to be safe with the forest fire. The result of this study is similar with former studies, which indicates high risks of forest fire in accessible area and reflects climatic characteristics of east and south part in dry season. To sum up, we estimated the forest fire hazard zone with existing forest fire locations and environment variables and had

  14. Estimation of infectious risks in residential populations exposed to airborne pathogens during center pivot irrigation of dairy wastewaters

    Technology Transfer Automated Retrieval System (TEKTRAN)

    In the western United States where dairy wastewaters are commonly land applied, there are concerns over individuals being exposed to airborne pathogens. In response, a quantitative microbial risk assessment (QMRA) was performed to estimate infectious risks after inhalation exposure of pathogens aero...

  15. Estimation of wildfire size and risk changes due to fuels treatments

    USGS Publications Warehouse

    Cochrane, M.A.; Moran, C.J.; Wimberly, M.C.; Baer, A.D.; Finney, M.A.; Beckendorf, K.L.; Eidenshink, J.; Zhu, Z.

    2012-01-01

    Human land use practices, altered climates, and shifting forest and fire management policies have increased the frequency of large wildfires several-fold. Mitigation of potential fire behaviour and fire severity have increasingly been attempted through pre-fire alteration of wildland fuels using mechanical treatments and prescribed fires. Despite annual treatment of more than a million hectares of land, quantitative assessments of the effectiveness of existing fuel treatments at reducing the size of actual wildfires or how they might alter the risk of burning across landscapes are currently lacking. Here, we present a method for estimating spatial probabilities of burning as a function of extant fuels treatments for any wildland fire-affected landscape. We examined the landscape effects of more than 72 000 ha of wildland fuel treatments involved in 14 large wildfires that burned 314 000 ha of forests in nine US states between 2002 and 2010. Fuels treatments altered the probability of fire occurrence both positively and negatively across landscapes, effectively redistributing fire risk by changing surface fire spread rates and reducing the likelihood of crowning behaviour. Trade offs are created between formation of large areas with low probabilities of increased burning and smaller, well-defined regions with reduced fire risk.

  16. Identification and estimation of socioeconomic impacts resulting from perceived risks and changing images; An annotated bibliography

    SciTech Connect

    Nieves, L.A.; Wernette, D.R.; Hemphill, R.C.; Mohiudden, S.; Corso, J.

    1990-02-01

    In 1982, the US Congress passed the Nuclear Waste Policy Act to initiate the process of choosing a location to permanently store high-level nuclear waste from the designated Yucca Mountain, Nevada, as the only location to be studied as a candidate site for such a repository. The original acts and its amendments had established the grant mechanism by which the state of Nevada could finance an investigation of the potential socioeconomic impacts that could result from the installation and operation of this facility. Over the past three years, the Office of Civilian Radioactive Waste Management (OCRWM or RW) in the US Department of Energy (DOE) has approved grant requests by Nevada to perform this investigation. This report is intended to update and enhance a literature review conducted by the Human Affairs Research Center (HARC) for the Basalt Waste Isolation Project that dealt with the psychological and sociological processes underlying risk perception. It provides addition information on the HARC work, covers a subsequent step in the impact-estimation process, and translates risk perception into decisions and behaviors with economic consequences. It also covers recently developed techniques for assessing the nature and magnitude of impacts caused by environmental changes focusing on those impacts caused by changes in perceived risks.

  17. The use of individual and societal risk criteria within the Dutch flood safety policy--nationwide estimates of societal risk and policy applications.

    PubMed

    Jonkman, Sebastiaan N; Jongejan, Ruben; Maaskant, Bob

    2011-02-01

    The Dutch government is in the process of revising its flood safety policy. The current safety standards for flood defenses in the Netherlands are largely based on the outcomes of cost-benefit analyses. Loss of life has not been considered separately in the choice for current standards. This article presents the results of a research project that evaluated the potential roles of two risk metrics, individual and societal risk, to support decision making about new flood safety standards. These risk metrics are already used in the Dutch major hazards policy for the evaluation of risks to the public. Individual risk concerns the annual probability of death of a person. Societal risk concerns the probability of an event with many fatalities. Technical aspects of the use of individual and societal risk metrics in flood risk assessments as well as policy implications are discussed. Preliminary estimates of nationwide levels of societal risk are presented. Societal risk levels appear relatively high in the southwestern part of the country where densely populated dike rings are threatened by a combination of river and coastal floods. It was found that cumulation, the simultaneous flooding of multiple dike rings during a single flood event, has significant impact on the national level of societal risk. Options for the application of the individual and societal risk in the new flood safety policy are presented and discussed. PMID:20883529

  18. Use of health effect risk estimates and uncertainty in formal regulatory proceedings: a case study involving atmospheric particulates

    SciTech Connect

    Habegger, L.J.; Oezkaynak, A.H.

    1984-01-01

    Coal combustion particulates are released to the atmosphere by power plants supplying electrical to the nuclear fuel cycle. This paper presents estimates of the public health risks associated with the release of these particulates at a rate associated with the annual nuclear fuel production requirements for a nuclear power plan. Utilization of these risk assessments as a new component in the formal evaluation of total risks from nuclear power plants is discussed. 23 references, 3 tables.

  19. Hybrid EANN-EA System for the Primary Estimation of Cardiometabolic Risk.

    PubMed

    Kupusinac, Aleksandar; Stokić, Edita; Kovaćevic, Ilija

    2016-06-01

    The most important part of the early prevention of atherosclerosis and cardiovascular diseases is the estimation of the cardiometabolic risk (CMR). The CMR estimation can be divided into two phases. The first phase is called primary estimation of CMR (PE-CMR) and includes solely diagnostic methods that are non-invasive, easily-obtained, and low-cost. Since cardiovascular diseases are among the main causes of death in the world, it would be significant for regional health strategies to develop an intelligent software system for PE-CMR that would save time and money by extracting the persons with potentially higher CMR and conducting complete tests only on them. The development of such a software system has few limitations - dataset can be very large, data can not be collected at the same time and the same place (eg. data can be collected at different health institutions) and data of some other region are not applicable since every population has own features. This paper presents a MATLAB solution for PE-CMR based on the ensemble of well-learned artificial neural networks guided by evolutionary algorithm or shortly EANN-EA system. Our solution is suitable for research of CMR in population of some region and its accuracy is above 90 %. PMID:27106582

  20. Estimated Reduction in Cancer Risk due to PAH Exposures If Source Control Measures during the 2008 Beijing Olympics Were Sustained

    PubMed Central

    Jia, Yuling; Stone, Dave; Wang, Wentao; Schrlau, Jill; Tao, Shu; Massey Simonich, Staci L.

    2011-01-01

    Background The 2008 Beijing Olympic Games provided a unique case study to investigate the effect of source control measures on the reduction in air pollution, and associated inhalation cancer risk, in a Chinese megacity. Objectives We measured 17 carcinogenic polycyclic aromatic hydrocarbons (PAHs) and estimated the lifetime excess inhalation cancer risk during different periods of the Beijing Olympic Games, to assess the effectiveness of source control measures in reducing PAH-induced inhalation cancer risks. Methods PAH concentrations were measured in samples of particulate matter ≤ 2.5 μm in aerodynamic diameter (PM2.5) collected during the Beijing Olympic Games, and the associated inhalation cancer risks were estimated using a point-estimate approach based on relative potency factors. Results We estimated the number of lifetime excess cancer cases due to exposure to the 17 carcinogenic PAHs [12 priority pollutant PAHs and five high-molecular-weight (302 Da) PAHs (MW 302 PAHs)] to range from 6.5 to 518 per million people for the source control period concentrations and from 12.2 to 964 per million people for the nonsource control period concentrations. This would correspond to a 46% reduction in estimated inhalation cancer risk due to source control measures, if these measures were sustained over time. Benzo[b]fluoranthene, dibenz[a,h]anthracene, benzo[a]pyrene, and dibenzo[a,l]pyrene were the most carcinogenic PAH species evaluated. Total excess inhalation cancer risk would be underestimated by 23% if we did not include the five MW 302 PAHs in the risk calculation. Conclusions Source control measures, such as those imposed during the 2008 Beijing Olympics, can significantly reduce the inhalation cancer risk associated with PAH exposure in Chinese megacities similar to Beijing. MW 302 PAHs are a significant contributor to the estimated overall inhalation cancer risk. PMID:21632310

  1. Risk Prediction for Prostate Cancer Recurrence Through Regularized Estimation with Simultaneous Adjustment for Nonlinear Clinical Effects*

    PubMed Central

    Long, Qi; Chung, Matthias; Moreno, Carlos S.; Johnson, Brent A.

    2011-01-01

    In biomedical studies, it is of substantial interest to develop risk prediction scores using high-dimensional data such as gene expression data for clinical endpoints that are subject to censoring. In the presence of well-established clinical risk factors, investigators often prefer a procedure that also adjusts for these clinical variables. While accelerated failure time (AFT) models are a useful tool for the analysis of censored outcome data, it assumes that covariate effects on the logarithm of time-to-event are linear, which is often unrealistic in practice. We propose to build risk prediction scores through regularized rank estimation in partly linear AFT models, where high-dimensional data such as gene expression data are modeled linearly and important clinical variables are modeled nonlinearly using penalized regression splines. We show through simulation studies that our model has better operating characteristics compared to several existing models. In particular, we show that there is a non-negligible effect on prediction as well as feature selection when nonlinear clinical effects are misspecified as linear. This work is motivated by a recent prostate cancer study, where investigators collected gene expression data along with established prognostic clinical variables and the primary endpoint is time to prostate cancer recurrence. We analyzed the prostate cancer data and evaluated prediction performance of several models based on the extended c statistic for censored data, showing that 1) the relationship between the clinical variable, prostate specific antigen, and the prostate cancer recurrence is likely nonlinear, i.e., the time to recurrence decreases as PSA increases and it starts to level off when PSA becomes greater than 11; 2) correct specification of this nonlinear effect improves performance in prediction and feature selection; and 3) addition of gene expression data does not seem to further improve the performance of the resultant risk

  2. Dementia risk estimates associated with measures of depression: a systematic review and meta-analysis

    PubMed Central

    Anstey, Kaarin J

    2015-01-01

    Objectives To perform a systematic review of reported HRs of all cause dementia, Alzheimer's disease (AD) and vascular dementia (VaD) for late-life depression and depressive symptomatology on specific screening instruments at specific thresholds. Design Meta-analysis with meta-regression. Setting and participants PubMed, PsycInfo, and Cochrane databases were searched through 28 February 2014. Articles reporting HRs for incident all-cause dementia, AD and VaD based on published clinical criteria using validated measures of clinical depression or symptomatology from prospective studies of general population of adults were selected by consensus among multiple reviewers. Studies that did not use clinical dementia diagnoses or validated instruments for the assessment of depression were excluded. Data were extracted by two reviewers and reviewed by two other independent reviewers. The most specific analyses possible using continuous symptomatology ratings and categorical measures of clinical depression focusing on single instruments with defined reported cut-offs were conducted. Primary outcome measures HRs for all-cause dementia, AD, and VaD were computed where possible for continuous depression scores, or for major depression assessed with single or comparable validated instruments. Results Searches yielded 121 301 articles, of which 36 (0.03%) were eligible. Included studies provided a combined sample size of 66 532 individuals including 6593 cases of dementia, 2797 cases of AD and 585 cases of VaD. The increased risk associated with depression did not significantly differ by type of dementia and ranged from 83% to 104% for diagnostic thresholds consistent with major depression. Risk associated with continuous depression symptomatology measures were consistent with those for clinical thresholds. Conclusions Late-life depression is consistently and similarly associated with a twofold increased risk of dementia. The precise risk estimates produced in this study for

  3. Development and Internal Validation of the Male Osteoporosis Risk Estimation Score

    PubMed Central

    Shepherd, Angela J.; Cass, Alvah R.; Carlson, Carol A.; Ray, Laura

    2007-01-01

    PURPOSE We wanted to develop and validate a clinical prediction rule to identify men at risk for osteoporosis and subsequent hip fracture who might benefit from dual-energy x-ray absorptiometry (DXA). METHODS We used risk factor data from the National Health and Nutrition Examination Survey III to develop a best fitting multivariable logistic regression model in men aged 50 years and older randomized to either the development (n = 1,497) or validation (n = 1,498) cohorts. The best fitting model was transformed into a simplified scoring algorithm, the Male Osteoporosis Risk Estimation Score (MORES). We validated the MORES, comparing sensitivity, specificity, and area under the receiver operating characteristics (ROC) curve in the 2 cohorts and assessed clinical utility with an analysis of the number needed-to-screen (NNS) to prevent 1 additional hip fracture. RESULTS The MORES included 3 variables—age, weight, and history of chronic obstructive pulmonary disease—and showed excellent predictive validity in the validation cohort. A score of 6 or greater yielded an overall sensitivity of 0.93 (95% CI, 0.85–0.97), a specificity of 0.59 (95% CI, 0.56–0.62), and an area under the ROC curve of 0.832 (95% CI, 0.807–0.858). The overall NNS to prevent 1 additional hip fracture was 279 in a cohort of men representative of the US population. CONCLUSIONS Osteoporosis is a major predictor of hip fractures. Experts believe bisphosphonate treatment in men should yield results similar to that in women and reduce hip fracture rates associated with osteoporosis. In men aged 60 years and older, the MORES is a simple approach to identify men at risk for osteoporosis and refer them for confirmatory DXA scans. PMID:18025492

  4. Fast and Accurate Construction of Confidence Intervals for Heritability.

    PubMed

    Schweiger, Regev; Kaufman, Shachar; Laaksonen, Reijo; Kleber, Marcus E; März, Winfried; Eskin, Eleazar; Rosset, Saharon; Halperin, Eran

    2016-06-01

    Estimation of heritability is fundamental in genetic studies. Recently, heritability estimation using linear mixed models (LMMs) has gained popularity because these estimates can be obtained from unrelated individuals collected in genome-wide association studies. Typically, heritability estimation under LMMs uses the restricted maximum likelihood (REML) approach. Existing methods for the construction of confidence intervals and estimators of SEs for REML rely on asymptotic properties. However, these assumptions are often violated because of the bounded parameter space, statistical dependencies, and limited sample size, leading to biased estimates and inflated or deflated confidence intervals. Here, we show that the estimation of confidence intervals by state-of-the-art methods is inaccurate, especially when the true heritability is relatively low or relatively high. We further show that these inaccuracies occur in datasets including thousands of individuals. Such biases are present, for example, in estimates of heritability of gene expression in the Genotype-Tissue Expression project and of lipid profiles in the Ludwigshafen Risk and Cardiovascular Health study. We also show that often the probability that the genetic component is estimated as 0 is high even when the true heritability is bounded away from 0, emphasizing the need for accurate confidence intervals. We propose a computationally efficient method, ALBI (accurate LMM-based heritability bootstrap confidence intervals), for estimating the distribution of the heritability estimator and for constructing accurate confidence intervals. Our method can be used as an add-on to existing methods for estimating heritability and variance components, such as GCTA, FaST-LMM, GEMMA, or EMMAX. PMID:27259052

  5. Cryptosporidium and Giardia in tropical recreational marine waters contaminated with domestic sewage: estimation of bathing-associated disease risks.

    PubMed

    Betancourt, Walter Q; Duarte, Diana C; Vásquez, Rosa C; Gurian, Patrick L

    2014-08-15

    Sewage is a major contributor to pollution problems involving human pathogens in tropical coastal areas. This study investigated the occurrence of intestinal protozoan parasites (Giardia and Cryptosporidium) in tropical recreational marine waters contaminated with sewage. The potential risks of Cryptosporidium and Giardia infection from recreational water exposure were estimated from the levels of viable (oo) cysts (DIC+, DAPI+, PI-) found in near-shore swimming areas using an exponential dose response model. A Monte Carlo uncertainty analysis was performed in order to determine the probability distribution of risks. Microbial indicators of recreational water quality (enterococci, Clostridium perfringens) and genetic markers of sewage pollution (human-specific Bacteroidales marker [HF183] and Clostridium coccoides) were simultaneously evaluated in order to estimate the extent of water quality deterioration associated with human wastes. The study revealed the potential risk of parasite infections via primary contact with tropical marine waters contaminated with sewage; higher risk estimates for Giardia than for Cryptosporidium were found. Mean risks estimated by Monte Carlo were below the U.S. EPA upper bound on recreational risk of 0.036 for cryptosporidiosis and giardiasis for both children and adults. However, 95th percentile estimates for giardiasis for children exceeded the 0.036 level. Environmental surveillance of microbial pathogens is crucial in order to control and eradicate the effects that increasing anthropogenic impacts have on marine ecosystems and human health. PMID:24975093

  6. Strategy Guideline. Accurate Heating and Cooling Load Calculations

    SciTech Connect

    Burdick, Arlan

    2011-06-01

    This guide presents the key criteria required to create accurate heating and cooling load calculations and offers examples of the implications when inaccurate adjustments are applied to the HVAC design process. The guide shows, through realistic examples, how various defaults and arbitrary safety factors can lead to significant increases in the load estimate. Emphasis is placed on the risks incurred from inaccurate adjustments or ignoring critical inputs of the load calculation.

  7. Strategy Guideline: Accurate Heating and Cooling Load Calculations

    SciTech Connect

    Burdick, A.

    2011-06-01

    This guide presents the key criteria required to create accurate heating and cooling load calculations and offers examples of the implications when inaccurate adjustments are applied to the HVAC design process. The guide shows, through realistic examples, how various defaults and arbitrary safety factors can lead to significant increases in the load estimate. Emphasis is placed on the risks incurred from inaccurate adjustments or ignoring critical inputs of the load calculation.

  8. Risk estimates for radiation-induced cancer--the epidemiological evidence.

    PubMed

    Kellerer, A M

    2000-03-01

    The risk of low-dose radiation exposures has--for a variety of reasons--been highly politicised. This has led to a frequently exaggerated perception of the potential health effects, and to lasting public controversies. A balanced view requires a critical reassessment of the epidemiological basis of current assumptions. There is reliable quantitative information available on the increase of cancer rates due to moderate and high doses. This provides a firm basis for the derivation of probabilities of causation, e.g. after high radiation exposures. For small doses or dose rates, the situation is entirely different: potential increases of cancer rates remain hidden below the statistical fluctuations of normal rates, and the molecular mechanisms of cancerogenesis are not sufficiently well known to allow numerical predictions. Risk coefficients for radiation protection must, therefore, be based on the uncertain extrapolation of observations obtained at moderate or high doses. While extrapolation is arbitrary, it is, nevertheless, used and mostly with the conservative assumption of a linear dose dependence with no threshold (LNT model). All risk estimates are based on this hypothesis. They are, thus, virtual guidelines, rather than firm numbers. The observations on the A-bomb survivors are still the major source of information on the health effects of comparatively small radiation doses. A fairly direct inspection of the data shows that the solid cancer mortality data of the A-bomb survivors are equally consistent with linearity in dose and with reduced effectiveness at low doses. In the leukemia data a reduction is strongly indicated. With one notable exception -- leukemia after prenatal exposure--these observations are in line with a multitude of observations in groups of persons exposed for medical reasons. The low-dose effects of densely ionizing radiations--such as alpha-particles from radon decay products or high-energy neutrons--are a separate important issue. For

  9. Improved risk estimated from carbon tetrachloride. Annual progress report, October 1, 1996--September 30, 1997

    SciTech Connect

    Benson, J.M.

    1997-10-27

    'Carbon tetrachloride (CCl{sub 4}) has been used extensively within the Department of Energy (DOE) nuclear weapons facilities. Rocky Flats was formerly the largest volume user of CCl{sub 4} in the US, with 5,000 gallons used there in 1977 alone. At the Hanford site, several hundred thousand gallons of CCl{sub 4} were discharged between 1955 and 1973 into underground cribs for storage. Levels of CCl{sub 4} in groundwater at highly contaminated sites at the Hanford. facility have exceeded the drinking water standard of 5 ppb by several orders of magnitude. High levels of CCl{sub 4} at these facilities represent a potential health hazard for workers conducting cleanup operations and for surrounding communities. The level of CCl{sub 4} cleanup required at these sites and associated costs are driven by current human health risk estimates which assume that CCl{sub 4} is a genotoxic carcinogen. The overall purpose of these studies is to improve the scientific basis for assessing the health risk associated with human exposure to CCl{sub 4}. Specifically, the authors will determine the toxicokinetics of inhaled and ingested CCl{sub 4} in F344/Crl rats, B6C3F1 mice, and Syrian hamsters. They will also evaluate species differences in the metabolism of CCl{sub 4} by rats, mice, hamsters, and man. Dose-response relationships will be determined in all these studies. This information will be used to improve the physiologically based pharmacokinetic (PBPK) model for CCl4 originally developed by Paustenbach et al. (1988) and more recently revised by Thrall and Kenny (1996). They will also provide scientific evidence that CCl{sub 4} , like chloroform, is a hepatocarcinogen only when exposure results in cell damage, cell killing, and regenerative cell proliferation. In combination, the studies outlined in this proposal will provide the exact types of information needed to enable refined cancer risk estimates for CCl{sub 4} under the new guidelines for risk assessment proposed by the

  10. Simulation-extrapolation method to address errors in atomic bomb survivor dosimetry on solid cancer and leukaemia mortality risk estimates, 1950-2003.

    PubMed

    Allodji, Rodrigue S; Schwartz, Boris; Diallo, Ibrahima; Agbovon, Césaire; Laurier, Dominique; de Vathaire, Florent

    2015-08-01

    Analyses of the Life Span Study (LSS) of Japanese atomic bombing survivors have routinely incorporated corrections for additive classical measurement errors using regression calibration. Recently, several studies reported that the efficiency of the simulation-extrapolation method (SIMEX) is slightly more accurate than the simple regression calibration method (RCAL). In the present paper, the SIMEX and RCAL methods have been used to address errors in atomic bomb survivor dosimetry on solid cancer and leukaemia mortality risk estimates. For instance, it is shown that using the SIMEX method, the ERR/Gy is increased by an amount of about 29 % for all solid cancer deaths using a linear model compared to the RCAL method, and the corrected EAR 10(-4) person-years at 1 Gy (the linear terms) is decreased by about 8 %, while the corrected quadratic term (EAR 10(-4) person-years/Gy(2)) is increased by about 65 % for leukaemia deaths based on a linear-quadratic model. The results with SIMEX method are slightly higher than published values. The observed differences were probably due to the fact that with the RCAL method the dosimetric data were partially corrected, while all doses were considered with the SIMEX method. Therefore, one should be careful when comparing the estimated risks and it may be useful to use several correction techniques in order to obtain a range of corrected estimates, rather than to rely on a single technique. This work will enable to improve the risk estimates derived from LSS data, and help to make more reliable the development of radiation protection standards. PMID:25894839

  11. Bayesian Risk Mapping and Model-Based Estimation of Schistosoma haematobium–Schistosoma mansoni Co-distribution in Côte d′Ivoire

    PubMed Central

    Chammartin, Frédérique; Houngbedji, Clarisse A.; Hürlimann, Eveline; Yapi, Richard B.; Silué, Kigbafori D.; Soro, Gotianwa; Kouamé, Ferdinand N.; N′Goran, Eliézer K.; Utzinger, Jürg; Raso, Giovanna; Vounatsou, Penelope

    2014-01-01

    Background Schistosoma haematobium and Schistosoma mansoni are blood flukes that cause urogenital and intestinal schistosomiasis, respectively. In Côte d′Ivoire, both species are endemic and control efforts are being scaled up. Accurate knowledge of the geographical distribution, including delineation of high-risk areas, is a central feature for spatial targeting of interventions. Thus far, model-based predictive risk mapping of schistosomiasis has relied on historical data of separate parasite species. Methodology We analyzed data pertaining to Schistosoma infection among school-aged children obtained from a national, cross-sectional survey conducted between November 2011 and February 2012. More than 5,000 children in 92 schools across Côte d′Ivoire participated. Bayesian geostatistical multinomial models were developed to assess infection risk, including S. haematobium–S. mansoni co-infection. The predicted risk of schistosomiasis was utilized to estimate the number of children that need preventive chemotherapy with praziquantel according to World Health Organization guidelines. Principal Findings We estimated that 8.9% of school-aged children in Côte d′Ivoire are affected by schistosomiasis; 5.3% with S. haematobium and 3.8% with S. mansoni. Approximately 2 million annualized praziquantel treatments would be required for preventive chemotherapy at health districts level. The distinct spatial patterns of S. haematobium and S. mansoni imply that co-infection is of little importance across the country. Conclusions/Significance We provide a comprehensive analysis of the spatial distribution of schistosomiasis risk among school-aged children in Côte d′Ivoire and a strong empirical basis for a rational targeting of control interventions. PMID:25522007

  12. Flood risk assessment in France: comparison of extreme flood estimation methods (EXTRAFLO project, Task 7)

    NASA Astrophysics Data System (ADS)

    Garavaglia, F.; Paquet, E.; Lang, M.; Renard, B.; Arnaud, P.; Aubert, Y.; Carre, J.

    2013-12-01

    In flood risk assessment the methods can be divided in two families: deterministic methods and probabilistic methods. In the French hydrologic community the probabilistic methods are historically preferred to the deterministic ones. Presently a French research project named EXTRAFLO (RiskNat Program of the French National Research Agency, https://extraflo.cemagref.fr) deals with the design values for extreme rainfall and floods. The object of this project is to carry out a comparison of the main methods used in France for estimating extreme values of rainfall and floods, to obtain a better grasp of their respective fields of application. In this framework we present the results of Task 7 of EXTRAFLO project. Focusing on French watersheds, we compare the main extreme flood estimation methods used in French background: (i) standard flood frequency analysis (Gumbel and GEV distribution), (ii) regional flood frequency analysis (regional Gumbel and GEV distribution), (iii) local and regional flood frequency analysis improved by historical information (Naulet et al., 2005), (iv) simplify probabilistic method based on rainfall information (i.e. Gradex method (CFGB, 1994), Agregee method (Margoum, 1992) and Speed method (Cayla, 1995)), (v) flood frequency analysis by continuous simulation approach and based on rainfall information (i.e. Schadex method (Paquet et al., 2013, Garavaglia et al., 2010), Shyreg method (Lavabre et al., 2003)) and (vi) multifractal approach. The main result of this comparative study is that probabilistic methods based on additional information (i.e. regional, historical and rainfall information) provide better estimations than the standard flood frequency analysis. Another interesting result is that, the differences between the various extreme flood quantile estimations of compared methods increase with return period, staying relatively moderate up to 100-years return levels. Results and discussions are here illustrated throughout with the example

  13. Patient-specific radiation dose and cancer risk estimation in pediatric chest CT: a study in 30 patients

    NASA Astrophysics Data System (ADS)

    Li, Xiang; Samei, Ehsan; Segars, W. Paul; Sturgeon, Gregory M.; Colsher, James G.; Frush, Donald P.

    2010-04-01

    Radiation-dose awareness and optimization in CT can greatly benefit from a dosereporting system that provides radiation dose and cancer risk estimates specific to each patient and each CT examination. Recently, we reported a method for estimating patientspecific dose from pediatric chest CT. The purpose of this study is to extend that effort to patient-specific risk estimation and to a population of pediatric CT patients. Our study included thirty pediatric CT patients (16 males and 14 females; 0-16 years old), for whom full-body computer models were recently created based on the patients' clinical CT data. Using a validated Monte Carlo program, organ dose received by the thirty patients from a chest scan protocol (LightSpeed VCT, 120 kVp, 1.375 pitch, 40-mm collimation, pediatric body scan field-of-view) was simulated and used to estimate patient-specific effective dose. Risks of cancer incidence were calculated for radiosensitive organs using gender-, age-, and tissue-specific risk coefficients and were used to derive patientspecific effective risk. The thirty patients had normalized effective dose of 3.7-10.4 mSv/100 mAs and normalized effective risk of 0.5-5.8 cases/1000 exposed persons/100 mAs. Normalized lung dose and risk of lung cancer correlated strongly with average chest diameter (correlation coefficient: r = -0.98 to -0.99). Normalized effective risk also correlated strongly with average chest diameter (r = -0.97 to -0.98). These strong correlations can be used to estimate patient-specific dose and risk prior to or after an imaging study to potentially guide healthcare providers in justifying CT examinations and to guide individualized protocol design and optimization.

  14. Improved Radiation Dosimetry/Risk Estimates to Facilitate Environmental Management of Plutonium-Contaminated Sites

    SciTech Connect

    Scott, Bobby R.; Tokarskaya, Zoya B.; Zhuntova, Galina V.; Osovets, Sergey V.; Syrchikov, Victor A., Belyaeva, Zinaida D.

    2007-12-14

    This report summarizes 4 years of research achievements in this Office of Science (BER), U.S. Department of Energy (DOE) project. The research described was conducted by scientists and supporting staff at Lovelace Respiratory Research Institute (LRRI)/Lovelace Biomedical and Environmental Research Institute (LBERI) and the Southern Urals Biophysics Institute (SUBI). All project objectives and goals were achieved. A major focus was on obtaining improved cancer risk estimates for exposure via inhalation to plutonium (Pu) isotopes in the workplace (DOE radiation workers) and environment (public exposures to Pu-contaminated soil). A major finding was that low doses and dose rates of gamma rays can significantly suppress cancer induction by alpha radiation from inhaled Pu isotopes. The suppression relates to stimulation of the body's natural defenses, including immunity against cancer cells and selective apoptosis which removes precancerous and other aberrant cells.

  15. Combining Radiation Epidemiology With Molecular Biology-Changing From Health Risk Estimates to Therapeutic Intervention.

    PubMed

    Abend, Michael; Port, Matthias

    2016-08-01

    The authors herein summarize six presentations dedicated to the key session "molecular radiation epidemiology" of the ConRad meeting 2015. These presentations were chosen in order to highlight the promise when combining conventional radiation epidemiology with molecular biology. Conventional radiation epidemiology uses dose estimates for risk predictions on health. However, combined with molecular biology, dose-dependent bioindicators of effect hold the promise to improve clinical diagnostics and to provide target molecules for potential therapeutic intervention. One out of the six presentations exemplified the use of radiation-induced molecular changes as biomarkers of exposure by measuring stabile chromosomal translocations. The remaining five presentations focused on molecular changes used as bioindicators of the effect. These bioindicators of the effect could be used for diagnostic purposes on colon cancers (genomic instability), thyroid cancer (CLIP2), or head and neck squamous cell cancers. Therapeutic implications of gene expression changes were examined in Chernobyl thyroid cancer victims and Mayak workers. PMID:27356062

  16. Schistosomiasis risk estimation in Minas Gerais State, Brazil, using environmental data and GIS techniques.

    PubMed

    Guimarães, Ricardo J P S; Freitas, Corina C; Dutra, Luciano V; Moura, Ana C M; Amaral, Ronaldo S; Drummond, Sandra C; Scholte, Ronaldo G C; Carvalho, Omar S

    2008-01-01

    The influence of climate and environmental variables to the distribution of schistosomiasis has been assessed in several previous studies. Also Geographical Information System (GIS), is a tool that has been recently tested for better understanding the spatial disease distribution. The objective of this paper is to further develop the GIS technology for modeling and control of schistosomiasis using meteorological and social variables and introducing new potential environmental-related variables, particularly those produced by recently launched orbital sensors like the Moderate Resolution Imaging Spectroradiometer (MODIS) and the Shuttle Radar Topography Mission (SRTM). Three different scenarios have been analyzed, and despite of not quite large determination factor, the standard deviation of risk estimates was considered adequate for public health needs. The main variables selected as important for modeling purposes was topographic elevation, summer minimum temperature, the NDVI vegetation index, and the social index HDI91. PMID:18692017

  17. Update of identification and estimation of socioeconomic impacts resulting from perceived risks and changing images: An annotated bibliography

    SciTech Connect

    Nieves, L.A.; Clark, D.E.; Wernette, D.

    1991-08-01

    This annotated bibliography reviews selected literature published through August 1991 on the identification of perceived risks and methods for estimating the economic impacts of risk perception. It updates the literature review found in Argonne National Laboratory report ANL/EAIS/TM-24 (February 1990). Included in this update are (1) a literature review of the risk perception process, of the relationship between risk perception and economic impacts, of economic methods and empirical applications, and interregional market interactions and adjustments; (2) a working bibliography (that includes the documents abstracted in the 1990 report); (3) a topical index to the abstracts found in both reports; and (4) abstracts of selected articles found in this update.

  18. Estimation of Value-at-Risk for Energy Commodities via CAViaR Model

    NASA Astrophysics Data System (ADS)

    Xiliang, Zhao; Xi, Zhu

    This paper uses the Conditional Autoregressive Value at Risk model (CAViaR) proposed by Engle and Manganelli (2004) to evaluate the value-at-risk for daily spot prices of Brent crude oil and West Texas Intermediate crude oil covering the period May 21th, 1987 to Novermber 18th, 2008. Then the accuracy of the estimates of CAViaR model, Normal-GARCH, and GED-GARCH was compared. The results show that all the methods do good job for the low confidence level (95%), and GED-GARCH is the best for spot WTI price, Normal-GARCH and Adaptive-CAViaR are the best for spot Brent price. However, for the high confidence level (99%), Normal-GARCH do a good job for spot WTI, GED-GARCH and four kind of CAViaR specifications do well for spot Brent price. Normal-GARCH does badly for spot Brent price. The result seems suggest that CAViaR do well as well as GED-GARCH since CAViaR directly model the quantile autoregression, but it does not outperform GED-GARCH although it does outperform Normal-GARCH.

  19. Regression models and risk estimation for mixed discrete and continuous outcomes in developmental toxicology.

    PubMed

    Regan, M M; Catalano, P J

    2000-06-01

    Multivariate dose-response models have recently been proposed for developmental toxicity data to simultaneously model malformation incidence (a binary outcome), and reductions in fetal weight (a continuous outcome). In this and other applications, the binary outcome often represents a dichotomization of another outcome or a composite of outcomes, which facilitates analysis. For example, in Segment II developmental toxicology studies, multiple malformation types (i.e., external, visceral, skeletal) are evaluated on each fetus; malformation status may also be ordinally measured (e.g., normal, signs of variation, full malformation). A model is proposed is for fetal weight and multiple malformation variables measured on an ordinal scale, where the correlations between the outcomes and between the offspring within a litter are taken into account. Fully specifying the joint distribution of outcomes within a litter is avoided by specifying only the distribution of the multivariate outcome for each fetus and using generalized estimating equation methodology to account for correlations due to litter clustering. The correlations between the outcomes are required to characterize joint risk to the fetus, and are therefore a focus of inference. Dose-response models and their application to quantitative risk assessment are illustrated using data from a recent developmental toxicology experiment of ethylene oxide in mice. PMID:10949415

  20. Estimate of the secondary cancer risk from megavoltage CT in tomotherapy

    NASA Astrophysics Data System (ADS)

    Kim, Dong Wook; Chung, Weon Kuu; Ahn, Sung Hwan; Yoon, Myonggeun

    2013-04-01

    We have assessed the radiation-induced excess cancer risk to organs from megavoltage computed tomography (MVCT). MVCT was performed in coarse, normal and fine scanning modes. Using a glass dosimeter, we measured the primary and the secondary doses inside a homemade phantom and at various distances from the imaging center. The organ-specific excess absolute risk (EAR) for cancer induction was estimated using an organ-equivalent dose (OED) based on the measured imaging doses. The average primary doses inside the phantom for the coarse, normal and fine scanning modes were 0.78, 1.15 and 2.15 cGy, respectively. The average secondary dose per scan, measured 20 to 60 cm from the imaging center, ranged from 0.044 to 0.008 cGy. The EAR for major organs indicated that when 30 MVCT scans are performed to position each patient during the course of radiation treatment, organ-specific cancers may develop in as many as 6 per 10,000 persons per year.

  1. End-to-end flood risk assessment: A coupled model cascade with uncertainty estimation

    NASA Astrophysics Data System (ADS)

    McMillan, Hilary K.; Brasington, James

    2008-03-01

    This paper presents the case for an `End-to-End' flood inundation modeling strategy: the creation of a coupled system of models to allow continuous simulation methodology to be used to predict the magnitude and simulate the effects of high return period flood events. The framework brings together the best in current thinking on reduced complexity modeling to formulate an efficient, process-based methodology which meets the needs of today's flood mitigation strategies. The model chain is subject to stochasticity and parameter uncertainty, and integral methods to allow the propagation and quantification of uncertainty are essential in order to produce robust estimates of flood risk. Results from an experimental application are considered in terms of their implications for successful floodplain management, and compared against the deterministic methodology more commonly in use for flood risk assessment applications. The provenance of predictive uncertainty is also considered in order to identify those areas where future effort in terms of data collection or model refinement might best be directed in order to narrow prediction bounds and produce a more precise forecast.

  2. Estimation of risks by chemicals produced during laser pyrolysis of tissues

    NASA Astrophysics Data System (ADS)

    Weber, Lothar W.; Spleiss, Martin

    1995-01-01

    Use of laser systems in minimal invasive surgery results in formation of laser aerosol with volatile organic compounds of possible health risk. By use of currently identified chemical substances an overview on possibly associated risks to human health is given. The class of the different identified alkylnitriles seem to be a laser specific toxicological problem. Other groups of chemicals belong to the Maillard reaction type, the fatty acid pyrolysis type, or even the thermally activated chemolysis. In relation to the available different threshold limit values the possible exposure ranges of identified substances are discussed. A rough estimation results in an exposure range of less than 1/100 for almost all substances with given human threshold limit values without regard of possible interactions. For most identified alkylnitriles, alkenes, and heterocycles no threshold limit values are given for lack of, until now, practical purposes. Pyrolysis of anaesthetized organs with isoflurane gave no hints for additional pyrolysis products by fragment interactions with resulting VOCs. Measurements of pyrolysis gases resulted in detection of small amounts of NO additionally with NO2 formation at plasma status.

  3. Application of quantitative estimates of fecal hemoglobin concentration for risk prediction of colorectal neoplasia

    PubMed Central

    Liao, Chao-Sheng; Lin, Yu-Min; Chang, Hung-Chuen; Chen, Yu-Hung; Chong, Lee-Won; Chen, Chun-Hao; Lin, Yueh-Shih; Yang, Kuo-Ching; Shih, Chia-Hui

    2013-01-01

    AIM: To determine the role of the fecal immunochemical test (FIT), used to evaluate fecal hemoglobin concentration, in the prediction of histological grade and risk of colorectal tumors. METHODS: We enrolled 17881 individuals who attended the two-step colorectal cancer screening program in a single hospital between January 2010 and October 2011. Colonoscopy was recommended to the participants with an FIT of ≥ 12 ngHb/mL buffer. We classified colorectal lesions as cancer (C), advanced adenoma (AA), adenoma (A), and others (O) by their colonoscopic and histological findings. Multiple linear regression analysis adjusted for age and gender was used to determine the association between the FIT results and colorectal tumor grade. The risk of adenomatous neoplasia was estimated by calculating the positive predictive values for different FIT concentrations. RESULTS: The positive rate of the FIT was 10.9% (1948/17881). The attendance rate for colonoscopy was 63.1% (1229/1948). The number of false positive results was 23. Of these 1229 cases, the numbers of O, A, AA, and C were 759, 221, 201, and 48, respectively. Regression analysis revealed a positive association between histological grade and FIT concentration (β = 0.088, P < 0.01). A significant log-linear relationship was found between the concentration and positive predictive value of the FIT for predicting colorectal tumors (R2 > 0.95, P < 0.001). CONCLUSION: Higher FIT concentrations are associated with more advanced histological grades. Risk prediction for colorectal neoplasia based on individual FIT concentrations is significant and may help to improve the performance of screening programs. PMID:24363529

  4. Radiation doses and estimated risk from angiographic projections during coronary angiography performed using novel flat detector.

    PubMed

    Varghese, Anna; Livingstone, Roshan S; Varghese, Lijo; Kumar, Parveen; Srinath, Sirish Chandra; George, Oommen K; George, Paul V

    2016-01-01

    Coronary angiography (CA) procedure uses various angiographic projections to elicit detailed information of the coronary arteries with some steep projections involving high radiation dose to patients. This study intends to evaluate radiation doses and estimated risk from angiographic projections during CA procedure performed using novel flat detector (FD) system with improved image processing and noise reduction techniques. Real-time monitoring of radiation doses using kerma-area product (KAP) meter was performed for 140 patients using Philips Clarity FD system. The CA procedure involved seven standard projections, of which five were extensively selected by interventionalists. Mean fluoroscopic time (FT), KAP, and reference air kerma (Ka,r) for CA procedure were 3.24 min (0.5-10.51), 13.99Gycm2 (4.02-37.6), and 231.43 mGy (73.8-622.15), respectively. Effective dose calculated using Monte Carlo-based PCXMC software was found to be 4.9mSv. Left anterior oblique (LAO) 45° projection contributed the highest radiation dose (28%) of the overall KAP. Radiation-induced risk was found to be higher in females compared to males with increased risk of lung cancer. An increase of 10%-15% in radiation dose was observed when one or more additional projections were adopted along with the seven standard projections. A 14% reduction of radiation dose was achieved from novel FD system when low-dose protocol during fluoroscopy and medium-dose protocol during cine acquisitions were adopted, compared to medium-dose protocol. PMID:27167263

  5. Estimating functional connectivity of wildlife habitat and its relevance to ecological risk assessment

    USGS Publications Warehouse

    Johnson, A.R.; Allen, C.R.; Simpson, K.A.N.

    2004-01-01

    Habitat fragmentation is a major threat to the viability of wildlife populations and the maintenance of biodiversity. Fragmentation relates to the sub-division of habitat intq disjunct patches. Usually coincident with fragmentation per se is loss of habitat, a reduction in the size of the remnant patches, and increasing distance between patches. Natural and anthropogenic processes leading to habitat fragmentation occur at many spatial scales, and their impacts on wildlife depend on the scales at which species interact with the landscape. The concept of functional connectivity captures this organism-based view of the relative ease of movement or degree of exchange between physically disjunct habitat patches. Functional connectivity of a given habitat arrangement for a given wildlife species depends on details of the organism's life history and behavioral ecology, but, for broad categories of species, quantities such as home range size and dispersal distance scale allometrically with body mass. These relationships can be incorporated into spatial analyses of functional connectivity, which can be quantified by indices or displayed graphically in maps. We review indices and GIS-based approaches to estimating functional connectivity, presenting examples from the literature and our own work on mammalian distributions. Such analyses can be readily incorporated within an ecological risk framework. Estimates of functional connectivity may be useful in a screening-level assessment of the impact of habitat fragmentation relative to other stressors, and may be crucial in detailed population modeling and viability analysis.

  6. [Comparative analysis of modern approaches to risk estimation from artificially created nanoparticles and nanomaterials].

    PubMed

    Kazak, A A; Stepanov, E G; Gmoshinskiĭ, I V; Khotimchenko, S A

    2012-01-01

    The article represents a review of modern approaches to problem of nanotechnologies and nanomaterials risk estimation for human health and environment that were elaborated in EU, USA and some international authorities. Submitted data suggests that there is a significant coincidence with approaches being developed and introduced in Russian Federation under the guidance of Rospotrebnadzor. Particularly criteria being used in Russian Federation and EU for classification of nanotechnologies and nanoindustry production are mainly similar. They include a) identification of nanomaterials in production, b) establishment of production disintegration possibility with concomitant migration of free nanoparticles, c) possibility of nanoparticles emission/migration both in normal conditions of utilization and in possible emergency conditions, d) degree of proximity of particular kind of production to it's consumer that means the possibility of exposition being from closely to zero (in conditions of unhabitated exploitation) up to approximately 100% (in case of medicine, food and cosmetics), e) biological examination of potential danger of nanomaterials according to current volume of scientific information. As applied to nanotechnology plants there are such criteria in use as a) nanomaterial identification, b) personnel exhibiting possibility, c) potential toxicity of stuff in aerosol nano-form, d) characteristics of biological action. Thus applied in Russia principles for nanomaterials safety estimation do not contradict to concepts of foreign authorities that builds up a possibility of said approaches harmonization to internationally recognized norms. PMID:23156045

  7. Estimating the Influence of Oil and Gas Emissions on Urban Ozone and Associated Health Risks

    NASA Astrophysics Data System (ADS)

    Capps, S.; Nsanzineza, R.; Turner, M. D.; Henze, D. K.; Zhao, S.; Russell, M. G.; Hakami, A.; Milford, J. B.

    2015-12-01

    Tropospheric ozone (O3) degrades air quality, impacting human health and public welfare. The National Ambient Air Quality Standard (NAAQS) is designed to limit these impacts, but certain areas in the continental U.S. exceed this standard. Mitigating O3 NAAQS exceedances by designing emissions controls can be complicated in urban areas because of the long-range transport of ozone and its gaseous precursors as well as the complex mix of local emissions sources. Recent growth of unconventional oil and gas development near urban areas in Colorado, Texas, and the northeastern corridor has exacerbated this problem. To estimate the contribution of emissions from oil and gas development to urban O3 issues, we apply the CMAQ adjoint, which efficiently elucidates the relative influence of emissions sources on select concentration-based metrics. Specifically, the adjoint is used to calculate the spatially-specific relative contributions of emissions of oxides of nitrogen (NOx) and volatile organic compounds (VOCs) throughout the continental U.S. to O3 NAAQS exceedances and to ozone-related health risks in select urban areas. By evaluating these influences for different urban areas, including one in California that has been managing air quality with adjacent oil and gas development for a longer period of time, we are able to compare and contrast the emissions control strategies that may be more effective in particular regions. Additionally, the resulting relationships between emissions and concentrations provide a way to project ozone impacts when measurements provide refined estimates of emissions from this sector.

  8. Iterative weighted risk estimation for nonlinear image restoration with analysis priors

    NASA Astrophysics Data System (ADS)

    Ramani, Sathish; Rosen, Jeffrey; Liu, Zhihao; Fessler, Jeffrey A.

    2012-03-01

    Image acquisition systems invariably introduce blur, which necessitates the use of deblurring algorithms for image restoration. Restoration techniques involving regularization require appropriate selection of the regularization parameter that controls the quality of the restored result. We focus on the problem of automatic adjustment of this parameter for nonlinear image restoration using analysis-type regularizers such as total variation (TV). For this purpose, we use two variants of Stein's unbiased risk estimate (SURE), Predicted-SURE and Projected-SURE, that are applicable for parameter selection in inverse problems involving Gaussian noise. These estimates require the Jacobian matrix of the restoration algorithm evaluated with respect to the data. We derive analytical expressions to recursively update the desired Jacobian matrix for a fast variant of the iterative reweighted least-squares restoration algorithm that can accommodate a variety of regularization criteria. Our method can also be used to compute a nonlinear version of the generalized cross-validation (NGCV) measure for parameter tuning. We demonstrate using simulations that Predicted-SURE, Projected-SURE, and NGCV-based adjustment of the regularization parameter yields near-MSE-optimal results for image restoration using TV, an analysis-type 1-regularization, and a smooth convex edge-preserving regularizer.

  9. Estimating the Pollution Risk of Cadmium in Soil Using a Composite Soil Environmental Quality Standard

    PubMed Central

    Huang, Biao; Zhao, Yongcun

    2014-01-01

    Estimating standard-exceeding probabilities of toxic metals in soil is crucial for environmental evaluation. Because soil pH and land use types have strong effects on the bioavailability of trace metals in soil, they were taken into account by some environmental protection agencies in making composite soil environmental quality standards (SEQSs) that contain multiple metal thresholds under different pH and land use conditions. This study proposed a method for estimating the standard-exceeding probability map of soil cadmium using a composite SEQS. The spatial variability and uncertainty of soil pH and site-specific land use type were incorporated through simulated realizations by sequential Gaussian simulation. A case study was conducted using a sample data set from a 150 km2 area in Wuhan City and the composite SEQS for cadmium, recently set by the State Environmental Protection Administration of China. The method may be useful for evaluating the pollution risks of trace metals in soil with composite SEQSs. PMID:24672364

  10. Value-at-risk estimation with wavelet-based extreme value theory: Evidence from emerging markets

    NASA Astrophysics Data System (ADS)

    Cifter, Atilla

    2011-06-01

    This paper introduces wavelet-based extreme value theory (EVT) for univariate value-at-risk estimation. Wavelets and EVT are combined for volatility forecasting to estimate a hybrid model. In the first stage, wavelets are used as a threshold in generalized Pareto distribution, and in the second stage, EVT is applied with a wavelet-based threshold. This new model is applied to two major emerging stock markets: the Istanbul Stock Exchange (ISE) and the Budapest Stock Exchange (BUX). The relative performance of wavelet-based EVT is benchmarked against the Riskmetrics-EWMA, ARMA-GARCH, generalized Pareto distribution, and conditional generalized Pareto distribution models. The empirical results show that the wavelet-based extreme value theory increases predictive performance of financial forecasting according to number of violations and tail-loss tests. The superior forecasting performance of the wavelet-based EVT model is also consistent with Basel II requirements, and this new model can be used by financial institutions as well.

  11. Estimation of effective dose and lifetime attributable risk from multiple head CT scans in ventriculoperitoneal shunted children

    PubMed Central

    Aw-Zoretic, J.; Seth, D.; Katzman, G.; Sammet, S.

    2015-01-01

    Purpose The purpose of this review is to determine the averaged effective dose and lifetime attributable risk factor from multiple head computed tomography (CT) dose data on children with ventriculoperitoneal shunts (VPS). Method and materials A total of 422 paediatric head CT exams were found between October 2008 and January 2011 and retrospectively reviewed. The CT dose data was weighted with the latest IRCP 103 conversion factor to obtain the effective dose per study and the averaged effective dose was calculated. Estimates of the lifetime attributable risk were also calculated from the averaged effective dose using a conversion factor from the latest BEIR VII report. Results Our study found the highest effective doses in neonates and the lowest effective doses were observed in the 10–18 years age group. We estimated a 0.007% potential increase risk in neonates and 0.001% potential increased risk in teenagers over the base risk. Conclusion Multiple head CTs in children equates to a slight potential increase risk in lifetime attributable risk over the baseline risk for cancer, slightly higher in neonates relative to teenagers. The potential risks versus clinical benefit must be assessed. PMID:25130177

  12. Population-Based Estimate of Prostate Cancer Risk for Carriers of the HOXB13 Missense Mutation G84E

    PubMed Central

    Baglietto, Laura; Dowty, James G.; Jenkins, Mark A.; Southey, Melissa C.; Hopper, John L.; Giles, Graham G.

    2013-01-01

    The HOXB13 missense mutation G84E (rs138213197) is associated with increased risk of prostate cancer, but the current estimate of increased risk has a wide confidence interval (width of 95% confidence interval (CI) >200-fold) so the point estimate of 20-fold increased risk could be misleading. Population-based family studies can be more informative for estimating risks for rare variants, therefore, we screened for mutations in an Australian population-based series of early-onset prostate cancer cases (probands). We found that 19 of 1,384 (1.4%) probands carried the missense mutation, and of these, six (32%) had a family history of prostate cancer. We tested the 22 relatives of carriers diagnosed from 1998 to 2008 for whom we had a DNA sample, and found seven more carriers and one obligate carrier. The age-specific incidence for carriers was estimated to be, on average, 16.4 (95% CI 2.5–107.2) times that for the population over the time frame when the relatives were at risk prior to baseline. We then estimated the age and birth year- specific cumulative risk of prostate cancer (penetrance) for carriers. For example, the penetrance for an unaffected male carrier born in 1950 was 19% (95% CI 5–46%) at age 60 years, 44% (95% CI 18–74%) at age 70 years and 60% (95% CI 30–85%) at age 80 years. Our study has provided a population-based estimate of the average risk of prostate cancer for HOXB13 missense mutation G84E carriers that can be used to guide clinical practice and research. This study has also shown that the majority of hereditary prostate cancers due to the HOXB13 missense mutation are ‘sporadic’ in the sense that unselected cases with the missense mutation do not typically report having a family history of prostate cancer. PMID:23457453

  13. Demonstration of the Effect of Generic Anatomical Divisions versus Clinical Protocols on Computed Tomography Dose Estimates and Risk Burden

    PubMed Central

    Moorin, Rachael E.; Gibson, David A. J.; Forsyth, Rene K.; Fox, Richard

    2014-01-01

    Objective Choosing to undertake a CT scan relies on balancing risk versus benefit, however risks associated with CT scanning have generally been limited to broad anatomical locations, which do not provided adequate information to evaluate risk against benefit. Our study aimed to determine differences in radiation dose and risk e