Science.gov

Sample records for accurate risk estimates

  1. Do We Know Whether Researchers and Reviewers are Estimating Risk and Benefit Accurately?

    PubMed

    Hey, Spencer Phillips; Kimmelman, Jonathan

    2016-10-01

    Accurate estimation of risk and benefit is integral to good clinical research planning, ethical review, and study implementation. Some commentators have argued that various actors in clinical research systems are prone to biased or arbitrary risk/benefit estimation. In this commentary, we suggest the evidence supporting such claims is very limited. Most prior work has imputed risk/benefit beliefs based on past behavior or goals, rather than directly measuring them. We describe an approach - forecast analysis - that would enable direct and effective measure of the quality of risk/benefit estimation. We then consider some objections and limitations to the forecasting approach.

  2. Aggregate versus individual-level sexual behavior assessment: how much detail is needed to accurately estimate HIV/STI risk?

    PubMed

    Pinkerton, Steven D; Galletly, Carol L; McAuliffe, Timothy L; DiFranceisco, Wayne; Raymond, H Fisher; Chesson, Harrell W

    2010-02-01

    The sexual behaviors of HIV/sexually transmitted infection (STI) prevention intervention participants can be assessed on a partner-by-partner basis: in aggregate (i.e., total numbers of sex acts, collapsed across partners) or using a combination of these two methods (e.g., assessing five partners in detail and any remaining partners in aggregate). There is a natural trade-off between the level of sexual behavior detail and the precision of HIV/STI acquisition risk estimates. The results of this study indicate that relatively simple aggregate data collection techniques suffice to adequately estimate HIV risk. For highly infectious STIs, in contrast, accurate STI risk assessment requires more intensive partner-by-partner methods.

  3. BIOACCESSIBILITY TESTS ACCURATELY ESTIMATE ...

    EPA Pesticide Factsheets

    Hazards of soil-borne Pb to wild birds may be more accurately quantified if the bioavailability of that Pb is known. To better understand the bioavailability of Pb to birds, we measured blood Pb concentrations in Japanese quail (Coturnix japonica) fed diets containing Pb-contaminated soils. Relative bioavailabilities were expressed by comparison with blood Pb concentrations in quail fed a Pb acetate reference diet. Diets containing soil from five Pb-contaminated Superfund sites had relative bioavailabilities from 33%-63%, with a mean of about 50%. Treatment of two of the soils with P significantly reduced the bioavailability of Pb. The bioaccessibility of the Pb in the test soils was then measured in six in vitro tests and regressed on bioavailability. They were: the “Relative Bioavailability Leaching Procedure” (RBALP) at pH 1.5, the same test conducted at pH 2.5, the “Ohio State University In vitro Gastrointestinal” method (OSU IVG), the “Urban Soil Bioaccessible Lead Test”, the modified “Physiologically Based Extraction Test” and the “Waterfowl Physiologically Based Extraction Test.” All regressions had positive slopes. Based on criteria of slope and coefficient of determination, the RBALP pH 2.5 and OSU IVG tests performed very well. Speciation by X-ray absorption spectroscopy demonstrated that, on average, most of the Pb in the sampled soils was sorbed to minerals (30%), bound to organic matter 24%, or present as Pb sulfate 18%. Ad

  4. Accurate pose estimation for forensic identification

    NASA Astrophysics Data System (ADS)

    Merckx, Gert; Hermans, Jeroen; Vandermeulen, Dirk

    2010-04-01

    In forensic authentication, one aims to identify the perpetrator among a series of suspects or distractors. A fundamental problem in any recognition system that aims for identification of subjects in a natural scene is the lack of constrains on viewing and imaging conditions. In forensic applications, identification proves even more challenging, since most surveillance footage is of abysmal quality. In this context, robust methods for pose estimation are paramount. In this paper we will therefore present a new pose estimation strategy for very low quality footage. Our approach uses 3D-2D registration of a textured 3D face model with the surveillance image to obtain accurate far field pose alignment. Starting from an inaccurate initial estimate, the technique uses novel similarity measures based on the monogenic signal to guide a pose optimization process. We will illustrate the descriptive strength of the introduced similarity measures by using them directly as a recognition metric. Through validation, using both real and synthetic surveillance footage, our pose estimation method is shown to be accurate, and robust to lighting changes and image degradation.

  5. Accurate Biomass Estimation via Bayesian Adaptive Sampling

    NASA Astrophysics Data System (ADS)

    Wheeler, K.; Knuth, K.; Castle, P.

    2005-12-01

    and IKONOS imagery and the 3-D volume estimates. The combination of these then allow for a rapid and hopefully very accurate estimation of biomass.

  6. Accurate Biomass Estimation via Bayesian Adaptive Sampling

    NASA Technical Reports Server (NTRS)

    Wheeler, Kevin R.; Knuth, Kevin H.; Castle, Joseph P.; Lvov, Nikolay

    2005-01-01

    The following concepts were introduced: a) Bayesian adaptive sampling for solving biomass estimation; b) Characterization of MISR Rahman model parameters conditioned upon MODIS landcover. c) Rigorous non-parametric Bayesian approach to analytic mixture model determination. d) Unique U.S. asset for science product validation and verification.

  7. Preparing Rapid, Accurate Construction Cost Estimates with a Personal Computer.

    ERIC Educational Resources Information Center

    Gerstel, Sanford M.

    1986-01-01

    An inexpensive and rapid method for preparing accurate cost estimates of construction projects in a university setting, using a personal computer, purchased software, and one estimator, is described. The case against defined estimates, the rapid estimating system, and adjusting standard unit costs are discussed. (MLW)

  8. Estimating Radiogenic Cancer Risks

    EPA Pesticide Factsheets

    This document presents a revised methodology for EPA's estimation of cancer risks due to low-LET radiation exposures developed in light of information that has become available, especially new information on the Japanese atomic bomb survivors.

  9. How utilities can achieve more accurate decommissioning cost estimates

    SciTech Connect

    Knight, R.

    1999-07-01

    The number of commercial nuclear power plants that are undergoing decommissioning coupled with the economic pressure of deregulation has increased the focus on adequate funding for decommissioning. The introduction of spent-fuel storage and disposal of low-level radioactive waste into the cost analysis places even greater concern as to the accuracy of the fund calculation basis. The size and adequacy of the decommissioning fund have also played a major part in the negotiations for transfer of plant ownership. For all of these reasons, it is important that the operating plant owner reduce the margin of error in the preparation of decommissioning cost estimates. To data, all of these estimates have been prepared via the building block method. That is, numerous individual calculations defining the planning, engineering, removal, and disposal of plant systems and structures are performed. These activity costs are supplemented by the period-dependent costs reflecting the administration, control, licensing, and permitting of the program. This method will continue to be used in the foreseeable future until adequate performance data are available. The accuracy of the activity cost calculation is directly related to the accuracy of the inventory of plant system component, piping and equipment, and plant structural composition. Typically, it is left up to the cost-estimating contractor to develop this plant inventory. The data are generated by searching and analyzing property asset records, plant databases, piping and instrumentation drawings, piping system isometric drawings, and component assembly drawings. However, experience has shown that these sources may not be up to date, discrepancies may exist, there may be missing data, and the level of detail may not be sufficient. Again, typically, the time constraints associated with the development of the cost estimate preclude perfect resolution of the inventory questions. Another problem area in achieving accurate cost

  10. Quantifying Accurate Calorie Estimation Using the "Think Aloud" Method

    ERIC Educational Resources Information Center

    Holmstrup, Michael E.; Stearns-Bruening, Kay; Rozelle, Jeffrey

    2013-01-01

    Objective: Clients often have limited time in a nutrition education setting. An improved understanding of the strategies used to accurately estimate calories may help to identify areas of focused instruction to improve nutrition knowledge. Methods: A "Think Aloud" exercise was recorded during the estimation of calories in a standard dinner meal…

  11. Risk estimation using probability machines

    PubMed Central

    2014-01-01

    Background Logistic regression has been the de facto, and often the only, model used in the description and analysis of relationships between a binary outcome and observed features. It is widely used to obtain the conditional probabilities of the outcome given predictors, as well as predictor effect size estimates using conditional odds ratios. Results We show how statistical learning machines for binary outcomes, provably consistent for the nonparametric regression problem, can be used to provide both consistent conditional probability estimation and conditional effect size estimates. Effect size estimates from learning machines leverage our understanding of counterfactual arguments central to the interpretation of such estimates. We show that, if the data generating model is logistic, we can recover accurate probability predictions and effect size estimates with nearly the same efficiency as a correct logistic model, both for main effects and interactions. We also propose a method using learning machines to scan for possible interaction effects quickly and efficiently. Simulations using random forest probability machines are presented. Conclusions The models we propose make no assumptions about the data structure, and capture the patterns in the data by just specifying the predictors involved and not any particular model structure. So they do not run the same risks of model mis-specification and the resultant estimation biases as a logistic model. This methodology, which we call a “risk machine”, will share properties from the statistical machine that it is derived from. PMID:24581306

  12. Injury Risk Estimation Expertise

    PubMed Central

    Petushek, Erich J.; Ward, Paul; Cokely, Edward T.; Myer, Gregory D.

    2015-01-01

    Background: Simple observational assessment of movement is a potentially low-cost method for anterior cruciate ligament (ACL) injury screening and prevention. Although many individuals utilize some form of observational assessment of movement, there are currently no substantial data on group skill differences in observational screening of ACL injury risk. Purpose/Hypothesis: The purpose of this study was to compare various groups’ abilities to visually assess ACL injury risk as well as the associated strategies and ACL knowledge levels. The hypothesis was that sports medicine professionals would perform better than coaches and exercise science academics/students and that these subgroups would all perform better than parents and other general population members. Study Design: Cross-sectional study; Level of evidence, 3. Methods: A total of 428 individuals, including physicians, physical therapists, athletic trainers, strength and conditioning coaches, exercise science researchers/students, athletes, parents, and members of the general public participated in the study. Participants completed the ACL Injury Risk Estimation Quiz (ACL-IQ) and answered questions related to assessment strategy and ACL knowledge. Results: Strength and conditioning coaches, athletic trainers, physical therapists, and exercise science students exhibited consistently superior ACL injury risk estimation ability (+2 SD) as compared with sport coaches, parents of athletes, and members of the general public. The performance of a substantial number of individuals in the exercise sciences/sports medicines (approximately 40%) was similar to or exceeded clinical instrument-based biomechanical assessment methods (eg, ACL nomogram). Parents, sport coaches, and the general public had lower ACL-IQ, likely due to their lower ACL knowledge and to rating the importance of knee/thigh motion lower and weight and jump height higher. Conclusion: Substantial cross-professional/group differences in visual ACL

  13. Accurate genome relative abundance estimation based on shotgun metagenomic reads.

    PubMed

    Xia, Li C; Cram, Jacob A; Chen, Ting; Fuhrman, Jed A; Sun, Fengzhu

    2011-01-01

    Accurate estimation of microbial community composition based on metagenomic sequencing data is fundamental for subsequent metagenomics analysis. Prevalent estimation methods are mainly based on directly summarizing alignment results or its variants; often result in biased and/or unstable estimates. We have developed a unified probabilistic framework (named GRAMMy) by explicitly modeling read assignment ambiguities, genome size biases and read distributions along the genomes. Maximum likelihood method is employed to compute Genome Relative Abundance of microbial communities using the Mixture Model theory (GRAMMy). GRAMMy has been demonstrated to give estimates that are accurate and robust across both simulated and real read benchmark datasets. We applied GRAMMy to a collection of 34 metagenomic read sets from four metagenomics projects and identified 99 frequent species (minimally 0.5% abundant in at least 50% of the data-sets) in the human gut samples. Our results show substantial improvements over previous studies, such as adjusting the over-estimated abundance for Bacteroides species for human gut samples, by providing a new reference-based strategy for metagenomic sample comparisons. GRAMMy can be used flexibly with many read assignment tools (mapping, alignment or composition-based) even with low-sensitivity mapping results from huge short-read datasets. It will be increasingly useful as an accurate and robust tool for abundance estimation with the growing size of read sets and the expanding database of reference genomes.

  14. Accurate absolute GPS positioning through satellite clock error estimation

    NASA Astrophysics Data System (ADS)

    Han, S.-C.; Kwon, J. H.; Jekeli, C.

    2001-05-01

    An algorithm for very accurate absolute positioning through Global Positioning System (GPS) satellite clock estimation has been developed. Using International GPS Service (IGS) precise orbits and measurements, GPS clock errors were estimated at 30-s intervals. Compared to values determined by the Jet Propulsion Laboratory, the agreement was at the level of about 0.1 ns (3 cm). The clock error estimates were then applied to an absolute positioning algorithm in both static and kinematic modes. For the static case, an IGS station was selected and the coordinates were estimated every 30 s. The estimated absolute position coordinates and the known values had a mean difference of up to 18 cm with standard deviation less than 2 cm. For the kinematic case, data obtained every second from a GPS buoy were tested and the result from the absolute positioning was compared to a differential GPS (DGPS) solution. The mean differences between the coordinates estimated by the two methods are less than 40 cm and the standard deviations are less than 25 cm. It was verified that this poorer standard deviation on 1-s position results is due to the clock error interpolation from 30-s estimates with Selective Availability (SA). After SA was turned off, higher-rate clock error estimates (such as 1 s) could be obtained by a simple interpolation with negligible corruption. Therefore, the proposed absolute positioning technique can be used to within a few centimeters' precision at any rate by estimating 30-s satellite clock errors and interpolating them.

  15. An Accurate Link Correlation Estimator for Improving Wireless Protocol Performance

    PubMed Central

    Zhao, Zhiwei; Xu, Xianghua; Dong, Wei; Bu, Jiajun

    2015-01-01

    Wireless link correlation has shown significant impact on the performance of various sensor network protocols. Many works have been devoted to exploiting link correlation for protocol improvements. However, the effectiveness of these designs heavily relies on the accuracy of link correlation measurement. In this paper, we investigate state-of-the-art link correlation measurement and analyze the limitations of existing works. We then propose a novel lightweight and accurate link correlation estimation (LACE) approach based on the reasoning of link correlation formation. LACE combines both long-term and short-term link behaviors for link correlation estimation. We implement LACE as a stand-alone interface in TinyOS and incorporate it into both routing and flooding protocols. Simulation and testbed results show that LACE: (1) achieves more accurate and lightweight link correlation measurements than the state-of-the-art work; and (2) greatly improves the performance of protocols exploiting link correlation. PMID:25686314

  16. Fast and accurate estimation for astrophysical problems in large databases

    NASA Astrophysics Data System (ADS)

    Richards, Joseph W.

    2010-10-01

    A recent flood of astronomical data has created much demand for sophisticated statistical and machine learning tools that can rapidly draw accurate inferences from large databases of high-dimensional data. In this Ph.D. thesis, methods for statistical inference in such databases will be proposed, studied, and applied to real data. I use methods for low-dimensional parametrization of complex, high-dimensional data that are based on the notion of preserving the connectivity of data points in the context of a Markov random walk over the data set. I show how this simple parameterization of data can be exploited to: define appropriate prototypes for use in complex mixture models, determine data-driven eigenfunctions for accurate nonparametric regression, and find a set of suitable features to use in a statistical classifier. In this thesis, methods for each of these tasks are built up from simple principles, compared to existing methods in the literature, and applied to data from astronomical all-sky surveys. I examine several important problems in astrophysics, such as estimation of star formation history parameters for galaxies, prediction of redshifts of galaxies using photometric data, and classification of different types of supernovae based on their photometric light curves. Fast methods for high-dimensional data analysis are crucial in each of these problems because they all involve the analysis of complicated high-dimensional data in large, all-sky surveys. Specifically, I estimate the star formation history parameters for the nearly 800,000 galaxies in the Sloan Digital Sky Survey (SDSS) Data Release 7 spectroscopic catalog, determine redshifts for over 300,000 galaxies in the SDSS photometric catalog, and estimate the types of 20,000 supernovae as part of the Supernova Photometric Classification Challenge. Accurate predictions and classifications are imperative in each of these examples because these estimates are utilized in broader inference problems

  17. Accurate estimation of sigma(exp 0) using AIRSAR data

    NASA Technical Reports Server (NTRS)

    Holecz, Francesco; Rignot, Eric

    1995-01-01

    During recent years signature analysis, classification, and modeling of Synthetic Aperture Radar (SAR) data as well as estimation of geophysical parameters from SAR data have received a great deal of interest. An important requirement for the quantitative use of SAR data is the accurate estimation of the backscattering coefficient sigma(exp 0). In terrain with relief variations radar signals are distorted due to the projection of the scene topography into the slant range-Doppler plane. The effect of these variations is to change the physical size of the scattering area, leading to errors in the radar backscatter values and incidence angle. For this reason the local incidence angle, derived from sensor position and Digital Elevation Model (DEM) data must always be considered. Especially in the airborne case, the antenna gain pattern can be an additional source of radiometric error, because the radar look angle is not known precisely as a result of the the aircraft motions and the local surface topography. Consequently, radiometric distortions due to the antenna gain pattern must also be corrected for each resolution cell, by taking into account aircraft displacements (position and attitude) and position of the backscatter element, defined by the DEM data. In this paper, a method to derive an accurate estimation of the backscattering coefficient using NASA/JPL AIRSAR data is presented. The results are evaluated in terms of geometric accuracy, radiometric variations of sigma(exp 0), and precision of the estimated forest biomass.

  18. Accurate Satellite-Derived Estimates of Tropospheric Ozone Radiative Forcing

    NASA Technical Reports Server (NTRS)

    Joiner, Joanna; Schoeberl, Mark R.; Vasilkov, Alexander P.; Oreopoulos, Lazaros; Platnick, Steven; Livesey, Nathaniel J.; Levelt, Pieternel F.

    2008-01-01

    Estimates of the radiative forcing due to anthropogenically-produced tropospheric O3 are derived primarily from models. Here, we use tropospheric ozone and cloud data from several instruments in the A-train constellation of satellites as well as information from the GEOS-5 Data Assimilation System to accurately estimate the instantaneous radiative forcing from tropospheric O3 for January and July 2005. We improve upon previous estimates of tropospheric ozone mixing ratios from a residual approach using the NASA Earth Observing System (EOS) Aura Ozone Monitoring Instrument (OMI) and Microwave Limb Sounder (MLS) by incorporating cloud pressure information from OMI. Since we cannot distinguish between natural and anthropogenic sources with the satellite data, our estimates reflect the total forcing due to tropospheric O3. We focus specifically on the magnitude and spatial structure of the cloud effect on both the shortand long-wave radiative forcing. The estimates presented here can be used to validate present day O3 radiative forcing produced by models.

  19. Towards accurate and precise estimates of lion density.

    PubMed

    Elliot, Nicholas B; Gopalaswamy, Arjun M

    2016-12-13

    Reliable estimates of animal density are fundamental to our understanding of ecological processes and population dynamics. Furthermore, their accuracy is vital to conservation biology since wildlife authorities rely on these figures to make decisions. However, it is notoriously difficult to accurately estimate density for wide-ranging species such as carnivores that occur at low densities. In recent years, significant progress has been made in density estimation of Asian carnivores, but the methods have not been widely adapted to African carnivores. African lions (Panthera leo) provide an excellent example as although abundance indices have been shown to produce poor inferences, they continue to be used to estimate lion density and inform management and policy. In this study we adapt a Bayesian spatially explicit capture-recapture model to estimate lion density in the Maasai Mara National Reserve (MMNR) and surrounding conservancies in Kenya. We utilize sightings data from a three-month survey period to produce statistically rigorous spatial density estimates. Overall posterior mean lion density was estimated to be 16.85 (posterior standard deviation = 1.30) lions over one year of age per 100km(2) with a sex ratio of 2.2♀:1♂. We argue that such methods should be developed, improved and favored over less reliable methods such as track and call-up surveys. We caution against trend analyses based on surveys of differing reliability and call for a unified framework to assess lion numbers across their range in order for better informed management and policy decisions to be made. This article is protected by copyright. All rights reserved.

  20. Accurate estimators of correlation functions in Fourier space

    NASA Astrophysics Data System (ADS)

    Sefusatti, E.; Crocce, M.; Scoccimarro, R.; Couchman, H. M. P.

    2016-08-01

    Efficient estimators of Fourier-space statistics for large number of objects rely on fast Fourier transforms (FFTs), which are affected by aliasing from unresolved small-scale modes due to the finite FFT grid. Aliasing takes the form of a sum over images, each of them corresponding to the Fourier content displaced by increasing multiples of the sampling frequency of the grid. These spurious contributions limit the accuracy in the estimation of Fourier-space statistics, and are typically ameliorated by simultaneously increasing grid size and discarding high-frequency modes. This results in inefficient estimates for e.g. the power spectrum when desired systematic biases are well under per cent level. We show that using interlaced grids removes odd images, which include the dominant contribution to aliasing. In addition, we discuss the choice of interpolation kernel used to define density perturbations on the FFT grid and demonstrate that using higher order interpolation kernels than the standard Cloud-In-Cell algorithm results in significant reduction of the remaining images. We show that combining fourth-order interpolation with interlacing gives very accurate Fourier amplitudes and phases of density perturbations. This results in power spectrum and bispectrum estimates that have systematic biases below 0.01 per cent all the way to the Nyquist frequency of the grid, thus maximizing the use of unbiased Fourier coefficients for a given grid size and greatly reducing systematics for applications to large cosmological data sets.

  1. Foresight begins with FMEA. Delivering accurate risk assessments.

    PubMed

    Passey, R D

    1999-03-01

    If sufficient factors are taken into account and two- or three-stage analysis is employed, failure mode and effect analysis represents an excellent technique for delivering accurate risk assessments for products and processes, and for relating them to legal liability. This article describes a format that facilitates easy interpretation.

  2. Estimating Terrorism Risk

    DTIC Science & Technology

    2005-01-01

    preparedness by addressing unique planning, equipment, training, and exercise needs of large urban areas (DHS, 2004). Al- though there appears to be agreement ...reasonable minimum standards for community preparedness. Until these questions are answered, allocating home- land security resources based on risk is the...and threats are correlated with population density. There are practical benefits for using simple risk indicators such as those based upon population

  3. [Medical insurance estimation of risks].

    PubMed

    Dunér, H

    1975-11-01

    The purpose of insurance medicine is to make a prognostic estimate of medical risk-factors in persons who apply for life, health, or accident insurance. Established risk-groups with a calculated average mortality and morbidity form the basis for premium rates and insurance terms. In most cases the applicant is accepted for insurance after a self-assessment of his health. Only around one per cent of the applications are refused, but there are cases in which the premium is raised, temporarily or permanently. It is often a matter of rough estimate, since the knowlege of the long-term prognosis for many diseases is incomplete. The insurance companies' rules for estimate of risk are revised at intervals of three or four years. The estimate of risk as regards life insurance has been gradually liberalised, while the medical conditions for health insurance have become stricter owing to an increase in the claims rate.

  4. Estimation of bone permeability using accurate microstructural measurements.

    PubMed

    Beno, Thoma; Yoon, Young-June; Cowin, Stephen C; Fritton, Susannah P

    2006-01-01

    While interstitial fluid flow is necessary for the viability of osteocytes, it is also believed to play a role in bone's mechanosensory system by shearing bone cell membranes or causing cytoskeleton deformation and thus activating biochemical responses that lead to the process of bone adaptation. However, the fluid flow properties that regulate bone's adaptive response are poorly understood. In this paper, we present an analytical approach to determine the degree of anisotropy of the permeability of the lacunar-canalicular porosity in bone. First, we estimate the total number of canaliculi emanating from each osteocyte lacuna based on published measurements from parallel-fibered shaft bones of several species (chick, rabbit, bovine, horse, dog, and human). Next, we determine the local three-dimensional permeability of the lacunar-canalicular porosity for these species using recent microstructural measurements and adapting a previously developed model. Results demonstrated that the number of canaliculi per osteocyte lacuna ranged from 41 for human to 115 for horse. Permeability coefficients were found to be different in three local principal directions, indicating local orthotropic symmetry of bone permeability in parallel-fibered cortical bone for all species examined. For the range of parameters investigated, the local lacunar-canalicular permeability varied more than three orders of magnitude, with the osteocyte lacunar shape and size along with the 3-D canalicular distribution determining the degree of anisotropy of the local permeability. This two-step theoretical approach to determine the degree of anisotropy of the permeability of the lacunar-canalicular porosity will be useful for accurate quantification of interstitial fluid movement in bone.

  5. Fast and Accurate Learning When Making Discrete Numerical Estimates

    PubMed Central

    Sanborn, Adam N.; Beierholm, Ulrik R.

    2016-01-01

    Many everyday estimation tasks have an inherently discrete nature, whether the task is counting objects (e.g., a number of paint buckets) or estimating discretized continuous variables (e.g., the number of paint buckets needed to paint a room). While Bayesian inference is often used for modeling estimates made along continuous scales, discrete numerical estimates have not received as much attention, despite their common everyday occurrence. Using two tasks, a numerosity task and an area estimation task, we invoke Bayesian decision theory to characterize how people learn discrete numerical distributions and make numerical estimates. Across three experiments with novel stimulus distributions we found that participants fell between two common decision functions for converting their uncertain representation into a response: drawing a sample from their posterior distribution and taking the maximum of their posterior distribution. While this was consistent with the decision function found in previous work using continuous estimation tasks, surprisingly the prior distributions learned by participants in our experiments were much more adaptive: When making continuous estimates, participants have required thousands of trials to learn bimodal priors, but in our tasks participants learned discrete bimodal and even discrete quadrimodal priors within a few hundred trials. This makes discrete numerical estimation tasks good testbeds for investigating how people learn and make estimates. PMID:27070155

  6. Estimating risks of perinatal death.

    PubMed

    Smith, Gordon C S

    2005-01-01

    The relative and absolute risks of perinatal death that are estimated from observational studies are used frequently in counseling about obstetric intervention. The statistical basis for these estimates therefore is crucial, but many studies are seriously flawed. In this review, a number of aspects of the approach to the estimation of the risk of perinatal death are addressed. Key factors in the analysis include (1) the definition of the cause of the death, (2) differentiation between antepartum and intrapartum events, (3) the use of the appropriate denominator for the given cause of death, (4) the assessment of the cumulative risk where appropriate, (5) the use of appropriate statistical tests, (6) the stratification of analysis of delivery-related deaths by gestational age, and (7) the specific features of multiple pregnancy, which include the correct determination of the timing of antepartum stillbirth and the use of paired statistical tests when outcomes are compared in relation to the birth order of twin pairs.

  7. BIOACCESSIBILITY TESTS ACCURATELY ESTIMATE BIOAVAILABILITY OF LEAD TO QUAIL

    EPA Science Inventory

    Hazards of soil-borne Pb to wild birds may be more accurately quantified if the bioavailability of that Pb is known. To better understand the bioavailability of Pb to birds, we measured blood Pb concentrations in Japanese quail (Coturnix japonica) fed diets containing Pb-contami...

  8. Bioaccessibility tests accurately estimate bioavailability of lead to quail

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Hazards of soil-borne Pb to wild birds may be more accurately quantified if the bioavailability of that Pb is known. To better understand the bioavailability of Pb, we incorporated Pb-contaminated soils or Pb acetate into diets for Japanese quail (Coturnix japonica), fed the quail for 15 days, and ...

  9. How accurate are physical property estimation programs for organosilicon compounds?

    PubMed

    Boethling, Robert; Meylan, William

    2013-11-01

    Organosilicon compounds are important in chemistry and commerce, and nearly 10% of new chemical substances for which premanufacture notifications are processed by the US Environmental Protection Agency (USEPA) contain silicon (Si). Yet, remarkably few measured values are submitted for key physical properties, and the accuracy of estimation programs such as the Estimation Programs Interface (EPI) Suite and the SPARC Performs Automated Reasoning in Chemistry (SPARC) system is largely unknown. To address this issue, the authors developed an extensive database of measured property values for organic compounds containing Si and evaluated the performance of no-cost estimation programs for several properties of importance in environmental assessment. These included melting point (mp), boiling point (bp), vapor pressure (vp), water solubility, n-octanol/water partition coefficient (log KOW ), and Henry's law constant. For bp and the larger of 2 vp datasets, SPARC, MPBPWIN, and the USEPA's Toxicity Estimation Software Tool (TEST) had similar accuracy. For log KOW and water solubility, the authors tested 11 and 6 no-cost estimators, respectively. The best performers were Molinspiration and WSKOWWIN, respectively. The TEST's consensus mp method outperformed that of MPBPWIN by a considerable margin. Generally, the best programs estimated the listed properties of diverse organosilicon compounds with accuracy sufficient for chemical screening. The results also highlight areas where improvement is most needed.

  10. Accurate feature detection and estimation using nonlinear and multiresolution analysis

    NASA Astrophysics Data System (ADS)

    Rudin, Leonid; Osher, Stanley

    1994-11-01

    A program for feature detection and estimation using nonlinear and multiscale analysis was completed. The state-of-the-art edge detection was combined with multiscale restoration (as suggested by the first author) and robust results in the presence of noise were obtained. Successful applications to numerous images of interest to DOD were made. Also, a new market in the criminal justice field was developed, based in part, on this work.

  11. Accurate tempo estimation based on harmonic + noise decomposition

    NASA Astrophysics Data System (ADS)

    Alonso, Miguel; Richard, Gael; David, Bertrand

    2006-12-01

    We present an innovative tempo estimation system that processes acoustic audio signals and does not use any high-level musical knowledge. Our proposal relies on a harmonic + noise decomposition of the audio signal by means of a subspace analysis method. Then, a technique to measure the degree of musical accentuation as a function of time is developed and separately applied to the harmonic and noise parts of the input signal. This is followed by a periodicity estimation block that calculates the salience of musical accents for a large number of potential periods. Next, a multipath dynamic programming searches among all the potential periodicities for the most consistent prospects through time, and finally the most energetic candidate is selected as tempo. Our proposal is validated using a manually annotated test-base containing 961 music signals from various musical genres. In addition, the performance of the algorithm under different configurations is compared. The robustness of the algorithm when processing signals of degraded quality is also measured.

  12. Fast and Accurate Estimates of Divergence Times from Big Data.

    PubMed

    Mello, Beatriz; Tao, Qiqing; Tamura, Koichiro; Kumar, Sudhir

    2017-01-01

    Ongoing advances in sequencing technology have led to an explosive expansion in the molecular data available for building increasingly larger and more comprehensive timetrees. However, Bayesian relaxed-clock approaches frequently used to infer these timetrees impose a large computational burden and discourage critical assessment of the robustness of inferred times to model assumptions, influence of calibrations, and selection of optimal data subsets. We analyzed eight large, recently published, empirical datasets to compare time estimates produced by RelTime (a non-Bayesian method) with those reported by using Bayesian approaches. We find that RelTime estimates are very similar to Bayesian approaches, yet RelTime requires orders of magnitude less computational time. This means that the use of RelTime will enable greater rigor in molecular dating, because faster computational speeds encourage more extensive testing of the robustness of inferred timetrees to prior assumptions (models and calibrations) and data subsets. Thus, RelTime provides a reliable and computationally thrifty approach for dating the tree of life using large-scale molecular datasets.

  13. Bioaccessibility tests accurately estimate bioavailability of lead to quail

    USGS Publications Warehouse

    Beyer, W. Nelson; Basta, Nicholas T; Chaney, Rufus L.; Henry, Paula F.; Mosby, David; Rattner, Barnett A.; Scheckel, Kirk G.; Sprague, Dan; Weber, John

    2016-01-01

    Hazards of soil-borne Pb to wild birds may be more accurately quantified if the bioavailability of that Pb is known. To better understand the bioavailability of Pb to birds, we measured blood Pb concentrations in Japanese quail (Coturnix japonica) fed diets containing Pb-contaminated soils. Relative bioavailabilities were expressed by comparison with blood Pb concentrations in quail fed a Pb acetate reference diet. Diets containing soil from five Pb-contaminated Superfund sites had relative bioavailabilities from 33%-63%, with a mean of about 50%. Treatment of two of the soils with phosphorus significantly reduced the bioavailability of Pb. Bioaccessibility of Pb in the test soils was then measured in six in vitro tests and regressed on bioavailability. They were: the “Relative Bioavailability Leaching Procedure” (RBALP) at pH 1.5, the same test conducted at pH 2.5, the “Ohio State University In vitro Gastrointestinal” method (OSU IVG), the “Urban Soil Bioaccessible Lead Test”, the modified “Physiologically Based Extraction Test” and the “Waterfowl Physiologically Based Extraction Test.” All regressions had positive slopes. Based on criteria of slope and coefficient of determination, the RBALP pH 2.5 and OSU IVG tests performed very well. Speciation by X-ray absorption spectroscopy demonstrated that, on average, most of the Pb in the sampled soils was sorbed to minerals (30%), bound to organic matter (24%), or present as Pb sulfate (18%). Additional Pb was associated with P (chloropyromorphite, hydroxypyromorphite and tertiary Pb phosphate), and with Pb carbonates, leadhillite (a lead sulfate carbonate hydroxide), and Pb sulfide. The formation of chloropyromorphite reduced the bioavailability of Pb and the amendment of Pb-contaminated soils with P may be a thermodynamically favored means to sequester Pb.

  14. Bioaccessibility tests accurately estimate bioavailability of lead to quail.

    PubMed

    Beyer, W Nelson; Basta, Nicholas T; Chaney, Rufus L; Henry, Paula F P; Mosby, David E; Rattner, Barnett A; Scheckel, Kirk G; Sprague, Daniel T; Weber, John S

    2016-09-01

    Hazards of soil-borne lead (Pb) to wild birds may be more accurately quantified if the bioavailability of that Pb is known. To better understand the bioavailability of Pb to birds, the authors measured blood Pb concentrations in Japanese quail (Coturnix japonica) fed diets containing Pb-contaminated soils. Relative bioavailabilities were expressed by comparison with blood Pb concentrations in quail fed a Pb acetate reference diet. Diets containing soil from 5 Pb-contaminated Superfund sites had relative bioavailabilities from 33% to 63%, with a mean of approximately 50%. Treatment of 2 of the soils with phosphorus (P) significantly reduced the bioavailability of Pb. Bioaccessibility of Pb in the test soils was then measured in 6 in vitro tests and regressed on bioavailability: the relative bioavailability leaching procedure at pH 1.5, the same test conducted at pH 2.5, the Ohio State University in vitro gastrointestinal method, the urban soil bioaccessible lead test, the modified physiologically based extraction test, and the waterfowl physiologically based extraction test. All regressions had positive slopes. Based on criteria of slope and coefficient of determination, the relative bioavailability leaching procedure at pH 2.5 and Ohio State University in vitro gastrointestinal tests performed very well. Speciation by X-ray absorption spectroscopy demonstrated that, on average, most of the Pb in the sampled soils was sorbed to minerals (30%), bound to organic matter (24%), or present as Pb sulfate (18%). Additional Pb was associated with P (chloropyromorphite, hydroxypyromorphite, and tertiary Pb phosphate) and with Pb carbonates, leadhillite (a lead sulfate carbonate hydroxide), and Pb sulfide. The formation of chloropyromorphite reduced the bioavailability of Pb, and the amendment of Pb-contaminated soils with P may be a thermodynamically favored means to sequester Pb. Environ Toxicol Chem 2016;35:2311-2319. Published 2016 Wiley Periodicals Inc. on behalf of

  15. New ventures require accurate risk analyses and adjustments.

    PubMed

    Eastaugh, S R

    2000-01-01

    For new business ventures to succeed, healthcare executives need to conduct robust risk analyses and develop new approaches to balance risk and return. Risk analysis involves examination of objective risks and harder-to-quantify subjective risks. Mathematical principles applied to investment portfolios also can be applied to a portfolio of departments or strategic business units within an organization. The ideal business investment would have a high expected return and a low standard deviation. Nonetheless, both conservative and speculative strategies should be considered in determining an organization's optimal service line and helping the organization manage risk.

  16. Thinking Concretely Increases the Perceived Likelihood of Risks: The Effect of Construal Level on Risk Estimation.

    PubMed

    Lermer, Eva; Streicher, Bernhard; Sachs, Rainer; Raue, Martina; Frey, Dieter

    2016-03-01

    Recent findings on construal level theory (CLT) suggest that abstract thinking leads to a lower estimated probability of an event occurring compared to concrete thinking. We applied this idea to the risk context and explored the influence of construal level (CL) on the overestimation of small and underestimation of large probabilities for risk estimates concerning a vague target person (Study 1 and Study 3) and personal risk estimates (Study 2). We were specifically interested in whether the often-found overestimation of small probabilities could be reduced with abstract thinking, and the often-found underestimation of large probabilities was reduced with concrete thinking. The results showed that CL influenced risk estimates. In particular, a concrete mindset led to higher risk estimates compared to an abstract mindset for several adverse events, including events with small and large probabilities. This suggests that CL manipulation can indeed be used for improving the accuracy of lay people's estimates of small and large probabilities. Moreover, the results suggest that professional risk managers' risk estimates of common events (thus with a relatively high probability) could be improved by adopting a concrete mindset. However, the abstract manipulation did not lead managers to estimate extremely unlikely events more accurately. Potential reasons for different CL manipulation effects on risk estimates' accuracy between lay people and risk managers are discussed.

  17. Accurate Non-parametric Estimation of Recent Effective Population Size from Segments of Identity by Descent.

    PubMed

    Browning, Sharon R; Browning, Brian L

    2015-09-03

    Existing methods for estimating historical effective population size from genetic data have been unable to accurately estimate effective population size during the most recent past. We present a non-parametric method for accurately estimating recent effective population size by using inferred long segments of identity by descent (IBD). We found that inferred segments of IBD contain information about effective population size from around 4 generations to around 50 generations ago for SNP array data and to over 200 generations ago for sequence data. In human populations that we examined, the estimates of effective size were approximately one-third of the census size. We estimate the effective population size of European-ancestry individuals in the UK four generations ago to be eight million and the effective population size of Finland four generations ago to be 0.7 million. Our method is implemented in the open-source IBDNe software package.

  18. Accurate Non-parametric Estimation of Recent Effective Population Size from Segments of Identity by Descent

    PubMed Central

    Browning, Sharon R.; Browning, Brian L.

    2015-01-01

    Existing methods for estimating historical effective population size from genetic data have been unable to accurately estimate effective population size during the most recent past. We present a non-parametric method for accurately estimating recent effective population size by using inferred long segments of identity by descent (IBD). We found that inferred segments of IBD contain information about effective population size from around 4 generations to around 50 generations ago for SNP array data and to over 200 generations ago for sequence data. In human populations that we examined, the estimates of effective size were approximately one-third of the census size. We estimate the effective population size of European-ancestry individuals in the UK four generations ago to be eight million and the effective population size of Finland four generations ago to be 0.7 million. Our method is implemented in the open-source IBDNe software package. PMID:26299365

  19. LSimpute: accurate estimation of missing values in microarray data with least squares methods.

    PubMed

    Bø, Trond Hellem; Dysvik, Bjarte; Jonassen, Inge

    2004-02-20

    Microarray experiments generate data sets with information on the expression levels of thousands of genes in a set of biological samples. Unfortunately, such experiments often produce multiple missing expression values, normally due to various experimental problems. As many algorithms for gene expression analysis require a complete data matrix as input, the missing values have to be estimated in order to analyze the available data. Alternatively, genes and arrays can be removed until no missing values remain. However, for genes or arrays with only a small number of missing values, it is desirable to impute those values. For the subsequent analysis to be as informative as possible, it is essential that the estimates for the missing gene expression values are accurate. A small amount of badly estimated missing values in the data might be enough for clustering methods, such as hierachical clustering or K-means clustering, to produce misleading results. Thus, accurate methods for missing value estimation are needed. We present novel methods for estimation of missing values in microarray data sets that are based on the least squares principle, and that utilize correlations between both genes and arrays. For this set of methods, we use the common reference name LSimpute. We compare the estimation accuracy of our methods with the widely used KNNimpute on three complete data matrices from public data sets by randomly knocking out data (labeling as missing). From these tests, we conclude that our LSimpute methods produce estimates that consistently are more accurate than those obtained using KNNimpute. Additionally, we examine a more classic approach to missing value estimation based on expectation maximization (EM). We refer to our EM implementations as EMimpute, and the estimate errors using the EMimpute methods are compared with those our novel methods produce. The results indicate that on average, the estimates from our best performing LSimpute method are at least as

  20. A fast and accurate frequency estimation algorithm for sinusoidal signal with harmonic components

    NASA Astrophysics Data System (ADS)

    Hu, Jinghua; Pan, Mengchun; Zeng, Zhidun; Hu, Jiafei; Chen, Dixiang; Tian, Wugang; Zhao, Jianqiang; Du, Qingfa

    2016-10-01

    Frequency estimation is a fundamental problem in many applications, such as traditional vibration measurement, power system supervision, and microelectromechanical system sensors control. In this paper, a fast and accurate frequency estimation algorithm is proposed to deal with low efficiency problem in traditional methods. The proposed algorithm consists of coarse and fine frequency estimation steps, and we demonstrate that it is more efficient than conventional searching methods to achieve coarse frequency estimation (location peak of FFT amplitude) by applying modified zero-crossing technique. Thus, the proposed estimation algorithm requires less hardware and software sources and can achieve even higher efficiency when the experimental data increase. Experimental results with modulated magnetic signal show that the root mean square error of frequency estimation is below 0.032 Hz with the proposed algorithm, which has lower computational complexity and better global performance than conventional frequency estimation methods.

  1. Estimating the re-identification risk of clinical data sets

    PubMed Central

    2012-01-01

    Background De-identification is a common way to protect patient privacy when disclosing clinical data for secondary purposes, such as research. One type of attack that de-identification protects against is linking the disclosed patient data with public and semi-public registries. Uniqueness is a commonly used measure of re-identification risk under this attack. If uniqueness can be measured accurately then the risk from this kind of attack can be managed. In practice, it is often not possible to measure uniqueness directly, therefore it must be estimated. Methods We evaluated the accuracy of uniqueness estimators on clinically relevant data sets. Four candidate estimators were identified because they were evaluated in the past and found to have good accuracy or because they were new and not evaluated comparatively before: the Zayatz estimator, slide negative binomial estimator, Pitman’s estimator, and mu-argus. A Monte Carlo simulation was performed to evaluate the uniqueness estimators on six clinically relevant data sets. We varied the sampling fraction and the uniqueness in the population (the value being estimated). The median relative error and inter-quartile range of the uniqueness estimates was measured across 1000 runs. Results There was no single estimator that performed well across all of the conditions. We developed a decision rule which selected between the Pitman, slide negative binomial and Zayatz estimators depending on the sampling fraction and the difference between estimates. This decision rule had the best consistent median relative error across multiple conditions and data sets. Conclusion This study identified an accurate decision rule that can be used by health privacy researchers and disclosure control professionals to estimate uniqueness in clinical data sets. The decision rule provides a reliable way to measure re-identification risk. PMID:22776564

  2. Sample Size Requirements for Accurate Estimation of Squared Semi-Partial Correlation Coefficients.

    ERIC Educational Resources Information Center

    Algina, James; Moulder, Bradley C.; Moser, Barry K.

    2002-01-01

    Studied the sample size requirements for accurate estimation of squared semi-partial correlation coefficients through simulation studies. Results show that the sample size necessary for adequate accuracy depends on: (1) the population squared multiple correlation coefficient (p squared); (2) the population increase in p squared; and (3) the…

  3. Towards an accurate estimation of the isosteric heat of adsorption - A correlation with the potential theory.

    PubMed

    Askalany, Ahmed A; Saha, Bidyut B

    2017-03-15

    Accurate estimation of the isosteric heat of adsorption is mandatory for a good modeling of adsorption processes. In this paper a thermodynamic formalism on adsorbed phase volume which is a function of adsorption pressure and temperature has been proposed for the precise estimation of the isosteric heat of adsorption. The estimated isosteric heat of adsorption using the new correlation has been compared with measured values of prudently selected several adsorbent-refrigerant pairs from open literature. Results showed that the proposed isosteric heat of adsorption correlation fits the experimentally measured values better than the Clausius-Clapeyron equation.

  4. On the accurate estimation of gap fraction during daytime with digital cover photography

    NASA Astrophysics Data System (ADS)

    Hwang, Y. R.; Ryu, Y.; Kimm, H.; Macfarlane, C.; Lang, M.; Sonnentag, O.

    2015-12-01

    Digital cover photography (DCP) has emerged as an indirect method to obtain gap fraction accurately. Thus far, however, the intervention of subjectivity, such as determining the camera relative exposure value (REV) and threshold in the histogram, hindered computing accurate gap fraction. Here we propose a novel method that enables us to measure gap fraction accurately during daytime under various sky conditions by DCP. The novel method computes gap fraction using a single DCP unsaturated raw image which is corrected for scattering effects by canopies and a reconstructed sky image from the raw format image. To test the sensitivity of the novel method derived gap fraction to diverse REVs, solar zenith angles and canopy structures, we took photos in one hour interval between sunrise to midday under dense and sparse canopies with REV 0 to -5. The novel method showed little variation of gap fraction across different REVs in both dense and spares canopies across diverse range of solar zenith angles. The perforated panel experiment, which was used to test the accuracy of the estimated gap fraction, confirmed that the novel method resulted in the accurate and consistent gap fractions across different hole sizes, gap fractions and solar zenith angles. These findings highlight that the novel method opens new opportunities to estimate gap fraction accurately during daytime from sparse to dense canopies, which will be useful in monitoring LAI precisely and validating satellite remote sensing LAI products efficiently.

  5. Accurate Estimation of the Entropy of Rotation-Translation Probability Distributions.

    PubMed

    Fogolari, Federico; Dongmo Foumthuim, Cedrix Jurgal; Fortuna, Sara; Soler, Miguel Angel; Corazza, Alessandra; Esposito, Gennaro

    2016-01-12

    The estimation of rotational and translational entropies in the context of ligand binding has been the subject of long-time investigations. The high dimensionality (six) of the problem and the limited amount of sampling often prevent the required resolution to provide accurate estimates by the histogram method. Recently, the nearest-neighbor distance method has been applied to the problem, but the solutions provided either address rotation and translation separately, therefore lacking correlations, or use a heuristic approach. Here we address rotational-translational entropy estimation in the context of nearest-neighbor-based entropy estimation, solve the problem numerically, and provide an exact and an approximate method to estimate the full rotational-translational entropy.

  6. [Guidelines for Accurate and Transparent Health Estimates Reporting: the GATHER Statement].

    PubMed

    Stevens, Gretchen A; Alkema, Leontine; Black, Robert E; Boerma, J Ties; Collins, Gary S; Ezzati, Majid; Grove, John T; Hogan, Daniel R; Hogan, Margaret C; Horton, Richard; Lawn, Joy E; Marušic, Ana; Mathers, Colin D; Murray, Christopher J L; Rudan, Igor; Salomon, Joshua A; Simpson, Paul J; Vos, Theo; Welch, Vivian

    2017-01-01

    Measurements of health indicators are rarely available for every population and period of interest, and available data may not be comparable. The Guidelines for Accurate and Transparent Health Estimates Reporting (GATHER) define best reporting practices for studies that calculate health estimates for multiple populations (in time or space) using multiple information sources. Health estimates that fall within the scope of GATHER include all quantitative population-level estimates (including global, regional, national, or subnational estimates) of health indicators, including indicators of health status, incidence and prevalence of diseases, injuries, and disability and functioning; and indicators of health determinants, including health behaviours and health exposures. GATHER comprises a checklist of 18 items that are essential for best reporting practice. A more detailed explanation and elaboration document, describing the interpretation and rationale of each reporting item along with examples of good reporting, is available on the GATHER website (http://gather-statement.org).

  7. Polynomial fitting of DT-MRI fiber tracts allows accurate estimation of muscle architectural parameters.

    PubMed

    Damon, Bruce M; Heemskerk, Anneriet M; Ding, Zhaohua

    2012-06-01

    Fiber curvature is a functionally significant muscle structural property, but its estimation from diffusion-tensor magnetic resonance imaging fiber tracking data may be confounded by noise. The purpose of this study was to investigate the use of polynomial fitting of fiber tracts for improving the accuracy and precision of fiber curvature (κ) measurements. Simulated image data sets were created in order to provide data with known values for κ and pennation angle (θ). Simulations were designed to test the effects of increasing inherent fiber curvature (3.8, 7.9, 11.8 and 15.3 m(-1)), signal-to-noise ratio (50, 75, 100 and 150) and voxel geometry (13.8- and 27.0-mm(3) voxel volume with isotropic resolution; 13.5-mm(3) volume with an aspect ratio of 4.0) on κ and θ measurements. In the originally reconstructed tracts, θ was estimated accurately under most curvature and all imaging conditions studied; however, the estimates of κ were imprecise and inaccurate. Fitting the tracts to second-order polynomial functions provided accurate and precise estimates of κ for all conditions except very high curvature (κ=15.3 m(-1)), while preserving the accuracy of the θ estimates. Similarly, polynomial fitting of in vivo fiber tracking data reduced the κ values of fitted tracts from those of unfitted tracts and did not change the θ values. Polynomial fitting of fiber tracts allows accurate estimation of physiologically reasonable values of κ, while preserving the accuracy of θ estimation.

  8. Polynomial Fitting of DT-MRI Fiber Tracts Allows Accurate Estimation of Muscle Architectural Parameters

    PubMed Central

    Damon, Bruce M.; Heemskerk, Anneriet M.; Ding, Zhaohua

    2012-01-01

    Fiber curvature is a functionally significant muscle structural property, but its estimation from diffusion-tensor MRI fiber tracking data may be confounded by noise. The purpose of this study was to investigate the use of polynomial fitting of fiber tracts for improving the accuracy and precision of fiber curvature (κ) measurements. Simulated image datasets were created in order to provide data with known values for κ and pennation angle (θ). Simulations were designed to test the effects of increasing inherent fiber curvature (3.8, 7.9, 11.8, and 15.3 m−1), signal-to-noise ratio (50, 75, 100, and 150), and voxel geometry (13.8 and 27.0 mm3 voxel volume with isotropic resolution; 13.5 mm3 volume with an aspect ratio of 4.0) on κ and θ measurements. In the originally reconstructed tracts, θ was estimated accurately under most curvature and all imaging conditions studied; however, the estimates of κ were imprecise and inaccurate. Fitting the tracts to 2nd order polynomial functions provided accurate and precise estimates of κ for all conditions except very high curvature (κ=15.3 m−1), while preserving the accuracy of the θ estimates. Similarly, polynomial fitting of in vivo fiber tracking data reduced the κ values of fitted tracts from those of unfitted tracts and did not change the θ values. Polynomial fitting of fiber tracts allows accurate estimation of physiologically reasonable values of κ, while preserving the accuracy of θ estimation. PMID:22503094

  9. Exposure Estimation and Interpretation of Occupational Risk: Enhanced Information for the Occupational Risk Manager

    PubMed Central

    Waters, Martha; McKernan, Lauralynn; Maier, Andrew; Jayjock, Michael; Schaeffer, Val; Brosseau, Lisa

    2015-01-01

    The fundamental goal of this article is to describe, define, and analyze the components of the risk characterization process for occupational exposures. Current methods are described for the probabilistic characterization of exposure, including newer techniques that have increasing applications for assessing data from occupational exposure scenarios. In addition, since the probability of health effects reflects variability in the exposure estimate as well as the dose-response curve—the integrated considerations of variability surrounding both components of the risk characterization provide greater information to the occupational hygienist. Probabilistic tools provide a more informed view of exposure as compared to use of discrete point estimates for these inputs to the risk characterization process. Active use of such tools for exposure and risk assessment will lead to a scientifically supported worker health protection program. Understanding the bases for an occupational risk assessment, focusing on important sources of variability and uncertainty enables characterizing occupational risk in terms of a probability, rather than a binary decision of acceptable risk or unacceptable risk. A critical review of existing methods highlights several conclusions: (1) exposure estimates and the dose-response are impacted by both variability and uncertainty and a well-developed risk characterization reflects and communicates this consideration; (2) occupational risk is probabilistic in nature and most accurately considered as a distribution, not a point estimate; and (3) occupational hygienists have a variety of tools available to incorporate concepts of risk characterization into occupational health and practice. PMID:26302336

  10. Robust and accurate fundamental frequency estimation based on dominant harmonic components.

    PubMed

    Nakatani, Tomohiro; Irino, Toshio

    2004-12-01

    This paper presents a new method for robust and accurate fundamental frequency (F0) estimation in the presence of background noise and spectral distortion. Degree of dominance and dominance spectrum are defined based on instantaneous frequencies. The degree of dominance allows one to evaluate the magnitude of individual harmonic components of the speech signals relative to background noise while reducing the influence of spectral distortion. The fundamental frequency is more accurately estimated from reliable harmonic components which are easy to select given the dominance spectra. Experiments are performed using white and babble background noise with and without spectral distortion as produced by a SRAEN filter. The results show that the present method is better than previously reported methods in terms of both gross and fine F0 errors.

  11. Development of Star Tracker System for Accurate Estimation of Spacecraft Attitude

    DTIC Science & Technology

    2009-12-01

    TRACKER SYSTEM FOR ACCURATE ESTIMATION OF SPACECRAFT ATTITUDE by Jack A. Tappe December 2009 Thesis Co-Advisors: Jae Jun Kim Brij N... Brij N. Agrawal Co-Advisor Dr. Knox T. Millsaps Chairman, Department of Mechanical and Astronautical Engineering iv THIS PAGE...much with my studies here. I would like to especially thank Professors Barry Leonard, Brij Agrawal, Grand Master Shin, and Comrade Oleg Yakimenko

  12. A method to accurately estimate the muscular torques of human wearing exoskeletons by torque sensors.

    PubMed

    Hwang, Beomsoo; Jeon, Doyoung

    2015-04-09

    In exoskeletal robots, the quantification of the user's muscular effort is important to recognize the user's motion intentions and evaluate motor abilities. In this paper, we attempt to estimate users' muscular efforts accurately using joint torque sensor which contains the measurements of dynamic effect of human body such as the inertial, Coriolis, and gravitational torques as well as torque by active muscular effort. It is important to extract the dynamic effects of the user's limb accurately from the measured torque. The user's limb dynamics are formulated and a convenient method of identifying user-specific parameters is suggested for estimating the user's muscular torque in robotic exoskeletons. Experiments were carried out on a wheelchair-integrated lower limb exoskeleton, EXOwheel, which was equipped with torque sensors in the hip and knee joints. The proposed methods were evaluated by 10 healthy participants during body weight-supported gait training. The experimental results show that the torque sensors are to estimate the muscular torque accurately in cases of relaxed and activated muscle conditions.

  13. Accurate Attitude Estimation Using ARS under Conditions of Vehicle Movement Based on Disturbance Acceleration Adaptive Estimation and Correction

    PubMed Central

    Xing, Li; Hang, Yijun; Xiong, Zhi; Liu, Jianye; Wan, Zhong

    2016-01-01

    This paper describes a disturbance acceleration adaptive estimate and correction approach for an attitude reference system (ARS) so as to improve the attitude estimate precision under vehicle movement conditions. The proposed approach depends on a Kalman filter, where the attitude error, the gyroscope zero offset error and the disturbance acceleration error are estimated. By switching the filter decay coefficient of the disturbance acceleration model in different acceleration modes, the disturbance acceleration is adaptively estimated and corrected, and then the attitude estimate precision is improved. The filter was tested in three different disturbance acceleration modes (non-acceleration, vibration-acceleration and sustained-acceleration mode, respectively) by digital simulation. Moreover, the proposed approach was tested in a kinematic vehicle experiment as well. Using the designed simulations and kinematic vehicle experiments, it has been shown that the disturbance acceleration of each mode can be accurately estimated and corrected. Moreover, compared with the complementary filter, the experimental results have explicitly demonstrated the proposed approach further improves the attitude estimate precision under vehicle movement conditions. PMID:27754469

  14. Novel serologic biomarkers provide accurate estimates of recent Plasmodium falciparum exposure for individuals and communities.

    PubMed

    Helb, Danica A; Tetteh, Kevin K A; Felgner, Philip L; Skinner, Jeff; Hubbard, Alan; Arinaitwe, Emmanuel; Mayanja-Kizza, Harriet; Ssewanyana, Isaac; Kamya, Moses R; Beeson, James G; Tappero, Jordan; Smith, David L; Crompton, Peter D; Rosenthal, Philip J; Dorsey, Grant; Drakeley, Christopher J; Greenhouse, Bryan

    2015-08-11

    Tools to reliably measure Plasmodium falciparum (Pf) exposure in individuals and communities are needed to guide and evaluate malaria control interventions. Serologic assays can potentially produce precise exposure estimates at low cost; however, current approaches based on responses to a few characterized antigens are not designed to estimate exposure in individuals. Pf-specific antibody responses differ by antigen, suggesting that selection of antigens with defined kinetic profiles will improve estimates of Pf exposure. To identify novel serologic biomarkers of malaria exposure, we evaluated responses to 856 Pf antigens by protein microarray in 186 Ugandan children, for whom detailed Pf exposure data were available. Using data-adaptive statistical methods, we identified combinations of antibody responses that maximized information on an individual's recent exposure. Responses to three novel Pf antigens accurately classified whether an individual had been infected within the last 30, 90, or 365 d (cross-validated area under the curve = 0.86-0.93), whereas responses to six antigens accurately estimated an individual's malaria incidence in the prior year. Cross-validated incidence predictions for individuals in different communities provided accurate stratification of exposure between populations and suggest that precise estimates of community exposure can be obtained from sampling a small subset of that community. In addition, serologic incidence predictions from cross-sectional samples characterized heterogeneity within a community similarly to 1 y of continuous passive surveillance. Development of simple ELISA-based assays derived from the successful selection strategy outlined here offers the potential to generate rich epidemiologic surveillance data that will be widely accessible to malaria control programs.

  15. Submarine tower escape decompression sickness risk estimation.

    PubMed

    Loveman, G A M; Seddon, E M; Thacker, J C; Stansfield, M R; Jurd, K M

    2014-01-01

    Actions to enhance survival in a distressed submarine (DISSUB) scenario may be guided in part by knowledge of the likely risk of decompression sickness (DCS) should the crew attempt tower escape. A mathematical model for DCS risk estimation has been calibrated against DCS outcome data from 3,738 exposures of either men or goats to raised pressure. Body mass was used to scale DCS risk. The calibration data included more than 1,000 actual or simulated submarine escape exposures and no exposures with substantial staged decompression. Cases of pulmonary barotrauma were removed from the calibration data. The calibrated model was used to estimate the likelihood of DCS occurrence following submarine escape from the United Kingdom Royal Navy tower escape system. Where internal DISSUB pressure remains at - 0.1 MPa, escape from DISSUB depths < 200 meters is estimated to have DCS risk < 6%. Saturation at raised DISSUB pressure markedly increases risk, with > 60% DCS risk predicted for a 200-meter escape from saturation at 0.21 MPa. Using the calibrated model to predict DCS for direct ascent from saturation gives similar risk estimates to other published models.

  16. Accurate estimation of object location in an image sequence using helicopter flight data

    NASA Technical Reports Server (NTRS)

    Tang, Yuan-Liang; Kasturi, Rangachar

    1994-01-01

    In autonomous navigation, it is essential to obtain a three-dimensional (3D) description of the static environment in which the vehicle is traveling. For a rotorcraft conducting low-latitude flight, this description is particularly useful for obstacle detection and avoidance. In this paper, we address the problem of 3D position estimation for static objects from a monocular sequence of images captured from a low-latitude flying helicopter. Since the environment is static, it is well known that the optical flow in the image will produce a radiating pattern from the focus of expansion. We propose a motion analysis system which utilizes the epipolar constraint to accurately estimate 3D positions of scene objects in a real world image sequence taken from a low-altitude flying helicopter. Results show that this approach gives good estimates of object positions near the rotorcraft's intended flight-path.

  17. Estimating the Effective Permittivity for Reconstructing Accurate Microwave-Radar Images

    PubMed Central

    Lavoie, Benjamin R.; Okoniewski, Michal; Fear, Elise C.

    2016-01-01

    We present preliminary results from a method for estimating the optimal effective permittivity for reconstructing microwave-radar images. Using knowledge of how microwave-radar images are formed, we identify characteristics that are typical of good images, and define a fitness function to measure the relative image quality. We build a polynomial interpolant of the fitness function in order to identify the most likely permittivity values of the tissue. To make the estimation process more efficient, the polynomial interpolant is constructed using a locally and dimensionally adaptive sampling method that is a novel combination of stochastic collocation and polynomial chaos. Examples, using a series of simulated, experimental and patient data collected using the Tissue Sensing Adaptive Radar system, which is under development at the University of Calgary, are presented. These examples show how, using our method, accurate images can be reconstructed starting with only a broad estimate of the permittivity range. PMID:27611785

  18. Accurate estimates of age at maturity from the growth trajectories of fishes and other ectotherms.

    PubMed

    Honsey, Andrew E; Staples, David F; Venturelli, Paul A

    2017-01-01

    Age at maturity (AAM) is a key life history trait that provides insight into ecology, evolution, and population dynamics. However, maturity data can be costly to collect or may not be available. Life history theory suggests that growth is biphasic for many organisms, with a change-point in growth occurring at maturity. If so, then it should be possible to use a biphasic growth model to estimate AAM from growth data. To test this prediction, we used the Lester biphasic growth model in a likelihood profiling framework to estimate AAM from length at age data. We fit our model to simulated growth trajectories to determine minimum data requirements (in terms of sample size, precision in length at age, and the cost to somatic growth of maturity) for accurate AAM estimates. We then applied our method to a large walleye Sander vitreus data set and show that our AAM estimates are in close agreement with conventional estimates when our model fits well. Finally, we highlight the potential of our method by applying it to length at age data for a variety of ectotherms. Our method shows promise as a tool for estimating AAM and other life history traits from contemporary and historical samples.

  19. [Estimation of risk areas for hepatitis A].

    PubMed

    Braga, Ricardo Cerqueira Campos; Valencia, Luís Iván Ortiz; Medronho, Roberto de Andrade; Escosteguy, Claudia Caminha

    2008-08-01

    This study estimated hepatitis A risk areas in a region of Duque de Caxias, Rio de Janeiro State, Brazil. A cross-sectional study consisting of a hepatitis A serological survey and a household survey were conducted in 19 census tracts. Of these, 11 tracts were selected and 1,298 children from one to ten years of age were included in the study. Geostatistical techniques allowed modeling the spatial continuity of hepatitis A, non-use of filtered drinking water, time since installation of running water, and number of water taps per household and their spatial estimation through ordinary and indicator kriging. Adjusted models for the outcome and socioeconomic variables were isotropic; risk maps were constructed; cross-validation of the four models was satisfactory. Spatial estimation using the kriging method detected areas with increased risk of hepatitis A, independently of the urban administrative area in which the census tracts were located.

  20. Intraocular lens power estimation by accurate ray tracing for eyes underwent previous refractive surgeries

    NASA Astrophysics Data System (ADS)

    Yang, Que; Wang, Shanshan; Wang, Kai; Zhang, Chunyu; Zhang, Lu; Meng, Qingyu; Zhu, Qiudong

    2015-08-01

    For normal eyes without history of any ocular surgery, traditional equations for calculating intraocular lens (IOL) power, such as SRK-T, Holladay, Higis, SRK-II, et al., all were relativley accurate. However, for eyes underwent refractive surgeries, such as LASIK, or eyes diagnosed as keratoconus, these equations may cause significant postoperative refractive error, which may cause poor satisfaction after cataract surgery. Although some methods have been carried out to solve this problem, such as Hagis-L equation[1], or using preoperative data (data before LASIK) to estimate K value[2], no precise equations were available for these eyes. Here, we introduced a novel intraocular lens power estimation method by accurate ray tracing with optical design software ZEMAX. Instead of using traditional regression formula, we adopted the exact measured corneal elevation distribution, central corneal thickness, anterior chamber depth, axial length, and estimated effective lens plane as the input parameters. The calculation of intraocular lens power for a patient with keratoconus and another LASIK postoperative patient met very well with their visual capacity after cataract surgery.

  1. Accurate and quantitative polarization-sensitive OCT by unbiased birefringence estimator with noise-stochastic correction

    NASA Astrophysics Data System (ADS)

    Kasaragod, Deepa; Sugiyama, Satoshi; Ikuno, Yasushi; Alonso-Caneiro, David; Yamanari, Masahiro; Fukuda, Shinichi; Oshika, Tetsuro; Hong, Young-Joo; Li, En; Makita, Shuichi; Miura, Masahiro; Yasuno, Yoshiaki

    2016-03-01

    Polarization sensitive optical coherence tomography (PS-OCT) is a functional extension of OCT that contrasts the polarization properties of tissues. It has been applied to ophthalmology, cardiology, etc. Proper quantitative imaging is required for a widespread clinical utility. However, the conventional method of averaging to improve the signal to noise ratio (SNR) and the contrast of the phase retardation (or birefringence) images introduce a noise bias offset from the true value. This bias reduces the effectiveness of birefringence contrast for a quantitative study. Although coherent averaging of Jones matrix tomography has been widely utilized and has improved the image quality, the fundamental limitation of nonlinear dependency of phase retardation and birefringence to the SNR was not overcome. So the birefringence obtained by PS-OCT was still not accurate for a quantitative imaging. The nonlinear effect of SNR to phase retardation and birefringence measurement was previously formulated in detail for a Jones matrix OCT (JM-OCT) [1]. Based on this, we had developed a maximum a-posteriori (MAP) estimator and quantitative birefringence imaging was demonstrated [2]. However, this first version of estimator had a theoretical shortcoming. It did not take into account the stochastic nature of SNR of OCT signal. In this paper, we present an improved version of the MAP estimator which takes into account the stochastic property of SNR. This estimator uses a probability distribution function (PDF) of true local retardation, which is proportional to birefringence, under a specific set of measurements of the birefringence and SNR. The PDF was pre-computed by a Monte-Carlo (MC) simulation based on the mathematical model of JM-OCT before the measurement. A comparison between this new MAP estimator, our previous MAP estimator [2], and the standard mean estimator is presented. The comparisons are performed both by numerical simulation and in vivo measurements of anterior and

  2. Accurate estimation of cardinal growth temperatures of Escherichia coli from optimal dynamic experiments.

    PubMed

    Van Derlinden, E; Bernaerts, K; Van Impe, J F

    2008-11-30

    Prediction of the microbial growth rate as a response to changing temperatures is an important aspect in the control of food safety and food spoilage. Accurate model predictions of the microbial evolution ask for correct model structures and reliable parameter values with good statistical quality. Given the widely accepted validity of the Cardinal Temperature Model with Inflection (CTMI) [Rosso, L., Lobry, J. R., Bajard, S. and Flandrois, J. P., 1995. Convenient model to describe the combined effects of temperature and pH on microbial growth, Applied and Environmental Microbiology, 61: 610-616], this paper focuses on the accurate estimation of its four parameters (T(min), T(opt), T(max) and micro(opt)) by applying the technique of optimal experiment design for parameter estimation (OED/PE). This secondary model describes the influence of temperature on the microbial specific growth rate from the minimum to the maximum temperature for growth. Dynamic temperature profiles are optimized within two temperature regions ([15 degrees C, 43 degrees C] and [15 degrees C, 45 degrees C]), focusing on the minimization of the parameter estimation (co)variance (D-optimal design). The optimal temperature profiles are implemented in a computer controlled bioreactor, and the CTMI parameters are identified from the resulting experimental data. Approximately equal CTMI parameter values were derived irrespective of the temperature region, except for T(max). The latter could only be estimated accurately from the optimal experiments within [15 degrees C, 45 degrees C]. This observation underlines the importance of selecting the upper temperature constraint for OED/PE as close as possible to the true T(max). Cardinal temperature estimates resulting from designs within [15 degrees C, 45 degrees C] correspond with values found in literature, are characterized by a small uncertainty error and yield a good result during validation. As compared to estimates from non-optimized dynamic

  3. READSCAN: a fast and scalable pathogen discovery program with accurate genome relative abundance estimation

    PubMed Central

    Rashid, Mamoon; Pain, Arnab

    2013-01-01

    Summary: READSCAN is a highly scalable parallel program to identify non-host sequences (of potential pathogen origin) and estimate their genome relative abundance in high-throughput sequence datasets. READSCAN accurately classified human and viral sequences on a 20.1 million reads simulated dataset in <27 min using a small Beowulf compute cluster with 16 nodes (Supplementary Material). Availability: http://cbrc.kaust.edu.sa/readscan Contact: arnab.pain@kaust.edu.sa or raeece.naeem@gmail.com Supplementary information: Supplementary data are available at Bioinformatics online. PMID:23193222

  4. The estimation of tumor cell percentage for molecular testing by pathologists is not accurate.

    PubMed

    Smits, Alexander J J; Kummer, J Alain; de Bruin, Peter C; Bol, Mijke; van den Tweel, Jan G; Seldenrijk, Kees A; Willems, Stefan M; Offerhaus, G Johan A; de Weger, Roel A; van Diest, Paul J; Vink, Aryan

    2014-02-01

    Molecular pathology is becoming more and more important in present day pathology. A major challenge for any molecular test is its ability to reliably detect mutations in samples consisting of mixtures of tumor cells and normal cells, especially when the tumor content is low. The minimum percentage of tumor cells required to detect genetic abnormalities is a major variable. Information on tumor cell percentage is essential for a correct interpretation of the result. In daily practice, the percentage of tumor cells is estimated by pathologists on hematoxylin and eosin (H&E)-stained slides, the reliability of which has been questioned. This study aimed to determine the reliability of estimated tumor cell percentages in tissue samples by pathologists. On 47 H&E-stained slides of lung tumors a tumor area was marked. The percentage of tumor cells within this area was estimated independently by nine pathologists, using categories of 0-5%, 6-10%, 11-20%, 21-30%, and so on, until 91-100%. As gold standard, the percentage of tumor cells was counted manually. On average, the range between the lowest and the highest estimate per sample was 6.3 categories. In 33% of estimates, the deviation from the gold standard was at least three categories. The mean absolute deviation was 2.0 categories (range between observers 1.5-3.1 categories). There was a significant difference between the observers (P<0.001). If 20% of tumor cells were considered the lower limit to detect a mutation, samples with an insufficient tumor cell percentage (<20%) would have been estimated to contain enough tumor cells in 27/72 (38%) observations, possibly causing false negative results. In conclusion, estimates of tumor cell percentages on H&E-stained slides are not accurate, which could result in misinterpretation of test results. Reliability could possibly be improved by using a training set with feedback.

  5. Toward an Accurate Estimate of the Exfoliation Energy of Black Phosphorus: A Periodic Quantum Chemical Approach.

    PubMed

    Sansone, Giuseppe; Maschio, Lorenzo; Usvyat, Denis; Schütz, Martin; Karttunen, Antti

    2016-01-07

    The black phosphorus (black-P) crystal is formed of covalently bound layers of phosphorene stacked together by weak van der Waals interactions. An experimental measurement of the exfoliation energy of black-P is not available presently, making theoretical studies the most important source of information for the optimization of phosphorene production. Here, we provide an accurate estimate of the exfoliation energy of black-P on the basis of multilevel quantum chemical calculations, which include the periodic local Møller-Plesset perturbation theory of second order, augmented by higher-order corrections, which are evaluated with finite clusters mimicking the crystal. Very similar results are also obtained by density functional theory with the D3-version of Grimme's empirical dispersion correction. Our estimate of the exfoliation energy for black-P of -151 meV/atom is substantially larger than that of graphite, suggesting the need for different strategies to generate isolated layers for these two systems.

  6. Spatial ascariasis risk estimation using socioeconomic variables.

    PubMed

    Valencia, Luis Iván Ortiz; Fortes, Bruno de Paula Menezes Drumond; Medronho, Roberto de Andrade

    2005-12-01

    Frequently, disease incidence is mapped as area data, for example, census tracts, districts or states. Spatial disease incidence can be highly heterogeneous inside these areas. Ascariasis is a highly prevalent disease, which is associated with poor sanitation and hygiene. Geostatistics was applied to model spatial distribution of Ascariasis risk and socioeconomic risk events in a poor community in Rio de Janeiro, Brazil. Data were gathered from a coproparasitologic and a domiciliary survey in 1550 children aged 1-9. Ascariasis risk and socioeconomic risk events were spatially estimated using Indicator Kriging. Cokriging models with a Linear Model of Coregionalization incorporating one socioeconomic variable were implemented. If a housewife attended school for less than four years, the non-use of a home water filter, a household density greater than one, and a household income lower than one Brazilian minimum wage increased the risk of Ascariasis. Cokriging improved spatial estimation of Ascariasis risk areas when compared to Indicator Kriging and detected more Ascariasis very-high risk areas than the GIS Overlay method.

  7. Lamb mode selection for accurate wall loss estimation via guided wave tomography

    SciTech Connect

    Huthwaite, P.; Ribichini, R.; Lowe, M. J. S.; Cawley, P.

    2014-02-18

    Guided wave tomography offers a method to accurately quantify wall thickness losses in pipes and vessels caused by corrosion. This is achieved using ultrasonic waves transmitted over distances of approximately 1–2m, which are measured by an array of transducers and then used to reconstruct a map of wall thickness throughout the inspected region. To achieve accurate estimations of remnant wall thickness, it is vital that a suitable Lamb mode is chosen. This paper presents a detailed evaluation of the fundamental modes, S{sub 0} and A{sub 0}, which are of primary interest in guided wave tomography thickness estimates since the higher order modes do not exist at all thicknesses, to compare their performance using both numerical and experimental data while considering a range of challenging phenomena. The sensitivity of A{sub 0} to thickness variations was shown to be superior to S{sub 0}, however, the attenuation from A{sub 0} when a liquid loading was present was much higher than S{sub 0}. A{sub 0} was less sensitive to the presence of coatings on the surface of than S{sub 0}.

  8. Magnetic gaps in organic tri-radicals: From a simple model to accurate estimates

    NASA Astrophysics Data System (ADS)

    Barone, Vincenzo; Cacelli, Ivo; Ferretti, Alessandro; Prampolini, Giacomo

    2017-03-01

    The calculation of the energy gap between the magnetic states of organic poly-radicals still represents a challenging playground for quantum chemistry, and high-level techniques are required to obtain accurate estimates. On these grounds, the aim of the present study is twofold. From the one side, it shows that, thanks to recent algorithmic and technical improvements, we are able to compute reliable quantum mechanical results for the systems of current fundamental and technological interest. From the other side, proper parameterization of a simple Hubbard Hamiltonian allows for a sound rationalization of magnetic gaps in terms of basic physical effects, unraveling the role played by electron delocalization, Coulomb repulsion, and effective exchange in tuning the magnetic character of the ground state. As case studies, we have chosen three prototypical organic tri-radicals, namely, 1,3,5-trimethylenebenzene, 1,3,5-tridehydrobenzene, and 1,2,3-tridehydrobenzene, which differ either for geometric or electronic structure. After discussing the differences among the three species and their consequences on the magnetic properties in terms of the simple model mentioned above, accurate and reliable values for the energy gap between the lowest quartet and doublet states are computed by means of the so-called difference dedicated configuration interaction (DDCI) technique, and the final results are discussed and compared to both available experimental and computational estimates.

  9. Do modelled or satellite-based estimates of surface solar irradiance accurately describe its temporal variability?

    NASA Astrophysics Data System (ADS)

    Bengulescu, Marc; Blanc, Philippe; Boilley, Alexandre; Wald, Lucien

    2017-02-01

    This study investigates the characteristic time-scales of variability found in long-term time-series of daily means of estimates of surface solar irradiance (SSI). The study is performed at various levels to better understand the causes of variability in the SSI. First, the variability of the solar irradiance at the top of the atmosphere is scrutinized. Then, estimates of the SSI in cloud-free conditions as provided by the McClear model are dealt with, in order to reveal the influence of the clear atmosphere (aerosols, water vapour, etc.). Lastly, the role of clouds on variability is inferred by the analysis of in-situ measurements. A description of how the atmosphere affects SSI variability is thus obtained on a time-scale basis. The analysis is also performed with estimates of the SSI provided by the satellite-derived HelioClim-3 database and by two numerical weather re-analyses: ERA-Interim and MERRA2. It is found that HelioClim-3 estimates render an accurate picture of the variability found in ground measurements, not only globally, but also with respect to individual characteristic time-scales. On the contrary, the variability found in re-analyses correlates poorly with all scales of ground measurements variability.

  10. Removing the thermal component from heart rate provides an accurate VO2 estimation in forest work.

    PubMed

    Dubé, Philippe-Antoine; Imbeau, Daniel; Dubeau, Denise; Lebel, Luc; Kolus, Ahmet

    2016-05-01

    Heart rate (HR) was monitored continuously in 41 forest workers performing brushcutting or tree planting work. 10-min seated rest periods were imposed during the workday to estimate the HR thermal component (ΔHRT) per Vogt et al. (1970, 1973). VO2 was measured using a portable gas analyzer during a morning submaximal step-test conducted at the work site, during a work bout over the course of the day (range: 9-74 min), and during an ensuing 10-min rest pause taken at the worksite. The VO2 estimated, from measured HR and from corrected HR (thermal component removed), were compared to VO2 measured during work and rest. Varied levels of HR thermal component (ΔHRTavg range: 0-38 bpm) originating from a wide range of ambient thermal conditions, thermal clothing insulation worn, and physical load exerted during work were observed. Using raw HR significantly overestimated measured work VO2 by 30% on average (range: 1%-64%). 74% of VO2 prediction error variance was explained by the HR thermal component. VO2 estimated from corrected HR, was not statistically different from measured VO2. Work VO2 can be estimated accurately in the presence of thermal stress using Vogt et al.'s method, which can be implemented easily by the practitioner with inexpensive instruments.

  11. Accurate Estimation of the Intrinsic Dimension Using Graph Distances: Unraveling the Geometric Complexity of Datasets

    PubMed Central

    Granata, Daniele; Carnevale, Vincenzo

    2016-01-01

    The collective behavior of a large number of degrees of freedom can be often described by a handful of variables. This observation justifies the use of dimensionality reduction approaches to model complex systems and motivates the search for a small set of relevant “collective” variables. Here, we analyze this issue by focusing on the optimal number of variable needed to capture the salient features of a generic dataset and develop a novel estimator for the intrinsic dimension (ID). By approximating geodesics with minimum distance paths on a graph, we analyze the distribution of pairwise distances around the maximum and exploit its dependency on the dimensionality to obtain an ID estimate. We show that the estimator does not depend on the shape of the intrinsic manifold and is highly accurate, even for exceedingly small sample sizes. We apply the method to several relevant datasets from image recognition databases and protein multiple sequence alignments and discuss possible interpretations for the estimated dimension in light of the correlations among input variables and of the information content of the dataset. PMID:27510265

  12. Accurate Estimation of the Intrinsic Dimension Using Graph Distances: Unraveling the Geometric Complexity of Datasets

    NASA Astrophysics Data System (ADS)

    Granata, Daniele; Carnevale, Vincenzo

    2016-08-01

    The collective behavior of a large number of degrees of freedom can be often described by a handful of variables. This observation justifies the use of dimensionality reduction approaches to model complex systems and motivates the search for a small set of relevant “collective” variables. Here, we analyze this issue by focusing on the optimal number of variable needed to capture the salient features of a generic dataset and develop a novel estimator for the intrinsic dimension (ID). By approximating geodesics with minimum distance paths on a graph, we analyze the distribution of pairwise distances around the maximum and exploit its dependency on the dimensionality to obtain an ID estimate. We show that the estimator does not depend on the shape of the intrinsic manifold and is highly accurate, even for exceedingly small sample sizes. We apply the method to several relevant datasets from image recognition databases and protein multiple sequence alignments and discuss possible interpretations for the estimated dimension in light of the correlations among input variables and of the information content of the dataset.

  13. MIDAS robust trend estimator for accurate GPS station velocities without step detection.

    PubMed

    Blewitt, Geoffrey; Kreemer, Corné; Hammond, William C; Gazeaux, Julien

    2016-03-01

    Automatic estimation of velocities from GPS coordinate time series is becoming required to cope with the exponentially increasing flood of available data, but problems detectable to the human eye are often overlooked. This motivates us to find an automatic and accurate estimator of trend that is resistant to common problems such as step discontinuities, outliers, seasonality, skewness, and heteroscedasticity. Developed here, Median Interannual Difference Adjusted for Skewness (MIDAS) is a variant of the Theil-Sen median trend estimator, for which the ordinary version is the median of slopes vij  = (xj-xi )/(tj-ti ) computed between all data pairs i > j. For normally distributed data, Theil-Sen and least squares trend estimates are statistically identical, but unlike least squares, Theil-Sen is resistant to undetected data problems. To mitigate both seasonality and step discontinuities, MIDAS selects data pairs separated by 1 year. This condition is relaxed for time series with gaps so that all data are used. Slopes from data pairs spanning a step function produce one-sided outliers that can bias the median. To reduce bias, MIDAS removes outliers and recomputes the median. MIDAS also computes a robust and realistic estimate of trend uncertainty. Statistical tests using GPS data in the rigid North American plate interior show ±0.23 mm/yr root-mean-square (RMS) accuracy in horizontal velocity. In blind tests using synthetic data, MIDAS velocities have an RMS accuracy of ±0.33 mm/yr horizontal, ±1.1 mm/yr up, with a 5th percentile range smaller than all 20 automatic estimators tested. Considering its general nature, MIDAS has the potential for broader application in the geosciences.

  14. Methods for accurate estimation of net discharge in a tidal channel

    USGS Publications Warehouse

    Simpson, M.R.; Bland, R.

    2000-01-01

    Accurate estimates of net residual discharge in tidally affected rivers and estuaries are possible because of recently developed ultrasonic discharge measurement techniques. Previous discharge estimates using conventional mechanical current meters and methods based on stage/discharge relations or water slope measurements often yielded errors that were as great as or greater than the computed residual discharge. Ultrasonic measurement methods consist of: 1) the use of ultrasonic instruments for the measurement of a representative 'index' velocity used for in situ estimation of mean water velocity and 2) the use of the acoustic Doppler current discharge measurement system to calibrate the index velocity measurement data. Methods used to calibrate (rate) the index velocity to the channel velocity measured using the Acoustic Doppler Current Profiler are the most critical factors affecting the accuracy of net discharge estimation. The index velocity first must be related to mean channel velocity and then used to calculate instantaneous channel discharge. Finally, discharge is low-pass filtered to remove the effects of the tides. An ultrasonic velocity meter discharge-measurement site in a tidally affected region of the Sacramento-San Joaquin Rivers was used to study the accuracy of the index velocity calibration procedure. Calibration data consisting of ultrasonic velocity meter index velocity and concurrent acoustic Doppler discharge measurement data were collected during three time periods. Two sets of data were collected during a spring tide (monthly maximum tidal current) and one of data collected during a neap tide (monthly minimum tidal current). The relative magnitude of instrumental errors, acoustic Doppler discharge measurement errors, and calibration errors were evaluated. Calibration error was found to be the most significant source of error in estimating net discharge. Using a comprehensive calibration method, net discharge estimates developed from the three

  15. MIDAS robust trend estimator for accurate GPS station velocities without step detection

    NASA Astrophysics Data System (ADS)

    Blewitt, Geoffrey; Kreemer, Corné; Hammond, William C.; Gazeaux, Julien

    2016-03-01

    Automatic estimation of velocities from GPS coordinate time series is becoming required to cope with the exponentially increasing flood of available data, but problems detectable to the human eye are often overlooked. This motivates us to find an automatic and accurate estimator of trend that is resistant to common problems such as step discontinuities, outliers, seasonality, skewness, and heteroscedasticity. Developed here, Median Interannual Difference Adjusted for Skewness (MIDAS) is a variant of the Theil-Sen median trend estimator, for which the ordinary version is the median of slopes vij = (xj-xi)/(tj-ti) computed between all data pairs i > j. For normally distributed data, Theil-Sen and least squares trend estimates are statistically identical, but unlike least squares, Theil-Sen is resistant to undetected data problems. To mitigate both seasonality and step discontinuities, MIDAS selects data pairs separated by 1 year. This condition is relaxed for time series with gaps so that all data are used. Slopes from data pairs spanning a step function produce one-sided outliers that can bias the median. To reduce bias, MIDAS removes outliers and recomputes the median. MIDAS also computes a robust and realistic estimate of trend uncertainty. Statistical tests using GPS data in the rigid North American plate interior show ±0.23 mm/yr root-mean-square (RMS) accuracy in horizontal velocity. In blind tests using synthetic data, MIDAS velocities have an RMS accuracy of ±0.33 mm/yr horizontal, ±1.1 mm/yr up, with a 5th percentile range smaller than all 20 automatic estimators tested. Considering its general nature, MIDAS has the potential for broader application in the geosciences.

  16. MIDAS robust trend estimator for accurate GPS station velocities without step detection

    PubMed Central

    Kreemer, Corné; Hammond, William C.; Gazeaux, Julien

    2016-01-01

    Abstract Automatic estimation of velocities from GPS coordinate time series is becoming required to cope with the exponentially increasing flood of available data, but problems detectable to the human eye are often overlooked. This motivates us to find an automatic and accurate estimator of trend that is resistant to common problems such as step discontinuities, outliers, seasonality, skewness, and heteroscedasticity. Developed here, Median Interannual Difference Adjusted for Skewness (MIDAS) is a variant of the Theil‐Sen median trend estimator, for which the ordinary version is the median of slopes vij = (xj–xi)/(tj–ti) computed between all data pairs i > j. For normally distributed data, Theil‐Sen and least squares trend estimates are statistically identical, but unlike least squares, Theil‐Sen is resistant to undetected data problems. To mitigate both seasonality and step discontinuities, MIDAS selects data pairs separated by 1 year. This condition is relaxed for time series with gaps so that all data are used. Slopes from data pairs spanning a step function produce one‐sided outliers that can bias the median. To reduce bias, MIDAS removes outliers and recomputes the median. MIDAS also computes a robust and realistic estimate of trend uncertainty. Statistical tests using GPS data in the rigid North American plate interior show ±0.23 mm/yr root‐mean‐square (RMS) accuracy in horizontal velocity. In blind tests using synthetic data, MIDAS velocities have an RMS accuracy of ±0.33 mm/yr horizontal, ±1.1 mm/yr up, with a 5th percentile range smaller than all 20 automatic estimators tested. Considering its general nature, MIDAS has the potential for broader application in the geosciences. PMID:27668140

  17. Reconstruction of financial networks for robust estimation of systemic risk

    NASA Astrophysics Data System (ADS)

    Mastromatteo, Iacopo; Zarinelli, Elia; Marsili, Matteo

    2012-03-01

    In this paper we estimate the propagation of liquidity shocks through interbank markets when the information about the underlying credit network is incomplete. We show that techniques such as maximum entropy currently used to reconstruct credit networks severely underestimate the risk of contagion by assuming a trivial (fully connected) topology, a type of network structure which can be very different from the one empirically observed. We propose an efficient message-passing algorithm to explore the space of possible network structures and show that a correct estimation of the network degree of connectedness leads to more reliable estimations for systemic risk. Such an algorithm is also able to produce maximally fragile structures, providing a practical upper bound for the risk of contagion when the actual network structure is unknown. We test our algorithm on ensembles of synthetic data encoding some features of real financial networks (sparsity and heterogeneity), finding that more accurate estimations of risk can be achieved. Finally we find that this algorithm can be used to control the amount of information that regulators need to require from banks in order to sufficiently constrain the reconstruction of financial networks.

  18. Estimating Terrorist Risk with Possibility Theory

    SciTech Connect

    J.L. Darby

    2004-11-30

    This report summarizes techniques that use possibility theory to estimate the risk of terrorist acts. These techniques were developed under the sponsorship of the Department of Homeland Security (DHS) as part of the National Infrastructure Simulation Analysis Center (NISAC) project. The techniques have been used to estimate the risk of various terrorist scenarios to support NISAC analyses during 2004. The techniques are based on the Logic Evolved Decision (LED) methodology developed over the past few years by Terry Bott and Steve Eisenhawer at LANL. [LED] The LED methodology involves the use of fuzzy sets, possibility theory, and approximate reasoning. LED captures the uncertainty due to vagueness and imprecision that is inherent in the fidelity of the information available for terrorist acts; probability theory cannot capture these uncertainties. This report does not address the philosophy supporting the development of nonprobabilistic approaches, and it does not discuss possibility theory in detail. The references provide a detailed discussion of these subjects. [Shafer] [Klir and Yuan] [Dubois and Prade] Suffice to say that these approaches were developed to address types of uncertainty that cannot be addressed by a probability measure. An earlier report discussed in detail the problems with using a probability measure to evaluate terrorist risk. [Darby Methodology]. Two related techniques are discussed in this report: (1) a numerical technique, and (2) a linguistic technique. The numerical technique uses traditional possibility theory applied to crisp sets, while the linguistic technique applies possibility theory to fuzzy sets. Both of these techniques as applied to terrorist risk for NISAC applications are implemented in software called PossibleRisk. The techniques implemented in PossibleRisk were developed specifically for use in estimating terrorist risk for the NISAC program. The LEDTools code can be used to perform the same linguistic evaluation as

  19. Almost efficient estimation of relative risk regression

    PubMed Central

    Fitzmaurice, Garrett M.; Lipsitz, Stuart R.; Arriaga, Alex; Sinha, Debajyoti; Greenberg, Caprice; Gawande, Atul A.

    2014-01-01

    Relative risks (RRs) are often considered the preferred measures of association in prospective studies, especially when the binary outcome of interest is common. In particular, many researchers regard RRs to be more intuitively interpretable than odds ratios. Although RR regression is a special case of generalized linear models, specifically with a log link function for the binomial (or Bernoulli) outcome, the resulting log-binomial regression does not respect the natural parameter constraints. Because log-binomial regression does not ensure that predicted probabilities are mapped to the [0,1] range, maximum likelihood (ML) estimation is often subject to numerical instability that leads to convergence problems. To circumvent these problems, a number of alternative approaches for estimating RR regression parameters have been proposed. One approach that has been widely studied is the use of Poisson regression estimating equations. The estimating equations for Poisson regression yield consistent, albeit inefficient, estimators of the RR regression parameters. We consider the relative efficiency of the Poisson regression estimator and develop an alternative, almost efficient estimator for the RR regression parameters. The proposed method uses near-optimal weights based on a Maclaurin series (Taylor series expanded around zero) approximation to the true Bernoulli or binomial weight function. This yields an almost efficient estimator while avoiding convergence problems. We examine the asymptotic relative efficiency of the proposed estimator for an increase in the number of terms in the series. Using simulations, we demonstrate the potential for convergence problems with standard ML estimation of the log-binomial regression model and illustrate how this is overcome using the proposed estimator. We apply the proposed estimator to a study of predictors of pre-operative use of beta blockers among patients undergoing colorectal surgery after diagnosis of colon cancer. PMID

  20. Accurate relative location estimates for the North Korean nuclear tests using empirical slowness corrections

    NASA Astrophysics Data System (ADS)

    Gibbons, S. J.; Pabian, F.; Näsholm, S. P.; Kværna, T.; Mykkeltveit, S.

    2017-01-01

    velocity gradients reduce the residuals, the relative location uncertainties and the sensitivity to the combination of stations used. The traveltime gradients appear to be overestimated for the regional phases, and teleseismic relative location estimates are likely to be more accurate despite an apparent lower precision. Calibrations for regional phases are essential given that smaller magnitude events are likely not to be recorded teleseismically. We discuss the implications for the absolute event locations. Placing the 2006 event under a local maximum of overburden at 41.293°N, 129.105°E would imply a location of 41.299°N, 129.075°E for the January 2016 event, providing almost optimal overburden for the later four events.

  1. Accurate Relative Location Estimates for the North Korean Nuclear Tests Using Empirical Slowness Corrections

    NASA Astrophysics Data System (ADS)

    Gibbons, S. J.; Pabian, F.; Näsholm, S. P.; Kværna', T.; Mykkeltveit, S.

    2016-10-01

    modified velocity gradients reduce the residuals, the relative location uncertainties, and the sensitivity to the combination of stations used. The traveltime gradients appear to be overestimated for the regional phases, and teleseismic relative location estimates are likely to be more accurate despite an apparent lower precision. Calibrations for regional phases are essential given that smaller magnitude events are likely not to be recorded teleseismically. We discuss the implications for the absolute event locations. Placing the 2006 event under a local maximum of overburden at 41.293°N, 129.105°E would imply a location of 41.299°N, 129.075°E for the January 2016 event, providing almost optimal overburden for the later four events.

  2. Simplified risk score models accurately predict the risk of major in-hospital complications following percutaneous coronary intervention.

    PubMed

    Resnic, F S; Ohno-Machado, L; Selwyn, A; Simon, D I; Popma, J J

    2001-07-01

    The objectives of this analysis were to develop and validate simplified risk score models for predicting the risk of major in-hospital complications after percutaneous coronary intervention (PCI) in the era of widespread stenting and use of glycoprotein IIb/IIIa antagonists. We then sought to compare the performance of these simplified models with those of full logistic regression and neural network models. From January 1, 1997 to December 31, 1999, data were collected on 4,264 consecutive interventional procedures at a single center. Risk score models were derived from multiple logistic regression models using the first 2,804 cases and then validated on the final 1,460 cases. The area under the receiver operating characteristic (ROC) curve for the risk score model that predicted death was 0.86 compared with 0.85 for the multiple logistic model and 0.83 for the neural network model (validation set). For the combined end points of death, myocardial infarction, or bypass surgery, the corresponding areas under the ROC curves were 0.74, 0.78, and 0.81, respectively. Previously identified risk factors were confirmed in this analysis. The use of stents was associated with a decreased risk of in-hospital complications. Thus, risk score models can accurately predict the risk of major in-hospital complications after PCI. Their discriminatory power is comparable to those of logistic models and neural network models. Accurate bedside risk stratification may be achieved with these simple models.

  3. Painfree and accurate Bayesian estimation of psychometric functions for (potentially) overdispersed data.

    PubMed

    Schütt, Heiko H; Harmeling, Stefan; Macke, Jakob H; Wichmann, Felix A

    2016-05-01

    The psychometric function describes how an experimental variable, such as stimulus strength, influences the behaviour of an observer. Estimation of psychometric functions from experimental data plays a central role in fields such as psychophysics, experimental psychology and in the behavioural neurosciences. Experimental data may exhibit substantial overdispersion, which may result from non-stationarity in the behaviour of observers. Here we extend the standard binomial model which is typically used for psychometric function estimation to a beta-binomial model. We show that the use of the beta-binomial model makes it possible to determine accurate credible intervals even in data which exhibit substantial overdispersion. This goes beyond classical measures for overdispersion-goodness-of-fit-which can detect overdispersion but provide no method to do correct inference for overdispersed data. We use Bayesian inference methods for estimating the posterior distribution of the parameters of the psychometric function. Unlike previous Bayesian psychometric inference methods our software implementation-psignifit 4-performs numerical integration of the posterior within automatically determined bounds. This avoids the use of Markov chain Monte Carlo (MCMC) methods typically requiring expert knowledge. Extensive numerical tests show the validity of the approach and we discuss implications of overdispersion for experimental design. A comprehensive MATLAB toolbox implementing the method is freely available; a python implementation providing the basic capabilities is also available.

  4. Accurate estimation of the RMS emittance from single current amplifier data

    SciTech Connect

    Stockli, Martin P.; Welton, R.F.; Keller, R.; Letchford, A.P.; Thomae, R.W.; Thomason, J.W.G.

    2002-05-31

    This paper presents the SCUBEEx rms emittance analysis, a self-consistent, unbiased elliptical exclusion method, which combines traditional data-reduction methods with statistical methods to obtain accurate estimates for the rms emittance. Rather than considering individual data, the method tracks the average current density outside a well-selected, variable boundary to separate the measured beam halo from the background. The average outside current density is assumed to be part of a uniform background and not part of the particle beam. Therefore the average outside current is subtracted from the data before evaluating the rms emittance within the boundary. As the boundary area is increased, the average outside current and the inside rms emittance form plateaus when all data containing part of the particle beam are inside the boundary. These plateaus mark the smallest acceptable exclusion boundary and provide unbiased estimates for the average background and the rms emittance. Small, trendless variations within the plateaus allow for determining the uncertainties of the estimates caused by variations of the measured background outside the smallest acceptable exclusion boundary. The robustness of the method is established with complementary variations of the exclusion boundary. This paper presents a detailed comparison between traditional data reduction methods and SCUBEEx by analyzing two complementary sets of emittance data obtained with a Lawrence Berkeley National Laboratory and an ISIS H{sup -} ion source.

  5. Accurate estimation of human body orientation from RGB-D sensors.

    PubMed

    Liu, Wu; Zhang, Yongdong; Tang, Sheng; Tang, Jinhui; Hong, Richang; Li, Jintao

    2013-10-01

    Accurate estimation of human body orientation can significantly enhance the analysis of human behavior, which is a fundamental task in the field of computer vision. However, existing orientation estimation methods cannot handle the various body poses and appearances. In this paper, we propose an innovative RGB-D-based orientation estimation method to address these challenges. By utilizing the RGB-D information, which can be real time acquired by RGB-D sensors, our method is robust to cluttered environment, illumination change and partial occlusions. Specifically, efficient static and motion cue extraction methods are proposed based on the RGB-D superpixels to reduce the noise of depth data. Since it is hard to discriminate all the 360 (°) orientation using static cues or motion cues independently, we propose to utilize a dynamic Bayesian network system (DBNS) to effectively employ the complementary nature of both static and motion cues. In order to verify our proposed method, we build a RGB-D-based human body orientation dataset that covers a wide diversity of poses and appearances. Our intensive experimental evaluations on this dataset demonstrate the effectiveness and efficiency of the proposed method.

  6. Accurate estimation of motion blur parameters in noisy remote sensing image

    NASA Astrophysics Data System (ADS)

    Shi, Xueyan; Wang, Lin; Shao, Xiaopeng; Wang, Huilin; Tao, Zhong

    2015-05-01

    The relative motion between remote sensing satellite sensor and objects is one of the most common reasons for remote sensing image degradation. It seriously weakens image data interpretation and information extraction. In practice, point spread function (PSF) should be estimated firstly for image restoration. Identifying motion blur direction and length accurately is very crucial for PSF and restoring image with precision. In general, the regular light-and-dark stripes in the spectrum can be employed to obtain the parameters by using Radon transform. However, serious noise existing in actual remote sensing images often causes the stripes unobvious. The parameters would be difficult to calculate and the error of the result relatively big. In this paper, an improved motion blur parameter identification method to noisy remote sensing image is proposed to solve this problem. The spectrum characteristic of noisy remote sensing image is analyzed firstly. An interactive image segmentation method based on graph theory called GrabCut is adopted to effectively extract the edge of the light center in the spectrum. Motion blur direction is estimated by applying Radon transform on the segmentation result. In order to reduce random error, a method based on whole column statistics is used during calculating blur length. Finally, Lucy-Richardson algorithm is applied to restore the remote sensing images of the moon after estimating blur parameters. The experimental results verify the effectiveness and robustness of our algorithm.

  7. Estimated Autism Risk and Older Reproductive Age

    PubMed Central

    King, Marissa D.; Fountain, Christine; Dakhlallah, Diana

    2009-01-01

    Objectives. We sought to estimate the risk for autism associated with maternal and paternal age across successive birth cohorts. Methods. We linked birth records and autism diagnostic records from the California Department of Developmental Services for children born in California between 1992 and 2000 to calculate the risk associated with maternal and paternal age for each birth cohort as well as for the pooled data. Results. The categorical risks associated with maternal age over 40 years ranged from a high of 1.84 (95% confidence interval [CI] = 1.37, 2.47) to a low of 1.27 (95% CI = 0.95, 1.69). The risk associated with paternal age ranged from 1.29 (95% CI = 1.03, 1.6) to 1.71 (95% CI = 1.41, 2.08). Conclusions. Pooling data across multiple birth cohorts inflates the risk associated with paternal age. Analyses that do not suffer from problems produced by pooling across birth cohorts demonstrated that advanced maternal age, rather than paternal age, may pose greater risk. Future research examining parental age as a risk factor must be careful to avoid the paradoxes that can arise from pooling data, particularly during periods of social demographic change. PMID:19608957

  8. Risk Estimation Methodology for Launch Accidents.

    SciTech Connect

    Clayton, Daniel James; Lipinski, Ronald J.; Bechtel, Ryan D.

    2014-02-01

    As compact and light weight power sources with reliable, long lives, Radioisotope Power Systems (RPSs) have made space missions to explore the solar system possible. Due to the hazardous material that can be released during a launch accident, the potential health risk of an accident must be quantified, so that appropriate launch approval decisions can be made. One part of the risk estimation involves modeling the response of the RPS to potential accident environments. Due to the complexity of modeling the full RPS response deterministically on dynamic variables, the evaluation is performed in a stochastic manner with a Monte Carlo simulation. The potential consequences can be determined by modeling the transport of the hazardous material in the environment and in human biological pathways. The consequence analysis results are summed and weighted by appropriate likelihood values to give a collection of probabilistic results for the estimation of the potential health risk. This information is used to guide RPS designs, spacecraft designs, mission architecture, or launch procedures to potentially reduce the risk, as well as to inform decision makers of the potential health risks resulting from the use of RPSs for space missions.

  9. Accurate and Robust Attitude Estimation Using MEMS Gyroscopes and a Monocular Camera

    NASA Astrophysics Data System (ADS)

    Kobori, Norimasa; Deguchi, Daisuke; Takahashi, Tomokazu; Ide, Ichiro; Murase, Hiroshi

    In order to estimate accurate rotations of mobile robots and vehicle, we propose a hybrid system which combines a low-cost monocular camera with gyro sensors. Gyro sensors have drift errors that accumulate over time. On the other hand, a camera cannot obtain the rotation continuously in the case where feature points cannot be extracted from images, although the accuracy is better than gyro sensors. To solve these problems we propose a method for combining these sensors based on Extended Kalman Filter. The errors of the gyro sensors are corrected by referring to the rotations obtained from the camera. In addition, by using the reliability judgment of camera rotations and devising the state value of the Extended Kalman Filter, even when the rotation is not continuously observable from the camera, the proposed method shows a good performance. Experimental results showed the effectiveness of the proposed method.

  10. Two-wavelength interferometry: extended range and accurate optical path difference analytical estimator.

    PubMed

    Houairi, Kamel; Cassaing, Frédéric

    2009-12-01

    Two-wavelength interferometry combines measurement at two wavelengths lambda(1) and lambda(2) in order to increase the unambiguous range (UR) for the measurement of an optical path difference. With the usual algorithm, the UR is equal to the synthetic wavelength Lambda=lambda(1)lambda(2)/|lambda(1)-lambda(2)|, and the accuracy is a fraction of Lambda. We propose here a new analytical algorithm based on arithmetic properties, allowing estimation of the absolute fringe order of interference in a noniterative way. This algorithm has nice properties compared with the usual algorithm: it is at least as accurate as the most accurate measurement at one wavelength, whereas the UR is extended to several times the synthetic wavelength. The analysis presented shows how the actual UR depends on the wavelengths and different sources of error. The simulations presented are confirmed by experimental results, showing that the new algorithm has enabled us to reach an UR of 17.3 microm, much larger than the synthetic wavelength, which is only Lambda=2.2 microm. Applications to metrology and fringe tracking are discussed.

  11. A Simple yet Accurate Method for the Estimation of the Biovolume of Planktonic Microorganisms

    PubMed Central

    2016-01-01

    Determining the biomass of microbial plankton is central to the study of fluxes of energy and materials in aquatic ecosystems. This is typically accomplished by applying proper volume-to-carbon conversion factors to group-specific abundances and biovolumes. A critical step in this approach is the accurate estimation of biovolume from two-dimensional (2D) data such as those available through conventional microscopy techniques or flow-through imaging systems. This paper describes a simple yet accurate method for the assessment of the biovolume of planktonic microorganisms, which works with any image analysis system allowing for the measurement of linear distances and the estimation of the cross sectional area of an object from a 2D digital image. The proposed method is based on Archimedes’ principle about the relationship between the volume of a sphere and that of a cylinder in which the sphere is inscribed, plus a coefficient of ‘unellipticity’ introduced here. Validation and careful evaluation of the method are provided using a variety of approaches. The new method proved to be highly precise with all convex shapes characterised by approximate rotational symmetry, and combining it with an existing method specific for highly concave or branched shapes allows covering the great majority of cases with good reliability. Thanks to its accuracy, consistency, and low resources demand, the new method can conveniently be used in substitution of any extant method designed for convex shapes, and can readily be coupled with automated cell imaging technologies, including state-of-the-art flow-through imaging devices. PMID:27195667

  12. Accurate biopsy-needle depth estimation in limited-angle tomography using multi-view geometry

    NASA Astrophysics Data System (ADS)

    van der Sommen, Fons; Zinger, Sveta; de With, Peter H. N.

    2016-03-01

    Recently, compressed-sensing based algorithms have enabled volume reconstruction from projection images acquired over a relatively small angle (θ < 20°). These methods enable accurate depth estimation of surgical tools with respect to anatomical structures. However, they are computationally expensive and time consuming, rendering them unattractive for image-guided interventions. We propose an alternative approach for depth estimation of biopsy needles during image-guided interventions, in which we split the problem into two parts and solve them independently: needle-depth estimation and volume reconstruction. The complete proposed system consists of the previous two steps, preceded by needle extraction. First, we detect the biopsy needle in the projection images and remove it by interpolation. Next, we exploit epipolar geometry to find point-to-point correspondences in the projection images to triangulate the 3D position of the needle in the volume. Finally, we use the interpolated projection images to reconstruct the local anatomical structures and indicate the position of the needle within this volume. For validation of the algorithm, we have recorded a full CT scan of a phantom with an inserted biopsy needle. The performance of our approach ranges from a median error of 2.94 mm for an distributed viewing angle of 1° down to an error of 0.30 mm for an angle larger than 10°. Based on the results of this initial phantom study, we conclude that multi-view geometry offers an attractive alternative to time-consuming iterative methods for the depth estimation of surgical tools during C-arm-based image-guided interventions.

  13. The potential of more accurate InSAR covariance matrix estimation for land cover mapping

    NASA Astrophysics Data System (ADS)

    Jiang, Mi; Yong, Bin; Tian, Xin; Malhotra, Rakesh; Hu, Rui; Li, Zhiwei; Yu, Zhongbo; Zhang, Xinxin

    2017-04-01

    Synthetic aperture radar (SAR) and Interferometric SAR (InSAR) provide both structural and electromagnetic information for the ground surface and therefore have been widely used for land cover classification. However, relatively few studies have developed analyses that investigate SAR datasets over richly textured areas where heterogeneous land covers exist and intermingle over short distances. One of main difficulties is that the shapes of the structures in a SAR image cannot be represented in detail as mixed pixels are likely to occur when conventional InSAR parameter estimation methods are used. To solve this problem and further extend previous research into remote monitoring of urban environments, we address the use of accurate InSAR covariance matrix estimation to improve the accuracy of land cover mapping. The standard and updated methods were tested using the HH-polarization TerraSAR-X dataset and compared with each other using the random forest classifier. A detailed accuracy assessment complied for six types of surfaces shows that the updated method outperforms the standard approach by around 9%, with an overall accuracy of 82.46% over areas with rich texture in Zhuhai, China. This paper demonstrates that the accuracy of land cover mapping can benefit from the 3 enhancement of the quality of the observations in addition to classifiers selection and multi-source data ingratiation reported in previous studies.

  14. Can student health professionals accurately estimate alcohol content in commonly occurring drinks?

    PubMed Central

    Sinclair, Julia; Searle, Emma

    2016-01-01

    Objectives: Correct identification of alcohol as a contributor to, or comorbidity of, many psychiatric diseases requires health professionals to be competent and confident to take an accurate alcohol history. Being able to estimate (or calculate) the alcohol content in commonly consumed drinks is a prerequisite for quantifying levels of alcohol consumption. The aim of this study was to assess this ability in medical and nursing students. Methods: A cross-sectional survey of 891 medical and nursing students across different years of training was conducted. Students were asked the alcohol content of 10 different alcoholic drinks by seeing a slide of the drink (with picture, volume and percentage of alcohol by volume) for 30 s. Results: Overall, the mean number of correctly estimated drinks (out of the 10 tested) was 2.4, increasing to just over 3 if a 10% margin of error was used. Wine and premium strength beers were underestimated by over 50% of students. Those who drank alcohol themselves, or who were further on in their clinical training, did better on the task, but overall the levels remained low. Conclusions: Knowledge of, or the ability to work out, the alcohol content of commonly consumed drinks is poor, and further research is needed to understand the reasons for this and the impact this may have on the likelihood to undertake screening or initiate treatment. PMID:27536344

  15. Greater contrast in Martian hydrological history from more accurate estimates of paleodischarge

    NASA Astrophysics Data System (ADS)

    Jacobsen, R. E.; Burr, D. M.

    2016-09-01

    Correlative width-discharge relationships from the Missouri River Basin are commonly used to estimate fluvial paleodischarge on Mars. However, hydraulic geometry provides alternative, and causal, width-discharge relationships derived from broader samples of channels, including those in reduced-gravity (submarine) environments. Comparison of these relationships implies that causal relationships from hydraulic geometry should yield more accurate and more precise discharge estimates. Our remote analysis of a Martian-terrestrial analog channel, combined with in situ discharge data, substantiates this implication. Applied to Martian features, these results imply that paleodischarges of interior channels of Noachian-Hesperian (~3.7 Ga) valley networks have been underestimated by a factor of several, whereas paleodischarges for smaller fluvial deposits of the Late Hesperian-Early Amazonian (~3.0 Ga) have been overestimated. Thus, these new paleodischarges significantly magnify the contrast between early and late Martian hydrologic activity. Width-discharge relationships from hydraulic geometry represent validated tools for quantifying fluvial input near candidate landing sites of upcoming missions.

  16. Ocean Lidar Measurements of Beam Attenuation and a Roadmap to Accurate Phytoplankton Biomass Estimates

    NASA Astrophysics Data System (ADS)

    Hu, Yongxiang; Behrenfeld, Mike; Hostetler, Chris; Pelon, Jacques; Trepte, Charles; Hair, John; Slade, Wayne; Cetinic, Ivona; Vaughan, Mark; Lu, Xiaomei; Zhai, Pengwang; Weimer, Carl; Winker, David; Verhappen, Carolus C.; Butler, Carolyn; Liu, Zhaoyan; Hunt, Bill; Omar, Ali; Rodier, Sharon; Lifermann, Anne; Josset, Damien; Hou, Weilin; MacDonnell, David; Rhew, Ray

    2016-06-01

    Beam attenuation coefficient, c, provides an important optical index of plankton standing stocks, such as phytoplankton biomass and total particulate carbon concentration. Unfortunately, c has proven difficult to quantify through remote sensing. Here, we introduce an innovative approach for estimating c using lidar depolarization measurements and diffuse attenuation coefficients from ocean color products or lidar measurements of Brillouin scattering. The new approach is based on a theoretical formula established from Monte Carlo simulations that links the depolarization ratio of sea water to the ratio of diffuse attenuation Kd and beam attenuation C (i.e., a multiple scattering factor). On July 17, 2014, the CALIPSO satellite was tilted 30° off-nadir for one nighttime orbit in order to minimize ocean surface backscatter and demonstrate the lidar ocean subsurface measurement concept from space. Depolarization ratios of ocean subsurface backscatter are measured accurately. Beam attenuation coefficients computed from the depolarization ratio measurements compare well with empirical estimates from ocean color measurements. We further verify the beam attenuation coefficient retrievals using aircraft-based high spectral resolution lidar (HSRL) data that are collocated with in-water optical measurements.

  17. Impact of microbial count distributions on human health risk estimates.

    PubMed

    Duarte, A S R; Nauta, M J

    2015-02-16

    Quantitative microbiological risk assessment (QMRA) is influenced by the choice of the probability distribution used to describe pathogen concentrations, as this may eventually have a large effect on the distribution of doses at exposure. When fitting a probability distribution to microbial enumeration data, several factors may have an impact on the accuracy of that fit. Analysis of the best statistical fits of different distributions alone does not provide a clear indication of the impact in terms of risk estimates. Thus, in this study we focus on the impact of fitting microbial distributions on risk estimates, at two different concentration scenarios and at a range of prevalence levels. By using five different parametric distributions, we investigate whether different characteristics of a good fit are crucial for an accurate risk estimate. Among the factors studied are the importance of accounting for the Poisson randomness in counts, the difference between treating "true" zeroes as such or as censored below a limit of quantification (LOQ) and the importance of making the correct assumption about the underlying distribution of concentrations. By running a simulation experiment with zero-inflated Poisson-lognormal distributed data and an existing QMRA model from retail to consumer level, it was possible to assess the difference between expected risk and the risk estimated with using a lognormal, a zero-inflated lognormal, a Poisson-gamma, a zero-inflated Poisson-gamma and a zero-inflated Poisson-lognormal distribution. We show that the impact of the choice of different probability distributions to describe concentrations at retail on risk estimates is dependent both on concentration and prevalence levels. We also show that the use of an LOQ should be done consciously, especially when zero-inflation is not used. In general, zero-inflation does not necessarily improve the absolute risk estimation, but performance of zero-inflated distributions in QMRA tends to be

  18. Discrete state model and accurate estimation of loop entropy of RNA secondary structures.

    PubMed

    Zhang, Jian; Lin, Ming; Chen, Rong; Wang, Wei; Liang, Jie

    2008-03-28

    Conformational entropy makes important contribution to the stability and folding of RNA molecule, but it is challenging to either measure or compute conformational entropy associated with long loops. We develop optimized discrete k-state models of RNA backbone based on known RNA structures for computing entropy of loops, which are modeled as self-avoiding walks. To estimate entropy of hairpin, bulge, internal loop, and multibranch loop of long length (up to 50), we develop an efficient sampling method based on the sequential Monte Carlo principle. Our method considers excluded volume effect. It is general and can be applied to calculating entropy of loops with longer length and arbitrary complexity. For loops of short length, our results are in good agreement with a recent theoretical model and experimental measurement. For long loops, our estimated entropy of hairpin loops is in excellent agreement with the Jacobson-Stockmayer extrapolation model. However, for bulge loops and more complex secondary structures such as internal and multibranch loops, we find that the Jacobson-Stockmayer extrapolation model has large errors. Based on estimated entropy, we have developed empirical formulae for accurate calculation of entropy of long loops in different secondary structures. Our study on the effect of asymmetric size of loops suggest that loop entropy of internal loops is largely determined by the total loop length, and is only marginally affected by the asymmetric size of the two loops. Our finding suggests that the significant asymmetric effects of loop length in internal loops measured by experiments are likely to be partially enthalpic. Our method can be applied to develop improved energy parameters important for studying RNA stability and folding, and for predicting RNA secondary and tertiary structures. The discrete model and the program used to calculate loop entropy can be downloaded at http://gila.bioengr.uic.edu/resources/RNA.html.

  19. Auditory risk estimates for youth target shooting

    PubMed Central

    Meinke, Deanna K.; Murphy, William J.; Finan, Donald S.; Lankford, James E.; Flamme, Gregory A.; Stewart, Michael; Soendergaard, Jacob; Jerome, Trevor W.

    2015-01-01

    Objective To characterize the impulse noise exposure and auditory risk for youth recreational firearm users engaged in outdoor target shooting events. The youth shooting positions are typically standing or sitting at a table, which places the firearm closer to the ground or reflective surface when compared to adult shooters. Design Acoustic characteristics were examined and the auditory risk estimates were evaluated using contemporary damage-risk criteria for unprotected adult listeners and the 120-dB peak limit suggested by the World Health Organization (1999) for children. Study sample Impulses were generated by 26 firearm/ammunition configurations representing rifles, shotguns, and pistols used by youth. Measurements were obtained relative to a youth shooter’s left ear. Results All firearms generated peak levels that exceeded the 120 dB peak limit suggested by the WHO for children. In general, shooting from the seated position over a tabletop increases the peak levels, LAeq8 and reduces the unprotected maximum permissible exposures (MPEs) for both rifles and pistols. Pistols pose the greatest auditory risk when fired over a tabletop. Conclusion Youth should utilize smaller caliber weapons, preferably from the standing position, and always wear hearing protection whenever engaging in shooting activities to reduce the risk for auditory damage. PMID:24564688

  20. The quantitative estimation of IT-related risk probabilities.

    PubMed

    Herrmann, Andrea

    2013-08-01

    How well can people estimate IT-related risk? Although estimating risk is a fundamental activity in software management and risk is the basis for many decisions, little is known about how well IT-related risk can be estimated at all. Therefore, we executed a risk estimation experiment with 36 participants. They estimated the probabilities of IT-related risks and we investigated the effect of the following factors on the quality of the risk estimation: the estimator's age, work experience in computing, (self-reported) safety awareness and previous experience with this risk, the absolute value of the risk's probability, and the effect of knowing the estimates of the other participants (see: Delphi method). Our main findings are: risk probabilities are difficult to estimate. Younger and inexperienced estimators were not significantly worse than older and more experienced estimators, but the older and more experienced subjects better used the knowledge gained by knowing the other estimators' results. Persons with higher safety awareness tend to overestimate risk probabilities, but can better estimate ordinal ranks of risk probabilities. Previous own experience with a risk leads to an overestimation of its probability (unlike in other fields like medicine or disasters, where experience with a disease leads to more realistic probability estimates and nonexperience to an underestimation).

  1. Relating space radiation environments to risk estimates

    NASA Technical Reports Server (NTRS)

    Curtis, Stanley B.

    1993-01-01

    A number of considerations must go into the process of determining the risk of deleterious effects of space radiation to travelers. Among them are (1) determination of the components of the radiation environment (particle species, fluxes and energy spectra) which will encounter, (2) determination of the effects of shielding provided by the spacecraft and the bodies of the travelers which modify the incident particle spectra and mix of particles, and (3) determination of relevant biological effects of the radiation in the organs of interest. The latter can then lead to an estimation of risk from a given space scenario. Clearly, the process spans many scientific disciplines from solar and cosmic ray physics to radiation transport theeory to the multistage problem of the induction by radiation of initial lesions in living material and their evolution via physical, chemical, and biological processes at the molecular, cellular, and tissue levels to produce the end point of importance.

  2. Accurate optical flow field estimation using mechanical properties of soft tissues

    NASA Astrophysics Data System (ADS)

    Mehrabian, Hatef; Karimi, Hirad; Samani, Abbas

    2009-02-01

    A novel optical flow based technique is presented in this paper to measure the nodal displacements of soft tissue undergoing large deformations. In hyperelasticity imaging, soft tissues maybe compressed extensively [1] and the deformation may exceed the number of pixels ordinary optical flow approaches can detect. Furthermore in most biomedical applications there is a large amount of image information that represent the geometry of the tissue and the number of tissue types present in the organ of interest. Such information is often ignored in applications such as image registration. In this work we incorporate the information pertaining to soft tissue mechanical behavior (Neo-Hookean hyperelastic model is used here) in addition to the tissue geometry before compression into a hierarchical Horn-Schunck optical flow method to overcome this large deformation detection weakness. Applying the proposed method to a phantom using several compression levels proved that it yields reasonably accurate displacement fields. Estimated displacement results of this phantom study obtained for displacement fields of 85 pixels/frame and 127 pixels/frame are reported and discussed in this paper.

  3. How accurately can we estimate energetic costs in a marine top predator, the king penguin?

    PubMed

    Halsey, Lewis G; Fahlman, Andreas; Handrich, Yves; Schmidt, Alexander; Woakes, Anthony J; Butler, Patrick J

    2007-01-01

    King penguins (Aptenodytes patagonicus) are one of the greatest consumers of marine resources. However, while their influence on the marine ecosystem is likely to be significant, only an accurate knowledge of their energy demands will indicate their true food requirements. Energy consumption has been estimated for many marine species using the heart rate-rate of oxygen consumption (f(H) - V(O2)) technique, and the technique has been applied successfully to answer eco-physiological questions. However, previous studies on the energetics of king penguins, based on developing or applying this technique, have raised a number of issues about the degree of validity of the technique for this species. These include the predictive validity of the present f(H) - V(O2) equations across different seasons and individuals and during different modes of locomotion. In many cases, these issues also apply to other species for which the f(H) - V(O2) technique has been applied. In the present study, the accuracy of three prediction equations for king penguins was investigated based on validity studies and on estimates of V(O2) from published, field f(H) data. The major conclusions from the present study are: (1) in contrast to that for walking, the f(H) - V(O2) relationship for swimming king penguins is not affected by body mass; (2) prediction equation (1), log(V(O2) = -0.279 + 1.24log(f(H) + 0.0237t - 0.0157log(f(H)t, derived in a previous study, is the most suitable equation presently available for estimating V(O2) in king penguins for all locomotory and nutritional states. A number of possible problems associated with producing an f(H) - V(O2) relationship are discussed in the present study. Finally, a statistical method to include easy-to-measure morphometric characteristics, which may improve the accuracy of f(H) - V(O2) prediction equations, is explained.

  4. Assessing uncertainty in published risk estimates using ...

    EPA Pesticide Factsheets

    Introduction: The National Research Council recommended quantitative evaluation of uncertainty in effect estimates for risk assessment. This analysis considers uncertainty across model forms and model parameterizations with hexavalent chromium [Cr(VI)] and lung cancer mortality as an example. The objective is to characterize model uncertainty by evaluating estimates across published epidemiologic studies of the same cohort.Methods: This analysis was based on 5 studies analyzing a cohort of 2,357 workers employed from 1950-74 in a chromate production plant in Maryland. Cox and Poisson models were the only model forms considered by study authors to assess the effect of Cr(VI) on lung cancer mortality. All models adjusted for smoking and included a 5-year exposure lag, however other latency periods and model covariates such as age and race were considered. Published effect estimates were standardized to the same units and normalized by their variances to produce a standardized metric to compare variability within and between model forms. A total of 5 similarly parameterized analyses were considered across model form, and 16 analyses with alternative parameterizations were considered within model form (10 Cox; 6 Poisson). Results: Across Cox and Poisson model forms, adjusted cumulative exposure coefficients (betas) for 5 similar analyses ranged from 2.47 to 4.33 (mean=2.97, σ2=0.63). Within the 10 Cox models, coefficients ranged from 2.53 to 4.42 (mean=3.29, σ2=0.

  5. IMPROVED RISK ESTIMATES FOR CARBON TETRACHLORIDE

    SciTech Connect

    Benson, Janet M.; Springer, David L.

    1999-12-31

    Carbon tetrachloride has been used extensively within the DOE nuclear weapons facilities. Rocky Flats was formerly the largest volume consumer of CCl4 in the United States using 5000 gallons in 1977 alone (Ripple, 1992). At the Hanford site, several hundred thousand gallons of CCl4 were discharged between 1955 and 1973 into underground cribs for storage. Levels of CCl4 in groundwater at highly contaminated sites at the Hanford facility have exceeded 8 the drinking water standard of 5 ppb by several orders of magnitude (Illman, 1993). High levels of CCl4 at these facilities represent a potential health hazard for workers conducting cleanup operations and for surrounding communities. The level of CCl4 cleanup required at these sites and associated costs are driven by current human health risk estimates, which assume that CCl4 is a genotoxic carcinogen. The overall purpose of these studies was to improve the scientific basis for assessing the health risk associated with human exposure to CCl4. Specific research objectives of this project were to: (1) compare the rates of CCl4 metabolism by rats, mice and hamsters in vivo and extrapolate those rates to man based on parallel studies on the metabolism of CCl4 by rat, mouse, hamster and human hepatic microsomes in vitro; (2) using hepatic microsome preparations, determine the role of specific cytochrome P450 isoforms in CCl4-mediated toxicity and the effects of repeated inhalation and ingestion of CCl4 on these isoforms; and (3) evaluate the toxicokinetics of inhaled CCl4 in rats, mice and hamsters. This information has been used to improve the physiologically based pharmacokinetic (PBPK) model for CCl4 originally developed by Paustenbach et al. (1988) and more recently revised by Thrall and Kenny (1996). Another major objective of the project was to provide scientific evidence that CCl4, like chloroform, is a hepatocarcinogen only when exposure results in cell damage, cell killing and regenerative proliferation. In

  6. Crop area estimation based on remotely-sensed data with an accurate but costly subsample

    NASA Technical Reports Server (NTRS)

    Gunst, R. F.

    1983-01-01

    Alternatives to sampling-theory stratified and regression estimators of crop production and timber biomass were examined. An alternative estimator which is viewed as especially promising is the errors-in-variable regression estimator. Investigations established the need for caution with this estimator when the ratio of two error variances is not precisely known.

  7. Estimation of health risks from radiation exposures

    SciTech Connect

    Randolph, M.L.

    1983-08-01

    An informal presentation is given of the cancer and genetic risks from exposures to ionizing radiations. The risks from plausible radiation exposures are shown to be comparable to other commonly encountered risks.

  8. Insights on the role of accurate state estimation in coupled model parameter estimation by a conceptual climate model study

    NASA Astrophysics Data System (ADS)

    Yu, Xiaolin; Zhang, Shaoqing; Lin, Xiaopei; Li, Mingkui

    2017-03-01

    The uncertainties in values of coupled model parameters are an important source of model bias that causes model climate drift. The values can be calibrated by a parameter estimation procedure that projects observational information onto model parameters. The signal-to-noise ratio of error covariance between the model state and the parameter being estimated directly determines whether the parameter estimation succeeds or not. With a conceptual climate model that couples the stochastic atmosphere and slow-varying ocean, this study examines the sensitivity of state-parameter covariance on the accuracy of estimated model states in different model components of a coupled system. Due to the interaction of multiple timescales, the fast-varying atmosphere with a chaotic nature is the major source of the inaccuracy of estimated state-parameter covariance. Thus, enhancing the estimation accuracy of atmospheric states is very important for the success of coupled model parameter estimation, especially for the parameters in the air-sea interaction processes. The impact of chaotic-to-periodic ratio in state variability on parameter estimation is also discussed. This simple model study provides a guideline when real observations are used to optimize model parameters in a coupled general circulation model for improving climate analysis and predictions.

  9. Uncertainty of Calculated Risk Estimates for Secondary Malignancies After Radiotherapy

    SciTech Connect

    Kry, Stephen F. . E-mail: sfkry@mdanderson.org; Followill, David; White, R. Allen; Stovall, Marilyn; Kuban, Deborah A.; Salehpour, Mohammad

    2007-07-15

    Purpose: The significance of risk estimates for fatal secondary malignancies caused by out-of-field radiation exposure remains unresolved because the uncertainty in calculated risk estimates has not been established. This work examines the uncertainty in absolute risk estimates and in the ratio of risk estimates between different treatment modalities. Methods and Materials: Clinically reasonable out-of-field doses and calculated risk estimates were taken from the literature for several prostate treatment modalities, including intensity-modulated radiotherapy (IMRT), and were recalculated using the most recent risk model. The uncertainties in this risk model and uncertainties in the linearity of the dose-response model were considered in generating 90% confidence intervals for the uncertainty in the absolute risk estimates and in the ratio of the risk estimates. Results: The absolute risk estimates of fatal secondary malignancy were associated with very large uncertainties, which precluded distinctions between the risks associated with the different treatment modalities considered. However, a much smaller confidence interval exists for the ratio of risk estimates, and this ratio between different treatment modalities may be statistically significant when there is an effective dose equivalent difference of at least 50%. Such a difference may exist between clinically reasonable treatment options, including 6-MV IMRT versus 18-MV IMRT for prostate therapy. Conclusion: The ratio of the risk between different treatment modalities may be significantly different. Consequently risk models and associated risk estimates may be useful and meaningful for evaluating different treatment options. The calculated risk of secondary malignancy should be considered in the selection of an optimal treatment plan.

  10. Children Can Accurately Monitor and Control Their Number-Line Estimation Performance

    ERIC Educational Resources Information Center

    Wall, Jenna L.; Thompson, Clarissa A.; Dunlosky, John; Merriman, William E.

    2016-01-01

    Accurate monitoring and control are essential for effective self-regulated learning. These metacognitive abilities may be particularly important for developing math skills, such as when children are deciding whether a math task is difficult or whether they made a mistake on a particular item. The present experiments investigate children's ability…

  11. Bi-fluorescence imaging for estimating accurately the nuclear condition of Rhizoctonia spp.

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Aims: To simplify the determination of the nuclear condition of the pathogenic Rhizoctonia, which currently needs to be performed either using two fluorescent dyes, thus is more costly and time-consuming, or using only one fluorescent dye, and thus less accurate. Methods and Results: A red primary ...

  12. Bayesian parameter estimation of a k-ε model for accurate jet-in-crossflow simulations

    SciTech Connect

    Ray, Jaideep; Lefantzi, Sophia; Arunajatesan, Srinivasan; Dechant, Lawrence

    2016-05-31

    Reynolds-averaged Navier–Stokes models are not very accurate for high-Reynolds-number compressible jet-in-crossflow interactions. The inaccuracy arises from the use of inappropriate model parameters and model-form errors in the Reynolds-averaged Navier–Stokes model. In this study, the hypothesis is pursued that Reynolds-averaged Navier–Stokes predictions can be significantly improved by using parameters inferred from experimental measurements of a supersonic jet interacting with a transonic crossflow.

  13. Accurate state estimation for a hydraulic actuator via a SDRE nonlinear filter

    NASA Astrophysics Data System (ADS)

    Strano, Salvatore; Terzo, Mario

    2016-06-01

    The state estimation in hydraulic actuators is a fundamental tool for the detection of faults or a valid alternative to the installation of sensors. Due to the hard nonlinearities that characterize the hydraulic actuators, the performances of the linear/linearization based techniques for the state estimation are strongly limited. In order to overcome these limits, this paper focuses on an alternative nonlinear estimation method based on the State-Dependent-Riccati-Equation (SDRE). The technique is able to fully take into account the system nonlinearities and the measurement noise. A fifth order nonlinear model is derived and employed for the synthesis of the estimator. Simulations and experimental tests have been conducted and comparisons with the largely used Extended Kalman Filter (EKF) are illustrated. The results show the effectiveness of the SDRE based technique for applications characterized by not negligible nonlinearities such as dead zone and frictions.

  14. Accurate liability estimation improves power in ascertained case-control studies.

    PubMed

    Weissbrod, Omer; Lippert, Christoph; Geiger, Dan; Heckerman, David

    2015-04-01

    Linear mixed models (LMMs) have emerged as the method of choice for confounded genome-wide association studies. However, the performance of LMMs in nonrandomly ascertained case-control studies deteriorates with increasing sample size. We propose a framework called LEAP (liability estimator as a phenotype; https://github.com/omerwe/LEAP) that tests for association with estimated latent values corresponding to severity of phenotype, and we demonstrate that this can lead to a substantial power increase.

  15. Robust and Accurate Vision-Based Pose Estimation Algorithm Based on Four Coplanar Feature Points

    PubMed Central

    Zhang, Zimiao; Zhang, Shihai; Li, Qiu

    2016-01-01

    Vision-based pose estimation is an important application of machine vision. Currently, analytical and iterative methods are used to solve the object pose. The analytical solutions generally take less computation time. However, the analytical solutions are extremely susceptible to noise. The iterative solutions minimize the distance error between feature points based on 2D image pixel coordinates. However, the non-linear optimization needs a good initial estimate of the true solution, otherwise they are more time consuming than analytical solutions. Moreover, the image processing error grows rapidly with measurement range increase. This leads to pose estimation errors. All the reasons mentioned above will cause accuracy to decrease. To solve this problem, a novel pose estimation method based on four coplanar points is proposed. Firstly, the coordinates of feature points are determined according to the linear constraints formed by the four points. The initial coordinates of feature points acquired through the linear method are then optimized through an iterative method. Finally, the coordinate system of object motion is established and a method is introduced to solve the object pose. The growing image processing error causes pose estimation errors the measurement range increases. Through the coordinate system, the pose estimation errors could be decreased. The proposed method is compared with two other existing methods through experiments. Experimental results demonstrate that the proposed method works efficiently and stably. PMID:27999338

  16. Accurate and efficient velocity estimation using Transmission matrix formalism based on the domain decomposition method

    NASA Astrophysics Data System (ADS)

    Wang, Benfeng; Jakobsen, Morten; Wu, Ru-Shan; Lu, Wenkai; Chen, Xiaohong

    2017-03-01

    Full waveform inversion (FWI) has been regarded as an effective tool to build the velocity model for the following pre-stack depth migration. Traditional inversion methods are built on Born approximation and are initial model dependent, while this problem can be avoided by introducing Transmission matrix (T-matrix), because the T-matrix includes all orders of scattering effects. The T-matrix can be estimated from the spatial aperture and frequency bandwidth limited seismic data using linear optimization methods. However the full T-matrix inversion method (FTIM) is always required in order to estimate velocity perturbations, which is very time consuming. The efficiency can be improved using the previously proposed inverse thin-slab propagator (ITSP) method, especially for large scale models. However, the ITSP method is currently designed for smooth media, therefore the estimation results are unsatisfactory when the velocity perturbation is relatively large. In this paper, we propose a domain decomposition method (DDM) to improve the efficiency of the velocity estimation for models with large perturbations, as well as guarantee the estimation accuracy. Numerical examples for smooth Gaussian ball models and a reservoir model with sharp boundaries are performed using the ITSP method, the proposed DDM and the FTIM. The estimated velocity distributions, the relative errors and the elapsed time all demonstrate the validity of the proposed DDM.

  17. Comparing the standards of one metabolic equivalent of task in accurately estimating physical activity energy expenditure based on acceleration.

    PubMed

    Kim, Dohyun; Lee, Jongshill; Park, Hoon Ki; Jang, Dong Pyo; Song, Soohwa; Cho, Baek Hwan; Jung, Yoo-Suk; Park, Rae-Woong; Joo, Nam-Seok; Kim, In Young

    2016-08-24

    The purpose of the study is to analyse how the standard of resting metabolic rate (RMR) affects estimation of the metabolic equivalent of task (MET) using an accelerometer. In order to investigate the effect on estimation according to intensity of activity, comparisons were conducted between the 3.5 ml O2 · kg(-1) · min(-1) and individually measured resting VO2 as the standard of 1 MET. MET was estimated by linear regression equations that were derived through five-fold cross-validation using 2 types of MET values and accelerations; the accuracy of estimation was analysed through cross-validation, Bland and Altman plot, and one-way ANOVA test. There were no significant differences in the RMS error after cross-validation. However, the individual RMR-based estimations had as many as 0.5 METs of mean difference in modified Bland and Altman plots than RMR of 3.5 ml O2 · kg(-1) · min(-1). Finally, the results of an ANOVA test indicated that the individual RMR-based estimations had less significant differences between the reference and estimated values at each intensity of activity. In conclusion, the RMR standard is a factor that affects accurate estimation of METs by acceleration; therefore, RMR requires individual specification when it is used for estimation of METs using an accelerometer.

  18. What's the Risk? A Simple Approach for Estimating Adjusted Risk Measures from Nonlinear Models Including Logistic Regression

    PubMed Central

    Kleinman, Lawrence C; Norton, Edward C

    2009-01-01

    Objective To develop and validate a general method (called regression risk analysis) to estimate adjusted risk measures from logistic and other nonlinear multiple regression models. We show how to estimate standard errors for these estimates. These measures could supplant various approximations (e.g., adjusted odds ratio [AOR]) that may diverge, especially when outcomes are common. Study Design Regression risk analysis estimates were compared with internal standards as well as with Mantel–Haenszel estimates, Poisson and log-binomial regressions, and a widely used (but flawed) equation to calculate adjusted risk ratios (ARR) from AOR. Data Collection Data sets produced using Monte Carlo simulations. Principal Findings Regression risk analysis accurately estimates ARR and differences directly from multiple regression models, even when confounders are continuous, distributions are skewed, outcomes are common, and effect size is large. It is statistically sound and intuitive, and has properties favoring it over other methods in many cases. Conclusions Regression risk analysis should be the new standard for presenting findings from multiple regression analysis of dichotomous outcomes for cross-sectional, cohort, and population-based case–control studies, particularly when outcomes are common or effect size is large. PMID:18793213

  19. Spatio-temporal population estimates for risk management

    NASA Astrophysics Data System (ADS)

    Cockings, Samantha; Martin, David; Smith, Alan; Martin, Rebecca

    2013-04-01

    Accurate estimation of population at risk from hazards and effective emergency management of events require not just appropriate spatio-temporal modelling of hazards but also of population. While much recent effort has been focused on improving the modelling and predictions of hazards (both natural and anthropogenic), there has been little parallel advance in the measurement or modelling of population statistics. Different hazard types occur over diverse temporal cycles, are of varying duration and differ significantly in their spatial extent. Even events of the same hazard type, such as flood events, vary markedly in their spatial and temporal characteristics. Conceptually and pragmatically then, population estimates should also be available for similarly varying spatio-temporal scales. Routine population statistics derived from traditional censuses or surveys are usually static representations in both space and time, recording people at their place of usual residence on census/survey night and presenting data for administratively defined areas. Such representations effectively fix the scale of population estimates in both space and time, which is unhelpful for meaningful risk management. Over recent years, the Pop24/7 programme of research, based at the University of Southampton (UK), has developed a framework for spatio-temporal modelling of population, based on gridded population surfaces. Based on a data model which is fully flexible in terms of space and time, the framework allows population estimates to be produced for any time slice relevant to the data contained in the model. It is based around a set of origin and destination centroids, which have capacities, spatial extents and catchment areas, all of which can vary temporally, such as by time of day, day of week, season. A background layer, containing information on features such as transport networks and landuse, provides information on the likelihood of people being in certain places at specific times

  20. Accurate kinetic parameter estimation during progress curve analysis of systems with endogenous substrate production.

    PubMed

    Goudar, Chetan T

    2011-10-01

    We have identified an error in the published integral form of the modified Michaelis-Menten equation that accounts for endogenous substrate production. The correct solution is presented and the error in both the substrate concentration, S, and the kinetic parameters Vm , Km , and R resulting from the incorrect solution was characterized. The incorrect integral form resulted in substrate concentration errors as high as 50% resulting in 7-50% error in kinetic parameter estimates. To better reflect experimental scenarios, noise containing substrate depletion data were analyzed by both the incorrect and correct integral equations. While both equations resulted in identical fits to substrate depletion data, the final estimates of Vm , Km , and R were different and Km and R estimates from the incorrect integral equation deviated substantially from the actual values. Another observation was that at R = 0, the incorrect integral equation reduced to the correct form of the Michaelis-Menten equation. We believe this combination of excellent fits to experimental data, albeit with incorrect kinetic parameter estimates, and the reduction to the Michaelis-Menten equation at R = 0 is primarily responsible for the incorrectness to go unnoticed. However, the resulting error in kinetic parameter estimates will lead to incorrect biological interpretation and we urge the use of the correct integral form presented in this study.

  1. Alpha's standard error (ASE): an accurate and precise confidence interval estimate.

    PubMed

    Duhachek, Adam; Lacobucci, Dawn

    2004-10-01

    This research presents the inferential statistics for Cronbach's coefficient alpha on the basis of the standard statistical assumption of multivariate normality. The estimation of alpha's standard error (ASE) and confidence intervals are described, and the authors analytically and empirically investigate the effects of the components of these equations. The authors then demonstrate the superiority of this estimate compared with previous derivations of ASE in a separate Monte Carlo simulation. The authors also present a sampling error and test statistic for a test of independent sample alphas. They conclude with a recommendation that all alpha coefficients be reported in conjunction with standard error or confidence interval estimates and offer SAS and SPSS programming codes for easy implementation.

  2. Precision Pointing Control to and Accurate Target Estimation of a Non-Cooperative Vehicle

    NASA Technical Reports Server (NTRS)

    VanEepoel, John; Thienel, Julie; Sanner, Robert M.

    2006-01-01

    In 2004, NASA began investigating a robotic servicing mission for the Hubble Space Telescope (HST). Such a mission would not only require estimates of the HST attitude and rates in order to achieve capture by the proposed Hubble Robotic Vehicle (HRV), but also precision control to achieve the desired rate and maintain the orientation to successfully dock with HST. To generalize the situation, HST is the target vehicle and HRV is the chaser. This work presents a nonlinear approach for estimating the body rates of a non-cooperative target vehicle, and coupling this estimation to a control scheme. Non-cooperative in this context relates to the target vehicle no longer having the ability to maintain attitude control or transmit attitude knowledge.

  3. Accurate State Estimation and Tracking of a Non-Cooperative Target Vehicle

    NASA Technical Reports Server (NTRS)

    Thienel, Julie K.; Sanner, Robert M.

    2006-01-01

    Autonomous space rendezvous scenarios require knowledge of the target vehicle state in order to safely dock with the chaser vehicle. Ideally, the target vehicle state information is derived from telemetered data, or with the use of known tracking points on the target vehicle. However, if the target vehicle is non-cooperative and does not have the ability to maintain attitude control, or transmit attitude knowledge, the docking becomes more challenging. This work presents a nonlinear approach for estimating the body rates of a non-cooperative target vehicle, and coupling this estimation to a tracking control scheme. The approach is tested with the robotic servicing mission concept for the Hubble Space Telescope (HST). Such a mission would not only require estimates of the HST attitude and rates, but also precision control to achieve the desired rate and maintain the orientation to successfully dock with HST.

  4. A microbial clock provides an accurate estimate of the postmortem interval in a mouse model system

    PubMed Central

    Metcalf, Jessica L; Wegener Parfrey, Laura; Gonzalez, Antonio; Lauber, Christian L; Knights, Dan; Ackermann, Gail; Humphrey, Gregory C; Gebert, Matthew J; Van Treuren, Will; Berg-Lyons, Donna; Keepers, Kyle; Guo, Yan; Bullard, James; Fierer, Noah; Carter, David O; Knight, Rob

    2013-01-01

    Establishing the time since death is critical in every death investigation, yet existing techniques are susceptible to a range of errors and biases. For example, forensic entomology is widely used to assess the postmortem interval (PMI), but errors can range from days to months. Microbes may provide a novel method for estimating PMI that avoids many of these limitations. Here we show that postmortem microbial community changes are dramatic, measurable, and repeatable in a mouse model system, allowing PMI to be estimated within approximately 3 days over 48 days. Our results provide a detailed understanding of bacterial and microbial eukaryotic ecology within a decomposing corpse system and suggest that microbial community data can be developed into a forensic tool for estimating PMI. DOI: http://dx.doi.org/10.7554/eLife.01104.001 PMID:24137541

  5. Fast and accurate probability density estimation in large high dimensional astronomical datasets

    NASA Astrophysics Data System (ADS)

    Gupta, Pramod; Connolly, Andrew J.; Gardner, Jeffrey P.

    2015-01-01

    Astronomical surveys will generate measurements of hundreds of attributes (e.g. color, size, shape) on hundreds of millions of sources. Analyzing these large, high dimensional data sets will require efficient algorithms for data analysis. An example of this is probability density estimation that is at the heart of many classification problems such as the separation of stars and quasars based on their colors. Popular density estimation techniques use binning or kernel density estimation. Kernel density estimation has a small memory footprint but often requires large computational resources. Binning has small computational requirements but usually binning is implemented with multi-dimensional arrays which leads to memory requirements which scale exponentially with the number of dimensions. Hence both techniques do not scale well to large data sets in high dimensions. We present an alternative approach of binning implemented with hash tables (BASH tables). This approach uses the sparseness of data in the high dimensional space to ensure that the memory requirements are small. However hashing requires some extra computation so a priori it is not clear if the reduction in memory requirements will lead to increased computational requirements. Through an implementation of BASH tables in C++ we show that the additional computational requirements of hashing are negligible. Hence this approach has small memory and computational requirements. We apply our density estimation technique to photometric selection of quasars using non-parametric Bayesian classification and show that the accuracy of the classification is same as the accuracy of earlier approaches. Since the BASH table approach is one to three orders of magnitude faster than the earlier approaches it may be useful in various other applications of density estimation in astrostatistics.

  6. Spectral estimation from laser scanner data for accurate color rendering of objects

    NASA Astrophysics Data System (ADS)

    Baribeau, Rejean

    2002-06-01

    Estimation methods are studied for the recovery of the spectral reflectance across the visible range from the sensing at just three discrete laser wavelengths. Methods based on principal component analysis and on spline interpolation are judged based on the CIE94 color differences for some reference data sets. These include the Macbeth color checker, the OSA-UCS color charts, some artist pigments, and a collection of miscellaneous surface colors. The optimal three sampling wavelengths are also investigated. It is found that color can be estimated with average accuracy ΔE94 = 2.3 when optimal wavelengths 455 nm, 540 n, and 610 nm are used.

  7. Crop area estimation based on remotely-sensed data with an accurate but costly subsample

    NASA Technical Reports Server (NTRS)

    Gunst, R. F.

    1985-01-01

    Research activities conducted under the auspices of National Aeronautics and Space Administration Cooperative Agreement NCC 9-9 are discussed. During this contract period research efforts are concentrated in two primary areas. The first are is an investigation of the use of measurement error models as alternatives to least squares regression estimators of crop production or timber biomass. The secondary primary area of investigation is on the estimation of the mixing proportion of two-component mixture models. This report lists publications, technical reports, submitted manuscripts, and oral presentation generated by these research efforts. Possible areas of future research are mentioned.

  8. Data Anonymization that Leads to the Most Accurate Estimates of Statistical Characteristics: Fuzzy-Motivated Approach

    PubMed Central

    Xiang, G.; Ferson, S.; Ginzburg, L.; Longpré, L.; Mayorga, E.; Kosheleva, O.

    2013-01-01

    To preserve privacy, the original data points (with exact values) are replaced by boxes containing each (inaccessible) data point. This privacy-motivated uncertainty leads to uncertainty in the statistical characteristics computed based on this data. In a previous paper, we described how to minimize this uncertainty under the assumption that we use the same standard statistical estimates for the desired characteristics. In this paper, we show that we can further decrease the resulting uncertainty if we allow fuzzy-motivated weighted estimates, and we explain how to optimally select the corresponding weights. PMID:25187183

  9. Accurate and unbiased estimation of power-law exponents from single-emitter blinking data.

    PubMed

    Hoogenboom, Jacob P; den Otter, Wouter K; Offerhaus, Herman L

    2006-11-28

    Single emitter blinking with a power-law distribution for the on and off times has been observed on a variety of systems including semiconductor nanocrystals, conjugated polymers, fluorescent proteins, and organic fluorophores. The origin of this behavior is still under debate. Reliable estimation of power exponents from experimental data is crucial in validating the various models under consideration. We derive a maximum likelihood estimator for power-law distributed data and analyze its accuracy as a function of data set size and power exponent both analytically and numerically. Results are compared to least-squares fitting of the double logarithmically transformed probability density. We demonstrate that least-squares fitting introduces a severe bias in the estimation result and that the maximum likelihood procedure is superior in retrieving the correct exponent and reducing the statistical error. For a data set as small as 50 data points, the error margins of the maximum likelihood estimator are already below 7%, giving the possibility to quantify blinking behavior when data set size is limited, e.g., due to photobleaching.

  10. How Accurate and Robust Are the Phylogenetic Estimates of Austronesian Language Relationships?

    PubMed Central

    Greenhill, Simon J.; Drummond, Alexei J.; Gray, Russell D.

    2010-01-01

    We recently used computational phylogenetic methods on lexical data to test between two scenarios for the peopling of the Pacific. Our analyses of lexical data supported a pulse-pause scenario of Pacific settlement in which the Austronesian speakers originated in Taiwan around 5,200 years ago and rapidly spread through the Pacific in a series of expansion pulses and settlement pauses. We claimed that there was high congruence between traditional language subgroups and those observed in the language phylogenies, and that the estimated age of the Austronesian expansion at 5,200 years ago was consistent with the archaeological evidence. However, the congruence between the language phylogenies and the evidence from historical linguistics was not quantitatively assessed using tree comparison metrics. The robustness of the divergence time estimates to different calibration points was also not investigated exhaustively. Here we address these limitations by using a systematic tree comparison metric to calculate the similarity between the Bayesian phylogenetic trees and the subgroups proposed by historical linguistics, and by re-estimating the age of the Austronesian expansion using only the most robust calibrations. The results show that the Austronesian language phylogenies are highly congruent with the traditional subgroupings, and the date estimates are robust even when calculated using a restricted set of historical calibrations. PMID:20224774

  11. Relating space radiation environments to risk estimates

    SciTech Connect

    Curtis, S.B.

    1991-10-01

    This lecture will provide a bridge from the physical energy or LET spectra as might be calculated in an organ to the risk of carcinogenesis, a particular concern for extended missions to the moon or beyond to Mars. Topics covered will include (1) LET spectra expected from galactic cosmic rays, (2) probabilities that individual cell nuclei in the body will be hit by heavy galactic cosmic ray particles, (3) the conventional methods of calculating risks from a mixed environment of high and low LET radiation, (4) an alternate method which provides certain advantages using fluence-related risk coefficients (risk cross sections), and (5) directions for future research and development of these ideas.

  12. Accurate estimation of influenza epidemics using Google search data via ARGO

    PubMed Central

    Yang, Shihao; Santillana, Mauricio; Kou, S. C.

    2015-01-01

    Accurate real-time tracking of influenza outbreaks helps public health officials make timely and meaningful decisions that could save lives. We propose an influenza tracking model, ARGO (AutoRegression with GOogle search data), that uses publicly available online search data. In addition to having a rigorous statistical foundation, ARGO outperforms all previously available Google-search–based tracking models, including the latest version of Google Flu Trends, even though it uses only low-quality search data as input from publicly available Google Trends and Google Correlate websites. ARGO not only incorporates the seasonality in influenza epidemics but also captures changes in people’s online search behavior over time. ARGO is also flexible, self-correcting, robust, and scalable, making it a potentially powerful tool that can be used for real-time tracking of other social events at multiple temporal and spatial resolutions. PMID:26553980

  13. Do hand-held calorimeters provide reliable and accurate estimates of resting metabolic rate?

    PubMed

    Van Loan, Marta D

    2007-12-01

    This paper provides an overview of a new technique for indirect calorimetry and the assessment of resting metabolic rate. Information from the research literature includes findings on the reliability and validity of a new hand-held indirect calorimeter as well as use in clinical and field settings. Research findings to date are of mixed results. The MedGem instrument has provided more consistent results when compared to the Douglas bag method of measuring metabolic rate. The BodyGem instrument has been shown to be less accurate when compared to standard metabolic carts. Furthermore, when the Body Gem has been used with clinical patients or with under nourished individuals the results have not been acceptable. Overall, there is not a large enough body of evidence to definitively support the use of these hand-held devices for assessment of metabolic rate in a wide variety of clinical or research environments.

  14. Accurate estimation of influenza epidemics using Google search data via ARGO.

    PubMed

    Yang, Shihao; Santillana, Mauricio; Kou, S C

    2015-11-24

    Accurate real-time tracking of influenza outbreaks helps public health officials make timely and meaningful decisions that could save lives. We propose an influenza tracking model, ARGO (AutoRegression with GOogle search data), that uses publicly available online search data. In addition to having a rigorous statistical foundation, ARGO outperforms all previously available Google-search-based tracking models, including the latest version of Google Flu Trends, even though it uses only low-quality search data as input from publicly available Google Trends and Google Correlate websites. ARGO not only incorporates the seasonality in influenza epidemics but also captures changes in people's online search behavior over time. ARGO is also flexible, self-correcting, robust, and scalable, making it a potentially powerful tool that can be used for real-time tracking of other social events at multiple temporal and spatial resolutions.

  15. Raman spectroscopy for highly accurate estimation of the age of refrigerated porcine muscle

    NASA Astrophysics Data System (ADS)

    Timinis, Constantinos; Pitris, Costas

    2016-03-01

    The high water content of meat, combined with all the nutrients it contains, make it vulnerable to spoilage at all stages of production and storage even when refrigerated at 5 °C. A non-destructive and in situ tool for meat sample testing, which could provide an accurate indication of the storage time of meat, would be very useful for the control of meat quality as well as for consumer safety. The proposed solution is based on Raman spectroscopy which is non-invasive and can be applied in situ. For the purposes of this project, 42 meat samples from 14 animals were obtained and three Raman spectra per sample were collected every two days for two weeks. The spectra were subsequently processed and the sample age was calculated using a set of linear differential equations. In addition, the samples were classified in categories corresponding to the age in 2-day steps (i.e., 0, 2, 4, 6, 8, 10, 12 or 14 days old), using linear discriminant analysis and cross-validation. Contrary to other studies, where the samples were simply grouped into two categories (higher or lower quality, suitable or unsuitable for human consumption, etc.), in this study, the age was predicted with a mean error of ~ 1 day (20%) or classified, in 2-day steps, with 100% accuracy. Although Raman spectroscopy has been used in the past for the analysis of meat samples, the proposed methodology has resulted in a prediction of the sample age far more accurately than any report in the literature.

  16. Multiple candidates and multiple constraints based accurate depth estimation for multi-view stereo

    NASA Astrophysics Data System (ADS)

    Zhang, Chao; Zhou, Fugen; Xue, Bindang

    2017-02-01

    In this paper, we propose a depth estimation method for multi-view image sequence. To enhance the accuracy of dense matching and reduce the inaccurate matching which is produced by inaccurate feature description, we select multiple matching points to build candidate matching sets. Then we compute an optimal depth from a candidate matching set which satisfies multiple constraints (epipolar constraint, similarity constraint and depth consistency constraint). To further increase the accuracy of depth estimation, depth consistency constraint of neighbor pixels is used to filter the inaccurate matching. On this basis, in order to get more complete depth map, depth diffusion is performed by neighbor pixels' depth consistency constraint. Through experiments on the benchmark datasets for multiple view stereo, we demonstrate the superiority of proposed method over the state-of-the-art method in terms of accuracy.

  17. Lower bound on reliability for Weibull distribution when shape parameter is not estimated accurately

    NASA Technical Reports Server (NTRS)

    Huang, Zhaofeng; Porter, Albert A.

    1990-01-01

    The mathematical relationships between the shape parameter Beta and estimates of reliability and a life limit lower bound for the two parameter Weibull distribution are investigated. It is shown that under rather general conditions, both the reliability lower bound and the allowable life limit lower bound (often called a tolerance limit) have unique global minimums over a range of Beta. Hence lower bound solutions can be obtained without assuming or estimating Beta. The existence and uniqueness of these lower bounds are proven. Some real data examples are given to show how these lower bounds can be easily established and to demonstrate their practicality. The method developed here has proven to be extremely useful when using the Weibull distribution in analysis of no-failure or few-failures data. The results are applicable not only in the aerospace industry but anywhere that system reliabilities are high.

  18. Accurate dynamic power estimation for CMOS combinational logic circuits with real gate delay model.

    PubMed

    Fadl, Omnia S; Abu-Elyazeed, Mohamed F; Abdelhalim, Mohamed B; Amer, Hassanein H; Madian, Ahmed H

    2016-01-01

    Dynamic power estimation is essential in designing VLSI circuits where many parameters are involved but the only circuit parameter that is related to the circuit operation is the nodes' toggle rate. This paper discusses a deterministic and fast method to estimate the dynamic power consumption for CMOS combinational logic circuits using gate-level descriptions based on the Logic Pictures concept to obtain the circuit nodes' toggle rate. The delay model for the logic gates is the real-delay model. To validate the results, the method is applied to several circuits and compared against exhaustive, as well as Monte Carlo, simulations. The proposed technique was shown to save up to 96% processing time compared to exhaustive simulation.

  19. Accurate group velocity estimation for unmanned aerial vehicle-based acoustic atmospheric tomography.

    PubMed

    Rogers, Kevin J; Finn, Anthony

    2017-02-01

    Acoustic atmospheric tomography calculates temperature and wind velocity fields in a slice or volume of atmosphere based on travel time estimates between strategically located sources and receivers. The technique discussed in this paper uses the natural acoustic signature of an unmanned aerial vehicle as it overflies an array of microphones on the ground. The sound emitted by the aircraft is recorded on-board and by the ground microphones. The group velocities of the intersecting sound rays are then derived by comparing these measurements. Tomographic inversion is used to estimate the temperature and wind fields from the group velocity measurements. This paper describes a technique for deriving travel time (and hence group velocity) with an accuracy of 0.1% using these assets. This is shown to be sufficient to obtain highly plausible tomographic inversion results that correlate well with independent SODAR measurements.

  20. Techniques for accurate estimation of net discharge in a tidal channel

    USGS Publications Warehouse

    Simpson, Michael R.; Bland, Roger

    1999-01-01

    An ultrasonic velocity meter discharge-measurement site in a tidally affected region of the Sacramento-San Joaquin rivers was used to study the accuracy of the index velocity calibration procedure. Calibration data consisting of ultrasonic velocity meter index velocity and concurrent acoustic Doppler discharge measurement data were collected during three time periods. The relative magnitude of equipment errors, acoustic Doppler discharge measurement errors, and calibration errors were evaluated. Calibration error was the most significant source of error in estimating net discharge. Using a comprehensive calibration method, net discharge estimates developed from the three sets of calibration data differed by less than an average of 4 cubic meters per second. Typical maximum flow rates during the data-collection period averaged 750 cubic meters per second.

  1. Lower bound on reliability for Weibull distribution when shape parameter is not estimated accurately

    NASA Technical Reports Server (NTRS)

    Huang, Zhaofeng; Porter, Albert A.

    1991-01-01

    The mathematical relationships between the shape parameter Beta and estimates of reliability and a life limit lower bound for the two parameter Weibull distribution are investigated. It is shown that under rather general conditions, both the reliability lower bound and the allowable life limit lower bound (often called a tolerance limit) have unique global minimums over a range of Beta. Hence lower bound solutions can be obtained without assuming or estimating Beta. The existence and uniqueness of these lower bounds are proven. Some real data examples are given to show how these lower bounds can be easily established and to demonstrate their practicality. The method developed here has proven to be extremely useful when using the Weibull distribution in analysis of no-failure or few-failures data. The results are applicable not only in the aerospace industry but anywhere that system reliabilities are high.

  2. A Simple and Accurate Equation for Peak Capacity Estimation in Two Dimensional Liquid Chromatography

    PubMed Central

    Li, Xiaoping; Stoll, Dwight R.; Carr, Peter W.

    2009-01-01

    Two dimensional liquid chromatography (2DLC) is a very powerful way to greatly increase the resolving power and overall peak capacity of liquid chromatography. The traditional “product rule” for peak capacity usually overestimates the true resolving power due to neglect of the often quite severe under-sampling effect and thus provides poor guidance for optimizing the separation and biases comparisons to optimized one dimensional gradient liquid chromatography. Here we derive a simple yet accurate equation for the effective two dimensional peak capacity that incorporates a correction for under-sampling of the first dimension. The results show that not only is the speed of the second dimension separation important for reducing the overall analysis time, but it plays a vital role in determining the overall peak capacity when the first dimension is under-sampled. A surprising subsidiary finding is that for relatively short 2DLC separations (much less than a couple of hours), the first dimension peak capacity is far less important than is commonly believed and need not be highly optimized, for example through use of long columns or very small particles. PMID:19053226

  3. Optimal Allocation for the Estimation of Attributable Risk,

    DTIC Science & Technology

    control studies . Various optimal strategies are examined using alternative exposure-specific disease rates. Odd Ratio, Relative Risk and Attributable Risk....This paper derives an expression for the optimum sampling allocation under the minimum variance criterion of the estimated attributable risk for case

  4. Accurate Estimation of Expression Levels of Homologous Genes in RNA-seq Experiments

    NASA Astrophysics Data System (ADS)

    Paşaniuc, Bogdan; Zaitlen, Noah; Halperin, Eran

    Next generation high throughput sequencing (NGS) is poised to replace array based technologies as the experiment of choice for measuring RNA expression levels. Several groups have demonstrated the power of this new approach (RNA-seq), making significant and novel contributions and simultaneously proposing methodologies for the analysis of RNA-seq data. In a typical experiment, millions of short sequences (reads) are sampled from RNA extracts and mapped back to a reference genome. The number of reads mapping to each gene is used as proxy for its corresponding RNA concentration. A significant challenge in analyzing RNA expression of homologous genes is the large fraction of the reads that map to multiple locations in the reference genome. Currently, these reads are either dropped from the analysis, or a naïve algorithm is used to estimate their underlying distribution. In this work, we present a rigorous alternative for handling the reads generated in an RNA-seq experiment within a probabilistic model for RNA-seq data; we develop maximum likelihood based methods for estimating the model parameters. In contrast to previous methods, our model takes into account the fact that the DNA of the sequenced individual is not a perfect copy of the reference sequence. We show with both simulated and real RNA-seq data that our new method improves the accuracy and power of RNA-seq experiments.

  5. Accurate estimation of expression levels of homologous genes in RNA-seq experiments.

    PubMed

    Paşaniuc, Bogdan; Zaitlen, Noah; Halperin, Eran

    2011-03-01

    Abstract Next generation high-throughput sequencing (NGS) is poised to replace array-based technologies as the experiment of choice for measuring RNA expression levels. Several groups have demonstrated the power of this new approach (RNA-seq), making significant and novel contributions and simultaneously proposing methodologies for the analysis of RNA-seq data. In a typical experiment, millions of short sequences (reads) are sampled from RNA extracts and mapped back to a reference genome. The number of reads mapping to each gene is used as proxy for its corresponding RNA concentration. A significant challenge in analyzing RNA expression of homologous genes is the large fraction of the reads that map to multiple locations in the reference genome. Currently, these reads are either dropped from the analysis, or a naive algorithm is used to estimate their underlying distribution. In this work, we present a rigorous alternative for handling the reads generated in an RNA-seq experiment within a probabilistic model for RNA-seq data; we develop maximum likelihood-based methods for estimating the model parameters. In contrast to previous methods, our model takes into account the fact that the DNA of the sequenced individual is not a perfect copy of the reference sequence. We show with both simulated and real RNA-seq data that our new method improves the accuracy and power of RNA-seq experiments.

  6. Voxel-based registration of simulated and real patient CBCT data for accurate dental implant pose estimation

    NASA Astrophysics Data System (ADS)

    Moreira, António H. J.; Queirós, Sandro; Morais, Pedro; Rodrigues, Nuno F.; Correia, André Ricardo; Fernandes, Valter; Pinho, A. C. M.; Fonseca, Jaime C.; Vilaça, João. L.

    2015-03-01

    The success of dental implant-supported prosthesis is directly linked to the accuracy obtained during implant's pose estimation (position and orientation). Although traditional impression techniques and recent digital acquisition methods are acceptably accurate, a simultaneously fast, accurate and operator-independent methodology is still lacking. Hereto, an image-based framework is proposed to estimate the patient-specific implant's pose using cone-beam computed tomography (CBCT) and prior knowledge of implanted model. The pose estimation is accomplished in a threestep approach: (1) a region-of-interest is extracted from the CBCT data using 2 operator-defined points at the implant's main axis; (2) a simulated CBCT volume of the known implanted model is generated through Feldkamp-Davis-Kress reconstruction and coarsely aligned to the defined axis; and (3) a voxel-based rigid registration is performed to optimally align both patient and simulated CBCT data, extracting the implant's pose from the optimal transformation. Three experiments were performed to evaluate the framework: (1) an in silico study using 48 implants distributed through 12 tridimensional synthetic mandibular models; (2) an in vitro study using an artificial mandible with 2 dental implants acquired with an i-CAT system; and (3) two clinical case studies. The results shown positional errors of 67+/-34μm and 108μm, and angular misfits of 0.15+/-0.08° and 1.4°, for experiment 1 and 2, respectively. Moreover, in experiment 3, visual assessment of clinical data results shown a coherent alignment of the reference implant. Overall, a novel image-based framework for implants' pose estimation from CBCT data was proposed, showing accurate results in agreement with dental prosthesis modelling requirements.

  7. [Estimation of absolute risk for fracture].

    PubMed

    Fujiwara, Saeko

    2009-03-01

    Osteoporosis treatment aims to prevent fractures and maintain the QOL of the elderly. However, persons at high risk of future fracture cannot be effectively identified on the basis of bone density (BMD) alone, although BMD is used as an diagnostic criterion. Therefore, the WHO recommended that absolute risk for fracture (10-year probability of fracture) for each individual be evaluated and used as an index for intervention threshold. The 10-year probability of fracture is calculated based on age, sex, BMD at the femoral neck (body mass index if BMD is not available), history of previous fractures, parental hip fracture history, smoking, steroid use, rheumatoid arthritis, secondary osteoporosis and alcohol consumption. The WHO has just announced the development of a calculation tool (FRAX: WHO Fracture Risk Assessment Tool) in February this year. Fractures could be prevented more effectively if, based on each country's medical circumstances, an absolute risk value for fracture to determine when to start medical treatment is established and persons at high risk of fracture are identified and treated accordingly.

  8. [Research on maize multispectral image accurate segmentation and chlorophyll index estimation].

    PubMed

    Wu, Qian; Sun, Hong; Li, Min-zan; Song, Yuan-yuan; Zhang, Yan-e

    2015-01-01

    In order to rapidly acquire maize growing information in the field, a non-destructive method of maize chlorophyll content index measurement was conducted based on multi-spectral imaging technique and imaging processing technology. The experiment was conducted at Yangling in Shaanxi province of China and the crop was Zheng-dan 958 planted in about 1 000 m X 600 m experiment field. Firstly, a 2-CCD multi-spectral image monitoring system was available to acquire the canopy images. The system was based on a dichroic prism, allowing precise separation of the visible (Blue (B), Green (G), Red (R): 400-700 nm) and near-infrared (NIR, 760-1 000 nm) band. The multispectral images were output as RGB and NIR images via the system vertically fixed to the ground with vertical distance of 2 m and angular field of 50°. SPAD index of each sample was'measured synchronously to show the chlorophyll content index. Secondly, after the image smoothing using adaptive smooth filtering algorithm, the NIR maize image was selected to segment the maize leaves from background, because there was a big difference showed in gray histogram between plant and soil background. The NIR image segmentation algorithm was conducted following steps of preliminary and accuracy segmentation: (1) The results of OTSU image segmentation method and the variable threshold algorithm were discussed. It was revealed that the latter was better one in corn plant and weed segmentation. As a result, the variable threshold algorithm based on local statistics was selected for the preliminary image segmentation. The expansion and corrosion were used to optimize the segmented image. (2) The region labeling algorithm was used to segment corn plants from soil and weed background with an accuracy of 95. 59 %. And then, the multi-spectral image of maize canopy was accurately segmented in R, G and B band separately. Thirdly, the image parameters were abstracted based on the segmented visible and NIR images. The average gray

  9. Non-parametric estimation of spatial variation in relative risk.

    PubMed

    Kelsall, J E; Diggle, P J

    We consider the problem of estimating the spatial variation in relative risks of two diseases, say, over a geographical region. Using an underlying Poisson point process model, we approach the problem as one of density ratio estimation implemented with a non-parametric kernel smoothing method. In order to assess the significance of any local peaks or troughs in the estimated risk surface, we introduce pointwise tolerance contours which can enhance a greyscale image plot of the estimate. We also propose a Monte Carlo test of the null hypothesis of constant risk over the whole region, to avoid possible over-interpretation of the estimated risk surface. We illustrate the capabilities of the methodology with two epidemiological examples.

  10. Parametric Estimation in a Recurrent Competing Risks Model.

    PubMed

    Taylor, Laura L; Peña, Edsel A

    2013-01-01

    A resource-efficient approach to making inferences about the distributional properties of the failure times in a competing risks setting is presented. Efficiency is gained by observing recurrences of the competing risks over a random monitoring period. The resulting model is called the recurrent competing risks model (RCRM) and is coupled with two repair strategies whenever the system fails. Maximum likelihood estimators of the parameters of the marginal distribution functions associated with each of the competing risks and also of the system lifetime distribution function are presented. Estimators are derived under perfect and partial repair strategies. Consistency and asymptotic properties of the estimators are obtained. The estimation methods are applied to a data set of failures for cars under warranty. Simulation studies are used to ascertain the small sample properties and the efficiency gains of the resulting estimators.

  11. Estimating successive cancer risks in Lynch Syndrome families using a progressive three-state model.

    PubMed

    Choi, Yun-Hee; Briollais, Laurent; Green, Jane; Parfrey, Patrick; Kopciuk, Karen

    2014-02-20

    Lynch Syndrome (LS) families harbor mutated mismatch repair genes,which predispose them to specific types of cancer. Because individuals within LS families can experience multiple cancers over their lifetime, we developed a progressive three-state model to estimate the disease risk from a healthy (state 0) to a first cancer (state 1) and then to a second cancer (state 2). Ascertainment correction of the likelihood was made to adjust for complex sampling designs with carrier probabilities for family members with missing genotype information estimated using their family's observed genotype and phenotype information in a one-step expectation-maximization algorithm. A sandwich variance estimator was employed to overcome possible model misspecification. The main objective of this paper is to estimate the disease risk (penetrance) for age at a second cancer after someone has experienced a first cancer that is also associated with a mutated gene. Simulation study results indicate that our approach generally provides unbiased risk estimates and low root mean squared errors across different family study designs, proportions of missing genotypes, and risk heterogeneities. An application to 12 large LS families from Newfoundland demonstrates that the risk for a second cancer was substantial and that the age at a first colorectal cancer significantly impacted the age at any LS subsequent cancer. This study provides new insights for developing more effective management of mutation carriers in LS families by providing more accurate multiple cancer risk estimates.

  12. Categorizing sources of risk and the estimated magnitude of risk.

    PubMed

    Aragonés, Juan Ignacio; Moyano, Emilio; Talayero, Fernando

    2008-05-01

    The social perception of risk is considered a multidimensional task, yet little attention has been paid to the cognitive components that organize sources of risk, despite their having been discovered in various research studies. This study attempts to concretely analyze the cultural dimension involved in those processes. In the first phase, we tried to discover to what extent sources of risk are organized into the same categories by people from different countries. In order to do so, two groups of participants were formed: 60 Spanish psychology students and 60 Chilean psychology students classified 43 sources of risk into different groups according to the criteria they found appropriate. The two samples classified risk into identical groups: acts of violence, drugs, electricity and home appliances, household chemicals, chemicals in the environment, public construction projects, transportation, sports, and natural disasters. In a second study, 100 Spanish and 84 Chilean students were asked to evaluate the magnitude of the damage incurred by 17 sources of risk. In both groups, it was observed that the evaluation of damage resulting from each source of risk was affected by its category.

  13. The challenges of accurately estimating time of long bone injury in children.

    PubMed

    Pickett, Tracy A

    2015-07-01

    The ability to determine the time an injury occurred can be of crucial significance in forensic medicine and holds special relevance to the investigation of child abuse. However, dating paediatric long bone injury, including fractures, is nuanced by complexities specific to the paediatric population. These challenges include the ability to identify bone injury in a growing or only partially-calcified skeleton, different injury patterns seen within the spectrum of the paediatric population, the effects of bone growth on healing as a separate entity from injury, differential healing rates seen at different ages, and the relative scarcity of information regarding healing rates in children, especially the very young. The challenges posed by these factors are compounded by a lack of consistency in defining and categorizing healing parameters. This paper sets out the primary limitations of existing knowledge regarding estimating timing of paediatric bone injury. Consideration and understanding of the multitude of factors affecting bone injury and healing in children will assist those providing opinion in the medical-legal forum.

  14. Error Estimation And Accurate Mapping Based ALE Formulation For 3D Simulation Of Friction Stir Welding

    NASA Astrophysics Data System (ADS)

    Guerdoux, Simon; Fourment, Lionel

    2007-05-01

    An Arbitrary Lagrangian Eulerian (ALE) formulation is developed to simulate the different stages of the Friction Stir Welding (FSW) process with the FORGE3® F.E. software. A splitting method is utilized: a) the material velocity/pressure and temperature fields are calculated, b) the mesh velocity is derived from the domain boundary evolution and an adaptive refinement criterion provided by error estimation, c) P1 and P0 variables are remapped. Different velocity computation and remap techniques have been investigated, providing significant improvement with respect to more standard approaches. The proposed ALE formulation is applied to FSW simulation. Steady state welding, but also transient phases are simulated, showing good robustness and accuracy of the developed formulation. Friction parameters are identified for an Eulerian steady state simulation by comparison with experimental results. Void formation can be simulated. Simulations of the transient plunge and welding phases help to better understand the deposition process that occurs at the trailing edge of the probe. Flexibility and robustness of the model finally allows investigating the influence of new tooling designs on the deposition process.

  15. A generic computerized method for estimate of familial risks.

    PubMed Central

    Colombet, Isabelle; Xu, Yigang; Jaulent, Marie-Christine; Desages, Daniel; Degoulet, Patrice; Chatellier, Gilles

    2002-01-01

    Most guidelines developed for cancers screening and for cardiovascular risk management use rules to estimate familial risk. These rules are complex, difficult to memorize, and need to collect a complete pedigree. This paper describes a generic computerized method to estimate familial risks and its implementation in an internet-based application. The program is based on 3 generic models: a model of the family; a model of familial risk; a display model for the pedigree. The model of family allows to represent each member of the family and to construct and display a family tree. The model of familial risk is generic and allows easy update of the program with new diseases or new rules. It was possible to implement guidelines dealing with breast and colorectal cancer and cardiovascular diseases prevention. First evaluation with general practitioners showed that the program was usable. Impact on quality of familial risk estimate should be more documented. PMID:12463810

  16. A new method based on the subpixel Gaussian model for accurate estimation of asteroid coordinates

    NASA Astrophysics Data System (ADS)

    Savanevych, V. E.; Briukhovetskyi, O. B.; Sokovikova, N. S.; Bezkrovny, M. M.; Vavilova, I. B.; Ivashchenko, Yu. M.; Elenin, L. V.; Khlamov, S. V.; Movsesian, Ia. S.; Dashkova, A. M.; Pogorelov, A. V.

    2015-08-01

    We describe a new iteration method to estimate asteroid coordinates, based on a subpixel Gaussian model of the discrete object image. The method operates by continuous parameters (asteroid coordinates) in a discrete observational space (the set of pixel potentials) of the CCD frame. In this model, the kind of coordinate distribution of the photons hitting a pixel of the CCD frame is known a priori, while the associated parameters are determined from a real digital object image. The method that is developed, which is flexible in adapting to any form of object image, has a high measurement accuracy along with a low calculating complexity, due to the maximum-likelihood procedure that is implemented to obtain the best fit instead of a least-squares method and Levenberg-Marquardt algorithm for minimization of the quadratic form. Since 2010, the method has been tested as the basis of our Collection Light Technology (COLITEC) software, which has been installed at several observatories across the world with the aim of the automatic discovery of asteroids and comets in sets of CCD frames. As a result, four comets (C/2010 X1 (Elenin), P/2011 NO1(Elenin), C/2012 S1 (ISON) and P/2013 V3 (Nevski)) as well as more than 1500 small Solar system bodies (including five near-Earth objects (NEOs), 21 Trojan asteroids of Jupiter and one Centaur object) have been discovered. We discuss these results, which allowed us to compare the accuracy parameters of the new method and confirm its efficiency. In 2014, the COLITEC software was recommended to all members of the Gaia-FUN-SSO network for analysing observations as a tool to detect faint moving objects in frames.

  17. Assessment of Methods for Estimating Risk to Birds from ...

    EPA Pesticide Factsheets

    The U.S. EPA Ecological Risk Assessment Support Center (ERASC) announced the release of the final report entitled, Assessment of Methods for Estimating Risk to Birds from Ingestion of Contaminated Grit Particles. This report evaluates approaches for estimating the probability of ingestion by birds of contaminated particles such as pesticide granules or lead particles (i.e. shot or bullet fragments). In addition, it presents an approach for using this information to estimate the risk of mortality to birds from ingestion of lead particles. Response to ERASC Request #16

  18. Resources for global risk assessment: the International Toxicity Estimates for Risk (ITER) and Risk Information Exchange (RiskIE) databases.

    PubMed

    Wullenweber, Andrea; Kroner, Oliver; Kohrman, Melissa; Maier, Andrew; Dourson, Michael; Rak, Andrew; Wexler, Philip; Tomljanovic, Chuck

    2008-11-15

    The rate of chemical synthesis and use has outpaced the development of risk values and the resolution of risk assessment methodology questions. In addition, available risk values derived by different organizations may vary due to scientific judgments, mission of the organization, or use of more recently published data. Further, each organization derives values for a unique chemical list so it can be challenging to locate data on a given chemical. Two Internet resources are available to address these issues. First, the International Toxicity Estimates for Risk (ITER) database (www.tera.org/iter) provides chronic human health risk assessment data from a variety of organizations worldwide in a side-by-side format, explains differences in risk values derived by different organizations, and links directly to each organization's website for more detailed information. It is also the only database that includes risk information from independent parties whose risk values have undergone independent peer review. Second, the Risk Information Exchange (RiskIE) is a database of in progress chemical risk assessment work, and includes non-chemical information related to human health risk assessment, such as training modules, white papers and risk documents. RiskIE is available at http://www.allianceforrisk.org/RiskIE.htm, and will join ITER on National Library of Medicine's TOXNET (http://toxnet.nlm.nih.gov/). Together, ITER and RiskIE provide risk assessors essential tools for easily identifying and comparing available risk data, for sharing in progress assessments, and for enhancing interaction among risk assessment groups to decrease duplication of effort and to harmonize risk assessment procedures across organizations.

  19. Markov chain Monte Carlo estimation of a multiparameter decision model: consistency of evidence and the accurate assessment of uncertainty.

    PubMed

    Ades, A E; Cliffe, S

    2002-01-01

    Decision models are usually populated 1 parameter at a time, with 1 item of information informing each parameter. Often, however, data may not be available on the parameters themselves but on several functions of parameters, and there may be more items of information than there are parameters to be estimated. The authors show how in these circumstances all the model parameters can be estimated simultaneously using Bayesian Markov chain Monte Carlo methods. Consistency of the information and/or the adequacy of the model can also be assessed within this framework. Statistical evidence synthesis using all available data should result in more precise estimates of parameters and functions of parameters, and is compatible with the emphasis currently placed on systematic use of evidence. To illustrate this, WinBUGS software is used to estimate a simple 9-parameter model of the epidemiology of HIV in women attending prenatal clinics, using information on 12 functions of parameters, and to thereby compute the expected net benefit of 2 alternative prenatal testing strategies, universal testing and targeted testing of high-risk groups. The authors demonstrate improved precision of estimates, and lower estimates of the expected value of perfect information, resulting from the use of all available data.

  20. Robust dynamic myocardial perfusion CT deconvolution for accurate residue function estimation via adaptive-weighted tensor total variation regularization: a preclinical study

    NASA Astrophysics Data System (ADS)

    Zeng, Dong; Gong, Changfei; Bian, Zhaoying; Huang, Jing; Zhang, Xinyu; Zhang, Hua; Lu, Lijun; Niu, Shanzhou; Zhang, Zhang; Liang, Zhengrong; Feng, Qianjin; Chen, Wufan; Ma, Jianhua

    2016-11-01

    Dynamic myocardial perfusion computed tomography (MPCT) is a promising technique for quick diagnosis and risk stratification of coronary artery disease. However, one major drawback of dynamic MPCT imaging is the heavy radiation dose to patients due to its dynamic image acquisition protocol. In this work, to address this issue, we present a robust dynamic MPCT deconvolution algorithm via adaptive-weighted tensor total variation (AwTTV) regularization for accurate residue function estimation with low-mA s data acquisitions. For simplicity, the presented method is termed ‘MPD-AwTTV’. More specifically, the gains of the AwTTV regularization over the original tensor total variation regularization are from the anisotropic edge property of the sequential MPCT images. To minimize the associative objective function we propose an efficient iterative optimization strategy with fast convergence rate in the framework of an iterative shrinkage/thresholding algorithm. We validate and evaluate the presented algorithm using both digital XCAT phantom and preclinical porcine data. The preliminary experimental results have demonstrated that the presented MPD-AwTTV deconvolution algorithm can achieve remarkable gains in noise-induced artifact suppression, edge detail preservation, and accurate flow-scaled residue function and MPHM estimation as compared with the other existing deconvolution algorithms in digital phantom studies, and similar gains can be obtained in the porcine data experiment.

  1. Linear-In-The-Parameters Oblique Least Squares (LOLS) Provides More Accurate Estimates of Density-Dependent Survival

    PubMed Central

    Vieira, Vasco M. N. C. S.; Engelen, Aschwin H.; Huanel, Oscar R.; Guillemin, Marie-Laure

    2016-01-01

    Survival is a fundamental demographic component and the importance of its accurate estimation goes beyond the traditional estimation of life expectancy. The evolutionary stability of isomorphic biphasic life-cycles and the occurrence of its different ploidy phases at uneven abundances are hypothesized to be driven by differences in survival rates between haploids and diploids. We monitored Gracilaria chilensis, a commercially exploited red alga with an isomorphic biphasic life-cycle, having found density-dependent survival with competition and Allee effects. While estimating the linear-in-the-parameters survival function, all model I regression methods (i.e, vertical least squares) provided biased line-fits rendering them inappropriate for studies about ecology, evolution or population management. Hence, we developed an iterative two-step non-linear model II regression (i.e, oblique least squares), which provided improved line-fits and estimates of survival function parameters, while robust to the data aspects that usually turn the regression methods numerically unstable. PMID:27936048

  2. Estimating the standardized mean difference with minimum risk: Maximizing accuracy and minimizing cost with sequential estimation.

    PubMed

    Chattopadhyay, Bhargab; Kelley, Ken

    2017-03-01

    The standardized mean difference is a widely used effect size measure. In this article, we develop a general theory for estimating the population standardized mean difference by minimizing both the mean square error of the estimator and the total sampling cost. Fixed sample size methods, when sample size is planned before the start of a study, cannot simultaneously minimize both the mean square error of the estimator and the total sampling cost. To overcome this limitation of the current state of affairs, this article develops a purely sequential sampling procedure, which provides an estimate of the sample size required to achieve a sufficiently accurate estimate with minimum expected sampling cost. Performance of the purely sequential procedure is examined via a simulation study to show that our analytic developments are highly accurate. Additionally, we provide freely available functions in R to implement the algorithm of the purely sequential procedure. (PsycINFO Database Record

  3. Accurate Estimation of Fungal Diversity and Abundance through Improved Lineage-Specific Primers Optimized for Illumina Amplicon Sequencing

    PubMed Central

    Walters, William A.; Lennon, Niall J.; Bochicchio, James; Krohn, Andrew; Pennanen, Taina

    2016-01-01

    ABSTRACT While high-throughput sequencing methods are revolutionizing fungal ecology, recovering accurate estimates of species richness and abundance has proven elusive. We sought to design internal transcribed spacer (ITS) primers and an Illumina protocol that would maximize coverage of the kingdom Fungi while minimizing nontarget eukaryotes. We inspected alignments of the 5.8S and large subunit (LSU) ribosomal genes and evaluated potential primers using PrimerProspector. We tested the resulting primers using tiered-abundance mock communities and five previously characterized soil samples. We recovered operational taxonomic units (OTUs) belonging to all 8 members in both mock communities, despite DNA abundances spanning 3 orders of magnitude. The expected and observed read counts were strongly correlated (r = 0.94 to 0.97). However, several taxa were consistently over- or underrepresented, likely due to variation in rRNA gene copy numbers. The Illumina data resulted in clustering of soil samples identical to that obtained with Sanger sequence clone library data using different primers. Furthermore, the two methods produced distance matrices with a Mantel correlation of 0.92. Nonfungal sequences comprised less than 0.5% of the soil data set, with most attributable to vascular plants. Our results suggest that high-throughput methods can produce fairly accurate estimates of fungal abundances in complex communities. Further improvements might be achieved through corrections for rRNA copy number and utilization of standardized mock communities. IMPORTANCE Fungi play numerous important roles in the environment. Improvements in sequencing methods are providing revolutionary insights into fungal biodiversity, yet accurate estimates of the number of fungal species (i.e., richness) and their relative abundances in an environmental sample (e.g., soil, roots, water, etc.) remain difficult to obtain. We present improved methods for high-throughput Illumina sequencing of the

  4. Sensitivity of health risk estimates to air quality adjustment procedure

    SciTech Connect

    Whitfield, R.G.

    1997-06-30

    This letter is a summary of risk results associated with exposure estimates using two-parameter Weibull and quadratic air quality adjustment procedures (AQAPs). New exposure estimates were developed for children and child-occurrences, six urban areas, and five alternative air quality scenarios. In all cases, the Weibull and quadratic results are compared to previous results, which are based on a proportional AQAP.

  5. Incentives Increase Participation in Mass Dog Rabies Vaccination Clinics and Methods of Coverage Estimation Are Assessed to Be Accurate

    PubMed Central

    Steinmetz, Melissa; Czupryna, Anna; Bigambo, Machunde; Mzimbiri, Imam; Powell, George; Gwakisa, Paul

    2015-01-01

    In this study we show that incentives (dog collars and owner wristbands) are effective at increasing owner participation in mass dog rabies vaccination clinics and we conclude that household questionnaire surveys and the mark-re-sight (transect survey) method for estimating post-vaccination coverage are accurate when all dogs, including puppies, are included. Incentives were distributed during central-point rabies vaccination clinics in northern Tanzania to quantify their effect on owner participation. In villages where incentives were handed out participation increased, with an average of 34 more dogs being vaccinated. Through economies of scale, this represents a reduction in the cost-per-dog of $0.47. This represents the price-threshold under which the cost of the incentive used must fall to be economically viable. Additionally, vaccination coverage levels were determined in ten villages through the gold-standard village-wide census technique, as well as through two cheaper and quicker methods (randomized household questionnaire and the transect survey). Cost data were also collected. Both non-gold standard methods were found to be accurate when puppies were included in the calculations, although the transect survey and the household questionnaire survey over- and under-estimated the coverage respectively. Given that additional demographic data can be collected through the household questionnaire survey, and that its estimate of coverage is more conservative, we recommend this method. Despite the use of incentives the average vaccination coverage was below the 70% threshold for eliminating rabies. We discuss the reasons and suggest solutions to improve coverage. Given recent international targets to eliminate rabies, this study provides valuable and timely data to help improve mass dog vaccination programs in Africa and elsewhere. PMID:26633821

  6. Incentives Increase Participation in Mass Dog Rabies Vaccination Clinics and Methods of Coverage Estimation Are Assessed to Be Accurate.

    PubMed

    Minyoo, Abel B; Steinmetz, Melissa; Czupryna, Anna; Bigambo, Machunde; Mzimbiri, Imam; Powell, George; Gwakisa, Paul; Lankester, Felix

    2015-12-01

    In this study we show that incentives (dog collars and owner wristbands) are effective at increasing owner participation in mass dog rabies vaccination clinics and we conclude that household questionnaire surveys and the mark-re-sight (transect survey) method for estimating post-vaccination coverage are accurate when all dogs, including puppies, are included. Incentives were distributed during central-point rabies vaccination clinics in northern Tanzania to quantify their effect on owner participation. In villages where incentives were handed out participation increased, with an average of 34 more dogs being vaccinated. Through economies of scale, this represents a reduction in the cost-per-dog of $0.47. This represents the price-threshold under which the cost of the incentive used must fall to be economically viable. Additionally, vaccination coverage levels were determined in ten villages through the gold-standard village-wide census technique, as well as through two cheaper and quicker methods (randomized household questionnaire and the transect survey). Cost data were also collected. Both non-gold standard methods were found to be accurate when puppies were included in the calculations, although the transect survey and the household questionnaire survey over- and under-estimated the coverage respectively. Given that additional demographic data can be collected through the household questionnaire survey, and that its estimate of coverage is more conservative, we recommend this method. Despite the use of incentives the average vaccination coverage was below the 70% threshold for eliminating rabies. We discuss the reasons and suggest solutions to improve coverage. Given recent international targets to eliminate rabies, this study provides valuable and timely data to help improve mass dog vaccination programs in Africa and elsewhere.

  7. Accurate estimation of entropy in very short physiological time series: the problem of atrial fibrillation detection in implanted ventricular devices.

    PubMed

    Lake, Douglas E; Moorman, J Randall

    2011-01-01

    Entropy estimation is useful but difficult in short time series. For example, automated detection of atrial fibrillation (AF) in very short heart beat interval time series would be useful in patients with cardiac implantable electronic devices that record only from the ventricle. Such devices require efficient algorithms, and the clinical situation demands accuracy. Toward these ends, we optimized the sample entropy measure, which reports the probability that short templates will match with others within the series. We developed general methods for the rational selection of the template length m and the tolerance matching r. The major innovation was to allow r to vary so that sufficient matches are found for confident entropy estimation, with conversion of the final probability to a density by dividing by the matching region volume, 2r(m). The optimized sample entropy estimate and the mean heart beat interval each contributed to accurate detection of AF in as few as 12 heartbeats. The final algorithm, called the coefficient of sample entropy (COSEn), was developed using the canonical MIT-BIH database and validated in a new and much larger set of consecutive Holter monitor recordings from the University of Virginia. In patients over the age of 40 yr old, COSEn has high degrees of accuracy in distinguishing AF from normal sinus rhythm in 12-beat calculations performed hourly. The most common errors are atrial or ventricular ectopy, which increase entropy despite sinus rhythm, and atrial flutter, which can have low or high entropy states depending on dynamics of atrioventricular conduction.

  8. Model-based Small Area Estimates of Cancer Risk Factors and Screening Behaviors - Small Area Estimates

    Cancer.gov

    These model-based estimates use two surveys, the Behavioral Risk Factor Surveillance System (BRFSS) and the National Health Interview Survey (NHIS). The two surveys are combined using novel statistical methodology.

  9. Towards more accurate life cycle risk management through integration of DDP and PRA

    NASA Technical Reports Server (NTRS)

    Cornford, Steven L.; Paulos, Todd; Meshkat, Leila; Feather, Martin

    2003-01-01

    The focus of this paper is on the integration of PRA and DDP. The intent is twofold: to extend risk-based decision though more of the lifecycle, and to lead to improved risk modeling (hence better informed decision making) wherever it is applied, most especially in the early phases as designs begin to mature.

  10. Estimating cancer risks to adults undergoing body CT examinations.

    PubMed

    Huda, Walter; He, Wenjun

    2012-06-01

    The purpose of the study is to estimate cancer risks from the amount of radiation used to perform body computed tomography (CT) examination. The ImPACT CT Patient Dosimetry Calculator was used to compute values of organ doses for adult body CT examinations. The radiation used to perform each examination was quantified by the dose-length product (DLP). Patient organ doses were converted into corresponding age and sex dependent cancer risks using data from BEIR VII. Results are presented for cancer risks per unit DLP and unit effective dose for 11 sensitive organs, as well as estimates of the contribution from 'other organs'. For patients who differ from a standard sized adult, correction factors based on the patient weight and antero-posterior dimension are provided to adjust organ doses and the corresponding risks. At constant incident radiation intensity, for CT examinations that include the chest, risks for females are markedly higher than those for males, whereas for examinations that include the pelvis, risks in males were slightly higher than those in females. In abdominal CT scans, risks for males and female patients are very similar. For abdominal CT scans, increasing the patient age from 20 to 80 resulted in a reduction in patient risks of nearly a factor of 5. The average cancer risk for chest/abdomen/pelvis CT examinations was ∼26 % higher than the cancer risk caused by 'sensitive organs'. Doses and radiation risks in 80 kg adults were ∼10 % lower than those in 70 kg patients. Cancer risks in body CT can be estimated from the examination DLP by accounting for sex, age, as well as patient physical characteristics.

  11. Impact of interfacial high-density water layer on accurate estimation of adsorption free energy by Jarzynski's equality

    NASA Astrophysics Data System (ADS)

    Zhang, Zhisen; Wu, Tao; Wang, Qi; Pan, Haihua; Tang, Ruikang

    2014-01-01

    The interactions between proteins/peptides and materials are crucial to research and development in many biomedical engineering fields. The energetics of such interactions are key in the evaluation of new proteins/peptides and materials. Much research has recently focused on the quality of free energy profiles by Jarzynski's equality, a widely used equation in biosystems. In the present work, considerable discrepancies were observed between the results obtained by Jarzynski's equality and those derived by umbrella sampling in biomaterial-water model systems. Detailed analyses confirm that such discrepancies turn up only when the target molecule moves in the high-density water layer on a material surface. Then a hybrid scheme was adopted based on this observation. The agreement between the results of the hybrid scheme and umbrella sampling confirms the former observation, which indicates an approach to a fast and accurate estimation of adsorption free energy for large biomaterial interfacial systems.

  12. Accurate state estimation from uncertain data and models: an application of data assimilation to mathematical models of human brain tumors

    PubMed Central

    2011-01-01

    Background Data assimilation refers to methods for updating the state vector (initial condition) of a complex spatiotemporal model (such as a numerical weather model) by combining new observations with one or more prior forecasts. We consider the potential feasibility of this approach for making short-term (60-day) forecasts of the growth and spread of a malignant brain cancer (glioblastoma multiforme) in individual patient cases, where the observations are synthetic magnetic resonance images of a hypothetical tumor. Results We apply a modern state estimation algorithm (the Local Ensemble Transform Kalman Filter), previously developed for numerical weather prediction, to two different mathematical models of glioblastoma, taking into account likely errors in model parameters and measurement uncertainties in magnetic resonance imaging. The filter can accurately shadow the growth of a representative synthetic tumor for 360 days (six 60-day forecast/update cycles) in the presence of a moderate degree of systematic model error and measurement noise. Conclusions The mathematical methodology described here may prove useful for other modeling efforts in biology and oncology. An accurate forecast system for glioblastoma may prove useful in clinical settings for treatment planning and patient counseling. Reviewers This article was reviewed by Anthony Almudevar, Tomas Radivoyevitch, and Kristin Swanson (nominated by Georg Luebeck). PMID:22185645

  13. Estimating and Mapping the Population at Risk of Sleeping Sickness

    PubMed Central

    Franco, José R.; Paone, Massimo; Diarra, Abdoulaye; Ruiz-Postigo, José Antonio; Fèvre, Eric M.; Mattioli, Raffaele C.; Jannin, Jean G.

    2012-01-01

    Background Human African trypanosomiasis (HAT), also known as sleeping sickness, persists as a public health problem in several sub-Saharan countries. Evidence-based, spatially explicit estimates of population at risk are needed to inform planning and implementation of field interventions, monitor disease trends, raise awareness and support advocacy. Comprehensive, geo-referenced epidemiological records from HAT-affected countries were combined with human population layers to map five categories of risk, ranging from “very high” to “very low,” and to estimate the corresponding at-risk population. Results Approximately 70 million people distributed over a surface of 1.55 million km2 are estimated to be at different levels of risk of contracting HAT. Trypanosoma brucei gambiense accounts for 82.2% of the population at risk, the remaining 17.8% being at risk of infection from T. b. rhodesiense. Twenty-one million people live in areas classified as moderate to very high risk, where more than 1 HAT case per 10,000 inhabitants per annum is reported. Discussion Updated estimates of the population at risk of sleeping sickness were made, based on quantitative information on the reported cases and the geographic distribution of human population. Due to substantial methodological differences, it is not possible to make direct comparisons with previous figures for at-risk population. By contrast, it will be possible to explore trends in the future. The presented maps of different HAT risk levels will help to develop site-specific strategies for control and surveillance, and to monitor progress achieved by ongoing efforts aimed at the elimination of sleeping sickness. PMID:23145192

  14. Reservoir evaluation of thin-bedded turbidites and hydrocarbon pore thickness estimation for an accurate quantification of resource

    NASA Astrophysics Data System (ADS)

    Omoniyi, Bayonle; Stow, Dorrik

    2016-04-01

    One of the major challenges in the assessment of and production from turbidite reservoirs is to take full account of thin and medium-bedded turbidites (<10cm and <30cm respectively). Although such thinner, low-pay sands may comprise a significant proportion of the reservoir succession, they can go unnoticed by conventional analysis and so negatively impact on reserve estimation, particularly in fields producing from prolific thick-bedded turbidite reservoirs. Field development plans often take little note of such thin beds, which are therefore bypassed by mainstream production. In fact, the trapped and bypassed fluids can be vital where maximising field value and optimising production are key business drivers. We have studied in detail, a succession of thin-bedded turbidites associated with thicker-bedded reservoir facies in the North Brae Field, UKCS, using a combination of conventional logs and cores to assess the significance of thin-bedded turbidites in computing hydrocarbon pore thickness (HPT). This quantity, being an indirect measure of thickness, is critical for an accurate estimation of original-oil-in-place (OOIP). By using a combination of conventional and unconventional logging analysis techniques, we obtain three different results for the reservoir intervals studied. These results include estimated net sand thickness, average sand thickness, and their distribution trend within a 3D structural grid. The net sand thickness varies from 205 to 380 ft, and HPT ranges from 21.53 to 39.90 ft. We observe that an integrated approach (neutron-density cross plots conditioned to cores) to HPT quantification reduces the associated uncertainties significantly, resulting in estimation of 96% of actual HPT. Further work will focus on assessing the 3D dynamic connectivity of the low-pay sands with the surrounding thick-bedded turbidite facies.

  15. Can endocranial volume be estimated accurately from external skull measurements in great-tailed grackles (Quiscalus mexicanus)?

    PubMed Central

    Palmstrom, Christin R.

    2015-01-01

    There is an increasing need to validate and collect data approximating brain size on individuals in the field to understand what evolutionary factors drive brain size variation within and across species. We investigated whether we could accurately estimate endocranial volume (a proxy for brain size), as measured by computerized tomography (CT) scans, using external skull measurements and/or by filling skulls with beads and pouring them out into a graduated cylinder for male and female great-tailed grackles. We found that while females had higher correlations than males, estimations of endocranial volume from external skull measurements or beads did not tightly correlate with CT volumes. We found no accuracy in the ability of external skull measures to predict CT volumes because the prediction intervals for most data points overlapped extensively. We conclude that we are unable to detect individual differences in endocranial volume using external skull measurements. These results emphasize the importance of validating and explicitly quantifying the predictive accuracy of brain size proxies for each species and each sex. PMID:26082858

  16. Estimating the state of a geophysical system with sparse observations: time delay methods to achieve accurate initial states for prediction

    NASA Astrophysics Data System (ADS)

    An, Zhe; Rey, Daniel; Ye, Jingxin; Abarbanel, Henry D. I.

    2017-01-01

    The problem of forecasting the behavior of a complex dynamical system through analysis of observational time-series data becomes difficult when the system expresses chaotic behavior and the measurements are sparse, in both space and/or time. Despite the fact that this situation is quite typical across many fields, including numerical weather prediction, the issue of whether the available observations are "sufficient" for generating successful forecasts is still not well understood. An analysis by Whartenby et al. (2013) found that in the context of the nonlinear shallow water equations on a β plane, standard nudging techniques require observing approximately 70 % of the full set of state variables. Here we examine the same system using a method introduced by Rey et al. (2014a), which generalizes standard nudging methods to utilize time delayed measurements. We show that in certain circumstances, it provides a sizable reduction in the number of observations required to construct accurate estimates and high-quality predictions. In particular, we find that this estimate of 70 % can be reduced to about 33 % using time delays, and even further if Lagrangian drifter locations are also used as measurements.

  17. Methods to Develop Inhalation Cancer Risk Estimates for ...

    EPA Pesticide Factsheets

    This document summarizes the approaches and rationale for the technical and scientific considerations used to derive inhalation cancer risks for emissions of chromium and nickel compounds from electric utility steam generating units. The purpose of this document is to discuss the methods used to develop inhalation cancer risk estimates associated with emissions of chromium and nickel compounds from coal- and oil-fired electric utility steam generating units (EGUs) in support of EPA's recently proposed Air Toxics Rule.

  18. Studies on the extended Techa river cohort: cancer risk estimation

    SciTech Connect

    Kossenko, M M.; Preston, D L.; Krestinina, L Y.; Degteva, M O.; Startsev, N V.; Thomas, T; Vyushkova, O V.; Anspaugh, L R.; Napier, Bruce A. ); Kozheurov, V P.; Ron, E; Akleyev, A V.

    2001-12-01

    Initial population-based studies of riverside residents were begun in the late 1950s and in 1967 a systematic effort was undertaken to develop a well-defined fixed cohort of Techa river residents, to carry out ongoing mortality and (limited) clinical follow-up of this cohort, and to provide individualized dose estimates for cohort members. Over the past decade, extensive efforts have been made to refine the cohort definition and improve both the follow-up and dosimetry data. Analyses of the Techa river cohort can provide useful quantitative estimates of the effects of low dose rate, chronic external and internal exposures on cancer mortality and incidence and non-cancer mortality rates. These risk estimates complement quantitative risk estimates for acute exposures based on the atomic bomb survivors and chronic exposure risk estimates from worker studies, including Mayak workers and other groups with occupational radiation exposures. As the dosimetry and follow-up are refined it may also be possible to gain useful insights into risks associated with 90Sr exposures.

  19. Accurately Predicting Future Reading Difficulty for Bilingual Latino Children at Risk for Language Impairment

    ERIC Educational Resources Information Center

    Petersen, Douglas B.; Gillam, Ronald B.

    2013-01-01

    Sixty-three bilingual Latino children who were at risk for language impairment were administered reading-related measures in English and Spanish (letter identification, phonological awareness, rapid automatized naming, and sentence repetition) and descriptive measures including English language proficiency (ELP), language ability (LA),…

  20. Estimating transport fatality risk from past accident data.

    PubMed

    Evans, Andrew W

    2003-07-01

    This paper examines the statistical properties of estimates of fatal accident rates, mean fatalities per accident, and fatality rates when these estimates are based on past accident data. The statistical properties are illustrated by two long-term transport fatal accident datasets from Great Britain, the principal one for railways and the other for roads, chosen to provide a statistical contrast. In both modes, the accident rates have fallen substantially over the long term. Two statistical estimates of current accident and fatality rates are presented for each dataset, one based only on recent data and the other based on estimates of long-term trends. The trend-based estimate is preferred for train accidents because this makes maximum use of the limited and variable data; the recent data are preferred for road accidents because this avoids unnecessary dependence on modelling the trends. For train accidents, the estimated fatality rate based on past accidents is compared with an estimate produced by the railway industry using a risk model. The statistical estimate is less than half the industry's estimate, and the paper concludes that the statistical estimate is to be preferred.

  1. On the estimation of risk associated with an attenuation prediction

    NASA Technical Reports Server (NTRS)

    Crane, R. K.

    1992-01-01

    Viewgraphs from a presentation on the estimation of risk associated with an attenuation prediction is presented. Topics covered include: link failure - attenuation exceeding a specified threshold for a specified time interval or intervals; risk - the probability of one or more failures during the lifetime of the link or during a specified accounting interval; the problem - modeling the probability of attenuation by rainfall to provide a prediction of the attenuation threshold for a specified risk; and an accounting for the inadequacy of a model or models.

  2. TIMP2•IGFBP7 biomarker panel accurately predicts acute kidney injury in high-risk surgical patients

    PubMed Central

    Gunnerson, Kyle J.; Shaw, Andrew D.; Chawla, Lakhmir S.; Bihorac, Azra; Al-Khafaji, Ali; Kashani, Kianoush; Lissauer, Matthew; Shi, Jing; Walker, Michael G.; Kellum, John A.

    2016-01-01

    BACKGROUND Acute kidney injury (AKI) is an important complication in surgical patients. Existing biomarkers and clinical prediction models underestimate the risk for developing AKI. We recently reported data from two trials of 728 and 408 critically ill adult patients in whom urinary TIMP2•IGFBP7 (NephroCheck, Astute Medical) was used to identify patients at risk of developing AKI. Here we report a preplanned analysis of surgical patients from both trials to assess whether urinary tissue inhibitor of metalloproteinase 2 (TIMP-2) and insulin-like growth factor–binding protein 7 (IGFBP7) accurately identify surgical patients at risk of developing AKI. STUDY DESIGN We enrolled adult surgical patients at risk for AKI who were admitted to one of 39 intensive care units across Europe and North America. The primary end point was moderate-severe AKI (equivalent to KDIGO [Kidney Disease Improving Global Outcomes] stages 2–3) within 12 hours of enrollment. Biomarker performance was assessed using the area under the receiver operating characteristic curve, integrated discrimination improvement, and category-free net reclassification improvement. RESULTS A total of 375 patients were included in the final analysis of whom 35 (9%) developed moderate-severe AKI within 12 hours. The area under the receiver operating characteristic curve for [TIMP-2]•[IGFBP7] alone was 0.84 (95% confidence interval, 0.76–0.90; p < 0.0001). Biomarker performance was robust in sensitivity analysis across predefined subgroups (urgency and type of surgery). CONCLUSION For postoperative surgical intensive care unit patients, a single urinary TIMP2•IGFBP7 test accurately identified patients at risk for developing AKI within the ensuing 12 hours and its inclusion in clinical risk prediction models significantly enhances their performance. LEVEL OF EVIDENCE Prognostic study, level I. PMID:26816218

  3. Estimating the gas transfer velocity: a prerequisite for more accurate and higher resolution GHG fluxes (lower Aare River, Switzerland)

    NASA Astrophysics Data System (ADS)

    Sollberger, S.; Perez, K.; Schubert, C. J.; Eugster, W.; Wehrli, B.; Del Sontro, T.

    2013-12-01

    Currently, carbon dioxide (CO2) and methane (CH4) emissions from lakes, reservoirs and rivers are readily investigated due to the global warming potential of those gases and the role these inland waters play in the carbon cycle. However, there is a lack of high spatiotemporally-resolved emission estimates, and how to accurately assess the gas transfer velocity (K) remains controversial. In anthropogenically-impacted systems where run-of-river reservoirs disrupt the flow of sediments by increasing the erosion and load accumulation patterns, the resulting production of carbonic greenhouse gases (GH-C) is likely to be enhanced. The GH-C flux is thus counteracting the terrestrial carbon sink in these environments that act as net carbon emitters. The aim of this project was to determine the GH-C emissions from a medium-sized river heavily impacted by several impoundments and channelization through a densely-populated region of Switzerland. Estimating gas emission from rivers is not trivial and recently several models have been put forth to do so; therefore a second goal of this project was to compare the river emission models available with direct measurements. Finally, we further validated the modeled fluxes by using a combined approach with water sampling, chamber measurements, and highly temporal GH-C monitoring using an equilibrator. We conducted monthly surveys along the 120 km of the lower Aare River where we sampled for dissolved CH4 (';manual' sampling) at a 5-km sampling resolution, and measured gas emissions directly with chambers over a 35 km section. We calculated fluxes (F) via the boundary layer equation (F=K×(Cw-Ceq)) that uses the water-air GH-C concentration (C) gradient (Cw-Ceq) and K, which is the most sensitive parameter. K was estimated using 11 different models found in the literature with varying dependencies on: river hydrology (n=7), wind (2), heat exchange (1), and river width (1). We found that chamber fluxes were always higher than boundary

  4. Estimating wildfire risk on a Mojave Desert landscape using remote sensing and field sampling

    USGS Publications Warehouse

    Van Linn, Peter F.; Nussear, Kenneth E.; Esque, Todd C.; DeFalco, Lesley A.; Inman, Richard D.; Abella, Scott R.

    2013-01-01

    Predicting wildfires that affect broad landscapes is important for allocating suppression resources and guiding land management. Wildfire prediction in the south-western United States is of specific concern because of the increasing prevalence and severe effects of fire on desert shrublands and the current lack of accurate fire prediction tools. We developed a fire risk model to predict fire occurrence in a north-eastern Mojave Desert landscape. First we developed a spatial model using remote sensing data to predict fuel loads based on field estimates of fuels. We then modelled fire risk (interactions of fuel characteristics and environmental conditions conducive to wildfire) using satellite imagery, our model of fuel loads, and spatial data on ignition potential (lightning strikes and distance to roads), topography (elevation and aspect) and climate (maximum and minimum temperatures). The risk model was developed during a fire year at our study landscape and validated at a nearby landscape; model performance was accurate and similar at both sites. This study demonstrates that remote sensing techniques used in combination with field surveys can accurately predict wildfire risk in the Mojave Desert and may be applicable to other arid and semiarid lands where wildfires are prevalent.

  5. Sensitivity of risk estimates to wildlife bioaccumulation factors in ecological risk assessment

    SciTech Connect

    Karustis, C.G.; Brewer, R.A.

    1995-12-31

    The concept of conservatism in risk assessment is well established. However, overly conservative assumptions may result in risk estimates that incorrectly predict remediation goals. Therefore, realistic assumptions should be applied in risk assessment whenever possible. A sensitivity analysis was performed on conservative (i.e. bioaccumulation factor = 1) and scientifically-derived wildlife bioaccumulation factors (BAFs) utilized to calculate risks during a terrestrial ecological risk assessment (ERA). In the first approach, 100% bioaccumulation of contaminants was assumed to estimate the transfer of contaminants through the terrestrial food chain. In the second approach, scientifically-derived BAFs were selected from the literature. For one of the measurement species selected, total risks calculated during the first approach were higher than those calculated during the second approach by two orders of magnitude. However, potential risks due to individual contaminants were not necessarily higher using the conservative approach. Potential risk due to contaminants with low actual bioaccumulation were exaggerated while potential risks due to contaminants with greater than 100% bioaccumulation were underestimated. Therefore, the use of a default of 100% bioaccumulation (BAF = 1) for all contaminants encountered during an ERA could result in cases where contaminants are incorrectly identified as risk drivers, and the calculation of incorrect ecological risk-based cleanup goals. The authors suggest using site-specific or literature-derived BAFs whenever possible and realistic BAF estimates, based upon factors such as log K{sub ow}, when BAFs are unavailable.

  6. Underestimating the Alcohol Content of a Glass of Wine: The Implications for Estimates of Mortality Risk

    PubMed Central

    Britton, Annie; O’Neill, Darragh; Bell, Steven

    2016-01-01

    Aims Increases in glass sizes and wine strength over the last 25 years in the UK are likely to have led to an underestimation of alcohol intake in population studies. We explore whether this probable misclassification affects the association between average alcohol intake and risk of mortality from all causes, cardiovascular disease and cancer. Methods Self-reported alcohol consumption in 1997–1999 among 7010 men and women in the Whitehall II cohort of British civil servants was linked to the risk of mortality until mid-2015. A conversion factor of 8 g of alcohol per wine glass (1 unit) was compared with a conversion of 16 g per wine glass (2 units). Results When applying a higher alcohol content conversion for wine consumption, the proportion of heavy/very heavy drinkers increased from 28% to 41% for men and 15% to 28% for women. There was a significantly increased risk of very heavy drinking compared with moderate drinking for deaths from all causes and cancer before and after change in wine conversion; however, the hazard ratios were reduced when a higher wine conversion was used. Conclusions In this population-based study, assuming higher alcohol content in wine glasses changed the estimates of mortality risk. We propose that investigator-led cohorts need to revisit conversion factors based on more accurate estimates of alcohol content in wine glasses. Prospectively, researchers need to collect more detailed information on alcohol including serving sizes and strength. Short summary The alcohol content in a wine glass is likely to be underestimated in population surveys as wine strength and serving size have increased in recent years. We demonstrate that in a large cohort study, this underestimation affects estimates of mortality risk. Investigator-led cohorts need to revisit conversion factors based on more accurate estimates of alcohol content in wine glasses. PMID:27261472

  7. SU-E-J-208: Fast and Accurate Auto-Segmentation of Abdominal Organs at Risk for Online Adaptive Radiotherapy

    SciTech Connect

    Gupta, V; Wang, Y; Romero, A; Heijmen, B; Hoogeman, M; Myronenko, A; Jordan, P

    2014-06-01

    Purpose: Various studies have demonstrated that online adaptive radiotherapy by real-time re-optimization of the treatment plan can improve organs-at-risk (OARs) sparing in the abdominal region. Its clinical implementation, however, requires fast and accurate auto-segmentation of OARs in CT scans acquired just before each treatment fraction. Autosegmentation is particularly challenging in the abdominal region due to the frequently observed large deformations. We present a clinical validation of a new auto-segmentation method that uses fully automated non-rigid registration for propagating abdominal OAR contours from planning to daily treatment CT scans. Methods: OARs were manually contoured by an expert panel to obtain ground truth contours for repeat CT scans (3 per patient) of 10 patients. For the non-rigid alignment, we used a new non-rigid registration method that estimates the deformation field by optimizing local normalized correlation coefficient with smoothness regularization. This field was used to propagate planning contours to repeat CTs. To quantify the performance of the auto-segmentation, we compared the propagated and ground truth contours using two widely used metrics- Dice coefficient (Dc) and Hausdorff distance (Hd). The proposed method was benchmarked against translation and rigid alignment based auto-segmentation. Results: For all organs, the auto-segmentation performed better than the baseline (translation) with an average processing time of 15 s per fraction CT. The overall improvements ranged from 2% (heart) to 32% (pancreas) in Dc, and 27% (heart) to 62% (spinal cord) in Hd. For liver, kidneys, gall bladder, stomach, spinal cord and heart, Dc above 0.85 was achieved. Duodenum and pancreas were the most challenging organs with both showing relatively larger spreads and medians of 0.79 and 2.1 mm for Dc and Hd, respectively. Conclusion: Based on the achieved accuracy and computational time we conclude that the investigated auto

  8. Estimates of endemic waterborne risks from community-intervention studies.

    PubMed

    Calderon, Rebecca L; Craun, Gunther F

    2006-01-01

    The nature and magnitude of endemic waterborne disease are not well characterized in the United States. Epidemiologic studies of various designs can provide an estimate of the waterborne attributable risk along with other types of information. Community drinking water systems frequently improve their operations and may change drinking water treatment and their major source of water. In the United States, many of these treatment changes are the result of regulations promulgated under the Safe Drinking Water Act. A community-intervention study design takes advantage of these "natural" experiments to assess changes in health risks. In this paper, we review the community-intervention studies that have assessed changes in waterborne gastroenteritis risks among immunocompetent populations in industrialized countries. Published results are available from two studies in Australia, one study in the United Kingdom, and one study in the United States. Preliminary results from two other US studies are also available. Although the current information is limited, the risks reported in these community-intervention studies can help inform the national estimate of endemic waterborne gastroenteritis. Information is provided about endemic waterborne risks for unfiltered surface water sources and a groundwater under the influence of surface water. Community-intervention studies with recommended study modifications should be conducted to better estimate the benefits associated with improved drinking water treatment.

  9. A method for simple and accurate estimation of fog deposition in a mountain forest using a meteorological model

    NASA Astrophysics Data System (ADS)

    Katata, Genki; Kajino, Mizuo; Hiraki, Takatoshi; Aikawa, Masahide; Kobayashi, Tomiki; Nagai, Haruyasu

    2011-10-01

    To apply a meteorological model to investigate fog occurrence, acidification and deposition in mountain forests, the meteorological model WRF was modified to calculate fog deposition accurately by the simple linear function of fog deposition onto vegetation derived from numerical experiments using the detailed multilayer atmosphere-vegetation-soil model (SOLVEG). The modified version of WRF that includes fog deposition (fog-WRF) was tested in a mountain forest on Mt. Rokko in Japan. fog-WRF provided a distinctly better prediction of liquid water content of fog (LWC) than the original version of WRF. It also successfully simulated throughfall observations due to fog deposition inside the forest during the summer season that excluded the effect of forest edges. Using the linear relationship between fog deposition and altitude given by the fog-WRF calculations and the data from throughfall observations at a given altitude, the vertical distribution of fog deposition can be roughly estimated in mountain forests. A meteorological model that includes fog deposition will be useful in mapping fog deposition in mountain cloud forests.

  10. Development of a new, robust and accurate, spectroscopic metric for scatterer size estimation in optical coherence tomography (OCT) images

    NASA Astrophysics Data System (ADS)

    Kassinopoulos, Michalis; Pitris, Costas

    2016-03-01

    The modulations appearing on the backscattering spectrum originating from a scatterer are related to its diameter as described by Mie theory for spherical particles. Many metrics for Spectroscopic Optical Coherence Tomography (SOCT) take advantage of this observation in order to enhance the contrast of Optical Coherence Tomography (OCT) images. However, none of these metrics has achieved high accuracy when calculating the scatterer size. In this work, Mie theory was used to further investigate the relationship between the degree of modulation in the spectrum and the scatterer size. From this study, a new spectroscopic metric, the bandwidth of the Correlation of the Derivative (COD) was developed which is more robust and accurate, compared to previously reported techniques, in the estimation of scatterer size. The self-normalizing nature of the derivative and the robustness of the first minimum of the correlation as a measure of its width, offer significant advantages over other spectral analysis approaches especially for scatterer sizes above 3 μm. The feasibility of this technique was demonstrated using phantom samples containing 6, 10 and 16 μm diameter microspheres as well as images of normal and cancerous human colon. The results are very promising, suggesting that the proposed metric could be implemented in OCT spectral analysis for measuring nuclear size distribution in biological tissues. A technique providing such information would be of great clinical significance since it would allow the detection of nuclear enlargement at the earliest stages of precancerous development.

  11. Neoplastic potential of gastric irradiation. IV. Risk estimates

    SciTech Connect

    Griem, M.L.; Justman, J.; Weiss, L.

    1984-12-01

    No significant tumor increase was found in the initial analysis of patients irradiated for peptic ulcer and followed through 1962. A preliminary study was undertaken 22 years later to estimate the risk of cancer due to gastric irradiation for peptic ulcer disease. A population of 2,049 irradiated patients and 763 medically managed patients has been identified. A relative risk of 3.7 was found for stomach cancer and an initial risk estimate of 5.5 x 10(-6) excess stomach cancers per person rad was calculated. A more complete follow-up is in progress to further elucidate this observation and decrease the ascertainment bias; however, preliminary data are in agreement with the Japanese atomic bomb reports.

  12. Estimation of myocardial volume at risk from CT angiography

    NASA Astrophysics Data System (ADS)

    Zhu, Liangjia; Gao, Yi; Mohan, Vandana; Stillman, Arthur; Faber, Tracy; Tannenbaum, Allen

    2011-03-01

    The determination of myocardial volume at risk distal to coronary stenosis provides important information for prognosis and treatment of coronary artery disease. In this paper, we present a novel computational framework for estimating the myocardial volume at risk in computed tomography angiography (CTA) imagery. Initially, epicardial and endocardial surfaces, and coronary arteries are extracted using an active contour method. Then, the extracted coronary arteries are projected onto the epicardial surface, and each point on this surface is associated with its closest coronary artery using the geodesic distance measurement. The likely myocardial region at risk on the epicardial surface caused by a stenosis is approximated by the region in which all its inner points are associated with the sub-branches distal to the stenosis on the coronary artery tree. Finally, the likely myocardial volume at risk is approximated by the volume in between the region at risk on the epicardial surface and its projection on the endocardial surface, which is expected to yield computational savings over risk volume estimation using the entire image volume. Furthermore, we expect increased accuracy since, as compared to prior work using the Euclidean distance, we employ the geodesic distance in this work. The experimental results demonstrate the effectiveness of the proposed approach on pig heart CTA datasets.

  13. Estimating population health risk from low-level environmental radon

    SciTech Connect

    Fisher, D.R.

    1980-01-01

    Although incidence of respiratory cancer is directly related to inhalation of radon and radon daughters, the magnitude of the actual risk is uncertain for members of the general population exposed for long periods to low-level concentrations. Currently, any such estimate of the risk must rely on data obtained through previous studies of underground-miner populations. Several methods of risk analysis have resulted from these studies. Since the breathing atmospheres, smoking patterns, and physiology are different between miners and the general public, overestimates of lung cancer risk to the latter may have resulted. Strong evidence exists to support the theory of synergistic action between alpha radiation and other agents, and therefore a modified relative risk model was developed to predict lung cancer risks to the general public. The model considers latent period, observation period, age dependency, and inherent risks from smoking or geographical location. A test of the model showed excellent agreement with results of the study of Czechoslovakian uranium miners, for which the necessary time factors were available. The risk model was also used to predict lung cancer incidence among residents of homes on reclaimed Florida phosphate lands, and results of this analysis indicate that over the space of many years, the increased incidence of lung cancer due to elevated radon levels may be indisgtinguishable from those due to other causes.

  14. Estimating Non-stationary Flood Risk in a Changing Climate

    NASA Astrophysics Data System (ADS)

    Yu, X.; Cohn, T. A.; Stedinger, J. R.

    2015-12-01

    Flood risk is usually described by a probability distribution for annual maximum streamflow which is assumed not to change with time. Federal, state and local governments in the United States are demanding guidance on flood frequency estimates that account for climate change. If a trend exists in peak flow series, ignoring it could result in large quantile estimator bias, while trying to estimate a trend will increase the flood quantile estimator's variance. Thus the issue is, what bias-variance tradeoff should we accept? This paper discusses approaches to flood frequency analysis (FFA) when flood series have trends. GCMs describe how annual runoff might vary over sub-continental scales, but this information is nearly useless for FFA in small watersheds. A LP3 Monte Carlo analysis and a re-sampling study of 100-year flood estimation (25- and 50-year projections) compares the performance of five methods: FFA as prescribed in national guidelines (Bulletin 17B), assumes the flood series is stationary and follows a log-Pearson type III (LP3) distribution; Fitting a LP3 distribution with time-varying parameters that include future trends in mean and perhaps variance, where slopes are assumed known; Fitting a LP3 distribution with time-varying parameters that capture future trends in mean and perhaps variance, where slopes are estimated from annual peak flow series; Employing only the most recent 30 years of flood records to fit a LP3 distribution; Applying a safety factor to the 100-year flood estimator (e.g. 25% increase). The 100-year flood estimator of method 2 has the smallest log-space mean squared error, though it is unlikely that the true trend would be known. Method 3 is only recommended over method 1 for large trends (≥ 0.5% per year). The 100-year flood estimators of method 1, 4, and 5 often have poor accuracy. Clearly, flood risk assessment will be a challenge in an uncertain world.

  15. Risk assessment in diabetes management: how do general practitioners estimate risks due to diabetes?

    PubMed Central

    Häussler, Bertram; Fischer, Gisela C; Meyer, Sibylle; Sturm, Diethard

    2007-01-01

    Objectives To evaluate the ability of general practitioners (GPs) in Germany to estimate the risk of patients with diabetes developing complications. Methods An interview study using a structured questionnaire to estimate risks of four case vignettes having diabetes‐specific complications within the next 10 years, risk reduction and life expectancy potential. A representative random sample of 584 GPs has been drawn, of which 150 could be interviewed. We compared GPs' estimates among each other (intraclass correlation coefficient (ICC) and Cohen's (multirater‐) κ) and with risks for long‐term complications generated by the multifactor disease model “Mellibase”, which is a knowledge‐based support system for medical decision management. Results The risk estimates by GPs varied widely (ICC 0.21 95% CI (0.13 to 0.36)). The average level of potential risk reduction was between 47% and 70%. Compared with Mellibase values, on average, the GPs overestimated the risk threefold. Mean estimates of potential prolongation of life expectancy were close to 10 years for each patient, whereas the Mellibase calculations ranged from 3 to 10 years. Conclusions Overestimation could lead to unnecessary care and waste of resources. PMID:17545348

  16. Attributable Risk Estimate of Severe Psoriasis on Major Cardiovascular Events

    PubMed Central

    Mehta, Nehal N.; Yu, YiDing; Pinnelas, Rebecca; Krishnamoorthy, Parasuram; Shin, Daniel B.; Troxel, Andrea B.; Gelfand, Joel M.

    2011-01-01

    Background Recent studies suggest that psoriasis, particularly if severe, may be a risk factor for major adverse cardiac events such as myocardial infarction, stroke, and mortality from cardiovascular disease. We compared the risk of major adverse cardiac events between patients with psoriasis and the general population and estimated the attributable risk of severe psoriasis. Methods We performed a cohort study in the General Practice Research Database. Severe psoriasis was defined as receiving a psoriasis diagnosis and systemic therapy (N=3,603). Up to 4 patients without psoriasis were selected from the same practices and start dates for each patient with psoriasis (N=14,330). Results Severe psoriasis was a risk factor for major adverse cardiac events (hazard ratio 1.53; 95% confidence interval 1.26, 1.85) after adjusting for age, gender, diabetes, hypertension, tobacco use and hyperlipidemia. After fully adjusted analysis, severe psoriasis conferred an additional 6.2% absolute risk of 10-year major adverse cardiac events. Conclusions Severe psoriasis confers an additional 6.2% absolute risk of 10-year rate of major adverse cardiac events compared to the general population. This potentially has important therapeutic implications for cardiovascular risk stratification and prevention in patients with severe psoriasis. Future prospective studies are needed to validate these findings. PMID:21787906

  17. Estimation of earthquake risk curves of physical building damage

    NASA Astrophysics Data System (ADS)

    Raschke, Mathias; Janouschkowetz, Silke; Fischer, Thomas; Simon, Christian

    2014-05-01

    In this study, a new approach to quantify seismic risks is presented. Here, the earthquake risk curves for the number of buildings with a defined physical damage state are estimated for South Africa. Therein, we define the physical damage states according to the current European macro-seismic intensity scale (EMS-98). The advantage of such kind of risk curve is that its plausibility can be checked more easily than for other types. The earthquake risk curve for physical building damage can be compared with historical damage and their corresponding empirical return periods. The number of damaged buildings from historical events is generally explored and documented in more detail than the corresponding monetary losses. The latter are also influenced by different economic conditions, such as inflation and price hikes. Further on, the monetary risk curve can be derived from the developed risk curve of physical building damage. The earthquake risk curve can also be used for the validation of underlying sub-models such as the hazard and vulnerability modules.

  18. Estimating radiation risk induced by CT screening for Korean population

    NASA Astrophysics Data System (ADS)

    Yang, Won Seok; Yang, Hye Jeong; Min, Byung In

    2017-02-01

    The purposes of this study are to estimate the radiation risks induced by chest/abdomen computed tomography (CT) screening for healthcare and to determine the cancer risk level of the Korean population compared to other populations. We used an ImPACT CT Patient Dosimetry Calculator to compute the organ effective dose induced by CT screening (chest, low-dose chest, abdomen/pelvis, and chest/abdomen/pelvis CT). A risk model was applied using principles based on the BEIR VII Report in order to estimate the lifetime attributable risk (LAR) using the Korean Life Table 2010. In addition, several countries including Hong Kong, the United States (U.S.), and the United Kingdom, were selected for comparison. Herein, each population exposed radiation dose of 100 mSv was classified according to country, gender and age. For each CT screening the total organ effective dose calculated by ImPACT was 6.2, 1.5, 5.2 and 11.4 mSv, respectively. In the case of Korean female LAR, it was similar to Hong Kong female but lower than those of U.S. and U.K. females, except for those in their twenties. The LAR of Korean males was the highest for all types of CT screening. However, the difference of the risk level was negligible because of the quite low value.

  19. A Review of Expertise and Judgment Processes for Risk Estimation

    SciTech Connect

    R. L. Boring

    2007-06-01

    A major challenge of risk and reliability analysis for human errors or hardware failures is the need to enlist expert opinion in areas for which adequate operational data are not available. Experts enlisted in this capacity provide probabilistic estimates of reliability, typically comprised of a measure of central tendency and uncertainty bounds. While formal guidelines for expert elicitation are readily available, they largely fail to provide a theoretical basis for expertise and judgment. This paper reviews expertise and judgment in the context of risk analysis; overviews judgment biases, the role of training, and multivariate judgments; and provides guidance on the appropriate use of atomistic and holistic judgment processes.

  20. Estimates of health risk from exposure to radioactive pollutants

    SciTech Connect

    Sullivan, R.E.; Nelson, N.S.; Ellett, W.H.; Dunning, D.E. Jr.; Leggett, R.W.; Yalcintas, M.G.; Eckerman, K.F.

    1981-11-01

    A dosimetric and health effects analysis has been performed for the Office of Radiation Programs of the Environmental Protection Agency (EPA) to assess potential hazards from radioactive pollutants. Contemporary dosimetric methods were used to obtain estimates of dose rates to reference organs from internal exposures due to either inhalation of contaminated air or ingestion of contaminated food, or from external exposures due to either immersion in contaminated air or proximity to contaminated ground surfaces. These dose rates were then used to estimate the number of premature cancer deaths arising from such exposures and the corresponding number of years of life lost in a cohort of 100,000 persons, all simultaneously liveborn and all going through life with the same risks of dying from competing causes. The risk of dying from a competing cause for a given year was taken to be the probability of dying from all causes as given in a recent actuarial life table for the total US population.

  1. Improved risk estimates for carbon tetrachloride. 1998 annual progress report

    SciTech Connect

    Benson, J.M.; Springer, D.L.; Thrall, K.D.

    1998-06-01

    'The overall purpose of these studies is to improve the scientific basis for assessing the cancer risk associated with human exposure to carbon tetrachloride. Specifically, the toxicokinetics of inhaled carbon tetrachloride is being determined in rats, mice and hamsters. Species differences in the metabolism of carbon tetrachloride by rats, mice and hamsters is being determined in vivo and in vitro using tissues and microsomes from these rodent species and man. Dose-response relationships will be determined in all studies. The information will be used to improve the current physiologically based pharmacokinetic model for carbon tetrachloride. The authors will also determine whether carbon tetrachloride is a hepatocarcinogen only when exposure results in cell damage, cell killing, and regenerative cell proliferation. In combination, the results of these studies will provide the types of information needed to enable a refined risk estimate for carbon tetrachloride under EPA''s new guidelines for cancer risk assessment.'

  2. Risk estimation based on chromosomal aberrations induced by radiation

    NASA Technical Reports Server (NTRS)

    Durante, M.; Bonassi, S.; George, K.; Cucinotta, F. A.

    2001-01-01

    The presence of a causal association between the frequency of chromosomal aberrations in peripheral blood lymphocytes and the risk of cancer has been substantiated recently by epidemiological studies. Cytogenetic analyses of crew members of the Mir Space Station have shown that a significant increase in the frequency of chromosomal aberrations can be detected after flight, and that such an increase is likely to be attributed to the radiation exposure. The risk of cancer can be estimated directly from the yields of chromosomal aberrations, taking into account some aspects of individual susceptibility and other factors unrelated to radiation. However, the use of an appropriate technique for the collection and analysis of chromosomes and the choice of the structural aberrations to be measured are crucial in providing sound results. Based on the fraction of aberrant lymphocytes detected before and after flight, the relative risk after a long-term Mir mission is estimated to be about 1.2-1.3. The new technique of mFISH can provide useful insights into the quantification of risk on an individual basis.

  3. Estimation of tuberculosis risk on a commercial airliner.

    PubMed

    Ko, Gwangpyo; Thompson, Kimberly M; Nardell, Edward A

    2004-04-01

    This article estimates the risk of tuberculosis (TB) transmission on a typical commercial airliner using a simple one box model (OBM) and a sequential box model (SBM). We used input data derived from an actual TB exposure on an airliner, and we assumed a hypothetical scenario that a highly infectious TB source case (i.e., 108 infectious quanta per hour) travels as a passenger on an 8.7-hour flight. We estimate an average risk of TB transmission on the order of 1 chance in 1,000 for all passengers using the OBM. Applying the more realistic SBM, we show that the risk and incidence decrease sharply in a stepwise fashion in cabins downstream from the cabin containing the source case assuming some potential for airflow from more contaminated to less contaminated cabins. We further characterized spatial variability in the risk within the cabin by modeling a previously reported TB outbreak in an airplane to demonstrate that the TB cases occur most likely within close proximity of the source TB patient.

  4. The Need for Accurate Risk Prediction Models for Road Mapping, Shared Decision Making and Care Planning for the Elderly with Advanced Chronic Kidney Disease.

    PubMed

    Stryckers, Marijke; Nagler, Evi V; Van Biesen, Wim

    2016-11-01

    As people age, chronic kidney disease becomes more common, but it rarely leads to end-stage kidney disease. When it does, the choice between dialysis and conservative care can be daunting, as much depends on life expectancy and personal expectations of medical care. Shared decision making implies adequately informing patients about their options, and facilitating deliberation of the available information, such that decisions are tailored to the individual's values and preferences. Accurate estimations of one's risk of progression to end-stage kidney disease and death with or without dialysis are essential for shared decision making to be effective. Formal risk prediction models can help, provided they are externally validated, well-calibrated and discriminative; include unambiguous and measureable variables; and come with readily applicable equations or scores. Reliable, externally validated risk prediction models for progression of chronic kidney disease to end-stage kidney disease or mortality in frail elderly with or without chronic kidney disease are scant. Within this paper, we discuss a number of promising models, highlighting both the strengths and limitations physicians should understand for using them judiciously, and emphasize the need for external validation over new development for further advancing the field.

  5. Observing Volcanic Thermal Anomalies from Space: How Accurate is the Estimation of the Hotspot's Size and Temperature?

    NASA Astrophysics Data System (ADS)

    Zaksek, K.; Pick, L.; Lombardo, V.; Hort, M. K.

    2015-12-01

    Measuring the heat emission from active volcanic features on the basis of infrared satellite images contributes to the volcano's hazard assessment. Because these thermal anomalies only occupy a small fraction (< 1 %) of a typically resolved target pixel (e.g. from Landsat 7, MODIS) the accurate determination of the hotspot's size and temperature is however problematic. Conventionally this is overcome by comparing observations in at least two separate infrared spectral wavebands (Dual-Band method). We investigate the resolution limits of this thermal un-mixing technique by means of a uniquely designed indoor analog experiment. Therein the volcanic feature is simulated by an electrical heating alloy of 0.5 mm diameter installed on a plywood panel of high emissivity. Two thermographic cameras (VarioCam high resolution and ImageIR 8300 by Infratec) record images of the artificial heat source in wavebands comparable to those available from satellite data. These range from the short-wave infrared (1.4-3 µm) over the mid-wave infrared (3-8 µm) to the thermal infrared (8-15 µm). In the conducted experiment the pixel fraction of the hotspot was successively reduced by increasing the camera-to-target distance from 3 m to 35 m. On the basis of an individual target pixel the expected decrease of the hotspot pixel area with distance at a relatively constant wire temperature of around 600 °C was confirmed. The deviation of the hotspot's pixel fraction yielded by the Dual-Band method from the theoretically calculated one was found to be within 20 % up until a target distance of 25 m. This means that a reliable estimation of the hotspot size is only possible if the hotspot is larger than about 3 % of the pixel area, a resolution boundary most remotely sensed volcanic hotspots fall below. Future efforts will focus on the investigation of a resolution limit for the hotspot's temperature by varying the alloy's amperage. Moreover, the un-mixing results for more realistic multi

  6. Equine seroprevalence rates as an additional indicator for a more accurate risk assessment of the West Nile virus transmission.

    PubMed

    Vignjević, Goran; Vrućina, Ivana; Sestak, Ivana; Turić, Natasa; Bogojević, Mirta Sudarić; Merdić, Enrih

    2013-09-01

    The West Nile Virus (WNV) is a zoonotic arbovirus that has recently been causing outbreaks in many countries in southern and Central Europe. In 2012, for the first time, it caused an outbreak in eastern Croatia with total of 7 human clinical cases. With an aim of assisting public health personnel in order to improve survey protocols and vector control, the high risk areas of the WNV transmission were estimated and mapped. The study area included cities of Osijek and Slavonski Brod and 8 municipalities in Vukovarsko-Srijemska County. Risk estimation was based on seroprevalence of WNV infections in horses as an indicator of the virus presence, as well as the presence of possible WNV mosquito vectors with corresponding vector competences. Four mosquito species considered as possible WNV vectors are included in this study: Aedes vexans, Culex modestus, Culex pipiens and Ochlerotatus caspius. Mosquitoes were sampled using dry-ice baited CDC trap, twice a month, between May and October. This study suggests that the two mosquito species present the main risk of WNV transmission in eastern Croatia: the Culex pipiens--because of good vector competence and the Aedes vexans--because of the very high abundances. As a result, these two species should be focus of future mosquito surveillance and a vector control management.

  7. How Many Significant Figures are Useful for Public Risk Estimates?

    NASA Astrophysics Data System (ADS)

    Wilde, Paul D.; Duffy, Jim

    2013-09-01

    This paper considers the level of uncertainty in the calculation of public risks from launch or reentry and provides guidance on the number of significant digits that can be used with confidence when reporting the analysis results to decision-makers. The focus of this paper is the uncertainty in collective risk calculations that are used for launches of new and mature ELVs. This paper examines the computational models that are used to estimate total collective risk to the public for a launch, including the model input data and the model results, and characterizes the uncertainties due to both bias and variability. There have been two recent efforts to assess the uncertainty in state-of-the-art risk analysis models used in the US and their input data. One assessment focused on launch area risk from an Atlas V at Vandenberg Air Force Base (VAFB) and the other focused on downrange risk to Eurasia from a Falcon 9 launched from Cape Canaveral Air Force Station (CCAFS). The results of these studies quantified the uncertainties related to both the probability and the consequence of the launch debris hazards. This paper summarizes the results of both of these relatively comprehensive launch risk uncertainty analyses, which addressed both aleatory and epistemic uncertainties. The epistemic uncertainties of most concern were associated with probability of failure and the debris list. Other major sources of uncertainty evaluated were: the casualty area for people in shelters that are impacted by debris, impact distribution size, yield from exploding propellant and propellant tanks, probability of injury from a blast wave for people in shelters or outside, and population density. This paper also summarizes a relatively comprehensive over-flight risk uncertainty analysis performed by the FAA for the second stage of flight for a Falcon 9 from CCAFS. This paper is applicable to baseline collective risk analyses, such as those used to make a commercial license determination, and

  8. Seismic Risk Assessment and Loss Estimation for Tbilisi City

    NASA Astrophysics Data System (ADS)

    Tsereteli, Nino; Alania, Victor; Varazanashvili, Otar; Gugeshashvili, Tengiz; Arabidze, Vakhtang; Arevadze, Nika; Tsereteli, Emili; Gaphrindashvili, Giorgi; Gventcadze, Alexander; Goguadze, Nino; Vephkhvadze, Sophio

    2013-04-01

    The proper assessment of seismic risk is of crucial importance for society protection and city sustainable economic development, as it is the essential part to seismic hazard reduction. Estimation of seismic risk and losses is complicated tasks. There is always knowledge deficiency on real seismic hazard, local site effects, inventory on elements at risk, infrastructure vulnerability, especially for developing countries. Lately great efforts was done in the frame of EMME (earthquake Model for Middle East Region) project, where in the work packages WP1, WP2 , WP3 and WP4 where improved gaps related to seismic hazard assessment and vulnerability analysis. Finely in the frame of work package wp5 "City Scenario" additional work to this direction and detail investigation of local site conditions, active fault (3D) beneath Tbilisi were done. For estimation economic losses the algorithm was prepared taking into account obtained inventory. The long term usage of building is very complex. It relates to the reliability and durability of buildings. The long term usage and durability of a building is determined by the concept of depreciation. Depreciation of an entire building is calculated by summing the products of individual construction unit' depreciation rates and the corresponding value of these units within the building. This method of calculation is based on an assumption that depreciation is proportional to the building's (constructions) useful life. We used this methodology to create a matrix, which provides a way to evaluate the depreciation rates of buildings with different type and construction period and to determine their corresponding value. Finally loss was estimated resulting from shaking 10%, 5% and 2% exceedance probability in 50 years. Loss resulting from scenario earthquake (earthquake with possible maximum magnitude) also where estimated.

  9. Risk cross sections and their application to risk estimation in the galactic cosmic-ray environment

    NASA Technical Reports Server (NTRS)

    Curtis, S. B.; Nealy, J. E.; Wilson, J. W.; Chatterjee, A. (Principal Investigator)

    1995-01-01

    Radiation risk cross sections (i.e. risks per particle fluence) are discussed in the context of estimating the risk of radiation-induced cancer on long-term space flights from the galactic cosmic radiation outside the confines of the earth's magnetic field. Such quantities are useful for handling effects not seen after low-LET radiation. Since appropriate cross-section functions for cancer induction for each particle species are not yet available, the conventional quality factor is used as an approximation to obtain numerical results for risks of excess cancer mortality. Risks are obtained for seven of the most radiosensitive organs as determined by the ICRP [stomach, colon, lung, bone marrow (BFO), bladder, esophagus and breast], beneath 10 g/cm2 aluminum shielding at solar minimum. Spectra are obtained for excess relative risk for each cancer per LET interval by calculating the average fluence-LET spectrum for the organ and converting to risk by multiplying by a factor proportional to R gamma L Q(L) before integrating over L, the unrestricted LET. Here R gamma is the risk coefficient for low-LET radiation (excess relative mortality per Sv) for the particular organ in question. The total risks of excess cancer mortality obtained are 1.3 and 1.1% to female and male crew, respectively, for a 1-year exposure at solar minimum. Uncertainties in these values are estimated to range between factors of 4 and 15 and are dominated by the biological uncertainties in the risk coefficients for low-LET radiation and in the LET (or energy) dependence of the risk cross sections (as approximated by the quality factor). The direct substitution of appropriate risk cross sections will eventually circumvent entirely the need to calculate, measure or use absorbed dose, equivalent dose and quality factor for such a high-energy charged-particle environment.

  10. Leukemia risk associated with benzene exposure in the pliofilm cohort. II. Risk estimates.

    PubMed

    Paxton, M B; Chinchilli, V M; Brett, S M; Rodricks, J V

    1994-04-01

    The detailed work histories of the individual workers composing the Pliofilm cohort represent a unique resource for estimating the dose-response for leukemia that may follow occupational exposure to benzene. In this paper, we report the results of analyzing the updated Pliofilm cohort using the proportional hazards model, a more sophisticated technique that uses more of the available exposure data than the conditional logistic model used by Rinsky et al. The more rigorously defined exposure estimates derived by Paustenbach et al. are consistent with those of Crump and Allen in giving estimates of the slope of the leukemogenic dose-response that are not as steep as the slope resulting from the exposure estimates of Rinsky et al. We consider estimates of 0.3-0.5 additional leukemia deaths per thousand workers with 45 ppm-years of cumulative benzene exposure to be the best estimates currently available of leukemia risk from occupational exposure to benzene. These risks were estimated in the proportional hazards model when the exposure estimates of Crump and Allen or of Paustenbach et al. were used to derive a cumulative concentration-by-time metric.

  11. Estimating twin concordance for bivariate competing risks twin data.

    PubMed

    Scheike, Thomas H; Holst, Klaus K; Hjelmborg, Jacob B

    2014-03-30

    For twin time-to-event data, we consider different concordance probabilities, such as the casewise concordance that are routinely computed as a measure of the lifetime dependence/correlation for specific diseases. The concordance probability here is the probability that both twins have experienced the event of interest. Under the assumption that both twins are censored at the same time, we show how to estimate this probability in the presence of right censoring, and as a consequence, we can then estimate the casewise twin concordance. In addition, we can model the magnitude of within pair dependence over time, and covariates may be further influential on the marginal risk and dependence structure. We establish the estimators large sample properties and suggest various tests, for example, for inferring familial influence. The method is demonstrated and motivated by specific twin data on cancer events with the competing risk death. We thus aim to quantify the degree of dependence through the casewise concordance function and show a significant genetic component.

  12. Cancer Risk Estimates from Space Flight Estimated Using Yields of Chromosome Damage in Astronaut's Blood Lymphocytes

    NASA Technical Reports Server (NTRS)

    George, Kerry A.; Rhone, J.; Chappell, L. J.; Cucinotta, F. A.

    2011-01-01

    To date, cytogenetic damage has been assessed in blood lymphocytes from more than 30 astronauts before and after they participated in long-duration space missions of three months or more on board the International Space Station. Chromosome damage was assessed using fluorescence in situ hybridization whole chromosome analysis techniques. For all individuals, the frequency of chromosome damage measured within a month of return from space was higher than their preflight yield, and biodosimetry estimates were within the range expected from physical dosimetry. Follow up analyses have been performed on most of the astronauts at intervals ranging from around 6 months to many years after flight, and the cytogenetic effects of repeat long-duration missions have so far been assessed in four individuals. Chromosomal aberrations in peripheral blood lymphocytes have been validated as biomarkers of cancer risk and cytogenetic damage can therefore be used to characterize excess health risk incurred by individual crewmembers after their respective missions. Traditional risk assessment models are based on epidemiological data obtained on Earth in cohorts exposed predominantly to acute doses of gamma-rays, and the extrapolation to the space environment is highly problematic, involving very large uncertainties. Cytogenetic damage could play a key role in reducing uncertainty in risk estimation because it is incurred directly in the space environment, using specimens from the astronauts themselves. Relative cancer risks were estimated from the biodosimetry data using the quantitative approach derived from the European Study Group on Cytogenetic Biomarkers and Health database. Astronauts were categorized into low, medium, or high tertiles according to their yield of chromosome damage. Age adjusted tertile rankings were used to estimate cancer risk and results were compared with values obtained using traditional modeling approaches. Individual tertile rankings increased after space

  13. Data Sources for the Model-based Small Area Estimates of Cancer Risk Factors and Screening Behaviors - Small Area Estimates

    Cancer.gov

    The model-based estimates of important cancer risk factors and screening behaviors are obtained by combining the responses to the Behavioral Risk Factor Surveillance System (BRFSS) and the National Health Interview Survey (NHIS).

  14. Estimating Worker Risk Levels Using Accident/Incident Data

    SciTech Connect

    Kenoyer, Judson L.; Stenner, Robert D.; Andrews, William B.; Scherpelz, Robert I.; Aaberg, Rosanne L.

    2000-09-26

    The purpose of the work described in this report was to identify methods that are currently being used in the Department of Energy (DOE) complex to identify and control hazards/risks in the workplace, evaluate them in terms of their effectiveness in reducing risk to the workers, and to develop a preliminary method that could be used to predict the relative risks to workers performing proposed tasks using some of the current methodology. This report describes some of the performance indicators (i.e., safety metrics) that are currently being used to track relative levels of workplace safety in the DOE complex, how these fit into an Integrated Safety Management (ISM) system, some strengths and weaknesses of using a statistically based set of indicators, and methods to evaluate them. Also discussed are methods used to reduce risk to the workers and some of the techniques that appear to be working in the process of establishing a condition of continuous improvement. The results of these methods will be used in future work involved with the determination of modifying factors for a more complex model. The preliminary method to predict the relative risk level to workers during an extended future time period is based on a currently used performance indicator that uses several factors tracked in the CAIRS. The relative risks for workers in a sample (but real) facility on the Hanford site are estimated for a time period of twenty years and are based on workforce predictions. This is the first step in developing a more complex model that will incorporate other modifying factors related to the workers, work environment and status of the ISM system to adjust the preliminary prediction.

  15. Global Building Inventory for Earthquake Loss Estimation and Risk Management

    USGS Publications Warehouse

    Jaiswal, Kishor; Wald, David; Porter, Keith

    2010-01-01

    We develop a global database of building inventories using taxonomy of global building types for use in near-real-time post-earthquake loss estimation and pre-earthquake risk analysis, for the U.S. Geological Survey’s Prompt Assessment of Global Earthquakes for Response (PAGER) program. The database is available for public use, subject to peer review, scrutiny, and open enhancement. On a country-by-country level, it contains estimates of the distribution of building types categorized by material, lateral force resisting system, and occupancy type (residential or nonresidential, urban or rural). The database draws on and harmonizes numerous sources: (1) UN statistics, (2) UN Habitat’s demographic and health survey (DHS) database, (3) national housing censuses, (4) the World Housing Encyclopedia and (5) other literature.

  16. Geostatistical analysis of disease data: estimation of cancer mortality risk from empirical frequencies using Poisson kriging

    PubMed Central

    Goovaerts, Pierre

    2005-01-01

    Background Cancer mortality maps are used by public health officials to identify areas of excess and to guide surveillance and control activities. Quality of decision-making thus relies on an accurate quantification of risks from observed rates which can be very unreliable when computed from sparsely populated geographical units or recorded for minority populations. This paper presents a geostatistical methodology that accounts for spatially varying population sizes and spatial patterns in the processing of cancer mortality data. Simulation studies are conducted to compare the performances of Poisson kriging to a few simple smoothers (i.e. population-weighted estimators and empirical Bayes smoothers) under different scenarios for the disease frequency, the population size, and the spatial pattern of risk. A public-domain executable with example datasets is provided. Results The analysis of age-adjusted mortality rates for breast and cervix cancers illustrated some key features of commonly used smoothing techniques. Because of the small weight assigned to the rate observed over the entity being smoothed (kernel weight), the population-weighted average leads to risk maps that show little variability. Other techniques assign larger and similar kernel weights but they use a different piece of auxiliary information in the prediction: global or local means for global or local empirical Bayes smoothers, and spatial combination of surrounding rates for the geostatistical estimator. Simulation studies indicated that Poisson kriging outperforms other approaches for most scenarios, with a clear benefit when the risk values are spatially correlated. Global empirical Bayes smoothers provide more accurate predictions under the least frequent scenario of spatially random risk. Conclusion The approach presented in this paper enables researchers to incorporate the pattern of spatial dependence of mortality rates into the mapping of risk values and the quantification of the

  17. Quaternion-Based Unscented Kalman Filter for Accurate Indoor Heading Estimation Using Wearable Multi-Sensor System

    PubMed Central

    Yuan, Xuebing; Yu, Shuai; Zhang, Shengzhi; Wang, Guoping; Liu, Sheng

    2015-01-01

    Inertial navigation based on micro-electromechanical system (MEMS) inertial measurement units (IMUs) has attracted numerous researchers due to its high reliability and independence. The heading estimation, as one of the most important parts of inertial navigation, has been a research focus in this field. Heading estimation using magnetometers is perturbed by magnetic disturbances, such as indoor concrete structures and electronic equipment. The MEMS gyroscope is also used for heading estimation. However, the accuracy of gyroscope is unreliable with time. In this paper, a wearable multi-sensor system has been designed to obtain the high-accuracy indoor heading estimation, according to a quaternion-based unscented Kalman filter (UKF) algorithm. The proposed multi-sensor system including one three-axis accelerometer, three single-axis gyroscopes, one three-axis magnetometer and one microprocessor minimizes the size and cost. The wearable multi-sensor system was fixed on waist of pedestrian and the quadrotor unmanned aerial vehicle (UAV) for heading estimation experiments in our college building. The results show that the mean heading estimation errors are less 10° and 5° to multi-sensor system fixed on waist of pedestrian and the quadrotor UAV, respectively, compared to the reference path. PMID:25961384

  18. Quaternion-based unscented Kalman filter for accurate indoor heading estimation using wearable multi-sensor system.

    PubMed

    Yuan, Xuebing; Yu, Shuai; Zhang, Shengzhi; Wang, Guoping; Liu, Sheng

    2015-05-07

    Inertial navigation based on micro-electromechanical system (MEMS) inertial measurement units (IMUs) has attracted numerous researchers due to its high reliability and independence. The heading estimation, as one of the most important parts of inertial navigation, has been a research focus in this field. Heading estimation using magnetometers is perturbed by magnetic disturbances, such as indoor concrete structures and electronic equipment. The MEMS gyroscope is also used for heading estimation. However, the accuracy of gyroscope is unreliable with time. In this paper, a wearable multi-sensor system has been designed to obtain the high-accuracy indoor heading estimation, according to a quaternion-based unscented Kalman filter (UKF) algorithm. The proposed multi-sensor system including one three-axis accelerometer, three single-axis gyroscopes, one three-axis magnetometer and one microprocessor minimizes the size and cost. The wearable multi-sensor system was fixed on waist of pedestrian and the quadrotor unmanned aerial vehicle (UAV) for heading estimation experiments in our college building. The results show that the mean heading estimation errors are less 10° and 5° to multi-sensor system fixed on waist of pedestrian and the quadrotor UAV, respectively, compared to the reference path.

  19. BACKGROUND RADIATION MEASUREMENTS AND CANCER RISK ESTIMATES FOR SEBINKARAHISAR, TURKEY.

    PubMed

    Kurnaz, Asli

    2013-07-19

    This paper presents the measurement results of environmental radioactivity levels for Şebinkarahisar district (uranium-thorium area), Giresun, Turkey. The radioactivity concentrations of (238)U, (232)Th, (40)K and the fission product (137)Cs in soil samples collected from 73 regions from the surroundings of the study area were determined. In situ measurements of the gamma dose rate in air were performed in the same 73 locations where the soil samples were collected using a portable NaI detector. Also the mean radioactivity concentrations of (238)U, (232)Th and (40)K in rock samples collected from 50 regions were determined. The mean estimated cancer risk value was found. The seasonal variations of the indoor radon activity concentrations were determined in the 30 dwellings in the study area. In addition, the mean gross alpha, gross beta and radon activities in tap water samples were determined in the same 30 dwellings. The excess lifetime cancer risk was calculated using the risk factors of International Commission on Radiological Protection and Biological Effects of Ionizing Radiation. Radiological maps of the Şebinkarahisar region were composed using the results obtained from this study.

  20. Estimation and testing of the relative risk of disease in case-control studies with a set of k matched controls per case with known prevalence of disease.

    PubMed

    Moser, Barry Kurt; Halabi, Susan

    2012-01-13

    The analysis of case-control studies with matched controls per case is well documented in the medical literature. Of primary interest is the estimation of the relative risk of disease. Matched case-control studies fall into two scenarios: the probability of exposure is constant within each of the case and control groups, or the probability of exposure varies within each group. Numerous estimation procedures have been developed for both scenarios. Often these procedures are developed under the rare disease assumption, where the relative risk of disease is approximated by the odds ratio. In this paper, without making the rare disease assumption, we develop consistent estimators of the relative risk of disease for both scenarios. Exact derivations of the relative risk of disease are provided. Estimators, confidence intervals, and test statistics for the relative risk of disease are developed. We then make the following observations based on extensive simulations. First, our estimators are as close or closer to the relative risk of disease than other estimators. Second, our estimators produce mean square errors for the relative risk of disease that are as good as or better than these other estimators. Third, our confidence intervals provide accurate coverage probabilities. Therefore, these new estimators, confidence intervals, and test statistics can be used to either estimate or test the relative risk of disease in matched case-control studies.

  1. Developing accurate survey methods for estimating population sizes and trends of the critically endangered Nihoa Millerbird and Nihoa Finch.

    USGS Publications Warehouse

    Gorresen, P. Marcos; Camp, Richard J.; Brinck, Kevin W.; Farmer, Chris

    2012-01-01

    Point-transect surveys indicated that millerbirds were more abundant than shown by the striptransect method, and were estimated at 802 birds in 2010 (95%CI = 652 – 964) and 704 birds in 2011 (95%CI = 579 – 837). Point-transect surveys yielded population estimates with improved precision which will permit trends to be detected in shorter time periods and with greater statistical power than is available from strip-transect survey methods. Mean finch population estimates and associated uncertainty were not markedly different among the three survey methods, but the performance of models used to estimate density and population size are expected to improve as the data from additional surveys are incorporated. Using the pointtransect survey, the mean finch population size was estimated at 2,917 birds in 2010 (95%CI = 2,037 – 3,965) and 2,461 birds in 2011 (95%CI = 1,682 – 3,348). Preliminary testing of the line-transect method in 2011 showed that it would not generate sufficient detections to effectively model bird density, and consequently, relatively precise population size estimates. Both species were fairly evenly distributed across Nihoa and appear to occur in all or nearly all available habitat. The time expended and area traversed by observers was similar among survey methods; however, point-transect surveys do not require that observers walk a straight transect line, thereby allowing them to avoid culturally or biologically sensitive areas and minimize the adverse effects of recurrent travel to any particular area. In general, pointtransect surveys detect more birds than strip-survey methods, thereby improving precision and resulting population size and trend estimation. The method is also better suited for the steep and uneven terrain of Nihoa

  2. Risking Life and Limb: Estimating a Measure of Medical Care Economic Risk and Considering its Implications.

    PubMed

    Abramowitz, Joelle; O'Hara, Brett; Morris, Darcy Steeg

    2017-04-01

    This paper considers the risk of incurring future medical expenditures in light of a family's resources available to pay for those expenditures as well as their choice of health insurance. We model non-premium medical out-of-pocket expenditures and use the estimates from our model to develop a prospective measure of medical care economic risk estimating the proportion of families who are at risk of incurring high non-premium out-of-pocket medical care expenses in relation to its resources. We further use the estimates from our model to compare the extent to which different types of insurance mitigate the risk of incurring non-premium expenditures by providing for increased utilization of medical care. We find that while 21.3% of families lack the resources to pay for the median expenditures for their insurance type, 42.4% lack the resources to pay for the 99(th) percentile of expenditures for their insurance type. We also find the mediating effect of insurance on non-premium expenditures to outweigh the associated premium expense for expenditures above $1804 for employer-sponsored insurance and $4337 for direct purchase insurance for those younger than age 65; and above $12 118 of expenditures for Medicare supplementary plans for those aged 65 or older. Published 2016. This article is a U.S. Government work and is in the public domain in the USA.

  3. Prospect theory based estimation of drivers' risk attitudes in route choice behaviors.

    PubMed

    Zhou, Lizhen; Zhong, Shiquan; Ma, Shoufeng; Jia, Ning

    2014-12-01

    This paper applied prospect theory (PT) to describe drivers' route choice behavior under Variable Message Sign (VMS), which presented visual traffic information to assist them to make route choice decisions. A quite rich empirical data from questionnaire and field spot was used to estimate parameters of PT. In order to make the parameters more realistic with drivers' attitudes, they were classified into different types by significant factors influencing their behaviors. Based on the travel time distribution of alternative routes and route choice results from questionnaire, the parameterized value function of each category was figured out, which represented drivers' risk attitudes and choice characteristics. The empirical verification showed that the estimates were acceptable and effective. The result showed drivers' risk attitudes and route choice characteristics could be captured by PT under real-time information shown on VMS. For practical application, once drivers' route choice characteristics and parameters were identified, their route choice behavior under different road conditions could be predicted accurately, which was the basis of traffic guidance measures formulation and implementation for targeted traffic management. Moreover, the heterogeneous risk attitudes among drivers should be considered when releasing traffic information and regulating traffic flow.

  4. Probability distributions of the logarithm of inter-spike intervals yield accurate entropy estimates from small datasets.

    PubMed

    Dorval, Alan D

    2008-08-15

    The maximal information that the spike train of any neuron can pass on to subsequent neurons can be quantified as the neuronal firing pattern entropy. Difficulties associated with estimating entropy from small datasets have proven an obstacle to the widespread reporting of firing pattern entropies and more generally, the use of information theory within the neuroscience community. In the most accessible class of entropy estimation techniques, spike trains are partitioned linearly in time and entropy is estimated from the probability distribution of firing patterns within a partition. Ample previous work has focused on various techniques to minimize the finite dataset bias and standard deviation of entropy estimates from under-sampled probability distributions on spike timing events partitioned linearly in time. In this manuscript we present evidence that all distribution-based techniques would benefit from inter-spike intervals being partitioned in logarithmic time. We show that with logarithmic partitioning, firing rate changes become independent of firing pattern entropy. We delineate the entire entropy estimation process with two example neuronal models, demonstrating the robust improvements in bias and standard deviation that the logarithmic time method yields over two widely used linearly partitioned time approaches.

  5. Risk Estimates and Risk Factors Related to Psychiatric Inpatient Suicide—An Overview

    PubMed Central

    Madsen, Trine; Erlangsen, Annette; Nordentoft, Merete

    2017-01-01

    People with mental illness have an increased risk of suicide. The aim of this paper is to provide an overview of suicide risk estimates among psychiatric inpatients based on the body of evidence found in scientific peer-reviewed literature; primarily focusing on the relative risks, rates, time trends, and socio-demographic and clinical risk factors of suicide in psychiatric inpatients. Psychiatric inpatients have a very high risk of suicide relative to the background population, but it remains challenging for clinicians to identify those patients that are most likely to die from suicide during admission. Most studies are based on low power, thus compromising quality and generalisability. The few studies with sufficient statistical power mainly identified non-modifiable risk predictors such as male gender, diagnosis, or recent deliberate self-harm. Also, the predictive value of these predictors is low. It would be of great benefit if future studies would be based on large samples while focusing on modifiable predictors over the course of an admission, such as hopelessness, depressive symptoms, and family/social situations. This would improve our chances of developing better risk assessment tools. PMID:28257103

  6. Risk Estimates and Risk Factors Related to Psychiatric Inpatient Suicide-An Overview.

    PubMed

    Madsen, Trine; Erlangsen, Annette; Nordentoft, Merete

    2017-03-02

    People with mental illness have an increased risk of suicide. The aim of this paper is to provide an overview of suicide risk estimates among psychiatric inpatients based on the body of evidence found in scientific peer-reviewed literature; primarily focusing on the relative risks, rates, time trends, and socio-demographic and clinical risk factors of suicide in psychiatric inpatients. Psychiatric inpatients have a very high risk of suicide relative to the background population, but it remains challenging for clinicians to identify those patients that are most likely to die from suicide during admission. Most studies are based on low power, thus compromising quality and generalisability. The few studies with sufficient statistical power mainly identified non-modifiable risk predictors such as male gender, diagnosis, or recent deliberate self-harm. Also, the predictive value of these predictors is low. It would be of great benefit if future studies would be based on large samples while focusing on modifiable predictors over the course of an admission, such as hopelessness, depressive symptoms, and family/social situations. This would improve our chances of developing better risk assessment tools.

  7. How Accurate Are German Work-Time Data? A Comparison of Time-Diary Reports and Stylized Estimates

    ERIC Educational Resources Information Center

    Otterbach, Steffen; Sousa-Poza, Alfonso

    2010-01-01

    This study compares work time data collected by the German Time Use Survey (GTUS) using the diary method with stylized work time estimates from the GTUS, the German Socio-Economic Panel, and the German Microcensus. Although on average the differences between the time-diary data and the interview data is not large, our results show that significant…

  8. A simple method for accurate liver volume estimation by use of curve-fitting: a pilot study.

    PubMed

    Aoyama, Masahito; Nakayama, Yoshiharu; Awai, Kazuo; Inomata, Yukihiro; Yamashita, Yasuyuki

    2013-01-01

    In this paper, we describe the effectiveness of our curve-fitting method by comparing liver volumes estimated by our new technique to volumes obtained with the standard manual contour-tracing method. Hepatic parenchymal-phase images of 13 patients were obtained with multi-detector CT scanners after intravenous bolus administration of 120-150 mL of contrast material (300 mgI/mL). The liver contours of all sections were traced manually by an abdominal radiologist, and the liver volume was computed by summing of the volumes inside the contours. The section number between the first and last slice was then divided into 100 equal parts, and each volume was re-sampled by use of linear interpolation. We generated 13 model profile curves by averaging 12 cases, leaving out one case, and we estimated the profile curve for each patient by fitting the volume values at 4 points using a scale and translation transform. Finally, we determined the liver volume by integrating the sampling points of the profile curve. We used Bland-Altman analysis to evaluate the agreement between the volumes estimated with our curve-fitting method and the volumes measured by the manual contour-tracing method. The correlation between the volume measured by manual tracing and that estimated with our curve-fitting method was relatively high (r = 0.98; slope 0.97; p < 0.001). The mean difference between the manual tracing and our method was -22.9 cm(3) (SD of the difference, 46.2 cm(3)). Our volume-estimating technique that requires the tracing of only 4 images exhibited a relatively high linear correlation with the manual tracing technique.

  9. The number of alleles at a microsatellite defines the allele frequency spectrum and facilitates fast accurate estimation of theta.

    PubMed

    Haasl, Ryan J; Payseur, Bret A

    2010-12-01

    Theoretical work focused on microsatellite variation has produced a number of important results, including the expected distribution of repeat sizes and the expected squared difference in repeat size between two randomly selected samples. However, closed-form expressions for the sampling distribution and frequency spectrum of microsatellite variation have not been identified. Here, we use coalescent simulations of the stepwise mutation model to develop gamma and exponential approximations of the microsatellite allele frequency spectrum, a distribution central to the description of microsatellite variation across the genome. For both approximations, the parameter of biological relevance is the number of alleles at a locus, which we express as a function of θ, the population-scaled mutation rate, based on simulated data. Discovered relationships between θ, the number of alleles, and the frequency spectrum support the development of three new estimators of microsatellite θ. The three estimators exhibit roughly similar mean squared errors (MSEs) and all are biased. However, across a broad range of sample sizes and θ values, the MSEs of these estimators are frequently lower than all other estimators tested. The new estimators are also reasonably robust to mutation that includes step sizes greater than one. Finally, our approximation to the microsatellite allele frequency spectrum provides a null distribution of microsatellite variation. In this context, a preliminary analysis of the effects of demographic change on the frequency spectrum is performed. We suggest that simulations of the microsatellite frequency spectrum under evolutionary scenarios of interest may guide investigators to the use of relevant and sometimes novel summary statistics.

  10. Declining bioavailability and inappropriate estimation of risk of persistent compounds

    SciTech Connect

    Kelsey, J.W.; Alexander, M.

    1997-03-01

    Earthworms (Eisenia foetida) assimilated decreasing amounts of atrazine, phenanthrene, and naphthalene that had been incubated for increasing periods of time in sterile soil. The amount of atrazine and phenanthrene removed from soil by mild extractants also decreased with time. The declines in bioavailability of the three compounds to earthworms and of naphthalene to bacteria were not reflected by analysis involving vigorous methods of solvent extraction; similar results for bioavailability of phenanthrene and 4-nitrophenol to bacteria were obtained in a previous study conducted at this laboratory. The authors suggest that regulations based on vigorous extractions for the analyses of persistent organic pollutants in soil do not appropriately estimate exposure or risk to susceptible populations.

  11. Model stimulations to estimate malaria risk under climate change.

    PubMed

    Jetten, T H; Martens, W J; Takken, W

    1996-05-01

    The current geographic range of malaria is much smaller than its potential range. In many regions there exists a phenomena characterized as "Anophelism without malaria." The vectors are present but malaria transmission does not occur. Vectorial capacity often has been used as a parameter to estimate the susceptibility of an area to malaria. Model computations with global climatological data show that a dynamic concept of vectorial capacity can be used as a comparative risk indicator to predict the current extent and distribution of malarious regions in the world. A sensitivity analysis done in 3 distinct geographic areas shows that the areas of largest change of epidemic potential caused by a temperature increase are those where mosquitoes already occur but where development of the parasite is limited by temperature. Computations with the model presented here predict, with different climate scenarios, an increased malaria risk in areas bordering malaria endemic regions and at higher altitudes within malarious regions under a temperature increase of 2-4 degrees C.

  12. Gambling disorder: estimated prevalence rates and risk factors in Macao.

    PubMed

    Wu, Anise M S; Lai, Mark H C; Tong, Kwok-Kit

    2014-12-01

    An excessive, problematic gambling pattern has been regarded as a mental disorder in the Diagnostic and Statistical Manual for Mental Disorders (DSM) for more than 3 decades (American Psychiatric Association [APA], 1980). In this study, its latest prevalence in Macao (one of very few cities with legalized gambling in China and the Far East) was estimated with 2 major changes in the diagnostic criteria, suggested by the 5th edition of DSM (APA, 2013): (a) removing the "Illegal Act" criterion, and (b) lowering the threshold for diagnosis. A random, representative sample of 1,018 Macao residents was surveyed with a phone poll design in January 2013. After the 2 changes were adopted, the present study showed that the estimated prevalence rate of gambling disorder was 2.1% of the Macao adult population. Moreover, the present findings also provided empirical support to the application of these 2 recommended changes when assessing symptoms of gambling disorder among Chinese community adults. Personal risk factors of gambling disorder, namely being male, having low education, a preference for casino gambling, as well as high materialism, were identified.

  13. Improved age modelling and high-precision age estimates of late Quaternary tephras, for accurate palaeoclimate reconstruction

    NASA Astrophysics Data System (ADS)

    Blockley, Simon P. E.; Bronk Ramsey, C.; Pyle, D. M.

    2008-10-01

    The role of tephrochronology, as a dating and stratigraphic tool, in precise palaeoclimate and environmental reconstruction, has expanded significantly in recent years. The power of tephrochronology rests on the fact that a tephra layer can stratigraphically link records at the resolution of as little as a few years, and that the most precise age for a particular tephra can be imported into any site where it is found. In order to maximise the potential of tephras for this purpose it is necessary to have the most precise and robustly tested age estimate possible available for key tephras. Given the varying number and quality of dates associated with different tephras it is important to be able to build age models to test competing tephra dates. Recent advances in Bayesian age modelling of dates in sequence have radically extended our ability to build such stratigraphic age models. As an example of the potential here we use Bayesian methods, now widely applied, to examine the dating of some key Late Quaternary tephras from Italy. These are: the Agnano Monte Spina Tephra (AMST), the Neapolitan Yellow Tuff (NYT) and the Agnano Pomici Principali (APP), and all of them have multiple estimates of their true age. Further, we use the Bayesian approaches to generate a revised mixed radiocarbon/varve chronology for the important Lateglacial section of the Lago Grande Monticchio record, as a further illustration of what can be achieved by a Bayesian approach. With all three tephras we were able to produce viable model ages for the tephra, validate the proposed 40Ar/ 39Ar age ranges for these tephras, and provide relatively high precision age models. The results of the Bayesian integration of dating and stratigraphic information, suggest that the current best 95% confidence calendar age estimates for the AMST are 4690-4300 cal BP, the NYT 14320-13900 cal BP, and the APP 12380-12140 cal BP.

  14. Is the SenseWear Armband accurate enough to quantify and estimate energy expenditure in healthy adults?

    PubMed Central

    Hernández-Vicente, Adrián; Pérez-Isaac, Raúl; Santín-Medeiros, Fernanda; Cristi-Montero, Carlos; Casajús, Jose Antonio; Garatachea, Nuria

    2017-01-01

    Background The SenseWear Armband (SWA) is a monitor that can be used to estimate energy expenditure (EE); however, it has not been validated in healthy adults. The objective of this paper was to study the validity of the SWA for quantifying EE levels. Methods Twenty-three healthy adults (age 40–55 years, mean: 48±3.42 years) performed different types of standardized physical activity (PA) for 10 minutes (rest, walking at 3 and 5 km·h-1, running at 7 and 9 km·h-1, and sitting/standing at a rate of 30 cycle·min-1). Participants wore the SWA on their right arm, and their EE was measured by indirect calorimetry (IC) the gold standard. Results There were significant differences between the SWA and IC, except in the group that ran at 9 km·h-1 (>9 METs). Bland-Altman analysis showed a BIAS of 1.56 METs (±1.83 METs) and limits of agreement (LOA) at 95% of −2.03 to 5.16 METs. There were indications of heteroscedasticity (R2 =0.03; P<0.05). Analysis of the receiver operating characteristic (ROC) curves showed that the SWA seems to be not sensitive enough to estimate the level of EE at highest intensities. Conclusions The SWA is not as precise in estimating EE as IC, but it could be a useful tool to determine levels of EE at low intensities. PMID:28361062

  15. From mechanisms to risk estimation--bridging the chasm.

    PubMed

    Curtis, S B; Hazelton, W D; Luebeck, E G; Moolgavkar, S H

    2004-01-01

    We have a considerable amount of work ahead of us to determine the importance of the wealth of new information emerging in the fields of sub-cellular, cellular and tissue biology in order to improve the estimation of radiation risk at low dose and protracted dose-rate. In this paper, we suggest that there is a need to develop models of the specific health effects of interest (e.g., carcinogenesis in specific tissues), which embody as much of the mechanistic (i.e., biological) information as is deemed necessary. Although it is not realistic to expect that every radiation-induced process should or could be included, we can hope that the major factors that shape the time dependence of evolution of damage can be identified and quantified to the point where reasonable estimations of risk can be made. Regarding carcinogenesis in particular, the structure of the model itself plays a role in determining the relative importance of various processes. We use a specific form of a multi-stage carcinogenic model to illustrate this point. We show in a review of the application of this model to lung cancer incidence and mortality in two exposed populations that for both high- and low-LET radiation, there is evidence of an "inverse dose-rate" or protraction effect. This result could be of some considerable importance, because it would imply that risk from protracted exposure even to low-LET radiation might be greater than from acute exposure, an opinion not currently held in the radiation protection community. This model also allows prediction of the evolution of the risk over the lifetimes of the exposed individuals. One inference is that radiation-induced initiation (i.e., the first cellular carcinogenic event(s) occurring in normal tissue after the passage of the radiation) may not be the driving factor in the risk, but more important may be the effects of the radiation on already-initiated cells in the tissue. Although present throughout the length of the exposure, radiation

  16. Soil-ecological risks for soil degradation estimation

    NASA Astrophysics Data System (ADS)

    Trifonova, Tatiana; Shirkin, Leonid; Kust, German; Andreeva, Olga

    2016-04-01

    Soil degradation includes the processes of soil properties and quality worsening, primarily from the point of view of their productivity and decrease of ecosystem services quality. Complete soil cover destruction and/or functioning termination of soil forms of organic life are considered as extreme stages of soil degradation, and for the fragile ecosystems they are normally considered in the network of their desertification, land degradation and droughts /DLDD/ concept. Block-model of ecotoxic effects, generating soil and ecosystem degradation, has been developed as a result of the long-term field and laboratory research of sod-podzol soils, contaminated with waste, containing heavy metals. The model highlights soil degradation mechanisms, caused by direct and indirect impact of ecotoxicants on "phytocenosis- soil" system and their combination, frequently causing synergistic effect. The sequence of occurring changes here can be formalized as a theory of change (succession of interrelated events). Several stages are distinguished here - from heavy metals leaching (releasing) in waste and their migration downward the soil profile to phytoproductivity decrease and certain phytocenosis composition changes. Phytoproductivity decrease leads to the reduction of cellulose content introduced into the soil. The described feedback mechanism acts as a factor of sod-podzolic soil self-purification and stability. It has been shown, that using phytomass productivity index, integrally reflecting the worsening of soil properties complex, it is possible to solve the problems dealing with the dose-reflecting reactions creation and determination of critical levels of load for phytocenosis and corresponding soil-ecological risks. Soil-ecological risk in "phytocenosis- soil" system means probable negative changes and the loss of some ecosystem functions during the transformation process of dead organic substance energy for the new biomass composition. Soil-ecological risks estimation is

  17. Measurement of pelvic motion is a prerequisite for accurate estimation of hip joint work in maximum height squat jumping.

    PubMed

    Blache, Yoann; Bobbert, Maarten; Argaud, Sebastien; Pairot de Fontenay, Benoit; Monteil, Karine M

    2013-08-01

    In experiments investigating vertical squat jumping, the HAT segment is typically defined as a line drawn from the hip to some point proximally on the upper body (eg, the neck, the acromion), and the hip joint as the angle between this line and the upper legs (θUL-HAT). In reality, the hip joint is the angle between the pelvis and the upper legs (θUL-pelvis). This study aimed to estimate to what extent hip joint definition affects hip joint work in maximal squat jumping. Moreover, the initial pelvic tilt was manipulated to maximize the difference in hip joint work as a function of hip joint definition. Twenty-two male athletes performed maximum effort squat jumps in three different initial pelvic tilt conditions: backward (pelvisB), neutral (pelvisN), and forward (pelvisF). Hip joint work was calculated by integrating the hip net joint torque with respect to θUL-HAT (WUL-HAT) or with respect to θUL-pelvis (WUL-pelvis). θUL-HAT was greater than θUL-pelvis in all conditions. WUL-HAT overestimated WULpelvis by 33%, 39%, and 49% in conditions pelvisF, pelvisN, and pelvisB, respectively. It was concluded that θUL-pelvis should be measured when the mechanical output of hip extensor muscles is estimated.

  18. How accurate is the estimation of anthropogenic carbon in the ocean? An evaluation of the ΔC* method

    NASA Astrophysics Data System (ADS)

    Matsumoto, Katsumi; Gruber, Nicolas

    2005-09-01

    The ΔC* method of Gruber et al. (1996) is widely used to estimate the distribution of anthropogenic carbon in the ocean; however, as yet, no thorough assessment of its accuracy has been made. Here we provide a critical re-assessment of the method and determine its accuracy by applying it to synthetic data from a global ocean biogeochemistry model, for which we know the "true" anthropogenic CO2 distribution. Our results indicate that the ΔC* method tends to overestimate anthropogenic carbon in relatively young waters but underestimate it in older waters. Main sources of these biases are (1) the time evolution of the air-sea CO2 disequilibrium, which is not properly accounted for in the ΔC* method, (2) a pCFC ventilation age bias that arises from mixing, and (3) errors in identifying the different end-member water types. We largely support the findings of Hall et al. (2004), who have also identified the first two bias sources. An extrapolation of the errors that we quantified on a number of representative isopycnals to the global ocean suggests a positive bias of about 7% in the ΔC*-derived global anthropogenic CO2 inventory. The magnitude of this bias is within the previously estimated 20% uncertainty of the method, but regional biases can be larger. Finally, we propose two improvements to the ΔC* method in order to account for the evolution of air-sea CO2 disequilibrium and the ventilation age mixing bias.

  19. Ambulatory blood pressure monitoring: importance of sampling rate and duration--48 versus 24 hours--on the accurate assessment of cardiovascular risk.

    PubMed

    Hermida, Ramón C; Ayala, Diana E; Fontao, María J; Mojón, Artemio; Fernández, José R

    2013-03-01

    Independent prospective studies have found that ambulatory blood pressure (BP) monitoring (ABPM) is more closely correlated with target organ damage and cardiovascular disease (CVD) risk than clinic BP measurement. This is based on studies in which BP was sampled every 15-30 min for ≤24 h, without taking into account that reproducibility of any estimated parameter from a time series to be potentially used for CVD risk assessment might depend more on monitoring duration than on sampling rate. Herein, we evaluated the influence of duration (48 vs. 24 h) and sampling rate of BP measurements (form every 20-30 min up to every 2 h) on the prognostic value of ABPM-derived parameters. We prospectively studied 3344 subjects (1718 men/1626 women), 52.6 ± 14.5 yrs of age, during a median follow-up of 5.6 yrs. Those with hypertension at baseline were randomized to ingest all their prescribed hypertension medications upon awakening or ≥1 of them at bedtime. At baseline, BP was measured at 20-min intervals from 07:00 to 23:00 h and at 30-min intervals at night for 48 h, and physical activity was simultaneously monitored every min by wrist actigraphy to accurately derive the awake and asleep BP means. Identical assessment was scheduled annually and more frequently (quarterly) if treatment adjustment was required. ABPM profiles were modified to generate time series of identical 48-h duration but with data sampled at 1- or 2-h intervals, or shorter, i.e., first 24 h, time series with data sampled at the original rate (daytime 20-min intervals/nighttime 30-min intervals). Bland-Altman plots indicated that the range of individual differences in the estimated awake and asleep systolic (SBP) and diastolic BP (DBP) means between the original and modified ABPM profiles was up to 3-fold smaller for data sampled every 1 h for 48 h than for data sampled every 20-30 min for the first 24 h. Reduction of ABPM duration to just 24 h resulted in error of the

  20. How many measurements are needed to estimate accurate daily and annual soil respiration fluxes? Analysis using data from a temperate rainforest

    NASA Astrophysics Data System (ADS)

    Perez-Quezada, Jorge F.; Brito, Carla E.; Cabezas, Julián; Galleguillos, Mauricio; Fuentes, Juan P.; Bown, Horacio E.; Franck, Nicolás

    2016-12-01

    Making accurate estimations of daily and annual Rs fluxes is key for understanding the carbon cycle process and projecting effects of climate change. In this study we used high-frequency sampling (24 measurements per day) of Rs in a temperate rainforest during 1 year, with the objective of answering the questions of when and how often measurements should be made to obtain accurate estimations of daily and annual Rs. We randomly selected data to simulate samplings of 1, 2, 4 or 6 measurements per day (distributed either during the whole day or only during daytime), combined with 4, 6, 12, 26 or 52 measurements per year. Based on the comparison of partial-data series with the full-data series, we estimated the performance of different partial sampling strategies based on bias, precision and accuracy. In the case of annual Rs estimation, we compared the performance of interpolation vs. using non-linear modelling based on soil temperature. The results show that, under our study conditions, sampling twice a day was enough to accurately estimate daily Rs (RMSE < 10 % of average daily flux), even if both measurements were done during daytime. The highest reduction in RMSE for the estimation of annual Rs was achieved when increasing from four to six measurements per year, but reductions were still relevant when further increasing the frequency of sampling. We found that increasing the number of field campaigns was more effective than increasing the number of measurements per day, provided a minimum of two measurements per day was used. Including night-time measurements significantly reduced the bias and was relevant in reducing the number of field campaigns when a lower level of acceptable error (RMSE < 5 %) was established. Using non-linear modelling instead of linear interpolation did improve the estimation of annual Rs, but not as expected. In conclusion, given that most of the studies of Rs use manual sampling techniques and apply only one measurement per day, we

  1. Accurate spike estimation from noisy calcium signals for ultrafast three-dimensional imaging of large neuronal populations in vivo

    PubMed Central

    Deneux, Thomas; Kaszas, Attila; Szalay, Gergely; Katona, Gergely; Lakner, Tamás; Grinvald, Amiram; Rózsa, Balázs; Vanzetta, Ivo

    2016-01-01

    Extracting neuronal spiking activity from large-scale two-photon recordings remains challenging, especially in mammals in vivo, where large noises often contaminate the signals. We propose a method, MLspike, which returns the most likely spike train underlying the measured calcium fluorescence. It relies on a physiological model including baseline fluctuations and distinct nonlinearities for synthetic and genetically encoded indicators. Model parameters can be either provided by the user or estimated from the data themselves. MLspike is computationally efficient thanks to its original discretization of probability representations; moreover, it can also return spike probabilities or samples. Benchmarked on extensive simulations and real data from seven different preparations, it outperformed state-of-the-art algorithms. Combined with the finding obtained from systematic data investigation (noise level, spiking rate and so on) that photonic noise is not necessarily the main limiting factor, our method allows spike extraction from large-scale recordings, as demonstrated on acousto-optical three-dimensional recordings of over 1,000 neurons in vivo. PMID:27432255

  2. A probabilistic method for the estimation of residual risk in donated blood.

    PubMed

    Bish, Ebru K; Ragavan, Prasanna K; Bish, Douglas R; Slonim, Anthony D; Stramer, Susan L

    2014-10-01

    The residual risk (RR) of transfusion-transmitted infections, including the human immunodeficiency virus and hepatitis B and C viruses, is typically estimated by the incidence[Formula: see text]window period model, which relies on the following restrictive assumptions: Each screening test, with probability 1, (1) detects an infected unit outside of the test's window period; (2) fails to detect an infected unit within the window period; and (3) correctly identifies an infection-free unit. These assumptions need not hold in practice due to random or systemic errors and individual variations in the window period. We develop a probability model that accurately estimates the RR by relaxing these assumptions, and quantify their impact using a published cost-effectiveness study and also within an optimization model. These assumptions lead to inaccurate estimates in cost-effectiveness studies and to sub-optimal solutions in the optimization model. The testing solution generated by the optimization model translates into fewer expected infections without an increase in the testing cost.

  3. Formula and scale for body surface area estimation in high-risk infants.

    PubMed

    Ahn, Youngmee

    2010-12-01

    Advances in medical technology and the health sciences have lead to a rapid increase in the prevalence and morbidity of high-risk infants with chronic or permanent sequels such as the birth of early preterm infants. A suitable formula is therefore needed for body surface area (BSA) estimation for high-risk infants to more accurately devise therapeutic regimes in clinical practice. A cohort study involving 5014 high-risk infants was conducted to develop a suitable formula for estimating BSA using four of the existing formulas in the literature. BSA of high-risk infants was calculated using the four BSA equations (Boyd-BSA, Dubois-BSA, Meban-BSA, Mosteller-BSA), from which a new calculation, Mean-BSA, was arithmetically derived as a reference BSA measure. Multiple-regression was performed using nonlinear least squares curve fitting corresponding to the trend line and the new equation, Neo-BSA, developed using Excel and SPSS 17.0. The Neo-BSA equation was constructed as follows: Neo-BSA = 5.520 x W(0.5526) x L(0.300). With the assumption of the least square root relation between weight and length, a BSA scale using only weight was fabricated specifically for clinical applications where weight is more available in high-risk infant populations than is length. The validity of Neo-BSA was evaluated against Meban-BSA, the best of the four equations for high-risk infants, as there is a similarity of subjects in the two studies. The other formulas revealed substantial variances in BSA compared to Neo-BSA. This study developed a new surface area equation, Neo-BSA, as the most suitable formula for BSA measurement of high-risk infants in modern-day societies, where an emerging population of newborns with shorten gestational ages are becoming more prevalent as a result of new advances in the health sciences and new development of reproductive technologies. In particular, a scale for 400-7000 g body weight babies derived from the Neo-BSA equation has the clinical advantage of

  4. R2 TRI facilities with 1999-2011 risk related estimates throughout the census blockgroup

    EPA Pesticide Factsheets

    This dataset delineates the distribution of estimate risk from the TRI facilities for 1999 - 2011 throughout the census blockgroup of the region using Office of Pollution, Prevention & Toxics (OPPT)'s Risk-Screening Environmental Indicators model (RSEI). The model uses the reported quantities of TRI releases of chemicals to estimate the impacts associated with each type of air release or transfer by every TRI facility.The RSEI was run to generate the estimate risk for each TRI facility in the region. The result from the model is joined to the TRI spatial data. Estimate risk values for each census block group were calculated based on the inverse distance of all the facilities which are within a 50 km radius of the census block group centroid. The estimate risk value for each census block group thus is an aggregated value that takes into account the estimate potential risk of all the facilities within the searching radius (50km).

  5. Methodology for the Model-based Small Area Estimates of Cancer Risk Factors and Screening Behaviors - Small Area Estimates

    Cancer.gov

    This model-based approach uses data from both the Behavioral Risk Factor Surveillance System (BRFSS) and the National Health Interview Survey (NHIS) to produce estimates of the prevalence rates of cancer risk factors and screening behaviors at the state, health service area, and county levels.

  6. The role of models in estimating consequences as part of the risk assessment process.

    PubMed

    Forde-Folle, K; Mitchell, D; Zepeda, C

    2011-08-01

    The degree of disease risk represented by the introduction, spread, or establishment of one or several diseases through the importation of animals and animal products is assessed by importing countries through an analysis of risk. The components of a risk analysis include hazard identification, risk assessment, risk management, and risk communication. A risk assessment starts with identification of the hazard(s) and then continues with four interrelated steps: release assessment, exposure assessment, consequence assessment, and risk estimation. Risk assessments may be either qualitative or quantitative. This paper describes how, through the integration of epidemiological and economic models, the potential adverse biological and economic consequences of exposure can be quantified.

  7. Towards accurate estimates of the spin-state energetics of spin-crossover complexes within density functional theory: a comparative case study of cobalt(II) complexes.

    PubMed

    Vargas, Alfredo; Krivokapic, Itana; Hauser, Andreas; Lawson Daku, Latévi Max

    2013-03-21

    We report a detailed DFT study of the energetic and structural properties of the spin-crossover Co(ii) complex [Co(tpy)(2)](2+) (tpy = 2,2':6',2''-terpyridine) in the low-spin (LS) and the high-spin (HS) states, using several generalized gradient approximation and hybrid functionals. In either spin-state, the results obtained with the functionals are consistent with one another and in good agreement with available experimental data. Although the different functionals correctly predict the LS state as the electronic ground state of [Co(tpy)(2)](2+), they give estimates of the HS-LS zero-point energy difference which strongly depend on the functional used. This dependency on the functional was also reported for the DFT estimates of the zero-point energy difference in the HS complex [Co(bpy)(3)](2+) (bpy = 2,2'-bipyridine) [A. Vargas, A. Hauser and L. M. Lawson Daku, J. Chem. Theory Comput., 2009, 5, 97]. The comparison of the and estimates showed that all functionals correctly predict an increase of the zero-point energy difference upon the bpy → tpy ligand substitution, which furthermore weakly depends on the functionals, amounting to . From these results and basic thermodynamic considerations, we establish that, despite their limitations, current DFT methods can be applied to the accurate determination of the spin-state energetics of complexes of a transition metal ion, or of these complexes in different environments, provided that the spin-state energetics is accurately known in one case. Thus, making use of the availability of a highly accurate ab initio estimate of the HS-LS energy difference in the complex [Co(NCH)(6)](2+) [L. M. Lawson Daku, F. Aquilante, T. W. Robinson and A. Hauser, J. Chem. Theory Comput., 2012, 8, 4216], we obtain for [Co(tpy)(2)](2+) and [Co(bpy)(3)](2+) best estimates of and , in good agreement with the known magnetic behaviour of the two complexes.

  8. Comparing self-perceived and estimated fracture risk by FRAX® of women with osteoporosis.

    PubMed

    Baji, Petra; Gulácsi, László; Horváth, Csaba; Brodszky, Valentin; Rencz, Fanni; Péntek, Márta

    2017-12-01

    In this study, we compared subjective fracture risks of Hungarian women with osteoporosis to FRAX®-based estimates. Patients with a previous fracture, parental hip fracture, low femoral T-score, higher age, and higher BMI were more likely to underestimate their risks. Patients also failed to associate risk factors with an increased risk of fractures.

  9. Risk Estimates From an Online Risk Calculator Are More Believable and Recalled Better When Expressed as Integers

    PubMed Central

    Zikmund-Fisher, Brian J; Waters, Erika A; Gavaruzzi, Teresa; Fagerlin, Angela

    2011-01-01

    Background Online risk calculators offer different levels of precision in their risk estimates. People interpret numbers in varying ways depending on how they are presented, and we do not know how the number of decimal places displayed might influence perceptions of risk estimates. Objective The objective of our study was to determine whether precision (ie, number of decimals) in risk estimates offered by an online risk calculator influences users’ ratings of (1) how believable the estimate is, (2) risk magnitude (ie, how large or small the risk feels to them), and (3) how well they can recall the risk estimate after a brief delay. Methods We developed two mock risk calculator websites that offered hypothetical percentage estimates of participants’ lifetime risk of kidney cancer. Participants were randomly assigned to a condition where the risk estimate value rose with increasing precision (2, 2.1, 2.13, 2.133) or the risk estimate value fell with increasing precision (2, 1.9, 1.87, 1.867). Within each group, participants were randomly assigned one of the four numbers as their first risk estimate, and later received one of the remaining three as a comparison. Results Participants who completed the experiment (N = 3422) were a demographically diverse online sample, approximately representative of the US adult population on age, gender, and race. Participants whose risk estimates had no decimal places gave the highest ratings of believability (F 3,3384 = 2.94, P = .03) and the lowest ratings of risk magnitude (F 3,3384 = 4.70, P = .003). Compared to estimates with decimal places, integer estimates were judged as highly believable by 7%–10% more participants (χ2 3 =17.8, P < .001). When comparing two risk estimates with different levels of precision, large majorities of participants reported that the numbers seemed equivalent across all measures. Both exact and approximate recall were highest for estimates with zero decimals. Odds ratios (OR) for correct

  10. Cancer risk estimation caused by radiation exposure during endovascular procedure

    NASA Astrophysics Data System (ADS)

    Kang, Y. H.; Cho, J. H.; Yun, W. S.; Park, K. H.; Kim, H. G.; Kwon, S. M.

    2014-05-01

    The objective of this study was to identify the radiation exposure dose of patients, as well as staff caused by fluoroscopy for C-arm-assisted vascular surgical operation and to estimate carcinogenic risk due to such exposure dose. The study was conducted in 71 patients (53 men and 18 women) who had undergone vascular surgical intervention at the division of vascular surgery in the University Hospital from November of 2011 to April of 2012. It had used a mobile C-arm device and calculated the radiation exposure dose of patient (dose-area product, DAP). Effective dose was measured by attaching optically stimulated luminescence on the radiation protectors of staff who participates in the surgery to measure the radiation exposure dose of staff during the vascular surgical operation. From the study results, DAP value of patients was 308.7 Gy cm2 in average, and the maximum value was 3085 Gy cm2. When converted to the effective dose, the resulted mean was 6.2 m Gy and the maximum effective dose was 61.7 milliSievert (mSv). The effective dose of staff was 3.85 mSv; while the radiation technician was 1.04 mSv, the nurse was 1.31 mSv. All cancer incidences of operator are corresponding to 2355 persons per 100,000 persons, which deemed 1 of 42 persons is likely to have all cancer incidences. In conclusion, the vascular surgeons should keep the radiation protection for patient, staff, and all participants in the intervention in mind as supervisor of fluoroscopy while trying to understand the effects by radiation by themselves to prevent invisible danger during the intervention and to minimize the harm.

  11. TU-EF-204-01: Accurate Prediction of CT Tube Current Modulation: Estimating Tube Current Modulation Schemes for Voxelized Patient Models Used in Monte Carlo Simulations

    SciTech Connect

    McMillan, K; Bostani, M; McNitt-Gray, M; McCollough, C

    2015-06-15

    Purpose: Most patient models used in Monte Carlo-based estimates of CT dose, including computational phantoms, do not have tube current modulation (TCM) data associated with them. While not a problem for fixed tube current simulations, this is a limitation when modeling the effects of TCM. Therefore, the purpose of this work was to develop and validate methods to estimate TCM schemes for any voxelized patient model. Methods: For 10 patients who received clinically-indicated chest (n=5) and abdomen/pelvis (n=5) scans on a Siemens CT scanner, both CT localizer radiograph (“topogram”) and image data were collected. Methods were devised to estimate the complete x-y-z TCM scheme using patient attenuation data: (a) available in the Siemens CT localizer radiograph/topogram itself (“actual-topo”) and (b) from a simulated topogram (“sim-topo”) derived from a projection of the image data. For comparison, the actual TCM scheme was extracted from the projection data of each patient. For validation, Monte Carlo simulations were performed using each TCM scheme to estimate dose to the lungs (chest scans) and liver (abdomen/pelvis scans). Organ doses from simulations using the actual TCM were compared to those using each of the estimated TCM methods (“actual-topo” and “sim-topo”). Results: For chest scans, the average differences between doses estimated using actual TCM schemes and estimated TCM schemes (“actual-topo” and “sim-topo”) were 3.70% and 4.98%, respectively. For abdomen/pelvis scans, the average differences were 5.55% and 6.97%, respectively. Conclusion: Strong agreement between doses estimated using actual and estimated TCM schemes validates the methods for simulating Siemens topograms and converting attenuation data into TCM schemes. This indicates that the methods developed in this work can be used to accurately estimate TCM schemes for any patient model or computational phantom, whether a CT localizer radiograph is available or not

  12. Subcutaneous nerve activity is more accurate than the heart rate variability in estimating cardiac sympathetic tone in ambulatory dogs with myocardial infarction

    PubMed Central

    Chan, Yi-Hsin; Tsai, Wei-Chung; Shen, Changyu; Han, Seongwook; Chen, Lan S.; Lin, Shien-Fong; Chen, Peng-Sheng

    2015-01-01

    Background We recently reported that subcutaneous nerve activity (SCNA) can be used to estimate sympathetic tone. Objectives To test the hypothesis that left thoracic SCNA is more accurate than heart rate variability (HRV) in estimating cardiac sympathetic tone in ambulatory dogs with myocardial infarction (MI). Methods We used an implanted radiotransmitter to study left stellate ganglion nerve activity (SGNA), vagal nerve activity (VNA), and thoracic SCNA in 9 dogs at baseline and up to 8 weeks after MI. HRV was determined based by time-domain, frequency-domain and non-linear analyses. Results The correlation coefficients between integrated SGNA and SCNA averaged 0.74 (95% confidence interval (CI), 0.41–1.06) at baseline and 0.82 (95% CI, 0.63–1.01) after MI (P<.05 for both). The absolute values of the correlation coefficients were significant larger than that between SGNA and HRV analysis based on time-domain, frequency-domain and non-linear analyses, respectively, at baseline (P<.05 for all) and after MI (P<.05 for all). There was a clear increment of SGNA and SCNA at 2, 4, 6 and 8 weeks after MI, while HRV parameters showed no significant changes. Significant circadian variations were noted in SCNA, SGNA and all HRV parameters at baseline and after MI, respectively. Atrial tachycardia (AT) episodes were invariably preceded by the SCNA and SGNA, which were progressively increased from 120th, 90th, 60th to 30th s before the AT onset. No such changes of HRV parameters were observed before AT onset. Conclusion SCNA is more accurate than HRV in estimating cardiac sympathetic tone in ambulatory dogs with MI. PMID:25778433

  13. Latent-failure risk estimates for computer control

    NASA Technical Reports Server (NTRS)

    Dunn, William R.; Folsom, Rolfe A.; Green, Owen R.

    1991-01-01

    It is shown that critical computer controls employing unmonitored safety circuits are unsafe. Analysis supporting this result leads to two additional, important conclusions: (1) annual maintenance checks of safety circuit function do not, as widely believed, eliminate latent failure risk; (2) safety risk remains even if multiple, series-connected protection circuits are employed. Finally, it is shown analytically that latent failure risk is eliminated when continuous monitoring is employed.

  14. Minimum Expected Risk Estimation for Near-neighbor Classification

    DTIC Science & Technology

    2006-04-01

    can be interpreted within an estimation framework proposed by Carnap in 1952. Although Carnap’s views were not Bayesian, he proposed a general...279]. Carnap noted that there were two extremes to the multinomial estimation problem ( Carnap and Jaynes both gave binomial examples, but their logic... Carnap refers to as a logical factor, which corresponds to an uninformed guess, such as the estimate θ̂g = 1/G. Carnap noted that experts in his time

  15. Embedded fiber-optic sensing for accurate internal monitoring of cell state in advanced battery management systems part 2: Internal cell signals and utility for state estimation

    NASA Astrophysics Data System (ADS)

    Ganguli, Anurag; Saha, Bhaskar; Raghavan, Ajay; Kiesel, Peter; Arakaki, Kyle; Schuh, Andreas; Schwartz, Julian; Hegyi, Alex; Sommer, Lars Wilko; Lochbaum, Alexander; Sahu, Saroj; Alamgir, Mohamed

    2017-02-01

    A key challenge hindering the mass adoption of Lithium-ion and other next-gen chemistries in advanced battery applications such as hybrid/electric vehicles (xEVs) has been management of their functional performance for more effective battery utilization and control over their life. Contemporary battery management systems (BMS) reliant on monitoring external parameters such as voltage and current to ensure safe battery operation with the required performance usually result in overdesign and inefficient use of capacity. More informative embedded sensors are desirable for internal cell state monitoring, which could provide accurate state-of-charge (SOC) and state-of-health (SOH) estimates and early failure indicators. Here we present a promising new embedded sensing option developed by our team for cell monitoring, fiber-optic (FO) sensors. High-performance large-format pouch cells with embedded FO sensors were fabricated. This second part of the paper focuses on the internal signals obtained from these FO sensors. The details of the method to isolate intercalation strain and temperature signals are discussed. Data collected under various xEV operational conditions are presented. An algorithm employing dynamic time warping and Kalman filtering was used to estimate state-of-charge with high accuracy from these internal FO signals. Their utility for high-accuracy, predictive state-of-health estimation is also explored.

  16. Assessment of Methods for Estimating Risk to Birds from Ingestion of Contaminated Grit Particles (Final Report)

    EPA Science Inventory

    The U.S. EPA Ecological Risk Assessment Support Center (ERASC) announced the release of the final report entitled, Assessment of Methods for Estimating Risk to Birds from Ingestion of Contaminated Grit Particles. This report evaluates approaches for estimating the probabi...

  17. CCSI Risk Estimation: An Application of Expert Elicitation

    SciTech Connect

    Engel, David W.; Dalton, Angela C.

    2012-10-01

    The Carbon Capture Simulation Initiative (CCSI) is a multi-laboratory simulation-driven effort to develop carbon capture technologies with the goal of accelerating commercialization and adoption in the near future. One of the key CCSI technical challenges is representing and quantifying the inherent uncertainty and risks associated with developing, testing, and deploying the technology in simulated and real operational settings. To address this challenge, the CCSI Element 7 team developed a holistic risk analysis and decision-making framework. The purpose of this report is to document the CCSI Element 7 structured systematic expert elicitation to identify additional risk factors. We review the significance of and established approaches to expert elicitation, describe the CCSI risk elicitation plan and implementation strategies, and conclude by discussing the next steps and highlighting the contribution of risk elicitation toward the achievement of the overarching CCSI objectives.

  18. Estimation of the Disease Burden Attributable to 11 Risk Factors in Hubei Province, China: A Comparative Risk Assessment

    PubMed Central

    Cui, Fangfang; Zhang, Lan; Yu, Chuanhua; Hu, Songbo; Zhang, Yunquan

    2016-01-01

    In order to estimate the health losses caused by common risk factors in the Hubei province, China, we calculated the deaths and disability-adjusted life years (DALYs) attributable to 11 risk factors. We estimated the exposure distributions of risk factors in Hubei Province in 2013 from the monitoring system on chronic disease and related risk factors, combined with relative risk (RR) in order to calculate the population attributable fraction. Deaths and DALYs attributed to the selected risk factors were then estimated together with cause-specific deaths and DALYs. In total, 53.39% of the total deaths and 36.23% of the total DALYs in Hubei were a result of the 11 selected risk factors. The top five risk factors were high blood pressure, smoking, high body mass index, diet low in fruits and alcohol use, accounting for 14.68%, 12.57%, 6.03%, 3.90% and 3.19% of total deaths, respectively, and 9.41%, 7.22%, 4.42%, 2.51% and 2.44% of total DALYs, respectively. These risk factors, especially high blood pressure, smoking and high body mass index, significantly influenced quality of life, causing a large number of deaths and DALYs. The burden of chronic disease could be substantially reduced if these risk factors were effectively controlled, which would allow people to enjoy healthier lives. PMID:27669279

  19. A systematic approach for the accurate non-invasive estimation of blood glucose utilizing a novel light-tissue interaction adaptive modelling scheme

    NASA Astrophysics Data System (ADS)

    Rybynok, V. O.; Kyriacou, P. A.

    2007-10-01

    Diabetes is one of the biggest health challenges of the 21st century. The obesity epidemic, sedentary lifestyles and an ageing population mean prevalence of the condition is currently doubling every generation. Diabetes is associated with serious chronic ill health, disability and premature mortality. Long-term complications including heart disease, stroke, blindness, kidney disease and amputations, make the greatest contribution to the costs of diabetes care. Many of these long-term effects could be avoided with earlier, more effective monitoring and treatment. Currently, blood glucose can only be monitored through the use of invasive techniques. To date there is no widely accepted and readily available non-invasive monitoring technique to measure blood glucose despite the many attempts. This paper challenges one of the most difficult non-invasive monitoring techniques, that of blood glucose, and proposes a new novel approach that will enable the accurate, and calibration free estimation of glucose concentration in blood. This approach is based on spectroscopic techniques and a new adaptive modelling scheme. The theoretical implementation and the effectiveness of the adaptive modelling scheme for this application has been described and a detailed mathematical evaluation has been employed to prove that such a scheme has the capability of extracting accurately the concentration of glucose from a complex biological media.

  20. Can the conventional sextant prostate biopsy accurately predict unilateral prostate cancer in low-risk, localized, prostate cancer?

    PubMed

    Mayes, Janice M; Mouraviev, Vladimir; Sun, Leon; Tsivian, Matvey; Madden, John F; Polascik, Thomas J

    2011-01-01

    We evaluate the reliability of routine sextant prostate biopsy to detect unilateral lesions. A total of 365 men with complete records including all clinical and pathologic variables who underwent a preoperative sextant biopsy and subsequent radical prostatectomy (RP) for clinically localized prostate cancer at our medical center between January 1996 and December 2006 were identified. When the sextant biopsy detects unilateral disease, according to RP results, the NPV is high (91%) with a low false negative rate (9%). However, the sextant biopsy has a PPV of 28% with a high false positive rate (72%). Therefore, a routine sextant prostate biopsy cannot provide reliable, accurate information about the unilaterality of tumor lesion(s).

  1. Applying the quarter-hour rule: can people with insomnia accurately estimate 15-min periods during the sleep-onset phase?

    PubMed

    Harrow, Lisa; Espie, Colin

    2010-03-01

    The 'quarter-hour rule' (QHR) instructs the person with insomnia to get out of bed after 15 min of wakefulness and return to bed only when sleep feels imminent. Recent research has identified that sleep can be significantly improved using this simple intervention (Malaffo and Espie, Sleep, 27(s), 2004, 280; Sleep, 29 (s), 2006, 257), but successful implementation depends on estimating time without clock monitoring, and the insomnia literature indicates poor time perception is a maintaining factor in primary insomnia (Harvey, Behav. Res. Ther., 40, 2002, 869). This study expands upon previous research with the aim of identifying whether people with insomnia can accurately perceive a 15-min interval during the sleep-onset period, and therefore successfully implements the QHR. A mixed models anova design was applied with between-participants factor of group (insomnia versus good sleepers) and within-participants factor of context (night versus day). Results indicated no differences between groups and contexts on time estimation tasks. This was despite an increase in arousal in the night context for both groups, and tentative support for the impact of arousal in inducing underestimations of time. These results provide promising support for the successful application of the QHR in people with insomnia. The results are discussed in terms of whether the design employed successfully accessed the processes that are involved in distorting time perception in insomnia. Suggestions for future research are provided and limitations of the current study discussed.

  2. Accurate Equilibrium Structures for trans-HEXATRIENE by the Mixed Estimation Method and for the Three Isomers of Octatetraene from Theory; Structural Consequences of Electron Delocalization

    NASA Astrophysics Data System (ADS)

    Craig, Norman C.; Demaison, Jean; Groner, Peter; Rudolph, Heinz Dieter; Vogt, Natalja

    2015-06-01

    An accurate equilibrium structure of trans-hexatriene has been determined by the mixed estimation method with rotational constants from 8 deuterium and carbon isotopologues and high-level quantum chemical calculations. In the mixed estimation method bond parameters are fit concurrently to moments of inertia of various isotopologues and to theoretical bond parameters, each data set carrying appropriate uncertainties. The accuracy of this structure is 0.001 Å and 0.1°. Structures of similar accuracy have been computed for the cis,cis, trans,trans, and cis,trans isomers of octatetraene at the CCSD(T) level with a basis set of wCVQZ(ae) quality adjusted in accord with the experience gained with trans-hexatriene. The structures are compared with butadiene and with cis-hexatriene to show how increasing the length of the chain in polyenes leads to increased blurring of the difference between single and double bonds in the carbon chain. In trans-hexatriene r(“C_1=C_2") = 1.339 Å and r(“C_3=C_4") = 1.346 Å compared to 1.338 Å for the “double" bond in butadiene; r(“C_2-C_3") = 1.449 Å compared to 1.454 Å for the “single" bond in butadiene. “Double" bonds increase in length; “single" bonds decrease in length.

  3. Multicentre validation of the Geneva Risk Score for hospitalised medical patients at risk of venous thromboembolism. Explicit ASsessment of Thromboembolic RIsk and Prophylaxis for Medical PATients in SwitzErland (ESTIMATE).

    PubMed

    Nendaz, M; Spirk, D; Kucher, N; Aujesky, D; Hayoz, D; Beer, J H; Husmann, M; Frauchiger, B; Korte, W; Wuillemin, W A; Jäger, K; Righini, M; Bounameaux, H

    2014-03-03

    There is a need to validate risk assessment tools for hospitalised medical patients at risk of venous thromboembolism (VTE). We investigated whether a predefined cut-off of the Geneva Risk Score, as compared to the Padua Prediction Score, accurately distinguishes low-risk from high-risk patients regardless of the use of thromboprophylaxis. In the multicentre, prospective Explicit ASsessment of Thromboembolic RIsk and Prophylaxis for Medical PATients in SwitzErland (ESTIMATE) cohort study, 1,478 hospitalised medical patients were enrolled of whom 637 (43%) did not receive thromboprophylaxis. The primary endpoint was symptomatic VTE or VTE-related death at 90 days. The study is registered at ClinicalTrials.gov, number NCT01277536. According to the Geneva Risk Score, the cumulative rate of the primary endpoint was 3.2% (95% confidence interval [CI] 2.2-4.6%) in 962 high-risk vs 0.6% (95% CI 0.2-1.9%) in 516 low-risk patients (p=0.002); among patients without prophylaxis, this rate was 3.5% vs 0.8% (p=0.029), respectively. In comparison, the Padua Prediction Score yielded a cumulative rate of the primary endpoint of 3.5% (95% CI 2.3-5.3%) in 714 high-risk vs 1.1% (95% CI 0.6-2.3%) in 764 low-risk patients (p=0.002); among patients without prophylaxis, this rate was 3.2% vs 1.5% (p=0.130), respectively. Negative likelihood ratio was 0.28 (95% CI 0.10-0.83) for the Geneva Risk Score and 0.51 (95% CI 0.28-0.93) for the Padua Prediction Score. In conclusion, among hospitalised medical patients, the Geneva Risk Score predicted VTE and VTE-related mortality and compared favourably with the Padua Prediction Score, particularly for its accuracy to identify low-risk patients who do not require thromboprophylaxis.

  4. Biomechanical Risk Estimates for Mild Traumatic Brain Injury

    PubMed Central

    Funk, J. R.; Duma, S. M.; Manoogian, S. J.; Rowson, S.

    2007-01-01

    The objective of this study was to characterize the risk of mild traumatic brain injury (MTBI) in living humans based on a large set of head impact data taken from American football players at the collegiate level. Real-time head accelerations were recorded from helmet-mounted accelerometers designed to stay in contact with the player’s head. Over 27,000 head impacts were recorded, including four impacts resulting in MTBI. Parametric risk curves were developed by normalizing MTBI incidence data by head impact exposure data. An important finding of this research is that living humans, at least in the setting of collegiate football, sustain much more significant head impacts without apparent injury than previously thought. The following preliminary nominal injury assessment reference values associated with a 10% risk of MTBI are proposed: a peak linear head acceleration of 165 g, a HIC of 400, and a peak angular head acceleration of 9000 rad/s2. PMID:18184501

  5. Probabilistic methodology for estimating radiation-induced cancer risk

    SciTech Connect

    Dunning, D.E. Jr.; Leggett, R.W.; Williams, L.R.

    1981-01-01

    The RICRAC computer code was developed at Oak Ridge National Laboratory to provide a versatile and convenient methodology for radiation risk assessment. The code allows as input essentially any dose pattern commonly encountered in risk assessments for either acute or chronic exposures, and it includes consideration of the age structure of the exposed population. Results produced by the analysis include the probability of one or more radiation-induced cancer deaths in a specified population, expected numbers of deaths, and expected years of life lost as a result of premature fatalities. These calculatons include consideration of competing risks of death from all other causes. The program also generates a probability frequency distribution of the expected number of cancers in any specified cohort resulting from a given radiation dose. The methods may be applied to any specified population and dose scenario.

  6. Space Radiation Cancer, Circulatory Disease and CNS Risks for Near Earth Asteroid and Mars Missions: Uncertainty Estimates for Never-Smokers

    NASA Technical Reports Server (NTRS)

    Cucinotta, Francis A.; Chappell, Lori J.; Wang, Minli; Kim, Myung-Hee

    2011-01-01

    The uncertainties in estimating the health risks from galactic cosmic rays (GCR) and solar particle events (SPE) are a major limitation to the length of space missions and the evaluation of potential risk mitigation approaches. NASA limits astronaut exposures to a 3% risk of exposure induced cancer death (REID), and protects against uncertainties in risks projections using an assessment of 95% confidence intervals after propagating the error from all model factors (environment and organ exposure, risk coefficients, dose-rate modifiers, and quality factors). Because there are potentially significant late mortality risks from diseases of the circulatory system and central nervous system (CNS) which are less well defined than cancer risks, the cancer REID limit is not necessarily conservative. In this report, we discuss estimates of lifetime risks from space radiation and new estimates of model uncertainties are described. The key updates to the NASA risk projection model are: 1) Revised values for low LET risk coefficients for tissue specific cancer incidence, with incidence rates transported to an average U.S. population to estimate the probability of Risk of Exposure Induced Cancer (REIC) and REID. 2) An analysis of smoking attributable cancer risks for never-smokers that shows significantly reduced lung cancer risk as well as overall cancer risks from radiation compared to risk estimated for the average U.S. population. 3) Derivation of track structure based quality functions depends on particle fluence, charge number, Z and kinetic energy, E. 4) The assignment of a smaller maximum in quality function for leukemia than for solid cancers. 5) The use of the ICRP tissue weights is shown to over-estimate cancer risks from SPEs by a factor of 2 or more. Summing cancer risks for each tissue is recommended as a more accurate approach to estimate SPE cancer risks. 6) Additional considerations on circulatory and CNS disease risks. Our analysis shows that an individual s

  7. Aggregate versus Individual-Level Sexual Behavior Assessment: How Much Detail Is Needed to Accurately Estimate HIV/STI Risk?

    ERIC Educational Resources Information Center

    Pinkerton, Steven D.; Galletly, Carol L.; McAuliffe, Timothy L.; DiFranceisco, Wayne; Raymond, H. Fisher; Chesson, Harrell W.

    2010-01-01

    The sexual behaviors of HIV/sexually transmitted infection (STI) prevention intervention participants can be assessed on a partner-by-partner basis: in aggregate (i.e., total numbers of sex acts, collapsed across partners) or using a combination of these two methods (e.g., assessing five partners in detail and any remaining partners in aggregate).…

  8. The use of kernel density estimators in breakthrough curve reconstruction and advantages in risk analysis

    NASA Astrophysics Data System (ADS)

    Siirila, E. R.; Fernandez-Garcia, D.; Sanchez-Vila, X.

    2014-12-01

    Particle tracking (PT) techniques, often considered favorable over Eulerian techniques due to artificial smoothening in breakthrough curves (BTCs), are evaluated in a risk-driven framework. Recent work has shown that given a relatively few number of particles (np), PT methods can yield well-constructed BTCs with kernel density estimators (KDEs). This work compares KDE and non-KDE BTCs simulated as a function of np (102-108) and averaged as a function of the exposure duration, ED. Results show that regardless of BTC shape complexity, un-averaged PT BTCs show a large bias over several orders of magnitude in concentration (C) when compared to the KDE results, remarkably even when np is as low as 102. With the KDE, several orders of magnitude less np are required to obtain the same global error in BTC shape as the PT technique. PT and KDE BTCs are averaged as a function of the ED with standard and new methods incorporating the optimal h (ANA). The lowest error curve is obtained through the ANA method, especially for smaller EDs. Percent error of peak of averaged-BTCs, important in a risk framework, is approximately zero for all scenarios and all methods for np ≥105, but vary between the ANA and PT methods, when np is lower. For fewer np, the ANA solution provides a lower error fit except when C oscillations are present during a short time frame. We show that obtaining a representative average exposure concentration is reliant on an accurate representation of the BTC, especially when data is scarce.

  9. Accurate estimation of global and regional cardiac function by retrospectively gated multidetector row computed tomography: comparison with cine magnetic resonance imaging.

    PubMed

    Belge, Bénédicte; Coche, Emmanuel; Pasquet, Agnès; Vanoverschelde, Jean-Louis J; Gerber, Bernhard L

    2006-07-01

    Retrospective reconstruction of ECG-gated images at different parts of the cardiac cycle allows the assessment of cardiac function by multi-detector row CT (MDCT) at the time of non-invasive coronary imaging. We compared the accuracy of such measurements by MDCT to cine magnetic resonance (MR). Forty patients underwent the assessment of global and regional cardiac function by 16-slice MDCT and cine MR. Left ventricular (LV) end-diastolic and end-systolic volumes estimated by MDCT (134+/-51 and 67+/-56 ml) were similar to those by MR (137+/-57 and 70+/-60 ml, respectively; both P=NS) and strongly correlated (r=0.92 and r=0.95, respectively; both P<0.001). Consequently, LV ejection fractions by MDCT and MR were also similar (55+/-21 vs. 56+/-21%; P=NS) and highly correlated (r=0.95; P<0.001). Regional end-diastolic and end-systolic wall thicknesses by MDCT were highly correlated (r=0.84 and r=0.92, respectively; both P<0.001), but significantly lower than by MR (8.3+/-1.8 vs. 8.8+/-1.9 mm and 12.7+/-3.4 vs. 13.3+/-3.5 mm, respectively; both P<0.001). Values of regional wall thickening by MDCT and MR were similar (54+/-30 vs. 51+/-31%; P=NS) and also correlated well (r=0.91; P<0.001). Retrospectively gated MDCT can accurately estimate LV volumes, EF and regional LV wall thickening compared to cine MR.

  10. Whole effluent risk estimation for a small recipient watercourse.

    PubMed

    Refaey, Maha; Kováts, Nóra; Kárpáti, A; Thury, P

    2009-09-01

    Whole effluent toxicity is most often considered as a static parameter. However, toxicity might change as degradation processes, especially biodegradation goes by and intermediate products appear. These intermediates can even be more toxic than the original effluent was, posing higher risk to the ecosystem of the recipient water body. In our test series it was assessed how toxicity of a municipal wastewater sample changes during biodegradation taking into consideration different temperature regimes (10, 20 and 30 degrees C). Results proved our null hypothesis: after the high initial toxicity of the fresh effluent sample toxicity did show a further increase. Biodegradation resulted in toxicity reduction only after an approx. 2 week-period.

  11. How are flood risk estimates affected by the choice of return-periods?

    NASA Astrophysics Data System (ADS)

    Ward, P. J.; Aerts, J. C. J. H.; De Moel, H.; Poussin, J. K.

    2012-04-01

    Flood management is more and more adopting a risk based approach, whereby flood risk is the product of the probability and consequences of flooding. One of the most common approaches in flood risk assessment is to estimate the damage that would occur for floods of several exceedance probabilities (or return periods), to plot these on an exceedance probability-loss curve (risk curve) and to estimate risk as the area under the curve. However, there is little insight into how the selection of the return-periods (which ones and how many) used to calculate risk actually affects the final risk calculation. To gain such insights, we developed and validated an inundation model capable of rapidly simulating inundation extent and depth, and dynamically coupled this to an existing damage model. The method was applied to a section of the River Meuse in the southeast of the Netherlands. Firstly, we estimated risk based on a risk curve using yearly return periods from 2 to 10 000 yr (€ 34 million p.a.). We found that the overall risk is greatly affected by the number of return periods used to construct the risk curve, with over-estimations of annual risk between 33% and 100% when only three return periods are used. Also, the final risk estimate is greatly dependent on the minimum and maximum return periods (and their associated damages) used in the construction of the risk curve. In addition, binary assumptions on dike failure can have a large effect (a factor two difference) on risk estimates. The results suggest that more research is needed to develop relatively simple inundation models that can be used to produce large numbers of inundation maps, complementary to more complex 2D-3D hydrodynamic models. We then used the insights and models described above to assess the relative change in risk between current conditions and several scenarios of land use and climate change. For the case study region, we found that future land use change has a larger impact than future climate

  12. Assessing the risk of Legionnaires' disease: the inhalation exposure model and the estimated risk in residential bathrooms.

    PubMed

    Azuma, Kenichi; Uchiyama, Iwao; Okumura, Jiro

    2013-02-01

    Legionella are widely found in the built environment. Patients with Legionnaires' disease have been increasing in Japan; however, health risks from Legionella bacteria in the environment are not appropriately assessed. We performed a quantitative health risk assessment modeled on residential bathrooms in the Adachi outbreak area and estimated risk levels. The estimated risks in the Adachi outbreak approximately corresponded to the risk levels exponentially extrapolated into lower levels on the basis of infection and mortality rates calculated from actual outbreaks, suggesting that the model of Legionnaires' disease in residential bathrooms was adequate to predict disease risk for the evaluated outbreaks. Based on this model, the infection and mortality risk levels per year in 10 CFU/100 ml (100 CFU/L) of the Japanese water quality guideline value were approximately 10(-2) and 10(-5), respectively. However, acceptable risk levels of infection and mortality from Legionnaires' disease should be adjusted to approximately 10(-4) and 10(-7), respectively, per year. Therefore, a reference value of 0.1 CFU/100 ml (1 CFU/L) as a water quality guideline for Legionella bacteria is recommended. This value is occasionally less than the actual detection limit. Legionella levels in water system should be maintained as low as reasonably achievable (<1 CFU/L).

  13. Genome-based, mechanism-driven computational modeling of risks of ionizing radiation: The next frontier in genetic risk estimation?

    PubMed

    Sankaranarayanan, K; Nikjoo, H

    2015-01-01

    Research activity in the field of estimation of genetic risks of ionizing radiation to human populations started in the late 1940s and now appears to be passing through a plateau phase. This paper provides a background to the concepts, findings and methods of risk estimation that guided the field through the period of its growth to the beginning of the 21st century. It draws attention to several key facts: (a) thus far, genetic risk estimates have been made indirectly using mutation data collected in mouse radiation studies; (b) important uncertainties and unsolved problems remain, one notable example being that we still do not know the sensitivity of human female germ cells to radiation-induced mutations; and (c) the concept that dominated the field thus far, namely, that radiation exposures to germ cells can result in single gene diseases in the descendants of those exposed has been replaced by the concept that radiation exposure can cause DNA deletions, often involving more than one gene. Genetic risk estimation now encompasses work devoted to studies on DNA deletions induced in human germ cells, their expected frequencies, and phenotypes and associated clinical consequences in the progeny. We argue that the time is ripe to embark on a human genome-based, mechanism-driven, computational modeling of genetic risks of ionizing radiation, and we present a provisional framework for catalyzing research in the field in the 21st century.

  14. Bayesian Framework for Water Quality Model Uncertainty Estimation and Risk Management

    EPA Science Inventory

    A formal Bayesian methodology is presented for integrated model calibration and risk-based water quality management using Bayesian Monte Carlo simulation and maximum likelihood estimation (BMCML). The primary focus is on lucid integration of model calibration with risk-based wat...

  15. Estimating risks to aquatic life using quantile regression

    USGS Publications Warehouse

    Schmidt, Travis S.; Clements, William H.; Cade, Brian S.

    2012-01-01

    One of the primary goals of biological assessment is to assess whether contaminants or other stressors limit the ecological potential of running waters. It is important to interpret responses to contaminants relative to other environmental factors, but necessity or convenience limit quantification of all factors that influence ecological potential. In these situations, the concept of limiting factors is useful for data interpretation. We used quantile regression to measure risks to aquatic life exposed to metals by including all regression quantiles (τ  =  0.05–0.95, by increments of 0.05), not just the upper limit of density (e.g., 90th quantile). We measured population densities (individuals/0.1 m2) of 2 mayflies (Rhithrogena spp., Drunella spp.) and a caddisfly (Arctopsyche grandis), aqueous metal mixtures (Cd, Cu, Zn), and other limiting factors (basin area, site elevation, discharge, temperature) at 125 streams in Colorado. We used a model selection procedure to test which factor was most limiting to density. Arctopsyche grandis was limited by other factors, whereas metals limited most quantiles of density for the 2 mayflies. Metals reduced mayfly densities most at sites where other factors were not limiting. Where other factors were limiting, low mayfly densities were observed despite metal concentrations. Metals affected mayfly densities most at quantiles above the mean and not just at the upper limit of density. Risk models developed from quantile regression showed that mayfly densities observed at background metal concentrations are improbable when metal mixtures are at US Environmental Protection Agency criterion continuous concentrations. We conclude that metals limit potential density, not realized average density. The most obvious effects on mayfly populations were at upper quantiles and not mean density. Therefore, we suggest that policy developed from mean-based measures of effects may not be as useful as policy based on the concept of

  16. Risk estimation based on germ-cell mutations in animals.

    PubMed

    Favor, J

    1989-01-01

    The set of mouse germ cell mutation rate results following spermatogonial exposure to high dose rate irradiation have been presented as the most relevant experimental results upon which to extrapolate the expected genetic risk of offspring of the survivors of the Hiroshima and Nagasaki atomic bombings. Results include mutation rates to recessive specific-locus, dominant cataract, protein-charge, and enzyme-activity alleles. The mutability as determined by the various genetic end points differed: the mutation rates to recessive specific-locus alleles and enzyme-activity alleles were similar and greater than the mutation rates to dominant cataract and protein-charge alleles. It is argued that the type of mutation event scored by a particular test will determine the mutability of the genetic end point screened. When the loss of functional gene product can be scored in a particular mutation test, as in the recessive specific-locus and enzyme-activity tests, a wide spectrum of DNA alterations may result in a loss of and a higher mutation rate is observed. When an altered gene product is scored, as in the dominant cataract and protein-charge tests, a narrower spectrum of DNA alterations is screened and a lower mutation rate is observed. The radiation doubling dose, defined as the dose that induces as many mutations as occur spontaneously per generation, was shown to be four times higher in the dominant cataract test than the specific-locus test. These results indicate that to extrapolate to genetic risks in humans using the doubling-dose method, the extrapolation must be based on experimental mutation rate results for the same genetic end point.(ABSTRACT TRUNCATED AT 250 WORDS)

  17. Another look at the (im-)precision of individual risk estimates made using actuarial risk assessment instruments.

    PubMed

    Hart, Stephen D; Cooke, David J

    2013-01-01

    We investigated the precision of individual risk estimates made using actuarial risk assessment instruments (ARAIs) by discussing some major conceptual issues and then illustrating them by analyzing new data. We used a standard multivariate statistical procedure, logistic regression, to create a new ARAI based on data from a follow-up study of 90 adult male sex offenders. We indexed predictive precision at the group level using confidence intervals for group mean probability estimates, and at the individual level using prediction intervals for individual probability estimates. Consistent with past research, ARAI scores were moderately and significantly predictive of failure in the aggregate, but group probability estimates had substantial margins of error and individual probability estimates had very large margins of error. We conclude that, without major advances in our understanding of the causes of violence, ARAIs cannot be used to estimate the specific probability or absolute likelihood of future violence with any reasonable degree of precision or certainty. The implications for conducting violence risk assessments in forensic mental health are discussed.

  18. An update on standards for radiation in the environment and associated estimates of risk

    SciTech Connect

    Kocher, D.C.

    1989-06-21

    This presentation reviews current and proposed standards, recommendations, and guidances for limiting routine radiation exposures of the public, and estimates the risk corresponding to standards, recommendations, and guidances. These estimates provide a common basis for comparing different criteria for limiting public exposures to radiation, as well as hazardous chemicals.

  19. ASSESSMENT OF METHODS FOR ESTIMATING RISK TO BIRDS FROM INGESTION OF CONTAMINATED GRIT PARTICLES (FINAL REPORT)

    EPA Science Inventory

    The report evaluates approaches for estimating the probability of ingestion by birds of contaminated particles such as pesticide granules or lead particles (i.e. shot or bullet fragments). In addition, it presents an approach for using this information to estimate the risk of mo...

  20. REVIEW OF DRAFT REVISED BLUE BOOK ON ESTIMATING CANCER RISKS FROM EXPOSURE TO IONIZING RADIATION

    EPA Science Inventory

    In 1994, EPA published a report, referred to as the “Blue Book,” which lays out EPA’s current methodology for quantitatively estimating radiogenic cancer risks. A follow-on report made minor adjustments to the previous estimates and presented a partial analysis of the uncertainti...

  1. Estimating Toxicity Pathway Activating Doses for High Throughput Chemical Risk Assessments

    EPA Science Inventory

    Estimating a Toxicity Pathway Activating Dose (TPAD) from in vitro assays as an analog to a reference dose (RfD) derived from in vivo toxicity tests would facilitate high throughput risk assessments of thousands of data-poor environmental chemicals. Estimating a TPAD requires def...

  2. Uncertainties in Estimates of the Risks of Late Effects from Space Radiation

    NASA Technical Reports Server (NTRS)

    Cucinotta, F. A.; Schimmerling, W.; Wilson, J. W.; Peterson, L. E.; Saganti, P.; Dicelli, J. F.

    2002-01-01

    The health risks faced by astronauts from space radiation include cancer, cataracts, hereditary effects, and non-cancer morbidity and mortality risks related to the diseases of the old age. Methods used to project risks in low-Earth orbit are of questionable merit for exploration missions because of the limited radiobiology data and knowledge of galactic cosmic ray (GCR) heavy ions, which causes estimates of the risk of late effects to be highly uncertain. Risk projections involve a product of many biological and physical factors, each of which has a differential range of uncertainty due to lack of data and knowledge. Within the linear-additivity model, we use Monte-Carlo sampling from subjective uncertainty distributions in each factor to obtain a Maximum Likelihood estimate of the overall uncertainty in risk projections. The resulting methodology is applied to several human space exploration mission scenarios including ISS, lunar station, deep space outpost, and Mar's missions of duration of 360, 660, and 1000 days. The major results are the quantification of the uncertainties in current risk estimates, the identification of factors that dominate risk projection uncertainties, and the development of a method to quantify candidate approaches to reduce uncertainties or mitigate risks. The large uncertainties in GCR risk projections lead to probability distributions of risk that mask any potential risk reduction using the "optimization" of shielding materials or configurations. In contrast, the design of shielding optimization approaches for solar particle events and trapped protons can be made at this time, and promising technologies can be shown to have merit using our approach. The methods used also make it possible to express risk management objectives in terms of quantitative objective's, i.e., the number of days in space without exceeding a given risk level within well defined confidence limits.

  3. Overview of Risk-Estimation Tools for Primary Prevention of Cardiovascular Diseases in European Populations.

    PubMed

    Gorenoi, Vitali; Hagen, Anja

    2015-06-01

    To identify persons with a high risk for cardiovascular diseases (CVD) special tools (scores, charts, graphics or computer programs) for CVD-risk assessment based on levels of the certain risk factors have been constructed. The applicability of these instruments depends on the derivation cohorts, considered risk factors and endpoints, applied statistical methods as well as used formats. The review addresses the risk-estimation tools for primary prevention of CVD potentially relevant for European populations. The risk-estimation tools were identified using two previously published systematic reviews as well as conducting a literature search in MEDLINE and a manual search. Only instruments were considered which were derived from cohorts of at least 1,000 participants of one gender without pre-existing CVD, enable risk assessment for a period of at least 5 years, were designed for an age-range of at least 25 years and published after the year 2000. A number of risk-estimation tools for CVD derived from single European, several European and from non-European cohorts were identified. From a clinical perspective, seem to be preferable instruments for risk of CVD contemporary developed for the population of interest, which use easily accessible measures and show a high discriminating ability. Instruments, restricting risk-estimation to certain cardiovascular events, recalibrated high-accuracy tools or tools derived from European populations with similar risk factors distribution and CVD-incidence are the second choice. In younger people, calculating the relative risk or cardiovascular age equivalence measures may be of more benefit.

  4. Multiple primary tumours: incidence estimation in the presence of competing risks

    PubMed Central

    Rosso, Stefano; Terracini, Lea; Ricceri, Fulvio; Zanetti, Roberto

    2009-01-01

    Background Estimating the risk of developing subsequent primary tumours in a population is difficult since the occurrence probability is conditioned to the survival probability. Methods We proposed to apply Markov models studying the transition intensities from first to second tumour with the Aalen-Johansen (AJ) estimators, as usually done in competing risk models. In a simulation study we applied the proposed method in different settings with constant or varying underlying intensities and applying age standardisation. In addition, we illustrated the method with data on breast cancer from the Piedmont Cancer Registry. Results The simulation study showed that the person-years approach led to a sensibly wider bias than the AJ estimators. The largest bias was observed assuming constantly increasing incidence rates. However, this situation is rather uncommon dealing with subsequent tumours incidence. In 9233 cases with breast cancer occurred in women resident in Turin, Italy, between 1985 and 1998 we observed a significant increased risk of 1.91 for subsequent cancer of corpus uteri, estimated with the age-standardised Aalen-Johansen incidence ratio (AJ-IRstand), and a significant increased risk of 1.29 for cancer possibly related to the radiotherapy of breast cancer. The peak of occurrence of those cancers was observed after 8 years of follow-up. Conclusion The increased risk of a cancer of the corpus uteri, also observed in other studies, is usually interpreted as the common shared risk factors such as low parity, early menarche and late onset of menopause. We also grouped together those cancers possibly associated to a previous local radiotherapy: the cumulative risk at 14 years is still not significant, however the AJ estimators showed a significant risk peak between the eighth and the ninth year. Finally, the proposed approach has been shown to be reliable and informative under several aspects. It allowed for a correct estimation of the risk, and for investigating

  5. Quantitative Risk reduction estimation Tool For Control Systems, Suggested Approach and Research Needs

    SciTech Connect

    Miles McQueen; Wayne Boyer; Mark Flynn; Sam Alessi

    2006-03-01

    For the past year we have applied a variety of risk assessment technologies to evaluate the risk to critical infrastructure from cyber attacks on control systems. More recently, we identified the need for a stand alone control system risk reduction estimation tool to provide owners and operators of control systems with a more useable, reliable, and credible method for managing the risks from cyber attack. Risk is defined as the probability of a successful attack times the value of the resulting loss, typically measured in lives and dollars. Qualitative and ad hoc techniques for measuring risk do not provide sufficient support for cost benefit analyses associated with cyber security mitigation actions. To address the need for better quantitative risk reduction models we surveyed previous quantitative risk assessment research; evaluated currently available tools; developed new quantitative techniques [17] [18]; implemented a prototype analysis tool to demonstrate how such a tool might be used; used the prototype to test a variety of underlying risk calculational engines (e.g. attack tree, attack graph); and identified technical and research needs. We concluded that significant gaps still exist and difficult research problems remain for quantitatively assessing the risk to control system components and networks, but that a useable quantitative risk reduction estimation tool is not beyond reach.

  6. Space Radiation Heart Disease Risk Estimates for Lunar and Mars Missions

    NASA Technical Reports Server (NTRS)

    Cucinotta, Francis A.; Chappell, Lori; Kim, Myung-Hee

    2010-01-01

    The NASA Space Radiation Program performs research on the risks of late effects from space radiation for cancer, neurological disorders, cataracts, and heart disease. For mortality risks, an aggregate over all risks should be considered as well as projection of the life loss per radiation induced death. We report on a triple detriment life-table approach to combine cancer and heart disease risks. Epidemiology results show extensive heterogeneity between populations for distinct components of the overall heart disease risks including hypertension, ischaemic heart disease, stroke, and cerebrovascular diseases. We report on an update to our previous heart disease estimates for Heart disease (ICD9 390-429) and Stroke (ICD9 430-438), and other sub-groups using recent meta-analysis results for various exposed radiation cohorts to low LET radiation. Results for multiplicative and additive risk transfer models are considered using baseline rates for US males and female. Uncertainty analysis indicated heart mortality risks as low as zero, assuming a threshold dose for deterministic effects, and projections approaching one-third of the overall cancer risk. Medan life-loss per death estimates were significantly less than that of solid cancer and leukemias. Critical research questions to improve risks estimates for heart disease are distinctions in mechanisms at high doses (>2 Gy) and low to moderate doses (<2 Gy), and data and basic understanding of radiation doserate and quality effects, and individual sensitivity.

  7. Uncertainties in estimates of the risks of late effects from space radiation

    NASA Technical Reports Server (NTRS)

    Cucinotta, F. A.; Schimmerling, W.; Wilson, J. W.; Peterson, L. E.; Saganti, P. B.; Dicello, J. F.

    2004-01-01

    Methods used to project risks in low-Earth orbit are of questionable merit for exploration missions because of the limited radiobiology data and knowledge of galactic cosmic ray (GCR) heavy ions, which causes estimates of the risk of late effects to be highly uncertain. Risk projections involve a product of many biological and physical factors, each of which has a differential range of uncertainty due to lack of data and knowledge. Using the linear-additivity model for radiation risks, we use Monte-Carlo sampling from subjective uncertainty distributions in each factor to obtain an estimate of the overall uncertainty in risk projections. The resulting methodology is applied to several human space exploration mission scenarios including a deep space outpost and Mars missions of duration of 360, 660, and 1000 days. The major results are the quantification of the uncertainties in current risk estimates, the identification of factors that dominate risk projection uncertainties, and the development of a method to quantify candidate approaches to reduce uncertainties or mitigate risks. The large uncertainties in GCR risk projections lead to probability distributions of risk that mask any potential risk reduction using the "optimization" of shielding materials or configurations. In contrast, the design of shielding optimization approaches for solar particle events and trapped protons can be made at this time and promising technologies can be shown to have merit using our approach. The methods used also make it possible to express risk management objectives in terms of quantitative metrics, e.g., the number of days in space without exceeding a given risk level within well-defined confidence limits. Published by Elsevier Ltd on behalf of COSPAR.

  8. A simple procedure for estimating pseudo risk ratios from exposure to non-carcinogenic chemical mixtures.

    PubMed

    Scinicariello, Franco; Portier, Christopher

    2016-03-01

    Non-cancer risk assessment traditionally assumes a threshold of effect, below which there is a negligible risk of an adverse effect. The Agency for Toxic Substances and Disease Registry derives health-based guidance values known as Minimal Risk Levels (MRLs) as estimates of the toxicity threshold for non-carcinogens. Although the definition of an MRL, as well as EPA reference dose values (RfD and RfC), is a level that corresponds to "negligible risk," they represent daily exposure doses or concentrations, not risks. We present a new approach to calculate the risk at exposure to specific doses for chemical mixtures, the assumption in this approach is to assign de minimis risk at the MRL. The assigned risk enables the estimation of parameters in an exponential model, providing a complete dose-response curve for each compound from the chosen point of departure to zero. We estimated parameters for 27 chemicals. The value of k, which determines the shape of the dose-response curve, was moderately insensitive to the choice of the risk at the MRL. The approach presented here allows for the calculation of a risk from a single substance or the combined risk from multiple chemical exposures in a community. The methodology is applicable from point of departure data derived from quantal data, such as data from benchmark dose analyses or from data that can be transformed into probabilities, such as lowest-observed-adverse-effect level. The individual risks are used to calculate risk ratios that can facilitate comparison and cost-benefit analyses of environmental contamination control strategies.

  9. Breast Cancer Risk Estimation Using Parenchymal Texture Analysis in Digital Breast Tomosynthesis

    SciTech Connect

    Ikejimba, Lynda C.; Kontos, Despina; Maidment, Andrew D. A.

    2010-10-11

    Mammographic parenchymal texture has been shown to correlate with genetic markers of developing breast cancer. Digital breast tomosynthesis (DBT) is a novel x-ray imaging technique in which tomographic images of the breast are reconstructed from multiple source projections acquired at different angles of the x-ray tube. Compared to digital mammography (DM), DBT eliminates breast tissue overlap, offering superior parenchymal tissue visualization. We hypothesize that texture analysis in DBT could potentially provide a better assessment of parenchymal texture and ultimately result in more accurate assessment of breast cancer risk. As a first step towards validating this hypothesis, we investigated the association between DBT parenchymal texture and breast percent density (PD), a known breast cancer risk factor, and compared it to DM. Bilateral DBT and DM images from 71 women participating in a breast cancer screening trial were analyzed. Filtered-backprojection was used to reconstruct DBT tomographic planes in 1 mm increments with 0.22 mm in-plane resolution. Corresponding DM images were acquired at 0.1 mm pixel resolution. Retroareolar regions of interest (ROIs) equivalent to 2.5 cm{sup 3} were segmented from the DBT images and corresponding 2.5 cm{sup 2} ROIs were segmented from the DM images. Breast PD was mammographically estimated using the Cumulus scale. Overall, DBT texture features demonstrated a stronger correlation than DM to PD. The Pearson correlation coefficients for DBT were r = 0.40 (p<0.001) for contrast and r = -0.52 (p<0.001) for homogeneity; the corresponding DM correlations were r = 0.26 (p = 0.002) and r = -0.33 (p<0.001). Multiple linear regression of the texture features versus breast PD also demonstrated significantly stronger associations in DBT (R{sup 2} = 0.39) compared to DM (R{sup 2} = 0.33). We attribute these observations to the superior parenchymal tissue visualization in DBT. Our study is the first to perform DBT texture analysis in a

  10. Estimation model of life insurance claims risk for cancer patients by using Bayesian method

    NASA Astrophysics Data System (ADS)

    Sukono; Suyudi, M.; Islamiyati, F.; Supian, S.

    2017-01-01

    This paper discussed the estimation model of the risk of life insurance claims for cancer patients using Bayesian method. To estimate the risk of the claim, the insurance participant data is grouped into two: the number of policies issued and the number of claims incurred. Model estimation is done using a Bayesian approach method. Further, the estimator model was used to estimate the risk value of life insurance claims each age group for each sex. The estimation results indicate that a large risk premium for insured males aged less than 30 years is 0.85; for ages 30 to 40 years is 3:58; for ages 41 to 50 years is 1.71; for ages 51 to 60 years is 2.96; and for those aged over 60 years is 7.82. Meanwhile, for insured women aged less than 30 years was 0:56; for ages 30 to 40 years is 3:21; for ages 41 to 50 years is 0.65; for ages 51 to 60 years is 3:12; and for those aged over 60 years is 9.99. This study is useful in determining the risk premium in homogeneous groups based on gender and age.

  11. Spatial Estimation of Populations at Risk from Radiological Dispersion Device Terrorism Incidents

    SciTech Connect

    Regens, J.L.; Gunter, J.T.

    2008-07-01

    Delineation of the location and size of the population potentially at risk of exposure to ionizing radiation is one of the key analytical challenges in estimating accurately the severity of the potential health effects associated with a radiological terrorism incident. Regardless of spatial scale, the geographical units for which population data commonly are collected rarely coincide with the geographical scale necessary for effective incident management and medical response. This paper identifies major government and commercial open sources of U.S. population data and presents a GIS-based approach for allocating publicly available population data, including age distributions, to geographical units appropriate for planning and implementing incident management and medical response strategies. In summary: The gravity model offers a straight-forward, empirical tool for estimating population flows, especially when geographical areas are relatively well-defined in terms of accessibility and spatial separation. This is particularly important for several reasons. First, the spatial scale for the area impacted by a RDD terrorism event is unlikely to match fully the spatial scale of available population data. That is, the plume spread typically will not uniformly overlay the impacted area. Second, the number of people within the impacted area varies as a function whether an attack occurs during the day or night. For example, the population of a central business district or industrial area typically is larger during the day while predominately residential areas have larger night time populations. As a result, interpolation techniques that link population data to geographical units and allocate those data based on time-frame at a spatial scale that is relevant to enhancing preparedness and response. The gravity model's main advantage is that it efficiently allocates readily available, open source population data to geographical units appropriate for planning and implementing

  12. Comparison of Paper-and-Pencil versus Web Administration of the Youth Risk Behavior Survey (YRBS): Risk Behavior Prevalence Estimates

    ERIC Educational Resources Information Center

    Eaton, Danice K.; Brener, Nancy D.; Kann, Laura; Denniston, Maxine M.; McManus, Tim; Kyle, Tonja M.; Roberts, Alice M.; Flint, Katherine H.; Ross, James G.

    2010-01-01

    The authors examined whether paper-and-pencil and Web surveys administered in the school setting yield equivalent risk behavior prevalence estimates. Data were from a methods study conducted by the Centers for Disease Control and Prevention (CDC) in spring 2008. Intact classes of 9th- or 10th-grade students were assigned randomly to complete a…

  13. ESTIMATING RISK TO CALIFORNIA ENERGY INFRASTRUCTURE FROM PROJECTED CLIMATE CHANGE

    SciTech Connect

    Sathaye, Jayant; Dale, Larry; Larsen, Peter; Fitts, Gary; Koy, Kevin; Lewis, Sarah; Lucena, Andre

    2011-06-22

    This report outlines the results of a study of the impact of climate change on the energy infrastructure of California and the San Francisco Bay region, including impacts on power plant generation; transmission line and substation capacity during heat spells; wildfires near transmission lines; sea level encroachment upon power plants, substations, and natural gas facilities; and peak electrical demand. Some end-of-century impacts were projected:Expected warming will decrease gas-fired generator efficiency. The maximum statewide coincident loss is projected at 10.3 gigawatts (with current power plant infrastructure and population), an increase of 6.2 percent over current temperature-induced losses. By the end of the century, electricity demand for almost all summer days is expected to exceed the current ninetieth percentile per-capita peak load. As much as 21 percent growth is expected in ninetieth percentile peak demand (per-capita, exclusive of population growth). When generator losses are included in the demand, the ninetieth percentile peaks may increase up to 25 percent. As the climate warms, California's peak supply capacity will need to grow faster than the population.Substation capacity is projected to decrease an average of 2.7 percent. A 5C (9F) air temperature increase (the average increase predicted for hot days in August) will diminish the capacity of a fully-loaded transmission line by an average of 7.5 percent.The potential exposure of transmission lines to wildfire is expected to increase with time. We have identified some lines whose probability of exposure to fire are expected to increase by as much as 40 percent. Up to 25 coastal power plants and 86 substations are at risk of flooding (or partial flooding) due to sea level rise.

  14. Summary Report on the Graded Prognostic Assessment: An Accurate and Facile Diagnosis-Specific Tool to Estimate Survival for Patients With Brain Metastases

    PubMed Central

    Sperduto, Paul W.; Kased, Norbert; Roberge, David; Xu, Zhiyuan; Shanley, Ryan; Luo, Xianghua; Sneed, Penny K.; Chao, Samuel T.; Weil, Robert J.; Suh, John; Bhatt, Amit; Jensen, Ashley W.; Brown, Paul D.; Shih, Helen A.; Kirkpatrick, John; Gaspar, Laurie E.; Fiveash, John B.; Chiang, Veronica; Knisely, Jonathan P.S.; Sperduto, Christina Maria; Lin, Nancy; Mehta, Minesh

    2012-01-01

    Purpose Our group has previously published the Graded Prognostic Assessment (GPA), a prognostic index for patients with brain metastases. Updates have been published with refinements to create diagnosis-specific Graded Prognostic Assessment indices. The purpose of this report is to present the updated diagnosis-specific GPA indices in a single, unified, user-friendly report to allow ease of access and use by treating physicians. Methods A multi-institutional retrospective (1985 to 2007) database of 3,940 patients with newly diagnosed brain metastases underwent univariate and multivariate analyses of prognostic factors associated with outcomes by primary site and treatment. Significant prognostic factors were used to define the diagnosis-specific GPA prognostic indices. A GPA of 4.0 correlates with the best prognosis, whereas a GPA of 0.0 corresponds with the worst prognosis. Results Significant prognostic factors varied by diagnosis. For lung cancer, prognostic factors were Karnofsky performance score, age, presence of extracranial metastases, and number of brain metastases, confirming the original Lung-GPA. For melanoma and renal cell cancer, prognostic factors were Karnofsky performance score and the number of brain metastases. For breast cancer, prognostic factors were tumor subtype, Karnofsky performance score, and age. For GI cancer, the only prognostic factor was the Karnofsky performance score. The median survival times by GPA score and diagnosis were determined. Conclusion Prognostic factors for patients with brain metastases vary by diagnosis, and for each diagnosis, a robust separation into different GPA scores was discerned, implying considerable heterogeneity in outcome, even within a single tumor type. In summary, these indices and related worksheet provide an accurate and facile diagnosis-specific tool to estimate survival, potentially select appropriate treatment, and stratify clinical trials for patients with brain metastases. PMID:22203767

  15. Estimation of cancer risks and benefits associated with a potential increased consumption of fruits and vegetables.

    PubMed

    Reiss, Richard; Johnston, Jason; Tucker, Kevin; DeSesso, John M; Keen, Carl L

    2012-12-01

    The current paper provides an analysis of the potential number of cancer cases that might be prevented if half the U.S. population increased its fruit and vegetable consumption by one serving each per day. This number is contrasted with an upper-bound estimate of concomitant cancer cases that might be theoretically attributed to the intake of pesticide residues arising from the same additional fruit and vegetable consumption. The cancer prevention estimates were derived using a published meta-analysis of nutritional epidemiology studies. The cancer risks were estimated using U.S. Environmental Protection Agency (EPA) methods, cancer potency estimates from rodent bioassays, and pesticide residue sampling data from the U.S. Department of Agriculture (USDA). The resulting estimates are that approximately 20,000 cancer cases per year could be prevented by increasing fruit and vegetable consumption, while up to 10 cancer cases per year could be caused by the added pesticide consumption. These estimates have significant uncertainties (e.g., potential residual confounding in the fruit and vegetable epidemiologic studies and reliance on rodent bioassays for cancer risk). However, the overwhelming difference between benefit and risk estimates provides confidence that consumers should not be concerned about cancer risks from consuming conventionally-grown fruits and vegetables.

  16. Occupational and consumer risk estimates for nanoparticles emitted by laser printers

    NASA Astrophysics Data System (ADS)

    Hänninen, Otto; Brüske-Hohlfeld, Irene; Loh, Miranda; Stoeger, Tobias; Kreyling, Wolfgang; Schmid, Otmar; Peters, Annette

    2010-01-01

    Several studies have reported laser printers as significant sources of nanosized particles (<0.1 μm). Laser printers are used occupationally in office environments and by consumers in their homes. The current work combines existing epidemiological and toxicological evidence on particle-related health effects, measuring doses as mass, particle number and surface area, to estimate and compare the potential risks in occupational and consumer exposure scenarios related to the use of laser printers. The daily uptake of laser printer particles was estimated based on measured particle size distributions and lung deposition modelling. The obtained daily uptakes (particle mass 0.15-0.44 μg d-1; particle number 1.1-3.1 × 109 d-1) were estimated to correspond to 4-13 (mass) or 12-34 (number) deaths per million persons exposed on the basis of epidemiological risk estimates for ambient particles. These risks are higher than the generally used definition of acceptable risk of 1 × 10-6, but substantially lower than the estimated risks due to ambient particles. Toxicological studies on ambient particles revealed consistent values for lowest observed effect levels (LOELs) which were converted into equivalent daily uptakes using allometric scaling. These LOEL uptakes were by a factor of about 330-1,000 (mass) and 1,000-2,500 (particle surface area) higher than estimated uptakes from printers. This toxicological assessment would indicate no significant health risks due to printer particles. Finally, our study suggests that particle number (not mass) and mass (not surface area) are the most conservative risk metrics for the epidemiological and toxicological risks presented here, respectively.

  17. Laypersons’ Responses to the Communication of Uncertainty Regarding Cancer Risk Estimates

    PubMed Central

    Han, Paul K. J.; Klein, William M. P.; Lehman, Thomas C.; Massett, Holly; Lee, Simon C.; Freedman, Andrew N.

    2009-01-01

    Objective To explore laypersons’ responses to the communication of uncertainty associated with individualized cancer risk estimates and to identify reasons for individual differences in these responses. Design A qualitative study was conducted using focus groups. Participants were informed about a new colorectal cancer risk prediction model, and presented with hypothetical individualized risk estimates using presentation formats varying in expressed uncertainty (range v. point estimate). Semistructured interviews explored participants’ responses to this information. Participants and Setting Eight focus groups were conducted with 48 adults aged 50 to 74 residing in 2 major US metropolitan areas, Chicago, IL and Washington, DC. Purposive sampling was used to recruit participants with a high school or greater education, some familiarity with information technology, and no personal or immediate family history of cancer. Results Participants identified several sources of uncertainty regarding cancer risk estimates, including missing data, limitations in accuracy and source credibility, and conflicting information. In comparing presentation formats, most participants reported greater worry and perceived risk with the range than with the point estimate, consistent with the phenomenon of “ambiguity aversion.” However, others reported the opposite effect or else indifference between formats. Reasons suggested by participants’ responses included individual differences in optimism and motivations to reduce feelings of vulnerability and personal lack of control. Perceptions of source credibility and risk mutability emerged as potential mediating factors. Conclusions Laypersons’ responses to the communication of uncertainty regarding cancer risk estimates differ, and include both heightened and diminished risk perceptions. These differences may be attributable to personality, cognitive, and motivational factors. PMID:19470720

  18. Assessment of the value of a genetic risk score in improving the estimation of coronary risk

    PubMed Central

    Lluis-Ganella, Carla; Subirana, Isaac; Lucas, Gavin; Tomás, Marta; Muñoz, Daniel; Sentí, Mariano; Salas, Eduardo; Sala, Joan; Ramos, Rafel; Ordovas, Jose M; Marrugat, Jaume; Elosua, Roberto

    2013-01-01

    Background The American Heart Association has established criteria for the evaluation of novel markers of cardiovascular risk. In accordance with these criteria, we assessed the association between a multi-locus genetic risk score (GRS) and incident coronary heart disease (CHD), and evaluated whether this GRS improves the predictive capacity of the Framingham risk function. Methods and results Using eight genetic variants associated with CHD but not with classical cardiovascular risk factors (CVRFs), we generated a multi-locus GRS, and found it to be linearly associated with CHD in two population based cohorts: The REGICOR Study (n=2,351) and The Framingham Heart Study (n=3,537) (meta-analyzed HR [95%CI]: ~1.13 [1.01–1.27], per unit). Inclusion of the GRS in the Framingham risk function improved its discriminative capacity in the Framingham sample (c-statistic: 72.81 vs.72.37, p=0.042) but not in the REGICOR sample. According to both the net reclassification improvement (NRI) index and the integrated discrimination index (IDI), the GRS improved re-classification among individuals with intermediate coronary risk (meta-analysis NRI [95%CI]: 17.44 [8.04; 26.83]), but not overall. Conclusions A multi-locus GRS based on genetic variants unrelated to CVRFs was associated with a linear increase in risk of CHD events in two distinct populations. This GRS improves risk reclassification particularly in the population at intermediate coronary risk. These results indicate the potential value of the inclusion of genetic information in classical functions for risk assessment in the intermediate risk population group. PMID:22521901

  19. Assessment of the value of a genetic risk score in improving the estimation of coronary risk

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The American Heart Association has established criteria for the evaluation of novel markers of cardiovascular risk. In accordance with these criteria, we assessed the association between a multi-locus genetic risk score (GRS) and incident coronary heart disease (CHD), and evaluated whether this GRS ...

  20. Estimating the Contribution of Selected Risk Factors in Attributable Burden to Stroke in Iran

    PubMed Central

    Karami, M; Soori, H; Monfared, A Bahadori

    2012-01-01

    Background: Knowledge of the magnitude of avoidable burden by risk factors is needed for health policy, priority setting, and preventing stroke. The aim of this study was to estimate the contribution of selected risk factors including hypertension, overweight, obesity, tobacco use, and physical inactivity to the attributable burden of stroke in Iran. Methods: The World Health Organization Comparative Risk Assessment (CRA) methodology was employed to calculate the Potential Impact Fraction (PIF) and percentage of avoidable burden of stroke, which attributed to its risk factors among Iranian adults in 2009. Prevalence of risk factors was obtained from the 5th STEPS survey of chronic disease risk factors which conducted in 2009. PIF was estimated on both theoretical minimum and feasible minimum risk. A simulation procedure incorporating sources of uncertainty was used to estimate the uncertainties for the attributable burden. Results: About 15.7% (95% uncertainty intervals: 5.8- 23.5) of attributable Disability Adjusted Life Years (DALYs) to stroke in adult males and 15.8% (95% uncertainty intervals: 5.8- 23.5) in adult females are avoidable after changing the current prevalence (16.0% and 16.1% for males and females, respectively) of hypertension to 10% in both sexes. Conclusion: This work highlighted the important role of hypertension and overweight. Accordingly, policy makers are advised to consider these risk factors once implementing interventional program in Iran. PMID:23113182

  1. Mathematical Models for Estimating the Risks of Bovine Spongiform Encephalopathy (BSE).

    PubMed

    Al-Zoughool, Mustafa; Cottrell, David; Elsaadany, Susie; Murray, Noel; Oraby, Tamer; Smith, Robert; Krewski, Daniel

    2015-01-01

    When the bovine spongiform encephalopathy (BSE) epidemic first emerged in the United Kingdom in the mid 1980s, the etiology of animal prion diseases was largely unknown. Risk management efforts to control the disease were also subject to uncertainties regarding the extent of BSE infections and future course of the epidemic. As understanding of BSE increased, mathematical models were developed to estimate risk of BSE infection and to predict reductions in risk in response to BSE control measures. Risk models of BSE-transmission dynamics determined disease persistence in cattle herds and relative infectivity of cattle prior to onset of clinical disease. These BSE models helped in understanding key epidemiological features of BSE transmission and dynamics, such as incubation period distribution and age-dependent infection susceptibility to infection with the BSE agent. This review summarizes different mathematical models and methods that have been used to estimate risk of BSE, and discusses how such risk projection models have informed risk assessment and management of BSE. This review also provides some general insights on how mathematical models of the type discussed here may be used to estimate risks of emerging zoonotic diseases when biological data on transmission of the etiological agent are limited.

  2. Mobile Applications for Type 2 Diabetes Risk Estimation: a Systematic Review.

    PubMed

    Fijacko, Nino; Brzan, Petra Povalej; Stiglic, Gregor

    2015-10-01

    Screening for chronical diseases like type 2 diabetes can be done using different methods and various risk tests. This study present a review of type 2 diabetes risk estimation mobile applications focusing on their functionality and availability of information on the underlying risk calculators. Only 9 out of 31 reviewed mobile applications, featured in three major mobile application stores, disclosed the name of risk calculator used for assessing the risk of type 2 diabetes. Even more concerning, none of the reviewed applications mentioned that they are collecting the data from users to improve the performance of their risk estimation calculators or offer users the descriptive statistics of the results from users that already used the application. For that purpose the questionnaires used for calculation of risk should be upgraded by including the information on the most recent blood sugar level measurements from users. Although mobile applications represent a great future potential for health applications, developers still do not put enough emphasis on informing the user of the underlying methods used to estimate the risk for a specific clinical condition.

  3. Race-specific genetic risk score is more accurate than nonrace-specific genetic risk score for predicting prostate cancer and high-grade diseases.

    PubMed

    Na, Rong; Ye, Dingwei; Qi, Jun; Liu, Fang; Lin, Xiaoling; Helfand, Brian T; Brendler, Charles B; Conran, Carly; Gong, Jian; Wu, Yishuo; Gao, Xu; Chen, Yaqing; Zheng, S Lilly; Mo, Zengnan; Ding, Qiang; Sun, Yinghao; Xu, Jianfeng

    2016-01-01

    Genetic risk score (GRS) based on disease risk-associated single nucleotide polymorphisms (SNPs) is an informative tool that can be used to provide inherited information for specific diseases in addition to family history. However, it is still unknown whether only SNPs that are implicated in a specific racial group should be used when calculating GRSs. The objective of this study is to compare the performance of race-specific GRS and nonrace-specific GRS for predicting prostate cancer (PCa) among 1338 patients underwent prostate biopsy in Shanghai, China. A race-specific GRS was calculated with seven PCa risk-associated SNPs implicated in East Asians (GRS7), and a nonrace-specific GRS was calculated based on 76 PCa risk-associated SNPs implicated in at least one racial group (GRS76). The means of GRS7 and GRS76 were 1.19 and 1.85, respectively, in the study population. Higher GRS7 and GRS76 were independent predictors for PCa and high-grade PCa in univariate and multivariate analyses. GRS7 had a better area under the receiver-operating curve (AUC) than GRS76 for discriminating PCa (0.602 vs 0.573) and high-grade PCa (0.603 vs 0.575) but did not reach statistical significance. GRS7 had a better (up to 13% at different cutoffs) positive predictive value (PPV) than GRS76. In conclusion, a race-specific GRS is more robust and has a better performance when predicting PCa in East Asian men than a GRS calculated using SNPs that are not shown to be associated with East Asians.

  4. Prophylactic radiotherapy against heterotopic ossification following internal fixation of acetabular fractures: a comparative estimate of risk

    PubMed Central

    Nasr, P; Yip, G; Scaife, J E; House, T; Thomas, S J; Harris, F; Owen, P J; Hull, P

    2014-01-01

    Objective: Radiotherapy (RT) is effective in preventing heterotopic ossification (HO) around acetabular fractures requiring surgical reconstruction. We audited outcomes and estimated risks from RT prophylaxis, and alternatives of indometacin or no prophylaxis. Methods: 34 patients underwent reconstruction of acetabular fractures through a posterior approach, followed by a 8-Gy single fraction. The mean age was 44 years. The mean time from surgery to RT was 1.1 days. The major RT risk is radiation-induced fatal cancer. The International Commission on Radiological Protection (ICRP) method was used to estimate risk, and compared with a method (Trott and Kemprad) specifically for estimating RT risk for benign disease. These were compared with risks associated with indometacin and no prophylaxis. Results: 28 patients (82%) developed no HO; 6 developed Brooker Class I; and none developed Class II–IV HO. The ICRP method suggests a risk of fatal cancer in the range of 1 in 1000 to 1 in 10,000; the Trott and Kemprad method suggests 1 in 3000. For younger patients, this may rise to 1 in 2000; and for elderly patients, it may fall to 1 in 6000. The risk of death from gastric bleeding or perforation from indometacin is 1 in 180 to 1 in 900 in older patients. Without prophylaxis risk of death from reoperation to remove HO is 1 in 4000 to 1 in 30,000. Conclusion: These results are encouraging, consistent with much larger series and endorse our multidisciplinary management. Risk estimates can be used in discussion with patients. Advances in knowledge: The risk from RT prophylaxis is small, it is safer than indometacin and substantially overlaps with the range for no prophylaxis. PMID:25089852

  5. Estimating risk at a Superfund site using passive sampling devices as biological surrogates in human health risk models.

    PubMed

    Allan, Sarah E; Sower, Gregory J; Anderson, Kim A

    2011-10-01

    Passive sampling devices (PSDs) sequester the freely dissolved fraction of lipophilic contaminants, mimicking passive chemical uptake and accumulation by biomembranes and lipid tissues. Public Health Assessments that inform the public about health risks from exposure to contaminants through consumption of resident fish are generally based on tissue data, which can be difficult to obtain and requires destructive sampling. The purpose of this study is to apply PSD data in a Public Health Assessment to demonstrate that PSDs can be used as a biological surrogate to evaluate potential human health risks and elucidate spatio-temporal variations in risk. PSDs were used to measure polycyclic aromatic hydrocarbons (PAHs) in the Willamette River; upriver, downriver and within the Portland Harbor Superfund megasite for 3 years during wet and dry seasons. Based on an existing Public Health Assessment for this area, concentrations of PAHs in PSDs were substituted for fish tissue concentrations. PSD measured PAH concentrations captured the magnitude, range and variability of PAH concentrations reported for fish/shellfish from Portland Harbor. Using PSD results in place of fish data revealed an unacceptable risk level for cancer in all seasons but no unacceptable risk for non-cancer endpoints. Estimated cancer risk varied by several orders of magnitude based on season and location. Sites near coal tar contamination demonstrated the highest risk, particularly during the dry season and remediation activities. Incorporating PSD data into Public Health Assessments provides specific spatial and temporal contaminant exposure information that can assist public health professionals in evaluating human health risks.

  6. Variations of lung cancer risk from asbestos exposure: impact on estimation of population attributable fraction.

    PubMed

    Moon, Eun Kyeong; Son, Mia; Jin, Young-Woo; Park, Sohee; Lee, Won Jin

    2013-01-01

    The purpose of this study is to investigate the potential impact of differing lung cancer risks in study populations on estimating population attributable fraction (PAF) from asbestos exposure. Studies were identified via a MEDLINE search up to September 2009 and from the reference lists of publications about asbestos exposure and lung cancer risk. Relative risk estimates were extracted from 160 studies and meta-relative risks were calculated according to random-effect models. Hypothetical PAFs were calculated based on the meta results and on the difference exposure scenarios. The risks for lung cancer from asbestos exposure were variable according to the region as well as other study characteristics. The risk estimates proved higher in Asian countries (RR=3.53), in studies with 500 or fewer subjects (RR=2.26), and papers published in the 1990s or earlier (RR=1.91), than did those for European or North American countries, studies with more than 500 subjects, and papers published in the 2000s, respectively. The differences in PAFs between Asian and North American studies were 15.5%, 30.3%, and 36.2% when the exposure prevalence was 10%, 30%, and 50%, respectively. This study suggested that it is important to apply appropriate lung cancer estimates to each study population when calculating PAF from asbestos exposure.

  7. Prospective method for estimating occupational health risks in new energy technologies

    SciTech Connect

    Moskowitz, P D; Briggs, T; Ungers, L; Hamilton, L D

    1981-09-01

    In design, development, and acceptance of new energy technologies, concern for health and safety is increasingly important. Determining risks for emerging technologies is difficult because health statistics associated with these new alternatives are unavailable. Nevertheless boundaries on such risks must be determined to identify potentially significant hazards and to permit technology comparisons to be made. An approach to determining occupational health costs is to disaggregate labor requirements of an emerging industy by different worker classifications. Risks to workers can then be determined for these classifications from occupational health statistics of related industries. By summing risks for each worker classification, prospective estimates of individual and societal risk from an emerging technology can be developed. Although this approach identifies accident-related effects, it cannot be used to quantitate occupationally induced disease. An example of this method analyzing different photovoltaic fabrication alternatives is given. Individual vs. societal risk is considered in these analyses.

  8. Measurement of natural radionuclides in Malaysian bottled mineral water and consequent health risk estimation

    NASA Astrophysics Data System (ADS)

    Priharti, W.; Samat, S. B.; Yasir, M. S.

    2015-09-01

    The radionuclides of 226Ra, 232Th and 40K were measured in ten mineral water samples, of which from the radioactivity obtained, the ingestion doses for infants, children and adults were calculated and the cancer risk for the adult was estimated. Results showed that the calculated ingestion doses for the three age categories are much lower than the average worldwide ingestion exposure of 0.29 mSv/y and the estimated cancer risk is much lower than the cancer risk of 8.40 × 10-3 (estimated from the total natural radiation dose of 2.40 mSv/y). The present study concludes that the bottled mineral water produced in Malaysia is safe for daily human consumption.

  9. Measurement of natural radionuclides in Malaysian bottled mineral water and consequent health risk estimation

    SciTech Connect

    Priharti, W.; Samat, S. B.; Yasir, M. S.

    2015-09-25

    The radionuclides of {sup 226}Ra, {sup 232}Th and {sup 40}K were measured in ten mineral water samples, of which from the radioactivity obtained, the ingestion doses for infants, children and adults were calculated and the cancer risk for the adult was estimated. Results showed that the calculated ingestion doses for the three age categories are much lower than the average worldwide ingestion exposure of 0.29 mSv/y and the estimated cancer risk is much lower than the cancer risk of 8.40 × 10{sup −3} (estimated from the total natural radiation dose of 2.40 mSv/y). The present study concludes that the bottled mineral water produced in Malaysia is safe for daily human consumption.

  10. Accounting for Ecosystem Alteration Doubles Estimates of Conservation Risk in the Conterminous United States

    PubMed Central

    Swaty, Randy; Blankenship, Kori; Hagen, Sarah; Fargione, Joseph; Smith, Jim; Patton, Jeannie

    2011-01-01

    Previous national and global conservation assessments have relied on habitat conversion data to quantify conservation risk. However, in addition to habitat conversion to crop production or urban uses, ecosystem alteration (e.g., from logging, conversion to plantations, biological invasion, or fire suppression) is a large source of conservation risk. We add data quantifying ecosystem alteration on unconverted lands to arrive at a more accurate depiction of conservation risk for the conterminous United States. We quantify ecosystem alteration using a recent national assessment based on remote sensing of current vegetation compared with modeled reference natural vegetation conditions. Highly altered (but not converted) ecosystems comprise 23% of the conterminous United States, such that the number of critically endangered ecoregions in the United States is 156% higher than when calculated using habitat conversion data alone. Increased attention to natural resource management will be essential to address widespread ecosystem alteration and reduce conservation risk. PMID:21850248

  11. Estimates of radiation doses in tissue and organs and risk of excess cancer in the single-course radiotherapy patients treated for ankylosing spondylitis in England and Wales

    SciTech Connect

    Fabrikant, J.I.; Lyman, J.T.

    1982-02-01

    The estimates of absorbed doses of x rays and excess risk of cancer in bone marrow and heavily irradiated sites are extremely crude and are based on very limited data and on a number of assumptions. Some of these assumptions may later prove to be incorrect, but it is probable that they are correct to within a factor of 2. The excess cancer risk estimates calculated compare well with the most reliable epidemiological surveys thus far studied. This is particularly important for cancers of heavily irradiated sites with long latent periods. The mean followup period for the patients was 16.2 y, and an increase in cancers of heavily irradiated sites may appear in these patients in the 1970s in tissues and organs with long latent periods for the induction of cancer. The accuracy of these estimates is severely limited by the inadequacy of information on doses absorbed by the tissues at risk in the irradiated patients. The information on absorbed dose is essential for an accurate assessment of dose-cancer incidence analysis. Furthermore, in this valuable series of irradiated patients, the information on radiation dosimetry on the radiotherapy charts is central to any reliable determination of somatic risks of radiation with regard to carcinogenesis in man. The work necessary to obtain these data is under way; only when they are available can more precise estimates of risk of cancer induction by radiation in man be obtained.

  12. Performance of default risk model with barrier option framework and maximum likelihood estimation: Evidence from Taiwan

    NASA Astrophysics Data System (ADS)

    Chou, Heng-Chih; Wang, David

    2007-11-01

    We investigate the performance of a default risk model based on the barrier option framework with maximum likelihood estimation. We provide empirical validation of the model by showing that implied default barriers are statistically significant for a sample of construction firms in Taiwan over the period 1994-2004. We find that our model dominates the commonly adopted models, Merton model, Z-score model and ZETA model. Moreover, we test the n-year-ahead prediction performance of the model and find evidence that the prediction accuracy of the model improves as the forecast horizon decreases. Finally, we assess the effect of estimated default risk on equity returns and find that default risk is able to explain equity returns and that default risk is a variable worth considering in asset-pricing tests, above and beyond size and book-to-market.

  13. A methodology for estimating risks associated with landslides of contaminated soil into rivers.

    PubMed

    Göransson, Gunnel; Norrman, Jenny; Larson, Magnus; Alén, Claes; Rosén, Lars

    2014-02-15

    Urban areas adjacent to surface water are exposed to soil movements such as erosion and slope failures (landslides). A landslide is a potential mechanism for mobilisation and spreading of pollutants. This mechanism is in general not included in environmental risk assessments for contaminated sites, and the consequences associated with contamination in the soil are typically not considered in landslide risk assessments. This study suggests a methodology to estimate the environmental risks associated with landslides in contaminated sites adjacent to rivers. The methodology is probabilistic and allows for datasets with large uncertainties and the use of expert judgements, providing quantitative estimates of probabilities for defined failures. The approach is illustrated by a case study along the river Göta Älv, Sweden, where failures are defined and probabilities for those failures are estimated. Failures are defined from a pollution perspective and in terms of exceeding environmental quality standards (EQSs) and acceptable contaminant loads. Models are then suggested to estimate probabilities of these failures. A landslide analysis is carried out to assess landslide probabilities based on data from a recent landslide risk classification study along the river Göta Älv. The suggested methodology is meant to be a supplement to either landslide risk assessment (LRA) or environmental risk assessment (ERA), providing quantitative estimates of the risks associated with landslide in contaminated sites. The proposed methodology can also act as a basis for communication and discussion, thereby contributing to intersectoral management solutions. From the case study it was found that the defined failures are governed primarily by the probability of a landslide occurring. The overall probabilities for failure are low; however, if a landslide occurs the probabilities of exceeding EQS are high and the probability of having at least a 10% increase in the contamination load

  14. Estimated human health risks of disposing of nonhazardous oil field waste in salt caverns

    SciTech Connect

    Tomasko, D.; Elcock, D.; Veil, J.

    1997-09-01

    Argonne National Laboratory (ANL) has completed an evaluation of the possibility that adverse human health effects (carcinogenic and noncarcinogenic) could result from exposure to contaminants released from nonhazardous oil field wastes (NOW) disposed in domal salt caverns. In this assessment, several steps were used to evaluate potential human health risks: identifying potential contaminants of concern, determining how humans could be exposed to these contaminants, assessing the contaminants` toxicities, estimating contaminant intakes, and, finally, calculating human cancer and noncancer risks.

  15. Examining the effects of air pollution composition on within region differences in PM2.5 mortality risk estimates

    EPA Science Inventory

    Multi-city population-based epidemiological studies have observed significant heterogeneity in both the magnitude and direction of city-specific risk estimates, but tended to focus on regional differences in PM2.5 mortality risk estimates. Interpreting differences in risk estimat...

  16. Estimating distributions out of qualitative and (semi)quantitative microbiological contamination data for use in risk assessment.

    PubMed

    Busschaert, P; Geeraerd, A H; Uyttendaele, M; Van Impe, J F

    2010-04-15

    A framework using maximum likelihood estimation (MLE) is used to fit a probability distribution to a set of qualitative (e.g., absence in 25 g), semi-quantitative (e.g., presence in 25 g and absence in 1g) and/or quantitative test results (e.g., 10 CFU/g). Uncertainty about the parameters of the variability distribution is characterized through a non-parametric bootstrapping method. The resulting distribution function can be used as an input for a second order Monte Carlo simulation in quantitative risk assessment. As an illustration, the method is applied to two sets of in silico generated data. It is demonstrated that correct interpretation of data results in an accurate representation of the contamination level distribution. Subsequently, two case studies are analyzed, namely (i) quantitative analyses of Campylobacter spp. in food samples with nondetects, and (ii) combined quantitative, qualitative, semiquantitative analyses and nondetects of Listeria monocytogenes in smoked fish samples. The first of these case studies is also used to illustrate what the influence is of the limit of quantification, measurement error, and the number of samples included in the data set. Application of these techniques offers a way for meta-analysis of the many relevant yet diverse data sets that are available in literature and (inter)national reports of surveillance or baseline surveys, therefore increases the information input of a risk assessment and, by consequence, the correctness of the outcome of the risk assessment.

  17. Estimate of Space Radiation-Induced Cancer Risks for International Space Station Orbits

    NASA Technical Reports Server (NTRS)

    Wu, Honglu; Atwell, William; Cucinotta, Francis A.; Yang, Chui-hsu

    1996-01-01

    Excess cancer risks from exposures to space radiation are estimated for various orbits of the International Space Station (ISS). Organ exposures are computed with the transport codes, BRYNTRN and HZETRN, and the computerized anatomical male and computerized anatomical female models. Cancer risk coefficients in the National Council on Radiation Protection and Measurements report No. 98 are used to generate lifetime excess cancer incidence and cancer mortality after a one-month mission to ISS. The generated data are tabulated to serve as a quick reference for assessment of radiation risk to astronauts on ISS missions.

  18. Studies on the Hiroshima and Nagasaki survivors, and their use in estimating radiation risks.

    PubMed

    Muirhead, C R

    2003-01-01

    Epidemiological studies of the survivors of the atomic bombings of Hiroshima and Nagasaki have been conducted over many years. These studies have examined, inter alia, mortality and cancer incidence among the survivors. This paper summarises the form of the studies undertaken, outlines the main findings and describes how these results can be used in deriving estimates of radiation risks. In doing so, some areas of uncertainty and open issues are highlighted, such as the magnitude of lifetime cancer risks and the evidence for raised risks of non-cancer diseases at low doses. Continued follow-up of the survivors will be important in shedding further light on these issues.

  19. Estimate of Space Radiation-Induced Cancer Risks for International Space Station Orbits

    SciTech Connect

    Wu, H.; Atwell, W.; Cucinotta, F.A.; Yang, C.

    1996-03-01

    Excess cancer risks from exposures to space radiation are estimated for various orbits of the International Space Station (ISS). Organ exposures are computed with the transport codes, BRYNTRN and HZETRN, and the computerized anatomical male and computerized anatomical female models. Cancer risk coefficients in the National Council on Radiation Protection and Measurements report No. 98 are used to generate lifetime excess cancer incidence and cancer mortality after a one-month mission to ISS. The generated data are tabulated to serve as a quick reference for assessment of radiation risk to astronauts on ISS missions.

  20. Estimating cancer risk from dental cone-beam CT exposures based on skin dosimetry

    NASA Astrophysics Data System (ADS)

    Pauwels, Ruben; Cockmartin, Lesley; Ivanauskaité, Deimante; Urbonienė, Ausra; Gavala, Sophia; Donta, Catherine; Tsiklakis, Kostas; Jacobs, Reinhilde; Bosmans, Hilde; Bogaerts, Ria; Horner, Keith; SEDENTEXCT Project Consortium, The

    2014-07-01

    The aim of this study was to measure entrance skin doses on patients undergoing cone-beam computed tomography (CBCT) examinations, to establish conversion factors between skin and organ doses, and to estimate cancer risk from CBCT exposures. 266 patients (age 8-83) were included, involving three imaging centres. CBCT scans were acquired using the SCANORA 3D (Soredex, Tuusula, Finland) and NewTom 9000 (QR, Verona, Italy). Eight thermoluminescent dosimeters were attached to the patient's skin at standardized locations. Using previously published organ dose estimations on various CBCTs with an anthropomorphic phantom, correlation factors to convert skin dose to organ doses were calculated and applied to estimate patient organ doses. The BEIR VII age- and gender-dependent dose-risk model was applied to estimate the lifetime attributable cancer risk. For the SCANORA 3D, average skin doses over the eight locations varied between 484 and 1788 µGy. For the NewTom 9000 the range was between 821 and 1686 µGy for Centre 1 and between 292 and 2325 µGy for Centre 2. Entrance skin dose measurements demonstrated the combined effect of exposure and patient factors on the dose. The lifetime attributable cancer risk, expressed as the probability to develop a radiation-induced cancer, varied between 2.7 per million (age >60) and 9.8 per million (age 8-11) with an average of 6.0 per million. On average, the risk for female patients was 40% higher. The estimated radiation risk was primarily influenced by the age at exposure and the gender, pointing out the continuing need for justification and optimization of CBCT exposures, with a specific focus on children.

  1. Estimating cancer risk from dental cone-beam CT exposures based on skin dosimetry.

    PubMed

    Pauwels, Ruben; Cockmartin, Lesley; Ivanauskaité, Deimante; Urbonienė, Ausra; Gavala, Sophia; Donta, Catherine; Tsiklakis, Kostas; Jacobs, Reinhilde; Bosmans, Hilde; Bogaerts, Ria; Horner, Keith

    2014-07-21

    The aim of this study was to measure entrance skin doses on patients undergoing cone-beam computed tomography (CBCT) examinations, to establish conversion factors between skin and organ doses, and to estimate cancer risk from CBCT exposures. 266 patients (age 8-83) were included, involving three imaging centres. CBCT scans were acquired using the SCANORA 3D (Soredex, Tuusula, Finland) and NewTom 9000 (QR, Verona, Italy). Eight thermoluminescent dosimeters were attached to the patient's skin at standardized locations. Using previously published organ dose estimations on various CBCTs with an anthropomorphic phantom, correlation factors to convert skin dose to organ doses were calculated and applied to estimate patient organ doses. The BEIR VII age- and gender-dependent dose-risk model was applied to estimate the lifetime attributable cancer risk. For the SCANORA 3D, average skin doses over the eight locations varied between 484 and 1788 µGy. For the NewTom 9000 the range was between 821 and 1686 µGy for Centre 1 and between 292 and 2325 µGy for Centre 2. Entrance skin dose measurements demonstrated the combined effect of exposure and patient factors on the dose. The lifetime attributable cancer risk, expressed as the probability to develop a radiation-induced cancer, varied between 2.7 per million (age >60) and 9.8 per million (age 8-11) with an average of 6.0 per million. On average, the risk for female patients was 40% higher. The estimated radiation risk was primarily influenced by the age at exposure and the gender, pointing out the continuing need for justification and optimization of CBCT exposures, with a specific focus on children.

  2. Micro-scale flood risk estimation in historic centres: a case study in Florence, Italy

    NASA Astrophysics Data System (ADS)

    Castelli, Fabio; Arrighi, Chiara; Brugioni, Marcello; Franceschini, Serena; Mazzanti, Bernardo

    2013-04-01

    The route to flood risk assessment is much more than hydraulic modelling of inundation, that is hazard mapping. Flood risk is the product of flood hazard, vulnerability and exposure, all the three to be estimated with comparable level of accuracy. While hazard maps have already been implemented in many countries, quantitative damage and risk maps are still at a preliminary level. Currently one of the main challenges in flood damage estimation resides in the scarce availability of socio-economic data characterizing the monetary value of the exposed assets. When these public-open data are available, the variability of their level of detail drives the need of merging different sources and of selecting an appropriate scale of analysis. In this work a parsimonious quasi-2D hydraulic model is adopted, having many advantages in terms of easy set-up. In order to represent the geometry of the study domain an high-resolution and up-to-date Digital Surface Model (DSM) is used. The accuracy in flood depth estimation is evaluated by comparison with marble-plate records of a historic flood in the city of Florence (Italy). The accuracy is characterized in the downtown most flooded area by a bias of a very few centimetres and a determination coefficient of 0.73. The average risk is found to be about 14 €/m2•year, corresponding to about 8.3% of residents income. The spatial distribution of estimated risk highlights a complex interaction between the flood pattern and the buildings characteristics. Proceeding through the risk estimation steps, a new micro-scale potential damage assessment method is proposed. This method is based on the georeferenced census system considered as optimal compromise between spatial detail and open availability of socio-economic data. The census sections system consist of geographically contiguous polygons that usually coincide with building blocks in dense urban areas. The results of flood risk assessment at the census section scale resolve most of

  3. Estimation of potential lifetime cancer risks for trihalomethanes from consuming chlorinated drinking water in Taiwan.

    PubMed

    Hsu, C H; Jeng, W L; Chang, R M; Chien, L C; Han, B C

    2001-02-01

    Data on concentrations of trihalomethanes (THMs) in raw and chlorinated water collected from three water treatment plants in Taiwan and estimates of the lifetime cancer risk for THMs from drinking water, using age-adjusted factors and volatilization terms, are presented. Data on THM levels in drinking water were obtained from the annual reports of the Environmental Protection Administration (EPA) of Taiwan. The methodology for estimation of lifetime cancer risks was taken from the USEPA. Chloroform was the major species of THMs, especially in the water plant of south Taiwan. Chloroform contributed the majority of the lifetime cancer risks (range: 87.5-92.5%) of total risks from the three water supply areas. All lifetime cancer risks for CHCl(3), CHBrCl(2), CHBr2Cl, and CHBr3 from consuming tap water in the three water supply areas were higher than 10(-6). The sum of lifetime cancer risks for CHCl(3), CHBrCl(3), CHBr2Cl, and CHBr3 was highest (total risk for total THMs<1.94x10(-4)) for tap water from south Taiwan.

  4. An Evidenced-Based Approach for Estimating Decompression Sickness Risk in Aircraft Operations

    NASA Technical Reports Server (NTRS)

    Robinson, Ronald R.; Dervay, Joseph P.; Conkin, Johnny

    1999-01-01

    Estimating the risk of decompression Sickness (DCS) in aircraft operations remains a challenge, making the reduction of this risk through the development of operationally acceptable denitrogenation schedules difficult. In addition, the medical recommendations which are promulgated are often not supported by rigorous evaluation of the available data, but are instead arrived at by negotiation with the aircraft operations community, are adapted from other similar aircraft operations, or are based upon the opinion of the local medical community. We present a systematic approach for defining DCS risk in aircraft operations by analyzing the data available for a specific aircraft, flight profile, and aviator population. Once the risk of DCS in a particular aircraft operation is known, appropriate steps can be taken to reduce this risk to a level acceptable to the applicable aviation community. Using this technique will allow any aviation medical community to arrive at the best estimate of DCS risk for its specific mission and aviator population and will allow systematic reevaluation of the decisions regarding DCS risk reduction when additional data are available.

  5. Cancer risk estimation for mixtures of coal tars and benzo(a)pyrene

    SciTech Connect

    Gaylor, D.W.; Culp, S.J.; Goldstein, L.S.; Beland, F.A.

    2000-02-01

    Two-year chronic bioassays were conducted by using B6C3F1 female mice fed several concentrations of two different mixtures of coal tars from manufactured gas waste sites or benzo(a)pyrene (BaP). The purpose of the study was to obtain estimates of cancer potency of coal tar mixtures, by using conventional regulatory methods, for use in manufactured gas waste site remediation. A secondary purpose was to investigate the validity of using the concentration of a single potent carcinogen, in this case benzo(a)pyrene, to estimate the relative risk for a coal tar mixture. The study has shown that BaP dominates the cancer risk when its concentration is greater than 6,300 ppm in the coal tar mixture. In this case the most sensitive tissue site is the forestomach. Using low-dose linear extrapolation, the lifetime cancer risk for humans is estimated to be: Risk < 1.03 x 10{sup {minus}4} (ppm coal tar in total diet) + 240 x 10{sup {minus}4} (ppm BaP in total diet), based on forestomach tumors. If the BaP concentration in the coal tar mixture is less than 6,300 ppm, the more likely case, then lung tumors provide the largest estimated upper limit of risk, Risk < 2.55 x 10{sup {minus}4} (ppm coal tar in total diet), with no contribution of BaP to lung tumors. The upper limit of the cancer potency (slope factor) for lifetime oral exposure to benzo(a)pyrene is 1.2 x 10{sup {minus}3} per {micro}g per kg body weight per day from this Good Laboratory Practice (GLP) study compared with the current value of 7.3 x 10{sup {minus}3} per {micro}g per kg body weight per day listed in the US EPA Integrated Risk Information System.

  6. Cancer risk estimation for mixtures of coal tars and benzo(a)pyrene.

    PubMed

    Gaylor, D W; Culp, S J; Goldstein, L S; Beland, F A

    2000-02-01

    Two-year chronic bioassays were conducted by using B6C3F1 female mice fed several concentrations of two different mixtures of coal tars from manufactured gas waste sites or benzo(a)pyrene (BaP). The purpose of the study was to obtain estimates of cancer potency of coal tar mixtures, by using conventional regulatory methods, for use in manufactured gas waste site remediation. A secondary purpose was to investigate the validity of using the concentration of a single potent carcinogen, in this case benzo(a)pyrene, to estimate the relative risk for a coal tar mixture. The study has shown that BaP dominates the cancer risk when its concentration is greater than 6,300 ppm in the coal tar mixture. In this case the most sensitive tissue site is the forestomach. Using low-dose linear extrapolation, the lifetime cancer risk for humans is estimated to be: Risk < 1.03 x 10(-4) (ppm coal tar in total diet) + 240 x 10(-4) (ppm BaP in total diet), based on forestomach tumors. If the BaP concentration in the coal tar mixture is less than 6,300 ppm, the more likely case, then lung tumors provide the largest estimated upper limit of risk, Risk < 2.55 x 10(-4) (ppm coal tar in total diet), with no contribution of BaP to lung tumors. The upper limit of the cancer potency (slope factor) for lifetime oral exposure to benzo(a)pyrene is 1.2 x 10(-3) per microgram per kg body weight per day from this Good Laboratory Practice (GLP) study compared with the current value of 7.3 x 10(-3) per microgram per kg body weight per day listed in the U.S. EPA Integrated Risk Information System.

  7. Quantitative assessment of the microbial risk of leafy greens from farm to consumption: preliminary framework, data, and risk estimates.

    PubMed

    Danyluk, Michelle D; Schaffner, Donald W

    2011-05-01

    This project was undertaken to relate what is known about the behavior of Escherichia coli O157:H7 under laboratory conditions and integrate this information to what is known regarding the 2006 E. coli O157:H7 spinach outbreak in the context of a quantitative microbial risk assessment. The risk model explicitly assumes that all contamination arises from exposure in the field. Extracted data, models, and user inputs were entered into an Excel spreadsheet, and the modeling software @RISK was used to perform Monte Carlo simulations. The model predicts that cut leafy greens that are temperature abused will support the growth of E. coli O157:H7, and populations of the organism may increase by as much a 1 log CFU/day under optimal temperature conditions. When the risk model used a starting level of -1 log CFU/g, with 0.1% of incoming servings contaminated, the predicted numbers of cells per serving were within the range of best available estimates of pathogen levels during the outbreak. The model predicts that levels in the field of -1 log CFU/g and 0.1% prevalence could have resulted in an outbreak approximately the size of the 2006 E. coli O157:H7 outbreak. This quantitative microbial risk assessment model represents a preliminary framework that identifies available data and provides initial risk estimates for pathogenic E. coli in leafy greens. Data gaps include retail storage times, correlations between storage time and temperature, determining the importance of E. coli O157:H7 in leafy greens lag time models, and validation of the importance of cross-contamination during the washing process.

  8. Accurate Estimate of Some Propagation Characteristics for the First Higher Order Mode in Graded Index Fiber with Simple Analytic Chebyshev Method

    NASA Astrophysics Data System (ADS)

    Dutta, Ivy; Chowdhury, Anirban Roy; Kumbhakar, Dharmadas

    2013-03-01

    Using Chebyshev power series approach, accurate description for the first higher order (LP11) mode of graded index fibers having three different profile shape functions are presented in this paper and applied to predict their propagation characteristics. These characteristics include fractional power guided through the core, excitation efficiency and Petermann I and II spot sizes with their approximate analytic formulations. We have shown that where two and three Chebyshev points in LP11 mode approximation present fairly accurate results, the values based on our calculations involving four Chebyshev points match excellently with available exact numerical results.

  9. How accurately can students estimate their performance on an exam and how does this relate to their actual performance on the exam?

    NASA Astrophysics Data System (ADS)

    Rebello, N. Sanjay

    2012-02-01

    Research has shown students' beliefs regarding their own abilities in math and science can influence their performance in these disciplines. I investigated the relationship between students' estimated performance and actual performance on five exams in a second semester calculus-based physics class. Students in a second-semester calculus-based physics class were given about 72 hours after the completion of each of five exams, to estimate their individual and class mean score on each exam. Students were given extra credit worth 1% of the exam points for estimating their score correct within 2% of the actual score and another 1% extra credit for estimating the class mean score within 2% of the correct value. I compared students' individual and mean score estimations with the actual scores to investigate the relationship between estimation accuracies and exam performance of the students as well as trends over the semester.

  10. A review of methods to estimate cause-specific mortality in presence of competing risks

    USGS Publications Warehouse

    Heisey, Dennis M.; Patterson, Brent R.

    2006-01-01

    Estimating cause-specific mortality is often of central importance for understanding the dynamics of wildlife populations. Despite such importance, methodology for estimating and analyzing cause-specific mortality has received little attention in wildlife ecology during the past 20 years. The issue of analyzing cause-specific, mutually exclusive events in time is not unique to wildlife. In fact, this general problem has received substantial attention in human biomedical applications within the context of biostatistical survival analysis. Here, we consider cause-specific mortality from a modern biostatistical perspective. This requires carefully defining what we mean by cause-specific mortality and then providing an appropriate hazard-based representation as a competing risks problem. This leads to the general solution of cause-specific mortality as the cumulative incidence function (CIF). We describe the appropriate generalization of the fully nonparametric staggered-entry Kaplan–Meier survival estimator to cause-specific mortality via the nonparametric CIF estimator (NPCIFE), which in many situations offers an attractive alternative to the Heisey–Fuller estimator. An advantage of the NPCIFE is that it lends itself readily to risk factors analysis with standard software for Cox proportional hazards model. The competing risks–based approach also clarifies issues regarding another intuitive but erroneous "cause-specific mortality" estimator based on the Kaplan–Meier survival estimator and commonly seen in the life sciences literature.

  11. Normal Tissue Complication Probability Estimation by the Lyman-Kutcher-Burman Method Does Not Accurately Predict Spinal Cord Tolerance to Stereotactic Radiosurgery

    SciTech Connect

    Daly, Megan E.; Luxton, Gary; Choi, Clara Y.H.; Gibbs, Iris C.; Chang, Steven D.; Adler, John R.; Soltys, Scott G.

    2012-04-01

    Purpose: To determine whether normal tissue complication probability (NTCP) analyses of the human spinal cord by use of the Lyman-Kutcher-Burman (LKB) model, supplemented by linear-quadratic modeling to account for the effect of fractionation, predict the risk of myelopathy from stereotactic radiosurgery (SRS). Methods and Materials: From November 2001 to July 2008, 24 spinal hemangioblastomas in 17 patients were treated with SRS. Of the tumors, 17 received 1 fraction with a median dose of 20 Gy (range, 18-30 Gy) and 7 received 20 to 25 Gy in 2 or 3 sessions, with cord maximum doses of 22.7 Gy (range, 17.8-30.9 Gy) and 22.0 Gy (range, 20.2-26.6 Gy), respectively. By use of conventional values for {alpha}/{beta}, volume parameter n, 50% complication probability dose TD{sub 50}, and inverse slope parameter m, a computationally simplified implementation of the LKB model was used to calculate the biologically equivalent uniform dose and NTCP for each treatment. Exploratory calculations were performed with alternate values of {alpha}/{beta} and n. Results: In this study 1 case (4%) of myelopathy occurred. The LKB model using radiobiological parameters from Emami and the logistic model with parameters from Schultheiss overestimated complication rates, predicting 13 complications (54%) and 18 complications (75%), respectively. An increase in the volume parameter (n), to assume greater parallel organization, improved the predictive value of the models. Maximum-likelihood LKB fitting of {alpha}/{beta} and n yielded better predictions (0.7 complications), with n = 0.023 and {alpha}/{beta} = 17.8 Gy. Conclusions: The spinal cord tolerance to the dosimetry of SRS is higher than predicted by the LKB model using any set of accepted parameters. Only a high {alpha}/{beta} value in the LKB model and only a large volume effect in the logistic model with Schultheiss data could explain the low number of complications observed. This finding emphasizes that radiobiological models

  12. Cancer risks in BRCA2 families: estimates for sites other than breast and ovary

    PubMed Central

    van Asperen, C J; Brohet, R; Meijers-Heijboer, E; Hoogerbrugge, N; Verhoef, S; Vasen, H; Ausems, M; Menko, F; Gomez, G; Klijn, J; Hogervorst, F; van Houwelingen, J C; van't, V; Rookus, M; van Leeuwen, F E; on, b

    2005-01-01

    Background: In BRCA2 mutation carriers, increased risks have been reported for several cancer sites besides breast and ovary. As most of the families included in earlier reports were selected on the basis of multiple breast/ovarian cancer cases, it is possible that risk estimates may differ in mutation carriers with a less striking family history. Methods: In the Netherlands, 139 BRCA2 families with 66 different pathogenic mutations were included in a nationwide study. To avoid testing bias, we chose not to estimate risk in typed carriers, but rather in male and female family members with a 50% prior probability of being a carrier (n = 1811). The relative risk (RR) for each cancer site with the exception of breast and ovarian cancer was determined by comparing observed numbers with those expected, based on Dutch cancer incidence rates. Results: We observed an excess risk for four cancer sites: pancreas (RR 5.9; 95% confidence interval (CI) 3.2 to 10.0), prostate (2.5; 1.6 to 3.8), bone (14.4; 2.9 to 42.1) and pharynx (7.3; 2.0 to 18.6). A small increase was observed for cancer of the digestive tract (1.5; 1.1 to 1.9). Histological verification was available for 46% of the tumours. Nearly all increased risks reached statistical significance for men only. Cancer risks tended to be higher for people before the age of 65 years. Moreover, families with mutations outside the previously defined ovarian cancer cluster region tended to have a higher cancer risk. Conclusions: We found that BRCA2 carriers are at increased risk for cancers of the prostate and pancreas, and possibly bone and pharynx. Larger databases with extended follow up are needed to provide insight into mutation specific risks of selected carriers in BRCA2 families. PMID:16141007

  13. AN INFORMATIC APPROACH TO ESTIMATING ECOLOGICAL RISKS POSED BY PHARMACEUTICAL USE

    EPA Science Inventory

    A new method for estimating risks of human prescription pharmaceuticals based on information found in regulatory filings as well as scientific and trade literature is described in a presentation at the Pharmaceuticals in the Environment Workshop in Las Vegas, NV, August 23-25, 20...

  14. Prevalence Estimates of Health Risk Behaviors of Immigrant Latino Men Who Have Sex with Men

    ERIC Educational Resources Information Center

    Rhodes, Scott D.; McCoy, Thomas P.; Hergenrather, Kenneth C.; Vissman, Aaron T.; Wolfson, Mark; Alonzo, Jorge; Bloom, Fred R.; Alegria-Ortega, Jose; Eng, Eugenia

    2012-01-01

    Purpose: Little is known about the health status of rural immigrant Latino men who have sex with men (MSM). These MSM comprise a subpopulation that tends to remain "hidden" from both researchers and practitioners. This study was designed to estimate the prevalence of tobacco, alcohol, and drug use, and sexual risk behaviors of Latino MSM…

  15. Challenges in Obtaining Estimates of the Risk of Tuberculosis Infection During Overseas Deployment.

    PubMed

    Mancuso, James D; Geurts, Mia

    2015-12-01

    Estimates of the risk of tuberculosis (TB) infection resulting from overseas deployment among U.S. military service members have varied widely, and have been plagued by methodological problems. The purpose of this study was to estimate the incidence of TB infection in the U.S. military resulting from deployment. Three populations were examined: 1) a unit of 2,228 soldiers redeploying from Iraq in 2008, 2) a cohort of 1,978 soldiers followed up over 5 years after basic training at Fort Jackson in 2009, and 3) 6,062 participants in the 2011-2012 National Health and Nutrition Examination Survey (NHANES). The risk of TB infection in the deployed population was low-0.6% (95% confidence interval [CI]: 0.1-2.3%)-and was similar to the non-deployed population. The prevalence of latent TB infection (LTBI) in the U.S. population was not significantly different among deployed and non-deployed veterans and those with no military service. The limitations of these retrospective studies highlight the challenge in obtaining valid estimates of risk using retrospective data and the need for a more definitive study. Similar to civilian long-term travelers, risks for TB infection during deployment are focal in nature, and testing should be targeted to only those at increased risk.

  16. Challenges in Obtaining Estimates of the Risk of Tuberculosis Infection during Overseas Deployment

    PubMed Central

    Mancuso, James D.; Geurts, Mia

    2015-01-01

    Estimates of the risk of tuberculosis (TB) infection resulting from overseas deployment among U.S. military service members have varied widely, and have been plagued by methodological problems. The purpose of this study was to estimate the incidence of TB infection in the U.S. military resulting from deployment. Three populations were examined: 1) a unit of 2,228 soldiers redeploying from Iraq in 2008, 2) a cohort of 1,978 soldiers followed up over 5 years after basic training at Fort Jackson in 2009, and 3) 6,062 participants in the 2011–2012 National Health and Nutrition Examination Survey (NHANES). The risk of TB infection in the deployed population was low—0.6% (95% confidence interval [CI]: 0.1–2.3%)—and was similar to the non-deployed population. The prevalence of latent TB infection (LTBI) in the U.S. population was not significantly different among deployed and non-deployed veterans and those with no military service. The limitations of these retrospective studies highlight the challenge in obtaining valid estimates of risk using retrospective data and the need for a more definitive study. Similar to civilian long-term travelers, risks for TB infection during deployment are focal in nature, and testing should be targeted to only those at increased risk. PMID:26416114

  17. Hip fracture risk estimation based on principal component analysis of QCT atlas: a preliminary study

    NASA Astrophysics Data System (ADS)

    Li, Wenjun; Kornak, John; Harris, Tamara; Lu, Ying; Cheng, Xiaoguang; Lang, Thomas

    2009-02-01

    We aim to capture and apply 3-dimensional bone fragility features for fracture risk estimation. Using inter-subject image registration, we constructed a hip QCT atlas comprising 37 patients with hip fractures and 38 age-matched controls. In the hip atlas space, we performed principal component analysis to identify the principal components (eigen images) that showed association with hip fracture. To develop and test a hip fracture risk model based on the principal components, we randomly divided the 75 QCT scans into two groups, one serving as the training set and the other as the test set. We applied this model to estimate a fracture risk index for each test subject, and used the fracture risk indices to discriminate the fracture patients and controls. To evaluate the fracture discrimination efficacy, we performed ROC analysis and calculated the AUC (area under curve). When using the first group as the training group and the second as the test group, the AUC was 0.880, compared to conventional fracture risk estimation methods based on bone densitometry, which had AUC values ranging between 0.782 and 0.871. When using the second group as the training group, the AUC was 0.839, compared to densitometric methods with AUC values ranging between 0.767 and 0.807. Our results demonstrate that principal components derived from hip QCT atlas are associated with hip fracture. Use of such features may provide new quantitative measures of interest to osteoporosis.

  18. The economic value of fatal and non-fatal occupational risks in Mexico City using actuarial- and perceived-risk estimates.

    PubMed

    Hammitt, James K; Ibarrarán, María Eugenia

    2006-12-01

    Compensating wage differentials are used to estimate marginal rates of substitution between income and both fatal and non-fatal occupational-injury risks in the Mexico City metropolitan area. Data are obtained by in-person survey of almost 600 workers and include workers' perceived risks of fatal and non-fatal occupational injury supplemented by actuarial-risk estimates from government statistics. Results using both actuarial- and perceived-risk estimates are reasonably consistent. Estimates of the value per statistical life are between 235,000 US dollars and 325,000 US dollars and estimates of the value per statistical non-fatal injury are between 3500 US dollars and 11,000 US dollars (2002 US dollars). These values are much smaller than corresponding estimates for higher-income countries but are compatible with the small number of prior estimates for lower-income countries.

  19. Value at risk estimation with entropy-based wavelet analysis in exchange markets

    NASA Astrophysics Data System (ADS)

    He, Kaijian; Wang, Lijun; Zou, Yingchao; Lai, Kin Keung

    2014-08-01

    In recent years, exchange markets are increasingly integrated together. Fluctuations and risks across different exchange markets exhibit co-moving and complex dynamics. In this paper we propose the entropy-based multivariate wavelet based approaches to analyze the multiscale characteristic in the multidimensional domain and improve further the Value at Risk estimation reliability. Wavelet analysis has been introduced to construct the entropy-based Multiscale Portfolio Value at Risk estimation algorithm to account for the multiscale dynamic correlation. The entropy measure has been proposed as the more effective measure with the error minimization principle to select the best basis when determining the wavelet families and the decomposition level to use. The empirical studies conducted in this paper have provided positive evidence as to the superior performance of the proposed approach, using the closely related Chinese Renminbi and European Euro exchange market.

  20. Non-parametric estimation of relative risk in survival and associated tests.

    PubMed

    Wakounig, Samo; Heinze, Georg; Schemper, Michael

    2015-12-01

    We extend the Tarone and Ware scheme of weighted log-rank tests to cover the associated weighted Mantel-Haenszel estimators of relative risk. Weighting functions previously employed are critically reviewed. The notion of an average hazard ratio is defined and its connection to the effect size measure P(Y > X) is emphasized. The connection makes estimation of P(Y > X) possible also under censoring. Two members of the extended Tarone-Ware scheme accomplish the estimation of intuitively interpretable average hazard ratios, also under censoring and time-varying relative risk which is achieved by an inverse probability of censoring weighting. The empirical properties of the members of the extended Tarone-Ware scheme are demonstrated by a Monte Carlo study. The differential role of the weighting functions considered is illustrated by a comparative analysis of four real data sets.

  1. Prevalence Estimates of Health Risk Behaviors of Immigrant Latino Men Who Have Sex With Men

    PubMed Central

    Rhodes, Scott D.; McCoy, Thomas P.; Hergenrather, Kenneth C.; Vissman, Aaron T.; Wolfson, Mark; Alonzo, Jorge; Bloom, Fred R.; Alegría-Ortega, Jose; Eng, Eugenia

    2011-01-01

    Purpose Little is known about the health status of rural immigrant Latino men who have sex with men (MSM). These MSM comprise a subpopulation that tends to remain “hidden” from both researchers and practitioners. This study was designed to estimate the prevalence of tobacco, alcohol, and drug use, and sexual risk behaviors of Latino MSM living in rural North Carolina. Methods A community-based participatory research (CBPR) partnership used respondent-driven sampling (RDS) to identify, recruit, and enroll Latino MSM to participate in an interviewer-administered behavioral assessment. RDS weighted prevalence of risk behaviors was estimated using the RDS Analysis Tool. Data collection occurred in 2008. Results A total of 190 Latino MSM was reached; the average age was 25.5 years old and nearly 80% reported being from Mexico. Prevalence estimates of smoking everyday and past 30-day heavy episodic drinking were 6.5% and 35.0%, respectively. Prevalence estimates of past 12-month marijuana and cocaine use were 56.0% and 27.1%, respectively. Past 3-month prevalence estimates of sex with at least one woman, multiple male partners, and inconsistent condom use were 21.2%, 88.9%, and 54.1%, respectively. Conclusions Respondents had low rates of tobacco use and club drug use, and high rates of sexual risk behaviors. Although this study represents an initial step in documenting the health risk behaviors of immigrant Latino MSM who are part of a new trend in Latino immigration to the southeastern US, a need exists for further research, including longitudinal studies to understand the trajectory of risk behavior among immigrant Latino MSM. PMID:22236317

  2. Estimating the accumulation of chemicals in an estuarine food web: A case study for evaluation of future ecological and human health risks

    SciTech Connect

    Iannuzzi, T.J.; Finley, B.L.

    1995-12-31

    A model was constructed and calibrated for estimating the accumulation of sediment associated nonionic organic chemicals, including selected PCBs and PCDD/Fs, in a simplified food web of the tidal Passaic River, New Jersey. The model was used to estimate concentrations of several chemicals in infaunal invertebrates, forage fish, blue crab, and adult finfish in the River as part of a screening-level risk assessment that was conducted during the preliminary phase of a CERCLA Remedial Investigation/Feasibility Study (RI/FS). Subsequent tissue-residue data were collected to evaluate the performance of the model, and to calibrate the model for multiple chemicals of concern in the River. A follow-up program of data collection was designed to support a more detailed risk assessment. The objectives of calibrating the model are to supplement the extant tissue-residue data that is available for risk assessment, and to evaluate future scenarios of bioaccumulation (and potential ecological and human health risk) under various future conditions in the River. Results to-date suggest that the model performs well for the simplified food web that exists in the Passaic River. A case study was constructed to demonstrate the application of the model for future predictions of ecological risk. These preliminary results suggest that the model is sufficiently sensitive and accurate for estimating variations of bioaccumulation under varying degrees of source control or other future conditions.

  3. Estimates of Prevalence and Risk Associated with Inattention and Distraction Based Upon In Situ Naturalistic Data

    PubMed Central

    Dingus, Thomas A.

    2014-01-01

    By using in situ naturalistic driving data, estimates of prevalence and risk can be made regarding driver populations’ secondary task distractions and crash rates. Through metadata analysis, three populations of drivers (i.e., adult light vehicle, teenaged light vehicle, and adult heavy vehicle) were compared regarding frequency of secondary task behavior and the associated risk for safety-critical incidents. Relative risk estimates provide insight into the risk associated with engaging in a single task. When such risk is considered in combination with frequency of use, it sheds additional light on those secondary tasks that create the greatest overall risk to driving safety. The results show that secondary tasks involving manual typing, texting, dialing, reaching for an object, or reading are dangerous for all three populations. Additionally, novice teen drivers have difficulty in several tasks that the other two populations do not, including eating and external distractions. Truck drivers also perform a number of risky “mobile office” types of tasks, including writing, not seen in the other populations. Implications are described for policy makers and designers of in-vehicle and nomadic, portable systems. PMID:24776227

  4. Estimates of auditory risk from outdoor impulse noise. II: Civilian firearms.

    PubMed

    Flamme, Gregory A; Wong, Adam; Liebe, Kevin; Lynd, James

    2009-01-01

    Firearm impulses are common noise exposures in the United States. This study records, describes and analyzes impulses produced outdoors by civilian firearms with respect to the amount of auditory risk they pose to the unprotected listener under various listening conditions. Risk estimates were obtained using three contemporary damage risk criteria (DRC) including a waveform parameter-based approach (peak SPL and B-duration), an energy-based criterion (A-weighted SEL and equivalent continuous level) and a physiological model (AHAAH). Results from these DRC were converted into a number of maximum permissible unprotected exposures to facilitate interpretation. Acoustic characteristics of firearm impulses differed substantially across guns, ammunition, and microphone location. The type of gun, ammunition and the microphone location all significantly affected estimates of auditory risk from firearms. Vast differences in maximum permissible exposures were observed; the rank order of the differences varied with the source of the impulse. Unprotected exposure to firearm noise is not recommended, but people electing to fire a gun without hearing protection should be advised to minimize auditory risk through careful selection of ammunition and shooting environment. Small-caliber guns with long barrels and guns loaded with the least powerful ammunition tend to be associated with the least auditory risk.

  5. Estimating risks of heat strain by age and sex: a population-level simulation model.

    PubMed

    Glass, Kathryn; Tait, Peter W; Hanna, Elizabeth G; Dear, Keith

    2015-05-18

    Individuals living in hot climates face health risks from hyperthermia due to excessive heat. Heat strain is influenced by weather exposure and by individual characteristics such as age, sex, body size, and occupation. To explore the population-level drivers of heat strain, we developed a simulation model that scales up individual risks of heat storage (estimated using Myrup and Morgan's man model "MANMO") to a large population. Using Australian weather data, we identify high-risk weather conditions together with individual characteristics that increase the risk of heat stress under these conditions. The model identifies elevated risks in children and the elderly, with females aged 75 and older those most likely to experience heat strain. Risk of heat strain in males does not increase as rapidly with age, but is greatest on hot days with high solar radiation. Although cloudy days are less dangerous for the wider population, older women still have an elevated risk of heat strain on hot cloudy days or when indoors during high temperatures. Simulation models provide a valuable method for exploring population level risks of heat strain, and a tool for evaluating public health and other government policy interventions.

  6. Impact of provision of cardiovascular disease risk estimates to healthcare professionals and patients: a systematic review

    PubMed Central

    Usher-Smith, Juliet A; Silarova, Barbora; Schuit, Ewoud; GM Moons, Karel; Griffin, Simon J

    2015-01-01

    Objective To systematically review whether the provision of information on cardiovascular disease (CVD) risk to healthcare professionals and patients impacts their decision-making, behaviour and ultimately patient health. Design A systematic review. Data sources An electronic literature search of MEDLINE and PubMed from 01/01/2004 to 01/06/2013 with no language restriction and manual screening of reference lists of systematic reviews on similar topics and all included papers. Eligibility criteria for selecting studies (1) Primary research published in a peer-reviewed journal; (2) inclusion of participants with no history of CVD; (3) intervention strategy consisted of provision of a CVD risk model estimate to either professionals or patients; and (4) the only difference between the intervention group and control group (or the only intervention in the case of before-after studies) was the provision of a CVD risk model estimate. Results After duplicates were removed, the initial electronic search identified 9671 papers. We screened 196 papers at title and abstract level and included 17 studies. The heterogeneity of the studies limited the analysis, but together they showed that provision of risk information to patients improved the accuracy of risk perception without decreasing quality of life or increasing anxiety, but had little effect on lifestyle. Providing risk information to physicians increased prescribing of lipid-lowering and blood pressure medication, with greatest effects in those with CVD risk >20% (relative risk for change in prescribing 2.13 (1.02 to 4.63) and 2.38 (1.11 to 5.10) respectively). Overall, there was a trend towards reductions in cholesterol and blood pressure and a statistically significant reduction in modelled CVD risk (−0.39% (−0.71 to −0.07)) after, on average, 12 months. Conclusions There seems evidence that providing CVD risk model estimates to professionals and patients improves perceived CVD risk and medical prescribing

  7. Health risk estimates for groundwater and soil contamination in the Slovak Republic: a convenient tool for identification and mapping of risk areas.

    PubMed

    Fajčíková, K; Cvečková, V; Stewart, A; Rapant, S

    2014-10-01

    We undertook a quantitative estimation of health risks to residents living in the Slovak Republic and exposed to contaminated groundwater (ingestion by adult population) and/or soils (ingestion by adult and child population). Potential risk areas were mapped to give a visual presentation at basic administrative units of the country (municipalities, districts, regions) for easy discussion with policy and decision-makers. The health risk estimates were calculated by US EPA methods, applying threshold values for chronic risk and non-threshold values for cancer risk. The potential health risk was evaluated for As, Ba, Cd, Cu, F, Hg, Mn, NO3 (-), Pb, Sb, Se and Zn for groundwater and As, B, Ba, Be, Cd, Cu, F, Hg, Mn, Mo, Ni, Pb, Sb, Se and Zn for soils. An increased health risk was identified mainly in historical mining areas highly contaminated by geogenic-anthropogenic sources (ore deposit occurrence, mining, metallurgy). Arsenic and antimony were the most significant elements in relation to health risks from groundwater and soil contamination in the Slovak Republic contributing a significant part of total chronic risk levels. Health risk estimation for soil contamination has highlighted the significance of exposure through soil ingestion in children. Increased cancer risks from groundwater and soil contamination by arsenic were noted in several municipalities and districts throughout the country in areas with significantly high arsenic levels in the environment. This approach to health risk estimations and visualization represents a fast, clear and convenient tool for delineation of risk areas at national and local levels.

  8. European risk assessment of LAS in agricultural soil revisited: species sensitivity distribution and risk estimates.

    PubMed

    Jensen, John; Smith, Stephen R; Krogh, Paul Henning; Versteeg, Donald J; Temara, Ali

    2007-10-01

    Linear alkylbenzene sulphonate (LAS) is used at a rate of approximately 430,000 tons/y in Western Europe, mainly in laundry detergents. It is present in sewage sludge (70-5,600 mg/kg; 5-95th percentile) because of its high usage per capita, its sorption and precipitation in primary settlers, and its lack of degradation in anaerobic digesters. Immediately after amendment, calculated and measured concentrations are <1 to 60 mg LAS/kg soil. LAS biodegrades rapidly in soil with primary and ultimate half-lives of up to 7 and 30 days, respectively. Calculated residual concentrations after the averaging time (30 days) are 0.24-18 mg LAS/kg soil. The long-term ecotoxicity to soil microbiota is relatively low (EC10 >or=26 mg sludge-associated LAS/kg soil). An extensive review of the invertebrate and plant ecotoxicological data, combined with a probabilistic assessment approach, led to a PNEC value of 35 mg LAS/kg soil, i.e. the 5th percentile (HC5) of the species sensitivity distribution (lognormal distribution of the EC10 and NOEC values). Risk ratios were identified to fall within a range of 0.01 (median LAS concentration in sludge) to 0.1 (95th percentile) and always below 0.5 (maximum LAS concentration measured in sludge) according to various scenarios covering different factors such as local sewage influent concentration, water hardness, and sewage sludge stabilisation process. Based on the present information, it can be concluded that LAS does not represent an ecological risk in Western Europe when applied via normal sludge amendment to agricultural soil.

  9. Radiobiologic risk estimation from dental radiology. Part II. Cancer incidence and fatality

    SciTech Connect

    Underhill, T.E.; Kimura, K.; Chilvarquer, I.; McDavid, W.D.; Langlais, R.P.; Preece, J.W.; Barnwell, G.

    1988-08-01

    With the use of the measured absorbed doses from part I of this article, the specific radiobiologic risk to the patient from (1) five different panoramic machines with rare-earth screens, (2) a 20-film complete-mouth survey with E-speed film, long round cone, (3) a 20-film complete-mouth survey with E-speed film, long rectangular cone, (4) a 4-film interproximal survey with E-speed film, long round cone, and (5) a 4-film interproximal survey with E-speed film, long rectangular cone, was calculated. The estimated risks are expressed in two ways: the probability of radiation-induced cancer in specific organs per million examinations and the probability of expression of a fatal cancer per million examinations. The highest risks calculated were from the complete-mouth survey with the use of round collimation. The lowest risks calculated were from panoramic radiography and four interproximal radiographs with rectangular collimation.

  10. Risk estimators for radiation-induced bone marrow syndrome lethality in humans

    SciTech Connect

    Scott, B.R.; Hahn, F.F.; McClellan, R.O.; Seiler, F.A.

    1988-09-01

    This manuscript provides risk estimators for acute lethality from radiation-induced injury to the bone marrow of humans after uniform total-body exposure to low linear energy transfer (LET) radiation. The risk estimators are needed for nuclear disaster risk assessment. The approach used is based on the dose X, in units of D50 (i.e., the dose required for 50% lethality). Using animal data, it is demonstrated that the use of dose in units of D50 eliminates most of the variability associated with mammalian species, type of low-LET radiation, and low-LET dose rate. Animal data are used to determine the shape of the dose-effect curve for marrow-syndrome lethality in man and to develop a functional relationship for the dependence of the D50 on dose rate. The functional relationship is used, along with the Weibull model, to develop acute lethality risk estimators for complex temporal patterns of continuous exposure to low-LET radiation. Animal data are used to test model predictions.

  11. A Methodological Approach to Small Area Estimation for the Behavioral Risk Factor Surveillance System

    PubMed Central

    Xu, Fang; Wallace, Robyn C.; Garvin, William; Greenlund, Kurt J.; Bartoli, William; Ford, Derek; Eke, Paul; Town, G. Machell

    2016-01-01

    Public health researchers have used a class of statistical methods to calculate prevalence estimates for small geographic areas with few direct observations. Many researchers have used Behavioral Risk Factor Surveillance System (BRFSS) data as a basis for their models. The aims of this study were to 1) describe a new BRFSS small area estimation (SAE) method and 2) investigate the internal and external validity of the BRFSS SAEs it produced. The BRFSS SAE method uses 4 data sets (the BRFSS, the American Community Survey Public Use Microdata Sample, Nielsen Claritas population totals, and the Missouri Census Geographic Equivalency File) to build a single weighted data set. Our findings indicate that internal and external validity tests were successful across many estimates. The BRFSS SAE method is one of several methods that can be used to produce reliable prevalence estimates in small geographic areas. PMID:27418213

  12. Understanding the effects of past flood events and perceived and estimated flood risks on individuals' voluntary flood insurance purchase behavior.

    PubMed

    Shao, Wanyun; Xian, Siyuan; Lin, Ning; Kunreuther, Howard; Jackson, Nida; Goidel, Kirby

    2017-01-01

    Over the past several decades, the economic damage from flooding in the coastal areas has greatly increased due to rapid coastal development coupled with possible climate change impacts. One effective way to mitigate excessive economic losses from flooding is to purchase flood insurance. Only a minority of coastal residents however have taken this preventive measure. Using original survey data for all coastal counties of the United States Gulf Coast merged with contextual data, this study examines the effects of external influences and perceptions of flood-related risks on individuals' voluntary behaviors to purchase flood insurance. It is found that the estimated flood hazard conveyed through the U.S. Federal Emergency Management Agency's (FEMA's) flood maps, the intensities and consequences of past storms and flooding events, and perceived flood-related risks significantly affect individual's voluntary purchase of flood insurance. This behavior is also influenced by home ownership, trust in local government, education, and income. These findings have several important policy implications. First, FEMA's flood maps have been effective in conveying local flood risks to coastal residents, and correspondingly influencing their decisions to voluntarily seek flood insurance in the U.S. Gulf Coast. Flood maps therefore should be updated frequently to reflect timely and accurate information about flood hazards. Second, policy makers should design strategies to increase homeowners' trust in the local government, to better communicate flood risks with residents, to address the affordability issue for the low-income, and better inform less educated homeowners through various educational programs. Future studies should examine the voluntary flood insurance behavior across countries that are vulnerable to flooding.

  13. Fatalities in high altitude mountaineering: a review of quantitative risk estimates.

    PubMed

    Weinbruch, Stephan; Nordby, Karl-Christian

    2013-12-01

    Quantitative estimates for mortality in high altitude mountaineering are reviewed. Special emphasis is placed on the heterogeneity of the risk estimates and on confounding. Crude estimates for mortality are on the order of 1/1000 to 40/1000 persons above base camp, for both expedition members and high altitude porters. High altitude porters have mostly a lower risk than expedition members (risk ratio for all Nepalese peaks requiring an expedition permit: 0.73; 95 % confidence interval 0.59-0.89). The summit bid is generally the most dangerous part of an expedition for members, whereas most high altitude porters die during route preparation. On 8000 m peaks, the mortality during descent from summit varies between 4/1000 and 134/1000 summiteers (members plus porters). The risk estimates are confounded by human and environmental factors. Information on confounding by gender and age is contradictory and requires further work. There are indications for safety segregation of men and women, with women being more risk averse than men. Citizenship appears to be a significant confounder. Prior high altitude mountaineering experience in Nepal has no protective effect. Commercial expeditions in the Nepalese Himalayas have a lower mortality than traditional expeditions, though after controlling for confounding, the difference is not statistically significant. The overall mortality is increasing with increasing peak altitude for expedition members but not for high altitude porters. In the Nepalese Himalayas and in Alaska, a significant decrease of mortality with calendar year was observed. A few suggestions for further work are made at the end of the article.

  14. Risk estimates for deterministic health effects of inhaled weapons grade plutonium.

    PubMed

    Scott, Bobby R; Peterson, Vern L

    2003-09-01

    Risk estimates for deterministic effects of inhaled weapons-grade plutonium (WG Pu) are needed to evaluate potential serious harm to (1) U.S. Department of Energy nuclear workers from accidental or other work-place releases of WG Pu; and (2) the public from terrorist actions resulting in the release of WG Pu to the environment. Deterministic health effects (the most serious radiobiological consequences to humans) can arise when large amounts of WG Pu are taken into the body. Inhalation is considered the most likely route of intake during work-place accidents or during a nuclear terrorism incident releasing WG Pu to the environment. Our current knowledge about radiation-related harm is insufficient for generating precise estimates of risk for a given WG Pu exposure scenario. This relates largely to uncertainties associated with currently available risk and dosimetry models. Thus, rather than generating point estimates of risk, distributions that account for variability/uncertainty are needed to properly characterize potential harm to humans from a given WG Pu exposure scenario. In this manuscript, we generate and summarize risk distributions for deterministic radiation effects in the lungs of nuclear workers from inhaled WG Pu particles (standard isotopic mix). These distributions were developed using NUREG/CR-4214 risk models and time-dependent, dose conversion factor data based on Publication 30 of the International Commission on Radiological Protection. Dose conversion factors based on ICRP Publication 30 are more relevant to deterministic effects than are the dose conversion factors based on ICRP Publication 66, which relate to targets for stochastic effects. Risk distributions that account for NUREG/CR-4214 parameter and model uncertainties were generated using the Monte Carlo method. Risks were evaluated for both lethality (from radiation pneumonitis) and morbidity (due to radiation-induced respiratory dysfunction) and were found to depend strongly on absorbed

  15. Using a relative health indicator (RHI) metric to estimate health risk reductions in drinking water.

    PubMed

    Alfredo, Katherine A; Seidel, Chad; Ghosh, Amlan; Roberson, J Alan

    2017-03-01

    When a new drinking water regulation is being developed, the USEPA conducts a health risk reduction and cost analysis to, in part, estimate quantifiable and non-quantifiable cost and benefits of the various regulatory alternatives. Numerous methodologies are available for cumulative risk assessment ranging from primarily qualitative to primarily quantitative. This research developed a summary metric of relative cumulative health impacts resulting from drinking water, the relative health indicator (RHI). An intermediate level of quantification and modeling was chosen, one which retains the concept of an aggregated metric of public health impact and hence allows for comparisons to be made across "cups of water," but avoids the need for development and use of complex models that are beyond the existing state of the science. Using the USEPA Six-Year Review data and available national occurrence surveys of drinking water contaminants, the metric is used to test risk reduction as it pertains to the implementation of the arsenic and uranium maximum contaminant levels and quantify "meaningful" risk reduction. Uranium represented the threshold risk reduction against which national non-compliance risk reduction was compared for arsenic, nitrate, and radium. Arsenic non-compliance is most significant and efforts focused on bringing those non-compliant utilities into compliance with the 10 μg/L maximum contaminant level would meet the threshold for meaningful risk reduction.

  16. What is the significance of end-stage renal disease risk estimation in living kidney donors?

    PubMed

    Gaillard, François; Baron, Stéphanie; Timsit, Marc-Olivier; Eladari, Dominique; Fournier, Catherine; Prot-Bertoye, Caroline; Bertocchio, Jean-Philippe; Lamhaut, Lionel; Friedlander, Gérard; Méjean, Arnaud; Legendre, Christophe; Courbebaisse, Marie

    2017-02-02

    Two end-stage renal disease (ESRD) risk calculators were recently developed by Grams et al., and Ibrahim et al. to calculate ESRD risk before donation among living kidney donors. However, those calculators have never been studied among potential donors for whom donation was refused due to medical contraindications and compared to a group of donors. We compared 15-year and lifetime ESRD risk of donors and nondonors due to medical cause as estimated by those two calculators. Nondonors due to medical cause (n = 27) had a significantly higher 15-year ESRD risk compared to donors (n = 288) with both calculators (0.25 vs. 0.14, P < 0.001 for that developed by Grams et al. and 2.21 vs. 1.43, P = 0.002 for that developed by Ibrahim et al.). On the contrary, lifetime ESRD risk was not significantly different between the two groups. At both times (15 years and lifetime), we observed a significant overlap of ESRD risk between the two groups. ESRD risk calculators could be complementary to standard screening strategy but cannot be used alone to accept or decline donation.

  17. Estimating Risk of Natural Gas Portfolios by Using GARCH-EVT-Copula Model

    PubMed Central

    Tang, Jiechen; Zhou, Chao; Yuan, Xinyu; Sriboonchitta, Songsak

    2015-01-01

    This paper concentrates on estimating the risk of Title Transfer Facility (TTF) Hub natural gas portfolios by using the GARCH-EVT-copula model. We first use the univariate ARMA-GARCH model to model each natural gas return series. Second, the extreme value distribution (EVT) is fitted to the tails of the residuals to model marginal residual distributions. Third, multivariate Gaussian copula and Student t-copula are employed to describe the natural gas portfolio risk dependence structure. Finally, we simulate N portfolios and estimate value at risk (VaR) and conditional value at risk (CVaR). Our empirical results show that, for an equally weighted portfolio of five natural gases, the VaR and CVaR values obtained from the Student t-copula are larger than those obtained from the Gaussian copula. Moreover, when minimizing the portfolio risk, the optimal natural gas portfolio weights are found to be similar across the multivariate Gaussian copula and Student t-copula and different confidence levels. PMID:26351652

  18. Estimating Risk of Natural Gas Portfolios by Using GARCH-EVT-Copula Model.

    PubMed

    Tang, Jiechen; Zhou, Chao; Yuan, Xinyu; Sriboonchitta, Songsak

    2015-01-01

    This paper concentrates on estimating the risk of Title Transfer Facility (TTF) Hub natural gas portfolios by using the GARCH-EVT-copula model. We first use the univariate ARMA-GARCH model to model each natural gas return series. Second, the extreme value distribution (EVT) is fitted to the tails of the residuals to model marginal residual distributions. Third, multivariate Gaussian copula and Student t-copula are employed to describe the natural gas portfolio risk dependence structure. Finally, we simulate N portfolios and estimate value at risk (VaR) and conditional value at risk (CVaR). Our empirical results show that, for an equally weighted portfolio of five natural gases, the VaR and CVaR values obtained from the Student t-copula are larger than those obtained from the Gaussian copula. Moreover, when minimizing the portfolio risk, the optimal natural gas portfolio weights are found to be similar across the multivariate Gaussian copula and Student t-copula and different confidence levels.

  19. Risk-Targeted Selection of Agricultural Holdings for Post-Epidemic Surveillance: Estimation of Efficiency Gains

    PubMed Central

    Handel, Ian G.; de C. Bronsvoort, Barend M.; Forbes, John F.; Woolhouse, Mark E. J.

    2011-01-01

    Current post-epidemic sero-surveillance uses random selection of animal holdings. A better strategy may be to estimate the benefits gained by sampling each farm and use this to target selection. In this study we estimate the probability of undiscovered infection for sheep farms in Devon after the 2001 foot-and-mouth disease outbreak using the combination of a previously published model of daily infection risk and a simple model of probability of discovery of infection during the outbreak. This allows comparison of the system sensitivity (ability to detect infection in the area) of arbitrary, random sampling compared to risk-targeted selection across a full range of sampling budgets. We show that it is possible to achieve 95% system sensitivity by sampling, on average, 945 farms with random sampling and 184 farms with risk-targeted sampling. We also examine the effect of ordering samples by risk to expedite return to a disease-free status. Risk ordering the sampling process results in detection of positive farms, if present, 15.6 days sooner than with randomly ordered sampling, assuming 50 farms are tested per day. PMID:21674022

  20. Performance of the modified Poisson regression approach for estimating relative risks from clustered prospective data.

    PubMed

    Yelland, Lisa N; Salter, Amy B; Ryan, Philip

    2011-10-15

    Modified Poisson regression, which combines a log Poisson regression model with robust variance estimation, is a useful alternative to log binomial regression for estimating relative risks. Previous studies have shown both analytically and by simulation that modified Poisson regression is appropriate for independent prospective data. This method is often applied to clustered prospective data, despite a lack of evidence to support its use in this setting. The purpose of this article is to evaluate the performance of the modified Poisson regression approach for estimating relative risks from clustered prospective data, by using generalized estimating equations to account for clustering. A simulation study is conducted to compare log binomial regression and modified Poisson regression for analyzing clustered data from intervention and observational studies. Both methods generally perform well in terms of bias, type I error, and coverage. Unlike log binomial regression, modified Poisson regression is not prone to convergence problems. The methods are contrasted by using example data sets from 2 large studies. The results presented in this article support the use of modified Poisson regression as an alternative to log binomial regression for analyzing clustered prospective data when clustering is taken into account by using generalized estimating equations.

  1. Muscle Mass and Body Fat in Relation to Cardiovascular Risk Estimation and Lipid-Lowering Eligibility.

    PubMed

    Lee, Kayoung

    2016-12-06

    This cross-sectional population-based study aimed to evaluate the relationships of muscle-mass and body-fat phenotypes to 10-yr risk of cardiovascular disease (CVD) events and eligibility for lipid management. Participants were Korean adults (N = 7315; 3163 men, 4152 women) aged 40-79 yr, free from stroke and coronary heart disease, who provided complete data for estimating 10-yr CVD risk and body composition during the Fifth Korea National Health and Nutrition Examination Survey (2009-2010). Four levels of combined muscle mass and body fat were determined using sex-specific quintiles of appendicular skeletal muscle mass divided by height squared, and sex-specific quintiles of total body fat percentage. Ten-year CVD risk was calculated using Pooled Cohort Equations and Framingham risk scores. Lipid-lowering medication eligibility was determined using American College of Cardiology/American Heart Association (ACC/AHA) and Adult Treatment Panel (ATP) III guidelines. Compared with the reference group, the risk of CVD events was higher in men with low muscle mass, high body fat, or the 2 factors combined. CVD risk was lower in women with low muscle mass, higher in women with high body fat, and nonsignificant in women with the 2 factors. Participants with low muscle mass and high body fat had higher odds for medication eligibility using the ACC/AHA guidelines but not the ATP III guidelines. Higher estimated 10-yr CVD risk was associated with combined phenotypes of low muscle mass and high fat in men but not in women. Also, the relationship of these phenotypes to lipid-lowering medication eligibility was guideline-specific.

  2. Estimation of the standardized risk difference and ratio in a competing risks framework: application to injection drug use and progression to AIDS after initiation of antiretroviral therapy.

    PubMed

    Cole, Stephen R; Lau, Bryan; Eron, Joseph J; Brookhart, M Alan; Kitahata, Mari M; Martin, Jeffrey N; Mathews, William C; Mugavero, Michael J

    2015-02-15

    There are few published examples of absolute risk estimated from epidemiologic data subject to censoring and competing risks with adjustment for multiple confounders. We present an example estimating the effect of injection drug use on 6-year risk of acquired immunodeficiency syndrome (AIDS) after initiation of combination antiretroviral therapy between 1998 and 2012 in an 8-site US cohort study with death before AIDS as a competing risk. We estimate the risk standardized to the total study sample by combining inverse probability weights with the cumulative incidence function; estimates of precision are obtained by bootstrap. In 7,182 patients (83% male, 33% African American, median age of 38 years), we observed 6-year standardized AIDS risks of 16.75% among 1,143 injection drug users and 12.08% among 6,039 nonusers, yielding a standardized risk difference of 4.68 (95% confidence interval: 1.27, 8.08) and a standardized risk ratio of 1.39 (95% confidence interval: 1.12, 1.72). Results may be sensitive to the assumptions of exposure-version irrelevance, no measurement bias, and no unmeasured confounding. These limitations suggest that results be replicated with refined measurements of injection drug use. Nevertheless, estimating the standardized risk difference and ratio is straightforward, and injection drug use appears to increase the risk of AIDS.

  3. 49 CFR Appendix G to Part 222 - Excess Risk Estimates for Public Highway-Rail Grade Crossings

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 4 2010-10-01 2010-10-01 false Excess Risk Estimates for Public Highway-Rail Grade Crossings G Appendix G to Part 222 Transportation Other Regulations Relating to Transportation... HIGHWAY-RAIL GRADE CROSSINGS Pt. 222, App. G Appendix G to Part 222—Excess Risk Estimates for...

  4. Estimating the risks of cancer mortality and genetic defects resulting from exposures to low levels of ionizing radiation

    SciTech Connect

    Buhl, T.E.; Hansen, W.R.

    1984-05-01

    Estimators for calculating the risk of cancer and genetic disorders induced by exposure to ionizing radiation have been recommended by the US National Academy of Sciences Committee on the Biological Effects of Ionizing Radiations, the UN Scientific Committee on the Effects of Atomic Radiation, and the International Committee on Radiological Protection. These groups have also considered the risks of somatic effects other than cancer. The US National Council on Radiation Protection and Measurements has discussed risk estimate procedures for radiation-induced health effects. The recommendations of these national and international advisory committees are summarized and compared in this report. Based on this review, two procedures for risk estimation are presented for use in radiological assessments performed by the US Department of Energy under the National Environmental Policy Act of 1969 (NEPA). In the first procedure, age- and sex-averaged risk estimators calculated with US average demographic statistics would be used with estimates of radiation dose to calculate the projected risk of cancer and genetic disorders that would result from the operation being reviewed under NEPA. If more site-specific risk estimators are needed, and the demographic information is available, a second procedure is described that would involve direct calculation of the risk estimators using recommended risk-rate factors. The computer program REPCAL has been written to perform this calculation and is described in this report. 25 references, 16 tables.

  5. Parametric estimation of P(X > Y) for normal distributions in the context of probabilistic environmental risk assessment

    PubMed Central

    Bekker, Andriëtte A.; van der Voet, Hilko; ter Braak, Cajo J.F.

    2015-01-01

    Estimating the risk, P(X > Y), in probabilistic environmental risk assessment of nanoparticles is a problem when confronted by potentially small risks and small sample sizes of the exposure concentration X and/or the effect concentration Y. This is illustrated in the motivating case study of aquatic risk assessment of nano-Ag. A non-parametric estimator based on data alone is not sufficient as it is limited by sample size. In this paper, we investigate the maximum gain possible when making strong parametric assumptions as opposed to making no parametric assumptions at all. We compare maximum likelihood and Bayesian estimators with the non-parametric estimator and study the influence of sample size and risk on the (interval) estimators via simulation. We found that the parametric estimators enable us to estimate and bound the risk for smaller sample sizes and small risks. Also, the Bayesian estimator outperforms the maximum likelihood estimators in terms of coverage and interval lengths and is, therefore, preferred in our motivating case study. PMID:26312175

  6. Parametric estimation of P(X > Y) for normal distributions in the context of probabilistic environmental risk assessment.

    PubMed

    Jacobs, Rianne; Bekker, Andriëtte A; van der Voet, Hilko; Ter Braak, Cajo J F

    2015-01-01

    Estimating the risk, P(X > Y), in probabilistic environmental risk assessment of nanoparticles is a problem when confronted by potentially small risks and small sample sizes of the exposure concentration X and/or the effect concentration Y. This is illustrated in the motivating case study of aquatic risk assessment of nano-Ag. A non-parametric estimator based on data alone is not sufficient as it is limited by sample size. In this paper, we investigate the maximum gain possible when making strong parametric assumptions as opposed to making no parametric assumptions at all. We compare maximum likelihood and Bayesian estimators with the non-parametric estimator and study the influence of sample size and risk on the (interval) estimators via simulation. We found that the parametric estimators enable us to estimate and bound the risk for smaller sample sizes and small risks. Also, the Bayesian estimator outperforms the maximum likelihood estimators in terms of coverage and interval lengths and is, therefore, preferred in our motivating case study.

  7. Uncertainties in estimating health risks associated with exposure to ionising radiation.

    PubMed

    Preston, R Julian; Boice, John D; Brill, A Bertrand; Chakraborty, Ranajit; Conolly, Rory; Hoffman, F Owen; Hornung, Richard W; Kocher, David C; Land, Charles E; Shore, Roy E; Woloschak, Gayle E

    2013-09-01

    The information for the present discussion on the uncertainties associated with estimation of radiation risks and probability of disease causation was assembled for the recently published NCRP Report No. 171 on this topic. This memorandum provides a timely overview of the topic, given that quantitative uncertainty analysis is the state of the art in health risk assessment and given its potential importance to developments in radiation protection. Over the past decade the increasing volume of epidemiology data and the supporting radiobiology findings have aided in the reduction of uncertainty in the risk estimates derived. However, it is equally apparent that there remain significant uncertainties related to dose assessment, low dose and low dose-rate extrapolation approaches (e.g. the selection of an appropriate dose and dose-rate effectiveness factor), the biological effectiveness where considerations of the health effects of high-LET and lower-energy low-LET radiations are required and the transfer of risks from a population for which health effects data are available to one for which such data are not available. The impact of radiation on human health has focused in recent years on cancer, although there has been a decided increase in the data for noncancer effects together with more reliable estimates of the risk following radiation exposure, even at relatively low doses (notably for cataracts and cardiovascular disease). New approaches for the estimation of hereditary risk have been developed with the use of human data whenever feasible, although the current estimates of heritable radiation effects still are based on mouse data because of an absence of effects in human studies. Uncertainties associated with estimation of these different types of health effects are discussed in a qualitative and semi-quantitative manner as appropriate. The way forward would seem to require additional epidemiological studies, especially studies of low dose and low dose

  8. Quantitative Cyber Risk Reduction Estimation Methodology for a Small Scada Control System

    SciTech Connect

    Miles A. McQueen; Wayne F. Boyer; Mark A. Flynn; George A. Beitel

    2006-01-01

    We propose a new methodology for obtaining a quick quantitative measurement of the risk reduction achieved when a control system is modified with the intent to improve cyber security defense against external attackers. The proposed methodology employs a directed graph called a compromise graph, where the nodes represent stages of a potential attack and the edges represent the expected time-to-compromise for differing attacker skill levels. Time-to-compromise is modeled as a function of known vulnerabilities and attacker skill level. The methodology was used to calculate risk reduction estimates for a specific SCADA system and for a specific set of control system security remedial actions. Despite an 86% reduction in the total number of vulnerabilities, the estimated time-to-compromise was increased only by about 3 to 30% depending on target and attacker skill level.

  9. Non-parametric estimation of bivariate failure time associations in the presence of a competing risk.

    PubMed

    Bandeen-Roche, Karen; Ning, Jing

    2008-03-01

    Most research on the study of associations among paired failure times has either assumed time invariance or been based on complex measures or estimators. Little has accommodated competing risks. This paper targets the conditional cause-specific hazard ratio, henceforth called the cause-specific cross ratio, a recent modification of the conditional hazard ratio designed to accommodate competing risks data. Estimation is accomplished by an intuitive, non-parametric method that localizes Kendall's tau. Time variance is accommodated through a partitioning of space into 'bins' between which the strength of association may differ. Inferential procedures are developed, small-sample performance is evaluated and the methods are applied to the investigation of familial association in dementia onset.

  10. Estimating relative risk of a log-transformed exposure measured in pools.

    PubMed

    Mitchell, Emily M; Plowden, Torie C; Schisterman, Enrique F

    2016-12-20

    Pooling biospecimens prior to performing laboratory assays is a useful tool to reduce costs, achieve minimum volume requirements and mitigate assay measurement error. When estimating the risk of a continuous, pooled exposure on a binary outcome, specialized statistical techniques are required. Current methods include a regression calibration approach, where the expectation of the individual-level exposure is calculated by adjusting the observed pooled measurement with additional covariate data. While this method employs a linear regression calibration model, we propose an alternative model that can accommodate log-linear relationships between the exposure and predictive covariates. The proposed model permits direct estimation of the relative risk associated with a log-transformation of an exposure measured in pools. Published 2016. This article is a U.S. Government work and is in the public domain in the USA.

  11. Publication Bias Currently Makes an Accurate Estimate of the Benefits of Enrichment Programs Difficult: A Postmortem of Two Meta-Analyses Using Statistical Power Analysis

    ERIC Educational Resources Information Center

    Warne, Russell T.

    2016-01-01

    Recently Kim (2016) published a meta-analysis on the effects of enrichment programs for gifted students. She found that these programs produced substantial effects for academic achievement (g = 0.96) and socioemotional outcomes (g = 0.55). However, given current theory and empirical research these estimates of the benefits of enrichment programs…

  12. Identification of an accurate soil suspension/dispersion modeling method for use in estimating health-based soil cleanup levels of hexavalent chromium in chromite ore processing residues.

    PubMed

    Scott, P K; Finley, B L; Sung, H M; Schulze, R H; Turner, D B

    1997-07-01

    The primary health concern associated with chromite ore processing residues (COPR) at sites in Hudson County, NJ, is the inhalation of Cr(VI) suspended from surface soils. Since health-based soil standards for Cr(VI) will be derived using the inhalation pathway, soil suspension modeling will be necessary to estimate site-specific, health-based soil cleanup levels (HBSCLs). The purpose of this study was to identify the most appropriate particulate emission and air dispersion models for estimating soil suspension at these sites based on their theoretical underpinnings, scientific acceptability, and past performance. The identified modeling approach, the AP-42 particulate emission model and the fugitive dust model (FDM), was used to calculate concentrations of airborne Cr(VI) and TSP at two COPR sites. These estimated concentrations were then compared to concentrations measured at each site. The TSP concentrations calculated using the AP-42/FDM soil suspension modeling approach were all within a factor of 3 of the measured concentrations. The majority of the estimated air concentrations were greater than the measured, indicating that the AP-42/FDM approach tends to overestimate on-site concentrations. The site-specific Cr(VI) HBSCLs for these two sites calculated using this conservative soil suspension modeling approach ranged from 190 to 420 mg/kg.

  13. Estimating Risk from Spillway Gate Systems on Dams Using Condition Assessment Data

    DTIC Science & Technology

    2005-10-01

    Rating Procedures for Earth and Rockfill Embankment Dams , Technical Report REMR-OM-25, Construction Engineering Research Laboratory, U.S. Army...York. Andersen, G., Chouinard, L., and Foltz, S. (1999). Condition Rating Procedures for Earth and Rockfill Embankment Dams . Technical Report REMR-OM...Approved for public release; distribution is unlimited. ER D C /C ER L TR -0 5- 40 Estimating Risk from Spillway Gate Systems on Dams Using

  14. Estimation of the Long-term Cardiovascular Events Using UKPDS Risk Engine in Metabolic Syndrome Patients.

    PubMed

    Shivakumar, V; Kandhare, A D; Rajmane, A R; Adil, M; Ghosh, P; Badgujar, L B; Saraf, M N; Bodhankar, S L

    2014-03-01

    Long-term cardiovascular complications in metabolic syndrome are a major cause of mortality and morbidity in India and forecasted estimates in this domain of research are scarcely reported in the literature. The aim of present investigation is to estimate the cardiovascular events associated with a representative Indian population of patients suffering from metabolic syndrome using United Kingdom Prospective Diabetes Study risk engine. Patient level data was collated from 567 patients suffering from metabolic syndrome through structured interviews and physician records regarding the input variables, which were entered into the United Kingdom Prospective Diabetes Study risk engine. The patients of metabolic syndrome were selected according to guidelines of National Cholesterol Education Program - Adult Treatment Panel III, modified National Cholesterol Education Program - Adult Treatment Panel III and International Diabetes Federation criteria. A projection for 10 simulated years was run on the engine and output was determined. The data for each patient was processed using the United Kingdom Prospective Diabetes Study risk engine to calculate an estimate of the forecasted value for the cardiovascular complications after a period of 10 years. The absolute risk (95% confidence interval) for coronary heart disease, fatal coronary heart disease, stroke and fatal stroke for 10 years was 3.79 (1.5-3.2), 9.6 (6.8-10.7), 7.91 (6.5-9.9) and 3.57 (2.3-4.5), respectively. The relative risk (95% confidence interval) for coronary heart disease, fatal coronary heart disease, stroke and fatal stroke was 17.8 (12.98-19.99), 7 (6.7-7.2), 5.9 (4.0-6.6) and 4.7 (3.2-5.7), respectively. Simulated projections of metabolic syndrome patients predict serious life-threatening cardiovascular consequences in the representative cohort of patients in western India.

  15. Estimating risks of importation and local transmission of Zika virus infection.

    PubMed

    Nah, Kyeongah; Mizumoto, Kenji; Miyamatsu, Yuichiro; Yasuda, Yohei; Kinoshita, Ryo; Nishiura, Hiroshi

    2016-01-01

    Background. An international spread of Zika virus (ZIKV) infection has attracted global attention. ZIKV is conveyed by a mosquito vector, Aedes species, which also acts as the vector species of dengue and chikungunya viruses. Methods. Arrival time of ZIKV importation (i.e., the time at which the first imported case was diagnosed) in each imported country was collected from publicly available data sources. Employing a survival analysis model in which the hazard is an inverse function of the effective distance as informed by the airline transportation network data, and using dengue and chikungunya virus transmission data, risks of importation and local transmission were estimated. Results. A total of 78 countries with imported case(s) have been identified, with the arrival time ranging from 1 to 44 weeks since the first ZIKV was identified in Brazil, 2015. Whereas the risk of importation was well explained by the airline transportation network data, the risk of local transmission appeared to be best captured by additionally accounting for the presence of dengue and chikungunya viruses. Discussion. The risk of importation may be high given continued global travel of mildly infected travelers but, considering that the public health concerns over ZIKV infection stems from microcephaly, it is more important to focus on the risk of local and widespread transmission that could involve pregnant women. The predicted risk of local transmission was frequently seen in tropical and subtropical countries with dengue or chikungunya epidemic experience.

  16. Estimating risks of importation and local transmission of Zika virus infection

    PubMed Central

    Nah, Kyeongah; Mizumoto, Kenji; Miyamatsu, Yuichiro; Yasuda, Yohei; Kinoshita, Ryo

    2016-01-01

    Background. An international spread of Zika virus (ZIKV) infection has attracted global attention. ZIKV is conveyed by a mosquito vector, Aedes species, which also acts as the vector species of dengue and chikungunya viruses. Methods. Arrival time of ZIKV importation (i.e., the time at which the first imported case was diagnosed) in each imported country was collected from publicly available data sources. Employing a survival analysis model in which the hazard is an inverse function of the effective distance as informed by the airline transportation network data, and using dengue and chikungunya virus transmission data, risks of importation and local transmission were estimated. Results. A total of 78 countries with imported case(s) have been identified, with the arrival time ranging from 1 to 44 weeks since the first ZIKV was identified in Brazil, 2015. Whereas the risk of importation was well explained by the airline transportation network data, the risk of local transmission appeared to be best captured by additionally accounting for the presence of dengue and chikungunya viruses. Discussion. The risk of importation may be high given continued global travel of mildly infected travelers but, considering that the public health concerns over ZIKV infection stems from microcephaly, it is more important to focus on the risk of local and widespread transmission that could involve pregnant women. The predicted risk of local transmission was frequently seen in tropical and subtropical countries with dengue or chikungunya epidemic experience. PMID:27069825

  17. Cancer risk estimates from radiation therapy for heterotopic ossification prophylaxis after total hip arthroplasty

    SciTech Connect

    Mazonakis, Michalis; Berris, Theoharris; Damilakis, John; Lyraraki, Efrossyni

    2013-10-15

    Purpose: Heterotopic ossification (HO) is a frequent complication following total hip arthroplasty. This study was conducted to calculate the radiation dose to organs-at-risk and estimate the probability of cancer induction from radiotherapy for HO prophylaxis.Methods: Hip irradiation for HO with a 6 MV photon beam was simulated with the aid of a Monte Carlo model. A realistic humanoid phantom representing an average adult patient was implemented in Monte Carlo environment for dosimetric calculations. The average out-of-field radiation dose to stomach, liver, lung, prostate, bladder, thyroid, breast, uterus, and ovary was calculated. The organ-equivalent-dose to colon, that was partly included within the treatment field, was also determined. Organ dose calculations were carried out using three different field sizes. The dependence of organ doses upon the block insertion into primary beam for shielding colon and prosthesis was investigated. The lifetime attributable risk for cancer development was estimated using organ, age, and gender-specific risk coefficients.Results: For a typical target dose of 7 Gy, organ doses varied from 1.0 to 741.1 mGy by the field dimensions and organ location relative to the field edge. Blocked field irradiations resulted in a dose range of 1.4–146.3 mGy. The most probable detriment from open field treatment of male patients was colon cancer with a high risk of 564.3 × 10{sup −5} to 837.4 × 10{sup −5} depending upon the organ dose magnitude and the patient's age. The corresponding colon cancer risk for female patients was (372.2–541.0) × 10{sup −5}. The probability of bladder cancer development was more than 113.7 × 10{sup −5} and 110.3 × 10{sup −5} for males and females, respectively. The cancer risk range to other individual organs was reduced to (0.003–68.5) × 10{sup −5}.Conclusions: The risk for cancer induction from radiation therapy for HO prophylaxis after total hip arthroplasty varies considerably by the

  18. Relative risk estimation for malaria disease mapping based on stochastic SIR-SI model in Malaysia

    NASA Astrophysics Data System (ADS)

    Samat, Nor Azah; Ma'arof, Syafiqah Husna Mohd Imam

    2016-10-01

    Disease mapping is a study on the geographical distribution of a disease to represent the epidemiology data spatially. The production of maps is important to identify areas that deserve closer scrutiny or more attention. In this study, a mosquito-borne disease called Malaria is the focus of our application. Malaria disease is caused by parasites of the genus Plasmodium and is transmitted to people through the bites of infected female Anopheles mosquitoes. Precautionary steps need to be considered in order to avoid the malaria virus from spreading around the world, especially in the tropical and subtropical countries, which would subsequently increase the number of Malaria cases. Thus, the purpose of this paper is to discuss a stochastic model employed to estimate the relative risk of malaria disease in Malaysia. The outcomes of the analysis include a Malaria risk map for all 16 states in Malaysia, revealing the high and low risk areas of Malaria occurrences.

  19. The economic value of reducing environmental health risks: Contingent valuation estimates of the value of information

    SciTech Connect

    Krieger, D.J.; Hoehn, J.P.

    1999-05-01

    Obtaining economically consistent values for changes in low probability health risks continues to be a challenge for contingent valuation (CV) as well as for other valuation methods. One of the cited condition for economic consistency is that estimated values be sensitive to the scope (differences in quantity or quality) of a good described in a CV application. The alleged limitations of CV pose a particular problem for environmental managers who must often make decisions that affect human health risks. This paper demonstrates that a well-designed CV application can elicit scope sensitive values even for programs that provide conceptually complex goods such as risk reduction. Specifically, it finds that the amount sport anglers are willing to pay for information about chemical residues in fish varies systematically with informativeness--a relationship suggested by the theory of information value.

  20. Estimation of sport fish harvest for risk and hazard assessment of environmental contaminants

    SciTech Connect

    Poston, T.M.; Strenge, D.L.

    1989-01-01

    Consumption of contaminated fish flesh can be a significant route of human exposure to hazardous chemicals. Estimation of exposure resulting from the consumption of fish requires knowledge of fish consumption and contaminant levels in the edible portion of fish. Realistic figures of sport fish harvest are needed to estimate consumption. Estimates of freshwater sport fish harvest were developed from a review of 72 articles and reports. Descriptive statistics based on fishing pressure were derived from harvest data for four distinct groups of freshwater sport fish in three water types: streams, lakes, and reservoirs. Regression equations were developed to relate harvest to surface area fished where data bases were sufficiently large. Other aspects of estimating human exposure to contaminants in fish flesh that are discussed include use of bioaccumulation factors for trace metals and organic compounds. Using the bioaccumulation factor and the concentration of contaminants in water as variables in the exposure equation may also lead to less precise estimates of tissue concentration. For instance, muscle levels of contaminants may not increase proportionately with increases in water concentrations, leading to overestimation of risk. In addition, estimates of water concentration may be variable or expressed in a manner that does not truly represent biological availability of the contaminant. These factors are discussed. 45 refs., 1 fig., 7 tabs.

  1. Estimates of Radiation Doses and Cancer Risk from Food Intake in Korea

    PubMed Central

    2016-01-01

    The aim of this study was to estimate internal radiation doses and lifetime cancer risk from food ingestion. Radiation doses from food intake were calculated using the Korea National Health and Nutrition Examination Survey and the measured radioactivity of 134Cs, 137Cs, and 131I from the Ministry of Food and Drug Safety in Korea. Total number of measured data was 8,496 (3,643 for agricultural products, 644 for livestock products, 43 for milk products, 3,193 for marine products, and 973 for processed food). Cancer risk was calculated by multiplying the estimated committed effective dose and the detriment adjusted nominal risk coefficients recommended by the International Commission on Radiation Protection. The lifetime committed effective doses from the daily diet are ranged 2.957-3.710 mSv. Excess lifetime cancer risks are 14.4-18.1, 0.4-0.5, and 1.8-2.3 per 100,000 for all solid cancers combined, thyroid cancer, and leukemia, respectively. PMID:26770031

  2. Radiation Dose and Cancer Risk Estimates in 16-Slice Computed Tomography Coronary Angiography

    PubMed Central

    Einstein, Andrew J.; Sanz, Javier; Dellegrottaglie, Santo; Milite, Margherita; Sirol, Marc; Henzlova, Milena; Rajagopalan, Sanjay

    2008-01-01

    Background Recent advances have led to a rapid increase in the number of computed tomography coronary angiography (CTCA) studies performed. While several studies have reported effective dose (E), there is no data available on cancer risk for current CTCA protocols. Methods and Results E and organ doses were estimated, using scanner-derived parameters and Monte Carlo methods, for 50 patients having 16-slice CTCA performed for clinical indications. Lifetime attributable risks (LARs) were estimated with models developed in the National Academies’ Biological Effects of Ionizing Radiation VII report. E of a complete CTCA averaged 9.5 mSv, while that of a complete study, including calcium scoring when indicated, averaged 11.7 mSv. Calcium scoring increased E by 25%, while tube current modulation reduced it by 34% and was more effective at lower heart rates. Organ doses were highest to the lungs and female breast. LAR of cancer incidence from CTCA averaged approximately 1 in 1600, but varied widely between patients, being highest in younger women. For all patients, the greatest risk was from lung cancer. Conclusions CTCA is associated with non-negligible risk of malignancy. Doses can be reduced by careful attention to scanning protocol. PMID:18371595

  3. Estimates of Radiation Doses and Cancer Risk from Food Intake in Korea.

    PubMed

    Moon, Eun-Kyeong; Ha, Wi-Ho; Seo, Songwon; Jin, Young Woo; Jeong, Kyu Hwan; Yoon, Hae-Jung; Kim, Hyoung-Soo; Hwang, Myung-Sil; Choi, Hoon; Lee, Won Jin

    2016-01-01

    The aim of this study was to estimate internal radiation doses and lifetime cancer risk from food ingestion. Radiation doses from food intake were calculated using the Korea National Health and Nutrition Examination Survey and the measured radioactivity of (134)Cs, (137)Cs, and (131)I from the Ministry of Food and Drug Safety in Korea. Total number of measured data was 8,496 (3,643 for agricultural products, 644 for livestock products, 43 for milk products, 3,193 for marine products, and 973 for processed food). Cancer risk was calculated by multiplying the estimated committed effective dose and the detriment adjusted nominal risk coefficients recommended by the International Commission on Radiation Protection. The lifetime committed effective doses from the daily diet are ranged 2.957-3.710 mSv. Excess lifetime cancer risks are 14.4-18.1, 0.4-0.5, and 1.8-2.3 per 100,000 for all solid cancers combined, thyroid cancer, and leukemia, respectively.

  4. Methods to assess performance of models estimating risk of death in intensive care patients: a review.

    PubMed

    Cook, D A

    2006-04-01

    Models that estimate the probability of death of intensive care unit patients can be used to stratify patients according to the severity of their condition and to control for casemix and severity of illness. These models have been used for risk adjustment in quality monitoring, administration, management and research and as an aid to clinical decision making. Models such as the Mortality Prediction Model family, SAPS II, APACHE II, APACHE III and the organ system failure models provide estimates of the probability of in-hospital death of ICU patients. This review examines methods to assess the performance of these models. The key attributes of a model are discrimination (the accuracy of the ranking in order of probability of death) and calibration (the extent to which the model's prediction of probability of death reflects the true risk of death). These attributes should be assessed in existing models that predict the probability of patient mortality, and in any subsequent model that is developed for the purposes of estimating these probabilities. The literature contains a range of approaches for assessment which are reviewed and a survey of the methodologies used in studies of intensive care mortality models is presented. The systematic approach used by Standards for Reporting Diagnostic Accuracy provides a framework to incorporate these theoretical considerations of model assessment and recommendations are made for evaluation and presentation of the performance of models that estimate the probability of death of intensive care patients.

  5. Problems and solutions in the estimation of genetic risks from radiation and chemicals

    SciTech Connect

    Russell, W. L.

    1980-01-01

    Extensive investigations with mice on the effects of various physical and biological factors, such as dose rate, sex and cell stage, on radiation-induced mutation have provided an evaluation of the genetics hazards of radiation in man. The mutational results obtained in both sexes with progressive lowering of the radiation dose rate have permitted estimation of the mutation frequency expected under the low-level radiation conditions of most human exposure. Supplementing the studies on mutation frequency are investigations on the phenotypic effects of mutations in mice, particularly anatomical disorders of the skeleton, which allow an estimation of the degree of human handicap associated with the occurrence of parallel defects in man. Estimation of the genetic risk from chemical mutagens is much more difficult, and the research is much less advanced. Results on transmitted mutations in mice indicate a poor correlation with mutation induction in non-mammalian organisms.

  6. Estimate of the risks of disposing nonhazardous oil field wastes into salt caverns

    SciTech Connect

    Tomasko, D.; Elcock, D.; Veil, J.

    1997-12-31

    Argonne National Laboratory (ANL) has completed an evaluation of the possibility that adverse human health effects (carcinogenic and noncarcinogenic) could result from exposure to contaminants released from nonhazardous oil field wastes (NOW) disposed in domal salt caverns. Potential human health risks associated with hazardous substances (arsenic, benzene, cadmium, and chromium) in NOW were assessed under four postclosure cavern release scenarios: inadvertent cavern intrusion, failure of the cavern seal, failure of the cavern through cracks or leaky interbeds, and a partial collapse of the cavern roof. To estimate potential human health risks for these scenarios, contaminant concentrations at the receptor were calculated using a one-dimensional solution to an advection/dispersion equation that included first order degradation. Assuming a single, generic salt cavern and generic oil-field wastes, the best-estimate excess cancer risks ranged from 1.7 {times} 10{sup {minus}12} to 1.1 {times} 10{sup {minus}8} and hazard indices (referring to noncancer health effects) ranged from 7 {times} 10{sup {minus}9} to 7 {times} 10{sup {minus}4}. Under worse-case conditions in which the probability of cavern failure is 1.0, excess cancer risks ranged from 4.9 {times} 10{sup {minus}9} to 1.7 {times} 10{sup {minus}5} and hazard indices ranged from 7.0 {times} 10{sup {minus}4} to 0.07. Even under worst-case conditions, the risks are within the US Environmental Protection Agency (EPA) target range for acceptable exposure levels. From a human health risk perspective, salt caverns can, therefore, provide an acceptable disposal method for NOW.

  7. Potential Biases in Estimating Absolute and Relative Case-Fatality Risks during Outbreaks

    PubMed Central

    Lipsitch, Marc; Donnelly, Christl A.; Fraser, Christophe; Blake, Isobel M.; Cori, Anne; Dorigatti, Ilaria; Ferguson, Neil M.; Garske, Tini; Mills, Harriet L.; Riley, Steven; Van Kerkhove, Maria D.; Hernán, Miguel A.

    2015-01-01

    Estimating the case-fatality risk (CFR)—the probability that a person dies from an infection given that they are a case—is a high priority in epidemiologic investigation of newly emerging infectious diseases and sometimes in new outbreaks of known infectious diseases. The data available to estimate the overall CFR are often gathered for other purposes (e.g., surveillance) in challenging circumstances. We describe two forms of bias that may affect the estimation of the overall CFR—preferential ascertainment of severe cases and bias from reporting delays—and review solutions that have been proposed and implemented in past epidemics. Also of interest is the estimation of the causal impact of specific interventions (e.g., hospitalization, or hospitalization at a particular hospital) on survival, which can be estimated as a relative CFR for two or more groups. When observational data are used for this purpose, three more sources of bias may arise: confounding, survivorship bias, and selection due to preferential inclusion in surveillance datasets of those who are hospitalized and/or die. We illustrate these biases and caution against causal interpretation of differential CFR among those receiving different interventions in observational datasets. Again, we discuss ways to reduce these biases, particularly by estimating outcomes in smaller but more systematically defined cohorts ascertained before the onset of symptoms, such as those identified by forward contact tracing. Finally, we discuss the circumstances in which these biases may affect non-causal interpretation of risk factors for death among cases. PMID:26181387

  8. A Multibiomarker-Based Model for Estimating the Risk of Septic Acute Kidney Injury

    PubMed Central

    Wong, Hector R.; Cvijanovich, Natalie Z.; Anas, Nick; Allen, Geoffrey L.; Thomas, Neal J.; Bigham, Michael T.; Weiss, Scott L.; Fitzgerald, Julie; Checchia, Paul A.; Meyer, Keith; Shanley, Thomas P.; Quasney, Michael; Hall, Mark; Gedeit, Rainer; Freishtat, Robert J.; Nowak, Jeffrey; Raj, Shekhar S.; Gertz, Shira; Dawson, Emily; Howard, Kelli; Harmon, Kelli; Lahni, Patrick; Frank, Erin; Hart, Kimberly W.; Lindsell, Christopher J.

    2015-01-01

    Objective The development of acute kidney injury in patients with sepsis is associated with worse outcomes. Identifying those at risk for septic acute kidney injury could help to inform clinical decision making. We derived and tested a multibiomarker-based model to estimate the risk of septic acute kidney injury in children with septic shock. Design Candidate serum protein septic acute kidney injury biomarkers were identified from previous transcriptomic studies. Model derivation involved measuring these biomarkers in serum samples from 241 subjects with septic shock obtained during the first 24 hours of admission and then using a Classification and Regression Tree approach to estimate the probability of septic acute kidney injury 3 days after the onset of septic shock, defined as at least two-fold increase from baseline serum creatinine. The model was then tested in a separate cohort of 200 subjects. Setting Multiple PICUs in the United States. Interventions None other than standard care. Measurements and Main Results The decision tree included a first-level decision node based on day 1 septic acute kidney injury status and five subsequent biomarker-based decision nodes. The area under the curve for the tree was 0.95 (CI95, 0.91–0.99), with a sensitivity of 93% and a specificity of 88%. The tree was superior to day 1 septic acute kidney injury status alone for estimating day 3 septic acute kidney injury risk. In the test cohort, the tree had an area under the curve of 0.83 (0.72–0.95), with a sensitivity of 85% and a specificity of 77% and was also superior to day 1 septic acute kidney injury status alone for estimating day 3 septic acute kidney injury risk. Conclusions We have derived and tested a model to estimate the risk of septic acute kidney injury on day 3 of septic shock using a novel panel of biomarkers. The model had very good performance in a test cohort and has test characteristics supporting clinical utility and further prospective evaluation

  9. Sensitivity Analysis of Median Lifetime on Radiation Risks Estimates for Cancer and Circulatory Disease amongst Never-Smokers

    NASA Technical Reports Server (NTRS)

    Chappell, Lori J.; Cucinotta, Francis A.

    2011-01-01

    Radiation risks are estimated in a competing risk formalism where age or time after exposure estimates of increased risks for cancer and circulatory diseases are folded with a probability to survive to a given age. The survival function, also called the life-table, changes with calendar year, gender, smoking status and other demographic variables. An outstanding problem in risk estimation is the method of risk transfer between exposed populations and a second population where risks are to be estimated. Approaches used to transfer risks are based on: 1) Multiplicative risk transfer models -proportional to background disease rates. 2) Additive risk transfer model -risks independent of background rates. In addition, a Mixture model is often considered where the multiplicative and additive transfer assumptions are given weighted contributions. We studied the influence of the survival probability on the risk of exposure induced cancer and circulatory disease morbidity and mortality in the Multiplicative transfer model and the Mixture model. Risks for never-smokers (NS) compared to the average U.S. population are estimated to be reduced between 30% and 60% dependent on model assumptions. Lung cancer is the major contributor to the reduction for NS, with additional contributions from circulatory diseases and cancers of the stomach, liver, bladder, oral cavity, esophagus, colon, a portion of the solid cancer remainder, and leukemia. Greater improvements in risk estimates for NS s are possible, and would be dependent on improved understanding of risk transfer models, and elucidating the role of space radiation on the various stages of disease formation (e.g. initiation, promotion, and progression).

  10. Radiation-Induced Leukemia at Doses Relevant to Radiation Therapy: Modeling Mechanisms and Estimating Risks

    NASA Technical Reports Server (NTRS)

    Shuryak, Igor; Sachs, Rainer K.; Hlatky, Lynn; Mark P. Little; Hahnfeldt, Philip; Brenner, David J.

    2006-01-01

    Because many cancer patients are diagnosed earlier and live longer than in the past, second cancers induced by radiation therapy have become a clinically significant issue. An earlier biologically based model that was designed to estimate risks of high-dose radiation induced solid cancers included initiation of stem cells to a premalignant state, inactivation of stem cells at high radiation doses, and proliferation of stem cells during cellular repopulation after inactivation. This earlier model predicted the risks of solid tumors induced by radiation therapy but overestimated the corresponding leukemia risks. Methods: To extend the model to radiation-induced leukemias, we analyzed in addition to cellular initiation, inactivation, and proliferation a repopulation mechanism specific to the hematopoietic system: long-range migration through the blood stream of hematopoietic stem cells (HSCs) from distant locations. Parameters for the model were derived from HSC biologic data in the literature and from leukemia risks among atomic bomb survivors v^ ho were subjected to much lower radiation doses. Results: Proliferating HSCs that migrate from sites distant from the high-dose region include few preleukemic HSCs, thus decreasing the high-dose leukemia risk. The extended model for leukemia provides risk estimates that are consistent with epidemiologic data for leukemia risk associated with radiation therapy over a wide dose range. For example, when applied to an earlier case-control study of 110000 women undergoing radiotherapy for uterine cancer, the model predicted an excess relative risk (ERR) of 1.9 for leukemia among women who received a large inhomogeneous fractionated external beam dose to the bone marrow (mean = 14.9 Gy), consistent with the measured ERR (2.0, 95% confidence interval [CI] = 0.2 to 6.4; from 3.6 cases expected and 11 cases observed). As a corresponding example for brachytherapy, the predicted ERR of 0.80 among women who received an inhomogeneous low

  11. Biokinetic and dosimetric modelling in the estimation of radiation risks from internal emitters.

    PubMed

    Harrison, John

    2009-06-01

    The International Commission on Radiological Protection (ICRP) has developed biokinetic and dosimetric models that enable the calculation of organ and tissue doses for a wide range of radionuclides. These are used to calculate equivalent and effective dose coefficients (dose in Sv Bq(-1) intake), considering occupational and environmental exposures. Dose coefficients have also been given for a range of radiopharmaceuticals used in diagnostic medicine. Using equivalent and effective dose, exposures from external sources and from different radionuclides can be summed for comparison with dose limits, constraints and reference levels that relate to risks from whole-body radiation exposure. Risk estimates are derived largely from follow-up studies of the survivors of the atomic bombings at Hiroshima and Nagasaki in 1945. New dose coefficients will be required following the publication in 2007 of new ICRP recommendations. ICRP biokinetic and dosimetric models are subject to continuing review and improvement, although it is arguable that the degree of sophistication of some of the most recent models is greater than required for the calculation of effective dose to a reference person for the purposes of regulatory control. However, the models are also used in the calculation of best estimates of doses and risks to individuals, in epidemiological studies and to determine probability of cancer causation. Models are then adjusted to best fit the characteristics of the individuals and population under consideration. For example, doses resulting from massive discharges of strontium-90 and other radionuclides to the Techa River from the Russian Mayak plutonium plant in the early years of its operation are being estimated using models adapted to take account of measurements on local residents and other population-specific data. Best estimates of doses to haemopoietic bone marrow, in utero and postnatally, are being used in epidemiological studies of radiation-induced leukaemia

  12. Eclipsing Binaries as Astrophysical Laboratories: CM Draconis - Accurate Absolute Physical Properties of Low Mass Stars and an Independent Estimate of the Primordial Helium Abundance

    NASA Astrophysics Data System (ADS)

    McCook, G. P.; Guinan, E. F.; Saumon, D.; Kang, Y. W.

    1997-05-01

    CM Draconis (Gl 630.1; Vmax = +12.93) is an important eclipsing binary consisting of two dM4.5e stars with an orbital period of 1.2684 days. This binary is a high velocity star (s= 164 km/s) and the brighter member of a common proper motion pair with a cool faint white dwarf companion (LP 101-16). CM Dra and its white dwarf companion were once considered by Zwicky to belong to a class of "pygmy stars", but they turned out to be ordinary old, cool white dwarfs or faint red dwarfs. Lacy (ApJ 218,444L) determined the first orbital and physical properties of CM Dra from the analysis of his light and radial velocity curves. In addition to providing directly measured masses, radii, and luminosities for low mass stars, CM Dra was also recognized by Lacy and later by Paczynski and Sienkiewicz (ApJ 286,332) as an important laboratory for cosmology, as a possible old Pop II object where it may be possible to determine the primordial helium abundance. Recently, Metcalfe et al.(ApJ 456,356) obtained accurate RV measures for CM Dra and recomputed refined elements along with its helium abundance. Starting in 1995, we have been carrying out intensive RI photoelectric photometry of CM Dra to obtain well defined, accurate light curves so that its fundamental properties can be improved, and at the same time, to search for evidence of planets around the binary from planetary transit eclipses. During 1996 and 1997 well defined light curves were secured and these were combined with the RV measures of Metcalfe et al. (1996) to determine the orbital and physical parameters of the system, including a refined orbital period. A recent version of the Wilson-Devinney program was used to analyze the data. New radii, masses, mean densities, Teff, and luminosities were found as well as a re-determination of the helium abundance (Y). The results of the recent analyses of the light and RV curves will be presented and modelling results discussed. This research is supported by NSF grants AST-9315365

  13. Comparison of the Male Osteoporosis Risk Estimation Score (MORES) With FRAX in Identifying Men at Risk for Osteoporosis

    PubMed Central

    Cass, Alvah R.; Shepherd, Angela J.; Asirot, Rechelle; Mahajan, Manju; Nizami, Maimoona

    2016-01-01

    PURPOSE We wanted to compare the male osteoporosis risk estimation score (MORES) with the fracture risk assessment tool (FRAX) in screening men for osteoporosis. METHODS This study reports analysis of data from the Third National Health and Nutrition Examination Survey (NHANES III), a nationally representative sample of the US population, comparing the operating characteristics of FRAX and MORES to identify men at risk for osteoporosis using a subset of 1,498 men, aged 50 years and older, with a valid dual-energy x-ray absorptiometry (DXA) scan. DXA-derived bone mineral density using a T score of −2.5 or lower at either the femoral neck or total hip defined the diagnosis of osteoporosis. Outcomes included the operating characteristics, area under the receiver-operator characteristic curve, and agreement of the FRAX and MORES. RESULTS Sixty-seven (4.5%) of the 1,498 men had osteoporosis of the hip. The sensitivity, specificity, and area under the curve (AUC) for the MORES were 0.96 (95% CI, 0.87–0.99), 0.61 (95% CI, 0.58–0.63), and 0.87 (95% CI, 0.84–0.91), respectively. The sensitivity, specificity, and AUC for the FRAX were 0.39 (95% CI, 0.27–0.51), 0.89 (95% CI, 0.88–0.91), and 0.79 (95% CI, 0.75–0.84) respectively. Agreement was poor. CONCLUSIONS Compared with the MORES, the FRAX underperformed as a screening strategy for osteoporosis using the threshold score suggested by the US Preventive Services Task Force (USPSTF). An integrated approach that uses the MORES to determine which men should have a DXA scan and the FRAX to guide treatment decisions, based on the risk of a future fracture, identified 82% of men who were candidates for treatments based on National Osteoporosis Foundation guidelines. PMID:27401426

  14. Polydimethylsiloxane-air partition ratios for semi-volatile organic compounds by GC-based measurement and COSMO-RS estimation: Rapid measurements and accurate modelling.

    PubMed

    Okeme, Joseph O; Parnis, J Mark; Poole, Justen; Diamond, Miriam L; Jantunen, Liisa M

    2016-08-01

    Polydimethylsiloxane (PDMS) shows promise for use as a passive air sampler (PAS) for semi-volatile organic compounds (SVOCs). To use PDMS as a PAS, knowledge of its chemical-specific partitioning behaviour and time to equilibrium is needed. Here we report on the effectiveness of two approaches for estimating the partitioning properties of polydimethylsiloxane (PDMS), values of PDMS-to-air partition ratios or coefficients (KPDMS-Air), and time to equilibrium of a range of SVOCs. Measured values of KPDMS-Air, Exp' at 25 °C obtained using the gas chromatography retention method (GC-RT) were compared with estimates from a poly-parameter free energy relationship (pp-FLER) and a COSMO-RS oligomer-based model. Target SVOCs included novel flame retardants (NFRs), polybrominated diphenyl ethers (PBDEs), polycyclic aromatic hydrocarbons (PAHs), organophosphate flame retardants (OPFRs), polychlorinated biphenyls (PCBs) and organochlorine pesticides (OCPs). Significant positive relationships were found between log KPDMS-Air, Exp' and estimates made using the pp-FLER model (log KPDMS-Air, pp-LFER) and the COSMOtherm program (log KPDMS-Air, COSMOtherm). The discrepancy and bias between measured and predicted values were much higher for COSMO-RS than the pp-LFER model, indicating the anticipated better performance of the pp-LFER model than COSMO-RS. Calculations made using measured KPDMS-Air, Exp' values show that a PDMS PAS of 0.1 cm thickness will reach 25% of its equilibrium capacity in ∼1 day for alpha-hexachlorocyclohexane (α-HCH) to ∼ 500 years for tris (4-tert-butylphenyl) phosphate (TTBPP), which brackets the volatility range of all compounds tested. The results presented show the utility of GC-RT method for rapid and precise measurements of KPDMS-Air.

  15. Estimating Loss-of-Coolant Accident Frequencies for the Standardized Plant Analysis Risk Models

    SciTech Connect

    S. A. Eide; D. M. Rasmuson; C. L. Atwood

    2008-09-01

    The U.S. Nuclear Regulatory Commission maintains a set of risk models covering the U.S. commercial nuclear power plants. These standardized plant analysis risk (SPAR) models include several loss-of-coolant accident (LOCA) initiating events such as small (SLOCA), medium (MLOCA), and large (LLOCA). All of these events involve a loss of coolant inventory from the reactor coolant system. In order to maintain a level of consistency across these models, initiating event frequencies generally are based on plant-type average performance, where the plant types are boiling water reactors and pressurized water reactors. For certain risk analyses, these plant-type initiating event frequencies may be replaced by plant-specific estimates. Frequencies for SPAR LOCA initiating events previously were based on results presented in NUREG/CR-5750, but the newest models use results documented in NUREG/CR-6928. The estimates in NUREG/CR-6928 are based on historical data from the initiating events database for pressurized water reactor SLOCA or an interpretation of results presented in the draft version of NUREG-1829. The information in NUREG-1829 can be used several ways, resulting in different estimates for the various LOCA frequencies. Various ways NUREG-1829 information can be used to estimate LOCA frequencies were investigated and this paper presents two methods for the SPAR model standard inputs, which differ from the method used in NUREG/CR-6928. In addition, results obtained from NUREG-1829 are compared with actual operating experience as contained in the initiating events database.

  16. Cancer risk estimation of genotoxic chemicals based on target dose and a multiplicative model

    SciTech Connect

    Granath, F.N. . Dept. of Mathematical Statistics Karolinska Inst., Stockholm . Dept. of Medical Epidemiology); Vaca, C.E. . Dept. of Radiobiology Casco Products AB, Stockholm ); Ehrenberg, L.G.; Toernqvist, M.A. )

    1999-04-01

    A mechanistic model and associated procedures are proposed for cancer risk assessment of genotoxic chemicals. As previously shown for ionizing radiation, a linear multiplicative model was found to be compatible with published experimental data for ethylene oxide, acrylamide, and butadiene. Concurrent analysis led to rejection of an additive model. A reanalysis of data for radiogenic cancer in mouse, dog and man shows that the relative risk coefficient is approximately the same for tumors induced in the three species. Doses in vivo, defined as the time-integrated concentrations of ultimate mutagens, expressed in millimol x kg[sup [minus]1] x h (mMh) are, like radiation doses given in Gy or rad, proportional to frequencies of potentially mutagenic events. The radiation dose equivalents of chemical doses are, calculated by multiplying chemical doses (in mMh) with the relative genotoxic potencies determined in vitro. In this way the relative cancer incidence increments in rats and mice exposed to ethylene oxide were shown to be about 0.4% per rad-equivalent, in agreement with the data for radiogenic cancer. The analyses suggest that values of the relative risk coefficients for genotoxic chemicals are independent of species and that relative cancer risks determined in animal tests apply also to humans. If reliable animal test data are not available, cancer risks may be estimated by the relative potency. In both cases exposure dose/target dose relationships, the latter via macromolecule adducts, should be determined.

  17. Impact of ground motion characterization on conservatism and variability in seismic risk estimates

    SciTech Connect

    Sewell, R.T.; Toro, G.R.; McGuire, R.K.

    1996-07-01

    This study evaluates the impact, on estimates of seismic risk and its uncertainty, of alternative methods in treatment and characterization of earthquake ground motions. The objective of this study is to delineate specific procedures and characterizations that may lead to less biased and more precise seismic risk results. This report focuses on sources of conservatism and variability in risk that may be introduced through the analytical processes and ground-motion descriptions which are commonly implemented at the interface of seismic hazard and fragility assessments. In particular, implication of the common practice of using a single, composite spectral shape to characterize motions of different magnitudes is investigated. Also, the impact of parameterization of ground motion on fragility and hazard assessments is shown. Examination of these results demonstrates the following. (1) There exists significant conservatism in the review spectra (usually, spectra characteristic of western U.S. earthquakes) that have been used in conducting past seismic risk assessments and seismic margin assessments for eastern U.S. nuclear power plants. (2) There is a strong dependence of seismic fragility on earthquake magnitude when PGA is used as the ground-motion characterization. When, however, magnitude-dependent spectra are anchored to a common measure of elastic spectral acceleration averaged over the appropriate frequency range, seismic fragility shows no important nor consistent dependence on either magnitude or strong-motion duration. Use of inelastic spectral acceleration (at the proper frequency) as the ground spectrum anchor demonstrates a very similar result. This study concludes that a single, composite-magnitude spectrum can generally be used to characterize ground motion for fragility assessment without introducing significant bias or uncertainty in seismic risk estimates.

  18. Mathematical model to estimate risk of calcium-containing renal stones

    NASA Technical Reports Server (NTRS)

    Pietrzyk, R. A.; Feiveson, A. H.; Whitson, P. A.

    1999-01-01

    BACKGROUND/AIMS: Astronauts exposed to microgravity during the course of spaceflight undergo physiologic changes that alter the urinary environment so as to increase the risk of renal stone formation. This study was undertaken to identify a simple method with which to evaluate the potential risk of renal stone development during spaceflight. METHOD: We used a large database of urinary risk factors obtained from 323 astronauts before and after spaceflight to generate a mathematical model with which to predict the urinary supersaturation of calcium stone forming salts. RESULT: This model, which involves the fewest possible analytical variables (urinary calcium, citrate, oxalate, phosphorus, and total volume), reliably and accurately predicted the urinary supersaturation of the calcium stone forming salts when compared to results obtained from a group of 6 astronauts who collected urine during flight. CONCLUSIONS: The use of this model will simplify both routine medical monitoring during spaceflight as well as the evaluation of countermeasures designed to minimize renal stone development. This model also can be used for Earth-based applications in which access to analytical resources is limited.

  19. Longer genotypically-estimated leukocyte telomere length is associated with increased adult glioma risk

    PubMed Central

    Walsh, Kyle M.; Codd, Veryan; Rice, Terri; Nelson, Christopher P.; Smirnov, Ivan V.; McCoy, Lucie S.; Hansen, Helen M.; Elhauge, Edward; Ojha, Juhi; Francis, Stephen S.; Madsen, Nils R.; Bracci, Paige M.; Pico, Alexander R.; Molinaro, Annette M.; Tihan, Tarik; Berger, Mitchel S.; Chang, Susan M.; Prados, Michael D.; Jenkins, Robert B.; Wiemels, Joseph L.; Samani, Nilesh J.; Wiencke, John K.; Wrensch, Margaret R.

    2015-01-01

    Telomere maintenance has emerged as an important molecular feature with impacts on adult glioma susceptibility and prognosis. Whether longer or shorter leukocyte telomere length (LTL) is associated with glioma risk remains elusive and is often confounded by the effects of age and patient treatment. We sought to determine if genotypically-estimated LTL is associated with glioma risk and if inherited single nucleotide polymorphisms (SNPs) that are associated with LTL are glioma risk factors. Using a Mendelian randomization approach, we assessed differences in genotypically-estimated relative LTL in two independent glioma case-control datasets from the UCSF Adult Glioma Study (652 patients and 3735 controls) and The Cancer Genome Atlas (478 non-overlapping patients and 2559 controls). LTL estimates were based on a weighted linear combination of subject genotype at eight SNPs, previously associated with LTL in the ENGAGE Consortium Telomere Project. Mean estimated LTL was 31bp (5.7%) longer in glioma patients than controls in discovery analyses (P = 7.82×10-8) and 27bp (5.0%) longer in glioma patients than controls in replication analyses (1.48×10-3). Glioma risk increased monotonically with each increasing septile of LTL (O.R.=1.12; P = 3.83×10-12). Four LTL-associated SNPs were significantly associated with glioma risk in pooled analyses, including those in the telomerase component genes TERC (O.R.=1.14; 95% C.I.=1.03-1.28) and TERT (O.R.=1.39; 95% C.I.=1.27-1.52), and those in the CST complex genes OBFC1 (O.R.=1.18; 95% C.I.=1.05-1.33) and CTC1 (O.R.=1.14; 95% C.I.=1.02-1.28). Future work is needed to characterize the role of the CST complex in gliomagenesis and further elucidate the complex balance between ageing, telomere length, and molecular carcinogenesis. PMID:26646793

  20. Longer genotypically-estimated leukocyte telomere length is associated with increased adult glioma risk.

    PubMed

    Walsh, Kyle M; Codd, Veryan; Rice, Terri; Nelson, Christopher P; Smirnov, Ivan V; McCoy, Lucie S; Hansen, Helen M; Elhauge, Edward; Ojha, Juhi; Francis, Stephen S; Madsen, Nils R; Bracci, Paige M; Pico, Alexander R; Molinaro, Annette M; Tihan, Tarik; Berger, Mitchel S; Chang, Susan M; Prados, Michael D; Jenkins, Robert B; Wiemels, Joseph L; Samani, Nilesh J; Wiencke, John K; Wrensch, Margaret R

    2015-12-15

    Telomere maintenance has emerged as an important molecular feature with impacts on adult glioma susceptibility and prognosis. Whether longer or shorter leukocyte telomere length (LTL) is associated with glioma risk remains elusive and is often confounded by the effects of age and patient treatment. We sought to determine if genotypically-estimated LTL is associated with glioma risk and if inherited single nucleotide polymorphisms (SNPs) that are associated with LTL are glioma risk factors. Using a Mendelian randomization approach, we assessed differences in genotypically-estimated relative LTL in two independent glioma case-control datasets from the UCSF Adult Glioma Study (652 patients and 3735 controls) and The Cancer Genome Atlas (478 non-overlapping patients and 2559 controls). LTL estimates were based on a weighted linear combination of subject genotype at eight SNPs, previously associated with LTL in the ENGAGE Consortium Telomere Project. Mean estimated LTL was 31bp (5.7%) longer in glioma patients than controls in discovery analyses (P = 7.82x10-8) and 27bp (5.0%) longer in glioma patients than controls in replication analyses (1.48x10-3). Glioma risk increased monotonically with each increasing septile of LTL (O.R.=1.12; P = 3.83x10-12). Four LTL-associated SNPs were significantly associated with glioma risk in pooled analyses, including those in the telomerase component genes TERC (O.R.=1.14; 95% C.I.=1.03-1.28) and TERT (O.R.=1.39; 95% C.I.=1.27-1.52), and those in the CST complex genes OBFC1 (O.R.=1.18; 95% C.I.=1.05-1.33) and CTC1 (O.R.=1.14; 95% C.I.=1.02-1.28). Future work is needed to characterize the role of the CST complex in gliomagenesis and further elucidate the complex balance between ageing, telomere length, and molecular carcinogenesis.

  1. Patient-specific radiation dose and cancer risk estimation in CT: Part I. Development and validation of a Monte Carlo program

    SciTech Connect

    Li Xiang; Samei, Ehsan; Segars, W. Paul; Sturgeon, Gregory M.; Colsher, James G.; Toncheva, Greta; Yoshizumi, Terry T.; Frush, Donald P.

    2011-01-15

    Purpose: Radiation-dose awareness and optimization in CT can greatly benefit from a dose-reporting system that provides dose and risk estimates specific to each patient and each CT examination. As the first step toward patient-specific dose and risk estimation, this article aimed to develop a method for accurately assessing radiation dose from CT examinations. Methods: A Monte Carlo program was developed to model a CT system (LightSpeed VCT, GE Healthcare). The geometry of the system, the energy spectra of the x-ray source, the three-dimensional geometry of the bowtie filters, and the trajectories of source motions during axial and helical scans were explicitly modeled. To validate the accuracy of the program, a cylindrical phantom was built to enable dose measurements at seven different radial distances from its central axis. Simulated radial dose distributions in the cylindrical phantom were validated against ion chamber measurements for single axial scans at all combinations of tube potential and bowtie filter settings. The accuracy of the program was further validated using two anthropomorphic phantoms (a pediatric one-year-old phantom and an adult female phantom). Computer models of the two phantoms were created based on their CT data and were voxelized for input into the Monte Carlo program. Simulated dose at various organ locations was compared against measurements made with thermoluminescent dosimetry chips for both single axial and helical scans. Results: For the cylindrical phantom, simulations differed from measurements by -4.8% to 2.2%. For the two anthropomorphic phantoms, the discrepancies between simulations and measurements ranged between (-8.1%, 8.1%) and (-17.2%, 13.0%) for the single axial scans and the helical scans, respectively. Conclusions: The authors developed an accurate Monte Carlo program for assessing radiation dose from CT examinations. When combined with computer models of actual patients, the program can provide accurate dose

  2. Arm span and ulnar length are reliable and accurate estimates of recumbent length and height in a multiethnic population of infants and children under 6 years of age.

    PubMed

    Forman, Michele R; Zhu, Yeyi; Hernandez, Ladia M; Himes, John H; Dong, Yongquan; Danish, Robert K; James, Kyla E; Caulfield, Laura E; Kerver, Jean M; Arab, Lenore; Voss, Paula; Hale, Daniel E; Kanafani, Nadim; Hirschfeld, Steven

    2014-09-01

    Surrogate measures are needed when recumbent length or height is unobtainable or unreliable. Arm span has been used as a surrogate but is not feasible in children with shoulder or arm contractures. Ulnar length is not usually impaired by joint deformities, yet its utility as a surrogate has not been adequately studied. In this cross-sectional study, we aimed to examine the accuracy and reliability of ulnar length measured by different tools as a surrogate measure of recumbent length and height. Anthropometrics [recumbent length, height, arm span, and ulnar length by caliper (ULC), ruler (ULR), and grid (ULG)] were measured in 1479 healthy infants and children aged <6 y across 8 study centers in the United States. Multivariate mixed-effects linear regression models for recumbent length and height were developed by using ulnar length and arm span as surrogate measures. The agreement between the measured length or height and the predicted values by ULC, ULR, ULG, and arm span were examined by Bland-Altman plots. All 3 measures of ulnar length and arm span were highly correlated with length and height. The degree of precision of prediction equations for length by ULC, ULR, and ULG (R(2) = 0.95, 0.95, and 0.92, respectively) was comparable with that by arm span (R(2) = 0.97) using age, sex, and ethnicity as covariates; however, height prediction by ULC (R(2) = 0.87), ULR (R(2) = 0.85), and ULG (R(2) = 0.88) was less comparable with arm span (R(2) = 0.94). Our study demonstrates that arm span and ULC, ULR, or ULG can serve as accurate and reliable surrogate measures of recumbent length and height in healthy children; however, ULC, ULR, and ULG tend to slightly overestimate length and height in young infants and children. Further testing of ulnar length as a surrogate is warranted in physically impaired or nonambulatory children.

  3. Estimated drinking water fluoride exposure and risk of hip fracture: a cohort study.

    PubMed

    Näsman, P; Ekstrand, J; Granath, F; Ekbom, A; Fored, C M

    2013-11-01

    The cariostatic benefit from water fluoridation is indisputable, but the knowledge of possible adverse effects on bone and fracture risk due to fluoride exposure is ambiguous. The association between long-term (chronic) drinking water fluoride exposure and hip fracture (ICD-7-9: '820' and ICD-10: 'S72.0-S72.2') was assessed in Sweden using nationwide registers. All individuals born in Sweden between January 1, 1900 and December 31, 1919, alive and living in their municipality of birth at the time of start of follow-up, were eligible for this study. Information on the study population (n = 473,277) was linked among the Swedish National In-Patient Register (IPR), the Swedish Cause of Death Register, and the Register of Population and Population Changes. Estimated individual drinking water fluoride exposure was stratified into 4 categories: very low, < 0.3 mg/L; low, 0.3 to 0.69 mg/L; medium, 0.7 to 1.49 mg/L; and high, ≥ 1.5 mg/L. Overall, we found no association between chronic fluoride exposure and the occurrence of hip fracture. The risk estimates did not change in analyses restricted to only low-trauma osteoporotic hip fractures. Chronic fluoride exposure from drinking water does not seem to have any important effects on the risk of hip fracture, in the investigated exposure range.

  4. Relative risk estimation of Chikungunya disease in Malaysia: An analysis based on Poisson-gamma model

    NASA Astrophysics Data System (ADS)

    Samat, N. A.; Ma'arof, S. H. Mohd Imam

    2015-05-01

    Disease mapping is a method to display the geographical distribution of disease occurrence, which generally involves the usage and interpretation of a map to show the incidence of certain diseases. Relative risk (RR) estimation is one of the most important issues in disease mapping. This paper begins by providing a brief overview of Chikungunya disease. This is followed by a review of the classical model used in disease mapping, based on the standardized morbidity ratio (SMR), which we then apply to our Chikungunya data. We then fit an extension of the classical model, which we refer to as a Poisson-Gamma model, when prior distributions for the relative risks are assumed known. Both results are displayed and compared using maps and we reveal a smoother map with fewer extremes values of estimated relative risk. The extensions of this paper will consider other methods that are relevant to overcome the drawbacks of the existing methods, in order to inform and direct government strategy for monitoring and controlling Chikungunya disease.

  5. Correlations between parameters in risk models: estimation and propagation of uncertainty by Markov Chain Monte Carlo.

    PubMed

    Ades, A E; Lu, G

    2003-12-01

    Monte Carlo simulation has become the accepted method for propagating parameter uncertainty through risk models. It is widely appreciated, however, that correlations between input variables must be taken into account if models are to deliver correct assessments of uncertainty in risk. Various two-stage methods have been proposed that first estimate a correlation structure and then generate Monte Carlo simulations, which incorporate this structure while leaving marginal distributions of parameters unchanged. Here we propose a one-stage alternative, in which the correlation structure is estimated from the data directly by Bayesian Markov Chain Monte Carlo methods. Samples from the posterior distribution of the outputs then correctly reflect the correlation between parameters, given the data and the model. Besides its computational simplicity, this approach utilizes the available evidence from a wide variety of structures, including incomplete data and correlated and uncorrelated repeat observations. The major advantage of a Bayesian approach is that, rather than assuming the correlation structure is fixed and known, it captures the joint uncertainty induced by the data in all parameters, including variances and covariances, and correctly propagates this through the decision or risk model. These features are illustrated with examples on emissions of dioxin congeners from solid waste incinerators.

  6. Developing a utility decision framework to evaluate predictive models in breast cancer risk estimation

    PubMed Central

    Wu, Yirong; Abbey, Craig K.; Chen, Xianqiao; Liu, Jie; Page, David C.; Alagoz, Oguzhan; Peissig, Peggy; Onitilo, Adedayo A.; Burnside, Elizabeth S.

    2015-01-01

    Abstract. Combining imaging and genetic information to predict disease presence and progression is being codified into an emerging discipline called “radiogenomics.” Optimal evaluation methodologies for radiogenomics have not been well established. We aim to develop a decision framework based on utility analysis to assess predictive models for breast cancer diagnosis. We garnered Gail risk factors, single nucleotide polymorphisms (SNPs), and mammographic features from a retrospective case-control study. We constructed three logistic regression models built on different sets of predictive features: (1) Gail, (2) Gail + Mammo, and (3) Gail + Mammo + SNP. Then we generated receiver operating characteristic (ROC) curves for three models. After we assigned utility values for each category of outcomes (true negatives, false positives, false negatives, and true positives), we pursued optimal operating points on ROC curves to achieve maximum expected utility of breast cancer diagnosis. We performed McNemar’s test based on threshold levels at optimal operating points, and found that SNPs and mammographic features played a significant role in breast cancer risk estimation. Our study comprising utility analysis and McNemar’s test provides a decision framework to evaluate predictive models in breast cancer risk estimation. PMID:26835489

  7. Estimating the Size of Populations at High Risk for HIV Using Respondent-Driven Sampling Data

    PubMed Central

    Handcock, Mark S.; Gile, Krista J.; Mar, Corinne M.

    2015-01-01

    Summary The study of hard-to-reach populations presents significant challenges. Typically, a sampling frame is not available, and population members are difficult to identify or recruit from broader sampling frames. This is especially true of populations at high risk for HIV/AIDS. Respondent-driven sampling (RDS) is often used in such settings with the primary goal of estimating the prevalence of infection. In such populations, the number of people at risk for infection and the number of people infected are of fundamental importance. This article presents a case-study of the estimation of the size of the hard-to-reach population based on data collected through RDS. We study two populations of female sex workers and men-who-have-sex-with-men in El Salvador. The approach is Bayesian and we consider different forms of prior information, including using the UNAIDS population size guidelines for this region. We show that the method is able to quantify the amount of information on population size available in RDS samples. As separate validation, we compare our results to those estimated by extrapolating from a capture–recapture study of El Salvadorian cities. The results of our case-study are largely comparable to those of the capture–recapture study when they differ from the UNAIDS guidelines. Our method is widely applicable to data from RDS studies and we provide a software package to facilitate this. PMID:25585794

  8. Community-based estimates of incidence and risk factors for childhood pneumonia in Western Sydney.

    PubMed Central

    MacIntyre, C. R.; McIntyre, P. B.; Cagney, M.

    2003-01-01

    The aim was to estimate the community incidence and risk factors for all-cause pneumonia in children in Western Sydney, Australia. A cross-sectional randomized computer-assisted telephone interview was conducted in July 2000, in Western Sydney. Parents of 2020 children aged between 5 and 14 years were interviewed about their child's respiratory health since birth. No verification of reported diagnosis was available. Logistic regression analysis was used to determine risk factors for pneumonia. A lifetime diagnosis of pneumonia was reported in 137/2020 (68%) children, giving an estimated incidence in the study sample of 7.6/1000 person-years. Radiological confirmation was reported in 85% (117/137). Hospitalization was reported in 41% (56/137) and antibiotic therapy in 93% (127/137) of cases. Using logistic regression modelling, statistically significant associations with pneumonia were a reported history of either asthma, bronchitis or other lung problems and health problems affecting other systems. In most cases, the diagnosis of asthma preceded the diagnosis of pneumonia. The community incidence of all causes of pneumonia is not well enumerated, either in adults or in children. This study provides community-based incidence data. The incidence of hospitalization for pneumonia in this study is comparable to estimates from studies in comparable populations, suggesting that retrospective parental report for memorable events is likely to be valid. We found a relationship between pneumonia and childhood respiratory diseases such as asthma, which has implications for targeted vaccination strategies. PMID:14959775

  9. Yesterday's Japan: A system of flood risk estimation over Japan with remote-sensing precipitation data

    NASA Astrophysics Data System (ADS)

    Kanae, S.; Seto, S.; Yoshimura, K.; Oki, T.

    2008-12-01

    A new river discharge prediction and hindcast system all over Japan in order to issue alerts of flood risks has been developed. It utilizes Japan Meteorological Agency"fs Meso-scale model outputs and remote-sensing precipitation data. A statistical approach that compromises the bias and uncertainty of models is proposed for interpreting the simulated river discharge as a flood risk. A 29-year simulation was implemented to estimate parameters of the Gumbel distribution for the probability of extreme discharge, and the estimated discharge probability index (DPI) showed good agreement with that estimated based on observations. Even more strikingly, high DPI in the simulation corresponded to actual flood damage records. This indicates that the real-time simulation of the DPI could potentially provide reasonable flood warnings. A method to overcome the lack of sufficiently long simulation data through the use of a pre-existing long-term simulation and to estimate statistical parameters is also proposed. A preliminary flood risk prediction that used operational weather forecast data for 2003 and 2004 gave results similar to those of the 29-year simulation for the Typhoon T0423 event on October 2004, demonstrating the transferability of the technique to real-time prediction. In addition, the usefulness of satellite precipitation data for the flood estimation is evaluated via hindcast. We conducted it using several precipitation satellite datasets. The GSMaP product can detect heavy precipitation events, but floods being not well simulated in many cases because of GSMAP"fs underestimation. The GSMaP product adjusted by using monthly and 1 degree rain gauge information can be used to detect flood events as well as hourly rain gauge observations. Another quantitative issue is also disscussed. When a remote-sensing based precipitation data is used as an input for hindcast, we are suffering from underestimation of precipitation amount. The effort for improvement will be shown

  10. Patient-specific radiation dose and cancer risk estimation in CT: Part II. Application to patients

    SciTech Connect

    Li Xiang; Samei, Ehsan; Segars, W. Paul; Sturgeon, Gregory M.; Colsher, James G.; Toncheva, Greta; Yoshizumi, Terry T.; Frush, Donald P.

    2011-01-15

    Purpose: Current methods for estimating and reporting radiation dose from CT examinations are largely patient-generic; the body size and hence dose variation from patient to patient is not reflected. Furthermore, the current protocol designs rely on dose as a surrogate for the risk of cancer incidence, neglecting the strong dependence of risk on age and gender. The purpose of this study was to develop a method for estimating patient-specific radiation dose and cancer risk from CT examinations. Methods: The study included two patients (a 5-week-old female patient and a 12-year-old male patient), who underwent 64-slice CT examinations (LightSpeed VCT, GE Healthcare) of the chest, abdomen, and pelvis at our institution in 2006. For each patient, a nonuniform rational B-spine (NURBS) based full-body computer model was created based on the patient's clinical CT data. Large organs and structures inside the image volume were individually segmented and modeled. Other organs were created by transforming an existing adult male or female full-body computer model (developed from visible human data) to match the framework defined by the segmented organs, referencing the organ volume and anthropometry data in ICRP Publication 89. A Monte Carlo program previously developed and validated for dose simulation on the LightSpeed VCT scanner was used to estimate patient-specific organ dose, from which effective dose and risks of cancer incidence were derived. Patient-specific organ dose and effective dose were compared with patient-generic CT dose quantities in current clinical use: the volume-weighted CT dose index (CTDI{sub vol}) and the effective dose derived from the dose-length product (DLP). Results: The effective dose for the CT examination of the newborn patient (5.7 mSv) was higher but comparable to that for the CT examination of the teenager patient (4.9 mSv) due to the size-based clinical CT protocols at our institution, which employ lower scan techniques for smaller

  11. Estimating the Risk of Renal Stone Events During Long-Duration Spaceflight

    NASA Technical Reports Server (NTRS)

    Reyes, David; Kerstman, Eric; Locke, James

    2014-01-01

    Introduction: Given the bone loss and increased urinary calcium excretion in the microgravity environment, persons participating in long-duration spaceflight may have an increased risk for renal stone formation. Renal stones are often an incidental finding of abdominal imaging studies done for other reasons. Thus, some crewmembers may have undiscovered, asymptomatic stones prior to their mission. Methods: An extensive literature search was conducted concerning the natural history of asymptomatic renal stones. For comparison, simulations were done using the Integrated Medical Model (IMM). The IMM is an evidence-based decision support tool that provides risk analysis and has the capability to optimize medical systems for missions by minimizing the occurrence of adverse mission outcomes such as evacuation and loss of crew life within specified mass and volume constraints. Results: The literature of the natural history of asymptomatic renal stones in the general medical population shows that the probability of symptomatic event is 8% to 34% at 1 to 3 years for stones < 7 mm. Extrapolated to a 6-month mission, for stones < 5 to 7 mm, the risk for any stone event is about 4 to 6%, with a 0.7% to 4% risk for intervention, respectively. IMM simulations compare favorably with risk estimates garnered from the terrestrial literature. The IMM forecasts that symptomatic renal stones may be one of the top drivers for medical evacuation of an International Space Station (ISS) mission. Discussion: Although the likelihood of a stone event is low, the consequences could be severe due to limitations of current ISS medical capabilities. Therefore, these risks need to be quantified to aid planning, limit crew morbidity and mitigate mission impacts. This will be especially critical for missions beyond earth orbit, where evacuation may not be an option.

  12. Estimating the Risk of Renal Stone Events during Long-Duration Spaceflight

    NASA Technical Reports Server (NTRS)

    Reyes, David; Kerstman, Eric; Gray, Gary; Locke, James

    2014-01-01

    Introduction: Given the bone loss and increased urinary calcium excretion in the microgravity environment, persons participating in long-duration spaceflight may have an increased risk for renal stone formation. Renal stones are often an incidental finding of abdominal imaging studies done for other reasons. Thus, some crewmembers may have undiscovered, asymptomatic stones prior to their mission. Methods: An extensive literature search was conducted concerning the natural history of asymptomatic renal stones. For comparison, simulations were done using the Integrated Medical Model (IMM). The IMM is an evidence-based decision support tool that provides risk analysis and has the capability to optimize medical systems for missions by minimizing the occurrence of adverse mission outcomes such as evacuation and loss of crew life within specified mass and volume constraints. Results: The literature of the natural history of asymptomatic renal stones in the general medical population shows that the probability of symptomatic event is 8% to 34% at 1 to 3 years for stones < 7 mm. Extrapolated to a 6-month mission, for stones < 5 to 7 mm, the risk for any stone event is about 4 to 6%, with a 0.7% to 4% risk for intervention, respectively. IMM simulations compare favorably with risk estimates garnered from the terrestrial literature. The IMM forecasts that symptomatic renal stones may be one of the top drivers for medical evacuation of an International Space Station (ISS) mission. Discussion: Although the likelihood of a stone event is low, the consequences could be severe due to limitations of current ISS medical capabilities. Therefore, these risks need to be quantified to aid planning, limit crew morbidity and mitigate mission impacts. This will be especially critical for missions beyond earth orbit, where evacuation may not be an option.

  13. Estimating drought risk across Europe from reported drought impacts, drought indices, and vulnerability factors

    NASA Astrophysics Data System (ADS)

    Blauhut, Veit; Stahl, Kerstin; Stagge, James Howard; Tallaksen, Lena M.; De Stefano, Lucia; Vogt, Jürgen

    2016-07-01

    Drought is one of the most costly natural hazards in Europe. Due to its complexity, drought risk, meant as the combination of the natural hazard and societal vulnerability, is difficult to define and challenging to detect and predict, as the impacts of drought are very diverse, covering the breadth of socioeconomic and environmental systems. Pan-European maps of drought risk could inform the elaboration of guidelines and policies to address its documented severity and impact across borders. This work tests the capability of commonly applied drought indices and vulnerability factors to predict annual drought impact occurrence for different sectors and macro regions in Europe and combines information on past drought impacts, drought indices, and vulnerability factors into estimates of drought risk at the pan-European scale. This hybrid approach bridges the gap between traditional vulnerability assessment and probabilistic impact prediction in a statistical modelling framework. Multivariable logistic regression was applied to predict the likelihood of impact occurrence on an annual basis for particular impact categories and European macro regions. The results indicate sector- and macro-region-specific sensitivities of drought indices, with the Standardized Precipitation Evapotranspiration Index (SPEI) for a 12-month accumulation period as the overall best hazard predictor. Vulnerability factors have only limited ability to predict drought impacts as single predictors, with information about land use and water resources being the best vulnerability-based predictors. The application of the hybrid approach revealed strong regional and sector-specific differences in drought risk across Europe. The majority of the best predictor combinations rely on a combination of SPEI for shorter and longer accumulation periods, and a combination of information on land use and water resources. The added value of integrating regional vulnerability information with drought risk prediction

  14. Estimating drought risk across Europe from reported drought impacts, hazard indicators and vulnerability factors

    NASA Astrophysics Data System (ADS)

    Blauhut, V.; Stahl, K.; Stagge, J. H.; Tallaksen, L. M.; De Stefano, L.; Vogt, J.

    2015-12-01

    Drought is one of the most costly natural hazards in Europe. Due to its complexity, drought risk, the combination of the natural hazard and societal vulnerability, is difficult to define and challenging to detect and predict, as the impacts of drought are very diverse, covering the breadth of socioeconomic and environmental systems. Pan-European maps of drought risk could inform the elaboration of guidelines and policies to address its documented severity and impact across borders. This work (1) tests the capability of commonly applied hazard indicators and vulnerability factors to predict annual drought impact occurrence for different sectors and macro regions in Europe and (2) combines information on past drought impacts, drought hazard indicators, and vulnerability factors into estimates of drought risk at the pan-European scale. This "hybrid approach" bridges the gap between traditional vulnerability assessment and probabilistic impact forecast in a statistical modelling framework. Multivariable logistic regression was applied to predict the likelihood of impact occurrence on an annual basis for particular impact categories and European macro regions. The results indicate sector- and macro region specific sensitivities of hazard indicators, with the Standardised Precipitation Evapotranspiration Index for a twelve month aggregation period (SPEI-12) as the overall best hazard predictor. Vulnerability factors have only limited ability to predict drought impacts as single predictor, with information about landuse and water resources as best vulnerability-based predictors. (3) The application of the "hybrid approach" revealed strong regional (NUTS combo level) and sector specific differences in drought risk across Europe. The majority of best predictor combinations rely on a combination of SPEI for shorter and longer aggregation periods, and a combination of information on landuse and water resources. The added value of integrating regional vulnerability information

  15. Estimated Phytanic Acid Intake and Prostate Cancer Risk: a Prospective Cohort Study

    PubMed Central

    Wright, Margaret E.; Bowen, Phyllis; Virtamo, Jarmo; Albanes, Demetrius; Gann, Peter H.

    2013-01-01

    Phytanic acid is a saturated fatty acid found predominantly in red meat and dairy products and may contribute to increases in prostate cancer risk that are observed with higher intakes of these foods. We constructed a novel summary measure of phytanic acid intake and prospectively examined its association with prostate cancer risk in the Alpha-Tocopherol, Beta-Carotene Cancer Prevention Study – a cohort of Finnish male smokers ages 50–69 years. Diet was assessed at baseline in 27,111 participants using a validated 276-item dietary questionnaire. Since phytanic acid is not currently included in food composition tables, we used the published phytanic acid content of 151 major food items to estimate total daily intake. During up to 20 years of follow-up, a total of 1,929 incident prostate cancer cases (including 438 advanced cases) were identified. Higher phytanic acid intake, though unrelated to the risk of localized disease [relative risks and 95% confidence intervals for increasing quartiles of intake = 1.00 (ref), 0.83 (0.68–1.01), 0.76 (0.62–0.94), and 0.91 (0.74–1.13); p trend = 0.23], was associated with increased risks of advanced prostate cancer [RR and 95% CI = 1.00 (ref), 1.43 (1.09–1.89), 1.31 (0.99–1.75), and 1.38 (1.02–1.89); p trend = 0.06]. This association appeared to be driven predominantly by phytanic acid obtained from dairy products (particularly butter). Our study indicates that phytanic acid may contribute to previously observed associations between high-fat animal foods (particularly dairy products) and prostate cancer risk, although some caution is warranted as it may be acting as a surrogate marker of dairy fat. PMID:22120496

  16. Three calibration factors, applied to a rapid sweeping method, can accurately estimate Aedes aegypti (Diptera: Culicidae) pupal numbers in large water-storage containers at all temperatures at which dengue virus transmission occurs.

    PubMed

    Romero-Vivas, C M E; Llinás, H; Falconar, A K I

    2007-11-01

    The ability of a simple sweeping method, coupled to calibration factors, to accurately estimate the total numbers of Aedes aegypti (L.) (Diptera: Culicidae) pupae in water-storage containers (20-6412-liter capacities at different water levels) throughout their main dengue virus transmission temperature range was evaluated. Using this method, one set of three calibration factors were derived that could accurately estimate the total Ae. aegypti pupae in their principal breeding sites, large water-storage containers, found throughout the world. No significant differences were obtained using the method at different altitudes (14-1630 m above sea level) that included the range of temperatures (20-30 degrees C) at which dengue virus transmission occurs in the world. In addition, no significant differences were found in the results obtained between and within the 10 different teams that applied this method; therefore, this method was extremely robust. One person could estimate the Ae. aegypti pupae in each of the large water-storage containers in only 5 min by using this method, compared with two people requiring between 45 and 90 min to collect and count the total pupae population in each of them. Because the method was both rapid to perform and did not disturb the sediment layers in these domestic water-storage containers, it was more acceptable by the residents, and, therefore, ideally suited for routine surveillance purposes and to assess the efficacy of Ae. aegypti control programs in dengue virus-endemic areas throughout the world.

  17. Estimates of coextinction risk: how anuran parasites respond to the extinction of their hosts.

    PubMed

    Campião, Karla Magalhães; de Aquino Ribas, Augusto Cesar; Cornell, Stephen J; Begon, Michael; Tavares, Luiz Eduardo Roland

    2015-12-01

    Amphibians are known as the most threatened vertebrate group. One of the outcomes of a species' extinction is the coextinction of its dependents. Here, we estimate the extinction risk of helminth parasites of South America anurans. Parasite coextinction probabilities were modeled, assuming parasite specificity and host vulnerability to extinction as determinants. Parasite species associated with few hosts were the most prone to extinction, and extinction risk varied amongst helminth species of different taxonomic groups and life cycle complexity. Considering host vulnerability in the model decreased the extinction probability of most parasites species. However, parasite specificity and host vulnerability combined to increase the extinction probabilities of 44% of the helminth species reported in a single anuran species.

  18. Fall risk probability estimation based on supervised feature learning using public fall datasets.

    PubMed

    Koshmak, Gregory A; Linden, Maria; Loutfi, Amy

    2016-08-01

    Risk of falling is considered among major threats for elderly population and therefore started to play an important role in modern healthcare. With recent development of sensor technology, the number of studies dedicated to reliable fall detection system has increased drastically. However, there is still a lack of universal approach regarding the evaluation of developed algorithms. In the following study we make an attempt to find publicly available fall datasets and analyze similarities among them using supervised learning. After preforming similarity assessment based on multidimensional scaling we indicate the most representative feature vector corresponding to each specific dataset. This vector obtained from a real-life data is subsequently deployed to estimate fall risk probabilities for a statistical fall detection model. Finally, we conclude with some observations regarding the similarity assessment results and provide suggestions towards an efficient approach for evaluation of fall detection studies.

  19. Risk information in support of cost estimates for the Baseline Environmental Management Report (BEMR). Section 1

    SciTech Connect

    Gelston, G.M.; Jarvis, M.F.; Warren, B.R.; Von Berg, R.

    1995-06-01

    The Pacific Northwest Laboratory (PNL)(1) effort on the overall Baseline Environmental Management Report (BEMR) project consists of four installation-specific work components performed in succession. These components include (1) development of source terms, 92) collection of data and preparation of environmental settings reports, (3) calculation of unit risk factors, and (4) utilization of the unit risk factors in Automated Remedial Action Methodology (ARAM) for computation of target concentrations and cost estimates. This report documents work completed for the Nevada Test Site, Nevada, for components 2 and 3. The product of this phase of the BEMR project is the development of unit factors (i.e., unit transport factors, unit exposure factors, and unit risk factors). Thousands of these unit factors are gene rated and fill approximately one megabyte of computer information per installation. The final unit risk factors (URF) are transmitted electronically to BEMR-Cost task personnel as input to a computer program (ARAM). Abstracted files and exhibits of the URF information are included in this report. These visual formats are intended to provide a sample of the final task deliverable (the URF files) which can be easily read without a computer.

  20. Risk estimation of infectious diseases determines the effectiveness of the control strategy

    NASA Astrophysics Data System (ADS)

    Zhang, Haifeng; Zhang, Jie; Li, Ping; Small, Michael; Wang, Binghong

    2011-05-01

    Usually, whether to take vaccination or not is a voluntary decision, which is determined by many factors, from societal factors (such as religious belief and human rights) to individual preferences (including psychology and altruism). Facing the outbreaks of infectious diseases, different people often have different estimations on the risk of infectious diseases. So, some persons are willing to vaccinate, but other persons are willing to take risks. In this paper, we establish two different risk assessment systems using the technique of dynamic programming, and then compare the effects of the two different systems on the prevention of diseases on complex networks. One is that the perceived probability of being infected for each individual is the same (uniform case). The other is that the perceived probability of being infected is positively correlated to individual degrees (preferential case). We show that these two risk assessment systems can yield completely different results, such as, the effectiveness of controlling diseases, the time evolution of the number of infections, and so on.

  1. Waste management programmatic environmental impact statement methodology for estimating human health risks

    SciTech Connect

    Bergenback, B.; Blaylock, B.P.; Legg, J.L.

    1995-05-01

    The US Department of Energy (DOE) has produced large quantities of radioactive and hazardous waste during years of nuclear weapons production. As a result, a large number of sites across the DOE Complex have become chemically and/or radiologically contaminated. In 1990, the Secretary of Energy charged the DOE Office of Environmental Restoration and Waste management (EM) with the task of preparing a Programmatic Environmental Impact Statement (PEIS). The PEIS should identify and assess the potential environmental impacts of implementing several integrated Environmental Restoration (ER) and Waste Management (WM) alternatives. The determination and integration of appropriate remediation activities and sound waste management practices is vital for ensuring the diminution of adverse human health impacts during site cleanup and waste management programs. This report documents the PEIS risk assessment methodology used to evaluate human health risks posed by WM activities. The methodology presents a programmatic cradle to grave risk assessment for EM program activities. A unit dose approach is used to estimate risks posed by WM activities and is the subject of this document.

  2. Social and economic factors of the natural risk increasing: estimation of the Russian regions

    NASA Astrophysics Data System (ADS)

    Petrova, E.

    2004-04-01

    This study is an attempt to assess quantitatively social and economic factors that determine vulnerability of Russian regions to natural risk, to trace the space differences of the considered factors, and to group the regions by their similarity. In order to indicate the regional differences in social and economic development, equipment condition, dangerous substances accumulation, and social trouble four the most suitable parameters were estimated, including the per capita production of Gross Regional Product (GRP), capital consumption, volume of total toxic waste, and crime rate. Increase of the first parameter causes vulnerability reducing, the increase of the last three causes its increasing. Using multidimensional cluster analysis five types of regions were found for Russia according to similarity of the considered parameters. These types are characterized with higher value of a single (rarely two) chosen parameter, which seems to be sufficient enough to affect natural risks increasing in these regions in near future. Only few regions belonging to the fifth type proved to have rather high value of GRP and relatively low values of the other parameters. The negative correlation was found between a number of natural disasters (ND) and the per capita GRP in case when some parameters reached anomalously high value. The distinctions between regions by prevailing different parameters, which result in natural risk increasing, help risk management to find directions where to focus on.

  3. Contribution of molecular analyses to the estimation of the risk of congenital myotonic dystrophy.

    PubMed

    Cobo, A M; Poza, J J; Martorell, L; López de Munain, A; Emparanza, J I; Baiget, M

    1995-02-01

    A molecular analysis of the maternal and child CTG repeat size and intergenerational amplification was performed in order to estimate the risk of having a child with congenital myotonic dystrophy (CMD). In a study of 124 affected mother-child pairs (42 mother-CMD and 82 mother-non-CMD) the mean maternal CTG allele in CMD cases was three times higher (700 repeats) than in non-CMD cases (236 repeats). When the maternal allele was in the 50-300 repeats range, 90% of children were non-CMD. In contrast, when the maternal allele was greater than 300 repeats, 59% inherited the congenital form. Furthermore, the risk of having a CMD child is also related to the intergenerational amplification, which was significantly greater in the mother-CMD pairs than in the mother-non-CMD pairs. Although the risk of giving birth to a CMD child always exists for affected mothers, our data show that such a risk is considerably higher if the maternal allele is greater than 300 repeats.

  4. A global building inventory for earthquake loss estimation and risk management

    USGS Publications Warehouse

    Jaiswal, K.; Wald, D.; Porter, K.

    2010-01-01

    We develop a global database of building inventories using taxonomy of global building types for use in near-real-time post-earthquake loss estimation and pre-earthquake risk analysis, for the U.S. Geological Survey's Prompt Assessment of Global Earthquakes for Response (PAGER) program. The database is available for public use, subject to peer review, scrutiny, and open enhancement. On a country-by-country level, it contains estimates of the distribution of building types categorized by material, lateral force resisting system, and occupancy type (residential or nonresidential, urban or rural). The database draws on and harmonizes numerous sources: (1) UN statistics, (2) UN Habitat's demographic and health survey (DHS) database, (3) national housing censuses, (4) the World Housing Encyclopedia and (5) other literature. ?? 2010, Earthquake Engineering Research Institute.

  5. Spatially Interpolated Disease Prevalence Estimation Using Collateral Indicators of Morbidity and Ecological Risk

    PubMed Central

    Congdon, Peter

    2013-01-01

    This paper considers estimation of disease prevalence for small areas (neighbourhoods) when the available observations on prevalence are for an alternative partition of a region, such as service areas. Interpolation to neighbourhoods uses a kernel method extended to take account of two types of collateral information. The first is morbidity and service use data, such as hospital admissions, observed for neighbourhoods. Variations in morbidity and service use are expected to reflect prevalence. The second type of collateral information is ecological risk factors (e.g., pollution indices) that are expected to explain variability in prevalence in service areas, but are typically observed only for neighbourhoods. An application involves estimating neighbourhood asthma prevalence in a London health region involving 562 neighbourhoods and 189 service (primary care) areas. PMID:24129116

  6. Efficient Estimation of Semiparametric Transformation Models for the Cumulative Incidence of Competing Risks.

    PubMed

    Mao, Lu; Lin, D Y

    2017-03-01

    The cumulative incidence is the probability of failure from the cause of interest over a certain time period in the presence of other risks. A semiparametric regression model proposed by Fine and Gray (1999) has become the method of choice for formulating the effects of covariates on the cumulative incidence. Its estimation, however, requires modeling of the censoring distribution and is not statistically efficient. In this paper, we present a broad class of semiparametric transformation models which extends the Fine and Gray model, and we allow for unknown causes of failure. We derive the nonparametric maximum likelihood estimators (NPMLEs) and develop simple and fast numerical algorithms using the profile likelihood. We establish the consistency, asymptotic normality, and semiparametric efficiency of the NPMLEs. In addition, we construct graphical and numerical procedures to evaluate and select models. Finally, we demonstrate the advantages of the proposed methods over the existing ones through extensive simulation studies and an application to a major study on bone marrow transplantation.

  7. The influence of climate change on flood risks in France - first estimates and uncertainty analysis

    NASA Astrophysics Data System (ADS)

    Dumas, P.; Hallegatte, S.; Quintana-Seguì, P.; Martin, E.

    2013-03-01

    This paper proposes a methodology to project the possible evolution of river flood damages due to climate change, and applies it to mainland France. Its main contributions are (i) to demonstrate a methodology to investigate the full causal chain from global climate change to local economic flood losses; (ii) to show that future flood losses may change in a very significant manner over France; (iii) to show that a very large uncertainty arises from the climate downscaling technique, since two techniques with comparable skills at reproducing reference river flows give very different estimates of future flows, and thus of future local losses. The main conclusion is thus that estimating future flood losses is still out of reach, especially at local scale, but that future national-scale losses may change significantly over this century, requiring policy changes in terms of risk management and land-use planning.

  8. Metabolic syndrome risk factors and estimated glomerular filtration rate among children and adolescents.

    PubMed

    Koulouridis, Efstathios; Georgalidis, Kostantinos; Kostimpa, Ioulia; Koulouridis, Ioannis; Krokida, Angeliki; Houliara, Despina

    2010-03-01

    The aim of this study was to seek the possible relationship between estimated glomerular filtration rate (e-GFR) and anthropometric indexes, lipids, insulin sensitivity, and metabolic syndrome risk factors among healthy children and adolescents. Sufficient evidence suggest that obesity is related with a novel form of glomerulopathy named obesity-related glomerulopathy (ORG) among adults, children, and adolescents. Glomerular filtration rate was estimated from serum creatinine in 166 healthy children and adolescents [79 males, 87 females; age 10.6 +/- 3.3 (3-18) years]. Anthropometric indexes and systolic and diastolic blood pressure were measured. Fasting insulin, glucose, creatinine, uric acid, total cholesterol, high-density lipoprotein (HDL)-cholesterol, low-density lipoprotein (LDL)-cholesterol, and triglycerides were estimated. Insulin sensitivity was estimated from known formulas. The presence of certain metabolic syndrome risk factors was checked among the studied population. Boys showed higher e-GFR rates than girls (f = 8.49, p = 0.004). We found a strong positive correlation between e-GFR and body weight (r = 0.415), body mass index (BMI) (r = 0.28), waist circumference (r = 0.419), hip circumference (r = 0.364), birth weight (r = 0.164), systolic blood pressure (SBP) (r = 0.305), and mean arterial pressure (MAP) (r = 0.207). A negative correlation was found between e-GFR and fasting glucose (r = -0.19), total cholesterol (r = -0.27) and LDL-cholesterol (r = -0.26). Clustering of metabolic syndrome risk factors among certain individuals was correlated with higher e-GFR rates (f = 3.606, p = 0.007). The results of this study suggest that gender, anthropometric indexes, and SBP are strong positive determinants of e-GFR among children and adolescents. Waist circumference is the most powerful determinant of e-GFR. Fasting glucose and lipid abnormalities are negative determinants of e-GFR among the studied population. Clustering of metabolic syndrome risk

  9. Impact of alternative metrics on estimates of extent of occurrence for extinction risk assessment.

    PubMed

    Joppa, Lucas N; Butchart, Stuart H M; Hoffmann, Michael; Bachman, Steve P; Akçakaya, H Resit; Moat, Justin F; Böhm, Monika; Holland, Robert A; Newton, Adrian; Polidoro, Beth; Hughes, Adrian

    2016-04-01

    In International Union for Conservation of Nature (IUCN) Red List assessments, extent of occurrence (EOO) is a key measure of extinction risk. However, the way assessors estimate EOO from maps of species' distributions is inconsistent among assessments of different species and among major taxonomic groups. Assessors often estimate EOO from the area of mapped distribution, but these maps often exclude areas that are not habitat in idiosyncratic ways and are not created at the same spatial resolutions. We assessed the impact on extinction risk categories of applying different methods (minimum convex polygon, alpha hull) for estimating EOO for 21,763 species of mammals, birds, and amphibians. Overall, the percentage of threatened species requiring down listing to a lower category of threat (taking into account other Red List criteria under which they qualified) spanned 11-13% for all species combined (14-15% for mammals, 7-8% for birds, and 12-15% for amphibians). These down listings resulted from larger estimates of EOO and depended on the EOO calculation method. Using birds as an example, we found that 14% of threatened and near threatened species could require down listing based on the minimum convex polygon (MCP) approach, an approach that is now recommended by IUCN. Other metrics (such as alpha hull) had marginally smaller impacts. Our results suggest that uniformly applying the MCP approach may lead to a one-time down listing of hundreds of species but ultimately ensure consistency across assessments and realign the calculation of EOO with the theoretical basis on which the metric was founded.

  10. An approximate estimate of the earthquake risk in the United Arab Emirates

    NASA Astrophysics Data System (ADS)

    Al-Homoud, A.; Wyss, M.

    2003-04-01

    The UAE is not as safe from earthquake disasters as often assumed. The magnitude 5.1 earthquake of 11 March 2002 in Fujairah Masafi demonstrated that earthquakes can occur in the UAE. The threat of large earthquakes in southern Iran is well known to seismologist, but people generally do not realize that the international expert team that assessed the earthquake hazard for the entire world placed the UAE into the same class as many parts of Iran and Turkey, as well as California. There is no question that large earthquakes will occur again in southern Iran and that moderate earthquakes will happen again in the UAE. The only question is: when will they happen? From the history of earthquakes, we have an understanding, although limited to the last few decades, of what size earthquakes may be expected. For this reason, it is timely to estimate the probable consequences in the UAE of a large to great earthquake in southern Iran and a moderate earthquake in the UAE themselves. We propose to estimate the number of possible injuries, fatalities, and the financial loss in building value that might occur in the UAE in several future likely earthquakes. This estimate will be based on scenario earthquakes with positions and magnitudes determined by us, based on seismic hazard maps. Scenario earthquakes are events that are very likely to occur in the future, because similar ones have happened in the past. The time when they may happen will not be estimated in this work. The input for calculating the earthquake risk in the UAE, as we propose, will be the census figures for the population and the estimated properties of the building stock. WAPPMERR is the only research group capable to make these estimates for the UAE. The deliverables will be a scientific manuscript to be submitted to a reviewed journal, which will contain tables and figures showing the estimated numbers of (a) people killed and (b) people injured (slightly and seriously counted separately), (c) buildings

  11. The performance of different propensity-score methods for estimating differences in proportions (risk differences or absolute risk reductions) in observational studies.

    PubMed

    Austin, Peter C

    2010-09-10

    Propensity score methods are increasingly being used to estimate the effects of treatments on health outcomes using observational data. There are four methods for using the propensity score to estimate treatment effects: covariate adjustment using the propensity score, stratification on the propensity score, propensity-score matching, and inverse probability of treatment weighting (IPTW) using the propensity score. When outcomes are binary, the effect of treatment on the outcome can be described using odds ratios, relative risks, risk differences, or the number needed to treat. Several clinical commentators suggested that risk differences and numbers needed to treat are more meaningful for clinical decision making than are odds ratios or relative risks. However, there is a paucity of information about the relative performance of the different propensity-score methods for estimating risk differences. We conducted a series of Monte Carlo simulations to examine this issue. We examined bias, variance estimation, coverage of confidence intervals, mean-squared error (MSE), and type I error rates. A doubly robust version of IPTW had superior performance compared with the other propensity-score methods. It resulted in unbiased estimation of risk differences, treatment effects with the lowest standard errors, confidence intervals with the correct coverage rates, and correct type I error rates. Stratification, matching on the propensity score, and covariate adjustment using the propensity score resulted in minor to modest bias in estimating risk differences. Estimators based on IPTW had lower MSE compared with other propensity-score methods. Differences between IPTW and propensity-score matching may reflect that these two methods estimate the average treatment effect and the average treatment effect for the treated, respectively.

  12. Accurate Evaluation of Quantum Integrals

    NASA Technical Reports Server (NTRS)

    Galant, D. C.; Goorvitch, D.; Witteborn, Fred C. (Technical Monitor)

    1995-01-01

    Combining an appropriate finite difference method with Richardson's extrapolation results in a simple, highly accurate numerical method for solving a Schrodinger's equation. Important results are that error estimates are provided, and that one can extrapolate expectation values rather than the wavefunctions to obtain highly accurate expectation values. We discuss the eigenvalues, the error growth in repeated Richardson's extrapolation, and show that the expectation values calculated on a crude mesh can be extrapolated to obtain expectation values of high accuracy.

  13. Concentrations of Prioritized Pharmaceuticals in Effluents from 50 Large Wastewater Treatment Plants in the US and Implications for Risk Estimation

    EPA Pesticide Factsheets

    PDF file of Concentrations of Prioritized Pharmaceuticals in Effluents from 50 Large Wastewater Treatment Plants in the US and Implications for Risk Estimation by Mitchell Kostich, Angella Batt, and James Lazorchak

  14. Bayesian Monte Carlo and Maximum Likelihood Approach for Uncertainty Estimation and Risk Management: Application to Lake Oxygen Recovery Model

    EPA Science Inventory

    Model uncertainty estimation and risk assessment is essential to environmental management and informed decision making on pollution mitigation strategies. In this study, we apply a probabilistic methodology, which combines Bayesian Monte Carlo simulation and Maximum Likelihood e...

  15. Psychosis Prediction: Stratification of Risk Estimation With Information-Processing and Premorbid Functioning Variables

    PubMed Central

    Nieman, Dorien H.; Ruhrmann, Stephan; Dragt, Sara; Soen, Francesca; van Tricht, Mirjam J.; Koelman, Johannes H. T .M.; Bour, Lo J; Velthorst, Eva; Becker, Hiske E.; Weiser, Mark; Linszen, Don H.; de Haan, Lieuwe

    2014-01-01

    Background: The period preceding the first psychotic episode is regarded as a promising period for intervention. We aimed to develop an optimized prediction model of a first psychosis, considering different sources of information. The outcome of this model may be used for individualized risk estimation. Methods: Sixty-one subjects clinically at high risk (CHR), participating in the Dutch Prediction of Psychosis Study, were assessed at baseline with instruments yielding data on neuropsychology, symptomatology, environmental factors, premorbid adjustment, and neurophysiology. The follow-up period was 36 months. Results: At 36 months, 18 participants (29.5%) had made a transition to psychosis. Premorbid adjustment (P = .001, hazard ratio [HR] = 2.13, 95% CI = 1.39/3.28) and parietal P300 amplitude (P = .004, HR = 1.27, 95% CI = 1.08/1.45) remained as predictors in the Cox proportional hazard model. The resulting prognostic score (PS) showed a sensitivity of 88.9% and a specificity of 82.5%. The area under the curve of the PS was 0.91 (95% CI = 0.83–0.98, cross-validation: 0.86), indicating an outstanding ability of the model to discriminate between transition and nontransition. The PS was further stratified into 3 risk classes establishing a prognostic index. In the class with the worst social-personal adjustment and lowest P300 amplitudes, 74% of the subjects made a transition to psychosis. Furthermore, transition emerged on average more than 17 months earlier than in the lowest risk class. Conclusions: Our results suggest that predicting a first psychotic episode in CHR subjects could be improved with a model including premorbid adjustment and information-processing variables in a multistep algorithm combining risk detection and stratification. PMID:24142369

  16. Estimating Geographical Variation in the Risk of Zoonotic Plasmodium knowlesi Infection in Countries Eliminating Malaria

    PubMed Central

    Shearer, Freya M.; Huang, Zhi; Weiss, Daniel J.; Wiebe, Antoinette; Gibson, Harry S.; Battle, Katherine E.; Pigott, David M.; Brady, Oliver J.; Putaporntip, Chaturong; Jongwutiwes, Somchai; Lau, Yee Ling; Manske, Magnus; Amato, Roberto; Elyazar, Iqbal R. F.; Vythilingam, Indra; Bhatt, Samir; Gething, Peter W.; Singh, Balbir; Golding, Nick; Hay, Simon I.

    2016-01-01

    Background Infection by the simian malaria parasite, Plasmodium knowlesi, can lead to severe and fatal disease in humans, and is the most common cause of malaria in parts of Malaysia. Despite being a serious public health concern, the geographical distribution of P. knowlesi malaria risk is poorly understood because the parasite is often misidentified as one of the human malarias. Human cases have been confirmed in at least nine Southeast Asian countries, many of which are making progress towards eliminating the human malarias. Understanding the geographical distribution of P. knowlesi is important for identifying areas where malaria transmission will continue after the human malarias have been eliminated. Methodology/Principal Findings A total of 439 records of P. knowlesi infections in humans, macaque reservoir and vector species were collated. To predict spatial variation in disease risk, a model was fitted using records from countries where the infection data coverage is high. Predictions were then made throughout Southeast Asia, including regions where infection data are sparse. The resulting map predicts areas of high risk for P. knowlesi infection in a number of countries that are forecast to be malaria-free by 2025 (Malaysia, Cambodia, Thailand and Vietnam) as well as countries projected to be eliminating malaria (Myanmar, Laos, Indonesia and the Philippines). Conclusions/Significance We have produced the first map of P. knowlesi malaria risk, at a fine-scale resolution, to identify priority areas for surveillance based on regions with sparse data and high estimated risk. Our map provides an initial evidence base to better understand the spatial distribution of this disease and its potential wider contribution to malaria incidence. Considering malaria elimination goals, areas for prioritised surveillance are identified. PMID:27494405

  17. Prospective validity of the Estimate of Risk of Adolescent Sexual Offense Recidivism (ERASOR).

    PubMed

    Worling, James R; Bookalam, David; Litteljohn, Ariel

    2012-06-01

    Data from the Estimate of Risk of Adolescent Sexual Offense Recidivism (ERASOR; Worling & Curwen) were collected for a sample of 191 adolescent males who had offended sexually. Adolescents were aged 12 to 19 years (M = 15.34; SD = 1.53) at the time of their participation in a comprehensive assessment. The ERASOR was completed by 1 of 22 clinicians immediately following each assessment. Forty-five adolescents were independently rated by pairs of clinicians, and significant interrater agreement was found for the ERASOR risk factors, the clinical judgment ratings (low, moderate, or high), and a total score. Recidivism data (criminal charges) were subsequently collected from three sources that spanned a follow-up period between 0.1 and 7.9 years (M = 3.66; SD = 2.08). Overall, 9.4% (18 of 191) of the adolescents were charged with a subsequent sexual offense over this time period. A shorter follow-up interval of up to 2.5 years (M = 1.4; SD = 0.71) was also examined. Recidivism data for the shorter follow-up interval were available for a subgroup of 70 adolescents, with a comparable recidivism rate of 8.6% (6 of 70). Clinical judgment ratings, the total score, and the sum of risk factors rated as present were significantly predictive of sexual reoffending for the short follow-up period. The total score and the sum of risk factors were predictive of sexual reoffending over the entire follow-up interval. These results add to the emerging research supporting the reliability and validity of structured risk assessment tools for adolescent sexual recidivism.

  18. Bomb survivor selection and consequences for estimates of population cancer risks.

    PubMed

    Little, M P; Charles, M W

    1990-12-01

    Health records of the Japanese bomb survivor population [with the 1965 (T65D) and 1986 (DS86) dosimetry systems] have been analyzed and some evidence found for the selection effect hypothesized by Stewart and Kneale. This is found to be significant in only the first of the periods examined (1950-1958), and the effect diminishes in magnitude thereafter. There are indications that the effect might be an artifact of the T65D dosimetry, in which it is observed more strongly than in the DS86 data. There is no evidence to suggest that selection on this basis might confer correspondingly reduced susceptibility to radiation-induced cancer. If, however, one makes this assumption, as suggested by Stewart and Kneale, then current estimates of population cancer risks might need to be inflated by between 5% and 35% (for excess cancer deaths, Gy-1) or between 8% and 40% (for years of life lost, Gy-1) to account for this. It is likely that these figures, even assuming them not to be simply an artifact of the T65D dosimetry, overestimate the degree of adjustment required to the risk estimates.

  19. Applying quality criteria to exposure in asbestos epidemiology increases the estimated risk.

    PubMed

    Burdorf, Alex; Heederik, Dick

    2011-07-01

    Mesothelioma deaths due to environmental exposure to asbestos in The Netherlands led to parliamentary concern that exposure guidelines were not strict enough. The Health Council of the Netherlands was asked for advice. Its report has recently been published. The question of quality of the exposure estimates was studied more systematically than in previous asbestos meta-analyses. Five criteria of quality of exposure information were applied, and cohort studies that failed to meet these were excluded. For lung cancer, this decreased the number of cohorts included from 19 to 3 and increased the risk estimate 3- to 6-fold, with the requirements for good historical data on exposure and job history having the largest effects. It also suggested that the apparent differences in lung cancer potency between amphiboles and chrysotile may be produced by lower quality studies. A similar pattern was seen for mesothelioma. As a result, the Health Council has proposed that the occupational exposure limit be reduced from 10 000 fibres m(-3) (all types) to 250 f m(-3) (amphiboles), 1300 f m(-3) (mixed fibres), and 2000 f m(-3) (chrysotile). The process illustrates the importance of evaluating quality of exposure in epidemiology since poor quality of exposure data will lead to underestimated risk.

  20. A framework for estimating radiation-related cancer risks in Japan from the 2011 Fukushima nuclear accident.

    PubMed

    Walsh, L; Zhang, W; Shore, R E; Auvinen, A; Laurier, D; Wakeford, R; Jacob, P; Gent, N; Anspaugh, L R; Schüz, J; Kesminiene, A; van Deventer, E; Tritscher, A; del Rosarion Pérez, M

    2014-11-01

    We present here a methodology for health risk assessment adopted by the World Health Organization that provides a framework for estimating risks from the Fukushima nuclear accident after the March 11, 2011 Japanese major earthquake and tsunami. Substantial attention has been given to the possible health risks associated with human exposure to radiation from damaged reactors at the Fukushima Daiichi nuclear power station. Cumulative doses were estimated and applied for each post-accident year of life, based on a reference level of exposure during the first year after the earthquake. A lifetime cumulative dose of twice the first year dose was estimated for the primary radionuclide contaminants ((134)Cs and (137)Cs) and are based on Chernobyl data, relative abundances of cesium isotopes, and cleanup efforts. Risks for particularly radiosensitive cancer sites (leukemia, thyroid and breast cancer), as well as the combined risk for all solid cancers were considered. The male and female cumulative risks of cancer incidence attributed to radiation doses from the accident, for those exposed at various ages, were estimated in terms of the lifetime attributable risk (LAR). Calculations of LAR were based on recent Japanese population statistics for cancer incidence and current radiation risk models from the Life Span Study of Japanese A-bomb survivors. Cancer risks over an initial period of 15 years after first exposure were also considered. LAR results were also given as a percentage of the lifetime baseline risk (i.e., the cancer risk in the absence of radiation exposure from the accident). The LAR results were based on either a reference first year dose (10 mGy) or a reference lifetime dose (20 mGy) so that risk assessment may be applied for relocated and non-relocated members of the public, as well as for adult male emergency workers. The results show that the major contribution to LAR from the reference lifetime dose comes from the first year dose. For a dose of 10 mGy in

  1. Regional estimation of design precipitation totals by simple scaling for flood risk prediction in Slovakia

    NASA Astrophysics Data System (ADS)

    Bara, Marta; Kohnova, Silvia; Gaal, Ladislav; Szolgay, Jan; Hlavcova, Kamila

    2010-05-01

    Design values of extreme rainfall are of very great importance in engineering hydrology, such as input data for hydrological modeling, for the prediction of flood events, or for planning and design in water resources management. Precipitation data with sufficient temporal resolution necessary for estimation of design precipitation totals are available from a limited number of raingauges with continuous recording. One of the advantages of the simple scaling method is, that it allows estimating of design precipitation totals for required durations and recurrence intervals using daily data, available from a denser network of non-recording raingauges. In this study the possibility of using the simple scaling method for regional estimation of design short-term precipitation totals for flood risk forecasting was tested. The analysis includes precipitation data from 56 raingauge stations from the whole territory of Slovakia, distributed into three homogeneous regions based on regionalization of the daily maximum precipitation totals in the warm season (April-September). The regional dimensionless growth curve of daily precipitation maxima was derived in the regions, and the local T-year quantiles were estimated by the index value method. In each region three verification stations were selected which were treated as ungauged sites. It was supposed that the only information on the precipitation regime at the verification stations was the index value. Using the regionally averaged scaling exponent, the IDF curves were estimated by downscaling the design daily precipitation totals. The IDF curves were finally compared with those assessed locally in previous studies and their application in engineering practice was discussed.

  2. Estimation of Wild Fire Risk Area based on Climate and Maximum Entropy in Korean Peninsular

    NASA Astrophysics Data System (ADS)

    Kim, T.; Lim, C. H.; Song, C.; Lee, W. K.

    2015-12-01

    The number of forest fires and accompanying human injuries and physical damages has been increased by frequent drought. In this study, forest fire danger zone of Korea is estimated to predict and prepare for future forest fire hazard regions. The MaxEnt (Maximum Entropy) model is used to estimate the forest fire hazard region which estimates the probability distribution of the status. The MaxEnt model is primarily for the analysis of species distribution, but its applicability for various natural disasters is getting recognition. The detailed forest fire occurrence data collected by the MODIS for past 5 years (2010-2014) is used as occurrence data for the model. Also meteorology, topography, vegetation data are used as environmental variable. In particular, various meteorological variables are used to check impact of climate such as annual average temperature, annual precipitation, precipitation of dry season, annual effective humidity, effective humidity of dry season, aridity index. Consequently, the result was valid based on the AUC(Area Under the Curve) value (= 0.805) which is used to predict accuracy in the MaxEnt model. Also predicted forest fire locations were practically corresponded with the actual forest fire distribution map. Meteorological variables such as effective humidity showed the greatest contribution, and topography variables such as TWI (Topographic Wetness Index) and slope also contributed on the forest fire. As a result, the east coast and the south part of Korea peninsula were predicted to have high risk on the forest fire. In contrast, high-altitude mountain area and the west coast appeared to be safe with the forest fire. The result of this study is similar with former studies, which indicates high risks of forest fire in accessible area and reflects climatic characteristics of east and south part in dry season. To sum up, we estimated the forest fire hazard zone with existing forest fire locations and environment variables and had

  3. Risk estimation for multifactorial diseases. A report of the International Commission on Radiological Protection.

    PubMed

    1999-01-01

    assessing the impact of radiation-induced mutations on the frequencies of multifactorial diseases in the population.The mutation component (MC) of genetic diseases quantifies the responsiveness of the genetic component of a disease to an increase in mutation rate (e.g. after radiation exposure). This report integrates the concepts of liability and threshold (from the MTM model) and of mutation-selection equilibrium (from mechanistic population genetic models) into the 'Finite Locus Threshold Model' (FLTM) for estimating MC for multifactorial diseases and the relationship between MC and h(2) of these diseases. Computer simulation studies illustrate the effects of one-time or a permanent increase in mutation rate on MC for multifactorial diseases.Finally, the report addresses the estimation of the radiation risk of multifactorial diseases. A formal revision of the estimates of risk of multifactorial diseases (and also of mendelian diseases) contained in the 1990 Recommendations of ICRP, Publication 60, must await the results of studies currently underway. While future genetic risk estimates are likely to be lower than those in current use, until the new ones become available, those provided in Publication 60 may be regarded as being adequate for use in radiological protection- they are unlikely to underestimate risk.

  4. Estimating the Risk of Chronic Pain: Development and Validation of a Prognostic Model (PICKUP) for Patients with Acute Low Back Pain

    PubMed Central

    Traeger, Adrian C.; Henschke, Nicholas; Hübscher, Markus; Williams, Christopher M.; Kamper, Steven J.; Maher, Christopher G.; Moseley, G. Lorimer; McAuley, James H.

    2016-01-01

    Background Low back pain (LBP) is a major health problem. Globally it is responsible for the most years lived with disability. The most problematic type of LBP is chronic LBP (pain lasting longer than 3 mo); it has a poor prognosis and is costly, and interventions are only moderately effective. Targeting interventions according to risk profile is a promising approach to prevent the onset of chronic LBP. Developing accurate prognostic models is the first step. No validated prognostic models are available to accurately predict the onset of chronic LBP. The primary aim of this study was to develop and validate a prognostic model to estimate the risk of chronic LBP. Methods and Findings We used the PROGRESS framework to specify a priori methods, which we published in a study protocol. Data from 2,758 patients with acute LBP attending primary care in Australia between 5 November 2003 and 15 July 2005 (development sample, n = 1,230) and between 10 November 2009 and 5 February 2013 (external validation sample, n = 1,528) were used to develop and externally validate the model. The primary outcome was chronic LBP (ongoing pain at 3 mo). In all, 30% of the development sample and 19% of the external validation sample developed chronic LBP. In the external validation sample, the primary model (PICKUP) discriminated between those who did and did not develop chronic LBP with acceptable performance (area under the receiver operating characteristic curve 0.66 [95% CI 0.63 to 0.69]). Although model calibration was also acceptable in the external validation sample (intercept = −0.55, slope = 0.89), some miscalibration was observed for high-risk groups. The decision curve analysis estimated that, if decisions to recommend further intervention were based on risk scores, screening could lead to a net reduction of 40 unnecessary interventions for every 100 patients presenting to primary care compared to a “treat all” approach. Limitations of the method include the model being

  5. Identification and estimation of socioeconomic impacts resulting from perceived risks and changing images; An annotated bibliography

    SciTech Connect

    Nieves, L.A.; Wernette, D.R.; Hemphill, R.C.; Mohiudden, S.; Corso, J.

    1990-02-01

    In 1982, the US Congress passed the Nuclear Waste Policy Act to initiate the process of choosing a location to permanently store high-level nuclear waste from the designated Yucca Mountain, Nevada, as the only location to be studied as a candidate site for such a repository. The original acts and its amendments had established the grant mechanism by which the state of Nevada could finance an investigation of the potential socioeconomic impacts that could result from the installation and operation of this facility. Over the past three years, the Office of Civilian Radioactive Waste Management (OCRWM or RW) in the US Department of Energy (DOE) has approved grant requests by Nevada to perform this investigation. This report is intended to update and enhance a literature review conducted by the Human Affairs Research Center (HARC) for the Basalt Waste Isolation Project that dealt with the psychological and sociological processes underlying risk perception. It provides addition information on the HARC work, covers a subsequent step in the impact-estimation process, and translates risk perception into decisions and behaviors with economic consequences. It also covers recently developed techniques for assessing the nature and magnitude of impacts caused by environmental changes focusing on those impacts caused by changes in perceived risks.

  6. Estimation of wildfire size and risk changes due to fuels treatments

    USGS Publications Warehouse

    Cochrane, M.A.; Moran, C.J.; Wimberly, M.C.; Baer, A.D.; Finney, M.A.; Beckendorf, K.L.; Eidenshink, J.; Zhu, Z.

    2012-01-01

    Human land use practices, altered climates, and shifting forest and fire management policies have increased the frequency of large wildfires several-fold. Mitigation of potential fire behaviour and fire severity have increasingly been attempted through pre-fire alteration of wildland fuels using mechanical treatments and prescribed fires. Despite annual treatment of more than a million hectares of land, quantitative assessments of the effectiveness of existing fuel treatments at reducing the size of actual wildfires or how they might alter the risk of burning across landscapes are currently lacking. Here, we present a method for estimating spatial probabilities of burning as a function of extant fuels treatments for any wildland fire-affected landscape. We examined the landscape effects of more than 72 000 ha of wildland fuel treatments involved in 14 large wildfires that burned 314 000 ha of forests in nine US states between 2002 and 2010. Fuels treatments altered the probability of fire occurrence both positively and negatively across landscapes, effectively redistributing fire risk by changing surface fire spread rates and reducing the likelihood of crowning behaviour. Trade offs are created between formation of large areas with low probabilities of increased burning and smaller, well-defined regions with reduced fire risk.

  7. Estimation of infectious risks in residential populations exposed to airborne pathogens during center pivot irrigation of dairy wastewaters

    Technology Transfer Automated Retrieval System (TEKTRAN)

    In the western United States where dairy wastewaters are commonly land applied, there are concerns over individuals being exposed to airborne pathogens. In response, a quantitative microbial risk assessment (QMRA) was performed to estimate infectious risks after inhalation exposure of pathogens aero...

  8. An unbiased risk estimator for image denoising in the presence of mixed poisson-gaussian noise.

    PubMed

    Le Montagner, Yoann; Angelini, Elsa D; Olivo-Marin, Jean-Christophe

    2014-03-01

    The behavior and performance of denoising algorithms are governed by one or several parameters, whose optimal settings depend on the content of the processed image and the characteristics of the noise, and are generally designed to minimize the mean squared error (MSE) between the denoised image returned by the algorithm and a virtual ground truth. In this paper, we introduce a new Poisson-Gaussian unbiased risk estimator (PG-URE) of the MSE applicable to a mixed Poisson-Gaussian noise model that unifies the widely used Gaussian and Poisson noise models in fluorescence bioimaging applications. We propose a stochastic methodology to evaluate this estimator in the case when little is known about the internal machinery of the considered denoising algorithm, and we analyze both theoretically and empirically the characteristics of the PG-URE estimator. Finally, we evaluate the PG-URE-driven parametrization for three standard denoising algorithms, with and without variance stabilizing transforms, and different characteristics of the Poisson-Gaussian noise mixture.

  9. Safety margins estimation method considering uncertainties within the risk-informed decision-making framework

    SciTech Connect

    Martorell, S.; Nebot, Y.; Vilanueva, J. F.; Carlos, S.; Serradell, V.

    2006-07-01

    The adoption by regulators of the risk-informed decision-making philosophy has opened the debate on the role of the deterministic and probabilistic approaches to support regulatory matters of concern to NPP safety (e.g. safety margins, core damage frequency, etc.). However, the typical separation of the application fields does not imply that both methods cannot benefit from each other. On the contrary, there is a growing interest nowadays aimed at developing methods for using probabilistic safety analysis results into requirements and assumptions in deterministic analysis and vice versa. Thus, it appears an interesting challenge for the technical community aimed at combining best estimate thermal-hydraulic codes with probabilistic techniques to produce an effective and feasible technology, which should provide more realistic, complete and logical measure of reactor safety. This paper proposes a new unified framework to estimate safety margins using a best estimate thermal-hydraulic code with help of data and models from a level 1 LPSA (low power and shutdown probabilistic safety assessment - PSA) and considering simultaneously the uncertainty associated to both probabilistic and thermal-hydraulic codes. It is also presented an application example that demonstrates the performance and significance of the method and the relevance of the results achieved to the safety of nuclear power plants. (authors)

  10. External validation, repeat determination, and precision of risk estimation in misclassified exposure data in epidemiology.

    PubMed Central

    Duffy, S W; Maximovitch, D M; Day, N E

    1992-01-01

    STUDY OBJECTIVE--The aim was to quantify the difference in precision of risk estimates in epidemiology between the situations where misclassification of exposure is corrected for by external validation and where it is corrected for by internal repeat measurement. Precision was measured in terms of the expected width of the 95% confidence interval on the odds ratio. DESIGN--In a hypothetical case-control study, first with 100 cases and 100 controls, then with 100 cases and 1000 controls (the latter to approximate the cohort study situation), expected estimated odds ratios and confidence intervals were calculated based on postulated underlying true odds ratios and misclassification error rates. The sizes of the confidence intervals using the two design strategies were compared, based on the same number of subjects receiving internal repeat measurements as were used in the external validation study. MAIN RESULTS--Confidence intervals obtained using internal repeat measurement were considerably narrower than those using external validation. Both methods yielded approximately correct point estimates. CONCLUSIONS--In terms of precision, it is preferable to correct for misclassification using internal repeat measurement rather than external validation. PMID:1494080

  11. Use of health effect risk estimates and uncertainty in formal regulatory proceedings: a case study involving atmospheric particulates

    SciTech Connect

    Habegger, L.J.; Oezkaynak, A.H.

    1984-01-01

    Coal combustion particulates are released to the atmosphere by power plants supplying electrical to the nuclear fuel cycle. This paper presents estimates of the public health risks associated with the release of these particulates at a rate associated with the annual nuclear fuel production requirements for a nuclear power plan. Utilization of these risk assessments as a new component in the formal evaluation of total risks from nuclear power plants is discussed. 23 references, 3 tables.

  12. Estimated Reduction in Cancer Risk due to PAH Exposures If Source Control Measures during the 2008 Beijing Olympics Were Sustained

    PubMed Central

    Jia, Yuling; Stone, Dave; Wang, Wentao; Schrlau, Jill; Tao, Shu; Massey Simonich, Staci L.

    2011-01-01

    Background The 2008 Beijing Olympic Games provided a unique case study to investigate the effect of source control measures on the reduction in air pollution, and associated inhalation cancer risk, in a Chinese megacity. Objectives We measured 17 carcinogenic polycyclic aromatic hydrocarbons (PAHs) and estimated the lifetime excess inhalation cancer risk during different periods of the Beijing Olympic Games, to assess the effectiveness of source control measures in reducing PAH-induced inhalation cancer risks. Methods PAH concentrations were measured in samples of particulate matter ≤ 2.5 μm in aerodynamic diameter (PM2.5) collected during the Beijing Olympic Games, and the associated inhalation cancer risks were estimated using a point-estimate approach based on relative potency factors. Results We estimated the number of lifetime excess cancer cases due to exposure to the 17 carcinogenic PAHs [12 priority pollutant PAHs and five high-molecular-weight (302 Da) PAHs (MW 302 PAHs)] to range from 6.5 to 518 per million people for the source control period concentrations and from 12.2 to 964 per million people for the nonsource control period concentrations. This would correspond to a 46% reduction in estimated inhalation cancer risk due to source control measures, if these measures were sustained over time. Benzo[b]fluoranthene, dibenz[a,h]anthracene, benzo[a]pyrene, and dibenzo[a,l]pyrene were the most carcinogenic PAH species evaluated. Total excess inhalation cancer risk would be underestimated by 23% if we did not include the five MW 302 PAHs in the risk calculation. Conclusions Source control measures, such as those imposed during the 2008 Beijing Olympics, can significantly reduce the inhalation cancer risk associated with PAH exposure in Chinese megacities similar to Beijing. MW 302 PAHs are a significant contributor to the estimated overall inhalation cancer risk. PMID:21632310

  13. The use of individual and societal risk criteria within the Dutch flood safety policy--nationwide estimates of societal risk and policy applications.

    PubMed

    Jonkman, Sebastiaan N; Jongejan, Ruben; Maaskant, Bob

    2011-02-01

    The Dutch government is in the process of revising its flood safety policy. The current safety standards for flood defenses in the Netherlands are largely based on the outcomes of cost-benefit analyses. Loss of life has not been considered separately in the choice for current standards. This article presents the results of a research project that evaluated the potential roles of two risk metrics, individual and societal risk, to support decision making about new flood safety standards. These risk metrics are already used in the Dutch major hazards policy for the evaluation of risks to the public. Individual risk concerns the annual probability of death of a person. Societal risk concerns the probability of an event with many fatalities. Technical aspects of the use of individual and societal risk metrics in flood risk assessments as well as policy implications are discussed. Preliminary estimates of nationwide levels of societal risk are presented. Societal risk levels appear relatively high in the southwestern part of the country where densely populated dike rings are threatened by a combination of river and coastal floods. It was found that cumulation, the simultaneous flooding of multiple dike rings during a single flood event, has significant impact on the national level of societal risk. Options for the application of the individual and societal risk in the new flood safety policy are presented and discussed.

  14. Value at risk estimation using independent component analysis-generalized autoregressive conditional heteroscedasticity (ICA-GARCH) models.

    PubMed

    Wu, Edmond H C; Yu, Philip L H; Li, W K

    2006-10-01

    We suggest using independent component analysis (ICA) to decompose multivariate time series into statistically independent time series. Then, we propose to use ICA-GARCH models which are computationally efficient to estimate the multivariate volatilities. The experimental results show that the ICA-GARCH models are more effective than existing methods, including DCC, PCA-GARCH, and EWMA. We also apply the proposed models to compute value at risk (VaR) for risk management applications. The backtesting and the out-of-sample tests validate the performance of ICA-GARCH models for value at risk estimation.

  15. TNFA Haplotype Genetic Testing Improves HLA in Estimating the Risk of Celiac Disease in Children

    PubMed Central

    Zambon, Carlo-Federico; Navaglia, Filippo; Greco, Eliana; Pelloso, Michela; Artuso, Serena; Padoan, Andrea; Pescarin, Matilde; Aita, Ada; Bozzato, Dania; Moz, Stefania; Cananzi, Mara; Guariso, Graziella; Plebani, Mario

    2015-01-01

    Background TNF-α and IFN-γ play a role in the development of mucosal damage in celiac disease (CD). Polymorphisms of TNFA and IFNG genes, as well as of the TNFRSF1A gene, encoding the TNF-α receptor 1, might underlie different inter-individual disease susceptibility over a common HLA risk background. The aims of this study were to ascertain whether five SNPs in the TNFA promoter (-1031T>C,-857C>T,-376G>A,-308G>A,-238G>A), sequence variants of the TNFRSF1A gene and IFNG +874A>T polymorphism are associated with CD in a HLA independent manner. Methods 511 children (244 CD, 267 controls) were genotyped for HLA, TNFA and INFG (Real Time PCR). TNFRSF1A variants were studied (DHPLC and sequence). Results Only the rare TNFA-1031C (OR=0.65, 95% CI:0.44-0.95), -857T (OR=0.42, 95% CI:0.27-0.65), -376A (OR=2.25, 95% CI:1.12-4.51) and -308A (OR=4.76, 95% CI:3.12-7.26) alleles were significantly associated with CD. One TNFRSF1A variant was identified (c.625+10A>G, rs1800693), but not associated with CD. The CD-correlated TNFA SNPs resulted in six haplotypes. Two haplotypes were control-associated (CCGG and TTGG) and three were CD-associated (CCAG, TCGA and CCGA). The seventeen inferred haplotype combinations were grouped (A to E) based on their frequencies among CD. Binary logistic regression analysis documented a strong association between CD and HLA (OR for intermediate risk haplotypes=178; 95% CI:24-1317; OR for high risk haplotypes=2752; 95% CI:287-26387), but also an HLA-independent correlation between CD and TNFA haplotype combination groups. The CD risk for patients carrying an intermediate risk HLA haplotype could be sub-stratified by TNFA haplotype combinations. Conclusion TNFA promoter haplotypes associate with CD independently from HLA. We suggest that their evaluation might enhance the accuracy in estimating the CD genetic risk. PMID:25915602

  16. Cryptosporidium and Giardia in tropical recreational marine waters contaminated with domestic sewage: estimation of bathing-associated disease risks.

    PubMed

    Betancourt, Walter Q; Duarte, Diana C; Vásquez, Rosa C; Gurian, Patrick L

    2014-08-15

    Sewage is a major contributor to pollution problems involving human pathogens in tropical coastal areas. This study investigated the occurrence of intestinal protozoan parasites (Giardia and Cryptosporidium) in tropical recreational marine waters contaminated with sewage. The potential risks of Cryptosporidium and Giardia infection from recreational water exposure were estimated from the levels of viable (oo) cysts (DIC+, DAPI+, PI-) found in near-shore swimming areas using an exponential dose response model. A Monte Carlo uncertainty analysis was performed in order to determine the probability distribution of risks. Microbial indicators of recreational water quality (enterococci, Clostridium perfringens) and genetic markers of sewage pollution (human-specific Bacteroidales marker [HF183] and Clostridium coccoides) were simultaneously evaluated in order to estimate the extent of water quality deterioration associated with human wastes. The study revealed the potential risk of parasite infections via primary contact with tropical marine waters contaminated with sewage; higher risk estimates for Giardia than for Cryptosporidium were found. M