Science.gov

Sample records for absolute risk models

  1. One idea of portfolio risk control for absolute return strategy risk adjustments by signals from correlation behavior

    NASA Astrophysics Data System (ADS)

    Nishiyama, N.

    2001-12-01

    Absolute return strategy provided from fund of funds (FOFs) investment schemes is the focus in Japanese Financial Community. FOFs investment mainly consists of hedge fund investment and it has two major characteristics which are low correlation against benchmark index and little impact from various external changes in the environment given maximizing return. According to the historical track record of survival hedge funds in this business world, they maintain a stable high return and low risk. However, one must keep in mind that low risk would not be equal to risk free. The failure of Long-term capital management (LTCM) that took place in the summer of 1998 was a symbolized phenomenon. The summer of 1998 exhibited a certain limitation of traditional value at risk (VaR) and some possibility that traditional VaR could be ineffectual to the nonlinear type of fluctuation in the market. In this paper, I try to bring self-organized criticality (SOC) into portfolio risk control. SOC would be well known as a model of decay in the natural world. I analyzed nonlinear type of fluctuation in the market as SOC and applied SOC to capture complicated market movement using threshold point of SOC and risk adjustments by scenario correlation as implicit signals. Threshold becomes the control parameter of risk exposure to set downside floor and forecast extreme nonlinear type of fluctuation under a certain probability. Simulation results would show synergy effect of portfolio risk control between SOC and absolute return strategy.

  2. A general relativistic model for free-fall absolute gravimeters

    NASA Astrophysics Data System (ADS)

    Tan, Yu-Jie; Shao, Cheng-Gang; Li, Jia; Hu, Zhong-Kun

    2016-04-01

    Although the relativistic manifestations of gravitational fields in gravimetry were first studied 40 years ago, the relativistic effects combined with free-fall absolute gravimeters have rarely been considered. In light of this, we present a general relativistic model for free-fall absolute gravimeters in a local-Fermi coordinates system, where we focus on effects related to the measuring devices: relativistic transverse Doppler effects, gravitational redshift effects and Earth’s rotation effects. Based on this model, a general relativistic expression of the measured gravity acceleration is obtained.

  3. Comparison of two methods for estimating absolute risk of prostate cancer based on SNPs and family history

    PubMed Central

    Hsu, Fang-Chi; Sun, Jielin; Zhu, Yi; Kim, Seong-Tae; Jin, Tao; Zhang, Zheng; Wiklund, Fredrik; Kader, A. Karim; Zheng, S. Lilly; Isaacs, William; Grönberg, Henrik; Xu, Jianfeng

    2010-01-01

    Disease risk-associated single nucleotide polymorphisms (SNPs) identified from genome-wide association studies have the potential to be used for disease risk prediction. An important feature of these risk-associated SNPs is their weak individual effect but stronger cumulative effect on disease risk. Several approaches are commonly used to model the combined effect in risk prediction but their performance is unclear. We compared two methods to model the combined effect of 14 prostate cancer (PCa) risk-associated SNPs and family history for the estimation of absolute risk for PCa in a population-based case-control study in Sweden (2,899 cases and 1,722 controls). Method 1 weighs each risk allele equally using a simple method of counting the number of risk alleles while Method 2 weighs each risk SNP differently based on their respective Odds Ratios. We found considerable differences between the two methods. Absolute risk estimates from Method 1 were generally higher than that of Method 2, especially among men at higher risk. The difference in the overall discriminative performance, measured by area under the curve (AUC) of the receiver operating characteristic was small between Method 1 (0.614) and Method 2 (0.618), P = 0.20. However, the performance of these two methods in identifying high-risk individuals (two-fold or three-fold higher than average risk), measured by positive predictive values (PPV), was higher for Method 2 than Method 1. In conclusion, these results suggest that Method 2 is superior to Method 1 in estimating absolute risk if the purpose of risk prediction is to identify high-risk individuals. PMID:20332264

  4. Electroweak absolute, meta-, and thermal stability in neutrino mass models

    NASA Astrophysics Data System (ADS)

    Lindner, Manfred; Patel, Hiren H.; Radovčić, Branimir

    2016-04-01

    We analyze the stability of the electroweak vacuum in neutrino mass models containing right-handed neutrinos or fermionic isotriplets. In addition to considering absolute stability, we place limits on the Yukawa couplings of new fermions based on metastability and thermal stability in the early Universe. Our results reveal that the upper limits on the neutrino Yukawa couplings can change significantly when the top quark mass is allowed to vary within the experimental range of uncertainty in its determination.

  5. Realized Volatility and Absolute Return Volatility: A Comparison Indicating Market Risk

    PubMed Central

    Takaishi, Tetsuya; Stanley, H. Eugene; Li, Baowen

    2014-01-01

    Measuring volatility in financial markets is a primary challenge in the theory and practice of risk management and is essential when developing investment strategies. Although the vast literature on the topic describes many different models, two nonparametric measurements have emerged and received wide use over the past decade: realized volatility and absolute return volatility. The former is strongly favored in the financial sector and the latter by econophysicists. We examine the memory and clustering features of these two methods and find that both enable strong predictions. We compare the two in detail and find that although realized volatility has a better short-term effect that allows predictions of near-future market behavior, absolute return volatility is easier to calculate and, as a risk indicator, has approximately the same sensitivity as realized volatility. Our detailed empirical analysis yields valuable guidelines for both researchers and market participants because it provides a significantly clearer comparison of the strengths and weaknesses of the two methods. PMID:25054439

  6. Biosafety Risk Assessment Model

    SciTech Connect

    Daniel Bowen, Susan Caskey

    2011-05-27

    Software tool based on a structured methodology for conducting laboratory biosafety risk assessments by biosafety experts. Software is based upon an MCDA scheme and uses peer reviewed criteria and weights. The software was developed upon Microsoft’s .net framework. The methodology defines likelihood and consequence of a laboratory exposure for thirteen unique scenarios and provides numerical relative risks for each of the relevant thirteen. The software produces 2-d graphs reflecting the relative risk and a sensitivity analysis which highlights the overall importance of each factor. The software works as a set of questions with absolute scales and uses a weighted additive model to calculate the likelihood and consequence.

  7. Decision-making using absolute cardiovascular risk reduction and incremental cost-effectiveness ratios: a case study

    PubMed Central

    Ker, J A; Oosthuizen, H; Rheeder, P

    2008-01-01

    Summary Background Many clinical guidelines have adopted a multifactorial cardiovascular risk assessment to identify high-risk individuals for treatment. The Framingham risk chart is a widely used risk engine to calculate the absolute cardiovascular risk of an individual. Cost-effective analyses are typically used to evaluate therapeutic strategies, but it is more problematic for a clinician when faced with alternative therapeutic strategies to calculate cost effectiveness. Aim We used a single simulated-patient model to explore the effect of different drug treatments on the calculated absolute cardiovascular risk. Methods The Framingham risk score was calculated on a hypothetical patient, and drug treatment was initiated. After every drug introduced, the score was recalculated. Single-exit pricing of the various drugs in South Africa was used to calculate the cost of reducing predicted cardiovascular risk. Results The cost-effective ratio of an antihypertensive treatment strategy was calculated to be R21.35 per percentage of risk reduction. That of a statin treatment strategy was R22.93 per percentage of risk reduction. Using a high-dose statin, the cost-effective ratio was R12.81 per percentage of risk reduction. Combining the antihypertensive and statin strategy demonstrated a cost-effective ratio of R23.84 per percentage of risk reduction. A combination of several drugs enabled the hypothetical patient to reduce the risk to 14% at a cost-effective ratio of R17.18 per percentage of risk reduction. Conclusion This model demonstrates a method to compare different therapeutic strategies to reduce cardiovascular risk with their cost-effective ratios. PMID:18516355

  8. Lunar eclipse photometry: absolute luminance measurements and modeling.

    PubMed

    Hernitschek, Nina; Schmidt, Elmar; Vollmer, Michael

    2008-12-01

    The Moon's time-dependent luminance was determined during the 9 February 1990 and 3 March 2007 total lunar eclipses by using calibrated, industry standard photometers. After the results were corrected to unit air mass and to standard distances for both Moon and Sun, an absolute calibration was accomplished by using the Sun's known luminance and a pre-eclipse lunar albedo of approximately 13.5%. The measured minimum level of brightness in the total phase of both eclipses was relatively high, namely -3.32 m(vis) and -1.7 m(vis), which hints at the absence of pronounced stratospheric aerosol. The light curves were modeled in such a way as to let the Moon move through an artificial Earth shadow composed of a multitude of disk and ring zones, containing a relative luminance data set from an atmospheric radiative transfer calculation. PMID:19037352

  9. Lunar eclipse photometry: absolute luminance measurements and modeling.

    PubMed

    Hernitschek, Nina; Schmidt, Elmar; Vollmer, Michael

    2008-12-01

    The Moon's time-dependent luminance was determined during the 9 February 1990 and 3 March 2007 total lunar eclipses by using calibrated, industry standard photometers. After the results were corrected to unit air mass and to standard distances for both Moon and Sun, an absolute calibration was accomplished by using the Sun's known luminance and a pre-eclipse lunar albedo of approximately 13.5%. The measured minimum level of brightness in the total phase of both eclipses was relatively high, namely -3.32 m(vis) and -1.7 m(vis), which hints at the absence of pronounced stratospheric aerosol. The light curves were modeled in such a way as to let the Moon move through an artificial Earth shadow composed of a multitude of disk and ring zones, containing a relative luminance data set from an atmospheric radiative transfer calculation.

  10. Greater absolute risk for all subtypes of breast cancer in the US than Malaysia.

    PubMed

    Horne, Hisani N; Beena Devi, C R; Sung, Hyuna; Tang, Tieng Swee; Rosenberg, Philip S; Hewitt, Stephen M; Sherman, Mark E; Anderson, William F; Yang, Xiaohong R

    2015-01-01

    Hormone receptor (HR) negative breast cancers are relatively more common in low-risk than high-risk countries and/or populations. However, the absolute variations between these different populations are not well established given the limited number of cancer registries with incidence rate data by breast cancer subtype. We, therefore, used two unique population-based resources with molecular data to compare incidence rates for the 'intrinsic' breast cancer subtypes between a low-risk Asian population in Malaysia and high-risk non-Hispanic white population in the National Cancer Institute's surveillance, epidemiology, and end results 18 registries database (SEER 18). The intrinsic breast cancer subtypes were recapitulated with the joint expression of the HRs (estrogen receptor and progesterone receptor) and human epidermal growth factor receptor-2 (HER2). Invasive breast cancer incidence rates overall were fivefold greater in SEER 18 than in Malaysia. The majority of breast cancers were HR-positive in SEER 18 and HR-negative in Malaysia. Notwithstanding the greater relative distribution for HR-negative cancers in Malaysia, there was a greater absolute risk for all subtypes in SEER 18; incidence rates were nearly 7-fold higher for HR-positive and 2-fold higher for HR-negative cancers in SEER 18. Despite the well-established relative breast cancer differences between low-risk and high-risk countries and/or populations, there was a greater absolute risk for HR-positive and HR-negative subtypes in the US than Malaysia. Additional analytical studies are sorely needed to determine the factors responsible for the elevated risk of all subtypes of breast cancer in high-risk countries like the United States.

  11. Common genetic polymorphisms modify the effect of smoking on absolute risk of bladder cancer

    PubMed Central

    Garcia-Closas, Montserrat; Rothman, Nathaniel; Figueroa, Jonine D.; Prokunina-Olsson, Ludmila; Han, Summer S.; Baris, Dalsu; Jacobs, Eric J; Malats, Nuria; De Vivo, Immaculata; Albanes, Demetrius; Purdue, Mark P; Sharma, Sapna; Fu, Yi-Ping; Kogevinas, Manolis; Wang, Zhaoming; Tang, Wei; Tardón, Adonina; Serra, Consol; Carrato, Alfredo; García-Closas, Reina; Lloreta, Josep; Johnson, Alison; Schwenn, Molly; Karagas, Margaret R; Schned, Alan; Andriole, Gerald; Grubb, Robert; Black, Amanda; Gapstur, Susan M; Thun, Michael; Diver, W Ryan; Weinstein, Stephanie J; Virtamo, Jarmo; Hunter, David J; Caporaso, Neil; Landi, Maria Teresa; Hutchinson, Amy; Burdett, Laurie; Jacobs, Kevin B; Yeager, Meredith; Fraumeni, Joseph F; Chanock, Stephen J; Silverman, Debra T; Chatterjee, Nilanjan

    2013-01-01

    Bladder cancer results from the combined effects of environmental and genetic factors, smoking being the strongest risk factor. Evaluating absolute risks resulting from the joint effects of smoking and genetic factors is critical to evaluate the public health relevance of genetic information. Analyses included up to 3,942 cases and 5,680 controls of European background in seven studies. We tested for multiplicative and additive interactions between smoking and 12 susceptibility loci, individually and combined as a polygenic risk score (PRS). Thirty-year absolute risks and risk differences by levels of the PRS were estimated for US-males aged 50-years. Six out of 12 variants showed significant additive gene-environment interactions, most notably NAT2 (P=7×10-4) and UGT1A6 (P=8×10-4). The 30-year absolute risk of bladder cancer in US males was 6.2% for all current smokers. This risk ranged from 2.9% for current smokers in the lowest quartile of the PRS to 9.9% for current smokers in the upper quartile. Risk difference estimates indicated that 8,200 cases would be prevented if elimination of smoking occurred in 100,000 men in the upper PRS quartile, compared to 2,000 cases prevented by a similar effort in the lowest PRS quartile (P-additive =1×10-4). The impact of eliminating smoking the on number of bladder cancer cases prevented is larger for individuals at higher than lower genetic risk. Our findings could have implications for targeted prevention strategies. However, other smoking-related diseases, as well as practical and ethical considerations, need to be considered before any recommendations could be made. PMID:23536561

  12. Common genetic polymorphisms modify the effect of smoking on absolute risk of bladder cancer.

    PubMed

    Garcia-Closas, Montserrat; Rothman, Nathaniel; Figueroa, Jonine D; Prokunina-Olsson, Ludmila; Han, Summer S; Baris, Dalsu; Jacobs, Eric J; Malats, Nuria; De Vivo, Immaculata; Albanes, Demetrius; Purdue, Mark P; Sharma, Sapna; Fu, Yi-Ping; Kogevinas, Manolis; Wang, Zhaoming; Tang, Wei; Tardón, Adonina; Serra, Consol; Carrato, Alfredo; García-Closas, Reina; Lloreta, Josep; Johnson, Alison; Schwenn, Molly; Karagas, Margaret R; Schned, Alan; Andriole, Gerald; Grubb, Robert; Black, Amanda; Gapstur, Susan M; Thun, Michael; Diver, William Ryan; Weinstein, Stephanie J; Virtamo, Jarmo; Hunter, David J; Caporaso, Neil; Landi, Maria Teresa; Hutchinson, Amy; Burdett, Laurie; Jacobs, Kevin B; Yeager, Meredith; Fraumeni, Joseph F; Chanock, Stephen J; Silverman, Debra T; Chatterjee, Nilanjan

    2013-04-01

    Bladder cancer results from the combined effects of environmental and genetic factors, smoking being the strongest risk factor. Evaluating absolute risks resulting from the joint effects of smoking and genetic factors is critical to assess the public health relevance of genetic information. Analyses included up to 3,942 cases and 5,680 controls of European background in seven studies. We tested for multiplicative and additive interactions between smoking and 12 susceptibility loci, individually and combined as a polygenic risk score (PRS). Thirty-year absolute risks and risk differences by levels of the PRS were estimated for U.S. males aged 50 years. Six of 12 variants showed significant additive gene-environment interactions, most notably NAT2 (P = 7 × 10(-4)) and UGT1A6 (P = 8 × 10(-4)). The 30-year absolute risk of bladder cancer in U.S. males was 6.2% for all current smokers. This risk ranged from 2.9% for current smokers in the lowest quartile of the PRS to 9.9% for current smokers in the upper quartile. Risk difference estimates indicated that 8,200 cases would be prevented if elimination of smoking occurred in 100,000 men in the upper PRS quartile compared with 2,000 cases prevented by a similar effort in the lowest PRS quartile (P(additive) = 1 × 10(-4)). Thus, the potential impact of eliminating smoking on the number of bladder cancer cases prevented is larger for individuals at higher than lower genetic risk. Our findings could have implications for targeted prevention strategies. However, other smoking-related diseases, as well as practical and ethical considerations, need to be considered before any recommendations could be made.

  13. Biosafety Risk Assessment Model

    2011-05-27

    Software tool based on a structured methodology for conducting laboratory biosafety risk assessments by biosafety experts. Software is based upon an MCDA scheme and uses peer reviewed criteria and weights. The software was developed upon Microsoft’s .net framework. The methodology defines likelihood and consequence of a laboratory exposure for thirteen unique scenarios and provides numerical relative risks for each of the relevant thirteen. The software produces 2-d graphs reflecting the relative risk and a sensitivitymore » analysis which highlights the overall importance of each factor. The software works as a set of questions with absolute scales and uses a weighted additive model to calculate the likelihood and consequence.« less

  14. Absolute fracture risk assessment using lumbar spine and femoral neck bone density measurements: derivation and validation of a hybrid system.

    PubMed

    Leslie, William D; Lix, Lisa M

    2011-03-01

    The World Health Organization (WHO) Fracture Risk Assessment Tool (FRAX) computes 10-year probability of major osteoporotic fracture from multiple risk factors, including femoral neck (FN) T-scores. Lumbar spine (LS) measurements are not currently part of the FRAX formulation but are used widely in clinical practice, and this creates confusion when there is spine-hip discordance. Our objective was to develop a hybrid 10-year absolute fracture risk assessment system in which nonvertebral (NV) fracture risk was assessed from the FN and clinical vertebral (V) fracture risk was assessed from the LS. We identified 37,032 women age 45 years and older undergoing baseline FN and LS dual-energy X-ray absorptiometry (DXA; 1990-2005) from a population database that contains all clinical DXA results for the Province of Manitoba, Canada. Results were linked to longitudinal health service records for physician billings and hospitalizations to identify nontrauma vertebral and nonvertebral fracture codes after bone mineral density (BMD) testing. The population was randomly divided into equal-sized derivation and validation cohorts. Using the derivation cohort, three fracture risk prediction systems were created from Cox proportional hazards models (adjusted for age and multiple FRAX risk factors): FN to predict combined all fractures, FN to predict nonvertebral fractures, and LS to predict vertebral (without nonvertebral) fractures. The hybrid system was the sum of nonvertebral risk from the FN model and vertebral risk from the LS model. The FN and hybrid systems were both strongly predictive of overall fracture risk (p < .001). In the validation cohort, ROC analysis showed marginally better performance of the hybrid system versus the FN system for overall fracture prediction (p = .24) and significantly better performance for vertebral fracture prediction (p < .001). In a discordance subgroup with FN and LS T-score differences greater than 1 SD, there was a significant

  15. Age Dependent Absolute Plate and Plume Motion Modeling

    NASA Astrophysics Data System (ADS)

    Heaton, D. E.; Koppers, A. A. P.

    2015-12-01

    Current absolute plate motion (APM) models from 80 - 0 Ma are constrained by the location of mantle plume related hotspot seamounts, in particular those of the Hawaiian-Emperor and Louisville seamount trails. Originally the 'fixed' hotspot hypothesis was developed to explain past plate motion based on linear age progressive intra-plate volcanism. However, now that 'moving' hotspots are accepted, it is becoming clear that APM models need to be corrected for individual plume motion vectors. For older seamount trails that were active between roughly 50 and 80 Ma the APM models that use 'fixed' hotspots overestimate the measured age progression in those trails, while APM models corrected for 'moving' hotspots underestimate those age progressions. These mismatches are due to both a lack of reliable ages in the older portions of both the Hawaii and Louisville seamount trails and insufficient APM modeling constraints from other seamount trails in the Pacific Basin. Seamounts are difficult to sample and analyze because many are hydrothermally altered and have low potassium concentrations. New 40Ar/39Ar Age results from International Ocean Drilling Project (IODP) Expedition 330 Sites U1372 (n=18), U1375 (n=3), U1376 (n=15) and U1377 (n=7) aid in constraining the oldest end of the Louisville Seamount trail. A significant observation in this study is that the age range recovered in the drill cores match the range of ages that were acquired on dredging cruises at the same seamounts (e.g. Koppers et al., 2011). This is important for determining the inception age of a seamount. The sections recovered from IODP EXP 330 are in-situ volcanoclastic breccia and lava flows. Comparing the seismic interpretations of Louisville guyots (Contreras-Reyes et al., 2010), Holes U1372, U1373 and U1374 penetrated the extrusive and volcanoclastic sections of the seamount. The ages obtained are consistent over stratigraphic intervals >100-450 m thick, providing evidence that these seamounts

  16. Mapping return levels of absolute NDVI variations for the assessment of drought risk in Ethiopia

    NASA Astrophysics Data System (ADS)

    Tonini, F.; Hochmair, H. H.; Jona Lasinio, G.

    2012-12-01

    The analysis and forecasting of extreme climatic events has become increasingly relevant to planning effective financial and food-related interventions in third-world countries. Natural disasters and climate change, both large and small scale, have a great impact on non-industrialized populations who rely exclusively on activities such as crop production, fishing, and similar livelihood activities. It is important to identify the extent of the areas prone to severe drought conditions in order to study the possible consequences of the drought on annual crop production. In this paper, we aim to identify such areas within the South Tigray zone, Ethiopia, using a transformation of the Normalized Difference Vegetation Index (NDVI) called Absolute Difference NDVI (ADVI). Negative NDVI shifts from the historical average can generally be linked to a reduction in the vigor of local vegetation. Drought is more likely to increase in areas where negative shifts occur more frequently and with high magnitude, making it possible to spot critical situations. We propose a new methodology for the assessment of drought risk in areas where crop production represents a primary source of livelihood for its inhabitants. We estimate ADVI return levels pixel per pixel by fitting extreme value models to independent monthly minima. The study is conducted using SPOT-Vegetation (VGT) ten-day composite (S10) images from April 1998 to March 2009. In all short-term and long-term predictions, we found that central and southern areas of the South Tigray zone are prone to a higher drought risk compared to other areas.; Temporal autocorrelation among monthly minima within the Alamata woreda. (a) ACF-Boxplot and (b) PACF-Boxplot. ; ADVI return level estimates. (a) 10-Month return levels. (b) 100-Month return levels. (c) 1000-Month return levels.

  17. Mapping return levels of absolute NDVI variations for the assessment of drought risk in Ethiopia

    NASA Astrophysics Data System (ADS)

    Tonini, Francesco; Jona Lasinio, Giovanna; Hochmair, Hartwig H.

    2012-08-01

    The analysis and forecasting of extreme climatic events has become increasingly relevant to plan effective financial and food-related interventions in third-world countries. Natural disasters and climate change, both large and small scale, have a great impact on non-industrialized populations who rely exclusively on activities such as crop production, fishing, and similar livelihood activities. It is important to identify the extent of the areas prone to severe drought conditions in order to study the possible consequences of the drought on annual crop production. In this paper, we aim to identify such areas within the South Tigray zone, Ethiopia, using a transformation of the Normalized Difference Vegetation Index (NDVI) called Absolute Difference NDVI (ADVI). Negative NDVI shifts from the historical average can generally be linked to a reduction in the vigor of local vegetation. Drought is more likely to increase in areas where negative shifts occur more frequently and with high magnitude, making it possible to spot critical situations. We propose a new methodology for the assessment of drought risk in areas where crop production represents a primary source of livelihood for its inhabitants. We estimate ADVI return levels pixel per pixel by fitting extreme value models to independent monthly minima. The study is conducted using SPOT-Vegetation (VGT) ten-day composite (S10) images from April 1998 to March 2009. In all short-term and long-term predictions, we found that central and southern areas of the South Tigray zone are prone to a higher drought risk compared to other areas.

  18. Absolute and Comparative Cancer Risk Perceptions Among Smokers in Two Cities in China

    PubMed Central

    2014-01-01

    Introduction: Knowledge about health effects of smoking motivates quit attempts and sustained abstinence among smokers and also predicts greater acceptance of tobacco control efforts such as cigarette taxes and public smoking bans. We examined whether smokers in China, the world’s largest consumer of cigarettes, recognized their heightened personal risk of cancer relative to nonsmokers. Methods: A sample of Chinese people (N = 2,517; 555 current smokers) from 2 cities (Beijing and Hefei) estimated their personal risk of developing cancer, both in absolute terms (overall likelihood) and in comparative terms (relative to similarly aged people). Results: Controlling for demographics, smokers judged themselves to be at significantly lower risk of cancer than did nonsmokers on the comparative measure. No significant difference emerged between smokers and nonsmokers in absolute estimates. Conclusions: Smokers in China did not recognize their heightened personal risk of cancer, possibly reflecting ineffective warning labels on cigarette packs, a positive affective climate associated with smoking in China, and beliefs that downplay personal vulnerability among smokers (e.g., I don’t smoke enough to increase my cancer risk; I smoke high-quality cigarettes that won’t cause cancer). PMID:24668289

  19. Mathematical Model for Absolute Magnetic Measuring Systems in Industrial Applications

    NASA Astrophysics Data System (ADS)

    Fügenschuh, Armin; Fügenschuh, Marzena; Ludszuweit, Marina; Mojsic, Aleksandar; Sokół, Joanna

    2015-09-01

    Scales for measuring systems are either based on incremental or absolute measuring methods. Incremental scales need to initialize a measurement cycle at a reference point. From there, the position is computed by counting increments of a periodic graduation. Absolute methods do not need reference points, since the position can be read directly from the scale. The positions on the complete scales are encoded using two incremental tracks with different graduation. We present a new method for absolute measuring using only one track for position encoding up to micrometre range. Instead of the common perpendicular magnetic areas, we use a pattern of trapezoidal magnetic areas, to store more complex information. For positioning, we use the magnetic field where every position is characterized by a set of values measured by a hall sensor array. We implement a method for reconstruction of absolute positions from the set of unique measured values. We compare two patterns with respect to uniqueness, accuracy, stability and robustness of positioning. We discuss how stability and robustness are influenced by different errors during the measurement in real applications and how those errors can be compensated.

  20. A method for determining weights for excess relative risk and excess absolute risk when applied in the calculation of lifetime risk of cancer from radiation exposure.

    PubMed

    Walsh, Linda; Schneider, Uwe

    2013-03-01

    Radiation-related risks of cancer can be transported from one population to another population at risk, for the purpose of calculating lifetime risks from radiation exposure. Transfer via excess relative risks (ERR) or excess absolute risks (EAR) or a mixture of both (i.e., from the life span study (LSS) of Japanese atomic bomb survivors) has been done in the past based on qualitative weighting. Consequently, the values of the weights applied and the method of application of the weights (i.e., as additive or geometric weighted means) have varied both between reports produced at different times by the same regulatory body and also between reports produced at similar times by different regulatory bodies. Since the gender and age patterns are often markedly different between EAR and ERR models, it is useful to have an evidence-based method for determining the relative goodness of fit of such models to the data. This paper identifies a method, using Akaike model weights, which could aid expert judgment and be applied to help to achieve consistency of approach and quantitative evidence-based results in future health risk assessments. The results of applying this method to recent LSS cancer incidence models are that the relative EAR weighting by cancer solid cancer site, on a scale of 0-1, is zero for breast and colon, 0.02 for all solid, 0.03 for lung, 0.08 for liver, 0.15 for thyroid, 0.18 for bladder and 0.93 for stomach. The EAR weighting for female breast cancer increases from 0 to 0.3, if a generally observed change in the trend between female age-specific breast cancer incidence rates and attained age, associated with menopause, is accounted for in the EAR model. Application of this method to preferred models from a study of multi-model inference from many models fitted to the LSS leukemia mortality data, results in an EAR weighting of 0. From these results it can be seen that lifetime risk transfer is most highly weighted by EAR only for stomach cancer. However

  1. Breast Cancer Risk Assessment SAS Macro (Gail Model)

    Cancer.gov

    A SAS macro (commonly referred to as the Gail Model) that projects absolute risk of invasive breast cancer according to NCI’s Breast Cancer Risk Assessment Tool (BCRAT) algorithm for specified race/ethnic groups and age intervals.

  2. Relative and Absolute Fit Evaluation in Cognitive Diagnosis Modeling

    ERIC Educational Resources Information Center

    Chen, Jinsong; de la Torre, Jimmy; Zhang, Zao

    2013-01-01

    As with any psychometric models, the validity of inferences from cognitive diagnosis models (CDMs) determines the extent to which these models can be useful. For inferences from CDMs to be valid, it is crucial that the fit of the model to the data is ascertained. Based on a simulation study, this study investigated the sensitivity of various fit…

  3. Comprehensive fluence model for absolute portal dose image prediction

    SciTech Connect

    Chytyk, K.; McCurdy, B. M. C.

    2009-04-15

    Amorphous silicon (a-Si) electronic portal imaging devices (EPIDs) continue to be investigated as treatment verification tools, with a particular focus on intensity modulated radiation therapy (IMRT). This verification could be accomplished through a comparison of measured portal images to predicted portal dose images. A general fluence determination tailored to portal dose image prediction would be a great asset in order to model the complex modulation of IMRT. A proposed physics-based parameter fluence model was commissioned by matching predicted EPID images to corresponding measured EPID images of multileaf collimator (MLC) defined fields. The two-source fluence model was composed of a focal Gaussian and an extrafocal Gaussian-like source. Specific aspects of the MLC and secondary collimators were also modeled (e.g., jaw and MLC transmission factors, MLC rounded leaf tips, tongue and groove effect, interleaf leakage, and leaf offsets). Several unique aspects of the model were developed based on the results of detailed Monte Carlo simulations of the linear accelerator including (1) use of a non-Gaussian extrafocal fluence source function, (2) separate energy spectra used for focal and extrafocal fluence, and (3) different off-axis energy spectra softening used for focal and extrafocal fluences. The predicted energy fluence was then convolved with Monte Carlo generated, EPID-specific dose kernels to convert incident fluence to dose delivered to the EPID. Measured EPID data were obtained with an a-Si EPID for various MLC-defined fields (from 1x1 to 20x20 cm{sup 2}) over a range of source-to-detector distances. These measured profiles were used to determine the fluence model parameters in a process analogous to the commissioning of a treatment planning system. The resulting model was tested on 20 clinical IMRT plans, including ten prostate and ten oropharyngeal cases. The model predicted the open-field profiles within 2%, 2 mm, while a mean of 96.6% of pixels over

  4. Ridge-spotting: A new test for Pacific absolute plate motion models

    NASA Astrophysics Data System (ADS)

    Wessel, Paul; Müller, R. Dietmar

    2016-06-01

    Relative plate motions provide high-resolution descriptions of motions of plates relative to other plates. Yet geodynamically, motions of plates relative to the mantle are required since such motions can be attributed to forces (e.g., slab pull and ridge push) acting upon the plates. Various reference frames have been proposed, such as the hot spot reference frame, to link plate motions to a mantle framework. Unfortunately, both accuracy and precision of absolute plate motion models lag behind those of relative plate motion models. Consequently, it is paramount to use relative plate motions in improving our understanding of absolute plate motions. A new technique called "ridge-spotting" combines absolute and relative plate motions and examines the viability of proposed absolute plate motion models. We test the method on six published Pacific absolute plate motions models, including fixed and moving hot spot models as well as a geodynamically derived model. Ridge-spotting reconstructs the Pacific-Farallon and Pacific-Antarctica ridge systems over the last 80 Myr. All six absolute plate motion models predict large amounts of northward migration and monotonic clockwise rotation for the Pacific-Farallon ridge. A geodynamic implication of our ridge migration predictions is that the suggestion that the Pacific-Farallon ridge may have been pinned by a large mantle upwelling is not supported. Unexpected or erratic ridge behaviors may be tied to limitations in the models themselves or (for Indo-Atlantic models) discrepancies in the plate circuits used to project models into the Pacific realm. Ridge-spotting is promising and will be extended to include more plates and other ocean basins.

  5. Quantifying Cancer Absolute Risk and Cancer Mortality in the Presence of Competing Events after a Myotonic Dystrophy Diagnosis

    PubMed Central

    Gadalla, Shahinaz M.; Pfeiffer, Ruth M.; Kristinsson, Sigurdur Y.; Björkholm, Magnus; Hilbert, James E.; Moxley, Richard T.; Landgren, Ola; Greene, Mark H.

    2013-01-01

    Recent studies show that patients with myotonic dystrophy (DM) have an increased risk of specific malignancies, but estimates of absolute cancer risk accounting for competing events are lacking. Using the Swedish Patient Registry, we identified 1,081 patients with an inpatient and/or outpatient diagnosis of DM between 1987 and 2007. Date and cause of death and date of cancer diagnosis were extracted from the Swedish Cause of Death and Cancer Registries. We calculated non-parametric estimates of absolute cancer risk and cancer mortality accounting for the high non-cancer competing mortality associated with DM. Absolute cancer risk after DM diagnosis was 1.6% (95% CI=0.4-4%), 5% (95% CI=3-9%) and 9% (95% CI=6-13%) at ages 40, 50 and 60 years, respectively. Females had a higher absolute risk of all cancers combined than males: 9% (95% CI=4-14), and 13% (95% CI=9-20) vs. 2% (95%CI= 0.7-6) and 4% (95%CI=2-8) by ages 50 and 60 years, respectively) and developed cancer at younger ages (median age =51 years, range=22-74 vs. 57, range=43-84, respectively, p=0.02). Cancer deaths accounted for 10% of all deaths, with an absolute cancer mortality risk of 2% (95%CI=1-4.5%), 4% (95%CI=2-6%), and 6% (95%CI=4-9%) by ages 50, 60, and 70 years, respectively. No gender difference in cancer-specific mortality was observed (p=0.6). In conclusion, cancer significantly contributes to morbidity and mortality in DM patients, even after accounting for high competing DM mortality from non-neoplastic causes. It is important to apply population-appropriate, validated cancer screening strategies in DM patients. PMID:24236163

  6. Absolute stability and Hopf bifurcation in a Plasmodium falciparum malaria model incorporating discrete immune response delay.

    PubMed

    Ncube, Israel

    2013-05-01

    We consider the absolute stability of the disease-free equilibrium of an intra-host Plasmodium falciparum malarial model allowing for antigenic variation within a single species. Antigenic variation can be viewed as an adaptation of the parasite to evade host defence [2]. The model was recently developed in [3-6]. The host's immune response is compartmentalised into reactions to major and minor epitopes. The immune response mounted by the human host is delayed, where, for simplicity, the delay is assumed to be discrete. We investigate the resulting characteristic equation, with a view to establishing absolute stability criteria and computing the Hopf bifurcation of the disease-free equilibrium.

  7. Constraint on Absolute Accuracy of Metacomprehension Assessments: The Anchoring and Adjustment Model vs. the Standards Model

    ERIC Educational Resources Information Center

    Kwon, Heekyung

    2011-01-01

    The objective of this study is to provide a systematic account of three typical phenomena surrounding absolute accuracy of metacomprehension assessments: (1) the absolute accuracy of predictions is typically quite low; (2) there exist individual differences in absolute accuracy of predictions as a function of reading skill; and (3) postdictions…

  8. Easy Absolute Values? Absolutely

    ERIC Educational Resources Information Center

    Taylor, Sharon E.; Mittag, Kathleen Cage

    2015-01-01

    The authors teach a problem-solving course for preservice middle-grades education majors that includes concepts dealing with absolute-value computations, equations, and inequalities. Many of these students like mathematics and plan to teach it, so they are adept at symbolic manipulations. Getting them to think differently about a concept that they…

  9. New identification method for Hammerstein models based on approximate least absolute deviation

    NASA Astrophysics Data System (ADS)

    Xu, Bao-Chang; Zhang, Ying-Dan

    2016-07-01

    Disorder and peak noises or large disturbances can deteriorate the identification effects of Hammerstein non-linear models when using the least-square (LS) method. The least absolute deviation technique can be used to resolve this problem; however, its absolute value cannot meet the need of differentiability required by most algorithms. To improve robustness and resolve the non-differentiable problem, an approximate least absolute deviation (ALAD) objective function is established by introducing a deterministic function that exhibits the characteristics of absolute value under certain situations. A new identification method for Hammerstein models based on ALAD is thus developed in this paper. The basic idea of this method is to apply the stochastic approximation theory in the process of deriving the recursive equations. After identifying the parameter matrix of the Hammerstein model via the new algorithm, the product terms in the matrix are separated by calculating the average values. Finally, algorithm convergence is proven by applying the ordinary differential equation method. The proposed algorithm has a better robustness as compared to other LS methods, particularly when abnormal points exist in the measured data. Furthermore, the proposed algorithm is easier to apply and converges faster. The simulation results demonstrate the efficacy of the proposed algorithm.

  10. A simple model explaining super-resolution in absolute optical instruments

    NASA Astrophysics Data System (ADS)

    Leonhardt, Ulf; Sahebdivan, Sahar; Kogan, Alex; Tyc, Tomáš

    2015-05-01

    We develop a simple, one-dimensional model for super-resolution in absolute optical instruments that is able to describe the interplay between sources and detectors. Our model explains the subwavelength sensitivity of a point detector to a point source reported in previous computer simulations and experiments (Miñano 2011 New J. Phys.13 125009; Miñano 2014 New J. Phys.16 033015).

  11. Fractional Brownian Motion with Stochastic Variance:. Modeling Absolute Returns in STOCK Markets

    NASA Astrophysics Data System (ADS)

    Roman, H. E.; Porto, M.

    We discuss a model for simulating a long-time memory in time series characterized in addition by a stochastic variance. The model is based on a combination of fractional Brownian motion (FBM) concepts, for dealing with the long-time memory, with an autoregressive scheme with conditional heteroskedasticity (ARCH), responsible for the stochastic variance of the series, and is denoted as FBMARCH. Unlike well-known fractionally integrated autoregressive models, FBMARCH admits finite second moments. The resulting probability distribution functions have power-law tails with exponents similar to ARCH models. This idea is applied to the description of long-time autocorrelations of absolute returns ubiquitously observed in stock markets.

  12. Modelling and measurement of the absolute level of power radiated by antenna integrated THz UTC photodiodes.

    PubMed

    Natrella, Michele; Liu, Chin-Pang; Graham, Chris; van Dijk, Frederic; Liu, Huiyun; Renaud, Cyril C; Seeds, Alwyn J

    2016-05-30

    We determine the output impedance of uni-travelling carrier (UTC) photodiodes at frequencies up to 400 GHz by performing, for the first time, 3D full-wave modelling of detailed UTC photodiode structures. In addition, we demonstrate the importance of the UTC impedance evaluation, by using it in the prediction of the absolute power radiated by an antenna integrated UTC, over a broad frequency range and confirming the predictions by experimental measurements up to 185 GHz. This is done by means of 3D full-wave modelling and is only possible since the source (UTC) to antenna impedance match is properly taken into account. We also show that, when the UTC-to-antenna coupling efficiency is modelled using the classical junction-capacitance/series-resistance concept, calculated and measured levels of absolute radiated power are in substantial disagreement, and the maximum radiated power is overestimated by a factor of almost 7 dB. The ability to calculate the absolute emitted power correctly enables the radiated power to be maximised through optimisation of the UTC-to-antenna impedance match.

  13. Modelling and measurement of the absolute level of power radiated by antenna integrated THz UTC photodiodes.

    PubMed

    Natrella, Michele; Liu, Chin-Pang; Graham, Chris; van Dijk, Frederic; Liu, Huiyun; Renaud, Cyril C; Seeds, Alwyn J

    2016-05-30

    We determine the output impedance of uni-travelling carrier (UTC) photodiodes at frequencies up to 400 GHz by performing, for the first time, 3D full-wave modelling of detailed UTC photodiode structures. In addition, we demonstrate the importance of the UTC impedance evaluation, by using it in the prediction of the absolute power radiated by an antenna integrated UTC, over a broad frequency range and confirming the predictions by experimental measurements up to 185 GHz. This is done by means of 3D full-wave modelling and is only possible since the source (UTC) to antenna impedance match is properly taken into account. We also show that, when the UTC-to-antenna coupling efficiency is modelled using the classical junction-capacitance/series-resistance concept, calculated and measured levels of absolute radiated power are in substantial disagreement, and the maximum radiated power is overestimated by a factor of almost 7 dB. The ability to calculate the absolute emitted power correctly enables the radiated power to be maximised through optimisation of the UTC-to-antenna impedance match. PMID:27410104

  14. Time-series modeling and prediction of global monthly absolute temperature for environmental decision making

    NASA Astrophysics Data System (ADS)

    Ye, Liming; Yang, Guixia; Van Ranst, Eric; Tang, Huajun

    2013-03-01

    A generalized, structural, time series modeling framework was developed to analyze the monthly records of absolute surface temperature, one of the most important environmental parameters, using a deterministicstochastic combined (DSC) approach. Although the development of the framework was based on the characterization of the variation patterns of a global dataset, the methodology could be applied to any monthly absolute temperature record. Deterministic processes were used to characterize the variation patterns of the global trend and the cyclic oscillations of the temperature signal, involving polynomial functions and the Fourier method, respectively, while stochastic processes were employed to account for any remaining patterns in the temperature signal, involving seasonal autoregressive integrated moving average (SARIMA) models. A prediction of the monthly global surface temperature during the second decade of the 21st century using the DSC model shows that the global temperature will likely continue to rise at twice the average rate of the past 150 years. The evaluation of prediction accuracy shows that DSC models perform systematically well against selected models of other authors, suggesting that DSC models, when coupled with other ecoenvironmental models, can be used as a supplemental tool for short-term (˜10-year) environmental planning and decision making.

  15. The Impact of Different Absolute Solar Irradiance Values on Current Climate Model Simulations

    NASA Technical Reports Server (NTRS)

    Rind, David H.; Lean, Judith L.; Jonas, Jeffrey

    2014-01-01

    Simulations of the preindustrial and doubled CO2 climates are made with the GISS Global Climate Middle Atmosphere Model 3 using two different estimates of the absolute solar irradiance value: a higher value measured by solar radiometers in the 1990s and a lower value measured recently by the Solar Radiation and Climate Experiment. Each of the model simulations is adjusted to achieve global energy balance; without this adjustment the difference in irradiance produces a global temperature change of 0.48C, comparable to the cooling estimated for the Maunder Minimum. The results indicate that by altering cloud cover the model properly compensates for the different absolute solar irradiance values on a global level when simulating both preindustrial and doubled CO2 climates. On a regional level, the preindustrial climate simulations and the patterns of change with doubled CO2 concentrations are again remarkably similar, but there are some differences. Using a higher absolute solar irradiance value and the requisite cloud cover affects the model's depictions of high-latitude surface air temperature, sea level pressure, and stratospheric ozone, as well as tropical precipitation. In the climate change experiments it leads to an underestimation of North Atlantic warming, reduced precipitation in the tropical western Pacific, and smaller total ozone growth at high northern latitudes. Although significant, these differences are typically modest compared with the magnitude of the regional changes expected for doubled greenhouse gas concentrations. Nevertheless, the model simulations demonstrate that achieving the highest possible fidelity when simulating regional climate change requires that climate models use as input the most accurate (lower) solar irradiance value.

  16. Absolute stability and synchronization in neural field models with transmission delays

    NASA Astrophysics Data System (ADS)

    Kao, Chiu-Yen; Shih, Chih-Wen; Wu, Chang-Hong

    2016-08-01

    Neural fields model macroscopic parts of the cortex which involve several populations of neurons. We consider a class of neural field models which are represented by integro-differential equations with transmission time delays which are space-dependent. The considered domains underlying the systems can be bounded or unbounded. A new approach, called sequential contracting, instead of the conventional Lyapunov functional technique, is employed to investigate the global dynamics of such systems. Sufficient conditions for the absolute stability and synchronization of the systems are established. Several numerical examples are presented to demonstrate the theoretical results.

  17. Absolute IGS antenna phase center model igs08.atx: status and potential improvements

    NASA Astrophysics Data System (ADS)

    Schmid, R.; Dach, R.; Collilieux, X.; Jäggi, A.; Schmitz, M.; Dilssner, F.

    2016-04-01

    On 17 April 2011, all analysis centers (ACs) of the International GNSS Service (IGS) adopted the reference frame realization IGS08 and the corresponding absolute antenna phase center model igs08.atx for their routine analyses. The latter consists of an updated set of receiver and satellite antenna phase center offsets and variations (PCOs and PCVs). An update of the model was necessary due to the difference of about 1 ppb in the terrestrial scale between two consecutive realizations of the International Terrestrial Reference Frame (ITRF2008 vs. ITRF2005), as that parameter is highly correlated with the GNSS satellite antenna PCO components in the radial direction.

  18. A California statewide three-dimensional seismic velocity model from both absolute and differential times

    USGS Publications Warehouse

    Lin, G.; Thurber, C.H.; Zhang, H.; Hauksson, E.; Shearer, P.M.; Waldhauser, F.; Brocher, T.M.; Hardebeck, J.

    2010-01-01

    We obtain a seismic velocity model of the California crust and uppermost mantle using a regional-scale double-difference tomography algorithm. We begin by using absolute arrival-time picks to solve for a coarse three-dimensional (3D) P velocity (VP) model with a uniform 30 km horizontal node spacing, which we then use as the starting model for a finer-scale inversion using double-difference tomography applied to absolute and differential pick times. For computational reasons, we split the state into 5 subregions with a grid spacing of 10 to 20 km and assemble our final statewide VP model by stitching together these local models. We also solve for a statewide S-wave model using S picks from both the Southern California Seismic Network and USArray, assuming a starting model based on the VP results and a VP=VS ratio of 1.732. Our new model has improved areal coverage compared with previous models, extending 570 km in the SW-NE directionand 1320 km in the NW-SE direction. It also extends to greater depth due to the inclusion of substantial data at large epicentral distances. Our VP model generally agrees with previous separate regional models for northern and southern California, but we also observe some new features, such as high-velocity anomalies at shallow depths in the Klamath Mountains and Mount Shasta area, somewhat slow velocities in the northern Coast Ranges, and slow anomalies beneath the Sierra Nevada at midcrustal and greater depths. This model can be applied to a variety of regional-scale studies in California, such as developing a unified statewide earthquake location catalog and performing regional waveform modeling.

  19. Melanoma Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing melanoma cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  20. Absolute Summ

    NASA Astrophysics Data System (ADS)

    Phillips, Alfred, Jr.

    Summ means the entirety of the multiverse. It seems clear, from the inflation theories of A. Guth and others, that the creation of many universes is plausible. We argue that Absolute cosmological ideas, not unlike those of I. Newton, may be consistent with dynamic multiverse creations. As suggested in W. Heisenberg's uncertainty principle, and with the Anthropic Principle defended by S. Hawking, et al., human consciousness, buttressed by findings of neuroscience, may have to be considered in our models. Predictability, as A. Einstein realized with Invariants and General Relativity, may be required for new ideas to be part of physics. We present here a two postulate model geared to an Absolute Summ. The seedbed of this work is part of Akhnaton's philosophy (see S. Freud, Moses and Monotheism). Most important, however, is that the structure of human consciousness, manifest in Kenya's Rift Valley 200,000 years ago as Homo sapiens, who were the culmination of the six million year co-creation process of Hominins and Nature in Africa, allows us to do the physics that we do. .

  1. Improved Strategies and Optimization of Calibration Models for Real-time PCR Absolute Quantification

    EPA Science Inventory

    Real-time PCR absolute quantification applications rely on the use of standard curves to make estimates of DNA target concentrations in unknown samples. Traditional absolute quantification approaches dictate that a standard curve must accompany each experimental run. However, t...

  2. The finite element absolute nodal coordinate formulation incorporated with surface stress effect to model elastic bending nanowires in large deformation

    NASA Astrophysics Data System (ADS)

    He, Jin; Lilley, Carmen M.

    2009-08-01

    Surface stress was incorporated into the finite element absolute nodal coordinate formulation in order to model elastic bending of nanowires in large deformation. The absolute nodal coordinate formulation is a numerical method to model bending structures in large deformation. The generalized Young-Laplace equation was employed to model the surface stress effect on bending nanowires. Effects from surface stress and large deformation on static bending nanowires are presented and discussed. The results calculated with the absolute nodal coordinate formulation incorporated with surface stress show that the surface stress effect makes the bending nanowires behave like softer or stiffer materials depending on the boundary condition. The surface stress effect diminishes as the dimensions of the bending structures increase beyond the nanoscale. The developed algorithm is consistent with the classical absolute nodal coordinate formulation at the macroscale.

  3. Computations of absolute solvation free energies of small molecules using explicit and implicit solvent model.

    SciTech Connect

    Shivakumar, D.; Deng, Y.; Roux, B.; Biosciences Division; Univ. of Chicago

    2009-01-01

    Accurate determination of absolute solvation free energy plays a critical role in numerous areas of biomolecular modeling and drug discovery. A quantitative representation of ligand and receptor desolvation, in particular, is an essential component of current docking and scoring methods. Furthermore, the partitioning of a drug between aqueous and nonpolar solvents is one of the important factors considered in pharmacokinetics. In this study, the absolute hydration free energy for a set of 239 neutral ligands spanning diverse chemical functional groups commonly found in drugs and drug-like candidates is calculated using the molecular dynamics free energy perturbation method (FEP/MD) with explicit water molecules, and compared to experimental data as well as its counterparts obtained using implicit solvent models. The hydration free energies are calculated from explicit solvent simulations using a staged FEP procedure permitting a separation of the total free energy into polar and nonpolar contributions. The nonpolar component is further decomposed into attractive (dispersive) and repulsive (cavity) components using the Weeks-Chandler-Anderson (WCA) separation scheme. To increase the computational efficiency, all of the FEP/MD simulations are generated using a mixed explicit/implicit solvent scheme with a relatively small number of explicit TIP3P water molecules, in which the influence of the remaining bulk is incorporated via the spherical solvent boundary potential (SSBP). The performances of two fixed-charge force fields designed for small organic molecules, the General Amber force field (GAFF), and the all-atom CHARMm-MSI, are compared. Because of the crucial role of electrostatics in solvation free energy, the results from various commonly used charge generation models based on the semiempirical (AM1-BCC) and QM calculations [charge fitting using ChelpG and RESP] are compared. In addition, the solvation free energies of the test set are also calculated using

  4. Photochirogenesis: Photochemical models on the absolute asymmetric formation of amino acids in interstellar space

    NASA Astrophysics Data System (ADS)

    Meinert, Cornelia; de Marcellus, Pierre; Le Sergeant D'Hendecourt, Louis; Nahon, Laurent; Jones, Nykola C.; Hoffmann, Søren V.; Bredehöft, Jan Hendrik; Meierhenrich, Uwe J.

    2011-10-01

    Proteins of all living organisms including plants, animals, and humans are made up of amino acid monomers that show identical stereochemical L-configuration. Hypotheses for the origin of this symmetry breaking in biomolecules include the absolute asymmetric photochemistry model by which interstellar ultraviolet (UV) circularly polarized light (CPL) induces an enantiomeric excess in chiral organic molecules in the interstellar/circumstellar media. This scenario is supported by a) the detection of amino acids in the organic residues of UV-photo-processed interstellar ice analogues, b) the occurrence of L-enantiomer-enriched amino acids in carbonaceous meteorites, and c) the observation of CPL of the same helicity over large distance scales in the massive star-forming region of Orion. These topics are of high importance in topical biophysical research and will be discussed in this review. Further evidence that amino acids and other molecules of prebiotic interest are asymmetrically formed in space comes from studies on the enantioselective photolysis of amino acids by UV-CPL. Also, experiments have been performed on the absolute asymmetric photochemical synthesis of enantiomer-enriched amino acids from mixtures of astrophysically relevant achiral precursor molecules using UV-circularly polarized photons. Both approaches are based on circular dichroic transitions of amino acids that will be highlighted here as well. These results have strong implications on our current understanding of how life's precursor molecules were possibly built and how life selected the left-handed form of proteinogenic amino acids.

  5. Modeling absolute differences in life expectancy with a censored skew-normal regression approach

    PubMed Central

    Clough-Gorr, Kerri; Zwahlen, Marcel

    2015-01-01

    Parameter estimates from commonly used multivariable parametric survival regression models do not directly quantify differences in years of life expectancy. Gaussian linear regression models give results in terms of absolute mean differences, but are not appropriate in modeling life expectancy, because in many situations time to death has a negative skewed distribution. A regression approach using a skew-normal distribution would be an alternative to parametric survival models in the modeling of life expectancy, because parameter estimates can be interpreted in terms of survival time differences while allowing for skewness of the distribution. In this paper we show how to use the skew-normal regression so that censored and left-truncated observations are accounted for. With this we model differences in life expectancy using data from the Swiss National Cohort Study and from official life expectancy estimates and compare the results with those derived from commonly used survival regression models. We conclude that a censored skew-normal survival regression approach for left-truncated observations can be used to model differences in life expectancy across covariates of interest. PMID:26339544

  6. Absolute nuclear material assay

    DOEpatents

    Prasad, Manoj K.; Snyderman, Neal J.; Rowland, Mark S.

    2012-05-15

    A method of absolute nuclear material assay of an unknown source comprising counting neutrons from the unknown source and providing an absolute nuclear material assay utilizing a model to optimally compare to the measured count distributions. In one embodiment, the step of providing an absolute nuclear material assay comprises utilizing a random sampling of analytically computed fission chain distributions to generate a continuous time-evolving sequence of event-counts by spreading the fission chain distribution in time.

  7. Absolute nuclear material assay

    DOEpatents

    Prasad, Manoj K.; Snyderman, Neal J.; Rowland, Mark S.

    2010-07-13

    A method of absolute nuclear material assay of an unknown source comprising counting neutrons from the unknown source and providing an absolute nuclear material assay utilizing a model to optimally compare to the measured count distributions. In one embodiment, the step of providing an absolute nuclear material assay comprises utilizing a random sampling of analytically computed fission chain distributions to generate a continuous time-evolving sequence of event-counts by spreading the fission chain distribution in time.

  8. An Integrated Model of Choices and Response Times in Absolute Identification

    ERIC Educational Resources Information Center

    Brown, Scott D.; Marley, A. A. J.; Donkin, Christopher; Heathcote, Andrew

    2008-01-01

    Recent theoretical developments in the field of absolute identification have stressed differences between relative and absolute processes, that is, whether stimulus magnitudes are judged relative to a shorter term context provided by recently presented stimuli or a longer term context provided by the entire set of stimuli. The authors developed a…

  9. Mammographic breast density and breast cancer risk: interactions of percent density, absolute dense, and non-dense areas with breast cancer risk factors.

    PubMed

    Yaghjyan, Lusine; Colditz, Graham A; Rosner, Bernard; Tamimi, Rulla M

    2015-02-01

    We investigated if associations of breast density and breast cancer differ according to the level of other known breast cancer risk factors, including body mass index (BMI), age at menarche, parity, age at first child's birth, age at menopause, alcohol consumption, a family history of breast cancer, a history of benign breast disease, and physical activity. This study included 1,044 postmenopausal incident breast cancer cases diagnosed within the Nurses' Health Study cohort and 1,794 matched controls. Percent breast density, absolute dense, and non-dense areas were measured from digitized film images with computerized techniques. Information on breast cancer risk factors was obtained prospectively from biennial questionnaires. Percent breast density was more strongly associated with breast cancer risk in current postmenopausal hormone users (≥50 vs. 10 %: OR 5.34, 95 % CI 3.36-8.49) as compared to women with past (OR 2.69, 95 % CI 1.32-5.49) or no hormone history (OR 2.57, 95 % CI 1.18-5.60, p-interaction = 0.03). Non-dense area was inversely associated with breast cancer risk in parous women, but not in women without children (p-interaction = 0.03). Associations of density with breast cancer risk did not differ by the levels of BMI, age at menarche, parity, age at first child's birth, age at menopause, alcohol consumption, a family history of breast cancer, a history of benign breast disease, and physical activity. Women with dense breasts, who currently use menopausal hormone therapy are at a particularly high risk of breast cancer. Most breast cancer risk factors do not modify the association between mammographic breast density and breast cancer risk.

  10. Forecasting the absolute and relative shortage of physicians in Japan using a system dynamics model approach

    PubMed Central

    2013-01-01

    Background In Japan, a shortage of physicians, who serve a key role in healthcare provision, has been pointed out as a major medical issue. The healthcare workforce policy planner should consider future dynamic changes in physician numbers. The purpose of this study was to propose a physician supply forecasting methodology by applying system dynamics modeling to estimate future absolute and relative numbers of physicians. Method We constructed a forecasting model using a system dynamics approach. Forecasting the number of physician was performed for all clinical physician and OB/GYN specialists. Moreover, we conducted evaluation of sufficiency for the number of physicians and sensitivity analysis. Result & conclusion As a result, it was forecast that the number of physicians would increase during 2008–2030 and the shortage would resolve at 2026 for all clinical physicians. However, the shortage would not resolve for the period covered. This suggests a need for measures for reconsidering the allocation system of new entry physicians to resolve maldistribution between medical departments, in addition, for increasing the overall number of clinical physicians. PMID:23981198

  11. Statistical modeling reveals the effect of absolute humidity on dengue in Singapore.

    PubMed

    Xu, Hai-Yan; Fu, Xiuju; Lee, Lionel Kim Hock; Ma, Stefan; Goh, Kee Tai; Wong, Jiancheng; Habibullah, Mohamed Salahuddin; Lee, Gary Kee Khoon; Lim, Tian Kuay; Tambyah, Paul Anantharajah; Lim, Chin Leong; Ng, Lee Ching

    2014-05-01

    Weather factors are widely studied for their effects on indicating dengue incidence trends. However, these studies have been limited due to the complex epidemiology of dengue, which involves dynamic interplay of multiple factors such as herd immunity within a population, distinct serotypes of the virus, environmental factors and intervention programs. In this study, we investigate the impact of weather factors on dengue in Singapore, considering the disease epidemiology and profile of virus serotypes. A Poisson regression combined with Distributed Lag Non-linear Model (DLNM) was used to evaluate and compare the impact of weekly Absolute Humidity (AH) and other weather factors (mean temperature, minimum temperature, maximum temperature, rainfall, relative humidity and wind speed) on dengue incidence from 2001 to 2009. The same analysis was also performed on three sub-periods, defined by predominant circulating serotypes. The performance of DLNM regression models were then evaluated through the Akaike's Information Criterion. From the correlation and DLNM regression modeling analyses of the studied period, AH was found to be a better predictor for modeling dengue incidence than the other unique weather variables. Whilst mean temperature (MeanT) also showed significant correlation with dengue incidence, the relationship between AH or MeanT and dengue incidence, however, varied in the three sub-periods. Our results showed that AH had a more stable impact on dengue incidence than temperature when virological factors were taken into consideration. AH appeared to be the most consistent factor in modeling dengue incidence in Singapore. Considering the changes in dominant serotypes, the improvements in vector control programs and the inconsistent weather patterns observed in the sub-periods, the impact of weather on dengue is modulated by these other factors. Future studies on the impact of climate change on dengue need to take all the other contributing factors into

  12. Absolute quantification of the pretreatment PML-RARA transcript defines the relapse risk in acute promyelocytic leukemia.

    PubMed

    Albano, Francesco; Zagaria, Antonella; Anelli, Luisa; Coccaro, Nicoletta; Tota, Giuseppina; Brunetti, Claudia; Minervini, Crescenzio Francesco; Impera, Luciana; Minervini, Angela; Cellamare, Angelo; Orsini, Paola; Cumbo, Cosimo; Casieri, Paola; Specchia, Giorgina

    2015-05-30

    In this study we performed absolute quantification of the PML-RARA transcript by droplet digital polymerase chain reaction (ddPCR) in 76 newly diagnosed acute promyelocytic leukemia (APL) cases to verify the prognostic impact of the PML-RARA initial molecular burden. ddPCR analysis revealed that the amount of PML-RARA transcript at diagnosis in the group of patients who relapsed was higher than in that with continuous complete remission (CCR) (272 vs 89.2 PML-RARA copies/ng, p = 0.0004, respectively). Receiver operating characteristic analysis detected the optimal PML-RARA concentration threshold as 209.6 PML-RARA/ng (AUC 0.78; p < 0.0001) for discriminating between outcomes (CCR versus relapse). Among the 67 APL cases who achieved complete remission after the induction treatment, those with >209.6 PML-RARA/ng had a worse relapse-free survival (p = 0.0006). At 5-year follow-up, patients with >209.6 PML-RARA/ng had a cumulative incidence of relapse of 50.3% whereas 7.5% of the patients with suffered a relapse (p < 0.0001). Multivariate analysis identified the amount of PML-RARA before induction treatment as the sole independent prognostic factor for APL relapse.Our results show that the pretreatment PML-RARA molecular burden could therefore be used to improve risk stratification in order to develop more individualized treatment regimens for high-risk APL cases. PMID:25944686

  13. Acute radiation risk models

    NASA Astrophysics Data System (ADS)

    Smirnova, Olga

    Biologically motivated mathematical models, which describe the dynamics of the major hematopoietic lineages (the thrombocytopoietic, lymphocytopoietic, granulocytopoietic, and erythropoietic systems) in acutely/chronically irradiated humans are developed. These models are implemented as systems of nonlinear differential equations, which variables and constant parameters have clear biological meaning. It is shown that the developed models are capable of reproducing clinical data on the dynamics of these systems in humans exposed to acute radiation in the result of incidents and accidents, as well as in humans exposed to low-level chronic radiation. Moreover, the averaged value of the "lethal" dose rates of chronic irradiation evaluated within models of these four major hematopoietic lineages coincides with the real minimal dose rate of lethal chronic irradiation. The demonstrated ability of the models of the human thrombocytopoietic, lymphocytopoietic, granulocytopoietic, and erythropoietic systems to predict the dynamical response of these systems to acute/chronic irradiation in wide ranges of doses and dose rates implies that these mathematical models form an universal tool for the investigation and prediction of the dynamics of the major human hematopoietic lineages for a vast pattern of irradiation scenarios. In particular, these models could be applied for the radiation risk assessment for health of astronauts exposed to space radiation during long-term space missions, such as voyages to Mars or Lunar colonies, as well as for health of people exposed to acute/chronic irradiation due to environmental radiological events.

  14. 3D geomechanical-numerical modelling of the absolute stress state for geothermal reservoir exploration

    NASA Astrophysics Data System (ADS)

    Reiter, Karsten; Heidbach, Oliver; Moeck, Inga

    2013-04-01

    For the assessment and exploration of a potential geothermal reservoir, the contemporary in-situ stress is of key importance in terms of well stability and orientation of possible fluid pathways. However, available data, e.g. Heidbach et al. (2009) or Zang et al. (2012), deliver only point wise information of parts of the six independent components of the stress tensor. Moreover most measurements of the stress orientation and magnitude are done for hydrocarbon industry obvious in shallow depth. Interpolation across long distances or extrapolation into depth is unfavourable, because this would ignore structural features, inhomogeneity's in the crust or other local effects like topography. For this reasons geomechanical numerical modelling is the favourable method to quantify orientations and magnitudes of the 3D stress field for a geothermal reservoir. A geomechanical-numerical modelling, estimating the 3D absolute stress state, requires the initial stress state as model constraints. But in-situ stress measurements within or close by a potential reservoir are rare. For that reason a larger regional geomechanical-numerical model is necessary, which derive boundary conditions for the wanted local reservoir model. Such a large scale model has to be tested against in-situ stress measurements, orientations and magnitudes. Other suitable and available data, like GPS measurements or fault slip rates are useful to constrain kinematic boundary conditions. This stepwise approach from regional to local scale takes all stress field factors into account, from first over second up to third order. As an example we present a large scale crustal and upper mantle 3D-geomechanical-numerical model of the Alberta Basin and the surroundings, which is constructed to describe continuously the full stress tensor. In-situ stress measurements are the most likely data, because they deliver the most direct information's of the stress field and they provide insights into different depths, a

  15. Mental Models of Security Risks

    NASA Astrophysics Data System (ADS)

    Asgharpour, Farzaneh; Liu, Debin; Camp, L. Jean

    In computer security, risk communication refers to informing computer users about the likelihood and magnitude of a threat. Efficacy of risk communication depends not only on the nature of the risk, but also on the alignment between the conceptual model embedded in the risk communication and the user's mental model of the risk. The gap between the mental models of security experts and non-experts could lead to ineffective risk communication. Our research shows that for a variety of the security risks self-identified security experts and non-experts have different mental models. We propose that the design of the risk communication methods should be based on the non-expert mental models.

  16. Gender equality and women's absolute status: a test of the feminist models of rape.

    PubMed

    Martin, Kimberly; Vieraitis, Lynne M; Britto, Sarah

    2006-04-01

    Feminist theory predicts both a positive and negative relationship between gender equality and rape rates. Although liberal and radical feminist theory predicts that gender equality should ameliorate rape victimization, radical feminist theorists have argued that gender equality may increase rape in the form of male backlash. Alternatively, Marxist criminologists focus on women's absolute socioeconomic status rather than gender equality as a predictor of rape rates, whereas socialist feminists combine both radical and Marxist perspectives. This study uses factor analysis to overcome multicollinearity limitations of past studies while exploring the relationship between women's absolute and relative socioeconomic status on rape rates in major U.S. cities using 2000 census data. The findings indicate support for both the Marxist and radical feminist explanations of rape but no support for the ameliorative hypothesis. These findings support a more inclusive socialist feminist theory that takes both Marxist and radical feminist hypotheses into account. PMID:16567334

  17. Gender equality and women's absolute status: a test of the feminist models of rape.

    PubMed

    Martin, Kimberly; Vieraitis, Lynne M; Britto, Sarah

    2006-04-01

    Feminist theory predicts both a positive and negative relationship between gender equality and rape rates. Although liberal and radical feminist theory predicts that gender equality should ameliorate rape victimization, radical feminist theorists have argued that gender equality may increase rape in the form of male backlash. Alternatively, Marxist criminologists focus on women's absolute socioeconomic status rather than gender equality as a predictor of rape rates, whereas socialist feminists combine both radical and Marxist perspectives. This study uses factor analysis to overcome multicollinearity limitations of past studies while exploring the relationship between women's absolute and relative socioeconomic status on rape rates in major U.S. cities using 2000 census data. The findings indicate support for both the Marxist and radical feminist explanations of rape but no support for the ameliorative hypothesis. These findings support a more inclusive socialist feminist theory that takes both Marxist and radical feminist hypotheses into account.

  18. Variable selection for modeling the absolute magnitude at maximum of Type Ia supernovae

    NASA Astrophysics Data System (ADS)

    Uemura, Makoto; Kawabata, Koji S.; Ikeda, Shiro; Maeda, Keiichi

    2015-06-01

    We discuss what is an appropriate set of explanatory variables in order to predict the absolute magnitude at the maximum of Type Ia supernovae. In order to have a good prediction, the error for future data, which is called the "generalization error," should be small. We use cross-validation in order to control the generalization error and a LASSO-type estimator in order to choose the set of variables. This approach can be used even in the case that the number of samples is smaller than the number of candidate variables. We studied the Berkeley supernova database with our approach. Candidates for the explanatory variables include normalized spectral data, variables about lines, and previously proposed flux ratios, as well as the color and light-curve widths. As a result, we confirmed the past understanding about Type Ia supernovae: (i) The absolute magnitude at maximum depends on the color and light-curve width. (ii) The light-curve width depends on the strength of Si II. Recent studies have suggested adding more variables in order to explain the absolute magnitude. However, our analysis does not support adding any other variables in order to have a better generalization error.

  19. Determining the importance of model calibration for forecasting absolute/relative changes in streamflow from LULC and climate changes

    USGS Publications Warehouse

    Niraula, Rewati; Meixner, Thomas; Norman, Laura M.

    2015-01-01

    Land use/land cover (LULC) and climate changes are important drivers of change in streamflow. Assessing the impact of LULC and climate changes on streamflow is typically done with a calibrated and validated watershed model. However, there is a debate on the degree of calibration required. The objective of this study was to quantify the variation in estimated relative and absolute changes in streamflow associated with LULC and climate changes with different calibration approaches. The Soil and Water Assessment Tool (SWAT) was applied in an uncalibrated (UC), single outlet calibrated (OC), and spatially-calibrated (SC) mode to compare the relative and absolute changes in streamflow at 14 gaging stations within the Santa Cruz River Watershed in southern Arizona, USA. For this purpose, the effect of 3 LULC, 3 precipitation (P), and 3 temperature (T) scenarios were tested individually. For the validation period, Percent Bias (PBIAS) values were >100% with the UC model for all gages, the values were between 0% and 100% with the OC model and within 20% with the SC model. Changes in streamflow predicted with the UC and OC models were compared with those of the SC model. This approach implicitly assumes that the SC model is “ideal”. Results indicated that the magnitude of both absolute and relative changes in streamflow due to LULC predicted with the UC and OC results were different than those of the SC model. The magnitude of absolute changes predicted with the UC and SC models due to climate change (both P and T) were also significantly different, but were not different for OC and SC models. Results clearly indicated that relative changes due to climate change predicted with the UC and OC were not significantly different than that predicted with the SC models. This result suggests that it is important to calibrate the model spatially to analyze the effect of LULC change but not as important for analyzing the relative change in streamflow due to climate change. This

  20. Application of two versions of the WHO/international society of hypertension absolute cardiovascular risk assessment tools in a rural Bangladeshi population

    PubMed Central

    Fatema, Kaniz; Zwar, Nicholas Arnold; Milton, Abul Hasnat; Rahman, Bayzidur; Ali, Liaquat

    2015-01-01

    Objectives To estimate the absolute cardiovascular disease (CVD) risk burden in a remote rural Bangladeshi population using the ‘With’ and ‘Without’ Cholesterol versions of the WHO/International Society of Hypertension (WHO/ISH) CVD risk assessment chart (particularly suitable for low and middle-income countries due to less reliance on laboratory testing) and to evaluate the agreement between the two approaches. Design Cross-sectional study using data from a large prospective cohort of the North Bengal Non-Communicable Disease Programme (NB-NCDP) of Bangladesh. Setting General rural population from Thakurgaon district of Bangladesh. Participants 563 individuals who were categorised as having ‘no CVDs’ on screening by a questionnaire-based survey using the ‘WHO CVD-Risk Management Package’ developed in 2002. Main outcome measures Absolute CVD risk burden assessed using two versions of the WHO/ISH risk assessment charts for the South-East Asian Region-D. Results 10-year risk (moderate, high and very high) positivity was present among 21.5% and 20.2% of participants, respectively, using with and without cholesterol versions of the tool. The overall concordance rate for the two versions was 89.5% and they did not differ significantly in estimating the proportion of overall participants having higher levels of CVD. The projected drug requirement, however, showed a significant overestimation in the proportion of participants at both the threshold levels (p<0.002) on using ‘without’ as compared to ‘with’ cholesterol versions. Conclusions About one-fifth of the adult population in Bangladesh, even in a remote rural area, seem to be at risk of developing CVDs (25% of them at high risk and 25% at very high risk) within 10 years with males and females being almost equally vulnerable. PMID:26463220

  1. [Absolute risk fracture prediction by risk factors validation and survey of osteoporosis in a Brussels cohort followed during 10 years (FRISBEE study)].

    PubMed

    Body, J J; Moreau, M; Bergmann, P; Paesmans, M; Dekelver, C; Lemaire, M L

    2008-09-01

    Osteoporosis is a major public health problem. For the time being, the diagnosis of osteoporosis relies on densitometry (T-score < -2.5 by DXA), although the risk of fracture depends also on other factors than the bone mass. Osteoporosis diagnosis (DXA) must be distinguished from the individual risk assessment of fracture. Different risk factors complementary to bone mass have been already validated in different populations. These include an old age, a history of fracture after the age of 50, a familial history of hip fracture (father or mother), a low BMI (< 20), corticoid treatment (> 3 months), tabagism and excessive alcohol consumption. A WHO taskforce has combined these different factors in order to integrate them in a 10-years predictive risk model of fracture (FRAX**). This model should still be validated in different populations, especially in populations not included in its development, which is the case for Belgium. We are evaluating these different risk factors for fracture in a Brussels population of 5000 women (60-80 years) who will be followed each year during 10 years. We also assess the predictive value of other risk factors for fracture not included in the WHO model (tendency to fall, use of sleeping pills, early non substituted menopause, sedentarity, ...). In an interim analysis of the first 452 women included and with data yet available at the time of this writing, we could find a significant (P < 0.05) relationship between diagnosis of osteoporosis at DXA and the number of risk factors, age > 70 years, a personal history of fracture after 50 years and a BMI < 20. PMID:18949979

  2. Absolute model ages of mantled surfaces in Malea Planum and Utopia Planitia, Mars.

    NASA Astrophysics Data System (ADS)

    Willmes, M.; Hiesinger, H.; Reiss, D.; Zanetti, M.

    2009-04-01

    The surface of Mars is partially covered by a latitude-dependent ice-rich smooth mantle in the middle and high latitudes (±30-60°) [1, 2]. These deposits relate to changes in the obliquity of Mars which have led to major shifts in the Martian climate and repeated global episodes of deposition [3]. The deposits vary in thickness and are usually independent of local geology, topography and elevation. In this study we have determined absolute model ages for the mantled surface units in Utopia Planitia (northern hemisphere) and Malea Planum (southern hemisphere) using crater statistics [4]. These regions show a specific type of mantle degradation called scalloped terrain, and modelled crater retention ages of the easily eroded mantle in these regions reveal the time since the last resurfacing. Images from the High Resolution Imaging Science Experiment (HiRISE) (25-50 cm/pixel spatial resolution) on board the Mars Reconnaissance Orbiter (MRO) were analyzed, continuous areas of smooth mantle were mapped, and small, fresh, unmodified craters were counted. Both regions show degradation features of the mantle in varying degrees. The mantle in Utopia Planitia appears heavily modified by polygonal fractures and scalloped depressions [5]. Scalloped depressions are also found in Malea Planum, but the mantle appears much smoother and less modified by periglacial processes [5, 6]. The study areas totalled 722 km² in Utopia Planitia, and 296 km² in Malea Planum. Model ages for these regions were determined using the chronology function of Hartmann and Neukum [4] and the production function Ivanov [7]. The model ages show that the mantle unit for the area mapped in Utopia Planitia is 0.65 (+0.35/-0.41) to 2.9 (+0.69/-0.75) Myr old and Malea Planum is 3.0 (+1.5/-1.7) to 4.5 (+1.3/-1.4) Myr old, and that both regions represent very recent Amazonian terrain. This is also in agreement with the observed young degradation features described by [6, 8]. We acknowledge that the

  3. An absolute chronology for early Egypt using radiocarbon dating and Bayesian statistical modelling.

    PubMed

    Dee, Michael; Wengrow, David; Shortland, Andrew; Stevenson, Alice; Brock, Fiona; Girdland Flink, Linus; Bronk Ramsey, Christopher

    2013-11-01

    The Egyptian state was formed prior to the existence of verifiable historical records. Conventional dates for its formation are based on the relative ordering of artefacts. This approach is no longer considered sufficient for cogent historical analysis. Here, we produce an absolute chronology for Early Egypt by combining radiocarbon and archaeological evidence within a Bayesian paradigm. Our data cover the full trajectory of Egyptian state formation and indicate that the process occurred more rapidly than previously thought. We provide a timeline for the First Dynasty of Egypt of generational-scale resolution that concurs with prevailing archaeological analysis and produce a chronometric date for the foundation of Egypt that distinguishes between historical estimates.

  4. RISK 0301 - MOLECULAR MODELING

    EPA Science Inventory

    Risk assessment practices, in general, for a range of diseases now encourages the use of mechanistic data to enhance the ability to predict responses at low, environmental exposures. In particular, the pathway from normal biology to pathologic state can be dcscribed by a set of m...

  5. An absolute chronology for early Egypt using radiocarbon dating and Bayesian statistical modelling

    PubMed Central

    Dee, Michael; Wengrow, David; Shortland, Andrew; Stevenson, Alice; Brock, Fiona; Girdland Flink, Linus; Bronk Ramsey, Christopher

    2013-01-01

    The Egyptian state was formed prior to the existence of verifiable historical records. Conventional dates for its formation are based on the relative ordering of artefacts. This approach is no longer considered sufficient for cogent historical analysis. Here, we produce an absolute chronology for Early Egypt by combining radiocarbon and archaeological evidence within a Bayesian paradigm. Our data cover the full trajectory of Egyptian state formation and indicate that the process occurred more rapidly than previously thought. We provide a timeline for the First Dynasty of Egypt of generational-scale resolution that concurs with prevailing archaeological analysis and produce a chronometric date for the foundation of Egypt that distinguishes between historical estimates. PMID:24204188

  6. Pancreatic Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing pancreatic cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  7. Colorectal Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing colorectal cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  8. Bladder Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing bladder cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  9. Testicular Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of testicular cervical cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  10. Lung Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing lung cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  11. Ovarian Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing ovarian cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  12. Liver Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing liver cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  13. Prostate Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing prostate cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  14. Esophageal Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing esophageal cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  15. Cervical Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing cervical cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  16. Breast Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing breast cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  17. College students' role models, learning style preferences, and academic achievement in collaborative teaching: absolute versus relativistic thinking.

    PubMed

    Chiou, Wen-Bin

    2008-01-01

    Based on the perspective of postformal operations, this study investigated whether college students' role models (technical teachers vs. lecturing teachers) and preferred learning styles (experience-driven mode vs. theory-driven mode) in collaborative teaching courses would be moderated by their cognitive development (absolute thinking vs. relativistic thinking) and examine whether academic achievement of students would be contingent upon their preferred learning styles. Two hundred forty-four college students who have taken the technical courses with collaborative teaching participated in this study. The results showed that those participants with absolute thinking perceived the modeling advantage of technical teachers was greater than that of lecturing teachers, preferred the experience-driven mode over the theory-driven mode, and displayed differential academic achievement between technical courses and general courses. On the other hand, the students with relativistic thinking revealed no difference in perceived modeling advantage of role models, learning styles preferences, and academic achievement between two categories of courses. In addition, this research indicates that college students' preferred learning styles would interact with course category (technical courses vs. general courses) to display differential academic achievement. Implications and future directions are discussed.

  18. Risk modelling in portfolio optimization

    NASA Astrophysics Data System (ADS)

    Lam, W. H.; Jaaman, Saiful Hafizah Hj.; Isa, Zaidi

    2013-09-01

    Risk management is very important in portfolio optimization. The mean-variance model has been used in portfolio optimization to minimize the investment risk. The objective of the mean-variance model is to minimize the portfolio risk and achieve the target rate of return. Variance is used as risk measure in the mean-variance model. The purpose of this study is to compare the portfolio composition as well as performance between the optimal portfolio of mean-variance model and equally weighted portfolio. Equally weighted portfolio means the proportions that are invested in each asset are equal. The results show that the portfolio composition of the mean-variance optimal portfolio and equally weighted portfolio are different. Besides that, the mean-variance optimal portfolio gives better performance because it gives higher performance ratio than the equally weighted portfolio.

  19. Absolute monocyte count trichotomizes chronic lymphocytic leukemia into high risk patients with immune dysregulation, disease progression and poor survival.

    PubMed

    Herishanu, Yair; Kay, Sigi; Sarid, Nadav; Kohan, Pedram; Braunstein, Rony; Rotman, Rachel; Deutsch, Varda; Ben-Ezra, Jonathan; Naparstek, Elizabeth; Perry, Chava; Katz, Ben-Zion

    2013-10-01

    Peripheral absolute monocyte count (AMC) has been reported to correlate with clinical outcome in different types of cancers. This association may relate to alteration in circulating monocytic subpopulations and tumor infiltrating macrophages. In this study we evaluated the clinical significance of peripheral AMC in 80 treatment naive patients with CLL. Measurement of AMC was based on direct morphological enumeration, due to our findings that complete blood count data may yield incorrect monocytes enumeration values in CLL. The median AMC in patients with CLL was within normal limits, however the AMC range exceeded the values of healthy individuals. The AMC trichotomized patients into 3 distinct sub-groups with different characteristics and outcomes. High AMC patients were younger and had higher absolute lymphocytes count, while patients with low AMC had prominent immune dysregulation (lower serum IgA levels, susceptibility to infections and a tendency for positive direct anti-globulin test). The low and high AMC patients had a shorter time to treatment compared to the intermediates AMC subgroups, whereas low AMC was associated with increased mortality caused by infectious complications. In conclusion, AMC quantification during the disease course classifies CLL patients into subgroups with unique clinical features and outcomes.

  20. The Relative and Absolute Risks of Disadvantaged Family Background and Low Levels of School Resources on Student Literacy

    ERIC Educational Resources Information Center

    Nonoyama-Tarumi, Yuko; Willms, J. Douglas

    2010-01-01

    There has been a long-lasting debate of whether the effects of family background are larger than those of school resources, and whether these effects are a function of national income level. In this study, we bring a new perspective to the debate by using the concepts of relative risk and population attributable risk in estimating family and…

  1. A strict test of stellar evolution models: The absolute dimensions of the massive benchmark eclipsing binary V578 Mon

    SciTech Connect

    Garcia, E. V.; Stassun, Keivan G.; Pavlovski, K.; Hensberge, H.; Chew, Y. Gómez Maqueo; Claret, A.

    2014-09-01

    We determine the absolute dimensions of the eclipsing binary V578 Mon, a detached system of two early B-type stars (B0V + B1V, P = 2.40848 days) in the star-forming region NGC 2244 of the Rosette Nebula. From the light curve analysis of 40 yr of photometry and the analysis of HERMES spectra, we find radii of 5.41 ± 0.04 R{sub ☉} and 4.29 ± 0.05 R{sub ☉}, and temperatures of 30,000 ± 500 K and 25,750 ± 435 K, respectively. We find that our disentangled component spectra for V578 Mon agree well with previous spectral disentangling from the literature. We also reconfirm the previous spectroscopic orbit of V578 Mon finding that masses of 14.54 ± 0.08 M{sub ☉} and 10.29 ± 0.06 M{sub ☉} are fully compatible with the new analysis. We compare the absolute dimensions to the rotating models of the Geneva and Utrecht groups and the models of the Granada group. We find that all three sets of models marginally reproduce the absolute dimensions of both stars with a common age within the uncertainty for gravity-effective temperature isochrones. However, there are some apparent age discrepancies for the corresponding mass-radius isochrones. Models with larger convective overshoot, >0.35, worked best. Combined with our previously determined apsidal motion of 0.07089{sub −0.00013}{sup +0.00021} deg cycle{sup –1}, we compute the internal structure constants (tidal Love number) for the Newtonian and general relativistic contribution to the apsidal motion as log k {sub 2} = –1.975 ± 0.017 and log k {sub 2} = –3.412 ± 0.018, respectively. We find the relativistic contribution to the apsidal motion to be small, <4%. We find that the prediction of log k {sub 2,theo} = –2.005 ± 0.025 of the Granada models fully agrees with our observed log k {sub 2}.

  2. Multifractal Value at Risk model

    NASA Astrophysics Data System (ADS)

    Lee, Hojin; Song, Jae Wook; Chang, Woojin

    2016-06-01

    In this paper new Value at Risk (VaR) model is proposed and investigated. We consider the multifractal property of financial time series and develop a multifractal Value at Risk (MFVaR). MFVaR introduced in this paper is analytically tractable and not based on simulation. Empirical study showed that MFVaR can provide the more stable and accurate forecasting performance in volatile financial markets where large loss can be incurred. This implies that our multifractal VaR works well for the risk measurement of extreme credit events.

  3. Cabin Environment Physics Risk Model

    NASA Technical Reports Server (NTRS)

    Mattenberger, Christopher J.; Mathias, Donovan Leigh

    2014-01-01

    This paper presents a Cabin Environment Physics Risk (CEPR) model that predicts the time for an initial failure of Environmental Control and Life Support System (ECLSS) functionality to propagate into a hazardous environment and trigger a loss-of-crew (LOC) event. This physics-of failure model allows a probabilistic risk assessment of a crewed spacecraft to account for the cabin environment, which can serve as a buffer to protect the crew during an abort from orbit and ultimately enable a safe return. The results of the CEPR model replace the assumption that failure of the crew critical ECLSS functionality causes LOC instantly, and provide a more accurate representation of the spacecraft's risk posture. The instant-LOC assumption is shown to be excessively conservative and, moreover, can impact the relative risk drivers identified for the spacecraft. This, in turn, could lead the design team to allocate mass for equipment to reduce overly conservative risk estimates in a suboptimal configuration, which inherently increases the overall risk to the crew. For example, available mass could be poorly used to add redundant ECLSS components that have a negligible benefit but appear to make the vehicle safer due to poor assumptions about the propagation time of ECLSS failures.

  4. Information risk and security modeling

    NASA Astrophysics Data System (ADS)

    Zivic, Predrag

    2005-03-01

    This research paper presentation will feature current frameworks to addressing risk and security modeling and metrics. The paper will analyze technical level risk and security metrics of Common Criteria/ISO15408, Centre for Internet Security guidelines, NSA configuration guidelines and metrics used at this level. Information IT operational standards view on security metrics such as GMITS/ISO13335, ITIL/ITMS and architectural guidelines such as ISO7498-2 will be explained. Business process level standards such as ISO17799, COSO and CobiT will be presented with their control approach to security metrics. Top level, the maturity standards such as SSE-CMM/ISO21827, NSA Infosec Assessment and CobiT will be explored and reviewed. For each defined level of security metrics the research presentation will explore the appropriate usage of these standards. The paper will discuss standards approaches to conducting the risk and security metrics. The research findings will demonstrate the need for common baseline for both risk and security metrics. This paper will show the relation between the attribute based common baseline and corporate assets and controls for risk and security metrics. IT will be shown that such approach spans over all mentioned standards. The proposed approach 3D visual presentation and development of the Information Security Model will be analyzed and postulated. Presentation will clearly demonstrate the benefits of proposed attributes based approach and defined risk and security space for modeling and measuring.

  5. Command Process Modeling & Risk Analysis

    NASA Technical Reports Server (NTRS)

    Meshkat, Leila

    2011-01-01

    Commanding Errors may be caused by a variety of root causes. It's important to understand the relative significance of each of these causes for making institutional investment decisions. One of these causes is the lack of standardized processes and procedures for command and control. We mitigate this problem by building periodic tables and models corresponding to key functions within it. These models include simulation analysis and probabilistic risk assessment models.

  6. Lunar Landing Operational Risk Model

    NASA Technical Reports Server (NTRS)

    Mattenberger, Chris; Putney, Blake; Rust, Randy; Derkowski, Brian

    2010-01-01

    Characterizing the risk of spacecraft goes beyond simply modeling equipment reliability. Some portions of the mission require complex interactions between system elements that can lead to failure without an actual hardware fault. Landing risk is currently the least characterized aspect of the Altair lunar lander and appears to result from complex temporal interactions between pilot, sensors, surface characteristics and vehicle capabilities rather than hardware failures. The Lunar Landing Operational Risk Model (LLORM) seeks to provide rapid and flexible quantitative insight into the risks driving the landing event and to gauge sensitivities of the vehicle to changes in system configuration and mission operations. The LLORM takes a Monte Carlo based approach to estimate the operational risk of the Lunar Landing Event and calculates estimates of the risk of Loss of Mission (LOM) - Abort Required and is Successful, Loss of Crew (LOC) - Vehicle Crashes or Cannot Reach Orbit, and Success. The LLORM is meant to be used during the conceptual design phase to inform decision makers transparently of the reliability impacts of design decisions, to identify areas of the design which may require additional robustness, and to aid in the development and flow-down of requirements.

  7. Prediction of absolute risk of fragility fracture at 10 years in a Spanish population: validation of the WHO FRAX ™ tool in Spain

    PubMed Central

    2011-01-01

    Background Age-related bone loss is asymptomatic, and the morbidity of osteoporosis is secondary to the fractures that occur. Common sites of fracture include the spine, hip, forearm and proximal humerus. Fractures at the hip incur the greatest morbidity and mortality and give rise to the highest direct costs for health services. Their incidence increases exponentially with age. Independently changes in population demography, the age - and sex- specific incidence of osteoporotic fractures appears to be increasing in developing and developed countries. This could mean more than double the expected burden of osteoporotic fractures in the next 50 years. Methods/Design To assess the predictive power of the WHO FRAX™ tool to identify the subjects with the highest absolute risk of fragility fracture at 10 years in a Spanish population, a predictive validation study of the tool will be carried out. For this purpose, the participants recruited by 1999 will be assessed. These were referred to scan-DXA Department from primary healthcare centres, non hospital and hospital consultations. Study population: Patients attended in the national health services integrated into a FRIDEX cohort with at least one Dual-energy X-ray absorptiometry (DXA) measurement and one extensive questionnaire related to fracture risk factors. Measurements: At baseline bone mineral density measurement using DXA, clinical fracture risk factors questionnaire, dietary calcium intake assessment, history of previous fractures, and related drugs. Follow up by telephone interview to know fragility fractures in the 10 years with verification in electronic medical records and also to know the number of falls in the last year. The absolute risk of fracture will be estimated using the FRAX™ tool from the official web site. Discussion Since more than 10 years ago numerous publications have recognised the importance of other risk factors for new osteoporotic fractures in addition to low BMD. The extension of a

  8. Absolute Identification by Relative Judgment

    ERIC Educational Resources Information Center

    Stewart, Neil; Brown, Gordon D. A.; Chater, Nick

    2005-01-01

    In unidimensional absolute identification tasks, participants identify stimuli that vary along a single dimension. Performance is surprisingly poor compared with discrimination of the same stimuli. Existing models assume that identification is achieved using long-term representations of absolute magnitudes. The authors propose an alternative…

  9. Revised Absolute Configuration of Sibiricumin A: Substituent Effects in Simplified Model Structures Used for Quantum Mechanical Predictions of Chiroptical Properties.

    PubMed

    Zhao, Dan; Li, Zheng-Qiu; Cao, Fei; Liang, Miao-Miao; Pittman, Charles U; Zhu, Hua-Jie; Li, Li; Yu, Shi-Shan

    2016-08-01

    This study discusses the choice of different simplified models used in computations of electronic circular dichroism (ECD) spectra and other chiroptical characteristics used to determine the absolute configuration (AC) of the complex natural product sibiricumin A. Sections of molecules containing one chiral center with one near an aromatic group have large effects on the ECD spectra. Conversely, when the phenyl group is present on a substituent without a nonstereogenic center, removal of this section will have little effect on ECD spectra. However, these nonstereogenic-center-containing sections have large effects on calculated optical rotations (OR) values since the OR value is more sensitive to the geometries of sections in a molecule. In this study, the wrong AC of sibiricumin A was reassigned as (7R,8S,1'R,7'R,8'S)-. Chirality 28:612-617, 2016. © 2016 Wiley Periodicals, Inc. PMID:27428019

  10. A 2015 International Geomagnetic Reference Field (IGRF) candidate model based on Swarm's experimental absolute magnetometer vector mode data

    NASA Astrophysics Data System (ADS)

    Vigneron, Pierre; Hulot, Gauthier; Olsen, Nils; Léger, Jean-Michel; Jager, Thomas; Brocco, Laura; Sirol, Olivier; Coïsson, Pierdavide; Lalanne, Xavier; Chulliat, Arnaud; Bertrand, François; Boness, Axel; Fratter, Isabelle

    2015-06-01

    Each of the three satellites of the European Space Agency Swarm mission carries an absolute scalar magnetometer (ASM) that provides the nominal 1-Hz scalar data of the mission for both science and calibration purposes. These ASM instruments, however, also deliver autonomous 1-Hz experimental vector data. Here, we report on how ASM-only scalar and vector data from the Alpha and Bravo satellites between November 29, 2013 (a week after launch) and September 25, 2014 (for on-time delivery of the model on October 1, 2014) could be used to build a very valuable candidate model for the 2015.0 International Geomagnetic Reference Field (IGRF). A parent model was first computed, describing the geomagnetic field of internal origin up to degree and order 40 in a spherical harmonic representation and including a constant secular variation up to degree and order 8. This model was next simply forwarded to epoch 2015.0 and truncated at degree and order 13. The resulting ASM-only 2015.0 IGRF candidate model is compared to analogous models derived from the mission's nominal data and to the now-published final 2015.0 IGRF model. Differences among models mainly highlight uncertainties enhanced by the limited geographical distribution of the selected data set (essentially due to a lack of availability of data at high northern latitude satisfying nighttime conditions at the end of the time period considered). These appear to be comparable to differences classically observed among IGRF candidate models. These positive results led the ASM-only 2015.0 IGRF candidate model to contribute to the construction of the final 2015.0 IGRF model.

  11. The Dynamics of Scaling: A Memory-Based Anchor Model of Category Rating and Absolute Identification

    ERIC Educational Resources Information Center

    Petrov, Alexander A.; Anderson, John R.

    2005-01-01

    A memory-based scaling model--ANCHOR--is proposed and tested. The perceived magnitude of the target stimulus is compared with a set of anchors in memory. Anchor selection is probabilistic and sensitive to similarity, base-level strength, and recency. The winning anchor provides a reference point near the target and thereby converts the global…

  12. The Laurentide Ice Sheet at LGM: Space Geodetic and Absolute Gravity Observations Require a Multi-domed Model

    NASA Astrophysics Data System (ADS)

    Peltier, W. R.

    2002-05-01

    Although surface geomorphological evidence has continued to suggest that the LGM form of the LIS was multi-domed, both explicit ice-mechanics based reconstructions such as that produced in the CLIMAP project, and models based upon the inversion of relative sea level observations such as ICE-4G(VM2), have led to the inference of single domed structures. Three recent sets of observations related to the isostatic adjustment process require that these single domed reconstructions be abandoned. The first of these consists of the VLBI based measurement of the rate of present day vertical motion at Yellowknife in the Northwest Territories of Canada, demonstrating that the rate predicted by the ICE-4G(VM2) model is more than a factor of two less than observed(Argus et al., 1999). The second consists of absolute gravity measurements on a traverse south from Churchill on Hudson Bay across the southern margin of the former LIS into the United States(Lambert et al., 2001). Finally there is the recent demonstration that the ICE-4G reconstruction of the process of post-LGM deglaciation has too little LGM mass (Peltier,2002). Analyses to be presented in this paper show that the additional LGM ice required by the latter analysis very precisely suffices to reconcile the misfits to the first two sets of observations when it is placed in a Keewatin Dome centred over the Yellowknife region. The resulting model of the LGM form of the LIS is then very close to that originally suggested by Dyke and Prest (1987). This modified form of the ICE-4G model is viable if and only if the depth dependence of mantle viscosity is very close to VM2. Models with higher viscosity in the lower mantle are ruled out by the data as they overpredict both the space geodetic and absolute gravity observations when ice thickness over Keewatin is significantly increased so as to satisfy far field requirements concerning the eustatic sea level depression at LGM.

  13. Model-based evaluation of microbial mass fractions: effect of absolute anaerobic reaction time on microbial mass fractions.

    PubMed

    Tunçal, Tolga

    2010-04-14

    Although enhanced biological phosphorus removal processes (EBPR) are popular methods for nutrient control, unstable treatment performances of full-scale systems are still not well understood. In this study, the interaction between electron acceptors present at the start of the anaerobic phase of an EBPR system and the amount of organic acids generated from simple substrate (rbsCOD) was investigated in a full-scale wastewater treatment plant. Quantification of microbial groups including phosphorus-accumulating microorganisms (PAOs), denitrifying PAOs (DPAOs), glycogen-accumulating microorganisms (GAOs) and ordinary heterotrophic microorganisms (OHOs) was based on a modified dynamic model. The intracellular phosphorus content of PAOs was also determined by the execution of mass balances for the biological stages of the plant. The EBPR activities observed in the plant and in batch tests (under idealized conditions) were compared with each other statistically as well. Modelling efforts indicated that the use of absolute anaerobic reaction (eta1) instead of nominal anaerobic reaction time (eta), to estimate the amount of available substrate for PAOs, significantly improved model accuracy. Another interesting result of the study was the differences in EBPR characteristics observed in idealized and real conditions. PMID:20480829

  14. Decent wage is more important than absolution of debts: A smallholder socio-hydrological modelling framework

    NASA Astrophysics Data System (ADS)

    Pande, Saket; Savenije, Hubert

    2015-04-01

    We present a framework to understand the socio-hydrological system dynamics of a small holder. Small holders are farmers who own less than 2 ha of farmland. It couples the dynamics of 6 main variables that are most relevant at the scale of a small holder: local storage (soil moisture and other water storage), capital, knowledge, livestock production, soil fertility and grass biomass production. The hydroclimatic variability is at sub-annual scale and influences the socio-hydrology at annual scale. The model incorporates rule-based adaptation mechanisms (for example: adjusting expenditures on food and fertilizers, selling livestocks etc.) of small holders when they face adverse socio-hydrological conditions, such as low annual rainfall, higher intra-annual variability in rainfall or variability in agricultural prices. We apply the framework to understand the socio-hydrology of a sugarcane small holder in Aurangabad, Maharashtra. This district has witnessed suicides of many sugarcane farmers who could not extricate themselves out of the debt trap. These farmers lack irrigation and are susceptible to fluctuating sugar prices and intra-annual hydro-climatic variability. We study the sensitivity of annual total capital averaged over 30 years, an indicator of small holder wellbeing, to initial capital that a small holder starts with and the prevalent wage rates. We find that a smallholder well being is low (below Rs 30000 per annum, a threshold above which a smallholder can afford a basic standard of living) and is rather insensitive to initial capital at low wage rates. Initial capital perhaps matters to small holder livelihoods at higher wage rates. Further, the small holder system appears to be resilient at around Rs 115/mandays in the sense that small perturbations in wage rates around this rate still sustains the smallholder above the basic standard of living. Our results thus indicate that government intervention to absolve the debt of farmers is not enough. It

  15. Breast cancer risk assessment across the risk continuum: genetic and nongenetic risk factors contributing to differential model performance

    PubMed Central

    2012-01-01

    Introduction Clinicians use different breast cancer risk models for patients considered at average and above-average risk, based largely on their family histories and genetic factors. We used longitudinal cohort data from women whose breast cancer risks span the full spectrum to determine the genetic and nongenetic covariates that differentiate the performance of two commonly used models that include nongenetic factors - BCRAT, also called Gail model, generally used for patients with average risk and IBIS, also called Tyrer Cuzick model, generally used for patients with above-average risk. Methods We evaluated the performance of the BCRAT and IBIS models as currently applied in clinical settings for 10-year absolute risk of breast cancer, using prospective data from 1,857 women over a mean follow-up length of 8.1 years, of whom 83 developed cancer. This cohort spans the continuum of breast cancer risk, with some subjects at lower than average population risk. Therefore, the wide variation in individual risk makes it an interesting population to examine model performance across subgroups of women. For model calibration, we divided the cohort into quartiles of model-assigned risk and compared differences between assigned and observed risks using the Hosmer-Lemeshow (HL) chi-squared statistic. For model discrimination, we computed the area under the receiver operator curve (AUC) and the case risk percentiles (CRPs). Results The 10-year risks assigned by BCRAT and IBIS differed (range of difference 0.001 to 79.5). The mean BCRAT- and IBIS-assigned risks of 3.18% and 5.49%, respectively, were lower than the cohort's 10-year cumulative probability of developing breast cancer (6.25%; 95% confidence interval (CI) = 5.0 to 7.8%). Agreement between assigned and observed risks was better for IBIS (HL X42 = 7.2, P value 0.13) than BCRAT (HL X42 = 22.0, P value <0.001). The IBIS model also showed better discrimination (AUC = 69.5%, CI = 63.8% to 75.2%) than did the BCRAT model

  16. Comparing Absolute Error with Squared Error for Evaluating Empirical Models of Continuous Variables: Compositions, Implications, and Consequences

    NASA Astrophysics Data System (ADS)

    Gao, J.

    2014-12-01

    Reducing modeling error is often a major concern of empirical geophysical models. However, modeling errors can be defined in different ways: When the response variable is continuous, the most commonly used metrics are squared (SQ) and absolute (ABS) errors. For most applications, ABS error is the more natural, but SQ error is mathematically more tractable, so is often used as a substitute with little scientific justification. Existing literature has not thoroughly investigated the implications of using SQ error in place of ABS error, especially not geospatially. This study compares the two metrics through the lens of bias-variance decomposition (BVD). BVD breaks down the expected modeling error of each model evaluation point into bias (systematic error), variance (model sensitivity), and noise (observation instability). It offers a way to probe the composition of various error metrics. I analytically derived the BVD of ABS error and compared it with the well-known SQ error BVD, and found that not only the two metrics measure the characteristics of the probability distributions of modeling errors differently, but also the effects of these characteristics on the overall expected error are different. Most notably, under SQ error all bias, variance, and noise increase expected error, while under ABS error certain parts of the error components reduce expected error. Since manipulating these subtractive terms is a legitimate way to reduce expected modeling error, SQ error can never capture the complete story embedded in ABS error. I then empirically compared the two metrics with a supervised remote sensing model for mapping surface imperviousness. Pair-wise spatially-explicit comparison for each error component showed that SQ error overstates all error components in comparison to ABS error, especially variance-related terms. Hence, substituting ABS error with SQ error makes model performance appear worse than it actually is, and the analyst would more likely accept a

  17. Low absolute lymphocyte count and addition of rituximab confer high risk for interstitial pneumonia in patients with diffuse large B-cell lymphoma.

    PubMed

    Huang, Yu-Chung; Liu, Chia-Jen; Liu, Chun-Yu; Pai, Jih-Tung; Hong, Ying-Chung; Teng, Hao-Wei; Hsiao, Liang-Tsai; Chao, Ta-Chung; Gau, Jyh-Pyng; Liu, Jin-Hwang; Hsu, Hui-Chi; Chiou, Tzeon-Jye; Chen, Po-Min; Yu, Yuan-Bin; Tzeng, Cheng-Hwai

    2011-10-01

    Several small-scale studies have reported pulmonary toxicity among patients with diffuse large B-cell lymphoma (DLBCL) receiving rituximab-containing chemotherapy, though whether the use of rituximab predisposes to interstitial pneumonia (IP) remains unclear. This retrospective study was intended to identify the characteristics and risk factors of IP in patients with DLBCL. Between 2000 and 2009, 529 consecutive patients with DLBCL receiving first-line tri-weekly COP- or CHOP-based chemotherapy with or without rituximab were enrolled as subjects. IP was defined as diffuse pulmonary interstitial infiltrates found on computed tomography scans in conjunction with respiratory symptoms. IP was observed in 26 patients (4.9%), six of whom were confirmed with Pneumocystis jirovecii pneumonia. The median number of chemotherapy courses before IP was four cycles. Using multivariate analysis, absolute lymphocyte count less than 1×10(9)/l at diagnosis [odds ratio (OR) 2.75, p=0.014] and the addition of rituximab to chemotherapy (OR 4.56, p=0.003) were identified as independent risk factors for IP. In conclusion, the incidence of IP is increased in patients with DLBCL receiving rituximab-containing chemotherapy. Specific subgroups with lymphopenia at diagnosis may justify close scrutiny to detect pulmonary complications. PMID:21647583

  18. Teaching Absolute Value Meaningfully

    ERIC Educational Resources Information Center

    Wade, Angela

    2012-01-01

    What is the meaning of absolute value? And why do teachers teach students how to solve absolute value equations? Absolute value is a concept introduced in first-year algebra and then reinforced in later courses. Various authors have suggested instructional methods for teaching absolute value to high school students (Wei 2005; Stallings-Roberts…

  19. Using the common sense model to understand perceived cancer risk in individuals testing for BRCA1/2 mutations.

    PubMed

    Kelly, Kimberly; Leventhal, Howard; Andrykowski, Michael; Toppmeyer, Deborah; Much, Judy; Dermody, James; Marvin, Monica; Baran, Jill; Schwalb, Marvin

    2005-01-01

    The common sense model posits that individuals' understanding of illness is based upon somatic symptoms and life experiences and thus may differ significantly from the biomedical view of illness. The current study used the common sense model to understand cancer risk perceptions in 99 individuals testing for BRCA1/2 mutations. Specifically, we examined change from post-counseling to post-result in (1) absolute risk (risk of developing cancer in one's lifetime) and (2) comparative risk (risk relative to the general population). Results indicated that absolute risk showed a trend such that those with a personal history of cancer receiving uninformative negative results reported decreased absolute risk. Further, individuals receiving uninformative negative results reported decreased comparative risk. Those with no personal cancer history receiving informative negative results did not decrease in risk over time nor did their risk differ from those with a personal cancer history, evidencing unrealistic pessimism regarding their risk of cancer. The reasons provided for individuals' risk perceptions could be classified in terms of attributes of the common sense model and included the: (1) causes of cancer (e.g. family history, mutation status); (2) control or cure of cancer through health behaviors and/or surgery; and (3) perceived timeline for developing cancer (e.g. time left in life to develop cancer). We conclude that key to developing interventions to improve understanding of cancer risk and promoting effective cancer control mechanisms is an understanding of the specific reasons underlying individuals' perceptions of cancer risk.

  20. Comparing paired vs non-paired statistical methods of analyses when making inferences about absolute risk reductions in propensity-score matched samples.

    PubMed

    Austin, Peter C

    2011-05-20

    Propensity-score matching allows one to reduce the effects of treatment-selection bias or confounding when estimating the effects of treatments when using observational data. Some authors have suggested that methods of inference appropriate for independent samples can be used for assessing the statistical significance of treatment effects when using propensity-score matching. Indeed, many authors in the applied medical literature use methods for independent samples when making inferences about treatment effects using propensity-score matched samples. Dichotomous outcomes are common in healthcare research. In this study, we used Monte Carlo simulations to examine the effect on inferences about risk differences (or absolute risk reductions) when statistical methods for independent samples are used compared with when statistical methods for paired samples are used in propensity-score matched samples. We found that compared with using methods for independent samples, the use of methods for paired samples resulted in: (i) empirical type I error rates that were closer to the advertised rate; (ii) empirical coverage rates of 95 per cent confidence intervals that were closer to the advertised rate; (iii) narrower 95 per cent confidence intervals; and (iv) estimated standard errors that more closely reflected the sampling variability of the estimated risk difference. Differences between the empirical and advertised performance of methods for independent samples were greater when the treatment-selection process was stronger compared with when treatment-selection process was weaker. We recommend using statistical methods for paired samples when using propensity-score matched samples for making inferences on the effect of treatment on the reduction in the probability of an event occurring.

  1. The reaction H + C4H2 - Absolute rate constant measurement and implication for atmospheric modeling of Titan

    NASA Technical Reports Server (NTRS)

    Nava, D. F.; Mitchell, M. B.; Stief, L. J.

    1986-01-01

    The absolute rate constant for the reaction H + C4H2 has been measured over the temperature (T) interval 210-423 K, using the technique of flash photolysis-resonance fluorescence. At each of the five temperatures employed, the results were independent of variations in C4H2 concentration, total pressure of Ar or N2, and flash intensity (i.e., the initial H concentration). The rate constant, k, was found to be equal to 1.39 x 10 to the -10th exp (-1184/T) cu cm/s, with an error of one standard deviation. The Arrhenius parameters at the high pressure limit determined here for the H + C4H2 reaction are consistent with those for the corresponding reactions of H with C2H2 and C3H4. Implications of the kinetic carbon chemistry results, particularly those at low temperature, are considered for models of the atmospheric carbon chemistry of Titan. The rate of this reaction, relative to that of the analogous, but slower, reaction of H + C2H2, appears to make H + C4H2 a very feasible reaction pathway for effective conversion of H atoms to molecular hydrogen in the stratosphere of Titan.

  2. A Quantitative Software Risk Assessment Model

    NASA Technical Reports Server (NTRS)

    Lee, Alice

    2002-01-01

    This slide presentation reviews a risk assessment model as applied to software development. the presentation uses graphs to demonstrate basic concepts of software reliability. It also discusses the application to the risk model to the software development life cycle.

  3. Constraining the Absolute Orientation of eta Carinae's Binary Orbit: A 3-D Dynamical Model for the Broad [Fe III] Emission

    NASA Technical Reports Server (NTRS)

    Madura, T. I.; Gull, T. R.; Owocki, S. P.; Groh, J. H.; Okazaki, A. T.; Russell, C. M. P.

    2011-01-01

    We present a three-dimensional (3-D) dynamical model for the broad [Fe III] emission observed in Eta Carinae using the Hubble Space Telescope/Space Telescope Imaging Spectrograph (HST/STIS). This model is based on full 3-D Smoothed Particle Hydrodynamics (SPH) simulations of Eta Car's binary colliding winds. Radiative transfer codes are used to generate synthetic spectro-images of [Fe III] emission line structures at various observed orbital phases and STIS slit position angles (PAs). Through a parameter study that varies the orbital inclination i, the PA(theta) that the orbital plane projection of the line-of-sight makes with the apastron side of the semi-major axis, and the PA on the sky of the orbital axis, we are able, for the first time, to tightly constrain the absolute 3-D orientation of the binary orbit. To simultaneously reproduce the blue-shifted emission arcs observed at orbital phase 0.976, STIS slit PA = +38deg, and the temporal variations in emission seen at negative slit PAs, the binary needs to have an i approx. = 130deg to 145deg, Theta approx. = -15deg to +30deg, and an orbital axis projected on the sky at a P A approx. = 302deg to 327deg east of north. This represents a system with an orbital axis that is closely aligned with the inferred polar axis of the Homunculus nebula, in 3-D. The companion star, Eta(sub B), thus orbits clockwise on the sky and is on the observer's side of the system at apastron. This orientation has important implications for theories for the formation of the Homunculus and helps lay the groundwork for orbital modeling to determine the stellar masses.

  4. ABSOLUTE DIMENSIONS OF THE G7+K7 ECLIPSING BINARY STAR IM VIRGINIS: DISCREPANCIES WITH STELLAR EVOLUTION MODELS

    SciTech Connect

    Morales, Juan Carlos; Marschall, Laurence A.; Brehm, William

    2009-12-10

    We report extensive spectroscopic and differential photometric BVRI observations of the active, detached, 1.309-day double-lined eclipsing binary IM Vir, composed of a G7-type primary and a K7 secondary. With these observations, we derive accurate absolute masses and radii of M {sub 1} = 0.981 +- 0.012 M {sub sun}, M {sub 2} = 0.6644 +- 0.0048 M {sub sun}, R {sub 1} = 1.061 +- 0.016 R {sub sun}, and R {sub 2} = 0.681 +- 0.013 R {sub sun} for the primary and secondary, with relative errors under 2%. The effective temperatures are 5570 +- 100 K and 4250 +- 130 K, respectively. The significant difference in mass makes this a favorable case for comparison with stellar evolution theory. We find that both stars are larger than the models predict, by 3.7% for the primary and 7.5% for the secondary, as well as cooler than expected, by 100 K and 150 K, respectively. These discrepancies are in line with previously reported differences in low-mass stars, and are believed to be caused by chromospheric activity, which is not accounted for in current models. The effect is not confined to low-mass stars: the rapidly rotating primary of IM Vir joins the growing list of objects of near-solar mass (but still with convective envelopes) that show similar anomalies. The comparison with the models suggests an age of 2.4 Gyr for the system, and a metallicity of [Fe/H] approx-0.3 that is consistent with other indications, but requires confirmation.

  5. The performance of different propensity-score methods for estimating differences in proportions (risk differences or absolute risk reductions) in observational studies.

    PubMed

    Austin, Peter C

    2010-09-10

    Propensity score methods are increasingly being used to estimate the effects of treatments on health outcomes using observational data. There are four methods for using the propensity score to estimate treatment effects: covariate adjustment using the propensity score, stratification on the propensity score, propensity-score matching, and inverse probability of treatment weighting (IPTW) using the propensity score. When outcomes are binary, the effect of treatment on the outcome can be described using odds ratios, relative risks, risk differences, or the number needed to treat. Several clinical commentators suggested that risk differences and numbers needed to treat are more meaningful for clinical decision making than are odds ratios or relative risks. However, there is a paucity of information about the relative performance of the different propensity-score methods for estimating risk differences. We conducted a series of Monte Carlo simulations to examine this issue. We examined bias, variance estimation, coverage of confidence intervals, mean-squared error (MSE), and type I error rates. A doubly robust version of IPTW had superior performance compared with the other propensity-score methods. It resulted in unbiased estimation of risk differences, treatment effects with the lowest standard errors, confidence intervals with the correct coverage rates, and correct type I error rates. Stratification, matching on the propensity score, and covariate adjustment using the propensity score resulted in minor to modest bias in estimating risk differences. Estimators based on IPTW had lower MSE compared with other propensity-score methods. Differences between IPTW and propensity-score matching may reflect that these two methods estimate the average treatment effect and the average treatment effect for the treated, respectively.

  6. Development and application of chronic disease risk prediction models.

    PubMed

    Oh, Sun Min; Stefani, Katherine M; Kim, Hyeon Chang

    2014-07-01

    Currently, non-communicable chronic diseases are a major cause of morbidity and mortality worldwide, and a large proportion of chronic diseases are preventable through risk factor management. However, the prevention efficacy at the individual level is not yet satisfactory. Chronic disease prediction models have been developed to assist physicians and individuals in clinical decision-making. A chronic disease prediction model assesses multiple risk factors together and estimates an absolute disease risk for the individual. Accurate prediction of an individual's future risk for a certain disease enables the comparison of benefits and risks of treatment, the costs of alternative prevention strategies, and selection of the most efficient strategy for the individual. A large number of chronic disease prediction models, especially targeting cardiovascular diseases and cancers, have been suggested, and some of them have been adopted in the clinical practice guidelines and recommendations of many countries. Although few chronic disease prediction tools have been suggested in the Korean population, their clinical utility is not as high as expected. This article reviews methodologies that are commonly used for developing and evaluating a chronic disease prediction model and discusses the current status of chronic disease prediction in Korea.

  7. A methodological survey of the analysis, reporting and interpretation of Absolute Risk ReductiOn in systematic revieWs (ARROW): a study protocol

    PubMed Central

    2013-01-01

    Background Clinicians, providers and guideline panels use absolute effects to weigh the advantages and downsides of treatment alternatives. Relative measures have the potential to mislead readers. However, little is known about the reporting of absolute measures in systematic reviews. The objectives of our study are to determine the proportion of systematic reviews that report absolute measures of effect for the most important outcomes, and ascertain how they are analyzed, reported and interpreted. Methods/design We will conduct a methodological survey of systematic reviews published in 2010. We will conduct a 1:1 stratified random sampling of Cochrane vs. non-Cochrane systematic reviews. We will calculate the proportion of systematic reviews reporting at least one absolute estimate of effect for the most patient-important outcome for the comparison of interest. We will conduct multivariable logistic regression analyses with the reporting of an absolute estimate of effect as the dependent variable and pre-specified study characteristics as the independent variables. For systematic reviews reporting an absolute estimate of effect, we will document the methods used for the analysis, reporting and interpretation of the absolute estimate. Discussion Our methodological survey will inform current practices regarding reporting of absolute estimates in systematic reviews. Our findings may influence recommendations on reporting, conduct and interpretation of absolute estimates. Our results are likely to be of interest to systematic review authors, funding agencies, clinicians, guideline developers and journal editors. PMID:24330779

  8. NGNP Risk Management Database: A Model for Managing Risk

    SciTech Connect

    John Collins; John M. Beck

    2011-11-01

    The Next Generation Nuclear Plant (NGNP) Risk Management System (RMS) is a database used to maintain the project risk register. The RMS also maps risk reduction activities to specific identified risks. Further functionality of the RMS includes mapping reactor suppliers Design Data Needs (DDNs) to risk reduction tasks and mapping Phenomena Identification Ranking Table (PIRTs) to associated risks. This document outlines the basic instructions on how to use the RMS. This document constitutes Revision 1 of the NGNP Risk Management Database: A Model for Managing Risk. It incorporates the latest enhancements to the RMS. The enhancements include six new custom views of risk data - Impact/Consequence, Tasks by Project Phase, Tasks by Status, Tasks by Project Phase/Status, Tasks by Impact/WBS, and Tasks by Phase/Impact/WBS.

  9. Eosinophil count - absolute

    MedlinePlus

    Eosinophils; Absolute eosinophil count ... the white blood cell count to give the absolute eosinophil count. ... than 500 cells per microliter (cells/mcL). Normal value ranges may vary slightly among different laboratories. Talk ...

  10. Modeling Research Project Risks with Fuzzy Maps

    ERIC Educational Resources Information Center

    Bodea, Constanta Nicoleta; Dascalu, Mariana Iuliana

    2009-01-01

    The authors propose a risks evaluation model for research projects. The model is based on fuzzy inference. The knowledge base for fuzzy process is built with a causal and cognitive map of risks. The map was especially developed for research projects, taken into account their typical lifecycle. The model was applied to an e-testing research…

  11. Relative risk regression models with inverse polynomials.

    PubMed

    Ning, Yang; Woodward, Mark

    2013-08-30

    The proportional hazards model assumes that the log hazard ratio is a linear function of parameters. In the current paper, we model the log relative risk as an inverse polynomial, which is particularly suitable for modeling bounded and asymmetric functions. The parameters estimated by maximizing the partial likelihood are consistent and asymptotically normal. The advantages of the inverse polynomial model over the ordinary polynomial model and the fractional polynomial model for fitting various asymmetric log relative risk functions are shown by simulation. The utility of the method is further supported by analyzing two real data sets, addressing the specific question of the location of the minimum risk threshold.

  12. Requirements based system risk modeling

    NASA Technical Reports Server (NTRS)

    Meshkat, Leila; Cornford, Steven; Feather, Martin

    2004-01-01

    The problem that we address in this paper is assessing the expected degree of success of the system or mission based on the degree to which each requirement is satisfied and the relative weight of the requirements. We assume a complete list of the requirements, the relevant risk elements and their probability of occurrence and the quantified effect of the risk elements on the requirements. In order to assess the degree to which each requirement is satisfied, we need to determine the effect of the various risk elements on the requirement.

  13. Modeling extreme risks in ecology.

    PubMed

    Burgman, Mark; Franklin, James; Hayes, Keith R; Hosack, Geoffrey R; Peters, Gareth W; Sisson, Scott A

    2012-11-01

    Extreme risks in ecology are typified by circumstances in which data are sporadic or unavailable, understanding is poor, and decisions are urgently needed. Expert judgments are pervasive and disagreements among experts are commonplace. We outline approaches to evaluating extreme risks in ecology that rely on stochastic simulation, with a particular focus on methods to evaluate the likelihood of extinction and quasi-extinction of threatened species, and the likelihood of establishment and spread of invasive pests. We evaluate the importance of assumptions in these assessments and the potential of some new approaches to account for these uncertainties, including hierarchical estimation procedures and generalized extreme value distributions. We conclude by examining the treatment of consequences in extreme risk analysis in ecology and how expert judgment may better be harnessed to evaluate extreme risks.

  14. Reliability Models and Attributable Risk

    NASA Technical Reports Server (NTRS)

    Jarvinen, Richard D.

    1999-01-01

    The intention of this report is to bring a developing and extremely useful statistical methodology to greater attention within the Safety, Reliability, and Quality Assurance Office of the NASA Johnson Space Center. The statistical methods in this exposition are found under the heading of attributable risk. Recently the Safety, Reliability, and Quality Assurance Office at the Johnson Space Center has supported efforts to introduce methods of medical research statistics dealing with the survivability of people to bear on the problems of aerospace that deal with the reliability of component hardware used in the NASA space program. This report, which describes several study designs for which attributable risk is used, is in concert with the latter goals. The report identifies areas of active research in attributable risk while briefly describing much of what has been developed in the theory of attributable risk. The report, which largely is a report on a report, attempts to recast the medical setting and language commonly found in descriptions of attributable risk into the setting and language of the space program and its component hardware.

  15. A methodology for modeling regional terrorism risk.

    PubMed

    Chatterjee, Samrat; Abkowitz, Mark D

    2011-07-01

    Over the past decade, terrorism risk has become a prominent consideration in protecting the well-being of individuals and organizations. More recently, there has been interest in not only quantifying terrorism risk, but also placing it in the context of an all-hazards environment in which consideration is given to accidents and natural hazards, as well as intentional acts. This article discusses the development of a regional terrorism risk assessment model designed for this purpose. The approach taken is to model terrorism risk as a dependent variable, expressed in expected annual monetary terms, as a function of attributes of population concentration and critical infrastructure. This allows for an assessment of regional terrorism risk in and of itself, as well as in relation to man-made accident and natural hazard risks, so that mitigation resources can be allocated in an effective manner. The adopted methodology incorporates elements of two terrorism risk modeling approaches (event-based models and risk indicators), producing results that can be utilized at various jurisdictional levels. The validity, strengths, and limitations of the model are discussed in the context of a case study application within the United States.

  16. Adversarial risk analysis for counterterrorism modeling.

    PubMed

    Rios, Jesus; Rios Insua, David

    2012-05-01

    Recent large-scale terrorist attacks have raised interest in models for resource allocation against terrorist threats. The unifying theme in this area is the need to develop methods for the analysis of allocation decisions when risks stem from the intentional actions of intelligent adversaries. Most approaches to these problems have a game-theoretic flavor although there are also several interesting decision-analytic-based proposals. One of them is the recently introduced framework for adversarial risk analysis, which deals with decision-making problems that involve intelligent opponents and uncertain outcomes. We explore how adversarial risk analysis addresses some standard counterterrorism models: simultaneous defend-attack models, sequential defend-attack-defend models, and sequential defend-attack models with private information. For each model, we first assess critically what would be a typical game-theoretic approach and then provide the corresponding solution proposed by the adversarial risk analysis framework, emphasizing how to coherently assess a predictive probability model of the adversary's actions, in a context in which we aim at supporting decisions of a defender versus an attacker. This illustrates the application of adversarial risk analysis to basic counterterrorism models that may be used as basic building blocks for more complex risk analysis of counterterrorism problems. PMID:22150163

  17. PRISM: a planned risk information seeking model.

    PubMed

    Kahlor, LeeAnn

    2010-06-01

    Recent attention on health-related information seeking has focused primarily on information seeking within specific health and health risk contexts. This study attempts to shift some of that focus to individual-level variables that may impact health risk information seeking across contexts. To locate these variables, the researcher posits an integrated model, the Planned Risk Information Seeking Model (PRISM). The model, which treats risk information seeking as a deliberate (planned) behavior, maps variables found in the Theory of Planned Behavior (TPB; Ajzen, 1991) and the Risk Information Seeking and Processing Model (RISP; Griffin, Dunwoody, & Neuwirth, 1999), and posits linkages among those variables. This effort is further informed by Kahlor's (2007) Augmented RISP, the Theory of Motivated Information Management (Afifi & Weiner, 2004), the Comprehensive Model of Information Seeking (Johnson & Meischke, 1993), the Health Information Acquisition Model (Freimuth, Stein, & Kean, 1989), and the Extended Parallel Processing Model (Witte, 1998). The resulting integrated model accounted for 59% of the variance in health risk information-seeking intent and performed better than the TPB or the RISP alone. PMID:20512716

  18. Quantitative risk modeling in aseptic manufacture.

    PubMed

    Tidswell, Edward C; McGarvey, Bernard

    2006-01-01

    Expedient risk assessment of aseptic manufacturing processes offers unique opportunities for improved and sustained assurance of product quality. Contemporary risk assessments applied to aseptic manufacturing processes, however, are commonly handicapped by assumptions and subjectivity, leading to inexactitude. Quantitative risk modeling augmented with Monte Carlo simulations represents a novel, innovative, and more efficient means of risk assessment. This technique relies upon fewer assumptions and removes subjectivity to more swiftly generate an improved, more realistic, quantitative estimate of risk. The fundamental steps and requirements for an assessment of the risk of bioburden ingress into aseptically manufactured products are described. A case study exemplifies how quantitative risk modeling and Monte Carlo simulations achieve a more rapid and improved determination of the risk of bioburden ingress during the aseptic filling of a parenteral product. Although application of quantitative risk modeling is described here purely for the purpose of process improvement, the technique has far wider relevance in the assisted disposition of batches, cleanroom management, and the utilization of real-time data from rapid microbial monitoring technologies. PMID:17089696

  19. Frequency-domain analysis of absolute gravimeters

    NASA Astrophysics Data System (ADS)

    Svitlov, S.

    2012-12-01

    An absolute gravimeter is analysed as a linear time-invariant system in the frequency domain. Frequency responses of absolute gravimeters are derived analytically based on the propagation of the complex exponential signal through their linear measurement functions. Depending on the model of motion and the number of time-distance coordinates, an absolute gravimeter is considered as a second-order (three-level scheme) or third-order (multiple-level scheme) low-pass filter. It is shown that the behaviour of an atom absolute gravimeter in the frequency domain corresponds to that of the three-level corner-cube absolute gravimeter. Theoretical results are applied for evaluation of random and systematic measurement errors and optimization of an experiment. The developed theory agrees with known results of an absolute gravimeter analysis in the time and frequency domains and can be used for measurement uncertainty analyses, building of vibration-isolation systems and synthesis of digital filtering algorithms.

  20. Korean Risk Assessment Model for Breast Cancer Risk Prediction

    PubMed Central

    Park, Boyoung; Ma, Seung Hyun; Shin, Aesun; Chang, Myung-Chul; Choi, Ji-Yeob; Kim, Sungwan; Han, Wonshik; Noh, Dong-Young; Ahn, Sei-Hyun; Kang, Daehee; Yoo, Keun-Young; Park, Sue K.

    2013-01-01

    Purpose We evaluated the performance of the Gail model for a Korean population and developed a Korean breast cancer risk assessment tool (KoBCRAT) based upon equations developed for the Gail model for predicting breast cancer risk. Methods Using 3,789 sets of cases and controls, risk factors for breast cancer among Koreans were identified. Individual probabilities were projected using Gail's equations and Korean hazard data. We compared the 5-year and lifetime risk produced using the modified Gail model which applied Korean incidence and mortality data and the parameter estimators from the original Gail model with those produced using the KoBCRAT. We validated the KoBCRAT based on the expected/observed breast cancer incidence and area under the curve (AUC) using two Korean cohorts: the Korean Multicenter Cancer Cohort (KMCC) and National Cancer Center (NCC) cohort. Results The major risk factors under the age of 50 were family history, age at menarche, age at first full-term pregnancy, menopausal status, breastfeeding duration, oral contraceptive usage, and exercise, while those at and over the age of 50 were family history, age at menarche, age at menopause, pregnancy experience, body mass index, oral contraceptive usage, and exercise. The modified Gail model produced lower 5-year risk for the cases than for the controls (p = 0.017), while the KoBCRAT produced higher 5-year and lifetime risk for the cases than for the controls (p<0.001 and <0.001, respectively). The observed incidence of breast cancer in the two cohorts was similar to the expected incidence from the KoBCRAT (KMCC, p = 0.880; NCC, p = 0.878). The AUC using the KoBCRAT was 0.61 for the KMCC and 0.89 for the NCC cohort. Conclusions Our findings suggest that the KoBCRAT is a better tool for predicting the risk of breast cancer in Korean women, especially urban women. PMID:24204664

  1. Integrated Environmental Modeling: Quantitative Microbial Risk Assessment

    EPA Science Inventory

    The presentation discusses the need for microbial assessments and presents a road map associated with quantitative microbial risk assessments, through an integrated environmental modeling approach. A brief introduction and the strengths of the current knowledge are illustrated. W...

  2. Absolute biological needs.

    PubMed

    McLeod, Stephen

    2014-07-01

    Absolute needs (as against instrumental needs) are independent of the ends, goals and purposes of personal agents. Against the view that the only needs are instrumental needs, David Wiggins and Garrett Thomson have defended absolute needs on the grounds that the verb 'need' has instrumental and absolute senses. While remaining neutral about it, this article does not adopt that approach. Instead, it suggests that there are absolute biological needs. The absolute nature of these needs is defended by appeal to: their objectivity (as against mind-dependence); the universality of the phenomenon of needing across the plant and animal kingdoms; the impossibility that biological needs depend wholly upon the exercise of the abilities characteristic of personal agency; the contention that the possession of biological needs is prior to the possession of the abilities characteristic of personal agency. Finally, three philosophical usages of 'normative' are distinguished. On two of these, to describe a phenomenon or claim as 'normative' is to describe it as value-dependent. A description of a phenomenon or claim as 'normative' in the third sense does not entail such value-dependency, though it leaves open the possibility that value depends upon the phenomenon or upon the truth of the claim. It is argued that while survival needs (or claims about them) may well be normative in this third sense, they are normative in neither of the first two. Thus, the idea of absolute need is not inherently normative in either of the first two senses. PMID:23586876

  3. Absolute biological needs.

    PubMed

    McLeod, Stephen

    2014-07-01

    Absolute needs (as against instrumental needs) are independent of the ends, goals and purposes of personal agents. Against the view that the only needs are instrumental needs, David Wiggins and Garrett Thomson have defended absolute needs on the grounds that the verb 'need' has instrumental and absolute senses. While remaining neutral about it, this article does not adopt that approach. Instead, it suggests that there are absolute biological needs. The absolute nature of these needs is defended by appeal to: their objectivity (as against mind-dependence); the universality of the phenomenon of needing across the plant and animal kingdoms; the impossibility that biological needs depend wholly upon the exercise of the abilities characteristic of personal agency; the contention that the possession of biological needs is prior to the possession of the abilities characteristic of personal agency. Finally, three philosophical usages of 'normative' are distinguished. On two of these, to describe a phenomenon or claim as 'normative' is to describe it as value-dependent. A description of a phenomenon or claim as 'normative' in the third sense does not entail such value-dependency, though it leaves open the possibility that value depends upon the phenomenon or upon the truth of the claim. It is argued that while survival needs (or claims about them) may well be normative in this third sense, they are normative in neither of the first two. Thus, the idea of absolute need is not inherently normative in either of the first two senses.

  4. Absolute neutrino mass measurements

    SciTech Connect

    Wolf, Joachim

    2011-10-06

    The neutrino mass plays an important role in particle physics, astrophysics and cosmology. In recent years the detection of neutrino flavour oscillations proved that neutrinos carry mass. However, oscillation experiments are only sensitive to the mass-squared difference of the mass eigenvalues. In contrast to cosmological observations and neutrino-less double beta decay (0v2{beta}) searches, single {beta}-decay experiments provide a direct, model-independent way to determine the absolute neutrino mass by measuring the energy spectrum of decay electrons at the endpoint region with high accuracy.Currently the best kinematic upper limits on the neutrino mass of 2.2eV have been set by two experiments in Mainz and Troitsk, using tritium as beta emitter. The next generation tritium {beta}-experiment KATRIN is currently under construction in Karlsruhe/Germany by an international collaboration. KATRIN intends to improve the sensitivity by one order of magnitude to 0.2eV. The investigation of a second isotope ({sup 137}Rh) is being pursued by the international MARE collaboration using micro-calorimeters to measure the beta spectrum. The technology needed to reach 0.2eV sensitivity is still in the R and D phase. This paper reviews the present status of neutrino-mass measurements with cosmological data, 0v2{beta} decay and single {beta}-decay.

  5. Breast cancer risk prediction using a clinical risk model and polygenic risk score.

    PubMed

    Shieh, Yiwey; Hu, Donglei; Ma, Lin; Huntsman, Scott; Gard, Charlotte C; Leung, Jessica W T; Tice, Jeffrey A; Vachon, Celine M; Cummings, Steven R; Kerlikowske, Karla; Ziv, Elad

    2016-10-01

    Breast cancer risk assessment can inform the use of screening and prevention modalities. We investigated the performance of the Breast Cancer Surveillance Consortium (BCSC) risk model in combination with a polygenic risk score (PRS) comprised of 83 single nucleotide polymorphisms identified from genome-wide association studies. We conducted a nested case-control study of 486 cases and 495 matched controls within a screening cohort. The PRS was calculated using a Bayesian approach. The contributions of the PRS and variables in the BCSC model to breast cancer risk were tested using conditional logistic regression. Discriminatory accuracy of the models was compared using the area under the receiver operating characteristic curve (AUROC). Increasing quartiles of the PRS were positively associated with breast cancer risk, with OR 2.54 (95 % CI 1.69-3.82) for breast cancer in the highest versus lowest quartile. In a multivariable model, the PRS, family history, and breast density remained strong risk factors. The AUROC of the PRS was 0.60 (95 % CI 0.57-0.64), and an Asian-specific PRS had AUROC 0.64 (95 % CI 0.53-0.74). A combined model including the BCSC risk factors and PRS had better discrimination than the BCSC model (AUROC 0.65 versus 0.62, p = 0.01). The BCSC-PRS model classified 18 % of cases as high-risk (5-year risk ≥3 %), compared with 7 % using the BCSC model. The PRS improved discrimination of the BCSC risk model and classified more cases as high-risk. Further consideration of the PRS's role in decision-making around screening and prevention strategies is merited. PMID:27565998

  6. Using A New Model for Main Sequence Turnoff Absolute Magnitudes to Measure Stellar Streams in the Milky Way Halo

    NASA Astrophysics Data System (ADS)

    Weiss, Jake; Newberg, Heidi Jo; Arsenault, Matthew; Bechtel, Torrin; Desell, Travis; Newby, Matthew; Thompson, Jeffery M.

    2016-01-01

    Statistical photometric parallax is a method for using the distribution of absolute magnitudes of stellar tracers to statistically recover the underlying density distribution of these tracers. In previous work, statistical photometric parallax was used to trace the Sagittarius Dwarf tidal stream, the so-called bifurcated piece of the Sagittaritus stream, and the Virgo Overdensity through the Milky Way. We use an improved knowledge of this distribution in a new algorithm that accounts for the changes in the stellar population of color-selected stars near the photometric limit of the Sloan Digital Sky Survey (SDSS). Although we select bluer main sequence turnoff stars (MSTO) as tracers, large color errors near the survey limit cause many stars to be scattered out of our selection box and many fainter, redder stars to be scattered into our selection box. We show that we are able to recover parameters for analogues of these streams in simulated data using a maximum likelihood optimization on MilkyWay@home. We also present the preliminary results of fitting the density distribution of major Milky Way tidal streams in SDSS data. This research is supported by generous gifts from the Marvin Clan, Babette Josephs, Manit Limlamai, and the MilkyWay@home volunteers.

  7. Risk Management in environmental geotechnical modelling

    NASA Astrophysics Data System (ADS)

    Tammemäe, Olavi; Torn, Hardi

    2008-01-01

    The objective of this article is to provide an overview of the basis of risk analysis, assessment and management, accompanying problems and principles of risk management when drafting an environmental geotechnical model, enabling the analysis of an entire territory or developed region as a whole. The environmental impact will remain within the limits of the criteria specified with the standards and will be acceptable for human health and environment. An essential part of the solution of the problem is the engineering-geological model based on risk analysis and the assessment and forecast of mutual effects of the processes.

  8. Possibilities of modelling of local and global hydrological changes from high-resolution Global Hydrological Model in the absolute gravity observations - the case of Józefosław Observatory

    NASA Astrophysics Data System (ADS)

    Olszak, Tomasz; Barlik, Marcin; Pachuta, Andrzej; Próchniewicz, Dominik

    2014-05-01

    Geodynamical use of epoch gravimetric relative and absolute observations requires the elimination of one from the most significant effect related to local and global changes of hydrological conditions. It is understood that hydrological effect is associated with changes in groundwater levels and soil moisture around the gravimetric station. In Poland, the quasi - permanent observations of gravity changes by absolute method carried out since 2005 on gravity station located in the Astronomical - Geodetic Observatory in Józefosław. In the poster will be shortly described measurement strategy of absolute observations and different approaches to the elimination of the local and global effects associated with changes in hydrology. This paper will discuss the results of the analysis of tidal observations relevant to the development of absolute observations - seasonal changes in barometric correction factor and differences in the locally designated tidal corrections model. Analysis of the possibility of elimination the impact of global hydrological influence is based on the model GLDAS a spatial resolution of 0.25 degree independently on a local scale and global. Józefosław Observatory is equipped with additional sensors linked to the monitoring of local hydrological conditions. It gives a possibility to verify the quality of modeling of hydrological changes using global models in local and global scale.

  9. The absolute path command

    2012-05-11

    The ap command traveres all symlinks in a given file, directory, or executable name to identify the final absolute path. It can print just the final path, each intermediate link along with the symlink chan, and the permissions and ownership of each directory component in the final path. It has functionality similar to "which", except that it shows the final path instead of the first path. It is also similar to "pwd", but it canmore » provide the absolute path to a relative directory from the current working directory.« less

  10. The absolute path command

    SciTech Connect

    Moody, A.

    2012-05-11

    The ap command traveres all symlinks in a given file, directory, or executable name to identify the final absolute path. It can print just the final path, each intermediate link along with the symlink chan, and the permissions and ownership of each directory component in the final path. It has functionality similar to "which", except that it shows the final path instead of the first path. It is also similar to "pwd", but it can provide the absolute path to a relative directory from the current working directory.

  11. Long range Ising model for credit risk modeling

    NASA Astrophysics Data System (ADS)

    Molins, Jordi; Vives, Eduard

    2005-07-01

    Within the framework of maximum entropy principle we show that the finite-size long-range Ising model is the adequate model for the description of homogeneous credit portfolios and the computation of credit risk when default correlations between the borrowers are included. The exact analysis of the model suggest that when the correlation increases a first-order-like transition may occur inducing a sudden risk increase.

  12. A comparison of absolute performance of different correlative and mechanistic species distribution models in an independent area.

    PubMed

    Shabani, Farzin; Kumar, Lalit; Ahmadi, Mohsen

    2016-08-01

    To investigate the comparative abilities of six different bioclimatic models in an independent area, utilizing the distribution of eight different species available at a global scale and in Australia. Global scale and Australia. We tested a variety of bioclimatic models for eight different plant species employing five discriminatory correlative species distribution models (SDMs) including Generalized Linear Model (GLM), MaxEnt, Random Forest (RF), Boosted Regression Tree (BRT), Bioclim, together with CLIMEX (CL) as a mechanistic niche model. These models were fitted using a training dataset of available global data, but with the exclusion of Australian locations. The capabilities of these techniques in projecting suitable climate, based on independent records for these species in Australia, were compared. Thus, Australia is not used to calibrate the models and therefore it is as an independent area regarding geographic locations. To assess and compare performance, we utilized the area under the receiver operating characteristic (ROC) curves (AUC), true skill statistic (TSS), and fractional predicted areas for all SDMs. In addition, we assessed satisfactory agreements between the outputs of the six different bioclimatic models, for all eight species in Australia. The modeling method impacted on potential distribution predictions under current climate. However, the utilization of sensitivity and the fractional predicted areas showed that GLM, MaxEnt, Bioclim, and CL had the highest sensitivity for Australian climate conditions. Bioclim calculated the highest fractional predicted area of an independent area, while RF and BRT were poor. For many applications, it is difficult to decide which bioclimatic model to use. This research shows that variable results are obtained using different SDMs in an independent area. This research also shows that the SDMs produce different results for different species; for example, Bioclim may not be good for one species but works better

  13. A comparison of absolute performance of different correlative and mechanistic species distribution models in an independent area.

    PubMed

    Shabani, Farzin; Kumar, Lalit; Ahmadi, Mohsen

    2016-08-01

    To investigate the comparative abilities of six different bioclimatic models in an independent area, utilizing the distribution of eight different species available at a global scale and in Australia. Global scale and Australia. We tested a variety of bioclimatic models for eight different plant species employing five discriminatory correlative species distribution models (SDMs) including Generalized Linear Model (GLM), MaxEnt, Random Forest (RF), Boosted Regression Tree (BRT), Bioclim, together with CLIMEX (CL) as a mechanistic niche model. These models were fitted using a training dataset of available global data, but with the exclusion of Australian locations. The capabilities of these techniques in projecting suitable climate, based on independent records for these species in Australia, were compared. Thus, Australia is not used to calibrate the models and therefore it is as an independent area regarding geographic locations. To assess and compare performance, we utilized the area under the receiver operating characteristic (ROC) curves (AUC), true skill statistic (TSS), and fractional predicted areas for all SDMs. In addition, we assessed satisfactory agreements between the outputs of the six different bioclimatic models, for all eight species in Australia. The modeling method impacted on potential distribution predictions under current climate. However, the utilization of sensitivity and the fractional predicted areas showed that GLM, MaxEnt, Bioclim, and CL had the highest sensitivity for Australian climate conditions. Bioclim calculated the highest fractional predicted area of an independent area, while RF and BRT were poor. For many applications, it is difficult to decide which bioclimatic model to use. This research shows that variable results are obtained using different SDMs in an independent area. This research also shows that the SDMs produce different results for different species; for example, Bioclim may not be good for one species but works better

  14. Classification images predict absolute efficiency.

    PubMed

    Murray, Richard F; Bennett, Patrick J; Sekuler, Allison B

    2005-02-24

    How well do classification images characterize human observers' strategies in perceptual tasks? We show mathematically that from the classification image of a noisy linear observer, it is possible to recover the observer's absolute efficiency. If we could similarly predict human observers' performance from their classification images, this would suggest that the linear model that underlies use of the classification image method is adequate over the small range of stimuli typically encountered in a classification image experiment, and that a classification image captures most important aspects of human observers' performance over this range. In a contrast discrimination task and in a shape discrimination task, we found that observers' absolute efficiencies were generally well predicted by their classification images, although consistently slightly (approximately 13%) higher than predicted. We consider whether a number of plausible nonlinearities can account for the slight under prediction, and of these we find that only a form of phase uncertainty can account for the discrepancy.

  15. The oscillation model of hydrothermal dynamics beneath Aso volcano, southwest Japan after small eruption on May 2011: A new understanding model using repeated absolute and relative gravity measurement

    NASA Astrophysics Data System (ADS)

    Sofyan, Yayan; Nishijima, Jun; Fujimitsu, Yasuhiro; Yoshikawa, Shin; Kagiyama, Tsuneomi; Ohkura, Takahiro

    2016-01-01

    At the end of 2010, the seismic activity in Aso volcano intensely increased and water level in the Nakadake crater decreased until early in 2011, then was followed by a small eruption in May 2011. After the eruption and heavy rain, the volcanic activity subsided to calm period, crater bottom was refilled with water, and water level increased in the Nakadake crater. The next tremor reappeared in 2014 and tracked to eruption in November 2014. This eruptive pattern and water level variation in the crater repeatedly appeared on the surface, and it should be related to the hydrothermal dynamics beneath Aso volcano. We initiated the gravity measurements in relation to hydrothermal dynamics in the subsurface of Aso volcano using Scintrex CG-5 (549) and LaCoste Romberg type G-1016 relative gravimeter at 28 benchmarks in April 2011, one month before the eruption. The repeated gravity measurements continue to monitor Aso volcano with a series of the measurement after the eruption in every three months to a half year. We analyze the gravity variation from 2011 to 2014 between the time of the phreatic and strombolian eruption. The measurements covered the area more than 60 km2 in the west side of Aso caldera. A new gravity network was also installed in May 2010 at seven benchmarks using A10-017 absolute gravimeter, which re-occupied in October 2010, June 2011 and two benchmarks in June 2014. As a result, the gravity changes distinguish hydrothermal dynamic in the subsurface, which has a direct correlation to water level fluctuation in the crater, after the first eruption and before the second discharge. The monitoring data notice large gravity changes between the surveys at benchmarks around Nakadake crater and Kusasenri area. The simple 3D inversion models of the 4-D gravity data deduce the density contrast distribution beneath Aso volcano. The inversion and mass change result generate the oscillation typical as a new understanding model. The variation of the mass shows a

  16. Modeling food spoilage in microbial risk assessment.

    PubMed

    Koutsoumanis, Konstantinos

    2009-02-01

    In this study, I describe a systematic approach for modeling food spoilage in microbial risk assessment that is based on the incorporation of kinetic spoilage modeling in exposure assessment by combining data and models for the specific spoilage organisms (SSO: fraction of the total microflora responsible for spoilage) with those for pathogens. The structure of the approach is presented through an exposure assessment application for Escherichia coli O157:H7 in ground beef. The proposed approach allows for identifying spoiled products at the time of consumption by comparing the estimated level of SSO (pseudomonads) with the spoilage level (level of SSO at which spoilage is observed). The results of the application indicate that ignoring spoilage in risk assessment could lead to significant overestimations of risk.

  17. Delayed detonation models for normal and subluminous type Ia sueprnovae: Absolute brightness, light curves, and molecule formation

    NASA Technical Reports Server (NTRS)

    Hoflich, P.; Khokhlov, A. M.; Wheeler, J. C.

    1995-01-01

    We compute optical and infrared light curves of the pulsating class of delayed detonation models for Type Ia supernovae (SN Ia's) using an elaborate treatment of the Local Thermodynamic Equilbrium (LTE) radiation transport, equation of state and ionization balance, expansion opacity including the cooling by CO, Co(+), and SiO, and a Monte Carlo gamma-ray deposition scheme. The models have an amount of Ni-56 in the range from approximately or equal to 0.1 solar mass up to 0.7 solar mass depending on the density at which the transition from a deflagration to a detonation occurs. Models with a large nickel production give light curves comparable to those of typical Type Ia supernovae. Subluminous supernovae can be explained by models with a low nickel production. Multiband light curves are presented in comparison with the normally bright event SN 1992bc and the subluminous events Sn 1991bg and SN 1992bo to establish the principle that the delayed detonation paradigm in Chandrasekhar mass models may give a common explosion mechanism accounting for both normal and subluminous SN Ia's. Secondary IR-maxima are formed in the models of normal SN Ia's as a photospheric effect if the photospheric radius continues to increase well after maximum light. Secondary maxima appear later and stronger in models with moderate expansion velocities and with radioactive material closer to the surface. Model light curves for subluminous SN Ia's tend to show only one 'late' IR-maximum. In some delayed detonation models shell-like envelopes form, which consist of unburned carbon and oxygen. The formation of molecules in these envelopes is addressed. If the model retains a C/O-envelope and is subluminous, strong vibration bands of CO may appear, typically several weeks past maximum light. CO should be very weak or absent in normal Sn Ia's.

  18. Managing exploration risk using basin modeling

    SciTech Connect

    Wendebourg, J. )

    1996-01-01

    Economic risk analysis requires a well's dry-hole probability and a probability distribution of type and volume of recoverable hydrocarbons. Today's world-wide exploration needs methods that can accommodate a wide variety of data quality and quantity. Monte Carlo methods are commonly used to compute volume distributions and dry hole probability by multiplying Probabilities of geologic risk factors such as source rock richness, migration loss, seal effectiveness etc. assuming that these are independent Parameters. This assumption however is not appropriate because they represent interdependent physical processes that should be treated by an integrated system. Basin modeling is a tool for assessing exploration risk by simulating the interdependent processes that lead to hydrocarbon accumulations. advanced 2-D and 3-D basin modeling can treat occurrence, type, and volumes of hydrocarbons. These models need many parameters that individually may have great uncertainties, but a calibration against available data may reduce their uncertainties significantly and therefore may quantify risk. Uncertainty of thermal and source rock parameters is evaluated by applying simple and fast 1-D tools to individual wells. Calibration of pressure and temperature data as well as occurrence and type of known hydrocarbon accumulations with 2-D tools evaluates uncertainty between wells along geologic cross-sections. Individual prospect risk is finally determined by the uncertainty of local parameters within the calibrated model, as for example seal effectiveness or fault permeability.

  19. Managing exploration risk using basin modeling

    SciTech Connect

    Wendebourg, J.

    1996-12-31

    Economic risk analysis requires a well`s dry-hole probability and a probability distribution of type and volume of recoverable hydrocarbons. Today`s world-wide exploration needs methods that can accommodate a wide variety of data quality and quantity. Monte Carlo methods are commonly used to compute volume distributions and dry hole probability by multiplying Probabilities of geologic risk factors such as source rock richness, migration loss, seal effectiveness etc. assuming that these are independent Parameters. This assumption however is not appropriate because they represent interdependent physical processes that should be treated by an integrated system. Basin modeling is a tool for assessing exploration risk by simulating the interdependent processes that lead to hydrocarbon accumulations. advanced 2-D and 3-D basin modeling can treat occurrence, type, and volumes of hydrocarbons. These models need many parameters that individually may have great uncertainties, but a calibration against available data may reduce their uncertainties significantly and therefore may quantify risk. Uncertainty of thermal and source rock parameters is evaluated by applying simple and fast 1-D tools to individual wells. Calibration of pressure and temperature data as well as occurrence and type of known hydrocarbon accumulations with 2-D tools evaluates uncertainty between wells along geologic cross-sections. Individual prospect risk is finally determined by the uncertainty of local parameters within the calibrated model, as for example seal effectiveness or fault permeability.

  20. Uncertainty in surface water flood risk modelling

    NASA Astrophysics Data System (ADS)

    Butler, J. B.; Martin, D. N.; Roberts, E.; Domuah, R.

    2009-04-01

    Two thirds of the flooding that occurred in the UK during summer 2007 was as a result of surface water (otherwise known as ‘pluvial') rather than river or coastal flooding. In response, the Environment Agency and Interim Pitt Reviews have highlighted the need for surface water risk mapping and warning tools to identify, and prepare for, flooding induced by heavy rainfall events. This need is compounded by the likely increase in rainfall intensities due to climate change. The Association of British Insurers has called for the Environment Agency to commission nationwide flood risk maps showing the relative risk of flooding from all sources. At the wider European scale, the recently-published EC Directive on the assessment and management of flood risks will require Member States to evaluate, map and model flood risk from a variety of sources. As such, there is now a clear and immediate requirement for the development of techniques for assessing and managing surface water flood risk across large areas. This paper describes an approach for integrating rainfall, drainage network and high-resolution topographic data using Flowroute™, a high-resolution flood mapping and modelling platform, to produce deterministic surface water flood risk maps. Information is provided from UK case studies to enable assessment and validation of modelled results using historical flood information and insurance claims data. Flowroute was co-developed with flood scientists at Cambridge University specifically to simulate river dynamics and floodplain inundation in complex, congested urban areas in a highly computationally efficient manner. It utilises high-resolution topographic information to route flows around individual buildings so as to enable the prediction of flood depths, extents, durations and velocities. As such, the model forms an ideal platform for the development of surface water flood risk modelling and mapping capabilities. The 2-dimensional component of Flowroute employs

  1. Suicide risk assessment and suicide risk formulation: essential components of the therapeutic risk management model.

    PubMed

    Silverman, Morton M

    2014-09-01

    Suicide and other suicidal behaviors are often associated with psychiatric disorders and dysfunctions. Therefore, psychiatrists have significant opportunities to identify at-risk individuals and offer treatment to reduce that risk. Although a suicide risk assessment is a core competency requirement, many clinical psychiatrists lack the requisite training and skills to appropriately assess for suicide risk. Moreover, the standard of care requires psychiatrists to foresee the possibility that a patient might engage in suicidal behavior, hence to conduct a suicide risk formulation sufficient to guide triage and treatment planning. Based on data collected via a suicide risk assessment, a suicide risk formulation is a process whereby the psychiatrist forms a judgment about a patient's foreseeable risk of suicidal behavior in order to inform triage decisions, safety and treatment planning, and interventions to reduce risk. This paper addresses the components of this process in the context of the model for therapeutic risk management of the suicidal patient developed at the Veterans Integrated Service Network (VISN) 19 Mental Illness Research, Education and Clinical Center by Wortzel et al.

  2. Role of Modeling When Designing for Absolute Energy Use Intensity Requirements in a Design-Build Framework: Preprint

    SciTech Connect

    Hirsch, A.; Pless, S.; Guglielmetti, R.; Torcellini, P. A.; Okada, D.; Antia, P.

    2011-03-01

    The Research Support Facility was designed to use half the energy of an equivalent minimally code-compliant building, and to produce as much renewable energy as it consumes on an annual basis. These energy goals and their substantiation through simulation were explicitly included in the project's fixed firm price design-build contract. The energy model had to be continuously updated during the design process and to match the final building as-built to the greatest degree possible. Computer modeling played a key role throughout the design process and in verifying that the contractual energy goals would be met within the specified budget. The main tool was a whole building energy simulation program. Other models were used to provide more detail or to complement the whole building simulation tool. Results from these specialized models were fed back into the main whole building simulation tool to provide the most accurate possible inputs for annual simulations. This paper will detail the models used in the design process and how they informed important program and design decisions on the path from preliminary design to the completed building.

  3. Challenges of Modeling Flood Risk at Large Scales

    NASA Astrophysics Data System (ADS)

    Guin, J.; Simic, M.; Rowe, J.

    2009-04-01

    uncertainty can be attributed to various sources among which are imperfections in the hazard modeling, inherent errors in the DTM, lack of accurate information on the properties that are being analyzed, imperfections in the vulnerability relationships, inability of the model to account for local mitigation measures that are usually undertaken when a real event is unfolding and lack of details in the claims data that are used for model calibration. Nevertheless, the model once calibrated provides a very robust framework for analyzing relative and absolute risk. The paper concludes with key economic statistics of flood risk for Great Britain as a whole including certain large loss-causing scenarios affecting the greater London region. The model estimates a total financial loss of 5.6 billion GBP to all properties at a 1% annual aggregate exceedance probability level.

  4. Risk management model of winter navigation operations.

    PubMed

    Valdez Banda, Osiris A; Goerlandt, Floris; Kuzmin, Vladimir; Kujala, Pentti; Montewka, Jakub

    2016-07-15

    The wintertime maritime traffic operations in the Gulf of Finland are managed through the Finnish-Swedish Winter Navigation System. This establishes the requirements and limitations for the vessels navigating when ice covers this area. During winter navigation in the Gulf of Finland, the largest risk stems from accidental ship collisions which may also trigger oil spills. In this article, a model for managing the risk of winter navigation operations is presented. The model analyses the probability of oil spills derived from collisions involving oil tanker vessels and other vessel types. The model structure is based on the steps provided in the Formal Safety Assessment (FSA) by the International Maritime Organization (IMO) and adapted into a Bayesian Network model. The results indicate that ship independent navigation and convoys are the operations with higher probability of oil spills. Minor spills are most probable, while major oil spills found very unlikely but possible.

  5. Risk management model of winter navigation operations.

    PubMed

    Valdez Banda, Osiris A; Goerlandt, Floris; Kuzmin, Vladimir; Kujala, Pentti; Montewka, Jakub

    2016-07-15

    The wintertime maritime traffic operations in the Gulf of Finland are managed through the Finnish-Swedish Winter Navigation System. This establishes the requirements and limitations for the vessels navigating when ice covers this area. During winter navigation in the Gulf of Finland, the largest risk stems from accidental ship collisions which may also trigger oil spills. In this article, a model for managing the risk of winter navigation operations is presented. The model analyses the probability of oil spills derived from collisions involving oil tanker vessels and other vessel types. The model structure is based on the steps provided in the Formal Safety Assessment (FSA) by the International Maritime Organization (IMO) and adapted into a Bayesian Network model. The results indicate that ship independent navigation and convoys are the operations with higher probability of oil spills. Minor spills are most probable, while major oil spills found very unlikely but possible. PMID:27207023

  6. Clinical Model for Suicide Risk Assessment.

    ERIC Educational Resources Information Center

    Kral, Michael J.; Sakinofsky, Isaac

    1994-01-01

    Presents suicide risk assessment in a two-tiered model comprising background/contextual factors and subjectivity. The subjectivity portion is formulated around Shneidman's concepts of perturbation and lethality. Discusses decision of hospital admission versus ambulatory care. Suggests that theoretically informed approach should serve both…

  7. Risk analysis: divergent models and convergent interpretations

    NASA Technical Reports Server (NTRS)

    Carnes, B. A.; Gavrilova, N.

    2001-01-01

    Material presented at a NASA-sponsored workshop on risk models for exposure conditions relevant to prolonged space flight are described in this paper. Analyses used mortality data from experiments conducted at Argonne National Laboratory on the long-term effects of external whole-body irradiation on B6CF1 mice by 60Co gamma rays and fission neutrons delivered as a single exposure or protracted over either 24 or 60 once-weekly exposures. The maximum dose considered was restricted to 1 Gy for neutrons and 10 Gy for gamma rays. Proportional hazard models were used to investigate the shape of the dose response at these lower doses for deaths caused by solid-tissue tumors and tumors of either connective or epithelial tissue origin. For protracted exposures, a significant mortality effect was detected at a neutron dose of 14 cGy and a gamma-ray dose of 3 Gy. For single exposures, radiation-induced mortality for neutrons also occurred within the range of 10-20 cGy, but dropped to 86 cGy for gamma rays. Plots of risk relative to control estimated for each observed dose gave a visual impression of nonlinearity for both neutrons and gamma rays. At least for solid-tissue tumors, male and female mortality was nearly identical for gamma-ray exposures, but mortality risks for females were higher than for males for neutron exposures. As expected, protracting the gamma-ray dose reduced mortality risks. Although curvature consistent with that observed visually could be detected by a model parameterized to detect curvature, a relative risk term containing only a simple term for total dose was usually sufficient to describe the dose response. Although detectable mortality for the three pathology end points considered typically occurred at the same level of dose, the highest risks were almost always associated with deaths caused by tumors of epithelial tissue origin.

  8. Landslide risk models for decision making.

    PubMed

    Bonachea, Jaime; Remondo, Juan; de Terán, José Ramón Díaz; González-Díez, Alberto; Cendrero, Antonio

    2009-11-01

    This contribution presents a quantitative procedure for landslide risk analysis and zoning considering hazard, exposure (or value of elements at risk), and vulnerability. The method provides the means to obtain landslide risk models (expressing expected damage due to landslides on material elements and economic activities in monetary terms, according to different scenarios and periods) useful to identify areas where mitigation efforts will be most cost effective. It allows identifying priority areas for the implementation of actions to reduce vulnerability (elements) or hazard (processes). The procedure proposed can also be used as a preventive tool, through its application to strategic environmental impact analysis (SEIA) of land-use plans. The underlying hypothesis is that reliable predictions about hazard and risk can be made using models based on a detailed analysis of past landslide occurrences in connection with conditioning factors and data on past damage. The results show that the approach proposed and the hypothesis formulated are essentially correct, providing estimates of the order of magnitude of expected losses for a given time period. Uncertainties, strengths, and shortcomings of the procedure and results obtained are discussed and potential lines of research to improve the models are indicated. Finally, comments and suggestions are provided to generalize this type of analysis.

  9. Landslide risk mapping and modeling in China

    NASA Astrophysics Data System (ADS)

    Li, W.; Hong, Y.

    2015-12-01

    Under circumstances of global climate change, tectonic stress and human effect, landslides are among the most frequent and severely widespread natural hazards on Earth, as demonstrated in the World Atlas of Natural Hazards (McGuire et al., 2004). Every year, landslide activities cause serious economic loss as well as casualties (Róbert et al., 2005). How landslides can be monitored and predicted is an urgent research topic of the international landslide research community. Particularly, there is a lack of high quality and updated landslide risk maps and guidelines that can be employed to better mitigate and prevent landslide disasters in many emerging regions, including China (Hong, 2007). Since the 1950s, landslide events have been recorded in the statistical yearbooks, newspapers, and monographs in China. As disasters have been increasingly concerned by the government and the public, information about landslide events is becoming available from online news reports (Liu et al., 2012).This study presents multi-scale landslide risk mapping and modeling in China. At the national scale, based on historical data and practical experiences, we carry out landslide susceptibility and risk mapping by adopting a statistical approach and pattern recognition methods to construct empirical models. Over the identified landslide hot-spot areas, we further evaluate the slope-stability for each individual site (Sidle and Hirotaka, 2006), with the ultimate goal to set up a space-time multi-scale coupling system of Landslide risk mapping and modeling for landslide hazard monitoring and early warning.

  10. Risk management model in road transport systems

    NASA Astrophysics Data System (ADS)

    Sakhapov, R. L.; Nikolaeva, R. V.; Gatiyatullin, M. H.; Makhmutov, M. M.

    2016-08-01

    The article presents the results of a study of road safety indicators that influence the development and operation of the transport system. Road safety is considered as a continuous process of risk management. Authors constructed a model that relates the social risks of a major road safety indicator - the level of motorization. The model gives a fairly accurate assessment of the level of social risk for any given level of motorization. Authors calculated the dependence of the level of socio-economic costs of accidents and injured people in them. The applicability of the concept of socio-economic damage is caused by the presence of a linear relationship between the natural and economic indicators damage from accidents. The optimization of social risk is reduced to finding the extremum of the objective function that characterizes the economic effect of the implementation of measures to improve safety. The calculations make it possible to maximize the net present value, depending on the costs of improving road safety, taking into account socio-economic damage caused by accidents. The proposed econometric models make it possible to quantify the efficiency of the transportation system, allow to simulate the change in road safety indicators.

  11. Quantitative risk modelling for new pharmaceutical compounds.

    PubMed

    Tang, Zhengru; Taylor, Mark J; Lisboa, Paulo; Dyas, Mark

    2005-11-15

    The process of discovering and developing new drugs is long, costly and risk-laden. Faced with a wealth of newly discovered compounds, industrial scientists need to target resources carefully to discern the key attributes of a drug candidate and to make informed decisions. Here, we describe a quantitative approach to modelling the risk associated with drug development as a tool for scenario analysis concerning the probability of success of a compound as a potential pharmaceutical agent. We bring together the three strands of manufacture, clinical effectiveness and financial returns. This approach involves the application of a Bayesian Network. A simulation model is demonstrated with an implementation in MS Excel using the modelling engine Crystal Ball. PMID:16257374

  12. Malignancy Risk Models for Oral Lesions

    PubMed Central

    Zarate, Ana M.; Brezzo, María M.; Secchi, Dante G.; Barra, José L.

    2013-01-01

    Objectives: The aim of this work was to assess risk habits, clinical and cellular phenotypes and TP53 DNA changes in oral mucosa samples from patients with Oral Potentially Malignant Disorders (OPMD), in order to create models that enable genotypic and phenotypic patterns to be obtained that determine the risk of lesions becoming malignant. Study Design: Clinical phenotypes, family history of cancer and risk habits were collected in clinical histories. TP53 gene mutation and morphometric-morphological features were studied, and multivariate models were applied. Three groups were estabished: a) oral cancer (OC) group (n=10), b) OPMD group (n=10), and c) control group (n=8). Results: An average of 50% of patients with malignancy were found to have smoking and drinking habits. A high percentage of TP53 mutations were observed in OC (30%) and OPMD (average 20%) lesions (p=0.000). The majority of these mutations were GC ? TA transversion mutations (60%). However, patients with OC presented mutations in all the exons and introns studied. Highest diagnostic accuracy (p=0.0001) was observed when incorporating alcohol and tobacco habits variables with TP53 mutations. Conclusions: Our results prove to be statistically reliable, with parameter estimates that are nearly unbiased even for small sample sizes. Models 2 and 3 were the most accurate for assessing the risk of an OPMD becoming cancerous. However, in a public health context, model 3 is the most recommended because the characteristics considered are easier and less costly to evaluate. Key words:TP53, oral potentially malignant disorders, risk factors, genotype, phenotype. PMID:23722122

  13. Absolute properties of the eclipsing binary system AQ Serpentis: A stringent test of convective core overshooting in stellar evolution models

    SciTech Connect

    Torres, Guillermo; Vaz, Luiz Paulo R.; Sandberg Lacy, Claud H.; Claret, Antonio E-mail: lpv@fisica.ufmg.br E-mail: claret@iaa.es

    2014-02-01

    We report differential photometric observations and radial-velocity measurements of the detached, 1.69 day period, double-lined eclipsing binary AQ Ser. Accurate masses and radii for the components are determined to better than 1.8% and 1.1%, respectively, and are M {sub 1} = 1.417 ± 0.021 M {sub ☉}, M {sub 2} = 1.346 ± 0.024 M {sub ☉}, R {sub 1} = 2.451 ± 0.027 R {sub ☉}, and R {sub 2} = 2.281 ± 0.014 R {sub ☉}. The temperatures are 6340 ± 100 K (spectral type F6) and 6430 ± 100 K (F5), respectively. Both stars are considerably evolved, such that predictions from stellar evolution theory are particularly sensitive to the degree of extra mixing above the convective core (overshoot). The component masses are different enough to exclude a location in the H-R diagram past the point of central hydrogen exhaustion, which implies the need for extra mixing. Moreover, we find that current main-sequence models are unable to match the observed properties at a single age even when allowing the unknown metallicity, mixing length parameter, and convective overshooting parameter to vary freely and independently for the two components. The age of the more massive star appears systematically younger. AQ Ser and other similarly evolved eclipsing binaries showing the same discrepancy highlight an outstanding and largely overlooked problem with the description of overshooting in current stellar theory.

  14. MODELING MULTI-WAVELENGTH STELLAR ASTROMETRY. III. DETERMINATION OF THE ABSOLUTE MASSES OF EXOPLANETS AND THEIR HOST STARS

    SciTech Connect

    Coughlin, J. L.; Lopez-Morales, Mercedes

    2012-05-10

    Astrometric measurements of stellar systems are becoming significantly more precise and common, with many ground- and space-based instruments and missions approaching 1 {mu}as precision. We examine the multi-wavelength astrometric orbits of exoplanetary systems via both analytical formulae and numerical modeling. Exoplanets have a combination of reflected and thermally emitted light that causes the photocenter of the system to shift increasingly farther away from the host star with increasing wavelength. We find that, if observed at long enough wavelengths, the planet can dominate the astrometric motion of the system, and thus it is possible to directly measure the orbits of both the planet and star, and thus directly determine the physical masses of the star and planet, using multi-wavelength astrometry. In general, this technique works best for, though is certainly not limited to, systems that have large, high-mass stars and large, low-mass planets, which is a unique parameter space not covered by other exoplanet characterization techniques. Exoplanets that happen to transit their host star present unique cases where the physical radii of the planet and star can be directly determined via astrometry alone. Planetary albedos and day-night contrast ratios may also be probed via this technique due to the unique signature they impart on the observed astrometric orbits. We develop a tool to examine the prospects for near-term detection of this effect, and give examples of some exoplanets that appear to be good targets for detection in the K to N infrared observing bands, if the required precision can be achieved.

  15. A statistical model for collective risk assessment

    NASA Astrophysics Data System (ADS)

    Keef, Caroline; Tawn, Jonathan A.; Lamb, Rob

    2010-05-01

    In this paper we present the theoretical basis of a statistical method that can be used as the basis of a collective risk assessment for country (or continent)-wide events. Our method is based on the conditional dependence model of Heffernan and Tawn (2004), which has been extended to handle missing data and temporal dependence by Keef et al (2009). This model describes the full joint distribution function of a set of variables and incorporates separate models for the marginal and dependence characteristics of the set using a copula approach. The advantages of this model include; the flexibility in terms of types of dependence modelled; the ability to handle situations where the dependence in the tails of the data is not the same as that in the main body of the data; the ability to handle both temporal and spatial dependence; and the ability to model a large number of variables. In this paper we present further extensions to the statistical model which allow us to simulate country-wide extreme events with the correct spatial and temporal structure and show an application to river flood events. Heffernan J. E. and Tawn J. A. (2004) A conditional approach for multivariate extreme values (with discussion) J. R. Statist. Soc. B, 66 497-546 Keef, C., J. Tawn, and C. Svensson. (2009). Spatial risk assessment for extreme river flows. Applied Statistics 58,(5) pp 601-618

  16. Simulation Assisted Risk Assessment: Blast Overpressure Modeling

    NASA Technical Reports Server (NTRS)

    Lawrence, Scott L.; Gee, Ken; Mathias, Donovan; Olsen, Michael

    2006-01-01

    A probabilistic risk assessment (PRA) approach has been developed and applied to the risk analysis of capsule abort during ascent. The PRA is used to assist in the identification of modeling and simulation applications that can significantly impact the understanding of crew risk during this potentially dangerous maneuver. The PRA approach is also being used to identify the appropriate level of fidelity for the modeling of those critical failure modes. The Apollo launch escape system (LES) was chosen as a test problem for application of this approach. Failure modes that have been modeled and/or simulated to date include explosive overpressure-based failure, explosive fragment-based failure, land landing failures (range limits exceeded either near launch or Mode III trajectories ending on the African continent), capsule-booster re-contact during separation, and failure due to plume-induced instability. These failure modes have been investigated using analysis tools in a variety of technical disciplines at various levels of fidelity. The current paper focuses on the development and application of a blast overpressure model for the prediction of structural failure due to overpressure, including the application of high-fidelity analysis to predict near-field and headwinds effects.

  17. Risk modelling for vaccination: a risk assessment perspective.

    PubMed

    Wooldridge, M

    2007-01-01

    Any risk assessment involves a number of steps. First, the risk manager, in close liaison with the risk assessor, should identify the question of interest. Then, the hazards associated with each risk question should be identified. Only then can the risks themselves be assessed. Several questions may reasonably be asked about the risk associated with avian influenza vaccines and their use. Some apply to any vaccine, while others are specific to avian influenza. Risks may occur during manufacture and during use. Some concern the vaccines themselves, while others address the effect of failure on disease control. The hazards associated with each risk question are then identified. These may be technical errors in design, development or production, such as contamination or failure to inactivate appropriately. They may relate to the biological properties of the pathogens themselves displayed during manufacture or use, for example, reversion to virulence, shedding or not being the right strain for the subsequent challenge. Following a consideration of risks and hazards, the information needed and an outline of the steps necessary to assess the risk is summarized, for an illustrative risk question using, as an example, the risks associated with the use of vaccines in the field. A brief consideration of the differences between qualitative and quantitative risk assessments is also included, and the potential effects of uncertainty and variability on the results are discussed.

  18. Structural equation modeling in environmental risk assessment.

    PubMed

    Buncher, C R; Succop, P A; Dietrich, K N

    1991-01-01

    Environmental epidemiology requires effective models that take individual observations of environmental factors and connect them into meaningful patterns. Single-factor relationships have given way to multivariable analyses; simple additive models have been augmented by multiplicative (logistic) models. Each of these steps has produced greater enlightenment and understanding. Models that allow for factors causing outputs that can affect later outputs with putative causation working at several different time points (e.g., linkage) are not commonly used in the environmental literature. Structural equation models are a class of covariance structure models that have been used extensively in economics/business and social science but are still little used in the realm of biostatistics. Path analysis in genetic studies is one simplified form of this class of models. We have been using these models in a study of the health and development of infants who have been exposed to lead in utero and in the postnatal home environment. These models require as input the directionality of the relationship and then produce fitted models for multiple inputs causing each factor and the opportunity to have outputs serve as input variables into the next phase of the simultaneously fitted model. Some examples of these models from our research are presented to increase familiarity with this class of models. Use of these models can provide insight into the effect of changing an environmental factor when assessing risk. The usual cautions concerning believing a model, believing causation has been proven, and the assumptions that are required for each model are operative. PMID:2050063

  19. Human Plague Risk: Spatial-Temporal Models

    NASA Technical Reports Server (NTRS)

    Pinzon, Jorge E.

    2010-01-01

    This chpater reviews the use of spatial-temporal models in identifying potential risks of plague outbreaks into the human population. Using earth observations by satellites remote sensing there has been a systematic analysis and mapping of the close coupling between the vectors of the disease and climate variability. The overall result is that incidence of plague is correlated to positive El Nino/Southem Oscillation (ENSO).

  20. Assessing calibration of multinomial risk prediction models.

    PubMed

    Van Hoorde, Kirsten; Vergouwe, Yvonne; Timmerman, Dirk; Van Huffel, Sabine; Steyerberg, Ewout W; Van Calster, Ben

    2014-07-10

    Calibration, that is, whether observed outcomes agree with predicted risks, is important when evaluating risk prediction models. For dichotomous outcomes, several tools exist to assess different aspects of model calibration, such as calibration-in-the-large, logistic recalibration, and (non-)parametric calibration plots. We aim to extend these tools to prediction models for polytomous outcomes. We focus on models developed using multinomial logistic regression (MLR): outcome Y with k categories is predicted using k - 1 equations comparing each category i (i = 2, … ,k) with reference category 1 using a set of predictors, resulting in k - 1 linear predictors. We propose a multinomial logistic recalibration framework that involves an MLR fit where Y is predicted using the k - 1 linear predictors from the prediction model. A non-parametric alternative may use vector splines for the effects of the linear predictors. The parametric and non-parametric frameworks can be used to generate multinomial calibration plots. Further, the parametric framework can be used for the estimation and statistical testing of calibration intercepts and slopes. Two illustrative case studies are presented, one on the diagnosis of malignancy of ovarian tumors and one on residual mass diagnosis in testicular cancer patients treated with cisplatin-based chemotherapy. The risk prediction models were developed on data from 2037 and 544 patients and externally validated on 1107 and 550 patients, respectively. We conclude that calibration tools can be extended to polytomous outcomes. The polytomous calibration plots are particularly informative through the visual summary of the calibration performance.

  1. Animal Models of Ischemic Stroke. Part One: Modeling Risk Factors

    PubMed Central

    Bacigaluppi, Marco; Comi, Giancarlo; Hermann, Dirk M.

    2010-01-01

    Ischemic stroke is one of the leading causes of long-term disability and death in developed and developing countries. As emerging disease, stroke related mortality and morbidity is going to step up in the next decades. This is both due to the poor identification of risk factors and persistence of unhealthy habits, as well as to the aging of the population. To counteract the estimated increase in stroke incidence, it is of primary importance to identify risk factors, study their effects, to promote primary and secondary prevention, and to extend the therapeutic repertoire that is currently limited to the very first hours after stroke. While epidemiologic studies in the human population are essential to identify emerging risk factors, adequate animal models represent a fundamental tool to dissect stroke risk factors to their molecular mechanism and to find efficacious therapeutic strategies for this complex multi- factorial disorder. The present review is organized into two parts: the first part deals with the animal models that have been developed to study stroke and its related risk factors and the second part analyzes the specific stroke models. These models represent an indispensable tool to investigate the mechanisms of cerebral injury and to develop novel therapies. PMID:20802809

  2. ABSOLUTE POLARIMETRY AT RHIC.

    SciTech Connect

    OKADA; BRAVAR, A.; BUNCE, G.; GILL, R.; HUANG, H.; MAKDISI, Y.; NASS, A.; WOOD, J.; ZELENSKI, Z.; ET AL.

    2007-09-10

    Precise and absolute beam polarization measurements are critical for the RHIC spin physics program. Because all experimental spin-dependent results are normalized by beam polarization, the normalization uncertainty contributes directly to final physics uncertainties. We aimed to perform the beam polarization measurement to an accuracy Of {Delta}P{sub beam}/P{sub beam} < 5%. The absolute polarimeter consists of Polarized Atomic Hydrogen Gas Jet Target and left-right pairs of silicon strip detectors and was installed in the RHIC-ring in 2004. This system features proton-proton elastic scattering in the Coulomb nuclear interference (CNI) region. Precise measurements of the analyzing power A{sub N} of this process has allowed us to achieve {Delta}P{sub beam}/P{sub beam} = 4.2% in 2005 for the first long spin-physics run. In this report, we describe the entire set up and performance of the system. The procedure of beam polarization measurement and analysis results from 2004-2005 are described. Physics topics of AN in the CNI region (four-momentum transfer squared 0.001 < -t < 0.032 (GeV/c){sup 2}) are also discussed. We point out the current issues and expected optimum accuracy in 2006 and the future.

  3. Absolute magnitudes of trans-neptunian objects

    NASA Astrophysics Data System (ADS)

    Duffard, R.; Alvarez-candal, A.; Pinilla-Alonso, N.; Ortiz, J. L.; Morales, N.; Santos-Sanz, P.; Thirouin, A.

    2015-10-01

    Accurate measurements of diameters of trans- Neptunian objects are extremely complicated to obtain. Radiomatric techniques applied to thermal measurements can provide good results, but precise absolute magnitudes are needed to constrain diameters and albedos. Our objective is to measure accurate absolute magnitudes for a sample of trans- Neptunian objects, many of which have been observed, and modelled, by the "TNOs are cool" team, one of Herschel Space Observatory key projects grantes with ~ 400 hours of observing time. We observed 56 objects in filters V and R, if possible. These data, along with data available in the literature, was used to obtain phase curves and to measure absolute magnitudes by assuming a linear trend of the phase curves and considering magnitude variability due to rotational light-curve. In total we obtained 234 new magnitudes for the 56 objects, 6 of them with no reported previous measurements. Including the data from the literature we report a total of 109 absolute magnitudes.

  4. NGNP Risk Management Database: A Model for Managing Risk

    SciTech Connect

    John Collins

    2009-09-01

    To facilitate the implementation of the Risk Management Plan, the Next Generation Nuclear Plant (NGNP) Project has developed and employed an analytical software tool called the NGNP Risk Management System (RMS). A relational database developed in Microsoft® Access, the RMS provides conventional database utility including data maintenance, archiving, configuration control, and query ability. Additionally, the tool’s design provides a number of unique capabilities specifically designed to facilitate the development and execution of activities outlined in the Risk Management Plan. Specifically, the RMS provides the capability to establish the risk baseline, document and analyze the risk reduction plan, track the current risk reduction status, organize risks by reference configuration system, subsystem, and component (SSC) and Area, and increase the level of NGNP decision making.

  5. Functional Risk Modeling for Lunar Surface Systems

    NASA Technical Reports Server (NTRS)

    Thomson, Fraser; Mathias, Donovan; Go, Susie; Nejad, Hamed

    2010-01-01

    We introduce an approach to risk modeling that we call functional modeling , which we have developed to estimate the capabilities of a lunar base. The functional model tracks the availability of functions provided by systems, in addition to the operational state of those systems constituent strings. By tracking functions, we are able to identify cases where identical functions are provided by elements (rovers, habitats, etc.) that are connected together on the lunar surface. We credit functional diversity in those cases, and in doing so compute more realistic estimates of operational mode availabilities. The functional modeling approach yields more realistic estimates of the availability of the various operational modes provided to astronauts by the ensemble of surface elements included in a lunar base architecture. By tracking functional availability the effects of diverse backup, which often exists when two or more independent elements are connected together, is properly accounted for.

  6. Electricity market pricing, risk hedging and modeling

    NASA Astrophysics Data System (ADS)

    Cheng, Xu

    In this dissertation, we investigate the pricing, price risk hedging/arbitrage, and simplified system modeling for a centralized LMP-based electricity market. In an LMP-based market model, the full AC power flow model and the DC power flow model are most widely used to represent the transmission system. We investigate the differences of dispatching results, congestion pattern, and LMPs for the two power flow models. An appropriate LMP decomposition scheme to quantify the marginal costs of the congestion and real power losses is critical for the implementation of financial risk hedging markets. However, the traditional LMP decomposition heavily depends on the slack bus selection. In this dissertation we propose a slack-independent scheme to break LMP down into energy, congestion, and marginal loss components by analyzing the actual marginal cost of each bus at the optimal solution point. The physical and economic meanings of the marginal effect at each bus provide accurate price information for both congestion and losses, and thus the slack-dependency of the traditional scheme is eliminated. With electricity priced at the margin instead of the average value, the market operator typically collects more revenue from power sellers than that paid to power buyers. According to the LMP decomposition results, the revenue surplus is then divided into two parts: congestion charge surplus and marginal loss revenue surplus. We apply the LMP decomposition results to the financial tools, such as financial transmission right (FTR) and loss hedging right (LHR), which have been introduced to hedge against price risks associated to congestion and losses, to construct a full price risk hedging portfolio. The two-settlement market structure and the introduction of financial tools inevitably create market manipulation opportunities. We investigate several possible market manipulation behaviors by virtual bidding and propose a market monitor approach to identify and quantify such

  7. A full-dimensional model of ozone forming reaction: the absolute value of the recombination rate coefficient, its pressure and temperature dependencies.

    PubMed

    Teplukhin, Alexander; Babikov, Dmitri

    2016-07-28

    Rigorous calculations of scattering resonances in ozone are carried out for a broad range of rotational excitations. The accurate potential energy surface of Dawes is adopted, and a new efficient method for calculations of ro-vibrational energies, wave functions and resonance lifetimes is employed (which uses hyper-spherical coordinates, the sequential diagonalization/truncation approach, grid optimization and complex absorbing potential). A detailed analysis is carried out to characterize distributions of resonance energies and lifetimes, their rotational/vibrational content and their positions with respect to the centrifugal barrier. Emphasis is on the contribution of these resonances to the recombination process that forms ozone. It is found that major contributions come from localized resonances at energies near the top of the barrier. Delocalized resonances at higher energies should also be taken into account, while very narrow resonances at low energies (trapped far behind the centrifugal barrier) should be treated as bound states. The absolute value of the recombination rate coefficient, its pressure and temperature dependencies are obtained using the energy-transfer model developed in the earlier work. Good agreement with experimental data is obtained if one follows the suggestion of Troe, who argued that the energy transfer mechanism of recombination is responsible only for 55% of the recombination rate (with the remaining 45% coming from the competing chaperon mechanism). PMID:27364351

  8. A full-dimensional model of ozone forming reaction: the absolute value of the recombination rate coefficient, its pressure and temperature dependencies.

    PubMed

    Teplukhin, Alexander; Babikov, Dmitri

    2016-07-28

    Rigorous calculations of scattering resonances in ozone are carried out for a broad range of rotational excitations. The accurate potential energy surface of Dawes is adopted, and a new efficient method for calculations of ro-vibrational energies, wave functions and resonance lifetimes is employed (which uses hyper-spherical coordinates, the sequential diagonalization/truncation approach, grid optimization and complex absorbing potential). A detailed analysis is carried out to characterize distributions of resonance energies and lifetimes, their rotational/vibrational content and their positions with respect to the centrifugal barrier. Emphasis is on the contribution of these resonances to the recombination process that forms ozone. It is found that major contributions come from localized resonances at energies near the top of the barrier. Delocalized resonances at higher energies should also be taken into account, while very narrow resonances at low energies (trapped far behind the centrifugal barrier) should be treated as bound states. The absolute value of the recombination rate coefficient, its pressure and temperature dependencies are obtained using the energy-transfer model developed in the earlier work. Good agreement with experimental data is obtained if one follows the suggestion of Troe, who argued that the energy transfer mechanism of recombination is responsible only for 55% of the recombination rate (with the remaining 45% coming from the competing chaperon mechanism).

  9. MODELING MULTI-WAVELENGTH STELLAR ASTROMETRY. II. DETERMINING ABSOLUTE INCLINATIONS, GRAVITY-DARKENING COEFFICIENTS, AND SPOT PARAMETERS OF SINGLE STARS WITH SIM LITE

    SciTech Connect

    Coughlin, Jeffrey L.; Harrison, Thomas E.; Gelino, Dawn M.

    2010-11-10

    We present a novel technique to determine the absolute inclination of single stars using multi-wavelength submilliarcsecond astrometry. The technique exploits the effect of gravity darkening, which causes a wavelength-dependent astrometric displacement parallel to a star's projected rotation axis. We find that this effect is clearly detectable using SIM Lite for various giant stars and rapid rotators, and present detailed models for multiple systems using the REFLUX code. We also explore the multi-wavelength astrometric reflex motion induced by spots on single stars. We find that it should be possible to determine spot size, relative temperature, and some positional information for both giant and nearby main-sequence stars utilizing multi-wavelength SIM Lite data. These data will be extremely useful in stellar and exoplanet astrophysics, as well as supporting the primary SIM Lite mission through proper multi-wavelength calibration of the giant star astrometric reference frame, and reduction of noise introduced by starspots when searching for extrasolar planets.

  10. Absolute Equilibrium Entropy

    NASA Technical Reports Server (NTRS)

    Shebalin, John V.

    1997-01-01

    The entropy associated with absolute equilibrium ensemble theories of ideal, homogeneous, fluid and magneto-fluid turbulence is discussed and the three-dimensional fluid case is examined in detail. A sigma-function is defined, whose minimum value with respect to global parameters is the entropy. A comparison is made between the use of global functions sigma and phase functions H (associated with the development of various H-theorems of ideal turbulence). It is shown that the two approaches are complimentary though conceptually different: H-theorems show that an isolated system tends to equilibrium while sigma-functions allow the demonstration that entropy never decreases when two previously isolated systems are combined. This provides a more complete picture of entropy in the statistical mechanics of ideal fluids.

  11. Reversible intramolecular hydrogen transfer between cysteine thiyl radicals and glycine and alanine in model peptides: absolute rate constants derived from pulse radiolysis and laser flash photolysis

    PubMed Central

    Nauser, Thomas; Casi, Giulio; Koppenol, Willem H.; Schöneich, Christian

    2008-01-01

    The intramolecular reaction of cysteine thiyl radicals with peptide and protein αC-H bonds represents a potential mechanism for irreversible protein oxidation. Here, we have measured absolute rate constants for these reversible hydrogen transfer reactions by means of pulse radiolysis and laser flash photolysis of model peptides. For N-Ac-CysGly6 and N-Ac-CysGly2AspGly3, Cys thiyl radicals abstract hydrogen atoms from Gly with kf = (1.0-1.1)×105 s-1, generating carbon-centered radicals, while the reverse reaction proceeds with kr = (8.0-8.9)×105 s-1. The forward reaction shows a normal kinetic isotope effect of kH/kD = 6.9, while the reverse reaction shows a significantly higher normal kinetic isotope effect of 17.6, suggesting a contribution of tunneling. For N-Ac-CysAla2AspAla3, cysteine thiyl radicals abstract hydrogen atoms from Ala with kf =(0.9-1.0)×104 s-1, while the reverse reaction proceeds with kr = 1.0×105 s-1. The order of reactivity, Gly > Ala, is in accord with previous studies on intermolecular reactions of thiyl radicals with these amino acids. The fact that kf < kr suggests some secondary structure of the model peptides, which prevents the adoption of extended conformations, for which calculations of homolytic bond dissociation energies would have predicted kf > kr. Despite kf < kr, model calculations show that intramolecular hydrogen abstraction by Cys thiyl radicals can lead to significant oxidation of other amino acids in the presence of physiologic oxygen concentrations. PMID:18973367

  12. Risk assessment model for invasive breast cancer in Hong Kong women

    PubMed Central

    Wang, Feng; Dai, Juncheng; Li, Mengjie; Chan, Wing-cheong; Kwok, Carol Chi-hei; Leung, Siu-lan; Wu, Cherry; Li, Wentao; Yu, Wai-cho; Tsang, Koon-ho; Law, Sze-hong; Lee, Priscilla Ming-yi; Wong, Carmen Ka-man; Shen, Hongbing; Wong, Samuel Yeung-shan; Yang, Xiaohong R.; Tse, Lap Ah

    2016-01-01

    Abstract No risk assessment tool is available for identifying high risk population of breast cancer (BCa) in Hong Kong. A case–control study including 918 BCa cases and 923 controls was used to develop the risk assessment model among Hong Kong Chinese women. Each participant received an in-depth interview to obtain their lifestyle and environmental risk factors. Least absolute shrinkage and selection operator (LASSO) selection model was used to select the optimal risk factors (LASSO-model). A risk score system was constructed to evaluate the cumulative effects of selected factors. Bootstrap simulation was used to test the internal validation of the model. Model performance was evaluated by receiver-operator characteristic curves and the area under the curve (AUC). Age, number of parity, number of BCa cases in 1st-degree relatives, exposure to light at night, and sleep quality were the common risk factors for all women. Alcohol drinking was included for premenopausal women; body mass index, age at menarche, age at 1st give birth, breast feeding, using of oral contraceptive, hormone replacement treatment, and history of benign breast diseases were included for postmenopausal women. The AUCs were 0.640 (95% CI, 0.598–0.681) and 0.655 (95% CI, 0.621–0.653) for pre- and postmenopausal women, respectively. Further subgroup evaluation revealed that the model performance was better for women aged 50 to 70 years or ER-positive. This BCa risk assessment tool in Hong Kong Chinese women based on LASSO selection is promising, which shows a slightly higher discriminative accuracy than those developed in other populations. PMID:27512870

  13. Risk assessment model for invasive breast cancer in Hong Kong women.

    PubMed

    Wang, Feng; Dai, Juncheng; Li, Mengjie; Chan, Wing-Cheong; Kwok, Carol Chi-Hei; Leung, Siu-Lan; Wu, Cherry; Li, Wentao; Yu, Wai-Cho; Tsang, Koon-Ho; Law, Sze-Hong; Lee, Priscilla Ming-Yi; Wong, Carmen Ka-Man; Shen, Hongbing; Wong, Samuel Yeung-Shan; Yang, Xiaohong R; Tse, Lap Ah

    2016-08-01

    No risk assessment tool is available for identifying high risk population of breast cancer (BCa) in Hong Kong. A case-control study including 918 BCa cases and 923 controls was used to develop the risk assessment model among Hong Kong Chinese women.Each participant received an in-depth interview to obtain their lifestyle and environmental risk factors. Least absolute shrinkage and selection operator (LASSO) selection model was used to select the optimal risk factors (LASSO-model). A risk score system was constructed to evaluate the cumulative effects of selected factors. Bootstrap simulation was used to test the internal validation of the model. Model performance was evaluated by receiver-operator characteristic curves and the area under the curve (AUC).Age, number of parity, number of BCa cases in 1st-degree relatives, exposure to light at night, and sleep quality were the common risk factors for all women. Alcohol drinking was included for premenopausal women; body mass index, age at menarche, age at 1st give birth, breast feeding, using of oral contraceptive, hormone replacement treatment, and history of benign breast diseases were included for postmenopausal women. The AUCs were 0.640 (95% CI, 0.598-0.681) and 0.655 (95% CI, 0.621-0.653) for pre- and postmenopausal women, respectively. Further subgroup evaluation revealed that the model performance was better for women aged 50 to 70 years or ER-positive.This BCa risk assessment tool in Hong Kong Chinese women based on LASSO selection is promising, which shows a slightly higher discriminative accuracy than those developed in other populations. PMID:27512870

  14. Semileptonic decays Λc+→Λ ℓ+νℓ(ℓ=e ,μ ) in the covariant quark model and comparison with the new absolute branching fraction measurements of Belle and BESIII

    NASA Astrophysics Data System (ADS)

    Gutsche, Thomas; Ivanov, Mikhail A.; Körner, Jürgen G.; Lyubovitskij, Valery E.; Santorelli, Pietro

    2016-02-01

    We present precise theoretical predictions for the absolute branching fractions of Λc+→Λ ℓ+νℓ(ℓ=e,μ ) decays in the covariant confined quark model. This study is motivated by two recent and accurate measurements of the absolute branching fractions of Λc+→p K -π+ and Λc+→Λ e+ν e by the Belle Collaboration at the KEKB and by the BESIII Collaboration at the BEPCII. Our predictions for the branching fractions are consistent with both experimental results. We also provide detailed numerical results for differential decay distributions and polarization observables.

  15. A hybrid likelihood algorithm for risk modelling.

    PubMed

    Kellerer, A M; Kreisheimer, M; Chmelevsky, D; Barclay, D

    1995-03-01

    The risk of radiation-induced cancer is assessed through the follow-up of large cohorts, such as atomic bomb survivors or underground miners who have been occupationally exposed to radon and its decay products. The models relate to the dose, age and time dependence of the excess tumour rates, and they contain parameters that are estimated in terms of maximum likelihood computations. The computations are performed with the software package EPI-CURE, which contains the two main options of person-by person regression or of Poisson regression with grouped data. The Poisson regression is most frequently employed, but there are certain models that require an excessive number of cells when grouped data are used. One example involves computations that account explicitly for the temporal distribution of continuous exposures, as they occur with underground miners. In past work such models had to be approximated, but it is shown here that they can be treated explicitly in a suitably reformulated person-by person computation of the likelihood. The algorithm uses the familiar partitioning of the log-likelihood into two terms, L1 and L0. The first term, L1, represents the contribution of the 'events' (tumours). It needs to be evaluated in the usual way, but constitutes no computational problem. The second term, L0, represents the event-free periods of observation. It is, in its usual form, unmanageable for large cohorts. However, it can be reduced to a simple form, in which the number of computational steps is independent of cohort size. The method requires less computing time and computer memory, but more importantly it leads to more stable numerical results by obviating the need for grouping the data. The algorithm may be most relevant to radiation risk modelling, but it can facilitate the modelling of failure-time data in general. PMID:7604154

  16. Using Multivariate Regression Model with Least Absolute Shrinkage and Selection Operator (LASSO) to Predict the Incidence of Xerostomia after Intensity-Modulated Radiotherapy for Head and Neck Cancer

    PubMed Central

    Ting, Hui-Min; Chang, Liyun; Huang, Yu-Jie; Wu, Jia-Ming; Wang, Hung-Yu; Horng, Mong-Fong; Chang, Chun-Ming; Lan, Jen-Hong; Huang, Ya-Yu; Fang, Fu-Min; Leung, Stephen Wan

    2014-01-01

    Purpose The aim of this study was to develop a multivariate logistic regression model with least absolute shrinkage and selection operator (LASSO) to make valid predictions about the incidence of moderate-to-severe patient-rated xerostomia among head and neck cancer (HNC) patients treated with IMRT. Methods and Materials Quality of life questionnaire datasets from 206 patients with HNC were analyzed. The European Organization for Research and Treatment of Cancer QLQ-H&N35 and QLQ-C30 questionnaires were used as the endpoint evaluation. The primary endpoint (grade 3+ xerostomia) was defined as moderate-to-severe xerostomia at 3 (XER3m) and 12 months (XER12m) after the completion of IMRT. Normal tissue complication probability (NTCP) models were developed. The optimal and suboptimal numbers of prognostic factors for a multivariate logistic regression model were determined using the LASSO with bootstrapping technique. Statistical analysis was performed using the scaled Brier score, Nagelkerke R2, chi-squared test, Omnibus, Hosmer-Lemeshow test, and the AUC. Results Eight prognostic factors were selected by LASSO for the 3-month time point: Dmean-c, Dmean-i, age, financial status, T stage, AJCC stage, smoking, and education. Nine prognostic factors were selected for the 12-month time point: Dmean-i, education, Dmean-c, smoking, T stage, baseline xerostomia, alcohol abuse, family history, and node classification. In the selection of the suboptimal number of prognostic factors by LASSO, three suboptimal prognostic factors were fine-tuned by Hosmer-Lemeshow test and AUC, i.e., Dmean-c, Dmean-i, and age for the 3-month time point. Five suboptimal prognostic factors were also selected for the 12-month time point, i.e., Dmean-i, education, Dmean-c, smoking, and T stage. The overall performance for both time points of the NTCP model in terms of scaled Brier score, Omnibus, and Nagelkerke R2 was satisfactory and corresponded well with the expected values. Conclusions

  17. The developmental 'risk factor' model of schizophrenia.

    PubMed

    Murray, R M; Fearon, P

    1999-01-01

    There is no single cause for schizophrenia. We believe that, as with other common chronic diseases such as diabetes and coronary artery disease, the appropriate aetiological model is one involving multiple genes and environmental risk factors; the latter can be divided into (a) predisposing and (b) precipitating. Our model is that genetic and/or early environmental factors cause the development of anomalous neural networks. We postulate that these interact in the growing child with inherited schizotypal traits to establish a trajectory towards an increasingly solitary and deviant life style. This ultimately projects the individual across the threshold for expression of schizophrenia, sometimes by causing the drug abuse and social adversity that appear to precipitate the psychosis. PMID:10628525

  18. Automating risk analysis of software design models.

    PubMed

    Frydman, Maxime; Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance. PMID:25136688

  19. Automating Risk Analysis of Software Design Models

    PubMed Central

    Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P.

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance. PMID:25136688

  20. Extreme Earthquake Risk Estimation by Hybrid Modeling

    NASA Astrophysics Data System (ADS)

    Chavez, M.; Cabrera, E.; Ashworth, M.; Garcia, S.; Emerson, D.; Perea, N.; Salazar, A.; Moulinec, C.

    2012-12-01

    The estimation of the hazard and the economical consequences i.e. the risk associated to the occurrence of extreme magnitude earthquakes in the neighborhood of urban or lifeline infrastructure, such as the 11 March 2011 Mw 9, Tohoku, Japan, represents a complex challenge as it involves the propagation of seismic waves in large volumes of the earth crust, from unusually large seismic source ruptures up to the infrastructure location. The large number of casualties and huge economic losses observed for those earthquakes, some of which have a frequency of occurrence of hundreds or thousands of years, calls for the development of new paradigms and methodologies in order to generate better estimates, both of the seismic hazard, as well as of its consequences, and if possible, to estimate the probability distributions of their ground intensities and of their economical impacts (direct and indirect losses), this in order to implement technological and economical policies to mitigate and reduce, as much as possible, the mentioned consequences. Herewith, we propose a hybrid modeling which uses 3D seismic wave propagation (3DWP) and neural network (NN) modeling in order to estimate the seismic risk of extreme earthquakes. The 3DWP modeling is achieved by using a 3D finite difference code run in the ~100 thousands cores Blue Gene Q supercomputer of the STFC Daresbury Laboratory of UK, combined with empirical Green function (EGF) techniques and NN algorithms. In particular the 3DWP is used to generate broadband samples of the 3D wave propagation of extreme earthquakes (plausible) scenarios corresponding to synthetic seismic sources and to enlarge those samples by using feed-forward NN. We present the results of the validation of the proposed hybrid modeling for Mw 8 subduction events, and show examples of its application for the estimation of the hazard and the economical consequences, for extreme Mw 8.5 subduction earthquake scenarios with seismic sources in the Mexican

  1. Integrating Professional and Folk Models of HIV Risk: YMSM's Perceptions of High-Risk Sex

    ERIC Educational Resources Information Center

    Kubicek, Katrina; Carpineto, Julie; McDavitt, Bryce; Weiss, George; Iverson, Ellen F.; Au, Chi-Wai; Kerrone, Dustin; Martinez, Miguel; Kipke, Michele D.

    2008-01-01

    Risks associated with HIV are well documented in research literature. Although a great deal has been written about high-risk sex, little research has been conducted to examine how young men who have sex with men (YMSM) perceive and define high-risk sexual behavior. In this study, we compare the "professional" and "folk" models of HIV risk based on…

  2. The globalization of risk and risk perception: why we need a new model of risk communication for vaccines.

    PubMed

    Larson, Heidi; Brocard Paterson, Pauline; Erondu, Ngozi

    2012-11-01

    Risk communication and vaccines is complex and the nature of risk perception is changing, with perceptions converging, evolving and having impacts well beyond specific geographic localities and points in time, especially when amplified through the Internet and other modes of global communication. This article examines the globalization of risk perceptions and their impacts, including the example of measles and the globalization of measles, mumps and rubella (MMR) vaccine risk perceptions, and calls for a new, more holistic model of risk assessment, risk communication and risk mitigation, embedded in an ongoing process of risk management for vaccines and immunization programmes. It envisions risk communication as an ongoing process that includes trust-building strategies hand-in-hand with operational and policy strategies needed to mitigate and manage vaccine-related risks, as well as perceptions of risk.

  3. Stimulus probability effects in absolute identification.

    PubMed

    Kent, Christopher; Lamberts, Koen

    2016-05-01

    This study investigated the effect of stimulus presentation probability on accuracy and response times in an absolute identification task. Three schedules of presentation were used to investigate the interaction between presentation probability and stimulus position within the set. Data from individual participants indicated strong effects of presentation probability on both proportion correct and response times. The effects were moderated by the ubiquitous stimulus position effect. The accuracy and response time data were predicted by an exemplar-based model of perceptual cognition (Kent & Lamberts, 2005). The bow in discriminability was also attenuated when presentation probability for middle items was relatively high, an effect that will constrain future model development. The study provides evidence for item-specific learning in absolute identification. Implications for other theories of absolute identification are discussed. (PsycINFO Database Record

  4. Assessing patients' risk of febrile neutropenia: is there a correlation between physician-assessed risk and model-predicted risk?

    PubMed

    Lyman, Gary H; Dale, David C; Legg, Jason C; Abella, Esteban; Morrow, Phuong Khanh; Whittaker, Sadie; Crawford, Jeffrey

    2015-08-01

    This study evaluated the correlation between the risk of febrile neutropenia (FN) estimated by physicians and the risk of severe neutropenia or FN predicted by a validated multivariate model in patients with nonmyeloid malignancies receiving chemotherapy. Before patient enrollment, physician and site characteristics were recorded, and physicians self-reported the FN risk at which they would typically consider granulocyte colony-stimulating factor (G-CSF) primary prophylaxis (FN risk intervention threshold). For each patient, physicians electronically recorded their estimated FN risk, orders for G-CSF primary prophylaxis (yes/no), and patient characteristics for model predictions. Correlations between physician-assessed FN risk and model-predicted risk (primary endpoints) and between physician-assessed FN risk and G-CSF orders were calculated. Overall, 124 community-based oncologists registered; 944 patients initiating chemotherapy with intermediate FN risk enrolled. Median physician-assessed FN risk over all chemotherapy cycles was 20.0%, and median model-predicted risk was 17.9%; the correlation was 0.249 (95% CI, 0.179-0.316). The correlation between physician-assessed FN risk and subsequent orders for G-CSF primary prophylaxis (n = 634) was 0.313 (95% CI, 0.135-0.472). Among patients with a physician-assessed FN risk ≥ 20%, 14% did not receive G-CSF orders. G-CSF was not ordered for 16% of patients at or above their physician's self-reported FN risk intervention threshold (median, 20.0%) and was ordered for 21% below the threshold. Physician-assessed FN risk and model-predicted risk correlated weakly; however, there was moderate correlation between physician-assessed FN risk and orders for G-CSF primary prophylaxis. Further research and education on FN risk factors and appropriate G-CSF use are needed.

  5. Flexible dose-response models for Japanese atomic bomb survivor data: Bayesian estimation and prediction of cancer risk.

    PubMed

    Bennett, James; Little, Mark P; Richardson, Sylvia

    2004-12-01

    Generalised absolute risk models were fitted to the latest Japanese atomic bomb survivor cancer incidence data using Bayesian Markov Chain Monte Carlo methods, taking account of random errors in the DS86 dose estimates. The resulting uncertainty distributions in the relative risk model parameters were used to derive uncertainties in population cancer risks for a current UK population. Because of evidence for irregularities in the low-dose dose response, flexible dose-response models were used, consisting of a linear-quadratic-exponential model, used to model the high-dose part of the dose response, together with piecewise-linear adjustments for the two lowest dose groups. Following an assumed administered dose of 0.001 Sv, lifetime leukaemia radiation-induced incidence risks were estimated to be 1.11 x 10(-2) Sv(-1) (95% Bayesian CI -0.61, 2.38) using this model. Following an assumed administered dose of 0.001 Sv, lifetime solid cancer radiation-induced incidence risks were calculated to be 7.28 x 10(-2) Sv(-1) (95% Bayesian CI -10.63, 22.10) using this model. Overall, cancer incidence risks predicted by Bayesian Markov Chain Monte Carlo methods are similar to those derived by classical likelihood-based methods and which form the basis of established estimates of radiation-induced cancer risk.

  6. Radiation carcinogenesis in man: influence of dose-response models and risk projection models in the estimation of risk coefficients following exposure to low-level radiation

    SciTech Connect

    Fabrikant, J.I.

    1982-02-01

    The somatic effects of concern in human populations exposed to low doses and low dose rates of ionizing radiations are those that may be induced by mutation in individual cells, singly or in small numbers. The most important of these is considered to be cancer induction. Current knowledge of the carcinogenic effect of radiation in man has been reviewed in two recent reports: the 1977 UNSCEAR Report; and the 1980 BEIR-III Report. Both reports emphasize that cancers of the breast, thyroid, hematopoietic tissues, lung, and bone can be induced by radiation. Other cancers, including the stomach, pancreas, pharynx, lymphatic, and perhaps all tissues of the body, may also be induced by radiation. Both reports calculate risk estimates in absolute and relative terms for low-dose, low-LET whole-body exposure, and for leukemia, breast cancer, thyroid cancer, lung cancer, and other cancers. These estimates derive from exposure and cancer incidence data at high doses and at high dose rates. There are no compelling scientific reasons to apply these values of risk to the very low doses and low dose rates of concern in human radiation protection. In the absence of reliable human data for calculating risk estimates, dose-response models have been constructed from extrapolations of animal data and high-dose-rate human data for projection of estimated risks at low doses and low dose rates. (ERB)

  7. Absolute near-infrared oximetry for urology: a quantitative study of the tissue hemoglobin saturation before and after testicular torsion in a rabbit model

    NASA Astrophysics Data System (ADS)

    Hallacoglu, Bertan; Matulewicz, Richard S.; Paltiel, Harriet J.; Padua, Horacio; Gargollo, Patricio; Cannon, Glenn; Alomari, Ahmad; Sassaroli, Angelo; Fantini, Sergio

    2009-02-01

    We present an experimental study on four rabbits to demonstrate the feasibility of near-infrared spectroscopy in the noninvasive assessment of testicular torsion. We used a multi-distance frequency-domain method, based on a fixed detector position and a 9-mm linear scan of the illumination optical fibers, to measure absolute values of pre- and post-operative testicular oxygen saturation. Unilateral testicular torsions (by 0°, 540° or 720°) on experimental testes and contralateral sham surgeries (no torsion) on control testes were performed and studied. Our results showed (a) a consistent baseline absolute tissue oxygen saturation value of 78% +/- 5%; (b) a comparable absolute saturation of 77% +/- 6% on the control side (testes after sham surgery); and (c) a significantly lower tissue oxygen saturation of 36% +/- 2% on the experimental side (testes after 540° or 720° torsion surgery). These results demonstrate the capability of frequency domain nearinfrared spectroscopy in the assessment of absolute testicular hemoglobin desaturation caused by torsion, and show promise as a potential method to serve as a complement to conventional color and spectral Doppler ultrasonography.

  8. Risk assessment compatible fire models (RACFMs)

    SciTech Connect

    Lopez, A.R.; Gritzo, L.A.; Sherman, M.P.

    1998-07-01

    A suite of Probabilistic Risk Assessment Compatible Fire Models (RACFMs) has been developed to represent the hazard posed by a pool fire to weapon systems transported on the B52-H aircraft. These models represent both stand-off (i.e., the weapon system is outside of the flame zone but exposed to the radiant heat load from fire) and fully-engulfing scenarios (i.e., the object is fully covered by flames). The approach taken in developing the RACFMs for both scenarios was to consolidate, reconcile, and apply data and knowledge from all available resources including: data and correlations from the literature, data from an extensive full-scale fire test program at the Naval Air Warfare Center (NAWC) at China Lake, and results from a fire field model (VULCAN). In the past, a single, effective temperature, T{sub f}, was used to represent the fire. The heat flux to an object exposed to a fire was estimated using the relationship for black body radiation, {sigma}T{sub f}{sup 4}. Significant improvements have been made by employing the present approach which accounts for the presence of temperature distributions in fully-engulfing fires, and uses best available correlations to estimate heat fluxes in stand-off scenarios.

  9. A Risk Prediction Model for Smoking Experimentation in Mexican American Youth

    PubMed Central

    Talluri, Rajesh; Wilkinson, Anna V.; Spitz, Margaret R.; Shete, Sanjay

    2014-01-01

    Background Smoking experimentation in Mexican American youth is problematic. In light of the research showing that preventing smoking experimentation is a valid strategy for smoking prevention, there is a need to identify Mexican American youth at high risk for experimentation. Methods A prospective population-based cohort of 1179 adolescents of Mexican descent was followed for 5 years starting in 2005–06. Participants completed a baseline interview at a home visit followed by three telephone interviews at intervals of approximately 6 months and additional interviews at two home visits in 2008–09 and 2010–11. The primary end point of interest in this study was smoking experimentation. Information regarding social, cultural, and behavioral factors (e.g., acculturation, susceptibility to experimentation, home characteristics, household influences) was collected at baseline using validated questionnaires. Results Age, sex, cognitive susceptibility, household smoking behavior, peer influence, neighborhood influence, acculturation, work characteristics, positive outcome expectations, family cohesion, degree of tension, ability to concentrate, and school discipline were found to be associated with smoking experimentation. In a validation dataset, the proposed risk prediction model had an AUC of 0.719 (95% confidence interval, 0.637 to 0.801)for predicting absolute risk for smoking experimentation within 1 year. Conclusions The proposed risk prediction model is able to quantify the risk of smoking experimentation in Mexican American adolescents. PMID:25063521

  10. Galactic Cosmic Ray Event-Based Risk Model (GERM) Code

    NASA Technical Reports Server (NTRS)

    Cucinotta, Francis A.; Plante, Ianik; Ponomarev, Artem L.; Kim, Myung-Hee Y.

    2013-01-01

    This software describes the transport and energy deposition of the passage of galactic cosmic rays in astronaut tissues during space travel, or heavy ion beams in patients in cancer therapy. Space radiation risk is a probability distribution, and time-dependent biological events must be accounted for physical description of space radiation transport in tissues and cells. A stochastic model can calculate the probability density directly without unverified assumptions about shape of probability density function. The prior art of transport codes calculates the average flux and dose of particles behind spacecraft and tissue shielding. Because of the signaling times for activation and relaxation in the cell and tissue, transport code must describe temporal and microspatial density of functions to correlate DNA and oxidative damage with non-targeted effects of signals, bystander, etc. These are absolutely ignored or impossible in the prior art. The GERM code provides scientists data interpretation of experiments; modeling of beam line, shielding of target samples, and sample holders; and estimation of basic physical and biological outputs of their experiments. For mono-energetic ion beams, basic physical and biological properties are calculated for a selected ion type, such as kinetic energy, mass, charge number, absorbed dose, or fluence. Evaluated quantities are linear energy transfer (LET), range (R), absorption and fragmentation cross-sections, and the probability of nuclear interactions after 1 or 5 cm of water equivalent material. In addition, a set of biophysical properties is evaluated, such as the Poisson distribution for a specified cellular area, cell survival curves, and DNA damage yields per cell. Also, the GERM code calculates the radiation transport of the beam line for either a fixed number of user-specified depths or at multiple positions along the Bragg curve of the particle in a selected material. The GERM code makes the numerical estimates of basic

  11. Be Resolute about Absolute Value

    ERIC Educational Resources Information Center

    Kidd, Margaret L.

    2007-01-01

    This article explores how conceptualization of absolute value can start long before it is introduced. The manner in which absolute value is introduced to students in middle school has far-reaching consequences for their future mathematical understanding. It begins to lay the foundation for students' understanding of algebra, which can change…

  12. Measuring and modelling pollution for risk analysis.

    PubMed

    Zidek, J V; Le, N D

    1999-01-01

    The great scale and complexity of environmental risk analysis offers major methodological challenges to those engaged in policymaking. In this paper we describe some of those challenges from the perspective gained through our work at the University of British Columbia (UBC). We describe some of our experiences with respect to the difficult problems of formulating environmental standards and developing abatement strategies. A failed but instructive attempt to find support for experiments on a promising method of reducing acid rain will be described. Then we describe an approach to scenario analysis under hypothetical new standards. Even with measurements of ambient environmental conditions in hand the problem of inferring actual human exposures remains. For example, in very hot weather people will tend to stay inside and population levels of exposure to e.g. ozone could be well below those predicted by the ambient measurements. Setting air quality criteria should ideally recognize the discrepancies likely to arise. Computer models that incorporate spatial random pollution fields and predict actual exposures from ambient levels will be described. From there we turn to the statistical issues of measurement and modelling and some of the contributions in these areas by the UBC group and its partners elsewhere. In particular we discuss the problem of measurement error when non-linear regression models are used. We sketch our approach to imputing unmeasured predictors needed in such models, deferring details to references cited below. We describe in general terms how those imputed measurements and their errors can be accommodated within the framework of health impact analysis.

  13. Modeling HIV Risk in Highly Vulnerable Youth

    ERIC Educational Resources Information Center

    Huba, G. J.; Panter, A. T.; Melchior, Lisa A.; Trevithick, Lee; Woods, Elizabeth R.; Wright, Eric; Feudo, Rudy; Tierney, Steven; Schneir, Arlene; Tenner, Adam; Remafedi, Gary; Greenberg, Brian; Sturdevant, Marsha; Goodman, Elizabeth; Hodgins, Antigone; Wallace, Michael; Brady, Russell E.; Singer, Barney; Marconi, Katherine

    2003-01-01

    This article examines the structure of several HIV risk behaviors in an ethnically and geographically diverse sample of 8,251 clients from 10 innovative demonstration projects intended for adolescents living with, or at risk for, HIV. Exploratory and confirmatory factor analyses identified 2 risk factors for men (sexual intercourse with men and a…

  14. Absolute dimensions of solar-type eclipsing binaries. III. EW Orionis: stellar evolutionary models tested by a G0 V system

    NASA Astrophysics Data System (ADS)

    Clausen, J. V.; Bruntt, H.; Olsen, E. H.; Helt, B. E.; Claret, A.

    2010-02-01

    Context. Recent studies of inactive and active solar-type binaries suggest that chromospheric activity, and its effect on envelope convection, is likely to cause significant radius and temperature discrepancies. Accurate mass, radius, and abundance determinations from additional solar-type binaries exhibiting various levels of activity are needed for a better insight into the structure and evolution of these stars. Aims: We aim to determine absolute dimensions and abundances for the G0 V detached eclipsing binary EW Ori, and to perform a detailed comparison with results from recent stellar evolutionary models. Methods: uvby light curves and {uvby}β standard photometry were obtained with the Strömgren Automatic Telescope, published radial velocity observations from the CORAVEL spectrometer were reanalysed, and high-resolution spectra were observed at the FEROS spectrograph; all are/were ESO, La Silla facilities. State-of-the-art methods were applied for the photometric and spectroscopic analyses. Results: Masses and radii that are precise to 0.9% and 0.5%, respectively, have been established for both components of EW Ori. The 1.12 M⊙ secondary component reveals weak Ca ii H and K emission and is probably mildly active; no signs of activity are seen for the 1.17 M⊙ primary. We derive an [Fe/H] abundance of +0.05 ± 0.09 and similar abundances for Si, Ca, Sc, Ti, Cr, and Ni. Yonsai-Yale and Granada solar-scaled evolutionary models for the observed metal abundance reproduce the components fairly well at an age of ≈2 Gyr. Perfect agreement is, however, obtained at an age of 2.3 Gyr for a combination of a) a slight downwards adjustment of the envelope mixing length parameter for the secondary, as seen for other active solar-type stars; and b) a slightly lower helium content than prescribed by the Y-Z relations adopted for the standard model grids. The orbit is eccentric (e = 0.0758 ± 0.0020), and apsidal motion with a 62% relativistic contribution has been

  15. Bankruptcy risk model and empirical tests

    PubMed Central

    Podobnik, Boris; Horvatic, Davor; Petersen, Alexander M.; Urošević, Branko; Stanley, H. Eugene

    2010-01-01

    We analyze the size dependence and temporal stability of firm bankruptcy risk in the US economy by applying Zipf scaling techniques. We focus on a single risk factor—the debt-to-asset ratio R—in order to study the stability of the Zipf distribution of R over time. We find that the Zipf exponent increases during market crashes, implying that firms go bankrupt with larger values of R. Based on the Zipf analysis, we employ Bayes’s theorem and relate the conditional probability that a bankrupt firm has a ratio R with the conditional probability of bankruptcy for a firm with a given R value. For 2,737 bankrupt firms, we demonstrate size dependence in assets change during the bankruptcy proceedings. Prepetition firm assets and petition firm assets follow Zipf distributions but with different exponents, meaning that firms with smaller assets adjust their assets more than firms with larger assets during the bankruptcy process. We compare bankrupt firms with nonbankrupt firms by analyzing the assets and liabilities of two large subsets of the US economy: 2,545 Nasdaq members and 1,680 New York Stock Exchange (NYSE) members. We find that both assets and liabilities follow a Pareto distribution. The finding is not a trivial consequence of the Zipf scaling relationship of firm size quantified by employees—although the market capitalization of Nasdaq stocks follows a Pareto distribution, the same distribution does not describe NYSE stocks. We propose a coupled Simon model that simultaneously evolves both assets and debt with the possibility of bankruptcy, and we also consider the possibility of firm mergers. PMID:20937903

  16. Bankruptcy risk model and empirical tests.

    PubMed

    Podobnik, Boris; Horvatic, Davor; Petersen, Alexander M; Urosevic, Branko; Stanley, H Eugene

    2010-10-26

    We analyze the size dependence and temporal stability of firm bankruptcy risk in the US economy by applying Zipf scaling techniques. We focus on a single risk factor--the debt-to-asset ratio R--in order to study the stability of the Zipf distribution of R over time. We find that the Zipf exponent increases during market crashes, implying that firms go bankrupt with larger values of R. Based on the Zipf analysis, we employ Bayes's theorem and relate the conditional probability that a bankrupt firm has a ratio R with the conditional probability of bankruptcy for a firm with a given R value. For 2,737 bankrupt firms, we demonstrate size dependence in assets change during the bankruptcy proceedings. Prepetition firm assets and petition firm assets follow Zipf distributions but with different exponents, meaning that firms with smaller assets adjust their assets more than firms with larger assets during the bankruptcy process. We compare bankrupt firms with nonbankrupt firms by analyzing the assets and liabilities of two large subsets of the US economy: 2,545 Nasdaq members and 1,680 New York Stock Exchange (NYSE) members. We find that both assets and liabilities follow a Pareto distribution. The finding is not a trivial consequence of the Zipf scaling relationship of firm size quantified by employees--although the market capitalization of Nasdaq stocks follows a Pareto distribution, the same distribution does not describe NYSE stocks. We propose a coupled Simon model that simultaneously evolves both assets and debt with the possibility of bankruptcy, and we also consider the possibility of firm mergers.

  17. Bankruptcy risk model and empirical tests.

    PubMed

    Podobnik, Boris; Horvatic, Davor; Petersen, Alexander M; Urosevic, Branko; Stanley, H Eugene

    2010-10-26

    We analyze the size dependence and temporal stability of firm bankruptcy risk in the US economy by applying Zipf scaling techniques. We focus on a single risk factor--the debt-to-asset ratio R--in order to study the stability of the Zipf distribution of R over time. We find that the Zipf exponent increases during market crashes, implying that firms go bankrupt with larger values of R. Based on the Zipf analysis, we employ Bayes's theorem and relate the conditional probability that a bankrupt firm has a ratio R with the conditional probability of bankruptcy for a firm with a given R value. For 2,737 bankrupt firms, we demonstrate size dependence in assets change during the bankruptcy proceedings. Prepetition firm assets and petition firm assets follow Zipf distributions but with different exponents, meaning that firms with smaller assets adjust their assets more than firms with larger assets during the bankruptcy process. We compare bankrupt firms with nonbankrupt firms by analyzing the assets and liabilities of two large subsets of the US economy: 2,545 Nasdaq members and 1,680 New York Stock Exchange (NYSE) members. We find that both assets and liabilities follow a Pareto distribution. The finding is not a trivial consequence of the Zipf scaling relationship of firm size quantified by employees--although the market capitalization of Nasdaq stocks follows a Pareto distribution, the same distribution does not describe NYSE stocks. We propose a coupled Simon model that simultaneously evolves both assets and debt with the possibility of bankruptcy, and we also consider the possibility of firm mergers. PMID:20937903

  18. Absolute nuclear material assay using count distribution (LAMBDA) space

    SciTech Connect

    Prasad, Mano K.; Snyderman, Neal J.; Rowland, Mark S.

    2015-12-01

    A method of absolute nuclear material assay of an unknown source comprising counting neutrons from the unknown source and providing an absolute nuclear material assay utilizing a model to optimally compare to the measured count distributions. In one embodiment, the step of providing an absolute nuclear material assay comprises utilizing a random sampling of analytically computed fission chain distributions to generate a continuous time-evolving sequence of event-counts by spreading the fission chain distribution in time.

  19. Absolute nuclear material assay using count distribution (LAMBDA) space

    DOEpatents

    Prasad, Manoj K.; Snyderman, Neal J.; Rowland, Mark S.

    2012-06-05

    A method of absolute nuclear material assay of an unknown source comprising counting neutrons from the unknown source and providing an absolute nuclear material assay utilizing a model to optimally compare to the measured count distributions. In one embodiment, the step of providing an absolute nuclear material assay comprises utilizing a random sampling of analytically computed fission chain distributions to generate a continuous time-evolving sequence of event-counts by spreading the fission chain distribution in time.

  20. Radiation risk modeling of thyroid cancer with special emphasis on the Chernobyl epidemiological data.

    PubMed

    Walsh, L; Jacob, P; Kaiser, J C

    2009-10-01

    Two recent studies analyzed thyroid cancer incidence in Belarus and Ukraine during the period from 1990 to 2001, for the birth cohort 1968 to 1985, and the related (131)I exposure associated with the Chernobyl accident in 1986. Contradictory age-at-exposure and time-since-exposure effect modifications of the excess relative risk (ERR) were reported. The present study identifies the choice of baseline modeling method as the reason for the conflicting results. Various quality-of-fit criteria favor a parametric baseline model to various categorical baseline models. The model with a parametric baseline results in a decrease of the ERR by a factor of about 0.2 from an age at exposure of 5 years to an age at exposure of 15 years (for a time since exposure of 12 years) and a decrease of the ERR from a time since exposure of 4 years to a time since exposure of 14 years of about 0.25 (for an age at exposure of 10 years). Central ERR estimates (of about 20 at 1 Gy for an age at exposure of 10 years and an attained age of 20 years) and their ratios for females compared to males (about 0.3) turn out to be relatively independent of the modeling. Excess absolute risk estimates are also predicted to be very similar from the different models. Risk models with parametric and categorical baselines were also applied to thyroid cancer incidence among the atomic bomb survivors. For young ages at exposure, the ERR values in the model with a parametric baseline are larger. Both data sets cover the period of 12 to 15 years since exposure. For this period, higher ERR values and a stronger age-at-exposure modification are found for the Chernobyl data set. Based on the results of the study, it is recommended to test parametric and categorical baseline models in risk analyses. PMID:19772472

  1. 42 CFR 425.600 - Selection of risk model.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 42 Public Health 3 2012-10-01 2012-10-01 false Selection of risk model. 425.600 Section 425.600... Selection of risk model. (a) For its initial agreement period, an ACO may elect to operate under one of the following tracks: (1) Track 1. Under Track 1, the ACO operates under the one-sided model (as described...

  2. 42 CFR 425.600 - Selection of risk model.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 42 Public Health 3 2013-10-01 2013-10-01 false Selection of risk model. 425.600 Section 425.600... Selection of risk model. (a) For its initial agreement period, an ACO may elect to operate under one of the following tracks: (1) Track 1. Under Track 1, the ACO operates under the one-sided model (as described...

  3. 42 CFR 425.600 - Selection of risk model.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 42 Public Health 3 2014-10-01 2014-10-01 false Selection of risk model. 425.600 Section 425.600... Selection of risk model. (a) For its initial agreement period, an ACO may elect to operate under one of the following tracks: (1) Track 1. Under Track 1, the ACO operates under the one-sided model (as described...

  4. Research on R&D Project Risk Management Model

    NASA Astrophysics Data System (ADS)

    Gu, Xiaoyan; Cai, Chen; Song, Hao; Song, Juan

    R&D project is an exploratory high-risk investment activity and has potential management flexibility. In R&D project risk management process, it is hard to quantify risk with very little past information available. This paper introduces quality function deployment and real option in traditional project risk management process. Through waterfall decomposition mode, R&D project risk management process is constructed step by step; through real option, the managerial flexibility inherent in R&D project can be modeled. In the paper, first of all, according to the relation matrix between R&D project success factors and risk indexes, risk priority list can be obtained. Then, risk features of various stages are analyzed. Finally, real options are embedded into various stages of R&D project by the risk features. In order to effectively manage R&D risk in a dynamic cycle, the steps above should be carried out repeatedly.

  5. Proliferation Risk Characterization Model Prototype Model - User and Programmer Guidelines

    SciTech Connect

    Dukelow, J.S.; Whitford, D.

    1998-12-01

    A model for the estimation of the risk of diversion of weapons-capable materials was developed. It represents both the threat of diversion and site vulnerability as a product of a small number of variables (two to eight), each of which can take on a small number (two to four) of qualitatively defined (but quantitatively implemented) values. The values of the overall threat and vulnerability variables are then converted to threat and vulnerability categories. The threat and vulnerability categories are used to define the likelihood of diversion, also defined categorically. The evaluator supplies an estimate of the consequences of a diversion, defined categorically, but with the categories based on the IAEA Attractiveness levels. Likelihood and Consequences categories are used to define the Risk, also defined categorically. The threat, vulnerability, and consequences input provided by the evaluator contains a representation of his/her uncertainty in each variable assignment which is propagated all the way through to the calculation of the Risk categories. [Appendix G available on diskette only.

  6. Impact of Winko on absolute discharges.

    PubMed

    Balachandra, Krishna; Swaminath, Sam; Litman, Larry C

    2004-01-01

    In Canada, case laws have had a significant impact on the way mentally ill offenders are managed, both in the criminal justice system and in the forensic mental health system. The Supreme Court of Canada's decision with respect to Winko has set a major precedent in the application of the test of significant risk to the safety of the public in making dispositions by the Ontario Review Board and granting absolute discharges to the mentally ill offenders in the forensic health system. Our study examines the impact of the Supreme Court of Canada's decision before and after Winko. The results show that the numbers of absolute discharges have increased post-Winko, which was statistically significant, but there could be other factors influencing this increase.

  7. PENALIZED VARIABLE SELECTION PROCEDURE FOR COX MODELS WITH SEMIPARAMETRIC RELATIVE RISK

    PubMed Central

    Ma, Shuangge; Liang, Hua

    2010-01-01

    We study the Cox models with semiparametric relative risk, which can be partially linear with one nonparametric component, or multiple additive or nonadditive nonparametric components. A penalized partial likelihood procedure is proposed to simultaneously estimate the parameters and select variables for both the parametric and the nonparametric parts. Two penalties are applied sequentially. The first penalty, governing the smoothness of the multivariate nonlinear covariate effect function, provides a smoothing spline ANOVA framework that is exploited to derive an empirical model selection tool for the nonparametric part. The second penalty, either the smoothly-clipped-absolute-deviation (SCAD) penalty or the adaptive LASSO penalty, achieves variable selection in the parametric part. We show that the resulting estimator of the parametric part possesses the oracle property, and that the estimator of the nonparametric part achieves the optimal rate of convergence. The proposed procedures are shown to work well in simulation experiments, and then applied to a real data example on sexually transmitted diseases. PMID:20802853

  8. Modeling biotic habitat high risk areas

    USGS Publications Warehouse

    Despain, D.G.; Beier, P.; Tate, C.; Durtsche, B.M.; Stephens, T.

    2000-01-01

    Fire, especially stand replacing fire, poses a threat to many threatened and endangered species as well as their habitat. On the other hand, fire is important in maintaining a variety of successional stages that can be important for approach risk assessment to assist in prioritizing areas for allocation of fire mitigation funds. One example looks at assessing risk to the species and biotic communities of concern followed by the Colorado Natural Heritage Program. One looks at the risk to Mexican spottled owls. Another looks at the risk to cutthroat trout, and a fourth considers the general effects of fire and elk.

  9. SMALL POPULATIONS REQUIRE SPECIFIC MODELING APPROACHES FOR ASSESSING RISK

    EPA Science Inventory

    All populations face non-zero risks of extinction. However, the risks for small populations, and therefore the modeling approaches necessary to predict them, are different from those of large populations. These differences are currently hindering assessment of risk to small pop...

  10. Issues in Absolute Spectral Radiometric Calibration: Intercomparison of Eight Sources

    NASA Technical Reports Server (NTRS)

    Goetz, Alexander F. H.; Kindel, Bruce; Pilewskie, Peter

    1998-01-01

    The application of atmospheric models to AVIRIS and other spectral imaging data to derive surface reflectance requires that the sensor output be calibrated to absolute radiance. Uncertainties in absolute calibration are to be expected, and claims of 92% accuracy have been published. Measurements of accurate surface albedos and cloud absorption to be used in radiative balance calculations depend critically on knowing the absolute spectral-radiometric response of the sensor. The Earth Observing System project is implementing a rigorous program of absolute radiometric calibration for all optical sensors. Since a number of imaging instruments that provide output in terms of absolute radiance are calibrated at different sites, it is important to determine the errors that can be expected among calibration sites. Another question exists about the errors in the absolute knowledge of the exoatmospheric spectral solar irradiance.

  11. A new explained-variance based genetic risk score for predictive modeling of disease risk.

    PubMed

    Che, Ronglin; Motsinger-Reif, Alison A

    2012-09-25

    The goal of association mapping is to identify genetic variants that predict disease, and as the field of human genetics matures, the number of successful association studies is increasing. Many such studies have shown that for many diseases, risk is explained by a reasonably large number of variants that each explains a very small amount of disease risk. This is prompting the use of genetic risk scores in building predictive models, where information across several variants is combined for predictive modeling. In the current study, we compare the performance of four previously proposed genetic risk score methods and present a new method for constructing genetic risk score that incorporates explained variance information. The methods compared include: a simple count Genetic Risk Score, an odds ratio weighted Genetic Risk Score, a direct logistic regression Genetic Risk Score, a polygenic Genetic Risk Score, and the new explained variance weighted Genetic Risk Score. We compare the methods using a wide range of simulations in two steps, with a range of the number of deleterious single nucleotide polymorphisms (SNPs) explaining disease risk, genetic modes, baseline penetrances, sample sizes, relative risks (RR) and minor allele frequencies (MAF). Several measures of model performance were compared including overall power, C-statistic and Akaike's Information Criterion. Our results show the relative performance of methods differs significantly, with the new explained variance weighted GRS (EV-GRS) generally performing favorably to the other methods.

  12. Singular perturbation of absolute stability.

    NASA Technical Reports Server (NTRS)

    Siljak, D. D.

    1972-01-01

    It was previously shown (author, 1969) that the regions of absolute stability in the parameter space can be determined when the parameters appear on the right-hand side of the system equations, i.e., the regular case. Here, the effect on absolute stability of a small parameter attached to higher derivatives in the equations (the singular case) is studied. The Lur'e-Postnikov class of nonlinear systems is considered.

  13. The modified model of radiation risk at radon exposure.

    PubMed

    Zhukovsky, Michael; Demin, Vladimir; Yarmoshenko, Ilia

    2014-07-01

    The combined modified model of risk assessment from an indoor radon exposure is proposed. Multiplicative dependence on fatal lung cancer is used. The model has been developed on the basis of the modern health risk theory and the results of epidemiological studies with the special attention to the results of the European combined study and the WISMUT miners cohort study. The model is presented as an age-specific relative risk coefficient for a single (short-term) exposure. The risk coefficient for an extended exposure can be obtained from this risk coefficient in the accordance with the risk theory. The smoothed dependences of the risk coefficients on time since exposure and attained age and radon progeny concentration are suggested.

  14. Uses and Abuses of Models in Radiation Risk Management

    SciTech Connect

    Strom, Daniel J.

    1998-12-10

    This paper is a high-level overview of managing risks to workers, public, and the environment. It discusses the difference between a model and a hypothesis. The need for models in risk assessment is justified, and then it is shown that radiation risk models that are useable in risk management are highly simplistic. The weight of evidence is considered for and against the linear non-threshold (LNT) model for carcinogenesis and heritable ill-health that is currently the basis for radiation risk management. Finally, uses and misuses of this model are considered. It is concluded that the LNT model continues to be suitable for use as the basis for radiation protection.

  15. Risk management modeling and its application in maritime safety

    NASA Astrophysics Data System (ADS)

    Qin, Ting-Rong; Chen, Wei-Jiong; Zeng, Xiang-Kun

    2008-12-01

    Quantified risk assessment (QRA) needs mathematicization of risk theory. However, attention has been paid almost exclusively to applications of assessment methods, which has led to neglect of research into fundamental theories, such as the relationships among risk, safety, danger, and so on. In order to solve this problem, as a first step, fundamental theoretical relationships about risk and risk management were analyzed for this paper in the light of mathematics, and then illustrated with some charts. Second, man-machine-environment-management (MMEM) theory was introduced into risk theory to analyze some properties of risk. On the basis of this, a three-dimensional model of risk management was established that includes: a goal dimension; a management dimension; an operation dimension. This goal management operation (GMO) model was explained and then emphasis was laid on the discussion of the risk flowchart (operation dimension), which lays the groundwork for further study of risk management and qualitative and quantitative assessment. Next, the relationship between Formal Safety Assessment (FSA) and Risk Management was researched. This revealed that the FSA method, which the international maritime organization (IMO) is actively spreading, comes from Risk Management theory. Finally, conclusion were made about how to apply this risk management method to concrete fields efficiently and conveniently, as well as areas where further research is required.

  16. A risk analysis model for radioactive wastes.

    PubMed

    Külahcı, Fatih

    2011-07-15

    Hazardous wastes affect natural environmental systems to a significant extend, and therefore, it is necessary to control their harm through risk analysis. Herein, an effective risk methodology is proposed by considering their uncertain behaviors on stochastic, statistical and probabilistic bases. The basic element is attachment of a convenient probability distribution function (pdf) to a given waste quality measurement sequence. In this paper, (40)K contaminant measurements are adapted for risk assessment application after derivation of necessary fundamental formulations. The spatial contaminant distribution of (40)K is presented in the forms of maps and three-dimensional surfaces.

  17. Building risk-on-a-chip models to improve breast cancer risk assessment and prevention

    PubMed Central

    Vidi, Pierre-Alexandre; Leary, James; Lelièvre, Sophie A.

    2013-01-01

    Summary Preventive actions for chronic diseases hold the promise of improving lives and reducing healthcare costs. For several diseases, including breast cancer, multiple risk and protective factors have been identified by epidemiologists. The impact of most of these factors has yet to be fully understood at the organism, tissue, cellular and molecular levels. Importantly, combinations of external and internal risk and protective factors involve cooperativity thus, synergizing or antagonizing disease onset. Models are needed to mechanistically decipher cancer risks under defined cellular and microenvironmental conditions. Here, we briefly review breast cancer risk models based on 3D cell culture and propose to improve risk modeling with lab-on-a-chip approaches. We suggest epithelial tissue polarity, DNA repair and epigenetic profiles as endpoints in risk assessment models and discuss the development of ‘risks-on-chips’ integrating biosensors of these endpoints and of general tissue homeostasis. Risks-on-chips will help identify biomarkers of risk, serve as screening platforms for cancer preventive agents, and provide a better understanding of risk mechanisms, hence resulting in novel developments in disease prevention. PMID:23681255

  18. Estimation of value at risk and conditional value at risk using normal mixture distributions model

    NASA Astrophysics Data System (ADS)

    Kamaruzzaman, Zetty Ain; Isa, Zaidi

    2013-04-01

    Normal mixture distributions model has been successfully applied in financial time series analysis. In this paper, we estimate the return distribution, value at risk (VaR) and conditional value at risk (CVaR) for monthly and weekly rates of returns for FTSE Bursa Malaysia Kuala Lumpur Composite Index (FBMKLCI) from July 1990 until July 2010 using the two component univariate normal mixture distributions model. First, we present the application of normal mixture distributions model in empirical finance where we fit our real data. Second, we present the application of normal mixture distributions model in risk analysis where we apply the normal mixture distributions model to evaluate the value at risk (VaR) and conditional value at risk (CVaR) with model validation for both risk measures. The empirical results provide evidence that using the two components normal mixture distributions model can fit the data well and can perform better in estimating value at risk (VaR) and conditional value at risk (CVaR) where it can capture the stylized facts of non-normality and leptokurtosis in returns distribution.

  19. Source-to-Outcome Microbial Exposure and Risk Modeling Framework

    EPA Science Inventory

    A Quantitative Microbial Risk Assessment (QMRA) is a computer-based data-delivery and modeling approach that integrates interdisciplinary fate/transport, exposure, and impact models and databases to characterize potential health impacts/risks due to pathogens. As such, a QMRA ex...

  20. A Process Model for Assessing Adolescent Risk for Suicide.

    ERIC Educational Resources Information Center

    Stoelb, Matt; Chiriboga, Jennifer

    1998-01-01

    This comprehensive assessment process model includes primary, secondary, and situational risk factors and their combined implications and significance in determining an adolescent's level or risk for suicide. Empirical data and clinical intuition are integrated to form a working client model that guides the professional in continuously reassessing…

  1. A Team Mental Model Perspective of Pre-Quantitative Risk

    NASA Technical Reports Server (NTRS)

    Cooper, Lynne P.

    2011-01-01

    This study was conducted to better understand how teams conceptualize risk before it can be quantified, and the processes by which a team forms a shared mental model of this pre-quantitative risk. Using an extreme case, this study analyzes seven months of team meeting transcripts, covering the entire lifetime of the team. Through an analysis of team discussions, a rich and varied structural model of risk emerges that goes significantly beyond classical representations of risk as the product of a negative consequence and a probability. In addition to those two fundamental components, the team conceptualization includes the ability to influence outcomes and probabilities, networks of goals, interaction effects, and qualitative judgments about the acceptability of risk, all affected by associated uncertainties. In moving from individual to team mental models, team members employ a number of strategies to gain group recognition of risks and to resolve or accept differences.

  2. Applying Four Different Risk Models in Local Ore Selection

    SciTech Connect

    Richmond, Andrew

    2002-12-15

    Given the uncertainty in grade at a mine location, a financially risk-averse decision-maker may prefer to incorporate this uncertainty into the ore selection process. A FORTRAN program risksel is presented to calculate local risk-adjusted optimal ore selections using a negative exponential utility function and three dominance models: mean-variance, mean-downside risk, and stochastic dominance. All four methods are demonstrated in a grade control environment. In the case study, optimal selections range with the magnitude of financial risk that a decision-maker is prepared to accept. Except for the stochastic dominance method, the risk models reassign material from higher cost to lower cost processing options as the aversion to financial risk increases. The stochastic dominance model usually was unable to determine the optimal local selection.

  3. Quantified Risk Ranking Model for Condition-Based Risk and Reliability Centered Maintenance

    NASA Astrophysics Data System (ADS)

    Chattopadhyaya, Pradip Kumar; Basu, Sushil Kumar; Majumdar, Manik Chandra

    2016-03-01

    In the recent past, risk and reliability centered maintenance (RRCM) framework is introduced with a shift in the methodological focus from reliability and probabilities (expected values) to reliability, uncertainty and risk. In this paper authors explain a novel methodology for risk quantification and ranking the critical items for prioritizing the maintenance actions on the basis of condition-based risk and reliability centered maintenance (CBRRCM). The critical items are identified through criticality analysis of RPN values of items of a system and the maintenance significant precipitating factors (MSPF) of items are evaluated. The criticality of risk is assessed using three risk coefficients. The likelihood risk coefficient treats the probability as a fuzzy number. The abstract risk coefficient deduces risk influenced by uncertainty, sensitivity besides other factors. The third risk coefficient is called hazardous risk coefficient, which is due to anticipated hazards which may occur in the future and the risk is deduced from criteria of consequences on safety, environment, maintenance and economic risks with corresponding cost for consequences. The characteristic values of all the three risk coefficients are obtained with a particular test. With few more tests on the system, the values may change significantly within controlling range of each coefficient, hence `random number simulation' is resorted to obtain one distinctive value for each coefficient. The risk coefficients are statistically added to obtain final risk coefficient of each critical item and then the final rankings of critical items are estimated. The prioritization in ranking of critical items using the developed mathematical model for risk assessment shall be useful in optimization of financial losses and timing of maintenance actions.

  4. A comprehensive Network Security Risk Model for process control networks.

    PubMed

    Henry, Matthew H; Haimes, Yacov Y

    2009-02-01

    The risk of cyber attacks on process control networks (PCN) is receiving significant attention due to the potentially catastrophic extent to which PCN failures can damage the infrastructures and commodity flows that they support. Risk management addresses the coupled problems of (1) reducing the likelihood that cyber attacks would succeed in disrupting PCN operation and (2) reducing the severity of consequences in the event of PCN failure or manipulation. The Network Security Risk Model (NSRM) developed in this article provides a means of evaluating the efficacy of candidate risk management policies by modeling the baseline risk and assessing expectations of risk after the implementation of candidate measures. Where existing risk models fall short of providing adequate insight into the efficacy of candidate risk management policies due to shortcomings in their structure or formulation, the NSRM provides model structure and an associated modeling methodology that captures the relevant dynamics of cyber attacks on PCN for risk analysis. This article develops the NSRM in detail in the context of an illustrative example.

  5. A comprehensive Network Security Risk Model for process control networks.

    PubMed

    Henry, Matthew H; Haimes, Yacov Y

    2009-02-01

    The risk of cyber attacks on process control networks (PCN) is receiving significant attention due to the potentially catastrophic extent to which PCN failures can damage the infrastructures and commodity flows that they support. Risk management addresses the coupled problems of (1) reducing the likelihood that cyber attacks would succeed in disrupting PCN operation and (2) reducing the severity of consequences in the event of PCN failure or manipulation. The Network Security Risk Model (NSRM) developed in this article provides a means of evaluating the efficacy of candidate risk management policies by modeling the baseline risk and assessing expectations of risk after the implementation of candidate measures. Where existing risk models fall short of providing adequate insight into the efficacy of candidate risk management policies due to shortcomings in their structure or formulation, the NSRM provides model structure and an associated modeling methodology that captures the relevant dynamics of cyber attacks on PCN for risk analysis. This article develops the NSRM in detail in the context of an illustrative example. PMID:19000078

  6. Absolute flux scale for radioastronomy

    SciTech Connect

    Ivanov, V.P.; Stankevich, K.S.

    1986-07-01

    The authors propose and provide support for a new absolute flux scale for radio astronomy, which is not encumbered with the inadequacies of the previous scales. In constructing it the method of relative spectra was used (a powerful tool for choosing reference spectra). A review is given of previous flux scales. The authors compare the AIS scale with the scale they propose. Both scales are based on absolute measurements by the ''artificial moon'' method, and they are practically coincident in the range from 0.96 to 6 GHz. At frequencies above 6 GHz, 0.96 GHz, the AIS scale is overestimated because of incorrect extrapolation of the spectra of the primary and secondary standards. The major results which have emerged from this review of absolute scales in radio astronomy are summarized.

  7. Absolute Humidity and the Seasonality of Influenza (Invited)

    NASA Astrophysics Data System (ADS)

    Shaman, J. L.; Pitzer, V.; Viboud, C.; Grenfell, B.; Goldstein, E.; Lipsitch, M.

    2010-12-01

    Much of the observed wintertime increase of mortality in temperate regions is attributed to seasonal influenza. A recent re-analysis of laboratory experiments indicates that absolute humidity strongly modulates the airborne survival and transmission of the influenza virus. Here we show that the onset of increased wintertime influenza-related mortality in the United States is associated with anomalously low absolute humidity levels during the prior weeks. We then use an epidemiological model, in which observed absolute humidity conditions temper influenza transmission rates, to successfully simulate the seasonal cycle of observed influenza-related mortality. The model results indicate that direct modulation of influenza transmissibility by absolute humidity alone is sufficient to produce this observed seasonality. These findings provide epidemiological support for the hypothesis that absolute humidity drives seasonal variations of influenza transmission in temperate regions. In addition, we show that variations of the basic and effective reproductive numbers for influenza, caused by seasonal changes in absolute humidity, are consistent with the general timing of pandemic influenza outbreaks observed for 2009 A/H1N1 in temperate regions. Indeed, absolute humidity conditions correctly identify the region of the United States vulnerable to a third, wintertime wave of pandemic influenza. These findings suggest that the timing of pandemic influenza outbreaks is controlled by a combination of absolute humidity conditions, levels of susceptibility and changes in population mixing and contact rates.

  8. The Integrated Medical Model: Statistical Forecasting of Risks to Crew Health and Mission Success

    NASA Technical Reports Server (NTRS)

    Fitts, M. A.; Kerstman, E.; Butler, D. J.; Walton, M. E.; Minard, C. G.; Saile, L. G.; Toy, S.; Myers, J.

    2008-01-01

    The Integrated Medical Model (IMM) helps capture and use organizational knowledge across the space medicine, training, operations, engineering, and research domains. The IMM uses this domain knowledge in the context of a mission and crew profile to forecast crew health and mission success risks. The IMM is most helpful in comparing the risk of two or more mission profiles, not as a tool for predicting absolute risk. The process of building the IMM adheres to Probability Risk Assessment (PRA) techniques described in NASA Procedural Requirement (NPR) 8705.5, and uses current evidence-based information to establish a defensible position for making decisions that help ensure crew health and mission success. The IMM quantitatively describes the following input parameters: 1) medical conditions and likelihood, 2) mission duration, 3) vehicle environment, 4) crew attributes (e.g. age, sex), 5) crew activities (e.g. EVA's, Lunar excursions), 6) diagnosis and treatment protocols (e.g. medical equipment, consumables pharmaceuticals), and 7) Crew Medical Officer (CMO) training effectiveness. It is worth reiterating that the IMM uses the data sets above as inputs. Many other risk management efforts stop at determining only likelihood. The IMM is unique in that it models not only likelihood, but risk mitigations, as well as subsequent clinical outcomes based on those mitigations. Once the mathematical relationships among the above parameters are established, the IMM uses a Monte Carlo simulation technique (a random sampling of the inputs as described by their statistical distribution) to determine the probable outcomes. Because the IMM is a stochastic model (i.e. the input parameters are represented by various statistical distributions depending on the data type), when the mission is simulated 10-50,000 times with a given set of medical capabilities (risk mitigations), a prediction of the most probable outcomes can be generated. For each mission, the IMM tracks which conditions

  9. Ecological risk assessment of water environment for Luanhe River Basin based on relative risk model.

    PubMed

    Liu, Jingling; Chen, Qiuying; Li, Yongli

    2010-11-01

    The relative risk model (RRM) was applied in regional ecological risk assessments successfully. In this study, the RRM was developed through increasing the data of risk source and introducing the source-stressor-habitat exposure filter (SSH), the endpoint-habitat exposure filter (EH) and the stressor-endpoint effect filter (SE) to reflect the meaning of exposure and effect more explicit. Water environment which include water quality, water quantity and aquatic ecosystems was selected as the ecological risk assessment endpoints. The Luanhe River Basin located in the North China was selected as model case. The results showed that there were three low risk regions, one medium risk region and two high risk regions in the Luanhe River Basin. The results also indicated habitat destruction was the largest stressor with the risk scores as high as 11.87 for the Luanhe water environment, the second was oxygen consuming organic pollutants (9.28) and the third was nutrients (7.78). So these three stressors were the main influencing factors of the ecological pressure in the study area. Furthermore, animal husbandry was the biggest source with the risk scores as high as 20.38, the second was domestic sewage (14.00), and the third was polluting industry (9.96). For habitats, waters and farmland were enduring the bigger pressure and should be taken considerable attention. Water deterioration and ecological service values damaged were facing the biggest risk pressure, and secondly was biodiversity decreased and landscape fragmentation. PMID:20683654

  10. Assessing Academic Risk of Student-Athletes: Applicability of the NCAA Graduation Risk Overview Model to GPA

    ERIC Educational Resources Information Center

    Johnson, James

    2013-01-01

    In an effort to standardize academic risk assessment, the NCAA developed the graduation risk overview (GRO) model. Although this model was designed to assess graduation risk, its ability to predict grade-point average (GPA) remained unknown. Therefore, 134 individual risk assessments were made to determine GRO model effectiveness in the…

  11. Modeling extinction risk of endemic birds of mainland china.

    PubMed

    Chen, Youhua

    2013-01-01

    The extinction risk of endemic birds of mainland China was modeled over evolutionary time. Results showed that extinction risk of endemic birds in mainland China always tended to be similar within subclades over the evolutionary time of species divergence, and the overall evolution of extinction risk of species presented a conservatism pattern, as evidenced by the disparity-through-time plot. A constant-rate evolutionary model was the best one to quantify the evolution of extinction risk of endemic birds of mainland China. Thus, there was no rate shifting pattern for the evolution of extinction risk of Chinese endemic birds over time. In a summary, extinction risk of endemic birds of mainland China is systematically quantified under the evolutionary framework in the present work.

  12. Risk models and scores for type 2 diabetes: systematic review

    PubMed Central

    Mathur, Rohini; Dent, Tom; Meads, Catherine; Greenhalgh, Trisha

    2011-01-01

    Objective To evaluate current risk models and scores for type 2 diabetes and inform selection and implementation of these in practice. Design Systematic review using standard (quantitative) and realist (mainly qualitative) methodology. Inclusion criteria Papers in any language describing the development or external validation, or both, of models and scores to predict the risk of an adult developing type 2 diabetes. Data sources Medline, PreMedline, Embase, and Cochrane databases were searched. Included studies were citation tracked in Google Scholar to identify follow-on studies of usability or impact. Data extraction Data were extracted on statistical properties of models, details of internal or external validation, and use of risk scores beyond the studies that developed them. Quantitative data were tabulated to compare model components and statistical properties. Qualitative data were analysed thematically to identify mechanisms by which use of the risk model or score might improve patient outcomes. Results 8864 titles were scanned, 115 full text papers considered, and 43 papers included in the final sample. These described the prospective development or validation, or both, of 145 risk prediction models and scores, 94 of which were studied in detail here. They had been tested on 6.88 million participants followed for up to 28 years. Heterogeneity of primary studies precluded meta-analysis. Some but not all risk models or scores had robust statistical properties (for example, good discrimination and calibration) and had been externally validated on a different population. Genetic markers added nothing to models over clinical and sociodemographic factors. Most authors described their score as “simple” or “easily implemented,” although few were specific about the intended users and under what circumstances. Ten mechanisms were identified by which measuring diabetes risk might improve outcomes. Follow-on studies that applied a risk score as part of an

  13. Latent Model Analysis of Substance Use and HIV Risk Behaviors among High-Risk Minority Adults

    ERIC Educational Resources Information Center

    Wang, Min Qi; Matthew, Resa F.; Chiu, Yu-Wen; Yan, Fang; Bellamy, Nikki D.

    2007-01-01

    Objectives: This study evaluated substance use and HIV risk profile using a latent model analysis based on ecological theory, inclusive of a risk and protective factor framework, in sexually active minority adults (N=1,056) who participated in a federally funded substance abuse and HIV prevention health initiative from 2002 to 2006. Methods: Data…

  14. A Measurement Model of Women’s Behavioral Risk Taking

    PubMed Central

    VanZile-Tamsen, Carol; Testa, Maria; Livingston, Jennifer A.; Harlow, Lisa L.

    2009-01-01

    The current study was designed to gain a better understanding of the nature of the relationship between substance use and sexual risk taking within a community sample of women (N = 1,004). Using confirmatory factor analysis, the authors examined the factor structure of sexual risk behaviors and substance use to determine whether they are best conceptualized as domains underlying a single, higher order, risk-taking propensity. A 2 higher order factor model (sexual risk behavior and substance use) provided the best fit to the data, suggesting that these 2 general risk domains are correlated but independent factors. Sensation seeking had large general direct effects on the 2 risk domains and large indirect effects on the 4 first-order factors and the individual indicators. Negative affect had smaller, yet still significant, effects. Impulsivity and anxiety were unrelated to sexual health risk domains. PMID:16569118

  15. National Veterans Health Administration inpatient risk stratification models for hospital-acquired acute kidney injury

    PubMed Central

    Cronin, Robert M; VanHouten, Jacob P; Siew, Edward D; Eden, Svetlana K; Fihn, Stephan D; Nielson, Christopher D; Peterson, Josh F; Baker, Clifton R; Ikizler, T Alp; Speroff, Theodore

    2015-01-01

    Objective Hospital-acquired acute kidney injury (HA-AKI) is a potentially preventable cause of morbidity and mortality. Identifying high-risk patients prior to the onset of kidney injury is a key step towards AKI prevention. Materials and Methods A national retrospective cohort of 1,620,898 patient hospitalizations from 116 Veterans Affairs hospitals was assembled from electronic health record (EHR) data collected from 2003 to 2012. HA-AKI was defined at stage 1+, stage 2+, and dialysis. EHR-based predictors were identified through logistic regression, least absolute shrinkage and selection operator (lasso) regression, and random forests, and pair-wise comparisons between each were made. Calibration and discrimination metrics were calculated using 50 bootstrap iterations. In the final models, we report odds ratios, 95% confidence intervals, and importance rankings for predictor variables to evaluate their significance. Results The area under the receiver operating characteristic curve (AUC) for the different model outcomes ranged from 0.746 to 0.758 in stage 1+, 0.714 to 0.720 in stage 2+, and 0.823 to 0.825 in dialysis. Logistic regression had the best AUC in stage 1+ and dialysis. Random forests had the best AUC in stage 2+ but the least favorable calibration plots. Multiple risk factors were significant in our models, including some nonsteroidal anti-inflammatory drugs, blood pressure medications, antibiotics, and intravenous fluids given during the first 48 h of admission. Conclusions This study demonstrated that, although all the models tested had good discrimination, performance characteristics varied between methods, and the random forests models did not calibrate as well as the lasso or logistic regression models. In addition, novel modifiable risk factors were explored and found to be significant. PMID:26104740

  16. Radiation risk models for all solid cancers other than those types of cancer requiring individual assessments after a nuclear accident.

    PubMed

    Walsh, Linda; Zhang, Wei

    2016-03-01

    In the assessment of health risks after nuclear accidents, some health consequences require special attention. For example, in their 2013 report on health risk assessment after the Fukushima nuclear accident, the World Health Organisation (WHO) panel of experts considered risks of breast cancer, thyroid cancer and leukaemia. For these specific cancer types, use was made of already published excess relative risk (ERR) and excess absolute risk (EAR) models for radiation-related cancer incidence fitted to the epidemiological data from the Japanese A-bomb Life Span Study (LSS). However, it was also considered important to assess all other types of solid cancer together and the WHO, in their above-mentioned report, stated "No model to calculate the risk for all other solid cancer excluding breast and thyroid cancer risks is available from the LSS data". Applying the LSS models for all solid cancers along with the models for the specific sites means that some cancers have an overlap in the risk evaluations. Thus, calculating the total solid cancer risk plus the breast cancer risk plus the thyroid cancer risk can overestimate the total risk by several per cent. Therefore, the purpose of this paper was to publish the required models for all other solid cancers, i.e. all solid cancers other than those types of cancer requiring special attention after a nuclear accident. The new models presented here have been fitted to the same LSS data set from which the risks provided by the WHO were derived. Although it is known already that the EAR and ERR effect modifications by sex are statistically significant for the outcome "all solid cancer", it is shown here that sex modification is not statistically significant for the outcome "all solid cancer other than thyroid and breast cancer". It is also shown here that the sex-averaged solid cancer risks with and without the sex modification are very similar once breast and thyroid cancers are factored out. Some other notable model

  17. Radiation risk models for all solid cancers other than those types of cancer requiring individual assessments after a nuclear accident.

    PubMed

    Walsh, Linda; Zhang, Wei

    2016-03-01

    In the assessment of health risks after nuclear accidents, some health consequences require special attention. For example, in their 2013 report on health risk assessment after the Fukushima nuclear accident, the World Health Organisation (WHO) panel of experts considered risks of breast cancer, thyroid cancer and leukaemia. For these specific cancer types, use was made of already published excess relative risk (ERR) and excess absolute risk (EAR) models for radiation-related cancer incidence fitted to the epidemiological data from the Japanese A-bomb Life Span Study (LSS). However, it was also considered important to assess all other types of solid cancer together and the WHO, in their above-mentioned report, stated "No model to calculate the risk for all other solid cancer excluding breast and thyroid cancer risks is available from the LSS data". Applying the LSS models for all solid cancers along with the models for the specific sites means that some cancers have an overlap in the risk evaluations. Thus, calculating the total solid cancer risk plus the breast cancer risk plus the thyroid cancer risk can overestimate the total risk by several per cent. Therefore, the purpose of this paper was to publish the required models for all other solid cancers, i.e. all solid cancers other than those types of cancer requiring special attention after a nuclear accident. The new models presented here have been fitted to the same LSS data set from which the risks provided by the WHO were derived. Although it is known already that the EAR and ERR effect modifications by sex are statistically significant for the outcome "all solid cancer", it is shown here that sex modification is not statistically significant for the outcome "all solid cancer other than thyroid and breast cancer". It is also shown here that the sex-averaged solid cancer risks with and without the sex modification are very similar once breast and thyroid cancers are factored out. Some other notable model

  18. Global flood risk modelling and its applications for disaster risk reduction

    NASA Astrophysics Data System (ADS)

    Jongman, Brenden; Winsemius, Hessel; Bierkens, Marc; Bouwman, Arno; van Beek, Rens; Ligtvoet, Willem; Ward, Philip

    2014-05-01

    Flooding of river systems is the most costly natural hazard affecting societies around the world, with an average of 55 billion in direct losses and 4,500 fatalities each year between 1990 and 2012. The accurate and consistent assessment of flood risk on a global scale is essential for international development organizations and the reinsurance industry, and for enhancing our understanding of climate change impacts. This need is especially felt in developing countries, where local data and models are largely unavailable, and where flood risk is increasing rapidly under strong population growth and economic development. Here we present ongoing applications of high-resolution flood risk modelling at a global scale. The work is based on GLOFRIS, a modelling chain that produces flood risk maps at a 1km spatial resolution for the entire globe, under a range of climate and socioeconomic scenarios and various past and future time periods. This modelling chain combines a hydrological inundation model with socioeconomic datasets to assess past, current and future population exposure; economic damages; and agricultural risk. These tools are currently applied scientifically to gain insights in geographical patterns in current risk, and to assess the effects of possible future scenarios under climate change and climate variability. In this presentation we show recent applications from global scale to national scales. The global scale applications include global risk profiling for the reinsurance industry; and novel estimation of global flood mortality risk. In addition it will be demonstrated how the global flood modelling approach was successfully applied to assess disaster risk reduction priorities on a national scale in Africa. Finally, we indicate how these global modelling tools can be used to quantify the costs and benefits of adaptation, and explore pathways for development under a changing environment.

  19. Relativistic Absolutism in Moral Education.

    ERIC Educational Resources Information Center

    Vogt, W. Paul

    1982-01-01

    Discusses Emile Durkheim's "Moral Education: A Study in the Theory and Application of the Sociology of Education," which holds that morally healthy societies may vary in culture and organization but must possess absolute rules of moral behavior. Compares this moral theory with current theory and practice of American educators. (MJL)

  20. Absolute transition probabilities of phosphorus.

    NASA Technical Reports Server (NTRS)

    Miller, M. H.; Roig, R. A.; Bengtson, R. D.

    1971-01-01

    Use of a gas-driven shock tube to measure the absolute strengths of 21 P I lines and 126 P II lines (from 3300 to 6900 A). Accuracy for prominent, isolated neutral and ionic lines is estimated to be 28 to 40% and 18 to 30%, respectively. The data and the corresponding theoretical predictions are examined for conformity with the sum rules.-

  1. The OPTIONS model of sexual risk assessment for adolescents.

    PubMed

    Lusczakoski, Kathryn D; Rue, Lisa A

    2012-03-01

    Typically, clinical evaluations of adolescents' sexual risk is based on inquiring about past sexual activity, which is limited by not including an adolescent's cognitive decision making regarding their past sexual decisions. This study describes the novel OPTIONS framework for assessing adolescent sexual risk including three general categories of risk (e.g., primary, secondary, and tertiary risk), which is designed to overcome the limitation of action-based assessment of risk and improve practitioners' ability to assess the levels of sexual risk. A convenience sample of 201 older adolescents (18-19 years of age) completed an online version of the Relationship Options Survey (ROS), designed to measure the OPTIONS sexual risk assessment. Bivariate correlation among the subscales functioned in the hypothesized manner, with all correlations being statistically significant. Using the OPTIONS model, 22.4% participants were classified as high risk primary, 7.0% participants were classified as high risk secondary, and 27.4% participants were classified as high risk tertiary. The study provided preliminary evidence for OPTIONS model of sexual assessment, which provides a more tailored evaluation by including cognitive decisions regarding an adolescent's sexual actions.

  2. Student Choices: Using a Competing Risks Model of Survival Analysis.

    ERIC Educational Resources Information Center

    Denson, Katy; Schumacker, Randall E.

    By using a competing risks model, survival analysis methods can be extended to predict which of several mutually exclusive outcomes students will choose based on predictor variables, thereby ascertaining if the profile of risk differs across groups. The paper begins with a brief introduction to logistic regression and some of the basic concepts of…

  3. Risk Prediction Models for Other Cancers or Multiple Sites

    Cancer.gov

    Developing statistical models that estimate the probability of developing other multiple cancers over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  4. Back-end Science Model Integration for Ecological Risk Assessment

    EPA Science Inventory

    The U.S. Environmental Protection Agency (USEPA) relies on a number of ecological risk assessment models that have been developed over 30-plus years of regulating pesticide exposure and risks under Federal Insecticide, Fungicide, and Rodenticide Act (FIFRA) and the Endangered Spe...

  5. Back-end Science Model Integration for Ecological Risk Assessment.

    EPA Science Inventory

    The U.S. Environmental Protection Agency (USEPA) relies on a number of ecological risk assessment models that have been developed over 30-plus years of regulating pesticide exposure and risks under Federal Insecticide, Fungicide, and Rodenticide Act (FIFRA) and the Endangered Spe...

  6. Risk prediction models for hepatocellular carcinoma in different populations

    PubMed Central

    Ma, Xiao; Yang, Yang; Tu, Hong; Gao, Jing; Tan, Yu-Ting; Zheng, Jia-Li; Bray, Freddie; Xiang, Yong-Bing

    2016-01-01

    Hepatocellular carcinoma (HCC) is a malignant disease with limited therapeutic options due to its aggressive progression. It places heavy burden on most low and middle income countries to treat HCC patients. Nowadays accurate HCC risk predictions can help making decisions on the need for HCC surveillance and antiviral therapy. HCC risk prediction models based on major risk factors of HCC are useful and helpful in providing adequate surveillance strategies to individuals who have different risk levels. Several risk prediction models among cohorts of different populations for estimating HCC incidence have been presented recently by using simple, efficient, and ready-to-use parameters. Moreover, using predictive scoring systems to assess HCC development can provide suggestions to improve clinical and public health approaches, making them more cost-effective and effort-effective, for inducing personalized surveillance programs according to risk stratification. In this review, the features of risk prediction models of HCC across different populations were summarized, and the perspectives of HCC risk prediction models were discussed as well. PMID:27199512

  7. Developing a predictive risk model for first-line antiretroviral therapy failure in South Africa

    PubMed Central

    Rohr, Julia K; Ive, Prudence; Horsburgh, C Robert; Berhanu, Rebecca; Shearer, Kate; Maskew, Mhairi; Long, Lawrence; Sanne, Ian; Bassett, Jean; Ebrahim, Osman; Fox, Matthew P

    2016-01-01

    Introduction A substantial number of patients with HIV in South Africa have failed first-line antiretroviral therapy (ART). Although individual predictors of first-line ART failure have been identified, few studies in resource-limited settings have been large enough for predictive modelling. Understanding the absolute risk of first-line failure is useful for patient monitoring and for effectively targeting limited resources for second-line ART. We developed a predictive model to identify patients at the greatest risk of virologic failure on first-line ART, and to estimate the proportion of patients needing second-line ART over five years on treatment. Methods A cohort of patients aged ≥18 years from nine South African HIV clinics on first-line ART for at least six months were included. Viral load measurements and baseline predictors were obtained from medical records. We used stepwise selection of predictors in accelerated failure-time models to predict virologic failure on first-line ART (two consecutive viral load levels >1000 copies/mL). Multiple imputations were used to assign missing baseline variables. The final model was selected using internal-external cross-validation maximizing model calibration at five years on ART, and model discrimination, measured using Harrell's C-statistic. Model covariates were used to create a predictive score for risk group of ART failure. Results A total of 72,181 patients were included in the analysis, with an average of 21.5 months (IQR: 8.8–41.5) of follow-up time on first-line ART. The final predictive model had a Weibull distribution and the final predictors of virologic failure were men of all ages, young women, nevirapine use in first-line regimen, low baseline CD4 count, high mean corpuscular volume, low haemoglobin, history of TB and missed visits during the first six months on ART. About 24.4% of patients in the highest quintile and 9.4% of patients in the lowest quintile of risk were predicted to experience

  8. Developing a predictive risk model for first-line antiretroviral therapy failure in South Africa

    PubMed Central

    Rohr, Julia K; Ive, Prudence; Horsburgh, C Robert; Berhanu, Rebecca; Shearer, Kate; Maskew, Mhairi; Long, Lawrence; Sanne, Ian; Bassett, Jean; Ebrahim, Osman; Fox, Matthew P

    2016-01-01

    Introduction A substantial number of patients with HIV in South Africa have failed first-line antiretroviral therapy (ART). Although individual predictors of first-line ART failure have been identified, few studies in resource-limited settings have been large enough for predictive modelling. Understanding the absolute risk of first-line failure is useful for patient monitoring and for effectively targeting limited resources for second-line ART. We developed a predictive model to identify patients at the greatest risk of virologic failure on first-line ART, and to estimate the proportion of patients needing second-line ART over five years on treatment. Methods A cohort of patients aged ≥18 years from nine South African HIV clinics on first-line ART for at least six months were included. Viral load measurements and baseline predictors were obtained from medical records. We used stepwise selection of predictors in accelerated failure-time models to predict virologic failure on first-line ART (two consecutive viral load levels >1000 copies/mL). Multiple imputations were used to assign missing baseline variables. The final model was selected using internal-external cross-validation maximizing model calibration at five years on ART, and model discrimination, measured using Harrell's C-statistic. Model covariates were used to create a predictive score for risk group of ART failure. Results A total of 72,181 patients were included in the analysis, with an average of 21.5 months (IQR: 8.8–41.5) of follow-up time on first-line ART. The final predictive model had a Weibull distribution and the final predictors of virologic failure were men of all ages, young women, nevirapine use in first-line regimen, low baseline CD4 count, high mean corpuscular volume, low haemoglobin, history of TB and missed visits during the first six months on ART. About 24.4% of patients in the highest quintile and 9.4% of patients in the lowest quintile of risk were predicted to experience

  9. Value-at-Risk forecasts by a spatiotemporal model in Chinese stock market

    NASA Astrophysics Data System (ADS)

    Gong, Pu; Weng, Yingliang

    2016-01-01

    This paper generalizes a recently proposed spatial autoregressive model and introduces a spatiotemporal model for forecasting stock returns. We support the view that stock returns are affected not only by the absolute values of factors such as firm size, book-to-market ratio and momentum but also by the relative values of factors like trading volume ranking and market capitalization ranking in each period. This article studies a new method for constructing stocks' reference groups; the method is called quartile method. Applying the method empirically to the Shanghai Stock Exchange 50 Index, we compare the daily volatility forecasting performance and the out-of-sample forecasting performance of Value-at-Risk (VaR) estimated by different models. The empirical results show that the spatiotemporal model performs surprisingly well in terms of capturing spatial dependences among individual stocks, and it produces more accurate VaR forecasts than the other three models introduced in the previous literature. Moreover, the findings indicate that both allowing for serial correlation in the disturbances and using time-varying spatial weight matrices can greatly improve the predictive accuracy of a spatial autoregressive model.

  10. Hazard Ranking System and toxicological risk assessment models yield different results

    SciTech Connect

    Dehghani, T.; Sells, G. . CER-CLA Site Assessment Div.)

    1993-09-01

    A major goal of the Superfund Site Assessment program is identifying hazardous waste sites that pose unacceptable risks to human health and the environment. To accomplish this, EPA developed the Hazard Ranking System (HRS), a mathematical model used to assess the relative risks associated with actual or potential releases of hazardous wastes from a site. HRS is a scoring system based on factors grouped into three categories--likelihood of release, waste characteristics and targets. Values for the factor categories are multiplied, then normalized to 100 points to obtain a pathway score. Four pathways--groundwater, surface water, air migration and soil exposure--are evaluated and scored. The final HRS score is obtained by combining pathway scores using a root-mean-square method. HRS is intended to be a screening tool for measuring relative, rather than absolute, risk. The Superfund site assessment program usually requires at least two studies of a potential hazardous waste site before it is proposed for listing on the NPL. The initial study, or preliminary assessment (PA), is a limited-scope evaluation based on available historical information and data that can be gathered readily during a site reconnaissance.

  11. Submission Form for Peer-Reviewed Cancer Risk Prediction Models

    Cancer.gov

    If you have information about a peer-reviewd cancer risk prediction model that you would like to be considered for inclusion on this list, submit as much information as possible through the form on this page.

  12. Lymphatic filariasis transmission risk map of India, based on a geo-environmental risk model.

    PubMed

    Sabesan, Shanmugavelu; Raju, Konuganti Hari Kishan; Subramanian, Swaminathan; Srivastava, Pradeep Kumar; Jambulingam, Purushothaman

    2013-09-01

    The strategy adopted by a global program to interrupt transmission of lymphatic filariasis (LF) is mass drug administration (MDA) using chemotherapy. India also followed this strategy by introducing MDA in the historically known endemic areas. All other areas, which remained unsurveyed, were presumed to be nonendemic and left without any intervention. Therefore, identification of LF transmission risk areas in the entire country has become essential so that they can be targeted for intervention. A geo-environmental risk model (GERM) developed earlier was used to create a filariasis transmission risk map for India. In this model, a Standardized Filariasis Transmission Risk Index (SFTRI, based on geo-environmental risk variables) was used as a predictor of transmission risk. The relationship between SFTRI and endemicity (historically known) of an area was quantified by logistic regression analysis. The quantified relationship was validated by assessing the filarial antigenemia status of children living in the unsurveyed areas through a ground truth study. A significant positive relationship was observed between SFTRI and the endemicity of an area. Overall, the model prediction of filarial endemic status of districts was found to be correct in 92.8% of the total observations. Thus, among the 190 districts hitherto unsurveyed, as many as 113 districts were predicted to be at risk, and the remaining at no risk. The GERM developed on geographic information system (GIS) platform is useful for LF spatial delimitation on a macrogeographic/regional scale. Furthermore, the risk map developed will be useful for the national LF elimination program by identifying areas at risk for intervention and for undertaking surveillance in no-risk areas.

  13. Constraining the Properties of the Eta Carinae System via 3-D SPH Models of Space-Based Observations: The Absolute Orientation of the Binary Orbit

    NASA Technical Reports Server (NTRS)

    Madura, Thomas I.; Gull, Theodore R.; Owocki, Stanley P.; Okazaki, Atsuo T.; Russell, Christopher M. P.

    2010-01-01

    The extremely massive (> 90 Solar Mass) and luminous (= 5 x 10(exp 6) Solar Luminosity) star Eta Carinae, with its spectacular bipolar "Homunculus" nebula, comprises one of the most remarkable and intensely observed stellar systems in the galaxy. However, many of its underlying physical parameters remain a mystery. Multiwavelength variations observed to occur every 5.54 years are interpreted as being due to the collision of a massive wind from the primary star with the fast, less dense wind of a hot companion star in a highly elliptical (e approx. 0.9) orbit. Using three-dimensional (3-D) Smoothed Particle Hydrodynamics (SPH) simulations of the binary wind-wind collision in Eta Car, together with radiative transfer codes, we compute synthetic spectral images of [Fe III] emission line structures and compare them to existing Hubble Space Telescope/Space Telescope Imaging Spectrograph (HST/STIS) observations. We are thus able, for the first time, to constrain the absolute orientation of the binary orbit on the sky. An orbit with an inclination of i approx. 40deg, an argument of periapsis omega approx. 255deg, and a projected orbital axis with a position angle of approx. 312deg east of north provides the best fit to the observations, implying that the orbital axis is closely aligned in 3-1) space with the Homunculus symmetry axis, and that the companion star orbits clockwise on the sky relative to the primary.

  14. Constraining the Properties of the Eta Carinae System via 3-D SPH Models of Space-Based Observations: The Absolute Orientation of the Binary Orbit

    NASA Technical Reports Server (NTRS)

    Madura, Thomas I.; Gull, Theodore R.; Owocki, Stanley P.; Okazaki, Atsuo T.; Russell, Christopher M. P.

    2011-01-01

    The extremely massive (> 90 Stellar Mass) and luminous (= 5 x 10(exp 6) Stellar Luminosity) star Eta Carinae, with its spectacular bipolar "Homunculus" nebula, comprises one of the most remarkable and intensely observed stellar systems in the Galaxy. However, many of its underlying physical parameters remain unknown. Multiwavelength variations observed to occur every 5.54 years are interpreted as being due to the collision of a massive wind from the primary star with the fast, less dense wind of a hot companion star in a highly elliptical (e approx. 0.9) orbit. Using three-dimensional (3-D) Smoothed Particle Hydrodynamics (SPH) simulations of the binary wind-wind collision, together with radiative transfer codes, we compute synthetic spectral images of [Fe III] emission line structures and compare them to existing Hubble Space Telescope/Space Telescope Imaging Spectrograph (HST/STIS) observations. We are thus able, for the first time, to tightly constrain the absolute orientation of the binary orbit on the sky. An orbit with an inclination of approx. 40deg, an argument of periapsis omega approx. 255deg, and a projected orbital axis with a position angle of approx. 312deg east of north provides the best fit to the observations, implying that the orbital axis is closely aligned in 3-D space with the Homunculus symmetry axis, and that the companion star orbits clockwise on the sky relative to the primary.

  15. A Hybrid Tsunami Risk Model for Japan

    NASA Astrophysics Data System (ADS)

    Haseemkunju, A. V.; Smith, D. F.; Khater, M.; Khemici, O.; Betov, B.; Scott, J.

    2014-12-01

    Around the margins of the Pacific Ocean, denser oceanic plates slipping under continental plates cause subduction earthquakes generating large tsunami waves. The subducting Pacific and Philippine Sea plates create damaging interplate earthquakes followed by huge tsunami waves. It was a rupture of the Japan Trench subduction zone (JTSZ) and the resultant M9.0 Tohoku-Oki earthquake that caused the unprecedented tsunami along the Pacific coast of Japan on March 11, 2011. EQECAT's Japan Earthquake model is a fully probabilistic model which includes a seismo-tectonic model describing the geometries, magnitudes, and frequencies of all potential earthquake events; a ground motion model; and a tsunami model. Within the much larger set of all modeled earthquake events, fault rupture parameters for about 24000 stochastic and 25 historical tsunamigenic earthquake events are defined to simulate tsunami footprints using the numerical tsunami model COMCOT. A hybrid approach using COMCOT simulated tsunami waves is used to generate inundation footprints, including the impact of tides and flood defenses. Modeled tsunami waves of major historical events are validated against observed data. Modeled tsunami flood depths on 30 m grids together with tsunami vulnerability and financial models are then used to estimate insured loss in Japan from the 2011 tsunami. The primary direct report of damage from the 2011 tsunami is in terms of the number of buildings damaged by municipality in the tsunami affected area. Modeled loss in Japan from the 2011 tsunami is proportional to the number of buildings damaged. A 1000-year return period map of tsunami waves shows high hazard along the west coast of southern Honshu, on the Pacific coast of Shikoku, and on the east coast of Kyushu, primarily associated with major earthquake events on the Nankai Trough subduction zone (NTSZ). The highest tsunami hazard of more than 20m is seen on the Sanriku coast in northern Honshu, associated with the JTSZ.

  16. Modeling Exposure to Persistent Chemicals in Hazard and Risk Assessment

    SciTech Connect

    Cowan-Ellsberry, Christina E.; McLachlan, Michael S.; Arnot, Jon A.; MacLeod, Matthew; McKone, Thomas E.; Wania, Frank

    2008-11-01

    Fate and exposure modeling has not thus far been explicitly used in the risk profile documents prepared to evaluate significant adverse effect of candidate chemicals for either the Stockholm Convention or the Convention on Long-Range Transboundary Air Pollution. However, we believe models have considerable potential to improve the risk profiles. Fate and exposure models are already used routinely in other similar regulatory applications to inform decisions, and they have been instrumental in building our current understanding of the fate of POP and PBT chemicals in the environment. The goal of this paper is to motivate the use of fate and exposure models in preparing risk profiles in the POP assessment procedure by providing strategies for incorporating and using models. The ways that fate and exposure models can be used to improve and inform the development of risk profiles include: (1) Benchmarking the ratio of exposure and emissions of candidate chemicals to the same ratio for known POPs, thereby opening the possibility of combining this ratio with the relative emissions and relative toxicity to arrive at a measure of relative risk. (2) Directly estimating the exposure of the environment, biota and humans to provide information to complement measurements, or where measurements are not available or are limited. (3) To identify the key processes and chemical and/or environmental parameters that determine the exposure; thereby allowing the effective prioritization of research or measurements to improve the risk profile. (4) Predicting future time trends including how quickly exposure levels in remote areas would respond to reductions in emissions. Currently there is no standardized consensus model for use in the risk profile context. Therefore, to choose the appropriate model the risk profile developer must evaluate how appropriate an existing model is for a specific setting and whether the assumptions and input data are relevant in the context of the application

  17. An integrated risk management model for source water protection areas.

    PubMed

    Chiueh, Pei-Te; Shang, Wei-Ting; Lo, Shang-Lien

    2012-10-17

    Watersheds are recognized as the most effective management unit for the protection of water resources. For surface water supplies that use water from upstream watersheds, evaluating threats to water quality and implementing a watershed management plan are crucial for the maintenance of drinking water safe for humans. The aim of this article is to establish a risk assessment model that provides basic information for identifying critical pollutants and areas at high risk for degraded water quality. In this study, a quantitative risk model that uses hazard quotients for each water quality parameter was combined with a qualitative risk model that uses the relative risk level of potential pollution events in order to characterize the current condition and potential risk of watersheds providing drinking water. In a case study of Taipei Source Water Area in northern Taiwan, total coliforms and total phosphorus were the top two pollutants of concern. Intensive tea-growing and recreational activities around the riparian zone may contribute the greatest pollution to the watershed. Our risk assessment tool may be enhanced by developing, recording, and updating information on pollution sources in the water supply watersheds. Moreover, management authorities could use the resultant information to create watershed risk management plans.

  18. Physical vulnerability modelling in natural hazard risk assessment

    NASA Astrophysics Data System (ADS)

    Douglas, J.

    2007-04-01

    An evaluation of the risk to an exposed element from a hazardous event requires a consideration of the element's vulnerability, which expresses its propensity to suffer damage. This concept allows the assessed level of hazard to be translated to an estimated level of risk and is often used to evaluate the risk from earthquakes and cyclones. However, for other natural perils, such as mass movements, coastal erosion and volcanoes, the incorporation of vulnerability within risk assessment is not well established and consequently quantitative risk estimations are not often made. This impedes the study of the relative contributions from different hazards to the overall risk at a site. Physical vulnerability is poorly modelled for many reasons: the cause of human casualties (from the event itself rather than by building damage); lack of observational data on the hazard, the elements at risk and the induced damage; the complexity of the structural damage mechanisms; the temporal and geographical scales; and the ability to modify the hazard level. Many of these causes are related to the nature of the peril therefore for some hazards, such as coastal erosion, the benefits of considering an element's physical vulnerability may be limited. However, for hazards such as volcanoes and mass movements the modelling of vulnerability should be improved by, for example, following the efforts made in earthquake risk assessment. For example, additional observational data on induced building damage and the hazardous event should be routinely collected and correlated and also numerical modelling of building behaviour during a damaging event should be attempted.

  19. Modeling the Risks of Geothermal Development

    SciTech Connect

    Golabi, K.; Nair, K.; Rothstein, S.; Sioshansi, F.

    1980-12-16

    Geothermal energy has emerged as a promising energy source in recent years and has received serious attention from developers and potential users. Despite the advantages of this resource, such as potential cost competitiveness, reliability, public acceptance, etc., the commercial development and use of geothermal energy has been slow. Impediments to the development of this resource include technical, financial, environmental and regulatory uncertainties. Since geothermal power is unique in that the generation facility is tied to a single fuel at a single site, these uncertainties are of particular concern to utility companies. The areas of uncertainty and potential risks are well known. This paper presents a method for quantifying the relevant uncertainties and a framework for aggregating the risks through the use of submodels. The objective submodels can be combined with subjective probabilities (when sufficient data is not available) to yield a probability distribution over a single criterion (levelized busbar cost) that can be used to compare the desirability of geothermal power development with respect to other alternatives.

  20. Usefulness and limitations of global flood risk models

    NASA Astrophysics Data System (ADS)

    Ward, Philip; Jongman, Brenden; Salamon, Peter; Simpson, Alanna; Bates, Paul; De Groeve, Tom; Muis, Sanne; Coughlan de Perez, Erin; Rudari, Roberto; Mark, Trigg; Winsemius, Hessel

    2016-04-01

    Global flood risk models are now a reality. Initially, their development was driven by a demand from users for first-order global assessments to identify risk hotspots. Relentless upward trends in flood damage over the last decade have enhanced interest in such assessments. The adoption of the Sendai Framework for Disaster Risk Reduction and the Warsaw International Mechanism for Loss and Damage Associated with Climate Change Impacts have made these efforts even more essential. As a result, global flood risk models are being used more and more in practice, by an increasingly large number of practitioners and decision-makers. However, they clearly have their limits compared to local models. To address these issues, a team of scientists and practitioners recently came together at the Global Flood Partnership meeting to critically assess the question 'What can('t) we do with global flood risk models?'. The results of this dialogue (Ward et al., 2013) will be presented, opening a discussion on similar broader initiatives at the science-policy interface in other natural hazards. In this contribution, examples are provided of successful applications of global flood risk models in practice (for example together with the World Bank, Red Cross, and UNISDR), and limitations and gaps between user 'wish-lists' and model capabilities are discussed. Finally, a research agenda is presented for addressing these limitations and reducing the gaps. Ward et al., 2015. Nature Climate Change, doi:10.1038/nclimate2742

  1. QSAR in predictive models for ecological risk assessment

    SciTech Connect

    Passino-Reader, D.R.; Hickey, J.P.

    1994-12-31

    The end use of toxicity and exposure data is risk assessment to determine the probability that receptors experience harmful effects from exposure to environmental contaminants at a site. Determination of processes and development of predictive models precede the collection of data for risk assessment. The presence of hundreds of contaminants at a site and absence of data for many contaminants lead to the use of QSAR to implement the models. Examples of the use of linear salvation energy relationships (LSER) to provide estimates of aquatic toxicity and exposure endpoints will be provided. Integration of QSAR estimates and measured data must be addressed in the uncertainty analysis accompanying ecological risk assessment.

  2. A Corrosion Risk Assessment Model for Underground Piping

    NASA Technical Reports Server (NTRS)

    Datta, Koushik; Fraser, Douglas R.

    2009-01-01

    The Pressure Systems Manager at NASA Ames Research Center (ARC) has embarked on a project to collect data and develop risk assessment models to support risk-informed decision making regarding future inspections of underground pipes at ARC. This paper shows progress in one area of this project - a corrosion risk assessment model for the underground high-pressure air distribution piping system at ARC. It consists of a Corrosion Model of pipe-segments, a Pipe Wrap Protection Model; and a Pipe Stress Model for a pipe segment. A Monte Carlo simulation of the combined models provides a distribution of the failure probabilities. Sensitivity study results show that the model uncertainty, or lack of knowledge, is the dominant contributor to the calculated unreliability of the underground piping system. As a result, the Pressure Systems Manager may consider investing resources specifically focused on reducing these uncertainties. Future work includes completing the data collection effort for the existing ground based pressure systems and applying the risk models to risk-based inspection strategies of the underground pipes at ARC.

  3. A Model for Risk Assessment in Health Care.

    PubMed

    Prijatelj, Vesna; Rajkovič, Vladislav; Šušteršič, Olga

    2016-01-01

    The purpose of our research is to reduce risks and hence prevent errors in the health care process. The aim is to design an organizational information model using error prevention methods for risk assessment in a clinical setting. The model is based on selected indicators of quality nursing care, resulting from the world-known theoretical and practical models combined with experience in the Slovenian health care. The proposed organizational information model and software solution has a significant impact on the professional attention, communication and information, critical thinking, experience and knowledge. PMID:27332383

  4. Optomechanics for absolute rotation detection

    NASA Astrophysics Data System (ADS)

    Davuluri, Sankar

    2016-07-01

    In this article, we present an application of optomechanical cavity for the absolute rotation detection. The optomechanical cavity is arranged in a Michelson interferometer in such a way that the classical centrifugal force due to rotation changes the length of the optomechanical cavity. The change in the cavity length induces a shift in the frequency of the cavity mode. The phase shift corresponding to the frequency shift in the cavity mode is measured at the interferometer output to estimate the angular velocity of absolute rotation. We derived an analytic expression to estimate the minimum detectable rotation rate in our scheme for a given optomechanical cavity. Temperature dependence of the rotation detection sensitivity is studied.

  5. Moral absolutism and ectopic pregnancy.

    PubMed

    Kaczor, C

    2001-02-01

    If one accepts a version of absolutism that excludes the intentional killing of any innocent human person from conception to natural death, ectopic pregnancy poses vexing difficulties. Given that the embryonic life almost certainly will die anyway, how can one retain one's moral principle and yet adequately respond to a situation that gravely threatens the life of the mother and her future fertility? The four options of treatment most often discussed in the literature are non-intervention, salpingectomy (removal of tube with embryo), salpingostomy (removal of embryo alone), and use of methotrexate (MXT). In this essay, I review these four options and introduce a fifth (the milking technique). In order to assess these options in terms of the absolutism mentioned, it will also be necessary to discuss various accounts of the intention/foresight distinction. I conclude that salpingectomy, salpingostomy, and the milking technique are compatible with absolutist presuppositions, but not the use of methotrexate.

  6. Moral absolutism and ectopic pregnancy.

    PubMed

    Kaczor, C

    2001-02-01

    If one accepts a version of absolutism that excludes the intentional killing of any innocent human person from conception to natural death, ectopic pregnancy poses vexing difficulties. Given that the embryonic life almost certainly will die anyway, how can one retain one's moral principle and yet adequately respond to a situation that gravely threatens the life of the mother and her future fertility? The four options of treatment most often discussed in the literature are non-intervention, salpingectomy (removal of tube with embryo), salpingostomy (removal of embryo alone), and use of methotrexate (MXT). In this essay, I review these four options and introduce a fifth (the milking technique). In order to assess these options in terms of the absolutism mentioned, it will also be necessary to discuss various accounts of the intention/foresight distinction. I conclude that salpingectomy, salpingostomy, and the milking technique are compatible with absolutist presuppositions, but not the use of methotrexate. PMID:11262641

  7. The Absolute Spectrum Polarimeter (ASP)

    NASA Technical Reports Server (NTRS)

    Kogut, A. J.

    2010-01-01

    The Absolute Spectrum Polarimeter (ASP) is an Explorer-class mission to map the absolute intensity and linear polarization of the cosmic microwave background and diffuse astrophysical foregrounds over the full sky from 30 GHz to 5 THz. The principal science goal is the detection and characterization of linear polarization from an inflationary epoch in the early universe, with tensor-to-scalar ratio r much greater than 1O(raised to the power of { -3}) and Compton distortion y < 10 (raised to the power of{-6}). We describe the ASP instrument and mission architecture needed to detect the signature of an inflationary epoch in the early universe using only 4 semiconductor bolometers.

  8. Absolute calibration of optical flats

    DOEpatents

    Sommargren, Gary E.

    2005-04-05

    The invention uses the phase shifting diffraction interferometer (PSDI) to provide a true point-by-point measurement of absolute flatness over the surface of optical flats. Beams exiting the fiber optics in a PSDI have perfect spherical wavefronts. The measurement beam is reflected from the optical flat and passed through an auxiliary optic to then be combined with the reference beam on a CCD. The combined beams include phase errors due to both the optic under test and the auxiliary optic. Standard phase extraction algorithms are used to calculate this combined phase error. The optical flat is then removed from the system and the measurement fiber is moved to recombine the two beams. The newly combined beams include only the phase errors due to the auxiliary optic. When the second phase measurement is subtracted from the first phase measurement, the absolute phase error of the optical flat is obtained.

  9. A triple risk model for unexplained late stillbirth

    PubMed Central

    2014-01-01

    Background The triple risk model for sudden infant death syndrome (SIDS) has been useful in understanding its pathogenesis. Risk factors for late stillbirth are well established, especially relating to maternal and fetal wellbeing. Discussion We propose a similar triple risk model for unexplained late stillbirth. The model proposed by us results from the interplay of three groups of factors: (1) maternal factors (such as maternal age, obesity, smoking), (2) fetal and placental factors (such as intrauterine growth retardation, placental insufficiency), and (3) a stressor (such as venocaval compression from maternal supine sleep position, sleep disordered breathing). We argue that the risk factors within each group in themselves may be insufficient to cause the death, but when they interrelate may produce a lethal combination. Summary Unexplained late stillbirth occurs when a fetus who is somehow vulnerable dies as a result of encountering a stressor and/or maternal condition in a combination which is lethal for them. PMID:24731396

  10. Modeling Manufacturing Processes to Mitigate Technological Risk

    SciTech Connect

    Allgood, G.O.; Manges, W.W.

    1999-10-24

    An economic model is a tool for determining the justifiable cost of new sensors and subsystems with respect to value and operation. This process balances the R and D costs against the expense of maintaining current operations and allows for a method to calculate economic indices of performance that can be used as control points in deciding whether to continue development or suspend actions. The model can also be used as an integral part of an overall control loop utilizing real-time process data from the sensor groups to make production decisions (stop production and repair machine, continue and warn of anticipated problems, queue for repairs, etc.). This model has been successfully used and deployed in the CAFE Project. The economic model was one of seven (see Fig. 1) elements critical in developing an investment strategy. It has been successfully used in guiding the R and D activities on the CAFE Project, suspending activities on three new sensor technologies, and continuing development o f two others. The model has also been used to justify the development of a new prognostic approach for diagnosing machine health using COTS equipment and a new algorithmic approach. maintaining current operations and allows for a method to calculate economic indices of performance that can be used as control points in deciding whether to continue development or suspend actions. The model can also be used as an integral part of an overall control loop utilizing real-time process data from the sensor groups to make production decisions (stop production and repair machine, continue and warn of anticipated problems, queue for repairs, etc.).

  11. The Global Earthquake Model and Disaster Risk Reduction

    NASA Astrophysics Data System (ADS)

    Smolka, A. J.

    2015-12-01

    Advanced, reliable and transparent tools and data to assess earthquake risk are inaccessible to most, especially in less developed regions of the world while few, if any, globally accepted standards currently allow a meaningful comparison of risk between places. The Global Earthquake Model (GEM) is a collaborative effort that aims to provide models, datasets and state-of-the-art tools for transparent assessment of earthquake hazard and risk. As part of this goal, GEM and its global network of collaborators have developed the OpenQuake engine (an open-source software for hazard and risk calculations), the OpenQuake platform (a web-based portal making GEM's resources and datasets freely available to all potential users), and a suite of tools to support modelers and other experts in the development of hazard, exposure and vulnerability models. These resources are being used extensively across the world in hazard and risk assessment, from individual practitioners to local and national institutions, and in regional projects to inform disaster risk reduction. Practical examples for how GEM is bridging the gap between science and disaster risk reduction are: - Several countries including Switzerland, Turkey, Italy, Ecuador, Papua-New Guinea and Taiwan (with more to follow) are computing national seismic hazard using the OpenQuake-engine. In some cases these results are used for the definition of actions in building codes. - Technical support, tools and data for the development of hazard, exposure, vulnerability and risk models for regional projects in South America and Sub-Saharan Africa. - Going beyond physical risk, GEM's scorecard approach evaluates local resilience by bringing together neighborhood/community leaders and the risk reduction community as a basis for designing risk reduction programs at various levels of geography. Actual case studies are Lalitpur in the Kathmandu Valley in Nepal and Quito/Ecuador. In agreement with GEM's collaborative approach, all

  12. Application of the Beck model to stock markets: Value-at-Risk and portfolio risk assessment

    NASA Astrophysics Data System (ADS)

    Kozaki, M.; Sato, A.-H.

    2008-02-01

    We apply the Beck model, developed for turbulent systems that exhibit scaling properties, to stock markets. Our study reveals that the Beck model elucidates the properties of stock market returns and is applicable to practical use such as the Value-at-Risk estimation and the portfolio analysis. We perform empirical analysis with daily/intraday data of the S&P500 index return and find that the volatility fluctuation of real markets is well-consistent with the assumptions of the Beck model: The volatility fluctuates at a much larger time scale than the return itself and the inverse of variance, or “inverse temperature”, β obeys Γ-distribution. As predicted by the Beck model, the distribution of returns is well-fitted by q-Gaussian distribution of Tsallis statistics. The evaluation method of Value-at-Risk (VaR), one of the most significant indicators in risk management, is studied for q-Gaussian distribution. Our proposed method enables the VaR evaluation in consideration of tail risk, which is underestimated by the variance-covariance method. A framework of portfolio risk assessment under the existence of tail risk is considered. We propose a multi-asset model with a single volatility fluctuation shared by all assets, named the single β model, and empirically examine the agreement between the model and an imaginary portfolio with Dow Jones indices. It turns out that the single β model gives good approximation to portfolios composed of the assets with non-Gaussian and correlated returns.

  13. Improving default risk prediction using Bayesian model uncertainty techniques.

    PubMed

    Kazemi, Reza; Mosleh, Ali

    2012-11-01

    Credit risk is the potential exposure of a creditor to an obligor's failure or refusal to repay the debt in principal or interest. The potential of exposure is measured in terms of probability of default. Many models have been developed to estimate credit risk, with rating agencies dating back to the 19th century. They provide their assessment of probability of default and transition probabilities of various firms in their annual reports. Regulatory capital requirements for credit risk outlined by the Basel Committee on Banking Supervision have made it essential for banks and financial institutions to develop sophisticated models in an attempt to measure credit risk with higher accuracy. The Bayesian framework proposed in this article uses the techniques developed in physical sciences and engineering for dealing with model uncertainty and expert accuracy to obtain improved estimates of credit risk and associated uncertainties. The approach uses estimates from one or more rating agencies and incorporates their historical accuracy (past performance data) in estimating future default risk and transition probabilities. Several examples demonstrate that the proposed methodology can assess default probability with accuracy exceeding the estimations of all the individual models. Moreover, the methodology accounts for potentially significant departures from "nominal predictions" due to "upsetting events" such as the 2008 global banking crisis. PMID:23163724

  14. Improving default risk prediction using Bayesian model uncertainty techniques.

    PubMed

    Kazemi, Reza; Mosleh, Ali

    2012-11-01

    Credit risk is the potential exposure of a creditor to an obligor's failure or refusal to repay the debt in principal or interest. The potential of exposure is measured in terms of probability of default. Many models have been developed to estimate credit risk, with rating agencies dating back to the 19th century. They provide their assessment of probability of default and transition probabilities of various firms in their annual reports. Regulatory capital requirements for credit risk outlined by the Basel Committee on Banking Supervision have made it essential for banks and financial institutions to develop sophisticated models in an attempt to measure credit risk with higher accuracy. The Bayesian framework proposed in this article uses the techniques developed in physical sciences and engineering for dealing with model uncertainty and expert accuracy to obtain improved estimates of credit risk and associated uncertainties. The approach uses estimates from one or more rating agencies and incorporates their historical accuracy (past performance data) in estimating future default risk and transition probabilities. Several examples demonstrate that the proposed methodology can assess default probability with accuracy exceeding the estimations of all the individual models. Moreover, the methodology accounts for potentially significant departures from "nominal predictions" due to "upsetting events" such as the 2008 global banking crisis.

  15. Absolute Antenna Calibration at the US National Geodetic Survey

    NASA Astrophysics Data System (ADS)

    Mader, G. L.; Bilich, A. L.

    2012-12-01

    Geodetic GNSS applications routinely demand millimeter precision and extremely high levels of accuracy. To achieve these accuracies, measurement and instrument biases at the centimeter to millimeter level must be understood. One of these biases is the antenna phase center, the apparent point of signal reception for a GNSS antenna. It has been well established that phase center patterns differ between antenna models and manufacturers; additional research suggests that the addition of a radome or the choice of antenna mount can significantly alter those a priori phase center patterns. For the more demanding GNSS positioning applications and especially in cases of mixed-antenna networks, it is all the more important to know antenna phase center variations as a function of both elevation and azimuth in the antenna reference frame and incorporate these models into analysis software. Determination of antenna phase center behavior is known as "antenna calibration". Since 1994, NGS has computed relative antenna calibrations for more than 350 antennas. In recent years, the geodetic community has moved to absolute calibrations - the IGS adopted absolute antenna phase center calibrations in 2006 for use in their orbit and clock products, and NGS's CORS group began using absolute antenna calibration upon the release of the new CORS coordinates in IGS08 epoch 2005.00 and NAD 83(2011,MA11,PA11) epoch 2010.00. Although NGS relative calibrations can be and have been converted to absolute, it is considered best practice to independently measure phase center characteristics in an absolute sense. Consequently, NGS has developed and operates an absolute calibration system. These absolute antenna calibrations accommodate the demand for greater accuracy and for 2-dimensional (elevation and azimuth) parameterization. NGS will continue to provide calibration values via the NGS web site www.ngs.noaa.gov/ANTCAL, and will publish calibrations in the ANTEX format as well as the legacy ANTINFO

  16. MODELING APPROACHES TO POPULATION-LEVEL RISK AESSESSMENT

    EPA Science Inventory

    A SETAC Pellston Workshop on Population-Level Risk Assessment was held in Roskilde, Denmark on 23-27 August 2003. One aspect of this workshop focused on modeling approaches for characterizing population-level effects of chemical exposure. The modeling work group identified th...

  17. Quantitative risk assessment modeling for nonhomogeneous urban road tunnels.

    PubMed

    Meng, Qiang; Qu, Xiaobo; Wang, Xinchang; Yuanita, Vivi; Wong, Siew Chee

    2011-03-01

    Urban road tunnels provide an increasingly cost-effective engineering solution, especially in compact cities like Singapore. For some urban road tunnels, tunnel characteristics such as tunnel configurations, geometries, provisions of tunnel electrical and mechanical systems, traffic volumes, etc. may vary from one section to another. These urban road tunnels that have characterized nonuniform parameters are referred to as nonhomogeneous urban road tunnels. In this study, a novel quantitative risk assessment (QRA) model is proposed for nonhomogeneous urban road tunnels because the existing QRA models for road tunnels are inapplicable to assess the risks in these road tunnels. This model uses a tunnel segmentation principle whereby a nonhomogeneous urban road tunnel is divided into various homogenous sections. Individual risk for road tunnel sections as well as the integrated risk indices for the entire road tunnel is defined. The article then proceeds to develop a new QRA model for each of the homogeneous sections. Compared to the existing QRA models for road tunnels, this section-based model incorporates one additional top event-toxic gases due to traffic congestion-and employs the Poisson regression method to estimate the vehicle accident frequencies of tunnel sections. This article further illustrates an aggregated QRA model for nonhomogeneous urban tunnels by integrating the section-based QRA models. Finally, a case study in Singapore is carried out.

  18. Empirical Analysis of Farm Credit Risk under the Structure Model

    ERIC Educational Resources Information Center

    Yan, Yan

    2009-01-01

    The study measures farm credit risk by using farm records collected by Farm Business Farm Management (FBFM) during the period 1995-2004. The study addresses the following questions: (1) whether farm's financial position is fully described by the structure model, (2) what are the determinants of farm capital structure under the structure model, (3)…

  19. Dental caries: an updated medical model of risk assessment.

    PubMed

    Kutsch, V Kim

    2014-04-01

    Dental caries is a transmissible, complex biofilm disease that creates prolonged periods of low pH in the mouth, resulting in a net mineral loss from the teeth. Historically, the disease model for dental caries consisted of mutans streptococci and Lactobacillus species, and the dental profession focused on restoring the lesions/damage from the disease by using a surgical model. The current recommendation is to implement a risk-assessment-based medical model called CAMBRA (caries management by risk assessment) to diagnose and treat dental caries. Unfortunately, many of the suggestions of CAMBRA have been overly complicated and confusing for clinicians. The risk of caries, however, is usually related to just a few common factors, and these factors result in common patterns of disease. This article examines the biofilm model of dental caries, identifies the common disease patterns, and discusses their targeted therapeutic strategies to make CAMBRA more easily adaptable for the privately practicing professional.

  20. Model for Solar Proton Risk Assessment

    NASA Technical Reports Server (NTRS)

    Xapos, M. A.; Stauffer, C.; Gee, G. B.; Barth, J. L.; Stassinopoulos, E. G.; McGuire, R. E.

    2004-01-01

    A statistical model for cumulative solar proton event fluences during space missions is presented that covers both the solar minimum and solar maximum phases of the solar cycle. It is based on data from the IMP and GOES series of satellites that is integrated together to allow the best features of each data set to be taken advantage of. This allows fluence-energy spectra to be extended out to energies of 327 MeV.

  1. Modeling of Flood Risk for the Continental United States

    NASA Astrophysics Data System (ADS)

    Lohmann, D.; Li, S.; Katz, B.; Goteti, G.; Kaheil, Y. H.; Vojjala, R.

    2011-12-01

    The science of catastrophic risk modeling helps people to understand the physical and financial implications of natural catastrophes (hurricanes, flood, earthquakes, etc.), terrorism, and the risks associated with changes in life expectancy. As such it depends on simulation techniques that integrate multiple disciplines such as meteorology, hydrology, structural engineering, statistics, computer science, financial engineering, actuarial science, and more in virtually every field of technology. In this talk we will explain the techniques and underlying assumptions of building the RMS US flood risk model. We especially will pay attention to correlation (spatial and temporal), simulation and uncertainty in each of the various components in the development process. Recent extreme floods (e.g. US Midwest flood 2008, US Northeast flood, 2010) have increased the concern of flood risk. Consequently, there are growing needs to adequately assess the flood risk. The RMS flood hazard model is mainly comprised of three major components. (1) Stochastic precipitation simulation module based on a Monte-Carlo analogue technique, which is capable of producing correlated rainfall events for the continental US. (2) Rainfall-runoff and routing module. A semi-distributed rainfall-runoff model was developed to properly assess the antecedent conditions, determine the saturation area and runoff. The runoff is further routed downstream along the rivers by a routing model. Combined with the precipitation model, it allows us to correlate the streamflow and hence flooding from different rivers, as well as low and high return-periods across the continental US. (3) Flood inundation module. It transforms the discharge (output from the flow routing) into water level, which is further combined with a two-dimensional off-floodplain inundation model to produce comprehensive flood hazard map. The performance of the model is demonstrated by comparing to the observation and published data. Output from

  2. Parametric Estimation in a Recurrent Competing Risks Model

    PubMed Central

    Peña, Edsel A.

    2014-01-01

    A resource-efficient approach to making inferences about the distributional properties of the failure times in a competing risks setting is presented. Efficiency is gained by observing recurrences of the competing risks over a random monitoring period. The resulting model is called the recurrent competing risks model (RCRM) and is coupled with two repair strategies whenever the system fails. Maximum likelihood estimators of the parameters of the marginal distribution functions associated with each of the competing risks and also of the system lifetime distribution function are presented. Estimators are derived under perfect and partial repair strategies. Consistency and asymptotic properties of the estimators are obtained. The estimation methods are applied to a data set of failures for cars under warranty. Simulation studies are used to ascertain the small sample properties and the efficiency gains of the resulting estimators. PMID:25346751

  3. Absolute Free Energies for Biomolecules in Implicit or Explicit Solvent

    NASA Astrophysics Data System (ADS)

    Berryman, Joshua T.; Schilling, Tanja

    Methods for absolute free energy calculation by alchemical transformation of a quantitative model to an analytically tractable one are discussed. These absolute free energy methods are placed in the context of other methods, and an attempt is made to describe the best practice for such calculations given the current state of the art. Calculations of the equilibria between the four free energy basins of the dialanine molecule and the two right- and left-twisted basins of DNA are discussed as examples.

  4. Model based climate information on drought risk in Africa

    NASA Astrophysics Data System (ADS)

    Calmanti, S.; Syroka, J.; Jones, C.; Carfagna, F.; Dell'Aquila, A.; Hoefsloot, P.; Kaffaf, S.; Nikulin, G.

    2012-04-01

    The United Nations World Food Programme (WFP) has embarked upon the endeavor of creating a sustainable Africa-wide natural disaster risk management system. A fundamental building block of this initiative is the setup of a drought impact modeling platform called Africa Risk-View that aims to quantify and monitor weather-related food security risk in Africa. The modeling approach is based the Water Requirement Satisfaction Index (WRSI), as the fundamental indicator of the performances of agriculture and uses historical records of food assistance operation to project future potential needs for livelihood protection. By using climate change scenarios as an input to Africa Risk-View it is possible, in principles, to evaluate the future impact of climate variability on critical issues such as food security and the overall performance of the envisaged risk management system. A necessary preliminary step to this challenging task is the exploration of the sources of uncertainties affecting the assessment based on modeled climate change scenarios. For this purpose, a limited set of climate models have been selected in order verify the relevance of using climate model output data with Africa Risk-View and to explore a minimal range of possible sources of uncertainty. This first evaluation exercise started before the setup of the CORDEX framework and has relied on model output available at the time. In particular only one regional downscaling was available for the entire African continent from the ENSEMBLES project. The analysis shows that current coarse resolution global climate models can not directly feed into the Africa RiskView risk-analysis tool. However, regional downscaling may help correcting the inherent biases observed in the datasets. Further analysis is performed by using the first data available under the CORDEX framework. In particular, we consider a set of simulation driven with boundary conditions from the reanalysis ERA-Interim to evaluate the skill drought

  5. Risk Classification with an Adaptive Naive Bayes Kernel Machine Model

    PubMed Central

    Minnier, Jessica; Yuan, Ming; Liu, Jun S.; Cai, Tianxi

    2014-01-01

    Genetic studies of complex traits have uncovered only a small number of risk markers explaining a small fraction of heritability and adding little improvement to disease risk prediction. Standard single marker methods may lack power in selecting informative markers or estimating effects. Most existing methods also typically do not account for non-linearity. Identifying markers with weak signals and estimating their joint effects among many non-informative markers remains challenging. One potential approach is to group markers based on biological knowledge such as gene structure. If markers in a group tend to have similar effects, proper usage of the group structure could improve power and efficiency in estimation. We propose a two-stage method relating markers to disease risk by taking advantage of known gene-set structures. Imposing a naive bayes kernel machine (KM) model, we estimate gene-set specific risk models that relate each gene-set to the outcome in stage I. The KM framework efficiently models potentially non-linear effects of predictors without requiring explicit specification of functional forms. In stage II, we aggregate information across gene-sets via a regularization procedure. Estimation and computational efficiency is further improved with kernel principle component analysis. Asymptotic results for model estimation and gene set selection are derived and numerical studies suggest that the proposed procedure could outperform existing procedures for constructing genetic risk models. PMID:26236061

  6. Beliefs and stochastic modelling of interest rate scenario risk

    NASA Astrophysics Data System (ADS)

    Galic, E.; Molgedey, L.

    2001-04-01

    We present a framework that allows for a systematic assessment of risk given a specific model and belief on the market. Within this framework the time evolution of risk is modeled in a twofold way. On the one hand, risk is modeled by the time discrete and nonlinear garch(1,1) process, which allows for a (time-)local understanding of its level, together with a short term forecast. On the other hand, via a diffusion approximation, the time evolution of the probability density of risk is modeled by a Fokker-Planck equation. Then, as a final step, using Bayes theorem, beliefs are conditioned on the stationary probability density function as obtained from the Fokker-Planck equation. We believe this to be a highly rigorous framework to integrate subjective judgments of future market behavior and underlying models. In order to demonstrate the approach, we apply it to risk assessment of empirical interest rate scenario methodologies, i.e. the application of Principal Component Analysis to the the dynamics of bonds.

  7. Credibility analysis of risk classes by generalized linear model

    NASA Astrophysics Data System (ADS)

    Erdemir, Ovgucan Karadag; Sucu, Meral

    2016-06-01

    In this paper generalized linear model (GLM) and credibility theory which are frequently used in nonlife insurance pricing are combined for reliability analysis. Using full credibility standard, GLM is associated with limited fluctuation credibility approach. Comparison criteria such as asymptotic variance and credibility probability are used to analyze the credibility of risk classes. An application is performed by using one-year claim frequency data of a Turkish insurance company and results of credible risk classes are interpreted.

  8. Model-based benefit-risk assessment: can Archimedes help?

    PubMed

    Krishna, R

    2009-03-01

    In December 2008, the US Food and Drug Administration issued a new draft Guidance for Industry on Diabetes Mellitus--evaluating cardiovascular risk in new antidiabetic therapies to treat Type 2 diabetes. This guidance comes at a time when recent discussions have focused on delineation of cardiovascular risk reduction for new antidiabetic drugs. Computational tools that can enable early prediction of cardiovascular risk are reviewed with specific reference to Archimedes (Kaiser Permanente), with an aim of proposing a model-based solution and enabling decisions to be made as early as possible in the drug development value chain.

  9. Risk Management Model in Surface Exploitation of Mineral Deposits

    NASA Astrophysics Data System (ADS)

    Stojanović, Cvjetko

    2016-06-01

    Risk management is an integrative part of all types of project management. One of the main tasks of pre-investment studies and other project documentation is the tendency to protect investment projects as much as possible against investment risks. Therefore, the provision and regulation of risk information ensure the identification of the probability of the emergence of adverse events, their forms, causes and consequences, and provides a timely measures of protection against risks. This means that risk management involves a set of management methods and techniques used to reduce the possibility of realizing the adverse events and consequences and thus increase the possibilities of achieving the planned results with minimal losses. Investment in mining projects are of capital importance because they are very complex projects, therefore being very risky, because of the influence of internal and external factors and limitations arising from the socio-economic environment. Due to the lack of a risk management system, numerous organizations worldwide have suffered significant financial losses. Therefore, it is necessary for any organization to establish a risk management system as a structural element of system management system as a whole. This paper presents an approach to a Risk management model in the project of opening a surface coal mine, developed based on studies of extensive scientific literature and personal experiences of the author, and which, with certain modifications, may find use for any investment project, both in the mining industry as well as in investment projects in other areas.

  10. Risk assessment and remedial policy evaluation using predictive modeling

    SciTech Connect

    Linkov, L.; Schell, W.R.

    1996-06-01

    As a result of nuclear industry operation and accidents, large areas of natural ecosystems have been contaminated by radionuclides and toxic metals. Extensive societal pressure has been exerted to decrease the radiation dose to the population and to the environment. Thus, in making abatement and remediation policy decisions, not only economic costs but also human and environmental risk assessments are desired. This paper introduces a general framework for risk assessment and remedial policy evaluation using predictive modeling. Ecological risk assessment requires evaluation of the radionuclide distribution in ecosystems. The FORESTPATH model is used for predicting the radionuclide fate in forest compartments after deposition as well as for evaluating the efficiency of remedial policies. Time of intervention and radionuclide deposition profile was predicted as being crucial for the remediation efficiency. Risk assessment conducted for a critical group of forest users in Belarus shows that consumption of forest products (berries and mushrooms) leads to about 0.004% risk of a fatal cancer annually. Cost-benefit analysis for forest cleanup suggests that complete removal of organic layer is too expensive for application in Belarus and a better methodology is required. In conclusion, FORESTPATH modeling framework could have wide applications in environmental remediation of radionuclides and toxic metals as well as in dose reconstruction and, risk-assessment.

  11. Modelling microbial health risk of wastewater reuse: A systems perspective.

    PubMed

    Beaudequin, Denise; Harden, Fiona; Roiko, Anne; Stratton, Helen; Lemckert, Charles; Mengersen, Kerrie

    2015-11-01

    There is a widespread need for the use of quantitative microbial risk assessment (QMRA) to determine reclaimed water quality for specific uses, however neither faecal indicator levels nor pathogen concentrations alone are adequate for assessing exposure health risk. The aim of this study was to build a conceptual model representing factors contributing to the microbiological health risks of reusing water treated in maturation ponds. This paper describes the development of an unparameterised model that provides a visual representation of theoretical constructs and variables of interest. Information was collected from the peer-reviewed literature and through consultation with experts from regulatory authorities and academic disciplines. In this paper we explore how, considering microbial risk as a modular system, following the QMRA framework enables incorporation of the many factors influencing human exposure and dose response, to better characterise likely human health impacts. By using and expanding upon the QMRA framework we deliver new insights into this important field of environmental exposures. We present a conceptual model of health risk of microbial exposure which can be used for maturation ponds and, more importantly, as a generic tool to assess health risk in diverse wastewater reuse scenarios. PMID:26277638

  12. Assessing risk factors for dental caries: a statistical modeling approach.

    PubMed

    Trottini, Mario; Bossù, Maurizio; Corridore, Denise; Ierardo, Gaetano; Luzzi, Valeria; Saccucci, Matteo; Polimeni, Antonella

    2015-01-01

    The problem of identifying potential determinants and predictors of dental caries is of key importance in caries research and it has received considerable attention in the scientific literature. From the methodological side, a broad range of statistical models is currently available to analyze dental caries indices (DMFT, dmfs, etc.). These models have been applied in several studies to investigate the impact of different risk factors on the cumulative severity of dental caries experience. However, in most of the cases (i) these studies focus on a very specific subset of risk factors; and (ii) in the statistical modeling only few candidate models are considered and model selection is at best only marginally addressed. As a result, our understanding of the robustness of the statistical inferences with respect to the choice of the model is very limited; the richness of the set of statistical models available for analysis in only marginally exploited; and inferences could be biased due the omission of potentially important confounding variables in the model's specification. In this paper we argue that these limitations can be overcome considering a general class of candidate models and carefully exploring the model space using standard model selection criteria and measures of global fit and predictive performance of the candidate models. Strengths and limitations of the proposed approach are illustrated with a real data set. In our illustration the model space contains more than 2.6 million models, which require inferences to be adjusted for 'optimism'.

  13. The AFGL absolute gravity program

    NASA Technical Reports Server (NTRS)

    Hammond, J. A.; Iliff, R. L.

    1978-01-01

    A brief discussion of the AFGL's (Air Force Geophysics Laboratory) program in absolute gravity is presented. Support of outside work and in-house studies relating to gravity instrumentation are discussed. A description of the current transportable system is included and the latest results are presented. These results show good agreement with measurements at the AFGL site by an Italian system. The accuracy obtained by the transportable apparatus is better than 0.1 microns sq sec 10 microgal and agreement with previous measurements is within the combined uncertainties of the measurements.

  14. Familial Aggregation of Absolute Pitch

    PubMed Central

    Baharloo, Siamak; Service, Susan K.; Risch, Neil; Gitschier, Jane; Freimer, Nelson B.

    2000-01-01

    Absolute pitch (AP) is a behavioral trait that is defined as the ability to identify the pitch of tones in the absence of a reference pitch. AP is an ideal phenotype for investigation of gene and environment interactions in the development of complex human behaviors. Individuals who score exceptionally well on formalized auditory tests of pitch perception are designated as “AP-1.” As described in this report, auditory testing of siblings of AP-1 probands and of a control sample indicates that AP-1 aggregates in families. The implications of this finding for the mapping of loci for AP-1 predisposition are discussed. PMID:10924408

  15. NASA Space Radiation Program Integrative Risk Model Toolkit

    NASA Technical Reports Server (NTRS)

    Kim, Myung-Hee Y.; Hu, Shaowen; Plante, Ianik; Ponomarev, Artem L.; Sandridge, Chris

    2015-01-01

    NASA Space Radiation Program Element scientists have been actively involved in development of an integrative risk models toolkit that includes models for acute radiation risk and organ dose projection (ARRBOD), NASA space radiation cancer risk projection (NSCR), hemocyte dose estimation (HemoDose), GCR event-based risk model code (GERMcode), and relativistic ion tracks (RITRACKS), NASA radiation track image (NASARTI), and the On-Line Tool for the Assessment of Radiation in Space (OLTARIS). This session will introduce the components of the risk toolkit with opportunity for hands on demonstrations. The brief descriptions of each tools are: ARRBOD for Organ dose projection and acute radiation risk calculation from exposure to solar particle event; NSCR for Projection of cancer risk from exposure to space radiation; HemoDose for retrospective dose estimation by using multi-type blood cell counts; GERMcode for basic physical and biophysical properties for an ion beam, and biophysical and radiobiological properties for a beam transport to the target in the NASA Space Radiation Laboratory beam line; RITRACKS for simulation of heavy ion and delta-ray track structure, radiation chemistry, DNA structure and DNA damage at the molecular scale; NASARTI for modeling of the effects of space radiation on human cells and tissue by incorporating a physical model of tracks, cell nucleus, and DNA damage foci with image segmentation for the automated count; and OLTARIS, an integrated tool set utilizing HZETRN (High Charge and Energy Transport) intended to help scientists and engineers study the effects of space radiation on shielding materials, electronics, and biological systems.

  16. A simulation model for risk assessment of turbine wheels

    NASA Technical Reports Server (NTRS)

    Safie, Fayssal M.; Hage, Richard T.

    1991-01-01

    A simulation model has been successfully developed to evaluate the risk of the Space Shuttle auxiliary power unit (APU) turbine wheels for a specific inspection policy. Besides being an effective tool for risk/reliability evaluation, the simulation model also allows the analyst to study the trade-offs between wheel reliability, wheel life, inspection interval, and rejection crack size. For example, in the APU application, sensitivity analysis results showed that the wheel life limit has the least effect on wheel reliability when compared to the effect of the inspection interval and the rejection crack size. In summary, the simulation model developed represents a flexible tool to predict turbine wheel reliability and study the risk under different inspection policies.

  17. The use of ecosystem models in risk assessment

    SciTech Connect

    Starodub, M.E.; Miller, P.A.; Willes, R.F.

    1994-12-31

    Ecosystem models, when used in conjunction with available environmental effects monitoring data enable informed decisions regarding actions that should be taken to manage ecological risks from areas of localized chemical loadings and accumulation. These models provide quantitative estimates of chemical concentrations in various environmental media. The reliable application of these models as predictive tools for environmental assessment requires a thorough understanding of the theory and mathematical relationships described by the models and demands rigorous validation of input data and model results with field and laboratory data. Food chain model selection should be based on the ability to best simulate the interactions of the food web and processes governing the transfer of chemicals from the dissolved and particulate phase to various trophic levels for the site in question. This requires that the user be familiar with the theories on which these models are based, and be aware of the merits and short comings of each prior to attempting to model food chain accumulation. Questions to be asked include: are all potential exposure pathways addressed? are omitted pathways critical to the risk assessment process? is the model flexible? To answer these questions one must consider the, chemical(s) of concern, site-specific ecosystem characteristics, risk assessment receptor (aquatic, wildlife, human) dietary habits, influence of effluent characteristics on food chain dynamics.

  18. A new model for polluted soil risk assessment

    NASA Astrophysics Data System (ADS)

    Andretta, M.; Serra, R.; Villani, M.

    2006-08-01

    In this paper, we discuss the most important theoretical aspects of polluted soil Risk Assessment Methodologies, which have been developed in order to evaluate the risk, for the exposed people, connected with the residual contaminant concentration in polluted soil, and we make a short presentation of the major different kinds of risk assessment methodologies. We also underline the relevant role played, in this kind of analysis, by the pollutant transport models. We also describe a new and innovative model, based on the general framework of the so-called Cellular Automata (CA), initially developed in the UE-Esprit Project COLOMBO for the simulation of bioremediation processes. These kinds of models, for their intrinsic "finite and discrete" characteristics, seem to be very well suited for a detailed analysis of the shape of the pollutant sources, the contaminant fates and the evaluation of target in the risk assessment evaluation. In particular, we will describe the future research activities we are going to develop in the area of a strict integration between pollutant fate and transport models and Risk Analysis Methodologies.

  19. Physics-based Entry, Descent and Landing Risk Model

    NASA Technical Reports Server (NTRS)

    Gee, Ken; Huynh, Loc C.; Manning, Ted

    2014-01-01

    A physics-based risk model was developed to assess the risk associated with thermal protection system failures during the entry, descent and landing phase of a manned spacecraft mission. In the model, entry trajectories were computed using a three-degree-of-freedom trajectory tool, the aerothermodynamic heating environment was computed using an engineering-level computational tool and the thermal response of the TPS material was modeled using a one-dimensional thermal response tool. The model was capable of modeling the effect of micrometeoroid and orbital debris impact damage on the TPS thermal response. A Monte Carlo analysis was used to determine the effects of uncertainties in the vehicle state at Entry Interface, aerothermodynamic heating and material properties on the performance of the TPS design. The failure criterion was set as a temperature limit at the bondline between the TPS and the underlying structure. Both direct computation and response surface approaches were used to compute the risk. The model was applied to a generic manned space capsule design. The effect of material property uncertainty and MMOD damage on risk of failure were analyzed. A comparison of the direct computation and response surface approach was undertaken.

  20. Evaluating biomarkers to model cancer risk post cosmic ray exposure

    NASA Astrophysics Data System (ADS)

    Sridharan, Deepa M.; Asaithamby, Aroumougame; Blattnig, Steve R.; Costes, Sylvain V.; Doetsch, Paul W.; Dynan, William S.; Hahnfeldt, Philip; Hlatky, Lynn; Kidane, Yared; Kronenberg, Amy; Naidu, Mamta D.; Peterson, Leif E.; Plante, Ianik; Ponomarev, Artem L.; Saha, Janapriya; Snijders, Antoine M.; Srinivasan, Kalayarasan; Tang, Jonathan; Werner, Erica; Pluth, Janice M.

    2016-06-01

    Robust predictive models are essential to manage the risk of radiation-induced carcinogenesis. Chronic exposure to cosmic rays in the context of the complex deep space environment may place astronauts at high cancer risk. To estimate this risk, it is critical to understand how radiation-induced cellular stress impacts cell fate decisions and how this in turn alters the risk of carcinogenesis. Exposure to the heavy ion component of cosmic rays triggers a multitude of cellular changes, depending on the rate of exposure, the type of damage incurred and individual susceptibility. Heterogeneity in dose, dose rate, radiation quality, energy and particle flux contribute to the complexity of risk assessment. To unravel the impact of each of these factors, it is critical to identify sensitive biomarkers that can serve as inputs for robust modeling of individual risk of cancer or other long-term health consequences of exposure. Limitations in sensitivity of biomarkers to dose and dose rate, and the complexity of longitudinal monitoring, are some of the factors that increase uncertainties in the output from risk prediction models. Here, we critically evaluate candidate early and late biomarkers of radiation exposure and discuss their usefulness in predicting cell fate decisions. Some of the biomarkers we have reviewed include complex clustered DNA damage, persistent DNA repair foci, reactive oxygen species, chromosome aberrations and inflammation. Other biomarkers discussed, often assayed for at longer points post exposure, include mutations, chromosome aberrations, reactive oxygen species and telomere length changes. We discuss the relationship of biomarkers to different potential cell fates, including proliferation, apoptosis, senescence, and loss of stemness, which can propagate genomic instability and alter tissue composition and the underlying mRNA signatures that contribute to cell fate decisions. Our goal is to highlight factors that are important in choosing

  1. Evaluating biomarkers to model cancer risk post cosmic ray exposure.

    PubMed

    Sridharan, Deepa M; Asaithamby, Aroumougame; Blattnig, Steve R; Costes, Sylvain V; Doetsch, Paul W; Dynan, William S; Hahnfeldt, Philip; Hlatky, Lynn; Kidane, Yared; Kronenberg, Amy; Naidu, Mamta D; Peterson, Leif E; Plante, Ianik; Ponomarev, Artem L; Saha, Janapriya; Snijders, Antoine M; Srinivasan, Kalayarasan; Tang, Jonathan; Werner, Erica; Pluth, Janice M

    2016-06-01

    Robust predictive models are essential to manage the risk of radiation-induced carcinogenesis. Chronic exposure to cosmic rays in the context of the complex deep space environment may place astronauts at high cancer risk. To estimate this risk, it is critical to understand how radiation-induced cellular stress impacts cell fate decisions and how this in turn alters the risk of carcinogenesis. Exposure to the heavy ion component of cosmic rays triggers a multitude of cellular changes, depending on the rate of exposure, the type of damage incurred and individual susceptibility. Heterogeneity in dose, dose rate, radiation quality, energy and particle flux contribute to the complexity of risk assessment. To unravel the impact of each of these factors, it is critical to identify sensitive biomarkers that can serve as inputs for robust modeling of individual risk of cancer or other long-term health consequences of exposure. Limitations in sensitivity of biomarkers to dose and dose rate, and the complexity of longitudinal monitoring, are some of the factors that increase uncertainties in the output from risk prediction models. Here, we critically evaluate candidate early and late biomarkers of radiation exposure and discuss their usefulness in predicting cell fate decisions. Some of the biomarkers we have reviewed include complex clustered DNA damage, persistent DNA repair foci, reactive oxygen species, chromosome aberrations and inflammation. Other biomarkers discussed, often assayed for at longer points post exposure, include mutations, chromosome aberrations, reactive oxygen species and telomere length changes. We discuss the relationship of biomarkers to different potential cell fates, including proliferation, apoptosis, senescence, and loss of stemness, which can propagate genomic instability and alter tissue composition and the underlying mRNA signatures that contribute to cell fate decisions. Our goal is to highlight factors that are important in choosing

  2. Evaluating biomarkers to model cancer risk post cosmic ray exposure.

    PubMed

    Sridharan, Deepa M; Asaithamby, Aroumougame; Blattnig, Steve R; Costes, Sylvain V; Doetsch, Paul W; Dynan, William S; Hahnfeldt, Philip; Hlatky, Lynn; Kidane, Yared; Kronenberg, Amy; Naidu, Mamta D; Peterson, Leif E; Plante, Ianik; Ponomarev, Artem L; Saha, Janapriya; Snijders, Antoine M; Srinivasan, Kalayarasan; Tang, Jonathan; Werner, Erica; Pluth, Janice M

    2016-06-01

    Robust predictive models are essential to manage the risk of radiation-induced carcinogenesis. Chronic exposure to cosmic rays in the context of the complex deep space environment may place astronauts at high cancer risk. To estimate this risk, it is critical to understand how radiation-induced cellular stress impacts cell fate decisions and how this in turn alters the risk of carcinogenesis. Exposure to the heavy ion component of cosmic rays triggers a multitude of cellular changes, depending on the rate of exposure, the type of damage incurred and individual susceptibility. Heterogeneity in dose, dose rate, radiation quality, energy and particle flux contribute to the complexity of risk assessment. To unravel the impact of each of these factors, it is critical to identify sensitive biomarkers that can serve as inputs for robust modeling of individual risk of cancer or other long-term health consequences of exposure. Limitations in sensitivity of biomarkers to dose and dose rate, and the complexity of longitudinal monitoring, are some of the factors that increase uncertainties in the output from risk prediction models. Here, we critically evaluate candidate early and late biomarkers of radiation exposure and discuss their usefulness in predicting cell fate decisions. Some of the biomarkers we have reviewed include complex clustered DNA damage, persistent DNA repair foci, reactive oxygen species, chromosome aberrations and inflammation. Other biomarkers discussed, often assayed for at longer points post exposure, include mutations, chromosome aberrations, reactive oxygen species and telomere length changes. We discuss the relationship of biomarkers to different potential cell fates, including proliferation, apoptosis, senescence, and loss of stemness, which can propagate genomic instability and alter tissue composition and the underlying mRNA signatures that contribute to cell fate decisions. Our goal is to highlight factors that are important in choosing

  3. Are Masking-Based Models of Risk Useful?

    PubMed

    Gisiner, Robert C

    2016-01-01

    As our understanding of directly observable effects from anthropogenic sound exposure has improved, concern about "unobservable" effects such as stress and masking have received greater attention. Equal energy models of masking such as power spectrum models have the appeal of simplicity, but do they offer biologically realistic assessments of the risk of masking? Data relevant to masking such as critical ratios, critical bandwidths, temporal resolution, and directional resolution along with what is known about general mammalian antimasking mechanisms all argue for a much more complicated view of masking when making decisions about the risk of masking inherent in a given anthropogenic sound exposure scenario. PMID:26610979

  4. A dynamical systems model for nuclear power plant risk

    NASA Astrophysics Data System (ADS)

    Hess, Stephen Michael

    The recent transition to an open access generation marketplace has forced nuclear plant operators to become much more cost conscious and focused on plant performance. Coincidentally, the regulatory perspective also is in a state of transition from a command and control framework to one that is risk-informed and performance-based. Due to these structural changes in the economics and regulatory system associated with commercial nuclear power plant operation, there is an increased need for plant management to explicitly manage nuclear safety risk. Application of probabilistic risk assessment techniques to model plant hardware has provided a significant contribution to understanding the potential initiating events and equipment failures that can lead to core damage accidents. Application of the lessons learned from these analyses has supported improved plant operation and safety over the previous decade. However, this analytical approach has not been nearly as successful in addressing the impact of plant processes and management effectiveness on the risks of plant operation. Thus, the research described in this dissertation presents a different approach to address this issue. Here we propose a dynamical model that describes the interaction of important plant processes among themselves and their overall impact on nuclear safety risk. We first provide a review of the techniques that are applied in a conventional probabilistic risk assessment of commercially operating nuclear power plants and summarize the typical results obtained. The limitations of the conventional approach and the status of research previously performed to address these limitations also are presented. Next, we present the case for the application of an alternative approach using dynamical systems theory. This includes a discussion of previous applications of dynamical models to study other important socio-economic issues. Next, we review the analytical techniques that are applicable to analysis of

  5. Analysis of Radiation Pneumonitis Risk Using a Generalized Lyman Model

    SciTech Connect

    Tucker, Susan L. Liu, H. Helen; Liao Zhongxing; Wei Xiong; Wang Shulian; Jin Hekun; Komaki, Ritsuko; Martel, Mary K.; Mohan, Radhe

    2008-10-01

    Purpose: To introduce a version of the Lyman normal-tissue complication probability (NTCP) model adapted to incorporate censored time-to-toxicity data and clinical risk factors and to apply the generalized model to analysis of radiation pneumonitis (RP) risk. Methods and Materials: Medical records and radiation treatment plans were reviewed retrospectively for 576 patients with non-small cell lung cancer treated with radiotherapy. The time to severe (Grade {>=}3) RP was computed, with event times censored at last follow-up for patients not experiencing this endpoint. The censored time-to-toxicity data were analyzed using the standard and generalized Lyman models with patient smoking status taken into account. Results: The generalized Lyman model with patient smoking status taken into account produced NTCP estimates up to 27 percentage points different from the model based on dose-volume factors alone. The generalized model also predicted that 8% of the expected cases of severe RP were unobserved because of censoring. The estimated volume parameter for lung was not significantly different from n = 1, corresponding to mean lung dose. Conclusions: NTCP models historically have been based solely on dose-volume effects and binary (yes/no) toxicity data. Our results demonstrate that inclusion of nondosimetric risk factors and censored time-to-event data can markedly affect outcome predictions made using NTCP models.

  6. A Model for Risk Analysis of Oil Tankers

    NASA Astrophysics Data System (ADS)

    Montewka, Jakub; Krata, Przemysław; Goerland, Floris; Kujala, Pentti

    2010-01-01

    The paper presents a model for risk analysis regarding marine traffic, with the emphasis on two types of the most common marine accidents which are: collision and grounding. The focus is on oil tankers as these pose the highest environmental risk. A case study in selected areas of Gulf of Finland in ice free conditions is presented. The model utilizes a well-founded formula for risk calculation, which combines the probability of an unwanted event with its consequences. Thus the model is regarded a block type model, consisting of blocks for the probability of collision and grounding estimation respectively as well as blocks for consequences of an accident modelling. Probability of vessel colliding is assessed by means of a Minimum Distance To Collision (MDTC) based model. The model defines in a novel way the collision zone, using mathematical ship motion model and recognizes traffic flow as a non homogeneous process. The presented calculations address waterways crossing between Helsinki and Tallinn, where dense cross traffic during certain hours is observed. For assessment of a grounding probability, a new approach is proposed, which utilizes a newly developed model, where spatial interactions between objects in different locations are recognized. A ship at a seaway and navigational obstructions may be perceived as interacting objects and their repulsion may be modelled by a sort of deterministic formulation. Risk due to tankers running aground addresses an approach fairway to an oil terminal in Sköldvik, near Helsinki. The consequences of an accident are expressed in monetary terms, and concern costs of an oil spill, based on statistics of compensations claimed from the International Oil Pollution Compensation Funds (IOPC Funds) by parties involved.

  7. Gambler Risk Perception: A Mental Model and Grounded Theory Analysis.

    PubMed

    Spurrier, Michael; Blaszczynski, Alexander; Rhodes, Paul

    2015-09-01

    Few studies have investigated how gamblers perceive risk or the role of risk perception in disordered gambling. The purpose of the current study therefore was to obtain data on lay gamblers' beliefs on these variables and their effects on decision-making, behaviour, and disordered gambling aetiology. Fifteen regular lay gamblers (non-problem/low risk, moderate risk and problem gamblers) completed a semi-structured interview following mental models and grounded theory methodologies. Gambler interview data was compared to an expert 'map' of risk-perception, to identify comparative gaps or differences associated with harmful or safe gambling. Systematic overlapping processes of data gathering and analysis were used to iteratively extend, saturate, test for exception, and verify concepts and themes emerging from the data. The preliminary findings suggested that gambler accounts supported the presence of expert conceptual constructs, and to some degree the role of risk perception in protecting against or increasing vulnerability to harm and disordered gambling. Gambler accounts of causality, meaning, motivation, and strategy were highly idiosyncratic, and often contained content inconsistent with measures of disordered gambling. Disordered gambling appears heavily influenced by relative underestimation of risk and overvaluation of gambling, based on explicit and implicit analysis, and deliberate, innate, contextual, and learned processing evaluations and biases. PMID:24402720

  8. Assessing Landslide Risk Areas Using Statistical Models and Land Cover

    NASA Astrophysics Data System (ADS)

    Kim, H. G.; Lee, D. K.; Park, C.; Ahn, Y.; Sung, S.; Park, J. H.

    2015-12-01

    Recently, damages due to landslides have increased in Republic of Korea. Extreme weathers like typhoon, heavy rainfall related to climate change are the main factor of the damages. Especially, Inje-gun, Gangwon-do had severe landslide damages in 2006 and 2007. In Inje-gun, 91% areas are forest, therefore, many land covers related to human activities were adjacent to forest land. Thus, establishment of adaptation plans to landslides was urgently needed. Landslide risk assessment can serve as a good information to policy makers. The objective of this study was assessing landslide risk areas to support establishment of adaptation plans to reduce landslide damages. Statistical distribution models (SDMs) were used to evaluate probability of landslide occurrence. Various SDMs were used to make landslide probability maps considering uncertainty of SDMs. The types of land cover were classified into 5 grades considering vulnerable level to landslide. The landslide probability maps were overlaid with land cover map to calculate landslide risk. As a result of overlay analysis, landslide risk areas were derived. Especially agricultural areas and transportation areas showed high risk and large areas in the risk map. In conclusion, policy makers in Inje-gun must consider the landslide risk map to establish adaptation plans effectively.

  9. Risk assessment models for cancer-associated venous thromboembolism.

    PubMed

    Dutia, Mrinal; White, Richard H; Wun, Ted

    2012-07-15

    Venous thromboembolism (VTE) is common in cancer patients, and is associated with significant morbidity and mortality. Several factors, including procoagulant agents secreted by tumor cells, immobilization, surgery, indwelling catheters, and systemic treatment (including chemotherapy), contribute to an increased risk of VTE in cancer patients. There is growing interest in instituting primary prophylaxis in high-risk patients to prevent incident (first-time) VTE events. The identification of patients at sufficiently high risk of VTE to warrant primary thromboprophylaxis is essential, as anticoagulation may be associated with a higher risk of bleeding. Current guidelines recommend the use of pharmacological thromboprophylaxis in postoperative and hospitalized cancer patients, as well as ambulatory cancer patients receiving thalidomide or lenalidomide in combination with high-dose dexamethasone or chemotherapy, in the absence of contraindications to anticoagulation. However, the majority of cancer patients are ambulatory, and currently primary thromboprophylaxis is not recommended for these patients, even those considered at very high risk. In this concise review, the authors discuss risk stratification models that have been specifically developed to identify cancer patients at high risk for VTE, and thus might be useful in future studies designed to determine the potential benefit of primary thromboprophylaxis.

  10. A critical evaluation of secondary cancer risk models applied to Monte Carlo dose distributions of 2-dimensional, 3-dimensional conformal and hybrid intensity-modulated radiation therapy for breast cancer

    NASA Astrophysics Data System (ADS)

    Joosten, A.; Bochud, F.; Moeckli, R.

    2014-08-01

    The comparison of radiotherapy techniques regarding secondary cancer risk has yielded contradictory results possibly stemming from the many different approaches used to estimate risk. The purpose of this study was to make a comprehensive evaluation of different available risk models applied to detailed whole-body dose distributions computed by Monte Carlo for various breast radiotherapy techniques including conventional open tangents, 3D conformal wedged tangents and hybrid intensity modulated radiation therapy (IMRT). First, organ-specific linear risk models developed by the International Commission on Radiological Protection (ICRP) and the Biological Effects of Ionizing Radiation (BEIR) VII committee were applied to mean doses for remote organs only and all solid organs. Then, different general non-linear risk models were applied to the whole body dose distribution. Finally, organ-specific non-linear risk models for the lung and breast were used to assess the secondary cancer risk for these two specific organs. A total of 32 different calculated absolute risks resulted in a broad range of values (between 0.1% and 48.5%) underlying the large uncertainties in absolute risk calculation. The ratio of risk between two techniques has often been proposed as a more robust assessment of risk than the absolute risk. We found that the ratio of risk between two techniques could also vary substantially considering the different approaches to risk estimation. Sometimes the ratio of risk between two techniques would range between values smaller and larger than one, which then translates into inconsistent results on the potential higher risk of one technique compared to another. We found however that the hybrid IMRT technique resulted in a systematic reduction of risk compared to the other techniques investigated even though the magnitude of this reduction varied substantially with the different approaches investigated. Based on the epidemiological data available, a reasonable

  11. A critical evaluation of secondary cancer risk models applied to Monte Carlo dose distributions of 2-dimensional, 3-dimensional conformal and hybrid intensity-modulated radiation therapy for breast cancer.

    PubMed

    Joosten, A; Bochud, F; Moeckli, R

    2014-08-21

    The comparison of radiotherapy techniques regarding secondary cancer risk has yielded contradictory results possibly stemming from the many different approaches used to estimate risk. The purpose of this study was to make a comprehensive evaluation of different available risk models applied to detailed whole-body dose distributions computed by Monte Carlo for various breast radiotherapy techniques including conventional open tangents, 3D conformal wedged tangents and hybrid intensity modulated radiation therapy (IMRT). First, organ-specific linear risk models developed by the International Commission on Radiological Protection (ICRP) and the Biological Effects of Ionizing Radiation (BEIR) VII committee were applied to mean doses for remote organs only and all solid organs. Then, different general non-linear risk models were applied to the whole body dose distribution. Finally, organ-specific non-linear risk models for the lung and breast were used to assess the secondary cancer risk for these two specific organs. A total of 32 different calculated absolute risks resulted in a broad range of values (between 0.1% and 48.5%) underlying the large uncertainties in absolute risk calculation. The ratio of risk between two techniques has often been proposed as a more robust assessment of risk than the absolute risk. We found that the ratio of risk between two techniques could also vary substantially considering the different approaches to risk estimation. Sometimes the ratio of risk between two techniques would range between values smaller and larger than one, which then translates into inconsistent results on the potential higher risk of one technique compared to another. We found however that the hybrid IMRT technique resulted in a systematic reduction of risk compared to the other techniques investigated even though the magnitude of this reduction varied substantially with the different approaches investigated. Based on the epidemiological data available, a reasonable

  12. Forewarning model for water pollution risk based on Bayes theory.

    PubMed

    Zhao, Jun; Jin, Juliang; Guo, Qizhong; Chen, Yaqian; Lu, Mengxiong; Tinoco, Luis

    2014-02-01

    In order to reduce the losses by water pollution, forewarning model for water pollution risk based on Bayes theory was studied. This model is built upon risk indexes in complex systems, proceeding from the whole structure and its components. In this study, the principal components analysis is used to screen out index systems. Hydrological model is employed to simulate index value according to the prediction principle. Bayes theory is adopted to obtain posterior distribution by prior distribution with sample information which can make samples' features preferably reflect and represent the totals to some extent. Forewarning level is judged on the maximum probability rule, and then local conditions for proposing management strategies that will have the effect of transforming heavy warnings to a lesser degree. This study takes Taihu Basin as an example. After forewarning model application and vertification for water pollution risk from 2000 to 2009 between the actual and simulated data, forewarning level in 2010 is given as a severe warning, which is well coincide with logistic curve. It is shown that the model is rigorous in theory with flexible method, reasonable in result with simple structure, and it has strong logic superiority and regional adaptability, providing a new way for warning water pollution risk.

  13. Launch Vehicle Debris Models and Crew Vehicle Ascent Abort Risk

    NASA Technical Reports Server (NTRS)

    Gee, Ken; Lawrence, Scott

    2013-01-01

    For manned space launch systems, a reliable abort system is required to reduce the risks associated with a launch vehicle failure during ascent. Understanding the risks associated with failure environments can be achieved through the use of physics-based models of these environments. Debris fields due to destruction of the launch vehicle is one such environment. To better analyze the risk posed by debris, a physics-based model for generating launch vehicle debris catalogs has been developed. The model predicts the mass distribution of the debris field based on formulae developed from analysis of explosions. Imparted velocity distributions are computed using a shock-physics code to model the explosions within the launch vehicle. A comparison of the debris catalog with an existing catalog for the Shuttle external tank show good comparison in the debris characteristics and the predicted debris strike probability. The model is used to analyze the effects of number of debris pieces and velocity distributions on the strike probability and risk.

  14. Achieving Climate Change Absolute Accuracy in Orbit

    NASA Technical Reports Server (NTRS)

    Wielicki, Bruce A.; Young, D. F.; Mlynczak, M. G.; Thome, K. J; Leroy, S.; Corliss, J.; Anderson, J. G.; Ao, C. O.; Bantges, R.; Best, F.; Bowman, K.; Brindley, H.; Butler, J. J.; Collins, W.; Dykema, J. A.; Doelling, D. R.; Feldman, D. R.; Fox, N.; Huang, X.; Holz, R.; Huang, Y.; Jennings, D.; Jin, Z.; Johnson, D. G.; Jucks, K.; Kato, S.; Kratz, D. P.; Liu, X.; Lukashin, C.; Mannucci, A. J.; Phojanamongkolkij, N.; Roithmayr, C. M.; Sandford, S.; Taylor, P. C.; Xiong, X.

    2013-01-01

    The Climate Absolute Radiance and Refractivity Observatory (CLARREO) mission will provide a calibration laboratory in orbit for the purpose of accurately measuring and attributing climate change. CLARREO measurements establish new climate change benchmarks with high absolute radiometric accuracy and high statistical confidence across a wide range of essential climate variables. CLARREO's inherently high absolute accuracy will be verified and traceable on orbit to Système Internationale (SI) units. The benchmarks established by CLARREO will be critical for assessing changes in the Earth system and climate model predictive capabilities for decades into the future as society works to meet the challenge of optimizing strategies for mitigating and adapting to climate change. The CLARREO benchmarks are derived from measurements of the Earth's thermal infrared spectrum (5-50 micron), the spectrum of solar radiation reflected by the Earth and its atmosphere (320-2300 nm), and radio occultation refractivity from which accurate temperature profiles are derived. The mission has the ability to provide new spectral fingerprints of climate change, as well as to provide the first orbiting radiometer with accuracy sufficient to serve as the reference transfer standard for other space sensors, in essence serving as a "NIST [National Institute of Standards and Technology] in orbit." CLARREO will greatly improve the accuracy and relevance of a wide range of space-borne instruments for decadal climate change. Finally, CLARREO has developed new metrics and methods for determining the accuracy requirements of climate observations for a wide range of climate variables and uncertainty sources. These methods should be useful for improving our understanding of observing requirements for most climate change observations.

  15. Modeling secondary cancer risk following paediatric radiotherapy: a comparison of intensity modulated proton therapy and photon therapy

    NASA Astrophysics Data System (ADS)

    Shin, Naomi

    Proton radiotherapy is known to reduce the radiation dose delivered to normal healthy tissue compared to photon techniques. The increase in normal tissue sparing could result in fewer acute and late effects from radiation therapy. In this work proton therapy plans were created for patients previously treated using photon therapy. Intensity modulated proton therapy (IMPT) plans were planned using inverse planning in VarianRTM's Eclipse(TM) treatment planning system with a scanning proton beam model to the same relative biological effectiveness (RBE)-weighted prescription dose as the photon plan. Proton and photon plans were compared for target dose conformity and homogeneity, body volumes receiving 2 Gy and 5 Gy, integral dose, dose to normal tissues and second cancer risk. Secondary cancer risk was determined using two methods. The relative risk of secondary cancer was found using the method described by Nguyen et al. 1 by applying a linear relationship between integral dose and relative risk of secondary cancer. The second approach used Schneider et al. 's organ equivalent dose concept to describe the dose in the body and then calculate the excess absolute risk and cumulative risk for solid cancers in the body. IMPT and photon plans had similar target conformity and homogeneity. However IMPT plans had reduced integral dose and volumes of the body receiving low dose. Overall the risk of radiation induced secondary cancer was lower for IMPT plans compared to the corresponding photon plans with a reduction of ~36% using the integral dose model and ˜50% using the organ equivalent dose model. *Please refer to dissertation for footnotes.

  16. Development of a GCR Event-based Risk Model

    NASA Technical Reports Server (NTRS)

    Cucinotta, Francis A.; Ponomarev, Artem L.; Plante, Ianik; Carra, Claudio; Kim, Myung-Hee

    2009-01-01

    A goal at NASA is to develop event-based systems biology models of space radiation risks that will replace the current dose-based empirical models. Complex and varied biochemical signaling processes transmit the initial DNA and oxidative damage from space radiation into cellular and tissue responses. Mis-repaired damage or aberrant signals can lead to genomic instability, persistent oxidative stress or inflammation, which are causative of cancer and CNS risks. Protective signaling through adaptive responses or cell repopulation is also possible. We are developing a computational simulation approach to galactic cosmic ray (GCR) effects that is based on biological events rather than average quantities such as dose, fluence, or dose equivalent. The goal of the GCR Event-based Risk Model (GERMcode) is to provide a simulation tool to describe and integrate physical and biological events into stochastic models of space radiation risks. We used the quantum multiple scattering model of heavy ion fragmentation (QMSFRG) and well known energy loss processes to develop a stochastic Monte-Carlo based model of GCR transport in spacecraft shielding and tissue. We validated the accuracy of the model by comparing to physical data from the NASA Space Radiation Laboratory (NSRL). Our simulation approach allows us to time-tag each GCR proton or heavy ion interaction in tissue including correlated secondary ions often of high multiplicity. Conventional space radiation risk assessment employs average quantities, and assumes linearity and additivity of responses over the complete range of GCR charge and energies. To investigate possible deviations from these assumptions, we studied several biological response pathway models of varying induction and relaxation times including the ATM, TGF -Smad, and WNT signaling pathways. We then considered small volumes of interacting cells and the time-dependent biophysical events that the GCR would produce within these tissue volumes to estimate how

  17. Flexible regression models for rate differences, risk differences and relative risks.

    PubMed

    Donoghoe, Mark W; Marschner, Ian C

    2015-05-01

    Generalized additive models (GAMs) based on the binomial and Poisson distributions can be used to provide flexible semi-parametric modelling of binary and count outcomes. When used with the canonical link function, these GAMs provide semi-parametrically adjusted odds ratios and rate ratios. For adjustment of other effect measures, including rate differences, risk differences and relative risks, non-canonical link functions must be used together with a constrained parameter space. However, the algorithms used to fit these models typically rely on a form of the iteratively reweighted least squares algorithm, which can be numerically unstable when a constrained non-canonical model is used. We describe an application of a combinatorial EM algorithm to fit identity link Poisson, identity link binomial and log link binomial GAMs in order to estimate semi-parametrically adjusted rate differences, risk differences and relative risks. Using smooth regression functions based on B-splines, the method provides stable convergence to the maximum likelihood estimates, and it ensures that the estimates always remain within the parameter space. It is also straightforward to apply a monotonicity constraint to the smooth regression functions. We illustrate the method using data from a clinical trial in heart attack patients. PMID:25781711

  18. There's risk, and then there's risk: The latest clinical prognostic risk stratification models in myelodysplastic syndromes.

    PubMed

    Zeidan, Amer M; Komrokji, Rami S

    2013-12-01

    Myelodysplastic syndromes (MDS) include a diverse group of clonal hematopoietic disorders characterized by progressive cytopenias and propensity for leukemic progression. The biologic heterogeneity that underlies MDS translates clinically in wide variations of clinical outcomes. Several prognostic schemes were developed to predict the natural course of MDS, counsel patients, and allow evidence-based, risk-adaptive implementation of therapeutic strategies. The prognostic schemes divide patients into subgroups with similar prognosis, but the extent to which the prognostic prediction applies to any individual patient is more variable. None of these instruments was designed to predict the clinical benefit in relation to any specific MDS therapy. The prognostic impact of molecular mutations is being more recognized and attempts at incorporating it into the current prognostic schemes are ongoing.

  19. A risk management model for securing virtual healthcare communities.

    PubMed

    Chryssanthou, Anargyros; Varlamis, Iraklis; Latsiou, Charikleia

    2011-01-01

    Virtual healthcare communities aim to bring together healthcare professionals and patients, improve the quality of healthcare services and assist healthcare professionals and researchers in their everyday activities. In a secure and reliable environment, patients share their medical data with doctors, expect confidentiality and demand reliable medical consultation. Apart from a concrete policy framework, several ethical, legal and technical issues must be considered in order to build a trustful community. This research emphasises on security issues, which can arise inside a virtual healthcare community and relate to the communication and storage of data. It capitalises on a standardised risk management methodology and a prototype architecture for healthcare community portals and justifies a security model that allows the identification, estimation and evaluation of potential security risks for the community. A hypothetical virtual healthcare community is employed in order to portray security risks and the solutions that the security model provides.

  20. Source apportionment of ambient non-methane hydrocarbons in Hong Kong: application of a principal component analysis/absolute principal component scores (PCA/APCS) receptor model.

    PubMed

    Guo, H; Wang, T; Louie, P K K

    2004-06-01

    Receptor-oriented source apportionment models are often used to identify sources of ambient air pollutants and to estimate source contributions to air pollutant concentrations. In this study, a PCA/APCS model was applied to the data on non-methane hydrocarbons (NMHCs) measured from January to December 2001 at two sampling sites: Tsuen Wan (TW) and Central & Western (CW) Toxic Air Pollutants Monitoring Stations in Hong Kong. This multivariate method enables the identification of major air pollution sources along with the quantitative apportionment of each source to pollutant species. The PCA analysis identified four major pollution sources at TW site and five major sources at CW site. The extracted pollution sources included vehicular internal engine combustion with unburned fuel emissions, use of solvent particularly paints, liquefied petroleum gas (LPG) or natural gas leakage, and industrial, commercial and domestic sources such as solvents, decoration, fuel combustion, chemical factories and power plants. The results of APCS receptor model indicated that 39% and 48% of the total NMHCs mass concentrations measured at CW and TW were originated from vehicle emissions, respectively. 32% and 36.4% of the total NMHCs were emitted from the use of solvent and 11% and 19.4% were apportioned to the LPG or natural gas leakage, respectively. 5.2% and 9% of the total NMHCs mass concentrations were attributed to other industrial, commercial and domestic sources, respectively. It was also found that vehicle emissions and LPG or natural gas leakage were the main sources of C(3)-C(5) alkanes and C(3)-C(5) alkenes while aromatics were predominantly released from paints. Comparison of source contributions to ambient NMHCs at the two sites indicated that the contribution of LPG or natural gas at CW site was almost twice that at TW site. High correlation coefficients (R(2) > 0.8) between the measured and predicted values suggested that the PCA/APCS model was applicable for estimation

  1. Source apportionment of ambient non-methane hydrocarbons in Hong Kong: application of a principal component analysis/absolute principal component scores (PCA/APCS) receptor model.

    PubMed

    Guo, H; Wang, T; Louie, P K K

    2004-06-01

    Receptor-oriented source apportionment models are often used to identify sources of ambient air pollutants and to estimate source contributions to air pollutant concentrations. In this study, a PCA/APCS model was applied to the data on non-methane hydrocarbons (NMHCs) measured from January to December 2001 at two sampling sites: Tsuen Wan (TW) and Central & Western (CW) Toxic Air Pollutants Monitoring Stations in Hong Kong. This multivariate method enables the identification of major air pollution sources along with the quantitative apportionment of each source to pollutant species. The PCA analysis identified four major pollution sources at TW site and five major sources at CW site. The extracted pollution sources included vehicular internal engine combustion with unburned fuel emissions, use of solvent particularly paints, liquefied petroleum gas (LPG) or natural gas leakage, and industrial, commercial and domestic sources such as solvents, decoration, fuel combustion, chemical factories and power plants. The results of APCS receptor model indicated that 39% and 48% of the total NMHCs mass concentrations measured at CW and TW were originated from vehicle emissions, respectively. 32% and 36.4% of the total NMHCs were emitted from the use of solvent and 11% and 19.4% were apportioned to the LPG or natural gas leakage, respectively. 5.2% and 9% of the total NMHCs mass concentrations were attributed to other industrial, commercial and domestic sources, respectively. It was also found that vehicle emissions and LPG or natural gas leakage were the main sources of C(3)-C(5) alkanes and C(3)-C(5) alkenes while aromatics were predominantly released from paints. Comparison of source contributions to ambient NMHCs at the two sites indicated that the contribution of LPG or natural gas at CW site was almost twice that at TW site. High correlation coefficients (R(2) > 0.8) between the measured and predicted values suggested that the PCA/APCS model was applicable for estimation

  2. Modeling intelligent adversaries for terrorism risk assessment: some necessary conditions for adversary models.

    PubMed

    Guikema, Seth

    2012-07-01

    Intelligent adversary modeling has become increasingly important for risk analysis, and a number of different approaches have been proposed for incorporating intelligent adversaries in risk analysis models. However, these approaches are based on a range of often-implicit assumptions about the desirable properties of intelligent adversary models. This "Perspective" paper aims to further risk analysis for situations involving intelligent adversaries by fostering a discussion of the desirable properties for these models. A set of four basic necessary conditions for intelligent adversary models is proposed and discussed. These are: (1) behavioral accuracy to the degree possible, (2) computational tractability to support decision making, (3) explicit consideration of uncertainty, and (4) ability to gain confidence in the model. It is hoped that these suggested necessary conditions foster discussion about the goals and assumptions underlying intelligent adversary modeling in risk analysis. PMID:22150359

  3. Modeling risk of occupational zoonotic influenza infection in swine workers.

    PubMed

    Paccha, Blanca; Jones, Rachael M; Gibbs, Shawn; Kane, Michael J; Torremorell, Montserrat; Neira-Ramirez, Victor; Rabinowitz, Peter M

    2016-08-01

    Zoonotic transmission of influenza A virus (IAV) between swine and workers in swine production facilities may play a role in the emergence of novel influenza strains with pandemic potential. Guidelines to prevent transmission of influenza to swine workers have been developed but there is a need for evidence-based decision-making about protective measures such as respiratory protection. A mathematical model was applied to estimate the risk of occupational IAV exposure to swine workers by contact and airborne transmission, and to evaluate the use of respirators to reduce transmission.  The Markov model was used to simulate the transport and exposure of workers to IAV in a swine facility. A dose-response function was used to estimate the risk of infection. This approach is similar to methods previously used to estimate the risk of infection in human health care settings. This study uses concentration of virus in air from field measurements collected during outbreaks of influenza in commercial swine facilities, and analyzed by polymerase chain reaction.  It was found that spending 25 min working in a barn during an influenza outbreak in a swine herd could be sufficient to cause zoonotic infection in a worker. However, this risk estimate was sensitive to estimates of viral infectivity to humans. Wearing an excellent fitting N95 respirator reduced this risk, but with high aerosol levels the predicted risk of infection remained high under certain assumptions.  The results of this analysis indicate that under the conditions studied, swine workers are at risk of zoonotic influenza infection. The use of an N95 respirator could reduce such risk. These findings have implications for risk assessment and preventive programs targeting swine workers. The exact level of risk remains uncertain, since our model may have overestimated the viability or infectivity of IAV. Additionally, the potential for partial immunity in swine workers associated with repeated low

  4. Cosmology with negative absolute temperatures

    NASA Astrophysics Data System (ADS)

    Vieira, J. P. P.; Byrnes, Christian T.; Lewis, Antony

    2016-08-01

    Negative absolute temperatures (NAT) are an exotic thermodynamical consequence of quantum physics which has been known since the 1950's (having been achieved in the lab on a number of occasions). Recently, the work of Braun et al. [1] has rekindled interest in negative temperatures and hinted at a possibility of using NAT systems in the lab as dark energy analogues. This paper goes one step further, looking into the cosmological consequences of the existence of a NAT component in the Universe. NAT-dominated expanding Universes experience a borderline phantom expansion (w < ‑1) with no Big Rip, and their contracting counterparts are forced to bounce after the energy density becomes sufficiently large. Both scenarios might be used to solve horizon and flatness problems analogously to standard inflation and bouncing cosmologies. We discuss the difficulties in obtaining and ending a NAT-dominated epoch, and possible ways of obtaining density perturbations with an acceptable spectrum.

  5. Apparatus for absolute pressure measurement

    NASA Technical Reports Server (NTRS)

    Hecht, R. (Inventor)

    1969-01-01

    An absolute pressure sensor (e.g., the diaphragm of a capacitance manometer) was subjected to a superimposed potential to effectively reduce the mechanical stiffness of the sensor. This substantially increases the sensitivity of the sensor and is particularly useful in vacuum gauges. An oscillating component of the superimposed potential induced vibrations of the sensor. The phase of these vibrations with respect to that of the oscillating component was monitored, and served to initiate an automatic adjustment of the static component of the superimposed potential, so as to bring the sensor into resonance at the frequency of the oscillating component. This establishes a selected sensitivity for the sensor, since a definite relationship exists between resonant frequency and sensitivity.

  6. Cosmology with negative absolute temperatures

    NASA Astrophysics Data System (ADS)

    Vieira, J. P. P.; Byrnes, Christian T.; Lewis, Antony

    2016-08-01

    Negative absolute temperatures (NAT) are an exotic thermodynamical consequence of quantum physics which has been known since the 1950's (having been achieved in the lab on a number of occasions). Recently, the work of Braun et al. [1] has rekindled interest in negative temperatures and hinted at a possibility of using NAT systems in the lab as dark energy analogues. This paper goes one step further, looking into the cosmological consequences of the existence of a NAT component in the Universe. NAT-dominated expanding Universes experience a borderline phantom expansion (w < -1) with no Big Rip, and their contracting counterparts are forced to bounce after the energy density becomes sufficiently large. Both scenarios might be used to solve horizon and flatness problems analogously to standard inflation and bouncing cosmologies. We discuss the difficulties in obtaining and ending a NAT-dominated epoch, and possible ways of obtaining density perturbations with an acceptable spectrum.

  7. A Dual System Model of Preferences under Risk

    ERIC Educational Resources Information Center

    Mukherjee, Kanchan

    2010-01-01

    This article presents a dual system model (DSM) of decision making under risk and uncertainty according to which the value of a gamble is a combination of the values assigned to it independently by the affective and deliberative systems. On the basis of research on dual process theories and empirical research in Hsee and Rottenstreich (2004) and…

  8. Field Evaluation of an Avian Risk Assessment Model

    EPA Science Inventory

    We conducted two laboratory subacute dietary toxicity tests and one outdoor subacute dietary toxicity test to determine the effectiveness of the U.S. Environmental Protection Agency's deterministic risk assessment model for evaluating the potential of adverse effects to birds in ...

  9. Surface Water Contamination Risk Assessment Modeled by Fuzzy-WRASTIC.

    PubMed

    Alavipoor, Fatemeh Sadat; Ghorbaninia, Zahra; Karimi, Saeed; Jafari, Hamidreza

    2016-07-01

    This research provides a Fuzzy-WRASTIC new model for water resource contamination risk assessment in a GIS (Geographic Information System) environment. First, this method setting in a multi-criteria evaluation framework (MCE) reviewed and mapped the sub criteria of every above-mentioned criterion. Then, related sub-layers were phased by the observance of GIS environment standards. In the next step, first the sub-layers were combined together, next the modeling of pollution risk status was done by utilizing a fuzzy overlay method and applying the OR, AND, SUM, PRODUCT and GAMMA operators by using WLC (Weighted Linear Combination) method and providing weights in the WRASTIC model. The results provide the best combination of modeling and the percentages of its risk categories of low, medium, high and very high, which are respectively 1.8, 14.07, 51.43 and 32.7. More areas have severe risk due to the unbalanced arrangement and compact of land uses around the compact surface water resources. PMID:27329055

  10. Model of risk assessment under ballistic statistical tests

    NASA Astrophysics Data System (ADS)

    Gabrovski, Ivan; Karakaneva, Juliana

    The material presents the application of a mathematical method for risk assessment under statistical determination of the ballistic limits of the protection equipment. The authors have implemented a mathematical model based on Pierson's criteria. The software accomplishment of the model allows to evaluate the V50 indicator and to assess the statistical hypothesis' reliability. The results supply the specialists with information about the interval valuations of the probability determined during the testing process.

  11. Modeling risk of pneumonia epizootics in bighorn sheep

    USGS Publications Warehouse

    Sells, Sarah N.; Mitchell, Michael S.; Nowak, J. Joshua; Lukacs, Paul M.; Anderson, Neil J.; Ramsey, Jennifer M.; Gude, Justin A.; Krausman, Paul R.

    2015-01-01

    Pneumonia epizootics are a major challenge for management of bighorn sheep (Ovis canadensis) affecting persistence of herds, satisfaction of stakeholders, and allocations of resources by management agencies. Risk factors associated with the disease are poorly understood, making pneumonia epizootics hard to predict; such epizootics are thus managed reactively rather than proactively. We developed a model for herds in Montana that identifies risk factors and addresses biological questions about risk. Using Bayesian logistic regression with repeated measures, we found that private land, weed control using domestic sheep or goats, pneumonia history, and herd density were positively associated with risk of pneumonia epizootics in 43 herds that experienced 22 epizootics out of 637 herd-years from 1979–2013. We defined an area of high risk for pathogen exposure as the area of each herd distribution plus a 14.5-km buffer from that boundary. Within this area, the odds of a pneumonia epizootic increased by >1.5 times per additional unit of private land (unit is the standardized % of private land where global  = 25.58% and SD = 14.53%). Odds were >3.3 times greater if domestic sheep or goats were used for weed control in a herd's area of high risk. If a herd or its neighbors within the area of high risk had a history of a pneumonia epizootic, odds of a subsequent pneumonia epizootic were >10 times greater. Risk greatly increased when herds were at high density, with nearly 15 times greater odds of a pneumonia epizootic compared to when herds were at low density. Odds of a pneumonia epizootic also appeared to decrease following increased spring precipitation (odds = 0.41 per unit increase, global  = 100.18% and SD = 26.97%). Risk was not associated with number of federal sheep and goat allotments, proximity to nearest herds of bighorn sheep, ratio of rams to ewes, percentage of average winter precipitation, or whether herds were of native versus mixed

  12. USING COPULAS TO MODEL DEPENDENCE IN SIMULATION RISK ASSESSMENT

    SciTech Connect

    Dana L. Kelly

    2007-11-01

    Typical engineering systems in applications with high failure consequences such as nuclear reactor plants often employ redundancy and diversity of equipment in an effort to lower the probability of failure and therefore risk. However, it has long been recognized that dependencies exist in these redundant and diverse systems. Some dependencies, such as common sources of electrical power, are typically captured in the logic structure of the risk model. Others, usually referred to as intercomponent dependencies, are treated implicitly by introducing one or more statistical parameters into the model. Such common-cause failure models have limitations in a simulation environment. In addition, substantial subjectivity is associated with parameter estimation for these models. This paper describes an approach in which system performance is simulated by drawing samples from the joint distributions of dependent variables. The approach relies on the notion of a copula distribution, a notion which has been employed by the actuarial community for ten years or more, but which has seen only limited application in technological risk assessment. The paper also illustrates how equipment failure data can be used in a Bayesian framework to estimate the parameter values in the copula model. This approach avoids much of the subjectivity required to estimate parameters in traditional common-cause failure models. Simulation examples are presented for failures in time. The open-source software package R is used to perform the simulations. The open-source software package WinBUGS is used to perform the Bayesian inference via Markov chain Monte Carlo sampling.

  13. Absolute Proper Motions of Southern Globular Clusters

    NASA Astrophysics Data System (ADS)

    Dinescu, D. I.; Girard, T. M.; van Altena, W. F.

    1996-05-01

    Our program involves the determination of absolute proper motions with respect to galaxies for a sample of globular clusters situated in the southern sky. The plates cover a 6(deg) x 6(deg) area and are taken with the 51-cm double astrograph at Cesco Observatory in El Leoncito, Argentina. We have developed special methods to deal with the modelling error of the plate transformation and we correct for magnitude equation using the cluster stars. This careful astrometric treatment leads to accuracies of from 0.5 to 1.0 mas/yr for the absolute proper motion of each cluster, depending primarily on the number of measurable cluster stars which in turn is related to the cluster's distance. Space velocities are then derived which, in association with metallicities, provide key information for the formation scenario of the Galaxy, i.e. accretion and/or dissipational collapse. Here we present results for NGC 1851, NGC 6752, NGC 6584, NGC 6362 and NGC 288.

  14. Modeling of Radiation Risks for Human Space Missions

    NASA Technical Reports Server (NTRS)

    Fletcher, Graham

    2004-01-01

    Prior to any human space flight, calculations of radiation risks are used to determine the acceptable scope of astronaut activity. Using the supercomputing facilities at NASA Ames Research Center, Ames researchers have determined the damage probabilities of DNA functional groups by space radiation. The data supercede those used in the current Monte Carlo model for risk assessment. One example is the reaction of DNA with hydroxyl radical produced by the interaction of highly energetic particles from space radiation with water molecules in the human body. This reaction is considered an important cause of DNA mutations, although its mechanism is not well understood.

  15. Reducing uncertainty in risk modeling for methylmercury exposure

    SciTech Connect

    Ponce, R.; Egeland, G.; Middaugh, J.; Lee, R.

    1995-12-31

    The biomagnification and bioaccumulation of methylmercury in marine species represents a challenge for risk assessment related to the consumption of subsistence foods in Alaska. Because of the profound impact that food consumption advisories have on indigenous peoples seeking to preserve a way of life, there is a need to reduce uncertainty in risk assessment. Thus, research was initiated to reduce the uncertainty in assessing the health risks associated with the consumption of subsistence foods. Because marine subsistence foods typically contain elevated levels of methylmercury, preliminary research efforts have focused on methylmercury as the principal chemical of concern. Of particular interest are the antagonistic effects of selenium on methylmercury toxicity. Because of this antagonism, methylmercury exposure through the consumption of marine mammal meat (with high selenium) may not be as toxic as comparable exposures through other sources of dietary intake, such as in the contaminated bread episode of Iraq (containing relatively low selenium). This hypothesis is supported by animal experiments showing reduced toxicity of methylmercury associated with marine mammal meat, by the antagonistic influence of selenium on methylmercury toxicity, and by negative clinical findings in adult populations exposed to methylmercury through a marine diet not subject to industrial contamination. Exploratory model development is underway to identify potential improvements and applications of current deterministic and probabilistic models, particularly by incorporating selenium as an antagonist in risk modeling methods.

  16. The acquired preparedness model of risk for bulimic symptom development.

    PubMed

    Combs, Jessica L; Smith, Gregory T; Flory, Kate; Simmons, Jean R; Hill, Kelly K

    2010-09-01

    The authors applied person-environment transaction theory to test the acquired preparedness model of eating disorder risk. The model holds that (a) middle-school girls high in the trait of ineffectiveness are differentially prepared to acquire high-risk expectancies for reinforcement from dieting or thinness; (b) those expectancies predict subsequent binge eating and purging; and (c) the influence of the disposition of ineffectiveness on binge eating and purging is mediated by dieting or thinness expectancies. In a three-wave longitudinal study of 394 middle-school girls, the authors found support for the model. Seventh-grade girls' scores on ineffectiveness predicted their subsequent endorsement of high-risk dieting or thinness expectancies, which in turn predicted subsequent increases in binge eating and purging. Statistical tests of mediation supported the hypothesis that the prospective relation between ineffectiveness and binge eating was mediated by dieting or thinness expectancies, as was the prospective relation between ineffectiveness and purging. This application of a basic science theory to eating disorder risk appears fruitful, and the findings suggest the importance of early interventions that address both disposition and learning.

  17. Small scale water recycling systems--risk assessment and modelling.

    PubMed

    Diaper, C; Dixon, A; Bulier, D; Fewkes, A; Parsons, S A; Strathern, M; Stephenson, T; Strutt, J

    2001-01-01

    This paper aims to use quantitative risk analysis, risk modelling and simulation modelling tools to assess the performance of a proprietary single house grey water recycling system. A preliminary Hazard and Operability study (HAZOP) identified the main hazards, both health related and economic, associated with installing the recycling system in a domestic environment. The health related consequences of system failure were associated with the presence of increased concentrations of micro-organisms at the point of use, due to failure of the disinfection system and/or the pump. The risk model was used to assess the increase in the probability of infection for a particular genus of micro-organism, Salmonella spp, during disinfection failure. The increase in the number of cases of infection above a base rate rose from 0.001% during normal operation, to 4% for a recycling system with no disinfection. The simulation model was used to examine the possible effects of pump failure. The model indicated that the anaerobic COD release rate in the system storage tank increases over time and dissolved oxygen decreases during this failure mode. These conditions are likely to result in odour problems.

  18. Modeling insurer-homeowner interactions in managing natural disaster risk.

    PubMed

    Kesete, Yohannes; Peng, Jiazhen; Gao, Yang; Shan, Xiaojun; Davidson, Rachel A; Nozick, Linda K; Kruse, Jamie

    2014-06-01

    The current system for managing natural disaster risk in the United States is problematic for both homeowners and insurers. Homeowners are often uninsured or underinsured against natural disaster losses, and typically do not invest in retrofits that can reduce losses. Insurers often do not want to insure against these losses, which are some of their biggest exposures and can cause an undesirably high chance of insolvency. There is a need to design an improved system that acknowledges the different perspectives of the stakeholders. In this article, we introduce a new modeling framework to help understand and manage the insurer's role in catastrophe risk management. The framework includes a new game-theoretic optimization model of insurer decisions that interacts with a utility-based homeowner decision model and is integrated with a regional catastrophe loss estimation model. Reinsurer and government roles are represented as bounds on the insurer-insured interactions. We demonstrate the model for a full-scale case study for hurricane risk to residential buildings in eastern North Carolina; present the results from the perspectives of all stakeholders-primary insurers, homeowners (insured and uninsured), and reinsurers; and examine the effect of key parameters on the results.

  19. Modelling of fire count data: fire disaster risk in Ghana.

    PubMed

    Boadi, Caleb; Harvey, Simon K; Gyeke-Dako, Agyapomaa

    2015-01-01

    Stochastic dynamics involved in ecological count data require distribution fitting procedures to model and make informed judgments. The study provides empirical research, focused on the provision of an early warning system and a spatial graph that can detect societal fire risks. It offers an opportunity for communities, organizations, risk managers, actuaries and governments to be aware of, and understand fire risks, so that they will increase the direct tackling of the threats posed by fire. Statistical distribution fitting method that best helps identify the stochastic dynamics of fire count data is used. The aim is to provide a fire-prediction model and fire spatial graph for observed fire count data. An empirical probability distribution model is fitted to the fire count data and compared to the theoretical probability distribution of the stochastic process of fire count data. The distribution fitted to the fire frequency count data helps identify the class of models that are exhibited by the fire and provides time leading decisions. The research suggests that fire frequency and loss (fire fatalities) count data in Ghana are best modelled with a Negative Binomial Distribution. The spatial map of observed fire frequency and fatality measured over 5 years (2007-2011) offers in this study a first regional assessment of fire frequency and fire fatality in Ghana. PMID:26702383

  20. Absolute irradiance of the Moon for on-orbit calibration

    USGS Publications Warehouse

    Stone, T.C.; Kieffer, H.H.; ,

    2002-01-01

    The recognized need for on-orbit calibration of remote sensing imaging instruments drives the ROLO project effort to characterize the Moon for use as an absolute radiance source. For over 5 years the ground-based ROLO telescopes have acquired spatially-resolved lunar images in 23 VNIR (Moon diameter ???500 pixels) and 9 SWIR (???250 pixels) passbands at phase angles within ??90 degrees. A numerical model for lunar irradiance has been developed which fits hundreds of ROLO images in each band, corrected for atmospheric extinction and calibrated to absolute radiance, then integrated to irradiance. The band-coupled extinction algorithm uses absorption spectra of several gases and aerosols derived from MODTRAN to fit time-dependent component abundances to nightly observations of standard stars. The absolute radiance scale is based upon independent telescopic measurements of the star Vega. The fitting process yields uncertainties in lunar relative irradiance over small ranges of phase angle and the full range of lunar libration well under 0.5%. A larger source of uncertainty enters in the absolute solar spectral irradiance, especially in the SWIR, where solar models disagree by up to 6%. Results of ROLO model direct comparisons to spacecraft observations demonstrate the ability of the technique to track sensor responsivity drifts to sub-percent precision. Intercomparisons among instruments provide key insights into both calibration issues and the absolute scale for lunar irradiance.

  1. Brownfields and health risks--air dispersion modeling and health risk assessment at landfill redevelopment sites.

    PubMed

    Ofungwu, Joseph; Eget, Steven

    2006-07-01

    Redevelopment of landfill sites in the New Jersey-New York metropolitan area for recreational (golf courses), commercial, and even residential purposes seems to be gaining acceptance among municipal planners and developers. Landfill gas generation, which includes methane and potentially toxic nonmethane compounds usually continues long after closure of the landfill exercise phase. It is therefore prudent to evaluate potential health risks associated with exposure to gas emissions before redevelopment of the landfill sites as recreational, commercial, and, especially, residential properties. Unacceptably high health risks would call for risk management measures such as limiting the development to commercial/recreational rather than residential uses, stringent gas control mechanisms, interior air filtration, etc. A methodology is presented for applying existing models to estimate residual landfill hazardous compounds emissions and to quantify associated health risks. Besides the toxic gas constituents of landfill emissions, other risk-related issues concerning buried waste, landfill leachate, and explosive gases were qualitatively evaluated. Five contiguously located landfill sites in New Jersey intended for residential and recreational redevelopment were used to exemplify the approach.

  2. Implications of pharmacokinetic modeling in risk assessment analysis.

    PubMed Central

    Lutz, R J; Dedrick, R L

    1987-01-01

    Physiologic pharmacokinetic models are a useful interface between exposure models and risk assessment models by providing a means to estimate tissue concentrations of reactive chemical species at the site of action. The models utilize numerous parameters that can be characterized as anatomical, such as body size or tissue volume; physiological, such as tissue blood perfusion rates, clearances, and metabolism; thermodynamic, such as partition coefficients; and transport, such as membrane permeabilities. The models provide a format to investigate how these parameters can influence the disposition of chemicals throughout the body, which is an important consideration in interpreting toxicity studies. Physiologic models can take into account nonlinear effects related to clearance, metabolism, or transport. They allow for extrapolation of tissue concentration from high dose to low dose experiments and from species to species and can account for temporal variations in dose. PMID:3447907

  3. Agents, Bayes, and Climatic Risks - a modular modelling approach

    NASA Astrophysics Data System (ADS)

    Haas, A.; Jaeger, C.

    2005-08-01

    When insurance firms, energy companies, governments, NGOs, and other agents strive to manage climatic risks, it is by no way clear what the aggregate outcome should and will be. As a framework for investigating this subject, we present the LAGOM model family. It is based on modules depicting learning social agents. For managing climate risks, our agents use second order probabilities and update them by means of a Bayesian mechanism while differing in priors and risk aversion. The interactions between these modules and the aggregate outcomes of their actions are implemented using further modules. The software system is implemented as a series of parallel processes using the CIAMn approach. It is possible to couple modules irrespective of the language they are written in, the operating system under which they are run, and the physical location of the machine.

  4. A neural network model for credit risk evaluation.

    PubMed

    Khashman, Adnan

    2009-08-01

    Credit scoring is one of the key analytical techniques in credit risk evaluation which has been an active research area in financial risk management. This paper presents a credit risk evaluation system that uses a neural network model based on the back propagation learning algorithm. We train and implement the neural network to decide whether to approve or reject a credit application, using seven learning schemes and real world credit applications from the Australian credit approval datasets. A comparison of the system performance under the different learning schemes is provided, furthermore, we compare the performance of two neural networks; with one and two hidden layers following the ideal learning scheme. Experimental results suggest that neural networks can be effectively used in automatic processing of credit applications.

  5. Development of Standardized Probabilistic Risk Assessment Models for Shutdown Operations Integrated in SPAR Level 1 Model

    SciTech Connect

    S. T. Khericha; J. Mitman

    2008-05-01

    Nuclear plant operating experience and several studies show that the risk from shutdown operation during Modes 4, 5, and 6 at pressurized water reactors and Modes 4 and 5 at boiling water reactors can be significant. This paper describes using the U.S. Nuclear Regulatory Commission’s full-power Standardized Plant Analysis Risk (SPAR) model as the starting point for development of risk evaluation models for commercial nuclear power plants. The shutdown models are integrated with their respective internal event at-power SPAR model. This is accomplished by combining the modified system fault trees from the SPAR full-power model with shutdown event tree logic. Preliminary human reliability analysis results indicate that risk is dominated by the operator’s ability to correctly diagnose events and initiate systems.

  6. Low Dose Radiation Cancer Risks: Epidemiological and Toxicological Models

    SciTech Connect

    David G. Hoel, PhD

    2012-04-19

    The basic purpose of this one year research grant was to extend the two stage clonal expansion model (TSCE) of carcinogenesis to exposures other than the usual single acute exposure. The two-stage clonal expansion model of carcinogenesis incorporates the biological process of carcinogenesis, which involves two mutations and the clonal proliferation of the intermediate cells, in a stochastic, mathematical way. The current TSCE model serves a general purpose of acute exposure models but requires numerical computation of both the survival and hazard functions. The primary objective of this research project was to develop the analytical expressions for the survival function and the hazard function of the occurrence of the first cancer cell for acute, continuous and multiple exposure cases within the framework of the piece-wise constant parameter two-stage clonal expansion model of carcinogenesis. For acute exposure and multiple exposures of acute series, it is either only allowed to have the first mutation rate vary with the dose, or to have all the parameters be dose dependent; for multiple exposures of continuous exposures, all the parameters are allowed to vary with the dose. With these analytical functions, it becomes easy to evaluate the risks of cancer and allows one to deal with the various exposure patterns in cancer risk assessment. A second objective was to apply the TSCE model with varing continuous exposures from the cancer studies of inhaled plutonium in beagle dogs. Using step functions to estimate the retention functions of the pulmonary exposure of plutonium the multiple exposure versions of the TSCE model was to be used to estimate the beagle dog lung cancer risks. The mathematical equations of the multiple exposure versions of the TSCE model were developed. A draft manuscript which is attached provides the results of this mathematical work. The application work using the beagle dog data from plutonium exposure has not been completed due to the fact

  7. Assessing discriminative ability of risk models in clustered data

    PubMed Central

    2014-01-01

    Background The discriminative ability of a risk model is often measured by Harrell’s concordance-index (c-index). The c-index estimates for two randomly chosen subjects the probability that the model predicts a higher risk for the subject with poorer outcome (concordance probability). When data are clustered, as in multicenter data, two types of concordance are distinguished: concordance in subjects from the same cluster (within-cluster concordance probability) and concordance in subjects from different clusters (between-cluster concordance probability). We argue that the within-cluster concordance probability is most relevant when a risk model supports decisions within clusters (e.g. who should be treated in a particular center). We aimed to explore different approaches to estimate the within-cluster concordance probability in clustered data. Methods We used data of the CRASH trial (2,081 patients clustered in 35 centers) to develop a risk model for mortality after traumatic brain injury. To assess the discriminative ability of the risk model within centers we first calculated cluster-specific c-indexes. We then pooled the cluster-specific c-indexes into a summary estimate with different meta-analytical techniques. We considered fixed effect meta-analysis with different weights (equal; inverse variance; number of subjects, events or pairs) and random effects meta-analysis. We reflected on pooling the estimates on the log-odds scale rather than the probability scale. Results The cluster-specific c-index varied substantially across centers (IQR = 0.70-0.81; I 2 = 0.76 with 95% confidence interval 0.66 to 0.82). Summary estimates resulting from fixed effect meta-analysis ranged from 0.75 (equal weights) to 0.84 (inverse variance weights). With random effects meta-analysis – accounting for the observed heterogeneity in c-indexes across clusters – we estimated a mean of 0.77, a between-cluster variance of 0.0072 and a 95% prediction interval of 0.60 to 0.95. The

  8. 42 CFR 81.10 - Use of cancer risk assessment models in NIOSH IREP.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 1 2011-10-01 2011-10-01 false Use of cancer risk assessment models in NIOSH IREP... Risk Models Used To Estimate Probability of Causation § 81.10 Use of cancer risk assessment models in... tables were developed from analyses of cancer mortality risk among the Japanese atomic bomb...

  9. 42 CFR 81.10 - Use of cancer risk assessment models in NIOSH IREP.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 42 Public Health 1 2012-10-01 2012-10-01 false Use of cancer risk assessment models in NIOSH IREP... Risk Models Used To Estimate Probability of Causation § 81.10 Use of cancer risk assessment models in... tables were developed from analyses of cancer mortality risk among the Japanese atomic bomb...

  10. 42 CFR 81.10 - Use of cancer risk assessment models in NIOSH IREP.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 1 2010-10-01 2010-10-01 false Use of cancer risk assessment models in NIOSH IREP... Risk Models Used To Estimate Probability of Causation § 81.10 Use of cancer risk assessment models in... tables were developed from analyses of cancer mortality risk among the Japanese atomic bomb...

  11. 42 CFR 81.10 - Use of cancer risk assessment models in NIOSH IREP.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 42 Public Health 1 2013-10-01 2013-10-01 false Use of cancer risk assessment models in NIOSH IREP... Risk Models Used To Estimate Probability of Causation § 81.10 Use of cancer risk assessment models in... tables were developed from analyses of cancer mortality risk among the Japanese atomic bomb...

  12. 42 CFR 81.10 - Use of cancer risk assessment models in NIOSH IREP.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 42 Public Health 1 2014-10-01 2014-10-01 false Use of cancer risk assessment models in NIOSH IREP... Risk Models Used To Estimate Probability of Causation § 81.10 Use of cancer risk assessment models in... tables were developed from analyses of cancer mortality risk among the Japanese atomic bomb...

  13. Survivorship models for estimating the risk of decompression sickness.

    PubMed

    Kumar, K V; Powell, M R

    1994-07-01

    Several approaches have been used for modeling the incidence of decompression sickness (DCS) such as Hill's dose-response and logistic regression. Most of these methods do not include the time-to-onset information in the model. Survival analysis (failure time analysis) is appropriate when the time to onset of an event is of interest. The applicability of survival analysis for modeling the risk of DCS is illustrated by using data obtained from hypobaric chamber exposures simulating extravehicular activities (n = 426). Univariate analysis of incidence-free survival proportions were obtained for Doppler-detectable circulating microbubbles (CMB), symptoms of DCS and test aborts. A log-linear failure time regression model with 360-min half-time tissue ratio (TR) as covariate was constructed, and estimated probabilities for various TR values were calculated. Further regression analysis by including CMB status in this model showed significant improvement (p < 0.05) in the estimation of DCS over the previous model. Since DCS is dependent on the exposure pressure as well as the duration of exposure, we recommend the use of survival analysis for modeling the risk of DCS. PMID:7945136

  14. An animal model of differential genetic risk for methamphetamine intake.

    PubMed

    Phillips, Tamara J; Shabani, Shkelzen

    2015-01-01

    The question of whether genetic factors contribute to risk for methamphetamine (MA) use and dependence has not been intensively investigated. Compared to human populations, genetic animal models offer the advantages of control over genetic family history and drug exposure. Using selective breeding, we created lines of mice that differ in genetic risk for voluntary MA intake and identified the chromosomal addresses of contributory genes. A quantitative trait locus was identified on chromosome 10 that accounts for more than 50% of the genetic variance in MA intake in the selected mouse lines. In addition, behavioral and physiological screening identified differences corresponding with risk for MA intake that have generated hypotheses that are testable in humans. Heightened sensitivity to aversive and certain physiological effects of MA, such as MA-induced reduction in body temperature, are hallmarks of mice bred for low MA intake. Furthermore, unlike MA-avoiding mice, MA-preferring mice are sensitive to rewarding and reinforcing MA effects, and to MA-induced increases in brain extracellular dopamine levels. Gene expression analyses implicate the importance of a network enriched in transcription factor genes, some of which regulate the mu opioid receptor gene, Oprm1, in risk for MA use. Neuroimmune factors appear to play a role in differential response to MA between the mice bred for high and low intake. In addition, chromosome 10 candidate gene studies provide strong support for a trace amine-associated receptor 1 gene, Taar1, polymorphism in risk for MA intake. MA is a trace amine-associated receptor 1 (TAAR1) agonist, and a non-functional Taar1 allele segregates with high MA consumption. Thus, reduced TAAR1 function has the potential to increase risk for MA use. Overall, existing findings support the MA drinking lines as a powerful model for identifying genetic factors involved in determining risk for harmful MA use. Future directions include the development of a

  15. An animal model of differential genetic risk for methamphetamine intake

    PubMed Central

    Phillips, Tamara J.; Shabani, Shkelzen

    2015-01-01

    The question of whether genetic factors contribute to risk for methamphetamine (MA) use and dependence has not been intensively investigated. Compared to human populations, genetic animal models offer the advantages of control over genetic family history and drug exposure. Using selective breeding, we created lines of mice that differ in genetic risk for voluntary MA intake and identified the chromosomal addresses of contributory genes. A quantitative trait locus was identified on chromosome 10 that accounts for more than 50% of the genetic variance in MA intake in the selected mouse lines. In addition, behavioral and physiological screening identified differences corresponding with risk for MA intake that have generated hypotheses that are testable in humans. Heightened sensitivity to aversive and certain physiological effects of MA, such as MA-induced reduction in body temperature, are hallmarks of mice bred for low MA intake. Furthermore, unlike MA-avoiding mice, MA-preferring mice are sensitive to rewarding and reinforcing MA effects, and to MA-induced increases in brain extracellular dopamine levels. Gene expression analyses implicate the importance of a network enriched in transcription factor genes, some of which regulate the mu opioid receptor gene, Oprm1, in risk for MA use. Neuroimmune factors appear to play a role in differential response to MA between the mice bred for high and low intake. In addition, chromosome 10 candidate gene studies provide strong support for a trace amine-associated receptor 1 gene, Taar1, polymorphism in risk for MA intake. MA is a trace amine-associated receptor 1 (TAAR1) agonist, and a non-functional Taar1 allele segregates with high MA consumption. Thus, reduced TAAR1 function has the potential to increase risk for MA use. Overall, existing findings support the MA drinking lines as a powerful model for identifying genetic factors involved in determining risk for harmful MA use. Future directions include the development of a

  16. Absolute configuration of isovouacapenol C

    PubMed Central

    Fun, Hoong-Kun; Yodsaoue, Orapun; Karalai, Chatchanok; Chantrapromma, Suchada

    2010-01-01

    The title compound, C27H34O5 {systematic name: (4aR,5R,6R,6aS,7R,11aS,11bR)-4a,6-dihy­droxy-4,4,7,11b-tetra­methyl-1,2,3,4,4a,5,6,6a,7,11,11a,11b-dodeca­hydro­phenanthro[3,2-b]furan-5-yl benzoate}, is a cassane furan­oditerpene, which was isolated from the roots of Caesalpinia pulcherrima. The three cyclo­hexane rings are trans fused: two of these are in chair conformations with the third in a twisted half-chair conformation, whereas the furan ring is almost planar (r.m.s. deviation = 0.003 Å). An intra­molecular C—H⋯O inter­action generates an S(6) ring. The absolute configurations of the stereogenic centres at positions 4a, 5, 6, 6a, 7, 11a and 11b are R, R, R, S, R, S and R, respectively. In the crystal, mol­ecules are linked into infinite chains along [010] by O—H⋯O hydrogen bonds. C⋯O [3.306 (2)–3.347 (2) Å] short contacts and C—H⋯π inter­actions also occur. PMID:21588364

  17. Field evaluation of an avian risk assessment model

    USGS Publications Warehouse

    Vyas, N.B.; Spann, J.W.; Hulse, C.S.; Borges, S.L.; Bennett, R.S.; Torrez, M.; Williams, B.I.; Leffel, R.

    2006-01-01

    We conducted two laboratory subacute dietary toxicity tests and one outdoor subacute dietary toxicity test to determine the effectiveness of the U.S. Environmental Protection Agency's deterministic risk assessment model for evaluating the potential of adverse effects to birds in the field. We tested technical-grade diazinon and its D Z N- 50W (50% diazinon active ingredient wettable powder) formulation on Canada goose (Branta canadensis) goslings. Brain acetylcholinesterase activity was measured, and the feathers and skin, feet. and gastrointestinal contents were analyzed for diazinon residues. The dose-response curves showed that diazinon was significantly more toxic to goslings in the outdoor test than in the laboratory tests. The deterministic risk assessment method identified the potential for risk to birds in general, but the factors associated with extrapolating from the laboratory to the field, and from the laboratory test species to other species, resulted in the underestimation of risk to the goslings. The present study indicates that laboratory-based risk quotients should be interpreted with caution.

  18. Making Risk Models Operational for Situational Awareness and Decision Support

    SciTech Connect

    Paulson, Patrick R.; Coles, Garill A.; Shoemaker, Steven V.

    2012-06-12

    Modernization of nuclear power operations control systems, in particular the move to digital control systems, creates an opportunity to modernize existing legacy infrastructure and extend plant life. We describe here decision support tools that allow the assessment of different facets of risk and support the optimization of available resources to reduce risk as plants are upgraded and maintained. This methodology could become an integrated part of the design review process and a part of the operations management systems. The methodology can be applied to the design of new reactors such as small nuclear reactors (SMR), and be helpful in assessing the risks of different configurations of the reactors. Our tool provides a low cost evaluation of alternative configurations and provides an expanded safety analysis by considering scenarios while early in the implementation cycle where cost impacts can be minimized. The effects of failures can be modeled and thoroughly vetted to understand their potential impact on risk. The process and tools presented here allow for an integrated assessment of risk by supporting traditional defense in depth approaches while taking into consideration the insertion of new digital instrument and control systems.

  19. A Longitudinal Transactional Risk Model for Early Eating Disorder Onset

    PubMed Central

    Pearson, Carolyn M.; Combs, Jessica L.; Zapolski, Tamika C. B.; Smith, Gregory T.

    2014-01-01

    The presence of binge eating behavior in early middle school predicts future diagnoses and health difficulties. The authors showed that this early binge eating behavior can, itself, be predicted by risk factors assessed in elementary school. We tested the acquired preparedness model of risk, which involves transactions among personality, psychosocial learning, and binge eating. In a sample of 1,906 children assessed in the spring of fifth grade (the last year of elementary school), the fall of sixth grade, and the spring of sixth grade, we found that fifth grade negative urgency (the personality tendency to act rashly when distressed) predicted subsequent increases in the expectancy that eating helps alleviate negative affect, which in turn predicted subsequent increases in binge eating behavior. This transactional risk process appeared to continue to occur at later time points. Negative urgency in the fall of sixth grade was predicted by fifth grade pubertal onset, binge eating behavior, and expectancies. It, in turn, predicted increases in high-risk eating expectancies by the spring of sixth grade, and thus heightened risk. PMID:22428790

  20. Social models of HIV risk among young adults in Lesotho.

    PubMed

    Bulled, Nicola L

    2015-01-01

    Extensive research over the past 30 years has revealed that individual and social determinants impact HIV risk. Even so, prevention efforts focus primarily on individual behaviour change, with little recognition of the dynamic interplay of individual and social environment factors that further exacerbate risk engagement. Drawing on long-term research with young adults in Lesotho, I examine how social environment factors contribute to HIV risk. During preliminary ethnographic analysis, I developed novel scales to measure social control, adoption of modernity, and HIV knowledge. In survey research, I examined the effects of individual characteristics (i.e., socioeconomic status, HIV knowledge, adoption of modernity) and social environment (i.e., social control) on HIV risk behaviours. In addition, I measured the impact of altered environments by taking advantage of an existing situation whereby young adults attending a national college are assigned to either a main campus in a metropolitan setting or a satellite campus in a remote setting, irrespective of the environment in which they were socialised as youth. This arbitrary assignment process generates four distinct groups of young adults with altered or constant environments. Regression models show that lower levels of perceived social control and greater adoption of modernity are associated with HIV risk, controlling for other factors. The impact of social control and modernity varies with environment dynamics. PMID:26284999

  1. Social models of HIV risk among young adults in Lesotho.

    PubMed

    Bulled, Nicola L

    2015-01-01

    Extensive research over the past 30 years has revealed that individual and social determinants impact HIV risk. Even so, prevention efforts focus primarily on individual behaviour change, with little recognition of the dynamic interplay of individual and social environment factors that further exacerbate risk engagement. Drawing on long-term research with young adults in Lesotho, I examine how social environment factors contribute to HIV risk. During preliminary ethnographic analysis, I developed novel scales to measure social control, adoption of modernity, and HIV knowledge. In survey research, I examined the effects of individual characteristics (i.e., socioeconomic status, HIV knowledge, adoption of modernity) and social environment (i.e., social control) on HIV risk behaviours. In addition, I measured the impact of altered environments by taking advantage of an existing situation whereby young adults attending a national college are assigned to either a main campus in a metropolitan setting or a satellite campus in a remote setting, irrespective of the environment in which they were socialised as youth. This arbitrary assignment process generates four distinct groups of young adults with altered or constant environments. Regression models show that lower levels of perceived social control and greater adoption of modernity are associated with HIV risk, controlling for other factors. The impact of social control and modernity varies with environment dynamics.

  2. Absolute radiometric calibration of the CCRS SAR

    NASA Astrophysics Data System (ADS)

    Ulander, Lars M. H.; Hawkins, Robert K.; Livingstone, Charles E.; Lukowski, Tom I.

    1991-11-01

    Determining the radar scattering coefficients from SAR (synthetic aperture radar) image data requires absolute radiometric calibration of the SAR system. The authors describe an internal calibration methodology for the airborne Canada Centre for Remote Sensing (CCRS) SAR system, based on radar theory, a detailed model of the radar system, and measurements of system parameters. The methodology is verified by analyzing external calibration data acquired over a 6-month period in 1988 by the C-band radar using HH polarization. The results indicate that the overall error is +/- 0.8 dB (1-sigma) for incidence angles +/- 20 deg from antenna boresight. The dominant error contributions are due to the antenna radome and uncertainties in the elevation angle relative to the antenna boresight.

  3. External model validation of binary clinical risk prediction models in cardiovascular and thoracic surgery.

    PubMed

    Hickey, Graeme L; Blackstone, Eugene H

    2016-08-01

    Clinical risk-prediction models serve an important role in healthcare. They are used for clinical decision-making and measuring the performance of healthcare providers. To establish confidence in a model, external model validation is imperative. When designing such an external model validation study, thought must be given to patient selection, risk factor and outcome definitions, missing data, and the transparent reporting of the analysis. In addition, there are a number of statistical methods available for external model validation. Execution of a rigorous external validation study rests in proper study design, application of suitable statistical methods, and transparent reporting.

  4. Methodology for risk analysis based on atmospheric dispersion modelling from nuclear risk sites

    NASA Astrophysics Data System (ADS)

    Baklanov, A.; Mahura, A.; Sørensen, J. H.; Rigina, O.

    2003-04-01

    The main purpose of this multidisciplinary study is to develop a methodology for complex nuclear risk and vulnerability assessment, and to test it on example of estimation of nuclear risk to the population in the Nordic countries in case of a severe accident at a nuclear risk site (NRS). The main focus of the paper is the methodology for the evaluation of the atmospheric transport and deposition of radioactive pollutants from NRSs. The method developed for this evaluation is derived from a probabilistic point of view. The main question we are trying to answer is: What is the probability for radionuclide atmospheric transport and impact to different neighbouring regions and countries in case of an accident at an NPP? To answer this question we applied a number of different tools: (i) Trajectory Modelling - to calculate multiyear forward trajectories originating over the locations of selected risk sites; (ii) Dispersion Modelling - for long-term simulation and case studies of radionuclide transport from hypothetical accidental releases at NRSs; (iii) Cluster Analysis - to identify atmospheric transport pathways from NRSs; (iv) Probability Fields Analysis - to construct annual, monthly, and seasonal NRS impact indicators to identify the most impacted geographical regions; (v) Specific Case Studies - to estimate consequences for the environment and the populations after a hypothetical accident; (vi) Vulnerability Evaluation to Radioactive Deposition - to describe its persistence in the ecosystems with a focus to the transfer of certain radionuclides into the food chains of key importance for the intake and exposure for a whole population and for certain population groups; (vii) Risk Evaluation and Mapping - to analyse socio-economical consequences for different geographical areas and various population groups taking into account social-geophysical factors and probabilities, and using demographic databases based on GIS analysis.

  5. The Application of Optimisation Methods to Constrain Absolute Plate Motions

    NASA Astrophysics Data System (ADS)

    Tetley, M. G.; Williams, S.; Hardy, S.; Müller, D.

    2015-12-01

    Plate tectonic reconstructions are an excellent tool for understanding the configuration and behaviour of continents through time on both global and regional scales, and are relatively well understood back to ~200 Ma. However, many of these models represent only relative motions between continents, providing little information of absolute tectonic motions and their relationship with the deep Earth. Significant issues exist in solving this problem, including how to combine constraints from multiple, diverse data into a unified model of absolute plate motions; and how to address uncertainties both in the available data, and in the assumptions involved in this process (e.g. hotspot motion, true polar wander). In deep time (pre-Pangea breakup), plate reconstructions rely more heavily on paleomagnetism, but these data often imply plate velocities much larger than those observed since the breakup of the supercontinent Pangea where plate velocities are constrained by the seafloor spreading record. Here we present two complementary techniques to address these issues, applying parallelized numerical methods to quantitatively investigate absolute plate motions through time. Firstly, we develop a data-fit optimized global absolute reference frame constrained by kinematic reconstruction data, hotspot-trail observations, and trench migration statistics. Secondly we calculate optimized paleomagnetic data-derived apparent polar wander paths (APWPs) for both the Phanerozoic and Precambrian. Paths are generated from raw pole data with optimal spatial and temporal pole configurations calculated using all known uncertainties and quality criteria to produce velocity-optimized absolute motion paths through deep time.

  6. Prediction models for early risk detection of cardiovascular event.

    PubMed

    Purwanto; Eswaran, Chikkannan; Logeswaran, Rajasvaran; Abdul Rahman, Abdul Rashid

    2012-04-01

    Cardiovascular disease (CVD) is the major cause of death globally. More people die of CVDs each year than from any other disease. Over 80% of CVD deaths occur in low and middle income countries and occur almost equally in male and female. In this paper, different computational models based on Bayesian Networks, Multilayer Perceptron,Radial Basis Function and Logistic Regression methods are presented to predict early risk detection of the cardiovascular event. A total of 929 (626 male and 303 female) heart attack data are used to construct the models.The models are tested using combined as well as separate male and female data. Among the models used, it is found that the Multilayer Perceptron model yields the best accuracy result.

  7. FIRESTORM: Modelling the water quality risk of wildfire.

    NASA Astrophysics Data System (ADS)

    Mason, C. I.; Sheridan, G. J.; Smith, H. G.; Jones, O.; Chong, D.; Tolhurst, K.

    2012-04-01

    Following wildfire, loss of vegetation and changes to soil properties may result in decreases in infiltration rates, less rainfall interception, and higher overland flow velocities. Rainfall events affecting burn areas before vegetation recovers can cause high magnitude erosion events that impact on downstream water quality. For cities and towns that rely upon fire-prone forest catchments for water supply, wildfire impacts on water quality represent a credible risk to water supply security. Quantifying the risk associated with the occurrence of wildfires and the magnitude of water quality impacts has important implications for managing water supplies. At present, no suitable integrative model exists that considers the probabilistic nature of system inputs as well as the range of processes and scales involved in this problem. We present FIRESTORM, a new model currently in development that aims to determine the range of sediment and associated contaminant loads that may be delivered to water supply reservoirs from the combination of wildfire and subsequent rainfall events. This Monte Carlo model incorporates the probabilistic nature of fire ignition, fire weather and rainfall, and includes deterministic models for fire behaviour and locally dominant erosion processes. FIRESTORM calculates the magnitude and associated annual risk of catchment-scale sediment loads associated with the occurrence of wildfire and rainfall generated by two rain event types. The two event types are localised, high intensity, short-duration convective storms, and widespread, longer duration synoptic-scale rainfall events. Initial application and testing of the model will focus on the two main reservoirs supplying water to Melbourne, Australia, both of which are situated in forest catchments vulnerable to wildfire. Probabilistic fire ignition and weather scenarios have been combined using 40 years of fire records and weather observations. These are used to select from a dataset of over 80

  8. Fire risk in San Diego County, California: A weighted Bayesian model approach

    USGS Publications Warehouse

    Kolden, Crystal A.; Weigel, Timothy J.

    2007-01-01

    Fire risk models are widely utilized to mitigate wildfire hazards, but models are often based on expert opinions of less understood fire-ignition and spread processes. In this study, we used an empirically derived weights-of-evidence model to assess what factors produce fire ignitions east of San Diego, California. We created and validated a dynamic model of fire-ignition risk based on land characteristics and existing fire-ignition history data, and predicted ignition risk for a future urbanization scenario. We then combined our empirical ignition-risk model with a fuzzy fire behavior-risk model developed by wildfire experts to create a hybrid model of overall fire risk. We found that roads influence fire ignitions and that future growth will increase risk in new rural development areas. We conclude that empirically derived risk models and hybrid models offer an alternative method to assess current and future fire risk based on management actions.

  9. Engineering models for catastrophe risk and their application to insurance

    NASA Astrophysics Data System (ADS)

    Dong, Weimin

    2002-06-01

    Internationally earthquake insurance, like all other insurance (fire, auto), adopted actuarial approach in the past, which is, based on historical loss experience to determine insurance rate. Due to the fact that earthquake is a rare event with severe consequence, irrational determination of premium rate and lack of understanding scale of potential loss led to many insurance companies insolvent after Northridge earthquake in 1994. Along with recent advances in earth science, computer science and engineering, computerized loss estimation methodologies based on first principles have been developed to the point that losses from destructive earthquakes can be quantified with reasonable accuracy using scientific modeling techniques. This paper intends to introduce how engineering models can assist to quantify earthquake risk and how insurance industry can use this information to manage their risk in the United States and abroad.

  10. Network Dependence in Risk Trading Games: A Banking Regulation Model

    NASA Astrophysics Data System (ADS)

    Thurner, Stefan

    2003-04-01

    In the quest of quantitatively understanding risk-regulatory behavior of financial agents we propose a physical model of interacting agents where interactions are defined by trades of financial derivatives. Consequences arising from various types of interaction-network topologies are shown for system safety and efficiency. We demonstrate that the model yields characteristic features of actually observed wealth timeseries. Further we study the dependence of global system safety as a function of a risk-control parameter (Basle multiplier). We find a phase transition-like phenomenon, where the Basle parameter plays the role of temperature and safety serves as the order parameter. This work is done together with R. Hanel and S. Pichler.

  11. Application of physiologically based pharmacokinetic models in chemical risk assessment.

    PubMed

    Mumtaz, Moiz; Fisher, Jeffrey; Blount, Benjamin; Ruiz, Patricia

    2012-01-01

    Post-exposure risk assessment of chemical and environmental stressors is a public health challenge. Linking exposure to health outcomes is a 4-step process: exposure assessment, hazard identification, dose response assessment, and risk characterization. This process is increasingly adopting "in silico" tools such as physiologically based pharmacokinetic (PBPK) models to fine-tune exposure assessments and determine internal doses in target organs/tissues. Many excellent PBPK models have been developed. But most, because of their scientific sophistication, have found limited field application-health assessors rarely use them. Over the years, government agencies, stakeholders/partners, and the scientific community have attempted to use these models or their underlying principles in combination with other practical procedures. During the past two decades, through cooperative agreements and contracts at several research and higher education institutions, ATSDR funded translational research has encouraged the use of various types of models. Such collaborative efforts have led to the development and use of transparent and user-friendly models. The "human PBPK model toolkit" is one such project. While not necessarily state of the art, this toolkit is sufficiently accurate for screening purposes. Highlighted in this paper are some selected examples of environmental and occupational exposure assessments of chemicals and their mixtures. PMID:22523493

  12. Application of Physiologically Based Pharmacokinetic Models in Chemical Risk Assessment

    PubMed Central

    Mumtaz, Moiz; Fisher, Jeffrey; Blount, Benjamin; Ruiz, Patricia

    2012-01-01

    Post-exposure risk assessment of chemical and environmental stressors is a public health challenge. Linking exposure to health outcomes is a 4-step process: exposure assessment, hazard identification, dose response assessment, and risk characterization. This process is increasingly adopting “in silico” tools such as physiologically based pharmacokinetic (PBPK) models to fine-tune exposure assessments and determine internal doses in target organs/tissues. Many excellent PBPK models have been developed. But most, because of their scientific sophistication, have found limited field application—health assessors rarely use them. Over the years, government agencies, stakeholders/partners, and the scientific community have attempted to use these models or their underlying principles in combination with other practical procedures. During the past two decades, through cooperative agreements and contracts at several research and higher education institutions, ATSDR funded translational research has encouraged the use of various types of models. Such collaborative efforts have led to the development and use of transparent and user-friendly models. The “human PBPK model toolkit” is one such project. While not necessarily state of the art, this toolkit is sufficiently accurate for screening purposes. Highlighted in this paper are some selected examples of environmental and occupational exposure assessments of chemicals and their mixtures. PMID:22523493

  13. Use Of Absolute Function And Its Associates In Formation And `Redevelopment' Of Mathematical Models In Some Plant-Related Quantitative Physiology: Salinity Effects On Leaf Development Of Schefflera arboricola And Harvest Index In Rice

    NASA Astrophysics Data System (ADS)

    Selamat, Ahmad; Awang, Yahya; Mohamed, Mahmud T. M.; Wahab, Zakaria; Osman, Mohammad

    2008-01-01

    The roles of quantitative physiology are becoming more apparent and crucial in the era of ICT recently. As based on the rate-related variables, most of the mathematical models are in the form of `non-linear' function in describing the responses or the observed within-plant processes outcomes versus time. Even though if some responses change in a drastic manner at certain response point within a biological unit or space of a plant system, the response curve `should' be dependent on a continuous independent variable range in a specified period of determination where biologically `should not' functioned by independent variable range having `IF' statement(s). Subjected to nutrient concentration of high salinity (6.0 mS cm-1), the leaf turgidity (measured as leaf surface area) of S. arboricola which initially was described by one form of the logistic growth functions [(y = 1/(a+be-cx)] abruptly reduced as explained by a model having terms of Absolute function (ABS) containing tan-1(x) and its parameter of leaf life expectancy as affected by high salinity growing medium at a certain point of days after planting. This yielded an overall function of y = 1/(a+be-cx)-A[tan-1{(x-B)/D}+ABS(tan-1{(x-B)/D})]E, where a, b, c, A, B, D, and E are constants that most of them can be `biologically' interpreted. The constant B is the point similar to `IF statement' as normally used in other mathematical functions. Plants subjected to lower salinity status (<3.0 mS cm-1) were only having function of y = 1/(a+be-cx). In the harvest index or HI (economic yield/above ground biomass) study of 20 rice varieties grown over two planting seasons, the long flattened tails at both sides of a peak in the middle of function of y = R+B(T+ABS(B-x))e-k(T+ABS(B-x)) had indicated that those varieties maturing at 123 to 133 days after transplanting were having high HI values. In our observation, Absolute (ABS) function coupled with some terms could be used in the formation of some mathematical functions

  14. A Family-Centered Model for Sharing Genetic Risk.

    PubMed

    Daly, Mary B

    2015-01-01

    The successes of the Human Genome Project have ushered in a new era of genomic science. To effectively translate these discoveries, it will be critical to improve the communication of genetic risk within families. This will require a systematic approach that accounts for the nature of family relationships and sociocultural beliefs. This paper proposes the application of the Family Systems Illness Model, used in the setting of cancer care, to the evolving field of genomics. PMID:26479564

  15. Perceived risk for cancer in an urban sexual minority

    PubMed Central

    Hay, Jennifer L.; Coups, Elliot; Warren, Barbara; Li, Yuelin; Ostroff, Jamie S.

    2013-01-01

    Lesbians, gay men, and bisexuals are a sexual minority experiencing elevated cancer risk factors and health disaparites, e.g., elevated tobacco use, disproportionate rates of infection with human immunodeficiency virus. Little attention has been paid to cancer prevention, education, and control in sexual minorities. This study describes cancer risk perceptions and their correlates so as to generate testable hypotheses and provide a foundation for targeting cancer prevention and risk reduction efforts in this high risk population. A cross-sectional survey of affiliates of a large urban community center serving sexual minority persons yielded a study sample of 247 anonymous persons. The survey assessed demographics, absolute perceived cancer risk, cancer risk behaviors, desired lifestyle changes to reduce cancer risk, and psychosocial variables including stress, depression, and stigma. Univariate and multivariate nonparametric statistics were used for analyses. The sample was primarily white non-Hispanic, middle-aged, and > 80% had at least a high school education. Mean values for absolute perceived cancer risk (range 0–100% risk), were 43.0 (SD = 25.4) for females, and for males, 49.3 (SD = 24.3). For females, although the multivariate regression model for absolute perceived cancer risk was statistically significant (P < .05), no single model variable was significant. For men, the multivariate regression model was significant (P < .001), with endorsement of “don't smoke/quit smoking” to reduce personal cancer risk (P < .001), and greater number of sexual partners (P = .054), positively associated with absolute perceived risk for cancer. This study provides novel data on cancer risk perceptions in sexual minorities, identifying correlates of absolute perceived cancer risk for each gender and several potential foci for cancer prevention interventions with this at-risk group. PMID:20872174

  16. Quasi-likelihood estimation for relative risk regression models.

    PubMed

    Carter, Rickey E; Lipsitz, Stuart R; Tilley, Barbara C

    2005-01-01

    For a prospective randomized clinical trial with two groups, the relative risk can be used as a measure of treatment effect and is directly interpretable as the ratio of success probabilities in the new treatment group versus the placebo group. For a prospective study with many covariates and a binary outcome (success or failure), relative risk regression may be of interest. If we model the log of the success probability as a linear function of covariates, the regression coefficients are log-relative risks. However, using such a log-linear model with a Bernoulli likelihood can lead to convergence problems in the Newton-Raphson algorithm. This is likely to occur when the success probabilities are close to one. A constrained likelihood method proposed by Wacholder (1986, American Journal of Epidemiology 123, 174-184), also has convergence problems. We propose a quasi-likelihood method of moments technique in which we naively assume the Bernoulli outcome is Poisson, with the mean (success probability) following a log-linear model. We use the Poisson maximum likelihood equations to estimate the regression coefficients without constraints. Using method of moment ideas, one can show that the estimates using the Poisson likelihood will be consistent and asymptotically normal. We apply these methods to a double-blinded randomized trial in primary biliary cirrhosis of the liver (Markus et al., 1989, New England Journal of Medicine 320, 1709-1713). PMID:15618526

  17. Quantitative Risk Modeling of Fire on the International Space Station

    NASA Technical Reports Server (NTRS)

    Castillo, Theresa; Haught, Megan

    2014-01-01

    The International Space Station (ISS) Program has worked to prevent fire events and to mitigate their impacts should they occur. Hardware is designed to reduce sources of ignition, oxygen systems are designed to control leaking, flammable materials are prevented from flying to ISS whenever possible, the crew is trained in fire response, and fire response equipment improvements are sought out and funded. Fire prevention and mitigation are a top ISS Program priority - however, programmatic resources are limited; thus, risk trades are made to ensure an adequate level of safety is maintained onboard the ISS. In support of these risk trades, the ISS Probabilistic Risk Assessment (PRA) team has modeled the likelihood of fire occurring in the ISS pressurized cabin, a phenomenological event that has never before been probabilistically modeled in a microgravity environment. This paper will discuss the genesis of the ISS PRA fire model, its enhancement in collaboration with fire experts, and the results which have informed ISS programmatic decisions and will continue to be used throughout the life of the program.

  18. Regime switching model for financial data: Empirical risk analysis

    NASA Astrophysics Data System (ADS)

    Salhi, Khaled; Deaconu, Madalina; Lejay, Antoine; Champagnat, Nicolas; Navet, Nicolas

    2016-11-01

    This paper constructs a regime switching model for the univariate Value-at-Risk estimation. Extreme value theory (EVT) and hidden Markov models (HMM) are combined to estimate a hybrid model that takes volatility clustering into account. In the first stage, HMM is used to classify data in crisis and steady periods, while in the second stage, EVT is applied to the previously classified data to rub out the delay between regime switching and their detection. This new model is applied to prices of numerous stocks exchanged on NYSE Euronext Paris over the period 2001-2011. We focus on daily returns for which calibration has to be done on a small dataset. The relative performance of the regime switching model is benchmarked against other well-known modeling techniques, such as stable, power laws and GARCH models. The empirical results show that the regime switching model increases predictive performance of financial forecasting according to the number of violations and tail-loss tests. This suggests that the regime switching model is a robust forecasting variant of power laws model while remaining practical to implement the VaR measurement.

  19. Evaluation of data for Sinkhole-development risk models

    NASA Astrophysics Data System (ADS)

    Upchurch, Sam B.; Littlefield, James R.

    1988-10-01

    Before risk assessments for sinkhole damage and indemnification are developed, a data base must be created to predict the occurrence and distribution of sinkholes. This database must be evaluated in terms of the following questions: (1) are available records of modern sinkhole development adequate, (2) can the distribution of ancient sinks be used for predictive purposes, and (3) at what areal scale must sinkhole occurrences be evaluated for predictive and risk analysis purposes? Twelve 7.5' quadrangles with varying karst development in Hillsborough County, Florida provide insight into these questions. The area includes 179 modern sinks that developed between 1964 and 1985 and 2,303 ancient sinks. The sinks occur in urban, suburban, agricultural, and major forest wetland areas. The number of ancient sinks ranges from 0.1 to 3.2/km2 and averages 1.1/km2 for the entire area. The quadrangle area occupied by ancient sinks ranges from 0.3 to 10.2 percent. The distribution of ancient sinkholes within a quadrangle ranges from 0 to over 25 percent of the land surface. In bare karst areas, the sinks are localized along major lineaments, especially at lineament intersections. Where there is covered karst, ancient sinks may be obscured. Modern sinkholes did not uniformly through time, they ranged from 0 to 29/yr. The regional occurrence rate is 7.6/yr. Most were reported in urban or suburban areas and their locations coincide with the lineament-controlled areas of ancient karst. Moving-average analysis indicates that the distribution of modern sinks is highly localized and ranges from 0 to 1.9/km2. Chi-square tests show that the distribution of ancient sinks in bare karst areas significantly predicts the locations of modern sinks. In areas of covered karst, the locations of ancient sinkholes do not predict modern sinks. It appears that risk-assessment models for sinkhole development can use the distribution of ancient sinks where bare karst is present. In covered karst areas

  20. A new model for polluted soil risk assessment

    NASA Astrophysics Data System (ADS)

    Andretta, M.; Villani, M.; Serra, R.

    2003-04-01

    In the last years, the problem of the evaluation of the risk related to soil pollution has became more and more important, all over the world. The increasing number of polluted soils in all the industrialised counties has required the formalisation of well defined methodologies for defining the technical and economical limits of soil remediation. Mainly, these limits are defined in terms of general threshold values that, in some cases, can not be reached even with the so called Best Available Technology (B.A.T.) due for example to the characteristics of the pollutants or of the affected soil, or on the extremely high cost or duration of the remedial intervention. For these reasons, both in the North American Countries and in the European ones, many alternative methodologies based on systematic and scientifically well founded approaches have been developed, in order to determine the real effects of the pollution on the receptor targets. Typically, these methodologies are organised into different levels of detail, the so called "TIERS". Tier 1 is based on a conservative estimation of the risk for the targets, that comes from very general and "worst case" general situations. Tier 2 is based on a more detailed and site specific estimation of the hazard, evaluated by the use of semi-empirical, analytical formulas for the source characterisation, the transport of the pollutant, the target exposition evaluation. Tier 3 is the more detailed and site specific level of application of the risk assessment methodologies and requires the use of numerical methods with many detailed information on the site and on the receptors (e.g.: chemical/physical parameters of the pollutants, hydro-geological data, exposition data, etc.) In this paper, we describe the most important theoretical aspects of the polluted soil risk assessment methodologies and the relevant role played, in this kind of analysis, by the pollutant transport models. In particular, we describe a new and innovative

  1. Reduced Risk of Importing Ebola Virus Disease because of Travel Restrictions in 2014: A Retrospective Epidemiological Modeling Study

    PubMed Central

    Otsuki, Shiori

    2016-01-01

    Background An epidemic of Ebola virus disease (EVD) from 2013–16 posed a serious risk of global spread during its early growth phase. A post-epidemic evaluation of the effectiveness of travel restrictions has yet to be conducted. The present study aimed to estimate the effectiveness of travel restrictions in reducing the risk of importation from mid-August to September, 2014, using a simple hazard-based statistical model. Methodology/Principal Findings The hazard rate was modeled as an inverse function of the effective distance, an excellent predictor of disease spread, which was calculated from the airline transportation network. By analyzing datasets of the date of EVD case importation from the 15th of July to the 15th of September 2014, and assuming that the network structure changed from the 8th of August 2014 because of travel restrictions, parameters that characterized the hazard rate were estimated. The absolute risk reduction and relative risk reductions due to travel restrictions were estimated to be less than 1% and about 20%, respectively, for all models tested. Effectiveness estimates among African countries were greater than those for other countries outside Africa. Conclusions The travel restrictions were not effective enough to expect the prevention of global spread of Ebola virus disease. It is more efficient to control the spread of disease locally during an early phase of an epidemic than to attempt to control the epidemic at international borders. Capacity building for local containment and coordinated and expedited international cooperation are essential to reduce the risk of global transmission. PMID:27657544

  2. Modeling the operational risk in Iranian commercial banks: case study of a private bank

    NASA Astrophysics Data System (ADS)

    Momen, Omid; Kimiagari, Alimohammad; Noorbakhsh, Eaman

    2012-08-01

    The Basel Committee on Banking Supervision from the Bank for International Settlement classifies banking risks into three main categories including credit risk, market risk, and operational risk. The focus of this study is on the operational risk measurement in Iranian banks. Therefore, issues arising when trying to implement operational risk models in Iran are discussed, and then, some solutions are recommended. Moreover, all steps of operational risk measurement based on Loss Distribution Approach with Iran's specific modifications are presented. We employed the approach of this study to model the operational risk of an Iranian private bank. The results are quite reasonable, comparing the scale of bank and other risk categories.

  3. Time-based collision risk modeling for air traffic management

    NASA Astrophysics Data System (ADS)

    Bell, Alan E.

    Since the emergence of commercial aviation in the early part of last century, economic forces have driven a steadily increasing demand for air transportation. Increasing density of aircraft operating in a finite volume of airspace is accompanied by a corresponding increase in the risk of collision, and in response to a growing number of incidents and accidents involving collisions between aircraft, governments worldwide have developed air traffic control systems and procedures to mitigate this risk. The objective of any collision risk management system is to project conflicts and provide operators with sufficient opportunity to recognize potential collisions and take necessary actions to avoid them. It is therefore the assertion of this research that the currency of collision risk management is time. Future Air Traffic Management Systems are being designed around the foundational principle of four dimensional trajectory based operations, a method that replaces legacy first-come, first-served sequencing priorities with time-based reservations throughout the airspace system. This research will demonstrate that if aircraft are to be sequenced in four dimensions, they must also be separated in four dimensions. In order to separate aircraft in four dimensions, time must emerge as the primary tool by which air traffic is managed. A functional relationship exists between the time-based performance of aircraft, the interval between aircraft scheduled to cross some three dimensional point in space, and the risk of collision. This research models that relationship and presents two key findings. First, a method is developed by which the ability of an aircraft to meet a required time of arrival may be expressed as a robust standard for both industry and operations. Second, a method by which airspace system capacity may be increased while maintaining an acceptable level of collision risk is presented and demonstrated for the purpose of formulating recommendations for procedures

  4. A Spatio-temporal Model of African Animal Trypanosomosis Risk

    PubMed Central

    Dicko, Ahmadou H.; Percoma, Lassane; Sow, Adama; Adam, Yahaya; Mahama, Charles; Sidibé, Issa; Dayo, Guiguigbaza-Kossigan; Thévenon, Sophie; Fonta, William; Sanfo, Safietou; Djiteye, Aligui; Salou, Ernest; Djohan, Vincent; Cecchi, Giuliano; Bouyer, Jérémy

    2015-01-01

    Background African animal trypanosomosis (AAT) is a major constraint to sustainable development of cattle farming in sub-Saharan Africa. The habitat of the tsetse fly vector is increasingly fragmented owing to demographic pressure and shifts in climate, which leads to heterogeneous risk of cyclical transmission both in space and time. In Burkina Faso and Ghana, the most important vectors are riverine species, namely Glossina palpalis gambiensis and G. tachinoides, which are more resilient to human-induced changes than the savannah and forest species. Although many authors studied the distribution of AAT risk both in space and time, spatio-temporal models allowing predictions of it are lacking. Methodology/Principal Findings We used datasets generated by various projects, including two baseline surveys conducted in Burkina Faso and Ghana within PATTEC (Pan African Tsetse and Trypanosomosis Eradication Campaign) national initiatives. We computed the entomological inoculation rate (EIR) or tsetse challenge using a range of environmental data. The tsetse apparent density and their infection rate were separately estimated and subsequently combined to derive the EIR using a “one layer-one model” approach. The estimated EIR was then projected into suitable habitat. This risk index was finally validated against data on bovine trypanosomosis. It allowed a good prediction of the parasitological status (r2 = 67%), showed a positive correlation but less predictive power with serological status (r2 = 22%) aggregated at the village level but was not related to the illness status (r2 = 2%). Conclusions/Significance The presented spatio-temporal model provides a fine-scale picture of the dynamics of AAT risk in sub-humid areas of West Africa. The estimated EIR was high in the proximity of rivers during the dry season and more widespread during the rainy season. The present analysis is a first step in a broader framework for an efficient risk management of climate

  5. Peer Review of NRC Standardized Plant Analysis Risk Models

    SciTech Connect

    Anthony Koonce; James Knudsen; Robert Buell

    2011-03-01

    The Nuclear Regulatory Commission (NRC) Standardized Plant Analysis Risk (SPAR) Models underwent a Peer Review using ASME PRA standard (Addendum C) as endorsed by NRC in Regulatory Guide (RG) 1.200. The review was performed by a mix of industry probabilistic risk analysis (PRA) experts and NRC PRA experts. Representative SPAR models, one PWR and one BWR, were reviewed against Capability Category I of the ASME PRA standard. Capability Category I was selected as the basis for review due to the specific uses/applications of the SPAR models. The BWR SPAR model was reviewed against 331 ASME PRA Standard Supporting Requirements; however, based on the Capability Category I level of review and the absence of internal flooding and containment performance (LERF) logic only 216 requirements were determined to be applicable. Based on the review, the BWR SPAR model met 139 of the 216 supporting requirements. The review also generated 200 findings or suggestions. Of these 200 findings and suggestions 142 were findings and 58 were suggestions. The PWR SPAR model was also evaluated against the same 331 ASME PRA Standard Supporting Requirements. Of these requirements only 215 were deemed appropriate for the review (for the same reason as noted for the BWR). The PWR review determined that 125 of the 215 supporting requirements met Capability Category I or greater. The review identified 101 findings or suggestions (76 findings and 25 suggestions). These findings or suggestions were developed to identify areas where SPAR models could be enhanced. A process to prioritize and incorporate the findings/suggestions supporting requirements into the SPAR models is being developed. The prioritization process focuses on those findings that will enhance the accuracy, completeness and usability of the SPAR models.

  6. A Novel Risk Score to the Prediction of 10-year Risk for Coronary Artery Disease Among the Elderly in Beijing Based on Competing Risk Model

    PubMed Central

    Liu, Long; Tang, Zhe; Li, Xia; Luo, Yanxia; Guo, Jin; Li, Haibin; Liu, Xiangtong; Tao, Lixin; Yan, Aoshuang; Guo, Xiuhua

    2016-01-01

    Abstract The study aimed to construct a risk prediction model for coronary artery disease (CAD) based on competing risk model among the elderly in Beijing and develop a user-friendly CAD risk score tool. We used competing risk model to evaluate the risk of developing a first CAD event. On the basis of the risk factors that were included in the competing risk model, we constructed the CAD risk prediction model with Cox proportional hazard model. Time-dependent receiver operating characteristic (ROC) curve and time-dependent area under the ROC curve (AUC) were used to evaluate the discrimination ability of the both methods. Calibration plots were applied to assess the calibration ability and adjusted for the competing risk of non-CAD death. Net reclassification index (NRI) and integrated discrimination improvement (IDI) were applied to quantify the improvement contributed by the new risk factors. Internal validation of predictive accuracy was performed using 1000 times of bootstrap re-sampling. Of the 1775 participants without CAD at baseline, 473 incident cases of CAD were documented for a 20-year follow-up. Time-dependent AUCs for men and women at t = 10 years were 0.841 [95% confidence interval (95% CI): 0.806–0.877], 0.804 (95% CI: 0.768–0.839) in Fine and Gray model, 0.784 (95% CI: 0.738–0.830), 0.733 (95% CI: 0.692–0.775) in Cox proportional hazard model. The competing risk model was significantly superior to Cox proportional hazard model on discrimination and calibration. The cut-off values of the risk score that marked the difference between low-risk and high-risk patients were 34 points for men and 30 points for women, which have good sensitivity and specificity. A sex-specific multivariable risk factor algorithm-based competing risk model has been developed on the basis of an elderly Chinese cohort, which could be applied to predict an individual's risk and provide a useful guide to identify the groups at a high risk for CAD among the Chinese

  7. A Novel Risk Score to the Prediction of 10-year Risk for Coronary Artery Disease Among the Elderly in Beijing Based on Competing Risk Model.

    PubMed

    Liu, Long; Tang, Zhe; Li, Xia; Luo, Yanxia; Guo, Jin; Li, Haibin; Liu, Xiangtong; Tao, Lixin; Yan, Aoshuang; Guo, Xiuhua

    2016-03-01

    The study aimed to construct a risk prediction model for coronary artery disease (CAD) based on competing risk model among the elderly in Beijing and develop a user-friendly CAD risk score tool. We used competing risk model to evaluate the risk of developing a first CAD event. On the basis of the risk factors that were included in the competing risk model, we constructed the CAD risk prediction model with Cox proportional hazard model. Time-dependent receiver operating characteristic (ROC) curve and time-dependent area under the ROC curve (AUC) were used to evaluate the discrimination ability of the both methods. Calibration plots were applied to assess the calibration ability and adjusted for the competing risk of non-CAD death. Net reclassification index (NRI) and integrated discrimination improvement (IDI) were applied to quantify the improvement contributed by the new risk factors. Internal validation of predictive accuracy was performed using 1000 times of bootstrap re-sampling. Of the 1775 participants without CAD at baseline, 473 incident cases of CAD were documented for a 20-year follow-up. Time-dependent AUCs for men and women at t = 10 years were 0.841 [95% confidence interval (95% CI): 0.806-0.877], 0.804 (95% CI: 0.768-0.839) in Fine and Gray model, 0.784 (95% CI: 0.738-0.830), 0.733 (95% CI: 0.692-0.775) in Cox proportional hazard model. The competing risk model was significantly superior to Cox proportional hazard model on discrimination and calibration. The cut-off values of the risk score that marked the difference between low-risk and high-risk patients were 34 points for men and 30 points for women, which have good sensitivity and specificity. A sex-specific multivariable risk factor algorithm-based competing risk model has been developed on the basis of an elderly Chinese cohort, which could be applied to predict an individual's risk and provide a useful guide to identify the groups at a high risk for CAD among the Chinese adults over 55

  8. Integrated Assessment Modeling for Carbon Storage Risk and Uncertainty Quantification

    NASA Astrophysics Data System (ADS)

    Bromhal, G. S.; Dilmore, R.; Pawar, R.; Stauffer, P. H.; Gastelum, J.; Oldenburg, C. M.; Zhang, Y.; Chu, S.

    2013-12-01

    The National Risk Assessment Partnership (NRAP) has developed tools to perform quantitative risk assessment at site-specific locations for long-term carbon storage. The approach that is being used is to divide the storage and containment system into components (e.g., reservoirs, seals, wells, groundwater aquifers), to develop detailed models for each component, to generate reduced order models (ROMs) based on the detailed models, and to reconnect the reduced order models within an integrated assessment model (IAM). CO2-PENS, developed at Los Alamos National Lab, is being used as the IAM for the simulations in this study. The benefit of this approach is that simulations of the complete system can be generated on a relatively rapid time scale so that Monte Carlo simulation can be performed. In this study, hundreds of thousands of runs of the IAMs have been generated to estimate likelihoods of the quantity of CO2 released to the atmosphere, size of aquifer impacted by pH, size of aquifer impacted by TDS, and size of aquifer with different metals concentrations. Correlations of the output variables with different reservoir, seal, wellbore, and aquifer parameters have been generated. Importance measures have been identified, and inputs have been ranked in the order of their impact on the output quantities. Presentation will describe the approach used, representative results, and implications for how the Monte Carlo analysis is implemented on uncertainty quantification.

  9. Approximate Uncertainty Modeling in Risk Analysis with Vine Copulas

    PubMed Central

    Bedford, Tim; Daneshkhah, Alireza

    2015-01-01

    Many applications of risk analysis require us to jointly model multiple uncertain quantities. Bayesian networks and copulas are two common approaches to modeling joint uncertainties with probability distributions. This article focuses on new methodologies for copulas by developing work of Cooke, Bedford, Kurowica, and others on vines as a way of constructing higher dimensional distributions that do not suffer from some of the restrictions of alternatives such as the multivariate Gaussian copula. The article provides a fundamental approximation result, demonstrating that we can approximate any density as closely as we like using vines. It further operationalizes this result by showing how minimum information copulas can be used to provide parametric classes of copulas that have such good levels of approximation. We extend previous approaches using vines by considering nonconstant conditional dependencies, which are particularly relevant in financial risk modeling. We discuss how such models may be quantified, in terms of expert judgment or by fitting data, and illustrate the approach by modeling two financial data sets. PMID:26332240

  10. Future bloom and blossom frost risk for Malus domestica considering climate model and impact model uncertainties.

    PubMed

    Hoffmann, Holger; Rath, Thomas

    2013-01-01

    The future bloom and risk of blossom frosts for Malus domestica were projected using regional climate realizations and phenological ( = impact) models. As climate impact projections are susceptible to uncertainties of climate and impact models and model concatenation, the significant horizon of the climate impact signal was analyzed by applying 7 impact models, including two new developments, on 13 climate realizations of the IPCC emission scenario A1B. Advancement of phenophases and a decrease in blossom frost risk for Lower Saxony (Germany) for early and late ripeners was determined by six out of seven phenological models. Single model/single grid point time series of bloom showed significant trends by 2021-2050 compared to 1971-2000, whereas the joint signal of all climate and impact models did not stabilize until 2043. Regarding blossom frost risk, joint projection variability exceeded the projected signal. Thus, blossom frost risk cannot be stated to be lower by the end of the 21st century despite a negative trend. As a consequence it is however unlikely to increase. Uncertainty of temperature, blooming date and blossom frost risk projection reached a minimum at 2078-2087. The projected phenophases advanced by 5.5 d K(-1), showing partial compensation of delayed fulfillment of the winter chill requirement and faster completion of the following forcing phase in spring. Finally, phenological model performance was improved by considering the length of day.

  11. Petri net modeling of fault analysis for probabilistic risk assessment

    NASA Astrophysics Data System (ADS)

    Lee, Andrew

    Fault trees and event trees have been widely accepted as the modeling strategy to perform Probabilistic Risk Assessment (PRA). However, there are several limitations associated with fault tree/event tree modeling. These include 1. It only considers binary events; 2. It assumes independence among basic events; and 3. It does not consider timing sequence of basic events. This thesis investigates Petri net modeling as a potential alternative for PRA modeling. Petri nets have mainly been used as a simulation tool for queuing and network systems. However, it has been suggested that they could also model failure scenarios, and thus could be a potential modeling strategy for PRA. In this thesis, the transformations required to model logic gates in a fault tree by Petri nets are explored. The gap between fault tree analysis and Petri net analysis is bridged through gate equivalency analysis. Methods for qualitative and quantitative analysis for Petri nets are presented. Techniques are developed and implemented to revise and tailor traditional Petri net modeling for system failure analysis. The airlock system and the maintenance cooling system of a CANada Deuterium Uranium (CANDU) reactor are used as case studies to demonstrate Petri nets ability to model system failure and provide a structured approach for qualitative and quantitative analysis. The minimal cutsets and the probability of the airlock system failing to maintain the pressure boundary are obtained. Furthermore, the case study is extended to non-coherent system analysis due to system maintenance.

  12. Guide for developing conceptual models for ecological risk assessments

    SciTech Connect

    Suter, G.W., II

    1996-05-01

    Ecological conceptual models are the result of the problem formulation phase of an ecological risk assessment, which is an important component of the Remedial Investigation process. They present hypotheses of how the site contaminants might affect the site ecology. The contaminant sources, routes, media, routes, and endpoint receptors are presented in the form of a flow chart. This guide is for preparing the conceptual models; use of this guide will standardize the models so that they will be of high quality, useful to the assessment process, and sufficiently consistent so that connections between sources of exposure and receptors can be extended across operable units (OU). Generic conceptual models are presented for source, aquatic integrator, groundwater integrator, and terrestrial OUs.

  13. Modeling and managing risk early in software development

    NASA Technical Reports Server (NTRS)

    Briand, Lionel C.; Thomas, William M.; Hetmanski, Christopher J.

    1993-01-01

    In order to improve the quality of the software development process, we need to be able to build empirical multivariate models based on data collectable early in the software process. These models need to be both useful for prediction and easy to interpret, so that remedial actions may be taken in order to control and optimize the development process. We present an automated modeling technique which can be used as an alternative to regression techniques. We show how it can be used to facilitate the identification and aid the interpretation of the significant trends which characterize 'high risk' components in several Ada systems. Finally, we evaluate the effectiveness of our technique based on a comparison with logistic regression based models.

  14. Is Implantation of a Left Ventricular Assist Device in Patients With Critical or Impending Cardiogenic Shock an Absolute Contraindication? Looking Back at Our Past Experience Trying to Identify Contraindicative Risk Factors.

    PubMed

    Dell'Aquila, Angelo Maria; Schneider, Stefan R B; Risso, Paolo; Welp, Henryk; Glockner, David G; Alles, Sebastian; Sindermann, Jürgen R; Scherer, Mirela

    2015-12-01

    Poor survival has been demonstrated after ventricular assist device (VAD) implantation for Interagency Registry for Mechanically Assisted Circulatory Support (INTERMACS) profile 1 and 2 patients compared with more stable levels. However, risk factors within this high-risk cohort have not been determined so far. The aim of the present study was to identify risk factors associated with this very high mortality rate. Between February 1993 and January 2013, 298 patients underwent VAD implantation in our institution. One hundred nine patients were in INTERMACS level 1 and 49 patients were in INTERMACS level 2 and were therefore defined as hemodynamically critical (overall 158 patients). Assist devices implanted were: HVAD HeartWare n = 18; Incor n = 11; VentrAssist n = 2; DeBakey n = 22; and pulsatile systems n = 105. After cumulative support duration of 815.35 months, Kaplan-Meier analysis revealed a survival of 63.9, 48.8, and 40.3% at 1, 6, and 12 months, respectively. Cox regression analyses identified age > 50 (P = 0.001, odds ratio [OR] 2.48), white blood cell count > 13.000/μL (P = 0.01, OR 2.06), preoperative renal replacement therapy (P = 0.001, OR 2.63), and postcardiotomy failure (P < 0.001, OR 2.79) as independent predictors of mortality. Of note, last generation VADs were not associated with significantly better 6-month survival (P = 0.59). Patients without the aforementioned risk factors could yield a survival of 79.2% at 6 months. This single-center experience shows that VAD implantation in hemodynamically unstable patients generally results in poor early outcome, even in third-generation pumps. However, avoiding the aforementioned risk factors could result in improved outcome.

  15. Predictive model of avian electrocution risk on overhead power lines.

    PubMed

    Dwyer, J F; Harness, R E; Donohue, K

    2014-02-01

    Electrocution on overhead power structures negatively affects avian populations in diverse ecosystems worldwide, contributes to the endangerment of raptor populations in Europe and Africa, and is a major driver of legal action against electric utilities in North America. We investigated factors associated with avian electrocutions so poles that are likely to electrocute a bird can be identified and retrofitted prior to causing avian mortality. We used historical data from southern California to identify patterns of avian electrocution by voltage, month, and year to identify species most often killed by electrocution in our study area and to develop a predictive model that compared poles where an avian electrocution was known to have occurred (electrocution poles) with poles where no known electrocution occurred (comparison poles). We chose variables that could be quantified by personnel with little training in ornithology or electric systems. Electrocutions were more common at distribution voltages (≤ 33 kV) and during breeding seasons and were more commonly reported after a retrofitting program began. Red-tailed Hawks (Buteo jamaicensis) (n = 265) and American Crows (Corvus brachyrhynchos) (n = 258) were the most commonly electrocuted species. In the predictive model, 4 of 14 candidate variables were required to distinguish electrocution poles from comparison poles: number of jumpers (short wires connecting energized equipment), number of primary conductors, presence of grounding, and presence of unforested unpaved areas as the dominant nearby land cover. When tested against a sample of poles not used to build the model, our model distributed poles relatively normally across electrocution-risk values and identified the average risk as higher for electrocution poles relative to comparison poles. Our model can be used to reduce avian electrocutions through proactive identification and targeting of high-risk poles for retrofitting. PMID:24033371

  16. Predictive model of avian electrocution risk on overhead power lines.

    PubMed

    Dwyer, J F; Harness, R E; Donohue, K

    2014-02-01

    Electrocution on overhead power structures negatively affects avian populations in diverse ecosystems worldwide, contributes to the endangerment of raptor populations in Europe and Africa, and is a major driver of legal action against electric utilities in North America. We investigated factors associated with avian electrocutions so poles that are likely to electrocute a bird can be identified and retrofitted prior to causing avian mortality. We used historical data from southern California to identify patterns of avian electrocution by voltage, month, and year to identify species most often killed by electrocution in our study area and to develop a predictive model that compared poles where an avian electrocution was known to have occurred (electrocution poles) with poles where no known electrocution occurred (comparison poles). We chose variables that could be quantified by personnel with little training in ornithology or electric systems. Electrocutions were more common at distribution voltages (≤ 33 kV) and during breeding seasons and were more commonly reported after a retrofitting program began. Red-tailed Hawks (Buteo jamaicensis) (n = 265) and American Crows (Corvus brachyrhynchos) (n = 258) were the most commonly electrocuted species. In the predictive model, 4 of 14 candidate variables were required to distinguish electrocution poles from comparison poles: number of jumpers (short wires connecting energized equipment), number of primary conductors, presence of grounding, and presence of unforested unpaved areas as the dominant nearby land cover. When tested against a sample of poles not used to build the model, our model distributed poles relatively normally across electrocution-risk values and identified the average risk as higher for electrocution poles relative to comparison poles. Our model can be used to reduce avian electrocutions through proactive identification and targeting of high-risk poles for retrofitting.

  17. Predicting the risk of rheumatoid arthritis and its age of onset through modelling genetic risk variants with smoking.

    PubMed

    Scott, Ian C; Seegobin, Seth D; Steer, Sophia; Tan, Rachael; Forabosco, Paola; Hinks, Anne; Eyre, Stephen; Morgan, Ann W; Wilson, Anthony G; Hocking, Lynne J; Wordsworth, Paul; Barton, Anne; Worthington, Jane; Cope, Andrew P; Lewis, Cathryn M

    2013-01-01

    The improved characterisation of risk factors for rheumatoid arthritis (RA) suggests they could be combined to identify individuals at increased disease risks in whom preventive strategies may be evaluated. We aimed to develop an RA prediction model capable of generating clinically relevant predictive data and to determine if it better predicted younger onset RA (YORA). Our novel modelling approach combined odds ratios for 15 four-digit/10 two-digit HLA-DRB1 alleles, 31 single nucleotide polymorphisms (SNPs) and ever-smoking status in males to determine risk using computer simulation and confidence interval based risk categorisation. Only males were evaluated in our models incorporating smoking as ever-smoking is a significant risk factor for RA in men but not women. We developed multiple models to evaluate each risk factor's impact on prediction. Each model's ability to discriminate anti-citrullinated protein antibody (ACPA)-positive RA from controls was evaluated in two cohorts: Wellcome Trust Case Control Consortium (WTCCC: 1,516 cases; 1,647 controls); UK RA Genetics Group Consortium (UKRAGG: 2,623 cases; 1,500 controls). HLA and smoking provided strongest prediction with good discrimination evidenced by an HLA-smoking model area under the curve (AUC) value of 0.813 in both WTCCC and UKRAGG. SNPs provided minimal prediction (AUC 0.660 WTCCC/0.617 UKRAGG). Whilst high individual risks were identified, with some cases having estimated lifetime risks of 86%, only a minority overall had substantially increased odds for RA. High risks from the HLA model were associated with YORA (P<0.0001); ever-smoking associated with older onset disease. This latter finding suggests smoking's impact on RA risk manifests later in life. Our modelling demonstrates that combining risk factors provides clinically informative RA prediction; additionally HLA and smoking status can be used to predict the risk of younger and older onset RA, respectively. PMID:24068971

  18. Predicting the risk of rheumatoid arthritis and its age of onset through modelling genetic risk variants with smoking.

    PubMed

    Scott, Ian C; Seegobin, Seth D; Steer, Sophia; Tan, Rachael; Forabosco, Paola; Hinks, Anne; Eyre, Stephen; Morgan, Ann W; Wilson, Anthony G; Hocking, Lynne J; Wordsworth, Paul; Barton, Anne; Worthington, Jane; Cope, Andrew P; Lewis, Cathryn M

    2013-01-01

    The improved characterisation of risk factors for rheumatoid arthritis (RA) suggests they could be combined to identify individuals at increased disease risks in whom preventive strategies may be evaluated. We aimed to develop an RA prediction model capable of generating clinically relevant predictive data and to determine if it better predicted younger onset RA (YORA). Our novel modelling approach combined odds ratios for 15 four-digit/10 two-digit HLA-DRB1 alleles, 31 single nucleotide polymorphisms (SNPs) and ever-smoking status in males to determine risk using computer simulation and confidence interval based risk categorisation. Only males were evaluated in our models incorporating smoking as ever-smoking is a significant risk factor for RA in men but not women. We developed multiple models to evaluate each risk factor's impact on prediction. Each model's ability to discriminate anti-citrullinated protein antibody (ACPA)-positive RA from controls was evaluated in two cohorts: Wellcome Trust Case Control Consortium (WTCCC: 1,516 cases; 1,647 controls); UK RA Genetics Group Consortium (UKRAGG: 2,623 cases; 1,500 controls). HLA and smoking provided strongest prediction with good discrimination evidenced by an HLA-smoking model area under the curve (AUC) value of 0.813 in both WTCCC and UKRAGG. SNPs provided minimal prediction (AUC 0.660 WTCCC/0.617 UKRAGG). Whilst high individual risks were identified, with some cases having estimated lifetime risks of 86%, only a minority overall had substantially increased odds for RA. High risks from the HLA model were associated with YORA (P<0.0001); ever-smoking associated with older onset disease. This latter finding suggests smoking's impact on RA risk manifests later in life. Our modelling demonstrates that combining risk factors provides clinically informative RA prediction; additionally HLA and smoking status can be used to predict the risk of younger and older onset RA, respectively.

  19. Predicting the Risk of Rheumatoid Arthritis and Its Age of Onset through Modelling Genetic Risk Variants with Smoking

    PubMed Central

    Scott, Ian C.; Seegobin, Seth D.; Steer, Sophia; Tan, Rachael; Forabosco, Paola; Hinks, Anne; Eyre, Stephen; Morgan, Ann W.; Wilson, Anthony G.; Hocking, Lynne J.; Wordsworth, Paul; Barton, Anne; Worthington, Jane; Cope, Andrew P.; Lewis, Cathryn M.

    2013-01-01

    The improved characterisation of risk factors for rheumatoid arthritis (RA) suggests they could be combined to identify individuals at increased disease risks in whom preventive strategies may be evaluated. We aimed to develop an RA prediction model capable of generating clinically relevant predictive data and to determine if it better predicted younger onset RA (YORA). Our novel modelling approach combined odds ratios for 15 four-digit/10 two-digit HLA-DRB1 alleles, 31 single nucleotide polymorphisms (SNPs) and ever-smoking status in males to determine risk using computer simulation and confidence interval based risk categorisation. Only males were evaluated in our models incorporating smoking as ever-smoking is a significant risk factor for RA in men but not women. We developed multiple models to evaluate each risk factor's impact on prediction. Each model's ability to discriminate anti-citrullinated protein antibody (ACPA)-positive RA from controls was evaluated in two cohorts: Wellcome Trust Case Control Consortium (WTCCC: 1,516 cases; 1,647 controls); UK RA Genetics Group Consortium (UKRAGG: 2,623 cases; 1,500 controls). HLA and smoking provided strongest prediction with good discrimination evidenced by an HLA-smoking model area under the curve (AUC) value of 0.813 in both WTCCC and UKRAGG. SNPs provided minimal prediction (AUC 0.660 WTCCC/0.617 UKRAGG). Whilst high individual risks were identified, with some cases having estimated lifetime risks of 86%, only a minority overall had substantially increased odds for RA. High risks from the HLA model were associated with YORA (P<0.0001); ever-smoking associated with older onset disease. This latter finding suggests smoking's impact on RA risk manifests later in life. Our modelling demonstrates that combining risk factors provides clinically informative RA prediction; additionally HLA and smoking status can be used to predict the risk of younger and older onset RA, respectively. PMID:24068971

  20. Quantifying Systemic Risk by Solutions of the Mean-Variance Risk Model

    PubMed Central

    Morgenstern, Ingo

    2016-01-01

    The world is still recovering from the financial crisis peaking in September 2008. The triggering event was the bankruptcy of Lehman Brothers. To detect such turmoils, one can investigate the time-dependent behaviour of correlations between assets or indices. These cross-correlations have been connected to the systemic risks within markets by several studies in the aftermath of this crisis. We study 37 different US indices which cover almost all aspects of the US economy and show that monitoring an average investor’s behaviour can be used to quantify times of increased risk. In this paper the overall investing strategy is approximated by the ground-states of the mean-variance model along the efficient frontier bound to real world constraints. Changes in the behaviour of the average investor is utlilized as a early warning sign. PMID:27351482

  1. Quantifying Systemic Risk by Solutions of the Mean-Variance Risk Model.

    PubMed

    Jurczyk, Jan; Eckrot, Alexander; Morgenstern, Ingo

    2016-01-01

    The world is still recovering from the financial crisis peaking in September 2008. The triggering event was the bankruptcy of Lehman Brothers. To detect such turmoils, one can investigate the time-dependent behaviour of correlations between assets or indices. These cross-correlations have been connected to the systemic risks within markets by several studies in the aftermath of this crisis. We study 37 different US indices which cover almost all aspects of the US economy and show that monitoring an average investor's behaviour can be used to quantify times of increased risk. In this paper the overall investing strategy is approximated by the ground-states of the mean-variance model along the efficient frontier bound to real world constraints. Changes in the behaviour of the average investor is utlilized as a early warning sign. PMID:27351482

  2. Risk-Adjusted Models for Adverse Obstetric Outcomes and Variation in Risk Adjusted Outcomes Across Hospitals

    PubMed Central

    Bailit, Jennifer L.; Grobman, William A.; Rice, Madeline Murguia; Spong, Catherine Y.; Wapner, Ronald J.; Varner, Michael W.; Thorp, John M.; Leveno, Kenneth J.; Caritis, Steve N.; Shubert, Phillip J.; Tita, Alan T. N.; Saade, George; Sorokin, Yoram; Rouse, Dwight J.; Blackwell, Sean C.; Tolosa, Jorge E.; Van Dorsten, J. Peter

    2014-01-01

    Objective Regulatory bodies and insurers evaluate hospital quality using obstetrical outcomes, however meaningful comparisons should take pre-existing patient characteristics into account. Furthermore, if risk-adjusted outcomes are consistent within a hospital, fewer measures and resources would be needed to assess obstetrical quality. Our objective was to establish risk-adjusted models for five obstetric outcomes and assess hospital performance across these outcomes. Study Design A cohort study of 115,502 women and their neonates born in 25 hospitals in the United States between March 2008 and February 2011. Hospitals were ranked according to their unadjusted and risk-adjusted frequency of venous thromboembolism, postpartum hemorrhage, peripartum infection, severe perineal laceration, and a composite neonatal adverse outcome. Correlations between hospital risk-adjusted outcome frequencies were assessed. Results Venous thromboembolism occurred too infrequently (0.03%, 95% CI 0.02% – 0.04%) for meaningful assessment. Other outcomes occurred frequently enough for assessment (postpartum hemorrhage 2.29% (95% CI 2.20–2.38), peripartum infection 5.06% (95% CI 4.93–5.19), severe perineal laceration at spontaneous vaginal delivery 2.16% (95% CI 2.06–2.27), neonatal composite 2.73% (95% CI 2.63–2.84)). Although there was high concordance between unadjusted and adjusted hospital rankings, several individual hospitals had an adjusted rank that was substantially different (as much as 12 rank tiers) than their unadjusted rank. None of the correlations between hospital adjusted outcome frequencies was significant. For example, the hospital with the lowest adjusted frequency of peripartum infection had the highest adjusted frequency of severe perineal laceration. Conclusions Evaluations based on a single risk-adjusted outcome cannot be generalized to overall hospital obstetric performance. PMID:23891630

  3. Absolute Income, Relative Income, and Happiness

    ERIC Educational Resources Information Center

    Ball, Richard; Chernova, Kateryna

    2008-01-01

    This paper uses data from the World Values Survey to investigate how an individual's self-reported happiness is related to (i) the level of her income in absolute terms, and (ii) the level of her income relative to other people in her country. The main findings are that (i) both absolute and relative income are positively and significantly…

  4. Investigating Absolute Value: A Real World Application

    ERIC Educational Resources Information Center

    Kidd, Margaret; Pagni, David

    2009-01-01

    Making connections between various representations is important in mathematics. In this article, the authors discuss the numeric, algebraic, and graphical representations of sums of absolute values of linear functions. The initial explanations are accessible to all students who have experience graphing and who understand that absolute value simply…

  5. Preschoolers' Success at Coding Absolute Size Values.

    ERIC Educational Resources Information Center

    Russell, James

    1980-01-01

    Forty-five 2-year-old and forty-five 3-year-old children coded relative and absolute sizes using 1.5-inch, 6-inch, and 18-inch cardboard squares. Results indicate that absolute coding is possible for children of this age. (Author/RH)

  6. Introducing the Mean Absolute Deviation "Effect" Size

    ERIC Educational Resources Information Center

    Gorard, Stephen

    2015-01-01

    This paper revisits the use of effect sizes in the analysis of experimental and similar results, and reminds readers of the relative advantages of the mean absolute deviation as a measure of variation, as opposed to the more complex standard deviation. The mean absolute deviation is easier to use and understand, and more tolerant of extreme…

  7. Monolithically integrated absolute frequency comb laser system

    DOEpatents

    Wanke, Michael C.

    2016-07-12

    Rather than down-convert optical frequencies, a QCL laser system directly generates a THz frequency comb in a compact monolithically integrated chip that can be locked to an absolute frequency without the need of a frequency-comb synthesizer. The monolithic, absolute frequency comb can provide a THz frequency reference and tool for high-resolution broad band spectroscopy.

  8. Estimating the absolute wealth of households

    PubMed Central

    Gerkey, Drew; Hadley, Craig

    2015-01-01

    Abstract Objective To estimate the absolute wealth of households using data from demographic and health surveys. Methods We developed a new metric, the absolute wealth estimate, based on the rank of each surveyed household according to its material assets and the assumed shape of the distribution of wealth among surveyed households. Using data from 156 demographic and health surveys in 66 countries, we calculated absolute wealth estimates for households. We validated the method by comparing the proportion of households defined as poor using our estimates with published World Bank poverty headcounts. We also compared the accuracy of absolute versus relative wealth estimates for the prediction of anthropometric measures. Findings The median absolute wealth estimates of 1 403 186 households were 2056 international dollars per capita (interquartile range: 723–6103). The proportion of poor households based on absolute wealth estimates were strongly correlated with World Bank estimates of populations living on less than 2.00 United States dollars per capita per day (R2 = 0.84). Absolute wealth estimates were better predictors of anthropometric measures than relative wealth indexes. Conclusion Absolute wealth estimates provide new opportunities for comparative research to assess the effects of economic resources on health and human capital, as well as the long-term health consequences of economic change and inequality. PMID:26170506

  9. Absolute optical metrology : nanometers to kilometers

    NASA Technical Reports Server (NTRS)

    Dubovitsky, Serge; Lay, O. P.; Peters, R. D.; Liebe, C. C.

    2005-01-01

    We provide and overview of the developments in the field of high-accuracy absolute optical metrology with emphasis on space-based applications. Specific work on the Modulation Sideband Technology for Absolute Ranging (MSTAR) sensor is described along with novel applications of the sensor.

  10. The Risk GP Model: the standard model of prediction in medicine.

    PubMed

    Fuller, Jonathan; Flores, Luis J

    2015-12-01

    With the ascent of modern epidemiology in the Twentieth Century came a new standard model of prediction in public health and clinical medicine. In this article, we describe the structure of the model. The standard model uses epidemiological measures-most commonly, risk measures-to predict outcomes (prognosis) and effect sizes (treatment) in a patient population that can then be transformed into probabilities for individual patients. In the first step, a risk measure in a study population is generalized or extrapolated to a target population. In the second step, the risk measure is particularized or transformed to yield probabilistic information relevant to a patient from the target population. Hence, we call the approach the Risk Generalization-Particularization (Risk GP) Model. There are serious problems at both stages, especially with the extent to which the required assumptions will hold and the extent to which we have evidence for the assumptions. Given that there are other models of prediction that use different assumptions, we should not inflexibly commit ourselves to one standard model. Instead, model pluralism should be standard in medical prediction.

  11. Europe's Other Poverty Measures: Absolute Thresholds Underlying Social Assistance

    ERIC Educational Resources Information Center

    Bavier, Richard

    2009-01-01

    The first thing many learn about international poverty measurement is that European nations apply a "relative" poverty threshold and that they also do a better job of reducing poverty. Unlike the European model, the "absolute" U.S. poverty threshold does not increase in real value when the nation's standard of living rises, even though it is…

  12. Interpreting incremental value of markers added to risk prediction models.

    PubMed

    Pencina, Michael J; D'Agostino, Ralph B; Pencina, Karol M; Janssens, A Cecile J W; Greenland, Philip

    2012-09-15

    The discrimination of a risk prediction model measures that model's ability to distinguish between subjects with and without events. The area under the receiver operating characteristic curve (AUC) is a popular measure of discrimination. However, the AUC has recently been criticized for its insensitivity in model comparisons in which the baseline model has performed well. Thus, 2 other measures have been proposed to capture improvement in discrimination for nested models: the integrated discrimination improvement and the continuous net reclassification improvement. In the present study, the authors use mathematical relations and numerical simulations to quantify the improvement in discrimination offered by candidate markers of different strengths as measured by their effect sizes. They demonstrate that the increase in the AUC depends on the strength of the baseline model, which is true to a lesser degree for the integrated discrimination improvement. On the other hand, the continuous net reclassification improvement depends only on the effect size of the candidate variable and its correlation with other predictors. These measures are illustrated using the Framingham model for incident atrial fibrillation. The authors conclude that the increase in the AUC, integrated discrimination improvement, and net reclassification improvement offer complementary information and thus recommend reporting all 3 alongside measures characterizing the performance of the final model.

  13. Architecture for Integrated Medical Model Dynamic Probabilistic Risk Assessment

    NASA Technical Reports Server (NTRS)

    Jaworske, D. A.; Myers, J. G.; Goodenow, D.; Young, M.; Arellano, J. D.

    2016-01-01

    Probabilistic Risk Assessment (PRA) is a modeling tool used to predict potential outcomes of a complex system based on a statistical understanding of many initiating events. Utilizing a Monte Carlo method, thousands of instances of the model are considered and outcomes are collected. PRA is considered static, utilizing probabilities alone to calculate outcomes. Dynamic Probabilistic Risk Assessment (dPRA) is an advanced concept where modeling predicts the outcomes of a complex system based not only on the probabilities of many initiating events, but also on a progression of dependencies brought about by progressing down a time line. Events are placed in a single time line, adding each event to a queue, as managed by a planner. Progression down the time line is guided by rules, as managed by a scheduler. The recently developed Integrated Medical Model (IMM) summarizes astronaut health as governed by the probabilities of medical events and mitigation strategies. Managing the software architecture process provides a systematic means of creating, documenting, and communicating a software design early in the development process. The software architecture process begins with establishing requirements and the design is then derived from the requirements.

  14. Absolute instability of the Gaussian wake profile

    NASA Technical Reports Server (NTRS)

    Hultgren, Lennart S.; Aggarwal, Arun K.

    1987-01-01

    Linear parallel-flow stability theory has been used to investigate the effect of viscosity on the local absolute instability of a family of wake profiles with a Gaussian velocity distribution. The type of local instability, i.e., convective or absolute, is determined by the location of a branch-point singularity with zero group velocity of the complex dispersion relation for the instability waves. The effects of viscosity were found to be weak for values of the wake Reynolds number, based on the center-line velocity defect and the wake half-width, larger than about 400. Absolute instability occurs only for sufficiently large values of the center-line wake defect. The critical value of this parameter increases with decreasing wake Reynolds number, thereby indicating a shrinking region of absolute instability with decreasing wake Reynolds number. If backflow is not allowed, absolute instability does not occur for wake Reynolds numbers smaller than about 38.

  15. Climate-based risk models for Fasciola hepatica in Colombia.

    PubMed

    Valencia-López, Natalia; Malone, John B; Carmona, Catalina Gómez; Velásquez, Luz E

    2012-09-01

    A predictive Fasciola hepatica model, based on the growing degree day-water budget (GDD-WB) concept and the known biological requirements of the parasite, was developed within a geographical information system (GIS) in Colombia. Climate-based forecast index (CFI) values were calculated and represented in a national-scale, climate grid (18 x 18 km) using ArcGIS 9.3. A mask overlay was used to exclude unsuitable areas where mean annual temperature exceeded 25 °C, the upper threshold for development and propagation of the F. hepatica life cycle. The model was then validated and further developed by studies limited to one department in northwest Colombia. F. hepatica prevalence data was obtained from a 2008-2010 survey in 10 municipalities of 6,016 dairy cattle at 673 herd study sites, for which global positioning system coordinates were recorded. The CFI map results were compared to F. hepatica environmental risk models for the survey data points that had over 5% prevalence (231 of the 673 sites) at the 1 km2 scale using two independent approaches: (i) a GIS map query based on satellite data parameters including elevation, enhanced vegetation index and land surface temperature day-night difference; and (ii) an ecological niche model (MaxEnt), for which geographic point coordinates of F. hepatica survey farms were used with BioClim data as environmental variables to develop a probability map. The predicted risk pattern of both approaches was similar to that seen in the forecast index grid. The temporal risk, evaluated by the monthly CFIs and a daily GDD-WB forecast software for 2007 and 2008, revealed a major July-August to January transmission period with considerable inter-annual differences.

  16. PSNs: a new model for Medicare risk contracting.

    PubMed

    Prescott, F M

    1997-03-01

    The 1995 Republican House Medicare reform proposal introduced the provider services network (PSN) concept as a new healthcare delivery model for accepting and administering Medicare risk contracts. A PSN operates much like an HMO, but is not subject to the reserve requirements established for HMOs. Providers that want to enter the Medicare risk contracting arena and exercise more control over the delivery of healthcare services may consider forming a PSN. To form a PSN, providers must be sufficiently capitalized to compete with HMOs, create a formal legal organization, and develop a financial plan. To ensure that its goals are met, the PSN must develop a sales promotion plan, enroll members, control and monitor financial resources and clinical outcomes, and implement a management information system. Other crucial capabilities that a PSN must develop include establishing mechanisms for utilization review, membership information maintenance, claims adjudication, physician credentialing, quality assurance, and member grievance procedures. PMID:10165441

  17. GERMcode: A Stochastic Model for Space Radiation Risk Assessment

    NASA Technical Reports Server (NTRS)

    Kim, Myung-Hee Y.; Ponomarev, Artem L.; Cucinotta, Francis A.

    2012-01-01

    A new computer model, the GCR Event-based Risk Model code (GERMcode), was developed to describe biophysical events from high-energy protons and high charge and energy (HZE) particles that have been studied at the NASA Space Radiation Laboratory (NSRL) for the purpose of simulating space radiation biological effects. In the GERMcode, the biophysical description of the passage of HZE particles in tissue and shielding materials is made with a stochastic approach that includes both particle track structure and nuclear interactions. The GERMcode accounts for the major nuclear interaction processes of importance for describing heavy ion beams, including nuclear fragmentation, elastic scattering, and knockout-cascade processes by using the quantum multiple scattering fragmentation (QMSFRG) model. The QMSFRG model has been shown to be in excellent agreement with available experimental data for nuclear fragmentation cross sections. For NSRL applications, the GERMcode evaluates a set of biophysical properties, such as the Poisson distribution of particles or delta-ray hits for a given cellular area and particle dose, the radial dose on tissue, and the frequency distribution of energy deposition in a DNA volume. By utilizing the ProE/Fishbowl ray-tracing analysis, the GERMcode will be used as a bi-directional radiation transport model for future spacecraft shielding analysis in support of Mars mission risk assessments. Recent radiobiological experiments suggest the need for new approaches to risk assessment that include time-dependent biological events due to the signaling times for activation and relaxation of biological processes in cells and tissue. Thus, the tracking of the temporal and spatial distribution of events in tissue is a major goal of the GERMcode in support of the simulation of biological processes important in GCR risk assessments. In order to validate our approach, basic radiobiological responses such as cell survival curves, mutation, chromosomal

  18. The role of models in estimating consequences as part of the risk assessment process.

    PubMed

    Forde-Folle, K; Mitchell, D; Zepeda, C

    2011-08-01

    The degree of disease risk represented by the introduction, spread, or establishment of one or several diseases through the importation of animals and animal products is assessed by importing countries through an analysis of risk. The components of a risk analysis include hazard identification, risk assessment, risk management, and risk communication. A risk assessment starts with identification of the hazard(s) and then continues with four interrelated steps: release assessment, exposure assessment, consequence assessment, and risk estimation. Risk assessments may be either qualitative or quantitative. This paper describes how, through the integration of epidemiological and economic models, the potential adverse biological and economic consequences of exposure can be quantified.

  19. The Leakage Risk Monetization Model for Geologic CO2 Storage.

    PubMed

    Bielicki, Jeffrey M; Pollak, Melisa F; Deng, Hang; Wilson, Elizabeth J; Fitts, Jeffrey P; Peters, Catherine A

    2016-05-17

    We developed the Leakage Risk Monetization Model (LRiMM) which integrates simulation of CO2 leakage from geologic CO2 storage reservoirs with estimation of monetized leakage risk (MLR). Using geospatial data, LRiMM quantifies financial responsibility if leaked CO2 or brine interferes with subsurface resources, and estimates the MLR reduction achievable by remediating leaks. We demonstrate LRiMM with simulations of 30 years of injection into the Mt. Simon sandstone at two locations that differ primarily in their proximity to existing wells that could be leakage pathways. The peak MLR for the site nearest the leakage pathways ($7.5/tCO2) was 190x larger than for the farther injection site, illustrating how careful siting would minimize MLR in heavily used sedimentary basins. Our MLR projections are at least an order of magnitude below overall CO2 storage costs at well-sited locations, but some stakeholders may incur substantial costs. Reliable methods to detect and remediate leaks could further minimize MLR. For both sites, the risk of CO2 migrating to potable aquifers or reaching the atmosphere was negligible due to secondary trapping, whereby multiple impervious sedimentary layers trap CO2 that has leaked through the primary seal of the storage formation.

  20. The Leakage Risk Monetization Model for Geologic CO2 Storage.

    PubMed

    Bielicki, Jeffrey M; Pollak, Melisa F; Deng, Hang; Wilson, Elizabeth J; Fitts, Jeffrey P; Peters, Catherine A

    2016-05-17

    We developed the Leakage Risk Monetization Model (LRiMM) which integrates simulation of CO2 leakage from geologic CO2 storage reservoirs with estimation of monetized leakage risk (MLR). Using geospatial data, LRiMM quantifies financial responsibility if leaked CO2 or brine interferes with subsurface resources, and estimates the MLR reduction achievable by remediating leaks. We demonstrate LRiMM with simulations of 30 years of injection into the Mt. Simon sandstone at two locations that differ primarily in their proximity to existing wells that could be leakage pathways. The peak MLR for the site nearest the leakage pathways ($7.5/tCO2) was 190x larger than for the farther injection site, illustrating how careful siting would minimize MLR in heavily used sedimentary basins. Our MLR projections are at least an order of magnitude below overall CO2 storage costs at well-sited locations, but some stakeholders may incur substantial costs. Reliable methods to detect and remediate leaks could further minimize MLR. For both sites, the risk of CO2 migrating to potable aquifers or reaching the atmosphere was negligible due to secondary trapping, whereby multiple impervious sedimentary layers trap CO2 that has leaked through the primary seal of the storage formation. PMID:27052112

  1. Documentation of the Ecological Risk Assessment Computer Model ECORSK.5

    SciTech Connect

    Anthony F. Gallegos; Gilbert J. Gonzales

    1999-06-01

    The FORTRAN77 ecological risk computer model--ECORSK.5--has been used to estimate the potential toxicity of surficial deposits of radioactive and non-radioactive contaminants to several threatened and endangered (T and E) species at the Los Alamos National Laboratory (LANL). These analyses to date include preliminary toxicity estimates for the Mexican spotted owl, the American peregrine falcon, the bald eagle, and the southwestern willow flycatcher. This work has been performed as required for the Record of Decision for the construction of the Dual Axis Radiographic Hydrodynamic Test (DARHT) Facility at LANL as part of the Environmental Impact Statement. The model is dependent on the use of the geographic information system and associated software--ARC/INFO--and has been used in conjunction with LANL's Facility for Information Management and Display (FIMAD) contaminant database. The integration of FIMAD data and ARC/INFO using ECORSK.5 allows the generation of spatial information from a gridded area of potential exposure called an Ecological Exposure Unit. ECORSK.5 was used to simulate exposures using a modified Environmental Protection Agency Quotient Method. The model can handle a large number of contaminants within the home range of T and E species. This integration results in the production of hazard indices which, when compared to risk evaluation criteria, estimate the potential for impact from consumption of contaminants in food and ingestion of soil. The assessment is considered a Tier-2 type of analysis. This report summarizes and documents the ECORSK.5 code, the mathematical models used in the development of ECORSK.5, and the input and other requirements for its operation. Other auxiliary FORTRAN 77 codes used for processing and graphing output from ECORSK.5 are also discussed. The reader may refer to reports cited in the introduction to obtain greater detail on past applications of ECORSK.5 and assumptions used in deriving model parameters.

  2. A Model to Predict the Risk of Keratinocyte Carcinomas.

    PubMed

    Whiteman, David C; Thompson, Bridie S; Thrift, Aaron P; Hughes, Maria-Celia; Muranushi, Chiho; Neale, Rachel E; Green, Adele C; Olsen, Catherine M

    2016-06-01

    Basal cell and squamous cell carcinomas of the skin are the commonest cancers in humans, yet no validated tools exist to estimate future risks of developing keratinocyte carcinomas. To develop a prediction tool, we used baseline data from a prospective cohort study (n = 38,726) in Queensland, Australia, and used data linkage to capture all surgically excised keratinocyte carcinomas arising within the cohort. Predictive factors were identified through stepwise logistic regression models. In secondary analyses, we derived separate models within strata of prior skin cancer history, age, and sex. The primary model included terms for 10 items. Factors with the strongest effects were >20 prior skin cancers excised (odds ratio 8.57, 95% confidence interval [95% CI] 6.73-10.91), >50 skin lesions destroyed (odds ratio 3.37, 95% CI 2.85-3.99), age ≥ 70 years (odds ratio 3.47, 95% CI 2.53-4.77), and fair skin color (odds ratio 1.75, 95% CI 1.42-2.15). Discrimination in the validation dataset was high (area under the receiver operator characteristic curve 0.80, 95% CI 0.79-0.81) and the model appeared well calibrated. Among those reporting no prior history of skin cancer, a similar model with 10 factors predicted keratinocyte carcinoma events with reasonable discrimination (area under the receiver operator characteristic curve 0.72, 95% CI 0.70-0.75). Algorithms using self-reported patient data have high accuracy for predicting risks of keratinocyte carcinomas.

  3. Bayesian spatial modeling of disease risk in relation to multivariate environmental risk fields.

    PubMed

    Kim, Ji-in; Lawson, Andrew B; McDermott, Suzanne; Aelion, C Marjorie

    2010-01-15

    The relationship between exposure to environmental chemicals during pregnancy and early childhood development is an important issue that has a spatial risk component. In this context, we have examined mental retardation and developmental delay (MRDD) outcome measures for children in a Medicaid population in South Carolina and sampled measures of soil chemistry (e.g. As, Hg, etc.) on a network of sites that are misaligned to the outcome residential addresses during pregnancy. The true chemical concentration at the residential addresses is not observed directly and must be interpolated from soil samples. In this study, we have developed a Bayesian joint model that interpolates soil chemical fields and estimates the associated MRDD risk simultaneously. Having multiple spatial fields to interpolate, we have considered a low-rank Kriging method for the interpolation that requires less computation than the Bayesian Kriging. We performed a sensitivity analysis for a bivariate smoothing, changing the number of knots and the smoothing parameter. These analyses show that a low-rank Kriging method can be used as an alternative to a full-rank Kriging, reducing the computational burden. However, the number of knots for the low-rank Kriging model needs to be selected with caution as a bivariate surface estimation can be sensitive to the choice of the number of knots.

  4. Nankai-Tokai subduction hazard for catastrophe risk modeling

    NASA Astrophysics Data System (ADS)

    Spurr, D. D.

    2010-12-01

    The historical record of Nankai subduction zone earthquakes includes nine event sequences over the last 1300 years. Typical characteristic behaviour is evident, with segments rupturing either co-seismically or as two large earthquakes less than 3 yrs apart (active phase), followed by periods of low seismicity lasting 90 - 150 yrs or more. Despite the long historical record, the recurrence behaviour and consequent seismic hazard remain uncertain and controversial. In 2005 the Headquarters for Earthquake Research Promotion (HERP) published models for hundreds of faults as part of an official Japanese seismic hazard map. The HERP models have been widely adopted in part or full both within Japan and by the main international catastrophe risk model companies. The time-dependent recurrence modelling we adopt for the Nankai faults departs considerably from HERP in three main areas: ■ A “Linked System” (LS) source model is used to simulate the strong correlation between segment ruptures evident in the historical record, whereas the HERP recurrence estimates assume the Nankai, Tonankai and Tokai segments rupture independently. The LS component models all historical events with a common rupture recurrence cycle for the three segments. System rupture probabilities are calculated assuming BPT behaviour and parameter uncertainties assessed from the full 1300 yr historical record. ■ An independent, “Tokai Only” (TO) rupture source is used specifically to model potential “Tokai only” earthquakes. There are widely diverging views on the possibility of this segment rupturing independently. Although all historical Tokai ruptures appear to have been composite Tonankai -Tokai earthquakes, the available data do not preclude the possibility of future “Tokai only” events. The HERP model also includes “Tokai only” earthquakes but the recurrence parameters are based on historical composite Tonankai -Tokai ruptures and do not appear to recognise the complex tectonic

  5. Applications of physiologic pharmacokinetic modeling in carcinogenic risk assessment.

    PubMed Central

    Krewski, D; Withey, J R; Ku, L F; Andersen, M E

    1994-01-01

    The use of physiologically based pharmacokinetic (PBPK) models has been proposed as a means of estimating the dose of the reactive metabolites of carcinogenic xenobiotics reaching target tissues, thereby affording an opportunity to base estimates of potential cancer risk on tissue dose rather than external levels of exposure. In this article, we demonstrate how a PBPK model can be constructed by specifying mass-balance equations for each physiological compartment included in the model. In general, this leads to a system of nonlinear partial differential equations with which to characterize the compartment system. These equations then can be solved numerically to determine the concentration of metabolites in each compartment as functions of time. In the special case of a linear pharmacokinetic system, we present simple closed-form expressions for the area under the concentration-time curves (AUC) in individual tissue compartments. A general relationship between the AUC in blood and other tissue compartments is also established. These results are of use in identifying those parameters in the models that characterize the integrated tissue dose, and which should therefore be the primary focus of sensitivity analyses. Applications of PBPK modeling for purposes of tissue dosimetry are reviewed, including models developed for methylene chloride, ethylene oxide, 1,4-dioxane, 1-nitropyrene, as well as polychlorinated biphenyls, dioxins, and furans. Special considerations in PBPK modeling related to aging, topical absorption, pregnancy, and mixed exposures are discussed. The linkage between pharmacokinetic models used for tissue dosimetry and pharmacodynamic models for neoplastic transformation of stem cells in the target tissue is explored. PMID:7737040

  6. Development of good modelling practice for phsiologically based pharmacokinetic models for use in risk assessment: The first steps

    EPA Science Inventory

    The increasing use of tissue dosimetry estimated using pharmacokinetic models in chemical risk assessments in multiple countries necessitates the need to develop internationally recognized good modelling practices. These practices would facilitate sharing of models and model eva...

  7. Applications of the International Space Station Probabilistic Risk Assessment Model

    NASA Technical Reports Server (NTRS)

    Grant, Warren; Lutomski, Michael G.

    2011-01-01

    Recently the International Space Station (ISS) has incorporated more Probabilistic Risk Assessments (PRAs) in the decision making process for significant issues. Future PRAs will have major impact to ISS and future spacecraft development and operations. These PRAs will have their foundation in the current complete ISS PRA model and the current PRA trade studies that are being analyzed as requested by ISS Program stakeholders. ISS PRAs have recently helped in the decision making process for determining reliability requirements for future NASA spacecraft and commercial spacecraft, making crew rescue decisions, as well as making operational requirements for ISS orbital orientation, planning Extravehicular activities (EVAs) and robotic operations. This paper will describe some applications of the ISS PRA model and how they impacted the final decision. This paper will discuss future analysis topics such as life extension, requirements of new commercial vehicles visiting ISS.

  8. Modeling financial disaster risk management in developing countries

    NASA Astrophysics Data System (ADS)

    Mechler, R.; Hochrainer, S.; Pflug, G.; Linnerooth-Bayer, J.

    2005-12-01

    The public sector plays a major role in reducing the long-term economic repercussions of disasters by repairing damaged infrastructure and providing financial assistance to households and businesses. If critical infrastructure is not repaired in a timely manner, there can be serious effects on the economy and the livelihoods of the population. The repair of public infrastructure, however, can be a significant drain on public budgets especially in developing and transition countries. Developing country governments frequently lack the liquidity, even including international aid and loans, to fully repair damaged critical public infrastructure or provide sufficient support to households and businesses for their recovery. The earthquake in Gujarat, and other recent cases of government post-disaster liquidity crises, have sounded an alarm, prompting financial development organizations, such as the World Bank, among others, to call for greater attention to reducing financial vulnerability and increasing the resilience of the public sector. This talk reports on a model designed to illustrate the tradeoffs and choices a developing country must make in financially managing the economic risks due to natural disasters. Budgetary resources allocated to pre-disaster risk management strategies, such as loss mitigation measures, a catastrophe reserve fund, insurance and contingent credit arrangements for public assets, reduce the probability of financing gaps - the inability of governments to meet their full obligations in providing relief to private victims and restoring public infrastructure - or prevent the deterioration of the ability to undertake additional borrowing without incurring a debt crisis. The model -which is equipped with a graphical interface - can be a helpful tool for building capacity of policy makers for developing and assessing public financing strategies for disaster risk by indicating the respective costs and consequences of financing alternatives.

  9. Modeling human risk: Cell & molecular biology in context

    SciTech Connect

    1997-06-01

    It is anticipated that early in the next century manned missions into outer space will occur, with a mission to Mars scheduled between 2015 and 2020. However, before such missions can be undertaken, a realistic estimation of the potential risks to the flight crews is required. One of the uncertainties remaining in this risk estimation is that posed by the effects of exposure to the radiation environment of outer space. Although the composition of this environment is fairly well understood, the biological effects arising from exposure to it are not. The reasons for this are three-fold: (1) A small but highly significant component of the radiation spectrum in outer space consists of highly charged, high energy (HZE) particles which are not routinely experienced on earth, and for which there are insufficient data on biological effects; (2) Most studies on the biological effects of radiation to date have been high-dose, high dose-rate, whereas in space, with the exception of solar particle events, radiation exposures will be low-dose, low dose-rate; (3) Although it has been established that the virtual absence of gravity in space has a profound effect on human physiology, it is not clear whether these effects will act synergistically with those of radiation exposure. A select panel will evaluate the utilizing experiments and models to accurately predict the risks associated with exposure to HZE particles. Topics of research include cellular and tissue response, health effects associated with radiation damage, model animal systems, and critical markers of Radiation response.

  10. Modified social ecological model: a tool to guide the assessment of the risks and risk contexts of HIV epidemics

    PubMed Central

    2013-01-01

    Background Social and structural factors are now well accepted as determinants of HIV vulnerabilities. These factors are representative of social, economic, organizational and political inequities. Associated with an improved understanding of multiple levels of HIV risk has been the recognition of the need to implement multi-level HIV prevention strategies. Prevention sciences research and programming aiming to decrease HIV incidence requires epidemiologic studies to collect data on multiple levels of risk to inform combination HIV prevention packages. Discussion Proximal individual-level risks, such as sharing injection devices and unprotected penile-vaginal or penile-anal sex, are necessary in mediating HIV acquisition and transmission. However, higher order social and structural-level risks can facilitate or reduce HIV transmission on population levels. Data characterizing these risks is often far more actionable than characterizing individual-level risks. We propose a modified social ecological model (MSEM) to help visualize multi-level domains of HIV infection risks and guide the development of epidemiologic HIV studies. Such a model may inform research in epidemiology and prevention sciences, particularly for key populations including men who have sex with men (MSM), people who inject drugs (PID), and sex workers. The MSEM builds on existing frameworks by examining multi-level risk contexts for HIV infection and situating individual HIV infection risks within wider network, community, and public policy contexts as well as epidemic stage. The utility of the MSEM is demonstrated with case studies of HIV risk among PID and MSM. Summary The MSEM is a flexible model for guiding epidemiologic studies among key populations at risk for HIV in diverse sociocultural contexts. Successful HIV prevention strategies for key populations require effective integration of evidence-based biomedical, behavioral, and structural interventions. While the focus of epidemiologic

  11. Bayesian approach for flexible modeling of semicompeting risks data.

    PubMed

    Han, Baoguang; Yu, Menggang; Dignam, James J; Rathouz, Paul J

    2014-12-20

    Semicompeting risks data arise when two types of events, non-terminal and terminal, are observed. When the terminal event occurs first, it censors the non-terminal event, but not vice versa. To account for possible dependent censoring of the non-terminal event by the terminal event and to improve prediction of the terminal event using the non-terminal event information, it is crucial to model their association properly. Motivated by a breast cancer clinical trial data analysis, we extend the well-known illness-death models to allow flexible random effects to capture heterogeneous association structures in the data. Our extension also represents a generalization of the popular shared frailty models that usually assume that the non-terminal event does not affect the hazards of the terminal event beyond a frailty term. We propose a unified Bayesian modeling approach that can utilize existing software packages for both model fitting and individual-specific event prediction. The approach is demonstrated via both simulation studies and a breast cancer data set analysis. PMID:25274445

  12. Model-based risk analysis of coupled process steps.

    PubMed

    Westerberg, Karin; Broberg-Hansen, Ernst; Sejergaard, Lars; Nilsson, Bernt

    2013-09-01

    A section of a biopharmaceutical manufacturing process involving the enzymatic coupling of a polymer to a therapeutic protein was characterized with regards to the process parameter sensitivity and design space. To minimize the formation of unwanted by-products in the enzymatic reaction, the substrate was added in small amounts and unreacted protein was separated using size-exclusion chromatography (SEC) and recycled to the reactor. The quality of the final recovered product was thus a result of the conditions in both the reactor and the SEC, and a design space had to be established for both processes together. This was achieved by developing mechanistic models of the reaction and SEC steps, establishing the causal links between process conditions and product quality. Model analysis was used to complement the qualitative risk assessment, and design space and critical process parameters were identified. The simulation results gave an experimental plan focusing on the "worst-case regions" in terms of product quality and yield. In this way, the experiments could be used to verify both the suggested process and the model results. This work demonstrates the necessary steps of model-assisted process analysis, from model development through experimental verification.

  13. Modeling urban flood risk territories for Riga city

    NASA Astrophysics Data System (ADS)

    Piliksere, A.; Sennikovs, J.; Virbulis, J.; Bethers, U.; Bethers, P.; Valainis, A.

    2012-04-01

    Riga, the capital of Latvia, is located on River Daugava at the Gulf of Riga. The main flooding risks of Riga city are: (1) storm caused water setup in South part of Gulf of Riga (storm event), (2) water level increase caused by Daugava River discharge maximums (spring snow melting event) and (3) strong rainfall or rapid snow melting in densely populated urban areas. The first two flooding factors were discussed previously (Piliksere et al, 2011). The aims of the study were (1) the identification of the flood risk situations in densely populated areas, (2) the quantification of the flooding scenarios caused by rain and snow melting events of different return periods nowadays, in the near future (2021-2050), far future (2071-2100) taking into account the projections of climate change, (3) estimation of groundwater level for Riga city, (4) the building and calibration of the hydrological mathematical model based on SWMM (EPA, 2004) for the domain potentially vulnerable for rain and snow melt flooding events, (5) the calculation of rain and snow melting flood events with different return periods, (6) mapping the potentially flooded areas on a fine grid. The time series of short term precipitation events during warm time period of year (id est. rain events) were analyzed for 35 year long time period. Annual maxima of precipitation intensity for events with different duration (5 min; 15 min; 1h; 3h; 6h; 12h; 1 day; 2 days; 4 days; 10 days) were calculated. The time series of long term simultaneous precipitation data and observations of the reduction of thickness of snow cover were analyzed for 27 year long time period. Snow thawing periods were detected and maximum of snow melting intensity for events with different intensity (1day; 2 days; 4 days; 7 days; 10 days) were calculated. According to the occurrence probability six scenarios for each event for nowadays, near and far future with return period once in 5, 10, 20, 50, 100 and 200 years were constructed based on

  14. Foraging and predation risk for larval cisco (Coregonus artedi) in Lake Superior: a modelling synthesis of empirical survey data

    USGS Publications Warehouse

    Myers, Jared T.; Yule, Daniel L.; Jones, Michael L.; Quinlan, Henry R.; Berglund, Eric K.

    2014-01-01

    The relative importance of predation and food availability as contributors to larval cisco (Coregonus artedi) mortality in Lake Superior were investigated using a visual foraging model to evaluate potential predation pressure by rainbow smelt (Osmerus mordax) and a bioenergetic model to evaluate potential starvation risk. The models were informed by observations of rainbow smelt, larval cisco, and zooplankton abundance at three Lake Superior locations during the period of spring larval cisco emergence and surface-oriented foraging. Predation risk was highest at Black Bay, ON, where average rainbow smelt densities in the uppermost 10 m of the water column were >1000 ha−1. Turbid conditions at the Twin Ports, WI-MN, affected larval cisco predation risk because rainbow smelt remained suspended in the upper water column during daylight, placing them alongside larval cisco during both day and night hours. Predation risk was low at Cornucopia, WI, owing to low smelt densities (<400 ha−1) and deep light penetration, which kept rainbow smelt near the lakebed and far from larvae during daylight. In situ zooplankton density estimates were low compared to the values used to develop the larval coregonid bioenergetics model, leading to predictions of negative growth rates for 10 mm larvae at all three locations. The model predicted that 15 mm larvae were capable of attaining positive growth at Cornucopia and the Twin Ports where low water temperatures (2–6 °C) decreased their metabolic costs. Larval prey resources were highest at Black Bay but warmer water temperatures there offset the benefit of increased prey availability. A sensitivity analysis performed on the rainbow smelt visual foraging model showed that it was relatively insensitive, while the coregonid bioenergetics model showed that the absolute growth rate predictions were highly sensitive to input parameters (i.e., 20% parameter perturbation led to order of magnitude differences in model estimates). Our

  15. Evaluating Geographically Weighted Regression Models for Environmental Chemical Risk Analysis

    PubMed Central

    Czarnota, Jenna; Wheeler, David C; Gennings, Chris

    2015-01-01

    In the evaluation of cancer risk related to environmental chemical exposures, the effect of many correlated chemicals on disease is often of interest. The relationship between correlated environmental chemicals and health effects is not always constant across a study area, as exposure levels may change spatially due to various environmental factors. Geographically weighted regression (GWR) has been proposed to model spatially varying effects. However, concerns about collinearity effects, including regression coefficient sign reversal (ie, reversal paradox), may limit the applicability of GWR for environmental chemical risk analysis. A penalized version of GWR, the geographically weighted lasso, has been proposed to remediate the collinearity effects in GWR models. Our focus in this study was on assessing through a simulation study the ability of GWR and GWL to correctly identify spatially varying chemical effects for a mixture of correlated chemicals within a study area. Our results showed that GWR suffered from the reversal paradox, while GWL overpenalized the effects for the chemical most strongly related to the outcome. PMID:25983546

  16. Evaluation of the Absolute Regional Temperature Potential

    NASA Technical Reports Server (NTRS)

    Shindell, D. T.

    2012-01-01

    The Absolute Regional Temperature Potential (ARTP) is one of the few climate metrics that provides estimates of impacts at a sub-global scale. The ARTP presented here gives the time-dependent temperature response in four latitude bands (90-28degS, 28degS-28degN, 28-60degN and 60-90degN) as a function of emissions based on the forcing in those bands caused by the emissions. It is based on a large set of simulations performed with a single atmosphere-ocean climate model to derive regional forcing/response relationships. Here I evaluate the robustness of those relationships using the forcing/response portion of the ARTP to estimate regional temperature responses to the historic aerosol forcing in three independent climate models. These ARTP results are in good accord with the actual responses in those models. Nearly all ARTP estimates fall within +/-20%of the actual responses, though there are some exceptions for 90-28degS and the Arctic, and in the latter the ARTP may vary with forcing agent. However, for the tropics and the Northern Hemisphere mid-latitudes in particular, the +/-20% range appears to be roughly consistent with the 95% confidence interval. Land areas within these two bands respond 39-45% and 9-39% more than the latitude band as a whole. The ARTP, presented here in a slightly revised form, thus appears to provide a relatively robust estimate for the responses of large-scale latitude bands and land areas within those bands to inhomogeneous radiative forcing and thus potentially to emissions as well. Hence this metric could allow rapid evaluation of the effects of emissions policies at a finer scale than global metrics without requiring use of a full climate model.

  17. GNSS Absolute Antenna Calibration at the National Geodetic Survey

    NASA Astrophysics Data System (ADS)

    Mader, G. L.; Bilich, A. L.; Geoghegan, C.

    2011-12-01

    Geodetic GNSS applications routinely demand millimeter precision and extremely high levels of accuracy. To achieve these accuracies, measurement and instrument biases at the centimeter to millimeter level must be understood. One of these biases is the antenna phase center, the apparent point of signal reception for a GNSS antenna. It has been well established that phase center patterns differ between antenna models and manufacturers; additional research suggests that the addition of a radome or the choice of antenna mount can significantly alter those a priori phase center patterns. For the more demanding GNSS positioning applications and especially in cases of mixed-antenna networks, it is all the more important to know antenna phase center variations as a function of both elevation and azimuth in the antenna reference frame and incorporate these models into analysis software. To help meet the needs of the high-precision GNSS community, the National Geodetic Survey (NGS) now operates an absolute antenna calibration facility. Located in Corbin, Virginia, this facility uses field measurements and actual GNSS satellite signals to quantitatively determine the carrier phase advance/delay introduced by the antenna element. The NGS facility was built to serve traditional NGS constituents such as the surveying and geodesy communities, however calibration services are open and available to all GNSS users as the calibration schedule permits. All phase center patterns computed by this facility will be publicly available and disseminated in both the ANTEX and NGS formats. We describe the NGS calibration facility, and discuss the observation models and strategy currently used to generate NGS absolute calibrations. We demonstrate that NGS absolute phase center variation (PCV) patterns are consistent with published values determined by other absolute antenna calibration facilities, and compare absolute calibrations to the traditional NGS relative calibrations.

  18. A New Gimmick for Assigning Absolute Configuration.

    ERIC Educational Resources Information Center

    Ayorinde, F. O.

    1983-01-01

    A five-step procedure is provided to help students in making the assignment absolute configuration less bothersome. Examples for both single (2-butanol) and multi-chiral carbon (3-chloro-2-butanol) molecules are included. (JN)

  19. The Simplicity Argument and Absolute Morality

    ERIC Educational Resources Information Center

    Mijuskovic, Ben

    1975-01-01

    In this paper the author has maintained that there is a similarity of thought to be found in the writings of Cudworth, Emerson, and Husserl in his investigation of an absolute system of morality. (Author/RK)

  20. Biological-Based Modeling of Low Dose Radiation Risks

    SciTech Connect

    Scott, Bobby R., Ph.D.

    2006-11-08

    The objective of this project was to refine a biological-based model (called NEOTRANS2) for low-dose, radiation-induced stochastic effects taking into consideration newly available data, including data on bystander effects (deleterious and protective). The initial refinement led to our NEOTRANS3 model which has undergone further refinement (e.g., to allow for differential DNA repair/apoptosis over different dose regions). The model has been successfully used to explain nonlinear dose-response curves for low-linear-energy-transfer (LET) radiation-induced mutations (in vivo) and neoplastic transformation (in vitro). Relative risk dose-response functions developed for neoplastic transformation have been adapted for application to cancer relative risk evaluation for irradiated humans. Our low-dose research along with that conducted by others collectively demonstrate the following regarding induced protection associated with exposure to low doses of low-LET radiation: (1) protects against cell killing by high-LET alpha particles; (2) protects against spontaneous chromosomal damage; (3) protects against spontaneous mutations and neoplastic transformations; (4) suppresses mutations induced by a large radiation dose even when the low dose is given after the large dose; (5) suppresses spontaneous and alpha-radiation-induced cancers; (6) suppresses metastasis of existing cancer; (7) extends tumor latent period; (8) protects against diseases other than cancer; and (9) extends life expectancy. These forms of radiation-induced protection are called adapted protection as they relate to induced adaptive response. Thus, low doses and dose rates of low-LET radiation generally protect rather than harm us. These findings invalidate the linear not threshold (LNT) hypothesis which is based on the premise that any amount of radiation is harmful irrespective of its type. The hypothesis also implicates a linear dose-response curve for cancer induction that has a positive slope and no

  1. Phase two of Site 300`s ecological risk assessment: Model verification and risk management

    SciTech Connect

    Carlson, T.M.; Gregory, S.D.

    1995-12-31

    The authors completed the baseline ecological risk assessment (ERA) for Lawrence Livermore National Laboratory`s Site 300 in 1993. Using data collection and modeling techniques adapted from the human health risk assessment (HRA), they evaluated the potential hazard of contaminants in environmental media to ecological receptors. They identified potential hazards to (1) aquatic invertebrates from heavy metal contaminants in surface water, (2) burrowing vertebrates from contaminants volatilizing from subsurface soil into burrow air, and (3) grazing deer and burrowing vertebrates from cadmium contamination in surface soil. They recently began collecting data to refine the estimates of potential hazard to these ecological receptors. Bioassay results form the surface water failed to verify a hazard to aquatic invertebrates. Soil vapor surveys of subsurface burrows did verify the presence of high concentrations of volatile organic compounds (VOCs). However, they have not yet verified a true impact on the burrowing populations. The authors also completed an extensive surface soil sampling program, which identified local hot spots of cadmium contamination. In addition, they have been collecting data on the land use patterns of the deer population. Their data indicate that deer do not typically use those areas with cadmium surface soil contamination. Information from this phase of the ERA, along with the results of the HRA, will direct the selection of remedial alternatives for the site. For the ecological receptors, remedial alternatives include developing a risk management program which includes ensuring that (1) sensitive burrowing species (such as rare or endangered species) do not use areas of surface or subsurface contamination, and (2) deer populations do not use areas of surface soil contamination.

  2. Development of Relative Risk Model for Regional Groundwater Risk Assessment: A Case Study in the Lower Liaohe River Plain, China

    PubMed Central

    Li, Xianbo; Zuo, Rui; Teng, Yanguo; Wang, Jinsheng; Wang, Bin

    2015-01-01

    Increasing pressure on water supply worldwide, especially in arid areas, has resulted in groundwater overexploitation and contamination, and subsequent deterioration of the groundwater quality and threats to public health. Environmental risk assessment of regional groundwater is an important tool for groundwater protection. This study presents a new approach for assessing the environmental risk assessment of regional groundwater. It was carried out with a relative risk model (RRM) coupled with a series of indices, such as a groundwater vulnerability index, which includes receptor analysis, risk source analysis, risk exposure and hazard analysis, risk characterization, and management of groundwater. The risk map is a product of the probability of environmental contamination and impact. The reliability of the RRM was verified using Monte Carlo analysis. This approach was applied to the lower Liaohe River Plain (LLRP), northeastern China, which covers 23604 km2. A spatial analysis tool within GIS which was used to interpolate and manipulate the data to develop environmental risk maps of regional groundwater, divided the level of risk from high to low into five ranks (V, IV, III, II, I). The results indicate that areas of relative risk rank (RRR) V cover 2324 km2, covering 9.8% of the area; RRR IV covers 3986 km2, accounting for 16.9% of the area. It is a new and appropriate method for regional groundwater resource management and land use planning, and is a rapid and effective tool for improving strategic decision making to protect groundwater and reduce environmental risk. PMID:26020518

  3. Development of relative risk model for regional groundwater risk assessment: a case study in the lower Liaohe River Plain, China.

    PubMed

    Li, Xianbo; Zuo, Rui; Teng, Yanguo; Wang, Jinsheng; Wang, Bin

    2015-01-01

    Increasing pressure on water supply worldwide, especially in arid areas, has resulted in groundwater overexploitation and contamination, and subsequent deterioration of the groundwater quality and threats to public health. Environmental risk assessment of regional groundwater is an important tool for groundwater protection. This study presents a new approach for assessing the environmental risk assessment of regional groundwater. It was carried out with a relative risk model (RRM) coupled with a series of indices, such as a groundwater vulnerability index, which includes receptor analysis, risk source analysis, risk exposure and hazard analysis, risk characterization, and management of groundwater. The risk map is a product of the probability of environmental contamination and impact. The reliability of the RRM was verified using Monte Carlo analysis. This approach was applied to the lower Liaohe River Plain (LLRP), northeastern China, which covers 23604 km2. A spatial analysis tool within GIS which was used to interpolate and manipulate the data to develop environmental risk maps of regional groundwater, divided the level of risk from high to low into five ranks (V, IV, III, II, I). The results indicate that areas of relative risk rank (RRR) V cover 2324 km2, covering 9.8% of the area; RRR IV covers 3986 km2, accounting for 16.9% of the area. It is a new and appropriate method for regional groundwater resource management and land use planning, and is a rapid and effective tool for improving strategic decision making to protect groundwater and reduce environmental risk. PMID:26020518

  4. Development of relative risk model for regional groundwater risk assessment: a case study in the lower Liaohe River Plain, China.

    PubMed

    Li, Xianbo; Zuo, Rui; Teng, Yanguo; Wang, Jinsheng; Wang, Bin

    2015-01-01

    Increasing pressure on water supply worldwide, especially in arid areas, has resulted in groundwater overexploitation and contamination, and subsequent deterioration of the groundwater quality and threats to public health. Environmental risk assessment of regional groundwater is an important tool for groundwater protection. This study presents a new approach for assessing the environmental risk assessment of regional groundwater. It was carried out with a relative risk model (RRM) coupled with a series of indices, such as a groundwater vulnerability index, which includes receptor analysis, risk source analysis, risk exposure and hazard analysis, risk characterization, and management of groundwater. The risk map is a product of the probability of environmental contamination and impact. The reliability of the RRM was verified using Monte Carlo analysis. This approach was applied to the lower Liaohe River Plain (LLRP), northeastern China, which covers 23604 km2. A spatial analysis tool within GIS which was used to interpolate and manipulate the data to develop environmental risk maps of regional groundwater, divided the level of risk from high to low into five ranks (V, IV, III, II, I). The results indicate that areas of relative risk rank (RRR) V cover 2324 km2, covering 9.8% of the area; RRR IV covers 3986 km2, accounting for 16.9% of the area. It is a new and appropriate method for regional groundwater resource management and land use planning, and is a rapid and effective tool for improving strategic decision making to protect groundwater and reduce environmental risk.

  5. Modelling surface water flood risk using coupled numerical and physical modelling techniques

    NASA Astrophysics Data System (ADS)

    Green, D. L.; Pattison, I.; Yu, D.

    2015-12-01

    Surface water (pluvial) flooding occurs due to intense precipitation events where rainfall cannot infiltrate into the sub-surface or drain via storm water systems. The perceived risk appears to have increased in recent years with pluvial flood events seeming more severe and frequent within the UK. Surface water flood risk currently accounts for one third of all UK flood risk, with approximately two million people living in urban areas being at risk of a 1 in 200 year flood event. Surface water flooding research often focuses upon using 1D, 2D or 1D-2D coupled numerical modelling techniques to understand the extent, depth and severity of actual or hypothetical flood scenarios. Although much research has been conducted using numerical modelling, field data available for model calibration and validation is limited due to the complexities associated with data collection in surface water flood conditions. Ultimately, the data which numerical models are based upon is often erroneous and inconclusive. Physical models offer an alternative and innovative environment to collect data within. A controlled, closed system allows independent variables to be altered individually to investigate cause and effect relationships. Despite this, physical modelling approaches are seldom used in surface water flooding research. Scaled laboratory experiments using a 9m2, two-tiered physical model consisting of: (i) a mist nozzle type rainfall simulator able to simulate a range of rainfall intensities similar to those observed within the United Kingdom, and; (ii) a fully interchangeable, scaled plot surface have been conducted to investigate and quantify the influence of factors such as slope, impermeability, building density/configuration and storm dynamics on overland flow and rainfall-runoff patterns within a range of terrestrial surface conditions. Results obtained within the physical modelling environment will be compared with numerical modelling results using FloodMap (Yu & Lane, 2006

  6. Time-to-Compromise Model for Cyber Risk Reduction Estimation

    SciTech Connect

    Miles A. McQueen; Wayne F. Boyer; Mark A. Flynn; George A. Beitel

    2005-09-01

    We propose a new model for estimating the time to compromise a system component that is visible to an attacker. The model provides an estimate of the expected value of the time-to-compromise as a function of known and visible vulnerabilities, and attacker skill level. The time-to-compromise random process model is a composite of three subprocesses associated with attacker actions aimed at the exploitation of vulnerabilities. In a case study, the model was used to aid in a risk reduction estimate between a baseline Supervisory Control and Data Acquisition (SCADA) system and the baseline system enhanced through a specific set of control system security remedial actions. For our case study, the total number of system vulnerabilities was reduced by 86% but the dominant attack path was through a component where the number of vulnerabilities was reduced by only 42% and the time-to-compromise of that component was increased by only 13% to 30% depending on attacker skill level.

  7. Dose-volume modeling of the risk of postoperative pulmonary complications among esophageal cancer patients treated with concurrent chemoradiotherapy followed by surgery

    SciTech Connect

    Tucker, Susan L. . E-mail: sltucker@mdanderson.org; Liu, H. Helen; Wang, Shulian; Wei Xiong; Liao Zhongxing; Komaki, Ritsuko; Cox, James D.; Mohan, Radhe

    2006-11-01

    Purpose: The aim of this study was to investigate the effect of radiation dose distribution in the lung on the risk of postoperative pulmonary complications among esophageal cancer patients. Methods and Materials: We analyzed data from 110 patients with esophageal cancer treated with concurrent chemoradiotherapy followed by surgery at our institution from 1998 to 2003. The endpoint for analysis was postsurgical pneumonia or acute respiratory distress syndrome. Dose-volume histograms (DVHs) and dose-mass histograms (DMHs) for the whole lung were used to fit normal-tissue complication probability (NTCP) models, and the quality of fits were compared using bootstrap analysis. Results: Normal-tissue complication probability modeling identified that the risk of postoperative pulmonary complications was most significantly associated with small absolute volumes of lung spared from doses {>=}5 Gy (VS5), that is, exposed to doses <5 Gy. However, bootstrap analysis found no significant difference between the quality of this model and fits based on other dosimetric parameters, including mean lung dose, effective dose, and relative volume of lung receiving {>=}5 Gy, probably because of correlations among these factors. The choice of DVH vs. DMH or the use of fractionation correction did not significantly affect the results of the NTCP modeling. The parameter values estimated for the Lyman NTCP model were as follows (with 95% confidence intervals in parentheses): n = 1.85 (0.04, {infinity}), m = 0.55 (0.22, 1.02), and D {sub 5} = 17.5 Gy (9.4 Gy, 102 Gy). Conclusions: In this cohort of esophageal cancer patients, several dosimetric parameters including mean lung dose, effective dose, and absolute volume of lung receiving <5 Gy provided similar descriptions of the risk of postoperative pulmonary complications as a function of Radiation dose distribution in the lung.

  8. STakeholder-Objective Risk Model (STORM): Determining the aggregated risk of multiple contaminant hazards in groundwater well catchments

    NASA Astrophysics Data System (ADS)

    Enzenhoefer, R.; Binning, P. J.; Nowak, W.

    2015-09-01

    Risk is often defined as the product of probability, vulnerability and value. Drinking water supply from groundwater abstraction is often at risk due to multiple hazardous land use activities in the well catchment. Each hazard might or might not introduce contaminants into the subsurface at any point in time, which then affects the pumped quality upon transport through the aquifer. In such situations, estimating the overall risk is not trivial, and three key questions emerge: (1) How to aggregate the impacts from different contaminants and spill locations to an overall, cumulative impact on the value at risk? (2) How to properly account for the stochastic nature of spill events when converting the aggregated impact to a risk estimate? (3) How will the overall risk and subsequent decision making depend on stakeholder objectives, where stakeholder objectives refer to the values at risk, risk attitudes and risk metrics that can vary between stakeholders. In this study, we provide a STakeholder-Objective Risk Model (STORM) for assessing the total aggregated risk. Or concept is a quantitative, probabilistic and modular framework for simulation-based risk estimation. It rests on the source-pathway-receptor concept, mass-discharge-based aggregation of stochastically occuring spill events, accounts for uncertainties in the involved flow and transport models through Monte Carlo simulation, and can address different stakeholder objectives. We illustrate the application of STORM in a numerical test case inspired by a German drinking water catchment. As one may expect, the results depend strongly on the chosen stakeholder objectives, but they are equally sensitive to different approaches for risk aggregation across different hazards, contaminant types, and over time.

  9. Application of wildfire simulation models for risk analysis

    NASA Astrophysics Data System (ADS)

    Ager, A.; Finney, M.

    2009-04-01

    Wildfire simulation models are being widely used by fire and fuels specialists in the U.S. to support tactical and strategic decisions related to the mitigation of wildfire risk. Much of this application has resulted from the development of a minimum travel time (MTT) fire spread algorithm (M. Finney) that makes it computationally feasible to simulate thousands of fires and generate burn probability and intensity maps over large areas (10,000 - 2,000,000 ha). The MTT algorithm is parallelized for multi-threaded processing and is imbedded in a number of research and applied fire modeling applications. High performance computers (e.g., 32-way 64 bit SMP) are typically used for MTT simulations, although the algorithm is also implemented in the 32 bit desktop FlamMap3 program (www.fire.org). Extensive testing has shown that this algorithm can replicate large fire boundaries in the heterogeneous landscapes that typify much of the wildlands in the western U.S. In this paper, we describe the application of the MTT algorithm to understand spatial patterns of burn probability (BP), and to analyze wildfire risk to key human and ecological values. The work is focused on a federally-managed 2,000,000 ha landscape in the central interior region of Oregon State, USA. The fire-prone study area encompasses a wide array of topography and fuel types and a number of highly valued resources that are susceptible to fire. We quantitatively defined risk as the product of the probability of a fire and the resulting consequence. Burn probabilities at specific intensity classes were estimated for each 100 x 100 m pixel by simulating 100,000 wildfires under burn conditions that replicated recent severe wildfire events that occurred under conditions where fire suppression was generally ineffective (97th percentile, August weather). We repeated the simulation under milder weather (70th percentile, August weather) to replicate a "wildland fire use scenario" where suppression is minimized to

  10. Low-probability flood risk modeling for New York City.

    PubMed

    Aerts, Jeroen C J H; Lin, Ning; Botzen, Wouter; Emanuel, Kerry; de Moel, Hans

    2013-05-01

    The devastating impact by Hurricane Sandy (2012) again showed New York City (NYC) is one of the most vulnerable cities to coastal flooding around the globe. The low-lying areas in NYC can be flooded by nor'easter storms and North Atlantic hurricanes. The few studies that have estimated potential flood damage for NYC base their damage estimates on only a single, or a few, possible flood events. The objective of this study is to assess the full distribution of hurricane flood risk in NYC. This is done by calculating potential flood damage with a flood damage model that uses many possible storms and surge heights as input. These storms are representative for the low-probability/high-impact flood hazard faced by the city. Exceedance probability-loss curves are constructed under different assumptions about the severity of flood damage. The estimated flood damage to buildings for NYC is between US$59 and 129 millions/year. The damage caused by a 1/100-year storm surge is within a range of US$2 bn-5 bn, while this is between US$5 bn and 11 bn for a 1/500-year storm surge. An analysis of flood risk in each of the five boroughs of NYC finds that Brooklyn and Queens are the most vulnerable to flooding. This study examines several uncertainties in the various steps of the risk analysis, which resulted in variations in flood damage estimations. These uncertainties include: the interpolation of flood depths; the use of different flood damage curves; and the influence of the spectra of characteristics of the simulated hurricanes.

  11. Ecological risk model of childhood obesity in Chinese immigrant children.

    PubMed

    Zhou, Nan; Cheah, Charissa S L

    2015-07-01

    Chinese Americans are the largest and fastest growing Asian American subgroup, increasing about one-third during the 2000s. Despite the slender Asian stereotype, nearly one-third of 6-to-11 year old Chinese American children were found to be overweight (above the 85th percentile in BMI). Importantly, unique and severe health risks are associated with being overweight/obese in Chinese. Unfortunately, Chinese immigrant children have been neglected in the literature on obesity. This review aimed to identify factors at various levels of the ecological model that may place Chinese immigrant children at risk for being overweight/obese in the U.S. Key contextual factors at the micro-, meso-, exo-, macro- and chronosystem were identified guided by Bronfenbrenner's ecological systems theory. The corresponding mediating and moderating processes among the factors were also reviewed and proposed. By presenting a conceptual framework and relevant research, this review can provide a basic framework for directing future interdisciplinary research in seeking solutions to childhood obesity within this understudied population.

  12. Modeling for regulatory purposes (risk and safety assessment).

    PubMed

    El-Masri, Hisham

    2013-01-01

    Chemicals provide many key building blocks that are converted into end-use products or used in industrial processes to make products that benefit society. Ensuring the safety of chemicals and their associated products is a key regulatory mission. Current processes and procedures for evaluating and assessing the impact of chemicals on human health, wildlife, and the environment were, in general, designed decades ago. These procedures depend on generation of relevant scientific knowledge in the laboratory and interpretation of this knowledge to refine our understanding of the related potential health risks. In practice, this often means that estimates of dose-response and time-course behaviors for apical toxic effects are needed as a function of relevant levels of exposure. In many situations, these experimentally determined functions are constructed using relatively high doses in experimental animals. In absence of experimental data, the application of computational modeling is necessary to extrapolate risk or safety guidance values for human exposures at low but environmentally relevant levels.

  13. Ecological Risk Model of Childhood Obesity in Chinese Immigrant Children

    PubMed Central

    Zhou, Nan; Cheah, Charissa S. L.

    2015-01-01

    Chinese Americans are the largest and fastest growing Asian American subgroup, increasing about one-third during the 2000s. Despite the slender Asian stereotype, nearly one-third of 6-to-11 years old Chinese American children were found to be overweight (above the 85th percentile in BMI). Importantly, unique and severe health risks are associated with being overweight/obese in Chinese. Unfortunately, Chinese immigrant children have been neglected in the literature on obesity. This review aimed to identify factors at various levels of the ecological model that may place Chinese immigrant children at risk for being overweight/obese in the U.S. Key contextual factors at the micro-, meso-, exo-, macro- and chronosystem were identified guided by Bronfenbrenner’s ecological systems theory. The corresponding mediating and moderating processes among the factors were also reviewed and proposed. By presenting a conceptual framework and relevant research, this review can provide a basic framework for directing future interdisciplinary research in seeking solutions to childhood obesity within this understudied population. PMID:25728887

  14. The Terrestrial Investigation Model: A probabilistic risk assessment model for birds exposed to pesticides

    EPA Science Inventory

    One of the major recommendations of the National Academy of Science to the USEPA, NMFS and USFWS was to utilize probabilistic methods when assessing the risks of pesticides to federally listed endangered and threatened species. The Terrestrial Investigation Model (TIM, version 3....

  15. Simple Model of Mating Preference and Extinction Risk

    NASA Astrophysics Data System (ADS)

    PȨKALSKI, Andrzej

    We present a simple model of a population of individuals characterized by their genetic structure in the form of a double string of bits and the phenotype following from it. The population is living in an unchanging habitat preferring a certain type of phenotype (optimum). Individuals are unisex, however a pair is necessary for breeding. An individual rejects a mate if the latter's phenotype contains too many bad, i.e. different from the optimum, genes in the same places as the individual's. We show that such strategy, analogous to disassortative mating based on the major histocompatibility complex, avoiding inbreeding and incest, could be beneficial for the population and could reduce considerably the extinction risk, especially in small populations.

  16. Optimal dividends in the Brownian motion risk model with interest

    NASA Astrophysics Data System (ADS)

    Fang, Ying; Wu, Rong

    2009-07-01

    In this paper, we consider a Brownian motion risk model, and in addition, the surplus earns investment income at a constant force of interest. The objective is to find a dividend policy so as to maximize the expected discounted value of dividend payments. It is well known that optimality is achieved by using a barrier strategy for unrestricted dividend rate. However, ultimate ruin of the company is certain if a barrier strategy is applied. In many circumstances this is not desirable. This consideration leads us to impose a restriction on the dividend stream. We assume that dividends are paid to the shareholders according to admissible strategies whose dividend rate is bounded by a constant. Under this additional constraint, we show that the optimal dividend strategy is formed by a threshold strategy.

  17. An absolute scale for measuring the utility of money

    NASA Astrophysics Data System (ADS)

    Thomas, P. J.

    2010-07-01

    Measurement of the utility of money is essential in the insurance industry, for prioritising public spending schemes and for the evaluation of decisions on protection systems in high-hazard industries. Up to this time, however, there has been no universally agreed measure for the utility of money, with many utility functions being in common use. In this paper, we shall derive a single family of utility functions, which have risk-aversion as the only free parameter. The fact that they return a utility of zero at their low, reference datum, either the utility of no money or of one unit of money, irrespective of the value of risk-aversion used, qualifies them to be regarded as absolute scales for the utility of money. Evidence of validation for the concept will be offered based on inferential measurements of risk-aversion, using diverse measurement data.

  18. Internal modelling under Risk-Based Capital (RBC) framework

    NASA Astrophysics Data System (ADS)

    Ling, Ang Siew; Hin, Pooi Ah

    2015-12-01

    Very often the methods for the internal modelling under the Risk-Based Capital framework make use of the data which are in the form of run-off triangle. The present research will instead extract from a group of n customers, the historical data for the sum insured si of the i-th customer together with the amount paid yij and the amount aij reported but not yet paid in the j-th development year for j = 1, 2, 3, 4, 5, 6. We model the future value (yij+1, aij+1) to be dependent on the present year value (yij, aij) and the sum insured si via a conditional distribution which is derived from a multivariate power-normal mixture distribution. For a group of given customers with different original purchase dates, the distribution of the aggregate claims liabilities may be obtained from the proposed model. The prediction interval based on the distribution for the aggregate claim liabilities is found to have good ability of covering the observed aggregate claim liabilities.

  19. Validation of a Multimarker Model for Assessing Risk of Type 2 Diabetes from a Five-Year Prospective Study of 6784 Danish People (Inter99)

    PubMed Central

    Urdea, Mickey; Kolberg, Janice; Wilber, Judith; Gerwien, Robert; Moler, Edward; Rowe, Michael; Jorgensen, Paul; Hansen, Torben; Pedersen, Oluf; Jørgensen, Torben; Borch-Johnsen, Knut

    2009-01-01

    Background Improved identification of subjects at high risk for development of type 2 diabetes would allow preventive interventions to be targeted toward individuals most likely to benefit. In previous research, predictive biomarkers were identified and used to develop multivariate models to assess an individual's risk of developing diabetes. Here we describe the training and validation of the PreDx™ Diabetes Risk Score (DRS) model in a clinical laboratory setting using baseline serum samples from subjects in the Inter99 cohort, a population-based primary prevention study of cardiovascular disease. Methods Among 6784 subjects free of diabetes at baseline, 215 subjects progressed to diabetes (converters) during five years of follow-up. A nested case-control study was performed using serum samples from 202 converters and 597 randomly selected nonconverters. Samples were randomly assigned to equally sized training and validation sets. Seven biomarkers were measured using assays developed for use in a clinical reference laboratory. Results The PreDx DRS model performed better on the training set (area under the curve [AUC] = 0.837) than fasting plasma glucose alone (AUC = 0.779). When applied to the sequestered validation set, the PreDx DRS showed the same performance (AUC = 0.838), thus validating the model. This model had a better AUC than any other single measure from a fasting sample. Moreover, the model provided further risk stratification among high-risk subpopulations with impaired fasting glucose or metabolic syndrome. Conclusions The PreDx DRS provides the absolute risk of diabetes conversion in five years for subjects identified to be “at risk” using the clinical factors. PMID:20144324

  20. Jasminum flexile flower absolute from India--a detailed comparison with three other jasmine absolutes.

    PubMed

    Braun, Norbert A; Kohlenberg, Birgit; Sim, Sherina; Meier, Manfred; Hammerschmidt, Franz-Josef

    2009-09-01

    Jasminum flexile flower absolute from the south of India and the corresponding vacuum headspace (VHS) sample of the absolute were analyzed using GC and GC-MS. Three other commercially available Indian jasmine absolutes from the species: J. sambac, J. officinale subsp. grandiflorum, and J. auriculatum and the respective VHS samples were used for comparison purposes. One hundred and twenty-one compounds were characterized in J. flexile flower absolute, with methyl linolate, benzyl salicylate, benzyl benzoate, (2E,6E)-farnesol, and benzyl acetate as the main constituents. A detailed olfactory evaluation was also performed.

  1. Models and mosaics: investigating cross-cultural differences in risk perception and risk preference.

    PubMed

    Weber, E U; Hsee, C K

    1999-12-01

    In this article, we describe a multistudy project designed to explain observed cross-national differences in risk taking between respondents from the People's Republic of China and the United States. Using this example, we develop the following recommendations for cross-cultural investigations. First, like all psychological research, cross-cultural studies should be model based. Investigators should commit themselves to a model of the behavior under study that explicitly specifies possible causal constructs or variables hypothesized to influence the behavior, as well as the relationship between those variables, and allows for individual, group, or cultural differences in the value of these variables or in the relationship between them. This moves the focus from a simple demonstration of cross-national differences toward a prediction of the behavior, including its cross-national variation. Ideally, the causal construct hypothesized and shown to differ between cultures should be demonstrated to serve as a moderator or a mediator between culture and observed behavioral differences. Second, investigators should look for converging evidence for hypothesized cultural effects on behavior by looking at multiple dependent variables and using multiple methodological approaches. Thus, the data collection that will allow for the establishment of conclusive causal connections between a cultural variable and some target behavior can be compared with the creation of a mosaic.

  2. SY 04-1 CVD RISK PREDICTION IN HIGH-RISK VERSUS LOW-RISK POPULATIONS.

    PubMed

    Kim, Hyeon Chang

    2016-09-01

    Disease risk prediction models have been developed to assess the impact of multiple risk factors and to estimate an individual's absolute disease risk. Accurate disease prediction is essential for personalized prevention, because the benefits, risks, and costs of alternative strategies must be weighed to choose the best preventive strategy for individual patients. Cardiovascular disease (CVD) prediction is the earliest example of individual risk predictions. Since the Framingham study reported a CVD risk prediction method in 1976, an increasing number of risk assessment tools have been developed to CVD risk in various settings. The Framingham study results are fundamental evidence for the prediction of CVD risk. However, the clinical utility of a disease prediction model can be population-specific because the baseline disease risk, subtype distribution of the disease, and level of exposure to risk factors differ by region and ethnicity.It has been proved that CVD prediction models which were developed in high-risk populations, such as the Framingham Risk Score, overestimate an individual's disease risk when applied to a low-risk population without re-calibration. Thus countries of relatively low CVD risk are trying to re-calibrate the existing CVD prediction models or to develop a new prediction model analyzing their own population data. However, even the re-calibrated or newly-developed CVD prediction models are often of little clinical value in a low-risk population. A good example is the CVD prediction in the Korean population. Compared to Western populations, the Korean population has much lower incidence of coronary heart disease. Therefore, the vast majority of individuals fall into the low-risk group when their disease risk is assessed with a prediction model. Even a well-validated prediction model may not identify high-risk individuals who merit aggressive preventive treatment.A few alternative approaches have been suggested for CVD risk prediction in a low-risk

  3. Risk evaluation of uranium mining: A geochemical inverse modelling approach

    NASA Astrophysics Data System (ADS)

    Rillard, J.; Zuddas, P.; Scislewski, A.

    2011-12-01

    It is well known that uranium extraction operations can increase risks linked to radiation exposure. The toxicity of uranium and associated heavy metals is the main environmental concern regarding exploitation and processing of U-ore. In areas where U mining is planned, a careful assessment of toxic and radioactive element concentrations is recommended before the start of mining activities. A background evaluation of harmful elements is important in order to prevent and/or quantify future water contamination resulting from possible migration of toxic metals coming from ore and waste water interaction. Controlled leaching experiments were carried out to investigate processes of ore and waste (leached ore) degradation, using samples from the uranium exploitation site located in Caetité-Bahia, Brazil. In experiments in which the reaction of waste with water was tested, we found that the water had low pH and high levels of sulphates and aluminium. On the other hand, in experiments in which ore was tested, the water had a chemical composition comparable to natural water found in the region of Caetité. On the basis of our experiments, we suggest that waste resulting from sulphuric acid treatment can induce acidification and salinization of surface and ground water. For this reason proper storage of waste is imperative. As a tool to evaluate the risks, a geochemical inverse modelling approach was developed to estimate the water-mineral interaction involving the presence of toxic elements. We used a method earlier described by Scislewski and Zuddas 2010 (Geochim. Cosmochim. Acta 74, 6996-7007) in which the reactive surface area of mineral dissolution can be estimated. We found that the reactive surface area of rock parent minerals is not constant during time but varies according to several orders of magnitude in only two months of interaction. We propose that parent mineral heterogeneity and particularly, neogenic phase formation may explain the observed variation of the

  4. Universal Cosmic Absolute and Modern Science

    NASA Astrophysics Data System (ADS)

    Kostro, Ludwik

    The official Sciences, especially all natural sciences, respect in their researches the principle of methodic naturalism i.e. they consider all phenomena as entirely natural and therefore in their scientific explanations they do never adduce or cite supernatural entities and forces. The purpose of this paper is to show that Modern Science has its own self-existent, self-acting, and self-sufficient Natural All-in Being or Omni-Being i.e. the entire Nature as a Whole that justifies the scientific methodic naturalism. Since this Natural All-in Being is one and only It should be considered as the own scientifically justified Natural Absolute of Science and should be called, in my opinion, the Universal Cosmic Absolute of Modern Science. It will be also shown that the Universal Cosmic Absolute is ontologically enormously stratified and is in its ultimate i.e. in its most fundamental stratum trans-reistic and trans-personal. It means that in its basic stratum. It is neither a Thing or a Person although It contains in Itself all things and persons with all other sentient and conscious individuals as well, On the turn of the 20th century the Science has begun to look for a theory of everything, for a final theory, for a master theory. In my opinion the natural Universal Cosmic Absolute will constitute in such a theory the radical all penetrating Ultimate Basic Reality and will substitute step by step the traditional supernatural personal Absolute.

  5. Absolute Humidity and the Seasonal Onset of Influenza in the Continental United States

    PubMed Central

    Shaman, Jeffrey; Pitzer, Virginia E.; Viboud, Cécile; Grenfell, Bryan T.; Lipsitch, Marc

    2010-01-01

    Much of the observed wintertime increase of mortality in temperate regions is attributed to seasonal influenza. A recent reanalysis of laboratory experiments indicates that absolute humidity strongly modulates the airborne survival and transmission of the influenza virus. Here, we extend these findings to the human population level, showing that the onset of increased wintertime influenza-related mortality in the United States is associated with anomalously low absolute humidity levels during the prior weeks. We then use an epidemiological model, in which observed absolute humidity conditions temper influenza transmission rates, to successfully simulate the seasonal cycle of observed influenza-related mortality. The model results indicate that direct modulation of influenza transmissibility by absolute humidity alone is sufficient to produce this observed seasonality. These findings provide epidemiological support for the hypothesis that absolute humidity drives seasonal variations of influenza transmission in temperate regions. PMID:20186267

  6. Development of a relative risk model for evaluating ecological risk of water environment in the Haihe River Basin estuary area.

    PubMed

    Chen, Qiuying; Liu, Jingling; Ho, Kin Chung; Yang, Zhifeng

    2012-03-15

    Ecological risk assessment for water environment is significant to water resource management of basin. Effective environmental management and systems restoration such as the Haihe River Basin require holistic understanding of the relative importance of various stressor-related impacts throughout the basin. As an effective technical tool for evaluating the ecological risk, relative risk model (RRM) was applied in regional scale successfully. In this study, the risk transfer from upstream of basin was considered and the RRM was developed through introducing the source-stressor-habitat exposure filter (SSH), the endpoint-habitat exposure filter (EH) and the stressor-endpoint effect filter (SE) to reflect the meaning of exposure and effect more explicit. Water environment which includes water quality, water quantity and aquatic ecosystems was selected as the assessment endpoints. We created a conceptual model which depicting potential and effect pathways from source to stressor to habitat to endpoint. The Haihe River Basin estuary (HRBE) was selected as the model case. The results showed that there were two low risk regions, one medium risk region and two high risk regions in the HRBE. The results also indicated that urbanization was the biggest source, the second was shipping and the third was industry, their risk scores are 5.65, 4.71 and 3.68 respectively. Furthermore, habitat destruction was the largest stressor with the risk scores (2.66), the second was oxygen consuming organic pollutants (1.75) and the third was pathogens (1.75). So these three stressors were the main influencing factors of the ecological pressure in the study area. For habitats, open waters (9.59) and intertidal mudflat were enduring the bigger pressure and should be taken considerable attention. Ecological service values damaged (30.54) and biodiversity decreased were facing the biggest risk pressure.

  7. Probabilistic modelling for estimating gas kinetics and decompression sickness risk in pigs during H2 biochemical decompression.

    PubMed

    Fahlman, Andreas; Kayar, Susan R

    2003-07-01

    We modelled the kinetics of H2 flux during gas uptake and elimination in conscious pigs exposed to hyperbaric H2. The model used a physiological description of gas flux fitted to the observed decompression sickness (DCS) incidence in two groups of pigs: untreated controls, and animals that had received intestinal injections of H2-metabolizing microbes that biochemically eliminated some of the H2 stored in the pigs' tissues. To analyse H2 flux during gas uptake, animals were compressed in a dry chamber to 24 atm (ca 88% H2, 9% He, 2% O2, 1% N2) for 30-1440 min and decompressed at 0.9 atm min(-1) (n = 70). To analyse H2 flux during gas elimination, animals were compressed to 24 atm for 3 h and decompressed at 0.45-1.8 atm min(-1) (n = 58). Animals were closely monitored for 1 h post-decompression for signs of DCS. Probabilistic modelling was used to estimate that the exponential time constant during H2 uptake (tau(in)) and H2 elimination (tau(out)) were 79 +/- 25 min and 0.76 +/- 0.14 min, respectively. Thus, the gas kinetics affecting DCS risk appeared to be substantially faster for elimination than uptake, which is contrary to customary assumptions of gas uptake and elimination kinetic symmetry. We discuss the possible reasons for this asymmetry, and why absolute values of H2 kinetics cannot be obtained with this approach.

  8. On the Absolute Continuity of the Blackwell Measure

    NASA Astrophysics Data System (ADS)

    Bárány, Balázs; Kolossváry, István

    2015-04-01

    In 1957, Blackwell expressed the entropy of hidden Markov chains using a measure which can be characterised as an invariant measure for an iterated function system with place-dependent weights. This measure, called the Blackwell measure, plays a central role in understanding the entropy rate and other important characteristics of fundamental models in information theory. We show that for a suitable set of parameter values the Blackwell measure is absolutely continuous for almost every parameter in the case of binary symmetric channels.

  9. Correlates of suicide and violence risk: III. A two-stage model of countervailing forces.

    PubMed

    Plutchik, R; van Praag, H M; Conte, H R

    1989-05-01

    Questionnaires and self-report scales were administered to 100 psychiatric inpatients. The scales measured such variables as depression, hopelessness, impulsivity, mental and life problems, family violence, personality characteristics, and dyscontrol tendencies. These were correlated with indices of suicide risk and violence risk. Most variables were found to correlate significantly with both suicide and violence risk. Partial correlation analyses revealed that 10 variables correlated significantly with suicide risk but not violence risk, while four variables correlated significantly with violence risk but not suicide risk. A two-stage model of countervailing forces, incorporating concepts from both psychoanalysis and ethology, is presented as a way of interpreting the results. PMID:2748772

  10. Long‐Term Post‐CABG Survival: Performance of Clinical Risk Models Versus Actuarial Predictions

    PubMed Central

    Carr, Brendan M.; Romeiser, Jamie; Ruan, Joyce; Gupta, Sandeep; Seifert, Frank C.; Zhu, Wei

    2015-01-01

    Abstract Background/aim Clinical risk models are commonly used to predict short‐term coronary artery bypass grafting (CABG) mortality but are less commonly used to predict long‐term mortality. The added value of long‐term mortality clinical risk models over traditional actuarial models has not been evaluated. To address this, the predictive performance of a long‐term clinical risk model was compared with that of an actuarial model to identify the clinical variable(s) most responsible for any differences observed. Methods Long‐term mortality for 1028 CABG patients was estimated using the Hannan New York State clinical risk model and an actuarial model (based on age, gender, and race/ethnicity). Vital status was assessed using the Social Security Death Index. Observed/expected (O/E) ratios were calculated, and the models' predictive performances were compared using a nested c‐index approach. Linear regression analyses identified the subgroup of risk factors driving the differences observed. Results Mortality rates were 3%, 9%, and 17% at one‐, three‐, and five years, respectively (median follow‐up: five years). The clinical risk model provided more accurate predictions. Greater divergence between model estimates occurred with increasing long‐term mortality risk, with baseline renal dysfunction identified as a particularly important driver of these differences. Conclusions Long‐term mortality clinical risk models provide enhanced predictive power compared to actuarial models. Using the Hannan risk model, a patient's long‐term mortality risk can be accurately assessed and subgroups of higher‐risk patients can be identified for enhanced follow‐up care. More research appears warranted to refine long‐term CABG clinical risk models. doi: 10.1111/jocs.12665 (J Card Surg 2016;31:23–30) PMID:26543019

  11. Quantum theory allows for absolute maximal contextuality

    NASA Astrophysics Data System (ADS)

    Amaral, Barbara; Cunha, Marcelo Terra; Cabello, Adán

    2015-12-01

    Contextuality is a fundamental feature of quantum theory and a necessary resource for quantum computation and communication. It is therefore important to investigate how large contextuality can be in quantum theory. Linear contextuality witnesses can be expressed as a sum S of n probabilities, and the independence number α and the Tsirelson-like number ϑ of the corresponding exclusivity graph are, respectively, the maximum of S for noncontextual theories and for the theory under consideration. A theory allows for absolute maximal contextuality if it has scenarios in which ϑ /α approaches n . Here we show that quantum theory allows for absolute maximal contextuality despite what is suggested by the examination of the quantum violations of Bell and noncontextuality inequalities considered in the past. Our proof is not constructive and does not single out explicit scenarios. Nevertheless, we identify scenarios in which quantum theory allows for almost-absolute-maximal contextuality.

  12. Absolute calibration in vivo measurement systems

    SciTech Connect

    Kruchten, D.A.; Hickman, D.P.

    1991-02-01

    Lawrence Livermore National Laboratory (LLNL) is currently investigating a new method for obtaining absolute calibration factors for radiation measurement systems used to measure internally deposited radionuclides in vivo. Absolute calibration of in vivo measurement systems will eliminate the need to generate a series of human surrogate structures (i.e., phantoms) for calibrating in vivo measurement systems. The absolute calibration of in vivo measurement systems utilizes magnetic resonance imaging (MRI) to define physiological structure, size, and composition. The MRI image provides a digitized representation of the physiological structure, which allows for any mathematical distribution of radionuclides within the body. Using Monte Carlo transport codes, the emission spectrum from the body is predicted. The in vivo measurement equipment is calibrated using the Monte Carlo code and adjusting for the intrinsic properties of the detection system. The calibration factors are verified using measurements of existing phantoms and previously obtained measurements of human volunteers. 8 refs.

  13. Quantitative standards for absolute linguistic universals.

    PubMed

    Piantadosi, Steven T; Gibson, Edward

    2014-01-01

    Absolute linguistic universals are often justified by cross-linguistic analysis: If all observed languages exhibit a property, the property is taken to be a likely universal, perhaps specified in the cognitive or linguistic systems of language learners and users. In many cases, these patterns are then taken to motivate linguistic theory. Here, we show that cross-linguistic analysis will very rarely be able to statistically justify absolute, inviolable patterns in language. We formalize two statistical methods--frequentist and Bayesian--and show that in both it is possible to find strict linguistic universals, but that the numbers of independent languages necessary to do so is generally unachievable. This suggests that methods other than typological statistics are necessary to establish absolute properties of human language, and thus that many of the purported universals in linguistics have not received sufficient empirical justification.

  14. Absolute photoacoustic thermometry in deep tissue.

    PubMed

    Yao, Junjie; Ke, Haixin; Tai, Stephen; Zhou, Yong; Wang, Lihong V

    2013-12-15

    Photoacoustic thermography is a promising tool for temperature measurement in deep tissue. Here we propose an absolute temperature measurement method based on the dual temperature dependences of the Grüneisen parameter and the speed of sound in tissue. By taking ratiometric measurements at two adjacent temperatures, we can eliminate the factors that are temperature irrelevant but difficult to correct for in deep tissue. To validate our method, absolute temperatures of blood-filled tubes embedded ~9 mm deep in chicken tissue were measured in a biologically relevant range from 28°C to 46°C. The temperature measurement accuracy was ~0.6°C. The results suggest that our method can be potentially used for absolute temperature monitoring in deep tissue during thermotherapy.

  15. Molecular iodine absolute frequencies. Final report

    SciTech Connect

    Sansonetti, C.J.

    1990-06-25

    Fifty specified lines of {sup 127}I{sub 2} were studied by Doppler-free frequency modulation spectroscopy. For each line the classification of the molecular transition was determined, hyperfine components were identified, and one well-resolved component was selected for precise determination of its absolute frequency. In 3 cases, a nearby alternate line was selected for measurement because no well-resolved component was found for the specified line. Absolute frequency determinations were made with an estimated uncertainty of 1.1 MHz by locking a dye laser to the selected hyperfine component and measuring its wave number with a high-precision Fabry-Perot wavemeter. For each line results of the absolute measurement, the line classification, and a Doppler-free spectrum are given.

  16. Climate and weather risk in natural resource models

    NASA Astrophysics Data System (ADS)

    Merrill, Nathaniel Henry

    This work, consisting of three manuscripts, addresses natural resource management under risk due to variation in climate and weather. In three distinct but theoretically related applications, I quantify the role of natural resources in stabilizing economic outcomes. In Manuscript 1, we address policy designed to effect the risk of cyanobacteria blooms in a drinking water reservoir through watershed wide policy. Combining a hydrologic and economic model for a watershed in Rhode Island, we solve for the efficient allocation of best management practices (BMPs) on livestock pastures to meet a monthly risk-based as well as mean-based water quality objective. In order to solve for the efficient allocations of nutrient control effort, we optimize a probabilistically constrained integer-programming problem representing the choices made on each farm and the resultant conditions that support cyanobacteria blooms. In doing so, we employ a genetic algorithm (GA). We hypothesize that management based on controlling the upper tail of the probability distribution of phosphorus loading implies different efficient management actions as compared to controlling mean loading. We find a shift to more intense effort on fewer acres when a probabilistic objective is specified with cost savings of meeting risk levels of up to 25% over mean loading based policies. Additionally, we illustrate the relative cost effectiveness of various policies designed to meet this risk-based objective. Rainfall and the subsequent overland runoff is the source of transportation of nutrients to a receiving water body, with larger amounts of phosphorus moving in more intense rainfall events. We highlight the importance of this transportation mechanism by comparing policies under climate change scenarios, where the intensity of rainfall is projected to increase and the time series process of rainfall to change. In Manuscript 2, we introduce a new economic groundwater model that incorporates the gradual shift

  17. A Social Ecological Model of Syndemic Risk affecting Women with and At-Risk for HIV in Impoverished Urban Communities.

    PubMed

    Batchelder, A W; Gonzalez, J S; Palma, A; Schoenbaum, E; Lounsbury, D W

    2015-12-01

    Syndemic risk is an ecological construct, defined by co-occurring interdependent socio-environmental, interpersonal and intrapersonal determinants. We posited syndemic risk to be a function of violence, substance use, perceived financial hardship, emotional distress and self-worth among women with and at-risk for HIV in an impoverished urban community. In order to better understand these interrelationships, we developed and validated a system dynamics (SD) model based upon peer-reviewed literature; secondary data analyses of a cohort dataset including women living with and at-risk of HIV in Bronx, NY (N = 620); and input from a Bronx-based community advisory board. Simulated model output revealed divergent levels and patterns of syndemic risk over time across different sample profiles. Outputs generated new insights about how to effectively explore multicomponent multi-level programs in order to strategically develop more effective services for this population. Specifically, the model indicated that effective multi-level interventions might bolster women's resilience by increasing self-worth, which may result in decreased perceived financial hardship and risk of violence. Overall, our stakeholder-informed model depicts how self-worth may be a major driver of vulnerability and a meaningful addition to syndemic theory affecting this population. PMID:26370203

  18. Modeling Commercial Turbofan Engine Icing Risk With Ice Crystal Ingestion

    NASA Technical Reports Server (NTRS)

    Jorgenson, Philip C. E.; Veres, Joseph P.

    2013-01-01

    The occurrence of ice accretion within commercial high bypass aircraft turbine engines has been reported under certain atmospheric conditions. Engine anomalies have taken place at high altitudes that have been attributed to ice crystal ingestion, partially melting, and ice accretion on the compression system components. The result was degraded engine performance, and one or more of the following: loss of thrust control (roll back), compressor surge or stall, and flameout of the combustor. As ice crystals are ingested into the fan and low pressure compression system, the increase in air temperature causes a portion of the ice crystals to melt. It is hypothesized that this allows the ice-water mixture to cover the metal surfaces of the compressor stationary components which leads to ice accretion through evaporative cooling. Ice accretion causes a blockage which subsequently results in the deterioration in performance of the compressor and engine. The focus of this research is to apply an engine icing computational tool to simulate the flow through a turbofan engine and assess the risk of ice accretion. The tool is comprised of an engine system thermodynamic cycle code, a compressor flow analysis code, and an ice particle melt code that has the capability of determining the rate of sublimation, melting, and evaporation through the compressor flow path, without modeling the actual ice accretion. A commercial turbofan engine which has previously experienced icing events during operation in a high altitude ice crystal environment has been tested in the Propulsion Systems Laboratory (PSL) altitude test facility at NASA Glenn Research Center. The PSL has the capability to produce a continuous ice cloud which are ingested by the engine during operation over a range of altitude conditions. The PSL test results confirmed that there was ice accretion in the engine due to ice crystal ingestion, at the same simulated altitude operating conditions as experienced previously in

  19. Orion Absolute Navigation System Progress and Challenge

    NASA Technical Reports Server (NTRS)

    Holt, Greg N.; D'Souza, Christopher

    2012-01-01

    The absolute navigation design of NASA's Orion vehicle is described. It has undergone several iterations and modifications since its inception, and continues as a work-in-progress. This paper seeks to benchmark the current state of the design and some of the rationale and analysis behind it. There are specific challenges to address when preparing a timely and effective design for the Exploration Flight Test (EFT-1), while still looking ahead and providing software extensibility for future exploration missions. The primary onboard measurements in a Near-Earth or Mid-Earth environment consist of GPS pseudo-range and delta-range, but for future explorations missions the use of star-tracker and optical navigation sources need to be considered. Discussions are presented for state size and composition, processing techniques, and consider states. A presentation is given for the processing technique using the computationally stable and robust UDU formulation with an Agee-Turner Rank-One update. This allows for computational savings when dealing with many parameters which are modeled as slowly varying Gauss-Markov processes. Preliminary analysis shows up to a 50% reduction in computation versus a more traditional formulation. Several state elements are discussed and evaluated, including position, velocity, attitude, clock bias/drift, and GPS measurement biases in addition to bias, scale factor, misalignment, and non-orthogonalities of the accelerometers and gyroscopes. Another consideration is the initialization of the EKF in various scenarios. Scenarios such as single-event upset, ground command, and cold start are discussed as are strategies for whole and partial state updates as well as covariance considerations. Strategies are given for dealing with latent measurements and high-rate propagation using multi-rate architecture. The details of the rate groups and the data ow between the elements is discussed and evaluated.

  20. Absolute determination of local tropospheric OH concentrations

    NASA Technical Reports Server (NTRS)

    Armerding, Wolfgang; Comes, Franz-Josef

    1994-01-01

    Long path absorption (LPA) according to Lambert Beer's law is a method to determine absolute concentrations of trace gases such as tropospheric OH. We have developed a LPA instrument which is based on a rapid tuning of the light source which is a frequency doubled dye laser. The laser is tuned across two or three OH absorption features around 308 nm with a scanning speed of 0.07 cm(exp -1)/microsecond and a repetition rate of 1.3 kHz. This high scanning speed greatly reduces the fluctuation of the light intensity caused by the atmosphere. To obtain the required high sensitivity the laser output power is additionally made constant and stabilized by an electro-optical modulator. The present sensitivity is of the order of a few times 10(exp 5) OH per cm(exp 3) for an acquisition time of a minute and an absorption path length of only 1200 meters so that a folding of the optical path in a multireflection cell was possible leading to a lateral dimension of the cell of a few meters. This allows local measurements to be made. Tropospheric measurements have been carried out in 1991 resulting in the determination of OH diurnal variation at specific days in late summer. Comparison with model calculations have been made. Interferences are mainly due to SO2 absorption. The problem of OH self generation in the multireflection cell is of minor extent. This could be shown by using different experimental methods. The minimum-maximum signal to noise ratio is about 8 x 10(exp -4) for a single scan. Due to the small size of the absorption cell the realization of an open air laboratory is possible in which by use of an additional UV light source or by additional fluxes of trace gases the chemistry can be changed under controlled conditions allowing kinetic studies of tropospheric photochemistry to be made in open air.

  1. Measuring the coupled risks: A copula-based CVaR model

    NASA Astrophysics Data System (ADS)

    He, Xubiao; Gong, Pu

    2009-01-01

    Integrated risk management for financial institutions requires an approach for aggregating risk types (such as market and credit) whose distributional shapes vary considerably. The financial institutions often ignore risks' coupling influence so as to underestimate the financial risks. We constructed a copula-based Conditional Value-at-Risk (CVaR) model for market and credit risks. This technique allows us to incorporate realistic marginal distributions that capture essential empirical features of these risks, such as skewness and fat-tails while allowing for a rich dependence structure. Finally, the numerical simulation method is used to implement the model. Our results indicate that the coupled risks for the listed company's stock maybe are undervalued if credit risk is ignored, especially for the listed company with bad credit quality.

  2. Modeling number of bacteria per food unit in comparison to bacterial concentration in quantitative risk assessment: impact on risk estimates.

    PubMed

    Pouillot, Régis; Chen, Yuhuan; Hoelzer, Karin

    2015-02-01

    When developing quantitative risk assessment models, a fundamental consideration for risk assessors is to decide whether to evaluate changes in bacterial levels in terms of concentrations or in terms of bacterial numbers. Although modeling bacteria in terms of integer numbers may be regarded as a more intuitive and rigorous choice, modeling bacterial concentrations is more popular as it is generally less mathematically complex. We tested three different modeling approaches in a simulation study. The first approach considered bacterial concentrations; the second considered the number of bacteria in contaminated units, and the third considered the expected number of bacteria in contaminated units. Simulation results indicate that modeling concentrations tends to overestimate risk compared to modeling the number of bacteria. A sensitivity analysis using a regression tree suggests that processes which include drastic scenarios consisting of combinations of large bacterial inactivation followed by large bacterial growth frequently lead to a >10-fold overestimation of the average risk when modeling concentrations as opposed to bacterial numbers. Alternatively, the approach of modeling the expected number of bacteria in positive units generates results similar to the second method and is easier to use, thus potentially representing a promising compromise.

  3. Absolute Stability And Hyperstability In Hilbert Space

    NASA Technical Reports Server (NTRS)

    Wen, John Ting-Yung

    1989-01-01

    Theorems on stabilities of feedback control systems proved. Paper presents recent developments regarding theorems of absolute stability and hyperstability of feedforward-and-feedback control system. Theorems applied in analysis of nonlinear, adaptive, and robust control. Extended to provide sufficient conditions for stability in system including nonlinear feedback subsystem and linear time-invariant (LTI) feedforward subsystem, state space of which is Hilbert space, and input and output spaces having finite numbers of dimensions. (In case of absolute stability, feedback subsystem memoryless and possibly time varying. For hyperstability, feedback system dynamical system.)

  4. Geographical modeling of exposure risk to cyanobacteria for epidemiological purposes.

    PubMed

    Serrano, Tania; Dupas, Rémi; Upegui, Erika; Buscail, Camille; Grimaldi, Catherine; Viel, Jean François

    2015-08-01

    The cyanobacteria-derived neurotoxin β-methylamino-L-alanine (BMAA) represents a plausible environmental trigger for amyotrophic lateral sclerosis (ALS), a debilitating and fatal neuromuscular disease. With the eutrophication of water bodies, cyanobacterial blooms and their toxins are becoming increasingly prevalent in France, especially in the Brittany region. Cyanobacteria are monitored at only a few recreational sites, preventing an estimation of exposure of the human population. By contrast, phosphorus, a limiting nutrient for cyanobacterial growth and thus considered a good proxy for cyanobacteria exposure, is monitored in many but not all surface water bodies. Our goal was to develop a geographic exposure indicator that could be used in epidemiological research. We considered the total phosphorus (TP) concentration (mg/L) of samples collected between October 2007 and September 2012 at 179 monitoring stations distributed throughout the Brittany region. Using readily available spatial data, we computed environmental descriptors at the watershed level with a Geographic Information System. Then, these descriptors were introduced into a backward stepwise linear regression model to predict the median TP concentration in unmonitored surface water bodies. TP concentrations in surface water follow an increasing gradient from West to East and inland to coast. The empirical concentration model included five predictor variables with a fair coefficient of determination (R(2) = 0.51). The specific total runoff and the watershed slope correlated negatively with the TP concentrations (p = 0.01 and p< 10(-9), respectively), whereas positive associations were found for the proportion of built-up area, the upstream presence of sewage treatment plants, and the algae volume as indicated by the Landsat red/green reflectance ratio (p < 0.01, p < 10(-6) and p < 0.01, respectively). Complementing the monitoring networks, this geographical modeling can help estimate TP concentrations

  5. Risk-Based Causal Modeling of Airborne Loss of Separation

    NASA Technical Reports Server (NTRS)

    Geuther, Steven C.; Shih, Ann T.

    2015-01-01

    Maintaining safe separation between aircraft remains one of the key aviation challenges as the Next Generation Air Transportation System (NextGen) emerges. The goals of the NextGen are to increase capacity and reduce flight delays to meet the aviation demand growth through the 2025 time frame while maintaining safety and efficiency. The envisioned NextGen is expected to enable high air traffic density, diverse fleet operations in the airspace, and a decrease in separation distance. All of these factors contribute to the potential for Loss of Separation (LOS) between aircraft. LOS is a precursor to a potential mid-air collision (MAC). The NASA Airspace Operations and Safety Program (AOSP) is committed to developing aircraft separation assurance concepts and technologies to mitigate LOS instances, therefore, preventing MAC. This paper focuses on the analysis of causal and contributing factors of LOS accidents and incidents leading to MAC occurrences. Mid-air collisions among large commercial aircraft are rare in the past decade, therefore, the LOS instances in this study are for general aviation using visual flight rules in the years 2000-2010. The study includes the investigation of causal paths leading to LOS, and the development of the Airborne Loss of Separation Analysis Model (ALOSAM) using Bayesian Belief Networks (BBN) to capture the multi-dependent relations of causal factors. The ALOSAM is currently a qualitative model, although further development could lead to a quantitative model. ALOSAM could then be used to perform impact analysis of concepts and technologies in the AOSP portfolio on the reduction of LOS risk.

  6. Personal exposure meets risk assessment: a comparison of measured and modeled exposures and risks in an urban community.

    PubMed Central

    Payne-Sturges, Devon C; Burke, Thomas A; Breysse, Patrick; Diener-West, Marie; Buckley, Timothy J

    2004-01-01

    Human exposure research has consistently shown that, for most volatile organic compounds (VOCs), personal exposures are vastly different from outdoor air concentrations. Therefore, risk estimates based on ambient measurements may over- or underestimate risk, leading to ineffective or inefficient management strategies. In the present study we examine the extent of exposure misclassification and its impact on risk for exposure estimated by the U.S. Environmental Protection Agency (U.S. EPA) Assessment System for Population Exposure Nationwide (ASPEN) model relative to monitoring results from a community-based exposure assessment conducted in Baltimore, Maryland (USA). This study is the first direct comparison of the ASPEN model (as used by the U.S. EPA for the Cumulative Exposure Project and subsequently the National-Scale Air Toxics Assessment) and human exposure data to estimate health risks. A random sampling strategy was used to recruit 33 nonsmoking adult community residents. Passive air sampling badges were used to assess 3-day time-weighted-average personal exposure as well as outdoor and indoor residential concentrations of VOCs for each study participant. In general, personal exposures were greater than indoor VOC concentrations, which were greater than outdoor VOC concentrations. Public health risks due to actual personal exposures were estimated. In comparing measured personal exposures and indoor and outdoor VOC concentrations with ASPEN model estimates for ambient concentrations, our data suggest that ASPEN was reasonably accurate as a surrogate for personal exposures (measured exposures of community residents) for VOCs emitted primarily from mobile sources or VOCs that occur as global "background" source pollutant with no indoor source contributions. Otherwise, the ASPEN model estimates were generally lower than measured personal exposures and the estimated health risks. ASPEN's lower exposures resulted in proportional underestimation of cumulative

  7. Absolute GNSS Antenna Calibration at the National Geodetic Survey

    NASA Astrophysics Data System (ADS)

    Mader, G.; Bilich, A.; Geoghegan, C.

    2012-04-01

    Geodetic GNSS applications routinely demand millimeter precision and extremely high levels of accuracy. To achieve these accuracies, measurement and instrument biases at the centimeter to millimeter level must be understood. One of these biases is the antenna phase center, the apparent point of signal reception for a GNSS antenna. It has been well established that phase center patterns differ between antenna models and manufacturers; additional research suggests that the addition of a radome or the choice of antenna mount can significantly alter those a priori phase center patterns. For the more demanding GNSS positioning applications and especially in cases of mixed-antenna networks, it is all the more important to know antenna phase center variations as a function of both elevation and azimuth in the antenna reference frame and incorporate these models into analysis software. To help meet the needs of the high-precision GNSS community, the National Geodetic Survey (NGS) now operates an absolute antenna calibration facility. Located in Corbin, Virginia, this facility uses field measurements and actual GNSS satellite signals to quantitatively determine the carrier phase advance/delay introduced by the antenna element. The NGS facility was built to serve traditional NGS constituents such as the surveying and geodesy communities, however calibration services are open and available to all GNSS users as the calibration schedule permits. All phase center patterns computed by this facility will be publicly available and disseminated in both the ANTEX and NGS formats. We describe the NGS calibration facility, and discuss the observation models and strategy currently used to generate NGS absolute calibrations. We demonstrate that NGS absolute phase center variation (PCV) patterns are consistent with published values determined by other absolute antenna calibration facilities, and outline future planned refinements to the system.

  8. Comparative application of different risk assessment models and implications on resulting remediation options.

    PubMed

    Capodaglio, Andrea; Callegari, Arianna; Torretta, Vincenzo

    2014-01-01

    The issue of contaminated soils and their productive recovery is a quite controversial environmental and economic problem with important consequences for its social, public health and sustainability aspects. The sheer number and characteristics of the polluted sites are so large and varied, and the definition of priorities related to their remediation interventions so site-dependent, that proper characterization and final environmental quality goals reflect a strategic importance. One of the possible approaches to site specific approach and site priority ranking can be that of carrying out, respectively, absolute and comparative analysis procedures. An important aspect to be solved is represented by the necessity to consider not only the potential risk to public health, but also the best possible financial return from the investments for remediation, especially when carried out with public money. In this paper, different contaminated sites' risk assessment approaches are considered, compared and their applicability to support sustainable policies discussed using a case study.

  9. Developing Risk Prediction