Science.gov

Sample records for absolute risk models

  1. Sharing risk management: an implementation model for cardiovascular absolute risk assessment and management in Australian general practice

    PubMed Central

    Wan, Q; Harris, M F; Zwar, N; Vagholkar, S

    2008-01-01

    Purpose Despite considerable work in developing and validating cardiovascular absolute risk (CVAR) algorithms, there has been less work on models for their implementation in assessment and management. The aim of our study was to develop a model for a joint approach to its implementation based on an exploration of views of patients, general practitioners (GPs) and key informants (KIs). Methods We conducted six focus group (three with GPs and three with patients) and nine KI interviews in Sydney. Thematic analysis was used with comparison to highlight the similarities and differences in perspectives of participants. Results Conducting CVAR was seen as more acceptable for regular patients rather than new patients for whom GPs had to attract their interest and build rapport before doing so at the next visit. GPs’ interest and patients’ positive attitude in managing risk were important in implementing CVAR. Long consultations, good communication skills and having a trusting relationship helped overcome the barriers during the process. All the participants supported engaging patients to self-assess their risk before the consultation and sharing decision making with GPs during consultation. Involving practice staff to help with the patient self-assessment, follow-up and referral would be helpful in implementing CVAR assessment and management, but GPs, patients and practices may need more support for this to occur. Conclusions Multiple strategies are required to promote the better use of CVAR in the extremely busy working environment of Australian general practice. An implementation model has been developed based on our findings and the Chronic Care Model. Further research needs to investigate the effectiveness of the proposed model. PMID:18479283

  2. [Estimation of absolute risk for fracture].

    PubMed

    Fujiwara, Saeko

    2009-03-01

    Osteoporosis treatment aims to prevent fractures and maintain the QOL of the elderly. However, persons at high risk of future fracture cannot be effectively identified on the basis of bone density (BMD) alone, although BMD is used as an diagnostic criterion. Therefore, the WHO recommended that absolute risk for fracture (10-year probability of fracture) for each individual be evaluated and used as an index for intervention threshold. The 10-year probability of fracture is calculated based on age, sex, BMD at the femoral neck (body mass index if BMD is not available), history of previous fractures, parental hip fracture history, smoking, steroid use, rheumatoid arthritis, secondary osteoporosis and alcohol consumption. The WHO has just announced the development of a calculation tool (FRAX: WHO Fracture Risk Assessment Tool) in February this year. Fractures could be prevented more effectively if, based on each country's medical circumstances, an absolute risk value for fracture to determine when to start medical treatment is established and persons at high risk of fracture are identified and treated accordingly.

  3. Assessing absolute changes in breast cancer risk due to modifiable risk factors.

    PubMed

    Quante, Anne S; Herz, Julia; Whittemore, Alice S; Fischer, Christine; Strauch, Konstantin; Terry, Mary Beth

    2015-07-01

    Clinical risk assessment involves absolute risk measures, but information on modifying risk and preventing cancer is often communicated in relative terms. To illustrate the potential impact of risk factor modification in model-based risk assessment, we evaluated the performance of the IBIS Breast Cancer Risk Evaluation Tool, with and without current body mass index (BMI), for predicting future breast cancer occurrence in a prospective cohort of 665 postmenopausal women. Overall, IBIS's accuracy (overall agreement between observed and assigned risks) and discrimination (AUC concordance between assigned risks and outcomes) were similar with and without the BMI information. However, in women with BMI > 25 kg/m(2), adding BMI information improved discrimination (AUC = 63.9 % and 61.4 % with and without BMI, P < 0.001). The model-assigned 10-year risk difference for a woman with high (27 kg/m(2)) versus low (21 kg/m(2)) BMI was only 0.3 % for a woman with neither affected first-degree relatives nor BRCA1 mutation, compared to 4.5 % for a mutation carrier with three such relatives. This contrast illustrates the value of using information on modifiable risk factors in risk assessment and in sharing information with patients of their absolute risks with and without modifiable risk factors.

  4. Methodology to predict long-term cancer survival from short-term data using Tobacco Cancer Risk and Absolute Cancer Cure models

    NASA Astrophysics Data System (ADS)

    Mould, R. F.; Lederman, M.; Tai, P.; Wong, J. K. M.

    2002-11-01

    Three parametric statistical models have been fully validated for cancer of the larynx for the prediction of long-term 15, 20 and 25 year cancer-specific survival fractions when short-term follow-up data was available for just 1-2 years after the end of treatment of the last patient. In all groups of cases the treatment period was only 5 years. Three disease stage groups were studied, T1N0, T2N0 and T3N0. The models are the Standard Lognormal (SLN) first proposed by Boag (1949 J. R. Stat. Soc. Series B 11 15-53) but only ever fully validated for cancer of the cervix, Mould and Boag (1975 Br. J. Cancer 32 529-50), and two new models which have been termed Tobacco Cancer Risk (TCR) and Absolute Cancer Cure (ACC). In each, the frequency distribution of survival times of defined groups of cancer deaths is lognormally distributed: larynx only (SLN), larynx and lung (TCR) and all cancers (ACC). All models each have three unknown parameters but it was possible to estimate a value for the lognormal parameter S a priori. By reduction to two unknown parameters the model stability has been improved. The material used to validate the methodology consisted of case histories of 965 patients, all treated during the period 1944-1968 by Dr Manuel Lederman of the Royal Marsden Hospital, London, with follow-up to 1988. This provided a follow-up range of 20- 44 years and enabled predicted long-term survival fractions to be compared with the actual survival fractions, calculated by the Kaplan and Meier (1958 J. Am. Stat. Assoc. 53 457-82) method. The TCR and ACC models are better than the SLN model and for a maximum short-term follow-up of 6 years, the 20 and 25 year survival fractions could be predicted. Therefore the numbers of follow-up years saved are respectively 14 years and 19 years. Clinical trial results using the TCR and ACC models can thus be analysed much earlier than currently possible. Absolute cure from cancer was also studied, using not only the prediction models which

  5. Measured and modelled absolute gravity in Greenland

    NASA Astrophysics Data System (ADS)

    Nielsen, E.; Forsberg, R.; Strykowski, G.

    2012-12-01

    Present day changes in the ice volume in glaciated areas like Greenland will change the load on the Earth and to this change the lithosphere will respond elastically. The Earth also responds to changes in the ice volume over a millennial time scale. This response is due to the viscous properties of the mantle and is known as Glaical Isostatic Adjustment (GIA). Both signals are present in GPS and absolute gravity (AG) measurements and they will give an uncertainty in mass balance estimates calculated from these data types. It is possible to separate the two signals if both gravity and Global Positioning System (GPS) time series are available. DTU Space acquired an A10 absolute gravimeter in 2008. One purpose of this instrument is to establish AG time series in Greenland and the first measurements were conducted in 2009. Since then are 18 different Greenland GPS Network (GNET) stations visited and six of these are visited more then once. The gravity signal consists of three signals; the elastic signal, the viscous signal and the direct attraction from the ice masses. All of these signals can be modelled using various techniques. The viscous signal is modelled by solving the Sea Level Equation with an appropriate ice history and Earth model. The free code SELEN is used for this. The elastic signal is modelled as a convolution of the elastic Greens function for gravity and a model of present day ice mass changes. The direct attraction is the same as the Newtonian attraction and is calculated as this. Here we will present the preliminary results of the AG measurements in Greenland. We will also present modelled estimates of the direct attraction, the elastic and the viscous signals.

  6. The importance of calculating absolute rather than relative fracture risk.

    PubMed

    Tucker, Graeme; Metcalfe, Andrew; Pearce, Charles; Need, Allan G; Dick, Ian M; Prince, Richard L; Nordin, B E Christopher

    2007-12-01

    The relation between fracture risk and bone mineral density (BMD) is commonly expressed as a multiplicative factor which is said to represent the increase in risk for each standard deviation fall in BMD. This practice assumes that risk increases multiplicatively with each unit fall in bone density, which is not correct. Although odds increase multiplicatively, absolute risk, which lies between 0 and 1, cannot do so though it can be derived from odds by the term Odds/(1+Odds). This concept is illustrated in a prospective study of 1098 women over age 69 followed for 6 years in a calcium trial in which hip BMD was measured in the second year. 304 Women (27.6%) had prevalent fractures and 198 (18.1%) incident fractures with a significant association between them (P 0.005). Age-adjusted hip BMD and T-score were significantly lower in those with prevalent fractures than in those without (P 0.003) and significantly lower in those with incident fractures than in those without (P 0.001). When the data were analysed by univariate logistic regression, the fracture odds at zero T-score were 0.130 and the rise in odds for each unit fall in hip T-score was 1.55. When these odds were converted to risks, there was a progressive divergence between odds and risk at T-scores below zero. Multiple logistic regression yielded significant odds ratios of 1.47 for each 5-year increase in age, 1.47 for prevalent fracture and 1.49 for each unit fall in hip T-score. Calcium therapy was not significant. Poisson regression, logistic regression and Cox's proportional hazards yielded very similar outcomes when converted into absolute risks. A nomogram was constructed to enable clinicians to estimate the approximate 6-year fracture risk from hip T-score, age and prevalent fracture which can probably be applied (with appropriate correction) to men as well as to women. We conclude that multiplicative factors can be applied to odds but not to risk and that multipliers of risk tend to overstate the

  7. Realized volatility and absolute return volatility: a comparison indicating market risk.

    PubMed

    Zheng, Zeyu; Qiao, Zhi; Takaishi, Tetsuya; Stanley, H Eugene; Li, Baowen

    2014-01-01

    Measuring volatility in financial markets is a primary challenge in the theory and practice of risk management and is essential when developing investment strategies. Although the vast literature on the topic describes many different models, two nonparametric measurements have emerged and received wide use over the past decade: realized volatility and absolute return volatility. The former is strongly favored in the financial sector and the latter by econophysicists. We examine the memory and clustering features of these two methods and find that both enable strong predictions. We compare the two in detail and find that although realized volatility has a better short-term effect that allows predictions of near-future market behavior, absolute return volatility is easier to calculate and, as a risk indicator, has approximately the same sensitivity as realized volatility. Our detailed empirical analysis yields valuable guidelines for both researchers and market participants because it provides a significantly clearer comparison of the strengths and weaknesses of the two methods.

  8. One idea of portfolio risk control for absolute return strategy risk adjustments by signals from correlation behavior

    NASA Astrophysics Data System (ADS)

    Nishiyama, N.

    2001-12-01

    Absolute return strategy provided from fund of funds (FOFs) investment schemes is the focus in Japanese Financial Community. FOFs investment mainly consists of hedge fund investment and it has two major characteristics which are low correlation against benchmark index and little impact from various external changes in the environment given maximizing return. According to the historical track record of survival hedge funds in this business world, they maintain a stable high return and low risk. However, one must keep in mind that low risk would not be equal to risk free. The failure of Long-term capital management (LTCM) that took place in the summer of 1998 was a symbolized phenomenon. The summer of 1998 exhibited a certain limitation of traditional value at risk (VaR) and some possibility that traditional VaR could be ineffectual to the nonlinear type of fluctuation in the market. In this paper, I try to bring self-organized criticality (SOC) into portfolio risk control. SOC would be well known as a model of decay in the natural world. I analyzed nonlinear type of fluctuation in the market as SOC and applied SOC to capture complicated market movement using threshold point of SOC and risk adjustments by scenario correlation as implicit signals. Threshold becomes the control parameter of risk exposure to set downside floor and forecast extreme nonlinear type of fluctuation under a certain probability. Simulation results would show synergy effect of portfolio risk control between SOC and absolute return strategy.

  9. Biosafety Risk Assessment Model

    SciTech Connect

    Daniel Bowen, Susan Caskey

    2011-05-27

    Software tool based on a structured methodology for conducting laboratory biosafety risk assessments by biosafety experts. Software is based upon an MCDA scheme and uses peer reviewed criteria and weights. The software was developed upon Microsoft’s .net framework. The methodology defines likelihood and consequence of a laboratory exposure for thirteen unique scenarios and provides numerical relative risks for each of the relevant thirteen. The software produces 2-d graphs reflecting the relative risk and a sensitivity analysis which highlights the overall importance of each factor. The software works as a set of questions with absolute scales and uses a weighted additive model to calculate the likelihood and consequence.

  10. Realized Volatility and Absolute Return Volatility: A Comparison Indicating Market Risk

    PubMed Central

    Takaishi, Tetsuya; Stanley, H. Eugene; Li, Baowen

    2014-01-01

    Measuring volatility in financial markets is a primary challenge in the theory and practice of risk management and is essential when developing investment strategies. Although the vast literature on the topic describes many different models, two nonparametric measurements have emerged and received wide use over the past decade: realized volatility and absolute return volatility. The former is strongly favored in the financial sector and the latter by econophysicists. We examine the memory and clustering features of these two methods and find that both enable strong predictions. We compare the two in detail and find that although realized volatility has a better short-term effect that allows predictions of near-future market behavior, absolute return volatility is easier to calculate and, as a risk indicator, has approximately the same sensitivity as realized volatility. Our detailed empirical analysis yields valuable guidelines for both researchers and market participants because it provides a significantly clearer comparison of the strengths and weaknesses of the two methods. PMID:25054439

  11. Lunar eclipse photometry: absolute luminance measurements and modeling.

    PubMed

    Hernitschek, Nina; Schmidt, Elmar; Vollmer, Michael

    2008-12-01

    The Moon's time-dependent luminance was determined during the 9 February 1990 and 3 March 2007 total lunar eclipses by using calibrated, industry standard photometers. After the results were corrected to unit air mass and to standard distances for both Moon and Sun, an absolute calibration was accomplished by using the Sun's known luminance and a pre-eclipse lunar albedo of approximately 13.5%. The measured minimum level of brightness in the total phase of both eclipses was relatively high, namely -3.32 m(vis) and -1.7 m(vis), which hints at the absence of pronounced stratospheric aerosol. The light curves were modeled in such a way as to let the Moon move through an artificial Earth shadow composed of a multitude of disk and ring zones, containing a relative luminance data set from an atmospheric radiative transfer calculation.

  12. Greater absolute risk for all subtypes of breast cancer in the US than Malaysia.

    PubMed

    Horne, Hisani N; Beena Devi, C R; Sung, Hyuna; Tang, Tieng Swee; Rosenberg, Philip S; Hewitt, Stephen M; Sherman, Mark E; Anderson, William F; Yang, Xiaohong R

    2015-01-01

    Hormone receptor (HR) negative breast cancers are relatively more common in low-risk than high-risk countries and/or populations. However, the absolute variations between these different populations are not well established given the limited number of cancer registries with incidence rate data by breast cancer subtype. We, therefore, used two unique population-based resources with molecular data to compare incidence rates for the 'intrinsic' breast cancer subtypes between a low-risk Asian population in Malaysia and high-risk non-Hispanic white population in the National Cancer Institute's surveillance, epidemiology, and end results 18 registries database (SEER 18). The intrinsic breast cancer subtypes were recapitulated with the joint expression of the HRs (estrogen receptor and progesterone receptor) and human epidermal growth factor receptor-2 (HER2). Invasive breast cancer incidence rates overall were fivefold greater in SEER 18 than in Malaysia. The majority of breast cancers were HR-positive in SEER 18 and HR-negative in Malaysia. Notwithstanding the greater relative distribution for HR-negative cancers in Malaysia, there was a greater absolute risk for all subtypes in SEER 18; incidence rates were nearly 7-fold higher for HR-positive and 2-fold higher for HR-negative cancers in SEER 18. Despite the well-established relative breast cancer differences between low-risk and high-risk countries and/or populations, there was a greater absolute risk for HR-positive and HR-negative subtypes in the US than Malaysia. Additional analytical studies are sorely needed to determine the factors responsible for the elevated risk of all subtypes of breast cancer in high-risk countries like the United States.

  13. Male Breast Cancer Incidence and Mortality Risk in the Japanese Atomic Bomb Survivors – Differences in Excess Relative and Absolute Risk from Female Breast Cancer

    PubMed Central

    Little, Mark P.; McElvenny, Damien M.

    2016-01-01

    Background: There are well-known associations of ionizing radiation with female breast cancer, and emerging evidence also for male breast cancer. In the United Kingdom, female breast cancer following occupational radiation exposure is among that set of cancers eligible for state compensation and consideration is currently being given to an extension to include male breast cancer. Objectives: We compare radiation-associated excess relative and absolute risks of male and female breast cancers. Methods: Breast cancer incidence and mortality data in the Japanese atomic-bomb survivors were analyzed using relative and absolute risk models via Poisson regression. Results: We observed significant (p ≤ 0.01) dose-related excess risk for male breast cancer incidence and mortality. For incidence and mortality data, there are elevations by factors of approximately 15 and 5, respectively, of relative risk for male compared with female breast cancer incidence, the former borderline significant (p = 0.050). In contrast, for incidence and mortality data, there are elevations by factors of approximately 20 and 10, respectively, of female absolute risk compared with male, both statistically significant (p < 0.001). There are no indications of differences between the sexes in age/time-since-exposure/age-at-exposure modifications to the relative or absolute excess risk. The probability of causation of male breast cancer following radiation exposure exceeds by at least a factor of 5 that of many other malignancies. Conclusions: There is evidence of much higher radiation-associated relative risk for male than for female breast cancer, although absolute excess risks for males are much less than for females. However, the small number of male cases and deaths suggests a degree of caution in interpretation of this finding. Citation: Little MP, McElvenny DM. 2017. Male breast cancer incidence and mortality risk in the Japanese atomic bomb survivors – differences in excess relative and

  14. The Fremantle Primary Prevention Study: a multicentre randomised trial of absolute cardiovascular risk reduction

    PubMed Central

    Brett, Tom; Arnold-Reed, Diane; Phan, Cam; Cadden, Frances; Walker, William; Manea-Walley, Wendy; Mora, Noelene; Young, Julie; Bulsara, Max

    2011-01-01

    Background Cardiovascular disease (CVD) is the leading cause of global mortality. Risk factor management in clinical practice often relies on relative risk modification rather than the more appropriate absolute risk assessment. Aim To determine whether patients receiving more-frequently designated GP visits had increased benefit in terms of their absolute CVD risk assessment, as compared with patients in receipt of their usual GP care. Design and setting Prospective, open, pragmatic block randomised study in a 1:1 group allocation ratio in three Western Australian general practices. Method A convenience sample (n = 1200) of patients aged 40–80 years were randomised to 3-monthly GP visits (five in total for the intensive) or usual GP care (two in total for the opportunistic), with 12 months’ follow-up. The main outcome was absolute CVD risk scores based on the New Zealand Cardiovascular Risk Calculator. Others outcome measures were weight, height, waist circumference, blood pressure, and fasting blood lipids and glucose. Results There were 600 patients per group at baseline. At 12 months’ analysis there were 543 in the intensive group and 569 in the opportunistic group. Mean (standard deviation [SD]) absolute CVD risk reduced significantly between baseline and 12 months in the intensive group (6.28% [5.11] to 6.10% [4.94]) but not in the opportunistic group (6.27% [5.10] to 6.24% [5.38]). There was a significant reduction between baseline and 12 months in mean (SD) total cholesterol (5.28 mmol/l [0.94] to 5.08 mmol/l [0.96]); low-density lipoprotein cholesterol (3.08 mmol/l [0.87] to 2.95 mmol/l [0.89]); triglyceride (1.45 mmol/l [0.86] to 1.36 mmol/l [0.84]); and in mean (SD) waist circumference in men (98.74 cm [10.70] to 97.13 cm [10.20]) and females (90.64 cm [14.62] to 88.96 cm [14.00]) in the intensive group. Conclusion A targeted approach using absolute risk calculators can be used in primary care to modify global CVD risk assessment. PMID:22520669

  15. Predicting Absolute Risk of Type 2 Diabetes Using Age and Waist Circumference Values in an Aboriginal Australian Community

    PubMed Central

    2015-01-01

    Objectives To predict in an Australian Aboriginal community, the 10-year absolute risk of type 2 diabetes associated with waist circumference and age on baseline examination. Method A sample of 803 diabetes-free adults (82.3% of the age-eligible population) from baseline data of participants collected from 1992 to 1998 were followed-up for up to 20 years till 2012. The Cox-proportional hazard model was used to estimate the effects of waist circumference and other risk factors, including age, smoking and alcohol consumption status, of males and females on prediction of type 2 diabetes, identified through subsequent hospitalisation data during the follow-up period. The Weibull regression model was used to calculate the absolute risk estimates of type 2 diabetes with waist circumference and age as predictors. Results Of 803 participants, 110 were recorded as having developed type 2 diabetes, in subsequent hospitalizations over a follow-up of 12633.4 person-years. Waist circumference was strongly associated with subsequent diagnosis of type 2 diabetes with P<0.0001 for both genders and remained statistically significant after adjusting for confounding factors. Hazard ratios of type 2 diabetes associated with 1 standard deviation increase in waist circumference were 1.7 (95%CI 1.3 to 2.2) for males and 2.1 (95%CI 1.7 to 2.6) for females. At 45 years of age with baseline waist circumference of 100 cm, a male had an absolute diabetic risk of 10.9%, while a female had a 14.3% risk of the disease. Conclusions The constructed model predicts the 10-year absolute diabetes risk in an Aboriginal Australian community. It is simple and easily understood and will help identify individuals at risk of diabetes in relation to waist circumference values. Our findings on the relationship between waist circumference and diabetes on gender will be useful for clinical consultation, public health education and establishing WC cut-off points for Aboriginal Australians. PMID:25876058

  16. Bridging the etiologic and prognostic outlooks in individualized assessment of absolute risk of an illness: application in lung cancer.

    PubMed

    Karp, Igor; Sylvestre, Marie-Pierre; Abrahamowicz, Michal; Leffondré, Karen; Siemiatycki, Jack

    2016-11-01

    Assessment of individual risk of illness is an important activity in preventive medicine. Development of risk-assessment models has heretofore relied predominantly on studies involving follow-up of cohort-type populations, while case-control studies have generally been considered unfit for this purpose. To present a method for individualized assessment of absolute risk of an illness (as illustrated by lung cancer) based on data from a 'non-nested' case-control study. We used data from a case-control study conducted in Montreal, Canada in 1996-2001. Individuals diagnosed with lung cancer (n = 920) and age- and sex-matched lung-cancer-free subjects (n = 1288) completed questionnaires documenting life-time cigarette-smoking history and occupational, medical, and family history. Unweighted and weighted logistic models were fitted. Model overfitting was assessed using bootstrap-based cross-validation and 'shrinkage.' The discriminating ability was assessed by the c-statistic, and the risk-stratifying performance was assessed by examination of the variability in risk estimates over hypothetical risk-profiles. In the logistic models, the logarithm of incidence-density of lung cancer was expressed as a function of age, sex, cigarette-smoking history, history of respiratory conditions and exposure to occupational carcinogens, and family history of lung cancer. The models entailed a minimal degree of overfitting ('shrinkage' factor: 0.97 for both unweighted and weighted models) and moderately high discriminating ability (c-statistic: 0.82 for the unweighted model and 0.66 for the weighted model). The method's risk-stratifying performance was quite high. The presented method allows for individualized assessment of risk of lung cancer and can be used for development of risk-assessment models for other illnesses.

  17. A New Absolute Plate Motion Model for Africa

    NASA Astrophysics Data System (ADS)

    Maher, S. M.; Wessel, P.; Müller, D.; Harada, Y.

    2013-12-01

    The India-Eurasia collision, a change in relative plate motion between Australia and Antarctica, and the coeval ages of the Hawaiian Emperor Bend (HEB) and Louisville Bend of ~Chron 22-21 all provide convincing evidence of a global tectonic plate reorganization at ~50 Ma. Yet if it were a truly global event, then there should be a contemporaneous change in Africa absolute plate motion (APM) reflected by physical evidence somewhere on the Africa plate. This evidence might be visible in the Reunion-Mascarene bend, which exhibits many HEB-like features such as a large angular change close to ~50 Ma. Recently, the Reunion hotpot trail has been interpreted as a continental feature with incidental hotspot volcanism. Here we propose the alternative hypothesis that the northern portion of the chain between Saya de Malha and the Seychelles (Mascarene Plateau) formed as the Reunion hotspot was situated on the Carlsberg Ridge, contemporaneously forming the Chagos-Laccadive Ridge on the India plate. We have created a 4-stage model that explores how a simple APM model fitting the Mascarene Plateau can also satisfy the age progressions and geometry of other hotspot trails on the Africa plate. This type of model could explain the apparent bifurcation of the Tristan hotspot chain, the age reversals seen along the Walvis Ridge and the diffuse nature of the St. Helena chain. To test this hypothesis we have made a new African APM model that goes back to ~80 Ma using a modified version of the Hybrid Polygonal Finite Rotation Method. This method uses seamount chains and their associated hotspots as geometric constraints for the model, and seamount age dates to determine its motion through time. The positions of the hotspots can be moved to get the best fit for the model and to explore the possibility that the ~50 Ma bend in the Reunion-Mascarene chain reflects Africa plate motion. We will examine how well this model can predict the key features reflecting Africa plate motion and

  18. Age Dependent Absolute Plate and Plume Motion Modeling

    NASA Astrophysics Data System (ADS)

    Heaton, D. E.; Koppers, A. A. P.

    2015-12-01

    Current absolute plate motion (APM) models from 80 - 0 Ma are constrained by the location of mantle plume related hotspot seamounts, in particular those of the Hawaiian-Emperor and Louisville seamount trails. Originally the 'fixed' hotspot hypothesis was developed to explain past plate motion based on linear age progressive intra-plate volcanism. However, now that 'moving' hotspots are accepted, it is becoming clear that APM models need to be corrected for individual plume motion vectors. For older seamount trails that were active between roughly 50 and 80 Ma the APM models that use 'fixed' hotspots overestimate the measured age progression in those trails, while APM models corrected for 'moving' hotspots underestimate those age progressions. These mismatches are due to both a lack of reliable ages in the older portions of both the Hawaii and Louisville seamount trails and insufficient APM modeling constraints from other seamount trails in the Pacific Basin. Seamounts are difficult to sample and analyze because many are hydrothermally altered and have low potassium concentrations. New 40Ar/39Ar Age results from International Ocean Drilling Project (IODP) Expedition 330 Sites U1372 (n=18), U1375 (n=3), U1376 (n=15) and U1377 (n=7) aid in constraining the oldest end of the Louisville Seamount trail. A significant observation in this study is that the age range recovered in the drill cores match the range of ages that were acquired on dredging cruises at the same seamounts (e.g. Koppers et al., 2011). This is important for determining the inception age of a seamount. The sections recovered from IODP EXP 330 are in-situ volcanoclastic breccia and lava flows. Comparing the seismic interpretations of Louisville guyots (Contreras-Reyes et al., 2010), Holes U1372, U1373 and U1374 penetrated the extrusive and volcanoclastic sections of the seamount. The ages obtained are consistent over stratigraphic intervals >100-450 m thick, providing evidence that these seamounts

  19. Absolute lymphocyte count and risk of short-term infection in patients with immune thrombocytopenia.

    PubMed

    Hu, Ming-Hung; Yu, Yuan-Bin; Huang, Yu-Chung; Gau, Jyh-Pyng; Hsiao, Liang-Tsai; Liu, Jin-Hwang; Chen, Ming-Huang; Chiou, Tzeon-Jye; Chen, Po-Min; Tzeng, Cheng-Hwai; Liu, Chun-Yu

    2014-06-01

    Patients with immune thrombocytopenia (ITP) may be at increased risk of infection because of the steroids and other immunosuppressive agents used in its treatment. This study aimed to identify events that are associated with infection within 6 months of diagnosis and the impact that infection has on survival. We retrospectively evaluated 239 patients (107 men, 132 women; median age 61 years) diagnosed between January 1997 and August 2011. Every patient received steroid treatment according to the platelet count and the extent of bleeding. Logistic regression analysis was used to identify risk factors associated with the development of infection within 6 months of ITP being diagnosed. Sixty-two patients (25.9 %) developed an infection within 6 months of diagnosis. Multivariate analysis revealed that a lower absolute lymphocyte count (ALC) at diagnosis (<1 × 10(9)/l) was an independent risk factor for infection (P = 0.039; 95 % confidence interval, 1.033-3.599; odds ratio, 1.928). The time to infection event is significant shorter in those of low ALC, compared with those of higher ALC (P = 0.032). Furthermore, the 1-year mortality rate after ITP diagnosis was significantly higher in those patients who developed an infection (P = 0.001). ITP patients with a low absolute lymphocyte count at diagnosis have an increased risk of infection, and those who develop infections have lower 1-year survival.

  20. A method for determining weights for excess relative risk and excess absolute risk when applied in the calculation of lifetime risk of cancer from radiation exposure.

    PubMed

    Walsh, Linda; Schneider, Uwe

    2013-03-01

    Radiation-related risks of cancer can be transported from one population to another population at risk, for the purpose of calculating lifetime risks from radiation exposure. Transfer via excess relative risks (ERR) or excess absolute risks (EAR) or a mixture of both (i.e., from the life span study (LSS) of Japanese atomic bomb survivors) has been done in the past based on qualitative weighting. Consequently, the values of the weights applied and the method of application of the weights (i.e., as additive or geometric weighted means) have varied both between reports produced at different times by the same regulatory body and also between reports produced at similar times by different regulatory bodies. Since the gender and age patterns are often markedly different between EAR and ERR models, it is useful to have an evidence-based method for determining the relative goodness of fit of such models to the data. This paper identifies a method, using Akaike model weights, which could aid expert judgment and be applied to help to achieve consistency of approach and quantitative evidence-based results in future health risk assessments. The results of applying this method to recent LSS cancer incidence models are that the relative EAR weighting by cancer solid cancer site, on a scale of 0-1, is zero for breast and colon, 0.02 for all solid, 0.03 for lung, 0.08 for liver, 0.15 for thyroid, 0.18 for bladder and 0.93 for stomach. The EAR weighting for female breast cancer increases from 0 to 0.3, if a generally observed change in the trend between female age-specific breast cancer incidence rates and attained age, associated with menopause, is accounted for in the EAR model. Application of this method to preferred models from a study of multi-model inference from many models fitted to the LSS leukemia mortality data, results in an EAR weighting of 0. From these results it can be seen that lifetime risk transfer is most highly weighted by EAR only for stomach cancer. However

  1. Mathematical Model for Absolute Magnetic Measuring Systems in Industrial Applications

    NASA Astrophysics Data System (ADS)

    Fügenschuh, Armin; Fügenschuh, Marzena; Ludszuweit, Marina; Mojsic, Aleksandar; Sokół, Joanna

    2015-09-01

    Scales for measuring systems are either based on incremental or absolute measuring methods. Incremental scales need to initialize a measurement cycle at a reference point. From there, the position is computed by counting increments of a periodic graduation. Absolute methods do not need reference points, since the position can be read directly from the scale. The positions on the complete scales are encoded using two incremental tracks with different graduation. We present a new method for absolute measuring using only one track for position encoding up to micrometre range. Instead of the common perpendicular magnetic areas, we use a pattern of trapezoidal magnetic areas, to store more complex information. For positioning, we use the magnetic field where every position is characterized by a set of values measured by a hall sensor array. We implement a method for reconstruction of absolute positions from the set of unique measured values. We compare two patterns with respect to uniqueness, accuracy, stability and robustness of positioning. We discuss how stability and robustness are influenced by different errors during the measurement in real applications and how those errors can be compensated.

  2. Comparative assessment of absolute cardiovascular disease risk characterization from non-laboratory-based risk assessment in South African populations

    PubMed Central

    2013-01-01

    Background All rigorous primary cardiovascular disease (CVD) prevention guidelines recommend absolute CVD risk scores to identify high- and low-risk patients, but laboratory testing can be impractical in low- and middle-income countries. The purpose of this study was to compare the ranking performance of a simple, non-laboratory-based risk score to laboratory-based scores in various South African populations. Methods We calculated and compared 10-year CVD (or coronary heart disease (CHD)) risk for 14,772 adults from thirteen cross-sectional South African populations (data collected from 1987 to 2009). Risk characterization performance for the non-laboratory-based score was assessed by comparing rankings of risk with six laboratory-based scores (three versions of Framingham risk, SCORE for high- and low-risk countries, and CUORE) using Spearman rank correlation and percent of population equivalently characterized as ‘high’ or ‘low’ risk. Total 10-year non-laboratory-based risk of CVD death was also calculated for a representative cross-section from the 1998 South African Demographic Health Survey (DHS, n = 9,379) to estimate the national burden of CVD mortality risk. Results Spearman correlation coefficients for the non-laboratory-based score with the laboratory-based scores ranged from 0.88 to 0.986. Using conventional thresholds for CVD risk (10% to 20% 10-year CVD risk), 90% to 92% of men and 94% to 97% of women were equivalently characterized as ‘high’ or ‘low’ risk using the non-laboratory-based and Framingham (2008) CVD risk score. These results were robust across the six risk scores evaluated and the thirteen cross-sectional datasets, with few exceptions (lower agreement between the non-laboratory-based and Framingham (1991) CHD risk scores). Approximately 18% of adults in the DHS population were characterized as ‘high CVD risk’ (10-year CVD death risk >20%) using the non-laboratory-based score. Conclusions We found a high level of

  3. Breast Cancer Risk Assessment SAS Macro (Gail Model)

    Cancer.gov

    A SAS macro (commonly referred to as the Gail Model) that projects absolute risk of invasive breast cancer according to NCI’s Breast Cancer Risk Assessment Tool (BCRAT) algorithm for specified race/ethnic groups and age intervals.

  4. Relative and Absolute Fit Evaluation in Cognitive Diagnosis Modeling

    ERIC Educational Resources Information Center

    Chen, Jinsong; de la Torre, Jimmy; Zhang, Zao

    2013-01-01

    As with any psychometric models, the validity of inferences from cognitive diagnosis models (CDMs) determines the extent to which these models can be useful. For inferences from CDMs to be valid, it is crucial that the fit of the model to the data is ascertained. Based on a simulation study, this study investigated the sensitivity of various fit…

  5. Comprehensive fluence model for absolute portal dose image prediction.

    PubMed

    Chytyk, K; McCurdy, B M C

    2009-04-01

    Amorphous silicon (a-Si) electronic portal imaging devices (EPIDs) continue to be investigated as treatment verification tools, with a particular focus on intensity modulated radiation therapy (IMRT). This verification could be accomplished through a comparison of measured portal images to predicted portal dose images. A general fluence determination tailored to portal dose image prediction would be a great asset in order to model the complex modulation of IMRT. A proposed physics-based parameter fluence model was commissioned by matching predicted EPID images to corresponding measured EPID images of multileaf collimator (MLC) defined fields. The two-source fluence model was composed of a focal Gaussian and an extrafocal Gaussian-like source. Specific aspects of the MLC and secondary collimators were also modeled (e.g., jaw and MLC transmission factors, MLC rounded leaf tips, tongue and groove effect, interleaf leakage, and leaf offsets). Several unique aspects of the model were developed based on the results of detailed Monte Carlo simulations of the linear accelerator including (1) use of a non-Gaussian extrafocal fluence source function, (2) separate energy spectra used for focal and extrafocal fluence, and (3) different off-axis energy spectra softening used for focal and extrafocal fluences. The predicted energy fluence was then convolved with Monte Carlo generated, EPID-specific dose kernels to convert incident fluence to dose delivered to the EPID. Measured EPID data were obtained with an a-Si EPID for various MLC-defined fields (from 1 x 1 to 20 x 20 cm2) over a range of source-to-detector distances. These measured profiles were used to determine the fluence model parameters in a process analogous to the commissioning of a treatment planning system. The resulting model was tested on 20 clinical IMRT plans, including ten prostate and ten oropharyngeal cases. The model predicted the open-field profiles within 2%, 2 mm, while a mean of 96.6% of pixels over all

  6. Quantifying Cancer Absolute Risk and Cancer Mortality in the Presence of Competing Events after a Myotonic Dystrophy Diagnosis

    PubMed Central

    Gadalla, Shahinaz M.; Pfeiffer, Ruth M.; Kristinsson, Sigurdur Y.; Björkholm, Magnus; Hilbert, James E.; Moxley, Richard T.; Landgren, Ola; Greene, Mark H.

    2013-01-01

    Recent studies show that patients with myotonic dystrophy (DM) have an increased risk of specific malignancies, but estimates of absolute cancer risk accounting for competing events are lacking. Using the Swedish Patient Registry, we identified 1,081 patients with an inpatient and/or outpatient diagnosis of DM between 1987 and 2007. Date and cause of death and date of cancer diagnosis were extracted from the Swedish Cause of Death and Cancer Registries. We calculated non-parametric estimates of absolute cancer risk and cancer mortality accounting for the high non-cancer competing mortality associated with DM. Absolute cancer risk after DM diagnosis was 1.6% (95% CI=0.4-4%), 5% (95% CI=3-9%) and 9% (95% CI=6-13%) at ages 40, 50 and 60 years, respectively. Females had a higher absolute risk of all cancers combined than males: 9% (95% CI=4-14), and 13% (95% CI=9-20) vs. 2% (95%CI= 0.7-6) and 4% (95%CI=2-8) by ages 50 and 60 years, respectively) and developed cancer at younger ages (median age =51 years, range=22-74 vs. 57, range=43-84, respectively, p=0.02). Cancer deaths accounted for 10% of all deaths, with an absolute cancer mortality risk of 2% (95%CI=1-4.5%), 4% (95%CI=2-6%), and 6% (95%CI=4-9%) by ages 50, 60, and 70 years, respectively. No gender difference in cancer-specific mortality was observed (p=0.6). In conclusion, cancer significantly contributes to morbidity and mortality in DM patients, even after accounting for high competing DM mortality from non-neoplastic causes. It is important to apply population-appropriate, validated cancer screening strategies in DM patients. PMID:24236163

  7. Ridge-spotting: A new test for Pacific absolute plate motion models

    NASA Astrophysics Data System (ADS)

    Wessel, Paul; Müller, R. Dietmar

    2016-06-01

    Relative plate motions provide high-resolution descriptions of motions of plates relative to other plates. Yet geodynamically, motions of plates relative to the mantle are required since such motions can be attributed to forces (e.g., slab pull and ridge push) acting upon the plates. Various reference frames have been proposed, such as the hot spot reference frame, to link plate motions to a mantle framework. Unfortunately, both accuracy and precision of absolute plate motion models lag behind those of relative plate motion models. Consequently, it is paramount to use relative plate motions in improving our understanding of absolute plate motions. A new technique called "ridge-spotting" combines absolute and relative plate motions and examines the viability of proposed absolute plate motion models. We test the method on six published Pacific absolute plate motions models, including fixed and moving hot spot models as well as a geodynamically derived model. Ridge-spotting reconstructs the Pacific-Farallon and Pacific-Antarctica ridge systems over the last 80 Myr. All six absolute plate motion models predict large amounts of northward migration and monotonic clockwise rotation for the Pacific-Farallon ridge. A geodynamic implication of our ridge migration predictions is that the suggestion that the Pacific-Farallon ridge may have been pinned by a large mantle upwelling is not supported. Unexpected or erratic ridge behaviors may be tied to limitations in the models themselves or (for Indo-Atlantic models) discrepancies in the plate circuits used to project models into the Pacific realm. Ridge-spotting is promising and will be extended to include more plates and other ocean basins.

  8. Absolute stability and Hopf bifurcation in a Plasmodium falciparum malaria model incorporating discrete immune response delay.

    PubMed

    Ncube, Israel

    2013-05-01

    We consider the absolute stability of the disease-free equilibrium of an intra-host Plasmodium falciparum malarial model allowing for antigenic variation within a single species. Antigenic variation can be viewed as an adaptation of the parasite to evade host defence [2]. The model was recently developed in [3-6]. The host's immune response is compartmentalised into reactions to major and minor epitopes. The immune response mounted by the human host is delayed, where, for simplicity, the delay is assumed to be discrete. We investigate the resulting characteristic equation, with a view to establishing absolute stability criteria and computing the Hopf bifurcation of the disease-free equilibrium.

  9. An integrated model of choices and response times in absolute identification.

    PubMed

    Brown, Scott D; Marley, A A J; Donkin, Christopher; Heathcote, Andrew

    2008-04-01

    Recent theoretical developments in the field of absolute identification have stressed differences between relative and absolute processes, that is, whether stimulus magnitudes are judged relative to a shorter term context provided by recently presented stimuli or a longer term context provided by the entire set of stimuli. The authors developed a model (SAMBA: selective attention, mapping, and ballistic accumulation) that integrates shorter and longer term memory processes and accounts for both the choices made and the associated response time distributions, including sequential effects in each. The model's predictions arise as a consequence of its architecture and require estimation of only a few parameters with values that are consistent across numerous data sets. The authors show that SAMBA provides a quantitative account of benchmark choice phenomena in classical absolute identification experiments and in contemporary data involving both choice and response time.

  10. Constraint on Absolute Accuracy of Metacomprehension Assessments: The Anchoring and Adjustment Model vs. the Standards Model

    ERIC Educational Resources Information Center

    Kwon, Heekyung

    2011-01-01

    The objective of this study is to provide a systematic account of three typical phenomena surrounding absolute accuracy of metacomprehension assessments: (1) the absolute accuracy of predictions is typically quite low; (2) there exist individual differences in absolute accuracy of predictions as a function of reading skill; and (3) postdictions…

  11. Easy Absolute Values? Absolutely

    ERIC Educational Resources Information Center

    Taylor, Sharon E.; Mittag, Kathleen Cage

    2015-01-01

    The authors teach a problem-solving course for preservice middle-grades education majors that includes concepts dealing with absolute-value computations, equations, and inequalities. Many of these students like mathematics and plan to teach it, so they are adept at symbolic manipulations. Getting them to think differently about a concept that they…

  12. Fractional Brownian Motion with Stochastic Variance:. Modeling Absolute Returns in STOCK Markets

    NASA Astrophysics Data System (ADS)

    Roman, H. E.; Porto, M.

    We discuss a model for simulating a long-time memory in time series characterized in addition by a stochastic variance. The model is based on a combination of fractional Brownian motion (FBM) concepts, for dealing with the long-time memory, with an autoregressive scheme with conditional heteroskedasticity (ARCH), responsible for the stochastic variance of the series, and is denoted as FBMARCH. Unlike well-known fractionally integrated autoregressive models, FBMARCH admits finite second moments. The resulting probability distribution functions have power-law tails with exponents similar to ARCH models. This idea is applied to the description of long-time autocorrelations of absolute returns ubiquitously observed in stock markets.

  13. The Impact of Different Absolute Solar Irradiance Values on Current Climate Model Simulations

    NASA Technical Reports Server (NTRS)

    Rind, David H.; Lean, Judith L.; Jonas, Jeffrey

    2014-01-01

    Simulations of the preindustrial and doubled CO2 climates are made with the GISS Global Climate Middle Atmosphere Model 3 using two different estimates of the absolute solar irradiance value: a higher value measured by solar radiometers in the 1990s and a lower value measured recently by the Solar Radiation and Climate Experiment. Each of the model simulations is adjusted to achieve global energy balance; without this adjustment the difference in irradiance produces a global temperature change of 0.48C, comparable to the cooling estimated for the Maunder Minimum. The results indicate that by altering cloud cover the model properly compensates for the different absolute solar irradiance values on a global level when simulating both preindustrial and doubled CO2 climates. On a regional level, the preindustrial climate simulations and the patterns of change with doubled CO2 concentrations are again remarkably similar, but there are some differences. Using a higher absolute solar irradiance value and the requisite cloud cover affects the model's depictions of high-latitude surface air temperature, sea level pressure, and stratospheric ozone, as well as tropical precipitation. In the climate change experiments it leads to an underestimation of North Atlantic warming, reduced precipitation in the tropical western Pacific, and smaller total ozone growth at high northern latitudes. Although significant, these differences are typically modest compared with the magnitude of the regional changes expected for doubled greenhouse gas concentrations. Nevertheless, the model simulations demonstrate that achieving the highest possible fidelity when simulating regional climate change requires that climate models use as input the most accurate (lower) solar irradiance value.

  14. Reduced dose measurement of absolute myocardial blood flow using dynamic SPECT imaging in a porcine model

    SciTech Connect

    Timmins, Rachel; Klein, Ran; Petryk, Julia; Marvin, Brian; Kemp, Robert A. de; Ruddy, Terrence D.; Wells, R. Glenn; Wei, Lihui

    2015-09-15

    Purpose: Absolute myocardial blood flow (MBF) and myocardial flow reserve (MFR) measurements provide important additional information over traditional relative perfusion imaging. Recent advances in camera technology have made this possible with single-photon emission tomography (SPECT). Low dose protocols are desirable to reduce the patient radiation risk; however, increased noise may reduce the accuracy of MBF measurements. The authors studied the effect of reducing dose on the accuracy of dynamic SPECT MBF measurements. Methods: Nineteen 30–40 kg pigs were injected with 370 + 1110 MBq of Tc-99m sestamibi or tetrofosmin or 37 + 111 MBq of Tl-201 at rest + stress. Microspheres were injected simultaneously to measure MBF. The pigs were imaged in list-mode for 11 min starting at the time of injection using a Discovery NM 530c camera (GE Healthcare). Each list file was modified so that 3/4, 1/2, 1/4, 1/8, 1/16, and 1/32 of the original counts were included in the projections. Modified projections were reconstructed with CT-based attenuation correction and an energy window-based scatter correction and analyzed with FlowQuant kinetic modeling software using a 1-compartment model. A modified Renkin-Crone extraction function was used to convert the tracer uptake rate K1 to MBF values. The SPECT results were compared to those from microspheres. Results: Correlation between SPECT and microsphere MBF values for the full injected activity was r ≥ 0.75 for all 3 tracers and did not significantly degrade over all count levels. The mean MBF and MFR and the standard errors in the estimates were not significantly worse than the full-count data at 1/4-counts (Tc99m-tracers) and 1/2-counts (Tl-201). Conclusions: Dynamic SPECT measurement of MBF and MFR in pigs can be performed with 1/4 (Tc99m-tracers) or 1/2 (Tl-201) of the standard injected activity without significantly reducing accuracy and precision.

  15. Absolute stability and synchronization in neural field models with transmission delays

    NASA Astrophysics Data System (ADS)

    Kao, Chiu-Yen; Shih, Chih-Wen; Wu, Chang-Hong

    2016-08-01

    Neural fields model macroscopic parts of the cortex which involve several populations of neurons. We consider a class of neural field models which are represented by integro-differential equations with transmission time delays which are space-dependent. The considered domains underlying the systems can be bounded or unbounded. A new approach, called sequential contracting, instead of the conventional Lyapunov functional technique, is employed to investigate the global dynamics of such systems. Sufficient conditions for the absolute stability and synchronization of the systems are established. Several numerical examples are presented to demonstrate the theoretical results.

  16. Absolute IGS antenna phase center model igs08.atx: status and potential improvements

    NASA Astrophysics Data System (ADS)

    Schmid, R.; Dach, R.; Collilieux, X.; Jäggi, A.; Schmitz, M.; Dilssner, F.

    2016-04-01

    On 17 April 2011, all analysis centers (ACs) of the International GNSS Service (IGS) adopted the reference frame realization IGS08 and the corresponding absolute antenna phase center model igs08.atx for their routine analyses. The latter consists of an updated set of receiver and satellite antenna phase center offsets and variations (PCOs and PCVs). An update of the model was necessary due to the difference of about 1 ppb in the terrestrial scale between two consecutive realizations of the International Terrestrial Reference Frame (ITRF2008 vs. ITRF2005), as that parameter is highly correlated with the GNSS satellite antenna PCO components in the radial direction.

  17. Melanoma Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing melanoma cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  18. A California statewide three-dimensional seismic velocity model from both absolute and differential times

    USGS Publications Warehouse

    Lin, G.; Thurber, C.H.; Zhang, H.; Hauksson, E.; Shearer, P.M.; Waldhauser, F.; Brocher, T.M.; Hardebeck, J.

    2010-01-01

    We obtain a seismic velocity model of the California crust and uppermost mantle using a regional-scale double-difference tomography algorithm. We begin by using absolute arrival-time picks to solve for a coarse three-dimensional (3D) P velocity (VP) model with a uniform 30 km horizontal node spacing, which we then use as the starting model for a finer-scale inversion using double-difference tomography applied to absolute and differential pick times. For computational reasons, we split the state into 5 subregions with a grid spacing of 10 to 20 km and assemble our final statewide VP model by stitching together these local models. We also solve for a statewide S-wave model using S picks from both the Southern California Seismic Network and USArray, assuming a starting model based on the VP results and a VP=VS ratio of 1.732. Our new model has improved areal coverage compared with previous models, extending 570 km in the SW-NE directionand 1320 km in the NW-SE direction. It also extends to greater depth due to the inclusion of substantial data at large epicentral distances. Our VP model generally agrees with previous separate regional models for northern and southern California, but we also observe some new features, such as high-velocity anomalies at shallow depths in the Klamath Mountains and Mount Shasta area, somewhat slow velocities in the northern Coast Ranges, and slow anomalies beneath the Sierra Nevada at midcrustal and greater depths. This model can be applied to a variety of regional-scale studies in California, such as developing a unified statewide earthquake location catalog and performing regional waveform modeling.

  19. Primary care use of FRAX: absolute fracture risk assessment in postmenopausal women and older men.

    PubMed

    Siris, Ethel S; Baim, Sanford; Nattiv, Aurelia

    2010-01-01

    Osteoporosis-related fractures (low-trauma or fragility fractures) cause substantial disability, health care costs, and mortality among postmenopausal women and older men. Epidemiologic studies indicate that at least half the population burden of osteoporosis-related fractures affects persons with osteopenia (low bone density), who comprise a larger segment of the population than those with osteoporosis. The public health burden of fractures will fail to decrease unless the subset of patients with low bone density who are at increased risk for fracture are identified and treated. Risk stratification for medically appropriate and cost-effective treatment is facilitated by the World Health Organization (WHO) FRAX algorithm, which uses clinical risk factors, bone mineral density, and country-specific fracture and mortality data to quantify a patient's 10-year probability of a hip or major osteoporotic fracture. Included risk factors comprise femoral neck bone mineral density, prior fractures, parental hip fracture history, age, gender, body mass index, ethnicity, smoking, alcohol use, glucocorticoid use, rheumatoid arthritis, and secondary osteoporosis. FRAX was developed by the WHO to be applicable to both postmenopausal women and men aged 40 to 90 years; the National Osteoporosis Foundation Clinician's Guide focuses on its utility in postmenopausal women and men aged >50 years. It is validated to be used in untreated patients only. The current National Osteoporosis Foundation Guide recommends treating patients with FRAX 10-year risk scores of > or = 3% for hip fracture or > or = 20% for major osteoporotic fracture, to reduce their fracture risk. Additional risk factors such as frequent falls, not represented in FRAX, warrant individual clinical judgment. FRAX has the potential to demystify fracture risk assessment in primary care for patients with low bone density, directing clinical fracture prevention strategies to those who can benefit most.

  20. Absolute Summ

    NASA Astrophysics Data System (ADS)

    Phillips, Alfred, Jr.

    Summ means the entirety of the multiverse. It seems clear, from the inflation theories of A. Guth and others, that the creation of many universes is plausible. We argue that Absolute cosmological ideas, not unlike those of I. Newton, may be consistent with dynamic multiverse creations. As suggested in W. Heisenberg's uncertainty principle, and with the Anthropic Principle defended by S. Hawking, et al., human consciousness, buttressed by findings of neuroscience, may have to be considered in our models. Predictability, as A. Einstein realized with Invariants and General Relativity, may be required for new ideas to be part of physics. We present here a two postulate model geared to an Absolute Summ. The seedbed of this work is part of Akhnaton's philosophy (see S. Freud, Moses and Monotheism). Most important, however, is that the structure of human consciousness, manifest in Kenya's Rift Valley 200,000 years ago as Homo sapiens, who were the culmination of the six million year co-creation process of Hominins and Nature in Africa, allows us to do the physics that we do. .

  1. Improved Strategies and Optimization of Calibration Models for Real-time PCR Absolute Quantification

    EPA Science Inventory

    Real-time PCR absolute quantification applications rely on the use of standard curves to make estimates of DNA target concentrations in unknown samples. Traditional absolute quantification approaches dictate that a standard curve must accompany each experimental run. However, t...

  2. Nanotechnologies: Risk assessment model

    NASA Astrophysics Data System (ADS)

    Giacobbe, F.; Monica, L.; Geraci, D.

    2009-05-01

    The development and use of nanomaterials has grown widely in the last years. Hence, it is necessary to carry out a careful and aimed risk assessment for the safety of the workers. The objective of this research is a specific assessment model finalized to the workplaces where the personnel work manipulating nanoparticles. This model mainly takes into account the number of exposed workers, the dimensions of particles, the information found in the safety data sheets and the uncertainties about the danger level coming from the exposition to nanomaterials. The evaluation algorithm considers the normal work conditions, the abnormal (e.g. breakdown air filter) and emergency situations (e.g. package cracking). It has been necessary to define several risk conditions in order to quantify the risk by increasing levels ("low", "middle" and "high" level). Each level includes appropriate behavioural procedures. In particular for the high level, it is advisable that the user carries out urgent interventions finalized to reduce the risk level (e.g. the utilization of vacuum box for the manipulation, high efficiency protection PPE, etc). The model has been implemented in a research laboratory where titanium dioxide and carbon nanotubes are used. The outcomes taken out from such specific evaluation gave a risk level equal to middle.

  3. Modeling absolute differences in life expectancy with a censored skew-normal regression approach.

    PubMed

    Moser, André; Clough-Gorr, Kerri; Zwahlen, Marcel

    2015-01-01

    Parameter estimates from commonly used multivariable parametric survival regression models do not directly quantify differences in years of life expectancy. Gaussian linear regression models give results in terms of absolute mean differences, but are not appropriate in modeling life expectancy, because in many situations time to death has a negative skewed distribution. A regression approach using a skew-normal distribution would be an alternative to parametric survival models in the modeling of life expectancy, because parameter estimates can be interpreted in terms of survival time differences while allowing for skewness of the distribution. In this paper we show how to use the skew-normal regression so that censored and left-truncated observations are accounted for. With this we model differences in life expectancy using data from the Swiss National Cohort Study and from official life expectancy estimates and compare the results with those derived from commonly used survival regression models. We conclude that a censored skew-normal survival regression approach for left-truncated observations can be used to model differences in life expectancy across covariates of interest.

  4. Potential Biases in Estimating Absolute and Relative Case-Fatality Risks during Outbreaks

    PubMed Central

    Lipsitch, Marc; Donnelly, Christl A.; Fraser, Christophe; Blake, Isobel M.; Cori, Anne; Dorigatti, Ilaria; Ferguson, Neil M.; Garske, Tini; Mills, Harriet L.; Riley, Steven; Van Kerkhove, Maria D.; Hernán, Miguel A.

    2015-01-01

    Estimating the case-fatality risk (CFR)—the probability that a person dies from an infection given that they are a case—is a high priority in epidemiologic investigation of newly emerging infectious diseases and sometimes in new outbreaks of known infectious diseases. The data available to estimate the overall CFR are often gathered for other purposes (e.g., surveillance) in challenging circumstances. We describe two forms of bias that may affect the estimation of the overall CFR—preferential ascertainment of severe cases and bias from reporting delays—and review solutions that have been proposed and implemented in past epidemics. Also of interest is the estimation of the causal impact of specific interventions (e.g., hospitalization, or hospitalization at a particular hospital) on survival, which can be estimated as a relative CFR for two or more groups. When observational data are used for this purpose, three more sources of bias may arise: confounding, survivorship bias, and selection due to preferential inclusion in surveillance datasets of those who are hospitalized and/or die. We illustrate these biases and caution against causal interpretation of differential CFR among those receiving different interventions in observational datasets. Again, we discuss ways to reduce these biases, particularly by estimating outcomes in smaller but more systematically defined cohorts ascertained before the onset of symptoms, such as those identified by forward contact tracing. Finally, we discuss the circumstances in which these biases may affect non-causal interpretation of risk factors for death among cases. PMID:26181387

  5. Pharmacokinetics and absolute bioavailability of phenobarbital in neonates and young infants, a population pharmacokinetic modelling approach.

    PubMed

    Marsot, Amélie; Brevaut-Malaty, Véronique; Vialet, Renaud; Boulamery, Audrey; Bruguerolle, Bernard; Simon, Nicolas

    2014-08-01

    Phenobarbital is widely used for treatment of neonatal seizures. Its optimal use in neonates and young infants requires information regarding pharmacokinetics. The objective of this study is to characterize the absolute bioavailability of phenobarbital in neonates and young infants, a pharmacokinetic parameter which has not yet been investigated. Routine clinical pharmacokinetic data were retrospectively collected from 48 neonates and infants (weight: 0.7-10 kg; patient's postnatal age: 0-206 days; GA: 27-42 weeks) treated with phenobarbital, who were administered as intravenous or suspension by oral routes and hospitalized in a paediatric intensive care unit. Total mean dose of 4.6 mg/kg (3.1-10.6 mg/kg) per day was administered by 30-min infusion or by oral route. Pharmacokinetic analysis was performed using a nonlinear mixed-effect population model software). Data were modelled with an allometric pharmacokinetic model, using three-fourths scaling exponent for clearance (CL). The population typical mean [per cent relative standard error (%RSE)] values for CL, apparent volume of distribution (Vd ) and bioavailability (F) were 0.0054 L/H/kg (7%), 0.64 L/kg (15%) and 48.9% (22%), respectively. The interindividual variability of CL, Vd , F (%RSE) and residual variability (%RSE) was 17% (31%), 50% (27%), 39% (27%) and 7.2 mg/L (29%), respectively. The absolute bioavailability of phenobarbital in neonates and infants was estimated. The dose should be increased when switching from intravenous to oral administration.

  6. Absolute nuclear material assay

    DOEpatents

    Prasad, Manoj K [Pleasanton, CA; Snyderman, Neal J [Berkeley, CA; Rowland, Mark S [Alamo, CA

    2012-05-15

    A method of absolute nuclear material assay of an unknown source comprising counting neutrons from the unknown source and providing an absolute nuclear material assay utilizing a model to optimally compare to the measured count distributions. In one embodiment, the step of providing an absolute nuclear material assay comprises utilizing a random sampling of analytically computed fission chain distributions to generate a continuous time-evolving sequence of event-counts by spreading the fission chain distribution in time.

  7. Absolute nuclear material assay

    DOEpatents

    Prasad, Manoj K.; Snyderman, Neal J.; Rowland, Mark S.

    2010-07-13

    A method of absolute nuclear material assay of an unknown source comprising counting neutrons from the unknown source and providing an absolute nuclear material assay utilizing a model to optimally compare to the measured count distributions. In one embodiment, the step of providing an absolute nuclear material assay comprises utilizing a random sampling of analytically computed fission chain distributions to generate a continuous time-evolving sequence of event-counts by spreading the fission chain distribution in time.

  8. Nilpotent and absolutely anticommuting symmetries in the Freedman-Townsend model: Augmented superfield formalism

    NASA Astrophysics Data System (ADS)

    Shukla, A.; Krishna, S.; Malik, R. P.

    2014-12-01

    We derive the off-shell nilpotent and absolutely anticommuting Becchi-Rouet-Stora-Tyutin (BRST) and anti-BRST symmetry transformations, corresponding to the (1-form) Yang-Mills (YM) and (2-form) tensorial gauge symmetries of the four (3+1)-dimensional (4D) Freedman-Townsend (FT) model, by exploiting the augmented version of Bonora-Tonin's (BT) superfield approach to BRST formalism where the 4D flat Minkowskian theory is generalized onto the (4, 2)-dimensional supermanifold. One of the novel observations is the fact that we are theoretically compelled to go beyond the horizontality condition (HC) to invoke an additional set of gauge-invariant restrictions (GIRs) for the derivation of the full set of proper (anti-)BRST symmetries. To obtain the (anti-)BRST symmetry transformations, corresponding to the tensorial (2-form) gauge symmetries within the framework of augmented version of BT-superfield approach, we are logically forced to modify the FT-model to incorporate an auxiliary 1-form field and the kinetic term for the antisymmetric (2-form) gauge field. This is also a new observation in our present investigation. We point out some of the key differences between the modified FT-model and Lahiri-model (LM) of the dynamical non-Abelian 2-form gauge theories. We also briefly mention a few similarities.

  9. Toward a self-consistent, high-resolution absolute plate motion model for the Pacific

    NASA Astrophysics Data System (ADS)

    Wessel, Paul; Harada, Yasushi; Kroenke, Loren W.

    2006-03-01

    The hot spot hypothesis postulates that linear volcanic trails form as lithospheric plates move relative to stationary or slowly moving plumes. Given geometry and ages from several trails, one can reconstruct absolute plate motions (APM) that provide valuable information about past and present tectonism, paleogeography, and volcanism. Most APM models have been designed by fitting small circles to coeval volcanic chain segments and determining stage rotation poles, opening angles, and time intervals. Unlike relative plate motion (RPM) models, such APM models suffer from oversimplicity, self-inconsistencies, inadequate fits to data, and lack of rigorous uncertainty estimates; in addition, they work only for fixed hot spots. Newer methods are now available that overcome many of these limitations. We present a technique that provides high-resolution APM models derived from stationary or moving hot spots (given prescribed paths). The simplest model assumes stationary hot spots, and an example of such a model is presented. Observations of geometry and chronology on the Pacific plate appear well explained by this type of model. Because it is a one-plate model, it does not discriminate between hot spot drift or true polar wander as explanations for inferred paleolatitudes from the Emperor chain. Whether there was significant relative motion within the hot spots under the Pacific plate during the last ˜70 m.y. is difficult to quantify, given the paucity and geological uncertainty of age determinations. Evidence in support of plume drift appears limited to the period before the 47 Ma Hawaii-Emperor Bend and, apart from the direct paleolatitude determinations, may have been somewhat exaggerated.

  10. An Integrated Model of Choices and Response Times in Absolute Identification

    ERIC Educational Resources Information Center

    Brown, Scott D.; Marley, A. A. J.; Donkin, Christopher; Heathcote, Andrew

    2008-01-01

    Recent theoretical developments in the field of absolute identification have stressed differences between relative and absolute processes, that is, whether stimulus magnitudes are judged relative to a shorter term context provided by recently presented stimuli or a longer term context provided by the entire set of stimuli. The authors developed a…

  11. Using "Ridge-Spotting" as a Test for Pacific Absolute Plate Motion Models

    NASA Astrophysics Data System (ADS)

    Wessel, P.; Müller, D.; Williams, S.

    2015-12-01

    In the mid-1990s the "hotspotting" technique was developed to assess the internal consistency of Pacific absolute plate motions (APM) models derived from hotspot trails, with the assumption that mantle plumes were fixed. Being a variant of the Hough transform, hotspotting maps a dated location (1-D geometry) on the seafloor to a flow line (2-D geometry). The accumulation of intersections of these flow lines reveals the optimal location of a fixed hotspot, assuming that the plate motion model is correct. It is the optimal exploratory technique for a planet with moving rigid plates over a set of fixed hotspots. However, it seems increasingly unlikely that we live on such a planet. Avoiding hotspots altogether we introduce "ridge-spotting", another promising technique for a planet with moving rigid plates and fixed ridges. Alas, we may not be living on that planet either. Yet, ridges are expected to undergo slow changes (ridge jumps notwithstanding), but that does not necessarily imply that an optimal APM model should minimize the ridge migration speed. In particular, ridges between stationary continental plates and fast-moving oceanic plates will move relatively fast, and an APM model should be expected to reflect this motion. In contrast, ridges that have been "pinned" by large mantle upwellings for considerable periods of time might be expected to favor APM models that minimize ridge migration. Given the long-lived super-plume mantle upwelling in the Equatorial Pacific it seems possible that the East-Pacific Rise may be a candidate for the second scenario, while the Pacific-Antarctic ridge, pushing the Pacific away from a near-stationary Antarctic continent, may be a candidate for the former. We present the ridge-spotting method and test published Pacific APM models using seafloor formed at the two ridges. Preliminary results indicate that ridge-spotting identifies problematic APM models because they imply unreasonable ridge migration. Fixed hotspot APM models, but

  12. Acute radiation risk models

    NASA Astrophysics Data System (ADS)

    Smirnova, Olga

    Biologically motivated mathematical models, which describe the dynamics of the major hematopoietic lineages (the thrombocytopoietic, lymphocytopoietic, granulocytopoietic, and erythropoietic systems) in acutely/chronically irradiated humans are developed. These models are implemented as systems of nonlinear differential equations, which variables and constant parameters have clear biological meaning. It is shown that the developed models are capable of reproducing clinical data on the dynamics of these systems in humans exposed to acute radiation in the result of incidents and accidents, as well as in humans exposed to low-level chronic radiation. Moreover, the averaged value of the "lethal" dose rates of chronic irradiation evaluated within models of these four major hematopoietic lineages coincides with the real minimal dose rate of lethal chronic irradiation. The demonstrated ability of the models of the human thrombocytopoietic, lymphocytopoietic, granulocytopoietic, and erythropoietic systems to predict the dynamical response of these systems to acute/chronic irradiation in wide ranges of doses and dose rates implies that these mathematical models form an universal tool for the investigation and prediction of the dynamics of the major human hematopoietic lineages for a vast pattern of irradiation scenarios. In particular, these models could be applied for the radiation risk assessment for health of astronauts exposed to space radiation during long-term space missions, such as voyages to Mars or Lunar colonies, as well as for health of people exposed to acute/chronic irradiation due to environmental radiological events.

  13. Model based period analysis of absolute and relative survival with R: data preparation, model fitting and derivation of survival estimates.

    PubMed

    Holleczek, Bernd; Brenner, Hermann

    2013-05-01

    Period analysis is increasingly employed in analyses of long-term survival of patients with chronic diseases such as cancer, as it derives more up-to-date survival estimates than traditional cohort based approaches. It has recently been extended with regression modelling using generalized linear models, which increases the precision of the survival estimates and enables to assess and account for effects of additional covariates. This paper provides a detailed presentation how model based period analysis may be used to derive population-based absolute and relative survival estimates using the freely available R language and statistical environment and already available R programs for period analysis. After an introduction of the underlying regression model and a description of the software tools we provide a step-by-step implementation of two regression models in R and illustrate how estimates and a test for trend over time in relative survival may be derived using data from a population based cancer registry.

  14. Absolute masses and radii determination in multiplanetary systems without stellar models

    NASA Astrophysics Data System (ADS)

    Almenara, J. M.; Díaz, R. F.; Mardling, R.; Barros, S. C. C.; Damiani, C.; Bruno, G.; Bonfils, X.; Deleuil, M.

    2015-11-01

    The masses and radii of extrasolar planets are key observables for understanding their interior, formation and evolution. While transit photometry and Doppler spectroscopy are used to measure the radii and masses respectively of planets relative to those of their host star, estimates for the true values of these quantities rely on theoretical models of the host star which are known to suffer from systematic differences with observations. When a system is composed of more than two bodies, extra information is contained in the transit photometry and radial velocity data. Velocity information (finite speed-of-light, Doppler) is needed to break the Newtonian MR-3 degeneracy. We performed a photodynamical modelling of the two-planet transiting system Kepler-117 using all photometric and spectroscopic data available. We demonstrate how absolute masses and radii of single-star planetary systems can be obtained without resorting to stellar models. Limited by the precision of available radial velocities (38 m s-1), we achieve accuracies of 20 per cent in the radii and 70 per cent in the masses, while simulated 1 m s-1 precision radial velocities lower these to 1 per cent for the radii and 2 per cent for the masses. Since transiting multiplanet systems are common, this technique can be used to measure precisely the mass and radius of a large sample of stars and planets. We anticipate these measurements will become common when the TESS and PLATO mission provide high-precision light curves of a large sample of bright stars. These determinations will improve our knowledge about stars and planets, and provide strong constraints on theoretical models.

  15. 3D geomechanical-numerical modelling of the absolute stress state for geothermal reservoir exploration

    NASA Astrophysics Data System (ADS)

    Reiter, Karsten; Heidbach, Oliver; Moeck, Inga

    2013-04-01

    For the assessment and exploration of a potential geothermal reservoir, the contemporary in-situ stress is of key importance in terms of well stability and orientation of possible fluid pathways. However, available data, e.g. Heidbach et al. (2009) or Zang et al. (2012), deliver only point wise information of parts of the six independent components of the stress tensor. Moreover most measurements of the stress orientation and magnitude are done for hydrocarbon industry obvious in shallow depth. Interpolation across long distances or extrapolation into depth is unfavourable, because this would ignore structural features, inhomogeneity's in the crust or other local effects like topography. For this reasons geomechanical numerical modelling is the favourable method to quantify orientations and magnitudes of the 3D stress field for a geothermal reservoir. A geomechanical-numerical modelling, estimating the 3D absolute stress state, requires the initial stress state as model constraints. But in-situ stress measurements within or close by a potential reservoir are rare. For that reason a larger regional geomechanical-numerical model is necessary, which derive boundary conditions for the wanted local reservoir model. Such a large scale model has to be tested against in-situ stress measurements, orientations and magnitudes. Other suitable and available data, like GPS measurements or fault slip rates are useful to constrain kinematic boundary conditions. This stepwise approach from regional to local scale takes all stress field factors into account, from first over second up to third order. As an example we present a large scale crustal and upper mantle 3D-geomechanical-numerical model of the Alberta Basin and the surroundings, which is constructed to describe continuously the full stress tensor. In-situ stress measurements are the most likely data, because they deliver the most direct information's of the stress field and they provide insights into different depths, a

  16. Absolute Entropy and Energy of Carbon Dioxide Using the Two-Phase Thermodynamic Model.

    PubMed

    Huang, Shao-Nung; Pascal, Tod A; Goddard, William A; Maiti, Prabal K; Lin, Shiang-Tai

    2011-06-14

    The two-phase thermodynamic (2PT) model is used to determine the absolute entropy and energy of carbon dioxide over a wide range of conditions from molecular dynamics trajectories. The 2PT method determines the thermodynamic properties by applying the proper statistical mechanical partition function to the normal modes of a fluid. The vibrational density of state (DoS), obtained from the Fourier transform of the velocity autocorrelation function, converges quickly, allowing the free energy, entropy, and other thermodynamic properties to be determined from short 20-ps MD trajectories. The anharmonic effects in the vibrations are accounted for by the broadening of the normal modes into bands from sampling the velocities over the trajectory. The low frequency diffusive modes, which lead to finite DoS at zero frequency, are accounted for by considering the DoS as a superposition of gas-phase and solid-phase components (two phases). The analytical decomposition of the DoS allows for an evaluation of properties contributed by different types of molecular motions. We show that this 2PT analysis leads to accurate predictions of entropy and energy of CO2 over a wide range of conditions (from the triple point to the critical point of both the vapor and the liquid phases along the saturation line). This allows the equation of state of CO2 to be determined, which is limited only by the accuracy of the force field. We also validated that the 2PT entropy agrees with that determined from thermodynamic integration, but 2PT requires only a fraction of the time. A complication for CO2 is that its equilibrium configuration is linear, which would have only two rotational modes, but during the dynamics it is never exactly linear, so that there is a third mode from rotational about the axis. In this work, we show how to treat such linear molecules in the 2PT framework.

  17. Absolute Zero

    NASA Astrophysics Data System (ADS)

    Donnelly, Russell J.; Sheibley, D.; Belloni, M.; Stamper-Kurn, D.; Vinen, W. F.

    2006-12-01

    Absolute Zero is a two hour PBS special attempting to bring to the general public some of the advances made in 400 years of thermodynamics. It is based on the book “Absolute Zero and the Conquest of Cold” by Tom Shachtman. Absolute Zero will call long-overdue attention to the remarkable strides that have been made in low-temperature physics, a field that has produced 27 Nobel Prizes. It will explore the ongoing interplay between science and technology through historical examples including refrigerators, ice machines, frozen foods, liquid oxygen and nitrogen as well as much colder fluids such as liquid hydrogen and liquid helium. A website has been established to promote the series: www.absolutezerocampaign.org. It contains information on the series, aimed primarily at students at the middle school level. There is a wealth of material here and we hope interested teachers will draw their student’s attention to this website and its substantial contents, which have been carefully vetted for accuracy.

  18. Gender equality and women's absolute status: a test of the feminist models of rape.

    PubMed

    Martin, Kimberly; Vieraitis, Lynne M; Britto, Sarah

    2006-04-01

    Feminist theory predicts both a positive and negative relationship between gender equality and rape rates. Although liberal and radical feminist theory predicts that gender equality should ameliorate rape victimization, radical feminist theorists have argued that gender equality may increase rape in the form of male backlash. Alternatively, Marxist criminologists focus on women's absolute socioeconomic status rather than gender equality as a predictor of rape rates, whereas socialist feminists combine both radical and Marxist perspectives. This study uses factor analysis to overcome multicollinearity limitations of past studies while exploring the relationship between women's absolute and relative socioeconomic status on rape rates in major U.S. cities using 2000 census data. The findings indicate support for both the Marxist and radical feminist explanations of rape but no support for the ameliorative hypothesis. These findings support a more inclusive socialist feminist theory that takes both Marxist and radical feminist hypotheses into account.

  19. Assessing the goodness of fit of personal risk models.

    PubMed

    Gong, Gail; Quante, Anne S; Terry, Mary Beth; Whittemore, Alice S

    2014-08-15

    We describe a flexible family of tests for evaluating the goodness of fit (calibration) of a pre-specified personal risk model to the outcomes observed in a longitudinal cohort. Such evaluation involves using the risk model to assign each subject an absolute risk of developing the outcome within a given time from cohort entry and comparing subjects' assigned risks with their observed outcomes. This comparison involves several issues. For example, subjects followed only for part of the risk period have unknown outcomes. Moreover, existing tests do not reveal the reasons for poor model fit when it occurs, which can reflect misspecification of the model's hazards for the competing risks of outcome development and death. To address these issues, we extend the model-specified hazards for outcome and death, and use score statistics to test the null hypothesis that the extensions are unnecessary. Simulated cohort data applied to risk models whose outcome and mortality hazards agreed and disagreed with those generating the data show that the tests are sensitive to poor model fit, provide insight into the reasons for poor fit, and accommodate a wide range of model misspecification. We illustrate the methods by examining the calibration of two breast cancer risk models as applied to a cohort of participants in the Breast Cancer Family Registry. The methods can be implemented using the Risk Model Assessment Program, an R package freely available at http://stanford.edu/~ggong/rmap/.

  20. Upscaling of absolute permeability for a super element model of petroleum reservoir

    NASA Astrophysics Data System (ADS)

    Mazo, A. B.; Potashev, K. A.

    2016-11-01

    This paper presents a new method of local upscaling of absolute permeability for super element simulation of an oil reservoir. Upscaling was performed for each block of a super element unstructured grid. For this purpose, a set of problems of a one-phase steady-state flow was solved on a fine computational grid with the initial scalar field of absolute permeability with various boundary conditions. These conditions reflect the specific variants of filtrational flow through the super element and take into account the presence or absence of boreholes in the coarse block. The resulting components of the effective permeability tensor in each super element were found from the solution of the problem of minimizing the deviations of the normal flows through the super element faces, averaged on a detailed computational grid, from those approximated on a coarse super element grid. The results of using the method are demonstrated for reservoirs with river-type absolute permeability. The method is compared with the traditional methods of local upscaling.

  1. Determining the importance of model calibration for forecasting absolute/relative changes in streamflow from LULC and climate changes

    USGS Publications Warehouse

    Niraula, Rewati; Meixner, Thomas; Norman, Laura M.

    2015-01-01

    Land use/land cover (LULC) and climate changes are important drivers of change in streamflow. Assessing the impact of LULC and climate changes on streamflow is typically done with a calibrated and validated watershed model. However, there is a debate on the degree of calibration required. The objective of this study was to quantify the variation in estimated relative and absolute changes in streamflow associated with LULC and climate changes with different calibration approaches. The Soil and Water Assessment Tool (SWAT) was applied in an uncalibrated (UC), single outlet calibrated (OC), and spatially-calibrated (SC) mode to compare the relative and absolute changes in streamflow at 14 gaging stations within the Santa Cruz River Watershed in southern Arizona, USA. For this purpose, the effect of 3 LULC, 3 precipitation (P), and 3 temperature (T) scenarios were tested individually. For the validation period, Percent Bias (PBIAS) values were >100% with the UC model for all gages, the values were between 0% and 100% with the OC model and within 20% with the SC model. Changes in streamflow predicted with the UC and OC models were compared with those of the SC model. This approach implicitly assumes that the SC model is “ideal”. Results indicated that the magnitude of both absolute and relative changes in streamflow due to LULC predicted with the UC and OC results were different than those of the SC model. The magnitude of absolute changes predicted with the UC and SC models due to climate change (both P and T) were also significantly different, but were not different for OC and SC models. Results clearly indicated that relative changes due to climate change predicted with the UC and OC were not significantly different than that predicted with the SC models. This result suggests that it is important to calibrate the model spatially to analyze the effect of LULC change but not as important for analyzing the relative change in streamflow due to climate change. This

  2. Absolute model ages of mantled surfaces in Malea Planum and Utopia Planitia, Mars.

    NASA Astrophysics Data System (ADS)

    Willmes, M.; Hiesinger, H.; Reiss, D.; Zanetti, M.

    2009-04-01

    The surface of Mars is partially covered by a latitude-dependent ice-rich smooth mantle in the middle and high latitudes (±30-60°) [1, 2]. These deposits relate to changes in the obliquity of Mars which have led to major shifts in the Martian climate and repeated global episodes of deposition [3]. The deposits vary in thickness and are usually independent of local geology, topography and elevation. In this study we have determined absolute model ages for the mantled surface units in Utopia Planitia (northern hemisphere) and Malea Planum (southern hemisphere) using crater statistics [4]. These regions show a specific type of mantle degradation called scalloped terrain, and modelled crater retention ages of the easily eroded mantle in these regions reveal the time since the last resurfacing. Images from the High Resolution Imaging Science Experiment (HiRISE) (25-50 cm/pixel spatial resolution) on board the Mars Reconnaissance Orbiter (MRO) were analyzed, continuous areas of smooth mantle were mapped, and small, fresh, unmodified craters were counted. Both regions show degradation features of the mantle in varying degrees. The mantle in Utopia Planitia appears heavily modified by polygonal fractures and scalloped depressions [5]. Scalloped depressions are also found in Malea Planum, but the mantle appears much smoother and less modified by periglacial processes [5, 6]. The study areas totalled 722 km² in Utopia Planitia, and 296 km² in Malea Planum. Model ages for these regions were determined using the chronology function of Hartmann and Neukum [4] and the production function Ivanov [7]. The model ages show that the mantle unit for the area mapped in Utopia Planitia is 0.65 (+0.35/-0.41) to 2.9 (+0.69/-0.75) Myr old and Malea Planum is 3.0 (+1.5/-1.7) to 4.5 (+1.3/-1.4) Myr old, and that both regions represent very recent Amazonian terrain. This is also in agreement with the observed young degradation features described by [6, 8]. We acknowledge that the

  3. An absolute chronology for early Egypt using radiocarbon dating and Bayesian statistical modelling.

    PubMed

    Dee, Michael; Wengrow, David; Shortland, Andrew; Stevenson, Alice; Brock, Fiona; Girdland Flink, Linus; Bronk Ramsey, Christopher

    2013-11-08

    The Egyptian state was formed prior to the existence of verifiable historical records. Conventional dates for its formation are based on the relative ordering of artefacts. This approach is no longer considered sufficient for cogent historical analysis. Here, we produce an absolute chronology for Early Egypt by combining radiocarbon and archaeological evidence within a Bayesian paradigm. Our data cover the full trajectory of Egyptian state formation and indicate that the process occurred more rapidly than previously thought. We provide a timeline for the First Dynasty of Egypt of generational-scale resolution that concurs with prevailing archaeological analysis and produce a chronometric date for the foundation of Egypt that distinguishes between historical estimates.

  4. Absolute Photometry

    NASA Astrophysics Data System (ADS)

    Hartig, George

    1990-12-01

    The absolute sensitivity of the FOS will be determined in SV by observing 2 stars at 3 epochs, first in 3 apertures (1.0", 0.5", and 0.3" circular) and then in 1 aperture (1.0" circular). In cycle 1, one star, BD+28D4211 will be observed in the 1.0" aperture to establish the stability of the sensitivity and flat field characteristics and improve the accuracy obtained in SV. This star will also be observed through the paired apertures since these are not calibrated in SV. The stars will be observed in most detector/grating combinations. The data will be averaged to form the inverse sensitivity functions required by RSDP.

  5. RISK 0301 - MOLECULAR MODELING

    EPA Science Inventory

    Risk assessment practices, in general, for a range of diseases now encourages the use of mechanistic data to enhance the ability to predict responses at low, environmental exposures. In particular, the pathway from normal biology to pathologic state can be dcscribed by a set of m...

  6. Absolute neutrino mass scale

    NASA Astrophysics Data System (ADS)

    Capelli, Silvia; Di Bari, Pasquale

    2013-04-01

    Neutrino oscillation experiments firmly established non-vanishing neutrino masses, a result that can be regarded as a strong motivation to extend the Standard Model. In spite of being the lightest massive particles, neutrinos likely represent an important bridge to new physics at very high energies and offer new opportunities to address some of the current cosmological puzzles, such as the matter-antimatter asymmetry of the Universe and Dark Matter. In this context, the determination of the absolute neutrino mass scale is a key issue within modern High Energy Physics. The talks in this parallel session well describe the current exciting experimental activity aiming to determining the absolute neutrino mass scale and offer an overview of a few models beyond the Standard Model that have been proposed in order to explain the neutrino masses giving a prediction for the absolute neutrino mass scale and solving the cosmological puzzles.

  7. Breast Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing breast cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  8. Esophageal Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing esophageal cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  9. Colorectal Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing colorectal cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  10. Prostate Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing prostate cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  11. Pancreatic Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing pancreatic cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  12. Lung Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing lung cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  13. Testicular Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of testicular cervical cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  14. Ovarian Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing ovarian cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  15. Cervical Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing cervical cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  16. Bladder Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing bladder cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  17. Liver Cancer Risk Prediction Models

    Cancer.gov

    Developing statistical models that estimate the probability of developing liver cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  18. Risk modelling in portfolio optimization

    NASA Astrophysics Data System (ADS)

    Lam, W. H.; Jaaman, Saiful Hafizah Hj.; Isa, Zaidi

    2013-09-01

    Risk management is very important in portfolio optimization. The mean-variance model has been used in portfolio optimization to minimize the investment risk. The objective of the mean-variance model is to minimize the portfolio risk and achieve the target rate of return. Variance is used as risk measure in the mean-variance model. The purpose of this study is to compare the portfolio composition as well as performance between the optimal portfolio of mean-variance model and equally weighted portfolio. Equally weighted portfolio means the proportions that are invested in each asset are equal. The results show that the portfolio composition of the mean-variance optimal portfolio and equally weighted portfolio are different. Besides that, the mean-variance optimal portfolio gives better performance because it gives higher performance ratio than the equally weighted portfolio.

  19. Models for Pesticide Risk Assessment

    EPA Pesticide Factsheets

    EPA considers the toxicity of the pesticide as well as the amount of pesticide to which a person or the environments may be exposed in risk assessment. Scientists use mathematical models to predict pesticide concentrations in exposure assessment.

  20. Stochastic and empirical models of the absolute asymmetric synthesis by the Soai-autocatalysis.

    PubMed

    Barabás, Béla; Zucchi, Claudia; Maioli, Marco; Micskei, Károly; Pályi, Gyula

    2015-02-01

    Absolute asymmetric synthesis (AAS) is the preparation of pure (or excess of one) enantiomer of a chiral compound from achiral precursor(s) by a chemical reaction, without enantiopure chiral additive and/or without applied asymmetric physical field. Only one well-characterized example of AAS is known today: the Soai-autocatalysis. In an attempt at clarification of the mechanism of this particular reaction we have undertaken empirical and stochastic analysis of several parallel AAS experiments. Our results show that the initial steps of the reaction might be controlled by simple normal distribution ("coin tossing") formalism. Advanced stages of the reaction, however, appear to be of a more complicated nature. Symmetric beta distribution formalism could not be brought into correspondence with the experimental observations. A bimodal beta distribution algorithm provided suitable agreement with the experimental data. The parameters of this bimodal beta function were determined by a Pólya-urn experiment (simulated by computer). Interestingly, parameters of the resulting bimodal beta function give a golden section ratio. These results show, that in this highly interesting autocatalysis two or even perhaps three catalytic cycles are cooperating. An attempt at constructing a "designed" Soai-type reaction system has also been made.

  1. Multifractal Value at Risk model

    NASA Astrophysics Data System (ADS)

    Lee, Hojin; Song, Jae Wook; Chang, Woojin

    2016-06-01

    In this paper new Value at Risk (VaR) model is proposed and investigated. We consider the multifractal property of financial time series and develop a multifractal Value at Risk (MFVaR). MFVaR introduced in this paper is analytically tractable and not based on simulation. Empirical study showed that MFVaR can provide the more stable and accurate forecasting performance in volatile financial markets where large loss can be incurred. This implies that our multifractal VaR works well for the risk measurement of extreme credit events.

  2. Cabin Environment Physics Risk Model

    NASA Technical Reports Server (NTRS)

    Mattenberger, Christopher J.; Mathias, Donovan Leigh

    2014-01-01

    This paper presents a Cabin Environment Physics Risk (CEPR) model that predicts the time for an initial failure of Environmental Control and Life Support System (ECLSS) functionality to propagate into a hazardous environment and trigger a loss-of-crew (LOC) event. This physics-of failure model allows a probabilistic risk assessment of a crewed spacecraft to account for the cabin environment, which can serve as a buffer to protect the crew during an abort from orbit and ultimately enable a safe return. The results of the CEPR model replace the assumption that failure of the crew critical ECLSS functionality causes LOC instantly, and provide a more accurate representation of the spacecraft's risk posture. The instant-LOC assumption is shown to be excessively conservative and, moreover, can impact the relative risk drivers identified for the spacecraft. This, in turn, could lead the design team to allocate mass for equipment to reduce overly conservative risk estimates in a suboptimal configuration, which inherently increases the overall risk to the crew. For example, available mass could be poorly used to add redundant ECLSS components that have a negligible benefit but appear to make the vehicle safer due to poor assumptions about the propagation time of ECLSS failures.

  3. Information risk and security modeling

    NASA Astrophysics Data System (ADS)

    Zivic, Predrag

    2005-03-01

    This research paper presentation will feature current frameworks to addressing risk and security modeling and metrics. The paper will analyze technical level risk and security metrics of Common Criteria/ISO15408, Centre for Internet Security guidelines, NSA configuration guidelines and metrics used at this level. Information IT operational standards view on security metrics such as GMITS/ISO13335, ITIL/ITMS and architectural guidelines such as ISO7498-2 will be explained. Business process level standards such as ISO17799, COSO and CobiT will be presented with their control approach to security metrics. Top level, the maturity standards such as SSE-CMM/ISO21827, NSA Infosec Assessment and CobiT will be explored and reviewed. For each defined level of security metrics the research presentation will explore the appropriate usage of these standards. The paper will discuss standards approaches to conducting the risk and security metrics. The research findings will demonstrate the need for common baseline for both risk and security metrics. This paper will show the relation between the attribute based common baseline and corporate assets and controls for risk and security metrics. IT will be shown that such approach spans over all mentioned standards. The proposed approach 3D visual presentation and development of the Information Security Model will be analyzed and postulated. Presentation will clearly demonstrate the benefits of proposed attributes based approach and defined risk and security space for modeling and measuring.

  4. Command Process Modeling & Risk Analysis

    NASA Technical Reports Server (NTRS)

    Meshkat, Leila

    2011-01-01

    Commanding Errors may be caused by a variety of root causes. It's important to understand the relative significance of each of these causes for making institutional investment decisions. One of these causes is the lack of standardized processes and procedures for command and control. We mitigate this problem by building periodic tables and models corresponding to key functions within it. These models include simulation analysis and probabilistic risk assessment models.

  5. A Strict Test of Stellar Evolution Models: The Absolute Dimensions of the Massive Benchmark Eclipsing Binary V578 Mon

    NASA Astrophysics Data System (ADS)

    Garcia, E. V.; Stassun, Keivan G.; Pavlovski, K.; Hensberge, H.; Gómez Maqueo Chew, Y.; Claret, A.

    2014-09-01

    We determine the absolute dimensions of the eclipsing binary V578 Mon, a detached system of two early B-type stars (B0V + B1V, P = 2.40848 days) in the star-forming region NGC 2244 of the Rosette Nebula. From the light curve analysis of 40 yr of photometry and the analysis of HERMES spectra, we find radii of 5.41 ± 0.04 R⊙ and 4.29 ± 0.05 R⊙, and temperatures of 30,000 ± 500 K and 25,750 ± 435 K, respectively. We find that our disentangled component spectra for V578 Mon agree well with previous spectral disentangling from the literature. We also reconfirm the previous spectroscopic orbit of V578 Mon finding that masses of 14.54 ± 0.08 M⊙ and 10.29 ± 0.06 M⊙ are fully compatible with the new analysis. We compare the absolute dimensions to the rotating models of the Geneva and Utrecht groups and the models of the Granada group. We find that all three sets of models marginally reproduce the absolute dimensions of both stars with a common age within the uncertainty for gravity-effective temperature isochrones. However, there are some apparent age discrepancies for the corresponding mass-radius isochrones. Models with larger convective overshoot, >0.35, worked best. Combined with our previously determined apsidal motion of 0.07089^{+0.00021}_{-0.00013} deg cycle-1, we compute the internal structure constants (tidal Love number) for the Newtonian and general relativistic contribution to the apsidal motion as log k 2 = -1.975 ± 0.017 and log k 2 = -3.412 ± 0.018, respectively. We find the relativistic contribution to the apsidal motion to be small, <4%. We find that the prediction of log k 2, theo = -2.005 ± 0.025 of the Granada models fully agrees with our observed log k 2.

  6. A strict test of stellar evolution models: The absolute dimensions of the massive benchmark eclipsing binary V578 Mon

    SciTech Connect

    Garcia, E. V.; Stassun, Keivan G.; Pavlovski, K.; Hensberge, H.; Chew, Y. Gómez Maqueo; Claret, A.

    2014-09-01

    We determine the absolute dimensions of the eclipsing binary V578 Mon, a detached system of two early B-type stars (B0V + B1V, P = 2.40848 days) in the star-forming region NGC 2244 of the Rosette Nebula. From the light curve analysis of 40 yr of photometry and the analysis of HERMES spectra, we find radii of 5.41 ± 0.04 R{sub ☉} and 4.29 ± 0.05 R{sub ☉}, and temperatures of 30,000 ± 500 K and 25,750 ± 435 K, respectively. We find that our disentangled component spectra for V578 Mon agree well with previous spectral disentangling from the literature. We also reconfirm the previous spectroscopic orbit of V578 Mon finding that masses of 14.54 ± 0.08 M{sub ☉} and 10.29 ± 0.06 M{sub ☉} are fully compatible with the new analysis. We compare the absolute dimensions to the rotating models of the Geneva and Utrecht groups and the models of the Granada group. We find that all three sets of models marginally reproduce the absolute dimensions of both stars with a common age within the uncertainty for gravity-effective temperature isochrones. However, there are some apparent age discrepancies for the corresponding mass-radius isochrones. Models with larger convective overshoot, >0.35, worked best. Combined with our previously determined apsidal motion of 0.07089{sub −0.00013}{sup +0.00021} deg cycle{sup –1}, we compute the internal structure constants (tidal Love number) for the Newtonian and general relativistic contribution to the apsidal motion as log k {sub 2} = –1.975 ± 0.017 and log k {sub 2} = –3.412 ± 0.018, respectively. We find the relativistic contribution to the apsidal motion to be small, <4%. We find that the prediction of log k {sub 2,theo} = –2.005 ± 0.025 of the Granada models fully agrees with our observed log k {sub 2}.

  7. Lunar Landing Operational Risk Model

    NASA Technical Reports Server (NTRS)

    Mattenberger, Chris; Putney, Blake; Rust, Randy; Derkowski, Brian

    2010-01-01

    Characterizing the risk of spacecraft goes beyond simply modeling equipment reliability. Some portions of the mission require complex interactions between system elements that can lead to failure without an actual hardware fault. Landing risk is currently the least characterized aspect of the Altair lunar lander and appears to result from complex temporal interactions between pilot, sensors, surface characteristics and vehicle capabilities rather than hardware failures. The Lunar Landing Operational Risk Model (LLORM) seeks to provide rapid and flexible quantitative insight into the risks driving the landing event and to gauge sensitivities of the vehicle to changes in system configuration and mission operations. The LLORM takes a Monte Carlo based approach to estimate the operational risk of the Lunar Landing Event and calculates estimates of the risk of Loss of Mission (LOM) - Abort Required and is Successful, Loss of Crew (LOC) - Vehicle Crashes or Cannot Reach Orbit, and Success. The LLORM is meant to be used during the conceptual design phase to inform decision makers transparently of the reliability impacts of design decisions, to identify areas of the design which may require additional robustness, and to aid in the development and flow-down of requirements.

  8. Prediction of absolute risk of fragility fracture at 10 years in a Spanish population: validation of the WHO FRAX ™ tool in Spain

    PubMed Central

    2011-01-01

    Background Age-related bone loss is asymptomatic, and the morbidity of osteoporosis is secondary to the fractures that occur. Common sites of fracture include the spine, hip, forearm and proximal humerus. Fractures at the hip incur the greatest morbidity and mortality and give rise to the highest direct costs for health services. Their incidence increases exponentially with age. Independently changes in population demography, the age - and sex- specific incidence of osteoporotic fractures appears to be increasing in developing and developed countries. This could mean more than double the expected burden of osteoporotic fractures in the next 50 years. Methods/Design To assess the predictive power of the WHO FRAX™ tool to identify the subjects with the highest absolute risk of fragility fracture at 10 years in a Spanish population, a predictive validation study of the tool will be carried out. For this purpose, the participants recruited by 1999 will be assessed. These were referred to scan-DXA Department from primary healthcare centres, non hospital and hospital consultations. Study population: Patients attended in the national health services integrated into a FRIDEX cohort with at least one Dual-energy X-ray absorptiometry (DXA) measurement and one extensive questionnaire related to fracture risk factors. Measurements: At baseline bone mineral density measurement using DXA, clinical fracture risk factors questionnaire, dietary calcium intake assessment, history of previous fractures, and related drugs. Follow up by telephone interview to know fragility fractures in the 10 years with verification in electronic medical records and also to know the number of falls in the last year. The absolute risk of fracture will be estimated using the FRAX™ tool from the official web site. Discussion Since more than 10 years ago numerous publications have recognised the importance of other risk factors for new osteoporotic fractures in addition to low BMD. The extension of a

  9. Risk perception in epidemic modeling

    NASA Astrophysics Data System (ADS)

    Bagnoli, Franco; Liò, Pietro; Sguanci, Luca

    2007-12-01

    We investigate the effects of risk perception in a simple model of epidemic spreading. We assume that the perception of the risk of being infected depends on the fraction of neighbors that are ill. The effect of this factor is to decrease the infectivity, that therefore becomes a dynamical component of the model. We study the problem in the mean-field approximation and by numerical simulations for regular, random, and scale-free networks. We show that for homogeneous and random networks, there is always a value of perception that stops the epidemics. In the “worst-case” scenario of a scale-free network with diverging input connectivity, a linear perception cannot stop the epidemics; however, we show that a nonlinear increase of the perception risk may lead to the extinction of the disease. This transition is discontinuous, and is not predicted by the mean-field analysis.

  10. Individualized Risk Prediction Model for Lung Cancer in Korean Men

    PubMed Central

    Park, Sohee; Nam, Byung-Ho; Yang, Hye-Ryung; Lee, Ji An; Lim, Hyunsun; Han, Jun Tae; Park, Il Su; Shin, Hai-Rim; Lee, Jin Soo

    2013-01-01

    Purpose Lung cancer is the leading cause of cancer deaths in Korea. The objective of the present study was to develop an individualized risk prediction model for lung cancer in Korean men using population-based cohort data. Methods From a population-based cohort study of 1,324,804 Korean men free of cancer at baseline, the individualized absolute risk of developing lung cancer was estimated using the Cox proportional hazards model. We checked the validity of the model using C statistics and the Hosmer–Lemeshow chi-square test on an external validation dataset. Results The risk prediction model for lung cancer in Korean men included smoking exposure, age at smoking initiation, body mass index, physical activity, and fasting glucose levels. The model showed excellent performance (C statistic = 0.871, 95% CI = 0.867–0.876). Smoking was significantly associated with the risk of lung cancer in Korean men, with a four-fold increased risk in current smokers consuming more than one pack a day relative to non-smokers. Age at smoking initiation was also a significant predictor for developing lung cancer; a younger age at initiation was associated with a higher risk of developing lung cancer. Conclusion This is the first study to provide an individualized risk prediction model for lung cancer in an Asian population with very good model performance. In addition to current smoking status, earlier exposure to smoking was a very important factor for developing lung cancer. Since most of the risk factors are modifiable, this model can be used to identify those who are at a higher risk and who can subsequently modify their lifestyle choices to lower their risk of lung cancer. PMID:23408946

  11. Quantile uncertainty and value-at-risk model risk.

    PubMed

    Alexander, Carol; Sarabia, José María

    2012-08-01

    This article develops a methodology for quantifying model risk in quantile risk estimates. The application of quantile estimates to risk assessment has become common practice in many disciplines, including hydrology, climate change, statistical process control, insurance and actuarial science, and the uncertainty surrounding these estimates has long been recognized. Our work is particularly important in finance, where quantile estimates (called Value-at-Risk) have been the cornerstone of banking risk management since the mid 1980s. A recent amendment to the Basel II Accord recommends additional market risk capital to cover all sources of "model risk" in the estimation of these quantiles. We provide a novel and elegant framework whereby quantile estimates are adjusted for model risk, relative to a benchmark which represents the state of knowledge of the authority that is responsible for model risk. A simulation experiment in which the degree of model risk is controlled illustrates how to quantify Value-at-Risk model risk and compute the required regulatory capital add-on for banks. An empirical example based on real data shows how the methodology can be put into practice, using only two time series (daily Value-at-Risk and daily profit and loss) from a large bank. We conclude with a discussion of potential applications to nonfinancial risks.

  12. Laminated Composite Shell Element Using Absolute Nodal Coordinate Formulation and Its Application to ANCF Tire Model

    DTIC Science & Technology

    2015-04-24

    the United States. Approved for public release; distribution is unlimited. 1. INTRODUCTION An accurate modeling of the complex tire geometry and the...coordinate system o-12 with respect to the material frame o-xy UNCLASSIFIED: Distribution Statement A. Approved for public release. #26428 7...fiber coordinate system as 1111 1122 1122 2222 1212 0 0 0 0 p C C C C C            C

  13. Geomorphological Dating Using an Improved Scarp Degradation Model: Is This a Reliable Approach Compared With Common Absolute Dating Methods?

    NASA Astrophysics Data System (ADS)

    Oemisch, M.; Hergarten, S.; Neugebauer, H. J.

    2002-12-01

    Geomorphological dating of a certain landform or geomorphological structure is based on the evolution of the landscape itself. In this context it is difficult to use common absolute dating techniques such as luminescence and radiocarbon dating because they require datable material which is often not available. Additionally these methods do not always date the time since the formation of these structures. For these reasons the application of geomorphological dating seems one reliable possibility to date certain geomorphological features. The aim of our work is to relate present-day shapes of fault scarps and terrace risers to their ages. The time span since scarp formation ceased is reflected by the stage of degradation as well as the rounding of the profile edges due to erosive processes. It is assumed that the average rate of downslope soil movement depends on the local slope angle and can be described in terms of a diffusion equation. On the basis of these assumptions we present a model to simulate the temporal development of scarp degradation by erosion. A diffusivity reflecting the effects of soil erosion, surface runoff and detachability of particles as well as present-day shapes of scarps are included in the model. As observations of present-day scarps suggest a higher diffusivity at the toe than at the head of a slope, we suggest a linear approach with increasing diffusivities in downslope direction. First results show a better match between simulated and observed profiles of the Upper Rhine Graben in comparison to models using a constant diffusivity. To date the scarps the model has to be calibrated. For this purpose we estimate diffusivities by fitting modelled profiles to observed ones of known age. Field data have been collected in the area around Bonn, Germany and in the Alps, Switzerland. It is a matter of current research to assess the quality of this dating technique and to compare the results and the applicability with some of the absolute dating

  14. Decent wage is more important than absolution of debts: A smallholder socio-hydrological modelling framework

    NASA Astrophysics Data System (ADS)

    Pande, Saket; Savenije, Hubert

    2015-04-01

    We present a framework to understand the socio-hydrological system dynamics of a small holder. Small holders are farmers who own less than 2 ha of farmland. It couples the dynamics of 6 main variables that are most relevant at the scale of a small holder: local storage (soil moisture and other water storage), capital, knowledge, livestock production, soil fertility and grass biomass production. The hydroclimatic variability is at sub-annual scale and influences the socio-hydrology at annual scale. The model incorporates rule-based adaptation mechanisms (for example: adjusting expenditures on food and fertilizers, selling livestocks etc.) of small holders when they face adverse socio-hydrological conditions, such as low annual rainfall, higher intra-annual variability in rainfall or variability in agricultural prices. We apply the framework to understand the socio-hydrology of a sugarcane small holder in Aurangabad, Maharashtra. This district has witnessed suicides of many sugarcane farmers who could not extricate themselves out of the debt trap. These farmers lack irrigation and are susceptible to fluctuating sugar prices and intra-annual hydro-climatic variability. We study the sensitivity of annual total capital averaged over 30 years, an indicator of small holder wellbeing, to initial capital that a small holder starts with and the prevalent wage rates. We find that a smallholder well being is low (below Rs 30000 per annum, a threshold above which a smallholder can afford a basic standard of living) and is rather insensitive to initial capital at low wage rates. Initial capital perhaps matters to small holder livelihoods at higher wage rates. Further, the small holder system appears to be resilient at around Rs 115/mandays in the sense that small perturbations in wage rates around this rate still sustains the smallholder above the basic standard of living. Our results thus indicate that government intervention to absolve the debt of farmers is not enough. It

  15. Intelligent adversary risk analysis: a bioterrorism risk management model.

    PubMed

    Parnell, Gregory S; Smith, Christopher M; Moxley, Frederick I

    2010-01-01

    The tragic events of 9/11 and the concerns about the potential for a terrorist or hostile state attack with weapons of mass destruction have led to an increased emphasis on risk analysis for homeland security. Uncertain hazards (natural and engineering) have been successfully analyzed using probabilistic risk analysis (PRA). Unlike uncertain hazards, terrorists and hostile states are intelligent adversaries who can observe our vulnerabilities and dynamically adapt their plans and actions to achieve their objectives. This article compares uncertain hazard risk analysis with intelligent adversary risk analysis, describes the intelligent adversary risk analysis challenges, and presents a probabilistic defender-attacker-defender model to evaluate the baseline risk and the potential risk reduction provided by defender investments. The model includes defender decisions prior to an attack; attacker decisions during the attack; defender actions after an attack; and the uncertainties of attack implementation, detection, and consequences. The risk management model is demonstrated with an illustrative bioterrorism problem with notional data.

  16. Teaching Absolute Value Meaningfully

    ERIC Educational Resources Information Center

    Wade, Angela

    2012-01-01

    What is the meaning of absolute value? And why do teachers teach students how to solve absolute value equations? Absolute value is a concept introduced in first-year algebra and then reinforced in later courses. Various authors have suggested instructional methods for teaching absolute value to high school students (Wei 2005; Stallings-Roberts…

  17. A Quantitative Software Risk Assessment Model

    NASA Technical Reports Server (NTRS)

    Lee, Alice

    2002-01-01

    This slide presentation reviews a risk assessment model as applied to software development. the presentation uses graphs to demonstrate basic concepts of software reliability. It also discusses the application to the risk model to the software development life cycle.

  18. Comparing paired vs non-paired statistical methods of analyses when making inferences about absolute risk reductions in propensity-score matched samples.

    PubMed

    Austin, Peter C

    2011-05-20

    Propensity-score matching allows one to reduce the effects of treatment-selection bias or confounding when estimating the effects of treatments when using observational data. Some authors have suggested that methods of inference appropriate for independent samples can be used for assessing the statistical significance of treatment effects when using propensity-score matching. Indeed, many authors in the applied medical literature use methods for independent samples when making inferences about treatment effects using propensity-score matched samples. Dichotomous outcomes are common in healthcare research. In this study, we used Monte Carlo simulations to examine the effect on inferences about risk differences (or absolute risk reductions) when statistical methods for independent samples are used compared with when statistical methods for paired samples are used in propensity-score matched samples. We found that compared with using methods for independent samples, the use of methods for paired samples resulted in: (i) empirical type I error rates that were closer to the advertised rate; (ii) empirical coverage rates of 95 per cent confidence intervals that were closer to the advertised rate; (iii) narrower 95 per cent confidence intervals; and (iv) estimated standard errors that more closely reflected the sampling variability of the estimated risk difference. Differences between the empirical and advertised performance of methods for independent samples were greater when the treatment-selection process was stronger compared with when treatment-selection process was weaker. We recommend using statistical methods for paired samples when using propensity-score matched samples for making inferences on the effect of treatment on the reduction in the probability of an event occurring.

  19. The performance of different propensity-score methods for estimating differences in proportions (risk differences or absolute risk reductions) in observational studies.

    PubMed

    Austin, Peter C

    2010-09-10

    Propensity score methods are increasingly being used to estimate the effects of treatments on health outcomes using observational data. There are four methods for using the propensity score to estimate treatment effects: covariate adjustment using the propensity score, stratification on the propensity score, propensity-score matching, and inverse probability of treatment weighting (IPTW) using the propensity score. When outcomes are binary, the effect of treatment on the outcome can be described using odds ratios, relative risks, risk differences, or the number needed to treat. Several clinical commentators suggested that risk differences and numbers needed to treat are more meaningful for clinical decision making than are odds ratios or relative risks. However, there is a paucity of information about the relative performance of the different propensity-score methods for estimating risk differences. We conducted a series of Monte Carlo simulations to examine this issue. We examined bias, variance estimation, coverage of confidence intervals, mean-squared error (MSE), and type I error rates. A doubly robust version of IPTW had superior performance compared with the other propensity-score methods. It resulted in unbiased estimation of risk differences, treatment effects with the lowest standard errors, confidence intervals with the correct coverage rates, and correct type I error rates. Stratification, matching on the propensity score, and covariate adjustment using the propensity score resulted in minor to modest bias in estimating risk differences. Estimators based on IPTW had lower MSE compared with other propensity-score methods. Differences between IPTW and propensity-score matching may reflect that these two methods estimate the average treatment effect and the average treatment effect for the treated, respectively.

  20. NGNP Risk Management Database: A Model for Managing Risk

    SciTech Connect

    John Collins; John M. Beck

    2011-11-01

    The Next Generation Nuclear Plant (NGNP) Risk Management System (RMS) is a database used to maintain the project risk register. The RMS also maps risk reduction activities to specific identified risks. Further functionality of the RMS includes mapping reactor suppliers Design Data Needs (DDNs) to risk reduction tasks and mapping Phenomena Identification Ranking Table (PIRTs) to associated risks. This document outlines the basic instructions on how to use the RMS. This document constitutes Revision 1 of the NGNP Risk Management Database: A Model for Managing Risk. It incorporates the latest enhancements to the RMS. The enhancements include six new custom views of risk data - Impact/Consequence, Tasks by Project Phase, Tasks by Status, Tasks by Project Phase/Status, Tasks by Impact/WBS, and Tasks by Phase/Impact/WBS.

  1. The reaction H + C4H2 - Absolute rate constant measurement and implication for atmospheric modeling of Titan

    NASA Technical Reports Server (NTRS)

    Nava, D. F.; Mitchell, M. B.; Stief, L. J.

    1986-01-01

    The absolute rate constant for the reaction H + C4H2 has been measured over the temperature (T) interval 210-423 K, using the technique of flash photolysis-resonance fluorescence. At each of the five temperatures employed, the results were independent of variations in C4H2 concentration, total pressure of Ar or N2, and flash intensity (i.e., the initial H concentration). The rate constant, k, was found to be equal to 1.39 x 10 to the -10th exp (-1184/T) cu cm/s, with an error of one standard deviation. The Arrhenius parameters at the high pressure limit determined here for the H + C4H2 reaction are consistent with those for the corresponding reactions of H with C2H2 and C3H4. Implications of the kinetic carbon chemistry results, particularly those at low temperature, are considered for models of the atmospheric carbon chemistry of Titan. The rate of this reaction, relative to that of the analogous, but slower, reaction of H + C2H2, appears to make H + C4H2 a very feasible reaction pathway for effective conversion of H atoms to molecular hydrogen in the stratosphere of Titan.

  2. The reaction H + C4H2 - Absolute rate constant measurement and implication for atmospheric modeling of Titan

    NASA Astrophysics Data System (ADS)

    Nava, D. F.; Mitchell, M. B.; Stief, L. J.

    1986-04-01

    The absolute rate constant for the reaction H + C4H2 has been measured over the temperature (T) interval 210-423 K, using the technique of flash photolysis-resonance fluorescence. At each of the five temperatures employed, the results were independent of variations in C4H2 concentration, total pressure of Ar or N2, and flash intensity (i.e., the initial H concentration). The rate constant, k, was found to be equal to 1.39 x 10 to the -10th exp (-1184/T) cu cm/s, with an error of one standard deviation. The Arrhenius parameters at the high pressure limit determined here for the H + C4H2 reaction are consistent with those for the corresponding reactions of H with C2H2 and C3H4. Implications of the kinetic carbon chemistry results, particularly those at low temperature, are considered for models of the atmospheric carbon chemistry of Titan. The rate of this reaction, relative to that of the analogous, but slower, reaction of H + C2H2, appears to make H + C4H2 a very feasible reaction pathway for effective conversion of H atoms to molecular hydrogen in the stratosphere of Titan.

  3. Constraining the Absolute Orientation of eta Carinae's Binary Orbit: A 3-D Dynamical Model for the Broad [Fe III] Emission

    NASA Technical Reports Server (NTRS)

    Madura, T. I.; Gull, T. R.; Owocki, S. P.; Groh, J. H.; Okazaki, A. T.; Russell, C. M. P.

    2011-01-01

    We present a three-dimensional (3-D) dynamical model for the broad [Fe III] emission observed in Eta Carinae using the Hubble Space Telescope/Space Telescope Imaging Spectrograph (HST/STIS). This model is based on full 3-D Smoothed Particle Hydrodynamics (SPH) simulations of Eta Car's binary colliding winds. Radiative transfer codes are used to generate synthetic spectro-images of [Fe III] emission line structures at various observed orbital phases and STIS slit position angles (PAs). Through a parameter study that varies the orbital inclination i, the PA(theta) that the orbital plane projection of the line-of-sight makes with the apastron side of the semi-major axis, and the PA on the sky of the orbital axis, we are able, for the first time, to tightly constrain the absolute 3-D orientation of the binary orbit. To simultaneously reproduce the blue-shifted emission arcs observed at orbital phase 0.976, STIS slit PA = +38deg, and the temporal variations in emission seen at negative slit PAs, the binary needs to have an i approx. = 130deg to 145deg, Theta approx. = -15deg to +30deg, and an orbital axis projected on the sky at a P A approx. = 302deg to 327deg east of north. This represents a system with an orbital axis that is closely aligned with the inferred polar axis of the Homunculus nebula, in 3-D. The companion star, Eta(sub B), thus orbits clockwise on the sky and is on the observer's side of the system at apastron. This orientation has important implications for theories for the formation of the Homunculus and helps lay the groundwork for orbital modeling to determine the stellar masses.

  4. Absolutely classical spin states

    NASA Astrophysics Data System (ADS)

    Bohnet-Waldraff, F.; Giraud, O.; Braun, D.

    2017-01-01

    We introduce the concept of "absolutely classical" spin states, in analogy to absolutely separable states of bipartite quantum systems. Absolutely classical states are states that remain classical (i.e., a convex sum of projectors on coherent states of a spin j ) under any unitary transformation applied to them. We investigate the maximal size of the ball of absolutely classical states centered on the maximally mixed state and derive a lower bound for its radius as a function of the total spin quantum number. We also obtain a numerical estimate of this maximal radius and compare it to the case of absolutely separable states.

  5. Absolute airborne gravimetry

    NASA Astrophysics Data System (ADS)

    Baumann, Henri

    This work consists of a feasibility study of a first stage prototype airborne absolute gravimeter system. In contrast to relative systems, which are using spring gravimeters, the measurements acquired by absolute systems are uncorrelated and the instrument is not suffering from problems like instrumental drift, frequency response of the spring and possible variation of the calibration factor. The major problem we had to resolve were to reduce the influence of the non-gravitational accelerations included in the measurements. We studied two different approaches to resolve it: direct mechanical filtering, and post-processing digital compensation. The first part of the work describes in detail the different mechanical passive filters of vibrations, which were studied and tested in the laboratory and later in a small truck in movement. For these tests as well as for the airborne measurements an absolute gravimeter FG5-L from Micro-G Ltd was used together with an Inertial navigation system Litton-200, a vertical accelerometer EpiSensor, and GPS receivers for positioning. These tests showed that only the use of an optical table gives acceptable results. However, it is unable to compensate for the effects of the accelerations of the drag free chamber. The second part describes the strategy of the data processing. It is based on modeling the perturbing accelerations by means of GPS, EpiSensor and INS data. In the third part the airborne experiment is described in detail, from the mounting in the aircraft and data processing to the different problems encountered during the evaluation of the quality and accuracy of the results. In the part of data processing the different steps conducted from the raw apparent gravity data and the trajectories to the estimation of the true gravity are explained. A comparison between the estimated airborne data and those obtained by ground upward continuation at flight altitude allows to state that airborne absolute gravimetry is feasible and

  6. Absolute dimensions of solar-type eclipsing binaries. EF Aquarii: a G0 test for stellar evolution models

    NASA Astrophysics Data System (ADS)

    Vos, J.; Clausen, J. V.; Jørgensen, U. G.; Østensen, R. H.; Claret, A.; Hillen, M.; Exter, K.

    2012-04-01

    Context. Recent studies have shown that stellar chromospheric activity, and its effect on convective energy transport in the envelope, is most likely the cause of significant radius and temperature discrepancies between theoretical evolution models and observations. Accurate mass, radius, and abundance determinations from solar-type binaries exhibiting various levels of activity are needed for a better insight into the structure and evolution of these stars. Aims: We aim to determine absolute dimensions and abundances for the solar-type detached eclipsing binary EF Aqr, and to perform a detailed comparison with results from recent stellar evolutionary models. Methods.uvby light curves and uvbyβ standard photometry were obtained with the Strömgren Automatic Telescope. The broadening function formalism was applied on spectra observed with HERMES at the Mercator telescope in La Palma, to obtain radial velocity curves. State-of-the-art methods were applied for the photometric and spectroscopic analyses. Results: Masses and radii with a precision of 0.6% and 1.0% respectively have been established for both components of EF Aqr. The active 0.956 M⊙ secondary shows star spots and strong Ca II H and K emission lines. The 1.224 M⊙ primary shows signs of activity as well, but at a lower level. An [Fe/H] abundance of 0.00 ± 0.10 is derived with similar abundances for Si, Ca, Sc, Ti, V, Cr, Co, and Ni. Solar calibrated evolutionary models such as Yonsei-Yale, Victoria-Regina and BaSTI isochrones and evolutionary tracks are unable to reproduce EF Aqr, especially for the secondary, which is 9% larger and 400 K cooler than predicted. Models adopting significantly lower mixing length parameters l/Hp remove these discrepancies, as seen in other solar type binaries. For the observed metallicity, Granada models with a mixing length of l/Hp = 1.30 (primary) and 1.05 (secondary) reproduce both components at a common age of 1.5 ± 0.6 Gyr. Conclusions: Observations of EF Aqr

  7. Modeling Research Project Risks with Fuzzy Maps

    ERIC Educational Resources Information Center

    Bodea, Constanta Nicoleta; Dascalu, Mariana Iuliana

    2009-01-01

    The authors propose a risks evaluation model for research projects. The model is based on fuzzy inference. The knowledge base for fuzzy process is built with a causal and cognitive map of risks. The map was especially developed for research projects, taken into account their typical lifecycle. The model was applied to an e-testing research…

  8. Comparison between a Terramechanics Model and a Continuum Soil Model Implemented within the Absolute Nodal Coordinate Formulation

    DTIC Science & Technology

    2012-08-01

    otherwise, does not necessarily constitute or imply its endorsement, recommendation, or favoring by the United States Government or the Department of...the Army (DoA). The opinions of the authors expressed herein do not necessarily state or reflect those of the United States Government or the DoA...higher pressure at the rear end of the track. Reece (1965) improved Bekker’s model by making the parameters dimensionless. This single equation

  9. Relative risk regression models with inverse polynomials.

    PubMed

    Ning, Yang; Woodward, Mark

    2013-08-30

    The proportional hazards model assumes that the log hazard ratio is a linear function of parameters. In the current paper, we model the log relative risk as an inverse polynomial, which is particularly suitable for modeling bounded and asymmetric functions. The parameters estimated by maximizing the partial likelihood are consistent and asymptotically normal. The advantages of the inverse polynomial model over the ordinary polynomial model and the fractional polynomial model for fitting various asymmetric log relative risk functions are shown by simulation. The utility of the method is further supported by analyzing two real data sets, addressing the specific question of the location of the minimum risk threshold.

  10. Development and validation of a cardiovascular risk prediction model for Japanese: the Hisayama study.

    PubMed

    Arima, Hisatomi; Yonemoto, Koji; Doi, Yasufumi; Ninomiya, Toshiharu; Hata, Jun; Tanizaki, Yumihiro; Fukuhara, Masayo; Matsumura, Kiyoshi; Iida, Mitsuo; Kiyohara, Yutaka

    2009-12-01

    The objective of this paper is to develop a new risk prediction model of cardiovascular disease and to validate its performance in a general population of Japanese. The Hisayama study is a population-based prospective cohort study. A total of 2634 participants aged 40 years or older were followed up for 14 years for incident cardiovascular disease (stroke and coronary heart disease (myocardial infarction, coronary revascularization and sudden cardiac death)). We used data among a random two-thirds (the derivation cohort, n=1756) to develop a new risk prediction model that was then tested to compare observed and predicted outcomes in the remaining one-third (the validation cohort, n=878). A multivariable cardiovascular risk prediction model was developed that incorporated age, sex, systolic blood pressure, diabetes, low-density lipoprotein cholesterol, high-density lipoprotein cholesterol and smoking. We assessed the performance of the model for predicting individual cardiovascular event among the validation cohort. The risk prediction model demonstrated good discrimination (c-statistic=0.81; 95% confidence interval, 0.77 to 0.86) and calibration (Hosmer-Lemeshow chi(2)-statistic=6.46; P=0.60). A simple risk score sheet based on the cardiovascular risk prediction model was also presented. We developed and validated a new cardiovascular risk prediction model in a general population of Japanese. The risk prediction model would provide a useful guide to estimate absolute risk of cardiovascular disease and to treat individual risk factors.

  11. Requirements based system risk modeling

    NASA Technical Reports Server (NTRS)

    Meshkat, Leila; Cornford, Steven; Feather, Martin

    2004-01-01

    The problem that we address in this paper is assessing the expected degree of success of the system or mission based on the degree to which each requirement is satisfied and the relative weight of the requirements. We assume a complete list of the requirements, the relevant risk elements and their probability of occurrence and the quantified effect of the risk elements on the requirements. In order to assess the degree to which each requirement is satisfied, we need to determine the effect of the various risk elements on the requirement.

  12. Database applicaton for absolute spectrophotometry

    NASA Astrophysics Data System (ADS)

    Bochkov, Valery V.; Shumko, Sergiy

    2002-12-01

    32-bit database application with multidocument interface for Windows has been developed to calculate absolute energy distributions of observed spectra. The original database contains wavelength calibrated observed spectra which had been already passed through apparatus reductions such as flatfielding, background and apparatus noise subtracting. Absolute energy distributions of observed spectra are defined in unique scale by means of registering them simultaneously with artificial intensity standard. Observations of sequence of spectrophotometric standards are used to define absolute energy of the artificial standard. Observations of spectrophotometric standards are used to define optical extinction in selected moments. FFT algorithm implemented in the application allows performing convolution (deconvolution) spectra with user-defined PSF. The object-oriented interface has been created using facilities of C++ libraries. Client/server model with Windows Socket functionality based on TCP/IP protocol is used to develop the application. It supports Dynamic Data Exchange conversation in server mode and uses Microsoft Exchange communication facilities.

  13. Root Caries Risk Indicators: A Systematic Review of Risk Models

    PubMed Central

    Ritter, André V.; Shugars, Daniel A.; Bader, James D.

    2010-01-01

    Objective To identify risk indicators that are associated with root caries incidence in published predictive risk models. Methods Abstracts (n=472) identified from a MEDLINE, EMBASE, and Cochrane registry search were screened independently by two investigators to exclude articles not in English (n=39), published prior to 1970 (none), or containing no information on either root caries incidence, risk indicators, or risk models (n=209). A full-article duplicate review of the remaining articles (n=224) selected those reporting predictive risk models based on original/primary longitudinal root caries incidence studies. The quality of the included articles was assessed based both on selected criteria of methodological standards for observational studies and on the statistical quality of the modeling strategy. Data from these included studies were extracted and compiled into evidence tables, which included information about the cohort location, incidence period, sample size, age of the study participants, risk indicators included in the model, root caries incidence, modeling strategy, significant risk indicators/predictors, and parameter estimates and statistical findings. Results Thirteen articles were selected for data extraction. The overall quality of the included articles was poor to moderate. Root caries incidence ranged fro m 12%–77% (mean±SD=45%±17%); follow-up time of the published studies was ≤10 years (range=9; median=3); sample size ranged from 23–723 (mean±SD=264±203; median=261); person-years ranged from 23–1540 (mean±SD=760±556; median=746). Variables most frequently tested and significantly associated with root caries incidence were (times tested; % significant; directionality): baseline root caries (12; 58%; positive); number of teeth (7; 71%; 3 times positive, twice negative), and plaque index (4; 100%; positive). Ninety-two other clinical and non-clinical variables were tested: 27 were tested 3 times or more and were significant between 9

  14. A methodology for modeling regional terrorism risk.

    PubMed

    Chatterjee, Samrat; Abkowitz, Mark D

    2011-07-01

    Over the past decade, terrorism risk has become a prominent consideration in protecting the well-being of individuals and organizations. More recently, there has been interest in not only quantifying terrorism risk, but also placing it in the context of an all-hazards environment in which consideration is given to accidents and natural hazards, as well as intentional acts. This article discusses the development of a regional terrorism risk assessment model designed for this purpose. The approach taken is to model terrorism risk as a dependent variable, expressed in expected annual monetary terms, as a function of attributes of population concentration and critical infrastructure. This allows for an assessment of regional terrorism risk in and of itself, as well as in relation to man-made accident and natural hazard risks, so that mitigation resources can be allocated in an effective manner. The adopted methodology incorporates elements of two terrorism risk modeling approaches (event-based models and risk indicators), producing results that can be utilized at various jurisdictional levels. The validity, strengths, and limitations of the model are discussed in the context of a case study application within the United States.

  15. Adversarial risk analysis for counterterrorism modeling.

    PubMed

    Rios, Jesus; Rios Insua, David

    2012-05-01

    Recent large-scale terrorist attacks have raised interest in models for resource allocation against terrorist threats. The unifying theme in this area is the need to develop methods for the analysis of allocation decisions when risks stem from the intentional actions of intelligent adversaries. Most approaches to these problems have a game-theoretic flavor although there are also several interesting decision-analytic-based proposals. One of them is the recently introduced framework for adversarial risk analysis, which deals with decision-making problems that involve intelligent opponents and uncertain outcomes. We explore how adversarial risk analysis addresses some standard counterterrorism models: simultaneous defend-attack models, sequential defend-attack-defend models, and sequential defend-attack models with private information. For each model, we first assess critically what would be a typical game-theoretic approach and then provide the corresponding solution proposed by the adversarial risk analysis framework, emphasizing how to coherently assess a predictive probability model of the adversary's actions, in a context in which we aim at supporting decisions of a defender versus an attacker. This illustrates the application of adversarial risk analysis to basic counterterrorism models that may be used as basic building blocks for more complex risk analysis of counterterrorism problems.

  16. Quantifying fatigue risk in model-based fatigue risk management.

    PubMed

    Rangan, Suresh; Van Dongen, Hans P A

    2013-02-01

    The question of what is a maximally acceptable level of fatigue risk is hotly debated in model-based fatigue risk management in commercial aviation and other transportation modes. A quantitative approach to addressing this issue, referred to by the Federal Aviation Administration with regard to its final rule for commercial aviation "Flightcrew Member Duty and Rest Requirements," is to compare predictions from a mathematical fatigue model against a fatigue threshold. While this accounts for duty time spent at elevated fatigue risk, it does not account for the degree of fatigue risk and may, therefore, result in misleading schedule assessments. We propose an alternative approach based on the first-order approximation that fatigue risk is proportional to both the duty time spent below the fatigue threshold and the distance of the fatigue predictions to the threshold--that is, the area under the curve (AUC). The AUC approach is straightforward to implement for schedule assessments in commercial aviation and also provides a useful fatigue metric for evaluating thousands of scheduling options in industrial schedule optimization tools.

  17. Measures of Residual Risk with Connections to Regression, Risk Tracking, Surrogate Models, and Ambiguity

    DTIC Science & Technology

    2015-01-07

    Measures of Residual Risk with Connections to Regression, Risk Tracking, Surrogate Models, and Ambiguity1 R. Tyrrell Rockafellar Johannes O. Royset...insights and a new class of distributionally robust optimization models. Keywords: risk measures, residual risk, generalized regression, surrogate ...Measures of Residual Risk with Connections to Regression, Risk Tracking, Surrogate Models, and Ambiguity 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c

  18. Modelling suicide risk in later life.

    PubMed

    Lo, C F; Kwok, Cordelia M Y

    2006-08-01

    Affective disorder is generally regarded as the prominent risk factor for suicide in the old age population. Despite the large number of empirical studies available in the literature, there is no attempt in modelling the dynamics of an individual's level of suicide risk theoretically yet. In particular, a dynamic model which can simulate the time evolution of an individual's level of risk for suicide and provide quantitative estimates of the probability of suicide risk is still lacking. In the present study we apply the contingent claims analysis of credit risk modelling in the field of quantitative finance to derive a theoretical stochastic model for estimation of the probability of suicide risk in later life in terms of a signalling index of affective disorder. Our model is based upon the hypothesis that the current state of affective disorder of a patient can be represented by a signalling index and exhibits stochastic movement and that a threshold of affective disorder, which signifies the occurrence of suicide, exists. According to the numerical results, the implications of our model are consistent with the clinical findings. Hence, we believe that such a dynamic model will be essential to the design of effective suicide prevention strategies in the target population of older adults, especially in the primary care setting.

  19. Integrated Environmental Modeling: Quantitative Microbial Risk Assessment

    EPA Science Inventory

    The presentation discusses the need for microbial assessments and presents a road map associated with quantitative microbial risk assessments, through an integrated environmental modeling approach. A brief introduction and the strengths of the current knowledge are illustrated. W...

  20. Two criteria for evaluating risk prediction models.

    PubMed

    Pfeiffer, R M; Gail, M H

    2011-09-01

    We propose and study two criteria to assess the usefulness of models that predict risk of disease incidence for screening and prevention, or the usefulness of prognostic models for management following disease diagnosis. The first criterion, the proportion of cases followed PCF (q), is the proportion of individuals who will develop disease who are included in the proportion q of individuals in the population at highest risk. The second criterion is the proportion needed to follow-up, PNF (p), namely the proportion of the general population at highest risk that one needs to follow in order that a proportion p of those destined to become cases will be followed. PCF (q) assesses the effectiveness of a program that follows 100q% of the population at highest risk. PNF (p) assess the feasibility of covering 100p% of cases by indicating how much of the population at highest risk must be followed. We show the relationship of those two criteria to the Lorenz curve and its inverse, and present distribution theory for estimates of PCF and PNF. We develop new methods, based on influence functions, for inference for a single risk model, and also for comparing the PCFs and PNFs of two risk models, both of which were evaluated in the same validation data.

  1. Monte Carlo Modeling-Based Digital Loop-Mediated Isothermal Amplification on a Spiral Chip for Absolute Quantification of Nucleic Acids.

    PubMed

    Xia, Yun; Yan, Shuangqian; Zhang, Xian; Ma, Peng; Du, Wei; Feng, Xiaojun; Liu, Bi-Feng

    2017-03-21

    Digital loop-mediated isothermal amplification (dLAMP) is an attractive approach for absolute quantification of nucleic acids with high sensitivity and selectivity. Theoretical and numerical analysis of dLAMP provides necessary guidance for the design and analysis of dLAMP devices. In this work, a mathematical model was proposed on the basis of the Monte Carlo method and the theories of Poisson statistics and chemometrics. To examine the established model, we fabricated a spiral chip with 1200 uniform and discrete reaction chambers (9.6 nL) for absolute quantification of pathogenic DNA samples by dLAMP. Under the optimized conditions, dLAMP analysis on the spiral chip realized quantification of nucleic acids spanning over 4 orders of magnitude in concentration with sensitivity as low as 8.7 × 10(-2) copies/μL in 40 min. The experimental results were consistent with the proposed mathematical model, which could provide useful guideline for future development of dLAMP devices.

  2. Using A New Model for Main Sequence Turnoff Absolute Magnitudes to Measure Stellar Streams in the Milky Way Halo

    NASA Astrophysics Data System (ADS)

    Weiss, Jake; Newberg, Heidi Jo; Arsenault, Matthew; Bechtel, Torrin; Desell, Travis; Newby, Matthew; Thompson, Jeffery M.

    2016-01-01

    Statistical photometric parallax is a method for using the distribution of absolute magnitudes of stellar tracers to statistically recover the underlying density distribution of these tracers. In previous work, statistical photometric parallax was used to trace the Sagittarius Dwarf tidal stream, the so-called bifurcated piece of the Sagittaritus stream, and the Virgo Overdensity through the Milky Way. We use an improved knowledge of this distribution in a new algorithm that accounts for the changes in the stellar population of color-selected stars near the photometric limit of the Sloan Digital Sky Survey (SDSS). Although we select bluer main sequence turnoff stars (MSTO) as tracers, large color errors near the survey limit cause many stars to be scattered out of our selection box and many fainter, redder stars to be scattered into our selection box. We show that we are able to recover parameters for analogues of these streams in simulated data using a maximum likelihood optimization on MilkyWay@home. We also present the preliminary results of fitting the density distribution of major Milky Way tidal streams in SDSS data. This research is supported by generous gifts from the Marvin Clan, Babette Josephs, Manit Limlamai, and the MilkyWay@home volunteers.

  3. Calibrated predictions for multivariate competing risks models.

    PubMed

    Gorfine, Malka; Hsu, Li; Zucker, David M; Parmigiani, Giovanni

    2014-04-01

    Prediction models for time-to-event data play a prominent role in assessing the individual risk of a disease, such as cancer. Accurate disease prediction models provide an efficient tool for identifying individuals at high risk, and provide the groundwork for estimating the population burden and cost of disease and for developing patient care guidelines. We focus on risk prediction of a disease in which family history is an important risk factor that reflects inherited genetic susceptibility, shared environment, and common behavior patterns. In this work family history is accommodated using frailty models, with the main novel feature being allowing for competing risks, such as other diseases or mortality. We show through a simulation study that naively treating competing risks as independent right censoring events results in non-calibrated predictions, with the expected number of events overestimated. Discrimination performance is not affected by ignoring competing risks. Our proposed prediction methodologies correctly account for competing events, are very well calibrated, and easy to implement.

  4. Risk terrain modeling predicts child maltreatment.

    PubMed

    Daley, Dyann; Bachmann, Michael; Bachmann, Brittany A; Pedigo, Christian; Bui, Minh-Thuy; Coffman, Jamye

    2016-12-01

    As indicated by research on the long-term effects of adverse childhood experiences (ACEs), maltreatment has far-reaching consequences for affected children. Effective prevention measures have been elusive, partly due to difficulty in identifying vulnerable children before they are harmed. This study employs Risk Terrain Modeling (RTM), an analysis of the cumulative effect of environmental factors thought to be conducive for child maltreatment, to create a highly accurate prediction model for future substantiated child maltreatment cases in the City of Fort Worth, Texas. The model is superior to commonly used hotspot predictions and more beneficial in aiding prevention efforts in a number of ways: 1) it identifies the highest risk areas for future instances of child maltreatment with improved precision and accuracy; 2) it aids the prioritization of risk-mitigating efforts by informing about the relative importance of the most significant contributing risk factors; 3) since predictions are modeled as a function of easily obtainable data, practitioners do not have to undergo the difficult process of obtaining official child maltreatment data to apply it; 4) the inclusion of a multitude of environmental risk factors creates a more robust model with higher predictive validity; and, 5) the model does not rely on a retrospective examination of past instances of child maltreatment, but adapts predictions to changing environmental conditions. The present study introduces and examines the predictive power of this new tool to aid prevention efforts seeking to improve the safety, health, and wellbeing of vulnerable children.

  5. Absolute and relative blindsight.

    PubMed

    Balsdon, Tarryn; Azzopardi, Paul

    2015-03-01

    The concept of relative blindsight, referring to a difference in conscious awareness between conditions otherwise matched for performance, was introduced by Lau and Passingham (2006) as a way of identifying the neural correlates of consciousness (NCC) in fMRI experiments. By analogy, absolute blindsight refers to a difference between performance and awareness regardless of whether it is possible to match performance across conditions. Here, we address the question of whether relative and absolute blindsight in normal observers can be accounted for by response bias. In our replication of Lau and Passingham's experiment, the relative blindsight effect was abolished when performance was assessed by means of a bias-free 2AFC task or when the criterion for awareness was varied. Furthermore, there was no evidence of either relative or absolute blindsight when both performance and awareness were assessed with bias-free measures derived from confidence ratings using signal detection theory. This suggests that both relative and absolute blindsight in normal observers amount to no more than variations in response bias in the assessment of performance and awareness. Consideration of the properties of psychometric functions reveals a number of ways in which relative and absolute blindsight could arise trivially and elucidates a basis for the distinction between Type 1 and Type 2 blindsight.

  6. Earthquake Risk Modelling - Opening the black box

    NASA Astrophysics Data System (ADS)

    Alarcon, John E.; Simic, Milan; Franco, Guillermo; Shen-Tu, Bingming

    2010-05-01

    Assessing the risk from natural catastrophes such as earthquakes involves the detailed study of the seismic sources and site conditions that contribute to the earthquake hazard in the region of interest, the distribution and particular characteristics of the exposures through the study of building stock and its vulnerabilities, and the application of specific financial terms for particular portfolios. The catastrophe modelling framework encompasses these relatively complex considerations while also including a measure of uncertainty. This paper describes succinctly the structure and modules included in a probabilistic catastrophe risk model and presents several examples of risk modelling for realistic scenarios such as the expected earthquakes in the Marmara Sea region of Turkey and the results from modelling the 2009 L'Aquila (Abruzzo) earthquake.

  7. A Risk Model for Lung Cancer Incidence

    PubMed Central

    Hoggart, Clive; Brennan, Paul; Tjonneland, Anne; Vogel, Ulla; Overvad, Kim; Østergaard, Jane Nautrup; Kaaks, Rudolf; Canzian, Federico; Boeing, Heiner; Steffen, Annika; Trichopoulou, Antonia; Bamia, Christina; Trichopoulos, Dimitrios; Johansson, Mattias; Palli, Domenico; Krogh, Vittorio; Tumino, Rosario; Sacerdote, Carlotta; Panico, Salvatore; Boshuizen, Hendriek; Bueno-de-Mesquita, H. Bas; Peeters, Petra H.M.; Lund, Eiliv; Gram, Inger Torhild; Braaten, Tonje; Rodríguez, Laudina; Agudo, Antonio; Sanchez-Cantalejo, Emilio; Arriola, Larraitz; Chirlaque, Maria-Dolores; Barricarte, Aurelio; Rasmuson, Torgny; Khaw, Kay-Tee; Wareham, Nicholas; Allen, Naomi E.; Riboli, Elio; Vineis, Paolo

    2015-01-01

    Risk models for lung cancer incidence would be useful for prioritizing individuals for screening and participation in clinical trials of chemoprevention. We present a risk model for lung cancer built using prospective cohort data from a general population which predicts individual incidence in a given time period. We build separate risk models for current and former smokers using 169,035 ever smokers from the multicenter European Prospective Investigation into Cancer and Nutrition (EPIC) and considered a model for never smokers. The data set was split into independent training and test sets. Lung cancer incidence was modeled using survival analysis, stratifying by age started smoking, and for former smokers, also smoking duration. Other risk factors considered were smoking intensity, 10 occupational/environmental exposures previously implicated with lung cancer, and single-nucleotide polymorphisms at two loci identified by genome-wide association studies of lung cancer. Individual risk in the test set was measured by the predicted probability of lung cancer incidence in the year preceding last follow-up time, predictive accuracy was measured by the area under the receiver operator characteristic curve (AUC). Using smoking information alone gave good predictive accuracy: the AUC and 95% confidence interval in ever smokers was 0.843 (0.810–0.875), the Bach model applied to the same data gave an AUC of 0.775 (0.737–0.813). Other risk factors had negligible effect on the AUC, including never smokers for whom prediction was poor. Our model is generalizable and straightforward to implement. Its accuracy can be attributed to its modeling of lifetime exposure to smoking. PMID:22496387

  8. Incorporation of a genetic factor into an epidemiologic model for prediction of individual risk of lung cancer: the Liverpool Lung Project.

    PubMed

    Raji, Olaide Y; Agbaje, Olorunsola F; Duffy, Stephen W; Cassidy, Adrian; Field, John K

    2010-05-01

    The Liverpool Lung Project (LLP) has previously developed a risk model for prediction of 5-year absolute risk of lung cancer based on five epidemiologic risk factors. SEZ6L, a Met430IIe polymorphic variant found on 22q12.2 region, has been previously linked with an increased risk of lung cancer in a case-control population. In this article, we quantify the improvement in risk prediction with addition of SEZ6L to the LLP risk model. Data from 388 LLP subjects genotyped for SEZ6L single-nucleotide polymorphism (SNP) were combined with epidemiologic risk factors. Multivariable conditional logistic regression was used to predict 5-year absolute risk of lung cancer with and without this SNP. The improvement in the model associated with the SEZ6L SNP was assessed through pairwise comparison of the area under the receiver operating characteristic curve and the net reclassification improvements (NRI). The extended model showed better calibration compared with the baseline model. There was a statistically significant modest increase in the area under the receiver operating characteristic curve when SEZ6L was added into the baseline model. The NRI also revealed a statistically significant improvement of around 12% for the extended model; this improvement was better for subjects classified into the two intermediate-risk categories by the baseline model (NRI, 27%). Our results suggest that the addition of SEZ6L improved the performance of the LLP risk model, particularly for subjects whose initial absolute risks were unable to discriminate into "low-risk" or "high-risk" group. This work shows an approach to incorporate genetic biomarkers in risk models for predicting an individual's lung cancer risk.

  9. The absolute path command

    SciTech Connect

    Moody, A.

    2012-05-11

    The ap command traveres all symlinks in a given file, directory, or executable name to identify the final absolute path. It can print just the final path, each intermediate link along with the symlink chan, and the permissions and ownership of each directory component in the final path. It has functionality similar to "which", except that it shows the final path instead of the first path. It is also similar to "pwd", but it can provide the absolute path to a relative directory from the current working directory.

  10. Modeling food spoilage in microbial risk assessment.

    PubMed

    Koutsoumanis, Konstantinos

    2009-02-01

    In this study, I describe a systematic approach for modeling food spoilage in microbial risk assessment that is based on the incorporation of kinetic spoilage modeling in exposure assessment by combining data and models for the specific spoilage organisms (SSO: fraction of the total microflora responsible for spoilage) with those for pathogens. The structure of the approach is presented through an exposure assessment application for Escherichia coli O157:H7 in ground beef. The proposed approach allows for identifying spoiled products at the time of consumption by comparing the estimated level of SSO (pseudomonads) with the spoilage level (level of SSO at which spoilage is observed). The results of the application indicate that ignoring spoilage in risk assessment could lead to significant overestimations of risk.

  11. Development and Validation of a Lung Cancer Risk Prediction Model for African-Americans

    PubMed Central

    Etzel, Carol J.; Kachroo, Sumesh; Liu, Mei; D'Amelio, Anthony; Dong, Qiong; Cote, Michele L.; Wenzlaff, Angela S.; Hong, Waun Ki; Greisinger, Anthony J.; Schwartz, Ann G.; Spitz, Margaret R.

    2009-01-01

    Because existing risk prediction models for lung cancer were developed in white populations, they may not be appropriate for predicting risk among African-Americans. Therefore, a need exists to construct and validate a risk prediction model for lung cancer that is specific to African-Americans. We analyzed data from 491 African-Americans with lung cancer and 497 matched African-American controls to identify specific risks and incorporate them into a multivariable risk model for lung cancer and estimate the 5-year absolute risk of lung cancer. We performed internal and external validations of the risk model using data on additional cases and controls from the same ongoing multiracial/ethnic lung cancer case-control study from which the model-building data were obtained as well as data from two different lung cancer studies in metropolitan Detroit, respectively. We also compared our African-American model with our previously developed risk prediction model for whites. The final risk model included smoking-related variables [smoking status, pack-years smoked, age at smoking cessation (former smokers), and number of years since smoking cessation (former smokers)], self- reported physician diagnoses of chronic obstructive pulmonary disease or hay fever, and exposures to asbestos or wood dusts. Our risk prediction model for African-Americans exhibited good discrimination [75% (95% confidence interval, 0.67−0.82)] for our internal data and moderate discrimination [63% (95% confidence interval, 0.57−0.69)] for the external data group, which is an improvement over the Spitz model for white subjects. Existing lung cancer prediction models may not be appropriate for predicting risk for African-Americans because (a) they were developed using white populations, (b) level of risk is different for risk factors that African-American share with whites, and (c) unique group-specific risk factors exist for African-Americans. This study developed and validated a risk prediction

  12. Suicide risk assessment and suicide risk formulation: essential components of the therapeutic risk management model.

    PubMed

    Silverman, Morton M

    2014-09-01

    Suicide and other suicidal behaviors are often associated with psychiatric disorders and dysfunctions. Therefore, psychiatrists have significant opportunities to identify at-risk individuals and offer treatment to reduce that risk. Although a suicide risk assessment is a core competency requirement, many clinical psychiatrists lack the requisite training and skills to appropriately assess for suicide risk. Moreover, the standard of care requires psychiatrists to foresee the possibility that a patient might engage in suicidal behavior, hence to conduct a suicide risk formulation sufficient to guide triage and treatment planning. Based on data collected via a suicide risk assessment, a suicide risk formulation is a process whereby the psychiatrist forms a judgment about a patient's foreseeable risk of suicidal behavior in order to inform triage decisions, safety and treatment planning, and interventions to reduce risk. This paper addresses the components of this process in the context of the model for therapeutic risk management of the suicidal patient developed at the Veterans Integrated Service Network (VISN) 19 Mental Illness Research, Education and Clinical Center by Wortzel et al.

  13. Delayed detonation models for normal and subluminous type Ia sueprnovae: Absolute brightness, light curves, and molecule formation

    NASA Technical Reports Server (NTRS)

    Hoflich, P.; Khokhlov, A. M.; Wheeler, J. C.

    1995-01-01

    We compute optical and infrared light curves of the pulsating class of delayed detonation models for Type Ia supernovae (SN Ia's) using an elaborate treatment of the Local Thermodynamic Equilbrium (LTE) radiation transport, equation of state and ionization balance, expansion opacity including the cooling by CO, Co(+), and SiO, and a Monte Carlo gamma-ray deposition scheme. The models have an amount of Ni-56 in the range from approximately or equal to 0.1 solar mass up to 0.7 solar mass depending on the density at which the transition from a deflagration to a detonation occurs. Models with a large nickel production give light curves comparable to those of typical Type Ia supernovae. Subluminous supernovae can be explained by models with a low nickel production. Multiband light curves are presented in comparison with the normally bright event SN 1992bc and the subluminous events Sn 1991bg and SN 1992bo to establish the principle that the delayed detonation paradigm in Chandrasekhar mass models may give a common explosion mechanism accounting for both normal and subluminous SN Ia's. Secondary IR-maxima are formed in the models of normal SN Ia's as a photospheric effect if the photospheric radius continues to increase well after maximum light. Secondary maxima appear later and stronger in models with moderate expansion velocities and with radioactive material closer to the surface. Model light curves for subluminous SN Ia's tend to show only one 'late' IR-maximum. In some delayed detonation models shell-like envelopes form, which consist of unburned carbon and oxygen. The formation of molecules in these envelopes is addressed. If the model retains a C/O-envelope and is subluminous, strong vibration bands of CO may appear, typically several weeks past maximum light. CO should be very weak or absent in normal Sn Ia's.

  14. Modeling and Managing Risk in Billing Infrastructures

    NASA Astrophysics Data System (ADS)

    Baiardi, Fabrizio; Telmon, Claudio; Sgandurra, Daniele

    This paper discusses risk modeling and risk management in information and communications technology (ICT) systems for which the attack impact distribution is heavy tailed (e.g., power law distribution) and the average risk is unbounded. Systems with these properties include billing infrastructures used to charge customers for services they access. Attacks against billing infrastructures can be classified as peripheral attacks and backbone attacks. The goal of a peripheral attack is to tamper with user bills; a backbone attack seeks to seize control of the billing infrastructure. The probability distribution of the overall impact of an attack on a billing infrastructure also has a heavy-tailed curve. This implies that the probability of a massive impact cannot be ignored and that the average impact may be unbounded - thus, even the most expensive countermeasures would be cost effective. Consequently, the only strategy for managing risk is to increase the resilience of the infrastructure by employing redundant components.

  15. Challenges of Modeling Flood Risk at Large Scales

    NASA Astrophysics Data System (ADS)

    Guin, J.; Simic, M.; Rowe, J.

    2009-04-01

    uncertainty can be attributed to various sources among which are imperfections in the hazard modeling, inherent errors in the DTM, lack of accurate information on the properties that are being analyzed, imperfections in the vulnerability relationships, inability of the model to account for local mitigation measures that are usually undertaken when a real event is unfolding and lack of details in the claims data that are used for model calibration. Nevertheless, the model once calibrated provides a very robust framework for analyzing relative and absolute risk. The paper concludes with key economic statistics of flood risk for Great Britain as a whole including certain large loss-causing scenarios affecting the greater London region. The model estimates a total financial loss of 5.6 billion GBP to all properties at a 1% annual aggregate exceedance probability level.

  16. Risk management model of winter navigation operations.

    PubMed

    Valdez Banda, Osiris A; Goerlandt, Floris; Kuzmin, Vladimir; Kujala, Pentti; Montewka, Jakub

    2016-07-15

    The wintertime maritime traffic operations in the Gulf of Finland are managed through the Finnish-Swedish Winter Navigation System. This establishes the requirements and limitations for the vessels navigating when ice covers this area. During winter navigation in the Gulf of Finland, the largest risk stems from accidental ship collisions which may also trigger oil spills. In this article, a model for managing the risk of winter navigation operations is presented. The model analyses the probability of oil spills derived from collisions involving oil tanker vessels and other vessel types. The model structure is based on the steps provided in the Formal Safety Assessment (FSA) by the International Maritime Organization (IMO) and adapted into a Bayesian Network model. The results indicate that ship independent navigation and convoys are the operations with higher probability of oil spills. Minor spills are most probable, while major oil spills found very unlikely but possible.

  17. Role of Modeling When Designing for Absolute Energy Use Intensity Requirements in a Design-Build Framework: Preprint

    SciTech Connect

    Hirsch, A.; Pless, S.; Guglielmetti, R.; Torcellini, P. A.; Okada, D.; Antia, P.

    2011-03-01

    The Research Support Facility was designed to use half the energy of an equivalent minimally code-compliant building, and to produce as much renewable energy as it consumes on an annual basis. These energy goals and their substantiation through simulation were explicitly included in the project's fixed firm price design-build contract. The energy model had to be continuously updated during the design process and to match the final building as-built to the greatest degree possible. Computer modeling played a key role throughout the design process and in verifying that the contractual energy goals would be met within the specified budget. The main tool was a whole building energy simulation program. Other models were used to provide more detail or to complement the whole building simulation tool. Results from these specialized models were fed back into the main whole building simulation tool to provide the most accurate possible inputs for annual simulations. This paper will detail the models used in the design process and how they informed important program and design decisions on the path from preliminary design to the completed building.

  18. A Probabilistic Asteroid Impact Risk Model

    NASA Technical Reports Server (NTRS)

    Mathias, Donovan L.; Wheeler, Lorien F.; Dotson, Jessie L.

    2016-01-01

    Asteroid threat assessment requires the quantification of both the impact likelihood and resulting consequence across the range of possible events. This paper presents a probabilistic asteroid impact risk (PAIR) assessment model developed for this purpose. The model incorporates published impact frequency rates with state-of-the-art consequence assessment tools, applied within a Monte Carlo framework that generates sets of impact scenarios from uncertain parameter distributions. Explicit treatment of atmospheric entry is included to produce energy deposition rates that account for the effects of thermal ablation and object fragmentation. These energy deposition rates are used to model the resulting ground damage, and affected populations are computed for the sampled impact locations. The results for each scenario are aggregated into a distribution of potential outcomes that reflect the range of uncertain impact parameters, population densities, and strike probabilities. As an illustration of the utility of the PAIR model, the results are used to address the question of what minimum size asteroid constitutes a threat to the population. To answer this question, complete distributions of results are combined with a hypothetical risk tolerance posture to provide the minimum size, given sets of initial assumptions. Model outputs demonstrate how such questions can be answered and provide a means for interpreting the effect that input assumptions and uncertainty can have on final risk-based decisions. Model results can be used to prioritize investments to gain knowledge in critical areas or, conversely, to identify areas where additional data has little effect on the metrics of interest.

  19. Risk analysis: divergent models and convergent interpretations

    NASA Technical Reports Server (NTRS)

    Carnes, B. A.; Gavrilova, N.

    2001-01-01

    Material presented at a NASA-sponsored workshop on risk models for exposure conditions relevant to prolonged space flight are described in this paper. Analyses used mortality data from experiments conducted at Argonne National Laboratory on the long-term effects of external whole-body irradiation on B6CF1 mice by 60Co gamma rays and fission neutrons delivered as a single exposure or protracted over either 24 or 60 once-weekly exposures. The maximum dose considered was restricted to 1 Gy for neutrons and 10 Gy for gamma rays. Proportional hazard models were used to investigate the shape of the dose response at these lower doses for deaths caused by solid-tissue tumors and tumors of either connective or epithelial tissue origin. For protracted exposures, a significant mortality effect was detected at a neutron dose of 14 cGy and a gamma-ray dose of 3 Gy. For single exposures, radiation-induced mortality for neutrons also occurred within the range of 10-20 cGy, but dropped to 86 cGy for gamma rays. Plots of risk relative to control estimated for each observed dose gave a visual impression of nonlinearity for both neutrons and gamma rays. At least for solid-tissue tumors, male and female mortality was nearly identical for gamma-ray exposures, but mortality risks for females were higher than for males for neutron exposures. As expected, protracting the gamma-ray dose reduced mortality risks. Although curvature consistent with that observed visually could be detected by a model parameterized to detect curvature, a relative risk term containing only a simple term for total dose was usually sufficient to describe the dose response. Although detectable mortality for the three pathology end points considered typically occurred at the same level of dose, the highest risks were almost always associated with deaths caused by tumors of epithelial tissue origin.

  20. Eddy current modeling by finite element method for evaluation of mechanical properties of the structure cracked in absolute probe

    NASA Astrophysics Data System (ADS)

    Harzallah, Salaheddine; Chabaat, Mohamed; Belgacem, Fethi Bin Muhammad

    2014-12-01

    In this paper, a nondestructive evaluation by sensor Eddy current is used as a tool to control cracks and micro-cracks in materials. A simulation by a numerical approach based on the finite element method is employed to detect cracks in materials and eventually to study their propagation using a crucial parameter such as a Stress Intensity Factor (SIF). This method has emerged as one of the most efficient techniques for prospecting cracks in materials, evaluating SIFs and analyzing crack's growth in the context of linear elastic fracture mechanics (LEFM). This technique uses extrapolation of displacements from results compared with those obtained by the integral interaction. On the other hand, crack's growth is analyzed as a model by combining the maximum circumferential stress criteria with the critical plane for predicting the direction of crack growth. Moreover, a constant crack growth increment is determined using the modified Paris's model. Furthermore, stress intensity factors needed for these models are calculated using the domain form of the J-integral interactions.

  1. Mathematical modelling of risk reduction in reinsurance

    NASA Astrophysics Data System (ADS)

    Balashov, R. B.; Kryanev, A. V.; Sliva, D. E.

    2017-01-01

    The paper presents a mathematical model of efficient portfolio formation in the reinsurance markets. The presented approach provides the optimal ratio between the expected value of return and the risk of yield values below a certain level. The uncertainty in the return values is conditioned by use of expert evaluations and preliminary calculations, which result in expected return values and the corresponding risk levels. The proposed method allows for implementation of computationally simple schemes and algorithms for numerical calculation of the numerical structure of the efficient portfolios of reinsurance contracts of a given insurance company.

  2. Landslide risk mapping and modeling in China

    NASA Astrophysics Data System (ADS)

    Li, W.; Hong, Y.

    2015-12-01

    Under circumstances of global climate change, tectonic stress and human effect, landslides are among the most frequent and severely widespread natural hazards on Earth, as demonstrated in the World Atlas of Natural Hazards (McGuire et al., 2004). Every year, landslide activities cause serious economic loss as well as casualties (Róbert et al., 2005). How landslides can be monitored and predicted is an urgent research topic of the international landslide research community. Particularly, there is a lack of high quality and updated landslide risk maps and guidelines that can be employed to better mitigate and prevent landslide disasters in many emerging regions, including China (Hong, 2007). Since the 1950s, landslide events have been recorded in the statistical yearbooks, newspapers, and monographs in China. As disasters have been increasingly concerned by the government and the public, information about landslide events is becoming available from online news reports (Liu et al., 2012).This study presents multi-scale landslide risk mapping and modeling in China. At the national scale, based on historical data and practical experiences, we carry out landslide susceptibility and risk mapping by adopting a statistical approach and pattern recognition methods to construct empirical models. Over the identified landslide hot-spot areas, we further evaluate the slope-stability for each individual site (Sidle and Hirotaka, 2006), with the ultimate goal to set up a space-time multi-scale coupling system of Landslide risk mapping and modeling for landslide hazard monitoring and early warning.

  3. Risk management model in road transport systems

    NASA Astrophysics Data System (ADS)

    Sakhapov, R. L.; Nikolaeva, R. V.; Gatiyatullin, M. H.; Makhmutov, M. M.

    2016-08-01

    The article presents the results of a study of road safety indicators that influence the development and operation of the transport system. Road safety is considered as a continuous process of risk management. Authors constructed a model that relates the social risks of a major road safety indicator - the level of motorization. The model gives a fairly accurate assessment of the level of social risk for any given level of motorization. Authors calculated the dependence of the level of socio-economic costs of accidents and injured people in them. The applicability of the concept of socio-economic damage is caused by the presence of a linear relationship between the natural and economic indicators damage from accidents. The optimization of social risk is reduced to finding the extremum of the objective function that characterizes the economic effect of the implementation of measures to improve safety. The calculations make it possible to maximize the net present value, depending on the costs of improving road safety, taking into account socio-economic damage caused by accidents. The proposed econometric models make it possible to quantify the efficiency of the transportation system, allow to simulate the change in road safety indicators.

  4. Landslide risk models for decision making.

    PubMed

    Bonachea, Jaime; Remondo, Juan; de Terán, José Ramón Díaz; González-Díez, Alberto; Cendrero, Antonio

    2009-11-01

    This contribution presents a quantitative procedure for landslide risk analysis and zoning considering hazard, exposure (or value of elements at risk), and vulnerability. The method provides the means to obtain landslide risk models (expressing expected damage due to landslides on material elements and economic activities in monetary terms, according to different scenarios and periods) useful to identify areas where mitigation efforts will be most cost effective. It allows identifying priority areas for the implementation of actions to reduce vulnerability (elements) or hazard (processes). The procedure proposed can also be used as a preventive tool, through its application to strategic environmental impact analysis (SEIA) of land-use plans. The underlying hypothesis is that reliable predictions about hazard and risk can be made using models based on a detailed analysis of past landslide occurrences in connection with conditioning factors and data on past damage. The results show that the approach proposed and the hypothesis formulated are essentially correct, providing estimates of the order of magnitude of expected losses for a given time period. Uncertainties, strengths, and shortcomings of the procedure and results obtained are discussed and potential lines of research to improve the models are indicated. Finally, comments and suggestions are provided to generalize this type of analysis.

  5. Quantitative risk modelling for new pharmaceutical compounds.

    PubMed

    Tang, Zhengru; Taylor, Mark J; Lisboa, Paulo; Dyas, Mark

    2005-11-15

    The process of discovering and developing new drugs is long, costly and risk-laden. Faced with a wealth of newly discovered compounds, industrial scientists need to target resources carefully to discern the key attributes of a drug candidate and to make informed decisions. Here, we describe a quantitative approach to modelling the risk associated with drug development as a tool for scenario analysis concerning the probability of success of a compound as a potential pharmaceutical agent. We bring together the three strands of manufacture, clinical effectiveness and financial returns. This approach involves the application of a Bayesian Network. A simulation model is demonstrated with an implementation in MS Excel using the modelling engine Crystal Ball.

  6. Malignancy Risk Models for Oral Lesions

    PubMed Central

    Zarate, Ana M.; Brezzo, María M.; Secchi, Dante G.; Barra, José L.

    2013-01-01

    Objectives: The aim of this work was to assess risk habits, clinical and cellular phenotypes and TP53 DNA changes in oral mucosa samples from patients with Oral Potentially Malignant Disorders (OPMD), in order to create models that enable genotypic and phenotypic patterns to be obtained that determine the risk of lesions becoming malignant. Study Design: Clinical phenotypes, family history of cancer and risk habits were collected in clinical histories. TP53 gene mutation and morphometric-morphological features were studied, and multivariate models were applied. Three groups were estabished: a) oral cancer (OC) group (n=10), b) OPMD group (n=10), and c) control group (n=8). Results: An average of 50% of patients with malignancy were found to have smoking and drinking habits. A high percentage of TP53 mutations were observed in OC (30%) and OPMD (average 20%) lesions (p=0.000). The majority of these mutations were GC ? TA transversion mutations (60%). However, patients with OC presented mutations in all the exons and introns studied. Highest diagnostic accuracy (p=0.0001) was observed when incorporating alcohol and tobacco habits variables with TP53 mutations. Conclusions: Our results prove to be statistically reliable, with parameter estimates that are nearly unbiased even for small sample sizes. Models 2 and 3 were the most accurate for assessing the risk of an OPMD becoming cancerous. However, in a public health context, model 3 is the most recommended because the characteristics considered are easier and less costly to evaluate. Key words:TP53, oral potentially malignant disorders, risk factors, genotype, phenotype. PMID:23722122

  7. Simulation Assisted Risk Assessment: Blast Overpressure Modeling

    NASA Technical Reports Server (NTRS)

    Lawrence, Scott L.; Gee, Ken; Mathias, Donovan; Olsen, Michael

    2006-01-01

    A probabilistic risk assessment (PRA) approach has been developed and applied to the risk analysis of capsule abort during ascent. The PRA is used to assist in the identification of modeling and simulation applications that can significantly impact the understanding of crew risk during this potentially dangerous maneuver. The PRA approach is also being used to identify the appropriate level of fidelity for the modeling of those critical failure modes. The Apollo launch escape system (LES) was chosen as a test problem for application of this approach. Failure modes that have been modeled and/or simulated to date include explosive overpressure-based failure, explosive fragment-based failure, land landing failures (range limits exceeded either near launch or Mode III trajectories ending on the African continent), capsule-booster re-contact during separation, and failure due to plume-induced instability. These failure modes have been investigated using analysis tools in a variety of technical disciplines at various levels of fidelity. The current paper focuses on the development and application of a blast overpressure model for the prediction of structural failure due to overpressure, including the application of high-fidelity analysis to predict near-field and headwinds effects.

  8. Absolute quantitation of protein posttranslational modification isoform.

    PubMed

    Yang, Zhu; Li, Ning

    2015-01-01

    Mass spectrometry has been widely applied in characterization and quantification of proteins from complex biological samples. Because the numbers of absolute amounts of proteins are needed in construction of mathematical models for molecular systems of various biological phenotypes and phenomena, a number of quantitative proteomic methods have been adopted to measure absolute quantities of proteins using mass spectrometry. The liquid chromatography-tandem mass spectrometry (LC-MS/MS) coupled with internal peptide standards, i.e., the stable isotope-coded peptide dilution series, which was originated from the field of analytical chemistry, becomes a widely applied method in absolute quantitative proteomics research. This approach provides more and more absolute protein quantitation results of high confidence. As quantitative study of posttranslational modification (PTM) that modulates the biological activity of proteins is crucial for biological science and each isoform may contribute a unique biological function, degradation, and/or subcellular location, the absolute quantitation of protein PTM isoforms has become more relevant to its biological significance. In order to obtain the absolute cellular amount of a PTM isoform of a protein accurately, impacts of protein fractionation, protein enrichment, and proteolytic digestion yield should be taken into consideration and those effects before differentially stable isotope-coded PTM peptide standards are spiked into sample peptides have to be corrected. Assisted with stable isotope-labeled peptide standards, the absolute quantitation of isoforms of posttranslationally modified protein (AQUIP) method takes all these factors into account and determines the absolute amount of a protein PTM isoform from the absolute amount of the protein of interest and the PTM occupancy at the site of the protein. The absolute amount of the protein of interest is inferred by quantifying both the absolute amounts of a few PTM

  9. The 0.4-Msun eclipsing binary CU Cancri. Absolute dimensions, comparison with evolutionary models and possible evidence for a circumstellar dust disk

    NASA Astrophysics Data System (ADS)

    Ribas, I.

    2003-01-01

    Photometric observations in the R and I bands of the detached M-type double-lined eclipsing binary CU Cnc have been acquired and analysed. The photometric elements obtained from the analysis of the light curves have been combined with an existing spectroscopic solution to yield high-precision (errors la 2%) absolute dimensions: MA=0.4333+/-0.0017 Msun, MB= 0.3980+/-0.0014 Msun, RA=0.4317+/-0.0052 Rsun, and RB=0.3908+/-0.0094 Rsun. The mean effective temperature of the system has been estimated to be Teff= 3140+/-150 K by comparing multi-band photometry (optical and infrared) with synthetic colors computed from state-of-the-art model atmospheres. Additionally, we have been able to obtain an estimate for the age ( ~ 320 Myr) and chemical composition ([Fe/H]~ 0.0) of the binary system through its membership of the Castor moving group. With all these observational constraints, we have carried out a critical test of recent stellar models for low-mass stars. The comparison reveals that most evolutionary models underestimate the radius of the stars by as much as 10%, thus confirming the trend observed by Torres & Ribas (\\cite{TR02}) for YY Gem and V818 Tau. In the mass-absolute magnitude diagram, CU Cnc is observed to be dimmer than other stars of the same mass and this makes the comparison with stellar models not so compelling. After ruling out a number of different scenarios, the apparent faintness of CU Cnc can be explained if its components are some 10% cooler than similar-mass stars or if there is some source of circumstellar dust absorption. The latter could be a tantalizing indirect evidence for a coplanar (Vega-like) dusty disk around this relatively young M-type binary. Tables 1 and 2 are only available in electronic form at the CDS via anonymous ftp to cdsarc.u-strasbg.fr (130.79.125.5) or via http://cdsweb.u-strasbg.fr/cgi-bin/qcat?J/A+A/398/239}

  10. Risk modelling for vaccination: a risk assessment perspective.

    PubMed

    Wooldridge, M

    2007-01-01

    Any risk assessment involves a number of steps. First, the risk manager, in close liaison with the risk assessor, should identify the question of interest. Then, the hazards associated with each risk question should be identified. Only then can the risks themselves be assessed. Several questions may reasonably be asked about the risk associated with avian influenza vaccines and their use. Some apply to any vaccine, while others are specific to avian influenza. Risks may occur during manufacture and during use. Some concern the vaccines themselves, while others address the effect of failure on disease control. The hazards associated with each risk question are then identified. These may be technical errors in design, development or production, such as contamination or failure to inactivate appropriately. They may relate to the biological properties of the pathogens themselves displayed during manufacture or use, for example, reversion to virulence, shedding or not being the right strain for the subsequent challenge. Following a consideration of risks and hazards, the information needed and an outline of the steps necessary to assess the risk is summarized, for an illustrative risk question using, as an example, the risks associated with the use of vaccines in the field. A brief consideration of the differences between qualitative and quantitative risk assessments is also included, and the potential effects of uncertainty and variability on the results are discussed.

  11. Applying Least Absolute Shrinkage Selection Operator and Akaike Information Criterion Analysis to Find the Best Multiple Linear Regression Models between Climate Indices and Components of Cow's Milk.

    PubMed

    Marami Milani, Mohammad Reza; Hense, Andreas; Rahmani, Elham; Ploeger, Angelika

    2016-07-23

    This study focuses on multiple linear regression models relating six climate indices (temperature humidity THI, environmental stress ESI, equivalent temperature index ETI, heat load HLI, modified HLI (HLI new), and respiratory rate predictor RRP) with three main components of cow's milk (yield, fat, and protein) for cows in Iran. The least absolute shrinkage selection operator (LASSO) and the Akaike information criterion (AIC) techniques are applied to select the best model for milk predictands with the smallest number of climate predictors. Uncertainty estimation is employed by applying bootstrapping through resampling. Cross validation is used to avoid over-fitting. Climatic parameters are calculated from the NASA-MERRA global atmospheric reanalysis. Milk data for the months from April to September, 2002 to 2010 are used. The best linear regression models are found in spring between milk yield as the predictand and THI, ESI, ETI, HLI, and RRP as predictors with p-value < 0.001 and R² (0.50, 0.49) respectively. In summer, milk yield with independent variables of THI, ETI, and ESI show the highest relation (p-value < 0.001) with R² (0.69). For fat and protein the results are only marginal. This method is suggested for the impact studies of climate variability/change on agriculture and food science fields when short-time series or data with large uncertainty are available.

  12. MODELING MULTI-WAVELENGTH STELLAR ASTROMETRY. III. DETERMINATION OF THE ABSOLUTE MASSES OF EXOPLANETS AND THEIR HOST STARS

    SciTech Connect

    Coughlin, J. L.; Lopez-Morales, Mercedes

    2012-05-10

    Astrometric measurements of stellar systems are becoming significantly more precise and common, with many ground- and space-based instruments and missions approaching 1 {mu}as precision. We examine the multi-wavelength astrometric orbits of exoplanetary systems via both analytical formulae and numerical modeling. Exoplanets have a combination of reflected and thermally emitted light that causes the photocenter of the system to shift increasingly farther away from the host star with increasing wavelength. We find that, if observed at long enough wavelengths, the planet can dominate the astrometric motion of the system, and thus it is possible to directly measure the orbits of both the planet and star, and thus directly determine the physical masses of the star and planet, using multi-wavelength astrometry. In general, this technique works best for, though is certainly not limited to, systems that have large, high-mass stars and large, low-mass planets, which is a unique parameter space not covered by other exoplanet characterization techniques. Exoplanets that happen to transit their host star present unique cases where the physical radii of the planet and star can be directly determined via astrometry alone. Planetary albedos and day-night contrast ratios may also be probed via this technique due to the unique signature they impart on the observed astrometric orbits. We develop a tool to examine the prospects for near-term detection of this effect, and give examples of some exoplanets that appear to be good targets for detection in the K to N infrared observing bands, if the required precision can be achieved.

  13. Probabilistic Seismic Risk Model for Western Balkans

    NASA Astrophysics Data System (ADS)

    Stejskal, Vladimir; Lorenzo, Francisco; Pousse, Guillaume; Radovanovic, Slavica; Pekevski, Lazo; Dojcinovski, Dragi; Lokin, Petar; Petronijevic, Mira; Sipka, Vesna

    2010-05-01

    A probabilistic seismic risk model for insurance and reinsurance purposes is presented for an area of Western Balkans, covering former Yugoslavia and Albania. This territory experienced many severe earthquakes during past centuries producing significant damage to many population centres in the region. The highest hazard is related to external Dinarides, namely to the collision zone of the Adriatic plate. The model is based on a unified catalogue for the region and a seismic source model consisting of more than 30 zones covering all the three main structural units - Southern Alps, Dinarides and the south-western margin of the Pannonian Basin. A probabilistic methodology using Monte Carlo simulation was applied to generate the hazard component of the model. Unique set of damage functions based on both loss experience and engineering assessments is used to convert the modelled ground motion severity into the monetary loss.

  14. Structural equation modeling in environmental risk assessment.

    PubMed Central

    Buncher, C R; Succop, P A; Dietrich, K N

    1991-01-01

    Environmental epidemiology requires effective models that take individual observations of environmental factors and connect them into meaningful patterns. Single-factor relationships have given way to multivariable analyses; simple additive models have been augmented by multiplicative (logistic) models. Each of these steps has produced greater enlightenment and understanding. Models that allow for factors causing outputs that can affect later outputs with putative causation working at several different time points (e.g., linkage) are not commonly used in the environmental literature. Structural equation models are a class of covariance structure models that have been used extensively in economics/business and social science but are still little used in the realm of biostatistics. Path analysis in genetic studies is one simplified form of this class of models. We have been using these models in a study of the health and development of infants who have been exposed to lead in utero and in the postnatal home environment. These models require as input the directionality of the relationship and then produce fitted models for multiple inputs causing each factor and the opportunity to have outputs serve as input variables into the next phase of the simultaneously fitted model. Some examples of these models from our research are presented to increase familiarity with this class of models. Use of these models can provide insight into the effect of changing an environmental factor when assessing risk. The usual cautions concerning believing a model, believing causation has been proven, and the assumptions that are required for each model are operative. PMID:2050063

  15. Human Plague Risk: Spatial-Temporal Models

    NASA Technical Reports Server (NTRS)

    Pinzon, Jorge E.

    2010-01-01

    This chpater reviews the use of spatial-temporal models in identifying potential risks of plague outbreaks into the human population. Using earth observations by satellites remote sensing there has been a systematic analysis and mapping of the close coupling between the vectors of the disease and climate variability. The overall result is that incidence of plague is correlated to positive El Nino/Southem Oscillation (ENSO).

  16. Risk-driven security testing using risk analysis with threat modeling approach.

    PubMed

    Palanivel, Maragathavalli; Selvadurai, Kanmani

    2014-01-01

    Security testing is a process of determining risks present in the system states and protects them from vulnerabilities. But security testing does not provide due importance to threat modeling and risk analysis simultaneously that affects confidentiality and integrity of the system. Risk analysis includes identification, evaluation and assessment of risks. Threat modeling approach is identifying threats associated with the system. Risk-driven security testing uses risk analysis results in test case identification, selection and assessment to prioritize and optimize the testing process. Threat modeling approach, STRIDE is generally used to identify both technical and non-technical threats present in the system. Thus, a security testing mechanism based on risk analysis results using STRIDE approach has been proposed for identifying highly risk states. Risk metrics considered for testing includes risk impact, risk possibility and risk threshold. Risk threshold value is directly proportional to risk impact and risk possibility. Risk-driven security testing results in reduced test suite which in turn reduces test case selection time. Risk analysis optimizes the test case selection and execution process. For experimentation, the system models namely LMS, ATM, OBS, OSS and MTRS are considered. The performance of proposed system is analyzed using Test Suite Reduction Rate (TSRR) and FSM coverage. TSRR varies from 13.16 to 21.43% whereas FSM coverage is achieved up to 91.49%. The results show that the proposed method combining risk analysis with threat modeling identifies states with high risks to improve the testing efficiency.

  17. Validation of the absolute renal risk of dialysis/death in adults with IgA nephropathy secondary to Henoch-Schönlein purpura: a monocentric cohort study

    PubMed Central

    2013-01-01

    Background We established earlier the absolute renal risk (ARR) of dialysis/death (D/D) in primary IgA nephropathy (IgAN) which permitted accurate prospective prediction of final prognosis. This ARR was based on the potential presence at initial diagnosis of three major, independent, and equipotent risk factors such as hypertension, quantitative proteinuria ≥ 1 g per day, and severe pathological lesions appreciated by our local classification scoring ≥ 8 (range 0–20). We studied the validity of this ARR concept in secondary IgAN to predict future outcome and focused on Henoch-Schönlein purpura (HSP) nephritis. Methods Our cohort of adults IgAN concerned 1064 patients with 101 secondary IgAN and was focused on 74 HSP (59 men) with a mean age of 38.6 at initial diagnosis and a mean follow-up of 11.8 years. Three major risk factors: hypertension, proteinuria ≥1 g/d, and severe pathological lesions appreciated by our global optical score ≥8 (GOS integrated all elementary histological lesions), were studied at biopsy-proven diagnosis and their presence defined the ARR scoring: 0 for none present, 3 for all present, 1 or 2 for the presence of any 1 or 2 risk factors. The primary end-point was composite with occurrence of dialysis or death before (D/D). We used classical statistics and both time-dependent Cox regression and Kaplan-Meier survival curve methods. Results The cumulative rate of D/D at 10 and 20 years post-onset was respectively 0 and 14% for ARR = 0 (23 patients); 10 and 23% for ARR = 1 (N = 19); 27 and 33% for ARR = 2 (N = 24); and 81 and 100% (before 20 y) in the 8 patients with ARR = 3 (P = 0.0007). Prediction at time of diagnosis (time zero) of 10y cumulative rate of D/D event was 0% for ARR = 0, 10% for ARR = 1, 33% for ARR = 2, and 100% by 8.5y for ARR = 3 (P = 0.0003) in this adequately treated cohort. Conclusion This study clearly validates the Absolute Renal Risk of Dialysis

  18. Electronic Absolute Cartesian Autocollimator

    NASA Technical Reports Server (NTRS)

    Leviton, Douglas B.

    2006-01-01

    An electronic absolute Cartesian autocollimator performs the same basic optical function as does a conventional all-optical or a conventional electronic autocollimator but differs in the nature of its optical target and the manner in which the position of the image of the target is measured. The term absolute in the name of this apparatus reflects the nature of the position measurement, which, unlike in a conventional electronic autocollimator, is based absolutely on the position of the image rather than on an assumed proportionality between the position and the levels of processed analog electronic signals. The term Cartesian in the name of this apparatus reflects the nature of its optical target. Figure 1 depicts the electronic functional blocks of an electronic absolute Cartesian autocollimator along with its basic optical layout, which is the same as that of a conventional autocollimator. Referring first to the optical layout and functions only, this or any autocollimator is used to measure the compound angular deviation of a flat datum mirror with respect to the optical axis of the autocollimator itself. The optical components include an illuminated target, a beam splitter, an objective or collimating lens, and a viewer or detector (described in more detail below) at a viewing plane. The target and the viewing planes are focal planes of the lens. Target light reflected by the datum mirror is imaged on the viewing plane at unit magnification by the collimating lens. If the normal to the datum mirror is parallel to the optical axis of the autocollimator, then the target image is centered on the viewing plane. Any angular deviation of the normal from the optical axis manifests itself as a lateral displacement of the target image from the center. The magnitude of the displacement is proportional to the focal length and to the magnitude (assumed to be small) of the angular deviation. The direction of the displacement is perpendicular to the axis about which the

  19. Graphical models and Bayesian domains in risk modelling: application in microbiological risk assessment.

    PubMed

    Greiner, Matthias; Smid, Joost; Havelaar, Arie H; Müller-Graf, Christine

    2013-05-15

    Quantitative microbiological risk assessment (QMRA) models are used to reflect knowledge about complex real-world scenarios for the propagation of microbiological hazards along the feed and food chain. The aim is to provide insight into interdependencies among model parameters, typically with an interest to characterise the effect of risk mitigation measures. A particular requirement is to achieve clarity about the reliability of conclusions from the model in the presence of uncertainty. To this end, Monte Carlo (MC) simulation modelling has become a standard in so-called probabilistic risk assessment. In this paper, we elaborate on the application of Bayesian computational statistics in the context of QMRA. It is useful to explore the analogy between MC modelling and Bayesian inference (BI). This pertains in particular to the procedures for deriving prior distributions for model parameters. We illustrate using a simple example that the inability to cope with feedback among model parameters is a major limitation of MC modelling. However, BI models can be easily integrated into MC modelling to overcome this limitation. We refer a BI submodel integrated into a MC model to as a "Bayes domain". We also demonstrate that an entire QMRA model can be formulated as Bayesian graphical model (BGM) and discuss the advantages of this approach. Finally, we show example graphs of MC, BI and BGM models, highlighting the similarities among the three approaches.

  20. NGNP Risk Management Database: A Model for Managing Risk

    SciTech Connect

    John Collins

    2009-09-01

    To facilitate the implementation of the Risk Management Plan, the Next Generation Nuclear Plant (NGNP) Project has developed and employed an analytical software tool called the NGNP Risk Management System (RMS). A relational database developed in Microsoft® Access, the RMS provides conventional database utility including data maintenance, archiving, configuration control, and query ability. Additionally, the tool’s design provides a number of unique capabilities specifically designed to facilitate the development and execution of activities outlined in the Risk Management Plan. Specifically, the RMS provides the capability to establish the risk baseline, document and analyze the risk reduction plan, track the current risk reduction status, organize risks by reference configuration system, subsystem, and component (SSC) and Area, and increase the level of NGNP decision making.

  1. Modeling Opponents in Adversarial Risk Analysis.

    PubMed

    Rios Insua, David; Banks, David; Rios, Jesus

    2016-04-01

    Adversarial risk analysis has been introduced as a framework to deal with risks derived from intentional actions of adversaries. The analysis supports one of the decisionmakers, who must forecast the actions of the other agents. Typically, this forecast must take account of random consequences resulting from the set of selected actions. The solution requires one to model the behavior of the opponents, which entails strategic thinking. The supported agent may face different kinds of opponents, who may use different rationality paradigms, for example, the opponent may behave randomly, or seek a Nash equilibrium, or perform level-k thinking, or use mirroring, or employ prospect theory, among many other possibilities. We describe the appropriate analysis for these situations, and also show how to model the uncertainty about the rationality paradigm used by the opponent through a Bayesian model averaging approach, enabling a fully decision-theoretic solution. We also show how as we observe an opponent's decision behavior, this approach allows learning about the validity of each of the rationality models used to predict his decision by computing the models' (posterior) probabilities, which can be understood as a measure of their validity. We focus on simultaneous decision making by two agents.

  2. Absolute solvation free energy of Li{sup +} and Na{sup +} ions in dimethyl sulfoxide solution: A theoretical ab initio and cluster-continuum model study

    SciTech Connect

    Westphal, Eduard; Pliego, Josefredo R. Jr.

    2005-08-15

    The solvation of the lithium and sodium ions in dimethyl sulfoxide solution was theoretically investigated using ab initio calculations coupled with the hybrid cluster-continuum model, a quasichemical theory of solvation. We have investigated clusters of ions with up to five dimethyl sulfoxide (DMSO) molecules, and the bulk solvent was described by a dielectric continuum model. Our results show that the lithium and sodium ions have four and five DMSO molecules into the first coordination shell, and the calculated solvation free energies are -135.5 and -108.6 kcal mol{sup -1}, respectively. These data suggest a solvation free energy value of -273.2 kcal mol{sup -1} for the proton in dimethyl sulfoxide solution, a value that is more negative than the present uncertain experimental value. This and previous studies on the solvation of ions in water solution indicate that the tetraphenylarsonium tetraphenylborate assumption is flawed and the absolute value of the free energy of transfer of ions from water to DMSO solution is higher than the present experimental values.

  3. Functional Risk Modeling for Lunar Surface Systems

    NASA Technical Reports Server (NTRS)

    Thomson, Fraser; Mathias, Donovan; Go, Susie; Nejad, Hamed

    2010-01-01

    We introduce an approach to risk modeling that we call functional modeling , which we have developed to estimate the capabilities of a lunar base. The functional model tracks the availability of functions provided by systems, in addition to the operational state of those systems constituent strings. By tracking functions, we are able to identify cases where identical functions are provided by elements (rovers, habitats, etc.) that are connected together on the lunar surface. We credit functional diversity in those cases, and in doing so compute more realistic estimates of operational mode availabilities. The functional modeling approach yields more realistic estimates of the availability of the various operational modes provided to astronauts by the ensemble of surface elements included in a lunar base architecture. By tracking functional availability the effects of diverse backup, which often exists when two or more independent elements are connected together, is properly accounted for.

  4. Absolute-structure reports.

    PubMed

    Flack, Howard D

    2013-08-01

    All the 139 noncentrosymmetric crystal structures published in Acta Crystallographica Section C between January 2011 and November 2012 inclusive have been used as the basis of a detailed study of the reporting of absolute structure. These structure determinations cover a wide range of space groups, chemical composition and resonant-scattering contribution. Defining A and D as the average and difference of the intensities of Friedel opposites, their level of fit has been examined using 2AD and selected-D plots. It was found, regardless of the expected resonant-scattering contribution to Friedel opposites, that the Friedel-difference intensities are often dominated by random uncertainty and systematic error. An analysis of data collection strategy is provided. It is found that crystal-structure determinations resulting in a Flack parameter close to 0.5 may not necessarily be from crystals twinned by inversion. Friedifstat is shown to be a robust estimator of the resonant-scattering contribution to Friedel opposites, very little affected by the particular space group of a structure nor by the occupation of special positions. There is considerable confusion in the text of papers presenting achiral noncentrosymmetric crystal structures. Recommendations are provided for the optimal way of treating noncentrosymmetric crystal structures for which the experimenter has no interest in determining the absolute structure.

  5. Development and validation of risk prediction model for venous thromboembolism in postpartum women: multinational cohort study

    PubMed Central

    Sultan, Alyshah Abdul; West, Joe; Grainge, Matthew J; Riley, Richard D; Tata, Laila J; Stephansson, Olof; Fleming, Kate M; Nelson-Piercy, Catherine; Ludvigsson, Jonas F

    2016-01-01

    Objective To develop and validate a risk prediction model for venous thromboembolism in the first six weeks after delivery (early postpartum). Design Cohort study. Setting Records from England based Clinical Practice Research Datalink (CPRD) linked to Hospital Episode Statistics (HES) and data from Sweden based registry. Participants All pregnant women registered with CPRD-HES linked data between 1997 and 2014 and Swedish medical birth registry between 2005 and 2011 with postpartum follow-up. Main outcome measure Multivariable logistic regression analysis was used to develop a risk prediction model for postpartum venous thromboembolism based on the English data, which was externally validated in the Swedish data. Results 433 353 deliveries were identified in the English cohort and 662 387 in the Swedish cohort. The absolute rate of venous thromboembolism was 7.2 per 10 000 deliveries in the English cohort and 7.9 per 10 000 in the Swedish cohort. Emergency caesarean delivery, stillbirth, varicose veins, pre-eclampsia/eclampsia, postpartum infection, and comorbidities were the strongest predictors of venous thromboembolism in the final multivariable model. Discrimination of the model was similar in both cohorts, with a C statistic above 0.70, with excellent calibration of observed and predicted risks. The model identified more venous thromboembolism events than the existing national English (sensitivity 68% v 63%) and Swedish guidelines (30% v 21%) at similar thresholds. Conclusion A new prediction model that quantifies absolute risk of postpartum venous thromboembolism has been developed and externally validated. It is based on clinical variables that are available in many developed countries at the point of delivery and could serve as the basis for real time decisions on obstetric thromboprophylaxis. PMID:27919934

  6. Modeling Situation Awareness and Crash Risk

    PubMed Central

    Fisher, Donald L.; Strayer, David. L.

    2014-01-01

    In this article we develop a model of the relationship between crash risk and a driver’s situation awareness. We consider a driver’s situation awareness to reflect the dynamic mental model of the driving environment and to be dependent upon several psychological processes including Scanning the driving environment, Predicting and anticipating hazards, Identifying potential hazards in the driving scene as they occur, Deciding on an action, and Executing an appropriate Response (SPIDER). Together, SPIDER is important for establishing and maintaining good situation awareness of the driving environment and good situation awareness is important for coordinating and scheduling the SPIDER-relevant processes necessary for safe driving. An Order-of-Processing (OP) model makes explicit the SPIDER-relevant processes and how they predict the likelihood of a crash when the driver is or is not distracted by a secondary task. For example, the OP model shows how a small decrease in the likelihood of any particular SPIDER activity being completed successfully (because of a concurrent secondary task performance) would lead to a large increase in the relative risk of a crash. PMID:24776225

  7. Risk assessment model for invasive breast cancer in Hong Kong women

    PubMed Central

    Wang, Feng; Dai, Juncheng; Li, Mengjie; Chan, Wing-cheong; Kwok, Carol Chi-hei; Leung, Siu-lan; Wu, Cherry; Li, Wentao; Yu, Wai-cho; Tsang, Koon-ho; Law, Sze-hong; Lee, Priscilla Ming-yi; Wong, Carmen Ka-man; Shen, Hongbing; Wong, Samuel Yeung-shan; Yang, Xiaohong R.; Tse, Lap Ah

    2016-01-01

    Abstract No risk assessment tool is available for identifying high risk population of breast cancer (BCa) in Hong Kong. A case–control study including 918 BCa cases and 923 controls was used to develop the risk assessment model among Hong Kong Chinese women. Each participant received an in-depth interview to obtain their lifestyle and environmental risk factors. Least absolute shrinkage and selection operator (LASSO) selection model was used to select the optimal risk factors (LASSO-model). A risk score system was constructed to evaluate the cumulative effects of selected factors. Bootstrap simulation was used to test the internal validation of the model. Model performance was evaluated by receiver-operator characteristic curves and the area under the curve (AUC). Age, number of parity, number of BCa cases in 1st-degree relatives, exposure to light at night, and sleep quality were the common risk factors for all women. Alcohol drinking was included for premenopausal women; body mass index, age at menarche, age at 1st give birth, breast feeding, using of oral contraceptive, hormone replacement treatment, and history of benign breast diseases were included for postmenopausal women. The AUCs were 0.640 (95% CI, 0.598–0.681) and 0.655 (95% CI, 0.621–0.653) for pre- and postmenopausal women, respectively. Further subgroup evaluation revealed that the model performance was better for women aged 50 to 70 years or ER-positive. This BCa risk assessment tool in Hong Kong Chinese women based on LASSO selection is promising, which shows a slightly higher discriminative accuracy than those developed in other populations. PMID:27512870

  8. Absolute Equilibrium Entropy

    NASA Technical Reports Server (NTRS)

    Shebalin, John V.

    1997-01-01

    The entropy associated with absolute equilibrium ensemble theories of ideal, homogeneous, fluid and magneto-fluid turbulence is discussed and the three-dimensional fluid case is examined in detail. A sigma-function is defined, whose minimum value with respect to global parameters is the entropy. A comparison is made between the use of global functions sigma and phase functions H (associated with the development of various H-theorems of ideal turbulence). It is shown that the two approaches are complimentary though conceptually different: H-theorems show that an isolated system tends to equilibrium while sigma-functions allow the demonstration that entropy never decreases when two previously isolated systems are combined. This provides a more complete picture of entropy in the statistical mechanics of ideal fluids.

  9. Absolute multilateration between spheres

    NASA Astrophysics Data System (ADS)

    Muelaner, Jody; Wadsworth, William; Azini, Maria; Mullineux, Glen; Hughes, Ben; Reichold, Armin

    2017-04-01

    Environmental effects typically limit the accuracy of large scale coordinate measurements in applications such as aircraft production and particle accelerator alignment. This paper presents an initial design for a novel measurement technique with analysis and simulation showing that that it could overcome the environmental limitations to provide a step change in large scale coordinate measurement accuracy. Referred to as absolute multilateration between spheres (AMS), it involves using absolute distance interferometry to directly measure the distances between pairs of plain steel spheres. A large portion of each sphere remains accessible as a reference datum, while the laser path can be shielded from environmental disturbances. As a single scale bar this can provide accurate scale information to be used for instrument verification or network measurement scaling. Since spheres can be simultaneously measured from multiple directions, it also allows highly accurate multilateration-based coordinate measurements to act as a large scale datum structure for localized measurements, or to be integrated within assembly tooling, coordinate measurement machines or robotic machinery. Analysis and simulation show that AMS can be self-aligned to achieve a theoretical combined standard uncertainty for the independent uncertainties of an individual 1 m scale bar of approximately 0.49 µm. It is also shown that combined with a 1 µm m‑1 standard uncertainty in the central reference system this could result in coordinate standard uncertainty magnitudes of 42 µm over a slender 1 m by 20 m network. This would be a sufficient step change in accuracy to enable next generation aerospace structures with natural laminar flow and part-to-part interchangeability.

  10. Absolutely relative or relatively absolute: violations of value invariance in human decision making.

    PubMed

    Teodorescu, Andrei R; Moran, Rani; Usher, Marius

    2016-02-01

    Making decisions based on relative rather than absolute information processing is tied to choice optimality via the accumulation of evidence differences and to canonical neural processing via accumulation of evidence ratios. These theoretical frameworks predict invariance of decision latencies to absolute intensities that maintain differences and ratios, respectively. While information about the absolute values of the choice alternatives is not necessary for choosing the best alternative, it may nevertheless hold valuable information about the context of the decision. To test the sensitivity of human decision making to absolute values, we manipulated the intensities of brightness stimuli pairs while preserving either their differences or their ratios. Although asked to choose the brighter alternative relative to the other, participants responded faster to higher absolute values. Thus, our results provide empirical evidence for human sensitivity to task irrelevant absolute values indicating a hard-wired mechanism that precedes executive control. Computational investigations of several modelling architectures reveal two alternative accounts for this phenomenon, which combine absolute and relative processing. One account involves accumulation of differences with activation dependent processing noise and the other emerges from accumulation of absolute values subject to the temporal dynamics of lateral inhibition. The potential adaptive role of such choice mechanisms is discussed.

  11. MODELING MULTI-WAVELENGTH STELLAR ASTROMETRY. II. DETERMINING ABSOLUTE INCLINATIONS, GRAVITY-DARKENING COEFFICIENTS, AND SPOT PARAMETERS OF SINGLE STARS WITH SIM LITE

    SciTech Connect

    Coughlin, Jeffrey L.; Harrison, Thomas E.; Gelino, Dawn M.

    2010-11-10

    We present a novel technique to determine the absolute inclination of single stars using multi-wavelength submilliarcsecond astrometry. The technique exploits the effect of gravity darkening, which causes a wavelength-dependent astrometric displacement parallel to a star's projected rotation axis. We find that this effect is clearly detectable using SIM Lite for various giant stars and rapid rotators, and present detailed models for multiple systems using the REFLUX code. We also explore the multi-wavelength astrometric reflex motion induced by spots on single stars. We find that it should be possible to determine spot size, relative temperature, and some positional information for both giant and nearby main-sequence stars utilizing multi-wavelength SIM Lite data. These data will be extremely useful in stellar and exoplanet astrophysics, as well as supporting the primary SIM Lite mission through proper multi-wavelength calibration of the giant star astrometric reference frame, and reduction of noise introduced by starspots when searching for extrasolar planets.

  12. Absolute quantification of myosin heavy chain isoforms by selected reaction monitoring can underscore skeletal muscle changes in a mouse model of amyotrophic lateral sclerosis.

    PubMed

    Peggion, Caterina; Massimino, Maria Lina; Biancotto, Giancarlo; Angeletti, Roberto; Reggiani, Carlo; Sorgato, Maria Catia; Bertoli, Alessandro; Stella, Roberto

    2017-03-01

    Skeletal muscle fibers contain different isoforms of myosin heavy chain (MyHC) that define distinctive contractile properties. In light of the muscle capacity to adapt MyHC expression to pathophysiological conditions, a rapid and quantitative assessment of MyHC isoforms in small muscle tissue quantities would represent a valuable diagnostic tool for (neuro)muscular diseases. As past protocols did not meet these requirements, in the present study we applied a targeted proteomic approach based on selected reaction monitoring that allowed the absolute quantification of slow and fast MyHC isoforms in different mouse skeletal muscles with high reproducibility. This mass-spectrometry-based method was validated also in a pathological specimen, by comparison of the MyHC expression profiles in different muscles from healthy mice and a genetic mouse model of amyotrophic lateral sclerosis (ALS) expressing the SOD1(G93A) mutant. This analysis showed that terminally ill ALS mice have a fast-to-slow shift in the fiber type composition of the tibialis anterior and gastrocnemius muscles, as previously reported. These results will likely open the way to accurate and rapid diagnoses of human (neuro)muscular diseases by the proposed method. Graphical Abstract Methods for myosin heavy chain (MyHC) quantification: a comparison of classical methods and selected reaction monitoring (SRM)-based mass spectrometry approaches.

  13. A Risk Assessment Model for Type 2 Diabetes in Chinese

    PubMed Central

    Luo, Senlin; Han, Longfei; Zeng, Ping; Chen, Feng; Pan, Limin; Wang, Shu; Zhang, Tiemei

    2014-01-01

    Aims To develop a risk assessment model for persons at risk from type 2 diabetes in Chinese. Materials and Methods The model was generated from the cross-sectional data of 16246 persons aged from 20 years old and over. C4.5 algorithm and multivariate logistic regression were used for variable selection. Relative risk value combined with expert decision constructed a comprehensive risk assessment for evaluating the individual risk category. The validity of the model was tested by cross validation and a survey performed six years later with some participants. Results Nine variables were selected as risk variables. A mathematical model was established to calculate the average probability of diabetes in each cluster's group divided by sex and age. A series of criteria combined with relative RR value (2.2) and level of risk variables stratified individuals into four risk groups (non, low, medium and high risk). The overall accuracy reached 90.99% evaluated by cross-validation inside the model population. The incidence of diabetes for each risk group increased from 1.5 (non-risk group) to 28.2(high-risk group) per one thousand persons per year with six years follow-up. Discussion The model could determine the individual risk for type 2 diabetes by four risk degrees. This model could be used as a technique tool not only to support screening persons at different risk, but also to evaluate the result of the intervention. PMID:25101994

  14. A Risk Management Model for the Federal Acquisition Process.

    DTIC Science & Technology

    1999-06-01

    risk management in the acquisition process. This research explains the Federal Acquisition Process and each of the 78 tasks to be completed by the CO...and examines the concepts of risk and risk management . This research culminates in the development of a model that identifies prevalent risks in the...contracting professionals is used to gather opinions, ideas, and practical applications of risk management in the acquisition process, and refine the model

  15. Integrating Professional and Folk Models of HIV Risk: YMSM's Perceptions of High-Risk Sex

    ERIC Educational Resources Information Center

    Kubicek, Katrina; Carpineto, Julie; McDavitt, Bryce; Weiss, George; Iverson, Ellen F.; Au, Chi-Wai; Kerrone, Dustin; Martinez, Miguel; Kipke, Michele D.

    2008-01-01

    Risks associated with HIV are well documented in research literature. Although a great deal has been written about high-risk sex, little research has been conducted to examine how young men who have sex with men (YMSM) perceive and define high-risk sexual behavior. In this study, we compare the "professional" and "folk" models of HIV risk based on…

  16. Automating risk analysis of software design models.

    PubMed

    Frydman, Maxime; Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance.

  17. The globalization of risk and risk perception: why we need a new model of risk communication for vaccines.

    PubMed

    Larson, Heidi; Brocard Paterson, Pauline; Erondu, Ngozi

    2012-11-01

    Risk communication and vaccines is complex and the nature of risk perception is changing, with perceptions converging, evolving and having impacts well beyond specific geographic localities and points in time, especially when amplified through the Internet and other modes of global communication. This article examines the globalization of risk perceptions and their impacts, including the example of measles and the globalization of measles, mumps and rubella (MMR) vaccine risk perceptions, and calls for a new, more holistic model of risk assessment, risk communication and risk mitigation, embedded in an ongoing process of risk management for vaccines and immunization programmes. It envisions risk communication as an ongoing process that includes trust-building strategies hand-in-hand with operational and policy strategies needed to mitigate and manage vaccine-related risks, as well as perceptions of risk.

  18. Personalized Predictive Modeling and Risk Factor Identification using Patient Similarity.

    PubMed

    Ng, Kenney; Sun, Jimeng; Hu, Jianying; Wang, Fei

    2015-01-01

    Personalized predictive models are customized for an individual patient and trained using information from similar patients. Compared to global models trained on all patients, they have the potential to produce more accurate risk scores and capture more relevant risk factors for individual patients. This paper presents an approach for building personalized predictive models and generating personalized risk factor profiles. A locally supervised metric learning (LSML) similarity measure is trained for diabetes onset and used to find clinically similar patients. Personalized risk profiles are created by analyzing the parameters of the trained personalized logistic regression models. A 15,000 patient data set, derived from electronic health records, is used to evaluate the approach. The predictive results show that the personalized models can outperform the global model. Cluster analysis of the risk profiles show groups of patients with similar risk factors, differences in the top risk factors for different groups of patients and differences between the individual and global risk factors.

  19. Assessing patients' risk of febrile neutropenia: is there a correlation between physician-assessed risk and model-predicted risk?

    PubMed

    Lyman, Gary H; Dale, David C; Legg, Jason C; Abella, Esteban; Morrow, Phuong Khanh; Whittaker, Sadie; Crawford, Jeffrey

    2015-08-01

    This study evaluated the correlation between the risk of febrile neutropenia (FN) estimated by physicians and the risk of severe neutropenia or FN predicted by a validated multivariate model in patients with nonmyeloid malignancies receiving chemotherapy. Before patient enrollment, physician and site characteristics were recorded, and physicians self-reported the FN risk at which they would typically consider granulocyte colony-stimulating factor (G-CSF) primary prophylaxis (FN risk intervention threshold). For each patient, physicians electronically recorded their estimated FN risk, orders for G-CSF primary prophylaxis (yes/no), and patient characteristics for model predictions. Correlations between physician-assessed FN risk and model-predicted risk (primary endpoints) and between physician-assessed FN risk and G-CSF orders were calculated. Overall, 124 community-based oncologists registered; 944 patients initiating chemotherapy with intermediate FN risk enrolled. Median physician-assessed FN risk over all chemotherapy cycles was 20.0%, and median model-predicted risk was 17.9%; the correlation was 0.249 (95% CI, 0.179-0.316). The correlation between physician-assessed FN risk and subsequent orders for G-CSF primary prophylaxis (n = 634) was 0.313 (95% CI, 0.135-0.472). Among patients with a physician-assessed FN risk ≥ 20%, 14% did not receive G-CSF orders. G-CSF was not ordered for 16% of patients at or above their physician's self-reported FN risk intervention threshold (median, 20.0%) and was ordered for 21% below the threshold. Physician-assessed FN risk and model-predicted risk correlated weakly; however, there was moderate correlation between physician-assessed FN risk and orders for G-CSF primary prophylaxis. Further research and education on FN risk factors and appropriate G-CSF use are needed.

  20. Urban Drainage Modeling and Flood Risk Management

    NASA Astrophysics Data System (ADS)

    Schmitt, Theo G.; Thomas, Martin

    The European research project in the EUREKA framework, RisUrSim (Σ!2255) has been worked out by a project consortium including industrial mathematics and water engineering research institutes, municipal drainage works as well as an insurance company. The overall objective has been the development of a simulation to allow flood risk analysis and cost-effective management for urban drainage systems. In view of the regulatory background of European Standard EN 752, the phenomenon of urban flooding caused by surcharged sewer systems in urban drainage systems is analyzed, leading to the necessity of dual drainage modeling. A detailed dual drainage simulation model is described based upon hydraulic flow routing procedures for surface flow and pipe flow. Special consideration is given to the interaction between surface and sewer flow in order to most accurately compute water levels above ground as a basis for further assessment of possible damage costs. The model application is presented for small case study in terms of data needs, model verification, and first simulation results.

  1. Influence of physical and chemical properties of HTSXT-FTIR samples on the quality of prediction models developed to determine absolute concentrations of total proteins, carbohydrates and triglycerides: a preliminary study on the determination of their absolute concentrations in fresh microalgal biomass.

    PubMed

    Serrano León, Esteban; Coat, Rémy; Moutel, Benjamin; Pruvost, Jérémy; Legrand, Jack; Gonçalves, Olivier

    2014-11-01

    Absolute concentrations of total macromolecules (triglycerides, proteins and carbohydrates) in microorganisms can be rapidly measured by FTIR spectroscopy, but caution is needed to avoid non-specific experimental bias. Here, we assess the limits within which this approach can be used on model solutions of macromolecules of interest. We used the Bruker HTSXT-FTIR system. Our results show that the solid deposits obtained after the sampling procedure present physical and chemical properties that influence the quality of the absolute concentration prediction models (univariate and multivariate). The accuracy of the models was degraded by a factor of 2 or 3 outside the recommended concentration interval of 0.5-35 µg spot(-1). Change occurred notably in the sample hydrogen bond network, which could, however, be controlled using an internal probe (pseudohalide anion). We also demonstrate that for aqueous solutions, accurate prediction of total carbohydrate quantities (in glucose equivalent) could not be made unless a constant amount of protein was added to the model solution (BSA). The results of the prediction model for more complex solutions, here with two components: glucose and BSA, were very encouraging, suggesting that this FTIR approach could be used as a rapid quantification method for mixtures of molecules of interest, provided the limits of use of the HTSXT-FTIR method are precisely known and respected. This last finding opens the way to direct quantification of total molecules of interest in more complex matrices.

  2. Measures of Residual Risk with Connections to Regression, Risk Tracking, Surrogate Models, and Ambiguity

    DTIC Science & Technology

    2015-05-20

    Measures of Residual Risk with Connections to Regression, Risk Tracking, Surrogate Models, and Ambiguity1 R. Tyrrell Rockafellar Johannes O. Royset... Measures of residual risk are developed as extension of measures of risk. They view a random variable of interest in concert with an auxiliary random...forecasting and generalized regression. We establish the fundamental properties in this framework and show that measures of residual risk along with

  3. Estimating Absolute Site Effects

    SciTech Connect

    Malagnini, L; Mayeda, K M; Akinci, A; Bragato, P L

    2004-07-15

    The authors use previously determined direct-wave attenuation functions as well as stable, coda-derived source excitation spectra to isolate the absolute S-wave site effect for the horizontal and vertical components of weak ground motion. They used selected stations in the seismic network of the eastern Alps, and find the following: (1) all ''hard rock'' sites exhibited deamplification phenomena due to absorption at frequencies ranging between 0.5 and 12 Hz (the available bandwidth), on both the horizontal and vertical components; (2) ''hard rock'' site transfer functions showed large variability at high-frequency; (3) vertical-motion site transfer functions show strong frequency-dependence, and (4) H/V spectral ratios do not reproduce the characteristics of the true horizontal site transfer functions; (5) traditional, relative site terms obtained by using reference ''rock sites'' can be misleading in inferring the behaviors of true site transfer functions, since most rock sites have non-flat responses due to shallow heterogeneities resulting from varying degrees of weathering. They also use their stable source spectra to estimate total radiated seismic energy and compare against previous results. they find that the earthquakes in this region exhibit non-constant dynamic stress drop scaling which gives further support for a fundamental difference in rupture dynamics between small and large earthquakes. To correct the vertical and horizontal S-wave spectra for attenuation, they used detailed regional attenuation functions derived by Malagnini et al. (2002) who determined frequency-dependent geometrical spreading and Q for the region. These corrections account for the gross path effects (i.e., all distance-dependent effects), although the source and site effects are still present in the distance-corrected spectra. The main goal of this study is to isolate the absolute site effect (as a function of frequency) by removing the source spectrum (moment-rate spectrum) from

  4. Risk assessment compatible fire models (RACFMs)

    SciTech Connect

    Lopez, A.R.; Gritzo, L.A.; Sherman, M.P.

    1998-07-01

    A suite of Probabilistic Risk Assessment Compatible Fire Models (RACFMs) has been developed to represent the hazard posed by a pool fire to weapon systems transported on the B52-H aircraft. These models represent both stand-off (i.e., the weapon system is outside of the flame zone but exposed to the radiant heat load from fire) and fully-engulfing scenarios (i.e., the object is fully covered by flames). The approach taken in developing the RACFMs for both scenarios was to consolidate, reconcile, and apply data and knowledge from all available resources including: data and correlations from the literature, data from an extensive full-scale fire test program at the Naval Air Warfare Center (NAWC) at China Lake, and results from a fire field model (VULCAN). In the past, a single, effective temperature, T{sub f}, was used to represent the fire. The heat flux to an object exposed to a fire was estimated using the relationship for black body radiation, {sigma}T{sub f}{sup 4}. Significant improvements have been made by employing the present approach which accounts for the presence of temperature distributions in fully-engulfing fires, and uses best available correlations to estimate heat fluxes in stand-off scenarios.

  5. Galactic Cosmic Ray Event-Based Risk Model (GERM) Code

    NASA Technical Reports Server (NTRS)

    Cucinotta, Francis A.; Plante, Ianik; Ponomarev, Artem L.; Kim, Myung-Hee Y.

    2013-01-01

    This software describes the transport and energy deposition of the passage of galactic cosmic rays in astronaut tissues during space travel, or heavy ion beams in patients in cancer therapy. Space radiation risk is a probability distribution, and time-dependent biological events must be accounted for physical description of space radiation transport in tissues and cells. A stochastic model can calculate the probability density directly without unverified assumptions about shape of probability density function. The prior art of transport codes calculates the average flux and dose of particles behind spacecraft and tissue shielding. Because of the signaling times for activation and relaxation in the cell and tissue, transport code must describe temporal and microspatial density of functions to correlate DNA and oxidative damage with non-targeted effects of signals, bystander, etc. These are absolutely ignored or impossible in the prior art. The GERM code provides scientists data interpretation of experiments; modeling of beam line, shielding of target samples, and sample holders; and estimation of basic physical and biological outputs of their experiments. For mono-energetic ion beams, basic physical and biological properties are calculated for a selected ion type, such as kinetic energy, mass, charge number, absorbed dose, or fluence. Evaluated quantities are linear energy transfer (LET), range (R), absorption and fragmentation cross-sections, and the probability of nuclear interactions after 1 or 5 cm of water equivalent material. In addition, a set of biophysical properties is evaluated, such as the Poisson distribution for a specified cellular area, cell survival curves, and DNA damage yields per cell. Also, the GERM code calculates the radiation transport of the beam line for either a fixed number of user-specified depths or at multiple positions along the Bragg curve of the particle in a selected material. The GERM code makes the numerical estimates of basic

  6. Absolute near-infrared oximetry for urology: a quantitative study of the tissue hemoglobin saturation before and after testicular torsion in a rabbit model

    NASA Astrophysics Data System (ADS)

    Hallacoglu, Bertan; Matulewicz, Richard S.; Paltiel, Harriet J.; Padua, Horacio; Gargollo, Patricio; Cannon, Glenn; Alomari, Ahmad; Sassaroli, Angelo; Fantini, Sergio

    2009-02-01

    We present an experimental study on four rabbits to demonstrate the feasibility of near-infrared spectroscopy in the noninvasive assessment of testicular torsion. We used a multi-distance frequency-domain method, based on a fixed detector position and a 9-mm linear scan of the illumination optical fibers, to measure absolute values of pre- and post-operative testicular oxygen saturation. Unilateral testicular torsions (by 0°, 540° or 720°) on experimental testes and contralateral sham surgeries (no torsion) on control testes were performed and studied. Our results showed (a) a consistent baseline absolute tissue oxygen saturation value of 78% +/- 5%; (b) a comparable absolute saturation of 77% +/- 6% on the control side (testes after sham surgery); and (c) a significantly lower tissue oxygen saturation of 36% +/- 2% on the experimental side (testes after 540° or 720° torsion surgery). These results demonstrate the capability of frequency domain nearinfrared spectroscopy in the assessment of absolute testicular hemoglobin desaturation caused by torsion, and show promise as a potential method to serve as a complement to conventional color and spectral Doppler ultrasonography.

  7. Radiation risk modeling of thyroid cancer with special emphasis on the Chernobyl epidemiological data.

    PubMed

    Walsh, L; Jacob, P; Kaiser, J C

    2009-10-01

    Two recent studies analyzed thyroid cancer incidence in Belarus and Ukraine during the period from 1990 to 2001, for the birth cohort 1968 to 1985, and the related (131)I exposure associated with the Chernobyl accident in 1986. Contradictory age-at-exposure and time-since-exposure effect modifications of the excess relative risk (ERR) were reported. The present study identifies the choice of baseline modeling method as the reason for the conflicting results. Various quality-of-fit criteria favor a parametric baseline model to various categorical baseline models. The model with a parametric baseline results in a decrease of the ERR by a factor of about 0.2 from an age at exposure of 5 years to an age at exposure of 15 years (for a time since exposure of 12 years) and a decrease of the ERR from a time since exposure of 4 years to a time since exposure of 14 years of about 0.25 (for an age at exposure of 10 years). Central ERR estimates (of about 20 at 1 Gy for an age at exposure of 10 years and an attained age of 20 years) and their ratios for females compared to males (about 0.3) turn out to be relatively independent of the modeling. Excess absolute risk estimates are also predicted to be very similar from the different models. Risk models with parametric and categorical baselines were also applied to thyroid cancer incidence among the atomic bomb survivors. For young ages at exposure, the ERR values in the model with a parametric baseline are larger. Both data sets cover the period of 12 to 15 years since exposure. For this period, higher ERR values and a stronger age-at-exposure modification are found for the Chernobyl data set. Based on the results of the study, it is recommended to test parametric and categorical baseline models in risk analyses.

  8. Absolute nuclear material assay using count distribution (LAMBDA) space

    DOEpatents

    Prasad, Mano K.; Snyderman, Neal J.; Rowland, Mark S.

    2015-12-01

    A method of absolute nuclear material assay of an unknown source comprising counting neutrons from the unknown source and providing an absolute nuclear material assay utilizing a model to optimally compare to the measured count distributions. In one embodiment, the step of providing an absolute nuclear material assay comprises utilizing a random sampling of analytically computed fission chain distributions to generate a continuous time-evolving sequence of event-counts by spreading the fission chain distribution in time.

  9. Absolute nuclear material assay using count distribution (LAMBDA) space

    DOEpatents

    Prasad, Manoj K [Pleasanton, CA; Snyderman, Neal J [Berkeley, CA; Rowland, Mark S [Alamo, CA

    2012-06-05

    A method of absolute nuclear material assay of an unknown source comprising counting neutrons from the unknown source and providing an absolute nuclear material assay utilizing a model to optimally compare to the measured count distributions. In one embodiment, the step of providing an absolute nuclear material assay comprises utilizing a random sampling of analytically computed fission chain distributions to generate a continuous time-evolving sequence of event-counts by spreading the fission chain distribution in time.

  10. Modeling biotic habitat high risk areas

    USGS Publications Warehouse

    Despain, D.G.; Beier, P.; Tate, C.; Durtsche, B.M.; Stephens, T.

    2000-01-01

    Fire, especially stand replacing fire, poses a threat to many threatened and endangered species as well as their habitat. On the other hand, fire is important in maintaining a variety of successional stages that can be important for approach risk assessment to assist in prioritizing areas for allocation of fire mitigation funds. One example looks at assessing risk to the species and biotic communities of concern followed by the Colorado Natural Heritage Program. One looks at the risk to Mexican spottled owls. Another looks at the risk to cutthroat trout, and a fourth considers the general effects of fire and elk.

  11. SMALL POPULATIONS REQUIRE SPECIFIC MODELING APPROACHES FOR ASSESSING RISK

    EPA Science Inventory

    All populations face non-zero risks of extinction. However, the risks for small populations, and therefore the modeling approaches necessary to predict them, are different from those of large populations. These differences are currently hindering assessment of risk to small pop...

  12. A new explained-variance based genetic risk score for predictive modeling of disease risk.

    PubMed

    Che, Ronglin; Motsinger-Reif, Alison A

    2012-09-25

    The goal of association mapping is to identify genetic variants that predict disease, and as the field of human genetics matures, the number of successful association studies is increasing. Many such studies have shown that for many diseases, risk is explained by a reasonably large number of variants that each explains a very small amount of disease risk. This is prompting the use of genetic risk scores in building predictive models, where information across several variants is combined for predictive modeling. In the current study, we compare the performance of four previously proposed genetic risk score methods and present a new method for constructing genetic risk score that incorporates explained variance information. The methods compared include: a simple count Genetic Risk Score, an odds ratio weighted Genetic Risk Score, a direct logistic regression Genetic Risk Score, a polygenic Genetic Risk Score, and the new explained variance weighted Genetic Risk Score. We compare the methods using a wide range of simulations in two steps, with a range of the number of deleterious single nucleotide polymorphisms (SNPs) explaining disease risk, genetic modes, baseline penetrances, sample sizes, relative risks (RR) and minor allele frequencies (MAF). Several measures of model performance were compared including overall power, C-statistic and Akaike's Information Criterion. Our results show the relative performance of methods differs significantly, with the new explained variance weighted GRS (EV-GRS) generally performing favorably to the other methods.

  13. Cryogenic, Absolute, High Pressure Sensor

    NASA Technical Reports Server (NTRS)

    Chapman, John J. (Inventor); Shams. Qamar A. (Inventor); Powers, William T. (Inventor)

    2001-01-01

    A pressure sensor is provided for cryogenic, high pressure applications. A highly doped silicon piezoresistive pressure sensor is bonded to a silicon substrate in an absolute pressure sensing configuration. The absolute pressure sensor is bonded to an aluminum nitride substrate. Aluminum nitride has appropriate coefficient of thermal expansion for use with highly doped silicon at cryogenic temperatures. A group of sensors, either two sensors on two substrates or four sensors on a single substrate are packaged in a pressure vessel.

  14. Risk management modeling and its application in maritime safety

    NASA Astrophysics Data System (ADS)

    Qin, Ting-Rong; Chen, Wei-Jiong; Zeng, Xiang-Kun

    2008-12-01

    Quantified risk assessment (QRA) needs mathematicization of risk theory. However, attention has been paid almost exclusively to applications of assessment methods, which has led to neglect of research into fundamental theories, such as the relationships among risk, safety, danger, and so on. In order to solve this problem, as a first step, fundamental theoretical relationships about risk and risk management were analyzed for this paper in the light of mathematics, and then illustrated with some charts. Second, man-machine-environment-management (MMEM) theory was introduced into risk theory to analyze some properties of risk. On the basis of this, a three-dimensional model of risk management was established that includes: a goal dimension; a management dimension; an operation dimension. This goal management operation (GMO) model was explained and then emphasis was laid on the discussion of the risk flowchart (operation dimension), which lays the groundwork for further study of risk management and qualitative and quantitative assessment. Next, the relationship between Formal Safety Assessment (FSA) and Risk Management was researched. This revealed that the FSA method, which the international maritime organization (IMO) is actively spreading, comes from Risk Management theory. Finally, conclusion were made about how to apply this risk management method to concrete fields efficiently and conveniently, as well as areas where further research is required.

  15. Concentration of Risk Model (CORM) Verification and Analysis

    DTIC Science & Technology

    2014-06-15

    Mental Health and using data from a repository at the University of Michigan, had attempted to identify soldiers at higher-than-average risk of suicide ...TRAC-M-TR-14-023 15 June 2014 Concentration of Risk Model (CORM) Verification and Analysis TRADOC Analysis Center - Monterey 700 Dyer Road Monterey...TRAC-M-TR-14-023 15 June 2014 Concentration of Risk Model (CORM) Verification and Analysis Edward M. Masotti Sam Buttrey TRADOC Analysis Center

  16. A risk analysis model for radioactive wastes.

    PubMed

    Külahcı, Fatih

    2011-07-15

    Hazardous wastes affect natural environmental systems to a significant extend, and therefore, it is necessary to control their harm through risk analysis. Herein, an effective risk methodology is proposed by considering their uncertain behaviors on stochastic, statistical and probabilistic bases. The basic element is attachment of a convenient probability distribution function (pdf) to a given waste quality measurement sequence. In this paper, (40)K contaminant measurements are adapted for risk assessment application after derivation of necessary fundamental formulations. The spatial contaminant distribution of (40)K is presented in the forms of maps and three-dimensional surfaces.

  17. [Model for early childhood caries risks].

    PubMed

    Dimitrova, M; Kukleva, M

    2008-01-01

    Risk factors of early childhood caries were studied on 406 children of 12-47 months age. The results showed that pathological pregnancy, sleeping with bottle of blend or sweet liquid, use of candy and caramel on sticks and sour-sweet fruit juices were significant factors leading to early childhood caries. During simultaneous action of all these risk factors domination belonged to use of sour-sweet fruit juices. The probability of caries occurrence at simultaneous action of all these risk factors was equal to 62%.

  18. Source-to-Outcome Microbial Exposure and Risk Modeling Framework

    EPA Science Inventory

    A Quantitative Microbial Risk Assessment (QMRA) is a computer-based data-delivery and modeling approach that integrates interdisciplinary fate/transport, exposure, and impact models and databases to characterize potential health impacts/risks due to pathogens. As such, a QMRA ex...

  19. Integrating Household Risk Mitigation Behavior in Flood Risk Analysis: An Agent-Based Model Approach.

    PubMed

    Haer, Toon; Botzen, W J Wouter; de Moel, Hans; Aerts, Jeroen C J H

    2016-11-28

    Recent studies showed that climate change and socioeconomic trends are expected to increase flood risks in many regions. However, in these studies, human behavior is commonly assumed to be constant, which neglects interaction and feedback loops between human and environmental systems. This neglect of human adaptation leads to a misrepresentation of flood risk. This article presents an agent-based model that incorporates human decision making in flood risk analysis. In particular, household investments in loss-reducing measures are examined under three economic decision models: (1) expected utility theory, which is the traditional economic model of rational agents; (2) prospect theory, which takes account of bounded rationality; and (3) a prospect theory model, which accounts for changing risk perceptions and social interactions through a process of Bayesian updating. We show that neglecting human behavior in flood risk assessment studies can result in a considerable misestimation of future flood risk, which is in our case study an overestimation of a factor two. Furthermore, we show how behavior models can support flood risk analysis under different behavioral assumptions, illustrating the need to include the dynamic adaptive human behavior of, for instance, households, insurers, and governments. The method presented here provides a solid basis for exploring human behavior and the resulting flood risk with respect to low-probability/high-impact risks.

  20. Extended risk-analysis model for activities of the project.

    PubMed

    Kušar, Janez; Rihar, Lidija; Zargi, Urban; Starbek, Marko

    2013-12-01

    Project management of product/service orders has become a mode of operation in many companies. Although these are mostly cyclically recurring projects, risk management is very important for them. An extended risk-analysis model for new product/service projects is presented in this paper. Emphasis is on a solution developed in the Faculty of Mechanical Engineering in Ljubljana, Slovenia. The usual project activities risk analysis is based on evaluation of the probability that risk events occur and on evaluation of their consequences. A third parameter has been added in our model: an estimate of the incidence of risk events. On the basis of the calculated activity risk level, a project team prepares preventive and corrective measures that should be taken according to the status indicators. An important advantage of the proposed solution is that the project manager and his team members are timely warned of risk events and they can thus activate the envisaged preventive and corrective measures as necessary.

  1. Quantified Risk Ranking Model for Condition-Based Risk and Reliability Centered Maintenance

    NASA Astrophysics Data System (ADS)

    Chattopadhyaya, Pradip Kumar; Basu, Sushil Kumar; Majumdar, Manik Chandra

    2016-03-01

    In the recent past, risk and reliability centered maintenance (RRCM) framework is introduced with a shift in the methodological focus from reliability and probabilities (expected values) to reliability, uncertainty and risk. In this paper authors explain a novel methodology for risk quantification and ranking the critical items for prioritizing the maintenance actions on the basis of condition-based risk and reliability centered maintenance (CBRRCM). The critical items are identified through criticality analysis of RPN values of items of a system and the maintenance significant precipitating factors (MSPF) of items are evaluated. The criticality of risk is assessed using three risk coefficients. The likelihood risk coefficient treats the probability as a fuzzy number. The abstract risk coefficient deduces risk influenced by uncertainty, sensitivity besides other factors. The third risk coefficient is called hazardous risk coefficient, which is due to anticipated hazards which may occur in the future and the risk is deduced from criteria of consequences on safety, environment, maintenance and economic risks with corresponding cost for consequences. The characteristic values of all the three risk coefficients are obtained with a particular test. With few more tests on the system, the values may change significantly within controlling range of each coefficient, hence `random number simulation' is resorted to obtain one distinctive value for each coefficient. The risk coefficients are statistically added to obtain final risk coefficient of each critical item and then the final rankings of critical items are estimated. The prioritization in ranking of critical items using the developed mathematical model for risk assessment shall be useful in optimization of financial losses and timing of maintenance actions.

  2. Evaluation of cluster recovery for small area relative risk models.

    PubMed

    Rotejanaprasert, Chawarat

    2014-12-01

    The analysis of disease risk is often considered via relative risk. The comparison of relative risk estimation methods with "true risk" scenarios has been considered on various occasions. However, there has been little examination of how well competing methods perform when the focus is clustering of risk. In this paper, a simulated evaluation of a range of potential spatial risk models and a range of measures that can be used for (a) cluster goodness of fit, (b) cluster diagnostics are considered. Results suggest that exceedence probability is a poor measure of hot spot clustering because of model dependence, whereas residual-based methods are less model dependent and perform better. Local deviance information criteria measures perform well, but conditional predictive ordinate measures yield a high false positive rate.

  3. A Team Mental Model Perspective of Pre-Quantitative Risk

    NASA Technical Reports Server (NTRS)

    Cooper, Lynne P.

    2011-01-01

    This study was conducted to better understand how teams conceptualize risk before it can be quantified, and the processes by which a team forms a shared mental model of this pre-quantitative risk. Using an extreme case, this study analyzes seven months of team meeting transcripts, covering the entire lifetime of the team. Through an analysis of team discussions, a rich and varied structural model of risk emerges that goes significantly beyond classical representations of risk as the product of a negative consequence and a probability. In addition to those two fundamental components, the team conceptualization includes the ability to influence outcomes and probabilities, networks of goals, interaction effects, and qualitative judgments about the acceptability of risk, all affected by associated uncertainties. In moving from individual to team mental models, team members employ a number of strategies to gain group recognition of risks and to resolve or accept differences.

  4. A comprehensive Network Security Risk Model for process control networks.

    PubMed

    Henry, Matthew H; Haimes, Yacov Y

    2009-02-01

    The risk of cyber attacks on process control networks (PCN) is receiving significant attention due to the potentially catastrophic extent to which PCN failures can damage the infrastructures and commodity flows that they support. Risk management addresses the coupled problems of (1) reducing the likelihood that cyber attacks would succeed in disrupting PCN operation and (2) reducing the severity of consequences in the event of PCN failure or manipulation. The Network Security Risk Model (NSRM) developed in this article provides a means of evaluating the efficacy of candidate risk management policies by modeling the baseline risk and assessing expectations of risk after the implementation of candidate measures. Where existing risk models fall short of providing adequate insight into the efficacy of candidate risk management policies due to shortcomings in their structure or formulation, the NSRM provides model structure and an associated modeling methodology that captures the relevant dynamics of cyber attacks on PCN for risk analysis. This article develops the NSRM in detail in the context of an illustrative example.

  5. Intervention models for mothers and children at risk for injuries.

    PubMed

    Gulotta, C S; Finney, J W

    2000-03-01

    We review risk factors commonly associated with childhood unintentional injuries and highlight adolescent mothers and their young children as a high risk group. Several intervention models of injury, including the epidemiological model, Peterson and Brown's "working model," and the socioecological model have been proposed to explain the events that lead to injuries. Discussion of these models is provided and a synthesis of the adolescent parenting model and the socioecological model of injury is suggested as way to address the complex variables that lead to an injury causing event for adolescent mothers and their young children. Finally, we suggest areas of future investigation and their implications for prevention and treatment.

  6. The effects of vehicle model and driver behavior on risk.

    PubMed

    Wenzel, Thomas P; Ross, Marc

    2005-05-01

    We study the dependence of risk on vehicle type and especially on vehicle model. Here, risk is measured by the number of driver fatalities per year per million vehicles registered. We analyze both the risk to the drivers of each vehicle model and the risk the vehicle model imposes on drivers of other vehicles with which it crashes. The "combined risk" associated with each vehicle model is simply the sum of the risk-to-drivers in all kinds of crashes and the risk-to-drivers-of-other-vehicles in two-vehicle crashes. We find that most car models are as safe to their drivers as most sport utility vehicles (SUVs); the increased risk of a rollover in a SUV roughly balances the higher risk for cars that collide with SUVs and pickup trucks. We find that SUVs and to a greater extent pickup trucks, impose much greater risks than cars on drivers of other vehicles; and these risks increase with increasing pickup size. The higher aggressivity of SUVs and pickups makes their combined risk higher than that of almost all cars. Effects of light truck design on their risk are revealed by the analysis of specific models: new unibody (or "crossover") SUVs appear, in preliminary analysis, to have much lower risks than the most popular truck-based SUVs. Much has been made in the past about the high risk of low-mass cars in certain kinds of collisions. We find there are other plausible explanations for this pattern of risk, which suggests that mass may not be fundamental to safety. While not conclusive, this is potentially important because improvement in fuel economy is a major goal for designers of new vehicles. We find that accounting for the most risky drivers, young males and the elderly, does not change our general results. Similarly, we find with California data that the high risk of rural driving and the high level of rural driving by pickups does not increase the risk-to-drivers of pickups relative to that for cars. However, other more subtle differences in drivers and the

  7. The Integrated Medical Model: Statistical Forecasting of Risks to Crew Health and Mission Success

    NASA Technical Reports Server (NTRS)

    Fitts, M. A.; Kerstman, E.; Butler, D. J.; Walton, M. E.; Minard, C. G.; Saile, L. G.; Toy, S.; Myers, J.

    2008-01-01

    The Integrated Medical Model (IMM) helps capture and use organizational knowledge across the space medicine, training, operations, engineering, and research domains. The IMM uses this domain knowledge in the context of a mission and crew profile to forecast crew health and mission success risks. The IMM is most helpful in comparing the risk of two or more mission profiles, not as a tool for predicting absolute risk. The process of building the IMM adheres to Probability Risk Assessment (PRA) techniques described in NASA Procedural Requirement (NPR) 8705.5, and uses current evidence-based information to establish a defensible position for making decisions that help ensure crew health and mission success. The IMM quantitatively describes the following input parameters: 1) medical conditions and likelihood, 2) mission duration, 3) vehicle environment, 4) crew attributes (e.g. age, sex), 5) crew activities (e.g. EVA's, Lunar excursions), 6) diagnosis and treatment protocols (e.g. medical equipment, consumables pharmaceuticals), and 7) Crew Medical Officer (CMO) training effectiveness. It is worth reiterating that the IMM uses the data sets above as inputs. Many other risk management efforts stop at determining only likelihood. The IMM is unique in that it models not only likelihood, but risk mitigations, as well as subsequent clinical outcomes based on those mitigations. Once the mathematical relationships among the above parameters are established, the IMM uses a Monte Carlo simulation technique (a random sampling of the inputs as described by their statistical distribution) to determine the probable outcomes. Because the IMM is a stochastic model (i.e. the input parameters are represented by various statistical distributions depending on the data type), when the mission is simulated 10-50,000 times with a given set of medical capabilities (risk mitigations), a prediction of the most probable outcomes can be generated. For each mission, the IMM tracks which conditions

  8. Assessing Academic Risk of Student-Athletes: Applicability of the NCAA Graduation Risk Overview Model to GPA

    ERIC Educational Resources Information Center

    Johnson, James

    2013-01-01

    In an effort to standardize academic risk assessment, the NCAA developed the graduation risk overview (GRO) model. Although this model was designed to assess graduation risk, its ability to predict grade-point average (GPA) remained unknown. Therefore, 134 individual risk assessments were made to determine GRO model effectiveness in the…

  9. Absolute Humidity and the Seasonality of Influenza (Invited)

    NASA Astrophysics Data System (ADS)

    Shaman, J. L.; Pitzer, V.; Viboud, C.; Grenfell, B.; Goldstein, E.; Lipsitch, M.

    2010-12-01

    Much of the observed wintertime increase of mortality in temperate regions is attributed to seasonal influenza. A recent re-analysis of laboratory experiments indicates that absolute humidity strongly modulates the airborne survival and transmission of the influenza virus. Here we show that the onset of increased wintertime influenza-related mortality in the United States is associated with anomalously low absolute humidity levels during the prior weeks. We then use an epidemiological model, in which observed absolute humidity conditions temper influenza transmission rates, to successfully simulate the seasonal cycle of observed influenza-related mortality. The model results indicate that direct modulation of influenza transmissibility by absolute humidity alone is sufficient to produce this observed seasonality. These findings provide epidemiological support for the hypothesis that absolute humidity drives seasonal variations of influenza transmission in temperate regions. In addition, we show that variations of the basic and effective reproductive numbers for influenza, caused by seasonal changes in absolute humidity, are consistent with the general timing of pandemic influenza outbreaks observed for 2009 A/H1N1 in temperate regions. Indeed, absolute humidity conditions correctly identify the region of the United States vulnerable to a third, wintertime wave of pandemic influenza. These findings suggest that the timing of pandemic influenza outbreaks is controlled by a combination of absolute humidity conditions, levels of susceptibility and changes in population mixing and contact rates.

  10. Risk Models to Predict Hypertension: A Systematic Review

    PubMed Central

    Echouffo-Tcheugui, Justin B.; Batty, G. David; Kivimäki, Mika; Kengne, Andre P.

    2013-01-01

    Background As well as being a risk factor for cardiovascular disease, hypertension is also a health condition in its own right. Risk prediction models may be of value in identifying those individuals at risk of developing hypertension who are likely to benefit most from interventions. Methods and Findings To synthesize existing evidence on the performance of these models, we searched MEDLINE and EMBASE; examined bibliographies of retrieved articles; contacted experts in the field; and searched our own files. Dual review of identified studies was conducted. Included studies had to report on the development, validation, or impact analysis of a hypertension risk prediction model. For each publication, information was extracted on study design and characteristics, predictors, model discrimination, calibration and reclassification ability, validation and impact analysis. Eleven studies reporting on 15 different hypertension prediction risk models were identified. Age, sex, body mass index, diabetes status, and blood pressure variables were the most common predictor variables included in models. Most risk models had acceptable-to-good discriminatory ability (C-statistic>0.70) in the derivation sample. Calibration was less commonly assessed, but overall acceptable. Two hypertension risk models, the Framingham and Hopkins, have been externally validated, displaying acceptable-to-good discrimination, and C-statistic ranging from 0.71 to 0.81. Lack of individual-level data precluded analyses of the risk models in subgroups. Conclusions The discrimination ability of existing hypertension risk prediction tools is acceptable, but the impact of using these tools on prescriptions and outcomes of hypertension prevention is unclear. PMID:23861760

  11. Risk models and scores for type 2 diabetes: systematic review

    PubMed Central

    Mathur, Rohini; Dent, Tom; Meads, Catherine; Greenhalgh, Trisha

    2011-01-01

    Objective To evaluate current risk models and scores for type 2 diabetes and inform selection and implementation of these in practice. Design Systematic review using standard (quantitative) and realist (mainly qualitative) methodology. Inclusion criteria Papers in any language describing the development or external validation, or both, of models and scores to predict the risk of an adult developing type 2 diabetes. Data sources Medline, PreMedline, Embase, and Cochrane databases were searched. Included studies were citation tracked in Google Scholar to identify follow-on studies of usability or impact. Data extraction Data were extracted on statistical properties of models, details of internal or external validation, and use of risk scores beyond the studies that developed them. Quantitative data were tabulated to compare model components and statistical properties. Qualitative data were analysed thematically to identify mechanisms by which use of the risk model or score might improve patient outcomes. Results 8864 titles were scanned, 115 full text papers considered, and 43 papers included in the final sample. These described the prospective development or validation, or both, of 145 risk prediction models and scores, 94 of which were studied in detail here. They had been tested on 6.88 million participants followed for up to 28 years. Heterogeneity of primary studies precluded meta-analysis. Some but not all risk models or scores had robust statistical properties (for example, good discrimination and calibration) and had been externally validated on a different population. Genetic markers added nothing to models over clinical and sociodemographic factors. Most authors described their score as “simple” or “easily implemented,” although few were specific about the intended users and under what circumstances. Ten mechanisms were identified by which measuring diabetes risk might improve outcomes. Follow-on studies that applied a risk score as part of an

  12. Absolute classification with unsupervised clustering

    NASA Technical Reports Server (NTRS)

    Jeon, Byeungwoo; Landgrebe, D. A.

    1992-01-01

    An absolute classification algorithm is proposed in which the class definition through training samples or otherwise is required only for a particular class of interest. The absolute classification is considered as a problem of unsupervised clustering when one cluster is known initially. The definitions and statistics of the other classes are automatically developed through the weighted unsupervised clustering procedure, which is developed to keep the cluster corresponding to the class of interest from losing its identity as the class of interest. Once all the classes are developed, a conventional relative classifier such as the maximum-likelihood classifier is used in the classification.

  13. Global flood risk modelling and its applications for disaster risk reduction

    NASA Astrophysics Data System (ADS)

    Jongman, Brenden; Winsemius, Hessel; Bierkens, Marc; Bouwman, Arno; van Beek, Rens; Ligtvoet, Willem; Ward, Philip

    2014-05-01

    Flooding of river systems is the most costly natural hazard affecting societies around the world, with an average of 55 billion in direct losses and 4,500 fatalities each year between 1990 and 2012. The accurate and consistent assessment of flood risk on a global scale is essential for international development organizations and the reinsurance industry, and for enhancing our understanding of climate change impacts. This need is especially felt in developing countries, where local data and models are largely unavailable, and where flood risk is increasing rapidly under strong population growth and economic development. Here we present ongoing applications of high-resolution flood risk modelling at a global scale. The work is based on GLOFRIS, a modelling chain that produces flood risk maps at a 1km spatial resolution for the entire globe, under a range of climate and socioeconomic scenarios and various past and future time periods. This modelling chain combines a hydrological inundation model with socioeconomic datasets to assess past, current and future population exposure; economic damages; and agricultural risk. These tools are currently applied scientifically to gain insights in geographical patterns in current risk, and to assess the effects of possible future scenarios under climate change and climate variability. In this presentation we show recent applications from global scale to national scales. The global scale applications include global risk profiling for the reinsurance industry; and novel estimation of global flood mortality risk. In addition it will be demonstrated how the global flood modelling approach was successfully applied to assess disaster risk reduction priorities on a national scale in Africa. Finally, we indicate how these global modelling tools can be used to quantify the costs and benefits of adaptation, and explore pathways for development under a changing environment.

  14. Radiation risk models for all solid cancers other than those types of cancer requiring individual assessments after a nuclear accident.

    PubMed

    Walsh, Linda; Zhang, Wei

    2016-03-01

    In the assessment of health risks after nuclear accidents, some health consequences require special attention. For example, in their 2013 report on health risk assessment after the Fukushima nuclear accident, the World Health Organisation (WHO) panel of experts considered risks of breast cancer, thyroid cancer and leukaemia. For these specific cancer types, use was made of already published excess relative risk (ERR) and excess absolute risk (EAR) models for radiation-related cancer incidence fitted to the epidemiological data from the Japanese A-bomb Life Span Study (LSS). However, it was also considered important to assess all other types of solid cancer together and the WHO, in their above-mentioned report, stated "No model to calculate the risk for all other solid cancer excluding breast and thyroid cancer risks is available from the LSS data". Applying the LSS models for all solid cancers along with the models for the specific sites means that some cancers have an overlap in the risk evaluations. Thus, calculating the total solid cancer risk plus the breast cancer risk plus the thyroid cancer risk can overestimate the total risk by several per cent. Therefore, the purpose of this paper was to publish the required models for all other solid cancers, i.e. all solid cancers other than those types of cancer requiring special attention after a nuclear accident. The new models presented here have been fitted to the same LSS data set from which the risks provided by the WHO were derived. Although it is known already that the EAR and ERR effect modifications by sex are statistically significant for the outcome "all solid cancer", it is shown here that sex modification is not statistically significant for the outcome "all solid cancer other than thyroid and breast cancer". It is also shown here that the sex-averaged solid cancer risks with and without the sex modification are very similar once breast and thyroid cancers are factored out. Some other notable model

  15. National Veterans Health Administration inpatient risk stratification models for hospital-acquired acute kidney injury

    PubMed Central

    Cronin, Robert M; VanHouten, Jacob P; Siew, Edward D; Eden, Svetlana K; Fihn, Stephan D; Nielson, Christopher D; Peterson, Josh F; Baker, Clifton R; Ikizler, T Alp; Speroff, Theodore

    2015-01-01

    Objective Hospital-acquired acute kidney injury (HA-AKI) is a potentially preventable cause of morbidity and mortality. Identifying high-risk patients prior to the onset of kidney injury is a key step towards AKI prevention. Materials and Methods A national retrospective cohort of 1,620,898 patient hospitalizations from 116 Veterans Affairs hospitals was assembled from electronic health record (EHR) data collected from 2003 to 2012. HA-AKI was defined at stage 1+, stage 2+, and dialysis. EHR-based predictors were identified through logistic regression, least absolute shrinkage and selection operator (lasso) regression, and random forests, and pair-wise comparisons between each were made. Calibration and discrimination metrics were calculated using 50 bootstrap iterations. In the final models, we report odds ratios, 95% confidence intervals, and importance rankings for predictor variables to evaluate their significance. Results The area under the receiver operating characteristic curve (AUC) for the different model outcomes ranged from 0.746 to 0.758 in stage 1+, 0.714 to 0.720 in stage 2+, and 0.823 to 0.825 in dialysis. Logistic regression had the best AUC in stage 1+ and dialysis. Random forests had the best AUC in stage 2+ but the least favorable calibration plots. Multiple risk factors were significant in our models, including some nonsteroidal anti-inflammatory drugs, blood pressure medications, antibiotics, and intravenous fluids given during the first 48 h of admission. Conclusions This study demonstrated that, although all the models tested had good discrimination, performance characteristics varied between methods, and the random forests models did not calibrate as well as the lasso or logistic regression models. In addition, novel modifiable risk factors were explored and found to be significant. PMID:26104740

  16. The OPTIONS model of sexual risk assessment for adolescents.

    PubMed

    Lusczakoski, Kathryn D; Rue, Lisa A

    2012-03-01

    Typically, clinical evaluations of adolescents' sexual risk is based on inquiring about past sexual activity, which is limited by not including an adolescent's cognitive decision making regarding their past sexual decisions. This study describes the novel OPTIONS framework for assessing adolescent sexual risk including three general categories of risk (e.g., primary, secondary, and tertiary risk), which is designed to overcome the limitation of action-based assessment of risk and improve practitioners' ability to assess the levels of sexual risk. A convenience sample of 201 older adolescents (18-19 years of age) completed an online version of the Relationship Options Survey (ROS), designed to measure the OPTIONS sexual risk assessment. Bivariate correlation among the subscales functioned in the hypothesized manner, with all correlations being statistically significant. Using the OPTIONS model, 22.4% participants were classified as high risk primary, 7.0% participants were classified as high risk secondary, and 27.4% participants were classified as high risk tertiary. The study provided preliminary evidence for OPTIONS model of sexual assessment, which provides a more tailored evaluation by including cognitive decisions regarding an adolescent's sexual actions.

  17. Back-end Science Model Integration for Ecological Risk Assessment

    EPA Science Inventory

    The U.S. Environmental Protection Agency (USEPA) relies on a number of ecological risk assessment models that have been developed over 30-plus years of regulating pesticide exposure and risks under Federal Insecticide, Fungicide, and Rodenticide Act (FIFRA) and the Endangered Spe...

  18. Back-end Science Model Integration for Ecological Risk Assessment.

    EPA Science Inventory

    The U.S. Environmental Protection Agency (USEPA) relies on a number of ecological risk assessment models that have been developed over 30-plus years of regulating pesticide exposure and risks under Federal Insecticide, Fungicide, and Rodenticide Act (FIFRA) and the Endangered Spe...

  19. Risk Prediction Models for Other Cancers or Multiple Sites

    Cancer.gov

    Developing statistical models that estimate the probability of developing other multiple cancers over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  20. Absolute measurement of length with nanometric resolution

    NASA Astrophysics Data System (ADS)

    Apostol, D.; Garoi, F.; Timcu, A.; Damian, V.; Logofatu, P. C.; Nascov, V.

    2005-08-01

    Laser interferometer displacement measuring transducers have a well-defined traceability route to the definition of the meter. The laser interferometer is de-facto length scale for applications in micro and nano technologies. However their physical unit -half lambda is too large for nanometric resolution. Fringe interpolation-usual technique to improve the resolution-lack of reproducibility could be avoided using the principles of absolute distance measurement. Absolute distance refers to the use of interferometric techniques for determining the position of an object without the necessity of measuring continuous displacements between points. The interference pattern as produced by the interference of two point-like coherent sources is fitted to a geometric model so as to determine the longitudinal location of the target by minimizing least square errors. The longitudinal coordinate of the target was measured with accuracy better than 1 nm, for a target position range of 0.4μm.

  1. Absolute transition probabilities of phosphorus.

    NASA Technical Reports Server (NTRS)

    Miller, M. H.; Roig, R. A.; Bengtson, R. D.

    1971-01-01

    Use of a gas-driven shock tube to measure the absolute strengths of 21 P I lines and 126 P II lines (from 3300 to 6900 A). Accuracy for prominent, isolated neutral and ionic lines is estimated to be 28 to 40% and 18 to 30%, respectively. The data and the corresponding theoretical predictions are examined for conformity with the sum rules.-

  2. Relativistic Absolutism in Moral Education.

    ERIC Educational Resources Information Center

    Vogt, W. Paul

    1982-01-01

    Discusses Emile Durkheim's "Moral Education: A Study in the Theory and Application of the Sociology of Education," which holds that morally healthy societies may vary in culture and organization but must possess absolute rules of moral behavior. Compares this moral theory with current theory and practice of American educators. (MJL)

  3. Absolute Standards for Climate Measurements

    NASA Astrophysics Data System (ADS)

    Leckey, J.

    2016-10-01

    In a world of changing climate, political uncertainty, and ever-changing budgets, the benefit of measurements traceable to SI standards increases by the day. To truly resolve climate change trends on a decadal time scale, on-orbit measurements need to be referenced to something that is both absolute and unchanging. One such mission is the Climate Absolute Radiance and Refractivity Observatory (CLARREO) that will measure a variety of climate variables with an unprecedented accuracy to definitively quantify climate change. In the CLARREO mission, we will utilize phase change cells in which a material is melted to calibrate the temperature of a blackbody that can then be observed by a spectrometer. A material's melting point is an unchanging physical constant that, through a series of transfers, can ultimately calibrate a spectrometer on an absolute scale. CLARREO consists of two primary instruments: an infrared (IR) spectrometer and a reflected solar (RS) spectrometer. The mission will contain orbiting radiometers with sufficient accuracy to calibrate other space-based instrumentation and thus transferring the absolute traceability. The status of various mission options will be presented.

  4. Lymphatic Filariasis Transmission Risk Map of India, Based on a Geo-Environmental Risk Model

    PubMed Central

    Sabesan, Shanmugavelu; Raju, Konuganti Hari Kishan; Srivastava, Pradeep Kumar; Jambulingam, Purushothaman

    2013-01-01

    Abstract The strategy adopted by a global program to interrupt transmission of lymphatic filariasis (LF) is mass drug administration (MDA) using chemotherapy. India also followed this strategy by introducing MDA in the historically known endemic areas. All other areas, which remained unsurveyed, were presumed to be nonendemic and left without any intervention. Therefore, identification of LF transmission risk areas in the entire country has become essential so that they can be targeted for intervention. A geo-environmental risk model (GERM) developed earlier was used to create a filariasis transmission risk map for India. In this model, a Standardized Filariasis Transmission Risk Index (SFTRI, based on geo-environmental risk variables) was used as a predictor of transmission risk. The relationship between SFTRI and endemicity (historically known) of an area was quantified by logistic regression analysis. The quantified relationship was validated by assessing the filarial antigenemia status of children living in the unsurveyed areas through a ground truth study. A significant positive relationship was observed between SFTRI and the endemicity of an area. Overall, the model prediction of filarial endemic status of districts was found to be correct in 92.8% of the total observations. Thus, among the 190 districts hitherto unsurveyed, as many as 113 districts were predicted to be at risk, and the remaining at no risk. The GERM developed on geographic information system (GIS) platform is useful for LF spatial delimitation on a macrogeographic/regional scale. Furthermore, the risk map developed will be useful for the national LF elimination program by identifying areas at risk for intervention and for undertaking surveillance in no-risk areas. PMID:23808973

  5. Applying risk assessment models in non-surgical patients: effective risk stratification.

    PubMed

    Eldor, A

    1999-08-01

    Pulmonary embolism and deep vein thrombosis are serious complications of non-surgical patients, but scarcity of data documenting prophylaxis means antithrombotic therapy is rarely used. Prediction of risk is complicated by the variation in the medical conditions associated with venous thromboembolism (VTE), and lack of data defining risk in different groups. Accurate risk assessment is further confounded by inherited or acquired factors for VTE, additional risk due to medical interventions, and interactions between risk factors. Acquired and inherited risk factors may underlie thromboembolic complications in a range of conditions, including pregnancy, ischaemic stroke, myocardial infarction and cancer. Risk stratification may be feasible in non-surgical patients by considering individual risk factors and their cumulative effects. Current risk assessment models require expansion and modification to reflect emerging evidence in the non-surgical field. A large on-going study of prophylaxis with low-molecular-weight heparin in non-surgical patients will clarify our understanding of the components of risk, and assist in developing therapy recommendations.

  6. Developing a predictive risk model for first-line antiretroviral therapy failure in South Africa

    PubMed Central

    Rohr, Julia K; Ive, Prudence; Horsburgh, C Robert; Berhanu, Rebecca; Shearer, Kate; Maskew, Mhairi; Long, Lawrence; Sanne, Ian; Bassett, Jean; Ebrahim, Osman; Fox, Matthew P

    2016-01-01

    Introduction A substantial number of patients with HIV in South Africa have failed first-line antiretroviral therapy (ART). Although individual predictors of first-line ART failure have been identified, few studies in resource-limited settings have been large enough for predictive modelling. Understanding the absolute risk of first-line failure is useful for patient monitoring and for effectively targeting limited resources for second-line ART. We developed a predictive model to identify patients at the greatest risk of virologic failure on first-line ART, and to estimate the proportion of patients needing second-line ART over five years on treatment. Methods A cohort of patients aged ≥18 years from nine South African HIV clinics on first-line ART for at least six months were included. Viral load measurements and baseline predictors were obtained from medical records. We used stepwise selection of predictors in accelerated failure-time models to predict virologic failure on first-line ART (two consecutive viral load levels >1000 copies/mL). Multiple imputations were used to assign missing baseline variables. The final model was selected using internal-external cross-validation maximizing model calibration at five years on ART, and model discrimination, measured using Harrell's C-statistic. Model covariates were used to create a predictive score for risk group of ART failure. Results A total of 72,181 patients were included in the analysis, with an average of 21.5 months (IQR: 8.8–41.5) of follow-up time on first-line ART. The final predictive model had a Weibull distribution and the final predictors of virologic failure were men of all ages, young women, nevirapine use in first-line regimen, low baseline CD4 count, high mean corpuscular volume, low haemoglobin, history of TB and missed visits during the first six months on ART. About 24.4% of patients in the highest quintile and 9.4% of patients in the lowest quintile of risk were predicted to experience

  7. Value-at-Risk forecasts by a spatiotemporal model in Chinese stock market

    NASA Astrophysics Data System (ADS)

    Gong, Pu; Weng, Yingliang

    2016-01-01

    This paper generalizes a recently proposed spatial autoregressive model and introduces a spatiotemporal model for forecasting stock returns. We support the view that stock returns are affected not only by the absolute values of factors such as firm size, book-to-market ratio and momentum but also by the relative values of factors like trading volume ranking and market capitalization ranking in each period. This article studies a new method for constructing stocks' reference groups; the method is called quartile method. Applying the method empirically to the Shanghai Stock Exchange 50 Index, we compare the daily volatility forecasting performance and the out-of-sample forecasting performance of Value-at-Risk (VaR) estimated by different models. The empirical results show that the spatiotemporal model performs surprisingly well in terms of capturing spatial dependences among individual stocks, and it produces more accurate VaR forecasts than the other three models introduced in the previous literature. Moreover, the findings indicate that both allowing for serial correlation in the disturbances and using time-varying spatial weight matrices can greatly improve the predictive accuracy of a spatial autoregressive model.

  8. Submission Form for Peer-Reviewed Cancer Risk Prediction Models

    Cancer.gov

    If you have information about a peer-reviewd cancer risk prediction model that you would like to be considered for inclusion on this list, submit as much information as possible through the form on this page.

  9. Modeling Exposure to Persistent Chemicals in Hazard and Risk Assessment

    SciTech Connect

    Cowan-Ellsberry, Christina E.; McLachlan, Michael S.; Arnot, Jon A.; MacLeod, Matthew; McKone, Thomas E.; Wania, Frank

    2008-11-01

    Fate and exposure modeling has not thus far been explicitly used in the risk profile documents prepared to evaluate significant adverse effect of candidate chemicals for either the Stockholm Convention or the Convention on Long-Range Transboundary Air Pollution. However, we believe models have considerable potential to improve the risk profiles. Fate and exposure models are already used routinely in other similar regulatory applications to inform decisions, and they have been instrumental in building our current understanding of the fate of POP and PBT chemicals in the environment. The goal of this paper is to motivate the use of fate and exposure models in preparing risk profiles in the POP assessment procedure by providing strategies for incorporating and using models. The ways that fate and exposure models can be used to improve and inform the development of risk profiles include: (1) Benchmarking the ratio of exposure and emissions of candidate chemicals to the same ratio for known POPs, thereby opening the possibility of combining this ratio with the relative emissions and relative toxicity to arrive at a measure of relative risk. (2) Directly estimating the exposure of the environment, biota and humans to provide information to complement measurements, or where measurements are not available or are limited. (3) To identify the key processes and chemical and/or environmental parameters that determine the exposure; thereby allowing the effective prioritization of research or measurements to improve the risk profile. (4) Predicting future time trends including how quickly exposure levels in remote areas would respond to reductions in emissions. Currently there is no standardized consensus model for use in the risk profile context. Therefore, to choose the appropriate model the risk profile developer must evaluate how appropriate an existing model is for a specific setting and whether the assumptions and input data are relevant in the context of the application

  10. A Hybrid Tsunami Risk Model for Japan

    NASA Astrophysics Data System (ADS)

    Haseemkunju, A. V.; Smith, D. F.; Khater, M.; Khemici, O.; Betov, B.; Scott, J.

    2014-12-01

    Around the margins of the Pacific Ocean, denser oceanic plates slipping under continental plates cause subduction earthquakes generating large tsunami waves. The subducting Pacific and Philippine Sea plates create damaging interplate earthquakes followed by huge tsunami waves. It was a rupture of the Japan Trench subduction zone (JTSZ) and the resultant M9.0 Tohoku-Oki earthquake that caused the unprecedented tsunami along the Pacific coast of Japan on March 11, 2011. EQECAT's Japan Earthquake model is a fully probabilistic model which includes a seismo-tectonic model describing the geometries, magnitudes, and frequencies of all potential earthquake events; a ground motion model; and a tsunami model. Within the much larger set of all modeled earthquake events, fault rupture parameters for about 24000 stochastic and 25 historical tsunamigenic earthquake events are defined to simulate tsunami footprints using the numerical tsunami model COMCOT. A hybrid approach using COMCOT simulated tsunami waves is used to generate inundation footprints, including the impact of tides and flood defenses. Modeled tsunami waves of major historical events are validated against observed data. Modeled tsunami flood depths on 30 m grids together with tsunami vulnerability and financial models are then used to estimate insured loss in Japan from the 2011 tsunami. The primary direct report of damage from the 2011 tsunami is in terms of the number of buildings damaged by municipality in the tsunami affected area. Modeled loss in Japan from the 2011 tsunami is proportional to the number of buildings damaged. A 1000-year return period map of tsunami waves shows high hazard along the west coast of southern Honshu, on the Pacific coast of Shikoku, and on the east coast of Kyushu, primarily associated with major earthquake events on the Nankai Trough subduction zone (NTSZ). The highest tsunami hazard of more than 20m is seen on the Sanriku coast in northern Honshu, associated with the JTSZ.

  11. Usefulness and limitations of global flood risk models

    NASA Astrophysics Data System (ADS)

    Ward, Philip; Jongman, Brenden; Salamon, Peter; Simpson, Alanna; Bates, Paul; De Groeve, Tom; Muis, Sanne; Coughlan de Perez, Erin; Rudari, Roberto; Mark, Trigg; Winsemius, Hessel

    2016-04-01

    Global flood risk models are now a reality. Initially, their development was driven by a demand from users for first-order global assessments to identify risk hotspots. Relentless upward trends in flood damage over the last decade have enhanced interest in such assessments. The adoption of the Sendai Framework for Disaster Risk Reduction and the Warsaw International Mechanism for Loss and Damage Associated with Climate Change Impacts have made these efforts even more essential. As a result, global flood risk models are being used more and more in practice, by an increasingly large number of practitioners and decision-makers. However, they clearly have their limits compared to local models. To address these issues, a team of scientists and practitioners recently came together at the Global Flood Partnership meeting to critically assess the question 'What can('t) we do with global flood risk models?'. The results of this dialogue (Ward et al., 2013) will be presented, opening a discussion on similar broader initiatives at the science-policy interface in other natural hazards. In this contribution, examples are provided of successful applications of global flood risk models in practice (for example together with the World Bank, Red Cross, and UNISDR), and limitations and gaps between user 'wish-lists' and model capabilities are discussed. Finally, a research agenda is presented for addressing these limitations and reducing the gaps. Ward et al., 2015. Nature Climate Change, doi:10.1038/nclimate2742

  12. A Corrosion Risk Assessment Model for Underground Piping

    NASA Technical Reports Server (NTRS)

    Datta, Koushik; Fraser, Douglas R.

    2009-01-01

    The Pressure Systems Manager at NASA Ames Research Center (ARC) has embarked on a project to collect data and develop risk assessment models to support risk-informed decision making regarding future inspections of underground pipes at ARC. This paper shows progress in one area of this project - a corrosion risk assessment model for the underground high-pressure air distribution piping system at ARC. It consists of a Corrosion Model of pipe-segments, a Pipe Wrap Protection Model; and a Pipe Stress Model for a pipe segment. A Monte Carlo simulation of the combined models provides a distribution of the failure probabilities. Sensitivity study results show that the model uncertainty, or lack of knowledge, is the dominant contributor to the calculated unreliability of the underground piping system. As a result, the Pressure Systems Manager may consider investing resources specifically focused on reducing these uncertainties. Future work includes completing the data collection effort for the existing ground based pressure systems and applying the risk models to risk-based inspection strategies of the underground pipes at ARC.

  13. Sensitivity and uncertainty analysis of a regulatory risk model

    SciTech Connect

    Kumar, A.; Manocha, A.; Shenoy, T.

    1999-07-01

    Health Risk Assessments (H.R.A.s) are increasingly being used in the environmental decision making process, starting from problem identification to the final clean up activities. A key issue concerning the results of these risk assessments is the uncertainty associated with them. This uncertainty has been associated with highly conservative estimates of risk assessment parameters in past studies. The primary purpose of this study was to investigate error propagation through a risk model. A hypothetical glass plant situated in the state of California was studied. Air emissions from this plant were modeled using the ISCST2 model and the risk was calculated using the ACE2588 model. The downwash was also considered during the concentration calculations. A sensitivity analysis on the risk computations identified five parameters--mixing depth for human consumption, deposition velocity, weathering constant, interception factors for vine crop and the average leaf vegetable consumption--which had the greatest impact on the calculated risk. A Monte Carlo analysis using these five parameters resulted in a distribution with a lesser percentage deviation than the percentage standard deviation of the input parameters.

  14. Constraining the Properties of the Eta Carinae System via 3-D SPH Models of Space-Based Observations: The Absolute Orientation of the Binary Orbit

    NASA Technical Reports Server (NTRS)

    Madura, Thomas I.; Gull, Theodore R.; Owocki, Stanley P.; Okazaki, Atsuo T.; Russell, Christopher M. P.

    2010-01-01

    The extremely massive (> 90 Solar Mass) and luminous (= 5 x 10(exp 6) Solar Luminosity) star Eta Carinae, with its spectacular bipolar "Homunculus" nebula, comprises one of the most remarkable and intensely observed stellar systems in the galaxy. However, many of its underlying physical parameters remain a mystery. Multiwavelength variations observed to occur every 5.54 years are interpreted as being due to the collision of a massive wind from the primary star with the fast, less dense wind of a hot companion star in a highly elliptical (e approx. 0.9) orbit. Using three-dimensional (3-D) Smoothed Particle Hydrodynamics (SPH) simulations of the binary wind-wind collision in Eta Car, together with radiative transfer codes, we compute synthetic spectral images of [Fe III] emission line structures and compare them to existing Hubble Space Telescope/Space Telescope Imaging Spectrograph (HST/STIS) observations. We are thus able, for the first time, to constrain the absolute orientation of the binary orbit on the sky. An orbit with an inclination of i approx. 40deg, an argument of periapsis omega approx. 255deg, and a projected orbital axis with a position angle of approx. 312deg east of north provides the best fit to the observations, implying that the orbital axis is closely aligned in 3-1) space with the Homunculus symmetry axis, and that the companion star orbits clockwise on the sky relative to the primary.

  15. Constraining the Properties of the Eta Carinae System via 3-D SPH Models of Space-Based Observations: The Absolute Orientation of the Binary Orbit

    NASA Technical Reports Server (NTRS)

    Madura, Thomas I.; Gull, Theodore R.; Owocki, Stanley P.; Okazaki, Atsuo T.; Russell, Christopher M. P.

    2011-01-01

    The extremely massive (> 90 Stellar Mass) and luminous (= 5 x 10(exp 6) Stellar Luminosity) star Eta Carinae, with its spectacular bipolar "Homunculus" nebula, comprises one of the most remarkable and intensely observed stellar systems in the Galaxy. However, many of its underlying physical parameters remain unknown. Multiwavelength variations observed to occur every 5.54 years are interpreted as being due to the collision of a massive wind from the primary star with the fast, less dense wind of a hot companion star in a highly elliptical (e approx. 0.9) orbit. Using three-dimensional (3-D) Smoothed Particle Hydrodynamics (SPH) simulations of the binary wind-wind collision, together with radiative transfer codes, we compute synthetic spectral images of [Fe III] emission line structures and compare them to existing Hubble Space Telescope/Space Telescope Imaging Spectrograph (HST/STIS) observations. We are thus able, for the first time, to tightly constrain the absolute orientation of the binary orbit on the sky. An orbit with an inclination of approx. 40deg, an argument of periapsis omega approx. 255deg, and a projected orbital axis with a position angle of approx. 312deg east of north provides the best fit to the observations, implying that the orbital axis is closely aligned in 3-D space with the Homunculus symmetry axis, and that the companion star orbits clockwise on the sky relative to the primary.

  16. Dietary Information Improves Model Performance and Predictive Ability of a Noninvasive Type 2 Diabetes Risk Model

    PubMed Central

    Han, Tianshu; Tian, Shuang; Wang, Li; Liang, Xi; Cui, Hongli; Du, Shanshan; Na, Guanqiong; Na, Lixin; Sun, Changhao

    2016-01-01

    There is no diabetes risk model that includes dietary predictors in Asia. We sought to develop a diet-containing noninvasive diabetes risk model in Northern China and to evaluate whether dietary predictors can improve model performance and predictive ability. Cross-sectional data for 9,734 adults aged 20–74 years old were used as the derivation data, and results obtained for a cohort of 4,515 adults with 4.2 years of follow-up were used as the validation data. We used a logistic regression model to develop a diet-containing noninvasive risk model. Akaike’s information criterion (AIC), area under curve (AUC), integrated discrimination improvements (IDI), net classification improvement (NRI) and calibration statistics were calculated to explicitly assess the effect of dietary predictors on a diabetes risk model. A diet-containing type 2 diabetes risk model was developed. The significant dietary predictors including the consumption of staple foods, livestock, eggs, potato, dairy products, fresh fruit and vegetables were included in the risk model. Dietary predictors improved the noninvasive diabetes risk model with a significant increase in the AUC (delta AUC = 0.03, P<0.001), an increase in relative IDI (24.6%, P-value for IDI <0.001), an increase in NRI (category-free NRI = 0.155, P<0.001), an increase in sensitivity of the model with 7.3% and a decrease in AIC (delta AIC = 199.5). The results of the validation data were similar to the derivation data. The calibration of the diet-containing diabetes risk model was better than that of the risk model without dietary predictors in the validation data. Dietary information improves model performance and predictive ability of noninvasive type 2 diabetes risk model based on classic risk factors. Dietary information may be useful for developing a noninvasive diabetes risk model. PMID:27851788

  17. Building a Better Model: A Comprehensive Breast Cancer Risk Model Incorporating Breast Density to Stratify Risk and Apply Resources

    DTIC Science & Technology

    2012-10-01

    methods (CumulusV, Volpara), developed an automated area based 15. SUBJECT TERMS Breast cancer; risk model; mammography ; breast density 16...recommendations based on an individual’s risk beginning with personalized mammography screening decisions. This will be done by increasing the ability... mammography machine vendor. Once the model is complete, tested nationally, and proven accurate, it will be available for widespread use within five to six

  18. Absolute calibration of optical flats

    DOEpatents

    Sommargren, Gary E.

    2005-04-05

    The invention uses the phase shifting diffraction interferometer (PSDI) to provide a true point-by-point measurement of absolute flatness over the surface of optical flats. Beams exiting the fiber optics in a PSDI have perfect spherical wavefronts. The measurement beam is reflected from the optical flat and passed through an auxiliary optic to then be combined with the reference beam on a CCD. The combined beams include phase errors due to both the optic under test and the auxiliary optic. Standard phase extraction algorithms are used to calculate this combined phase error. The optical flat is then removed from the system and the measurement fiber is moved to recombine the two beams. The newly combined beams include only the phase errors due to the auxiliary optic. When the second phase measurement is subtracted from the first phase measurement, the absolute phase error of the optical flat is obtained.

  19. The Absolute Spectrum Polarimeter (ASP)

    NASA Technical Reports Server (NTRS)

    Kogut, A. J.

    2010-01-01

    The Absolute Spectrum Polarimeter (ASP) is an Explorer-class mission to map the absolute intensity and linear polarization of the cosmic microwave background and diffuse astrophysical foregrounds over the full sky from 30 GHz to 5 THz. The principal science goal is the detection and characterization of linear polarization from an inflationary epoch in the early universe, with tensor-to-scalar ratio r much greater than 1O(raised to the power of { -3}) and Compton distortion y < 10 (raised to the power of{-6}). We describe the ASP instrument and mission architecture needed to detect the signature of an inflationary epoch in the early universe using only 4 semiconductor bolometers.

  20. Physics of negative absolute temperatures

    NASA Astrophysics Data System (ADS)

    Abraham, Eitan; Penrose, Oliver

    2017-01-01

    Negative absolute temperatures were introduced into experimental physics by Purcell and Pound, who successfully applied this concept to nuclear spins; nevertheless, the concept has proved controversial: a recent article aroused considerable interest by its claim, based on a classical entropy formula (the "volume entropy") due to Gibbs, that negative temperatures violated basic principles of statistical thermodynamics. Here we give a thermodynamic analysis that confirms the negative-temperature interpretation of the Purcell-Pound experiments. We also examine the principal arguments that have been advanced against the negative temperature concept; we find that these arguments are not logically compelling, and moreover that the underlying "volume" entropy formula leads to predictions inconsistent with existing experimental results on nuclear spins. We conclude that, despite the counterarguments, negative absolute temperatures make good theoretical sense and did occur in the experiments designed to produce them.

  1. Optomechanics for absolute rotation detection

    NASA Astrophysics Data System (ADS)

    Davuluri, Sankar

    2016-07-01

    In this article, we present an application of optomechanical cavity for the absolute rotation detection. The optomechanical cavity is arranged in a Michelson interferometer in such a way that the classical centrifugal force due to rotation changes the length of the optomechanical cavity. The change in the cavity length induces a shift in the frequency of the cavity mode. The phase shift corresponding to the frequency shift in the cavity mode is measured at the interferometer output to estimate the angular velocity of absolute rotation. We derived an analytic expression to estimate the minimum detectable rotation rate in our scheme for a given optomechanical cavity. Temperature dependence of the rotation detection sensitivity is studied.

  2. The Global Earthquake Model and Disaster Risk Reduction

    NASA Astrophysics Data System (ADS)

    Smolka, A. J.

    2015-12-01

    Advanced, reliable and transparent tools and data to assess earthquake risk are inaccessible to most, especially in less developed regions of the world while few, if any, globally accepted standards currently allow a meaningful comparison of risk between places. The Global Earthquake Model (GEM) is a collaborative effort that aims to provide models, datasets and state-of-the-art tools for transparent assessment of earthquake hazard and risk. As part of this goal, GEM and its global network of collaborators have developed the OpenQuake engine (an open-source software for hazard and risk calculations), the OpenQuake platform (a web-based portal making GEM's resources and datasets freely available to all potential users), and a suite of tools to support modelers and other experts in the development of hazard, exposure and vulnerability models. These resources are being used extensively across the world in hazard and risk assessment, from individual practitioners to local and national institutions, and in regional projects to inform disaster risk reduction. Practical examples for how GEM is bridging the gap between science and disaster risk reduction are: - Several countries including Switzerland, Turkey, Italy, Ecuador, Papua-New Guinea and Taiwan (with more to follow) are computing national seismic hazard using the OpenQuake-engine. In some cases these results are used for the definition of actions in building codes. - Technical support, tools and data for the development of hazard, exposure, vulnerability and risk models for regional projects in South America and Sub-Saharan Africa. - Going beyond physical risk, GEM's scorecard approach evaluates local resilience by bringing together neighborhood/community leaders and the risk reduction community as a basis for designing risk reduction programs at various levels of geography. Actual case studies are Lalitpur in the Kathmandu Valley in Nepal and Quito/Ecuador. In agreement with GEM's collaborative approach, all

  3. Modeling Manufacturing Processes to Mitigate Technological Risk

    SciTech Connect

    Allgood, G.O.; Manges, W.W.

    1999-10-24

    An economic model is a tool for determining the justifiable cost of new sensors and subsystems with respect to value and operation. This process balances the R and D costs against the expense of maintaining current operations and allows for a method to calculate economic indices of performance that can be used as control points in deciding whether to continue development or suspend actions. The model can also be used as an integral part of an overall control loop utilizing real-time process data from the sensor groups to make production decisions (stop production and repair machine, continue and warn of anticipated problems, queue for repairs, etc.). This model has been successfully used and deployed in the CAFE Project. The economic model was one of seven (see Fig. 1) elements critical in developing an investment strategy. It has been successfully used in guiding the R and D activities on the CAFE Project, suspending activities on three new sensor technologies, and continuing development o f two others. The model has also been used to justify the development of a new prognostic approach for diagnosing machine health using COTS equipment and a new algorithmic approach. maintaining current operations and allows for a method to calculate economic indices of performance that can be used as control points in deciding whether to continue development or suspend actions. The model can also be used as an integral part of an overall control loop utilizing real-time process data from the sensor groups to make production decisions (stop production and repair machine, continue and warn of anticipated problems, queue for repairs, etc.).

  4. Prevalence Incidence Mixture Models

    Cancer.gov

    The R package and webtool fits Prevalence Incidence Mixture models to left-censored and irregularly interval-censored time to event data that is commonly found in screening cohorts assembled from electronic health records. Absolute and relative risk can be estimated for simple random sampling, stratified sampling, and two-phase stratified sampling. Non-parametric (absolute risks only), semi-parametric, weakly-parametric (using B-splines), and some fully parametric (such as the logistic-Weibull) models are supported.

  5. Application of the Beck model to stock markets: Value-at-Risk and portfolio risk assessment

    NASA Astrophysics Data System (ADS)

    Kozaki, M.; Sato, A.-H.

    2008-02-01

    We apply the Beck model, developed for turbulent systems that exhibit scaling properties, to stock markets. Our study reveals that the Beck model elucidates the properties of stock market returns and is applicable to practical use such as the Value-at-Risk estimation and the portfolio analysis. We perform empirical analysis with daily/intraday data of the S&P500 index return and find that the volatility fluctuation of real markets is well-consistent with the assumptions of the Beck model: The volatility fluctuates at a much larger time scale than the return itself and the inverse of variance, or “inverse temperature”, β obeys Γ-distribution. As predicted by the Beck model, the distribution of returns is well-fitted by q-Gaussian distribution of Tsallis statistics. The evaluation method of Value-at-Risk (VaR), one of the most significant indicators in risk management, is studied for q-Gaussian distribution. Our proposed method enables the VaR evaluation in consideration of tail risk, which is underestimated by the variance-covariance method. A framework of portfolio risk assessment under the existence of tail risk is considered. We propose a multi-asset model with a single volatility fluctuation shared by all assets, named the single β model, and empirically examine the agreement between the model and an imaginary portfolio with Dow Jones indices. It turns out that the single β model gives good approximation to portfolios composed of the assets with non-Gaussian and correlated returns.

  6. Improving default risk prediction using Bayesian model uncertainty techniques.

    PubMed

    Kazemi, Reza; Mosleh, Ali

    2012-11-01

    Credit risk is the potential exposure of a creditor to an obligor's failure or refusal to repay the debt in principal or interest. The potential of exposure is measured in terms of probability of default. Many models have been developed to estimate credit risk, with rating agencies dating back to the 19th century. They provide their assessment of probability of default and transition probabilities of various firms in their annual reports. Regulatory capital requirements for credit risk outlined by the Basel Committee on Banking Supervision have made it essential for banks and financial institutions to develop sophisticated models in an attempt to measure credit risk with higher accuracy. The Bayesian framework proposed in this article uses the techniques developed in physical sciences and engineering for dealing with model uncertainty and expert accuracy to obtain improved estimates of credit risk and associated uncertainties. The approach uses estimates from one or more rating agencies and incorporates their historical accuracy (past performance data) in estimating future default risk and transition probabilities. Several examples demonstrate that the proposed methodology can assess default probability with accuracy exceeding the estimations of all the individual models. Moreover, the methodology accounts for potentially significant departures from "nominal predictions" due to "upsetting events" such as the 2008 global banking crisis.

  7. Crisis and emergency risk communication as an integrative model.

    PubMed

    Reynolds, Barbara; W Seeger, Matthew

    2005-01-01

    This article describes a model of communication known as crisis and emergency risk communication (CERC). The model is outlined as a merger of many traditional notions of health and risk communication with work in crisis and disaster communication. The specific kinds of communication activities that should be called for at various stages of disaster or crisis development are outlined. Although crises are by definition uncertain, equivocal, and often chaotic situations, the CERC model is presented as a tool health communicators can use to help manage these complex events.

  8. Empirical Analysis of Farm Credit Risk under the Structure Model

    ERIC Educational Resources Information Center

    Yan, Yan

    2009-01-01

    The study measures farm credit risk by using farm records collected by Farm Business Farm Management (FBFM) during the period 1995-2004. The study addresses the following questions: (1) whether farm's financial position is fully described by the structure model, (2) what are the determinants of farm capital structure under the structure model, (3)…

  9. Dental caries: an updated medical model of risk assessment.

    PubMed

    Kutsch, V Kim

    2014-04-01

    Dental caries is a transmissible, complex biofilm disease that creates prolonged periods of low pH in the mouth, resulting in a net mineral loss from the teeth. Historically, the disease model for dental caries consisted of mutans streptococci and Lactobacillus species, and the dental profession focused on restoring the lesions/damage from the disease by using a surgical model. The current recommendation is to implement a risk-assessment-based medical model called CAMBRA (caries management by risk assessment) to diagnose and treat dental caries. Unfortunately, many of the suggestions of CAMBRA have been overly complicated and confusing for clinicians. The risk of caries, however, is usually related to just a few common factors, and these factors result in common patterns of disease. This article examines the biofilm model of dental caries, identifies the common disease patterns, and discusses their targeted therapeutic strategies to make CAMBRA more easily adaptable for the privately practicing professional.

  10. Modeling of Flood Risk for the Continental United States

    NASA Astrophysics Data System (ADS)

    Lohmann, D.; Li, S.; Katz, B.; Goteti, G.; Kaheil, Y. H.; Vojjala, R.

    2011-12-01

    The science of catastrophic risk modeling helps people to understand the physical and financial implications of natural catastrophes (hurricanes, flood, earthquakes, etc.), terrorism, and the risks associated with changes in life expectancy. As such it depends on simulation techniques that integrate multiple disciplines such as meteorology, hydrology, structural engineering, statistics, computer science, financial engineering, actuarial science, and more in virtually every field of technology. In this talk we will explain the techniques and underlying assumptions of building the RMS US flood risk model. We especially will pay attention to correlation (spatial and temporal), simulation and uncertainty in each of the various components in the development process. Recent extreme floods (e.g. US Midwest flood 2008, US Northeast flood, 2010) have increased the concern of flood risk. Consequently, there are growing needs to adequately assess the flood risk. The RMS flood hazard model is mainly comprised of three major components. (1) Stochastic precipitation simulation module based on a Monte-Carlo analogue technique, which is capable of producing correlated rainfall events for the continental US. (2) Rainfall-runoff and routing module. A semi-distributed rainfall-runoff model was developed to properly assess the antecedent conditions, determine the saturation area and runoff. The runoff is further routed downstream along the rivers by a routing model. Combined with the precipitation model, it allows us to correlate the streamflow and hence flooding from different rivers, as well as low and high return-periods across the continental US. (3) Flood inundation module. It transforms the discharge (output from the flow routing) into water level, which is further combined with a two-dimensional off-floodplain inundation model to produce comprehensive flood hazard map. The performance of the model is demonstrated by comparing to the observation and published data. Output from

  11. Model for Solar Proton Risk Assessment

    NASA Technical Reports Server (NTRS)

    Xapos, M. A.; Stauffer, C.; Gee, G. B.; Barth, J. L.; Stassinopoulos, E. G.; McGuire, R. E.

    2004-01-01

    A statistical model for cumulative solar proton event fluences during space missions is presented that covers both the solar minimum and solar maximum phases of the solar cycle. It is based on data from the IMP and GOES series of satellites that is integrated together to allow the best features of each data set to be taken advantage of. This allows fluence-energy spectra to be extended out to energies of 327 MeV.

  12. Parametric Estimation in a Recurrent Competing Risks Model.

    PubMed

    Taylor, Laura L; Peña, Edsel A

    2013-01-01

    A resource-efficient approach to making inferences about the distributional properties of the failure times in a competing risks setting is presented. Efficiency is gained by observing recurrences of the competing risks over a random monitoring period. The resulting model is called the recurrent competing risks model (RCRM) and is coupled with two repair strategies whenever the system fails. Maximum likelihood estimators of the parameters of the marginal distribution functions associated with each of the competing risks and also of the system lifetime distribution function are presented. Estimators are derived under perfect and partial repair strategies. Consistency and asymptotic properties of the estimators are obtained. The estimation methods are applied to a data set of failures for cars under warranty. Simulation studies are used to ascertain the small sample properties and the efficiency gains of the resulting estimators.

  13. Absolute Antenna Calibration at the US National Geodetic Survey

    NASA Astrophysics Data System (ADS)

    Mader, G. L.; Bilich, A. L.

    2012-12-01

    Geodetic GNSS applications routinely demand millimeter precision and extremely high levels of accuracy. To achieve these accuracies, measurement and instrument biases at the centimeter to millimeter level must be understood. One of these biases is the antenna phase center, the apparent point of signal reception for a GNSS antenna. It has been well established that phase center patterns differ between antenna models and manufacturers; additional research suggests that the addition of a radome or the choice of antenna mount can significantly alter those a priori phase center patterns. For the more demanding GNSS positioning applications and especially in cases of mixed-antenna networks, it is all the more important to know antenna phase center variations as a function of both elevation and azimuth in the antenna reference frame and incorporate these models into analysis software. Determination of antenna phase center behavior is known as "antenna calibration". Since 1994, NGS has computed relative antenna calibrations for more than 350 antennas. In recent years, the geodetic community has moved to absolute calibrations - the IGS adopted absolute antenna phase center calibrations in 2006 for use in their orbit and clock products, and NGS's CORS group began using absolute antenna calibration upon the release of the new CORS coordinates in IGS08 epoch 2005.00 and NAD 83(2011,MA11,PA11) epoch 2010.00. Although NGS relative calibrations can be and have been converted to absolute, it is considered best practice to independently measure phase center characteristics in an absolute sense. Consequently, NGS has developed and operates an absolute calibration system. These absolute antenna calibrations accommodate the demand for greater accuracy and for 2-dimensional (elevation and azimuth) parameterization. NGS will continue to provide calibration values via the NGS web site www.ngs.noaa.gov/ANTCAL, and will publish calibrations in the ANTEX format as well as the legacy ANTINFO

  14. Model based climate information on drought risk in Africa

    NASA Astrophysics Data System (ADS)

    Calmanti, S.; Syroka, J.; Jones, C.; Carfagna, F.; Dell'Aquila, A.; Hoefsloot, P.; Kaffaf, S.; Nikulin, G.

    2012-04-01

    The United Nations World Food Programme (WFP) has embarked upon the endeavor of creating a sustainable Africa-wide natural disaster risk management system. A fundamental building block of this initiative is the setup of a drought impact modeling platform called Africa Risk-View that aims to quantify and monitor weather-related food security risk in Africa. The modeling approach is based the Water Requirement Satisfaction Index (WRSI), as the fundamental indicator of the performances of agriculture and uses historical records of food assistance operation to project future potential needs for livelihood protection. By using climate change scenarios as an input to Africa Risk-View it is possible, in principles, to evaluate the future impact of climate variability on critical issues such as food security and the overall performance of the envisaged risk management system. A necessary preliminary step to this challenging task is the exploration of the sources of uncertainties affecting the assessment based on modeled climate change scenarios. For this purpose, a limited set of climate models have been selected in order verify the relevance of using climate model output data with Africa Risk-View and to explore a minimal range of possible sources of uncertainty. This first evaluation exercise started before the setup of the CORDEX framework and has relied on model output available at the time. In particular only one regional downscaling was available for the entire African continent from the ENSEMBLES project. The analysis shows that current coarse resolution global climate models can not directly feed into the Africa RiskView risk-analysis tool. However, regional downscaling may help correcting the inherent biases observed in the datasets. Further analysis is performed by using the first data available under the CORDEX framework. In particular, we consider a set of simulation driven with boundary conditions from the reanalysis ERA-Interim to evaluate the skill drought

  15. Credibility analysis of risk classes by generalized linear model

    NASA Astrophysics Data System (ADS)

    Erdemir, Ovgucan Karadag; Sucu, Meral

    2016-06-01

    In this paper generalized linear model (GLM) and credibility theory which are frequently used in nonlife insurance pricing are combined for reliability analysis. Using full credibility standard, GLM is associated with limited fluctuation credibility approach. Comparison criteria such as asymptotic variance and credibility probability are used to analyze the credibility of risk classes. An application is performed by using one-year claim frequency data of a Turkish insurance company and results of credible risk classes are interpreted.

  16. Model-based benefit-risk assessment: can Archimedes help?

    PubMed

    Krishna, R

    2009-03-01

    In December 2008, the US Food and Drug Administration issued a new draft Guidance for Industry on Diabetes Mellitus--evaluating cardiovascular risk in new antidiabetic therapies to treat Type 2 diabetes. This guidance comes at a time when recent discussions have focused on delineation of cardiovascular risk reduction for new antidiabetic drugs. Computational tools that can enable early prediction of cardiovascular risk are reviewed with specific reference to Archimedes (Kaiser Permanente), with an aim of proposing a model-based solution and enabling decisions to be made as early as possible in the drug development value chain.

  17. Risk Prediction Models for Lung Cancer: A Systematic Review.

    PubMed

    Gray, Eoin P; Teare, M Dawn; Stevens, John; Archer, Rachel

    2016-03-01

    Many lung cancer risk prediction models have been published but there has been no systematic review or comprehensive assessment of these models to assess how they could be used in screening. We performed a systematic review of lung cancer prediction models and identified 31 articles that related to 25 distinct models, of which 11 considered epidemiological factors only and did not require a clinical input. Another 11 articles focused on models that required a clinical assessment such as a blood test or scan, and 8 articles considered the 2-stage clonal expansion model. More of the epidemiological models had been externally validated than the more recent clinical assessment models. There was varying discrimination, the ability of a model to distinguish between cases and controls, with an area under the curve between 0.57 and 0.879 and calibration, the model's ability to assign an accurate probability to an individual. In our review we found that further validation studies need to be considered; especially for the Prostate, Lung, Colorectal, and Ovarian (PLCO) Cancer Screening Trial 2012 Model Version (PLCOM2012) and Hoggart models, which recorded the best overall performance. Future studies will need to focus on prediction rules, such as optimal risk thresholds, for models for selective screening trials. Only 3 validation studies considered prediction rules when validating the models and overall the models were validated using varied tests in distinct populations, which made direct comparisons difficult. To improve this, multiple models need to be tested on the same data set with considerations for sensitivity, specificity, model accuracy, and positive predictive values at the optimal risk thresholds.

  18. Sensitivity Analysis of Launch Vehicle Debris Risk Model

    NASA Technical Reports Server (NTRS)

    Gee, Ken; Lawrence, Scott L.

    2010-01-01

    As part of an analysis of the loss of crew risk associated with an ascent abort system for a manned launch vehicle, a model was developed to predict the impact risk of the debris resulting from an explosion of the launch vehicle on the crew module. The model consisted of a debris catalog describing the number, size and imparted velocity of each piece of debris, a method to compute the trajectories of the debris and a method to calculate the impact risk given the abort trajectory of the crew module. The model provided a point estimate of the strike probability as a function of the debris catalog, the time of abort and the delay time between the abort and destruction of the launch vehicle. A study was conducted to determine the sensitivity of the strike probability to the various model input parameters and to develop a response surface model for use in the sensitivity analysis of the overall ascent abort risk model. The results of the sensitivity analysis and the response surface model are presented in this paper.

  19. Risk Management Model in Surface Exploitation of Mineral Deposits

    NASA Astrophysics Data System (ADS)

    Stojanović, Cvjetko

    2016-06-01

    Risk management is an integrative part of all types of project management. One of the main tasks of pre-investment studies and other project documentation is the tendency to protect investment projects as much as possible against investment risks. Therefore, the provision and regulation of risk information ensure the identification of the probability of the emergence of adverse events, their forms, causes and consequences, and provides a timely measures of protection against risks. This means that risk management involves a set of management methods and techniques used to reduce the possibility of realizing the adverse events and consequences and thus increase the possibilities of achieving the planned results with minimal losses. Investment in mining projects are of capital importance because they are very complex projects, therefore being very risky, because of the influence of internal and external factors and limitations arising from the socio-economic environment. Due to the lack of a risk management system, numerous organizations worldwide have suffered significant financial losses. Therefore, it is necessary for any organization to establish a risk management system as a structural element of system management system as a whole. This paper presents an approach to a Risk management model in the project of opening a surface coal mine, developed based on studies of extensive scientific literature and personal experiences of the author, and which, with certain modifications, may find use for any investment project, both in the mining industry as well as in investment projects in other areas.

  20. Assessing risk factors for dental caries: a statistical modeling approach.

    PubMed

    Trottini, Mario; Bossù, Maurizio; Corridore, Denise; Ierardo, Gaetano; Luzzi, Valeria; Saccucci, Matteo; Polimeni, Antonella

    2015-01-01

    The problem of identifying potential determinants and predictors of dental caries is of key importance in caries research and it has received considerable attention in the scientific literature. From the methodological side, a broad range of statistical models is currently available to analyze dental caries indices (DMFT, dmfs, etc.). These models have been applied in several studies to investigate the impact of different risk factors on the cumulative severity of dental caries experience. However, in most of the cases (i) these studies focus on a very specific subset of risk factors; and (ii) in the statistical modeling only few candidate models are considered and model selection is at best only marginally addressed. As a result, our understanding of the robustness of the statistical inferences with respect to the choice of the model is very limited; the richness of the set of statistical models available for analysis in only marginally exploited; and inferences could be biased due the omission of potentially important confounding variables in the model's specification. In this paper we argue that these limitations can be overcome considering a general class of candidate models and carefully exploring the model space using standard model selection criteria and measures of global fit and predictive performance of the candidate models. Strengths and limitations of the proposed approach are illustrated with a real data set. In our illustration the model space contains more than 2.6 million models, which require inferences to be adjusted for 'optimism'.

  1. NASA Space Radiation Program Integrative Risk Model Toolkit

    NASA Technical Reports Server (NTRS)

    Kim, Myung-Hee Y.; Hu, Shaowen; Plante, Ianik; Ponomarev, Artem L.; Sandridge, Chris

    2015-01-01

    NASA Space Radiation Program Element scientists have been actively involved in development of an integrative risk models toolkit that includes models for acute radiation risk and organ dose projection (ARRBOD), NASA space radiation cancer risk projection (NSCR), hemocyte dose estimation (HemoDose), GCR event-based risk model code (GERMcode), and relativistic ion tracks (RITRACKS), NASA radiation track image (NASARTI), and the On-Line Tool for the Assessment of Radiation in Space (OLTARIS). This session will introduce the components of the risk toolkit with opportunity for hands on demonstrations. The brief descriptions of each tools are: ARRBOD for Organ dose projection and acute radiation risk calculation from exposure to solar particle event; NSCR for Projection of cancer risk from exposure to space radiation; HemoDose for retrospective dose estimation by using multi-type blood cell counts; GERMcode for basic physical and biophysical properties for an ion beam, and biophysical and radiobiological properties for a beam transport to the target in the NASA Space Radiation Laboratory beam line; RITRACKS for simulation of heavy ion and delta-ray track structure, radiation chemistry, DNA structure and DNA damage at the molecular scale; NASARTI for modeling of the effects of space radiation on human cells and tissue by incorporating a physical model of tracks, cell nucleus, and DNA damage foci with image segmentation for the automated count; and OLTARIS, an integrated tool set utilizing HZETRN (High Charge and Energy Transport) intended to help scientists and engineers study the effects of space radiation on shielding materials, electronics, and biological systems.

  2. Tutorial: Parallel Computing of Simulation Models for Risk Analysis.

    PubMed

    Reilly, Allison C; Staid, Andrea; Gao, Michael; Guikema, Seth D

    2016-10-01

    Simulation models are widely used in risk analysis to study the effects of uncertainties on outcomes of interest in complex problems. Often, these models are computationally complex and time consuming to run. This latter point may be at odds with time-sensitive evaluations or may limit the number of parameters that are considered. In this article, we give an introductory tutorial focused on parallelizing simulation code to better leverage modern computing hardware, enabling risk analysts to better utilize simulation-based methods for quantifying uncertainty in practice. This article is aimed primarily at risk analysts who use simulation methods but do not yet utilize parallelization to decrease the computational burden of these models. The discussion is focused on conceptual aspects of embarrassingly parallel computer code and software considerations. Two complementary examples are shown using the languages MATLAB and R. A brief discussion of hardware considerations is located in the Appendix.

  3. The use of ecosystem models in risk assessment

    SciTech Connect

    Starodub, M.E.; Miller, P.A.; Willes, R.F.

    1994-12-31

    Ecosystem models, when used in conjunction with available environmental effects monitoring data enable informed decisions regarding actions that should be taken to manage ecological risks from areas of localized chemical loadings and accumulation. These models provide quantitative estimates of chemical concentrations in various environmental media. The reliable application of these models as predictive tools for environmental assessment requires a thorough understanding of the theory and mathematical relationships described by the models and demands rigorous validation of input data and model results with field and laboratory data. Food chain model selection should be based on the ability to best simulate the interactions of the food web and processes governing the transfer of chemicals from the dissolved and particulate phase to various trophic levels for the site in question. This requires that the user be familiar with the theories on which these models are based, and be aware of the merits and short comings of each prior to attempting to model food chain accumulation. Questions to be asked include: are all potential exposure pathways addressed? are omitted pathways critical to the risk assessment process? is the model flexible? To answer these questions one must consider the, chemical(s) of concern, site-specific ecosystem characteristics, risk assessment receptor (aquatic, wildlife, human) dietary habits, influence of effluent characteristics on food chain dynamics.

  4. Absolute calibration of optical tweezers

    SciTech Connect

    Viana, N.B.; Mazolli, A.; Maia Neto, P.A.; Nussenzveig, H.M.; Rocha, M.S.; Mesquita, O.N.

    2006-03-27

    As a step toward absolute calibration of optical tweezers, a first-principles theory of trapping forces with no adjustable parameters, corrected for spherical aberration, is experimentally tested. Employing two very different setups, we find generally very good agreement for the transverse trap stiffness as a function of microsphere radius for a broad range of radii, including the values employed in practice, and at different sample chamber depths. The domain of validity of the WKB ('geometrical optics') approximation to the theory is verified. Theoretical predictions for the trapping threshold, peak position, depth variation, multiple equilibria, and 'jump' effects are also confirmed.

  5. Physics-based Entry, Descent and Landing Risk Model

    NASA Technical Reports Server (NTRS)

    Gee, Ken; Huynh, Loc C.; Manning, Ted

    2014-01-01

    A physics-based risk model was developed to assess the risk associated with thermal protection system failures during the entry, descent and landing phase of a manned spacecraft mission. In the model, entry trajectories were computed using a three-degree-of-freedom trajectory tool, the aerothermodynamic heating environment was computed using an engineering-level computational tool and the thermal response of the TPS material was modeled using a one-dimensional thermal response tool. The model was capable of modeling the effect of micrometeoroid and orbital debris impact damage on the TPS thermal response. A Monte Carlo analysis was used to determine the effects of uncertainties in the vehicle state at Entry Interface, aerothermodynamic heating and material properties on the performance of the TPS design. The failure criterion was set as a temperature limit at the bondline between the TPS and the underlying structure. Both direct computation and response surface approaches were used to compute the risk. The model was applied to a generic manned space capsule design. The effect of material property uncertainty and MMOD damage on risk of failure were analyzed. A comparison of the direct computation and response surface approach was undertaken.

  6. Evaluating biomarkers to model cancer risk post cosmic ray exposure.

    PubMed

    Sridharan, Deepa M; Asaithamby, Aroumougame; Blattnig, Steve R; Costes, Sylvain V; Doetsch, Paul W; Dynan, William S; Hahnfeldt, Philip; Hlatky, Lynn; Kidane, Yared; Kronenberg, Amy; Naidu, Mamta D; Peterson, Leif E; Plante, Ianik; Ponomarev, Artem L; Saha, Janapriya; Snijders, Antoine M; Srinivasan, Kalayarasan; Tang, Jonathan; Werner, Erica; Pluth, Janice M

    2016-06-01

    Robust predictive models are essential to manage the risk of radiation-induced carcinogenesis. Chronic exposure to cosmic rays in the context of the complex deep space environment may place astronauts at high cancer risk. To estimate this risk, it is critical to understand how radiation-induced cellular stress impacts cell fate decisions and how this in turn alters the risk of carcinogenesis. Exposure to the heavy ion component of cosmic rays triggers a multitude of cellular changes, depending on the rate of exposure, the type of damage incurred and individual susceptibility. Heterogeneity in dose, dose rate, radiation quality, energy and particle flux contribute to the complexity of risk assessment. To unravel the impact of each of these factors, it is critical to identify sensitive biomarkers that can serve as inputs for robust modeling of individual risk of cancer or other long-term health consequences of exposure. Limitations in sensitivity of biomarkers to dose and dose rate, and the complexity of longitudinal monitoring, are some of the factors that increase uncertainties in the output from risk prediction models. Here, we critically evaluate candidate early and late biomarkers of radiation exposure and discuss their usefulness in predicting cell fate decisions. Some of the biomarkers we have reviewed include complex clustered DNA damage, persistent DNA repair foci, reactive oxygen species, chromosome aberrations and inflammation. Other biomarkers discussed, often assayed for at longer points post exposure, include mutations, chromosome aberrations, reactive oxygen species and telomere length changes. We discuss the relationship of biomarkers to different potential cell fates, including proliferation, apoptosis, senescence, and loss of stemness, which can propagate genomic instability and alter tissue composition and the underlying mRNA signatures that contribute to cell fate decisions. Our goal is to highlight factors that are important in choosing

  7. Evaluating biomarkers to model cancer risk post cosmic ray exposure

    NASA Astrophysics Data System (ADS)

    Sridharan, Deepa M.; Asaithamby, Aroumougame; Blattnig, Steve R.; Costes, Sylvain V.; Doetsch, Paul W.; Dynan, William S.; Hahnfeldt, Philip; Hlatky, Lynn; Kidane, Yared; Kronenberg, Amy; Naidu, Mamta D.; Peterson, Leif E.; Plante, Ianik; Ponomarev, Artem L.; Saha, Janapriya; Snijders, Antoine M.; Srinivasan, Kalayarasan; Tang, Jonathan; Werner, Erica; Pluth, Janice M.

    2016-06-01

    Robust predictive models are essential to manage the risk of radiation-induced carcinogenesis. Chronic exposure to cosmic rays in the context of the complex deep space environment may place astronauts at high cancer risk. To estimate this risk, it is critical to understand how radiation-induced cellular stress impacts cell fate decisions and how this in turn alters the risk of carcinogenesis. Exposure to the heavy ion component of cosmic rays triggers a multitude of cellular changes, depending on the rate of exposure, the type of damage incurred and individual susceptibility. Heterogeneity in dose, dose rate, radiation quality, energy and particle flux contribute to the complexity of risk assessment. To unravel the impact of each of these factors, it is critical to identify sensitive biomarkers that can serve as inputs for robust modeling of individual risk of cancer or other long-term health consequences of exposure. Limitations in sensitivity of biomarkers to dose and dose rate, and the complexity of longitudinal monitoring, are some of the factors that increase uncertainties in the output from risk prediction models. Here, we critically evaluate candidate early and late biomarkers of radiation exposure and discuss their usefulness in predicting cell fate decisions. Some of the biomarkers we have reviewed include complex clustered DNA damage, persistent DNA repair foci, reactive oxygen species, chromosome aberrations and inflammation. Other biomarkers discussed, often assayed for at longer points post exposure, include mutations, chromosome aberrations, reactive oxygen species and telomere length changes. We discuss the relationship of biomarkers to different potential cell fates, including proliferation, apoptosis, senescence, and loss of stemness, which can propagate genomic instability and alter tissue composition and the underlying mRNA signatures that contribute to cell fate decisions. Our goal is to highlight factors that are important in choosing

  8. Cryptosporidium Infection Risk: Results of New Dose-Response Modeling.

    PubMed

    Messner, Michael J; Berger, Philip

    2016-10-01

    Cryptosporidium human dose-response data from seven species/isolates are used to investigate six models of varying complexity that estimate infection probability as a function of dose. Previous models attempt to explicitly account for virulence differences among C. parvum isolates, using three or six species/isolates. Four (two new) models assume species/isolate differences are insignificant and three of these (all but exponential) allow for variable human susceptibility. These three human-focused models (fractional Poisson, exponential with immunity and beta-Poisson) are relatively simple yet fit the data significantly better than the more complex isolate-focused models. Among these three, the one-parameter fractional Poisson model is the simplest but assumes that all Cryptosporidium oocysts used in the studies were capable of initiating infection. The exponential with immunity model does not require such an assumption and includes the fractional Poisson as a special case. The fractional Poisson model is an upper bound of the exponential with immunity model and applies when all oocysts are capable of initiating infection. The beta Poisson model does not allow an immune human subpopulation; thus infection probability approaches 100% as dose becomes huge. All three of these models predict significantly (>10x) greater risk at the low doses that consumers might receive if exposed through drinking water or other environmental exposure (e.g., 72% vs. 4% infection probability for a one oocyst dose) than previously predicted. This new insight into Cryptosporidium risk suggests additional inactivation and removal via treatment may be needed to meet any specified risk target, such as a suggested 10(-4) annual risk of Cryptosporidium infection.

  9. Building a Better Model: A Comprehensive Breast Cancer Risk Model Incorporating Breast Density to Stratify Risk and Apply Resources

    DTIC Science & Technology

    2014-10-01

    assessment model that includes automated measurement of breast density. Scope: Assemble a cohort of women with known breast cancer risk factors and...digital mammogram files for women diagnosed with breast cancer using existing data sources and match them to controls (Harvey/Knaus). Validate and...density will translate to changes in breast cancer risk. Therefore, noise in measurement should be minimal. Thirty women were recruited under this

  10. A dynamical systems model for nuclear power plant risk

    NASA Astrophysics Data System (ADS)

    Hess, Stephen Michael

    The recent transition to an open access generation marketplace has forced nuclear plant operators to become much more cost conscious and focused on plant performance. Coincidentally, the regulatory perspective also is in a state of transition from a command and control framework to one that is risk-informed and performance-based. Due to these structural changes in the economics and regulatory system associated with commercial nuclear power plant operation, there is an increased need for plant management to explicitly manage nuclear safety risk. Application of probabilistic risk assessment techniques to model plant hardware has provided a significant contribution to understanding the potential initiating events and equipment failures that can lead to core damage accidents. Application of the lessons learned from these analyses has supported improved plant operation and safety over the previous decade. However, this analytical approach has not been nearly as successful in addressing the impact of plant processes and management effectiveness on the risks of plant operation. Thus, the research described in this dissertation presents a different approach to address this issue. Here we propose a dynamical model that describes the interaction of important plant processes among themselves and their overall impact on nuclear safety risk. We first provide a review of the techniques that are applied in a conventional probabilistic risk assessment of commercially operating nuclear power plants and summarize the typical results obtained. The limitations of the conventional approach and the status of research previously performed to address these limitations also are presented. Next, we present the case for the application of an alternative approach using dynamical systems theory. This includes a discussion of previous applications of dynamical models to study other important socio-economic issues. Next, we review the analytical techniques that are applicable to analysis of

  11. A quality risk management model approach for cell therapy manufacturing.

    PubMed

    Lopez, Fabio; Di Bartolo, Chiara; Piazza, Tommaso; Passannanti, Antonino; Gerlach, Jörg C; Gridelli, Bruno; Triolo, Fabio

    2010-12-01

    International regulatory authorities view risk management as an essential production need for the development of innovative, somatic cell-based therapies in regenerative medicine. The available risk management guidelines, however, provide little guidance on specific risk analysis approaches and procedures applicable in clinical cell therapy manufacturing. This raises a number of problems. Cell manufacturing is a poorly automated process, prone to operator-introduced variations, and affected by heterogeneity of the processed organs/tissues and lot-dependent variability of reagent (e.g., collagenase) efficiency. In this study, the principal challenges faced in a cell-based product manufacturing context (i.e., high dependence on human intervention and absence of reference standards for acceptable risk levels) are identified and addressed, and a risk management model approach applicable to manufacturing of cells for clinical use is described for the first time. The use of the heuristic and pseudo-quantitative failure mode and effect analysis/failure mode and critical effect analysis risk analysis technique associated with direct estimation of severity, occurrence, and detection is, in this specific context, as effective as, but more efficient than, the analytic hierarchy process. Moreover, a severity/occurrence matrix and Pareto analysis can be successfully adopted to identify priority failure modes on which to act to mitigate risks. The application of this approach to clinical cell therapy manufacturing in regenerative medicine is also discussed.

  12. Gambler Risk Perception: A Mental Model and Grounded Theory Analysis.

    PubMed

    Spurrier, Michael; Blaszczynski, Alexander; Rhodes, Paul

    2015-09-01

    Few studies have investigated how gamblers perceive risk or the role of risk perception in disordered gambling. The purpose of the current study therefore was to obtain data on lay gamblers' beliefs on these variables and their effects on decision-making, behaviour, and disordered gambling aetiology. Fifteen regular lay gamblers (non-problem/low risk, moderate risk and problem gamblers) completed a semi-structured interview following mental models and grounded theory methodologies. Gambler interview data was compared to an expert 'map' of risk-perception, to identify comparative gaps or differences associated with harmful or safe gambling. Systematic overlapping processes of data gathering and analysis were used to iteratively extend, saturate, test for exception, and verify concepts and themes emerging from the data. The preliminary findings suggested that gambler accounts supported the presence of expert conceptual constructs, and to some degree the role of risk perception in protecting against or increasing vulnerability to harm and disordered gambling. Gambler accounts of causality, meaning, motivation, and strategy were highly idiosyncratic, and often contained content inconsistent with measures of disordered gambling. Disordered gambling appears heavily influenced by relative underestimation of risk and overvaluation of gambling, based on explicit and implicit analysis, and deliberate, innate, contextual, and learned processing evaluations and biases.

  13. Assessing Landslide Risk Areas Using Statistical Models and Land Cover

    NASA Astrophysics Data System (ADS)

    Kim, H. G.; Lee, D. K.; Park, C.; Ahn, Y.; Sung, S.; Park, J. H.

    2015-12-01

    Recently, damages due to landslides have increased in Republic of Korea. Extreme weathers like typhoon, heavy rainfall related to climate change are the main factor of the damages. Especially, Inje-gun, Gangwon-do had severe landslide damages in 2006 and 2007. In Inje-gun, 91% areas are forest, therefore, many land covers related to human activities were adjacent to forest land. Thus, establishment of adaptation plans to landslides was urgently needed. Landslide risk assessment can serve as a good information to policy makers. The objective of this study was assessing landslide risk areas to support establishment of adaptation plans to reduce landslide damages. Statistical distribution models (SDMs) were used to evaluate probability of landslide occurrence. Various SDMs were used to make landslide probability maps considering uncertainty of SDMs. The types of land cover were classified into 5 grades considering vulnerable level to landslide. The landslide probability maps were overlaid with land cover map to calculate landslide risk. As a result of overlay analysis, landslide risk areas were derived. Especially agricultural areas and transportation areas showed high risk and large areas in the risk map. In conclusion, policy makers in Inje-gun must consider the landslide risk map to establish adaptation plans effectively.

  14. A Model for Risk Analysis of Oil Tankers

    NASA Astrophysics Data System (ADS)

    Montewka, Jakub; Krata, Przemysław; Goerland, Floris; Kujala, Pentti

    2010-01-01

    The paper presents a model for risk analysis regarding marine traffic, with the emphasis on two types of the most common marine accidents which are: collision and grounding. The focus is on oil tankers as these pose the highest environmental risk. A case study in selected areas of Gulf of Finland in ice free conditions is presented. The model utilizes a well-founded formula for risk calculation, which combines the probability of an unwanted event with its consequences. Thus the model is regarded a block type model, consisting of blocks for the probability of collision and grounding estimation respectively as well as blocks for consequences of an accident modelling. Probability of vessel colliding is assessed by means of a Minimum Distance To Collision (MDTC) based model. The model defines in a novel way the collision zone, using mathematical ship motion model and recognizes traffic flow as a non homogeneous process. The presented calculations address waterways crossing between Helsinki and Tallinn, where dense cross traffic during certain hours is observed. For assessment of a grounding probability, a new approach is proposed, which utilizes a newly developed model, where spatial interactions between objects in different locations are recognized. A ship at a seaway and navigational obstructions may be perceived as interacting objects and their repulsion may be modelled by a sort of deterministic formulation. Risk due to tankers running aground addresses an approach fairway to an oil terminal in Sköldvik, near Helsinki. The consequences of an accident are expressed in monetary terms, and concern costs of an oil spill, based on statistics of compensations claimed from the International Oil Pollution Compensation Funds (IOPC Funds) by parties involved.

  15. A critical evaluation of secondary cancer risk models applied to Monte Carlo dose distributions of 2-dimensional, 3-dimensional conformal and hybrid intensity-modulated radiation therapy for breast cancer

    NASA Astrophysics Data System (ADS)

    Joosten, A.; Bochud, F.; Moeckli, R.

    2014-08-01

    The comparison of radiotherapy techniques regarding secondary cancer risk has yielded contradictory results possibly stemming from the many different approaches used to estimate risk. The purpose of this study was to make a comprehensive evaluation of different available risk models applied to detailed whole-body dose distributions computed by Monte Carlo for various breast radiotherapy techniques including conventional open tangents, 3D conformal wedged tangents and hybrid intensity modulated radiation therapy (IMRT). First, organ-specific linear risk models developed by the International Commission on Radiological Protection (ICRP) and the Biological Effects of Ionizing Radiation (BEIR) VII committee were applied to mean doses for remote organs only and all solid organs. Then, different general non-linear risk models were applied to the whole body dose distribution. Finally, organ-specific non-linear risk models for the lung and breast were used to assess the secondary cancer risk for these two specific organs. A total of 32 different calculated absolute risks resulted in a broad range of values (between 0.1% and 48.5%) underlying the large uncertainties in absolute risk calculation. The ratio of risk between two techniques has often been proposed as a more robust assessment of risk than the absolute risk. We found that the ratio of risk between two techniques could also vary substantially considering the different approaches to risk estimation. Sometimes the ratio of risk between two techniques would range between values smaller and larger than one, which then translates into inconsistent results on the potential higher risk of one technique compared to another. We found however that the hybrid IMRT technique resulted in a systematic reduction of risk compared to the other techniques investigated even though the magnitude of this reduction varied substantially with the different approaches investigated. Based on the epidemiological data available, a reasonable

  16. Launch Vehicle Debris Models and Crew Vehicle Ascent Abort Risk

    NASA Technical Reports Server (NTRS)

    Gee, Ken; Lawrence, Scott

    2013-01-01

    For manned space launch systems, a reliable abort system is required to reduce the risks associated with a launch vehicle failure during ascent. Understanding the risks associated with failure environments can be achieved through the use of physics-based models of these environments. Debris fields due to destruction of the launch vehicle is one such environment. To better analyze the risk posed by debris, a physics-based model for generating launch vehicle debris catalogs has been developed. The model predicts the mass distribution of the debris field based on formulae developed from analysis of explosions. Imparted velocity distributions are computed using a shock-physics code to model the explosions within the launch vehicle. A comparison of the debris catalog with an existing catalog for the Shuttle external tank show good comparison in the debris characteristics and the predicted debris strike probability. The model is used to analyze the effects of number of debris pieces and velocity distributions on the strike probability and risk.

  17. Development of a GCR Event-based Risk Model

    NASA Technical Reports Server (NTRS)

    Cucinotta, Francis A.; Ponomarev, Artem L.; Plante, Ianik; Carra, Claudio; Kim, Myung-Hee

    2009-01-01

    A goal at NASA is to develop event-based systems biology models of space radiation risks that will replace the current dose-based empirical models. Complex and varied biochemical signaling processes transmit the initial DNA and oxidative damage from space radiation into cellular and tissue responses. Mis-repaired damage or aberrant signals can lead to genomic instability, persistent oxidative stress or inflammation, which are causative of cancer and CNS risks. Protective signaling through adaptive responses or cell repopulation is also possible. We are developing a computational simulation approach to galactic cosmic ray (GCR) effects that is based on biological events rather than average quantities such as dose, fluence, or dose equivalent. The goal of the GCR Event-based Risk Model (GERMcode) is to provide a simulation tool to describe and integrate physical and biological events into stochastic models of space radiation risks. We used the quantum multiple scattering model of heavy ion fragmentation (QMSFRG) and well known energy loss processes to develop a stochastic Monte-Carlo based model of GCR transport in spacecraft shielding and tissue. We validated the accuracy of the model by comparing to physical data from the NASA Space Radiation Laboratory (NSRL). Our simulation approach allows us to time-tag each GCR proton or heavy ion interaction in tissue including correlated secondary ions often of high multiplicity. Conventional space radiation risk assessment employs average quantities, and assumes linearity and additivity of responses over the complete range of GCR charge and energies. To investigate possible deviations from these assumptions, we studied several biological response pathway models of varying induction and relaxation times including the ATM, TGF -Smad, and WNT signaling pathways. We then considered small volumes of interacting cells and the time-dependent biophysical events that the GCR would produce within these tissue volumes to estimate how

  18. Evaluation of Periodontal Risk in Adult Patients using Two Different Risk Assessment Models – A Pilot Study

    PubMed Central

    Bade, Shruthi; Bollepalli, Appaiah Chowdary; Katuri, Kishore Kumar; Devulapalli, Narasimha Swamy; Swarna, Chakrapani

    2015-01-01

    Objective: The aim of the present study was to evaluate the periodontal risk of individuals using periodontal risk assessment (PRA) model and modified PRA model. Materials and Methods: A total of 50 patients with chronic periodontitis, age 30-60 years were selected randomly and charting of the periodontal status was performed and those who met the inclusion criteria were enrolled in the study. Parameters recorded were- percentage of sites with bleeding on probing (BOP), number of sites with pocket depths (PD) ≥ 5mm, number of the teeth lost, bone loss (BL)/age ratio, Clinical attachment loss(CAL)/age ratio, diabetic and smoking status, dental status, systemic factors like diabetes were assessed. All the risk factors were plotted on the radar chart in (PRA) and (mPRA) models, using Microsoft excel and periodontal risk were categorized as low, moderate and high risk. Results: Among 50 patients 31 were in low risk, 9 in moderate risk, and 10 in high risk identified by modified (PRA) model, whereas 28 patients were in low risk, 13 in moderate risk and 9 in high risk identified by (PRA). Statistical analysis demonstrated that there was no significant difference between the risk scores (X2 = 0.932 with degree of freedom = 2, P = 0.627). Conclusion: Both the periodontal risk models are effective in evaluating the risk factors and can be useful tool for predicting proper diagnosis, disease progression and therapeutic strategies during the supportive periodontal therapy. PMID:25859520

  19. Risk of second primary cancer following prostate cancer radiotherapy: DVH analysis using the competitive risk model

    NASA Astrophysics Data System (ADS)

    Takam, R.; Bezak, E.; Yeoh, E. E.

    2009-02-01

    This study aimed to estimate the risk of developing second primary cancer (SPC) corresponding to various radiation treatment techniques for prostate cancer. Estimation of SPC was done by analysing differential dose-volume histograms (DDVH) of normal tissues such as rectum, bladder and urethra with the competitive risk model. Differential DVHs were obtained from treatment planning systems for external beam radiotherapy (EBRT), low-dose-rate (LDR) and high-dose-rate (HDR) brachytherapy techniques. The average risk of developing SPC was no greater than 0.6% for all treatment techniques but was lower with either LDR or HDR brachytherapy alone compared with any EBRT technique. For LDR and HDR brachytherapy alone, the risk of SPC for the rectum was 2.0 × 10-4% and 8.3 × 10-5% respectively compared with 0.2% for EBRT using five-field 3D-CRT to a total dose of 74 Gy. Overall, the risk of developing SPC for urethra following all radiation treatment techniques was very low compared with the rectum and bladder. Treatment plans which deliver equivalent doses of around 3-5 Gy to normal tissues were associated with higher risks of development of SPC.

  20. Achieving Climate Change Absolute Accuracy in Orbit

    NASA Technical Reports Server (NTRS)

    Wielicki, Bruce A.; Young, D. F.; Mlynczak, M. G.; Thome, K. J; Leroy, S.; Corliss, J.; Anderson, J. G.; Ao, C. O.; Bantges, R.; Best, F.; Bowman, K.; Brindley, H.; Butler, J. J.; Collins, W.; Dykema, J. A.; Doelling, D. R.; Feldman, D. R.; Fox, N.; Huang, X.; Holz, R.; Huang, Y.; Jennings, D.; Jin, Z.; Johnson, D. G.; Jucks, K.; Kato, S.; Kratz, D. P.; Liu, X.; Lukashin, C.; Mannucci, A. J.; Phojanamongkolkij, N.; Roithmayr, C. M.; Sandford, S.; Taylor, P. C.; Xiong, X.

    2013-01-01

    The Climate Absolute Radiance and Refractivity Observatory (CLARREO) mission will provide a calibration laboratory in orbit for the purpose of accurately measuring and attributing climate change. CLARREO measurements establish new climate change benchmarks with high absolute radiometric accuracy and high statistical confidence across a wide range of essential climate variables. CLARREO's inherently high absolute accuracy will be verified and traceable on orbit to Système Internationale (SI) units. The benchmarks established by CLARREO will be critical for assessing changes in the Earth system and climate model predictive capabilities for decades into the future as society works to meet the challenge of optimizing strategies for mitigating and adapting to climate change. The CLARREO benchmarks are derived from measurements of the Earth's thermal infrared spectrum (5-50 micron), the spectrum of solar radiation reflected by the Earth and its atmosphere (320-2300 nm), and radio occultation refractivity from which accurate temperature profiles are derived. The mission has the ability to provide new spectral fingerprints of climate change, as well as to provide the first orbiting radiometer with accuracy sufficient to serve as the reference transfer standard for other space sensors, in essence serving as a "NIST [National Institute of Standards and Technology] in orbit." CLARREO will greatly improve the accuracy and relevance of a wide range of space-borne instruments for decadal climate change. Finally, CLARREO has developed new metrics and methods for determining the accuracy requirements of climate observations for a wide range of climate variables and uncertainty sources. These methods should be useful for improving our understanding of observing requirements for most climate change observations.

  1. A risk management model for securing virtual healthcare communities.

    PubMed

    Chryssanthou, Anargyros; Varlamis, Iraklis; Latsiou, Charikleia

    2011-01-01

    Virtual healthcare communities aim to bring together healthcare professionals and patients, improve the quality of healthcare services and assist healthcare professionals and researchers in their everyday activities. In a secure and reliable environment, patients share their medical data with doctors, expect confidentiality and demand reliable medical consultation. Apart from a concrete policy framework, several ethical, legal and technical issues must be considered in order to build a trustful community. This research emphasises on security issues, which can arise inside a virtual healthcare community and relate to the communication and storage of data. It capitalises on a standardised risk management methodology and a prototype architecture for healthcare community portals and justifies a security model that allows the identification, estimation and evaluation of potential security risks for the community. A hypothetical virtual healthcare community is employed in order to portray security risks and the solutions that the security model provides.

  2. The visual communication of risk.

    PubMed

    Lipkus, I M; Hollands, J G

    1999-01-01

    This paper 1) provides reasons why graphics should be effective aids to communicate risk; 2) reviews the use of visuals, especially graphical displays, to communicate risk; 3) discusses issues to consider when designing graphs to communicate risk; and 4) provides suggestions for future research. Key articles and materials were obtained from MEDLINE(R) and PsychInfo(R) databases, from reference article citations, and from discussion with experts in risk communication. Research has been devoted primarily to communicating risk magnitudes. Among the various graphical displays, the risk ladder appears to be a promising tool for communicating absolute and relative risks. Preliminary evidence suggests that people understand risk information presented in histograms and pie charts. Areas that need further attention include 1) applying theoretical models to the visual communication of risk, 2) testing which graphical displays can be applied best to different risk communication tasks (e.g., which graphs best convey absolute or relative risks), 3) communicating risk uncertainty, and 4) testing whether the lay public's perceptions and understanding of risk varies by graphical format and whether the addition of graphical displays improves comprehension substantially beyond numerical or narrative translations of risk and, if so, by how much. There is a need to ascertain the extent to which graphics and other visuals enhance the public's understanding of disease risk to facilitate decision-making and behavioral change processes. Nine suggestions are provided to help achieve these ends.

  3. A Dual System Model of Preferences under Risk

    ERIC Educational Resources Information Center

    Mukherjee, Kanchan

    2010-01-01

    This article presents a dual system model (DSM) of decision making under risk and uncertainty according to which the value of a gamble is a combination of the values assigned to it independently by the affective and deliberative systems. On the basis of research on dual process theories and empirical research in Hsee and Rottenstreich (2004) and…

  4. Field Evaluation of an Avian Risk Assessment Model

    EPA Science Inventory

    We conducted two laboratory subacute dietary toxicity tests and one outdoor subacute dietary toxicity test to determine the effectiveness of the U.S. Environmental Protection Agency's deterministic risk assessment model for evaluating the potential of adverse effects to birds in ...

  5. Effective Genetic-Risk Prediction Using Mixed Models

    PubMed Central

    Golan, David; Rosset, Saharon

    2014-01-01

    For predicting genetic risk, we propose a statistical approach that is specifically adapted to dealing with the challenges imposed by disease phenotypes and case-control sampling. Our approach (termed Genetic Risk Scores Inference [GeRSI]), combines the power of fixed-effects models (which estimate and aggregate the effects of single SNPs) and random-effects models (which rely primarily on whole-genome similarities between individuals) within the framework of the widely used liability-threshold model. We demonstrate in extensive simulation that GeRSI produces predictions that are consistently superior to current state-of-the-art approaches. When applying GeRSI to seven phenotypes from the Wellcome Trust Case Control Consortium (WTCCC) study, we confirm that the use of random effects is most beneficial for diseases that are known to be highly polygenic: hypertension (HT) and bipolar disorder (BD). For HT, there are no significant associations in the WTCCC data. The fixed-effects model yields an area under the ROC curve (AUC) of 54%, whereas GeRSI improves it to 59%. For BD, using GeRSI improves the AUC from 55% to 62%. For individuals ranked at the top 10% of BD risk predictions, using GeRSI substantially increases the BD relative risk from 1.4 to 2.5. PMID:25279982

  6. Application of Catastrophe Risk Modelling to Evacuation Public Policy

    NASA Astrophysics Data System (ADS)

    Woo, G.

    2009-04-01

    The decision by civic authorities to evacuate an area threatened by a natural hazard is especially fraught when the population in harm's way is extremely large, and where there is considerable uncertainty in the spatial footprint, scale, and strike time of a hazard event. Traditionally viewed as a hazard forecasting issue, civil authorities turn to scientists for advice on a potentially imminent dangerous event. However, the level of scientific confidence varies enormously from one peril and crisis situation to another. With superior observational data, meteorological and hydrological hazards are generally better forecast than geological hazards. But even with Atlantic hurricanes, the track and intensity of a hurricane can change significantly within a few hours. This complicated and delayed the decision to call an evacuation of New Orleans when threatened by Hurricane Katrina, and would present a severe dilemma if a major hurricane were appearing to head for New York. Evacuation needs to be perceived as a risk issue, requiring the expertise of catastrophe risk modellers as well as geoscientists. Faced with evidence of a great earthquake in the Indian Ocean in December 2004, seismologists were reluctant to give a tsunami warning without more direct sea observations. Yet, from a risk perspective, the risk to coastal populations would have warranted attempts at tsunami warning, even though there was significant uncertainty in the hazard forecast, and chance of a false alarm. A systematic coherent risk-based framework for evacuation decision-making exists, which weighs the advantages of an evacuation call against the disadvantages. Implicitly and qualitatively, such a cost-benefit analysis is undertaken by civic authorities whenever an evacuation is considered. With the progress in catastrophe risk modelling, such an analysis can be made explicit and quantitative, providing a transparent audit trail for the decision process. A stochastic event set, the core of a

  7. [Integrated Management Area of Vascular Risk: A new organisational model for global control of risk factors].

    PubMed

    Armario, P; Jericó, C; Vila, L; Freixa, R; Martin-Castillejos, C; Rotllan, M

    2016-11-17

    Cardiovascular disease (CVD), is a major cause of morbidity and mortality that increases the cost of care. Currently there is a low degree of control of the main cardiovascular risk factors, although we have a good therapeutic arsenal. To achieve the improvement of this reality, a good coordination and multidisciplinary participation are essential. The development of new organizational models such as the Integrated Management Area of Vascular Risk can facilitate the therapeutic harmonization and unification of the health messages offered by different levels of care, based on clinical practice guidelines, in order to provide patient-centred integrated care.

  8. Simplified risk score models accurately predict the risk of major in-hospital complications following percutaneous coronary intervention.

    PubMed

    Resnic, F S; Ohno-Machado, L; Selwyn, A; Simon, D I; Popma, J J

    2001-07-01

    The objectives of this analysis were to develop and validate simplified risk score models for predicting the risk of major in-hospital complications after percutaneous coronary intervention (PCI) in the era of widespread stenting and use of glycoprotein IIb/IIIa antagonists. We then sought to compare the performance of these simplified models with those of full logistic regression and neural network models. From January 1, 1997 to December 31, 1999, data were collected on 4,264 consecutive interventional procedures at a single center. Risk score models were derived from multiple logistic regression models using the first 2,804 cases and then validated on the final 1,460 cases. The area under the receiver operating characteristic (ROC) curve for the risk score model that predicted death was 0.86 compared with 0.85 for the multiple logistic model and 0.83 for the neural network model (validation set). For the combined end points of death, myocardial infarction, or bypass surgery, the corresponding areas under the ROC curves were 0.74, 0.78, and 0.81, respectively. Previously identified risk factors were confirmed in this analysis. The use of stents was associated with a decreased risk of in-hospital complications. Thus, risk score models can accurately predict the risk of major in-hospital complications after PCI. Their discriminatory power is comparable to those of logistic models and neural network models. Accurate bedside risk stratification may be achieved with these simple models.

  9. Modeling risk of pneumonia epizootics in bighorn sheep

    USGS Publications Warehouse

    Sells, Sarah N.; Mitchell, Michael S.; Nowak, J. Joshua; Lukacs, Paul M.; Anderson, Neil J.; Ramsey, Jennifer M.; Gude, Justin A.; Krausman, Paul R.

    2015-01-01

    Pneumonia epizootics are a major challenge for management of bighorn sheep (Ovis canadensis) affecting persistence of herds, satisfaction of stakeholders, and allocations of resources by management agencies. Risk factors associated with the disease are poorly understood, making pneumonia epizootics hard to predict; such epizootics are thus managed reactively rather than proactively. We developed a model for herds in Montana that identifies risk factors and addresses biological questions about risk. Using Bayesian logistic regression with repeated measures, we found that private land, weed control using domestic sheep or goats, pneumonia history, and herd density were positively associated with risk of pneumonia epizootics in 43 herds that experienced 22 epizootics out of 637 herd-years from 1979–2013. We defined an area of high risk for pathogen exposure as the area of each herd distribution plus a 14.5-km buffer from that boundary. Within this area, the odds of a pneumonia epizootic increased by >1.5 times per additional unit of private land (unit is the standardized % of private land where global  = 25.58% and SD = 14.53%). Odds were >3.3 times greater if domestic sheep or goats were used for weed control in a herd's area of high risk. If a herd or its neighbors within the area of high risk had a history of a pneumonia epizootic, odds of a subsequent pneumonia epizootic were >10 times greater. Risk greatly increased when herds were at high density, with nearly 15 times greater odds of a pneumonia epizootic compared to when herds were at low density. Odds of a pneumonia epizootic also appeared to decrease following increased spring precipitation (odds = 0.41 per unit increase, global  = 100.18% and SD = 26.97%). Risk was not associated with number of federal sheep and goat allotments, proximity to nearest herds of bighorn sheep, ratio of rams to ewes, percentage of average winter precipitation, or whether herds were of native versus mixed

  10. Modeling logistic performance in quantitative microbial risk assessment.

    PubMed

    Rijgersberg, Hajo; Tromp, Seth; Jacxsens, Liesbeth; Uyttendaele, Mieke

    2010-01-01

    In quantitative microbial risk assessment (QMRA), food safety in the food chain is modeled and simulated. In general, prevalences, concentrations, and numbers of microorganisms in media are investigated in the different steps from farm to fork. The underlying rates and conditions (such as storage times, temperatures, gas conditions, and their distributions) are determined. However, the logistic chain with its queues (storages, shelves) and mechanisms for ordering products is usually not taken into account. As a consequence, storage times-mutually dependent in successive steps in the chain-cannot be described adequately. This may have a great impact on the tails of risk distributions. Because food safety risks are generally very small, it is crucial to model the tails of (underlying) distributions as accurately as possible. Logistic performance can be modeled by describing the underlying planning and scheduling mechanisms in discrete-event modeling. This is common practice in operations research, specifically in supply chain management. In this article, we present the application of discrete-event modeling in the context of a QMRA for Listeria monocytogenes in fresh-cut iceberg lettuce. We show the potential value of discrete-event modeling in QMRA by calculating logistic interventions (modifications in the logistic chain) and determining their significance with respect to food safety.

  11. Source apportionment of ambient non-methane hydrocarbons in Hong Kong: application of a principal component analysis/absolute principal component scores (PCA/APCS) receptor model.

    PubMed

    Guo, H; Wang, T; Louie, P K K

    2004-06-01

    Receptor-oriented source apportionment models are often used to identify sources of ambient air pollutants and to estimate source contributions to air pollutant concentrations. In this study, a PCA/APCS model was applied to the data on non-methane hydrocarbons (NMHCs) measured from January to December 2001 at two sampling sites: Tsuen Wan (TW) and Central & Western (CW) Toxic Air Pollutants Monitoring Stations in Hong Kong. This multivariate method enables the identification of major air pollution sources along with the quantitative apportionment of each source to pollutant species. The PCA analysis identified four major pollution sources at TW site and five major sources at CW site. The extracted pollution sources included vehicular internal engine combustion with unburned fuel emissions, use of solvent particularly paints, liquefied petroleum gas (LPG) or natural gas leakage, and industrial, commercial and domestic sources such as solvents, decoration, fuel combustion, chemical factories and power plants. The results of APCS receptor model indicated that 39% and 48% of the total NMHCs mass concentrations measured at CW and TW were originated from vehicle emissions, respectively. 32% and 36.4% of the total NMHCs were emitted from the use of solvent and 11% and 19.4% were apportioned to the LPG or natural gas leakage, respectively. 5.2% and 9% of the total NMHCs mass concentrations were attributed to other industrial, commercial and domestic sources, respectively. It was also found that vehicle emissions and LPG or natural gas leakage were the main sources of C(3)-C(5) alkanes and C(3)-C(5) alkenes while aromatics were predominantly released from paints. Comparison of source contributions to ambient NMHCs at the two sites indicated that the contribution of LPG or natural gas at CW site was almost twice that at TW site. High correlation coefficients (R(2) > 0.8) between the measured and predicted values suggested that the PCA/APCS model was applicable for estimation

  12. Cosmology with negative absolute temperatures

    NASA Astrophysics Data System (ADS)

    Vieira, J. P. P.; Byrnes, Christian T.; Lewis, Antony

    2016-08-01

    Negative absolute temperatures (NAT) are an exotic thermodynamical consequence of quantum physics which has been known since the 1950's (having been achieved in the lab on a number of occasions). Recently, the work of Braun et al. [1] has rekindled interest in negative temperatures and hinted at a possibility of using NAT systems in the lab as dark energy analogues. This paper goes one step further, looking into the cosmological consequences of the existence of a NAT component in the Universe. NAT-dominated expanding Universes experience a borderline phantom expansion (w < -1) with no Big Rip, and their contracting counterparts are forced to bounce after the energy density becomes sufficiently large. Both scenarios might be used to solve horizon and flatness problems analogously to standard inflation and bouncing cosmologies. We discuss the difficulties in obtaining and ending a NAT-dominated epoch, and possible ways of obtaining density perturbations with an acceptable spectrum.

  13. Cumulative Incidence Association Models for Bivariate Competing Risks Data.

    PubMed

    Cheng, Yu; Fine, Jason P

    2012-03-01

    Association models, like frailty and copula models, are frequently used to analyze clustered survival data and evaluate within-cluster associations. The assumption of noninformative censoring is commonly applied to these models, though it may not be true in many situations. In this paper, we consider bivariate competing risk data and focus on association models specified for the bivariate cumulative incidence function (CIF), a nonparametrically identifiable quantity. Copula models are proposed which relate the bivariate CIF to its corresponding univariate CIFs, similarly to independently right censored data, and accommodate frailty models for the bivariate CIF. Two estimating equations are developed to estimate the association parameter, permitting the univariate CIFs to be estimated either parametrically or nonparametrically. Goodness-of-fit tests are presented for formally evaluating the parametric models. Both estimators perform well with moderate sample sizes in simulation studies. The practical use of the methodology is illustrated in an analysis of dementia associations.

  14. Advanced uncertainty modelling for container port risk analysis.

    PubMed

    Alyami, Hani; Yang, Zaili; Riahi, Ramin; Bonsall, Stephen; Wang, Jin

    2016-08-13

    Globalization has led to a rapid increase of container movements in seaports. Risks in seaports need to be appropriately addressed to ensure economic wealth, operational efficiency, and personnel safety. As a result, the safety performance of a Container Terminal Operational System (CTOS) plays a growing role in improving the efficiency of international trade. This paper proposes a novel method to facilitate the application of Failure Mode and Effects Analysis (FMEA) in assessing the safety performance of CTOS. The new approach is developed through incorporating a Fuzzy Rule-Based Bayesian Network (FRBN) with Evidential Reasoning (ER) in a complementary manner. The former provides a realistic and flexible method to describe input failure information for risk estimates of individual hazardous events (HEs) at the bottom level of a risk analysis hierarchy. The latter is used to aggregate HEs safety estimates collectively, allowing dynamic risk-based decision support in CTOS from a systematic perspective. The novel feature of the proposed method, compared to those in traditional port risk analysis lies in a dynamic model capable of dealing with continually changing operational conditions in ports. More importantly, a new sensitivity analysis method is developed and carried out to rank the HEs by taking into account their specific risk estimations (locally) and their Risk Influence (RI) to a port's safety system (globally). Due to its generality, the new approach can be tailored for a wide range of applications in different safety and reliability engineering and management systems, particularly when real time risk ranking is required to measure, predict, and improve the associated system safety performance.

  15. Linear ultrasonic motor for absolute gravimeter.

    PubMed

    Jian, Yue; Yao, Zhiyuan; Silberschmidt, Vadim V

    2017-02-01

    Thanks to their compactness and suitability for vacuum applications, linear ultrasonic motors are considered as substitutes for classical electromagnetic motors as driving elements in absolute gravimeters. Still, their application is prevented by relatively low power output. To overcome this limitation and provide better stability, a V-type linear ultrasonic motor with a new clamping method is proposed for a gravimeter. In this paper, a mechanical model of stators with flexible clamping components is suggested, according to a design criterion for clamps of linear ultrasonic motors. After that, an effect of tangential and normal rigidity of the clamping components on mechanical output is studied. It is followed by discussion of a new clamping method with sufficient tangential rigidity and a capability to facilitate pre-load. Additionally, a prototype of the motor with the proposed clamping method was fabricated and the performance tests in vertical direction were implemented. Experimental results show that the suggested motor has structural stability and high dynamic performance, such as no-load speed of 1.4m/s and maximal thrust of 43N, meeting the requirements for absolute gravimeters.

  16. The absolute threshold of cone vision

    PubMed Central

    Koeing, Darran; Hofer, Heidi

    2013-01-01

    We report measurements of the absolute threshold of cone vision, which has been previously underestimated due to sub-optimal conditions or overly strict subjective response criteria. We avoided these limitations by using optimized stimuli and experimental conditions while having subjects respond within a rating scale framework. Small (1′ fwhm), brief (34 msec), monochromatic (550 nm) stimuli were foveally presented at multiple intensities in dark-adapted retina for 5 subjects. For comparison, 4 subjects underwent similar testing with rod-optimized stimuli. Cone absolute threshold, that is, the minimum light energy for which subjects were just able to detect a visual stimulus with any response criterion, was 203 ± 38 photons at the cornea, ∼0.47 log units lower than previously reported. Two-alternative forced-choice measurements in a subset of subjects yielded consistent results. Cone thresholds were less responsive to criterion changes than rod thresholds, suggesting a limit to the stimulus information recoverable from the cone mosaic in addition to the limit imposed by Poisson noise. Results were consistent with expectations for detection in the face of stimulus uncertainty. We discuss implications of these findings for modeling the first stages of human cone vision and interpreting psychophysical data acquired with adaptive optics at the spatial scale of the receptor mosaic. PMID:21270115

  17. Modeling of Radiation Risks for Human Space Missions

    NASA Technical Reports Server (NTRS)

    Fletcher, Graham

    2004-01-01

    Prior to any human space flight, calculations of radiation risks are used to determine the acceptable scope of astronaut activity. Using the supercomputing facilities at NASA Ames Research Center, Ames researchers have determined the damage probabilities of DNA functional groups by space radiation. The data supercede those used in the current Monte Carlo model for risk assessment. One example is the reaction of DNA with hydroxyl radical produced by the interaction of highly energetic particles from space radiation with water molecules in the human body. This reaction is considered an important cause of DNA mutations, although its mechanism is not well understood.

  18. Models for the risk of secondary cancers from radiation therapy.

    PubMed

    Dasu, Alexandru; Toma-Dasu, Iuliana

    2017-02-24

    The interest in the induction of secondary tumours following radiotherapy has greatly increased as developments in detecting and treating the primary tumours have improved the life expectancy of cancer patients. However, most of the knowledge on the current levels of risk comes from patients treated many decades ago. As developments of irradiation techniques take place at a much faster pace than the progression of the carcinogenesis process, the earlier results could not be easily extrapolated to modern treatments. Indeed, the patterns of irradiation from historically-used orthovoltage radiotherapy and from contemporary techniques like conformal radiotherapy with megavoltage radiation, intensity modulated radiation therapy with photons or with particles are quite different. Furthermore, the increased interest in individualised treatment options raises the question of evaluating and ranking the different treatment plan options from the point of view of the risk for cancer induction, in parallel with the quantification of other long-term effects. It is therefore inevitable that models for risk assessment will have to be used to complement the knowledge from epidemiological studies and to make predictions for newer forms of treatment for which clinical evidence is not yet available. This work reviews the mathematical models that could be used to predict the risk of secondary cancers from radiotherapy-relevant dose levels, as well as the approaches and factors that have to be taken into account when including these models in the clinical evaluation process. These include the effects of heterogeneous irradiation, secondary particles production, imaging techniques, interpatient variability and other confounding factors.

  19. Reducing uncertainty in risk modeling for methylmercury exposure

    SciTech Connect

    Ponce, R.; Egeland, G.; Middaugh, J.; Lee, R.

    1995-12-31

    The biomagnification and bioaccumulation of methylmercury in marine species represents a challenge for risk assessment related to the consumption of subsistence foods in Alaska. Because of the profound impact that food consumption advisories have on indigenous peoples seeking to preserve a way of life, there is a need to reduce uncertainty in risk assessment. Thus, research was initiated to reduce the uncertainty in assessing the health risks associated with the consumption of subsistence foods. Because marine subsistence foods typically contain elevated levels of methylmercury, preliminary research efforts have focused on methylmercury as the principal chemical of concern. Of particular interest are the antagonistic effects of selenium on methylmercury toxicity. Because of this antagonism, methylmercury exposure through the consumption of marine mammal meat (with high selenium) may not be as toxic as comparable exposures through other sources of dietary intake, such as in the contaminated bread episode of Iraq (containing relatively low selenium). This hypothesis is supported by animal experiments showing reduced toxicity of methylmercury associated with marine mammal meat, by the antagonistic influence of selenium on methylmercury toxicity, and by negative clinical findings in adult populations exposed to methylmercury through a marine diet not subject to industrial contamination. Exploratory model development is underway to identify potential improvements and applications of current deterministic and probabilistic models, particularly by incorporating selenium as an antagonist in risk modeling methods.

  20. A quantitative risk assessment model for Salmonella and whole chickens.

    PubMed

    Oscar, Thomas P

    2004-06-01

    Existing data and predictive models were used to define the input settings of a previously developed but modified quantitative risk assessment model (QRAM) for Salmonella and whole chickens. The QRAM was constructed in an Excel spreadsheet and was simulated using @Risk. The retail-to-table pathway was modeled as a series of unit operations and associated pathogen events that included initial contamination at retail, growth during consumer transport, thermal inactivation during cooking, cross-contamination during serving, and dose response after consumption. Published data as well as predictive models for growth and thermal inactivation of Salmonella were used to establish input settings. Noncontaminated chickens were simulated so that the QRAM could predict changes in the incidence of Salmonella contamination. The incidence of Salmonella contamination changed from 30% at retail to 0.16% after cooking to 4% at consumption. Salmonella growth on chickens during consumer transport was the only pathogen event that did not impact the risk of salmonellosis. For the scenario simulated, the QRAM predicted 0.44 cases of salmonellosis per 100,000 consumers, which was consistent with recent epidemiological data that indicate a rate of 0.66-0.88 cases of salmonellosis per 100,000 consumers of chicken. Although the QRAM was in agreement with the epidemiological data, surrogate data and models were used, assumptions were made, and potentially important unit operations and pathogen events were not included because of data gaps and thus, further refinement of the QRAM is needed.

  1. Modeling insurer-homeowner interactions in managing natural disaster risk.

    PubMed

    Kesete, Yohannes; Peng, Jiazhen; Gao, Yang; Shan, Xiaojun; Davidson, Rachel A; Nozick, Linda K; Kruse, Jamie

    2014-06-01

    The current system for managing natural disaster risk in the United States is problematic for both homeowners and insurers. Homeowners are often uninsured or underinsured against natural disaster losses, and typically do not invest in retrofits that can reduce losses. Insurers often do not want to insure against these losses, which are some of their biggest exposures and can cause an undesirably high chance of insolvency. There is a need to design an improved system that acknowledges the different perspectives of the stakeholders. In this article, we introduce a new modeling framework to help understand and manage the insurer's role in catastrophe risk management. The framework includes a new game-theoretic optimization model of insurer decisions that interacts with a utility-based homeowner decision model and is integrated with a regional catastrophe loss estimation model. Reinsurer and government roles are represented as bounds on the insurer-insured interactions. We demonstrate the model for a full-scale case study for hurricane risk to residential buildings in eastern North Carolina; present the results from the perspectives of all stakeholders-primary insurers, homeowners (insured and uninsured), and reinsurers; and examine the effect of key parameters on the results.

  2. Modelling of fire count data: fire disaster risk in Ghana.

    PubMed

    Boadi, Caleb; Harvey, Simon K; Gyeke-Dako, Agyapomaa

    2015-01-01

    Stochastic dynamics involved in ecological count data require distribution fitting procedures to model and make informed judgments. The study provides empirical research, focused on the provision of an early warning system and a spatial graph that can detect societal fire risks. It offers an opportunity for communities, organizations, risk managers, actuaries and governments to be aware of, and understand fire risks, so that they will increase the direct tackling of the threats posed by fire. Statistical distribution fitting method that best helps identify the stochastic dynamics of fire count data is used. The aim is to provide a fire-prediction model and fire spatial graph for observed fire count data. An empirical probability distribution model is fitted to the fire count data and compared to the theoretical probability distribution of the stochastic process of fire count data. The distribution fitted to the fire frequency count data helps identify the class of models that are exhibited by the fire and provides time leading decisions. The research suggests that fire frequency and loss (fire fatalities) count data in Ghana are best modelled with a Negative Binomial Distribution. The spatial map of observed fire frequency and fatality measured over 5 years (2007-2011) offers in this study a first regional assessment of fire frequency and fire fatality in Ghana.

  3. A flexible count data regression model for risk analysis.

    PubMed

    Guikema, Seth D; Coffelt, Jeremy P; Goffelt, Jeremy P

    2008-02-01

    In many cases, risk and reliability analyses involve estimating the probabilities of discrete events such as hardware failures and occurrences of disease or death. There is often additional information in the form of explanatory variables that can be used to help estimate the likelihood of different numbers of events in the future through the use of an appropriate regression model, such as a generalized linear model. However, existing generalized linear models (GLM) are limited in their ability to handle the types of variance structures often encountered in using count data in risk and reliability analysis. In particular, standard models cannot handle both underdispersed data (variance less than the mean) and overdispersed data (variance greater than the mean) in a single coherent modeling framework. This article presents a new GLM based on a reformulation of the Conway-Maxwell Poisson (COM) distribution that is useful for both underdispersed and overdispersed count data and demonstrates this model by applying it to the assessment of electric power system reliability. The results show that the proposed COM GLM can provide as good of fits to data as the commonly used existing models for overdispered data sets while outperforming these commonly used models for underdispersed data sets.

  4. Framework for Risk Analysis in Multimedia Environmental Systems: Modeling Individual Steps of a Risk Assessment Process

    SciTech Connect

    Shah, Anuj; Castleton, Karl J.; Hoopes, Bonnie L.

    2004-06-01

    The study of the release and effects of chemicals in the environment and their associated risks to humans is central to public and private decision making. FRAMES 1.X, Framework for Risk Analysis in Multimedia Environmental Systems, is a systems modeling software platform, developed by PNNL, Pacific Northwest National Laboratory, that helps scientists study the release and effects of chemicals on a source to outcome basis, create environmental models for similar risk assessment and management problems. The unique aspect of FRAMES is to dynamically introduce software modules representing individual components of a risk assessment (e.g., source release of contaminants, fate and transport in various environmental media, exposure, etc.) within a software framework, manipulate their attributes and run simulations to obtain results. This paper outlines the fundamental constituents of FRAMES 2.X, an enhanced version of FRAMES 1.X, that greatly improve the ability of the module developers to “plug” their self-developed software modules into the system. The basic design, the underlying principles and a discussion of the guidelines for module developers are presented.

  5. A neural network model for credit risk evaluation.

    PubMed

    Khashman, Adnan

    2009-08-01

    Credit scoring is one of the key analytical techniques in credit risk evaluation which has been an active research area in financial risk management. This paper presents a credit risk evaluation system that uses a neural network model based on the back propagation learning algorithm. We train and implement the neural network to decide whether to approve or reject a credit application, using seven learning schemes and real world credit applications from the Australian credit approval datasets. A comparison of the system performance under the different learning schemes is provided, furthermore, we compare the performance of two neural networks; with one and two hidden layers following the ideal learning scheme. Experimental results suggest that neural networks can be effectively used in automatic processing of credit applications.

  6. Absolute irradiance of the Moon for on-orbit calibration

    USGS Publications Warehouse

    Stone, T.C.; Kieffer, H.H.; ,

    2002-01-01

    The recognized need for on-orbit calibration of remote sensing imaging instruments drives the ROLO project effort to characterize the Moon for use as an absolute radiance source. For over 5 years the ground-based ROLO telescopes have acquired spatially-resolved lunar images in 23 VNIR (Moon diameter ???500 pixels) and 9 SWIR (???250 pixels) passbands at phase angles within ??90 degrees. A numerical model for lunar irradiance has been developed which fits hundreds of ROLO images in each band, corrected for atmospheric extinction and calibrated to absolute radiance, then integrated to irradiance. The band-coupled extinction algorithm uses absorption spectra of several gases and aerosols derived from MODTRAN to fit time-dependent component abundances to nightly observations of standard stars. The absolute radiance scale is based upon independent telescopic measurements of the star Vega. The fitting process yields uncertainties in lunar relative irradiance over small ranges of phase angle and the full range of lunar libration well under 0.5%. A larger source of uncertainty enters in the absolute solar spectral irradiance, especially in the SWIR, where solar models disagree by up to 6%. Results of ROLO model direct comparisons to spacecraft observations demonstrate the ability of the technique to track sensor responsivity drifts to sub-percent precision. Intercomparisons among instruments provide key insights into both calibration issues and the absolute scale for lunar irradiance.

  7. 42 CFR 81.10 - Use of cancer risk assessment models in NIOSH IREP.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 1 2011-10-01 2011-10-01 false Use of cancer risk assessment models in NIOSH IREP... Risk Models Used To Estimate Probability of Causation § 81.10 Use of cancer risk assessment models in... tables were developed from analyses of cancer mortality risk among the Japanese atomic bomb...

  8. 42 CFR 81.10 - Use of cancer risk assessment models in NIOSH IREP.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 42 Public Health 1 2014-10-01 2014-10-01 false Use of cancer risk assessment models in NIOSH IREP... Risk Models Used To Estimate Probability of Causation § 81.10 Use of cancer risk assessment models in... tables were developed from analyses of cancer mortality risk among the Japanese atomic bomb...

  9. 42 CFR 81.10 - Use of cancer risk assessment models in NIOSH IREP.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 42 Public Health 1 2012-10-01 2012-10-01 false Use of cancer risk assessment models in NIOSH IREP... Risk Models Used To Estimate Probability of Causation § 81.10 Use of cancer risk assessment models in... tables were developed from analyses of cancer mortality risk among the Japanese atomic bomb...

  10. 42 CFR 81.10 - Use of cancer risk assessment models in NIOSH IREP.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 1 2010-10-01 2010-10-01 false Use of cancer risk assessment models in NIOSH IREP... Risk Models Used To Estimate Probability of Causation § 81.10 Use of cancer risk assessment models in... tables were developed from analyses of cancer mortality risk among the Japanese atomic bomb...

  11. 42 CFR 81.10 - Use of cancer risk assessment models in NIOSH IREP.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 42 Public Health 1 2013-10-01 2013-10-01 false Use of cancer risk assessment models in NIOSH IREP... Risk Models Used To Estimate Probability of Causation § 81.10 Use of cancer risk assessment models in... tables were developed from analyses of cancer mortality risk among the Japanese atomic bomb...

  12. Low Dose Radiation Cancer Risks: Epidemiological and Toxicological Models

    SciTech Connect

    David G. Hoel, PhD

    2012-04-19

    The basic purpose of this one year research grant was to extend the two stage clonal expansion model (TSCE) of carcinogenesis to exposures other than the usual single acute exposure. The two-stage clonal expansion model of carcinogenesis incorporates the biological process of carcinogenesis, which involves two mutations and the clonal proliferation of the intermediate cells, in a stochastic, mathematical way. The current TSCE model serves a general purpose of acute exposure models but requires numerical computation of both the survival and hazard functions. The primary objective of this research project was to develop the analytical expressions for the survival function and the hazard function of the occurrence of the first cancer cell for acute, continuous and multiple exposure cases within the framework of the piece-wise constant parameter two-stage clonal expansion model of carcinogenesis. For acute exposure and multiple exposures of acute series, it is either only allowed to have the first mutation rate vary with the dose, or to have all the parameters be dose dependent; for multiple exposures of continuous exposures, all the parameters are allowed to vary with the dose. With these analytical functions, it becomes easy to evaluate the risks of cancer and allows one to deal with the various exposure patterns in cancer risk assessment. A second objective was to apply the TSCE model with varing continuous exposures from the cancer studies of inhaled plutonium in beagle dogs. Using step functions to estimate the retention functions of the pulmonary exposure of plutonium the multiple exposure versions of the TSCE model was to be used to estimate the beagle dog lung cancer risks. The mathematical equations of the multiple exposure versions of the TSCE model were developed. A draft manuscript which is attached provides the results of this mathematical work. The application work using the beagle dog data from plutonium exposure has not been completed due to the fact

  13. Applying Least Absolute Shrinkage Selection Operator and Akaike Information Criterion Analysis to Find the Best Multiple Linear Regression Models between Climate Indices and Components of Cow’s Milk

    PubMed Central

    Marami Milani, Mohammad Reza; Hense, Andreas; Rahmani, Elham; Ploeger, Angelika

    2016-01-01

    This study focuses on multiple linear regression models relating six climate indices (temperature humidity THI, environmental stress ESI, equivalent temperature index ETI, heat load HLI, modified HLI (HLI new), and respiratory rate predictor RRP) with three main components of cow’s milk (yield, fat, and protein) for cows in Iran. The least absolute shrinkage selection operator (LASSO) and the Akaike information criterion (AIC) techniques are applied to select the best model for milk predictands with the smallest number of climate predictors. Uncertainty estimation is employed by applying bootstrapping through resampling. Cross validation is used to avoid over-fitting. Climatic parameters are calculated from the NASA-MERRA global atmospheric reanalysis. Milk data for the months from April to September, 2002 to 2010 are used. The best linear regression models are found in spring between milk yield as the predictand and THI, ESI, ETI, HLI, and RRP as predictors with p-value < 0.001 and R2 (0.50, 0.49) respectively. In summer, milk yield with independent variables of THI, ETI, and ESI show the highest relation (p-value < 0.001) with R2 (0.69). For fat and protein the results are only marginal. This method is suggested for the impact studies of climate variability/change on agriculture and food science fields when short-time series or data with large uncertainty are available. PMID:28231147

  14. Individual-based model for radiation risk assessment

    NASA Astrophysics Data System (ADS)

    Smirnova, O.

    A mathematical model is developed which enables one to predict the life span probability for mammals exposed to radiation. It relates statistical biometric functions with statistical and dynamic characteristics of an organism's critical system. To calculate the dynamics of the latter, the respective mathematical model is used too. This approach is applied to describe the effects of low level chronic irradiation on mice when the hematopoietic system (namely, thrombocytopoiesis) is the critical one. For identification of the joint model, experimental data on hematopoiesis in nonirradiated and irradiated mice, as well as on mortality dynamics of those in the absence of radiation are utilized. The life span probability and life span shortening predicted by the model agree with corresponding experimental data. Modeling results show the significance of ac- counting the variability of the individual radiosensitivity of critical system cells when estimating the radiation risk. These findings are corroborated by clinical data on persons involved in the elimination of the Chernobyl catastrophe after- effects. All this makes it feasible to use the model for radiation risk assessments for cosmonauts and astronauts on long-term missions such as a voyage to Mars or a lunar colony. In this case the model coefficients have to be determined by making use of the available data for humans. Scenarios for the dynamics of dose accumulation during space flights should also be taken into account.

  15. An animal model of differential genetic risk for methamphetamine intake

    PubMed Central

    Phillips, Tamara J.; Shabani, Shkelzen

    2015-01-01

    The question of whether genetic factors contribute to risk for methamphetamine (MA) use and dependence has not been intensively investigated. Compared to human populations, genetic animal models offer the advantages of control over genetic family history and drug exposure. Using selective breeding, we created lines of mice that differ in genetic risk for voluntary MA intake and identified the chromosomal addresses of contributory genes. A quantitative trait locus was identified on chromosome 10 that accounts for more than 50% of the genetic variance in MA intake in the selected mouse lines. In addition, behavioral and physiological screening identified differences corresponding with risk for MA intake that have generated hypotheses that are testable in humans. Heightened sensitivity to aversive and certain physiological effects of MA, such as MA-induced reduction in body temperature, are hallmarks of mice bred for low MA intake. Furthermore, unlike MA-avoiding mice, MA-preferring mice are sensitive to rewarding and reinforcing MA effects, and to MA-induced increases in brain extracellular dopamine levels. Gene expression analyses implicate the importance of a network enriched in transcription factor genes, some of which regulate the mu opioid receptor gene, Oprm1, in risk for MA use. Neuroimmune factors appear to play a role in differential response to MA between the mice bred for high and low intake. In addition, chromosome 10 candidate gene studies provide strong support for a trace amine-associated receptor 1 gene, Taar1, polymorphism in risk for MA intake. MA is a trace amine-associated receptor 1 (TAAR1) agonist, and a non-functional Taar1 allele segregates with high MA consumption. Thus, reduced TAAR1 function has the potential to increase risk for MA use. Overall, existing findings support the MA drinking lines as a powerful model for identifying genetic factors involved in determining risk for harmful MA use. Future directions include the development of a

  16. Field evaluation of an avian risk assessment model

    USGS Publications Warehouse

    Vyas, N.B.; Spann, J.W.; Hulse, C.S.; Borges, S.L.; Bennett, R.S.; Torrez, M.; Williams, B.I.; Leffel, R.

    2006-01-01

    We conducted two laboratory subacute dietary toxicity tests and one outdoor subacute dietary toxicity test to determine the effectiveness of the U.S. Environmental Protection Agency's deterministic risk assessment model for evaluating the potential of adverse effects to birds in the field. We tested technical-grade diazinon and its D Z N- 50W (50% diazinon active ingredient wettable powder) formulation on Canada goose (Branta canadensis) goslings. Brain acetylcholinesterase activity was measured, and the feathers and skin, feet. and gastrointestinal contents were analyzed for diazinon residues. The dose-response curves showed that diazinon was significantly more toxic to goslings in the outdoor test than in the laboratory tests. The deterministic risk assessment method identified the potential for risk to birds in general, but the factors associated with extrapolating from the laboratory to the field, and from the laboratory test species to other species, resulted in the underestimation of risk to the goslings. The present study indicates that laboratory-based risk quotients should be interpreted with caution.

  17. Social models of HIV risk among young adults in Lesotho.

    PubMed

    Bulled, Nicola L

    2015-01-01

    Extensive research over the past 30 years has revealed that individual and social determinants impact HIV risk. Even so, prevention efforts focus primarily on individual behaviour change, with little recognition of the dynamic interplay of individual and social environment factors that further exacerbate risk engagement. Drawing on long-term research with young adults in Lesotho, I examine how social environment factors contribute to HIV risk. During preliminary ethnographic analysis, I developed novel scales to measure social control, adoption of modernity, and HIV knowledge. In survey research, I examined the effects of individual characteristics (i.e., socioeconomic status, HIV knowledge, adoption of modernity) and social environment (i.e., social control) on HIV risk behaviours. In addition, I measured the impact of altered environments by taking advantage of an existing situation whereby young adults attending a national college are assigned to either a main campus in a metropolitan setting or a satellite campus in a remote setting, irrespective of the environment in which they were socialised as youth. This arbitrary assignment process generates four distinct groups of young adults with altered or constant environments. Regression models show that lower levels of perceived social control and greater adoption of modernity are associated with HIV risk, controlling for other factors. The impact of social control and modernity varies with environment dynamics.

  18. Making Risk Models Operational for Situational Awareness and Decision Support

    SciTech Connect

    Paulson, Patrick R.; Coles, Garill A.; Shoemaker, Steven V.

    2012-06-12

    Modernization of nuclear power operations control systems, in particular the move to digital control systems, creates an opportunity to modernize existing legacy infrastructure and extend plant life. We describe here decision support tools that allow the assessment of different facets of risk and support the optimization of available resources to reduce risk as plants are upgraded and maintained. This methodology could become an integrated part of the design review process and a part of the operations management systems. The methodology can be applied to the design of new reactors such as small nuclear reactors (SMR), and be helpful in assessing the risks of different configurations of the reactors. Our tool provides a low cost evaluation of alternative configurations and provides an expanded safety analysis by considering scenarios while early in the implementation cycle where cost impacts can be minimized. The effects of failures can be modeled and thoroughly vetted to understand their potential impact on risk. The process and tools presented here allow for an integrated assessment of risk by supporting traditional defense in depth approaches while taking into consideration the insertion of new digital instrument and control systems.

  19. The impact of consumer phase models in microbial risk analysis.

    PubMed

    Nauta, Maarten; Christensen, Bjarke

    2011-02-01

    In quantitative microbiological risk assessment (QMRA), the consumer phase model (CPM) describes the part of the food chain between purchase of the food product at retail and exposure. Construction of a CPM is complicated by the large variation in consumer food handling practices and a limited availability of data. Therefore, several subjective (simplifying) assumptions have to be made when a CPM is constructed, but with a single CPM their impact on the QMRA results is unclear. We therefore compared the performance of eight published CPMs for Campylobacter in broiler meat in an example of a QMRA, where all the CPMs were analyzed using one single input distribution of concentrations at retail, and the same dose-response relationship. It was found that, between CPMs, there may be a considerable difference in the estimated probability of illness per serving. However, the estimated relative risk reductions are less different for scenarios modeling the implementation of control measures. For control measures affecting the Campylobacter prevalence, the relative risk is proportional irrespective of the CPM used. However, for control measures affecting the concentration the CPMs show some difference in the estimated relative risk. This difference is largest for scenarios where the aim is to remove the highly contaminated portion from human exposure. Given these results, we conclude that for many purposes it is not necessary to develop a new detailed CPM for each new QMRA. However, more observational data on consumer food handling practices and their impact on microbial transfer and survival are needed to generalize this conclusion.

  20. Risk prediction models for contrast induced nephropathy: systematic review

    PubMed Central

    Silver, Samuel A; Shah, Prakesh M; Chertow, Glenn M; Wald, Ron

    2015-01-01

    Objectives To look at the available literature on validated prediction models for contrast induced nephropathy and describe their characteristics. Design Systematic review. Data sources Medline, Embase, and CINAHL (cumulative index to nursing and allied health literature) databases. Review methods Databases searched from inception to 2015, and the retrieved reference lists hand searched. Dual reviews were conducted to identify studies published in the English language of prediction models tested with patients that included derivation and validation cohorts. Data were extracted on baseline patient characteristics, procedural characteristics, modelling methods, metrics of model performance, risk of bias, and clinical usefulness. Eligible studies evaluated characteristics of predictive models that identified patients at risk of contrast induced nephropathy among adults undergoing a diagnostic or interventional procedure using conventional radiocontrast media (media used for computed tomography or angiography, and not gadolinium based contrast). Results 16 studies were identified, describing 12 prediction models. Substantial interstudy heterogeneity was identified, as a result of different clinical settings, cointerventions, and the timing of creatinine measurement to define contrast induced nephropathy. Ten models were validated internally and six were validated externally. Discrimination varied in studies that were validated internally (C statistic 0.61-0.95) and externally (0.57-0.86). Only one study presented reclassification indices. The majority of higher performing models included measures of pre-existing chronic kidney disease, age, diabetes, heart failure or impaired ejection fraction, and hypotension or shock. No prediction model evaluated its effect on clinical decision making or patient outcomes. Conclusions Most predictive models for contrast induced nephropathy in clinical use have modest ability, and are only relevant to patients receiving contrast for

  1. Methodology for risk analysis based on atmospheric dispersion modelling from nuclear risk sites

    NASA Astrophysics Data System (ADS)

    Baklanov, A.; Mahura, A.; Sørensen, J. H.; Rigina, O.

    2003-04-01

    The main purpose of this multidisciplinary study is to develop a methodology for complex nuclear risk and vulnerability assessment, and to test it on example of estimation of nuclear risk to the population in the Nordic countries in case of a severe accident at a nuclear risk site (NRS). The main focus of the paper is the methodology for the evaluation of the atmospheric transport and deposition of radioactive pollutants from NRSs. The method developed for this evaluation is derived from a probabilistic point of view. The main question we are trying to answer is: What is the probability for radionuclide atmospheric transport and impact to different neighbouring regions and countries in case of an accident at an NPP? To answer this question we applied a number of different tools: (i) Trajectory Modelling - to calculate multiyear forward trajectories originating over the locations of selected risk sites; (ii) Dispersion Modelling - for long-term simulation and case studies of radionuclide transport from hypothetical accidental releases at NRSs; (iii) Cluster Analysis - to identify atmospheric transport pathways from NRSs; (iv) Probability Fields Analysis - to construct annual, monthly, and seasonal NRS impact indicators to identify the most impacted geographical regions; (v) Specific Case Studies - to estimate consequences for the environment and the populations after a hypothetical accident; (vi) Vulnerability Evaluation to Radioactive Deposition - to describe its persistence in the ecosystems with a focus to the transfer of certain radionuclides into the food chains of key importance for the intake and exposure for a whole population and for certain population groups; (vii) Risk Evaluation and Mapping - to analyse socio-economical consequences for different geographical areas and various population groups taking into account social-geophysical factors and probabilities, and using demographic databases based on GIS analysis.

  2. Age-specific absolute and relative organ weight distributions for Fischer 344 rats.

    PubMed

    Marino, Dale J

    2012-01-01

    The Fischer 344 (F344) rat has been the standard rat strain used in toxicology studies conducted by the National Cancer Institute (NCI) and the National Toxicology Program (NTP). However, the numerous reports published to date on growth, survival, and tumor incidence have not included an overall compilation of organ weight data. Notably, dose-related organ weight effects are endpoints used by regulatory agencies to develop toxicity reference values (TRVs) for use in human health risk assessments. In addition, physiologically-based pharmacokinetic (PBPK) models, which utilize relative organ weights, are increasingly being used to develop TRVs. Because a compilation of organ weights for F344 rats could prove beneficial for TRV development and PBPK modeling, all available absolute and relative organ weight data for untreated control F344 rats were collected from NCI/NTP feed, drinking-water, and inhalation studies in order to develop age-specific distributions. Results showed that organ weights were collected more frequently at 2-wk (59 studies), 3-mo (148 studies), and 15-mo (38 studies) intervals than at other intervals and more frequently from feeding and inhalation than from drinking-water studies. Liver, right kidney, lung, heart, thymus, and brain weights were most frequently collected. From the collected data, the mean and standard deviation for absolute and relative organ weights were calculated. Findings showed age-related increases in absolute weights and decreases in relative weights for brain, liver, right kidney, lung, heart, thyroid, and right testis. The results suggest a general variability trend in absolute organ weights of brain < right testis < heart < right kidney < liver < lung < thymus < thyroid.

  3. Fire risk in San Diego County, California: A weighted Bayesian model approach

    USGS Publications Warehouse

    Kolden, Crystal A.; Weigel, Timothy J.

    2007-01-01

    Fire risk models are widely utilized to mitigate wildfire hazards, but models are often based on expert opinions of less understood fire-ignition and spread processes. In this study, we used an empirically derived weights-of-evidence model to assess what factors produce fire ignitions east of San Diego, California. We created and validated a dynamic model of fire-ignition risk based on land characteristics and existing fire-ignition history data, and predicted ignition risk for a future urbanization scenario. We then combined our empirical ignition-risk model with a fuzzy fire behavior-risk model developed by wildfire experts to create a hybrid model of overall fire risk. We found that roads influence fire ignitions and that future growth will increase risk in new rural development areas. We conclude that empirically derived risk models and hybrid models offer an alternative method to assess current and future fire risk based on management actions.

  4. Network Dependence in Risk Trading Games: A Banking Regulation Model

    NASA Astrophysics Data System (ADS)

    Thurner, Stefan

    2003-04-01

    In the quest of quantitatively understanding risk-regulatory behavior of financial agents we propose a physical model of interacting agents where interactions are defined by trades of financial derivatives. Consequences arising from various types of interaction-network topologies are shown for system safety and efficiency. We demonstrate that the model yields characteristic features of actually observed wealth timeseries. Further we study the dependence of global system safety as a function of a risk-control parameter (Basle multiplier). We find a phase transition-like phenomenon, where the Basle parameter plays the role of temperature and safety serves as the order parameter. This work is done together with R. Hanel and S. Pichler.

  5. Engineering models for catastrophe risk and their application to insurance

    NASA Astrophysics Data System (ADS)

    Dong, Weimin

    2002-06-01

    Internationally earthquake insurance, like all other insurance (fire, auto), adopted actuarial approach in the past, which is, based on historical loss experience to determine insurance rate. Due to the fact that earthquake is a rare event with severe consequence, irrational determination of premium rate and lack of understanding scale of potential loss led to many insurance companies insolvent after Northridge earthquake in 1994. Along with recent advances in earth science, computer science and engineering, computerized loss estimation methodologies based on first principles have been developed to the point that losses from destructive earthquakes can be quantified with reasonable accuracy using scientific modeling techniques. This paper intends to introduce how engineering models can assist to quantify earthquake risk and how insurance industry can use this information to manage their risk in the United States and abroad.

  6. FIRESTORM: Modelling the water quality risk of wildfire.

    NASA Astrophysics Data System (ADS)

    Mason, C. I.; Sheridan, G. J.; Smith, H. G.; Jones, O.; Chong, D.; Tolhurst, K.

    2012-04-01

    Following wildfire, loss of vegetation and changes to soil properties may result in decreases in infiltration rates, less rainfall interception, and higher overland flow velocities. Rainfall events affecting burn areas before vegetation recovers can cause high magnitude erosion events that impact on downstream water quality. For cities and towns that rely upon fire-prone forest catchments for water supply, wildfire impacts on water quality represent a credible risk to water supply security. Quantifying the risk associated with the occurrence of wildfires and the magnitude of water quality impacts has important implications for managing water supplies. At present, no suitable integrative model exists that considers the probabilistic nature of system inputs as well as the range of processes and scales involved in this problem. We present FIRESTORM, a new model currently in development that aims to determine the range of sediment and associated contaminant loads that may be delivered to water supply reservoirs from the combination of wildfire and subsequent rainfall events. This Monte Carlo model incorporates the probabilistic nature of fire ignition, fire weather and rainfall, and includes deterministic models for fire behaviour and locally dominant erosion processes. FIRESTORM calculates the magnitude and associated annual risk of catchment-scale sediment loads associated with the occurrence of wildfire and rainfall generated by two rain event types. The two event types are localised, high intensity, short-duration convective storms, and widespread, longer duration synoptic-scale rainfall events. Initial application and testing of the model will focus on the two main reservoirs supplying water to Melbourne, Australia, both of which are situated in forest catchments vulnerable to wildfire. Probabilistic fire ignition and weather scenarios have been combined using 40 years of fire records and weather observations. These are used to select from a dataset of over 80

  7. Application of Physiologically Based Pharmacokinetic Models in Chemical Risk Assessment

    PubMed Central

    Mumtaz, Moiz; Fisher, Jeffrey; Blount, Benjamin; Ruiz, Patricia

    2012-01-01

    Post-exposure risk assessment of chemical and environmental stressors is a public health challenge. Linking exposure to health outcomes is a 4-step process: exposure assessment, hazard identification, dose response assessment, and risk characterization. This process is increasingly adopting “in silico” tools such as physiologically based pharmacokinetic (PBPK) models to fine-tune exposure assessments and determine internal doses in target organs/tissues. Many excellent PBPK models have been developed. But most, because of their scientific sophistication, have found limited field application—health assessors rarely use them. Over the years, government agencies, stakeholders/partners, and the scientific community have attempted to use these models or their underlying principles in combination with other practical procedures. During the past two decades, through cooperative agreements and contracts at several research and higher education institutions, ATSDR funded translational research has encouraged the use of various types of models. Such collaborative efforts have led to the development and use of transparent and user-friendly models. The “human PBPK model toolkit” is one such project. While not necessarily state of the art, this toolkit is sufficiently accurate for screening purposes. Highlighted in this paper are some selected examples of environmental and occupational exposure assessments of chemicals and their mixtures. PMID:22523493

  8. Perceived risk for cancer in an urban sexual minority

    PubMed Central

    Hay, Jennifer L.; Coups, Elliot; Warren, Barbara; Li, Yuelin; Ostroff, Jamie S.

    2013-01-01

    Lesbians, gay men, and bisexuals are a sexual minority experiencing elevated cancer risk factors and health disaparites, e.g., elevated tobacco use, disproportionate rates of infection with human immunodeficiency virus. Little attention has been paid to cancer prevention, education, and control in sexual minorities. This study describes cancer risk perceptions and their correlates so as to generate testable hypotheses and provide a foundation for targeting cancer prevention and risk reduction efforts in this high risk population. A cross-sectional survey of affiliates of a large urban community center serving sexual minority persons yielded a study sample of 247 anonymous persons. The survey assessed demographics, absolute perceived cancer risk, cancer risk behaviors, desired lifestyle changes to reduce cancer risk, and psychosocial variables including stress, depression, and stigma. Univariate and multivariate nonparametric statistics were used for analyses. The sample was primarily white non-Hispanic, middle-aged, and > 80% had at least a high school education. Mean values for absolute perceived cancer risk (range 0–100% risk), were 43.0 (SD = 25.4) for females, and for males, 49.3 (SD = 24.3). For females, although the multivariate regression model for absolute perceived cancer risk was statistically significant (P < .05), no single model variable was significant. For men, the multivariate regression model was significant (P < .001), with endorsement of “don't smoke/quit smoking” to reduce personal cancer risk (P < .001), and greater number of sexual partners (P = .054), positively associated with absolute perceived risk for cancer. This study provides novel data on cancer risk perceptions in sexual minorities, identifying correlates of absolute perceived cancer risk for each gender and several potential foci for cancer prevention interventions with this at-risk group. PMID:20872174

  9. Modeling the Risk of Secondary Malignancies after Radiotherapy

    PubMed Central

    Schneider, Uwe

    2011-01-01

    In developed countries, more than half of all cancer patients receive radiotherapy at some stage in the management of their disease. However, a radiation-induced secondary malignancy can be the price of success if the primary cancer is cured or at least controlled. Therefore, there is increasing concern regarding radiation-related second cancer risks in long-term radiotherapy survivors and a corresponding need to be able to predict cancer risks at high radiation doses. Of particular interest are second cancer risk estimates for new radiation treatment modalities such as intensity modulated radiotherapy, intensity modulated arc-therapy, proton and heavy ion radiotherapy. The long term risks from such modern radiotherapy treatment techniques have not yet been determined and are unlikely to become apparent for many years, due to the long latency time for solid tumor induction. Most information on the dose-response of radiation-induced cancer is derived from data on the A-bomb survivors who were exposed to γ-rays and neutrons. Since, for radiation protection purposes, the dose span of main interest is between zero and one Gy, the analysis of the A-bomb survivors is usually focused on this range. With increasing cure rates, estimates of cancer risk for doses larger than one Gy are becoming more important for radiotherapy patients. Therefore in this review, emphasis was placed on doses relevant for radiotherapy with respect to radiation induced solid cancer. Simple radiation protection models should be used only with extreme care for risk estimates in radiotherapy, since they are developed exclusively for low dose. When applied to scatter radiation, such models can predict only a fraction of observed second malignancies. Better semi-empirical models include the effect of dose fractionation and represent the dose-response relationships more accurately. The involved uncertainties are still huge for most of the organs and tissues. A major reason for this is that the

  10. A poultry-processing model for quantitative microbiological risk assessment.

    PubMed

    Nauta, Maarten; van der Fels-Klerx, Ine; Havelaar, Arie

    2005-02-01

    A poultry-processing model for a quantitative microbiological risk assessment (QMRA) of campylobacter is presented, which can also be applied to other QMRAs involving poultry processing. The same basic model is applied in each consecutive stage of industrial processing. It describes the effects of inactivation and removal of the bacteria, and the dynamics of cross-contamination in terms of the transfer of campylobacter from the intestines to the carcass surface and the environment, from the carcasses to the environment, and from the environment to the carcasses. From the model it can be derived that, in general, the effect of inactivation and removal is dominant for those carcasses with high initial bacterial loads, and cross-contamination is dominant for those with low initial levels. In other QMRA poultry-processing models, the input-output relationship between the numbers of bacteria on the carcasses is usually assumed to be linear on a logarithmic scale. By including some basic mechanistics, it is shown that this may not be realistic. As nonlinear behavior may affect the predicted effects of risk mitigations; this finding is relevant for risk management. Good knowledge of the variability of bacterial loads on poultry entering the process is important. The common practice in microbiology to only present geometric mean of bacterial counts is insufficient: arithmetic mean are more suitable, in particular, to describe the effect of cross-contamination. The effects of logistic slaughter (scheduled processing) as a risk mitigation strategy are predicted to be small. Some additional complications in applying microbiological data obtained in processing plants are discussed.

  11. Risk factors correlated with risk of insulin resistance using homeostasis model assessment in adolescents in Taiwan.

    PubMed

    Lin, Shiyng-Yu; Su, Chien-Tien; Hsieh, Yi-Chen; Li, Yu-Ling; Chen, Yih-Ru; Cheng, Shu-Yun; Hu, Chien-Ming; Chen, Yi-Hua; Hsieh, Fang-I; Chiou, Hung-Yi

    2015-03-01

    The study aims to discover risk factors significantly correlated with insulin resistance among adolescents in Taiwan. A total of 339 study subjects were recruited in this cross-sectional study. A self-administered questionnaire and physical examinations including anthropometrics and biochemistry profiles were collected. Insulin resistance was assessed using homeostasis model assessment for insulin resistance (HOMA-IR). Study subjects had a significantly increased risk of IR for those with abnormal level of body mass index (odds ratio [OR] = 3.54; 95% confidence interval [CI] = 1.81-6.91), body fat (OR = 2.71; 95% CI = 1.25-5.88), and waist circumference (OR = 25.04; 95% CI = 2.93-214.14) when compared with those who have normal values. Furthermore, a significantly joint effect of 10.86-fold risk for HOMA-IR abnormality among body fat, body mass index, and systolic blood pressure was observed. The identification of risk factors significantly correlated with IR will be important to prevent metabolic syndrome-related diseases and complications for adolescents in their future life.

  12. Quantitative Risk Modeling of Fire on the International Space Station

    NASA Technical Reports Server (NTRS)

    Castillo, Theresa; Haught, Megan

    2014-01-01

    The International Space Station (ISS) Program has worked to prevent fire events and to mitigate their impacts should they occur. Hardware is designed to reduce sources of ignition, oxygen systems are designed to control leaking, flammable materials are prevented from flying to ISS whenever possible, the crew is trained in fire response, and fire response equipment improvements are sought out and funded. Fire prevention and mitigation are a top ISS Program priority - however, programmatic resources are limited; thus, risk trades are made to ensure an adequate level of safety is maintained onboard the ISS. In support of these risk trades, the ISS Probabilistic Risk Assessment (PRA) team has modeled the likelihood of fire occurring in the ISS pressurized cabin, a phenomenological event that has never before been probabilistically modeled in a microgravity environment. This paper will discuss the genesis of the ISS PRA fire model, its enhancement in collaboration with fire experts, and the results which have informed ISS programmatic decisions and will continue to be used throughout the life of the program.

  13. Regime switching model for financial data: Empirical risk analysis

    NASA Astrophysics Data System (ADS)

    Salhi, Khaled; Deaconu, Madalina; Lejay, Antoine; Champagnat, Nicolas; Navet, Nicolas

    2016-11-01

    This paper constructs a regime switching model for the univariate Value-at-Risk estimation. Extreme value theory (EVT) and hidden Markov models (HMM) are combined to estimate a hybrid model that takes volatility clustering into account. In the first stage, HMM is used to classify data in crisis and steady periods, while in the second stage, EVT is applied to the previously classified data to rub out the delay between regime switching and their detection. This new model is applied to prices of numerous stocks exchanged on NYSE Euronext Paris over the period 2001-2011. We focus on daily returns for which calibration has to be done on a small dataset. The relative performance of the regime switching model is benchmarked against other well-known modeling techniques, such as stable, power laws and GARCH models. The empirical results show that the regime switching model increases predictive performance of financial forecasting according to the number of violations and tail-loss tests. This suggests that the regime switching model is a robust forecasting variant of power laws model while remaining practical to implement the VaR measurement.

  14. Model risk for European-style stock index options.

    PubMed

    Gençay, Ramazan; Gibson, Rajna

    2007-01-01

    In empirical modeling, there have been two strands for pricing in the options literature, namely the parametric and nonparametric models. Often, the support for the nonparametric methods is based on a benchmark such as the Black-Scholes (BS) model with constant volatility. In this paper, we study the stochastic volatility (SV) and stochastic volatility random jump (SVJ) models as parametric benchmarks against feedforward neural network (FNN) models, a class of neural network models. Our choice for FNN models is due to their well-studied universal approximation properties of an unknown function and its partial derivatives. Since the partial derivatives of an option pricing formula are risk pricing tools, an accurate estimation of the unknown option pricing function is essential for pricing and hedging. Our findings indicate that FNN models offer themselves as robust option pricing tools, over their sophisticated parametric counterparts in predictive settings. There are two routes to explain the superiority of FNN models over the parametric models in forecast settings. These are nonnormality of return distributions and adaptive learning.

  15. Modeling the operational risk in Iranian commercial banks: case study of a private bank

    NASA Astrophysics Data System (ADS)

    Momen, Omid; Kimiagari, Alimohammad; Noorbakhsh, Eaman

    2012-08-01

    The Basel Committee on Banking Supervision from the Bank for International Settlement classifies banking risks into three main categories including credit risk, market risk, and operational risk. The focus of this study is on the operational risk measurement in Iranian banks. Therefore, issues arising when trying to implement operational risk models in Iran are discussed, and then, some solutions are recommended. Moreover, all steps of operational risk measurement based on Loss Distribution Approach with Iran's specific modifications are presented. We employed the approach of this study to model the operational risk of an Iranian private bank. The results are quite reasonable, comparing the scale of bank and other risk categories.

  16. A Novel Risk Score to the Prediction of 10-year Risk for Coronary Artery Disease Among the Elderly in Beijing Based on Competing Risk Model

    PubMed Central

    Liu, Long; Tang, Zhe; Li, Xia; Luo, Yanxia; Guo, Jin; Li, Haibin; Liu, Xiangtong; Tao, Lixin; Yan, Aoshuang; Guo, Xiuhua

    2016-01-01

    Abstract The study aimed to construct a risk prediction model for coronary artery disease (CAD) based on competing risk model among the elderly in Beijing and develop a user-friendly CAD risk score tool. We used competing risk model to evaluate the risk of developing a first CAD event. On the basis of the risk factors that were included in the competing risk model, we constructed the CAD risk prediction model with Cox proportional hazard model. Time-dependent receiver operating characteristic (ROC) curve and time-dependent area under the ROC curve (AUC) were used to evaluate the discrimination ability of the both methods. Calibration plots were applied to assess the calibration ability and adjusted for the competing risk of non-CAD death. Net reclassification index (NRI) and integrated discrimination improvement (IDI) were applied to quantify the improvement contributed by the new risk factors. Internal validation of predictive accuracy was performed using 1000 times of bootstrap re-sampling. Of the 1775 participants without CAD at baseline, 473 incident cases of CAD were documented for a 20-year follow-up. Time-dependent AUCs for men and women at t = 10 years were 0.841 [95% confidence interval (95% CI): 0.806–0.877], 0.804 (95% CI: 0.768–0.839) in Fine and Gray model, 0.784 (95% CI: 0.738–0.830), 0.733 (95% CI: 0.692–0.775) in Cox proportional hazard model. The competing risk model was significantly superior to Cox proportional hazard model on discrimination and calibration. The cut-off values of the risk score that marked the difference between low-risk and high-risk patients were 34 points for men and 30 points for women, which have good sensitivity and specificity. A sex-specific multivariable risk factor algorithm-based competing risk model has been developed on the basis of an elderly Chinese cohort, which could be applied to predict an individual's risk and provide a useful guide to identify the groups at a high risk for CAD among the Chinese

  17. A Novel Risk Score to the Prediction of 10-year Risk for Coronary Artery Disease Among the Elderly in Beijing Based on Competing Risk Model.

    PubMed

    Liu, Long; Tang, Zhe; Li, Xia; Luo, Yanxia; Guo, Jin; Li, Haibin; Liu, Xiangtong; Tao, Lixin; Yan, Aoshuang; Guo, Xiuhua

    2016-03-01

    The study aimed to construct a risk prediction model for coronary artery disease (CAD) based on competing risk model among the elderly in Beijing and develop a user-friendly CAD risk score tool. We used competing risk model to evaluate the risk of developing a first CAD event. On the basis of the risk factors that were included in the competing risk model, we constructed the CAD risk prediction model with Cox proportional hazard model. Time-dependent receiver operating characteristic (ROC) curve and time-dependent area under the ROC curve (AUC) were used to evaluate the discrimination ability of the both methods. Calibration plots were applied to assess the calibration ability and adjusted for the competing risk of non-CAD death. Net reclassification index (NRI) and integrated discrimination improvement (IDI) were applied to quantify the improvement contributed by the new risk factors. Internal validation of predictive accuracy was performed using 1000 times of bootstrap re-sampling. Of the 1775 participants without CAD at baseline, 473 incident cases of CAD were documented for a 20-year follow-up. Time-dependent AUCs for men and women at t = 10 years were 0.841 [95% confidence interval (95% CI): 0.806-0.877], 0.804 (95% CI: 0.768-0.839) in Fine and Gray model, 0.784 (95% CI: 0.738-0.830), 0.733 (95% CI: 0.692-0.775) in Cox proportional hazard model. The competing risk model was significantly superior to Cox proportional hazard model on discrimination and calibration. The cut-off values of the risk score that marked the difference between low-risk and high-risk patients were 34 points for men and 30 points for women, which have good sensitivity and specificity. A sex-specific multivariable risk factor algorithm-based competing risk model has been developed on the basis of an elderly Chinese cohort, which could be applied to predict an individual's risk and provide a useful guide to identify the groups at a high risk for CAD among the Chinese adults over 55

  18. Reduced Risk of Importing Ebola Virus Disease because of Travel Restrictions in 2014: A Retrospective Epidemiological Modeling Study

    PubMed Central

    Otsuki, Shiori

    2016-01-01

    Background An epidemic of Ebola virus disease (EVD) from 2013–16 posed a serious risk of global spread during its early growth phase. A post-epidemic evaluation of the effectiveness of travel restrictions has yet to be conducted. The present study aimed to estimate the effectiveness of travel restrictions in reducing the risk of importation from mid-August to September, 2014, using a simple hazard-based statistical model. Methodology/Principal Findings The hazard rate was modeled as an inverse function of the effective distance, an excellent predictor of disease spread, which was calculated from the airline transportation network. By analyzing datasets of the date of EVD case importation from the 15th of July to the 15th of September 2014, and assuming that the network structure changed from the 8th of August 2014 because of travel restrictions, parameters that characterized the hazard rate were estimated. The absolute risk reduction and relative risk reductions due to travel restrictions were estimated to be less than 1% and about 20%, respectively, for all models tested. Effectiveness estimates among African countries were greater than those for other countries outside Africa. Conclusions The travel restrictions were not effective enough to expect the prevention of global spread of Ebola virus disease. It is more efficient to control the spread of disease locally during an early phase of an epidemic than to attempt to control the epidemic at international borders. Capacity building for local containment and coordinated and expedited international cooperation are essential to reduce the risk of global transmission. PMID:27657544

  19. A Spatio-temporal Model of African Animal Trypanosomosis Risk

    PubMed Central

    Dicko, Ahmadou H.; Percoma, Lassane; Sow, Adama; Adam, Yahaya; Mahama, Charles; Sidibé, Issa; Dayo, Guiguigbaza-Kossigan; Thévenon, Sophie; Fonta, William; Sanfo, Safietou; Djiteye, Aligui; Salou, Ernest; Djohan, Vincent; Cecchi, Giuliano; Bouyer, Jérémy

    2015-01-01

    Background African animal trypanosomosis (AAT) is a major constraint to sustainable development of cattle farming in sub-Saharan Africa. The habitat of the tsetse fly vector is increasingly fragmented owing to demographic pressure and shifts in climate, which leads to heterogeneous risk of cyclical transmission both in space and time. In Burkina Faso and Ghana, the most important vectors are riverine species, namely Glossina palpalis gambiensis and G. tachinoides, which are more resilient to human-induced changes than the savannah and forest species. Although many authors studied the distribution of AAT risk both in space and time, spatio-temporal models allowing predictions of it are lacking. Methodology/Principal Findings We used datasets generated by various projects, including two baseline surveys conducted in Burkina Faso and Ghana within PATTEC (Pan African Tsetse and Trypanosomosis Eradication Campaign) national initiatives. We computed the entomological inoculation rate (EIR) or tsetse challenge using a range of environmental data. The tsetse apparent density and their infection rate were separately estimated and subsequently combined to derive the EIR using a “one layer-one model” approach. The estimated EIR was then projected into suitable habitat. This risk index was finally validated against data on bovine trypanosomosis. It allowed a good prediction of the parasitological status (r2 = 67%), showed a positive correlation but less predictive power with serological status (r2 = 22%) aggregated at the village level but was not related to the illness status (r2 = 2%). Conclusions/Significance The presented spatio-temporal model provides a fine-scale picture of the dynamics of AAT risk in sub-humid areas of West Africa. The estimated EIR was high in the proximity of rivers during the dry season and more widespread during the rainy season. The present analysis is a first step in a broader framework for an efficient risk management of climate

  20. Peer Review of NRC Standardized Plant Analysis Risk Models

    SciTech Connect

    Anthony Koonce; James Knudsen; Robert Buell

    2011-03-01

    The Nuclear Regulatory Commission (NRC) Standardized Plant Analysis Risk (SPAR) Models underwent a Peer Review using ASME PRA standard (Addendum C) as endorsed by NRC in Regulatory Guide (RG) 1.200. The review was performed by a mix of industry probabilistic risk analysis (PRA) experts and NRC PRA experts. Representative SPAR models, one PWR and one BWR, were reviewed against Capability Category I of the ASME PRA standard. Capability Category I was selected as the basis for review due to the specific uses/applications of the SPAR models. The BWR SPAR model was reviewed against 331 ASME PRA Standard Supporting Requirements; however, based on the Capability Category I level of review and the absence of internal flooding and containment performance (LERF) logic only 216 requirements were determined to be applicable. Based on the review, the BWR SPAR model met 139 of the 216 supporting requirements. The review also generated 200 findings or suggestions. Of these 200 findings and suggestions 142 were findings and 58 were suggestions. The PWR SPAR model was also evaluated against the same 331 ASME PRA Standard Supporting Requirements. Of these requirements only 215 were deemed appropriate for the review (for the same reason as noted for the BWR). The PWR review determined that 125 of the 215 supporting requirements met Capability Category I or greater. The review identified 101 findings or suggestions (76 findings and 25 suggestions). These findings or suggestions were developed to identify areas where SPAR models could be enhanced. A process to prioritize and incorporate the findings/suggestions supporting requirements into the SPAR models is being developed. The prioritization process focuses on those findings that will enhance the accuracy, completeness and usability of the SPAR models.

  1. Use Of Absolute Function And Its Associates In Formation And `Redevelopment' Of Mathematical Models In Some Plant-Related Quantitative Physiology: Salinity Effects On Leaf Development Of Schefflera arboricola And Harvest Index In Rice

    NASA Astrophysics Data System (ADS)

    Selamat, Ahmad; Awang, Yahya; Mohamed, Mahmud T. M.; Wahab, Zakaria; Osman, Mohammad

    2008-01-01

    The roles of quantitative physiology are becoming more apparent and crucial in the era of ICT recently. As based on the rate-related variables, most of the mathematical models are in the form of `non-linear' function in describing the responses or the observed within-plant processes outcomes versus time. Even though if some responses change in a drastic manner at certain response point within a biological unit or space of a plant system, the response curve `should' be dependent on a continuous independent variable range in a specified period of determination where biologically `should not' functioned by independent variable range having `IF' statement(s). Subjected to nutrient concentration of high salinity (6.0 mS cm-1), the leaf turgidity (measured as leaf surface area) of S. arboricola which initially was described by one form of the logistic growth functions [(y = 1/(a+be-cx)] abruptly reduced as explained by a model having terms of Absolute function (ABS) containing tan-1(x) and its parameter of leaf life expectancy as affected by high salinity growing medium at a certain point of days after planting. This yielded an overall function of y = 1/(a+be-cx)-A[tan-1{(x-B)/D}+ABS(tan-1{(x-B)/D})]E, where a, b, c, A, B, D, and E are constants that most of them can be `biologically' interpreted. The constant B is the point similar to `IF statement' as normally used in other mathematical functions. Plants subjected to lower salinity status (<3.0 mS cm-1) were only having function of y = 1/(a+be-cx). In the harvest index or HI (economic yield/above ground biomass) study of 20 rice varieties grown over two planting seasons, the long flattened tails at both sides of a peak in the middle of function of y = R+B(T+ABS(B-x))e-k(T+ABS(B-x)) had indicated that those varieties maturing at 123 to 133 days after transplanting were having high HI values. In our observation, Absolute (ABS) function coupled with some terms could be used in the formation of some mathematical functions

  2. Approximate Uncertainty Modeling in Risk Analysis with Vine Copulas

    PubMed Central

    Bedford, Tim; Daneshkhah, Alireza

    2015-01-01

    Many applications of risk analysis require us to jointly model multiple uncertain quantities. Bayesian networks and copulas are two common approaches to modeling joint uncertainties with probability distributions. This article focuses on new methodologies for copulas by developing work of Cooke, Bedford, Kurowica, and others on vines as a way of constructing higher dimensional distributions that do not suffer from some of the restrictions of alternatives such as the multivariate Gaussian copula. The article provides a fundamental approximation result, demonstrating that we can approximate any density as closely as we like using vines. It further operationalizes this result by showing how minimum information copulas can be used to provide parametric classes of copulas that have such good levels of approximation. We extend previous approaches using vines by considering nonconstant conditional dependencies, which are particularly relevant in financial risk modeling. We discuss how such models may be quantified, in terms of expert judgment or by fitting data, and illustrate the approach by modeling two financial data sets. PMID:26332240

  3. Approximate Uncertainty Modeling in Risk Analysis with Vine Copulas.

    PubMed

    Bedford, Tim; Daneshkhah, Alireza; Wilson, Kevin J

    2016-04-01

    Many applications of risk analysis require us to jointly model multiple uncertain quantities. Bayesian networks and copulas are two common approaches to modeling joint uncertainties with probability distributions. This article focuses on new methodologies for copulas by developing work of Cooke, Bedford, Kurowica, and others on vines as a way of constructing higher dimensional distributions that do not suffer from some of the restrictions of alternatives such as the multivariate Gaussian copula. The article provides a fundamental approximation result, demonstrating that we can approximate any density as closely as we like using vines. It further operationalizes this result by showing how minimum information copulas can be used to provide parametric classes of copulas that have such good levels of approximation. We extend previous approaches using vines by considering nonconstant conditional dependencies, which are particularly relevant in financial risk modeling. We discuss how such models may be quantified, in terms of expert judgment or by fitting data, and illustrate the approach by modeling two financial data sets.

  4. Future bloom and blossom frost risk for Malus domestica considering climate model and impact model uncertainties.

    PubMed

    Hoffmann, Holger; Rath, Thomas

    2013-01-01

    The future bloom and risk of blossom frosts for Malus domestica were projected using regional climate realizations and phenological ( = impact) models. As climate impact projections are susceptible to uncertainties of climate and impact models and model concatenation, the significant horizon of the climate impact signal was analyzed by applying 7 impact models, including two new developments, on 13 climate realizations of the IPCC emission scenario A1B. Advancement of phenophases and a decrease in blossom frost risk for Lower Saxony (Germany) for early and late ripeners was determined by six out of seven phenological models. Single model/single grid point time series of bloom showed significant trends by 2021-2050 compared to 1971-2000, whereas the joint signal of all climate and impact models did not stabilize until 2043. Regarding blossom frost risk, joint projection variability exceeded the projected signal. Thus, blossom frost risk cannot be stated to be lower by the end of the 21st century despite a negative trend. As a consequence it is however unlikely to increase. Uncertainty of temperature, blooming date and blossom frost risk projection reached a minimum at 2078-2087. The projected phenophases advanced by 5.5 d K(-1), showing partial compensation of delayed fulfillment of the winter chill requirement and faster completion of the following forcing phase in spring. Finally, phenological model performance was improved by considering the length of day.

  5. Quantifying Systemic Risk by Solutions of the Mean-Variance Risk Model

    PubMed Central

    Morgenstern, Ingo

    2016-01-01

    The world is still recovering from the financial crisis peaking in September 2008. The triggering event was the bankruptcy of Lehman Brothers. To detect such turmoils, one can investigate the time-dependent behaviour of correlations between assets or indices. These cross-correlations have been connected to the systemic risks within markets by several studies in the aftermath of this crisis. We study 37 different US indices which cover almost all aspects of the US economy and show that monitoring an average investor’s behaviour can be used to quantify times of increased risk. In this paper the overall investing strategy is approximated by the ground-states of the mean-variance model along the efficient frontier bound to real world constraints. Changes in the behaviour of the average investor is utlilized as a early warning sign. PMID:27351482

  6. The characteristics of lightning risk and zoning in Beijing simulated by a risk assessment model

    NASA Astrophysics Data System (ADS)

    Hu, H.; Wang, J.; Pan, J.

    2014-08-01

    In this study, the cloud-to-ground (CG) lightning flash/stroke density was derived from the lightning location finder (LLF) data recorded between 2007 and 2011. The vulnerability of land surfaces was then assessed from the classification of the study areas into buildings, outdoor areas under the building canopy and open-field areas, which makes it convenient to deduce the location factor and confirm the protective capability. Subsequently, the potential number of dangerous lightning events at a location could be estimated from the product of the CG stroke density and the location's vulnerability. Although the human beings and all their material properties are identically exposed to lightning, the lightning casualty risk and property loss risk was assessed respectively due to their vulnerability discrepancy. Our analysis of the CG flash density in Beijing revealed that the valley of JuMaHe to the southwest, the ChangPing-ShunYi zone downwind of the Beijing metropolis, and the mountainous PingGu-MiYun zone near the coast are the most active lightning areas, with densities greater than 1.5 flashes km-2 year-1. Moreover, the mountainous northeastern, northern, and northwestern rural areas are relatively more vulnerable to lightning because the high-elevation terrain attracts lightning and there is little protection. In contrast, lightning incidents by induced lightning are most likely to occur in densely populated urban areas, and the property damage caused by lightning here is more extensive than that in suburban and rural areas. However, casualty incidents caused by direct lightning strokes seldom occur in urban areas. On the other hand, the simulation based on the lightning risk assessment model (LRAM) demonstrates that the casualty risk is higher in rural areas, whereas the property loss risk is higher in urban areas, and this conclusion is also supported by the historical casualty and damage reports.

  7. The characteristics of lightning risk and zoning in Beijing simulated by a risk assessment model

    NASA Astrophysics Data System (ADS)

    Hu, H.; Wang, J.; Pan, J.

    2013-08-01

    In this study, the Cloud-to-Ground (CG) lightning flash/stroke density has been derived from the Lightning Location Finder (LLF) data recorded in recent years. Meanwhile, the vulnerability on land surfaces has been assessed by the classification of the building, outdoor area under the building canopy and open-field area, which makes it convenient to deduce the location factor and confirm the protective capability. Then, the number of dangerous lightning event can be estimated by product of the CG stroke density and vulnerability. Although the human beings and all their material properties are identically exposed to lightning, the lightning casualty risk and property loss risk have been assessed respectively due to their vulnerability discrepancy. The analysis of the CG flash density in Beijing revealed that the JuMaHe river-valley in the southwestern region, the ChangPing-ShunYi zone downwind of the Beijing metropolis, and the mountainous PingGu-MiYun zone near the seashore are the most active lightning areas, with densities greater than 1.5 fl km-2 yr-1. Moreover, the mountainous northeastern, northern, and northwestern rural areas are relatively vulnerable to lightning due to the ability of high elevation terrain to attract lightning and the lack of protection measures. In contrast, lightning incidents by indirect lightning are most likely to occur in urban areas with high population density and aggregated properties, and the property damages caused by lightning are more extensive than those in suburban and rural areas. However, casualty incidents caused by direct lightning strokes seldom occur in urban areas. On the other hand, the simulation based on the Lightning Risk Assessment Model (LRAM) demonstrates that the casualty risk is higher in rural, whereas the property loss risk is higher in urban, and this conclusion is also supported by the historical casualty and damage reports.

  8. Guide for developing conceptual models for ecological risk assessments

    SciTech Connect

    Suter, G.W., II

    1996-05-01

    Ecological conceptual models are the result of the problem formulation phase of an ecological risk assessment, which is an important component of the Remedial Investigation process. They present hypotheses of how the site contaminants might affect the site ecology. The contaminant sources, routes, media, routes, and endpoint receptors are presented in the form of a flow chart. This guide is for preparing the conceptual models; use of this guide will standardize the models so that they will be of high quality, useful to the assessment process, and sufficiently consistent so that connections between sources of exposure and receptors can be extended across operable units (OU). Generic conceptual models are presented for source, aquatic integrator, groundwater integrator, and terrestrial OUs.

  9. Modeling and managing risk early in software development

    NASA Technical Reports Server (NTRS)

    Briand, Lionel C.; Thomas, William M.; Hetmanski, Christopher J.

    1993-01-01

    In order to improve the quality of the software development process, we need to be able to build empirical multivariate models based on data collectable early in the software process. These models need to be both useful for prediction and easy to interpret, so that remedial actions may be taken in order to control and optimize the development process. We present an automated modeling technique which can be used as an alternative to regression techniques. We show how it can be used to facilitate the identification and aid the interpretation of the significant trends which characterize 'high risk' components in several Ada systems. Finally, we evaluate the effectiveness of our technique based on a comparison with logistic regression based models.

  10. Biomarkers of leukemia risk: benzene as a model.

    PubMed Central

    Smith, M T; Zhang, L

    1998-01-01

    Although relatively rare, leukemias place a considerable financial burden on society and cause psychologic trauma to many families. Leukemia is the most common cancer in children. The causes of leukemia in adults and children are largely unknown, but occupational and environmental factors are strongly suspected. Genetic predisposition may also play a major role. Our aim is to use molecular epidemiology and toxicology to find the cause of leukemia and develop biomarkers of leukemia risk. We have studied benzene as a model chemical leukemogen, and we have identified risk factors for susceptibility to benzene toxicity. Numerous studies have associated exposure to benzene with increased levels of chromosome aberrations in circulating lymphocytes of exposed workers. Increased levels of chromosome aberrations have, in turn, been correlated with a heightened risk of cancer, especially for hematologic malignancy, in two recent cohort studies in Europe. Conventional chromosome analysis is laborious, however, and requires highly trained personnel. Further, it lacks statistical power, as only a small number of cells can be examined. The recently developed fluorescence in situ hybridization (FISH) and polymerase chain reaction (PCR)-based technologies have allowed the detection of specific chromosome aberrations. These techniques are far less time consuming and are more sensitive than classical chromosomal analysis. Because leukemias commonly show a variety of specific chromosome aberrations, detection of these aberrations by FISH and PCR in peripheral blood may provide improved biomarkers of leukemia risk. PMID:9703476

  11. Predictive model of avian electrocution risk on overhead power lines.

    PubMed

    Dwyer, J F; Harness, R E; Donohue, K

    2014-02-01

    Electrocution on overhead power structures negatively affects avian populations in diverse ecosystems worldwide, contributes to the endangerment of raptor populations in Europe and Africa, and is a major driver of legal action against electric utilities in North America. We investigated factors associated with avian electrocutions so poles that are likely to electrocute a bird can be identified and retrofitted prior to causing avian mortality. We used historical data from southern California to identify patterns of avian electrocution by voltage, month, and year to identify species most often killed by electrocution in our study area and to develop a predictive model that compared poles where an avian electrocution was known to have occurred (electrocution poles) with poles where no known electrocution occurred (comparison poles). We chose variables that could be quantified by personnel with little training in ornithology or electric systems. Electrocutions were more common at distribution voltages (≤ 33 kV) and during breeding seasons and were more commonly reported after a retrofitting program began. Red-tailed Hawks (Buteo jamaicensis) (n = 265) and American Crows (Corvus brachyrhynchos) (n = 258) were the most commonly electrocuted species. In the predictive model, 4 of 14 candidate variables were required to distinguish electrocution poles from comparison poles: number of jumpers (short wires connecting energized equipment), number of primary conductors, presence of grounding, and presence of unforested unpaved areas as the dominant nearby land cover. When tested against a sample of poles not used to build the model, our model distributed poles relatively normally across electrocution-risk values and identified the average risk as higher for electrocution poles relative to comparison poles. Our model can be used to reduce avian electrocutions through proactive identification and targeting of high-risk poles for retrofitting.

  12. The Temporal Version of the Pediatric Sepsis Biomarker Risk Model

    PubMed Central

    Wong, Hector R.; Weiss, Scott L.; Giuliano, John S.; Wainwright, Mark S.; Cvijanovich, Natalie Z.; Thomas, Neal J.; Allen, Geoffrey L.; Anas, Nick; Bigham, Michael T.; Hall, Mark; Freishtat, Robert J.; Sen, Anita; Meyer, Keith; Checchia, Paul A.; Shanley, Thomas P.; Nowak, Jeffrey; Quasney, Michael; Chopra, Arun; Fitzgerald, Julie C.; Gedeit, Rainer; Banschbach, Sharon; Beckman, Eileen; Harmon, Kelli; Lahni, Patrick; Lindsell, Christopher J.

    2014-01-01

    Background PERSEVERE is a risk model for estimating mortality probability in pediatric septic shock, using five biomarkers measured within 24 hours of clinical presentation. Objective Here, we derive and test a temporal version of PERSEVERE (tPERSEVERE) that considers biomarker values at the first and third day following presentation to estimate the probability of a “complicated course”, defined as persistence of ≥2 organ failures at seven days after meeting criteria for septic shock, or death within 28 days. Methods Biomarkers were measured in the derivation cohort (n = 225) using serum samples obtained during days 1 and 3 of septic shock. Classification and Regression Tree (CART) analysis was used to derive a model to estimate the risk of a complicated course. The derived model was validated in the test cohort (n = 74), and subsequently updated using the combined derivation and test cohorts. Results A complicated course occurred in 23% of the derivation cohort subjects. The derived model had a sensitivity for a complicated course of 90% (95% CI 78–96), specificity was 70% (62–77), positive predictive value was 47% (37–58), and negative predictive value was 96% (91–99). The area under the receiver operating characteristic curve was 0.85 (0.79–0.90). Similar test characteristics were observed in the test cohort. The updated model had a sensitivity of 91% (81–96), a specificity of 70% (64–76), a positive predictive value of 47% (39–56), and a negative predictive value of 96% (92–99). Conclusions tPERSEVERE reasonably estimates the probability of a complicated course in children with septic shock. tPERSEVERE could potentially serve as an adjunct to physiological assessments for monitoring how risk for poor outcomes changes during early interventions in pediatric septic shock. PMID:24626215

  13. New full velocity difference model considering the driver’s heterogeneity of the disturbance risk preference for car-following theory

    NASA Astrophysics Data System (ADS)

    Zeng, You-Zhi; Zhang, Ning

    2016-12-01

    This paper proposes a new full velocity difference model considering the driver’s heterogeneity of the disturbance risk preference for car-following theory to investigate the effects of the driver’s heterogeneity of the disturbance risk preference on traffic flow instability when the driver reacts to the relative velocity. We obtain traffic flow instability condition and the calculation method of the unstable region headway range and the probability of traffic congestion caused by a small disturbance. The analysis shows that has important effects the driver’s heterogeneity of the disturbance risk preference on traffic flow instability: (1) traffic flow instability is independent of the absolute size of the driver’s disturbance risk preference coefficient and depends on the ratio of the preceding vehicle driver’s disturbance risk preference coefficient to the following vehicle driver’s disturbance risk preference coefficient; (2) the smaller the ratio of the preceding vehicle driver’s disturbance risk preference coefficient to the following vehicle driver’s disturbance risk preference coefficient, the smaller traffic flow instability and vice versa. It provides some viable ideas to suppress traffic congestion.

  14. Risk Factors and Prediction Models for Retinopathy of Prematurity

    PubMed Central

    Senthil, Mallika Prem; Salowi, Mohamad Aziz; Bujang, Mohamad Adam; Kueh, Adeline; Siew, Chong Min; Sumugam, Kala; Gaik, Chan Lee; Kah, Tan Aik

    2015-01-01

    Objectives To develop a simple prediction model for the pre-screening of Retinopathy of Prematurity (ROP) among preterm babies. Methods This was a prospective study. The test dataset (January 2007 until December 2010) was used to construct risk prediction models, and the validation dataset (January 2011 until March 2012) was used to validate the models developed from the test dataset. Two prediction models were produced using the test dataset based on logistic regression equations in which the development of ROP was used as the outcome. Results The sensitivity and specificity for model 1 [gestational age (GA), birth weight (BW), intraventricular haemorrhage (IVH) and respiratory distress syndrome (RDS)] was 82 % and 81.7%, respectively; for model 2, (GA and BW) the sensitivity and specificity were 80.5% and 80.3%, respectively. Conclusion Model 2 was preferable, as it only required two predictors (GA and BW). Our prediction model can be used for early detection of ROP to avoid poor outcomes. PMID:28239269

  15. Interpreting incremental value of markers added to risk prediction models.

    PubMed

    Pencina, Michael J; D'Agostino, Ralph B; Pencina, Karol M; Janssens, A Cecile J W; Greenland, Philip

    2012-09-15

    The discrimination of a risk prediction model measures that model's ability to distinguish between subjects with and without events. The area under the receiver operating characteristic curve (AUC) is a popular measure of discrimination. However, the AUC has recently been criticized for its insensitivity in model comparisons in which the baseline model has performed well. Thus, 2 other measures have been proposed to capture improvement in discrimination for nested models: the integrated discrimination improvement and the continuous net reclassification improvement. In the present study, the authors use mathematical relations and numerical simulations to quantify the improvement in discrimination offered by candidate markers of different strengths as measured by their effect sizes. They demonstrate that the increase in the AUC depends on the strength of the baseline model, which is true to a lesser degree for the integrated discrimination improvement. On the other hand, the continuous net reclassification improvement depends only on the effect size of the candidate variable and its correlation with other predictors. These measures are illustrated using the Framingham model for incident atrial fibrillation. The authors conclude that the increase in the AUC, integrated discrimination improvement, and net reclassification improvement offer complementary information and thus recommend reporting all 3 alongside measures characterizing the performance of the final model.

  16. Architecture for Integrated Medical Model Dynamic Probabilistic Risk Assessment

    NASA Technical Reports Server (NTRS)

    Jaworske, D. A.; Myers, J. G.; Goodenow, D.; Young, M.; Arellano, J. D.

    2016-01-01

    Probabilistic Risk Assessment (PRA) is a modeling tool used to predict potential outcomes of a complex system based on a statistical understanding of many initiating events. Utilizing a Monte Carlo method, thousands of instances of the model are considered and outcomes are collected. PRA is considered static, utilizing probabilities alone to calculate outcomes. Dynamic Probabilistic Risk Assessment (dPRA) is an advanced concept where modeling predicts the outcomes of a complex system based not only on the probabilities of many initiating events, but also on a progression of dependencies brought about by progressing down a time line. Events are placed in a single time line, adding each event to a queue, as managed by a planner. Progression down the time line is guided by rules, as managed by a scheduler. The recently developed Integrated Medical Model (IMM) summarizes astronaut health as governed by the probabilities of medical events and mitigation strategies. Managing the software architecture process provides a systematic means of creating, documenting, and communicating a software design early in the development process. The software architecture process begins with establishing requirements and the design is then derived from the requirements.

  17. Climate-based risk models for Fasciola hepatica in Colombia.

    PubMed

    Valencia-López, Natalia; Malone, John B; Carmona, Catalina Gómez; Velásquez, Luz E

    2012-09-01

    A predictive Fasciola hepatica model, based on the growing degree day-water budget (GDD-WB) concept and the known biological requirements of the parasite, was developed within a geographical information system (GIS) in Colombia. Climate-based forecast index (CFI) values were calculated and represented in a national-scale, climate grid (18 x 18 km) using ArcGIS 9.3. A mask overlay was used to exclude unsuitable areas where mean annual temperature exceeded 25 °C, the upper threshold for development and propagation of the F. hepatica life cycle. The model was then validated and further developed by studies limited to one department in northwest Colombia. F. hepatica prevalence data was obtained from a 2008-2010 survey in 10 municipalities of 6,016 dairy cattle at 673 herd study sites, for which global positioning system coordinates were recorded. The CFI map results were compared to F. hepatica environmental risk models for the survey data points that had over 5% prevalence (231 of the 673 sites) at the 1 km2 scale using two independent approaches: (i) a GIS map query based on satellite data parameters including elevation, enhanced vegetation index and land surface temperature day-night difference; and (ii) an ecological niche model (MaxEnt), for which geographic point coordinates of F. hepatica survey farms were used with BioClim data as environmental variables to develop a probability map. The predicted risk pattern of both approaches was similar to that seen in the forecast index grid. The temporal risk, evaluated by the monthly CFIs and a daily GDD-WB forecast software for 2007 and 2008, revealed a major July-August to January transmission period with considerable inter-annual differences.

  18. Absolute optical metrology : nanometers to kilometers

    NASA Technical Reports Server (NTRS)

    Dubovitsky, Serge; Lay, O. P.; Peters, R. D.; Liebe, C. C.

    2005-01-01

    We provide and overview of the developments in the field of high-accuracy absolute optical metrology with emphasis on space-based applications. Specific work on the Modulation Sideband Technology for Absolute Ranging (MSTAR) sensor is described along with novel applications of the sensor.

  19. ON A SUFFICIENT CONDITION FOR ABSOLUTE CONTINUITY.

    DTIC Science & Technology

    The formulation of a condition which yields absolute continuity when combined with continuity and bounded variation is the problem considered in the...Briefly, the formulation is achieved through a discussion which develops a proof by contradiction of a sufficiently theorem for absolute continuity which uses in its hypothesis the condition of continuity and bounded variation .

  20. Introducing the Mean Absolute Deviation "Effect" Size

    ERIC Educational Resources Information Center

    Gorard, Stephen

    2015-01-01

    This paper revisits the use of effect sizes in the analysis of experimental and similar results, and reminds readers of the relative advantages of the mean absolute deviation as a measure of variation, as opposed to the more complex standard deviation. The mean absolute deviation is easier to use and understand, and more tolerant of extreme…

  1. Monolithically integrated absolute frequency comb laser system

    SciTech Connect

    Wanke, Michael C.

    2016-07-12

    Rather than down-convert optical frequencies, a QCL laser system directly generates a THz frequency comb in a compact monolithically integrated chip that can be locked to an absolute frequency without the need of a frequency-comb synthesizer. The monolithic, absolute frequency comb can provide a THz frequency reference and tool for high-resolution broad band spectroscopy.

  2. Age-specific absolute and relative organ weight distributions for B6C3F1 mice.

    PubMed

    Marino, Dale J

    2012-01-01

    The B6C3F1 mouse is the standard mouse strain used in toxicology studies conducted by the National Cancer Institute (NCI) and the National Toxicology Program (NTP). While numerous reports have been published on growth, survival, and tumor incidence, no overall compilation of organ weight data is available. Importantly, organ weight change is an endpoint used by regulatory agencies to develop toxicity reference values (TRVs) for use in human health risk assessments. Furthermore, physiologically based pharmacokinetic (PBPK) models, which utilize relative organ weights, are increasingly being used to develop TRVs. Therefore, all available absolute and relative organ weight data for untreated control B6C3F1 mice were collected from NCI/NTP studies in order to develop age-specific distributions. Results show that organ weights were collected more frequently in NCI/NTP studies at 2-wk (60 studies), 3-mo (147 studies), and 15-mo (40 studies) intervals than at other intervals, and more frequently from feeding and inhalation than drinking water studies. Liver, right kidney, lung, heart, thymus, and brain weights were most frequently collected. From the collected data, the mean and standard deviation for absolute and relative organ weights were calculated. Results show age-related increases in absolute liver, right kidney, lung, and heart weights and relatively stable brain and right testis weights. The results suggest a general variability trend in absolute organ weights of brain < right testis < right kidney < heart < liver < lung < spleen < thymus. This report describes the results of this effort.

  3. Absolute instability of the Gaussian wake profile

    NASA Technical Reports Server (NTRS)

    Hultgren, Lennart S.; Aggarwal, Arun K.

    1987-01-01

    Linear parallel-flow stability theory has been used to investigate the effect of viscosity on the local absolute instability of a family of wake profiles with a Gaussian velocity distribution. The type of local instability, i.e., convective or absolute, is determined by the location of a branch-point singularity with zero group velocity of the complex dispersion relation for the instability waves. The effects of viscosity were found to be weak for values of the wake Reynolds number, based on the center-line velocity defect and the wake half-width, larger than about 400. Absolute instability occurs only for sufficiently large values of the center-line wake defect. The critical value of this parameter increases with decreasing wake Reynolds number, thereby indicating a shrinking region of absolute instability with decreasing wake Reynolds number. If backflow is not allowed, absolute instability does not occur for wake Reynolds numbers smaller than about 38.

  4. GERMcode: A Stochastic Model for Space Radiation Risk Assessment

    NASA Technical Reports Server (NTRS)

    Kim, Myung-Hee Y.; Ponomarev, Artem L.; Cucinotta, Francis A.

    2012-01-01

    A new computer model, the GCR Event-based Risk Model code (GERMcode), was developed to describe biophysical events from high-energy protons and high charge and energy (HZE) particles that have been studied at the NASA Space Radiation Laboratory (NSRL) for the purpose of simulating space radiation biological effects. In the GERMcode, the biophysical description of the passage of HZE particles in tissue and shielding materials is made with a stochastic approach that includes both particle track structure and nuclear interactions. The GERMcode accounts for the major nuclear interaction processes of importance for describing heavy ion beams, including nuclear fragmentation, elastic scattering, and knockout-cascade processes by using the quantum multiple scattering fragmentation (QMSFRG) model. The QMSFRG model has been shown to be in excellent agreement with available experimental data for nuclear fragmentation cross sections. For NSRL applications, the GERMcode evaluates a set of biophysical properties, such as the Poisson distribution of particles or delta-ray hits for a given cellular area and particle dose, the radial dose on tissue, and the frequency distribution of energy deposition in a DNA volume. By utilizing the ProE/Fishbowl ray-tracing analysis, the GERMcode will be used as a bi-directional radiation transport model for future spacecraft shielding analysis in support of Mars mission risk assessments. Recent radiobiological experiments suggest the need for new approaches to risk assessment that include time-dependent biological events due to the signaling times for activation and relaxation of biological processes in cells and tissue. Thus, the tracking of the temporal and spatial distribution of events in tissue is a major goal of the GERMcode in support of the simulation of biological processes important in GCR risk assessments. In order to validate our approach, basic radiobiological responses such as cell survival curves, mutation, chromosomal

  5. Value-at-risk prediction using context modeling

    NASA Astrophysics Data System (ADS)

    Denecker, K.; van Assche, S.; Crombez, J.; Vander Vennet, R.; Lemahieu, I.

    2001-04-01

    In financial market risk measurement, Value-at-Risk (VaR) techniques have proven to be a very useful and popular tool. Unfortunately, most VaR estimation models suffer from major drawbacks: the lognormal (Gaussian) modeling of the returns does not take into account the observed fat tail distribution and the non-stationarity of the financial instruments severely limits the efficiency of the VaR predictions. In this paper, we present a new approach to VaR estimation which is based on ideas from the field of information theory and lossless data compression. More specifically, the technique of context modeling is applied to estimate the VaR by conditioning the probability density function on the present context. Tree-structured vector quantization is applied to partition the multi-dimensional state space of both macroeconomic and microeconomic priors into an increasing but limited number of context classes. Each class can be interpreted as a state of aggregation with its own statistical and dynamic behavior, or as a random walk with its own drift and step size. Results on the US S&P500 index, obtained using several evaluation methods, show the strong potential of this approach and prove that it can be applied successfully for, amongst other useful applications, VaR and volatility prediction. The October 1997 crash is indicated in time.

  6. Risk-based Multiobjective Optimization Model for Bridge Maintenance Planning

    SciTech Connect

    Yang, I-T.; Hsu, Y.-S.

    2010-05-21

    Determining the optimal maintenance plan is essential for successful bridge management. The optimization objectives are defined in the forms of minimizing life-cycle cost and maximizing performance indicators. Previous bridge maintenance models assumed the process of bridge deterioration and the estimate of maintenance cost are deterministic, i.e., known with certainty. This assumption, however, is invalid especially with estimates over a long time horizon of bridge life. In this study, we consider the risks associated with bridge deterioration and maintenance cost in determining the optimal maintenance plan. The decisions variables include the strategic choice of essential maintenance (such as silane treatment and cathodic protection), and the intervals between periodic maintenance. A epsilon-constrained Particle Swarm Optimization algorithm is used to approximate the tradeoff between life-cycle cost and performance indicators. During stochastic search for optimal solutions, Monte-Carlo simulation is used to evaluate the impact of risks on the objective values, at an acceptance level of reliability. The proposed model can facilitate decision makers to select the compromised maintenance plan with a group of alternative choices, each of which leads to a different level of performance and life-cycle cost. A numerical example is used to illustrate the proposed model.

  7. The role of models in estimating consequences as part of the risk assessment process.

    PubMed

    Forde-Folle, K; Mitchell, D; Zepeda, C

    2011-08-01

    The degree of disease risk represented by the introduction, spread, or establishment of one or several diseases through the importation of animals and animal products is assessed by importing countries through an analysis of risk. The components of a risk analysis include hazard identification, risk assessment, risk management, and risk communication. A risk assessment starts with identification of the hazard(s) and then continues with four interrelated steps: release assessment, exposure assessment, consequence assessment, and risk estimation. Risk assessments may be either qualitative or quantitative. This paper describes how, through the integration of epidemiological and economic models, the potential adverse biological and economic consequences of exposure can be quantified.

  8. Toward The Absolute Age of M92 With MIST

    NASA Astrophysics Data System (ADS)

    Choi, Jieun; Conroy, Charlie; Dotter, Aaron; Weisz, Daniel; Rosenfield, Philip; Dolphin, Andrew

    2016-08-01

    Globular clusters provide a fundamental link between stars and galaxies. For example, it has been suggested that ultra faint dwarf galaxies formed all of their stars prior to the epoch of reionization, but this conclusion hinges entirely on the striking similarity of their stellar populations to the ancient, metal-poor globular cluster M92. The accurate measurement of absolute ages of ancient globular clusters therefore has direct implications for the formation histories of the smallest galaxies in the Universe. However, a reliable determination of the absolute ages of globular clusters has proven to be a challenge due to uncertainties in stellar physics and complications in how the models are compared to observations. I will present preliminary results from a comprehensive study to measure the absolute age of M92 using high-quality HST archival imaging data. We pair our new MESA Isochrones and Stellar Tracks (MIST) models with a full CMD fitting framework to jointly fit multi-color CMDs, taking into account the uncertainties in abundances, distance, and stellar physics. The goal of this project is two-fold. First, we aim to provide the most secure absolute age of M92 to date with robustly estimated uncertainties. Second, we explore and quantify the degeneracies between uncertain physical quantities and model variables, such as the distance, mixing-length-alpha parameter, and helium abundance, with the ultimate goal of better constraining these unknowns with data from ongoing and future surveys such as K2, Gaia, TESS, JWST, and WFIRST.

  9. A mathematical biologist's guide to absolute and convective instability.

    PubMed

    Sherratt, Jonathan A; Dagbovie, Ayawoa S; Hilker, Frank M

    2014-01-01

    Mathematical models have been highly successful at reproducing the complex spatiotemporal phenomena seen in many biological systems. However, the ability to numerically simulate such phenomena currently far outstrips detailed mathematical understanding. This paper reviews the theory of absolute and convective instability, which has the potential to redress this inbalance in some cases. In spatiotemporal systems, unstable steady states subdivide into two categories. Those that are absolutely unstable are not relevant in applications except as generators of spatial or spatiotemporal patterns, but convectively unstable steady states can occur as persistent features of solutions. The authors explain the concepts of absolute and convective instability, and also the related concepts of remnant and transient instability. They give examples of their use in explaining qualitative transitions in solution behaviour. They then describe how to distinguish different types of instability, focussing on the relatively new approach of the absolute spectrum. They also discuss the use of the theory for making quantitative predictions on how spatiotemporal solutions change with model parameters. The discussion is illustrated throughout by numerical simulations of a model for river-based predator-prey systems.

  10. Nonstationary decision model for flood risk decision scaling

    NASA Astrophysics Data System (ADS)

    Spence, Caitlin M.; Brown, Casey M.

    2016-11-01

    Hydroclimatic stationarity is increasingly questioned as a default assumption in flood risk management (FRM), but successor methods are not yet established. Some potential successors depend on estimates of future flood quantiles, but methods for estimating future design storms are subject to high levels of uncertainty. Here we apply a Nonstationary Decision Model (NDM) to flood risk planning within the decision scaling framework. The NDM combines a nonstationary probability distribution of annual peak flow with optimal selection of flood management alternatives using robustness measures. The NDM incorporates structural and nonstructural FRM interventions and valuation of flows supporting ecosystem services to calculate expected cost of a given FRM strategy. A search for the minimum-cost strategy under incrementally varied representative scenarios extending across the plausible range of flood trend and value of the natural flow regime discovers candidate FRM strategies that are evaluated and compared through a decision scaling analysis (DSA). The DSA selects a management strategy that is optimal or close to optimal across the broadest range of scenarios or across the set of scenarios deemed most likely to occur according to estimates of future flood hazard. We illustrate the decision framework using a stylized example flood management decision based on the Iowa City flood management system, which has experienced recent unprecedented high flow episodes. The DSA indicates a preference for combining infrastructural and nonstructural adaptation measures to manage flood risk and makes clear that options-based approaches cannot be assumed to be "no" or "low regret."

  11. Documentation of the Ecological Risk Assessment Computer Model ECORSK.5

    SciTech Connect

    Anthony F. Gallegos; Gilbert J. Gonzales

    1999-06-01

    The FORTRAN77 ecological risk computer model--ECORSK.5--has been used to estimate the potential toxicity of surficial deposits of radioactive and non-radioactive contaminants to several threatened and endangered (T and E) species at the Los Alamos National Laboratory (LANL). These analyses to date include preliminary toxicity estimates for the Mexican spotted owl, the American peregrine falcon, the bald eagle, and the southwestern willow flycatcher. This work has been performed as required for the Record of Decision for the construction of the Dual Axis Radiographic Hydrodynamic Test (DARHT) Facility at LANL as part of the Environmental Impact Statement. The model is dependent on the use of the geographic information system and associated software--ARC/INFO--and has been used in conjunction with LANL's Facility for Information Management and Display (FIMAD) contaminant database. The integration of FIMAD data and ARC/INFO using ECORSK.5 allows the generation of spatial information from a gridded area of potential exposure called an Ecological Exposure Unit. ECORSK.5 was used to simulate exposures using a modified Environmental Protection Agency Quotient Method. The model can handle a large number of contaminants within the home range of T and E species. This integration results in the production of hazard indices which, when compared to risk evaluation criteria, estimate the potential for impact from consumption of contaminants in food and ingestion of soil. The assessment is considered a Tier-2 type of analysis. This report summarizes and documents the ECORSK.5 code, the mathematical models used in the development of ECORSK.5, and the input and other requirements for its operation. Other auxiliary FORTRAN 77 codes used for processing and graphing output from ECORSK.5 are also discussed. The reader may refer to reports cited in the introduction to obtain greater detail on past applications of ECORSK.5 and assumptions used in deriving model parameters.

  12. Building a Better Model: A Personalized Breast Cancer Risk Model Incorporating Breast Density to Stratify Risk and Improve Application of Resources

    DTIC Science & Technology

    2012-10-01

    gold standard manual method. 15. SUBJECT TERMS Breast cancer; risk model; mammography ; breast density 16. SECURITY CLASSIFICATION OF: 17...individual’s risk beginning with personalized mammography screening decisions. This will be done by increasing the ability to predict a women’s risk of...correction, comparison of several different software methods, precision measurement, and evaluation of variation by mammography machine vendor. Once

  13. Applications of physiologic pharmacokinetic modeling in carcinogenic risk assessment.

    PubMed Central

    Krewski, D; Withey, J R; Ku, L F; Andersen, M E

    1994-01-01

    The use of physiologically based pharmacokinetic (PBPK) models has been proposed as a means of estimating the dose of the reactive metabolites of carcinogenic xenobiotics reaching target tissues, thereby affording an opportunity to base estimates of potential cancer risk on tissue dose rather than external levels of exposure. In this article, we demonstrate how a PBPK model can be constructed by specifying mass-balance equations for each physiological compartment included in the model. In general, this leads to a system of nonlinear partial differential equations with which to characterize the compartment system. These equations then can be solved numerically to determine the concentration of metabolites in each compartment as functions of time. In the special case of a linear pharmacokinetic system, we present simple closed-form expressions for the area under the concentration-time curves (AUC) in individual tissue compartments. A general relationship between the AUC in blood and other tissue compartments is also established. These results are of use in identifying those parameters in the models that characterize the integrated tissue dose, and which should therefore be the primary focus of sensitivity analyses. Applications of PBPK modeling for purposes of tissue dosimetry are reviewed, including models developed for methylene chloride, ethylene oxide, 1,4-dioxane, 1-nitropyrene, as well as polychlorinated biphenyls, dioxins, and furans. Special considerations in PBPK modeling related to aging, topical absorption, pregnancy, and mixed exposures are discussed. The linkage between pharmacokinetic models used for tissue dosimetry and pharmacodynamic models for neoplastic transformation of stem cells in the target tissue is explored. PMID:7737040

  14. Assessing patient safety risk before the injury occurs: an introduction to sociotechnical probabilistic risk modelling in health care

    PubMed Central

    Marx, D; Slonim, A

    2003-01-01

    Since 1 July 2001 the Joint Commission on Accreditation of Healthcare Organizations (JCAHO) has required each accredited hospital to conduct at least one proactive risk assessment annually. Failure modes and effects analysis (FMEA) was recommended as one tool for conducting this task. This paper examines the limitations of FMEA and introduces a second tool used by the aviation and nuclear industries to examine low frequency, high impact events in complex systems. The adapted tool, known as sociotechnical probabilistic risk assessment (ST-PRA), provides an alternative for proactively identifying, prioritizing, and mitigating patient safety risk. The uniqueness of ST-PRA is its ability to model combinations of equipment failures, human error, at risk behavioral norms, and recovery opportunities through the use of fault trees. While ST-PRA is a complex, high end risk modelling tool, it provides an opportunity to visualize system risk in a manner that is not possible through FMEA. PMID:14645893

  15. Assessing patient safety risk before the injury occurs: an introduction to sociotechnical probabilistic risk modelling in health care.

    PubMed

    Marx, D A; Slonim, A D

    2003-12-01

    Since 1 July 2001 the Joint Commission on Accreditation of Healthcare Organizations (JCAHO) has required each accredited hospital to conduct at least one proactive risk assessment annually. Failure modes and effects analysis (FMEA) was recommended as one tool for conducting this task. This paper examines the limitations of FMEA and introduces a second tool used by the aviation and nuclear industries to examine low frequency, high impact events in complex systems. The adapted tool, known as sociotechnical probabilistic risk assessment (ST-PRA), provides an alternative for proactively identifying, prioritizing, and mitigating patient safety risk. The uniqueness of ST-PRA is its ability to model combinations of equipment failures, human error, at risk behavioral norms, and recovery opportunities through the use of fault trees. While ST-PRA is a complex, high end risk modelling tool, it provides an opportunity to visualize system risk in a manner that is not possible through FMEA.

  16. Absolute realization of low BRDF value

    NASA Astrophysics Data System (ADS)

    Liu, Zilong; Liao, Ningfang; Li, Ping; Wang, Yu

    2010-10-01

    Low BRDF value is widespread used in many critical domains such as space and military fairs. These values below 0.1 Sr-1 . So the Absolute realization of these value is the most critical issue in the absolute measurement of BRDF. To develop the Absolute value realization theory of BRDF , defining an arithmetic operators of BRDF , achieving an absolute measurement Eq. of BRDF based on radiance. This is a new theory method to solve the realization problem of low BRDF value. This theory method is realized on a self-designed common double orientation structure in space. By designing an adding structure to extend the range of the measurement system and a control and processing software, Absolute realization of low BRDF value is achieved. A material of low BRDF value is measured in this measurement system and the spectral BRDF value are showed within different angles allover the space. All these values are below 0.4 Sr-1 . This process is a representative procedure about the measurement of low BRDF value. A corresponding uncertainty analysis of this measurement data is given depend on the new theory of absolute realization and the performance of the measurement system. The relative expand uncertainty of the measurement data is 0.078. This uncertainty analysis is suitable for all measurements using the new theory of absolute realization and the corresponding measurement system.

  17. Applications of the International Space Station Probabilistic Risk Assessment Model

    NASA Technical Reports Server (NTRS)

    Grant, Warren; Lutomski, Michael G.

    2011-01-01

    Recently the International Space Station (ISS) has incorporated more Probabilistic Risk Assessments (PRAs) in the decision making process for significant issues. Future PRAs will have major impact to ISS and future spacecraft development and operations. These PRAs will have their foundation in the current complete ISS PRA model and the current PRA trade studies that are being analyzed as requested by ISS Program stakeholders. ISS PRAs have recently helped in the decision making process for determining reliability requirements for future NASA spacecraft and commercial spacecraft, making crew rescue decisions, as well as making operational requirements for ISS orbital orientation, planning Extravehicular activities (EVAs) and robotic operations. This paper will describe some applications of the ISS PRA model and how they impacted the final decision. This paper will discuss future analysis topics such as life extension, requirements of new commercial vehicles visiting ISS.

  18. Modeling financial disaster risk management in developing countries

    NASA Astrophysics Data System (ADS)

    Mechler, R.; Hochrainer, S.; Pflug, G.; Linnerooth-Bayer, J.

    2005-12-01

    The public sector plays a major role in reducing the long-term economic repercussions of disasters by repairing damaged infrastructure and providing financial assistance to households and businesses. If critical infrastructure is not repaired in a timely manner, there can be serious effects on the economy and the livelihoods of the population. The repair of public infrastructure, however, can be a significant drain on public budgets especially in developing and transition countries. Developing country governments frequently lack the liquidity, even including international aid and loans, to fully repair damaged critical public infrastructure or provide sufficient support to households and businesses for their recovery. The earthquake in Gujarat, and other recent cases of government post-disaster liquidity crises, have sounded an alarm, prompting financial development organizations, such as the World Bank, among others, to call for greater attention to reducing financial vulnerability and increasing the resilience of the public sector. This talk reports on a model designed to illustrate the tradeoffs and choices a developing country must make in financially managing the economic risks due to natural disasters. Budgetary resources allocated to pre-disaster risk management strategies, such as loss mitigation measures, a catastrophe reserve fund, insurance and contingent credit arrangements for public assets, reduce the probability of financing gaps - the inability of governments to meet their full obligations in providing relief to private victims and restoring public infrastructure - or prevent the deterioration of the ability to undertake additional borrowing without incurring a debt crisis. The model -which is equipped with a graphical interface - can be a helpful tool for building capacity of policy makers for developing and assessing public financing strategies for disaster risk by indicating the respective costs and consequences of financing alternatives.

  19. Modeling human risk: Cell & molecular biology in context

    SciTech Connect

    1997-06-01

    It is anticipated that early in the next century manned missions into outer space will occur, with a mission to Mars scheduled between 2015 and 2020. However, before such missions can be undertaken, a realistic estimation of the potential risks to the flight crews is required. One of the uncertainties remaining in this risk estimation is that posed by the effects of exposure to the radiation environment of outer space. Although the composition of this environment is fairly well understood, the biological effects arising from exposure to it are not. The reasons for this are three-fold: (1) A small but highly significant component of the radiation spectrum in outer space consists of highly charged, high energy (HZE) particles which are not routinely experienced on earth, and for which there are insufficient data on biological effects; (2) Most studies on the biological effects of radiation to date have been high-dose, high dose-rate, whereas in space, with the exception of solar particle events, radiation exposures will be low-dose, low dose-rate; (3) Although it has been established that the virtual absence of gravity in space has a profound effect on human physiology, it is not clear whether these effects will act synergistically with those of radiation exposure. A select panel will evaluate the utilizing experiments and models to accurately predict the risks associated with exposure to HZE particles. Topics of research include cellular and tissue response, health effects associated with radiation damage, model animal systems, and critical markers of Radiation response.

  20. Uses of risk importance measures

    NASA Astrophysics Data System (ADS)

    Mankamo, Tuomas; Porn, Kurt; Holmberg, Jan

    1991-05-01

    The definitions, relationships, and interpretations of the three most basic measures of risk importance measures are outlined: risk increase factor, risk decrease factor, and fractional contribution. Risk importance measures provide an understandable and practical way of presenting probabilistic safety analysis results which too often tend to remain abstract numbers without real insight into the content. The fundamental aspect of importance measures is that they express some specific influence of a basic event in the total risk. The basic failure or error events are the elements from which the reliability and risk models are constituted. The importance measures are relative, which is an advantage compared to absolute risk numbers, due to insensitivity with respect to quantification uncertainties. Therefore, they are particularly adapted to give first hand guidance on where to focus the main interest from the system's risk and reliability point of view, and from where to continue the analysis with more sophisticated methods requiring more effort.

  1. Modeling urban flood risk territories for Riga city

    NASA Astrophysics Data System (ADS)

    Piliksere, A.; Sennikovs, J.; Virbulis, J.; Bethers, U.; Bethers, P.; Valainis, A.

    2012-04-01

    Riga, the capital of Latvia, is located on River Daugava at the Gulf of Riga. The main flooding risks of Riga city are: (1) storm caused water setup in South part of Gulf of Riga (storm event), (2) water level increase caused by Daugava River discharge maximums (spring snow melting event) and (3) strong rainfall or rapid snow melting in densely populated urban areas. The first two flooding factors were discussed previously (Piliksere et al, 2011). The aims of the study were (1) the identification of the flood risk situations in densely populated areas, (2) the quantification of the flooding scenarios caused by rain and snow melting events of different return periods nowadays, in the near future (2021-2050), far future (2071-2100) taking into account the projections of climate change, (3) estimation of groundwater level for Riga city, (4) the building and calibration of the hydrological mathematical model based on SWMM (EPA, 2004) for the domain potentially vulnerable for rain and snow melt flooding events, (5) the calculation of rain and snow melting flood events with different return periods, (6) mapping the potentially flooded areas on a fine grid. The time series of short term precipitation events during warm time period of year (id est. rain events) were analyzed for 35 year long time period. Annual maxima of precipitation intensity for events with different duration (5 min; 15 min; 1h; 3h; 6h; 12h; 1 day; 2 days; 4 days; 10 days) were calculated. The time series of long term simultaneous precipitation data and observations of the reduction of thickness of snow cover were analyzed for 27 year long time period. Snow thawing periods were detected and maximum of snow melting intensity for events with different intensity (1day; 2 days; 4 days; 7 days; 10 days) were calculated. According to the occurrence probability six scenarios for each event for nowadays, near and far future with return period once in 5, 10, 20, 50, 100 and 200 years were constructed based on

  2. Model-based risk analysis of coupled process steps.

    PubMed

    Westerberg, Karin; Broberg-Hansen, Ernst; Sejergaard, Lars; Nilsson, Bernt

    2013-09-01

    A section of a biopharmaceutical manufacturing process involving the enzymatic coupling of a polymer to a therapeutic protein was characterized with regards to the process parameter sensitivity and design space. To minimize the formation of unwanted by-products in the enzymatic reaction, the substrate was added in small amounts and unreacted protein was separated using size-exclusion chromatography (SEC) and recycled to the reactor. The quality of the final recovered product was thus a result of the conditions in both the reactor and the SEC, and a design space had to be established for both processes together. This was achieved by developing mechanistic models of the reaction and SEC steps, establishing the causal links between process conditions and product quality. Model analysis was used to complement the qualitative risk assessment, and design space and critical process parameters were identified. The simulation results gave an experimental plan focusing on the "worst-case regions" in terms of product quality and yield. In this way, the experiments could be used to verify both the suggested process and the model results. This work demonstrates the necessary steps of model-assisted process analysis, from model development through experimental verification.

  3. Model stimulations to estimate malaria risk under climate change.

    PubMed

    Jetten, T H; Martens, W J; Takken, W

    1996-05-01

    The current geographic range of malaria is much smaller than its potential range. In many regions there exists a phenomena characterized as "Anophelism without malaria." The vectors are present but malaria transmission does not occur. Vectorial capacity often has been used as a parameter to estimate the susceptibility of an area to malaria. Model computations with global climatological data show that a dynamic concept of vectorial capacity can be used as a comparative risk indicator to predict the current extent and distribution of malarious regions in the world. A sensitivity analysis done in 3 distinct geographic areas shows that the areas of largest change of epidemic potential caused by a temperature increase are those where mosquitoes already occur but where development of the parasite is limited by temperature. Computations with the model presented here predict, with different climate scenarios, an increased malaria risk in areas bordering malaria endemic regions and at higher altitudes within malarious regions under a temperature increase of 2-4 degrees C.

  4. Foraging and predation risk for larval cisco (Coregonus artedi) in Lake Superior: a modelling synthesis of empirical survey data

    USGS Publications Warehouse

    Myers, Jared T.; Yule, Daniel L.; Jones, Michael L.; Quinlan, Henry R.; Berglund, Eric K.

    2014-01-01

    The relative importance of predation and food availability as contributors to larval cisco (Coregonus artedi) mortality in Lake Superior were investigated using a visual foraging model to evaluate potential predation pressure by rainbow smelt (Osmerus mordax) and a bioenergetic model to evaluate potential starvation risk. The models were informed by observations of rainbow smelt, larval cisco, and zooplankton abundance at three Lake Superior locations during the period of spring larval cisco emergence and surface-oriented foraging. Predation risk was highest at Black Bay, ON, where average rainbow smelt densities in the uppermost 10 m of the water column were >1000 ha−1. Turbid conditions at the Twin Ports, WI-MN, affected larval cisco predation risk because rainbow smelt remained suspended in the upper water column during daylight, placing them alongside larval cisco during both day and night hours. Predation risk was low at Cornucopia, WI, owing to low smelt densities (<400 ha−1) and deep light penetration, which kept rainbow smelt near the lakebed and far from larvae during daylight. In situ zooplankton density estimates were low compared to the values used to develop the larval coregonid bioenergetics model, leading to predictions of negative growth rates for 10 mm larvae at all three locations. The model predicted that 15 mm larvae were capable of attaining positive growth at Cornucopia and the Twin Ports where low water temperatures (2–6 °C) decreased their metabolic costs. Larval prey resources were highest at Black Bay but warmer water temperatures there offset the benefit of increased prey availability. A sensitivity analysis performed on the rainbow smelt visual foraging model showed that it was relatively insensitive, while the coregonid bioenergetics model showed that the absolute growth rate predictions were highly sensitive to input parameters (i.e., 20% parameter perturbation led to order of magnitude differences in model estimates). Our

  5. Family Maltreatment, Substance Problems, and Suicidality: Prevention Surveillance and Ecological Risk/ Protective Factors Models

    DTIC Science & Technology

    2009-04-01

    hypothesized risk/protective effects and develop and validate regression and structural equation modeling based models for next three dependent...Task 12 Test all hypothesized risk/protective effects and develop and validate regression and structural equation modeling based...process of being written up for dissemination). o Models using structural equation modeling have been developed, tested, and cross-validated for male

  6. Biological-Based Modeling of Low Dose Radiation Risks

    SciTech Connect

    Scott, Bobby R., Ph.D.

    2006-11-08

    The objective of this project was to refine a biological-based model (called NEOTRANS2) for low-dose, radiation-induced stochastic effects taking into consideration newly available data, including data on bystander effects (deleterious and protective). The initial refinement led to our NEOTRANS3 model which has undergone further refinement (e.g., to allow for differential DNA repair/apoptosis over different dose regions). The model has been successfully used to explain nonlinear dose-response curves for low-linear-energy-transfer (LET) radiation-induced mutations (in vivo) and neoplastic transformation (in vitro). Relative risk dose-response functions developed for neoplastic transformation have been adapted for application to cancer relative risk evaluation for irradiated humans. Our low-dose research along with that conducted by others collectively demonstrate the following regarding induced protection associated with exposure to low doses of low-LET radiation: (1) protects against cell killing by high-LET alpha particles; (2) protects against spontaneous chromosomal damage; (3) protects against spontaneous mutations and neoplastic transformations; (4) suppresses mutations induced by a large radiation dose even when the low dose is given after the large dose; (5) suppresses spontaneous and alpha-radiation-induced cancers; (6) suppresses metastasis of existing cancer; (7) extends tumor latent period; (8) protects against diseases other than cancer; and (9) extends life expectancy. These forms of radiation-induced protection are called adapted protection as they relate to induced adaptive response. Thus, low doses and dose rates of low-LET radiation generally protect rather than harm us. These findings invalidate the linear not threshold (LNT) hypothesis which is based on the premise that any amount of radiation is harmful irrespective of its type. The hypothesis also implicates a linear dose-response curve for cancer induction that has a positive slope and no

  7. Phase two of Site 300`s ecological risk assessment: Model verification and risk management

    SciTech Connect

    Carlson, T.M.; Gregory, S.D.

    1995-12-31

    The authors completed the baseline ecological risk assessment (ERA) for Lawrence Livermore National Laboratory`s Site 300 in 1993. Using data collection and modeling techniques adapted from the human health risk assessment (HRA), they evaluated the potential hazard of contaminants in environmental media to ecological receptors. They identified potential hazards to (1) aquatic invertebrates from heavy metal contaminants in surface water, (2) burrowing vertebrates from contaminants volatilizing from subsurface soil into burrow air, and (3) grazing deer and burrowing vertebrates from cadmium contamination in surface soil. They recently began collecting data to refine the estimates of potential hazard to these ecological receptors. Bioassay results form the surface water failed to verify a hazard to aquatic invertebrates. Soil vapor surveys of subsurface burrows did verify the presence of high concentrations of volatile organic compounds (VOCs). However, they have not yet verified a true impact on the burrowing populations. The authors also completed an extensive surface soil sampling program, which identified local hot spots of cadmium contamination. In addition, they have been collecting data on the land use patterns of the deer population. Their data indicate that deer do not typically use those areas with cadmium surface soil contamination. Information from this phase of the ERA, along with the results of the HRA, will direct the selection of remedial alternatives for the site. For the ecological receptors, remedial alternatives include developing a risk management program which includes ensuring that (1) sensitive burrowing species (such as rare or endangered species) do not use areas of surface or subsurface contamination, and (2) deer populations do not use areas of surface soil contamination.

  8. Comparison of Risk Predicted by Multiple Norovirus Dose-Response Models and Implications for Quantitative Microbial Risk Assessment.

    PubMed

    Van Abel, Nicole; Schoen, Mary E; Kissel, John C; Meschke, J Scott

    2016-06-10

    The application of quantitative microbial risk assessments (QMRAs) to understand and mitigate risks associated with norovirus is increasingly common as there is a high frequency of outbreaks worldwide. A key component of QMRA is the dose-response analysis, which is the mathematical characterization of the association between dose and outcome. For Norovirus, multiple dose-response models are available that assume either a disaggregated or an aggregated intake dose. This work reviewed the dose-response models currently used in QMRA, and compared predicted risks from waterborne exposures (recreational and drinking) using all available dose-response models. The results found that the majority of published QMRAs of norovirus use the 1 F1 hypergeometric dose-response model with α = 0.04, β = 0.055. This dose-response model predicted relatively high risk estimates compared to other dose-response models for doses in the range of 1-1,000 genomic equivalent copies. The difference in predicted risk among dose-response models was largest for small doses, which has implications for drinking water QMRAs where the concentration of norovirus is low. Based on the review, a set of best practices was proposed to encourage the careful consideration and reporting of important assumptions in the selection and use of dose-response models in QMRA of norovirus. Finally, in the absence of one best norovirus dose-response model, multiple models should be used to provide a range of predicted outcomes for probability of infection.

  9. Development of Relative Risk Model for Regional Groundwater Risk Assessment: A Case Study in the Lower Liaohe River Plain, China

    PubMed Central

    Li, Xianbo; Zuo, Rui; Teng, Yanguo; Wang, Jinsheng; Wang, Bin

    2015-01-01

    Increasing pressure on water supply worldwide, especially in arid areas, has resulted in groundwater overexploitation and contamination, and subsequent deterioration of the groundwater quality and threats to public health. Environmental risk assessment of regional groundwater is an important tool for groundwater protection. This study presents a new approach for assessing the environmental risk assessment of regional groundwater. It was carried out with a relative risk model (RRM) coupled with a series of indices, such as a groundwater vulnerability index, which includes receptor analysis, risk source analysis, risk exposure and hazard analysis, risk characterization, and management of groundwater. The risk map is a product of the probability of environmental contamination and impact. The reliability of the RRM was verified using Monte Carlo analysis. This approach was applied to the lower Liaohe River Plain (LLRP), northeastern China, which covers 23604 km2. A spatial analysis tool within GIS which was used to interpolate and manipulate the data to develop environmental risk maps of regional groundwater, divided the level of risk from high to low into five ranks (V, IV, III, II, I). The results indicate that areas of relative risk rank (RRR) V cover 2324 km2, covering 9.8% of the area; RRR IV covers 3986 km2, accounting for 16.9% of the area. It is a new and appropriate method for regional groundwater resource management and land use planning, and is a rapid and effective tool for improving strategic decision making to protect groundwater and reduce environmental risk. PMID:26020518

  10. Development of relative risk model for regional groundwater risk assessment: a case study in the lower Liaohe River Plain, China.

    PubMed

    Li, Xianbo; Zuo, Rui; Teng, Yanguo; Wang, Jinsheng; Wang, Bin

    2015-01-01

    Increasing pressure on water supply worldwide, especially in arid areas, has resulted in groundwater overexploitation and contamination, and subsequent deterioration of the groundwater quality and threats to public health. Environmental risk assessment of regional groundwater is an important tool for groundwater protection. This study presents a new approach for assessing the environmental risk assessment of regional groundwater. It was carried out with a relative risk model (RRM) coupled with a series of indices, such as a groundwater vulnerability index, which includes receptor analysis, risk source analysis, risk exposure and hazard analysis, risk characterization, and management of groundwater. The risk map is a product of the probability of environmental contamination and impact. The reliability of the RRM was verified using Monte Carlo analysis. This approach was applied to the lower Liaohe River Plain (LLRP), northeastern China, which covers 23604 km2. A spatial analysis tool within GIS which was used to interpolate and manipulate the data to develop environmental risk maps of regional groundwater, divided the level of risk from high to low into five ranks (V, IV, III, II, I). The results indicate that areas of relative risk rank (RRR) V cover 2324 km2, covering 9.8% of the area; RRR IV covers 3986 km2, accounting for 16.9% of the area. It is a new and appropriate method for regional groundwater resource management and land use planning, and is a rapid and effective tool for improving strategic decision making to protect groundwater and reduce environmental risk.

  11. Evaluation of the Absolute Regional Temperature Potential

    NASA Technical Reports Server (NTRS)

    Shindell, D. T.

    2012-01-01

    The Absolute Regional Temperature Potential (ARTP) is one of the few climate metrics that provides estimates of impacts at a sub-global scale. The ARTP presented here gives the time-dependent temperature response in four latitude bands (90-28degS, 28degS-28degN, 28-60degN and 60-90degN) as a function of emissions based on the forcing in those bands caused by the emissions. It is based on a large set of simulations performed with a single atmosphere-ocean climate model to derive regional forcing/response relationships. Here I evaluate the robustness of those relationships using the forcing/response portion of the ARTP to estimate regional temperature responses to the historic aerosol forcing in three independent climate models. These ARTP results are in good accord with the actual responses in those models. Nearly all ARTP estimates fall within +/-20%of the actual responses, though there are some exceptions for 90-28degS and the Arctic, and in the latter the ARTP may vary with forcing agent. However, for the tropics and the Northern Hemisphere mid-latitudes in particular, the +/-20% range appears to be roughly consistent with the 95% confidence interval. Land areas within these two bands respond 39-45% and 9-39% more than the latitude band as a whole. The ARTP, presented here in a slightly revised form, thus appears to provide a relatively robust estimate for the responses of large-scale latitude bands and land areas within those bands to inhomogeneous radiative forcing and thus potentially to emissions as well. Hence this metric could allow rapid evaluation of the effects of emissions policies at a finer scale than global metrics without requiring use of a full climate model.

  12. STakeholder-Objective Risk Model (STORM): Determining the aggregated risk of multiple contaminant hazards in groundwater well catchments

    NASA Astrophysics Data System (ADS)

    Enzenhoefer, R.; Binning, P. J.; Nowak, W.

    2015-09-01

    Risk is often defined as the product of probability, vulnerability and value. Drinking water supply from groundwater abstraction is often at risk due to multiple hazardous land use activities in the well catchment. Each hazard might or might not introduce contaminants into the subsurface at any point in time, which then affects the pumped quality upon transport through the aquifer. In such situations, estimating the overall risk is not trivial, and three key questions emerge: (1) How to aggregate the impacts from different contaminants and spill locations to an overall, cumulative impact on the value at risk? (2) How to properly account for the stochastic nature of spill events when converting the aggregated impact to a risk estimate? (3) How will the overall risk and subsequent decision making depend on stakeholder objectives, where stakeholder objectives refer to the values at risk, risk attitudes and risk metrics that can vary between stakeholders. In this study, we provide a STakeholder-Objective Risk Model (STORM) for assessing the total aggregated risk. Or concept is a quantitative, probabilistic and modular framework for simulation-based risk estimation. It rests on the source-pathway-receptor concept, mass-discharge-based aggregation of stochastically occuring spill events, accounts for uncertainties in the involved flow and transport models through Monte Carlo simulation, and can address different stakeholder objectives. We illustrate the application of STORM in a numerical test case inspired by a German drinking water catchment. As one may expect, the results depend strongly on the chosen stakeholder objectives, but they are equally sensitive to different approaches for risk aggregation across different hazards, contaminant types, and over time.

  13. A New Gimmick for Assigning Absolute Configuration.

    ERIC Educational Resources Information Center

    Ayorinde, F. O.

    1983-01-01

    A five-step procedure is provided to help students in making the assignment absolute configuration less bothersome. Examples for both single (2-butanol) and multi-chiral carbon (3-chloro-2-butanol) molecules are included. (JN)

  14. Magnifying absolute instruments for optically homogeneous regions

    SciTech Connect

    Tyc, Tomas

    2011-09-15

    We propose a class of magnifying absolute optical instruments with a positive isotropic refractive index. They create magnified stigmatic images, either virtual or real, of optically homogeneous three-dimensional spatial regions within geometrical optics.

  15. The Simplicity Argument and Absolute Morality

    ERIC Educational Resources Information Center

    Mijuskovic, Ben

    1975-01-01

    In this paper the author has maintained that there is a similarity of thought to be found in the writings of Cudworth, Emerson, and Husserl in his investigation of an absolute system of morality. (Author/RK)

  16. Modelling surface water flood risk using coupled numerical and physical modelling techniques

    NASA Astrophysics Data System (ADS)

    Green, D. L.; Pattison, I.; Yu, D.

    2015-12-01

    Surface water (pluvial) flooding occurs due to intense precipitation events where rainfall cannot infiltrate into the sub-surface or drain via storm water systems. The perceived risk appears to have increased in recent years with pluvial flood events seeming more severe and frequent within the UK. Surface water flood risk currently accounts for one third of all UK flood risk, with approximately two million people living in urban areas being at risk of a 1 in 200 year flood event. Surface water flooding research often focuses upon using 1D, 2D or 1D-2D coupled numerical modelling techniques to understand the extent, depth and severity of actual or hypothetical flood scenarios. Although much research has been conducted using numerical modelling, field data available for model calibration and validation is limited due to the complexities associated with data collection in surface water flood conditions. Ultimately, the data which numerical models are based upon is often erroneous and inconclusive. Physical models offer an alternative and innovative environment to collect data within. A controlled, closed system allows independent variables to be altered individually to investigate cause and effect relationships. Despite this, physical modelling approaches are seldom used in surface water flooding research. Scaled laboratory experiments using a 9m2, two-tiered physical model consisting of: (i) a mist nozzle type rainfall simulator able to simulate a range of rainfall intensities similar to those observed within the United Kingdom, and; (ii) a fully interchangeable, scaled plot surface have been conducted to investigate and quantify the influence of factors such as slope, impermeability, building density/configuration and storm dynamics on overland flow and rainfall-runoff patterns within a range of terrestrial surface conditions. Results obtained within the physical modelling environment will be compared with numerical modelling results using FloodMap (Yu & Lane, 2006

  17. Dose-volume modeling of the risk of postoperative pulmonary complications among esophageal cancer patients treated with concurrent chemoradiotherapy followed by surgery

    SciTech Connect

    Tucker, Susan L. . E-mail: sltucker@mdanderson.org; Liu, H. Helen; Wang, Shulian; Wei Xiong; Liao Zhongxing; Komaki, Ritsuko; Cox, James D.; Mohan, Radhe

    2006-11-01

    Purpose: The aim of this study was to investigate the effect of radiation dose distribution in the lung on the risk of postoperative pulmonary complications among esophageal cancer patients. Methods and Materials: We analyzed data from 110 patients with esophageal cancer treated with concurrent chemoradiotherapy followed by surgery at our institution from 1998 to 2003. The endpoint for analysis was postsurgical pneumonia or acute respiratory distress syndrome. Dose-volume histograms (DVHs) and dose-mass histograms (DMHs) for the whole lung were used to fit normal-tissue complication probability (NTCP) models, and the quality of fits were compared using bootstrap analysis. Results: Normal-tissue complication probability modeling identified that the risk of postoperative pulmonary complications was most significantly associated with small absolute volumes of lung spared from doses {>=}5 Gy (VS5), that is, exposed to doses <5 Gy. However, bootstrap analysis found no significant difference between the quality of this model and fits based on other dosimetric parameters, including mean lung dose, effective dose, and relative volume of lung receiving {>=}5 Gy, probably because of correlations among these factors. The choice of DVH vs. DMH or the use of fractionation correction did not significantly affect the results of the NTCP modeling. The parameter values estimated for the Lyman NTCP model were as follows (with 95% confidence intervals in parentheses): n = 1.85 (0.04, {infinity}), m = 0.55 (0.22, 1.02), and D {sub 5} = 17.5 Gy (9.4 Gy, 102 Gy). Conclusions: In this cohort of esophageal cancer patients, several dosimetric parameters including mean lung dose, effective dose, and absolute volume of lung receiving <5 Gy provided similar descriptions of the risk of postoperative pulmonary complications as a function of Radiation dose distribution in the lung.

  18. Absolute cross sections of compound nucleus reactions

    NASA Astrophysics Data System (ADS)

    Capurro, O. A.

    1993-11-01

    The program SEEF is a Fortran IV computer code for the extraction of absolute cross sections of compound nucleus reactions. When the evaporation residue is fed by its parents, only cumulative cross sections will be obtained from off-line gamma ray measurements. But, if one has the parent excitation function (experimental or calculated), this code will make it possible to determine absolute cross sections of any exit channel.

  19. Kelvin and the absolute temperature scale

    NASA Astrophysics Data System (ADS)

    Erlichson, Herman

    2001-07-01

    This paper describes the absolute temperature scale of Kelvin (William Thomson). Kelvin found that Carnot's axiom about heat being a conserved quantity had to be abandoned. Nevertheless, he found that Carnot's fundamental work on heat engines was correct. Using the concept of a Carnot engine Kelvin found that Q1/Q2 = T1/T2. Thermometers are not used to obtain absolute temperatures since they are calculated temperatures.

  20. Using toxicokinetic-toxicodynamic modeling as an acute risk assessment refinement approach in vertebrate ecological risk assessment.

    PubMed

    Ducrot, Virginie; Ashauer, Roman; Bednarska, Agnieszka J; Hinarejos, Silvia; Thorbek, Pernille; Weyman, Gabriel

    2016-01-01

    Recent guidance identified toxicokinetic-toxicodynamic (TK-TD) modeling as a relevant approach for risk assessment refinement. Yet, its added value compared to other refinement options is not detailed, and how to conduct the modeling appropriately is not explained. This case study addresses these issues through 2 examples of individual-level risk assessment for 2 hypothetical plant protection products: 1) evaluating the risk for small granivorous birds and small omnivorous mammals of a single application, as a seed treatment in winter cereals, and 2) evaluating the risk for fish after a pulsed treatment in the edge-of-field zone. Using acute test data, we conducted the first tier risk assessment as defined in the European Food Safety Authority (EFSA) guidance. When first tier risk assessment highlighted a concern, refinement options were discussed. Cases where the use of models should be preferred over other existing refinement approaches were highlighted. We then practically conducted the risk assessment refinement by using 2 different models as examples. In example 1, a TK model accounting for toxicokinetics and relevant feeding patterns in the skylark and in the wood mouse was used to predict internal doses of the hypothetical active ingredient in individuals, based on relevant feeding patterns in an in-crop situation, and identify the residue levels leading to mortality. In example 2, a TK-TD model accounting for toxicokinetics, toxicodynamics, and relevant exposure patterns in the fathead minnow was used to predict the time-course of fish survival for relevant FOCUS SW exposure scenarios and identify which scenarios might lead to mortality. Models were calibrated using available standard data and implemented to simulate the time-course of internal dose of active ingredient or survival for different exposure scenarios. Simulation results were discussed and used to derive the risk assessment refinement endpoints used for decision. Finally, we compared the

  1. Application of wildfire simulation models for risk analysis

    NASA Astrophysics Data System (ADS)

    Ager, A.; Finney, M.

    2009-04-01

    Wildfire simulation models are being widely used by fire and fuels specialists in the U.S. to support tactical and strategic decisions related to the mitigation of wildfire risk. Much of this application has resulted from the development of a minimum travel time (MTT) fire spread algorithm (M. Finney) that makes it computationally feasible to simulate thousands of fires and generate burn probability and intensity maps over large areas (10,000 - 2,000,000 ha). The MTT algorithm is parallelized for multi-threaded processing and is imbedded in a number of research and applied fire modeling applications. High performance computers (e.g., 32-way 64 bit SMP) are typically used for MTT simulations, although the algorithm is also implemented in the 32 bit desktop FlamMap3 program (www.fire.org). Extensive testing has shown that this algorithm can replicate large fire boundaries in the heterogeneous landscapes that typify much of the wildlands in the western U.S. In this paper, we describe the application of the MTT algorithm to understand spatial patterns of burn probability (BP), and to analyze wildfire risk to key human and ecological values. The work is focused on a federally-managed 2,000,000 ha landscape in the central interior region of Oregon State, USA. The fire-prone study area encompasses a wide array of topography and fuel types and a number of highly valued resources that are susceptible to fire. We quantitatively defined risk as the product of the probability of a fire and the resulting consequence. Burn probabilities at specific intensity classes were estimated for each 100 x 100 m pixel by simulating 100,000 wildfires under burn conditions that replicated recent severe wildfire events that occurred under conditions where fire suppression was generally ineffective (97th percentile, August weather). We repeated the simulation under milder weather (70th percentile, August weather) to replicate a "wildland fire use scenario" where suppression is minimized to

  2. Elevation correction factor for absolute pressure measurements

    NASA Technical Reports Server (NTRS)

    Panek, Joseph W.; Sorrells, Mark R.

    1996-01-01

    With the arrival of highly accurate multi-port pressure measurement systems, conditions that previously did not affect overall system accuracy must now be scrutinized closely. Errors caused by elevation differences between pressure sensing elements and model pressure taps can be quantified and corrected. With multi-port pressure measurement systems, the sensing elements are connected to pressure taps that may be many feet away. The measurement system may be at a different elevation than the pressure taps due to laboratory space or test article constraints. This difference produces a pressure gradient that is inversely proportional to height within the interface tube. The pressure at the bottom of the tube will be higher than the pressure at the top due to the weight of the tube's column of air. Tubes with higher pressures will exhibit larger absolute errors due to the higher air density. The above effect is well documented but has generally been taken into account with large elevations only. With error analysis techniques, the loss in accuracy from elevation can be easily quantified. Correction factors can be applied to maintain the high accuracies of new pressure measurement systems.

  3. Low-probability flood risk modeling for New York City.

    PubMed

    Aerts, Jeroen C J H; Lin, Ning; Botzen, Wouter; Emanuel, Kerry; de Moel, Hans

    2013-05-01

    The devastating impact by Hurricane Sandy (2012) again showed New York City (NYC) is one of the most vulnerable cities to coastal flooding around the globe. The low-lying areas in NYC can be flooded by nor'easter storms and North Atlantic hurricanes. The few studies that have estimated potential flood damage for NYC base their damage estimates on only a single, or a few, possible flood events. The objective of this study is to assess the full distribution of hurricane flood risk in NYC. This is done by calculating potential flood damage with a flood damage model that uses many possible storms and surge heights as input. These storms are representative for the low-probability/high-impact flood hazard faced by the city. Exceedance probability-loss curves are constructed under different assumptions about the severity of flood damage. The estimated flood damage to buildings for NYC is between US$59 and 129 millions/year. The damage caused by a 1/100-year storm surge is within a range of US$2 bn-5 bn, while this is between US$5 bn and 11 bn for a 1/500-year storm surge. An analysis of flood risk in each of the five boroughs of NYC finds that Brooklyn and Queens are the most vulnerable to flooding. This study examines several uncertainties in the various steps of the risk analysis, which resulted in variations in flood damage estimations. These uncertainties include: the interpolation of flood depths; the use of different flood damage curves; and the influence of the spectra of characteristics of the simulated hurricanes.

  4. Assessing the risk of Legionnaires' disease: the inhalation exposure model and the estimated risk in residential bathrooms.

    PubMed

    Azuma, Kenichi; Uchiyama, Iwao; Okumura, Jiro

    2013-02-01

    Legionella are widely found in the built environment. Patients with Legionnaires' disease have been increasing in Japan; however, health risks from Legionella bacteria in the environment are not appropriately assessed. We performed a quantitative health risk assessment modeled on residential bathrooms in the Adachi outbreak area and estimated risk levels. The estimated risks in the Adachi outbreak approximately corresponded to the risk levels exponentially extrapolated into lower levels on the basis of infection and mortality rates calculated from actual outbreaks, suggesting that the model of Legionnaires' disease in residential bathrooms was adequate to predict disease risk for the evaluated outbreaks. Based on this model, the infection and mortality risk levels per year in 10 CFU/100 ml (100 CFU/L) of the Japanese water quality guideline value were approximately 10(-2) and 10(-5), respectively. However, acceptable risk levels of infection and mortality from Legionnaires' disease should be adjusted to approximately 10(-4) and 10(-7), respectively, per year. Therefore, a reference value of 0.1 CFU/100 ml (1 CFU/L) as a water quality guideline for Legionella bacteria is recommended. This value is occasionally less than the actual detection limit. Legionella levels in water system should be maintained as low as reasonably achievable (<1 CFU/L).

  5. Gender attitudes, sexual power, HIV risk: a model for understanding HIV risk behavior of South African men.

    PubMed

    Kaufman, Michelle R; Shefer, Tamara; Crawford, Mary; Simbayi, Leickness C; Kalichman, Seth C

    2008-04-01

    The Gender Attitudes-Power-Risk (GAPR) model of HIV risk behavior was tested using survey data collected from among 309 men who were attending STI services in a primary health care clinic in Cape Town, South Africa. Results showed that negative attitudes towards women were significantly positively associated with a high level of HIV risk behavior, and that endorsement of traditional male roles was negatively associated with HIV risk behavior. Endorsement of traditional male gender roles was also inversely related to relationship control but positively to a high degree of decision-making dominance in one's relationship. Sexual relationship power did not significantly mediate the relationships between gender attitudes and HIV risk behavior. A better understanding of gender roles and ideologies in combination with one's power in sexual relationships as they relate to HIV risk behavior among men could better inform future HIV prevention interventions.

  6. Recent Enhancements to the Genetic Risk Prediction Model BRCAPRO

    PubMed Central

    Mazzola, Emanuele; Blackford, Amanda; Parmigiani, Giovanni; Biswas, Swati

    2015-01-01

    BRCAPRO is a widely used model for genetic risk prediction of breast cancer. It is a function within the R package BayesMendel and is used to calculate the probabilities of being a carrier of a deleterious mutation in one or both of the BRCA genes, as well as the probability of being affected with breast and ovarian cancer within a defined time window. Both predictions are based on information contained in the counselee’s family history of cancer. During the last decade, BRCAPRO has undergone several rounds of successive refinements: the current version is part of release 2.1 of BayesMendel. In this review, we showcase some of the most notable features of the software resulting from these recent changes. We provide examples highlighting each feature, using artificial pedigrees motivated by complex clinical examples. We illustrate how BRCAPRO is a comprehensive software for genetic risk prediction with many useful features that allow users the flexibility to incorporate varying amounts of available information. PMID:25983549

  7. Ecological Risk Model of Childhood Obesity in Chinese Immigrant Children

    PubMed Central

    Zhou, Nan; Cheah, Charissa S. L.

    2015-01-01

    Chinese Americans are the largest and fastest growing Asian American subgroup, increasing about one-third during the 2000s. Despite the slender Asian stereotype, nearly one-third of 6-to-11 years old Chinese American children were found to be overweight (above the 85th percentile in BMI). Importantly, unique and severe health risks are associated with being overweight/obese in Chinese. Unfortunately, Chinese immigrant children have been neglected in the literature on obesity. This review aimed to identify factors at various levels of the ecological model that may place Chinese immigrant children at risk for being overweight/obese in the U.S. Key contextual factors at the micro-, meso-, exo-, macro- and chronosystem were identified guided by Bronfenbrenner’s ecological systems theory. The corresponding mediating and moderating processes among the factors were also reviewed and proposed. By presenting a conceptual framework and relevant research, this review can provide a basic framework for directing future interdisciplinary research in seeking solutions to childhood obesity within this understudied population. PMID:25728887

  8. Method of Breast Reconstruction Determines Venous Thromboembolism Risk Better Than Current Prediction Models

    PubMed Central

    Patel, Niyant V.; Wagner, Douglas S.

    2015-01-01

    Background: Venous thromboembolism (VTE) risk models including the Davison risk score and the 2005 Caprini risk assessment model have been validated in plastic surgery patients. However, their utility and predictive value in breast reconstruction has not been well described. We sought to determine the utility of current VTE risk models in this population and the VTE rate observed in various methods of breast reconstruction. Methods: A retrospective review of breast reconstructions by a single surgeon was performed. One hundred consecutive transverse rectus abdominis myocutaneous (TRAM) patients, 100 consecutive implant patients, and 100 consecutive latissimus dorsi patients were identified over a 10-year period. Patient demographics and presence of symptomatic VTE were collected. 2005 Caprini risk scores and Davison risk scores were calculated for each patient. Results: The TRAM reconstruction group was found to have a higher VTE rate (6%) than the implant (0%) and latissimus (0%) reconstruction groups (P < 0.01). Mean Davison risk scores and 2005 Caprini scores were similar across all reconstruction groups (P > 0.1). The vast majority of patients were stratified as high risk (87.3%) by the VTE risk models. However, only TRAM reconstruction patients demonstrated significant VTE risk. Conclusions: TRAM reconstruction appears to have a significantly higher risk of VTE than both implant and latissimus reconstruction. Current risk models do not effectively stratify breast reconstruction patients at risk for VTE. The method of breast reconstruction appears to have a significant role in patients’ VTE risk. PMID:26090287

  9. Simple Model of Mating Preference and Extinction Risk

    NASA Astrophysics Data System (ADS)

    PȨKALSKI, Andrzej

    We present a simple model of a population of individuals characterized by their genetic structure in the form of a double string of bits and the phenotype following from it. The population is living in an unchanging habitat preferring a certain type of phenotype (optimum). Individuals are unisex, however a pair is necessary for breeding. An individual rejects a mate if the latter's phenotype contains too many bad, i.e. different from the optimum, genes in the same places as the individual's. We show that such strategy, analogous to disassortative mating based on the major histocompatibility complex, avoiding inbreeding and incest, could be beneficial for the population and could reduce considerably the extinction risk, especially in small populations.

  10. Internal modelling under Risk-Based Capital (RBC) framework

    NASA Astrophysics Data System (ADS)

    Ling, Ang Siew; Hin, Pooi Ah

    2015-12-01

    Very often the methods for the internal modelling under the Risk-Based Capital framework make use of the data which are in the form of run-off triangle. The present research will instead extract from a group of n customers, the historical data for the sum insured si of the i-th customer together with the amount paid yij and the amount aij reported but not yet paid in the j-th development year for j = 1, 2, 3, 4, 5, 6. We model the future value (yij+1, aij+1) to be dependent on the present year value (yij, aij) and the sum insured si via a conditional distribution which is derived from a multivariate power-normal mixture distribution. For a group of given customers with different original purchase dates, the distribution of the aggregate claims liabilities may be obtained from the proposed model. The prediction interval based on the distribution for the aggregate claim liabilities is found to have good ability of covering the observed aggregate claim liabilities.

  11. Eutrophication risk assessment in coastal embayments using simple statistical models.

    PubMed

    Arhonditsis, G; Eleftheriadou, M; Karydis, M; Tsirtsis, G

    2003-09-01

    A statistical methodology is proposed for assessing the risk of eutrophication in marine coastal embayments. The procedure followed was the development of regression models relating the levels of chlorophyll a (Chl) with the concentration of the limiting nutrient--usually nitrogen--and the renewal rate of the systems. The method was applied in the Gulf of Gera, Island of Lesvos, Aegean Sea and a surrogate for renewal rate was created using the Canberra metric as a measure of the resemblance between the Gulf and the oligotrophic waters of the open sea in terms of their physical, chemical and biological properties. The Chl-total dissolved nitrogen-renewal rate regression model was the most significant, accounting for 60% of the variation observed in Chl. Predicted distributions of Chl for various combinations of the independent variables, based on Bayesian analysis of the models, enabled comparison of the outcomes of specific scenarios of interest as well as further analysis of the system dynamics. The present statistical approach can be used as a methodological tool for testing the resilience of coastal ecosystems under alternative managerial schemes and levels of exogenous nutrient loading.

  12. Linking GIS and storm water modeling for emergency risk assessment

    SciTech Connect

    Newkirk, R.T.

    1995-12-31

    Many emergencies involve the deposition of chemical contaminants on land either as a direct event or as a secondary byproduct. GIS can be useful in estimating the initial deposition area. Chemical product attribute data bases can be accessed to determine the degree that the contaminants might be transportable in a water medium. An important issue is to estimate the potential impact of the deposition on surface and subsurface water flows. This particularly important since millions of people rely on subsurface ground water as their main source of potable water. Thus, a modeling system is needed by planners and emergency managers to assess the potential for short and long term risks to communities due to storm water transport of deposited contaminants. GIS itself cannot provide the complete analysis. A prototype system to assist in estimating the flows of contaminants related to an emergency has been developed by linking an Arc/Info database, Digital Terrain Model, and SWMM the storm water management modeling system. This system also has important planning applications in assessing alternative land development plans for their impact on ground water recharge and management of storm water.

  13. The Terrestrial Investigation Model: A probabilistic risk assessment model for birds exposed to pesticides

    EPA Science Inventory

    One of the major recommendations of the National Academy of Science to the USEPA, NMFS and USFWS was to utilize probabilistic methods when assessing the risks of pesticides to federally listed endangered and threatened species. The Terrestrial Investigation Model (TIM, version 3....

  14. A "CLIPER" Risk Model for Insured Losses From US Hurricane Landfalls and the Need for an Open-Source Risk Model

    NASA Astrophysics Data System (ADS)

    Murnane, R. J.

    2003-12-01

    To plan for the consequences of hurricanes, earthquakes, and other natural hazards the public and private sectors use a variety of risk models. Model output is tailored for specific users and includes a range of parameters including: damage to structures, insured losses, and estimates of shelter requirements to care for people displaced by the catastrophe. Extensive efforts are made to tune risk models to past events. However, model "forecasts" of losses are rarely verified through a comparison with new events. Instead, new events generally are used to further tune a new version of the model. In addition, there has been no public attempt to determine which model has the most predictive skill, in part because there is no agreed upon reference forecast, and in part because most risk models are proprietary. Here I describe a simple risk model that can be used to provide deterministic and probabilistic exceedance probabilities for insured losses caused by hurricanes striking the US coastline. I propose that loss estimates based on the approach used in this simple model can be used as a reference forecast for assessing the skill of more complex commercial models. I also suggest that an effort be initiated to promote the development of an open-source risk model. The simple risk model combines wind speed exceedance probabilities estimated using the historical record of maximum sustained winds for hurricanes at landfall, and a set of normalized insured losses produced by landfalling hurricanes. The approach is analogous to weather, or climate, forecasts based on a combination of CLImatology and PERsistence (CLIPER). The climatological component accounts for low frequency variability in weather due to factors such as seasonality. The analog to climatology in the simple risk model is the historical record of hurricane wind speeds and insured losses. The insured losses have been corrected for the effects of inflation, population increases, and wealth, and other factors. The

  15. An absolute scale for measuring the utility of money

    NASA Astrophysics Data System (ADS)

    Thomas, P. J.

    2010-07-01

    Measurement of the utility of money is essential in the insurance industry, for prioritising public spending schemes and for the evaluation of decisions on protection systems in high-hazard industries. Up to this time, however, there has been no universally agreed measure for the utility of money, with many utility functions being in common use. In this paper, we shall derive a single family of utility functions, which have risk-aversion as the only free parameter. The fact that they return a utility of zero at their low, reference datum, either the utility of no money or of one unit of money, irrespective of the value of risk-aversion used, qualifies them to be regarded as absolute scales for the utility of money. Evidence of validation for the concept will be offered based on inferential measurements of risk-aversion, using diverse measurement data.

  16. Risk evaluation of uranium mining: A geochemical inverse modelling approach

    NASA Astrophysics Data System (ADS)

    Rillard, J.; Zuddas, P.; Scislewski, A.

    2011-12-01

    It is well known that uranium extraction operations can increase risks linked to radiation exposure. The toxicity of uranium and associated heavy metals is the main environmental concern regarding exploitation and processing of U-ore. In areas where U mining is planned, a careful assessment of toxic and radioactive element concentrations is recommended before the start of mining activities. A background evaluation of harmful elements is important in order to prevent and/or quantify future water contamination resulting from possible migration of toxic metals coming from ore and waste water interaction. Controlled leaching experiments were carried out to investigate processes of ore and waste (leached ore) degradation, using samples from the uranium exploitation site located in Caetité-Bahia, Brazil. In experiments in which the reaction of waste with water was tested, we found that the water had low pH and high levels of sulphates and aluminium. On the other hand, in experiments in which ore was tested, the water had a chemical composition comparable to natural water found in the region of Caetité. On the basis of our experiments, we suggest that waste resulting from sulphuric acid treatment can induce acidification and salinization of surface and ground water. For this reason proper storage of waste is imperative. As a tool to evaluate the risks, a geochemical inverse modelling approach was developed to estimate the water-mineral interaction involving the presence of toxic elements. We used a method earlier described by Scislewski and Zuddas 2010 (Geochim. Cosmochim. Acta 74, 6996-7007) in which the reactive surface area of mineral dissolution can be estimated. We found that the reactive surface area of rock parent minerals is not constant during time but varies according to several orders of magnitude in only two months of interaction. We propose that parent mineral heterogeneity and particularly, neogenic phase formation may explain the observed variation of the

  17. Jasminum flexile flower absolute from India--a detailed comparison with three other jasmine absolutes.

    PubMed

    Braun, Norbert A; Kohlenberg, Birgit; Sim, Sherina; Meier, Manfred; Hammerschmidt, Franz-Josef

    2009-09-01

    Jasminum flexile flower absolute from the south of India and the corresponding vacuum headspace (VHS) sample of the absolute were analyzed using GC and GC-MS. Three other commercially available Indian jasmine absolutes from the species: J. sambac, J. officinale subsp. grandiflorum, and J. auriculatum and the respective VHS samples were used for comparison purposes. One hundred and twenty-one compounds were characterized in J. flexile flower absolute, with methyl linolate, benzyl salicylate, benzyl benzoate, (2E,6E)-farnesol, and benzyl acetate as the main constituents. A detailed olfactory evaluation was also performed.

  18. Development of a relative risk model for evaluating ecological risk of water environment in the Haihe River Basin estuary area.

    PubMed

    Chen, Qiuying; Liu, Jingling; Ho, Kin Chung; Yang, Zhifeng

    2012-03-15

    Ecological risk assessment for water environment is significant to water resource management of basin. Effective environmental management and systems restoration such as the Haihe River Basin require holistic understanding of the relative importance of various stressor-related impacts throughout the basin. As an effective technical tool for evaluating the ecological risk, relative risk model (RRM) was applied in regional scale successfully. In this study, the risk transfer from upstream of basin was considered and the RRM was developed through introducing the source-stressor-habitat exposure filter (SSH), the endpoint-habitat exposure filter (EH) and the stressor-endpoint effect filter (SE) to reflect the meaning of exposure and effect more explicit. Water environment which includes water quality, water quantity and aquatic ecosystems was selected as the assessment endpoints. We created a conceptual model which depicting potential and effect pathways from source to stressor to habitat to endpoint. The Haihe River Basin estuary (HRBE) was selected as the model case. The results showed that there were two low risk regions, one medium risk region and two high risk regions in the HRBE. The results also indicated that urbanization was the biggest source, the second was shipping and the third was industry, their risk scores are 5.65, 4.71 and 3.68 respectively. Furthermore, habitat destruction was the largest stressor with the risk scores (2.66), the second was oxygen consuming organic pollutants (1.75) and the third was pathogens (1.75). So these three stressors were the main influencing factors of the ecological pressure in the study area. For habitats, open waters (9.59) and intertidal mudflat were enduring the bigger pressure and should be taken considerable attention. Ecological service values damaged (30.54) and biodiversity decreased were facing the biggest risk pressure.

  19. A model of consumers' risk perceptions toward recombinant bovine growth hormone (rbGH): the impact of risk characteristics.

    PubMed

    Grobe, D; Douthitt, R; Zepeda, L

    1999-08-01

    This study estimates the effect risk characteristics, described as outrage factors by Hadden, have on consumers' risk perceptions toward the food-related biotechnology, recombinant bovine growth hormone (rbGH). The outrage factors applicable to milk from rbGH treated herds are involuntary risk exposure, unfamiliarity with the product's production process, unnatural product characteristics, lack of trust in regulator's ability to protect consumers in the marketplace, and consumers' inability to distinguish milk from rbGH treated herds compared to milk from untreated herds. An empirical analysis of data from a national survey of household food shoppers reveals that outrage factors mediate risk perceptions. The results support the inclusion of outrage factors into the risk perception model for the rbGH product, as they add significantly to the explanatory power of the model and therefore reduce bias compared to a simpler model of attitudinal and demographic factors. The study indicates that outrage factors which have a significant impact on risk perceptions are the lack of trust in the FDA as a food-related information source, and perceiving no consumer benefits from farmers' use of rbGH. Communication strategies to reduce consumer risk perceptions therefore could utilize agencies perceived as more trustworthy and emphasize the benefits of rbGH use to consumers.

  20. Test of a Clinical Model of Drinking and Suicidal Risk

    PubMed Central

    Conner, Kenneth R.; Gunzler, Douglas; Tang, Wan; Tu, Xin M.; Maisto, Stephen A.

    2010-01-01

    Background There are few data on the role of drinking patterns in suicidal thoughts or behavior among alcohol dependent individuals (ADIs), and meager data on variables that may influence the role of drinking in suicidal thoughts and behavior. This study tested a heuristic model that predicts that drinking promotes suicidal thoughts and behavior, the association is mediated (accounted for) by depressive symptoms, and that anger moderates (increases) the risk associated with intense drinking. Methods Data from Project MATCH, a multi-site alcohol use disorders treatment trial, were analyzed using structural equation modeling. There were 1,726 participants including 24% women and a mean age of 40.2 ± 11.0 years. Subjects were assessed at baseline and at 3-, 9-, and 15-month follow-up. Two categorical measures (presence/absence) of suicidal ideation (SI) were used that were analyzed in separate models. Predictors of interest were continuous assessments of average drinking intensity (i.e., drinks per drinking day or DDD), drinking frequency (i.e., percent days abstinent or PDA), depression, and anger. Results Both DDD and PDA were associated with SI at a statistically significant level, with PDA showing an inverse association. Depression scores served as a partial mediator or a full mediator of the drinking – SI relationship depending on the measure of SI used in the analysis. The models testing anger scores as a moderator fit the data poorly and did not support that anger serves as a moderator of the drinking – SI association. Conclusions Greater drinking intensity and drinking frequency predict SI among ADIs and depression serves as a mediator of these associations but anger does not appear to serve as a moderator. Further research is required to clarify if depression serves as a partial- or full mediator and to see if the results herein extend to suicidal behavior (i.e., suicide attempt, suicide). PMID:20958331

  1. Universal Cosmic Absolute and Modern Science

    NASA Astrophysics Data System (ADS)

    Kostro, Ludwik

    The official Sciences, especially all natural sciences, respect in their researches the principle of methodic naturalism i.e. they consider all phenomena as entirely natural and therefore in their scientific explanations they do never adduce or cite supernatural entities and forces. The purpose of this paper is to show that Modern Science has its own self-existent, self-acting, and self-sufficient Natural All-in Being or Omni-Being i.e. the entire Nature as a Whole that justifies the scientific methodic naturalism. Since this Natural All-in Being is one and only It should be considered as the own scientifically justified Natural Absolute of Science and should be called, in my opinion, the Universal Cosmic Absolute of Modern Science. It will be also shown that the Universal Cosmic Absolute is ontologically enormously stratified and is in its ultimate i.e. in its most fundamental stratum trans-reistic and trans-personal. It means that in its basic stratum. It is neither a Thing or a Person although It contains in Itself all things and persons with all other sentient and conscious individuals as well, On the turn of the 20th century the Science has begun to look for a theory of everything, for a final theory, for a master theory. In my opinion the natural Universal Cosmic Absolute will constitute in such a theory the radical all penetrating Ultimate Basic Reality and will substitute step by step the traditional supernatural personal Absolute.

  2. Modelling the risk of airborne infectious disease using exhaled air.

    PubMed

    Issarow, Chacha M; Mulder, Nicola; Wood, Robin

    2015-05-07

    In this paper we develop and demonstrate a flexible mathematical model that predicts the risk of airborne infectious diseases, such as tuberculosis under steady state and non-steady state conditions by monitoring exhaled air by infectors in a confined space. In the development of this model, we used the rebreathed air accumulation rate concept to directly determine the average volume fraction of exhaled air in a given space. From a biological point of view, exhaled air by infectors contains airborne infectious particles that cause airborne infectious diseases such as tuberculosis in confined spaces. Since not all infectious particles can reach the target infection site, we took into account that the infectious particles that commence the infection are determined by respiratory deposition fraction, which is the probability of each infectious particle reaching the target infection site of the respiratory tracts and causing infection. Furthermore, we compute the quantity of carbon dioxide as a marker of exhaled air, which can be inhaled in the room with high likelihood of causing airborne infectious disease given the presence of infectors. We demonstrated mathematically and schematically the correlation between TB transmission probability and airborne infectious particle generation rate, ventilation rate, average volume fraction of exhaled air, TB prevalence and duration of exposure to infectors in a confined space.

  3. Risk prediction for myocardial infarction via generalized functional regression models.

    PubMed

    Ieva, Francesca; Paganoni, Anna M

    2016-08-01

    In this paper, we propose a generalized functional linear regression model for a binary outcome indicating the presence/absence of a cardiac disease with multivariate functional data among the relevant predictors. In particular, the motivating aim is the analysis of electrocardiographic traces of patients whose pre-hospital electrocardiogram (ECG) has been sent to 118 Dispatch Center of Milan (the Italian free-toll number for emergencies) by life support personnel of the basic rescue units. The statistical analysis starts with a preprocessing of ECGs treated as multivariate functional data. The signals are reconstructed from noisy observations. The biological variability is then removed by a nonlinear registration procedure based on landmarks. Thus, in order to perform a data-driven dimensional reduction, a multivariate functional principal component analysis is carried out on the variance-covariance matrix of the reconstructed and registered ECGs and their first derivatives. We use the scores of the Principal Components decomposition as covariates in a generalized linear model to predict the presence of the disease in a new patient. Hence, a new semi-automatic diagnostic procedure is proposed to estimate the risk of infarction (in the case of interest, the probability of being affected by Left Bundle Brunch Block). The performance of this classification method is evaluated and compared with other methods proposed in literature. Finally, the robustness of the procedure is checked via leave-j-out techniques.

  4. Long‐Term Post‐CABG Survival: Performance of Clinical Risk Models Versus Actuarial Predictions

    PubMed Central

    Carr, Brendan M.; Romeiser, Jamie; Ruan, Joyce; Gupta, Sandeep; Seifert, Frank C.; Zhu, Wei

    2015-01-01

    Abstract Background/aim Clinical risk models are commonly used to predict short‐term coronary artery bypass grafting (CABG) mortality but are less commonly used to predict long‐term mortality. The added value of long‐term mortality clinical risk models over traditional actuarial models has not been evaluated. To address this, the predictive performance of a long‐term clinical risk model was compared with that of an actuarial model to identify the clinical variable(s) most responsible for any differences observed. Methods Long‐term mortality for 1028 CABG patients was estimated using the Hannan New York State clinical risk model and an actuarial model (based on age, gender, and race/ethnicity). Vital status was assessed using the Social Security Death Index. Observed/expected (O/E) ratios were calculated, and the models' predictive performances were compared using a nested c‐index approach. Linear regression analyses identified the subgroup of risk factors driving the differences observed. Results Mortality rates were 3%, 9%, and 17% at one‐, three‐, and five years, respectively (median follow‐up: five years). The clinical risk model provided more accurate predictions. Greater divergence between model estimates occurred with increasing long‐term mortality risk, with baseline renal dysfunction identified as a particularly important driver of these differences. Conclusions Long‐term mortality clinical risk models provide enhanced predictive power compared to actuarial models. Using the Hannan risk model, a patient's long‐term mortality risk can be accurately assessed and subgroups of higher‐risk patients can be identified for enhanced follow‐up care. More research appears warranted to refine long‐term CABG clinical risk models. doi: 10.1111/jocs.12665 (J Card Surg 2016;31:23–30) PMID:26543019

  5. Identification of the high risk emergency surgical patient: Which risk prediction model should be used?

    PubMed Central

    Stonelake, Stephen; Thomson, Peter; Suggett, Nigel

    2015-01-01

    Introduction National guidance states that all patients having emergency surgery should have a mortality risk assessment calculated on admission so that the ‘high risk’ patient can receive the appropriate seniority and level of care. We aimed to assess if peri-operative risk scoring tools could accurately calculate mortality and morbidity risk. Methods Mortality risk scores for 86 consecutive emergency laparotomies, were calculated using pre-operative (ASA, Lee index) and post-operative (POSSUM, P-POSSUM and CR-POSSUM) risk calculation tools. Morbidity risk scores were calculated using the POSSUM predicted morbidity and compared against actual morbidity according to the Clavien–Dindo classification. Results The actual mortality was 10.5%. The average predicted risk scores for all laparotomies were: ASA 26.5%, Lee Index 2.5%, POSSUM 29.5%, P-POSSUM 18.5%, CR-POSSUM 10.5%. Complications occurred following 67 laparotomies (78%). The majority (51%) of complications were classified as Clavien–Dindo grade 2–3 (non-life-threatening). Patients having a POSSUM morbidity risk of greater than 50% developed significantly more life-threatening complications (CD 4–5) compared with those who predicted less than or equal to 50% morbidity risk (P = 0.01). Discussion Pre-operative risk stratification remains a challenge because the Lee Index under-predicts and ASA over-predicts mortality risk. Post-operative risk scoring using the CR-POSSUM is more accurate and we suggest can be used to identify patients who require intensive care post-operatively. Conclusions In the absence of accurate risk scoring tools that can be used on admission to hospital it is not possible to reliably audit the achievement of national standards of care for the ‘high-risk’ patient. PMID:26468369

  6. Climate and weather risk in natural resource models

    NASA Astrophysics Data System (ADS)

    Merrill, Nathaniel Henry

    This work, consisting of three manuscripts, addresses natural resource management under risk due to variation in climate and weather. In three distinct but theoretically related applications, I quantify the role of natural resources in stabilizing economic outcomes. In Manuscript 1, we address policy designed to effect the risk of cyanobacteria blooms in a drinking water reservoir through watershed wide policy. Combining a hydrologic and economic model for a watershed in Rhode Island, we solve for the efficient allocation of best management practices (BMPs) on livestock pastures to meet a monthly risk-based as well as mean-based water quality objective. In order to solve for the efficient allocations of nutrient control effort, we optimize a probabilistically constrained integer-programming problem representing the choices made on each farm and the resultant conditions that support cyanobacteria blooms. In doing so, we employ a genetic algorithm (GA). We hypothesize that management based on controlling the upper tail of the probability distribution of phosphorus loading implies different efficient management actions as compared to controlling mean loading. We find a shift to more intense effort on fewer acres when a probabilistic objective is specified with cost savings of meeting risk levels of up to 25% over mean loading based policies. Additionally, we illustrate the relative cost effectiveness of various policies designed to meet this risk-based objective. Rainfall and the subsequent overland runoff is the source of transportation of nutrients to a receiving water body, with larger amounts of phosphorus moving in more intense rainfall events. We highlight the importance of this transportation mechanism by comparing policies under climate change scenarios, where the intensity of rainfall is projected to increase and the time series process of rainfall to change. In Manuscript 2, we introduce a new economic groundwater model that incorporates the gradual shift

  7. Modeling Commercial Turbofan Engine Icing Risk With Ice Crystal Ingestion

    NASA Technical Reports Server (NTRS)

    Jorgenson, Philip C. E.; Veres, Joseph P.

    2013-01-01

    The occurrence of ice accretion within commercial high bypass aircraft turbine engines has been reported under certain atmospheric conditions. Engine anomalies have taken place at high altitudes that have been attributed to ice crystal ingestion, partially melting, and ice accretion on the compression system components. The result was degraded engine performance, and one or more of the following: loss of thrust control (roll back), compressor surge or stall, and flameout of the combustor. As ice crystals are ingested into the fan and low pressure compression system, the increase in air temperature causes a portion of the ice crystals to melt. It is hypothesized that this allows the ice-water mixture to cover the metal surfaces of the compressor stationary components which leads to ice accretion through evaporative cooling. Ice accretion causes a blockage which subsequently results in the deterioration in performance of the compressor and engine. The focus of this research is to apply an engine icing computational tool to simulate the flow through a turbofan engine and assess the risk of ice accretion. The tool is comprised of an engine system thermodynamic cycle code, a compressor flow analysis code, and an ice particle melt code that has the capability of determining the rate of sublimation, melting, and evaporation through the compressor flow path, without modeling the actual ice accretion. A commercial turbofan engine which has previously experienced icing events during operation in a high altitude ice crystal environment has been tested in the Propulsion Systems Laboratory (PSL) altitude test facility at NASA Glenn Research Center. The PSL has the capability to produce a continuous ice cloud which are ingested by the engine during operation over a range of altitude conditions. The PSL test results confirmed that there was ice accretion in the engine due to ice crystal ingestion, at the same simulated altitude operating conditions as experienced previously in

  8. Modeling number of bacteria per food unit in comparison to bacterial concentration in quantitative risk assessment: impact on risk estimates.

    PubMed

    Pouillot, Régis; Chen, Yuhuan; Hoelzer, Karin

    2015-02-01

    When developing quantitative risk assessment models, a fundamental consideration for risk assessors is to decide whether to evaluate changes in bacterial levels in terms of concentrations or in terms of bacterial numbers. Although modeling bacteria in terms of integer numbers may be regarded as a more intuitive and rigorous choice, modeling bacterial concentrations is more popular as it is generally less mathematically complex. We tested three different modeling approaches in a simulation study. The first approach considered bacterial concentrations; the second considered the number of bacteria in contaminated units, and the third considered the expected number of bacteria in contaminated units. Simulation results indicate that modeling concentrations tends to overestimate risk compared to modeling the number of bacteria. A sensitivity analysis using a regression tree suggests that processes which include drastic scenarios consisting of combinations of large bacterial inactivation followed by large bacterial growth frequently lead to a >10-fold overestimation of the average risk when modeling concentrations as opposed to bacterial numbers. Alternatively, the approach of modeling the expected number of bacteria in positive units generates results similar to the second method and is easier to use, thus potentially representing a promising compromise.

  9. Quantitative standards for absolute linguistic universals.

    PubMed

    Piantadosi, Steven T; Gibson, Edward

    2014-01-01

    Absolute linguistic universals are often justified by cross-linguistic analysis: If all observed languages exhibit a property, the property is taken to be a likely universal, perhaps specified in the cognitive or linguistic systems of language learners and users. In many cases, these patterns are then taken to motivate linguistic theory. Here, we show that cross-linguistic analysis will very rarely be able to statistically justify absolute, inviolable patterns in language. We formalize two statistical methods--frequentist and Bayesian--and show that in both it is possible to find strict linguistic universals, but that the numbers of independent languages necessary to do so is generally unachievable. This suggests that methods other than typological statistics are necessary to establish absolute properties of human language, and thus that many of the purported universals in linguistics have not received sufficient empirical justification.

  10. Geographical modeling of exposure risk to cyanobacteria for epidemiological purposes.

    PubMed

    Serrano, Tania; Dupas, Rémi; Upegui, Erika; Buscail, Camille; Grimaldi, Catherine; Viel, Jean François

    2015-08-01

    The cyanobacteria-derived neurotoxin β-methylamino-L-alanine (BMAA) represents a plausible environmental trigger for amyotrophic lateral sclerosis (ALS), a debilitating and fatal neuromuscular disease. With the eutrophication of water bodies, cyanobacterial blooms and their toxins are becoming increasingly prevalent in France, especially in the Brittany region. Cyanobacteria are monitored at only a few recreational sites, preventing an estimation of exposure of the human population. By contrast, phosphorus, a limiting nutrient for cyanobacterial growth and thus considered a good proxy for cyanobacteria exposure, is monitored in many but not all surface water bodies. Our goal was to develop a geographic exposure indicator that could be used in epidemiological research. We considered the total phosphorus (TP) concentration (mg/L) of samples collected between October 2007 and September 2012 at 179 monitoring stations distributed throughout the Brittany region. Using readily available spatial data, we computed environmental descriptors at the watershed level with a Geographic Information System. Then, these descriptors were introduced into a backward stepwise linear regression model to predict the median TP concentration in unmonitored surface water bodies. TP concentrations in surface water follow an increasing gradient from West to East and inland to coast. The empirical concentration model included five predictor variables with a fair coefficient of determination (R(2) = 0.51). The specific total runoff and the watershed slope correlated negatively with the TP concentrations (p = 0.01 and p< 10(-9), respectively), whereas positive associations were found for the proportion of built-up area, the upstream presence of sewage treatment plants, and the algae volume as indicated by the Landsat red/green reflectance ratio (p < 0.01, p < 10(-6) and p < 0.01, respectively). Complementing the monitoring networks, this geographical modeling can help estimate TP concentrations

  11. Risk-Based Causal Modeling of Airborne Loss of Separation

    NASA Technical Reports Server (NTRS)

    Geuther, Steven C.; Shih, Ann T.

    2015-01-01

    Maintaining safe separation between aircraft remains one of the key aviation challenges as the Next Generation Air Transportation System (NextGen) emerges. The goals of the NextGen are to increase capacity and reduce flight delays to meet the aviation demand growth through the 2025 time frame while maintaining safety and efficiency. The envisioned NextGen is expected to enable high air traffic density, diverse fleet operations in the airspace, and a decrease in separation distance. All of these factors contribute to the potential for Loss of Separation (LOS) between aircraft. LOS is a precursor to a potential mid-air collision (MAC). The NASA Airspace Operations and Safety Program (AOSP) is committed to developing aircraft separation assurance concepts and technologies to mitigate LOS instances, therefore, preventing MAC. This paper focuses on the analysis of causal and contributing factors of LOS accidents and incidents leading to MAC occurrences. Mid-air collisions among large commercial aircraft are rare in the past decade, therefore, the LOS instances in this study are for general aviation using visual flight rules in the years 2000-2010. The study includes the investigation of causal paths leading to LOS, and the development of the Airborne Loss of Separation Analysis Model (ALOSAM) using Bayesian Belief Networks (BBN) to capture the multi-dependent relations of causal factors. The ALOSAM is currently a qualitative model, although further development could lead to a quantitative model. ALOSAM could then be used to perform impact analysis of concepts and technologies in the AOSP portfolio on the reduction of LOS risk.

  12. A causal model of chronic obstructive pulmonary disease (COPD) risk.

    PubMed

    Cox, Louis Anthony Tony

    2011-01-01

    Research on the etiology of chronic pulmonary disease (COPD), an irreversible degenerative lung disease affecting 15% to 20% of smokers, has blossomed over the past half-century. Profound new insights have emerged from a combination of in vitro and -omics studies on affected lung cell populations (including cytotoxic CD8(+) T lymphocytes, regulatory CD4(+) helper T cells, dendritic cells, alveolar macrophages and neutrophils, alveolar and bronchiolar epithelial cells, goblet cells, and fibroblasts) and extracellular matrix components (especially, elastin and collagen fibers); in vivo studies on wild-type and genetically engineered mice and other rodents; clinical investigation of cell- and molecular-level changes in asymptomatic smokers and COPD patients; genetic studies of susceptible and rapidly-progressing phenotypes (both human and animal); biomarker studies of enzyme and protein degradation products in induced sputum, bronchiolar lavage, urine, and blood; and epidemiological and clinical investigations of the time course of disease progression. To this rich mix of data, we add a relatively simple in silico computational model that incorporates recent insights into COPD disease causation and progression. Our model explains irreversible degeneration of lung tissue as resulting from a cascade of positive feedback loops: a macrophage inflammation loop, a neutrophil inflammation loop, and an alveolar epithelial cell apoptosis loop. Unrepaired damage results in clinical symptoms. The resulting model illustrates how to simplify and make more understandable the main aspects of the very complex dynamics of COPD initiation and progression, as well as how to predict the effects on risk of interventions that affect specific biological responses.

  13. Developing Risk Prediction Models for Postoperative Pancreatic Fistula: a Systematic Review of Methodology and Reporting Quality.

    PubMed

    Wen, Zhang; Guo, Ya; Xu, Banghao; Xiao, Kaiyin; Peng, Tao; Peng, Minhao

    2016-04-01

    Postoperative pancreatic fistula is still a major complication after pancreatic surgery, despite improvements of surgical technique and perioperative management. We sought to systematically review and critically access the conduct and reporting of methods used to develop risk prediction models for predicting postoperative pancreatic fistula. We conducted a systematic search of PubMed and EMBASE databases to identify articles published before January 1, 2015, which described the development of models to predict the risk of postoperative pancreatic fistula. We extracted information of developing a prediction model including study design, sample size and number of events, definition of postoperative pancreatic fistula, risk predictor selection, missing data, model-building strategies, and model performance. Seven studies of developing seven risk prediction models were included. In three studies (42 %), the number of events per variable was less than 10. The number of candidate risk predictors ranged from 9 to 32. Five studies (71 %) reported using univariate screening, which was not recommended in building a multivariate model, to reduce the number of risk predictors. Six risk prediction models (86 %) were developed by categorizing all continuous risk predictors. The treatment and handling of missing data were not mentioned in all studies. We found use of inappropriate methods that could endanger the development of model, including univariate pre-screening of variables, categorization of continuous risk predictors, and model validation. The use of inappropriate methods affects the reliability and the accuracy of the probability estimates of predicting postoperative pancreatic fistula.

  14. Orion Absolute Navigation System Progress and Challenge

    NASA Technical Reports Server (NTRS)

    Holt, Greg N.; D'Souza, Christopher

    2012-01-01

    The absolute navigation design of NASA's Orion vehicle is described. It has undergone several iterations and modifications since its inception, and continues as a work-in-progress. This paper seeks to benchmark the current state of the design and some of the rationale and analysis behind it. There are specific challenges to address when preparing a timely and effective design for the Exploration Flight Test (EFT-1), while still looking ahead and providing software extensibility for future exploration missions. The primary onboard measurements in a Near-Earth or Mid-Earth environment consist of GPS pseudo-range and delta-range, but for future explorations missions the use of star-tracker and optical navigation sources need to be considered. Discussions are presented for state size and composition, processing techniques, and consider states. A presentation is given for the processing technique using the computationally stable and robust UDU formulation with an Agee-Turner Rank-One update. This allows for computational savings when dealing with many parameters which are modeled as slowly varying Gauss-Markov processes. Preliminary analysis shows up to a 50% reduction in computation versus a more traditional formulation. Several state elements are discussed and evaluated, including position, velocity, attitude, clock bias/drift, and GPS measurement biases in addition to bias, scale factor, misalignment, and non-orthogonalities of the accelerometers and gyroscopes. Another consideration is the initialization of the EKF in various scenarios. Scenarios such as single-event upset, ground command, and cold start are discussed as are strategies for whole and partial state updates as well as covariance considerations. Strategies are given for dealing with latent measurements and high-rate propagation using multi-rate architecture. The details of the rate groups and the data ow between the elements is discussed and evaluated.

  15. Absolute determination of local tropospheric OH concentrations

    NASA Technical Reports Server (NTRS)

    Armerding, Wolfgang; Comes, Franz-Josef

    1994-01-01

    Long path absorption (LPA) according to Lambert Beer's law is a method to determine absolute concentrations of trace gases such as tropospheric OH. We have developed a LPA instrument which is based on a rapid tuning of the light source which is a frequency doubled dye laser. The laser is tuned across two or three OH absorption features around 308 nm with a scanning speed of 0.07 cm(exp -1)/microsecond and a repetition rate of 1.3 kHz. This high scanning speed greatly reduces the fluctuation of the light intensity caused by the atmosphere. To obtain the required high sensitivity the laser output power is additionally made constant and stabilized by an electro-optical modulator. The present sensitivity is of the order of a few times 10(exp 5) OH per cm(exp 3) for an acquisition time of a minute and an absorption path length of only 1200 meters so that a folding of the optical path in a multireflection cell was possible leading to a lateral dimension of the cell of a few meters. This allows local measurements to be made. Tropospheric measurements have been carried out in 1991 resulting in the determination of OH diurnal variation at specific days in late summer. Comparison with model calculations have been made. Interferences are mainly due to SO2 absorption. The problem of OH self generation in the multireflection cell is of minor extent. This could be shown by using different experimental methods. The minimum-maximum signal to noise ratio is about 8 x 10(exp -4) for a single scan. Due to the small size of the absorption cell the realization of an open air laboratory is possible in which by use of an additional UV light source or by additional fluxes of trace gases the chemistry can be changed under controlled conditions allowing kinetic studies of tropospheric photochemistry to be made in open air.

  16. Comparative application of different risk assessment models and implications on resulting remediation options.

    PubMed

    Capodaglio, Andrea; Callegari, Arianna; Torretta, Vincenzo

    2014-01-01

    The issue of contaminated soils and their productive recovery is a quite controversial environmental and economic problem with important consequences for its social, public health and sustainability aspects. The sheer number and characteristics of the polluted sites are so large and varied, and the definition of priorities related to their remediation interventions so site-dependent, that proper characterization and final environmental quality goals reflect a strategic importance. One of the possible approaches to site specific approach and site priority ranking can be that of carrying out, respectively, absolute and comparative analysis procedures. An important aspect to be solved is represented by the necessity to consider not only the potential risk to public health, but also the best possible financial return from the investments for remediation, especially when carried out with public money. In this paper, different contaminated sites' risk assessment approaches are considered, compared and their applicability to support sustainable policies discussed using a case study.

  17. European Seismic Risk Model Covering Italy, Switzerland, Austria, Germany and Belgium

    NASA Astrophysics Data System (ADS)

    Williams, C.; Nyst, M.; Onur, T.; Seneviratna, P.; Baca, A.; Sorby, A.

    2006-12-01

    A seismic risk model for Europe has been developed to assisted insurers and reinsurers in assessing their financial risk posed by earthquakes. This model was covers Italy and several countries in central Europe including Switzerland, Austria, Germany and Belgium. This presentation summarizes the methodology and data within the model and includes a discussion of the key results from the hazard and risk perspectives. The earthquake, risk-model framework has four components. First, the stochastic event set is determined, as well as its associated event probabilities. A ground-motion model including geotechnical data is added to complete the seismic hazard model. To determine risk, regional building vulnerability curves and a financial model are incorporated. An insurer property exposure database was developed to determine the insured seismic risk in these countries. Using this model, examination of resulting hazard maps (200, 475, 1000 and 2500 years) and of city-level, hazard-curves gives insight to the key drivers of risk across the region. Hazard de-aggregation allow for examination of key drivers of risk in terms of seismic sources, event magnitude and events types. Examination of loss costs for residential and commercial (short and mid-rise) structures gives insight into the risk perspective for these various lines of business. Finally, incorporation of the insurer property exposure allows for an examination of the insured risk across the region and between exposure concentrations including Rome, Zurich, Munich, Vienna and Brussels.

  18. Absolute Distance Measurement with the MSTAR Sensor

    NASA Technical Reports Server (NTRS)

    Lay, Oliver P.; Dubovitsky, Serge; Peters, Robert; Burger, Johan; Ahn, Seh-Won; Steier, William H.; Fetterman, Harrold R.; Chang, Yian

    2003-01-01

    The MSTAR sensor (Modulation Sideband Technology for Absolute Ranging) is a new system for measuring absolute distance, capable of resolving the integer cycle ambiguity of standard interferometers, and making it possible to measure distance with sub-nanometer accuracy. The sensor uses a single laser in conjunction with fast phase modulators and low frequency detectors. We describe the design of the system - the principle of operation, the metrology source, beamlaunching optics, and signal processing - and show results for target distances up to 1 meter. We then demonstrate how the system can be scaled to kilometer-scale distances.

  19. Integrating Professional and Folk Models of HIV Risk: YMSM’s Perceptions of High-Risk Sex

    PubMed Central

    Kubicek, Katrina; Carpineto, Julie; McDavitt, Bryce; Weiss, George; Iverson, Ellen F.; Au, Chi-Wai; Kerrone, Dustin; Martinez, Miguel; Kipke, Michele D.

    2009-01-01

    Risks associated with HIV are well documented in research literature. While a great deal has been written about high-risk sex, little research has been conducted to examine how young men who have sex with men (YMSM) perceive and define high-risk sexual behavior. In this study, we compare the “professional’ and “folk” models of HIV-risk based on YMSM’s understanding of high-risk sex and where and how they gathered their understanding of HIV-risk behaviors. The findings reported here emerged from the quantitative and qualitative interviews from the Healthy Young Men’s Study (HYM), a longitudinal study examining risk and protective factors for substance use and sexual risk among an ethnically diverse sample of YMSM. Findings are discussed in relation to framing how service providers and others can increase YMSM’s knowledge of sexual behavior and help them build solid foundations of sexual health education to protect them from STI and HIV infection. PMID:18558819

  20. Training Systems Modelers through the Development of a Multi-scale Chagas Disease Risk Model

    NASA Astrophysics Data System (ADS)

    Hanley, J.; Stevens-Goodnight, S.; Kulkarni, S.; Bustamante, D.; Fytilis, N.; Goff, P.; Monroy, C.; Morrissey, L. A.; Orantes, L.; Stevens, L.; Dorn, P.; Lucero, D.; Rios, J.; Rizzo, D. M.

    2012-12-01

    The goal of our NSF-sponsored Division of Behavioral and Cognitive Sciences grant is to create a multidisciplinary approach to develop spatially explicit models of vector-borne disease risk using Chagas disease as our model. Chagas disease is a parasitic disease endemic to Latin America that afflicts an estimated 10 million people. The causative agent (Trypanosoma cruzi) is most commonly transmitted to humans by blood feeding triatomine insect vectors. Our objectives are: (1) advance knowledge on the multiple interacting factors affecting the transmission of Chagas disease, and (2) provide next generation genomic and spatial analysis tools applicable to the study of other vector-borne diseases worldwide. This funding is a collaborative effort between the RSENR (UVM), the School of Engineering (UVM), the Department of Biology (UVM), the Department of Biological Sciences (Loyola (New Orleans)) and the Laboratory of Applied Entomology and Parasitology (Universidad de San Carlos). Throughout this five-year study, multi-educational groups (i.e., high school, undergraduate, graduate, and postdoctoral) will be trained in systems modeling. This systems approach challenges students to incorporate environmental, social, and economic as well as technical aspects and enables modelers to simulate and visualize topics that would either be too expensive, complex or difficult to study directly (Yasar and Landau 2003). We launch this research by developing a set of multi-scale, epidemiological models of Chagas disease risk using STELLA® software v.9.1.3 (isee systems, inc., Lebanon, NH). We use this particular system dynamics software as a starting point because of its simple graphical user interface (e.g., behavior-over-time graphs, stock/flow diagrams, and causal loops). To date, high school and undergraduate students have created a set of multi-scale (i.e., homestead, village, and regional) disease models. Modeling the system at multiple spatial scales forces recognition that

  1. Adolescent mental health and academic functioning: empirical support for contrasting models of risk and vulnerability.

    PubMed

    Lucier-Greer, Mallory; O'Neal, Catherine W; Arnold, A Laura; Mancini, Jay A; Wickrama, Kandauda K A S

    2014-11-01

    Adolescents in military families contend with normative stressors that are universal and exist across social contexts (minority status, family disruptions, and social isolation) as well as stressors reflective of their military life context (e.g., parental deployment, school transitions, and living outside the United States). This study utilizes a social ecological perspective and a stress process lens to examine the relationship between multiple risk factors and relevant indicators of youth well-being, namely depressive symptoms and academic performance, as well as the mediating role of self-efficacy (N = 1,036). Three risk models were tested: an additive effects model (each risk factor uniquely influences outcomes), a full cumulative effects model (the collection of risk factors influences outcomes), a comparative model (a cumulative effects model exploring the differential effects of normative and military-related risks). This design allowed for the simultaneous examination of multiple risk factors and a comparison of alternative perspectives on measuring risk. Each model was predictive of depressive symptoms and academic performance through persistence; however, each model provides unique findings about the relationship between risk factors and youth outcomes. Discussion is provided pertinent to service providers and researchers on how risk is conceptualized and suggestions for identifying at-risk youth.

  2. Space Weather Influence on Power Systems: Prediction, Risk Analysis, and Modeling

    NASA Astrophysics Data System (ADS)

    Yatsenko, Vitaliy

    2016-04-01

    This report concentrates on dynamic probabilistic risk analysis of optical elements for complex characterization of damages using physical model of solid state lasers and predictable level of ionizing radiation and space weather. The following main subjects will be covered by our report: (a) solid-state laser model; (b) mathematical models for dynamic probabilistic risk assessment; and (c) software for modeling and prediction of ionizing radiation. A probabilistic risk assessment method for solid-state lasers is presented with consideration of some deterministic and stochastic factors. Probabilistic risk assessment is a comprehensive, structured, and logical analysis method aimed at identifying and assessing risks in solid-state lasers for the purpose of cost-e®ectively improving their safety and performance. This method based on the Conditional Value-at-Risk measure (CVaR) and the expected loss exceeding Value-at-Risk (VaR). We propose to use a new dynamical-information approach for radiation damage risk assessment of laser elements by cosmic radiation. Our approach includes the following steps: laser modeling, modeling of ionizing radiation in°uences on laser elements, probabilistic risk assessment methods, and risk minimization. For computer simulation of damage processes at microscopic and macroscopic levels the following methods are used: () statistical; (b) dynamical; (c) optimization; (d) acceleration modeling, and (e) mathematical modeling of laser functioning. Mathematical models of space ionizing radiation in°uence on laser elements were developed for risk assessment in laser safety analysis. This is a so-called `black box' or `input-output' models, which seeks only to reproduce the behaviour of the system's output in response to changes in its inputs. The model inputs are radiation in°uences on laser systems and output parameters are dynamical characteristics of the solid laser. Algorithms and software for optimal structure and parameters of

  3. Systemic risk: the dynamics of model banking systems

    PubMed Central

    May, Robert M.; Arinaminpathy, Nimalan

    2010-01-01

    The recent banking crises have made it clear that increasingly complex strategies for managing risk in individual banks have not been matched by corresponding attention to overall systemic risks. We explore some simple mathematical caricatures for ‘banking ecosystems’, with emphasis on the interplay between the characteristics of individual banks (capital reserves in relation to total assets, etc.) and the overall dynamical behaviour of the system. The results are discussed in relation to potential regulations aimed at reducing systemic risk. PMID:19864264

  4. Systemic risk: the dynamics of model banking systems.

    PubMed

    May, Robert M; Arinaminpathy, Nimalan

    2010-05-06

    The recent banking crises have made it clear that increasingly complex strategies for managing risk in individual banks have not been matched by corresponding attention to overall systemic risks. We explore some simple mathematical caricatures for 'banking ecosystems', with emphasis on the interplay between the characteristics of individual banks (capital reserves in relation to total assets, etc.) and the overall dynamical behaviour of the system. The results are discussed in relation to potential regulations aimed at reducing systemic risk.

  5. SVAT modelling in support to flood risk assessment in Bulgaria

    NASA Astrophysics Data System (ADS)

    Stoyanova, Julia S.; Georgiev, Christo G.

    2013-04-01

    This study explores the benefit that can be drawn from incorporating the diagnosis of initial soil moisture of the top vegetation/soil layer and its anomalies as parameters in support of operational weather forecasting. For that purpose, a 1D vertical numerical land surface scheme, referred to as Soil Vegetation Transfer Model (‘SVAT_bg’) has been developed to simulate the soil-vegetation-atmosphere mass and energy transfer, accounting for local soil/climate features. The model is run daily for estimating soil moisture content and on this basis, a biogeophysical index designating Soil Moisture Availability Index (SMAI) to vegetation land cover is derived. SMAI is introduced as a measure of the proportion between the energy and water balances and their anomalies at different weather/climate conditions through a 6-level threshold scheme of land surface moistening. To facilitate the use of SMAI as a diagnostic tool for operational forecasting purposes, it is generated on a daily basis and visualised by colour-coded maps, covering the main administrative regions of Bulgaria in combination with a numerical part, which indicates the required flood-producing rainfall quantities (specific for each region). In case of overmoistening conditions, the numerical part denotes the rainfall excess above the soil saturation moisture content. The utility of this approach is illustrated in two case studies of severe weather produced by deep convection and a rapid cyclogenesis developed at initial ‘dry’/‘wet’ soil moisture anomalies, respectively. The thermodynamic conditions and space-time structure of the rainfall are analysed by NWP output fields and satellite information. The study contributes to a better definition of the role of vegetation-soil moistening in flood risk forecasting within strong synoptic scale forcing regimes. The utility of the results comes also from the recognition of soil moisture as a meteorological forcing factor, which may affect both severity

  6. Lung Cancer Risk Models for Screening (R package: lcrisks)

    Cancer.gov

    In both the absence and presence of screening, the R package lcrisks, calculates individual risks of lung cancer and lung cancer death based on covariates: age, education, sex, race, smoking intensity/duration/quit-years, Body Mass Index, family history of lung-cancer, and self-reported emphysema. In the presence of CT screening akin to the NLST (3 yearly screens, 5 years of follow-up), it uses the covariates to estimate risk of false-positive CT screen as well as the reduction in risk of lung cancer death and increase in risk of lung cancer screening.

  7. Uncertainty of Calculated Risk Estimates for Secondary Malignancies After Radiotherapy

    SciTech Connect

    Kry, Stephen F. . E-mail: sfkry@mdanderson.org; Followill, David; White, R. Allen; Stovall, Marilyn; Kuban, Deborah A.; Salehpour, Mohammad

    2007-07-15

    Purpose: The significance of risk estimates for fatal secondary malignancies caused by out-of-field radiation exposure remains unresolved because the uncertainty in calculated risk estimates has not been established. This work examines the uncertainty in absolute risk estimates and in the ratio of risk estimates between different treatment modalities. Methods and Materials: Clinically reasonable out-of-field doses and calculated risk estimates were taken from the literature for several prostate treatment modalities, including intensity-modulated radiotherapy (IMRT), and were recalculated using the most recent risk model. The uncertainties in this risk model and uncertainties in the linearity of the dose-response model were considered in generating 90% confidence intervals for the uncertainty in the absolute risk estimates and in the ratio of the risk estimates. Results: The absolute risk estimates of fatal secondary malignancy were associated with very large uncertainties, which precluded distinctions between the risks associated with the different treatment modalities considered. However, a much smaller confidence interval exists for the ratio of risk estimates, and this ratio between different treatment modalities may be statistically significant when there is an effective dose equivalent difference of at least 50%. Such a difference may exist between clinically reasonable treatment options, including 6-MV IMRT versus 18-MV IMRT for prostate therapy. Conclusion: The ratio of the risk between different treatment modalities may be significantly different. Consequently risk models and associated risk estimates may be useful and meaningful for evaluating different treatment options. The calculated risk of secondary malignancy should be considered in the selection of an optimal treatment plan.

  8. Students' Mental Models with Respect to Flood Risk in the Netherlands

    ERIC Educational Resources Information Center

    Bosschaart, Adwin; Kuiper, Wilmad; van der Schee, Joop

    2015-01-01

    Until now various quantitative studies have shown that adults and students in the Netherlands have low flood risk perceptions. In this study we interviewed fifty 15-year-old students in two different flood prone areas. In order to find out how they think and reason about the risk of flooding, the mental model approach was used. Flood risk turned…

  9. An Ecological Risk Model for Early Childhood Anxiety: The Importance of Early Child Symptoms and Temperament

    ERIC Educational Resources Information Center

    Mian, Nicholas D.; Wainwright, Laurel; Briggs-Gowan, Margaret J.; Carter, Alice S.

    2011-01-01

    Childhood anxiety is impairing and associated with later emotional disorders. Studying risk factors for child anxiety may allow earlier identification of at-risk children for prevention efforts. This study applied an ecological risk model to address how early childhood anxiety symptoms, child temperament, maternal anxiety and depression symptoms,…

  10. An absolute photometric system at 10 and 20 microns

    NASA Technical Reports Server (NTRS)

    Rieke, G. H.; Lebofsky, M. J.; Low, F. J.

    1985-01-01

    Two new direct calibrations at 10 and 20 microns are presented in which terrestrial flux standards are referred to infrared standard stars. These measurements give both good agreement and higher accuracy when compared with previous direct calibrations. As a result, the absolute calibrations at 10 and 20 microns have now been determined with accuracies of 3 and 8 percent, respectively. A variety of absolute calibrations based on extrapolation of stellar spectra from the visible to 10 microns are reviewed. Current atmospheric models of A-type stars underestimate their fluxes by about 10 percent at 10 microns, whereas models of solar-type stars agree well with the direct calibrations. The calibration at 20 microns can probably be determined to about 5 percent by extrapolation from the more accurate result at 10 microns. The photometric system at 10 and 20 microns is updated to reflect the new absolute calibration, to base its zero point directly on the colors of A0 stars, and to improve the accuracy in the comparison of the standard stars.

  11. The importance and attainment of accurate absolute radiometric calibration

    NASA Technical Reports Server (NTRS)

    Slater, P. N.

    1984-01-01

    The importance of accurate absolute radiometric calibration is discussed by reference to the needs of those wishing to validate or use models describing the interaction of electromagnetic radiation with the atmosphere and earth surface features. The in-flight calibration methods used for the Landsat Thematic Mapper (TM) and the Systeme Probatoire d'Observation de la Terre, Haute Resolution visible (SPOT/HRV) systems are described and their limitations discussed. The questionable stability of in-flight absolute calibration methods suggests the use of a radiative transfer program to predict the apparent radiance, at the entrance pupil of the sensor, of a ground site of measured reflectance imaged through a well characterized atmosphere. The uncertainties of such a method are discussed.

  12. Absolute magnitude calibration using trigonometric parallax - Incomplete, spectroscopic samples

    NASA Technical Reports Server (NTRS)

    Ratnatunga, Kavan U.; Casertano, Stefano

    1991-01-01

    A new numerical algorithm is used to calibrate the absolute magnitude of spectroscopically selected stars from their observed trigonometric parallax. This procedure, based on maximum-likelihood estimation, can retrieve unbiased estimates of the intrinsic absolute magnitude and its dispersion even from incomplete samples suffering from selection biases in apparent magnitude and color. It can also make full use of low accuracy and negative parallaxes and incorporate censorship on reported parallax values. Accurate error estimates are derived for each of the fitted parameters. The algorithm allows an a posteriori check of whether the fitted model gives a good representation of the observations. The procedure is described in general and applied to both real and simulated data.

  13. Remote ultrasound palpation for robotic interventions using absolute elastography.

    PubMed

    Schneider, Caitlin; Baghani, Ali; Rohling, Robert; Salcudean, Septimiu

    2012-01-01

    Although robotic surgery has addressed many of the challenges presented by minimally invasive surgery, haptic feedback and the lack of knowledge of tissue stiffness is an unsolved problem. This paper presents a system for finding the absolute elastic properties of tissue using a freehand ultrasound scanning technique, which utilizes the da Vinci Surgical robot and a custom 2D ultrasound transducer for intraoperative use. An external exciter creates shear waves in the tissue, and a local frequency estimation method computes the shear modulus. Results are reported for both phantom and in vivo models. This system can be extended to any 6 degree-of-freedom tracking method and any 2D transducer to provide real-time absolute elastic properties of tissue.

  14. Absolute configuration of labdane diterpenoids from Physalis nicandroides.

    PubMed

    Maldonado, Emma; Pérez-Castorena, Ana L; Romero, Yunuen; Martínez, Mahinda

    2015-02-27

    A mixture of the new epimeric labdenetriols 1 and 2 was isolated from the aerial parts of Physalis nicandroides. The structures of 1 and 2, including their absolute configurations, were established by analyses of their spectroscopic data, together with the X-ray diffraction analysis of acetonide 3 and chemical correlation with (-)-(13E)-labd-13-ene-8α,15-diol (6), whose absolute configuration was also confirmed by X-ray analysis of its dibromo derivative 7. The epimeric labdenediols 8 and 9, the known labdanes 6 and 11, and the acylsucroses 12 and 13 were also isolated. Labdanes 6 and 11 showed moderate anti-inflammatory activities in the induced ear edema model.

  15. Oblique-incidence sounder measurements with absolute propagation delay timing

    SciTech Connect

    Daehler, M.

    1990-05-03

    Timing from the Global Position Satellite (GPS) system has been applied to HF oblique incidence sounder measurements to produce ionograms whose propagation delay time scale is absolutely calibrated. Such a calibration is useful for interpreting ionograms in terms of the electron density true-height profile for the ionosphere responsible for the propagation. Use of the time variations in the shape of the electron density profile, in conjunction with an HF propagation model, is expected to provide better near-term (1-24 hour) HF propagation forecasts than are available from current updating systems, which use only the MUF. Such a capability may provide the basis for HF frequency management techniques which are more efficient than current methods. Absolute timing and other techniques applicable to automatic extraction of the electron-density profile from an ionogram will be discussed.

  16. Incorporating risk attitude into Markov-process decision models: importance for individual decision making.

    PubMed

    Cher, D J; Miyamoto, J; Lenert, L A

    1997-01-01

    Most decision models published in the medical literature take a risk-neutral perspective. Under risk neutrality, the utility of a gamble is equivalent to its expected value and the marginal utility of living a given unit of time is the same regardless of when it occurs. Most patients, however, are not risk-neutral. Not only does risk aversion affect decision analyses when tradeoffs between short- and long-term survival are involved, it also affects the interpretation of time-tradeoff measures of health-state utility. The proportional time tradeoff under- or overestimates the disutility of an inferior health state, depending on whether the patient is risk-seeking or risk-averse (it is unbiased if the patient is risk-neutral). The authors review how risk attitude with respect to gambles for survival duration can be incorporated into decision models using the framework of risk-adjusted quality-adjusted life years (RA-QALYs). They present a simple extension of this framework that allows RA-QALYs to be calculated for Markov-process decision models. Using a previously published Markov-process model of surgical vs expectant treatment for benign prostatic hypertrophy (BPH), they show how attitude towards risk affects the expected number of QALYs calculated by the model. In this model, under risk neutrality, surgery was the preferred option. Under mild risk aversion, expectant treatment was the preferred option. Risk attitude is an important aspect of preferences that should be incorporated into decision models where one treatment option has upfront risks of morbidity or mortality.

  17. Comparative vs. Absolute Judgments of Trait Desirability

    ERIC Educational Resources Information Center

    Hofstee, Willem K. B.

    1970-01-01

    Reversals of trait desirability are studied. Terms indicating conservativw behavior appeared to be judged relatively desirable in comparative judgement, while traits indicating dynamic and expansive behavior benefited from absolute judgement. The reversal effect was shown to be a general one, i.e. reversals were not dependent upon the specific…

  18. New Techniques for Absolute Gravity Measurements.

    DTIC Science & Technology

    1983-01-07

    Hammond, J.A. (1978) Bollettino Di Geofisica Teorica ed Applicata Vol. XX. 8. Hammond, J. A., and Iliff, R. L. (1979) The AFGL absolute gravity system...International Gravimetric Bureau, No. L:I-43. 7. Hammond. J.A. (1978) Bollettino Di Geofisica Teorica ed Applicata Vol. XX. 8. Hammond, J.A., and

  19. An Absolute Electrometer for the Physics Laboratory

    ERIC Educational Resources Information Center

    Straulino, S.; Cartacci, A.

    2009-01-01

    A low-cost, easy-to-use absolute electrometer is presented: two thin metallic plates and an electronic balance, usually available in a laboratory, are used. We report on the very good performance of the device that allows precise measurements of the force acting between two charged plates. (Contains 5 footnotes, 2 tables, and 6 figures.)

  20. Stimulus Probability Effects in Absolute Identification

    ERIC Educational Resources Information Center

    Kent, Christopher; Lamberts, Koen

    2016-01-01

    This study investigated the effect of stimulus presentation probability on accuracy and response times in an absolute identification task. Three schedules of presentation were used to investigate the interaction between presentation probability and stimulus position within the set. Data from individual participants indicated strong effects of…

  1. Absolute Positioning Using the Global Positioning System

    DTIC Science & Technology

    1994-04-01

    Global Positioning System ( GPS ) has becom a useful tool In providing relativ survey...Includes the development of a low cost navigator for wheeled vehicles. ABSTRACT The Global Positioning System ( GPS ) has become a useful tool In providing...technique of absolute or point positioning involves the use of a single Global Positioning System ( GPS ) receiver to determine the three-dimenslonal

  2. The development of posterior probability models in risk-based integrity modeling.

    PubMed

    Thodi, Premkumar N; Khan, Faisal I; Haddara, Mahmoud R

    2010-03-01

    There is a need for accurate modeling of mechanisms causing material degradation of equipment in process installation, to ensure safety and reliability of the equipment. Degradation mechanisms are stochastic processes. They can be best described using risk-based approaches. Risk-based integrity assessment quantifies the level of risk to which the individual components are subjected and provides means to mitigate them in a safe and cost-effective manner. The uncertainty and variability in structural degradations can be best modeled by probability distributions. Prior probability models provide initial description of the degradation mechanisms. As more inspection data become available, these prior probability models can be revised to obtain posterior probability models, which represent the current system and can be used to predict future failures. In this article, a rejection sampling-based Metropolis-Hastings (M-H) algorithm is used to develop posterior distributions. The M-H algorithm is a Markov chain Monte Carlo algorithm used to generate a sequence of posterior samples without actually knowing the normalizing constant. Ignoring the transient samples in the generated Markov chain, the steady state samples are rejected or accepted based on an acceptance criterion. To validate the estimated parameters of posterior models, analytical Laplace approximation method is used to compute the integrals involved in the posterior function. Results of the M-H algorithm and Laplace approximations are compared with conjugate pair estimations of known prior and likelihood combinations. The M-H algorithm provides better results and hence it is used for posterior development of the selected priors for corrosion and cracking.

  3. Absolute Gravity Measurements with the FG5#215 in Czech Republic, Slovakia and Hungary

    NASA Astrophysics Data System (ADS)

    Pálinkás, V.; Kostelecký, J.; Lederer, M.

    2009-04-01

    Since 2001, the absolute gravimeter FG5#215 has been used for modernization of national gravity networks in Czech Republic, Slovakia and Hungary. Altogether 37 absolute sites were measured at least once. In case of 29 sites, the absolute gravity has been determined prior to the FG5#215 by other accurate absolute meters (FG5 or JILA-g). Differences between gravity results, which reach up to 25 microgal, are caused by random and systematic errors of measurements, variations of environmental effects (mainly hydrological effects) and by geodynamics. The set of achieved differences is analyzed for potential hydrological effects based on global hydrology models and systematic errors of instrumental origin. Systematic instrumental errors are evaluated in context with accomplished international comparison measurements of absolute gravimeters in Sèvres and Walferdange organized by the Bureau International des Poids et Measures and European Center for Geodynamics and Seismology, respectively.

  4. Performance of default risk model with barrier option framework and maximum likelihood estimation: Evidence from Taiwan

    NASA Astrophysics Data System (ADS)

    Chou, Heng-Chih; Wang, David

    2007-11-01

    We investigate the performance of a default risk model based on the barrier option framework with maximum likelihood estimation. We provide empirical validation of the model by showing that implied default barriers are statistically significant for a sample of construction firms in Taiwan over the period 1994-2004. We find that our model dominates the commonly adopted models, Merton model, Z-score model and ZETA model. Moreover, we test the n-year-ahead prediction performance of the model and find evidence that the prediction accuracy of the model improves as the forecast horizon decreases. Finally, we assess the effect of estimated default risk on equity returns and find that default risk is able to explain equity returns and that default risk is a variable worth considering in asset-pricing tests, above and beyond size and book-to-market.

  5. EVALUATING RISK IN OLDER ADULTS USING PHYSIOLOGICALLY BASED PHARMACOKINETIC MODELS

    EPA Science Inventory

    The rapid growth in the number of older Americans has many implications for public health, including the need to better understand the risks posed by environmental exposures to older adults. An important element for evaluating risk is the understanding of the doses of environment...

  6. PIER: An Inclusive Model for At-Risk Students.

    ERIC Educational Resources Information Center

    Saint-Laurent, Lise

    This study compared the inclusive PIER program (French acronym for program intervention with at risk students), which was implemented in 13 third grade classes in Quebec (Canada) and compared with 13 third grade classes using a traditional resource room pull-out program over the course of a full school year. A total of 165 at-risk and 441 non…

  7. Risk Management in Australian Science Education: A Model for Practice.

    ERIC Educational Resources Information Center

    Forlin, Peter

    1995-01-01

    Provides a framework that incorporates the diverse elements of risk management in science education into a systematic process and is adaptable to changing circumstances. Appendix contains risk management checklist for management, laboratory and storage, extreme biological and chemical hazards, protective equipment, waste disposal, electrical…

  8. Absolute Radiation Thermometry in the NIR

    NASA Astrophysics Data System (ADS)

    Bünger, L.; Taubert, R. D.; Gutschwager, B.; Anhalt, K.; Briaudeau, S.; Sadli, M.

    2017-04-01

    A near infrared (NIR) radiation thermometer (RT) for temperature measurements in the range from 773 K up to 1235 K was characterized and calibrated in terms of the "Mise en Pratique for the definition of the Kelvin" (MeP-K) by measuring its absolute spectral radiance responsivity. Using Planck's law of thermal radiation allows the direct measurement of the thermodynamic temperature independently of any ITS-90 fixed-point. To determine the absolute spectral radiance responsivity of the radiation thermometer in the NIR spectral region, an existing PTB monochromator-based calibration setup was upgraded with a supercontinuum laser system (0.45 μm to 2.4 μm) resulting in a significantly improved signal-to-noise ratio. The RT was characterized with respect to its nonlinearity, size-of-source effect, distance effect, and the consistency of its individual temperature measuring ranges. To further improve the calibration setup, a new tool for the aperture alignment and distance measurement was developed. Furthermore, the diffraction correction as well as the impedance correction of the current-to-voltage converter is considered. The calibration scheme and the corresponding uncertainty budget of the absolute spectral responsivity are presented. A relative standard uncertainty of 0.1 % (k=1) for the absolute spectral radiance responsivity was achieved. The absolute radiometric calibration was validated at four temperature values with respect to the ITS-90 via a variable temperature heatpipe blackbody (773 K ...1235 K) and at a gold fixed-point blackbody radiator (1337.33 K).

  9. What's the Risk? A Simple Approach for Estimating Adjusted Risk Measures from Nonlinear Models Including Logistic Regression

    PubMed Central

    Kleinman, Lawrence C; Norton, Edward C

    2009-01-01

    Objective To develop and validate a general method (called regression risk analysis) to estimate adjusted risk measures from logistic and other nonlinear multiple regression models. We show how to estimate standard errors for these estimates. These measures could supplant various approximations (e.g., adjusted odds ratio [AOR]) that may diverge, especially when outcomes are common. Study Design Regression risk analysis estimates were compared with internal standards as well as with Mantel–Haenszel estimates, Poisson and log-binomial regressions, and a widely used (but flawed) equation to calculate adjusted risk ratios (ARR) from AOR. Data Collection Data sets produced using Monte Carlo simulations. Principal Findings Regression risk analysis accurately estimates ARR and differences directly from multiple regression models, even when confounders are continuous, distributions are skewed, outcomes are common, and effect size is large. It is statistically sound and intuitive, and has properties favoring it over other methods in many cases. Conclusions Regression risk analysis should be the new standard for presenting findings from multiple regression analysis of dichotomous outcomes for cross-sectional, cohort, and population-based case–control stud