Bonner, Carissa; Jansen, Jesse; McKinn, Shannon; Irwig, Les; Doust, Jenny; Glasziou, Paul; McCaffery, Kirsten
2014-05-29
Cardiovascular disease (CVD) prevention guidelines encourage assessment of absolute CVD risk - the probability of a CVD event within a fixed time period, based on the most predictive risk factors. However, few General Practitioners (GPs) use absolute CVD risk consistently, and communication difficulties have been identified as a barrier to changing practice. This study aimed to explore GPs' descriptions of their CVD risk communication strategies, including the role of absolute risk. Semi-structured interviews were conducted with a purposive sample of 25 GPs in New South Wales, Australia. Transcribed audio-recordings were thematically coded, using the Framework Analysis method to ensure rigour. GPs used absolute CVD risk within three different communication strategies: 'positive', 'scare tactic', and 'indirect'. A 'positive' strategy, which aimed to reassure and motivate, was used for patients with low risk, determination to change lifestyle, and some concern about CVD risk. Absolute risk was used to show how they could reduce risk. A 'scare tactic' strategy was used for patients with high risk, lack of motivation, and a dismissive attitude. Absolute risk was used to 'scare' them into taking action. An 'indirect' strategy, where CVD risk was not the main focus, was used for patients with low risk but some lifestyle risk factors, high anxiety, high resistance to change, or difficulty understanding probabilities. Non-quantitative absolute risk formats were found to be helpful in these situations. This study demonstrated how GPs use three different communication strategies to address the issue of CVD risk, depending on their perception of patient risk, motivation and anxiety. Absolute risk played a different role within each strategy. Providing GPs with alternative ways of explaining absolute risk, in order to achieve different communication aims, may improve their use of absolute CVD risk assessment in practice.
2014-01-01
Background Cardiovascular disease (CVD) prevention guidelines encourage assessment of absolute CVD risk - the probability of a CVD event within a fixed time period, based on the most predictive risk factors. However, few General Practitioners (GPs) use absolute CVD risk consistently, and communication difficulties have been identified as a barrier to changing practice. This study aimed to explore GPs’ descriptions of their CVD risk communication strategies, including the role of absolute risk. Methods Semi-structured interviews were conducted with a purposive sample of 25 GPs in New South Wales, Australia. Transcribed audio-recordings were thematically coded, using the Framework Analysis method to ensure rigour. Results GPs used absolute CVD risk within three different communication strategies: ‘positive’, ‘scare tactic’, and ‘indirect’. A ‘positive’ strategy, which aimed to reassure and motivate, was used for patients with low risk, determination to change lifestyle, and some concern about CVD risk. Absolute risk was used to show how they could reduce risk. A ‘scare tactic’ strategy was used for patients with high risk, lack of motivation, and a dismissive attitude. Absolute risk was used to ‘scare’ them into taking action. An ‘indirect’ strategy, where CVD risk was not the main focus, was used for patients with low risk but some lifestyle risk factors, high anxiety, high resistance to change, or difficulty understanding probabilities. Non-quantitative absolute risk formats were found to be helpful in these situations. Conclusions This study demonstrated how GPs use three different communication strategies to address the issue of CVD risk, depending on their perception of patient risk, motivation and anxiety. Absolute risk played a different role within each strategy. Providing GPs with alternative ways of explaining absolute risk, in order to achieve different communication aims, may improve their use of absolute CVD risk assessment in practice. PMID:24885409
Little, Mark P; McElvenny, Damien M
2017-02-01
There are well-known associations of ionizing radiation with female breast cancer, and emerging evidence also for male breast cancer. In the United Kingdom, female breast cancer following occupational radiation exposure is among that set of cancers eligible for state compensation and consideration is currently being given to an extension to include male breast cancer. We compare radiation-associated excess relative and absolute risks of male and female breast cancers. Breast cancer incidence and mortality data in the Japanese atomic-bomb survivors were analyzed using relative and absolute risk models via Poisson regression. We observed significant (p ≤ 0.01) dose-related excess risk for male breast cancer incidence and mortality. For incidence and mortality data, there are elevations by factors of approximately 15 and 5, respectively, of relative risk for male compared with female breast cancer incidence, the former borderline significant (p = 0.050). In contrast, for incidence and mortality data, there are elevations by factors of approximately 20 and 10, respectively, of female absolute risk compared with male, both statistically significant (p < 0.001). There are no indications of differences between the sexes in age/time-since-exposure/age-at-exposure modifications to the relative or absolute excess risk. The probability of causation of male breast cancer following radiation exposure exceeds by at least a factor of 5 that of many other malignancies. There is evidence of much higher radiation-associated relative risk for male than for female breast cancer, although absolute excess risks for males are much less than for females. However, the small number of male cases and deaths suggests a degree of caution in interpretation of this finding. Citation: Little MP, McElvenny DM. 2017. Male breast cancer incidence and mortality risk in the Japanese atomic bomb survivors - differences in excess relative and absolute risk from female breast cancer. Environ Health Perspect 125:223-229; http://dx.doi.org/10.1289/EHP151.
Comparison of evidence on harms of medical interventions in randomized and nonrandomized studies
Papanikolaou, Panagiotis N.; Christidi, Georgia D.; Ioannidis, John P.A.
2006-01-01
Background Information on major harms of medical interventions comes primarily from epidemiologic studies performed after licensing and marketing. Comparison with data from large-scale randomized trials is occasionally feasible. We compared evidence from randomized trials with that from epidemiologic studies to determine whether they give different estimates of risk for important harms of medical interventions. Methods We targeted well-defined, specific harms of various medical interventions for which data were already available from large-scale randomized trials (> 4000 subjects). Nonrandomized studies involving at least 4000 subjects addressing these same harms were retrieved through a search of MEDLINE. We compared the relative risks and absolute risk differences for specific harms in the randomized and nonrandomized studies. Results Eligible nonrandomized studies were found for 15 harms for which data were available from randomized trials addressing the same harms. Comparisons of relative risks between the study types were feasible for 13 of the 15 topics, and of absolute risk differences for 8 topics. The estimated increase in relative risk differed more than 2-fold between the randomized and nonrandomized studies for 7 (54%) of the 13 topics; the estimated increase in absolute risk differed more than 2-fold for 5 (62%) of the 8 topics. There was no clear predilection for randomized or nonrandomized studies to estimate greater relative risks, but usually (75% [6/8]) the randomized trials estimated larger absolute excess risks of harm than the nonrandomized studies did. Interpretation Nonrandomized studies are often conservative in estimating absolute risks of harms. It would be useful to compare and scrutinize the evidence on harms obtained from both randomized and nonrandomized studies. PMID:16505459
Little, Mark P.; McElvenny, Damien M.
2016-01-01
Background: There are well-known associations of ionizing radiation with female breast cancer, and emerging evidence also for male breast cancer. In the United Kingdom, female breast cancer following occupational radiation exposure is among that set of cancers eligible for state compensation and consideration is currently being given to an extension to include male breast cancer. Objectives: We compare radiation-associated excess relative and absolute risks of male and female breast cancers. Methods: Breast cancer incidence and mortality data in the Japanese atomic-bomb survivors were analyzed using relative and absolute risk models via Poisson regression. Results: We observed significant (p ≤ 0.01) dose-related excess risk for male breast cancer incidence and mortality. For incidence and mortality data, there are elevations by factors of approximately 15 and 5, respectively, of relative risk for male compared with female breast cancer incidence, the former borderline significant (p = 0.050). In contrast, for incidence and mortality data, there are elevations by factors of approximately 20 and 10, respectively, of female absolute risk compared with male, both statistically significant (p < 0.001). There are no indications of differences between the sexes in age/time-since-exposure/age-at-exposure modifications to the relative or absolute excess risk. The probability of causation of male breast cancer following radiation exposure exceeds by at least a factor of 5 that of many other malignancies. Conclusions: There is evidence of much higher radiation-associated relative risk for male than for female breast cancer, although absolute excess risks for males are much less than for females. However, the small number of male cases and deaths suggests a degree of caution in interpretation of this finding. Citation: Little MP, McElvenny DM. 2017. Male breast cancer incidence and mortality risk in the Japanese atomic bomb survivors – differences in excess relative and absolute risk from female breast cancer. Environ Health Perspect 125:223–229; http://dx.doi.org/10.1289/EHP151 PMID:27286002
Little, Mark P.; McElvenny, Damien M.
2016-06-10
There are well-known associations of ionizing radiation with female breast cancer, and emerging evidence also for male breast cancer. In the UK, female breast cancer following occupational radiation exposure is among that set of cancers eligible for state compensation and consideration is currently being given to an extension to include male breast cancer. The objectives here, compare radiation-associated excess relative and absolute risks of male and female breast cancers. Breast cancer incidence and mortality data in the Japanese atomic-bomb survivors were analyzed using relative and absolute risk models via Poisson regression. As a result, we observed significant ( p≤ 0.01)more » dose-related excess risk for male breast cancer incidence and mortality. For incidence and mortality data, there are approximate 15-fold and 5- fold elevations, respectively, of relative risk for male compared with female breast cancer incidence, the former borderline significant (p = 0.050). In contrast, for incidence and mortality data there are approximate 20-fold and 10-fold elevations, respectively, of female absolute risk compared with male, both statistically significant (p < 0.001). There are no indications of differences between the sexes in age/time-since-exposure/age-at-exposure modifications to the relative or absolute excess risk. The probability of causation of male breast cancer following radiation exposure exceeds by at least 5-fold that of many other malignancies. In conclusion, there is evidence of much higher radiation-associated relative risk for male than for female breast cancer, although absolute excess risks for males are much less than for females. However, the small number of male cases and deaths suggests a degree of caution in interpretation of this finding.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Little, Mark P.; McElvenny, Damien M.
There are well-known associations of ionizing radiation with female breast cancer, and emerging evidence also for male breast cancer. In the UK, female breast cancer following occupational radiation exposure is among that set of cancers eligible for state compensation and consideration is currently being given to an extension to include male breast cancer. The objectives here, compare radiation-associated excess relative and absolute risks of male and female breast cancers. Breast cancer incidence and mortality data in the Japanese atomic-bomb survivors were analyzed using relative and absolute risk models via Poisson regression. As a result, we observed significant ( p≤ 0.01)more » dose-related excess risk for male breast cancer incidence and mortality. For incidence and mortality data, there are approximate 15-fold and 5- fold elevations, respectively, of relative risk for male compared with female breast cancer incidence, the former borderline significant (p = 0.050). In contrast, for incidence and mortality data there are approximate 20-fold and 10-fold elevations, respectively, of female absolute risk compared with male, both statistically significant (p < 0.001). There are no indications of differences between the sexes in age/time-since-exposure/age-at-exposure modifications to the relative or absolute excess risk. The probability of causation of male breast cancer following radiation exposure exceeds by at least 5-fold that of many other malignancies. In conclusion, there is evidence of much higher radiation-associated relative risk for male than for female breast cancer, although absolute excess risks for males are much less than for females. However, the small number of male cases and deaths suggests a degree of caution in interpretation of this finding.« less
Staerk, L; Gerds, T A; Lip, G Y H; Ozenne, B; Bonde, A N; Lamberts, M; Fosbøl, E L; Torp-Pedersen, C; Gislason, G H; Olesen, J B
2018-01-01
Comparative data of non-vitamin K antagonist oral anticoagulants (NOAC) are lacking in patients with atrial fibrillation (AF). We compared effectiveness and safety of standard and reduced dose NOAC in AF patients. Using Danish nationwide registries, we included all oral anticoagulant-naïve AF patients who initiated NOAC treatment (2012-2016). Outcome-specific and mortality-specific multiple Cox regressions were combined to compute average treatment effects as 1-year standardized differences in stroke and bleeding risks (g-formula). Amongst 31 522 AF patients, the distribution of NOAC/dose was as follows: dabigatran standard dose (22.4%), dabigatran-reduced dose (14.0%), rivaroxaban standard dose (21.8%), rivaroxaban reduced dose (6.7%), apixaban standard dose (22.9%), and apixaban reduced dose (12.2%). The 1-year standardized absolute risks of stroke/thromboembolism were 1.73-1.98% and 2.51-2.78% with standard and reduced NOAC dose, respectively, without statistically significant differences between NOACs for given dose level. Comparing standard doses, the 1-year standardized absolute risk (95% CI) for major bleeding was for rivaroxaban 2.78% (2.42-3.17%); corresponding absolute risk differences (95% CI) were for dabigatran -0.93% (-1.45% to -0.38%) and apixaban, -0.54% (-0.99% to -0.05%). The results for major bleeding were similar for reduced NOAC dose. The 1-year standardized absolute risk (95% CI) for intracranial bleeding was for standard dose dabigatran 0.19% (0.22-0.50%); corresponding absolute risk differences (95% CI) were for rivaroxaban 0.23% (0.06-0.41%) and apixaban, 0.18% (0.01-0.34%). Standard and reduced dose NOACs, respectively, showed no significant risk difference for associated stroke/thromboembolism. Rivaroxaban was associated with higher bleeding risk compared with dabigatran and apixaban and dabigatran was associated with lower intracranial bleeding risk compared with rivaroxaban and apixaban. © 2017 The Association for the Publication of the Journal of Internal Medicine.
Murray, Louise; Mason, Joshua; Henry, Ann M; Hoskin, Peter; Siebert, Frank-Andre; Venselaar, Jack; Bownes, Peter
2016-08-01
To estimate the risks of radiation-induced rectal and bladder cancers following low dose rate (LDR) and high dose rate (HDR) brachytherapy as monotherapy for localised prostate cancer and compare to external beam radiotherapy techniques. LDR and HDR brachytherapy monotherapy plans were generated for three prostate CT datasets. Second cancer risks were assessed using Schneider's concept of organ equivalent dose. LDR risks were assessed according to a mechanistic model and a bell-shaped model. HDR risks were assessed according to a bell-shaped model. Relative risks and excess absolute risks were estimated and compared to external beam techniques. Excess absolute risks of second rectal or bladder cancer were low for both LDR (irrespective of the model used for calculation) and HDR techniques. Average excess absolute risks of rectal cancer for LDR brachytherapy according to the mechanistic model were 0.71 per 10,000 person-years (PY) and 0.84 per 10,000 PY respectively, and according to the bell-shaped model, were 0.47 and 0.78 per 10,000 PY respectively. For HDR, the average excess absolute risks for second rectal and bladder cancers were 0.74 and 1.62 per 10,000 PY respectively. The absolute differences between techniques were very low and clinically irrelevant. Compared to external beam prostate radiotherapy techniques, LDR and HDR brachytherapy resulted in the lowest risks of second rectal and bladder cancer. This study shows both LDR and HDR brachytherapy monotherapy result in low estimated risks of radiation-induced rectal and bladder cancer. LDR resulted in lower bladder cancer risks than HDR, and lower or similar risks of rectal cancer. In absolute terms these differences between techniques were very small. Compared to external beam techniques, second rectal and bladder cancer risks were lowest for brachytherapy. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Dobrijević, Lj J; Ljubić, Aleksandar; Sovilj, Mirjana; Ribarić-Jankes, Ksenija; Miković, Zeljko; Cerović, Nikola
2009-10-01
To examine fetal auditory perception in low- and high-risk pregnancies in period from 27 to 31 weeks gestational age with the aim to establish diagnostic parameters in prenatal detection of the degree of hearing development in a fetus. Method of prenatal hearing screening was applied on 80 women divided in two groups: Control group (C=22), consisted of pregnant women with low-risk pregnancies, and Experimental group (E=58), consisted of pregnant women with high-risk pregnancies (pregnancies with diagnosis of: preterm delivery, hypertension and/or intrauterine growth restriction (IUGR), diabetes). PHS was applied in period from 27 to 31 weeks gestational age. Brain circulation changes in fetal middle cerebral artery (MCA) caused by defined sound stimulus, as the indicator of fetal auditory reactions, were registered on Doppler ultrasound apparatus. After visualization of MCA, a sound stimulus was delivered. The stimulus consisted of one defined sound which is a digitally produced sound with the intensity of 90 dB, frequency range of 1500-4500 Hz, and duration of 0.2s (click) and it was presented only once. Measurements in observed artery were taken before (baseline) and after defined sound stimulation. Results showed that the absolute and relative difference in Pulsatility index (baseline and after sound stimulation) were greater for the high-risk group compared to the low-risk group (absolute difference: mean=0.36 vs mean=0.36) (relative difference: mean = ∼ 18% vs mean = ∼ 12%). When the low-risk group and the three high-risk group mean pairs were compared using multiple t-test, the diabetic group differed from the low-risk and two other high-risk groups; the low-risk and the two other high-risk groups did not differ from each other. Fetuses from pregnancies with diagnosis of diabetes demonstrated the most expressive reactibility and significantly higher absolute and relative changes of Pi values (absolute difference: mean=0.54, relative difference: mean=25.49%). The values of Pulsatility index (Pi) registered by PHS in low- and high-risk pregnancies may be used as differential and diagnostic parameters in fetal auditory perception examination. Fetuses from pregnancies with diagnosis of diabetes demonstrated significantly higher absolute and relative changes of Pi values compared to other groups of examined fetuses.
Ranganathan, Priya; Pramesh, C. S.; Aggarwal, Rakesh
2016-01-01
In the previous article in this series on common pitfalls in statistical analysis, we looked at the difference between risk and odds. Risk, which refers to the probability of occurrence of an event or outcome, can be defined in absolute or relative terms. Understanding what these measures represent is essential for the accurate interpretation of study results. PMID:26952180
Janssen, Eva; Verduyn, Philippe; Waters, Erika A
2018-05-01
Many people report uncertainty when appraising their risk of cancer and other diseases, but prior research about the topic has focused solely on cognitive risk perceptions. We investigated uncertainty related to cognitive and affective risk questions. We also explored whether any differences in uncertainty between cognitive and affective questions varied in magnitude by item-specific or socio-demographic characteristics. Secondary analysis of data collected for a 2 × 2 × 3 full-factorial risk communication experiment (N = 835) that was embedded within an online survey. We investigated the frequency of 'don't know' responses (DKR) to eight perceived risk items that varied according to whether they assessed (1) cognitive versus affective perceived risk, (2) absolute versus comparative risk, and (3) colon cancer versus 'any exercise-related diseases'. Socio-demographics were as follows: sex, age, education, family history, and numeracy. We analysed the data using multilevel logistic regression. The odds of DKR were lower for affective than cognitive perceived risk (OR = 0.64, p < .001). This difference occurred for absolute but not comparative risk perceptions (interaction effect, p = .004), but no interactions for disease type or demographic characteristics were found (ps > .05). Lower uncertainty for affective (vs. cognitive) absolute perceived risk items is consistent with research stating: (1) Risk perceptions are grounded in people's feelings about a hazard, and (2) feelings are easier for people to access than facts. Including affective perceived risk items in health behaviour surveys may reduce missing data and improve data quality. Statement of contribution What is already known on this subject? Many people report that they don't know their risk (i.e., risk uncertainty). Evidence is growing for the importance of feelings of risk in explaining health behaviour. Feelings are easier for people to access than facts. What does this study add? Don't know responding is higher for absolute cognitive than absolute affective risk questions. This difference does not vary in magnitude by demographic characteristics. Affective perceived risk questions in surveys may reduce missing data and improve data quality. © 2018 The British Psychological Society.
Variance computations for functional of absolute risk estimates.
Pfeiffer, R M; Petracci, E
2011-07-01
We present a simple influence function based approach to compute the variances of estimates of absolute risk and functions of absolute risk. We apply this approach to criteria that assess the impact of changes in the risk factor distribution on absolute risk for an individual and at the population level. As an illustration we use an absolute risk prediction model for breast cancer that includes modifiable risk factors in addition to standard breast cancer risk factors. Influence function based variance estimates for absolute risk and the criteria are compared to bootstrap variance estimates.
Variance computations for functional of absolute risk estimates
Pfeiffer, R.M.; Petracci, E.
2011-01-01
We present a simple influence function based approach to compute the variances of estimates of absolute risk and functions of absolute risk. We apply this approach to criteria that assess the impact of changes in the risk factor distribution on absolute risk for an individual and at the population level. As an illustration we use an absolute risk prediction model for breast cancer that includes modifiable risk factors in addition to standard breast cancer risk factors. Influence function based variance estimates for absolute risk and the criteria are compared to bootstrap variance estimates. PMID:21643476
Vandvik, Per Olav; Santesso, Nancy; Akl, Elie A; You, John; Mulla, Sohail; Spencer, Frederick A; Johnston, Bradley C; Brozek, Jan; Kreis, Julia; Brandt, Linn; Zhou, Qi; Schünemann, Holger J; Guyatt, Gordon
2012-07-01
To determine the effects of formatting alternatives in Grading of Recommendations Assessment, Development, and Evaluation (GRADE) evidence profiles on guideline panelists' preferences, comprehension, and accessibility. We randomized 116 antithrombotic therapy guideline panelists to review one of two table formats with four formatting alternatives. After answering relevant questions, panelists reviewed the other format and reported their preferences for specific formatting alternatives. Panelists (88 of 116 invited [76%]) preferred presentation of study event rates over no study event rates (median 1 [interquartile range (IQR) 1] on 1-7 scale), absolute risk differences over absolute risks (median 2 [IQR 3]), and additional information in table cells over footnotes (median 1 [IQR 2]). Panelists presented with time frame information in the tables, and not only in footnotes, were more likely to correctly answer questions regarding time frame (58% vs. 11%, P<0.0001), and those presented with risk differences and not absolute risks were more likely to correctly interpret confidence intervals for absolute effects (95% vs. 54%, P<0.0001). Information was considered easy to find, easy to comprehend, and helpful in making recommendations regardless of table format (median 6, IQR 0-1). Panelists found information in GRADE evidence profiles accessible. Correct comprehension of some key information was improved by providing additional information in table and presenting risk differences. Copyright © 2012 Elsevier Inc. All rights reserved.
Hitaka, Yuka; Miura, Shin-ichiro; Koyoshi, Rie; Shiga, Yuhei; Miyase, Yuiko; Norimatsu, Kenji; Nakamura, Ayumi; Adachi, Sen; Kuwano, Takashi; Sugihara, Makoto; Ike, Amane; Nishikawa, Hiroaki; Saku, Keijiro
2015-01-01
Background We investigated the relationship between the severity and presence of coronary artery disease (CAD) and a difference in systolic and diastolic blood pressure (SBP and DBP) between arms or between lower limbs. Methods We enrolled 277 patients who underwent coronary angiography. We calculated the absolute (|right BP (rt. BP) - left BP (lt. BP)|) and relative (rt. BP - lt. BP) differences in SBP or DBP between arms or between lower limbs, and assessed the severity of CAD in terms of the Gensini score. Results The absolute difference in DBP between arms in the CAD group was significantly lower than that in the non-CAD group, whereas the absolute difference in DBP between lower limbs in the CAD group was significantly higher. There were no differences in the absolute or relative difference in SBP between arms or lower limbs between the groups. The absolute difference in DBP between arms decreased as the Gensini score increased. In a logistic regression analysis, the presence of CAD was independently associated with the absolute difference in DBP between arms, in addition to male, family history, dyslipidemia, diabetes mellitus and hypertension. Conclusion The absolute difference in DBP between arms in addition to traditional factors may be a critical risk factor for the presence of CAD. PMID:26491500
Hitaka, Yuka; Miura, Shin-Ichiro; Koyoshi, Rie; Shiga, Yuhei; Miyase, Yuiko; Norimatsu, Kenji; Nakamura, Ayumi; Adachi, Sen; Kuwano, Takashi; Sugihara, Makoto; Ike, Amane; Nishikawa, Hiroaki; Saku, Keijiro
2015-11-01
We investigated the relationship between the severity and presence of coronary artery disease (CAD) and a difference in systolic and diastolic blood pressure (SBP and DBP) between arms or between lower limbs. We enrolled 277 patients who underwent coronary angiography. We calculated the absolute (|right BP (rt. BP) - left BP (lt. BP)|) and relative (rt. BP - lt. BP) differences in SBP or DBP between arms or between lower limbs, and assessed the severity of CAD in terms of the Gensini score. The absolute difference in DBP between arms in the CAD group was significantly lower than that in the non-CAD group, whereas the absolute difference in DBP between lower limbs in the CAD group was significantly higher. There were no differences in the absolute or relative difference in SBP between arms or lower limbs between the groups. The absolute difference in DBP between arms decreased as the Gensini score increased. In a logistic regression analysis, the presence of CAD was independently associated with the absolute difference in DBP between arms, in addition to male, family history, dyslipidemia, diabetes mellitus and hypertension. The absolute difference in DBP between arms in addition to traditional factors may be a critical risk factor for the presence of CAD.
Darby, S; McGale, P; Correa, C; Taylor, C; Arriagada, R; Clarke, M; Cutter, D; Davies, C; Ewertz, M; Godwin, J; Gray, R; Pierce, L; Whelan, T; Wang, Y; Peto, R
2011-11-12
After breast-conserving surgery, radiotherapy reduces recurrence and breast cancer death, but it may do so more for some groups of women than for others. We describe the absolute magnitude of these reductions according to various prognostic and other patient characteristics, and relate the absolute reduction in 15-year risk of breast cancer death to the absolute reduction in 10-year recurrence risk. We undertook a meta-analysis of individual patient data for 10,801 women in 17 randomised trials of radiotherapy versus no radiotherapy after breast-conserving surgery, 8337 of whom had pathologically confirmed node-negative (pN0) or node-positive (pN+) disease. Overall, radiotherapy reduced the 10-year risk of any (ie, locoregional or distant) first recurrence from 35·0% to 19·3% (absolute reduction 15·7%, 95% CI 13·7-17·7, 2p<0·00001) and reduced the 15-year risk of breast cancer death from 25·2% to 21·4% (absolute reduction 3·8%, 1·6-6·0, 2p=0·00005). In women with pN0 disease (n=7287), radiotherapy reduced these risks from 31·0% to 15·6% (absolute recurrence reduction 15·4%, 13·2-17·6, 2p<0·00001) and from 20·5% to 17·2% (absolute mortality reduction 3·3%, 0·8-5·8, 2p=0·005), respectively. In these women with pN0 disease, the absolute recurrence reduction varied according to age, grade, oestrogen-receptor status, tamoxifen use, and extent of surgery, and these characteristics were used to predict large (≥20%), intermediate (10-19%), or lower (<10%) absolute reductions in the 10-year recurrence risk. Absolute reductions in 15-year risk of breast cancer death in these three prediction categories were 7·8% (95% CI 3·1-12·5), 1·1% (-2·0 to 4·2), and 0·1% (-7·5 to 7·7) respectively (trend in absolute mortality reduction 2p=0·03). In the few women with pN+ disease (n=1050), radiotherapy reduced the 10-year recurrence risk from 63·7% to 42·5% (absolute reduction 21·2%, 95% CI 14·5-27·9, 2p<0·00001) and the 15-year risk of breast cancer death from 51·3% to 42·8% (absolute reduction 8·5%, 1·8-15·2, 2p=0·01). Overall, about one breast cancer death was avoided by year 15 for every four recurrences avoided by year 10, and the mortality reduction did not differ significantly from this overall relationship in any of the three prediction categories for pN0 disease or for pN+ disease. After breast-conserving surgery, radiotherapy to the conserved breast halves the rate at which the disease recurs and reduces the breast cancer death rate by about a sixth. These proportional benefits vary little between different groups of women. By contrast, the absolute benefits from radiotherapy vary substantially according to the characteristics of the patient and they can be predicted at the time when treatment decisions need to be made. Cancer Research UK, British Heart Foundation, and UK Medical Research Council. Copyright © 2011 Elsevier Ltd. All rights reserved.
Kimura, Atsushi; Hashimoto, Junichiro; Watabe, Daisuke; Takahashi, Hisaki; Ohkubo, Takayoshi; Kikuya, Masahiro; Imai, Yutaka
2004-12-01
To assess whether there is a natural difference in blood pressure (BP) measurements between the right and left arms, and to identify what factors are associated with this difference in a general population. The study subjects were 1090 individuals who participated in a medical check-up in Ohasama, Japan. The BP was measured simultaneously in both arms, using an automated device. The inter-arm BP difference was expressed as the relative difference [right-arm BP (R) minus left-arm BP (L): R - L] and the absolute difference (|R - L|). The relationship between inter-arm difference and various factors was analyzed using univariate analysis. The characteristics of subjects in whom the absolute systolic BP (SBP) difference was greater than 10 mmHg were analyzed using multivariate logistic analysis. The relative differences in SBP and diastolic BP (DBP) were -0.6 +/- 6.6 (mean +/- SD) and 1.1 +/- 4.7 mmHg, while the absolute differences were 4.9 +/- 4.4 and 3.7 +/- 3.0 mmHg. The absolute SBP difference was found to correlate significantly with age, body mass index, ankle-brachial index (ABI), and hypertension. Subjects with hypertension, hypercholesterolemia, obesity, elevated hemoglobin A1c (HbA1c) and low ABI had a significant and independent increase in the risk of an absolute SBP difference greater than 10 mmHg. The results suggest that there is considerable difference in the measured BP in the right and left arms and that large differences in the absolute SBP are associated with risk factors for arteriosclerosis such as hypertension, hypercholesterolemia, obesity, metabolic abnormalities and low ABI.
Mind and body therapy for fibromyalgia.
Theadom, Alice; Cropley, Mark; Smith, Helen E; Feigin, Valery L; McPherson, Kathryn
2015-04-09
Mind-body interventions are based on the holistic principle that mind, body and behaviour are all interconnected. Mind-body interventions incorporate strategies that are thought to improve psychological and physical well-being, aim to allow patients to take an active role in their treatment, and promote people's ability to cope. Mind-body interventions are widely used by people with fibromyalgia to help manage their symptoms and improve well-being. Examples of mind-body therapies include psychological therapies, biofeedback, mindfulness, movement therapies and relaxation strategies. To review the benefits and harms of mind-body therapies in comparison to standard care and attention placebo control groups for adults with fibromyalgia, post-intervention and at three and six month follow-up. Electronic searches of the Cochrane Central Register of Controlled Trials (CENTRAL), MEDLINE (Ovid), EMBASE (Ovid), PsycINFO (Ovid), AMED (EBSCO) and CINAHL (Ovid) were conducted up to 30 October 2013. Searches of reference lists were conducted and authors in the field were contacted to identify additional relevant articles. All relevant randomised controlled trials (RCTs) of mind-body interventions for adults with fibromyalgia were included. Two authors independently selected studies, extracted the data and assessed trials for low, unclear or high risk of bias. Any discrepancy was resolved through discussion and consensus. Continuous outcomes were analysed using mean difference (MD) where the same outcome measure and scoring method was used and standardised mean difference (SMD) where different outcome measures were used. For binary data standard estimation of the risk ratio (RR) and its 95% confidence interval (CI) was used. Seventy-four papers describing 61 trials were identified, with 4234 predominantly female participants. The nature of fibromyalgia varied from mild to severe across the study populations. Twenty-six studies were classified as having a low risk of bias for all domains assessed. The findings of mind-body therapies compared with usual care were prioritised.There is low quality evidence that in comparison to usual care controls psychological therapies have favourable effects on physical functioning (SMD -0.4, 95% CI -0.6 to -0.3, -7.5% absolute change, 2 point shift on a 0 to 100 scale), pain (SMD -0.3, 95% CI -0.5 to -0.2, -3.5% absolute change, 2 point shift on a 0 to 100 scale) and mood (SMD -0.5, 95% CI -0.6 to -0.3, -4.8% absolute change, 3 point shift on a 20 to 80 scale). There is very low quality evidence of more withdrawals in the psychological therapy group in comparison to usual care controls (RR 1.38, 95% CI 1.12 to 1.69, 6% absolute risk difference). There is lack of evidence of a difference between the number of adverse events in the psychological therapy and control groups (RR 0.38, 95% CI 0.06 to 2.50, 4% absolute risk difference).There was very low quality evidence that biofeedback in comparison to usual care controls had an effect on physical functioning (SMD -0.1, 95% CI -0.4 to 0.3, -1.2% absolute change, 1 point shift on a 0 to 100 scale), pain (SMD -2.6, 95% CI -91.3 to 86.1, -2.6% absolute change) and mood (SMD 0.1, 95% CI -0.3 to 0.5, 1.9% absolute change, less than 1 point shift on a 0 to 90 scale) post-intervention. In view of the quality of evidence we cannot be certain that biofeedback has a little or no effect on these outcomes. There was very low quality evidence that biofeedback led to more withdrawals from the study (RR 4.08, 95% CI 1.43 to 11.62, 20% absolute risk difference). No adverse events were reported.There was no advantage observed for mindfulness in comparison to usual care for physical functioning (SMD -0.3, 95% CI -0.6 to 0.1, -4.8% absolute change, 4 point shift on a scale 0 to 100), pain (SMD -0.1, CI -0.4 to 0.3, -1.3% absolute change, less than 1 point shift on a 0 to 10 scale), mood (SMD -0.2, 95% CI -0.5 to 0.0, -3.7% absolute change, 2 point shift on a 20 to 80 scale) or withdrawals (RR 1.07, 95% CI 0.67 to 1.72, 2% absolute risk difference) between the two groups post-intervention. However, the quality of the evidence was very low for pain and moderate for mood and number of withdrawals. No studies reported any adverse events.Very low quality evidence revealed that movement therapies in comparison to usual care controls improved pain (MD -2.3, CI -4.2 to -0.4, -23% absolute change) and mood (MD -9.8, 95% CI -18.5 to -1.2, -16.4% absolute change) post-intervention. There was no advantage for physical functioning (SMD -0.2, 95% CI -0.5 to 0.2, -3.4% absolute change, 2 point shift on a 0 to 100 scale), participant withdrawals (RR 1.95, 95% CI 1.13 to 3.38, 11% absolute difference) or adverse events (RR 4.62, 95% CI 0.23 to 93.92, 4% absolute risk difference) between the two groups, however rare adverse events may include worsening of pain.Low quality evidence revealed that relaxation based therapies in comparison to usual care controls showed an advantage for physical functioning (MD -8.3, 95% CI -10.1 to -6.5, -10.4% absolute change) and pain (SMD -1.0, 95% CI -1.6 to -0.5, -3.5% absolute change, 2 point shift on a 0 to 78 scale) but not for mood (SMD -4.4, CI -14.5 to 5.6, -7.4% absolute change) post-intervention. There was no difference between the groups for number of withdrawals (RR 4.40, 95% CI 0.59 to 33.07, 31% absolute risk difference) and no adverse events were reported. Psychological interventions therapies may be effective in improving physical functioning, pain and low mood for adults with fibromyalgia in comparison to usual care controls but the quality of the evidence is low. Further research on the outcomes of therapies is needed to determine if positive effects identified post-intervention are sustained. The effectiveness of biofeedback, mindfulness, movement therapies and relaxation based therapies remains unclear as the quality of the evidence was very low or low. The small number of trials and inconsistency in the use of outcome measures across the trials restricted the analysis.
NASA Astrophysics Data System (ADS)
Murray, Louise J.; Thompson, Christopher M.; Lilley, John; Cosgrove, Vivian; Franks, Kevin; Sebag-Montefiore, David; Henry, Ann M.
2015-02-01
Risks of radiation-induced second primary cancer following prostate radiotherapy using 3D-conformal radiotherapy (3D-CRT), intensity-modulated radiotherapy (IMRT), volumetric modulated arc therapy (VMAT), flattening filter free (FFF) and stereotactic ablative radiotherapy (SABR) were evaluated. Prostate plans were created using 10 MV 3D-CRT (78 Gy in 39 fractions) and 6 MV 5-field IMRT (78 Gy in 39 fractions), VMAT (78 Gy in 39 fractions, with standard flattened and energy-matched FFF beams) and SABR (42.7 Gy in 7 fractions with standard flattened and energy-matched FFF beams). Dose-volume histograms from pelvic planning CT scans of three prostate patients, each planned using all 6 techniques, were used to calculate organ equivalent doses (OED) and excess absolute risks (EAR) of second rectal and bladder cancers, and pelvic bone and soft tissue sarcomas, using mechanistic, bell-shaped and plateau models. For organs distant to the treatment field, chamber measurements recorded in an anthropomorphic phantom were used to calculate OEDs and EARs using a linear model. Ratios of OED give relative radiation-induced second cancer risks. SABR resulted in lower second cancer risks at all sites relative to 3D-CRT. FFF resulted in lower second cancer risks in out-of-field tissues relative to equivalent flattened techniques, with increasing impact in organs at greater distances from the field. For example, FFF reduced second cancer risk by up to 20% in the stomach and up to 56% in the brain, relative to the equivalent flattened technique. Relative to 10 MV 3D-CRT, 6 MV IMRT or VMAT with flattening filter increased second cancer risks in several out-of-field organs, by up to 26% and 55%, respectively. For all techniques, EARs were consistently low. The observed large relative differences between techniques, in absolute terms, were very low, highlighting the importance of considering absolute risks alongside the corresponding relative risks, since when absolute risks are very low, large relative risks become less meaningful. A calculated relative radiation-induced second cancer risk benefit from SABR and FFF techniques was theoretically predicted, although absolute radiation-induced second cancer risks were low for all techniques, and absolute differences between techniques were small.
A randomized trial of decision-making in asymptomatic carotid stenosis.
Silver, B; Zaman, I F; Ashraf, K; Majed, Y; Norwood, E M; Schuh, L A; Smith, B J; Smith, R E; Schultz, L R
2012-01-31
We sought to evaluate whether different presentation formats, presenter characteristics, and patient characteristics affect decision-making in asymptomatic carotid stenosis. Subjects included individuals presenting to a neurology clinic. Participants included those over age 18 without known carotid stenosis. Subjects were randomized to a 30-second video with 1 of 5 presentation formats (absolute risk, absolute event-free survival, annualized absolute risk, relative risk, and a qualitative description) delivered by 1 of 4 presenter physicians (black woman, white woman, black man, white man). Subjects then completed a one-page form regarding background demographics and their decision regarding treatment choice. A total of 409 subjects watched the video and completed the survey. Overall, 48.4% of subjects chose surgery. Presentation format strongly predicted choice of surgery (qualitative [64%], relative risk [63%], absolute risk [43%], absolute event-free survival [37%], and annualized absolute risk [35%], p < 0.001). There was a trend for younger age (mean age 52 vs 55, p = 0.054), male gender (53% vs 45%, p = 0.08), and advanced education (42% for high school education or less vs 52% for more than high school education, p = 0.052) to predict surgery choice. Gender and race of presenter, and race of subject, had no influence on the choice of treatment. Presentation format (information framing) strongly determines patient decision-making in asymptomatic carotid stenosis. Subject age, gender, and education level may also influence the decision. Clinicians should consider the influence of these variables when counseling patients.
Greater absolute risk for all subtypes of breast cancer in the US than Malaysia.
Horne, Hisani N; Beena Devi, C R; Sung, Hyuna; Tang, Tieng Swee; Rosenberg, Philip S; Hewitt, Stephen M; Sherman, Mark E; Anderson, William F; Yang, Xiaohong R
2015-01-01
Hormone receptor (HR) negative breast cancers are relatively more common in low-risk than high-risk countries and/or populations. However, the absolute variations between these different populations are not well established given the limited number of cancer registries with incidence rate data by breast cancer subtype. We, therefore, used two unique population-based resources with molecular data to compare incidence rates for the 'intrinsic' breast cancer subtypes between a low-risk Asian population in Malaysia and high-risk non-Hispanic white population in the National Cancer Institute's surveillance, epidemiology, and end results 18 registries database (SEER 18). The intrinsic breast cancer subtypes were recapitulated with the joint expression of the HRs (estrogen receptor and progesterone receptor) and human epidermal growth factor receptor-2 (HER2). Invasive breast cancer incidence rates overall were fivefold greater in SEER 18 than in Malaysia. The majority of breast cancers were HR-positive in SEER 18 and HR-negative in Malaysia. Notwithstanding the greater relative distribution for HR-negative cancers in Malaysia, there was a greater absolute risk for all subtypes in SEER 18; incidence rates were nearly 7-fold higher for HR-positive and 2-fold higher for HR-negative cancers in SEER 18. Despite the well-established relative breast cancer differences between low-risk and high-risk countries and/or populations, there was a greater absolute risk for HR-positive and HR-negative subtypes in the US than Malaysia. Additional analytical studies are sorely needed to determine the factors responsible for the elevated risk of all subtypes of breast cancer in high-risk countries like the United States.
How are lung cancer risk perceptions and cigarette smoking related?-testing an accuracy hypothesis.
Chen, Lei-Shih; Kaphingst, Kimberly A; Tseng, Tung-Sung; Zhao, Shixi
2016-10-01
Subjective risk perception is an important theoretical construct in the field of cancer prevention and control. Although the relationship between subjective risk perception and health behaviors has been widely studied in many health contexts, the causalities and associations between the risk perception of developing lung cancer and cigarette smoking have been inconsistently reported among studies. Such inconsistency may be from discrepancies between study designs (cross-sectional versus longitudinal designs) and the three hypotheses (i.e., the behavior motivation hypothesis, the risk reappraisals hypothesis, and the accuracy hypothesis) testing different underlying associations between risk perception and cigarette-smoking behaviors. To clarify this issue, as an initial step, we examined the association between absolute and relative risk perceptions of developing lung cancer and cigarette-smoking behaviors among a large, national representative sample of 1,680 U.S. adults by testing an accuracy hypothesis (i.e., people who smoke accurately perceived a higher risk of developing lung cancer). Data from the U.S. Health Information National Trends Survey (HINTS) were analyzed using logistic regression and multivariate linear regression to examine the associations between risk perception and cigarette-smoking behaviors among 1,680 U.S. adults. Findings from this cross-sectional survey suggest that absolute and relative risk perceptions were positively and significantly correlated with having smoked >100 cigarettes during lifetime and the frequency of cigarette smoking. Only absolute risk perception was significantly associated with the number of cigarettes smoked per day among current smokers. Because both absolute and relative risk perceptions are positively related to most cigarette-smoking behaviors, this study supports the accuracy hypothesis. Moreover, absolute risk perception might be a more sensitive measurement than relative risk perception for perceived lung cancer risk. Longitudinal research is needed in the future to investigate other types of risk perception-risk behavior hypotheses-the behavior motivation and the risk reappraisals hypotheses-among nationally representative samples to further examine the causations between risk perception of obtaining lung cancer and smoking behaviors.
Chapman, Robert S; Silverman, Debra T; He, Xinghzhou; Hu, Wei; Vermeulen, Roel; Ning, Bofu; Fraumeni, Joseph F; Rothman, Nathaniel; Lan, Qing
2012-01-01
Objective To estimate the risk of lung cancer associated with the use of different types of coal for household cooking and heating. Setting Xuanwei County, Yunnan Province, China. Design Retrospective cohort study (follow-up 1976-96) comparing mortality from lung cancer between lifelong users of “smoky coal” (bituminous) and “smokeless coal” (anthracite). Participants 27 310 individuals using smoky coal and 9962 individuals using smokeless coal during their entire life. Main outcome measures Primary outcomes were absolute and relative risk of death from lung cancer among users of different types of coal. Unadjusted survival analysis was used to estimate the absolute risk of lung cancer, while Cox regression models compared mortality hazards for lung cancer between smoky and smokeless coal users. Results Lung cancer mortality was substantially higher among users of smoky coal than users of smokeless coal. The absolute risks of lung cancer death before 70 years of age for men and women using smoky coal were 18% and 20%, respectively, compared with less than 0.5% among smokeless coal users of both sexes. Lung cancer alone accounted for about 40% of all deaths before age 60 among individuals using smoky coal. Compared with smokeless coal, use of smoky coal was associated with an increased risk of lung cancer death (for men, hazard ratio 36 (95% confidence interval 20 to 65); for women, 99 (37 to 266)). Conclusions In Xuanwei, the domestic use of smoky coal is associated with a substantial increase in the absolute lifetime risk of developing lung cancer and is likely to represent one of the strongest effects of environmental pollution reported for cancer risk. Use of less carcinogenic types of coal could translate to a substantial reduction of lung cancer risk. PMID:22936785
Ohsawa, Masaki; Okamura, Tomonori; Ogasawara, Kuniaki; Ogawa, Akira; Fujioka, Tomoaki; Tanno, Kozo; Yonekura, Yuki; Omama, Shinichi; Turin, Tanvir Chowdhury; Itai, Kazuyoshi; Ishibashi, Yasuhiro; Morino, Yoshihiro; Itoh, Tomonori; Miyamatsu, Naomi; Onoda, Toshiyuki; Kuribayashi, Toru; Makita, Shinji; Yoshida, Yuki; Nakamura, Motoyuki; Tanaka, Fumitaka; Ohta, Mutsuko; Sakata, Kiyomi; Okayama, Akira
2015-04-01
The relative and absolute risks of outcomes other than all-cause death (ACD) attributable to atrial fibrillation (AF) stratified age have not been sufficiently investigated. A prospective study of 23,634 community dwellers aged 40 years or older without organic cardiovascular disease (AF=335, non-AF=23,299) was conducted. Multivariate-adjusted rates, rate ratios (RRs) and excess deaths (EDs) for ACD, cardiovascular death (CVD) and non-cardiovascular death (non-CVD), and sex- and age-adjusted RR and ED in middle-aged (40 to 69) and elderly (70 years or older) for ACD, CVD, non-CVD, sudden cardiac death (SCD), stroke-related death (Str-D), neoplasm-related death (NPD), and infection-related death (IFD) attributable to AF were estimated using Poisson regression. Multivariate-adjusted analysis revealed that AF significantly increased the risk of ACD (RR [95% confidence interval]:1.70 [1.23-2.95]) and CVD (3.86 [2.38-6.27]), but not non-CVD. Age-stratified analysis revealed that AF increased the risk of Str-D in middle-aged (14.5 [4.77-44.3]) and elderly individuals (4.92 [1.91-12.7]), SCD in elderly individuals (3.21 [1.37-7.51]), and might increase the risk of IFD in elderly individuals (2.02 [0.80-4.65], p=0.098). The RR of CVD was higher in middle-aged versus elderly individuals (RRs, 6.19 vs. 3.57) but the absolute risk difference was larger in elderly individuals (EDs: 7.6 vs. 3.0 per 1000 person-years). Larger absolute risk differences for ACD and CVD attributable to AF among elderly people indicate that the absolute burden of AF is higher in elderly versus middle-aged people despite the relatively small RR. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Miura, Kyoko; Hughes, Maria Celia B; Ungerer, Jacobus P J; Smith, David D; Green, Adèle C
2018-03-01
In a well-characterised community-based prospective study, we aimed to systematically assess the differences in associations of plasma omega-3 and omega-6 fatty acid (FA) status with all-cause mortality when plasma FA status is expressed in absolute concentrations versus relative levels. In a community sample of 564 women aged 25-75 years in Queensland, Australia, baseline plasma phospholipid FA levels were measured using gas chromatography. Specific FAs analysed were eicosapentaenoic acid, docosapentaenoic acid, docosahexaenoic acid, total long-chain omega-3 FAs, linoleic acid, arachidonic acid, and total omega-6 FAs. Levels of each FA were expressed in absolute amounts (µg/mL) and relative levels (% of total FAs) and divided into thirds. Deaths were monitored for 17 years and hazard ratios and 95% confidence intervals calculated to assess risk of death according to absolute versus relative plasma FA levels. In total 81 (14%) women died during follow-up. Agreement between absolute and relative measures of plasma FAs was higher in omega-3 than omega-6 FAs. The results of multivariate analyses for risk of all-cause mortality were generally similar with risk tending to inverse associations with plasma phospholipid omega-3 FAs and no association with omega-6 FAs. Sensitivity analyses examining effects of age and presence of serious medical conditions on risk of mortality did not alter findings. The directions and magnitude of associations with mortality of absolute versus relative FA levels were comparable. However, plasma FA expressed as absolute concentrations may be preferred for ease of comparison and since relative units can be deduced from absolute units.
Gout and subsequent erectile dysfunction: a population-based cohort study from England.
Abdul Sultan, Alyshah; Mallen, Christian; Hayward, Richard; Muller, Sara; Whittle, Rebecca; Hotston, Matthew; Roddy, Edward
2017-06-06
An association has been suggested between gout and erectile dysfunction (ED), however studies quantifying the risk of ED amongst gout patients are lacking. We aimed to precisely determine the population-level absolute and relative rate of ED reporting among men with gout over a decade in England. We utilised the UK-based Clinical Practice Research Datalink to identify 9653 men with incident gout age- and practice-matched to 38,218 controls. Absolute and relative rates of incident ED were calculated using Cox regression models. Absolute rates within specific time periods before and after gout diagnosis were compared to control using a Poisson regression model. Overall, the absolute rate of ED post-gout diagnosis was 193 (95% confidence interval (CI): 184-202) per 10,000 person-years. This corresponded to a 31% (hazard ratio (HR): 1.31 95%CI: 1.24-1.40) increased relative risk and 0.6% excess absolute risk compared to those without gout. We did not observe statistically significant differences in the risk of ED among those prescribed ULT within 1 and 3 years after gout diagnosis. Compared to those unexposed, the risk of ED was also high in the year before gout diagnosis (relative rate = 1.63 95%CI 1.27-2.08). Similar findings were also observed for severe ED warranting pharmacological intervention. We have shown a statistically significant increased risk of ED among men with gout. Our findings will have important implications in planning a multidisciplinary approach to managing patients with gout.
Treatment decision-making and the form of risk communication: results of a factorial survey.
Hembroff, Larry A; Holmes-Rovner, Margaret; Wills, Celia E
2004-11-16
Prospective users of preventive therapies often must evaluate complex information about therapeutic risks and benefits. The purpose of this study was to evaluate the effect of relative and absolute risk information on patient decision-making in scenarios typical of health information for patients. Factorial experiments within a telephone survey of the Michigan adult, non-institutionalized, English-speaking population. Average interview lasted 23 minutes. Subjects and sample design: 952 randomly selected adults within a random-digit dial sample of Michigan households. Completion rate was 54.3%. When presented hypothetical information regarding additional risks of breast cancer from a medication to prevent a bone disease, respondents reduced their willingness to recommend a female friend take the medication compared to the baseline rate (66.8% = yes). The decrease was significantly greater with relative risk information. Additional benefit information regarding preventing heart disease from the medication increased willingness to recommend the medication to a female friend relative to the baseline scenario, but did not differ between absolute and relative risk formats. When information about both increased risk of breast cancer and reduced risk of heart disease were provided, typical respondents appeared to make rational decisions consistent with Expected Utility Theory, but the information presentation format affected choices. Those 11% - 33% making decisions contrary to the medical indications were more likely to be Hispanic, older, more educated, smokers, and to have children in the home. In scenarios typical of health risk information, relative risk information led respondents to make non-normative decisions that were "corrected" when the frame used absolute risk information. This population sample made generally rational decisions when presented with absolute risk information, even in the context of a telephone interview requiring remembering rates given. The lack of effect of gender and race suggests that a standard strategy of presenting absolute risk information may improve patient decision-making.
Wolff, Katharina; Larsen, Svein
2016-12-01
The present investigation is a cross-sectional, multi-national, quantitative, and quasi-experimental comparison of tourists' risk perceptions regarding different destinations throughout the past decade. Over 10,000 tourists to Norway from 89 different countries filled in a questionnaire rating the perceived risk for various destinations. Data were collected during 2004, 2010, 2011, 2012, 2013 and 2015 and allow for a comparison of perceived risk across time, place and nationality. Results show that while absolute risk judgments for different destinations fluctuate somewhat over the years, relative risk judgments remain constant. Findings also reveal a "home-is-safer-then-abroad-bias" with tourists consistently perceiving their home country among the safest destinations. The current investigation is rare because it looks at more than one destination at a time. Insights gained from the present findings diverge from what would have been concluded from employing case studies, that is, looking at one destination at a time. © 2016 The Authors. Scandinavian Journal of Psychology published by Scandinavian Psychological Associations and John Wiley & Sons Ltd.
Denton, Brian T.; Hayward, Rodney A.
2017-01-01
Background Intensive blood pressure (BP) treatment can avert cardiovascular disease (CVD) events but can cause some serious adverse events. We sought to develop and validate risk models for predicting absolute risk difference (increased risk or decreased risk) for CVD events and serious adverse events from intensive BP therapy. A secondary aim was to test if the statistical method of elastic net regularization would improve the estimation of risk models for predicting absolute risk difference, as compared to a traditional backwards variable selection approach. Methods and findings Cox models were derived from SPRINT trial data and validated on ACCORD-BP trial data to estimate risk of CVD events and serious adverse events; the models included terms for intensive BP treatment and heterogeneous response to intensive treatment. The Cox models were then used to estimate the absolute reduction in probability of CVD events (benefit) and absolute increase in probability of serious adverse events (harm) for each individual from intensive treatment. We compared the method of elastic net regularization, which uses repeated internal cross-validation to select variables and estimate coefficients in the presence of collinearity, to a traditional backwards variable selection approach. Data from 9,069 SPRINT participants with complete data on covariates were utilized for model development, and data from 4,498 ACCORD-BP participants with complete data were utilized for model validation. Participants were exposed to intensive (goal systolic pressure < 120 mm Hg) versus standard (<140 mm Hg) treatment. Two composite primary outcome measures were evaluated: (i) CVD events/deaths (myocardial infarction, acute coronary syndrome, stroke, congestive heart failure, or CVD death), and (ii) serious adverse events (hypotension, syncope, electrolyte abnormalities, bradycardia, or acute kidney injury/failure). The model for CVD chosen through elastic net regularization included interaction terms suggesting that older age, black race, higher diastolic BP, and higher lipids were associated with greater CVD risk reduction benefits from intensive treatment, while current smoking was associated with fewer benefits. The model for serious adverse events chosen through elastic net regularization suggested that male sex, current smoking, statin use, elevated creatinine, and higher lipids were associated with greater risk of serious adverse events from intensive treatment. SPRINT participants in the highest predicted benefit subgroup had a number needed to treat (NNT) of 24 to prevent 1 CVD event/death over 5 years (absolute risk reduction [ARR] = 0.042, 95% CI: 0.018, 0.066; P = 0.001), those in the middle predicted benefit subgroup had a NNT of 76 (ARR = 0.013, 95% CI: −0.0001, 0.026; P = 0.053), and those in the lowest subgroup had no significant risk reduction (ARR = 0.006, 95% CI: −0.007, 0.018; P = 0.71). Those in the highest predicted harm subgroup had a number needed to harm (NNH) of 27 to induce 1 serious adverse event (absolute risk increase [ARI] = 0.038, 95% CI: 0.014, 0.061; P = 0.002), those in the middle predicted harm subgroup had a NNH of 41 (ARI = 0.025, 95% CI: 0.012, 0.038; P < 0.001), and those in the lowest subgroup had no significant risk increase (ARI = −0.007, 95% CI: −0.043, 0.030; P = 0.72). In ACCORD-BP, participants in the highest subgroup of predicted benefit had significant absolute CVD risk reduction, but the overall ACCORD-BP participant sample was skewed towards participants with less predicted benefit and more predicted risk than in SPRINT. The models chosen through traditional backwards selection had similar ability to identify absolute risk difference for CVD as the elastic net models, but poorer ability to correctly identify absolute risk difference for serious adverse events. A key limitation of the analysis is the limited sample size of the ACCORD-BP trial, which expanded confidence intervals for ARI among persons with type 2 diabetes. Additionally, it is not possible to mechanistically explain the physiological relationships explaining the heterogeneous treatment effects captured by the models, since the study was an observational secondary data analysis. Conclusions We found that predictive models could help identify subgroups of participants in both SPRINT and ACCORD-BP who had lower versus higher ARRs in CVD events/deaths with intensive BP treatment, and participants who had lower versus higher ARIs in serious adverse events. PMID:29040268
Risk aversion and compliance in markets for pollution control.
Stranlund, John K
2008-07-01
This paper examines the effects of risk aversion on compliance choices in markets for pollution control. A firm's decision to be compliant or not is independent of its manager's risk preference. However, non-compliant firms with risk-averse managers will have lower violations than otherwise identical firms with risk-neutral managers. The violations of non-compliant firms with risk-averse managers are independent of differences in their profit functions and their initial allocations of permits if and only if their managers' utility functions exhibit constant absolute risk aversion. However, firm-level characteristics do impact violation choices when managers have coefficients of absolute risk aversion that are increasing or decreasing in profit levels. Finally, in the equilibrium of a market for emissions rights with widespread non-compliance, risk aversion is associated with higher permit prices, better environmental quality, and lower aggregate violations.
Schwartz, Lisa M; Dvorin, Evan L; Welch, H Gilbert
2006-01-01
Objective To examine the accessibility of absolute risk in articles reporting ratio measures in leading medical journals. Design Structured review of abstracts presenting ratio measures. Setting Articles published between 1 June 2003 and 1 May 2004 in Annals of Internal Medicine, BMJ, Journal of the American Medical Association, Journal of the National Cancer Institute, Lancet, and New England Journal of Medicine. Participants 222 articles based on study designs in which absolute risks were directly calculable (61 randomised trials, 161 cohort studies). Main outcome measure Accessibility of the absolute risks underlying the first ratio measure in the abstract. Results 68% of articles (150/222) failed to report the underlying absolute risks for the first ratio measure in the abstract (range 55−81% across the journals). Among these articles, about half did report the underlying absolute risks elsewhere in the article (text, table, or figure) but half did not report them anywhere. Absolute risks were more likely to be reported in the abstract for randomised trials compared with cohort studies (62% v 21%; relative risk 3.0, 95% confidence interval 2.1 to 4.2) and for studies reporting crude compared with adjusted ratio measures (62% v 21%; relative risk 3.0, 2.1 to 4.3). Conclusion Absolute risks are often not easily accessible in articles reporting ratio measures and sometimes are missing altogether—this lack of accessibility can easily exaggerate readers' perceptions of benefit or harm. PMID:17060338
Communicating data about the benefits and harms of treatment: a randomized trial.
Woloshin, Steven; Schwartz, Lisa M
2011-07-19
Despite limited evidence, it is often asserted that natural frequencies (for example, 2 in 1000) are the best way to communicate absolute risks. To compare comprehension of treatment benefit and harm when absolute risks are presented as natural frequencies, percents, or both. Parallel-group randomized trial with central allocation and masking of investigators to group assignment, conducted through an Internet survey in September 2009. (ClinicalTrials.gov registration number: NCT00950014) National sample of U.S. adults randomly selected from a professional survey firm's research panel of about 30,000 households. 2944 adults aged 18 years or older (all with complete follow-up). Tables presenting absolute risks in 1 of 5 numeric formats: natural frequency (x in 1000), variable frequency (x in 100, x in 1000, or x in 10,000, as needed to keep the numerator >1), percent, percent plus natural frequency, or percent plus variable frequency. Comprehension as assessed by 18 questions (primary outcome) and judgment of treatment benefit and harm. The average number of comprehension questions answered correctly was lowest in the variable frequency group and highest in the percent group (13.1 vs. 13.8; difference, 0.7 [95% CI, 0.3 to 1.1]). The proportion of participants who "passed" the comprehension test (≥13 correct answers) was lowest in the natural and variable frequency groups and highest in the percent group (68% vs. 73%; difference, 5 percentage points [CI, 0 to 10 percentage points]). The largest format effect was seen for the 2 questions about absolute differences: the proportion correct in the natural frequency versus percent groups was 43% versus 72% (P < 0.001) and 73% versus 87% (P < 0.001). Even when data were presented in the percent format, one third of participants failed the comprehension test. Natural frequencies are not the best format for communicating the absolute benefits and harms of treatment. The more succinct percent format resulted in better comprehension: Comprehension was slightly better overall and notably better for absolute differences. Attorney General Consumer and Prescriber Education grant program, the Robert Wood Johnson Pioneer Program, and the National Cancer Institute.
Auvinen, Anssi; Moss, Sue M; Tammela, Teuvo L J; Taari, Kimmo; Roobol, Monique J; Schröder, Fritz H; Bangma, Chris H; Carlsson, Sigrid; Aus, Gunnar; Zappa, Marco; Puliti, Donella; Denis, Louis J; Nelen, Vera; Kwiatkowski, Maciej; Randazzo, Marco; Paez, Alvaro; Lujan, Marcos; Hugosson, Jonas
2016-01-01
Purpose The balance of benefits and harms in prostate cancer screening has not been sufficiently characterized. We related indicators of mortality reduction and overdetection by center within the European Randomized Study of Prostate Cancer Screening. Experimental Design We analyzed the absolute mortality reduction expressed as number needed to invite (NNI=1/absolute risk reduction; indicating how many men had to be randomized to screening arm to avert a prostate cancer death) for screening and the absolute excess of prostate cancer detection as number needed for overdetection (NNO=1/absolute excess incidence; indicating the number of men invited per additional prostate cancer case), and compared their relationship across the seven ERSPC centers. Results Both absolute mortality reduction (NNI) and absolute overdetection (NNO) varied widely between the centers: NNI 200-7000 and NNO 16-69. Extent of overdiagnosis and mortality reduction were closely associated (correlation coefficient r=0.76, weighted linear regression coefficient β=33, 95% 5-62, R2=0.72). For an averted prostate cancer death at 13 years of follow-up, 12-36 excess cases had to be detected in various centers. Conclusions The differences between the ERSPC centers likely reflect variations in prostate cancer incidence and mortality, as well as in screening protocol and performance. The strong interrelation between the benefits and harms suggests that efforts to maximize the mortality effect are bound to increase overdiagnosis, and might be improved by focusing on high-risk populations. The optimal balance between screening intensity and risk of overdiagnosis remains unclear. PMID:26289069
The gender- and age-specific 10-year and lifetime absolute fracture risk in Tromsø, Norway.
Ahmed, Luai A; Schirmer, Henrik; Bjørnerem, Ashild; Emaus, Nina; Jørgensen, Lone; Størmer, Jan; Joakimsen, Ragnar M
2009-01-01
Aim of this study is to estimate the gender- and age-specific 10-year and lifetime absolute risks of non-vertebral and osteoporotic (included hip, distal forearm and proximal humerus) fractures in a large cohort of men and women. This is a population-based 10 years follow-up study of 26,891 subjects aged 25 years and older in Tromsø, Norway. All non-vertebral fractures were registered from 1995 throughout 2004 by computerized search in radiographic archives. Absolute risks were estimated by life-table method taking into account the competing risk of death. The absolute fracture risk at each year of age was estimated for the next 10 years (10-year risk) or up to the age of 90 years (lifetime risk). The estimated 10-year absolute risk of all non-vertebral fracture was higher in men than women before but not after the age of 45 years. The 10-year absolute risk for non-vertebral and osteoporotic fractures was over 10%, respectively, in men over 65 and 70 years and in women over 45 and 50 years of age. The 10-year absolute risks of hip fractures at the age of 65 and 80 years were 4.2 and 18.6% in men, and 9.0 and 24.0% in women, respectively. The risk estimates for distal forearm and proximal humerus fractures were under 5% in men and 13% in women. The estimated lifetime risks for all fracture locations were higher in women than men at all ages. At the age of 50 years, the risks were 38.1 and 24.8% in men and 67.4 and 55.0% in women for all non-vertebral and osteoporotic fractures, respectively. The estimated gender- and age-specific 10-year and lifetime absolute fracture risk were higher in Tromsø than in other populations. The high lifetime fracture risk reflects the increased burden of fractures in this cohort.
Occupational factors and reproductive outcomes among a cohort of female veterinarians.
Wilkins, J R; Steele, L L
1998-07-01
To estimate absolute and relative risks of preterm delivery (PTD) and small-for-gestational-age (SGA) births among a cohort of female veterinarians in relation to selected occupational factors, including clinical practice type (CPT). Retrospective cohort survey. 2,997 female graduates from US veterinary colleges between 1970 and 1980. Relevant health and occupational data were collected through a self-administered mail questionnaire with telephone follow-up of nonrespondents. Absolute and relative risks of PTD and SGA births were estimated in relation to maternal CPT at the time of conception and exposure to 13 occupational factors. Attempts were made to control confounding by use of multiple logistic regression analyses. Absolute and relative risks of PTD were highest for veterinarians employed in exclusively equine clinical practice. Although several increased, none of the CPT-specific relative risk estimates were significantly different from the null value of 1. Exposure-specific analyses indicated that occupational involvement with solvents among exclusively small animal practitioners was associated with the highest relative risk of PTD. A small number of SGA births limited information that could be obtained from these analyses. Overall absolute risks of PTD and SGA births among cohort members were much lower in comparison with the general female population. Given the large number of women currently practicing and entering the profession of veterinary medicine, clinical tasks associated with potential reproductive hazards should be approached with heightened awareness and increased caution, especially activities that may involve exposure to solvents.
Population-based absolute risk estimation with survey data
Kovalchik, Stephanie A.; Pfeiffer, Ruth M.
2013-01-01
Absolute risk is the probability that a cause-specific event occurs in a given time interval in the presence of competing events. We present methods to estimate population-based absolute risk from a complex survey cohort that can accommodate multiple exposure-specific competing risks. The hazard function for each event type consists of an individualized relative risk multiplied by a baseline hazard function, which is modeled nonparametrically or parametrically with a piecewise exponential model. An influence method is used to derive a Taylor-linearized variance estimate for the absolute risk estimates. We introduce novel measures of the cause-specific influences that can guide modeling choices for the competing event components of the model. To illustrate our methodology, we build and validate cause-specific absolute risk models for cardiovascular and cancer deaths using data from the National Health and Nutrition Examination Survey. Our applications demonstrate the usefulness of survey-based risk prediction models for predicting health outcomes and quantifying the potential impact of disease prevention programs at the population level. PMID:23686614
Adegbija, Odewumi; Hoy, Wendy E; Wang, Zhiqiang
2015-11-13
There have been suggestions that currently recommended waist circumference (WC) cut-off points for Australians of European origin may not be applicable to Aboriginal people who have different body habitus profiles. We aimed to generate equivalent WC values that correspond to body mass index (BMI) points for identifying absolute cardiovascular disease (CVD) risks. Prospective cohort study. An Aboriginal community in Australia's Northern Territory. From 1992 to 1998, 920 adults without CVD, with age, WC and BMI measurements were followed-up for up to 20 years. Incident CVD, coronary artery disease (CAD) and heart failure (HF) events during the follow-up period ascertained from hospitalisation data. We generated WC values with 10-year absolute risks equivalent for the development of CVD as BMI values (20-34 kg/m(2)) using the Weibull accelerated time-failure model. There were 211 incident cases of CVD over 13,669 person-years of follow-up. At the average age of 35 years, WC values with absolute CVD, CAD and HF risks equivalent to BMI of 25 kg/m(2) were 91.5, 91.8 and 91.7 cm, respectively, for males, and corresponding WC values were 92.5, 92.7 and 93 cm for females. WC values with equal absolute CVD, CAD and HF risks to BMI of 30 kg/m(2) were 101.7, 103.1 and 102.6 cm, respectively, for males, and corresponding values were 99.2, 101.6 and 101.5 cm for females. Association between WC and CVD did not depend on gender (p=0.54). WC ranging from 91 to 93 cm was equivalent to BMI 25 kg/m(2) for overweight, and 99 to 103 cm was equivalent to BMI of 30 kg/m(2) for obesity in terms of predicting 10-year absolute CVD risk. Replicating the absolute risk method in other Aboriginal communities will further validate the WC values generated for future development of WC cut-off points for Aboriginal people. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
Stewart, Simon; Carrington, Melinda J; Swemmer, Carla H; Anderson, Craig; Kurstjens, Nicol P; Amerena, John; Brown, Alex; Burrell, Louise M; de Looze, Ferdinandus J; Harris, Mark; Hung, Joseph; Krum, Henry; Nelson, Mark; Schlaich, Markus; Stocks, Nigel P; Jennings, Garry L
2012-11-20
To determine the effectiveness of intensive structured care to optimise blood pressure control based on individual absolute risk targets in primary care. Pragmatic multicentre randomised controlled trial. General practices throughout Australia, except Northern Territory, 2009-11. Of 2185 patients from 119 general practices who were eligible for drug treatment for hypertension according to national guidelines 416 (19.0%) achieved their individual blood pressure target during a 28 day run-in period of monotherapy. After exclusions, 1562 participants not at target blood pressure (systolic 150 (SD 17) mm Hg, diastolic 88 (SD 11) mm Hg) were randomised (1:2 ratio) to usual care (n=524) or the intervention (n=1038). Computer assisted clinical profiling and risk target setting (all participants) with intensified follow-up and stepwise drug titration (initial angiotensin receptor blocker monotherapy or two forms of combination therapy using angiotensin receptor blockers) for those randomised to the intervention. The control group received usual care. The primary outcome was individual blood pressure target achieved at 26 weeks. Secondary outcomes were change in mean sitting systolic and diastolic blood pressure, absolute risk for cardiovascular disease within five years based on the Framingham risk score, and proportion and rate of adverse events. On an intention to treat basis, there was an 8.8% absolute difference in individual blood pressure target achieved at 26 weeks in favour of the intervention group compared with usual care group (358/988 (36.2%) v 138/504 (27.4%)): adjusted relative risk 1.28 (95% confidence interval 1.10 to 1.49, P=0.0013). There was also a 9.5% absolute difference in favour of the intervention group for achieving the classic blood pressure target of ≤ 140/90 mm Hg (627/988 (63.5%) v 272/504 (54.0%)): adjusted relative risk 1.18 (1.07 to 1.29, P<0.001). The intervention group achieved a mean adjusted reduction in systolic blood pressure of 13.2 mm Hg (95% confidence interval -12.3 to -14.2 mm Hg) and diastolic blood pressure of 7.7 mm Hg (-7.1 to -8.3 mm Hg) v 10.1 mm Hg (-8.8 to 11.3 mm Hg) and 5.5 mm Hg (-4.7 to -6.2 mm Hg) in the usual care group (P<0.001). Among 1141 participants in whom five year absolute cardiovascular risk scores were calculated from baseline to the 26 week follow-up, the reduction in risk scores was greater in the intervention group than usual care group (14.7% (SD 9.3%) to 10.9% (SD 8.0%); difference -3.7% (SD 4.5%) and 15.0% (SD 10.1%) to 12.4% (SD 9.4%); -2.6% (SD 4.5%): adjusted mean difference -1.13% (95% confidence interval -0.69% to -1.63%; P<0.001). Owing to adverse events 82 (7.9%) participants in the intervention group and 10 (1.9%) in the usual care group had their drug treatment modified. In a primary care setting intensive structured care resulted in higher levels of blood pressure control, with clinically lower blood pressure and absolute risk of future cardiovascular events overall and with more people achieving their target blood pressure. An important gap in treatment remains though and applying intensive management and achieving currently advocated risk based blood pressure targets is challenging.
Alendronate for fracture prevention in postmenopause.
Holder, Kathryn K; Kerley, Sara Shelton
2008-09-01
Osteoporosis is an abnormal reduction in bone mass and bone deterioration leading to increased fracture risk. Alendronate (Fosamax) belongs to the bisphosphonate class of drugs, which act to inhibit bone resorption by interfering with the activity of osteoclasts. To assess the effectiveness of alendronate in the primary and secondary prevention of osteoporotic fractures in postmenopausal women. The authors searched Central, Medline, and EMBASE for relevant randomized controlled trials published from 1966 to 2007. The authors undertook study selection and data abstraction in duplicate. The authors performed meta-analysis of fracture outcomes using relative risks, and a relative change greater than 15 percent was considered clinically important. The authors assessed study quality through reporting of allocation concealment, blinding, and withdrawals. Eleven trials representing 12,068 women were included in the review. Relative and absolute risk reductions for the 10-mg dose were as follows. For vertebral fractures, a 45 percent relative risk reduction was found (relative risk [RR] = 0.55; 95% confidence interval [CI], 0.45 to 0.67). This was significant for primary prevention, with a 45 percent relative risk reduction (RR = 0.55; 95% CI, 0.38 to 0.80) and 2 percent absolute risk reduction; and for secondary prevention, with 45 percent relative risk reduction (RR = 0.55; 95% CI, 0.43 to 0.69) and 6 percent absolute risk reduction. For nonvertebral fractures, a 16 percent relative risk reduction was found (RR = 0.84; 95% CI, 0.74 to 0.94). This was significant for secondary prevention, with a 23 percent relative risk reduction (RR = 0.77; 95% CI, 0.64 to 0.92) and a 2 percent absolute risk reduction, but not for primary prevention (RR = 0.89; 95% CI, 0.76 to 1.04). There was a 40 percent relative risk reduction in hip fractures (RR = 0.60; 95% CI, 0.40 to 0.92), but only secondary prevention was significant, with a 53 percent relative risk reduction (RR = 0.47; 95% CI, 0.26 to 0.85) and a 1 percent absolute risk reduction. The only significance found for wrist fractures was in secondary prevention, with a 50 percent relative risk reduction (RR = 0.50; 95% CI, 0.34 to 0.73) and a 2 percent absolute risk reduction. For adverse events, the authors found no statistically significant difference in any included study. However, observational data raise concerns about potential risk for upper gastrointestinal injury and, less commonly, osteonecrosis of the jaw. At 10 mg of alendronate per day, clinically important and statistically significant reductions in vertebral, nonvertebral, hip, and wrist fractures were observed for secondary prevention. The authors found no statistically significant results for primary prevention, with the exception of vertebral fractures, for which the reduction was clinically important.
Risk communication methods in hip fracture prevention: a randomised trial in primary care.
Hudson, Ben; Toop, Les; Mangin, Dee; Pearson, John
2011-08-01
Treatment acceptance by patients is influenced by the way treatment effects are presented. Presentation of benefits using relative risk increases treatment acceptance compared to the use of absolute risk. It is not known whether this effect is modified by prior presentation of a patient's individualised risk estimate or how presentation of treatment harms by relative or absolute risk affects acceptance. To compare acceptance of a hypothetical treatment to prevent hip fracture after presentation of the treatment's benefit in relative or absolute terms in the context of a personal fracture risk estimate, and to reassess acceptance following subsequent presentation of harm in relative or absolute terms. Randomised controlled trial of patients recruited from 10 GPs' lists in Christchurch, New Zealand. Women aged ≥ 50 years were invited to participate. Participants were given a personal 10-year hip fracture risk estimate and randomised to receive information on a hypothetical treatment's benefit and harm in relative or absolute terms. Of the 1140 women invited to participate 393 (34%) took part. Treatment acceptance was greater following presentation of benefit using absolute terms than relative terms after adjustment forage, education, previous osteoporosis diagnosis, and self-reported risk (OR 1.73, 95% confidence interval [CI] = 1.10 to 2.73, P = 0.018). Presentation of the treatment's harmful effect in relative terms led to a greater proportion of participants declining treatment than did presentation in absolute terms (OR 4.89, 95% CI = 2.3 to 11.0, P<0.001). Presentation of treatment benefit and harm using absolute risk estimates led to greater treatment acceptance than presentation of the same information in relative terms.
Debiasing comparative optimism and increasing worry for health outcomes.
Rose, Jason P
2012-11-01
Comparative optimism - feeling at less personal risk for negative outcomes than one's peers - has been linked to reduced prevention efforts. This study examined a novel debiasing technique aimed at simultaneously reducing both indirectly and directly measured comparative optimism. Before providing direct comparative estimates, participants provided absolute self and peer estimates in a joint format (same computer screen) or a separate format (different computer screens). Relative to the separate format condition, participants in the joint format condition showed (1) lower comparative optimism in absolute/indirect measures, (2) lower direct comparative optimism, and (3) heightened worry. Implications for risk perception screening are discussed.
Realized volatility and absolute return volatility: a comparison indicating market risk.
Zheng, Zeyu; Qiao, Zhi; Takaishi, Tetsuya; Stanley, H Eugene; Li, Baowen
2014-01-01
Measuring volatility in financial markets is a primary challenge in the theory and practice of risk management and is essential when developing investment strategies. Although the vast literature on the topic describes many different models, two nonparametric measurements have emerged and received wide use over the past decade: realized volatility and absolute return volatility. The former is strongly favored in the financial sector and the latter by econophysicists. We examine the memory and clustering features of these two methods and find that both enable strong predictions. We compare the two in detail and find that although realized volatility has a better short-term effect that allows predictions of near-future market behavior, absolute return volatility is easier to calculate and, as a risk indicator, has approximately the same sensitivity as realized volatility. Our detailed empirical analysis yields valuable guidelines for both researchers and market participants because it provides a significantly clearer comparison of the strengths and weaknesses of the two methods.
Realized Volatility and Absolute Return Volatility: A Comparison Indicating Market Risk
Takaishi, Tetsuya; Stanley, H. Eugene; Li, Baowen
2014-01-01
Measuring volatility in financial markets is a primary challenge in the theory and practice of risk management and is essential when developing investment strategies. Although the vast literature on the topic describes many different models, two nonparametric measurements have emerged and received wide use over the past decade: realized volatility and absolute return volatility. The former is strongly favored in the financial sector and the latter by econophysicists. We examine the memory and clustering features of these two methods and find that both enable strong predictions. We compare the two in detail and find that although realized volatility has a better short-term effect that allows predictions of near-future market behavior, absolute return volatility is easier to calculate and, as a risk indicator, has approximately the same sensitivity as realized volatility. Our detailed empirical analysis yields valuable guidelines for both researchers and market participants because it provides a significantly clearer comparison of the strengths and weaknesses of the two methods. PMID:25054439
Tawadrous, Davy; Dixon, Stephanie; Shariff, Salimah Z; Fleet, Jamie; Gandhi, Sonja; Jain, Arsh K; Weir, Matthew A; Gomes, Tara; Garg, Amit X
2014-10-01
Standard doses of histamine2-receptor antagonists (H2RAs) may induce altered mental status in older adults, especially in those with chronic kidney disease (CKD). Population-based cohort study of older adults who started a new H2RA between 2002 and 2011 was conducted. Ninety percent received the current standard H2RA dose in routine care. There was no significant difference in 27 baseline patient characteristics. The primary outcome was hospitalization with an urgent head computed tomography (CT) scan (proxy for altered mental status), and the secondary outcome was all-cause mortality also within 30days of a new H2RA prescription. Standard vs. low H2RA dose was associated with a higher risk of hospitalization with an urgent head CT scan (0.98% vs. 0.74%, absolute risk difference 0.24% [95% CI 0.11% to 0.36%], relative risk 1.33 [95% CI 1.12 to 1.58]). This risk was not modified by the presence of CKD (interaction P value=0.71). Standard vs. low H2RA dose was associated with a higher risk of mortality (1.07% vs.0.74%; absolute risk difference 0.34% [95% CI 0.20% to 0.46%], relative risk 1.46 [95% CI 1.23 to 1.73]). Compared to a lower dose, initiation of the current standard dose of H2RA in older adults is associated with a small absolute increase in the 30-day risk of altered mental status (using neuroimaging as a proxy), even in the absence of CKD. This risk may be avoided by initiating older adults on low doses of H2RAs for gastroesophogeal reflux disease, and increasing dosing as necessary for symptom control. Copyright © 2014 European Federation of Internal Medicine. Published by Elsevier B.V. All rights reserved.
An Evaluation of Performance-Based Tests Designed to Improve Naval Aviation Selection.
1991-08-01
Qualification Test AQT/FAR 8z Flight Aptitude Rating 3 Complex Visual CVT Information Processing 3 Risk RISK 3 Absolute Difference ADHT & Horizontal...01329* .01852 1077 6 CVT .03752* .05064 557 3 ADHT .03432* .03289 499 8 RISK .03369* .05028 337 2 MB .01814 .02479 544 5 PMT/DLT .00987 .01037 641 7 *p...to differences in age, sex, accession source, college major, prior flight hours, or intelligence. From Table 2 we see that the CVT, ADHT , and the RISK
Huang, Jia-Jia; Li, Ya-Jun; Xia, Yi; Wang, Yu; Wei, Wen-Xiao; Zhu, Ying-Jie; Lin, Tong-Yu; Huang, Hui-Qiang; Jiang, Wen-Qi; Li, Zhi-Ming
2013-05-03
Extranodal natural killer/T-cell lymphoma (ENKL) has heterogeneous clinical manifestations and prognosis. This study aims to evaluate the prognostic impact of absolute monocyte count (AMC) in ENKL, and provide some immunologically relevant information for better risk stratification in patients with ENKL. Retrospective data from 163 patients newly diagnosed with ENKL were analyzed. The absolute monocyte count (AMC) at diagnosis was analyzed as continuous and dichotomized variables. Independent prognostic factors of survival were determined by Cox regression analysis. The AMC at diagnosis were related to overall survival (OS) and progression-free survival (PFS) in patients with ENKL. Multivariate analysis identified AMC as independent prognostic factors of survival, independent of International Prognostic Index (IPI) and Korean prognostic index (KPI). The prognostic index incorporating AMC and absolute lymphocyte count (ALC), another surrogate factor of immune status, could be used to stratify all 163 patients with ENKL into different prognostic groups. For patients who received chemotherapy followed by radiotherapy (102 cases), the three AMC/ALC index categories identified patients with significantly different survivals. When superimposed on IPI or KPI categories, the AMC/ALC index was better able to identify high-risk patients in the low-risk IPI or KPI category. The baseline peripheral monocyte count is shown to be an effective prognostic indicator of survival in ENKL patients. The prognostic index related to tumor microenvironment might be helpful to identify high-risk patients with ENKL.
Graphs to estimate an individualized risk of breast cancer.
Benichou, J; Gail, M H; Mulvihill, J J
1996-01-01
Clinicians who counsel women about their risk for developing breast cancer need a rapid method to estimate individualized risk (absolute risk), as well as the confidence limits around that point. The Breast Cancer Detection Demonstration Project (BCDDP) model (sometimes called the Gail model) assumes no genetic model and simultaneously incorporates five risk factors, but involves cumbersome calculations and interpolations. This report provides graphs to estimate the absolute risk of breast cancer from the BCDDP model. The BCDDP recruited 280,000 women from 1973 to 1980 who were monitored for 5 years. From this cohort, 2,852 white women developed breast cancer and 3,146 controls were selected, all with complete risk-factor information. The BCDDP model, previously developed from these data, was used to prepare graphs that relate a specific summary relative-risk estimate to the absolute risk of developing breast cancer over intervals of 10, 20, and 30 years. Once a summary relative risk is calculated, the appropriate graph is chosen that shows the 10-, 20-, or 30-year absolute risk of developing breast cancer. A separate graph gives the 95% confidence limits around the point estimate of absolute risk. Once a clinician rules out a single gene trait that predisposes to breast cancer and elicits information on age and four risk factors, the tables and figures permit an estimation of a women's absolute risk of developing breast cancer in the next three decades. These results are intended to be applied to women who undergo regular screening. They should be used only in a formal counseling program to maximize a woman's understanding of the estimates and the proper use of them.
Murray, L; Sethugavalar, B; Robertshaw, H; Bayman, E; Thomas, E; Gilson, D; Prestwich, R J D
2015-07-01
Recent radiotherapy guidelines for lymphoma have included involved site radiotherapy (ISRT), involved node radiotherapy (INRT) and irradiation of residual volume after full-course chemotherapy. In the absence of late toxicity data, we aim to compare organ at risk (OAR) dose-metrics and calculated second malignancy risks. Fifteen consecutive patients who had received mediastinal radiotherapy were included. Four radiotherapy plans were generated for each patient using a parallel pair photon technique: (i) involved field radiotherapy (IFRT), (ii) ISRT, (iii) INRT, (iv) residual post-chemotherapy volume. The radiotherapy dose was 30 Gy in 15 fractions. The OARs evaluated were: breasts, lungs, thyroid, heart, oesophagus. Relative and absolute second malignancy rates were estimated using the concept of organ equivalent dose. Significance was defined as P < 0.005. Compared with ISRT, IFRT significantly increased doses to lung, thyroid, heart and oesophagus, whereas INRT and residual volume techniques significantly reduced doses to all OARs. The relative risks of second cancers were significantly higher with IFRT compared with ISRT for lung, breast and thyroid; INRT and residual volume resulted in significantly lower relative risks compared with ISRT for lung, breast and thyroid. The median excess absolute risks of second cancers were consistently lowest for the residual technique and highest for IFRT in terms of thyroid, lung and breast cancers. The risk of oesophageal cancer was similar for all four techniques. Overall, the absolute risk of second cancers was very similar for ISRT and INRT. Decreasing treatment volumes from IFRT to ISRT, INRT or residual volume reduces radiation exposure to OARs. Second malignancy modelling suggests that this reduction in treatment volumes will lead to a reduction in absolute excess second malignancy. Little difference was observed in second malignancy risks between ISRT and INRT, supporting the use of ISRT in the absence of a pre-chemotherapy positron emission tomography scan in the radiotherapy treatment position. Copyright © 2015 The Royal College of Radiologists. Published by Elsevier Ltd. All rights reserved.
Corradini, Stefanie; Ballhausen, Hendrik; Weingandt, Helmut; Freislederer, Philipp; Schönecker, Stephan; Niyazi, Maximilian; Simonetto, Cristoforo; Eidemüller, Markus; Ganswindt, Ute; Belka, Claus
2018-03-01
Modern breast cancer radiotherapy techniques, such as respiratory-gated radiotherapy in deep-inspiration breath-hold (DIBH) or volumetric-modulated arc radiotherapy (VMAT) have been shown to reduce the high dose exposure of the heart in left-sided breast cancer. The aim of the present study was to comparatively estimate the excess relative and absolute risks of radiation-induced secondary lung cancer and ischemic heart disease for different modern radiotherapy techniques. Four different treatment plans were generated for ten computed tomography data sets of patients with left-sided breast cancer, using either three-dimensional conformal radiotherapy (3D-CRT) or VMAT, in free-breathing (FB) or DIBH. Dose-volume histograms were used for organ equivalent dose (OED) calculations using linear, linear-exponential, and plateau models for the lung. A linear model was applied to estimate the long-term risk of ischemic heart disease as motivated by epidemiologic data. Excess relative risk (ERR) and 10-year excess absolute risk (EAR) for radiation-induced secondary lung cancer and ischemic heart disease were estimated for different representative baseline risks. The DIBH maneuver resulted in a significant reduction of the ERR and estimated 10-year excess absolute risk for major coronary events compared to FB in 3D-CRT plans (p = 0.04). In VMAT plans, the mean predicted risk reduction through DIBH was less pronounced and not statistically significant (p = 0.44). The risk of radiation-induced secondary lung cancer was mainly influenced by the radiotherapy technique, with no beneficial effect through DIBH. VMAT plans correlated with an increase in 10-year EAR for radiation-induced lung cancer as compared to 3D-CRT plans (DIBH p = 0.007; FB p = 0.005, respectively). However, the EARs were affected more strongly by nonradiation-associated risk factors, such as smoking, as compared to the choice of treatment technique. The results indicate that 3D-CRT plans in DIBH pose the lowest risk for both major coronary events and secondary lung cancer.
Energy expenditure and intake during puberty in healthy nonobese adolescents: a systematic review.
Cheng, Hoi Lun; Amatoury, Mazen; Steinbeck, Katharine
2016-10-01
Puberty is a time of rapid growth and changing energy requirements and is a risk period for obesity. There is little high-quality evidence on the pubertal alterations of energy expenditure and intake, and this has limited our understanding of energy balance during this important life stage. The purpose of this study was to summarize existing evidence on pubertal energy expenditure and intake in healthy nonobese adolescents. Studies were identified through CINAHL, the Cochrane Library, Embase, MEDLINE, and Web of Science databases up to August 2015. Articles presenting objectively measured data for basal or resting metabolic rate (BMR/RMR), total daily energy expenditure (TDEE), and/or energy intake (EI) for ≥2 categories of puberty were included. Relevant data adjusted for fat-free mass (FFM) also were extracted. Data were dichotomized into prepubertal and pubertal groups and compared through the use of standardized mean differences (SMDs). Heterogeneous study methodologies precluded meta-analysis. The search netted 6770 articles, with 12 included for review. From these, 6 of 9 studies supported significantly higher absolute BMR/RMR during puberty (SMD: 1.10-5.93), and all of the studies favored significantly higher absolute TDEE during puberty (SMD: 0.46-9.55). These corresponded to a 12% difference and an 18% difference in absolute BMR/RMR and TDEE, respectively. Results adjusted for FFM were equivocal, with 3 studies favoring higher (1 significantly) and 3 favoring significantly lower adjusted BMR/RMR during puberty. Only 1 study reported EI, showing 41% and 25% greater absolute intakes in pubertal males and females, respectively. These differences were not significant after adjustment for FFM. Reasonably consistent evidence exists to support higher absolute BMR/RMR and TDEE in pubertal than in prepubertal adolescents. Differences are largely accounted for by FFM, among other potential factors such as growth- and puberty-related hormones. This review argues for further research into hormonal influences on pubertal energy balance and subsequent effects on obesity risk. © 2016 American Society for Nutrition.
Gierach, Gretchen L.; Geller, Berta M.; Shepherd, John A.; Patel, Deesha A.; Vacek, Pamela M.; Weaver, Donald L.; Chicoine, Rachael E.; Pfeiffer, Ruth M.; Fan, Bo; Mahmoudzadeh, Amir Pasha; Wang, Jeff; Johnson, Jason M.; Herschorn, Sally D.; Brinton, Louise A.; Sherman, Mark E.
2014-01-01
Background Mammographic density (MD), the area of non-fatty appearing tissue divided by total breast area, is a strong breast cancer risk factor. Most MD analyses have employed visual categorizations or computer-assisted quantification, which ignore breast thickness. We explored MD volume and area, using a volumetric approach previously validated as predictive of breast cancer risk, in relation to risk factors among women undergoing breast biopsy. Methods Among 413 primarily white women, ages 40–65, undergoing diagnostic breast biopsies between 2007–2010 at an academic facility in Vermont, MD volume (cm3) was quantified in cranio-caudal views of the breast contralateral to the biopsy target using a density phantom, while MD area (cm2) was measured on the same digital mammograms using thresholding software. Risk factor associations with continuous MD measurements were evaluated using linear regression. Results Percent MD volume and area were correlated (r=0.81) and strongly and inversely associated with age, body mass index (BMI), and menopause. Both measures were inversely associated with smoking and positively associated with breast biopsy history. Absolute MD measures were correlated (r=0.46) and inversely related to age and menopause. Whereas absolute dense area was inversely associated with BMI, absolute dense volume was positively associated. Conclusions Volume and area MD measures exhibit some overlap in risk factor associations, but divergence as well, particularly for BMI. Impact Findings suggest that volume and area density measures differ in subsets of women; notably, among obese women, absolute density was higher with volumetric methods, suggesting that breast cancer risk assessments may vary for these techniques. PMID:25139935
Kent, David M; Nelson, Jason; Dahabreh, Issa J; Rothwell, Peter M; Altman, Douglas G; Hayward, Rodney A
2016-01-01
Abstract Background: Risk of the outcome is a mathematical determinant of the absolute treatment benefit of an intervention, yet this can vary substantially within a trial population, complicating the interpretation of trial results. Methods: We developed risk models using Cox or logistic regression on a set of large publicly available randomized controlled trials (RCTs). We evaluated risk heterogeneity using the extreme quartile risk ratio (EQRR, the ratio of outcome rates in the lowest risk quartile to that in the highest) and skewness using the median to mean risk ratio (MMRR, the ratio of risk in the median risk patient to the average). We also examined heterogeneity of treatment effects (HTE) across risk strata. Results: We describe 39 analyses using data from 32 large trials, with event rates across studies ranging from 3% to 63% (median = 15%, 25th–75th percentile = 9–29%). C-statistics of risk models ranged from 0.59 to 0.89 (median = 0.70, 25th–75th percentile = 0.65–0.71). The EQRR ranged from 1.8 to 50.7 (median = 4.3, 25th–75th percentile = 3.0–6.1). The MMRR ranged from 0.4 to 1.0 (median = 0.86, 25th–75th percentile = 0.80–0.92). EQRRs were predictably higher and MMRRs predictably lower as the c-statistic increased or the overall outcome incidence decreased. Among 18 comparisons with a significant overall treatment effect, there was a significant interaction between treatment and baseline risk on the proportional scale in only one. The difference in the absolute risk reduction between extreme risk quartiles ranged from −3.2 to 28.3% (median = 5.1%; 25th–75th percentile = 0.3–10.9). Conclusions: There is typically substantial variation in outcome risk in clinical trials, commonly leading to clinically significant differences in absolute treatment effects. Most patients have outcome risks lower than the trial average reflected in the summary result. Risk-stratified trial analyses are feasible and may be clinically informative, particularly when the outcome is predictable and uncommon. PMID:27375287
NASA Astrophysics Data System (ADS)
Svitlov, S. M.
2010-06-01
A recent paper (Baumann et al 2009 Metrologia 46 178-86) presents a method to evaluate the free-fall acceleration at a desired point in space, as required for the watt balance experiment. The claimed uncertainty of their absolute gravity measurements is supported by two bilateral comparisons using two absolute gravimeters of the same type. This comment discusses the case where absolute gravity measurements are traceable to a key comparison reference value. Such an approach produces a more complete uncertainty budget and reduces the risk of the results of different watt balance experiments not being compatible.
Tramarin, Roberto; Pistuddi, Valeria; Maresca, Luigi; Pavesi, Marco; Castelvecchio, Serenella; Menicanti, Lorenzo; de Vincentiis, Carlo; Ranucci, Marco
2017-05-01
Background Anaemia and iron deficiency are frequent following major surgery. The present study aims to identify the iron deficiency patterns in cardiac surgery patients at their admission to a cardiac rehabilitation programme, and to determine which perioperative risk factor(s) may be associated with functional and absolute iron deficiency. Design This was a retrospective study on prospectively collected data. Methods The patient population included 339 patients. Functional iron deficiency was defined in the presence of transferrin saturation <20% and serum ferritin ≥100 µg/l. Absolute iron deficiency was defined in the presence of serum ferritin values <100 µg/l. Results Functional iron deficiency was found in 62.9% of patients and absolute iron deficiency in 10% of the patients. At a multivariable analysis, absolute iron deficiency was significantly ( p = 0.001) associated with mechanical prosthesis mitral valve replacement (odds ratio 5.4, 95% confidence interval 1.9-15) and tissue valve aortic valve replacement (odds ratio 4.5, 95% confidence interval 1.9-11). In mitral valve surgery, mitral repair carried a significant ( p = 0.013) lower risk of absolute iron deficiency (4.4%) than mitral valve replacement with tissue valves (8.3%) or mechanical prostheses (22.5%). Postoperative outcome did not differ between patients with functional iron deficiency and patients without iron deficiency; patients with absolute iron deficiency had a significantly ( p = 0.017) longer postoperative hospital stay (median 11 days) than patients without iron deficiency (median nine days) or with functional iron deficiency (median eight days). Conclusions Absolute iron deficiency following cardiac surgery is more frequent in heart valve surgery and is associated with a prolonged hospital stay. Routine screening for iron deficiency at admission in the cardiac rehabilitation unit is suggested.
Febuxostat for treating chronic gout
Tayar, Jean H; Lopez-Olivo, Maria Angeles; Suarez-Almazor, Maria E
2014-01-01
Background Gout is the most common inflammatory arthritis in men over 40 years and has an increasing prevalence among postmenopausal women. Lowering serum uric acid levels remains one of the primary goals in the treatment of chronic gout. In clinical trials, febuxostat has been shown to be effective in lowering serum uric acid levels to < 6.0 mg/dL. Objectives To evaluate the benefits and harms of febuxostat for chronic gout. Search methods We searched The Cochrane Library, MEDLINE, EMBASE, and International Pharmaceutical Abstracts from inception to July 2011. The ClinicalTrials.gov website was searched for references to trials of febuxostat. Our search did not include any restrictions. Selection criteria Two authors independently reviewed the search results and disagreements were resolved by discussion. We included any controlled clinical trial or open label trial (OLT) using febuxostat at any dose. Data collection and analysis Data and risk of bias were independently extracted by two authors and summarised in a meta-analysis. Continuous data were expressed as mean difference and dichotomous data as risk ratio (RR). Main results Four randomised trials and two OLTs with 3978 patients were included. Risk of bias differed by outcome, ranging from low to high risk of bias. Included studies failed to report on five to six of the nine outcome measures recommended by OMERACT. Patients taking febuxostat 120 mg and 240 mg reported more frequent gout flares than in the placebo group at 4 to 28 weeks (RR 1.7; 95% CI 1.3 to 2.3, and RR 2.6; 95% CI 1.8 to 3.7 respectively). No statistically significant differences were observed at 40 mg and 80 mg. Compared to placebo, patients on febuxostat 40 mg were 40.1 times more likely to achieve serum uric acid levels < 6.0 mg/dL at 4 weeks (95% CI 2.5 to 639), with an absolute treatment benefit of 56% (95% CI 37% to 71%). For febuxostat 80 mg and 120 mg, patients were 68.9 and 80.7 times more likely to achieve serum uric acid levels < 6.0 mg/dL at their final visit compared to placebo (95% CI 13.8 to 343.9, 95% CI 16.0 to 405.5), respectively; with an absolute treatment benefit of 75% and 87% (95% CI 68 to 80% and 81 to 91%), respectively. Total discontinuation rates were significantly higher in the febuxostat 80 mg group compared to placebo (RR 1.4; 95% CI 1.0 to 2.0, absolute risk increase 11%; 95% CI 3 to 19%). No other differences were observed. When comparing allopurinol to febuxostat at 24 to 52 weeks, the number of gout flares was not significantly different between the two groups, except for febuxostat 240 mg (RR 2.3; 95% CI 1.7 to 3.0). Patients on febuxostat 40 mg showed no statistically significant differences in benefits or harms. Patients on febuxostat 80 mg and 120 mg were 1.8 and 2.2 times more likely to achieve serum uric acid levels < 6.0 mg/dL at their final visit (95% CI 1.6 to 2.2, 95% CI 1.9 to 2.5) with an absolute treatment benefit of 29% and 44% (95% CI 25% to 33%, 95% CI 38% to 50%), respectively, at 24 to 52 weeks. Total discontinuation rates were higher for febuxostat 80 mg and 120 mg compared to allopurinol (RR 1.5; 95% CI 1.2 to 1.8, absolute risk increase 11%; 95% CI 6% to 16%; and RR 2.6; 95% CI 2.0 to 3.3, absolute risk increase 20%; 95% CI 3% to 14%, respectively). Discontinuations due to adverse events were similar across groups. Total adverse events were lower for febuxostat 80 mg and 120 mg compared with allopurinol (RR 0.93; 95% CI 0.87 to 0.99, absolute risk increase 6%; 95% CI 0.7% to 11%; and RR 0.90; 95% CI 0.84 to 0.96, absolute risk increase 8%; 95% CI 3% to 13%, respectively). No other relevant differences were noted. After 3 years of follow-up there were no statistically significant differences regarding effectiveness and harms between febuxostat 80 mg or 120 mg and allopurinol groups (adverse event rate per 100 patient-years 227, 216, and 246, respectively). Authors’ conclusions Although the incidence of gout flares requiring treatment may be increased in patients taking febuxostat compared to placebo or allopurinol during early treatment, no such increase in gout flares was observed in the long-term follow-up study when compared to allopurinol. Febuxostat at any dose was shown to be beneficial in achieving serum uric acid levels < 6.0 mg/dL and reducing serum uric acid levels in the period from baseline to final visit when compared to placebo and to allopurinol. However, the grade of evidence ranged from low to high, which indicates that further research is needed. PMID:23152264
Post-neonatal mortality in Norway 1969-95: a cause-specific analysis.
Arntzen, Annett; Samuelsen, Sven Ove; Daltveit, Anne Kjersti; Stoltenberg, Camilla
2006-08-01
We recently reported increased social inequality for post-neonatal death. The aim of the present study was to investigate the association between socioeconomic status and cause-specific post-neonatal death. All 1,483,857 live births recorded in the Medical Birth Registry of Norway from 1969-95 with information on parents' education were included. During the post-neonatal period (from 28 to 364 days of life) 4,464 infants died. Differences between education groups were estimated as risk differences, relative risks, population attributable fractions, and relative index of inequality. The major causes of death were congenital conditions, sudden infant death syndrome (SIDS), and infections. Post-neonatal mortality declined from 3.2/1,000 in the 1970s to 1.9/1,000 in the 1990s, mainly due to reduced mortality from congenital conditions. The absolute risk for SIDS increased by 0.51/1,000 in the same period among infants whose mothers had low education, while it decreased by 0.56/1,000 for those whose mothers had high education. The relative risk for SIDS among infants whose mothers had low education increased from 1.02 in the 1970s to 2.39 in the 1980s and 5.63 in the 1990s. Among infants whose fathers were not recorded in the Birth Registry, the absolute risk of SIDS increased by 0.79/1,000 from the 1970s to the 1990s. Increased social inequality for post-neonatal death was primarily due to increases in the absolute and relative risks of SIDS among infants whose mothers have low education. Social inequality widened during the study period for SIDS and deaths caused by infections.
2013-01-01
Background Extranodal natural killer/T-cell lymphoma (ENKL) has heterogeneous clinical manifestations and prognosis. This study aims to evaluate the prognostic impact of absolute monocyte count (AMC) in ENKL, and provide some immunologically relevant information for better risk stratification in patients with ENKL. Methods Retrospective data from 163 patients newly diagnosed with ENKL were analyzed. The absolute monocyte count (AMC) at diagnosis was analyzed as continuous and dichotomized variables. Independent prognostic factors of survival were determined by Cox regression analysis. Results The AMC at diagnosis were related to overall survival (OS) and progression-free survival (PFS) in patients with ENKL. Multivariate analysis identified AMC as independent prognostic factors of survival, independent of International Prognostic Index (IPI) and Korean prognostic index (KPI). The prognostic index incorporating AMC and absolute lymphocyte count (ALC), another surrogate factor of immune status, could be used to stratify all 163 patients with ENKL into different prognostic groups. For patients who received chemotherapy followed by radiotherapy (102 cases), the three AMC/ALC index categories identified patients with significantly different survivals. When superimposed on IPI or KPI categories, the AMC/ALC index was better able to identify high-risk patients in the low-risk IPI or KPI category. Conclusion The baseline peripheral monocyte count is shown to be an effective prognostic indicator of survival in ENKL patients. The prognostic index related to tumor microenvironment might be helpful to identify high-risk patients with ENKL. PMID:23638998
Mammographic Density Phenotypes and Risk of Breast Cancer: A Meta-analysis
Graff, Rebecca E.; Ursin, Giske; dos Santos Silva, Isabel; McCormack, Valerie; Baglietto, Laura; Vachon, Celine; Bakker, Marije F.; Giles, Graham G.; Chia, Kee Seng; Czene, Kamila; Eriksson, Louise; Hall, Per; Hartman, Mikael; Warren, Ruth M. L.; Hislop, Greg; Chiarelli, Anna M.; Hopper, John L.; Krishnan, Kavitha; Li, Jingmei; Li, Qing; Pagano, Ian; Rosner, Bernard A.; Wong, Chia Siong; Scott, Christopher; Stone, Jennifer; Maskarinec, Gertraud; Boyd, Norman F.; van Gils, Carla H.
2014-01-01
Background Fibroglandular breast tissue appears dense on mammogram, whereas fat appears nondense. It is unclear whether absolute or percentage dense area more strongly predicts breast cancer risk and whether absolute nondense area is independently associated with risk. Methods We conducted a meta-analysis of 13 case–control studies providing results from logistic regressions for associations between one standard deviation (SD) increments in mammographic density phenotypes and breast cancer risk. We used random-effects models to calculate pooled odds ratios and 95% confidence intervals (CIs). All tests were two-sided with P less than .05 considered to be statistically significant. Results Among premenopausal women (n = 1776 case patients; n = 2834 control subjects), summary odds ratios were 1.37 (95% CI = 1.29 to 1.47) for absolute dense area, 0.78 (95% CI = 0.71 to 0.86) for absolute nondense area, and 1.52 (95% CI = 1.39 to 1.66) for percentage dense area when pooling estimates adjusted for age, body mass index, and parity. Corresponding odds ratios among postmenopausal women (n = 6643 case patients; n = 11187 control subjects) were 1.38 (95% CI = 1.31 to 1.44), 0.79 (95% CI = 0.73 to 0.85), and 1.53 (95% CI = 1.44 to 1.64). After additional adjustment for absolute dense area, associations between absolute nondense area and breast cancer became attenuated or null in several studies and summary odds ratios became 0.82 (95% CI = 0.71 to 0.94; P heterogeneity = .02) for premenopausal and 0.85 (95% CI = 0.75 to 0.96; P heterogeneity < .01) for postmenopausal women. Conclusions The results suggest that percentage dense area is a stronger breast cancer risk factor than absolute dense area. Absolute nondense area was inversely associated with breast cancer risk, but it is unclear whether the association is independent of absolute dense area. PMID:24816206
Ezenwaka, C E; Nwagbara, E; Seales, D; Okali, F; Hussaini, S; Raja, Bn; Jones-LeCointe, A; Sell, H; Avci, H; Eckel, J
2009-03-06
Primary prevention of Coronary Heart Disease (CHD) in diabetic patients should be based on absolute CHD risk calculation. This study was aimed to determine the levels of 10-year CHD risk in Caribbean type 2 diabetic patients using the diabetes specific United Kingdom Prospective Diabetes Study (UKPDS) risk engine calculator. Three hundred and twenty-five (106 males, 219 females) type 2 diabetic patients resident in two Caribbean Islands of Tobago and Trinidad met the UKPDS risk engine inclusion criteria. Records of their sex, age, ethnicity, smoking habit, diabetes duration, systolic blood pressure, total cholesterol, HDL-cholesterol and glycated haemoglobin were entered into the UKPDS risk engine calculator programme and the absolute 10-year CHD and stroke risk levels were computed. The 10-year CHD and stroke risks were statistically stratified into <15%, 15-30% and >30% CHD risk levels and differences between patients of African and Asian-Indian origin were compared. In comparison with patients in Tobago, type 2 diabetic patients in Trinidad, irrespective of gender, had higher proportion of 10-year CHD risk (10.4 vs. 23.6%, P<0.001) whereas the overall 10-year stroke risk prediction was higher in patients resident in Tobago (16.9 vs. 11.4%, P<0.001). Ethnicity-based analysis revealed that irrespective of gender, higher proportion of patients of Indian origin scored >30% of absolute 10-year CHD risk compared with patients of African descent (3.2 vs. 28.2%, P<0.001). The results of the study identified diabetic patients resident in Trinidad and patients of Indian origin as the most vulnerable groups for CHD. These groups of diabetic patients should have priority in primary or secondary prevention of coronary heart disease.
Whiteley, William N; Emberson, Jonathan; Lees, Kennedy R; Blackwell, Lisa; Albers, Gregory; Bluhmki, Erich; Brott, Thomas; Cohen, Geoff; Davis, Stephen; Donnan, Geoffrey; Grotta, James; Howard, George; Kaste, Markku; Koga, Masatoshi; von Kummer, Rüdiger; Lansberg, Maarten G; Lindley, Richard I; Lyden, Patrick; Olivot, Jean Marc; Parsons, Mark; Toni, Danilo; Toyoda, Kazunori; Wahlgren, Nils; Wardlaw, Joanna; Del Zoppo, Gregory J; Sandercock, Peter; Hacke, Werner; Baigent, Colin
2016-08-01
Randomised trials have shown that alteplase improves the odds of a good outcome when delivered within 4·5 h of acute ischaemic stroke. However, alteplase also increases the risk of intracerebral haemorrhage; we aimed to determine the proportional and absolute effects of alteplase on the risks of intracerebral haemorrhage, mortality, and functional impairment in different types of patients. We used individual patient data from the Stroke Thrombolysis Trialists' (STT) meta-analysis of randomised trials of alteplase versus placebo (or untreated control) in patients with acute ischaemic stroke. We prespecified assessment of three classifications of intracerebral haemorrhage: type 2 parenchymal haemorrhage within 7 days; Safe Implementation of Thrombolysis in Stroke Monitoring Study's (SITS-MOST) haemorrhage within 24-36 h (type 2 parenchymal haemorrhage with a deterioration of at least 4 points on National Institutes of Health Stroke Scale [NIHSS]); and fatal intracerebral haemorrhage within 7 days. We used logistic regression, stratified by trial, to model the log odds of intracerebral haemorrhage on allocation to alteplase, treatment delay, age, and stroke severity. We did exploratory analyses to assess mortality after intracerebral haemorrhage and examine the absolute risks of intracerebral haemorrhage in the context of functional outcome at 90-180 days. Data were available from 6756 participants in the nine trials of intravenous alteplase versus control. Alteplase increased the odds of type 2 parenchymal haemorrhage (occurring in 231 [6·8%] of 3391 patients allocated alteplase vs 44 [1·3%] of 3365 patients allocated control; odds ratio [OR] 5·55 [95% CI 4·01-7·70]; absolute excess 5·5% [4·6-6·4]); of SITS-MOST haemorrhage (124 [3·7%] of 3391 vs 19 [0·6%] of 3365; OR 6·67 [4·11-10·84]; absolute excess 3·1% [2·4-3·8]); and of fatal intracerebral haemorrhage (91 [2·7%] of 3391 vs 13 [0·4%] of 3365; OR 7·14 [3·98-12·79]; absolute excess 2·3% [1·7-2·9]). However defined, the proportional increase in intracerebral haemorrhage was similar irrespective of treatment delay, age, or baseline stroke severity, but the absolute excess risk of intracerebral haemorrhage increased with increasing stroke severity: for SITS-MOST intracerebral haemorrhage the absolute excess risk ranged from 1·5% (0·8-2·6%) for strokes with NIHSS 0-4 to 3·7% (2·1-6·3%) for NIHSS 22 or more (p=0·0101). For patients treated within 4·5 h, the absolute increase in the proportion (6·8% [4·0% to 9·5%]) achieving a modified Rankin Scale of 0 or 1 (excellent outcome) exceeded the absolute increase in risk of fatal intracerebral haemorrhage (2·2% [1·5% to 3·0%]) and the increased risk of any death within 90 days (0·9% [-1·4% to 3·2%]). Among patients given alteplase, the net outcome is predicted both by time to treatment (with faster time increasing the proportion achieving an excellent outcome) and stroke severity (with a more severe stroke increasing the absolute risk of intracerebral haemorrhage). Although, within 4·5 h of stroke, the probability of achieving an excellent outcome with alteplase treatment exceeds the risk of death, early treatment is especially important for patients with severe stroke. UK Medical Research Council, British Heart Foundation, University of Glasgow, University of Edinburgh. Copyright © 2016 Elsevier Ltd. All rights reserved.
The Distinct Role of Comparative Risk Perceptions in a Breast Cancer Prevention Program
Dillard, Amanda J.; Ubel, Peter A.; Smith, Dylan M.; Zikmund-Fisher, Brian J.; Nair, Vijay; Derry, Holly A.; Zhang, Aijun; Pitsch, Rosemarie K.; Alford, Sharon Hensley; McClure, Jennifer B.; Fagerlin, Angela
2013-01-01
Background Comparative risk perceptions may rival other types of information in terms of effects on health behavior decisions. Purpose We examined associations between comparative risk perceptions, affect, and behavior while controlling for absolute risk perceptions and actual risk. Methods Women at an increased risk of breast cancer participated in a program to learn about tamoxifen which can reduce the risk of breast cancer. Women reported comparative risk perceptions of breast cancer and completed measures of anxiety, knowledge, and tamoxifen-related behavior intentions. Three months later, women reported their behavior. Results Comparative risk perceptions were positively correlated with anxiety, knowledge, intentions, and behavior three months later. After controlling for participants’ actual risk of breast cancer and absolute risk perceptions, comparative risk perceptions predicted anxiety and knowledge, but not intentions or behavior. Conclusions Comparative risk perceptions can affect patient outcomes like anxiety and knowledge independently of absolute risk perceptions and actual risk information. PMID:21698518
NT-proBNP and Heart Failure Risk Among Individuals With and Without Obesity: The ARIC Study
Ndumele, Chiadi E.; Matsushita, Kunihiro; Sang, Yingying; Lazo, Mariana; Agarwal, Sunil K.; Nambi, Vijay; Deswal, Anita; Blumenthal, Roger S.; Ballantyne, Christie M.; Coresh, Josef; Selvin, Elizabeth
2016-01-01
Background Obesity is a risk factor for heart failure (HF), but is associated with lower N-terminal of pro-Brain Natriuretic Peptide (NT-proBNP) levels. It is unclear whether the prognostic value and implications of NT-proBNP levels for HF risk differ across body mass index (BMI) categories. Methods and Results We followed 12,230 ARIC participants free of prior HF at baseline (visit 2, 1990–1992) with BMI ≥18.5 kg/m2. We quantified and compared the relative and absolute risk associations of NT-proBNP with incident HF across BMI categories. There were 1,861 HF events during a median 20.6 years of follow-up. Despite increased HF risk in obesity, a weak inverse association was seen between baseline BMI and NT-proBNP levels (r = −0.10). Nevertheless, higher baseline NT-proBNP was associated with increased HF risk in all BMI categories. NT-proBNP improved HF risk prediction overall and even among those with severe obesity (BMI ≥35 kg/m2; improvement in c-statistic +0.032 [95% CI 0.011–0.053]). However, given higher HF rates among those with obesity, at each NT-proBNP level, higher BMI was associated with greater absolute HF risk. Indeed, among those with NT-proBNP 100 to < 200 pg/ml, the average 10-year HF risk was <5% among normal weight individuals but >10% if severely obese. Conclusions Despite its inverse relationship with BMI, NT-proBNP provides significant prognostic information regarding the risk of developing HF even among individuals with obesity. Given the higher baseline HF risk among persons with obesity, even slight elevations in NT-proBNP may have implications for increased absolute HF risk in this population. PMID:26746175
van Geel, Tineke A C M; Eisman, John A; Geusens, Piet P; van den Bergh, Joop P W; Center, Jacqueline R; Dinant, Geert-Jan
2014-02-01
There are two commonly used fracture risk prediction tools FRAX(®) and Garvan Fracture Risk Calculator (GARVAN-FRC). The objective of this study was to investigate the utility of these tools in daily practice. A prospective population-based 5-year follow-up study was conducted in ten general practice centres in the Netherlands. For the analyses, the FRAX(®) and GARVAN-FRC 10-year absolute risks (FRAX(®) does not have 5-year risk prediction) for all fractures were used. Among 506 postmenopausal women aged ≥60 years (mean age: 67.8±5.8 years), 48 (9.5%) sustained a fracture during follow-up. Both tools, using BMD values, distinguish between women who did and did not fracture (10.2% vs. 6.8%, respectively for FRAX(®) and 32.4% vs. 39.1%, respectively for GARVAN-FRC, p<0.0001) at group level. However, only 8.9% of those who sustained a fracture had an estimated fracture risk ≥20% using FRAX(®) compared with 53.3% using GARVAN-FRC. Although both underestimated the observed fracture risk, the GARVAN-FRC performed significantly better for women who sustained a fracture (higher sensitivity) and FRAX(®) for women who did not sustain a fracture (higher specificity). Similar results were obtained using age related cut off points. The discriminant value of both models is at least as good as models used in other medical conditions; hence they can be used to communicate the fracture risk to patients. However, given differences in the estimated risks between FRAX(®) and GARVAN-FRC, the significance of the absolute risk must be related to country-specific recommended intervention thresholds to inform the patient. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Overview of the randomized trials of radiotherapy in ductal carcinoma in situ of the breast.
Correa, C; McGale, P; Taylor, C; Wang, Y; Clarke, M; Davies, C; Peto, R; Bijker, N; Solin, L; Darby, S
2010-01-01
Individual patient data were available for all four of the randomized trials that began before 1995, and that compared adjuvant radiotherapy vs no radiotherapy following breast-conserving surgery for ductal carcinoma in situ (DCIS). A total of 3729 women were eligible for analysis. Radiotherapy reduced the absolute 10-year risk of any ipsilateral breast event (ie, either recurrent DCIS or invasive cancer) by 15.2% (SE 1.6%, 12.9% vs 28.1% 2 P <.00001), and it was effective regardless of the age at diagnosis, extent of breast-conserving surgery, use of tamoxifen, method of DCIS detection, margin status, focality, grade, comedonecrosis, architecture, or tumor size. The proportional reduction in ipsilateral breast events was greater in older than in younger women (2P < .0004 for difference between proportional reductions; 10-year absolute risks: 18.5% vs 29.1% at ages <50 years, 10.8% vs 27.8% at ages ≥ 50 years) but did not differ significantly according to any other available factor. Even for women with negative margins and small low-grade tumors, the absolute reduction in the 10-year risk of ipsilateral breast events was 18.0% (SE 5.5, 12.1% vs 30.1%, 2P = .002). After 10 years of follow-up, there was, however, no significant effect on breast cancer mortality, mortality from causes other than breast cancer, or all-cause mortality.
Gender differences in socioeconomic inequality in mortality.
Mustard, C A; Etches, J
2003-12-01
There is uncertainty about whether position in a socioeconomic hierarchy confers different mortality risks on men and women. The objective of this study was to conduct a systematic review of gender differences in socioeconomic inequality in risk of death. This research systematically reviewed observational cohort studies describing all cause or cause specific mortality for populations aged 25-64 in developed countries. For inclusion in the review, mortality had to be reported stratified by gender and by one or more measures of socioeconomic status. For all eligible studies, five absolute and six relative measures of the socioeconomic inequality in mortality were computed for male and female populations separately. A total of 136 published papers were reviewed for eligibility, with 58 studies deemed eligible for inclusion. Of these eligible studies, 20 papers published data that permitted the computation of both absolute and relative measures of inequality. Absolute measures of socioeconomic mortality inequality for men and women generally agreed, with about 90% of studies indicating that male mortality was more unequal than female mortality across socioeconomic groups. In contrast, the pattern of relative inequality results across the 20 studies suggested that male and female socioeconomic inequality in mortality was equivalent. Inferences about gender differences in socioeconomic inequality in mortality are sensitive to the choice of inequality measure. Wider understanding of this methodological issue would improve the clarity of the reporting and synthesis of evidence on the magnitude of health inequalities in populations.
Chondroitin for osteoarthritis
Singh, Jasvinder A.; Noorbaloochi, Shahrzad; MacDonald, Roderick; Maxwell, Lara J.
2016-01-01
Background Osteoarthritis, a common joint disorder, is one of the leading causes of disability. Chondroitin has emerged as a new treatment. Previous meta-analyses have shown contradictory results on the efficacy of chondroitin. This, in addition to the publication of more trials, necessitates a systematic review. Objectives To evaluate the benefit and harm of oral chondroitin for treating osteoarthritis compared with placebo or a comparator oral medication including, but not limited to, nonsteroidal anti-inflammatory drugs (NSAIDs), analgesics, opioids, and glucosamine or other “herbal” medications. Search methods We searched seven databases up to November 2013, including the Cochrane Central Register of Controlled Trials (CENTRAL), Ovid MEDLINE, CINAHL, EMBASE, Science Citation Index (Web of Science) and Current Controlled Trials. We searched the US Food and Drug Administration (FDA) and European Medicines Agency (EMEA) websites for adverse effects. Trial registers were not searched. Selection criteria All randomized or quasi-randomized clinical trials lasting longer than two weeks, studying adults with osteoarthritis in any joint, and comparing chondroitin with placebo, an active control such as NSAIDs, or other “herbal” supplements such as glucosamine. Data collection and analysis Two review authors independently performed all title assessments, data extractions, and risk of bias assessments. Main results Forty-three randomized controlled trials including 4,962 participants treated with chondroitin and 4,148 participants given placebo or another control were included. The majority of trials were in knee OA, with few in hip and hand OA. Trial duration varied from 1 month to 3 years. Participants treated with chondroitin achieved statistically significantly and clinically meaningful better pain scores (0–100) in studies less than 6 months than those given placebo with an absolute risk difference of 10% lower (95% confidence interval (CI), 15% to 6% lower; number needed to treat (NNT) = 5 (95% CI, 3 to 8; n = 8 trials) (level of evidence, low; risk of bias, high); but there was high heterogeneity between the trials (T2 = 0.07; I2 = 70%, which was not easily explained by differences in risk of bias or study sample size). In studies longer than 6 months, the absolute risk difference for pain was 9% lower (95% CI 18% lower to 0%); n = 6 trials; T2 = 0.18; I2 = 83% ), again with low level of evidence. For the Western Ontario and McMaster Universities Osteoarthritis Index Minimal Clinically Important Improvement (WOMAC MCII Pain subscale) outcome, a reduction in knee pain by 20% was achieved by 53/100 in the chondroitin group versus 47/100 in the placebo group, an absolute risk difference of 6% (95% CI 1% to 11%), (RR 1.12, 95% CI 1.01 to 1.24; T2 = 0.00; I2 = 0%) (n = 2 trials, 1253 participants; level of evidence, high; risk of bias, low). Differences in Lequesne’s index (composite of pain, function and disability) statistically significantly favoured chondroitin as compared with placebo in studies under six months, with an absolute risk difference of 8% lower (95% CI 12% to 5% lower; T2= 0.78; n = 7 trials) (level of evidence, moderate; risk of bias, unclear), also clinically meaningful. Loss of minimum joint space width in the chondroitin group was statistically significantly less than in the placebo group, with a relative risk difference of 4.7% less (95% CI 1.6% to 7.8% less; n = 2 trials) (level of evidence, high; risk of bias, low). Chondroitin was associated with statistically significantly lower odds of serious adverse events compared with placebo with Peto odds ratio of 0.40 (95% CI 0.19 to 0.82; n = 6 trials) (level of evidence, moderate). Chondroitin did not result in statistically significant numbers of adverse events or withdrawals due to adverse events compared with placebo or another drug. Adverse events were reported in a limited fashion, with some studies providing data and others not. Comparisons of chondroitin taken alone or in combination with glucosamine or another supplement showed a statistically significant reduction in pain (0–100) when compared with placebo or an active control, with an absolute risk difference of 10% lower (95% CI 14% to 5% lower); NNT = 4 (95% CI 3 to 6); T2 = 0.33; I2 = 91%; n = 17 trials) (level of evidence, low). For physical function, chondroitin in combination with glucosamine or another supplement showed no statistically significant difference from placebo or an active control, with an absolute risk difference of 1% lower (95% CI 6% lower to 3% higher with T2 = 0.04; n = 5 trials) (level of evidence, moderate). Differences in Lequesne’s index statistically significantly favoured chondroitin as compared with placebo, with an absolute risk difference of 8% lower (95% CI, 12% to 4% lower; T2 = 0.12; n = 10 trials) (level of evidence, moderate). Chondroitin in combination with glucosamine did not result in statistically significant differences in the numbers of adverse events, withdrawals due to adverse events, or in the numbers of serious adverse events compared with placebo or with an active control. The beneficial effects of chondroitin in pain and Lequesne’s index persisted when evidence was limited to studies with adequate blinding or studies that used appropriate intention to treat (ITT) analyses. These beneficial effects were uncertain when we limited data to studies with appropriate allocation concealment or a large study sample (> 200) or to studies without pharmaceutical funding. Authors’ conclusions A review of randomized trials of mostly low quality reveals that chondroitin (alone or in combination with glucosamine) was better than placebo in improving pain in participants with osteoarthritis in short-term studies. The benefit was small to moderate with an 8 point greater improvement in pain (range 0 to 100) and a 2 point greater improvement in Lequesne’s index (range 0 to 24), both likely clinically meaningful. These differences persisted in some sensitivity analyses and not others. Chondroitin had a lower risk of serious adverse events compared with control. More high-quality studies are needed to explore the role of chondroitin in the treatment of osteoarthritis. The combination of some efficacy and low risk associated with chondroitin may explain its popularity among patients as an over-the-counter supplement. PMID:25629804
Cancer risk communication in mainstream and ethnic newspapers.
Stryker, Jo Ellen; Fishman, Jessica; Emmons, Karen M; Viswanath, Kasisomayajula
2009-01-01
We wanted to understand how cancer risks are communicated in mainstream and ethnic newspapers, to determine whether the 2 kinds of newspapers differ and to examine features of news stories and sources that might predict optimal risk communication. Optimal risk communication was defined as presenting the combination of absolute risk, relative risk, and prevention response efficacy information. We collected data by conducting a content analysis of cancer news coverage from 2003 (5,327 stories in major newspapers, 565 stories in ethnic newspapers). Comparisons of mainstream and ethnic newspapers were conducted by using cross-tabulations and Pearson chi2 tests for significance. Logistic regression equations were computed to calculate odds ratios and 95% confidence intervals for optimal risk communication. In both kinds of newspapers, cancer risks were rarely communicated numerically. When numeric presentations of cancer risks were used, only 26.2% of mainstream and 29.5% of ethnic newspaper stories provided estimates of both absolute and relative risk. For both kinds of papers, only 19% of news stories presented risk communication optimally. Cancer risks were more likely to be communicated optimally if they focused on prostate cancer, were reports of new research, or discussed medical or demographic risks. Research is needed to understand how these nonnumeric and decontextualized presentations of risk might contribute to inaccurate risk perceptions among news consumers.
Arntzen, Annett; Mortensen, Laust; Schnor, Ole; Cnattingius, Sven; Gissler, Mika; Andersen, Anne-Marie Nybo
2008-06-01
This study examined changes in the educational gradients in neonatal and postneonatal mortality over a 20-year period in the four largest Nordic countries. The study populations were all live-born singleton infants with gestational age of at least 22 weeks from 1981 to 2000 (Finland 1987-2000). Information on births and infant deaths from the Medical Birth Registries was linked to information from census statistics. Numbers of eligible live-births were: Denmark 1 179 831, Finland 834 299 (1987-2000), Norway 1 017 168 and Sweden 1 971 645. Differences in mortality between education groups were estimated as risk differences (RD), relative risks (RR) and index of inequality ratio (RII). Overall, rates of infant mortality were in Denmark 5.9 per 1000 live-births, in Finland 4.2 (1987-2000), in Norway 5.3 and in Sweden 4.7. Overall the mortality decreased in all educational groups, and the educational level increased in the study period. The time-trends differed between neonatal and postneonatal death. For neonatal death, both the absolute and relative educational differences decreased in Finland and Sweden, increased in Denmark, whereas in Norway a decrease in absolute differences and a slight increase in relative differences occurred. For postneonatal death, the relative educational differences increased in all countries, whereas the absolute differences decreased. All educational groups experienced a decline in infant mortality during the period under study. Still, the inverse association between maternal education and RR of postneonatal death has become more pronounced in all Nordic countries.
Kadota, Aya; Miura, Katsuyuki; Okamura, Tomonori; Fujiyoshi, Akira; Ohkubo, Takayoshi; Kadowaki, Takashi; Takashima, Naoyuki; Hisamatsu, Takashi; Nakamura, Yasuyuki; Kasagi, Fumiyoshi; Maegawa, Hiroshi; Kashiwagi, Atsunori; Ueshima, Hirotsugu
2013-01-01
To examine whether subclinical atherosclerosis of the carotid arteries is concordant with the categories in the 2012 atherosclerosis prevention guidelines proposed by the Japan Atherosclerosis Society (JAS guidelines 2012), which adopted the estimated 10-year absolute risk of coronary artery disease (CAD) death in the NIPPON DATA80 Risk Assessment Chart. Between 2006 and 2008, 868 Japanese men 40 to 74 years of age without a history of cardiovascular disease were randomly selected from Kusatsu City, Japan. The intima media thickness (IMT) and plaque number from the common to internal carotid arteries were investigated using ultrasonography. The absolute risk of CAD death was estimated based on the individual risk factor data, and the mean IMT and plaque number in Categories Ⅰ, Ⅱ and Ⅲ of the guidelines were examined. The estimated 10-year absolute risk of CAD was directly related to the IMT (mean IMT (mean ± SD) (mm) for a 10-year absolute risk of ≥ 2.0% and ≥ 5.0%: 0.88 ± 0.18 and 0.95 ± 0.19, respectively) and the plaque number. These results are compatible with the categories described by the guidelines (mean IMT (mean ± SD) (mm) for Categories Ⅰ, Ⅱ, and Ⅲ: 0.70 ± 0.11, 0.81 ± 0.16 and 0.88 ± 0.18, respectively; mean plaque number: 0.9, 2.1 and 3, respectively). These findings were similar for Category Ⅲ participants with or without DM and CKD. Subclinical atherosclerosis of the carotid arteries is concordant with the 10-year absolute risk of CAD and the categories in the JAS guidelines 2012.
Knapp, Peter; Gardner, Peter H; Raynor, David K; Woolf, Elizabeth; McMillan, Brian
2010-05-01
To investigate the effectiveness of presenting medicine side effect risk information in different forms, including that proposed by UK guidelines [[1] Medicines and Healthcare products Regulatory Agency. Always read the leaflet-Getting the best information with every medicine. (Report of the Committee on Safety of Medicines Working Group on Patient Information). London: The Stationery Office, 2005.]. 134 Cancer Research UK (CRUK) website users were recruited via a 'pop-up'. Using a 2x2 factorial design, participants were randomly allocated to one of four conditions and asked to: imagine they had to take tamoxifen, estimate the risks of 4 side effects, and indicate a presentation mode preference. Those presented with absolute frequencies demonstrated greater accuracy in estimating 2 of 4 side effects, and of any side effect occurring, than those presented with frequency bands. Those presented with combined descriptors were more accurate at estimating the risk of pulmonary embolism than those presented with numeric descriptors only. Absolute frequencies outperform frequency bands when presenting side effect risk information. However, presenting such exact frequencies for every side effect may be much less digestible than all side effects listed under 5 frequency bands. Combined numerical and verbal descriptors may be better than numeric only descriptors when describing infrequent side effects. Information about side effects should be presented in ways that patients prefer, and which result in most accurate risk estimates. Copyright (c) 2009 Elsevier Ireland Ltd. All rights reserved.
Rolison, Jonathan J; Hanoch, Yaniv; Miron-Shatz, Talya
2012-07-01
Genetic testing for gene mutations associated with specific cancers provides an opportunity for early detection, surveillance, and intervention (Smith, Cokkinides, & Brawley, 2008). Lifetime risk estimates provided by genetic testing refer to the risk of developing a specific disease within one's lifetime, and evidence suggests that this is important for the medical choices people make, as well as their future family and financial plans. The present studies tested whether adult men understand the lifetime risks of prostate cancer informed by genetic testing. In 2 experiments, adult men were asked to interpret the lifetime risk information provided in statements about risks of prostate cancer. Statement format was manipulated such that the most appropriate interpretation of risk statements referred to an absolute risk of cancer in experiment 1 and a relative risk in experiment 2. Experiment 1 revealed that few men correctly interpreted the lifetime risks of cancer when these refer to an absolute risk of cancer, and numeracy levels positively predicted correct responding. The proportion of correct responses was greatly improved in experiment 2 when the most appropriate interpretation of risk statements referred instead to a relative rather than an absolute risk, and numeracy levels were less involved. Understanding of lifetime risk information is often poor because individuals incorrectly believe that these refer to relative rather than absolute risks of cancer.
van Dyk, Nicol; Bahr, Roald; Whiteley, Rodney; Tol, Johannes L; Kumar, Bhavesh D; Hamilton, Bruce; Farooq, Abdulaziz; Witvrouw, Erik
2016-07-01
A hamstring strain injury (HSI) has become the most common noncontact injury in soccer. Isokinetic muscle strength deficits are considered a risk factor for HSIs. However, underpowered studies with small sample sizes unable to determine small associations have led to inconclusive results regarding the role of isokinetic strength and strength testing in HSIs. To examine whether differences in isokinetic strength measures of knee flexion and extension represent risk factors for hamstring injuries in a large cohort of professional soccer players in an adequately powered study design. Cohort study; Level of evidence, 2. A total of 614 professional soccer players from 14 teams underwent isokinetic strength testing during preseason screening. Testing consisted of concentric knee flexion and extension at 60 deg/s and 300 deg/s and eccentric knee extension at 60 deg/s. A clustered multiple logistic regression analysis was used to identify variables associated with the risk of HSIs. Receiver operating characteristic (ROC) curves were calculated to determine sensitivity and specificity. Of the 614 players, 190 suffered an HSI during the 4 seasons. Quadriceps concentric strength at 60 deg/s (odds ratio [OR], 1.41; 95% CI, 1.03-1.92; P = .03) and hamstring eccentric strength at 60 deg/s (OR, 1.37; 95% CI, 1.01-1.85; P = .04) adjusted for bodyweight were independently associated with the risk of injuries. The absolute differences between the injured and uninjured players were 6.9 N·m and 9.1 N·m, with small effect sizes (d < 0.2). The ROC analyses showed an area under the curve of 0.54 and 0.56 for quadriceps concentric strength and hamstring eccentric strength, respectively, indicating a failed combined sensitivity and specificity of the 2 strength variables identified in the logistic regression models. This study identified small absolute strength differences and a wide overlap of the absolute strength measurements at the group level. The small associations between lower hamstring eccentric strength and lower quadriceps concentric strength with HSIs can only be considered as weak risk factors. The identification of these risk factors still does not allow the identification of individual players at risk. The use of isokinetic testing to determine the association between strength differences and HSIs is not supported. © 2016 The Author(s).
Eckermann, Simon; Coory, Michael; Willan, Andrew R
2011-02-01
Economic analysis and assessment of net clinical benefit often requires estimation of absolute risk difference (ARD) for binary outcomes (e.g. survival, response, disease progression) given baseline epidemiological risk in a jurisdiction of interest and trial evidence of treatment effects. Typically, the assumption is made that relative treatment effects are constant across baseline risk, in which case relative risk (RR) or odds ratios (OR) could be applied to estimate ARD. The objective of this article is to establish whether such use of RR or OR allows consistent estimates of ARD. ARD is calculated from alternative framing of effects (e.g. mortality vs survival) applying standard methods for translating evidence with RR and OR. For RR, the RR is applied to baseline risk in the jurisdiction to estimate treatment risk; for OR, the baseline risk is converted to odds, the OR applied and the resulting treatment odds converted back to risk. ARD is shown to be consistently estimated with OR but changes with framing of effects using RR wherever there is a treatment effect and epidemiological risk differs from trial risk. Additionally, in indirect comparisons, ARD is shown to be consistently estimated with OR, while calculation with RR allows inconsistency, with alternative framing of effects in the direction, let alone the extent, of ARD. OR ensures consistent calculation of ARD in translating evidence from trial settings and across trials in direct and indirect comparisons, avoiding inconsistencies from RR with alternative outcome framing and associated biases. These findings are critical for consistently translating evidence to inform economic analysis and assessment of net clinical benefit, as translation of evidence is proposed precisely where the advantages of OR over RR arise.
Peace, Frederick; Howard, Virginia J.
2014-01-01
Introduction Differences in risk for death from diseases and other causes among racial/ethnic groups likely contributed to the limited improvement in the state of health in the United States in the last few decades. The objective of this study was to identify causes of death that are the largest contributors to health disparities among racial/ethnic groups. Methods Using data from WONDER system, we measured the relative (age-adjusted mortality ratio [AAMR]) and absolute (difference in years of life lost [dYLL]) differences in mortality risk between the non-Hispanic white population and the black, Hispanic, American Indian/Alaska Native, and Asian/Pacific Islander populations for the 25 leading causes of death. Results Many causes contributed to disparities between non-Hispanic whites and blacks, led by assault (AAMR, 7.56; dYLL, 4.5 million). Malignant neoplasms were the second largest absolute contributor (dYLL, 3.8 million) to black–white disparities; we also found substantial relative and absolute differences for several cardiovascular diseases. Only assault, diabetes, and diseases of the liver contributed substantially to disparities between non-Hispanic whites and Hispanics (AAMR ≥ 1.65; dYLL ≥ 325,000). Many causes of death, led by assault (AAMR, 3.25; dYLL, 98,000), contributed to disparities between non-Hispanic whites and American Indians/Alaska Natives; Asian/Pacific Islanders did not have a higher risk than non-Hispanic whites for death from any disease. Conclusion Assault was a substantial contributor to disparities in mortality among non-Asian racial/ethnic minority populations. Research and intervention resources need to target diseases (such as diabetes and diseases of the liver) that affect certain racial/ethnic populations. PMID:25078566
Keller, Brad M; Nathan, Diane L; Gavenonis, Sara C; Chen, Jinbo; Conant, Emily F; Kontos, Despina
2013-05-01
Mammographic breast density, a strong risk factor for breast cancer, may be measured as either a relative percentage of dense (ie, radiopaque) breast tissue or as an absolute area from either raw (ie, "for processing") or vendor postprocessed (ie, "for presentation") digital mammograms. Given the increasing interest in the incorporation of mammographic density in breast cancer risk assessment, the purpose of this study is to determine the inherent reader variability in breast density assessment from raw and vendor-processed digital mammograms, because inconsistent estimates could to lead to misclassification of an individual woman's risk for breast cancer. Bilateral, mediolateral-oblique view, raw, and processed digital mammograms of 81 women were retrospectively collected for this study (N = 324 images). Mammographic percent density and absolute dense tissue area estimates for each image were obtained from two radiologists using a validated, interactive software tool. The variability of interreader agreement was not found to be affected by the image presentation style (ie, raw or processed, F-test: P > .5). Interreader estimates of relative and absolute breast density are strongly correlated (Pearson r > 0.84, P < .001) but systematically different (t-test, P < .001) between the two readers. Our results show that mammographic density may be assessed with equal reliability from either raw or vendor postprocessed images. Furthermore, our results suggest that the primary source of density variability comes from the subjectivity of the individual reader in assessing the absolute amount of dense tissue present in the breast, indicating the need to use standardized tools to mitigate this effect. Copyright © 2013 AUR. Published by Elsevier Inc. All rights reserved.
The metabolism of plant sterols is disturbed in postmenopausal women with coronary artery disease.
Gylling, Helena; Hallikainen, Maarit; Rajaratnam, Radhakrishnan A; Simonen, Piia; Pihlajamäki, Jussi; Laakso, Markku; Miettinen, Tatu A
2009-03-01
In postmenopausal coronary artery disease (CAD) women, serum plant sterols are elevated. Thus, we investigated further whether serum plant sterols reflect absolute cholesterol metabolism in CAD as in other populations and whether the ABCG5 and ABCG8 genes, associated with plant sterol metabolism, were related to the risk of CAD. In free-living postmenopausal women with (n = 47) and without (n = 62) CAD, serum noncholesterol sterols including plant sterols were analyzed with gas-liquid chromatography, cholesterol absorption with peroral isotopes, absolute cholesterol synthesis with sterol balance technique, and bile acid synthesis with quantitating fecal bile acids. In CAD women, serum plant sterol ratios to cholesterol were 21% to 26% (P < .05) higher than in controls despite similar cholesterol absorption efficiency. Absolute cholesterol and bile acid synthesis were reduced. Only in controls were serum plant sterols related to cholesterol absorption (eg, sitosterol; in controls: r = 0.533, P < .001; in CAD: r = 0.296, P = not significant). However, even in CAD women, serum lathosterol (relative synthesis marker) and lathosterol-cholestanol (relative synthesis-absorption marker) were related to absolute synthesis and absorption percentage (P range from .05 to <.001) similarly to controls. Frequencies of the common polymorphisms of ABCG5 and ABCG8 genes did not differ between coronary and control women. In conclusion, plant sterol metabolism is disturbed in CAD women; so serum plant sterols only tended to reflect absolute cholesterol absorption. Other relative markers of cholesterol metabolism were related to the absolute ones in both groups. ABCG5 and ABCG8 genes were not associated with the risk of CAD.
Keller, Brad M; McCarthy, Anne Marie; Chen, Jinbo; Armstrong, Katrina; Conant, Emily F; Domchek, Susan M; Kontos, Despina
2015-03-18
Breast density and single-nucleotide polymorphisms (SNPs) have both been associated with breast cancer risk. To determine the extent to which these two breast cancer risk factors are associated, we investigate the association between a panel of validated SNPs related to breast cancer and quantitative measures of mammographic density in a cohort of Caucasian and African-American women. In this IRB-approved, HIPAA-compliant study, we analyzed a screening population of 639 women (250 African American and 389 Caucasian) who were tested with a validated panel assay of 12 SNPs previously associated to breast cancer risk. Each woman underwent digital mammography as part of routine screening and all were interpreted as negative. Both absolute and percent estimates of area and volumetric density were quantified on a per-woman basis using validated software. Associations between the number of risk alleles in each SNP and the density measures were assessed through a race-stratified linear regression analysis, adjusted for age, BMI, and Gail lifetime risk. The majority of SNPs were not found to be associated with any measure of breast density. SNP rs3817198 (in LSP1) was significantly associated with both absolute area (p = 0.004) and volumetric (p = 0.019) breast density in Caucasian women. In African-American women, SNPs rs3803662 (in TNRC9/TOX3) and rs4973768 (in NEK10) were significantly associated with absolute (p = 0.042) and percent (p = 0.028) volume density respectively. The majority of SNPs investigated in our study were not found to be significantly associated with breast density, even when accounting for age, BMI, and Gail risk, suggesting that these two different risk factors contain potentially independent information regarding a woman's risk to develop breast cancer. Additionally, the few statistically significant associations between breast density and SNPs were different for Caucasian versus African American women. Larger prospective studies are warranted to validate our findings and determine potential implications for breast cancer risk assessment.
YÜKSEL, Serpil; ALTUN UĞRAŞ, Gülay; ÇAVDAR, İkbal; BOZDOĞAN, Atilla; ÖZKAN GÜRDAL, Sibel; AKYOLCU, Neriman; ESENCAN, Ecem; VAROL SARAÇOĞLU, Gamze; ÖZMEN, Vahit
2017-01-01
Background: The increase in breast cancer incidence has enhanced attention towards breast cancer risk. The aim of this study was to determine the risk of breast cancer and risk perception of women, factors that affect risk perception, and to determine differences between absolute risk and the perception of risk. Methods: This cross-sectional study was carried out among 346 women whose score in the Gail Risk Model (GRM) was ≥ 1.67% and/or had a 1st degree relative with breast cancer in Bahçeşehir town in Istanbul, Turkey between Jul 2012 and Dec 2012. Data were collected through face-to-face interviews. The level of risk for breast cancer has been calculated using GRM and the Breast Cancer Risk Assessment Form (BCRAF). Breast cancer risk perception (BCRP), has been evaluated by visual analogue 100-cm-long scale. Results: Even though 39.6% of the women considered themselves as high-risk carriers, according to the GRM and the BCRAF, only 11.6% and 9.8% of women were in the “high risk” category, respectively. There was a positive significant correlation between the GRM and the BCRAF scores (P<0.001), and the BCRAF and BCRP scores (P<0.001). Factors related to high-risk perception were age (40–59 yr), post-menopausal phase, high-very high economic income level, existence of breast cancer in the family, having regular breast self-examination and clinical breast examination (P<0.05). Conclusion: In women with high risk of breast, cancer there is a significant difference between the women’s risk perception and their absolute risk level. PMID:28435816
A new Web-based medical tool for assessment and prevention of comprehensive cardiovascular risk
Franchi, Daniele; Cini, Davide; Iervasi, Giorgio
2011-01-01
Background: Multifactor cardiovascular disease is the leading cause of death; besides well-known cardiovascular risk factors, several emerging factors such as mental stress, diet type, and physical inactivity, have been associated to cardiovascular disease. To date, preventive strategies are based on the concept of absolute risk calculated by different algorithms and scoring systems. However, in general practice the patient’s data collection represents a critical issue. Design: A new multipurpose computer-based program has been developed in order to:1) easily calculate and compare the absolute cardiovascular risk by the Framingham, Procam, and Progetto Cuore algorithms; 2) to design a web-based computerized tool for prospective collection of structured data; 3) to support the doctor in the decision-making process for patients at risk according to recent international guidelines. Methods: During a medical consultation the doctor utilizes a common computer connected by Internet to a medical server where all the patient’s data and software reside. The program evaluates absolute and relative cardiovascular risk factors, personalized patient’s goals, and multiparametric trends, monitors critical parameter values, and generates an automated medical report. Results: In a pilot study on 294 patients (47% males; mean age 60 ± 12 years [±SD]) the global time to collect data at first consultation was 13 ± 11 minutes which declined to 8 ± 7 minutes at the subsequent consultation. In 48.2% of cases the program revealed 2 or more primary risk factor parameters outside guideline indications and gave specific clinical suggestions to return altered parameters to target values. Conclusion: The web-based system proposed here may represent a feasible and flexible tool for clinical management of patients at risk of cardiovascular disease and for epidemiological research. PMID:21445280
Axelsson Fisk, Sten; Merlo, Juan
2017-05-04
While psychosocial theory claims that socioeconomic status (SES), acting through social comparisons, has an important influence on susceptibility to disease, materialistic theory says that socioeconomic position (SEP) and related access to material resources matter more. However, the relative role of SEP versus SES in chronic obstructive pulmonary disease (COPD) risk has still not been examined. We investigated the association between SES/SEP and COPD risk among 667 094 older adults, aged 55 to 60, residing in Sweden between 2006 and 2011. Absolute income in five groups by population quintiles depicted SEP and relative income expressed as quintile groups within each absolute income group represented SES. We performed sex-stratified logistic regression models to estimate odds ratios and the area under the receiver operator curve (AUC) to compare the discriminatory accuracy of SES and SEP in relation to COPD. Even though both absolute (SEP) and relative income (SES) were associated with COPD risk, only absolute income (SEP) presented a clear gradient, so the poorest had a three-fold higher COPD risk than the richest individuals. While the AUC for a model including only age was 0.54 and 0.55 when including relative income (SES), it increased to 0.65 when accounting for absolute income (SEP). SEP rather than SES demonstrated a consistent association with COPD. Our study supports the materialistic theory. Access to material resources seems more relevant to COPD risk than the consequences of low relative income.
Ndumele, Chiadi E; Matsushita, Kunihiro; Sang, Yingying; Lazo, Mariana; Agarwal, Sunil K; Nambi, Vijay; Deswal, Anita; Blumenthal, Roger S; Ballantyne, Christie M; Coresh, Josef; Selvin, Elizabeth
2016-02-16
Obesity is a risk factor for heart failure (HF) but is associated with lower N-terminal pro-brain natriuretic peptide (NT-proBNP) levels. It is unclear whether the prognostic value and implications of NT-proBNP levels for HF risk differ across body mass index (BMI) categories. We followed up 12 230 ARIC participants free of prior HF at baseline (visit 2, 1990-1992) with BMI ≥18.5 kg/m(2). We quantified and compared the relative and absolute risk associations of NT-proBNP with incident HF across BMI categories. There were 1861 HF events during a median 20.6 years of follow-up. Despite increased HF risk in obesity, a weak inverse association was seen between baseline BMI and NT-proBNP levels (r=-0.10). Nevertheless, higher baseline NT-proBNP was associated with increased HF risk in all BMI categories. NT-proBNP improved HF risk prediction overall, even among those with severe obesity (BMI ≥35 kg/m(2); improvement in C statistic, 0.032; 95% confidence interval, 0.011-0.053). However, given the higher HF rates among those with obesity, at each NT-proBNP level, higher BMI was associated with greater absolute HF risk. Indeed, among those with NT-proBNP of 100 to <200 pg/mL, the average 10-year HF risk was <5% among normal-weight individuals but >10% among the severely obese. Despite its inverse relationship with BMI, NT-proBNP provides significant prognostic information on the risk of developing HF even among individuals with obesity. Given the higher baseline HF risk among persons with obesity, even slight elevations in NT-proBNP may have implications for increased absolute HF risk in this population. © 2016 American Heart Association, Inc.
The visual communication of risk.
Lipkus, I M; Hollands, J G
1999-01-01
This paper 1) provides reasons why graphics should be effective aids to communicate risk; 2) reviews the use of visuals, especially graphical displays, to communicate risk; 3) discusses issues to consider when designing graphs to communicate risk; and 4) provides suggestions for future research. Key articles and materials were obtained from MEDLINE(R) and PsychInfo(R) databases, from reference article citations, and from discussion with experts in risk communication. Research has been devoted primarily to communicating risk magnitudes. Among the various graphical displays, the risk ladder appears to be a promising tool for communicating absolute and relative risks. Preliminary evidence suggests that people understand risk information presented in histograms and pie charts. Areas that need further attention include 1) applying theoretical models to the visual communication of risk, 2) testing which graphical displays can be applied best to different risk communication tasks (e.g., which graphs best convey absolute or relative risks), 3) communicating risk uncertainty, and 4) testing whether the lay public's perceptions and understanding of risk varies by graphical format and whether the addition of graphical displays improves comprehension substantially beyond numerical or narrative translations of risk and, if so, by how much. There is a need to ascertain the extent to which graphics and other visuals enhance the public's understanding of disease risk to facilitate decision-making and behavioral change processes. Nine suggestions are provided to help achieve these ends.
Vascular Disease, ESRD, and Death: Interpreting Competing Risk Analyses
Coresh, Josef; Segev, Dorry L.; Kucirka, Lauren M.; Tighiouart, Hocine; Sarnak, Mark J.
2012-01-01
Summary Background and objectives Vascular disease, a common condition in CKD, is a risk factor for mortality and ESRD. Optimal patient care requires accurate estimation and ordering of these competing risks. Design, setting, participants, & measurements This is a prospective cohort study of screened (n=885) and randomized participants (n=837) in the Modification of Diet in Renal Disease study (original study enrollment, 1989–1992), evaluating the association of vascular disease with ESRD and pre-ESRD mortality using standard survival analysis and competing risk regression. Results The method of analysis resulted in markedly different estimates. Cumulative incidence by standard analysis (censoring at the competing event) implied that, with vascular disease, the 15-year incidence was 66% and 51% for ESRD and pre-ESRD death, respectively. A more accurate representation of absolute risk was estimated with competing risk regression: 15-year incidence was 54% and 29% for ESRD and pre-ESRD death, respectively. For the association of vascular disease with pre-ESRD death, estimates of relative risk by the two methods were similar (standard survival analysis adjusted hazard ratio, 1.63; 95% confidence interval, 1.20–2.20; competing risk regression adjusted subhazard ratio, 1.57; 95% confidence interval, 1.15–2.14). In contrast, the hazard and subhazard ratios differed substantially for other associations, such as GFR and pre-ESRD mortality. Conclusions When competing events exist, absolute risk is better estimated using competing risk regression, but etiologic associations by this method must be carefully interpreted. The presence of vascular disease in CKD decreases the likelihood of survival to ESRD, independent of age and other risk factors. PMID:22859747
Vascular disease, ESRD, and death: interpreting competing risk analyses.
Grams, Morgan E; Coresh, Josef; Segev, Dorry L; Kucirka, Lauren M; Tighiouart, Hocine; Sarnak, Mark J
2012-10-01
Vascular disease, a common condition in CKD, is a risk factor for mortality and ESRD. Optimal patient care requires accurate estimation and ordering of these competing risks. This is a prospective cohort study of screened (n=885) and randomized participants (n=837) in the Modification of Diet in Renal Disease study (original study enrollment, 1989-1992), evaluating the association of vascular disease with ESRD and pre-ESRD mortality using standard survival analysis and competing risk regression. The method of analysis resulted in markedly different estimates. Cumulative incidence by standard analysis (censoring at the competing event) implied that, with vascular disease, the 15-year incidence was 66% and 51% for ESRD and pre-ESRD death, respectively. A more accurate representation of absolute risk was estimated with competing risk regression: 15-year incidence was 54% and 29% for ESRD and pre-ESRD death, respectively. For the association of vascular disease with pre-ESRD death, estimates of relative risk by the two methods were similar (standard survival analysis adjusted hazard ratio, 1.63; 95% confidence interval, 1.20-2.20; competing risk regression adjusted subhazard ratio, 1.57; 95% confidence interval, 1.15-2.14). In contrast, the hazard and subhazard ratios differed substantially for other associations, such as GFR and pre-ESRD mortality. When competing events exist, absolute risk is better estimated using competing risk regression, but etiologic associations by this method must be carefully interpreted. The presence of vascular disease in CKD decreases the likelihood of survival to ESRD, independent of age and other risk factors.
12 CFR Appendix B to Part 3 - Risk-Based Capital Guidelines; Market Risk Adjustment
Code of Federal Regulations, 2012 CFR
2012-01-01
...-zero specific risk capital charge. (A) For covered debt positions that are derivatives, a bank must... (including derivatives) in identical debt issues or indices. (iii) A bank must multiply the absolute value of... multiply the absolute value of the current market value of each net long or short covered equity position...
Koopman, Carla; van Oeffelen, Aloysia A M; Bots, Michiel L; Engelfriet, Peter M; Verschuren, W M Monique; van Rossem, Lenie; van Dis, Ineke; Capewell, Simon; Vaartjes, Ilonca
2012-08-07
Socioeconomic status has a profound effect on the risk of having a first acute myocardial infarction (AMI). Information on socioeconomic inequalities in AMI incidence across age-gender-groups is lacking. Our objective was to examine socioeconomic inequalities in the incidence of AMI considering both relative and absolute measures of risk differences, with a particular focus on age and gender. We identified all patients with a first AMI from 1997 to 2007 through linked hospital discharge and death records covering the Dutch population. Relative risks (RR) of AMI incidence were estimated by mean equivalent household income at neighbourhood-level for strata of age and gender using Poisson regression models. Socioeconomic inequalities were also shown within the stratified age-gender groups by calculating the total number of events attributable to socioeconomic disadvantage. Between 1997 and 2007, 317,564 people had a first AMI. When comparing the most deprived socioeconomic quintile with the most affluent quintile, the overall RR for AMI was 1.34 (95 % confidence interval (CI): 1.32-1.36) in men and 1.44 (95 % CI: 1.42-1.47) in women. The socioeconomic gradient decreased with age. Relative socioeconomic inequalities were most apparent in men under 35 years and in women under 65 years. The largest number of events attributable to socioeconomic inequalities was found in men aged 45-74 years and in women aged 65-84 years. The total proportion of AMIs that was attributable to socioeconomic inequalities in the Dutch population of 1997 to 2007 was 14 % in men and 18 % in women. Neighbourhood socioeconomic inequalities were observed in AMI incidence in the Netherlands, but the magnitude across age-gender groups depended on whether inequality was expressed in relative or absolute terms. Relative socioeconomic inequalities were high in young persons and women, where the absolute burden of AMI was low. Absolute socioeconomic inequalities in AMI were highest in the age-gender groups of middle-aged men and elderly women, where the number of cases was largest.
2012-01-01
Background Socioeconomic status has a profound effect on the risk of having a first acute myocardial infarction (AMI). Information on socioeconomic inequalities in AMI incidence across age- gender-groups is lacking. Our objective was to examine socioeconomic inequalities in the incidence of AMI considering both relative and absolute measures of risk differences, with a particular focus on age and gender. Methods We identified all patients with a first AMI from 1997 to 2007 through linked hospital discharge and death records covering the Dutch population. Relative risks (RR) of AMI incidence were estimated by mean equivalent household income at neighbourhood-level for strata of age and gender using Poisson regression models. Socioeconomic inequalities were also shown within the stratified age-gender groups by calculating the total number of events attributable to socioeconomic disadvantage. Results Between 1997 and 2007, 317,564 people had a first AMI. When comparing the most deprived socioeconomic quintile with the most affluent quintile, the overall RR for AMI was 1.34 (95 % confidence interval (CI): 1.32 – 1.36) in men and 1.44 (95 % CI: 1.42 – 1.47) in women. The socioeconomic gradient decreased with age. Relative socioeconomic inequalities were most apparent in men under 35 years and in women under 65 years. The largest number of events attributable to socioeconomic inequalities was found in men aged 45–74 years and in women aged 65–84 years. The total proportion of AMIs that was attributable to socioeconomic inequalities in the Dutch population of 1997 to 2007 was 14 % in men and 18 % in women. Conclusions Neighbourhood socioeconomic inequalities were observed in AMI incidence in the Netherlands, but the magnitude across age-gender groups depended on whether inequality was expressed in relative or absolute terms. Relative socioeconomic inequalities were high in young persons and women, where the absolute burden of AMI was low. Absolute socioeconomic inequalities in AMI were highest in the age-gender groups of middle-aged men and elderly women, where the number of cases was largest. PMID:22870916
Patti, Giuseppe; Lucerna, Markus; Pecen, Ladislav; Siller-Matula, Jolanta M; Cavallari, Ilaria; Kirchhof, Paulus; De Caterina, Raffaele
2017-07-23
Increasing age predisposes to both thromboembolic and bleeding events in patients with atrial fibrillation; therefore, balancing risks and benefits of antithrombotic strategies in older populations is crucial. We investigated 1-year outcome with different antithrombotic approaches in very elderly atrial fibrillation patients (age ≥85 years) compared with younger patients. We accessed individual patients' data from the prospective PREFER in AF (PREvention oF thromboembolic events-European Registry in Atrial Fibrillation), compared outcomes with and without oral anticoagulation (OAC), and estimated weighed net clinical benefit in different age groups. A total of 6412 patients, 505 of whom were aged ≥85 years, were analyzed. In patients aged <85 years, the incidence of thromboembolic events was 2.8%/year without OAC versus 2.3%/year with OAC (0.5% absolute reduction); in patients aged ≥85 years, it was 6.3%/year versus 4.3%/year (2% absolute reduction). In very elderly patients, the risk of major bleeding was higher than in younger patients, but similar in patients on OAC and in those on antiplatelet therapy or without antithrombotic treatment (4.0%/year versus 4.2%/year; P =0.77). OAC was overall associated with weighted net clinical benefit, assigning weights to nonfatal events according to their prognostic implication for subsequent death (-2.19%; CI, -4.23%, -0.15%; P =0.036). We found a significant gradient of this benefit as a function of age, with the oldest patients deriving the highest benefit. Because the risk of stroke increases with age more than the risk of bleeding, the absolute benefit of OAC is highest in very elderly patients, where it, by far, outweighs the risk of bleeding, with the greatest net clinical benefit in such patients. © 2017 The Authors and Daiichi Sankyo Europe GmbH. Published on behalf of the American Heart Association, Inc., by Wiley.
Liberal or restrictive transfusion in high-risk patients after hip surgery.
Carson, Jeffrey L; Terrin, Michael L; Noveck, Helaine; Sanders, David W; Chaitman, Bernard R; Rhoads, George G; Nemo, George; Dragert, Karen; Beaupre, Lauren; Hildebrand, Kevin; Macaulay, William; Lewis, Courtland; Cook, Donald Richard; Dobbin, Gwendolyn; Zakriya, Khwaja J; Apple, Fred S; Horney, Rebecca A; Magaziner, Jay
2011-12-29
The hemoglobin threshold at which postoperative red-cell transfusion is warranted is controversial. We conducted a randomized trial to determine whether a higher threshold for blood transfusion would improve recovery in patients who had undergone surgery for hip fracture. We enrolled 2016 patients who were 50 years of age or older, who had either a history of or risk factors for cardiovascular disease, and whose hemoglobin level was below 10 g per deciliter after hip-fracture surgery. We randomly assigned patients to a liberal transfusion strategy (a hemoglobin threshold of 10 g per deciliter) or a restrictive transfusion strategy (symptoms of anemia or at physician discretion for a hemoglobin level of <8 g per deciliter). The primary outcome was death or an inability to walk across a room without human assistance on 60-day follow-up. A median of 2 units of red cells were transfused in the liberal-strategy group and none in the restrictive-strategy group. The rates of the primary outcome were 35.2% in the liberal-strategy group and 34.7% in the restrictive-strategy group (odds ratio in the liberal-strategy group, 1.01; 95% confidence interval [CI], 0.84 to 1.22), for an absolute risk difference of 0.5 percentage points (95% CI, -3.7 to 4.7). The rates of in-hospital acute coronary syndrome or death were 4.3% and 5.2%, respectively (absolute risk difference, -0.9%; 99% CI, -3.3 to 1.6), and rates of death on 60-day follow-up were 7.6% and 6.6%, respectively (absolute risk difference, 1.0%; 99% CI, -1.9 to 4.0). The rates of other complications were similar in the two groups. A liberal transfusion strategy, as compared with a restrictive strategy, did not reduce rates of death or inability to walk independently on 60-day follow-up or reduce in-hospital morbidity in elderly patients at high cardiovascular risk. (Funded by the National Heart, Lung, and Blood Institute; FOCUS ClinicalTrials.gov number, NCT00071032.).
Income and Social Rank Influence UK Children's Behavioral Problems: A Longitudinal Analysis.
Garratt, Elisabeth A; Chandola, Tarani; Purdam, Kingsley; Wood, Alex M
2017-07-01
Children living in low-income households face elevated risks of behavioral problems, but the impact of absolute and relative income to this risk remains unexplored. Using the U.K. Millennium Cohort Study data, longitudinal associations between Strengths and Difficulties Questionnaire scores and absolute household income, distance from the regional median and mean income, and regional income rank were examined in 3- to 12-year-olds (n = 16,532). Higher absolute household incomes were associated with lower behavioral problems, while higher income rank was associated with lower behavioral problems only at the highest absolute incomes. Higher absolute household incomes were associated with lower behavioral problems among children in working households, indicating compounding effects of income and socioeconomic advantages. Both absolute and relative incomes therefore appear to influence behavioral problems. © 2016 The Authors. Child Development © 2016 Society for Research in Child Development, Inc.
Tu, Yu-Kang; Gilthorpe, Mark S; Griffiths, Gareth S; Maddick, Ian H; Eaton, Kenneth A; Johnson, Newell W
2004-01-01
Statistical analyses of periodontal data that average site measurements to subject mean values are unable to explore the site-specific nature of periodontal diseases. Multilevel modeling (MLM) overcomes this, taking hierarchical structure into account. MLM was used to investigate longitudinal relationships between the outcomes of lifetime cumulative attachment loss (LCAL) and probing depth (PD) in relation to potential risk factors for periodontal disease progression. One hundred males (mean age 17 years) received a comprehensive periodontal examination at baseline and at 12 and 30 months. The resulting data were analyzed in two stages. In stage one (reported here), the absolute levels of disease were analyzed in relation to potential risk factors; in stage two (reported in a second paper), changes in disease patterns over time were analyzed in relation to the same risk factors. Each approach yielded substantially different insights. For absolute levels of disease, subject-level risk factors (covariates) had limited prediction for LCAL/PD throughout the 30-month observation period. Tooth position demonstrated a near linear relationship for both outcomes, with disease increasing from anterior to posterior teeth. Sites with subgingival calculus and bleeding on probing demonstrated more LCAL and PD, and supragingival calculus had an apparently protective effect. Covariates had more "explanatory power" for the variation in PD than for the variation in LCAL, suggesting that LCAL and PD might be generally associated with a different profile of covariates. This study provides, for a relatively young cohort, considerable insights into the factors associated with early-life periodontal disease and its progression at all levels of the natural hierarchy of sites within teeth within subjects.
Jung-Choi, K; Khang, Y H
2011-02-01
To determine the contribution of different causes of death to absolute socioeconomic inequalities in mortality for the whole population of children of South Korea aged 1-4 years and 5-9 years. A cohort study based on the national birth and death registers of Korea was performed for 3,724,347 children born in 1995-2000 and 657,209 children born in 1995 to analyse mortality among children aged 1-4 and 5-9 years old, respectively. Adjusted mortality, risk difference (RD), slope index of inequality (SII), RR and relative index of inequality were calculated. The contributions of different causes of death to absolute mortality inequalities were calculated as percentages based on RD and SII. Injuries other than from transport accidents contributed the most to total SIIs for male deaths at ages 1-4 (30.0% for father's education). The second largest contribution was from transport accident injuries (19.6% for father's education). For male deaths at ages 5-9, transport accident injuries and other injuries also accounted for most of the educational and occupational differentials in absolute mortality (63.5-90.5%). Patterns in cause-specific contribution to total inequalities in mortality among girls were generally similar to those among boys. The major contributing causes to absolute socioeconomic inequality in all-cause mortality for children aged 1-9 were external. To reduce the absolute magnitude of socioeconomic inequalities in childhood mortality, policy efforts should be directed towards injury prevention and treatment in South Korea.
Absolute and relative educational inequalities in depression in Europe.
Dudal, Pieter; Bracke, Piet
2016-09-01
To investigate (1) the size of absolute and relative educational inequalities in depression, (2) their variation between European countries, and (3) their relationship with underlying prevalence rates. Analyses are based on the European Social Survey, rounds three and six (N = 57,419). Depression is measured using the shortened Centre of Epidemiologic Studies Depression Scale. Education is coded by use of the International Standard Classification of Education. Country-specific logistic regressions are applied. Results point to an elevated risk of depressive symptoms among the lower educated. The cross-national patterns differ between absolute and relative measurements. For men, large relative inequalities are found for countries including Denmark and Sweden, but are accompanied by small absolute inequalities. For women, large relative and absolute inequalities are found in Belgium, Bulgaria, and Hungary. Results point to an empirical association between inequalities and the underlying prevalence rates. However, the strength of the association is only moderate. This research stresses the importance of including both measurements for comparative research and suggests the inclusion of the level of population health in research into inequalities in health.
Pneumococcal Conjugate Vaccine and Pneumonia Prevention in Children with Congenital Heart Disease.
Solórzano-Santos, Fortino; Espinoza-García, Lilia; Aguilar-Martínez, Glorinella; Beirana-Palencia, Luisa; Echániz-Avilés, Gabriela; Miranda-Novales, Guadalupe
2017-01-01
A successful strategy to prevent Streptococcus pneumoniae infections is the administration of pneumococcal conjugate vaccines (PCVs). To analyze the effectiveness of the 7- and 13-valent PCV for the prevention of all-cause pneumonia. A retrospective cohort of children younger than 5 years of age, with congenital heart disease (CHD) and different vaccination schedules, was analyzed. History of vaccination was confirmed with verifiable records. The outcome measure was all-cause pneumonia or bronchopneumonia. Protocol was approved by the Institutional Review Board. For comparisons, we used inferential statistics with Chi-square and Fisher's exact test; a p ≤ 0.5 was considered statistically significant. Relative and absolute risks reduction and number needed to treat were also calculated. A total of 348 patients were included: 196 with two or more doses of PCV (considered the vaccinated group), and 152 in the unvaccinated group. There was a statistically significant difference for pneumonia events (p < 0.001) between the vaccinated (26/196) and unvaccinated (51/152) groups. The relative risk reduction was 60.5%, and the absolute risk reduction, 20.3%. There were no differences between patients who received two, three or four doses. The number needed to vaccinate to prevent one event of pneumonia was 5 children. At least two doses of PCV in children with CHD reduced the risk of all-cause pneumonia.
Stone, Jennifer; Thompson, Deborah J.; dos-Santos-Silva, Isabel; Scott, Christopher; Tamimi, Rulla M.; Lindstrom, Sara; Kraft, Peter; Hazra, Aditi; Li, Jingmei; Eriksson, Louise; Czene, Kamila; Hall, Per; Jensen, Matt; Cunningham, Julie; Olson, Janet E.; Purrington, Kristen; Couch, Fergus J.; Brown, Judith; Leyland, Jean; Warren, Ruth M. L.; Luben, Robert N.; Khaw, Kay-Tee; Smith, Paula; Wareham, Nicholas J.; Jud, Sebastian M.; Heusinger, Katharina; Beckmann, Matthias W.; Douglas, Julie A.; Shah, Kaanan P.; Chan, Heang-Ping; Helvie, Mark A.; Le Marchand, Loic; Kolonel, Laurence N.; Woolcott, Christy; Maskarinec, Gertraud; Haiman, Christopher; Giles, Graham G.; Baglietto, Laura; Krishnan, Kavitha; Southey, Melissa C.; Apicella, Carmel; Andrulis, Irene L.; Knight, Julia A.; Ursin, Giske; Grenaker Alnaes, Grethe I.; Kristensen, Vessela N.; Borresen-Dale, Anne-Lise; Gram, Inger Torhild; Bolla, Manjeet K.; Wang, Qin; Michailidou, Kyriaki; Dennis, Joe; Simard, Jacques; Paroah, Paul; Dunning, Alison M.; Easton, Douglas F.; Fasching, Peter A.; Pankratz, V. Shane; Hopper, John; Vachon, Celine M.
2015-01-01
Mammographic density measures adjusted for age and body mass index (BMI) are heritable predictors of breast cancer risk but few mammographic density-associated genetic variants have been identified. Using data for 10,727 women from two international consortia, we estimated associations between 77 common breast cancer susceptibility variants and absolute dense area, percent dense area and absolute non-dense area adjusted for study, age and BMI using mixed linear modeling. We found strong support for established associations between rs10995190 (in the region of ZNF365), rs2046210 (ESR1) and rs3817198 (LSP1) and adjusted absolute and percent dense areas (all p <10−5). Of 41 recently discovered breast cancer susceptibility variants, associations were found between rs1432679 (EBF1), rs17817449 (MIR1972-2: FTO), rs12710696 (2p24.1), and rs3757318 (ESR1) and adjusted absolute and percent dense areas, respectively. There were associations between rs6001930 (MKL1) and both adjusted absolute dense and non-dense areas, and between rs17356907 (NTN4) and adjusted absolute non-dense area. Trends in all but two associations were consistent with those for breast cancer risk. Results suggested that 18% of breast cancer susceptibility variants were associated with at least one mammographic density measure. Genetic variants at multiple loci were associated with both breast cancer risk and the mammographic density measures. Further understanding of the underlying mechanisms at these loci could help identify etiological pathways implicated in how mammographic density predicts breast cancer risk. PMID:25862352
Fox, Ashley M
2012-07-01
Although health is generally believed to improve with higher wealth, research on HIV in sub-Saharan Africa has shown otherwise. Whereas researchers and advocates have frequently advanced poverty as a social determinant that can help to explain sub-Saharan Africa's disproportionate burden of HIV infection, recent evidence from population surveys suggests that HIV infection is higher among wealthier individuals. Furthermore, wealthier countries in Africa have experienced the fastest growing epidemics. Some researchers have theorized that inequality in wealth may be more important than absolute wealth in explaining why some countries have higher rates of infection and rapidly increasing epidemics. Studies taking a longitudinal approach have further suggested a dynamic process whereby wealth initially increases risk for HIV acquisition and later becomes protective. Prior studies, conducted exclusively at either the individual or the country level, have neither attempted to disentangle the effects of absolute and relative wealth on HIV infection nor to look simultaneously at different levels of analysis within countries at different stages in their epidemics. The current study used micro-, meso- and macro-level data from Demographic and Health Surveys (DHS) across 170 regions within sixteen countries in sub-Saharan Africa to test the hypothesis that socioeconomic inequality, adjusted for absolute wealth, is associated with greater risk of HIV infection. These analyses reveal that inequality trumps wealth: living in a region with greater inequality in wealth was significantly associated with increased individual risk of HIV infection, net of absolute wealth. The findings also reveal a paradox that supports a dynamic interpretation of epidemic trends: in wealthier regions/countries, individuals with less wealth were more likely to be infected with HIV, whereas in poorer regions/countries, individuals with more wealth were more likely to be infected with HIV. These findings add additional nuance to existing literature on the relationship between HIV and socioeconomic status.
Leslie, William D; Lix, Lisa M
2011-03-01
The World Health Organization (WHO) Fracture Risk Assessment Tool (FRAX) computes 10-year probability of major osteoporotic fracture from multiple risk factors, including femoral neck (FN) T-scores. Lumbar spine (LS) measurements are not currently part of the FRAX formulation but are used widely in clinical practice, and this creates confusion when there is spine-hip discordance. Our objective was to develop a hybrid 10-year absolute fracture risk assessment system in which nonvertebral (NV) fracture risk was assessed from the FN and clinical vertebral (V) fracture risk was assessed from the LS. We identified 37,032 women age 45 years and older undergoing baseline FN and LS dual-energy X-ray absorptiometry (DXA; 1990-2005) from a population database that contains all clinical DXA results for the Province of Manitoba, Canada. Results were linked to longitudinal health service records for physician billings and hospitalizations to identify nontrauma vertebral and nonvertebral fracture codes after bone mineral density (BMD) testing. The population was randomly divided into equal-sized derivation and validation cohorts. Using the derivation cohort, three fracture risk prediction systems were created from Cox proportional hazards models (adjusted for age and multiple FRAX risk factors): FN to predict combined all fractures, FN to predict nonvertebral fractures, and LS to predict vertebral (without nonvertebral) fractures. The hybrid system was the sum of nonvertebral risk from the FN model and vertebral risk from the LS model. The FN and hybrid systems were both strongly predictive of overall fracture risk (p < .001). In the validation cohort, ROC analysis showed marginally better performance of the hybrid system versus the FN system for overall fracture prediction (p = .24) and significantly better performance for vertebral fracture prediction (p < .001). In a discordance subgroup with FN and LS T-score differences greater than 1 SD, there was a significant improvement in overall fracture prediction with the hybrid method (p = .025). Risk reclassification under the hybrid system showed better alignment with observed fracture risk, with 6.4% of the women reclassified to a different risk category. In conclusion, a hybrid 10-year absolute fracture risk assessment system based on combining FN and LS information is feasible. The improvement in fracture risk prediction is small but supports clinical interest in a system that integrates LS in fracture risk assessment. Copyright © 2011 American Society for Bone and Mineral Research.
Effects of baseline risk information on social and individual choices.
Gyrd-Hansen, Dorte; Kristiansen, Ivar Sønbø; Nexøe, Jørgen; Nielsen, Jesper Bo
2002-01-01
This article analyzes preferences for risk reductions in the context of individual and societal decision making. The effect of information on baseline risk is analyzed in both contexts. The results indicate that if individuals are to imagine that they suffer from 1 low-risk and 1 high-risk ailment, and are offered a specified identical absolute risk reduction, a majority will ceteris paribus opt for treatment of the low-risk ailment. A different preference structure is elicited when priority questions are framed as social choices. Here, a majority will prefer to treat the high-risk group of patients. The preference reversal demonstrates the extent to which baseline risk information can influence preferences in different choice settings. It is argued that presentation of baseline risk information may induce framing effects that lead to nonoptimal resource allocations. A solution to this problem may be to not present group-specific baseline risk information when eliciting preferences.
Maas, Paige; Barrdahl, Myrto; Joshi, Amit D.; Auer, Paul L.; Gaudet, Mia M.; Milne, Roger L.; Schumacher, Fredrick R.; Anderson, William F.; Check, David; Chattopadhyay, Subham; Baglietto, Laura; Berg, Christine D.; Chanock, Stephen J.; Cox, David G.; Figueroa, Jonine D.; Gail, Mitchell H.; Graubard, Barry I.; Haiman, Christopher A.; Hankinson, Susan E.; Hoover, Robert N.; Isaacs, Claudine; Kolonel, Laurence N.; Le Marchand, Loic; Lee, I-Min; Lindström, Sara; Overvad, Kim; Romieu, Isabelle; Sanchez, Maria-Jose; Southey, Melissa C.; Stram, Daniel O.; Tumino, Rosario; VanderWeele, Tyler J.; Willett, Walter C.; Zhang, Shumin; Buring, Julie E.; Canzian, Federico; Gapstur, Susan M.; Henderson, Brian E.; Hunter, David J.; Giles, Graham G; Prentice, Ross L.; Ziegler, Regina G.; Kraft, Peter; Garcia-Closas, Montse; Chatterjee, Nilanjan
2017-01-01
IMPORTANCE An improved model for risk stratification can be useful for guiding public health strategies of breast cancer prevention. OBJECTIVE To evaluate combined risk stratification utility of common low penetrant single nucleotide polymorphisms (SNPs) and epidemiologic risk factors. DESIGN, SETTING, AND PARTICIPANTS Using a total of 17 171 cases and 19 862 controls sampled from the Breast and Prostate Cancer Cohort Consortium (BPC3) and 5879 women participating in the 2010 National Health Interview Survey, a model for predicting absolute risk of breast cancer was developed combining information on individual level data on epidemiologic risk factors and 24 genotyped SNPs from prospective cohort studies, published estimate of odds ratios for 68 additional SNPs, population incidence rate from the National Cancer Institute-Surveillance, Epidemiology, and End Results Program cancer registry and data on risk factor distribution from nationally representative health survey. The model is used to project the distribution of absolute risk for the population of white women in the United States after adjustment for competing cause of mortality. EXPOSURES Single nucleotide polymorphisms, family history, anthropometric factors, menstrual and/or reproductive factors, and lifestyle factors. MAIN OUTCOMES AND MEASURES Degree of stratification of absolute risk owing to nonmodifiable (SNPs, family history, height, and some components of menstrual and/or reproductive history) and modifiable factors (body mass index [BMI; calculated as weight in kilograms divided by height in meters squared], menopausal hormone therapy [MHT], alcohol, and smoking). RESULTS The average absolute risk for a 30-year-old white woman in the United States developing invasive breast cancer by age 80 years is 11.3%. A model that includes all risk factors provided a range of average absolute risk from 4.4% to 23.5% for women in the bottom and top deciles of the risk distribution, respectively. For women who were at the lowest and highest deciles of nonmodifiable risks, the 5th and 95th percentile range of the risk distribution associated with 4 modifiable factors was 2.9% to 5.0% and 15.5% to 25.0%, respectively. For women in the highest decile of risk owing to nonmodifiable factors, those who had low BMI, did not drink or smoke, and did not use MHT had risks comparable to an average woman in the general population. CONCLUSIONS AND RELEVANCE This model for absolute risk of breast cancer including SNPs can provide stratification for the population of white women in the United States. The model can also identify subsets of the population at an elevated risk that would benefit most from risk-reduction strategies based on altering modifiable factors. The effectiveness of this model for individual risk communication needs further investigation. PMID:27228256
12 CFR Appendix C to Part 325 - Risk-Based Capital for State Non-Member Banks: Market Risk
Code of Federal Regulations, 2012 CFR
2012-01-01
... instrument is a covered debt instrument that is subject to a non-zero specific risk capital charge. (A) For... indices. (iii) A bank must multiply the absolute value of the current market value of each net long or... conversion. (iii)(A) A bank must multiply the absolute value of the current market value of each net long or...
"Don't know" responses to risk perception measures: implications for underserved populations.
Waters, Erika A; Hay, Jennifer L; Orom, Heather; Kiviniemi, Marc T; Drake, Bettina F
2013-02-01
Risk perceptions are legitimate targets for behavioral interventions because they can motivate medical decisions and health behaviors. However, some survey respondents may not know (or may not indicate) their risk perceptions. The scope of "don't know" (DK) responding is unknown. Examine the prevalence and correlates of responding DK to items assessing perceived risk of colorectal cancer. Two nationally representative, population-based, cross-sectional surveys (2005 National Health Interview Survey [NHIS]; 2005 Health Information National Trends Survey [HINTS]), and one primary care clinic-based survey comprised of individuals from low-income communities. Analyses included 31,202 (NHIS), 1,937 (HINTS), and 769 (clinic) individuals. Five items assessed perceived risk of colorectal cancer. Four of the items differed in format and/or response scale: comparative risk (NHIS, HINTS); absolute risk (HINTS, clinic), and "likelihood" and "chance" response scales (clinic). Only the clinic-based survey included an explicit DK response option. "Don't know" responding was 6.9% (NHIS), 7.5% (HINTS-comparative), and 8.7% (HINTS-absolute). "Don't know" responding was 49.1% and 69.3% for the "chance" and "likely" response options (clinic). Correlates of DK responding were characteristics generally associated with disparities (e.g., low education), but the pattern of results varied among samples, question formats, and response scales. The surveys were developed independently and employed different methodologies and items. Consequently, the results were not directly comparable. There may be multiple explanations for differences in the magnitude and characteristics of DK responding. "Don't know" responding is more prevalent in populations affected by health disparities. Either not assessing or not analyzing DK responses could further disenfranchise these populations and negatively affect the validity of research and the efficacy of interventions seeking to eliminate health disparities.
Factoring socioeconomic status into cardiac performance profiling for hospitals: does it matter?
Alter, David A; Austin, Peter C; Naylor, C David; Tu, Jack V
2002-01-01
Critics of "scorecard medicine" often highlight the incompleteness of risk-adjustment methods used when accounting for baseline patient differences. Although socioeconomic status is a highly important determinant of adverse outcome for patients admitted to the hospital with acute myocardial infarction, it has not been used in most risk-adjustment models for cardiovascular report cards. To determine the incremental impact of socioeconomic status adjustments on age, sex, and illness severity for hospital-specific 30-day mortality rates after acute myocardial infarction. The authors compared the absolute and relative hospital-specific 30-day acute myocardial infarction mortality rates in 169 hospitals throughout Ontario between April 1, 1994 and March 31, 1997. Patient socioeconomic status was characterized by median neighborhood income using postal codes and 1996 Canadian census data. They examined two risk-adjustment models: the first adjusted for age, sex, and illness severity (standard), whereas the second adjusted for age, sex, illness severity, and median neighborhood income level (socioeconomic status). There was an extremely strong correlation between 'standard' and 'socioeconomic status' risk-adjusted mortality rates (r = 0.99). Absolute differences in 30-day risk-adjusted mortality rates between the socioeconomic status and standard risk-adjustment models were small (median, 0.1%; 25th-75th percentile, 0.1-0.2). The agreement in the quintile rankings of hospitals between the socioeconomic status and standard risk-adjustment models was high (weighted kappa = 0.93). Despite its importance as a determinant of patient outcomes, the effect of socioeconomic status on hospital-specific mortality rates over and above standard risk-adjustment methods for acute myocardial infarction hospital profiling in Ontario was negligible.
Gaziano, Thomas A; Opie, Lionel H; Weinstein, Milton C
2008-01-01
Summary Background Cardiovascular disease is the leading cause of death, with 80% of cases occurring in developing countries. We therefore aimed to establish whether use of evidence-based multidrug regimens for patients at high risk for cardiovascular disease would be cost-effective in low-income and middle-income countries. Methods We used a Markov model to do a cost-effectiveness analysis with two combination regimens. For primary prevention, we used aspirin, a calcium-channel blocker, an angiotensin-converting-enzyme inhibitor, and a statin, and assessed them in four groups with different thresholds of absolute risks for cardiovascular disease. For secondary prevention, we assessed the same combination of drugs in one group, but substituted a β blocker for the calcium-channel blocker. To compare strategies, we report incremental cost-effectiveness ratios (ICER), in US$ per quality-adjusted life-year (QALY). Findings We recorded that preventive strategies could result in a 2-year gain in life expectancy. Across six developing World Bank regions, primary prevention yielded ICERs of US$746–890/QALY gained for patients with a 10-year absolute risk of cardiovascular disease greater than 25%, and $1039–1221/QALY gained for those with an absolute risk greater than 5%. ICERs for secondary prevention ranged from $306/QALY to $388/QALY gained. Interpretation Regimens of aspirin, two blood-pressure drugs, and a statin could halve the risk of death from cardiovascular disease in high-risk patients. This approach is cost-effective according to WHO recommendations, and is robust across several estimates of drug efficacy and of treatment cost. Developing countries should encourage the use of these inexpensive drugs that are currently available for both primary and secondary prevention. PMID:16920473
van Rein, N; Biedermann, J S; van der Meer, F J M; Cannegieter, S C; Wiersma, N; Vermaas, H W; Reitsma, P H; Kruip, M J H A; Lijfering, W M
2017-07-01
Essentials Low-molecular-weight-heparins (LMWH) kinetics differ which may result in different bleeding risks. A cohort of 12 934 venous thrombosis patients on LMWH was followed until major bleeding. The absolute major bleeding risk was low among patients registered at the anticoagulation clinic. Once-daily dosing was associated with a lower bleeding risk as compared with twice-daily. Background Low-molecular-weight heparins (LMWHs) are considered members of a class of drugs with similar anticoagulant properties. However, pharmacodynamics and pharmacokinetics between LMWHs differ, which may result in different bleeding risks. As these agents are used by many patients, small differences may lead to a large effect on numbers of major bleeding events. Objectives To determine major bleeding risks for different LMWH agents and dosing schedules. Methods A cohort of acute venous thrombosis patients from four anticoagulation clinics who used an LMWH and a vitamin K antagonist were followed until they ceased LMWH treatment or until major bleeding. Exposures were classified according to different types of LMWHs and for b.i.d. and o.d. use. Cumulative incidences for major bleeding per 1000 patients and risk ratios were calculated and adjusted for study center. Results The study comprised 12 934 patients with a mean age of 59 years; 6218 (48%) were men. The cumulative incidence of major bleeding was 2.5 per 1000 patients (95% confidence interval [CI], 1.7-3.5). Enoxaparin b.i.d. or o.d. was associated with a relative bleeding risk of 1.7 (95% CI, 0.2-17.5) compared with nadroparin o.d. In addition, a nadroparin b.i.d. dosing schedule was associated with a 2.0-fold increased major bleeding risk (95% CI, 0.8-5.1) as compared with a nadroparin o.d. dosing schedule. Conclusions Absolute major bleeding rates were low for all LMWH agents and dosing schedules in a large unselected cohort. Nevertheless, twice-daily dosing with nadroparin appeared to be associated with an increased major bleeding risk as compared with once-daily dosing, as also suggested in a meta-analysis of controlled clinical trials. © 2017 International Society on Thrombosis and Haemostasis.
Kim, Hye Kyung; Lwin, May O
2017-10-01
Although culture is acknowledged as an important factor that influences health, little is known about cultural differences pertaining to cancer-related beliefs and prevention behaviors. This study examines two culturally influenced beliefs-fatalistic beliefs about cancer prevention, and optimistic beliefs about cancer risk-to identify reasons for cultural disparity in the engagement of cancer prevention behaviors. We utilized data from national surveys of European Americans in the United States (Health Information National Trends Survey 4, Cycle3; N = 1,139) and Asians in Singapore (N = 1,200) to make cultural comparisons. The odds of an Asian adhering to prevention recommendations were less than half the odds of a European American, with the exception of smoking avoidance. Compared to European Americans, Asians were more optimistic about their cancer risk both in an absolute and a comparative sense, and held stronger fatalistic beliefs about cancer prevention. Mediation analyses revealed that fatalistic beliefs and absolute risk optimism among Asians partially explain their lower engagement in prevention behaviors, whereas comparative risk optimism increases their likelihood of adhering to prevention behaviors. Our findings underscore the need for developing culturally targeted interventions in communicating cancer causes and prevention.
Redaniel, Maria Theresa M; Laudico, Adriano; Mirasol-Lumague, Maria Rica; Gondos, Adam; Uy, Gemma Leonora; Toral, Jean Ann; Benavides, Doris; Brenner, Hermann
2009-09-24
In contrast to most other forms of cancer, data from some developing and developed countries show surprisingly similar survival rates for ovarian cancer. We aimed to compare ovarian cancer survival in Philippine residents, Filipino-Americans and Caucasians living in the US, using a high resolution approach, taking potential differences in prognostic factors into account. Using databases from the SEER 13 and from the Manila and Rizal Cancer Registries, age-adjusted five-year absolute and relative survival estimates were computed using the period analysis method and compared between Filipino-American ovarian cancer patients with cancer patients from the Philippines and Caucasians in the US. Cox proportional hazards modelling was used to determine factors affecting survival differences. Despite more favorable distribution of age and cancer morphology and similar stage distribution, 5-year absolute and relative survival were lower in Philippine residents (Absolute survival, AS, 44%, Standard Error, SE, 2.9 and Relative survival, RS, 49.7%, SE, 3.7) than in Filipino-Americans (AS, 51.3%, SE, 3.1 and RS, 54.1%, SE, 3.4). After adjustment for these and additional covariates, strong excess risk of death for Philippine residents was found (Relative Risk, RR, 2.45, 95% confidence interval, 95% CI, 1.99-3.01). In contrast, no significant differences were found between Filipino-Americans and Caucasians living in the US. Multivariate analyses disclosed strong survival disadvantages of Philippine residents compared to Filipino-American patients, for which differences in access to health care might have played an important role. Survival is no worse among Filipino-Americans than among Caucasians living in the US.
Presentation of Evidence in Continuing Medical Education Programs: A Mixed Methods Study
ERIC Educational Resources Information Center
Allen, Michael; MacLeod, Tanya; Handfield-Jones, Richard; Sinclair, Douglas; Fleming, Michael
2010-01-01
Introduction: Clinical trial data can be presented in ways that exaggerate treatment effectiveness. Physicians consider therapy more effective, and may be more likely to make inappropriate practice changes, when data are presented in relative terms such as relative risk reduction rather than in absolute terms such as absolute risk reduction and…
12 CFR 324.210 - Standardized measurement method for specific risk.
Code of Federal Regulations, 2014 CFR
2014-01-01
... purchased credit protection is capped at the current fair value of the transaction plus the absolute value... hedge has a specific risk add-on of zero if: (i) The debt or securitization position is fully hedged by... debt or securitization positions, an FDIC-supervised institution must multiply the absolute value of...
Levofloxacin to prevent bacterial infection in patients with cancer and neutropenia.
Bucaneve, Giampaolo; Micozzi, Alessandra; Menichetti, Francesco; Martino, Pietro; Dionisi, M Stella; Martinelli, Giovanni; Allione, Bernardino; D'Antonio, Domenico; Buelli, Maurizio; Nosari, A Maria; Cilloni, Daniela; Zuffa, Eliana; Cantaffa, Renato; Specchia, Giorgina; Amadori, Sergio; Fabbiano, Francesco; Deliliers, Giorgio Lambertenghi; Lauria, Francesco; Foà, Robin; Del Favero, Albano
2005-09-08
The prophylactic use of fluoroquinolones in patients with cancer and neutropenia is controversial and is not a recommended intervention. We randomly assigned 760 consecutive adult patients with cancer in whom chemotherapy-induced neutropenia (<1000 neutrophils per cubic millimeter) was expected to occur for more than seven days to receive either oral levofloxacin (500 mg daily) or placebo from the start of chemotherapy until the resolution of neutropenia. Patients were stratified according to their underlying disease (acute leukemia vs. solid tumor or lymphoma). An intention-to-treat analysis showed that fever was present for the duration of neutropenia in 65 percent of patients who received levofloxacin prophylaxis, as compared with 85 percent of those receiving placebo (243 of 375 vs. 308 of 363; relative risk, 0.76; absolute difference in risk, -20 percent; 95 percent confidence interval, -26 to -14 percent; P=0.001). The levofloxacin group had a lower rate of microbiologically documented infections (absolute difference in risk, -17 percent; 95 percent confidence interval, -24 to -10 percent; P<0.001), bacteremias (difference in risk, -16 percent; 95 percent confidence interval, -22 to -9 percent; P<0.001), and single-agent gram-negative bacteremias (difference in risk, -7 percent; 95 percent confidence interval, -10 to -2 percent; P<0.01) than did the placebo group. Mortality and tolerability were similar in the two groups. The effects of prophylaxis were also similar between patients with acute leukemia and those with solid tumors or lymphoma. Prophylactic treatment with levofloxacin is an effective and well-tolerated way of preventing febrile episodes and other relevant infection-related outcomes in patients with cancer and profound and protracted neutropenia. The long-term effect of this intervention on microbial resistance in the community is not known. Copyright 2005 Massachusetts Medical Society.
Donin, A S; Nightingale, C M; Owen, C G; Rudnicka, A R; McNamara, M C; Prynne, C J; Stephen, A M; Cook, D G; Whincup, P H
2010-07-01
In the UK, South Asian adults have increased risks of CHD, type 2 diabetes and central obesity. Black African-Caribbeans, in contrast, have increased risks of type 2 diabetes and general obesity but lower CHD risk. There is growing evidence that these risk differences emerge in early life and that nutritional factors may be important. We have therefore examined the variations in nutritional composition of the diets of South Asian, black African-Caribbean and white European children, using 24 h recalls of dietary intake collected during a cross-sectional survey of cardiovascular health in eighty-five primary schools in London, Birmingham and Leicester. In all, 2209 children aged 9-10 years took part, including 558 of South Asian, 560 of black African-Caribbean and 543 of white European ethnicity. Compared with white Europeans, South Asian children reported higher mean total energy intake; their intakes of total fat, polyunsaturated fat and protein (both absolute and as proportions of total energy intake) were higher and their intakes of carbohydrate as a proportion of energy (particularly sugars), vitamin C and D, Ca and haem Fe were lower. These differences were especially marked for Bangladeshi children. Black African-Caribbean children had lower intakes of total and saturated fat (both absolute and as proportions of energy intake), NSP, vitamin D and Ca. The lower total and saturated fat intakes were particularly marked among black African children. Appreciable ethnic differences exist in the nutritional composition of children's diets, which may contribute to future differences in chronic disease risk.
Chen, Lei; Peeters, Anna; Magliano, Dianna J; Shaw, Jonathan E; Welborn, Timothy A; Wolfe, Rory; Zimmet, Paul Z; Tonkin, Andrew M
2007-12-01
Framingham risk functions are widely used for prediction of future cardiovascular disease events. They do not, however, include anthropometric measures of overweight or obesity, now considered a major cardiovascular disease risk factor. We aimed to establish the most appropriate anthropometric index and its optimal cutoff point for use as an ancillary measure in clinical practice when identifying people with increased absolute cardiovascular risk estimates. Analysis of a population-based, cross-sectional survey was carried out. The 1991 Framingham prediction equations were used to compute 5 and 10-year risks of cardiovascular or coronary heart disease in 7191 participants from the Australian Diabetes, Obesity and Lifestyle Study (1999-2000). Receiver operating characteristic curve analysis was used to compare measures of body mass index (BMI), waist circumference, and waist-to-hip ratio in identifying participants estimated to be at 'high', or at 'intermediate or high' absolute risk. After adjustment for BMI and age, waist-to-hip ratio showed stronger correlation with absolute risk estimates than waist circumference. The areas under the receiver operating characteristic curve for waist-to-hip ratio (0.67-0.70 in men, 0.64-0.74 in women) were greater than those for waist circumference (0.60-0.65, 0.59-0.71) or BMI (0.52-0.59, 0.53-0.66). The optimal cutoff points of BMI, waist circumference and waist-to-hip ratio to predict people at 'high', or at 'intermediate or high' absolute risk estimates were 26 kg/m2, 95 cm and 0.90 in men, and 25-26 kg/m2, 80-85 cm and 0.80 in women, respectively. Measurement of waist-to-hip ratio is more useful than BMI or waist circumference in the identification of individuals estimated to be at increased risk for future primary cardiovascular events.
van Vlijmen, E F W; Veeger, N J G M; Middeldorp, S; Hamulyák, K; Prins, M H; Kluin-Nelemans, H C; Meijer, K
2016-09-01
Essentials It is unknown if a male or female thrombotic family history influences risk in female relatives. We assessed thrombotic risk in female relatives of male and female patients with thrombosis. A hormonally related female thrombotic family history further increases risk in female relatives. This information could be important in counseling women on contraceptive options. Click to hear Prof. Rosendaal's perspective on venous thrombosis: etiology, pathogenesis, and prognosis Background Women from thrombophilic families have increased risk of venous thromboembolism (VTE), which increases further during oral contraceptive (COC) use and pregnancy-postpartum. Whether this additional risk differs between relatives of male and female patients, or is different when that female patient had a hormonally related VTE (during COC use/pregnancy), is unknown. Methods One thousand five female relatives of consecutive patients with VTE from a family-based cohort were retrospectively followed for incident VTE from ages 15 to 50, first VTE, or study inclusion. Absolute and relative VTE risks adjusted for factors of patients (sex, age) and relatives (thrombophilia, COC use, pregnancy) were estimated in relatives of female and male patients and in relatives of female patients with and without hormonally related VTE. Results Absolute risk in relatives of female (0.32 [95% confidence interval [CI] 0.23-0.43]) vs. male patients (0.39 [95% CI 0.28-0.53]) was comparable. However, the heterogeneity analysis of risk estimates suggested that in relatives of female vs. male patients, the contribution of pregnancy-postpartum (hazard ratio [HR] 11.6 [95% CI 6.3-21.3] vs. HR6.6 [95% CI 2.8-15.2]) and, to a lesser extent, COC use (HR3.6 [95% CI 1.8-7.1] vs. HR2.7 [95% CI 1.5-5.0]) to the VTE risk differs. Absolute risk was significantly higher in relatives of female patients with hormonally related VTE (0.43 [95% CI 0.3-0.6]) vs. relatives of female patients without hormonally related VTE (0.13 [95% CI 0.05-0.27]), HR3.28 [95% CI 1.5-7.9]). The higher contribution of pregnancy-postpartum and COC use to the VTE risk was mainly observed in relatives of patients with hormonally related VTE. Conclusions These findings suggest that a family history from a female patient, especially when VTE was hormonally related, may further increase VTE risk in her female relatives. This information could be important in counseling women on contraceptive options. © 2016 The Authors. Journal of Thrombosis and Haemostasis published by Wiley Periodicals, Inc. on behalf of International Society on Thrombosis and Haemostasis.
Welk, Blayne; McArthur, Eric; Fraser, Lisa-Ann; Hayward, Jade; Dixon, Stephanie; Hwang, Y Joseph; Ordon, Michael
2015-10-26
Do men starting treatment with prostate-specific α antagonists have increased risk of fall and fracture? Administrative datasets from the province of Ontario, Canada, that contain patient level data were used to generate a cohort of 147,084 men aged ≥ 66 years who filled their first outpatient prescription for prostate-specific α antagonists tamsulosin, alfuzosin, or silodosin between June 2003 and December 2013 (exposed men) plus an equal sized cohort matched 1:1 (using a propensity score model) who did not initiate α antagonist therapy. The primary outcome was a hospital emergency room visit or inpatient admission for a fall or fracture in the 90 days after exposure. The men exposed to prostate-specific α antagonist had significantly increased risks of falling (odds ratio 1.14 (95% CI 1.07 to 1.21), absolute risk increase 0.17% (0.08 to 0.25%)) and of sustaining a fracture (odds ratio 1.16 (1.04 to 1.29), absolute risk increase 0.06% (0.02 to 0.11%)) compared with the unexposed cohort. This increased risk was not observed in the period before α antagonist use. Secondary outcomes of hypotension and head trauma were also significantly increased in the exposed cohort (odds ratios 1.80 (1.59 to 2.03) and 1.15 (1.04 to 1.27) respectively). The two cohorts were similar across 98 different covariates including demographics, comorbid conditions, medication use, healthcare use, and prior medical investigation. Potential unmeasured confounders, such as physical deconditioning, mobility impairment, and situational risk factors, may exist. The data used to identify the primary outcomes had limited sensitivity, so the absolute risks of the outcomes are probably underestimates. The study only included men ≥ 66 years old, and 84% of exposed men were prescribed tamsulosin, so results may not be generalizable to younger men, and there may not be statistical power to show small differences in outcomes between the drugs. Prostate-specific α antagonists are associated with a small but significant increased risk of fall, fracture, and head trauma, probably as a result of induced hypotension. This project was conducted at the Institute for Clinical Evaluative Sciences (ICES) Western Site through the Kidney, Dialysis, and Transplantation (KDT) research program. BW has received a research grant from Astellas, and L-AF does consultancy for Amgen. © Welk et al 2015.
Dixon, Stephanie N.; Kuwornu, Paul John; Dev, Varun K.; Montero-Odasso, Manuel; Burneo, Jorge; Garg, Amit X.
2018-01-01
Gabapentin is an effective treatment for chronic neuropathic pain but may cause dizziness, drowsiness, and confusion in some older adults. The goal of this study was to assess the association between gabapentin dosing and adverse outcomes by obtaining estimates of the 30-day risk of hospitalization with altered mental status and mortality in older adults (mean age 76 years) in Ontario, Canada initiated on high dose (>600 mg/day; n = 34,159) compared to low dose (≤600 mg/day; n = 76,025) oral gabapentin in routine outpatient care. A population-based, retrospective cohort study assessing new gabapentin use between 2002 to 2014 was conducted. The primary outcome was 30-day hospitalization with an urgent head computed tomography (CT) scan in the absence of evidence of stroke (a proxy for altered mental status). The secondary outcome was 30-day all-cause mortality. The baseline characteristics measured in the two dose groups were similar. Initiation of a high versus low dose of gabapentin was associated with a higher risk of hospitalization with head CT scan (1.27% vs. 1.06%, absolute risk difference 0.21%, adjusted relative risk 1.29 [95% CI 1.14 to 1.46], number needed to treat 477) but not a statistically significant higher risk of mortality (1.25% vs. 1.16%, absolute risk difference of 0.09%, adjusted relative risk of 1.01 [95% CI 0.89 to 1.14]). Overall, the risk of being hospitalized with altered mental status after initiating gabapentin remains low, but may be reduced through the judicious use of gabapentin, use of the lowest dose to control pain, and vigilance for early signs of altered mental status. PMID:29538407
Lundberg, Frida E; Johansson, Anna L V; Rodriguez-Wallberg, Kenny; Brand, Judith S; Czene, Kamila; Hall, Per; Iliadou, Anastasia N
2016-04-13
Ovarian stimulation drugs, in particular hormonal agents used for controlled ovarian stimulation (COS) required to perform in vitro fertilization, increase estrogen and progesterone levels and have therefore been suspected to influence breast cancer risk. This study aims to investigate whether infertility and hormonal fertility treatment influences mammographic density, a strong hormone-responsive risk factor for breast cancer. Cross-sectional study including 43,313 women recruited to the Karolinska Mammography Project between 2010 and 2013. Among women who reported having had infertility, 1576 had gone through COS, 1429 had had hormonal stimulation without COS and 5958 had not received any hormonal fertility treatment. Percent and absolute mammographic densities were obtained using the volumetric method Volpara™. Associations with mammographic density were assessed using multivariable generalized linear models, estimating mean differences (MD) with 95 % confidence intervals (CI). After multivariable adjustment, women with a history of infertility had 1.53 cm(3) higher absolute dense volume compared to non-infertile women (95 % CI: 0.70 to 2.35). Among infertile women, only those who had gone through COS treatment had a higher absolute dense volume than those who had not received any hormone treatment (adjusted MD 3.22, 95 % CI: 1.10 to 5.33). No clear associations were observed between infertility, fertility treatment and percent volumetric density. Overall, women reporting infertility had more dense tissue in the breast. The higher absolute dense volume in women treated with COS may indicate a treatment effect, although part of the association might also be due to the underlying infertility. Continued monitoring of cancer risk in infertile women, especially those who undergo COS, is warranted.
Office blood pressure or ambulatory blood pressure for the prediction of cardiovascular events.
Mortensen, Rikke Nørmark; Gerds, Thomas Alexander; Jeppesen, Jørgen Lykke; Torp-Pedersen, Christian
2017-11-21
To determine the added value of (i) 24-h ambulatory blood pressure relative to office blood pressure and (ii) night-time ambulatory blood pressure relative to daytime ambulatory blood pressure for 10-year person-specific absolute risks of fatal and non-fatal cardiovascular events. A total of 7927 participants were included from the International Database on Ambulatory blood pressure monitoring in relation to Cardiovascular Outcomes. We used cause-specific Cox regression to predict 10-year person-specific absolute risks of fatal and non-fatal cardiovascular events. Discrimination of 10-year outcomes was assessed by time-dependent area under the receiver operating characteristic curve (AUC). No differences in predicted risks were observed when comparing office blood pressure and ambulatory blood pressure. The median difference in 10-year risks (1st; 3rd quartile) was -0.01% (-0.3%; 0.1%) for cardiovascular mortality and -0.1% (-1.1%; 0.5%) for cardiovascular events. The difference in AUC (95% confidence interval) was 0.65% (0.22-1.08%) for cardiovascular mortality and 1.33% (0.83-1.84%) for cardiovascular events. Comparing daytime and night-time blood pressure, the median difference in 10-year risks was 0.002% (-0.1%; 0.1%) for cardiovascular mortality and -0.01% (-0.5%; 0.2%) for cardiovascular events. The difference in AUC was 0.10% (-0.08 to 0.29%) for cardiovascular mortality and 0.15% (-0.06 to 0.35%) for cardiovascular events. Ten-year predictions obtained from ambulatory blood pressure are similar to predictions from office blood pressure. Night-time blood pressure does not improve 10-year predictions obtained from daytime measurements. For an otherwise healthy population sufficient prognostic accuracy of cardiovascular risks can be achieved with office blood pressure. Published on behalf of the European Society of Cardiology. All rights reserved. © The Author 2017. For permissions, please email: journals.permissions@oup.com.
Absolute Risk Aversion and the Returns to Education.
ERIC Educational Resources Information Center
Brunello, Giorgio
2002-01-01
Uses 1995 Italian household income and wealth survey to measure individual absolute risk aversion of 1,583 married Italian male household heads. Uses this measure as an instrument for attained education in a standard-log earnings equation. Finds that the IV estimate of the marginal return to schooling is much higher than the ordinary least squares…
12 CFR 217.210 - Standardized measurement method for specific risk
Code of Federal Regulations, 2014 CFR
2014-01-01
... current fair value of the transaction plus the absolute value of the present value of all remaining... a securitization position and its credit derivative hedge has a specific risk add-on of zero if: (i... institution must multiply the absolute value of the current fair value of each net long or net short debt or...
12 CFR 3.210 - Standardized measurement method for specific risk
Code of Federal Regulations, 2014 CFR
2014-01-01
... purchased credit protection is capped at the current fair value of the transaction plus the absolute value... specific risk add-on of zero if: (i) The debt or securitization position is fully hedged by a total return... absolute value of the current fair value of each net long or net short debt or securitization position in...
Taber, Jennifer M; Klein, William M P; Ferrer, Rebecca A; Lewis, Katie L; Biesecker, Leslie G; Biesecker, Barbara B
2015-07-01
Dispositional optimism and risk perceptions are each associated with health-related behaviors and decisions and other outcomes, but little research has examined how these constructs interact, particularly in consequential health contexts. The predictive validity of risk perceptions for health-related information seeking and intentions may be improved by examining dispositional optimism as a moderator, and by testing alternate types of risk perceptions, such as comparative and experiential risk. Participants (n = 496) had their genomes sequenced as part of a National Institutes of Health pilot cohort study (ClinSeq®). Participants completed a cross-sectional baseline survey of various types of risk perceptions and intentions to learn genome sequencing results for differing disease risks (e.g., medically actionable, nonmedically actionable, carrier status) and to use this information to change their lifestyle/health behaviors. Risk perceptions (absolute, comparative, and experiential) were largely unassociated with intentions to learn sequencing results. Dispositional optimism and comparative risk perceptions interacted, however, such that individuals higher in optimism reported greater intentions to learn all 3 types of sequencing results when comparative risk was perceived to be higher than when it was perceived to be lower. This interaction was inconsistent for experiential risk and absent for absolute risk. Independent of perceived risk, participants high in dispositional optimism reported greater interest in learning risks for nonmedically actionable disease and carrier status, and greater intentions to use genome information to change their lifestyle/health behaviors. The relationship between risk perceptions and intentions may depend on how risk perceptions are assessed and on degree of optimism. (c) 2015 APA, all rights reserved.
Taber, Jennifer M.; Klein, William M. P.; Ferrer, Rebecca A.; Lewis, Katie L.; Biesecker, Leslie G.; Biesecker, Barbara B.
2015-01-01
Objective Dispositional optimism and risk perceptions are each associated with health-related behaviors and decisions and other outcomes, but little research has examined how these constructs interact, particularly in consequential health contexts. The predictive validity of risk perceptions for health-related information seeking and intentions may be improved by examining dispositional optimism as a moderator, and by testing alternate types of risk perceptions, such as comparative and experiential risk. Method Participants (n = 496) had their genomes sequenced as part of a National Institutes of Health pilot cohort study (ClinSeq®). Participants completed a cross-sectional baseline survey of various types of risk perceptions and intentions to learn genome sequencing results for differing disease risks (e.g., medically actionable, nonmedically actionable, carrier status) and to use this information to change their lifestyle/health behaviors. Results Risk perceptions (absolute, comparative, and experiential) were largely unassociated with intentions to learn sequencing results. Dispositional optimism and comparative risk perceptions interacted, however, such that individuals higher in optimism reported greater intentions to learn all 3 types of sequencing results when comparative risk was perceived to be higher than when it was perceived to be lower. This interaction was inconsistent for experiential risk and absent for absolute risk. Independent of perceived risk, participants high in dispositional optimism reported greater interest in learning risks for nonmedically actionable disease and carrier status, and greater intentions to use genome information to change their lifestyle/health behaviors. Conclusions The relationship between risk perceptions and intentions may depend on how risk perceptions are assessed and on degree of optimism. PMID:25313897
Van Hemelrijck, Mieke; Garmo, Hans; Holmberg, Lars; Ingelsson, Erik; Bratt, Ola; Bill-Axelson, Anna; Lambe, Mats; Stattin, Pär; Adolfsson, Jan
2010-07-20
Cardiovascular disease (CVD) is a potential adverse effect of endocrine treatment (ET) for prostate cancer (PC). We investigated absolute and relative CVD risk in 76,600 patients with PC undergoing ET, curative treatment, or surveillance. PCBaSe Sweden is based on the National Prostate Cancer Register, which covers more than 96% of PC cases. Standardized incidence ratios (SIRs) and standardized mortality ratios (SMRs) of ischemic heart disease (IHD), acute myocardial infarction (MI), arrhythmia, heart failure, and stroke were calculated to compare observed and expected (using total Swedish population) numbers of CVD, taking into account age, calendar time, and previous CVD. Between 1997 and 2007, 30,642 patients with PC received primary ET, 26,432 curative treatment, and 19,527 surveillance. SIRs for CVD were elevated in all men with the highest for those undergoing ET, independent of circulatory disease history (SIR MI for men without circulatory disease history: 1.40 [95% CI, 1.31 to 1.49], 1.15 [95% CI, 1.01 to 1.31], and 1.20 [95% CI, 1.11 to 1.30] for men undergoing ET, curative treatment, and surveillance, respectively). Absolute risk differences (ARD) showed that two (arrhythmia) to eight (IHD) extra cases of CVD would occur per 1,000 person-years. SMRs showed similar patterns, with ARD of zero (arrhythmia) to three (IHD) per 1,000 person-years. Increased relative risks of nonfatal and fatal CVD were found among all men with PC, especially those treated with ET. Because ET is currently the only effective treatment for metastatic disease and the ARDs were rather small, our findings indicate that CVD risk should be considered when prescribing ET but should not constitute a contraindication when the expected gain is tangible.
Loguercio, A D; Servat, F; Stanislawczuk, R; Mena-Serrano, A; Rezende, M; Prieto, M V; Cereño, V; Rojas, M F; Ortega, K; Fernandez, E; Reis, A
2017-12-01
The study aimed to compare the tooth sensitivity (TS) and bleaching efficacy of two hydrogen peroxide gels with different pHs (acid pH [Pola Office, SDI] and the neutral pH [Pola Office+, SDI]) used for in-office bleaching. Fifty-four patients from Brazil and Chile, with right superior incisor darker than A2, were selected for this double-blind, split-mouth randomized trial. Teeth were bleached in two sessions, with 1-week interval. Each session had three applications of 8 min each, according to the manufacturer's instructions. The color changes were evaluated by subjective (Vita Classical and Vita Bleachedguide) and objective (Easy shade spectrophotometer) methods. Participants recorded TS with 0-10 visual analog scale. Color change in shade guide units (SGU) and ΔE was analyzed by Student's t test (α = 0.05). The absolute risk and intensity of TS were evaluated by McNemar's test and Wilcoxon-paired test, respectively (α = 0.05). All groups achieved the same level of whitening after 30 days of clinical evaluation. The use of a neutral in-office bleaching gel significantly decreases the absolute risk of TS (28%, 95% CI 18-41) and intensity of TS when compared to the acid bleaching gel (absolute risk of 50%, 95% CI 37-63). The use of a neutral in-office bleaching agent gel produced the same whitening degree than an acid bleaching gel but with reduced risk and intensity of tooth sensitivity. Clinicians should opt to use in-office bleaching with a neutral gel than an acid product because the former causes a significant lower risk and intensity of tooth sensitivity.
[Four numbers and a bit more basic knowledge of mathematics].
Günther, Judith; Briel, Matthias; Suter, Katja
2015-02-01
In addition to relative risk, relative risk reduction and absolute risk reduction there circulates another effect size for binary endpoints in the scientific medical literature: the odds ratio. Relative risk and odds ratio are alternative ways of reflecting study results. Both, relative risk (RR) and odds ratio (OR), can easily be calculated from the "2 x 2-table". Advantage of OR: odds ratios can be calculated in every type of controlled study design, including retrospective studies. Furthermore, odds ratios--the biostatisticians are swarming--offer beautiful mathematical properties and therefore are often used in meta-analysis as an effect size for calculating a pooled estimate of the results of different studies with the same clinical question. Disadvantage of OR: In clinical studies the presentation of the results as "odds ratios" may result in an overestimation of the intervention effect. This article shows the difference between "chance" and "risk" and how odds ratio and relative risk are associated.
Miovský, Michal; Vonkova, Hana; Čablová, Lenka; Gabrhelík, Roman
2015-11-01
To study the effect of a universal prevention intervention targeting cannabis use in individual children with different risk profiles. A school-based randomized controlled prevention trial was conducted over a period of 33 months (n=1874 sixth-graders, baseline mean age 11.82). We used a two-level random intercept logistic model for panel data to predict the probabilities of cannabis use for each child. Specifically, we used eight risk/protective factors to characterize each child and then predicted two probabilities of cannabis use for each child if the child had the intervention or not. Using the two probabilities, we calculated the absolute and relative effect of the intervention for each child. According to the two probabilities, we also divided the sample into a low-risk group (the quarter of the children with the lowest probabilities), a moderate-risk group, and a high-risk group (the quarter of the children with the highest probabilities) and showed the average effect of the intervention on these groups. The differences between the intervention group and the control group were statistically significant in each risk group. The average predicted probabilities of cannabis use for a child from the low-risk group were 4.3% if the child had the intervention and 6.53% if no intervention was provided. The corresponding probabilities for a child from the moderate-risk group were 10.91% and 15.34% and for a child from the high-risk group 25.51% and 32.61%. School grades, thoughts of hurting oneself, and breaking the rules were the three most important factors distinguishing high-risk and low-risk children. We predicted the effect of the intervention on individual children, characterized by their risk/protective factors. The predicted absolute effect and relative effect of any intervention for any selected risk/protective profile of a given child may be utilized in both prevention practice and research. Copyright © 2015 Elsevier Ltd. All rights reserved.
Quantifying prognosis with risk predictions.
Pace, Nathan L; Eberhart, Leopold H J; Kranke, Peter R
2012-01-01
Prognosis is a forecast, based on present observations in a patient, of their probable outcome from disease, surgery and so on. Research methods for the development of risk probabilities may not be familiar to some anaesthesiologists. We briefly describe methods for identifying risk factors and risk scores. A probability prediction rule assigns a risk probability to a patient for the occurrence of a specific event. Probability reflects the continuum between absolute certainty (Pi = 1) and certified impossibility (Pi = 0). Biomarkers and clinical covariates that modify risk are known as risk factors. The Pi as modified by risk factors can be estimated by identifying the risk factors and their weighting; these are usually obtained by stepwise logistic regression. The accuracy of probabilistic predictors can be separated into the concepts of 'overall performance', 'discrimination' and 'calibration'. Overall performance is the mathematical distance between predictions and outcomes. Discrimination is the ability of the predictor to rank order observations with different outcomes. Calibration is the correctness of prediction probabilities on an absolute scale. Statistical methods include the Brier score, coefficient of determination (Nagelkerke R2), C-statistic and regression calibration. External validation is the comparison of the actual outcomes to the predicted outcomes in a new and independent patient sample. External validation uses the statistical methods of overall performance, discrimination and calibration and is uniformly recommended before acceptance of the prediction model. Evidence from randomised controlled clinical trials should be obtained to show the effectiveness of risk scores for altering patient management and patient outcomes.
Using electronic patient records to inform strategic decision making in primary care.
Mitchell, Elizabeth; Sullivan, Frank; Watt, Graham; Grimshaw, Jeremy M; Donnan, Peter T
2004-01-01
Although absolute risk of death associated with raised blood pressure increases with age, the benefits of treatment are greater in elderly patients. Despite this, the 'rule of halves' particularly applies to this group. We conducted a randomised controlled trial to evaluate different levels of feedback designed to improve identification, treatment and control of elderly hypertensives. Fifty-two general practices were randomly allocated to either: Control (n=19), Audit only feedback (n=16) or Audit plus Strategic feedback, prioritising patients by absolute risk (n=17). Feedback was based on electronic data, annually extracted from practice computer systems. Data were collected for 265,572 patients, 30,345 aged 65-79. The proportion of known hypertensives in each group with BP recorded increased over the study period and the numbers of untreated and uncontrolled patients reduced. There was a significant difference in mean systolic pressure between the Audit plus Strategic and Audit only groups and significantly greater control in the Audit plus Strategic group. Providing patient-specific practice feedback can impact on identification and management of hypertension in the elderly and produce a significant increase in control.
O’Connor, David; Enshaei, Amir; Bartram, Jack; Hancock, Jeremy; Harrison, Christine J.; Hough, Rachael; Samarasinghe, Sujith; Schwab, Claire; Vora, Ajay; Wade, Rachel; Moppett, John; Moorman, Anthony V.; Goulden, Nick
2018-01-01
Purpose Minimal residual disease (MRD) and genetic abnormalities are important risk factors for outcome in acute lymphoblastic leukemia. Current risk algorithms dichotomize MRD data and do not assimilate genetics when assigning MRD risk, which reduces predictive accuracy. The aim of our study was to exploit the full power of MRD by examining it as a continuous variable and to integrate it with genetics. Patients and Methods We used a population-based cohort of 3,113 patients who were treated in UKALL2003, with a median follow-up of 7 years. MRD was evaluated by polymerase chain reaction analysis of Ig/TCR gene rearrangements, and patients were assigned to a genetic subtype on the basis of immunophenotype, cytogenetics, and fluorescence in situ hybridization. To examine response kinetics at the end of induction, we log-transformed the absolute MRD value and examined its distribution across subgroups. Results MRD was log normally distributed at the end of induction. MRD distributions of patients with distinct genetic subtypes were different (P < .001). Patients with good-risk cytogenetics demonstrated the fastest disease clearance, whereas patients with high-risk genetics and T-cell acute lymphoblastic leukemia responded more slowly. The risk of relapse was correlated with MRD kinetics, and each log reduction in disease level reduced the risk by 20% (hazard ratio, 0.80; 95% CI, 0.77 to 0.83; P < .001). Although the risk of relapse was directly proportional to the MRD level within each genetic risk group, absolute relapse rate that was associated with a specific MRD value or category varied significantly by genetic subtype. Integration of genetic subtype–specific MRD values allowed more refined risk group stratification. Conclusion A single threshold for assigning patients to an MRD risk group does not reflect the response kinetics of the different genetic subtypes. Future risk algorithms should integrate genetics with MRD to accurately identify patients with the lowest and highest risk of relapse. PMID:29131699
Difference in health inequity between two population groups due to a social determinant of health.
Moonesinghe, Ramal; Bouye, Karen; Penman-Aguilar, Ana
2014-12-01
The World Health Organization defines social determinants of health as "complex, integrated, and overlapping social structures and economic systems" that are responsible for most health inequities. Similar to the individual-level risk factors such as behavioral and biological risk factors that influence disease, we consider social determinants of health such as the distribution of income, wealth, influence and power as risk factors for risk of disease. We operationally define health inequity in a disease within a population due to a risk factor that is unfair and avoidable as the difference between the disease outcome with and without the risk factor in the population. We derive expressions for difference in health inequity between two populations due to a risk factor that is unfair and avoidable for a given disease. The difference in heath inequity between two population groups due to a risk factor increases with increasing difference in relative risks and the difference in prevalence of the risk factor in the two populations. The difference in health inequity could be larger than the difference in health outcomes between the two populations in some situations. Compared to health disparities which are typically measured and monitored using absolute or relative disparities of health outcomes, the methods presented in this manuscript provide a different, yet complementary, picture because they parse out the contributions of unfair and avoidable risk factors.
Difference in Health Inequity between Two Population Groups due to a Social Determinant of Health
Moonesinghe, Ramal; Bouye, Karen; Penman-Aguilar, Ana
2014-01-01
The World Health Organization defines social determinants of health as “complex, integrated, and overlapping social structures and economic systems” that are responsible for most health inequities. Similar to the individual-level risk factors such as behavioral and biological risk factors that influence disease, we consider social determinants of health such as the distribution of income, wealth, influence and power as risk factors for risk of disease. We operationally define health inequity in a disease within a population due to a risk factor that is unfair and avoidable as the difference between the disease outcome with and without the risk factor in the population. We derive expressions for difference in health inequity between two populations due to a risk factor that is unfair and avoidable for a given disease. The difference in heath inequity between two population groups due to a risk factor increases with increasing difference in relative risks and the difference in prevalence of the risk factor in the two populations. The difference in health inequity could be larger than the difference in health outcomes between the two populations in some situations. Compared to health disparities which are typically measured and monitored using absolute or relative disparities of health outcomes, the methods presented in this manuscript provide a different, yet complementary, picture because they parse out the contributions of unfair and avoidable risk factors. PMID:25522048
Radical Prostatectomy versus Observation for Localized Prostate Cancer
Wilt, Timothy J.; Brawer, Michael K.; Jones, Karen M.; Barry, Michael J.; Aronson, William J.; Fox, Steven; Gingrich, Jeffrey R.; Wei, John T.; Gilhooly, Patricia; Grob, B. Mayer; Nsouli, Imad; Iyer, Padmini; Cartagena, Ruben; Snider, Glenn; Roehrborn, Claus; Sharifi, Roohollah; Blank, William; Pandya, Parikshit; Andriole, Gerald L.; Culkin, Daniel; Wheeler, Thomas
2012-01-01
BACKGROUND The effectiveness of surgery versus observation for men with localized prostate cancer detected by means of prostate-specific antigen (PSA) testing is not known. METHODS From November 1994 through January 2002, we randomly assigned 731 men with localized prostate cancer (mean age, 67 years; median PSA value, 7.8 ng per milliliter) to radical prostatectomy or observation and followed them through January 2010. The primary outcome was all-cause mortality; the secondary outcome was prostate-cancer mortality. RESULTS During the median follow-up of 10.0 years, 171 of 364 men (47.0%) assigned to radical prostatectomy died, as compared with 183 of 367 (49.9%) assigned to observation (hazard ratio, 0.88; 95% confidence interval [CI], 0.71 to 1.08; P = 0.22; absolute risk reduction, 2.9 percentage points). Among men assigned to radical prostatectomy, 21 (5.8%) died from prostate cancer or treatment, as compared with 31 men (8.4%) assigned to observation (hazard ratio, 0.63; 95% CI, 0.36 to 1.09; P = 0.09; absolute risk reduction, 2.6 percentage points). The effect of treatment on all-cause and prostate-cancer mortality did not differ according to age, race, coexisting conditions, self-reported performance status, or histologic features of the tumor. Radical prostatectomy was associated with reduced all-cause mortality among men with a PSA value greater than 10 ng per milliliter (P = 0.04 for interaction) and possibly among those with intermediate-risk or high-risk tumors (P = 0.07 for interaction). Adverse events within 30 days after surgery occurred in 21.4% of men, including one death. CONCLUSIONS Among men with localized prostate cancer detected during the early era of PSA testing, radical prostatectomy did not significantly reduce all-cause or prostate-cancer mortality, as compared with observation, through at least 12 years of follow-up. Absolute differences were less than 3 percentage points. (Funded by the Department of Veterans Affairs Cooperative Studies Program and others; PIVOT ClinicalTrials.gov number, NCT00007644.) PMID:22808955
Code of Federal Regulations, 2012 CFR
2012-01-01
... covered debt instrument that is subject to a non-zero specific risk capital charge. (A) For covered debt... indices. (iii) An organization must multiply the absolute value of the current market value of each net... multiply the absolute value of the current market value of each net long or short covered equity position...
Patient stratification for preventive care in dentistry.
Giannobile, W V; Braun, T M; Caplis, A K; Doucette-Stamm, L; Duff, G W; Kornman, K S
2013-08-01
Prevention reduces tooth loss, but little evidence supports biannual preventive care for all adults. We used risk-based approaches to test tooth loss association with 1 vs. 2 annual preventive visits in high-risk (HiR) and low-risk (LoR) patients. Insurance claims for 16 years for 5,117 adults were evaluated retrospectively for tooth extraction events. Patients were classified as HiR for progressive periodontitis if they had ≥ 1 of the risk factors (RFs) smoking, diabetes, interleukin-1 genotype; or as LoR if no RFs. LoR event rates were 13.8% and 16.4% for 2 or 1 annual preventive visits (absolute risk reduction, 2.6%; 95%CI, 0.5% to 5.8%; p = .092). HiR event rates were 16.9% and 22.1% for 2 and 1 preventive visits (absolute risk reduction, 5.2%; 95%CI, 1.8% to 8.4%; p = .002). Increasing RFs increased events (p < .001). Oral health care costs were not increased by any single RF, regardless of prevention frequency (p > .41), but multiple RFs increased costs vs. no (p < .001) or 1 RF (p = .001). For LoR individuals, the association between preventive dental visits and tooth loss was not significantly different whether the frequency was once or twice annually. A personalized medicine approach combining gene biomarkers with conventional risk factors to stratify populations may be useful in resource allocation for preventive dentistry (ClinicalTrials.gov, NCT01584479).
Turnbull, Fiona; Arima, Hisatomi; Heeley, Emma; Cass, Alan; Chalmers, John; Morgan, Claire; Patel, Anushka; Peiris, David; Weekes, Andrew; Anderson, Craig
2011-06-01
Studies indicate ongoing gender-based differences in the prevention, detection and management of cardiovascular disease. The aims of this study were to determine whether there are differences in general practitioners' (GPs') perceptions of a patient's cardiovascular risk compared with the patient's estimated risk and in the patient's subsequent medical management according to patient sex. The Australian Hypertension and Absolute Risk Study (AusHEART) was a nationally representative, cluster-stratified, cross-sectional survey among 322 GPs. Each GP was asked to collect data on cardiovascular disease risk factors and their management in 15-20 consecutive patients (age ≥55 years) who presented between April and June, 2008. They were also asked to estimate each patient's absolute risk of a cardiovascular event in the next five years. The main outcomes were the Adjusted Framingham risk, GP estimated risk and proportion of patients receiving blood pressure-lowering, statin and antiplatelet therapy. A total of 5293 patients were recruited to the study, of whom 2968 (56%) were women. Among patients without established cardiovascular disease, the level of agreement between the GP estimated risk and the Adjusted Framingham risk was poor (<50%) and was similarly so for men (kappa coefficient 0.18; 95% confidence interval (CI) 0.14-0.21) and women (0.19; 95% CI 0.16-0.22; P homogeneity = 0.57). For patients with established cardiovascular disease, however, women were more likely to be assigned by the GP to a lower risk category (66% vs. 54%, P < 0.001) and less likely to be prescribed combination (blood pressure-lowering, statin and antiplatelet) (44% vs. 56%, P < 0.001) therapy compared with men, even after adjusting for patient age. Cardiovascular risk is underrecognized and undertreated in Australian primary care patients, with women apparently disproportionately affected. These findings underscore the importance of initiatives to raise awareness of cardiovascular disease in women.
Gamp, Martina; Renner, Britta
2016-11-01
Personalised health-risk assessment is one of the most common components of health promotion programs. Previous research on responses to health risk feedback has commonly focused on the reception of bad news (high-risk feedback). The reception of low-risk feedback has been comparably neglected since it is assumed that good news is reassuring and readily received. However, field studies suggest mixed responses to low-risk health feedback. Accordingly, we examine whether pre-feedback risk expectancies can mitigate the reassuring effects of good news. In two studies (N = 187, N = 565), after assessing pre-feedback risk expectancies, participants received low-risk personalised feedback about their own risk of developing (the fictitious) Tucson Chronic Fatigue Syndrome (TCFS). Study 2 also included peer TCFS risk status feedback. Afterwards, self- and peer-related risk perception for TCFS was assessed. In both studies, participants who expected to be at high risk but received good news (unexpected low-risk feedback) showed absolute lack of reassurance. Specifically, they felt at significantly greater TCFS risk than participants who received expected good news. Moreover, the unexpected low-risk group even believed that their risk was as high as (Study 1) or higher (Study 2) than that of their peers (comparative lack of reassurance). Results support the notion that high pre-feedback risk expectancies can mitigate absolute and comparative reassuring effects of good news. © 2016 The International Association of Applied Psychology.
The trading time risks of stock investment in stock price drop
NASA Astrophysics Data System (ADS)
Li, Jiang-Cheng; Tang, Nian-Sheng; Mei, Dong-Cheng; Li, Yun-Xian; Zhang, Wan
2016-11-01
This article investigates the trading time risk (TTR) of stock investment in the case of stock price drop of Dow Jones Industrial Average (ˆDJI) and Hushen300 data (CSI300), respectively. The escape time of stock price from the maximum to minimum in a data window length (DWL) is employed to measure the absolute TTR, the ratio of the escape time to data window length is defined as the relative TTR. Empirical probability density functions of the absolute and relative TTRs for the ˆDJI and CSI300 data evidence that (i) whenever the DWL increases, the absolute TTR increases, the relative TTR decreases otherwise; (ii) there is the monotonicity (or non-monotonicity) for the stability of the absolute (or relative) TTR; (iii) there is a peak distribution for shorter trading days and a two-peak distribution for longer trading days for the PDF of ratio; (iv) the trading days play an opposite role on the absolute (or relative) TTR and its stability between ˆDJI and CSI300 data.
Tint, Mya Thway; Fortier, Marielle V; Godfrey, Keith M; Shuter, Borys; Kapur, Jeevesh; Rajadurai, Victor S; Agarwal, Pratibha; Chinnadurai, Amutha; Niduvaje, Krishnamoorthy; Chan, Yiong-Huak; Aris, Izzuddin Bin Mohd; Soh, Shu-E; Yap, Fabian; Saw, Seang-Mei; Kramer, Michael S; Gluckman, Peter D; Chong, Yap-Seng; Lee, Yung-Seng
2016-05-01
A susceptibility to metabolic diseases is associated with abdominal adipose tissue distribution and varies between ethnic groups. The distribution of abdominal adipose tissue at birth may give insights into whether ethnicity-associated variations in metabolic risk originate partly in utero. We assessed the influence of ethnicity on abdominal adipose tissue compartments in Asian neonates in the Growing Up in Singapore Toward Healthy Outcomes mother-offspring cohort. MRI was performed at ≤2 wk after birth in 333 neonates born at ≥34 wk of gestation and with birth weights ≥2000 g. Abdominal superficial subcutaneous tissue (sSAT), deep subcutaneous tissue (dSAT), and internal adipose tissue (IAT) compartment volumes (absolute and as a percentage of the total abdominal volume) were quantified. In multivariate analyses that were controlled for sex, age, and parity, the absolute and percentage of dSAT and the percentage of sSAT (but not absolute sSAT) were greater, whereas absolute IAT (but not the percentage of IAT) was lower, in Indian neonates than in Chinese neonates. Compared with Chinese neonates, Malay neonates had greater percentages of sSAT and dSAT but similar percentages of IAT. Marginal structural model analyses largely confirmed the results on the basis of volume percentages with controlled direct effects of ethnicity on abdominal adipose tissue; dSAT was significantly greater (1.45 mL; 95% CI: 0.49, 2.41 mL, P = 0.003) in non-Chinese (Indian or Malay) neonates than in Chinese neonates. However, ethnic differences in sSAT and IAT were NS [3.06 mL (95% CI:-0.27, 6.39 mL; P = 0.0712) for sSAT and -1.30 mL (95% CI: -2.64, 0.04 mL; P = 0.057) for IAT in non-Chinese compared with Chinese neonates, respectively]. Indian and Malay neonates have a greater dSAT volume than do Chinese neonates. This finding supports the notion that in utero influences may contribute to higher cardiometabolic risk observed in Indian and Malay persons in our population. If such differences persist in the longitudinal tracking of adipose tissue growth, these differences may contribute to the ethnic disparities in risks of cardiometabolic diseases. This trial was registered at clinicaltrials.gov as NCT01174875. © 2016 American Society for Nutrition.
Thomopoulos, Costas; Skalis, George; Michalopoulou, Helena; Tsioufis, Costas; Makris, Thomas
2015-12-01
This analysis investigated the extent of different outcome reductions from low-density lipoprotein cholesterol (LDL-C) lowering following ezetimibe/simvastatin treatment and the proportionality of outcome to LDL-C reductions. The authors searched PubMed between 1997 and mid-June 2015 (any language) and the Cochrane Library to identify all randomized controlled trials comparing ezetimibe/simvastatin with placebo or less intensive LDL-C lowering. Risk ratios (RR) and 95% confidence intervals (CIs), standardized to 20 mg/dL LDL-C reduction, were calculated for 5 primary outcomes (fatal and nonfatal) and 4 secondary outcomes (non-cardiovascular [CV] death, cancer, myopathy, and hepatopathy). Five ezetimibe/simvastatin RCTs (30 051 individuals) were eligible, 2 comparing ezetimibe/simvastatin vs placebo and 3 vs less intensive treatment. Outcomes reduced almost to the same extent were stroke (RR: -13%, 95% CI: -21% to -3%), coronary heart disease (CHD; RR: -12%, 95% CI: -19% to -5%), and composite of stroke and CHD (RR: -14%, 95% CI: -20% to -8%). Absolute risk reductions: 5 strokes, 10 CHD events, and 16 stroke and CHD events prevented for every 1000 patients treated for 5 years. Residual risk was almost 7× higher than absolute risk reduction for all the above outcomes. All death outcomes were not reduced, and secondary outcomes did not differ between groups. Logarithmic risk ratios were not associated with LDL-C lowering. Our meta-analysis provides evidence that, in patients with different CV disease burden, major CV events are safely reduced by LDL-C lowering with ezetimibe/simvastatin, while raising the hypothesis that the extent of LDL-C lowering might not be accompanied by incremental clinical-event reduction. © 2015 Wiley Periodicals, Inc.
Nielsen, Rasmus Østergaard; Malisoux, Laurent; Møller, Merete; Theisen, Daniel; Parner, Erik Thorlund
2016-04-01
The etiological mechanism underpinning any sports-related injury is complex and multifactorial. Frequently, athletes perceive "excessive training" as the principal factor in their injury, an observation that is biologically plausible yet somewhat ambiguous. If the applied training load is suddenly increased, this may increase the risk for sports injury development, irrespective of the absolute amount of training. Indeed, little to no rigorous scientific evidence exists to support the hypothesis that fluctuations in training load, compared to absolute training load, are more important in explaining sports injury development. One reason for this could be that prospective data from scientific studies should be analyzed in a different manner. Time-to-event analysis is a useful statistical tool in which to analyze the influence of changing exposures on injury risk. However, the potential of time-to-event analysis remains insufficiently exploited in sports injury research. Therefore, the purpose of the present article was to present and discuss measures of association used in time-to-event analyses and to present the advanced concept of time-varying exposures and outcomes. In the paper, different measures of association, such as cumulative relative risk, cumulative risk difference, and the classical hazard rate ratio, are presented in a nontechnical manner, and suggestions for interpretation of study results are provided. To summarize, time-to-event analysis complements the statistical arsenal of sports injury prevention researchers, because it enables them to analyze the complex and highly dynamic reality of injury etiology, injury recurrence, and time to recovery across a range of sporting contexts.
Beddhu, Srinivasan; Greene, Tom; Boucher, Robert; Cushman, William C; Wei, Guo; Stoddard, Gregory; Ix, Joachim H; Chonchol, Michel; Kramer, Holly; Cheung, Alfred K; Kimmel, Paul L; Whelton, Paul K; Chertow, Glenn M
2018-07-01
Guidelines, including the 2017 American College of Cardiology and American Heart Association blood pressure guideline, recommend tighter control of systolic blood pressure in people with type 2 diabetes. However, it is unclear whether intensive lowering of systolic blood pressure increases the incidence of chronic kidney disease in this population. We aimed to compare the effects of intensive systolic blood pressure control on incident chronic kidney disease in people with and without type 2 diabetes. The Systolic Blood Pressure Intervention Trial (SPRINT) tested the effects of a systolic blood pressure goal of less than 120 mm Hg (intensive intervention) versus a goal of less than 140 mm Hg (standard intervention) in people without diabetes. The Action to Control Cardiovascular Risk in Diabetes (ACCORD) blood pressure trial tested a similar systolic blood pressure intervention in people with type 2 diabetes. Our study is a secondary analysis of limited access datasets from SPRINT and the ACCORD trial obtained from the National Institutes of Health. In participants without chronic kidney disease at baseline (n=4311 in the ACCORD trial; n=6715 in SPRINT), we related systolic blood pressure interventions (intensive vs standard) to incident chronic kidney disease (defined as >30% decrease in estimated glomerular filtration rate [eGFR] to <60 mL/min per 1·73 m 2 ). These trials are registered with ClinicalTrials.gov, numbers NCT01206062 (SPRINT) and NCT00000620 (ACCORD trial). The average difference in systolic blood pressure between intensive and standard interventions was 13·9 mm Hg (95% CI 13·4-14·4) in the ACCORD trial and 15·2 mm Hg (14·8-15·6) in SPRINT. At 3 years, the cumulative incidence of chronic kidney disease in the ACCORD trial was 10·0% (95% CI 8·8-11·4) with the intensive intervention and 4·1% (3·3-5·1) with the standard intervention (absolute risk difference 5·9%, 95% CI 4·3-7·5). Corresponding values in SPRINT were 3·5% (95% CI 2·9-4·2) and 1·0% (0·7-1·4; absolute risk difference 2·5%, 95% CI 1·8-3·2). The absolute risk difference was significantly higher in the ACCORD trial than in SPRINT (p=0·0001 for interaction). Intensive lowering of systolic blood pressure increased the risk of incident chronic kidney disease in people with and without type 2 diabetes. However, the absolute risk of incident chronic kidney disease was higher in people with type 2 diabetes. Our findings suggest the need for vigilance in monitoring kidney function during intensive antihypertensive drug treatment, particularly in adults with diabetes. Long-term studies are needed to understand the clinical implications of antihypertensive treatment-related reductions in eGFR. National Institutes of Health. Copyright © 2018 Elsevier Ltd. All rights reserved.
Wilson, Nick; Selak, Vanessa; Blakely, Tony; Leung, William; Clarke, Philip; Jackson, Rod; Knight, Josh; Nghiem, Nhung
2016-03-11
Based on new systematic reviews of the evidence, the US Preventive Services Task Force has drafted updated guidelines on the use of low-dose aspirin for the primary prevention of both cardiovascular disease (CVD) and cancer. The Task Force generally recommends consideration of aspirin in adults aged 50-69 years with 10-year CVD risk of at least 10%, in who absolute health gain (reduction of CVD and cancer) is estimated to exceed absolute health loss (increase in bleeds). With the ongoing decline in CVD, current risk calculators for New Zealand are probably outdated, so it is difficult to be precise about what proportion of the population is in this risk category (roughly equivalent to 5-year CVD risk ≥5%). Nevertheless, we suspect that most smokers aged 50-69 years, and some non-smokers, would probably meet the new threshold for taking low-dose aspirin. The country therefore needs updated guidelines and risk calculators that are ideally informed by estimates of absolute net health gain (in quality-adjusted life-years (QALYs) per person) and cost-effectiveness. Other improvements to risk calculators include: epidemiological rigour (eg, by addressing competing mortality); providing enhanced graphical display of risk to enhance risk communication; and possibly capturing the issues of medication disutility and comparison with lifestyle changes.
Baker, Simon; Priest, Patricia; Jackson, Rod
2000-01-01
Objective To estimate the impact of using thresholds based on absolute risk of cardiovascular disease to target drug treatment to lower blood pressure in the community. Design Modelling of three thresholds of treatment for hypertension based on the absolute risk of cardiovascular disease. 5 year risk of disease was estimated for each participant using an equation to predict risk. Net predicted impact of the thresholds on the number of people treated and the number of disease events averted over 5 years was calculated assuming a relative treatment benefit of one quarter. Setting Auckland, New Zealand. Participants 2158 men and women aged 35-79 years randomly sampled from the general electoral rolls. Main outcome measures Predicted 5 year risk of cardiovascular disease event, estimated number of people for whom treatment would be recommended, and disease events averted over 5 years at different treatment thresholds. Results 46 374 (12%) Auckland residents aged 35-79 receive drug treatment to lower their blood pressure, averting an estimated 1689 disease events over 5 years. Restricting treatment to individuals with blood pressure ⩾170/100 mm Hg and those with blood pressure between 150/90-169/99 mm Hg who have a predicted 5 year risk of disease ⩾10% would increase the net number for whom treatment would be recommended by 19 401. This 42% relative increase is predicted to avert 1139/1689 (68%) additional disease events overall over 5 years compared with current treatment. If the threshold for 5 year risk of disease is set at 15% the number recommended for treatment increases by <10% but about 620/1689 (37%) additional events can be averted. A 20% threshold decreases the net number of patients recommended for treatment by about 10% but averts 204/1689 (12%) more disease events than current treatment. Conclusions Implementing treatment guidelines that use treatment thresholds based on absolute risk could significantly improve the efficiency of drug treatment to lower blood pressure in primary care. PMID:10710577
Hayward, Irene; Malcoe, Lorraine Halinka; Cleathero, Lesley A; Janssen, Patricia A; Lanphear, Bruce P; Hayes, Michael V; Mattman, Andre; Pampalon, Robert; Venners, Scott A
2012-06-13
The major aim of this study was to investigate whether maternal risk factors associated with socioeconomic status and small for gestational age (SGA) might be viable targets of interventions to reduce differential risk of SGA by socioeconomic status (socioeconomic SGA inequality) in the metropolitan area of Vancouver, Canada. This study included 59,039 live, singleton births in the Vancouver Census Metropolitan Area (Vancouver) from January 1, 2006 to September 17, 2009. To identify an indicator of socioeconomic SGA inequality, we used hierarchical logistic regression to model SGA by area-level variables from the Canadian census. We then modelled SGA by area-level average income plus established maternal risk factors for SGA and calculated population attributable SGA risk percentages (PAR%) for each variable. Associations of maternal risk factors for SGA with average income were investigated to identify those that might contribute to SGA inequality. Finally, we estimated crude reductions in the percentage and absolute differences in SGA risks between highest and lowest average income quintiles that would result if interventions on maternal risk factors successfully equalized them across income levels or eliminated them altogether. Average income produced the most linear and statistically significant indicator of socioeconomic SGA inequality with 8.9% prevalence of SGA in the lowest income quintile compared to 5.6% in the highest. The adjusted PAR% of SGA for variables were: bottom four quintiles of height (51%), first birth (32%), bottom four quintiles of average income (14%), oligohydramnios (7%), underweight or hypertension, (6% each), smoking (3%) and placental disorder (1%). Shorter height, underweight and smoking during pregnancy had higher prevalence in lower income groups. Crude models assuming equalization of risk factors across income levels or elimination altogether indicated little potential change in relative socioeconomic SGA inequality and reduction in absolute SGA inequality for shorter height only. Our findings regarding maternal height may indicate trans-generational aetiology for socioeconomic SGA inequalities and/or that adult height influences social mobility. Conditions affecting foetal and childhood growth might be viable targets to reduce absolute socioeconomic SGA inequality in future generations, but more research is needed to determine whether such an approach is appropriate.
Petrella, Robert J; Tremblay, Guy; De Backer, Guy; Gill, Dawn P
2014-01-01
Purpose/introduction The Canadian Hypertension Education Program (CHEP) has identified blood pressure (BP) control as a key target for an overall reduction in cardiovascular disease risk. The POWER survey (Physicians’ Observational Work on Patient Education According to their Vascular Risk) used Framingham methodology to investigate the impact of an angiotensin-receptor-blocker-based regimen on arterial BP and total coronary heart disease (CHD) risk in a subset of patients recruited in Canada. Methods 309 Canadian practices screened for patients with either newly diagnosed or uncontrolled mild/moderate hypertension (sitting systolic blood pressure [SBP] >140 mmHg with diastolic blood pressure [DBP] <110 mmHg). Treatment comprised eprosartan 600 mg/day with add-on antihypertensive therapy after 1 month if required. The primary efficacy variable was change in SBP at 6 months; the secondary variable was the absolute change in the Framingham 10-year CHD risk score. Results 1,385 patients were identified, of whom 1,114 were included in the intention-to-treat (ITT) cohort. Thirty-eight point four percent of ITT patients were managed with monotherapy at 6 months, versus 35.2% and 13.7% with two-drug or multiple-drug therapy, respectively. SBP in the ITT cohort declined 22.4 (standard deviation [SD] 14.8) mmHg and DBP declined 10.5 (SD 10.3) mmHg during that time. The absolute mean Framingham score declined 2.1 (SD 3.1) points with significant age and sex variation (P<0.001) and differences between the various Framingham methods used. Discussion/conclusion Primary care physicians were able to use a strategy of BP lowering and CHD risk assessment to achieve significant reductions in BP and Framingham-assessed CHD risk. The effect size estimate of the different Framingham methods varied noticeably; reasons for those differences warrant further investigation. PMID:24493928
Holme, Øyvind; Løberg, Magnus; Kalager, Mette; Bretthauer, Michael; Hernán, Miguel A; Aas, Eline; Eide, Tor J; Skovlund, Eva; Lekven, Jon; Schneede, Jörn; Tveit, Kjell Magne; Vatn, Morten; Ursin, Giske; Hoff, Geir
2018-06-05
The long-term effects of sigmoidoscopy screening on colorectal cancer (CRC) incidence and mortality in women and men are unclear. To determine the effectiveness of flexible sigmoidoscopy screening after 15 years of follow-up in women and men. Randomized controlled trial. (ClinicalTrials.gov: NCT00119912). Oslo and Telemark County, Norway. Adults aged 50 to 64 years at baseline without prior CRC. Screening (between 1999 and 2001) with flexible sigmoidoscopy with and without additional fecal blood testing versus no screening. Participants with positive screening results were offered colonoscopy. Age-adjusted CRC incidence and mortality stratified by sex. Of 98 678 persons, 20 552 were randomly assigned to screening and 78 126 to no screening. Adherence rates were 64.7% in women and 61.4% in men. Median follow-up was 14.8 years. The absolute risks for CRC in women were 1.86% in the screening group and 2.05% in the control group (risk difference, -0.19 percentage point [95% CI, -0.49 to 0.11 percentage point]; HR, 0.92 [CI, 0.79 to 1.07]). In men, the corresponding risks were 1.72% and 2.50%, respectively (risk difference, -0.78 percentage point [CI, -1.08 to -0.48 percentage points]; hazard ratio [HR], 0.66 [CI, 0.57 to 0.78]) (P for heterogeneity = 0.004). The absolute risks for death from CRC in women were 0.60% in the screening group and 0.59% in the control group (risk difference, 0.01 percentage point [CI, -0.16 to 0.18 percentage point]; HR, 1.01 [CI, 0.77 to 1.33]). The corresponding risks for death from CRC in men were 0.49% and 0.81%, respectively (risk difference, -0.33 percentage point [CI, -0.49 to -0.16 percentage point]; HR, 0.63 [CI, 0.47 to 0.83]) (P for heterogeneity = 0.014). Follow-up through national registries. Offering sigmoidoscopy screening in Norway reduced CRC incidence and mortality in men but had little or no effect in women. Norwegian government and Norwegian Cancer Society.
Zhu, Liling; Su, Fengxi; Jia, Weijuan; Deng, Xiaogeng
2014-01-01
Background Predictive models for febrile neutropenia (FN) would be informative for physicians in clinical decision making. This study aims to validate a predictive model (Jenkin’s model) that comprises pretreatment hematological parameters in early-stage breast cancer patients. Patients and Methods A total of 428 breast cancer patients who received neoadjuvant/adjuvant chemotherapy without any prophylactic use of colony-stimulating factor were included. Pretreatment absolute neutrophil counts (ANC) and absolute lymphocyte counts (ALC) were used by the Jenkin’s model to assess the risk of FN. In addition, we modified the threshold of Jenkin’s model and generated Model-A and B. We also developed Model-C by incorporating the absolute monocyte count (AMC) as a predictor into Model-A. The rates of FN in the 1st chemotherapy cycle were calculated. A valid model should be able to significantly identify high-risk subgroup of patients with FN rate >20%. Results Jenkin’s model (Predicted as high-risk when ANC≦3.1*10∧9/L;ALC≦1.5*10∧9/L) did not identify any subgroups with significantly high risk (>20%) of FN in our population, even if we used different thresholds in Model-A(ANC≦4.4*10∧9/L;ALC≦2.1*10∧9/L) or B(ANC≦3.8*10∧9/L;ALC≦1.8*10∧9/L). However, with AMC added as an additional predictor, Model-C(ANC≦4.4*10∧9/L;ALC≦2.1*10∧9/L; AMC≦0.28*10∧9/L) identified a subgroup of patients with a significantly high risk of FN (23.1%). Conclusions In our population, Jenkin’s model, cannot accurately identify patients with a significant risk of FN. The threshold should be changed and the AMC should be incorporated as a predictor, to have excellent predictive ability. PMID:24945817
Amin Al Olama, Ali; Benlloch, Sara; Antoniou, Antonis C; Giles, Graham G; Severi, Gianluca; Neal, David E; Hamdy, Freddie C; Donovan, Jenny L; Muir, Kenneth; Schleutker, Johanna; Henderson, Brian E; Haiman, Christopher A; Schumacher, Fredrick R; Pashayan, Nora; Pharoah, Paul D P; Ostrander, Elaine A; Stanford, Janet L; Batra, Jyotsna; Clements, Judith A; Chambers, Suzanne K; Weischer, Maren; Nordestgaard, Børge G; Ingles, Sue A; Sorensen, Karina D; Orntoft, Torben F; Park, Jong Y; Cybulski, Cezary; Maier, Christiane; Doerk, Thilo; Dickinson, Joanne L; Cannon-Albright, Lisa; Brenner, Hermann; Rebbeck, Timothy R; Zeigler-Johnson, Charnita; Habuchi, Tomonori; Thibodeau, Stephen N; Cooney, Kathleen A; Chappuis, Pierre O; Hutter, Pierre; Kaneva, Radka P; Foulkes, William D; Zeegers, Maurice P; Lu, Yong-Jie; Zhang, Hong-Wei; Stephenson, Robert; Cox, Angela; Southey, Melissa C; Spurdle, Amanda B; FitzGerald, Liesel; Leongamornlert, Daniel; Saunders, Edward; Tymrakiewicz, Malgorzata; Guy, Michelle; Dadaev, Tokhir; Little, Sarah J; Govindasami, Koveela; Sawyer, Emma; Wilkinson, Rosemary; Herkommer, Kathleen; Hopper, John L; Lophatonanon, Aritaya; Rinckleb, Antje E; Kote-Jarai, Zsofia; Eeles, Rosalind A; Easton, Douglas F
2015-07-01
Genome-wide association studies have identified multiple genetic variants associated with prostate cancer risk which explain a substantial proportion of familial relative risk. These variants can be used to stratify individuals by their risk of prostate cancer. We genotyped 25 prostate cancer susceptibility loci in 40,414 individuals and derived a polygenic risk score (PRS). We estimated empirical odds ratios (OR) for prostate cancer associated with different risk strata defined by PRS and derived age-specific absolute risks of developing prostate cancer by PRS stratum and family history. The prostate cancer risk for men in the top 1% of the PRS distribution was 30.6 (95% CI, 16.4-57.3) fold compared with men in the bottom 1%, and 4.2 (95% CI, 3.2-5.5) fold compared with the median risk. The absolute risk of prostate cancer by age of 85 years was 65.8% for a man with family history in the top 1% of the PRS distribution, compared with 3.7% for a man in the bottom 1%. The PRS was only weakly correlated with serum PSA level (correlation = 0.09). Risk profiling can identify men at substantially increased or reduced risk of prostate cancer. The effect size, measured by OR per unit PRS, was higher in men at younger ages and in men with family history of prostate cancer. Incorporating additional newly identified loci into a PRS should improve the predictive value of risk profiles. We demonstrate that the risk profiling based on SNPs can identify men at substantially increased or reduced risk that could have useful implications for targeted prevention and screening programs. ©2015 American Association for Cancer Research.
Michos, Erin D; Blaha, Michael J; Blumenthal, Roger S
2017-12-01
Clinical guidelines for instituting pharmacotherapy for the primary prevention of atherosclerotic cardiovascular disease (ASCVD), specifically lipid management and aspirin, have long been based on absolute risk. However, lipid management in the current era remains challenging to both patients and clinicians in the setting of somewhat discordant recommendations from various organizations. All guidelines endorse the use of statins for primary prevention for those at sufficient absolute risk, and treatment recommendations are generally "risk-based" rather than exclusively targeting specific low-density lipoprotein cholesterol levels. Nonetheless, guidelines differ in relation to the risk threshold for initiation and the intensity of statin treatment. The key concept of the clinician-patient risk discussion introduced in the 2013 American College of Cardiology/American Heart Association cholesterol guidelines is a process that addresses the potential for ASCVD risk reduction with statin treatment, potential for adverse treatment effects, patient preferences, encouragement of heart-healthy lifestyle, and management of other risk factors. However, operationalizing the clinician-patient risk discussion requires effective communication of the most accurate and personalized risk information. In this article, we review our treatment approach for the appropriate use of coronary artery calcium testing in the intermediate-risk patient to guide shared decision making. The decision to initiate or intensify statin therapy may be uncertain across a broad range of estimated 10-year ASCVD risk of 5% to 20%, and coronary artery calcium testing can reclassify risk upward or downward in approximately 50% of this group to inform the risk discussion. We conclude with 2 case-based examples of uncertain risk and uncertain statin therapeutic benefit to illustrate execution of the clinician-patient risk discussion. Copyright © 2017 Mayo Foundation for Medical Education and Research. Published by Elsevier Inc. All rights reserved.
Amin Al Olama, Ali; Benlloch, Sara; Antoniou, Antonis C.; Zeigler-Johnson, Charnita; Stephenson, Robert; Cox, Angela; Southey, Melissa C.; Spurdle, Amanda B.; FitzGerald, Liesel; Leongamornlert, Daniel; Saunders, Edward; Tymrakiewicz, Malgorzata; Guy, Michelle; Dadaev, Tokhir; Little, Sarah J.; Govindasami, Koveela; Sawyer, Emma; Wilkinson, Rosemary; Herkommer, Kathleen; Hopper, John L.; Lophatonanon, Aritaya; Rinckleb, Antje E.; Kote-Jarai, Zsofia; Eeles, Rosalind A.; Easton, Douglas F.
2015-01-01
Background Genome-wide association studies have identified multiple genetic variants associated with prostate cancer (PrCa) risk which explain a substantial proportion of familial relative risk. These variants can be used to stratify individuals by their risk of PrCa. Methods We genotyped 25 PrCa susceptibility loci in 40,414 individuals and derived a polygenic risk score (PRS). We estimated empirical Odds Ratios for PrCa associated with different risk strata defined by PRS and derived age-specific absolute risks of developing PrCa by PRS stratum and family history. Results The PrCa risk for men in the top 1% of the PRS distribution was 30.6 (95% CI 16.4-57.3) fold compared with men in the bottom 1%, and 4.2 (95% CI 3.2-5.5) fold compared with the median risk. The absolute risk of PrCa by age 85 was 65.8% for a man with family history in the top 1% of the PRS distribution, compared with 3.7% for a man in the bottom 1%. The PRS was only weakly correlated with serum PSA level (correlation=0.09). Conclusions Risk profiling can identify men at substantially increased or reduced risk of PrCa. The effect size, measured by OR per unit PRS, was higher in men at younger ages and in men with family history of PrCa. Incorporating additional newly identified loci into a PRS should improve the predictive value of risk profiles. Impact We demonstrate that the risk profiling based on SNPs can identify men at substantially increased or reduced risk that could have useful implications for targeted prevention and screening programs. PMID:25837820
Current recommendations: what is the clinician to do?
Manson, Joann E
2014-04-01
Menopausal hormone therapy (HT) has complex biologic effects but continues to have an important clinical role in the management of vasomotor and other menopausal symptoms. The rational use of menopausal HT requires balancing the potential benefits and risks of treatment. Findings from the Women's Health Initiative (WHI) and other randomized clinical trials have helped to clarify the benefits and risks of HT and have provided insights to improve decision making. Several clinical characteristics have utility in identifying women for whom benefits of HT are likely to outweigh the risks. Age and time since menopause are strong predictors of health outcomes and absolute risks associated with HT, and differences by age have been particularly apparent for estrogen alone. In the WHI trial of conjugated equine estrogens (CEE) alone, younger women (50-59 years) had more favorable results for all-cause mortality, myocardial infarction, and the global index, but not for stroke and venous thrombosis. Age trends were less clear for CEE + medroxyprogesterone acetate, owing to increased risks of breast cancer, stroke, and venous thrombosis in all age groups. Absolute risks of adverse events were lower in younger than in older women in both trials, however. Other predictors of lower vascular risk from HT include favorable lipid status and absence of the metabolic syndrome. Transdermal administration may be associated with lower risks of venous thrombosis and stroke, but additional research is needed. The use of risk stratification and personalized risk assessment offers promise for improved benefit-risk profile and safety of HT. One approach to decision making is presented. Key elements include: assessment of whether the patient has moderate to severe menopausal symptoms, the primary indication for initiating systemic HT (vaginal estrogen may be used to treat genitourinary symptoms in the absence of vasomotor symptoms); understanding the patient's own preference regarding therapy; evaluating the patient for the presence of any contraindications to HT, as well as the time since menopause onset and baseline risks of cardiovascular disease and breast cancer; reviewing carefully the benefits and risks of treatment with the patient, giving more emphasis to absolute than to relative measures of effect; and, if HT is initiated, regularly reviewing the patient's need for continued treatment. Copyright © 2014 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.
Herman, William H; Pan, Qing; Edelstein, Sharon L; Mather, Kieren J; Perreault, Leigh; Barrett-Connor, Elizabeth; Dabelea, Dana M; Horton, Edward; Kahn, Steven E; Knowler, William C; Lorenzo, Carlos; Pi-Sunyer, Xavier; Venditti, Elizabeth; Ye, Wen
2017-12-01
Both lifestyle and metformin interventions can delay or prevent progression to type 2 diabetes mellitus (DM) in people with impaired glucose regulation, but there is considerable interindividual variation in the likelihood of receiving benefit. Understanding an individual's 3-year risk of progressing to DM and regressing to normal glucose regulation (NGR) might facilitate benefit-based tailored treatment. We used the values of 19 clinical variables measured at the Diabetes Prevention Program (DPP) baseline evaluation and Cox proportional hazards models to assess the 3-year risk of progression to DM and regression to NGR separately for DPP lifestyle, metformin, and placebo participants who were adherent to the interventions. Lifestyle participants who lost ≥5% of their initial body weight at 6 months and metformin and placebo participants who reported taking ≥80% of their prescribed medication at the 6-month follow-up were defined as adherent. Eleven of 19 clinical variables measured at baseline predicted progression to DM, and 6 of 19 predicted regression to NGR. Compared with adherent placebo participants at lowest risk of developing diabetes, participants at lowest risk of developing diabetes who adhered to a lifestyle intervention had an 8% absolute risk reduction (ARR) of developing diabetes and a 35% greater absolute likelihood of reverting to NGR. Participants at lowest risk of developing diabetes who adhered to a metformin intervention had no reduction in their risk of developing diabetes and a 17% greater absolute likelihood of reverting to NGR. Participants at highest risk of developing DM who adhered to a lifestyle intervention had a 39% ARR of developing diabetes and a 24% greater absolute likelihood of reverting to NGR, whereas those who adhered to the metformin intervention had a 25% ARR of developing diabetes and an 11% greater absolute likelihood of reverting to NGR. Unlike our previous analyses that sought to explain population risk, these analyses evaluate individual risk. The models can be used by overweight and obese adults with fasting hyperglycemia and impaired glucose tolerance to facilitate personalized decision-making by allowing them to explicitly weigh the benefits and feasibility of the lifestyle and metformin interventions. © 2017 by the American Diabetes Association.
Søgaard, Kirstine Kobberøe; Farkas, Dóra Körmendiné; Sørensen, Henrik Toft
2017-12-29
There is an ongoing debate on the possible association between infections in early childhood and subsequent cancer risk, but it remains unclear if a hospital admission for infection is associated with risk of childhood cancer diagnosis. We examined if a hospital-based diagnosis of pneumonia was a clinical marker of the three most common childhood cancers. Population-based cohort study. Denmark, hospital diagnoses, 1994-2013. Using national health registries, we compared the observed incidence of leukaemia, lymphoma and brain cancer among 83 935 children with a hospital-based pneumonia diagnosis with that expected among children in the general population. We calculated absolute cancer risks and standardised incidence ratios (SIRs) as a measure of relative risk. The cancer SIRs were substantially increased during the first 6 months of follow-up; lymphoid leukaemia: 6.2 (95% CI 3.5 to 10.3); myeloid leukaemia: 14.8 (95% CI 6.0 to 30.6); Hodgkin's lymphoma: 60.8 (95% CI 26.2 to 120), non-Hodgkin's lymphoma: 15.9 (95% CI 5.2 to 37.2) and brain cancer: 4.4 (95% CI 1.9 to 8.7). The 6-month absolute risks of leukaemia, lymphoma and brain cancer were all low, reaching 0.05% when combined. An increased risk persisted beyond 5 years for non-Hodgkin's lymphoma and brain cancer. However, the 5-year absolute cancer risk was 0.14%. The short-term incidence of leukaemia, lymphoma and brain cancer was higher than expected and persisted beyond 5 years for non-Hodgkin's lymphoma and brain cancer. However, the absolute cancer risk was low. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
McArthur, Eric; Fraser, Lisa-Ann; Hayward, Jade; Dixon, Stephanie; Hwang, Y Joseph; Ordon, Michael
2015-01-01
Study question Do men starting treatment with prostate-specific α antagonists have increased risk of fall and fracture? Methods Administrative datasets from the province of Ontario, Canada, that contain patient level data were used to generate a cohort of 147 084 men aged ≥66 years who filled their first outpatient prescription for prostate-specific α antagonists tamsulosin, alfuzosin, or silodosin between June 2003 and December 2013 (exposed men) plus an equal sized cohort matched 1:1 (using a propensity score model) who did not initiate α antagonist therapy. The primary outcome was a hospital emergency room visit or inpatient admission for a fall or fracture in the 90 days after exposure. Study answer and limitations The men exposed to prostate-specific α antagonist had significantly increased risks of falling (odds ratio 1.14 (95% CI 1.07 to 1.21), absolute risk increase 0.17% (0.08 to 0.25%)) and of sustaining a fracture (odds ratio 1.16 (1.04 to 1.29), absolute risk increase 0.06% (0.02 to 0.11%)) compared with the unexposed cohort. This increased risk was not observed in the period before α antagonist use. Secondary outcomes of hypotension and head trauma were also significantly increased in the exposed cohort (odds ratios 1.80 (1.59 to 2.03) and 1.15 (1.04 to 1.27) respectively). The two cohorts were similar across 98 different covariates including demographics, comorbid conditions, medication use, healthcare use, and prior medical investigation. Potential unmeasured confounders, such as physical deconditioning, mobility impairment, and situational risk factors, may exist. The data used to identify the primary outcomes had limited sensitivity, so the absolute risks of the outcomes are probably underestimates. The study only included men ≥66 years old, and 84% of exposed men were prescribed tamsulosin, so results may not be generalizable to younger men, and there may not be statistical power to show small differences in outcomes between the drugs. What this study adds Prostate-specific α antagonists are associated with a small but significant increased risk of fall, fracture, and head trauma, probably as a result of induced hypotension. Funding, competing interests, data sharing This project was conducted at the Institute for Clinical Evaluative Sciences (ICES) Western Site through the Kidney, Dialysis, and Transplantation (KDT) research program. BW has received a research grant from Astellas, and L-AF does consultancy for Amgen. PMID:26502947
Projecting Individualized Absolute Invasive Breast Cancer Risk in US Hispanic Women
John, Esther M.; Slattery, Martha L.; Gomez, Scarlett Lin; Yu, Mandi; LaCroix, Andrea Z.; Pee, David; Chlebowski, Rowan T.; Hines, Lisa M.; Thompson, Cynthia A.; Gail, Mitchell H.
2017-01-01
Background: There is no model to estimate absolute invasive breast cancer risk for Hispanic women. Methods: The San Francisco Bay Area Breast Cancer Study (SFBCS) provided data on Hispanic breast cancer case patients (533 US-born, 553 foreign-born) and control participants (464 US-born, 947 foreign-born). These data yielded estimates of relative risk (RR) and attributable risk (AR) separately for US-born and foreign-born women. Nativity-specific absolute risks were estimated by combining RR and AR information with nativity-specific invasive breast cancer incidence and competing mortality rates from the California Cancer Registry and Surveillance, Epidemiology, and End Results program to develop the Hispanic risk model (HRM). In independent data, we assessed model calibration through observed/expected (O/E) ratios, and we estimated discriminatory accuracy with the area under the receiver operating characteristic curve (AUC) statistic. Results: The US-born HRM included age at first full-term pregnancy, biopsy for benign breast disease, and family history of breast cancer; the foreign-born HRM also included age at menarche. The HRM estimated lower risks than the National Cancer Institute’s Breast Cancer Risk Assessment Tool (BCRAT) for US-born Hispanic women, but higher risks in foreign-born women. In independent data from the Women’s Health Initiative, the HRM was well calibrated for US-born women (observed/expected [O/E] ratio = 1.07, 95% confidence interval [CI] = 0.81 to 1.40), but seemed to overestimate risk in foreign-born women (O/E ratio = 0.66, 95% CI = 0.41 to 1.07). The AUC was 0.564 (95% CI = 0.485 to 0.644) for US-born and 0.625 (95% CI = 0.487 to 0.764) for foreign-born women. Conclusions: The HRM is the first absolute risk model that is based entirely on data specific to Hispanic women by nativity. Further studies in Hispanic women are warranted to evaluate its validity. PMID:28003316
Utility of the Seattle Heart Failure Model in patients with advanced heart failure.
Kalogeropoulos, Andreas P; Georgiopoulou, Vasiliki V; Giamouzis, Grigorios; Smith, Andrew L; Agha, Syed A; Waheed, Sana; Laskar, Sonjoy; Puskas, John; Dunbar, Sandra; Vega, David; Levy, Wayne C; Butler, Javed
2009-01-27
The aim of this study was to validate the Seattle Heart Failure Model (SHFM) in patients with advanced heart failure (HF). The SHFM was developed primarily from clinical trial databases and extrapolated the benefit of interventions from published data. We evaluated the discrimination and calibration of SHFM in 445 advanced HF patients (age 52 +/- 12 years, 68.5% male, 52.4% white, ejection fraction 18 +/- 8%) referred for cardiac transplantation. The primary end point was death (n = 92), urgent transplantation (n = 14), or left ventricular assist device (LVAD) implantation (n = 3); a secondary analysis was performed on mortality alone. Patients were receiving optimal therapy (angiotensin-II modulation 92.8%, beta-blockers 91.5%, aldosterone antagonists 46.3%), and 71.0% had an implantable device (defibrillator 30.4%, biventricular pacemaker 3.4%, combined 37.3%). During a median follow-up of 21 months, 109 patients (24.5%) had an event. Although discrimination was adequate (c-statistic >0.7), the SHFM overall underestimated absolute risk (observed vs. predicted event rate: 11.0% vs. 9.2%, 21.0% vs. 16.6%, and 27.9% vs. 22.8% at 1, 2, and 3 years, respectively). Risk underprediction was more prominent in patients with an implantable device. The SHFM had different calibration properties in white versus black patients, leading to net underestimation of absolute risk in blacks. Race-specific recalibration improved the accuracy of predictions. When analysis was restricted to mortality, the SHFM exhibited better performance. In patients with advanced HF, the SHFM offers adequate discrimination, but absolute risk is underestimated, especially in blacks and in patients with devices. This is more prominent when including transplantation and LVAD implantation as an end point.
Patient Stratification for Preventive Care in Dentistry
Giannobile, W.V.; Braun, T.M.; Caplis, A.K.; Doucette-Stamm, L.; Duff, G.W.; Kornman, K.S.
2013-01-01
Prevention reduces tooth loss, but little evidence supports biannual preventive care for all adults. We used risk-based approaches to test tooth loss association with 1 vs. 2 annual preventive visits in high-risk (HiR) and low-risk (LoR) patients. Insurance claims for 16 years for 5,117 adults were evaluated retrospectively for tooth extraction events. Patients were classified as HiR for progressive periodontitis if they had ≥ 1 of the risk factors (RFs) smoking, diabetes, interleukin-1 genotype; or as LoR if no RFs. LoR event rates were 13.8% and 16.4% for 2 or 1 annual preventive visits (absolute risk reduction, 2.6%; 95%CI, 0.5% to 5.8%; p = .092). HiR event rates were 16.9% and 22.1% for 2 and 1 preventive visits (absolute risk reduction, 5.2%; 95%CI, 1.8% to 8.4%; p = .002). Increasing RFs increased events (p < .001). Oral health care costs were not increased by any single RF, regardless of prevention frequency (p > .41), but multiple RFs increased costs vs. no (p < .001) or 1 RF (p = .001). For LoR individuals, the association between preventive dental visits and tooth loss was not significantly different whether the frequency was once or twice annually. A personalized medicine approach combining gene biomarkers with conventional risk factors to stratify populations may be useful in resource allocation for preventive dentistry (ClinicalTrials.gov, NCT01584479). PMID:23752171
Jenum, Synne; Grewal, Harleen M S; Hokey, David A; Kenneth, John; Vaz, Mario; Doherty, Timothy Mark; Jahnsen, Frode Lars
2014-01-01
QuantiFERON-TB Gold In-Tube (QFT) is an IFNγ-release assay used in the diagnosis of Mycobacterium tuberculosis (MTB) infection. The risk of TB progression increases with the magnitude of the MTB-specific IFNγ-response. QFT reversion, also associated with low Tuberculin Skin Test responses, may therefore represent a transient immune response with control of M. tuberculosis infection. However, studies at the single cell level have suggested that the quality (polyfunctionality) of the T-cell response is more important than the quantity of cytokines produced. To explore the quality and/or magnitude of mycobacteria-specific T-cell responses associated with QFT reversion and persistent QFT-positivity. Multi-color flowcytometry on prospectively collected peripheral blood mononuclear cells was applied to assess mycobacteria-specific T-cell responses in 42 QFT positive Indian adolescents of whom 21 became QFT negative (reverters) within one year. Ten QFT consistent negatives were also included as controls. There was no difference in the qualitative PPD-specific CD4+ T-cell response between QFT consistent positives and reverters. However, compared with QFT consistent positives, reverters displayed lower absolute frequencies of polyfunctional (IFNγ+IL2+TNFα+) CD4+ T-cells at baseline, which were further reduced to the point where they were not different to QFT negative controls one year later. Moreover, absolute frequencies of these cells correlated well with the magnitude of the QFT-response. Whereas specific polyfunctional CD4+ T-cells have been suggested to protect against TB progression, our data do not support that higher relative or absolute frequencies of PPD-specific polyfunctional CD4+ T-cells in peripheral blood can explain the reduced risk of TB progression observed in QFT reverters. On the contrary, absolute frequencies of these cells correlated with the QFT-response, suggesting that this readout reflects antigenic load.
2013-01-01
Background The high burden and rising incidence of cardiovascular disease (CVD) in resource constrained countries necessitates implementation of robust and pragmatic primary and secondary prevention strategies. Many current CVD management guidelines recommend absolute cardiovascular (CV) risk assessment as a clinically sound guide to preventive and treatment strategies. Development of non-laboratory based cardiovascular risk assessment algorithms enable absolute risk assessment in resource constrained countries. The objective of this review is to evaluate the performance of existing non-laboratory based CV risk assessment algorithms using the benchmarks for clinically useful CV risk assessment algorithms outlined by Cooney and colleagues. Methods A literature search to identify non-laboratory based risk prediction algorithms was performed in MEDLINE, CINAHL, Ovid Premier Nursing Journals Plus, and PubMed databases. The identified algorithms were evaluated using the benchmarks for clinically useful cardiovascular risk assessment algorithms outlined by Cooney and colleagues. Results Five non-laboratory based CV risk assessment algorithms were identified. The Gaziano and Framingham algorithms met the criteria for appropriateness of statistical methods used to derive the algorithms and endpoints. The Swedish Consultation, Framingham and Gaziano algorithms demonstrated good discrimination in derivation datasets. Only the Gaziano algorithm was externally validated where it had optimal discrimination. The Gaziano and WHO algorithms had chart formats which made them simple and user friendly for clinical application. Conclusion Both the Gaziano and Framingham non-laboratory based algorithms met most of the criteria outlined by Cooney and colleagues. External validation of the algorithms in diverse samples is needed to ascertain their performance and applicability to different populations and to enhance clinicians’ confidence in them. PMID:24373202
2012-01-01
Background There has been an overall decrease in incident ischaemic heart disease (IHD), but the reduction in IHD risk factors has been greater among those with higher social position. Increased social inequalities in IHD mortality in Scandinavian countries is often referred to as the Scandinavian “public health puzzle”. The objective of this study was to examine trends in absolute and relative educational inequalities in four modifiable ischaemic heart disease risk factors (smoking, diabetes, hypertension and high total cholesterol) over the last three decades among Norwegian middle-aged women and men. Methods Population-based, cross-sectional data from The Nord-Trøndelag Health Study (HUNT): HUNT 1 (1984–1986), HUNT 2 (1995–1997) and HUNT 3 (2006–2008), women and men 40–59 years old. Educational inequalities were assessed using the Slope Index of Inequality (SII) and The Relative Index of Inequality (RII). Results Smoking prevalence increased for all education groups among women and decreased in men. Relative and absolute educational inequalities in smoking widened in both genders, with significantly higher absolute inequalities among women than men in the two last surveys. Diabetes prevalence increased in all groups. Relative inequalities in diabetes were stable, while absolute inequalities increased both among women (p = 0.05) and among men (p = 0.01). Hypertension prevalence decreased in all groups. Relative inequalities in hypertension widened over time in both genders. However, absolute inequalities in hypertension decreased among women (p = 0.05) and were stable among men (p = 0.33). For high total cholesterol relative and absolute inequalities remained stable in both genders. Conclusion Widening absolute educational inequalities in smoking and diabetes over the last three decades gives rise to concern. The mechanisms behind these results are less clear, and future studies are needed to assess if educational inequalities in secondary prevention of IHD are larger compared to educational inequalities in primary prevention of IHD. Continued monitoring of IHD risk factors at the population level is therefore warranted. The results emphasise the need for public health efforts to prevent future burdens of life-style-related diseases and to avoid further widening in socioeconomic inequalities in IHD mortality in Norway, especially among women. PMID:22471945
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Grace L.; Jiang, Jing; Buchholz, Thomas A.
Purpose: Brachytherapy after lumpectomy is an increasingly popular breast cancer treatment, but data concerning its effectiveness are conflicting. Recently proposed “suitability” criteria guiding patient selection for brachytherapy have never been empirically validated. Methods: Using the Surveillance, Epidemiology, and End Results–Medicare linked database, we compared women aged 66 years or older with invasive breast cancer (n=28,718) or ductal carcinoma in situ (n=7229) diagnosed from 2002 to 2007, treated with lumpectomy alone, brachytherapy, or external beam radiation therapy (EBRT). The likelihood of breast preservation, measured by subsequent mastectomy risk, was compared by use of multivariate proportional hazards, further stratified by American Societymore » for Radiation Oncology (ASTRO) brachytherapy suitability groups. We compared 1-year postoperative complications using the χ{sup 2} test and 5-year local toxicities using the log-rank test. Results: For patients with invasive cancer, the 5-year subsequent mastectomy risk was 4.7% after lumpectomy alone (95% confidence interval [CI], 4.1%-5.4%), 2.8% after brachytherapy (95% CI, 1.8%-4.3%), and 1.3% after EBRT (95% CI, 1.1%-1.5%) (P<.001). Compared with lumpectomy alone, brachytherapy achieved a more modest reduction in adjusted risk (hazard ratio [HR], 0.61; 95% CI, 0.40-0.94) than achieved with EBRT (HR, 0.22; 95% CI, 0.18-0.28). Relative risks did not differ when stratified by ASTRO suitability group (P=.84 for interaction), although ASTRO “suitable” patients did show a low absolute subsequent mastectomy risk, with a minimal absolute difference in risk after brachytherapy (1.6%; 95% CI, 0.7%-3.5%) versus EBRT (0.8%; 95% CI, 0.6%-1.1%). For patients with ductal carcinoma in situ, EBRT maintained a reduced risk of subsequent mastectomy (HR, 0.40; 95% CI, 0.28-0.55; P<.001), whereas the small number of patients treated with brachytherapy (n=179) precluded definitive comparison with lumpectomy alone. In all patients, brachytherapy showed a higher postoperative infection risk (16.5% vs 9.9% after lumpectomy alone vs 11.4% after EBRT, P<.001); higher incidence of breast pain (22.9% vs 11.2% vs 16.7%, P<.001); and higher incidence of fat necrosis (15.3% vs 5.3% vs 7.7%, P<.001). Conclusions: In this study era, brachytherapy showed lesser breast preservation benefit compared with EBRT. Suitability criteria predicted differential absolute, but not relative, benefit in patients with invasive cancer.« less
Targeting Preschool Children to Promote Cardiovascular Health: Cluster Randomized Trial
Céspedes, Jaime; Briceño, German; Farkouh, Michael E.; Vedanthan, Rajesh; Baxter, Jorge; Leal, Martha; Boffetta, Paolo; Woodward, Mark; Hunn, Marilyn; Dennis, Rodolfo; Fuster, Valentin
2015-01-01
BACKGROUND School programs can be effective in modifying knowledge, attitudes, and habits relevant to long-term risk of chronic diseases associated with sedentary lifestyles. As part of a long-term research strategy, we conducted an educational intervention in preschool facilities to assess changes in preschoolers’ knowledge, attitudes, and habits toward healthy eating and living an active lifestyle. METHODS Using a cluster design, we randomly assigned 14 preschool facilities in Bogotá, Colombia to a 5-month educational and playful intervention (7 preschool facilities) or to usual curriculum (7 preschool facilities). A total of 1216 children aged 3–5 years, 928 parents, and 120 teachers participated. A structured survey was used at baseline, at the end of the study, and 12 months later to evaluate changes in knowledge, attitudes, and habits. RESULTS Children in the intervention group showed a 10.9% increase in weighted score, compared with 5.3% in controls. The absolute adjusted difference was 3.90 units (95% confidence interval [CI], 1.64–6.16; P <.001). Among parents, the equivalent statistics were 8.9% and 3.1%, respectively (absolute difference 4.08 units; 95% CI, 2.03 to 6.12; P <.001), and among teachers, 9.4% and 2.5%, respectively (absolute difference 5.36 units; 95% CI, −0.29–11.01; P = .06). In the intervened cohort 1 year after the intervention, children still showed a significant increase in weighted score (absolute difference of 6.38 units; P <.001). CONCLUSIONS A preschool-based intervention aimed at improving knowledge, attitudes, and habits related to healthy diet and active lifestyle is feasible, efficacious, and sustainable in very young children. PMID:23062403
Cancer incidence and mortality risks in a large US Barrett's oesophagus cohort.
Cook, Michael B; Coburn, Sally B; Lam, Jameson R; Taylor, Philip R; Schneider, Jennifer L; Corley, Douglas A
2018-03-01
Barrett's oesophagus (BE) increases the risk of oesophageal adenocarcinoma by 10-55 times that of the general population, but no community-based cancer-specific incidence and cause-specific mortality risk estimates exist for large cohorts in the USA. Within Kaiser Permanente Northern California (KPNC), we identified patients with BE diagnosed during 1995-2012. KPNC cancer registry and mortality files were used to estimate standardised incidence ratios (SIR), standardised mortality ratios (SMR) and excess absolute risks. There were 8929 patients with BE providing 50 147 person-years of follow-up. Compared with the greater KPNC population, patients with BE had increased risks of any cancer (SIR=1.40, 95% CI 1.31 to 1.49), which slightly decreased after excluding oesophageal cancer. Oesophageal adenocarcinoma risk was increased 24 times, which translated into an excess absolute risk of 24 cases per 10 000 person-years. Although oesophageal adenocarcinoma risk decreased with time since BE diagnosis, oesophageal cancer mortality did not, indicating that the true risk is stable and persistent with time. Relative risks of cardia and stomach cancers were increased, but excess absolute risks were modest. Risks of colorectal, lung and prostate cancers were unaltered. All-cause mortality was slightly increased after excluding oesophageal cancer (SMR=1.24, 95% CI 1.18 to 1.31), but time-stratified analyses indicated that this was likely attributable to diagnostic bias. Cause-specific SMRs were elevated for ischaemic heart disease (SMR=1.39, 95% CI 1.18 to 1.63), respiratory system diseases (SMR=1.51, 95% CI 1.29 to 1.75) and digestive system diseases (SMR=2.20 95% CI 1.75 to 2.75). Patients with BE had a persistent excess risk of oesophageal adenocarcinoma over time, although their absolute excess risks for this cancer, any cancer and overall mortality were modest. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
Risk aversion affects economic values of blue fox breeding scheme.
Peura, J; Kempe, R; Strandén, I; Rydhmer, L
2016-12-01
The profit and production of an average Finnish blue fox farm was simulated using a deterministic bio-economic farm model. Risk was included using Arrow-Prat absolute risk aversion coefficient and profit variance. Risk-rated economic values were calculated for pregnancy rate, litter loss, litter size, pelt size, pelt quality, pelt colour clarity, feed efficiency and eye infection. With high absolute risk aversion, economic values were lower than with low absolute risk aversion. Economic values were highest for litter loss (18.16 and 26.42 EUR), litter size (13.27 and 19.40 EUR), pregnancy (11.99 and 18.39 EUR) and eye infection (12.39 and 13.81 EUR). Sensitivity analysis showed that selection pressure for improved eye health depended strongly on proportion of culled animals among infected animals and much less on the proportion of infected animals. The economic value of feed efficiency was lower than expected (6.06 and 8.03 EUR). However, it was almost the same magnitude as pelt quality (7.30 and 7.30 EUR) and higher than the economic value of pelt size (3.37 and 5.26 EUR). Risk factors should be considered in blue fox breeding scheme because they change the relative importance of traits. © 2016 Blackwell Verlag GmbH.
Li, Chaoyang; Balluz, Lina S; Ford, Earl S; Okoro, Catherine A; Zhao, Guixiang; Pierannunzi, Carol
2012-06-01
To compare the prevalence estimates of selected health indicators and chronic diseases or conditions among three national health surveys in the United States. Data from adults aged 18 years or older who participated in the Behavioral Risk Factor Surveillance System (BRFSS) in 2007 and 2008 (n=807,524), the National Health Interview Survey (NHIS) in 2007 and 2008 (n=44,262), and the National Health and Nutrition Examination Survey (NHANES) during 2007 and 2008 (n=5871) were analyzed. The prevalence estimates of current smoking, obesity, hypertension, and no health insurance were similar across the three surveys, with absolute differences ranging from 0.7% to 3.9% (relative differences: 2.3% to 20.2%). The prevalence estimate of poor or fair health from BRFSS was similar to that from NHANES, but higher than that from NHIS. The prevalence estimates of diabetes, coronary heart disease, and stroke were similar across the three surveys, with absolute differences ranging from 0.0% to 0.8% (relative differences: 0.2% to 17.1%). While the BRFSS continues to provide invaluable health information at state and local level, it is reassuring to observe consistency in the prevalence estimates of key health indicators of similar caliber between BRFSS and other national surveys. Published by Elsevier Inc.
Banerjee, Smita C.; Greene, Kathryn; Li, Yuelin; Ostroff, Jamie S.
2016-01-01
Objectives This study examined the effects of comparative-framing [C-F; ads highlighting differences between the advertised product and conventional cigarettes and/or smokeless tobacco products] versus similarity-framing (S-F; ads highlighting congruence with conventional cigarettes and/or smokeless tobacco products) in e-cigarette and snus ads on young adult smokers’ and non-smokers’ ad- and product-related perceptions. Methods One thousand fifty one (1,051) young adults (18–24 years; 76% women; 50% smokers) from existing consumer panels were recruited in a within-subjects quasi-experiment. Each participant viewed 4 online advertisements, varied by tobacco product type (e-cigarette or snus) and ad framing (C-F or S-F). The dependent measures for this study were ad-related (ad perceptions, ad credibility) and product-related perceptions (absolute and comparative risk perceptions, product appeal, and product use intentions). Results Former and current smokers rated C-F ads as more persuasive than S-F ads, as evidenced by favorable ad perceptions and high product use intentions. Former and current smokers also rated e-cigarette ads with more favorable ad perceptions, low absolute and comparative risk perceptions, high product appeal, and high product use intentions as compared to snus ads. However, the effect sizes of the significant differences are less than.2, indicating small magnitude of difference between the study variables. Conclusions Unless FDA regulates e-cig and snus advertising, there is a potential of decreasing risk perceptions and increasing use of e-cigs among young adults. Further research on implicit/explicit comparative claims in e-cigarettes and snus advertisements that encourage risk misperceptions is recommended. PMID:28042597
Banerjee, Smita C; Greene, Kathryn; Li, Yuelin; Ostroff, Jamie S
2016-07-01
This study examined the effects of comparative-framing [C-F; ads highlighting differences between the advertised product and conventional cigarettes and/or smokeless tobacco products] versus similarity-framing (S-F; ads highlighting congruence with conventional cigarettes and/or smokeless tobacco products) in e-cigarette and snus ads on young adult smokers' and non-smokers' ad- and product-related perceptions. One thousand fifty one (1,051) young adults (18-24 years; 76% women; 50% smokers) from existing consumer panels were recruited in a within-subjects quasi-experiment. Each participant viewed 4 online advertisements, varied by tobacco product type (e-cigarette or snus) and ad framing (C-F or S-F). The dependent measures for this study were ad-related (ad perceptions, ad credibility) and product-related perceptions (absolute and comparative risk perceptions, product appeal, and product use intentions). Former and current smokers rated C-F ads as more persuasive than S-F ads, as evidenced by favorable ad perceptions and high product use intentions. Former and current smokers also rated e-cigarette ads with more favorable ad perceptions, low absolute and comparative risk perceptions, high product appeal, and high product use intentions as compared to snus ads. However, the effect sizes of the significant differences are less than.2, indicating small magnitude of difference between the study variables. Unless FDA regulates e-cig and snus advertising, there is a potential of decreasing risk perceptions and increasing use of e-cigs among young adults. Further research on implicit/explicit comparative claims in e-cigarettes and snus advertisements that encourage risk misperceptions is recommended.
17 CFR 402.2a - Appendix A-Calculation of market risk haircut for purposes of § 402.2(g)(2).
Code of Federal Regulations, 2014 CFR
2014-04-01
... larger in absolute value of the two residual position interim haircuts being netted, and (ii) zero, in... category of the larger (in absolute value) of the two interim haircuts that were netted, and (2) a zero in... the larger (in absolute value) of the two interim haircuts that were netted, and (2) a zero in the...
17 CFR 402.2a - Appendix A-Calculation of market risk haircut for purposes of § 402.2(g)(2).
Code of Federal Regulations, 2011 CFR
2011-04-01
... larger in absolute value of the two residual position interim haircuts being netted, and (ii) zero, in... category of the larger (in absolute value) of the two interim haircuts that were netted, and (2) a zero in... the larger (in absolute value) of the two interim haircuts that were netted, and (2) a zero in the...
17 CFR 402.2a - Appendix A-Calculation of market risk haircut for purposes of § 402.2(g)(2).
Code of Federal Regulations, 2012 CFR
2012-04-01
... larger in absolute value of the two residual position interim haircuts being netted, and (ii) zero, in... category of the larger (in absolute value) of the two interim haircuts that were netted, and (2) a zero in... the larger (in absolute value) of the two interim haircuts that were netted, and (2) a zero in the...
17 CFR 402.2a - Appendix A-Calculation of market risk haircut for purposes of § 402.2(g)(2).
Code of Federal Regulations, 2010 CFR
2010-04-01
... larger in absolute value of the two residual position interim haircuts being netted, and (ii) zero, in... category of the larger (in absolute value) of the two interim haircuts that were netted, and (2) a zero in... the larger (in absolute value) of the two interim haircuts that were netted, and (2) a zero in the...
17 CFR 402.2a - Appendix A-Calculation of market risk haircut for purposes of § 402.2(g)(2).
Code of Federal Regulations, 2013 CFR
2013-04-01
... larger in absolute value of the two residual position interim haircuts being netted, and (ii) zero, in... category of the larger (in absolute value) of the two interim haircuts that were netted, and (2) a zero in... the larger (in absolute value) of the two interim haircuts that were netted, and (2) a zero in the...
Metabolic syndrome, diet and exercise.
De Sousa, Sunita M C; Norman, Robert J
2016-11-01
Polycystic ovary syndrome (PCOS) is associated with a range of metabolic complications including insulin resistance (IR), obesity, dyslipidaemia, hypertension, obstructive sleep apnoea (OSA) and non-alcoholic fatty liver disease. These compound risks result in a high prevalence of metabolic syndrome and possibly increased cardiovascular (CV) disease. As the cardiometabolic risk of PCOS is shared amongst the different diagnostic systems, all women with PCOS should undergo metabolic surveillance though the precise approach differs between guidelines. Lifestyle interventions consisting of increased physical activity and caloric restriction have been shown to improve both metabolic and reproductive outcomes. Pharmacotherapy and bariatric surgery may be considered in resistant metabolic disease. Issues requiring further research include the natural history of PCOS-associated metabolic disease, absolute CV risk and comparative efficacy of lifestyle interventions. Copyright © 2016 Elsevier Ltd. All rights reserved.
The absolute disparity anomaly and the mechanism of relative disparities.
Chopin, Adrien; Levi, Dennis; Knill, David; Bavelier, Daphne
2016-06-01
There has been a long-standing debate about the mechanisms underlying the perception of stereoscopic depth and the computation of the relative disparities that it relies on. Relative disparities between visual objects could be computed in two ways: (a) using the difference in the object's absolute disparities (Hypothesis 1) or (b) using relative disparities based on the differences in the monocular separations between objects (Hypothesis 2). To differentiate between these hypotheses, we measured stereoscopic discrimination thresholds for lines with different absolute and relative disparities. Participants were asked to judge the depth of two lines presented at the same distance from the fixation plane (absolute disparity) or the depth between two lines presented at different distances (relative disparity). We used a single stimulus method involving a unique memory component for both conditions, and no extraneous references were available. We also measured vergence noise using Nonius lines. Stereo thresholds were substantially worse for absolute disparities than for relative disparities, and the difference could not be explained by vergence noise. We attribute this difference to an absence of conscious readout of absolute disparities, termed the absolute disparity anomaly. We further show that the pattern of correlations between vergence noise and absolute and relative disparity acuities can be explained jointly by the existence of the absolute disparity anomaly and by the assumption that relative disparity information is computed from absolute disparities (Hypothesis 1).
The absolute disparity anomaly and the mechanism of relative disparities
Chopin, Adrien; Levi, Dennis; Knill, David; Bavelier, Daphne
2016-01-01
There has been a long-standing debate about the mechanisms underlying the perception of stereoscopic depth and the computation of the relative disparities that it relies on. Relative disparities between visual objects could be computed in two ways: (a) using the difference in the object's absolute disparities (Hypothesis 1) or (b) using relative disparities based on the differences in the monocular separations between objects (Hypothesis 2). To differentiate between these hypotheses, we measured stereoscopic discrimination thresholds for lines with different absolute and relative disparities. Participants were asked to judge the depth of two lines presented at the same distance from the fixation plane (absolute disparity) or the depth between two lines presented at different distances (relative disparity). We used a single stimulus method involving a unique memory component for both conditions, and no extraneous references were available. We also measured vergence noise using Nonius lines. Stereo thresholds were substantially worse for absolute disparities than for relative disparities, and the difference could not be explained by vergence noise. We attribute this difference to an absence of conscious readout of absolute disparities, termed the absolute disparity anomaly. We further show that the pattern of correlations between vergence noise and absolute and relative disparity acuities can be explained jointly by the existence of the absolute disparity anomaly and by the assumption that relative disparity information is computed from absolute disparities (Hypothesis 1). PMID:27248566
Quantifying Treatment Benefit in Molecular Subgroups to Assess a Predictive Biomarker
Iasonos, Alexia; Chapman, Paul B.; Satagopan, Jaya M.
2016-01-01
There is an increased interest in finding predictive biomarkers that can guide treatment options for both mutation carriers and non-carriers. The statistical assessment of variation in treatment benefit (TB) according to the biomarker carrier status plays an important role in evaluating predictive biomarkers. For time to event endpoints, the hazard ratio (HR) for interaction between treatment and a biomarker from a Proportional Hazards regression model is commonly used as a measure of variation in treatment benefit. While this can be easily obtained using available statistical software packages, the interpretation of HR is not straightforward. In this article, we propose different summary measures of variation in TB on the scale of survival probabilities for evaluating a predictive biomarker. The proposed summary measures can be easily interpreted as quantifying differential in TB in terms of relative risk or excess absolute risk due to treatment in carriers versus non-carriers. We illustrate the use and interpretation of the proposed measures using data from completed clinical trials. We encourage clinical practitioners to interpret variation in TB in terms of measures based on survival probabilities, particularly in terms of excess absolute risk, as opposed to HR. PMID:27141007
The effect of household poverty on tuberculosis.
Siroka, A; Law, I; Macinko, J; Floyd, K; Banda, R P; Hoa, N B; Tsolmon, B; Chanda-Kapata, P; Gasana, M; Lwinn, T; Senkoro, M; Tupasi, T; Ponce, N A
2016-12-01
pSETTING: Households in Malawi, Mongolia, Myanmar, the Philippines, Rwanda, Tanzania, Viet Nam and Zambia.OBJECTIVE To assess the relationship between household socio-economic level, both relative and absolute, and individual tuberculosis (TB) disease. We analysed national TB prevalence surveys from eight countries individually and in pooled multicountry models. Socio-economic level (SEL) was measured in terms of both relative household position and absolute wealth. The outcome of interest was whether or not an individual had TB disease. Logistic regression models were used to control for putative risk factors for TB disease such as age, sex and previous treatment history. Overall, a strong and consistent association between household SEL and individual TB disease was not found. Significant results were found in four individual country models, with the lowest socio-economic quintile being associated with higher TB risk in Mongolia, Myanmar, Tanzania and Viet Nam. TB prevalence surveys are designed to assess prevalence of disease and, due to the small numbers of cases usually detected, may not be the most efficient means of investigating TB risk factors. Different designs are needed, including measuring the SEL of individuals in nested case-control studies within TB prevalence surveys or among TB patients seeking treatment in health care facilities.
Weinberg, Ido; Gona, Philimon; O'Donnell, Christopher J; Jaff, Michael R; Murabito, Joanne M
2014-03-01
An increased interarm systolic blood pressure difference is an easily determined physical examination finding. The relationship between interarm systolic blood pressure difference and risk of future cardiovascular disease is uncertain. We described the prevalence and risk factor correlates of interarm systolic blood pressure difference in the Framingham Heart Study (FHS) original and offspring cohorts and examined the association between interarm systolic blood pressure difference and incident cardiovascular disease and all-cause mortality. An increased interarm systolic blood pressure difference was defined as ≥ 10 mm Hg using the average of initial and repeat blood pressure measurements obtained in both arms. Participants were followed through 2010 for incident cardiovascular disease events. Multivariable Cox proportional hazards regression analyses were performed to investigate the effect of interarm systolic blood pressure difference on incident cardiovascular disease. We examined 3390 (56.3% female) participants aged 40 years and older, free of cardiovascular disease at baseline, mean age of 61.1 years, who attended a FHS examination between 1991 and 1994 (original cohort) and from 1995 to 1998 (offspring cohort). The mean absolute interarm systolic blood pressure difference was 4.6 mm Hg (range 0-78). Increased interarm systolic blood pressure difference was present in 317 (9.4%) participants. The median follow-up time was 13.3 years, during which time 598 participants (17.6%) experienced a first cardiovascular event, including 83 (26.2%) participants with interarm systolic blood pressure difference ≥ 10 mm Hg. Compared with those with normal interarm systolic blood pressure difference, participants with an elevated interarm systolic blood pressure difference were older (63.0 years vs 60.9 years), had a greater prevalence of diabetes mellitus (13.3% vs 7.5%,), higher systolic blood pressure (136.3 mm Hg vs 129.3 mm Hg), and a higher total cholesterol level (212.1 mg/dL vs 206.5 mg/dL). Interarm systolic blood pressure difference was associated with a significantly increased hazard of incident cardiovascular events in the multivariable adjusted model (hazard ratio 1.38; 95% CI, 1.09-1.75). For each 1-SD-unit increase in absolute interarm systolic blood pressure difference, the hazard ratio for incident cardiovascular events was 1.07 (95% CI, 1.00-1.14) in the fully adjusted model. There was no such association with mortality (hazard ratio 1.02; 95% CI 0.76-1.38). In this community-based cohort, an interarm systolic blood pressure difference is common and associated with a significant increased risk for future cardiovascular events, even when the absolute difference in arm systolic blood pressure is modest. These findings support research to expand clinical use of this simple measurement. Copyright © 2014 Elsevier Inc. All rights reserved.
Kim, Kyoung Bog; Oh, Mi Kyeong; Kim, Haa Gyoung; Ki, Ji Hoon; Lee, Soo Hee; Kim, Su Min
2013-03-01
It has traditionally been known that there is normally a difference in blood pressure (BP) between the two arms; there is at least 20 mm Hg difference in the systolic blood pressure (SBP) and 10 mm Hg difference in the diastolic blood pressure (DBP). However, recent epidemiologic studies have shown that there are between-arm differences of < 5 mm Hg in simultaneous BP measurements. The purposes of this study is to examine whether there are between-arm differences in simultaneous BP measurements obtained from ambulatory patients without cardiovascular diseases and to identify the factors associated these differences. We examined 464 patients who visited the outpatient clinic of Gangneung Asan Hospital clinical department. For the current analysis, we excluded patients with ischemic heart disease, stroke, arrhythmia, congestive heart failure, or hyperthyroidism. Simultaneous BP measurements were obtained using the Omron MX3 BP monitor in both arms. The inter-arm difference (IAD) in BP was expressed as the relative difference (right-arm BP [R] minus left-arm BP [L]: R - L) and the absolute difference (|R - L|). The mean absolute IAD in SBP and DBP were 3.19 ± 2.38 and 2.41 ± 1.59 mm Hg, respectively, in men and 2.61 ± 2.18 and 2.25 ± 2.01 mm Hg, respectively, in women. In men, there were 83.8% of patients with the IAD in SBP of ≤ 6 mm Hg, 98.1% with the IAD in SBP of ≤ 10 mm Hg, 96.5% with the IAD in DBP of ≤ 6 mm Hg and 0% with the IAD in DBP of > 10 mm Hg. In women, 89.6% of patients had IAD in SBP of ≤ 6 mm Hg, 92.1% with IAD in DBP of ≤ 6 mm Hg, and 0% with IAD in SBP of > 10 mm Hg or IAD in DBP of > 10 mm Hg. Gangneung Asan Hospital clinical series of patients showed that the absolute IAD in SBP had a significant correlation with cardiovascular risk factors such as the 10-year Framingham cardiac risk scores and higher BP in men and higher BP in women. However, the absolute IAD in SBP and DBP had no significant correlation with the age, obesity, smoking, drinking, hyperlipidemia, diabetes, metabolic syndrome, and renal function. Our results showed that there were no significant between-arm differences in simultaneous BP measurements. It was also shown that most of the ambulatory patients without cardiovascular diseases had an IAD in SBP of < 10 mm Hg and an IAD in DBP of < 6 mm Hg.
Kim, Kyoung Bog; Kim, Haa Gyoung; Ki, Ji Hoon; Lee, Soo Hee; Kim, Su Min
2013-01-01
Background It has traditionally been known that there is normally a difference in blood pressure (BP) between the two arms; there is at least 20 mm Hg difference in the systolic blood pressure (SBP) and 10 mm Hg difference in the diastolic blood pressure (DBP). However, recent epidemiologic studies have shown that there are between-arm differences of < 5 mm Hg in simultaneous BP measurements. The purposes of this study is to examine whether there are between-arm differences in simultaneous BP measurements obtained from ambulatory patients without cardiovascular diseases and to identify the factors associated these differences. Methods We examined 464 patients who visited the outpatient clinic of Gangneung Asan Hospital clinical department. For the current analysis, we excluded patients with ischemic heart disease, stroke, arrhythmia, congestive heart failure, or hyperthyroidism. Simultaneous BP measurements were obtained using the Omron MX3 BP monitor in both arms. The inter-arm difference (IAD) in BP was expressed as the relative difference (right-arm BP [R] minus left-arm BP [L]: R - L) and the absolute difference (|R - L|). Results The mean absolute IAD in SBP and DBP were 3.19 ± 2.38 and 2.41 ± 1.59 mm Hg, respectively, in men and 2.61 ± 2.18 and 2.25 ± 2.01 mm Hg, respectively, in women. In men, there were 83.8% of patients with the IAD in SBP of ≤ 6 mm Hg, 98.1% with the IAD in SBP of ≤ 10 mm Hg, 96.5% with the IAD in DBP of ≤ 6 mm Hg and 0% with the IAD in DBP of > 10 mm Hg. In women, 89.6% of patients had IAD in SBP of ≤ 6 mm Hg, 92.1% with IAD in DBP of ≤ 6 mm Hg, and 0% with IAD in SBP of > 10 mm Hg or IAD in DBP of > 10 mm Hg. Gangneung Asan Hospital clinical series of patients showed that the absolute IAD in SBP had a significant correlation with cardiovascular risk factors such as the 10-year Framingham cardiac risk scores and higher BP in men and higher BP in women. However, the absolute IAD in SBP and DBP had no significant correlation with the age, obesity, smoking, drinking, hyperlipidemia, diabetes, metabolic syndrome, and renal function. Conclusion Our results showed that there were no significant between-arm differences in simultaneous BP measurements. It was also shown that most of the ambulatory patients without cardiovascular diseases had an IAD in SBP of < 10 mm Hg and an IAD in DBP of < 6 mm Hg. PMID:23560208
NASA Astrophysics Data System (ADS)
Nishiyama, N.
2001-12-01
Absolute return strategy provided from fund of funds (FOFs) investment schemes is the focus in Japanese Financial Community. FOFs investment mainly consists of hedge fund investment and it has two major characteristics which are low correlation against benchmark index and little impact from various external changes in the environment given maximizing return. According to the historical track record of survival hedge funds in this business world, they maintain a stable high return and low risk. However, one must keep in mind that low risk would not be equal to risk free. The failure of Long-term capital management (LTCM) that took place in the summer of 1998 was a symbolized phenomenon. The summer of 1998 exhibited a certain limitation of traditional value at risk (VaR) and some possibility that traditional VaR could be ineffectual to the nonlinear type of fluctuation in the market. In this paper, I try to bring self-organized criticality (SOC) into portfolio risk control. SOC would be well known as a model of decay in the natural world. I analyzed nonlinear type of fluctuation in the market as SOC and applied SOC to capture complicated market movement using threshold point of SOC and risk adjustments by scenario correlation as implicit signals. Threshold becomes the control parameter of risk exposure to set downside floor and forecast extreme nonlinear type of fluctuation under a certain probability. Simulation results would show synergy effect of portfolio risk control between SOC and absolute return strategy.
Mega, J L; Stitziel, N O; Smith, J G; Chasman, D I; Caulfield, M; Devlin, J J; Nordio, F; Hyde, C; Cannon, C P; Sacks, F; Poulter, N; Sever, P; Ridker, P M; Braunwald, E; Melander, O; Kathiresan, S; Sabatine, M S
2015-06-06
Genetic variants have been associated with the risk of coronary heart disease. In this study, we tested whether or not a composite of these variants could ascertain the risk of both incident and recurrent coronary heart disease events and identify those individuals who derive greater clinical benefit from statin therapy. A community-based cohort study (the Malmo Diet and Cancer Study) and four randomised controlled trials of both primary prevention (JUPITER and ASCOT) and secondary prevention (CARE and PROVE IT-TIMI 22) with statin therapy, comprising a total of 48,421 individuals and 3477 events, were included in these analyses. We studied the association of a genetic risk score based on 27 genetic variants with incident or recurrent coronary heart disease, adjusting for traditional clinical risk factors. We then investigated the relative and absolute risk reductions in coronary heart disease events with statin therapy stratified by genetic risk. We combined data from the different studies using a meta-analysis. When individuals were divided into low (quintile 1), intermediate (quintiles 2-4), and high (quintile 5) genetic risk categories, a significant gradient in risk for incident or recurrent coronary heart disease was shown. Compared with the low genetic risk category, the multivariable-adjusted hazard ratio for coronary heart disease for the intermediate genetic risk category was 1·34 (95% CI 1·22-1·47, p<0·0001) and that for the high genetic risk category was 1·72 (1·55-1·92, p<0·0001). In terms of the benefit of statin therapy in the four randomised trials, we noted a significant gradient (p=0·0277) of increasing relative risk reductions across the low (13%), intermediate (29%), and high (48%) genetic risk categories. Similarly, we noted greater absolute risk reductions in those individuals in higher genetic risk categories (p=0·0101), resulting in a roughly threefold decrease in the number needed to treat to prevent one coronary heart disease event in the primary prevention trials. Specifically, in the primary prevention trials, the number needed to treat to prevent one such event in 10 years was 66 in people at low genetic risk, 42 in those at intermediate genetic risk, and 25 in those at high genetic risk in JUPITER, and 57, 47, and 20, respectively, in ASCOT. A genetic risk score identified individuals at increased risk for both incident and recurrent coronary heart disease events. People with the highest burden of genetic risk derived the largest relative and absolute clinical benefit from statin therapy. National Institutes of Health. Copyright © 2015 Elsevier Ltd. All rights reserved.
den Ruijter, H M; Peters, S A E; Groenewegen, K A; Anderson, T J; Britton, A R; Dekker, J M; Engström, G; Eijkemans, M J; Evans, G W; de Graaf, J; Grobbee, D E; Hedblad, B; Hofman, A; Holewijn, S; Ikeda, A; Kavousi, M; Kitagawa, K; Kitamura, A; Koffijberg, H; Ikram, M A; Lonn, E M; Lorenz, M W; Mathiesen, E B; Nijpels, G; Okazaki, S; O'Leary, D H; Polak, J F; Price, J F; Robertson, C; Rembold, C M; Rosvall, M; Rundek, T; Salonen, J T; Sitzer, M; Stehouwer, C D A; Witteman, J C; Moons, K G; Bots, M L
2013-07-01
The aim of this work was to investigate whether measurement of the mean common carotid intima-media thickness (CIMT) improves cardiovascular risk prediction in individuals with diabetes. We performed a subanalysis among 4,220 individuals with diabetes in a large ongoing individual participant data meta-analysis involving 56,194 subjects from 17 population-based cohorts worldwide. We first refitted the risk factors of the Framingham heart risk score on the individuals without previous cardiovascular disease (baseline model) and then expanded this model with the mean common CIMT (CIMT model). The absolute 10 year risk for developing a myocardial infarction or stroke was estimated from both models. In individuals with diabetes we compared discrimination and calibration of the two models. Reclassification of individuals with diabetes was based on allocation to another cardiovascular risk category when mean common CIMT was added. During a median follow-up of 8.7 years, 684 first-time cardiovascular events occurred among the population with diabetes. The C statistic was 0.67 for the Framingham model and 0.68 for the CIMT model. The absolute 10 year risk for developing a myocardial infarction or stroke was 16% in both models. There was no net reclassification improvement with the addition of mean common CIMT (1.7%; 95% CI -1.8, 3.8). There were no differences in the results between men and women. There is no improvement in risk prediction in individuals with diabetes when measurement of the mean common CIMT is added to the Framingham risk score. Therefore, this measurement is not recommended for improving individual cardiovascular risk stratification in individuals with diabetes.
Assi, Valentina; Massat, Nathalie J; Thomas, Susan; MacKay, James; Warwick, Jane; Kataoka, Masako; Warsi, Iqbal; Brentnall, Adam; Warren, Ruth; Duffy, Stephen W
2015-05-15
Mammographic density is a strong risk factor for breast cancer, but its potential application in risk management is not clear, partly due to uncertainties about its interaction with other breast cancer risk factors. We aimed to quantify the impact of mammographic density on breast cancer risk in women aged 40-49 at intermediate familial risk of breast cancer (average lifetime risk of 23%), in particular in premenopausal women, and to investigate its relationship with other breast cancer risk factors in this population. We present the results from a case-control study nested with the FH01 cohort study of 6,710 women mostly aged 40-49 at intermediate familial risk of breast cancer. One hundred and three cases of breast cancer were age-matched to one or two controls. Density was measured by semiautomated interactive thresholding. Absolute density, but not percent density, was a significant risk factor for breast cancer in this population after adjusting for area of nondense tissue (OR per 10 cm(2) = 1.07, 95% CI 1.00-1.15, p = 0.04). The effect was stronger in premenopausal women, who made up the majority of the study population. Absolute density remained a significant predictor of breast cancer risk after adjusting for age at menarche, age at first live birth, parity, past or present hormone replacement therapy, and the Tyrer-Cuzick 10-year relative risk estimate of breast cancer. Absolute density can improve breast cancer risk stratification and delineation of high-risk groups alongside the Tyrer-Cuzick 10-year relative risk estimate. © 2014 UICC.
Lies, Damned Lies, and Health Inequality Measurements
Gerdtham, Ulf-G; Petrie, Dennis
2015-01-01
Measuring and monitoring socioeconomic health inequalities are critical for understanding the impact of policy decisions. However, the measurement of health inequality is far from value neutral, and one can easily present the measure that best supports one’s chosen conclusion or selectively exclude measures. Improving people’s understanding of the often implicit value judgments is therefore important to reduce the risk that researchers mislead or policymakers are misled. While the choice between relative and absolute inequality is already value laden, further complexities arise when, as is often the case, health variables have both a lower and upper bound, and thus can be expressed in terms of either attainments or shortfalls, such as for mortality/survival. We bring together the recent parallel discussions from epidemiology and health economics regarding health inequality measurement and provide a deeper understanding of the different value judgments within absolute and relative measures expressed both in attainments and shortfalls, by graphically illustrating both hypothetical and real examples. We show that relative measures in terms of attainments and shortfalls have distinct value judgments, highlighting that for health variables with two bounds the choice is no longer only between an absolute and a relative measure but between an absolute, an attainment- relative and a shortfall-relative one. We illustrate how these three value judgments can be combined onto a single graph which shows the rankings according to all three measures, and illustrates how the three measures provide ethical benchmarks against which to judge the difference in inequality between populations. PMID:26133019
Colon Cancer Risk Assessment - Gauss Program
An executable file (in GAUSS) that projects absolute colon cancer risk (with confidence intervals) according to NCI’s Colorectal Cancer Risk Assessment Tool (CCRAT) algorithm. GAUSS is not needed to run the program.
Sex as a Biological Variable: Who, What, When, Why, and How.
Bale, Tracy L; Epperson, C Neill
2017-01-01
The inclusion of sex as a biological variable in research is absolutely essential for improving our understanding of disease mechanisms contributing to risk and resilience. Studies focusing on examining sex differences have demonstrated across many levels of analyses and stages of brain development and maturation that males and females can differ significantly. This review will discuss examples of animal models and clinical studies to provide guidance and reference for the inclusion of sex as an important biological variable relevant to a Neuropsychopharmacology audience.
Projecting Individualized Absolute Invasive Breast Cancer Risk in US Hispanic Women.
Banegas, Matthew P; John, Esther M; Slattery, Martha L; Gomez, Scarlett Lin; Yu, Mandi; LaCroix, Andrea Z; Pee, David; Chlebowski, Rowan T; Hines, Lisa M; Thompson, Cynthia A; Gail, Mitchell H
2017-02-01
There is no model to estimate absolute invasive breast cancer risk for Hispanic women. The San Francisco Bay Area Breast Cancer Study (SFBCS) provided data on Hispanic breast cancer case patients (533 US-born, 553 foreign-born) and control participants (464 US-born, 947 foreign-born). These data yielded estimates of relative risk (RR) and attributable risk (AR) separately for US-born and foreign-born women. Nativity-specific absolute risks were estimated by combining RR and AR information with nativity-specific invasive breast cancer incidence and competing mortality rates from the California Cancer Registry and Surveillance, Epidemiology, and End Results program to develop the Hispanic risk model (HRM). In independent data, we assessed model calibration through observed/expected (O/E) ratios, and we estimated discriminatory accuracy with the area under the receiver operating characteristic curve (AUC) statistic. The US-born HRM included age at first full-term pregnancy, biopsy for benign breast disease, and family history of breast cancer; the foreign-born HRM also included age at menarche. The HRM estimated lower risks than the National Cancer Institute's Breast Cancer Risk Assessment Tool (BCRAT) for US-born Hispanic women, but higher risks in foreign-born women. In independent data from the Women's Health Initiative, the HRM was well calibrated for US-born women (observed/expected [O/E] ratio = 1.07, 95% confidence interval [CI] = 0.81 to 1.40), but seemed to overestimate risk in foreign-born women (O/E ratio = 0.66, 95% CI = 0.41 to 1.07). The AUC was 0.564 (95% CI = 0.485 to 0.644) for US-born and 0.625 (95% CI = 0.487 to 0.764) for foreign-born women. The HRM is the first absolute risk model that is based entirely on data specific to Hispanic women by nativity. Further studies in Hispanic women are warranted to evaluate its validity. Published by Oxford University Press 2016. This work is written by US Government employees and is in the public domain in the US.
Prevalence Incidence Mixture Models
The R package and webtool fits Prevalence Incidence Mixture models to left-censored and irregularly interval-censored time to event data that is commonly found in screening cohorts assembled from electronic health records. Absolute and relative risk can be estimated for simple random sampling, and stratified sampling (the two approaches of superpopulation and a finite population are supported for target populations). Non-parametric (absolute risks only), semi-parametric, weakly-parametric (using B-splines), and some fully parametric (such as the logistic-Weibull) models are supported.
Statin therapy and the risk for diabetes among adult women: do the benefits outweigh the risk?
Ma, Yunsheng; Culver, Annie; Rossouw, Jacques; Olendzki, Barbara; Merriam, Philip; Lian, Bill; Ockene, Ira
2013-02-01
The purpose of this review was to examine statin therapy and the risk for diabetes among adult women using a selective review. The literature contains reports of new-onset diabetes associated with statin use. While many studies do not report sex-specific results, there is evidence indicating the risk to benefit ratio may vary by gender. However, the absolute effects are not clear because women have historically been under-represented in clinical trials. A review of the literature indicates that the cardiovascular benefits of statins appear to outweigh the risk for statin-related diabetes. However, the effect may depend upon baseline diabetes risk, dose, and statin potency. Rigorous, long-term studies focused on the risks and benefits of statins in women are unavailable to sort for gender-specific differences. Until this changes, individualized attention to risk assessment, and strong prevention with lifestyle changes must prevail.
Thromboprophylaxis after Knee Arthroscopy and Lower-Leg Casting.
van Adrichem, Raymond A; Nemeth, Banne; Algra, Ale; le Cessie, Saskia; Rosendaal, Frits R; Schipper, Inger B; Nelissen, Rob G H H; Cannegieter, Suzanne C
2017-02-09
The use of thromboprophylaxis to prevent clinically apparent venous thromboembolism after knee arthroscopy or casting of the lower leg is disputed. We compared the incidence of symptomatic venous thromboembolism after these procedures between patients who received anticoagulant therapy and those who received no anticoagulant therapy. We conducted two parallel, pragmatic, multicenter, randomized, controlled, open-label trials with blinded outcome evaluation: the POT-KAST trial, which included patients undergoing knee arthroscopy, and the POT-CAST trial, which included patients treated with casting of the lower leg. Patients were assigned to receive either a prophylactic dose of low-molecular-weight heparin (for the 8 days after arthroscopy in the POT-KAST trial or during the full period of immobilization due to casting in the POT-CAST trial) or no anticoagulant therapy. The primary outcomes were the cumulative incidences of symptomatic venous thromboembolism and major bleeding within 3 months after the procedure. In the POT-KAST trial, 1543 patients underwent randomization, of whom 1451 were included in the intention-to-treat population. Venous thromboembolism occurred in 5 of the 731 patients (0.7%) in the treatment group and in 3 of the 720 patients (0.4%) in the control group (relative risk, 1.6; 95% confidence interval [CI], 0.4 to 6.8; absolute difference in risk, 0.3 percentage points; 95% CI, -0.6 to 1.2). Major bleeding occurred in 1 patient (0.1%) in the treatment group and in 1 (0.1%) in the control group (absolute difference in risk, 0 percentage points; 95% CI, -0.6 to 0.7). In the POT-CAST trial, 1519 patients underwent randomization, of whom 1435 were included in the intention-to-treat population. Venous thromboembolism occurred in 10 of the 719 patients (1.4%) in the treatment group and in 13 of the 716 patients (1.8%) in the control group (relative risk, 0.8; 95% CI, 0.3 to 1.7; absolute difference in risk, -0.4 percentage points; 95% CI, -1.8 to 1.0). No major bleeding events occurred. In both trials, the most common adverse event was infection. The results of our trials showed that prophylaxis with low-molecular-weight heparin for the 8 days after knee arthroscopy or during the full period of immobilization due to casting was not effective for the prevention of symptomatic venous thromboembolism. (Funded by the Netherlands Organization for Health Research and Development; POT-KAST and POT-CAST ClinicalTrials.gov numbers, NCT01542723 and NCT01542762 , respectively.).
Budoff, Matthew J; Nasir, Khurram; McClelland, Robyn L; Detrano, Robert; Wong, Nathan; Blumenthal, Roger S; Kondos, George; Kronmal, Richard A
2009-01-27
In this study, we aimed to establish whether age-sex-specific percentiles of coronary artery calcium (CAC) predict cardiovascular outcomes better than the actual (absolute) CAC score. The presence and extent of CAC correlates with the overall magnitude of coronary atherosclerotic plaque burden and with the development of subsequent coronary events. MESA (Multi-Ethnic Study of Atherosclerosis) is a prospective cohort study of 6,814 asymptomatic participants followed for coronary heart disease (CHD) events including myocardial infarction, angina, resuscitated cardiac arrest, or CHD death. Time to incident CHD was modeled with Cox regression, and we compared models with percentiles based on age, sex, and/or race/ethnicity to categories commonly used (0, 1 to 100, 101 to 400, 400+ Agatston units). There were 163 (2.4%) incident CHD events (median follow-up 3.75 years). Expressing CAC in terms of age- and sex-specific percentiles had significantly lower area under the receiver-operating characteristic curve (AUC) than when using absolute scores (women: AUC 0.73 versus 0.76, p = 0.044; men: AUC 0.73 versus 0.77, p < 0.001). Akaike's information criterion indicated better model fit with the overall score. Both methods robustly predicted events (>90th percentile associated with a hazard ratio [HR] of 16.4, 95% confidence interval [CI]: 9.30 to 28.9, and score >400 associated with HR of 20.6, 95% CI: 11.8 to 36.0). Within groups based on age-, sex-, and race/ethnicity-specific percentiles there remains a clear trend of increasing risk across levels of the absolute CAC groups. In contrast, once absolute CAC category is fixed, there is no increasing trend across levels of age-, sex-, and race/ethnicity-specific categories. Patients with low absolute scores are low-risk, regardless of age-, sex-, and race/ethnicity-specific percentile rank. Persons with an absolute CAC score of >400 are high risk, regardless of percentile rank. Using absolute CAC in standard groups performed better than age-, sex-, and race/ethnicity-specific percentiles in terms of model fit and discrimination. We recommend using cut points based on the absolute CAC amount, and the common CAC cut points of 100 and 400 seem to perform well.
Laurent, Olivier; Ancelet, Sophie; Richardson, David B; Hémon, Denis; Ielsch, Géraldine; Demoury, Claire; Clavel, Jacqueline; Laurier, Dominique
2013-05-01
Previous epidemiological studies and quantitative risk assessments (QRA) have suggested that natural background radiation may be a cause of childhood leukemia. The present work uses a QRA approach to predict the excess risk of childhood leukemia in France related to three components of natural radiation: radon, cosmic rays and terrestrial gamma rays, using excess relative and absolute risk models proposed by the United Nations Scientific Committee on the Effects of Atomic Radiation (UNSCEAR). Both models were developed from the Life Span Study (LSS) of Japanese A-bomb survivors. Previous risk assessments were extended by considering uncertainties in radiation-related leukemia risk model parameters as part of this process, within a Bayesian framework. Estimated red bone marrow doses cumulated during childhood by the average French child due to radon, terrestrial gamma and cosmic rays are 4.4, 7.5 and 4.3 mSv, respectively. The excess fractions of cases (expressed as percentages) associated with these sources of natural radiation are 20 % [95 % credible interval (CI) 0-68 %] and 4 % (95 % CI 0-11 %) under the excess relative and excess absolute risk models, respectively. The large CIs, as well as the different point estimates obtained under these two models, highlight the uncertainties in predictions of radiation-related childhood leukemia risks. These results are only valid provided that models developed from the LSS can be transferred to the population of French children and to chronic natural radiation exposures, and must be considered in view of the currently limited knowledge concerning other potential risk factors for childhood leukemia. Last, they emphasize the need for further epidemiological investigations of the effects of natural radiation on childhood leukemia to reduce uncertainties and help refine radiation protection standards.
Thyroid neoplasia following low-dose radiation in childhood
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ron, E.; Modan, B.; Preston, D.
1989-12-01
The thyroid gland is highly sensitive to the carcinogenic effects of ionizing radiation. Previously, we reported a significant increase of thyroid cancer and adenomas among 10,834 persons in Israel who received radiotherapy to the scalp for ringworm. These findings have now been extended with further follow-up and revised dosimetry. Overall, 98 thyroid tumors were identified among the exposed and 57 among 10,834 nonexposed matched population and 5392 sibling comparison subjects. An estimated thyroid dose of 9 cGy was linked to a fourfold (95% Cl = 2.3-7.9) increase of malignant tumors and a twofold (95% Cl = 1.3-3.0) increase of benignmore » tumors. The dose-response relationship was consistent with linearity. Age was an important modifier of risk with those exposed under 5 years being significantly more prone to develop thyroid tumors than older children. The pattern of radiation risk over time could be described on the basis of a constant multiplication of the background rate, and an absolute risk model was not compatible with the observed data. Overall, the excess relative risk per cGy for thyroid cancer development after childhood exposure is estimated as 0.3, and the absolute excess risk as 13 per 10(6) PY-cGy. For benign tumors the estimated excess relative risk was 0.1 per cGy and the absolute risk was 15 per 10(6) PY-cGy.« less
Citrome, Leslie
2009-09-01
Pharmaceutical product labeling as approved by regulatory agencies include statements of adverse event risk. Product labels include descriptive statements such as whether events are uncommon or rare, as well as percentage occurrence for more common events. In addition tables are provided with the frequencies of the latter events for both product and placebo as observed in clinical trials. Competing products are not mentioned in a specific drug's product labeling but indirect comparisons can be made using the corresponding label information for the alternate product. Two types of tools are easily used for this purpose: absolute measures such as number needed to harm (NNH), and relative measures such as relative risk increase (RRI). The calculations for both of these types of quantitative measures are presented using as examples the oral first-line second-generation antipsychotic medications. Among three sample outcomes selected a priori, akathisia, weight gain, and discontinuation from a clinical trial because of an adverse reaction, there appears to be differences among the different antipsychotics versus placebo. Aripiprazole was associated with the highest risk for akathisia, particularly when used as adjunctive treatment of major depressive disorder (NNH 5, 95% CI 4-7; RRI 525%, 95% CI 267%-964%). Although insufficient information was available in product labeling to calculate the CI, olanzapine was associated with the highest risk for weight gain of at least 7% from baseline (NNH 6, RRI 640% for adults; NNH 4, RRI 314% for adolescents), and quetiapine for the indication of bipolar depression was associated with the highest risk of discontinuation from a clinical trial because of an adverse reaction (NNH 8, RRI 265% for 600 mg/d; NNH 15, RRI 137% for 300 mg/d). In conclusion, with certain limitations, it is possible for the clinician to extract information from medication product labeling regarding the frequency with which certain adverse reactions can be expected. This supplements, but does not replace, information reported directly in clinical trial reports.
Hetherington, Erin; McDonald, Sheila; Williamson, Tyler; Patten, Scott B; Tough, Suzanne C
2018-06-19
Low social support is consistently associated with postpartum depression. Previous studies do not always control for previous mental health and do not consider what type of support (tangible, emotional, informational or positive social interaction) is most important. The objectives are: to examine if low social support contributes to subsequent risk of depressive or anxiety symptoms and to determine which type of support is most important. Data from the All Our Families longitudinal pregnancy cohort were used (n=3057). Outcomes were depressive or anxiety symptoms at 4 months and 1 year postpartum. Exposures were social support during pregnancy and at 4 months postpartum. Log binomial models were used to calculate risk ratios (RRs) and absolute risk differences, controlling for past mental health. Low total social support during pregnancy was associated with an increased risk of depressive symptoms (RR 1.50, 95% CI 1.24 to 1.82) and anxiety symptoms (RR 1.63, 95% CI 1.38 to 1.93) at 4 months postpartum. Low total social support at 4 months was associated with an increased risk of anxiety symptoms (RR 1.65, 95% CI 1.31 to 2.09) at 1 year. Absolute risk differences were largest among women with previous mental health challenges resulting in a number needed to treat of 5 for some outcomes. Emotional/informational support was the most important type of support for postpartum anxiety. Group prenatal care, prenatal education and peer support programmes have the potential to improve social support. Prenatal interventions studies are needed to confirm these findings in higher risk groups. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Absolutely relative or relatively absolute: violations of value invariance in human decision making.
Teodorescu, Andrei R; Moran, Rani; Usher, Marius
2016-02-01
Making decisions based on relative rather than absolute information processing is tied to choice optimality via the accumulation of evidence differences and to canonical neural processing via accumulation of evidence ratios. These theoretical frameworks predict invariance of decision latencies to absolute intensities that maintain differences and ratios, respectively. While information about the absolute values of the choice alternatives is not necessary for choosing the best alternative, it may nevertheless hold valuable information about the context of the decision. To test the sensitivity of human decision making to absolute values, we manipulated the intensities of brightness stimuli pairs while preserving either their differences or their ratios. Although asked to choose the brighter alternative relative to the other, participants responded faster to higher absolute values. Thus, our results provide empirical evidence for human sensitivity to task irrelevant absolute values indicating a hard-wired mechanism that precedes executive control. Computational investigations of several modelling architectures reveal two alternative accounts for this phenomenon, which combine absolute and relative processing. One account involves accumulation of differences with activation dependent processing noise and the other emerges from accumulation of absolute values subject to the temporal dynamics of lateral inhibition. The potential adaptive role of such choice mechanisms is discussed.
Thyroid Cancer Risk Assessment Tool
The R package thyroid implements a risk prediction model developed by NCI researchers to calculate the absolute risk of developing a second primary thyroid cancer (SPTC) in individuals who were diagnosed with a cancer during their childhood.
[Prognostic value of absolute monocyte count in chronic lymphocytic leukaemia].
Szerafin, László; Jakó, János; Riskó, Ferenc
2015-04-01
The low peripheral absolute lymphocyte and high monocyte count have been reported to correlate with poor clinical outcome in various lymphomas and other cancers. However, a few data known about the prognostic value of absolute monocyte count in chronic lymphocytic leukaemia. The aim of the authors was to investigate the impact of absolute monocyte count measured at the time of diagnosis in patients with chronic lymphocytic leukaemia on the time to treatment and overal survival. Between January 1, 2005 and December 31, 2012, 223 patients with newly-diagnosed chronic lymphocytic leukaemia were included. The rate of patients needing treatment, time to treatment, overal survival and causes of mortality based on Rai stages, CD38, ZAP-70 positivity and absolute monocyte count were analyzed. Therapy was necessary in 21.1%, 57.4%, 88.9%, 88.9% and 100% of patients in Rai stage 0, I, II, III an IV, respectively; in 61.9% and 60.8% of patients exhibiting CD38 and ZAP-70 positivity, respectively; and in 76.9%, 21.2% and 66.2% of patients if the absolute monocyte count was <0.25 G/l, between 0.25-0.75 G/l and >0.75 G/l, respectively. The median time to treatment and the median overal survival were 19.5, 65, and 35.5 months; and 41.5, 65, and 49.5 months according to the three groups of monocyte counts. The relative risk of beginning the therapy was 1.62 (p<0.01) in patients with absolute monocyte count <0.25 G/l or >0.75 G/l, as compared to those with 0.25-0.75 G/l, and the risk of overal survival was 2.41 (p<0.01) in patients with absolute monocyte count lower than 0.25 G/l as compared to those with higher than 0.25 G/l. The relative risks remained significant in Rai 0 patients, too. The leading causes of mortality were infections (41.7%) and the chronic lymphocytic leukaemia (58.3%) in patients with low monocyte count, while tumours (25.9-35.3%) and other events (48.1 and 11.8%) occurred in patients with medium or high monocyte counts. Patients with low and high monocyte counts had a shorter time to treatment compared to patients who belonged to the intermediate monocyte count group. The low absolute monocyte count was associated with increased mortality caused by infectious complications and chronic lymphocytic leukaemia. The absolute monocyte count may give additional prognostic information in Rai stage 0, too.
Feagan, Brian G; Rubin, David T; Danese, Silvio; Vermeire, Severine; Abhyankar, Brihad; Sankoh, Serap; James, Alexandra; Smyth, Michael
2017-02-01
The efficacy and safety of vedolizumab, a humanized immunoglobulin G1 monoclonal antibody against the integrin α4β7, were demonstrated in multicenter, phase 3, randomized, placebo-controlled trials in patients with moderately to severely active ulcerative colitis (UC) or Crohn's disease. We analyzed data from 1 of these trials to determine the effects of vedolizumab therapy in patients with UC, based on past exposure to anti-tumor necrosis factor-α (TNF) antagonists. We performed a post hoc analysis of data from the GEMINI 1 study, collected from 464 patients who received vedolizumab or placebo but had not received a previous TNF antagonist (naive to TNF antagonists) and 367 patients with an inadequate response, loss of response, or intolerance to TNF antagonists (failure of TNF antagonists). Predefined outcomes of GEMINI 1 were evaluated in these subpopulations. At Week 6, there were greater absolute differences in efficacy between vedolizumab and placebo in patients naive to TNF antagonists than patients with failure of TNF antagonists, although the risk ratios (RRs) for efficacy were similar for each group. Week 6 rates of response to vedolizumab and placebo were 53.1% and 26.3%, respectively, among patients naive to TNF antagonists (absolute difference, 26.4%; 95% confidence interval [CI], 12.4-40.4; RR, 2.0; 95% CI, 1.3-3.0); these rates were 39.0% and 20.6%, respectively, in patients with failure of TNF antagonists (absolute difference, 18.1%; 95% CI, 2.8-33.5; RR, 1.9; 95% CI, 1.1-3.2). During maintenance therapy, the absolute differences were similar but the RR for efficacy was higher for patients with failure of TNF antagonists than for patients naive to TNF antagonists, for most outcomes. Week 52 rates of remission with vedolizumab and placebo were 46.9% and 19.0%, respectively, in patients naive to TNF antagonists (absolute difference, 28.0%; 95% CI, 14.9-41.1; RR, 2.5; 95% CI, 1.5-4.0) and 36.1% and 5.3%, respectively, in patients with failure of TNF antagonists (absolute difference, 29.5%; 95% CI, 12.8-46.1; RR, 6.6; 95% CI, 1.7-26.5). No differences in adverse events were observed among groups. Vedolizumab demonstrated significantly greater efficacy as induction and maintenance therapy for UC than placebo in patients naive to TNF antagonists and patients with TNF antagonist failure. There were numerically greater treatment differences at Week 6 among patients receiving vedolizumab who were naive to TNF antagonists than patients with TNF antagonist failure. ClinicalTrials.gov no: NCT00783718. Copyright © 2017 AGA Institute. Published by Elsevier Inc. All rights reserved.
Singh, Jasvinder A; Hossain, Alomgir; Tanjong Ghogomu, Elizabeth; Mudano, Amy S; Maxwell, Lara J; Buchbinder, Rachelle; Lopez-Olivo, Maria Angeles; Suarez-Almazor, Maria E; Tugwell, Peter; Wells, George A
2017-03-10
Biologic disease-modifying anti-rheumatic drugs (DMARDs: referred to as biologics) are effective in treating rheumatoid arthritis (RA), however there are few head-to-head comparison studies. Our systematic review, standard meta-analysis and network meta-analysis (NMA) updates the 2009 Cochrane overview, 'Biologics for rheumatoid arthritis (RA)' and adds new data. This review is focused on biologic or tofacitinib therapy in people with RA who had previously been treated unsuccessfully with biologics. To compare the benefits and harms of biologics (abatacept, adalimumab, anakinra, certolizumab pegol, etanercept, golimumab, infliximab, rituximab, tocilizumab) and small molecule tofacitinib versus comparator (placebo or methotrexate (MTX)/other DMARDs) in people with RA, previously unsuccessfully treated with biologics. On 22 June 2015 we searched for randomized controlled trials (RCTs) in CENTRAL, MEDLINE, and Embase; and trials registries (WHO trials register, Clinicaltrials.gov). We carried out article selection, data extraction, and risk of bias and GRADE assessments in duplicate. We calculated direct estimates with 95% confidence intervals (CI) using standard meta-analysis. We used a Bayesian mixed treatment comparison (MTC) approach for NMA estimates with 95% credible intervals (CrI). We converted odds ratios (OR) to risk ratios (RR) for ease of understanding. We have also presented results in absolute measures as risk difference (RD) and number needed to treat for an additional beneficial outcome (NNTB). Outcomes measured included four benefits (ACR50, function measured by Health Assessment Questionnaire (HAQ) score, remission defined as DAS < 1.6 or DAS28 < 2.6, slowing of radiographic progression) and three harms (withdrawals due to adverse events, serious adverse events, and cancer). This update includes nine new RCTs for a total of 12 RCTs that included 3364 participants. The comparator was placebo only in three RCTs (548 participants), MTX or other traditional DMARD in six RCTs (2468 participants), and another biologic in three RCTs (348 participants). Data were available for four tumor necrosis factor (TNF)-biologics: (certolizumab pegol (1 study; 37 participants), etanercept (3 studies; 348 participants), golimumab (1 study; 461 participants), infliximab (1 study; 27 participants)), three non-TNF biologics (abatacept (3 studies; 632 participants), rituximab (2 studies; 1019 participants), and tocilizumab (2 studies; 589 participants)); there was only one study for tofacitinib (399 participants). The majority of the trials (10/12) lasted less than 12 months.We judged 33% of the studies at low risk of bias for allocation sequence generation, allocation concealment and blinding, 25% had low risk of bias for attrition, 92% were at unclear risk for selective reporting; and 92% had low risk of bias for major baseline imbalance. We downgraded the quality of the evidence for most outcomes to moderate or low due to study limitations, heterogeneity, or rarity of direct comparator trials. Biologic monotherapy versus placeboCompared to placebo, biologics were associated with clinically meaningful and statistically significant improvement in RA as demonstrated by higher ACR50 and RA remission rates. RR was 4.10 for ACR50 (95% CI 1.97 to 8.55; moderate-quality evidence); absolute benefit RD 14% (95% CI 6% to 21%); and NNTB = 8 (95% CI 4 to 23). RR for RA remission was 13.51 (95% CI 1.85 to 98.45, one study available; moderate-quality evidence); absolute benefit RD 9% (95% CI 5% to 13%); and NNTB = 11 (95% CI 3 to 136). Results for withdrawals due to adverse events and serious adverse events did not show any statistically significant or clinically meaningful differences. There were no studies available for analysis for function measured by HAQ, radiographic progression, or cancer outcomes. There were not enough data for any of the outcomes to look at subgroups. Biologic + MTX versus active comparator (MTX/other traditional DMARDs)Compared to MTX/other traditional DMARDs, biologic + MTX was associated with a clinically meaningful and statistically significant improvement in ACR50, function measured by HAQ, and RA remission rates in direct comparisons. RR for ACR50 was 4.07 (95% CI 2.76 to 5.99; high-quality evidence); absolute benefit RD 16% (10% to 21%); NNTB = 7 (95% CI 5 to 11). HAQ scores showed an improvement with a mean difference (MD) of 0.29 (95% CI 0.21 to 0.36; high-quality evidence); absolute benefit RD 9.7% improvement (95% CI 7% to 12%); and NNTB = 5 (95% CI 4 to 7). Remission rates showed an improved RR of 20.73 (95% CI 4.13 to 104.16; moderate-quality evidence); absolute benefit RD 10% (95% CI 8% to 13%); and NNTB = 17 (95% CI 4 to 96), among the biologic + MTX group compared to MTX/other DMARDs. There were no studies for radiographic progression. Results were not clinically meaningful or statistically significantly different for withdrawals due to adverse events or serious adverse events, and were inconclusive for cancer. Tofacitinib monotherapy versus placeboThere were no published data. Tofacitinib + MTX versus active comparator (MTX)In one study, compared to MTX, tofacitinib + MTX was associated with a clinically meaningful and statistically significant improvement in ACR50 (RR 3.24; 95% CI 1.78 to 5.89; absolute benefit RD 19% (95% CI 12% to 26%); NNTB = 6 (95% CI 3 to 14); moderate-quality evidence), and function measured by HAQ, MD 0.27 improvement (95% CI 0.14 to 0.39); absolute benefit RD 9% (95% CI 4.7% to 13%), NNTB = 5 (95% CI 4 to 10); high-quality evidence). RA remission rates were not statistically significantly different but the observed difference may be clinically meaningful (RR 15.44 (95% CI 0.93 to 256.1; high-quality evidence); absolute benefit RD 6% (95% CI 3% to 9%); NNTB could not be calculated. There were no studies for radiographic progression. There were no statistically significant or clinically meaningful differences for withdrawals due to adverse events and serious adverse events, and results were inconclusive for cancer. Biologic (with or without MTX) or tofacitinib (with MTX) use was associated with clinically meaningful and statistically significant benefits (ACR50, HAQ, remission) compared to placebo or an active comparator (MTX/other traditional DMARDs) among people with RA previously unsuccessfully treated with biologics.No studies examined radiographic progression. Results were not clinically meaningful or statistically significant for withdrawals due to adverse events and serious adverse events, and were inconclusive for cancer.
Modeling returns volatility: Realized GARCH incorporating realized risk measure
NASA Astrophysics Data System (ADS)
Jiang, Wei; Ruan, Qingsong; Li, Jianfeng; Li, Ye
2018-06-01
This study applies realized GARCH models by introducing several risk measures of intraday returns into the measurement equation, to model the daily volatility of E-mini S&P 500 index futures returns. Besides using the conventional realized measures, realized volatility and realized kernel as our benchmarks, we also use generalized realized risk measures, realized absolute deviation, and two realized tail risk measures, realized value-at-risk and realized expected shortfall. The empirical results show that realized GARCH models using the generalized realized risk measures provide better volatility estimation for the in-sample and substantial improvement in volatility forecasting for the out-of-sample. In particular, the realized expected shortfall performs best for all of the alternative realized measures. Our empirical results reveal that future volatility may be more attributable to present losses (risk measures). The results are robust to different sample estimation windows.
Prenatal Valproate Exposure and Risk of Autism Spectrum Disorders and Childhood Autism
Christensen, Jakob; Grønborg, Therese Koops; Sørensen, Merete Juul; Schendel, Diana; Parner, Erik Thorlund; Pedersen, Lars Henning; Vestergaard, Mogens
2015-01-01
Importance Valproate is used for the treatment of epilepsy and other neuropsychological disorders and may be the only treatment option for women of childbearing potential. However, prenatal exposure to valproate may increase the risk of autism. Objective To determine whether prenatal exposure to valproate is associated with an increased risk of autism in offspring. Design, Setting, and Participants Population-based study of all children born alive in Denmark from 1996 to 2006. National registers were used to identify children exposed to valproate during pregnancy and diagnosed with autism spectrum disorders (childhood autism [autistic disorder], Asperger syndrome, atypical autism, and other or unspecified pervasive developmental disorders). We analyzed the risks associated with all autism spectrum disorders as well as childhood autism. Data were analyzed by Cox regression adjusting for potential confounders (maternal age at conception, paternal age at conception, parental psychiatric history, gestational age, birth weight, sex, congenital malformations, and parity). Children were followed up from birth until the day of autism spectrum disorder diagnosis, death, emigration, or December 31, 2010, whichever came first. Main Outcomes and Measures Absolute risk (cumulative incidence) and the hazard ratio (HR) of autism spectrum disorder and childhood autism in children after exposure to valproate in pregnancy. Results Of 655 615 children born from 1996 through 2006, 5437 were identified with autism spectrum disorder, including 2067 with childhood autism. The mean age of the children at end of follow-up was 8.84 years (range, 4-14; median, 8.85). The estimated absolute risk after 14 years of follow-up was 1.53% (95% CI, 1.47%- 1.58%) for autism spectrum disorder and 0.48% (95% CI, 0.46%-0.51%) for childhood autism. Overall, the 508 children exposed to valproate had an absolute risk of 4.42% (95% CI, 2.59%-7.46%) for autism spectrum disorder (adjusted HR, 2.9 [95% CI, 1.7-4.9]) and an absolute risk of 2.50% (95% CI, 1.30%-4.81%) for childhood autism (adjusted HR, 5.2 [95% CI, 2.7-10.0]). When restricting the cohort to the 6584 children born to women with epilepsy, the absolute risk of autism spectrum disorder among 432 children exposed to valproate was 4.15% (95% CI, 2.20%-7.81%) (adjusted HR, 1.7 [95% CI, 0.9-3.2]), and the absolute risk of childhood autism was 2.95% (95% CI, 1.42%-6.11%) (adjusted HR, 2.9 [95% CI, 1.4-6.0]) vs 2.44% (95% CI, 1.88%-3.16%) for autism spectrum disorder and 1.02% (95% CI, 0.70%-1.49%) for childhood autism among 6152 children not exposed to valproate. Conclusions and Relevance Maternal use of valproate during pregnancy was associated with a significantly increased risk of autism spectrum disorder and childhood autism in the offspring, even after adjusting for maternal epilepsy. For women of childbearing potential who use antiepileptic medications, these findings must be balanced against the treatment benefits for women who require valproate for epilepsy control. PMID:23613074
Dolz, Jose; Laprie, Anne; Ken, Soléakhéna; Leroy, Henri-Arthur; Reyns, Nicolas; Massoptier, Laurent; Vermandel, Maximilien
2016-01-01
To constrain the risk of severe toxicity in radiotherapy and radiosurgery, precise volume delineation of organs at risk is required. This task is still manually performed, which is time-consuming and prone to observer variability. To address these issues, and as alternative to atlas-based segmentation methods, machine learning techniques, such as support vector machines (SVM), have been recently presented to segment subcortical structures on magnetic resonance images (MRI). SVM is proposed to segment the brainstem on MRI in multicenter brain cancer context. A dataset composed by 14 adult brain MRI scans is used to evaluate its performance. In addition to spatial and probabilistic information, five different image intensity values (IIVs) configurations are evaluated as features to train the SVM classifier. Segmentation accuracy is evaluated by computing the Dice similarity coefficient (DSC), absolute volumes difference (AVD) and percentage volume difference between automatic and manual contours. Mean DSC for all proposed IIVs configurations ranged from 0.89 to 0.90. Mean AVD values were below 1.5 cm(3), where the value for best performing IIVs configuration was 0.85 cm(3), representing an absolute mean difference of 3.99% with respect to the manual segmented volumes. Results suggest consistent volume estimation and high spatial similarity with respect to expert delineations. The proposed approach outperformed presented methods to segment the brainstem, not only in volume similarity metrics, but also in segmentation time. Preliminary results showed that the approach might be promising for adoption in clinical use.
Lacy, C R; Barone, J A; Suh, D C; Malini, P L; Bueno, M; Moylan, D M; Kostis, J B
2001-01-15
This study was conducted to evaluate willingness to prescribe medication based on identical data presented in different outcome terms to health professionals of varied discipline, geographic location, and level of training. Cross-sectional survey using a self-administered questionnaire was performed in 400 health professionals (physicians, pharmacists, physicians-in-training, and pharmacy students) in the United States and Europe. Data reflecting a clinical trial were presented in 6 outcome terms: 3 terms describing identical mortality (relative risk reduction, absolute risk reduction, and number of patients needed to be treated to prevent 1 death); and 3 distractors (increased life expectancy, decreased hospitalization rate, and decreased cost). Willingness to prescribe and rank order of medication preference assuming willingness to prescribe were measured. The results of the study showed that willingness to prescribe and first choice preference were significantly greater when study results were presented as relative risk reduction than when identical mortality data were presented as absolute risk reduction or number of patients needed to be treated to avoid 1 death (p <0.001). Increase in life expectancy was the most influential distractor. In conclusion, this study, performed in the era of "evidence-based medicine," demonstrates that the method of reporting research trial results has significant influence on health professionals' willingness to prescribe. The high numerical value of relative risk reduction and the concrete and tangible quality of increased life expectancy exert greater influence on health professionals than other standard outcome terms.
Reduced COPD Exacerbation Risk Correlates With Improved FEV1: A Meta-Regression Analysis.
Zider, Alexander D; Wang, Xiaoyan; Buhr, Russell G; Sirichana, Worawan; Barjaktarevic, Igor Z; Cooper, Christopher B
2017-09-01
The mechanism by which various classes of medication reduce COPD exacerbation risk remains unknown. We hypothesized a correlation between reduced exacerbation risk and improvement in airway patency as measured according to FEV 1 . By systematic review, COPD trials were identified that reported therapeutic changes in predose FEV 1 (dFEV 1 ) and occurrence of moderate to severe exacerbations. Using meta-regression analysis, a model was generated with dFEV 1 as the moderator variable and the absolute difference in exacerbation rate (RD), ratio of exacerbation rates (RRs), or hazard ratio (HR) as dependent variables. The analysis of RD and RR included 119,227 patients, and the HR analysis included 73,475 patients. For every 100-mL change in predose FEV 1 , the HR decreased by 21% (95% CI, 17-26; P < .001; R 2 = 0.85) and the absolute exacerbation rate decreased by 0.06 per patient per year (95% CI, 0.02-0.11; P = .009; R 2 = 0.05), which corresponded to an RR of 0.86 (95% CI, 0.81-0.91; P < .001; R 2 = 0.20). The relationship with exacerbation risk remained statistically significant across multiple subgroup analyses. A significant correlation between increased FEV 1 and lower COPD exacerbation risk suggests that airway patency is an important mechanism responsible for this effect. Copyright © 2017 American College of Chest Physicians. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Holland, Katharina; van Gils, Carla H.; Wanders, Johanna OP; Mann, Ritse M.; Karssemeijer, Nico
2016-03-01
The sensitivity of mammograms is low for women with dense breasts, since cancers may be masked by dense tissue. In this study, we investigated methods to identify women with density patterns associated with a high masking risk. Risk measures are derived from volumetric breast density maps. We used the last negative screening mammograms of 93 women who subsequently presented with an interval cancer (IC), and, as controls, 930 randomly selected normal screening exams from women without cancer. Volumetric breast density maps were computed from the mammograms, which provide the dense tissue thickness at each location. These were used to compute absolute and percentage glandular tissue volume. We modeled the masking risk for each pixel location using the absolute and percentage dense tissue thickness and we investigated the effect of taking the cancer location probability distribution (CLPD) into account. For each method, we selected cases with the highest masking measure (by thresholding) and computed the fraction of ICs as a function of the fraction of controls selected. The latter can be interpreted as the negative supplemental screening rate (NSSR). Between the models, when incorporating CLPD, no significant differences were found. In general, the methods performed better when CLPD was included. At higher NSSRs some of the investigated masking measures had a significantly higher performance than volumetric breast density. These measures may therefore serve as an alternative to identify women with a high risk for a masked cancer.
Gray, Kurt; Schein, Chelsea
2016-05-01
Moral absolutism is the idea that people's moral judgments are insensitive to considerations of harm. Scott, Inbar, and Rozin (2016, this issue) claim that most moral opponents to genetically modified organisms are absolutely opposed-motivated by disgust and not harm. Yet there is no evidence for moral absolutism in their data. Perceived risk/harm is the most significant predictor of moral judgments for "absolutists," accounting for 30 times more variance than disgust. Reanalyses suggest that disgust is not even a significant predictor of the moral judgments of absolutists once accounting for perceived harm and anger. Instead of revealing actual moral absolutism, Scott et al. find only empty absolutism: hypothetical, forecasted, self-reported moral absolutism. Strikingly, the moral judgments of so-called absolutists are somewhat more sensitive to consequentialist concerns than those of nonabsolutists. Mediation reanalyses reveal that moral judgments are most proximally predicted by harm and not disgust, consistent with dyadic morality. © The Author(s) 2016.
Breast Cancer Risk Assessment SAS Macro (Gail Model)
A SAS macro (commonly referred to as the Gail Model) that projects absolute risk of invasive breast cancer according to NCI’s Breast Cancer Risk Assessment Tool (BCRAT) algorithm for specified race/ethnic groups and age intervals.
Hawken, Steven; Kwong, Jeffrey C; Deeks, Shelley L; Crowcroft, Natasha S; McGeer, Allison J; Ducharme, Robin; Campitelli, Michael A; Coyle, Doug; Wilson, Kumanan
2015-02-01
It is unclear whether seasonal influenza vaccination results in a net increase or decrease in the risk for Guillain-Barré syndrome (GBS). To assess the effect of seasonal influenza vaccination on the absolute risk of acquiring GBS, we used simulation models and published estimates of age- and sex-specific risks for GBS, influenza incidence, and vaccine effectiveness. For a hypothetical 45-year-old woman and 75-year-old man, excess GBS risk for influenza vaccination versus no vaccination was -0.36/1 million vaccinations (95% credible interval -1.22% to 0.28) and -0.42/1 million vaccinations (95% credible interval, -3.68 to 2.44), respectively. These numbers represent a small absolute reduction in GBS risk with vaccination. Under typical conditions (e.g. influenza incidence rates >5% and vaccine effectiveness >60%), vaccination reduced GBS risk. These findings should strengthen confidence in the safety of influenza vaccine and allow health professionals to better put GBS risk in context when discussing influenza vaccination with patients.
Cole, Stephen R.; Lau, Bryan; Eron, Joseph J.; Brookhart, M. Alan; Kitahata, Mari M.; Martin, Jeffrey N.; Mathews, William C.; Mugavero, Michael J.; Cole, Stephen R.; Brookhart, M. Alan; Lau, Bryan; Eron, Joseph J.; Kitahata, Mari M.; Martin, Jeffrey N.; Mathews, William C.; Mugavero, Michael J.
2015-01-01
There are few published examples of absolute risk estimated from epidemiologic data subject to censoring and competing risks with adjustment for multiple confounders. We present an example estimating the effect of injection drug use on 6-year risk of acquired immunodeficiency syndrome (AIDS) after initiation of combination antiretroviral therapy between 1998 and 2012 in an 8-site US cohort study with death before AIDS as a competing risk. We estimate the risk standardized to the total study sample by combining inverse probability weights with the cumulative incidence function; estimates of precision are obtained by bootstrap. In 7,182 patients (83% male, 33% African American, median age of 38 years), we observed 6-year standardized AIDS risks of 16.75% among 1,143 injection drug users and 12.08% among 6,039 nonusers, yielding a standardized risk difference of 4.68 (95% confidence interval: 1.27, 8.08) and a standardized risk ratio of 1.39 (95% confidence interval: 1.12, 1.72). Results may be sensitive to the assumptions of exposure-version irrelevance, no measurement bias, and no unmeasured confounding. These limitations suggest that results be replicated with refined measurements of injection drug use. Nevertheless, estimating the standardized risk difference and ratio is straightforward, and injection drug use appears to increase the risk of AIDS. PMID:24966220
Chen, Wen-jun; Zhang, Xi; Wu, Cong-cong; Zhang, Chao-ying; Sun, Shuang-shuang; Wu, Jian
2017-01-01
Background There are no consistent agreements on whether radiotherapy after breast-conserving surgery (BCS) could provide local control and survival benefit for older patients with early breast cancer or breast ductal carcinoma in situ (DCIS). The present study aimed to evaluate the efficacy of radiotherapy after BCS in older patients with early breast cancer or DCIS. Results Radiotherapy could reduce the risk of local relapse in older patients with early breast cancer. The 5-year AR of local relapse was 2.2% and 6.2% for radiotherapy and non-radiotherapy group, respectively, with low 5-year ARD of 4.0% and high NNT of 25. The 10-year AR of local relapse was 5.3% and 10.5% for radiotherapy and non-radiotherapy group, respectively, with the 10-year ARD of 5.2% and NNT of 20. However, radiotherapy could not improve survival benefits, including overall survival, cancer-specific survival, breast-cancer-specific survival, and distant relapse. Moreover, radiotherapy could reduce the risk of ipsilateral breast events in older patients with DCIS. Materials and Methods PubMed and Embase database were searched for relevant studies. Hazard ratios (HRs), risk ratios (RRs), absolute risk (AR), absolute risk difference (ARD), and number needed to treat (NNT) were used as effect measures to evaluate the efficacy of radiotherapy in older patients. Conclusions Our study indicates that radiotherapy could slightly reduce the risk of local relapse in older patients with favorable early breast cancer. However, radiotherapy cannot translate into significant survival benefits. PMID:28415667
Mother's education and offspring asthma risk in 10 European cohort studies.
Lewis, Kate Marie; Ruiz, Milagros; Goldblatt, Peter; Morrison, Joana; Porta, Daniela; Forastiere, Francesco; Hryhorczuk, Daniel; Zvinchuk, Oleksandr; Saurel-Cubizolles, Marie-Josephe; Lioret, Sandrine; Annesi-Maesano, Isabella; Vrijheid, Martine; Torrent, Maties; Iniguez, Carmen; Larranaga, Isabel; Harskamp-van Ginkel, Margreet W; Vrijkotte, Tanja G M; Klanova, Jana; Svancara, Jan; Barross, Henrique; Correia, Sofia; Jarvelin, Marjo-Riitta; Taanila, Anja; Ludvigsson, Johnny; Faresjo, Tomas; Marmot, Michael; Pikhart, Hynek
2017-09-01
Highly prevalent and typically beginning in childhood, asthma is a burdensome disease, yet the risk factors for this condition are not clarified. To enhance understanding, this study assessed the cohort-specific and pooled risk of maternal education on asthma in children aged 3-8 across 10 European countries. Data on 47,099 children were obtained from prospective birth cohort studies across 10 European countries. We calculated cohort-specific prevalence difference in asthma outcomes using the relative index of inequality (RII) and slope index of inequality (SII). Results from all countries were pooled using random-effects meta-analysis procedures to obtain mean RII and SII scores at the European level. Final models were adjusted for child sex, smoking during pregnancy, parity, mother's age and ethnicity. The higher the score the greater the magnitude of relative (RII, reference 1) and absolute (SII, reference 0) inequity. The pooled RII estimate for asthma risk across all cohorts was 1.46 (95% CI 1.26, 1.71) and the pooled SII estimate was 1.90 (95% CI 0.26, 3.54). Of the countries examined, France, the United Kingdom and the Netherlands had the highest prevalence's of childhood asthma and the largest inequity in asthma risk. Smaller inverse associations were noted for all other countries except Italy, which presented contradictory scores, but with small effect sizes. Tests for heterogeneity yielded significant results for SII scores. Overall, offspring of mothers with a low level of education had an increased relative and absolute risk of asthma compared to offspring of high-educated mothers.
Hannibal, Charlotte Gerd; Vang, Russell; Junge, Jette; Frederiksen, Kirsten; Kurman, Robert J; Kjaer, Susanne K
2017-01-01
Absolute risk and risk factors for recurrence and ovarian serous carcinoma following ovarian serous borderline tumors (SBTs) is not well-established. We included all women with SBTs in Denmark, 1978-2002. Diagnoses were confirmed by centralized pathology review and classified as atypical proliferative serous tumor (APST) or noninvasive low-grade serous carcinoma (LGSC). Implants were classified as noninvasive or invasive. Medical records were collected and reviewed, and follow-up was obtained. Subsequent diagnoses were also confirmed by centralized pathology review. We examined absolute risk and risk factors for recurrent APST and serous carcinoma using Cox regression. The absolute serous carcinoma risk after, respectively, 5 and 20years was 5.0% and 13.9% for noninvasive LGSC, and 0.9% and 3.7% for APST. Serous carcinoma risk was significantly higher following noninvasive LGSC compared with APST among stage I patients/patients without implants (HR=5.3; 95% CI: 1.7-16.3), whereas no significant association with tumor type was found in advanced stage patients/patients with implants. Advanced stage - notably invasive implants - bilaterality, surface involvement, and residual disease increased serous carcinoma risk. However, women with stage I APST also had a higher risk than the general population. This largest population-based cohort of verified SBTs revealed that women with noninvasive LGSC are significantly more likely to develop serous carcinoma than women with APST, which could not entirely be explained by invasive implants. Although invasive implants was a strong risk factor for serous carcinoma, even women with stage I APST were at increased risk compared with the general population. Copyright © 2016 Elsevier Inc. All rights reserved.
Jeon, Mi Young; Lee, Hye Won; Kim, Seung Up; Kim, Beom Kyung; Park, Jun Yong; Kim, Do Young; Han, Kwang-Hyub; Ahn, Sang Hoon
2018-04-01
Several risk prediction models for hepatocellular carcinoma (HCC) development are available. We explored whether the use of risk prediction models can dynamically predict HCC development at different time points in chronic hepatitis B (CHB) patients. Between 2006 and 2014, 1397 CHB patients were recruited. All patients underwent serial transient elastography at intervals of >6 months. The median age of this study population (931 males and 466 females) was 49.0 years. The median CU-HCC, REACH-B, LSM-HCC and mREACH-B score at enrolment were 4.0, 9.0, 10.0 and 8.0 respectively. During the follow-up period (median, 68.0 months), 87 (6.2%) patients developed HCC. All risk prediction models were successful in predicting HCC development at both the first liver stiffness (LS) measurement (hazard ratio [HR] = 1.067-1.467 in the subgroup without antiviral therapy [AVT] and 1.096-1.458 in the subgroup with AVT) and second LS measurement (HR = 1.125-1.448 in the subgroup without AVT and 1.087-1.249 in the subgroup with AVT). In contrast, neither the absolute nor percentage change in the scores from the risk prediction models predicted HCC development (all P > .05). The mREACH-B score performed similarly or significantly better than did the other scores (AUROCs at 5 years, 0.694-0.862 vs 0.537-0.875). Dynamic prediction of HCC development at different time points was achieved using four risk prediction models, but not using the changes in the absolute and percentage values between two time points. The mREACH-B score was the most appropriate prediction model of HCC development among four prediction models. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
NASA Astrophysics Data System (ADS)
Zeng, You-Zhi; Zhang, Ning
2016-12-01
This paper proposes a new full velocity difference model considering the driver’s heterogeneity of the disturbance risk preference for car-following theory to investigate the effects of the driver’s heterogeneity of the disturbance risk preference on traffic flow instability when the driver reacts to the relative velocity. We obtain traffic flow instability condition and the calculation method of the unstable region headway range and the probability of traffic congestion caused by a small disturbance. The analysis shows that has important effects the driver’s heterogeneity of the disturbance risk preference on traffic flow instability: (1) traffic flow instability is independent of the absolute size of the driver’s disturbance risk preference coefficient and depends on the ratio of the preceding vehicle driver’s disturbance risk preference coefficient to the following vehicle driver’s disturbance risk preference coefficient; (2) the smaller the ratio of the preceding vehicle driver’s disturbance risk preference coefficient to the following vehicle driver’s disturbance risk preference coefficient, the smaller traffic flow instability and vice versa. It provides some viable ideas to suppress traffic congestion.
Quantifying Treatment Benefit in Molecular Subgroups to Assess a Predictive Biomarker.
Iasonos, Alexia; Chapman, Paul B; Satagopan, Jaya M
2016-05-01
An increased interest has been expressed in finding predictive biomarkers that can guide treatment options for both mutation carriers and noncarriers. The statistical assessment of variation in treatment benefit (TB) according to the biomarker carrier status plays an important role in evaluating predictive biomarkers. For time-to-event endpoints, the hazard ratio (HR) for interaction between treatment and a biomarker from a proportional hazards regression model is commonly used as a measure of variation in TB. Although this can be easily obtained using available statistical software packages, the interpretation of HR is not straightforward. In this article, we propose different summary measures of variation in TB on the scale of survival probabilities for evaluating a predictive biomarker. The proposed summary measures can be easily interpreted as quantifying differential in TB in terms of relative risk or excess absolute risk due to treatment in carriers versus noncarriers. We illustrate the use and interpretation of the proposed measures with data from completed clinical trials. We encourage clinical practitioners to interpret variation in TB in terms of measures based on survival probabilities, particularly in terms of excess absolute risk, as opposed to HR. Clin Cancer Res; 22(9); 2114-20. ©2016 AACR. ©2016 American Association for Cancer Research.
9 CFR 439.20 - Criteria for maintaining accreditation.
Code of Federal Regulations, 2011 CFR
2011-01-01
... deviation measure equal to zero when the absolute value of the result's standardized difference, (d), is...) Variability: The absolute value of the standardized difference between the accredited laboratory's result and... constant, is used in place of the absolute value of the standardized difference to determine the CUSUM-V...
9 CFR 439.20 - Criteria for maintaining accreditation.
Code of Federal Regulations, 2013 CFR
2013-01-01
... deviation measure equal to zero when the absolute value of the result's standardized difference, (d), is...) Variability: The absolute value of the standardized difference between the accredited laboratory's result and... constant, is used in place of the absolute value of the standardized difference to determine the CUSUM-V...
9 CFR 439.20 - Criteria for maintaining accreditation.
Code of Federal Regulations, 2012 CFR
2012-01-01
... deviation measure equal to zero when the absolute value of the result's standardized difference, (d), is...) Variability: The absolute value of the standardized difference between the accredited laboratory's result and... constant, is used in place of the absolute value of the standardized difference to determine the CUSUM-V...
9 CFR 439.20 - Criteria for maintaining accreditation.
Code of Federal Regulations, 2014 CFR
2014-01-01
... deviation measure equal to zero when the absolute value of the result's standardized difference, (d), is...) Variability: The absolute value of the standardized difference between the accredited laboratory's result and... constant, is used in place of the absolute value of the standardized difference to determine the CUSUM-V...
9 CFR 439.20 - Criteria for maintaining accreditation.
Code of Federal Regulations, 2010 CFR
2010-01-01
... deviation measure equal to zero when the absolute value of the result's standardized difference, (d), is...) Variability: The absolute value of the standardized difference between the accredited laboratory's result and... constant, is used in place of the absolute value of the standardized difference to determine the CUSUM-V...
Weinberg, Ido; Gona, Philimon; O’Donnell, Christopher J.; Jaff, Michael R.; Murabito, Joanne M.
2014-01-01
Background An increased inter-arm systolic blood pressure difference is an easily determined physical examination finding. The relationship between inter-arm systolic blood pressure difference and risk of future cardiovascular disease is uncertain. We described the prevalence and risk factor correlates of inter-arm systolic blood pressure difference in the Framingham Heart Study (FHS) original and offspring cohorts and examined the association between inter-arm systolic blood pressure difference and incident cardiovascular disease and all-cause mortality. Methods An increased inter-arm systolic blood pressure difference was defined as ≥10mmHg using the average of initial and repeat blood pressure measurements obtained in both arms. Participants were followed through 2010 for incident cardiovascular disease events. Multivariable Cox proportional hazards regression analyses were performed to investigate the effect of inter-arm systolic blood pressure difference on incident cardiovascular disease. Results We examined 3,390 (56.3% female) participants aged 40 years and older, free of cardiovascular disease at baseline, mean age of 61.1 years, who attended a FHS examination between 1991 and 1994 (original cohort) and from 1995 to 1998 (offspring cohort). The mean absolute inter-arm systolic blood pressure difference was 4.6 mmHg (range 0 to 78). Increased inter-arm systolic blood pressure difference was present in 317 (9.4%) participants. The median follow-up time was 13.3 years, during which time 598 participants (17.6%) experienced a first cardiovascular event including 83 (26.2%) participants with inter-arm systolic blood pressure difference ≥10 mmHg. Compared to those with normal inter-arm systolic blood pressure difference, participants with an elevated inter-arm systolic blood pressure difference were older (63.0 years vs. 60.9 years), had a greater prevalence of diabetes mellitus (13.3% vs. 7.5%,), higher systolic blood pressure (136.3 mmHg vs. 129.3 mmHg), and a higher total cholesterol level (212.1 mg/dL vs. 206.5 mg/dL). Inter-arm systolic blood pressure difference was associated with a significantly increased hazard of incident cardiovascular events in the multivariable adjusted model (hazard ratio 1.38, 95% CI, 1.09 to 1.75). For each 1-standard deviation unit increase in absolute interarm systolic blood pressure difference, the hazard ratio for incident cardiovascular events was 1.07 (CI, 1.00 to 1.14) in the fully-adjusted model. There was no such association with mortality (hazard ratio 1.02, 95% CI 0.76 to 1.38). Conclusions In this community-based cohort, an inter-arm systolic blood pressure difference is common and associated with a significant increased risk for future cardiovascular events, even when the absolute difference in arm systolic blood pressure is modest. These findings support research to expand clinical use of this simple measurement. PMID:24287007
Rezende, Márcia; Chemin, Kaprice; Vaez, Savil Costa; Peixoto, Aline Carvalho; Rabelo, Jéssica de Freitas; Braga, Stella Sueli Lourenço; Faria-E-Silva, André Luis; Silva, Gisele Rodrigues da; Soares, Carlos José; Loguercio, Alessandro D; Reis, Alessandra
2018-05-01
Tooth sensitivity commonly occurs during and immediately after dental bleaching. The authors conducted a trial to compare tooth sensitivity after in-office bleaching after the use of either a topical dipyrone or placebo gel. A split-mouth, triple-blind, randomized, multicenter clinical trial was conducted among 120 healthy adults having teeth that were shade A2 or darker. The facial tooth surfaces of the right or left sides of the maxillary arch of each patient were randomly assigned to receive either topical dipyrone or placebo gel before 2 in-office bleaching sessions (35% hydrogen peroxide) separated by 2 weeks. Visual analog and numerical rating scales were used to record tooth sensitivity during and up to 48 hours after bleaching. Tooth color change from baseline to 1 month after bleaching was measured with shade guide and spectrophotometer measures. The primary outcome variable was absolute risk of tooth sensitivity. An intention-to-treat analysis was used to analyze data from all patients who were randomly assigned to receive the dipyrone and placebo gels. No statically significant difference was found in the absolute risk of tooth sensitivity between the dipyrone and placebo gels (83% and 90%, respectively, P = .09; relative risk, 0.92; 95% confidence interval, 0.8 to 1.0). A whitening effect was observed in both groups with no statistically significant difference (P > .05) between them. No adverse effects were observed. Topical use of dipyrone gel before tooth bleaching, at the levels used in this study, did not reduce the risk or intensity of bleaching-induced tooth sensitivity. Topical application of dipyrone gel does not reduce bleaching-induced tooth sensitivity. Copyright © 2018 American Dental Association. Published by Elsevier Inc. All rights reserved.
HIN7/440: Evidence-based Consumer Health Information - The need for unbiased risk communication
Hoeldke, B; Muehlhauser, I
1999-01-01
Online consumer health information is rapidly growing. At the same time an active part of patients and consumers in decision making about preventive or therapeutic interventions is increasingly demanded. The basis for informed consumer choice is the communication of evidence-based scientific data in a format that is clearly understood by most lay persons. The way study results are presented influence decisions by health care providers and patients or consumers alike. The impact of framing of outcome data as either relative or absolute differences is well recognized. Outcome data should be reported as absolute numbers, absolute risk reductions or numbers needed to treat or to screen rather than as relative risk reductions. Beyond the question of whether relative or absolute differences are used, outcome data can be framed by either emphasising achievable benefits or the lack of such benefits. Presentation of data as the proportion of patients who remain free of a target outcome rather than the proportion of patients who benefit from a certain intervention could substantially influence decision making. So far, studies evaluating the communication of treatment results to patients were focussed on the benefits of the respective interventions. Such an approach is incompatible with unbiased informed decision making by the patient, client or consumer. In order to communicate outcome data in an objective manner the whole possible spectrum of data presentation should be considered. Both, the proportion of persons who are likely to benefit as well as the proportion of persons who are unlikely to benefit or likely to be harmed should be presented with equal emphasis. Instruments to judge the quality of printed or online consumer health information do not include rating the framing of outcome data (e.g. http:/www.discern.org.uk).In order to establish an online system of evidence-based consumer health information that provides unbiased evidence-based communication of outcome data mammography screening was used as a model. After screening the literature according to evidence-based medicine criteria the information on benefits and risks of mammography screening has been compiled. Results are communicated as simple self explaining illustrations as well as original numbers equally emphasising the various aspects of the outcome. In addition, unbiased information is provided on the test efficacy of mammography screening (false positive, false negative results), on other potential side effects or other beneficial effects of mammography screening such as the number of diagnostic surgical interventions following mammography or the psychological sequaele thereof, data on total mortality and precision or lack of precision of results. The described mammography screening consumer information system is being evaluated with experts and the target consumer population with the final goal of an online evidence-based consumer health information
12 CFR 324.152 - Simple risk weight approach (SRWA).
Code of Federal Regulations, 2014 CFR
2014-01-01
... (that is, between zero and -1), then E equals the absolute value of RVC. If RVC is negative and less... the lowest applicable risk weight in this section. (1) Zero percent risk weight equity exposures. An....131(d)(2) is assigned a zero percent risk weight. (2) 20 percent risk weight equity exposures. An...
Sex as a Biological Variable: Who, What, When, Why, and How
Bale, Tracy L; Epperson, C Neill
2017-01-01
The inclusion of sex as a biological variable in research is absolutely essential for improving our understanding of disease mechanisms contributing to risk and resilience. Studies focusing on examining sex differences have demonstrated across many levels of analyses and stages of brain development and maturation that males and females can differ significantly. This review will discuss examples of animal models and clinical studies to provide guidance and reference for the inclusion of sex as an important biological variable relevant to a Neuropsychopharmacology audience. PMID:27658485
Past Decline Versus Current eGFR and Subsequent ESRD Risk.
Kovesdy, Csaba P; Coresh, Josef; Ballew, Shoshana H; Woodward, Mark; Levin, Adeera; Naimark, David M J; Nally, Joseph; Rothenbacher, Dietrich; Stengel, Benedicte; Iseki, Kunitoshi; Matsushita, Kunihiro; Levey, Andrew S
2016-08-01
eGFR is a robust predictor of ESRD risk. However, the prognostic information gained from the past trajectory (slope) beyond that of the current eGFR is unclear. We examined 22 cohorts to determine the association of past slopes and current eGFR level with subsequent ESRD. We modeled hazard ratios as a spline function of slopes, adjusting for demographic variables, eGFR, and comorbidities. We used random effects meta-analyses to combine results across studies stratified by cohort type. We calculated the absolute risk of ESRD at 5 years after the last eGFR using the weighted average baseline risk. Overall, 1,080,223 participants experienced 5163 ESRD events during a mean follow-up of 2.0 years. In CKD cohorts, a slope of -6 versus 0 ml/min per 1.73 m(2) per year over the previous 3 years (a decline of 18 ml/min per 1.73 m(2) versus no decline) associated with an adjusted hazard ratio of ESRD of 2.28 (95% confidence interval, 1.88 to 2.76). In contrast, a current eGFR of 30 versus 50 ml/min per 1.73 m(2) (a difference of 20 ml/min per 1.73 m(2)) associated with an adjusted hazard ratio of 19.9 (95% confidence interval, 13.6 to 29.1). Past decline contributed more to the absolute risk of ESRD at lower than higher levels of current eGFR. In conclusion, during a follow-up of 2 years, current eGFR associates more strongly with future ESRD risk than the magnitude of past eGFR decline, but both contribute substantially to the risk of ESRD, especially at eGFR<30 ml/min per 1.73 m(2). Copyright © 2016 by the American Society of Nephrology.
BCRA is an R package that projects absolute risk of invasive breast cancer according to NCI’s Breast Cancer Risk Assessment Tool (BCRAT) algorithm for specified race/ethnic groups and age intervals.
Bright, Chloe J; Hawkins, Mike M; Guha, Joyeeta; Henson, Katherine E; Winter, David L; Kelly, Julie S; Feltbower, Richard G; Hall, Marlous; Cutter, David J; Edgar, Angela B; Frobisher, Clare; Reulen, Raoul C
2017-03-28
Survivors of teenage and young adult cancer are at risk of cerebrovascular events, but the magnitude of and extent to which this risk varies by cancer type, decade of diagnosis, age at diagnosis, and attained age remains uncertain. This is the largest-ever cohort study to evaluate the risks of hospitalization for a cerebrovascular event among long-term survivors of teenage and young adult cancer. The population-based TYACSS (Teenage and Young Adult Cancer Survivor Study) (N=178,962) was linked to Hospital Episode Statistics data for England to investigate the risks of hospitalization for a cerebrovascular event among 5-year survivors of cancer diagnosed when 15 to 39 years of age. Observed numbers of first hospitalizations for cerebrovascular events were compared with that expected from the general population using standardized hospitalization ratios (SHRs) and absolute excess risks per 10 000 person-years. Cumulative incidence was calculated with death considered a competing risk. Overall, 2782 cancer survivors were hospitalized for a cerebrovascular event-40% higher than expected (SHR=1.4, 95% confidence interval, 1.3-1.4). Survivors of central nervous system (CNS) tumors (SHR=4.6, 95% confidence interval, 4.3-5.0), head and neck tumors (SHR=2.6, 95% confidence interval, 2.2-3.1), and leukemia (SHR=2.5, 95% confidence interval, 1.9-3.1) were at greatest risk. Males had significantly higher absolute excess risks than females (absolute excess risks =7 versus 3), especially among head and neck tumor survivors (absolute excess risks =30 versus 11). By 60 years of age, 9%, 6%, and 5% of CNS tumor, head and neck tumor, and leukemia survivors, respectively, had been hospitalized for a cerebrovascular event. Beyond 60 years of age, every year, 0.4% of CNS tumor survivors were hospitalized for a cerebral infarction (versus 0.1% expected), whereas at any age, every year, 0.2% of head and neck tumor survivors were hospitalized for a cerebral infarction (versus 0.06% expected). Survivors of a CNS tumor, head and neck tumor, and leukemia are particularly at risk of hospitalization for a cerebrovascular event. The excess risk of cerebral infarction among CNS tumor survivors increases with attained age. For head and neck tumor survivors, this excess risk remains high across all ages. These groups of survivors, particularly males, should be considered for surveillance of cerebrovascular risk factors and potential pharmacological interventions for cerebral infarction prevention. © 2017 American Heart Association, Inc.
The Discipline of Asset Allocation.
ERIC Educational Resources Information Center
Petzel, Todd E.
2000-01-01
Discussion of asset allocation for college/university endowment funds focuses on three levels of risk: (1) the absolute risk of the portfolio (usually leading to asset diversification); (2) the benchmark risk (usually comparison with peer institutions; and (3) personal career risk (which may incline managers toward maximizing short-term returns,…
Teepen, Jop C; van Leeuwen, Flora E; Tissing, Wim J; van Dulmen-den Broeder, Eline; van den Heuvel-Eibrink, Marry M; van der Pal, Helena J; Loonen, Jacqueline J; Bresters, Dorine; Versluys, Birgitta; Neggers, Sebastian J C M M; Jaspers, Monique W M; Hauptmann, Michael; van der Heiden-van der Loo, Margriet; Visser, Otto; Kremer, Leontien C M; Ronckers, Cécile M
2017-07-10
Purpose Childhood cancer survivors (CCSs) are at increased risk for subsequent malignant neoplasms (SMNs). We evaluated the long-term risk of SMNs in a well-characterized cohort of 5-year CCSs, with a particular focus on individual chemotherapeutic agents and solid cancer risk. Methods The Dutch Childhood Cancer Oncology Group-Long-Term Effects After Childhood Cancer cohort includes 6,165 5-year CCSs diagnosed between 1963 and 2001 in the Netherlands. SMNs were identified by linkages with the Netherlands Cancer Registry, the Dutch Pathology Registry, and medical chart review. We calculated standardized incidence ratios, excess absolute risks, and cumulative incidences. Multivariable Cox proportional hazard regression analyses were used to evaluate treatment-associated risks for breast cancer, sarcoma, and all solid cancers. Results After a median follow-up of 20.7 years (range, 5.0 to 49.8 years) since first diagnosis, 291 SMNs were ascertained in 261 CCSs (standardized incidence ratio, 5.2; 95% CI, 4.6 to 5.8; excess absolute risk, 20.3/10,000 person-years). Cumulative SMN incidence at 25 years after first diagnosis was 3.9% (95% CI, 3.4% to 4.6%) and did not change noticeably among CCSs treated in the 1990s compared with those treated earlier. We found dose-dependent doxorubicin-related increased risks of all solid cancers ( P trend < .001) and breast cancer ( P trend < .001). The doxorubicin-breast cancer dose response was stronger in survivors of Li-Fraumeni syndrome-associated childhood cancers (leukemia, CNS, and non-Ewing sarcoma) versus survivors of other cancers ( P difference = .008). In addition, cyclophosphamide was found to increase sarcoma risk in a dose-dependent manner ( P trend = .01). Conclusion The results strongly suggest that doxorubicin exposure in CCSs increases the risk of subsequent solid cancers and breast cancer, whereas cyclophosphamide exposure increases the risk of subsequent sarcomas. These results may inform future childhood cancer treatment protocols and SMN surveillance guidelines for CCSs.
Risk factors for breast cancer in postmenopausal Caucasian and Chinese-Canadian women.
Tam, Carolyn Y; Martin, Lisa J; Hislop, Gregory; Hanley, Anthony J; Minkin, Salomon; Boyd, Norman F
2010-01-01
Striking differences exist between countries in the incidence of breast cancer. The causes of these differences are unknown, but because incidence rates change in migrants, they are thought to be due to lifestyle rather than genetic differences. The goal of this cross-sectional study was to examine breast cancer risk factors in populations with different risks for breast cancer. We compared breast cancer risk factors among three groups of postmenopausal Canadian women at substantially different risk of developing breast cancer - Caucasians (N = 413), Chinese women born in the West or who migrated to the West before age 21 (N = 216), and recent Chinese migrants (N = 421). Information on risk factors and dietary acculturation were collected by telephone interviews using questionnaires, and anthropometric measurements were taken at a home visit. Compared to Caucasians, recent Chinese migrants weighed on average 14 kg less, were 6 cm shorter, had menarche a year later, were more often parous, less often had a family history of breast cancer or a benign breast biopsy, a higher Chinese dietary score, and a lower Western dietary score. For most of these variables, Western born Chinese and early Chinese migrants had values intermediate between those of Caucasians and recent Chinese migrants. We estimated five-year absolute risks for breast cancer using the Gail Model and found that risk estimates in Caucasians would be reduced by only 11% if they had the risk factor profile of recent Chinese migrants for the risk factors in the Gail Model. Our results suggest that in addition to the risk factors in the Gail Model, there likely are other factors that also contribute to the large difference in breast cancer risk between Canada and China.
Cumulative incidence of cancer after solid organ transplantation.
Hall, Erin C; Pfeiffer, Ruth M; Segev, Dorry L; Engels, Eric A
2013-06-15
Solid organ transplantation recipients have elevated cancer incidence. Estimates of absolute cancer risk after transplantation can inform prevention and screening. The Transplant Cancer Match Study links the US transplantation registry with 14 state/regional cancer registries. The authors used nonparametric competing risk methods to estimate the cumulative incidence of cancer after transplantation for 2 periods (1987-1999 and 2000-2008). For recipients from 2000 to 2008, the 5-year cumulative incidence, stratified by organ, sex, and age at transplantation, was estimated for 6 preventable or screen-detectable cancers. For comparison, the 5-year cumulative incidence was calculated for the same cancers in the general population at representative ages using Surveillance, Epidemiology, and End Results data. Among 164,156 recipients, 8520 incident cancers were identified. The absolute cancer risk was slightly higher for recipients during the period from 2000 to 2008 than during the period from 1987 to 1999 (5-year cumulative incidence: 4.4% vs. 4.2%; P = .006); this difference arose from the decreasing risk of competing events (5-year cumulative incidence of death, graft failure, or retransplantation: 26.6% vs. 31.9%; P < .001). From 2000 to 2008, the 5-year cumulative incidence of non-Hodgkin lymphoma was highest at extremes of age, especially in thoracic organ recipients (ages 0-34 years: range, 1.74%-3.28%; aged >50 years; range, 0.36%-2.22%). For recipients aged >50 years, the 5-year cumulative incidence was higher for colorectal cancer (range, 0.33%-1.94%) than for the general population at the recommended screening age (aged 50 years: range, 0.25%-0.33%). For recipients aged >50 years, the 5-year cumulative incidence was high for lung cancer among thoracic organ recipients (range, 1.16%-3.87%) and for kidney cancer among kidney recipients (range, 0.53%-0.84%). The 5-year cumulative incidence for prostate cancer and breast cancer was similar or lower in transplantation recipients than at the recommended ages of screening in the general population. Subgroups of transplantation recipients have a high absolute risk of some cancers and may benefit from targeted prevention or screening. Copyright © 2013 American Cancer Society.
Walsh, Linda; Schneider, Uwe
2013-03-01
Radiation-related risks of cancer can be transported from one population to another population at risk, for the purpose of calculating lifetime risks from radiation exposure. Transfer via excess relative risks (ERR) or excess absolute risks (EAR) or a mixture of both (i.e., from the life span study (LSS) of Japanese atomic bomb survivors) has been done in the past based on qualitative weighting. Consequently, the values of the weights applied and the method of application of the weights (i.e., as additive or geometric weighted means) have varied both between reports produced at different times by the same regulatory body and also between reports produced at similar times by different regulatory bodies. Since the gender and age patterns are often markedly different between EAR and ERR models, it is useful to have an evidence-based method for determining the relative goodness of fit of such models to the data. This paper identifies a method, using Akaike model weights, which could aid expert judgment and be applied to help to achieve consistency of approach and quantitative evidence-based results in future health risk assessments. The results of applying this method to recent LSS cancer incidence models are that the relative EAR weighting by cancer solid cancer site, on a scale of 0-1, is zero for breast and colon, 0.02 for all solid, 0.03 for lung, 0.08 for liver, 0.15 for thyroid, 0.18 for bladder and 0.93 for stomach. The EAR weighting for female breast cancer increases from 0 to 0.3, if a generally observed change in the trend between female age-specific breast cancer incidence rates and attained age, associated with menopause, is accounted for in the EAR model. Application of this method to preferred models from a study of multi-model inference from many models fitted to the LSS leukemia mortality data, results in an EAR weighting of 0. From these results it can be seen that lifetime risk transfer is most highly weighted by EAR only for stomach cancer. However, the generalization and interpretation of radiation effect estimates based on the LSS cancer data, when projected to other populations, are particularly uncertain if considerable differences exist between site-specific baseline rates in the LSS and the other populations of interest. Definitive conclusions, regarding the appropriate method for transporting cancer risks, are limited by a lack of knowledge in several areas including unknown factors and uncertainties in biological mechanisms and genetic and environmental risk factors for carcinogenesis; uncertainties in radiation dosimetry; and insufficient statistical power and/or incomplete follow-up in data from radio-epidemiological studies.
Socioeconomic inequalities in cause specific mortality among older people in France.
Menvielle, Gwenn; Leclerc, Annette; Chastang, Jean-François; Luce, Danièle
2010-05-19
European comparative studies documented a clear North-South divide in socioeconomic inequalities with cancer being the most important contributor to inequalities in total mortality among middle aged men in Latin Europe (France, Spain, Portugal, Italy). The aim of this paper is to investigate educational inequalities in mortality by gender, age and causes of death in France, with a special emphasis on people aged 75 years and more. We used data from a longitudinal population sample that includes 1% of the French population. Risk of death (total and cause specific) in the period 1990-1999 according to education was analysed using Cox regression models by age group (45-59, 60-74, and 75+). Inequalities were quantified using both relative (ratio) and absolute (difference) measures. Relative inequalities decreased with age but were still observed in the oldest age group. Absolute inequalities increased with age. This increase was particularly pronounced for cardiovascular diseases. The contribution of different causes of death to absolute inequalities in total mortality differed between age groups. In particular, the contribution of cancer deaths decreased substantially between the age groups 60-74 years and 75 years and more, both in men and in women. This study suggests that the large contribution of cancer deaths to the excess mortality among low educated people that was observed among middle aged men in Latin Europe is not observed among French people aged 75 years and more. This should be confirmed among other Latin Europe countries.
Desai, Rishi J; Huybrechts, Krista F; Hernandez-Diaz, Sonia; Mogun, Helen; Patorno, Elisabetta; Kaltenbach, Karol; Kerzner, Leslie S; Bateman, Brian T
2015-05-14
To provide absolute and relative risk estimates of neonatal abstinence syndrome (NAS) based on duration and timing of prescription opioid use during pregnancy in the presence or absence of additional NAS risk factors of history of opioid misuse or dependence, misuse of other substances, non-opioid psychotropic drug use, and smoking. Observational cohort study. Medicaid data from 46 US states. Pregnant women filling at least one prescription for an opioid analgesic at any time during pregnancy for whom opioid exposure characteristics including duration of therapy: short term (<30 days) or long term (≥ 30 days); timing of use: early use (only in the first two trimesters) or late use (extending into the third trimester); and cumulative dose (in morphine equivalent milligrams) were assessed. Diagnosis of NAS in liveborn infants. 1705 cases of NAS were identified among 290,605 pregnant women filling opioid prescriptions, corresponding to an absolute risk of 5.9 per 1000 deliveries (95% confidence interval 5.6 to 6.2). Long term opioid use during pregnancy resulted in higher absolute risk of NAS per 1000 deliveries in the presence of additional risk factors of known opioid misuse (220.2 (200.8 to 241.0)), alcohol or other drug misuse (30.8 (26.1 to 36.0)), exposure to other psychotropic medications (13.1 (10.6 to 16.1)), and smoking (6.6 (4.3 to 9.6)) than in the absence of any of these risk factors (4.2 (3.3 to 5.4)). The corresponding risk estimates for short term use were 192.0 (175.8 to 209.3), 7.0 (6.0 to 8.2), 2.0 (1.5 to 2.6), 1.5 (1.0 to 2.0), and 0.7 (0.6 to 0.8) per 1000 deliveries, respectively. In propensity score matched analyses, long term prescription opioid use compared with short term use and late use compared with early use in pregnancy demonstrated greater risk of NAS (risk ratios 2.05 (95% confidence interval 1.81 to 2.33) and 1.24 (1.12 to 1.38), respectively). Use of prescription opioids during pregnancy is associated with a low absolute risk of NAS in the absence of additional risk factors. Long term use compared with short term use and late use compared with early use of prescription opioids are associated with increased NAS risk independent of additional risk factors. © Desai et al 2015.
Rivadeneira, Fernando; Zillikens, M Carola; De Laet, Chris Edh; Hofman, Albert; Uitterlinden, André G; Beck, Thomas J; Pols, Huibert Ap
2007-11-01
We studied HSA measurements in relation to hip fracture risk in 4,806 individuals (2,740 women). Hip fractures (n = 147) occurred at the same absolute levels of bone instability in both sexes. Cortical instability (propensity of thinner cortices in wide diameters to buckle) explains why hip fracture risk at different BMD levels is the same across sexes. Despite the sexual dimorphism of bone, hip fracture risk is very similar in men and women at the same absolute BMD. We aimed to elucidate the main structural properties of bone that underlie the measured BMD and that ultimately determines the risk of hip fracture in elderly men and women. This study is part of the Rotterdam Study (a large prospective population-based cohort) and included 147 incident hip fracture cases in 4,806 participants with DXA-derived hip structural analysis (mean follow-up, 8.6 yr). Indices compared in relation to fracture included neck width, cortical thickness, section modulus (an index of bending strength), and buckling ratio (an index of cortical bone instability). We used a mathematical model to calculate the hip fracture distribution by femoral neck BMD, BMC, bone area, and hip structure analysis (HSA) parameters (cortical thickness, section modulus narrow neck width, and buckling ratio) and compared it with prospective data from the Rotterdam Study. In the prospective data, hip fracture cases in both sexes had lower BMD, thinner cortices, greater bone width, lower strength, and higher instability at baseline. In fractured individuals, men had an average BMD that was 0.09 g/cm(2) higher than women (p < 0.00001), whereas no significant difference in buckling ratios was seen. Modeled fracture distribution by BMD and buckling ratio levels were in concordance to the prospective data and showed that hip fractures seem to occur at the same absolute levels of bone instability (buckling ratio) in both men and women. No significant differences were observed between the areas under the ROC curves of BMD (0.8146 in women and 0.8048 in men) and the buckling ratio (0.8161 in women and 0.7759 in men). The buckling ratio (an index of bone instability) portrays in both sexes the critical balance between cortical thickness and bone width. Our findings suggest that extreme thinning of cortices in expanded bones plays a key role on local susceptibility to fracture. Even though the buckling ratio does not offer additional predictive value, these findings improve our understanding of why low BMD is a good predictor of fragility fractures.
Effects of extended-release niacin with laropiprant in high-risk patients.
Landray, Martin J; Haynes, Richard; Hopewell, Jemma C; Parish, Sarah; Aung, Theingi; Tomson, Joseph; Wallendszus, Karl; Craig, Martin; Jiang, Lixin; Collins, Rory; Armitage, Jane
2014-07-17
Patients with evidence of vascular disease are at increased risk for subsequent vascular events despite effective use of statins to lower the low-density lipoprotein (LDL) cholesterol level. Niacin lowers the LDL cholesterol level and raises the high-density lipoprotein (HDL) cholesterol level, but its clinical efficacy and safety are uncertain. After a prerandomization run-in phase to standardize the background statin-based LDL cholesterol-lowering therapy and to establish participants' ability to take extended-release niacin without clinically significant adverse effects, we randomly assigned 25,673 adults with vascular disease to receive 2 g of extended-release niacin and 40 mg of laropiprant or a matching placebo daily. The primary outcome was the first major vascular event (nonfatal myocardial infarction, death from coronary causes, stroke, or arterial revascularization). During a median follow-up period of 3.9 years, participants who were assigned to extended-release niacin-laropiprant had an LDL cholesterol level that was an average of 10 mg per deciliter (0.25 mmol per liter as measured in the central laboratory) lower and an HDL cholesterol level that was an average of 6 mg per deciliter (0.16 mmol per liter) higher than the levels in those assigned to placebo. Assignment to niacin-laropiprant, as compared with assignment to placebo, had no significant effect on the incidence of major vascular events (13.2% and 13.7% of participants with an event, respectively; rate ratio, 0.96; 95% confidence interval [CI], 0.90 to 1.03; P=0.29). Niacin-laropiprant was associated with an increased incidence of disturbances in diabetes control that were considered to be serious (absolute excess as compared with placebo, 3.7 percentage points; P<0.001) and with an increased incidence of diabetes diagnoses (absolute excess, 1.3 percentage points; P<0.001), as well as increases in serious adverse events associated with the gastrointestinal system (absolute excess, 1.0 percentage point; P<0.001), musculoskeletal system (absolute excess, 0.7 percentage points; P<0.001), skin (absolute excess, 0.3 percentage points; P=0.003), and unexpectedly, infection (absolute excess, 1.4 percentage points; P<0.001) and bleeding (absolute excess, 0.7 percentage points; P<0.001). Among participants with atherosclerotic vascular disease, the addition of extended-release niacin-laropiprant to statin-based LDL cholesterol-lowering therapy did not significantly reduce the risk of major vascular events but did increase the risk of serious adverse events. (Funded by Merck and others; HPS2-THRIVE ClinicalTrials.gov number, NCT00461630.).
Risk-taking behavior in the presence of nonconvex asset dynamics.
Lybbert, Travis J; Barrett, Christopher B
2011-01-01
The growing literature on poverty traps emphasizes the links between multiple equilibria and risk avoidance. However, multiple equilibria may also foster risk-taking behavior by some poor people. We illustrate this idea with a simple analytical model in which people with different wealth and ability endowments make investment and risky activity choices in the presence of known nonconvex asset dynamics. This model underscores a crucial distinction between familiar static concepts of risk aversion and forward-looking dynamic risk responses to nonconvex asset dynamics. Even when unobservable preferences exhibit decreasing absolute risk aversion, observed behavior may suggest that risk aversion actually increases with wealth near perceived dynamic asset thresholds. Although high ability individuals are not immune from poverty traps, they can leverage their capital endowments more effectively than lower ability types and are therefore less likely to take seemingly excessive risks. In general, linkages between behavioral responses and wealth dynamics often seem to run in both directions. Both theoretical and empirical poverty trap research could benefit from making this two-way linkage more explicit.
Mok, Pearl L H; Antonsen, Sussie; Pedersen, Carsten Bøcker; Appleby, Louis; Shaw, Jenny; Webb, Roger T
2015-09-19
Psychiatric illness, substance misuse, suicidality, criminality and premature death represent major public health challenges that afflict a sizeable proportion of young people. However, studies of multiple adverse outcomes in the same cohort at risk are rare. In a national Danish cohort we estimated sex- and age-specific incidence rates and absolute risks of these outcomes between adolescence and early middle age. Using interlinked registers, persons born in Denmark 1966-1996 were followed from their 15(th) until 40(th) birthday or December 2011 (N = 2,070,904). We estimated sex- and age-specific incidence rates of nine adverse outcomes, in three main categories: Premature mortality (all-causes, suicide, accident); Psychiatric morbidity (any mental illness diagnosis, suicide attempt, alcohol or drug misuse disorder); Criminality (violent offending, receiving custodial sentence, driving under influence of alcohol or drugs). Cumulative incidences were also calculated using competing risk survival analyses. For cohort members alive on their 15(th) birthday, the absolute risks of dying by age 40 were 1.99 % for males [95 % confidence interval (CI) 1.95-2.03 %] and 0.85 % for females (95 % CI 0.83-0.88 %). The risks of substance misuse and criminality were also much higher for males, especially younger males, than for females. Specifically, the risk of a first conviction for a violent offence was highest amongst males aged below 20. Females, however, were more likely than males to have a hospital-treated psychiatric disorder. By age 40, 13.25 % of females (95 % CI 13.16-13.33 %) and 9.98 % of males (95 % CI 9.91-10.06 %) had been treated. Women aged below 25 were also more likely than men to first attempt suicide, but this pattern was reversed beyond this age. The greatest gender differentials in incidence rates were in criminality outcomes. This is the first comprehensive assessment of the incidence rates and absolute risks of these multiple adverse outcomes. Approximately 1 in 50 males and 1 in 120 females who are alive on their 15th birthday will die by age 40. By examining the same cohort at risk, we compared risks for multiple outcomes without differential inter-cohort biases. These epidemiological profiles will inform further research into the pathways leading to these adverse events and future preventive strategies.
Perceptions of health risks of cigarette smoking: A new measure reveals widespread misunderstanding
Krosnick, Jon A.; Malhotra, Neil; Bruera, Eduardo F.; Chang, LinChiat; Pasek, Josh; Thomas, Randall K.
2017-01-01
Most Americans recognize that smoking causes serious diseases, yet many Americans continue to smoke. One possible explanation for this paradox is that perhaps Americans do not accurately perceive the extent to which smoking increases the probability of adverse health outcomes. This paper examines the accuracy of Americans’ perceptions of the absolute risk, attributable risk, and relative risk of lung cancer, and assesses which of these beliefs drive Americans’ smoking behavior. Using data from three national surveys, statistical analyses were performed by comparing means, medians, and distributions, and by employing Generalized Additive Models. Perceptions of relative risk were associated as expected with smoking onset and smoking cessation, whereas perceptions of absolute risk and attributable risk were not. Additionally, the relation of relative risk with smoking status was stronger among people who held their risk perceptions with more certainty. Most current smokers, former smokers, and never-smokers considerably underestimated the relative risk of smoking. If, as this paper suggests, people naturally think about the health consequences of smoking in terms of relative risk, smoking rates might be reduced if public understanding of the relative risks of smoking were more accurate and people held those beliefs with more confidence. PMID:28806420
The iCARE R Package allows researchers to quickly build models for absolute risk, and apply them to estimate an individual's risk of developing disease during a specifed time interval, based on a set of user defined input parameters.
Lorazepam vs diazepam for pediatric status epilepticus: a randomized clinical trial.
Chamberlain, James M; Okada, Pamela; Holsti, Maija; Mahajan, Prashant; Brown, Kathleen M; Vance, Cheryl; Gonzalez, Victor; Lichenstein, Richard; Stanley, Rachel; Brousseau, David C; Grubenhoff, Joseph; Zemek, Roger; Johnson, David W; Clemons, Traci E; Baren, Jill
Benzodiazepines are considered first-line therapy for pediatric status epilepticus. Some studies suggest that lorazepam may be more effective or safer than diazepam, but lorazepam is not Food and Drug Administration approved for this indication. To test the hypothesis that lorazepam has better efficacy and safety than diazepam for treating pediatric status epilepticus. This double-blind, randomized clinical trial was conducted from March 1, 2008, to March 14, 2012. Patients aged 3 months to younger than 18 years with convulsive status epilepticus presenting to 1 of 11 US academic pediatric emergency departments were eligible. There were 273 patients; 140 randomized to diazepam and 133 to lorazepam. Patients received either 0.2 mg/kg of diazepam or 0.1 mg/kg of lorazepam intravenously, with half this dose repeated at 5 minutes if necessary. If status epilepticus continued at 12 minutes, fosphenytoin was administered. The primary efficacy outcome was cessation of status epilepticus by 10 minutes without recurrence within 30 minutes. The primary safety outcome was the performance of assisted ventilation. Secondary outcomes included rates of seizure recurrence and sedation and times to cessation of status epilepticus and return to baseline mental status. Outcomes were measured 4 hours after study medication administration. Cessation of status epilepticus for 10 minutes without recurrence within 30 minutes occurred in 101 of 140 (72.1%) in the diazepam group and 97 of 133 (72.9%) in the lorazepam group, with an absolute efficacy difference of 0.8% (95% CI, -11.4% to 9.8%). Twenty-six patients in each group required assisted ventilation (16.0% given diazepam and 17.6% given lorazepam; absolute risk difference, 1.6%; 95% CI, -9.9% to 6.8%). There were no statistically significant differences in secondary outcomes except that lorazepam patients were more likely to be sedated (66.9% vs 50%, respectively; absolute risk difference, 16.9%; 95% CI, 6.1% to 27.7%). Among pediatric patients with convulsive status epilepticus, treatment with lorazepam did not result in improved efficacy or safety compared with diazepam. These findings do not support the preferential use of lorazepam for this condition. clinicaltrials.gov Identifier: NCT00621478.
Evaluation of mean-monthly streamflow-regression equations for Colorado, 2014
Kohn, Michael S.; Stevens, Michael R.; Bock, Andrew R.; Char, Stephen J.
2015-01-01
The median absolute differences between the observed and computed mean-monthly streamflow for Mountain, Northwest, and Southwest hydrologic regions are fairly uniform throughout the year, with the exception of late summer and early fall (July, August, and September), when each hydrologic region exhibits a substantial increase in median absolute percent difference. The greatest difference occurs in the Northwest hydrologic region, and the smallest difference occurs in the Mountain hydrologic region. The Rio Grande hydrologic region shows seasonal variation in median absolute percent difference with March, April, August, and September having a median absolute difference near or below 40 percent, and the remaining months of the year having a median absolute difference near or above 50 percent. In the Mountain, Northwest, and Southwest hydrologic regions, the mean-monthly streamflow equations perform the best during spring (March, April, and May). However, in the Rio Grande hydrologic region, the mean-monthly streamflow equations perform the best during late summer and early fall (August and September).
Clearing margin system in the futures markets—Applying the value-at-risk model to Taiwanese data
NASA Astrophysics Data System (ADS)
Chiu, Chien-Liang; Chiang, Shu-Mei; Hung, Jui-Cheng; Chen, Yu-Lung
2006-07-01
This article sets out to investigate if the TAIFEX has adequate clearing margin adjustment system via unconditional coverage, conditional coverage test and mean relative scaled bias to assess the performance of three value-at-risk (VaR) models (i.e., the TAIFEX, RiskMetrics and GARCH-t). For the same model, original and absolute returns are compared to explore which can accurately capture the true risk. For the same return, daily and tiered adjustment methods are examined to evaluate which corresponds to risk best. The results indicate that the clearing margin adjustment of the TAIFEX cannot reflect true risks. The adjustment rules, including the use of absolute return and tiered adjustment of the clearing margin, have distorted VaR-based margin requirements. Besides, the results suggest that the TAIFEX should use original return to compute VaR and daily adjustment system to set clearing margin. This approach would improve the funds operation efficiency and the liquidity of the futures markets.
Hawken, Steven; Kwong, Jeffrey C.; Deeks, Shelley L.; Crowcroft, Natasha S.; McGeer, Allison J.; Ducharme, Robin; Campitelli, Michael A.; Coyle, Doug
2015-01-01
It is unclear whether seasonal influenza vaccination results in a net increase or decrease in the risk for Guillain-Barré syndrome (GBS). To assess the effect of seasonal influenza vaccination on the absolute risk of acquiring GBS, we used simulation models and published estimates of age- and sex-specific risks for GBS, influenza incidence, and vaccine effectiveness. For a hypothetical 45-year-old woman and 75-year-old man, excess GBS risk for influenza vaccination versus no vaccination was −0.36/1 million vaccinations (95% credible interval −1.22 to 0.28) and −0.42/1 million vaccinations (95% credible interval, –3.68 to 2.44), respectively. These numbers represent a small absolute reduction in GBS risk with vaccination. Under typical conditions (e.g. influenza incidence rates >5% and vaccine effectiveness >60%), vaccination reduced GBS risk. These findings should strengthen confidence in the safety of influenza vaccine and allow health professionals to better put GBS risk in context when discussing influenza vaccination with patients. PMID:25625590
Sprague, Debra; Russo, Joan E.; LaVallie, Donna L.; Buchwald, Dedra S.
2012-01-01
We evaluated methods for presenting risk information by administering 6 versions of an anonymous survey to 489 American Indian tribal college students. All surveys presented identical numeric information, but framing varied. Half expressed prevention benefits as relative risk reduction, half as absolute risk reduction. One-third of surveys used text to describe prevention benefits; 1/3 used text plus bar graph; 1/3 used text plus modified bar graph incorporating a culturally tailored image. The odds ratio (OR) for correct risk interpretation for absolute risk framing vs. relative risk framing was 1.40 (95% CI=1.01, 1.93). The OR for correct interpretation of text plus bar graph vs. text only was 2.16 (95% CI=1.46, 3.19); OR for text plus culturally tailored bar graph vs. text only was 1.72 (95% CI=1.14, 2.60). Risk information including a bar graph was better understood than text-only information; a culturally tailored graph was no more effective than a standard graph. PMID:22544538
A DISCUSSION ON DIFFERENT APPROACHES FOR ASSESSING LIFETIME RISKS OF RADON-INDUCED LUNG CANCER.
Chen, Jing; Murith, Christophe; Palacios, Martha; Wang, Chunhong; Liu, Senlin
2017-11-01
Lifetime risks of radon induced lung cancer were assessed based on epidemiological approaches for Canadian, Swiss and Chinese populations, using the most recent vital statistic data and radon distribution characteristics available for each country. In the risk calculation, the North America residential radon risk model was used for the Canadian population, the European residential radon risk model for the Swiss population, the Chinese residential radon risk model for the Chinese population, and the EPA/BEIR-VI radon risk model for all three populations. The results were compared with the risk calculated from the International Commission on Radiological Protection (ICRP)'s exposure-to-risk conversion coefficients. In view of the fact that the ICRP coefficients were recommended for radiation protection of all populations, it was concluded that, generally speaking, lifetime absolute risks calculated with ICRP-recommended coefficients agree reasonably well with the range of radon induced lung cancer risk predicted by risk models derived from epidemiological pooling analyses. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Patients prefer pictures to numbers to express cardiovascular benefit from treatment.
Goodyear-Smith, Felicity; Arroll, Bruce; Chan, Lydia; Jackson, Rod; Wells, Sue; Kenealy, Timothy
2008-01-01
This study aimed to determine which methods of expressing a preventive medication's benefit encourage patients with known cardiovascular disease to decide to take the medication and which methods patients prefer. We identified patients in Auckland, New Zealand, family practices located in areas of differing socioeconomic status who had preexisting heart disease (myocardial infarction, angina, or both) and were taking statins. The patients were interviewed about their preference for methods of expressing the benefit of a hypothetical medication. Benefits were expressed numerically (relative risk, absolute risk, number needed to treat, odds ratio, natural frequency) and graphically. Statistical testing was adjusted for practice. We interviewed 100 eligible patients, representing a 53% response rate. No matter how the risk was expressed, the majority of patients indicated they would be encouraged to take the medication. Two-thirds (68) of the patients preferred 1 method of expressing benefit over others. Of this group, 57% preferred the information presented graphically. This value was significantly greater (P <.001) than the 19% who chose the next most preferred option, relative risk. Few patients preferred absolute risk (13%) or natural frequencies (9%). Only a single patient (1%) preferred the odds ratio. None preferred number needed to treat. Ninety percent of patients responding to a question about framing preferred positive framing (description of the benefit of treatment) over negative framing (description of the harm of not being treated). Although number needed to treat is a useful tool for communicating risk and benefit to clinicians, this format was the least likely to encourage patients to take medication. As graphical representation of benefit was the method patients preferred most, consideration should be given to developing visual aids to support shared clinical decision making.
Kirkbride, James B; Jones, Peter B; Ullrich, Simone; Coid, Jeremy W
2014-01-01
Although urban birth, upbringing, and living are associated with increased risk of nonaffective psychotic disorders, few studies have used appropriate multilevel techniques accounting for spatial dependency in risk to investigate social, economic, or physical determinants of psychosis incidence. We adopted Bayesian hierarchical modeling to investigate the sociospatial distribution of psychosis risk in East London for DSM-IV nonaffective and affective psychotic disorders, ascertained over a 2-year period in the East London first-episode psychosis study. We included individual and environmental data on 427 subjects experiencing first-episode psychosis to estimate the incidence of disorder across 56 neighborhoods, having standardized for age, sex, ethnicity, and socioeconomic status. A Bayesian model that included spatially structured neighborhood-level random effects identified substantial unexplained variation in nonaffective psychosis risk after controlling for individual-level factors. This variation was independently associated with greater levels of neighborhood income inequality (SD increase in inequality: Bayesian relative risks [RR]: 1.25; 95% CI: 1.04-1.49), absolute deprivation (RR: 1.28; 95% CI: 1.08-1.51) and population density (RR: 1.18; 95% CI: 1.00-1.41). Neighborhood ethnic composition effects were associated with incidence of nonaffective psychosis for people of black Caribbean and black African origin. No variation in the spatial distribution of the affective psychoses was identified, consistent with the possibility of differing etiological origins of affective and nonaffective psychoses. Our data suggest that both absolute and relative measures of neighborhood social composition are associated with the incidence of nonaffective psychosis. We suggest these associations are consistent with a role for social stressors in psychosis risk, particularly when people live in more unequal communities.
Viallon, Vivian; Latouche, Aurélien
2011-03-01
Finding out biomarkers and building risk scores to predict the occurrence of survival outcomes is a major concern of clinical epidemiology, and so is the evaluation of prognostic models. In this paper, we are concerned with the estimation of the time-dependent AUC--area under the receiver-operating curve--which naturally extends standard AUC to the setting of survival outcomes and enables to evaluate the discriminative power of prognostic models. We establish a simple and useful relation between the predictiveness curve and the time-dependent AUC--AUC(t). This relation confirms that the predictiveness curve is the key concept for evaluating calibration and discrimination of prognostic models. It also highlights that accurate estimates of the conditional absolute risk function should yield accurate estimates for AUC(t). From this observation, we derive several estimators for AUC(t) relying on distinct estimators of the conditional absolute risk function. An empirical study was conducted to compare our estimators with the existing ones and assess the effect of model misspecification--when estimating the conditional absolute risk function--on the AUC(t) estimation. We further illustrate the methodology on the Mayo PBC and the VA lung cancer data sets. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Geurts, Marjolein; van der Worp, H Bart; Kappelle, L Jaap; Amelink, G Johan; Algra, Ale; Hofmeijer, Jeannette
2013-09-01
We assessed whether the effects of surgical decompression for space-occupying hemispheric infarction, observed at 1 year, are sustained at 3 years. Patients with space-occupying hemispheric infarction, who were enrolled in the Hemicraniectomy After Middle cerebral artery infarction with Life-threatening Edema Trial within 4 days after stroke onset, were followed up at 3 years. Outcome measures included functional outcome (modified Rankin Scale), death, quality of life, and place of residence. Poor functional outcome was defined as modified Rankin Scale >3. Of 64 included patients, 32 were randomized to decompressive surgery and 32 to best medical treatment. Just as at 1 year, surgery had no effect on the risk of poor functional outcome at 3 years (absolute risk reduction, 1%; 95% confidence interval, -21 to 22), but it reduced case fatality (absolute risk reduction, 37%; 95% confidence interval, 14-60). Sixteen surgically treated patients and 8 controls lived at home (absolute risk reduction, 27%; 95% confidence interval, 4-50). Quality of life improved between 1 and 3 years in patients treated with surgery. In patients with space-occupying hemispheric infarction, the effects of decompressive surgery on case fatality and functional outcome observed at 1 year are sustained at 3 years. http://www.controlled-trials.com. Unique identifier: ISRCTN94237756.
Balaban, Richard B; Galbraith, Alison A; Burns, Marguerite E; Vialle-Valentin, Catherine E; Larochelle, Marc R; Ross-Degnan, Dennis
2015-07-01
Evidence-based interventions to reduce hospital readmissions may not generalize to resource-constrained safety-net hospitals. To determine if an intervention by patient navigators (PNs), hospital-based Community Health Workers, reduces readmissions among high risk, low socioeconomic status patients. Randomized controlled trial. General medicine inpatients having at least one of the following readmission risk factors: (1) age ≥60 years, (2) any in-network inpatient admission within the past 6 months, (3) length of stay ≥3 days, (4) admission diagnosis of heart failure, or (5) chronic obstructive pulmonary disease. The analytic sample included 585 intervention patients and 925 controls. PNs provided coaching and assistance in navigating the transition from hospital to home through hospital visits and weekly telephone outreach, supporting patients for 30 days post-discharge with discharge preparation, medication management, scheduling of follow-up appointments, communication with primary care, and symptom management. The primary outcome was in-network 30-day hospital readmissions. Secondary outcomes included rates of outpatient follow-up. We evaluated outcomes for the entire cohort and stratified by patient age >60 years (425 intervention/584 controls) and ≤60 years (160 intervention/341 controls). Overall, 30-day readmission rates did not differ between intervention and control patients. However, the two age groups demonstrated marked differences. Intervention patients >60 years showed a statistically significant adjusted absolute 4.1% decrease [95% CI: -8.0%, -0.2%] in readmission with an increase in 30-day outpatient follow-up. Intervention patients ≤60 years showed a statistically significant adjusted absolute 11.8% increase [95% CI: 4.4%, 19.0%] in readmission with no change in 30-day outpatient follow-up. A patient navigator intervention among high risk, safety-net patients decreased readmission among older patients while increasing readmissions among younger patients. Care transition strategies should be evaluated among diverse populations, and younger high risk patients may require novel strategies.
Ziller, M; Ziller, V; Haas, G; Rex, J; Kostev, K
2014-02-01
Recent studies showed differences in the risk of venous thrombosis between different combined hormonal contraceptives. Database studies comprising large cohorts can add relevant aspects from daily clinical practice. The purpose of this study was to evaluate different progestogen in combination with ethinylestradiol on the risk of venous thrombosis in Germany. Computerized data from 68,168 contraceptive users in gynecological practices throughout Germany (Disease Analyzer Database) were analyzed. The adjusted odds ratios for risk of thrombosis were estimated in users of different oral contraceptive (OC) formulations relative to users of levonorgestrel-containing preparations. In total, 38 (0.06 %) of the 68,168 contraceptive users had a recorded diagnosis of thrombosis within 365 days after the initial prescription. The adjusted risk was 1.95 for desogestrel (95 % CI 0.52-7.29), 2.97 for dienogest (95 % CI 0.96-9.24), 1.57 for drospirenone (95 % CI 0.46-5.38), 2.54 for chlormadinone (95 % CI 0.72-9.04), and 3.24 for norgestimate (95 % CI 0.59-17.75) compared to levonorgestrel. None of those findings reached statistical significance. The maximum absolute increase versus levonorgestrel was 6 cases per 10,000 women (n.s.). The study shows the low incidence rates of thrombosis in OC users. Since there is no significant difference, this study does not confirm an increased risk but shows only a tendency for this risk of third- and fourth-generation OC versus levonorgestrel-containing products.
The two radii of a charged particle.
Michov, B M
1989-01-01
The existence of two radii of each charged particle-a geometric and electrokinetic radii, is supposed. The mathematical relationship between them in the four possible combinations of an ion and its counterion is analyzed: (i) at equal geometric radii and, in absolute values, equal valencies; (ii) at equal geometric radii and, in absolute values, different valencies; (iii) at different geometric radii and, in absolute values, equal valencies; (iv) at different geometric radii and, in absolute values, different valencies. One of the equations worked out can be used to define the relationship between the geometric and electrokinetic radii of a polyion. All the equations are used in working out precise calculations.
Hill, Sophie; Spink, Janet; Cadilhac, Dominique; Edwards, Adrian; Kaufman, Caroline; Rogers, Sophie; Ryan, Rebecca; Tonkin, Andrew
2010-03-04
Communicating risk is part of primary prevention of coronary heart disease and stroke, collectively referred to as cardiovascular disease (CVD). In Australia, health organisations have promoted an absolute risk approach, thereby raising the question of suitable standardised formats for risk communication. Sixteen formats of risk representation were prepared including statements, icons, graphical formats, alone or in combination, and with variable use of colours. All presented the same risk, i.e., the absolute risk for a 55 year old woman, 16% risk of CVD in five years. Preferences for a five or ten-year timeframe were explored. Australian GPs and consumers were recruited for participation in focus groups, with the data analysed thematically and preferred formats tallied. Three focus groups with health consumers and three with GPs were held, involving 19 consumers and 18 GPs. Consumers and GPs had similar views on which formats were more easily comprehended and which conveyed 16% risk as a high risk. A simple summation of preferences resulted in three graphical formats (thermometers, vertical bar chart) and one statement format as the top choices. The use of colour to distinguish risk (red, yellow, green) and comparative information (age, sex, smoking status) were important ingredients. Consumers found formats which combined information helpful, such as colour, effect of changing behaviour on risk, or comparison with a healthy older person. GPs preferred formats that helped them relate the information about risk of CVD to their patients, and could be used to motivate patients to change behaviour.Several formats were reported as confusing, such as a percentage risk with no contextual information, line graphs, and icons, particularly those with larger numbers. Whilst consumers and GPs shared preferences, the use of one format for all situations was not recommended. Overall, people across groups felt that risk expressed over five years was preferable to a ten-year risk, the latter being too remote. Consumers and GPs shared preferences for risk representation formats. Both groups liked the option to combine formats and tailor the risk information to reflect a specific individual's risk, to maximise understanding and provide a good basis for discussion.
Treatment of post-intubation laryngeal granulomas: systematic review and proportional meta-analysis.
Rimoli, Caroline Fernandes; Martins, Regina Helena Garcia; Catâneo, Daniele Cristina; Imamura, Rui; Catâneo, Antonio José Maria
2018-04-14
Laryngeal granulomas post intubation are benign but recurrent lesions. There is no consensus for its treatment. To describe the effectiveness of different treatment modalities for primary or recurrent laryngeal granulomas resulting from endotracheal intubation. Systematic review and proportional meta-analysis. Eligibility criteria - experimental or observational studies with at least five subjects. Outcomes studied - granuloma resolution, recurrence, and time for resolution. Databases used - Pubmed, Embase, Lilacs, and Cochrane. The Stats Direct 3.0.121 program was used. Six studies were selected, with 85 patients. The treatments registered were: antireflux therapy, speech therapy, anti-inflammatory drugs, steroids, antibiotics, zinc sulfate and surgery. 85 patients from six studies had primary treatment: surgery±associations (41 patients), resolution chance 75% (95% CI: 0.3-100%, I 2 =90%), absolute relapse risk 25% (95% CI: 0.2-71%); medical treatment (44 patients), resolution chance 86% (95% CI: 67-97%); and absolute relapse risk 14% (95% CI: 3-33%). There was no significant difference between groups. Three studies, encompassing 19 patients, analyzed secondary treatment (failure or recurrence after primary treatment); three subjects presented new recurrence. The time needed to resolve the lesions varied from immediate, after surgery, to 23 months, for inhaled steroid. There is no evidence of high quality that proves the efficacy of any treatment for laryngeal granulomas resulting from endotracheal intubation. Published by Elsevier Editora Ltda.
Risk profile of breast cancer following atypical hyperplasia detected through organized screening.
Buckley, Elizabeth; Sullivan, Tom; Farshid, Gelareh; Hiller, Janet; Roder, David
2015-06-01
Few population-based data are available indicating the breast cancer risk following detection of atypia within a breast screening program. Prospectively collected data from the South Australian screening program were linked with the state cancer registry. Absolute and relative breast cancer risk estimates were calculated for ADH and ALH separately, and by age at diagnosis and time since diagnosis. Post-hoc analysis was undertaken of the effect of family history on breast cancer risk. Women with ADH and ALH had an increase in relative risk for malignancy (ADH HR 2.81 [95% CI 1.72, 4.59] and (ALH HR 4.14 [95% CI 1.97, 8.69], respectively. Differences in risk profile according to time since diagnosis and age at diagnosis were not statistically significant. Estimates of the relative risk of breast cancer are necessary to inform decisions regarding clinical management and/or treatment of women with ADH and ALH. Copyright © 2015 Elsevier Ltd. All rights reserved.
Lafeber, Melvin; Webster, Ruth; Visseren, Frank Lj; Bots, Michiel L; Grobbee, Diederick E; Spiering, W; Rodgers, Anthony
2016-08-01
Recent data indicate that fixed-dose combination (FDC) pills, polypills, can produce sizeable risk factor reductions. There are very few published data on the consistency of the effects of a polypill in different patient populations. It is unclear for example whether the effects of the polypill are mainly driven by the individuals with high individual risk factor levels. The aim of the present study is to examine whether baseline risk factor levels modify the effect of polypill treatment on low-density lipoprotein (LDL)-cholesterol, blood pressure (BP), calculated cardiovascular relative risk reduction and adverse events. This paper describes a post-hoc analysis of a randomised, placebo-controlled trial of a polypill (containing aspirin 75 mg, simvastatin 20 mg, lisinopril 10 mg and hydrochlorothiazide 12.5 mg) in 378 individuals without an indication for any component of the polypill, but who had an estimated five-year risk for cardiovascular disease ≥7.5%. The outcomes considered were effect modification by baseline risk factor levels on change in LDL-cholesterol, systolic BP, calculated cardiovascular relative risk reduction and adverse events. The mean LDL-cholesterol in the polypill group was 0.9 mmol/l (95% confidence interval (CI): 0.8-1.0) lower compared with the placebo group during follow-up. Those with a baseline LDL-cholesterol >3.6 mmol/l achieved a greater absolute LDL-cholesterol reduction with the polypill compared with placebo, than patients with an LDL-cholesterol ≤3.6 mmol/l (-1.1 versus -0.6 mmol/l, respectively). The mean systolic BP was 10 mm Hg (95% CI: 8-12) lower in the polypill group. In participants with a baseline systolic BP >135 mm Hg the polypill resulted in a greater absolute systolic BP reduction with the polypill compared with placebo, than participants with a systolic BP ≤ 135 mm Hg (-12 versus -7 mm Hg, respectively). Calculated from individual risk factor reductions, the mean cardiovascular relative risk reduction was 48% (95% CI: 43-52) in the polypill group. Both baseline LDL-cholesterol and estimated cardiovascular risk were significant modifiers of the estimated cardiovascular relative risk reduction caused by the polypill. Adverse events did not appear to be related to baseline risk factor levels or the estimated cardiovascular risk. This study demonstrated that the effect of a cardiovascular polypill on risk factor levels is modified by the level of these risk factors. Groups defined by baseline LDL-cholesterol or systolic BP had large differences in risk factor reductions but only moderate differences in estimated cardiovascular relative risk reduction, suggesting also that patients with mildly increased risk factor levels but an overall raised cardiovascular risk benefit from being treated with a polypill. © The European Society of Cardiology 2016.
Vehicle mass and injury risk in two-car crashes: A novel methodology.
Tolouei, Reza; Maher, Mike; Titheridge, Helena
2013-01-01
This paper introduces a novel methodology based on disaggregate analysis of two-car crash data to estimate the partial effects of mass, through the velocity change, on absolute driver injury risk in each of the vehicles involved in the crash when absolute injury risk is defined as the probability of injury when the vehicle is involved in a two-car crash. The novel aspect of the introduced methodology is in providing a solution to the issue of lack of data on the speed of vehicles prior to the crash, which is required to calculate the velocity change, as well as a solution to the issue of lack of information on non-injury two-car crashes in national accident data. These issues have often led to focussing on relative measures of injury risk that are not independent of risk in the colliding cars. Furthermore, the introduced methodology is used to investigate whether there is any effect of vehicle size above and beyond that of mass ratio, and whether there are any effects associated with the gender and age of the drivers. The methodology was used to analyse two-car crashes to investigate the partial effects of vehicle mass and size on absolute driver injury risk. The results confirmed that in a two-car collision, vehicle mass has a protective effect on its own driver injury risk and an aggressive effect on the driver injury risk of the colliding vehicle. The results also confirmed that there is a protective effect of vehicle size above and beyond that of vehicle mass for frontal and front to side collisions. Copyright © 2012 Elsevier Ltd. All rights reserved.
Drewes, Yvonne M; Poortvliet, Rosalinde K E; Blom, Jeanet W; de Ruijter, Wouter; Westendorp, Rudi G J; Stott, David J; Blom, Henk J; Ford, Ian; Sattar, Naveed; Wouter Jukema, J; Assendelft, Willem J J; de Craen, Anton J M; Gussekloo, Jacobijn
2014-02-01
To assess the effect of preventive pravastatin treatment on coronary heart disease (CHD) morbidity and mortality in older persons at risk for cardiovascular disease (CVD), stratified according to plasma levels of homocysteine. A post hoc subanalysis in the PROspective Study of Pravastatin in the Elderly at Risk (PROSPER), started in 1997, which is a double-blind, randomized, placebo-controlled trial with a mean follow-up of 3.2 years. Primary care setting in two of the three PROSPER study sites (Netherlands and Scotland). Individuals (n = 3,522, aged 70-82, 1,765 male) with a history of or risk factors for CVD were ranked in three groups depending on baseline homocysteine level, sex, and study site. Pravastatin (40 mg) versus placebo. Fatal and nonfatal CHD and mortality. In the placebo group, participants with a high homocysteine level (n = 588) had a 1.8 higher risk (95% confidence interval (CI) = 1.2-2.5, P = .001) of fatal and nonfatal CHD than those with a low homocysteine level (n = 597). The absolute risk reduction in fatal and nonfatal CHD with pravastatin treatment was 1.6% (95% CI = -1.6 to 4.7%) in the low homocysteine group and 6.7% (95% CI = 2.7-10.7%) in the high homocysteine group (difference 5.2%, 95% CI = 0.11-10.3, P = .046). Therefore, the number needed to treat (NNT) with pravastatin for 3.2 years for benefit related to fatal and nonfatal CHD events was 14.8 (95% CI = 9.3-36.6) for high homocysteine and 64.5 (95% CI = 21.4-∞) for low homocysteine. In older persons at risk of CVD, those with high homocysteine are at highest risk for fatal and nonfatal CHD. With pravastatin treatment, this group has the highest absolute risk reduction and the lowest NNT to prevent fatal and nonfatal CHD. © 2014, Copyright the Authors. Journal compilation © 2014, The American Geriatrics Society.
Breast Implants and the Risk of Anaplastic Large-Cell Lymphoma in the Breast.
de Boer, Mintsje; van Leeuwen, Flora E; Hauptmann, Michael; Overbeek, Lucy I H; de Boer, Jan Paul; Hijmering, Nathalie J; Sernee, Arthur; Klazen, Caroline A H; Lobbes, Marc B I; van der Hulst, René R W J; Rakhorst, Hinne A; de Jong, Daphne
2018-03-01
Breast implants are among the most commonly used medical devices. Since 2008, the number of women with breast implants diagnosed with anaplastic large-cell lymphoma in the breast (breast-ALCL) has increased, and several reports have suggested an association between breast implants and risk of breast-ALCL. However, relative and absolute risks of breast-ALCL in women with implants are still unknown, precluding evidence-based counseling about implants. To determine relative and absolute risks of breast-ALCL in women with breast implants. Through the population-based nationwide Dutch pathology registry we identified all patients diagnosed with primary non-Hodgkin lymphoma in the breast between 1990 and 2016 and retrieved clinical data, including breast implant status, from the treating physicians. We estimated the odds ratio (OR) of ALCL associated with breast implants in a case-control design, comparing implant prevalence between women with breast-ALCL and women with other types of breast lymphoma. Cumulative risk of breast-ALCL was derived from the age-specific prevalence of breast implants in Dutch women, estimated from an examination of 3000 chest x-rays and time trends from implant sales. Relative and absolute risks of breast-ALCL in women with breast implants. Among 43 patients with breast-ALCL (median age, 59 years), 32 had ipsilateral breast implants, compared with 1 among 146 women with other primary breast lymphomas (OR, 421.8; 95% CI, 52.6-3385.2). Implants among breast-ALCL cases were more often macrotextured (23 macrotextured of 28 total implants of known type, 82%) than expected (49 193 sold macrotextured implants of total sold 109 449 between 2010 and 2015, 45%) based on sales data (P < .001). The estimated prevalence of breast implants in women aged 20 to 70 years was 3.3%. Cumulative risks of breast-ALCL in women with implants were 29 per million at 50 years and 82 per million at 70 years. The number of women with implants needed to cause 1 breast-ALCL case before age 75 years was 6920. Breast implants are associated with increased risk of breast-ALCL, but the absolute risk remains small. Our results emphasize the need for increased awareness among the public, medical professionals, and regulatory bodies, promotion of alternative cosmetic procedures, and alertness to signs and symptoms of breast-ALCL in women with implants.
Keller, Brad M; Chen, Jinbo; Daye, Dania; Conant, Emily F; Kontos, Despina
2015-08-25
Breast density, commonly quantified as the percentage of mammographically dense tissue area, is a strong breast cancer risk factor. We investigated associations between breast cancer and fully automated measures of breast density made by a new publicly available software tool, the Laboratory for Individualized Breast Radiodensity Assessment (LIBRA). Digital mammograms from 106 invasive breast cancer cases and 318 age-matched controls were retrospectively analyzed. Density estimates acquired by LIBRA were compared with commercially available software and standard Breast Imaging-Reporting and Data System (BI-RADS) density estimates. Associations between the different density measures and breast cancer were evaluated by using logistic regression after adjustment for Gail risk factors and body mass index (BMI). Area under the curve (AUC) of the receiver operating characteristic (ROC) was used to assess discriminatory capacity, and odds ratios (ORs) for each density measure are provided. All automated density measures had a significant association with breast cancer (OR = 1.47-2.23, AUC = 0.59-0.71, P < 0.01) which was strengthened after adjustment for Gail risk factors and BMI (OR = 1.96-2.64, AUC = 0.82-0.85, P < 0.001). In multivariable analysis, absolute dense area (OR = 1.84, P < 0.001) and absolute dense volume (OR = 1.67, P = 0.003) were jointly associated with breast cancer (AUC = 0.77, P < 0.01), having a larger discriminatory capacity than models considering the Gail risk factors alone (AUC = 0.64, P < 0.001) or the Gail risk factors plus standard area percent density (AUC = 0.68, P = 0.01). After BMI was further adjusted for, absolute dense area retained significance (OR = 2.18, P < 0.001) and volume percent density approached significance (OR = 1.47, P = 0.06). This combined area-volume density model also had a significantly (P < 0.001) improved discriminatory capacity (AUC = 0.86) relative to a model considering the Gail risk factors plus BMI (AUC = 0.80). Our study suggests that new automated density measures may ultimately augment the current standard breast cancer risk factors. In addition, the ability to fully automate density estimation with digital mammography, particularly through the use of publically available breast density estimation software, could accelerate the translation of density reporting in routine breast cancer screening and surveillance protocols and facilitate broader research into the use of breast density as a risk factor for breast cancer.
12 CFR 217.52 - Simple risk-weight approach (SRWA).
Code of Federal Regulations, 2014 CFR
2014-01-01
... greater than or equal to −1 (that is, between zero and −1), then E equals the absolute value of RVC. If... this section) by the lowest applicable risk weight in this paragraph (b). (1) Zero percent risk weight... credit exposures receive a zero percent risk weight under § 217.32 may be assigned a zero percent risk...
Reporting of Numerical and Statistical Differences in Abstracts
Dryver, Eric; Hux, Janet E
2002-01-01
OBJECTIVE The reporting of relative risk reductions (RRRs) or absolute risk reductions (ARRs) to quantify binary outcomes in trials engenders differing perceptions of therapeutic efficacy, and the merits of P values versus confidence intervals (CIs) are also controversial. We describe the manner in which numerical and statistical difference in treatment outcomes is presented in published abstracts. DESIGN A descriptive study of abstracts published in 1986 and 1996 in 8 general medical and specialty journals. Inclusion criteria: controlled, intervention trials with a binary primary or secondary outcome. Seven items were recorded: raw data (outcomes for each treatment arm), measure of relative difference (e.g., RRR), ARR, number needed to treat, P value, CI, and verbal statement of statistical significance. The prevalence of these items was compared between journals and across time. RESULTS Of 5,293 abstracts, 300 met the inclusion criteria. In 1986, 60% of abstracts did not provide both the raw data and a corresponding P value or CI, while 28% failed to do so in 1Dr. Hux is a Career Scientist of the Ontario Ministry of Health and receives salary support from the Institute for Clinical Evaluative Sciences in Ontario.996 (P < .001; RRR of 53%; ARR of 32%; CI for ARR 21% to 43%). The variability between journals was highly significant (P < .001). In 1986, 100% of abstracts lacked a measure of absolute difference while 88% of 1996 abstracts did so (P < .001). In 1986, 98% of abstracts lacked a CI while 65% of 1996 abstracts did so (P < .001). CONCLUSIONS The provision of quantitative outcome and statistical quantitative information has significantly increased between 1986 and 1996. However, further progress can be made to make abstracts more informative. PMID:11929506
Reporting of numerical and statistical differences in abstracts: improving but not optimal.
Dryver, Eric; Hux, Janet E
2002-03-01
The reporting of relative risk reductions (RRRs) or absolute risk reductions (ARRs) to quantify binary outcomes in trials engenders differing perceptions of therapeutic efficacy, and the merits of P values versus confidence intervals (CIs) are also controversial. We describe the manner in which numerical and statistical difference in treatment outcomes is presented in published abstracts. A descriptive study of abstracts published in 1986 and 1996 in 8 general medical and specialty journals. controlled, intervention trials with a binary primary or secondary outcome. Seven items were recorded: raw data (outcomes for each treatment arm), measure of relative difference (e.g., RRR), ARR, number needed to treat, P value, CI, and verbal statement of statistical significance. The prevalence of these items was compared between journals and across time. Of 5,293 abstracts, 300 met the inclusion criteria. In 1986, 60% of abstracts did not provide both the raw data and a corresponding P value or CI, while 28% failed to do so in 1Dr. Hux is a Career Scientist of the Ontario Ministry of Health and receives salary support from the Institute for Clinical Evaluative Sciences in Ontario.996 ( P <.001; RRR of 53%; ARR of 32%; CI for ARR 21% to 43%). The variability between journals was highly significant ( P <.001). In 1986, 100% of abstracts lacked a measure of absolute difference while 88% of 1996 abstracts did so ( P <.001). In 1986, 98% of abstracts lacked a CI while 65% of 1996 abstracts did so ( P <.001). The provision of quantitative outcome and statistical quantitative information has significantly increased between 1986 and 1996. However, further progress can be made to make abstracts more informative.
Trinh, Thang; Eriksson, Mikael; Darabi, Hatef; Bonn, Stephanie E; Brand, Judith S; Cuzick, Jack; Czene, Kamila; Sjölander, Arvid; Bälter, Katarina; Hall, Per
2015-04-02
High physical activity has been shown to decrease the risk of breast cancer, potentially by a mechanism that also reduces mammographic density. We tested the hypothesis that the risk of developing breast cancer in the next 10 years according to the Tyrer-Cuzick prediction model influences the association between physical activity and mammographic density. We conducted a population-based cross-sectional study of 38,913 Swedish women aged 40-74 years. Physical activity was assessed using the validated web-questionnaire Active-Q and mammographic density was measured by the fully automated volumetric Volpara method. The 10-year risk of breast cancer was estimated using the Tyrer-Cuzick (TC) prediction model. Linear regression analyses were performed to assess the association between physical activity and volumetric mammographic density and the potential interaction with the TC breast cancer risk. Overall, high physical activity was associated with lower absolute dense volume. As compared to women with the lowest total activity level (<40 metabolic equivalent hours [MET-h] per day), women with the highest total activity level (≥50 MET-h/day) had an estimated 3.4 cm(3) (95% confidence interval, 2.3-4.7) lower absolute dense volume. The inverse association was seen for any type of physical activity among women with <3.0% TC 10-year risk, but only for total and vigorous activities among women with 3.0-4.9% TC risk, and only for vigorous activity among women with ≥5.0% TC risk. The association between total activity and absolute dense volume was modified by the TC breast cancer risk (P interaction = 0.05). As anticipated, high physical activity was also associated with lower non-dense volume. No consistent association was found between physical activity and percent dense volume. Our results suggest that physical activity may decrease breast cancer risk through reducing mammographic density, and that the physical activity needed to reduce mammographic density may depend on background risk of breast cancer.
Mortensen, Erik Lykke; Hvidtjørn, Dorte; Kesmodel, Ulrik Schiøler
2013-01-01
Objective To assess the mental health of children born after fertility treatment by comparing their risk of mental disorders with that of spontaneously conceived children. Design Prospective register based cohort study. Setting Nationwide register based information from Danish National Health Registers cross linked by a unique personal identification number assigned to all citizens in Denmark. Participants All children born in Denmark in 1995-2003 with follow-up in 2012 when the children were aged 8-17; 33 139 children were conceived after fertility treatment and 555 828 children were born after spontaneous conception. Main outcome measures Absolute risks and hazard ratios for overall and specific mental disorders estimated with adjustment for potential confounding variables. Estimated association between the risk of mental disorders and subtypes of procedures, hormone treatments, gamete types, and cause of infertility. Results The risk of mental disorders in children born after in vitro fertilisation or intracytoplasmic sperm injection was low, and was no higher than in spontaneously conceived children, except for a borderline significant increased risk of tic disorders (hazard ratio 1.40, 95% confidence interval 1.01 to 1.95; absolute risk 0.3%). In contrast, children born after ovulation induction with or without insemination had low but significantly increased risks of any mental disorder (1.20, 1.11 to 1.31; absolute risk 4.1%), autism spectrum disorders (1.20, 1.05 to 1.37; 1.5%), hyperkinetic disorders (1.23, 1.08 to 1.40; 1.7%), conduct, emotional, or social disorder (1.21, 1.02 to 1.45; 0.8%), and tic disorders (1.51, 1.16 to 1.96; 0.4%). There was no risk systematically related to any specific type of hormone drug treatment. Conclusions There was a small increase in the incidence of mental disorders in children born after ovulation induction/intrauterine insemination. Children born after in vitro fertilisation/intracytoplasmic sperm injection were found to have overall risk comparable with children conceived spontaneously. PMID:23833075
Dillard, Amanda J; Ferrer, Rebecca A; Ubel, Peter A; Fagerlin, Angela
2012-01-01
Risk perception is important for motivating health behavior (e.g., Janz & Becker, 1984), but different measures of the construct may change how important that relationship appears. In two studies, we examined associations between four measures of risk perception, health behavior intentions and possible behavioral determinants. Participants in these studies, who were due for colorectal cancer screening, read an online message about the importance of screening to reduce the chance of cancer. We examined bivariate and multivariate associations between risk perception measures, including absolute, comparative, and feelings-of-risk, and behavioral intentions to screen, general worry, and knowledge and attitudes related to screening. Results across the two studies were consistent, with all risk perception measures being correlated with intentions and attitudes. Multivariate analyses revealed that feelings-of-risk was most predictive of all variables, with the exception of general worry, for which comparative measures were the most predictive. Researchers interested in risk perception should assess feelings-of-risk along with more traditional measures. Those interested in influencing health behavior specifically should attempt to increase feelings of vulnerability rather than numerical risk.
Laterality, spatial abilities, and accident proneness.
Voyer, Susan D; Voyer, Daniel
2015-01-01
Although handedness as a measure of cerebral specialization has been linked to accident proneness, more direct measures of laterality are rarely considered. The present study aimed to fill that gap in the existing research. In addition, individual difference factors in accident proneness were further examined with the inclusion of mental rotation and navigation abilities measures. One hundred and forty participants were asked to complete the Mental Rotations Test, the Santa Barbara Sense of Direction scale, the Greyscales task, the Fused Dichotic Word Test, the Waterloo Handedness Questionnaire, and a grip strength task before answering questions related to number of accidents in five areas. Results indicated that handedness scores, absolute visual laterality score, absolute response time on the auditory laterality index, and navigation ability were significant predictors of the total number of accidents. Results are discussed with respect to cerebral hemispheric specialization and risk-taking attitudes and behavior.
Sehgal, Mandi; Wood, Sarah K; Ouslander, Joseph G; Hennekens, Charles H
2017-11-01
In the treatment or secondary prevention of cardiovascular disease (CVD), there is general consensus that the absolute benefits of aspirin far outweigh the absolute risks. Despite evidence from randomized trials and their meta-analyses, older adults, defined as aged 65 years or older, are less likely to be prescribed aspirin than their middle-aged counterparts. In primary prevention, the optimal utilization of aspirin is widely debated. There is insufficient randomized evidence among apparently healthy participants at moderate to high risk of a first CVD event, so general guidelines seem premature. Among older adults, randomized data are even more sparse but trials are ongoing. Further, older adults commonly take multiple medications due to comorbidities, which may increase deleterious interactions and side effects. Older adults have higher risks of occlusive events as well as bleeding. All these considerations support the need for individual clinical judgments in prescribing aspirin in the context of therapeutic lifestyle changes and other adjunctive drug therapies. These include statins for lipids and usually multiple drugs to achieve control of high blood pressure. As regards aspirin, the clinician should weigh the absolute benefit on occlusion against the absolute risk of bleeding. These issues should be considered with each patient to facilitate an informed and person-centered individual clinical judgment. The use of aspirin in primary prevention is particularly attractive because the drug is generally over the counter and, for developing countries where CVD is becoming the leading cause of death, is extremely inexpensive. The more widespread use of aspirin in older adults with prior CVD will confer net benefits to risks and even larger net benefits to costs in the United States as well as other developed and developing countries. In primary prevention among older adults, individual clinical judgments should be made by the health-care professional and each of his or her patients.
2012-01-01
Background Our aims were to determine the pace of change in cardiovascular risk factors by age, gender and socioeconomic groups from 1994 to 2008, and quantify the magnitude, direction and change in absolute and relative inequalities. Methods Time trend analysis was used to measure change in absolute and relative inequalities in risk factors by gender and age (16-54, ≥ 55 years), using repeated cross-sectional data from the Health Survey for England 1994-2008. Seven risk factors were examined: smoking, obesity, diabetes, high blood pressure, raised cholesterol, consumption of five or more daily portions of fruit and vegetables, and physical activity. Socioeconomic group was measured using the Index of Multiple Deprivation 2007. Results Between 1994 and 2008, the prevalence of smoking, high blood pressure and raised cholesterol decreased in most deprivation quintiles. However, obesity and diabetes increased. Increasing absolute inequalities were found in obesity in older men and women (p = 0.044 and p = 0.027 respectively), diabetes in young men and older women (p = 0.036 and p = 0.019 respectively), and physical activity in older women (p = 0.025). Relative inequality increased in high blood pressure in young women (p = 0.005). The prevalence of raised cholesterol showed widening absolute and relative inverse gradients from 1998 onwards in older men (p = 0.004 and p ≤ 0.001 respectively) and women (p ≤ 0.001 and p ≤ 0.001). Conclusions Favourable trends in smoking, blood pressure and cholesterol are consistent with falling coronary heart disease death rates. However, adverse trends in obesity and diabetes are likely to counteract some of these gains. Furthermore, little progress over the last 15 years has been made towards reducing inequalities. Implementation of known effective population based approaches in combination with interventions targeted at individuals/subgroups with poorer cardiovascular risk profiles are therefore recommended to reduce social inequalities. PMID:22333887
Augmenting the Deliberative Method for Ranking Risks.
Susel, Irving; Lasley, Trace; Montezemolo, Mark; Piper, Joel
2016-01-01
The Department of Homeland Security (DHS) characterized and prioritized the physical cross-border threats and hazards to the nation stemming from terrorism, market-driven illicit flows of people and goods (illegal immigration, narcotics, funds, counterfeits, and weaponry), and other nonmarket concerns (movement of diseases, pests, and invasive species). These threats and hazards pose a wide diversity of consequences with very different combinations of magnitudes and likelihoods, making it very challenging to prioritize them. This article presents the approach that was used at DHS to arrive at a consensus regarding the threats and hazards that stand out from the rest based on the overall risk they pose. Due to time constraints for the decision analysis, it was not feasible to apply multiattribute methodologies like multiattribute utility theory or the analytic hierarchy process. Using a holistic approach was considered, such as the deliberative method for ranking risks first published in this journal. However, an ordinal ranking alone does not indicate relative or absolute magnitude differences among the risks. Therefore, the use of the deliberative method for ranking risks is not sufficient for deciding whether there is a material difference between the top-ranked and bottom-ranked risks, let alone deciding what the stand-out risks are. To address this limitation of ordinal rankings, the deliberative method for ranking risks was augmented by adding an additional step to transform the ordinal ranking into a ratio scale ranking. This additional step enabled the selection of stand-out risks to help prioritize further analysis. © 2015 Society for Risk Analysis.
The Ethics of Information: Absolute Risk Reduction and Patient Understanding of Screening
Meslin, Eric M.
2008-01-01
Some experts have argued that patients should routinely be told the specific magnitude and absolute probability of potential risks and benefits of screening tests. This position is motivated by the idea that framing risk information in ways that are less precise violates the ethical principle of respect for autonomy and its application in informed consent or shared decision-making. In this Perspective, we consider a number of problems with this view that have not been adequately addressed. The most important challenges stem from the danger that patients will misunderstand the information or have irrational responses to it. Any initiative in this area should take such factors into account and should consider carefully how to apply the ethical principles of respect for autonomy and beneficence. PMID:18421509
Jasminum sambac flower absolutes from India and China--geographic variations.
Braun, Norbert A; Sim, Sherina
2012-05-01
Seven Jasminum sambac flower absolutes from different locations in the southern Indian state of Tamil Nadu were analyzed using GC and GC-MS. Focus was placed on 41 key ingredients to investigate geographic variations in this species. These seven absolutes were compared with an Indian bud absolute and commercially available J. sambac flower absolutes from India and China. All absolutes showed broad variations for the 10 main ingredients between 8% and 96%. In addition, the odor of Indian and Chinese J. sambac flower absolutes were assessed.
Keto, Jaana; Ventola, Hanna; Jokelainen, Jari; Linden, Kari; Keinänen-Kiukaanniemi, Sirkka; Timonen, Markku; Ylisaukko-oja, Tero; Auvinen, Juha
2016-01-01
Objective To investigate how individual risk factors for cardiovascular disease (CVD) (blood pressure, lipid levels, body mass index, waist and hip circumference, use of antihypertensive or hypolipidemic medication, and diagnosed diabetes) differ in people aged 46 years with different smoking behaviour and history. Methods This population-based cohort study is based on longitudinal data from the Northern Finland Birth Cohort 1966 project. Data were collected at the 31-year and 46-year follow-ups, when a total of 5038 and 5974 individuals participated in clinical examinations and questionnaires. Data from both follow-ups were available for 3548 participants. In addition to individual CVD risk factors, Framingham and Systematic Coronary Risk Evaluation (SCORE) algorithms were used to assess the absolute risk of a CVD event within the next decade. Results The differences in individual risk factors for CVD reached statistical significance for some groups, but the differences were not consistent or clinically significant. There were no clinically significant differences in CVD risk as measured by Framingham or SCORE algorithms between never smokers, recent quitters and former smokers (7.5%, 7.4%, 8.1% for men; 3.3%, 3.0%, 3.2% for women; p<0.001). Conclusions The effect of past or present smoking on individual CVD risk parameters such as blood pressure and cholesterol seems to be of clinically minor significance in people aged 46 years. In other words, smoking seems to be above all an independent risk factor for CVD in the working-age population. Quitting smoking in working age may thus reduce calculated CVD risk nearly to the same level with people who have never smoked. PMID:27493759
Large-Scale Simulation of Multi-Asset Ising Financial Markets
NASA Astrophysics Data System (ADS)
Takaishi, Tetsuya
2017-03-01
We perform a large-scale simulation of an Ising-based financial market model that includes 300 asset time series. The financial system simulated by the model shows a fat-tailed return distribution and volatility clustering and exhibits unstable periods indicated by the volatility index measured as the average of absolute-returns. Moreover, we determine that the cumulative risk fraction, which measures the system risk, changes at high volatility periods. We also calculate the inverse participation ratio (IPR) and its higher-power version, IPR6, from the absolute-return cross-correlation matrix. Finally, we show that the IPR and IPR6 also change at high volatility periods.
Chen, Lingjing; Eloranta, Sandra; Martling, Anna; Glimelius, Ingrid; Neovius, Martin; Glimelius, Bengt; Smedby, Karin E
2018-03-01
A population-based cohort and four randomized trials enriched with long-term register data were used to clarify if radiotherapy in combination with rectal cancer surgery is associated with increased risks of cardiovascular disease (CVD). We identified 14,901 rectal cancer patients diagnosed 1995-2009 in Swedish nationwide registers, of whom 9227 were treated with preoperative radiotherapy. Also, we investigated 2675 patients with rectal cancer previously randomized to preoperative radiotherapy or not followed by surgery in trials conducted 1980-1999. Risks of CVD overall and subtypes were estimated based on prospectively recorded hospital visits during relapse-free follow-up using multivariable Cox regression. Maximum follow-up was 18 and 33 years in the register and trials, respectively. We found no association between preoperative radiotherapy and overall CVD risk in the register (Incidence Rate Ratio, IRR = 0.99, 95% confidence interval (CI) 0.92-1.06) or in the pooled trials (IRR = 1.07, 95% CI 0.93-1.24). We noted an increased risk of venous thromboembolism among irradiated patients in both cohorts (IRR register = 1.41, 95% CI 1.15-2.72; IRR trials = 1.41, 95% CI 0.97-2.04), that remained during the first 6 months following surgery among patients treated 2006-2009, after the introduction of antithrombotic treatment (IRR 6 months = 2.30, 95% CI 1.01-5.21). However, the absolute rate difference of venous thromboembolism attributed to RT was low (10 cases per 1000 patients and year). Preoperative radiotherapy did not affect rectal cancer patients' risk of CVD overall. Although an excess risk of short-term venous thromboembolism was noted, the small increase in absolute numbers does not call for general changes in routine prophylactic treatment, but might do so for patients already at high risk of venous thromboembolism. Copyright © 2017 Elsevier B.V. All rights reserved.
Armstrong, Miranda E.G.; Cairns, Benjamin J.; Banks, Emily; Green, Jane; Reeves, Gillian K.; Beral, Valerie
2012-01-01
While increasing age, decreasing body mass index (BMI), and physical inactivity are known to increase hip fracture risk, whether these factors have similar effects on other common fractures is not well established. We used prospectively-collected data from a large cohort to examine the role of these factors on the risk of incident ankle, wrist and hip fractures in postmenopausal women. 1,155,304 postmenopausal participants in the Million Women Study with a mean age of 56.0 (SD 4.8) years, provided information about lifestyle, anthropometric, and reproductive factors at recruitment in 1996–2001. All participants were linked to National Health Service cause-specific hospital records for day-case or overnight admissions. During follow-up for an average of 8.3 years per woman, 6807 women had an incident ankle fracture, 9733 an incident wrist fracture, and 5267 an incident hip fracture. Adjusted absolute and relative risks (RRs) for incident ankle, wrist, and hip fractures were calculated using Cox regression models. Age-specific rates for wrist and hip fractures increased sharply with age, whereas rates for ankle fracture did not. Cumulative absolute risks from ages 50 to 84 years per 100 women were 2.5 (95%CI 2.2–2.8) for ankle fracture, 5.0 (95%CI 4.4–5.5) for wrist fracture, and 6.2 (95%CI 5.5–7.0) for hip fracture. Compared with lean women (BMI < 20 kg/m2), obese women (BMI ≥ 30 kg/m2) had a three-fold increased risk of ankle fracture (RR = 3.07; 95%CI 2.53–3.74), but a substantially reduced risk of wrist fracture and especially of hip fracture (RR = 0.57; 0.51–0.64 and 0.23; 0.21–0.27, respectively). Physical activity was associated with a reduced risk of hip fracture but was not associated with ankle or wrist fracture risk. Ankle, wrist and hip fractures are extremely common in postmenopausal women, but the associations with age, adiposity, and physical activity differ substantially between the three fracture sites. PMID:22465850
Laursen, T M; Munk-Olsen, T; Mortensen, P B; Abel, K M; Appleby, L; Webb, R T
2011-05-01
Although rare in absolute terms, risk of homicide is markedly elevated among children of parents with mental disorders. Our aims were to examine risk of child homicide if 1 or both parents had a psychiatric history, to compare effects by parental sex and diagnostic group, and to assess likelihood of child homicide being perpetrated by parents according to their psychiatric history. A prospective, register-based cohort study using the entire Danish population born between January 1, 1973, and January 1, 2007, was conducted. Follow-up of the cohort members began on their date of birth and ended on January 1, 2007; their 18th birthday; their date of death; or their date of emigration, whichever came first. We used the Danish national registers from 1973 to 2007 to study homicide risk between children whose parents were previously admitted to a psychiatric hospital, including diagnosis-specific analyses, versus their unexposed counterparts. In addition, we used police records during 2000 to 2005 to examine whether or not 1 of the parents was the perpetrator. Rates of homicide were analyzed using survival analysis. Children of parents previously admitted to a psychiatric hospital had an overall higher risk of being homicide victims (MRR = 8.94; 95% CI, 6.56-12.18). The risk differed according to parental sex and psychiatric diagnosis (ICD-8 and ICD-10 criteria). The absolute risk of homicide was 0.009% if neither parent had been admitted before the birth of their child and 0.051% if 1 of the parents had previously been admitted. During 2000 to 2005, 88% of the child homicide cases were filicide victims. This percentage was not significantly different for parents with a previous psychiatric admission versus those without such a history. In the large majority of Danish child-homicide cases, a parent was the perpetrator, regardless of whether there had been parental admission to a psychiatric hospital. Children of parents previously admitted had a higher risk of being homicide victims, and risks were especially high in young children whose mothers were hospitalized with affective disorders or schizophrenia. However, the relative risks presented in the current study are based on extremely rare events, and the overwhelming majority of children whose parents have a psychiatric history do not become homicide victims. © Copyright 2011 Physicians Postgraduate Press, Inc.
Grey, Corina; Wells, Sue; Riddell, Tania; Pylypchuk, Romana; Marshall, Roger; Drury, Paul; Elley, Raina; Ameratunga, Shanthi; Gentles, Dudley; Erick-Peletiy, Stephanie; Bell, Fionna; Kerr, Andrew; Jackson, Rod
2010-11-05
Data on the cardiovascular disease risk profiles of Pacific peoples in New Zealand is usually aggregated and treated as a single entity. Little is known about the comparability or otherwise of cardiovascular disease (CVD) risk between different Pacific groups. To compare CVD risk profiles for the main Pacific ethnic groups assessed in New Zealand primary care practice to determine if it is reasonable to aggregate these data, or if significant differences exist. A web-based clinical decision support system for CVD risk assessment and management (PREDICT) has been implemented in primary care practices in nine PHOs throughout Auckland and Northland since 2002, covering approximately 65% of the population of these regions. Between 2002 and January 2009, baseline CVD risk assessments were carried out on 11,642 patients aged 35-74 years identifying with one or more Pacific ethnic groups (4933 Samoans, 1724 Tongans, 1366 Cook Island Maori, 880 Niueans, 1341 Fijians and 1398 people identified as Other Pacific or Pacific Not Further Defined). Fijians were subsequently excluded from the analyses because of a probable misclassification error that appears to combine Fijian Indians with ethnic Fijians. Prevalences of smoking, diabetes and prior history of CVD, as well as mean total cholesterol/HDL ratio, systolic and diastolic blood pressures, and Framingham 5-year CVD risk were calculated for each Pacific group. Age-adjusted risk ratios and mean differences stratified by gender were calculated using Samoans as the reference group. Cook Island women were almost 60% more likely to smoke than Samoan women. While Tongan men had the highest proportion of smoking (29%) among Pacific men, Tongan women had the lowest smoking proportion (10%) among Pacific women. Tongan women and Niuean men and women had a higher burden of diabetes than other Pacific ethnic groups, which were 20-30% higher than their Samoan counterparts. Niuean men and women had lower blood pressure levels than all other Pacific groups while Tongan men and women had the highest total cholesterol to HDL ratios. Tongan men and women had higher absolute 5-year CVD risk scores, as estimated by the Framingham equation, than their Samoan counterparts (Age-adjusted mean differences 0.71% [95% CI 0.36% to 1.06%] for Tongan men and 0.52% [95% CI 0.17% to 0.86%] for Tongan women) although these risk differences were only about 10% higher in relative terms. The validity of the analyses depend on the assumption that the selection of participants for CVD risk assessment in primary care is similar between Pacific groups. The ethnic-specific CVD risk profiles presented do not represent estimates of population prevalence. Almost all previous Pacific data has been aggregated with Pacific peoples treated as a single entity because of small sample sizes. We have analysed data from the largest study to date measuring CVD risk factors in Pacific peoples living in New Zealand. Our findings suggest that aggregating Pacific population data appears to be reasonable in terms of assessing absolute CVD risk, however there are differences for specific CVD risk factors between Pacific ethnic groups that may be important for targeting community level interventions.
Ntaios, George; Papavasileiou, Vasileios; Diener, Hans-Chris; Makaritsis, Konstantinos; Michel, Patrik
2017-08-01
Background In a previous systematic review and meta-analysis, we assessed the efficacy and safety of nonvitamin-K antagonist oral anticoagulants versus warfarin in patients with atrial fibrillation and stroke or transient ischemic attack. Since then, new information became available. Aim The aim of the present work was to update the results of the previous systematic review and meta-analysis. Methods We searched PubMed until 24 August 2016 for randomized controlled trials using the following search items: "atrial fibrillation" and "anticoagulation" and "warfarin" and "previous stroke or transient ischemic attack." Eligible studies had to be phase III trials in patients with atrial fibrillation comparing warfarin with nonvitamin-K antagonist oral anticoagulants currently on the market or with the intention to be brought to the market in North America or Europe. The outcomes assessed in the efficacy analysis included stroke or systemic embolism, stroke, ischemic or unknown stroke, disabling or fatal stroke, hemorrhagic stroke, cardiovascular death, death from any cause, and myocardial infarction. The outcomes assessed in the safety analysis included major bleeding, intracranial bleeding, and major gastrointestinal bleeding. We performed fixed effects analyses on intention-to-treat basis. Results Among 183 potentially eligible articles, four were included in the meta-analysis. In 20,500 patients, compared to warfarin, nonvitamin-K antagonist oral anticoagulants were associated with a significant reduction of stroke/systemic embolism (relative risk reduction: 13.7%, absolute risk reduction: 0.78%, number needed to treat to prevent one event: 127), hemorrhagic stroke (relative risk reduction: 50.0%, absolute risk reduction: 0.63%, number needed to treat: 157), any stroke (relative risk reduction: 13.1%, absolute risk reduction: 0.7%, number needed to treat: 142), and intracranial hemorrhage (relative risk reduction: 46.1%, absolute risk reduction: 0.88%, number needed to treat: 113) over 1.8-2.8 years. Conclusions This updated meta-analysis in 20,500 atrial fibrillation patients with previous stroke or transient ischemic attack shows that compared to warfarin non-vitamin-K antagonist oral anticoagulants are associated with a significant reduction of stroke, stroke or systemic embolism, hemorrhagic stroke, and intracranial bleeding.
Kang, Dong Oh; Seo, Hong Seog; Choi, Byung Geol; Lee, Eunmi; Kim, Ji Park; Lee, Sun Ki; Im, Sung Il; Na, Jin Oh; Choi, Cheol Ung; Lim, Hong Euy; Kim, Jin Won; Kim, Eung Ju; Rha, Seung-Woon; Park, Chang Gyu; Oh, Dong Joo
2015-01-20
Major adverse cardiovascular events (MACEs) in patients with or without cardiovascular disease (CVD) are greatly affected by various factors associated with metabolism and inflammation. To determine which clinical parameters at treatment are associated with the development of 2-year and 5-year MACEs in high-risk patients with CVD who have undergone drug-eluting stent (DES) implantation. The present study involved a total of 432 patients who underwent percutaneous coronary intervention with DES. Variables representing the average and absolute amount of change in clinical parameters over the 12-month follow-up were assessed for association with 2-year and 5-year development of MACE. The study population was divided into quartiles for the variable showing the highest correlation to MACE development. Estimated incidence of 2-year and 5-year MACEs for each of the quartiles was determined by survival curve analysis, and subgroup analysis was performed for patients with diabetes and statin users. Absolute change in fasting plasma glucose (FPG) over 12 months showed the highest correlation with 2-year and 5-year MACE development. The estimated incidence of MACE increased with increasing quartiles for absolute change in FPG. The association between absolute change in FPG and MACE development exhibited a stronger relationship for the specific subgroups of patients with diabetes and statin users. Increases and decreases in FPG had a comparable contribution to MACE development. A greater absolute change in FPG over 12 months post-PCI is an independent risk factor for 2-year and 5-year MACE development in DES-implanted patients, especially in the diabetes and statin users. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Quantification of Treatment Effect Modification on Both an Additive and Multiplicative Scale
Girerd, Nicolas; Rabilloud, Muriel; Pibarot, Philippe; Mathieu, Patrick; Roy, Pascal
2016-01-01
Background In both observational and randomized studies, associations with overall survival are by and large assessed on a multiplicative scale using the Cox model. However, clinicians and clinical researchers have an ardent interest in assessing absolute benefit associated with treatments. In older patients, some studies have reported lower relative treatment effect, which might translate into similar or even greater absolute treatment effect given their high baseline hazard for clinical events. Methods The effect of treatment and the effect modification of treatment were respectively assessed using a multiplicative and an additive hazard model in an analysis adjusted for propensity score in the context of coronary surgery. Results The multiplicative model yielded a lower relative hazard reduction with bilateral internal thoracic artery grafting in older patients (Hazard ratio for interaction/year = 1.03, 95%CI: 1.00 to 1.06, p = 0.05) whereas the additive model reported a similar absolute hazard reduction with increasing age (Delta for interaction/year = 0.10, 95%CI: -0.27 to 0.46, p = 0.61). The number needed to treat derived from the propensity score-adjusted multiplicative model was remarkably similar at the end of the follow-up in patients aged < = 60 and in patients >70. Conclusions The present example demonstrates that a lower treatment effect in older patients on a relative scale can conversely translate into a similar treatment effect on an additive scale due to large baseline hazard differences. Importantly, absolute risk reduction, either crude or adjusted, can be calculated from multiplicative survival models. We advocate for a wider use of the absolute scale, especially using additive hazard models, to assess treatment effect and treatment effect modification. PMID:27045168
Santerre, Cyrille; Vallet, Nadine; Touboul, David
2018-06-02
Supercritical fluid chromatography hyphenated with high resolution mass spectrometry (SFC-HRMS) was developed for fingerprint analysis of different flower absolutes commonly used in cosmetics field, especially in perfumes. Supercritical fluid chromatography-atmospheric pressure photoionization-high resolution mass spectrometry (SFC-APPI-HRMS) technique was employed to identify the components of the fingerprint. The samples were separated with a porous graphitic carbon (PGC) Hypercarb™ column (100 mm × 2.1 mm, 3 μm) by gradient elution using supercritical CO 2 and ethanol (0.0-20.0 min (2-30% B), 20.0-25.0 min (30% B), 25.0-26.0 min (30-2% B) and 26.0-30.0 min (2% B)) as mobile phase at a flow rate of 1.5 mL/min. In order to compare the SFC fingerprints between five different flower absolutes: Jasminum grandiflorum absolutes, Jasminum sambac absolutes, Narcissus jonquilla absolutes, Narcissus poeticus absolutes, Lavandula angustifolia absolutes from different suppliers and batches, the chemometric procedure including principal component analysis (PCA) was applied to classify the samples according to their genus and their species. Consistent results were obtained to show that samples could be successfully discriminated. Copyright © 2018 Elsevier B.V. All rights reserved.
Poljak, Mario; Oštrbenk, Anja
2013-01-01
Human papillomavirus (HPV) testing has become an essential part of current clinical practice in the management of cervical cancer and precancerous lesions. We reviewed the most important validation studies of a next-generation real-time polymerase chain reaction-based assay, the RealTime High Risk HPV test (RealTime)(Abbott Molecular, Des Plaines, IL, USA), for triage in referral population settings and for use in primary cervical cancer screening in women 30 years and older published in peer-reviewed journals from 2009 to 2013. RealTime is designed to detect 14 high-risk HPV genotypes with concurrent distinction of HPV-16 and HPV-18 from 12 other HPV genotypes. The test was launched on the European market in January 2009 and is currently used in many laboratories worldwide for routine detection of HPV. We concisely reviewed validation studies of a next-generation real-time polymerase chain reaction (PCR)-based assay: the Abbott RealTime High Risk HPV test. Eight validation studies of RealTime in referral settings showed its consistently high absolute clinical sensitivity for both CIN2+ (range 88.3-100%) and CIN3+ (range 93.0-100%), as well as comparative clinical sensitivity relative to the currently most widely used HPV test: the Qiagen/Digene Hybrid Capture 2 HPV DNA Test (HC2). Due to the significantly different composition of the referral populations, RealTime absolute clinical specificity for CIN2+ and CIN3+ varied greatly across studies, but was comparable relative to HC2. Four validation studies of RealTime performance in cervical cancer screening settings showed its consistently high absolute clinical sensitivity for both CIN2+ and CIN3+, as well as comparative clinical sensitivity and specificity relative to HC2 and GP5+/6+ PCR. RealTime has been extensively evaluated in the last 4 years. RealTime can be considered clinically validated for triage in referral population settings and for use in primary cervical cancer screening in women 30 years and older.
2014-08-16
We aimed to investigate whether the benefits of blood pressure-lowering drugs are proportional to baseline cardiovascular risk, to establish whether absolute risk could be used to inform treatment decisions for blood pressure-lowering therapy, as is recommended for lipid-lowering therapy. This meta-analysis included individual participant data from trials that randomly assigned patients to either blood pressure-lowering drugs or placebo, or to more intensive or less intensive blood pressure-lowering regimens. The primary outcome was total major cardiovascular events, consisting of stroke, heart attack, heart failure, or cardiovascular death. Participants were separated into four categories of baseline 5-year major cardiovascular risk using a risk prediction equation developed from the placebo groups of the included trials (<11%, 11-15%, 15-21%, >21%). 11 trials and 26 randomised groups met the inclusion criteria, and included 67,475 individuals, of whom 51,917 had available data for the calculation of the risk equations. 4167 (8%) had a cardiovascular event during a median of 4·0 years (IQR 3·4-4·4) of follow-up. The mean estimated baseline levels of 5-year cardiovascular risk for each of the four risk groups were 6·0% (SD 2·0), 12·1% (1·5), 17·7% (1·7), and 26·8% (5·4). In each consecutive higher risk group, blood pressure-lowering treatment reduced the risk of cardiovascular events relatively by 18% (95% CI 7-27), 15% (4-25), 13% (2-22), and 15% (5-24), respectively (p=0·30 for trend). However, in absolute terms, treating 1000 patients in each group with blood pressure-lowering treatment for 5 years would prevent 14 (95% CI 8-21), 20 (8-31), 24 (8-40), and 38 (16-61) cardiovascular events, respectively (p=0·04 for trend). Lowering blood pressure provides similar relative protection at all levels of baseline cardiovascular risk, but progressively greater absolute risk reductions as baseline risk increases. These results support the use of predicted baseline cardiovascular disease risk equations to inform blood pressure-lowering treatment decisions. None. Copyright © 2014 Elsevier Ltd. All rights reserved.
Cole, Stephen R; Lau, Bryan; Eron, Joseph J; Brookhart, M Alan; Kitahata, Mari M; Martin, Jeffrey N; Mathews, William C; Mugavero, Michael J
2015-02-15
There are few published examples of absolute risk estimated from epidemiologic data subject to censoring and competing risks with adjustment for multiple confounders. We present an example estimating the effect of injection drug use on 6-year risk of acquired immunodeficiency syndrome (AIDS) after initiation of combination antiretroviral therapy between 1998 and 2012 in an 8-site US cohort study with death before AIDS as a competing risk. We estimate the risk standardized to the total study sample by combining inverse probability weights with the cumulative incidence function; estimates of precision are obtained by bootstrap. In 7,182 patients (83% male, 33% African American, median age of 38 years), we observed 6-year standardized AIDS risks of 16.75% among 1,143 injection drug users and 12.08% among 6,039 nonusers, yielding a standardized risk difference of 4.68 (95% confidence interval: 1.27, 8.08) and a standardized risk ratio of 1.39 (95% confidence interval: 1.12, 1.72). Results may be sensitive to the assumptions of exposure-version irrelevance, no measurement bias, and no unmeasured confounding. These limitations suggest that results be replicated with refined measurements of injection drug use. Nevertheless, estimating the standardized risk difference and ratio is straightforward, and injection drug use appears to increase the risk of AIDS. © The Author 2014. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Age of Red Cells for Transfusion and Outcomes in Critically Ill Adults.
Cooper, D James; McQuilten, Zoe K; Nichol, Alistair; Ady, Bridget; Aubron, Cécile; Bailey, Michael; Bellomo, Rinaldo; Gantner, Dashiell; Irving, David O; Kaukonen, Kirsi-Maija; McArthur, Colin; Murray, Lynne; Pettilä, Ville; French, Craig
2017-11-09
It is uncertain whether the duration of red-cell storage affects mortality after transfusion among critically ill adults. In an international, multicenter, randomized, double-blind trial, we assigned critically ill adults to receive either the freshest available, compatible, allogeneic red cells (short-term storage group) or standard-issue (oldest available), compatible, allogeneic red cells (long-term storage group). The primary outcome was 90-day mortality. From November 2012 through December 2016, at 59 centers in five countries, 4994 patients underwent randomization and 4919 (98.5%) were included in the primary analysis. Among the 2457 patients in the short-term storage group, the mean storage duration was 11.8 days. Among the 2462 patients in the long-term storage group, the mean storage duration was 22.4 days. At 90 days, there were 610 deaths (24.8%) in the short-term storage group and 594 (24.1%) in the long-term storage group (absolute risk difference, 0.7 percentage points; 95% confidence interval [CI], -1.7 to 3.1; P=0.57). At 180 days, the absolute risk difference was 0.4 percentage points (95% CI, -2.1 to 3.0; P=0.75). Most of the prespecified secondary measures showed no significant between-group differences in outcome. The age of transfused red cells did not affect 90-day mortality among critically ill adults. (Funded by the Australian National Health and Medical Research Council and others; TRANSFUSE Australian and New Zealand Clinical Trials Registry number, ACTRN12612000453886 ; ClinicalTrials.gov number, NCT01638416 .).
Electrotherapy modalities for adhesive capsulitis (frozen shoulder).
Page, Matthew J; Green, Sally; Kramer, Sharon; Johnston, Renea V; McBain, Brodwen; Buchbinder, Rachelle
2014-10-01
Adhesive capsulitis (also termed frozen shoulder) is a common condition characterised by spontaneous onset of pain, progressive restriction of movement of the shoulder and disability that restricts activities of daily living, work and leisure. Electrotherapy modalities, which aim to reduce pain and improve function via an increase in energy (electrical, sound, light, thermal) into the body, are often delivered as components of a physical therapy intervention. This review is one in a series of reviews which form an update of the Cochrane review 'Physiotherapy interventions for shoulder pain'. To synthesise the available evidence regarding the benefits and harms of electrotherapy modalities, delivered alone or in combination with other interventions, for the treatment of adhesive capsulitis. We searched CENTRAL, MEDLINE, EMBASE, CINAHL Plus and the ClinicalTrials.gov and World Health Organization (WHO) International Clinical Trials Registry Platform (ICTRP) clinical trials registries up to May 2014, unrestricted by language, and reviewed the reference lists of review articles and retrieved trials to identify any other potentially relevant trials. We included randomised controlled trials (RCTs) and controlled clinical trials using a quasi-randomised method of allocation that included adults with adhesive capsulitis and compared any electrotherapy modality to placebo, no treatment, a different electrotherapy modality, or any other intervention. The two main questions of the review focused on whether electrotherapy modalities are effective compared to placebo or no treatment, or if they are an effective adjunct to manual therapy or exercise (or both). The main outcomes of interest were participant-reported pain relief of 30% or greater, overall pain, function, global assessment of treatment success, active shoulder abduction, quality of life, and the number of participants experiencing any adverse event. Two review authors independently selected trials for inclusion, extracted the data, performed a risk of bias assessment, and assessed the quality of the body of evidence for the main outcomes using the GRADE approach. Nineteen trials (1249 participants) were included in the review. Four trials reported using an adequate method of allocation concealment and six trials blinded participants and personnel. Only two electrotherapy modalities (low-level laser therapy (LLLT) and pulsed electromagnetic field therapy (PEMF)) have been compared to placebo. No trial has compared an electrotherapy modality plus manual therapy and exercise to manual therapy and exercise alone. The two main questions of the review were investigated in nine trials.Low quality evidence from one trial (40 participants) indicated that LLLT for six days may result in improvement at six days. Eighty per cent (16/20) of participants reported treatment success with LLLT compared with 10% (2/20) of participants receiving placebo (risk ratio (RR) 8.00, 95% confidence interval (CI) 2.11 to 30.34; absolute risk difference 70%, 95% CI 48% to 92%). No participants in either group reported adverse events.We were uncertain whether PEMF for two weeks improved pain or function more than placebo at two weeks because of the very low quality evidence from one trial (32 participants). Seventy-five per cent (15/20) of participants reported pain relief of 30% or more with PEMF compared with 0% (0/12) of participants receiving placebo (RR 19.19, 95% CI 1.25 to 294.21; absolute risk difference 75%, 95% CI 53% to 97%). Fifty-five per cent (11/20) of participants reported total recovery of joint function with PEMF compared with 0% (0/12) of participants receiving placebo (RR 14.24, 95% CI 0.91 to 221.75; absolute risk difference 55%, 95% CI 31 to 79).Moderate quality evidence from one trial (63 participants) indicated that LLLT plus exercise for eight weeks probably results in greater improvement when measured at the fourth week of treatment, but a similar number of adverse events, compared with placebo plus exercise. The mean pain score at four weeks was 51 points with placebo plus exercise, while with LLLT plus exercise the mean pain score was 32 points on a 100 point scale (mean difference (MD) 19 points, 95% CI 15 to 23; absolute risk difference 19%, 95% CI 15% to 23%). The mean function impairment score was 48 points with placebo plus exercise, while with LLLT plus exercise the mean function impairment score was 36 points on a 100 point scale (MD 12 points, 95% CI 6 to 18; absolute risk difference 12%, 95% CI 6 to 18). Mean active abduction was 70 degrees with placebo plus exercise, while with LLLT plus exercise mean active abduction was 79 degrees (MD 9 degrees, 95% CI 2 to 16; absolute risk difference 5%, 95% CI 1% to 9%). No participants in either group reported adverse events. LLLT's benefits on function were maintained at four months.Based on very low quality evidence from six trials, we were uncertain whether therapeutic ultrasound, PEMF, continuous short wave diathermy, Iodex phonophoresis, a combination of Iodex iontophoresis with continuous short wave diathermy, or a combination of therapeutic ultrasound with transcutaneous electrical nerve stimulation (TENS) were effective adjuncts to exercise. Based on low or very low quality evidence from 12 trials, we were uncertain whether a diverse range of electrotherapy modalities (delivered alone or in combination with manual therapy, exercise, or other active interventions) were more or less effective than other active interventions (for example glucocorticoid injection). Based upon low quality evidence from one trial, LLLT for six days may be more effective than placebo in terms of global treatment success at six days. Based upon moderate quality evidence from one trial, LLLT plus exercise for eight weeks may be more effective than exercise alone in terms of pain up to four weeks, and function up to four months. It is unclear whether PEMF is more or less effective than placebo, or whether other electrotherapy modalities are an effective adjunct to exercise. Further high quality randomised controlled trials are needed to establish the benefits and harms of physical therapy interventions (that comprise electrotherapy modalities, manual therapy and exercise, and are reflective of clinical practice) compared to interventions with evidence of benefit (for example glucocorticoid injection or arthrographic joint distension).
Rønn, Pernille Falberg; Lucas, Michel; Laouan Sidi, Elhadji A; Tvermosegaard, Maria; Andersen, Gregers Stig; Lauritzen, Torsten; Toft, Ulla; Carstensen, Bendix; Christensen, Dirk Lund; Jørgensen, Marit Eika
2017-10-01
Inuit populations have lower levels of cardiometabolic risk factors for the same level of body mass index (BMI) or waist circumference (WC) compared to Europeans in cross-sectional studies. We aimed to compare the longitudinal associations of anthropometric measures with cardiovascular disease (CVD) and all-cause mortality in Inuit and Europeans. Using pooled data from three population-based studies in Canada, Greenland and Denmark, we conducted a cohort study of 10,033 adult participants (765 Nunavik Inuit, 2960 Greenlandic Inuit and 6308 Europeans). Anthropometric measures collected at baseline included: BMI, WC, waist-to-hip-ratio (WHR), waist-to-height-ratio (WHtR) and a body shape index (ABSI). Information on CVD and death was retrieved from national registers or medical files. Poisson regression analyses were used to calculate incidence rates for CVD and all-cause mortality. During a median follow-up of 10.5 years, there were 642 CVD events and 594 deaths. Slightly higher absolute incidence rates of CVD for a given anthropometric measure were found in Nunavik Inuit compared with Greenlandic Inuit and the Europeans; however, no cohort interactions were observed. For all-cause mortality, all anthropometric measures were positively associated in the Europeans, but only ABSI in the two Inuit populations. In contrast, BMI and WC were inversely associated with mortality in the two Inuit populations. Inuit and Europeans have different absolute incidences of CVD and all-cause mortality, but the trends in the associations with the anthropometric measures only differ for all-cause mortality. Previous findings of a lower obesity-associated cardiometabolic risk among Inuit were not confirmed. Copyright © 2017 Elsevier B.V. All rights reserved.
Dunn, William G; Walters, Matthew R
2014-11-01
The deep-fried Mars bar has been cited as 'all that is wrong with the high-fat, high-sugar Scottish diet'. We investigated the effect of ingestion of a deep-fried Mars bar or porridge on cerebrovascular reactivity. We hypothesised that deep-fried Mars bar ingestion would impair cerebrovascular reactivity, which is associated with increased risk of ischaemic stroke. Twenty-four fasted volunteers were randomised to receive a deep-fried Mars bar and then porridge (control), or vice-versa. We used transcranial Doppler ultrasound to calculate Breath Holding Index as a surrogate measure of cerebrovascular reactivity. Change in Breath Holding Index post-ingestion was the primary outcome measure. Twenty-four healthy adults (mean (SD) age 21.5 (1.7) years, 14 males) completed the protocol. Deep-fried Mars bar ingestion caused a non-significant reduction in cerebrovascular reactivity relative to control (mean difference in absolute Breath Holding Index after deep-fried Mars bar versus porridge -0.11, p = 0.40). Comparison of the difference between the absolute change in Breath Holding Index between genders demonstrated a significant impairment of cerebrovascular reactivity in males (mean difference women minus men of 0.65, 95% CI 0.30 to 1.00, p = 0.0003). Ingestion of a bolus of sugar and fat caused no overall difference in cerebrovascular reactivity, but there was a modest decrease in males. Impaired cerebrovascular reactivity is associated with increased stroke risk, and therefore deep-fried Mars bar ingestion may acutely contribute to cerebral hypoperfusion in men. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.
Katki, Hormuzd A; Schiffman, Mark
2018-05-01
Our work involves assessing whether new biomarkers might be useful for cervical-cancer screening across populations with different disease prevalences and biomarker distributions. When comparing across populations, we show that standard diagnostic accuracy statistics (predictive values, risk-differences, Youden's index and Area Under the Curve (AUC)) can easily be misinterpreted. We introduce an intuitively simple statistic for a 2 × 2 table, Mean Risk Stratification (MRS): the average change in risk (pre-test vs. post-test) revealed for tested individuals. High MRS implies better risk separation achieved by testing. MRS has 3 key advantages for comparing test performance across populations with different disease prevalences and biomarker distributions. First, MRS demonstrates that conventional predictive values and the risk-difference do not measure risk-stratification because they do not account for test-positivity rates. Second, Youden's index and AUC measure only multiplicative relative gains in risk-stratification: AUC = 0.6 achieves only 20% of maximum risk-stratification (AUC = 0.9 achieves 80%). Third, large relative gains in risk-stratification might not imply large absolute gains if disease is rare, demonstrating a "high-bar" to justify population-based screening for rare diseases such as cancer. We illustrate MRS by our experience comparing the performance of cervical-cancer screening tests in China vs. the USA. The test with the worst AUC = 0.72 in China (visual inspection with acetic acid) provides twice the risk-stratification (i.e. MRS) of the test with best AUC = 0.83 in the USA (human papillomavirus and Pap cotesting) because China has three times more cervical precancer/cancer. MRS could be routinely calculated to better understand the clinical/public-health implications of standard diagnostic accuracy statistics. Published by Elsevier Inc.
Dore, David D.; Swaminathan, Shailender; Gutman, Roee; Trivedi, Amal N.; Mor, Vincent
2013-01-01
Objective To compare the assumptions and estimands across three approaches to estimating the effect of erythropoietin-stimulating agents (ESAs) on mortality. Study Design and Setting Using data from the Renal Management Information System, we conducted two analyses utilizing a change to bundled payment that we hypothesized mimicked random assignment to ESA (pre-post, difference-in-difference, and instrumental variable analyses). A third analysis was based on multiply imputing potential outcomes using propensity scores. Results There were 311,087 recipients of ESAs and 13,095 non-recipients. In the pre-post comparison, we identified no clear relationship between bundled payment (measured by calendar time) and the incidence of death within six months (risk difference -1.5%; 95% CI - 7.0% to 4.0%). In the instrumental variable analysis, the risk of mortality was similar among ESA recipients (risk difference -0.9%; 95% CI -2.1 to 0.3). In the multiple imputation analysis, we observed a 4.2% (95% CI 3.4% to 4.9%) absolute reduction in mortality risk with use of ESAs, but closer to the null for patients with baseline hematocrit >36%. Conclusion Methods emanating from different disciplines often rely on different assumptions, but can be informative about a similar causal contrast. The implications of these distinct approaches are discussed. PMID:23849152
2011-01-01
Background Previous research has documented heterogeneity in the effects of maternal education on adverse birth outcomes by nativity and Hispanic subgroup in the United States. In this article, we considered the risk of preterm birth (PTB) using 9 years of vital statistics birth data from New York City. We employed finer categorizations of exposure than used previously and estimated the risk dose-response across the range of education by nativity and ethnicity. Methods Using Bayesian random effects logistic regression models with restricted quadratic spline terms for years of completed maternal education, we calculated and plotted the estimated posterior probabilities of PTB (gestational age < 37 weeks) for each year of education by ethnic and nativity subgroups adjusted for only maternal age, as well as with more extensive covariate adjustments. We then estimated the posterior risk difference between native and foreign born mothers by ethnicity over the continuous range of education exposures. Results The risk of PTB varied substantially by education, nativity and ethnicity. Native born groups showed higher absolute risk of PTB and declining risk associated with higher levels of education beyond about 10 years, as did foreign-born Puerto Ricans. For most other foreign born groups, however, risk of PTB was flatter across the education range. For Mexicans, Central Americans, Dominicans, South Americans and "Others", the protective effect of foreign birth diminished progressively across the educational range. Only for Puerto Ricans was there no nativity advantage for the foreign born, although small numbers of foreign born Cubans limited precision of estimates for that group. Conclusions Using flexible Bayesian regression models with random effects allowed us to estimate absolute risks without strong modeling assumptions. Risk comparisons for any sub-groups at any exposure level were simple to calculate. Shrinkage of posterior estimates through the use of random effects allowed for finer categorization of exposures without restricting joint effects to follow a fixed parametric scale. Although foreign born Hispanic women with the least education appeared to generally have low risk, this seems likely to be a marker for unmeasured environmental and behavioral factors, rather than a causally protective effect of low education itself. PMID:21504612
Kaufman, Jay S; MacLehose, Richard F; Torrone, Elizabeth A; Savitz, David A
2011-04-19
Previous research has documented heterogeneity in the effects of maternal education on adverse birth outcomes by nativity and Hispanic subgroup in the United States. In this article, we considered the risk of preterm birth (PTB) using 9 years of vital statistics birth data from New York City. We employed finer categorizations of exposure than used previously and estimated the risk dose-response across the range of education by nativity and ethnicity. Using Bayesian random effects logistic regression models with restricted quadratic spline terms for years of completed maternal education, we calculated and plotted the estimated posterior probabilities of PTB (gestational age < 37 weeks) for each year of education by ethnic and nativity subgroups adjusted for only maternal age, as well as with more extensive covariate adjustments. We then estimated the posterior risk difference between native and foreign born mothers by ethnicity over the continuous range of education exposures. The risk of PTB varied substantially by education, nativity and ethnicity. Native born groups showed higher absolute risk of PTB and declining risk associated with higher levels of education beyond about 10 years, as did foreign-born Puerto Ricans. For most other foreign born groups, however, risk of PTB was flatter across the education range. For Mexicans, Central Americans, Dominicans, South Americans and "Others", the protective effect of foreign birth diminished progressively across the educational range. Only for Puerto Ricans was there no nativity advantage for the foreign born, although small numbers of foreign born Cubans limited precision of estimates for that group. Using flexible Bayesian regression models with random effects allowed us to estimate absolute risks without strong modeling assumptions. Risk comparisons for any sub-groups at any exposure level were simple to calculate. Shrinkage of posterior estimates through the use of random effects allowed for finer categorization of exposures without restricting joint effects to follow a fixed parametric scale. Although foreign born Hispanic women with the least education appeared to generally have low risk, this seems likely to be a marker for unmeasured environmental and behavioral factors, rather than a causally protective effect of low education itself.
Kuipers, Saskia; Cannegieter, Suzanne C; Middeldorp, Saskia; Robyn, Luc; Büller, Harry R; Rosendaal, Frits R
2007-01-01
Background The risk of venous thrombosis is approximately 2- to 4-fold increased after air travel, but the absolute risk is unknown. The objective of this study was to assess the absolute risk of venous thrombosis after air travel. Methods and Findings We conducted a cohort study among employees of large international companies and organisations, who were followed between 1 January 2000 and 31 December 2005. The occurrence of symptomatic venous thrombosis was linked to exposure to air travel, as assessed by travel records provided by the companies and organisations. A long-haul flight was defined as a flight of at least 4 h and participants were considered exposed for a postflight period of 8 wk. A total of 8,755 employees were followed during a total follow-up time of 38,910 person-years (PY). The total time employees were exposed to a long-haul flight was 6,872 PY. In the follow-up period, 53 thromboses occurred, 22 of which within 8 wk of a long-haul flight, yielding an incidence rate of 3.2/1,000 PY, as compared to 1.0/1,000 PY in individuals not exposed to air travel (incidence rate ratio 3.2, 95% confidence interval 1.8–5.6). This rate was equivalent to a risk of one event per 4,656 long-haul flights. The risk increased with exposure to more flights within a short time frame and with increasing duration of flights. The incidence was highest in the first 2 wk after travel and gradually decreased to baseline after 8 wk. The risk was particularly high in employees under age 30 y, women who used oral contraceptives, and individuals who were particularly short, tall, or overweight. Conclusions The risk of symptomatic venous thrombosis after air travel is moderately increased on average, and rises with increasing exposure and in high-risk groups. PMID:17896862
Kuipers, Saskia; Cannegieter, Suzanne C; Middeldorp, Saskia; Robyn, Luc; Büller, Harry R; Rosendaal, Frits R
2007-09-01
The risk of venous thrombosis is approximately 2- to 4-fold increased after air travel, but the absolute risk is unknown. The objective of this study was to assess the absolute risk of venous thrombosis after air travel. We conducted a cohort study among employees of large international companies and organisations, who were followed between 1 January 2000 and 31 December 2005. The occurrence of symptomatic venous thrombosis was linked to exposure to air travel, as assessed by travel records provided by the companies and organisations. A long-haul flight was defined as a flight of at least 4 h and participants were considered exposed for a postflight period of 8 wk. A total of 8,755 employees were followed during a total follow-up time of 38,910 person-years (PY). The total time employees were exposed to a long-haul flight was 6,872 PY. In the follow-up period, 53 thromboses occurred, 22 of which within 8 wk of a long-haul flight, yielding an incidence rate of 3.2/1,000 PY, as compared to 1.0/1,000 PY in individuals not exposed to air travel (incidence rate ratio 3.2, 95% confidence interval 1.8-5.6). This rate was equivalent to a risk of one event per 4,656 long-haul flights. The risk increased with exposure to more flights within a short time frame and with increasing duration of flights. The incidence was highest in the first 2 wk after travel and gradually decreased to baseline after 8 wk. The risk was particularly high in employees under age 30 y, women who used oral contraceptives, and individuals who were particularly short, tall, or overweight. The risk of symptomatic venous thrombosis after air travel is moderately increased on average, and rises with increasing exposure and in high-risk groups.
Integration of second cancer risk calculations in a radiotherapy treatment planning system
NASA Astrophysics Data System (ADS)
Hartmann, M.; Schneider, U.
2014-03-01
Second cancer risk in patients, in particular in children, who were treated with radiotherapy is an important side effect. It should be minimized by selecting an appropriate treatment plan for the patient. The objectives of this study were to integrate a risk model for radiation induced cancer into a treatment planning system which allows to judge different treatment plans with regard to second cancer induction and to quantify the potential reduction in predicted risk. A model for radiation induced cancer including fractionation effects which is valid for doses in the radiotherapy range was integrated into a treatment planning system. From the three-dimensional (3D) dose distribution the 3D-risk equivalent dose (RED) was calculated on an organ specific basis. In addition to RED further risk coefficients like OED (organ equivalent dose), EAR (excess absolute risk) and LAR (lifetime attributable risk) are computed. A risk model for radiation induced cancer was successfully integrated in a treatment planning system. Several risk coefficients can be viewed and used to obtain critical situations were a plan can be optimised. Risk-volume-histograms and organ specific risks were calculated for different treatment plans and were used in combination with NTCP estimates for plan evaluation. It is concluded that the integration of second cancer risk estimates in a commercial treatment planning system is feasible. It can be used in addition to NTCP modelling for optimising treatment plans which result in the lowest possible second cancer risk for a patient.
Evidence for Absolute Moral Opposition to Genetically Modified Food in the United States.
Scott, Sydney E; Inbar, Yoel; Rozin, Paul
2016-05-01
Public opposition to genetic modification (GM) technology in the food domain is widespread (Frewer et al., 2013). In a survey of U.S. residents representative of the population on gender, age, and income, 64% opposed GM, and 71% of GM opponents (45% of the entire sample) were "absolutely" opposed-that is, they agreed that GM should be prohibited no matter the risks and benefits. "Absolutist" opponents were more disgust sensitive in general and more disgusted by the consumption of genetically modified food than were non-absolutist opponents or supporters. Furthermore, disgust predicted support for legal restrictions on genetically modified foods, even after controlling for explicit risk-benefit assessments. This research suggests that many opponents are evidence insensitive and will not be influenced by arguments about risks and benefits. © The Author(s) 2016.
2017-01-01
Purpose/Background Shoulder proprioception is essential in the activities of daily living as well as in sports. Acute muscle fatigue is believed to cause a deterioration of proprioception, increasing the risk of injury. The purpose of this study was to evaluate if fatigue of the shoulder external rotators during eccentric versus concentric activity affects shoulder joint proprioception as determined by active reproduction of position. Study design Quasi-experimental trial. Methods Twenty-two healthy subjects with no recent history of shoulder pathology were randomly allocated to either a concentric or an eccentric exercise group for fatiguing the shoulder external rotators. Proprioception was assessed before and after the fatiguing protocol using an isokinetic dynamometer, by measuring active reproduction of position at 30 ° of shoulder external rotation, reported as absolute angular error. The fatiguing protocol consisted of sets of fifteen consecutive external rotator muscle contractions in either the concentric or eccentric action. The subjects were exercised until there was a 30% decline from the peak torque of the subjects’ maximal voluntary contraction over three consecutive muscle contractions. Results A one-way analysis of variance test revealed no statistical difference in absolute angular error (p > 0.05) between concentric and eccentric groups. Moreover, no statistical difference (p > 0.05) was found in absolute angular error between pre- and post-fatigue in either group. Conclusions Eccentric exercise does not seem to acutely affect shoulder proprioception to a larger extent than concentric exercise. Level of evidence 2b PMID:28515976
Neural Sensitivity to Absolute and Relative Anticipated Reward in Adolescents
Vaidya, Jatin G.; Knutson, Brian; O'Leary, Daniel S.; Block, Robert I.; Magnotta, Vincent
2013-01-01
Adolescence is associated with a dramatic increase in risky and impulsive behaviors that have been attributed to developmental differences in neural processing of rewards. In the present study, we sought to identify age differences in anticipation of absolute and relative rewards. To do so, we modified a commonly used monetary incentive delay (MID) task in order to examine brain activity to relative anticipated reward value (neural sensitivity to the value of a reward as a function of other available rewards). This design also made it possible to examine developmental differences in brain activation to absolute anticipated reward magnitude (the degree to which neural activity increases with increasing reward magnitude). While undergoing fMRI, 18 adolescents and 18 adult participants were presented with cues associated with different reward magnitudes. After the cue, participants responded to a target to win money on that trial. Presentation of cues was blocked such that two reward cues associated with $.20, $1.00, or $5.00 were in play on a given block. Thus, the relative value of the $1.00 reward varied depending on whether it was paired with a smaller or larger reward. Reflecting age differences in neural responses to relative anticipated reward (i.e., reference dependent processing), adults, but not adolescents, demonstrated greater activity to a $1 reward when it was the larger of the two available rewards. Adults also demonstrated a more linear increase in ventral striatal activity as a function of increasing absolute reward magnitude compared to adolescents. Additionally, reduced ventral striatal sensitivity to absolute anticipated reward (i.e., the difference in activity to medium versus small rewards) correlated with higher levels of trait Impulsivity. Thus, ventral striatal activity in anticipation of absolute and relative rewards develops with age. Absolute reward processing is also linked to individual differences in Impulsivity. PMID:23544046
Hull, Russell D; Schellong, Sebastian M; Tapson, Victor F; Monreal, Manuel; Samama, Meyer-Michel; Nicol, Philippe; Vicaut, Eric; Turpie, Alexander G G; Yusen, Roger D
2010-07-06
Extended-duration low-molecular-weight heparin has been shown to prevent venous thromboembolism (VTE) in high-risk surgical patients. To evaluate the efficacy and safety of extended-duration enoxaparin thromboprophylaxis in acutely ill medical patients. Randomized, parallel, placebo-controlled trial. Randomization was computer-generated. Allocation was centralized. Patients, caregivers, and outcome assessors were blinded to group assignment. (ClinicalTrials.gov registration number: NCT00077753) SETTING: 370 sites in 20 countries across North and South America, Europe, and Asia. Acutely ill medical patients 40 years or older with recently reduced mobility (bed rest or sedentary without [level 1] or with [level 2] bathroom privileges). Eligibility criteria for patients with level 2 immobility were amended to include only those who had additional VTE risk factors (age >75 years, history of VTE, or active or previous cancer) after interim analyses suggested lower-than-expected VTE rates. Enoxaparin, 40 mg/d subcutaneously (2975 patients), or placebo (2988 patients), for 28 +/- 4 days after receiving open-label enoxaparin for an initial 10 +/- 4 days. Incidence of VTE up to day 28 and of major bleeding events up to 48 hours after the last study treatment dose. Extended-duration enoxaparin reduced VTE incidence compared with placebo (2.5% vs. 4%; absolute risk difference favoring enoxaparin, -1.53% [95.8% CI, -2.54% to -0.52%]). Enoxaparin increased major bleeding events (0.8% vs. 0.3%; absolute risk difference favoring placebo, 0.51% [95% CI, 0.12% to 0.89%]). The benefits of extended-duration enoxaparin seemed to be restricted to women, patients older than 75 years, and those with level 1 immobility. Estimates of efficacy and safety for the overall trial population are difficult to interpret because of the change in eligibility criteria during the trial. Use of extended-duration enoxaparin reduces VTE more than it increases major bleeding events in acutely ill medical patients with level 1 immobility, those older than 75 years, and women. Sanofi-aventis.
Wegwarth, O; Kurzenhäuser-Carstens, S; Gigerenzer, G
2014-03-10
Informed decision making requires transparent and evidence-based (=balanced) information on the potential benefit and harms of medical preventions. An analysis of German HPV vaccination leaflets revealed, however, that none met the standards of balanced risk communication. We surveyed a sample of 225 girl-parent pairs in a before-after design on the effects of balanced and unbalanced risk communication on participants' knowledge about cervical cancer and the HPV vaccination, their perceived risk, their intention to have the vaccine, and their actual vaccination decision. The balanced leaflet increased the number of participants who were correctly informed about cervical cancer and the HPV vaccine by 33 to 66 absolute percentage points. In contrast, the unbalanced leaflet decreased the number of participants who were correctly informed about these facts by 0 to 18 absolute percentage points. Whereas the actual uptake of the HPV vaccination 14 months after the initial study did not differ between the two groups (22% balanced leaflet vs. 23% unbalanced leaflet; p=.93, r=.01), the originally stated intention to have the vaccine reliably predicted the actual vaccination decision for the balanced leaflet group only (concordance between intention and actual uptake: 97% in the balanced leaflet group, rs=.92, p=.00; 60% in the unbalanced leaflet group, rs=.37, p=.08). In contrast to a unbalanced leaflet, a balanced leaflet increased people's knowledge of the HPV vaccination, improved perceived risk judgments, and led to an actual vaccination uptake, which first was robustly predicted by people's intention and second did not differ from the uptake in the unbalanced leaflet group. These findings suggest that balanced reporting about HPV vaccination increases informed decisions about whether to be vaccinated and does not undermine actual uptake. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.
Moving closer to understanding the risks of living kidney donation.
Steiner, Robert W
2016-01-01
Recent studies from the United States and Norway have suggested an unexpected 8- to 11-fold relative risk of ESRD after kidney donation, but a low long-term absolute risk. Abundant renal epidemiologic data predict that these studies have underestimated long-term risk. The 1% lifetime post-donation risk in the US study requires medical screening to predict ESRD in 96 of 100 candidates. This is particularly unlikely in the 30-35% of candidates under age 35, half of whose lifetime ESRD will occur after age 64. Many experts have attributed the increased relative risks in these studies to loss of GFR at donation, which ultimately means that high-normal pre-donation GFRs will reduce absolute post-donation risks. The 8- to 11-fold relative risks predict implausible risks of uninephrectomy in the general population, but lower estimates still result in very high risks for black donors. Young vs. older age, low vs. high-normal pre-donation GFRs, black race, and an increased relative risk of donation all predict highly variable individual risks, not a single "low" or "1%" risk as these studies suggest. A uniform, ethically defensible donor selection protocol would accept older donors with many minor medical abnormalities but protect from donation many currently acceptable younger, black, and/or low GFR candidates. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
McKinn, Shannon; Bonner, Carissa; Jansen, Jesse; Teixeira-Pinto, Armando; So, Matthew; Irwig, Les; Doust, Jenny; Glasziou, Paul; McCaffery, Kirsten
2016-08-05
Guidelines on cardiovascular disease (CVD) risk reassessment intervals are unclear, potentially leading to detrimental practice variation: too frequent can result in overtreatment and greater strain on the healthcare system; too infrequent could result in the neglect of high risk patients who require medication. This study aimed to understand the different factors that general practitioners (GPs) consider when deciding on the reassessment interval for patients previously assessed for primary CVD risk. This paper combines quantitative and qualitative data regarding reassessment intervals from two separate studies of CVD risk management. Experimental study: 144 Australian GPs viewed a random selection of hypothetical cases via a paper-based questionnaire, in which blood pressure, cholesterol and 5-year absolute risk (AR) were systematically varied to appear lower or higher. GPs were asked how they would manage each case, including an open-ended response for when they would reassess the patient. Interview study: Semi-structured interviews were conducted with a purposive sample of 25 Australian GPs, recruited separately from the GPs in the experimental study. Transcribed audio-recordings were thematically coded, using the Framework Analysis method. GPs stated that they would reassess the majority of patients across all absolute risk categories in 6 months or less (low AR = 52 % [CI95% = 47-57 %], moderate AR = 82 % [CI95% = 76-86 %], high AR = 87 % [CI95% = 82-90 %], total = 71 % [CI95% = 67-75 %]), with 48 % (CI95% = 43-53 %) of patients reassessed in under 3 months. The majority (75 % [CI95% = 70-79 %]) of patients with low-moderate AR (≤15 %) and an elevated risk factor would be reassessed in under 6 months. Interviews: GPs identified different functions for reassessment and risk factor monitoring, which affected recommended intervals. These included perceived psychosocial benefits to patients, preparing the patient for medication, and identifying barriers to lifestyle change and medication adherence. Reassessment and monitoring intervals were driven by patient motivation to change lifestyle, patient demand, individual risk factors, and GP attitudes. There is substantial variation in reassessment intervals for patients with the same risk profile. This suggests that GPs are not following reassessment recommendations in the Australian guidelines. The use of shorter intervals for low-moderate AR contradicts research on optimal monitoring intervals, and may result in unnecessary costs and over-treatment.
Correa, Candace; Duane, Frances K.; Aznar, Marianne C.; Anderson, Stewart J.; Bergh, Jonas; Dodwell, David; Ewertz, Marianne; Gray, Richard; Jagsi, Reshma; Pierce, Lori; Pritchard, Kathleen I.; Swain, Sandra; Wang, Zhe; Wang, Yaochen; Whelan, Tim; Peto, Richard; McGale, Paul
2017-01-01
Purpose Radiotherapy reduces the absolute risk of breast cancer mortality by a few percentage points in suitable women but can cause a second cancer or heart disease decades later. We estimated the absolute long-term risks of modern breast cancer radiotherapy. Methods First, a systematic literature review was performed of lung and heart doses in breast cancer regimens published during 2010 to 2015. Second, individual patient data meta-analyses of 40,781 women randomly assigned to breast cancer radiotherapy versus no radiotherapy in 75 trials yielded rate ratios (RRs) for second primary cancers and cause-specific mortality and excess RRs (ERRs) per Gy for incident lung cancer and cardiac mortality. Smoking status was unavailable. Third, the lung or heart ERRs per Gy in the trials and the 2010 to 2015 doses were combined and applied to current smoker and nonsmoker lung cancer and cardiac mortality rates in population-based data. Results Average doses from 647 regimens published during 2010 to 2015 were 5.7 Gy for whole lung and 4.4 Gy for whole heart. The median year of irradiation was 2010 (interquartile range [IQR], 2008 to 2011). Meta-analyses yielded lung cancer incidence ≥ 10 years after radiotherapy RR of 2.10 (95% CI, 1.48 to 2.98; P < .001) on the basis of 134 cancers, indicating 0.11 (95% CI, 0.05 to 0.20) ERR per Gy whole-lung dose. For cardiac mortality, RR was 1.30 (95% CI, 1.15 to 1.46; P < .001) on the basis of 1,253 cardiac deaths. Detailed analyses indicated 0.04 (95% CI, 0.02 to 0.06) ERR per Gy whole-heart dose. Estimated absolute risks from modern radiotherapy were as follows: lung cancer, approximately 4% for long-term continuing smokers and 0.3% for nonsmokers; and cardiac mortality, approximately 1% for smokers and 0.3% for nonsmokers. Conclusion For long-term smokers, the absolute risks of modern radiotherapy may outweigh the benefits, yet for most nonsmokers (and ex-smokers), the benefits of radiotherapy far outweigh the risks. Hence, smoking can determine the net effect of radiotherapy on mortality, but smoking cessation substantially reduces radiotherapy risk. PMID:28319436
Taylor, Carolyn; Correa, Candace; Duane, Frances K; Aznar, Marianne C; Anderson, Stewart J; Bergh, Jonas; Dodwell, David; Ewertz, Marianne; Gray, Richard; Jagsi, Reshma; Pierce, Lori; Pritchard, Kathleen I; Swain, Sandra; Wang, Zhe; Wang, Yaochen; Whelan, Tim; Peto, Richard; McGale, Paul
2017-05-20
Purpose Radiotherapy reduces the absolute risk of breast cancer mortality by a few percentage points in suitable women but can cause a second cancer or heart disease decades later. We estimated the absolute long-term risks of modern breast cancer radiotherapy. Methods First, a systematic literature review was performed of lung and heart doses in breast cancer regimens published during 2010 to 2015. Second, individual patient data meta-analyses of 40,781 women randomly assigned to breast cancer radiotherapy versus no radiotherapy in 75 trials yielded rate ratios (RRs) for second primary cancers and cause-specific mortality and excess RRs (ERRs) per Gy for incident lung cancer and cardiac mortality. Smoking status was unavailable. Third, the lung or heart ERRs per Gy in the trials and the 2010 to 2015 doses were combined and applied to current smoker and nonsmoker lung cancer and cardiac mortality rates in population-based data. Results Average doses from 647 regimens published during 2010 to 2015 were 5.7 Gy for whole lung and 4.4 Gy for whole heart. The median year of irradiation was 2010 (interquartile range [IQR], 2008 to 2011). Meta-analyses yielded lung cancer incidence ≥ 10 years after radiotherapy RR of 2.10 (95% CI, 1.48 to 2.98; P < .001) on the basis of 134 cancers, indicating 0.11 (95% CI, 0.05 to 0.20) ERR per Gy whole-lung dose. For cardiac mortality, RR was 1.30 (95% CI, 1.15 to 1.46; P < .001) on the basis of 1,253 cardiac deaths. Detailed analyses indicated 0.04 (95% CI, 0.02 to 0.06) ERR per Gy whole-heart dose. Estimated absolute risks from modern radiotherapy were as follows: lung cancer, approximately 4% for long-term continuing smokers and 0.3% for nonsmokers; and cardiac mortality, approximately 1% for smokers and 0.3% for nonsmokers. Conclusion For long-term smokers, the absolute risks of modern radiotherapy may outweigh the benefits, yet for most nonsmokers (and ex-smokers), the benefits of radiotherapy far outweigh the risks. Hence, smoking can determine the net effect of radiotherapy on mortality, but smoking cessation substantially reduces radiotherapy risk.
Jenkins, P; Scaife, J; Freeman, S
2012-07-01
We have previously developed a predictive model that identifies patients at increased risk of febrile neutropaenia (FN) following chemotherapy, based on pretreatment haematological indices. This study was designed to validate our earlier findings in a separate cohort of patients undergoing more myelosuppressive chemotherapy supported by growth factors. We conducted a retrospective analysis of 263 patients who had been treated with adjuvant docetaxel, adriamycin and cyclophosphamide (TAC) chemotherapy for breast cancer. All patients received prophylactic pegfilgrastim and the majority also received prophylactic antibiotics. Thirty-one patients (12%) developed FN. Using our previous model, patients in the highest risk group (pretreatment absolute neutrophil count≤3.1 10(9)/l and absolute lymphocyte count≤1.5 10(9)/l) comprised 8% of the total population and had a 33% risk of developing FN. Compared with the rest of the cohort, this group had a 3.4-fold increased risk of developing FN (P=0.001) and a 5.2-fold increased risk of cycle 1 FN (P<0.001). A simple model based on pretreatment differential white blood cell count can be applied to pegfilgrastim-supported patients to identify those who are at higher risk of FN.
Hindricks, Gerhard; Varma, Niraj; Kacet, Salem; Lewalter, Thorsten; Søgaard, Peter; Guédon-Moreau, Laurence; Proff, Jochen; Gerds, Thomas A; Anker, Stefan D; Torp-Pedersen, Christian
2017-06-07
Remote monitoring of implantable cardioverter-defibrillators may improve clinical outcome. A recent meta-analysis of three randomized controlled trials (TRUST, ECOST, IN-TIME) using a specific remote monitoring system with daily transmissions [Biotronik Home Monitoring (HM)] demonstrated improved survival. We performed a patient-level analysis to verify this result with appropriate time-to-event statistics and to investigate further clinical endpoints. Individual data of the TRUST, ECOST, and IN-TIME patients were pooled to calculate absolute risks of endpoints at 1-year follow-up for HM vs. conventional follow-up. All-cause mortality analysis involved all three trials (2405 patients). Other endpoints involved two trials, ECOST and IN-TIME (1078 patients), in which an independent blinded endpoint committee adjudicated the underlying causes of hospitalizations and deaths. The absolute risk of death at 1 year was reduced by 1.9% in the HM group (95% CI: 0.1-3.8%; P = 0.037), equivalent to a risk ratio of 0.62. Also the combined endpoint of all-cause mortality or hospitalization for worsening heart failure (WHF) was significantly reduced (by 5.6%; P = 0.007; risk ratio 0.64). The composite endpoint of all-cause mortality or cardiovascular (CV) hospitalization tended to be reduced by a similar degree (4.1%; P = 0.13; risk ratio 0.85) but without statistical significance. In a pooled analysis of the three trials, HM reduced all-cause mortality and the composite endpoint of all-cause mortality or WHF hospitalization. The similar magnitudes of absolute risk reductions for WHF and CV endpoints suggest that the benefit of HM is driven by the prevention of heart failure exacerbation.
ERIC Educational Resources Information Center
Kwon, Heekyung
2011-01-01
The objective of this study is to provide a systematic account of three typical phenomena surrounding absolute accuracy of metacomprehension assessments: (1) the absolute accuracy of predictions is typically quite low; (2) there exist individual differences in absolute accuracy of predictions as a function of reading skill; and (3) postdictions…
12 CFR 324.52 - Simple risk-weight approach (SRWA).
Code of Federal Regulations, 2014 CFR
2014-01-01
... greater than or equal to −1 (that is, between zero and −1), then E equals the absolute value of RVC. If...) Zero percent risk weight equity exposures. An equity exposure to a sovereign, the Bank for..., an MDB, and any other entity whose credit exposures receive a zero percent risk weight under § 324.32...
Genetic Risk, Coronary Heart Disease Events, and the Clinical Benefit of Statin Therapy
Smith, JG; Chasman, DI; Caulfield, M; Devlin, JJ; Nordio, F; Hyde, C; Cannon, CP; Sacks, F; Poulter, N; Sever, P; Ridker, PM; Braunwald, E; Melander, O
2015-01-01
Background Genetic variants have been associated with the risk of coronary heart disease (CHD). We tested whether a composite of these variants could identify the risk of both incident as well as recurrent CHD events and distinguish individuals who derived greater clinical benefit from statin therapy. Methods A community-based cohort and four randomized controlled trials of both primary (JUPITER and ASCOT) and secondary (CARE and PROVE IT-TIMI 22) prevention with statin therapy totaling 48,421 individuals and 3,477 events were included in these analyses. We examined the association of a genetic risk score based on 27 genetic variants with incident or recurrent CHD, adjusting for established clinical predictors. We then investigated the relative and absolute risk reductions in CHD events with statin therapy stratified by genetic risk. Data from studies were combined using meta-analysis. Findings When individuals were divided into low (quintile 1), intermediate (quintiles 2-4), and high (quintile 5) genetic risk categories, a significant gradient of risk for incident or recurrent CHD was demonstrated with the multivariable-adjusted HRs (95% CI) for CHD for the intermediate and high genetic risk categories vs. low genetic risk category being 1.32 (1.20-1.46, P<0.0001) and 1.71 (1.54-1.91, P<0.0001), respectively. In terms of the benefit of statin therapy in the four randomized trials, there was a significant gradient of increasing relative risk reduction across the low, intermediate, and high genetic risk categories (13%, 29%, and 48%, P=0.0277). Similarly, greater absolute risk reductions were seen in those individuals in higher genetic risk categories (P=0.0101), resulting in an approximate three-fold gradient in the number needed to treat (NNT) in the primary prevention trials. Specifically, in the primary prevention trials, the NNT to prevent one MACE over 10 years for the low, intermediate, and high GRS individuals was 66, 42, and 25 in JUPITER and 57, 47, and 20 in ASCOT. Interpretation A genetic risk score identified individuals at increased risk for both incident and recurrent CHD events. Individuals with the highest burden of genetic risk derived the largest relative and absolute clinical benefit with statin therapy. PMID:25748612
Gender differences in cardiovascular disease and comorbid depression.
Möller-Leimkühler, Anne Maria
2007-01-01
Although gender is increasingly perceived as a key determinant in health and illness, systematic gender studies in medicine are still lacking. For a long time, cardiovascular disease (CVD) has been seen as a “male” disease, due to men's higher absolute risk compared with women, but the relative risk in women of CVD morbidity and mortality is actually higher: Current knowledge points to important gender differences in age of onset, symptom presentation, management, and outcome, as well as traditional and psychosocial risk factors. Compared with men, CVD risk in women is increased to a greater extent by some traditional factors (eg, diabetes, hypertension, hypercholesterolemia, obesity,) and socioeconomic and psychosocial factors also seem to have a higher impact on CVD in women. With respect la differences in CVD management, a gender bias in favor of men has to be taken into account, in spite of greater age and higher comorbidity in women, possibly contributing to a poorer outcome. Depression has been shown to be an independent risk factor and consequence of CVD; however, concerning gender differences, The results have been inconsistent. Current evidence suggests that depression causes a greater increase in CVD incidence in women, and that female CVD patients experience higher levels of depression than men. Gensier aspects should be more intensively considered, both in further research on gender differences in comorbid depresion, and in cardiac treatment and rehabilitation, with the goal of making secondary prevention more effective. PMID:17506227
Schroder, Kerstin E. E.; Carey, Michael P.; Vanable, Peter A.
2008-01-01
Investigation of sexual behavior involves many challenges, including how to assess sexual behavior and how to analyze the resulting data. Sexual behavior can be assessed using absolute frequency measures (also known as “counts”) or with relative frequency measures (e.g., rating scales ranging from “never” to “always”). We discuss these two assessment approaches in the context of research on HIV risk behavior. We conclude that these two approaches yield non-redundant information and, more importantly, that only data yielding information about the absolute frequency of risk behavior have the potential to serve as valid indicators of HIV contraction risk. However, analyses of count data may be challenging due to non-normal distributions with many outliers. Therefore, we identify new and powerful data analytical solutions that have been developed recently to analyze count data, and discuss limitations of a commonly applied method (viz., ANCOVA using baseline scores as covariates). PMID:14534027
Volpe, Massimo; Battistoni, Allegra; Gallo, Giovanna; Coluccia, Roberta; De Caterina, Raffaele
2017-09-01
While the use of aspirin in the secondary prevention of cardiovascular (CVD) is well established, aspirin in primary prevention is not systematically recommended because the absolute CV event reduction is similar to the absolute excess in major bleedings. Recently, emerging evidence suggests the possibility that the assumption of aspirin, may also be effective in the prevention of cancer. By adding to the CV prevention benefits the potential beneficial effect of aspirin in reducing the incidence of mortality and cancer could tip the balance between risks and benefits of aspirin therapy in the primary prevention in favour of the latter and broaden the indication for treatment with in populations at average risk. While prospective and randomized study are currently investigating the effect of aspirin in prevention of both cancer and CVD, clinical efforts at the individual level to promote the use of aspirin in global (or total) primary prevention could be already based on a balanced evaluation of the benefit/risk ratio.
Performance of 2014 NICE defibrillator implantation guidelines in heart failure risk stratification.
Cubbon, Richard M; Witte, Klaus K; Kearney, Lorraine C; Gierula, John; Byrom, Rowenna; Paton, Maria; Sengupta, Anshuman; Patel, Peysh A; Mn Walker, Andrew; Cairns, David A; Rajwani, Adil; Hall, Alistair S; Sapsford, Robert J; Kearney, Mark T
2016-05-15
Define the real-world performance of recently updated National Institute for Health and Care Excellence guidelines (TA314) on implantable cardioverter-defibrillator (ICD) use in people with chronic heart failure. Multicentre prospective cohort study of 1026 patients with stable chronic heart failure, associated with left ventricular ejection fraction (LVEF) ≤45% recruited in cardiology outpatient departments of four UK hospitals. We assessed the capacity of TA314 to identify patients at increased risk of sudden cardiac death (SCD) or appropriate ICD shock. The overall risk of SCD or appropriate ICD shock was 2.1 events per 100 patient-years (95% CI 1.7 to 2.6). Patients meeting TA314 ICD criteria (31.1%) were 2.5-fold (95% CI 1.6 to 3.9) more likely to suffer SCD or appropriate ICD shock; they were also 1.5-fold (95% CI 1.1 to 2.2) more likely to die from non-cardiovascular causes and 1.6-fold (95% CI 1.1 to 2.3) more likely to die from progressive heart failure. Patients with diabetes not meeting TA314 criteria experienced comparable absolute risk of SCD or appropriate ICD shock to patients without diabetes who met TA314 criteria. Patients with ischaemic cardiomyopathy not meeting TA314 criteria experienced comparable absolute risk of SCD or appropriate ICD shock to patients with non-ischaemic cardiomyopathy who met TA314 criteria. TA314 can identify patients with reduced LVEF who are at increased relative risk of sudden death. Clinicians should also consider clinical context and the absolute risk of SCD when advising patients about the potential risks and benefits of ICD therapy. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
Optimizing Preseason Training Loads in Australian Football.
Carey, David L; Crow, Justin; Ong, Kok-Leong; Blanch, Peter; Morris, Meg E; Dascombe, Ben J; Crossley, Kay M
2018-02-01
To investigate whether preseason training plans for Australian football can be computer generated using current training-load guidelines to optimize injury-risk reduction and performance improvement. A constrained optimization problem was defined for daily total and sprint distance, using the preseason schedule of an elite Australian football team as a template. Maximizing total training volume and maximizing Banister-model-projected performance were both considered optimization objectives. Cumulative workload and acute:chronic workload-ratio constraints were placed on training programs to reflect current guidelines on relative and absolute training loads for injury-risk reduction. Optimization software was then used to generate preseason training plans. The optimization framework was able to generate training plans that satisfied relative and absolute workload constraints. Increasing the off-season chronic training loads enabled the optimization algorithm to prescribe higher amounts of "safe" training and attain higher projected performance levels. Simulations showed that using a Banister-model objective led to plans that included a taper in training load prior to competition to minimize fatigue and maximize projected performance. In contrast, when the objective was to maximize total training volume, more frequent training was prescribed to accumulate as much load as possible. Feasible training plans that maximize projected performance and satisfy injury-risk constraints can be automatically generated by an optimization problem for Australian football. The optimization methods allow for individualized training-plan design and the ability to adapt to changing training objectives and different training-load metrics.
Cancer in Machado-Joseph disease patients-low frequency as a cause of death.
Souza, Gabriele Nunes; Kersting, Nathália; Gonçalves, Thomaz Abramsson; Pacheco, Daphne Louise Oliveira; Saraiva-Pereira, Maria-Luiza; Camey, Suzi Alves; Saute, Jonas Alex Morales; Jardim, Laura Bannach
2017-04-01
Since polyglutamine diseases have been related to a reduced risk of cancer, we aimed to study the 15 years cumulative incidence of cancer (CIC) (arm 1) and the proportion of cancer as a cause of death (arm 2) in symptomatic carriers of spinocerebellar ataxia type 3/Machado-Joseph disease (SCA3/MJD). SCA3/MJD and control individuals from our state were invited to participate. A structured interview was performed. CIC as published by the Brazilian National Institute of Cancer, was used as populational control. Causes of death were obtained from the Public Information System on Mortality. We interviewed 154 SCA3/MJD patients and 80 unrelated controls: CIC was 7/154 (4.5%) and 5/80 (6.3%), respectively. The interim analysis for futility showed that the number of individuals required to detect a significant difference between groups (1938) would be three times larger than the existing local SCA3/MJD population (625), for an absolute risk reduction of 1.8%. Then this study arm was discontinued due to lack of power. In the same period, cancer was a cause of death in 9/101 (8.9%) SCA3/MJD and in 52/202 (26.2%) controls, with an absolute reduction risk of 17.3% (OR 0.27, 95%CI 0.13 to 0.58, p = 0.01). A significant reduction of cancer as cause of death was observed in SCA3/MJD, suggesting a common effect to all polyglutamine diseases. Copyright © 2017. Published by Elsevier Inc.
Rücker, Viktoria; Keil, Ulrich; Fitzgerald, Anthony P; Malzahn, Uwe; Prugger, Christof; Ertl, Georg; Heuschmann, Peter U; Neuhauser, Hannelore
2016-01-01
Estimation of absolute risk of cardiovascular disease (CVD), preferably with population-specific risk charts, has become a cornerstone of CVD primary prevention. Regular recalibration of risk charts may be necessary due to decreasing CVD rates and CVD risk factor levels. The SCORE risk charts for fatal CVD risk assessment were first calibrated for Germany with 1998 risk factor level data and 1999 mortality statistics. We present an update of these risk charts based on the SCORE methodology including estimates of relative risks from SCORE, risk factor levels from the German Health Interview and Examination Survey for Adults 2008–11 (DEGS1) and official mortality statistics from 2012. Competing risks methods were applied and estimates were independently validated. Updated risk charts were calculated based on cholesterol, smoking, systolic blood pressure risk factor levels, sex and 5-year age-groups. The absolute 10-year risk estimates of fatal CVD were lower according to the updated risk charts compared to the first calibration for Germany. In a nationwide sample of 3062 adults aged 40–65 years free of major CVD from DEGS1, the mean 10-year risk of fatal CVD estimated by the updated charts was lower by 29% and the estimated proportion of high risk people (10-year risk > = 5%) by 50% compared to the older risk charts. This recalibration shows a need for regular updates of risk charts according to changes in mortality and risk factor levels in order to sustain the identification of people with a high CVD risk. PMID:27612145
Absolute and Relative Socioeconomic Health Inequalities across Age Groups
van Zon, Sander K. R.; Bültmann, Ute; Mendes de Leon, Carlos F.; Reijneveld, Sijmen A.
2015-01-01
Background The magnitude of socioeconomic health inequalities differs across age groups. It is less clear whether socioeconomic health inequalities differ across age groups by other factors that are known to affect the relation between socioeconomic position and health, like the indicator of socioeconomic position, the health outcome, gender, and as to whether socioeconomic health inequalities are measured in absolute or in relative terms. The aim is to investigate whether absolute and relative socioeconomic health inequalities differ across age groups by indicator of socioeconomic position, health outcome and gender. Methods The study sample was derived from the baseline measurement of the LifeLines Cohort Study and consisted of 95,432 participants. Socioeconomic position was measured as educational level and household income. Physical and mental health were measured with the RAND-36. Age concerned eleven 5-years age groups. Absolute inequalities were examined by comparing means. Relative inequalities were examined by comparing Gini-coefficients. Analyses were performed for both health outcomes by both educational level and household income. Analyses were performed for all age groups, and stratified by gender. Results Absolute and relative socioeconomic health inequalities differed across age groups by indicator of socioeconomic position, health outcome, and gender. Absolute inequalities were most pronounced for mental health by household income. They were larger in younger than older age groups. Relative inequalities were most pronounced for physical health by educational level. Gini-coefficients were largest in young age groups and smallest in older age groups. Conclusions Absolute and relative socioeconomic health inequalities differed cross-sectionally across age groups by indicator of socioeconomic position, health outcome and gender. Researchers should critically consider the implications of choosing a specific age group, in addition to the indicator of socioeconomic position and health outcome, as findings on socioeconomic health inequalities may differ between them. PMID:26717482
A Special Application of Absolute Value Techniques in Authentic Problem Solving
ERIC Educational Resources Information Center
Stupel, Moshe
2013-01-01
There are at least five different equivalent definitions of the absolute value concept. In instances where the task is an equation or inequality with only one or two absolute value expressions, it is a worthy educational experience for learners to solve the task using each one of the definitions. On the other hand, if more than two absolute value…
Outpatient management of febrile neutropenia: time to revise the present treatment strategy.
Carstensen, Mads; Sørensen, Jens Benn
2008-01-01
We reviewed medical literature on the efficacy and safety of outpatient versus hospital-based therapy of low-risk febrile neutropenia in adult cancer patients. A PubMed search for all studies evaluating the outpatient treatment of adults diagnosed with solid tumors who suffered from low-risk febrile neutropenia was completed; reference lists from identified articles also were used. In all, 10 trials were included in the analysis, which showed no significant difference in clinical failure rates and mortality for ambulatory regimens and standard hospital-based therapy. Subgroup analysis according to the type of fever episode showed no significant differences in clinical failure rates for fever of unknown origin and fever due to documented infections. Subgroup analyses in two independent trials identified an absolute neutrophil count < 100 cells/ mm3 as being predictive of outpatient treatment failure (P < 0.04). These findings need to be confirmed by further trials. Thus, outpatient management of adult cancer patients with low-risk febrile neutropenia is safe, effective, and comparable to standard hospital-based therapy. Patients at low risk are outpatients and are hemodynamically stable; they have no organ failure, they are able to take oral medications, and they do not suffer from acute leukemia. Low-risk prediction also may be based on the Multinational Association for Supportive Care in Cancer risk index.
Factors influencing histologic confirmation of high-grade squamous intraepithelial lesion cytology.
Castle, Philip E; Cox, J Thomas; Schiffman, Mark; Wheeler, Cosette M; Solomon, Diane
2008-09-01
To examine the predictors of histologic confirmation of high-grade squamous intraepithelial lesion (HSIL) cytology occurring in follow-up of young women originally referred into a trial because of less severe cytology. We used enrollment HSIL cytology (N=411) as read by clinical center pathologists for women participating in the ASCUS-LSIL Triage Study (ALTS). The primary outcome was histologic cervical intraepithelial neoplasia (CIN) grade 3 and early cancer (n=195; 191 CIN 3 and four cancers) as diagnosed by the Pathology Quality Control Group during the 2-year duration of ALTS. The 2-year absolute risk of CIN 3 or worse after an HSIL cytology was 47.4% (95% confidence interval 42.5-52.4%). The 2-year absolute risk of CIN 3 or worse was lowest (14.3%) for women who were human papillomavirus (HPV)-16-negative, had colposcopic impression of less than low-grade, and whose HSIL cytology as called by the clinical center was not also called HSIL or equivocal HSIL cytology by the Pathology Quality Control Group. The 2-year absolute risk of CIN 3 or worse was highest (82.4%) for women who were HPV16-positive, had colposcopic impression of low-grade or worse, and whose HSIL cytology also was called HSIL or equivocal HSIL cytology by the Pathology Quality Control Group. Histologic confirmation of precancer among young women with HSIL cytology was more likely when other risk factors (eg, HPV16) for cervical precancer were present.
An absolute scale for measuring the utility of money
NASA Astrophysics Data System (ADS)
Thomas, P. J.
2010-07-01
Measurement of the utility of money is essential in the insurance industry, for prioritising public spending schemes and for the evaluation of decisions on protection systems in high-hazard industries. Up to this time, however, there has been no universally agreed measure for the utility of money, with many utility functions being in common use. In this paper, we shall derive a single family of utility functions, which have risk-aversion as the only free parameter. The fact that they return a utility of zero at their low, reference datum, either the utility of no money or of one unit of money, irrespective of the value of risk-aversion used, qualifies them to be regarded as absolute scales for the utility of money. Evidence of validation for the concept will be offered based on inferential measurements of risk-aversion, using diverse measurement data.
An Engaging Illustration of the Physical Differences among Menthol Stereoisomers
ERIC Educational Resources Information Center
Treadwell, Edward M.; Black, T. Howard
2005-01-01
An experiment illustrating stereochemical principles, like different physical properties in achiral environments, assignment of absolute stereochemistry, and the stereoisomeric relationships to differences in absolute stereochemistry is devised. A demonstration of how enantiomers have the same physical properties until placed in chiral…
Cameron, Linda D; Sherman, Kerry A; Marteau, Theresa M; Brown, Paul M
2009-05-01
Genetic tests vary in their prediction of disease occurrence, with some mutations conferring relatively low risk and others indicating near certainty. The authors assessed how increments in absolute risk of disease influence risk perceptions, interest, and expected consequences of genetic tests for diseases of varying severity. Adults (N = 752), recruited from New Zealand, Australia, and the United Kingdom for an online analogue study, were randomly assigned to receive information about a test of genetic risk for diabetes, heart disease, colon cancer, or lung cancer. The lifetime risk varied across conditions by 10% increments, from 20% to 100%. Participants completed measures of perceived likelihood of disease for individuals with mutations, risk-related affect, interest, and testing consequences. Analyses revealed two increment clusters yielding differences in likelihood perceptions: A "moderate-risk" cluster (20%-70%), and a "high-risk" cluster (80%-100%). Risk increment influenced anticipated worry, feelings of risk, testing-induced distress, and family obligations, with nonlinear patterns including disproportionately high responses for the 50% increment. Risk increment did not alter testing interest or perceived benefits. These patterns of effects held across the four diseases. Magnitude of risk from genetic testing has a nonlinear influence on risk-related appraisals and affect but is unrelated to test interest.
Understanding and Communicating Medical Risks for Living Kidney Donors: A Matter of Perspective
Lentine, Krista L.
2017-01-01
Communicating the current knowledge of medical outcomes after live kidney donation necessary to support donor candidates in well informed decision-making requires grounding in perspectives of comparison. Baseline risk (without donating), risk attributable to donation, and absolute risk (after donating) need to be considered. Severe perioperative complications and death are rare, but vary by demographic, clinical, and procedure factors. Innovative capture of “healthy” controls designed to simulate donor selection processes has identified higher risk of ESRD attributable to donation in two studies; importantly, however, the absolute 15-year ESRD incidence in donors remains very low (0.3%). In the first decade after donation, the risk of all-cause mortality and cardiovascular events is no higher than in healthy nondonors. Pregnancies in donors may incur attributable risk of gestational hypertension or preeclampsia (11% versus 5% incidence in one study). A modest rise in uric acid levels beginning early after donation, and a small (1.4%) increase in the 8-year incidence of gout, have also been reported in comparisons to healthy nondonors. As in the general population, postdonation outcomes vary by race, sex, and age. Efforts to improve the counseling and selection of living donors should focus on developing tools for tailored risk prediction according to donor characteristics, and ideally, compared with similar healthy nondonors. PMID:27591246
Dietary carbohydrates and cardiovascular disease risk factors in the Framingham Offspring Cohort
USDA-ARS?s Scientific Manuscript database
Evidence from observational studies has suggested that carbohydrate quality rather than absolute intake is associated with greater risk of chronic diseases. The aim of this study was to examine the relationship between carbohydrate intake and dietary glycemic index and several cardiovascular disease...
Malik, Salim S; Lythgoe, Mark P; McPhail, Mark; Monahan, Kevin J
2017-11-30
Around 5% of colorectal cancers are due to mutations within DNA mismatch repair genes, resulting in Lynch syndrome (LS). These mutations have a high penetrance with early onset of colorectal cancer at a mean age of 45 years. The mainstay of surgical management is either a segmental or extensive colectomy. Currently there is no unified agreement as to which management strategy is superior due to limited conclusive empirical evidence available. A systematic review and meta- analysis to evaluate the risk of metachronous colorectal cancer (MCC) and mortality in LS following segmental and extensive colectomy. A systematic review of the PubMed database was conducted. Studies were included/ excluded based on pre-specified criteria. To assess the risk of MCC and mortality attributed to segmental or extensive colectomies, relative risks (RR) were calculated and corresponding 95% confidence intervals (CI). Publication bias was investigated using funnel plots. Data about mortality, as well as patient ascertainment [Amsterdam criteria (AC), germline mutation (GM)] were also extracted. Statistical analysis was conducted using the R program (version 3.2.3). The literature search identified 85 studies. After further analysis ten studies were eligible for inclusion in data synthesis. Pooled data identified 1389 patients followed up for a mean of 100.7 months with a mean age of onset of 45.5 years of age. A total 1119 patients underwent segmental colectomies with an absolute risk of MCC in this group of 22.4% at the end of follow-up. The 270 patients who had extensive colectomies had a MCC absolute risk of 4.7% (0% in those with a panproctocolecomy). Segmental colectomy was significantly associated with an increased relative risk of MCC (RR = 5.12; 95% CI 2.88-9.11; Fig. 1), although no significant association with mortality was identified (RR = 1.65; 95% CI 0.90-3.02). There was no statistically significant difference in the risk of MCC between AC and GM cohorts (p = 0.5, Chi-squared test). In LS, segmental colectomy results in a significant increased risk of developing MCC. Despite the choice of segmental or extensive colectomies having no statistically significant impact on mortality, the choice of initial surgical management can impact a patient's requirement for further surgery. An extensive colectomy can result in decreased need for further surgery; reduced hospital stays and associated costs. The significant difference in the risk of MCC, following segmental or extensive colectomies should be discussed with patients when deciding appropriate management. An individualised approach should be utilised, taking into account the patient's age, co-morbidities and genotype. In order to determine likely germline-specific effects, or a difference in survival, larger and more comprehensive studies are required.
Shultz, R; Birmingham, T B; Jenkyn, T R
2011-12-01
This study examined the absolute differences in neutral positions of the joints of the foot with different footwear. This addresses the question of whether separate static trials should be collected for each footwear condition to establish neutral positions. A multi-segment kinematic foot model and optical motion analysis system measured four inter-segmental joints of the foot: (1) hindfoot-to-midfoot in the frontal plane, (2) forefoot-to-midfoot in the frontal plane, (3) hallux-to-forefoot in the sagittal plane, and (4) the height-to-length ratio of the medial longitudinal arch. Barefoot was compared to three shoe condition using Nike Free trainers of varying longitudinal torsional stiffness in ten male volunteers. There was high variability both within subjects and shoe conditions. Shoes in general tended to raise the medial longitudinal arch and dorsiflex the hallux compared to barefoot condition. For the hallux, a minimum important difference of 5° or more was found between shoe conditions and the barefoot condition for majority of the subjects in all three shoe conditions (90% for control, 60% for least stiff, 50% for most stiff). This was less for the frontal plane inter-segmental joints of the foot where 50% of the subjects experience a change above 5° for at least one of the conditions. The choice of using condition-specific neutral trials versus a single common neutral trials should be considered carefully. A single common trial allows for differences in absolute joint angles to be compared between footwear conditions. This can be important clinically to determine whether a joint is approaching its end-of-range and therefore at risk of injury. Several condition-specific neutral trials allows for subtleties in kinematic waveforms to be better compared between conditions, since absolute shifts in joint angles due to changing neutral position are removed and the waveforms are better aligned. Copyright © 2011. Published by Elsevier Ltd.
Respiratory Viruses and Treatment Failure in Children With Asthma Exacerbation.
Merckx, Joanna; Ducharme, Francine M; Martineau, Christine; Zemek, Roger; Gravel, Jocelyn; Chalut, Dominic; Poonai, Naveen; Quach, Caroline
2018-06-04
: media-1vid110.1542/5771275574001PEDS-VA_2017-4105 Video Abstract OBJECTIVES: Respiratory pathogens commonly trigger pediatric asthma exacerbations, but their impact on severity and treatment response remains unclear. We performed a secondary analysis of the Determinants of Oral Corticosteroid Responsiveness in Wheezing Asthmatic Youth (DOORWAY) study, a prospective cohort study of children (aged 1-17 years) presenting to the emergency department with moderate or severe exacerbations. Nasopharyngeal specimens were analyzed by RT-PCR for 27 respiratory pathogens. We investigated the association between pathogens and both exacerbation severity (assessed with the Pediatric Respiratory Assessment Measure) and treatment failure (hospital admission, emergency department stay >8 hours, or relapse) of a standardized severity-specific treatment. Logistic multivariate regressions were used to estimate average marginal effects (absolute risks and risk differences [RD]). Of 958 participants, 61.7% were positive for ≥1 pathogen (rhinovirus was the most prevalent [29.4%]) and 16.9% experienced treatment failure. The presence of any pathogen was not associated with higher baseline severity but with a higher risk of treatment failure (20.7% vs 12.5%; RD = 8.2% [95% confidence interval: 3.3% to 13.1%]) compared to the absence of a pathogen. Nonrhinovirus pathogens were associated with an increased absolute risk (RD) of treatment failure by 13.1% (95% confidence interval: 6.4% to 19.8%), specifically, by 8.8% for respiratory syncytial virus, 24.9% for influenza, and 34.1% for parainfluenza. Although respiratory pathogens were not associated with higher severity on presentation, they were associated with increased treatment failure risk, particularly in the presence of respiratory syncytial virus, influenza, and parainfluenza. This supports influenza prevention in asthmatic children, consideration of pathogen identification on presentation, and exploration of treatment intensification for infected patients at higher risk of treatment failure. Copyright © 2018 by the American Academy of Pediatrics.
Robotics and automation in Mars exploration
NASA Technical Reports Server (NTRS)
Bourke, Roger D.; Sturms, Francis M., Jr.; Golombek, Matthew P.; Gamber, R. T.
1992-01-01
A new approach to the exploration of Mars is examined which relies on the use of smaller and simpler vehicles. The new strategy involves the following principles: limiting science objectives to retrieval of rock samples from several different but geologically homogeneous areas; making use of emerging microspacecraft technologies to significantly reduce the mass of hardware elements; simplifying missions to the absolutely essential elements; and managing risk through the employment of many identical independent pieces some of which may fail. The emerging technologies and their applications to robotic Mars missions are discussed.
McMahon, Jordan D; Lashley, Marcus A; Brooks, Christopher P; Barton, Brandon T
2018-04-26
Giving-up density (GUD) experiments have been a foundational method to evaluate perceived predation risk, but rely on the assumption that food preferences are absolute, so that areas with higher GUDs can be interpreted as having higher risk. However, nutritional preferences are context dependent and can change with risk. We used spiders and grasshoppers to test the hypothesis that covariance in nutritional preferences and risk may confound the interpretation of GUD experiments. We presented grasshoppers with carbohydrate-rich and protein-rich diets, in the presence and absence of spider predators. Predators reduced grasshopper preference for the protein-rich food, but increased their preference for the carbohydrate-rich food. We then measured GUDs with both food types under different levels of risk (spider density, 0 - 5). As expected, GUDs increased with spider density indicating increasing risk, but only when using protein-rich food. With carbohydrate-rich food, GUD was independent of predation risk. Our results demonstrate that predation risk and nutritional preferences covary and can confound interpretation of GUD experiments. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
Effectiveness of personalized and interactive health risk calculators: a randomized trial.
Harle, Christopher A; Downs, Julie S; Padman, Rema
2012-01-01
Risk calculators are popular websites that provide individualized disease risk assessments to the public. Little is known about their effect on risk perceptions and health behavior. This study sought to test whether risk calculator features-namely, personalized estimates of one's disease risk and feedback about the effects of risk-mitigating behaviors-improve risk perceptions and motivate healthy behavior. A web-based experimental study using simple randomization was conducted to compare the effects of 3 prediabetes risk communication websites. Setting The study was conducted in the context of ongoing health promotion activities sponsored by a university's human resources office. Patients Participants were adult university employees. Intervention The control website presented nonindividualized risk information. The personalized noninteractive website presented individualized risk calculations. The personalized interactive website presented individualized risk calculations and feedback about the effects of hypothetical risk-mitigating behaviors. Measurements Pre- and postintervention risk perceptions were measured in absolute and relative terms. Health behavior was measured by assessing participant interest in follow-up preventive health services. On average, risk perceptions decreased by 2%. There was no general effect of personalization or interactivity in aligning subjective risk perceptions with objective risk calculations or in increasing healthy behaviors. However, participants who previously overestimated their risk reduced their perceptions by 16%. This was a significantly larger change than the 2% increase by participants who underestimated their risk. Limitations Results may not generalize to different populations, different diseases, or longer-term outcomes. Compared to nonpersonalized information, individualized risk calculators had little positive effect on prediabetes risk perception accuracy or health behavior. Risk perception accuracy was improved in people who receive relatively "good news" about risk rather than "bad news."
Predicting mortality over different time horizons: which data elements are needed?
Goldstein, Benjamin A; Pencina, Michael J; Montez-Rath, Maria E; Winkelmayer, Wolfgang C
2017-01-01
Electronic health records (EHRs) are a resource for "big data" analytics, containing a variety of data elements. We investigate how different categories of information contribute to prediction of mortality over different time horizons among patients undergoing hemodialysis treatment. We derived prediction models for mortality over 7 time horizons using EHR data on older patients from a national chain of dialysis clinics linked with administrative data using LASSO (least absolute shrinkage and selection operator) regression. We assessed how different categories of information relate to risk assessment and compared discrete models to time-to-event models. The best predictors used all the available data (c-statistic ranged from 0.72-0.76), with stronger models in the near term. While different variable groups showed different utility, exclusion of any particular group did not lead to a meaningfully different risk assessment. Discrete time models performed better than time-to-event models. Different variable groups were predictive over different time horizons, with vital signs most predictive for near-term mortality and demographic and comorbidities more important in long-term mortality. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Workload and non-contact injury incidence in elite football players competing in European leagues.
Delecroix, Barthelemy; McCall, Alan; Dawson, Brian; Berthoin, Serge; Dupont, Gregory
2018-06-02
The aim of this study was to analyse the relationship between absolute and acute:chronic workload ratios and non-contact injury incidence in professional football players and to assess their predictive ability. Elite football players (n = 130) from five teams competing in European domestic and confederation level competitions were followed during one full competitive season. Non-contact injuries were recorded and using session rate of perceived exertion (s-RPE) internal absolute workload and acute:chronic (A:C) workload ratios (4-weeks, 3-weeks, 2-weeks and week-to-week) were calculated using a rolling days method. The relative risk (RR) of non-contact injury was increased (RR = 1.59, CI95%: 1.18-2.15) when a cumulative 4-week absolute workload was greater than 10629 arbitrary units (AU) in comparison with a workload between 3745 and 10628 AU. When the 3-week absolute workload was more than 8319 AU versus between 2822 and 8318 AU injury risk was also increased (RR= 1.46, CI95% 1.08-1.98). Injury incidence was higher when the 4-week A:C ratio was <0.85 versus >0.85 (RR = 1.31, CI95%: 1.02-1.70) and with a 3-week A:C ratio >1.30 versus <1.30 (RR = 1.37, CI95%: 1.05-1.77). Importantly, none of the A:C workload combinations showed high sensitivity or specificity. In elite European footballers, using internal workload (sRPE) revealed that cumulative workloads over 3 and 4 weeks were associated with injury incidence. Additionally, A:C workloads, using combinations of 2, 3 and 4 weeks as the C workloads were also associated with increased injury risk. No A:C workload combination was appropriate to predict injury.
Skaftun, Eirin K; Verguet, Stéphane; Norheim, Ole F; Johansson, Kjell A
2018-05-24
This study aims at quantifying the level and changes over time of inequality in age-specific mortality and life expectancy between the 19 Norwegian counties from 1980 to 2014. Data on population and mortality by county was obtained from Statistics Norway for 1980-2014. Life expectancy and age-specific mortality rates (0-4, 5-49 and 50-69 age groups) were estimated by year and county. Geographic inequality was described by the absolute Gini index annually. Life expectancy in Norway has increased from 75.6 to 82.0 years, and the risk of death before the age of 70 has decreased from 26 to 14% from 1980 to 2014. The absolute Gini index decreased over the period 1980 to 2014 from 0.43 to 0.32 for life expectancy, from 0.012 to 0.0057 for the age group 50-69 years, from 0.0038 to 0.0022 for the age group 5-49 years, and from 0.0009 to 0.0006 for the age group 0-4 years. It will take between 2 and 32 years (national average 7 years) until the counties catch up with the life expectancy in the best performing county if their annual rates of increase remain unchanged. Using the absolute Gini index as a metric for monitoring changes in geographic inequality over time may be a valuable tool for informing public health policies. The absolute inequality in mortality and life expectancy between Norwegian counties has decreased from 1980 to 2014.
Chemin, K; Rezende, M; Loguercio, A D; Reis, A; Kossatz, S
To evaluate the risk for and intensity of tooth sensitivity and color change of at-home dental bleaching with 4% and 10% hydrogen peroxide (HP). For this study, 78 patients were selected according to the inclusion and exclusion criteria and randomized into two groups: HP 4 (White Class 4%, FGM) and HP 10 (White Class 10%, FGM). In both groups, the at-home bleaching was performed for a period of 30 minutes twice a day for two weeks. The color was assessed by Vita Classical, Vita Bleachedguide 3D-MASTER and spectrophotometer Vita Easyshade (Vita Zahnfabrik) at baseline, during bleaching (first and second weeks) and after bleaching (one month). Patients recorded their tooth sensitivity using a numeric rating scale (0-4) and visual analog scale (0-10). Data from color change (DeltaE data) was submitted to two-way analysis of variance. The color change data in Delta SGU from the two shade guide units were compared with the Mann Whitney test. The risk of tooth sensitivity was evaluated by χ 2 test and the intensity of tooth sensitivity from both scales was evaluated by a Mann-Whitney test (α=0.05). The absolute risk and intensity of tooth sensitivity was higher in the group that used HP 10 than the one that used HP 4. Data from change in the number of shade guide units and color variation after one month of bleaching for both groups showed significant whitening, with no difference between groups. At-home bleaching is effective with 4% and 10% HP concentrations, but 10% HP increased the absolute risk and intensity of tooth sensitivity during at-home bleaching.
Koblitz, Amber R.; Persoskie, Alexander; Ferrer, Rebecca A.; Klein, William M. P.; Dwyer, Laura A.; Park, Elyse R.
2016-01-01
Introduction: Absolute and comparative risk perceptions, worry, perceived severity, perceived benefits, and self-efficacy are important theoretical determinants of tobacco use, but no measures have been validated to ensure the discriminant validity as well as test-retest reliability of these measures in the tobacco context. The purpose of the current study is to examine the reliability and factor structure of a measure assessing smoking-related health cognitions and emotions in a national sample of current and former heavy smokers in the National Lung Screening Trial. Methods: A sub-study of the National Lung Screening Trial assessed current and former smokers’ (age 55–74; N = 4379) self-reported health cognitions and emotions at trial enrollment and at 12-month follow-up. Items were derived from the Health Belief Model and Self-Regulation Model. Results: An exploratory factor analysis of baseline responses revealed a five-factor structure for former smokers (risk perceptions, worry, perceived severity, perceived benefits, and self-efficacy) and a six-factor structure for current smokers, such that absolute risk and comparative risk perceptions emerged as separate factors. A confirmatory factor analysis of 12-month follow-up responses revealed a good fit for the five latent constructs for former smokers and six latent constructs for current smokers. Longitudinal stability of these constructs was also demonstrated. Conclusions: This is the first study to examine tobacco-related health cognition and emotional constructs over time in current and former heavy smokers undergoing lung screening. This study found that the theoretical constructs were stable across time and that the factor structure differed based on smoking status (current vs. former). PMID:25964503
Safety of benzathine penicillin for preventing congenital syphilis: a systematic review.
Galvao, Tais F; Silva, Marcus T; Serruya, Suzanne J; Newman, Lori M; Klausner, Jeffrey D; Pereira, Mauricio G; Fescina, Ricardo
2013-01-01
To estimate the risk of serious adverse reactions to benzathine penicillin in pregnant women for preventing congenital syphilis. We searched for clinical trials or cohorts that assessed the incidence of serious adverse reactions to benzathine penicillin in pregnant women and the general population (indirect evidence). MEDLINE, EMBASE, Scopus and other databases were searched up to December 2012. The GRADE approach was used to assess quality of evidence. Absolute risks of each study were calculated along with their 95% confidence intervals (95% CI). We employed the DerSimonian and Laird random effects model in the meta-analyses. From 2,765 retrieved studies we included 13, representing 3,466,780 patients. The studies that included pregnant women were conducted to demonstrate the effectiveness of benzathine penicillin: no serious adverse reactions were reported among the 1,244 pregnant women included. In the general population, among 2,028,982 patients treated, 4 died from an adverse reaction. The pooled risk of death was virtually zero. Fifty-four cases of anaphylaxis were reported (pooled absolute risk = 0.002%; 95% CI: 0%-0.003% I(2) = 12%). From that estimate, penicillin treatment would be expected to result in an incidence of 0 to 3 cases of anaphylaxis per 100,000 treated. Any adverse reactions were reported in 6,377 patients among 3,465,322 treated with penicillin (pooled absolute risk = 0.169%; 95% CI: 0.073%-0.265% I(2) = 97%). The quality of evidence was very low. Studies that assessed the risk of serious adverse events due to benzathine penicillin treatment in pregnant women were scarce, but no reports of adverse reactions were found. The incidence of severe adverse outcomes was very low in the general population. The risk of treating pregnant women with benzathine penicillin to prevent congenital syphilis appears very low and does not outweigh its benefits. Further research is needed to improve the quality of evidence.
Recurrent stroke risk and cerebral microbleed burden in ischemic stroke and TIA
Wilson, Duncan; Charidimou, Andreas; Ambler, Gareth; Fox, Zoe V.; Gregoire, Simone; Rayson, Phillip; Imaizumi, Toshio; Fluri, Felix; Naka, Hiromitsu; Horstmann, Solveig; Veltkamp, Roland; Rothwell, Peter M.; Kwa, Vincent I.H.; Thijs, Vincent; Lee, Yong-Seok; Kim, Young Dae; Huang, Yining; Wong, Ka Sing; Jäger, Hans Rolf
2016-01-01
Objective: To determine associations between cerebral microbleed (CMB) burden with recurrent ischemic stroke (IS) and intracerebral hemorrhage (ICH) risk after IS or TIA. Methods: We identified prospective studies of patients with IS or TIA that investigated CMBs and stroke (ICH and IS) risk during ≥3 months follow-up. Authors provided aggregate summary-level data on stroke outcomes, with CMBs categorized according to burden (single, 2–4, and ≥5 CMBs) and distribution. We calculated absolute event rates and pooled risk ratios (RR) using random-effects meta-analysis. Results: We included 5,068 patients from 15 studies. There were 115/1,284 (9.6%) recurrent IS events in patients with CMBs vs 212/3,781 (5.6%) in patients without CMBs (pooled RR 1.8 for CMBs vs no CMBs; 95% confidence interval [CI] 1.4–2.5). There were 49/1,142 (4.3%) ICH events in those with CMBs vs 17/2,912 (0.58%) in those without CMBs (pooled RR 6.3 for CMBs vs no CMBs; 95% CI 3.5–11.4). Increasing CMB burden increased the risk of IS (pooled RR [95% CI] 1.8 [1.0–3.1], 2.4 [1.3–4.4], and 2.7 [1.5–4.9] for 1 CMB, 2–4 CMBs, and ≥5 CMBs, respectively) and ICH (pooled RR [95% CI] 4.6 [1.9–10.7], 5.6 [2.4–13.3], and 14.1 [6.9–29.0] for 1 CMB, 2–4 CMBs, and ≥5 CMBs, respectively). Conclusions: CMBs are associated with increased stroke risk after IS or TIA. With increasing CMB burden (compared to no CMBs), the risk of ICH increases more steeply than that of IS. However, IS absolute event rates remain higher than ICH absolute event rates in all CMB burden categories. PMID:27590288
Modeling and predicting historical volatility in exchange rate markets
NASA Astrophysics Data System (ADS)
Lahmiri, Salim
2017-04-01
Volatility modeling and forecasting of currency exchange rate is an important task in several business risk management tasks; including treasury risk management, derivatives pricing, and portfolio risk evaluation. The purpose of this study is to present a simple and effective approach for predicting historical volatility of currency exchange rate. The approach is based on a limited set of technical indicators as inputs to the artificial neural networks (ANN). To show the effectiveness of the proposed approach, it was applied to forecast US/Canada and US/Euro exchange rates volatilities. The forecasting results show that our simple approach outperformed the conventional GARCH and EGARCH with different distribution assumptions, and also the hybrid GARCH and EGARCH with ANN in terms of mean absolute error, mean of squared errors, and Theil's inequality coefficient. Because of the simplicity and effectiveness of the approach, it is promising for US currency volatility prediction tasks.
Pixel-by-pixel absolute phase retrieval using three phase-shifted fringe patterns without markers
NASA Astrophysics Data System (ADS)
Jiang, Chufan; Li, Beiwen; Zhang, Song
2017-04-01
This paper presents a method that can recover absolute phase pixel by pixel without embedding markers on three phase-shifted fringe patterns, acquiring additional images, or introducing additional hardware component(s). The proposed three-dimensional (3D) absolute shape measurement technique includes the following major steps: (1) segment the measured object into different regions using rough priori knowledge of surface geometry; (2) artificially create phase maps at different z planes using geometric constraints of structured light system; (3) unwrap the phase pixel by pixel for each region by properly referring to the artificially created phase map; and (4) merge unwrapped phases from all regions into a complete absolute phase map for 3D reconstruction. We demonstrate that conventional three-step phase-shifted fringe patterns can be used to create absolute phase map pixel by pixel even for large depth range objects. We have successfully implemented our proposed computational framework to achieve absolute 3D shape measurement at 40 Hz.
Grover, S. A.; Lowensteyn, I.; Esrey, K. L.; Steinert, Y.; Joseph, L.; Abrahamowicz, M.
1995-01-01
OBJECTIVE--To evaluate the ability of doctors in primary care to assess risk patients' risk of coronary heart disease. DESIGN--Questionnaire survey. SETTING--Continuing medical education meetings, Ontario and Quebec, Canada. SUBJECTS--Community based doctors who agreed to enroll in the coronary health assessment study. MAIN OUTCOME MEASURE--Ratings of coronary risk factors and estimates by doctors of relative and absolute coronary risk of two hypothetical patients and the "average" 40 year old Canadian man and 70 year old Canadian woman. RESULTS--253 doctors answered the questionnaire. For 30 year olds the doctors rated cigarette smoking as the most important risk factor and raised serum triglyceride concentrations as the least important; for 70 year old patients they rated diabetes as the most important risk factor and raised serum triglyceride concentrations as the least important. They rated each individual risk factor as significantly less important for 70 year olds than for 30 year olds (all risk factors, P < 0.001). They showed a strong understanding of the relative importance of specific risk factors, and most were confident in their ability to estimate coronary risk. While doctors accurately estimated the relative risk of a specific patient (compared with the average adult) they systematically overestimated the absolute baseline risk of developing coronary disease and the risk reductions associated with specific interventions. CONCLUSIONS--Despite guidelines on targeting patients at high risk of coronary disease accurate assessment of coronary risk remains difficult for many doctors. Additional strategies must be developed to help doctors to assess better their patients' coronary risk. PMID:7728035
Waldron, Sarah; Horsburgh, Margaret
2009-09-01
Evidence has shown the effectiveness of risk factor management in reducing mortality and morbidity from cardiovascular disease (CVD). An audit of a nurse CVD risk assessment programme undertaken between November 2005 and December 2008 in a Northland general practice. A retrospective audit of CVD risk assessment with data for the first entry of 621 patients collected exclusively from PREDICT-CVDTM, along with subsequent data collected from 320 of these patients who had a subsequent assessment recorded at an interval ranging from six months to three years (18 month average). Of the eligible population (71%) with an initial CVD risk assessment, 430 (69.2%) had afive-year absolute risk less than 15%, with 84 (13.5%) having a risk greater than 15% and having not had a cardiovascular event. Of the patients with a follow-up CVD risk assessment, 34 showed improvement. Medication prescribing for patients with absolute CVD risk greater than 15% increased from 71% to 86% for anti-platelet medication and for lipid lowering medication from 65% to 72% in the audit period. The recently available 'heart health' trajectory tool will help patients become more aware of risks that are modifiable, together with community support to engage more patients in the nurse CVD prevention programme. Further medication audits to monitor prescribing trends. Patients who showed an improvement in CVD risk had an improvement in one or more modifiable risk factors and became actively involved in making changes to their health.
Stringhini, Silvia; Spencer, Brenda; Marques-Vidal, Pedro; Waeber, Gerard; Vollenweider, Peter; Paccaud, Fred; Bovet, Pascal
2012-01-01
Objectives We examined the social distribution of a comprehensive range of cardiovascular risk factors (CVRF) in a Swiss population and assessed whether socioeconomic differences varied by age and gender. Methods Participants were 2960 men and 3343 women aged 35–75 years from a population-based survey conducted in Lausanne, Switzerland (CoLaus study). Educational level was the indicator of socioeconomic status used in this study. Analyses were stratified by gender and age group (35–54 years; 55–75 years). Results There were large educational differences in the prevalence of CVRF such as current smoking (Δ = absolute difference in prevalence between highest and lowest educational group:15.1%/12.6% in men/women aged 35–54 years), physical inactivity (Δ = 25.3%/22.7% in men/women aged 35–54 years), overweight and obesity (Δ = 14.6%/14.8% in men/women aged 55–75 years for obesity), hypertension (Δ = 16.7%/11.4% in men/women aged 55–75 years), dyslipidemia (Δ = 2.8%/6.2% in men/women aged 35–54 years for high LDL-cholesterol) and diabetes (Δ = 6.0%/2.6% in men/women aged 55–75 years). Educational inequalities in the distribution of CVRF were larger in women than in men for alcohol consumption, obesity, hypertension and dyslipidemia (p<0.05). Relative educational inequalities in CVRF tended to be greater among the younger (35–54 years) than among the older age group (55–75 years), particularly for behavioral CVRF and abdominal obesity among men and for physiological CVRF among women (p<0.05). Conclusion Large absolute differences in the prevalence of CVRF according to education categories were observed in this Swiss population. The socioeconomic gradient in CVRF tended to be larger in women and in younger persons. PMID:23152909
Kim, Su-A; Kim, Jang Young; Park, Jeong Bae
2016-06-01
There has been a rising interest in interarm blood pressure difference (IAD), due to its relationship with peripheral arterial disease and its possible relationship with cardiovascular disease. This study aimed to characterize hypertensive patients with a significant IAD in relation to cardiovascular risk. A total of 3699 patients (mean age, 61 ± 11 years) were prospectively enrolled in the study. Blood pressure (BP) was measured simultaneously in both arms 3 times using an automated cuff-oscillometric device. IAD was defined as the absolute difference in averaged BPs between the left and right arm, and an IAD ≥ 10 mm Hg was considered to be significant. The Framingham risk score was used to calculate the 10-year cardiovascular risk. The mean systolic IAD (sIAD) was 4.3 ± 4.1 mm Hg, and 285 (7.7%) patients showed significant sIAD. Patients with significant sIAD showed larger body mass index (P < 0.001), greater systolic BP (P = 0.050), more coronary artery disease (relative risk = 1.356, P = 0.034), and more cerebrovascular disease (relative risk = 1.521, P = 0.072). The mean 10-year cardiovascular risk was 9.3 ± 7.7%. By multiple regression, sIAD was significantly but weakly correlated with the 10-year cardiovascular risk (β = 0.135, P = 0.008). Patients with significant sIAD showed a higher prevalence of coronary artery disease, as well as an increase in 10-year cardiovascular risk. Therefore, accurate measurements of sIAD may serve as a simple and cost-effective tool for predicting cardiovascular risk in clinical settings.
Significant interarm blood pressure difference predicts cardiovascular risk in hypertensive patients
Kim, Su-A; Kim, Jang Young; Park, Jeong Bae
2016-01-01
Abstract There has been a rising interest in interarm blood pressure difference (IAD), due to its relationship with peripheral arterial disease and its possible relationship with cardiovascular disease. This study aimed to characterize hypertensive patients with a significant IAD in relation to cardiovascular risk. A total of 3699 patients (mean age, 61 ± 11 years) were prospectively enrolled in the study. Blood pressure (BP) was measured simultaneously in both arms 3 times using an automated cuff-oscillometric device. IAD was defined as the absolute difference in averaged BPs between the left and right arm, and an IAD ≥ 10 mm Hg was considered to be significant. The Framingham risk score was used to calculate the 10-year cardiovascular risk. The mean systolic IAD (sIAD) was 4.3 ± 4.1 mm Hg, and 285 (7.7%) patients showed significant sIAD. Patients with significant sIAD showed larger body mass index (P < 0.001), greater systolic BP (P = 0.050), more coronary artery disease (relative risk = 1.356, P = 0.034), and more cerebrovascular disease (relative risk = 1.521, P = 0.072). The mean 10-year cardiovascular risk was 9.3 ± 7.7%. By multiple regression, sIAD was significantly but weakly correlated with the 10-year cardiovascular risk (β = 0.135, P = 0.008). Patients with significant sIAD showed a higher prevalence of coronary artery disease, as well as an increase in 10-year cardiovascular risk. Therefore, accurate measurements of sIAD may serve as a simple and cost-effective tool for predicting cardiovascular risk in clinical settings. PMID:27310982
Orozco-Beltran, Domingo; Gil-Guillen, Vicente F.; Redon, Josep; Martin-Moreno, Jose M.; Pallares-Carratala, Vicente; Navarro-Perez, Jorge; Valls-Roca, Francisco; Sanchis-Domenech, Carlos; Fernandez-Gimenez, Antonio; Perez-Navarro, Ana; Bertomeu-Martinez, Vicente; Bertomeu-Gonzalez, Vicente; Cordero, Alberto; Pascual de la Torre, Manuel; Trillo, Jose L.; Carratala-Munuera, Concepcion; Pita-Fernandez, Salvador; Uso, Ruth; Durazo-Arvizu, Ramon; Cooper, Richard; Sanz, Gines; Castellano, Jose M.; Ascaso, Juan F.; Carmena, Rafael; Tellez-Plaza, Maria
2017-01-01
Introduction The potential impact of targeting different components of an adverse lipid profile in populations with multiple cardiovascular risk factors is not completely clear. This study aims to assess the association between different components of the standard lipid profile with all-cause mortality and hospitalization due to cardiovascular events in a high-risk population. Methods This prospective registry included high risk adults over 30 years old free of cardiovascular disease (2008–2012). Diagnosis of hypertension, dyslipidemia or diabetes mellitus was inclusion criterion. Lipid biomarkers were evaluated. Primary endpoints were all-cause mortality and hospital admission due to coronary heart disease or stroke. We estimated adjusted rate ratios (aRR), absolute risk differences and population attributable risk associated with adverse lipid profiles. Results 51,462 subjects were included with a mean age of 62.6 years (47.6% men). During an average follow-up of 3.2 years, 919 deaths, 1666 hospitalizations for coronary heart disease and 1510 hospitalizations for stroke were recorded. The parameters that showed an increased rate for total mortality, coronary heart disease and stroke hospitalization were, respectively, low HDL-Cholesterol: aRR 1.25, 1.29 and 1.23; high Total/HDL-Cholesterol: aRR 1.22, 1.38 and 1.25; and high Triglycerides/HDL-Cholesterol: aRR 1.21, 1.30, 1.09. The parameters that showed highest population attributable risk (%) were, respectively, low HDL-Cholesterol: 7.70, 11.42, 8.40; high Total/HDL-Cholesterol: 6.55, 12.47, 8.73; and high Triglycerides/HDL-Cholesterol: 8.94, 15.09, 6.92. Conclusions In a population with cardiovascular risk factors, HDL-cholesterol, Total/HDL-cholesterol and triglycerides/HDL-cholesterol ratios were associated with a higher population attributable risk for cardiovascular disease compared to other common biomarkers. PMID:29045483
Orozco-Beltran, Domingo; Gil-Guillen, Vicente F; Redon, Josep; Martin-Moreno, Jose M; Pallares-Carratala, Vicente; Navarro-Perez, Jorge; Valls-Roca, Francisco; Sanchis-Domenech, Carlos; Fernandez-Gimenez, Antonio; Perez-Navarro, Ana; Bertomeu-Martinez, Vicente; Bertomeu-Gonzalez, Vicente; Cordero, Alberto; Pascual de la Torre, Manuel; Trillo, Jose L; Carratala-Munuera, Concepcion; Pita-Fernandez, Salvador; Uso, Ruth; Durazo-Arvizu, Ramon; Cooper, Richard; Sanz, Gines; Castellano, Jose M; Ascaso, Juan F; Carmena, Rafael; Tellez-Plaza, Maria
2017-01-01
The potential impact of targeting different components of an adverse lipid profile in populations with multiple cardiovascular risk factors is not completely clear. This study aims to assess the association between different components of the standard lipid profile with all-cause mortality and hospitalization due to cardiovascular events in a high-risk population. This prospective registry included high risk adults over 30 years old free of cardiovascular disease (2008-2012). Diagnosis of hypertension, dyslipidemia or diabetes mellitus was inclusion criterion. Lipid biomarkers were evaluated. Primary endpoints were all-cause mortality and hospital admission due to coronary heart disease or stroke. We estimated adjusted rate ratios (aRR), absolute risk differences and population attributable risk associated with adverse lipid profiles. 51,462 subjects were included with a mean age of 62.6 years (47.6% men). During an average follow-up of 3.2 years, 919 deaths, 1666 hospitalizations for coronary heart disease and 1510 hospitalizations for stroke were recorded. The parameters that showed an increased rate for total mortality, coronary heart disease and stroke hospitalization were, respectively, low HDL-Cholesterol: aRR 1.25, 1.29 and 1.23; high Total/HDL-Cholesterol: aRR 1.22, 1.38 and 1.25; and high Triglycerides/HDL-Cholesterol: aRR 1.21, 1.30, 1.09. The parameters that showed highest population attributable risk (%) were, respectively, low HDL-Cholesterol: 7.70, 11.42, 8.40; high Total/HDL-Cholesterol: 6.55, 12.47, 8.73; and high Triglycerides/HDL-Cholesterol: 8.94, 15.09, 6.92. In a population with cardiovascular risk factors, HDL-cholesterol, Total/HDL-cholesterol and triglycerides/HDL-cholesterol ratios were associated with a higher population attributable risk for cardiovascular disease compared to other common biomarkers.
Lumme, Sonja; Manderbacka, Kristiina; Keskimäki, Ilmo
2017-02-20
Resources for coronary revascularisations have increased substantially since the early 1990s in Finland. At the same time, ischaemic heart disease (IHD) mortality has decreased markedly. This study aims to examine how these changes have influenced trends in absolute and relative differences between socioeconomic groups in revascularisations and age group differences in them using IHD mortality as a proxy for need. Hospital Discharge Register data on revascularisations among Finns aged 45-84 in 1995-2010 were individually linked to population registers to obtain socio-demographic data. We measured absolute and relative income group differences in revascularisation and IHD mortality with slope index of inequality (SII) and concentration index (C), and relative equity taking need for care into account with horizontal inequity index (HII). The supply of procedures doubled during the years. Socioeconomic distribution of revascularisations was in absolute and relative terms equal in 1995 (Men: SII = -12, C = -0.00; Women, SII = -30, C = -0.03), but differences favouring low-income groups emerged by 2010 (M: SII = -340, C = -0.08; W: SII = -195, C = -0.14). IHD mortality decreased markedly, but absolute and relative differences favouring the better-off existed throughout study years. Absolute differences decreased somewhat (M: SII = -760 in 1995, SII = -681 in 2010; W: SII = -318 in 1995, SII = -211 in 2010), but relative differences increased significantly (M: C = -0.14 in 1995, C = -0.26 in 2010; W: C = -0.15 in 1995, C = -0.25 in 2010). HII was greater than zero in each year indicating inequity favouring the better-off. HII increased from 0.15 to 0.18 among men and from 0.10 to 0.12 among women. We found significant and increasing age group differences in HII. Despite large increase in supply of revascularisations and decrease in IHD mortality, there is still marked socioeconomic inequity in revascularisations in Finland. However, since changes in absolute distributions of both supply and need for coronary care have favoured low-income groups, absolute inequity can be claimed to have decreased although it cannot be quantified numerically.
Aerobic exercise training for adults with fibromyalgia.
Bidonde, Julia; Busch, Angela J; Schachter, Candice L; Overend, Tom J; Kim, Soo Y; Góes, Suelen M; Boden, Catherine; Foulds, Heather Ja
2017-06-21
Exercise training is commonly recommended for individuals with fibromyalgia. This review is one of a series of reviews about exercise training for people with fibromyalgia that will replace the "Exercise for treating fibromyalgia syndrome" review first published in 2002. • To evaluate the benefits and harms of aerobic exercise training for adults with fibromyalgia• To assess the following specific comparisons ० Aerobic versus control conditions (eg, treatment as usual, wait list control, physical activity as usual) ० Aerobic versus aerobic interventions (eg, running vs brisk walking) ० Aerobic versus non-exercise interventions (eg, medications, education) We did not assess specific comparisons involving aerobic exercise versus other exercise interventions (eg, resistance exercise, aquatic exercise, flexibility exercise, mixed exercise). Other systematic reviews have examined or will examine these comparisons (Bidonde 2014; Busch 2013). We searched the Cochrane Library, MEDLINE, Embase, the Cumulative Index to Nursing and Allied Health Literature (CINAHL), the Physiotherapy Evidence Database (PEDro), Thesis and Dissertation Abstracts, the Allied and Complementary Medicine Database (AMED), the World Health Organization International Clinical Trials Registry Platform (WHO ICTRP), and the ClinicalTrials.gov registry up to June 2016, unrestricted by language, and we reviewed the reference lists of retrieved trials to identify potentially relevant trials. We included randomized controlled trials (RCTs) in adults with a diagnosis of fibromyalgia that compared aerobic training interventions (dynamic physical activity that increases breathing and heart rate to submaximal levels for a prolonged period) versus no exercise or another intervention. Major outcomes were health-related quality of life (HRQL), pain intensity, stiffness, fatigue, physical function, withdrawals, and adverse events. Two review authors independently selected trials for inclusion, extracted data, performed a risk of bias assessment, and assessed the quality of the body of evidence for major outcomes using the GRADE approach. We used a 15% threshold for calculation of clinically relevant differences between groups. We included 13 RCTs (839 people). Studies were at risk of selection, performance, and detection bias (owing to lack of blinding for self-reported outcomes) and had low risk of attrition and reporting bias. We prioritized the findings when aerobic exercise was compared with no exercise control and present them fully here.Eight trials (with 456 participants) provided low-quality evidence for pain intensity, fatigue, stiffness, and physical function; and moderate-quality evidence for withdrawals and HRQL at completion of the intervention (6 to 24 weeks). With the exception of withdrawals and adverse events, major outcome measures were self-reported and were expressed on a 0 to 100 scale (lower values are best, negative mean differences (MDs)/standardized mean differences (SMDs) indicate improvement). Effects for aerobic exercise versus control were as follows: HRQL: mean 56.08; five studies; N = 372; MD -7.89, 95% CI -13.23 to -2.55; absolute improvement of 8% (3% to 13%) and relative improvement of 15% (5% to 24%); pain intensity: mean 65.31; six studies; N = 351; MD -11.06, 95% CI -18.34 to -3.77; absolute improvement of 11% (95% CI 4% to 18%) and relative improvement of 18% (7% to 30%); stiffness: mean 69; one study; N = 143; MD -7.96, 95% CI -14.95 to -0.97; absolute difference in improvement of 8% (1% to 15%) and relative change in improvement of 11.4% (21.4% to 1.4%); physical function: mean 38.32; three studies; N = 246; MD -10.16, 95% CI -15.39 to -4.94; absolute change in improvement of 10% (15% to 5%) and relative change in improvement of 21.9% (33% to 11%); and fatigue: mean 68; three studies; N = 286; MD -6.48, 95% CI -14.33 to 1.38; absolute change in improvement of 6% (12% improvement to 0.3% worse) and relative change in improvement of 8% (16% improvement to 0.4% worse). Pooled analysis resulted in a risk ratio (RR) of moderate quality for withdrawals (17 per 100 and 20 per 100 in control and intervention groups, respectively; eight studies; N = 456; RR 1.25, 95%CI 0.89 to 1.77; absolute change of 5% more withdrawals with exercise (3% fewer to 12% more).Three trials provided low-quality evidence on long-term effects (24 to 208 weeks post intervention) and reported that benefits for pain and function persisted but did not for HRQL or fatigue. Withdrawals were similar, and investigators did not assess stiffness and adverse events.We are uncertain about the effects of one aerobic intervention versus another, as the evidence was of low to very low quality and was derived from single trials only, precluding meta-analyses. Similarly, we are uncertain of the effects of aerobic exercise over active controls (ie, education, three studies; stress management training, one study; medication, one study) owing to evidence of low to very low quality provided by single trials. Most studies did not measure adverse events; thus we are uncertain about the risk of adverse events associated with aerobic exercise. When compared with control, moderate-quality evidence indicates that aerobic exercise probably improves HRQL and all-cause withdrawal, and low-quality evidence suggests that aerobic exercise may slightly decrease pain intensity, may slightly improve physical function, and may lead to little difference in fatigue and stiffness. Three of the reported outcomes reached clinical significance (HRQL, physical function, and pain). Long-term effects of aerobic exercise may include little or no difference in pain, physical function, and all-cause withdrawal, and we are uncertain about long-term effects on remaining outcomes. We downgraded the evidence owing to the small number of included trials and participants across trials, and because of issues related to unclear and high risks of bias (performance, selection, and detection biases). Aerobic exercise appears to be well tolerated (similar withdrawal rates across groups), although evidence on adverse events is scarce, so we are uncertain about its safety.
Easy Absolute Values? Absolutely
ERIC Educational Resources Information Center
Taylor, Sharon E.; Mittag, Kathleen Cage
2015-01-01
The authors teach a problem-solving course for preservice middle-grades education majors that includes concepts dealing with absolute-value computations, equations, and inequalities. Many of these students like mathematics and plan to teach it, so they are adept at symbolic manipulations. Getting them to think differently about a concept that they…
Bhaskaran, Krishnan; Douglas, Ian; Evans, Stephen; van Staa, Tjeerd; Smeeth, Liam
2012-04-24
To investigate whether there is an association between use of angiotensin receptor blockers and risk of cancer. Cohort study of risk of cancer in people treated with angiotensin receptor blockers compared with angiotensin converting enzyme (ACE) inhibitors. Effects were explored with time updated covariates in Cox models adjusted for age, sex, body mass index (BMI), diabetes and metformin/insulin use, hypertension, heart failure, statin use, socioeconomic status, alcohol, smoking, and calendar year. Absolute changes in risk were predicted from a Poisson model incorporating the strongest determinants of risk from the main analysis. UK primary care practices contributing to the General Practice Research Database. 377,649 new users of angiotensin receptor blockers or ACE inhibitors with at least one year of initial treatment. Adjusted hazard ratios for all cancer and major site specific cancers (breast, lung, colon, prostate) by exposure to angiotensin receptor blockers and by cumulative duration of use. Follow-up ended a median of 4.6 years after the start of treatment; 20,203 cancers were observed. There was no evidence of any increase in overall risk of cancer among those ever exposed to angiotensin receptor blockers (adjusted hazard ratio 1.03, 95% confidence interval 0.99 to 1.06, P = 0.10). For specific cancers, there was some evidence of an increased risk of breast and prostate cancer (1.11, 1.01 to 1.21, P = 0.02; and 1.10, 1.00 to 1.20, P = 0.04; respectively), which in absolute terms corresponded to an estimated 0.5 and 1.1 extra cases, respectively, per 1000 person years of follow-up among those with the highest baseline risk. Longer duration of treatment did not seem to be associated with higher risk (P>0.15 in each case). There was a decreased risk of lung cancer (0.84, 0.75 to 0.94), but no effect on colon cancer (1.02, 0.91 to 1.16). Use of angiotensin receptor blockers was not associated with an increased risk of cancer overall. Observed increased risks for breast and prostate cancer were small in absolute terms, and the lack of association with duration of treatment meant that non-causal explanations could not be excluded.
Pasina, Luca; Cortesi, Laura; Tiraboschi, Mara; Nobili, Alessandro; Lanzo, Giovanna; Tettamanti, Mauro; Franchi, Carlotta; Mannucci, Pier Mannuccio; Ghidoni, Silvia; Assolari, Andrea; Brucato, Antonio
2018-01-01
Short-term prognosis, e.g. mortality at three months, has many important implications in planning the overall management of patients, particularly non-oncologic patients in order to avoid futile practices. The aims of this study were: i) to investigate the risk of three-month mortality after discharge from internal medicine and geriatric wards of non-oncologic patients with at least one of the following conditions: permanent bedridden status during the hospital stay; severely reduced kidney function; hypoalbuminemia; hospital admissions in the previous six months; severe dementia; ii) to establish the absolute risk difference of three-month mortality of bedridden compared to non-bedridden patients. This prospective cohort study was run in 102 Italian internal medicine and geriatric hospital wards. The sample included all patients with three-months follow-up data. Bedridden condition was defined as the inability to walk or stand upright during the whole hospital stay. The following parameters were also recorded: estimated GFR≤29mL/min/1.73m 2 ; severe dementia; albuminemia ≪2.5g/dL; hospital admissions in the six months before the index admission. Of 3915 patients eligible for the analysis, three-month follow-up were available for 2058, who were included in the study. Bedridden patients were 112 and the absolute risk difference of mortality at three months was 0.13 (CI 95% 0.08-0.19, p≪0.0001). Logistic regression analysis also adjusted for age, sex, number of drugs and comorbidity index found that bedridden condition (OR 2.10, CI 95% 1.12-3.94), severely reduced kidney function (OR 2.27, CI 95% 1.22-4.21), hospital admission in the previous six months (OR 1.96, CI 95% 1.22-3.14), severe dementia (with total or severe physical dependence) (OR 4.16, CI 95% 2.39-7.25) and hypoalbuminemia (OR 2.47, CI 95% 1.12-5.44) were significantly associated with higher risk of three-month mortality. Bedridden status, severely reduced kidney function, recent hospital admissions, severe dementia and hypoalbuminemia were associated with higher risk of three-month mortality in non-oncologic patients after discharge from internal medicine and geriatric hospital wards. Copyright © 2017 Elsevier B.V. All rights reserved.
2013-01-01
Absolute flatness of three silicon plane mirrors have been measured by a three-intersection method based on the three-flat method using a near-infrared interferometer. The interferometer was constructed using a near-infrared laser diode with a 1,310-nm wavelength light where the silicon plane mirror is transparent. The height differences at the coordinate values between the absolute line profiles by the three-intersection method have been evaluated. The height differences of the three flats were 4.5 nm or less. The three-intersection method using the near-infrared interferometer was useful for measuring the absolute flatness of the silicon plane mirrors. PMID:23758916
Sarmiento, E; del Pozo, N; Gallego, A; Fernández-Yañez, J; Palomo, J; Villa, A; Ruiz, M; Muñoz, P; Rodríguez, C; Rodríguez-Molina, J; Navarro, J; Kotsch, K; Fernandez-Cruz, E; Carbone, J
2012-10-01
Infection remains a source of mortality in heart recipients. We previously reported that post-transplant immunoglobulin G (IgG) quantification can help identify the risk for infection. We assessed whether other standardized parameters of humoral and cellular immunity could prove useful when identifying patients at risk of infection. We prospectively studied 133 heart recipients over a 12-month period. Forty-eight patients had at least one episode of severe infection. An event was defined as an infection requiring intravenous antimicrobial therapy. Cox regression analysis revealed an association between the risk of developing infection and the following: lower IgG2 subclass levels (day 7: relative hazard [RH] 1.71; day 30: RH 1.76), lower IgA levels (day 7: RH 1.61; day 30: RH 1.91), lower complement C3 values (day 7: RH 1.25), lower CD3 absolute counts (day 30: RH 1.10), lower absolute natural killer [NK] cell count (day 7: RH 1.24), and lower IgG concentrations (day 7: RH 1.31; day 30: RH 1.36). Cox regression bivariate analysis revealed that lower day 7 C3 levels, IgG2 concentration, and absolute NK cell count remained significant after adjustment for total IgG levels. Data suggest that early immune monitoring including C3, IgG2, and NK cell testing in addition to IgG concentrations is useful when attempting to identify the risk of infection in heart transplant recipients. © 2012 John Wiley & Sons A/S.
Roland, Michelle E; Neilands, Torsten B; Krone, Melissa R; Coates, Thomas J; Franses, Karena; Chesney, Margaret A; Kahn, James S; Martin, Jeffrey N
2011-07-01
The National HIV/AIDS Strategy proposes to scale-up post-exposure prophylaxis (PEP). Intensive risk reduction and adherence counseling appear to be effective but are resource intensive. Identifying simpler interventions that maximize the HIV prevention potential of PEP is critical. A randomized noninferiority study comparing 2 (standard) or 5 (enhanced) risk reduction counseling sessions was performed. Adherence counseling was provided in the enhanced arm. We measured changes in unprotected sexual intercourse acts at 12 months, compared with baseline; HIV acquisition; and PEP adherence. Outcomes were stratified by degree of baseline risk. We enrolled 457 individuals reporting unprotected intercourse within 72 h with an HIV-infected or at-risk partner. Participants were 96% male and 71% white. There were 1.8 and 2.3 fewer unprotected sex acts in the standard and enhanced groups. The maximum potential risk difference, reflected by the upper bound of the 95% confidence interval, was 3.9 acts. The difference in the riskier subset may have been as many as 19.6 acts. The incidence of HIV seroconversion was 2.9% and 2.6% among persons randomized to standard and enhanced counseling, respectively, with a maximum potential difference of 3.4%. The absolute and maximal HIV seroconversion incidence was 9.9% and 20.4% greater in the riskier group randomized to standard, compared with enhanced, counseling. Adherence outcomes were similar, with noninferiority in the lower risk group and concerning differences among the higher-risk group. Risk assessment is critical at PEP initiation. Standard counseling is only noninferior for individuals with lower baseline risk; thus, enhanced counseling should be targeted to individuals at higher risk. © The Author 2011. Published by Oxford University Press on behalf of the Infectious Diseases Society of America. All rights reserved.
Napper, Lucy E.; Grimaldi, Elizabeth M.; LaBrie, Joseph W.
2017-01-01
The current study aims to examine discrepancies in parents’ and college students’ perceptions of alcohol risk and the role of perceived risk in predicting parents’ intentions to discuss alcohol with their child. In total, 246 college student-parent dyads (56.1% female students, 77.2% mothers) were recruited from a mid-size university. Participants completed measures of absolute likelihood, comparative likelihood, and severity of alcohol consequences. In comparison to students, parents perceived the risks of alcohol poisoning (p < .001), academic impairment (p < .05), and problems with others (p < .05) to be more likely. In addition, parents rated the majority alcohol consequences (e.g., passing out, regrettable sexual situation, throwing up) as more severe than students (all ps < .001). However, parents tended to be more optimistic than their child about the comparative likelihood of alcohol consequences. After controlling for demographics and past alcohol communication, greater absolute likelihood (β = .20, p = .016) and less confidence in knowledge of student behavior (β = .20, p = .013) predicted greater intentions to discuss alcohol. Providing parents of college students with information about college drinking norms and the likelihood of alcohol consequences may help prompt alcohol-related communication. PMID:25437267
A new paradigm for primary prevention strategy in people with elevated risk of stroke.
Feigin, Valery L; Norrving, Bo
2014-07-01
Existing methods of primary stroke prevention are not sufficiently effective. Based on the recently developed Stroke Riskometer app, a new 'mass-elevated risk stroke/cardiovascular disease prevention' approach as an addition to the currently adopted absolute risk stroke/cardiovascular disease prevention approach is being advocated. We believe this approach is far more appealing to the individuals concerned and could be as efficient as the conventional population-based approach because it allows identification and engagement in prevention of all individuals who are at an increased (even slightly increased) risk of stroke and cardiovascular disease. The key novelty of this approach is twofold. First, it utilizes modern far-reaching mobile technologies, allowing individuals to calculate their absolute risk of stroke within the next 5 to 10 years and to compare their risk with those of the same age and gender without risk factors. Second, it employs self-management strategies to engage the person concerned in stroke/cardiovascular disease prevention, which is tailored to the person's individual risk profile. Preventative strategies similar to the Stroke Riskometer could be developed for other non-communicable disorders for which reliable predictive models and preventative recommendations exist. This would help reduce the burden of non-communicable disorders worldwide. © 2014 The Authors. International Journal of Stroke published by John Wiley & Sons Ltd on behalf of World Stroke Organization.
Ruiz, Milagros; Goldblatt, Peter; Morrison, Joana; Kukla, Lubomír; Švancara, Jan; Riitta-Järvelin, Marjo; Taanila, Anja; Saurel-Cubizolles, Marie-Josèphe; Lioret, Sandrine; Bakoula, Chryssa; Veltsista, Alexandra; Porta, Daniela; Forastiere, Francesco; van Eijsden, Manon; Vrijkotte, Tanja G M; Eggesbø, Merete; White, Richard A; Barros, Henrique; Correia, Sofia; Vrijheid, Martine; Torrent, Maties; Rebagliato, Marisa; Larrañaga, Isabel; Ludvigsson, Johnny; Olsen Faresjö, Åshild; Hryhorczuk, Daniel; Antipkin, Youriy; Marmot, Michael; Pikhart, Hynek
2015-09-01
A healthy start to life is a major priority in efforts to reduce health inequalities across Europe, with important implications for the health of future generations. There is limited combined evidence on inequalities in health among newborns across a range of European countries. Prospective cohort data of 75 296 newborns from 12 European countries were used. Maternal education, preterm and small for gestational age births were determined at baseline along with covariate data. Regression models were estimated within each cohort and meta-analyses were conducted to compare and measure heterogeneity between cohorts. Mother's education was linked to an appreciable risk of preterm and small for gestational age (SGA) births across 12 European countries. The excess risk of preterm births associated with low maternal education was 1.48 (1.29 to 1.69) and 1.84 (0.99 to 2.69) in relative and absolute terms (Relative/Slope Index of Inequality, RII/SII) for all cohorts combined. Similar effects were found for SGA births, but absolute inequalities were greater, with an SII score of 3.64 (1.74 to 5.54). Inequalities at birth were strong in the Netherlands, the UK, Sweden and Spain and marginal in other countries studied. This study highlights the value of comparative cohort analysis to better understand the relationship between maternal education and markers of fetal growth in different settings across Europe. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Marmot, Michael G.; Demakakos, Panayotes; Vaz de Melo Mambrini, Juliana; Peixoto, Sérgio Viana; Lima-Costa, Maria Fernanda
2016-01-01
Background: The main aim of this study was to quantify and compare 6-year mortality risk attributable to smoking, hypertension and diabetes among English and Brazilian older adults. This study represents a rare opportunity to approach the subject in two different social and economic contexts. Methods: Data from the data from the English Longitudinal Study of Ageing (ELSA) and the Bambuí Cohort Study of Ageing (Brazil) were used. Deaths in both cohorts were identified through mortality registers. Risk factors considered in this study were baseline smoking, hypertension and diabetes mellitus. Both age–sex adjusted hazard ratios and population attributable risks (PAR) of all-cause mortality and their 95% confidence intervals for the association between risk factors and mortality were estimated using Cox proportional hazards models. Results: Participants were 3205 English and 1382 Brazilians aged 60 years and over. First, Brazilians showed much higher absolute risk of mortality than English and this finding was consistent in all age, independently of sex. Second, as a rule, hazard ratios for mortality to smoking, hypertension and diabetes showed more similarities than differences between these two populations. Third, there was strong difference among English and Brazilians on attributable deaths to hypertension. Conclusions: The findings indicate that, despite of being in more recent transitions, the attributable deaths to one or more risk factors was twofold among Brazilians relative to the English. These findings call attention for the challenge imposed to health systems to prevent and treat non-communicable diseases, particularly in populations with low socioeconomic level. PMID:26666869
Dvir, Danny; Kitabata, Hironori; Barbash, Israel M; Minha, Sa'ar; Badr, Salem; Loh, Joshua P; Chen, Fang; Torguson, Rebecca; Waksman, Ron
2014-09-01
To evaluate the axial integrity of different coronary stents using intravascular ultrasound (IVUS). Longitudinal stent deformation was recently reported. Consecutive patients who underwent IVUS analysis after drug-eluting stent (DES) implantation for de novo coronary lesions were evaluated. Stent length was compared with label length for calculation of absolute change and relative difference (absolute change divided by label length). A total of 233 DES utilizing five different platforms were included. The median absolute change in stent length was 0.90 mm (interquartile range [IQR] 0.48-1.39) and the relative difference was 5.24% (IQR 2.55-8.29). There was no significant difference among the groups in median absolute or relative change: Cypher 0.89 mm/3.89%, Taxus 0.88 mm/5.39%, Endeavor 1.16 mm/6.77%, Xience V 0.86 mm/5.80%, and PROMUS Element 0.79 mm/5.34% (P = 0.085, P = 0.072, respectively). Multivariate logistic regression revealed that the Cypher stent was independently correlated with a lower change in length, whereas stent label length and deployment pressure were correlated with higher absolute change. The axial integrity of DES platforms examined in vivo was high, with only mild changes in stent length after implantation. While there are differences between first- and second-generation DES, axial integrity among second-generation DES was similar. © 2013 Wiley Periodicals, Inc.
Crawford, Natalie D; Ford, Chandra; Rudolph, Abby; Kim, BoRin; Lewis, Crystal M
2017-09-01
Experiences of discrimination, or social marginalization and ostracism, may lead to the formation of social networks characterized by inequality. For example, those who experience discrimination may be more likely to develop drug use and sexual partnerships with others who are at increased risk for HIV compared to those without experiences of discrimination. This is critical as engaging in risk behaviors with others who are more likely to be HIV positive can increase one's risk of HIV. We used log-binomial regression models to examine the relationship between drug use, racial and incarceration discrimination with changes in the composition of one's risk network among 502 persons who use drugs. We examined both absolute and proportional changes with respect to sex partners, drug use partners, and injecting partners, after accounting for individual risk behaviors. At baseline, participants were predominately male (70%), black or Latino (91%), un-married (85%), and used crack (64%). Among those followed-up (67%), having experienced discrimination due to drug use was significantly related to increases in the absolute number of sex networks and drug networks over time. No types of discrimination were related to changes in the proportion of high-risk network members. Discrimination may increase one's risk of HIV acquisition by leading them to preferentially form risk relationships with higher-risk individuals, thereby perpetuating racial and ethnic inequities in HIV. Future social network studies and behavioral interventions should consider whether social discrimination plays a role in HIV transmission.
Green, Brady; Bourne, Matthew N; Pizzari, Tania
2018-03-01
To examine the value of isokinetic strength assessment for predicting risk of hamstring strain injury, and to direct future research into hamstring strain injuries. Systematic review. Database searches for Medline, CINAHL, Embase, AMED, AUSPORT, SPORTDiscus, PEDro and Cochrane Library from inception to April 2017. Manual reference checks, ahead-of-press and citation tracking. Prospective studies evaluating isokinetic hamstrings, quadriceps and hip extensor strength testing as a risk factor for occurrence of hamstring muscle strain. Independent search result screening. Risk of bias assessment by independent reviewers using Quality in Prognosis Studies tool. Best evidence synthesis and meta-analyses of standardised mean difference (SMD). Twelve studies were included, capturing 508 hamstring strain injuries in 2912 athletes. Isokinetic knee flexor, knee extensor and hip extensor outputs were examined at angular velocities ranging 30-300°/s, concentric or eccentric, and relative (Nm/kg) or absolute (Nm) measures. Strength ratios ranged between 30°/s and 300°/s. Meta-analyses revealed a small, significant predictive effect for absolute (SMD=-0.16, P=0.04, 95% CI -0.31 to -0.01) and relative (SMD=-0.17, P=0.03, 95% CI -0.33 to -0.014) eccentric knee flexor strength (60°/s). No other testing speed or strength ratio showed statistical association. Best evidence synthesis found over half of all variables had moderate or strong evidence for no association with future hamstring injury. Despite an isolated finding for eccentric knee flexor strength at slow speeds, the role and application of isokinetic assessment for predicting hamstring strain risk should be reconsidered, particularly given costs and specialised training required. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Tani, Shigemasa; Nagao, Ken; Kawauchi, Kenji; Yagi, Tsukasa; Atsumi, Wataru; Matsuo, Rei; Hirayama, Atsushi
2017-10-01
We investigated the relationship between the eicosapentaenoic acid (EPA)/arachidonic acid (AA) ratio and non-high-density lipoprotein cholesterol (non-HDL-C) level, a major residual risk of coronary artery disease (CAD), in statin-treated CAD patients following EPA therapy. We conducted a 6-month, prospective, randomized clinical trial to investigate the effect of the additional administration of EPA on the EPA/AA ratio and the serum non-HDL-C level in stable CAD patients receiving statin treatment. We assigned CAD patients already receiving statin therapy to an EPA group (1800 mg/day; n = 50) or a control group (n = 50). A significant reduction in the serum non-HDL-C level was observed in the EPA group, compared with the control group (-9.7 vs. -1.2%, p = 0.01). A multiple-regression analysis with adjustments for coronary risk factors revealed that achieved EPA/AA ratio was more reliable as an independent and significant predictor of a reduction in the non-HDL-C level at a 6-month follow-up examination (β = -0.324, p = 0.033) than the absolute change in the EPA/AA ratio. Interestingly, significant negative correlations were found between the baseline levels and the absolute change values of both non-HDL-C and triglyceride-rich lipoproteins, both markers of residual risk of CAD, indicating that patients with a higher baseline residual risk achieved a greater reduction. The present results suggest that the achieved EPA/AA ratio, but not the absolute change in EPA/AA ratio, following EPA therapy might be a useful marker for the risk stratification of CAD among statin-treated patients with a high non-HDL-C level. UMIN ( http://www.umin.ac.jp/ ) Study ID: UMIN000010452.
Olver, Mark E; Mundt, James C; Thornton, David; Beggs Christofferson, Sarah M; Kingston, Drew A; Sowden, Justina N; Nicholaichuk, Terry P; Gordon, Audrey; Wong, Stephen C P
2018-04-30
The present study sought to develop updated risk categories and recidivism estimates for the Violence Risk Scale-Sexual Offense version (VRS-SO; Wong, Olver, Nicholaichuk, & Gordon, 2003-2017), a sexual offender risk assessment and treatment planning tool. The overarching purpose was to increase the clarity and accuracy of communicating risk assessment information that includes a systematic incorporation of new information (i.e., change) to modify risk estimates. Four treated samples of sexual offenders with VRS-SO pretreatment, posttreatment, and Static-99R ratings were combined with a minimum follow-up period of 10-years postrelease (N = 913). Logistic regression was used to model 5- and 10-year sexual and violent (including sexual) recidivism estimates across 6 different regression models employing specific risk and change score information from the VRS-SO and/or Static-99R. A rationale is presented for clinical applications of select models and the necessity of controlling for baseline risk when utilizing change information across repeated assessments. Information concerning relative risk (percentiles) and absolute risk (recidivism estimates) is integrated with common risk assessment language guidelines to generate new risk categories for the VRS-SO. Guidelines for model selection and forensic clinical application of the risk estimates are discussed. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
[Antidepressants-SSRIs in pregnancy and risk of major malformations: treat or not to treat].
Bellantuono, Cesario; Santone, Giovanni
2014-06-01
Considering teratogenic risk, recent data suggest that selective serotonin reuptake inhibitors (SSRIs) can be prescribed during pregnancy, even though some SSRIs are to be considered as a second choice. In any case, antidepressive treatment during pregnancy must be carefully tailored to the pregnant woman, considering absolute risk/benefit ratio of SSRIs, but also availability of other effective treatments, as well as woman's preferences.
Selective Serotonin Reuptake Inhibitors and Violent Crime: A Cohort Study.
Molero, Yasmina; Lichtenstein, Paul; Zetterqvist, Johan; Gumpert, Clara Hellner; Fazel, Seena
2015-09-01
Although selective serotonin reuptake inhibitors (SSRIs) are widely prescribed, associations with violence are uncertain. From Swedish national registers we extracted information on 856,493 individuals who were prescribed SSRIs, and subsequent violent crimes during 2006 through 2009. We used stratified Cox regression analyses to compare the rate of violent crime while individuals were prescribed these medications with the rate in the same individuals while not receiving medication. Adjustments were made for other psychotropic medications. Information on all medications was extracted from the Swedish Prescribed Drug Register, with complete national data on all dispensed medications. Information on violent crime convictions was extracted from the Swedish national crime register. Using within-individual models, there was an overall association between SSRIs and violent crime convictions (hazard ratio [HR] = 1.19, 95% CI 1.08-1.32, p < 0.001, absolute risk = 1.0%). With age stratification, there was a significant association between SSRIs and violent crime convictions for individuals aged 15 to 24 y (HR = 1.43, 95% CI 1.19-1.73, p < 0.001, absolute risk = 3.0%). However, there were no significant associations in those aged 25-34 y (HR = 1.20, 95% CI 0.95-1.52, p = 0.125, absolute risk = 1.6%), in those aged 35-44 y (HR = 1.06, 95% CI 0.83-1.35, p = 0.666, absolute risk = 1.2%), or in those aged 45 y or older (HR = 1.07, 95% CI 0.84-1.35, p = 0.594, absolute risk = 0.3%). Associations in those aged 15 to 24 y were also found for violent crime arrests with preliminary investigations (HR = 1.28, 95% CI 1.16-1.41, p < 0.001), non-violent crime convictions (HR = 1.22, 95% CI 1.10-1.34, p < 0.001), non-violent crime arrests (HR = 1.13, 95% CI 1.07-1.20, p < 0.001), non-fatal injuries from accidents (HR = 1.29, 95% CI 1.22-1.36, p < 0.001), and emergency inpatient or outpatient treatment for alcohol intoxication or misuse (HR = 1.98, 95% CI 1.76-2.21, p < 0.001). With age and sex stratification, there was a significant association between SSRIs and violent crime convictions for males aged 15 to 24 y (HR = 1.40, 95% CI 1.13-1.73, p = 0.002) and females aged 15 to 24 y (HR = 1.75, 95% CI 1.08-2.84, p = 0.023). However, there were no significant associations in those aged 25 y or older. One important limitation is that we were unable to fully account for time-varying factors. The association between SSRIs and violent crime convictions and violent crime arrests varied by age group. The increased risk we found in young people needs validation in other studies.
Selective Serotonin Reuptake Inhibitors and Violent Crime: A Cohort Study
Molero, Yasmina; Lichtenstein, Paul; Zetterqvist, Johan; Gumpert, Clara Hellner; Fazel, Seena
2015-01-01
Background Although selective serotonin reuptake inhibitors (SSRIs) are widely prescribed, associations with violence are uncertain. Methods and Findings From Swedish national registers we extracted information on 856,493 individuals who were prescribed SSRIs, and subsequent violent crimes during 2006 through 2009. We used stratified Cox regression analyses to compare the rate of violent crime while individuals were prescribed these medications with the rate in the same individuals while not receiving medication. Adjustments were made for other psychotropic medications. Information on all medications was extracted from the Swedish Prescribed Drug Register, with complete national data on all dispensed medications. Information on violent crime convictions was extracted from the Swedish national crime register. Using within-individual models, there was an overall association between SSRIs and violent crime convictions (hazard ratio [HR] = 1.19, 95% CI 1.08–1.32, p < 0.001, absolute risk = 1.0%). With age stratification, there was a significant association between SSRIs and violent crime convictions for individuals aged 15 to 24 y (HR = 1.43, 95% CI 1.19–1.73, p < 0.001, absolute risk = 3.0%). However, there were no significant associations in those aged 25–34 y (HR = 1.20, 95% CI 0.95–1.52, p = 0.125, absolute risk = 1.6%), in those aged 35–44 y (HR = 1.06, 95% CI 0.83–1.35, p = 0.666, absolute risk = 1.2%), or in those aged 45 y or older (HR = 1.07, 95% CI 0.84–1.35, p = 0.594, absolute risk = 0.3%). Associations in those aged 15 to 24 y were also found for violent crime arrests with preliminary investigations (HR = 1.28, 95% CI 1.16–1.41, p < 0.001), non-violent crime convictions (HR = 1.22, 95% CI 1.10–1.34, p < 0.001), non-violent crime arrests (HR = 1.13, 95% CI 1.07–1.20, p < 0.001), non-fatal injuries from accidents (HR = 1.29, 95% CI 1.22–1.36, p < 0.001), and emergency inpatient or outpatient treatment for alcohol intoxication or misuse (HR = 1.98, 95% CI 1.76–2.21, p < 0.001). With age and sex stratification, there was a significant association between SSRIs and violent crime convictions for males aged 15 to 24 y (HR = 1.40, 95% CI 1.13–1.73, p = 0.002) and females aged 15 to 24 y (HR = 1.75, 95% CI 1.08–2.84, p = 0.023). However, there were no significant associations in those aged 25 y or older. One important limitation is that we were unable to fully account for time-varying factors. Conclusions The association between SSRIs and violent crime convictions and violent crime arrests varied by age group. The increased risk we found in young people needs validation in other studies. PMID:26372359
An Integrated Model of Choices and Response Times in Absolute Identification
ERIC Educational Resources Information Center
Brown, Scott D.; Marley, A. A. J.; Donkin, Christopher; Heathcote, Andrew
2008-01-01
Recent theoretical developments in the field of absolute identification have stressed differences between relative and absolute processes, that is, whether stimulus magnitudes are judged relative to a shorter term context provided by recently presented stimuli or a longer term context provided by the entire set of stimuli. The authors developed a…
Ban, Lu; West, Joe; Gibson, Jack E.; Fiaschi, Linda; Sokal, Rachel; Doyle, Pat; Hubbard, Richard; Smeeth, Liam; Tata, Laila J.
2014-01-01
Background Despite their widespread use the effects of taking benzodiazepines and non-benzodiazepine hypnotics during pregnancy on the risk of major congenital anomaly (MCA) are uncertain. The objectives were to estimate absolute and relative risks of MCAs in children exposed to specific anxiolytic and hypnotic drugs taken in the first trimester of pregnancy, compared with children of mothers with depression and/or anxiety but not treated with medication and children of mothers without diagnosed mental illness during pregnancy. Methods We identified singleton children born to women aged 15–45 years between 1990 and 2010 from a large United Kingdom primary care database. We calculated absolute risks of MCAs for children with first trimester exposures of different anxiolytic and hypnotic drugs and used logistic regression with a generalised estimating equation to compare risks adjusted for year of childbirth, maternal age, smoking, body mass index, and socioeconomic status. Results Overall MCA prevalence was 2.7% in 1,159 children of mothers prescribed diazepam, 2.9% in 379 children with temazepam, 2.5% in 406 children with zopiclone, and 2.7% in 19,193 children whose mothers had diagnosed depression and/or anxiety but no first trimester drug exposures. When compared with 2.7% in 351,785 children with no diagnosed depression/anxiety nor medication use, the adjusted odds ratios were 1.02 (99% confidence interval 0.63–1.64) for diazepam, 1.07 (0.49–2.37) for temazepam, 0.96 (0.42–2.20) for zopiclone and 1.27 (0.43–3.75) for other anxiolytic/hypnotic drugs and 1.01 (0.90–1.14) for un-medicated depression/anxiety. Risks of system-specific MCAs were generally similar in children exposed and not exposed to such medications. Conclusions We found no evidence for an increase in MCAs in children exposed to benzodiazepines and non-benzodiazepine hypnotics in the first trimester of pregnancy. These findings suggest that prescription of these drugs during early pregnancy may be safe in terms of MCA risk, but findings from other studies are required before safety can be confirmed. PMID:24963627
Fried, Terri R; Tinetti, Mary E; Towle, Virginia; O'Leary, John R; Iannone, Lynne
2011-05-23
Quality-assurance initiatives encourage adherence to evidenced-based guidelines based on a consideration of treatment benefit. We examined older persons' willingness to take medication for primary cardiovascular disease prevention according to benefits and harms. In-person interviews were performed with 356 community-living older persons. Participants were asked about their willingness to take medication for primary prevention of myocardial infarction (MI) with varying benefits in terms of absolute 5-year risk reduction and varying harms in terms of type and severity of adverse effects. Most (88%) would take medication, providing an absolute benefit of 6 fewer persons with MI out of 100, approximating the average risk reduction of currently available medications. Of participants who would not take it, 17% changed their preference if the absolute benefit was increased to 10 fewer persons with MI, and, of participants who would take it, 82% remained willing if the absolute benefit was decreased to 3 fewer persons with MI. In contrast, large proportions (48%-69%) were unwilling or uncertain about taking medication with average benefit causing mild fatigue, nausea, or fuzzy thinking, and only 3% would take medication with adverse effects severe enough to affect functioning. Older persons' willingness to take medication for primary cardiovascular disease prevention is relatively insensitive to its benefit but highly sensitive to its adverse effects. These results suggest that clinical guidelines and decisions about prescribing these medications to older persons need to place emphasis on both benefits and harms.
Radiation induction of drug resistance in RIF-1 tumors and tumor cells
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hopwood, L.E.; Moulder, J.E.
1989-11-01
The RIF-1 tumor cell line contains a small number of cells (1-20 per 10(6) cells) that are resistant to various single antineoplastic drugs, including 5-fluorouracil (5FU), methotrexate (MTX), and adriamycin (ADR). For 5FU the frequency of drug resistance is lower for tumor-derived cells than for cells from cell culture; for MTX the reverse is true, and for ADR there is no difference. In vitro irradiation at 5 Gy significantly increased the frequency of drug-resistant cells for 5FU, MTX, and ADR. In vivo irradiation at 3 Gy significantly increased the frequency of drug-resistant cells for 5FU and MTX, but not formore » ADR. The absolute risk for in vitro induction of MTX, 5FU, and ADR resistance, and for in vivo induction of 5FU resistance, was 1-3 per 10(6) cells per Gy; but the absolute risk for in vivo induction of MTX resistance was 54 per 10(6) cells per Gy. The frequency of drug-resistant cells among individual untreated tumors was highly variable; among individual irradiated tumors the frequency of drug-resistant cells was significantly less variable. These studies provide supporting data for models of the development of tumor drug resistance, and imply that some of the drug resistance seen when chemotherapy follows radiotherapy may be due to radiation-induced drug resistance.« less
Zhang, Yanxin; Ma, Ye; Liu, Guangyu
2016-01-01
The objective of the study was to evaluate two types of cricket bowling techniques by comparing the lumbar spinal loading using a musculoskeletal modelling approach. Three-dimensional kinematic data were recorded by a Vicon motion capture system under two cricket bowling conditions: (1) participants bowled at their absolute maximal speeds (max condition), and (2) participants bowled at their absolute maximal speeds while simultaneously forcing their navel down towards their thighs starting just prior to ball release (max-trunk condition). A three-dimensional musculoskeletal model comprised of the pelvis, sacrum, lumbar vertebrae and torso segments, which enabled the motion of the individual lumbar vertebrae in the sagittal, frontal and coronal planes to be actuated by 210 muscle-tendon units, was used to simulate spinal loading based on the recorded kinematic data. The maximal lumbar spine compressive force is 4.89 ± 0.88BW for the max condition and 4.58 ± 0.54BW for the max-trunk condition. Results showed that there was no significant difference between the two techniques in trunk moments and lumbar spine forces. This indicates that the max-trunk technique may not increase lower back injury risks. The method proposed in this study could be served as a tool to evaluate lower back injury risks for cricket bowling as well as other throwing activities.
Coupland, Carol
2015-01-01
Study question Is it possible to develop and externally validate risk prediction equations to estimate the 10 year risk of blindness and lower limb amputation in patients with diabetes aged 25-84 years? Methods This was a prospective cohort study using routinely collected data from general practices in England contributing to the QResearch and Clinical Practice Research Datalink (CPRD) databases during the study period 1998-2014. The equations were developed using 763 QResearch practices (n=454 575 patients with diabetes) and validated in 254 different QResearch practices (n=142 419) and 357 CPRD practices (n=206 050). Cox proportional hazards models were used to derive separate risk equations for blindness and amputation in men and women that could be evaluated at 10 years. Measures of calibration and discrimination were calculated in the two validation cohorts. Study answer and limitations Risk prediction equations to quantify absolute risk of blindness and amputation in men and women with diabetes have been developed and externally validated. In the QResearch derivation cohort, 4822 new cases of lower limb amputation and 8063 new cases of blindness occurred during follow-up. The risk equations were well calibrated in both validation cohorts. Discrimination was good in men in the external CPRD cohort for amputation (D statistic 1.69, Harrell’s C statistic 0.77) and blindness (D statistic 1.40, Harrell’s C statistic 0.73), with similar results in women and in the QResearch validation cohort. The algorithms are based on variables that patients are likely to know or that are routinely recorded in general practice computer systems. They can be used to identify patients at high risk for prevention or further assessment. Limitations include lack of formally adjudicated outcomes, information bias, and missing data. What this study adds Patients with type 1 or type 2 diabetes are at increased risk of blindness and amputation but generally do not have accurate assessments of the magnitude of their individual risks. The new algorithms calculate the absolute risk of developing these complications over a 10 year period in patients with diabetes, taking account of their individual risk factors. Funding, competing interests, data sharing JH-C is co-director of QResearch, a not for profit organisation which is a joint partnership between the University of Nottingham and Egton Medical Information Systems, and is also a paid director of ClinRisk Ltd. CC is a paid consultant statistician for ClinRisk Ltd. PMID:26560308
Hippisley-Cox, Julia; Coupland, Carol
2015-11-11
Is it possible to develop and externally validate risk prediction equations to estimate the 10 year risk of blindness and lower limb amputation in patients with diabetes aged 25-84 years? This was a prospective cohort study using routinely collected data from general practices in England contributing to the QResearch and Clinical Practice Research Datalink (CPRD) databases during the study period 1998-2014. The equations were developed using 763 QResearch practices (n=454,575 patients with diabetes) and validated in 254 different QResearch practices (n=142,419) and 357 CPRD practices (n=206,050). Cox proportional hazards models were used to derive separate risk equations for blindness and amputation in men and women that could be evaluated at 10 years. Measures of calibration and discrimination were calculated in the two validation cohorts. Risk prediction equations to quantify absolute risk of blindness and amputation in men and women with diabetes have been developed and externally validated. In the QResearch derivation cohort, 4822 new cases of lower limb amputation and 8063 new cases of blindness occurred during follow-up. The risk equations were well calibrated in both validation cohorts. Discrimination was good in men in the external CPRD cohort for amputation (D statistic 1.69, Harrell's C statistic 0.77) and blindness (D statistic 1.40, Harrell's C statistic 0.73), with similar results in women and in the QResearch validation cohort. The algorithms are based on variables that patients are likely to know or that are routinely recorded in general practice computer systems. They can be used to identify patients at high risk for prevention or further assessment. Limitations include lack of formally adjudicated outcomes, information bias, and missing data. Patients with type 1 or type 2 diabetes are at increased risk of blindness and amputation but generally do not have accurate assessments of the magnitude of their individual risks. The new algorithms calculate the absolute risk of developing these complications over a 10 year period in patients with diabetes, taking account of their individual risk factors. JH-C is co-director of QResearch, a not for profit organisation which is a joint partnership between the University of Nottingham and Egton Medical Information Systems, and is also a paid director of ClinRisk Ltd. CC is a paid consultant statistician for ClinRisk Ltd. © Hippisley-Cox et al 2015.
Day-to-day variation of urinary NGAL and rational for creatinine correction.
Helmersson-Karlqvist, Johanna; Arnlöv, Johan; Larsson, Anders
2013-01-01
The number of clinical studies evaluating the new tubular biomarker urinary neutrophil gelatinase-associated lipocalin (U-NGAL) in urine are increasing. There is no consensus whether absolute U-NGAL concentrations or urinary NGAL/creatinine (U-NGAL/Cr) ratios should be used when chronic tubular dysfunction is studied. The aim was to study the biological variation of U-NGAL in healthy subjects and the rational for urinary creatinine (U-Cr) correction in two different study samples. To study biological variation of U-NGAL and U-NGAL/Cr ratio and the association between U-NGAL and U-Cr in healthy subjects 13 young males and females (median age 29 years) collected morning urine in 10 consecutive days. Additionally, a random subsample of 400 males from a population-based cohort (aged 78 years) collecting 24-hour urine during 1 day was studied. The calculated biological variation for absolute U-NGAL was 27% and for U-NGAL/Cr ratio, 101%. Absolute U-NGAL increased linearly with U-Cr concentration (the theoretical basis for creatinine adjustment) in the older males (R=0.19, P<0.001) and with borderline significance in the young adults (R=0.16, P=0.08). The U-NGAL/Cr ratio was, however, negatively associated with creatinine in the older males (R=-0.14, P<0.01) and in the young adults (R=-0.16, P=0.07) indicating a slight "overadjustment." The study provides some support for the use of U-NGAL/Cr ratio but the rather large biological variation and risk of possible overadjustment need to be considered. Both absolute U-NGAL and U-NGAL/Cr ratios should be reported for the estimation of chronic tubular dysfunction. Copyright © 2012 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.
Wetherbee, G.A.; Latysh, N.E.; Gordon, J.D.
2005-01-01
Data from the U.S. Geological Survey (USGS) collocated-sampler program for the National Atmospheric Deposition Program/National Trends Network (NADP/NTN) are used to estimate the overall error of NADP/NTN measurements. Absolute errors are estimated by comparison of paired measurements from collocated instruments. Spatial and temporal differences in absolute error were identified and are consistent with longitudinal distributions of NADP/NTN measurements and spatial differences in precipitation characteristics. The magnitude of error for calcium, magnesium, ammonium, nitrate, and sulfate concentrations, specific conductance, and sample volume is of minor environmental significance to data users. Data collected after a 1994 sample-handling protocol change are prone to less absolute error than data collected prior to 1994. Absolute errors are smaller during non-winter months than during winter months for selected constituents at sites where frozen precipitation is common. Minimum resolvable differences are estimated for different regions of the USA to aid spatial and temporal watershed analyses.
Ruthig, Joelle C; Gamblin, Bradlee W; Jones, Kelly; Vanderzanden, Karen; Kehn, Andre
2017-02-01
Researchers have spent considerable effort examining unrealistic absolute optimism and unrealistic comparative optimism, yet there is a lack of research exploring them concurrently. This longitudinal study repeatedly assessed unrealistic absolute and comparative optimism within a performance context over several months to identify the degree to which they shift as a function of proximity to performance and performance feedback, their associations with global individual difference and event-specific factors, and their link to subsequent behavioural outcomes. Results showed similar shifts in unrealistic absolute and comparative optimism based on proximity to performance and performance feedback. Moreover, increases in both types of unrealistic optimism were associated with better subsequent performance beyond the effect of prior performance. However, several differences were found between the two forms of unrealistic optimism in their associations with global individual difference factors and event-specific factors, highlighting the distinctiveness of the two constructs. © 2016 The British Psychological Society.
Gobardhan, Sanjay N; Dimitriu-Leen, Aukelien C; van Rosendael, Alexander R; van Zwet, Erik W; Roos, Cornelis J; Oemrawsingh, Pranobe V; Kharagjitsingh, Aan V; Jukema, J Wouter; Delgado, Victoria; Schalij, Martin J; Bax, Jeroen J; Scholte, Arthur J H A
2017-03-01
The aim of this study was to explore the association between various cardiovascular (CV) risk scores and coronary atherosclerotic burden on coronary computed tomography angiography (CTA) in South Asians with type 2 diabetes mellitus and matched whites. Asymptomatic type 2 diabetic South Asians and whites were matched for age, gender, body mass index, hypertension, and hypercholesterolemia. Ten-year CV risk was estimated using different risk scores (United Kingdom Prospective Diabetes Study [UKPDS], Framingham Risk Score [FRS], AtheroSclerotic CardioVascular Disease [ASCVD], and Joint British Societies for the prevention of CVD [JBS3]) and categorized into low- and high-risk groups. The presence of coronary artery calcium (CAC) and obstructive coronary artery disease (CAD; ≥50% stenosis) was assessed using coronary CTA. Finally, the relation between coronary atherosclerosis on CTA and the low- and high-risk groups was compared. UKPDS, FRS, and ASCVD showed no differences in estimated CV risk between 159 South Asians and 159 matched whites. JBS3 showed a significant greater absolute CV risk in South Asians (18.4% vs 14.2%, p <0.01). Higher presence of CAC score >0 (69% vs 55%, p <0.05) and obstructive CAD (39% vs 27%, p <0.05) was observed in South Asians. South Asians categorized as high risk, using UKPDS, FRS, and ASCVD, showed more CAC and CAD compared than whites. JBS3 showed no differences. In conclusion, asymptomatic South Asians with type 2 diabetes mellitus more frequently showed CAC and obstructive CAD than matched whites in the population categorized as high-risk patients using UKPDS, FRS, and ASCVD as risk estimators. However, JBS3 seems to correlate best to CAC and CAD in both ethnicity groups compared with the other risk scores. Copyright © 2016 Elsevier Inc. All rights reserved.
Prediction of breast cancer risk by genetic risk factors, overall and by hormone receptor status.
Hüsing, Anika; Canzian, Federico; Beckmann, Lars; Garcia-Closas, Montserrat; Diver, W Ryan; Thun, Michael J; Berg, Christine D; Hoover, Robert N; Ziegler, Regina G; Figueroa, Jonine D; Isaacs, Claudine; Olsen, Anja; Viallon, Vivian; Boeing, Heiner; Masala, Giovanna; Trichopoulos, Dimitrios; Peeters, Petra H M; Lund, Eiliv; Ardanaz, Eva; Khaw, Kay-Tee; Lenner, Per; Kolonel, Laurence N; Stram, Daniel O; Le Marchand, Loïc; McCarty, Catherine A; Buring, Julie E; Lee, I-Min; Zhang, Shumin; Lindström, Sara; Hankinson, Susan E; Riboli, Elio; Hunter, David J; Henderson, Brian E; Chanock, Stephen J; Haiman, Christopher A; Kraft, Peter; Kaaks, Rudolf
2012-09-01
There is increasing interest in adding common genetic variants identified through genome wide association studies (GWAS) to breast cancer risk prediction models. First results from such models showed modest benefits in terms of risk discrimination. Heterogeneity of breast cancer as defined by hormone-receptor status has not been considered in this context. In this study we investigated the predictive capacity of 32 GWAS-detected common variants for breast cancer risk, alone and in combination with classical risk factors, and for tumours with different hormone receptor status. Within the Breast and Prostate Cancer Cohort Consortium, we analysed 6009 invasive breast cancer cases and 7827 matched controls of European ancestry, with data on classical breast cancer risk factors and 32 common gene variants identified through GWAS. Discriminatory ability with respect to breast cancer of specific hormone receptor-status was assessed with the age adjusted and cohort-adjusted concordance statistic (AUROC(a)). Absolute risk scores were calculated with external reference data. Integrated discrimination improvement was used to measure improvements in risk prediction. We found a small but steady increase in discriminatory ability with increasing numbers of genetic variants included in the model (difference in AUROC(a) going from 2.7% to 4%). Discriminatory ability for all models varied strongly by hormone receptor status. Adding information on common polymorphisms provides small but statistically significant improvements in the quality of breast cancer risk prediction models. We consistently observed better performance for receptor-positive cases, but the gain in discriminatory quality is not sufficient for clinical application.
Leone, Aurelio
2003-01-01
Among the major Coronary Risk Factors (CRF) cigarette smoking has shown undoubtedly harmful effects on the heart and blood vessels either as active smoking (smoking a cigarette) or passive smoking (exposure to environmental tobacco smoke -ETS). The strong relationship between cigarette smoking and cardiovascular disease has been seen independent of the other CRF in a number of well-designated epidemiologic studies. However, a strong increase in the excess of cardiovascular risk has been defined along with the interaction of cigarette smoking and other major CRF. Thousands of pharmacologically active substances are present in tobacco smoke, and a large number of direct and indirect effects have been demonstrated. Different responses are also related to these types of exposure: active exposure or passive exposure. The cardiovascular risk increases with increasing levels of blood pressure and/or serum cholesterol and diabetes mellitus, and at each level of these three risk factors, distributed with different rates according to age and gender in individuals, the risk in active smokers or passive smokers is greater than the risk in nonsmokers. Further analytical and methodological observations are needed for better understanding of the chemical and biological synergism. Nevertheless, evidence is clear that cigarette smoking greatly increases the risk of cardiovascular diseases in individuals already at increased risk because of other CRF. Preventive measures must be absolutely conducted to prevent the CRF interaction. These are the changes in lifestyle (i.e. to give up smoking and make physical activity), drug administration, diet supplementation especially by those substances with antioxidant effects.
Mortensen, Lotte Maxild; Lundbye-Christensen, Søren; Schmidt, Erik Berg; Calder, Philip C; Schierup, Mikkel Heide; Tjønneland, Anne; Parner, Erik T; Overvad, Kim
2017-01-01
Studies of the relation between polyunsaturated fatty acids and risk of atrial fibrillation have been inconclusive. The risk of atrial fibrillation may depend on the interaction between n-3 and n-6 polyunsaturated fatty acids as both types of fatty acids are involved in the regulation of systemic inflammation. We investigated the association between dietary intake of long chain polyunsaturated fatty acids (individually and in combination) and the risk of atrial fibrillation with focus on potential interaction between the two types of polyunsaturated fatty acids. The risk of atrial fibrillation in the Diet, Cancer and Health Cohort was analyzed using the pseudo-observation method to explore cumulative risks on an additive scale providing risk differences. Dietary intake of long chain polyunsaturated fatty acids was assessed by food frequency questionnaires. The main analyses were adjusted for the dietary intake of n-3 α-linolenic acid and n-6 linoleic acid to account for endogenous synthesis of long chain polyunsaturated fatty acids. Interaction was assessed as deviation from additivity of absolute association measures (risk differences). Cumulative risks in 15-year age periods were estimated in three strata of the cohort (N = 54,737). No associations between intake of n-3 or n-6 long chain polyunsaturated fatty acids and atrial fibrillation were found, neither when analyzed separately as primary exposures nor when interaction between n-3 and n-6 long chain polyunsaturated fatty acids was explored. This study suggests no association between intake of long chain polyunsaturated fatty acids and risk of atrial fibrillation.
McConeghy, Kevin W; Wing, Coady
2016-06-24
A series of state-level statute changes have allowed pharmacists to provide influenza vaccinations in community pharmacies. The study aim was to estimate the effects of pharmacy-based immunization statutes changes on per capita influenza vaccine prescriptions, adult vaccination rates, and the utilization of other preventive health services. A quasi-experimental study that compares vaccination outcomes over time before and after states allowed pharmacy-based immunization. Measures of per capita pharmacy prescriptions for influenza vaccines in each state came from a proprietary pharmacy prescription database. Data on adult vaccination rates and preventive health utilization were studied using multiple waves of the Behavioral Risk Factor Surveillance System (BRFSS). The primary outcomes were changes in per capita influenza vaccine pharmacy prescriptions, adult vaccination rates, and preventive health interventions following changes. Between 2007 and 2013, the number of influenza vaccinations dispensed in community pharmacies increased from 3.2 to 20.9 million. After one year, adopting pharmacist immunization statutes increased per capita influenza vaccine prescriptions by an absolute difference (AD) of 2.6% (95% CI: 1.1-4.2). Adopting statutes did not lead to a significant absolute increase in adult vaccination rates (AD 0.9%, 95% CI: -0.3, 2.2). There also was no observed difference in adult vaccination rates among adults at high-risk of influenza complications (AD 0.8%, 95% CI: -0.2, 1.8) or among standard demographic subgroups. There also was no observed difference in the receipt of preventive health services, including routine physician office visits (AD -1.9%, 95% CI: -4.9, 1.1). Pharmacists are providing millions of influenza vaccines as a consequence of immunization statutes, but we do not observe significant differences in adult influenza vaccination rates. The main gains from pharmacy-based immunization may be in providing a more convenient way to obtain an important health service. Published by Elsevier Ltd.
Relative and Absolute Error Control in a Finite-Difference Method Solution of Poisson's Equation
ERIC Educational Resources Information Center
Prentice, J. S. C.
2012-01-01
An algorithm for error control (absolute and relative) in the five-point finite-difference method applied to Poisson's equation is described. The algorithm is based on discretization of the domain of the problem by means of three rectilinear grids, each of different resolution. We discuss some hardware limitations associated with the algorithm,…
Methodologic ramifications of paying attention to sex and gender differences in clinical research.
Prins, Martin H; Smits, Kim M; Smits, Luc J
2007-01-01
Methodologic standards for studies on sex and gender differences should be developed to improve reporting of studies and facilitate their inclusion in systematic reviews. The essence of these studies lies within the concept of effect modification. This article reviews important methodologic issues in the design and reporting of pharmacogenetic studies. Differences in effect based on sex or gender should preferably be expressed in absolute terms (risk differences) to facilitate clinical decisions on treatment. Information on the distribution of potential effect modifiers or prognostic factors should be available to prevent a biased comparison of differences in effect between genotypes. Other considerations included the possibility of selective nonavailability of biomaterial and the choice of a statistical model to study effect modification. To ensure high study quality, additional methodologic issues should be taken into account when designing and reporting studies on sex and gender differences.
Tatsi, Christina; Boden, Rebecca; Sinaii, Ninet; Keil, Meg; Lyssikatos, Charalampos; Belyavskaya, Elena; Rosenzweig, Sergio D; Stratakis, Constantine A; Lodish, Maya B
2018-02-01
BackgroundHypercortisolemia results in changes of the immune system and elevated infection risk, but data on the WBC changes in pediatric Cushing syndrome (CS) are not known. We describe the changes of the WBC lineages in pediatric endogenous hypercortisolemia, their associations with the markers of disease severity, and the presence of infections.MethodsWe identified 197 children with endogenous CS. Clinical and biochemical data were recorded. Sixty-six children with similar age and gender, and normocortisolemia served as controls.ResultsThe absolute lymphocyte count of CS patients was significantly lower than that of controls, while the total WBC and the absolute neutrophil counts were significantly higher. These changes correlated with several markers of CS severity and improved after resolution of hypercortisolemia. Infections were identified in 35 patients (17.8%), and their presence correlated to elevated serum morning cortisol, midnight cortisol, and urinary free cortisol levels, as well as with the decrease in absolute lymphocyte count.ConclusionsChildren with endogenous CS have abnormal WBC counts, which correlate with the severity of CS, and normalize after cure. Infections are common in this population; clinicians should be aware of this complication of CS and have low threshold in diagnosis and treating infections in CS.
The 10-year Absolute Risk of Cardiovascular (CV) Events in Northern Iran: a Population Based Study
Motamed, Nima; Mardanshahi, Alireza; Saravi, Benyamin Mohseni; Siamian, Hasan; Maadi, Mansooreh; Zamani, Farhad
2015-01-01
Background: The present study was conducted to estimate 10-year cardiovascular disease events (CVD) risk using three instruments in northern Iran. Material and methods: Baseline data of 3201 participants 40-79 of a population based cohort which was conducted in Northern Iran were analyzed. Framingham risk score (FRS), World Health Organization (WHO) risk prediction charts and American college of cardiovascular / American heart association (ACC/AHA) tool were applied to assess 10-year CVD events risk. The agreement values between the risk assessment instruments were determined using the kappa statistics. Results: Our study estimated 53.5%of male population aged 40-79 had a 10 –year risk of CVD events≥10% based on ACC/AHA approach, 48.9% based on FRS and 11.8% based on WHO risk charts. A 10 –year risk≥10% was estimated among 20.1% of women using the ACC/AHA approach, 11.9%using FRS and 5.7%using WHO tool. ACC/AHA and Framingham tools had closest agreement in the estimation of 10-year risk≥10% (κ=0.7757) in meanwhile ACC/AHA and WHO approaches displayed highest agreement (κ=0.6123) in women. Conclusion: Different estimations of 10-year risk of CVD event were provided by ACC/AHA, FRS and WHO approaches. PMID:26236160
Method For Detecting The Presence Of A Ferromagnetic Object
Roybal, Lyle G.
2000-11-21
A method for detecting a presence or an absence of a ferromagnetic object within a sensing area may comprise the steps of sensing, during a sample time, a magnetic field adjacent the sensing area; producing surveillance data representative of the sensed magnetic field; determining an absolute value difference between a maximum datum and a minimum datum comprising the surveillance data; and determining whether the absolute value difference has a positive or negative sign. The absolute value difference and the corresponding positive or negative sign thereof forms a representative surveillance datum that is indicative of the presence or absence in the sensing area of the ferromagnetic material.
Yamamoto, Tomohiko; Miura, Shin-Ichiro; Suematsu, Yasunori; Kuwano, Takashi; Sugihara, Makoto; Ike, Amane; Iwata, Atsushi; Nishikawa, Hiroaki; Saku, Keijiro
2016-06-01
It is not known the relationships between a difference in systolic blood pressure (SBP) or diastolic BP (DBP) between arms by synchronal measurement and the presence of coronary artery disease (CAD), and between a difference in BP between arms and the severity of coronary atherosclerosis. We enrolled 425 consecutive patients (M/F = 286/139, 67 ± 13 year) who were admitted to our University Hospital and in whom we could measure the absolute (|rt. BP - lt. BP|) and relative (rt. BP - lt. BP) differences in SBP and DBP using a nico PS-501(®) (Parama-Tech). We divided all patients into those who did and did not have CAD. The relative differences in SBP between arms in patients with CAD were significantly lower than those in patients without CAD. However, the relative difference in SBP between arms was not a predictor of the presence of CAD. We also divided 267 patients who underwent coronary angiography into tertiles according to the Gensini score (low, middle, and high score groups). Interestingly, the middle + high score groups showed significantly lower relative differences in SBP between arms than the low score group. The mean Korotkoff sound graph in the middle + high Gensini score group was significantly higher than that in the low Gensini score group. Among conventional cardiovascular risk factors and nico parameters, the relative difference in SBP between arms in addition to the risk factors (age, gender, body mass index, hypertension, dyslipidemia, and diabetes mellitus) was associated with the score by a logistic regression analysis. In conclusion, the relative difference in SBP between arms as well as conventional risk factors may be associated with the severity of coronary arteriosclerosis.
Her, Ae-Young; Cho, Kyoung-Im; Garg, Scot; Kim, Yong Hoon
2017-01-01
Purpose There are no sufficient data on the correlation between inter-arm blood pressure (BP) difference and coronary atherosclerosis found using coronary artery calcium score (CACS). We aimed to investigate if the increased difference in inter-arm BP is independently associated with severity of CACS. Materials and Methods Patients who had ≥3 cardiovascular risk factors or an intermediate Framingham Risk Score (FRS; ≥10) were enrolled. Inter-arm BP difference was defined as the absolute difference in BP in both arms. Quantitative CACS was measured by using coronary computed tomography angiography with the scoring system. Results A total of 261 patients were included in this study. Age (r=0.256, p<0.001), serum creatinine (r=0.139, p=0.030), mean of right arm systolic BP (SBP; r=0.172, p=0.005), mean of left arm SBP (r=0.190, p=0.002), inter-arm SBP difference (r=0.152, p=0.014), and the FRS (r=0.278, p<0.001) showed significant correlation with CACS. The increased inter-arm SBP difference (≥6 mm Hg) was significantly associated with CACS ≥300 [odds ratio (OR) 2.17, 95% confidence interval (CI) 1.12–4.22; p=0.022]. In multivariable analysis, the inter-arm SBP difference ≥6 mm Hg was also significantly associated with CACS ≥300 after adjusting for clinical risk factors (OR 2.34, 95 % CI 1.06–5.19; p=0.036). Conclusion An increased inter-arm SBP difference (≥6 mm Hg) is associated with coronary atherosclerotic disease burden using CACS, and provides additional information for predicting severe coronary calcification, compared to models based on traditional risk factors. PMID:28792138
Her, Ae Young; Cho, Kyoung Im; Garg, Scot; Kim, Yong Hoon; Shin, Eun Seok
2017-09-01
There are no sufficient data on the correlation between inter-arm blood pressure (BP) difference and coronary atherosclerosis found using coronary artery calcium score (CACS). We aimed to investigate if the increased difference in inter-arm BP is independently associated with severity of CACS. Patients who had ≥3 cardiovascular risk factors or an intermediate Framingham Risk Score (FRS; ≥10) were enrolled. Inter-arm BP difference was defined as the absolute difference in BP in both arms. Quantitative CACS was measured by using coronary computed tomography angiography with the scoring system. A total of 261 patients were included in this study. Age (r=0.256, p<0.001), serum creatinine (r=0.139, p=0.030), mean of right arm systolic BP (SBP; r=0.172, p=0.005), mean of left arm SBP (r=0.190, p=0.002), inter-arm SBP difference (r=0.152, p=0.014), and the FRS (r=0.278, p<0.001) showed significant correlation with CACS. The increased inter-arm SBP difference (≥6 mm Hg) was significantly associated with CACS ≥300 [odds ratio (OR) 2.17, 95% confidence interval (CI) 1.12-4.22; p=0.022]. In multivariable analysis, the inter-arm SBP difference ≥6 mm Hg was also significantly associated with CACS ≥300 after adjusting for clinical risk factors (OR 2.34, 95 % CI 1.06-5.19; p=0.036). An increased inter-arm SBP difference (≥6 mm Hg) is associated with coronary atherosclerotic disease burden using CACS, and provides additional information for predicting severe coronary calcification, compared to models based on traditional risk factors. © Copyright: Yonsei University College of Medicine 2017
Grant, Richard W; Meigs, James B; Florez, Jose C; Park, Elyse R; Green, Robert C; Waxler, Jessica L; Delahanty, Linda M; O'Brien, Kelsey E
2011-10-01
The efficacy of diabetes genetic risk testing to motivate behavior change for diabetes prevention is currently unknown. This paper presents key issues in the design and implementation of one of the first randomized trials (The Genetic Counseling/Lifestyle Change (GC/LC) Study for Diabetes Prevention) to test whether knowledge of diabetes genetic risk can motivate patients to adopt healthier behaviors. Because individuals may react differently to receiving 'higher' vs 'lower' genetic risk results, we designed a 3-arm parallel group study to separately test the hypotheses that: (1) patients receiving 'higher' diabetes genetic risk results will increase healthy behaviors compared to untested controls, and (2) patients receiving 'lower' diabetes genetic risk results will decrease healthy behaviors compared to untested controls. In this paper we describe several challenges to implementing this study, including: (1) the application of a novel diabetes risk score derived from genetic epidemiology studies to a clinical population, (2) the use of the principle of Mendelian randomization to efficiently exclude 'average' diabetes genetic risk patients from the intervention, and (3) the development of a diabetes genetic risk counseling intervention that maintained the ethical need to motivate behavior change in both 'higher' and 'lower' diabetes genetic risk result recipients. Diabetes genetic risk scores were developed by aggregating the results of 36 diabetes-associated single nucleotide polymorphisms. Relative risk for type 2 diabetes was calculated using Framingham Offspring Study outcomes, grouped by quartiles into 'higher', 'average' (middle two quartiles) and 'lower' genetic risk. From these relative risks, revised absolute risks were estimated using the overall absolute risk for the study group. For study efficiency, we excluded all patients receiving 'average' diabetes risk results from the subsequent intervention. This post-randomization allocation strategy was justified because genotype represents a random allocation of parental alleles ('Mendelian randomization'). Finally, because it would be unethical to discourage participants to participate in diabetes prevention behaviors, we designed our two diabetes genetic risk counseling interventions (for 'higher' and 'lower' result recipients) so that both groups would be motivated despite receiving opposing results. For this initial assessment of the clinical implementation of genetic risk testing we assessed intermediate outcomes of attendance at a 12-week diabetes prevention course and changes in self-reported motivation. If effective, longer term studies with larger sample sizes will be needed to assess whether knowledge of diabetes genetic risk can help patients prevent diabetes. We designed a randomized clinical trial designed to explore the motivational impact of disclosing both higher than average and lower than average genetic risk for type 2 diabetes. This design allowed exploration of both increased risk and false reassurance, and has implications for future studies in translational genomics.
Visualization of risk of radiogenic second cancer in the organs and tissues of the human body.
Zhang, Rui; Mirkovic, Dragan; Newhauser, Wayne D
2015-04-28
Radiogenic second cancer is a common late effect in long term cancer survivors. Currently there are few methods or tools available to visually evaluate the spatial distribution of risks of radiogenic late effects in the human body. We developed a risk visualization method and demonstrated it for radiogenic second cancers in tissues and organs of one patient treated with photon volumetric modulated arc therapy and one patient treated with proton craniospinal irradiation. Treatment plans were generated using radiotherapy treatment planning systems (TPS) and dose information was obtained from TPS. Linear non-threshold risk coefficients for organs at risk of second cancer incidence were taken from the Biological Effects of Ionization Radiation VII report. Alternative risk models including linear exponential model and linear plateau model were also examined. The predicted absolute lifetime risk distributions were visualized together with images of the patient anatomy. The risk distributions of second cancer for the two patients were visually presented. The risk distributions varied with tissue, dose, dose-risk model used, and the risk distribution could be similar to or very different from the dose distribution. Our method provides a convenient way to directly visualize and evaluate the risks of radiogenic second cancer in organs and tissues of the human body. In the future, visual assessment of risk distribution could be an influential determinant for treatment plan scoring.
Deguen, Séverine; Lalloue, Benoît; Bard, Denis; Havard, Sabrina; Arveiler, Dominique; Zmirou-Navier, Denis
2010-07-01
Socioeconomic inequalities in the risk of coronary heart disease (CHD) are well documented for men and women. CHD incidence is greater for men but its association with socioeconomic status is usually found to be stronger among women. We explored the sex-specific association between neighborhood deprivation level and the risk of myocardial infarction (MI) at a small-area scale. We studied 1193 myocardial infarction events in people aged 35-74 years in the Strasbourg metropolitan area, France (2000-2003). We used a deprivation index to assess the neighborhood deprivation level. To take into account spatial dependence and the variability of MI rates due to the small number of events, we used a hierarchical Bayesian modeling approach. We fitted hierarchical Bayesian models to estimate sex-specific relative and absolute MI risks across deprivation categories. We tested departure from additive joint effects of deprivation and sex. The risk of MI increased with the deprivation level for both sexes, but was higher for men for all deprivation classes. Relative rates increased along the deprivation scale more steadily for women and followed a different pattern: linear for men and nonlinear for women. Our data provide evidence of effect modification, with departure from an additive joint effect of deprivation and sex. We document sex differences in the socioeconomic gradient of MI risk in Strasbourg. Women appear more susceptible at levels of extreme deprivation; this result is not a chance finding, given the large difference in event rates between men and women.
Liu, Y; Polo, A; Zequera, M; Harba, R; Canals, R; Vilcahuaman, L; Bello, Y
2016-08-01
Prevention of serious diabetic foot complication like ulceration or infection is an important issue. As the development of thermal graphic technologies, foot temperature-guided avoidance therapy has been recommended. Doctors from Hospital National Dos de Mayo are studying on the risk of the diabetic foot passing from Grade 0 to Grade 1 in the Wagner Scale. This risk to develop ulcers is related to the temperature difference of corresponding area between left and right foot. Generally speaking, the diabetic foot with greater mean temperature difference has more potential to develop ulcers; especially, area whose temperature difference of more than 2.2°C is where doctors and patients must pay much attention to potential problems like ulceration or infection. A system in Visual Studio was developed taking the thermal images as input and producing image with absolute mean temperature difference of 7different regions or four plantar angiosomes as output. The program process contained essential medical image processing issues such as segmentation, location and regionalization, in which adapted algorithms were implemented. From a database of 85 patients provided only 60 were used due to the quality of acquisition.
Costa, Rubens Barros; Costa, Ricardo L B; Talamantes, Sarah M; Kaplan, Jason B; Bhave, Manali A; Rademaker, Alfred; Miller, Corinne; Carneiro, Benedito A; Mahalingam, Devalingam; Chae, Young Kwang
2018-04-24
Anaplastic lymphoma kinase ( ALK ) inhibitors are the mainstay treatment for patients with non-small cell lung carcinoma (NSCLC) harboring a rearrangement of the ALK gene or the ROS1 oncogenes. With the recent publication of pivotal trials leading to the approval of these compounds in different indications, their toxicity profile warrants an update. A systematic literature search was performed in July 2017. Studies evaluating US FDA approved doses of one of the following ALK inhibitors: Crizotinib, Ceritinib, Alectinib or Brigatinib as monotherapy were included. Data were analyzed using random effects meta-analysis for absolute risks (AR), study heterogeneity, publication bias and differences among treatments. Fifteen trials with a total of 2,005 patients with evaluable toxicity data were included in this report. There was significant heterogeneity amongst different studies. The pooled AR of death and severe adverse events were 0.5% and 34.5%, respectively. Grade 3/4 nausea, vomiting, diarrhea, and constipation were uncommon: 2.6%, 2.5%, 2.7%, 1.2%, respectively. ALK inhibitors have an acceptable safety profile with a low risk of treatment-related deaths. Important differences in toxicity profile were detected amongst the different drugs.
ERIC Educational Resources Information Center
Tamrouti-Makkink, Ilse D.; Dubas, Judith Semon; Gerris, Jan R. M.; van Aken, Marcel A. G.
2004-01-01
Background: The present study extends existing studies on the role of differential parental treatment in explaining individual differences in adolescent problem behaviors above the absolute level of parenting and clarifies the function of gender of the child, birth rank and gender constellation of the sibling dyads. Method: The absolute level of…
April-Sanders, Ayana; Oskar, Sabine; Shelton, Rachel C.; Schmitt, Karen; Desperito, Elise; Protacio, Angeline; Tehranifar, Parisa
2017-01-01
Objective Worry about developing breast cancer (BC worry) has been associated with participation in screening and genetic testing and with follow-up of abnormal screening results. Little is known about the scope and predictors of BC worry in Hispanic and immigrant populations. Methods We collected in-person interview data from 250 self-identified Hispanic women recruited from an urban mammography facility (average age 50.4 years; 82% foreign-born). Women reported whether they worried about developing breast cancer rarely/never (low worry), sometimes (moderate worry) or often/all the time (high worry). We examined whether sociocultural and psychological factors (e.g., acculturation, education, perceived risk), and risk factors and objective risk for breast cancer (e.g., family history, Gail model 5-year risk estimates, parity) predicted BC worry using multinomial and binary logistic regression. Results In multivariable models, women who perceived higher absolute breast cancer risk (OR=1.66, 95% CI: 1.28, 2.14 for one unit increase in perceived lifetime risk) and comparative breast cancer risk (e.g., OR=2.37, 95% CI: 1.23, 6.06) were more likely to report high BC worry than moderate or low BC worry. There were no associations between BC worry and indicators of objective risk or acculturation. Conclusions In Hispanic women undergoing screening mammography, higher perceptions of breast cancer risk, on both absolute and comparative terms, were independently associated with high BC worry, and were stronger predictors of BC worry than indicators of objective breast cancer risk, including family history, mammographic density and personal breast cancer risk estimates. PMID:27863982
Gong, Yi; Chen, Rui; Zhang, Xi; Zou, Zhong Min; Chen, Xing Hua
2017-07-01
To investigate the risk stratification of aggressive B cell lymphoma using the immune microenvironment and clinical factors. A total of 127 patients with aggressive B cell lymphoma between 2014 and 2015 were enrolled in this study. CD4, Foxp3, CD8, CD68, CD163, PD-1, and PD-L1 expression levels were evaluated in paraffin-embedded lymphoma tissues to identify their roles in the risk stratification. Eleven factors were identified for further evaluation using analysis of variance, chi-square, and multinomial logistic regression analysis. Significant differences in 11 factors (age, Ann Arbor stage, B symptom, ECOG performance status, infiltrating CD8+ T cells, PD-L1 expression, absolute blood monocyte count, serum lactate dehydrogenase, serum iron, serum albumin, and serum β2-microglobulin) were observed among patient groups stratified by at least two risk stratification methods [International Prognostic Index (IPI), revised IPI, and NCCN-IPI models] (P < 0.05). Concordance rates were high (81.4%-100.0%) when these factors were used for the risk stratification. No difference in the risk stratification results was observed with or without the Ann Arbor stage data. We developed a convenient and inexpensive tool for use in risk stratification of aggressive B cell lymphomas, although further studies on the role of immune microenvironmental factors are needed. Copyright © 2017 The Editorial Board of Biomedical and Environmental Sciences. Published by China CDC. All rights reserved.
Kaiser, Christoph; Galatius, Soeren; Jeger, Raban; Gilgen, Nicole; Skov Jensen, Jan; Naber, Christoph; Alber, Hannes; Wanitschek, Maria; Eberli, Franz; Kurz, David J; Pedrazzini, Giovanni; Moccetti, Tiziano; Rickli, Hans; Weilenmann, Daniel; Vuillomenet, André; Steiner, Martin; Von Felten, Stefanie; Vogt, Deborah R; Wadt Hansen, Kim; Rickenbacher, Peter; Conen, David; Müller, Christian; Buser, Peter; Hoffmann, Andreas; Pfisterer, Matthias
2015-01-06
Biodegradable-polymer drug-eluting stents (BP-DES) were developed to be as effective as second-generation durable-polymer drug-eluting stents (DP-DES) and as safe >1 year as bare-metal stents (BMS). Thus, very late stent thrombosis (VLST) attributable to durable polymers should no longer appear. To address these early and late aspects, 2291 patients presenting with acute or stable coronary disease needing stents ≥3.0 mm in diameter between April 2010 and May 2012 were randomly assigned to biolimus-A9-eluting BP-DES, second-generation everolimus-eluting DP-DES, or thin-strut silicon-carbide-coated BMS in 8 European centers. All patients were treated with aspirin and risk-adjusted doses of prasugrel. The primary end point was combined cardiac death, myocardial infarction, and clinically indicated target-vessel revascularization within 2 years. The combined secondary safety end point was a composite of VLST, myocardial infarction, and cardiac death. The cumulative incidence of the primary end point was 7.6% with BP-DES, 6.8% with DP-DES, and 12.7% with BMS. By intention-to-treat BP-DES were noninferior (predefined margin, 3.80%) compared with DP-DES (absolute risk difference, 0.78%; -1.93% to 3.50%; P for noninferiority 0.042; per protocol P=0.09) and superior to BMS (absolute risk difference, -5.16; -8.32 to -2.01; P=0.0011). The 3 stent groups did not differ in the combined safety end point, with no decrease in events >1 year, particularly VLST with BP-DES. In large vessel stenting, BP-DES appeared barely noninferior compared with DP-DES and more effective than thin-strut BMS, but without evidence for better safety nor lower VLST rates >1 year. Findings challenge the concept that durable polymers are key in VLST formation. http://www.clinicaltrials.gov. Unique identifier: NCT01166685. © 2014 American Heart Association, Inc.
Balosso, Jacques
2017-01-01
Background During the past decades, in radiotherapy, the dose distributions were calculated using density correction methods with pencil beam as type ‘a’ algorithm. The objectives of this study are to assess and evaluate the impact of dose distribution shift on the predicted secondary cancer risk (SCR), using modern advanced dose calculation algorithms, point kernel, as type ‘b’, which consider change in lateral electrons transport. Methods Clinical examples of pediatric cranio-spinal irradiation patients were evaluated. For each case, two radiotherapy treatment plans with were generated using the same prescribed dose to the target resulting in different number of monitor units (MUs) per field. The dose distributions were calculated, respectively, using both algorithms types. A gamma index (γ) analysis was used to compare dose distribution in the lung. The organ equivalent dose (OED) has been calculated with three different models, the linear, the linear-exponential and the plateau dose response curves. The excess absolute risk ratio (EAR) was also evaluated as (EAR = OED type ‘b’ / OED type ‘a’). Results The γ analysis results indicated an acceptable dose distribution agreement of 95% with 3%/3 mm. Although, the γ-maps displayed dose displacement >1 mm around the healthy lungs. Compared to type ‘a’, the OED values from type ‘b’ dose distributions’ were about 8% to 16% higher, leading to an EAR ratio >1, ranged from 1.08 to 1.13 depending on SCR models. Conclusions The shift of dose calculation in radiotherapy, according to the algorithm, can significantly influence the SCR prediction and the plan optimization, since OEDs are calculated from DVH for a specific treatment. The agreement between dose distribution and SCR prediction depends on dose response models and epidemiological data. In addition, the γ passing rates of 3%/3 mm does not translate the difference, up to 15%, in the predictions of SCR resulting from alternative algorithms. Considering that modern algorithms are more accurate, showing more precisely the dose distributions, but that the prediction of absolute SCR is still very imprecise, only the EAR ratio could be used to rank radiotherapy plans. PMID:28811995
Using the Lorenz Curve to Characterize Risk Predictiveness and Etiologic Heterogeneity
Mauguen, Audrey; Begg, Colin B.
2017-01-01
The Lorenz curve is a graphical tool that is used widely in econometrics. It represents the spread of a probability distribution, and its traditional use has been to characterize population distributions of wealth or income, or more specifically, inequalities in wealth or income. However, its utility in public health research has not been broadly established. The purpose of this article is to explain its special usefulness for characterizing the population distribution of disease risks, and in particular for identifying the precise disease burden that can be predicted to occur in segments of the population that are known to have especially high (or low) risks, a feature that is important for evaluating the yield of screening or other disease prevention initiatives. We demonstrate that, although the Lorenz curve represents the distribution of predicted risks in a population at risk for the disease, in fact it can be estimated from a case–control study conducted in the population without the need for information on absolute risks. We explore two different estimation strategies and compare their statistical properties using simulations. The Lorenz curve is a statistical tool that deserves wider use in public health research. PMID:27096256
NASA Astrophysics Data System (ADS)
Wziontek, H.; Palinkas, V.; Falk, R.; Vaľko, M.
2016-12-01
Since decades, absolute gravimeters are compared on a regular basis on an international level, starting at the International Bureau for Weights and Measures (BIPM) in 1981. Usually, these comparisons are based on constant reference values deduced from all accepted measurements acquired during the comparison period. Temporal changes between comparison epochs are usually not considered. Resolution No. 2, adopted by IAG during the IUGG General Assembly in Prague 2015, initiates the establishment of a Global Absolute Gravity Reference System based on key comparisons of absolute gravimeters (AG) under the International Committee for Weights and Measures (CIPM) in order to establish a common level in the microGal range. A stable and unique reference frame can only be achieved, if different AG are taking part in different kind of comparisons. Systematic deviations between the respective comparison reference values can be detected, if the AG can be considered stable over time. The continuous operation of superconducting gravimeters (SG) on selected stations further supports the temporal link of comparison reference values by establishing a reference function over time. By a homogenous reprocessing of different comparison epochs and including AG and SG time series at selected stations, links between several comparisons will be established and temporal comparison reference functions will be derived. By this, comparisons on a regional level can be traced to back to the level of key comparisons, providing a reference for other absolute gravimeters. It will be proved and discussed, how such a concept can be used to support the future absolute gravity reference system.
Castelli, Joël; Depeursinge, Adrien; de Bari, Berardino; Devillers, Anne; de Crevoisier, Renaud; Bourhis, Jean; Prior, John O
2017-06-01
In the context of oropharyngeal cancer treated with definitive radiotherapy, the aim of this retrospective study was to identify the best threshold value to compute metabolic tumor volume (MTV) and/or total lesion glycolysis to predict local-regional control (LRC) and disease-free survival. One hundred twenty patients with a locally advanced oropharyngeal cancer from 2 different institutions treated with definitive radiotherapy underwent FDG PET/CT before treatment. Various MTVs and total lesion glycolysis were defined based on 2 segmentation methods: (i) an absolute threshold of SUV (0-20 g/mL) or (ii) a relative threshold for SUVmax (0%-100%). The parameters' predictive capabilities for disease-free survival and LRC were assessed using the Harrell C-index and Cox regression model. Relative thresholds between 40% and 68% and absolute threshold between 5.5 and 7 had a similar predictive value for LRC (C-index = 0.65 and 0.64, respectively). Metabolic tumor volume had a higher predictive value than gross tumor volume (C-index = 0.61) and SUVmax (C-index = 0.54). Metabolic tumor volume computed with a relative threshold of 51% of SUVmax was the best predictor of disease-free survival (hazard ratio, 1.23 [per 10 mL], P = 0.009) and LRC (hazard ratio: 1.22 [per 10 mL], P = 0.02). The use of different thresholds within a reasonable range (between 5.5 and 7 for an absolute threshold and between 40% and 68% for a relative threshold) seems to have no major impact on the predictive value of MTV. This parameter may be used to identify patient with a high risk of recurrence and who may benefit from treatment intensification.
Joensen, Albert Marni; Joergensen, Torben; Lundbye-Christensen, Søren; Johansen, Martin Berg; Guzman-Castillo, Maria; Bandosz, Piotr; Hallas, Jesper; Prescott, Eva Irene Bossano; Capewell, Simon; O'Flaherty, Martin
2018-01-01
To quantify the contribution of changes in different risk factors population levels and treatment uptake on the decline in CHD mortality in Denmark from 1991 to 2007 in different socioeconomic groups. We used IMPACTSEC, a previously validated policy model using data from different population registries. All adults aged 25-84 years living in Denmark in 1991 and 2007. Deaths prevented or postponed (DPP). There were approximately 11,000 fewer CHD deaths in Denmark in 2007 than would be expected if the 1991 mortality rates had persisted. Higher mortality rates were observed in the lowest socioeconomic quintile. The highest absolute reduction in CHD mortality was seen in this group but the highest relative reduction was in the most affluent socioeconomic quintile. Overall, the IMPACTSEC model explained nearly two thirds of the decline in. Improved treatments accounted for approximately 25% with the least relative mortality reduction in the most deprived quintile. Risk factor improvements accounted for approximately 40% of the mortality decrease with similar gains across all socio-economic groups. The 36% gap in explaining all DPPs may reflect inaccurate data or risk factors not quantified in the current model. According to the IMPACTSEC model, the largest contribution to the CHD mortality decline in Denmark from 1991 to 2007 was from improvements in risk factors, with similar gains across all socio-economic groups. However, we found a clear socioeconomic trend for the treatment contribution favouring the most affluent groups.
A clustering approach to segmenting users of internet-based risk calculators.
Harle, C A; Downs, J S; Padman, R
2011-01-01
Risk calculators are widely available Internet applications that deliver quantitative health risk estimates to consumers. Although these tools are known to have varying effects on risk perceptions, little is known about who will be more likely to accept objective risk estimates. To identify clusters of online health consumers that help explain variation in individual improvement in risk perceptions from web-based quantitative disease risk information. A secondary analysis was performed on data collected in a field experiment that measured people's pre-diabetes risk perceptions before and after visiting a realistic health promotion website that provided quantitative risk information. K-means clustering was performed on numerous candidate variable sets, and the different segmentations were evaluated based on between-cluster variation in risk perception improvement. Variation in responses to risk information was best explained by clustering on pre-intervention absolute pre-diabetes risk perceptions and an objective estimate of personal risk. Members of a high-risk overestimater cluster showed large improvements in their risk perceptions, but clusters of both moderate-risk and high-risk underestimaters were much more muted in improving their optimistically biased perceptions. Cluster analysis provided a unique approach for segmenting health consumers and predicting their acceptance of quantitative disease risk information. These clusters suggest that health consumers were very responsive to good news, but tended not to incorporate bad news into their self-perceptions much. These findings help to quantify variation among online health consumers and may inform the targeted marketing of and improvements to risk communication tools on the Internet.
Zhu, Pengyu; Fu, Wei; Wang, Chenguang; Du, Zhixin; Huang, Kunlun; Zhu, Shuifang; Xu, Wentao
2016-04-15
The possibility of the absolute quantitation of GMO events by digital PCR was recently reported. However, most absolute quantitation methods based on the digital PCR required pretreatment steps. Meanwhile, singleplex detection could not meet the demand of the absolute quantitation of GMO events that is based on the ratio of foreign fragments and reference genes. Thus, to promote the absolute quantitative detection of different GMO events by digital PCR, we developed a quantitative detection method based on duplex digital PCR without pretreatment. Moreover, we tested 7 GMO events in our study to evaluate the fitness of our method. The optimized combination of foreign and reference primers, limit of quantitation (LOQ), limit of detection (LOD) and specificity were validated. The results showed that the LOQ of our method for different GMO events was 0.5%, while the LOD is 0.1%. Additionally, we found that duplex digital PCR could achieve the detection results with lower RSD compared with singleplex digital PCR. In summary, the duplex digital PCR detection system is a simple and stable way to achieve the absolute quantitation of different GMO events. Moreover, the LOQ and LOD indicated that this method is suitable for the daily detection and quantitation of GMO events. Copyright © 2016 Elsevier B.V. All rights reserved.
Conditioning procedure and color discrimination in the honeybee Apis mellifera
NASA Astrophysics Data System (ADS)
Giurfa, Martin
We studied the influence of the conditioning procedure on color discrimination by free-flying honeybees. We asked whether absolute and differential conditioning result in different discrimination capabilities for the same pairs of colored targets. In absolute conditioning, bees were rewarded on a single color; in differential conditioning, bees were rewarded on the same color but an alternative, non-rewarding, similar color was also visible. In both conditioning procedures, bees learned their respective task and could also discriminate the training stimulus from a novel stimulus that was perceptually different from the trained one. Discrimination between perceptually closer stimuli was possible after differential conditioning but not after absolute conditioning. Differences in attention inculcated by these training procedures may underlie the different discrimination performances of the bees.
Influence of graphic format on comprehension of risk information among American Indians.
Sprague, Debra; LaVallie, Donna L; Wolf, Fredric M; Jacobsen, Clemma; Sayson, Kirsten; Buchwald, Dedra
2011-01-01
Presentation of risk information influences patients' ability to interpret health care options. Little is known about this relationship between risk presentation and interpretation among American Indians. Three hundred American Indian employees on a western American Indian reservation were invited to complete an anonymous written survey. All surveys included a vignette presenting baseline risk information about a hypothetical cancer and possible benefits of 2 prevention plans. Risk interpretation was assessed by correct answers to 3 questions evaluating the risk reduction associated with the plans. Numeric information was the same in all surveys, but framing varied; half expressed prevention benefits in terms of relative risk reduction and half in terms of absolute risk reduction. All surveys used text to describe the benefits of the 2 plans, but half included a graphic image. Surveys were distributed randomly. Responses were analyzed using binary logistic regression with the robust variance estimator to account for clustering of outcomes within participant. Use of a graphic image was associated with higher odds of correctly answering 3 risk interpretation questions (odds ratio = 2.5, 95% confidence interval = 1.5-4.0, P < 0.001) compared to the text-only format. These findings were similar to those of previous studies carried out in the general population. Neither framing information as relative compared to absolute risk nor the interaction between graphic image and relative risk presentation was associated with risk interpretation. One type of graphic image was associated with increased understanding of risk in a small sample of American Indian adults. The authors recommend further investigation of the effectiveness of other types of graphic displays for conveying health risk information to this population.
Hu, Tian; Yang, Hai-Long; Tang, Qing; Zhang, Hui; Nie, Lei; Li, Lian; Wang, Jin-Feng; Liu, Dong-Ming; Jiang, Wei; Wang, Fei; Zang, Heng-Chang
2014-10-01
As one very precious traditional Chinese medicine (TCM), Huoshan Dendrobium has not only high price, but also significant pharmaceutical efficacy. However, different species of Huoshan Dendrobium exhibit considerable difference in pharmaceutical efficacy, so rapid and absolutely non-destructive discrimination of Huoshan Dendrobium nobile according to different species is crucial to quality control and pharmaceutical effect. In this study, as one type of miniature near-infrared (NIR) spectrometer, MicroNIR 1700 was used for absolutely nondestructive determination of NIR spectra of 90 batches of Dendrobium from five species of differ- ent commodity grades. The samples were intact and not smashed. Soft independent modeling of class analogy (SIMCA) pattern recognition based on principal component analysis (PCA) was used to classify and recognize different species of Dendrobium samples. The results indicated that the SIMCA qualitative models established with pretreatment method of standard normal variate transformation (SNV) in the spectra range selected by Qs method had 100% recognition rates and 100% rejection rates. This study demonstrated that a rapid and absolutely non-destructive analytical technique based on MicroNIR 1700 spectrometer was developed for successful discrimination of five different species of Huoshan Dendrobium with acceptable accuracy.
Gierach, Gretchen L.; Patel, Deesha A.; Pfeiffer, Ruth M.; Figueroa, Jonine D.; Linville, Laura; Papathomas, Daphne; Johnson, Jason M.; Chicoine, Rachael E.; Herschorn, Sally D.; Shepherd, John A.; Wang, Jeff; Malkov, Serghei; Vacek, Pamela M.; Weaver, Donald L.; Fan, Bo; Mahmoudzadeh, Amir Pasha; Palakal, Maya; Xiang, Jackie; Oh, Hannah; Horne, Hisani N.; Sprague, Brian L.; Hewitt, Stephen M.; Brinton, Louise A.; Sherman, Mark E.
2016-01-01
Elevated mammographic density (MD) is an established breast cancer risk factor. Reduced involution of terminal duct lobular units (TDLUs), the histologic source of most breast cancers, has been associated with higher MD and breast cancer risk. We investigated relationships of TDLU involution with area and volumetric MD, measured throughout the breast and surrounding biopsy targets (peri-lesional). Three measures inversely related to TDLU involution (TDLU count/mm2, median TDLU span, median acini count/TDLU) assessed in benign diagnostic biopsies from 348 women, ages 40–65, were related to MD area (quantified with thresholding software) and volume (assessed with a density phantom) by analysis of covariance, stratified by menopausal status and adjusted for confounders. Among premenopausal women, TDLU count was directly associated with percent peri-lesional MD (P-trend=0.03), but not with absolute dense area/volume. Greater TDLU span was associated with elevated percent dense area/volume (P-trend<0.05) and absolute peri-lesional MD (P=0.003). Acini count was directly associated with absolute peri-lesional MD (P=0.02). Greater TDLU involution (all metrics) was associated with increased nondense area/volume (P-trend≤0.04). Among postmenopausal women, TDLU measures were not significantly associated with MD. Among premenopausal women, reduced TDLU involution was associated with higher area and volumetric MD, particularly in peri-lesional parenchyma. Data indicating that TDLU involution and MD are correlated markers of breast cancer risk suggest that associations of MD with breast cancer may partly reflect amounts of at-risk epithelium. If confirmed, these results could suggest a prevention paradigm based on enhancing TDLU involution and monitoring efficacy by assessing MD reduction. PMID:26645278
Gentry, Quinn M; Elifson, Kirk; Sterk, Claire
2005-06-01
The purpose of this study was to examine how various living conditions impact the context within which low-income African American women engage in a diverse range of high-risk behavior that increases their risk for HIV infection. The study, based on 2 years of ethnographic fieldwork, analyzed the living conditions of 45 African American women at risk for HIV infection in a high-risk neighborhood in Atlanta, Georgia. A black feminist perspective guided the study's analytical framework as a way to extend knowledge about the social conditions, the social interactions, and the meaning of high-risk behavior in the lives of African American women. Using black feminist theory and the constant comparison method, two groups emerged: "street" women and "house" women. Street women were defined as the absolute homeless, the rooming housed, and the hustling homeless. House women were defined as the family housed, the heads of household, and the steady-partner housed. Results reveal that various types of living arrangements place women at risk in different ways and suggest that low-income African American women at high risk for HIV infection-a group often considered homogeneous-have unique "within group" needs that must be addressed in HIV prevention intervention research.
Garratt, Elisabeth A; Chandola, Tarani; Purdam, Kingsley; Wood, Alex M
2016-10-01
Parents face an increased risk of psychological distress compared with adults without children, and families with children also have lower average household incomes. Past research suggests that absolute income (material position) and income status (psychosocial position) influence psychological distress, but their combined effects on changes in psychological distress have not been examined. Whether absolute income interacts with income status to influence psychological distress are also key questions. We used fixed-effects panel models to examine longitudinal associations between psychological distress (measured on the Kessler scale) and absolute income, distance from the regional mean income, and regional income rank (a proxy for status) using data from 29,107 parents included in the UK Millennium Cohort Study (2003-2012). Psychological distress was determined by an interaction between absolute income and income rank: higher absolute income was associated with lower psychological distress across the income spectrum, while the benefits of higher income rank were evident only in the highest income parents. Parents' psychological distress was, therefore, determined by a combination of income-related material and psychosocial factors. Both material and psychosocial factors contribute to well-being. Higher absolute incomes were associated with lower psychological distress across the income spectrum, demonstrating the importance of material factors. Conversely, income status was associated with psychological distress only at higher absolute incomes, suggesting that psychosocial factors are more relevant to distress in more advantaged, higher income parents. Clinical interventions could, therefore, consider both the material and psychosocial impacts of income on psychological distress.
Reijnierse, Esmee M.; Trappenburg, Marijke C.; Leter, Morena J.; Blauw, Gerard Jan; de van der Schueren, Marian A. E.; Meskers, Carel G. M.; Maier, Andrea B.
2015-01-01
Objectives Diagnostic criteria for sarcopenia include measures of muscle mass, muscle strength and physical performance. Consensus on the definition of sarcopenia has not been reached yet. To improve insight into the most clinically valid definition of sarcopenia, this study aimed to compare the association between parameters of malnutrition, as a risk factor in sarcopenia, and diagnostic measures of sarcopenia in geriatric outpatients. Material and Methods This study is based on data from a cross-sectional study conducted in a geriatric outpatient clinic including 185 geriatric outpatients (mean age 82 years). Parameters of malnutrition included risk of malnutrition (assessed by the Short Nutritional Assessment Questionnaire), loss of appetite, unintentional weight loss and underweight (body mass index <22 kg/m2). Diagnostic measures of sarcopenia included relative muscle mass (lean mass and appendicular lean mass [ALM] as percentages), absolute muscle mass (total lean mass and ALM/height2), handgrip strength and walking speed. All diagnostic measures of sarcopenia were standardized. Associations between parameters of malnutrition (independent variables) and diagnostic measures of sarcopenia (dependent variables) were analysed using multivariate linear regression models adjusted for age, body mass, fat mass and height in separate models. Results None of the parameters of malnutrition was consistently associated with diagnostic measures of sarcopenia. The strongest associations were found for both relative and absolute muscle mass; less stronger associations were found for muscle strength and physical performance. Underweight (p = <0.001) and unintentional weight loss (p = 0.031) were most strongly associated with higher lean mass percentage after adjusting for age. Loss of appetite (p = 0.003) and underweight (p = 0.021) were most strongly associated with lower total lean mass after adjusting for age and fat mass. Conclusion Parameters of malnutrition relate differently to diagnostic measures of sarcopenia in geriatric outpatients. The association between parameters of malnutrition and diagnostic measures of sarcopenia was strongest for both relative and absolute muscle mass, while less strong associations were found with muscle strength and physical performance. PMID:26284368
2011-01-01
Background Age-related bone loss is asymptomatic, and the morbidity of osteoporosis is secondary to the fractures that occur. Common sites of fracture include the spine, hip, forearm and proximal humerus. Fractures at the hip incur the greatest morbidity and mortality and give rise to the highest direct costs for health services. Their incidence increases exponentially with age. Independently changes in population demography, the age - and sex- specific incidence of osteoporotic fractures appears to be increasing in developing and developed countries. This could mean more than double the expected burden of osteoporotic fractures in the next 50 years. Methods/Design To assess the predictive power of the WHO FRAX™ tool to identify the subjects with the highest absolute risk of fragility fracture at 10 years in a Spanish population, a predictive validation study of the tool will be carried out. For this purpose, the participants recruited by 1999 will be assessed. These were referred to scan-DXA Department from primary healthcare centres, non hospital and hospital consultations. Study population: Patients attended in the national health services integrated into a FRIDEX cohort with at least one Dual-energy X-ray absorptiometry (DXA) measurement and one extensive questionnaire related to fracture risk factors. Measurements: At baseline bone mineral density measurement using DXA, clinical fracture risk factors questionnaire, dietary calcium intake assessment, history of previous fractures, and related drugs. Follow up by telephone interview to know fragility fractures in the 10 years with verification in electronic medical records and also to know the number of falls in the last year. The absolute risk of fracture will be estimated using the FRAX™ tool from the official web site. Discussion Since more than 10 years ago numerous publications have recognised the importance of other risk factors for new osteoporotic fractures in addition to low BMD. The extension of a method for calculating the risk (probability) of fractures using the FRAX™ tool is foreseeable in Spain and this would justify a study such as this to allow the necessary adjustments in calibration of the parameters included in the logarithmic formula constituted by FRAX™. PMID:21272372
Werner, S.C.; Tanaka, K.L.
2011-01-01
For the boundaries of each chronostratigraphic epoch on Mars, we present systematically derived crater-size frequencies based on crater counts of geologic referent surfaces and three proposed " standard" crater size-frequency production distributions as defined by (a) a simple -2 power law, (b) Neukum and Ivanov, (c) Hartmann. In turn, these crater count values are converted to model-absolute ages based on the inferred cratering rate histories. We present a new boundary definition for the Late Hesperian-Early Amazonian transition. Our fitting of crater size-frequency distributions to the chronostratigraphic record of Mars permits the assignment of cumulative counts of craters down to 100. m, 1. km, 2. km, 5. km, and 16. km diameters to martian epochs. Due to differences in the " standard" crater size-frequency production distributions, a generalized crater-density-based definition to the chronostratigraphic system cannot be provided. For the diameter range used for the boundary definitions, the resulting model absolute age fits vary within 1.5% for a given set of production function and chronology model ages. Crater distributions translated to absolute ages utilizing different curve descriptions can result in absolute age differences exceeding 10%. ?? 2011 Elsevier Inc.
Soon, Aun Woon; Toney, Amanda Greene; Stidham, Timothy; Kendall, John; Roosevelt, Genie
2018-04-24
To assess whether Web-based teaching is at least as effective as traditional classroom didactic in improving the proficiency of pediatric novice learners in the image acquisition and interpretation of pneumothorax and pleural effusion using point-of-care ultrasound (POCUS). We conducted a randomized controlled noninferiority study comparing the effectiveness of Web-based teaching to traditional classroom didactic. The participants were randomized to either group A (live classroom lecture) or group B (Web-based lecture) and completed a survey and knowledge test. They also received hands-on training and completed an objective structured clinical examination. The participants were invited to return 2 months later to test for retention of knowledge and skills. There were no significant differences in the mean written test scores between the classroom group and Web group for the precourse test (absolute difference, -2.5; 95% confidence interval [CI], -12 to 6.9), postcourse test (absolute difference, 2.0; 95% CI, -1.4, 5.3), and postcourse 2-month retention test (absolute difference, -0.8; 95% CI, -9.6 to 8.1). Similarly, no significant differences were noted in the mean objective structured clinical examination scores for both intervention groups in postcourse (absolute difference, 1.9; 95% CI, -4.7 to 8.5) and 2-month retention (absolute difference, -0.6; 95% CI, -10.7 to 9.5). Web-based teaching is at least as effective as traditional classroom didactic in improving the proficiency of novice learners in POCUS. The usage of Web-based tutorials allows a more efficient use of time and a wider dissemination of knowledge.
Prioritization of influenza pandemic vaccination to minimize years of life lost.
Miller, Mark A; Viboud, Cecile; Olson, Donald R; Grais, Rebecca F; Rabaa, Maia A; Simonsen, Lone
2008-08-01
How to allocate limited vaccine supplies in the event of an influenza pandemic is currently under debate. Conventional vaccination strategies focus on those at highest risk for severe outcomes, including seniors, but do not consider (1) the signature pandemic pattern in which mortality risk is shifted to younger ages, (2) likely reduced vaccine response in seniors, and (3) differences in remaining years of life with age. We integrated these factors to project the age-specific years of life lost (YLL) and saved in a future pandemic, on the basis of mortality patterns from 3 historical pandemics, age-specific vaccine efficacy, and the 2000 US population structure. For a 1918-like scenario, the absolute mortality risk is highest in people <45 years old; in contrast, seniors (those >or=65 years old) have the highest mortality risk in the 1957 and 1968 scenarios. The greatest YLL savings would be achieved by targeting different age groups in each scenario; people <45 years old in the 1918 scenario, people 45-64 years old in the 1968 scenario, and people >45 years old in the 1957 scenario. Our findings shift the focus of pandemic vaccination strategies onto younger populations and illustrate the need for real-time surveillance of mortality patterns in a future pandemic. Flexible setting of vaccination priority is essential to minimize mortality.
Comparison of elicitation potential of chloroatranol and atranol--2 allergens in oak moss absolute.
Johansen, Jeanne D; Bernard, Guillaume; Giménez-Arnau, Elena; Lepoittevin, Jean-Pierre; Bruze, Magnus; Andersen, Klaus E
2006-04-01
Chloroatranol and atranol are degradation products of chloroatranorin and atranorin, respectively, and have recently been identified as important contact allergens in the natural fragrance extract, oak moss absolute. Oak moss absolute is widely used in perfumery and is the cause of many cases of fragrance allergic contact dermatitis. Chloroatranol elicits reactions at very low levels of exposure. In oak moss absolute, chloroatranol and atranol are present together and both may contribute to the allergenicity and eliciting capacity of the natural extract. In this study, 10 eczema patients with known sensitization to chloroatranol and oak moss absolute were tested simultaneously to a serial dilution of chloroatranol and atranol in ethanol, in equimolar concentrations (0.0034-1072 microM). Dose-response curves were estimated and analysed by logistic regression. The estimated difference in elicitation potency of chloroatranol relative to atranol based on testing with equimolar concentrations was 217% (95% confidence interval 116-409%). Both substances elicited reactions at very low levels of exposure. It is concluded that the differences in elicitation capacity between the 2 substances are counterbalanced by exposure being greater to atranol than to chloroatranol and that both substances contribute to the clinical problems seen in oak moss absolute-sensitized individuals.
Mosquito repellents: An insight into the chronological perspectives and novel discoveries.
Islam, Johirul; Zaman, Kamaruz; Duarah, Sanjukta; Raju, Pakalapati Srinivas; Chattopadhyay, Pronobesh
2017-03-01
Mosquito being the major medically important arthropod vector; requires utmost attention to reduce the sufferings and economic consequences of those living in the endemic regions. This is only possible by minimising the human-mosquito contact by an absolute preventing measure. However, unfortunately, such absolute measures are yet to be developed despite enormous efforts and huge investments worldwide. In the absence of vaccines for number of mosquito-borne diseases, repellents could be an attractive option for both military personal and civilians to minimise the risk of contacting different mosquito-borne diseases. However, to achieve this golden goal, the detailed knowledge of a particular repellent is must, including its mode of repellency and other relevant informations. Here, in the present article, an effort has been made to convey the best and latest information on repellents in order to enhance the knowledge of scientific community. The review offers an overview on mosquito repellents, the novel discoveries, and areas in need of attention such as novel repellent formulations and their future prospective. Copyright © 2016 Elsevier B.V. All rights reserved.
Blomkvist, A W; Andersen, S; de Bruin, E; Jorgensen, M G
2017-10-01
Lower limb weakness is an important risk factor for fall accidents and a predictor for all-cause mortality among older adults. Unilateral whole-lower limb strength may be a better measure of fall risk than the bilateral measure. In addition, a number of clinical conditions affect only one leg, and thus this type of assessment is relevant in clinical settings. To explore the intra-rater reproducibility of the Nintendo Wii Balance Board (WBB) to measure unilateral whole-lower limb strength and to compare the method with stationary isometric muscle apparatus (SID). Intra-rater test-retest design with 1 week between sessions. Thirty community-dwelling older adults (69 ± 4.2 years) were enrolled and examined for maximum lower limb strength in their dominant and non-dominant leg. Intraclass correlation coefficient (ICC) was calculated to describe relative reproducibility, while standard error of measurement (SEM), limits of agreement (LOA) and smallest real difference (SRD) were calculated to describe absolute reproducibility between test sessions. Concurrent validity with the SID was explored using the Pearson product-moment correlation coefficient (PCC). No systematic difference was observed between test sessions. ICC was 0.919-0.950 and SEM, LOA and SRD was 2.9-4.1 kg, 24.1-28.3 kg and 7.6-11.3 kg, respectively. Further, the PCC was 0.755 and 0.730 for the dominant limb and the non-dominant limb, respectively. A high relative and an acceptable absolute reproducibility was seen when using the Nintendo Wii Balance Board for testing unilateral lower limb strength in community-dwelling older adults. The WBB correlated strongly with the SID.
NASA Astrophysics Data System (ADS)
Ngo, Son Tung; Nguyen, Minh Tung; Nguyen, Minh Tho
2017-05-01
The absolute binding free energy of an inhibitor to HIV-1 Protease (PR) was determined throughout evaluation of the non-bonded interaction energy difference between the two bound and unbound states of the inhibitor and surrounding molecules by the fast pulling of ligand (FPL) process using non-equilibrium molecular dynamics (NEMD) simulations. The calculated free energy difference terms help clarifying the nature of the binding. Theoretical binding affinities are in good correlation with experimental data, with R = 0.89. The paradigm used is able to rank two inhibitors having the maximum difference of ∼1.5 kcal/mol in absolute binding free energies.
Duan, Yunbo; Beck, Thomas J; Wang, Xiao-Fang; Seeman, Ego
2003-10-01
The structural basis for sex differences in femoral neck (FN) fragility was studied in 1196 subjects and 307 patients with hip fracture. The absolute and relative patterns of modeling and remodeling on the periosteal and endocortical envelopes during growth and aging produce changes in FN geometry and structure that results in FN fragility in both sexes and sexual dimorphism in hip fracture risk in old age. Femoral neck (FN) fragility in old age is usually attributed to age-related bone loss, while the sex differences in hip fracture rate are attributed to less bone loss in men than in women. The purpose of this study was to define the structural and biomechanical basis underlying the increase in FN fragility in elderly men and women and the structural basis of sex differences in hip fracture incidence in old age. We measured FN dimensions and areal bone mineral density in 1196 healthy subjects (801 females) 18-92 years of age and 307 patients (180 females) with hip fracture using DXA. We then used the DXA-derived FN areal bone mineral density (BMD) and measured periosteal diameter to estimate endocortical diameter, cortical thickness, section modulus (a measure of bending strength), and buckling ratio (indices for structural stability). Neither FN cortical thickness nor volumetric density differed in young adult women and men after height and weight adjustment. The sex differences in geometry were confined to the further displacement of the cortex from the FN neutral axis in young men, which produced 13.4% greater bending strength than in young women. Aging amplified this geometric difference; widening of the periosteal and endocortical diameters continued in both sexes but was greater in men, shifting the cortex even further from the neutral axis maintaining bending strength in men, not in women. In both sexes, less age-related periosteal than endocortical widening produced cortical thinning increasing the risk for structural failure by local buckling of the enlarged thin walled FN. Relative to age-matched controls, women and men with hip fractures had reduced cortical thickness, but FN periosteal diameter was increased in women and reduced in men, differences are likely to be originated in growth. The absolute and relative patterns of modeling and remodeling on the periosteal and endocortical envelopes during growth and aging produce changes in FN diameters, cortical thickness, and geometry that results in FN fragility in both sexes and sexual dimorphism in hip fracture risk in old age.
Napper, Lucy E; Grimaldi, Elizabeth M; LaBrie, Joseph W
2015-03-01
The current study aims to examine discrepancies in parents' and college students' perceptions of alcohol risk and the role of perceived risk in predicting parents' intentions to discuss alcohol with their child. In total, 246 college student-parent dyads (56.1% female students, 77.2% mothers) were recruited from a mid-size university. Participants completed measures of absolute likelihood, comparative likelihood, and severity of alcohol consequences. In comparison to students, parents perceived the risks of alcohol poisoning (p<.001), academic impairment (p<.05), and problems with others (p<.05) to be more likely. In addition, parents rated the majority of alcohol consequences (e.g., passing out, regrettable sexual situation, throwing up) as more severe than students (all ps<.001). However, parents tended to be more optimistic than their child about the comparative likelihood of alcohol consequences. After controlling for demographics and past alcohol communication, greater absolute likelihood (β=.20, p=.016) and less confidence in knowledge of student behavior (β=.20, p=.013) predicted greater intentions to discuss alcohol. Providing parents of college students with information about college drinking norms and the likelihood of alcohol consequences may help prompt alcohol-related communication. Copyright © 2014 Elsevier Ltd. All rights reserved.
Whole body vibration exercise training for fibromyalgia.
Bidonde, Julia; Busch, Angela J; van der Spuy, Ina; Tupper, Susan; Kim, Soo Y; Boden, Catherine
2017-09-26
Exercise training is commonly recommended for adults with fibromyalgia. We defined whole body vibration (WBV) exercise as use of a vertical or rotary oscillating platform as an exercise stimulus while the individual engages in sustained static positioning or dynamic movements. The individual stands on the platform, and oscillations result in vibrations transmitted to the subject through the legs. This review is one of a series of reviews that replaces the first review published in 2002. To evaluate benefits and harms of WBV exercise training in adults with fibromyalgia. We searched the Cochrane Library, MEDLINE, Embase, CINAHL, PEDro, Thesis and Dissertation Abstracts, AMED, WHO ICTRP, and ClinicalTrials.gov up to December 2016, unrestricted by language, to identify potentially relevant trials. We included randomized controlled trials (RCTs) in adults with the diagnosis of fibromyalgia based on published criteria including a WBV intervention versus control or another intervention. Major outcomes were health-related quality of life (HRQL), pain intensity, stiffness, fatigue, physical function, withdrawals, and adverse events. Two review authors independently selected trials for inclusion, extracted data, performed risk of bias assessments, and assessed the quality of evidence for major outcomes using the GRADE approach. We used a 15% threshold for calculation of clinically relevant differences. We included four studies involving 150 middle-aged female participants from one country. Two studies had two treatment arms (71 participants) that compared WBV plus mixed exercise plus relaxation versus mixed exercise plus relaxation and placebo WBV versus control, and WBV plus mixed exercise versus mixed exercise and control; two studies had three treatment arms (79 participants) that compared WBV plus mixed exercise versus control and mixed relaxation placebo WBV. We judged the overall risk of bias as low for selection (random sequence generation), detection (objectively measured outcomes), attrition, and other biases; as unclear for selection bias (allocation concealment); and as high for performance, detection (self-report outcomes), and selective reporting biases.The WBV versus control comparison reported on three major outcomes assessed at 12 weeks post intervention based on the Fibromyalgia Impact Questionnaire (FIQ) (0 to 100 scale, lower score is better). Results for HRQL in the control group at end of treatment (59.13) showed a mean difference (MD) of -3.73 (95% confidence interval [CI] -10.81 to 3.35) for absolute HRQL, or improvement of 4% (11% better to 3% worse) and relative improvement of 6.7% (19.6% better to 6.1% worse). Results for withdrawals indicate that 14 per 100 and 10 per 100 in the intervention and control groups, respectively, withdrew from the intervention (RR 1.43, 95% CI 0.27 to 7.67; absolute change 4%, 95% CI 16% fewer to 24% more; relative change 43% more, 95% CI 73% fewer to 667% more). The only adverse event reported was acute pain in the legs, for which one participant dropped out of the program. We judged the quality of evidence for all outcomes as very low. This study did not measure pain intensity, fatigue, stiffness, or physical function. No outcomes in this comparison met the 15% threshold for clinical relevance.The WBV plus mixed exercise (aerobic, strength, flexibility, and relaxation) versus control study (N = 21) evaluated symptoms at six weeks post intervention using the FIQ. Results for HRQL at end of treatment (59.64) showed an MD of -16.02 (95% CI -31.57 to -0.47) for absolute HRQL, with improvement of 16% (0.5% to 32%) and relative change in HRQL of 24% (0.7% to 47%). Data showed a pain intensity MD of -28.22 (95% CI -43.26 to -13.18) for an absolute difference of 28% (13% to 43%) and a relative change of 39% improvement (18% to 60%); as well as a fatigue MD of -33 (95% CI -49 to -16) for an absolute difference of 33% (16% to 49%) and relative difference of 47% (95% CI 23% to 60%); and a stiffness MD of -26.27 (95% CI -42.96 to -9.58) for an absolute difference of 26% (10% to 43%) and a relative difference of 36.5% (23% to 60%). All-cause withdrawals occurred in 8 per 100 and 33 per 100 withdrawals in the intervention and control groups, respectively (two studies, N = 46; RR 0.25, 95% CI 0.06 to 1.12) for an absolute risk difference of 24% (3% to 51%). One participant exhibited a mild anxiety attack at the first session of WBV. No studies in this comparison reported on physical function. Several outcomes (based on the findings of one study) in this comparison met the 15% threshold for clinical relevance: HRQL, pain intensity, fatigue, and stiffness, which improved by 16%, 39%, 46%, and 36%, respectively. We found evidence of very low quality for all outcomes.The WBV plus mixed exercise versus other exercise provided very low quality evidence for all outcomes. Investigators evaluated outcomes on a 0 to 100 scale (lower score is better) for pain intensity (one study, N = 23; MD -16.36, 95% CI -29.49 to -3.23), HRQL (two studies, N = 49; MD -6.67, 95% CI -14.65 to 1.31), fatigue (one study, N = 23; MD -14.41, 95% CI -29.47 to 0.65), stiffness (one study, N = 23; MD -12.72, 95% CI -26.90 to 1.46), and all-cause withdrawal (three studies, N = 77; RR 0.72, 95% CI -0.17 to 3.11). Adverse events reported for the three studies included one anxiety attack at the first session of WBV and one dropout from the comparison group ("other exercise group") due to an injury that was not related to the program. No studies reported on physical function. Whether WBV or WBV in addition to mixed exercise is superior to control or another intervention for women with fibromyalgia remains uncertain. The quality of evidence is very low owing to imprecision (few study participants and wide confidence intervals) and issues related to risk of bias. These trials did not measure major outcomes such as pain intensity, stiffness, fatigue, and physical function. Overall, studies were few and were very small, which prevented meaningful estimates of harms and definitive conclusions about WBV safety.
Liew, Aaron Y L; Piran, Siavash; Eikelboom, John W; Douketis, James D
2017-04-01
Extended-duration pharmacological thromboprophylaxis, for at least 28 days, is effective for the prevention of symptomatic venous thromboembolism (VTE) in high-risk surgical patients but is of uncertain benefit in hospitalized medical patients. We aimed to evaluate the efficacy and safety of extended-duration thromboprophylaxis in hospitalized medical patients. We conducted a systematic PubMed, Medline and EMBASE literature search until June 2016 and a meta-analysis of randomized controlled trials which compared extended-duration with short-duration thromboprophylaxis in hospitalized medical patients. Four randomized controlled trials comparing extended-duration prophylaxis (24-47 days) with short-duration prophylaxis (6-14 days) in a total of 34,068 acutely ill hospitalized medical patients were included. When compared with short-duration prophylaxis, extended-duration prophylaxis was associated with a decrease in symptomatic proximal or distal deep vein thrombosis (DVT) [relative risk (RR) = 0.52; 95% confidence interval (Cl): 0.35-0.77: p = 0.001; absolute risk reduction (ARR) = 0.32%, number needed to treat (NNT) = 313], and symptomatic non-fatal pulmonary embolism (RR = 0.61; 95% Cl 0.38-0.99: p = 0.04; ARR = 0.16%; NNT = 625), an increase in major bleeding (RR = 2.08; 95% Cl 1.50-2.90: p < 0.0001, absolute risk increase = 0.41%, number needed to harm = 244), and no significant reduction in VTE-related mortality (RR = 0.69; 95% Cl 0.45-1.06: p = 0.09) or all-cause mortality (RR = 1.00; 95% CI 0.89-1.12; p = 0.95). There was heterogeneity for major bleeding due to results from the APEX trial (no difference between betrixaban and enoxaparin). Compared with short-duration thromboprophylaxis, extended-duration treatment reduces the risk for symptomatic DVT and non-fatal pulmonary embolism. Extended treatment with apixaban, enoxaparin and rivaroxaban but not betrixaban increases the risk for major bleeding.
De Ridder, Karin A A; Pape, Kristine; Cuypers, Koenraad; Johnsen, Roar; Holmen, Turid Lingaas; Westin, Steinar; Bjørngaard, Johan Håkon
2013-10-09
High school dropout and long-term sickness absence/disability pension in young adulthood are strongly associated. We investigated whether common risk factors in adolescence may confound this association. Data from 6612 school-attending adolescents (13-20 years old) participating in the Norwegian Young-HUNT1 Survey (1995-1997) was linked to long-term sickness absence or disability pension from age 24-29 years old, recorded in the Norwegian Labour and Welfare Organisation registers (1998-2008). We used logistic regression to estimate risk differences of sickness or disability for school dropouts versus completers, adjusting for health, health-related behaviours, psychosocial factors, school problems, and parental socioeconomic position. In addition, we stratified the regression models of sickness and disability following dropout across the quintiles of the propensity score for high school dropout. The crude absolute risk difference for long-term sickness or disability for a school dropout compared to a completer was 0.21% or 21% points (95% confidence interval (CI), 17 to 24). The adjusted risk difference was reduced to 15% points (95% CI, 12 to 19). Overall, high school dropout increased the risk for sickness or disability regardless of the risk factor level present for high school dropout. High school dropouts have a strongly increased risk for sickness and disability in young adulthood across all quintiles of the propensity score for dropout, i.e. independent of own health, family and socioeconomic factors in adolescence. These findings reveal the importance of early prevention of dropout where possible, combined with increased attention to labour market integration and targeted support for those who fail to complete school.
Nilsson, Lars B; Skansen, Patrik
2012-06-30
The investigations in this article were triggered by two observations in the laboratory; for some liquid chromatography/tandem mass spectrometry (LC/MS/MS) systems it was possible to obtain linear calibration curves for extreme concentration ranges and for some systems seemingly linear calibration curves gave good accuracy at low concentrations only when using a quadratic regression function. The absolute and relative responses were tested for three different LC/MS/MS systems by injecting solutions of a model compound and a stable isotope labeled internal standard. The analyte concentration range for the solutions was 0.00391 to 500 μM (128,000×), giving overload of the chromatographic column at the highest concentrations. The stable isotope labeled internal standard concentration was 0.667 μM in all samples. The absolute response per concentration unit decreased rapidly as higher concentrations were injected. The relative response, the ratio for the analyte peak area to the internal standard peak area, per concentration unit was calculated. For system 1, the ionization process was found to limit the response and the relative response per concentration unit was constant. For systems 2 and 3, the ion detection process was the limiting factor resulting in decreasing relative response at increasing concentrations. For systems behaving like system 1, simple linear regression can be used for any concentration range while, for systems behaving like systems 2 and 3, non-linear regression is recommended for all concentration ranges. Another consequence is that the ionization capacity limited systems will be insensitive to matrix ion suppression when an ideal internal standard is used while the detection capacity limited systems are at risk of giving erroneous results at high concentrations if the matrix ion suppression varies for different samples in a run. Copyright © 2012 John Wiley & Sons, Ltd.
12 CFR 217.152 - Simple risk weight approach (SRWA).
Code of Federal Regulations, 2014 CFR
2014-01-01
... than or equal to -1 (that is, between zero and -1), then E equals the absolute value of RVC. If RVC is... this section. (1) Zero percent risk weight equity exposures. An equity exposure to an entity whose credit exposures are exempt from the 0.03 percent PD floor in § 217.131(d)(2) is assigned a zero percent...
Cassagne, E; Caillaud, P D; Besancenot, J P; Thibaudon, M
2007-10-01
Pollen of Poaceae is among the most allergenic pollen in Europe with pollen of birch. It is therefore useful to elaborate models to help pollen allergy sufferers. The objective of this study was to construct forecast models that could predict the first day characterized by a certain level of allergic risk called here the Starting Date of the Allergic Risk (SDAR). Models result from four forecast methods (three summing and one multiple regression analysis) used in the literature. They were applied on Nancy and Strasbourg from 1988 to 2005 and were tested on 2006. Mean Absolute Error and Actual forecast ability test are the parameters used to choose best models, assess and compare their accuracy. It was found, on the whole, that all the models presented a good forecast accuracy which was equivalent. They were all reliable and were used in order to forecast the SDAR in 2006 with contrasting results in forecasting precision.
Carnell, S; Pryor, K; Mais, L A; Warkentin, S; Benson, L; Cheng, R
2016-08-01
Children's appetitive characteristics measured by parent-report questionnaires are reliably associated with body weight, as well as behavioral tests of appetite, but relatively little is known about relationships with food choice. As part of a larger preloading study, we served 4-5year olds from primary school classes five school lunches at which they were presented with the same standardized multi-item meal. Parents completed Child Eating Behavior Questionnaire (CEBQ) sub-scales assessing satiety responsiveness (CEBQ-SR), food responsiveness (CEBQ-FR) and enjoyment of food (CEBQ-EF), and children were weighed and measured. Despite differing preload conditions, children showed remarkable consistency of intake patterns across all five meals with day-to-day intra-class correlations in absolute and percentage intake of each food category ranging from 0.78 to 0.91. Higher CEBQ-SR was associated with lower mean intake of all food categories across all five meals, with the weakest association apparent for snack foods. Higher CEBQ-FR was associated with higher intake of white bread and fruits and vegetables, and higher CEBQ-EF was associated with greater intake of all categories, with the strongest association apparent for white bread. Analyses of intake of each food group as a percentage of total intake, treated here as an index of the child's choice to consume relatively more or relatively less of each different food category when composing their total lunch-time meal, further suggested that children who were higher in CEBQ-SR ate relatively more snack foods and relatively less fruits and vegetables, while children with higher CEBQ-EF ate relatively less snack foods and relatively more white bread. Higher absolute intakes of white bread and snack foods were associated with higher BMI z score. CEBQ sub-scale associations with food intake variables were largely unchanged by controlling for daily metabolic needs. However, descriptive comparisons of lunch intakes with expected amounts based on metabolic needs suggested that overweight/obese boys were at particularly high risk of overeating. Parents' reports of children's appetitive characteristics on the CEBQ are associated with differential patterns of food choice as indexed by absolute and relative intake of various food categories assessed on multiple occasions in a naturalistic, school-based setting, without parents present. Copyright © 2016 Elsevier Inc. All rights reserved.
Carnell, S; Pryor, K; Mais, LA; Warkentin, S; Benson, L; Cheng, R
2016-01-01
Children’s appetitive characteristics measured by parent-report questionnaires are reliably associated with body weight, as well as behavioral tests of appetite, but relatively little is known about relationships with food choice. As part of a larger preloading study, we served 4-5y olds from primary school classes five school lunches at which they were presented with the same standardized multi-item meal. Parents completed Child Eating Behavior Questionnaire (CEBQ) sub-scales assessing satiety responsiveness (CEBQ-SR), food responsiveness (CEBQ-FR) and enjoyment of food (CEBQ-EF), and children were weighed and measured. Despite differing preload conditions, children showed remarkable consistency of intake patterns across all five meals with day-to-day intra-class correlations in absolute and percentage intake of each food category ranging from .78 to .91. Higher CEBQ-SR was associated with lower mean intake of all food categories across all five meals, with the weakest association apparent for snack foods. Higher CEBQ-FR was associated with higher intake of white bread and fruits and vegetables, and higher CEBQ-EF was associated with greater intake of all categories, with the strongest association apparent for white bread. Analyses of intake of each food group as a percentage of total intake, treated here as an index of the child’s choice to consume relatively more or relatively less of each different food category when composing their total lunch-time meal, further suggested that children who were higher in CEBQ-SR ate relatively more snack foods and relatively less fruits and vegetables, while children with higher CEBQ-EF ate relatively less snack foods and relatively more white bread. Higher absolute intakes of white bread and snack foods were associated with higher BMI z score. CEBQ sub-scale associations with food intake variables were largely unchanged by controlling for daily metabolic needs. However, descriptive comparisons of lunch intakes with expected amounts based on metabolic needs suggested that overweight/obese boys were at particularly high risk of overeating. Parents’ reports of children’s appetitive characteristics on the CEBQ are associated with differential patterns of food choice as indexed by absolute and relative intake of various food categories assessed on multiple occasions in a naturalistic, school-based setting, without parents present. PMID:27039281
9 CFR 439.10 - Criteria for obtaining accreditation.
Code of Federal Regulations, 2014 CFR
2014-01-01
... absolute value of the average standardized difference must not exceed the following: (i) For food chemistry... samples must be less than 5.0. A result will have a large deviation measure equal to zero when the absolute value of the result's standardized difference, (d), is less than 2.5 and otherwise a measure equal...
9 CFR 439.10 - Criteria for obtaining accreditation.
Code of Federal Regulations, 2012 CFR
2012-01-01
... absolute value of the average standardized difference must not exceed the following: (i) For food chemistry... samples must be less than 5.0. A result will have a large deviation measure equal to zero when the absolute value of the result's standardized difference, (d), is less than 2.5 and otherwise a measure equal...
9 CFR 439.10 - Criteria for obtaining accreditation.
Code of Federal Regulations, 2013 CFR
2013-01-01
... absolute value of the average standardized difference must not exceed the following: (i) For food chemistry... samples must be less than 5.0. A result will have a large deviation measure equal to zero when the absolute value of the result's standardized difference, (d), is less than 2.5 and otherwise a measure equal...
9 CFR 439.10 - Criteria for obtaining accreditation.
Code of Federal Regulations, 2011 CFR
2011-01-01
... absolute value of the average standardized difference must not exceed the following: (i) For food chemistry... samples must be less than 5.0. A result will have a large deviation measure equal to zero when the absolute value of the result's standardized difference, (d), is less than 2.5 and otherwise a measure equal...
9 CFR 439.10 - Criteria for obtaining accreditation.
Code of Federal Regulations, 2010 CFR
2010-01-01
... absolute value of the average standardized difference must not exceed the following: (i) For food chemistry... samples must be less than 5.0. A result will have a large deviation measure equal to zero when the absolute value of the result's standardized difference, (d), is less than 2.5 and otherwise a measure equal...
Hamilton, S J
2017-05-22
Electrical impedance tomography (EIT) is an emerging imaging modality that uses harmless electrical measurements taken on electrodes at a body's surface to recover information about the internal electrical conductivity and or permittivity. The image reconstruction task of EIT is a highly nonlinear inverse problem that is sensitive to noise and modeling errors making the image reconstruction task challenging. D-bar methods solve the nonlinear problem directly, bypassing the need for detailed and time-intensive forward models, to provide absolute (static) as well as time-difference EIT images. Coupling the D-bar methodology with the inclusion of high confidence a priori data results in a noise-robust regularized image reconstruction method. In this work, the a priori D-bar method for complex admittivities is demonstrated effective on experimental tank data for absolute imaging for the first time. Additionally, the method is adjusted for, and tested on, time-difference imaging scenarios. The ability of the method to be used for conductivity, permittivity, absolute as well as time-difference imaging provides the user with great flexibility without a high computational cost.
Sheridan, Stacey L; Pignone, Michael P; Lewis, Carmen L
2003-11-01
Commentators have suggested that patients may understand quantitative information about treatment benefits better when they are presented as numbers needed to treat (NNT) rather than as absolute or relative risk reductions. To determine whether NNT helps patients interpret treatment benefits better than absolute risk reduction (ARR), relative risk reduction (RRR), or a combination of all three of these risk reduction presentations (COMBO). Randomized cross-sectional survey. University internal medicine clinic. Three hundred fifty-seven men and women, ages 50 to 80, who presented for health care. Subjects were given written information about the baseline risk of a hypothetical "disease Y" and were asked (1) to compare the benefits of two drug treatments for disease Y, stating which provided more benefit; and (2) to calculate the effect of one of those drug treatments on a given baseline risk of disease. Risk information was presented to each subject in one of four randomly allocated risk formats: NNT, ARR, RRR, or COMBO. When asked to state which of two treatments provided more benefit, subjects who received the RRR format responded correctly most often (60% correct vs 43% for COMBO, 42% for ARR, and 30% for NNT, P =.001). Most subjects were unable to calculate the effect of drug treatment on the given baseline risk of disease, although subjects receiving the RRR and ARR formats responded correctly more often (21% and 17% compared to 7% for COMBO and 6% for NNT, P =.004). Patients are best able to interpret the benefits of treatment when they are presented in an RRR format with a given baseline risk of disease. ARR also is easily interpreted. NNT is often misinterpreted by patients and should not be used alone to communicate risk to patients.
Individual risk of cutaneous melanoma in New Zealand: developing a clinical prediction aid.
Sneyd, Mary Jane; Cameron, Claire; Cox, Brian
2014-05-22
New Zealand and Australia have the highest melanoma incidence rates worldwide. In New Zealand, both the incidence and thickness have been increasing. Clinical decisions require accurate risk prediction but a simple list of genetic, phenotypic and behavioural risk factors is inadequate to estimate individual risk as the risk factors for melanoma have complex interactions. In order to offer tailored clinical management strategies, we developed a New Zealand prediction model to estimate individual 5-year absolute risk of melanoma. A population-based case-control study (368 cases and 270 controls) of melanoma risk factors provided estimates of relative risks for fair-skinned New Zealanders aged 20-79 years. Model selection techniques and multivariate logistic regression were used to determine the important predictors. The relative risks for predictors were combined with baseline melanoma incidence rates and non-melanoma mortality rates to calculate individual probabilities of developing melanoma within 5 years. For women, the best model included skin colour, number of moles > =5 mm on the right arm, having a 1st degree relative with large moles, and a personal history of non-melanoma skin cancer (NMSC). The model correctly classified 68% of participants; the C-statistic was 0.74. For men, the best model included age, place of occupation up to age 18 years, number of moles > =5 mm on the right arm, birthplace, and a history of NMSC. The model correctly classified 67% of cases; the C-statistic was 0.71. We have developed the first New Zealand risk prediction model that calculates individual absolute 5-year risk of melanoma. This model will aid physicians to identify individuals at high risk, allowing them to individually target surveillance and other management strategies, and thereby reduce the high melanoma burden in New Zealand.
ERIC Educational Resources Information Center
Boye, Katarina
2009-01-01
Absolute as well as relative hours of paid and unpaid work may influence well-being. This study investigates whether absolute hours spent on paid work and housework account for the lower well-being among women as compared to men in Europe, and whether the associations between well-being and hours of paid work and housework differ by gender…
Delusion proneness and 'jumping to conclusions': relative and absolute effects.
van der Leer, L; Hartig, B; Goldmanis, M; McKay, R
2015-04-01
That delusional and delusion-prone individuals 'jump to conclusions' is one of the most robust and important findings in the literature on delusions. However, although the notion of 'jumping to conclusions' (JTC) implies gathering insufficient evidence and reaching premature decisions, previous studies have not investigated whether the evidence gathering of delusion-prone individuals is, in fact, suboptimal. The standard JTC effect is a relative effect but using relative comparisons to substantiate absolute claims is problematic. In this study we investigated whether delusion-prone participants jump to conclusions in both a relative and an absolute sense. Healthy participants (n = 112) completed an incentivized probabilistic reasoning task in which correct decisions were rewarded and additional information could be requested for a small price. This combination of rewards and costs generated optimal decision points. Participants also completed measures of delusion proneness, intelligence and risk aversion. Replicating the standard relative finding, we found that delusion proneness significantly predicted task decisions, such that the more delusion prone the participants were, the earlier they decided. This finding was robust when accounting for the effects of risk aversion and intelligence. Importantly, high-delusion-prone participants also decided in advance of an objective rational optimum, gathering fewer data than would have maximized their expected payoff. Surprisingly, we found that even low-delusion-prone participants jumped to conclusions in this absolute sense. Our findings support and clarify the claim that delusion formation is associated with a tendency to 'jump to conclusions'. In short, most people jump to conclusions, but more delusion-prone individuals 'jump further'.
Sideri, M.; Gulmini, C.; Igidbashian, S.; Tricca, A.; Casadio, C.; Carinelli, S.; Boveri, S.; Ejegod, D.; Bonde, J.; Sandri, M. T.
2015-01-01
Analytical and clinical performance validation is essential before introduction of a new human papillomavirus (HPV) assay into clinical practice. This study compares the new BD Onclarity HPV assay, which detects E6/E7 DNA from 14 high-risk HPV types, to the Hybrid Capture II (HC2) HPV DNA test, to concurrent cytology and histology results, in order to evaluate its performance in detecting high-grade cervical lesions. A population of 567 women, including 325 with ≥ASCUS (where ASCUS stands for atypical cells of undetermined significance) and any HC2 result and 242 with both negative cytology and negative HC2 results, were prospectively enrolled for the study. The overall agreement between Onclarity and HC2 was 94.6% (95% confidence intervals [CI], 92.3% to 96.2%). In this population with a high prevalence of disease, the relative sensitivities (versus adjudicated cervical intraepithelial neoplasia grades 2 and 3 [CIN2+] histology endpoints) of the Onclarity and HC2 tests were 95.2% (95% CI, 90.7% to 97.5%) and 96.9% (95% CI, 92.9% to 98.7%), respectively, and the relative specificities were 50.3% (95% CI, 43.2% to 57.4%) for BD and 40.8% (95% CI, 33.9%, 48.1%) for HC2. These results indicate that the BD Onclarity HPV assay has sensitivity comparable to that of the HC2 assay, with a trend to an increased specificity. Moreover, as Onclarity gives the chance to discriminate between the different genotypes, we calculated the genotype prevalence and the absolute risk of CIN2+: HPV 16 was the most prevalent genotype (19.8%) with an absolute risk of CIN2+ of 77.1%. PMID:25903574
Hartvigsen, Jan; Davidsen, Michael; Søgaard, Karen; Roos, Ewa M; Hestbaek, Lise
2014-11-01
Musculoskeletal pain and disability is a modern epidemic and a major reason for seeking health care. The aim of this study is to determine absolute and relative rates of care seeking over 20 years for adults reporting musculoskeletal complaints. Interview data on musculoskeletal pain reported during the past two weeks from the Danish National Cohort Study were merged with data from the Danish National Health Insurance Registry and the National Patient Registry containing information on consultations in the Danish primary and secondary care sector. Absolute and relative rates for all seeking of care with general practitioners, physiotherapists, chiropractors, outpatient hospital contacts and hospital admissions are reported for persons reporting no musculoskeletal pain and for persons reporting pain in the neck, shoulder, wrist/hands, mid back, low back, hips, knees and ankles/feet. Regardless of site, persons experiencing a musculoskeletal complaint had a statistically increased risk of consulting a general practitioner when compared with persons reporting no musculoskeletal complaint. For physiotherapists and chiropractors, only persons complaining of neck pain and back pain had an increased risk of seeking care. Regardless of pain site, except for shoulder pain, persons reporting musculoskeletal pain had a statistically significant increased risk of outpatient hospital consultations and hospital admissions. Few differences were found between pain sites in relation to any of the outcomes. Self-report of musculoskeletal pain reported within the past two weeks predicts a statistically significant long-term increase in general use of health care services in both the primary and the secondary health care sector. © 2014 the Nordic Societies of Public Health.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tucker, Susan L.; Liu, H. Helen; Wang, Shulian
Purpose: The aim of this study was to investigate the effect of radiation dose distribution in the lung on the risk of postoperative pulmonary complications among esophageal cancer patients. Methods and Materials: We analyzed data from 110 patients with esophageal cancer treated with concurrent chemoradiotherapy followed by surgery at our institution from 1998 to 2003. The endpoint for analysis was postsurgical pneumonia or acute respiratory distress syndrome. Dose-volume histograms (DVHs) and dose-mass histograms (DMHs) for the whole lung were used to fit normal-tissue complication probability (NTCP) models, and the quality of fits were compared using bootstrap analysis. Results: Normal-tissue complicationmore » probability modeling identified that the risk of postoperative pulmonary complications was most significantly associated with small absolute volumes of lung spared from doses {>=}5 Gy (VS5), that is, exposed to doses <5 Gy. However, bootstrap analysis found no significant difference between the quality of this model and fits based on other dosimetric parameters, including mean lung dose, effective dose, and relative volume of lung receiving {>=}5 Gy, probably because of correlations among these factors. The choice of DVH vs. DMH or the use of fractionation correction did not significantly affect the results of the NTCP modeling. The parameter values estimated for the Lyman NTCP model were as follows (with 95% confidence intervals in parentheses): n = 1.85 (0.04, {infinity}), m = 0.55 (0.22, 1.02), and D {sub 5} = 17.5 Gy (9.4 Gy, 102 Gy). Conclusions: In this cohort of esophageal cancer patients, several dosimetric parameters including mean lung dose, effective dose, and absolute volume of lung receiving <5 Gy provided similar descriptions of the risk of postoperative pulmonary complications as a function of Radiation dose distribution in the lung.« less
[Patent foramen ovale and decompression illness in divers].
Sivertsen, Wiebke; Risberg, Jan; Norgård, Gunnar
2010-04-22
About 25 % of the population has patent foramen ovale, and the condition has been assumed to be a causal factor in decompressive illness. Transcatheter closure is possible and is associated with a relatively low risk, but it has not been clarified whether there is an indication for assessment and treatment of the condition in divers. The present study explored a possible relationship between a patent foramen ovale and the risk for decompression illness in divers, if there are categories of divers that should be screened for the condition and what advice should be given to divers with this condition. The review is based on literature identified through a search in Pubmed and the authors' long clinical experience in the field. The risk of decompression illness for divers with a persistent foramen ovale is about five times higher than that in divers without this condition, but the absolute risk for decompression illness is only 2.5 after 10,000 dives. A causal association has not been shown between patent foramen ovale and decompression illness. Even if closure of patent foramen ovale may be done with relatively small risk, the usefulness of the procedure has not been documented in divers. We do not recommend screening for patent foramen ovale in divers because the absolute risk of decompression illness is small and transcatheter closure is only indicated after decompression illness in some occupational divers.
A change in paradigm: lowering blood pressure in everyone over a certain age.
Law, Malcolm
2012-06-01
Dividing people into 'hypertensives' and 'normotensives' is commonplace but problematic. The relationship between blood pressure and cardiovascular disease is continuous. The Prospective Studies Collaboration analysis shows a continuous straight line dose-response relationship across the entire population down to blood pressure levels of 115 mmHg systolic and 75 mmHg diastolic, the confidence limits on the individual data points being sufficiently narrow to exclude even a minor deviation from a linear relationship. Meta-analysis of randomized controlled trials shows that blood pressure-lowering drugs produce similar proportional reductions in risk of coronary heart disease (CHD) and stroke irrespective of pre-treatment blood pressure, down to levels of 110 mmHg systolic and 70 mmHg diastolic. There are also now sufficient trial data to show a statistically significant risk reduction in 'normotensive' people without known vascular disease on entry. The straight line (log-linear) relationship means that the benefit derived from lowering blood pressure is proportional to existing risk, so the decision on whom to treat with blood pressure-lowering drugs should depend on a person's overall absolute risk irrespective of blood pressure. In primary prevention, basing treatment on age alone rather than overall absolute risk entails little loss of efficacy and may be preferred on the basis of simplicity and avoidance of anxiety in telling people they are at elevated risk.
Hernández, Gonzalo; Vaquero, Concepción; Colinas, Laura; Cuena, Rafael; González, Paloma; Canabal, Alfonso; Sanchez, Susana; Rodriguez, Maria Luisa; Villasclaras, Ana; Fernández, Rafael
2016-10-18
High-flow conditioned oxygen therapy delivered through nasal cannulae and noninvasive mechanical ventilation (NIV) may reduce the need for reintubation. Among the advantages of high-flow oxygen therapy are comfort, availability, lower costs, and additional physiopathological mechanisms. To test if high-flow conditioned oxygen therapy is noninferior to NIV for preventing postextubation respiratory failure and reintubation in patients at high risk of reintubation. Multicenter randomized clinical trial in 3 intensive care units in Spain (September 2012-October 2014) including critically ill patients ready for planned extubation with at least 1 of the following high-risk factors for reintubation: older than 65 years; Acute Physiology and Chronic Health Evaluation II score higher than 12 points on extubation day; body mass index higher than 30; inadequate secretions management; difficult or prolonged weaning; more than 1 comorbidity; heart failure as primary indication for mechanical ventilation; moderate to severe chronic obstructive pulmonary disease; airway patency problems; or prolonged mechanical ventilation. Patients were randomized to undergo either high-flow conditioned oxygen therapy or NIV for 24 hours after extubation. Primary outcomes were reintubation and postextubation respiratory failure within 72 hours. Noninferiority margin was 10 percentage points. Secondary outcomes included respiratory infection, sepsis, and multiple organ failure, length of stay and mortality; adverse events; and time to reintubation. Of 604 patients (mean age, 65 [SD, 16] years; 388 [64%] men), 314 received NIV and 290 high-flow oxygen. Sixty-six patients (22.8%) in the high-flow group vs 60 (19.1%) in the NIV group were reintubation (absolute difference, -3.7%; 95% CI, -9.1% to ∞); 78 patients (26.9%) in the high-flow group vs 125 (39.8%) in the NIV group experienced postextubation respiratory failure (risk difference, 12.9%; 95% CI, 6.6% to ∞) [corrected]. Median time to reintubation did not significantly differ: 26.5 hours (IQR, 14-39 hours) in the high-flow group vs 21.5 hours (IQR, 10-47 hours) in the NIV group (absolute difference, -5 hours; 95% CI, -34 to 24 hours). Median postrandomization ICU length of stay was lower in the high-flow group, 3 days (IQR, 2-7) vs 4 days (IQR, 2-9; P=.048). Other secondary outcomes were similar in the 2 groups. Adverse effects requiring withdrawal of the therapy were observed in none of patients in the high-flow group vs 42.9% patients in the NIV group (P < .001). Among high-risk adults who have undergone extubation, high-flow conditioned oxygen therapy was not inferior to NIV for preventing reintubation and postextubation respiratory failure. High-flow conditioned oxygen therapy may offer advantages for these patients. clinicaltrials.gov Identifier: NCT01191489.
Braz, José R C; Braz, Mariana G; Hayashi, Yoko; Martins, Regina H G; Betini, Marluci; Braz, Leandro G; El Dib, Regina
2017-08-01
The minimum inhaled gas absolute humidity level is 20 mgH2O l for short-duration use in general anaesthesia and 30 mgH2O l for long-duration use in intensive care to avoid respiratory tract dehydration. The aim is to compare the effects of different fresh gas flows (FGFs) through a circle rebreathing system with or without a heat and moisture exchanger (HME) on inhaled gas absolute humidity in adults undergoing general anaesthesia. Systematic review and meta-analyses of randomised controlled trials. We defined FGF (l min) as minimal (0.25 to 0.5), low (0.6 to 1.0) or high (≥2). We extracted the inhaled gas absolute humidity data at 60 and 120 min after connection of the patient to the breathing circuit. The effect size is expressed as the mean differences and corresponding 95% confidence intervals (CI). PubMed, EMBASE, SciELO, LILACS and CENTRAL until January 2017. We included 10 studies. The inhaled gas absolute humidity was higher with minimal flow compared with low flow at 120 min [mean differences 2.51 (95%CI: 0.32 to 4.70); P = 0.02] but not at 60 min [mean differences 2.95 (95%CI: -0.95 to 6.84); P = 0.14], and higher with low flow compared with high flow at 120 min [mean differences 7.19 (95%CI: 4.53 to 9.86); P < 0.001]. An inhaled gas absolute humidity minimum of 20 mgH2O l was attained with minimal flow at all times but not with low or high flows. An HME increased the inhaled gas absolute humidity: with minimal flow at 120 min [mean differences 8.49 (95%CI: 1.15 to 15.84); P = 0.02]; with low flow at 60 min [mean differences 9.87 (95%CI: 3.18 to 16.57); P = 0.04] and 120 min [mean differences 7.19 (95%CI: 3.29 to 11.10); P = 0.003]; and with high flow of 2 l min at 60 min [mean differences 6.46 (95%CI: 4.05 to 8.86); P < 0.001] and of 3 l min at 120 min [mean differences 12.18 (95%CI: 6.89 to 17.47); P < 0.001]. The inhaled gas absolute humidity data attained or were near 30 mgH2O l when an HME was used at all FGFs and times. All intubated patients should receive a HME with low or high flows. With minimal flow, a HME adds cost and is not needed to achieve an appropriate inhaled gas absolute humidity.
Familial Risk and Heritability of Cancer Among Twins in Nordic Countries
Mucci, Lorelei A.; Hjelmborg, Jacob B.; Harris, Jennifer R.; Czene, Kamila; Havelick, David J.; Scheike, Thomas; Graff, Rebecca E.; Holst, Klaus; Möller, Sören; Unger, Robert H.; McIntosh, Christina; Nuttall, Elizabeth; Brandt, Ingunn; Penney, Kathryn L.; Hartman, Mikael; Kraft, Peter; Parmigiani, Giovanni; Christensen, Kaare; Koskenvuo, Markku; Holm, Niels V.; Heikkilä, Kauko; Pukkala, Eero; Skytthe, Axel; Adami, Hans-Olov; Kaprio, Jaakko
2017-01-01
Importance Estimates of familial cancer risk from population-based studies are essential components of cancer risk prediction. Objective To estimate familial risk and heritability of cancer types in a large twin cohort. Design, Setting, and Participants Prospective study of 80 309 monozygotic and 123 382 same-sex dizygotic twin individuals (N = 203 691) within the population-based registers of Denmark, Finland, Norway, and Sweden. Twins were followed up a median of 32 years between 1943 and 2010. There were 50 990 individuals who died of any cause, and 3804 who emigrated and were lost to follow-up. Exposures Shared environmental and heritable risk factors among pairs of twins. Main Outcomes and Measures The main outcome was incident cancer. Time-to-event analyses were used to estimate familial risk (risk of cancer in an individual given a twin's development of cancer) and heritability (proportion of variance in cancer risk due to interindividual genetic differences) with follow-up via cancer registries. Statistical models adjusted for age and follow-up time, and accounted for censoring and competing risk of death. Results A total of 27 156 incident cancers were diagnosed in 23 980 individuals, translating to a cumulative incidence of 32%. Cancer was diagnosed in both twins among 1383 monozygotic (2766 individuals) and 1933 dizygotic (2866 individuals) pairs. Of these, 38% of monozygotic and 26% of dizygotic pairs were diagnosed with the same cancer type. There was an excess cancer risk in twins whose co-twin was diagnosed with cancer, with estimated cumulative risks that were an absolute 5% (95% CI, 4%-6%) higher in dizygotic (37%; 95% CI, 36%-38%) and an absolute 14% (95% CI, 12%-16%) higher in monozygotic twins (46%; 95% CI, 44%-48%) whose twin also developed cancer compared with the cumulative risk in the overall cohort (32%). For most cancer types, there were significant familial risks and the cumulative risks were higher in monozygotic than dizygotic twins. Heritability of cancer overall was 33% (95% CI, 30%-37%). Significant heritability was observed for the cancer types of skin melanoma (58%; 95% CI, 43%-73%), prostate (57%; 95% CI, 51%-63%), nonmelanoma skin (43%; 95% CI, 26%-59%), ovary (39%; 95% CI, 23%-55%), kidney (38%; 95% CI, 21%-55%), breast (31%; 95% CI, 11%-51%), and corpus uteri (27%; 95% CI, 11%-43%). Conclusions and Relevance In this long-term follow-up study among Nordic twins, there was significant excess familial risk for cancer overall and for specific types of cancer, including prostate, melanoma, breast, ovary, and uterus. This information about hereditary risks of cancers may be helpful in patient education and cancer risk counseling. PMID:26746459
A system for automatic aorta sections measurements on chest CT
NASA Astrophysics Data System (ADS)
Pfeffer, Yitzchak; Mayer, Arnaldo; Zholkover, Adi; Konen, Eli
2016-03-01
A new method is proposed for caliber measurement of the ascending aorta (AA) and descending aorta (DA). A key component of the method is the automatic detection of the carina, as an anatomical landmark around which an axial volume of interest (VOI) can be defined to observe the aortic caliber. For each slice in the VOI, a linear profile line connecting the AA with the DA is found by pattern matching on the underlying intensity profile. Next, the aortic center position is found using Hough transform on the best linear segment candidate. Finally, region growing around the center provides an accurate segmentation and caliber measurement. We evaluated the algorithm on 113 sequential chest CT scans, slice thickness of 0.75 - 3.75mm, 90 with contrast agent injected. The algorithm success rates were computed as the percentage of scans in which the center of the AA was found. Automated measurements of AA caliber were compared with independent measurements of two experienced chest radiologists, comparing the absolute difference between the two radiologists with the absolute difference between the algorithm and each of the radiologists. The measurement stability was demonstrated by computing the STD of the absolute difference between the radiologists, and between the algorithm and the radiologists. Results: Success rates of 93% and 74% were achieved, for contrast injected cases and non-contrast cases, respectively. These results indicate that the algorithm can be robust in large variability of image quality, such as the cases in a realworld clinical setting. The average absolute difference between the algorithm and the radiologists was 1.85mm, lower than the average absolute difference between the radiologists, which was 2.1mm. The STD of the absolute difference between the algorithm and the radiologists was 1.5mm vs 1.6mm between the two radiologists. These results demonstrate the clinical relevance of the algorithm measurements.
Nielsen, Lars Hougaard; Skovlund, Charlotte Wessel; Skjeldestad, Finn Egil; Løkkegaard, Ellen
2011-01-01
Objective To assess the risk of venous thromboembolism from use of combined oral contraceptives according to progestogen type and oestrogen dose. Design National historical registry based cohort study. Setting Four registries in Denmark. Participants Non-pregnant Danish women aged 15-49 with no history of thrombotic disease and followed from January 2001 to December 2009. Main outcome measures Relative and absolute risks of first time venous thromboembolism. Results Within 8 010 290 women years of observation, 4307 first ever venous thromboembolic events were recorded and 4246 included, among which 2847 (67%) events were confirmed as certain. Compared with non-users of hormonal contraception, the relative risk of confirmed venous thromboembolism in users of oral contraceptives containing 30-40 µg ethinylestradiol with levonorgestrel was 2.9 (95% confidence interval 2.2 to 3.8), with desogestrel was 6.6 (5.6 to 7.8), with gestodene was 6.2 (5.6 to 7.0), and with drospirenone was 6.4 (5.4 to 7.5). With users of oral contraceptives with levonorgestrel as reference and after adjusting for length of use, the rate ratio of confirmed venous thromboembolism for users of oral contraceptives with desogestrel was 2.2 (1.7 to 3.0), with gestodene was 2.1 (1.6 to 2.8), and with drospirenone was 2.1 (1.6 to 2.8). The risk of confirmed venous thromboembolism was not increased with use of progestogen only pills or hormone releasing intrauterine devices. If oral contraceptives with desogestrel, gestodene, or drospirenone are anticipated to increase the risk of venous thromboembolism sixfold and those with levonorgestrel threefold, and the absolute risk of venous thromboembolism in current users of the former group is on average 10 per 10 000 women years, then 2000 women would need to shift from using oral contraceptives with desogestrel, gestodene, or drospirenone to those with levonorgestrel to prevent one event of venous thromboembolism in one year. Conclusion After adjustment for length of use, users of oral contraceptives with desogestrel, gestodene, or drospirenone were at least at twice the risk of venous thromboembolism compared with users of oral contraceptives with levonorgestrel. PMID:22027398
Evidence-based risk communication: a systematic review.
Zipkin, Daniella A; Umscheid, Craig A; Keating, Nancy L; Allen, Elizabeth; Aung, KoKo; Beyth, Rebecca; Kaatz, Scott; Mann, Devin M; Sussman, Jeremy B; Korenstein, Deborah; Schardt, Connie; Nagi, Avishek; Sloane, Richard; Feldstein, David A
2014-08-19
Effective communication of risks and benefits to patients is critical for shared decision making. To review the comparative effectiveness of methods of communicating probabilistic information to patients that maximize their cognitive and behavioral outcomes. PubMed (1966 to March 2014) and CINAHL, EMBASE, and the Cochrane Central Register of Controlled Trials (1966 to December 2011) using several keywords and structured terms. Prospective or cross-sectional studies that recruited patients or healthy volunteers and compared any method of communicating probabilistic information with another method. Two independent reviewers extracted study characteristics and assessed risk of bias. Eighty-four articles, representing 91 unique studies, evaluated various methods of numerical and visual risk display across several risk scenarios and with diverse outcome measures. Studies showed that visual aids (icon arrays and bar graphs) improved patients' understanding and satisfaction. Presentations including absolute risk reductions were better than those including relative risk reductions for maximizing accuracy and seemed less likely than presentations with relative risk reductions to influence decisions to accept therapy. The presentation of numbers needed to treat reduced understanding. Comparative effects of presentations of frequencies (such as 1 in 5) versus event rates (percentages, such as 20%) were inconclusive. Most studies were small and highly variable in terms of setting, context, and methods of administering interventions. Visual aids and absolute risk formats can improve patients' understanding of probabilistic information, whereas numbers needed to treat can lessen their understanding. Due to study heterogeneity, the superiority of any single method for conveying probabilistic information is not established, but there are several good options to help clinicians communicate with patients. None.
SU-E-T-208: Incidence Cancer Risk From the Radiation Treatment for Acoustic Neuroma Patient
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, D; Chung, W; Shin, D
2014-06-01
Purpose: The present study aimed to compare the incidence risk of a secondary cancer from therapeutic doses in patients receiving intensitymodulated radiotherapy (IMRT), volumetric modulated arc therapy (VMAT), and stereotactic radiosurgery (SRS). Methods: Four acoustic neuroma patients were treated with IMRT, VMAT, or SRS. Their incidnece excess relative risk (ERR), excess absolute risk (EAR), and lifetime attributable risk (LAR) were estimated using the corresponding therapeutic doses measured at various organs by radio-photoluminescence glass dosimeters (RPLGD) placed inside a humanoid phantom. Results: When a prescription dose was delivered in the planning target volume of the 4 patients, the average organ equivalentmore » doses (OED) at the thyroid, lung, normal liver, colon, bladder, prostate (or ovary), and rectum were measured. The OED decreased as the distance from the primary beam increased. The thyroid received the highest OED compared to other organs. A LAR were estimated that more than 0.03% of AN patients would get radiation-induced cancer. Conclusion: The tyroid was highest radiation-induced cancer risk after radiation treatment for AN. We found that LAR can be increased by the transmitted dose from the primary beam. No modality-specific difference in radiation-induced cancer risk was observed in our study.« less
Fuzzy Random λ-Mean SAD Portfolio Selection Problem: An Ant Colony Optimization Approach
NASA Astrophysics Data System (ADS)
Thakur, Gour Sundar Mitra; Bhattacharyya, Rupak; Mitra, Swapan Kumar
2010-10-01
To reach the investment goal, one has to select a combination of securities among different portfolios containing large number of securities. Only the past records of each security do not guarantee the future return. As there are many uncertain factors which directly or indirectly influence the stock market and there are also some newer stock markets which do not have enough historical data, experts' expectation and experience must be combined with the past records to generate an effective portfolio selection model. In this paper the return of security is assumed to be Fuzzy Random Variable Set (FRVS), where returns are set of random numbers which are in turn fuzzy numbers. A new λ-Mean Semi Absolute Deviation (λ-MSAD) portfolio selection model is developed. The subjective opinions of the investors to the rate of returns of each security are taken into consideration by introducing a pessimistic-optimistic parameter vector λ. λ-Mean Semi Absolute Deviation (λ-MSAD) model is preferred as it follows absolute deviation of the rate of returns of a portfolio instead of the variance as the measure of the risk. As this model can be reduced to Linear Programming Problem (LPP) it can be solved much faster than quadratic programming problems. Ant Colony Optimization (ACO) is used for solving the portfolio selection problem. ACO is a paradigm for designing meta-heuristic algorithms for combinatorial optimization problem. Data from BSE is used for illustration.
Isiordia-Espinoza, M A; Aragon-Martinez, O H; Martínez-Morales, J F; Zapata-Morales, J R
2015-11-01
The aim of this systematic review and meta-analysis was to assess the risk of surgical wound infection and the adverse effects of amoxicillin in healthy patients who required excision of third molars. We identified eligible reports from searches of PubMed, Medline®, the Cochrane Library, Imbiomed, LILACS, and Google Scholar. Studies that met our minimum requirements were evaluated using inclusion and exclusion criteria and the Oxford Quality Scale. Those with a score of 3 or more on this Scale were included and their data were extracted and analysed. For evaluation of the risk of infection the absolute risk reduction, number needed to treat, and 95% CI were calculated. For evaluation of the risk of an adverse effect the absolute risk increase, number needed to harm, and 95% CI were calculated using the Risk Reduction Calculator. Each meta-analysis was made with the help of the Mantel-Haenszel random effects model, and estimates of risk (OR) and 95% CI were calculated using the Review Manager 5.3, from the Cochrane Library. A significant risk was assumed when the lower limit of the 95% CI was greater than 1. Probabilities of less than 0.05 were accepted as significant. The results showed that there was no reduction in the risk of infection when amoxicillin was given before or after operation compared with an untreated group or placebo. In conclusion, this study suggests that amoxicillin given prophylactically or postoperatively does not reduce the risk of infection in healthy patients having their third molars extracted. Copyright © 2015 The British Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.
April-Sanders, Ayana; Oskar, Sabine; Shelton, Rachel C; Schmitt, Karen M; Desperito, Elise; Protacio, Angeline; Tehranifar, Parisa
Worry about developing breast cancer (BC) has been associated with participation in screening and genetic testing and with follow-up of abnormal screening results. Little is known about the scope and predictors of BC worry in Hispanic and immigrant populations. We collected in-person interview data from 250 self-identified Hispanic women recruited from an urban mammography facility (average age 50.4 years; 82% foreign-born). Women reported whether they worried about developing breast cancer rarely/never (low worry), sometimes (moderate worry), or often/all the time (high worry). We examined whether sociocultural and psychological factors (e.g., acculturation, education, perceived risk), and risk factors and objective risk for BC (e.g., family history, Gail model 5-year risk estimates, parity) predicted BC worry using multinomial and logistic regression. In multivariable models, women who perceived higher absolute BC risk (odds ratio, 1.66 [95% confidence interval, 1.28-2.14] for a one-unit increase in perceived lifetime risk) and comparative BC risk (e.g., odds ratio, 2.73, 95% confidence interval, 1.23-6.06) were more likely to report high BC worry than moderate or low BC worry. There were no associations between BC worry and indicators of objective risk or acculturation. In Hispanic women undergoing screening mammography, higher perceptions of BC risk, in both absolute and comparative terms, were associated independently with high BC worry, and were stronger predictors of BC worry than indicators of objective BC risk, including family history, mammographic density, and personal BC risk estimates. Copyright © 2016 Jacobs Institute of Women's Health. Published by Elsevier Inc. All rights reserved.
Express yourself: bold individuals induce enhanced morphological defences
Hulthén, Kaj; Chapman, Ben B.; Nilsson, P. Anders; Hollander, Johan; Brönmark, Christer
2014-01-01
Organisms display an impressive array of defence strategies in nature. Inducible defences (changes in morphology and/or behaviour within a prey's lifetime) allow prey to decrease vulnerability to predators and avoid unnecessary costs of expression. Many studies report considerable interindividual variation in the degree to which inducible defences are expressed, yet what underlies this variation is poorly understood. Here, we show that individuals differing in a key personality trait also differ in the magnitude of morphological defence expression. Crucian carp showing risky behaviours (bold individuals) expressed a significantly greater morphological defence response when exposed to a natural enemy when compared with shy individuals. Furthermore, we show that fish of different personality types differ in their behavioural plasticity, with shy fish exhibiting greater absolute plasticity than bold fish. Our data suggest that individuals with bold personalities may be able to compensate for their risk-prone behavioural type by expressing enhanced morphological defences. PMID:24335987
Clyde, Merlise A.; Palmieri Weber, Rachel; Iversen, Edwin S.; Poole, Elizabeth M.; Doherty, Jennifer A.; Goodman, Marc T.; Ness, Roberta B.; Risch, Harvey A.; Rossing, Mary Anne; Terry, Kathryn L.; Wentzensen, Nicolas; Whittemore, Alice S.; Anton-Culver, Hoda; Bandera, Elisa V.; Berchuck, Andrew; Carney, Michael E.; Cramer, Daniel W.; Cunningham, Julie M.; Cushing-Haugen, Kara L.; Edwards, Robert P.; Fridley, Brooke L.; Goode, Ellen L.; Lurie, Galina; McGuire, Valerie; Modugno, Francesmary; Moysich, Kirsten B.; Olson, Sara H.; Pearce, Celeste Leigh; Pike, Malcolm C.; Rothstein, Joseph H.; Sellers, Thomas A.; Sieh, Weiva; Stram, Daniel; Thompson, Pamela J.; Vierkant, Robert A.; Wicklund, Kristine G.; Wu, Anna H.; Ziogas, Argyrios; Tworoger, Shelley S.; Schildkraut, Joellen M.
2016-01-01
Previously developed models for predicting absolute risk of invasive epithelial ovarian cancer have included a limited number of risk factors and have had low discriminatory power (area under the receiver operating characteristic curve (AUC) < 0.60). Because of this, we developed and internally validated a relative risk prediction model that incorporates 17 established epidemiologic risk factors and 17 genome-wide significant single nucleotide polymorphisms (SNPs) using data from 11 case-control studies in the United States (5,793 cases; 9,512 controls) from the Ovarian Cancer Association Consortium (data accrued from 1992 to 2010). We developed a hierarchical logistic regression model for predicting case-control status that included imputation of missing data. We randomly divided the data into an 80% training sample and used the remaining 20% for model evaluation. The AUC for the full model was 0.664. A reduced model without SNPs performed similarly (AUC = 0.649). Both models performed better than a baseline model that included age and study site only (AUC = 0.563). The best predictive power was obtained in the full model among women younger than 50 years of age (AUC = 0.714); however, the addition of SNPs increased the AUC the most for women older than 50 years of age (AUC = 0.638 vs. 0.616). Adapting this improved model to estimate absolute risk and evaluating it in prospective data sets is warranted. PMID:27698005
Is there a specific fracture ‘cascade'?
Melton, L Joseph; Amin, Shreyasee
2013-01-01
Different kinds of epidemiologic data provide varying views of the relationships among the main osteoporotic fractures. Descriptive incidence data indicate that distal forearm fractures typically occur earlier than vertebral fractures that, in turn, precede hip fractures late in life. In addition, relative risk estimates document the fact that one osteoporotic fracture increases the risk of subsequent ones. These two observations support the notion of a ‘fracture cascade' and justify the recent emphasis on secondary prevention, that is, more aggressive treatment of patients presenting with a fracture in order to prevent recurrences. However, the absolute risk of a subsequent fracture given an initial one is modest, and the degree to which the second fracture can be attributed to the first one is unclear. Moreover, the osteoporotic fractures encountered in the majority of patients are the first one experienced, and even these initial fractures lead to substantial morbidity and cost. These latter points reemphasize the importance of primary prevention, that is, the management of bone loss and other risk factors to prevent the first fracture. Continued efforts are needed to refine risk assessment algorithms so that candidates for such fracture prophylaxis can be identified more accurately and efficiently. PMID:24575296
Upper body fat predicts metabolic syndrome similarly in men and women.
Grundy, Scott M; Williams, Corbin; Vega, Gloria L
2018-04-23
The metabolic syndrome is a constellation of risk factors including dyslipidemia, dysglycemia, hypertension, a pro-inflammatory state, and a prothrombotic state. All of these factors are accentuated by obesity. However, obesity can be defined by body mass index (BMI), percent body fat, or by body fat distribution. The latter consists of upper body fat (subcutaneous and visceral fat) and lower body fat (gluteofemoral fat). Waist circumference is a common surrogate marker for upper body fat. Data from the National Health and Nutrition Examination Survey (NHANES) for the years 1999-2006 was examined for associations of metabolic risk factors with percent body fat, waist circumference, and BMI. Associations between absolute measures of waist circumference and risk factors were similiar for men and women. The similarities of associations between waist circumference and risk factors suggests that greater visceral fat in men does not accentuate the influence of upper body fat on risk factors. Different waist concumference values should not be used to define abdominal obesity in men and women. © 2018 The Authors. European Journal of Clinical Investigation published by John Wiley & Sons Ltd on behalf of Stichting European Society for Clinical Investigation Journal Foundation.
1994-06-30
Recherche Scientifique, France The Absolute Galois Group from a Geometric Viewpoint LESLIE C SHAW Fellow (Anthropology and Archaeology ) University of...Massachusetts it Boston The Emergence of Inequality in the Maya Lowlands PATRICIA L. SIPE Fellow (Mathematics) Smith College DES and Risk
Pietiläinen, Olli; Ferrie, Jane; Kivimäki, Mika; Lahti, Jouni; Marmot, Michael; Rahkonen, Ossi; Sekine, Michikazu; Shipley, Martin; Tatsuse, Takashi; Lallukka, Tea
2016-01-01
Introduction: Socioeconomic differences in smoking over time and across national contexts are poorly understood. We assessed the magnitude of relative and absolute social class differences in smoking in cohorts from Britain, Finland, and Japan over 5–7 years. Methods: The British Whitehall II study (n = 4350), Finnish Helsinki Health Study (n = 6328), and Japanese Civil Servants Study (n = 1993) all included employed men and women aged 35–68 at baseline in 1997–2002. Follow-up was in 2003–2007 (mean follow-up 5.1, 6.5, and 3.6 years, respectively). Occupational social class (managers, professionals and clerical employees) was measured at baseline. Current smoking and covariates (age, marital status, body mass index, and self-rated health) were measured at baseline and follow-up. We assessed relative social class differences using the Relative Index of Inequality and absolute differences using the Slope Index of Inequality. Results: Social class differences in smoking were found in Britain and Finland, but not in Japan. Age-adjusted relative differences at baseline ranged from Relative Index of Inequality 3.08 (95% confidence interval 1.99–4.78) among Finnish men to 2.32 (1.24–4.32) among British women, with differences at follow-up greater by 8%–58%. Absolute differences remained stable and varied from Slope Index of Inequality 0.27 (0.15–0.40) among Finnish men to 0.10 (0.03–0.16) among British women. Further adjustment for covariates had modest effects on inequality indices. Conclusions: Large social class differences in smoking persisted among British and Finnish men and women, with widening tendencies in relative differences over time. No differences could be confirmed among Japanese men or women. Implications: Changes over time in social class differences in smoking are poorly understood across countries. Our study focused on employees from Britain, Finland and Japan, and found relative and absolute and class differences among British and Finnish men and women. Key covariates had modest effects on the differences. Relative differences tended to widen over the 4- to 7-year follow-up, whereas absolute differences remained stable. In contrast, class differences in smoking among Japanese men or women were not found. Britain and Finland are at the late stage of the smoking epidemic model, whereas Japan may not follow the same model. PMID:26764256
Hospital Readmission Risk: Isolating Hospital from Patient Effects
Krumholz, Harlan M.; Wang, Kun; Lin, Zhenqiu; Dharmarajan, Kumar; Horwitz, Leora I.; Ross, Joseph S.; Drye, Elizabeth E.; Bernheim, Susannah M.; Normand, Sharon-Lise T.
2017-01-01
Background To isolate hospital effects on hospitals’ risk-standardized readmission rates, we examined readmission outcomes among patients with multiple admissions for a similar diagnosis at >1 hospital within a given year. Methods We divided the Centers for Medicare & Medicaid Services hospital-wide readmission measure cohort from July 2014–June 2015 into 2 random samples. We used the first sample to calculate each hospital’s risk-standardized readmission rate and classified hospitals into performance quartiles. In the second sample, we identified patients with 2 admissions for similar diagnoses at different hospitals that occurred more than a month and less than a year apart, and compared observed readmission rates for those admitted to hospitals in different performance quartiles. Results In the sample used to characterize hospital performance, the median risk-standardized readmission rate was 15.5% (IQR 15.3%–15.8%). The other sample included 37,508 patients with 2 admissions for similar diagnoses at 4,272 different hospitals. The observed readmission rate was consistently higher when patients were admitted to hospitals in the worse performing quartile, but the only statistically significant difference was observed when the same patients were admitted to hospitals in the best and worst performing quartiles, in which the absolute readmission rate difference was 1.95 percentage points (95% CI, 0.39%–3.50%). Conclusions When the same patients were admitted with similar diagnoses to hospitals in the best performing quartile compared with the worst performing quartile for hospital readmission performance, there is a significant difference in rates of readmission within 30 days. The findings suggest that hospital quality contributes in part to readmission rates independent of patient factors. PMID:28902587
Different conceptions of mental illness: consequences for the association with patients.
Helmchen, Hanfried
2013-01-01
Whenever partial knowledge is considered absolute and turned into ideological and dogmatic conceptions, the risk increases that the conditions for the people involved might become dangerous. This will be illustrated by casuistic examples of consequences of one-sided psychiatric conceptions such as social, biological, and psychological ideas about the treatment and care of the mentally ill. Present perspectives of an integrative model, i.e., an advanced bio-psycho-social conception about evidence-based characteristics on the social, psychological, and molecular-genetic level, require that all of these dimensions should be considered in order to personalize and thereby improve the care and treatment of the mentally ill.
Borrego, L M; Arroz, M J; Videira, P; Martins, C; Guimarães, H; Nunes, G; Papoila, A L; Trindade, H
2009-08-01
Several risk factors for asthma have been identified in infants and young children with recurrent wheeze. However, published literature has reported contradictory findings regarding the underlying immunological mechanisms. This study was designed to assess and compare the immunological status during the first 2 years in steroid-naive young children with >or= three episodes of physician-confirmed wheeze (n=50), with and without clinical risk factors for developing subsequent asthma (i.e. parental asthma or a personal history of eczema and/or two of the following: wheezing without colds, a personal history of allergic rhinitis and peripheral blood eosinophilia >4%), with age-matched healthy controls (n=30). Peripheral blood CD4(+)CD25(+) and CD4(+)CD25(high) T cells and their cytotoxic T-lymphocyte-associated antigen-4 (CTLA-4), GITR and Foxp3 expression were analysed by flow cytometry. Cytokine (IFN-gamma, TGF-beta and IL-10), CTLA-4 and Foxp3 mRNA expression were evaluated (real-time PCR) after peripheral blood mononuclear cell stimulation with phorbol 12-myristate 13-acetate (PMA) (24 h) and house dust mite (HDM) extracts (7th day). Flow cytometry results showed a significant reduction in the absolute number of CD4(+)CD25(high) and the absolute and percentage numbers of CD4(+)CD25(+)CTLA-4(+) in wheezy children compared with healthy controls. Wheezy children at a high risk of developing asthma had a significantly lower absolute number of CD4(+)CD25(+) (P=0.01) and CD4(+)CD25(high) (P=0.04), compared with those at a low risk. After PMA stimulation, CTLA-4 (P=0.03) and Foxp3 (P=0.02) expression was diminished in wheezy children compared with the healthy children. After HDM stimulation, CTLA-4 (P=0.03) and IFN-gamma (P=0.04) expression was diminished in wheezy children compared with healthy children. High-risk children had lower expression of IFN-gamma (P=0.03) compared with low-risk and healthy children and lower expression of CTLA-4 (P=0.01) compared with healthy children. Although our findings suggest that some immunological parameters are impaired in children with recurrent wheeze, particularly with a high risk for asthma, further studies are needed in order to assess their potential as surrogate predictor factors for asthma in early life.
Perichart-Perera, Otilia; Espino, Salvador; Avila-Vergara, Marco Antonio; Ibarra, Isabel; Ahued, Roberto; Godines, Myrna; Parry, Samuel; Macones, George; Strauss, Jerome F
2011-01-01
Objective To test the hypothesis that a relative deficiency in L-arginine, the substrate for synthesis of the vasodilatory gas nitric oxide, may be associated with the development of pre-eclampsia in a population at high risk. Design Randomised, blinded, placebo controlled clinical trial. Setting Tertiary public hospital in Mexico City. Participants Pregnant women with a history of a previous pregnancy complicated by pre-eclampsia, or pre-eclampsia in a first degree relative, and deemed to be at increased risk of recurrence of the disease were studied from week 14-32 of gestation and followed until delivery. Interventions Supplementation with a medical food—bars containing L-arginine plus antioxidant vitamins, antioxidant vitamins alone, or placebo—during pregnancy. Main outcome measure Development of pre-eclampsia/eclampsia. Results 222 women were allocated to the placebo group, 228 received L-arginine plus antioxidant vitamins, and 222 received antioxidant vitamins alone. Women had 4-8 prenatal visits while receiving the bars. The incidence of pre-eclampsia was reduced significantly (χ2=19.41; P<0.001) in women randomised to L-arginine plus antioxidant vitamins compared with placebo (absolute risk reduction 0.17 (95% confidence interval 0.12 to 0.21). Antioxidant vitamins alone showed an observed benefit, but this effect was not statistically significant compared with placebo (χ2=3.76; P=0.052; absolute risk reduction 0.07, 0.005 to 0.15). L-arginine plus antioxidant vitamins compared with antioxidant vitamins alone resulted in a significant effect (P=0.004; absolute risk reduction 0.09, 0.05 to 0.14). Conclusions Supplementation during pregnancy with a medical food containing L-arginine and antioxidant vitamins reduced the incidence of pre-eclampsia in a population at high risk of the condition. Antioxidant vitamins alone did not have a protective effect for prevention of pre-eclampsia. Supplementation with L-arginine plus antioxidant vitamins needs to be evaluated in a low risk population to determine the generalisability of the protective effect, and the relative contributions of L-arginine and antioxidant vitamins to the observed effects of the combined treatment need to be determined. Trial registration Clinical trials NCT00469846. PMID:21596735
Parra-Medina, Deborah; Liang, Yuanyuan; Yin, Zenong; Esparza, Laura; Lopez, Louis
2015-12-10
US Latinos have disproportionately higher rates of obesity and physical inactivity than the general US population, putting them at greater risk for chronic disease. This evaluation aimed to examine the impact of the Y Living Program (Y Living), a 12-week family-focused healthy lifestyle program, on the weight status of adult and child (aged ≥7 years) participants. In this pretest-posttest evaluation, participants attended twice-weekly group education sessions and engaged in physical activity at least 3 times per week. Primary outcome measures were body mass index ([BMI], zBMI and BMI percentile for children), weight, waist circumference, and percentage body fat. Wilcoxon signed-rank tests and mixed effects models were used to evaluate pretest-posttest differences (ie, absolute change and relative change) for adults and children separately. BMI, weight, waist circumference, and percentage body fat improved significantly (both absolutely and relatively) among adults who completed the program (n = 180; all P ≤ .001). Conversely, child participants that completed the program (n = 72) showed no improvements. Intervention effects varied across subgroups. Among adults, women and participants who were obese at baseline had larger improvements than did children who were obese at baseline or who were in families that had an annual household income of $15,000 or more. Significant improvements in weight were observed among adult participants but not children. This family-focused intervention has potential to prevent excess weight gain among high-risk Latino families.
Minor planets and related objects. XXI - Photometry of eight asteroids
NASA Technical Reports Server (NTRS)
Taylor, R. C.; Gehrels, T.; Capen, R. C.
1976-01-01
Light curves, UBV colors, rotational periods, phase coefficients, and absolute magnitudes are presented. The asteroids studied are (1) Ceres, (4) Vesta, (16) Psyche, (78) Diana, (281) Lucretia, (451) Patientia, (1212) Francette, and (1362) Griqua. The rotation axis of Lucretia is nearly perpendicular to the plane of the ecliptic. Ceres appears to be nearly spherical with an absolute magnitude of B(1,0)=4.42, which is 0.3 mag fainter than previously reported. The determination of the absolute magnitude for an asteroid depends on its aspect, and for each opposition there is, therefore, a different absolute magnitude.
Deutsch, Diana; Henthorn, Trevor; Marvin, Elizabeth; Xu, HongShuai
2006-02-01
Absolute pitch is extremely rare in the U.S. and Europe; this rarity has so far been unexplained. This paper reports a substantial difference in the prevalence of absolute pitch in two normal populations, in a large-scale study employing an on-site test, without self-selection from within the target populations. Music conservatory students in the U.S. and China were tested. The Chinese subjects spoke the tone language Mandarin, in which pitch is involved in conveying the meaning of words. The American subjects were nontone language speakers. The earlier the age of onset of musical training, the greater the prevalence of absolute pitch; however, its prevalence was far greater among the Chinese than the U.S. students for each level of age of onset of musical training. The findings suggest that the potential for acquiring absolute pitch may be universal, and may be realized by enabling infants to associate pitches with verbal labels during the critical period for acquisition of features of their native language.
Karp, Igor; Sylvestre, Marie-Pierre; Abrahamowicz, Michal; Leffondré, Karen; Siemiatycki, Jack
2016-11-01
Assessment of individual risk of illness is an important activity in preventive medicine. Development of risk-assessment models has heretofore relied predominantly on studies involving follow-up of cohort-type populations, while case-control studies have generally been considered unfit for this purpose. To present a method for individualized assessment of absolute risk of an illness (as illustrated by lung cancer) based on data from a 'non-nested' case-control study. We used data from a case-control study conducted in Montreal, Canada in 1996-2001. Individuals diagnosed with lung cancer (n = 920) and age- and sex-matched lung-cancer-free subjects (n = 1288) completed questionnaires documenting life-time cigarette-smoking history and occupational, medical, and family history. Unweighted and weighted logistic models were fitted. Model overfitting was assessed using bootstrap-based cross-validation and 'shrinkage.' The discriminating ability was assessed by the c-statistic, and the risk-stratifying performance was assessed by examination of the variability in risk estimates over hypothetical risk-profiles. In the logistic models, the logarithm of incidence-density of lung cancer was expressed as a function of age, sex, cigarette-smoking history, history of respiratory conditions and exposure to occupational carcinogens, and family history of lung cancer. The models entailed a minimal degree of overfitting ('shrinkage' factor: 0.97 for both unweighted and weighted models) and moderately high discriminating ability (c-statistic: 0.82 for the unweighted model and 0.66 for the weighted model). The method's risk-stratifying performance was quite high. The presented method allows for individualized assessment of risk of lung cancer and can be used for development of risk-assessment models for other illnesses.
Walker, Alex J.; West, Joe; Card, Tim R.; Crooks, Colin; Kirwan, Cliona C.
2016-01-01
Patients with breast cancer are at increased risk of venous thromboembolism (VTE), particularly in the peridiagnosis period. However, no previous epidemiologic studies have investigated the relative impact of breast cancer treatments in a time-dependent manner. We aimed to determine the impact of breast cancer stage, biology, and treatment on the absolute and relative risks of VTE by using several recently linked data sources from England. Our cohort comprised 13 202 patients with breast cancer from the Clinical Practice Research Datalink (linked to Hospital Episode Statistics and Cancer Registry data) diagnosed between 1997 and 2006 with follow-up continuing to the end of 2010. Cox regression analysis was performed to determine which demographic, treatment-related, and biological factors independently affected VTE risk. Women had an annual VTE incidence of 6% while receiving chemotherapy which was 10.8-fold higher (95% confidence interval [CI], 8.2-14.4; absolute rate [AR], 59.6 per 1000 person-years) than that in women who did not receive chemotherapy. After surgery, the risk was significantly increased in the first month (hazard ratio [HR], 2.2; 95% CI, 1.4-3.4; AR, 23.5; reference group, no surgery), but the risk was not increased after the first month. Risk of VTE was noticeably higher in the 3 months after initiation of tamoxifen compared with the risk before therapy (HR, 5.5; 95% CI, 2.3-12.7; AR, 24.1); however, initiating therapy with aromatase inhibitors was not associated with VTE (HR, 0.8; 95% CI, 0.5-1.4; AR, 28.3). In conclusion, women receiving chemotherapy for breast cancer have a clinically important risk of VTE, whereas an increased risk of VTE immediately after endocrine therapy is restricted to tamoxifen. PMID:26574606
Noumegni, Steve Raoul; Bigna, Jean Joel; Ama Moor Epse Nkegoum, Vicky Jocelyne; Nansseu, Jobert Richie; Assah, Felix K; Jingi, Ahmadou Musa; Guewo-Fokeng, Magellan; Leumi, Steve; Katte, Jean-Claude; Dehayem, Mesmin Y; Mfeukeu Kuate, Liliane; Kengne, Andre Pascal; Sobngwi, Eugene
2017-08-11
Cardiovascular disease (CVD) and metabolic diseases are growing concerns among patients with HIV infection as a consequence of the improving survival of this population. We aimed to assess the relationship between CVD risk and insulin resistance in a group of black African individuals with HIV infection. This cross-sectional study involved patients with HIV infection aged 30-74 years and followed up at the Yaoundé Central Hospital, Cameroon. Absolute CVD risk was calculated using the Framingham and the DAD CVD risk equations while the HOMA-IR index was used to assess insulin resistance (index ≥2.1). A total of 452 patients (361 women; 80%) were screened. The mean age was 44.4 years and most of the respondents were on antiretroviral therapy (88.5%). The median 5-year cardiovascular risk was 0.7% (25th-75th percentiles: 0.2-2.0) and 0.6% (0.3-1.3) according to the Framingham and DAD equations respectively. Of all participants, 47.3% were insulin resistant. The Framingham equation derived absolute CVD risk was significantly associated with insulin resistance; while no linear association was found using the DAD equation. The relationship between cardiovascular risk and insulin resistance in black African patients with HIV infection seems to depend on the cardiovascular risk equation used. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Fotis, Dimitrios; Doukas, Michael; Wijnhoven, Bas PL; Didden, Paul; Biermann, Katharina; Bruno, Marco J
2015-01-01
Background Due to the high mortality and morbidity rates of esophagectomy, endoscopic mucosal resection (EMR) is increasingly used for the curative treatment of early low risk Barrett’s adenocarcinoma. Objective This retrospective cohort study aimed to assess the prevalence of lymph node metastases (LNM) in submucosal (T1b) esophageal adenocarcinomas (EAC) in relation to the absolute depth of submucosal tumor invasion and demonstrate the efficacy of EMR for low risk (well and moderately differentiated without lymphovascular invasion) EAC with sm1 invasion (submucosal invasion ≤500 µm) according to the Paris classification. Methods The pathology reports of patients undergoing endoscopic resection and surgery from January 1994 until December 2013 at one center were reviewed and 54 patients with submucosal invasion were included. LNM were evaluated in surgical specimens and by follow up examinations in case of EMR. Results No LNM were observed in 10 patients with sm1 adenocarcinomas that underwent endoscopic resection. Three of them underwent supplementary endoscopic eradication therapy with a median follow up of 27 months for patients with sm1 tumors. In the surgical series two patients (29%) with sm1 invasion according to the pragmatic classification (subdivision of the submucosa into three equal thirds), staged as sm2-3 in the Paris classification, had LNM. The rate of LNM for surgical patients with low risk sm1 tumors was 10% according to the pragmatic classification and 0% according to Paris classification. Conclusion Different classifications of the tumor invasion depth lead to different LNM risks and treatment strategies for sm1 adenocarcinomas. Patients with low risk sm1 adenocarcinomas appear to be suitable candidates for EMR. PMID:26668743
Hamilton-Craig, Christian R; Chow, Clara K; Younger, John F; Jelinek, V M; Chan, Jonathan; Liew, Gary Yh
2017-10-16
Introduction This article summarises the Cardiac Society of Australia and New Zealand position statement on coronary artery calcium (CAC) scoring. CAC scoring is a non-invasive method for quantifying coronary artery calcification using computed tomography. It is a marker of atherosclerotic plaque burden and the strongest independent predictor of future myocardial infarction and mortality. CAC scoring provides incremental risk information beyond traditional risk calculators such as the Framingham Risk Score. Its use for risk stratification is confined to primary prevention of cardiovascular events, and can be considered as individualised coronary risk scoring for intermediate risk patients, allowing reclassification to low or high risk based on the score. Medical practitioners should carefully counsel patients before CAC testing, which should only be undertaken if an alteration in therapy, including embarking on pharmacotherapy, is being considered based on the test result. Main recommendations CAC scoring should primarily be performed on individuals without coronary disease aged 45-75 years (absolute 5-year cardiovascular risk of 10-15%) who are asymptomatic. CAC scoring is also reasonable in lower risk groups (absolute 5-year cardiovascular risk, < 10%) where risk scores traditionally underestimate risk (eg, family history of premature CVD) and in patients with diabetes aged 40-60 years. We recommend aspirin and a high efficacy statin in high risk patients, defined as those with a CAC score ≥ 400, or a CAC score of 100-399 and above the 75th percentile for age and sex. It is reasonable to treat patients with CAC scores ≥ 100 with aspirin and a statin. It is reasonable not to treat asymptomatic patients with a CAC score of zero. Changes in management as a result of this statement Cardiovascular risk is reclassified according to CAC score. High risk patients are treated with a high efficacy statin and aspirin. Very low risk patients (ie, CAC score of zero) do not benefit from treatment.
Joergensen, Torben; Bandosz, Piotr; Hallas, Jesper; Prescott, Eva Irene Bossano; Capewell, Simon
2018-01-01
Aim To quantify the contribution of changes in different risk factors population levels and treatment uptake on the decline in CHD mortality in Denmark from 1991 to 2007 in different socioeconomic groups. Design We used IMPACTSEC, a previously validated policy model using data from different population registries. Participants All adults aged 25–84 years living in Denmark in 1991 and 2007. Main outcome measure Deaths prevented or postponed (DPP). Results There were approximately 11,000 fewer CHD deaths in Denmark in 2007 than would be expected if the 1991 mortality rates had persisted. Higher mortality rates were observed in the lowest socioeconomic quintile. The highest absolute reduction in CHD mortality was seen in this group but the highest relative reduction was in the most affluent socioeconomic quintile. Overall, the IMPACTSEC model explained nearly two thirds of the decline in. Improved treatments accounted for approximately 25% with the least relative mortality reduction in the most deprived quintile. Risk factor improvements accounted for approximately 40% of the mortality decrease with similar gains across all socio-economic groups. The 36% gap in explaining all DPPs may reflect inaccurate data or risk factors not quantified in the current model. Conclusions According to the IMPACTSEC model, the largest contribution to the CHD mortality decline in Denmark from 1991 to 2007 was from improvements in risk factors, with similar gains across all socio-economic groups. However, we found a clear socioeconomic trend for the treatment contribution favouring the most affluent groups. PMID:29672537
Do Weather Phenomena Have Any Influence on the Occurrence of Spontaneous Pneumothorax?
Vodička, Josef; Vejvodová, Šárka; Šmíd, David; Fichtl, Jakub; Špidlen, Vladimír; Kormunda, Stanislav; Hostýnek, Jiří; Moláček, Jiří
2016-05-01
The objective of this study was to assess the impact of weather phenomena on the occurrence of spontaneous pneumothorax (SP) in the Plzeň region (Czech Republic). A retrospective analysis of 450 cases of SP in 394 patients between 1991 and 2013. We observed changes in average daily values of atmospheric pressure, air temperature and daily maximum wind gust for each day of that period and their effect on the development of SP. The risk of developing SP is 1.41 times higher (P=.0017) with air pressure changes of more than±6.1hPa. When the absolute value of the air temperature changes by more than±0.9°C, the risk of developing SP is 1.55 times higher (P=.0002). When the wind speed difference over the 5 days prior to onset of SP is less than 13m/sec, then the risk of SP is 2.16 times higher (P=.0004). If the pressure difference is greater than±6.1hPa and the temperature difference is greater than±0.9°C or the wind speed difference during the 5 days prior to onset of SP is less than 10.7m/s, the risk of SP is 2.04 times higher (P≤.0001). Changes in atmospheric pressure, air temperature and wind speed are undoubtedly involved in the development of SP, but don't seem to be the only factors causing rupture of blebs or emphysematous bullae. Copyright © 2015 SEPAR. Published by Elsevier Espana. All rights reserved.
Triggers for Violent Criminality in Patients With Psychotic Disorders.
Sariaslan, Amir; Lichtenstein, Paul; Larsson, Henrik; Fazel, Seena
2016-08-01
Absolute and relative risks of violence are increased in patients with psychotic disorders, but the contribution of triggers for violent acts to these risks is uncertain. To examine whether a range of triggers for violent acts are associated with risks of violence in patients diagnosed with psychotic disorders and in individuals without a psychiatric diagnosis. Using a sample of all individuals born in Sweden between 1958 and 1988 (N = 3 123 724), we identified patients in the National Patient Register who were diagnosed with schizophrenia spectrum disorders (n = 34 903) and bipolar disorder (n = 29 692), as well as unaffected controls (n = 2 763 012). We then identified, within each subsample, persons who had experienced any of the following triggers for violent acts between January 1, 2001, and December 15, 2013: exposure to violence, parental bereavement, self-harm, traumatic brain injury, unintentional injuries, and substance intoxication. By using within-individual models, we conducted conditional logistic regression to compare the risk of the individual engaging in violent acts in the week following the exposure to a trigger with the risk during earlier periods of equivalent length. All time-invariant confounders (eg, genetic and early environmental influences) were controlled for by this research design and we further adjusted for time-varying sociodemographic factors. Adjusted odds ratios (aORs) of violent crime occurring in the week following the exposure to a trigger event compared with earlier periods. Among the sample of 2 827 607 individuals (1 492 186 male and 1 335 421 female), all of the examined trigger events were associated with increased risk of violent crime in the week following exposure. The largest 1-week absolute risk of violent crime was observed following exposure to violence (70-177 violent crimes per 10 000 persons). For most triggers, the relative risks did not vary significantly by diagnosis, including unintentional injuries (aOR range, 3.5-4.8), self-harm (aOR range, 3.9-4.2), and substance intoxication (aOR range, 3.0-4.0). Differences by diagnosis included parental bereavement, which was significantly higher in patients with schizophrenia spectrum disorders (aOR, 5.0; 95% CI, 3.0-8.1) compared with controls (aOR, 1.7; 95% CI, 1.3-2.2). In addition to identifying risk factors for violence, clarifying the timing of the triggers may provide opportunities to improve risk assessment and management in individuals with psychotic disorders.
Noto, Nobutaka; Kato, Masataka; Abe, Yuriko; Kamiyama, Hiroshi; Karasawa, Kensuke; Ayusawa, Mamoru; Takahashi, Shori
2015-01-01
Previous studies that used carotid ultrasound have been largely conflicting in regards to whether or not patients after Kawasaki disease (KD) have a greater carotid intima-media thickness (CIMT) than controls. To test the hypothesis that there are significant differences between the values of CIMT expressed as absolute values and standard deviation scores (SDS) in children and adolescents after KD and controls, we reviewed 12 published articles regarding CIMT on KD patients and controls. The mean ± SD of absolute CIMT (mm) in the KD patients and controls obtained from each article was transformed to SDS (CIMT-SDS) using age-specific reference values established by Jourdan et al. (J: n = 247) and our own data (N: n = 175), and the results among these 12 articles were compared between the two groups and the references for comparison of racial disparities. There were no significant differences in mean absolute CIMT and mean CIMT-SDS for J between KD patients and controls (0.46 ± 0.06 mm vs. 0.44 ± 0.04 mm, p = 0.133, and 1.80 ± 0.84 vs. 1.25 ± 0.12, p = 0.159, respectively). However, there were significant differences in mean CIMT-SDS for N between KD patients and controls (0.60 ± 0.71 vs. 0.01 ± 0.65, p = 0.042). When we assessed the nine articles on Asian subjects, the difference of CIMT-SDS between the two groups was invariably significant only for N (p = 0.015). Compared with the reference values, CIMT-SDS of controls was within the normal range at a rate of 41.6 % for J and 91.6 % for N. These results indicate that age- and race-specific reference values for CIMT are mandatory for performing accurate assessment of the vascular status in healthy children and adolescents, particularly in those after KD considered at increased long-term cardiovascular risk.
Filippi, Andrea Riccardo; Ragona, Riccardo; Piva, Cristina; Scafa, Davide; Fiandra, Christian; Fusella, Marco; Giglioli, Francesca Romana; Lohr, Frank; Ricardi, Umberto
2015-05-01
The purpose of this study was to evaluate the risks of second cancers and cardiovascular diseases associated with an optimized volumetric modulated arc therapy (VMAT) planning solution in a selected cohort of stage I/II Hodgkin lymphoma (HL) patients treated with either involved-node or involved-site radiation therapy in comparison with 3-dimensional conformal radiation therapy (3D-CRT). Thirty-eight patients (13 males and 25 females) were included. Disease extent was mediastinum alone (n=8, 21.1%); mediastinum plus unilateral neck (n=19, 50%); mediastinum plus bilateral neck (n=11, 29.9%). Prescription dose was 30 Gy in 2-Gy fractions. Only 5 patients had mediastinal bulky disease at diagnosis (13.1%). Anteroposterior 3D-CRT was compared with a multiarc optimized VMAT solution. Lung, breast, and thyroid cancer risks were estimated by calculating a lifetime attributable risk (LAR), with a LAR ratio (LAR(VMAT)-to-LAR(3D-CRT)) as a comparative measure. Cardiac toxicity risks were estimated by calculating absolute excess risk (AER). The LAR ratio favored 3D-CRT for lung cancer induction risk in mediastinal alone (P=.004) and mediastinal plus unilateral neck (P=.02) presentations. LAR ratio for breast cancer was lower for VMAT in mediastinal plus bilateral neck presentations (P=.02), without differences for other sites. For thyroid cancer, no significant differences were observed, regardless of anatomical presentation. A significantly lower AER of cardiac (P=.038) and valvular diseases (P<.0001) was observed for VMAT regardless of disease extent. In a cohort of patients with favorable characteristics in terms of disease extent at diagnosis (large prevalence of nonbulky presentations without axillary involvement), optimized VMAT reduced heart disease risk with comparable risks of thyroid and breast cancer, with an increase in lung cancer induction probability. The results are however strongly influenced by the different anatomical presentations, supporting an individualized approach. Copyright © 2015 Elsevier Inc. All rights reserved.
Resonant imaging of carotenoid pigments in the human retina
NASA Astrophysics Data System (ADS)
Gellermann, Werner; Emakov, Igor V.; McClane, Robert W.
2002-06-01
We have generated high spatial resolution images showing the distribution of carotenoid macular pigments in the human retina using Raman spectroscopy. A low level of macular pigments is associated with an increased risk of developing age-related macular degeneration, a leading cause of irreversible blindness. Using excised human eyecups and resonant excitation of the pigment molecules with narrow bandwidth blue light from a mercury arc lamp, we record Raman images originating from the carbon-carbon double bond stretch vibrations of lutein and zeaxanthin, the carotenoids comprising human macular pigments. Our Raman images reveal significant differences among subjects, both in regard to absolute levels as well as spatial distribution within the macula. Since the light levels used to obtain these images are well below established safety limits, this technique holds promise for developing a rapid screening diagnostic in large populations at risk for vision loss from age-related macular degeneration.
Perroni, Fabrizio; Guidetti, Laura; Cignitti, Lamberto; Baldari, Carlo
2015-01-01
During fire emergencies, firefighters wear personal protective devices (PC) and a self-contained breathing apparatus (S.C.B.A.) to be protected from injuries. The purpose of this study was to investigate the differences of aerobic level in 197 firefighters (age: 34±7 yr; BMI: 24.4±2.3 kg.m-2), evaluated by a Queen’s College Step field Test (QCST), performed with and without fire protective garments, and to analyze the differences among age groups (<25 yr; 26-30 yr, 31-35 yr, 36-40 yr and >40 yr). Variance analysis was applied to assess differences (p < 0.05) between tests and age groups observed in absolute and weight-related values, while a correlation was examined between QCST with and without PC+S.C.B.A. The results have shown that a 13% of firefighters failed to complete the test with PC+S.C.B.A. and significant differences between QCST performed with and without PC+S.C.B.A. in absolute (F(1,169) = 42.6, p < 0.0001) and weight-related (F(1,169) = 339.9, p < 0.0001) terms. A better correlation has been found in L•min-1 (r=0.67) than in ml•kg-1•min-1 (r=0.54). Moreover, we found significant differences among age groups both in absolute and weight-related values. The assessment of maximum oxygen uptake of firefighters in absolute term can be a useful tool to evaluate the firefighters' cardiovascular strain. PMID:25764201
Is Dose Deformation–Invariance Hypothesis Verified in Prostate IGRT?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Simon, Antoine, E-mail: antoine.simon@univ-rennes1.fr; Laboratoire Traitement du Signal et de l'Image, Université de Rennes 1, 35000 Rennes; Le Maitre, Amandine
Purpose: To assess dose uncertainties resulting from the dose deformation–invariance hypothesis in prostate cone beam computed tomography (CT)–based image guided radiation therapy (IGRT), namely to evaluate whether rigidly propagated planned dose distribution enables good estimation of fraction dose distributions. Methods and Materials: Twenty patients underwent a CT scan for planning intensity modulated radiation therapy–IGRT delivering 80 Gy to the prostate, followed by weekly CT scans. Two methods were used to obtain the dose distributions on the weekly CT scans: (1) recalculating the dose using the original treatment plan; and (2) rigidly propagating the planned dose distribution. The cumulative doses were then estimatedmore » in the organs at risk for each dose distribution by deformable image registration. The differences between recalculated and propagated doses were finally calculated for the fraction and the cumulative dose distributions, by use of per-voxel and dose-volume histogram (DVH) metrics. Results: For the fraction dose, the mean per-voxel absolute dose difference was <1 Gy for 98% and 95% of the fractions for the rectum and bladder, respectively. The maximum dose difference within 1 voxel reached, however, 7.4 Gy in the bladder and 8.0 Gy in the rectum. The mean dose differences were correlated with gas volume for the rectum and patient external contour variations for the bladder. The mean absolute differences for the considered volume receiving greater than or equal to dose x (V{sub x}) of the DVH were between 0.37% and 0.70% for the rectum and between 0.53% and 1.22% for the bladder. For the cumulative dose, the mean differences in the DVH were between 0.23% and 1.11% for the rectum and between 0.55% and 1.66% for the bladder. The largest dose difference was 6.86%, for bladder V{sub 80Gy}. The mean dose differences were <1.1 Gy for the rectum and <1 Gy for the bladder. Conclusions: The deformation–invariance hypothesis was corroborated for the organs at risk in prostate IGRT except in cases of a large disappearance or appearance of rectal gas for the rectum and large external contour variations for the bladder.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nguyen, J.; Moteabbed, M.; Paganetti, H., E-mail: hpaganetti@mgh.harvard.edu
2015-01-15
Purpose: Theoretical dose–response models offer the possibility to assess second cancer induction risks after external beam therapy. The parameters used in these models are determined with limited data from epidemiological studies. Risk estimations are thus associated with considerable uncertainties. This study aims at illustrating uncertainties when predicting the risk for organ-specific second cancers in the primary radiation field illustrated by choosing selected treatment plans for brain cancer patients. Methods: A widely used risk model was considered in this study. The uncertainties of the model parameters were estimated with reported data of second cancer incidences for various organs. Standard error propagationmore » was then subsequently applied to assess the uncertainty in the risk model. Next, second cancer risks of five pediatric patients treated for cancer in the head and neck regions were calculated. For each case, treatment plans for proton and photon therapy were designed to estimate the uncertainties (a) in the lifetime attributable risk (LAR) for a given treatment modality and (b) when comparing risks of two different treatment modalities. Results: Uncertainties in excess of 100% of the risk were found for almost all organs considered. When applied to treatment plans, the calculated LAR values have uncertainties of the same magnitude. A comparison between cancer risks of different treatment modalities, however, does allow statistically significant conclusions. In the studied cases, the patient averaged LAR ratio of proton and photon treatments was 0.35, 0.56, and 0.59 for brain carcinoma, brain sarcoma, and bone sarcoma, respectively. Their corresponding uncertainties were estimated to be potentially below 5%, depending on uncertainties in dosimetry. Conclusions: The uncertainty in the dose–response curve in cancer risk models makes it currently impractical to predict the risk for an individual external beam treatment. On the other hand, the ratio of absolute risks between two modalities is less sensitive to the uncertainties in the risk model and can provide statistically significant estimates.« less
40 CFR 53.56 - Test for effect of variations in ambient pressure.
Code of Federal Regulations, 2012 CFR
2012-07-01
... the tests and shall be checked at zero and at least one flow rate within ±3 percent of 16.7 L/min... this test, the absolute difference in values calculated in Equation 21 of this paragraph (g)(4) must... absolute difference between the mean ambient air pressure indicated by the test sampler and the ambient...
40 CFR 53.56 - Test for effect of variations in ambient pressure.
Code of Federal Regulations, 2013 CFR
2013-07-01
... the tests and shall be checked at zero and at least one flow rate within ±3 percent of 16.7 L/min... this test, the absolute difference in values calculated in Equation 21 of this paragraph (g)(4) must... absolute difference between the mean ambient air pressure indicated by the test sampler and the ambient...
40 CFR 53.56 - Test for effect of variations in ambient pressure.
Code of Federal Regulations, 2014 CFR
2014-07-01
... the tests and shall be checked at zero and at least one flow rate within ±3 percent of 16.7 L/min... this test, the absolute difference in values calculated in Equation 21 of this paragraph (g)(4) must... absolute difference between the mean ambient air pressure indicated by the test sampler and the ambient...
40 CFR 53.56 - Test for effect of variations in ambient pressure.
Code of Federal Regulations, 2011 CFR
2011-07-01
... the tests and shall be checked at zero and at least one flow rate within ±3 percent of 16.7 L/min... this test, the absolute difference in values calculated in Equation 21 of this paragraph (g)(4) must... absolute difference between the mean ambient air pressure indicated by the test sampler and the ambient...
Costa, Rubens Barros; Costa, Ricardo L.B.; Talamantes, Sarah M.; Kaplan, Jason B.; Bhave, Manali A.; Rademaker, Alfred; Miller, Corinne; Carneiro, Benedito A.; Mahalingam, Devalingam; Chae, Young Kwang
2018-01-01
Introduction Anaplastic lymphoma kinase (ALK) inhibitors are the mainstay treatment for patients with non-small cell lung carcinoma (NSCLC) harboring a rearrangement of the ALK gene or the ROS1 oncogenes. With the recent publication of pivotal trials leading to the approval of these compounds in different indications, their toxicity profile warrants an update. Materials and Methods A systematic literature search was performed in July 2017. Studies evaluating US FDA approved doses of one of the following ALK inhibitors: Crizotinib, Ceritinib, Alectinib or Brigatinib as monotherapy were included. Data were analyzed using random effects meta-analysis for absolute risks (AR), study heterogeneity, publication bias and differences among treatments. Results Fifteen trials with a total of 2,005 patients with evaluable toxicity data were included in this report. There was significant heterogeneity amongst different studies. The pooled AR of death and severe adverse events were 0.5% and 34.5%, respectively. Grade 3/4 nausea, vomiting, diarrhea, and constipation were uncommon: 2.6%, 2.5%, 2.7%, 1.2%, respectively. Conclusions ALK inhibitors have an acceptable safety profile with a low risk of treatment-related deaths. Important differences in toxicity profile were detected amongst the different drugs. PMID:29774128
Peña, R; Wall, S; Persson, L A
2000-01-01
OBJECTIVES: This study assessed the effect of poverty and social inequity on infant mortality risks in Nicaragua from 1988 to 1993 and the preventive role of maternal education. METHODS: A cohort analysis of infant survival, based on reproductive histories of a representative sample of 10,867 women aged 15 to 49 years in León, Nicaragua, was conducted. A total of 7073 infants were studied; 342 deaths occurred during 6394 infant-years of follow-up. Outcome measures were infant mortality rate (IMR) and relative mortality risks for different groups. RESULTS: IMR was 50 per 1000 live births. Poverty, expressed as unsatisfied basic needs (UBN) of the household, increased the risk of infant death (adjusted relative risk [RR] = 1.49; 95% confidence interval [CI] = 1.15, 1.92). Social inequity, expressed as the contrast between the household UBN and the predominant UBN of the neighborhood, further increased the risk (adjusted RR = 1.74; 95% CI = 1.12, 2.71). A protective effect of the mother's educational level was seen only in poor households. CONCLUSIONS: Apart from absolute level of poverty, social inequity may be an independent risk factor for infant mortality in a low-income country. In poor households, female education may contribute to preventing infant mortality. PMID:10630139
Vegetables, but not pickled vegetables, are negatively associated with the risk of breast cancer.
Yu, Hyejin; Hwang, Ji-Yun; Ro, Jungsil; Kim, Jeongseon; Chang, Namsoo
2010-01-01
This study investigated the association between pickled vegetable consumption and the risk of breast cancer using a validated food frequency questionnaire. A total of 358 patients with breast cancer who were matched to 360 healthy controls by age (using a 5-yr age distribution) were recruited from the National Cancer Center in South Korea. After adjusting for nondietary risk factors, total vegetable intake was inversely associated with risk of breast cancer. However, unlike nonpickled vegetables, pickled vegetable intake and its proportion relative to total vegetables were positively associated with the risk of breast cancer, and this association was more profound and consistent when pickled vegetable intake was considered as a proportion relative to total vegetables (odds ratio [OR] = 6.24, 95% confidence interval [CI] = 3.55-10.97; P for trend <0.001 for highest vs. lowest quartiles of intake) than as the absolute consumed amount (OR = 2.47, 95% CI = 1.45-4.21; P for trend = 0.015 for highest vs. lowest quartiles of intake). These results suggest that not only the amount of total vegetable intake but also the amounts of different types of vegetable (i.e., pickled or nonpickled) and their proportions relative to total vegetables are significantly associated with the risk of breast cancer.
Atrial Fibrillation, Type 2 Diabetes, and Non-Vitamin K Antagonist Oral Anticoagulants: A Review.
Plitt, Anna; McGuire, Darren K; Giugliano, Robert P
2017-04-01
Atrial fibrillation (AF) is the most common cardiac arrhythmia and is associated with a 5-fold increase in the risk for stroke. Type 2 diabetes is an independent risk factor for both stroke and atrial fibrillation, and in the setting of AF, type 2 diabetes is independently associated with a 2% to 3.5% increase in absolute stroke rate per year. The overlap in the pathophysiologies of AF and type 2 diabetes are not well understood, and current practice guidelines provide few recommendations regarding patients with both conditions. In this article, we review the epidemiology and pathophysiology of the nexus of AF and type 2 diabetes. Furthermore, we analyze the subgroup of patients with type 2 diabetes enrolled in phase 3 clinical trials of non-vitamin K antagonist oral anticoagulants in prevention of arterial thromboembolism in AF, highlighting the greater absolute benefit of non-vitamin K oral anticoagulants in patients with type 2 diabetes. Finally, we offer recommendations on risk stratification and therapy for patients with concomitant AF and type 2 diabetes. We highlight the increased thromboembolic risk with coexisting AF and type 2 diabetes. We recommend that further studies be done to evaluate the potential benefits of anticoagulation for all patients who have both and the potential for non-vitamin K oral anticoagulants to have greater benefits than risks over vitamin K antagonists.
Padula, Francesco; Laganà, Antonio Simone; Vitale, Salvatore Giovanni; D'Emidio, Laura; Coco, Claudio; Giannarelli, Diana; Cariola, Maria; Favilli, Alessandro; Giorlandino, Claudio
2017-05-01
Maternal age is a crucial factor in fetal aneuploidy screening, resulting in an increased rate of false-positive cases in older women and false-negative cases in younger women. The absolute risk (AR) is the simplest way to eliminate the background maternal age risk, as it represents the amount of improvement of the combined risk from the maternal background risk. The aim of this work is to assess the performance of the AR in the combined first-trimester screening for aneuploidies. A retrospective validation of the AR in the combined first-trimester screening for fetal aneuploidies, in an unselected population at Altamedica Fetal-Maternal Medical Center in Rome, between March 2007 and December 2008. Of 3845 women included in the study, we had a complete follow-up on 2984. We evaluated that an AR < 3 would individuate 22 of 23 cases of aneuploidy with a detection rate of 95.7% (95%CI 87.3-100), a false-positive rate of 8.7% (95%CI 7.7-9.7) and a false-negative rate of 4.3% (95%CI 0-12.7). In our study, the AR ameliorates the detection rate for aneuploidy. Further research and a prospective study on a larger population would help us to improve the AR in detecting most cases of aneuploidy.
Malaria Chemoprophylaxis: Strategies for Risk Groups
Schlagenhauf, Patricia; Petersen, Eskild
2008-01-01
The risk of malaria for travelers varies from region to region and depends on the intensity of transmission, the duration of the stay in the area of endemicity, the style of travel, and the efficacy of preventive measures. The decision to recommend chemoprophylaxis to travelers to areas with a low risk of malarial infection is especially difficult because the risk of infection must be balanced with the risk of experiencing side effects. If the risk of side effects by far exceeds the risk of infection, the traveler needs information on measures against mosquito bites and advice on prompt diagnosis and self-treatment. The risk is difficult to quantify, and the absolute risk for travelers to most areas is not known, especially because the populations at risk are unknown. We propose here that the best approximation of the risk to the traveler to a specific area is to use the risk to the indigenous population as a guideline for the risk to the traveler, and we provide examples on how risk in the indigenous population can be used for the estimation of risk of malarial infection for travelers. Special groups are long-term visitors and residents, who often perceive risk differently, cease using chemoprophylaxis, and rely on self-diagnosis and treatment. For long-term visitors, the problem of fake drugs needs to be discussed. Strategies for chemoprophylaxis and self-treatment of pregnant women and small children are discussed. So far, malaria prophylaxis is recommended to prevent Plasmodium falciparum infections, and primaquine prophylaxis against persistent Plasmodium vivax and Plasmodium ovale infections in travelers is not recommended. PMID:18625682
Acupuncture for treating fibromyalgia
Deare, John C; Zheng, Zhen; Xue, Charlie CL; Liu, Jian Ping; Shang, Jingsheng; Scott, Sean W; Littlejohn, Geoff
2014-01-01
Background One in five fibromyalgia sufferers use acupuncture treatment within two years of diagnosis. Objectives To examine the benefits and safety of acupuncture treatment for fibromyalgia. Search methods We searched CENTRAL, PubMed, EMBASE, CINAHL, National Research Register, HSR Project and Current Contents, as well as the Chinese databases VIP and Wangfang to January 2012 with no language restrictions. Selection criteria Randomised and quasi-randomised studies evaluating any type of invasive acupuncture for fibromyalgia diagnosed according to the American College of Rheumatology (ACR) criteria, and reporting any main outcome: pain, physical function, fatigue, sleep, total well-being, stiffness and adverse events. Data collection and analysis Two author pairs selected trials, extracted data and assessed risk of bias. Treatment effects were reported as standardised mean differences (SMD) and 95%confidence intervals (CI) for continuous outcomes using different measurement tools (pain, physical function, fatigue, sleep, total well-being and stiffness) and risk ratio (RR) and 95% CI for dichotomous outcomes (adverse events).We pooled data using the random-effects model. Main results Nine trials (395 participants) were included. All studies except one were at low risk of selection bias; five were at risk of selective reporting bias (favouring either treatment group); two were subject to attrition bias (favouring acupuncture); three were subject to performance bias (favouring acupuncture) and one to detection bias (favouring acupuncture). Three studies utilised electro-acupuncture (EA) with the remainder using manual acupuncture (MA) without electrical stimulation. All studies used ’formula acupuncture’ except for one, which used trigger points. Low quality evidence from one study (13 participants) showed EA improved symptoms with no adverse events at one month following treatment. Mean pain in the non-treatment control group was 70 points on a 100 point scale; EA reduced pain by a mean of 22 points (95% confidence interval (CI) 4 to 41), or 22% absolute improvement. Control group global well-being was 66.5 points on a 100 point scale; EA improved well-being by a mean of 15 points (95% CI 5 to 26 points). Control group stiffness was 4.8 points on a 0 to 10 point; EA reduced stiffness by a mean of 0.9 points (95% CI 0.1 to 2 points; absolute reduction 9%, 95% CI 4% to 16%). Fatigue was 4.5 points (10 point scale) without treatment; EA reduced fatigue by a mean of 1 point (95% CI 0.22 to 2 points), absolute reduction 11% (2% to 20%). There was no difference in sleep quality (MD 0.4 points, 95% CI −1 to 0.21 points, 10 point scale), and physical function was not reported. Moderate quality evidence from six studies (286 participants) indicated that acupuncture (EA or MA) was no better than sham acupuncture, except for less stiffness at one month. Subgroup analysis of two studies (104 participants) indicated benefits of EA. Mean pain was 70 points on 0 to 100 point scale with sham treatment; EA reduced pain by 13% (5% to 22%); (SMD −0.63, 95% CI −1.02 to −0.23). Global well-being was 5.2 points on a 10 point scale with sham treatment; EA improved well-being: SMD 0.65, 95% CI 0.26 to 1.05; absolute improvement 11% (4% to 17%). EA improved sleep, from 3 points on a 0 to 10 point scale in the sham group: SMD 0.40 (95% CI 0.01 to 0.79); absolute improvement 8% (0.2% to 16%). Low-quality evidence from one study suggested that MA group resulted in poorer physical function: mean function in the sham group was 28 points (100 point scale); treatment worsened function by a mean of 6 points (95% CI −10.9 to −0.7). Low-quality evidence from three trials (289 participants) suggested no difference in adverse events between real (9%) and sham acupuncture (35%); RR 0.44 (95% CI 0.12 to 1.63). Moderate quality evidence from one study (58 participants) found that compared with standard therapy alone (antidepressants and exercise), adjunct acupuncture therapy reduced pain at one month after treatment: mean pain was 8 points on a 0 to 10 point scale in the standard therapy group; treatment reduced pain by 3 points (95% CI −3.9 to −2.1), an absolute reduction of 30% (21% to 39%). Two people treated with acupuncture reported adverse events; there were none in the control group (RR 3.57; 95% CI 0.18 to 71.21). Global well-being, sleep, fatigue and stiffness were not reported. Physical function data were not usable. Low quality evidence from one study (38 participants) showed a short-term benefit of acupuncture over antidepressants in pain relief: mean pain was 29 points (0 to 100 point scale) in the antidepressant group; acupuncture reduced pain by 17 points (95% CI −24.1 to −10.5). Other outcomes or adverse events were not reported. Moderate-quality evidence from one study (41 participants) indicated that deep needling with or without deqi did not differ in pain, fatigue, function or adverse events. Other outcomes were not reported. Four studies reported no differences between acupuncture and control or other treatments described at six to seven months follow-up. No serious adverse events were reported, but there were insufficient adverse events to be certain of the risks. Authors’ conclusions There is low tomoderate-level evidence that compared with no treatment and standard therapy, acupuncture improves pain and stiffness in people with fibromyalgia. There is moderate-level evidence that the effect of acupuncture does not differ from sham acupuncture in reducing pain or fatigue, or improving sleep or global well-being. EA is probably better than MA for pain and stiffness reduction and improvement of global well-being, sleep and fatigue. The effect lasts up to one month, but is not maintained at six months follow-up. MA probably does not improve pain or physical functioning. Acupuncture appears safe. People with fibromyalgia may consider using EA alone or with exercise and medication. The small sample size, scarcity of studies for each comparison, lack of an ideal sham acupuncture weaken the level of evidence and its clinical implications. Larger studies are warranted. PMID:23728665