Analysis of the Structure of Surgical Activity for a Suturing and Knot-Tying Task
Vedula, S. Swaroop; Malpani, Anand O.; Tao, Lingling; Chen, George; Gao, Yixin; Poddar, Piyush; Ahmidi, Narges; Paxton, Christopher; Vidal, Rene; Khudanpur, Sanjeev; Hager, Gregory D.; Chen, Chi Chiung Grace
2016-01-01
Background Surgical tasks are performed in a sequence of steps, and technical skill evaluation includes assessing task flow efficiency. Our objective was to describe differences in task flow for expert and novice surgeons for a basic surgical task. Methods We used a hierarchical semantic vocabulary to decompose and annotate maneuvers and gestures for 135 instances of a surgeon’s knot performed by 18 surgeons. We compared counts of maneuvers and gestures, and analyzed task flow by skill level. Results Experts used fewer gestures to perform the task (26.29; 95% CI = 25.21 to 27.38 for experts vs. 31.30; 95% CI = 29.05 to 33.55 for novices) and made fewer errors in gestures than novices (1.00; 95% CI = 0.61 to 1.39 vs. 2.84; 95% CI = 2.3 to 3.37). Transitions among maneuvers, and among gestures within each maneuver for expert trials were more predictable than novice trials. Conclusions Activity segments and state flow transitions within a basic surgical task differ by surgical skill level, and can be used to provide targeted feedback to surgical trainees. PMID:26950551
2011-01-01
Background Time nurses spend with patients is associated with improved patient outcomes, reduced errors, and patient and nurse satisfaction. Few studies have measured how nurses distribute their time across tasks. We aimed to quantify how nurses distribute their time across tasks, with patients, in individual tasks, and engagement with other health care providers; and how work patterns changed over a two year period. Methods Prospective observational study of 57 nurses for 191.3 hours (109.8 hours in 2005/2006 and 81.5 in 2008), on two wards in a teaching hospital in Australia. The validated Work Observation Method by Activity Timing (WOMBAT) method was applied. Proportions of time in 10 categories of work, average time per task, time with patients and others, information tools used, and rates of interruptions and multi-tasking were calculated. Results Nurses spent 37.0%[95%CI: 34.5, 39.3] of their time with patients, which did not change in year 3 [35.7%; 95%CI: 33.3, 38.0]. Direct care, indirect care, medication tasks and professional communication together consumed 76.4% of nurses' time in year 1 and 81.0% in year 3. Time on direct and indirect care increased significantly (respectively 20.4% to 24.8%, P < 0.01;13.0% to 16.1%, P < 0.01). Proportion of time on medication tasks (19.0%) did not change. Time in professional communication declined (24.0% to 19.2%, P < 0.05). Nurses completed an average of 72.3 tasks per hour, with a mean task length of 55 seconds. Interruptions arose at an average rate of two per hour, but medication tasks incurred 27% of all interruptions. In 25% of medication tasks nurses multi-tasked. Between years 1 and 3 nurses spent more time alone, from 27.5%[95%CI 24.5, 30.6] to 39.4%[34.9, 43.9]. Time with health professionals other than nurses was low and did not change. Conclusions Nurses spent around 37% of their time with patients which did not change. Work patterns were increasingly fragmented with rapid changes between tasks of short length. Interruptions were modest but their substantial over-representation among medication tasks raises potential safety concerns. There was no evidence of an increase in team-based, multi-disciplinary care. Over time nurses spent significantly less time talking with colleagues and more time alone. PMID:22111656
2014-01-01
Background To identify the relationship between perceived environmental barriers and disability in community-dwelling elderly. Methods Cross-sectional study in two community service centers in Tainan. We enrolled 200 community-dwelling residents, aged above 65 years, who had resided in the same community for at least 12 months. Basic activity of daily living (BADL) and instrumental activity of daily living (IADL) were assessed using the Hierarchy of Care Required (HCR). There were 59 participants in BADL disability and 109 in IADL disability. Perceived environmental barriers were assessed using the Craig Hospital Inventory of Environmental Factors (CHIEF). We used multinomial logistic regression to examine the relationship of perceived environmental barriers and disability. Results The presence of perceived environmental barriers was related to BADL disability (OR = 4.39, 95% CI = 1.01-19.11) and IADL disability (IADL with difficulty in 1–2 tasks: OR = 9.93, 95% CI = 3.22-30.56; IADL with difficulty in more than 2 tasks: OR = 8.40, 95% CI = 1.83-38.51). The presence of physically/structurally perceived environmental barriers was related to BADL disability (OR = 4.90, 95% CI = 1.01-23.86) and IADL disability (IADL with difficulty in 1–2 tasks: OR = 4.61, 95% CI = 1.27-16.76; IADL with difficulty in more than 2 tasks: OR = 17.05, 95% CI = 2.82-103.30). Conclusions Perceived environmental barriers are related to disability in community-dwelling elderly. PMID:24885956
Results of Hg speciation testing on tank 39 and 1Q16 tank 50 samples
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bannochie, C. J.
2016-03-07
The Savannah River National Laboratory (SRNL) was tasked with preparing and shipping samples for Hg speciation by Eurofins Frontier Global Sciences, Inc. in Seattle, WA on behalf of the Savannah River Remediation (SRR) Mercury Task Team.i,ii The seventeenth shipment of samples was designated to include two Tank 39 samples and the 1Q16 Tank 50 Quarterly WAC sample. The surface Tank 39 sample was pulled at 262.1” from the tank bottom, and the depth Tank 39 sample was pulled at 95” from the tank bottom. The 1Q16 Tank 50 WAC sample was drawn from the 1-L variable depth sample received bymore » SRNL.« less
El-Helaly, Mohamed; Balkhy, Hanan H.; Vallenius, Laura
2017-01-01
Objectives: Work-related carpal tunnel syndrome (CTS) has been reported in different occupations, including laboratory technicians, so this study was carried out to determine the prevalence and the associated personal and ergonomic factors for CTS among laboratory technicians. Methods: A cross-sectional study was conducted among 279 laboratory technicians at King Fahd Hospital, Saudi Arabia, who filled in a self-administered questionnaire, including questions regarding their demographic criteria, occupational history, job tasks, workplace tools, ergonomic factors at work, and symptoms suggestive of CTS. Physical examinations and electrodiagnostic studies were carried out for those who had symptoms suggestive of CTS to confirm the diagnosis. Univariate and multivariate analysis were performed for both personal and physical factors in association with confirmed CTS among laboratory technicians. Results: The prevalence of CTS among the laboratory technicians was 9.7% (27/279). The following were the statistically significant risk factors for CTS among them: gender (all cases of CTS were female, P=0.00), arm/hand exertion (OR: 7.96; 95% CI: 1.84-34.33), pipetting (OR: 7.27; 95% CI: 3.15-16.78), repetitive tasks (OR: 4.60; 95% CI: 1.39-15.70), using unadjustable chairs or desks (OR: 3.35; 95% CI: 1.23-9.15), and working with a biosafety cabinet (OR: 2.49; 95% CI: 1.11-5.59). CTS cases had significant longer work duration (17.9 ± 5.6 years) than CTS non-case (11.5 ± 7.4 yeas) with low OR (1.108). Conclusion: This study demonstrates some personal and ergonomic factors associated with CTS among the laboratory technicians, including female gender, arm/hand exertion, pipetting, repetitive tasks, working with a biosafety cabinet, and an unadjusted workstation. PMID:28855446
Hogervorst, Eva; Bandelow, Stephan; Hart, John; Henderson, Victor W
2004-09-01
Parallel versions of memory tasks are useful in clinical and research settings to reduce practice effects engendered by multiple administrations. We aimed to investigate the usefulness of three parallel versions of ten-item word list recall tasks administered by telephone. A population based telephone survey of middle-aged and elderly residents of Bradley County, Arkansas was carried out as part of the Rural Aging and Memory Study (RAMS). Participants in the study were 1845 persons aged 40 to 95 years. Word lists included that used in the telephone interview of cognitive status (TICS) as a criterion standard and two newly developed lists. The mean age of participants was 61.05 (SD 12.44) years; 39.5% were over age 65. 78% of the participants had completed high school, 66% were women and 21% were African-American. There was no difference in demographic characteristics between groups receiving different word list versions, and performances on the three versions were equivalent for both immediate (mean 4.22, SD 1.53) and delayed (mean 2.35 SD 1.75) recall trials. The total memory score (immediate+delayed recall) was negatively associated with older age (beta = -0.41, 95%CI=-0.11 to -0.04), lower education (beta = 0.24, 95%CI = 0.36 to 0.51), male gender (beta = -0.18, 95%CI = -1.39 to -0.90) and African-American race (beta = -0.15, 95%CI = -1.41 to -0.82). The two RAMS word recall lists and the TICS word recall list can be used interchangeably in telephone assessment of memory of middle-aged and elderly persons. This finding is important for future studies where parallel versions of a word-list memory task are needed. (250 words).
Self-regulated learning processes of medical students during an academic learning task.
Gandomkar, Roghayeh; Mirzazadeh, Azim; Jalili, Mohammad; Yazdani, Kamran; Fata, Ladan; Sandars, John
2016-10-01
This study was designed to identify the self-regulated learning (SRL) processes of medical students during a biomedical science learning task and to examine the associations of the SRL processes with previous performance in biomedical science examinations and subsequent performance on a learning task. A sample of 76 Year 1 medical students were recruited based on their performance in biomedical science examinations and stratified into previous high and low performers. Participants were asked to complete a biomedical science learning task. Participants' SRL processes were assessed before (self-efficacy, goal setting and strategic planning), during (metacognitive monitoring) and after (causal attributions and adaptive inferences) their completion of the task using an SRL microanalytic interview. Descriptive statistics were used to analyse the means and frequencies of SRL processes. Univariate and multiple logistic regression analyses were conducted to examine the associations of SRL processes with previous examination performance and the learning task performance. Most participants (from 88.2% to 43.4%) reported task-specific processes for SRL measures. Students who exhibited higher self-efficacy (odds ratio [OR] 1.44, 95% confidence interval [CI] 1.09-1.90) and reported task-specific processes for metacognitive monitoring (OR 6.61, 95% CI 1.68-25.93) and causal attributions (OR 6.75, 95% CI 2.05-22.25) measures were more likely to be high previous performers. Multiple analysis revealed that similar SRL measures were associated with previous performance. The use of task-specific processes for causal attributions (OR 23.00, 95% CI 4.57-115.76) and adaptive inferences (OR 27.00, 95% CI 3.39-214.95) measures were associated with being a high learning task performer. In multiple analysis, only the causal attributions measure was associated with high learning task performance. Self-efficacy, metacognitive monitoring and causal attributions measures were associated positively with previous performance. Causal attributions and adaptive inferences measures were associated positively with learning task performance. These findings may inform remediation interventions in the early years of medical school training. © 2016 John Wiley & Sons Ltd and The Association for the Study of Medical Education.
Associations between environmental quality and mortality in ...
BACKGROUND: Assessing cumulative effects of the multiple environmental factors influencing mortality remains a challenging task. OBJECTIVES: This study aimed to examine the associations between cumulative environmental quality and all-cause and leading cause-specific (heart disease, cancer, and stroke) mortality rates. METHODS: We used the overall Environmental Quality Index (EQI) and its five domain indices (air, water, land, built, and sociodemographic) to represent environmental exposure. Associations between the EQI and mortality rates (CDC WONDER) for counties in the contiguous United States (n = 3,109) were investigated using multiple linear regression models and random intercept and random slope hierarchical models. Urbanicity, climate, and a combination of the two were used to explore the spatial patterns in the associations. RESULTS: We found 1 standard deviation increase in the overall EQI (worse environment) was associated with a mean 3.22% (95% Cl:2.80%, 3.64%) increase in all-cause mortality, a 0.54% (95% Cl: -0.17%, 1.25%) increase in heart disease mortality, a 2.71% (95% Cl: 2.21%, 3.22%) increase in cancer mortality, and a 2.25% (95% Cl: 1.11%, 3.39%) increase in stroke mortality. Among the environmental domains, the associations ranged from -1.27% (95% Cl: -1.70%,-0.84%) to 3.37% (95% Cl:2.90%, 3.84%),for all-cause mortality, -2.62% (95% Cl: -3.52%, -1.73%) to 4.50% (95% Cl:3.73%, 5.27%) for heart disease mortality, -0.88% (95% Cl:-2.12%, 0.36%)
Term Elective Induction of Labor and Perinatal Outcomes in Obese Women: Retrospective Cohort Study
Lee, Vanessa R.; Darney, Blair G.; Snowden, Jonathan M.; Main, Elliott K.; Gilbert, William; Chung, Judith; Caughey, Aaron B.
2015-01-01
Objective To compare perinatal outcomes between elective induction of labor (eIOL) and expectant management in obese women. Design Retrospective cohort study. Setting Deliveries in California in 2007. Population Term, singleton, vertex, nonanomalous deliveries among obese women (n=74,725). Methods Women who underwent eIOL at 37 weeks were compared with women who were expectantly managed at that gestational age. Similar comparisons were made at 38, 39, and 40 weeks. Results were stratified by parity. Chi-square tests and multivariable logistic regression were used for statistical comparison. Main Outcome Measures Method of delivery, severe perineal lacerations, postpartum hemorrhage, chorioamnionitis, macrosomia, shoulder dystocia, brachial plexus injury, respiratory distress syndrome. Results The odds of cesarean delivery were lower among nulliparous women with eIOL at 37 weeks (odds ratio [OR] 0.55, 95% confidence interval [CI] 0.34–0.90) and 39 weeks (OR 0.77, 95% CI 0.63–0.95) compared to expectant management. Among multiparous women with a prior vaginal delivery, eIOL at 37 (OR 0.39, 95% CI 0.24–0.64), 38 (OR 0.65, 95% CI 0.51–0.82), and 39 weeks (OR 0.67, 95% CI 0.56–0.81) was associated with lower odds of cesarean. Additionally, eIOL at 38, 39, and 40 weeks was associated with lower odds of macrosomia. There were no differences in the odds of operative vaginal delivery, lacerations, brachial plexus injury, or respiratory distress syndrome. Conclusions In obese women, term eIOL may decrease the risk of cesarean delivery, particularly in multiparas, without increasing the risks of other adverse outcomes when compared with expectant management. Tweetable Abstract Elective induction of labor in obese women does not increase risk of cesarean or other perinatal morbidities. PMID:26840780
Galdos, Mariana; Simons, Claudia J P; Wichers, Marieke; Fernandez-Rivas, Aranzazu; Martinez-Azumendi, Oscar; Lataster, Tineke; Amer, Guillermo; Myin-Germeys, Inez; Gonzalez-Torres, Miguel Angel; van Os, Jim
2011-10-01
Neurocognitive impairments observed in psychotic disorder may impact on emotion recognition and theory of mind, resulting in altered understanding of the social world. Early intervention efforts would be served by further elucidation of this mechanism. Patients with a psychotic disorder (n=30) and a reference control group (n=310) were asked to offer emotional appraisals of images of social situations (EASS task). The degree to which case-control differences in appraisals were mediated by neurocognitive alterations was analyzed. The EASS task displayed convergent and discriminant validity. Compared to controls, patients displayed blunted emotional appraisal of social situations (B=0.52, 95% CI: 0.30, 0.74, P<0.001; adjusted for age, sex and number of years of education: B=0.44, 95% CI: 0.20, 0.68, P<0.001), a difference of 0.88 (adjusted: 0.75) standard deviation. After adjustment for neurocognitive variables, the case-control difference was reduced by nearly 75% and was non-significant (B=0.12, 95% CI: -0.14, 0.39, P=0.37). Neurocognitive impairments observed in patients with psychotic disorder may underlie misrepresentation of the social world, mediated by altered emotion recognition. A task assessing the social impact of cognitive alterations in clinical practice may be useful in detecting key alterations very early in the course of psychotic illness.
Multi-Agent Task Negotiation Among UAVs to Defend Against Swarm Attacks
2012-03-01
are based on economic models [39]. Auction methods of task coordination also attempt to deal with agents dealing with noisy, dynamic environments...August 2006. [34] M. Alighanbari, “ Robust and decentralized task assignment algorithms for uavs,” Ph.D. dissertation, Massachusetts Institute of Technology...Implicit Coordination . . . . . . . . . . . . . 12 2.4 Decentralized Algorithm B - Market- Based . . . . . . . . . . . . . . . . 12 2.5 Decentralized
A Submodularity Framework for Data Subset Selection
2013-09-01
37 7 List of Language Modeling Corpora in thet Arabic -to-English NIST Task ............. 37 8...Task ( Arabic -to-English) ................. 39 10 Baseline BLEU (%) PER Scores on Transtac Task (English-to- Arabic ) ................. 39 11...Comparison of BLEU (%) PER Scores on Transtac Task ( Arabic -to-English) ....... 39 12 Comparison of BLEU (%) PER Scores on Transtac Task (English-to- Arabic
Biomechanical risk factors for carpal tunnel syndrome: a pooled study of 2474 workers
Harris-Adamson, Carisa; Eisen, Ellen A; Kapellusch, Jay; Garg, Arun; Hegmann, Kurt T; Thiese, Matthew S; Dale, Ann Marie; Evanoff, Bradley; Burt, Susan; Bao, Stephen; Silverstein, Barbara; Merlino, Linda; Gerr, Fred; Rempel, David
2015-01-01
Background Between 2001 and 2010, five research groups conducted coordinated prospective studies of carpal tunnel syndrome (CTS) incidence among US workers from various industries and collected detailed subject-level exposure information with follow-up of symptoms, electrophysiological measures and job changes. Objective This analysis examined the associations between workplace biomechanical factors and incidence of dominant-hand CTS, adjusting for personal risk factors. Methods 2474 participants, without CTS or possible polyneuropathy at enrolment, were followed up to 6.5 years (5102 person-years). Individual workplace exposure measures of the dominant hand were collected for each task and included force, repetition, duty cycle and posture. Task exposures were combined across the workweek using time-weighted averaging to estimate job-level exposures. CTS case-criteria were based on symptoms and results of electrophysiological testing. HRs were estimated using Cox proportional hazard models. Results After adjustment for covariates, analyst (HR=2.17; 95% CI 1.38 to 3.43) and worker (HR=2.08; 95% CI 1.31 to 3.39) estimated peak hand force, forceful repetition rate (HR=1.84; 95% CI 1.19 to 2.86) and per cent time spent (eg, duty cycle) in forceful hand exertions (HR=2.05; 95% CI 1.34 to 3.15) were associated with increased risk of incident CTS. Associations were not observed between total hand repetition rate, per cent duration of all hand exertions, or wrist posture and incident CTS. Conclusions In this prospective multicentre study of production and service workers, measures of exposure to forceful hand exertion were associated with incident CTS after controlling for important covariates. These findings may influence the design of workplace safety programmes for preventing work-related CTS. PMID:25324489
Pavone, Venere Leda Mara; Lisi, Catiuscia; Cinti, Danilo; Cervino, Daniela; Costantini, Adele Seniori; Forastiere, Francesco
2007-01-01
to study determinants of occupational injuries in tunnel construction using data from the surveillance system which had been implemented in order to monitor accidents during the construction of the "high speed train tracks in the Italian Regions Emilia-Romagna and Tuscany. retrospective cohort study. 16 sites for the construction of 14 tunnels of the high speed railway-tract Bologna-Firenze, in Italy. 1,602 workers (of 3,000 employed in the underground tunnelling), aged 18 - 67 years, operating during excavation with traditional method in 1999-2002. A total of 549 injuries occurred among 385 workers. The number of worked hours were used as time at risk. incidence rate ratios (IRR) and 95% confidence intervals for all injuries, serious injuries and first injuries were considered in separate multiple regression analyses (Poisson). residence, task and working phase were taken into consideration. An increased risk was found for younger workers, for carpenters (IRR "all-events" = 2.33; 95% CI=1.85-2.94; IRR" first-events" = 2.12; 95% CI 1.62-2.77) and miners (IRR "all-events" = 1.76; 95% CI 1.39-2.24; IRR"first-events" = 1.71; 95% CI 1.30-2.24) vs. machinery operators. Construction of inverted arch turns out to have an incidence rate ratio three times higher than digging out (IRR "all-events" = 2.79; 95% CI 2.27-3.43; IRR "firsts-event = 2.98; 95% CI 2.33-3.81). The probability of "serious" injuries (>30 days) is higher for miners (IRR=2.45; 95% CI 1.65-3.64) and for carpenters (IRR=2.31; 95% CI 1.53-3.49). this study pointed out to indicate some determinants (age, task and work phase) of injuries in tunneling about which little had been published previously. These results are useful for addressing preventive measures, for control and prevention activities and point to the need to explore the effect of experience and to study, through a case crossover design, transient working and individual risk factors for traumatic injury within these working sites.
Bolton, Kenyon C.; Mace, John L.; Herschorn, Sally D.; James, Ted A.; Vacek, Pamela M.; Weaver, Donald L.; Geller, Berta M.
2014-01-01
Purpose To determine whether the 2009 U.S. Preventive Services Task Force (USPSTF) guidelines for breast cancer mammography screening were followed by changes in screening utilization in the state of Vermont. Materials and Methods This retrospective study was HIPAA compliant and approved by the institutional review board, with waiver of informed consent. Trends in screening mammography utilization during 1997–2011 were examined among approximately 150 000 women aged 40 years and older in the state of Vermont using statewide mammography registry data. Results The percentage of Vermont women aged 40 years and older screened in the past year declined from 45.3% in 2009% to 41.6% in 2011 (an absolute decrease of −3.7 percentage points; 95% confidence interval [CI]: −3.3, −4.1). The largest decline in utilization was among women aged 40–49 years (−4.8 percentage points; 95% CI: −4.1, −5.4), although substantial declines were also observed among women aged 50–74 years (−3.0 percentage points; 95% CI: −2.6, −3.5) and women aged 75 years and older (−3.1 percentage points; 95% CI: −2.3, −4.0). The percentage of women aged 50–74 years screened within the past 2 years declined by −3.4 percentage points (95% CI: −3.0, −3.9) from 65.4% in 2009 to 61.9% in 2011. Conclusion After years of increasing screening mammography utilization in Vermont, there was a decline in screening, which coincided with the release of the 2009 USPSTF recommendations. The age-specific patterns in utilization were generally consistent with the USPSTF recommendations, although there was also evidence that the percentage of women aged 50–74 years screened in the past 2 years declined since 2009. © RSNA, 2013 PMID:24072778
Aymerich, Marta; Guillamón, Imma; Jovell, Albert J
2009-01-01
Objectives: To measure the health-related quality of life (HRQoL) of multiple sclerosis (MS) patients and their caregivers, and to assess which factors can best describe HRQoL. Methods: A cross-sectional multicenter study of nine hospitals enrolled MS patients and their caregivers who attended outpatient clinics consecutively. The instruments used were the SF-36 for patients and the SF-12 and GHQ-12 for caregivers. Classification and regression tree analysis was used to analyze the explanatory factors of HRQoL. Results: A total of 705 patients (mean age 40.4 years, median Expanded Disability Status Scale 2.5, 77.8% with relapsing-remitting MS) and 551 caregivers (mean age 45.4 years) participated in the study. MS patients had significantly lower HRQoL than in the general population (physical SF-36: 39.9; 95% confidence interval [CI]: 39.1–40.6; mental SF-36: 44.4; 95% CI: 43.5–45.3). Caregivers also presented lower HRQoL than general population, especially in its mental domain (mental SF-12: 46.4; 95% CI: 45.5–47.3). Moreover, according to GHQ-12, 27% of caregivers presented probable psychological distress. Disability and co-morbidity in patients, and co-morbidity and employment status in caregivers, were the most important explanatory factors of their HRQoL. Conclusions: Not only the HRQoL of patients with MS, but also that of their caregivers, is indeed notably affected. Caregivers’ HRQoL is close to population of chronic illness even that the patients sample has a mild clinical severity and that caregiving role is a usual task in the study context. PMID:19936174
Midazolam microdose to determine systemic and pre-systemic metabolic CYP3A activity in humans
Hohmann, Nicolas; Kocheise, Franziska; Carls, Alexandra; Burhenne, Jürgen; Haefeli, Walter E; Mikus, Gerd
2015-01-01
Aim We aimed to establish a method to assess systemic and pre-systemic cytochrome P450 (CYP) 3A activity using ineffective microgram doses of midazolam. Methods In an open, one sequence, crossover study, 16 healthy participants received intravenous and oral midazolam at microgram (0.001 mg intravenous and 0.003 mg oral) and regular milligram (1 mg intravenous and 3 mg oral) doses to assess the linearity of plasma and urine pharmacokinetics. Results Dose-normalized AUC and Cmax were 37.1 ng ml−1 h [95% CI 35.5, 40.6] and 39.1 ng ml−1 [95% CI 30.4, 50.2] for the microdose and 39.0 ng ml−1 h [95% CI 36.1, 42.1] and 37.1 ng ml−1 [95% CI 26.9, 51.3] for the milligram dose. CLmet was 253 ml min−1 [95% CI 201, 318] vs. 278 ml min−1 [95% CI 248, 311] for intravenous doses and 1880 ml min−1 [95% CI 1590, 2230] vs. 2050 ml min−1 [95% CI 1720, 2450] for oral doses. Oral bioavailability of a midazolam microdose was 23.4% [95% CI 20.0, 27.3] vs. 20.9% [95% CI 17.1, 25.5] after the regular dose. Hepatic and gut extraction ratios for microgram doses were 0.44 [95% CI 0.39, 0.49] and 0.53 [95% CI 0.45, 0.63] and compared well with those for milligram doses (0.43 [95% CI 0.37, 0.49] and 0.61 [95% CI 0.53, 0.70]). Conclusion The pharmacokinetics of an intravenous midazolam microdose is linear to the applied regular doses and can be used to assess safely systemic CYP3A activity and, in combination with oral microdoses, pre-systemic CYP3A activity. PMID:25588320
Jeannot, Emilien; Mahler, Per; Elia, Nadia; Cerruti, Bernard; Chastonnay, P.
2015-01-01
Background: Obesity among children and adolescents is a growing public health problem. The purpose of this study is to assess the prevalence, socioeconomic and demographic determinants of overweight and obesity in schoolchildren from Geneva. Methods: A cross-sectional study was undertaken at the Public School of Geneva canton in Switzerland. A total of 8544 public school children were collected and analyzed: 2577 were in second grade, 2641 in fifth grade and 3326 in eighth grade. To identify overweight and obesity we used the definition issued by the International Obesity Task Force. Child characteristics included gender, age, socioeconomic status (SES) of father and mother, and school grade. The multivariate logistic regression model was used to examine potential predictors of overweight/obesity. Results: The prevalence of overweight or obese children was 14.4% in second grade, 17.3% in fifth grade and 18.6% in eighth grade. Multivariate logistic regression analyses reveal that children that have a low economic status or certain citizenships are more likely to be overweight or obese. Children of Kosovar origin, have a higher risk of OBO in second grade (adjusted odds ratio [OR] = 2.19; 95% confidence interval [CI]: 1.20–4.00), fifth grade (adjusted OR = 2.36 95% CI: 1.27–4.39) and in eighth grade (adjusted OR = 2.15 95% CI: 1.27–4.39). Association between SES and overweight was high with regards to the father's SES in fifth grade (adjusted OR = 4.21 95% CI: 2.83–6.25). Conclusions: Overweight and obesity is associated to socioeconomic and sociodemographic factors. The analyzes reveals that children with a low economic status and/or from certain countries are more likely to be overweight or obese than Swiss children. There is an urgent need for action to prevent further increase in overweight or obesity among children. PMID:26015862
El-Helaly, Mohamed; Balkhy, Hanan H; Vallenius, Laura
2017-11-25
Work-related carpal tunnel syndrome (CTS) has been reported in different occupations, including laboratory technicians, so this study was carried out to determine the prevalence and the associated personal and ergonomic factors for CTS among laboratory technicians. A cross-sectional study was conducted among 279 laboratory technicians at King Fahd Hospital, Saudi Arabia, who filled in a self-administered questionnaire, including questions regarding their demographic criteria, occupational history, job tasks, workplace tools, ergonomic factors at work, and symptoms suggestive of CTS. Physical examinations and electrodiagnostic studies were carried out for those who had symptoms suggestive of CTS to confirm the diagnosis. Univariate and multivariate analysis were performed for both personal and physical factors in association with confirmed CTS among laboratory technicians. The prevalence of CTS among the laboratory technicians was 9.7% (27/279). The following were the statistically significant risk factors for CTS among them: gender (all cases of CTS were female, P=0.00), arm/hand exertion (OR: 7.96; 95% CI: 1.84-34.33), pipetting (OR: 7.27; 95% CI: 3.15-16.78), repetitive tasks (OR: 4.60; 95% CI: 1.39-15.70), using unadjustable chairs or desks (OR: 3.35; 95% CI: 1.23-9.15), and working with a biosafety cabinet (OR: 2.49; 95% CI: 1.11-5.59). CTS cases had significant longer work duration (17.9 ± 5.6 years) than CTS non-case (11.5 ± 7.4 yeas) with low OR (1.108). This study demonstrates some personal and ergonomic factors associated with CTS among the laboratory technicians, including female gender, arm/hand exertion, pipetting, repetitive tasks, working with a biosafety cabinet, and an unadjusted workstation.
Term elective induction of labour and perinatal outcomes in obese women: retrospective cohort study.
Lee, V R; Darney, B G; Snowden, J M; Main, E K; Gilbert, W; Chung, J; Caughey, A B
2016-01-01
To compare perinatal outcomes between elective induction of labour (eIOL) and expectant management in obese women. Retrospective cohort study. Deliveries in California in 2007. Term, singleton, vertex, nonanomalous deliveries among obese women (n = 74 725). Women who underwent eIOL at 37 weeks were compared with women who were expectantly managed at that gestational age. Similar comparisons were made at 38, 39, and 40 weeks. Results were stratified by parity. Chi-square tests and multivariable logistic regression were used for statistical comparison. Method of delivery, severe perineal lacerations, postpartum haemorrhage, chorioamnionitis, macrosomia, shoulder dystocia, brachial plexus injury, respiratory distress syndrome. The odds of caesarean delivery were lower among nulliparous women with eIOL at 37 weeks [odds ratio (OR) 0.55, 95% confidence interval (CI) 0.34-0.90] and 39 weeks (OR 0.77, 95% CI 0.63-0.95) compared to expectant management. Among multiparous women with a prior vaginal delivery, eIOL at 37 (OR 0.39, 95% CI 0.24-0.64), 38 (OR 0.65, 95% CI 0.51-0.82), and 39 weeks (OR 0.67, 95% CI 0.56-0.81) was associated with lower odds of caesarean. Additionally, eIOL at 38, 39, and 40 weeks was associated with lower odds of macrosomia. There were no differences in the odds of operative vaginal delivery, lacerations, brachial plexus injury or respiratory distress syndrome. In obese women, term eIOL may decrease the risk of caesarean delivery, particularly in multiparas, without increasing the risks of other adverse outcomes when compared with expectant management. © 2015 Royal College of Obstetricians and Gynaecologists.
Physical performance limitations among adult survivors of childhood brain tumors
Ness, Kirsten K.; Morris, E. Brannon; Nolan, Vikki G.; Howell, Carrie R.; Gilchrist, Laura S.; Stovall, Marilyn; Cox, Cheryl L.; Klosky, James L.; Gajjar, Amar; Neglia, Joseph P.
2013-01-01
Background Young adult survivors of childhood brain tumors (BT) may have late-effects that compromise physical performance and everyday task participation. Objective To evaluate muscle strength, fitness, physical performance, and task participation among adult survivors of childhood BT. Design/Method In-home evaluations and interviews were conducted for 156 participants (54% male). Results on measures of muscle strength, fitness, physical performance, and participation were compared between survivors and population-group members with chi-squared statistics and two-sample t-tests. Associations between late effects and physical performance, and physical performance and participation, were evaluated in regression models. Results BT survivors were a median age of 22 (18–58), and 14.7 (6.5–45.9) years from diagnosis. Survivors had lower estimates of grip strength (Female: 24.7±9.2 vs. 31.5±5.8, Male: 39.0±12.2 vs. 53.0±10.1 kilograms), knee extension strength (Female: 246.6±95.5 vs. 331.5±5.8, Male: 304.7±116.4 vs. 466.6±92.1 Newtons) and peak oxygen uptake (Female: 25.1±8.8 vs. 31.3±5.1, Male: 24.6±9.5 vs. 33.2±3.4 milliliters/kilogram/minute) than population-group members. Physical performance was lower among survivors and associated with not living independently (OR=5.0, 95% CI=2.0–12.2) and not attending college (OR=2.3, 95% CI 1.2–4.4). Conclusion Muscle strength and fitness values among BT survivors are similar to those among persons 60+ years, and are associated with physical performance limitations. Physical performance limitations are associated with poor outcomes in home and school environments. These data indicate an opportunity for interventions targeted at improving long-term physical function in this survivor population. PMID:20564409
Lam, Freddy Mh; Huang, Mei-Zhen; Liao, Lin-Rong; Chung, Raymond Ck; Kwok, Timothy Cy; Pang, Marco Yc
2018-01-01
Does physical exercise training improve physical function and quality of life in people with cognitive impairment and dementia? Which training protocols improve physical function and quality of life? How do cognitive impairment and other patient characteristics influence the outcomes of exercise training? Systematic review with meta-analysis of randomised trials. People with mild cognitive impairment or dementia as the primary diagnosis. Physical exercise. Strength, flexibility, gait, balance, mobility, walking endurance, dual-task ability, activities of daily living, quality of life, and falls. Forty-three clinical trials (n=3988) were included. According to the Grades of Recommendation, Assessment, Development and Evaluation (GRADE) system, the meta-analyses revealed strong evidence in support of using supervised exercise training to improve the results of 30-second sit-to-stand test (MD 2.1 repetitions, 95% CI 0.3 to 3.9), step length (MD 5cm, 95% CI 2 to 8), Berg Balance Scale (MD 3.6 points, 95% CI 0.3 to 7.0), functional reach (3.9cm, 95% CI 2.2 to 5.5), Timed Up and Go test (-1second, 95% CI -2 to 0), walking speed (0.13m/s, 95% CI 0.03 to 0.24), and 6-minute walk test (50m, 95% CI 18 to 81) in individuals with mild cognitive impairment or dementia. Weak evidence supported the use of exercise in improving flexibility and Barthel Index performance. Weak evidence suggested that non-specific exercise did not improve dual-tasking ability or activity level. Strong evidence indicated that exercise did not improve quality of life in this population. The effect of exercise on falls remained inconclusive. Poorer physical function was a determinant of better response to exercise training, but cognitive performance did not have an impact. People with various levels of cognitive impairment can benefit from supervised multi-modal exercise for about 60minutes a day, 2 to 3days a week to improve physical function. [Lam FMH , Huang MZ, Liao LR, Chung RCK, Kwok TCY, Pang MYC (2018) Physical exercise improves strength, balance, mobility, and endurance in people with cognitive impairment and dementia: a systematic review. Journal of Physiotherapy 64: 4-15]. Copyright © 2017 Australian Physiotherapy Association. Published by Elsevier B.V. All rights reserved.
Instrumentation development for In Situ 40Ar/39Ar planetary geochronology
Morgan, Leah; Munk, Madicken; Davidheiser-Kroll, Brett; Warner, Nicholas H.; Gupta, Sanjeev; Slaybaugh, Rachel; Harkness, Patrick; Mark, Darren
2017-01-01
The chronology of the Solar System, particularly the timing of formation of extra-terrestrial bodies and their features, is an outstanding problem in planetary science. Although various chronological methods for in situ geochronology have been proposed (e.g., Rb-Sr, K-Ar), and even applied (K-Ar), the reliability, accuracy, and applicability of the 40Ar/39Ar method makes it by far the most desirable chronometer for dating extra-terrestrial bodies. The method however relies on the neutron irradiation of samples, and thus a neutron source. Herein, we discuss the challenges and feasibility of deploying a passive neutron source to planetary surfaces for the in situ application of the 40Ar/39Ar chronometer. Requirements in generating and shielding neutrons, as well as analysing samples are described, along with an exploration of limitations such as mass, power and cost. Two potential solutions for the in situ extra-terrestrial deployment of the 40Ar/39Ar method are presented. Although this represents a challenging task, developing the technology to apply the 40Ar/39Ar method on planetary surfaces would represent a major advance towards constraining the timescale of solar system formation and evolution.
Moore, Melanie; Kwitowski, Melissa; Javier, Sarah
2017-06-01
To examine mental health influences on dual contraceptive method use (i.e., the use of a hormonal contraceptive or intrauterine device with a condom barrier) among college women. Data from N=307 sexually active women who completed the 2014 National College Health Assessment at a large mid-Atlantic university were analyzed. Following chi-square tests of associations, multivariate logistic regressions examined the relation between mental health and sociodemographic factors and dual contraceptive method use. Among all women, 27% utilized a dual contraceptive method during last vaginal intercourse. A prior depressive disorder diagnosis was significantly associated with lower odds of dual method use compared to use of other contraceptive methods combined (aOR, 0.39; 95% CI: 0.19-0.79), use of no method (aOR, 0.12; 95% CI: 0.03-0.55), or use of hormonal contraceptives only (aOR, 0.39; 95% CI: 0.18-0.85). Mental health is an important contributor to contraceptive method use. Health care providers should consider the role of mental health when counseling women about contraceptive options during routine gynecological visits. Results suggest that mental health screenings may be helpful in identifying those most at risk for not using dual contraceptive methods. Copyright © 2017 Elsevier B.V. All rights reserved.
Boullé, Charlotte; Kouanfack, Charles; Laborde-Balen, Gabrièle; Carrieri, Maria Patrizia; Dontsop, Marlise; Boyer, Sylvie; Aghokeng, Avelin Fobang; Spire, Bruno; Koulla-Shiro, Sinata; Delaporte, Eric; Laurent, Christian
2013-04-15
Task shifting to nurses for antiretroviral therapy (ART) is promoted by the World Health Organization to compensate for the severe shortage of physicians in Africa. We assessed the effectiveness of task shifting from physicians to nurses in rural district hospitals in Cameroon. We performed a cohort study using data from the Stratall trial, designed to assess monitoring strategies in 2006-2010. ART-naive patients were followed up for 24 months after treatment initiation. Clinical visits were performed by nurses or physicians. We assessed the associations between the consultant ratio (ie, the ratio of the number of nurse-led visits to the number of physician-led visits) and HIV virological success, CD4 recovery, mortality, and disease progression to death or to the World Health Organization clinical stage 4 in multivariate analyses. Of the 4141 clinical visits performed in 459 patients (70.6% female, median age 37 years), a quarter was task shifted to nurses. The consultant ratio was not significantly associated with virological success [odds ratio 1.00, 95% confidence interval (CI): 0.59 to 1.72, P = 0.990], CD4 recovery (coefficient -3.6, 95% CI: -35.6; 28.5, P = 0.827), mortality (time ratio 1.39, 95% CI: 0.27 to 7.06, P = 0.693), or disease progression (time ratio 1.60, 95% CI: 0.35 to 7.37, P = 0.543). This study brings important evidence about the comparability of ART-related outcomes between HIV models of care based on physicians or nurses in resource-limited settings. Investing in nursing resources for the management of noncomplex patients should help reduce costs and patient waiting lists while freeing up physician time for the management of complex cases, for mentoring and supervision activities, and for other health interventions.
Youth doing dangerous tasks: Supervision matters.
Zierold, Kristina M
2017-09-01
Supervisors are partially responsible for ensuring that teens are safe at work. The purpose of this study was to explore whether supervision is related to teens' willingness to do a dangerous task at work. A mixed-methods study consisting of focus groups and a cross-sectional survey was conducted with teens from two public high schools. If asked by a supervisor, 21% of working teens would do a dangerous task. After controlling for gender and age, teens whose supervisor did not establish weekly goals (AOR = 3.54, 95%CI = 1.55-8.08), teens who perceived their supervisors as not approachable (AOR = 2.35, 95%CI = 1.34-4.13), and teens who were not comfortable talking about safety issues (AOR = 1.97, 95%CI = 1.08-3.61) were more likely to do a dangerous task if asked by their supervisors. This study indicates that how teens perceive their supervisor may be associated with whether teens do a dangerous task when asked by their supervisor. © 2017 Wiley Periodicals, Inc.
Stevens, Allen D.; Hernandez, Caleb; Jones, Seth; Moreira, Maria E.; Blumen, Jason R.; Hopkins, Emily; Sande, Margaret; Bakes, Katherine; Haukoos, Jason S.
2016-01-01
Background Medication dosing errors remain commonplace and may result in potentially life-threatening outcomes, particularly for pediatric patients where dosing often requires weight-based calculations. Novel medication delivery systems that may reduce dosing errors resonate with national healthcare priorities. Our goal was to evaluate novel, prefilled medication syringes labeled with color-coded volumes corresponding to the weight-based dosing of the Broselow Tape, compared to conventional medication administration, in simulated prehospital pediatric resuscitation scenarios. Methods We performed a prospective, block-randomized, cross-over study, where 10 full-time paramedics each managed two simulated pediatric arrests in situ using either prefilled, color-coded-syringes (intervention) or their own medication kits stocked with conventional ampoules (control). Each paramedic was paired with two emergency medical technicians to provide ventilations and compressions as directed. The ambulance patient compartment and the intravenous medication port were video recorded. Data were extracted from video review by blinded, independent reviewers. Results Median time to delivery of all doses for the intervention and control groups was 34 (95% CI: 28–39) seconds and 42 (95% CI: 36–51) seconds, respectively (difference = 9 [95% CI: 4–14] seconds). Using the conventional method, 62 doses were administered with 24 (39%) critical dosing errors; using the prefilled, color-coded syringe method, 59 doses were administered with 0 (0%) critical dosing errors (difference = 39%, 95% CI: 13–61%). Conclusions A novel color-coded, prefilled syringe decreased time to medication administration and significantly reduced critical dosing errors by paramedics during simulated prehospital pediatric resuscitations. PMID:26247145
Sproston, E L; Carrillo, C D; Boulter-Bitzer, J
2014-12-01
Harmonisation of methods between Canadian government agencies is essential to accurately assess and compare the prevalence and concentrations present on retail poultry intended for human consumption. The standard qualitative procedure used by Health Canada differs to that used by the USDA for both quantitative and qualitative methods. A comparison of three methods was performed on raw poultry samples obtained from an abattoir to determine if one method is superior to the others in isolating Campylobacter from chicken carcass rinses. The average percent of positive samples was 34.72% (95% CI, 29.2-40.2), 39.24% (95% CI, 33.6-44.9), 39.93% (95% CI, 34.3-45.6) for the direct plating US method and the US enrichment and Health Canada enrichment methods, respectively. Overall there were significant differences when comparing either of the enrichment methods to the direct plating method using the McNemars chi squared test. On comparison of weekly data (Fishers exact test) direct plating was only inferior to the enrichment methods on a single occasion. Direct plating is important for enumeration and establishing the concentration of Campylobacter present on raw poultry. However, enrichment methods are also vital to identify positive samples where concentrations are below the detection limit for direct plating. Crown Copyright © 2014. Published by Elsevier Ltd. All rights reserved.
Krage, Ralf; Zwaan, Laura; Tjon Soei Len, Lian; Kolenbrander, Mark W; van Groeningen, Dick; Loer, Stephan A; Wagner, Cordula; Schober, Patrick
2017-01-01
Background Non-technical skills, such as task management, leadership, situational awareness, communication and decision-making refer to cognitive, behavioural and social skills that contribute to safe and efficient team performance. The importance of these skills during cardiopulmonary resuscitation (CPR) is increasingly emphasised. Nonetheless, the relationship between non-technical skills and technical performance is poorly understood. We hypothesise that non-technical skills become increasingly important under stressful conditions when individuals are distracted from their tasks, and investigated the relationship between non-technical and technical skills under control conditions and when external stressors are present. Methods In this simulator-based randomised cross-over study, 30 anaesthesiologists and anaesthesia residents from the VU University Medical Center, Amsterdam, the Netherlands, participated in two different CPR scenarios in random order. In one scenario, external stressors (radio noise and a distractive scripted family member) were added, while the other scenario without stressors served as control condition. Non-technical performance of the team leader and technical performance of the team were measured using the ‘Anaesthetists’ Non-technical Skill’ score and a recently developed technical skills score. Analysis of variance and Pearson correlation coefficients were used for statistical analyses. Results Non-technical performance declined when external stressors were present (adjusted mean difference 3.9 points, 95% CI 2.4 to 5.5 points). A significant correlation between non-technical and technical performance scores was observed when external stressors were present (r=0.67, 95% CI 0.40 to 0.83, p<0.001), while no evidence for such a relationship was observed under control conditions (r=0.15, 95% CI −0.22 to 0.49, p=0.42). This was equally true for all individual domains of the non-technical performance score (task management, team working, situation awareness, decision-making). Conclusions During CPR with external stressors, the team’s technical performance is related to the non-technical skills of the team leader. This may have important implications for training of CPR teams. PMID:28844039
10 CFR 95.39 - External transmission of documents and material.
Code of Federal Regulations, 2013 CFR
2013-01-01
... Procedures Plan for the protection of classified information. (e) Security of classified information in... Section 95.39 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) FACILITY SECURITY CLEARANCE AND SAFEGUARDING OF NATIONAL SECURITY INFORMATION AND RESTRICTED DATA Control of Information § 95.39 External...
10 CFR 95.39 - External transmission of documents and material.
Code of Federal Regulations, 2011 CFR
2011-01-01
... Procedures Plan for the protection of classified information. (e) Security of classified information in... Section 95.39 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) FACILITY SECURITY CLEARANCE AND SAFEGUARDING OF NATIONAL SECURITY INFORMATION AND RESTRICTED DATA Control of Information § 95.39 External...
10 CFR 95.39 - External transmission of documents and material.
Code of Federal Regulations, 2014 CFR
2014-01-01
... Procedures Plan for the protection of classified information. (e) Security of classified information in... Section 95.39 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) FACILITY SECURITY CLEARANCE AND SAFEGUARDING OF NATIONAL SECURITY INFORMATION AND RESTRICTED DATA Control of Information § 95.39 External...
10 CFR 95.39 - External transmission of documents and material.
Code of Federal Regulations, 2012 CFR
2012-01-01
... Procedures Plan for the protection of classified information. (e) Security of classified information in... Section 95.39 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) FACILITY SECURITY CLEARANCE AND SAFEGUARDING OF NATIONAL SECURITY INFORMATION AND RESTRICTED DATA Control of Information § 95.39 External...
10 CFR 95.39 - External transmission of documents and material.
Code of Federal Regulations, 2010 CFR
2010-01-01
... Procedures Plan for the protection of classified information. (e) Security of classified information in... Section 95.39 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) FACILITY SECURITY CLEARANCE AND SAFEGUARDING OF NATIONAL SECURITY INFORMATION AND RESTRICTED DATA Control of Information § 95.39 External...
Exposure to metal-working fluids in the automobile industry and the risk of male germ cell tumours.
Behrens, Thomas; Pohlabeln, Hermann; Mester, Birte; Langner, Ingo; Schmeisser, Nils; Ahrens, Wolfgang
2012-03-01
In a previous analysis of a case-control study of testicular cancer nested in a cohort of automobile workers, we observed an increased risk for testicular cancer among workers who had ever been involved in occupational metal-cutting tasks. We investigated whether this risk increase was due to exposure to metal-working fluids (MWF). Occupational exposure to MWF was assessed in detail using a job-specific questionnaire for metal-cutting work. We calculated ORs and associated 95% CIs individually matched for age (±2 years) and adjusted for a history of cryptorchidism by conditional logistic regression. The prevalence of exposure to MWF was 39.8% among cases and 40.1% among controls. For total germ cell tumours and seminomas we did not observe risk increases for metal-cutting tasks or occupational exposure to MWF (OR 0.95; 95% CI 0.69 to 1.32 and OR 0.88; 95% CI 0.58 to 1.35, respectively). However, dermal exposure to oil-based MWF was associated with an increased risk for non-seminomatous testicular cancer. Dermal exposure to oil-based MWF for more than 5000 h showed particularly high risk estimates (OR 4.72; 95% CI 1.48 to 15.09). Long-term dermal exposure to oil-based MWF was a risk factor for the development of non-seminomatous testicular germ cell cancer. Possible measures to reduce exposure include the introduction of engineering control measures such as venting or enclosing of machines, and enforcing the use of personal protective equipment during metal cutting.
Willmott, C; Ponsford, J
2009-05-01
Most previous studies evaluating the use of methylphenidate following traumatic brain injury (TBI) have been conducted many years post-injury. This study evaluated the efficacy of methylphenidate in facilitating cognitive function in the inpatient rehabilitation phase. 40 participants with moderate-severe TBI (mean 68 days post-injury) were recruited into a randomised, crossover, double blind, placebo controlled trial. Methylphenidate was administered at a dose of 0.3 mg/kg twice daily and lactose in identical capsules served as placebo. Methylphenidate and placebo administration was randomised in a crossover design across six sessions over a 2 week period. Primary efficacy outcomes were neuropsychological tests of attention. No participants were withdrawn because of side effects or adverse events. Methylphenidate significantly increased speed of information processing on the Symbol Digit Modalities Test (95% CI 0.30 to 2.95, Cohen's d = 0.39, p = 0.02), Ruff 2 and 7 Test-Automatic Condition (95% CI 1.38 to 6.12, Cohen's d = 0.51, p = 0.003), Simple Selective Attention Task (95% CI -58.35 to -17.43, Cohen's d = 0.59, p = 0.001) and Dissimilar Compatible (95% CI -70.13 to -15.38, Cohen's d = 0.51, p = 0.003) and Similar Compatible (95% CI -74.82 to -19.06, Cohen's d = 0.55, p = 0.002) conditions of the Four Choice Reaction Time Task. Those with more severe injuries and slower baseline information processing speed demonstrated a greater drug response. Methylphenidate enhances information processing speed in the inpatient rehabilitation phase following TBI. This trial is registered with the Australian New Zealand Clinical Trials Registry (12607000503426).
Midazolam microdose to determine systemic and pre-systemic metabolic CYP3A activity in humans.
Hohmann, Nicolas; Kocheise, Franziska; Carls, Alexandra; Burhenne, Jürgen; Haefeli, Walter E; Mikus, Gerd
2015-02-01
We aimed to establish a method to assess systemic and pre-systemic cytochrome P450 (CYP) 3A activity using ineffective microgram doses of midazolam. In an open, one sequence, crossover study, 16 healthy participants received intravenous and oral midazolam at microgram (0.001 mg intravenous and 0.003 mg oral) and regular milligram (1 mg intravenous and 3 mg oral) doses to assess the linearity of plasma and urine pharmacokinetics. Dose-normalized AUC and Cmax were 37.1 ng ml(-1 ) h [95% CI 35.5, 40.6] and 39.1 ng ml(-1) [95% CI 30.4, 50.2] for the microdose and 39.0 ng ml(-1 ) h [95% CI 36.1, 42.1] and 37.1 ng ml(-1) [95% CI 26.9, 51.3] for the milligram dose. CLmet was 253 ml min(-1) [95% CI 201, 318] vs. 278 ml min(-1) [95% CI 248, 311] for intravenous doses and 1880 ml min(-1) [95% CI 1590, 2230] vs. 2050 ml min(-1) [95% CI 1720, 2450] for oral doses. Oral bioavailability of a midazolam microdose was 23.4% [95% CI 20.0, 27.3] vs. 20.9% [95% CI 17.1, 25.5] after the regular dose. Hepatic and gut extraction ratios for microgram doses were 0.44 [95% CI 0.39, 0.49] and 0.53 [95% CI 0.45, 0.63] and compared well with those for milligram doses (0.43 [95% CI 0.37, 0.49] and 0.61 [95% CI 0.53, 0.70]). The pharmacokinetics of an intravenous midazolam microdose is linear to the applied regular doses and can be used to assess safely systemic CYP3A activity and, in combination with oral microdoses, pre-systemic CYP3A activity. © 2014 The British Pharmacological Society.
Characteristics of Plantar Loads in Maximum Forward Lunge Tasks in Badminton
Hu, Xiaoyue; Li, Jing Xian; Hong, Youlian; Wang, Lin
2015-01-01
Background Badminton players often perform powerful and long-distance lunges during such competitive matches. The objective of this study is to compare the plantar loads of three one-step maximum forward lunges in badminton. Methods Fifteen right-handed male badminton players participated in the study. Each participant performed five successful maximum lunges at three directions. For each direction, the participant wore three different shoe brands. Plantar loading, including peak pressure, maximum force, and contact area, was measured by using an insole pressure measurement system. Two-way ANOVA with repeated measures was employed to determine the effects of the different lunge directions and different shoes, as well as the interaction of these two variables, on the measurements. Results The maximum force (MF) on the lateral midfoot was lower when performing left-forward lunges than when performing front-forward lunges (p = 0.006, 95% CI = −2.88 to −0.04%BW). The MF and peak pressures (PP) on the great toe region were lower for the front-forward lunge than for the right-forward lunge (MF, p = 0.047, 95% CI = −3.62 to −0.02%BW; PP, p = 0.048, 95% CI = −37.63 to −0.16 KPa) and left-forward lunge (MF, p = 0.015, 95% CI = −4.39 to −0.38%BW; PP, p = 0.008, 95% CI = −47.76 to −5.91 KPa). Conclusions These findings indicate that compared with the front-forward lunge, left and right maximum forward lunges induce greater plantar loads on the great toe region of the dominant leg of badminton players. The differences in the plantar loads of the different lunge directions may be potential risks for injuries to the lower extremities of badminton players. PMID:26367741
Wang, Yuan; Wang, Yuliang; Ma, Wenbin; Lu, Shujun; Chen, Jinbo; Cao, Lili
2018-01-01
Purpose The relationship between cognitive impairment during the acute phase of first cerebral infarction and the development of long-term pseudobulbar affect (PBA) has not been elucidated. Therefore, in this study, we aimed to determine if cognitive impairment during the acute phase of cerebral infarction will increase the risk of long-term post-infarction PBA. Patients and methods This was a nested case–control study using a prospective approach. A consecutive multicenter matched 1:1 case–control study of cognitive impairment cases following acute cerebral infarction (N=26) with 26 sex-, education years-, and age-matched controls. Univariate and multivariate conditional logistic regression analyses were performed to study the clinical features and changes in cognitive domain as well as the risk factors for PBA. Results Long-term PBA was independently predicted by low Montreal cognitive assessment (MoCA) scores at baseline. Multivariable regression models showed that post-infarction low MoCA scores remained independent predictors of long-term PBA (odds ratio [OR]=0.72; 95% confidence interval [CI]=0.54–0.95; P=0.018). Among all cognitive disorders, digit span test (DST) scores (OR=0.39; 95% CI=0.16–0.91, P=0.030), StroopC time (OR=1.15; 95% CI=1.01–1.31; P=0.037), and clock-drawing task (CDT) scores (OR=0.62; 95% CI=0.42–0.90; P=0.013) were found to be the independent risk factors for PBA. Conclusion Cognitive impairment during the acute phase of cerebral infarction increased the risk of cerebral infarction-induced long-term PBA. Development of PBA was closely associated with executive function, attention, and visuospatial disorder. PMID:29636612
What Is the Evidence for Physical Therapy Poststroke? A Systematic Review and Meta-Analysis
Veerbeek, Janne Marieke; van Wegen, Erwin; van Peppen, Roland; van der Wees, Philip Jan; Hendriks, Erik; Rietberg, Marc; Kwakkel, Gert
2014-01-01
Background Physical therapy (PT) is one of the key disciplines in interdisciplinary stroke rehabilitation. The aim of this systematic review was to provide an update of the evidence for stroke rehabilitation interventions in the domain of PT. Methods and Findings Randomized controlled trials (RCTs) regarding PT in stroke rehabilitation were retrieved through a systematic search. Outcomes were classified according to the ICF. RCTs with a low risk of bias were quantitatively analyzed. Differences between phases poststroke were explored in subgroup analyses. A best evidence synthesis was performed for neurological treatment approaches. The search yielded 467 RCTs (N = 25373; median PEDro score 6 [IQR 5–7]), identifying 53 interventions. No adverse events were reported. Strong evidence was found for significant positive effects of 13 interventions related to gait, 11 interventions related to arm-hand activities, 1 intervention for ADL, and 3 interventions for physical fitness. Summary Effect Sizes (SESs) ranged from 0.17 (95%CI 0.03–0.70; I2 = 0%) for therapeutic positioning of the paretic arm to 2.47 (95%CI 0.84–4.11; I2 = 77%) for training of sitting balance. There is strong evidence that a higher dose of practice is better, with SESs ranging from 0.21 (95%CI 0.02–0.39; I2 = 6%) for motor function of the paretic arm to 0.61 (95%CI 0.41–0.82; I2 = 41%) for muscle strength of the paretic leg. Subgroup analyses yielded significant differences with respect to timing poststroke for 10 interventions. Neurological treatment approaches to training of body functions and activities showed equal or unfavorable effects when compared to other training interventions. Main limitations of the present review are not using individual patient data for meta-analyses and absence of correction for multiple testing. Conclusions There is strong evidence for PT interventions favoring intensive high repetitive task-oriented and task-specific training in all phases poststroke. Effects are mostly restricted to the actually trained functions and activities. Suggestions for prioritizing PT stroke research are given. PMID:24505342
A Method for Functional Task Alignment Analysis of an Arthrocentesis Simulator.
Adams, Reid A; Gilbert, Gregory E; Buckley, Lisa A; Nino Fong, Rodolfo; Fuentealba, I Carmen; Little, Erika L
2018-05-16
During simulation-based education, simulators are subjected to procedures composed of a variety of tasks and processes. Simulators should functionally represent a patient in response to the physical action of these tasks. The aim of this work was to describe a method for determining whether a simulator does or does not have sufficient functional task alignment (FTA) to be used in a simulation. Potential performance checklist items were gathered from published arthrocentesis guidelines and aggregated into a performance checklist using Lawshe's method. An expert panel used this performance checklist and an FTA analysis questionnaire to evaluate a simulator's ability to respond to the physical actions required by the performance checklist. Thirteen items, from a pool of 39, were included on the performance checklist. Experts had mixed reviews of the simulator's FTA and its suitability for use in simulation. Unexpectedly, some positive FTA was found for several tasks where the simulator lacked functionality. By developing a detailed list of specific tasks required to complete a clinical procedure, and surveying experts on the simulator's response to those actions, educators can gain insight into the simulator's clinical accuracy and suitability. Unexpected of positive FTA ratings of function deficits suggest that further revision of the survey method is required.
Stevens, Allen D; Hernandez, Caleb; Jones, Seth; Moreira, Maria E; Blumen, Jason R; Hopkins, Emily; Sande, Margaret; Bakes, Katherine; Haukoos, Jason S
2015-11-01
Medication dosing errors remain commonplace and may result in potentially life-threatening outcomes, particularly for pediatric patients where dosing often requires weight-based calculations. Novel medication delivery systems that may reduce dosing errors resonate with national healthcare priorities. Our goal was to evaluate novel, prefilled medication syringes labeled with color-coded volumes corresponding to the weight-based dosing of the Broselow Tape, compared to conventional medication administration, in simulated prehospital pediatric resuscitation scenarios. We performed a prospective, block-randomized, cross-over study, where 10 full-time paramedics each managed two simulated pediatric arrests in situ using either prefilled, color-coded syringes (intervention) or their own medication kits stocked with conventional ampoules (control). Each paramedic was paired with two emergency medical technicians to provide ventilations and compressions as directed. The ambulance patient compartment and the intravenous medication port were video recorded. Data were extracted from video review by blinded, independent reviewers. Median time to delivery of all doses for the intervention and control groups was 34 (95% CI: 28-39) seconds and 42 (95% CI: 36-51) seconds, respectively (difference=9 [95% CI: 4-14] seconds). Using the conventional method, 62 doses were administered with 24 (39%) critical dosing errors; using the prefilled, color-coded syringe method, 59 doses were administered with 0 (0%) critical dosing errors (difference=39%, 95% CI: 13-61%). A novel color-coded, prefilled syringe decreased time to medication administration and significantly reduced critical dosing errors by paramedics during simulated prehospital pediatric resuscitations. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Household responsibilities, income, and ambulatory blood pressure among working men and women
Thurston, Rebecca C.; Sherwood, Andrew; Matthews, Karen A.; Blumenthal, James A.
2011-01-01
Objective To test the hypothesis that a greater perceived responsibility for household tasks and a greater number of hours spent doing these tasks would be associated with elevated ambulatory systolic (SBP) and diastolic blood pressure (DBP). The connection between job characteristics and cardiovascular outcomes has been widely studied. However, less is known about links between household work characteristics and cardiovascular health. Methods 113 employed unmedicated hypertensive men and women underwent one day of ambulatory blood pressure (ABP) monitoring. Participants provided information on 1) the number of hours spent doing and 2) their degree of responsibility for seven household tasks (child care, pet care, caring for ill/elderly, household chores, house/car repair, yardwork, finances). Associations between task hours and responsibility ratings in relation to SBP and DBP were estimated using generalized estimating equations, with covariates age, race, gender, body mass index, location, posture. Interactions with gender and socioeconomic position were assessed. Results A greater perceived responsibility for household tasks, but not the hours spent doing these tasks, was associated with higher ambulatory SBP (b(95% confidence interval (CI))=0.93(0.29–1.56), p=0.004) and DBP (b(95%CI)=0.30(0.10–0.51), p=0.003)). Significant interactions with income indicated that associations between household responsibilities and ABP were most pronounced among low income participants (SBP: b(95%CI)=1.40(0.58–2.21), p<0.001; DBP: b(95%CI)=0.48(0.18–0.78), p<0.01). The task most strongly associated with BP was household chores. No interactions with gender were observed. Conclusions Greater perceived responsibility for household tasks was associated with elevated ABP, particularly for lower income participants. Household obligations may have important implications for cardiovascular health, meriting further empirical attention. PMID:21217097
Deep Learning for Classification of Colorectal Polyps on Whole-slide Images
Korbar, Bruno; Olofson, Andrea M.; Miraflor, Allen P.; Nicka, Catherine M.; Suriawinata, Matthew A.; Torresani, Lorenzo; Suriawinata, Arief A.; Hassanpour, Saeed
2017-01-01
Context: Histopathological characterization of colorectal polyps is critical for determining the risk of colorectal cancer and future rates of surveillance for patients. However, this characterization is a challenging task and suffers from significant inter- and intra-observer variability. Aims: We built an automatic image analysis method that can accurately classify different types of colorectal polyps on whole-slide images to help pathologists with this characterization and diagnosis. Setting and Design: Our method is based on deep-learning techniques, which rely on numerous levels of abstraction for data representation and have shown state-of-the-art results for various image analysis tasks. Subjects and Methods: Our method covers five common types of polyps (i.e., hyperplastic, sessile serrated, traditional serrated, tubular, and tubulovillous/villous) that are included in the US Multisociety Task Force guidelines for colorectal cancer risk assessment and surveillance. We developed multiple deep-learning approaches by leveraging a dataset of 2074 crop images, which were annotated by multiple domain expert pathologists as reference standards. Statistical Analysis: We evaluated our method on an independent test set of 239 whole-slide images and measured standard machine-learning evaluation metrics of accuracy, precision, recall, and F1 score and their 95% confidence intervals. Results: Our evaluation shows that our method with residual network architecture achieves the best performance for classification of colorectal polyps on whole-slide images (overall accuracy: 93.0%, 95% confidence interval: 89.0%–95.9%). Conclusions: Our method can reduce the cognitive burden on pathologists and improve their efficacy in histopathological characterization of colorectal polyps and in subsequent risk assessment and follow-up recommendations. PMID:28828201
Task 2 Report: Algorithm Development and Performance Analysis
1993-07-01
separated peaks ............................................. 39 7-16 Example ILGC data for schedule 3 phosphites showing an analysis method which integrates...more closely follows the baseline ................. 40 7-18 Example R.GC data for schedule 3 phosphites showing an analysis method resulting in unwanted...much of the ambiguity that can arise in GC/MS with trace environmental samples, for example. Correlated chromatography, on the other hand, separates the
Mache, Stefanie; Kelm, Ramona; Bauer, Hartwig; Nienhaus, Albert; Klapp, Burghard F; Groneberg, David A
2010-01-01
Surgeons have criticized the working conditions at German hospitals. They complain in particular about long working hours, an inadequate salary for their work, insufficient training/supervision, and an increasing amount of time spent on administration duties. Since these critics are only subjective perceptions, they should be compared to data that can be quantified more objectively and accurately. In this study, we sought to report precise data on surgeons' workflow in several German hospitals. General surgeons were shadowed unobtrusively over 567 h during their shifts at four urban German hospitals. All job tasks surgeons performed were recorded using a tablet PC. The average work day of the surgeons in this study was 9 h 26 min (95% CI 09:10:30 to 09:42:44 h). Within this time span, an average of 02:03:08 h were spent on documentation and administration duties (95% CI 01:47:29 to 02:18:47 h), 01:47:40 h on operating procedures (95% CI 01:20:44 to 02:14:35 h), 01:43:46 h on internal communication (95% CI 01:32:55 to 01:54:36 h), and 0:48:25 h on ward rounds (95% CI 0:39:55 to 0:56:55 h). For the first time, surgeons' workflow in German hospitals was studied in real time. The study results substantiate physicians' statements about their own working conditions, especially with concerns to large amount of time spent on administration tasks. The findings of this study form a basis upon which further analysis can be built and recommendations for improvements in physicians' workflows at German hospitals can be made.
Goetz, Katja; Kornitzky, Anna; Mahnkopf, Janis; Steinhäuser, Jost
2017-12-19
In the future, 'delegation' as task shifting from general practitioners (GPs) to non-physicians will be important in primary care. Therefore, the aim of this study was to evaluate the attitudes towards the concept of task shifting and to identify predictors of a positive attitude towards task shifting from the perspective of GPs. This cross-sectional questionnaire study analysed attitudes towards the concept of task shifting and delegated tasks from the perspective of GPs who were recruited in the German federal state of Schleswig-Holstein. Descriptive statistics and binary regression analyses were computed to identify potential predictors of a positive attitude towards task shifting. Out of 1538 questionnaires distributed, 577 GP questionnaires were returned (response rate: 37.5%). A total of 53.2% of the respondents were male, and 37.3% were female. A positive attitude regarding task shifting was shown by 49% of the participating GPs. The highest level of agreement (95.2%) was found for time savings with task shifting, and a lower agreement (39%) was found regarding the lack of clarity concerning the responsibilities and legal aspects with regards to task shifting. The most frequently delegated tasks were recording electrocardiograms and measuring blood glucose levels. A positive attitude towards task shifting was positively associated with higher job satisfaction and a need for qualified staff. Our sample of GPs for this study was very open-minded towards the concept of task shifting. Germany is just beginning this delegation, but the implementation of task shifting depends on different aspects, such as legal requirements, adequate payment and qualified staff. Finally, there is a need for continuing professional development in primary care teams, especially for non-clinical practice staff.
Mahoney, Jeannette; Verghese, Joe
2014-01-01
Background. The relationship between executive functions (EF) and gait speed is well established. However, with the exception of dual tasking, the key components of EF that predict differences in gait performance have not been determined. Therefore, the current study was designed to determine whether processing speed, conflict resolution, and intraindividual variability in EF predicted variance in gait performance in single- and dual-task conditions. Methods. Participants were 234 nondemented older adults (mean age 76.48 years; 55% women) enrolled in a community-based cohort study. Gait speed was assessed using an instrumented walkway during single- and dual-task conditions. The flanker task was used to assess EF. Results. Results from the linear mixed effects model showed that (a) dual-task interference caused a significant dual-task cost in gait speed (estimate = 35.99; 95% CI = 33.19–38.80) and (b) of the cognitive predictors, only intraindividual variability was associated with gait speed (estimate = −.606; 95% CI = −1.11 to −.10). In unadjusted analyses, the three EF measures were related to gait speed in single- and dual-task conditions. However, in fully adjusted linear regression analysis, only intraindividual variability predicted performance differences in gait speed during dual tasking (B = −.901; 95% CI = −1.557 to −.245). Conclusion. Among the three EF measures assessed, intraindividual variability but not speed of processing or conflict resolution predicted performance differences in gait speed. PMID:24285744
Deep Learning for Classification of Colorectal Polyps on Whole-slide Images.
Korbar, Bruno; Olofson, Andrea M; Miraflor, Allen P; Nicka, Catherine M; Suriawinata, Matthew A; Torresani, Lorenzo; Suriawinata, Arief A; Hassanpour, Saeed
2017-01-01
Histopathological characterization of colorectal polyps is critical for determining the risk of colorectal cancer and future rates of surveillance for patients. However, this characterization is a challenging task and suffers from significant inter- and intra-observer variability. We built an automatic image analysis method that can accurately classify different types of colorectal polyps on whole-slide images to help pathologists with this characterization and diagnosis. Our method is based on deep-learning techniques, which rely on numerous levels of abstraction for data representation and have shown state-of-the-art results for various image analysis tasks. Our method covers five common types of polyps (i.e., hyperplastic, sessile serrated, traditional serrated, tubular, and tubulovillous/villous) that are included in the US Multisociety Task Force guidelines for colorectal cancer risk assessment and surveillance. We developed multiple deep-learning approaches by leveraging a dataset of 2074 crop images, which were annotated by multiple domain expert pathologists as reference standards. We evaluated our method on an independent test set of 239 whole-slide images and measured standard machine-learning evaluation metrics of accuracy, precision, recall, and F1 score and their 95% confidence intervals. Our evaluation shows that our method with residual network architecture achieves the best performance for classification of colorectal polyps on whole-slide images (overall accuracy: 93.0%, 95% confidence interval: 89.0%-95.9%). Our method can reduce the cognitive burden on pathologists and improve their efficacy in histopathological characterization of colorectal polyps and in subsequent risk assessment and follow-up recommendations.
Darney, Blair G; Sosa-Rubi, Sandra G; Servan-Mori, Edson; Rodriguez, Maria I; Walker, Dilys; Lozano, Rafael
2016-06-01
To test the association of age (adolescents vs. older women) and place of delivery with receipt of immediate postpartum contraception in Mexico. Retrospective cohort study, Mexico, nationally representative sample of women 12-39years old at last delivery. We used multivariable logistic regression to test the association of self-reported receipt of postpartum contraception prior to discharge with age and place of delivery (public, employment based, private, or out of facility). We included individual and household-level confounders and calculated relative and absolute multivariable estimates of association. Our analytic sample included 7022 women (population, N=9,881,470). Twenty percent of the population was 12-19years old at last birth, 55% aged 20-29 and 25% 30-39years old. Overall, 43% of women reported no postpartum contraceptive method. Age was not significantly associated with receipt of a method, controlling for covariates. Women delivering in public facilities had lower odds of receipt of a method (Odds Ratio=0.52; 95% Confidence Interval (CI)=0.40-0.68) compared with employment-based insurance facilities. We estimated 76% (95% CI=74-78%) of adolescents (12-19years) who deliver in employment-based insurance facilities leave with a method compared with 59% (95% CI=56-62%) who deliver in public facilities. Both adolescents and women ages 20-39 receive postpartum contraception, but nearly half of all women receive no method. Place of delivery is correlated with receipt of postpartum contraception, with lower rates in the public sector. Lessons learned from Mexico are relevant to other countries seeking to improve adolescent health through reducing unintended pregnancy. Adolescents receive postpartum contraception as often as older women in Mexico, but half of all women receive no method. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.
Broder, Joshua Seth; Bhat, Rahul; Boyd, Joshua P.; Ogloblin, Ivan A.; Limkakeng, Alexander; Hocker, Michael Brian; Drake, Weiying Gao; Miller, Taylor; Harringa, John Brian; Repplinger, Michael Dean
2016-01-01
Background Emergency department (ED) computed tomography (CT) use has increased substantially in recent years, resulting in increased radiation exposure for patients. Few studies have assessed which parties contribute to CT ordering in the ED. Objective To determine the proportion of CT scans ordered due to explicit requests by various stakeholders in ED patient care. Methods Prospective, observational study at three university hospital EDs. CT scans ordered during research assistant hours were eligible for inclusion. Attending emergency physicians (EPs) completed standardized data forms to indicate all parties who had explicitly requested that a specific CT be performed. Forms were completed before the CT results were known in order to minimize bias. Results Data were obtained from 77 EPs regarding 944 CTs. The parties most frequently requesting CTs were attending EPs (82.0%, 95% CI 79.4–84.3), resident physicians (28.6%, 95%CI 25.8–31.6), consulting physicians (24.4%, 95%CI 21.7–27.2), and admitting physicians (3.9%, 95%CI 2.9–5.4). In the 168 instances in which the attending EP did not explicitly request the CT, requests most commonly came from consulting physicians (51.2%, 95%CI 43.7–58.6), resident physicians in the ED (39.9%, 95%CI, 32.8–47.4), and admitting physicians (8.9%, 95%CI, 5.5–14.2). EPs were the sole party requesting CT in 46.2% of cases while multiple parties were involved in 39.0%. Patients, families, and radiologists were uncommon sources of such requests. Conclusions Emergency physicians requested the majority of CTs, though nearly 20% were actually not desired by them. Admitting, consulting, and resident physicians in the ED were important contributors to CT utilization. PMID:26873604
Khurrum, Huma; Bedaiwi, Khalid M.; AlBalahi, Naif Meshael
2017-01-01
Background The Koebner phenomenon (KP) is a common entity observed in dermatological disorders. The reported incidence of KP in vitiligo varies widely. Although the KP is frequently observed in patients with viltiligo, the associated factors with KP has not been established yet. Objective The aim is to estimate the prevalence of KP in vitiligo patients and to investigate the associated factors with KP among vitiligo characteristics. Methods A cross-sectional observational study was conducted using 381 vitiligo patients. Demographic and clinical information was obtained via the completion of Vitiligo European Task Force (VETF) questionnaires. Patients with positive history of KP were extracted from this vitiligo database. Multivariate analysis was performed to assess associations with KP. Results The median age of cases was 24 years (range, 0.6~76). In total, 237 of the patients were male (62.2%). Vitiligo vulgaris was the most common type observed (152/381, 39.9%). Seventy-two percent (274/381) patients did not exhibit KP, whereas 28.1% (107/381) of patients exhibited this condition. Multivariable analysis showed the following to be independent factors with KP in patients with vitiligo: the progressive disease (odds ratio [OR], 1.82; 95% confidence interval [95% CI], 1.17~2.92; p=0.041), disease duration longer than 5 years (OR, 1.92; 95% CI, 1.22~2.11; p=0.003), and body surface area more than 2% (OR, 2.20; 95% CI, 1.26~3.24; p<0.001). Conclusion Our results suggest that KP may be used to evaluate disease activity and investigate different associations between the clinical profile and course of vitiligo. Further studies are needed to predict the relationship between KP and responsiveness to therapy. PMID:28566906
The Impact of Gestational Age at Delivery on Urologic Outcomes for the Fetus with Hydronephrosis.
Benjamin, Tara; Amodeo, Rhiannon R; Patil, Avinash S; Robinson, Barrett K
2016-01-01
Compare short-term urologic outcomes with delivery timing in fetuses with severe hydronephrosis. An ultrasound database was queried for severe hydronephrosis. Cases were categorized into late preterm/early term (36 0/7 - 38 6/7 weeks) and full term (39 0/7 weeks or greater) groups. Baseline characteristics were compared using standard statistical methods. Spearman's correlation analysis was performed for grade and severity of hydronephrosis on first postnatal ultrasound with gestational age at delivery. Of 589 cases, 79 (33 late preterm/early term, 46 full term) met criteria. Baseline characteristics were similar between groups. Spearman's correlation coefficients (rs) indicated that increased postnatal Society for Fetal Urology grade, rs= -0.26 (95% CI [-.48, -.002]), and severity of hydronephrosis, rs= -0.39 (95% CI [-.59, -.14]), both correlated with earlier delivery. Late preterm/early term delivery resulted in worse short-term postnatal renal outcomes. Unless otherwise indicated, delivery for fetal hydronephrosis should be deferred until 39 weeks.
Orbital evolution of 95/P Chiron, 39P/Oterma, 29P/Shwassmann-Wachmann 1, and of 33 Centaurs
NASA Astrophysics Data System (ADS)
Kovalenko, N. S.; Churyumov, K. I.; Babenko, Yu. G.
2011-12-01
The paper is devoted to numerical modeling of orbital evolution of 34 Centaurs, and 2 distant Jupiter-family comets - 39P/Oterma and 29P/Shwassmann-Wachmann 1. As a result the evolutionary tracks of orbital elements of 33 Centaurs and 3 comets (95/P Chiron (2060), 39P/Oterma and 29P/Shwassmann-Wachmann 1) are obtained. The integrations were produced for 1 Myr back and forth in time starting at epoch and using the implicit single sequence Everhart methods. The statistical analysis of numerical integrations results was done, trends in changes of Centaurs' orbital elements in the past and in the future are revealed. The part of Centaurs that are potential comets is defined by the values of perihelia distributions for modeled orbits. It is shown that Centaurs may transits into orbits typical for Jupiter-family comets, and vice versa. Centaurs represent one of possible sources for replenishment of JFCs population, but other sources are also necessary.
Trombetti, Andrea; Hars, Mélany; Herrmann, François R; Kressig, Reto W; Ferrari, Serge; Rizzoli, René
2011-03-28
Falls occur mainly while walking or performing concurrent tasks. We determined whether a music-based multitask exercise program improves gait and balance and reduces fall risk in elderly individuals. We conducted a 12-month randomized controlled trial involving 134 community-dwelling individuals older than 65 years, who are at increased risk of falling. They were randomly assigned to an intervention group (n = 66) or a delayed intervention control group scheduled to start the program 6 months later (n = 68). The intervention was a 6-month multitask exercise program performed to the rhythm of piano music. Change in gait variability under dual-task condition from baseline to 6 months was the primary end point. Secondary outcomes included changes in balance, functional performances, and fall risk. At 6 months, there was a reduction in stride length variability (adjusted mean difference, -1.4%; P < .002) under dual-task condition in the intervention group, compared with the delayed intervention control group. Balance and functional tests improved compared with the control group. There were fewer falls in the intervention group (incidence rate ratio, 0.46; 95% confidence interval, 0.27-0.79) and a lower risk of falling (relative risk, 0.61; 95% confidence interval, 0.39-0.96). Similar changes occurred in the delayed intervention control group during the second 6-month period with intervention. The benefit of the intervention on gait variability persisted 6 months later. In community-dwelling older people at increased risk of falling, a 6-month music-based multitask exercise program improved gait under dual-task condition, improved balance, and reduced both the rate of falls and the risk of falling. Trial Registration clinicaltrials.gov Identifier: NCT01107288.
Management Information Task Group
2002-12-18
Defense Business Practice Implementation Board Management Information Task Group Report...Std Z39-18 Defense Business Practice Implementation Board Management Information Task Group... Business Practice Implementation Board Management Information Task Group Report FY02-2 3
Evaluation of a strapless heart rate monitor during simulated flight tasks.
Wang, Zhen; Fu, Shan
2016-01-01
Pilots are under high task demands during flight. Monitoring pilot's physiological status is very important in the evaluation of pilot's workload and flight safety. Recently, physiological status monitor (PSM) has been embedded into a watch that can be used without a conventional chest strap. This makes it possible to unobtrusively monitor, log and transmit pilot's physiological measurements such as heart rate (HR) during flight tasks. The purpose of this study is to validate HR recorded by a strapless heart rate watch against criterion ECG-derived HR. Ten commercial pilots (mean ± SD : age: 39.1 ± 7.8 years; total flight hours 7173.2 ± 5270.9 hr) performed three routinely trained flight tasks in a full flight simulator: wind shear go-around (WG), takeoff and climb (TC), and hydraulic failure (HF). For all tasks combined (overall) and for each task, differences between the heart rate watch measurements and the criterion data were small (mean difference [95% CI]: overall: -0.71 beats/min [-0.85, -0.57]; WG: -0.90 beats/min [-1.15, -0.65]; TC: -0.69 beats/min [-0.98, -0.40]; HF: -0.61 beats/min [-0.80, -0.42]). There were high correlations between the heart rate watch measurements and the ECG-derived HR for all tasks (r ≥ 0.97, SEE < 3). Bland-Altman plots also show high agreements between the watch measurements and the criterion HR. These results suggest that the strapless heart rate watch provides valid measurements of HR during simulated flight tasks and could be a useful tool for pilot workload evaluation.
Lee, Miryoung; Pascoe, John M; McNicholas, Caroline I
2017-01-01
Objectives The prevalence of extreme prematurity at birth has increased, but little research has examined its impact on developmental outcomes in large representative samples within the United States. This study examined the association of extreme prematurity with kindergarteners' reading skills, mathematics skills and fine motor skills. Methods The early childhood longitudinal study-birth cohort, a representative sample of the US children born in 2001 was analyzed for this study. Early reading and mathematics skills and fine motor skills were compared among 200 extremely premature children (EPC) (gestational age <28 wks or birthweight <1000 g), 500 premature children (PC), and 4300 term children (TC) (≥37wks or ≥2500 g). Generalized linear regression analyses included sampling weights, children's age, race, sex, and general health status, and parental marital status and education among singleton children. Results At age 5 years, EPC were 2.6(95 % CI 1.7-3.8) times more likely to fail build a gate and were 3.1(95 % CI 1.6-5.8) times more likely to fail all four drawing tasks compared to TC (p values <0.001). Fine motor performance of PC (failed to build a gate, 1.3[95 % CI 1.0-1.7]; failed to draw all four shapes, 1.1[95 % CI 0.8-1.6]) was not significantly different from TC. Mean early reading scale score (36.8[SE:1.3]) of EPC was 4.0 points lower than TC (p value < 0.0001) while mean reading score (39.9[SE:1.4]) of PC was not significantly different from TC (40.8[SE:1.1]). Mean mathematics scale score were significantly lower for both EPC (35.5[SE:1.0], p value < 0.001) and PC (39.8[SE:0.8], p value = 0.023) compared to TC (41.0[SE:0.6]). Conclusions for Practice Extreme prematurity at birth was associated with cognitive and fine motor delays at age 5 years. This suggests that based on a nationally representative sample of infants, the biological risk of extreme prematurity persists after adjusting for other factors related to development.
2011-01-01
Background Shifting the role of counseling to less skilled workers may improve efficiency and coverage of health services, but evidence is needed on the impact of substitution on quality of care. This research explored the influence of delegating maternal and newborn counseling responsibilities to clinic-based lay nurse aides on the quality of counseling provided as part of a task shifting initiative to expand their role. Methods Nurse-midwives and lay nurse aides in seven public maternities were trained to use job aids to improve counseling in maternal and newborn care. Quality of counseling and maternal knowledge were assessed using direct observation of antenatal consultations and patient exit interviews. Both provider types were interviewed to examine perceptions regarding the task shift. To compare provider performance levels, non-inferiority analyses were conducted where non-inferiority was demonstrated if the lower confidence limit of the performance difference did not exceed a margin of 10 percentage points. Results Mean percent of recommended messages provided by lay nurse aides was non-inferior to counseling by nurse-midwives in adjusted analyses for birth preparedness (β = -0.0, 95% CI: -9.0, 9.1), danger sign recognition (β = 4.7, 95% CI: -5.1, 14.6), and clean delivery (β = 1.4, 95% CI: -9.4, 12.3). Lay nurse aides demonstrated superior performance for communication on general prenatal care (β = 15.7, 95% CI: 7.0, 24.4), although non-inferiority was not achieved for newborn care counseling (β = -7.3, 95% CI: -23.1, 8.4). The proportion of women with correct knowledge was significantly higher among those counseled by lay nurse aides as compared to nurse-midwives in general prenatal care (β = 23.8, 95% CI: 15.7, 32.0), birth preparedness (β = 12.7, 95% CI: 5.2, 20.1), and danger sign recognition (β = 8.6, 95% CI: 3.3, 13.9). Both cadres had positive opinions regarding task shifting, although several preferred 'task sharing' over full delegation. Conclusions Lay nurse aides can provide effective antenatal counseling in maternal and newborn care in facility-based settings, provided they receive adequate training and support. Efforts are needed to improve management of human resources to ensure that effective mechanisms for regulating and financing task shifting are sustained. PMID:21211045
Accuracy and reliability of pulp/tooth area ratio in upper canines by peri-apical X-rays.
Azevedo, A C; Michel-Crosato, E; Biazevic, M G H; Galić, I; Merelli, V; De Luca, S; Cameriere, R
2014-11-01
Due to the real need for careful staff training in age assessment, in order to improve capacity, consistency and competence, new research on the reliability and repeatability of methods frequently used in age assessment are required. The aim of this study was twofold: first, to test the accuracy of this method for age estimation; second, to obtain data on the reliability of this technique. A sample of 81 peri-apical radiographs of upper canines (44 men and 37 women), aged between 19 and 74years, was used; the teeth were taken from the osteological collection of Sassari (Sardinia, Italy). Three blinded observers used the technique in order to perform the age estimation. The mean real age of the 81 observations was 37.21 (CI95% 34.37 40.05), and estimated ages ranged from 36.65 to 38.99 (CI95%-Ex1 35.42; 41.28; CI95%-Ex2 33.89; 39.41; CI95%-Ex3 35.92; 42.06). The module differences found by the three observers were 3.43, 4.24 and 4.45, respectively for Ex1×Ex2, Ex1×Ex3 and Ex2×Ex3. The module differences observed among real and observed ages were 2.55 (CI95% 1.90; 3.20), 2.22 (CI95% 1.65; 2.78) and 4.39 (CI95% 3.80; 5.75), respectively for Ex1, Ex2 and Ex3. No differences were observed among measurements. This technique can be reproduced and repeated after proper training, since it was found high reliability and accuracy. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Meta-analysis of executive functioning in ecstasy/polydrug users.
Roberts, C A; Jones, A; Montgomery, C
2016-06-01
Ecstasy/3,4-methylenedioxymethamphetamine (MDMA) use is proposed to cause damage to serotonergic (5-HT) axons in humans. Therefore, users should show deficits in cognitive processes that rely on serotonin-rich, prefrontal areas of the brain. However, there is inconsistency in findings to support this hypothesis. The aim of the current study was to examine deficits in executive functioning in ecstasy users compared with controls using meta-analysis. We identified k = 39 studies, contributing 89 effect sizes, investigating executive functioning in ecstasy users and polydrug-using controls. We compared function-specific task performance in 1221 current ecstasy users and 1242 drug-using controls, from tasks tapping the executive functions - updating, switching, inhibition and access to long-term memory. The significant main effect demonstrated overall executive dysfunction in ecstasy users [standardized mean difference (SMD) = -0.18, 95% confidence interval (CI) -0.26 to -0.11, Z = 5.05, p < 0.001, I 2 = 82%], with a significant subgroup effect (χ 2 = 22.06, degrees of freedom = 3, p < 0.001, I 2 = 86.4%) demonstrating differential effects across executive functions. Ecstasy users showed significant performance deficits in access (SMD = -0.33, 95% CI -0.46 to -0.19, Z = 4.72, p < 0.001, I 2 = 74%), switching (SMD = -0.19, 95% CI -0.36 to -0.02, Z = 2.16, p < 0.05, I 2 = 85%) and updating (SMD = -0.26, 95% CI -0.37 to -0.15, Z = 4.49, p < 0.001, I 2 = 82%). No differences were observed in inhibitory control. We conclude that this is the most comprehensive analysis of executive function in ecstasy users to date and provides a behavioural correlate of potential serotonergic neurotoxicity.
López-Vicente, Mónica; Garcia-Aymerich, Judith; Torrent-Pallicer, Jaume; Forns, Joan; Ibarluzea, Jesús; Lertxundi, Nerea; González, Llúcia; Valera-Gran, Desirée; Torrent, Maties; Dadvand, Payam; Vrijheid, Martine; Sunyer, Jordi
2017-09-01
To evaluate the role of extracurricular physical activity and sedentary behavior at preschool and primary school age on working memory at primary school age and adolescence, respectively. This prospective study was based on a birth cohort across 4 Spanish regions. In the 3 younger subcohorts (n = 1093), parents reported lifestyle habits of child at age 4 years of age on a questionnaire, and children performed a computerized working memory task at 7 years of age. In the older subcohort (n = 307), the questionnaire was completed at 6 years of age and working memory was tested at 14 years of age. Adjusted regression models were developed to investigate the associations between lifestyle habits and working memory. Low extracurricular physical activity levels at 4 years of age were associated with a nonsignificant 0.95% (95% CI -2.81 to 0.92) reduction of correct responses in the working memory task at age 7 years of age. Low extracurricular physical activity levels at 6 years of age were associated with a 4.22% (95% CI -8.05 to -0.39) reduction of correct responses at age 14 years. Television watching was not associated with working memory. Other sedentary behaviors at 6 year of age were associated with a 5.07% (95% CI -9.68 to -0.46) reduction of correct responses in boys at 14 years of age. Low extracurricular physical activity levels at preschool and primary school ages were associated with poorer working memory performance at primary school age and adolescence, respectively. High sedentary behavior levels at primary school age were related negatively to working memory in adolescent boys. Copyright © 2017 Elsevier Inc. All rights reserved.
Parkinson, Bonny; Goodall, Stephen; Norman, Richard
2013-04-01
Economic evaluation of mandatory health programmes generally do not consider the utility impact of a loss of consumer choice upon implementation, despite evidence suggesting that consumers do value having the ability to choose. The primary aim of this study was to explore whether the utility impact of a loss of consumer choice from implementing mandatory health programmes can be measured using discrete choice experiments (DCEs). Three case studies were used to test the methodology: fortification of bread-making flour with folate, mandatory influenza vaccination of children, and the banning of trans-fats. Attributes and levels were developed from a review of the literature. An orthogonal, fractional factorial design was used to select the profiles presented to respondents to allow estimation of main effects. Overall, each DCE consisted of 64 profiles which were allocated to four versions of 16 profiles. Each choice task compared two profiles, one being voluntary and the other being mandatory, plus a 'no policy' option, thus each respondent was presented with eight choice tasks. For each choice task, respondents were asked which health policy they most preferred and least preferred. Data was analysed using a mixed logit model with correlated coefficients (200 Halton draws). The compensating variation required for introducing a programme on a mandatory basis (versus achieving the same health impacts with a voluntary programme) that holds utility constant was estimated. Responses were provided by 535 participants (a response rate of 83 %). For the influenza vaccination and folate fortification programmes, the results suggested that some level of compensation may be required for introducing the programme on a mandatory basis. Introducing a mandatory influenza vaccination programme required the highest compensation (Australian dollars [A$] 112.75, 95 % CI -60.89 to 286.39) compared with folate fortification (A$18.05, 95 % CI -3.71 to 39.80). No compensation was required for introducing the trans-fats programme (-A$0.22, 95 % CI -6.24 to 5.80) [year 2010 values]. In addition to the type of mandatory health programme, the compensation required was also found to be dependent on a number of other factors. In particular, the study found an association between the compensation required and stronger libertarian preferences. DCEs can be used to measure the utility impact of a loss of consumer choice. Excluding the utility impact of a loss of consumer choice from an economic evaluation taking a societal perspective may result in a sub-optimal, or incorrect, funding decision.
Infant Growth Before and After Term: Effects on Neurodevelopment in Preterm Infants
Rifas-Shiman, Sheryl L.; Sullivan, Thomas; Collins, Carmel T.; McPhee, Andrew J.; Ryan, Philip; Kleinman, Ken P.; Gillman, Matthew W.; Gibson, Robert A.; Makrides, Maria
2011-01-01
OBJECTIVE: To identify sensitive periods of postnatal growth for preterm infants relative to neurodevelopment at 18 months' corrected age. PATIENTS AND METHODS: We studied 613 infants born at <33 weeks' gestation who participated in the DHA for Improvement of Neurodevelopmental Outcome trial. We calculated linear slopes of growth in weight, length, BMI, and head circumference from 1 week of age to term (40 weeks' postmenstrual age), term to 4 months, and 4 to 12 months, and we estimated their associations with Bayley Scales of Infant Development, 2nd Edition, Mental (MDI) and Psychomotor (PDI) Development Indexes in linear regression. RESULTS: The median gestational age was 30 (range: 2–33) weeks. Mean ± SD MDI was 94 ± 16, and PDI was 93 ± 16. From 1 week to term, greater weight gain (2.4 MDI points per z score [95% confidence interval (CI): 0.8–3.9]; 2.7 PDI points [95% CI: 1.2–.2]), BMI gain (1.7 MDI points [95% CI: 0.4–3.1]; 2.5 PDI points [95% CI: 1.2–3.9]), and head growth (1.4 MDI points [95% CI: −0.0–2.8]; 2.5 PDI points [95% CI: 1.2–3.9]) were associated with higher scores. From term to 4 months, greater weight gain (1.7 points [95% CI: 0.2–3.1]) and linear growth (2.0 points [95% CI: 0.7–3.2]), but not BMI gain, were associated with higher PDI. From 4 to 12 months, none of the growth measures was associated with MDI or PDI score. CONCLUSIONS: In preterm infants, greater weight and BMI gain to term were associated with better neurodevelopmental outcomes. After term, greater weight gain was also associated with better outcomes, but increasing weight out of proportion to length did not confer additional benefit. PMID:21949135
Renal replacement therapy in Europe: a summary of the 2011 ERA–EDTA Registry Annual Report
Noordzij, Marlies; Kramer, Anneke; Abad Diez, José M.; Alonso de la Torre, Ramón; Arcos Fuster, Emma; Bikbov, Boris T.; Bonthuis, Marjolein; Bouzas Caamaño, Encarnación; Čala, Svetlana; Caskey, Fergus J.; Castro de la Nuez, Pablo; Cernevskis, Harijs; Collart, Frederic; Díaz Tejeiro, Rafael; Djukanovic, Ljubica; Ferrer-Alamar, Manuel; Finne, Patrik; García Bazaga, María de los Angelos; Garneata, Liliana; Golan, Eliezer; Gonzalez Fernández, Raquel; Heaf, James G.; Hoitsma, Andries; Ioannidis, George A.; Kolesnyk, Mykola; Kramar, Reinhard; Lasalle, Mathilde; Leivestad, Torbjørn; Lopot, Frantisek; van de Luijtgaarden, Moniek W.M.; Macário, Fernando; Magaz, Ángela; Martín Escobar, Eduardo; de Meester, Johan; Metcalfe, Wendy; Ots-Rosenberg, Mai; Palsson, Runolfur; Piñera, Celestino; Pippias, Maria; Prütz, Karl G.; Ratkovic, Marina; Resić, Halima; Rodríguez Hernández, Aurelio; Rutkowski, Boleslaw; Spustová, Viera; Stel, Vianda S.; Stojceva-Taneva, Olivera; Süleymanlar, Gültekin; Wanner, Christoph; Jager, Kitty J.
2014-01-01
Background This article provides a summary of the 2011 ERA–EDTA Registry Annual Report (available at www.era-edta-reg.org). Methods Data on renal replacement therapy (RRT) for end-stage renal disease (ESRD) from national and regional renal registries in 30 countries in Europe and bordering the Mediterranean Sea were used. From 27 registries, individual patient data were received, whereas 17 registries contributed data in aggregated form. We present the incidence and prevalence of RRT, and renal transplant rates in 2011. In addition, survival probabilities and expected remaining lifetimes were calculated for those registries providing individual patient data. Results The overall unadjusted incidence rate of RRT in 2011 among all registries reporting to the ERA–EDTA Registry was 117 per million population (pmp) (n = 71.631). Incidence rates varied from 24 pmp in Ukraine to 238 pmp in Turkey. The overall unadjusted prevalence of RRT for ESRD on 31 December 2011 was 692 pmp (n = 425 824). The highest prevalence was reported by Portugal (1662 pmp) and the lowest by Ukraine (131 pmp). Among all registries, a total of 22 814 renal transplantations were performed (37 pmp). The highest overall transplant rate was reported from Spain, Cantabria (81 pmp), whereas the highest rate of living donor transplants was reported from Turkey (39 pmp). For patients who started RRT between 2002 and 2006, the unadjusted 5-year patient survival on RRT was 46.8% [95% confidence interval (CI) 46.6–47.0], and on dialysis 39.3% (95% CI 39.2–39.4). The unadjusted 5-year patient survival after the first renal transplantation performed between 2002 and 2006 was 86.7% (95% CI 86.2–87.2) for kidneys from deceased donors and 94.3% (95% CI 93.6–95.0) for kidneys from living donors. PMID:25852881
Howell, David R; Meehan, William P; Barber Foss, Kim D; Reches, Amit; Weiss, Michal; Myer, Gregory D
2018-05-31
To investigate the association between dual-task gait performance and brain network activation (BNA) using an electroencephalography (EEG)-based Go/No-Go paradigm among children and adolescents with concussion. Participants with a concussion completed a visual Go/No-Go task with collection of electroencephalogram brain activity. Data were treated with BNA analysis, which involves an algorithmic approach to EEG-ERP activation quantification. Participants also completed a dual-task gait assessment. The relationship between dual-task gait speed and BNA was assessed using multiple linear regression models. Participants (n = 20, 13.9 ± 2.3 years of age, 50% female) were tested at a mean of 7.0 ± 2.5 days post-concussion and were symptomatic at the time of testing (post-concussion symptom scale = 40.4 ± 21.9). Slower dual-task average gait speed (mean = 82.2 ± 21.0 cm/s) was significantly associated with lower relative time BNA scores (mean = 39.6 ± 25.8) during the No-Go task (β = 0.599, 95% CI = 0.214, 0.985, p = 0.005, R 2 = 0.405), while controlling for the effect of age and gender. Among children and adolescents with a concussion, slower dual-task gait speed was independently associated with lower BNA relative time scores during a visual Go/No-Go task. The relationship between abnormal gait behaviour and brain activation deficits may be reflective of disruption to multiple functional abilities after concussion.
Rakotoniana, Jerry S.; Rakotomanga, Jean de Dieu M.; Barennes, Hubert
2014-01-01
Introduction Churches occupy an important social and cultural position in Madagascar. The sexual transmission of HIV raises controversies about the role that Churches can play in preventing HIV/AIDS. This cross-sectional survey investigated recommendations by religious leaders for condom use and other preventive strategies in the context of international guidelines. Methods A questionnaire was self-administered to a random sample of religious leaders. The questions related to preventive methods against HIV/AIDS such as: condom use, marital fidelity, sexual abstinence before marriage, and HIV-testing. Associations with recommendations for condom use were evaluated using univariate and multivariate logistic regression analyses. Results Of 231 religious leaders, 215 (93.1%) were willing to share their knowledge of HIV/AIDS with their congregations. The majority received their information from the media (N = 136, 58.9%), a minority from their church (N = 9, 3.9%), and 38 (16.4%) had received prior training on HIV. Nearly all (N = 212, 91.8%) knew that HIV could be sexually transmitted though only a few (N = 39, 16.9%) were aware of mother-to-child transmission or unsafe injections (N = 56, 24.2%). A total of 91 (39.4%) were willing to, or had recommended (N = 64, 27.7%), condom use, while 50 (21.6%) had undergone HIV testing. Only nine (3.9%) had ever cared for a person living with HIV/AIDS (PLHIV). Multivariable logistic regression shows that condom use recommendations by religious leaders were negatively associated with tertiary level education (OR: 0.3, 95% CI 0.1–0.7), and positively associated with knowing a person at risk (OR: 16.2, 95% CI 3.2–80.2), knowing of an ART center (OR: 2.6, 95% CI 1.4–4.8), and receiving information about HIV at school (OR: 2.6, 95% CI 1.2–5.6). Conclusions Malagasy church leaders could potentially become key players in HIV/AIDS prevention if they improved their knowledge of the illness, their commitment to international recommendations, and extended their interaction with people most at risk. PMID:24824620
Stroop, D M; Glueck, C J; Tracy, T M; Schumacher, H R
1994-12-01
Our specific aim was to compare three plasminogen activator-inhibitor type 1 (PAI-1) antigen ELISA kit assays (the Biopool AB, Ltd, TintElize PAI-1 Strip-Well Format; the American Diagnostica, Inc., Imubind 822/1; and the second generation Imubind 822/1S). Within-run coefficients of variation (n = 6) for the TintElize, Imubind 822/1 and Imubind 822/1S methods were 5.5%, 5.9% and 6.8%, respectively. Between-run coefficients of variation for six aliquots per run were 2.9% for TintElize, 3.8% for Imubind 822/1, and 3.5% for Imubind 822/1S. Comparison of the average of duplicate aliquots from hyperlipidaemic patients demonstrated intraclass correlations of 0.75, 0.79 and 0.95 for TintElize vs Imubind 822/1 (n = 39), TintElize vs Imubind 822/1S (n = 39), and Imubind 822/1 vs 822/1S (n = 84), respectively. Lower 95% confidence interval limits of the intraclass correlation were 0.55, 0.48 and 0.93, respectively. Mean PAI-1 antigen values (n = 39) were 12.1, 15.8, 15.8 and 16.0 ng/ml, respectively, for TintElize, TintElize without using the quenching well, Imubind 822/1, and Imubind 822/1S. All three methods were easily performed and exhibited high correlation and reproducibility. A significant systematic bias (P < 0.006) existed between TintElize and TintElize without using the quenching well, Imubind 822/1, and Imubind 822/1S. However, there was no significant bias when TintElize without using the quenching well is compared with Imubind 822/1 (P > 0.8) and to 822/1S (P > 0.8) nor is there significant systematic bias between Imubind 822/1 and 822/1S (P > 0.3). By convention, interchangeability between assay methods suggests that the lower limit of the 95% intraclass correlation confidence interval be greater than 0.75.(ABSTRACT TRUNCATED AT 250 WORDS)
Costello, S P; Soo, W; Bryant, R V; Jairath, V; Hart, A L; Andrews, J M
2017-08-01
Faecal microbiota transplantation (FMT) is emerging as a novel therapy for ulcerative colitis (UC). Interpretation of efficacy of FMT for UC is complicated by differences among studies in blinding, FMT administration procedures, intensity of therapy and donor stool processing methods. To determine whether FMT is effective and safe for the induction of remission in active UC. Medline (Ovid), Embase and the Cochrane Library were searched from inception through February 2017. Original studies reporting remission rates following FMT for active UC were included. All study designs were included in the systematic review and a meta-analysis performed including only randomised controlled trials (RCTs). There were 14 cohort studies and four RCTs that used markedly different protocols. In the meta-analysis of RCTs, clinical remission was achieved in 39 of 140 (28%) patients in the donor FMT groups compared with 13 of 137 (9%) patients in the placebo groups; odds ratio 3.67 (95% CI: 1.82-7.39, P<.01). Clinical response was achieved in 69 of 140 (49%) donor FMT patients compared to 38 of 137 (28%) placebo patients; odds ratio 2.48 (95% CI: 1.18-5.21, P=.02). In cohort studies, 39 of 168 (24%; 95% CI: 11%-40%) achieved clinical remission. Despite variation in processes, FMT appears to be effective for induction of remission in UC, with no major short-term safety signals. Further studies are needed to better define dose frequency and preparation methods, and to explore its feasibility, efficacy and safety as a maintenance agent. © 2017 John Wiley & Sons Ltd.
Overinterpretation and misreporting of diagnostic accuracy studies: evidence of "spin".
Ochodo, Eleanor A; de Haan, Margriet C; Reitsma, Johannes B; Hooft, Lotty; Bossuyt, Patrick M; Leeflang, Mariska M G
2013-05-01
To estimate the frequency of distorted presentation and overinterpretation of results in diagnostic accuracy studies. MEDLINE was searched for diagnostic accuracy studies published between January and June 2010 in journals with an impact factor of 4 or higher. Articles included were primary studies of the accuracy of one or more tests in which the results were compared with a clinical reference standard. Two authors scored each article independently by using a pretested data-extraction form to identify actual overinterpretation and practices that facilitate overinterpretation, such as incomplete reporting of study methods or the use of inappropriate methods (potential overinterpretation). The frequency of overinterpretation was estimated in all studies and in a subgroup of imaging studies. Of the 126 articles, 39 (31%; 95% confidence interval [CI]: 23, 39) contained a form of actual overinterpretation, including 29 (23%; 95% CI: 16, 30) with an overly optimistic abstract, 10 (8%; 96% CI: 3%, 13%) with a discrepancy between the study aim and conclusion, and eight with conclusions based on selected subgroups. In our analysis of potential overinterpretation, authors of 89% (95% CI: 83%, 94%) of the studies did not include a sample size calculation, 88% (95% CI: 82%, 94%) did not state a test hypothesis, and 57% (95% CI: 48%, 66%) did not report CIs of accuracy measurements. In 43% (95% CI: 34%, 52%) of studies, authors were unclear about the intended role of the test, and in 3% (95% CI: 0%, 6%) they used inappropriate statistical tests. A subgroup analysis of imaging studies showed 16 (30%; 95% CI: 17%, 43%) and 53 (100%; 95% CI: 92%, 100%) contained forms of actual and potential overinterpretation, respectively. Overinterpretation and misreporting of results in diagnostic accuracy studies is frequent in journals with high impact factors. http://radiology.rsna.org/lookup/suppl/doi:10.1148/radiol.12120527/-/DC1. © RSNA, 2013.
Maternal mobile device use during a structured parent-child interaction task
Radesky, Jenny; Miller, Alison L.; Rosenblum, Katherine L.; Appugliese, Danielle; Kaciroti, Niko; Lumeng, Julie C.
2014-01-01
Objective Examine associations of maternal mobile device use with the frequency of mother-child interactions during a structured laboratory task. Methods Participants included 225 low-income mother-child pairs. When children were ~6 years old, dyads were videotaped during a standardized protocol in order to characterize how mothers and children interacted when asked to try familiar and unfamiliar foods. From videotapes, we dichotomized mothers based on whether or not they spontaneously used a mobile device, and counted maternal verbal and nonverbal prompts toward the child. We used multivariate Poisson regression to study associations of device use with eating prompt frequency for different foods. Results Mothers were an average of 31.3 (SD 7.1) years old and 28.0% were of Hispanic/non-white race/ethnicity. During the protocol, 23.1% of mothers spontaneously used a mobile device. Device use was not associated with any maternal characteristics, including age, race/ethnicity, education, depressive symptoms, or parenting style. Mothers with device use initiated fewer verbal (RR 0.80 [95% CI: 0.63, 1.03]) and nonverbal (0.61 [0.39, 0.96]) interactions with their children than mothers who did not use a device, when averaged across all foods. This association was strongest during introduction of halva, the most unfamiliar food (0.67 [0.48, 0.93] for verbal and 0.42 [0.20, 0.89] for nonverbal interactions). Conclusions Mobile device use was common and associated with fewer interactions with children during a structured interaction task, particularly nonverbal interactions and during introduction of an unfamiliar food. More research is needed to understand how device use affects parent-child engagement in naturalistic contexts. PMID:25454369
Gartlehner, Gerald; Patel, Sheila V; Feltner, Cynthia; Weber, Rachel Palmieri; Long, Rachel; Mullican, Kelly; Boland, Erin; Lux, Linda; Viswanathan, Meera
2017-12-12
Postmenopausal status coincides with increased risks for chronic conditions such as heart disease, osteoporosis, cognitive impairment, or some types of cancers. Previously, hormone therapy was used for the primary prevention of these chronic conditions. To update evidence for the US Preventive Services Task Force on the benefits and harms of hormone therapy in reducing risks for chronic conditions. MEDLINE, Cochrane Library, EMBASE, and trial registries from June 1, 2011, through August 1, 2016. Surveillance for new evidence in targeted publications was conducted through July 1, 2017. English-language randomized clinical trials reporting health outcomes. Dual review of abstracts, full-text articles, and study quality; meta-analyses when at least 3 similar studies were available. Beneficial or harmful changes in risks for various chronic conditions. Eighteen trials (n = 40 058; range, 142-16 608; mean age, 53-79 years) were included. Women using estrogen-only therapy compared with placebo had significantly lower risks, per 10 000 person-years, for diabetes (-19 cases [95% CI, -34 to -3]) and fractures (-53 cases [95% CI, -69 to -39]). Risks were statistically significantly increased, per 10 000 person-years, for gallbladder disease (30 more cases [95% CI, 16 to 48]), stroke (11 more cases [95% CI, 2 to 23]), venous thromboembolism (11 more cases [95% CI, 3 to 22]), and urinary incontinence (1261 more cases [95% CI, 880 to 1689]). Women using estrogen plus progestin compared with placebo experienced significantly lower risks, per 10 000 person-years, for colorectal cancer (-6 cases [95% CI, -9 to -1]), diabetes (-14 cases [95% CI, -24 to -3), and fractures (-44 cases [95% CI, -71 to -13). Risks, per 10 000 person-years, were significantly increased for invasive breast cancer (9 more cases [95% CI, 1 to 19]), probable dementia (22 more cases [95% CI, 4 to 53]), gallbladder disease (21 more cases [95% CI, 10 to 34]), stroke (9 more cases [95% CI, 2 to 19]), urinary incontinence (876 more cases [95% CI, 606 to 1168]), and venous thromboembolism (21 more cases [95% CI, 12 to 33]). Hormone therapy for the primary prevention of chronic conditions in menopausal women is associated with some beneficial effects but also with a substantial increase of risks for harms. The available evidence regarding benefits and harms of early initiation of hormone therapy is inconclusive.
Retention of Mastoidectomy Skills After Virtual Reality Simulation Training.
Andersen, Steven Arild Wuyts; Konge, Lars; Cayé-Thomasen, Per; Sørensen, Mads Sølvsten
2016-07-01
The ultimate goal of surgical training is consolidated skills with a consistently high performance. However, surgical skills are heterogeneously retained and depend on a variety of factors, including the task, cognitive demands, and organization of practice. Virtual reality (VR) simulation is increasingly being used in surgical skills training, including temporal bone surgery, but there is a gap in knowledge on the retention of mastoidectomy skills after VR simulation training. To determine the retention of mastoidectomy skills after VR simulation training with distributed and massed practice and to investigate participants' cognitive load during retention procedures. A prospective 3-month follow-up study of a VR simulation trial was conducted from February 6 to September 19, 2014, at an academic teaching hospital among 36 medical students: 19 from a cohort trained with distributed practice and 17 from a cohort trained with massed practice. Participants performed 2 virtual mastoidectomies in a VR simulator a mean of 3.2 months (range, 2.4-5.0 months) after completing initial training with 12 repeated procedures. Practice blocks were spaced apart in time (distributed), or all procedures were performed in 1 day (massed). Performance of the virtual mastoidectomy as assessed by 2 masked senior otologists using a modified Welling scale, as well as cognitive load as estimated by reaction time to perform a secondary task. Among 36 participants, mastoidectomy final-product skills were largely retained at 3 months (mean change in score, 0.1 points; P = .89) regardless of practice schedule, but the group trained with massed practice took more time to complete the task. The performance of the massed practice group increased significantly from the first to the second retention procedure (mean change, 1.8 points; P = .001), reflecting that skills were less consolidated. For both groups, increases in reaction times in the secondary task (distributed practice group: mean pretraining relative reaction time, 1.42 [95% CI, 1.37-1.47]; mean end of training relative reaction time, 1.24 [95% CI, 1.16-1.32]; and mean retention relative reaction time, 1.36 [95% CI, 1.30-1.42]; massed practice group: mean pretraining relative reaction time, 1.34 [95% CI, 1.28-1.40]; mean end of training relative reaction time, 1.31 [95% CI, 1.21-1.42]; and mean retention relative reaction time, 1.39 [95% CI, 1.31-1.46]) indicated that cognitive load during the virtual procedures had returned to the pretraining level. Mastoidectomy skills acquired under time-distributed practice conditions were retained better than skills acquired under massed practice conditions. Complex psychomotor skills should be regularly reinforced to consolidate both motor and cognitive aspects. Virtual reality simulation training provides the opportunity for such repeated training and should be integrated into training curricula.
2011-01-01
Background It was still unclear whether the methodological reporting quality of randomized controlled trials (RCTs) in major hepato-gastroenterology journals improved after the Consolidated Standards of Reporting Trials (CONSORT) Statement was revised in 2001. Methods RCTs in five major hepato-gastroenterology journals published in 1998 or 2008 were retrieved from MEDLINE using a high sensitivity search method and their reporting quality of methodological details were evaluated based on the CONSORT Statement and Cochrane Handbook for Systematic Reviews of interventions. Changes of the methodological reporting quality between 2008 and 1998 were calculated by risk ratios with 95% confidence intervals. Results A total of 107 RCTs published in 2008 and 99 RCTs published in 1998 were found. Compared to those in 1998, the proportion of RCTs that reported sequence generation (RR, 5.70; 95%CI 3.11-10.42), allocation concealment (RR, 4.08; 95%CI 2.25-7.39), sample size calculation (RR, 3.83; 95%CI 2.10-6.98), incomplete outecome data addressed (RR, 1.81; 95%CI, 1.03-3.17), intention-to-treat analyses (RR, 3.04; 95%CI 1.72-5.39) increased in 2008. Blinding and intent-to-treat analysis were reported better in multi-center trials than in single-center trials. The reporting of allocation concealment and blinding were better in industry-sponsored trials than in public-funded trials. Compared with historical studies, the methodological reporting quality improved with time. Conclusion Although the reporting of several important methodological aspects improved in 2008 compared with those published in 1998, which may indicate the researchers had increased awareness of and compliance with the revised CONSORT statement, some items were still reported badly. There is much room for future improvement. PMID:21801429
Weaver, Marcia R; Pillay, Erushka; Jed, Suzanne L; de Kadt, Julia; Galagan, Sean; Gilvydis, Jennifer; Marumo, Eva; Mawandia, Shreshth; Naidoo, Evasen; Owens, Tamara; Prongay, Vickery; O'Malley, Gabrielle
2016-01-01
Introduction The South African National Department of Health sought to improve syndromic management of sexually transmitted infections (STIs). Continuing medical education on STIs was delivered at primary healthcare (PHC) clinics using one of three training methods: (1) lecture, (2) computer and (3) paper-based. Clinics with training were compared with control clinics. Methods Ten PHC clinics were randomly assigned to control and 10 to each training method arm. Clinicians participated in on-site training on six modules; two per week for three weeks. Each clinic was visited by three or four unannounced standardised patient (SP) actors pre-training and post-training. Male SPs reported symptoms of male urethritis syndrome and female SPs reported symptoms of vaginal discharge syndrome. Quality of healthcare was measured by whether or not clinicians completed five tasks: HIV test, genital exam, correct medications, condoms and partner notification. Results An average of 31% of clinicians from each PHC attended each module. Quality of STI care was low. Pre-training (n=128) clinicians completed an average of 1.63 tasks. Post-training (n=114) they completed 1.73. There was no change in the number of STI tasks completed in the control arm and an 11% increase overall in the training arms relative to the control (ratio of relative risk (RRR)=1.11, 95% CI 0.67 to 1.84). Across training arms, there was a 26% increase (RRR=1.26, 95% CI 0.77 to 2.06) associated with lecture, 17% increase (RRR=1.17, 95% CI 0.59 to 2.28) with paper-based and 13% decrease (RRR=0.87, 95% CI 0.40 to 1.90) with computer arm relative to the control. Conclusions Future interventions should address increasing training attendance and computer-based training effectiveness. Trial registration number AEARCTR-0000668. PMID:26430128
Sorenson, Wendy R; Sullivan, Darryl
2006-01-01
In conjunction with an AOAC Presidential Task Force on Dietary Supplements, a method was validated for measurement of 3 plant sterols (phytosterols) in saw palmetto raw materials, extracts, and dietary supplements. AOAC Official Method 994.10, "Cholesterol in Foods," was modified for purposes of this validation. Test samples were saponified at high temperature with ethanolic potassium hydroxide solution. The unsaponifiable fraction containing phytosterols (campesterol, stigmasterol, and beta-sitosterol) was extracted with toluene. Phytosterols were derivatized to trimethylsilyl ethers and then quantified by gas chromatography with a hydrogen flame ionization detector. The presence of the phytosterols was detected at concentrations greater than or equal to 1.00 mg/100 g based on 2-3 g of sample. The standard curve range for this assay was 0.00250 to 0.200 mg/mL. The calibration curves for all phytosterols had correlation coefficients greater than or equal to 0.995. Precision studies produced relative standard deviation values of 1.52 to 7.27% for campesterol, 1.62 to 6.48% for stigmasterol, and 1.39 to 10.5% for beta-sitosterol. Recoveries for samples fortified at 100% of the inherent values averaged 98.5 to 105% for campesterol, 95.0 to 108% for stigmasterol, and 85.0 to 103% for beta-sitosterol.
Understanding Older People’s Readiness for Receiving Telehealth: Mixed-Method Study
2018-01-01
Background The Dutch Ministry of Health has formulated ambitious goals concerning the use of telehealth, leading to subsequent changes compared with the current health care situation, in which 93% of care is delivered face-to-face. Since most care is delivered to older people, the prospect of telehealth raises the question of whether this population is ready for this new way of receiving care. To study this, we created a theoretical framework consisting of 6 factors associated with older people’s intention to use technology. Objective The objective of this study was to understand community-dwelling older people’s readiness for receiving telehealth by studying their intention to use videoconferencing and capacities for using digital technology in daily life as indicators. Methods A mixed-method triangulation design was used. First, a cross-sectional survey study was performed to investigate older people’s intention to use videoconferencing, by testing our theoretical framework with a multilevel path analysis (phase 1). Second, for deeper understanding of older people’s actual use of digital technology, qualitative observations of older people executing technological tasks (eg, on a computer, cell phone) were conducted at their homes (phase 2). Results In phase 1, a total of 256 people aged 65 years or older participated in the survey study (50.0% male; median age, 70 years; Q1-Q3: 67-76). Using a significance level of .05, we found seven significant associations regarding older people’s perception of videoconferencing. Older people’s (1) intention to use videoconferencing was predicted by their performance expectancy (odds ratio [OR] 1.26, 95% CI 1.13-1.39), effort expectancy (OR 1.23, 95% CI 1.07-1.39), and perceived privacy and security (OR 1.30, 95% CI 1.17-1.43); (2) their performance expectancy was predicted by their effort expectancy (OR 1.38, 95% CI 1.24-1.52); and (3) their effort expectancy was predicted by their self-efficacy (OR 1.55, 95% CI 1.42-1.68). In phase 2, a total of 6 men and 9 women aged between 65 and 87 years participated in the qualitative observation study. Of the primary themes, 5 themes were identified that could provide greater understanding of older people’s capacities and incapacities in using digital technology: (1) “self-efficacy and digital literacy,” (2) “obstacles to using technology,” (3) “prior experience and frequency of use,” (4) “sources of support and facilitating conditions,” and (5) “performance expectancy.” These 5 themes recurred in all 15 observations. Conclusions Performance expectancy, effort expectancy, and perceived privacy and security are direct predictors of older people’s intention to use videoconferencing. Self-efficacy appeared to play a role in both older people’s intention to use, as well as their actual use of technology. The path analysis revealed that self-efficacy was significantly associated with older people’s effort expectancy. Furthermore, self-efficacy and digital literacy appeared to play a major role in older people’s capacities to make use of digital technology. PMID:29625950
Dust exposure in workers from grain storage facilities in Costa Rica.
Rodríguez-Zamora, María G; Medina-Escobar, Lourdes; Mora, Glend; Zock, Jan-Paul; van Wendel de Joode, Berna; Mora, Ana M
2017-08-01
About 12 million workers are involved in the production of basic grains in Central America. However, few studies in the region have examined the occupational factors associated with inhalable dust exposure. (i) To assess the exposure to inhalable dust in workers from rice, maize, and wheat storage facilities in Costa Rica; (ii) to examine the occupational factors associated with this exposure; and (iii) to measure concentrations of respirable and thoracic particles in different areas of the storage facilities. We measured inhalable (<100μm) dust concentrations in 176 personal samples collected from 136 workers of eight grain storage facilities in Costa Rica. We also measured respirable (<4μm) and thoracic (<10μm) dust particles in several areas of the storage facilities. Geometric mean (GM) and geometric standard deviation (GSD) inhalable dust concentrations were 2.0mg/m 3 and 7.8 (range=<0.2-275.4mg/m 3 ). Personal inhalable dust concentrations were associated with job category [GM for category/GM for administrative staff and other workers (95% CI)=4.4 (2.6, 7.2) for packing; 20.4 (12.3, 34.7) for dehulling; 109.6 (50.1, 234.4) for unloading in flat bed sheds; 24.0 (14.5, 39.8) for unloading in pits; and 31.6 (18.6, 52.5) for drying], and cleaning task [15.8 (95% CI: 10.0, 26.3) in workers who cleaned in addition to their regular tasks]. Higher area concentrations of thoracic dust particles were found in wheat (GM and GSD=4.3mg/m 3 and 4.5) and maize (3.0mg/m 3 and 3.9) storage facilities, and in grain drying (2.3mg/m 3 and 3.1) and unloading (1.5mg/m 3 and 4.8) areas. Operators of grain storage facilities showed elevated inhalable dust concentrations, mostly above international exposure limits. Better engineering and administrative controls are needed. Copyright © 2017 Elsevier GmbH. All rights reserved.
Kahwati, Leila C; Weber, Rachel Palmieri; Pan, Huiling; Gourlay, Margaret; LeBlanc, Erin; Coker-Schwimmer, Manny; Viswanathan, Meera
2018-04-17
Osteoporotic fractures result in significant morbidity and mortality. To update the evidence for benefits and harms of vitamin D, calcium, or combined supplementation for the primary prevention of fractures in community-dwelling adults to inform the US Preventive Services Task Force. PubMed, EMBASE, Cochrane Library, and trial registries through March 21, 2017; references; and experts. Surveillance continued through February 28, 2018. English-language randomized clinical trials (RCTs) or observational studies of supplementation with vitamin D, calcium, or both among adult populations; studies of populations that were institutionalized or had known vitamin D deficiency, osteoporosis, or prior fracture were excluded. Dual, independent review of titles/abstracts and full-text articles and study quality rating using predefined criteria. Random-effects meta-analysis used when at least 3 similar studies were available. Incident fracture, mortality, kidney stones, cardiovascular events, and cancer. Eleven RCTs (N = 51 419) in adults 50 years and older conducted over 2 to 7 years were included. Compared with placebo, supplementation with vitamin D decreased total fracture incidence (1 RCT [n = 2686]; absolute risk difference [ARD], -2.26% [95% CI, -4.53% to 0.00%]) but had no significant association with hip fracture (3 RCTs [n = 5496]; pooled ARD, -0.01% [95% CI, -0.80% to 0.78%]). Supplementation using vitamin D with calcium had no effect on total fracture incidence (1 RCT [n = 36 282]; ARD, -0.35% [95% CI, -1.02% to 0.31%]) or hip fracture incidence (2 RCTs [n = 36 727]; ARD from the larger trial, -0.14% [95% CI, -0.34% to 0.07%]). The evidence for calcium alone was limited, with only 2 studies (n = 339 total) and very imprecise results. Supplementation with vitamin D alone or with calcium had no significant effect on all-cause mortality or incident cardiovascular disease; ARDs ranged from -1.93% to 1.79%, with CIs consistent with no significant differences. Supplementation using vitamin D with calcium was associated with an increased incidence of kidney stones (3 RCTs [n = 39 213]; pooled ARD, 0.33% [95% CI, 0.06% to 0.60%]), but supplementation with calcium alone was not associated with an increased risk (3 RCTs [n = 1259]; pooled ARD, 0.00% [95% CI, -0.87% to 0.87%]). Supplementation with vitamin D and calcium was not associated with an increase in cancer incidence (3 RCTs [n = 39 213]; pooled ARD, -1.48% [95% CI, -3.32% to 0.35%]). Vitamin D supplementation alone or with calcium was not associated with reduced fracture incidence among community-dwelling adults without known vitamin D deficiency, osteoporosis, or prior fracture. Vitamin D with calcium was associated with an increase in the incidence of kidney stones.
Tillmann, Julian; Swettenham, John
2017-02-01
Previous studies examining selective attention in individuals with autism spectrum disorder (ASD) have yielded conflicting results, some suggesting superior focused attention (e.g., on visual search tasks), others demonstrating greater distractibility. This pattern could be accounted for by the proposal (derived by applying the Load theory of attention, e.g., Lavie, 2005) that ASD is characterized by an increased perceptual capacity (Remington, Swettenham, Campbell, & Coleman, 2009). Recent studies in the visual domain support this proposal. Here we hypothesize that ASD involves an enhanced perceptual capacity that also operates across sensory modalities, and test this prediction, for the first time using a signal detection paradigm. Seventeen neurotypical (NT) and 15 ASD adolescents performed a visual search task under varying levels of visual perceptual load while simultaneously detecting presence/absence of an auditory tone embedded in noise. Detection sensitivity (d') for the auditory stimulus was similarly high for both groups in the low visual perceptual load condition (e.g., 2 items: p = .391, d = 0.31, 95% confidence interval [CI] [-0.39, 1.00]). However, at a higher level of visual load, auditory d' reduced for the NT group but not the ASD group, leading to a group difference (p = .002, d = 1.2, 95% CI [0.44, 1.96]). As predicted, when visual perceptual load was highest, both groups then showed a similarly low auditory d' (p = .9, d = 0.05, 95% CI [-0.65, 0.74]). These findings demonstrate that increased perceptual capacity in ASD operates across modalities. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Growing Up With a Chronic Illness: Social Success, Educational/Vocational Distress
Maslow, Gary R.; Haydon, Abigail; McRee, Annie-Laurie; Ford, Carol Ann; Halpern, Carolyn Tucker
2012-01-01
OBJECTIVES We compared adult educational, vocational, and social outcomes among young adults with and without childhood-onset chronic illness in a nationally representative US sample. METHODS We used data from Wave IV (2008) of the National Longitudinal Study of Adolescent Health. We compared respondents who reported childhood onset cancer, heart disease, diabetes, or epilepsy to young adults without these chronic illnesses in terms of marriage, having children, living with parents, romantic relationship quality, educational attainment, income and employment. Multivariate models controlled for socio-demographic factors and adult-onset chronic illness. RESULTS Compared to those without childhood chronic illness, respondents with childhood chronic illness had similar odds of marriage (OR=0.89, 95%CI: 0.65–1.24), having children (OR=0.99, 95%CI: 0.70–1.42), and living with parents (OR=1.49, 95%CI 0.94–2.33), and similar reports of romantic relationship quality. However, the chronic illness group had lower odds of graduating college (OR=0.49, 95%CI: 0.31–0.78) and being employed (OR=0.56, 95%CI: 0.39–0.80), and higher odds of receiving public assistance (OR=2.13, 95%CI: 1.39–3.25), and lower mean income. CONCLUSIONS Young adults growing up with chronic illness succeed socially, but are at increased risk of poorer educational and vocational outcomes. PMID:21783055
Mortality and Its Risk Factors in Patients with Rapid Eye Movement Sleep Behavior Disorder
Zhou, Junying; Zhang, Jihui; Lam, Siu Ping; Mok, Vincent; Chan, Anne; Li, Shirley Xin; Liu, Yaping; Tang, Xiangdong; Yung, Wing Ho; Wing, Yun Kwok
2016-01-01
Study Objectives: To determine the mortality and its risk factors in patients with rapid eye movement (REM) sleep behavior disorder (RBD). Methods: A total of 205 consecutive patients with video-polysomnography confirmed RBD (mean age = 66.4 ± 10.0 y, 78.5% males) were recruited. Medical records and death status were systematically reviewed in the computerized records of the health care system. Standardized mortality ratio (SMR) was used to calculate the risk ratio of mortality in RBD with reference to the general population. Results: Forty-three patients (21.0%) died over a mean follow-up period of 7.1 ± 4.5 y. The SMR was not increased in the overall sample, SMR (95% confidence interval [CI]) = 1.00 (0.73–1.33). However, SMR (95% CI) increased to 1.80 (1.21–2.58) and 1.75 (1.11–2.63) for RBD patients in whom neurodegenerative diseases and dementia, respectively, eventually developed. In the Cox regression model, mortality risk was significantly associated with age (hazard ratio [HR] = 1.05; 95% CI, 1.01–1.10), living alone (HR = 2.04; 95% CI, 1.39–2.99), chronic obstructive pulmonary disease (HR = 3.38; 95% CI, 1.21–9.46), cancer (HR = 10.09; 95% CI, 2.65–38.42), periodic limb movements during sleep (HR = 3.06; 95% CI, 1.50–6.24), and development of neurodegenerative diseases (HR = 2.84; 95% CI, 1.47–5.45) and dementia (HR = 2.66; 95% CI, 1.39–5.08). Conclusions: Patients with RBD have a higher mortality rate than the general population only if neurodegenerative diseases develop. Several risk factors on clinical and sleep aspects are associated with mortality in RBD patients. Our findings underscore the necessity of timely neuroprotective interventions in the early phase of RBD before the development of neurodegenerative diseases. Citation: Zhou J, Zhang J, Lam SP, Mok V, Chan A, Li SX, Liu Y, Tang X, Yung WH, Wing YK. Mortality and its risk factors in patients with rapid eye movement sleep behavior disorder. SLEEP 2016;39(8):1543–1550. PMID:27306273
Stock, Sarah J.; Ferguson, Evelyn; Duffy, Andrew; Ford, Ian; Chalmers, James; Norman, Jane E.
2013-01-01
Background There is evidence that induction of labour (IOL) around term reduces perinatal mortality and caesarean delivery rates when compared to expectant management of pregnancy (allowing the pregnancy to continue to await spontaneous labour or definitive indication for delivery). However, it is not clear whether IOL in women with a previous caesarean section confers the same benefits. The aim of this study was to describe outcomes of IOL at 39–41 weeks in women with one previous caesarean delivery and to compare outcomes of IOL or planned caesarean delivery to those of expectant management. Methods and Findings We performed a population-based retrospective cohort study of singleton births greater than 39 weeks gestation, in women with one previous caesarean delivery, in Scotland, UK 1981–2007 (n = 46,176). Outcomes included mode of delivery, perinatal mortality, neonatal unit admission, postpartum hemorrhage and uterine rupture. 40.1% (2,969/7,401) of women who underwent IOL 39–41 weeks were ultimately delivered by caesarean. When compared to expectant management IOL was associated with lower odds of caesarean delivery (adjusted odds ratio [AOR] after IOL at 39 weeks of 0.81 [95% CI 0.71–0.91]). There was no significant effect on the odds of perinatal mortality but greater odds of neonatal unit admission (AOR after IOL at 39 weeks of 1.29 [95% CI 1.08–1.55]). In contrast, when compared with expectant management, elective repeat caesarean delivery was associated with lower perinatal mortality (AOR after planned caesarean at 39 weeks of 0.23 [95% CI 0.07–0.75]) and, depending on gestation, the same or lower neonatal unit admission (AOR after planned caesarean at 39 weeks of 0.98 [0.90–1.07] at 40 weeks of 1.08 [0.94–1.23] and at 41 weeks of 0.77 [0.60–1.00]). Conclusions A more liberal policy of IOL in women with previous caesarean delivery may reduce repeat caesarean delivery, but increases the risks of neonatal complications. PMID:23565242
Cell Phones to Collect Pregnancy Data From Remote Areas in Liberia
Lori, Jody R.; Munro, Michelle L.; Boyd, Carol J.; Andreatta, Pamela
2012-01-01
Purpose To report findings on knowledge and skill acquisition following a 3-day training session in the use of short message service (SMS) texting with non- and low-literacy traditional midwives. Design A pre- and post-test study design was used to assess knowledge and skills acquisition with 99 traditional midwives on the use of SMS texting for real-time, remote data collection in rural Liberia, West Africa. Methods Paired sample t-tests were conducted to establish if overall mean scores varied significantly from pre-test to immediate post-test. Analysis of variance was used to compare means across groups. The nonparametric McNemar’s test was used to determine significant differences between the pre-test and post-test values of each individual step involved in SMS texting. Pearson’s chi-square test of independence was used to examine the association between ownership of cell phones within a family and achievement of the seven tasks. Findings The mean increase in cell phone knowledge scores was 3.67, with a 95% confidence interval ranging from 3.39 to 3.95. Participants with a cell phone in the family did significantly better on three of the seven tasks in the pre-test: “turns cell on without help” (χ2(1) = 9.15, p = .003); “identifies cell phone coverage” (χ2(1) = 5.37, p = .024); and “identifies cell phone is charged” (χ2(1) = 4.40, p = .042). Conclusions A 3-day cell phone training session with low- and nonliterate traditional midwives in rural Liberia improved their ability to use mobile technology for SMS texting. Clinical Relevance Mobile technology can improve data collection accessibility and be used for numerous healthcare and public health issues. Cell phone accessibility holds great promise for collecting health data in low-resource areas of the world. PMID:22672157
2014-01-01
Background Scattered radiation can be assessed by in vivo dosimetry. Thyroid tissue is sensitive to radiation, even at doses <10 cGy. This study compared the scattered dose to the thyroid measured by thermoluminescent dosimeters (TLDs) and the estimated one by treatment planning system (TPS). Methods During radiotherapy to sites other than the thyroid of 16 children and adolescents, seventy-two TLD measurements at the thyroid were compared with TPS estimation. Results The overall TPS/TLD bias was 1.02 (95% LA 0.05 to 21.09). When bias was stratified by treatment field, the TPS overestimated TLD values at doses <1 cGy and underestimated them at doses >10 cGy. The greatest bias was found in pelvis and abdomen: 15.01 (95% LA 9.16 to 24.61) and 5.12 (95% LA 3.04 to 8.63) respectively. There was good agreement in orbit, head, and spine: bias 1.52 (95% LA 0.48 to 4.79), 0.44 (95% LA 0.11 to 1.82) and 0.83 (0.39 to 1.76) respectively. There was small agreement with broad limits for lung and mediastinum: 1.13 (95% LA 0.03 to 40.90) and 0.39 (95% LA 0.02 to 7.14) respectively. Conclusions The scattered dose can be measured with TLDs, and TPS algorithms for outside structures should be improved. PMID:24479890
Maier, Claudia B; Aiken, Linda H
2016-12-01
Primary care is in short supply in many countries. Task shifting from physicians to nurses is one strategy to improve access, but international research is scarce. We analysed the extent of task shifting in primary care and policy reforms in 39 countries. Cross-country comparative research, based on an international expert survey, plus literature scoping review. A total of 93 country experts participated, covering Europe, USA, Canada, Australia and New Zealand (response rate: 85.3%). Experts were selected according to pre-defined criteria. Survey responses were triangulated with the literature and analysed using policy, thematic and descriptive methods to assess developments in country-specific contexts. Task shifting, where nurses take up advanced roles from physicians, was implemented in two-thirds of countries (N = 27, 69%), yet its extent varied. Three clusters emerged: 11 countries with extensive (Australia, Canada, England, Northern Ireland, Scotland, Wales, Finland, Ireland, Netherlands, New Zealand and USA), 16 countries with limited and 12 countries with no task shifting. The high number of policy, regulatory and educational reforms, such as on nurse prescribing, demonstrate an evolving trend internationally toward expanding nurses' scope-of-practice in primary care. Many countries have implemented task-shifting reforms to maximise workforce capacity. Reforms have focused on removing regulatory and to a lower extent, financial barriers, yet were often lengthy and controversial. Countries early on in the process are primarily reforming their education. From an international and particularly European Union perspective, developing standardised definitions, minimum educational and practice requirements would facilitate recognition procedures in increasingly connected labour markets. © The Author 2016. Published by Oxford University Press on behalf of the European Public Health Association. All rights reserved.
A Meta-Analysis of Social Capital and Health: A Case for Needed Research
Gilbert, Keon L.; Quinn, Sandra C.; Goodman, Robert M.; Butler, James; Wallace, John
2014-01-01
Background Social capital refers to various levels of social relationships formed through social networks. Measurement differences have lead to imprecise measurement. Methods A meta-analysis of eligible studies assessing the bivariate association between social capital and self-reported health and all-cause mortality. Results Thirty-nine studies met inclusion criteria, showing social capital increased odds of good health by 27% (95% confidence intervals [CI] =21%, 34%). Social capital variables, reciprocity increased odds of good health by 39% (95% CI = 21%, 60%), trust by 32% (95% CI =19%, 46%). Future research suggests operationalizing measures by assessing differences by race/ethnicity, gender and socioeconomic status. PMID:23548810
Cognitive deficits associated with impaired awareness of hypoglycaemia in type 1 diabetes.
Hansen, Tor I; Olsen, Sandra E; Haferstrom, Elise C D; Sand, Trond; Frier, Brian M; Håberg, Asta K; Bjørgaas, Marit R
2017-06-01
The aim of this study was to compare cognitive function in adults with type 1 diabetes who have impaired awareness of hypoglycaemia with those who have normal awareness of hypoglycaemia. A putative association was sought between cognitive test scores and a history of severe hypoglycaemia. A total of 68 adults with type 1 diabetes were included: 33 had impaired and 35 had normal awareness of hypoglycaemia, as confirmed by formal testing. The groups were matched for age, sex and diabetes duration. Cognitive tests of verbal memory, object-location memory, pattern separation, executive function, working memory and processing speed were administered. Participants with impaired awareness of hypoglycaemia scored significantly lower on the verbal and object-location memory tests and on the pattern separation test (Cohen's d -0.86 to -0.55 [95% CI -1.39, -0.05]). Participants with impaired awareness of hypoglycaemia had reduced planning ability task scores, although the difference was not statistically significant (Cohen's d 0.57 [95% CI 0, 1.14]). Frequency of exposure to severe hypoglycaemia correlated with the number of cognitive tests that had not been performed according to instructions. Impaired awareness of hypoglycaemia was associated with diminished learning, memory and pattern separation. These cognitive tasks all depend on the hippocampus, which is vulnerable to neuroglycopenia. The findings suggest that hypoglycaemia contributes to the observed correlation between impaired awareness of hypoglycaemia and impaired cognition.
Neuropsychological Markers of Suicidal Risk in the Context of Medical Rehabilitation.
Pustilnik, Alexandra; Elkana, Odelia; Vatine, Jean-Jacques; Franko, Motty; Hamdan, Sami
2017-01-01
While great strides have been made to advance the understanding of the neurobiology of suicidal behavior (SB), the neural and neuropsychological mechanisms associated with SB are not well understood. The purpose of the current study is to identify neurocognitive markers of SB in the context of medical rehabilitation. The performances of 39 patients at a medical rehabilitation center, aged 21-78, were examined on a series of neurocognitive executive tasks-decision-making (Iowa Gambling Task - IGT), mental flexibility (WCST), response inhibition (SST) and working memory (digit span). Self-report questionnaires were administered, for Suicidal behaviors, depression, Anxiety, and PTSD as well as perceived social support. Suicidal participants performed more poorly on the IGT. A mediation analysis presented a significant direct effect of decision making on suicidal risk (p < 0.14) as well as significant indirect effect of decision making on suicidal risk that was mediated by the depressive symptoms (95% BCa CI [-0.15, -0.018]) with a medium effect size (κ 2 = 0.20, 95% BCa CI [0.067, 0.381]). Despite the complexity of relationship between decision-making and suicidal risk, these results suggest that clinicians should routinely assess decision-making abilities in adults at risk for suicide due to the fact that impaired decision-making may increase suicidal risk above and beyond that conferred by depression.
F61. THE RELATIONSHIP OF AGE AND SYMPTOMS WITH COGNITIVE PLANNING IN SCHIZOPHRENIA
Kontis, Dimitrios; Giannakopoulou, Alexandra; Theochari, Eirini; Andreopoulou, Angeliki; Vassilouli, Spyridoula; Giannakopoulou, Dimitra; Siettou, Eleni; Tsaltas, Eleftheria
2018-01-01
Abstract Background The relationship of age and symptoms with the performance on non-verbal cognitive planning tasks in schizophrenia could be useful for the development of cognitive remediation programmes. Methods During a cross-sectional study, 97 medicated and stabilized patients with chronic schizophrenia (61 males and 36 females, mean age=43.74 years, standard deviation-SD=11.59), which were consecutively referred to our Unit, were assessed using the Stockings of Cambridge (SOC) task of the Cambridge Neuropsychological Test Automated Battery (CANTAB) and the Positive and Negative Syndrome Scale (PANSS). Linear regression analyses were conducted in order to investigate the correlations of symptoms and age with SOC performance. Results Age and PANSS total scores negatively correlated with optimal SOC solutions (problems solved in minimum moves) (age: B=-0.05, 95% CI=-0.089, -0–012, df=86, t=-2.599, p=0.011, symptoms: B=-0.047, 95%CI=-0.071, -0.024, df=86, t=-3.982, p<0.001). The effects of total symptoms were driven by positive (B=-0.149, 95%CI=-0.229, -0.068, df=86 t=-3.672, p<0.001), negative (B=-0.087, 95%CI=-0.150, -0.023, df=86, t=-2.717, p=0.008) and general psychopathology symptoms (B=-0.065, 95%CI=-0.108, -0.023, df=86, t=-3.045, p=0.03). PANSS total scores positively correlated with mean excess moves in 2- (B=0.007, 95%CI=0.002, 0.012, df=86, t=2.656, p=0.009), 3- (B=0.014, 95%CI=0.005, 0.023, df=86, t=2.951, p=0.004) and 5-move (B=0.026, 95%CI=0.008, 0.044, df=86, t=2.923, p=0.004) problems and age only in 4- (B=0.026, 95%CI=0.006, 0.046, df=86, t=2.571, p=0.012) and 5-move (B=0.032, 95%CI=0.002, 0.061, df=86, t=2.152, p=0.034) problems. We could not find any association between PANSS scores and age with initial or subsequent thinking times during the SOC task. Discussion Cognitive planning deficits in schizophrenia are associated with patients’ symptoms and age. Whereas the effect of symptoms appears to be independent of task difficulty, the age effect emerges when the planning tasks become more complex. The role of drugs remains to be examined in future analyses.
Wong, Jenise C; Dolan, Lawrence M; Yang, Tony T; Hood, Korey K
2015-12-01
Few studies have explored durability of insulin pump use, and none have explored the link between depression and pump discontinuation. To examine the relationship between depressive symptoms [measured by the Children's Depression Inventory (CDI)], method of insulin delivery, and hemoglobin A1c (A1c), mixed models were used with data from 150 adolescents with type 1 diabetes (T1D) and visits every 6 months for 2 years. Of the 63% who used a pump, compared with multiple daily injections (MDI) at baseline, there were higher proportions who were non-minorities, had caregivers with a college degree, private insurance, and two caregivers in the home (p ≤ 0.01). After adjusting for time, sex, age, T1D duration, frequency of blood glucose monitoring, ethnicity, insurance, and caregiver number and education, baseline pump use was associated with -0.79% lower mean A1c [95% confidence interval (CI): -1.48, -0.096; p = 0.03]. For those using a pump at baseline, but switching to MDI during the study (n = 9), mean A1c was 1.38% higher (95% CI: 0.68, 2.08; p < 0.001) than that for those who did not switch method of delivery. A 10-point increase in CDI was associated with a 0.39% increase in A1c (95% CI: 0.16, 0.61; p = 0.001), independent of pump use. Regarding the temporal relationship between CDI score and changing method of insulin delivery, prior higher CDI score was associated with switching from pump to MDI (odds ratio = 1.21; 95% CI: 1.05, 1.39; p = 0.007). Clinicians should be aware of the associations between depressive symptoms, change in insulin delivery method, and the effect on glycemic control. © 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Ong, Jason; Chen, Marcus; Temple-Smith, Meredith; Walker, Sandra; Hoy, Jennifer; Woolley, Ian; Grulich, Andrew; Fairley, Christopher
2013-12-01
Anal cancer is relatively common amongst HIV positive men who have sex with men (MSM), but little is known about the anal cancer screening practices of HIV physicians, and whether digital ano-rectal examination (DARE) is utilized for this. To determine the practice of anal cancer screening among HIV physicians, and to identify any barriers for implementing DARE as a method for anal cancer screening. 36 physicians from a sexual health centre, 2 tertiary hospital infectious diseases outpatient clinics, and 2 general practices completed a questionnaire on their practice of anal cancer screening amongst HIV positive MSM. Physicians were asked about their confidence in using DARE for anal cancer screening, and whether they perceived barriers to implementing this in their clinic. Most physicians (86%, 95% CI: 71-95) thought that anal cancer screening was important, but only 22% (95% CI: 10-39) were currently screening. Reasons for not screening were the absence of guidelines (87%, 95% CI: 60-98), lack of time (47%, 95% CI: 30-65), and concern about patient acceptability of DARE (32%, 95% CI: 17-51). Whilst 67% (95% CI: 49-81) of physicians felt confident in performing a DARE, only 22% (95% CI: 10-39) were confident in recognizing anal cancer using DARE. Although HIV physicians were aware of the need for anal cancer screening among the HIV + MSM population, few were routinely screening. If DARE were to be incorporated into routine HIV care, guidelines recommending screening and up-skilling of HIV physicians to recognize anal cancer are needed.
Factors Associated With Revision Surgery After Internal Fixation of Hip Fractures.
Sprague, Sheila; Schemitsch, Emil H; Swiontkowski, Marc; Della Rocca, Gregory J; Jeray, Kyle J; Liew, Susan; Slobogean, Gerard P; Bzovsky, Sofia; Heels-Ansdell, Diane; Zhou, Qi; Bhandari, Mohit
2018-05-01
Femoral neck fractures are associated with high rates of revision surgery after management with internal fixation. Using data from the Fixation using Alternative Implants for the Treatment of Hip fractures (FAITH) trial evaluating methods of internal fixation in patients with femoral neck fractures, we investigated associations between baseline and surgical factors and the need for revision surgery to promote healing, relieve pain, treat infection or improve function over 24 months postsurgery. Additionally, we investigated factors associated with (1) hardware removal and (2) implant exchange from cancellous screws (CS) or sliding hip screw (SHS) to total hip arthroplasty, hemiarthroplasty, or another internal fixation device. We identified 15 potential factors a priori that may be associated with revision surgery, 7 with hardware removal, and 14 with implant exchange. We used multivariable Cox proportional hazards analyses in our investigation. Factors associated with increased risk of revision surgery included: female sex, [hazard ratio (HR) 1.79, 95% confidence interval (CI) 1.25-2.50; P = 0.001], higher body mass index (for every 5-point increase) (HR 1.19, 95% CI 1.02-1.39; P = 0.027), displaced fracture (HR 2.16, 95% CI 1.44-3.23; P < 0.001), unacceptable quality of implant placement (HR 2.70, 95% CI 1.59-4.55; P < 0.001), and smokers treated with cancellous screws versus smokers treated with a sliding hip screw (HR 2.94, 95% CI 1.35-6.25; P = 0.006). Additionally, for every 10-year decrease in age, participants experienced an average increased risk of 39% for hardware removal. Results of this study may inform future research by identifying high-risk patients who may be better treated with arthroplasty and may benefit from adjuncts to care (HR 1.39, 95% CI 1.05-1.85; P = 0.020). Prognostic Level II. See Instructions for Authors for a complete description of levels of evidence.
Yokoyama, Hisayo; Okazaki, Kazunobu; Imai, Daiki; Yamashina, Yoshihiro; Takeda, Ryosuke; Naghavi, Nooshin; Ota, Akemi; Hirasawa, Yoshikazu; Miyagawa, Toshiaki
2015-05-28
Physical activity reduces the incidence and progression of cognitive impairment. Cognitive-motor dual-task training, which requires dividing attention between cognitive tasks and exercise, may improve various cognitive domains; therefore, we examined the effect of dual-task training on the executive functions and on plasma amyloid β peptide (Aβ) 42/40 ratio, a potent biomarker of Alzheimer's disease, in healthy elderly people. Twenty-seven sedentary elderly people participated in a 12-week randomized, controlled trial. The subjects assigned to the dual-task training (DT) group underwent a specific cognitive-motor dual-task training, and then the clinical outcomes, including cognitive functions by the Modified Mini-Mental State (3MS) examination and the Trail-Making Test (TMT), and the plasma Aβ 42/40 ratio following the intervention were compared with those of the control single-task training (ST) group by unpaired t-test. Among 27 participants, 25 completed the study. The total scores in the 3MS examination as well as the muscular strength of quadriceps were equally improved in both groups after the training. The specific cognitive domains, "registration & recall", "attention", "verbal fluency & understanding", and "visuospatial skills" were significantly improved only in the DT group. Higher scores in "attention", "verbal fluency & understanding", and "similarities" were found in the DT group than in the ST group at post-intervention. The absolute changes in the total (8.5 ± 1.6 vs 2.4 ± 0.9, p = 0.004, 95 % confidence interval (CI) 0.75-3.39) and in the scores of "attention" (1.9 ± 0.5 vs -0.2 ± 0.4, p = 0.004, 95 % CI 2.25-9.98) were greater in the DT group than in the ST group. We found no changes in the TMT results in either group. Plasma Aβ 42/40 ratio decreased in both groups following the training (ST group: 0.63 ± 0.13 to 0.16 ± 0.03, p = 0.001; DT group: 0.60 ± 0.12 to 0.25 ± 0.06, p = 0.044), although the pre- and post-intervention values were not different between the groups for either measure. Cognitive-motor dual-task training was more beneficial than single-task training alone in improving broader domains of cognitive functions of elderly persons, and the improvement was not directly due to modulating Aβ metabolism.
Social Inequalities in Secondhand Smoke Among Japanese Non-smokers: A Cross-Sectional Study
Aida, Jun; Tsuboya, Toru; Koyama, Shihoko; Sato, Yukihiro; Hozawa, Atsushi; Osaka, Ken
2018-01-01
Background Secondhand smoke (SHS) causes many deaths. Inequalities in SHS have been reported in several countries; however, the evidence in Asian countries is scarce. We aimed to investigate the association between socioeconomic status (SES) and SHS at home and the workplace/school among non-smoking Japanese adults. Methods Cross-sectional data from the Miyagi Prefectural Health Survey 2014 were analyzed. Self-reported questionnaires were randomly distributed to residents ≥20 years of age and 2,443 (92.8%) responded. The data of the 1,738 and 1,003 respondents were included to the analyses for SHS in the past month at home and at the workplace/school, respectively. Ordered logistic regression models considering possible confounders, including knowledge of the adverse health effects of tobacco, were applied. Results The prevalence of SHS at home and the workplace/school was 19.0% and 39.0%, respectively. Compared with ≥13 years of education, odds ratios (ORs) and 95% confidence intervals (CIs) for SHS at home were 1.94 (95% CI, 1.42–2.64) for 10–12 years and 3.00 (95% CI, 1.95–4.60) for ≤9 years; those for SHS at the workplace/school were 1.80 (95% CI, 1.36–2.39) and 3.82 (95% CI, 2.29–6.36), respectively. Knowledge of the adverse health effects of tobacco was significantly associated with lower SHS at home (OR 0.95; 95% CI, 0.91–0.98) but it was not associated with SHS at the workplace/school (OR 1.02; 95% CI, 0.98–1.06). Conclusions Social inequalities in SHS existed among Japanese non-smoking adults. Knowledge about tobacco was negatively associated with SHS at home but not at workplace/school. PMID:29093356
Mechanism of impaired consciousness in absence seizures: a cross-sectional study
Guo, Jennifer N.; Kim, Robert; Chen, Yu; Negishi, Michiro; Jhun, Stephen; Weiss, Sarah; Ryu, Jun Hwan; Bai, Xiaoxiao; Xiao, Wendy; Feeney, Erin; Rodriguez-Fernandez, Jorge; Mistry, Hetal; Crunelli, Vincenzo; Crowley, Michael J.; Mayes, Linda C.; Todd Constable, R.; Blumenfeld, Hal
2017-01-01
Background Absence seizures are brief episodes of impaired consciousness characterized by staring and behavioral arrest. The neural underpinnings of impaired consciousness and of the variable severity of behavioral impairment observed from one absence seizure to the next are not well understood. We therefore compared fMRI and EEG changes in absence seizures with impaired task performance to seizures in which performance was spared. Methods Patients were recruited from 59 pediatric neurology practices including hospitals and neurology outpatient offices throughout the United States. We performed simultaneous electroencephalography (EEG), fMRI, and behavioral testing in children and adolescents aged 6 to 19 years with typical absence epilepsy. fMRI and EEG were analyzed using data-driven approaches without prior assumptions about signal time courses or spatial distributions. The main outcomes were fMRI and EEG amplitudes in seizures with impaired versus spared behavioral responses analysed by t-test. We also examined the timing of fMRI and EEG changes in seizures with impaired behavioral responses compared to seizures with spared responses. Findings 93 patients were enrolled between September 1, 2005 and January 1, 2013, and we captured a total of 1032 seizures in 39 patients. fMRI changes during seizures occurred sequentially in three functional brain networks previously well-validated in studies of normal subjects. Seizures associated with more impaired behavior showed higher fMRI amplitude in all three networks compared to seizures with spared performance. In the default-mode network fMRI, amplitude was 0·57 ± 0·26% for seizures with impaired and 0·40 ± 0·16% for seizures with spared behavioral responses (mean difference 017%; 95% CI: 0·11 to 0·23%; p < 0.0001). In the task-positive network, fMRI amplitude was 0·53 ± 0·29% for impaired and 0·39 ± 0·15% for spared seizures (mean difference 0·14%; 95% CI: 008 to 0·21%; p < 0.0001). In the sensorimotor-thalamic network, fMRI amplitude was 0·41 ± 0·25% for impaired and 0·34 ± 014% for spared seizures (mean difference 0 07%; 95% CI: 001 to 0·13%; p = 0.02). Seizures with impaired behavior also showed greater EEG power in widespread brain regions compared to seizures with spared behavior. Mean fractional EEG power in the frontal leads was 50·4 ± 15·2 for seizures with impaired and 24·8 ± 6·5 for seizures with spared behavior (mean difference 25·6; 95% CI: 210 to 30·3); middle leads 35·4 ± 6·5 for impaired, 13 3 ± 34 for spared seizures (mean difference 22·1; 95% CI: 20.0 to 24·1); posterior leads 41·6 ± 5·3 for impaired, 24·6 ± 86 for spared seizures (mean difference 170; 95% CI: 14·4 to 19·7); p < 00001 for all comparisons. Average seizure duration was longer for seizures with impaired behavior at 79 ± 66 s, compared to 3·8 ± 3.0 s for seizures with spared behavior (mean difference 4.1 s; 95% CI 3.0 to 5.3 s, p < 00001). However, larger amplitude fMRI and EEG signals occurred at the outset or even preceding seizures with impairment. Interpretation Impaired consciousness in absence seizures is related to the intensity of physiological changes in established networks affecting widespread regions of the brain. Increased EEG and fMRI amplitude occurs at the onset of seizures associated with behavioral impairment. These findings suggest that a vulnerable state may exist at the initiation of some seizures leading to greater physiological changes and altered consciousness. PMID:27839650
Injury Mortality in Individuals With Autism
Guan, Joseph
2017-01-01
Objectives. To examine epidemiological patterns of injury fatalities in individuals with a diagnosis of autism. Methods. We identified individuals with a diagnosis of autism who died between 1999 and 2014 by screening causes of death in the multiple cause-of-death data files in the National Vital Statistics System based on the International Classification of Diseases, 10th Revision, code F84.0. We used the general US population as the reference to calculate proportionate mortality ratios (PMRs) and 95% confidence intervals (CIs). Results. During the study period, 1367 deaths (1043 males and 324 females) in individuals with autism were recorded in the United States. The mean age at death for individuals with autism was 36.2 years (SD = 20.9 years), compared with 72.0 years (SD = 19.2 years) for the general population. Of the deaths in individuals with autism, 381 (27.9%) were attributed to injury (PMR = 2.93; 95% CI = 2.64, 3.24), with suffocation (n = 90; PMR = 31.93; 95% CI = 25.69, 39.24) being the leading cause of injury mortality, followed by asphyxiation (n = 78; PMR = 13.50; 95% CI = 10.68, 16.85) and drowning (n = 74; PMR = 39.89; 95% CI = 31.34, 50.06). Conclusions. Individuals with autism appear to be at substantially heightened risk for death from injury. PMID:28323463
Weaver, Marcia R; Pillay, Erushka; Jed, Suzanne L; de Kadt, Julia; Galagan, Sean; Gilvydis, Jennifer; Marumo, Eva; Mawandia, Shreshth; Naidoo, Evasen; Owens, Tamara; Prongay, Vickery; O'Malley, Gabrielle
2016-03-01
The South African National Department of Health sought to improve syndromic management of sexually transmitted infections (STIs). Continuing medical education on STIs was delivered at primary healthcare (PHC) clinics using one of three training methods: (1) lecture, (2) computer and (3) paper-based. Clinics with training were compared with control clinics. Ten PHC clinics were randomly assigned to control and 10 to each training method arm. Clinicians participated in on-site training on six modules; two per week for three weeks. Each clinic was visited by three or four unannounced standardised patient (SP) actors pre-training and post-training. Male SPs reported symptoms of male urethritis syndrome and female SPs reported symptoms of vaginal discharge syndrome. Quality of healthcare was measured by whether or not clinicians completed five tasks: HIV test, genital exam, correct medications, condoms and partner notification. An average of 31% of clinicians from each PHC attended each module. Quality of STI care was low. Pre-training (n=128) clinicians completed an average of 1.63 tasks. Post-training (n=114) they completed 1.73. There was no change in the number of STI tasks completed in the control arm and an 11% increase overall in the training arms relative to the control (ratio of relative risk (RRR)=1.11, 95% CI 0.67 to 1.84). Across training arms, there was a 26% increase (RRR=1.26, 95% CI 0.77 to 2.06) associated with lecture, 17% increase (RRR=1.17, 95% CI 0.59 to 2.28) with paper-based and 13% decrease (RRR=0.87, 95% CI 0.40 to 1.90) with computer arm relative to the control. Future interventions should address increasing training attendance and computer-based training effectiveness. AEARCTR-0000668. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
Dosimetric Inhomogeneity Predicts for Long-Term Breast Pain After Breast-Conserving Therapy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mak, Kimberley S.; Chen, Yu-Hui; Catalano, Paul J.
Purpose: The objective of this cross-sectional study was to characterize long-term breast pain in patients undergoing breast-conserving surgery and radiation (BCT) and to identify predictors of this pain. Methods and Materials: We identified 355 eligible patients with Tis-T2N0M0 breast cancer who underwent BCT in 2007 to 2011, without recurrent disease. A questionnaire derived from the Late Effects Normal Tissue Task Force (LENT) Subjective, Objective, Management, Analytic (SOMA) scale was mailed with 7 items detailing the severity, frequency, duration, and impact of ipsilateral breast pain over the previous 2 weeks. A logistic regression model identified predictors of long-term breast pain based on questionnaire responsesmore » and patient, disease, and treatment characteristics. Results: The questionnaire response rate was 80% (n=285). One hundred thirty-five patients (47%) reported pain in the treated breast, with 19 (14%) having pain constantly or at least daily; 15 (11%) had intense pain. The pain interfered with daily activities in 11 patients (8%). Six patients (4%) took analgesics for breast pain. Fourteen (10%) thought that the pain affected their quality of life. On univariable analysis, volume of breast tissue treated to ≥105% of the prescribed dose (odds ratio [OR] 1.001 per cc, 95% confidence interval [CI] 1.000-1.002; P=.045), volume treated to ≥110% (OR 1.009 per cc, 95% CI 1.002-1.016; P=.012), hormone therapy use (OR 1.95, 95% CI 1.12-3.39; P=.02), and other sites of pain (OR 1.79, 95% CI 1.05-3.07; P=.03) predicted for long-term breast pain. On multivariable analysis, volume ≥110% (OR 1.01 per cc, 95% CI 1.003-1.017; P=.007), shorter time since treatment (OR 0.98 per month, 95% CI 0.96-0.998; P=.03), and hormone therapy (OR 1.84, 95% CI 1.05-3.25; P=.03) were independent predictors of pain. Conclusion: Long-term breast pain was common after BCT. Although nearly half of patients had pain, most considered it tolerable. Dosimetric inhomogeneity independently predicted for pain and should be minimized to the greatest extent possible.« less
(DCT-FY08) Target Detection Using Multiple Modality Airborne and Ground Based Sensors
2013-03-01
Plenoptic modeling: an image-based rendering system,” in SIGGRAPH ’95: Proceedings of the 22nd annual conference on Computer graphics and interactive...techniques. New York, NY, USA: ACM, 1995, pp. 39–46. [21] D. G. Aliaga and I. Carlbom, “ Plenoptic stitching: a scalable method for reconstructing 3D
Task shifting from doctors to non-doctors for initiation and maintenance of antiretroviral therapy.
Kredo, Tamara; Adeniyi, Folasade B; Bateganya, Moses; Pienaar, Elizabeth D
2014-07-01
The high levels of healthcare worker shortage is recognised as a severe impediment to increasing patients' access to antiretroviral therapy. This is particularly of concern where the burden of disease is greatest and the access to trained doctors is limited.This review aims to better inform HIV care programmes that are currently underway, and those planned, by assessing if task-shifting care from doctors to non-doctors provides both high quality and safe care for all patients requiring antiretroviral treatment. To evaluate the quality of initiation and maintenance of HIV/AIDS care in models that task shift care from doctors to non-doctors. We conducted a comprehensive search to identify all relevant studies regardless of language or publication status (published, unpublished, in press, and in progress) from 1 January 1996 to 28 March 2014, with major HIV/AIDS conferences searched 23 May 2014. We had also contacted relevant organizations and researchers. Key words included MeSH terms and free-text terms relevant to 'task shifting', 'skill mix', 'integration of tasks', 'service delivery' and 'health services accessibility'. We included controlled trials (randomised or non-randomised), controlled-before and after studies, and cohort studies (prospective or retrospective) comparing doctor-led antiretroviral therapy delivery to delivery that included another cadre of health worker other than a doctor, for initiating treatment, continuing treatment, or both, in HIV infected patients. Two authors independently screened titles, abstracts and descriptor terms of the results of the electronic search and applied our eligibility criteria using a standardized eligibility form to full texts of potentially eligible or uncertain abstracts. Two reviewers independently extracted data on standardized data extraction forms. Where possible, data were pooled using random effects meta-analysis. We assessed evidence quality with GRADE methodology. Ten studies met our inclusion criteria, all of which were conducted in Africa. Of these four were randomised controlled trials while the remaining six were cohort studies.From the trial data, when nurses initiated and provided follow-up HIV therapy, there was high quality evidence of no difference in death at one year, unadjusted risk ratio was 0.96 (95% CI 0.82 to 1.12), one trial, cluster adjusted n = 2770. There was moderate quality evidence of lower rates of losses to follow-up at one year, relative risk of 0.73 (95% CI 0.55 to 0.97). From the cohort data, there was low quality evidence that there may be an increased risk of death in the task shifting group, relative risk 1.23 (95% CI 1.14 to 1.33, two cohorts, n = 39 160) and very low quality data reporting no difference in patients lost to follow-up between groups, relative risk 0.30 (95% CI 0.05 to 1.94).From the trial data, when doctors initiated therapy and nurses provided follow-up, there was moderate quality evidence that there is probably no difference in death compared with doctor-led care at one year, relative risk of 0.89 (95% CI 0.59 to 1.32), two trials, cluster adjusted n = 4332. There was moderate quality evidence that there is probably no difference in the numbers of patients lost to follow-up at one year, relative risk 1.27 (95% CI 0.92 to 1.77), P = 0.15. From the cohort data, there is very low quality data that death at one year may be lower in the task shifting group, relative risk 0.19 (95% CI 0.05 to 0.78), one cohort, n = 2772, and very low quality evidence that loss to follow-up was reduced, relative risk 0.34 (95% CI 0.18 to 0.66).From the trial data, for maintenance therapy delivered in the community there was moderate quality evidence that there is probably no difference in mortality when doctors deliver care in the hospital or specially trained field workers provide home-based maintenance care and antiretroviral therapy at one year, relative risk 1.0 (95% CI 0.62 to 1.62), 1 trial, cluster adjusted n = 559. There is moderate quality evidence from this trial that losses to follow-up are probably no different at one year, relative risk 0.52 (0.12 to 2.3), P = 0.39. The cohort studies did not report on one year follow-up for these outcomes.Across the studies that reported on virological and immunological outcomes, there was no clear evidence of difference whether a doctor or nurse or clinical officer delivered therapy. Three studies report on costs to patients, indicating a reduction in travel costs to treatment facilities where task shifting was occurring closer to patients homes. There is conflicting evidence regarding the relative cost to the health system, as implementation of the strategy may increase costs. The two studies reporting the patient and staff perceptions of the quality of care, report good acceptability of the service by patients, and general acceptance by doctors of the shifting of roles. One trial reported on the time to initiation of antiretroviral therapy, finding no clear evidence of a difference between groups. The same trial reports on new diagnosis of tuberculosis which favours nurse initiation of HIV care for increasing the numbers of diagnoses of tuberculosis made. Our review found moderate quality evidence that shifting responsibility from doctors to adequately trained and supported nurses or community health workers for managing HIV patients probably does not decrease the quality of care and, in the case of nurse initiated care, may decrease the numbers of patients lost to follow-up.
Larcos, George; Prgomet, Mirela; Georgiou, Andrew; Westbrook, Johanna
2017-01-01
Background Errors by nuclear medicine technologists during the preparation of radiopharmaceuticals or at other times can cause patient harm and may reflect the impact of interruptions, busy work environments and deficient systems or processes. We aimed to: (a) characterise the rate and nature of interruptions technologists experience and (b) identify strategies that support safety. Methods We performed 100 hours of observation of 11 technologists at a major public hospital and measured the proportions of time spent in eight categories of work tasks, location of task, interruption rate and type and multitasking (tasks conducted in parallel). We catalogued specific safety-oriented strategies used by technologists. Results Technologists completed 5227 tasks and experienced 569 interruptions (mean, 4.5 times per hour; 95% CI 4.1 to 4.9). The highest interruption rate occurred when technologists were in transit between rooms (10.3 per hour (95% CI 8.3 to 12.5)). Interruptions during radiopharmaceutical preparation occurred a mean of 4.4 times per hour (95% CI 3.3 to 5.6). Most (n=426) tasks were interrupted once only and all tasks were resumed after interruption. Multitasking occurred 16.6% of the time. At least some interruptions were initiated by other technologists to convey important information and/or to render assistance. Technologists employed a variety of verbal and non-verbal strategies in all work areas (notably in the hot-lab) to minimise the impact of interruptions and optimise the safe conduct of procedures. Although most were due to individual choices, some strategies reflected overt or subliminal departmental policy. Conclusions Some interruptions appear beneficial. Technologists' self-initiated strategies to support safe work practices appear to be an important element in supporting a resilient work environment in nuclear medicine. PMID:27707869
Tan, Melanie; Mol, Gerben C; van Rooden, Cornelis J; Klok, Frederikus A; Westerbeek, Robin E; Iglesias Del Sol, Antonio; van de Ree, Marcel A; de Roos, Albert; Huisman, Menno V
2014-07-24
Accurate diagnostic assessment of suspected ipsilateral recurrent deep vein thrombosis (DVT) is a major clinical challenge because differentiating between acute recurrent thrombosis and residual thrombosis is difficult with compression ultrasonography (CUS). We evaluated noninvasive magnetic resonance direct thrombus imaging (MRDTI) in a prospective study of 39 patients with symptomatic recurrent ipsilateral DVT (incompressibility of a different proximal venous segment than at the prior DVT) and 42 asymptomatic patients with at least 6-month-old chronic residual thrombi and normal D-dimer levels. All patients were subjected to MRDTI. MRDTI images were judged by 2 independent radiologists blinded for the presence of acute DVT and a third in case of disagreement. The sensitivity, specificity, and interobserver reliability of MRDTI were determined. MRDTI demonstrated acute recurrent ipsilateral DVT in 37 of 39 patients and was normal in all 42 patients without symptomatic recurrent disease for a sensitivity of 95% (95% CI, 83% to 99%) and a specificity of 100% (95% CI, 92% to 100%). Interobserver agreement was excellent (κ = 0.98). MRDTI images were adequate for interpretation in 95% of the cases. MRDTI is a sensitive and reproducible method for distinguishing acute ipsilateral recurrent DVT from 6-month-old chronic residual thrombi in the leg veins. © 2014 by The American Society of Hematology.
Weisscher, Nadine; de Haan, Rob J; Vermeulen, Marinus
2007-01-01
Background To investigate the interchangeability of measures of disability and health-related quality of life (HRQL) by comparing their associations patterns with disease-related impairment measures in patients with a variety of conditions. Methods A systematic literature search of MEDLINE, EMBASE, Web of Science and a hand search of reference lists through January 2006. Studies were included if they reported associations patterns between impairment and disability and between impairment and HRQL. Correlation coefficients were transformed to Fisher's z effect size (ES(z)). Weighted averages were reported as pooled ES(z) with 95% confidence intervals (CI). Results The relationship between impairment and disability was stronger (pooled ES(z) = 0.69; 95% CI, 0.66 – 0.72) than between impairment and HRQL (pooled ES(z) = 0.38; 95% CI, 0.36 – 0.41). The physical component score (pooled ES(z) = 0.43; 95% CI, 0.39 – 0.47) and disease-specific HRQL (pooled ES(z) = 0.46; 95% CI, 0.40 – 0.51) were stronger associated with impairments than the mental component score (pooled ES(z) = 0.28; 95% CI, 0.20 – 0.36) and generic HRQL (pooled ES(z) = 0.36; 95% CI, 0.33 – 0.39). Conclusion This study shows measures of disability and different HRQL domains were not equally related to impairment. Patient's impairments are better reflected in disability measures, than in HRQL instruments. There are many outcomes of interest and precisely defining them and measuring them will improve assessing the impact of new interventions. PMID:17578571
38 CFR 21.260 - Subsistence allowance.
Code of Federal Regulations, 2012 CFR
2012-07-01
....05 39.95 On-job 327.81 396.44 456.88 29.71 Improvement of rehabilitation potential: Full-time only.... 2 For on-job training, subsistence allowance may not exceed the difference between the monthly....93 465.08 548.05 39.95 On-job 327.81 396.44 456.88 29.71 Improvement of rehabilitation potential...
38 CFR 21.260 - Subsistence allowance.
Code of Federal Regulations, 2013 CFR
2013-07-01
....05 39.95 On-job 327.81 396.44 456.88 29.71 Improvement of rehabilitation potential: Full-time only.... 2 For on-job training, subsistence allowance may not exceed the difference between the monthly....93 465.08 548.05 39.95 On-job 327.81 396.44 456.88 29.71 Improvement of rehabilitation potential...
38 CFR 21.260 - Subsistence allowance.
Code of Federal Regulations, 2014 CFR
2014-07-01
....05 39.95 On-job 327.81 396.44 456.88 29.71 Improvement of rehabilitation potential: Full-time only.... 2 For on-job training, subsistence allowance may not exceed the difference between the monthly....93 465.08 548.05 39.95 On-job 327.81 396.44 456.88 29.71 Improvement of rehabilitation potential...
Sleep Disturbances and Depression in the Multi-Ethnic Study of Atherosclerosis
Alcántara, Carmela; Biggs, Mary L.; Davidson, Karina W.; Delaney, Joseph A.; Jackson, Chandra L.; Zee, Phyllis C.; Shea, Steven J.C.; Redline, Susan
2016-01-01
Study Objectives: We examined the association of objectively and subjectively measured sleep disturbances with depression, and explored if race/ethnicity, socioeconomic status, and sex modified these associations. Methods: We used data from the cross-sectional Multi-Ethnic Study of Atherosclerosis Sleep Study. Participants included 1,784 adults (ages 54–93 y), 36.8% non-Hispanic Whites, 28.0% African Americans, 23.7% Hispanics, 11.5% Chinese, and 46.0% males. Sleep was assessed with actigraphy, polysomnography, and self-report. Depressive symptoms were assessed using the Center for Epidemiologic Studies Depression (CES-D) scale. We used relative risk regression to evaluate the association of sleep measures and depression (CES-D score ≥ 16) adjusting for site, sociodemographics, and behavioral and medical risk factors. Results: Overall, 14.5% had depression, 29.3% had insomnia symptoms, 14.1% had excessive daytime sleepiness (EDS), 15.1% had apnea-hypopnea index (AHI) ≥ 30, and 30.4% experienced short sleep (< 6 h). Depression was associated with short sleep duration (adjusted prevalence ratio [PR] = 1.47, 95% confidence interval [CI] = 1.11, 1.94), < 10% rapid eye movement [REM] sleep (PR = 1.57, 95% CI = 1.08, 2.27), ≥ 25% REM sleep (PR = 1.42, 95% CI = 1.03, 1.95), insomnia (PR = 1.83, 95% CI = 1.39, 2.40), excessive daytime sleepiness (EDS) (PR = 1.61, 95% CI = 1.19, 2.18), and AHI > 15 + EDS (PR = 1.55, 95% CI = 1.01, 2.39). Short sleep duration was associated with depression among those with high school education or beyond, but not among those with less education. Insomnia was more strongly associated with depression among men than women. Conclusions: Sleep disturbances are associated with depression among middle-aged and older adults; these associations may be modified by education and sex. Future research should further test these hypotheses, evaluate whether early detection or treatment of sleep disturbances ameliorate depression, and explore subpopulation differences. Citation: Alcántara C, Biggs ML, Davidson KW, Delaney JA, Jackson CL, Zee PC, Shea SJ, Redline S. Sleep disturbances and depression in the multi-ethnic study of atherosclerosis. SLEEP 2016;39(4):915–925. PMID:26715223
Wu, Chen-Yi; Hu, Hsiao-Yun; Chow, Lok-Hi; Chou, Yiing-Jenq; Huang, Nicole; Wang, Pei-Ning; Li, Chung-Pin
2015-01-01
Background Few studies have examined the contribution of treatment on the mortality of dementia based on a population-based study. Objective To investigate the effects of anti-dementia and nootropic treatments on the mortality of dementia using a population-based cohort study. Methods 12,193 incident dementia patients were found from 2000 to 2010. Their data were compared with 12,193 age- and sex-matched non-dementia controls that were randomly selected from the same database. Dementia was classified into vascular (VaD) and degenerative dementia. Mortality incidence and hazard ratios (HRs) were calculated. Results The median survival time was 3.39 years (95% confidence interval [CI]: 2.88–3.79) for VaD without medication, 6.62 years (95% CI: 6.24–7.21) for VaD with nootropics, 3.01 years (95% CI: 2.85–3.21) for degenerative dementia without medication, 8.11 years (95% CI: 6.30–8.55) for degenerative dementia with anti-dementia medication, 6.00 years (95% CI: 5.73–6.17) for degenerative dementia with nootropics, and 9.03 years (95% CI: 8.02–9.87) for degenerative dementia with both anti-dementia and nootropic medications. Compared to the non-dementia group, the HRs among individuals with degenerative dementia were 2.69 (95% CI: 2.55–2.83) without medication, 1.46 (95% CI: 1.39–1.54) with nootropics, 1.05 (95% CI: 0.82–1.34) with anti-dementia medication, and 0.92 (95% CI: 0.80–1.05) with both nootropic and anti-dementia medications. VaD with nootropics had a lower mortality (HR: 1.25, 95% CI: 1.15–1.37) than VaD without medication (HR: 2.46, 95% CI: 2.22–2.72). Conclusion Pharmacological treatments have beneficial effects for patients with dementia in prolonging their survival. PMID:26098910
Inês, Elizabete De Jesus; Pacheco, Flavia Thamiris Figueiredo; Pinto, Milena Carneiro; Mendes, Patrícia Silva de Almeida; Da Costa-Ribeiro, Hugo; Soares, Neci Matos; Teixeira, Márcia Cristina Aquino
2016-12-01
The diagnosis of intestinal parasitic infections depends on the parasite load, the specific gravity density of the parasite eggs, oocysts or cysts, and the density and viscosity of flotation or sedimentation medium where faeces are processed. To evaluate the concordance between zinc sulphate flotation and centrifugal sedimentation in the recovery of parasites in faecal samples of children. Faecal samples of 330 children from day care centers were evaluated by zinc sulphate flotation and centrifugal sedimentation techniques. The frequencies of detection of parasites by each method were determined and the agreement between the diagnostic techniques was evaluated using the kappa index, with 95% confidence intervals. The faecal flotation in zinc sulphate diagnosed significantly more cases of Trichuris trichiura infection when compared to centrifugal sedimentation (39/330; 11.8% vs. 13/330; 3.9%, p<0.001), with low diagnostic concordance between methods (kappa=0.264; 95% CI: 0.102-0.427). Moreover, all positive samples for Enterobius vermicularis eggs (n=5) and Strongyloides stercoralis larvae (n=3) were diagnosed only by zinc sulphate. No statistical differences were observed between methods for protozoa identification. The results showed that centrifugal flotation in zinc sulphate solution was significantly more likely to detect light helminths eggs such as those of T. trichiura and E. vermicularis in faeces than the centrifugal sedimentation process.
Relationships of physical performance tests to military-relevant tasks in women.
Szivak, Tunde K; Kraemer, William J; Nindl, Bradley C; Gotshalk, Lincoln A; Volek, Jeff S; Gomez, Ana L; Dunn-Lewis, Courtenay; Looney, David P; Comstock, Brett A; Hooper, David R; Flanagan, Shawn D; Maresh, Carl M
2014-01-01
This investigation sought to determine the most predictive measures of performance on a repetitive box lifting task (RBLT) and load bearing task (LBT) among 123 women (aged ±4 years, height 165±7 cm, body mass 64±10 kg). To determine the relationship of various predictors to performance on the RBLT and LBT, multiple regression analysis was conducted on body mass, height, leg cross-sectional area, upper and lower body muscular strength, lower body explosive power, upper and lower body local muscular endurance, and aerobic capacity. The mean±SD (range) number of repetitions for the RBLT was 86±23 (20-159). The mean±SD (range) time to complete the LBT was 2,054±340 seconds (1,307-3,447). The following equations were generated: RBLT (number of repetitions)=57.4+0.2(peak jump power)+0.4(number of pushups in 2 minutes)+0.15(number of repetitions during the squat endurance test)+1.39(one repetition maximal strength boxlift (kg))-0.04(2-mile run time (2MR) in seconds), R=0.81; standard error of the estimate (SEE)=14; LBT (in seconds)=1,831-4.28(number of repetitions during the squat endurance test)+0.95(2MR in seconds)-13.4(body mass), R=0.73; SEE=232. We found that the 2MR and squat endurance test were significant predictive factors for performance on both load carriage tasks. These data also imply that women's performance in combat-related tasks can be improved with training that targets muscular strength, power, and local muscular endurance in addition to aerobic capacity.
Wang, Z.; Zhang, B.; Zhai, F.; Wang, H.; Zhang, J.; Du, W.; Su, C.; Zhang, J.; Jiang, H.; Popkin, B. M.
2014-01-01
Aim We examined the longitudinal association between red meat (RM) consumption and the risk of abdominal obesity in Chinese adults. Methods and results Our data are from 16,822 adults aged 18 to 75 in the China Health and Nutrition Survey from 1993 to 2011. We assessed RM intake with three 24-hour dietary recalls. We defined abdominal obesity as a waist circumference (WC) ≥ 85 centimeters (cm) for men and ≥ 80 cm for women. Multilevel mixed-effect regression models showed that men experienced WC increases of 0.74 cm (95% confidence interval [CI]: 0.39–1.09) from a higher total intake of fresh RM and 0.59 cm (95% CI: 0.24–0. 95) from a higher intake of fatty fresh RM but 0.14 cm (95% CI: −0.39–0.66) from a higher intake of lean fresh RM in the top versus the bottom quartile when adjusted for potential confounders. In contrast, after additional adjustment for baseline WC, the odds ratios of abdominal obesity in men were attenuated for total fresh RM (1.25 [95% CI: 1.06–1.47]) and fatty fresh RM (1.22 [95% CI: 1.03–1.44]) but were still not affected by lean fresh RM (0.95 [95% CI: 0.75–1.22]). Women also showed a positive association of fatty fresh RM intake with abdominal obesity. Conclusion Greater intake of fatty fresh RM was significantly associated with higher WC (men only) and abdominal obesity risk in Chinese adults. The gender-specific differential association of fatty versus lean fresh RM warrants further study. PMID:24795160
Distracted Driving and Risk of Road Crashes among Novice and Experienced Drivers
Klauer, Sheila G.; Guo, Feng; Simons-Morton, Bruce G.; Ouimet, Marie Claude; Lee, Suzanne E.; Dingus, Thomas A.
2014-01-01
BACKGROUND Distracted driving attributable to the performance of secondary tasks is a major cause of motor vehicle crashes both among teenagers who are novice drivers and among adults who are experienced drivers. METHODS We conducted two studies on the relationship between the performance of secondary tasks, including cell-phone use, and the risk of crashes and near-crashes. To facilitate objective assessment, accelerometers, cameras, global positioning systems, and other sensors were installed in the vehicles of 42 newly licensed drivers (16.3 to 17.0 years of age) and 109 adults with more driving experience. RESULTS During the study periods, 167 crashes and near-crashes among novice drivers and 518 crashes and near-crashes among experienced drivers were identified. The risk of a crash or near-crash among novice drivers increased significantly if they were dialing a cell phone (odds ratio, 8.32; 95% confidence interval [CI], 2.83 to 24.42), reaching for a cell phone (odds ratio, 7.05; 95% CI, 2.64 to 18.83), sending or receiving text messages (odds ratio, 3.87; 95% CI, 1.62 to 9.25), reaching for an object other than a cell phone (odds ratio, 8.00; 95% CI, 3.67 to 17.50), looking at a roadside object (odds ratio, 3.90; 95% CI, 1.72 to 8.81), or eating (odds ratio, 2.99; 95% CI, 1.30 to 6.91). Among experienced drivers, dialing a cell phone was associated with a significantly increased risk of a crash or near-crash (odds ratio, 2.49; 95% CI, 1.38 to 4.54); the risk associated with texting or accessing the Internet was not assessed in this population. The prevalence of high-risk attention to secondary tasks increased over time among novice drivers but not among experienced drivers. CONCLUSIONS The risk of a crash or near-crash among novice drivers increased with the performance of many secondary tasks, including texting and dialing cell phones. (Funded by the Eunice Kennedy Shriver National Institute of Child Health and Human Development and the National Highway Traffic Safety Administration.) PMID:24382065
Lim, Sung-Shil; Kim, Byurira; Yoon, Jin-Ha; Song, Je Seon; Park, Eun-Cheol; Jang, Sung-In
2018-05-01
Objectives We investigated the association between parents' occupational characteristics and untreated dental caries in their children. Methods We analyzed the data of 4764 and 5862 children merged with data of their mothers and fathers, respectively, derived from the Korean National Health and Nutrition Examination Survey, 2008-2015. Dentists assessed untreated dental caries, and occupational characteristics were self-reported. The associations between untreated dental caries in children and their parents' occupational characteristics were assessed with logistic regression analysis. Results The prevalence of untreated dental caries was 18.58% and 16.39% in the mother- and father-matched data, respectively. Compared to children whose mothers worked regular hours, those whose mothers worked overtime had increased odds of untreated dental caries [odds ratio (OR) 1.19, 95% confidence interval (CI) 1.02-1.39]. Children of female self-employed workers/employers/unpaid family workers had higher odds of untreated dental caries than those of wage earners (OR 1.18, 95% CI 1.00-1.39). The OR of untreated dental caries was higher among children with shift-working parents than those whose parents worked daytime hours (mother: OR 1.29, 95% CI 1.11-1.51; father: OR 1.36, 95% CI 1.18-1.58). Conclusions The children of non-white-collar workers, non-wage earners, and workers working overtime or doing shift work had higher odds of untreated dental caries. The effects of parental occupational characteristics on untreated dental caries differed by sex (mother versus father). Public health programs targeting the prevention of dental caries among children should consider parental occupational characteristics.
The effects of mental fatigue on cricket-relevant performance among elite players.
Veness, Darren; Patterson, Stephen David; Jeffries, Owen; Waldron, Mark
2017-12-01
This study investigated the effects of a mentally fatiguing test on physical tasks among elite cricketers. In a cross-over design, 10 elite male cricket players from a professional club performed a cricket run-two test, a Batak Lite reaction time test and a Yo-Yo-Intermittent Recovery Level 1 (Yo-Yo-IR1) test, providing a rating of perceived exertion (RPE) after completing a 30-min Stroop test (mental fatigue condition) or 30-min control condition. Perceived fatigue was assessed before and after the two conditions and motivation was measured before testing. There were post-treatment differences in the perception of mental fatigue (P < 0.001; d = -7.82, 95% CIs = -9.05-6.66; most likely). Cricket run-two (P = 0.002; d = -0.51, 95% CIs = -0.72-0.30; very likely), Yo-Yo-IR1 distance (P = 0.023; d = 0.39, 95% CIs = 0.14-0.64; likely) and RPE (P = 0.001; d = -1.82, 95% CIs = -2.49-1.14; most likely) were negatively affected by mental fatigue. The Batak Lite test was not affected (P = 0.137), yet a moderate (d = 0.41, 95% CIs = -0.05-0.87) change was likely. Mental fatigue, induced by an app-based Stroop test, negatively affected cricket-relevant performance.
Kurth, Tobias; Diener, Hans-Christoph; Buring, Julie E.
2011-01-01
Background Migraine with aura (MA) has been associated with increased risk of cardiovascular disease (CVD). The role of aspirin on this association remains unclear. Methods Post-hoc subgroup analyses of the Women’s Health Study, a randomized trial testing 100mg aspirin on alternate days in primary prevention of CVD among 39,876 women aged ≥45. Results During 10 years, 998 major CVD events were confirmed in 39,757 women with complete migraine information. Aspirin reduced risk of ischemic stroke (RR=0.76; 95%CI=0.63–0.93) but not other CVD. Migraine or MA did not modify the effect of aspirin on CVD except for myocardial infarction (MI) (p-interaction=0.01). Women with MA on aspirin had increased risk of MI (RR=3.72, 95%CI=1.39–9.95). Further exploratory analyses indicate this is only apparent among women with MA on aspirin who ever smoked or had history of hypertension (p-interaction<0.01). Conclusion In post-hoc subgroup analyses, aspirin had similar protective effects on ischemic stroke for women with or without migraine. By contrast, our data suggest that women with MA on aspirin had increased risk of MI. The small number of outcome events in subgroups, the exploratory nature of our analyses, and lack of plausible mechanisms raise the possibility of a chance finding, which must caution the interpretation. PMID:21673005
Hunter, Susan W; Frengopoulos, Courtney; Holmes, Jeff; Viana, Ricardo; Payne, Michael W
2018-04-01
To determine the relative and absolute reliability of a dual-task functional mobility assessment. Cross-sectional study. Academic rehabilitation hospital. Individuals (N=60) with lower extremity amputation attending an outpatient amputee clinic (mean age, 58.21±12.59y; 18, 80% male) who were stratified into 3 groups: (1) transtibial amputation of vascular etiology (n=20); (2) transtibial amputation of nonvascular etiology (n=20); and (3) transfemoral or bilateral amputation of any etiology (n=20). Not applicable. Time to complete the L Test measured functional mobility under single- and dual-task conditions. The addition of a cognitive task (serial subtractions by 3's) created dual-task conditions. Single-task performance on the cognitive task was also reported. Intraclass correlation coefficients (ICCs) measured relative reliability; SEM and minimal detectable change with a 95% confidence interval (MDC 95 ) measured absolute reliability. Bland-Altman plots measured agreement between assessments. Relative reliability results were excellent for all 3 groups. Values for the dual-task L Test for those with transtibial amputation of vascular etiology (n=20; mean age, 60.36±7.84y; 19, 90% men) were ICC=.98 (95% confidence interval [CI], .94-.99), SEM=1.36 seconds, and MDC 95 =3.76 seconds; for those with transtibial amputation of nonvascular etiology (n=20; mean age, 55.85±14.08y; 17, 85% men), values were ICC=.93 (95% CI, .80-.98), SEM=1.34 seconds, and MDC 95 =3.71 seconds; and for those with transfemoral or bilateral amputation (n=20; mean age, 58.21±14.88y; 13, 65% men), values were ICC=.998 (95% CI, .996-.999), SEM=1.03 seconds, and MDC 95 =2.85 seconds. Bland-Altman plots indicated that assessments did not vary systematically for each group. This dual-task assessment protocol achieved approved levels of relative reliability values for the 3 groups tested. This protocol may be used clinically or in research settings to assess the interaction between cognition and functional mobility in the population with lower extremity amputation. Copyright © 2018 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.
Sorenson, Wendy R.; Sullivan, Darryl
2008-01-01
In conjunction with an AOAC Presidential Task Force on Dietary Supplements, a method was validated for measurement of 3 plant sterols (phytosterols) in saw palmetto raw materials, extracts, and dietary supplements. AOAC Official Method 994.10, “Cholesterol in Foods,” was modified for purposes of this validation. Test samples were saponified at high temperature with ethanolic potassium hydroxide solution. The unsaponifiable fraction containing phytosterols (campesterol, stigmasterol, and beta-sitosterol) was extracted with toluene. Phytosterols were derivatized to trimethylsilyl ethers and then quantified by gas Chromatography with a hydrogen flame ionization detector. The presence of the phytosterols was detected at concentrations greater than or equal to 1.00 mg/100 g based on 2–3 g of sample. The standard curve range for this assay was 0.00250 to 0.200 mg/mL. The calibration curves for all phytosterols had correlation coefficients greater than or equal to 0.995. Precision studies produced relative standard deviation values of 1.52 to 7.27% for campesterol, 1.62 to 6.48% for stigmasterol, and 1.39 to 10.5% for beta-sitosterol. Recoveries for samples fortified at 100% of the inherent values averaged 98.5 to 105% for campesterol, 95.0 to 108% for stigmasterol, and 85.0 to 103% for beta-sitosterol. PMID:16512224
Yen, Po-Yin; Kelley, Marjorie; Lopetegui, Marcelo; Rosado, Amber L.; Migliore, Elaina M.; Chipps, Esther M.; Buck, Jacalyn
2016-01-01
A fundamental understanding of multitasking within nursing workflow is important in today’s dynamic and complex healthcare environment. We conducted a time motion study to understand nursing workflow, specifically multitasking and task switching activities. We used TimeCaT, a comprehensive electronic time capture tool, to capture observational data. We established inter-observer reliability prior to data collection. We completed 56 hours of observation of 10 registered nurses. We found, on average, nurses had 124 communications and 208 hands-on tasks per 4-hour block of time. They multitasked (having communication and hands-on tasks simultaneously) 131 times, representing 39.48% of all times; the total multitasking duration ranges from 14.6 minutes to 109 minutes, 44.98 minutes (18.63%) on average. We also reviewed workflow visualization to uncover the multitasking events. Our study design and methods provide a practical and reliable approach to conducting and analyzing time motion studies from both quantitative and qualitative perspectives. PMID:28269924
Yen, Po-Yin; Kelley, Marjorie; Lopetegui, Marcelo; Rosado, Amber L; Migliore, Elaina M; Chipps, Esther M; Buck, Jacalyn
2016-01-01
A fundamental understanding of multitasking within nursing workflow is important in today's dynamic and complex healthcare environment. We conducted a time motion study to understand nursing workflow, specifically multitasking and task switching activities. We used TimeCaT, a comprehensive electronic time capture tool, to capture observational data. We established inter-observer reliability prior to data collection. We completed 56 hours of observation of 10 registered nurses. We found, on average, nurses had 124 communications and 208 hands-on tasks per 4-hour block of time. They multitasked (having communication and hands-on tasks simultaneously) 131 times, representing 39.48% of all times; the total multitasking duration ranges from 14.6 minutes to 109 minutes, 44.98 minutes (18.63%) on average. We also reviewed workflow visualization to uncover the multitasking events. Our study design and methods provide a practical and reliable approach to conducting and analyzing time motion studies from both quantitative and qualitative perspectives.
Motor Imagery Practice for Enhancing Elevé Performance Among Professional Dancers: A Pilot Study.
Abraham, Amit; Dunsky, Ayelet; Dickstein, Ruth
2016-09-01
Elevé is a core dance movement requiring the greatest ankle plantarflexion (PF) range of motion (ROM). One possible way to enhance elevé performance is by using motor imagery practice (MIP). The aims of this pilot study were to investigate: 1) functional ankle PF maximal angles and ROM while performing elevé among professional dancers, 2) the effect of MIP on enhancing elevé performance, and 3) participants' views on the MIP intervention and its feasibility in a professional dance company setting. Five professional dancers, mean age 31 yrs (SD 1.87), participated in a 2-week MIP intervention. Data on ankle PF maximal angles and ROM were collected pre- and post-intervention using 3-dimensional motion capture while performing repeat (10 repetitions) and static (10 sec) elevé. At baseline, ankle PF maximal angles were 169.20° (SD 2.81°) and 168.36° (2.23°) and ankle PF ROM were 40.21° (3.35°) and 35.94° (3.95°) for the repeat and static tasks, respectively. After the MIP intervention, ankle PF maximal angles were 170.28° (4.26°) and 170.74° (3.77°) and ankle PF ROM were 41.53° (2.33°) and 39.30° (2.30°) for the repeat and static tasks, respectively. Feasibility of MIP was established with 100% compliance and positive views were expressed by participants. The results suggest MIP holds potential as an adjunct training method for enhancing elevé performance among professional dancers.
Smith-Coggins, Rebecca; Howard, Steven K; Mac, Dat T; Wang, Cynthia; Kwan, Sharon; Rosekind, Mark R; Sowb, Yasser; Balise, Raymond; Levis, Joel; Gaba, David M
2006-11-01
We examine whether a 40-minute nap opportunity at 3 AM can improve cognitive and psychomotor performance in physicians and nurses working 12-hour night shifts. This is a randomized controlled trial of 49 physicians and nurses working 3 consecutive night shifts in an academic emergency department. Subjects were randomized to a control group (no-nap condition=NONE) or nap intervention group (40-minute nap opportunity at 3 AM=NAP). The main outcome measures were Psychomotor Vigilance Task, Probe Recall Memory Task, CathSim intravenous insertion virtual reality simulation, and Profile of Mood States, which were administered before (6:30 PM), during (4 AM), and after (7:30 AM) night shifts. A 40-minute driving simulation was administered at 8 AM and videotaped for behavioral signs of sleepiness and driving accuracy. During the nap period, standard polysomnographic data were recorded. Polysomnographic data revealed that 90% of nap subjects were able to sleep for an average of 24.8 minutes (SD 11.1). At 7:30 AM, the nap group had fewer performance lapses (NAP 3.13, NONE 4.12; p<0.03; mean difference 0.99; 95% CI: -0.1-2.08), reported more vigor (NAP 4.44, NONE 2.39; p<0.03; mean difference 2.05; 95% CI: 0.63-3.47), less fatigue (NAP 7.4, NONE 10.43; p<0.05; mean difference 3.03; 95% CI: 1.11-4.95), and less sleepiness (NAP 5.36, NONE 6.48; p<0.03; mean difference 1.12; 95% CI: 0.41-1.83). They tended to more quickly complete the intravenous insertion (NAP 66.40 sec, NONE 86.48 sec; p=0.10; mean difference 20.08; 95% CI: 4.64-35.52), exhibit less dangerous driving and display fewer behavioral signs of sleepiness during the driving simulation. Immediately after the nap (4 AM), the subjects scored more poorly on Probed Recall Memory (NAP 2.76, NONE 3.7; p<0.05; mean difference 0.94; 95% CI: 0.20-1.68). A nap at 3 AM improved performance and subjective report in physicians and nurses at 7:30 AM compared to a no-nap condition. Immediately after the nap, memory temporarily worsened. The nap group did not perform any better than the no-nap group during a simulated drive home after the night shift.
Respiratory rates measured by a standardised clinical approach, ward staff, and a wireless device.
Granholm, A; Pedersen, N E; Lippert, A; Petersen, L F; Rasmussen, L S
2016-11-01
Respiratory rate is among the first vital signs to change in deteriorating patients. The aim was to investigate the agreement between respiratory rate measurements by three different methods. This prospective observational study included acutely admitted adult patients in a medical ward. Respiratory rate was measured by three methods: a standardised approach over 60 s while patients lay still and refrained from talking, by ward staff and by a wireless electronic patch (SensiumVitals). The Bland-Altman method was used to compare measurements and three breaths per minute (BPM) was considered a clinically relevant difference. We included 50 patients. The mean difference between the standardised approach and the electronic measurement was 0.3 (95% CI: -1.4 to 2.0) BPM; 95% limits of agreement were -11.5 (95% CI: -14.5 to -8.6) and 12.1 (95% CI: 9.2 to 15.1) BPM. Removal of three outliers with huge differences lead to a mean difference of -0.1 (95% CI: -0.7 to 0.5) BPM and 95% limits of agreement of -4.2 (95% CI: -5.3 to -3.2) BPM and 4.0 (95% CI: 2.9 to 5.0) BPM. The mean difference between staff and electronic measurements was 1.7 (95% CI: -0.5 to 3.9) BPM; 95% limits of agreement were -13.3 (95% CI: -17.2 to -9.5) BPM and 16.8 (95% CI: 13.0 to 20.6) BPM. A concerning lack of agreement was found between a wireless monitoring system and a standardised clinical approach. Ward staff's measurements also seemed to be inaccurate. © 2016 The Acta Anaesthesiologica Scandinavica Foundation. Published by John Wiley & Sons Ltd.
Carrera, Cristina; Marchetti, Michael A; Dusza, Stephen W; Argenziano, Giuseppe; Braun, Ralph P; Halpern, Allan C; Jaimes, Natalia; Kittler, Harald J; Malvehy, Josep; Menzies, Scott W; Pellacani, Giovanni; Puig, Susana; Rabinovitz, Harold S; Scope, Alon; Soyer, H Peter; Stolz, Wilhelm; Hofmann-Wellenhof, Rainer; Zalaudek, Iris; Marghoob, Ashfaq A
2016-07-01
The comparative diagnostic performance of dermoscopic algorithms and their individual criteria are not well studied. To analyze the discriminatory power and reliability of dermoscopic criteria used in melanoma detection and compare the diagnostic accuracy of existing algorithms. This was a retrospective, observational study of 477 lesions (119 melanomas [24.9%] and 358 nevi [75.1%]), which were divided into 12 image sets that consisted of 39 or 40 images per set. A link on the International Dermoscopy Society website from January 1, 2011, through December 31, 2011, directed participants to the study website. Data analysis was performed from June 1, 2013, through May 31, 2015. Participants included physicians, residents, and medical students, and there were no specialty-type or experience-level restrictions. Participants were randomly assigned to evaluate 1 of the 12 image sets. Associations with melanoma and intraclass correlation coefficients (ICCs) were evaluated for the presence of dermoscopic criteria. Diagnostic accuracy measures were estimated for the following algorithms: the ABCD rule, the Menzies method, the 7-point checklist, the 3-point checklist, chaos and clues, and CASH (color, architecture, symmetry, and homogeneity). A total of 240 participants registered, and 103 (42.9%) evaluated all images. The 110 participants (45.8%) who evaluated fewer than 20 lesions were excluded, resulting in data from 130 participants (54.2%), 121 (93.1%) of whom were regular dermoscopy users. Criteria associated with melanoma included marked architectural disorder (odds ratio [OR], 6.6; 95% CI, 5.6-7.8), pattern asymmetry (OR, 4.9; 95% CI, 4.1-5.8), nonorganized pattern (OR, 3.3; 95% CI, 2.9-3.7), border score of 6 (OR, 3.3; 95% CI, 2.5-4.3), and contour asymmetry (OR, 3.2; 95% CI, 2.7-3.7) (P < .001 for all). Most dermoscopic criteria had poor to fair interobserver agreement. Criteria that reached moderate levels of agreement included comma vessels (ICC, 0.44; 95% CI, 0.40-0.49), absence of vessels (ICC, 0.46; 95% CI, 0.42-0.51), dark brown color (ICC, 0.40; 95% CI, 0.35-0.44), and architectural disorder (ICC, 0.43; 95% CI, 0.39-0.48). The Menzies method had the highest sensitivity for melanoma diagnosis (95.1%) but the lowest specificity (24.8%) compared with any other method (P < .001). The ABCD rule had the highest specificity (59.4%). All methods had similar areas under the receiver operating characteristic curves. Important dermoscopic criteria for melanoma recognition were revalidated by participants with varied experience. Six algorithms tested had similar but modest levels of diagnostic accuracy, and the interobserver agreement of most individual criteria was poor.
Kottke, Melissa; Whiteman, Maura K; Kraft, Joan Marie; Goedken, Peggy; Wiener, Jeffrey; Kourtis, Athena P; DiClemente, Ralph
2015-12-01
To characterize factors associated with dual method contraceptive use in a sample of adolescent women. DESIGN, SETTING, PARTICIPANTS, INTERVENTIONS, AND MAIN OUTCOME MEASURES: We conducted a cross-sectional survey of sexually active African American women aged 14-19 years who attended an urban Title X clinic in Georgia in 2012 (N = 350). Participants completed a computerized survey to assess contraceptive and condom use during the past 2 sexual encounters with their most recent partner. Dual method use was defined as use of a hormonal contraceptive or intrauterine device and a condom. We applied multinomial logistic regression, using generalized estimating equations, to examine the adjusted association between dual method use (vs use of no methods or less effective methods alone; eg, withdrawal) and select characteristics. Dual methods were used by 20.6% of participants at last sexual intercourse and 23.6% at next to last sexual intercourse. Having a previous sexually transmitted disease (adjusted odds ratio [aOR], 2.30; 95% confidence interval [CI], 1.26-4.18), negative attitude toward pregnancy (aOR, 2.25; 95% CI, 1.19-4.28), and a mother who gave birth as a teen (aOR, 2.34; 95% CI, 1.21-4.52) were associated with higher odds of dual method use. Having no health insurance (aOR, 0.39; 95% CI, 0.18-0.82), 4 or more lifetime sexual partners (aOR, 0.42; 95% CI, 0.22-0.78), sex at least weekly (aOR, 0.54; 95% CI, 0.29-0.99), and agreeing to monogamy with the most recent partner (aOR, 0.40; 95% CI, 0.16-0.96) were associated with decreased odds of dual method use. Dual method use was uncommon in our sample. Efforts to increase use of dual methods should address individual and relationship factors. Copyright © 2015 North American Society for Pediatric and Adolescent Gynecology. All rights reserved.
Understanding Older People's Readiness for Receiving Telehealth: Mixed-Method Study.
van Houwelingen, Cornelis Tm; Ettema, Roelof Ga; Antonietti, Michelangelo Gef; Kort, Helianthe Sm
2018-04-06
The Dutch Ministry of Health has formulated ambitious goals concerning the use of telehealth, leading to subsequent changes compared with the current health care situation, in which 93% of care is delivered face-to-face. Since most care is delivered to older people, the prospect of telehealth raises the question of whether this population is ready for this new way of receiving care. To study this, we created a theoretical framework consisting of 6 factors associated with older people's intention to use technology. The objective of this study was to understand community-dwelling older people's readiness for receiving telehealth by studying their intention to use videoconferencing and capacities for using digital technology in daily life as indicators. A mixed-method triangulation design was used. First, a cross-sectional survey study was performed to investigate older people's intention to use videoconferencing, by testing our theoretical framework with a multilevel path analysis (phase 1). Second, for deeper understanding of older people's actual use of digital technology, qualitative observations of older people executing technological tasks (eg, on a computer, cell phone) were conducted at their homes (phase 2). In phase 1, a total of 256 people aged 65 years or older participated in the survey study (50.0% male; median age, 70 years; Q1-Q3: 67-76). Using a significance level of .05, we found seven significant associations regarding older people's perception of videoconferencing. Older people's (1) intention to use videoconferencing was predicted by their performance expectancy (odds ratio [OR] 1.26, 95% CI 1.13-1.39), effort expectancy (OR 1.23, 95% CI 1.07-1.39), and perceived privacy and security (OR 1.30, 95% CI 1.17-1.43); (2) their performance expectancy was predicted by their effort expectancy (OR 1.38, 95% CI 1.24-1.52); and (3) their effort expectancy was predicted by their self-efficacy (OR 1.55, 95% CI 1.42-1.68). In phase 2, a total of 6 men and 9 women aged between 65 and 87 years participated in the qualitative observation study. Of the primary themes, 5 themes were identified that could provide greater understanding of older people's capacities and incapacities in using digital technology: (1) "self-efficacy and digital literacy," (2) "obstacles to using technology," (3) "prior experience and frequency of use," (4) "sources of support and facilitating conditions," and (5) "performance expectancy." These 5 themes recurred in all 15 observations. Performance expectancy, effort expectancy, and perceived privacy and security are direct predictors of older people's intention to use videoconferencing. Self-efficacy appeared to play a role in both older people's intention to use, as well as their actual use of technology. The path analysis revealed that self-efficacy was significantly associated with older people's effort expectancy. Furthermore, self-efficacy and digital literacy appeared to play a major role in older people's capacities to make use of digital technology. ©Cornelis TM van Houwelingen, Roelof GA Ettema, Michelangelo GEF Antonietti, Helianthe SM Kort. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 06.04.2018.
Trent, Simon; Dean, Rachel; Veit, Bonnie; Cassano, Tommaso; Bedse, Gaurav; Ojarikre, Obah A.; Humby, Trevor; Davies, William
2013-01-01
Summary Chromosomal deletions at Xp22.3 appear to influence vulnerability to the neurodevelopmental disorders attention deficit hyperactivity disorder (ADHD) and autism. 39,XY*O mice, which lack the murine orthologue of the Xp22.3 ADHD candidate gene STS (encoding steroid sulfatase), exhibit behavioural phenotypes relevant to such disorders (e.g. hyperactivity), elevated hippocampal serotonin (5-HT) levels, and reduced serum levels of dehydroepiandrosterone (DHEA). Here we initially show that 39,XY*O mice are also deficient for the recently-characterised murine orthologue of the Xp22.3 autism candidate gene ASMT (encoding acetylserotonin-O-methyltransferase). Subsequently, to specify potential behavioural correlates of elevated hippocampal 5-HT arising due to the genetic lesion, we compared 39,XY*O MF1 mice to 40,XY MF1 mice on behavioural tasks taxing hippocampal and/or 5-HT function (a ‘foraging’ task, an object-location task, and the 1-choice serial reaction time task of impulsivity). Although Sts/Asmt deficiency did not influence foraging behaviour, reactivity to familiar objects in novel locations, or ‘ability to wait’, it did result in markedly increased response rates; these rates correlated with hippocampal 5-HT levels and are likely to index behavioural perseveration, a frequent feature of neurodevelopmental disorders. Additionally, we show that whilst there was no systematic relationship between serum DHEA levels and hippocampal 5-HT levels across 39,XY*O and 40,XY mice, there was a significant inverse linear correlation between serum DHEA levels and activity. Our data suggest that deficiency for genes within Xp22.3 could influence core behavioural features of neurodevelopmental disorders via dissociable effects on hippocampal neurochemistry and steroid hormone levels, and that the mediating neurobiological mechanisms may be investigated in the 39,XY*O model. PMID:23276394
38 CFR 21.260 - Subsistence allowance.
Code of Federal Regulations, 2010 CFR
2010-07-01
... of rehabilitation potential: Full-time only 374.93 465.08 548.05 39.95 3/4 time 281.71 349.32 409.76... exceed the difference between the monthly training wage, not including overtime, and the entrance....44 456.88 29.71 Improvement of rehabilitation potential: Full-time only 374.93 465.08 548.05 39.95 3...
38 CFR 21.260 - Subsistence allowance.
Code of Federal Regulations, 2011 CFR
2011-07-01
... of rehabilitation potential: Full-time only 374.93 465.08 548.05 39.95 3/4 time 281.71 349.32 409.76... exceed the difference between the monthly training wage, not including overtime, and the entrance....44 456.88 29.71 Improvement of rehabilitation potential: Full-time only 374.93 465.08 548.05 39.95 3...
2011-01-01
Background Problematic internet use (PIU) is associated with a plethora of psychosocial adversities. The study objectives were to assess the determinants and psychosocial implications associated with potential PIU and PIU among adolescents. Methods A cross-sectional study design was applied among a random sample (n = 866) of Greek adolescents (mean age: 14.7 years). Self-completed questionnaires, including internet use characteristics, Young Internet Addiction Test, and Strengths and Difficulties Questionnaire, were utilized to examine the study objectives. Results Among the study population, the prevalence rates of potential PIU and PIU were 19.4% and 1.5%, respectively. Multinomial logistic regression indicated that male gender (Odds Ratio, OR: 2.01; 95% Confidence Interval, 95% CI: 1.35-3.00), as well as utilizing the internet for retrieving sexual information (OR: 2.52; 95% CI: 1.53-4.12), interactive game playing (OR: 1.85; 95% CI: 1.21-2.82), and socialization, including chat-room use (OR: 1.97; 95% CI: 1.36-2.86) and email (OR: 1.53; 95% CI: 1.05-2.24), were independently associated with potential PIU and PIU. Adolescents with potential PIU had an increased likelihood of concomitantly presenting with hyperactivity (OR: 4.39; 95% CI: 2.03-9.52) and conduct (OR: 2.56; 95% CI: 1.46-4.50) problems. Moreover, adolescent PIU was significantly associated with hyperactivity (OR: 9.96; 95% CI: 1.76-56.20) and conduct (OR: 8.39; 95% CI: 2.04-34.56) problems, as well as comprehensive psychosocial maladjustment (OR: 8.08; 95% CI: 1.44-45.34). Conclusions The determinants of potential PIU and PIU include accessing the internet for the purposes of retrieving sexual information, game playing, and socialization. Furthermore, both potential PIU and PIU are adversely associated with notable behavioral and social maladjustment among adolescents. PMID:21794167
Assessment of Risk Factors of Helicobacter Pylori Infection and Peptic Ulcer Disease
Mhaskar, Rahul S; Ricardo, Izurieta; Azliyati, Azizan; Laxminarayan, Rajaram; Amol, Bapaye; Santosh, Walujkar; Boo, Kwa
2013-01-01
Background: Helicobacter pylori (H. pylori) infection is a risk factor for peptic ulcer. There have been no studies addressing environmental and dietary risk factors in western India. We conducted a case control study enrolling peptic ulcer patients in Pune, India. Materials and Methods: Risk factors for peptic ulcer and H. pylori infection were assessed in a participant interview. H. pylori status was assessed from stool by monoclonal antigen detection. Results: We enrolled 190 peptic ulcer, 35 stomach cancer patients, and 125 controls. Fifty-one percent (180/350) of the participants were infected with H. pylori. Lower socioeconomic status (SES) [odds ratio (OR): 1.10, 95% confidence interval (CI): 1.02–1.39], meat consumption (OR: 2.35, 95% CI: 1.30–4.23), smoking (OR: 2.23, 95% CI: 1.24–4.02), eating restaurant food (OR: 3.77, 95% CI: 1.39–10.23), and drinking nonfiltered or nonboiled water (OR: 1.05, 95% CI: 1.01–1.23) were risk factors for H. pylori infection. H. pylori infection (OR: 1.70, 95% CI: 1.03–2.89), meat (OR: 1.10, 95% CI: 1.02-1.75), fish (OR: 1.05, 95% CI: 1.02–1.89) consumption, and a family history of ulcer (OR: 1.20, 95% CI: 1.08–1.60) were risk factors for peptic ulcer. Consumption of chili peppers (OR: 0.20, 95% CI: 0.10–0.37) and parasite infestation (OR: 0.44, 95% CI: 0.24–0.80) were protective against H. pylori infection. Conclusion: H. pylori infection is associated with peptic ulcer. Lower SES, consumption of restaurant food, meat, nonfiltered water, and smoking are risk factors for H. pylori. Consumption of meat, fish, and a family history of peptic ulcer are risk factors for peptic ulcer. Consumption of chili peppers and concurrent parasite infestation appear to be protective against H. pylori. PMID:23853433
Risk Factors for Mortality in Lower Intestinal Bleeding
Strate, Lisa L.; Ayanian, John Z.; Kotler, Gregory; Syngal, Sapna
2009-01-01
Background and Aims Previous studies of Lower Intestinal Bleeding (LIB) have limited power to study mortality. We sought to identify characteristics associated with in-hospital mortality in a large cohort of patients with LIB. Methods We used the 2002 Healthcare Cost and Utilization Project (HCUP) Nationwide Inpatient Sample (NIS) to study a cross-sectional cohort of 227,022 hospitalized patients with discharge diagnoses indicating LIB. Predictors of mortality were identified using multiple logistic regression. Results In 2002, an estimated 8,737 patients with LIB (3.9%) died while hospitalized. Independent predictors of in-hospital mortality were age (age >70 vs. <50, odds ratio (OR) 4.91; 95% CI 2.45–9.87), intestinal ischemia (OR 3.47; 95% CI 2.57–4.68), comorbid illness (≥ 2 vs. 0 comorbidities, OR 3.00; 95% CI 2.25–3.98), bleeding while hospitalized for a separate process (OR 2.35; 95% CI 1.81–3.04), coagulation defects (OR 2.34; 95% CI 1.50–3.65), hypovolemia (OR 2.22; 95% CI 1.69–2.90), transfusion of packed red blood cells (OR 1.60; 95% CI 1.23–2.08), and male gender (OR 1.52; 95% CI 1.21–1.92). Colorectal polyps (OR 0.26, 95% CI 0.15–0.45), and hemorrhoids (OR 0.42; 95% CI 0.28–0.64) were associated with a lower risk of mortality, as was diagnostic testing for LIB when added to the multivariate model (OR 0.37, 95% CI 0.28–0.48; p<0.001). Hospital characteristics were not significantly related to mortality. Predictors of mortality were similar in an analysis restricted to patients with diverticular bleeding. Conclusions The all-cause in-hospital mortality rate in LIB is low (3.9%). Advanced age, intestinal ischemia and comorbid illness were the strongest predictors of mortality. PMID:18558513
Facial emotion recognition in alcohol and substance use disorders: A meta-analysis.
Castellano, Filippo; Bartoli, Francesco; Crocamo, Cristina; Gamba, Giulia; Tremolada, Martina; Santambrogio, Jacopo; Clerici, Massimo; Carrà, Giuseppe
2015-12-01
People with alcohol and substance use disorders (AUDs/SUDs) show worse facial emotion recognition (FER) than controls, though magnitude and potential moderators remain unknown. The aim of this meta-analysis was to estimate the association between AUDs, SUDs and FER impairment. Electronic databases were searched through April 2015. Pooled analyses were based on standardized mean differences between index and control groups with 95% confidence intervals, weighting each study with random effects inverse variance models. Risk of publication bias and role of potential moderators, including task type, were explored. Nineteen of 70 studies assessed for eligibility met the inclusion criteria, comprising 1352 individuals, of whom 714 (53%) had AUDs or SUDs. The association between substance related disorders and FER performance showed an effect size of -0.67 (-0.95, -0.39), and -0.65 (-0.93, -0.37) for AUDs and SUDs, respectively. There was no publication bias and subgroup and sensitivity analyses based on potential moderators confirmed core results. Future longitudinal research should confirm these findings, clarifying the role of specific clinical issues of AUDs and SUDs. Copyright © 2015 Elsevier Ltd. All rights reserved.
Urbonas, Vincas; Eidukaitė, Audronė; Tamulienė, Indrė
2012-03-01
Early diagnosis of bacteremia and sepsis in pediatric oncology patients with febrile neutropenia still remains unresolved task due to lack of sensitive and specific laboratory markers particularly at the beginning of the infectious process. The objective of our study was to assess the potentiality of interleukin-10 (IL-10) to predict or exclude bacteremia or sepsis at the beginning of febrile episode in childhood oncology patients. A total of 36 febrile neutropenic episodes in 24 children were studied. Serum samples were collected after confirmation of febrile neutropenia and analyzed using automated random access analyzer. The sensitivity of IL-10 was 73% and specificity - 92% (cut-off=18pg/ml, area under the curve - 0.87, 95% CI for sensitivity 39-94%, 95% CI for specificity 74-99%) with negative predictive value (NPV) - 83%. IL-10 evaluation might be used as an additional diagnostic tool for clinicians in excluding bacteremia or clinical sepsis in oncology patients with febrile neutropenia because of high NPV and specificity. Copyright © 2011 Elsevier Ltd. All rights reserved.
The effects of living environment on disaster workers: a one-year longitudinal study.
Nagamine, Masanori; Harada, Nahoko; Shigemura, Jun; Dobashi, Kosuke; Yoshiga, Makiko; Esaki, Naoki; Tanaka, Miyuki; Tanichi, Masaaki; Yoshino, Aihide; Shimizu, Kunio
2016-10-21
Defense Force workers engaged in disaster relief activities might suffer from strong psychological stress due to the tasks that they had been involved. We evaluated how living environments, work environments, and individual factors psychologically affect those who engaged in disaster relief activities. Data generated with 1506 personnel engaged in the Great East Japan Earthquake relief activity were analyzed. Those who scored ≥25 points on the Impact of Events Scale-Revised and the Kessler Psychological Distress Scale (K10) were allocated into the high post-traumatic stress response (high-PTSR) group, and the high general psychological distress (high-GPD) group, respectively. The multiple logistic regression analysis extracted living environment (camping within the shelter sites) as the significant risk factor for both high-PTSR (OR = 3.39, 95 % CI 2.04-5.64, p < 0.001) and high-GPD (OR = 3.35, 95 % CI 1.77-6.34, p < 0.001) groups. It is desirable for disaster workers to have a living environment in which they can keep an appropriate distance from the victims.
[Red meat intake among employees of floating population aged 18-59 years old in China, 2012].
Yin, Xiangjun; Wang, Limin; Li, Yichong; Zhang, Mei; Wang, Zhihui; Deng, Qian; Wang, Linhong
2014-11-01
To evaluate the level of daily red meat intake and prevalence of excessive red meat intake among employees of floating population in China. 48 511 employees of floating population aged 18 to 59 from 170 counties of 31 provinces (autonomous regions and municipalities) and Xinjiang Production and Construction Corps (District) were selected by stratified cluster sampling method. Information on red meat intake was collected by semi-quantitative food frequency questionnaire. Average intake of 100 g/day, recommended by the World Cancer Research Fund, was used as the cut-off point to estimate the prevalence of excessive red meat intake. After performing the complex weighted analysis, level of daily red meat intake and prevalence of excessive red meat intake were calculated by demographic characteristics including age, education, industries and body mass index etc. 1)The mean daily red meat intake was 125.9 g (95%CI: 116.5 g-132.5 g), higher in men (141.6 g, 95%CI:131.3 g-148.9 g)than in women (104.7 g, 95%CI: 95.8 g-111.2 g) (P < 0.01). Results from the Tendency Test did not show statistically significant changes on the red meat intake related to age, education level or body mass index (P values for trend were all greater than 0.05). The standardized mean daily intake of red meat, adjusted by 2010 census data of China, was 121.0 g (95% CI:113.4 g-128.7 g). 2) The prevalence of excessive red meat intake was 36.2% (95% CI:33.0%-39.3%) significantly higher in males (42.4% , 95% CI:38.9%-45.8%) than in females (27.8%, 95%CI:27.1%-31.0%) (P < 0.01). The prevalence was estimated to be the highest among the population aged 30-39, with 43.5% (95%CI:39.7%-47.4%) in males and 30.1% (95%CI:26.5%-33.9%)in females. The standardized prevalence, adjusted by 2010 census data of China, appeared to be 34.6% (95%CI:31.9%-38.0%). The level of daily red meat intake was higher than 100 g/d, the standard recommended by the World Cancer Foundation, among floating population of China. Both the mean daily red meat intake and prevalence of excessive red meat intake were higher in floating population than that in the local residents in China.
Impact of USPSTF recommendations for aspirin for prevention of recurrent preeclampsia.
Tolcher, Mary Catherine; Chu, Derrick M; Hollier, Lisa M; Mastrobattista, Joan M; Racusin, Diana A; Ramin, Susan M; Sangi-Haghpeykar, Haleh; Aagaard, Kjersti M
2017-09-01
The US Preventive Services Task Force recommends low-dose aspirin for the prevention of preeclampsia among women at high risk for primary occurrence or recurrence of disease. Recommendations for the use of aspirin for preeclampsia prevention were issued by the US Preventive Services Task Force in September 2014. The objective of the study was to evaluate the incidence of recurrent preeclampsia in our cohort before and after the US Preventive Services Task Force recommendation for aspirin for preeclampsia prevention. This was a retrospective cohort study designed to evaluate the rates of recurrent preeclampsia among women with a history of preeclampsia. We utilized a 2-hospital, single academic institution database from August 2011 through June 2016. We excluded multiple gestations and included only the first delivery for women with multiple deliveries during the study period. The cohort of women with a history of preeclampsia were divided into 2 groups, before and after the release of the US Preventive Services Task Force 2014 recommendations. Potential confounders were accounted for in multivariate analyses, and relative risk and adjusted relative risk were calculated. A total of 17,256 deliveries occurred during the study period. A total of 417 women had a documented history of prior preeclampsia: 284 women before and 133 women after the US Preventive Services Task Force recommendation. Comparing the before and after groups, the proportion of Hispanic women in the after group was lower and the method of payment differed between the groups (P <.0001). The prevalence of type 1 diabetes was increased in the after period, but overall rates of pregestational diabetes were similar (6.3% before vs 5.3% after [P > .05]). Risk factors for recurrent preeclampsia included maternal age >35 years (relative risk, 1.83; 95% confidence interval, 1.34-2.48), Medicaid insurance (relative risk, 2.08; 95% confidence interval, 1.15-3.78), type 2 diabetes (relative risk, 2.13; 95% confidence interval, 1.37-3.33), and chronic hypertension (relative risk, 1.96; 95% confidence interval, 1.44-2.66). The risk of recurrent preeclampsia was decreased by 30% in the after group (adjusted relative risk, 0.70; 95% confidence interval, 0.52-0.95). Rates of recurrent preeclampsia among women with a history of preeclampsia decreased by 30% after release of the US Preventive Services Task Force recommendation for aspirin for preeclampsia prevention. Future prospective studies should include direct measures of aspirin compliance, gestational age at initiation, and explore the influence of race and ethnicity on the efficacy of this primary prevention. Copyright © 2017 Elsevier Inc. All rights reserved.
Frontal Hyperconnectivity Related to Discounting and Reversal Learning in Cocaine Subjects
Camchong, Jazmin; MacDonald, Angus W; Nelson, Brent; Bell, Christopher; Mueller, Bryon A; Specker, Sheila; Lim, Kelvin O
2011-01-01
BACKGROUND Functional neuroimaging studies suggest that chronic cocaine use is associated with frontal lobe abnormalities. Functional connectivity (FC) alterations of cocaine dependent individuals (CD), however, are not yet clear. This is the first study to our knowledge that examines resting FC of anterior cingulate cortex (ACC) in CD. Because ACC is known to integrate inputs from different brain regions to regulate behavior, we hypothesize that CD will have connectivity abnormalities in ACC networks. In addition, we hypothesized that abnormalities would be associated with poor performance in delayed discounting and reversal learning tasks. METHODS Resting functional magnetic resonance imaging data were collected to look for FC differences between twenty-seven cocaine dependent individuals (CD) (5 females, age: M=39.73, SD=6.14) and twenty-four controls (5 females, age: M=39.76, SD = 7.09). Participants were assessed with delayed discounting and reversal learning tasks. Using seed-based FC measures, we examined FC in CD and controls within five ACC connectivity networks with seeds in subgenual, caudal, dorsal, rostral, and perigenual ACC. RESULTS CD showed increased FC within the perigenual ACC network in left middle frontal gyrus, ACC and middle temporal gyrus when compared to controls. FC abnormalities were significantly positively correlated with task performance in delayed discounting and reversal learning tasks in CD. CONCLUSIONS The present study shows that participants with chronic cocaine-dependency have hyperconnectivity within an ACC network known to be involved in social processing and mentalizing. In addition, FC abnormalities found in CD were associated with difficulties with delay rewards and slower adaptive learning. PMID:21371689
Krage, Ralf; Zwaan, Laura; Tjon Soei Len, Lian; Kolenbrander, Mark W; van Groeningen, Dick; Loer, Stephan A; Wagner, Cordula; Schober, Patrick
2017-11-01
Non-technical skills, such as task management, leadership, situational awareness, communication and decision-making refer to cognitive, behavioural and social skills that contribute to safe and efficient team performance. The importance of these skills during cardiopulmonary resuscitation (CPR) is increasingly emphasised. Nonetheless, the relationship between non-technical skills and technical performance is poorly understood. We hypothesise that non-technical skills become increasingly important under stressful conditions when individuals are distracted from their tasks, and investigated the relationship between non-technical and technical skills under control conditions and when external stressors are present. In this simulator-based randomised cross-over study, 30 anaesthesiologists and anaesthesia residents from the VU University Medical Center, Amsterdam, the Netherlands, participated in two different CPR scenarios in random order. In one scenario, external stressors (radio noise and a distractive scripted family member) were added, while the other scenario without stressors served as control condition. Non-technical performance of the team leader and technical performance of the team were measured using the 'Anaesthetists' Non-technical Skill' score and a recently developed technical skills score. Analysis of variance and Pearson correlation coefficients were used for statistical analyses. Non-technical performance declined when external stressors were present (adjusted mean difference 3.9 points, 95% CI 2.4 to 5.5 points). A significant correlation between non-technical and technical performance scores was observed when external stressors were present (r=0.67, 95% CI 0.40 to 0.83, p<0.001), while no evidence for such a relationship was observed under control conditions (r=0.15, 95% CI -0.22 to 0.49, p=0.42). This was equally true for all individual domains of the non-technical performance score (task management, team working, situation awareness, decision-making). During CPR with external stressors, the team's technical performance is related to the non-technical skills of the team leader. This may have important implications for training of CPR teams. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Strathdee, Steffanie A.; Shoveller, Jean; Rusch, Melanie; Kerr, Thomas; Tyndall, Mark W.
2009-01-01
Objectives. We investigated the relationship between environmental–structural factors and condom-use negotiation with clients among female sex workers. Methods. We used baseline data from a 2006 Vancouver, British Columbia, community-based cohort of female sex workers, to map the clustering of “hot spots” for being pressured into unprotected sexual intercourse by a client and assess sexual HIV risk. We used multivariate logistic modeling to estimate the relationship between environmental–structural factors and being pressured by a client into unprotected sexual intercourse. Results. In multivariate analyses, being pressured into having unprotected sexual intercourse was independently associated with having an individual zoning restriction (odds ratio [OR] = 3.39; 95% confidence interval [CI] = 1.00, 9.36), working away from main streets because of policing (OR = 3.01; 95% CI = 1.39, 7.44), borrowing a used crack pipe (OR = 2.51; 95% CI = 1.06, 2.49), client-perpetrated violence (OR = 2.08; 95% CI = 1.06, 4.49), and servicing clients in cars or in public spaces (OR = 2.00; 95% CI = 1.65, 5.73). Conclusions. Given growing global concern surrounding the failings of prohibitive sex-work legislation on sex workers' health and safety, there is urgent need for environmental–structural HIV-prevention efforts that facilitate sex workers' ability to negotiate condom use in safer sex-work environments and criminalize abuse by clients and third parties. PMID:19197086
Cumulative Impact of Comorbidity on Quality of Life in MS
Marrie, Ruth Ann; Horwitz, Ralph; Cutter, Gary; Tyry, Tuula
2011-01-01
Background Little is known about the impact of comorbidity on HRQOL in multiple sclerosis (MS). We investigated the association of comorbidity and health-related HRQOL among participants in the North American Research Committee on Multiple Sclerosis (NARCOMS). Materials & Methods In 2006, we queried NARCOMS participants regarding physical and mental comorbidities and HRQOL, using the Short Form-12. We summarized physical HRQOL using the aggregate Physical Component Scale (PCS-12) score, and mental HRQOL using the aggregate Mental Component Scale (MCS-12) score. We assessed multivariable associations between comorbidity and HRQOL using a general linear model, adjusting for potential confounders. Results Among 8983 respondents, the mean (SD) PCS-12 was 36.9 (11.8) and MCS-12 was 45.6 (11.6). After adjustment for sociodemographic and clinical factors, participants with any physical comorbidity had a lower PCS-12 (37.2; 95% CI: 36.4-38.1) than those without any physical comorbidity (40.1; 95% CI: 39.0-41.1). As the number of physical comorbidities increased PCS-12 scores decreased (r = -0.25; 95% CI: -0.23- -0.27) indicating lower reported HRQOL. Participants with any mental comorbidity had a lower MCS-12 (40.7; 95% CI: 39.8-41.6) than those without any mental comorbidity (48.5; 95% CI: 47.7-49.4). Conclusions Comorbidity is associated with reduced HRQOL in MS. Further research should evaluate whether more aggressive treatment of comorbidities improves the HRQOL of MS patients. PMID:21615355
[Relationship between the expression of DDX39 protein and prognosis of colorectal cancer].
Ma, Jun; Chang, Wenjun; Zhang, Wei
2018-03-25
To investigate the relationship between the expression of DDX39 protein and prognosis in colorectal cancer. Clinical data and paraffin specimens of postoperative tumor tissue from 824 patients with primary colorectal cancer who received first surgical treatment at the Department of Colorectal Surgery of Changhai Hospital of Navy Military Medical University from January 2010 to December 2011 were collected. Paraffin samples of paracancerous tissues of 38 patients were served as controls. At the same time, samples of normal rectal mucous membrane from 37 cases after procedure of prolapse and hemorrhoids, and samples of colorectal adenoma from 33 cases after endoscopic treatment were enrolled in this study. All the specimens were made as the tissue microarray, and the expression of DDX39 protein was detected by immunohistochemistry. The expression of DDX39 in the epithelium and stroma was evaluated with the average staining intensity (H-Score) and the number of positive cells. It was defined as high expression in the epithelium that the H-Score was greater than or equal to 200. It was defined as high expression in the stroma that the number of positive cells was greater than or equal to 50 in 200 times the field of vision. Relationship of different DDX39 expression levels with clinicopathological parameters and prognosis of colorectal cancer was analyzed. The expression of DDX39 in colorectal cancer tissues was lower than that in normal tissues, paracancerous tissues and adenomatous tissues, whether it is in the epithelium or in the stroma [DDX39 expression in the epithelium: normal tissues 253.2±64.1, paracancerous tissues 238.8±79.2, adenomatous tissues 259.4±51.6, colorectal cancer tissues 194.2±76.5 (P=0.000, P=0.005, P=0.000, respectively); DDX39 expression in the stroma: normal tissues 110.1±64.8, paracancerous tissues 106.0±49.2, adenomatous tissues 108.5±79.1, colorectal cancer tissues 54.1±34.7(all P=0.000)]. Among the cases of colorectal cancer, there were 541 cases of high DDX39 expression and 283 cases of low DDX39 expression in the epithelium; there were 424 cases of high DDX39 expression of and 400 cases of low DDX39 expression in the stroma. The high DDX39 expression and low DDX39 expression in epithelial and stromal of colorectal cancer were related respectively with tumor location (P=0.006, P=0.016), degree of tumor differentiation (P=0.002, P=0.064), TNM stage (P=0.021, P=0.000), serum CEA level (P=0.003, P=0.005), serum CA199 level (P=0.040, P=0.005) and tumor recurrence and metastasis (P=0.000, P=0.000). All the colorectal cancer cases were followed up for (41.6±15.7) months after operation. The 5-year overall survival (OS) and disease-free survival (DFS) rates of the cases with epithelial low DDX39 expression were 84.1% and 61.5%, and both were significantly lower as compared to those with epithelial high DDX39 expression (95.4% and 88.2%, P=0.000, P=0.000). The 5-year OS and DFS rates of the stroma low DDX39 expression were 86.8% and 66.8%, and both were significantly lower as compared to those with stroma high DDX39 expression (96.1% and 90.6%, P=0.000, P=0.000). Cox multivariate analysis showed that tumor differentiation (OS:HR=0.252, 95%CI: 0.128 to 0.497, P=0.000; DFS:HR=0.266, 95%CI: 0.134 to 0.530, P=0.000), DDX39 expression level in epithelium (OS: HR =0.229, 95%CI: 0.138 to 0.382, P=0.000; DFS: HR =0.266, 95%CI: 0.158 to 0.446, P=0.000), and DDX39 expression level in stroma (OS: HR =0.331, 95%CI: 0.188 to 0.582, P=0.000; DFS:HR=0.326, 95%CI: 0.184 to 0.578, P=0.000) were independent influencing factors of overall or disease-free survival in patients with colorectal cancer. The low expression of DDX39 protein suggests poor prognosis and DDX39 is expected to be a new prognostic marker of colorectal cancer.
Bayesian analysis of multimethod ego-depletion studies favours the null hypothesis.
Etherton, Joseph L; Osborne, Randall; Stephenson, Katelyn; Grace, Morgan; Jones, Chas; De Nadai, Alessandro S
2018-04-01
Ego-depletion refers to the purported decrease in performance on a task requiring self-control after engaging in a previous task involving self-control, with self-control proposed to be a limited resource. Despite many published studies consistent with this hypothesis, recurrent null findings within our laboratory and indications of publication bias have called into question the validity of the depletion effect. This project used three depletion protocols involved three different depleting initial tasks followed by three different self-control tasks as dependent measures (total n = 840). For each method, effect sizes were not significantly different from zero When data were aggregated across the three different methods and examined meta-analytically, the pooled effect size was not significantly different from zero (for all priors evaluated, Hedges' g = 0.10 with 95% credibility interval of [-0.05, 0.24]) and Bayes factors reflected strong support for the null hypothesis (Bayes factor > 25 for all priors evaluated). © 2018 The British Psychological Society.
Real-Time Non-Intrusive Assessment of Viewing Distance during Computer Use.
Argilés, Marc; Cardona, Genís; Pérez-Cabré, Elisabet; Pérez-Magrané, Ramon; Morcego, Bernardo; Gispets, Joan
2016-12-01
To develop and test the sensitivity of an ultrasound-based sensor to assess the viewing distance of visual display terminals operators in real-time conditions. A modified ultrasound sensor was attached to a computer display to assess viewing distance in real time. Sensor functionality was tested on a sample of 20 healthy participants while they conducted four 10-minute randomly presented typical computer tasks (a match-three puzzle game, a video documentary, a task requiring participants to complete a series of sentences, and a predefined internet search). The ultrasound sensor offered good measurement repeatability. Game, text completion, and web search tasks were conducted at shorter viewing distances (54.4 cm [95% CI 51.3-57.5 cm], 54.5 cm [95% CI 51.1-58.0 cm], and 54.5 cm [95% CI 51.4-57.7 cm], respectively) than the video task (62.3 cm [95% CI 58.9-65.7 cm]). Statistically significant differences were found between the video task and the other three tasks (all p < 0.05). Range of viewing distances (from 22 to 27 cm) was similar for all tasks (F = 0.996; p = 0.413). Real-time assessment of the viewing distance of computer users with a non-intrusive ultrasonic device disclosed a task-dependent pattern.
Dose-response study of spinal hyperbaric ropivacaine for cesarean section
Chen, Xin-zhong; Chen, Hong; Lou, Ai-fei; Lü, Chang-cheng
2006-01-01
Background: Spinal hyperbaric ropivacaine may produce more predictable and reliable anesthesia than plain ropivacaine for cesarean section. The dose-response relation for spinal hyperbaric ropivacaine is undetermined. This double-blind, randomized, dose-response study determined the ED50 (50% effective dose) and ED95 (95% effective dose) of spinal hyperbaric ropivacaine for cesarean section anesthesia. Methods: Sixty parturients undergoing elective cesarean section delivery with use of combined spinal-epidural anesthesia were enrolled in this study. An epidural catheter was placed at the L1~L2 vertebral interspace, then lumbar puncture was performed at the L3~L4 vertebral interspace, and parturients were randomized to receive spinal hyperbaric ropivacaine in doses of 10.5 mg, 12 mg, 13.5 mg, or 15 mg in equal volumes of 3 ml. Sensory levels (pinprick) were assessed every 2.5 min until a T7 level was achieved and motor changes were assessed by modified Bromage Score. A dose was considered effective if an upper sensory level to pin prick of T7 or above was achieved and no intraoperative epidural supplement was required. ED50 and ED95 were determined with use of a logistic regression model. Results: ED50 (95% confidence interval) of spinal hyperbaric ropivacaine was determined to be 10.37 (5.23~11.59) mg and ED95 (95% confidence interval) to be 15.39 (13.81~23.59) mg. The maximum sensory block levels and the duration of motor block and the rate of hypotension, but not onset of anesthesia, were significantly related to the ropivacaine dose. Conclusion: The ED50 and ED95 of spinal hyperbaric ropivacaine for cesarean delivery under the conditions of this study were 10.37 mg and 15.39 mg, respectively. Ropivacaine is suitable for spinal anesthesia in cesarean delivery. PMID:17111469
Loke, Yoon K; Pradhan, Shiva; Yeong, Jessica Ka-yan; Kwok, Chun Shing
2014-01-01
Aims There are concerns regarding increased risk of acute coronary syndrome with dabigatran. We aimed to assess whether alternative treatment options such as rivaroxaban or apixaban carry a similar risk as compared with dabigatran. Methods We searched MEDLINE and EMBASE for randomized controlled trials of apixaban, dabigatran or rivaroxaban against control (placebo, heparin or vitamin K antagonist). We pooled odds ratios (OR) for adverse coronary events (acute coronary syndrome or myocardial infarction) using fixed effect meta-analysis and assessed heterogeneity with I2. We conducted adjusted indirect comparisons to compare risk of adverse coronary events with apixaban or rivaroxaban vs. dabigatran. Results Twenty-seven randomized controlled trials met the inclusion criteria. Dabigatran was associated with a significantly increased risk of adverse coronary events in pooled analysis of nine trials (OR 1.45, 95% CI 1.14, 1.86). There was no signal for coronary risk with apixaban from nine trials (pooled OR 0.89, 95% CI 0.78, 1.03) or rivaroxaban from nine trials (pooled OR 0.81, 95% CI 0.72, 0.93). Overall, adjusted indirect comparison suggested that both apixaban (OR 0.61, 95% CI 0.44, 0.85) and rivaroxaban (OR 0.54; 95% CI 0.39, 0.76) were associated with lower coronary risk than dabigatran. Restricting the indirect comparison to a vitamin K antagonist as a common control, yielded similar findings, OR 0.57 (95% CI 0.39, 0.85) for apixaban vs. dabigatran and 0.53 (95% CI 0.37, 0.77) for rivaroxaban vs. dabigatran. Conclusions There are significant differences in the comparative safety of apixaban, rivaroxaban and dabigatran with regards to acute coronary adverse events. PMID:24617578
Fenton, Joshua J.; Cai, Yong; Weiss, Noel S.; Elmore, Joann G.; Pardee, Roy E.; Reid, Robert J.; Baldwin, Laura-Mae
2012-01-01
Background Patients and physicians strongly endorse the importance of preventive or periodic health examinations (PHEs). However, the extent to which PHEs contribute to the delivery of cancer screening is uncertain. Methods In a retrospective cohort study, we determined the association between receipt of a PHE and cancer testing in a population-based sample of enrollees in a Washington State health plan who were aged 52 to 78 years and eligible for colorectal, breast, or prostate cancer screening in 2002–2003 (N = 64 288). Outcomes included completion of any colorectal cancer testing (fecal occult blood testing, sigmoidoscopy, colonoscopy, or barium enema), screening mammography, and prostate-specific antigen testing. Results More than half (52.4%) of the enrollees received a PHE during the study period. After adjusting for demographics, comorbidity, number of outpatient visits, and historical preventive service use before January 1, 2002, receipt of a PHE was significantly associated with completion of colorectal cancer testing (incidence difference, 40.4% [95% confidence interval (CI), 39.4%–41.3%]; relative incidence, 3.47 [95% CI, 3.34–3.59]), screening mammography [incidence difference, 14.2% [95% CI, 12.7%–15.7%]; relative incidence, 1.23 [95% CI, 1.20–1.25]), and prostate-specific antigen testing (incidence difference, 39.4% [95% CI, 38.3%–40.5%]; relative incidence, 3.06 [95% CI, 2.95–3.18]). Conclusions Among managed care enrollees eligible for cancer screening, PHE receipt is associated with completion of colorectal, breast, and prostate cancer testing. In similar populations, the PHE may serve as a clinically important forum for the promotion of evidence-based colorectal cancer and breast cancer screening and of screening with relatively less empirical support, such as prostate cancer screening. PMID:17389289
Genoni, Angela; Lyons-Wall, Philippa; Lo, Johnny; Devine, Amanda
2016-05-23
(1) BACKGROUND: The Paleolithic diet is popular in Australia, however, limited literature surrounds the dietary pattern. Our primary aim was to compare the Paleolithic diet with the Australian Guide to Healthy Eating (AGHE) in terms of anthropometric, metabolic and cardiovascular risk factors, with a secondary aim to examine the macro and micronutrient composition of both dietary patterns; (2) METHODS: 39 healthy women (mean ± SD age 47 ± 13 years, BMI 27 ± 4 kg/m²) were randomised to either the Paleolithic (n = 22) or AGHE diet (n = 17) for four weeks. Three-day weighed food records, body composition and biochemistry data were collected pre and post intervention; (3) RESULTS: Significantly greater weight loss occurred in the Paleolithic group (-1.99 kg, 95% CI -2.9, -1.0), p < 0.001). There were no differences in cardiovascular and metabolic markers between groups. The Paleolithic group had lower intakes of carbohydrate (-14.63% of energy (E), 95% CI -19.5, -9.7), sodium (-1055 mg/day, 95% CI -1593, -518), calcium (-292 mg/day 95% CI -486.0, -99.0) and iodine (-47.9 μg/day, 95% CI -79.2, -16.5) and higher intakes of fat (9.39% of E, 95% CI 3.7, 15.1) and β-carotene (6777 μg/day 95% CI 2144, 11410) (all p < 0.01); (4) CONCLUSIONS: The Paleolithic diet induced greater changes in body composition over the short-term intervention, however, larger studies are recommended to assess the impact of the Paleolithic vs. AGHE diets on metabolic and cardiovascular risk factors in healthy populations.
Genoni, Angela; Lyons-Wall, Philippa; Lo, Johnny; Devine, Amanda
2016-01-01
(1) Background: The Paleolithic diet is popular in Australia, however, limited literature surrounds the dietary pattern. Our primary aim was to compare the Paleolithic diet with the Australian Guide to Healthy Eating (AGHE) in terms of anthropometric, metabolic and cardiovascular risk factors, with a secondary aim to examine the macro and micronutrient composition of both dietary patterns; (2) Methods: 39 healthy women (mean ± SD age 47 ± 13 years, BMI 27 ± 4 kg/m2) were randomised to either the Paleolithic (n = 22) or AGHE diet (n = 17) for four weeks. Three-day weighed food records, body composition and biochemistry data were collected pre and post intervention; (3) Results: Significantly greater weight loss occurred in the Paleolithic group (−1.99 kg, 95% CI −2.9, −1.0), p < 0.001). There were no differences in cardiovascular and metabolic markers between groups. The Paleolithic group had lower intakes of carbohydrate (−14.63% of energy (E), 95% CI −19.5, −9.7), sodium (−1055 mg/day, 95% CI −1593, −518), calcium (−292 mg/day 95% CI −486.0, −99.0) and iodine (−47.9 μg/day, 95% CI −79.2, −16.5) and higher intakes of fat (9.39% of E, 95% CI 3.7, 15.1) and β-carotene (6777 μg/day 95% CI 2144, 11410) (all p < 0.01); (4) Conclusions: The Paleolithic diet induced greater changes in body composition over the short-term intervention, however, larger studies are recommended to assess the impact of the Paleolithic vs. AGHE diets on metabolic and cardiovascular risk factors in healthy populations. PMID:27223304
Lifetime Risk of Symptomatic Hand Osteoarthritis: The Johnston County Osteoarthritis Project
Qin, Jin; Barbour, Kamil E.; Murphy, Louise B.; Nelson, Amanda E.; Schwartz, Todd A.; Helmick, Charles G.; Allen, Kelli D.; Renner, Jordan B.; Baker, Nancy A; Jordan, Joanne M.
2017-01-01
Objective Symptomatic hand osteoarthritis (SHOA) is a common condition that affects hand strength and function, and causes disability in activities of daily living. Prior studies have estimated lifetime risk for symptomatic knee and hip osteoarthritis to be 45% and 25% respectively. The objective of this study is to estimate overall lifetime risk for SHOA and stratified lifetime risk by potential risk factors. Methods We analyzed data for 2,218 adults ≥ 45 years in the Johnston County Osteoarthritis Project, a population-based prospective cohort study in residents of Johnston County, North Carolina. Data were collected in two cycles (1999–2004 and 2005–2010). SHOA was defined as having both self-reported symptoms and radiographic OA in the same hand. Lifetime risk, defined as the proportion of the population who will develop SHOA in at least one hand by age 85, was estimated from models using generalized estimating equations methodology. Results Overall, the lifetime risk of SHOA is 39.8% (95% confidence interval (CI): 34.4, 45.3). Nearly one in two women (47.2%; 95% CI: 40.6, 53.9) will develop SHOA by age 85 compared with one in four men (24.6%; 95% CI: 19.5, 30.5). Race-specific estimates are 41.4% (95% CI: 35.5, 47.6) among whites and 29.2% (95% CI: 20.5, 39.7) among blacks. Lifetime risk among individuals with obesity (47.1%, 95% CI: 37.8, 56.7) is 11 percentage point higher than those without obesity (36.1%, 95% CI: 29.7, 42.9). Conclusion These findings demonstrate the substantial burden of SHOA overall and in subgroups. Increased use of public health and clinical interventions is needed to address its impact. PMID:28470947
ERIC Educational Resources Information Center
Bozdogan, Aykut Emre; Günaydin, Esra; Okur, Alperen
2014-01-01
The aim of the study is to explore the academic achievement and performance tasks of students studying in a regional primary boarding school in science course with regard to different variables. The study was carried out via survey method and total 96 students, 57 of them boarding students and 39 of them non-boarding students studying in the 5th,…
Melamed, Nir; Ray, Joel G; Geary, Michael; Bedard, Daniel; Yang, Cathy; Sprague, Ann; Murray-Davis, Beth; Barrett, Jon; Berger, Howard
2016-03-01
In women with gestational diabetes mellitus, it is not clear whether routine induction of labor at <40 weeks of gestation is beneficial to mother and newborn infant. The purpose of this study was to compare outcomes among women with gestational diabetes mellitus who had induction of labor at either 38 or 39 weeks with those whose pregnancy was managed expectantly. We included all women in Ontario, Canada, with diagnosed gestational diabetes mellitus who had a singleton hospital birth at ≥38 + 0 weeks of gestation between April 2012 and March 2014. Data were obtained from the Better Outcomes Registry & Network Ontario, which is a province-wide registry of all births in Ontario, Canada. Women who underwent induction of labor at 38 + 0 to 38 + 6 weeks of gestation (38-IOL; n = 1188) were compared with those who remained undelivered until 39 + 0 weeks of gestation (38-Expectant; n = 5229). Separately, those women who underwent induction of labor at 39 + 0 to 39 + 6 weeks of gestation (39-IOL; n = 1036) were compared with women who remained undelivered until 40 + 0 weeks of gestation (39-Expectant; n = 2162). Odds ratios and 95% confidence intervals were adjusted for maternal age, parity, insulin treatment, and prepregnancy body mass index. Of 281,480 women who gave birth during the study period, 14,600 women (5.2%) had gestational diabetes mellitus; of these, 8392 women (57.5%) met all inclusion criteria. Compared with the 38-Expectant group, those women in the 38-IOL group had lower odds for cesarean delivery (adjusted odds ratio, 0.73; 95% confidence interval, 0.52-0.90), higher odds for neonatal intensive care unit admission (adjusted odds ratio, 1.36; 95% confidence interval, 1.09-1.69), and no difference in other maternal-newborn infant outcomes. Compared with the 39-Expectant group, women in the 39-IOL group likewise had lower odds for cesarean delivery (adjusted odds ratio, 0.73; 95% confidence interval, 0.58-0.93) but no difference in neonatal intensive care unit admission (adjusted odds ratio, 0.83; 95% confidence interval, 0.61-1.11). In women with gestational diabetes mellitus, the routine induction of labor at 38 or 39 weeks is associated with a lower risk of cesarean delivery compared with expectant management but may increase the risk of neonatal intensive care unit admission when done at <39 weeks of gestation. Copyright © 2016 Elsevier Inc. All rights reserved.
A putative placebo analysis of the effects of LCZ696 on clinical outcomes in heart failure
McMurray, John; Packer, Milton; Desai, Akshay; Gong, Jianjian; Greenlaw, Nicola; Lefkowitz, Martin; Rizkala, Adel; Shi, Victor; Rouleau, Jean; Solomon, Scott; Swedberg, Karl; Zile, Michael R.; Andersen, Karl; Arango, Juan Luis; Arnold, Malcolm; Be˘lohlávek, Jan; Böhm, Michael; Boytsov, Sergey; Burgess, Lesley; Cabrera, Walter; Chen, Chen-Huan; Erglis, Andrejs; Fu, Michael; Gomez, Efrain; Gonzalez, Angel; Hagege, Albert-Alain; Katova, Tzvetana; Kiatchoosakun, Songsak; Kim, Kee-Sik; Bayram, Edmundo; Martinez, Felipe; Merkely, Bela; Mendoza, Iván; Mosterd, Arend; Negrusz-Kawecka, Marta; Peuhkurinen, Keijo; Ramires, Felix; Refsgaard, Jens; Senni, Michele; Sibulo, Antonio S.; Silva-Cardoso, José; Squire, Iain; Starling, Randall C.; Vinereanu, Dragos; Teerlink, John R.; Wong, Raymond
2015-01-01
Aims Although active-controlled trials with renin–angiotensin inhibitors are ethically mandated in heart failure with reduced ejection fraction, clinicians and regulators often want to know how the experimental therapy would perform compared with placebo. The angiotensin receptor-neprilysin inhibitor LCZ696 was compared with enalapril in PARADIGM-HF. We made indirect comparisons of the effects of LCZ696 with putative placebos. Methods and results We used the treatment-arm of the Studies Of Left Ventricular Dysfunction (SOLVD-T) as the reference trial for comparison of an ACE inhibitor to placebo and the Candesartan in Heart failure: Assessment of Reduction in Mortality and morbidity-Alternative trial (CHARM-Alternative) as the reference trial for comparison of an ARB to placebo. The hazard ratio of LCZ696 vs. a putative placebo was estimated through the product of the hazard ratio of LCZ696 vs. enalapril (active-control) and that of the historical active-control (enalapril or candesartan) vs. placebo. For the primary composite outcome of cardiovascular death or heart failure hospitalization in PARADIGM-HF, the relative risk reduction with LCZ696 vs. a putative placebo from SOLVD-T was 43% (95%CI 34–50%; P < 0.0001) with similarly large effects on cardiovascular death (34%, 21–44%; P < 0.0001) and heart failure hospitalization (49%, 39–58%; P < 0.0001). For all-cause mortality, the reduction compared with a putative placebo was 28% (95%CI 15–39%; P < 0.0001). Putative placebo analyses based on CHARM-Alternative gave relative risk reductions of 39% (95%CI 27–48%; P < 0.0001) for the composite outcome of cardiovascular death or heart failure hospitalization, 32% (95%CI 16–45%; P < 0.0001) for cardiovascular death, 46% (33–56%; P < 0.0001) for heart failure hospitalization, and 26% (95%CI 11–39%; P < 0.0001) for all-cause mortality. Conclusion These indirect comparisons of LCZ696 with a putative placebo show that the strategy of combined angiotensin receptor blockade and neprilysin inhibition led to striking reductions in cardiovascular and all-cause mortality, as well as heart failure hospitalization. These benefits were obtained even though LCZ696 was added to comprehensive background beta-blocker and mineralocorticoid receptor antagonist therapy. PMID:25416329
Chapman, Julia Louise; Cayanan, Elizabeth Anne; Hoyos, Camilla Miranda; Serinel, Yasmina; Comas, Maria; Yee, Brendon John; Wong, Keith Keat Huat; Grunstein, Ronald Robert; Marshall, Nathaniel Stuart
2018-05-18
RATIONALE Obstructive sleep apnea (OSA) patients unable to tolerate standard treatments have few alternatives. They may benefit from weight loss, but the major symptom of daytime performance impairment may remain during weight loss programs. OBJECTIVES We hypothesized that wakefulness-promoter armodafinil would improve driving task performance over placebo in patients undergoing weight loss. METHODS Placebo-controlled, double-blind, randomized trial of Armodafinil vs Placebo daily for 6 months in patients who were also randomized to one of two diets for six months with follow-up at one year in overweight, adult, OSA patients who had rejected standard treatment and suffered daytime sleepiness. MEASUREMENTS Primary outcome: change in steering deviation in the final 30 minutes of a 90 minute afternoon driving task (AusED) at six months. Epworth Sleepiness Scale, Functional Outcomes of Sleep Questionnaire, fat mass measured by dual-emission X-ray absorptiometry (DXA). MAIN RESULTS Armodafinil improved driving task performance over placebo at three months (12.9cm, 95%CI 4.1 to 21.7, p=0.004), but not the primary timepoint of six months (5.5cm, 95%CI -3.3 to 14.3, p=0.223). Patients on armodafinil lost 2.4kg more fat than those on placebo at six months (95%CI 0.9 to 4.0, p=0.002). Other secondary outcomes were not significantly improved. CONCLUSIONS Armodafinil did not improve driving task performance at the primary endpoint of six months. Armodafinil might be a useful adjunctive to weight loss in OSA patients rejecting conventional treatments but this needs to be directly tested in a specifically designed, properly powered clinical trial. Clinical trial registration available at www.anzctr.org.au, ID ACTRN12611000847910.
Tweya, Hannock; Feldacker, Caryl; Heller, Tom; Gugsa, Salem; Ng’ambi, Wingston; Nthala, Omisher; Kalulu, Mike; Chiwoko, Jane; Banda, Rabecca; Makwinja, Agness; Phiri, Sam
2017-01-01
Objective To estimate patients enrolling on antiretroviral therapy (ART) over time; describe trends in baseline characteristics; and compare immunological response, loss to follow-up (LTFU), and mortality by three age groups (25–39, 40–49 and ≥50 years). Design A retrospective observation cohort study. Methods This study used routine ART data from two public clinics in Lilongwe, Malawi. All HIV-infected individuals, except pregnant or breastfeeding women, aged ≥ 25 years at ART initiation between 2006 and 2015 were included. Poisson regression models estimated risk of mortality, stratified by age groups. Results Of 37,378 ART patients, 3,406 were ≥ 50 years old. Patients aged ≥ 50 years initiated ART with more advanced WHO clinical stage and lower CD4 cell count than their younger counterparts. Older patients had a significantly slower immunological response to ART in the first 18 months on ART compared to patients aged 25–39 years (p = 0.04). Overall mortality rates were 2.3 (95% confidence Interval (CI) 2.2–2.4), 2.9 (95% CI 2.7–3.2) and 4.6 (95% CI 4.2–5.1) per 100 person-years in patients aged 25–39 years, 40–49 years and 50 years and older, respectively. Overall LTFU rates were 6.3 (95% CI 6.1–6.5), 4.5 (95% CI 4.2–4.7), and 5.6 (95% CI 5.1–6.1) per 100 person years among increasing age cohorts. The proportion of patients aged ≥ 50 years and newly enrolling into ART care remained stable at 9% while the proportion of active ART patients aged ≥50 years increased from 10% in 2006 to 15% in 2015. Conclusion Older people had slower immunological response and higher mortality. Malawi appears to be undergoing a demographic shift in people living with HIV. Increased consideration of long-term ART-related problems, drug-drug interactions and age-related non-communicable diseases is warranted. PMID:28686636
A Comparison of Second and Third Generations Combined Oral Contraceptive Pills’ Effect on Mood
Shahnazi, Mahnaz; Farshbaf Khalili, Azizeh; Ranjbar Kochaksaraei, Fatemeh; Asghari Jafarabadi, Mohammad; Gaza Banoi, Kamal; Nahaee, Jila; Bayati Payan, Somayeh
2014-01-01
Background: Most women taking combined oral contraceptives (COCs) are satisfied with their contraceptive method. However, one of the most common reasons reported for discontinuation of combined oral contraceptives (COCs) is mood deterioration. Objectives: This study aimed to compare effects of the second and third generation oral contraceptive pills on the mood of reproductive women. Materials and Methods: This randomized, double-blind, controlled clinical trial was conducted in reproductive women at health centers in Tehran, Iran. Participants were randomized into the second and third generation oral contraceptive groups. Positive and negative moods were recorded using positive affect, negative affect scale (PANAS) tools at the end the second and fourth months of the study. Data analysis was carried out using ANOVA and P Values < 0.05 was considered significant. Results: Statistically significant difference was seen in positive and negative mood changes in women receiving contraceptive pills. The second generation oral contraceptive pills resulted in a decrease in positive mood (95% CI: 43.39 to 38.32 in second month and 43.39 to 26.05 in four month) and increase in negative mood (95% CI: 14.23 to 22.04 in second month and 14.23 to 32.26 in four month - P < 0.001), but the third generation led to an increase in positive mood (95% CI: 22.42 to 25.60 in second month and 22.42 to 33.87 in four month) and decrease in negative mood (95% CI: 36.78 to 31.97 in second month and 36.78 to 22.65 in four month - P < 0.001). Conclusions: Third generation combined oral contraceptive pills have a better effect on mood in women in reproductive ages than the second generation pills. It can be recommended as a proper combined oral contraceptive in Iran. PMID:25389478
Mulhall, Brian P; Wright, Stephen T; De La Mata, Nicole; Allen, Debbie; Brown, Katherine; Dickson, Bridget; Grotowski, Miriam; Jackson, Eva; Petoumenos, Kathy; Foster, Rosalind; Read, Timothy; Russell, Darren; Smith, David J; Templeton, David J; Fairley, Christopher K; Law, Matthew G
2016-01-01
Background We established a sub-cohort of HIV positive individuals from ten sexual health clinics within the Australian HIV Observational Database (AHOD). The aim of this paper was to assess demographic and other factors that might be associated with an incident sexually transmitted infection (STI). Methods The cohort follow-up was from March 2010 to March 2013, and included patients screened at least once for an STI. We used survival methods to determine time-to first new and confirmed incident STI infection (chlamydia, gonorrhoea, syphilis or genital warts). Factors evaluated included sex, age, mode of HIV exposure, year of AHOD enrolment, hepatitis B or C coinfection, time-updated CD4 cell count, time-updated HIV RNA viral load, and prior STI diagnosis. Results There were 110 first incident STI diagnoses observed over 1015 person-years of follow-up, a crude rate of 10.8 (95% confidence interval [CI]:9.0-13.0) per 100 person-years. Factors independently associated with increased risk of incident STI included younger age (≥50 vs 30-39 years old, adjusted hazards ratio, [aHR]=0.4; 95% CI:0.2-0.8, p<0.0001); prior STI infection, (aHR=2.5; 95% CI: 1.6-3.8, p<0.001); and heterosexual vs men who have sex with men (MSM) as likely exposure (aHR=0.2; 95% CI: 0.1-0.6; p<0.001). Conclusions In this cohort of individuals being treated with anti-retroviral drugs (ARV), who are MSM, 30-39 years old, and with a prior history of STI, are at highest risk of a further STI diagnosis PMID:27019207
Balekouzou, Augustin; Yin, Ping; Afewerky, Henok Kessete; Bekolo, Cavin; Pamatika, Christian Maucler; Nambei, Sylvain Wilfrid; Djeintote, Marceline; Doui Doumgba, Antoine; Mossoro-Kpinde, Christian Diamont; Shu, Chang; Yin, Minghui; Fu, Zhen; Qing, Tingting; Yan, Mingming; Zhang, Jianyuan; Chen, Shaojun; Li, Hongyu; Xu, Zhongyu; Koffi, Boniface
2017-01-01
Breast cancer is recognized as a major public health problem in developing countries; however, there is very little evidence of behavioral factors associated with breast cancer risk. This study was conducted to identify lifestyles as risk factors for breast cancer among Central African women. A case-control study was conducted with 174 cases confirmed histologically by the pathology unit of the National Laboratory and 348 age-matched controls. Data collection tools included a questionnaire with interviews and medical records of patients. Data were analyzed using SPSS software version 20. Odd ratio (OR) and 95% confidence intervals (95% CI) were obtained by unconditional logistic regression. In total, 522 women were studied with a mean age of 45.8 (SD = 13.4) years. By unconditional logistic regression model, women with breast cancer were more likely to have attained illiterate and elementary education level [11.23 (95% CI, 4.65-27.14) and 2.40 (95% CI, 1.15-4.99)], married [2.09 (95% CI, 1.18-3.71)], positive family history [2.31 (95% CI, 1.36-3.91)], radiation exposure [8.21 (95% CI, 5.04-13.38)], consumption charcuterie [10.82 (95% CI, 2.39-48.90)], fresh fish consumption [4.26 (95% CI, 1.56-11.65)], groundnut consumption [6.46 (95% CI, 2.57-16.27)], soybean consumption [16.74 (95% CI, 8.03-39.84)], alcohol [2.53 (95% CI, 1.39-4.60)], habit of keeping money in bras[3.57 (95% CI, 2.24-5.69)], overweight [5.36 (95% CI, 4.46-24.57)] and obesity [3.11(95% CI, 2.39-20.42)]. However, decreased risk of breast cancer was associated with being employed [0.32 (95% CI, 0.19-0.56)], urban residence [0.16 (95% CI, 0.07-0.37)], groundnut oil consumption [0.05 (95% CI, 0.02-0.14)], wine consumption [0.16 (95% CI, 0.09-0.26)], non habit of keeping cell phone in bras [0.56 (95% CI, 0.35-0.89)] and physical activity [0.71(95% CI, 0.14-0.84)]. The study showed that little or no education, marriage, positive family history of cancer, radiation exposure, charcuterie, fresh fish, groundnut, soybean, alcohol, habit of keeping money in bras, overweight and obesity were associated with breast cancer risk among Central African women living in Bangui. Women living in Bangui should be more cautious on the behavioral risk associated with breast cancer.
Streicker, Daniel G.; Fischer, Justin W.; VerCauteren, Kurt C.; Gilbert, Amy T.
2017-01-01
Background Prevention and control of wildlife disease invasions relies on the ability to predict spatio-temporal dynamics and understand the role of factors driving spread rates, such as seasonality and transmission distance. Passive disease surveillance (i.e., case reports by public) is a common method of monitoring emergence of wildlife diseases, but can be challenging to interpret due to spatial biases and limitations in data quantity and quality. Methodology/Principal findings We obtained passive rabies surveillance data from dead striped skunks (Mephitis mephitis) in an epizootic in northern Colorado, USA. We developed a dynamic patch-occupancy model which predicts spatio-temporal spreading while accounting for heterogeneous sampling. We estimated the distance travelled per transmission event, direction of invasion, rate of spatial spread, and effects of infection density and season. We also estimated mean transmission distance and rates of spatial spread using a phylogeographic approach on a subsample of viral sequences from the same epizootic. Both the occupancy and phylogeographic approaches predicted similar rates of spatio-temporal spread. Estimated mean transmission distances were 2.3 km (95% Highest Posterior Density (HPD95): 0.02, 11.9; phylogeographic) and 3.9 km (95% credible intervals (CI95): 1.4, 11.3; occupancy). Estimated rates of spatial spread in km/year were: 29.8 (HPD95: 20.8, 39.8; phylogeographic, branch velocity, homogenous model), 22.6 (HPD95: 15.3, 29.7; phylogeographic, diffusion rate, homogenous model) and 21.1 (CI95: 16.7, 25.5; occupancy). Initial colonization probability was twice as high in spring relative to fall. Conclusions/Significance Skunk-to-skunk transmission was primarily local (< 4 km) suggesting that if interventions were needed, they could be applied at the wave front. Slower viral invasions of skunk rabies in western USA compared to a similar epizootic in raccoons in the eastern USA implies host species or landscape factors underlie the dynamics of rabies invasions. Our framework provides a straightforward method for estimating rates of spatial spread of wildlife diseases. PMID:28759576
What Box: A task for assessing language lateralization in young children.
Badcock, Nicholas A; Spooner, Rachael; Hofmann, Jessica; Flitton, Atlanta; Elliott, Scott; Kurylowicz, Lisa; Lavrencic, Louise M; Payne, Heather M; Holt, Georgina K; Holden, Anneka; Churches, Owen F; Kohler, Mark J; Keage, Hannah A D
2018-07-01
The assessment of active language lateralization in infants and toddlers is challenging. It requires an imaging tool that is unintimidating, quick to setup, and robust to movement, in addition to an engaging and cognitively simple language processing task. Functional Transcranial Doppler Ultrasound (fTCD) offers a suitable technique and here we report on a suitable method to elicit active language production in young children. The 34-second "What Box" trial presents an animated face "searching" for an object. The face "finds" a box that opens to reveal a to-be-labelled object. In a sample of 95 children (1 to 5 years of age), 81% completed the task-32% with ≥10 trials. The task was validated (ρ = 0.4) against the gold standard Word Generation task in a group of older adults (n = 65, 60-85 years of age), though was less likely to categorize lateralization as left or right, indicative of greater measurement variability. Existing methods for active language production have been used with 2-year-old children while passive listening has been conducted with sleeping 6-month-olds. This is the first active method to be successfully employed with infants through to pre-schoolers, forming a useful tool for populations in which complex instructions are problematic.
Zhu, Jingying; Zhang, Xuhui; Zhang, Xi; Dong, Mei; Wu, Jiamei; Dong, Yunqiu; Chen, Rong; Ding, Xinliang; Huang, Chunhua; Zhang, Qi; Zhou, Weijie
2017-05-01
Ambient air pollution ranks high among the risk factors that increase the global burden of disease. Previous studies focused on assessing mortality risk and were sparsely performed in populous developing countries with deteriorating environments. We conducted a time-series study to evaluate the air pollution-associated years of life lost (YLL) and mortality risk and to identify potential modifiers relating to the season and demographic characteristics. Using linear (for YLL) and Poisson (for mortality) regression models and controlling for time-varying factors, we found that an interquartile range (IQR) increase in a three-day average cumulative (lag 0-2 day) concentrations of PM 2.5 , PM 10 , NO 2 and SO 2 corresponded to increases in YLL of 12.09 (95% confidence interval [CI]: 2.98-21.20), 13.69 (95% CI: 3.32-24.07), 26.95 (95% CI: 13.99-39.91) and 24.39 (95% CI: 8.62-40.15) years, respectively, and to percent increases in mortality of 1.34% (95% CI: 0.67-2.01%), 1.56% (95% CI: 0.80-2.33%), 3.36% (95% CI: 2.39-4.33%) and 2.39% (95% CI: 1.24-3.55%), respectively. Among the specific causes of death, cardiovascular and respiratory diseases were positively associated with gaseous pollutants (NO 2 and SO 2 ), and diabetes was positively correlated with NO 2 (in terms of the mortality risk). The effects of air pollutants were more pronounced in the cool season than in the warm season. The elderly (>65 years) and females were more vulnerable to air pollution. Studying effect estimates and their modifications by using YLL to detect premature death should support implementing health risk assessments, identifying susceptible groups and guiding policy-making and resource allocation according to specific local conditions. Copyright © 2017 Elsevier Ltd. All rights reserved.
Haye Salinas, M J; Caeiro, F; Saurit, V; Alvarellos, A; Wojdyla, D; Scherbarth, H R; de O E Silva, A C; Tavares Brenol, J C; Lavras Costallat, L T; Neira, O J; Iglesias Gamarra, A; Vásquez, G; Reyes Llerena, G A; Barile-Fabris, L A; Silveira, L H; Sauza Del Pozo, M J; Acevedo Vásquez, E M; Alfaro Lozano, J L; Esteva Spinetti, M H; Alarcón, G S; Pons-Estel, B A
2017-11-01
Objectives The objectives of this study were to examine the demographic and clinical features associated with the occurrence of pleuropulmonary manifestations, the predictive factors of their occurrence and their impact on mortality in systemic lupus erythematosus (SLE) patients. Materials and methods The association of pleuropulmonary manifestations with demographic and clinical features, the predictive factors of their occurrence and their impact on mortality were examined in GLADEL patients by appropriate univariable and multivariable analyses. Results At least one pleuropulmonary manifestation occurred in 421 of the 1480 SLE patients (28.4%), pleurisy being the most frequent (24.0%). Age at SLE onset ≥30 years (OR 1.42; 95% CI 1.10-1.83), the presence of lower respiratory tract infection (OR 3.19; 95% CI 2.05-4.96), non-ischemic heart disease (OR 3.17; 95% CI 2.41-4.18), ischemic heart disease (OR 3.39; 95% CI 2.08-5.54), systemic (OR 2.00; 95% CI 1.37-2.91), ocular (OR 1.58; 95% CI 1.16-2.14) and renal manifestations (OR 1.44; 95% CI 1.09-1.83) were associated with pleuropulmonary manifestations, whereas cutaneous manifestations were negatively associated (OR 0.47; 95% CI 0.29-0.76). Non-ischemic heart disease (HR 2.24; 95% CI 1.63-3.09), SDI scores ≥1 (OR 1.54; 95% CI 1.10-2.17) and anti-La antibody positivity (OR 2.51; 95% CI 1.39-4.57) independently predicted their subsequent occurrence. Cutaneous manifestations were protective of the subsequent occurrence of pleuropulmonary manifestations (HR 0.62; 95% CI 0.43-0.90). Pleuropulmonary manifestations independently contributed a decreased survival (HR: 2.79 95% CI 1.80-4.31). Conclusion Pleuropulmonary manifestations are frequent in SLE, particularly pleuritis. Older age, respiratory tract infection, cardiac, systemic and renal involvement were associated with them, whereas cutaneous manifestations were negatively associated. Cardiac compromise, SDI scores ≥1 and anti-La positivity at disease onset were predictive of their subsequent occurrence, whereas cutaneous manifestations were protective. They independently contributed to a decreased survival in these patients.
Associations between food insecurity and healthy behaviors among Korean adults
Chun, In-Ae; Park, Jong; Ro, Hee-Kyung; Han, Mi-Ah
2015-01-01
BACKGROUND/OBJECTIVES Food insecurity has been suggested as being negatively associated with healthy behaviors and health status. This study was performed to identify the associations between food insecurity and healthy behaviors among Korean adults. SUBJECTS/METHODS The data used were the 2011 Community Health Survey, cross-sectional representative samples of 253 communities in Korea. Food insecurity was defined as when participants reported that their family sometimes or often did not get enough food to eat in the past year. Healthy behaviors were considered as non-smoking, non-high risk drinking, participation in physical activities, eating a regular breakfast, and maintaining a normal weight. Multiple logistic regression and multinomial logistic regression analyses were used to identify the association between food insecurity and healthy behaviors. RESULTS The prevalence of food insecurity was 4.4% (men 3.9%, women 4.9%). Men with food insecurity had lower odds ratios (ORs) for non-smoking, 0.75 (95% CI: 0.68-0.82), participation in physical activities, 0.82 (95% CI: 0.76-0.90), and eating a regular breakfast, 0.66 (95% CI: 0.59-0.74), whereas they had a higher OR for maintaining a normal weight, 1.19 (95% CI: 1.09-1.30), than men with food security. Women with food insecurity had lower ORs for non-smoking, 0.77 (95% CI: 0.66-0.89), and eating a regular breakfast, 0.79 (95% CI: 0.72-0.88). For men, ORs for obesity were 0.78 (95% CI: 0.70-0.87) for overweight and 0.56 (95% CI: 0.39-0.82) for mild obesity. For women, the OR for moderate obesity was 2.04 (95% CI: 1.14-3.63) as compared with normal weight. CONCLUSIONS Food insecurity has a different impact on healthy behaviors. Provision of coping strategies for food insecurity might be critical to improve healthy behaviors among the population. PMID:26244083
Kim, Young Wan; Kim, Ik Yong
2016-01-01
Purpose To identify the factors affecting 30-day postoperative complications and 1-year mortality after surgery for colorectal cancer in octogenarians and nonagenarians. Methods Between 2005 and 2014, a total of 204 consecutive patients aged ≥80 years who underwent major colorectal surgery were included. Results One hundred patients were male (49%) and 52 patients had American Society of Anesthesiologists (ASA) score ≥3 (25%). Combined surgery was performed in 32 patients (16%). Postoperative complications within 30 days after surgery occurred in 54 patients (26%) and 30-day mortality occurred in five patients (2%). Independent risk factors affecting 30-day postoperative complications were older age (≥90 years, hazard ratio [HR] with 95% confidence interval [CI] =4.95 [1.69−14.47], P=0.004), an ASA score ≥3 (HR with 95% CI =4.19 [1.8−9.74], P=0.001), performance of combined surgery (HR with 95% CI =3.1 [1.13−8.46], P=0.028), lower hemoglobin level (<10 g/dL, HR with 95% CI =7.56 [3.07−18.63], P<0.001), and lower albumin level (<3.4 g/dL, HR with 95% CI =3.72 [1.43−9.69], P=0.007). An ASA score ≥3 (HR with 95% CI =2.72 [1.15−6.46], P=0.023), tumor-node-metastasis (TNM) stage IV (HR with 95% CI =3.47 [1.44−8.39], P=0.006), and occurrence of postoperative complications (HR with 95% CI =4.42 [1.39−14.09], P=0.012) were significant prognostic factors for 1-year mortality. Conclusion Patient-related factors (older age, higher ASA score, presence of anemia, and lower serum albumin) and procedure-related factors (performance of combined surgical procedure) increased postoperative complications. Avoidance of 30-day postoperative complications may decrease 1-year mortality. PMID:27279741
Brunette, Amanda M; Holm, Kristen E; Wamboldt, Frederick S; Kozora, Elizabeth; Moser, David J; Make, Barry J; Crapo, James D; Meschede, Kimberly; Weinberger, Howard D; Moreau, Kerrie L; Bowler, Russell P; Hoth, Karin F
2018-05-01
This study examined the association of perceived cognitive difficulties with objective cognitive performance in former smokers. We hypothesized that greater perceived cognitive difficulties would be associated with poorer performance on objective executive and memory tasks. Participants were 95 former smokers recruited from the COPDGene study. They completed questionnaires (including the Cognitive Difficulties Scale [CDS] and the Hospital Anxiety and Depression Scale [HADS]), neuropsychological assessment, and pulmonary function testing. Pearson correlations and t-tests were conducted to examine the bivariate association of the CDS (total score and subscales for attention/concentration, praxis, delayed recall, orientation for persons, temporal orientation, and prospective memory) with each domain of objective cognitive functioning (memory recall, executive functioning/processing speed, visuospatial processing, and language). Simultaneous multiple linear regression was used to further examine all statistically significant bivariate associations. The following covariates were included in all regression models: age, sex, pack-years, premorbid functioning (WRAT-IV Reading), HADS total score, and chronic obstructive pulmonary disease (COPD) status (yes/no based on GOLD criteria). In regression models, greater perceived cognitive difficulties overall (using CDS total score) were associated with poorer performance on executive functioning/processing speed tasks (b = -0.07, SE = 0.03, p = .037). Greater perceived cognitive difficulties on the CDS praxis subscale were associated with poorer performance on executive functioning/processing speed tasks (b = -3.65, SE = 1.25, p = .005), memory recall tasks (b = -4.60, SE = 1.75, p = .010), and language tasks (b = -3.89, SE = 1.39, p = .006). Clinicians should be aware that cognitive complaints may be indicative of problems with the executive functioning/processing speed and memory of former smokers with and without COPD.
Fosco, Whitney D; White, Corey N; Hawk, Larry W
2017-07-01
The current studies utilized drift diffusion modeling (DDM) to examine how reinforcement and stimulant medication affect cognitive task performance in children with ADHD. In Study 1, children with (n = 25; 88 % male) and without ADHD (n = 33; 82 % male) completed a 2-choice discrimination task at baseline (100 trials) and again a week later under alternating reinforcement and no-reinforcement contingencies (400 trials total). In Study 2, participants with ADHD (n = 29; 72 % male) completed a double-blind, placebo-controlled trial of 0.3 and 0.6 mg/kg methylphenidate and completed the same task utilized in Study 1 at baseline (100 trials). Children with ADHD accumulated information at a much slower rate than controls, as evidenced by a lower drift rate. Groups were similar in nondecision time and boundary separation. Both reinforcement and stimulant medication markedly improved drift rate in children with ADHD (ds = 0.70 and 0.95 for reinforcement and methylphenidate, respectively); both treatments also reduced boundary separation (ds = 0.70 and 0.39). Reinforcement, which emphasized speeded accuracy, reduced nondecision time (d = 0.37), whereas stimulant medication increased nondecision time (d = 0.38). These studies provide initial evidence that frontline treatments for ADHD primarily impact cognitive performance in youth with ADHD by improving the speed/efficiency of information accumulation. Treatment effects on other DDM parameters may vary between treatments or interact with task parameters (number of trials, task difficulty). DDM, in conjunction with other approaches, may be helpful in clarifying the specific cognitive processes that are disrupted in ADHD, as well as the basic mechanisms that underlie the efficacy of ADHD treatments.
3. East portal of Tunnel 39, view to west with ...
3. East portal of Tunnel 39, view to west with east portal of Tunnel 38 (HAER CA-211) visible in distance, 135mm lens with electronic flash fill. - Central Pacific Transcontinental Railroad, Tunnel No. 39, Milepost 180.95, Cisco, Placer County, CA
Intubation methods by novice intubators in a manikin model.
O'Carroll, Darragh C; Barnes, Robert L; Aratani, Ashley K; Lee, Dane C; Lau, Christopher A; Morton, Paul N; Yamamoto, Loren G; Berg, Benjamin W
2013-10-01
Tracheal Intubation is an important yet difficult skill to learn with many possible methods and techniques. Direct laryngoscopy is the standard method of tracheal intubation, but several instruments have been shown to be less difficult and have better performance characteristics than the traditional direct method. We compared 4 different intubation methods performed by novice intubators on manikins: conventional direct laryngoscopy, video laryngoscopy, Airtraq® laryngoscopy, and fiberoptic laryngoscopy. In addition, we attempted to find a correlation between playing videogames and intubation times in novice intubators. Video laryngoscopy had the best results for both our normal and difficult airway (cervical spine immobilization) manikin scenarios. When video was compared to direct in the normal airway scenario, it had a significantly higher success rate (100% vs 83% P=.02) and shorter intubation times (29.1 ± 27.4 sec vs 45.9 ± 39.5 sec, P=.03). In the difficult airway scenario video laryngoscopy maintained a significantly higher success rate (91% vs 71% P=0.04) and likelihood of success (3.2 ± 1.0 95%CI [2.9-3.5] vs 2.4 ± 0.9 95%CI [2.1-2.7]) when compared to direct laryngoscopy. Participants also reported significantly higher rates of self-confidence (3.5 ± 0.6 95%CI [3.3-3.7]) and ease of use (1.5 ± 0.7 95%CI [1.3-1.8]) with video laryngoscopy compared to all other methods. We found no correlation between videogame playing and intubation methods.
Audrain-McGovern, Janet; Strasser, Andrew A; Ashare, Rebecca; Wileyto, E Paul
2015-12-01
This study sought to evaluate whether individual differences in the reinforcing value of smoking relative to physical activity (RRVS) moderated the effects of physical activity on smoking abstinence symptoms in young adult smokers. The repeated-measures within-subjects design included daily smokers (N = 79) 18-26 years old. RRVS was measured with a validated behavioral choice task. On 2 subsequent visits, participants completed self-report measures of craving, withdrawal, mood, and affective valence before and after they engaged in passive sitting or a bout of physical activity. RRVS did not moderate any effects of physical activity (ps > .05). Physical activity compared with passive sitting predicted decreased withdrawal symptoms, β = -5.23, 95% confidence interval (CI) [-6.93, -3.52] (p < .001), negative mood, β = -2.92, 95% CI [-4.13, -1.72] (p < .001), and urge to smoke. β = -7.13, 95% CI [-9.39, -4.86] (p < .001). Also, physical activity compared with passive sitting predicted increased positive affect, β = 3.08, 95% CI [1.87, 4.28] (p < .001) and pleasurable feelings, β = 1.07, 95% CI [0.58, 1.55] (p < .001), and greater time to first cigarette during the ad libitum smoking period, β = 211.76, 95% CI [32.54, 390.98] (p = .02). RRVS predicted higher levels of pleasurable feelings, β = 0.22, 95% CI [0.01, 0.43] (p = .045), increased odds of smoking versus remaining abstinent during the ad libitum smoking period, β = 0.04, 95% CI [0.01, 0.08] (p = .02), and reduced time to first cigarette, β = -163.00, 95% CI [-323.50, -2.49] (p = .047). Regardless of the RRVS, physical activity produced effects that may aid smoking cessation in young adult smokers. However, young adult smokers who have a higher RRVS will be less likely to choose to engage physical activity, especially when smoking is an alternative. (PsycINFO Database Record (c) 2015 APA, all rights reserved).
Grashow, Rachel; Spiro, Avron; Taylor, Kathryn M.; Newton, Kimberly; Shrairman, Ruth; Landau, Alexander; Sparrow, David; Hu, Howard; Weisskopf, Marc
2013-01-01
Background and Aims Lead exposure in children and occupationally-exposed adults has been associated with reduced visuomotor and fine motor function. However, associations in environmentally-exposed adults remain relatively unexplored. To address this, we examined the association between cumulative lead exposure—as measured by lead in bone—and performance on the Grooved Pegboard (GP) manual dexterity task, as well as on handwriting tasks using a novel assessment approach, among men in the VA Normative Aging Study (NAS). Methods GP testing was done with 362 NAS participants, and handwriting assessment with 328, who also had tibia and patella lead measurements made with K-X-Ray Fluorescence (KXRF). GP scores were time (sec) to complete the task with the dominant hand. The handwriting assessment approach assessed the production of signature and cursive lowercase l and m letter samples. Signature and lm task scores reflect consistency in repeated trials. We used linear regression to estimate associations and 95% confidence intervals (CI) with adjustment for age, smoking, education, income and computer experience. A backward elimination algorithm was used in the subset with both GP and handwriting assessment to identify variables predictive of each outcome. Results The mean (SD) participant age was 69.1 (7.2) years; mean patella and tibia concentrations were 25.0 (20.7) μg/g and 19.2 (14.6) μg/g, respectively. In multivariable-adjusted analyses, GP performance was associated with tibia (β per 15 μg/g bone = 4.66, 95% CI: 1.73, 7.58, p=0.002) and patella (β per 20 μg/g = 3.93, 95% CI: 1.11, 6.76, p = 0.006). In multivariable adjusted models of handwriting production, only the lm-pattern task showed a significant association with tibia (β per 15 μg/g bone = 1.27, 95% CI: 0.24, 2.29, p = 0.015), such that lm pattern production was more stable with increasing lead exposure. GP and handwriting scores were differentially sensitive to education, smoking, computer experience, financial stability, income and alcohol consumption. Conclusions Long-term cumulative environmental lead exposure was associated with deficits in GP performance, but not handwriting production. Higher lead appeared to be associated with greater consistency on the lm task. Lead sensitivity differences could suggest that lead affects neural processing speed rather than motor function per se, or could result from distinct brain areas involved in the execution of different motor tasks. PMID:23370289
Validity and Reliability of Dermoscopic Criteria Used to Differentiate Nevi From Melanoma
Carrera, Cristina; Marchetti, Michael A.; Dusza, StephenW.; Argenziano, Giuseppe; Braun, Ralph P.; Halpern, Allan C.; Jaimes, Natalia; Kittler, Harald J.; Malvehy, Josep; Menzies, Scott W.; Pellacani, Giovanni; Puig, Susana; Rabinovitz, Harold S.; Scope, Alon; Soyer, H. Peter; Stolz, Wilhelm; Hofmann-Wellenhof, Rainer; Zalaudek, Iris; Marghoob, Ashfaq A.
2017-01-01
IMPORTANCE The comparative diagnostic performance of dermoscopic algorithms and their individual criteria are not well studied. OBJECTIVES To analyze the discriminatory power and reliability of dermoscopic criteria used in melanoma detection and compare the diagnostic accuracy of existing algorithms. DESIGN, SETTING, AND PARTICIPANTS This was a retrospective, observational study of 477 lesions (119 melanomas [24.9%] and 358 nevi [75.1%]), which were divided into 12 image sets that consisted of 39 or 40 images per set. A link on the International Dermoscopy Society website from January 1, 2011, through December 31, 2011, directed participants to the study website. Data analysis was performed from June 1, 2013, through May 31, 2015. Participants included physicians, residents, and medical students, and there were no specialty-type or experience-level restrictions. Participants were randomly assigned to evaluate 1 of the 12 image sets. MAIN OUTCOMES AND MEASURES Associations with melanoma and intraclass correlation coefficients (ICCs) were evaluated for the presence of dermoscopic criteria. Diagnostic accuracy measures were estimated for the following algorithms: the ABCD rule, the Menzies method, the 7-point checklist, the 3-point checklist, chaos and clues, and CASH (color, architecture, symmetry, and homogeneity). RESULTS A total of 240 participants registered, and 103 (42.9%) evaluated all images. The 110 participants (45.8%) who evaluated fewer than 20 lesions were excluded, resulting in data from 130 participants (54.2%), 121 (93.1%) of whom were regular dermoscopy users. Criteria associated with melanoma included marked architectural disorder (odds ratio [OR], 6.6; 95% CI, 5.6–7.8), pattern asymmetry (OR, 4.9; 95% CI, 4.1–5.8), nonorganized pattern (OR, 3.3; 95% CI, 2.9–3.7), border score of 6 (OR, 3.3; 95% CI, 2.5–4.3), and contour asymmetry (OR, 3.2; 95% CI, 2.7–3.7) (P < .001 for all). Most dermoscopic criteria had poor to fair interobserver agreement. Criteria that reached moderate levels of agreement included comma vessels (ICC, 0.44; 95% CI, 0.40–0.49), absence of vessels (ICC, 0.46; 95% CI, 0.42–0.51), dark brown color (ICC, 0.40; 95% CI, 0.35–0.44), and architectural disorder (ICC, 0.43; 95% CI, 0.39–0.48). The Menzies method had the highest sensitivity for melanoma diagnosis (95.1%) but the lowest specificity (24.8%) compared with any other method (P < .001). The ABCD rule had the highest specificity (59.4%). All methods had similar areas under the receiver operating characteristic curves. CONCLUSIONS AND RELEVANCE Important dermoscopic criteria for melanoma recognition were revalidated by participants with varied experience. Six algorithms tested had similar but modest levels of diagnostic accuracy, and the interobserver agreement of most individual criteria was poor. PMID:27074267
DOE Office of Scientific and Technical Information (OSTI.GOV)
Allen, Aaron M.; Den, Robert; Department of Radiation Oncology, Brigham and Women's Hospital, Boston, MA
2007-08-01
Purpose: Extrapleural pneumonectomy (EPP) is an effective treatment of malignant pleural mesothelioma. We compared the outcomes after moderate-dose hemithoracic radiotherapy (MDRT) and high-dose hemithoracic RT (HDRT) after EPP for malignant pleural mesothelioma. Methods and Materials: Between July 1994 and April 2004, 39 patients underwent EPP and adjuvant RT at Dana-Farber Cancer Institute/Brigham and Women's Hospital. Between 1994 and 2002, MDRT, including 30 Gy to the hemithorax, 40 Gy to the mediastinum, and boosts to positive margins or nodes to 54 Gy, was given, generally with concurrent chemotherapy. In 2003, HDRT to 54 Gy with a matched photon/electron technique was given,more » with sequential chemotherapy. Results: A total of 39 patients underwent RT after EPP. The median age was 59 years (range, 44-77). The histologic type was epithelial in 25 patients (64%) and mixed or sarcomatoid in 14 patients (36%). Of the 39 patients, 24 underwent MDRT and 15 (39%) HDRT. The median follow-up was 23 months (range, 6-71). The median overall survival was 19 months (95% confidence interval, 14-24). The median time to distant failure (DF) and local failure (LF) was 20 months (95% confidence interval, 14-26) and 26 months (95% confidence interval, 16-36), respectively. On univariate and multivariate analyses, only a mixed histologic type was predictive of inferior DF (p <0.006) and overall survival (p <0.004). The RT technique was not predictive of LF, DF, or overall survival. The LF rate was 50% (12 of 24) after MDRT and 27% (4 of 15) after HDRT (p = NS). Four patients who had undergone HDRT were alive and without evidence of disease at the last follow-up. Conclusions: High-dose hemithoracic RT appears to limit in-field LF compared with MDRT. However, DF remains a significant challenge, with one-half of our patients experiencing DF.« less
USDA-ARS?s Scientific Manuscript database
Breeding and selection for the traits with polygenic inheritance is a challenging task that can be done by phenotypic selection, by marker-assisted selection or by genome wide selection. We tested predictive ability of four selection models in a biparental population genotyped with 95 SNP markers an...
The Effect of Colour on Children's Cognitive Performance
ERIC Educational Resources Information Center
Brooker, Alice; Franklin, Anna
2016-01-01
Background: The presence of red appears to hamper adults' cognitive performance relative to other colours (see Elliot & Maier, 2014, "Ann. Rev. Psychol." 65, 95). Aims and sample: Here, we investigate whether colour affects cognitive performance in 8- and 9-year-olds. Method: Children completed a battery of tasks once in the presence…
Taggar, Jaspal S; Coleman, Tim; Lewis, Sarah; Heneghan, Carl; Jones, Matthew
2016-08-01
Pulse palpation has been recommended as the first step of screening to detect atrial fibrillation. We aimed to determine and compare the accuracy of different methods for detecting pulse irregularities caused by atrial fibrillation. We systematically searched MEDLINE, EMBASE, CINAHL and LILACS until 16 March 2015. Two reviewers identified eligible studies, extracted data and appraised quality using the QUADAS-2 instrument. Meta-analysis, using the bivariate hierarchical random effects method, determined average operating points for sensitivities, specificities, positive and negative likelihood ratios (PLR, NLR); we constructed summary receiver operating characteristic plots. Twenty-one studies investigated 39 interventions (n = 15,129 pulse assessments) for detecting atrial fibrillation. Compared to 12-lead electrocardiography (ECG) diagnosed atrial fibrillation, blood pressure monitors (BPMs; seven interventions) and non-12-lead ECGs (20 interventions) had the greatest accuracy for detecting pulse irregularities attributable to atrial fibrillation (BPM: sensitivity 0.98 (95% confidence interval (CI) 0.92-1.00), specificity 0.92 (95% CI 0.88-0.95), PLR 12.1 (95% CI 8.2-17.8) and NLR 0.02 (95% CI 0.00-0.09); non-12-lead ECG: sensitivity 0.91 (95% CI 0.86-0.94), specificity 0.95 (95% CI 0.92-0.97), PLR 20.1 (95% CI 12-33.7), NLR 0.09 (95% CI 0.06-0.14)). There were similar findings for smartphone applications (six interventions) although these studies were small in size. The sensitivity and specificity of pulse palpation (six interventions) were 0.92 (95% CI 0.85-0.96) and 0.82 (95% CI 0.76-0.88), respectively (PLR 5.2 (95% CI 3.8-7.2), NLR 0.1 (95% CI 0.05-0.18)). BPMs and non-12-lead ECG were most accurate for detecting pulse irregularities caused by atrial fibrillation; other technologies may therefore be pragmatic alternatives to pulse palpation for the first step of atrial fibrillation screening. © The European Society of Cardiology 2015.
Gaoua, Nadia; Herrera, Christopher P; Périard, Julien D; El Massioui, Farid; Racinais, Sebastien
2017-01-01
The aim of this study was to verify the hypothesis that hyperthermia represents a cognitive load limiting available resources for executing concurrent cognitive tasks. Electroencephalographic activity (EEG: alpha and theta power) was obtained in 10 hyperthermic participants in HOT (50°C, 50% RH) conditions and in a normothermic state in CON (25°C, 50% RH) conditions in counterbalanced order. In each trial, EEG was measured over the frontal lobe prior to task engagement (PRE) in each condition and during simple (One Touch Stockings of Cambridge, OTS-4) and complex (OTS-6) cognitive tasks. Core (39.5 ± 0.5 vs. 36.9 ± 0.2°C) and mean skin (39.06 ± 0.3 vs. 31.6 ± 0.6°C) temperatures were significantly higher in HOT than CON ( p < 0.005). Theta power significantly increased with task demand ( p = 0.017, η 2 = 0.36) and was significantly higher in HOT than CON ( p = 0.041, η 2 = 0.39). The difference between HOT and CON was large (η 2 = 0.40) and significant ( p = 0.036) PRE, large (η 2 = 0.20) but not significant ( p = 0.17) during OTS-4, and disappeared during OTS-6 ( p = 0.87, η 2 = 0.00). Those changes in theta power suggest that hyperthermia may act as an additional cognitive load. However, this load disappeared during OTS-6 together with an impaired performance, suggesting a potential saturation of the available resources.
Spahillari, Aferdita; Talegawkar, Sameera; Correa, Adolfo; Carr, J. Jeffrey; Terry, James G.; Lima, Joao; Freedman, Jane E.; Das, Saumya; Kociol, Robb; de Ferranti, Sarah; Mohebali, Donya; Mwasongwe, Stanford; Tucker, Katherine L.; Murthy, Venkatesh L.; Shah, Ravi V.
2017-01-01
Background The lifetime risk of heart failure is higher in the African-American population than in other racial groups in the United States. Methods and Results We measured the Life’s Simple 7 ideal cardiovascular health metrics in 4195 African-Americans in the Jackson Heart Study (2000–2004). We evaluated the association of Simple 7 metrics with incident HF and left ventricular (LV) structure and function by cardiac magnetic resonance (CMR; n=1188). Mean age at baseline was 54.4 years (65% women). Relative to 0–2 Simple 7 factors, African-Americans with 3 factors had 47% lower incident HF risk (HR 0.53, 95% CI 0.39–0.73, P<0.0001); those with ≥ 4 factors had 61% lower HF risk (HR 0.39, 95% CI 0.24–0.64, P=0.0002). Higher blood pressure (HR 2.32, 95% CI 1.28–4.20, P=0.005), physical inactivity (HR 1.65, 95% CI 1.07–2.55, P=0.02), smoking (HR 2.04, 95% CI 1.43–2.91, P<0.0001) and impaired glucose control (HR 1.76, 95% CI 1.34–2.29, P<0.0001) were associated with incident HF. The age-/sex-adjusted population attributable risk for these Simple 7 metrics combined was 37.1%. Achievement of ideal blood pressure, ideal body mass index, ideal glucose control, and non-smoking was associated with less likelihood of adverse cardiac remodeling by CMR. Conclusions Cardiovascular risk factors in mid-life (specifically elevated blood pressure, physical inactivity, smoking and poor glucose control) are associated with incident HF in African Americans, and represent targets for intensified HF prevention. PMID:28209767
Hasan, Syed Shahzad; Thiruchelvam, Kaeshaelya; Ahmed, Syed Imran; Clavarino, Alexandra M; Mamun, Abdullah A; Kairuz, Therese
2013-01-01
The aim of this study was to investigate the association between pregnancy complications, mental health-related problems, and type 2 diabetes mellitus (T2DM) in Malaysian women. A case-control study of women with T2DM (n=160) matched by age range to controls without T2DM (n=160). Data were collected in the Negeri Sembilan and PutraJaya regions in Malaysia, from two hospital outpatient clinics, PutraJaya Hospital and Tuanku Jaa'far Hospital Seremban, and one health clinic at Seremban. Validated, interviewer-administered questionnaires were used to obtain the data. The unadjusted and adjusted estimates were calculated using the Mantel-Haenszel method. Neither depression (RR 0.74, 95% CI: 0.39-1.41) nor anxiety (RR 1.00, 95% CI: 0.53-1.88) symptoms increased the risk of T2DM significantly. However, gestational diabetes (RR 1.35, 95% CI: 1.02-1.79), and ≥3 pregnancies (RR 1.39, 95% CI: 1.08-1.79) were significant risk factors for the development of T2DM. T2DM was not a significant risk factor for either depression (RR 1.26, 95% CI: 0.91-1.74) or anxiety symptoms (RR 1.13, 95% CI: 0.59-2.19). In this study, T2DM is not a significant risk factor for depression and anxiety; similarly, neither are depression and anxiety significant risk factors for T2DM. Although prevalence of depression and anxiety is not alarming, the findings reported here should alert clinicians to screen and treat anxiety and depression in people with diabetes and also note the importance of monitoring women with complications in pregnancy for risk of later T2DM. Copyright © 2013 Diabetes India. Published by Elsevier Ltd. All rights reserved.
HCV avidity as a tool for detection of recent HCV infection: Sensitivity depends on HCV genotype.
Shepherd, Samantha J; McDonald, Scott A; Palmateer, Norah E; Gunson, Rory N; Aitken, Celia; Dore, Gregory J; Goldberg, David J; Applegate, Tanya L; Lloyd, Andrew R; Hajarizadeh, Behzad; Grebely, Jason; Hutchinson, Sharon J
2018-01-01
Accurate detection of incident hepatitis C virus (HCV) infection is required to target and evaluate public health interventions, but acute infection is largely asymptomatic and difficult to detect using traditional methods. Our aim was to evaluate a previously developed HCV avidity assay to distinguish acute from chronic HCV infection. Plasma samples collected from recent seroconversion subjects in two large Australian cohorts were tested using the avidity assay, and the avidity index (AI) was calculated. Demographic and clinical characteristics of patients with low/high AI were compared via logistic regression. Sensitivity and specificity of the assay for recent infection and the mean duration of recent infection (MDRI) were estimated stratified by HCV genotype. Avidity was assessed in 567 samples (from 215 participants), including 304 with viraemia (defined as ≥250 IU/mL). An inverse relationship between AI and infection duration was found in viraemic samples only. The adjusted odds of a low AI (<30%) decreased with infection duration (odds ratio [OR] per week of 0.93; 95% CI:0.89-0.97), and were lower for G1 compared with G3 samples (OR = 0.14; 95% CI:0.05-0.39). Defining recent infection as <26 weeks, sensitivity (at AI cut-off of 20%) was estimated at 48% (95% CI:39-56%), 36% (95% CI:20-52%), and 65% (95% CI:54-75%) and MDRI was 116, 83, and 152 days for all genotypes, G1, and G3, respectively. Specificity (≥52 weeks infection duration, all genotypes) was 96% (95% CI:90-98%). HCV avidity testing has utility for detecting recent HCV infection in patients, and for assessing progress in reaching incidence targets for eliminating transmission, but variation in assay performance across genotype should be recognized. © 2017 Wiley Periodicals, Inc.
Comparative Evaluation of Preliminary Screening Methods for Colorectal Cancer in a Mass Program.
Ye, Ding; Huang, Qiuchi; Li, Qilong; Jiang, Xiyi; Mamat, Mayila; Tang, Mengling; Wang, Jianbing; Chen, Kun
2017-09-01
The fecal immunochemical test (FIT) has been widely used in preliminary screening for colorectal cancer (CRC). The high-risk factor questionnaire (HRFQ) and quantitative risk-assessment method (QRAM) are recommended for estimating the risk of CRC qualitatively and quantitatively in China. We aimed to prospectively compare the diagnostic values of CRC preliminary screening methods to identify which method is preferable as a screening strategy. Individuals aged 40-74 years old were enrolled in a mass CRC screening program from January 1, 2007 to December 31, 2014, in Jiashan County, Zhejiang Province, China. FIT of two stool specimens at 1-week intervals was performed by laboratory personnel and face-to-face interviews were conducted by trained investigators. Screening data in the program were linked to a CRC surveillance and registry system, and CRC cases reported in the system were regarded as true patients. A total of 96,043 subjects were included. The sensitivity and specificity of FIT for detecting CRC cases were 75.49% (95% CI 69.84-80.39) and 90.36% (95% CI 90.17-90.54), respectively. QRAM was more sensitive (p < 0.001) and less specific (p < 0.001) than HRFQ. The sensitivity and specificity of FIT along with HRFQ were 86.56% (95% CI 81.81-90.22) and 81.37% (95% CI 81.12-81.62), and those of FIT along with QRAM were 88.93% (95% CI 84.47-92.23) and 73.95% (95% CI 73.67-74.23). Our findings suggest that CRC preliminary screening with FIT and QRAM in parallel has high sensitivity and satisfactory specificity, and is a useful strategy in mass screening programs.
Beiraghi, Asadollah; Shokri, Masood
2018-02-01
In the present study a new centrifuge-less dispersive liquid-liquid microextraction technique based on application of a new task specific magnetic polymeric ionic liquid (TSMPIL) as a chelating and extraction solvent for selective preconcentration of trace amounts of potassium from oil samples is developed, for the first time. After extraction, the fine droplets of TSMPIL were transferred into an eppendorf tube and diluted to 500µL using distilled water. Then, the enriched analyte was determined by flame atomic emission spectroscopy (FAES). Several important factors affecting both the complexation and extraction efficiency including extraction time, rate of vortex agitator, amount of carbonyl iron powder, pH of sample solution, volume of ionic liquid as well as effects of interfering species were investigated and optimized. Under the optimal conditions, the limits of detection (LOD) and quantification (LOQ) were 0.5 and 1.6µgL -1 respectively with the preconcentration factor of 128. The precision (RSD %) for seven replicate determinations at 10µgL -1 of potassium was better than 3.9%. The relative recoveries for the spiked samples were in the acceptable range of 95-104%. The results demonstrated that no remarkable interferences are created by other various ions in the determination of potassium, so that the tolerance limits (W Ion /W K ) of major cations and anions were in the range of 2500-10,000. The purposed method was successfully applied for the analysis of potassium in some oil samples. Copyright © 2017 Elsevier B.V. All rights reserved.
Tomorrow. The Report of the Task Force for the Study of Chemistry Education in the United States.
ERIC Educational Resources Information Center
American Chemical Society, Washington, DC.
An American Chemical Society (ACS) task force was charged to examine the state of chemistry education in the United States and to make recommendations in light of its findings. This document presents the task force's report and 39 major (and also secondary) recommendations. These recommendations, with accompanying discussions, focus on: (1)…
Coleman, Tim; Lewis, Sarah; Heneghan, Carl; Jones, Matthew
2015-01-01
Background Pulse palpation has been recommended as the first step of screening to detect atrial fibrillation. We aimed to determine and compare the accuracy of different methods for detecting pulse irregularities caused by atrial fibrillation. Methods We systematically searched MEDLINE, EMBASE, CINAHL and LILACS until 16 March 2015. Two reviewers identified eligible studies, extracted data and appraised quality using the QUADAS-2 instrument. Meta-analysis, using the bivariate hierarchical random effects method, determined average operating points for sensitivities, specificities, positive and negative likelihood ratios (PLR, NLR); we constructed summary receiver operating characteristic plots. Results Twenty-one studies investigated 39 interventions (n = 15,129 pulse assessments) for detecting atrial fibrillation. Compared to 12-lead electrocardiography (ECG) diagnosed atrial fibrillation, blood pressure monitors (BPMs; seven interventions) and non-12-lead ECGs (20 interventions) had the greatest accuracy for detecting pulse irregularities attributable to atrial fibrillation (BPM: sensitivity 0.98 (95% confidence interval (CI) 0.92–1.00), specificity 0.92 (95% CI 0.88–0.95), PLR 12.1 (95% CI 8.2–17.8) and NLR 0.02 (95% CI 0.00–0.09); non-12-lead ECG: sensitivity 0.91 (95% CI 0.86–0.94), specificity 0.95 (95% CI 0.92–0.97), PLR 20.1 (95% CI 12–33.7), NLR 0.09 (95% CI 0.06–0.14)). There were similar findings for smartphone applications (six interventions) although these studies were small in size. The sensitivity and specificity of pulse palpation (six interventions) were 0.92 (95% CI 0.85–0.96) and 0.82 (95% CI 0.76–0.88), respectively (PLR 5.2 (95% CI 3.8–7.2), NLR 0.1 (95% CI 0.05–0.18)). Conclusions BPMs and non-12-lead ECG were most accurate for detecting pulse irregularities caused by atrial fibrillation; other technologies may therefore be pragmatic alternatives to pulse palpation for the first step of atrial fibrillation screening. PMID:26464292
Trent, Simon; Dean, Rachel; Veit, Bonnie; Cassano, Tommaso; Bedse, Gaurav; Ojarikre, Obah A; Humby, Trevor; Davies, William
2013-08-01
Chromosomal deletions at Xp22.3 appear to influence vulnerability to the neurodevelopmental disorders attention deficit hyperactivity disorder (ADHD) and autism. 39,X(Y*)O mice, which lack the murine orthologue of the Xp22.3 ADHD candidate gene STS (encoding steroid sulfatase), exhibit behavioural phenotypes relevant to such disorders (e.g. hyperactivity), elevated hippocampal serotonin (5-HT) levels, and reduced serum levels of dehydroepiandrosterone (DHEA). Here we initially show that 39,X(Y*)O mice are also deficient for the recently-characterised murine orthologue of the Xp22.3 autism candidate gene ASMT (encoding acetylserotonin-O-methyltransferase). Subsequently, to specify potential behavioural correlates of elevated hippocampal 5-HT arising due to the genetic lesion, we compared 39,X(Y*)O MF1 mice to 40,XY MF1 mice on behavioural tasks taxing hippocampal and/or 5-HT function (a 'foraging' task, an object-location task, and the 1-choice serial reaction time task of impulsivity). Although Sts/Asmt deficiency did not influence foraging behaviour, reactivity to familiar objects in novel locations, or 'ability to wait', it did result in markedly increased response rates; these rates correlated with hippocampal 5-HT levels and are likely to index behavioural perseveration, a frequent feature of neurodevelopmental disorders. Additionally, we show that whilst there was no systematic relationship between serum DHEA levels and hippocampal 5-HT levels across 39,X(Y*)O and 40,XY mice, there was a significant inverse linear correlation between serum DHEA levels and activity. Our data suggest that deficiency for genes within Xp22.3 could influence core behavioural features of neurodevelopmental disorders via dissociable effects on hippocampal neurochemistry and steroid hormone levels, and that the mediating neurobiological mechanisms may be investigated in the 39,X(Y*)O model. Copyright © 2012 Elsevier Ltd. All rights reserved.
Cancer in the Minnesota Hmong population.
Ross, Julie A; Xie, Yang; Kiffmeyer, William R; Bushhouse, Sally; Robison, Leslie L
2003-06-15
The Hmong are an isolated, agrarian people who settled in the mountainous regions of what today are Vietnam, Cambodia, and Laos. After the Vietnam War, many Hmong were relocated to the U.S. Minnesota has the second largest population (after California) of Hmong individuals. The objective of this study was to examine cancer incidence in this population, because it may indicate areas for targeted surveillance and intervention. The Minnesota Cancer Surveillance System database was screened for Hmong surnames, and proportional incidence ratios (PIRs) were calculated for the period 1988-1999. Compared with all Minnesotans, the Hmong population had increased PIRs for nasopharyngeal cancer (PIR, 39.39; 95% confidence interval [95% CI], 21.01-66.86), gastric cancer (PIR, 8.70; 95% CI, 5.39-13.25), hepatic cancer (PIR, 8.08; 95% CI, 3.88-14.71), and cervical cancer (PIR, 3.72; 95% CI, 2.04-6.20) and had decreased PIRs for prostate cancer, breast cancer, Hodgkin disease, and melanoma. The current observations have implications for cancer control interventions. In particular, an increased incidence of cervical cancer might be addressed in part by targeting culturally sensitive screening programs in the Hmong population. Copyright 2003 American Cancer Society.
Green, Traci C.; Pouget, Enrique R.; Harrington, Magdalena; Taxman, Faye S.; Rhodes, Anne G.; O’Connell, Daniel; Martin, Steven S.; Prendergast, Michael; Friedmann, Peter D.
2012-01-01
Background To investigate how incarceration may affect risk of acquiring HIV and other sexually transmitted infections, we tested associations of ex-offenders’ sexual risk behavior with the male-female sex ratio and the male incarceration rate. Methods Longitudinal data from 1287 drug-involved persons on probation and parole as part of the Criminal Justice Drug Abuse Treatment Studies were matched by county of residence with population factors, and stratified by race/ethnicity and gender. Generalized estimating equations assessed associations of having unprotected sex with a partner who had HIV risk factors, and having more than 1 sex partner in the past month. Results Among non-Hispanic Black men and women, low sex ratios were associated with greater risk of having unprotected sex with a risky partner (Adjusted relative risk (ARR) = 1.76, 95% confidence interval (CI) = 1.29, 2.42; ARR = 2.48, 95% CI = 1.31, 4.73, respectively). Among non-Hispanic Black and non-Hispanic White women, low sex ratios were associated with having more than 1 sex partner (ARR = 2.00, 95% CI = 1.02, 3.94; ARR = 1.71, 95% CI = 1.06, 2.75, respectively). High incarceration rates were associated with greater risk of having a risky partner for all men (non-Hispanic Black: ARR= 2.14, 95% CI=1.39, 3.30; non-Hispanic White: ARR= 1.39, 95% CI: 1.05, 1.85; Hispanic: ARR = 3.99, 95% CI =1.55, 10.26) and having more than one partner among non-Hispanic White men (ARR= 1.92, 95% CI = 1.40, 2.64). Conclusions Low sex ratios and high incarceration rates may influence the number and risk characteristics of sex partners of ex-offenders. HIV-prevention policies and programs for ex-offenders could be improved by addressing structural barriers to safer sexual behavior. PMID:22592827
Rising burden of gout in the UK but continuing suboptimal management: a nationwide population study
Kuo, Chang-Fu; Grainge, Matthew J; Mallen, Christian; Zhang, Weiya; Doherty, Michael
2015-01-01
Objectives To describe trends in the epidemiology of gout and patterns of urate-lowering treatment (ULT) in the UK general population from 1997 to 2012. Methods We used the Clinical Practice Research Datalink to estimate the prevalence and incidence of gout for each calendar year from 1997 to 2012. We also investigated the pattern of gout management for both prevalent and incident gout patients. Results In 2012, the prevalence of gout was 2.49% (95% CI 2.48% to 2.51%) and the incidence was 1.77 (95% CI 1.73 to 1.81) per 1000 person-years. Prevalence and incidence both were significantly higher in 2012 than in 1997, with a 63.9% increase in prevalence and 29.6% increase in incidence over this period. Regions with highest prevalence and incidence were the North East and Wales. Among prevalent gout patients in 2012, only 48.48% (95% CI 48.08% to 48.89%) were being consulted specifically for gout or treated with ULT and of these 37.63% (95% CI 37.28% to 38.99%) received ULT. In addition, only 18.6% (95% CI 17.6% to 19.6%) of incident gout patients received ULT within 6 months and 27.3% (95% CI 26.1% to 28.5%) within 12 months of diagnosis. The management of prevalent and incident gout patients remained essentially the same during the study period, although the percentage of adherent patients improved from 28.28% (95% CI 27.33% to 29.26%) in 1997 to 39.66% (95% CI 39.11% to 40.22%) in 2012. Conclusions In recent years, both the prevalence and incidence of gout have increased significantly in the UK. Suboptimal use of ULT has not changed between 1997 and 2012. Patient adherence has improved during the study period, but it remains poor. PMID:24431399
Karanika, Styliani; Paudel, Suresh; Zervou, Fainareti N.; Grigoras, Christos; Zacharioudakis, Ioannis M.; Mylonakis, Eleftherios
2016-01-01
Background. Intensive care unit (ICU) patients are at higher risk for Clostridium difficile infection (CDI). Methods. We performed a systematic review and meta-analysis of published studies from 1983 to 2015 using the PubMed, EMBASE, and Google Scholar databases to study the prevalence and outcomes of CDI in this patient population. Among the 9146 articles retrieved from the studies, 22 articles, which included a total of 80 835 ICU patients, were included in our final analysis. Results. The prevalence of CDI among ICU patients was 2% (95% confidence interval [CI], 1%–2%), and among diarrheic ICU patients the prevalence was 11% (95% CI, 6%–17%). Among CDI patients, 25% (95% CI, 5%–51%) were diagnosed with pseudomembranous colitis, and the estimated length of ICU stay before CDI acquisition was 10.74 days (95% CI, 5%–51%). The overall hospital mortality among ICU patients with CDI was 32% (95% CI, 26%–39%), compared with 24% (95% CI, 14%–36%) among those without CDI presenting a statistically significant difference in mortality risk (P = .030). It is worth noting that the length of ICU and hospital stay among CDI patients was significantly longer, compared with non-CDI patients (standardized mean of difference [SMD] = 0.49, 95% CI, .39%–.6%, P = .00 and SMD = 1.15, 95% CI, .44%–1.91%, P = .003, respectively). It is noteworthy that the morbidity score at ICU admission (Acute Physiology and Chronic Health Evaluation II [APACHE II]) was not statistically different between the 2 groups (P = .911), implying that the differences in outcomes can be attributed to CDI. Conclusions. The ICU setting is associated with higher prevalence of CDI. In this setting, CDI is associated with increased hospital mortality and prolonged ICU and overall hospital stay. These findings highlight the need for additional prevention and treatment studies in this setting. PMID:26788544
Emergency department characteristics and capabilities in Bogotá, Colombia.
Bustos, Yury; Castro, Jenny; Wen, Leana S; Sullivan, Ashley F; Chen, Dinah K; Camargo, Carlos A
2015-12-01
Emergency departments (EDs) are a critical, yet heterogeneous, part of international emergency care. The National ED Inventories (NEDI) survey has been used in multiple countries as a standardized method to benchmark ED characteristics. We sought to describe the characteristics, resources, capabilities, and capacity of EDs in the densely populated capital city of Bogotá, Colombia. Bogotá EDs accessible to the general public 24/7 were surveyed using the 23-item NEDI survey used in several other countries ( www.emnet-nedi.org ). ED staff were asked about ED characteristics with reference to calendar year 2011. Seventy EDs participated (82 % response). Most EDs (87 %) were located in hospitals, and 83 % were independent hospital departments. The median annual ED visit volume was approximately 50,000 visits. Approximately 90 % (95 % confidence interval (CI) 80-96 %) had a contiguous layout, with medical and surgical care provided in one area. Almost all EDs saw both adults and children (91 %), while 6 % saw only adults and 3 % saw only children. Availability of technological and consultant resources in EDs was variable. Nearly every ED had cardiac monitoring (99 %, 95 % CI 92-100 %), but less than half had a dedicated CT scanner (39 %, 95 % CI 28-52 %). While most EDs were able to treat trauma 24/7 (81 %, 95 % CI 69-89 %), few could manage oncological (22 %, 95 % CI 13-34 %) or dental (3 %, 95 % CI 0-11 %) emergencies 24/7. The typical ED length-of-stay was between 1 and 6 h in 59 % of EDs (95 % CI, 46-70 %), while most others reported that patients remained for >6 h (39 %). Almost half of respondents (46 %, 95 % CI 34-59 %) reported their ED was over capacity. Bogotá EDs have high annual visit volumes and long length-of-stay, and half are over capacity. To meet the emergency care needs of people in Bogotá and other large cities, Colombia should consider improving urban ED capacity and training more emergency medicine specialists capable of efficiently staffing its large and crowded EDs.
[Factors associated with early weaning in a Spanish region].
Rius, J M; Ortuño, J; Rivas, C; Maravall, M; Calzado, M A; López, A; Aguar, M; Vento, M
2014-01-01
Breastfeeding has undoubtedly great benefits. Previous studies have foundan early dropout. Only a few studies have investigated related factors. Our aim was to find out on-going breastfeeding rates along the first 12 months after birth and analyse factors associated with early weaning. This is a prospective study including consecutive pairs (mother and newborn) till completion of the required sample. Variables were collected performing a structured program of surveys to the mothers. Bivariate and multivariate analysis of the data was performed. A total of 452 pairs were recruited. It was found that 81% of them started breastfeeding, with a prevalence of breastfeeding of 39% and 21% at 3 and 6 months after birth, respectively. Factors associated with early discontinuation of breastfeeding were: pregnancy induced by assisted reproduction methods (OR=5.58; 95% CI: 2.62-11.91), maternal smoking (OR=1.56; 95% CI: 1.10-2.22), poor maternal expectations about the duration of breastfeeding (OR=2.19; 95% CI: 1.49-3.23), use of nipple shields for breastfeeding (OR=2.57; 95% CI: 1.69-3.90), pacifier use on a regular basis during the first month after delivery (OR=1.39; 95% CI: 1.02-1.91), maternal university educational level (OR=0,59; 95% CI: 0,40-0,88), attending birth preparation programs during pregnancy (OR=0,68; 95% CI: 0,49-0,94), and believing having enough milk output at the time of discharge (OR=0,66; 95% CI: 0,47-0,92). International recommendations about duration of breastfeeding are not achieved in our country because of high rates of early weaning. We describe the known factors involved and other novel factors. The implementation of interventions to increase breastfeeding rates and to prevent early weaning are strongly recommended. Copyright © 2012 Asociación Española de Pediatría. Published by Elsevier Espana. All rights reserved.
Tasks completed by nursing members of a teaching hospital Medical Emergency Team.
Topple, Michelle; Ryan, Brooke; Baldwin, Ian; McKay, Richard; Blythe, Damien; Rogan, John; Radford, Sam; Jones, Daryl
2016-02-01
To assess tasks completed by intensive care medical emergency team nurses. Prospective observational study. Australian teaching hospital. Nursing-related technical and non-technical tasks and level of self-reported confidence and competence. Amongst 400 calls, triggers and nursing tasks were captured in 93.5% and 77.3% of cases, respectively. The median patient age was 73 years. The four most common triggers were hypotension (22.0%), tachycardia (21.1%), low SpO2 (17.4%), and altered conscious state (10.1%). Non-technical skills included investigation review (33.7%), history acquisition (18.4%), contribution to the management plan (40.5%) and explanation to bedside nurses (78.3%), doctors (13.6%), allied health (3.9%) or patient/relative (39.5%). Technical tasks included examining the circulation (32%), conscious state (29.4%), and chest (26.5%). Additional tasks included adjusting oxygen (23.9%), humidification (8.4%), non-invasive ventilation (6.5%), performing an ECG (22%), and administrating fluid as a bolus (17.5%) or maintenance (16, 5.2%), or medication as a statim dose (16.8%) or infusion (5.2%). Self-reported competence and confidence appeared to be high overall amongst our MET nurses. Our findings provide important information on the tasks completed by Medical Emergency Team nurses and will guide future training. Copyright © 2015 Elsevier Ltd. All rights reserved.
Has, Recep; Akel, Esra Gilbaz; Kalelioglu, Ibrahim H; Dural, Ozlem; Yasa, Cenk; Esmer, Aytül Corbacioglu; Yuksel, Atıl; Yildirim, Alkan; Ibrahimoglu, Lemi; Ermis, Hayri
2016-02-01
The aim of this prospective observational study was to identify the best method for use in diagnosing fetal nasal bone (NB) hypoplasia in the second trimester as a means of predicting trisomy 21 (Down syndrome). The NB length (NBL), NBL percentiles, and NBL multiple-of-median (MoM) values and the biparietal diameter-to-NBL ratios were calculated and compared in an attempt to identify the best predictive method and most appropriate cutoff value. Predictive values for several cutoff points were calculated. Receiver operating characteristic curves at a fixed 5% false-positive rate were used to compare the four methods. NBL measurements were obtained from 2,211 (95.6%) of a total of 2,314 fetuses. Data from 1,689 of those 2,211 fetuses were used to obtain reference ranges, derive a linear regression equation, and calculate NBL percentiles and MoM values. Using a fixed 5% false-positive rate, we found 25.5% sensitivity for NBL (95% confidence interval [CI], 15-39.1) and 23.5% sensitivity for NBL percentiles (95% CI, 13.4-37), NBL MoM values (95% CI, 13.4-37), and biparietal diameter-to-NBL ratios (95% CI, 13.4-37). Our study demonstrated that all four methods can be used in the second trimester for diagnosing fetal NB hypoplasia as a means of predicting trisomy 21 because their predictive values are similar at a fixed 5% false-positive rate. For simplicity of use, we recommend using 3 mm as the NBL cutoff value. © 2015 Wiley Periodicals, Inc.
Effects of noise and task loading on a communication task loading on a communication task
NASA Astrophysics Data System (ADS)
Orrell, Dean H., II
Previous research had shown the effect of noise on a single communication task. This research has been criticized as not being representative of a real world situation since subjects allocated all of their attention to only one task. In the present study, the effect of adding a loading task to a standard noise-communication paradigm was investigated. Subjects performed both a communication task (Modified Rhyme Test; House et al. 1965) and a short term memory task (Sternberg, 1969) in simulated levels of aircraft noise (95, 105 and 115 dB overall sound pressure level (OASPL)). Task loading was varied with Sternberg's task by requiring subjects to memorize one, four, or six alphanumeric characters. Simulated aircraft noise was varied between levels of 95, 105 and 115 dB OASPL using a pink noise source. Results show that the addition of Sternberg's task and little effect on the intelligibility of the communication task while response time for the communication task increased.
Advancing the Certified in Public Health Examination: A Job Task Analysis.
Kurz, Richard S; Yager, Christopher; Yager, James D; Foster, Allison; Breidenbach, Daniel H; Irwin, Zachary
In 2014, the National Board of Public Health Examiners performed a job task analysis (JTA) to revise the Certified in Public Health (CPH) examination. The objectives of this study were to describe the development, administration, and results of the JTA survey; to present an analysis of the survey results; and to review the implications of this first-ever public health JTA. An advisory committee of public health professionals developed a list of 200 public health job tasks categorized into 10 work domains. The list of tasks was incorporated into a web-based survey, and a snowball sample of public health professionals provided 4850 usable responses. Respondents rated job tasks as essential (4), very important (3), important (2), not very important (1), and never performed (0). The mean task importance ratings ranged from 2.61 to 3.01 (important to very important). The highest mean ratings were for tasks in the ethics domain (mean rating, 3.01). Respondents ranked 10 of the 200 tasks as the most important, with mean task rankings ranging from 2.98 to 3.39. We found subtle differences between male and female respondents and between master of public health and doctor of public health respondents in their rankings. The JTA established a set of job tasks in 10 public health work domains, and the results provided a foundation for refining the CPH examination. Additional steps are needed to further modify the content outline of the examination. An empirical assessment of public health job tasks, using methods such as principal components analysis, may provide additional insight.
Rodriguez-Lopez, Jesica S.; Ramos, Marcel; Islam, Nadia; Trinh-Shevrin, Chau; Yi, Stella S.; Chernov, Claudia; Perlman, Sharon E.; Thorpe, Lorna E.
2017-01-01
Introduction Racial/ethnic minority adults have higher rates of hypertension than non-Hispanic white adults. We examined the prevalence of hypertension among Hispanic and Asian subgroups in New York City. Methods Data from the 2013–2014 New York City Health and Nutrition Examination Survey were used to assess hypertension prevalence among adults (aged ≥20) in New York City (n = 1,476). Hypertension was measured (systolic blood pressure ≥140 mm Hg or diastolic blood pressure ≥90 mm Hg or self-reported hypertension and use of blood pressure medication). Participants self-reported race/ethnicity and country of origin. Multivariable logistic regression models assessed differences in prevalence by race/ethnicity and sociodemographic and health-related characteristics. Results Overall hypertension prevalence among adults in New York City was 33.9% (43.5% for non-Hispanic blacks, 38.0% for Asians, 33.0% for Hispanics, and 27.5% for non-Hispanic whites). Among Hispanic adults, prevalence was 39.4% for Dominican, 34.2% for Puerto Rican, and 27.5% for Central/South American adults. Among Asian adults, prevalence was 43.0% for South Asian and 39.9% for East/Southeast Asian adults. Adjusting for age, sex, education, and body mass index, 2 major racial/ethnic minority groups had higher odds of hypertension than non-Hispanic whites: non-Hispanic black (AOR [adjusted odds ratio], 2.6; 95% confidence interval [CI], 1.7–3.9) and Asian (AOR, 2.0; 95% CI, 1.2–3.4) adults. Two subgroups had greater odds of hypertension than the non-Hispanic white group: East/Southeast Asian adults (AOR, 2.8; 95% CI, 1.6–4.9) and Dominican adults (AOR, 1.9; 95% CI, 1.1–3.5). Conclusion Racial/ethnic minority subgroups vary in hypertension prevalence, suggesting the need for targeted interventions. PMID:28427484
Cognitive Task Analysis of the HALIFAX-Class Operations Room Officer: Data Sheets. Annexes
1999-03-10
Image Cover Sheet CLASSIFICATION SYSTEM NUMBER 510920 UNCLASSIFIED 1111111111111111111111111111111111111111 TITLE ANNEXES TO: COGNITIVE TASK ANALYSIS OF...1999 2. REPORT TYPE 3. DATES COVERED 00-00-1999 to 00-00-1999 4. TITLE AND SUBTITLE Annexes to: Cognitive Task Analysis of the HALIFAX-Class...by ANSI Std Z39-18 Guelph, Ontario .H U. M A N S X S T E M S Incorporated Annexes to: Cognitive Task Analysis of the HALIFAX-Class Operations
Pronk, A; Yu, F; Vlaanderen, J; Tielemans, E; Preller, L; Bobeldijk, I; Deddens, J A; Latza, U; Baur, X; Heederik, D
2006-01-01
Objectives To study inhalation and dermal exposure to hexamethylene diisocyanate (HDI) and its oligomers as well as personal protection equipment (PPE) use during task performance in conjunction with urinary hexamethylene diamine (HDA) in car body repair shop workers and industrial spray painters. Methods Personal task based inhalation samples (n = 95) were collected from six car body repair shops and five industrial painting companies using impingers with di‐n‐butylamine (DBA) in toluene. In parallel, dermal exposure was assessed using nitril rubber gloves. Gloves were submerged into DBA in toluene after sampling. Analysis for HDI and its oligomers was performed by LC‐MS/MS. Urine samples were collected from 55 workers (n = 291) and analysed for HDA by GC‐MS. Results Inhalation exposure was strongly associated with tasks during which aerosolisation occurs. Dermal exposure occurred during tasks that involve direct handling of paint. In car body repair shops associations were found between detectable dermal exposure and glove use (odds ratio (OR) 0.22, 95% confidence interval (CI) 0.09 to 0.57) and inhalation exposure level (OR 1.34, 95% CI 0.97 to 1.84 for a 10‐fold increase). HDA in urine could be demonstrated in 36% and 10% of car body repair shop workers and industrial painting company workers respectively. In car body repair shops, the frequency of detectable HDA was significantly elevated at the end of the working day (OR 2.13, 95% CI 1.07 to 4.22 for 3–6 pm v 0–8 am). In both branches HDA was detected in urine of ∼25% of the spray painters. In addition HDA was detected in urine of a large proportion of non‐spray painters in car body repair shops. Conclusion Although (spray) painting with lacquers containing isocyanate hardeners results in the highest external exposures to HDI and oligomers, workers that do not perform paint related tasks may also receive a considerable internal dose. PMID:16728504
Navy-wide Personnel Survey (NPS) 2003: Tabulated Results
2006-01-01
9.5% 8.4% 9.5% 8.9% 733 8270 9835 18838 12.8% 9.3% 14.7% 11.7% 746 9216 6712 16674 13.1% 10.4% 10.0% 10.3% 86 2016 1382 3484 1.5% 2.3% 2.1% 2.2% 108...95.5% 96.1% 95.6% 2016 302 2318 4.5% 3.9% 4.4% 44570 7692 52262 100.0% 100.0% 100.0% Count % within Q70 GENDER Count % within Q70 GENDER Count...1029 1.7% 3.3% 2.0% 2832 877 3709 6.3% 11.4% 7.1% 5 0 5 .0% .0% .0% 37575 5880 43455 83.8% 76.4% 82.7% 2016 302 2318 4.5% 3.9% 4.4% 576 176 752 1.3% 2.3
Physical performance limitations among adult survivors of childhood brain tumors.
Ness, Kirsten K; Morris, E Brannon; Nolan, Vikki G; Howell, Carrie R; Gilchrist, Laura S; Stovall, Marilyn; Cox, Cheryl L; Klosky, James L; Gajjar, Amar; Neglia, Joseph P
2010-06-15
Young adult survivors of childhood brain tumors (BTs) may have late effects that compromise physical performance and everyday task participation. The objective of this study was to evaluate muscle strength, fitness, physical performance, and task participation among adult survivors of childhood BTs. In-home evaluations and interviews were conducted for 156 participants (54% men). Results on measures of muscle strength, fitness, physical performance, and participation were compared between BT survivors and members of a population-based comparison group by using chi-square statistics and 2-sample t tests. Associations between late effects and physical performance and between physical performance and participation were evaluated in regression models. : The median age of BT survivors was 22 years (range, 18-58 years) at the time of the current evaluation, and they had survived for a median of 14.7 years (range, 6.5-45.9 years) postdiagnosis. Survivors had lower estimates of grip strength (women, 24.7 + or - 9.2 kg vs 31.5 + or - 5.8 kg; men, 39.0 + or - 12.2 kg vs 53.0 + or - 10.1 kg), knee extension strength (women, 246.6 + or - 95.5 Newtons [N] vs 331.5 + or - 5.8 N; men, 304.7 + or - 116.4 N vs 466.6 + or - 92.1 N), and peak oxygen uptake (women, 25.1 + or - 8.8 mL/kg per minute vs 31.3 + or - 5.1 mL/kg per minute; men, 24.6 + or - 9.5 mL/kg per minute vs 33.2 + or - 3.4 mL/kg per minute) than members of the population-based comparison group. Physical performance was lower among survivors and was associated with not living independently (odds ratio [OR], 5.0; 95% confidence interval [CI], 2.0-12.2) and not attending college (OR, 2.3; 95% CI 1.2-4.4). Muscle strength and fitness values among BT survivors were similar to those among individuals aged > or = 60 years and were associated with physical performance limitations. Physical performance limitations were associated with poor outcomes in home and school environments. The current data indicated an opportunity for interventions targeted at improving long-term physical function in this survivor population.
Taylor, Lee; Fitch, Natalie; Castle, Paul; Watkins, Samuel; Aldous, Jeffrey; Sculthorpe, Nicholas; Midgely, Adrian; Brewer, John; Mauger, Alexis
2014-01-01
Soccer referees enforce the laws of the game and the decisions they make can directly affect match results. Fixtures within European competitions take place in climatic conditions that are often challenging (e.g., Moscow ~ -5°C, Madrid ~30°C). Effects of these temperatures on player performance are well-documented; however, little is known how this environmental stress may impair cognitive performance of soccer referees and if so, whether exercise exasperates this. The present study aims to investigate the effect of cold [COLD; -5°C, 40% relative humidity (RH)], hot (HOT; 30°C, 40% RH) and temperate (CONT; 18°C, 40% RH) conditions on decision making during soccer specific exercise. On separate occasions within each condition, 13 physically active males; either semi-professional referees or semi-professional soccer players completed three 90 min intermittent treadmill protocols that simulated match play, interspersed with 4 computer delivered cognitive tests to measure vigilance and dual task capacity. Core and skin temperature, heart rate, rating of perceived exertion (RPE) and thermal sensation (TS) were recorded throughout the protocol. There was no significant difference between conditions for decision making in either the dual task (interaction effects: FALSE p = 0.46; MISSED p = 0.72; TRACKING p = 0.22) or vigilance assessments (interaction effects: FALSE p = 0.31; HIT p = 0.15; MISSED p = 0.17) despite significant differences in measured physiological variables (skin temperature: HOT vs. CONT 95% CI = 2.6 to 3.9, p < 0.001; HOT vs. COLD 95% CI = 6.6 to 9.0, p < 0.001; CONT vs. COLD 95% CI = 3.4 to 5.7, p < 0.01). It is hypothesized that the lack of difference observed in decision making ability between conditions was due to the exercise protocol used, as it may not have elicited an appropriate and valid soccer specific internal load to alter cognitive functioning.
Vetterlein, Malte W; Dalela, Deepansh; Sammon, Jesse D; Karabon, Patrick; Sood, Akshay; Jindal, Tarun; Meyer, Christian P; Löppenberg, Björn; Sun, Maxine; Trinh, Quoc-Dien; Menon, Mani; Abdollah, Firas
2018-02-01
To evaluate state-by-state trends in prostate-specific antigen (PSA) screening prevalence after the 2011 United States Preventive Services Task Force (USPSTF) recommendation against this practice. We included 222,475 men who responded to the Behavioral Risk Factor Surveillance System 2012 and 2014 surveys, corresponding to early and late post-USPSTF populations. Logistic regression was used to identify predictors of PSA screening and to calculate the adjusted and weighted state-by-state PSA screening prevalence and respective relative percent changes between 2012 and 2014. To account for unmeasured factors, the correlation between changes in PSA screening over time and changes in screening for colorectal and breast cancer were assessed. All analyses were conducted in 2016. Overall, 38.9% (95% confidence interval [CI] = 38.6%-39.2%) reported receiving PSA screening in 2012 vs 35.8% (95% CI = 35.1%-36.2%) in 2014. State of residence, age, race, education, income, insurance, access to care, marital status, and smoking status were independent predictors of PSA screening in both years (all P <.001). In adjusted analyses, the nationwide PSA screening prevalence decreased by a relative 8.5% (95% CI = 6.4%-10.5%; P <.001) between 2012 and 2014. There was a vast state-by-state heterogeneity, ranging from a relative 26.6% decrease in Vermont to 10.2% increase in Hawaii. Overall, 81.5% and 84.0% of the observed changes were not accompanied by matching changes in respective colorectal and breast cancer screening utilization, for which there were no updates in USPSTF recommendations. There is a significant state-by-state variation in PSA screening trends following the 2011 USPSTF recommendation. Further research is needed to elucidate the reasons for this heterogeneity in screening behavior among the states. Copyright © 2017 Elsevier Inc. All rights reserved.
Wang, Ya; Liu, Lu-lu; Gan, Ming-yuan; Tan, Shu-ping; Shum, David; Chan, Raymond
2017-01-01
Abstract Background: Prospective memory (PM) refers to remembering to execute a planned intention in the future, which can been divided as event-based PM (focal, nonfocal) and time-based PM according to the nature of the cue. Focal event-based PM, where the ongoing task requires processing of the characteristics of PM cues, has been found to be benefited from implementation intention (II, ie, an encoding strategy in the format of “if I see X, then I will do Y”). However, to date, it is unclear whether implementation intention can produce a positive effect on nonfocal event-based PM (where the ongoing task is irrelevant with the PM cues) and time-based PM. Moreover, patients with schizophrenia (SCZ) were found to have impairments in these types of PM, and few studies have been conducted to examine the effect of II on these types of PM. This study investigated whether (and how) implementation intention can improve nonfocal event-based PM and time-based PM performance in patients with SCZ. Methods: Forty-two patients with SCZ and 42 healthy control participants were administered both computerized nonfocal event-based PM task and time-based PM task. Patients and healthy controls were further randomly allocated to implementation intention condition (N = 21) and typical instruction condition (N = 21). Results: Patients with SCZ in the implementation intention group showed higher PM accuracy than the typical instruction group in both nonfocal event-based PM task (0.51 ± 0.32 vs 0.19 ± 0.29, t(40) = 3.39, P = .002) and time-based PM task (0.72 ± 0.31 vs 0.39 ± 0.40, t(40) = 2.98, P = .005). Similarly, healthy controls in the II group also showed better PM performance than the typical instruction group in both tasks (all P’s < 0.05). Time check frequency of time-based PM task in the II group of all the participants was significantly higher than the typical instruction group. Conclusion: Implementation intention is an effective strategy for improving different types of PM performance in patients with schizophrenia and can be applied for clinical settings.
Sundstrup, Emil; Jakobsen, Markus D; Brandt, Mikkel; Jay, Kenneth; Ajslev, Jeppe Z N; Andersen, Lars L
2016-11-01
We aimed to determine the association between work, health, and lifestyle with regular use of pain medication due to musculoskeletal disorders in the general working population. Currently employed wage earners (N = 10,024) replied to questions about health, work, and lifestyle. The odds for regularly using medication for musculoskeletal disorders were modeled using logistic regression controlled for various confounders. Pain intensity increased the odds for using pain medication in a dose-response fashion. With seated work as reference, the odds for using pain medication were 1.26 (95%CI: 1.09-1.47) for workers engaged in standing or walking work that is not strenuous and 1.59 (95%CI: 1.39-1.82) for workers engaged in standing or walking work with lifting tasks or heavy and fast strenuous work. Workers with higher levels of physical activity at work are more likely to use pain medication on a regular basis for musculoskeletal disorders, even when adjusting for pain intensity, lifestyle, and influence at work. Am. J. Ind. Med. 59:934-941, 2016. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
Lundin, Jessica I.; Checkoway, Harvey; Criswell, Susan R.; Hobson, Angela; Harris, Rachel C.; Swisher, Laura M.; Evanoff, Bradley A.; Racette, Brad A.
2013-01-01
Background Manganese (Mn) is a common component of welding fume. Exposure to Mn fume has been associated with parkinsonism. A simple and reliable screening tool to evaluate Mn exposed workers for neurotoxic injury would have broad occupational health application. Methods This study investigated 490 occupational welders recruited from a trade union list. Subjects were examined by a movement disorders specialist using the Unified Parkinson Disease Rating Scale motor subsection 3 (UPDRS3). Parkinsonism, intermediate, and normal groups were defined as UPDRS3 score ≥15, 6–15, and <6, respectively. Workers completed a health status questionnaire (PDQ39) and a Parkinson’s disease (PD) Symptoms Questionnaire. Areas under receiver operator curve (AUC) were analyzed based on these scores, adjusted for age, smoking, race, gender, and neurologist, using normal as the reference. Results The AUC was 0.79 (95% Confidence Interval [CI] = 0.73–0.84) for PDQ39 and 0.78 (95% CI = 0.72–0.85) for PD Symptoms Questionnaire score. At 70% sensitivity, the specificity for PDQ39 score and PD Symptoms Questionnaire score for the prediction of parkinsonism was 73.1% and 80.1%, respectively. Conclusions These results suggest the questionnaires have reasonably good sensitivity and specificity to predict parkinsonism in Mn exposed workers. These questionnaires could be a valuable first step in a tiered screening approach for Mn exposed workers. PMID:24035927
Comorbidities in patients with gout prior to and following diagnosis: case-control study
Kuo, Chang-Fu; Grainge, Matthew J; Mallen, Christian; Zhang, Weiya; Doherty, Michael
2016-01-01
Objectives To determine the burden of comorbidities in patients with gout at diagnosis and the risk of developing new comorbidities post diagnosis. Methods There were 39 111 patients with incident gout and 39 111 matched controls identified from the UK Clinical Practice Research Data-link. The risk of comorbidity before (ORs) and after the diagnosis of gout (HRs) were estimated, adjusted for age, sex, diagnosis year, body mass index, smoking and alcohol consumption. Results Gout was associated with adjusted ORs (95% CIs) of 1.39 (1.34 to 1.45), 1.89 (1.76 to 2.03) and 2.51 (2.19 to 2.86) for the Charlson index of 1–2, 3–4 and ≥5, respectively. Cardiovascular and genitourinary diseases, in addition to hyperlipidaemia, hypothyroidism, anaemia, psoriasis, chronic pulmonary diseases, osteoarthritis and depression, were associated with a higher risk for gout. Gout was also associated with an adjusted HR (95% CI) of 1.41 (1.34 to 1.48) for having a Charlson index ≥1. Median time to first comorbidity was 43 months in cases and 111 months in controls. Risks for incident comorbidity were higher in cardiovascular, genitourinary, metabolic/endocrine and musculoskeletal diseases, in addition to liver diseases, hemiplegia, depression, anaemia and psoriasis in patients with gout. After additionally adjusting for all comorbidities at diagnosis, gout was associated with a HR (95% CI) for all-cause mortality of 1.13 (1.08 to 1.18; p<0.001). Conclusions The majority of patients with gout have worse pre-existing health status at diagnosis and the risk of incident comorbidity continues to rise following diagnosis. The range of associated comorbidities is broader than previously recognised and merits further evaluation. PMID:25398375
Johnson, Leigh F.; Stinson, Kathryn; Newell, Marie-Louise; Bland, Ruth M.; Moultrie, Harry; Davies, Mary-Ann; Rehle, Thomas M.; Dorrington, Rob E.; Sherman, Gayle G.
2012-01-01
Background The prevention of mother-to-child transmission (PMTCT) of HIV has been focused mainly on women who are HIV-positive at their first antenatal visit, but there is uncertainty regarding the contribution to overall transmission from mothers who seroconvert after their first antenatal visit and before weaning. Method A mathematical model was developed to simulate changes in mother-to-child transmission of HIV over time, in South Africa. The model allows for changes in infant feeding practices as infants age, temporal changes in the provision of antiretroviral prophylaxis and counselling on infant feeding, as well as temporal changes in maternal HIV prevalence and incidence. Results The proportion of MTCT from mothers who seroconverted after their first antenatal visit was 26% (95% CI: 22-30%) in 2008, or 15 000 out of 57 000 infections. It is estimated that by 2014, total MTCT will reduce to 39 000 per annum, and transmission from mothers seroconverting after their first antenatal visit will reduce to 13 000 per annum, accounting for 34% (95% CI: 29-39%) of MTCT. If maternal HIV incidence during late pregnancy and breastfeeding were reduced by 50% after 2010, and HIV screening were repeated in late pregnancy and at 6-week immunization visits after 2010, the average annual number of MTCT cases over the 2010-15 period would reduce by 28% (95% CI: 25-31%), from 39 000 to 28 000 per annum. Conclusion Maternal seroconversion during late pregnancy and breastfeeding contributes significantly to the paediatric HIV burden, and needs greater attention in the planning of PMTCT programmes. PMID:22193774
Single-task and dual-task tandem gait test performance after concussion.
Howell, David R; Osternig, Louis R; Chou, Li-Shan
2017-07-01
To compare single-task and dual-task tandem gait test performance between athletes after concussion with controls on observer-timed, spatio-temporal, and center-of-mass (COM) balance control measurements. Ten participants (19.0±5.5years) were prospectively identified and completed a tandem gait test protocol within 72h of concussion and again 1 week, 2 weeks, 1 month, and 2 months post-injury. Seven uninjured controls (20.0±4.5years) completed the same protocol in similar time increments. Tandem gait test trials were performed with (dual-task) and without (single-task) concurrently performing a cognitive test as whole-body motion analysis was performed. Outcome variables included test completion time, average tandem gait velocity, cadence, and whole-body COM frontal plane displacement. Concussion participants took significantly longer to complete the dual-task tandem gait test than controls throughout the first 2 weeks post-injury (mean time=16.4 [95% CI: 13.4-19.4] vs. 10.1 [95% CI: 6.4-13.7] seconds; p=0.03). Single-task tandem gait times were significantly lower 72h post-injury (p=0.04). Dual-task cadence was significantly lower for concussion participants than controls (89.5 [95% CI: 68.6-110.4] vs. 127.0 [95% CI: 97.4-156.6] steps/minute; p=0.04). Moderately-high to high correlations between tandem gait test time and whole-body COM medial-lateral displacement were detected at each time point during dual-task gait (r s =0.70-0.93; p=0.03-0.001). Adding a cognitive task during the tandem gait test resulted in longer detectable deficits post-concussion compared to the traditional single-task tandem gait test. As a clinical tool to assess dynamic motor function, tandem gait may assist with return to sport decisions after concussion. Copyright © 2017 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.
The effects of food advertising and cognitive load on food choices
2014-01-01
Background Advertising has been implicated in the declining quality of the American diet, but much of the research has been conducted with children rather than adults. This study tested the effects of televised food advertising on adult food choice. Methods Participants (N = 351) were randomized into one of 4 experimental conditions: exposure to food advertising vs. exposure to non-food advertising, and within each of these groups, exposure to a task that was either cognitively demanding or not cognitively demanding. The number of unhealthy snacks chosen was subsequently measured, along with total calories of the snacks chosen. Results Those exposed to food advertising chose 28% more unhealthy snacks than those exposed to non-food-advertising (95% CI: 7% - 53%), with a total caloric value that was 65 kcal higher (95% CI: 10-121). The effect of advertising was not significant among those assigned to the low-cognitive-load group, but was large and significant among those assigned to the high-cognitive-load group: 43% more unhealthy snacks (95% CI: 11% - 85%) and 94 more total calories (95% CI: 19-169). Conclusions Televised food advertising has strong effects on individual food choice, and these effects are magnified when individuals are cognitively occupied by other tasks. PMID:24721289
Intubation Methods by Novice Intubators in a Manikin Model
O'Carroll, Darragh C; Aratani, Ashley K; Lee, Dane C; Lau, Christopher A; Morton, Paul N; Yamamoto, Loren G; Berg, Benjamin W
2013-01-01
Tracheal Intubation is an important yet difficult skill to learn with many possible methods and techniques. Direct laryngoscopy is the standard method of tracheal intubation, but several instruments have been shown to be less difficult and have better performance characteristics than the traditional direct method. We compared 4 different intubation methods performed by novice intubators on manikins: conventional direct laryngoscopy, video laryngoscopy, Airtraq® laryngoscopy, and fiberoptic laryngoscopy. In addition, we attempted to find a correlation between playing videogames and intubation times in novice intubators. Video laryngoscopy had the best results for both our normal and difficult airway (cervical spine immobilization) manikin scenarios. When video was compared to direct in the normal airway scenario, it had a significantly higher success rate (100% vs 83% P=.02) and shorter intubation times (29.1±27.4 sec vs 45.9±39.5 sec, P=.03). In the difficult airway scenario video laryngoscopy maintained a significantly higher success rate (91% vs 71% P=0.04) and likelihood of success (3.2±1.0 95%CI [2.9–3.5] vs 2.4±0.9 95%CI [2.1–2.7]) when compared to direct laryngoscopy. Participants also reported significantly higher rates of self-confidence (3.5±0.6 95%CI [3.3–3.7]) and ease of use (1.5±0.7 95%CI [1.3–1.8]) with video laryngoscopy compared to all other methods. We found no correlation between videogame playing and intubation methods. PMID:24167768
Evaluation of Antimicrobial Stewardship-Related Alerts Using a Clinical Decision Support System.
Ghamrawi, Riane J; Kantorovich, Alexander; Bauer, Seth R; Pallotta, Andrea M; Sekeres, Jennifer K; Gordon, Steven M; Neuner, Elizabeth A
2017-11-01
Background: Information technology, including clinical decision support systems (CDSS), have an increasingly important and growing role in identifying opportunities for antimicrobial stewardship-related interventions. Objective: The aim of this study was to describe and compare types and outcomes of CDSS-built antimicrobial stewardship alerts. Methods: Fifteen alerts were evaluated in the initial antimicrobial stewardship program (ASP) review. Preimplementation, alerts were reviewed retrospectively. Postimplementation, alerts were reviewed in real-time. Data collection included total number of actionable alerts, recommendation acceptance rates, and time spent on each alert. Time to de-escalation to narrower spectrum agents was collected. Results: In total, 749 alerts were evaluated. Overall, 306 (41%) alerts were actionable (173 preimplementation, 133 postimplementation). Rates of actionable alerts were similar for custom-built and prebuilt alert types (39% [53 of 135] vs 41% [253 of 614], P = .68]. In the postimplementation group, an intervention was attempted in 97% of actionable alerts and 70% of interventions were accepted. The median time spent per alert was 7 minutes (interquartile range [IQR], 5-13 minutes; 15 [12-17] minutes for actionable alerts vs 6 [5-7] minutes for nonactionable alerts, P < .001). In cases where the antimicrobial was eventually de-escalated, the median time to de-escalation was 28.8 hours (95% confidence interval [CI], 10.0-69.1 hours) preimplementation vs 4.7 hours (95% CI, 2.4-22.1 hours) postimplementation, P < .001. Conclusions: CDSS have played an important role in ASPs to help identify opportunities to optimize antimicrobial use through prebuilt and custom-built alerts. As ASP roles continue to expand, focusing time on customizing institution specific alerts will be of vital importance to help redistribute time needed to manage other ASP tasks and opportunities.
Amengual, O; Forastiero, R; Sugiura-Ogasawara, M; Otomo, K; Oku, K; Favas, C; Delgado Alves, J; Žigon, P; Ambrožič, A; Tomšič, M; Ruiz-Arruza, I; Ruiz-Irastorza, G; Bertolaccini, M L; Norman, G L; Shums, Z; Arai, J; Murashima, A; Tebo, A E; Gerosa, M; Meroni, P L; Rodriguez-Pintó, I; Cervera, R; Swadzba, J; Musial, J; Atsumi, T
2017-03-01
Objective A task force of scientists at the International Congress on Antiphospholipid Antibodies recognized that phosphatidylserine-dependent antiprothrombin antibodies (aPS/PT) might contribute to a better identification of antiphospholipid syndrome (APS). Accordingly, initial and replication retrospective, cross-sectional multicentre studies were conducted to ascertain the value of aPS/PT for APS diagnosis. Methods In the initial study (eight centres, seven countries), clinical/laboratory data were retrospectively collected. Serum/plasma samples were tested for IgG aPS/PT at Inova Diagnostics (Inova) using two ELISA kits. A replication study (five centres, five countries) was carried out afterwards. Results In the initial study ( n = 247), a moderate agreement between the IgG aPS/PT Inova and MBL ELISA kits was observed ( k = 0.598). IgG aPS/PT were more prevalent in APS patients (51%) than in those without (9%), OR 10.8, 95% CI (4.0-29.3), p < 0.0001. Sensitivity, specificity, positive (LR+) and negative (LR-) likelihood ratio of IgG aPS/PT for APS diagnosis were 51%, 91%, 5.9 and 0.5, respectively. In the replication study ( n = 214), a moderate/substantial agreement between the IgG aPS/PT results obtained with both ELISA kits was observed ( k = 0.630). IgG aPS/PT were more prevalent in APS patients (47%) than in those without (12%), OR 6.4, 95% CI (2.6-16), p < 0.0001. Sensitivity, specificity, LR + and LR- for APS diagnosis were 47%, 88%, 3.9 and 0.6, respectively. Conclusions IgG aPS/PT detection is an easily performed laboratory parameter that might contribute to a better and more complete identification of patients with APS.
Difficulty in detecting discrepancies in a clinical trial report: 260-reader evaluation
Cole, Graham D; Shun-Shin, Matthew J; Nowbar, Alexandra N; Buell, Kevin G; Al-Mayahi, Faisal; Zargaran, David; Mahmood, Saliha; Singh, Bharpoor; Mielewczik, Michael; Francis, Darrel P
2015-01-01
Background: Scientific literature can contain errors. Discrepancies, defined as two or more statements or results that cannot both be true, may be a signal of problems with a trial report. In this study, we report how many discrepancies are detected by a large panel of readers examining a trial report containing a large number of discrepancies. Methods: We approached a convenience sample of 343 journal readers in seven countries, and invited them in person to participate in a study. They were asked to examine the tables and figures of one published article for discrepancies. 260 participants agreed, ranging from medical students to professors. The discrepancies they identified were tabulated and counted. There were 39 different discrepancies identified. We evaluated the probability of discrepancy identification, and whether more time spent or greater participant experience as academic authors improved the ability to detect discrepancies. Results: Overall, 95.3% of discrepancies were missed. Most participants (62%) were unable to find any discrepancies. Only 11.5% noticed more than 10% of the discrepancies. More discrepancies were noted by participants who spent more time on the task (Spearman’s ρ = 0.22, P < 0.01), and those with more experience of publishing papers (Spearman’s ρ = 0.13 with number of publications, P = 0.04). Conclusions: Noticing discrepancies is difficult. Most readers miss most discrepancies even when asked specifically to look for them. The probability of a discrepancy evading an individual sensitized reader is 95%, making it important that, when problems are identified after publication, readers are able to communicate with each other. When made aware of discrepancies, the majority of readers support editorial action to correct the scientific record. PMID:26174517
Eye-tracking-based assessment of cognitive function in low-resource settings.
Forssman, Linda; Ashorn, Per; Ashorn, Ulla; Maleta, Kenneth; Matchado, Andrew; Kortekangas, Emma; Leppänen, Jukka M
2017-04-01
Early development of neurocognitive functions in infants can be compromised by poverty, malnutrition and lack of adequate stimulation. Optimal management of neurodevelopmental problems in infants requires assessment tools that can be used early in life, and are objective and applicable across economic, cultural and educational settings. The present study examined the feasibility of infrared eye tracking as a novel and highly automated technique for assessing visual-orienting and sequence-learning abilities as well as attention to facial expressions in young (9-month-old) infants. Techniques piloted in a high-resource laboratory setting in Finland (N=39) were subsequently field-tested in a community health centre in rural Malawi (N=40). Parents' perception of the acceptability of the method (Finland 95%, Malawi 92%) and percentages of infants completing the whole eye-tracking test (Finland 95%, Malawi 90%) were high, and percentages of valid test trials (Finland 69-85%, Malawi 68-73%) satisfactory at both sites. Test completion rates were slightly higher for eye tracking (90%) than traditional observational tests (87%) in Malawi. The predicted response pattern indicative of specific cognitive function was replicated in Malawi, but Malawian infants exhibited lower response rates and slower processing speed across tasks. High test completion rates and the replication of the predicted test patterns in a novel environment in Malawi support the feasibility of eye tracking as a technique for assessing infant development in low-resource setting. Further research is needed to the test-retest stability and predictive validity of the eye-tracking scores in low-income settings. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
Incidence and Nature of Medical Attendance Injuries in English Community Rugby Union
Roberts, Simon P.; Trewartha, Grant; England, Mike; Stokes, Keith A.
2014-01-01
Background: Previous research has identified injury patterns during community-level rugby union match play, but none have investigated the frequency and reasons for on-field injury management. Purpose: To establish the frequency, reasons, and patterns of on-field injury management in English community rugby, including differences between different levels of play. Study Design: Descriptive epidemiology study. Methods: Over 3 seasons, injury information was collected from 46 (2009-2010), 67 (2010-2011), and 76 (2011-2012) English community clubs (Rugby Football Union [RFU] levels 3-9). Club injury management staff reported information for all medical attendances during match play, including details on the injury site and type, playing position (seasons 2010-2011 and 2011-2012 only), and whether the player was removed from play. Clubs were subdivided into groups A (RFU levels 3 and 4 [mainly semiprofessional]; n = 39), B (RFU levels 5 and 6 [mainly amateur]; n = 71), and C (RFU levels 7-9 [social and recreational]; n = 79) to differentiate playing levels. Results: The overall medical attendance incidence was 229 per 1000 player-match hours (95% CI, 226-232), with 45 players removed per 1000 player-match hours (95% CI, 44-46). Attendance incidence for group A (294 per 1000 player-match hours; 95% CI, 287-301) was higher compared with group B (213; 95% CI, 208-218; P < .001) and C (204; 95% CI, 200-209; P < .001). There was a higher incidence of attendances to forwards (254; 95% CI, 249-259) compared with backs (191; 95% CI, 187-196; P < .001). The head was the most common specific site of injury (55 per 1000 player-match hours; 95% CI, 53-57) but the lower limb region overall accounted for most attendances (87; 95% CI, 85-89) and the greatest chance of removal from the pitch (22; 95% CI, 21-23). Conclusion: With the likelihood of 1 injury for each team per match severe enough for the player to leave the pitch and with at least 1 attendance for a head injury per match, there is clear evidence that pitch side staff should be trained to recognize potentially serious injuries. PMID:26535294
The Log Handwriting Program improved children's writing legibility: a pretest-posttest study.
Mackay, Nadine; McCluskey, Annie; Mayes, Rachel
2010-01-01
We determined the feasibility and outcomes of the Log Handwriting Program (Raynal, 1990), an 8-week training program based on task-specific practice of handwriting. We used a pretest-posttest design involving 16 first- and second-grade Australian students. Handwriting training sessions occurred in schools for 45 min per week over 8 weeks, in groups of 2 or 3. Weekly homework was provided. The primary outcome measure was the Minnesota Handwriting Assessment (range = 0 to 34; Reisman, 1999). Legibility, form, alignment, size, spacing, and speed were measured. All six assessment subscales showed statistically significant differences. Legibility improved by a mean of 4.1 points (95% confidence interval = 2.5 to 5.7); form, 5.3 points; alignment, 7.8 points; size, 7.9 points; and space, 5.3 points. Speed decreased by 3.9 points. Preliminary evidence indicates that an 8-week Log Handwriting Program is feasible and improved handwriting in primary school children.
Estimating seat belt effectiveness using matched-pair cohort methods.
Cummings, Peter; Wells, James D; Rivara, Frederick P
2003-01-01
Using US data for 1986-1998 fatal crashes, we employed matched-pair analysis methods to estimate that the relative risk of death among belted compared with unbelted occupants was 0.39 (95% confidence interval (CI) 0.37-0.41). This differs from relative risk estimates of about 0.55 in studies that used crash data collected prior to 1986. Using 1975-1998 data, we examined and rejected three theories that might explain the difference between our estimate and older estimates: (1) differences in the analysis methods; (2) changes related to car model year; (3) changes in crash characteristics over time. A fourth theory, that the introduction of seat belt laws would induce some survivors to claim belt use when they were not restrained, could explain part of the difference in our estimate and older estimates; but even in states without seat belt laws, from 1986 through 1998, the relative risk estimate was 0.45 (95% CI 0.39-0.52). All of the difference between our estimate and older estimates could be explained by some misclassification of seat belt use. Relative risk estimates would move away from 1, toward their true value, if misclassification of both the belted and unbelted decreased over time, or if the degree of misclassification remained constant, as the prevalence of belt use increased. We conclude that estimates of seat belt effects based upon data prior to 1986 may be biased toward 1 by misclassification.
Bavdekar, Ashish; Oswal, Jitendra; Ramanan, Padmasani Venkat; Aundhkar, Chandrashekhar; Venugopal, P; Kapse, Dhananjay; Miller, Tara; McGray, Sarah; Zehrung, Darin; Kulkarni, Prasad S
2018-02-21
We conducted a randomized, non-inferiority, clinical study of MMR vaccine by a disposable-syringe jet injector (DSJI) in toddlers in India in comparison with the conventional administration. MMR vaccine was administered subcutaneously by DSJI or needle-syringe (N-S) to toddlers (15-18 months) who had received a measles vaccine at 9 months. Seropositivity to measles, mumps, and rubella serum IgG antibodies was assessed 35 days after vaccination. Non-inferiority was concluded if the upper limit of the 95% CI for the difference in the percent of seropositive between groups was less than 10%. Solicited reactions were collected for 14 days after vaccination by using structured diaries. In each study group, 170 subjects received MMR vaccine. On day 35, seropositivity for measles was 97.5% [95% CI (93.8%, 99.3%)] in the DSJI group and 98.7% [95% CI (95.5%, 99.8%)] in the N-S group; for mumps, 98.8% [95% CI (95.6%, 99.8%)] and 98.7% [95% CI (95.5%, 99.8%)]; and for rubella, 98.8% [95% CI (95.6%, 99.8%)] and 100% [95% CI (97.7%, 100.0%)]; none of the differences were significant. The day 35 post-vaccination GMTs in DSJI and N-S groups were measles: 5.48 IU/ml [95% CI (3.71, 8.11)] and 5.94 IU/ml [95% CI (3.92, 9.01)], mumps: 3.83 ISR [95% CI (3.53, 4.14)] and 3.66 ISR [95% CI (3.39, 3.95)] and rubella: 95.27 IU/ml [95% CI (70.39, 128.95)] and 107.06 IU/ml [95% CI (79.02, 145.06)]; none of the differences were significant. The DSJI group reported 173 solicited local reactions and the N-S group reported 112; most were mild grade. Of the total of 156 solicited systemic adverse events, most were mild, and incidence between the two groups was similar. MMR vaccination via DSJI is as immunogenic as vaccination by N-S. Safety profile of DSJI method is similar to N-S except for injection site reactions which are more with DSJI and are well-tolerated. Registration US National Institutes of Health clinical trials identifier - NCT02253407. Clinical trial registry of India identifier - CTRI/2013/05/003702. Copyright © 2018 The Author(s). Published by Elsevier Ltd.. All rights reserved.
Three-Dimensional Digital Capture of Head Size in Neonates – A Method Evaluation
Ifflaender, Sascha; Rüdiger, Mario; Koch, Arite; Burkhardt, Wolfram
2013-01-01
Introduction The quality of neonatal care is mainly determined by long-term neurodevelopmental outcome. The neurodevelopment of preterm infants is related to postnatal head growth and depends on medical interventions such as nutritional support. Head circumference (HC) is currently used as a two-dimensional measure of head growth. Since head deformities are frequently found in preterm infants, HC may not always adequately reflect head growth. Laser aided head shape digitizers offer semiautomatic acquisition of HC and cranial volume (CrV) and could thus be useful in describing head size more precisely. Aims 1) To evaluate reproducibility of a 3D digital capture system in newborns. 2) To compare manual and digital HC measurements in a neonatal cohort. 3) To determine correlation of HC and CrV and predictive value of HC. Methods Within a twelve-month period data of head scans with a laser shape digitizer were analysed. Repeated measures were used for method evaluation. Manually and digitally acquired HC was compared. Regression analysis of HC and CrV was performed. Results Interobserver reliability was excellent for HC (bias-0.005%, 95% Limits of Agreement (LoA) −0.39–0.39%) and CrV (bias1.5%, 95%LoA-0.8–3.6%). Method comparison data was acquired from 282 infants. It revealed interchangeability of the methods (bias-0.45%; 95%LoA-4.55–3.65%) and no significant systematic or proportional differences. HC and CrV correlated (r2 = 0.859, p<0.001), performance of HC predicting CrV was poor (RSD ±24 ml). Correlation was worse in infants with lower postmenstrual age (r2 = 0.745) compared to older infants (r2 = 0.843). Discussion The current practice of measuring HC for describing head growth in preterm infants could be misleading since it does not represent a 3D approach. CrV can vary substantially in infants of equal HC. The 3D laser scanner represents a new and promising method to provide reproducible data of CrV and HC. Since it does not provide data on cerebral structures, additional imaging is required. PMID:23580107
Mills, Kelly A; Markun, Leslie C; Luciano, Marta San; Rizk, Rami; Allen, I Elaine; Racine, Caroline A; Starr, Philip A; Alberts, Jay L; Ostrem, Jill L
2015-01-01
Objective Subthalamic nucleus (STN) deep brain stimulation (DBS) can improve motor complications of Parkinson's disease (PD) but may worsen specific cognitive functions. The effect of STN DBS on cognitive function in dystonia patients is less clear. Previous reports indicate that bilateral STN stimulation in patients with PD amplifies the decrement in cognitive-motor dual-task performance seen when moving from a single-task to dual-task paradigm. We aimed to determine if the effect of bilateral STN DBS on dual-task performance in isolated patients with dystonia, who have less cognitive impairment and no dementia, is similar to that seen in PD. Methods Eight isolated predominantly cervical patients with dystonia treated with bilateral STN DBS, with average dystonia duration of 10.5 years and Montreal Cognitive Assessment score of 26.5, completed working memory (n-back) and motor (forced-maintenance) tests under single-task and dual-task conditions while on and off DBS. Results A multivariate, repeated-measures analysis of variance showed no effect of stimulation status (On vs Off) on working memory (F=0.75, p=0.39) or motor function (F=0.22, p=0.69) when performed under single-task conditions, though as working memory task difficulty increased, stimulation disrupted the accuracy of force-tracking. There was a very small worsening in working memory performance (F=9.14, p=0.019) when moving from single-task to dual-tasks when using the ‘dual-task loss’ analysis. Conclusions This study suggests the effect of STN DBS on working memory and attention may be much less consequential in patients with dystonia than has been reported in PD. PMID:25012202
Knutson, Jayme S.; Gunzler, Douglas D.; Wilson, Richard D.; Chae, John
2016-01-01
Background and Purpose It is unknown whether one method of neuromuscular stimulation for post-stroke upper limb rehabilitation is more effective than another. Our aim was to compare the effects of contralaterally controlled functional electrical stimulation (CCFES) to cyclic neuromuscular electrical stimulation (cNMES). Methods Stroke patients with chronic (> 6 months) moderate to severe upper extremity hemiparesis (n=80) were randomized to receive 10 sessions/week of CCFES- or cNMES-assisted hand opening exercise at home plus 20 sessions of functional task practice in the lab over 12 weeks. The task practice for the CCFES group was stimulation-assisted. The primary outcome was change in Box and Blocks Test (BBT) score at 6-months post-treatment. Upper extremity Fugl-Meyer (UEFM) and Arm Motor Abilities Test (AMAT) were also measured. Results At 6-months post-treatment, the CCFES group had greater improvement on the BBT, 4.6 (95% CI: 2.2, 7.0), than the cNMES group, 1.8 (95% CI: 0.6, 3.0); between-group difference, 2.8 (95% CI: 0.1, 5.5), p=0.045. No significant between-group difference was found for the UEFM (p=.888) or AMAT (p=.096). Participants who had the largest improvements on BBT were less than two years post-stroke with moderate (i.e., not severe) hand impairment at baseline. Among these, the 6-month post-treatment BBT gains of the CCFES group, 9.6 (95% CI: 5.6, 13.6), were greater than those of the cNMES group, 4.1 (95% CI: 1.7, 6.5); between-group difference, 5.5 (95% CI: 0.8, 10.2), p=0.023. Conclusions CCFES improved hand dexterity more than cNMES in chronic stroke survivors. Clinical Trial Registration URL: http://www.clinicaltrials.gov. Unique identifier: NCT00891319. PMID:27608819
Lendvay, Thomas S; Brand, Timothy C; White, Lee; Kowalewski, Timothy; Jonnadula, Saikiran; Mercer, Laina D; Khorsand, Derek; Andros, Justin; Hannaford, Blake; Satava, Richard M
2013-06-01
Preoperative simulation warm-up has been shown to improve performance and reduce errors in novice and experienced surgeons, yet existing studies have only investigated conventional laparoscopy. We hypothesized that a brief virtual reality (VR) robotic warm-up would enhance robotic task performance and reduce errors. In a 2-center randomized trial, 51 residents and experienced minimally invasive surgery faculty in General Surgery, Urology, and Gynecology underwent a validated robotic surgery proficiency curriculum on a VR robotic simulator and on the da Vinci surgical robot (Intuitive Surgical Inc). Once they successfully achieved performance benchmarks, surgeons were randomized to either receive a 3- to 5-minute VR simulator warm-up or read a leisure book for 10 minutes before performing similar and dissimilar (intracorporeal suturing) robotic surgery tasks. The primary outcomes compared were task time, tool path length, economy of motion, technical, and cognitive errors. Task time (-29.29 seconds, p = 0.001; 95% CI, -47.03 to -11.56), path length (-79.87 mm; p = 0.014; 95% CI, -144.48 to -15.25), and cognitive errors were reduced in the warm-up group compared with the control group for similar tasks. Global technical errors in intracorporeal suturing (0.32; p = 0.020; 95% CI, 0.06-0.59) were reduced after the dissimilar VR task. When surgeons were stratified by earlier robotic and laparoscopic clinical experience, the more experienced surgeons (n = 17) demonstrated significant improvements from warm-up in task time (-53.5 seconds; p = 0.001; 95% CI, -83.9 to -23.0) and economy of motion (0.63 mm/s; p = 0.007; 95% CI, 0.18-1.09), and improvement in these metrics was not statistically significantly appreciated in the less-experienced cohort (n = 34). We observed significant performance improvement and error reduction rates among surgeons of varying experience after VR warm-up for basic robotic surgery tasks. In addition, the VR warm-up reduced errors on a more complex task (robotic suturing), suggesting the generalizability of the warm-up. Copyright © 2013 American College of Surgeons. All rights reserved.
Lendvay, Thomas S.; Brand, Timothy C.; White, Lee; Kowalewski, Timothy; Jonnadula, Saikiran; Mercer, Laina; Khorsand, Derek; Andros, Justin; Hannaford, Blake; Satava, Richard M.
2014-01-01
Background Pre-operative simulation “warm-up” has been shown to improve performance and reduce errors in novice and experienced surgeons, yet existing studies have only investigated conventional laparoscopy. We hypothesized a brief virtual reality (VR) robotic warm-up would enhance robotic task performance and reduce errors. Study Design In a two-center randomized trial, fifty-one residents and experienced minimally invasive surgery faculty in General Surgery, Urology, and Gynecology underwent a validated robotic surgery proficiency curriculum on a VR robotic simulator and on the da Vinci surgical robot. Once successfully achieving performance benchmarks, surgeons were randomized to either receive a 3-5 minute VR simulator warm-up or read a leisure book for 10 minutes prior to performing similar and dissimilar (intracorporeal suturing) robotic surgery tasks. The primary outcomes compared were task time, tool path length, economy of motion, technical and cognitive errors. Results Task time (-29.29sec, p=0.001, 95%CI-47.03,-11.56), path length (-79.87mm, p=0.014, 95%CI -144.48,-15.25), and cognitive errors were reduced in the warm-up group compared to the control group for similar tasks. Global technical errors in intracorporeal suturing (0.32, p=0.020, 95%CI 0.06,0.59) were reduced after the dissimilar VR task. When surgeons were stratified by prior robotic and laparoscopic clinical experience, the more experienced surgeons(n=17) demonstrated significant improvements from warm-up in task time (-53.5sec, p=0.001, 95%CI -83.9,-23.0) and economy of motion (0.63mm/sec, p=0.007, 95%CI 0.18,1.09), whereas improvement in these metrics was not statistically significantly appreciated in the less experienced cohort(n=34). Conclusions We observed a significant performance improvement and error reduction rate among surgeons of varying experience after VR warm-up for basic robotic surgery tasks. In addition, the VR warm-up reduced errors on a more complex task (robotic suturing) suggesting the generalizability of the warm-up. PMID:23583618
Reliability and Validity of Dual-Task Mobility Assessments in People with Chronic Stroke
Yang, Lei; He, Chengqi; Pang, Marco Yiu Chung
2016-01-01
Background The ability to perform a cognitive task while walking simultaneously (dual-tasking) is important in real life. However, the psychometric properties of dual-task walking tests have not been well established in stroke. Objective To assess the test-retest reliability, concurrent and known-groups validity of various dual-task walking tests in people with chronic stroke. Design Observational measurement study with a test-retest design. Methods Eighty-eight individuals with chronic stroke participated. The testing protocol involved four walking tasks (walking forward at self-selected and maximal speed, walking backward at self-selected speed, and crossing over obstacles) performed simultaneously with each of the three attention-demanding tasks (verbal fluency, serial 3 subtractions or carrying a cup of water). For each dual-task condition, the time taken to complete the walking task, the correct response rate (CRR) of the cognitive task, and the dual-task effect (DTE) for the walking time and CRR were calculated. Forty-six of the participants were tested twice within 3–4 days to establish test-retest reliability. Results The walking time in various dual-task assessments demonstrated good to excellent reliability [Intraclass correlation coefficient (ICC2,1) = 0.70–0.93; relative minimal detectable change at 95% confidence level (MDC95%) = 29%-45%]. The reliability of the CRR (ICC2,1 = 0.58–0.81) and the DTE in walking time (ICC2,1 = 0.11–0.80) was more varied. The reliability of the DTE in CRR (ICC2,1 = -0.31–0.40) was poor to fair. The walking time and CRR obtained in various dual-task walking tests were moderately to strongly correlated with those of the dual-task Timed-up-and-Go test, thus demonstrating good concurrent validity. None of the tests could discriminate fallers (those who had sustained at least one fall in the past year) from non-fallers. Limitation The results are generalizable to community-dwelling individuals with chronic stroke only. Conclusions The walking time derived from the various dual-task assessments generally demonstrated good to excellent reliability, making them potentially useful in clinical practice and future research endeavors. However, the usefulness of these measurements in predicting falls needs to be further explored. Relatively low reliability was shown in the cognitive outcomes and DTE, which may not be preferred measurements for assessing dual-task performance. PMID:26808662
Twenty-year trends in the prevalence of disability in China
Chen, Gong; Song, Xinming; Liu, Jufen; Yan, Lijing; Du, Wei; Pang, Lihua; Zhang, Lei; Wu, Jilei; Zhang, Bingzi; Zhang, Jun
2011-01-01
Abstract Objective To evaluate changes in the age-adjusted prevalence of disability in transitional China from 1987 to 2006. Methods Data from nationally representative surveys conducted in 1987 and 2006 were used to calculate age-adjusted disability prevalence rates by applying appropriate sample weights and directly adjusting to the age distribution of the 1990 Chinese population. Trends were assessed in terms of average annual percentage change. Findings The estimated number of disabled people in China in 1987 and 2006 was 52.7 and 84.6 million, respectively, corresponding to a weighted prevalence of 4.9% and 6.5%. The age-adjusted prevalence of disability decreased by an average of 0.5% per year (average annual percentage change, AAPC: −0.5%; 95% confidence interval, CI: −0.7 to −0.4) during 1987–2006. However, it increased by an average of 0.3% (AAPC: 0.3%; 95% CI: 0.1 to 0.5) per year in males and by an average of 1.0% (AAPC: 1.0%; 95% CI: 0.8 to 1.2) per year among rural residents, whereas among females it showed an average annual decrease of 1.5% (AAPC: −1.5%; 95% CI: −1.7 to −1.3) and among urban residents, an average annual decrease of 3.9% (AAPC: −3.9%; 95% CI: −4.3 to −3.5). Despite significant declining trends for hearing and speech, intellectual and visual disabilities, the annual age-adjusted prevalence of physical and mental disabilities increased by an average of 11.2% (AAPC: 11.2%; 95% CI: 10.5 to 11.9) and 13.3% (AAPC: 13.3%; 95% CI: 10.7 to 16.2), respectively. Conclusion In China, the age-adjusted prevalence of disability has declined since 1987, with inconsistencies dependent on the type of disability. These findings call for continuing and specific efforts to prevent disabilities in China. PMID:22084524
Gao, Junling; Weaver, Scott R.; Dai, Junming; Jia, Yingnan; Liu, Xingdi; Jin, Kezhi; Fu, Hua
2014-01-01
Background Whereas the majority of previous research on social capital and health has been on residential neighborhoods and communities, the evidence remains sparse on workplace social capital. To address this gap in the literature, we examined the association between workplace social capital and health status among Chinese employees in a large, multi-level, cross-sectional study. Methods By employing a two-stage stratified random sampling procedure, 2,796 employees were identified from 35 workplaces in Shanghai during March to November 2012. Workplace social capital was assessed using a validated and psychometrically tested eight-item measure, and the Chinese language version of the WHO-Five Well-Being Index (WHO-5) was used to assess mental health. Control variables included sex, age, marital status, education level, occupation status, smoking status, physical activity, and job stress. Multilevel logistic regression analysis was conducted to explore whether individual- and workplace-level social capital was associated with mental health status. Results In total, 34.9% of workers reported poor mental health (WHO-5<13). After controlling for individual-level socio-demographic and lifestyle variables, compared to workers with the highest quartile of personal social capital, workers with the third, second, and lowest quartiles exhibited 1.39 to 3.54 times greater odds of poor mental health, 1.39 (95% CI: 1.10–1.75), 1.85 (95% CI: 1.38–2.46) and 3.54 (95% CI: 2.73–4.59), respectively. Corresponding odds ratios for workplace-level social capital were 0.95 (95% CI: 0.61–1.49), 1.14 (95% CI: 0.72–1.81) and 1.63 (95% CI: 1.05–2.53) for the third, second, and lowest quartiles, respectively. Conclusions Higher workplace social capital is associated with lower odds of poor mental health among Chinese employees. Promoting social capital at the workplace may contribute to enhancing employees’ mental health in China. PMID:24404199
Airborne occupational exposures and risk of oesophageal and cardia adenocarcinoma
Jansson, C; Plato, N; Johansson, A L V; Nyrén, O; Lagergren, J
2006-01-01
Background The reasons for the increasing incidence of and strong male predominance in patients with oesophageal and cardia adenocarcinoma remain unclear. The authors hypothesised that airborne occupational exposures in male dominated industries might contribute. Methods In a nationwide Swedish population based case control study, 189 and 262 cases of oesophageal and cardia adenocarcinoma respectively, 167 cases of oesophageal squamous cell carcinoma, and 820 frequency matched controls underwent personal interviews. Based on each study participant's lifetime occupational history the authors assessed cumulative airborne occupational exposure for 10 agents, analysed individually and combined, by a deterministic additive model including probability, frequency, and intensity. Furthermore, occupations and industries of longest duration were analysed. Relative risks were estimated by odds ratios (OR), with 95% confidence intervals (CI), using conditional logistic regression, adjusted for potential confounders. Results Tendencies of positive associations were found between high exposure to pesticides and risk of oesophageal (OR 2.3 (95% CI 0.9 to 5.7)) and cardia adenocarcinoma (OR 2.1 (95% CI 1.0 to 4.6)). Among workers highly exposed to particular agents, a tendency of an increased risk of oesophageal squamous cell carcinoma was found. There was a twofold increased risk of oesophageal squamous cell carcinoma among concrete and construction workers (OR 2.2 (95% CI 1.1 to 4.2)) and a nearly fourfold increased risk of cardia adenocarcinoma among workers within the motor vehicle industry (OR 3.9 (95% CI 1.5 to 10.4)). An increased risk of oesophageal squamous cell carcinoma (OR 3.9 (95% CI 1.2 to 12.5)), and a tendency of an increased risk of cardia adenocarcinoma (OR 2.8 (95% CI 0.9 to 8.5)), were identified among hotel and restaurant workers. Conclusions Specific airborne occupational exposures do not seem to be of major importance in the aetiology of oesophageal or cardia adenocarcinoma and are unlikely to contribute to the increasing incidence or the male predominance. PMID:16421388
Lewis, Cara L; Loverro, Kari L; Khuu, Anne
2018-04-01
Study Design Controlled laboratory study, case-control design. Background Despite recognition that femoroacetabular impingement syndrome (FAIS) is a movement-related disorder, few studies have examined dynamic unilateral tasks in individuals with FAIS. Objectives To determine whether movements of the pelvis and lower extremities in individuals with FAIS differ from those in individuals without hip pain during a single-leg step-down, and to analyze kinematic differences between male and female participants within groups. Methods Individuals with FAIS and individuals without hip pain performed a single-leg step-down while kinematic data were collected. Kinematics were evaluated at 60° of knee flexion. A linear regression analysis assessed the main effects of group, sex, and side, and the interaction of sex by group. Results Twenty individuals with FAIS and 40 individuals without hip pain participated. Individuals with FAIS performed the step-down with greater hip flexion (4.9°; 95% confidence interval [CI]: 0.5°, 9.2°) and anterior pelvic tilt (4.1°; 95% CI: 0.9°, 7.3°) than individuals without hip pain. Across groups, female participants performed the task with more hip flexion (6.1°; 95% CI: 1.7°, 10.4°), hip adduction (4.8°; 95% CI: 2.2°, 7.4°), anterior pelvic tilt (5.8°; 95% CI: 2.6°, 9.0°), pelvic drop (1.4°; 95% CI: 0.3°, 2.5°), and thigh adduction (2.7°; 95% CI: 1.3°, 4.2°) than male participants. Conclusion The results of this study suggest that individuals with FAIS have alterations in pelvic motion during a dynamic unilateral task. The noted altered movement patterns in the FAIS group may contribute to the development of hip pain and may be due to impairments that are modifiable through rehabilitation. J Orthop Sports Phys Ther 2018;48(4):270-279. Epub 6 Mar 2018. doi:10.2519/jospt.2018.7794.
Characteristics of Plantar Loads in Maximum Forward Lunge Tasks in Badminton.
Hu, Xiaoyue; Li, Jing Xian; Hong, Youlian; Wang, Lin
2015-01-01
Badminton players often perform powerful and long-distance lunges during such competitive matches. The objective of this study is to compare the plantar loads of three one-step maximum forward lunges in badminton. Fifteen right-handed male badminton players participated in the study. Each participant performed five successful maximum lunges at three directions. For each direction, the participant wore three different shoe brands. Plantar loading, including peak pressure, maximum force, and contact area, was measured by using an insole pressure measurement system. Two-way ANOVA with repeated measures was employed to determine the effects of the different lunge directions and different shoes, as well as the interaction of these two variables, on the measurements. The maximum force (MF) on the lateral midfoot was lower when performing left-forward lunges than when performing front-forward lunges (p = 0.006, 95% CI = -2.88 to -0.04%BW). The MF and peak pressures (PP) on the great toe region were lower for the front-forward lunge than for the right-forward lunge (MF, p = 0.047, 95% CI = -3.62 to -0.02%BW; PP, p = 0.048, 95% CI = -37.63 to -0.16 KPa) and left-forward lunge (MF, p = 0.015, 95% CI = -4.39 to -0.38%BW; PP, p = 0.008, 95% CI = -47.76 to -5.91 KPa). These findings indicate that compared with the front-forward lunge, left and right maximum forward lunges induce greater plantar loads on the great toe region of the dominant leg of badminton players. The differences in the plantar loads of the different lunge directions may be potential risks for injuries to the lower extremities of badminton players.
Harcombe, Helen; McBride, David; Derrett, Sarah; Gray, Andrew
2010-04-01
To investigate the association of physical and psychosocial risk factors with musculoskeletal disorders (MSDs) in New Zealand nurses, postal workers and office workers. A cross-sectional postal survey asking about demographic, physical and psychosocial factors and MSDs. A total of 911 participants was randomly selected; nurses from the Nursing Council of New Zealand database (n=280), postal workers from their employer's database (n=280) and office workers from the 2005 electoral roll (n=351). Self-reported pain in the low back, neck, shoulder, elbow, wrist/hand or knee lasting more than 1 day in the month before the survey. The response rate was 58%, 443 from 770 potential participants. 70% (n=310) reported at least one MSDs. Physical work tasks were associated with low back (odds ratio (OR) 1.35, 95% CI 1.14 to 1.6), shoulder (OR 1.41, 95% CI 1.17 to 1.69), elbow (OR 1.14, 95% CI 1.13 to 1.83) and wrist/hand pain (OR 1.39, 95% CI 1.15 to 1.69). Job strain had the strongest association with neck pain (OR 3.46, 95% CI 1.30 to 9.21) and wrist/hand pain. Somatisation was weakly associated with MSDs at most sites. Better general and mental health status were weakly associated with lower odds of MSDs. In injury prevention and rehabilitation the physical nature of the work needs to be addressed for most MSDs, with modest decreases in risk seemingly possible. Addressing job strain could provide significant benefit for those with neck and wrist/hand pain, while the effects of somatisation and the promotion of good mental health may provide smaller but global benefits.
Wang, Yuan; Wang, Yuliang; Ma, Wenbin; Lu, Shujun; Chen, Jinbo; Cao, Lili
2018-01-01
The relationship between cognitive impairment during the acute phase of first cerebral infarction and the development of long-term pseudobulbar affect (PBA) has not been elucidated. Therefore, in this study, we aimed to determine if cognitive impairment during the acute phase of cerebral infarction will increase the risk of long-term post-infarction PBA. This was a nested case-control study using a prospective approach. A consecutive multicenter matched 1:1 case-control study of cognitive impairment cases following acute cerebral infarction (N=26) with 26 sex-, education years-, and age-matched controls. Univariate and multivariate conditional logistic regression analyses were performed to study the clinical features and changes in cognitive domain as well as the risk factors for PBA. Long-term PBA was independently predicted by low Montreal cognitive assessment (MoCA) scores at baseline. Multivariable regression models showed that post-infarction low MoCA scores remained independent predictors of long-term PBA (odds ratio [OR]=0.72; 95% confidence interval [CI]=0.54-0.95; P =0.018). Among all cognitive disorders, digit span test (DST) scores (OR=0.39; 95% CI=0.16-0.91, P =0.030), StroopC time (OR=1.15; 95% CI=1.01-1.31; P =0.037), and clock-drawing task (CDT) scores (OR=0.62; 95% CI=0.42-0.90; P =0.013) were found to be the independent risk factors for PBA. Cognitive impairment during the acute phase of cerebral infarction increased the risk of cerebral infarction-induced long-term PBA. Development of PBA was closely associated with executive function, attention, and visuospatial disorder.
Papillon-Smith, Jessica; Imam, Basel; Patenaude, Valerie; Abenhaim, Haim Arie
2014-01-01
To evaluate whether socioeconomic variables influence the management and outcomes of ectopic pregnancies. Retrospective cohort study (Canadian Task Force classification II-2). Hospitals in the United States participating in the Health Care Cost and Utilization Project. Women (n = 35 535) with a primary discharge diagnosis of ectopic pregnancy. Effect of socioeconomic factors and race/ethnicity on management and adverse outcomes of ectopic pregnancy. During the 9-year study, 35 535 ectopic pregnancies were identified. The development of hemoperitoneum in 8706 patients (24.50%) was the most common complication. Asian race was the sociodemographic variable most predictive of hemoperitoneum (odds ratio [OR], 1.41; 95% confidence interval [CI], 1.24-1.61; p < .01) and transfusion (OR, 1.62; 95% CI, 1.39-1.89; p < .01), and Medicare status was most influential on prolonged hospitalization (OR, 1.83; 95% CI, 1.36-2.47; p < .01). Major complications were not affected by socioeconomic factors. Laparotomy in 25 075 patients (70.6%) was the most common treatment option. Patients of Asian or Pacific Islander descent were least likely to be treated non-surgically (OR, 0.62; 95% CI, 0.51-0.76; p < .01), whereas Medicare recipients were most likely to be treated non-surgically (OR, 1.70; 95% CI, 1.32-2.18; p < .01). All non-white groups were less likely to undergo a laparoscopic approach. Major complications from ectopic pregnancy are not influenced by socioeconomic variables; however, less serious complications and management approaches are persistently affected. Copyright © 2014 AAGL. Published by Elsevier Inc. All rights reserved.
Bertens, Dirk; Kessels, Roy P C; Fiorenzato, Eleonora; Boelen, Danielle H E; Fasotti, Luciano
2015-09-01
Both errorless learning (EL) and Goal Management Training (GMT) have been shown effective cognitive rehabilitation methods aimed at optimizing the performance on everyday skills after brain injury. We examine whether a combination of EL and GMT is superior to traditional GMT for training complex daily tasks in brain-injured patients with executive dysfunction. This was an assessor-blinded randomized controlled trial conducted in 67 patients with executive impairments due to brain injury of non-progressive nature (minimal post-onset time: 3 months), referred for outpatient rehabilitation. Individually selected everyday tasks were trained using 8 sessions of an experimental combination of EL and GMT or via conventional GMT, which follows a trial-and-error approach. Primary outcome measure was everyday task performance assessed after treatment compared to baseline. Goal attainment scaling, rated by both trainers and patients, was used as secondary outcome measure. EL-GMT improved everyday task performance significantly more than conventional GMT (adjusted difference 15.43, 95% confidence interval [CI] [4.52, 26.35]; Cohen's d=0.74). Goal attainment, as scored by the trainers, was significantly higher after EL-GMT compared to conventional GMT (mean difference 7.34, 95% CI [2.99, 11.68]; Cohen's d=0.87). The patients' goal attainment scores did not differ between the two treatment arms (mean difference 3.51, 95% CI [-1.41, 8.44]). Our study is the first to show that preventing the occurrence of errors during executive strategy training enhances the acquisition of everyday activities. A combined EL-GMT intervention is a valuable contribution to cognitive rehabilitation in clinical practice.
Musicians have better memory than nonmusicians: A meta-analysis
Altoè, Gianmarco; Carretti, Barbara; Grassi, Massimo
2017-01-01
Background Several studies have found that musicians perform better than nonmusicians in memory tasks, but this is not always the case, and the strength of this apparent advantage is unknown. Here, we conducted a meta-analysis with the aim of clarifying whether musicians perform better than nonmusicians in memory tasks. Methods Education Source; PEP (WEB)—Psychoanalytic Electronic Publishing; Psychology and Behavioral Science (EBSCO); PsycINFO (Ovid); PubMed; ScienceDirect—AllBooks Content (Elsevier API); SCOPUS (Elsevier API); SocINDEX with Full Text (EBSCO) and Google Scholar were searched for eligible studies. The selected studies involved two groups of participants: young adult musicians and nonmusicians. All the studies included memory tasks (loading long-term, short-term or working memory) that contained tonal, verbal or visuospatial stimuli. Three meta-analyses were run separately for long-term memory, short-term memory and working memory. Results We collected 29 studies, including 53 memory tasks. The results showed that musicians performed better than nonmusicians in terms of long-term memory, g = .29, 95% CI (.08–.51), short-term memory, g = .57, 95% CI (.41–.73), and working memory, g = .56, 95% CI (.33–.80). To further explore the data, we included a moderator (the type of stimulus presented, i.e., tonal, verbal or visuospatial), which was found to influence the effect size for short-term and working memory, but not for long-term memory. In terms of short-term and working memory, the musicians’ advantage was large with tonal stimuli, moderate with verbal stimuli, and small or null with visuospatial stimuli. Conclusions The three meta-analyses revealed a small effect size for long-term memory, and a medium effect size for short-term and working memory, suggesting that musicians perform better than nonmusicians in memory tasks. Moreover, the effect of the moderator suggested that, the type of stimuli influences this advantage. PMID:29049416
Meta-Analysis of Early Nonmotor Features and Risk Factors for Parkinson Disease
Noyce, Alastair J; Bestwick, Jonathan P; Silveira-Moriyama, Laura; Hawkes, Christopher H; Giovannoni, Gavin; Lees, Andrew J; Schrag, Anette
2012-01-01
Objective To evaluate the association between diagnosis of Parkinson disease (PD) and risk factors or early symptoms amenable to population-based screening. Methods A systematic review and meta-analysis of risk factors for PD. Results The strongest associations with later diagnosis of PD were found for having a first-degree or any relative with PD (odds ratio [OR], 3.23; 95% confidence interval [CI], 2.65–3.93 and OR, 4.45; 95% CI, 3.39–5.83) or any relative with tremor (OR, 2.74; 95% CI, 2.10–3.57), constipation (relative risk [RR], 2.34; 95% CI, 1.55–3.53), or lack of smoking history (current vs never: RR, 0.44; 95% CI, 0.39–0.50), each at least doubling the risk of PD. Further positive significant associations were found for history of anxiety or depression, pesticide exposure, head injury, rural living, beta-blockers, farming occupation, and well-water drinking, and negative significant associations were found for coffee drinking, hypertension, nonsteroidal anti-inflammatory drugs, calcium channel blockers, and alcohol, but not for diabetes mellitus, cancer, oral contraceptive pill use, surgical menopause, hormone replacement therapy, statins, acetaminophen/paracetamol, aspirin, tea drinking, history of general anesthesia, or gastric ulcers. In the systematic review, additional associations included negative associations with raised serum urate, and single studies or studies with conflicting results. Interpretation The strongest risk factors associated with later PD diagnosis are having a family history of PD or tremor, a history of constipation, and lack of smoking history. Further factors also but less strongly contribute to risk of PD diagnosis or, as some premotor symptoms, require further standardized studies to demonstrate the magnitude of risk associated with them. ANN NEUROL 2012 PMID:23071076
Giles, Jon T.; Bathon, Joan M.
2015-01-01
Objectives To investigate the association between oral calcium supplementation and coronary arterial calcification among rheumatoid arthritis (RA) patients without known cardiovascular disease (CVD). Methods This study was nested in a prospective cohort study of RA patients without known CVD. Daily supplemental calcium dose was ascertained from prescription and over-the-counter medications at baseline and visit 2 (median 20 months post-baseline). Coronary artery calcium (CAC), a measure of coronary atherosclerosis, was assessed by cardiac multi-detector row computed tomography at baseline and visit 3 (median 39 months post-baseline). The association of calcium supplementation with CAC was explored. Results Among the 145 RA patients studied, 42 (28%) took ≥1000mg/day of supplemental calcium at baseline. Forty-four (30%) and 50 (34%) had a CAC score >100 units at baseline and follow-up, respectively. Baseline CAC scores >100 units were significantly less frequent in the higher (≥1000mg/day) supplemental calcium group than in the lower dosed group (<1000mg/day) [OR 0.28 (95% CI 0.11-0.74)]; this remained significant after adjusting for relevant confounders [OR 0.30 (95% CI 0.09-0.93)]. Similarly, at the third study visit, CAC scores >100 units were less frequent in the higher vs. the lower supplemental calcium group [OR 0.41 (95% CI 0.18-0.95)]. When adjusted for relevant confounders, statistical significance was lost [OR 0.39 (95% CI 0.14-1.12)]. No gender interaction and no change in CAC score over time were appreciated. Conclusion Higher levels of oral calcium supplementation were not associated with an increased risk of coronary atherosclerosis as measured by CAC score in this RA cohort. PMID:25808397
Wither, Joan; Bernatsky, Sasha; Claudio, Jaime O.; Clarke, Ann; Rioux, John D.; Fortin, Paul R.
2010-01-01
Objectives. We examined occupational and non-occupational exposures in relation to risk of SLE in a case–control study conducted through the Canadian Network for Improved Outcomes in SLE (CaNIOS). Methods. SLE cases (n = 258) were recruited from 11 rheumatology centres across Canada. Controls (without SLE, n = 263) were randomly selected from phone number listings and matched to cases by age, sex and area of residence. Data were collected using a structured telephone interview. Results. An association was seen with outdoor work in the 12 months preceding diagnosis [odds ratio (OR) 2.0; 95% CI 1.1, 3.8]; effect modification by sun reaction was suggested, with the strongest effect among people who reported reacting to midday sun with a blistering sunburn or a rash (OR 7.9; 95% CI 0.97, 64.7). Relatively strong but imprecise associations were seen with work as an artist working with paints, dyes or developing film (OR 3.9; 95% CI 1.3, 12.3) and work that included applying nail polish or nail applications (OR 10.2; 95% CI 1.3, 81.5). Patients were more likely than controls to report participation in pottery or ceramics work as a leisure activity, with an increased risk among individuals with a total frequency of at least 26 days (OR 2.1; 95% CI 1.1, 3.9). Analyses of potential respirable silica exposures suggested an exposure–response gradient (OR 1.0, 1.4. and 2.1 for zero, one and two or more sources of exposure, respectively; trend test P < 0.01). Conclusions. This study supports the role of specific occupational and non-occupational exposures in the development of SLE. PMID:20675707
2014-01-01
Background Lower breast cancer survival has been reported for Australian Aboriginal women compared to non-Aboriginal women, however the reasons for this disparity have not been fully explored. We compared the surgical treatment and survival of Aboriginal and non-Aboriginal women diagnosed with breast cancer in New South Wales (NSW), Australia. Methods We analysed NSW cancer registry records of breast cancers diagnosed in 2001–2007, linked to hospital inpatient episodes and deaths. We used unconditional logistic regression to compare the odds of Aboriginal and non-Aboriginal women receiving surgical treatment. Breast cancer-specific survival was examined using cumulative mortality curves and Cox proportional hazards regression models. Results Of the 27 850 eligible women, 288 (1.03%) identified as Aboriginal. The Aboriginal women were younger and more likely to have advanced spread of disease when diagnosed than non-Aboriginal women. Aboriginal women were less likely than non-Aboriginal women to receive surgical treatment (odds ratio 0.59, 95% confidence interval (CI) 0.42-0.86). The five-year crude breast cancer-specific mortality was 6.1% higher for Aboriginal women (17.7%, 95% CI 12.9-23.2) compared with non-Aboriginal women (11.6%, 95% CI 11.2-12.0). After accounting for differences in age at diagnosis, year of diagnosis, spread of disease and surgical treatment received the risk of death from breast cancer was 39% higher in Aboriginal women (HR 1.39, 95% CI 1.01-1.86). Finally after also accounting for differences in comorbidities, socioeconomic disadvantage and place of residence the hazard ratio was reduced to 1.30 (95% CI 0.94-1.75). Conclusion Preventing comorbidities and increasing rates of surgical treatment may increase breast cancer survival for NSW Aboriginal women. PMID:24606675
Community mental health status six months after the Sewol ferry disaster in Ansan, Korea
2015-01-01
OBJECTIVES: The disaster of the Sewol ferry that sank at sea off Korea’s southern coast of the Yellow Sea on April 16, 2014 was a tragedy that brought grief and despair to the whole country. The aim of this study was to evaluate the mental health effects of this disaster on the community of Ansan, where most victims and survivors resided. METHODS: The self-administered questionnaire survey was conducted 4 to 6 months after the accident using the Korean Community Health Survey system, an annual nationwide cross-sectional survey. Subjects were 7,076 adults (≥19 years) living in two victimized communities in Ansan, four control communities from Gyeonggi-do, Jindo and Haenam near the accident site. Depression, stress, somatic symptoms, anxiety, and suicidal ideation were measured using the Center for Epidemiologic Studies-Depression Scale, Brief Encounter Psychosocial Instrument, Patient Health Questionnaire-15, and Generalized Anxiety Disorder 7-Item Scale, respectively. RESULTS: The depression rate among the respondents from Ansan was 11.8%, and 18.4% reported suicidal ideation. Prevalence of other psychiatric disturbances was also higher compared with the other areas. A multiple logistic regression analysis revealed significantly higher odds ratios (ORs) in depression (1.66; 95% confidence interval [CI], 1.36 to 2.04), stress (1.37; 95% CI, 1.10 to 1.71), somatic symptoms (1.31; 95% CI, 1.08 to 1.58), anxiety (1.82; 95% CI, 1.39 to 2.39), and suicidal ideation (1.33; 95% CI, 1.13 to 1.56) compared with Gyeonggi-do. In contrast, the accident areas of Jindo and Haenam showed the lowest prevalence and ORs. CONCLUSIONS: Residents in the victimized area of Ansan had a significantly higher prevalence of psychiatric disturbances than in the control communities. PMID:27923237
Li, W; Wang, D Z; Zhang, H; Xu, Z L; Xue, X D; Jiang, G H
2017-11-10
Objective: To analyze the influence of smoking on deaths in residents aged 35-79 years and the effects of smoking cessation in Tianjin. Methods: The data of 39 499 death cases aged 35-79 years in 2016 in Tianjin were collected, the risks for deaths caused by smoking related diseases and excess deaths as well as effects of smoking cessation were analyzed after adjusting 5 year old age group, education level and marital status. Results: Among the 39 499 deaths cases, 1 589 (13.56%) were caused by smoking, the percentage of the excess mortality of lung cancer caused by smoking was highest (47.60%); the risk of death due to lung cancer in smokers was 2.75 times higher than that in non-smokers (95 %CI : 2.47-3.06). Among the female deaths, 183 (7.29%) were caused by smoking, the percentage of the excess mortality of lung cancer was highest (28.90%); and the risk of death of lung cancer in smokers was 4.04 times higher than that in non-smokers (95 %CI : 3.49-4.68). The OR for disease in ex-smokers was 0.80 compared with 1.00 in smokers (95 %CI : 0.72-0.90). The OR in males who had quitted smoking for ≥10 years was lower (0.74, 95 %CI : 0.63-0.86) than that in those who had quitted smoking for 1-9 years (0.85, 95 %CI : 0.74-0.98), but the difference was not significant. Conclusion: Smoking is one of the most important risk factors for deaths in residents in Tianjin. Smoking cessation can benefit people's health.
Wu, Wen-Chih; Waring, Molly E.; Lessard, Darleen; Yarzebski, Jorge; Gore, Joel; Goldberg, Robert J.
2011-01-01
Background It is unknown how anemia influences the invasive management of patients with non-ST-segment-elevation myocardial infarction (NSTEMI) and associated mortality. We investigated whether receipt of cardiac catheterization relates to 6-month death rates among patients with different severity of anemia. Methods We used data from the population-based Worcester Heart Attack Study, which included 2,634 patients hospitalized with confirmed NSTEMI, from 3 PCI-capable medical centers in the Worcester (MA) metropolitan area, during 5 biennial periods between 1997 and 2005. Severity of anemia was categorized using admission hematocrit levels: ≤30.0% (moderate-to-severe anemia), 30.1–39.0% (mild anemia), and >39.0% (no anemia). Propensity matching and conditional logistic regression adjusting for hospital use of aspirin, heparin, and plavix compared 6-month post-admission all-cause mortality rates in relation to cardiac catheterization during NSTEMI hospitalization. Results Compared to patients without anemia, patients with anemia were less likely to undergo cardiac catheterization (adjusted odds ratio [AOR] 0.79 [95% confidence interval [CI]: 0.67–0.95] for mild anemia and 0.45 [95%CI: 0.42–0.49] for moderate-to-severe anemia). After propensity matching, cardiac catheterization was associated with lower 6-month death rates only in patients without anemia (AOR 0.26 [95%CI: 0.09–0.79]) but not in patients with mild anemia (AOR 0.55 [95%CI: 0.25–1.23]). The small number of patients rendered data inconclusive for patients with moderate-to-severe anemia. Conclusions Anemia at the time of hospitalization for NSTEMI was associated with lower utilization of cardiac catheterization. However, cardiac catheterization use was associated with a decreased risk of dying at 6 months post hospital admission only in patients without anemia. PMID:21738102
Age-Specific Trends in Incidence of Noncardia Gastric Cancer in US Adults
Anderson, William F.; Camargo, M. Constanza; Fraumeni, Joseph F.; Correa, Pelayo; Rosenberg, Philip S.; Rabkin, Charles S.
2011-01-01
Context For the last 50 years, overall age-standardized incidence rates for noncardia gastric cancer have steadily declined in most populations. However, overall rates are summary measures that may obscure important age-specific trends. Objective To examine effects of age at diagnosis on noncardia gastric cancer incidence trends in the United States. Design, Setting, and Participants Descriptive study with age-period-cohort analysis of cancer registration data from the National Cancer Institute's Surveillance, Epidemiology, and End Results Program, which covers approximately 26% of the US population. From 1977 through 2006, there were 83 225 adults with incident primary gastric cancer, including 39 003 noncardia cases. Main Outcome Measures Overall and age-specific incidence rates, adjusted for period and cohort effects using age-period-cohort models. Results were stratified by race, sex, and socioeconomic status. Results Overall age-standardized annual incidence per 100 000 population declined during the study period from 5.9 (95% confidence interval [CI], 5.7-6.1) to 4.0 (95% CI, 3.9-4.1) in whites, from 13.7 (95% CI, 12.5-14.9) to 9.5 (95% CI, 9.1-10.0) in blacks, and from 17.8 (95% CI, 16.1-19.4) to 11.7 (95% CI, 11.2-12.1) in other races. Age-specific trends among whites varied significantly between older and younger age groups (P < .001 for interaction by age): incidence per 100 000 declined significantly from 19.8 (95% CI, 19.0-20.6) to 12.8 (95% CI, 12.5-13.1) for ages 60 to 84 years and from 2.6 (95% CI, 2.4-2.8) to 2.0 (95% CI, 1.9-2.1) for ages 40 to 59 years but increased significantly from 0.27 (95% CI, 0.19-0.35) to 0.45 (95% CI, 0.39-0.50) for ages 25 to 39 years. Conversely, rates for all age groups declined or were stable among blacks and other races. Age-period-cohort analysis confirmed a significant increase in whites among younger cohorts born since 1952 (P < .001). Conclusions From 1977 through 2006, the incidence rate for noncardia gastric cancer declined among all race and age groups except for whites aged 25 to 39 years, for whom it increased. Additional surveillance and analytical studies are warranted to identify risk factors that may explain this unfavorable trend. PMID:20442388
Emergency Department Visits for Nontraumatic Dental Problems: A Mixed-Methods Study
Chi, Donald L.; Schwarz, Eli; Milgrom, Peter; Yagapen, Annick; Malveau, Susan; Chen, Zunqui; Chan, Ben; Danner, Sankirtana; Owen, Erin; Morton, Vickie; Lowe, Robert A.
2015-01-01
Objectives. We documented emergency department (ED) visits for nontraumatic dental problems and identified strategies to reduce ED dental visits. Methods. We used mixed methods to analyze claims in 2010 from a purposive sample of 25 Oregon hospitals and Oregon’s All Payer All Claims data set and interviewed 51 ED dental visitors and stakeholders from 6 communities. Results. Dental visits accounted for 2.5% of ED visits and represented the second-most-common discharge diagnosis in adults aged 20 to 39 years, were associated with being uninsured (odds ratio [OR] = 5.2 [reference: commercial insurance]; 95% confidence interval [CI] = 4.8, 5.5) or having Medicaid insurance (OR = 4.0; 95% CI = 3.7, 4.2), resulted in opioid (56%) and antibiotic (56%) prescriptions, and generated $402 (95% CI = $396, $408) in hospital costs per visit. Interviews revealed health system, community, provider, and patient contributors to ED dental visits. Potential solutions provided by interviewees included Medicaid benefit expansion, care coordination, water fluoridation, and patient education. Conclusions. Emergency department dental visits are a significant and costly public health problem for vulnerable individuals. Future efforts should focus on implementing multilevel interventions to reduce ED dental visits. PMID:25790415
European Scientific Notes, Volume 35, Number 11
1981-11-30
PERFORMING ORGANIZATION NAME AND ADDRESS ,. PROGRAM ELEMENT, PROJECT, TASK US Office of Naval Research Branch Office London AREA & WORK UNIT NUMBERS Box 39...presentation in both sides of the tree. Such methodsI was "Non-Standard Uses of the word If," can be spectacularly efficient: in one by D.S. Bred and R. Smit...probability sample of rf uses reduced to a few minutes’ run. was analyzed. Of these, some 60 percent At least two game programs are now of the If’s were
Kaindjee-Tjituka, Francina; Sawadogo, Souleymane; Mutandi, Graham; Maher, Andrew D; Salomo, Natanael; Mbapaha, Claudia; Neo, Marytha; Beukes, Anita; Gweshe, Justice; Muadinohamba, Alexinah; Lowrance, David W
2017-01-01
Access to CD4+ testing remains a common barrier to early initiation of antiretroviral therapy among persons living with HIV/AIDS in low- and middle-income countries. The feasibility of task-shifting of point-of-care (POC) CD4+ testing to lay health workers in Namibia has not been evaluated. From July to August 2011, Pima CD4+ analysers were used to improve access to CD4+ testing at 10 selected public health facilities in Namibia. POC Pima CD4+ testing was performed by nurses or lay health workers. Venous blood samples were collected from 10% of patients and sent to centralised laboratories for CD4+ testing with standard methods. Outcomes for POC Pima CD4+ testing and patient receipt of results were compared between nurses and lay health workers and between the POC method and standard laboratory CD4+ testing methods. Overall, 1429 patients received a Pima CD4+ test; 500 (35.0%) tests were performed by nurses and 929 (65.0%) were performed by lay health workers. When Pima CD4+ testing was performed by a nurse or a lay health worker, 93.2% and 95.2% of results were valid ( p = 0.1); 95.6% and 98.1% of results were received by the patient ( p = 0.007); 96.2% and 94.0% of results were received by the patient on the same day ( p = 0.08). Overall, 97.2% of Pima CD4+ results were received by patients, compared to 55.4% of standard laboratory CD4+ results ( p < 0.001). POC CD4+ testing was feasible and effective when task-shifted to lay health workers. Rollout of POC CD4+ testing via task-shifting can improve access to CD4+ testing and retention in care between HIV diagnosis and antiretroviral therapy initiation in low- and middle-income countries.
Sex Differences in Patient-Reported Poststroke Disability.
White, Brandi M; Magwood, Gayenell S; Burns, Suzanne Perea; Ellis, Charles
2018-04-01
Recent studies have shown that stroke has a differential impact in women compared to men. Women are more likely to survive strokes than men, yet they experience more severe strokes resulting in greater poststroke disability. However, few studies have characterized sex differences in functional ability after stroke. This study examined sex differences in long-term disability among stroke survivors. This was a retrospective analysis of the 2015 National Health Interview Survey. Respondents were asked to rate their ability to perform 11 functional tasks. Univariate comparisons were completed to evaluate sex differences in performance, and multinomial logistic regression was used to determine the odds of reporting functional limitations. Five hundred fourteen men and 641 women stroke survivors completed the survey (mean age: 66.9 years). Approximately 75% of the sample reported having hypertension, 61% high cholesterol, 33% diabetes, 24% heart disease, 21% heart attack, and 16% chronic obstructive pulmonary disease. In the predictive models, men were less likely to report "very difficult/can't do at all" in walking ¼ mile (odds ratios [OR] = 0.68, 95% CI 0.51-0.90), climbing 10 steps (OR = 0.65, 95% CI 0.49-0.85), standing 2 hours (OR = 0.66, 95% CI 0.50-0.87), stooping (OR = 0.51, 95% CI 0.39-0.68), reaching overhead (OR = 0.69, 95% CI 0.49-0.97), carrying 10 pounds (OR = 0.45, 95% CI 0.34-0.59), and pushing large objects (OR = 0.37, 95% CI 0.28-0.5) compared to women. The functional outcomes of men stroke survivors were significantly greater than women. The specific factors that contribute to sex differences in stroke-related outcomes are not entirely clear. Future research is needed to better understand these differences to ensure that equity of care is received.
Romany, Laura; Garrido, Nicolas; Cobo, Ana; Aparicio-Ruiz, Belen; Serra, Vicente; Meseguer, Marcos
2017-02-01
The purpose of this study is to assess outcomes after magnetic-activated cell sorting (MACS) technology on obstetric and perinatal outcomes compared with those achieved after swim up from randomized controlled trial. This is a two-arm, unicentric, prospective, randomized, and triple-blinded trial and has a total of 237 infertile couples, between October 2010 and January 2013. A total of 65 and 66 newborns from MACS and control group, respectively, were described. MACS had no clinically relevant adverse effects on obstetric and perinatal outcomes. No differences were found for obstetric problems including premature rupture of membranes 6.1% (CI95% 0-12.8) vs. 5.9% (CI95% 0-12.4), 1st trimester bleeding 28.6% (CI95% 15.9-41.2) vs. 23.5% (CI95% 11.9-35.1), invasive procedures as amniocentesis 2.0% (CI95% 0-5.9) vs. 3.9% (CI95% 0-9.2), diabetes 14.3% (CI95% 4.5-24.1) vs. 9.8% (CI95% 1.6-17.9), anemia 6.1% (CI95% 0-12.8) vs. 5.9%(CI95% 0-12.4), 2nd and 3rd trimesters 10.2% (CI95% 1.7-18.7) vs. 5.9% (CI95% 0-12.4), urinary tract infection 8.2% (CI95% 0.5-15.9) vs. 3.9% (CI95% 0-9.2), pregnancy-induced hypertension 6.1% (CI95% 0-12.8) vs. 15.7% (CI95% 5.7-25.7), birth weight (g) 2684.10 (CI95% 2499.48-2868.72) vs. 2676.12 (CI95% 2499.02-2852.21), neonatal height (cm) 48.3 (CI95% 47.1-49.4) vs. 46.5 (CI95% 44.6-48.4), and gestational cholestasis 0%(CI95% 0-0) vs. 3.9% (CI95% 0-9.2), respectively, in MACS group compared with control group. Our data suggest that MACS technology does not increase or decrease Powered by Editorial Manager® and ProduXion Manager® from Aries Systems Corporation adverse obstetric and perinatal outcomes in children conceived when this technology was performed, being the largest randomized control trial with live birth reported results with MACS.
Does Strategy Training Reduce Age-Related Deficits in Working Memory?
Bailey, Heather R.; Dunlosky, John; Hertzog, Christopher
2014-01-01
Background Older adults typically perform worse on measures of working memory (WM) than do young adults; however, age-related differences in WM performance might be reduced if older adults use effective encoding strategies (Bailey, Dunlosky, & Hertzog, 2009). Objective The purpose of the current experiment was to evaluate WM performance after training individuals to use effective encoding strategies. Methods Participants in the training group (older adults: n = 39; young adults: n = 41) were taught about various verbal encoding strategies and their differential effectiveness and were trained to use interactive imagery and sentence generation on a list-learning task. Participants in the control group (older: n=37; young: n=38) completed an equally engaging filler task. All participants completed a pre-training and post-training reading span task, which included self-reported strategy use, as well as two transfer tasks that differed in the affordance to use the trained strategies – a paired-associate recall task and the self-ordered pointing task. Results Both young and older adults were able to use the target strategies on the WM task and showed gains in WM performance after training. The age-related WM deficit was not greatly affected, however, and the training gains did not transfer to the other cognitive tasks. In fact, participants attempted to adapt the trained strategies for a paired-associate recall task, but the increased strategy use did not benefit their performance. Conclusions Strategy training can boost WM performance, and its benefits appear to arise from strategy-specific effects and not from domain-general gains in cognitive ability. PMID:24577079
Association between Body Composition and Motor Performance in Preschool Children
Kakebeeke, Tanja H.; Lanzi, Stefano; Zysset, Annina E.; Arhab, Amar; Messerli-Bürgy, Nadine; Stuelb, Kerstin; Leeger-Aschmann, Claudia S.; Schmutz, Einat A.; Meyer, Andrea H.; Kriemler, Susi; Munsch, Simone; Jenni, Oskar G.; Puder, Jardena J.
2017-01-01
Objective Being overweight makes physical movement more difficult. Our aim was to investigate the association between body composition and motor performance in preschool children. Methods A total of 476 predominantly normal-weight preschool children (age 3.9 ± 0.7 years; m/f: 251/225; BMI 16.0 ± 1.4 kg/m2) participated in the Swiss Preschoolers' Health Study (SPLASHY). Body composition assessments included skinfold thickness, waist circumference (WC), and BMI. The Zurich Neuromotor Assessment (ZNA) was used to assess gross and fine motor tasks. Results After adjustment for age, sex, socioeconomic status, sociocultural characteristics, and physical activity (assessed with accelerometers), skinfold thickness and WC were both inversely correlated with jumping sideward (gross motor task β-coefficient −1.92, p = 0.027; and −3.34, p = 0.014, respectively), while BMI was positively correlated with running performance (gross motor task β-coefficient 9.12, p = 0.001). No significant associations were found between body composition measures and fine motor tasks. Conclusion The inverse associations between skinfold thickness or WC and jumping sideward indicates that children with high fat mass may be less proficient in certain gross motor tasks. The positive association between BMI and running suggests that BMI might be an indicator of fat-free (i.e., muscle) mass in predominately normal-weight preschool children. PMID:28934745
Deng, Jianping; Deng, Tianxing; Gong, Zhihua; Hao, Ping
2013-01-01
Background The association of the three Glutathione S-transferases (GSTs) polymorphisms (GSTM1, GSTT1 and GSTP1) genotypes with their individual susceptibilities to renal cell carcinoma (RCC) has not been well established. We performed a quantitative meta-analysis to assess the possible associations between the GSTM1, GSTT1 and GSTP1 genotypes and their individual susceptibilities to renal cell carcinoma. Methods We systematically searched the PubMed, CNKI and Embase databases to identify the relevant studies. Finally, 11 eligible studies were selected. The pooled odds ratios (ORs) with their 95% confidence intervals (CIs) were used to assess the association between the GSTs polymorphisms and the risk of RCC. Multiple subgroup analyses and quality assessment of the included studies were performed based on the available information. Results None of the GSTs polymorphisms had a significant association with the RCC risk. Similar results were found in the subgroup analyses, except for the GSTs polymorphisms in the situations described below. The GSTM1 and GSTT1 active genotypes in subjects exposed to pesticides (GSTM1: OR = 3.44; 95% CI, 2.04–5.80; GSTT1: OR = 2.84; 95% CI, 1.75–4.60), most of the GSTs genotypes in Asian populations (GSTT1: OR = 2.39, 95% CI = 1.63–3.51; GSTP1: Dominant model: OR = 1.50, 95% CI = 1.14–1.99; Additive model: OR = 1.39, 95% CI = 1.12–1.73; AG vs. AA: OR = 1.47, 95% CI = 1.10–1.97; GG vs. AA: OR = 1.82, 95% CI = 1.07–3.09) and the dual null genotype of GSTT1-GSTP1 (OR = 2.84, 95% CI = 1.75–4.60) showed positive associations with the RCC risk. Conclusion Our present study provides evidence that the GSTM1, GSTT1 and GSTP1 polymorphisms are not associated with the development of RCC. However, more case-control studies are needed for further confirmation. PMID:23717494
Huang, Ruixue; Ning, Huacheng; Shen, Minxue; Li, Jie; Zhang, Jianglin; Chen, Xiang
2017-01-01
Objective: Atopic dermatitis (AD) is a prevalent, burdensome, and psychologically important pediatric concern. Probiotics have been suggested as a treatment for AD. Some reports have explored this topic; however, the utility of probiotics for AD remains to be firmly established. Methods: To assess the effects of probiotics on AD in children, the PubMed/Medline, Cochrane Library Scopus, and OVID databases were searched for reports published in the English language. Results: Thirteen studies were identified. Significantly higher SCORAD values favoring probiotics over controls were observed (mean difference [MD], -3.07; 95% confidence interval [CI], -6.12 to -0.03; P < 0.001). The reported efficacy of probiotics in children < 1 year old was -1.03 (95%CI, -7.05 to 4.99) and that in children 1-18 years old was -4.50 (95%CI, -7.45 to -1.54; P < 0.001). Subgroup analyses showed that in Europe, SCORAD revealed no effect of probiotics, whereas significantly lower SCORAD values were reported in Asia (MD, -5.39; 95%CI, -8.91 to -1.87). Lactobacillus rhamnosus GG (MD, 3.29; 95%CI, -0.30 to 6.88; P = 0.07) and Lactobacillus plantarum (MD, -0.70; 95%CI, -2.30 to 0.90; P = 0.39) showed no significant effect on SCORAD values in children with AD. However, Lactobacillus fermentum (MD, -11.42; 95%CI, -13.81 to -9.04), Lactobacillus salivarius (MD, -7.21; 95%CI, -9.63 to -4.78), and a mixture of different strains (MD, -3.52; 95%CI, -5.61 to -1.44) showed significant effects on SCORAD values in children with AD. Conclusions: Our meta-analysis indicated that the research to date has not robustly shown that probiotics are beneficial for children with AD. However, caution is needed when generalizing our results, as the populations evaluated were heterogeneous. Randomized controlled trials with larger samples and greater power are necessary to identify the species, dose, and treatment duration of probiotics that are most efficacious for treating AD in children.
2011-01-01
Background Obstetric fistula although virtually eliminated in high income countries, still remains a prevalent and debilitating condition in many parts of the developing world. It occurs in areas where access to care at childbirth is limited, or of poor quality and where few hospitals offer the necessary corrective surgery. Methods This was a prospective observational study where all women who attended Mbarara Regional Referral Hospital in western Uganda with obstetric fistula during the study period were assessed pre-operatively for social demographics, fistula characteristics, classification and outcomes after surgery. Assessment for fistula closure and stress incontinence after surgery was done using a dye test before discharge Results Of the 77 women who were recruited in this study, 60 (77.9%) had successful closure of their fistulae. Unsuccessful fistula closure was significantly associated with large fistula size (Odds Ratio 6 95% Confidential interval 1.46-24.63), circumferential fistulae (Odds ratio 9.33 95% Confidential interval 2.23-39.12) and moderate to severe vaginal scarring (Odds ratio 12.24 95% Confidential interval 1.52-98.30). Vaginal scarring was the only factor independently associated with unsuccessful fistula repair (Odds ratio 10 95% confidential interval 1.12-100.57). Residual stress incontinence after successful fistula closure was associated with type IIb fistulae (Odds ratio 5.56 95% Confidential interval 1.34-23.02), circumferential fistulae (Odds ratio 10.5 95% Confidential interval 1.39-79.13) and previous unsuccessful fistula repair (Odds ratio 4.8 95% Confidential interval 1.27-18.11). Independent predictors for residual stress incontinence after successful fistula closure were urethral involvement (Odds Ratio 4.024 95% Confidential interval 2.77-5.83) and previous unsuccessful fistula repair (Odds ratio 38.69 95% Confidential interval 2.13-703.88). Conclusions This study demonstrated that large fistula size, circumferential fistulae and marked vaginal scarring are predictors for unsuccessful fistula repair while predictors for residual stress incontinence after successful fistula closure were urethral involvement, circumferential fistulae and previous unsuccessful fistula repair. PMID:22151960
Intelligent single switch wheelchair navigation.
Ka, Hyun W; Simpson, Richard; Chung, Younghyun
2012-11-01
We have developed an intelligent single switch scanning interface and wheelchair navigation assistance system, called intelligent single switch wheelchair navigation (ISSWN), to improve driving safety, comfort and efficiency for individuals who rely on single switch scanning as a control method. ISSWN combines a standard powered wheelchair with a laser rangefinder, a single switch scanning interface and a computer. It provides the user with context sensitive and task specific scanning options that reduce driving effort based on an interpretation of sensor data together with user input. Trials performed by 9 able-bodied participants showed that the system significantly improved driving safety and efficiency in a navigation task by significantly reducing the number of switch presses to 43.5% of traditional single switch wheelchair navigation (p < 0.001). All participants made a significant improvement (39.1%; p < 0.001) in completion time after only two trials.
Exudate detection in color retinal images for mass screening of diabetic retinopathy.
Zhang, Xiwei; Thibault, Guillaume; Decencière, Etienne; Marcotegui, Beatriz; Laÿ, Bruno; Danno, Ronan; Cazuguel, Guy; Quellec, Gwénolé; Lamard, Mathieu; Massin, Pascale; Chabouis, Agnès; Victor, Zeynep; Erginay, Ali
2014-10-01
The automatic detection of exudates in color eye fundus images is an important task in applications such as diabetic retinopathy screening. The presented work has been undertaken in the framework of the TeleOphta project, whose main objective is to automatically detect normal exams in a tele-ophthalmology network, thus reducing the burden on the readers. A new clinical database, e-ophtha EX, containing precisely manually contoured exudates, is introduced. As opposed to previously available databases, e-ophtha EX is very heterogeneous. It contains images gathered within the OPHDIAT telemedicine network for diabetic retinopathy screening. Image definition, quality, as well as patients condition or the retinograph used for the acquisition, for example, are subject to important changes between different examinations. The proposed exudate detection method has been designed for this complex situation. We propose new preprocessing methods, which perform not only normalization and denoising tasks, but also detect reflections and artifacts in the image. A new candidates segmentation method, based on mathematical morphology, is proposed. These candidates are characterized using classical features, but also novel contextual features. Finally, a random forest algorithm is used to detect the exudates among the candidates. The method has been validated on the e-ophtha EX database, obtaining an AUC of 0.95. It has been also validated on other databases, obtaining an AUC between 0.93 and 0.95, outperforming state-of-the-art methods. Copyright © 2014 Elsevier B.V. All rights reserved.
The Healthy Mind, Healthy Mobility Trial: A Novel Exercise Program for Older Adults.
Gill, Dawn P; Gregory, Michael A; Zou, Guangyong; Liu-Ambrose, Teresa; Shigematsu, Ryosuke; Hachinski, Vladimir; Fitzgerald, Clara; Petrella, Robert J
2016-02-01
More evidence is needed to conclude that a specific program of exercise and/or cognitive training warrants prescription for the prevention of cognitive decline. We examined the effect of a group-based standard exercise program for older adults, with and without dual-task training, on cognitive function in older adults without dementia. We conducted a proof-of-concept, single-blinded, 26-wk randomized controlled trial whereby participants recruited from preexisting exercise classes at the Canadian Centre for Activity and Aging in London, Ontario, were randomized to the intervention group (exercise + dual-task [EDT]) or the control group (exercise only [EO]). Each week (2 or 3 d · wk(-1)), both groups accumulated a minimum of 50 min of aerobic exercise (target 75 min) from standard group classes and completed 45 min of beginner-level square-stepping exercise. The EDT group was also required to answer cognitively challenging questions while doing beginner-level square-stepping exercise (i.e., dual-task training). The effect of interventions on standardized global cognitive function (GCF) scores at 26 wk was compared between the groups using the linear mixed effects model approach. Participants (n = 44; 68% female; mean [SD] age: 73.5 [7.2] yr) had on average, objective evidence of cognitive impairment (Montreal Cognitive Assessment scores, mean [SD]: 24.9 [1.9]) but not dementia (Mini-Mental State Examination scores, mean [SD]: 28.8 [1.2]). After 26 wk, the EDT group showed greater improvement in GCF scores compared with the EO group (difference between groups in mean change [95% CI]: 0.20 SD [0.01-0.39], P = 0.04). A 26-wk group-based exercise program combined with dual-task training improved GCF in community-dwelling older adults without dementia.
Burr, Nick E; Hull, Mark A; Subramanian, Venkataraman
2016-01-01
AIM: To determine whether aspirin or non-aspirin non-steroidal anti-inflammatory drugs (NA-NSAIDs) prevent colorectal cancer (CRC) in patients with inflammatory bowel disease (IBD). METHODS: We performed a systematic review and meta-analysis. We searched for articles reporting the risk of CRC in patients with IBD related to aspirin or NA-NSAID use. Pooled odds ratios (OR) and 95%CIs were determined using a random-effects model. Publication bias was assessed using Funnel plots and Egger’s test. Heterogeneity was assessed using Cochran’s Q and the I2 statistic. RESULTS: Eight studies involving 14917 patients and 3 studies involving 1282 patients provided data on the risk of CRC in patients with IBD taking NA-NSAIDs and aspirin respectively. The pooled OR of developing CRC after exposure to NA-NSAIDs in patients with IBD was 0.80 (95%CI: 0.39-1.21) and after exposure to aspirin it was 0.66 (95%CI: 0.06-1.39). There was significant heterogeneity (I2 > 50%) between the studies. There was no change in the effect estimates on subgroup analyses of the population studied or whether adjustment or matching was performed. CONCLUSION: There is a lack of high quality evidence on this important clinical topic. From the available evidence NA-NSAID or aspirin use does not appear to be chemopreventative for CRC in patients with IBD. PMID:27053860
ERICA: leisure-time physical inactivity in Brazilian adolescents.
Cureau, Felipe Vogt; da Silva, Thiago Luiz Nogueira; Bloch, Katia Vergetti; Fujimori, Elizabeth; Belfort, Dilson Rodrigues; de Carvalho, Kênia Mara Baiocchi; de Leon, Elisa Brosina; de Vasconcellos, Mauricio Teixeira Leite; Ekelund, Ulf; Schaan, Beatriz D
2016-02-01
OBJECTIVE To evaluate the prevalence of leisure-time physical inactivity in Brazilian adolescents and their association with geographical and sociodemographic variables. METHODS The sample was composed by 74,589 adolescents participating in the Study of Cardiovascular Risks in Adolescents (ERICA). This cross-sectional study of school basis with national scope involved adolescents aged from 12 to 17 years in Brazilian cities with more than 100 thousand inhabitants. The prevalence of leisure-time physical inactivity was categorized according to the volume of weekly practice (< 300; 0 min). The prevalences were estimated for the total sample and by sex. Poisson regression models were used to assess associated factors. RESULTS The prevalence of leisure-time physical inactivity was 54.3% (95%CI 53.4-55.2), and higher for the female sex (70.7%, 95%CI 69.5-71.9) compared to the male (38.0%, 95%CI 36.7-39.4). More than a quarter of adolescents (26.5%, 95%CI 25.8-27.3) reported not practicing physical activity in the leisure time, a condition more prevalent for girls (39.8%, 95%CI 38.8-40.9) than boys (13.4%, 95%CI 12.4-14.4). For girls, the variables that were associated with physical inactivity were: reside in the Northeast (RP = 1.13, 95%CI 1.08-1.19), Southeast (RP = 1.16, 95%CI 1.11-1.22) and South (RP = 1.12, 95%CI 1.06-1.18); have 16-17 years (RP = 1.06, 95%CI 1.12-1.15); and belong to the lower economic class (RP = 1.33, 95%CI 1.20-1.48). The same factors, except reside in the Southeast and South, were also associated with not practicing physical activity in the leisure time for the same group. In males, as well as the region, being older (p < 0.001) and declaring to be indigenous (RP = 0.37, 95%CI 0.19-0.73) were also associated with not practicing physical activities in the leisure time. CONCLUSIONS The prevalence of leisure-time physical inactivity in Brazilian adolescents is high. It presents regional variations and is associated with age and low socioeconomic status. Special attention should be given to girls and to those who do not engage in any physical activity during the leisure time, so that they can adopt a more active lifestyle.
ERICA: leisure-time physical inactivity in Brazilian adolescents
Cureau, Felipe Vogt; da Silva, Thiago Luiz Nogueira; Bloch, Katia Vergetti; Fujimori, Elizabeth; Belfort, Dilson Rodrigues; de Carvalho, Kênia Mara Baiocchi; de Leon, Elisa Brosina; de Vasconcellos, Mauricio Teixeira Leite; Ekelund, Ulf; Schaan, Beatriz D
2016-01-01
ABSTRACT OBJECTIVE To evaluate the prevalence of leisure-time physical inactivity in Brazilian adolescents and their association with geographical and sociodemographic variables. METHODS The sample was composed by 74,589 adolescents participating in the Study of Cardiovascular Risks in Adolescents (ERICA). This cross-sectional study of school basis with national scope involved adolescents aged from 12 to 17 years in Brazilian cities with more than 100 thousand inhabitants. The prevalence of leisure-time physical inactivity was categorized according to the volume of weekly practice (< 300; 0 min). The prevalences were estimated for the total sample and by sex. Poisson regression models were used to assess associated factors. RESULTS The prevalence of leisure-time physical inactivity was 54.3% (95%CI 53.4-55.2), and higher for the female sex (70.7%, 95%CI 69.5-71.9) compared to the male (38.0%, 95%CI 36.7-39.4). More than a quarter of adolescents (26.5%, 95%CI 25.8-27.3) reported not practicing physical activity in the leisure time, a condition more prevalent for girls (39.8%, 95%CI 38.8-40.9) than boys (13.4%, 95%CI 12.4-14.4). For girls, the variables that were associated with physical inactivity were: reside in the Northeast (RP = 1.13, 95%CI 1.08-1.19), Southeast (RP = 1.16, 95%CI 1.11-1.22) and South (RP = 1.12, 95%CI 1.06-1.18); have 16-17 years (RP = 1.06, 95%CI 1.12-1.15); and belong to the lower economic class (RP = 1.33, 95%CI 1.20-1.48). The same factors, except reside in the Southeast and South, were also associated with not practicing physical activity in the leisure time for the same group. In males, as well as the region, being older (p < 0.001) and declaring to be indigenous (RP = 0.37, 95%CI 0.19-0.73) were also associated with not practicing physical activities in the leisure time. CONCLUSIONS The prevalence of leisure-time physical inactivity in Brazilian adolescents is high. It presents regional variations and is associated with age and low socioeconomic status. Special attention should be given to girls and to those who do not engage in any physical activity during the leisure time, so that they can adopt a more active lifestyle. PMID:26910541
LeMonda, Brittany C.; Mahoney, Jeannette R.; Verghese, Joe; Holtzer, Roee
2016-01-01
The Walking While Talking (WWT) dual-task paradigm is a mobility stress test that predicts major outcomes, including falls, frailty, disability, and mortality in aging. Certain personality traits, such as neuroticism, extraversion, and their combination, have been linked to both cognitive and motor outcomes. We examined whether individual differences in personality dimensions of neuroticism and extraversion predicted dual-task performance decrements (both motor and cognitive) on a WWT task in non-demented older adults. We hypothesized that the combined effect of high neuroticism-low extraversion would be related to greater dual-task costs in gait velocity and cognitive performance in non-demented older adults. Participants (N = 295; age range, = 65–95 years; female = 164) completed the Big Five Inventory and WWT task involving concurrent gait and a serial 7's subtraction task. Gait velocity was obtained using an instrumented walkway. The high neuroticism-low extraversion group incurred greater dual-task costs (i.e., worse performance) in both gait velocity {95% confidence interval (CI) [−17.68 to −3.07]} and cognitive performance (95% CI [−19.34 to −2.44]) compared to the low neuroticism-high extraversion group, suggesting that high neuroticism-low extraversion interferes with the allocation of attentional resources to competing task demands during the WWT task. Older individuals with high neuroticism-low extraversion may be at higher risk for falls, mobility decline and other adverse outcomes in aging. PMID:26527241
The Effect of Work Characteristics on Dermatologic Symptoms in Hairdressers
2014-01-01
Objectives Hairdressers in Korea perform various tasks and are exposed to health risk factors such as chemical substances or prolonged duration of wet work. The objective of this study is to provide descriptive statistics on the demographics and work characteristics of hairdressers in Korea and to identify work-related risk factors for dermatologic symptoms in hairdressers. Methods 1,054 hairdressers were selected and analyzed for this study. Independent variables were exposure to chemical substances, the training status of the hairdressers, and the main tasks required of them, and the dependent variable was the incidence of dermatologic symptoms. The relationships between work characteristics and dermatologic symptoms were evaluated by estimating odds ratios using multiple logistic regression analysis. Results Among the 1,054 study subjects, 212 hairdressers (20.1%) complained of dermatologic symptoms, and the symptoms were more prevalent in younger, unmarried or highly educated hairdressers. The main tasks that comprise the majority of the wet work were strictly determined by training status, since 96.5% of staff hairdressers identified washing as their main task, while only 1.5% and 2.0% of master and designer hairdressers, respectively, identified this as their main task. Multiple logistic regressions was performed to estimate odds ratios. While exposure to hairdressing chemicals showed no significant effect on the odds ratio for the incidence of dermatologic symptoms, higher odds ratios of dermatologic symptoms were shown in staff hairdressers (2.70, 95% CI: 1.32 - 5.51) and in hairdressers who perform washing as their main task (2.03, 95% CI: 1.22 - 3.37), after adjusting for general and work characteristics. Conclusions This study showed that the training status and main tasks of hairdressers are closely related to each other and that the training status and main tasks of hairdressers are related to the incidence of dermatologic symptoms. This suggests that in the future, regulations on working conditions and health management guidelines for hairdressers should be established. PMID:25028609
Bergin, Junping Ma; Rubenstein, Jeffrey E; Mancl, Lloyd; Brudvik, James S; Raigrodski, Ariel J
2013-10-01
Conventional impression techniques for recording the location and orientation of implant-supported, complete-arch prostheses are time consuming and prone to error. The direct optical recording of the location and orientation of implants, without the need for intermediate transfer steps, could reduce or eliminate those disadvantages. The objective of this study was to assess the feasibility of using a photogrammetric technique to record the location and orientation of multiple implants and to compare the results with those of a conventional complete-arch impression technique. A stone cast of an edentulous mandibular arch containing 5 implant analogs was fabricated to create a master model. The 3-dimensional (3D) spatial orientations of implant analogs on the master model were measured with a coordinate measuring machine (CMM) (control). Five definitive casts were made from the master model with a splinted impression technique. The positions of the implant analogs on the 5 casts were measured with a NobelProcera scanner (conventional method). Prototype optical targets were attached to the master model implant analogs, and 5 sets of images were recorded with a digital camera and a standardized image capture protocol. Dimensional data were imported into commercially available photogrammetry software (photogrammetric method). The precision and accuracy of the 2 methods were compared with a 2-sample t test (α=.05) and a 95% confidence interval. The location precision (standard error of measurement) for CMM was 3.9 µm (95% CI 2.7 to 7.1), for photogrammetry, 5.6 µm (95% CI 3.4 to 16.1), and for the conventional method, 17.2 µm (95% CI 10.3 to 49.4). The average measurement error was 26.2 µm (95% CI 15.9 to 36.6) for the conventional method and 28.8 µm (95% CI 24.8 to 32.9) for the photogrammetric method. The overall measurement accuracy was not significantly different when comparing the conventional to the photogrammetric method (mean difference = -2.6 µm, 95% CI -12.8 to 7.6). The precision of the photogrammetric method was similar to CMM, but lower for the conventional method as compared to CMM and the photogrammetric method. However, the overall measurement accuracy of the photogrammetric and conventional methods was similar. Copyright © 2013 The Editorial Council of the Journal of Prosthetic Dentistry. Published by Mosby, Inc. All rights reserved.
Nienow, Tasha; MacDonald, Angus
2017-01-01
Abstract Background: Cognitive deficits contribute to the functional disability associated with schizophrenia. Cognitive training has shown promise as a method of intervention; however, there is considerable variability in the implementation of this approach. The aim of this study was to test the efficacy of a high dose of cognitive training that targeted working memory-related functions. Methods: A randomized, double blind, active placebo-controlled, clinical trial was conducted with 80 outpatients with schizophrenia (mean age 46.44 years, 25% female). Patients were randomized to either working memory-based cognitive training or a computer skills training course that taught computer applications. In both conditions, participants received an average of 3 hours of training weekly for 16 weeks. Cognitive and functional outcomes were assessed with the MATRICS Consensus Cognitive Battery, N-Back performance, 2 measures of functional capacity (UPSA and SSPA) and a measure of community functioning, the Social Functioning Scale. Results: An intent-to-treat analysis found that patients who received cognitive training demonstrated significantly greater change on a trained task (Word N-Back), F(78) = 21.69, P < .0001, and a novel version of a trained task (Picture N-Back) as compared to those in the comparison condition, F(78) = 13.59, P = .002. However, only very modest support was found for generalization of training gains. A trend for an interaction was found on the MCCB Attention Domain score, F(78) = 2.56, P = .12. Participants who received cognitive training demonstrated significantly improved performance, t(39) = 3.79, P = .001, while those in computer skills did not, t(39) = 1.07, P = .37. Conclusion: A well-powered, high-dose, working memory focused, computer-based, cognitive training protocol produced only a small effect in patients with schizophrenia. Results indicate the importance of measuring generalization from training tasks in cognitive remediation studies. Computer-based training was not an effective method of producing change in cognition in patients with schizophrenia.
Cell phones to collect pregnancy data from remote areas in Liberia.
Lori, Jody R; Munro, Michelle L; Boyd, Carol J; Andreatta, Pamela
2012-09-01
To report findings on knowledge and skill acquisition following a 3-day training session in the use of short message service (SMS) texting with non- and low-literacy traditional midwives. A pre- and post-test study design was used to assess knowledge and skill acquisition with 99 traditional midwives on the use of SMS texting for real-time, remote data collection in rural Liberia, West Africa. Paired sample t-tests were conducted to establish if overall mean scores varied significantly from pre-test to immediate post-test. Analysis of variance was used to compare means across groups. The nonparametric McNemar's test was used to determine significant differences between the pre-test and post-test values of each individual step involved in SMS texting. Pearson's chi-square test of independence was used to examine the association between ownership of cell phones within a family and achievement of the seven tasks. The mean increase in cell phone knowledge scores was 3.67, with a 95% confidence interval ranging from 3.39 to 3.95. Participants with a cell phone in the family did significantly better on three of the seven tasks in the pre-test: "turns cell on without help" (χ(2) (1) = 9.15, p= .003); "identifies cell phone coverage" (χ(2) (1) = 5.37, p= .024); and "identifies cell phone is charged" (χ(2) (1) = 4.40, p= .042). A 3-day cell phone training session with low- and nonliterate traditional midwives in rural Liberia improved their ability to use mobile technology for SMS texting. Mobile technology can improve data collection accessibility and be used for numerous health care and public health issues. Cell phone accessibility holds great promise for collecting health data in low-resource areas of the world. © 2012 Sigma Theta Tau International.
Yan, Cunling; Hu, Jian; Yang, Jia; Chen, Zhaoyun; Li, Huijun; Wei, Lianhua; Zhang, Wei; Xing, Hao; Sang, Guoyao; Wang, Xiaoqin; Han, Ruilin; Liu, Ping; Li, Zhihui; Li, Zhiyan; Huang, Ying; Jiang, Li; Li, Shunjun; Dai, Shuyang; Wang, Nianyue; Yang, Yongfeng; Ma, Li; Soh, Andrew; Beshiri, Agim; Shen, Feng; Yang, Tian; Fan, Zhuping; Zheng, Yijie; Chen, Wei
2018-04-01
Protein induced by vitamin K absence or antagonist-II (PIVKA-II) has been widely used as a biomarker for liver cancer diagnosis in Japan for decades. However, the reference intervals for serum ARCHITECT PIVKA-II have not been established in the Chinese population. Thus, this study aimed to measure serum PIVKA-II levels in healthy Chinese subjects. This is a sub-analysis from the prospective, cross-sectional and multicenter study (ClinicalTrials.gov Identifier: NCT03047603). A total of 892 healthy participants (777 Han and 115 Uygur) with complete health checkup results were recruited from 7 regional centers in China. Serum PIVKA-II level was measured by ARCHITECT immunoassay. All 95% reference ranges were estimated by nonparametric method. The distribution of PIVKA-II values showed significant difference with ethnicity and sex, but not age. The 95% reference range of PIVKA-II was 13.62-40.38 mAU/ml in Han Chinese subjects and 15.16-53.74 mAU/ml in Uygur subjects. PIVKA-II level was significantly higher in males than in females (P < 0.001). The 95% reference range of PIVKA-II was 15.39-42.01 mAU/ml in Han males while 11.96-39.13 mAU/ml in Han females. The reference interval of serum PIVKA-II on the Architect platform was established in healthy Chinese adults. This will be valuable for future clinical and laboratory studies performed using the Architect analyzer. Different ethnic backgrounds and analytical methods underline the need for redefining the reference interval of analytes such as PIVKA-II, in central laboratories in different countries. Copyright © 2018 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.
Dalla Nora, Magali; Hörner, Rosmari; De Carli, Diego Michelon; Rocha, Marta Pires da; Araujo, Amanda Faria de; Fagundes, Renato Borges
2016-01-01
The diagnosis of H. pylori infection can be performed by non-invasive and invasive methods.The identification through a fecal antigen test is a non-invasive, simple, and relatively inexpensive test. To determine the diagnostic performance of fecal antigen test in the identification of H. pylori infection. H. pylori antigens were identified in the stools of dyspeptic patients undergoing upper gastrointestinal endoscopy. For the identification of H. pylori antigen, we use ImmunoCard STAT! HpSA with immunochromatography technique. Histopathology plus urease test were the gold standard. We studied 163 patients, 51% male, mean age of 56.7± 8.5years. H. pylori infection was present in 49%. Fecal test presented: sensitivity 67.5% (CI95% 60.6-72.9); specificity 85.5% (CI95% 78.9-90.7); positive predictive value 81.8% (CI95% 73.4-88.4) and negative predictive value 73,2% (CI95% 67.5-77.6); Positive likelihood ratio was 4.7 (CI95% 2.9-7.9) and Negative Likelihood Ratio 0.4 (CI95% 0.3-0.5). The prevalence odds ratio for a positive test was 12.3 (CI95% 5.7-26.3).The index kappa between FAT and histology/urease test was 0.53 (CI95% 0.39-0.64). Immunochromatographic FAT is less expensive than the other methods and readily accepted by the patients but its diagnostic performance does not recommend its use in the primary diagnosis, when the patient may have an active infection.
B-2 Systems Engineering Case Study
2007-01-01
formal configuration freeze , an immediate refocus of the Task Teams was required. Within several days, the air vehicle task teams were conducting...39 3.3.3 Configuration Freeze ...1983 PDR 1 Oct 1982 Reconfiguration Feb 1983-Aug 1983 (LP3, LP4) Configuration Freeze July 1983 PDR 2 Mar-April 1984 CDR Dec 1986
Child Proportional Scaling: Is 1/3 = 2/6 = 3/9 = 4/12?
ERIC Educational Resources Information Center
Boyer, Ty W.; Levine, Susan C.
2012-01-01
The current experiments examined the role of scale factor in children's proportional reasoning. Experiment 1 used a choice task and Experiment 2 used a production task to examine the abilities of kindergartners through fourth-graders to match equivalent, visually depicted proportional relations. The findings of both experiments show that accuracy…
Renal replacement therapy in Europe: a summary of the 2011 ERA-EDTA Registry Annual Report.
Noordzij, Marlies; Kramer, Anneke; Abad Diez, José M; Alonso de la Torre, Ramón; Arcos Fuster, Emma; Bikbov, Boris T; Bonthuis, Marjolein; Bouzas Caamaño, Encarnación; Čala, Svetlana; Caskey, Fergus J; Castro de la Nuez, Pablo; Cernevskis, Harijs; Collart, Frederic; Díaz Tejeiro, Rafael; Djukanovic, Ljubica; Ferrer-Alamar, Manuel; Finne, Patrik; García Bazaga, María de Los Angelos; Garneata, Liliana; Golan, Eliezer; Gonzalez Fernández, Raquel; Heaf, James G; Hoitsma, Andries; Ioannidis, George A; Kolesnyk, Mykola; Kramar, Reinhard; Lasalle, Mathilde; Leivestad, Torbjørn; Lopot, Frantisek; van de Luijtgaarden, Moniek W M; Macário, Fernando; Magaz, Ángela; Martín Escobar, Eduardo; de Meester, Johan; Metcalfe, Wendy; Ots-Rosenberg, Mai; Palsson, Runolfur; Piñera, Celestino; Pippias, Maria; Prütz, Karl G; Ratkovic, Marina; Resić, Halima; Rodríguez Hernández, Aurelio; Rutkowski, Boleslaw; Spustová, Viera; Stel, Vianda S; Stojceva-Taneva, Olivera; Süleymanlar, Gültekin; Wanner, Christoph; Jager, Kitty J
2014-04-01
This article provides a summary of the 2011 ERA-EDTA Registry Annual Report (available at www.era-edta-reg.org). Data on renal replacement therapy (RRT) for end-stage renal disease (ESRD) from national and regional renal registries in 30 countries in Europe and bordering the Mediterranean Sea were used. From 27 registries, individual patient data were received, whereas 17 registries contributed data in aggregated form. We present the incidence and prevalence of RRT, and renal transplant rates in 2011. In addition, survival probabilities and expected remaining lifetimes were calculated for those registries providing individual patient data. The overall unadjusted incidence rate of RRT in 2011 among all registries reporting to the ERA-EDTA Registry was 117 per million population (pmp) (n = 71.631). Incidence rates varied from 24 pmp in Ukraine to 238 pmp in Turkey. The overall unadjusted prevalence of RRT for ESRD on 31 December 2011 was 692 pmp (n = 425 824). The highest prevalence was reported by Portugal (1662 pmp) and the lowest by Ukraine (131 pmp). Among all registries, a total of 22 814 renal transplantations were performed (37 pmp). The highest overall transplant rate was reported from Spain, Cantabria (81 pmp), whereas the highest rate of living donor transplants was reported from Turkey (39 pmp). For patients who started RRT between 2002 and 2006, the unadjusted 5-year patient survival on RRT was 46.8% [95% confidence interval (CI) 46.6-47.0], and on dialysis 39.3% (95% CI 39.2-39.4). The unadjusted 5-year patient survival after the first renal transplantation performed between 2002 and 2006 was 86.7% (95% CI 86.2-87.2) for kidneys from deceased donors and 94.3% (95% CI 93.6-95.0) for kidneys from living donors.
Lima, Isac da S F; Duarte, Elisabeth C
2017-08-21
To identify factors associated with timely treatment of malaria in the Brazilian Amazon. Malaria, despite being treatable, has proven difficult to control and continues to be an important public health problem globally. Brazil accounted for almost half of the 427 000 new malaria cases notified in the Americas in 2013. This was a cross-sectional study using secondary data on all notified malaria cases for the period from 2004 - 2013. Timely treatment was considered to be all treatment started within 24 hours of symptoms onset. Multivariate logistic regression was used to identify independent factors associated with timely treatment. The proportion of cases starting treatment on a timely basis was 41.1%, tending to increase in more recent years (OR = 1.40; 95%CI: 1.37 - 1.42 in 2013). Furthermore, people starting within < 24 hours were more likely to: reside in the states of Rondônia (OR = 1.50; 95%CI: 1.49 - 1.51) or Acre (OR = 1.53; 95%CI: 1.55 - 1.57); be 0 - 5 years of age (OR = 1.39; 95%CI: 1.34 - 1.44) or 6 - 14 years of age (OR = 1.34; 95%CI: 1.32 - 1.36); be indigenous (OR = 1.41; 95%CI: 1.37 - 1.45); have a low level of schooling (OR = 1.20; 95%CI: 1.19 - 1.22); and be diagnosed by active detection (OR = 1.39; 95%CI: 1.38 - 1.39). In the Brazilian Amazon area, individuals were more likely to have timely treatment of malaria if they were young, residing in Acre or Rondônia states, have little schooling, and be identified through active detection. Identifying groups vulnerable to late treatment is important for preventing severe cases and malaria deaths.
Role of RPL39 in Metaplastic Breast Cancer.
Dave, Bhuvanesh; Gonzalez, Daniel D; Liu, Zhi-Bin; Li, Xiaoxian; Wong, Helen; Granados, Sergio; Ezzedine, Nadeer E; Sieglaff, Douglas H; Ensor, Joe E; Miller, Kathy D; Radovich, Milan; KarinaEtrovic, Agda; Gross, Steven S; Elemento, Olivier; Mills, Gordon B; Gilcrease, Michael Z; Chang, Jenny C
2017-06-01
Metaplastic breast cancer is one of the most therapeutically challenging forms of breast cancer because of its highly heterogeneous and chemoresistant nature. We have previously demonstrated that ribosomal protein L39 (RPL39) and its gain-of-function mutation A14V have oncogenic activity in triple-negative breast cancer and this activity may be mediated through inducible nitric oxide synthase (iNOS). The function of RPL39 and A14V in other breast cancer subtypes is currently unknown. The objective of this study was to determine the role and mechanism of action of RPL39 in metaplastic breast cancer. Both competitive allele-specific and droplet digital polymerase chain reaction were used to determine the RPL39 A14V mutation rate in metaplastic breast cancer patient samples. The impact of RPL39 and iNOS expression on patient overall survival was estimated using the Kaplan-Meier method. Co-immunoprecipitation and immunoblot analyses were used for mechanistic evaluation of RPL39. The RPL39 A14V mutation rate was 97.5% (39/40 tumor samples). High RPL39 (hazard ratio = 0.71, 95% confidence interval = 0.55 to 0.91, P = 006) and iNOS expression (P = 003) were associated with reduced patient overall survival. iNOS inhibition with the pan-NOS inhibitor N G -methyl-L-arginine acetate decreased in vitro proliferation and migration, in vivo tumor growth in both BCM-4664 and BCM-3807 patient-derived xenograft models (P = 04 and P = 02, respectively), and in vitro and in vivo chemoresistance. Mechanistically, RPL39 mediated its cancer-promoting actions through iNOS signaling, which was driven by the RNA editing enzyme adenosine deaminase acting on RNA 1. NOS inhibitors and RNA editing modulators may offer novel treatment options for metaplastic breast cancer. © The Author 2016. Published by Oxford University Press.
Cooling Effectiveness of a Modified Cold-Water Immersion Method After Exercise-Induced Hyperthermia.
Luhring, Katherine E; Butts, Cory L; Smith, Cody R; Bonacci, Jeffrey A; Ylanan, Ramon C; Ganio, Matthew S; McDermott, Brendon P
2016-11-01
Recommended treatment for exertional heat stroke includes whole-body cold-water immersion (CWI). However, remote locations or monetary or spatial restrictions can challenge the feasibility of CWI. Thus, the development of a modified, portable CWI method would allow for optimal treatment of exertional heat stroke in the presence of these challenges. To determine the cooling rate of modified CWI (tarp-assisted cooling with oscillation [TACO]) after exertional hyperthermia. Randomized, crossover controlled trial. Environmental chamber (temperature = 33.4°C ± 0.8°C, relative humidity = 55.7% ± 1.9%). Sixteen volunteers (9 men, 7 women; age = 26 ± 4.7 years, height = 1.76 ± 0.09 m, mass = 72.5 ± 9.0 kg, body fat = 20.7% ± 7.1%) with no history of compromised thermoregulation. Participants completed volitional exercise (cycling or treadmill) until they demonstrated a rectal temperature (T re ) ≥39.0°C. After exercise, participants transitioned to a semirecumbent position on a tarp until either T re reached 38.1°C or 15 minutes had elapsed during the control (no immersion [CON]) or TACO (immersion in 151 L of 2.1°C ± 0.8°C water) treatment. The T re , heart rate, and blood pressure (reported as mean arterial pressure) were assessed precooling and postcooling. Statistical analyses included repeated-measures analysis of variance with appropriate post hoc t tests and Bonferroni correction. Before cooling, the T re was not different between conditions (CON: 39.27°C ± 0.26°C, TACO: 39.30°C ± 0.39°C; P = .62; effect size = -0.09; 95% confidence interval [CI] = -0.2, 0.1). At postcooling, the T re was decreased in the TACO (38.10°C ± 0.16°C) compared with the CON condition (38.74°C ± 0.38°C; P < .001; effect size = 2.27; 95% CI = 0.4, 0.9). The rate of cooling was greater during the TACO (0.14 ± 0.06°C/min) than the CON treatment (0.04°C/min ± 0.02°C/min; t 15 = -8.84; P < .001; effect size = 2.21; 95% CI = -0.13, -0.08). These differences occurred despite an insignificant increase in fluid consumption during exercise preceding CON (0.26 ± 0.29 L) versus TACO (0.19 ± 0.26 L; t 12 = 1.73; P = .11; effect size = 0.48; 95% CI = -0.02, 0.14) treatment. Decreases in heart rate did not differ between the TACO and CON conditions (t 15 = -1.81; P = .09; effect size = 0.45; 95% CI = -22, 2). Mean arterial pressure was greater at postcooling with TACO (84.2 ± 6.6 mm Hg) than with CON (67.0 ± 9.0 mm Hg; P < .001; effect size = 2.25; 95% CI = 13, 21). The TACO treatment provided faster cooling than did the CON treatment. When location, monetary, or spatial restrictions are present, TACO represents an effective alternative to traditional CWI in the emergency treatment of patients with exertional hyperthermia.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jabbour, Salma K., E-mail: jabbousk@cinj.rutgers.edu; Kim, Sinae; Department of Biostatistics, School of Public Health, Rutgers University, New Brunswick, New Jersey
2015-07-01
Purpose: We sought to evaluate whether tumor response using cone beam computed tomography (CBCT) performed as part of the routine care during chemoradiation therapy (CRT) could forecast the outcome of unresectable, locally advanced, non-small cell lung cancer (NSCLC). Methods and Materials: We manually delineated primary tumor volumes (TV) of patients with NSCLC who were treated with radical CRT on days 1, 8, 15, 22, 29, 36, and 43 on CBCTs obtained as part of the standard radiation treatment course. Percentage reductions in TV were calculated and then correlated to survival and pattern of recurrence using Cox proportional hazard models. Clinicalmore » information including histologic subtype was also considered in the study of such associations. Results: We evaluated 38 patients with a median follow-up time of 23.4 months. The median TV reduction was 39.3% (range, 7.3%-69.3%) from day 1 (D1) to day 43 (D43) CBCTs. Overall survival was associated with TV reduction from D1 to D43 (hazard ratio [HR] 0.557, 95% CI 0.39-0.79, P=.0009). For every 10% decrease in TV from D1 to D43, the risk of death decreased by 44.3%. For patients whose TV decreased ≥39.3 or <39.3%, log-rank test demonstrated a separation in survival (P=.02), with median survivals of 31 months versus 10 months, respectively. Neither local recurrence (HR 0.791, 95% CI 0.51-1.23, P=.29), nor distant recurrence (HR 0.78, 95% CI 0.57-1.08, P=.137) correlated with TV decrease from D1 to D43. Histologic subtype showed no impact on our findings. Conclusions: TV reduction as determined by CBCT during CRT as part of routine care predicts post-CRT survival. Such knowledge may justify intensification of RT or application of additional therapies. Assessment of genomic characteristics of these tumors may permit a better understanding of behavior or prediction of therapeutic outcomes.« less
Ar-40/Ar-39 ages and cosmic ray exposure ages of Apollo 14 samples.
NASA Technical Reports Server (NTRS)
Turner, G.; Huneke, J. C.; Podosek, F. A.; Wasserburg, G. J.
1971-01-01
We have used the Ar-40/Ar-39 dating technique on eight samples of Apollo 14 rocks (14053, 14310), breccia fragments (14321), and soil fragments (14001, 14167). The large basalt fragments give reasonable Ar-40/Ar-39 release patterns and yield well defined crystallization ages of 3.89-3.95 aeons. Correlation of the Ar-40/Ar-39 release patterns with Ar-39/Ar-37 patterns showed that the low temperature fractions with high radiogenic argon loss came from K-rich phases. A highly shocked sample and fragments included in the breccia yield complex release patterns with a low temperature peak. The total argon age of these fragments is 3.95 aeons. Cosmic ray exposure ages on these samples are obtained from the ratio of spallogenic Ar-38 to reactor induced Ar-37 and show a distinct grouping of low exposure ages of 26 m.y. correlated with Cone crater. Other samples have exposure ages of more than 260 m.y. and identify material with a more complex integrated cosmic age exposure history.
Man, road and vehicle: risk factors associated with the severity of traffic accidents.
Almeida, Rosa Lívia Freitas de; Bezerra Filho, José Gomes; Braga, José Ueleres; Magalhães, Francismeire Brasileiro; Macedo, Marinila Calderaro Munguba; Silva, Kellyanne Abreu
2013-08-01
To describe the main characteristics of victims, roads and vehicles involved in traffic accidents and the risk factors involved in accidents resulting in death. METHODS A non-concurrent cohort study of traffic accidents in Fortaleza, CE, Northeastern Brazil, in the period from January 2004 to December 2008. Data from the Fortaleza Traffic Accidents Information System, the Mortality Information System, the Hospital Information System and the State Traffic Department Driving Licenses and Vehicle database. Deterministic and probabilistic relationship techniques were used to integrate the databases. First, descriptive analysis of data relating to people, roads, vehicles and weather was carried out. In the investigation of risk factors for death by traffic accident, generalized linear models were used. The fit of the model was verified by likelihood ratio and ROC analysis. RESULTS There were 118,830 accidents recorded in the period. The most common types of accidents were crashes/collisions (78.1%), running over pedestrians (11.9%), colliding with a fixed obstacle (3.9%), and with motorcycles (18.1%). Deaths occurred in 1.4% of accidents. The factors that were independently associated with death by traffic accident in the final model were bicycles (OR = 21.2, 95%CI 16.1;27.8), running over pedestrians OR = 5.9 (95%CI 3.7;9.2), collision with a fixed obstacle (OR = 5.7, 95%CI 3.1;10.5) and accidents involving motorcyclists (OR = 3.5, 95%CI 2.6;4.6). The main contributing factors were a single person being involved (OR = 6.6, 95%CI 4.1;10.73), presence of unskilled drivers (OR = 4.1, 95%CI 2.9;5.5) a single vehicle (OR = 3.9, 95%CI 2,3;6,4), male (OR = 2.5, 95%CI 1.9;3.3), traffic on roads under federal jurisdiction (OR = 2.4, 95%CI 1.8;3.7), early morning hours (OR = 2.4, 95%CI 1.8;3.0), and Sundays (OR = 1.7, 95%CI 1.3;2.2), adjusted according to the log-binomial model. CONCLUSIONS Activities promoting the prevention of traffic accidents should primarily focus on accidents involving two-wheeled vehicles that most often involves a single person, unskilled, male, at nighttime, on weekends and on roads where they travel at higher speeds.
Santos, Glenn-Milo; Coffin, Phillip O.; Das, Moupali; Matheson, Tim; DeMicco, Erin; Raiford, Jerris L.; Vittinghoff, Eric; Dilley, James W.; Colfax, Grant; Herbst, Jeffrey H.
2015-01-01
We evaluated the relationship between frequency and number of substances used and HIV risk [ie, serodiscordant unprotected anal intercourse (SDUAI)] among 3173 HIV-negative substance-using MSM. Compared with nonusers, the adjusted odds ratio (AOR) for SDUAI among episodic and at least weekly users, respectively, was 3.31 [95% confidence interval (CI), 2.55 to 4.28] and 5.46 (95% CI, 3.80 to 7.84) for methamphetamine, 1.86 (95% CI, 1.51 to 2.29) and 3.13 (95% CI, 2.12 to 4.63) for cocaine, and 2.08 (95% CI, 1.68 to 2.56) and 2.54 (95% CI, 1.85 to 3.48) for poppers. Heavy alcohol drinkers reported more SDUAI than moderate drinkers [AOR, 1.90 (95% CI, 1.43 to 2.51)]. Compared with nonusers, AORs for using 1, 2, and ≥3 substances were 16.81 (95% CI, 12.25 to 23.08), 27.31 (95% CI, 18.93 to 39.39), and 46.38 (95% CI, 30.65 to 70.19), respectively. High-risk sexual behaviors were strongly associated with frequency and number of substances used. PMID:23572012
Westbrook, Johanna I; Ampt, Amanda; Kearney, Leanne; Rob, Marilyn I
2008-05-05
To quantify time doctors in hospital wards spend on specific work tasks, and with health professionals and patients. Observational time and motion study. 400-bed teaching hospital in Sydney. 19 doctors (seven registrars, five residents, seven interns) in four wards were observed between 08:30 and 19:00 for a total of 151 hours between July and December 2006. Proportions of time in categories of work; proportions of tasks performed with health professionals and patients; proportions of tasks using specific information tools; rates of multitasking and interruptions. The greatest proportions of doctors' time were in professional communication (33%; 95% CI, 29%-38%); social activities, such as non-work communication and meal breaks (17%; 95% CI, 13%-21%), and indirect care, such as planning care (17%; 95% CI, 15%-19%). Multitasking involved 20% of time, and on average, doctors were interrupted every 21 minutes. Most tasks were completed with another doctor (56%; 95% CI, 55%-57%), while 24% (95% CI, 23%-25%) were undertaken alone and 15% (95% CI, 15%-16%) with a patient. Interns spent more time completing documentation and administrative tasks, and less time in direct care than residents and registrars. The time interns spent documenting (22%) was almost double the time they were engaged in direct patient care. Two-thirds of doctors' time was consumed by three work categories: professional communication, social activities and indirect care. Doctors on wards are interrupted at considerably lower rates than those in emergency and intensive care units. The results confirm interns' previously reported dissatisfaction with their level of administrative work and documentation.
Nagendran, Myura; Toon, Clare D; Davidson, Brian R; Gurusamy, Kurinchi Selvan
2014-01-17
Surgical training has traditionally been one of apprenticeship, where the surgical trainee learns to perform surgery under the supervision of a trained surgeon. This is time consuming, costly, and of variable effectiveness. Training using a box model physical simulator - either a video box or a mirrored box - is an option to supplement standard training. However, the impact of this modality on trainees with no prior laparoscopic experience is unknown. To compare the benefits and harms of box model training versus no training, another box model, animal model, or cadaveric model training for surgical trainees with no prior laparoscopic experience. We searched the Cochrane Central Register of Controlled Trials (CENTRAL), MEDLINE, EMBASE, and Science Citation Index Expanded to May 2013. We included all randomised clinical trials comparing box model trainers versus no training in surgical trainees with no prior laparoscopic experience. We also included trials comparing different methods of box model training. Two authors independently identified trials and collected data. We analysed the data with both the fixed-effect and the random-effects models using Review Manager for analysis. For each outcome, we calculated the standardised mean difference (SMD) with 95% confidence intervals (CI) based on intention-to-treat analysis whenever possible. Twenty-five trials contributed data to the quantitative synthesis in this review. All but one trial were at high risk of bias. Overall, 16 trials (464 participants) provided data for meta-analysis of box training (248 participants) versus no supplementary training (216 participants). All the 16 trials in this comparison used video trainers. Overall, 14 trials (382 participants) provided data for quantitative comparison of different methods of box training. There were no trials comparing box model training versus animal model or cadaveric model training. Box model training versus no training: The meta-analysis showed that the time taken for task completion was significantly shorter in the box trainer group than the control group (8 trials; 249 participants; SMD -0.48 seconds; 95% CI -0.74 to -0.22). Compared with the control group, the box trainer group also had lower error score (3 trials; 69 participants; SMD -0.69; 95% CI -1.21 to -0.17), better accuracy score (3 trials; 73 participants; SMD 0.67; 95% CI 0.18 to 1.17), and better composite performance scores (SMD 0.65; 95% CI 0.42 to 0.88). Three trials reported movement distance but could not be meta-analysed as they were not in a format for meta-analysis. There was significantly lower movement distance in the box model training compared with no training in one trial, and there were no significant differences in the movement distance between the two groups in the other two trials. None of the remaining secondary outcomes such as mortality and morbidity were reported in the trials when animal models were used for assessment of training, error in movements, and trainee satisfaction. Different methods of box training: One trial (36 participants) found significantly shorter time taken to complete the task when box training was performed using a simple cardboard box trainer compared with the standard pelvic trainer (SMD -3.79 seconds; 95% CI -4.92 to -2.65). There was no significant difference in the time taken to complete the task in the remaining three comparisons (reverse alignment versus forward alignment box training; box trainer suturing versus box trainer drills; and single incision versus multiport box model training). There were no significant differences in the error score between the two groups in any of the comparisons (box trainer suturing versus box trainer drills; single incision versus multiport box model training; Z-maze box training versus U-maze box training). The only trial that reported accuracy score found significantly higher accuracy score with Z-maze box training than U-maze box training (1 trial; 16 participants; SMD 1.55; 95% CI 0.39 to 2.71). One trial (36 participants) found significantly higher composite score with simple cardboard box trainer compared with conventional pelvic trainer (SMD 0.87; 95% CI 0.19 to 1.56). Another trial (22 participants) found significantly higher composite score with reverse alignment compared with forward alignment box training (SMD 1.82; 95% CI 0.79 to 2.84). There were no significant differences in the composite score between the intervention and control groups in any of the remaining comparisons. None of the secondary outcomes were adequately reported in the trials. The results of this review are threatened by both risks of systematic errors (bias) and risks of random errors (play of chance). Laparoscopic box model training appears to improve technical skills compared with no training in trainees with no previous laparoscopic experience. The impacts of this decreased time on patients and healthcare funders in terms of improved outcomes or decreased costs are unknown. There appears to be no significant differences in the improvement of technical skills between different methods of box model training. Further well-designed trials of low risk of bias and random errors are necessary. Such trials should assess the impacts of box model training on surgical skills in both the short and long term, as well as clinical outcomes when the trainee becomes competent to operate on patients.
Mustafa, Mudasir; Zakar, Rubeena; Zakar, Muhammad Zakria; Chaudhry, Ashraf; Nasrullah, Muazzam
2017-05-01
Objective To assess the combined effect of consanguineous and child marriages (CCM) on children health, which has not previously been explored, either globally or locally. Methods We analyzed secondary data from a series of cross-sectional, nationally representative Pakistan Demographic and Health Surveys 1990-91, 2006-07, and 2012-13. A total of 5406 mothers with 10,164 children were included in the analysis. Child health was assessed by variables such as history of diarrhea, acute respiratory infection (ARI), ARI with fever, Under-5 child mortality (U5CM) and small-size birth (SSB). Associations among variables were assessed by calculating unadjusted Odd Ratios (OR) and adjusted OR (AOR). Results A majority (n = 6,247, 61%) of the births were to mothers having CCM as compare to non-CCM (3917, 39%). There was a significant association between CCM and U5CM during 1990-91 (AOR 1.24, 95% CI 1.03-1.49) and 2006-07 (AOR 1.25, 95% CI 1.05-1.51), and infant mortality in 1990-91 (AOR 1.39, 95% CI 1.05-1.85) and 2006-07 (AOR 1.61, 95% CI 1.17-2.21). A significant association was also found between CCM and SSB infants in the period 2006-07 (AOR 1.19, 95% CI 1.01-1.42) and 2012-13 (AOR 1.22, 95% CI 1.02-1.46). We noted no effect of CCM on diarrhea, ARI, and ARI with fever. Conclusion CCM increases the likelihood of U5CM, infant mortality and SSB infants. Further quantitative and qualitative research should be conducted to assess the effects of environmental, congenital and genetic factors on the health of children born to mothers in CCM.
Sinner, Moritz F.; Tucker, Nathan R.; Lunetta, Kathryn L.; Ozaki, Kouichi; Smith, J. Gustav; Trompet, Stella; Bis, Joshua C.; Lin, Honghuang; Chung, Mina K.; Nielsen, Jonas B.; Lubitz, Steven A.; Krijthe, Bouwe P.; Magnani, Jared W.; Ye, Jiangchuan; Gollob, Michael H.; Tsunoda, Tatsuhiko; Müller-Nurasyid, Martina; Lichtner, Peter; Peters, Annette; Dolmatova, Elena; Kubo, Michiaki; Smith, Jonathan D.; Psaty, Bruce M.; Smith, Nicholas L.; Jukema, J. Wouter; Chasman, Daniel I.; Albert, Christine M.; Ebana, Yusuke; Furukawa, Tetsushi; MacFarlane, Peter; Harris, Tamara B.; Darbar, Dawood; Dörr, Marcus; Holst, Anders G.; Svendsen, Jesper H.; Hofman, Albert; Uitterlinden, Andre G.; Gudnason, Vilmundur; Isobe, Mitsuaki; Malik, Rainer; Dichgans, Martin; Rosand, Jonathan; Van Wagoner, David R.; Benjamin, Emelia J.; Milan, David J.; Melander, Olle; Heckbert, Susan R.; Ford, Ian; Liu, Yongmei; Barnard, John; Olesen, Morten S.; Stricker, Bruno H.C.; Tanaka, Toshihiro; Kääb, Stefan; Ellinor, Patrick T.
2014-01-01
Background Atrial fibrillation (AF) affects over 30 million individuals worldwide and is associated with an increased risk of stroke, heart failure, and death. AF is highly heritable, yet the genetic basis for the arrhythmia remains incompletely understood. Methods & Results To identify new AF-related genes, we utilized a multifaceted approach, combining large-scale genotyping in two ethnically distinct populations, cis-eQTL mapping, and functional validation. Four novel loci were identified in individuals of European descent near the genes NEURL (rs12415501, RR=1.18, 95%CI 1.13 – 1.23, p=6.5×10−16), GJA1 (rs13216675, RR=1.10, 95%CI 1.06 – 1.14, p=2.2×10−8), TBX5 (rs10507248, RR=1.12, 95%CI 1.08 – 1.16, p=5.7×10−11), and CAND2 (rs4642101, RR=1.10, 95%CI 1.06 – 1.14, p=9.8×10−9). In Japanese, novel loci were identified near NEURL (rs6584555, RR=1.32, 95%CI 1.26–1.39, p=2.0×10−25) and CUX2 (rs6490029, RR=1.12, 95%CI 1.08–1.16, p=3.9×10−9). The top SNPs or their proxies were identified as cis-eQTLs for the genes CAND2 (p=2.6×10−19), GJA1 (p=2.66×10−6), and TBX5 (p=1.36×10−05). Knockdown of the zebrafish orthologs of NEURL and CAND2 resulted in prolongation of the atrial action potential duration (17% and 45%, respectively). Conclusions We have identified five novel loci for AF. Our results further expand the diversity of genetic pathways implicated in AF and provide novel molecular targets for future biological and pharmacological investigation. PMID:25124494
Fuchs, Zeynep; Scaal, Martin; Haverkamp, Heinz; Koerber, Friederike; Persigehl, Thorsten; Eifinger, Frank
2018-06-01
Intraosseous (IO)-access plays an alternative route during resuscitation. Our study was performed to investigate the successful rate of IO-access in preterm and term stillborns using different devices and techniques. The cadavers used were legal donations. 16 stillborns, median: 29.2 weeks (IQR 27.2-38.4) were investigated. Two different needles (a: Butterfly needle, 21G, Venofix ® Fa.Braun; b: Arrow ® EZ-IO ® 15G, Teleflex, Dublin, Ireland) were used. Needles were inserted i: manually, using a Butterfly needle; ii: manually, using EZ-IO ® needle or iii: using a battery-powered semi-automatic drill (Arrow ® EZ-IO ® ). Spectral-CT's were performed. The diameter of the corticalis was determined from the CT-images. Successful hit rates with 95% confidence intervals (CI) and odds ratios between the three methods were estimated using a generalised linear mixed model (GLMM). Estimated success rate was 61.1% (95%CI:39.7%-78.9%) for the Butterfly needle, 43.0% (95%CI:23.4%-65.0%) for hand-twisted EZ-IO ® screwing and 39.7% (95%CI:24.1-57.7%) for the semi-automatic drill (Arrow ® EZ-IO ® ), all referring to an average diameter of the corticalis of 1.2 mm. The odds of a correct position were 2.4 times higher (95%CI:0.8-7.6) when using the Butterfly needle than with the drill. In contrast, the odds of correct positioning when inserting the needle by hand were not significantly different from using the drill (odds ratio 1.1, 95%CI: 0.4-3.3). Neither of these effects nor the diameter of the corticalis with an odds ratio near one were significant in the model. Median diameter of the bone marrow cavity was 4.0 mm [IQR 3.3-4.7]. Intraosseous access for premature and neonatal infants could be best achieved by using a manually twisted Butterfly needle. Copyright © 2018 Elsevier B.V. All rights reserved.
Umeki, Yoko; Adachi, Hisashi; Enomoto, Mika; Fukami, Ako; Nakamura, Sachiko; Nohara, Yume; Nakao, Erika; Sakaue, Akiko; Tsuru, Tomoko; Morikawa, Nagisa; Fukumoto, Yoshihiro
Objective There is little long-term data on the association between the serum albumin levels and mortality in community-based populations. We aimed to determine whether the serum albumin level is an independent risk factor for all-cause and cause-specific death in a community-based cohort study in Japan. Methods In 1999, we performed a periodic epidemiological survey over a 15-year period in a population of 1,905 healthy subjects (783 males, 1,122 females) who were older than 40 years of age and who resided in Tanushimaru, a rural community, in Japan. Over the course of the study, we periodically examined the blood chemistry of the study subjects, including their serum albumin levels. Their baseline serum albumin levels were categorized into quartiles. Results The baseline albumin levels were significantly associated with age (inversely), body mass index (BMI), diastolic blood pressure, lipid profiles [high density lipoprotein-cholesterol (HDL-c), low-density lipoprotein-cholesterol (LDL-c) and triglycerides] and estimated glomerular filtration rate (eGFR). After adjusting for confounders, a Cox proportional hazards regression analysis demonstrated that a low serum albumin level was an independent predictor of all-cause death [hazard ratio (HR): 0.39, 95% confidence interval (CI): 0.24-0.65], cancer death (HR: 0.43, 95% CI: 0.18-0.99), death from infection (HR: 0.21, 95% CI: 0.06-0.73) and cerebro-cardiovascular death (HR: 0.19, 95% CI: 0.06-0.63). The HRs for all-cause and cerebro-cardiovascular death in the highest quartile vs. the lowest quartile of albumin after adjusting for confounders were 0.59 (95%CI:0.39-0.88) and 0.15 (95%CI: 0.03-0.66), respectively. Conclusion The serum albumin level was thus found to be a predictor of all-cause and cerebro-cardiovascular death in a general population.
Boyle, Gregory J; Neumann, David L; Furedy, John J; Westbury, H Rae
2010-04-01
This paper reports sex differences in cognitive task performance that emerged when 39 Australian university undergraduates (19 men, 20 women) were asked to solve verbal (lexical) and visual-spatial cognitive matching tasks which varied in difficulty and visual field of presentation. Sex significantly interacted with task type, task difficulty, laterality, and changes in performance across trials. The results revealed that the significant individual-differences' variable of sex does not always emerge as a significant main effect, but instead in terms of significant interactions with other variables manipulated experimentally. Our results show that sex differences must be taken into account when conducting experiments into human cognitive-task performance.
Illicit drug use among school-going adolescents in Malaysia.
Yusoff, Fadhli; Sahril, Norhafizah; Rasidi, Naim M; Zaki, Nor Azian M; Muhamad, Norazlina; Ahmad, NoorAni
2014-09-01
Illicit drug use among adolescents has become a public health issue in Malaysia. This study was from the Global School-Based Student Health Survey (GSHS) and aimed to determine the prevalence of and factors associated with illicit drug use among school-going adolescents in Malaysia. A 2-stage stratified cluster sampling method was used and data were collected via a self-administered questionnaire. A total of 25 507 students participated in the study. The prevalence of adolescents who ever used illicit drugs was 1.7%. Adolescents who ever used illicit drugs were associated with current smoking (adjusted odds ratio [aOR] = 6.99; 95% CI = 5.19, 9.40), current alcohol use (aOR = 4.63; 95% CI = 3.43, 6.26), ever having sex (aOR = 4.76; 95% CI = 3.54, 6.41), truancy (aOR = 1.43; 95% CI = 1.07, 1.90), lack of peer support (aOR = 1.47; 95% CI = 1.07, 2.03), and lack of parental monitoring (aOR = 1.71; 95% CI = 1.22, 2.39). Public health intervention should be addressed to prevent illicit drug used among adolescents. © 2014 APJPH.
Predictive Factors of Cytomegalovirus Seropositivity among Pregnant Women in Paris, France
N’Diaye, Dieynaba S.; Yazdanpanah, Yazdan; Krivine, Anne; Andrieu, Thibaut; Rozenberg, Flore; Picone, Olivier; Tsatsaris, Vassilis; Goffinet, François; Launay, Odile
2014-01-01
Background Cytomegalovirus (CMV) is the most frequent cause of congenital infection. The objective of this study was to evaluate predictive factors for CMV seronegativity in a cohort of pregnant women in Paris, France. Methods Pregnant women enrolled in a prospective cohort during the 2009 A/H1N1 pandemic were tested for CMV IgG antibodies. Variables collected were age, geographic origin, lifestyle, work characteristics, socioeconomic status, gravidity, parity and number of children at home. A multivariate logistic regression model was used to identify independent predictive factors for CMV seropositivity. Results Among the 826 women enrolled, 389 (47.1%) were primiparous, and 552 (67.1%) had Metropolitan France as a geographic origin. Out of these, 355 (i.e. 57.0%, 95% confidence interval (CI): [53.6%–60.4%]) were CMV seropositive: 43.7% (95% CI:[39.5%–47.9%]) in those whose geographic origin was Metropolitan France and 84.1% in those with other origins (95% CI:[79.2%–88.3%]). Determinants associated with CMV seropositivity in a multivariate logistic regression model were: (i) geographic origin (p<0.001(compared with Metropolitan France, geographic origins of Africa adjusted odds ratio (aOR) 21.2, 95% CI:[9.7–46.5], French overseas departments and territories and other origin, aOR 7.5, 95% CI:[3.9–14.6], and Europe or Asia, aOR 2.2, 95% CI: [1.3–3.7]); and (ii) gravidity (p = 0.019), (compared with gravidity = 1, if gravidity≥3, aOR = 1.5, 95% CI: [1.1–2.2]; if gravidity = 2, aOR = 1.0, 95% CI: [0.7–1.4]). Work characteristics and socioeconomic status were not independently associated with CMV seropositivity. Conclusions In this cohort of pregnant women, a geographic origin of Metropolitan France and a low gravidity were predictive factors for CMV low seropositivity. Such women are therefore the likely target population for prevention of CMV infection during pregnancy in France. PMID:24587077
Kedir, Haji; Berhane, Yemane; Worku, Alemayehu
2013-01-01
Background Anemia affects a high proportion of pregnant women in the developing countries. Factors associated with it vary in context. This study aimed to determine the prevalence and predictors of anemia among pregnant women in the rural eastern Ethiopia. Methods A community-based cross-sectional study was done on 1678 pregnant women who were selected by a cluster random sampling technique. A pregnant woman was identified as anemic if her hemoglobin concentration was <11 g/dl. Data were collected in a community-based setting. Multilevel mixed effect logistic regression was used to determine the adjusted odds ratios (AOR) with 95% confidence intervals (CI) for the predictors of anemia. Results Anemia was observed among 737(43.9%) of the 1678 pregnant women studied (95% CI 41.5%–46.3%). After controlling for the confounders, the risk of anemia was 29% higher in the women who chewed khat daily than those who sometimes or never did so (AOR, 1.29; 95% CI, 1.02–1.62). The study subjects with restrictive dietary behavior (reduced either meal size or frequency) had a 39% higher risk of anemia compared to those without restrictive dietary behavior (AOR, 1.39; 95% CI, 1.02–1.88). The risk of anemia was increased by 68% (AOR, 1.68; 95% CI, 1.15–2.47), and 60% (AOR, 1.60; 95% CI, 1.08–2.37) in parity levels of 2 births and 3 births, respectively. Compared to the first trimester, the risk of anemia was higher by two-fold (AOR, 2.09; 95% CI, 1.46–3.00) in the second trimester and by four-fold (AOR, 4.23; 95% CI, 2.97–6.02) in the third trimester. Conclusion In this study, two out of five women were anemic. Chewing khat and restrictive dietary habits that are associated with anemia in the setting should be addressed through public education programs. Interventions should also focus on the women at higher parity levels and those who are in advanced stages of pregnancy. PMID:24223828
Boko, Pelagie M; Ibikounle, Moudachirou; Onzo-Aboki, Ablawa; Tougoue, Jean-Jacques; Sissinto, Yollande; Batcho, Wilfrid; Kinde-Gazard, Dorothe; Kabore, Achille
2016-01-01
In 2013, Benin developed strategies to control neglected tropical diseases and one of the first step was the disease mapping of the entire country in order to identify endemic districts of schistosomiasis and soil transmitted helminths (STH). This study was carried out in 30 of the 77 districts of Benin. Of these 30 districts 22 were previously treated for Lymphatic Filariasis (LF) using the Ivermectin and Albendazole combination. In each district, five schools were selected and 50 children aged 8 to 14 years were sampled in each school, making a total of 250 children sampled in the district. The schools were selected mainly according to their proximity to lakes or any bodies of water that were likely to have been used by the children. Samples of faeces and urine were collected from each pupil. Urinary schistosomiasis was identified using the urine filtration technique while STH and intestinal schistosomiasis were identified through the Kato Katz method. Overall a total of 7500 pupils were surveyed across 150 schools with a gender ratio of 1:1. Hookworm was identified in all 30 districts with a prevalence ranging from 1.2% (95%CI: 0.0-2.5) to 60% (95%CI: 53.9-66.1). Ascaris lumbricoides was detected in 19 districts with a prevalence rate between 1% (95%CI: 0.0-2.2) and 39% (95%CI: 32.9-45.0). In addition to these common STH, Trichuris trichiura, Enterobius vermicularis and Strongyloides stercoralis were found at low prevalence. Only 16 districts were endemic to Schistosoma mansoni, while 29 districts were endemic to S. haematobium. The S. haematobium prevalence ranged from 0.8% (95% CI: 0.0-1.9) to 56% (95% CI: 50.2-62.5) while the prevalence of S. mansoni varied from 0.4% (95%CI: 0.0-1.2) to 46% (95% CI: 39.8-52.2). The 22 districts, where LF was successfully eliminated, still require mass drug administration (MDA) of albendazole indicating that school-based MDA would be needed even after LF elimination in districts co-endemic to LF and STH in Benin.
Qiu, Xinyun; Ma, Jingjing; Wang, Kai; Zhang, Hongjie
2017-01-01
Background and Aims The chemopreventive effect of 5-aminosalicylic acid (5-ASA) in patients with inflammatory bowel disease (IBD) has been widely studied; however, the results remain conflicting. The aim of this study was to systematically review the literature and update evidence concerning effects of 5-ASA on the risk of colorectal cancer (CRC) and dysplasia (Dys) in patients with ulcerative colitis (UC) or Crohn's disease (CD). Results 5-ASA showed a chemopreventive effect against CRC/Dys in IBD patients (OR = 0.58, 95% CI: 0.45−0.75). However, this effect was significant only in clinical-based studies (OR = 0.51; 95% CI: 0.39−0.65), but not in population-based studies (OR = 0.71; 95% CI: 0.46−1.09). Moreover, this effect was noticeable in patients with UC (OR = 0.46, 95% CI: 0.34−0.61), but not in CD (OR = 0.66, 95% CI: 0.42−1.03), and on the outcome of CRC (OR = 0.54, 95% CI: 0.39−0.74), but not Dys (OR = 0.47; 95% CI: 0.20−1.10). In IBD patients, mesalazine dosage ≥ 1.2 g/day showed greater protective effects against CRC/Dys than dosages < 1.2 g/day. However, Sulphasalazine therapy did not show any noticeable protective function regardless of the dosage administered. Materials and Methods We performed a systematic review with a meta-analysis of 26 observational studies involving 15,460 subjects to evaluate the risks of developing CRC and Dys in IBD patients receiving 5-ASA treatment. Pooled odds ratios (ORs) and 95% confidence intervals (CIs) were calculated for each evaluation index. Conclusions 5-ASA has a chemopreventive effect on CRC (but not Dys) in IBD patients. Moreover, UC patients can benefit more from 5-ASA than CD patients. Mesalazine maintenance dosage ≥ 1.2 g/day is an effective treatment for reducing CRC risk in IBD patients. PMID:27906680
Deng, Xiaoming; Fan, Ting; Fu, Runqiao; Geng, Wanming; Guo, Ruihong; He, Nong; Li, Chenghui; Li, Lei; Li, Min; Li, Tianzuo; Tian, Ming; Wang, Geng; Wang, Lei; Wang, Tianlong; Wu, Anshi; Wu, Di; Xue, Xiaodong; Xu, Mingjun; Yang, Xiaoming; Yang, Zhanmin; Yuan, Jianhu; Zhao, Qiuhua; Zhou, Guoqing; Zuo, Mingzhang; Pan, Shuang; Zhan, Lujing; Yao, Min; Huang, Yuguang
2015-01-01
Background/Objective Inadvertent intraoperative hypothermia (core temperature <360 C) is a recognized risk in surgery and has adverse consequences. However, no data about this complication in China are available. Our study aimed to determine the incidence of inadvertent intraoperative hypothermia and its associated risk factors in a sample of Chinese patients. Methods We conducted a regional cross-sectional survey in Beijing from August through December, 2013. Eight hundred thirty patients who underwent various operations under general anesthesia were randomly selected from 24 hospitals through a multistage probability sampling. Multivariate logistic regression analyses were applied to explore the risk factors of developing hypothermia. Results The overall incidence of intraoperative hypothermia was high, 39.9%. All patients were warmed passively with surgical sheets or cotton blankets, whereas only 10.7% of patients received active warming with space heaters or electric blankets. Pre-warmed intravenous fluid were administered to 16.9% of patients, and 34.6% of patients had irrigation of wounds with pre-warmed fluid. Active warming (OR = 0.46, 95% CI 0.26–0.81), overweight or obesity (OR = 0.39, 95% CI 0.28–0.56), high baseline core temperature before anesthesia (OR = 0.08, 95% CI 0.04–0.13), and high ambient temperature (OR = 0.89, 95% CI 0.79–0.98) were significant protective factors for hypothermia. In contrast, major-plus operations (OR = 2.00, 95% CI 1.32–3.04), duration of anesthesia (1–2 h) (OR = 3.23, 95% CI 2.19–4.78) and >2 h (OR = 3.44, 95% CI 1.90–6.22,), and intravenous un-warmed fluid (OR = 2.45, 95% CI 1.45–4.12) significantly increased the risk of hypothermia. Conclusions The incidence of inadvertent intraoperative hypothermia in Beijing is high, and the rate of active warming of patients during operation is low. Concern for the development of intraoperative hypothermia should be especially high in patients undergoing major operations, requiring long periods of anesthesia, and receiving un-warmed intravenous fluids. PMID:26360773
Boko, Pelagie M.; Ibikounle, Moudachirou; Onzo-Aboki, Ablawa; Tougoue, Jean-Jacques; Sissinto, Yollande; Batcho, Wilfrid; Kinde-Gazard, Dorothe; Kabore, Achille
2016-01-01
In 2013, Benin developed strategies to control neglected tropical diseases and one of the first step was the disease mapping of the entire country in order to identify endemic districts of schistosomiasis and soil transmitted helminths (STH). This study was carried out in 30 of the 77 districts of Benin. Of these 30 districts 22 were previously treated for Lymphatic Filariasis (LF) using the Ivermectin and Albendazole combination. In each district, five schools were selected and 50 children aged 8 to 14 years were sampled in each school, making a total of 250 children sampled in the district. The schools were selected mainly according to their proximity to lakes or any bodies of water that were likely to have been used by the children. Samples of faeces and urine were collected from each pupil. Urinary schistosomiasis was identified using the urine filtration technique while STH and intestinal schistosomiasis were identified through the Kato Katz method. Overall a total of 7500 pupils were surveyed across 150 schools with a gender ratio of 1:1. Hookworm was identified in all 30 districts with a prevalence ranging from 1.2% (95%CI: 0.0–2.5) to 60% (95%CI: 53.9–66.1). Ascaris lumbricoides was detected in 19 districts with a prevalence rate between 1% (95%CI: 0.0–2.2) and 39% (95%CI: 32.9–45.0). In addition to these common STH, Trichuris trichiura, Enterobius vermicularis and Strongyloides stercoralis were found at low prevalence. Only 16 districts were endemic to Schistosoma mansoni, while 29 districts were endemic to S. haematobium. The S. haematobium prevalence ranged from 0.8% (95% CI: 0.0–1.9) to 56% (95% CI: 50.2–62.5) while the prevalence of S. mansoni varied from 0.4% (95%CI: 0.0–1.2) to 46% (95% CI: 39.8–52.2). The 22 districts, where LF was successfully eliminated, still require mass drug administration (MDA) of albendazole indicating that school-based MDA would be needed even after LF elimination in districts co-endemic to LF and STH in Benin. PMID:27643795
Zenebe, Chernet Baye; Adefris, Mulat; Yenit, Melaku Kindie; Gelaw, Yalemzewod Assefa
2017-09-06
Despite the fact that long acting family planning methods reduce population growth and improve maternal health, their utilization remains poor. Therefore, this study assessed the prevalence of long acting and permanent family planning method utilization and associated factors among women in reproductive age groups who have decided not to have more children in Gondar city, northwest Ethiopia. An institution based cross-sectional study was conducted from August to October, 2015. Three hundred seventeen women who have decided not to have more children were selected consecutively into the study. A structured and pretested questionnaire was used to collect data. Both bivariate and multi-variable logistic regressions analyses were used to identify factors associated with utilization of long acting and permanent family planning methods. The multi-variable logistic regression analysis was used to investigate factors associated with the utilization of long acting and permanent family planning methods. The Adjusted Odds Ratio (AOR) with the corresponding 95% Confidence Interval (CI) was used to show the strength of associations, and variables with a P-value of <0.05 were considered statistically significant. In this study, the overall prevalence of long acting and permanent contraceptive (LAPCM) method utilization was 34.7% (95% CI: 29.5-39.9). According to the multi-variable logistic regression analysis, utilization of long acting and permanent contraceptive methods was significantly associated with women who had secondary school, (AOR: 2279, 95% CI: 1.17, 4.44), college, and above education (AOR: 2.91, 95% CI: 1.36, 6.24), history of previous utilization (AOR: 3.02, 95% CI: 1.69, 5.38), and information about LAPCM (AOR: 8.85, 95% CI: 2.04, 38.41). In this study the prevalence of long acting and permanent family planning method utilization among women who have decided not to have more children was high compared with previous studies conducted elsewhere. Advanced educational status, previous utilization of LAPCM, and information on LAPCM were significantly associated with the utilization of LAPCM. As a result, strengthening behavioral change communication channels to make information accessible is highly recommended.
Ramírez-Vélez, Robinson; Tordecilla-Sanders, Alejandra; Correa-Bautista, Jorge Enrique; González-Ruíz, Katherine; González-Jiménez, Emilio; Triana-Reina, Hector Reynaldo; García-Hermoso, Antonio; Schmidt-RioValle, Jacqueline
2018-01-01
To verify the validity of multi-frequency bioelectrical impedance analysis (mBCA) for predicting body fat percentage (BF%) in overweight/obese adults using dual-energy X-ray absorptiometry (DXA) as the reference method. Forty-eight adults participated (54% women, mean age = 41.0 ± 7.3 years old). The Pearson's correlation coefficient was used to evaluate the correlation between BIA and BF% assessed by DXA. The concordance between BF% measured by both methods was obtained with Lin's concordance correlation coefficient and Bland-Altman difference plots. Measures of BF% were estimated as 39.0 (SD = 6.1) and 38.3 (SD = 6.5) using DXA and mBCA, respectively. The Pearson's correlation coefficient reflected a strong correlation (r =.921, P = .001). The paired t-test showed a significant mean difference between these methods for obese men BF% of -0.6 [(SD 1.95; 95% CI = -4.0 to 3.0), P =.037]. Overall, the bias of the mBCA was -0.6 [(SD 2.2; 95% CI = -5.0 to 3.7), P =.041], which indicated that the mBCA method significantly underestimated BF% in comparison to the reference method. Finally, in both genders, Lin's concordance correlation coefficient showed a strong agreement. More specifically the DXA value was ρc = 0.943 (95% CI = 0.775 to 0.950) and the mBCA value was ρc = 0.948 (95% CI = 0.778 to 0.978). Our analysis showed a strong agreement between the two methods as reflected in the range of BF%. These results show that mBCA and DXA are comparable methods for measuring body composition with higher body fat percentages. However, due to broad limits of agreement, we can only recommend mBCA for groups of populations. © 2017 Wiley Periodicals, Inc.
Task shifting for the delivery of pediatric antiretroviral treatment: a systematic review.
Penazzato, Martina; Davies, Mary-Ann; Apollo, Tsitsi; Negussie, Eyerusalem; Ford, Nathan
2014-04-01
Pediatric antiretroviral treatment coverage in resource-limited settings continues to lag behind adults. Task shifting is an effective approach broadly used for adults, which some countries have also adopted for children, but implementation is limited by lack of confidence and skills among nonspecialist staff. A systematic review was conducted by combining key terms for task shifting, antiretroviral therapy (ART), and children. Five databases and two conferences were searched from inception till August 01, 2013. Eight observational studies provided outcome data for 11,828 children who received ART from nonphysician providers across 10 countries in sub-Saharan Africa. The cumulative pooled proportion of deaths was 3.2% [95% confidence interval (CI): 2.0 to 4.5] at 6 months, 4.6% (95% CI: 2.1 to 7.1) at 12 months, 6.2% (95% CI: 3.7 to 8.8) at 24 months, and 5.9% (95% CI: 3.5 to 8.3) at 36 months. Mortality and loss to follow-up in task-shifting programs were comparable to those reported by programs providing doctor- or specialist-led care. Our review suggests that task shifting of ART care can result in outcomes comparable to routine physician care, and this approach should be considered as part of a strategy to scale-up pediatric treatment. Specialist care will remain important for management of sick patients and complicated cases. Further qualitative research is needed to inform optimal implementation of task shifting for pediatric patients.
A High-Throughput Screening Method to Identify Potential Pesticides for Mosquito Control
2009-01-01
receptor agonists 5 Imidacloprid Nicotinic acetylcholine receptor agonist/antagonists 4 Diazinon Acetylcholinesterase inhibitors (organophosphates) 1B...0.84) 1.50 Spinosad 3.9 101 (3.6 101Ð4.1 101) 6.3 101 (5.5 101Ð7.9 101) 7.82 (1.33) 2.95 Imidacloprid 3.7 101 (2.9 101Ð4.5 101...pesticides (pyridaben, hydramethylnon, imidacloprid , diazinon, and indoxacarb) were moder- ately active against Þrst-instar larvae,withLC50 values of
Howell, David R; Osternig, Louis R; Chou, Li-Shan
2018-02-16
To examine the acute (within 72h of injury) and long-term (2mo postinjury) independent associations between objective dual-task gait balance and neurocognitive measurements among adolescents and young adults with a concussion and matched controls. Longitudinal case-control. Motion analysis laboratory. A total of 95 participants completed the study: 51 who sustained a concussion (mean age, 17.5±3.3y; 71% men) and 44 controls (mean age, 17.7±2.9y; 72% men). Participants who sustained a concussion underwent a dual-task gait analysis and computerized neurocognitive testing within 72 hours of injury and again 2 months later. Uninjured controls also completed the same test protocol in similar time increments. Not applicable. We compared dual-task gait balance control and computerized neurocognitive test performance between groups using independent samples t tests. Multivariable binary logistic regression models were then constructed for each testing time to determine the association between group membership (concussion vs control), dual-task gait balance control, and neurocognitive function. Medial-lateral center-of-mass displacement during dual-task gait was independently associated with group membership at the initial test (adjusted odds ratio [aOR], 2.432; 95% confidence interval [CI], 1.269-4.661) and 2-month follow-up test (aOR, 1.817; 95% CI, 1.014-3.256) tests. Visual memory composite scores were significantly associated with group membership at the initial hour postinjury time point (aOR, .953; 95% CI, .833-.998). However, the combination of computerized neurocognitive test variables did not predict dual-task gait balance control for participants with concussion, and no single neurocognitive variable was associated with dual-task gait balance control at either testing time. Dual-task assessments concurrently evaluating gait and cognitive performance may allow for the detection of persistent deficits beyond those detected by computerized neurocognitive deficits alone. Copyright © 2018 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.
Jennings, Larissa; Yebadokpo, André Sourou; Affo, Jean; Agbogbe, Marthe; Tankoano, Aguima
2011-01-06
Shifting the role of counseling to less skilled workers may improve efficiency and coverage of health services, but evidence is needed on the impact of substitution on quality of care. This research explored the influence of delegating maternal and newborn counseling responsibilities to clinic-based lay nurse aides on the quality of counseling provided as part of a task shifting initiative to expand their role. Nurse-midwives and lay nurse aides in seven public maternities were trained to use job aids to improve counseling in maternal and newborn care. Quality of counseling and maternal knowledge were assessed using direct observation of antenatal consultations and patient exit interviews. Both provider types were interviewed to examine perceptions regarding the task shift. To compare provider performance levels, non-inferiority analyses were conducted where non-inferiority was demonstrated if the lower confidence limit of the performance difference did not exceed a margin of 10 percentage points. Mean percent of recommended messages provided by lay nurse aides was non-inferior to counseling by nurse-midwives in adjusted analyses for birth preparedness (β = -0.0, 95% CI: -9.0, 9.1), danger sign recognition (β = 4.7, 95% CI: -5.1, 14.6), and clean delivery (β = 1.4, 95% CI: -9.4, 12.3). Lay nurse aides demonstrated superior performance for communication on general prenatal care (β = 15.7, 95% CI: 7.0, 24.4), although non-inferiority was not achieved for newborn care counseling (β = -7.3, 95% CI: -23.1, 8.4). The proportion of women with correct knowledge was significantly higher among those counseled by lay nurse aides as compared to nurse-midwives in general prenatal care (β = 23.8, 95% CI: 15.7, 32.0), birth preparedness (β = 12.7, 95% CI: 5.2, 20.1), and danger sign recognition (β = 8.6, 95% CI: 3.3, 13.9). Both cadres had positive opinions regarding task shifting, although several preferred 'task sharing' over full delegation. Lay nurse aides can provide effective antenatal counseling in maternal and newborn care in facility-based settings, provided they receive adequate training and support. Efforts are needed to improve management of human resources to ensure that effective mechanisms for regulating and financing task shifting are sustained.
Caesarean delivery before 39 weeks associated with selecting an auspicious time for birth in Taiwan.
Chu, Kuei-Hui; Lee, Yu-Hsiang; Tai, Chen-Jei; Lin, Yu-Hung; Huang, Chiu-Mieh; Chien, Li-Yin
2015-09-01
Caesarean delivery before 39 weeks of gestation increases the risk of morbidity among infants. Taiwan has one of the highest caesarean rates in the world, but little attention has been paid to this issue. This study aimed to describe the rate of caesarean delivery before 39 weeks gestation among women who did not have labour signs and had a non-emergency caesarean delivery in Taiwan and to examine whether the phenomenon was associated with the Chinese cultural practice of selecting an auspicious time for birth. We recruited women at 15-28 weeks of pregnancy at 5 hospitals in northern Taiwan and followed them at 4 or 5 weeks after delivery using structured questionnaires. This analysis included 150 primiparous mothers with a singleton pregnancy who had a non-emergency caesarean delivery without the presence of labour signs. Ninety-three of these women (62.0%) had caesarean deliveries before 39 weeks of gestation. Logistic regression analysis showed that women who had selected an auspicious time for delivery (OR=2.82, 95% CI: 1.15-6.95) and delivered in medical centres (OR=5.26, 95% CI: 2.25-12.26) were more likely to deliver before 39 weeks of gestation. Non-emergency caesarean delivery before 39 weeks of gestation was common among the study women, and was related to the Chinese cultural practice of selecting an auspicious time for birth. Further studies are needed to examine the risks and benefits associated with timing of caesarean delivery in Taiwan in order to generate a consensus among obstetricians and give pregnant women appropriate information. Copyright © 2015 Australian College of Midwives. Published by Elsevier Ltd. All rights reserved.
Perception of Physical Attractiveness When Consuming and Not Consuming Alcohol: A Meta-Analysis.
Bowdring, Molly A; Sayette, Michael A
2018-04-16
Elucidating why people drink and why drinking can lead to negative psychosocial consequences remains a crucial task for alcohol researchers. Because drinking typically occurs in social settings, broader investigation of the associations between alcohol and social experience is needed to advance understanding of both the rewarding and hazardous effects of alcohol use. This review aimed to (a) estimate alcohol's relation to the perception of others' physical attractiveness and (b) suggest theoretical and methodological considerations that may advance the study of this topic. Systematic review of Scopus and PsycInfo databases was conducted to identify experimental and quasi-experimental studies, with either between- or within-subjects designs, that assessed attractiveness ratings provided by individuals who had and had not consumed alcohol (k=16 studies, n=1,811). A meta-analysis was conducted to evaluate alcohol's aggregate association with physical attractiveness perceptions. Separate a priori secondary analyses examined alcohol's associations with perception of opposite-sex (k=12 studies) and same-sex (k=7 studies) attractiveness. The primary analysis indicated that alcohol was significantly related to enhanced attractiveness perceptions (d=0.19, 95% CI=0.05-0.32, p=.01; I 2 =5.28, 95% CI=0.00 to 39.32). Analysis of alcohol's association with perception of opposite-sex attractiveness similarly yielded a small, significant positive association (d=0.30, 95% CI=0.16-0.44, p<.01; I 2 =17.49, 95% CI=0.00 to 57.75). Alcohol's relation to perception of same-sex attractiveness was not significant (d=0.04, 95% CI=-0.18-0.26, p=.71; I 2 =54.08, 95% CI=0.00 to 81.66). Experimental and quasi-experimental studies suggest that consuming alcohol may have a small effect of increasing perceived attractiveness of people of the opposite sex. This article is protected by copyright. All rights reserved.
Muramatsu, Yukio; Yamamichi, Junta; Gomi, Shiho; Oubel, Estanislao; Moriyama, Noriyuki
2018-01-01
Background This study sought to evaluate the 95% limits of agreement of the volumes of 5-year clinically stable solid nodules for the development of a follow-up system for indeterminate solid nodules. Methods The volumes of 226 solid nodules that had been clinically stable for 5 years were measured in 186 patients (53 female never-smokers, 36 male never-smokers, 51 males with <30 pack-years, and 46 males with ≥30 pack-years) using a three-dimensional semiautomated method. Volume changes were evaluated using three methods: percent change, proportional change and growth rate. The 95% limits of agreement were evaluated using the Bland-Altman method. Results The 95% limits of agreement were as follows: range of percent change, from ±34.5% to ±37.8%; range of proportional change, from ±34.1% to ±36.8%; and range of growth rate, from ±39.2% to ±47.4%. Percent change-based, proportional change-based, and growth rate-based diagnoses of an increase or decrease in ten solid nodules were made at a mean of 302±402, 367±455, and 329±496 days, respectively, compared with a clinical diagnosis made at 809±616 days (P<0.05). Conclusions The 95% limits of agreement for volume change in 5-year stable solid nodules may enable the detection of an increase or decrease in the solid nodule at an earlier stage than that enabled by a clinical diagnosis, possibly contributing to the development of a follow-up system for reducing the number of additional Computed tomography (CT) scans performed during the follow-up period. PMID:29600047
Mission Specialist Pedro Duque smiles at camera while at Launch Pad 39B
NASA Technical Reports Server (NTRS)
1998-01-01
STS-95 Mission Specialist Pedro Duque of Spain, with the European Space Agency (ESA), smiles for the camera from Launch Pad 39B. The STS-95 crew were making final preparations for launch, targeted for liftoff at 2 p.m. on Oct. 29. Other crew members not shown are Mission Commander Curtis L. Brown Jr., Pilot Steven W. Lindsey, Mission Specialists Scott E. Parazynski, Stephen K. Robinsion, and and Payload Specialists John H. Glenn Jr., senator from Ohio, and Chiaki Mukai, with the National Space Development Agency of Japan (NASDA). The STS-95 mission is expected to last 8 days, 21 hours and 49 minutes, returning to KSC at 11:49 a.m. EST on Nov. 7.
WAVE3 is a Biomarker for Breast Cancer Progression and Metastasis
2012-04-01
insure reproducibility of the results. f) Repeat tasks a to e for the specimens with questionable results. Completed. See below Task 7: A BC TMA...miRs 570, 542, 103, 107 and 302, all of which have been found to be deregulated during cancer progression and metastasis [26,33,39–44], therefore
Gimbel, Sarah I; Brewer, James B
2014-01-01
Functional imaging studies of episodic memory retrieval consistently report task-evoked and memory-related activity in the medial temporal lobe, default network and parietal lobe subregions. Associated components of memory retrieval, such as attention-shifts, search, retrieval success, and post-retrieval processing also influence regional activity, but these influences remain ill-defined. To better understand how top-down control affects the neural bases of memory retrieval, we examined how regional activity responses were modulated by task goals during recall success or failure. Specifically, activity was examined during memory suppression, recall, and elaborative recall of paired-associates. Parietal lobe was subdivided into dorsal (BA 7), posterior ventral (BA 39), and anterior ventral (BA 40) regions, which were investigated separately to examine hypothesized distinctions in sub-regional functional responses related to differential attention-to-memory and memory strength. Top-down suppression of recall abolished memory strength effects in BA 39, which showed a task-negative response, and BA 40, which showed a task-positive response. The task-negative response in default network showed greater negatively-deflected signal for forgotten pairs when task goals required recall. Hippocampal activity was task-positive and was influenced by memory strength only when task goals required recall. As in previous studies, we show a memory strength effect in parietal lobe and hippocampus, but we show that this effect is top-down controlled and sensitive to whether the subject is trying to suppress or retrieve a memory. These regions are all implicated in memory recall, but their individual activity patterns show distinct memory-strength-related responses when task goals are varied. In parietal lobe, default network, and hippocampus, top-down control can override the commonly identified effects of memory strength.
Gimbel, Sarah I.; Brewer, James B.
2014-01-01
Functional imaging studies of episodic memory retrieval consistently report task-evoked and memory-related activity in the medial temporal lobe, default network and parietal lobe subregions. Associated components of memory retrieval, such as attention-shifts, search, retrieval success, and post-retrieval processing also influence regional activity, but these influences remain ill-defined. To better understand how top-down control affects the neural bases of memory retrieval, we examined how regional activity responses were modulated by task goals during recall success or failure. Specifically, activity was examined during memory suppression, recall, and elaborative recall of paired-associates. Parietal lobe was subdivided into dorsal (BA 7), posterior ventral (BA 39), and anterior ventral (BA 40) regions, which were investigated separately to examine hypothesized distinctions in sub-regional functional responses related to differential attention-to-memory and memory strength. Top-down suppression of recall abolished memory strength effects in BA 39, which showed a task-negative response, and BA 40, which showed a task-positive response. The task-negative response in default network showed greater negatively-deflected signal for forgotten pairs when task goals required recall. Hippocampal activity was task-positive and was influenced by memory strength only when task goals required recall. As in previous studies, we show a memory strength effect in parietal lobe and hippocampus, but we show that this effect is top-down controlled and sensitive to whether the subject is trying to suppress or retrieve a memory. These regions are all implicated in memory recall, but their individual activity patterns show distinct memory-strength-related responses when task goals are varied. In parietal lobe, default network, and hippocampus, top-down control can override the commonly identified effects of memory strength. PMID:24586492
Long-term trends and survival analysis of esophageal and gastric cancer in Yangzhong, 1991-2013.
Hua, Zhaolai; Zheng, Xianzhi; Xue, Hengchuan; Wang, Jianming; Yao, Jun
2017-01-01
To describe the long-term trends of the incidence, mortality and survival of upper digestive tract cancers in a high-risk area of China. We extracted esophageal and gastric cancer cases diagnosed from 1991 to 2013 through the Yangzhong Cancer Registry and calculated the crude and age-standardized incidence and mortality rates. Cancer trends were calculated using the Joinpoint Regression Program and were reported using the annual percentage change (APC). The cancer-specific survival rates were evaluated and compared between groups using the Kaplan-Meier method and log-rank test. The age-standardized incidence rate of esophageal cancer declined from 107.06 per 100,000 person-years (male: 118.05 per 100,000 person-years; female: 97.42 per 100,000 person-years) in 1991 to 37.04 per 100,000 person-years (male: 46.43 per 100,000 person-years; female: 27.26 per 100,000 person-years) in 2013, with an APC of -2.5% (95% confidence interval (CI): -3.4%, -1.5%) for males and -4.9% (95% CI:-5.8%, -3.9%) for females. The age-standardized incidence rate of gastric cancer was 165.11 per 100,000 person-years (male: 225.39 per 100,000 person-years; female: 113.34 per 100,000 person-years) in 1991 and 53.46 per 100,000 person-years (male: 76.51 per 100,000 person-years; female: 32.43 per 100,000 person-years) in 2013, with the APC of -3.6% (95% CI: -4.5%, -2.7%) for males and -4.8% (95% CI: -5.7%, -3.9%) for females. The median survival time was 3.0 years for patients with esophageal or gastric cancer. Cancer cases detected after 2004 had a better prognosis. The age-standardized incidence rates of both esophageal and gastric cancer continuously decreased since 1991 through 2013, whereas the mortality rate remained stable before 2004 and significantly declined following the massive endoscopic screening program initiated in 2004. The survival probability of patients with esophageal and gastric cancer has improved obviously in recent decades.
Clinician search behaviors may be influenced by search engine design.
Lau, Annie Y S; Coiera, Enrico; Zrimec, Tatjana; Compton, Paul
2010-06-30
Searching the Web for documents using information retrieval systems plays an important part in clinicians' practice of evidence-based medicine. While much research focuses on the design of methods to retrieve documents, there has been little examination of the way different search engine capabilities influence clinician search behaviors. Previous studies have shown that use of task-based search engines allows for faster searches with no loss of decision accuracy compared with resource-based engines. We hypothesized that changes in search behaviors may explain these differences. In all, 75 clinicians (44 doctors and 31 clinical nurse consultants) were randomized to use either a resource-based or a task-based version of a clinical information retrieval system to answer questions about 8 clinical scenarios in a controlled setting in a university computer laboratory. Clinicians using the resource-based system could select 1 of 6 resources, such as PubMed; clinicians using the task-based system could select 1 of 6 clinical tasks, such as diagnosis. Clinicians in both systems could reformulate search queries. System logs unobtrusively capturing clinicians' interactions with the systems were coded and analyzed for clinicians' search actions and query reformulation strategies. The most frequent search action of clinicians using the resource-based system was to explore a new resource with the same query, that is, these clinicians exhibited a "breadth-first" search behaviour. Of 1398 search actions, clinicians using the resource-based system conducted 401 (28.7%, 95% confidence interval [CI] 26.37-31.11) in this way. In contrast, the majority of clinicians using the task-based system exhibited a "depth-first" search behavior in which they reformulated query keywords while keeping to the same task profiles. Of 585 search actions conducted by clinicians using the task-based system, 379 (64.8%, 95% CI 60.83-68.55) were conducted in this way. This study provides evidence that different search engine designs are associated with different user search behaviors.
Upadhyay, Ravi Prakash; Chowdhury, Ranadip; Mazumder, Sarmila; Taneja, Sunita; Sinha, Bireshwar; Martines, Jose; Bahl, Rajiv; Bhandari, Nita; Bhan, Maharaj Kishan
2017-01-01
Background Low birth weight (LBW) infants constitute a vulnerable subset of infants with impaired immunity in early life. In India, there is scarcity of studies that focus on immunization practices in such infants. This analysis aimed to examine immunization practices in LBW infants with the intention to identify areas requiring intervention. Methods Data on immunization status of LBW infants enrolled in an individually randomized, double–masked, placebo–controlled trial of neonatal vitamin A supplementation were analysed. Study outcomes were full immunization by one year of age and delayed vaccination with DPT1 and DPT3. Multivariable logistic regression was performed to identify factors associated with the outcome(s). Findings Out of 10 644 LBW infants enrolled in trial, immunization data were available for 10 517 (98.8%). Less than one–third (29.7%) were fully immunized by one year of age. Lowest wealth quintile (adjusted odds ratio (AOR) 0.39, 95% confidence interval (CI) 0.32–0.47), Muslim religion (AOR 0.41, 95% CI 0.35–0.48) and age of mother <20 years (AOR 0.62, 95% CI 0.52–0.73) were associated with decreased odds of full immunization. Proportion of infants with delayed vaccination for DPT1 and DPT3 were 52% and 81% respectively. Lowest wealth quintiles (AOR 1.51, 95% CI 1.25–1.82), Muslim religion (AOR 1.41, 95% CI 1.21–1.65), mother aged <20 years (AOR 1.31, 95% CI 1.11–1.53) and birth weight <2000 g (AOR 1.20, 95% CI 1.03–1.40) were associated with higher odds of delayed vaccination for DPT–1. Maternal education (≥12 years of schooling) was associated with high odds of full immunization (AOR 2.39, 95% CI 1.97–2.91) and low odds of delayed vaccination for both DPT–1 (AOR 0.59, 95% CI 0.49–0.73) and DPT–3 (AOR 0.57, 95% CI 0.43–0.76) Conclusion In this population, LBW infants are at a risk of delayed and incomplete immunization and therefore need attention. The risks are even higher in identified subgroups that should specifically be targeted PMID:29423177
Dynamic Self-adaptive Remote Health Monitoring System for Diabetics
Suh, Myung-kyung; Moin, Tannaz; Woodbridge, Jonathan; Lan, Mars; Ghasemzadeh, Hassan; Bui, Alex; Ahmadi, Sheila; Sarrafzadeh, Majid
2016-01-01
Diabetes is the seventh leading cause of death in the United States. In 2010, about 1.9 million new cases of diabetes were diagnosed in people aged 20 years or older. Remote health monitoring systems can help diabetics and their healthcare professionals monitor health-related measurements by providing real-time feedback. However, data-driven methods to dynamically prioritize and generate tasks are not well investigated in the remote health monitoring. This paper presents a task optimization technique used in WANDA (Weight and Activity with Blood Pressure and Other Vital Signs); a wireless health project that leverages sensor technology and wireless communication to monitor the health status of patients with diabetes. WANDA applies data analytics in real-time to improving the quality of care. The developed algorithm minimizes the number of daily tasks required by diabetic patients using association rules that satisfies a minimum support threshold. Each of these tasks maximizes information gain, thereby improving the overall level of care. Experimental results show that the developed algorithm can reduce the number of tasks up to 28.6% with minimum support 0.95, minimum confidence 0.97 and high efficiency. PMID:23366365
2010-09-30
Arianna Azzellino Piazza Leonardo da Vinci , 32, 20133 Milano, Italy. phone: (+39) 02-239-964-31 fax: (+39) 02-239-964-99 email...5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Piazza Leonardo da Vinci , 32,20133...Manning CRV Leonardo RV Urania Positve Effort * (km) 2339.39 1700.54 3401.03 504 BW sightings 0 24 2 16 BW cluster of acoustic detections
Kremer, Ingrid E. H.; van der Weijden, Trudy; van de Kolk, Ilona
2016-01-01
Objectives Understanding the preferences of patients with multiple sclerosis (MS) for disease-modifying drugs and involving these patients in clinical decision making can improve the concordance between medical decisions and patient values and may, subsequently, improve adherence to disease-modifying drugs. This study aims first to identify which characteristics–or attributes–of disease-modifying drugs influence patients´ decisions about these treatments and second to quantify the attributes’ relative importance among patients. Methods First, three focus groups of relapsing-remitting MS patients were formed to compile a preliminary list of attributes using a nominal group technique. Based on this qualitative research, a survey with several choice tasks (best-worst scaling) was developed to prioritize attributes, asking a larger patient group to choose the most and least important attributes. The attributes’ mean relative importance scores (RIS) were calculated. Results Nineteen patients reported 34 attributes during the focus groups and 185 patients evaluated the importance of the attributes in the survey. The effect on disease progression received the highest RIS (RIS = 9.64, 95% confidence interval: [9.48–9.81]), followed by quality of life (RIS = 9.21 [9.00–9.42]), relapse rate (RIS = 7.76 [7.39–8.13]), severity of side effects (RIS = 7.63 [7.33–7.94]) and relapse severity (RIS = 7.39 [7.06–7.73]). Subgroup analyses showed heterogeneity in preference of patients. For example, side effect-related attributes were statistically more important for patients who had no experience in using disease-modifying drugs compared to experienced patients (p < .001). Conclusions This study shows that, on average, patients valued effectiveness and unwanted effects as most important. Clinicians should be aware of the average preferences but also that attributes of disease-modifying drugs are valued differently by different patients. Person-centred clinical decision making would be needed and requires eliciting individual preferences. PMID:27812117
Gait initiation time is associated with the risk of multiple falls-A population-based study.
Callisaya, Michele L; Blizzard, Leigh; Martin, Kara; Srikanth, Velandai K
2016-09-01
In a population-based study of older people to examine whether 1) overall gait initiation (GI) time or its components are associated with falls and 2) GI under dual-task is a stronger predictor of falls risk than under single-task. Participants aged 60-85 years were randomly selected from the electoral roll. GI was obtained with a force platform under both single and dual-task conditions. Falls were ascertained prospectively over a 12-month period. Log multinomial regression was used to examine the association between GI time (total and its components) and risk of single and multiple falls. Age, sex and physiological and cognitive falls risk factors were considered as confounders. The mean age of the sample (n=124) was 71.0 (SD 6.8) years and 58.9% (n=73) were male. Over 12 months 21.8% (n=27) of participants reported a single fall and 16.1% (n=20) reported multiple falls. Slower overall GI time under both single (RR all per 100ms 1.28, 95%CI 1.03, 1.58) and dual-task (RR 1.14, 95%CI 1.02, 1.27) was associated with increased risk of multiple, but not single falls (p<0.05). Multiple falls were also associated with slower time to first lateral movement under single-task (RR 1.90 95%CI 0.59, 1.51) and swing time under dual-task condition (RR 1.44 95%CI 1.08, 1.94). Slower GI time is associated with the risk of multiple falls independent of other risk factors, suggesting it could be used as part of a comprehensive falls assessment. Time to the first lateral movement under single-task may be the best measures of this risk. Copyright © 2016 Elsevier B.V. All rights reserved.
Seroprevalence and molecular detection of Mycoplasma ovipneumoniae in goats in tropical China.
Rong, Guang; Zhao, Jun-Ming; Hou, Guan-Yu; Zhou, Han-Lin
2014-12-01
In the present study, the seroprevalence and genetic identification of Mycoplasma ovipneumoniae infection in goats were investigated in Hainan Province, tropical China between October 2012 and October 2013. A total of 1,210 serum samples collected from 16 herds in various administrative regions in tropical China were evaluated using indirect hemagglutination assay (IHA). Antibodies to M. ovipneumoniae were tested in (31.7 %, 95 % confidence interval (CI) 29-34.3) 383 of 1,210 serum samples (IHA titer ≥1:16). The M. ovipneumoniae seroprevalence ranged from 26.8 % (95 % CI 20.8-32.9) to 39 % (95 % CI 30.8-47.2) among different regions in tropical China, and the difference was statistically significant (P < 0.01). The seroprevalence of M. ovipneumoniae infection in goats was higher in winter (46.1 %, 95 % CI 39.6-52.5) and spring (33.8 %, 95 % CI 28.3-39.3) than in autumn (27.5 %, 95 % CI 22.6-32.3) and summer (24.7 %, 95 % CI 20.3-29.1), and the difference was statistically significant (P < 0.01). In addition, DNA was extracted from nasal swab; lung samples and the 16S rRNA gene sequences were amplified by polymerase chain reaction (PCR) and then sequenced. Twenty-four of 329 (7.3 %) nasal swab samples and 73 of 280 (26.1 %) pneumonic lung tissues were found to contain M. ovipneumoniae, respectively. The results of the present survey indicate that M. ovipneumoniae infection is highly prevalent in goats in tropical China. This is the first report of the comprehensive survey of M. ovipneumoniae prevalence in goats in China.
Effect of hypoglycemic agents on survival outcomes of lung cancer patients with diabetes mellitus
Xin, Wen-Xiu; Fang, Luo; Fang, Qi-Lu; Zheng, Xiao-Wei; Ding, Hai-Ying; Huang, Ping
2018-01-01
Abstract Background: To assess the association between hypoglycemic agents and prognosis of lung cancer patients with diabetes. Methods: A comprehensive literature search was performed in PubMed, Web of Science, Embase, and Cochrane Library until May 2017. The search yielded 2593 unique citations, of which 18 articles met inclusion criteria. The hazard ratios (HRs) and 95% confidence intervals (95% CIs) were calculated by a fixed-effects or random-effects model. Results: The pooled HRs favoring metformin users were 0.77 for overall survival (OS) (n = 15, 95% CI: 0.68–0.86) and 0.50 for disease-free survival (n = 5, 95% CI: 0.39–0.64). One study assessed the relationship between metformin and cancer-specific survival (CSS), reporting no significant results. No significant association between insulin and OS (n = 2, HR: 0.95, 95% CI: 0.79–1.13) or CSS (n = 2, HR: 1.03, 95% CI: 0.76–1.41) was noted. One study evaluated association of sulfonylureas with lung cancer survival and reported no clinical benefit (HR: 1.10, 95% CI: 0.87–1.40). One study reported no association of thiazolidinediones with lung cancer survival (HR: 1.04, 95% CI: 0.65–1.66). Conclusions: This meta-analysis demonstrated that metformin exposure might improve survival outcomes in lung cancer patients with diabetes. PMID:29489653
Li, Zhanzhan; Zhou, Qin; Li, Yanyan; Yan, Shipeng; Fu, Jun; Huang, Xinqiong; Shen, Liangfang
2017-02-28
We conducted a meta-analysis to evaluate the diagnostic values of mean cerebral blood volume for recurrent and radiation injury in glioma patients. We performed systematic electronic searches for eligible study up to August 8, 2016. Bivariate mixed effects models were used to estimate the combined sensitivity, specificity, positive likelihood ratios, negative likelihood ratios, diagnostic odds ratios and their 95% confidence intervals (CIs). Fifteen studies with a total number of 576 participants were enrolled. The pooled sensitivity and specificity of diagnostic were 0.88 (95%CI: 0.82-0.92) and 0.85 (95%CI: 0.68-0.93). The pooled positive likelihood ratio is 5.73 (95%CI: 2.56-12.81), negative likelihood ratio is 0.15 (95%CI: 0.10-0.22), and the diagnostic odds ratio is 39.34 (95%CI:13.96-110.84). The summary receiver operator characteristic is 0.91 (95%CI: 0.88-0.93). However, the Deek's plot suggested publication bias may exist (t=2.30, P=0.039). Mean cerebral blood volume measurement methods seems to be very sensitive and highly specific to differentiate recurrent and radiation injury in glioma patients. The results should be interpreted with caution because of the potential bias.
Recent diabetes-related mortality trends in Romania.
Ioacara, Sorin; Sava, Elisabeta; Georgescu, Olivia; Sirbu, Anca; Fica, Simona
2018-05-17
As there are no published articles on country-level diabetes-related mortality in Romania, we aimed to investigate this aspect for the 1998-2015 period. Anonymized demographic and diabetes-related mortality data (underlying or first secondary cause of death) were retrospectively obtained from the National Institute of Statistics/Eurostat microdata. Age-standardized mortality rates (ASMR) and their annual percentage change (APC) were analysed. During 1998-2015, 4,567,899 persons died in Romania, among whom, diabetes was responsible for 168,854 cases. The ASMR for diabetes was 39.34 per 100,000 person-years (p-y) (95% CI 39.32-39.35). There was an increase in ASMR from 27.10 per 100,000 p-y (95% CI 27.01-27.19) in women and 30.88 per 100,000 p-y (95% CI 30.77-30.99) in men in 1998 to 35.42 per 100,000 p-y (95% CI 35.34-35.51) in women and 48.41 per 100,000 p-y (95% CI 48.29-48.52) in men, in 2015. The mean APC in women was 3.8% per year (95% CI 3.5-4.0, p < 0.001) during 1998-2010 and - 1.9% per year (95% CI - 2.7 to - 1.1, p < 0.001) during 2010-2015. The mean APC in men was 5.3% per year (95% CI 5.0-5.5, p < 0.001) during 1998-2010 and - 1.5% per year (95% CI - 2.2 to - 0.8, p < 0.001) during 2010-2015. Diabetes-related mortality rates increased with age, with men experiencing higher mortality rates than women for most age groups and calendar years. Diabetes-related mortality rates increased significantly in Romania during 1998-2010, followed by a steady decline during 2010-2015.
Risk factors for treatment default among adult tuberculosis patients in Indonesia.
Rutherford, M E; Hill, P C; Maharani, W; Sampurno, H; Ruslami, R
2013-10-01
Defaulting from anti-tuberculosis treatment hinders tuberculosis (TB) control. To identify potential defaulters. We conducted a cohort study in newly diagnosed Indonesian TB patients. We administered a questionnaire, prospectively identified defaulters (discontinued treatment ≥ 2 weeks) and assessed risk factors using Cox's regression. Of 249 patients, 39 (16%) defaulted, 61% in the first 2 months. Default was associated with liver disease (HR 3.40, 95%CI 1.02-11.78), chest pain (HR 2.25, 95%CI 1.06-4.77), night sweats (HR 1.98, 95%CI 1.03-3.79), characteristics of the head of the household (self-employed, HR 2.47, 95%CI 1.15-5.34; patient's mother, HR 7.72, 95%CI 1.66-35.88), household wealth (HR 4.24, 95%CI 1.12-16.09), walking to clinic (HR 4.53, 95%CI 1.39-14.71), being unaccompanied at diagnosis (HR 30.49, 95%CI 7.55-123.07) or when collecting medication (HR 3.34, 95%CI 1.24-8.98) and low level of satisfaction with the clinic (HR 3.85, 95%CI 1.17-12.62) or doctors (HR 2.45, 95%CI 1.18-5.10). Health insurance (HR 0.24, 95%CI 0.07-0.74) and paying for diagnosis (HR 0.14, 95%CI 0.04-0.48) were protective. Defaulting is common and occurs early. Interventions that improve clinic services, strengthen patient support and increase insurance coverage may reduce default in Indonesia.
Laboratory-Induced Stress and Craving among Individuals with Prescription Opioid Dependence
Back, Sudie E.; Gros, Daniel F.; Price, Matthew; LaRowe, Steve; Flanagan, Julianne; Brady, Kathleen T.; Davis, Charles; Jaconis, Maryanne; McCauley, Jenna L.
2015-01-01
Background Stress and conditioned drug cues have been implicated in the initiation, maintenance and relapse to substances of abuse. Although stress and drug cues are often encountered together, little research exists on whether stress potentiates the response to drug cues. Method Participants (N = 75) were 39 community recruited individuals with current prescription opioid (PO) dependence and 36 healthy controls. Participants stayed overnight in the hospital for one night and then completed laboratory testing the following morning. During laboratory testing, participants were randomly assigned to a stress task (Trier Social Stress Task; TSST) or a no-stress condition. Following the stress manipulation, all participants completed a PO cue paradigm. Immediately before and after the stress and cue tasks, the following were assessed: subjective (stress, craving, anger, sadness, happiness), physiological (heart rate, blood pressure, galvanic skin response), and neuroendocrine responses (cortisol and dehydroepiandrosterone). Results Internal validity of the stress task was demonstrated, as evidenced by significantly higher subjective stress, as well as cortisol, heart rate and blood pressure in the TSST compared to the no-stress group. Individuals with PO dependence evidenced significantly greater reactivity to the stress task than controls. Craving increased significantly in response to the drug cue task among PO participants. No stress × cue interaction was observed. Conclusions In this study, heightened stress reactivity was observed among individuals with PO dependence. Exposure to acute stress, however, did not potentiate craving in response to conditioned drug cues. PMID:26342626
Relationships of physical job tasks and living conditions with occupational injuries in coal miners.
Bhattacherjee, Ashis; Bertrand, Jean-Pierre; Meyer, Jean-Pierre; Benamghar, Lahoucine; Otero Sierra, Carmen; Michaely, Jean-Pierre; Ghosh, Apurna Kumar; d'Houtaud, Alphonse; Mur, Jean-Marie; Chau, Nearkasen
2007-04-01
This study assessed the relationships of job tasks and living conditions with occupational injuries among coal miners. The sample included randomly selected 516 underground workers. They completed a standardized self-administred questionnaire. The data were analyzed via logistic regression method. The rate of injuries in the past two years was 29.8%. The job tasks with significant crude relative risks were: power hammer, vibrating hand tools, pneumatic tools, bent trunk, awkward work posture, heat, standing about and walking, job tasks for trunk and upper/lower limbs, pain caused by work, and muscular tiredness. Logistic model shows a strong relationship between the number of job tasks (JT) and injuries (adjusted ORs vs. JT 0-1: 2.21, 95%CI 1.27-3.86 for JT 2-6 and 3.82, 2.14-6.82 for JT>or=7), and significant ORs>or=1.71 for face work, not-good-health-status, and psychotropic drug use. Musculoskeletal disorders and certain personality traits were also significant in univariate analysis. Therefore job tasks and living conditions strongly increase the injuries, and occupational physicians could help workers to find remedial measures.
Chen, X; Zhao, Y; Xu, Y; Zhang, H W; Sun, S H; Gao, Z D; He, X X
2016-09-13
Objective: To investigate the prevalence of depression and anxiety and the related factors among new registered tuberculosis (TB) outpatients. Methods: Questionnaire survey was conducted in 1 105 new registered TB patients from sixteen districts of Beijing city during Jan to Jun, 2015. Structured self-administered questionnaire including gender, age, education, occupation, history of smoking and drinking information was designed by epidemiological and psychiatric experts from Beijing Research Institute for Tuberculosis Control. Meanwhile the TB patients status including depression, anxiety and social supporting were investigated by using Self-rating Depression Scale (SDS), Self-Rating Anxiety Scale (SAS) and Social Support Rating Scale (SSRS). The survey data was then analyzed. A total of 1 132 questionnaires were issued and 1 119 were recovered including 1 105 valid questionnaires, and the effective rate was 98.7%. Results: There were 742 males and 363 females among 1 105 new registered TB patients. Age ranged from 16 to 65 years, the mean age was (35.7±13.8) years old. The total standard scores of SDS and SAS were (45.00±12.40) and (39.46±10.03) points, obviously higher than those in national norms (all P <0.05). The detection rates of depression and anxiety were 29.8% (329/1 105) and 13.5% (149/1 105). Multivariate Logistic regression analysis showed that TB patients with female ( OR =1.75, 95% CI : 1.32-2.30), over 35 years ( OR =1.82, 95% CI: 1.39-2.39), annual household income <50 000 ¥ ( OR =1.57, 95% CI: 1.19-2.06), rarely talking to someone about their worries ( OR =1.41, 95% CI: 1.05-1.90) had high risk of depression (all P <0.05). Annual household income<50 000 ¥ ( OR =1.69, 95% CI: 1.17-2.43), rarely talking to someone about their worries ( OR =1.80, 95% CI: 1.19-2.74) also had high risk of anxiety (all P <0.05). The medians scores in social support, subjective support, objective support and support utilization were 38(32, 43), 22(18, 26), 8(6, 10) and 7 (6, 9) points, respectively, and these scores were negatively related to depression and anxiety. Conclusion: Depression and anxiety prevalence in TB patients are obviously higher than those in normal people, and there are many factors that can cause or contribute to depression and anxiety.
Quinot, C; Dumas, O; Henneberger, PK; Varraso, R; Wiley, AS; Speizer, FE; Goldberg, M; Zock, JP; Camargo, CA; Le Moual, N
2016-01-01
Objectives Occupational exposure to disinfectants is associated with work-related asthma, especially in healthcare workers. However, little is known about the specific products involved. To evaluate disinfectant exposures, we designed job-exposure (JEM) and job-task-exposure (JTEM) matrices, which are thought to be less prone to differential misclassification bias than self-reported exposure. We then compared the three assessment methods: self-reported exposure, JEM, and JTEM. Methods Disinfectant use was assessed by an occupational questionnaire in 9,073 U.S. female registered nurses without asthma, aged 49–68 years, drawn from the Nurses’ Health Study II. A JEM was created based on self-reported frequency of use (1–3, 4–7 days/week) of 7 disinfectants and sprays in 8 nursing jobs. We then created a JTEM combining jobs and disinfection tasks to further reduce misclassification. Exposure was evaluated in 3 classes (low, medium, high) using product-specific cut-offs (e.g., <30%, 30–49.9%, ≥50%, respectively, for alcohol); the cut-offs were defined from the distribution of self-reported exposure per job/task. Results The most frequently reported disinfectants were alcohol (weekly use: 39%), bleach (22%) and sprays (20%). More nurses were classified as highly exposed by JTEM (alcohol 41%, sprays 41%, bleach 34%) than by JEM (21%, 30%, 26%, respectively). Agreement between JEM and JTEM was fair-to-moderate (kappa: 0.3–0.5) for most disinfectants. JEM and JTEM exposure estimates were heterogeneous in most nursing jobs, except in emergency room and education/administration. Conclusion The JTEM may provide more accurate estimates than the JEM, especially for nursing jobs with heterogeneous tasks. Use of the JTEM is likely to reduce exposure misclassification. PMID:27566782
Ehrman, Jonathan K; Brawner, Clinton A; Al-Mallah, Mouaz H; Qureshi, Waqas T; Blaha, Michael J; Keteyian, Steven J
2017-10-01
Little is known about the relationship of change in cardiorespiratory fitness and mortality risk in Black patients. This study assessed change in cardiorespiratory fitness and its association with all-cause mortality risk in Black and White patients. This is a retrospective, longitudinal, observational cohort study of 13,345 patients (age = 55 ± 11 years; 39% women; 26% black) who completed 2 exercise tests, at least 12 months apart at Henry Ford Hospital, Detroit, Mich. All-cause mortality was identified through April 2013. Data were analyzed in 2015-2016 using Cox regression to calculate hazard ratios (HR) for risk of mortality associated with change in sex-specific cardiorespiratory fitness. Mean time between the tests was 3.4 years (interquartile range 1.9-5.6 years). During 9.1 years (interquartile range 6.3-11.6 years) of follow-up, there were 1931 (14%) deaths (16.5% black, 13.7% white). For both races, change in fitness from Low to the Intermediate/High category resulted in a significant reduction of death risk (HR 0.65 [95% confidence interval (CI), 0.49-0.87] for Black; HR 0.41 [95% CI, 0.34-0.51] for White). Each 1-metabolic-equivalent-of-task increase was associated with a reduced mortality risk in black (HR 0.84 [95% CI, 0.81-0.89]) and white (HR 0.87 [95% CI, 0.82-0.86]) patients. There was no interaction by race. Among black and white patients, change in cardiorespiratory fitness from Low to Intermediate/High fitness was associated with a 35% and 59% lower risk of all-cause mortality, respectively. Copyright © 2017 Elsevier Inc. All rights reserved.
Individual and occupational factors related to fatal occupational injuries: a case-control study.
Villanueva, Vicent; Garcia, Ana M
2011-01-01
This study has been designed in order to identify factors increasing the risk of a fatal outcome when occupational accidents occur. The aim is to provide further evidence for the design and implementation of preventive measures in occupational settings. The Spanish Ministry of Labour registry of occupational injuries causing absence from work includes information on individual and occupational characteristics of injured workers and events. Registered fatal occupational injuries in 2001 (n=539) were compared to a sample of non-fatal injuries in the same year (n=3493). Risks for a fatal result of occupational injuries, adjusted by individual and occupational factors significantly associated, were obtained through logistic regression models. Compared to non-fatal injuries, fatal occupational injuries were mostly produced by trapping or by natural causes, mostly related to elevation and transport devices and power generators, and injured parts of body more frequently affected were head, multiple parts or internal organs. Adjusted analyses showed increased risk of fatality after an occupational injury for males (adjusted odds ratio aOR=10.92; 95%CI 4.80-24.84) and temporary workers (aOR=5.18; 95%CI 2.63-10.18), and the risk increased with age and with advancing hour of the work shift (p for trends <0.01). Injuries taking place out of the usual occupational setting (aOR=2.85, 95%CI 2.27-3.59), or carrying out atypical tasks (aOR=2.08; 95%CI 1.27-3.39) showed increased risks of a fatal result too, as occupational accidents in agricultural or construction companies. These data can help to select and define priorities for programmes aimed to prevent fatal consequences of occupational injuries. Copyright © 2010 Elsevier Ltd. All rights reserved.
Catsburg, Chelsea; Kirsh, Victoria A; Soskolne, Colin L; Kreiger, Nancy; Bruce, Erin; Ho, Thi; Leatherdale, Scott T; Rohan, Thomas E
2014-06-01
Obesity, physical inactivity, and sedentary behavior, concomitants of the modern environment, are potentially modifiable breast cancer risk factors. This study investigated the association of anthropometric measurements, physical activity and sedentary behavior, with the risk of incident, invasive breast cancer using a prospective cohort of women enrolled in the Canadian Study of Diet, Lifestyle and Health. Using a case-cohort design, an age-stratified subcohort of 3,320 women was created from 39,532 female participants who returned completed self-administered lifestyle and dietary questionnaires at baseline. A total of 1,097 incident breast cancer cases were identified from the entire cohort via linkage to the Canadian Cancer Registry. Cox regression models, modified to account for the case-cohort design, were used to estimate hazard ratios (HR) and 95 % confidence intervals (CI) for the association between anthropometric characteristics, physical activity, and the risk of breast cancer. Weight gain as an adult was positively associated with risk of post-menopausal breast cancer, with a 6 % increase in risk for every 5 kg gained since age 20 (HR 1.06; 95 % CI 1.01-1.11). Women who exercised more than 30.9 metabolic equivalent task (MET) hours per week had a 21 % decreased risk of breast cancer compared to women who exercised less than 3 MET hours per week (HR 0.79; 95 % CI 0.62-1.00), most evident in pre-menopausal women (HR 0.62; 95 % CI 0.43-0.90). As obesity reaches epidemic proportions and sedentary lifestyles have become more prevalent in modern populations, programs targeting adult weight gain and promoting physical activity may be beneficial with respect to reducing breast cancer morbidity.
Prado-Leon, Lilia R; Celis, Alfredo; Avila-Chaurand, Rosalio
2005-01-01
The objective of this study was to quantify and assess whether lifting tasks in the workplace are a risk factor in lumbar spondyloarthrosis etiology. A case-control study was performed with 231 workers, 18-55 years old, insured by the Mexican Social Security Institute (IMSS, according to its designation in Spanish). A multivariate analysis using conditional logistical regression showed that lifting tasks, combined with driving tasks, are associated with this illness (OR = 7.3; 95% CI 1.7-31.4). The daily frequency of lifting as it interacts with work as a driver resulted in a greater risk (OR = 10.4; 95% CI 2.0-52.5). The load weight, daily task-hours and cumulative time showed a dose-response relationship. The attributable risk for lifting tasks was 0.83, suggesting that 83% of lumbar spondyloarthrosis development could be prevented if risk factors were eliminated by ergonomic redesign of the task.
Guo, Jennifer N; Kim, Robert; Chen, Yu; Negishi, Michiro; Jhun, Stephen; Weiss, Sarah; Ryu, Jun Hwan; Bai, Xiaoxiao; Xiao, Wendy; Feeney, Erin; Rodriguez-Fernandez, Jorge; Mistry, Hetal; Crunelli, Vincenzo; Crowley, Michael J; Mayes, Linda C; Constable, R Todd; Blumenfeld, Hal
2016-12-01
The neural underpinnings of impaired consciousness and of the variable severity of behavioural deficits from one absence seizure to the next are not well understood. We aimed to measure functional MRI (fMRI) and electroencephalography (EEG) changes in absence seizures with impaired task performance compared with seizures in which performance was spared. In this cross-sectional study done at the Yale School of Medicine, CT, USA, we recruited patients from 59 paediatric neurology practices in the USA. We did simultaneous EEG, fMRI, and behavioural testing in patients aged 6-19 years with childhood or juvenile absence epilepsy, and with an EEG with typical 3-4 Hz bilateral spike-wave discharges and normal background. The main outcomes were fMRI and EEG amplitudes in seizures with impaired versus spared behavioural responses analysed by t test. We also examined the timing of fMRI and EEG changes in seizures with impaired behavioural responses compared with seizures with spared responses. 93 patients were enrolled between Jan 1, 2005, and Sept 1, 2013; we recorded 1032 seizures in 39 patients. fMRI changes during seizures occurred sequentially in three functional brain networks. In the default mode network, fMRI amplitude was 0·57% (SD 0·26) for seizures with impaired and 0·40% (0·16) for seizures with spared behavioural responses (mean difference 0·17%, 95% CI 0·11-0·23; p<0·0001). In the task-positive network, fMRI amplitude was 0·53% (SD 0·29) for seizures with impaired and 0·39% (0·15) for seizures with spared behavioral responses (mean difference 0·14%, 95% CI 0·08-0·21; p<0·0001). In the sensorimotor-thalamic network, fMRI amplitude was 0·41% (0·25) for seizures with impaired and 0·34% (0·14) for seizures with spared behavioural responses (mean difference 0·07%, 95% CI 0·01-0·13; p=0·02). Mean fractional EEG power in the frontal leads was 50·4 (SD 15·2) for seizures with impaired and 24·8 (6·5) for seizures with spared behavioural responses (mean difference 25·6, 95% CI 21·0-30·3); middle leads 35·4 (6·5) for seizures with impaired, 13·3 (3·4) for seizures with spared behavioural responses (mean difference 22·1, 95% CI 20·0-24·1); posterior leads 41·6 (5·3) for seizures with impaired, 24·6 (8·6) for seizures with spared behavioural responses (mean difference 17·0, 95% CI 14·4-19·7); p<0·0001 for all comparisons. Mean seizure duration was longer for seizures with impaired behaviour at 7·9 s (SD 6·6), compared with 3·8 s (3·0) for seizures with spared behaviour (mean difference 4·1 s, 95% CI 3·0-5·3; p<0·0001). However, larger amplitude fMRI and EEG signals occurred at the outset or even preceding seizures with behavioural impairment. Impaired consciousness in absence seizures is related to the intensity of physiological changes in established networks affecting widespread regions of the brain. Increased EEG and fMRI amplitude occurs at the onset of seizures associated with behavioural impairment. These finding suggest that a vulnerable state might exist at the initiation of some absence seizures leading them to have more severe physiological changes and altered consciousness than other absence seizures. National Institutes of Health, National Institute of Neurological Disorders and Stroke, National Center for Advancing Translational Science, the Loughridge Williams Foundation, and the Betsy and Jonathan Blattmachr Family. Copyright © 2016 Elsevier Ltd. All rights reserved.
Mitchell, Rebecca J; Seah, Rebecca; Ting, Hsuen P; Curtis, Kate; Foster, Kim
2018-06-01
To examine the magnitude, 10-year temporal trends and treatment cost of intentional injury hospitalisations of children aged ≤16 years in Australia. A retrospective examination of linked hospitalisation and mortality data for children aged ≤16 years during 1 July 2001 to 30 June 2012 with self-harm or assault injuries. Negative binomial regression examined temporal trends. There were 18,223 self-harm and 13,877 assault hospitalisations, with a treatment cost of $64 million and $60.6 million, respectively. The self-harm hospitalisation rate was 59.8 per 100,000 population (95%CI 58.96-60.71) with no annual decrease. The assault hospitalisation rate was 29.9 per 100,000 population (95%CI 29.39-30.39) with a 4.2% annual decrease (95%CI -6.14- -2.31, p<0.0001). Poisoning was the most common method of self-harm. Other maltreatment syndromes were common for children ≤5 years of age. Assault by bodily force was common for children aged 6-16 years. Health professionals can play a key role in identifying and preventing the recurrence of intentional injury. Psychosocial care and access to support services are essential for self-harmers. Parental education interventions to reduce assaults of children and training in conflict de-escalation to reduce child peer-assaults are recommended. Implications for public health: Australia needs a whole-of-government and community approach to prevent intentional injury. © 2018 The Authors.
A comparison of second and third generations combined oral contraceptive pills' effect on mood.
Shahnazi, Mahnaz; Farshbaf Khalili, Azizeh; Ranjbar Kochaksaraei, Fatemeh; Asghari Jafarabadi, Mohammad; Gaza Banoi, Kamal; Nahaee, Jila; Bayati Payan, Somayeh
2014-08-01
Most women taking combined oral contraceptives (COCs) are satisfied with their contraceptive method. However, one of the most common reasons reported for discontinuation of combined oral contraceptives (COCs) is mood deterioration. This study aimed to compare effects of the second and third generation oral contraceptive pills on the mood of reproductive women. This randomized, double-blind, controlled clinical trial was conducted in reproductive women at health centers in Tehran, Iran. Participants were randomized into the second and third generation oral contraceptive groups. Positive and negative moods were recorded using positive affect, negative affect scale (PANAS) tools at the end the second and fourth months of the study. Data analysis was carried out using ANOVA and P Values < 0.05 was considered significant. Statistically significant difference was seen in positive and negative mood changes in women receiving contraceptive pills. The second generation oral contraceptive pills resulted in a decrease in positive mood (95% CI: 43.39 to 38.32 in second month and 43.39 to 26.05 in four month) and increase in negative mood (95% CI: 14.23 to 22.04 in second month and 14.23 to 32.26 in four month - P < 0.001), but the third generation led to an increase in positive mood (95% CI: 22.42 to 25.60 in second month and 22.42 to 33.87 in four month) and decrease in negative mood (95% CI: 36.78 to 31.97 in second month and 36.78 to 22.65 in four month - P < 0.001). Third generation combined oral contraceptive pills have a better effect on mood in women in reproductive ages than the second generation pills. It can be recommended as a proper combined oral contraceptive in Iran.
Mulhall, B P; Wright, S T; De La Mata, N; Allen, D; Brown, K; Dickson, B; Grotowski, M; Jackson, E; Petoumenos, K; Foster, R; Read, T; Russell, D; Smith, D J; Templeton, D J; Fairley, C K; Law, M G
2016-09-01
We established a subcohort of HIV-positive individuals from 10 sexual health clinics within the Australian HIV Observational Database (AHOD). The aim of this study was to assess demographic and other factors that might be associated with an incident sexually transmitted infection (STI). The cohort follow-up was from March 2010 to March 2013, and included patients screened at least once for an STI. We used survival methods to determine time to first new and confirmed incident STI infection (chlamydia, gonorrhoea, syphilis or genital warts). Factors evaluated included sex, age, mode of HIV exposure, year of AHOD enrolment, hepatitis B or C coinfection, time-updated CD4 cell count, time-updated HIV RNA viral load, and prior STI diagnosis. There were 110 first incident STI diagnoses observed over 1015 person-years of follow-up, a crude rate of 10.8 [95% confidence interval (CI) 9.0-13.0] per 100 person-years. Factors independently associated with increased risk of incident STI included younger age [≥ 50 vs. 30-39 years old, adjusted hazards ratio (aHR) 0.4; 95% CI 0.2-0.8; P < 0.0001]; prior STI infection (aHR 2.5; 95% CI 1.6-3.8; P < 0.001), and heterosexual vs. men who have sex with men (MSM) as the likely route of exposure (aHR 0.2; 95% CI 0.1-0.6; P < 0.001). In this cohort of individualsbeing treated with antiretroviral drugs, those who were MSM, who were 30-39 years old, and who had a prior history of STI, were at highest risk of a further STI diagnosis. © 2016 British HIV Association.
When to ask male adolescents to provide semen sample for fertility preservation?
Dabaja, Ali A.; Wosnitzer, Matthew S.; Bolyakov, Alexander; Schlegel, Peter N.
2014-01-01
Background Fertility preservation in adolescents undergoing sterilizing radiation and/or chemotherapy is the standard of care in oncology. The opportunity for patients to provide a semen sample by ejaculation is a critical issue in adolescent fertility preservation. Methods Fifty males with no medical or sexual developmental abnormalities were evaluated. The subjects were screened for evidence of orgasmic, erectile, and ejaculatory dysfunction. A detailed sexual development history was obtained under an Institutional Review Board (IRB)-approved protocol. Results Fifty males, aged 18-65 years (mean 39±16.03 years) volunteered to be part of this study. The mean reported age for the onset of puberty was 12.39 years (95% CI, 11.99-12.80 years), 13.59 years (95% CI, 13.05-14.12 years) for the first ejaculation, 12.56 years (95% CI, 11.80-13.32 years) for the start of masturbation, and 17.26 years (95% CI, 16.18-18.33 years) for the first experienced intercourse. Seventy-five percent of the cohort reached puberty by the age of 13.33, experienced masturbation by 14.5, first ejaculated by the age of 14.83, and had intercourse at age of 19.15 years. The first experienced ejaculation fell 1.5 years after the onset of puberty in 80% present of the cohort, and 84% starts masturbation 1.5 years after the onset of puberty. The mean response between the younger and the older subject was not statistical significance. Conclusions It is appropriate to consider a request for semen specimens by masturbation from teenagers at one year and six months after the onset of puberty; the onset age of puberty plus 1.5 years is an important predictor of ejaculation and sample collection for cryopreservation. PMID:26813354
The predictive value of aptitude assessment in laparoscopic surgery: a meta-analysis.
Kramp, Kelvin H; van Det, Marc J; Hoff, Christiaan; Veeger, Nic J G M; ten Cate Hoedemaker, Henk O; Pierie, Jean-Pierre E N
2016-04-01
Current methods of assessing candidates for medical specialties that involve laparoscopic skills suffer from a lack of instruments to assess the ability to work in a minimally invasive surgery environment. A meta-analysis was conducted to investigate whether aptitude assessment can be used to predict variability in the acquisition and performance of laparoscopic skills. PubMed, PsycINFO and Google Scholar were searched to November 2014 for published and unpublished studies reporting the measurement of a form of aptitude for laparoscopic skills. The quality of studies was assessed with QUADAS-2. Summary correlations were calculated using a random-effects model. Thirty-four studies were found to be eligible for inclusion; six of these studies used an operating room performance measurement. Laparoscopic skills correlated significantly with visual-spatial ability (r = 0.32, 95% confidence interval [CI] 0.25-0.39; p < 0.001), perceptual ability (r = 0.31, 95% CI 0.22-0.39; p < 0.001), psychomotor ability (r = 0.26, 95% CI 0.10-0.40; p = 0.003) and simulator-based assessment of aptitude (r = 0.64, 95% CI 0.52-0.73; p < 0.001). Three-dimensional dynamic visual-spatial ability showed a significantly higher correlation than intrinsic static visual-spatial ability (p = 0.024). In general, aptitude assessments are associated with laparoscopic skill level. Simulator-based assessment of aptitude appears to have the potential to represent a job sample and to enable the assessment of all forms of aptitude for laparoscopic surgery at once. A laparoscopy aptitude test can be a valuable additional tool in the assessment of candidates for medical specialties that require laparoscopic skills. © 2016 John Wiley & Sons Ltd.
Auhl, Maria; Tan, Jade M.; Levinger, Pazit; Roddy, Edward; Munteanu, Shannon E.
2016-01-01
Objective To compare the effectiveness of prefabricated foot orthoses to rocker‐sole footwear in reducing foot pain in people with first metatarsophalangeal (MTP) joint osteoarthritis (OA). Methods Participants (n = 102) with first MTP joint OA were randomly allocated to receive individualized, prefabricated foot orthoses or rocker‐sole footwear. The primary outcome measure was the pain subscale on the Foot Health Status Questionnaire (FHSQ) at 12 weeks. Secondary outcome measures included the function, footwear, and general foot health subscales of the FHSQ; the Foot Function Index; severity of pain and stiffness at the first MTP joint; perception of global improvement; general health status; use of rescue medication and co‐interventions to relieve pain; physical activity; and the frequency of self‐reported adverse events. Results The FHSQ pain subscale scores improved in both groups, but no statistically significant difference between the groups was observed (adjusted mean difference 2.05 points, 95% confidence interval [95% CI] −3.61, 7.71; P = 0.477). However, the footwear group exhibited lower adherence (mean ± SD total hours worn 287 ± 193 versus 448 ± 234; P < 0.001), were less likely to report global improvement in symptoms (39% versus 62%; relative risk [RR] 0.63, 95% CI 0.41, 0.99; P = 0.043), and were more likely to experience adverse events (39% versus 16%; RR 2.47, 95% CI 1.12, 5.44; P = 0.024) compared to the orthoses group. Conclusion Prefabricated foot orthoses and rocker‐sole footwear are similarly effective at reducing foot pain in people with first MTP joint OA. However, prefabricated foot orthoses may be the intervention of choice due to greater adherence and fewer associated adverse events. PMID:26638878
Tomasson, Gunnar; Boers, Maarten; Walsh, Michael; LaValley, Michael; Cuthbertson, David; Carette, Simon; Davis, John C.; Hoffman, Gary S.; Khalidi, Nader A.; Langford, Carol A.; McAlear, Carol A.; McCune, W. Joseph; Monach, Paul A.; Seo, Philip; Specks, Ulrich; Spiera, Robert; St. Clair, E. William; Stone, John H.; Ytterberg, Steven R.; Merkel, Peter A.
2011-01-01
Objective Assess a generic measure of health-related quality of life (HRQOL) as an outcome measure in granulomatosis with polyangiitis (Wegener's, GPA) Methods Subjects were participants in the Wegener’s Granulomatosis Etanercept Trial (WGET) or the Vasculitis Clinical Research Consortium Longitudinal Study (VCRC-LS). HRQOL was assessed with the Short Form 36 Health Survey (SF-36) that includes physical and mental component summary scores (PCS and MCS). Disease activity was assessed with the Birmingham Vasculitis Activity Score for Wegener’s Granulomatosis (BVAS/WG). Results Data from 180 subjects in the WGET (median follow-up = 2.3 years, mean number of visits = 10) and 237 subjects in the VCRC-LS (median follow-up = 2.0 years, mean number of visits = 8) were analyzed. One unit increase in BVAS/WG corresponded to a 1.15 unit (95%CI: 1.02; 1.29) decrease in PCS and a 0.93 (95%CI: 0.78; 1.07) decrease in MCS in the WGET and by 1.16 for PCS (95%CI: 0.94; 1.39) and 0.79 for MCS (95%CI: 0.51; 1.39) in the VCRC-LS. In both arms of the WGET study, SF-36 measures improved rapidly during the first 6 weeks of treatment followed by gradual improvement among patients achieving sustained remission (0.5 improvement in PCS per three months), but worsened slightly (0.03 decrease in PCS per three months) among patients not achieving sustained remission (p = 0.005). Conclusion HRQOL, as measured by SF-36, is reduced among patients with GPA. SF-36 measures are modestly associated with other disease outcomes and discriminate between disease states of importance in GPA. PMID:21954229
Escolar, Esteban; Lamas, Gervasio A.; Mark, Daniel B.; Boineau, Robin; Goertz, Christine; Rosenberg, Yves; Nahin, Richard L.; Ouyang, Pamela; Rozema, Theodore; Magaziner, Allan; Nahas, Richard; Lewis, Eldrin F.; Lindblad, Lauren; Lee, Kerry L.
2014-01-01
Background The Trial to Assess Chelation Therapy (TACT) showed clinical benefit of an ethylene diamine tetraacetic acid (EDTA-based) infusion regimen in patients 50 years or older with prior myocardial infarction (MI). Diabetes prior to enrollment was a pre-specified subgroup. Methods and Results Patients received 40 infusions of EDTA chelation or placebo. 633 (37%) had diabetes (322 EDTA, 311 placebo). EDTA reduced the primary endpoint (death, reinfarction, stroke, coronary revascularization, or hospitalization for angina) [25% vs 38%, hazard ratio (HR) 0.59, 95% confidence interval (CI) (0.44, 0.79), p<0.001] over 5 years. The result remained significant after Bonferroni adjustment for multiple subgroups (99.4% CI (0.39, 0.88), adjusted p=0.002). All-cause mortality was reduced by EDTA chelation [10% vs 16%, HR 0.57, 95% CI (0.36, 0.88) p=0.011], as was the secondary endpoint (cardiovascular death, reinfarction, or stroke) [11% vs 17% HR 0.60, 95% CI (0.39, 0.91), p=0.017]. After adjusting for multiple subgroups, however, those results were no longer significant. The number needed to treat to reduce one primary endpoint was 6.5 over 5 years (95% CI (4.4, 12.7). There was no reduction in events in non-diabetics (n=1075, p=0.877), resulting in a treatment by diabetes interaction (p=0.004). Conclusions Post-MI diabetic patients age 50 or older demonstrated a marked reduction in cardiovascular events with EDTA chelation. These findings support efforts to replicate these findings and define the mechanisms of benefit. They do not, however, constitute sufficient evidence to indicate the routine use of chelation therapy for all post-MI diabetic patients. PMID:24254885
Buchbinder, David; Mertens, Ann C.; Zeltzer, Lonnie K.; Leisenring, Wendy; Goodman, Pam; Lown, E. Anne; Alderfer, Melissa A.; Recklitis, Christopher; Oeffinger, Kevin; Armstrong, Gregory T.; Hudson, Melissa; Robison, Leslie L.; Casillas, Jacqueline
2012-01-01
Objective To compare the skin and breast/cervical cancer prevention/screening practices of adult siblings of childhood cancer survivors with controls and to identify modifying factors for these practices. Methods Cross-sectional, self-report data from 2,588 adult siblings of 5+ year survivors of childhood cancer were analyzed to assess cancer prevention/screening practices. Two age, sex and race/ethnicity-matched samples (n=5,915 and n=37,789) of the Behavioral Risk Factor Surveillance System participants served as the comparison populations. Sociodemographic and cancer-related data were explored as modifying factors for sibling cancer prevention/screening practices through multivariable logistic regression. Results Compared to controls, siblings were more likely to practice skin cancer prevention behaviors: use of protective clothing (OR 2.85, 95% 2.39-3.39), use of shade (OR 2. 11, 95% 1.88-2.36), use of sunscreen (OR 1.27, 95% 1.14-1.40), and wearing a hat (OR 1.77, 95% 1.58-1.98). No differences were noted for breast/cervical cancer screening including mammography and Pap testing. Having less than a high school education and lack of health insurance were associated with diminished cancer prevention/screening behaviors. Survivor diagnosis, treatment intensity, adverse health, chronic health conditions, and second cancers were not associated with sibling cancer prevention/screening behaviors. Conclusions Siblings of cancer survivors report greater skin cancer prevention practices when compared with controls; however, no differences were noted for breast/cervical cancer screening practices. Access to care and lack of education may be associated with decreased cancer prevention/screening behaviors. Interventions are needed to address these barriers. Impact Research should be directed at understanding the impact of the cancer experience on sibling health behaviors. PMID:22576363
The prevalence of symptomatic pelvic floor disorders in women in Bangladesh.
Islam, R M; Bell, R J; Billah, B; Hossain, M B; Davis, S R
2016-12-01
To investigate the prevalence of, and risk factors for, pelvic floor disorders (PFDs) in women in Bangladesh. A nationally representative sample of 1590 Bangladeshi women, aged 30-59 years, was recruited using a multistage cluster sampling technique, between September 2013 and March 2014. Urinary incontinence (UI), fecal incontinence (FI) and pelvic organ prolapse (POP) were assessed using validated questionnaires. The weighted prevalence and the factors associated with each PFD were investigated using multivariable weighted logistic regression. The weighted prevalence of UI was 23.7% (95% confidence interval (CI) 21.3-26.0%), FI 5.3% (95% CI 4.0-6.6%), POP 16.2% (95% CI 14.2-18.2%), and having at least one PFD 35.3% (95% CI 32.6-37.9%). Women were more likely to have at least one PFD if aged 40-49 years (adjusted odds ratio (AOR) 1.46, 95% CI 1.02-2.08; p = 0.040) or 50-59 years (AOR 2.39, 95% CI 1.59-3.58; p < 0.0001), compared with women aged 30-39 years. Having at least one PFD was positively associated with having three or more versus fewer children (AOR 1.61, 95% CI 1.14-2.27; p = 0.007), being in the middle (AOR 3.05, 95% CI 1.72-5.41; p < 0.0001), second lowest (AOR 2.49, 95% CI 1.39-4.47; p = 0.002) or lowest (AOR 3.13, 95% CI 1.68-5.86; p < 0.0001) wealth quintile compared with the highest, and self-reporting diabetes (AOR 2.55, 95% CI 1.54-4.23; p < 0.0001). One-third of Bangladeshi women aged 30-59 years had at least one symptomatic PFD. Risk factors included greater age, higher parity, lower wealth status and self-reported diabetes. The diagnosis, treatment, and prevention of PFDs in Bangladesh need greater attention, as the prevalence of these disabling conditions is likely to increase with the aging of the population.
Likavec, Tasha; Pires, Alda F.A.; Funk, Julie A.
2016-01-01
The objective of this study was to describe the association between thermal measures in the barn environment (pen temperature and humidity) and fecal shedding of Salmonella in dairy cattle. A repeated cross-sectional study was conducted within a commercial dairy herd located in the midwestern United States. Five pooled fecal samples were collected monthly from each pen for 9 mo and submitted for microbiological culture. Negative binomial regression methods were used to test the association [incidence rate ratio (IRR)] between Salmonella pen status (the count of Salmonella-positive pools) and thermal environmental parameters [average temperature and temperature humidity index (THI)] for 3 time periods (48 h, 72 h, and 1 wk) before fecal sampling. Salmonella was cultured from 10.8% [39/360; 95% confidence interval (CI): 7.8% to 14.5%] of pooled samples. The highest proportion of positive pools occurred in August. The IRR ranged from 1.26 (95% CI: 1.15 to 1.39, THI 1 wk) to 4.5 (95% CI: 2.13 to 9.51, heat exposure 1 wk) across all thermal parameters and lag time periods measured. For example, the incidence rate of Salmonella-positive pools increased by 54% for every 5°C increment in average temperature (IRR = 1.54; 95% CI: 1.29 to 1.85) and 29% for every 5-unit increase in THI (IRR = 1.29; 95% CI: 1.16 to 1.42) during the 72 h before sampling. The incidence rate ratio for pens exposed to higher temperatures (> 25°C) was 4.5 times (95% CI: 2.13 to 9.51) the incidence rate ratio for pens exposed to temperatures < 25°C in the 72 h before sampling. Likewise, the incidence rate ratio for pens exposed to THI > 70 was 4.23 times greater (95% CI: 2.1 to 8.28) than when the THI was < 70 in the 72 h before sampling. An association was found between the thermal environment and Salmonella shedding in dairy cattle. Further research is warranted in order to fully understand the component risks associated with the summer season and increased Salmonella shedding. PMID:27408330
Hange, Dominique; Mehlig, Kirsten; Lissner, Lauren; Guo, Xinxin; Bengtsson, Calle; Skoog, Ingmar; Björkelund, Cecilia
2013-01-01
Purpose To investigate possible association between mental stress and psychosomatic symptoms, socioeconomic status, lifestyle, as well as incident mortality in a middle-aged female population followed over 37 years. Methods A prospective observational study initiated in 1968–1969, including 1462 women aged 60, 54, 50, 46, and 38 years, with follow-ups in 1974–1975, 1980–1981, and 2000–2001, was performed. Measures included self-reported mental stress as well as psychosomatic symptoms and smoking, physical activity, total cholesterol, S-triglycerides, body mass index, waist–hip ratio, blood pressure, socioeconomic status and mortality. Results Smoking, not being single, and not working outside home were strongly associated with reported mental stress at baseline. Women who reported high mental stress in 1968–1969 were more likely to report presence of abdominal symptoms (odds ratio [OR] = 1.85, 95% confidence interval [CI]: 1.39–2.46), headache/migraine (OR = 2.04, 95% CI: 1.53–2.72), frequent infections (OR = 1.75, 95% CI: 1.14–2.70), and musculoskeletal symptoms (OR = 1.70, 95% CI: 1.30–2.23) than women who did not report mental stress. Women without these symptoms at baseline 1968–1969, but with perceived mental stress were more likely to subsequently report incident abdominal symptoms (OR = 2.15, 95% CI: 1.39–3.34), headache/migraine (OR = 2.27, 95% CI: 1.48–3.48) and frequent infections (OR = 2.21, 95% CI: 1.12–4.36) in 1974–1975 than women without mental stress in 1968–1969. There was no association between perceived mental stress at baseline and mortality over 37 years of follow-up. Conclusion Women reporting mental stress had a higher frequency of psychosomatic symptoms than women who did not report these symptoms. Not working outside home and smoking rather than low socioeconomic status per se was associated with higher stress levels. Perception of high mental stress was not associated with increased mortality. PMID:23650451
Chanu, A; Caron, A; Ficheur, G; Berkhout, C; Duhamel, A; Rochoy, M
2018-05-01
A general practitioner's office is an economic unit where task delegation is an essential component in improving the quality and performance of work. To classify the preferences of general practitioners regarding the delegation of medical-administrative tasks to assistant medical-social secretaries. Conjoint analysis was applied to a random sample of 175 general practitioners working in metropolitan France. Ten scenarios were constructed based on seven attributes: training for medical secretaries, logistical support during the consultation, delegation of management planning, medical records, accounting, maintenance, and taking initiative on the telephone. A factorial design was used to reduce the number of scenarios. Physicians' socio-demographic variables were collected. One hundred and three physicians responded and the analysis included 90 respondents respecting the transitivity of preferences hypothesis. Perceived difficulty was scored 2.8 out of 5. The high rates of respondents (59%; 95% CI [51.7-66.3]) and transitivity (87.5%; 95% CI [81.1-93.9]) showed physicians' interest in this topic. Delegation of tasks concerning management planning (OR=2.91; 95% CI [2.40-13.52]) and medical records (OR=1.88; 95% CI [1.56-2.27]) were the two most important attributes for physicians. The only variable for which the choice of a secretary was not taken into account was logistical support. This is a first study examining the choices of general practitioners concerning the delegation of tasks to assistants. These findings are helpful to better understand the determinants of practitioners' choices in delegating certain tasks or not. They reveal doctors' desire to limit their ancillary tasks in order to favor better use of time for "medical" tasks. They also expose interest for training medical secretaries and widening their field of competence, suggesting the emergence of a new professional occupation that could be called "medical assistant". Copyright © 2018 Elsevier Masson SAS. All rights reserved.
Fernandes, Ângela; Rocha, Nuno; Santos, Rubim; Tavares, João Manuel R S
2015-01-01
The aim of this study was to analyze the efficacy of cognitive-motor dual-task training compared with single-task training on balance and executive functions in individuals with Parkinson's disease. Fifteen subjects, aged between 39 and 75 years old, were randomly assigned to the dual-task training group (n = 8) and single-task training group (n = 7). The training was run twice a week for 6 weeks. The single-task group received balance training and the dual-task group performed cognitive tasks simultaneously with the balance training. There were no significant differences between the two groups at baseline. After the intervention, the results for mediolateral sway with eyes closed were significantly better for the dual-task group and anteroposterior sway with eyes closed was significantly better for the single-task group. The results suggest superior outcomes for the dual-task training compared to the single-task training for static postural control, except in anteroposterior sway with eyes closed.
Fatty liver disease in severe obese patients: Diagnostic value of abdominal ultrasound
de Moura Almeida, Alessandro; Cotrim, Helma Pinchemel; Barbosa, Daniel Batista Valente; de Athayde, Luciana Gordilho Matteoni; Santos, Adimeia Souza; Bitencourt, Almir Galvão Vieira; de Freitas, Luiz Antonio Rodrigues; Rios, Adriano; Alves, Erivaldo
2008-01-01
AIM: To evaluate the sensitivity and specificity of abdominal ultrasound (US) for the diagnosis of hepatic steatosis in severe obese subjects and its relation to histological grade of steatosis. METHODS: A consecutive series of obese patients, who underwent bariatric surgery from October 2004 to May 2005, was selected. Ultrasonography was performed in all patients as part of routine preoperative time and an intraoperative wedge biopsy was obtained at the beginning of the bariatric surgery. The US and histological findings of steatosis were compared, considering histology as the gold standard. RESULTS: The study included 105 patients. The mean age was 37.2 ± 10.6 years and 75.2% were female. The histological prevalence of steatosis was 89.5%. The sensitivity and specificity of US in the diagnosis of hepatic steatosis were, respectively, 64.9% (95% CI: 54.9-74.3) and 90.9% (95% CI: 57.1-99.5). The positive predictive value and negative predictive value were, respectively, 98.4% (95% CI: 90.2-99.9) and 23.3% (95% CI: 12.3-39.0). The presence of steatosis on US was associated to advanced grades of steatosis on histology (P = 0.016). CONCLUSION: Preoperative abdominal US in our series has not shown to be an accurate method for the diagnosis of hepatic steatosis in severe obese patients. Until another non-invasive method demonstrates better sensitivity and specificity values, histological evaluation may be recommended to these patients undergoing bariatric surgery. PMID:18322958
Domazet, Sidsel Louise; Froberg, Karsten; Hillman, Charles H.; Andersen, Lars Bo; Bugge, Anna
2016-01-01
Background Physical activity is associated not only with health-related parameters, but also with cognitive and academic performance. However, no large scale school-based physical activity interventions have investigated effects on cognitive performance in adolescents. The aim of this study was to describe the effectiveness of a school-based physical activity intervention in enhancing cognitive performance in 12–14 years old adolescents. Methods A 20 week cluster randomized controlled trial was conducted including seven intervention and seven control schools. A total of 632 students (mean (SD) age: 12.9 (0.6) years) completed the trial with baseline and follow-up data on primary or secondary outcomes (74% of randomized subjects). The intervention targeted physical activity during academic subjects, recess, school transportation and leisure-time. Cognitive performance was assessed using an executive functions test of inhibition (flanker task) with the primary outcomes being accuracy and reaction time on congruent and incongruent trials. Secondary outcomes included mathematics performance, physical activity levels, body-mass index, waist-circumference and cardiorespiratory fitness. Results No significant difference in change, comparing the intervention group to the control group, was observed on the primary outcomes (p’s>0.05) or mathematics skills (p>0.05). An intervention effect was found for cardiorespiratory fitness in girls (21 meters (95% CI: 4.4–38.6) and body-mass index in boys (-0.22 kg/m2 (95% CI: -0.39–0.05). Contrary to our predictions, a significantly larger change in interference control for reaction time was found in favor of the control group (5.0 milliseconds (95% CI: 0–9). Baseline to mid-intervention changes in physical activity levels did not differ significantly between groups (all p’s>0.05). Conclusions No evidence was found for effectiveness of a 20-week multi-faceted school-based physical activity intervention for enhancing executive functioning or mathematics skills compared to a control group, but low implementation fidelity precludes interpretation of the causal relationship. Trial Registration www.ClinicalTrials.gov NCT02012881 PMID:27341346
Cardoso, Christopher; Kingdon, Danielle; Ellenbogen, Mark A
2014-11-01
A large body of research has examined the acute effects of intranasal oxytocin administration on social cognition and stress-regulation. While progress has been made with respect to understanding the effect of oxytocin administration on social cognition in clinical populations (e.g. autism, schizophrenia), less is known about its impact on the functioning of the hypothalamic-pituitary-adrenal (HPA) axis among individuals with a mental disorder. We conducted a meta-analysis on the acute effect of intranasal oxytocin administration on the cortisol response to laboratory tasks. The search yielded eighteen studies employing a randomized, placebo-controlled design (k=18, N=675). Random-effects models and moderator analyses were performed using the metafor package for the statistical program R. The overall effect size estimate was modest and not statistically significant (Hedges g=-0.151, p=0.11) with moderate heterogeneity in this effect across studies (I(2)=31%). Controlling for baseline differences in cortisol concentrations, moderation analyses revealed that this effect was larger in response to challenging laboratory tasks that produced a robust stimulation of the HPA-axis (Hedges g=-0.433, 95% CI[-0.841, -0.025]), and in clinical populations relative to healthy controls (Hedges g=-0.742, 95% CI[-1.405, -0.078]). Overall, oxytocin administration showed greater attenuation of the cortisol response to laboratory tasks that strongly activated the HPA-axis, relative to tasks that did not. The effect was more robust among clinical populations, suggesting possible increased sensitivity to oxytocin among those with a clinical diagnosis and concomitant social difficulties. These data support the view that oxytocin may play an important role in HPA dysfunction associated with psychopathology. Copyright © 2014 Elsevier Ltd. All rights reserved.
Jewkes, Rachel K; Dunkle, Kristin; Nduna, Mzikazi; Jama, P Nwabisa; Puren, Adrian
2010-01-01
Objectives to describe prevalence of childhood experiences of adversity in rural South African youth and their associations with health outcomes. Methods we analysed questionnaires and blood specimens collected during a baseline survey for a cluster randomized controlled trial of behavioral intervention, and also tested blood HIV and herpes simplex type 2 virus at 12 and 24 month follow up; 1,367 male and 1,415 female volunteers were recruited from 70 rural villages. Results Both women and men before 18 had experienced physical punishment (89.3% & 94.4%), physical hardship (65.8% & 46.8%), emotional abuse (54.7% & 56.4%), emotional neglect (41.6% & 39.6%), and sexual abuse (39.1% & 16.7%). Incident HIV infections were more common in women who experienced emotional abuse (IRR 1.96, 95% CI 1.25, 3.06, p=0.003), sexual abuse (IRR 1.66 95%CI 1.04, 2.63, p=0.03), and physical punishment (IRR 2.13 95%CI 1.04, 4.37, p=0.04). Emotional neglect in women was associated with depression (aOR 1.82 (95% CI 1.15, 2.88, p= 0.01), suicidality (aOR 5.07 (95% CI 2.07, 12.45, p<0.0001), alcohol abuse (aOR 2.17 (95% CI 0.99, 4.72, p=0.05), and incident HSV2 infections (IRR 1.62, 95% CI 1.01, 2.59, p=0.04). In men emotional neglect was associated with depression (aOR 3.41 (95% CI1.87, 6.20, p<0.0001) and drug use (aOR 1.98 (95% CI 1.37, 2.88, p<0.0001). Sexual abuse was associated with alcohol abuse in men (aOR 3.68 (95% CI2.00, 6.77, p<0.0001) and depression (aOR 2.16 (95% CI1.34, 3.48, p=0.002) and alcohol abuse in women (aOR 3.94 (95% CI 1.90, 8.17, p<0.0001). Practice implications Childhood exposure to adversity is very common and influences the health of women and men. All forms of adversity, emotional, physical and sexual, enhance the risk of adverse health outcomes in men and women. Prevention of child abuse need to be included as part of the HIV prevention agenda in Sub-Saharan Africa. Interventions are needed to prevent emotional, sexual, and physical abuse and responses from health and social systems in Africa to psychologically support exposed children must be strengthened. PMID:20943270
Denniston, Maxine M.; Klevens, R. Monina; Jiles, Ruth B.; Murphy, Trudy V.
2015-01-01
Objectives To estimate the predictive value of self-reported hepatitis A vaccine (HepA) receipt for the presence of hepatitis A virus (HAV) antibody (anti-HAV) from either past infection or vaccination, as an indicator of HAV protection. Methods Using 2007–2012 National Health and Nutrition Examination Survey data, we assigned participants to 4 groups based on self-reported HepA receipt and anti-HAV results. We compared characteristics across groups and calculated three measures of agreement between self-report and serologic status (anti- HAV): percentage concordance, and positive (PPV) and negative (NPV) predictive values. Using logistic regression we investigated factors associated with agreement between self-reported vaccination status and serological results. Results Demographic and other characteristics varied significantly across the 4 groups. Overall agreement between self-reported HepA receipt and serological results was 63.6% (95% confidence interval [CI] 61.9–65.2); PPV and NPV of self-reported vaccination status for serological result were 47.0% (95% CI 44.2–49.8) and 69.4% (95% CI 67.0–71.8), respectively. Mexican American and foreign-born adults had the highest PPVs (71.5% [95% CI 65.9–76.5], and 75.8% [95% CI 71.4–79.7]) and the lowest NPVs (21.8% [95% CI 18.5–25.4], and 20.0% [95% CI 17.2–23.1]), respectively. Young (ages 20–29 years), US-born, and non-Hispanic White adults had the lowest PPVs (37.9% [95% CI 34.5–41.5], 39.1% [95% CI, 36.0–42.3], and 39.8% [36.1–43.7]), and the highest NPVs (76.9% [95% CI 72.2–81.0, 78.5% [95% CI 76.5–80.4)], and 80.6% [95% CI 78.2–82.8), respectively. Multivariate logistic analyses found age, race/ethnicity, education, place of birth and income to be significantly associated with agreement between self-reported vaccination status and serological results. Conclusions When assessing hepatitis A protection, self-report of not having received HepA was most likely to identify persons at risk for hepatitis A infection (no anti-HAV) among young, US-born and non-Hispanic White adults, and self-report of HepA receipt was least likely to be reliable among adults with the same characteristics. PMID:26116252
STS-95 crew members Glenn, Lindsey and Robinson at Launch Pad 39B
NASA Technical Reports Server (NTRS)
1998-01-01
STS-95 Payload Specialist John H. Glenn Jr., senator from Ohio, smiles at his fellow crew members (middle) Pilot Steven W. Lindsey and (right) Mission Specialist Stephen K. Robinson while visiting Launch Pad 39B. The crew were making final preparations for launch, targeted for liftoff at 2 p.m. on Oct. 29. The other crew members (not shown) are Mission Specialist Scott E. Parazynski, Payload Specialist Chiaki Mukai, with the National Space Development Agency of Japan (NASDA), Mission Commander Curtis L. Brown Jr., and Mission Specialist Pedro Duque of Spain, with the European Space Agency (ESA). The STS-95 mission is expected to last 8 days, 21 hours and 49 minutes, returning to KSC at 11:49 a.m. EST on Nov. 7.
The prevalence of acne in Mainland China: a systematic review and meta-analysis
Li, Danhui; Chen, Qiang; Liu, Yi; Liu, Tingting; Tang, Wenhui; Li, Shengjie
2017-01-01
Introduction Acne, a very common skin disease, can result in psychological distress and sustain impairment in quality of life. Data on the prevalence of acne and the differences in gender, region and age are limited. The aim of this review is to estimate the prevalence of acne in Mainland China comprehensively and to quantify its association with gender, region and age. Methods We searched electronic databases with predetermined search terms to identify relevant studies published between 1 January 1996 and 30 September 2016. We pointed out repeated results using Note Express software and evaluated the studies for inclusion. Two independent reviewers extracted the data, followed with statistical analyses using Comprehensive Meta-Analysis software version 2.0. A random effects model was adopted to calculate the overall pooled prevalence and to merge categories, including gender (males and females), region (Northern China and Southern China) and age (primary and secondary students: 7–17 years old; undergraduates: 18–23 years old; overall: no limits of age) for subgroup analyses. Logistic meta-regression analysis was used to clarify the associations between acne and the predictors age, gender and region using OR and their associated 95% CI. Results 25 relevant studies were included in this meta-analysis. The overall pooled prevalence rates of acne were 39.2% (95% CI 0.310 to 0.479). The prevalence rates in different age groups were 10.2% overall (95% CI 0.059 to 0.171), 50.2% for primary and secondary students (95% CI 0.451 to 0.554), and 44.5% for undergraduates (95% CI 0.358 to 0.534); by gender, the prevalence rates were 35.7% for females (95% CI 0.275 to 0.448) and 39.7% for males (95% CI 0.317 to 0.482); and by region, the prevalence rates were 34.2% for Northern China (95% CI 0.242 to 0.458) and 46.3% for Southern China (95% CI 0.374 to 0.555). The associations between acne and the predictors age, gender and region were statistically significant. Conclusions In Mainland China, primary and secondary students exhibited higher prevalence rates than undergraduate students; males had higher prevalence rates of acne than females; and the prevalence rates of acne in Southern China was higher than Northern China. PMID:28432064
Huang, Ruixue; Ning, Huacheng; Shen, Minxue; Li, Jie; Zhang, Jianglin; Chen, Xiang
2017-01-01
Objective: Atopic dermatitis (AD) is a prevalent, burdensome, and psychologically important pediatric concern. Probiotics have been suggested as a treatment for AD. Some reports have explored this topic; however, the utility of probiotics for AD remains to be firmly established. Methods: To assess the effects of probiotics on AD in children, the PubMed/Medline, Cochrane Library Scopus, and OVID databases were searched for reports published in the English language. Results: Thirteen studies were identified. Significantly higher SCORAD values favoring probiotics over controls were observed (mean difference [MD], −3.07; 95% confidence interval [CI], −6.12 to −0.03; P < 0.001). The reported efficacy of probiotics in children < 1 year old was −1.03 (95%CI, −7.05 to 4.99) and that in children 1–18 years old was −4.50 (95%CI, −7.45 to −1.54; P < 0.001). Subgroup analyses showed that in Europe, SCORAD revealed no effect of probiotics, whereas significantly lower SCORAD values were reported in Asia (MD, −5.39; 95%CI, −8.91 to −1.87). Lactobacillus rhamnosus GG (MD, 3.29; 95%CI, −0.30 to 6.88; P = 0.07) and Lactobacillus plantarum (MD, −0.70; 95%CI, −2.30 to 0.90; P = 0.39) showed no significant effect on SCORAD values in children with AD. However, Lactobacillus fermentum (MD, −11.42; 95%CI, −13.81 to −9.04), Lactobacillus salivarius (MD, −7.21; 95%CI, −9.63 to −4.78), and a mixture of different strains (MD, −3.52; 95%CI, −5.61 to −1.44) showed significant effects on SCORAD values in children with AD. Conclusions: Our meta-analysis indicated that the research to date has not robustly shown that probiotics are beneficial for children with AD. However, caution is needed when generalizing our results, as the populations evaluated were heterogeneous. Randomized controlled trials with larger samples and greater power are necessary to identify the species, dose, and treatment duration of probiotics that are most efficacious for treating AD in children. PMID:28932705
Alvarez-Uria, Gerardo; Gandra, Sumanth; Mandal, Siddhartha; Laxminarayan, Ramanan
2018-03-01
To project future antimicrobial resistance (AMR) in Escherichia coli and Klebsiella pneumoniae. Mixed linear models were constructed from a sample of countries with AMR data in the ResistanceMap database. Inverse probability weighting methods were used to account for countries without AMR data. The estimated prevalence of AMR in 2015 was 64.5% (95% confidence interval (CI) 42-87%) for third-generation cephalosporin-resistant (3GCR) Escherichia coli, 5.8% (95% CI 1.8-9.7%) for carbapenem-resistant (CR) E. coli, 66.9% (95% CI 47.1-86.8%) for 3GCR Klebsiella pneumoniae, and 23.4% (95% CI 7.4-39.4%) for CR K. pneumoniae. The projected AMR prevalence in 2030 was 77% (95% CI 55-99.1%) for 3GCR E. coli, 11.8% (95% CI 3.7-19.9%) for CR E. coli, 58.2% (95% CI 50.2-66.1%) for 3GCR K. pneumoniae, and 52.8% (95% CI 16.3-89.3%) for CR K. pneumoniae. The models suggest that third-generation cephalosporins and carbapenems could be ineffective against a sizeable proportion of infections by E. coli and K. pneumoniae in most parts of the world by 2030, supporting both the need to enhance stewardship efforts and to prioritize research and development of new antibiotics for resistant Enterobacteriaceae. Copyright © 2018 The Author(s). Published by Elsevier Ltd.. All rights reserved.
A 6‐month prospective study of injury in Gaelic football
Wilson, F; Caffrey, S; King, E; Casey, K; Gissane, C
2007-01-01
Objective To describe the injury incidence in Gaelic football. Methods A total of 83 players from three counties were interviewed monthly about their injury experience, during the 6 months of the playing season. Results The injury rate was 13.5/1000 h exposure to Gaelic football (95% CI, 10.9 to 16.6). There were nearly twice as many injuries during matches (64.4%, 95% CI, 54.1 to 73.6) as in training (35.6%, 95% CI, 26.4 to 49.5). The ankle was found to be the most commonly injured site (13.3%, 95% CI, 7.8 to 21.9). The musculotendinous unit accounted for nearly 1/3 of all injuries (31.1%). The tackle accounted for 27.8% of the injuries sustained (tackler 10%, 95% CI, 5.4 to 17.9; player being tackled 17.9%, 95% CI, 11.2 to 26.9). Of total match injuries, 56.9% (95% CI, 46.1 to 67.1) were experienced in the second half as opposed to 39.7% (95% CI, 29.8 to 50.5) in the first half. Conclusions Gaelic footballers are under considerable risk of injury. Greater efforts must be made to reduce this risk so that players miss less time from sport due to injury. Risk factors for injury in Gaelic football must now be investigated so that specific interventions may be established to reduce them. PMID:17138631
Maternal thyroid function and the outcome of external cephalic version: a prospective cohort study
2011-01-01
Background To investigate the relation between maternal thyroid function and the outcome of external cephalic version (ECV) in breech presentation. Methods Prospective cohort study in 141 women (≥ 35 weeks gestation) with a singleton fetus in breech. Blood samples for assessing thyroid function were taken prior to ECV. Main outcome measure was the relation between maternal thyroid function and ECV outcome indicated by post ECV ultrasound. Results ECV success rate was 77/141 (55%), 41/48 (85%) in multipara and 36/93 (39%) in primipara. Women with a failed ECV attempt had significantly higher TSH concentrations than women with a successful ECV (p < 0.001). Multiple logistic regression showed that TSH (OR: 0.52, 95% CI: 0.30-0.90), nulliparity (OR: 0.11, 95% CI: 0.03-0.36), frank breech (OR: 0.30, 95% CI: 0.10-0.93) and placenta anterior (OR: 0.31, 95% CI: 0.11-0.85) were independently related to ECV success. Conclusions Higher TSH levels increase the risk of ECV failure. Trial registration number ClinicalTrials.gov: NCT00516555 PMID:21269431
Feeding practices and blood lead levels in infants in Nagpur, India.
Patel, Archana Behram; Belsare, Hrishikesh; Banerjee, Anita
2011-01-01
In a hospital-based cross-sectional study of 200 infants age 4-9 months in an Indian city (Nagpur), the authors determined the prevalence of elevated blood lead level (EBLL) and mean blood lead levels with respect to feeding patterns, i.e., breastfed or fed with formula or dairy milk. The blood lead levels in this study population ranged from 0.048 microg/dl to 42.944 microg/dl; the mean blood lead level was 10.148 microg/dl (+/- 9.128); EBLL prevalence was 38.2%. EBLL risk factors included removal of house paint in the past year, odds ratio (OR), 5.6 (95% confidence interval [CI], 1.6-19.65); use of surma (eye cosmetic), OR 4.27 (95% CI, 1.39-13.08); maternal use of sindur (vermillion), OR 2.118 (95% CI, 1.07-4.44). Feeding method (breastfed or not) did not appear to have an effect on blood lead level. In non-breastfed infants, boiling of water was significantly associated with EBLL, OR 1.97 (95% CI, 1.01-3.84).
Risks of Primary Extracolonic Cancers Following Colorectal Cancer in Lynch Syndrome
2012-01-01
Background Lynch syndrome is a highly penetrant cancer predisposition syndrome caused by germline mutations in DNA mismatch repair (MMR) genes. We estimated the risks of primary cancers other than colorectal cancer following a diagnosis of colorectal cancer in mutation carriers. Methods We obtained data from the Colon Cancer Family Registry for 764 carriers of an MMR gene mutation (316 MLH1, 357 MSH2, 49 MSH6, and 42 PMS2), who had a previous diagnosis of colorectal cancer. The Kaplan–Meier method was used to estimate their cumulative risk of cancers 10 and 20 years after colorectal cancer. We estimated the age-, sex-, country- and calendar period–specific standardized incidence ratios (SIRs) of cancers following colorectal cancer, compared with the general population. Results Following colorectal cancer, carriers of MMR gene mutations had the following 10-year risk of cancers in other organs: kidney, renal pelvis, ureter, and bladder (2%, 95% confidence interval [CI] = 1% to 3%); small intestine, stomach, and hepatobiliary tract (1%, 95% CI = 0.2% to 2%); prostate (3%, 95% CI = 1% to 5%); endometrium (12%, 95% CI = 8% to 17%); breast (2%, 95% CI = 1% to 4%); and ovary (1%, 95% CI = 0% to 2%). They were at elevated risk compared with the general population: cancers of the kidney, renal pelvis, and ureter (SIR = 12.54, 95% CI = 7.97 to 17.94), urinary bladder (SIR = 7.22, 95% CI = 4.08 to 10.99), small intestine (SIR = 72.68, 95% CI = 39.95 to 111.29), stomach (SIR = 5.65, 95% CI = 2.32 to 9.69), and hepatobiliary tract (SIR = 5.94, 95% CI = 1.81 to 10.94) for both sexes; cancer of the prostate (SIR = 2.05, 95% CI = 1.23 to 3.01), endometrium (SIR = 40.23, 95% CI = 27.91 to 56.06), breast (SIR = 1.76, 95% CI = 1.07 to 2.59), and ovary (SIR = 4.19, 95% CI = 1.28 to 7.97). Conclusion Carriers of MMR gene mutations who have already had a colorectal cancer are at increased risk of a greater range of cancers than the recognized spectrum of Lynch syndrome cancers, including breast and prostate cancers. PMID:22933731
Celebrini, Richard G.; Eng, Janice J.; Miller, William C.; Ekegren, Christina L.; Johnston, James D.; Depew, Thomas A.; MacIntyre, Donna L.
2015-01-01
Objective To determine the effect of a novel movement strategy incorporated within a soccer warm-up on biomechanical risk factors for ACL injury during three sport-specific movement tasks. Design Single-blind, randomized controlled clinical trial. Setting Laboratory setting. Participants 20 top-tier female teenage soccer players. Interventions Subjects were randomized to the Core Position and Control movement strategy (Core-PAC) warm-up or standard warm-up which took place prior to their regular soccer practice over a 6-week period. The Core-PAC focuses on getting the centre of mass closer to the plant foot or base of support. Main Outcome Measures Peak knee flexion angle and abduction moments during a side-hop (SH), side-cut (SC) and unanticipated side-cut (USC) task after the 6-weeks with (intervention group only) and without a reminder to use the Core-PAC strategy. Results The Core-PAC group increased peak flexion angles during the SH task (Mean difference = 6.2°, 95% CI: 1.9–10.5°, effect size = 1.01, P = 0.034) after the 6-week warm-up program without a reminder. In addition, the Core-PAC group demonstrated increased knee flexion angles for the side-cut (Mean difference = 8.5°, 95% CI: 4.8–12.2°, ES = 2.02, P = 0.001) and side-hop (Mean difference = 10.0°, 95% CI: 5.7–14.3°, ES = 1.66, P = 0.001) task after a reminder. No changes in abduction moments were found. Conclusions The results of this study suggest that the Core-PAC may be one method of modifying high-risk soccer-specific movements and can be implemented within a practical, team-based soccer warm-up. The results should be interpreted with caution due to the small sample size. PMID:24184850
Onigbogi, Modupe O; Odeyemi, Kofoworola A; Onigbogi, Olanrewaju O
2015-03-01
Violence against women is a major public health problem globally. A cross-sectional descriptive study was conducted in Ikosi Isheri LCDA of Lagos State among 400 married women. A multistage sampling method was used to select the respondents. The lifetime prevalence for physical violence, sexual violence and psychological violence were 50.5%, 33.8% and 85.0% respectively. Predictive factors for physical IPV include lower educational status of the women (AOR 3.22 95% CI: 1.54-6.77) and partner's daily alcohol intake (AOR: 1.84 95% CI: 1.05-3.23). The predictors of sexual violence include unemployment status of the partners (OR 5.89:1.39-24.84) and daily/weekly alcohol use (AOR 1.87 95% CI: 1.05-3.33). Predictors of psychological violence include respondents witness of parental violence (AOR 2.80 95% CI: 1.04-7.5) and daily alcohol use by partners (AOR 2.71 95% CI: 1.19-6.18). Preventive interventions such as increasing the educational status of women and reducing the intake of alcohol by men may help break the cycle of abuse.
Health Care Access Among Individuals Involved in Same-Sex Relationships
Heck, Julia E.; Sell, Randall L.; Gorin, Sherri Sheinfeld
2006-01-01
Objectives. We used data from the National Health Interview Survey to compare health care access among individuals involved in same-sex versus opposite-sex relationships. Methods. We conducted descriptive and logistic regression analyses from pooled data on 614 individuals in same-sex relationships and 93418 individuals in opposite-sex relationships. Results. Women in same-sex relationships (adjusted odds ratio [OR]=0.60; 95% confidence interval [CI]=0.39, 0.92) were significantly less likely than women in opposite-sex relationships to have health insurance coverage, to have seen a medical provider in the previous 12 months (OR=0.66; 95% CI=0.46, 0.95), and to have a usual source of health care (OR=0.50; 95% CI=0.35, 0.71); they were more likely to have unmet medical needs as a result of cost issues (OR=1.85; 95% CI=1.16, 2.96). In contrast, health care access among men in same-sex relationships was equivalent to or greater than that among men in opposite-sex relationships. Conclusions. In this study involving a nationwide probability sample, we found some important differences in access to health care between individuals in same-sex and opposite-sex relationships, particularly women. PMID:16670230
Ryll, Ulrike C; Bastiaenen, Carolien H G; Eliasson, Ann-Christin
2017-05-01
To explore the differences, relationship, and extent of agreement between the Assisting Hand Assessment (AHA), measuring observed ability to perform bimanual tasks, and the Children's Hand-Use Experience Questionnaire (CHEQ), assessing experienced bimanual performance. This study investigates a convenience sample of 34 children (16 girls) with unilateral cerebral palsy aged 6-18 years (mean 12.1, SD 3.9) in a cross-sectional design. The AHA and CHEQ subscales share 8-25% of their variance (R 2 ). Bland-Altman plots for AHA and all three CHEQ subscales indicate good average agreement, with a mean difference approaching zero but large 95% confidence intervals. Limits of agreement were extremely wide, indicating considerable disagreement between AHA and CHEQ subscales. AHA and CHEQ seem to measure different though somewhat related constructs of bimanual performance. Results of this investigation reinforce the recommendation to use both instruments to obtain complementary information about bimanual performance including observed and perceived performance of children with unilateral cerebral palsy.
Tarp-Assisted Cooling as a Method of Whole-Body Cooling in Hyperthermic Individuals.
Hosokawa, Yuri; Adams, William M; Belval, Luke N; Vandermark, Lesley W; Casa, Douglas J
2017-03-01
We investigated the efficacy of tarp-assisted cooling as a body cooling modality. Participants exercised on a motorized treadmill in hot conditions (ambient temperature 39.5°C [103.1°F], SD 3.1°C [5.58°F]; relative humidity 38.1% [SD 6.7%]) until they reached exercise-induced hyperthermia. After exercise, participants were cooled with either partial immersion using a tarp-assisted cooling method (water temperature 9.20°C [48.56°F], SD 2.81°C [5.06°F]) or passive cooling in a climatic chamber. There were no differences in exercise duration (mean difference=0.10 minutes; 95% CI -5.98 to 6.17 minutes or end exercise rectal temperature (mean difference=0.10°C [0.18°F]; 95% CI -0.05°C to 0.25°C [-0.09°F to 0.45°F] between tarp-assisted cooling (48.47 minutes [SD 8.27 minutes]; rectal temperature 39.73°C [103.51°F], SD 0.27°C [0.49°F]) and passive cooling (48.37 minutes [SD 7.10 minutes]; 39.63°C [103.33°F], SD 0.40°C [0.72°F]). Cooling time to rectal temperature 38.25°C (100.85°F) was significantly faster in tarp-assisted cooling (10.30 minutes [SD 1.33 minutes]) than passive cooling (42.78 [SD 5.87 minutes]). Cooling rates for tarp-assisted cooling and passive cooling were 0.17°C/min (0.31°F/min), SD 0.07°C/min (0.13°F/min) and 0.04°C/min (0.07°F/min), SD 0.01°C/min (0.02°F/min), respectively (mean difference=0.13°C [0.23°F]; 95% CI 0.09°C to 0.17°C [0.16°F to 0.31°F]. No sex differences were observed in tarp-assisted cooling rates (men 0.17°C/min [0.31°F/min], SD 0.07°C/min [0.13°F/min]; women 0.16°C/min [0.29°F/min], SD 0.07°C/min [0.13°F/min]; mean difference=0.02°C/min [0.04°F/min]; 95% CI -0.06°C/min to 0.10°C/min [-0.11°F/min to 0.18°F/min]). Women (0.04°C/min [0.07°F/min], SD 0.01°C/min [0.02°F/min]) had greater cooling rates than men (0.03°C/min [0.05°F/min], SD 0.01°C/min [0.02°F/min]) in passive cooling, with negligible clinical effect (mean difference=0.01°C/min [0.02°F/min]; 95% CI 0.001°C/min to 0.024°C/min [0.002°F/min to 0.04°F/min]). Body mass was moderately negatively correlated with the cooling rate in passive cooling (r=-0.580) but not in tarp-assisted cooling (r=-0.206). In the absence of a stationary cooling method such as cold-water immersion, tarp-assisted cooling can serve as an alternative, field-expedient method to provide on-site cooling with a satisfactory cooling rate. Copyright © 2016 American College of Emergency Physicians. Published by Elsevier Inc. All rights reserved.
English, Coralie; Hillier, Susan; Kaur, Gurpreet; Hundertmark, Laura
2014-03-01
Do people with stroke spend more time in active task practice during circuit class therapy sessions versus individual physiotherapy sessions? Do people with stroke practise different tasks during circuit class therapy sessions versus individual physiotherapy sessions? Prospective, observational study. Twenty-nine people with stroke in inpatient rehabilitation settings. Individual therapy sessions and circuit class therapy sessions provided within a larger randomised controlled trial. Seventy-nine therapy sessions were video-recorded and the footage was analysed for time spent engaged in various categories of activity. In a subsample of 28 videos, the number of steps taken by people with stroke per therapy session was counted. Circuit class therapy sessions were of a longer duration (mean difference 38.0minutes, 95% CI 29.9 to 46.1), and participants spent more time engaged in active task practice (mean difference 23.8minutes, 95% CI 16.1 to 31.4) compared with individual sessions. A greater percentage of time in circuit class therapy sessions was spent practising tasks in sitting (mean difference 5.3%, 95% CI 2.4 to 8.2) and in sit-to-stand practice (mean difference 2.7%, 95% CI 1.4 to 4.1), and a lower percentage of time in walking practice (mean difference 19.1%, 95% CI 10.0 to 28.1) compared with individual sessions. PARTICIPANTS took an average of 371 steps (SD 418) during therapy sessions and this did not differ significantly between group and individual sessions. People with stroke spent more time in active task practice, but a similar amount of time in walking practice when physiotherapy was offered in circuit class therapy sessions versus individual therapy sessions. There is a need for effective strategies to increase the amount of walking practice during physiotherapy sessions for people after stroke. Copyright © 2014 Australian Physiotherapy Association. Published by Elsevier B.V. All rights reserved.
Dunn, Sandra; Sprague, Ann E; Fell, Deshayne B; Dy, Jessica; Harrold, JoAnn; Lamontagne, Bernard; Walker, Mark
2013-04-01
Elective repeat Caesarean section (ERCS) for low-risk women at < 39 weeks' gestation has consistently been associated with increased risks to the neonate, including respiratory morbidity, NICU admission, and lengthier hospital stays than ERCS at 39 to 40 weeks' gestation. The objective of this quality improvement project was to reduce high rates of ERCS < 39 weeks across the Eastern Ontario region. All hospitals within the region providing care during labour and birth (n = 10) were asked to participate. Representatives from each hospital received information about their site-specific rates and knowledge-translation resources to assist them with the project. A benchmark rate for ERCS < 39 weeks was set at 30%. The rates of ERCS < 39 weeks were calculated for two different times (the 2009-2010 and 2010-2011 fiscal years) and the relative difference and 95% confidence intervals were calculated to quantify the magnitude and statistical significance of any change. Qualitative interviews were completed with key informants from each hospital. The proportion of ERCS at < 39 weeks' gestation across the region in the fiscal year 2010-2011 (n = 197/497; 39.6%) was significantly decreased (relative difference: -21%; 95% CI -31% to -8%, P = 0.002) from the previous fiscal year 2009-2010 (n = 229/459; 49.9%). A number of barriers to, and facilitators of, practice change were identified. A reduction in the rate of ERCS < 39 weeks among low-risk women was achieved across the region. Awareness of the issue, possession of site-specific data, and agreement about the evidence and the need for change are critical first steps to improving practice.
Effects of the FITKids randomized controlled trial on executive control and brain function.
Hillman, Charles H; Pontifex, Matthew B; Castelli, Darla M; Khan, Naiman A; Raine, Lauren B; Scudder, Mark R; Drollette, Eric S; Moore, Robert D; Wu, Chien-Ting; Kamijo, Keita
2014-10-01
To assess the effect of a physical activity (PA) intervention on brain and behavioral indices of executive control in preadolescent children. Two hundred twenty-one children (7-9 years) were randomly assigned to a 9-month afterschool PA program or a wait-list control. In addition to changes in fitness (maximal oxygen consumption), electrical activity in the brain (P3-ERP) and behavioral measures (accuracy, reaction time) of executive control were collected by using tasks that modulated attentional inhibition and cognitive flexibility. Fitness improved more among intervention participants from pretest to posttest compared with the wait-list control (1.3 mL/kg per minute, 95% confidence interval [CI]: 0.3 to 2.4; d = 0.34 for group difference in pre-to-post change score). Intervention participants exhibited greater improvements from pretest to posttest in inhibition (3.2%, 95% CI: 0.0 to 6.5; d = 0.27) and cognitive flexibility (4.8%, 95% CI: 1.1 to 8.4; d = 0.35 for group difference in pre-to-post change score) compared with control. Only the intervention group increased attentional resources from pretest to posttest during tasks requiring increased inhibition (1.4 µV, 95% CI: 0.3 to 2.6; d = 0.34) and cognitive flexibility (1.5 µV, 95% CI: 0.6 to 2.5; d = 0.43). Finally, improvements in brain function on the inhibition task (r = 0.22) and performance on the flexibility task correlated with intervention attendance (r = 0.24). The intervention enhanced cognitive performance and brain function during tasks requiring greater executive control. These findings demonstrate a causal effect of a PA program on executive control, and provide support for PA for improving childhood cognition and brain health. Copyright © 2014 by the American Academy of Pediatrics.
Wood dust exposure and lung cancer risk: a meta-analysis.
Hancock, David G; Langley, Mary E; Chia, Kwan Leung; Woodman, Richard J; Shanahan, E Michael
2015-12-01
Occupational lung cancers represent a major health burden due to their increasing prevalence and poor long-term outcomes. While wood dust is a confirmed human carcinogen, its association with lung cancer remains unclear due to inconsistent findings in the literature. We aimed to clarify this association using meta-analysis. We performed a search of 10 databases to identify studies published until June 2014. We assessed the lung cancer risk associated with wood dust exposure as the primary outcome and with wood dust-related occupations as a secondary outcome. Random-effects models were used to pool summary risk estimates. 85 publications were included in the meta-analysis. A significantly increased risk for developing lung cancer was observed among studies that directly assessed wood dust exposure (RR 1.21, 95% CI 1.05 to 1.39, n=33) and that assessed wood dust-related occupations (RR 1.15, 95% CI 1.07 to 1.23, n=59). In contrast, a reduced risk for lung cancer was observed among wood dust (RR 0.63, 95% CI 0.39 to 0.99, n=5) and occupation (RR 0.96, 95% CI 0.95 to 0.98, n=1) studies originating in Nordic countries, where softwood dust is the primary exposure. These results were independent of the presence of adjustment for smoking and exposure classification methods. Only minor differences in risk between the histological subtypes were identified. This meta-analysis provides strong evidence for an association between wood dust and lung cancer, which is critically influenced by the geographic region of the study. The reasons for this region-specific effect estimates remain to be clarified, but may suggest a differential effect for hardwood and softwood dusts. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
Shah, Manish A.; Jhawer, Minaxi; Ilson, David H.; Lefkowitz, Robert A.; Robinson, Edric; Capanu, Marinela; Kelsen, David P.
2011-01-01
Purpose To evaluate the safety and efficacy of a modified administration schedule of docetaxel, cisplatin, and fluorouracil (mDCF) with bevacizumab in patients with advanced gastroesophageal malignancies. Patients and Methods Previously untreated patients with metastatic gastroesophageal adenocarcinoma received bevacizumab 10 mg/kg, docetaxel 40 mg/m2, fluorouracil 400 mg/m2, leucovorin 400 mg/m2 on day 1, fluorouracil 1,000 mg/m2/d × 2 days intravenous continuous infusion beginning on day 1, and cisplatin 40 mg/m2 on day 3. The primary objective was to improve 6-month progression-free survival (PFS) from 43% (historical DCF control) to 63% with the addition of bevacizumab. The target accrual was 44 patients to have 10% type I and II error rates. Results In total, 44 eligible patients with cancer were enrolled from October 2006 to October 2008: 22 gastric, 20 gastroesophageal junction (GEJ), and two esophagus. In 39 patients with measurable disease, the confirmed response rate was 67% (95% CI, 50% to 81%). Six-month PFS was 79% (95% CI, 63% to 88%), and median PFS was 12 months (95% CI, 8.8 to 18.2 months). With 26-month follow-up, median overall survival (OS) was 16.8 months (95% CI, 12.1 to 26.1 months), and 2-year survival was 37%. Treatment-related grade 3 to 4 toxicity was as follows: neutropenia without fever (50%), fatigue (25%), venous thromboembolism (39%), and nausea, vomiting, mucositis, neuropathy, and febrile neutropenia less than 10% each. In subset analysis, diffuse gastric cancer had significantly worse PFS and OS, and the response rate in proximal/GEJ tumors was 85% (95% CI, 62% to 97%). Conclusion mDCF with bevacizumab appears tolerable and has notable patient outcomes in patients with advanced gastroesophageal adenocarcinoma. Six-month PFS was 79%, surpassing our predefined efficacy end point, and median and 2-year OS were 16.8 months and 37%, respectively. PMID:21189380
[A Meta analysis on the associations between air pollution and respiratory mortality in China].
Liu, Changjing; Huang, Fei; Yang, Zhizhou; Sun, Zhaorui; Huang, Changbao; Liu, Hongmei; Shao, Danbing; Zhang, Wei; Ren, Yi; Tang, Wenjie; Han, Xiaoqin; Nie, Shinan
2015-08-01
To analyze the associations between air pollution and adverse health outcomes on respiratory diseases and to estimate the short-term effects of air pollutions [Particulate matter with particle size below 10 microns (PM(10)), PM(10) particulate matter with particle size below 2.5 microns (PM(2.5)), nitrogen dioxide (NO₂), sulphur dioxide (SO₂) and ozone (O₃)] on respiratory mortality in China. Data related to the epidemiological studies on the associations between air pollution and adverse health outcomes of respiratory diseases that published from 1989 through 2014 in China, were collected by systematically searching databases of PubMed, SpringerLink, Embase, Medline, CNKI, CBM and VIP in different provinces of China. Short-term effects between (PM(10), PM(2.5), NO₂, SO₂, O₃) and respiratory mortality were analyzed by Meta-analysis method, and estimations were pooled by random or fixed effect models, using the Stata 12.0 software. A total of 157 papers related to the associations between air pollution and adverse health outcomes of respiratory diseases in China were published, which covered 79.4% of all the provinces in China. Results from the Meta-analysis showed that a 10 µg/m³ increase in PM10, PM(2.5), NO₂, SO₂, and O₃was associated with mortality rates as 0.50% (95% CI: 0-0.90%), 0.50% (95% CI: 0.30%-0.70%), 1.39% (95% CI: 0.90%-1.78%), 1.00% (95% CI: 0.40%-1.59%) and 0.10% (95% CI: -1.21%-1.39%) in respiratory tracts, respectively. No publication bias was found among these studies. There seemed positive associations existed between PM(10)/PM(2.5)/NO₂/SO₂and respiratory mortality in China that the relationship called for further attention on air pollution and adverse health outcomes of the respiratory diseases.
Corley, Douglas A; Jensen, Christopher D; Marks, Amy R; Zhao, Wei K; de Boer, Jolanda; Levin, Theodore R; Doubeni, Chyke; Fireman, Bruce H; Quesenberry, Charles P
2013-02-01
Reliable community-based colorectal adenoma prevalence estimates are needed to inform colonoscopy quality standards and to estimate patient colorectal cancer risks; however, minimal data exist from populations with large numbers of diverse patients and examiners. We evaluated the prevalence of adenomas detected by sex, age, race/ethnicity, and colon location among 20,792 Kaiser Permanente Northern California members ≥50 years of age who received a screening colonoscopy examination (102 gastroenterologists, 2006-2008). Prevalence of detected adenomas increased more rapidly with age in the proximal colon (adjusted odds ratio [OR], 2.39; 95% confidence interval [CI], 2.05-2.80; 70-74 vs 50-54 years) than in the distal colon (OR, 1.89; 95% CI, 1.63-2.19). Prevalence was higher among men vs women at all ages (OR, 1.77; 95% CI, 1.66-1.89), increasing in men from 25% to 39% at ≥70 years and in women from 15% at 50-54 years to 26% (P < .001). Proximal adenoma prevalence was higher among blacks than whites (OR, 1.26; 95% CI, 1.04-1.54), although total prevalence was similar, including persons <60 years old (OR, 1.17; 95% CI, 0.91-1.50). Prevalence of detected adenomas increases substantially with age and is much higher in men; proximal adenomas are more common among blacks than whites, although the total prevalence and the prevalence for ages <60 years were similar by race. These demographic differences are such that current adenoma detection guidelines may not be valid, without adjustment, for comparing providers serving different populations. The variation in prevalence and location may also have implications for the effectiveness of screening methods in different demographic groups. Copyright © 2013 AGA Institute. Published by Elsevier Inc. All rights reserved.
Cheng, Yvonne W.; Wikström, Anna-Karin; Stephansson, Olof
2017-01-01
Background There is no apparent consensus on obstetric management, i.e., induction of labor or expectant management of women with suspected large-for-gestational-age (LGA)-fetuses. Methods and findings To further examine the subject, a nationwide population-based cohort study from the Swedish Medical Birth Register in nulliparous non-diabetic women with singleton, vertex LGA (>90th centile) births, 1992–2013, was performed. Delivery of a live-born LGA infant induced at 38 completed weeks of gestation in non-preeclamptic pregnancies, was compared to those of expectant management, with delivery at 39, 40, 41, or 42 completed weeks of gestation and beyond, either by labor induction or via spontaneous labor. Primary outcome was mode of delivery. Secondary outcomes included obstetric anal sphincter injury, 5-minute Apgar<7 and birth injury. Multivariable logistic regression analysis was performed to control for potential confounding. We found that among the 722 women induced at week 38, there was a significantly increased risk of cesarean delivery (aOR = 1.44 95% CI:1.20–1.72), compared to those with expectant management (n = 44 081). There was no significant difference between the groups in regards to risk of instrumental vaginal delivery (aOR = 1.05, 95% CI:0.85–1.30), obstetric anal sphincter injury (aOR = 0.81, 95% CI:0.55–1.19), nor 5-minute Apgar<7 (aOR = 1.06, 95% CI:0.58–1.94) or birth injury (aOR = 0.82, 95% CI:0.49–1.38). Similar comparisons for induction of labor at 39, 40 or 41 weeks compared to expectant management with delivery at a later gestational age, showed increased rates of cesarean delivery for induced women. Conclusions In women with LGA infants, induction of labor at 38 weeks gestation is associated with increased risk of cesarean delivery compared to expectant management, with no difference in neonatal morbidity. PMID:28727729
Zhao, Z X; Zhou, R R; Li, L Q; Yu, W; Li, Q F; Hu, P
2018-01-06
Objective: To evaluate the population immunity to measles and explore the factors associated with measles susceptibility in Yunnan residents aged ≥20 years. Methods: 2 689 residents aged ≥20 years were selected by multistage stratified systematic randomized sampling in 252 villages of 42 counties in Yunnan Province between June and September in 2015. Each subject was surveyed by the same questionnaire, including general information, measles contained vaccine history, measles history, and 5 ml blood sample of each subject was collected. Serum IgG antibodies against measles virus were measured by ELISA. Positive was defined as the antibody concentration ≥250 mU/ml, and negative as <250 mU/ml. Non-conditional logistic regression model was used analyze the factors associated with measles susceptibility in adults. Results: Among 2 689 subjects, 1 214 were males (45.15%), and the overall positive rate of measles IgG antibody was 89.77%. Compared with subjects from the region where economic development was low, subjects from the region where economic development was moderate were likely to be susceptible to measles virus ( OR= 1.81, 95 %CI: 1.33-2.47). Four age groups had higher risk of being susceptible to measles virus (compared with ≥40 years: 20-24 years old, OR= 2.04, 95 %CI: 1.26-3.31; 25-29 years old, OR= 3.72, 95 %CI: 2.37-5.86; 30-34 years old, OR= 1.94, 95 %CI: 1.22-3.09; 35-39 years old, OR= 1.81, 95 %CI: 1.07-3.05). Conclusion: Our results suggest that the serological susceptibility in adults (20-39 years), especially adults from the regions where the economic development was moderate, should be concerned. The additional vaccination strategy targeting young adults is important for reducing the risk of measles infection.
Assessing Participation in Community-Based Physical Activity Programs in Brazil
REIS, RODRIGO S.; YAN, YAN; PARRA, DIANA C.; BROWNSON, ROSS C.
2015-01-01
Purpose This study aimed to develop and validate a risk prediction model to examine the characteristics that are associated with participation in community-based physical activity programs in Brazil. Methods We used pooled data from three surveys conducted from 2007 to 2009 in state capitals of Brazil with 6166 adults. A risk prediction model was built considering program participation as an outcome. The predictive accuracy of the model was quantified through discrimination (C statistic) and calibration (Brier score) properties. Bootstrapping methods were used to validate the predictive accuracy of the final model. Results The final model showed sex (women: odds ratio [OR] = 3.18, 95% confidence interval [CI] = 2.14–4.71), having less than high school degree (OR = 1.71, 95% CI = 1.16–2.53), reporting a good health (OR = 1.58, 95% CI = 1.02–2.24) or very good/excellent health (OR = 1.62, 95% CI = 1.05–2.51), having any comorbidity (OR = 1.74, 95% CI = 1.26–2.39), and perceiving the environment as safe to walk at night (OR = 1.59, 95% CI = 1.18–2.15) as predictors of participation in physical activity programs. Accuracy indices were adequate (C index = 0.778, Brier score = 0.031) and similar to those obtained from bootstrapping (C index = 0.792, Brier score = 0.030). Conclusions Sociodemographic and health characteristics as well as perceptions of the environment are strong predictors of participation in community-based programs in selected cities of Brazil. PMID:23846162
Word Recall: Cognitive Performance Within Internet Surveys
Craig, Benjamin M; Jim, Heather S
2015-01-01
Background The use of online surveys for data collection has increased exponentially, yet it is often unclear whether interview-based cognitive assessments (such as face-to-face or telephonic word recall tasks) can be adapted for use in application-based research settings. Objective The objective of the current study was to compare and characterize the results of online word recall tasks to those of the Health and Retirement Study (HRS) and determine the feasibility and reliability of incorporating word recall tasks into application-based cognitive assessments. Methods The results of the online immediate and delayed word recall assessment, included within the Women’s Health and Valuation (WHV) study, were compared to the results of the immediate and delayed recall tasks of Waves 5-11 (2000-2012) of the HRS. Results Performance on the WHV immediate and delayed tasks demonstrated strong concordance with performance on the HRS tasks (ρc=.79, 95% CI 0.67-0.91), despite significant differences between study populations (P<.001) and study design. Sociodemographic characteristics and self-reported memory demonstrated similar relationships with performance on both the HRS and WHV tasks. Conclusions The key finding of this study is that the HRS word recall tasks performed similarly when used as an online cognitive assessment in the WHV. Online administration of cognitive tests, which has the potential to significantly reduce participant and administrative burden, should be considered in future research studies and health assessments. PMID:26543924
2018-01-01
Background: Patients’ satisfaction is the key parameter to measure the quality of healthcare services. Value added-services (VAS) were introduced to improve the quality of medication deliveries and to reduce the waiting time at outpatient pharmacy. Objective: This study aimed to compare the satisfaction levels of patients receiving VAS and traditional counter service (TCS) for prescription refills in Port Dickson Hospital. Methods: A single-center, cross-sectional study was conducted in the outpatient pharmacy department of Port Dickson Hospital from 1 March to 30 June 2017. Systematic sampling method was utilized to recruit subjects into the study, except mail pharmacy in which universal sampling method was used. Data collection was done via telephone interviews for both groups. Results: There was 104 and 105 in TCS and VAS group respectively. The response rate was 99.5%. Overall, a significant higher total mean satisfaction score in VAS group was observed as compared to TCS group (43.39 versus 40.49, p=0.002). The same finding was observed after confounding factors were controlled (VAS=44.66, 95% CI 43.07:46.24 versus TCS=39.88, 95% CI 38.29:41.46; p<0.001). VAS respondents reported more satisfaction than TCS respondents for both general and technical aspects. Among the VAS offered, mail pharmacy service respondents showed highest total mean satisfaction score, but no significant different was seen between groups (p=0.064). Conclusion: VAS respondents were generally more satisfied than TCS respondents for prescription refills. A longitudinal study is necessary to examine the impact of other dimensions and other types of VAS on patients’ satisfaction levels. PMID:29619135
Othman, Nasih; Kasem, Attallah O.; Salih, Faisal A.
2017-01-01
Background: Waterpipe smoking is increasingly becoming the most common method of tobacco use among adolescents in the Eastern Mediterranean Region. This study was undertaken in Iraqi Kurdistan to estimate its prevalence among students and investigate attitudes and factors associated with it. Materials and Methods: In a cross-sectional survey at Sulaimani Polytechnic University, 1160 students were approached in a two-stage design using a self-administered questionnaire. Data was entered into Epidata and analysis was done in Stata. Results: Prevalence of cigarette smoking was 10% and waterpipe smoking was 28% (male 49%, female 10%). Waterpipe smoking was initiated prior to joining the university in 74% of the cases and 22% of waterpipe smokers smoked every day. The most common place for smoking was coffee shops (52%) and 71% of smokers shared the pipe. The significant risk factors were smoking cigarettes (OR 10.3, 95% CI 7.0–15.0), male gender (OR 5.7, 95% CI 3.9–8.2), non-Kurdish ethnicity (OR 3.0, 95% CI 1.6–15.9), city residence (OR 1.5, 95% CI 1.0–2.1), and use of alcohol and other substances (OR 2.8 95% CI 1.4–5.6). Conclusion: Waterpipe smoking is highly prevalent among students in Iraqi Kurdistan, especially among males, and is becoming a public health problem. Tobacco control interventions should be designed specifically to address this problem among adolescents and the youth. PMID:29849676
Zhu, Yanbin; Chang, Hengrui; Yu, Yiyang; Chen, Wei; Liu, Song; Zhang, Yingze
2017-05-01
To evaluate the comparative effectiveness and accuracy of electromagnetic technique (EM) verses free-hand method (FH) for distal locking in intramedullary nailing procedure. Relevant original studies were searched in Medline, Pubmed, Embase, China National Knowledge Infrastructure, and Cochrane Central Database (all through October 2015). Comparative studies providing sufficient data of interest were included in this meta-analysis. The Stata 11.0 was used to analyze all data. Eight studies involving 611 participants were included, with 305 in EM group and 306 in FH group. EM outperformed FH with reduced distal locking time of 4.1 minutes [standardized mean difference (SMD), 1.61; 95 % confidence interval (95 %CI), 0.81 to 2.41] and the reduced fluoroscopy time of 25.3 seconds (SMD, 2.64; 95 %CI, 2.12 to 3.16). Regarding the accuracy of distal screw placement, no significant difference was observed between two techniques (OR, 2.39; 95 %CI, 0.38 to 15.0). There was a trend of longer operative time in FH versus EM by 10 minutes (79.0 and 69.0 minutes), although the difference was not statistically significant (SMD, 0.341; 95 % CI, -0.02 to 0.703). The existing evidence suggests EM technique is a better alternative for distal locking in intramedullary nailing procedure, and this might aid in the management of diaphyseal fractures in lower extremities.
Robledo, J A; Mejía, G I; Morcillo, N; Chacón, L; Camacho, M; Luna, J; Zurita, J; Bodon, A; Velasco, M; Palomino, J C; Martin, A; Portaels, F
2006-06-01
Tuberculosis (TB) diagnostic laboratories in Latin America. Evaluation of thin-layer agar (TLA) compared to Löwenstein-Jensen (LJ) culture for the diagnosis of TB. Phase II prospective study in six laboratories. Samples included sputum and extra-pulmonary specimens from patients with a clinical diagnosis of TB. Respiratory samples were decontaminated using NaOH/ NALC; all samples were centrifuged, stained with Ziehl-Neelsen for acid-fast bacilli (AFB), cultured on LJ and TLA and identified according to recommended procedures. Sensitivity and likelihood ratios (LR), growth detection time and contamination rate were calculated for both media. A total of 1118 clinical specimens were studied. Cultures detected Mycobacterium tuberculosis in all AFB-positive samples, whereas for AFB-negative specimens LJ detected 3.2% and TLA 4.4%. Sensitivity was 92.6% (95%CI 87.9-95.9) and 84.7% (95%CI 78.8-89.0) for TLA and LJ, respectively. Positive and negative LRs were similar. Contamination was 5.1% for TLA and 3.0% for LJ. Median time to detection of a positive culture was 11.5 days (95%CI 9.3-15.0) for TLA and 30.5 days (95%CI 26.9-39.0) for LJ (P < 0.0001). Difference in the characteristics of the participating laboratories, the disease prevalence and the number and type of specimens processed did not affect the overall performance of TLA as compared to LJ, supporting the robustness of the method and its feasibility in different laboratory settings.
Trends of obesity prevalence among Spanish adults with diabetes, 1987-2012.
Basterra-Gortari, Francisco Javier; Bes-Rastrollo, Maira; Ruiz-Canela, Miguel; Gea, Alfredo; Sayón-Orea, Carmen; Martínez-González, Miguel Ángel
2018-04-24
Our aim was to examine the secular trends in obesity prevalence among Spanish adults with diabetes. Data were collected from 8 waves (from 1987 to 2012) of the National Health Surveys (NHS). NHS are cross-sectional studies conducted in representative samples of the Spanish adult population. Data of 7378 adults (≥16 years) who reported having been diagnosed of diabetes were analyzed. Previously validated self-reported weight and height were used to estimate body mass index (BMI). Obesity was defined as a BMI of 30kg/m 2 or greater. Age-adjusted obesity prevalence for each wave was calculated by the direct standardization method. From 1987 to 2012 age-adjusted prevalence of obesity among persons with diabetes increased from 18.2% (95% confidence interval [CI]: 14.2-22.2%) to 39.8% (95% CI: 36.8-42.8%). Age-adjusted prevalence of obesity in males with diabetes increased from 13.2% (95% CI: 7.3-19.1%) to 38.0% (95% CI: 33.8-42.1%) and in females from 23.0% (95% CI: 17.6-28.4%) to 42.3% (95% CI: 38.0-46.6%). Between 1987 and 2012 the prevalence of obesity markedly increased in Spain among adults with diabetes. Copyright © 2018 Elsevier España, S.L.U. All rights reserved.
Trends in Participation Rates for the National Cancer Screening Program in Korea, 2002-2012
Suh, Mina; Song, Seolhee; Cho, Ha Na; Park, Boyoung; Jun, Jae Kwan; Choi, Eunji; Kim, Yeol; Choi, Kui Son
2017-01-01
Purpose The National Cancer Screening Program (NCSP) in Korea supports cancer screening for stomach, liver, colorectal, breast, and cervical cancer. This study was conducted to assess trends in participation rates among Korean men and women invited to undergo screening via the NCSP as part of an effort to guide future implementation of the program in Korea. Materials and Methods Data from the NCSP for 2002 to 2012 were used to calculate annual participation rates with 95% confidence intervals (CI) by sex, insurance status, and age group for stomach, liver, colorectal, breast, and cervical cancer screening. Results In 2012, participation rates for stomach, liver, colorectal, breast, and cervical cancer screening were 47.3%, 25.0%, 39.5%, 51.9%, and 40.9%, respectively. The participation rates increased annually by 4.3% (95% CI, 4.0 to 4.6) for stomach cancer, 3.3% (95% CI, 2.5 to 4.1) for liver cancer, 4.1% (95% CI, 3.2 to 5.0) for colorectal cancer, 4.6% (95% CI, 4.1 to 5.0) for breast cancer, and 0.9% (95% CI, –0.7 to 2.5) for cervical cancer from 2002 to 2012. Conclusion Participant rates for the NCSP for the five above-mentioned cancers increased annually from 2002 to 2012. PMID:27857022
van Aar, F; de Moraes, M; Morré, S A; van Bergen, J E A M; van der Klis, F R M; Land, J A; van der Sande, M A B; van den Broek, I V F
2014-08-01
Chlamydia trachomatis (CT) reporting rates from sexually transmitted infection clinics and general practitioners have shown a rising trend in the Netherlands. It is unknown to what extent this reflects increased CT transmission or improved case finding. To achieve more insight into the CT epidemic, we explored the CT IgG seroprevalence (a marker of past CT infection) in the general population of the Netherlands in 1996 and in 2007. From two population-based studies in 1996 and 2007, serum samples, demographic and sexual behaviour outcomes were examined, including 1246 men and 1930 women aged 15-39 years. Serum CT IgG antibodies were analysed using the Medac CT IgG ELISA test. Multivariate logistic regression analyses explored the seroprevalence and determinants over time. The CT IgG seroprevalence was higher in women than in men (10% vs 6%). Among women aged 25-39 years the seroprevalence was lower in 2007 (9%) than in 1996 (14%; adjusted OR (aOR) 0.6, 95% CI 0.4 to 0.8). There was no statistical evidence of a difference in seroprevalence within birth cohorts. Factors associated with seropositivity were male gender (aOR 0.4, 95% CI 0.3 to 0.7), a self-reported history of CT infection (aOR 5.1, 95% CI 2.6 to 10.0), age 25-39 years (aOR 1.7, 95% CI 1.1 to 2.7), non-Western ethnicity (aOR 2.2, 95% CI 1.4 to 3.3) and ≥ 2 recent sexual partners (aOR 2.2, 95% CI 1.3 to 3.5). Between 1996 and 2007 the proportion of individuals in the general population with CT IgG antibodies was lower among women aged 25-39 years, but remained similar among younger women and men. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Holtzer, Roee; Mahoney, Jeannette; Verghese, Joe
2014-08-01
The relationship between executive functions (EF) and gait speed is well established. However, with the exception of dual tasking, the key components of EF that predict differences in gait performance have not been determined. Therefore, the current study was designed to determine whether processing speed, conflict resolution, and intraindividual variability in EF predicted variance in gait performance in single- and dual-task conditions. Participants were 234 nondemented older adults (mean age 76.48 years; 55% women) enrolled in a community-based cohort study. Gait speed was assessed using an instrumented walkway during single- and dual-task conditions. The flanker task was used to assess EF. Results from the linear mixed effects model showed that (a) dual-task interference caused a significant dual-task cost in gait speed (estimate = 35.99; 95% CI = 33.19-38.80) and (b) of the cognitive predictors, only intraindividual variability was associated with gait speed (estimate = -.606; 95% CI = -1.11 to -.10). In unadjusted analyses, the three EF measures were related to gait speed in single- and dual-task conditions. However, in fully adjusted linear regression analysis, only intraindividual variability predicted performance differences in gait speed during dual tasking (B = -.901; 95% CI = -1.557 to -.245). Among the three EF measures assessed, intraindividual variability but not speed of processing or conflict resolution predicted performance differences in gait speed. © The Author 2013. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Alvarado-Esquivel, Cosme; Sánchez-Anguiano, Luis Francisco; Arnaud-Gil, Carlos Alberto; Hernández-Tinoco, Jesús; Molina-Espinoza, Luis Fernando; Rábago-Sánchez, Elizabeth
2014-03-01
Little is known about the epidemiology of suicide attempts among psychiatric outpatients in Mexico. This study was aimed to determine the socio-demographic, clinical and behavioral characteristics associated with suicide attempts in psychiatric outpatients in two public hospitals in Durango, Mexico. Two hundred seventy six psychiatric outpatients (154 suicide attempters and 122 patients without suicide attempt history) attended the two public hospitals in Durango City, Mexico were included in this study. Socio-demographic, clinical and behavioral characteristics were obtained retrospectively from all outpatients and compared in relation to the presence or absence of suicide attempt history. Increased prevalence of suicide attempts was associated with mental and behavioral disorders due to psychoactive substance use (F10-19) (P=0.01), schizophrenia, schizotypal and delusional disorders (F20-29) (P=0.02), mood (affective) disorders (F30-39) (P<0.001), and disorders of adult personality and behavior (F60-69) (P<0.001). Multivariate analysis showed that suicide attempts were associated with young age (OR=1.21, 95% CI: 1.06-1.39; P=0.003), female gender (OR=2.98, 95% CI: 1.55-5.73; P=0.001), urban residence (OR=2.31, 95% CI: 1.17-4.57; P=0.01), memory impairment (OR=1.91, 95% CI: 1.07-3.40; P=0.02), alcohol consumption (OR=2.39, 95% CI: 1.21-4.70; P=0.01), and sexual promiscuity (OR=3.90, 95% CI: 1.74-8.77; P<0.001). We report the association of suicide attempts with socio-demographic, clinical and behavioral characteristics in psychiatric outpatients in Mexico. Results may be useful for an optimal planning of preventive measures against suicide attempts in psychiatric outpatients.
Zhou, Jing; Zeng, Lingxia; Dang, Shaonong; Pei, Leilei; Gao, Wenlong; Li, Chao; Yan, Hong
2016-11-01
To identify postnatal predictors of malnutrition among 7- to 10-year-old children and to assess the long-term effects of antenatal micronutrient supplementation on malnutrition. A follow-up study was conducted to assess the nutritional status of 7- to 10-year-olds (1747 children) whose mothers participated in a cluster-randomized double-blind controlled trial from 2002 to 2006. The rate of malnourished 7- to 10-year-olds was 11.1%. A mixed-effects logistic regression model adjusted for the cluster-sampling design indicated that mothers with low prepregnant midupper arm circumference had boys with an increased risk of thinness (aOR 2.05, 95% CI 1.11, 3.79) and girls who were more likely to be underweight (aOR 2.01, 95% CI 1.05, 3.85). Antenatal micronutrient supplementation was not significantly associated with malnutrition. Low birth weight was significantly associated with increased odds of malnutrition among boys (aOR 4.34, 95% CI 1.82, 10.39) and girls (aOR 7.50, 95% CI 3.48, 16.13). Being small for gestational age significantly increased the odds of malnutrition among boys (aOR 1.75, 95% CI 1.01, 3.04) and girls (aOR 4.20, 95% CI 2.39, 7.39). In addition, household wealth, parental height, being picky eater, and illness frequency also predicted malnutrition. Both maternal prenatal nutrition and adverse birth outcomes are strong predictors of malnutrition among early school-aged children. Currently, available evidence is insufficient to support long-term effects of antenatal micronutrient supplementation on children's nutrition. www.isrctn.com: ISRCTN08850194. Copyright © 2016 Elsevier Inc. All rights reserved.
Chaaya, Monique; Awwad, Johnny; Campbell, Oona M.R.; Sibai, Abla; Kaddour, Afamia
2006-01-01
Objectives To assess the prevalence and determinants of smoking prior to and during pregnancy in Lebanon. Methods A cross-sectional study using two structured instruments. One instrument included information on demographic characteristics, smoking patterns in the index pregnancy and previous pregnancies, use of prenatal health services, stressful life events, and social support during pregnancy. The second was the Arabic General Health Questionnaire (GHQ-12). Women who delivered in 11 randomly selected hospitals in Beirut and its suburbs within 24 hours were asked to consent to participate in the study. The total sample interviewed was 576 women. Results The prevalence of pre-pregnancy smoking was 32% and 20% for smoking in pregnancy. Considering argileh smoking, the prevalence of tobacco use in pregnancy increased to 27% in Beirut and 25% in the suburbs. Pre-pregnancy smoking was associated with older maternal age [OR = 1.08, 95% CI (1.03, 1.14)], low and medium education [OR = 2.22, 95% CI (1.22, 4.04)], increased psychiatric distress [OR = 3.11, 95% CI (1.77, 5.46)], and a husband who smoked [OR = 5.00, 95% CI (2.98, 8.39)]. Continued smoking during pregnancy was associated with low and medium education [OR = 3.77, 95% CI (1.31, 10.8)], younger age [OR = 1.11, 95% CI (1.02–1.20)], and a heavy pre-pregnancy smoking pattern [OR = 13.9, 95% CI (1.40, 137.4)]. Conclusion Policies and programs to eliminate or reduce smoking during pregnancy should be targeted toward young and less educated females and involving the spouse. Obstetricians should promote smoking cessation during pregnancy using evidence-based methods. PMID:14509413
Nakamura, Sachiyo; Horiuchi, Shigeko
2013-01-01
Background: In Japan, the proportion of women aged 35 and older giving birth has greatly increased in recent years, and maternal age is continuing to increase. Advanced maternal age is a risk factor for abnormal delivery, as is hiesho (sensitivity to cold). Research Question: This study aimed to assess whether advanced maternal age and hiesho precipitate premature delivery, premature rupture of membranes, weak labor pains, prolonged labor and atonic bleeding. Method: The study design was a descriptive comparative study with a retrospective cohort group design. Subjects in this study were 2,810 Japanese women in hospital after childbirth. The research methods employed were a paper questionnaire and extraction of data from medical records. Results: Comparing the rate of occurrence of abnormal delivery among women aged 35 to 39 according to whether or not they had hiesho, results were premature delivery OR: 3.51 (95% CI: 1.66-7.43), premature rupture of membranes OR: 1.25 (95% CI: 0.90-1.74), weak labor pains OR: 2.94 (95% CI: 1.65-5.24), prolonged labor OR: 2.56 (95% CI: 1.23-5.26), and atonic bleeding, OR: 1.65 (95% CI: 0.14-2.40) when hiesho was present. Among women aged 40 and over, results were premature delivery OR: 5.09 (95% CI: 1.16-22.20), premature rupture of membranes OR: 1.60 (95% CI: 0.73-3.46), weak labor pains OR: 7.02 (95% CI: 1.56-31.55), prolonged labor OR:7.19 (95% CI: 1.49-34.60) and atonic bleeding OR: 2.00 (95% CI: 0.64-6.23). Conclusions: Regardless of maternal age, the presence of hiesho is a risk factor that can precipitate premature delivery, premature rupture of membranes, weak labor pains, prolonged labor and atonic bleeding. Furthermore, hiesho coupled with advanced maternal age increases the incidence of premature delivery, weak labor pains and prolonged labor. PMID:24062862
The Effects of Age on Divergent Thinking and Creative Objects Production: A Cross-Sectional Study
ERIC Educational Resources Information Center
Massimiliano, Palmiero
2015-01-01
Age-related changes in divergent thinking and creative objects production were investigated in 150 native Italian speakers, divided into six age groups, each one comprised 25 participants: young (20-29), young adult (30-39), middle aged (40-49), adult-old (50-59), old (60-69), and old-old (70-80). Two tasks were used: the alternative uses task,…
ERIC Educational Resources Information Center
Robert, Nicole D.; LeFevre, Jo-Anne
2013-01-01
Does solving subtraction problems with negative answers (e.g., 5-14) require different cognitive processes than solving problems with positive answers (e.g., 14-5)? In a dual-task experiment, young adults (N=39) combined subtraction with two working memory tasks, verbal memory and visual-spatial memory. All of the subtraction problems required…
Measuring the degree of integration for an integrated service network
Ye, Chenglin; Browne, Gina; Grdisa, Valerie S; Beyene, Joseph; Thabane, Lehana
2012-01-01
Background Integration involves the coordination of services provided by autonomous agencies and improves the organization and delivery of multiple services for target patients. Current measures generally do not distinguish between agencies’ perception and expectation. We propose a method for quantifying the agencies’ service integration. Using the data from the Children’s Treatment Network (CTN), we aimed to measure the degree of integration for the CTN agencies in York and Simcoe. Theory and methods We quantified the integration by the agreement between perceived and expected levels of involvement and calculated four scores from different perspectives for each agency. We used the average score to measure the global network integration and examined the sensitivity of the global score. Results Most agencies’ integration scores were <65%. As measured by the agreement between every other agency’s perception and expectation, the overall integration of CTN in Simcoe and York was 44% (95% CI: 39%–49%) and 52% (95% CI: 48%–56%), respectively. The sensitivity analysis showed that the global scores were robust. Conclusion Our method extends existing measures of integration and possesses a good extent of validity. We can also apply the method in monitoring improvement and linking integration with other outcomes. PMID:23593050
Longitudinal changes in bone lead levels: the VA Normative Aging Study
Wilker, Elissa; Korrick, Susan; Nie, Linda H; Sparrow, David; Vokonas, Pantel; Coull, Brent; Wright, Robert O.; Schwartz, Joel; Hu, Howard
2011-01-01
Objective Bone lead is a cumulative measure of lead exposure that can also be remobilized. We examined repeated measures of bone lead over 11 years to characterize long-term changes and identify predictors of tibia and patella lead stores in an elderly male population. Methods Lead was measured every 3–5 years by k-x-ray fluorescence and mixed-effect models with random effects were used to evaluate change over time. Results 554 participants provided up to 4 bone lead measurements. Final models predicted a −1.4% annual decline (95%CI: −2.2,−0.7) for tibia lead and piecewise linear model for patella with an initial decline of 5.1% per year (95%CI: −6.2,−3.9) during the first 4.6 years but no significant change thereafter (−0.4% (95% CI: −2.4, 1.7)). Conclusions These results suggest that bone lead half-life may be longer than previously reported. PMID:21788910
Alexandre, Gisele Caldas; Nadanovsky, Paulo; Lopes, Claudia S; Faerstein, Eduardo
2006-05-01
The aims of this study were to estimate the prevalence of dental pain preventing the performance of routine tasks and to assess its association with socioeconomic factors, minor psychiatric disorders, number of missing teeth, and dental consultation patterns. A cross-sectional study was conducted using a self-completed questionnaire answered by 4,030 administrative employees at a university in Rio de Janeiro, Brazil (the Pró-Saúde Study). Data were analyzed using multiple logistic regression. Prevalence of toothache preventing the performance of routine tasks in the two weeks prior to the interview was 2.9% (95%CI: 2.5-3.6). Men (OR = 1.6; 95%CI: 1.1-2.4), individuals with minor psychiatric disorders (OR = 1.7; 95%CI: 1.2-2.6), individuals with extensive tooth loss (OR = 3.4; 95%CI: 1.5-7.8), and those failing to appear for regular dental checkups (OR = 2.5; 95%CI: 1.8-17.3) showed increased odds of experiencing dental pain. Dental pain was an important problem in this population. Unfavorable living conditions and lack of regular dental checkups increased the odds of dental pain.
Reducing Operating Room Turnover Time for Robotic Surgery Using a Motor Racing Pit Stop Model.
Souders, Colby P; Catchpole, Ken R; Wood, Lauren N; Solnik, Jonathon M; Avenido, Raymund M; Strauss, Paul L; Eilber, Karyn S; Anger, Jennifer T
2017-08-01
Operating room (OR) turnover time, time taken between one patient leaving the OR and the next entering, is an important determinant of OR utilization, a key value metric for hospital administrators. Surgical robots have increased the complexity and number of tasks required during an OR turnover, resulting in highly variable OR turnover times. We sought to streamline the turnover process and decrease robotic OR turnover times and increase efficiency. Direct observation of 45 pre-intervention robotic OR turnovers was performed. Following a previously successful model for handoffs, we employed concepts from motor racing pit stops, including briefings, leadership, role definition, task allocation and task sequencing. Turnover task cards for staff were developed, and card assignments were distributed for each turnover. Forty-one cases were observed post-intervention. Average total OR turnover time was 99.2 min (95% CI 88.0-110.3) pre-intervention and 53.2 min (95% CI 48.0-58.5) at 3 months post-intervention. Average room ready time from when the patient exited the OR until the surgical technician was ready to receive the next patient was 42.2 min (95% CI 36.7-47.7) before the intervention, which reduced to 27.2 min at 3 months (95% CI 24.7-29.7) post-intervention (p < 0.0001). Role definition, task allocation and sequencing, combined with a visual cue for ease-of-use, create efficient, and sustainable approaches to decreasing robotic OR turnover times. Broader system changes are needed to capitalize on that result. Pit stop and other high-risk industry models may inform approaches to the management of tasks and teams.
A Measure of Search Efficiency in a Real World Search Task (PREPRINT)
2009-02-16
Search Task 5a. CONTRACT NUMBER N00173-08-1-G030 5b. GRANT NUMBER NRL BAA 08-09, 55-07-01 5c. PROGRAM ELEMENT NUMBER 0602782N 6. AUTHOR(S... Beck , Melissa R. Ph.D (LSU) Maura C. Lohrenz (NRL Code 7440.1) J. Gregory Trafton (NRL Code 5515) 5d. PROJECT NUMBER 08294 5e. TASK NUMBER... Beck 19b. TELEPHONE NUMBER (Include area code) (225)578-7214 Standard Form 298 (Rev. 8/98) Prescribed by ANSI Std. Z39.18 A measure of search
Image patch-based method for automated classification and detection of focal liver lesions on CT
NASA Astrophysics Data System (ADS)
Safdari, Mustafa; Pasari, Raghav; Rubin, Daniel; Greenspan, Hayit
2013-03-01
We developed a method for automated classification and detection of liver lesions in CT images based on image patch representation and bag-of-visual-words (BoVW). BoVW analysis has been extensively used in the computer vision domain to analyze scenery images. In the current work we discuss how it can be used for liver lesion classification and detection. The methodology includes building a dictionary for a training set using local descriptors and representing a region in the image using a visual word histogram. Two tasks are described: a classification task, for lesion characterization, and a detection task in which a scan window moves across the image and is determined to be normal liver tissue or a lesion. Data: In the classification task 73 CT images of liver lesions were used, 25 images having cysts, 24 having metastasis and 24 having hemangiomas. A radiologist circumscribed the lesions, creating a region of interest (ROI), in each of the images. He then provided the diagnosis, which was established either by biopsy or clinical follow-up. Thus our data set comprises 73 images and 73 ROIs. In the detection task, a radiologist drew ROIs around each liver lesion and two regions of normal liver, for a total of 159 liver lesion ROIs and 146 normal liver ROIs. The radiologist also demarcated the liver boundary. Results: Classification results of more than 95% were obtained. In the detection task, F1 results obtained is 0.76. Recall is 84%, with precision of 73%. Results show the ability to detect lesions, regardless of shape.
Kakinuma, Ryutaro; Muramatsu, Yukio; Yamamichi, Junta; Gomi, Shiho; Oubel, Estanislao; Moriyama, Noriyuki
2018-01-01
This study sought to evaluate the 95% limits of agreement of the volumes of 5-year clinically stable solid nodules for the development of a follow-up system for indeterminate solid nodules. The volumes of 226 solid nodules that had been clinically stable for 5 years were measured in 186 patients (53 female never-smokers, 36 male never-smokers, 51 males with <30 pack-years, and 46 males with ≥30 pack-years) using a three-dimensional semiautomated method. Volume changes were evaluated using three methods: percent change, proportional change and growth rate. The 95% limits of agreement were evaluated using the Bland-Altman method. The 95% limits of agreement were as follows: range of percent change, from ±34.5% to ±37.8%; range of proportional change, from ±34.1% to ±36.8%; and range of growth rate, from ±39.2% to ±47.4%. Percent change-based, proportional change-based, and growth rate-based diagnoses of an increase or decrease in ten solid nodules were made at a mean of 302±402, 367±455, and 329±496 days, respectively, compared with a clinical diagnosis made at 809±616 days (P<0.05). The 95% limits of agreement for volume change in 5-year stable solid nodules may enable the detection of an increase or decrease in the solid nodule at an earlier stage than that enabled by a clinical diagnosis, possibly contributing to the development of a follow-up system for reducing the number of additional Computed tomography (CT) scans performed during the follow-up period.
Observed Self-Regulation is Associated with Weight in Low-Income Toddlers
Miller, Alison L.; Rosenblum, Katherine L.; Retzloff, Lauren B.; Lumeng, Julie C.
2016-01-01
Obesity emerges in early childhood and tracks across development. Self-regulation develops rapidly during the toddler years, yet few studies have examined toddlers’ self-regulation in relation to concurrent child weight. Further, few studies compare child responses in food and non-food-related tasks. Our goal was to examine toddlers’ observed behavioral and emotional self-regulation in food and non-food tasks in relation to their body mass index z-score (BMIz) and weight status (overweight/obese vs. not). Observational measures were used to assess self-regulation (SR) in four standardized tasks in 133 low-income children (M age=33.1 months; SD=0.6). Behavioral SR was measured by assessing how well the child could delay gratification for a snack (food-related task) and a gift (non-food-related task). Emotional SR was measured by assessing child intensity of negative affect in two tasks designed to elicit frustration: being shown, then denied a cookie (food-related) or a toy (non-food-related). Task order was counterbalanced. BMIz was measured. Bivariate correlations and regression analyses adjusting for child sex, child race/ethnicity, and maternal education were conducted to examine associations of SR with weight. Results were that better behavioral SR in the snack delay task associated with lower BMIz (β=−0.27, p<.05) and lower odds of overweight/obesity (OR=0.66, 95% CI 0.45, 0.96), but behavioral SR in the gift task did not associate with BMIz or weight status. Better emotional SR in the non-food task associated with lower BMIz (β= −0.27, p<.05), and better emotional SR in food and non-food tasks associated with lower odds of overweight/obesity (OR=0.65, 95% CI 0.45, 0.96 and OR=0.56, 95% CI 0.37, 0.87, respectively). Results are discussed regarding how behavioral SR for food and overall emotional SR relate to weight during toddlerhood, and regarding early childhood obesity prevention implications. PMID:27397726
Rossen, Lauren M.; Pollack, Keshia M.; Curriero, Frank C.; Shields, Timothy M.; Smart, Mieka J.; Furr-Holden, C. Debra M.; Cooley-Strickland, Michele
2011-01-01
Background Walking to school is an important source of physical activity among children. There is a paucity of research exploring environmental determinants of walking to school among children in urban areas. Methods A cross-sectional secondary analysis of baseline data (2007) from 365 children in the “Multiple Opportunities to Reach Excellence” (MORE) Study (8 to 13 years; Mean 9.60 years, SD 1.04). Children and caregivers were asked about walking to school and perceived safety. Objective measures of the environment were obtained using a validated environmental neighborhood assessment. Results Over half (55.83%) of children reported walking to school most of the time. High levels of neighborhood incivilities were associated with lower levels of perceived safety (OR: 0.39, 95% CI: 0.21 to 0.72). Living on a block above the median in incivilities was associated with a 353% increase in odds of walking to school (OR: 3.53; 95% CI: 1.68 to 7.39). Conclusions Children residing in neighborhoods high in incivilities are more likely to walk to school, in spite of lower levels of perceived safety. As a high proportion of children residing in disadvantaged neighborhoods walk to school, efforts should be directed at minimizing exposure to neighborhood hazards by ensuring safe routes to and from school. PMID:21415453
Shebl, Fatma M; Bhatia, Kishor; Engels, Eric A
2010-05-15
Individuals with acquired immunodeficiency syndrome (AIDS) manifest an increased risk of cancer, particularly cancers caused by oncogenic viruses. Because some salivary gland and nasopharyngeal cancers are associated with Epstein Barr virus, the impact of AIDS on these cancers needs further evaluation. We used linked U.S. AIDS and cancer registry data (N = 519,934 people with AIDS) to derive standardized incidence ratios (SIRs) comparing risk of salivary gland and nasopharyngeal cancers to the general population. For salivary gland cancers (N = 43 cases), individuals with AIDS had strongly elevated risks for lymphoepithelial carcinoma (SIR 39, 95% CI 16-81) and squamous cell carcinoma (SIR 4.9, 95% CI 2.5-8.6). Among nasopharyngeal cancers (N = 39 cases), risks were elevated for both keratinizing and nonkeratinizing carcinomas (SIR 2.4, 95% CI 1.5-3.7 and SIR 2.4, 95% CI 1.2-4.4, respectively). The elevated risks of salivary gland and nasopharyngeal cancers among people with AIDS suggest that immunosuppression and oncogenic viral infections are etiologically important.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rubinsztein, D.C.; Leggo, J.; Whittaker, J.L.
1996-07-01
Abnormal CAG expansions in the IT-15 gene are associated with Huntington disease (HD). In the diagnostic setting it is necessary to define the limits of the CAG size ranges on normal and HD-associated chromosomes. Most large analyses that defined the limits of the normal and pathological size ranges employed PCR assays, which included the CAG repeats and a CCG repeat tract that was thought to be invariant. Many of these experiments found an overlap between the normal and disease size ranges. Subsequent findings that the CCG repeats vary by 9 trinucleotide lengths suggested that the limits of the normal andmore » disease size ranges should be reevaluated with assays that exclude the CCG polymorphism. Since patients with between 30 and 40 repeats are rare, a consortium was assembled to collect such individuals. All 178 samples were reanalyzed in Cambridge by using assays specific for the CAG repeats. We have optimized methods for reliable sizing of CAG repeats and show cases that demonstrate the dangers of using PCR assays that include both the CAG and CCG polymorphisms. Seven HD patients had 36 repeats, which confirms that this allele is associated with disease. Individuals without apparent symptoms or signs of HD were found at 36 repeats (aged 74, 78, 79, and 87 years), 37 repeats (aged 69 years), 38 repeats (aged 69 and 90 years), and 39 repeats (aged 67, 90, and 95 years). The detailed case histories of an exceptional case from this series will be presented: a 95-year-old man with 39 repeats who did not have classical features of HD. The apparently healthy survival into old age of some individuals with 36-39 repeats suggests that the HD mutation may not always be fully penetrant. 26 refs., 3 figs., 1 tab.« less
Henning, Daniel J; Puskarich, Michael A; Self, Wesley H; Howell, Michael D; Donnino, Michael W; Yealy, Donald M; Jones, Alan E; Shapiro, Nathan I
2017-10-01
The Third International Consensus Definitions Task Force (SEP-3) proposed revised criteria defining sepsis and septic shock. We seek to evaluate the performance of the SEP-3 definitions for prediction of inhospital mortality in an emergency department (ED) population and compare the performance of the SEP-3 definitions to that of the previous definitions. This was a secondary analysis of 3 prospectively collected, observational cohorts of infected ED subjects aged 18 years or older. The primary outcome was all-cause inhospital mortality. In accordance with the SEP-3 definitions, we calculated test characteristics of sepsis (quick Sequential Organ Failure Assessment [qSOFA] score ≥2) and septic shock (vasopressor dependence plus lactate level >2.0 mmol/L) for mortality and compared them to the original 1992 consensus definitions. We identified 7,754 ED patients with suspected infection overall; 117 had no documented mental status evaluation, leaving 7,637 patients included in the analysis. The mortality rate for the overall population was 4.4% (95% confidence interval [CI] 3.9% to 4.9%). The mortality rate for patients with qSOFA score greater than or equal to 2 was 14.2% (95% CI 12.2% to 16.2%), with a sensitivity of 52% (95% CI 46% to 57%) and specificity of 86% (95% CI 85% to 87%) to predict mortality. The original systemic inflammatory response syndrome-based 1992 consensus sepsis definition had a 6.8% (95% CI 6.0% to 7.7%) mortality rate, sensitivity of 83% (95% CI 79% to 87%), and specificity of 50% (95% CI 49% to 51%). The SEP-3 septic shock mortality was 23% (95% CI 16% to 30%), with a sensitivity of 12% (95% CI 11% to 13%) and specificity of 98.4% (95% CI 98.1% to 98.7%). The original 1992 septic shock definition had a 22% (95% CI 17% to 27%) mortality rate, sensitivity of 23% (95% CI 18% to 28%), and specificity of 96.6% (95% CI 96.2% to 97.0%). Both the new SEP-3 and original sepsis definitions stratify ED patients at risk for mortality, albeit with differing performances. In terms of mortality prediction, the SEP-3 definitions had improved specificity, but at the cost of sensitivity. Use of either approach requires a clearly intended target: more sensitivity versus specificity. Copyright © 2017 American College of Emergency Physicians. Published by Elsevier Inc. All rights reserved.
Raevuori, Anu; Dick, Danielle M.; Keski-Rahkonen, Anna; Pulkkinen, Lea; Rose, Richard J.; Rissanen, Aila; Kaprio, Jaakko; Viken, Richard J.; Silventoinen, Karri
2007-01-01
Background We analysed genetic and environmental influences on self-esteem and its stability across adolescence. Methods Finnish twins born in 1983–1987 were assessed by questionnaire at ages 14y (N= 4132 twin individuals) and 17y (N=3841 twin individuals). Self esteem was measured using the Rosenberg global self-esteem scale and analyzed using quantitative genetic methods for twin data in the Mx statistical package. Results The heritability of self-esteem was 0.62 (95% CI 0.56–0.68) in 14-y-old boys and 0.40 (95% CI 0.26–0.54) in 14-y-old girls, while the corresponding estimates at age 17y were 0.48 (95% CI 0.39–0.56) and 0.29 (95% CI 0.11–0.45). Rosenberg self-esteem scores at age 14 y and 17 y were modestly correlated (r=0.44 in boys, r=0.46 in girls). In boys, the correlation was mainly (82%) due to genetic factors, with residual co-variation due to unique environment. In girls, genetic (31%) and common environmental (61%) factors largely explained the correlation. Conclusions In adolescence, self-esteem seems to be differently regulated in boys versus girls. A key challenge for future research is to identify environmental influences contributing to self-esteem during adolescence and how these factors interact with genetic influences. PMID:17537282
[Determination of carvacrol and thymol in Mosla chinensis by HPLC].
Ji, Li; Wang, Fang; Liu, Yuan-yan; Tong, Yan; Li, Xian-duan; Feng, Xue-feng; Huang, Lu-qi; Zhou, Guo-ping
2004-11-01
To establish a quantitative method of determination of carvacrol and thymol in Mosla chinensis. The sample was extracted with 95% ethanol, ODS column was used with methanol-water-acetic acid (60:40:2) as mobile phase. The detection wavelength was set at 274 nm. The linearities of carvacrol and thymol were respectively in the range of 0.23-2.15 microg (r = 0.9999) and 0.39-2.36 microg (r = 0.9999); the average recoveries were 99.9% (RSD 1.4%) and 98.6% (RSD 1.3%); the RSD of repeatability were 1.1% and 1.6%. The method is reliable, and can be used for quality control of M. chinensis.
Akilimali, P Z; Mutombo, P B; Kayembe, P K; Kaba, D K; Mapatano, M A
2014-06-01
The study aimed to identify factors associated with the survival of patients receiving antiretroviral therapy. A historic cohort of HIV patients from two major hospitals in Goma (Democratic Republic of Congo) was followed from 2004 to 2012. The Kaplan-Meier method was used to describe the probability of survival as a function of time since inclusion into the cohort. The log-rank test was used to compare survival curves based on determinants. The Cox regression model identified the determinants of survival since treatment induction. The median follow-up time was 3.56 years (IQR=2.22-5.39). The mortality rate was 40 deaths per 1000 person-years. Male gender (RR: 2.56; 95 %CI 1.66-4.83), advanced clinical stage (RR: 2.12; 95 %CI 1.15-3.90), low CD4 count (CD4 < 50) (RR: 2.05; 95 %CI : 1.22-3.45), anemia (RR: 3.95; 95 %CI 2.60-6.01), chemoprophylaxis with cotrimoxazole (RR: 4.29, 95 % CI 2.69-6.86) and period of treatment initiation (2010-2011) (RR: 3.34; 95 %CI 1.24-8.98) were statistically associated with short survival. Initiation of treatment at an early stage of the disease with use of less toxic molecules and an increased surveillance especially of male patients are recommended to reduce mortality. Copyright © 2014 Elsevier Masson SAS. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Videtic, Gregory M.M., E-mail: videtig@ccf.org; Hu, Chen; Johns Hopkins University School of Medicine, Baltimore, Maryland
Purpose: To compare 2 stereotactic body radiation therapy (SBRT) schedules for medically inoperable early-stage lung cancer to determine which produces the lowest rate of grade ≥3 protocol-specified adverse events (psAEs) at 1 year. Methods and Materials: Patients with biopsy-proven peripheral (≥2 cm from the central bronchial tree) T1 or T2, N0 (clinically node negative by positron emission tomography), M0 tumors were eligible. Patients were randomized to receive either 34 Gy in 1 fraction (arm 1) or 48 Gy in 4 consecutive daily fractions (arm 2). Rigorous central accreditation and quality assurance confirmed treatment per protocol guidelines. This study was designed to detect a psAEsmore » rate >17% at a 10% significance level (1-sided) and 90% power. Secondary endpoints included rates of primary tumor control (PC), overall survival (OS), and disease-free survival (DFS) at 1 year. Designating the better of the 2 regimens was based on prespecified rules of psAEs and PC for each arm. Results: Ninety-four patients were accrued between September 2009 and March 2011. The median follow-up time was 30.2 months. Of 84 analyzable patients, 39 were in arm 1 and 45 in arm 2. Patient and tumor characteristics were balanced between arms. Four (10.3%) patients on arm 1 (95% confidence interval [CI] 2.9%-24.2%) and 6 (13.3%) patients on arm 2 (95% CI 5.1%-26.8%) experienced psAEs. The 2-year OS rate was 61.3% (95% CI 44.2%-74.6%) for arm 1 patients and 77.7% (95% CI 62.5%-87.3%) for arm 2. The 2-year DFS was 56.4% (95% CI 39.6%-70.2%) for arm 1 and 71.1% (95% CI 55.5%-82.1%) for arm 2. The 1-year PC rate was 97.0% (95% CI 84.2%-99.9%) for arm 1 and 92.7% (95% CI 80.1%-98.5%) for arm 2. Conclusions: 34 Gy in 1 fraction met the prespecified criteria and, of the 2 schedules, warrants further clinical research.« less
Hutcheon, J A; Strumpf, E C; Harper, S; Giesbrecht, E
2015-08-01
To evaluate the extent to which implementing a hospital policy to limit planned caesarean deliveries before 39 weeks of gestation improved neonatal health, maternal health, and healthcare costs. Retrospective cohort study. British Columbia Women's Hospital, Vancouver, Canada, in the period 2005-2012. Women with a low-risk planned repeat caesarean delivery. An interrupted time series design was used to evaluate the policy to limit planned caesarean deliveries before 39 weeks of gestation, introduced on 1 April 2008. Composite adverse neonatal health outcome (respiratory morbidity, 5-minute Apgar score of <7, neonatal intensive care unit admission, mortality), postpartum haemorrhage, obstetrical wound infection, out-of-hour deliveries, length of stay, and healthcare costs. Between 2005 and 2008, 60% (1204/2021) of low-risk planned caesarean deliveries were performed before 39 weeks of gestation. After the introduction of the policy, the proportion of planned caesareans dropped by 20 percentage points (adjusted risk difference of 20 fewer cases per 100 deliveries; 95% CI -25.8, -14.3) to 41% (1033/2518). The policy had no detectable impact on adverse neonatal outcomes (2.2 excess cases per 100; 95% CI -0.4, 4.8), maternal complications, or healthcare costs, but increased the risk of out-of-hours delivery from 16.2 to 21.1% (adjusted risk difference 6.3 per 100; 95% CI 1.6, 10.9). We found little evidence that a hospital policy to limit planned caesareans before 39 weeks of gestation reduced adverse neonatal outcomes. Hospital administrators intending to introduce such policies should anticipate, and plan for, modest increases in out-of-hours and emergency-timing. © 2015 Royal College of Obstetricians and Gynaecologists.
Berryman, Carolyn; Wise, Vikki; Stanton, Tasha R; McFarlane, Alexander; Moseley, G Lorimer
2017-02-01
Somatic hypervigilance describes a clinical presentation in which people report more, and more intense, bodily sensations than is usual. Most explanations of somatic hypervigilance implicate altered information processing, but strong empirical data are lacking. Attention and working memory are critical for information processing, and we aimed to evaluate brain activity during attention/working memory tasks in people with and without somatic hypervigilance. Data from 173 people with somatic hypervigilance and 173 controls matched for age, gender, handedness, and years of education were analyzed. Event-related potential (ERP) data, extracted from the continuous electroencephalograph recordings obtained during performance of the Auditory Oddball task, and the Two In A Row (TIAR) task, for N1, P2, N2, and P3, were used in the analysis. Between-group differences for P3 amplitude and N2 amplitude and latency were assessed with two-tailed independent t tests. Between-group differences for N1 and P2 amplitude and latency were assessed using mixed, repeated measures analyses of variance (ANOVAs) with group and Group × Site factors. Linear regression analysis investigated the relationship between anxiety and depression and any outcomes of significance. People with somatic hypervigilance showed smaller P3 amplitudes-Auditory Oddball task: t(285) = 2.32, 95% confidence interval, CI [3.48, 4.47], p = .026, d = 0.27; Two-In-A-Row (TIAR) task: t(334) = 2.23, 95% CI [2.20; 3.95], p = .021, d = 0.24-than case-matched controls. N2 amplitude was also smaller in people with somatic hypervigilance-TIAR task: t(318) = 2.58, 95% CI [0.33, 2.47], p = .010, d = 0.29-than in case-matched controls. Neither depression nor anxiety was significantly associated with any outcome. People with somatic hypervigilance demonstrated an event-related potential response to attention/working memory tasks that is consistent with altered information processing.
Enhancer of Zeste Homolog 2 as an Independent Prognostic Marker for Cancer: A Meta-Analysis
Sun, Kaiyu; Wu, Dexi; Li, Minrui; Li, Manying; Zhong, Bihui; Chen, Minhu; Zhang, Shenghong
2015-01-01
Background Novel biomarkers are of particular interest for predicting cancer prognosis. This study aimed to explore the associations between enhancer of zeste homolog 2 (EZH2) and patient survival in various cancers. Methods Relevant literature was retrieved from PubMed and Web of Science databases. Pooled hazard ratios (HRs), odds ratios (ORs), and 95% confidence intervals (CIs) were calculated. Results Forty-nine studies (8,050 patients) were included. High EZH2 expression was significantly associated with shorter overall (hazard ratio [HR] 1.74, 95% CI: 1.46–2.07), disease-free (HR 1.59, 95% CI: 1.27–1.99), metastasis-free (HR 2.19, 95% CI: 1.38–3.47), progression-free (HR 2.53, 95% CI: 1.52–4.21), cancer-specific (HR 3.13, 95% CI: 1.70–5.74), and disease-specific (HR 2.29, 95% CI: 1.56–3.35) survival, but not recurrence-free survival (HR 1.38, 95% CI: 0.93–2.06). Moreover, EZH2 expression significantly correlated with distant metastasis (OR 3.25, 95% CI: 1.07–9.87) in esophageal carcinoma; differentiation (OR 3.00, 95% CI: 1.37–6.55) in non-small cell lung cancer; TNM stage (OR 3.18, 95% CI: 2.49–4.08) in renal cell carcinoma; and histological grade (OR 4.50, 95% CI: 3.33–6.09), estrogen receptor status (OR 0.15, 95% CI: 0.11–0.20) and progesterone receptor status (OR 0.30, 95% CI: 0.23–0.39) in breast cancer. Conclusions Our results suggested that EZH2 might be an independent prognostic factor for multiple survival measures in different cancers. PMID:25974088
Quality of childcare influences children's attentiveness and emotional regulation at school entry.
Gialamas, Angela; Sawyer, Alyssa C P; Mittinty, Murthy N; Zubrick, Stephen R; Sawyer, Michael G; Lynch, John
2014-10-01
To examine the association between domain-specific qualities of formal childcare at age 2-3 years and children's task attentiveness and emotional regulation at age 4-5 and 6-7 years. We used data from the Longitudinal Study of Australian Children (n = 1038). Three domain-specific aspects of childcare quality were assessed: provider and program characteristics of care, activities in childcare, and carer-child relationship. Two self-regulatory abilities were considered: task attentiveness and emotional regulation. Associations between domain-specific qualities of childcare and self-regulation were investigated in linear regression analyses adjusted for confounding, with imputation for missing data. There was no association between any provider or program characteristics of care and children's task attentiveness and emotional regulation. The quality of activities in childcare were associated only with higher levels of emotional regulation at age 4-5 years (β = 0.24; 95% CI, 0.03-0.44) and 6-7 years (β = 0.26; 95% CI, 0.04-0.48). Higher-quality carer-child relationships were associated with higher levels of task attentiveness (β = 0.20; 95% CI, 0.05-0.36) and emotional regulation at age 4-5 years (β = 0.19; 95% CI, 0.04-0.34) that persisted to age 6-7 years (β = 0.26; 95% CI, 0.10-0.42; β = 0.31; 95% CI, 0.16-0.47). Among children using formal childcare, those who experienced higher-quality relationships were better able to regulate their attention and emotions as they started school. Higher emotional regulation was also observed for children engaged in more activities in childcare. Beneficial effects were stable over time. Copyright © 2014 Elsevier Inc. All rights reserved.
Suzan-Monti, M; Blanche, J; Boyer, S; Kouanfack, C; Delaporte, E; Bonono, R-C; Carrieri, P M; Protopopescu, C; Laurent, C; Spire, B
2015-05-01
The World Health Organization (WHO) recommends task-shifting HIV care to nurses in low-resource settings with limited numbers of physicians. However, the effect of such task-shifting on the health-related quality of life (HRQL) of people living with HIV (PLHIV) has seldom been evaluated. We aimed to investigate the effect of task-shifting HIV care to nurses on HRQL outcomes in PLHIV initiating antiretroviral therapy (ART) in rural district hospitals in Cameroon. Outcomes in PLHIV were longitudinally collected in the 2006-2010 Stratall trial. PLHIV were followed up for 24 months by nurses and/or physicians. Six HRQL dimensions were assessed during face-to-face interviews using the WHO Quality of Life (WHOQOL)-HIV BREF scale: physical health; psychological health; independence level; social relationships; environment; and spirituality/religion/personal beliefs. The degree of task-shifting was estimated using a consultant ratio (i.e. the ratio of nurse-led to physician-led visits). The effect of task-shifting and other potential correlates on HRQL dimensions was explored using a Heckman two-stage approach based on linear mixed models to adjust for the potential bias caused by missing data in the outcomes. Of 1424 visits in 440 PLHIV (70.5% female; median age 36 years; median CD4 count 188 cells/μL at enrolment), 423 (29.7%) were task-shifted to nurses. After multiple adjustment, task-shifting was associated with higher HRQL level for four dimensions: physical health [coefficient 0.7; 95% confidence interval (CI) 0.1-1.2; P = 0.01], psychological health (coefficient 0.5; 95% CI 0.0-1.0; P = 0.05), independence level (coefficient 0.6; 95% CI 0.1-1.1; P = 0.01) and environment (coefficient 0.6; 95% CI 0.1-1.0; P = 0.02). Task-shifting HIV care to nurses benefits the HRQL of PLHIV. Together with the previously demonstrated comparable clinical effectiveness of physician-based and nurse-based models of HIV care, our results support the WHO recommendation for task-shifting. © 2015 British HIV Association.
Ratliff, Kristin R; Newcombe, Nora S
2008-03-01
Being able to reorient to the spatial environment after disorientation is a basic adaptive challenge. There is clear evidence that reorientation uses geometric information about the shape of the surrounding space. However, there has been controversy concerning whether use of geometry is a modular function, and whether use of features is dependent on human language. A key argument for the role of language comes from shadowing findings where adults engaged in a linguistic task during reorientation ignored a colored wall feature and only used geometric information to reorient [Hermer-Vazquez, L., Spelke, E., & Katsnelson, A. (1999). Sources of flexibility in human cognition: Dual task studies of space and language. Cognitive Psychology, 39, 3-36]. We report three studies showing: (a) that the results of Hermer-Vazques et al. [Hermer-Vazquez, L., Spelke, E., & Katsnelson, A. (1999). Sources of flexibility in human cognition: Dual task studies of space and language. Cognitive Psychology, 39, 3-36] are obtained in incidental learning but not with explicit instructions, (b) that a spatial task impedes use of features at least as much as a verbal shadowing task, and (c) that neither secondary task impedes use of features in a room larger than that used by Hermer-Vazquez et al. These results suggest that language is not necessary for successful use of features in reorientation. In fact, whether or not there is an encapsulated geometric module is currently unsettled. The current findings support an alternative to modularity; the adaptive combination view hypothesizes that geometric and featural information are utilized in varying degrees, dependent upon the certainty and variance with which the two kinds of information are encoded, along with their salience and perceived usefulness.
Glossary of Terms--Nuclear Weapon Phenomena and Effects
1985-02-15
w".: w~*~ combat ineffective. An individual whose injuries are of such nature that he is no longer capable of carrying out his assigned task...significantly harmful result [39]. That dose of ionizing radiation that is not expected to cause appreciable bodily injury to a person at any time during his...art and science of protecting human beings from injury by radia- tion, and promoting better health through beneficial applications of radiation [39
Jonas, Daniel E; Amick, Halle R; Feltner, Cynthia; Weber, Rachel Palmieri; Arvanitis, Marina; Stine, Alexander; Lux, Linda; Harris, Russell P
2017-01-24
Many adverse health outcomes are associated with obstructive sleep apnea (OSA). To review primary care-relevant evidence on screening adults for OSA, test accuracy, and treatment of OSA, to inform the US Preventive Services Task Force. MEDLINE, Cochrane Library, EMBASE, and trial registries through October 2015, references, and experts, with surveillance of the literature through October 5, 2016. English-language randomized clinical trials (RCTs); studies evaluating accuracy of screening questionnaires or prediction tools, diagnostic accuracy of portable monitors, or association between apnea-hypopnea index (AHI) and health outcomes among community-based participants. Two investigators independently reviewed abstracts and full-text articles. When multiple similar studies were available, random-effects meta-analyses were conducted. Sensitivity, specificity, area under the curve (AUC), AHI, Epworth Sleepiness Scale (ESS) scores, blood pressure, mortality, cardiovascular events, motor vehicle crashes, quality of life, and harms. A total of 110 studies were included (N = 46 188). No RCTs compared screening with no screening. In 2 studies (n = 702), the screening accuracy of the multivariable apnea prediction score followed by home portable monitor testing for detecting severe OSA syndrome (AHI ≥30 and ESS score >10) was AUC 0.80 (95% CI, 0.78 to 0.82) and 0.83 (95% CI, 0.77 to 0.90), respectively, but the studies oversampled high-risk participants and those with OSA and OSA syndrome. No studies prospectively evaluated screening tools to report calibration or clinical utility for improving health outcomes. Meta-analysis found that continuous positive airway pressure (CPAP) compared with sham was significantly associated with reduction of AHI (weighted mean difference [WMD], -33.8 [95% CI, -42.0 to -25.6]; 13 trials, 543 participants), excessive sleepiness assessed by ESS score (WMD, -2.0 [95% CI, -2.6 to -1.4]; 22 trials, 2721 participants), diurnal systolic blood pressure (WMD, -2.4 points [95% CI, -3.9 to -0.9]; 15 trials, 1190 participants), and diurnal diastolic blood pressure (WMD, -1.3 points [95% CI, -2.2 to -0.4]; 15 trials, 1190 participants). CPAP was associated with modest improvement in sleep-related quality of life (Cohen d, 0.28 [95% CI, 0.14 to 0.42]; 13 trials, 2325 participants). Mandibular advancement devices (MADs) and weight loss programs were also associated with reduced AHI and excessive sleepiness. Common adverse effects of CPAP and MADs included oral or nasal dryness, irritation, and pain, among others. In cohort studies, there was a consistent association between AHI and all-cause mortality. There is uncertainty about the accuracy or clinical utility of all potential screening tools. Multiple treatments for OSA reduce AHI, ESS scores, and blood pressure. Trials of CPAP and other treatments have not established whether treatment reduces mortality or improves most other health outcomes, except for modest improvement in sleep-related quality of life.
Role delineation study for the American Society for Pain Management Nursing.
Willens, Joyce S; DePascale, Christine; Penny, James
2010-06-01
A role delineation study, or job analysis, is a necessary step in the development of a quality credentialing program. The process requires a logical approach and systematic methods to have an examination that is legally defensible. There are three main phases: initial development and evaluation, validation study, and development of test specifications. In the first phase, the content expert panel discussed performance domains that exist in pain management nursing. The six domains developed were: 1) assessment, monitoring, and evaluation of pain; 2) pharmacologic pain management; 3) nonpharmacologic pain management; 4) therapeutic communication and counseling; 5) patient and family teaching; and 6) collaborative and organizational activities. The panel then produced a list of 70 task statements to develop an online survey which was sent to independent reviewers with expertise in pain management nursing. After the panel reviewed the results of the pilot test, it was decided to clarify a few items that did not perform as expected. After the questionnaire was finalized it was distributed to 1,500 pain management nurses. The final yield was 585 usable returns, for a response rate of 39%. Thirty-three percent of the respondents reported a bachelor's degree in nursing as the highest degree awarded. Over 80% indicated that they were certified in pain management. Over 35% reported working in a staff position, 14% as a nurse practitioner, and 13% as a clinical nurse specialist. Part of the questionnaire asked the participants to rate performance expectation, consequence or the likelihood that the newly certified pain management nurse could cause harm, and the frequency of how often that nurse performs in each of the performance domains. The performance expectation was rated from 0 (the newly certified pain management nurse was not at all expected to perform the domain task) to 2 (after 6 months the newly certified pain management nurse would be expected to perform the domain task). The consequences of the degree would be the inability of the newly certified pain management nurse to perform duties or tasks in each domain was rated from 0 (no harm) to 4 (extreme harm). The first domain received the highest average frequency rating. The pharmacologic domain received the highest mean rating on consequence. The reliability of all scales was 0.95 or higher, which indicated that the questionnaire consistently measured what it was intended to measure. The quality of the questionnaire is an indicator that certification is one measure of nursing excellence. (c) 2010 American Society for Pain Management Nursing. Published by Elsevier Inc. All rights reserved.
Relationship between the exocrine and endocrine pancreas after acute pancreatitis
Das, Stephanie L M; Kennedy, James I C; Murphy, Rinki; Phillips, Anthony R J; Windsor, John A; Petrov, Maxim S
2014-01-01
AIM: To determine the prevalence and time course of pancreatic exocrine insufficiency in individuals with newly diagnosed prediabetes or diabetes mellitus after acute pancreatitis. METHODS: Relevant literature cited in three major biomedical journal databases (EMBASE, MEDLINE, and Scopus) was reviewed independently by two authors. There were no language constraints but the search was limited to human studies. Studies included were cohort studies of adult patients who were discharged after an attack of acute pancreatitis. Patients were excluded if they were under 18 years of age or had a previous diagnosis of prediabetes or diabetes mellitus, pancreatic exocrine insufficiency, or chronic pancreatitis. The main outcome measure was the prevalence of concomitant pancreatic exocrine insufficiency in patients who were diagnosed with prediabetes and diabetes mellitus after an attack of acute pancreatitis. Subgroup analysis was conducted for patients who were diagnosed with prediabetes only and those who were diagnosed with diabetes mellitus only. Subgroup analysis looking at the time course of concomitant pancreatic exocrine and endocrine insufficiency was also conducted. Pooled prevalence and corresponding 95% confidence intervals were calculated for all outcome measures and P-values < 0.05 were deemed statistically significant. RESULTS: Eight clinical studies comprising of 234 patients met all eligibility criteria. The pooled prevalence of newly diagnosed prediabetes or diabetes in individuals after acute pancreatitis was 43% (95%CI: 30%-56%). The pooled prevalence of pancreatic exocrine insufficiency in individuals after acute pancreatitis was 29% (95%CI: 19%-39%). The prevalence of concomitant pancreatic exocrine insufficiency in individuals with newly diagnosed prediabetes or diabetes was 40% (95%CI: 25%-55%). The prevalence of concomitant pancreatic exocrine insufficiency among individuals with prediabetes alone and diabetes mellitus alone was 41% (95%CI: 12%-75%) and 39% (95%CI: 28%-51%), respectively. Further analysis showed that the prevalence of concomitant pancreatic exocrine insufficiency in individuals with prediabetes or diabetes decreases over time after an attack of acute pancreatitis. CONCLUSION: Pancreatic exocrine insufficiency occurs in 40% of individuals with newly diagnosed prediabetes or diabetes mellitus after acute pancreatitis. Further studies are needed to investigate the pathogenesis of diabetes in this setting. PMID:25493036
Complications of Laparoscopic Cholecystectomy: Our Experience from a Retrospective Analysis
Radunovic, Miodrag; Lazovic, Ranko; Popovic, Natasa; Magdelinic, Milorad; Bulajic, Milutin; Radunovic, Lenka; Vukovic, Marko; Radunovic, Miroslav
2016-01-01
AIM: The aim of this study was to evaluate the intraoperative and postoperative complications of laparoscopic cholecystectomy, as well as the frequency of conversions. MATERIAL AND METHODS: Medical records of 740 patients who had laparoscopic cholecystectomy were analysed retrospectively. We evaluated patients for the presence of potential risk factors that could predict the development of complications such as age, gender, body mass index, white blood cell count and C-reactive protein (CRP), gallbladder ultrasonographic findings, and pathohistological analysis of removed gallbladders. The correlation between these risk factors was also analysed. RESULTS: There were 97 (13.1%) intraoperative complications (IOC). Iatrogenic perforations of a gallbladder were the most common complication - 39 patients (5.27%). Among the postoperative complications (POC), the most common ones were bleeding from abdominal cavity 27 (3.64%), biliary duct leaks 14 (1.89%), and infection of the surgical wound 7 patients (0.94%). There were 29 conversions (3.91%). The presence of more than one complication was more common in males (OR = 2.95, CI 95%, 1.42-4.23, p < 0.001). An especially high incidence of complications was noted in patients with elevated white blood cell count (OR = 3.98, CI 95% 1.68-16.92, p < 0.01), and CRP (OR = 2.42, CI 95% 1.23-12.54, p < 0.01). The increased incidence of complications was noted in patients with ultrasonographic finding of gallbladder empyema and increased thickness of the gallbladder wall > 3 mm (OR = 4.63, CI 95% 1.56-17.33, p < 0.001), as well as in patients with acute cholecystitis that was confirmed by pathohistological analysis (OR = 1.75, CI 95% 2.39-16.46, p < 0.001). CONCLUSION: Adopting laparoscopic cholecystectomy as a new technique for treatment of cholelithiasis, introduced a new spectrum of complications. Major biliary and vascular complications are life threatening, while minor complications cause patient discomfort and prolongation of the hospital stay. It is important recognising IOC complications during the surgery so they are taken care of in a timely manner during the surgical intervention. Conversion should not be considered a complication. PMID:28028405
Loy, See Ling; Lek, Ngee; Yap, Fabian; Soh, Shu E.; Padmapriya, Natarajan; Tan, Kok Hian; Biswas, Arijit; Yeo, George Seow Heong; Kwek, Kenneth; Gluckman, Peter D.; Godfrey, Keith M.; Saw, Seang Mei; Müller-Riemenschneider, Falk; Chong, Yap-Seng; Chong, Mary Foong-Fong; Chan, Jerry Kok Yen
2015-01-01
Objective Epidemiological studies relating maternal 25-hydroxyvitamin D (25OHD) with gestational diabetes mellitus (GDM) and mode of delivery have shown controversial results. We examined if maternal 25OHD status was associated with plasma glucose concentrations, risks of GDM and caesarean section in the Growing Up in Singapore Towards healthy Outcomes (GUSTO) study. Methods Plasma 25OHD concentrations, fasting glucose (FG) and 2-hour postprandial glucose (2HPPG) concentrations were measured in 940 women from a Singapore mother-offspring cohort study at 26–28 weeks’ gestation. 25OHD inadequacy and adequacy were defined based on concentrations of 25OHD ≤75nmol/l and >75nmol/l respectively. Mode of delivery was obtained from hospital records. Multiple linear regression was performed to examine the association between 25OHD status and glucose concentrations, while multiple logistic regression was performed to examine the association of 25OHD status with risks of GDM and caesarean section. Results In total, 388 (41.3%) women had 25OHD inadequacy. Of these, 131 (33.8%), 155 (39.9%) and 102 (26.3%) were Chinese, Malay and Indian respectively. After adjustment for confounders, maternal 25OHD inadequacy was associated with higher FG concentrations (β = 0.08mmol/l, 95% Confidence Interval (CI) = 0.01, 0.14), but not 2HPPG concentrations and risk of GDM. A trend between 25OHD inadequacy and higher likelihood of emergency caesarean section (Odds Ratio (OR) = 1.39, 95% CI = 0.95, 2.05) was observed. On stratification by ethnicity, the association with higher FG concentrations was significant in Malay women (β = 0.19mmol/l, 95% CI = 0.04, 0.33), while risk of emergency caesarean section was greater in Chinese (OR = 1.90, 95% CI = 1.06, 3.43) and Indian women (OR = 2.41, 95% CI = 1.01, 5.73). Conclusions 25OHD inadequacy is prevalent in pregnant Singaporean women, particularly among the Malay and Indian women. This is associated with higher FG concentrations in Malay women, and increased risk of emergency caesarean section in Chinese and Indian women. PMID:26571128
Pang, Joselyn; Wei, Clayton Koh Thuan; Yee, Ilias Adam; Wang, Bangyuan; Cassolato, Matteo
2017-01-01
Objective We examined willingness to use pre-exposure prophylaxis (PrEP) for HIV prevention among men who have sex with men (MSM) in Malaysia. Methods An online survey of 990 MSM was conducted between March and April 2016. Eligibility criteria included being biological male, Malaysian citizen, 18 years of age or above, identifying as MSM, and being HIV negative or unknown status. Participants’ demographics, sexual and drug use behaviors, attitudes towards PrEP, and preferences regarding future access to PrEP were collected. Bivariate analysis and logistic regression were performed to determine factors associated with willingness to use PrEP. Results Fewer than half of participants (44%) knew about PrEP before completing the survey. Overall, 39% of the sample were willing to take PrEP. Multivariate logistic regression indicated that Malay men (AOR: 1.73, 95% CI:1.12, 2.70), having 2 or more male anal sex partners in the past 6 months (AOR: 1.98, 95% CI: 1.29, 3.05), previous knowledge of PrEP (AOR: 1.40, 95%CI: 1.06, 1.86), lack of confidence in practising safer sex (AOR: 1.36, 95% CI: 1.02, 1.81), and having ever paid for sex with a male partner (AOR: 1.39, 95% CI: 1.01, 1.91) were independently associated with greater willingness to use PrEP, while men who identified as heterosexual were less willing to use PrEP (AOR, 0.36, 95% CI: 0.13, 0.97). Majority of participants preferred to access PrEP at affordable cost below 100 Malaysian Ringgit (USD25) per month from community based organisations followed by private or government hospitals. Conclusions Overall, MSM in Malaysia reported a relatively low level of willingness to use PrEP, although willingness was higher among those previously aware of PrEP. There is a need to provide PrEP at affordable cost, increase demand and awareness of PrEP, and to provide access to this preventative medication via diverse, integrated and tailored sexual health services. PMID:28902857
Michalopoulos, Lynn Murphy; Jiwatram-Negrón, Tina; Choo, Martin K K; Kamarulzaman, Adeeba; El-Bassel, Nabila
2016-06-02
Malaysian fishermen have been identified as a key-affected HIV population with HIV rates 10 times higher than national rates. A number of studies have identified that psychosocial and structural-level stressors increase HIV injection drug risk behaviors. The purpose of this paper is to examine psychosocial and structural-level stressors of injection drug use and HIV injection drug risk behaviors among Malaysian fishermen. The study employs a cross-sectional design using respondent driven sampling methods. The sample includes 406 fishermen from Pahang state, Malaysia. Using multivariate logistic regressions, we examined the relationship between individual (depression), social (adverse interactions with the police), and structural (poverty-related) stressors and injection drug use and risky injection drug use (e.g.., receptive and non-receptive needle sharing, frontloading and back-loading, or sharing drugs from a common container). Participants below the poverty line had significantly lower odds of injection drug use (OR 0.52, 95 % CI: 0.27-0.99, p = 0.047) and risky injection drug use behavior (OR 0.48, 95 % CI: 0.25-0.93, p = 0.030). In addition, participants with an arrest history had higher odds of injection use (OR 19.58, 95 % CI: 9.81-39.10, p < 0.001) and risky injection drug use (OR 16.25, 95 % CI: 4.73-55.85, p < 0.001). Participants with depression had significantly higher odds of engaging in risky injection drug use behavior (OR 3.26, 95 % 1.39-7.67, p = 0.007). Focusing on participants with a history of injection drug use, we found that participants with depression were significantly more likely to engage in risky drug use compared to participants below the depression cutoff (OR 3.45, 95 % CI: 1.23-9.66, p < 0.02). Findings underscore the need to address psychosocial and structural-level stressors among Malaysian fishermen to reduce HIV injection drug risk behaviors.
Lian, Qiguo; Su, Qiru; Li, Ruili; Elgar, Frank J.; Liu, Zhihao
2018-01-01
Background Childhood obesity and school bullying are pervasive public health issues and known to co-occur in adolescents. However, the association between underweight or thinness and chronic bullying victimization is unclear. The current study examined whether chronic bullying victimization is associated with weight status and body self-image. Methods A school-based, cross-sectional study in 39 North American and European countries and regions was conducted. A total of 213,595 adolescents aged 11, 13, and 15 years were surveyed in 2009/10. Chronic bullying victimization was identified using the Revised Olweus Bully/Victim Questionnaire. Weight status was determined using self-reported height and weight and the body mass index (BMI), and body self-image was based on perceived weight. We tested associations between underweight and bullying victimization using three-level logistic regression models. Results Of the 213,595 adolescents investigated, 11.28% adolescents reported chronic bullying victimization, 14.80% were classified as overweight/obese according to age- and sex-specific BMI criteria, 12.97% were underweight, and 28.36% considered themselves a little bit fat or too fat, 14.57% were too thin. Bullying victimization was less common in older adolescent boys and girls. Weight status was associated with chronic bullying victimization (adjusted ORunderweight = 1.10, 95% CI = 1.05–1.16, p = 0.002; adjusted ORoverweight = 1.40, 95% CI = 1.32–1.49, p < 0.0001; adjusted ORobese = 1.91, 95% CI = 1.71–2.14, p < 0.0001). Body self-image also related to chronic bullying victimization (adjusted ORtoo thin = 1.42, 95% CI = 1.36–1.49, p < 0.0001; adjusted ORa little bit fat = 1.54, 95% CI = 1.48–1.61, p < 0.0001; adjusted ORtoo fat = 3.30, 95% CI = 2.96–3.68, p < 0.0001). Conclusion Both perceived weight and self-rated overweight are associated with chronic bullying victimization. Both overweight and underweight children are at risk of being chronically bullied. PMID:29404221
Vodstrcil, Lenka A; Walker, Sandra M; Hocking, Jane S; Law, Matthew; Forcey, Dana S; Fehler, Glenda; Bilardi, Jade E; Chen, Marcus Y; Fethers, Katherine A; Fairley, Christopher K; Bradshaw, Catriona S
2015-04-01
Female same-sex partnerships provide a unique opportunity to study the pathogenesis and transmissibility of bacterial vaginosis (BV) because it can be diagnosed in both members of the partnership. We conducted a nationwide community-enrolled cohort study of women who have sex with women, including women coenrolled with their regular female sexual partner (FSP), to investigate the BV incidence rate and factors associated with incident BV. Women who have sex with women, without prevalent BV in a cross-sectional study, were enrolled in a 24-month cohort study involving 3-monthly questionnaires and self-collected vaginal swabs that were scored by the Nugent method. We assessed the BV incidence rate per 100 woman-years (WY) and used univariate and multivariable Cox regression analysis to establish factors associated with BV acquisition. Two hundred ninety-eight participants were enrolled in the cohort; 122 were coenrolled with their regular FSP. There were 51 incident cases of BV (rate, 9.75/100 WY; 95% confidence interval [CI], 7.41-12.83). Incident BV was associated with exposure to a new sexual partner (adjusted hazard ratio [AHR], 2.51; 95% CI, 1.30-4.82), a partner with BV symptoms (AHR, 3.99; 95% CI, 1.39-11.45), receptive oral sex (AHR, 3.52; 95% CI, 1.41-8.79), and onset of BV symptoms (AHR, 2.80; 95% CI, 1.39-5.61). Women coenrolled with their BV-negative partner had a greatly reduced risk of incident BV (AHR, 0.26; 95% CI, .11-.61), and high concordance of Nugent category (74%), which was predominantly normal vaginal flora throughout follow-up. These data highlight the strong influence of sexual relationships and behaviors on BV acquisition and the vaginal microbiota. They provide epidemiological evidence to support exchange of vaginal bacterial species between women and the concept that BV is sexually transmitted. © The Author 2014. Published by Oxford University Press on behalf of the Infectious Diseases Society of America. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Diagnostic change and the increased prevalence of autism
King, Marissa; Bearman, Peter
2009-01-01
Background Increased autism prevalence rates have generated considerable concern. However, the contribution of changes in diagnostic practices to increased prevalence rates has not been thoroughly examined. Debates over the role of diagnostic substitution also continue. California has been an important test case in these controversies. The objective of this study was to determine the extent to which the increased prevalence of autism in California has been driven by changes in diagnostic practices, diagnostic substitution and diagnostic accretion. Methods Retrospective case record examination of 7003 patients born before 1987 with autism who were enrolled with the California Department of Developmental Services between 1992 and 2005 was carried out. Of principal interest were 631 patients with a sole diagnosis of mental retardation (MR) who subsequently acquired a diagnosis of autism. The outcome of interest was the probability of acquiring a diagnosis of autism as a result of changes in diagnostic practices was calculated. The probability of diagnostic change is then used to model the proportion of the autism caseload arising from changing diagnostic practices. Results The odds of a patient acquiring an autism diagnosis were elevated in periods in which the practices for diagnosing autism changed. The odds of change in years in which diagnostic practices changed were 1.68 [95% confidence interval (CI) 1.11–2.54], 1.55 (95% CI 1.03–2.34), 1.58 (95% CI 1.05–2.39), 1.82 (95% CI 1.23–2.7) and 1.61 (95% CI 1.09–2.39). Using the probability of change between 1992 and 2005 to generalize to the population with autism, it is estimated that 26.4% (95% CI 16.25–36.48) of the increased autism caseload in California is uniquely associated with diagnostic change through a single pathway—individuals previously diagnosed with MR. Conclusion Changes in practices for diagnosing autism have had a substantial effect on autism caseloads, accounting for one-quarter of the observed increase in prevalence in California between 1992 and 2005. PMID:19737791
Balleyguier, Corinne; Arfi-Rouche, Julia; Levy, Laurent; Toubiana, Patrick R; Cohen-Scali, Franck; Toledano, Alicia Y; Boyer, Bruno
2017-12-01
Evaluate concurrent Computer-Aided Detection (CAD) with Digital Breast Tomosynthesis (DBT) to determine impact on radiologist performance and reading time. The CAD system detects and extracts suspicious masses, architectural distortions and asymmetries from DBT planes that are blended into corresponding synthetic images to form CAD-enhanced synthetic images. Review of CAD-enhanced images and navigation to corresponding planes to confirm or dismiss potential lesions allows radiologists to more quickly review DBT planes. A retrospective, crossover study with and without CAD was conducted with six radiologists who read an enriched sample of 80 DBT cases including 23 malignant lesions in 21 women. Area Under the Receiver Operating Characteristic (ROC) Curve (AUC) compared the readings with and without CAD to determine the effect of CAD on overall interpretation performance. Sensitivity, specificity, recall rate and reading time were also assessed. Multi-reader, multi-case (MRMC) methods accounting for correlation and requiring correct lesion localization were used to analyze all endpoints. AUCs were based on a 0-100% probability of malignancy (POM) score. Sensitivity and specificity were based on BI-RADS scores, where 3 or higher was positive. Average AUC across readers without CAD was 0.854 (range: 0.785-0.891, 95% confidence interval (CI): 0.769,0.939) and 0.850 (range: 0.746-0.905, 95% CI: 0.751,0.949) with CAD (95% CI for difference: -0.046,0.039), demonstrating non-inferiority of AUC. Average reduction in reading time with CAD was 23.5% (95% CI: 7.0-37.0% improvement), from an average 48.2 (95% CI: 39.1,59.6) seconds without CAD to 39.1 (95% CI: 26.2,54.5) seconds with CAD. Per-patient sensitivity was the same with and without CAD (0.865; 95% CI for difference: -0.070,0.070), and there was a small 0.022 improvement (95% CI for difference: -0.046,0.089) in per-lesion sensitivity from 0.790 without CAD to 0.812 with CAD. A slight reduction in specificity with a -0.014 difference (95% CI for difference: -0.079,0.050) and a small 0.025 increase (95% CI for difference: -0.036,0.087) in recall rate in non-cancer cases were observed with CAD. Concurrent CAD resulted in faster reading time with non-inferiority of radiologist interpretation performance. Radiologist sensitivity, specificity and recall rate were similar with and without CAD. Copyright © 2017 Elsevier B.V. All rights reserved.
2013-01-01
Introduction Contemporary information on mechanical ventilation (MV) use in emerging countries is limited. Moreover, most epidemiological studies on ventilatory support were carried out before significant developments, such as lung protective ventilation or broader application of non-invasive ventilation (NIV). We aimed to evaluate the clinical characteristics, outcomes and risk factors for hospital mortality and failure of NIV in patients requiring ventilatory support in Brazilian intensive care units (ICU). Methods In a multicenter, prospective, cohort study, a total of 773 adult patients admitted to 45 ICUs over a two-month period requiring invasive ventilation or NIV for more than 24 hours were evaluated. Causes of ventilatory support, prior chronic health status and physiological data were assessed. Multivariate analysis was used to identifiy variables associated with hospital mortality and NIV failure. Results Invasive MV and NIV were used as initial ventilatory support in 622 (80%) and 151 (20%) patients. Failure with subsequent intubation occurred in 54% of NIV patients. The main reasons for ventilatory support were pneumonia (27%), neurologic disorders (19%) and non-pulmonary sepsis (12%). ICU and hospital mortality rates were 34% and 42%. Using the Berlin definition, acute respiratory distress syndrome (ARDS) was diagnosed in 31% of the patients with a hospital mortality of 52%. In the multivariate analysis, age (odds ratio (OR), 1.03; 95% confidence interval (CI), 1.01 to 1.03), comorbidities (OR, 2.30; 95% CI, 1.28 to 3.17), associated organ failures (OR, 1.12; 95% CI, 1.05 to 1.20), moderate (OR, 1.92; 95% CI, 1.10 to 3.35) to severe ARDS (OR, 2.12; 95% CI, 1.01 to 4.41), cumulative fluid balance over the first 72 h of ICU (OR, 2.44; 95% CI, 1.39 to 4.28), higher lactate (OR, 1.78; 95% CI, 1.27 to 2.50), invasive MV (OR, 2.67; 95% CI, 1.32 to 5.39) and NIV failure (OR, 3.95; 95% CI, 1.74 to 8.99) were independently associated with hospital mortality. The predictors of NIV failure were the severity of associated organ dysfunctions (OR, 1.20; 95% CI, 1.05 to 1.34), ARDS (OR, 2.31; 95% CI, 1.10 to 4.82) and positive fluid balance (OR, 2.09; 95% CI, 1.02 to 4.30). Conclusions Current mortality of ventilated patients in Brazil is elevated. Implementation of judicious fluid therapy and a watchful use and monitoring of NIV patients are potential targets to improve outcomes in this setting. Trial registration ClinicalTrials.gov NCT01268410. PMID:23557378
Nordander, Catarina; Ohlsson, Kerstina; Balogh, Istvan; Hansson, Gert-Ake; Axmon, Anna; Persson, Roger; Skerfving, Staffan
2008-08-01
For unknown reasons, females run a higher risk than males of work-related musculoskeletal disorders. The aim of this study was to evaluate whether male and female workers, with identical repetitive work tasks, differ concerning risk of disorders, physical or psychosocial exposures. Employees in two industries were studied; one rubber manufacturing and one mechanical assembly plant. These industries were selected since in both, large groups of males and females worked side by side performing identical repetitive work tasks. Physical exposure was measured by technical equipment. Postures and movements were registered by inclinometry for the head and upper arms, and by electrogoniometry for the wrists. Muscular activity (muscular rest and %max) was registered by surface electromyography for m. trapezius and the forearm extensors (18 males and 19 females). Psychosocial work environment was evaluated by the demand-control-support model (85 males and 138 females). Musculoskeletal disorders were assessed (105 males and 172 females), by interview (last 7-days complaints), and by physical examination (diagnoses). Concerning physical exposure, females showed higher muscular activity related to maximal voluntary contractions [(%MVE); m. trapezius: females 18 (SD 9.2), males 12 (SD 4.3); forearm extensors: females 39 (SD 11), males 27 (SD 10), right side, 90th percentile]. Working postures and movements were similar between genders. Also, concerning psychosocial work environment, no significant gender differences were found. Females had higher prevalences of disorders [complaints: age-adjusted prevalence odds ratio (POR) 2.3 (95% CI 1.3-3.8) for neck/shoulders, 2.4 (1.4-4.0) for elbows/hands; diagnoses: neck/shoulder 1.9 (1.1-3.6), elbows/hands 4.1 (1.2-9.3)]. In 225 workers, PORs were adjusted for household work, personal recovery and exercise, which only slightly affected the risk estimates. In identical work tasks, females showed substantially higher muscular activity in relation to capacity, and higher prevalence of musculoskeletal disorders of the neck and upper extremity, than did males.
Effect of Active Videogames on Underserved Children's Classroom Behaviors, Effort, and Fitness.
Gao, Zan; Lee, Jung Eun; Pope, Zachary; Zhang, Dachao
2016-09-30
The purpose of this study was to examine the effect of active videogames (AVGs) on underserved minority children's on-task classroom behavior, academic effort, and fitness. A one group pre- and posttest repeated measures design was used. In Fall 2013, 95 fourth grade children (57 boys, 38 girls; 96% of minority) from three classes at an underserved urban elementary school participated in teacher-supervised AVG activities (e.g., Wii Sports, Xbox Just Dance). Specifically, students participated in a 50-minute weekly AVG program at school for 6 weeks. Children's academic effort was evaluated by classroom teachers using a validated scale that assessed activity, attention, conduct, and social/emotional behavior. Moreover, children's classroom behavior was observed immediately before and after each AVG session by trained researchers. Finally, cardiovascular fitness was also measured. A paired t-test was used to assess teacher-rated student effort, while one-way (gender) analysis of variance (ANOVA) with repeated measures was performed to analyze children's on-task classroom behavior. There was a significant effect on children's effort between the first (mean = 3.24, SD = 0.75) and last week (mean = 3.41, SD = 0.73) assessments, t = 2.42, P = 0.02. In addition, there was a significant effect on classroom behavior, F = 33.103, P < 0.01. In detail, children scored significantly higher on on-task behavior during the post-AVG observation (mean = 81.4, SD = 12.3) than seen during the pre-AVG observation (mean = 69.8, SD = 14.9). However, no main effect was indicated for gender, F = 0.39, P = 0.54. No significant improvement in cardiovascular fitness was observed, although slight improvements were seen. Offering an AVG program at school could improve underserved minority children's classroom on-task behavior and academic effort. Future studies may include a control group to further confirm the effectiveness of AVG activities. Practical implications for educators and other stakeholders are provided.
Mearin, Fermín; Barreiro-de Acosta, Manuel; González-Galilea, Ángel; Gisbert, Javier P; Cucala, Mercedes; Ponce, Julio
2013-10-01
To determine the prevalence and characteristics of anemia and iron deficiency in patients hospitalized for gastrointestinal diseases. An epidemiological, multicenter, mixed design study (retrospective review of randomized clinical records and prospective visits) conducted between February 2010 and March 2011 in 22 Spanish gastroenterology departments. Severe anemia was defined as Hb < 10g/dL, mild/moderate as Hb ≥ 10g/dL, and iron deficiency as ferritin < 30ng/ml or transferrin saturation < 16%. We included 379 patients. The mean±SD age was 57±19 years and 47% were men. The prevalence of anemia at admission was 60% (95% CI 55 to 65), and anemia was severe (Hb <10g/dl) in half the patients. The prevalence of iron deficiency was 54% of evaluable patients (95% CI 47 to 61). Gastrointestinal bleeding at admission was found in 39% of the patients, of whom 83% (121/146) were anemic. At discharge, the proportion of anemic patients was unchanged (from 60% at admission to 58% at discharge) (95% CI 53 to 63) and iron deficiency was found in 41% (95% CI 32 to 50): anemia was severe in 17% and mild/moderate in 41%. During follow-up, at 3-6 months after admission, 44% (95% CI 39 to 50) of evaluable patients continued to have iron deficiency and 28% (95% CI 23 to 32) were still anemic: 5% severe and 23% mild/moderate. The prevalence of iron deficiency was 44% (95% CI: 39-50). During admission, 50% of patients with anemia did not receive treatment. At discharge, 55% were untreated. The prevalence of anemia in patients hospitalized for gastroenterological diseases was very high. Anemia persisted in over a quarter of patients at the follow-up visit. Only half of hospitalized patients received treatment for anemia, even when the anemia was severe. Copyright © 2013 Elsevier España, S.L. y AEEH y AEG. All rights reserved.
Kobayashi, Hisami; Nakasato, Takuya; Sakamoto, Mitsuo; Ohtani, Yoshihisa; Terada, Fuminori; Sakai, Ken; Ohkuma, Moriya; Tohno, Masanori
2017-12-01
A Gram-stain-variable, strictly anaerobic, rod-shaped, catalase-negative and endospore-forming bacterial strain, designated MJC39 T , was isolated from grass silage preserved in Hokkaido, Japan. Growth occurred at 20-42 °C, pH 5.0-7.0 and NaCl concentrations up to 2 % (w/v). The isolated strain MJC39 T produced butyric acid in peptone yeast extract medium with glucose. The DNA G+C content of strain MJC39 T was 34.4±0.2 mol%. The major cellular fatty acids (>10 %) were C14 : 0, C16 : 0 and summed feature 3 (including C16 : 1ω7c/C16 : 1ω6c). No respiratory quinones were detected. The polar lipids of strain MJC39 T were phosphatidylglycerol, phosphatidylethanolamine, diphosphatidylglycerol, one unidentified lipid, one unidentified aminolipid, two unidentified glycolipids, one unidentified phospholipid, one unidentified aminoglycolipid and one unidentified phosphoaminoglycolipid. Phylogenetic analysis based on 16S rRNA gene sequences showed that strain MJC39 T was a member of the genus Clostridium and is closely related to Clostridium tyrobutyricum JCM 11008 T (95.8 % similarity) and Clostridium algifaecis MB9-7 T (95.5 % similarity). Based on the genotypic, phenotypic and chemotaxonomic characteristics, strain MJC39 T represents a novel species of the genus Clostridium, for which the name Clostridium pabulibutyricum sp. nov. is proposed. The type strain is MJC39 T (=JCM 31506 T =DSM 103944 T ).
MSH6 G39E Polymorphism and CpG Island Methylator Phenotype in Colon Cancer
Curtin, Karen; Samowitz, Wade S.; Wolff, Roger K.; Caan, Bette J.; Ulrich, Cornelia M.; Potter, John D.; Slattery, Martha L.
2010-01-01
The MSH6 G39E germline polymorphism is not associated with an increased risk of either microsatellite stable or unstable sporadic colorectal cancer. Other than microsatellite instability, however, most genetic and epigenetic changes of tumors associated with this common variant have not been studied. The objective of our investigation was to evaluate associations between the MSH6 G39E (116G>A) polymorphism and CpG island methylator phenotype (CIMP) and BRAF V600E mutations in tumors from a sample of 1048 individuals with colon cancer and 1964 controls from Utah, Northern California, and Minnesota. The G39E polymorphism (rs1042821) was determined by the five prime nuclease assay. CIMP was determined by methylation-specific polymerase chain reaction (PCR) of CpG islands in MLH1, methylated in tumors (MINT)1, MINT2, MINT31, and CDKN2A. The BRAF V600E mutation was determined by sequencing exon 15. In microsatellite stable tumors, homozygous carriers of the G39E polymorphism had an increased risk of CIMP+ colon cancer (odds ratio (OR) 2.2, 95% confidence interval (CI) 1.1, 4.2) and BRAF V600E mutation (OR 3.1, 95% CI 1.01, 9.7) in a case–control comparison. This finding was not observed in unstable tumors; however, power may have been low to detect an association. Age at diagnosis, family history, and alcohol use did not interact with MSH6 G39E and CIMP. The MSH6 G39E germline polymorphism may be associated with CIMP+ colon cancer. PMID:19582761
Kordiimonas sediminis sp. nov., isolated from a sea cucumber culture pond.
Zhang, Heng-Xi; Zhao, Jin-Xin; Chen, Guan-Jun; Du, Zong-Jun
2016-05-01
A marine bacterium, designated strain N39(T), was isolated from a sediment sample collected at a sea cucumber culture pond in Weihai, China. Cells of strain N39(T) were observed to be Gram-stain negative, facultatively anaerobic, motile rods showing catalase and oxidase negative reactions. Strain N39(T) was found to grow optimally at pH 8.0-8.5, 35-37 °C and in the presence of approximately 3.0 % (w/v) NaCl. Phylogenetic analysis based on 16S rRNA gene sequences indicated that strain N39(T) belongs to the genus Kordiimonas in the family Kordiimonadaceae, appearing closely related to Kordiimonas lacus JCM 16261(T) (95.9 %), Kordiimonas aquimaris MEBiC06554(T) (95.1 %), Kordiimonas gwangyangensis JCM 12864(T) (94.2 %) and Kordiimonas aestuarii 101-1(T) (93.8 %). Ubiquinone 10 (Q-10) was found to be the major respiratory quinone. The dominant cellular fatty acids were identified as iso-C17:1 ω9c, iso-C17:0, iso-C15:0 and C17:1 ω6c. The predominant polar lipids were found to be phosphatidylglycerol, phosphatidylethanolamine and diphosphatidylglycerol. The DNA G + C content of strain N39(T) is 50.8 %. On the basis of genotypic, chemotaxonomic and phenotypic data, strain N39(T) is concluded to represent a novel species within the genus Kordiimonas, for which the name Kordiimonas sediminis sp. nov. is proposed. The type strain is N39(T) (=KCTC 42590(T) = MCCC 1H00112(T)).
Bernardi, Mauro; Caraceni, Paolo; Navickis, Roberta J; Wilkes, Mahlon M
2012-04-01
Albumin infusion reduces the incidence of postparacentesis circulatory dysfunction among patients with cirrhosis and tense ascites, as compared with no treatment. Treatment alternatives to albumin, such as artificial colloids and vasoconstrictors, have been widely investigated. The aim of this meta-analysis was to determine whether morbidity and mortality differ between patients receiving albumin versus alternative treatments. The meta-analysis included randomized trials evaluating albumin infusion in patients with tense ascites. Primary endpoints were postparacentesis circulatory dysfunction, hyponatremia, and mortality. Eligible trials were sought by multiple methods, including computer searches of bibliographic and abstract databases and the Cochrane Library. Results were quantitatively combined under a fixed-effects model. Seventeen trials with 1,225 total patients were included. There was no evidence of heterogeneity or publication bias. Compared with alternative treatments, albumin reduced the incidence of postparacentesis circulatory dysfunction (odds ratio [OR], 0.39; 95% confidence interval [CI], 0.27-0.55). Significant reductions in that complication by albumin were also shown in subgroup analyses versus each of the other volume expanders tested (e.g., dextran, gelatin, hydroxyethyl starch, and hypertonic saline). The occurrence of hyponatremia was also decreased by albumin, compared with alternative treatments (OR, 0.58; 95% CI, 0.39-0.87). In addition, mortality was lower in patients receiving albumin than alternative treatments (OR, 0.64; 95% CI, 0.41-0.98). This meta-analysis provides evidence that albumin reduces morbidity and mortality among patients with tense ascites undergoing large-volume paracentesis, as compared with alternative treatments investigated thus far. Copyright © 2011 American Association for the Study of Liver Diseases.
Jabs, Douglas A; Van Natta, Mark L; Sezgin, Efe; Pak, Jeong Won; Danis, Ronald
2015-06-01
To evaluate the prevalence of intermediate-stage age-related macular degeneration (AMD) in patients with acquired immunodeficiency syndrome (AIDS). Cross-sectional study of patients with AIDS enrolled in the Longitudinal Study of the Ocular Complications of AIDS. Intermediate-stage AMD was determined from enrollment retinal photographs by graders at a centralized Reading Center, using the Age-Related Eye Disease Study grading system. Graders were masked as to clinical data. Of 1825 participants with AIDS and no ocular opportunistic infections, 9.9% had intermediate-stage AMD. Risk factors included age, with an odds ratio (OR) of 1.9 (95% confidence interval [CI] 1.6, 2.3, P < .001) for every decade of age; the prevalence of AMD ranged from 4.0% for participants 30-39 years old to 24.3% for participants ≥60 years old. Other risk factors included the human immunodeficiency virus (HIV) risk groups of injection drug use (OR = 2.4, 95% CI 1.5, 3.9, P < .001) or heterosexual contact (OR = 1.9, 95% CI 1.3, 2.8, P = .001). Compared with the HIV-uninfected population in the Beaver Dam Offspring Study, there was an approximate 4-fold increased age-adjusted prevalence of intermediate-stage AMD. Patients with AIDS have an increased age-adjusted prevalence of intermediate-stage AMD compared with that found in a non-HIV-infected cohort evaluated with similar methods. This increased prevalence is consistent with the increased prevalence of other age-related diseases in antiretroviral-treated, immune-restored, HIV-infected persons when compared to non-HIV-infected persons. Published by Elsevier Inc.
Individual Differences in Dynamic Functional Brain Connectivity across the Human Lifespan.
Davison, Elizabeth N; Turner, Benjamin O; Schlesinger, Kimberly J; Miller, Michael B; Grafton, Scott T; Bassett, Danielle S; Carlson, Jean M
2016-11-01
Individual differences in brain functional networks may be related to complex personal identifiers, including health, age, and ability. Dynamic network theory has been used to identify properties of dynamic brain function from fMRI data, but the majority of analyses and findings remain at the level of the group. Here, we apply hypergraph analysis, a method from dynamic network theory, to quantify individual differences in brain functional dynamics. Using a summary metric derived from the hypergraph formalism-hypergraph cardinality-we investigate individual variations in two separate, complementary data sets. The first data set ("multi-task") consists of 77 individuals engaging in four consecutive cognitive tasks. We observe that hypergraph cardinality exhibits variation across individuals while remaining consistent within individuals between tasks; moreover, the analysis of one of the memory tasks revealed a marginally significant correspondence between hypergraph cardinality and age. This finding motivated a similar analysis of the second data set ("age-memory"), in which 95 individuals, aged 18-75, performed a memory task with a similar structure to the multi-task memory task. With the increased age range in the age-memory data set, the correlation between hypergraph cardinality and age correspondence becomes significant. We discuss these results in the context of the well-known finding linking age with network structure, and suggest that hypergraph analysis should serve as a useful tool in furthering our understanding of the dynamic network structure of the brain.
Ruiz-España, Silvia; Arana, Estanislao; Moratal, David
2015-07-01
Computer-aided diagnosis (CAD) methods for detecting and classifying lumbar spine disease in Magnetic Resonance imaging (MRI) can assist radiologists to perform their decision-making tasks. In this paper, a CAD software has been developed able to classify and quantify spine disease (disc degeneration, herniation and spinal stenosis) in two-dimensional MRI. A set of 52 lumbar discs from 14 patients was used for training and 243 lumbar discs from 53 patients for testing in conventional two-dimensional MRI of the lumbar spine. To classify disc degeneration according to the gold standard, Pfirrmann classification, a method based on the measurement of disc signal intensity and structure was developed. A gradient Vector Flow algorithm was used to extract disc shape features and for detecting contour abnormalities. Also, a signal intensity method was used for segmenting and detecting spinal stenosis. Novel algorithms have also been developed to quantify the severity of these pathologies. Variability was evaluated by kappa (k) and intra-class correlation (ICC) statistics. Segmentation inaccuracy was below 1%. Almost perfect agreement, as measured by the k and ICC statistics, was obtained for all the analyzed pathologies: disc degeneration (k=0.81 with 95% CI=[0.75..0.88]) with a sensitivity of 95.8% and a specificity of 92.6%, disc herniation (k=0.94 with 95% CI=[0.87..1]) with a sensitivity of 60% and a specificity of 87.1%, categorical stenosis (k=0.94 with 95% CI=[0.90..0.98]) and quantitative stenosis (ICC=0.98 with 95% CI=[0.97..0.98]) with a sensitivity of 70% and a specificity of 81.7%. The proposed methods are reproducible and should be considered as a possible alternative when compared to reference standards. Copyright © 2015 Elsevier Ltd. All rights reserved.
Walendy, Victor; Stang, Andreas
2017-01-17
Our aim was to provide nationwide age-standardised rates (ASR) on the usage of endovascular coiling and neurosurgical clipping for unruptured intracranial aneurysm (UIA) treatment in Germany. Nationwide observational study using the Diagnosis-Related-Groups (DRG) statistics for the years 2005-2009 (overall 83 million hospitalisations). From 2005 to 2009, overall 39 155 hospitalisations with a diagnosis of UIA occurred in Germany. Age-specific and age-standardised hospitalisation rates for UIA with the midyear population of Germany in 2007 as the standard. Of the 10 221 hospitalisations with UIA during the observation period, 6098 (59.7%) and 4123 (40.3%) included coiling and clipping, respectively. Overall hospitalisation rates for UIA increased by 39.5% (95% CI 24.7% to 56.0%) and 50.4% (95% CI 39.6% to 62.1%) among men and women, respectively. In 2005, the ASR per 100 000 person years for coiling was 0.7 (95% CI 0.62 to 0.78) for men and 1.7 (95% CI 1.58 to 1.82) for women. In 2009, the ASR was 1.0 (95% CI 0.90 to 1.10) and 2.4 (95% CI 2.24 to 2.56), respectively. Similarly, the ASR for clipping in 2005 amounted to 0.6 (95% CI 0.52 to 0.68) for men and 1.1 (95% CI 1.00 to 1.20) for women. These rates increased in 2009 to 0.8 (95% CI 0.72 to 0.88) and 1.7 (95% CI 1.58 to 1.82), respectively. We observed a marked geographical variation of ASR for coiling and less pronounced for clipping. For the federal state of Saarland, the ASR for coiling was 5.64 (95% CI 4.76 to 6.52) compared with 0.68 (95% CI 0.48 to 0.88; per 100 000 person years) in Saxony-Anhalt, whereas, ASR for clipping were highest in Rhineland-Palatinate (2.48, 95% CI 2.17 to 4.75) and lowest in Saxony-Anhalt (0.52, 95% CI 0.34 to 0.70). To the best of our knowledge, we presented the first representative, nationwide analysis of the clinical management of UIA in Germany. The ASR increased markedly and showed substantial geographical variation among federal states for all treatment modalities during the observation period. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
Maternal mobile device use during a structured parent-child interaction task.
Radesky, Jenny; Miller, Alison L; Rosenblum, Katherine L; Appugliese, Danielle; Kaciroti, Niko; Lumeng, Julie C
2015-01-01
To examine associations of maternal mobile device use with the frequency of mother-child interactions during a structured laboratory task. Participants included 225 low-income mother-child pairs. When children were ∼6 years old, dyads were videotaped during a standardized protocol in order to characterize how mothers and children interacted when asked to try familiar and unfamiliar foods. From videotapes, we dichotomized mothers on the basis of whether or not they spontaneously used a mobile device, and we counted maternal verbal and nonverbal prompts toward the child. We used multivariate Poisson regression to study associations of device use with eating prompt frequency for different foods. Mothers were an average of 31.3 (SD 7.1) years old, and 28.0% were of Hispanic/nonwhite race/ethnicity. During the protocol, 23.1% of mothers spontaneously used a mobile device. Device use was not associated with any maternal characteristics, including age, race/ethnicity, education, depressive symptoms, or parenting style. Mothers with device use initiated fewer verbal (relative rate 0.80; 95% confidence interval 0.63, 1.03) and nonverbal (0.61; 0.39, 0.96) interactions with their children than mothers who did not use a device, when averaged across all foods. This association was strongest during introduction of halva, the most unfamiliar food (0.67; 0.48, 0.93 for verbal and 0.42; 0.20, 0.89 for nonverbal interactions). Mobile device use was common and associated with fewer interactions with children during a structured interaction task, particularly nonverbal interactions and during introduction of an unfamiliar food. More research is needed to understand how device use affects parent-child engagement in naturalistic contexts. Copyright © 2015 Academic Pediatric Association. Published by Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bain, B.
Effectiveness of a portable, ice-pack cooling vest (Steelevest) in prolonging work tolerance time in chemical defense clothing in the heat (33 C dry bulb, 33% relative humidity or 25 C WBGT) was evaluated while subjects exercised at a metabolic rate of approx. 700 watts. Subjects were six male volunteers. The protocol consisted of a 20 minute treadmill walk at 1.33 m/s. and 7.5% grade, followed by 15 minutes of a lifting task, 5 minutes rest, then another 20 minutes of lifting task for a total of one hour. The lifting task consisted of lifting of 20 kg box, carrying itmore » 3 meters and setting it down. This was followed by a 6 m walk (3m back to the start point and 3 m back to the box) 15 sec after which the lifting cycle began again. The work was classified as heavy as previously defined. This protocol was repeated until the subjects were unable to continue or they reached a physiological endpoint. Time to voluntary cessation or physiological endpoint was called the work tolerance time. Physiological endpoints were rectal temperature of 39 C, heart rate exceeding 95% of maximum for two consecutive minutes or visible loss of motor control or nausea. The cooling vest had no effect on work tolerance time, rate of rise of rectal temperature or sweat loss. It was concluded that the Steelvest ice-vest is ineffective in prolonging work tolerance time and preventing increases in rectal temperature while wearing chemical protective clothing.« less
Recent Sexual Trauma and Adverse Health and Occupational Outcomes Among U.S. Service Women.
Millegan, Jeffrey; Milburn, Emma K; LeardMann, Cynthia A; Street, Amy E; Williams, Diane; Trone, Daniel W; Crum-Cianflone, Nancy F
2015-08-01
Sexual trauma is prevalent among military women, but data on potential effects are needed. The association of sexual trauma with health and occupational outcomes was investigated using longitudinal data from the Millennium Cohort Study. Of 13,001 U.S. service women, 1,364 (10.5%) reported recent sexual harassment and 374 (2.9%) recent sexual assault. Women reporting recent sexual harassment or assault were more likely to report poorer mental health: OR = 1.96, 95% CI [1.71, 2.25], and OR = 3.45, 95% CI [2.67, 4.44], respectively. They reported poorer physical health: OR = 1.39, 95% CI [1.20, 1.62], and OR = 1.39, 95% CI [1.04, 1.85], respectively. They reported difficulties in work/activities due to emotional health: OR = 1.80, 95% CI [1.59, 2.04], and OR = 2.70, 95% CI [2.12, 3.44], respectively. They also reported difficulties with physical health: OR = 1.55, 95% CI [1.37, 1.75], and OR = 1.52 95% CI [1.20, 1.91], respectively, after adjustment for demographic, military, health, and prior sexual trauma characteristics. Recent sexual harassment was associated with demotion, OR = 1.47, 95% CI [1.12, 1.93]. Findings demonstrated that sexual trauma represents a potential threat to military operational readiness and draws attention to the importance of prevention strategies and services to reduce the burden of sexual trauma on military victims. Copyright © 2015 Wiley Periodicals, Inc., A Wiley Company.
Efficient Credit Assignment through Evaluation Function Decomposition
NASA Technical Reports Server (NTRS)
Agogino, Adrian; Turner, Kagan; Mikkulainen, Risto
2005-01-01
Evolutionary methods are powerful tools in discovering solutions for difficult continuous tasks. When such a solution is encoded over multiple genes, a genetic algorithm faces the difficult credit assignment problem of evaluating how a single gene in a chromosome contributes to the full solution. Typically a single evaluation function is used for the entire chromosome, implicitly giving each gene in the chromosome the same evaluation. This method is inefficient because a gene will get credit for the contribution of all the other genes as well. Accurately measuring the fitness of individual genes in such a large search space requires many trials. This paper instead proposes turning this single complex search problem into a multi-agent search problem, where each agent has the simpler task of discovering a suitable gene. Gene-specific evaluation functions can then be created that have better theoretical properties than a single evaluation function over all genes. This method is tested in the difficult double-pole balancing problem, showing that agents using gene-specific evaluation functions can create a successful control policy in 20 percent fewer trials than the best existing genetic algorithms. The method is extended to more distributed problems, achieving 95 percent performance gains over tradition methods in the multi-rover domain.
21 CFR 74.1339 - D&C Red No. 39.
Code of Federal Regulations, 2010 CFR
2010-04-01
... As), not more than 3 parts per million. Total color, not less than 95.0 percent. (c) Uses and... Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL LISTING OF COLOR ADDITIVES SUBJECT TO CERTIFICATION Drugs § 74.1339 D&C Red No. 39. (a) Identity. (1) The color additive D&C...
Validity of Guatemalan Mother's Self-Reported Breast-Feeding Practices of 3-Month-Old Infants.
Mazariegos, Monica; Slater, Christine; Ramirez-Zea, Manuel
2016-12-01
Breast-feeding practices (BFPs) can be assessed by interviewing the mother about current feeding practices and with a 24-hour recall. It is crucial to establish the accuracy of these methods, which are commonly used by public health decision makers to design health policies aimed at increasing exclusive breast-feeding rates. We aimed to validate 2 self-report BFP instruments using the dose-to-mother deuterium oxide turnover technique (DMDOT) as the reference method. Breast-feeding practices were assessed by interviewing the mother about current feeding practices and with a 24-hour recall in 36 Guatemalan mother-infant pairs. The validity of these instruments was assessed using DMDOT as the reference method. Both self-report instruments overestimated exclusively breast-fed (EBF) infants. Infants classified as EBF were 50% by the reported current feeding practice, 61% by the 24-hour recall, and only 36% using DMDOT. Sensitivity to detect EBF infants from the mother's self-report was 92% (95% CI: 62%-99%) while from the 24-hour recall was 100% (95% CI: 72%-100%, P < .01). However, specificity for both instruments was low, at 74% (95% CI: 51%-89%) for reported current feeding practice and at 61% (95% CI: 39%-79%) for the 24-hour recall (P < .01). Both reported current feeding practice and the 24-hour recall instruments overestimated exclusive breast-feeding. Nevertheless, the use of reported current feeding practice provided more accurate data to assess BFPs in a public health setting. Furthermore, population-based surveys should consider the overestimation of exclusive breast-feeding caused when using these BFP instruments. © The Author(s) 2016.
Sawyer, A C P; Chittleborough, C R; Mittinty, M N; Miller-Lewis, L R; Sawyer, M G; Sullivan, T; Lynch, J W
2015-09-01
The aim of this study was to estimate the association between two key aspects of self-regulation, 'task attentiveness' and 'emotional regulation' assessed from ages 2-3 to 6-7 years, and academic achievement when children were aged 6-7 years. Participants (n = 3410) were children in the Longitudinal Study of Australian Children. Parents rated children's task attentiveness and emotional regulation abilities when children were aged 2-3, 4-5 and 6-7. Academic achievement was assessed using the Academic Rating Scale completed by teachers. Linear regression models were used to estimate the association between developmental trajectories (i.e. rate of change per year) of task attentiveness and emotional regulation, and academic achievement at 6-7 years. Improvements in task attentiveness between 2-3 and 6-7 years, adjusted for baseline levels of task attentiveness, child and family confounders, and children's receptive vocabulary and non-verbal reasoning skills at age 6-7 were associated with greater teacher-rated literacy [B = 0.05, 95% confidence interval (CI) = 0.04-0.06] and maths achievement (B = 0.04, 95% CI = 0.03-0.06) at 6-7 years. Improvements in emotional regulation, adjusting for baseline levels and covariates, were also associated with better teacher-rated literacy (B = 0.02, 95% CI = 0.01-0.04) but not with maths achievement (B = 0.01, 95% CI = -0.01-0.02) at 6-7 years. For literacy, improvements in task attentiveness had a stronger association with achievement at 6-7 years than improvements in emotional regulation. Our study shows that improved trajectories of task attentiveness from ages 2-3 to 6-7 years are associated with improved literacy and maths achievement during the early school years. Trajectories of improving emotional regulation showed smaller effects on academic outcomes. Results suggest that interventions that improve task attentiveness when children are aged 2-3 to 6-7 years have the potential to improve literacy and maths achievement during the early school years. © 2014 John Wiley & Sons Ltd.
Curreri, Chiara; Trevisan, Caterina; Carrer, Pamela; Facchini, Silvia; Giantin, Valter; Maggi, Stefania; Noale, Marianna; De Rui, Marina; Perissinotto, Egle; Zambon, Sabina; Crepaldi, Gaetano; Manzato, Enzo; Sergi, Giuseppe
2018-02-01
To investigate dysfunction in fine motor skills in a cohort of older Italian adults, identifying their prevalence and usefulness as indicators and predictors of cognitive impairment. Population-based longitudinal study with mean follow-up of 4.4 years. Community. Older men and women enrolled in the Progetto Veneto Anziani (Pro.V.A.) (N = 2,361); 1,243 subjects who were cognitively intact at baseline were selected for longitudinal analyses. Fine motor skills were assessed by measuring the time needed to successfully complete two functional tasks: putting on a shirt and a manual dexterity task. Cognitive impairment was defined as a Mini-Mental State Examination (MMSE) score less than 24. On simple correlation, baseline MMSE score was significantly associated with the manual dexterity task (correlation coefficient (r) = -0.25, P < .001) and time needed to put on a shirt (r = -0.29, P < .001). Over the study period, changes in time needed to perform the fine motor tasks were significantly associated with changes in MMSE (putting on a shirt: β = 0.083, P = .003; manual dexterity task: β = 0.098, P < .001). Logistic regression analyses confirmed that worse results on tasks were associated with cognitive impairment at baseline (odds ratio (OR) = 2.47, 95% confidence interval (CI) = 1.74-3.50, for the fourth quartile of time needed to put on a shirt; OR = 1.98, 95% CI = 1.42-2.76, for the fourth manual dexterity task quartile) and greater risk of cognitive impairment developing during follow-up (OR = 4.38, 95% CI = 2.46-7.80, for the fourth quartile of time needed to put on a shirt; OR = 2.20, 95% CI = 1.30-3.72, for the fourth manual dexterity task quartile). Difficulties with fine motor skills are common in older adults, and assessing them may help to identify early signs of dementia, subjects at high risk to develop cognitive decline, and individuals who can be referred to specialists. © 2017, Copyright the Authors Journal compilation © 2017, The American Geriatrics Society.
The Effects of a Secondary Task on Forward and Backward Walking in Parkinson Disease
Hackney, Madeleine E.; Earhart, Gammon M.
2009-01-01
Background People with Parkinson disease (PD) often fall while multi-tasking or walking backward, unavoidable activities in daily living. Dual tasks involving cognitive demand during gait and unfamiliar motor skills like backward walking could identify those with fall risk, but dual tasking while walking backward has not been examined in those with PD, those who experience Freezing of Gait (FOG), or healthy older controls. Methods Seventy-eight people with PD (mean age = 65.1±9.5 years, Female: 28%) and 74 age- and sex-matched controls (mean age = 65.0±10.0 years, Female: 23%) participated. A computerized walkway measured gait velocity, stride length, swing and stance percent, cadence, heel to heel base of support, functional ambulation profile, and gait asymmetry during forward and backward walking with and without a secondary cognitive task. Results Direction and task effects on walking performance were similar between healthy controls and those with PD. However, those with PD were more affected than controls, and freezers were more affected than non-freezers, by backward walking and dual tasking. Walking backward seemed to impact gait more than dual tasking in those with PD, although the subset of freezers appeared particularly impacted by both challenges. Conclusion People with PD are impaired while performing complex motor and mental tasks simultaneously, which may put them at risk for falling. Those with FOG are more adversely affected by both motor and mental challenges than those without. Evaluation of backward walking while performing a secondary task might be an effective clinical tool to identify locomotor difficulties. PMID:19675121
Suicidal Ideation, Attempt, and Determining Factors among HIV/AIDS Patients, Ethiopia.
Bitew, Huluagresh; Andargie, Gashaw; Tadesse, Agitu; Belete, Amsalu; Fekadu, Wubalem; Mekonen, Tesfa
2016-01-01
Background . Suicide is a serious cause of mortality worldwide and is considered as a psychiatric emergency. Suicide is more frequent in peoples living with HIV/AIDS than in general population. Objective . To assess the proportion and determining factors of suicidal ideation and attempt among peoples living with HIV/AIDS in Ethiopia. Methods . Institutional based cross-sectional study was conducted from May to June 2015 by selecting 393 participants using systematic random sampling technique. Suicide manual of Composite International Diagnostic Interview (CIDI) was used to collect data. Logistic regression was carried out and odds ratio with 95% confidence intervals was computed. Results . The proportion of suicidal ideation and attempt was 33.6% and 20.1%, respectively. Female sex (AOR = 2.6, 95%CI: 1.27-5.22), marital status (AOR = 13.5, 95%CI: 4.69-39.13), depression (AOR = 17.0, 95%CI: 8.76-33.26), CD4 level (AOR = 2.57, 95%CI: 1.34-4.90), and presence of opportunistic infection (AOR = 5.23, 95%CI: 2.51-10.88) were associated with suicidal ideation, whereas marital status (AOR = 8.44, 95%CI: 3.117-22.84), perceived HIV stigma (AOR = 2.9, 95%CI: 1.45-5.99), opportunistic infection (AOR = 2.37, 95%CI: 1.18-4.76), and poor social support (AOR = 2.9, 95%CI: 1.58-5.41) were significantly associated with suicidal attempt. Conclusion . Suicidal ideation and attempt were high among HIV positive patients. Therefore early screening, treatment, and referral of suicidal patients are necessary in HIV clinics.
Women empowerment and use of contraception
Patrikar, S.R.; Basannar, D.R.; Seema Sharma, Maj
2014-01-01
Background Use of contraception is influenced by many processes most by the women's empowerment. Women's decision making power and their autonomy within the household is the most important factor affecting contraceptive use. This paper aims to analyze the relationship between these two indicators of women's empowerment and the use of contraception. Methods This cross sectional study was conducted by personally interviewing 385 currently married women selected by systematic sampling on a pretested and validated questionnaire. Two indices, women's decision-making power index and women's autonomy index, were constructed and association with contraception use analyzed. Results & Conclusion The study gives the evidence that decision making power is low in the respondents with 48.2% (95% CI 43.34, 53.31) of them having low level of power, while 27.6% (95% CI 23.24, 32.16) have medium level and 3.6% (95% CI 2.08, 5.88) having high level of power. 22.4% (95% CI 18.39, 26.70) of women do not have any autonomy as against 43.9% (95% CI 38.99, 48.89) with low level, 25% (95% CI 20.80, 29.44) with medium autonomy and 8.7% (95% CI 6.29, 11.98) scoring above 7 (high level of autonomy). In the study population it was found that 273 (70.7%, 95% CI 66.2, 75.28) of the respondents were using contraceptives. Women's autonomy, years of marriage and number of children were significant variables. PMID:25378779
Task-specific fall prevention training is effective for warfighters with transtibial amputations.
Kaufman, Kenton R; Wyatt, Marilynn P; Sessoms, Pinata H; Grabiner, Mark D
2014-10-01
Key factors limiting patients with lower extremity amputations to achieve maximal functional capabilities are falls and fear of falling. A task-specific fall prevention training program has successfully reduced prospectively recorded trip-related falls that occur in the community by the elderly. However, this program has not been tested in amputees. In a cohort of unilateral transtibial amputees, we aimed to assess effectiveness of a falls prevention training program by (1) quantifying improvements in trunk control; (2) measuring responses to a standardized perturbation; and (3) demonstrating retention at 3 and 6 months after training. Second, we collected patient-reported outcomes for balance confidence and falls control. Fourteen male military service members (26 ± 3 years) with unilateral transtibial amputations and who had been walking without an assistive device for a median of 10 months (range, 2-106 months) were recruited to participate in this prospective cohort study. The training program used a microprocessor-controlled treadmill designed to deliver task-specific postural perturbations that simulated a trip. The training consisted of six 30-minute sessions delivered over a 2-week period, during which task difficulty, including perturbation magnitude, increased as the patient's ability progressed. Training effectiveness was assessed using a perturbation test in an immersive virtual environment. The key outcome variables were peak trunk flexion and velocity, because trunk kinematics at the recovery step have been shown to be a determinant of fall likelihood. The patient-reported outcomes were also collected using questionnaires. The effectiveness of the rehabilitation program was also assessed by collecting data before perturbation training and comparing the key outcome parameters with those measured immediately after perturbation training (0 months) as well as both 3 and 6 months posttraining. Mean trunk flexion angle and velocity significantly improved after participating in the training program. The prosthetic limb trunk flexion angle improved from pretraining (42°; 95% confidence interval [CI], 38°-47°) to after training (31°; 95% CI, 25°-37°; p < 0.001). Likewise, the trunk flexion velocity improved from pretraining (187°/sec; 95% CI, 166°-209°) to after training (143°/sec; 95% CI, 119°-167°; p < 0.004). The results display a significant side-to-side difference for peak trunk flexion angle (p = 0.01) with perturbations of the prosthetic limb resulting in higher peak angles. Prosthetic limb trips also exhibited significantly greater peak trunk flexion velocity compared with trips of the prosthetic limb (p = 0.005). These changes were maintained up to 6 months after the training. The peak trunk flexion angle of the subjects when the prosthetic limb was perturbed had a mean of 31° (95% CI, 25°-37°) at 0 month, 32° (95% CI, 28°-37°) at 3 months, and 30° (95% CI, 25°-34°) at 6 months. Likewise, the peak trunk flexion velocity for the prosthetic limb was a mean of 143°/sec (95% CI, 118°-167°) at 0 months, 143°/sec (95% CI, 126°-159°) at 3 months, and 132° (95% CI, 115°-149°) at 6 months. The peak trunk flexion angle when the nonprosthetic limb was perturbed had a mean of 22° (95% CI, 18°-24°) at 0 months, a mean of 26° (95% CI, 20°-32°) at 3 months, and a mean of 23° (95% CI, 19°-28°) at 6 months. The peak trunk flexion velocity for the nonprosthetic limb had a mean of 85°/sec (95% CI, 71°-98°) at 0 months, a mean of 96° (95% CI, 68°-124°) at 3 months, and 87°/sec (95% CI, 68°-105°) at 6 months. There were no significant changes in the peak trunk flexion angle (p = 0.16) or peak trunk flexion velocity (p = 0.35) over time after the training ended. The skill retention was present when either the prosthetic or nonprosthetic limb was perturbed. There were side-to-side differences in the trunk flexion angle (p = 0.038) and trunk flexion velocity (p = 0.004). Perturbations of the prosthetic side resulted in larger trunk flexion and higher trunk flexion velocities. Subjects prospectively reported decreased stumbles, semicontrolled falls, and uncontrolled falls. These results indicate that task-specific fall prevention training is an effective rehabilitation method to reduce falls in persons with lower extremity transtibial amputations.
United States National Interests in China: A Post-Normalization Analysis
1981-06-01
GRANT NMZR 0)(i)Donald Piýhi ~ own 9. PERFORMING ORGANIZATION NAME AND ADDRESS t.PROGRAM ELEMENT, PROJECT, TASK ARMIA A WORK UN IT NUMBERS Monterey...The Great Leap Forward (1958-1960)- - 37 4. Recovery (1960-1965) - ----------------- 39 5. The Cultural Revolution (1965-1969) - 39 6. Recovery (1969...sparked early interest, but the cultural contrasts and changing American views of the value of Sino- American relations have often caused China to take a
Carpentry and Masonry Career Ladders, AFSCs 552X0/552X1/55273.
1985-01-01
he follows numerous safety practices in operating machines and Using tools and equipment. Some of the protective devices the wood craftsman uses may...performance. Overlap was found only in -he tool repair independent job type. AFR 39-1 Specialty Description. The AFR 39-1 Specialty Descriptions for...between Carpentry and Masonry Personnel, with the exception of Shop Personnel, and here overlap was found as a function of those tasks common to tool
Adenoma Detection Rate: I Will Show You Mine if You Show Me Yours
Oliveira Ferreira, Alexandre; Fidalgo, Catarina; Palmela, Carolina; Costa Santos, Maria Pia; Torres, Joana; Nunes, Joana; Loureiro, Rui; Ferreira, Rosa; Barjas, Elídio; Glória, Luísa; Santos, António Alberto; Cravo, Marília
2017-01-01
Background Colorectal cancer (CRC) is the first cause of cancer-related mortality in Portugal. CRC screening reduces disease-specific mortality. Colonoscopy is currently the preferred method for screening as it may contribute to the reduction of CRC incidence. This beneficial effect is strongly associated with the adenoma detection rate (ADR). Aim Our aim was to evaluate the quality of colonoscopy at our unit by measuring the currently accepted quality parameters and publish them as benchmarking indicators. Methods From 5,860 colonoscopies, 654 screening procedures (with and without previous fecal occult blood testing) were analyzed. Results The mean age of the patients was 66.4 ± 7.8 years, and the gender distribution was 1:1. The overall ADR was 36% (95% confidence interval [CI] 32–39), the mean number of adenomas per colonoscopy was 0.66 (95% CI 0.56–0.77), and the sessile serrate lesion detection rate was 1% (95% CI 0–2). The bowel preparation was rated as adequate in 496 (76%) patients. The adjusted cecal intubation rate (CIR) was 93.7% (95% CI 91.7–95.8). Most colonoscopies were performed under monitored anesthesia care (53%), and 35% were unsedated. The use of sedation (propofol or midazolam based) was associated with a higher CIR with an odds ratio of 3.60 (95% CI 2.02–6.40, p < 0.001). Conclusion Our data show an above-standard ADR. The frequency of poor bowel preparation and the low sessile serrated lesion detection rate were acknowledged, and actions were implemented to improve both indicators. Quality auditing in colonoscopy should be compulsory, and while many units may do so internally, this is the first national report from a high-throughput endoscopy unit. PMID:28848785
[Prevalence of hypertension and diabetes in residents from Lima and Callao, Peru].
Revilla, Luis; López, Tania; Sánchez, Sixto; Yasuda, Myriam; Sanjinés, Giovanna
2014-01-01
To determine the prevalence of hypertension and diabetes in residents of districts in metropolitan Lima and Callao, Peru. This was a cross-sectional study conducted during September 2006 in people aged 15 years and older, residing in metropolitan Lima and Callao. Participants were selected using a sample of conglomerados (neighborhoods) in three stages. Standardized procedures were used to measure weight, height, waist circumference, blood pressure and blood glucose levels. Univariate, bivariate and logistic regression analysis were performed to estimate odds ratios (OR) and their respective confidence intervals at 95%. We enrolled 1,771 subjects; the mean age was 39.5 ± 16.5 years. 62% were women. 19.5% (95% CI 17.6-21.4) were obese, 15.8% (95% CI 14.1-17.6) had hypertension and 3.9% (95% CI: 3.0-4.8) had diabetes. Obesity was associated with a greater likelihood of having hypertension (OR 2.15, 95% CI 1.57-2.94) and diabetes (OR 1.97, 95% CI 1.02-3.80). . The results of this study in a representative sample of residents in Lima and Callao showed high prevalences of hypertension and obesity and a moderate prevalence of diabetes. These results can be used as a reference for public health interventions and to monitor their impact.
Sun, Jing; Yao, Liang; Fang, Yuan; Yang, Ruifei; Chen, Yaolong
2017-01-01
Background Evidence on the association between subclinical thyroid dysfunction and the risk of cardiovascular outcomes are conflicting. Methods and Results PubMed, EMbase, Web of Science, Cochrane Library, and China Biology Medicine (CBM) databases were searched from inception to July 10, 2016. A total of 16 studies were included for meta-analysis. We found that subclinical hypothyroidism was not correlated with coronary heart disease (CHD) (RR = 1.17; 95% CI, 0.91–1.52), total mortality (RR = 1.02; 95% CI, 0.93–1.13), cardiovascular mortality (RR = 1.06; 95% CI, 0.77–1.45), heart failure (RR = 1.17; 95% CI, 0.87–1.57), and atrial fibrillation (RR = 1.05; 95% CI, 0.91–1.21), except CHD mortality (RR = 1.37; 95% CI, 1.03–1.84). Subgroup analysis indicated a higher estimation risk in CHD (RR = 1.54; 95% CI, 1.00–2.39), cardiovascular mortality (RR = 2.14; 95% CI, 1.43–3.22), and CHD mortality (RR = 1.54; 95% CI, 1.11–2.15) among participants < 65 years. Furthermore, subclinical hyperthyroidism was found to be associated with CHD (RR = 1.20; 95% CI, 1.02–1.42), total mortality (RR = 1.27; 95% CI, 1.07–1.51), and CHD mortality (RR = 1.45; 95% CI, 1.12–1.86). Conclusions Subclinical hypothyroidism is likely associated with an increased risk of CHD mortality, and subclinical hyperthyroidism is likely associated with increased risk of CHD, CHD mortality, and total mortality. PMID:29081800
Analysis of the prevalence of and factors associated with overactive bladder in adult Korean women
2017-01-01
Background Overactive bladder (OAB) is one of the most prevalent lower urinary tract conditions and has been suggested to be related to various factors. We assessed the prevalence of and factors associated with OAB in women based on a large cross-sectional, population-based study of adult Korean women. Methods The Korean community health survey (KCHS) of 2012 was reviewed, and 107,950 female participants aged 19 to 107 years were identified for inclusion in this study. The overactive bladder symptom score (OABSS) was used to define and classify OAB as mild, moderate, or severe. Numerous variables, including marital status; physical activity; education and income levels; type of occupation; body mass index (BMI); smoking; alcohol; sleep time; and medical history of hypertension, diabetes mellitus, hyperlipidemia, or cerebral stroke, were evaluated. The correlation of these variables with the prevalence of OAB was analyzed using simple and multiple logistic regression analyses with complex sampling. Results The results showed that 5.2% of adult women experienced OAB. Multiple regression analyses showed a significant correlation between the following variables and OAB: older age (adjusted odds ratio [AOR] = 1.44, 95% confidence interval [CI] = 1.39–1.50, P < 0.001 as 10 years older); married status (AOR = 0.83, 95%CI = 0.70–0.96, P = 0.016); lower level of income (AOR = 1.50, 95%CI = 1.34–1.68, P < 0.001); high BMI (AOR = 1.33, 95%CI = 1.23–1.44, P < 0.001); smoking (AOR = 1.24, 95%CI = 1.04–1.47, P < 0.001); long sleep time (AOR = 1.95, 95%CI = 1.69–2.26); and medical history of hypertension (AOR = 1.11, 95%CI = 1.03–1.21, P = 0.011), diabetes mellitus (AOR = 1.38, 95%CI = 1.25–1.53, P < 0.001), hyperlipidemia (AOR = 1.27, 95%CI = 1.16–1.39, P < 0.001), and cerebral stroke (AOR = 2.04, 95%CI = 1.73–2.41, P < 0.001). The level of stress showed a dose-dependent association with OAB (AOR [95%CI] = 3.28 [2.81–3.83] > 2.11 [1.91–2.33] >1.28 [1.16–1.41] for severe > moderate > some stress, respectively, P < 0.001). Conclusion The prevalence of OAB was approximately 5.2% among adult Korean women. Older age; high BMI; stress level; sleep duration; levels of income and education; marital status; smoking; and medical history of hypertension, diabetes mellitus, hyperlipidemia, and cerebral stroke were significantly related to OAB in women. PMID:28957446
[Effect of occupational stress and effort-reward imbalance on sleep quality of people's policeman].
Wu, Hui; Gu, Guizhen; Yu, Shanfa
2014-04-01
To explore the effect of occupational stress and effort-reward imbalance on sleep quality of people's police. A cluster sampling survey of sleep quality and occupational stress correlated factors was conducted on 287 police from a city public security bureau by questionnaires in May, 2011; the relationship between sleep quality and occupational stress correlated factors was analyzed by one-way ANOVA and multivariate non-conditional logistic regression using effort-reward imbalance model (ERI) and demand-control-support model (DCS). And the subjects were divided into high tension group and low tension group using the 1.0 of ERI and DCS coefficients as the boundary. The sleep quality score of shift work police was higher than day work police (11.95 ± 6.54 vs 9.52 ± 6.43, t = 2.77, P < 0.05).In ERI model, the sleep quality score in high tension group was higher than low tension group (14.50 ± 6.41 vs 8.60 ± 5.53, t = -5.32, P < 0.01), and in DCS model, the sleep quality score in high tension group was also higher than low tension group (13.71 ± 6.62 vs 9.46 ± 6.04, t = -3.71, P < 0.01).For the regression analysis of ERI model as an argument, sex (OR = 3.0, 95%CI:1.16-7.73) , age for 30-39 years (OR = 3.48, 95%CI:1.32-9.16) , intrinsic effort (OR = 2.30, 95%CI:1.10-4.81) and daily hassles (OR = 2.15, 95%CI:1.06-4.33) were risk factors of low sleep quality, and reward (OR = 0.26, 95%CI:0.12-0.52) was the protective factor.For the regression analysis of DCS model as an argument , age for 30-39 years (OR = 2.55, 95%CI:1.02-6.37) , depressive symptom (OR = 2.10, 95%CI:1.14-3.89) and daily hassles (OR = 3.25, 95%CI:1.70-6.19) were risk factors of low sleep quality.While the ERI model and the DCS model were analyzed simultaneously, sex (OR = 3.03, 95%CI:1.15-7.98) , age for 30-39 years (OR = 3.71, 95%CI:1.38-9.98) and daily hassles (OR = 2.09, 95%CI:1.01-4.30) were the risk factors of low sleep quality, and reward (OR = 0.22, 95%CI:0.10-0.48) was the protective factor. Occupational stress and effort-reward imbalance affected the sleep quality to people's policeman.
Correlates of gambling on high-school grounds
Foster, Dawn W.; Hoff, Rani A.; Pilver, Corey E.; Yau, Yvonne H. C.; Steinberg, Marvin A.; Wampler, Jeremy; Krishnan-Sarin, Suchitra; Potenza, Marc N.
2015-01-01
Objective This study examined adolescent gambling on school grounds (GS+) and how such behavior was associated with gambling-related attitudes. Further, we examined whether GS+ moderated associations between at-risk problem-gambling (ARPG) and gambling behaviors related to gambling partners. Method Participants were 1988 high-school students who completed survey materials. Demographic, perceptions, attitudes, and gambling variables were stratified by problem-gambling severity (ARPG versus recreational gambling) and GS+ status. Chi-square and adjusted logistic regression models were used to examine relationships among study variables. Results Nearly 40% (39.58%) of students reported past-year GS+, with 12.91% of GS+ students, relative to 2.63% of those who did not report gambling on school grounds (GS), meeting DSM-IV criteria for pathological gambling (p<0.0001). In comparison to GS- students, GS+ students were more likely to report poorer academic achievement and more permissive attitudes towards gambling behaviors. Weaker links in GS+ students, in comparison with GS-, students, were observed between problem-gambling severity and gambling with family members (interaction odds ratio (IOR)=0.60; 95%CI=0.39–0.92) and gambling with friends (IOR=0.21; 95%CI=0.11–0.39). Conclusions GS+ is common and associated with pathological gambling and more permissive attitudes towards gambling. The finding that GS+ (relative to GS-) youth show differences in how problem-gambling is related to gambling partners (friends and family) warrants further investigation regarding whether and how peer and familial interactions might be improved to diminish youth problem-gambling severity. The high frequency of GS+ and its relationship with ARPG highlight a need for school administrators and personnel to consider interventions that target school-based gambling. PMID:26232102
Kelly, Tanika N.; Bazzano, Lydia A.; Ajami, Nadim J.; He, Hua; Zhao, Jinying; Petrosino, Joseph F.; Correa, Adolfo; He, Jiang
2016-01-01
Rationale Few studies have systematically assessed the influence of gut microbiota on cardiovascular disease (CVD) risk. Objective To examine the association between gut microbiota and lifetime CVD risk profile among 55 Bogalusa Heart Study (BHS) participants with the highest and 57 with the lowest lifetime burdens of CVD risk factors. Methods and Results 16S rRNA sequencing was conducted on microbial DNA extracted from stool samples of the BHS participants. Alpha diversity, including measures of richness and evenness, and individual genera were tested for associations with lifetime CVD risk profile. Multivariable regression techniques were employed to adjust for age, gender, and race (Model 1), along with body mass index (BMI) (Model 2) and both BMI and diet (Model 3). In Model 1, odds ratios (95% confidence intervals) for each standard deviation increase in richness, measured by the number of observed operational taxonomic units, Chao 1 index, and abundance-based coverage estimator, were 0.62 (0.39, 0.99), 0.61 (0.38, 0.98), and 0.63 (0.39, 0.99), respectively. Associations were consistent in Models 2 and 3. Four genera were enriched among those with high versus low CVD risk profile in all models. Model 1 p-values were: 2.12×10−3, 7.95×10−5, 4.39×10−4, and 1.51×10−4 for Prevotella 2, Prevotella 7, Tyzzerella and Tyzzerella 4, respectively. Two genera were depleted among those with high versus low CVD risk profile in all models. Model 1 P-values were: 2.96×10−6 and 1.82×10−4 for Alloprevotella and Catenibacterium, respectively. Conclusions The current study identified associations of overall microbial richness and six microbial genera with lifetime CVD risk. PMID:27507222
Casteel, Carri; Peek-Asa, Corinne; Nocera, Maryalice; Smith, Jamie B; Blando, James; Goldmacher, Suzi; O'Hagan, Emily; Valiante, David; Harrison, Robert
2009-02-01
This study examines changes in violent event rates to hospital employees before and after enactment of the California Hospital Safety and Security Act in 1995. We compared pre- and post-initiative employee assault rates in California (n = 116) emergency departments and psychiatric units with those in New Jersey (n = 50), where statewide workplace violence initiatives do not exist. Poisson regression with generalized estimating equations was used to compare assault rates between a 3-year pre-enactment period (1993-1995) and a 6-year post-enactment period (1996-2001) using New Jersey hospitals as a temporal control. Assault rates among emergency department employees decreased 48% in California post-enactment, compared with emergency department employee assault rates in New Jersey (rate ratio [RR] = 0.52, 95% confidence interval [CI]: 0.31, 0.90). Emergency department employee assault rates decreased in smaller facilities (RR = 0.46, 95% CI: 0.21, 0.96) and for-profit-controlled hospitals (RR = 0.39, 95% CI: 0.19, 0.79) post-enactment. Among psychiatric units, for-profit-controlled hospitals (RR = 0.41, 95% CI: 0.19, 0.85) and hospitals located in smaller communities (RR = 0.44, 95% CI: 0.21, 0.92) experienced decreased assault rates post-enactment. Policy may be an effective method to increase safety to health care workers.
Bonato, Cassiane Cardoso; Dias, Henrique Bregolin; Alves, Michele da Silva; Duarte, Lucas Ost; Dias, Telpo Martins; Dalenogare, Maiara Oliveira; Viegas, Claudio Castelo Branco; Elnecave, Regina Helena
2014-01-30
Scattered radiation can be assessed by in vivo dosimetry. Thyroid tissue is sensitive to radiation, even at doses <10 cGy. This study compared the scattered dose to the thyroid measured by thermoluminescent dosimeters (TLDs) and the estimated one by treatment planning system (TPS). During radiotherapy to sites other than the thyroid of 16 children and adolescents, seventy-two TLD measurements at the thyroid were compared with TPS estimation. The overall TPS/TLD bias was 1.02 (95% LA 0.05 to 21.09). When bias was stratified by treatment field, the TPS overestimated TLD values at doses <1 cGy and underestimated them at doses >10 cGy. The greatest bias was found in pelvis and abdomen: 15.01 (95% LA 9.16 to 24.61) and 5.12 (95% LA 3.04 to 8.63) respectively. There was good agreement in orbit, head, and spine: bias 1.52 (95% LA 0.48 to 4.79), 0.44 (95% LA 0.11 to 1.82) and 0.83 (0.39 to 1.76) respectively. There was small agreement with broad limits for lung and mediastinum: 1.13 (95% LA 0.03 to 40.90) and 0.39 (95% LA 0.02 to 7.14) respectively. The scattered dose can be measured with TLDs, and TPS algorithms for outside structures should be improved.
Thanawattano, Chusak; Pongthornseri, Ronachai; Anan, Chanawat; Dumnin, Songphon; Bhidayasiri, Roongroj
2015-11-04
Parkinson's disease (PD) and essential tremor (ET) are the two most common movement disorders but the rate of misdiagnosis rate in these disorders is high due to similar characteristics of tremor. The purpose of the study is to present: (a) a solution to identify PD and ET patients by using the novel measurement of tremor signal variations while performing the resting task, (b) the improvement of the differentiation of PD from ET patients can be obtained by using the ratio of the novel measurement while performing two specific tasks. 35 PD and 22 ET patients were asked to participate in the study. They were asked to wear a 6-axis inertial sensor on his/her index finger of the tremor dominant hand and perform three tasks including kinetic, postural and resting tasks. Each task required 10 s to complete. The angular rate signal measured during the performance of these tasks was band-pass filtered and transformed into a two-dimensional representation. The ratio of the ellipse area covering 95 % of this two-dimensional representation of different tasks was investigated and the two best tasks were selected for the purpose of differentiation. The ellipse area of two-dimensional representation of the resting task of PD and ET subjects are statistically significantly different (p < 0.05). Furthermore, the fluctuation ratio, defined as a ratio of the ellipse area of two-dimensional representation of resting to kinetic tremor, of PD subjects were statistically significantly higher than ET subjects in all axes (p = 0.0014, 0.0011 and 0.0001 for x, y and z-axis, respectively). The validation shows that the proposed method provides 100 % sensitivity, specificity and accuracy of the discrimination in the 5 subjects in the validation group. While the method would have to be validated with a larger number of subjects, these preliminary results show the feasibility of the approach. This study provides the novel measurement of tremor variation in time domain termed 'temporal fluctuation'. The temporal fluctuation of the resting task can be used to discriminate PD from ET subjects. The ratio of the temporal fluctuation of the resting task to the kinetic task improves the reliability of the discrimination. While the method is powerful, it is also simple so it could be applied on low resource platforms such as smart phones and watches which are commonly equipped with inertial sensors.
Classification of EEG Signals Based on Pattern Recognition Approach.
Amin, Hafeez Ullah; Mumtaz, Wajid; Subhani, Ahmad Rauf; Saad, Mohamad Naufal Mohamad; Malik, Aamir Saeed
2017-01-01
Feature extraction is an important step in the process of electroencephalogram (EEG) signal classification. The authors propose a "pattern recognition" approach that discriminates EEG signals recorded during different cognitive conditions. Wavelet based feature extraction such as, multi-resolution decompositions into detailed and approximate coefficients as well as relative wavelet energy were computed. Extracted relative wavelet energy features were normalized to zero mean and unit variance and then optimized using Fisher's discriminant ratio (FDR) and principal component analysis (PCA). A high density EEG dataset validated the proposed method (128-channels) by identifying two classifications: (1) EEG signals recorded during complex cognitive tasks using Raven's Advance Progressive Metric (RAPM) test; (2) EEG signals recorded during a baseline task (eyes open). Classifiers such as, K-nearest neighbors (KNN), Support Vector Machine (SVM), Multi-layer Perceptron (MLP), and Naïve Bayes (NB) were then employed. Outcomes yielded 99.11% accuracy via SVM classifier for coefficient approximations (A5) of low frequencies ranging from 0 to 3.90 Hz. Accuracy rates for detailed coefficients were 98.57 and 98.39% for SVM and KNN, respectively; and for detailed coefficients (D5) deriving from the sub-band range (3.90-7.81 Hz). Accuracy rates for MLP and NB classifiers were comparable at 97.11-89.63% and 91.60-81.07% for A5 and D5 coefficients, respectively. In addition, the proposed approach was also applied on public dataset for classification of two cognitive tasks and achieved comparable classification results, i.e., 93.33% accuracy with KNN. The proposed scheme yielded significantly higher classification performances using machine learning classifiers compared to extant quantitative feature extraction. These results suggest the proposed feature extraction method reliably classifies EEG signals recorded during cognitive tasks with a higher degree of accuracy.
Classification of EEG Signals Based on Pattern Recognition Approach
Amin, Hafeez Ullah; Mumtaz, Wajid; Subhani, Ahmad Rauf; Saad, Mohamad Naufal Mohamad; Malik, Aamir Saeed
2017-01-01
Feature extraction is an important step in the process of electroencephalogram (EEG) signal classification. The authors propose a “pattern recognition” approach that discriminates EEG signals recorded during different cognitive conditions. Wavelet based feature extraction such as, multi-resolution decompositions into detailed and approximate coefficients as well as relative wavelet energy were computed. Extracted relative wavelet energy features were normalized to zero mean and unit variance and then optimized using Fisher's discriminant ratio (FDR) and principal component analysis (PCA). A high density EEG dataset validated the proposed method (128-channels) by identifying two classifications: (1) EEG signals recorded during complex cognitive tasks using Raven's Advance Progressive Metric (RAPM) test; (2) EEG signals recorded during a baseline task (eyes open). Classifiers such as, K-nearest neighbors (KNN), Support Vector Machine (SVM), Multi-layer Perceptron (MLP), and Naïve Bayes (NB) were then employed. Outcomes yielded 99.11% accuracy via SVM classifier for coefficient approximations (A5) of low frequencies ranging from 0 to 3.90 Hz. Accuracy rates for detailed coefficients were 98.57 and 98.39% for SVM and KNN, respectively; and for detailed coefficients (D5) deriving from the sub-band range (3.90–7.81 Hz). Accuracy rates for MLP and NB classifiers were comparable at 97.11–89.63% and 91.60–81.07% for A5 and D5 coefficients, respectively. In addition, the proposed approach was also applied on public dataset for classification of two cognitive tasks and achieved comparable classification results, i.e., 93.33% accuracy with KNN. The proposed scheme yielded significantly higher classification performances using machine learning classifiers compared to extant quantitative feature extraction. These results suggest the proposed feature extraction method reliably classifies EEG signals recorded during cognitive tasks with a higher degree of accuracy. PMID:29209190
Knutson, Jayme S; Gunzler, Douglas D; Wilson, Richard D; Chae, John
2016-10-01
It is unknown whether one method of neuromuscular electrical stimulation for poststroke upper limb rehabilitation is more effective than another. Our aim was to compare the effects of contralaterally controlled functional electrical stimulation (CCFES) with cyclic neuromuscular electrical stimulation (cNMES). Stroke patients with chronic (>6 months) moderate to severe upper extremity hemiparesis (n=80) were randomized to receive 10 sessions/wk of CCFES- or cNMES-assisted hand opening exercise at home plus 20 sessions of functional task practice in the laboratory for 12 weeks. The task practice for the CCFES group was stimulation assisted. The primary outcome was change in Box and Block Test (BBT) score at 6 months post treatment. Upper extremity Fugl-Meyer and Arm Motor Abilities Test were also measured. At 6 months post treatment, the CCFES group had greater improvement on the BBT, 4.6 (95% confidence interval [CI], 2.2-7.0), than the cNMES group, 1.8 (95% CI, 0.6-3.0), between-group difference of 2.8 (95% CI, 0.1-5.5), P=0.045. No significant between-group difference was found for the upper extremity Fugl-Meyer (P=0.888) or Arm Motor Abilities Test (P=0.096). Participants who had the largest improvements on BBT were <2 years post stroke with moderate (ie, not severe) hand impairment at baseline. Among these, the 6-month post-treatment BBT gains of the CCFES group, 9.6 (95% CI, 5.6-13.6), were greater than those of the cNMES group, 4.1 (95% CI, 1.7-6.5), between-group difference of 5.5 (95% CI, 0.8-10.2), P=0.023. CCFES improved hand dexterity more than cNMES in chronic stroke survivors. URL: http://www.clinicaltrials.gov. Unique identifier: NCT00891319. © 2016 American Heart Association, Inc.
Concussion is associated with altered preparatory postural adjustments during gait initiation.
Doherty, Cailbhe; Zhao, Liang; Ryan, John; Komaba, Yusuke; Inomata, Akihiro; Caulfield, Brian
2017-04-01
Gait initiation is a useful surrogate measure of supraspinal motor control mechanisms but has never been evaluated in a cohort following concussion. The aim of this study was to quantify the preparatory postural adjustments (PPAs) of gait initiation (GI) in fifteen concussion patients (4 females, 11 males) in comparison to a group of fifteen age- and sex-matched controls. All participants completed variants of the GI task where their dominant and non-dominant limbs as the 'stepping' and 'support' limbs. Task performance was quantified using the centre of pressure (COP) trajectory of each foot (computed from a force plate) and a surrogate of the centre of mass (COM) trajectory (estimated from an inertial measurement unit placed on the sacrum). Concussed patients exhibited decreased COP excursion on their dominant foot, both when it was the stepping limb (sagittal plane: 9.71mm [95% CI: 8.14-11.27mm] vs 14.9mm [95% CI: 12.31-17.49mm]; frontal plane: 36.95mm [95% CI: 30.87-43.03mm] vs 54.24mm [95% CI: 46.99-61.50mm]) and when it was the support limb (sagittal plane: 10.43mm [95% CI: 8.73-12.13mm] vs 18.13mm [95% CI: 14.92-21.35mm]; frontal plane: 66.51mm [95% CI: 60.45-72.57mm] vs 88.43mm [95% CI: 78.53-98.32mm]). This was reflected in the trajectory of the COM, wherein concussion patients exhibited lower posterior displacement (19.67mm [95% CI: 19.65mm-19.7mm]) compared with controls (23.62mm [95% CI: 23.6-23.64]). On this basis, we conclude that individuals with concussion display deficits during a GI task which are potentially indicative of supraspinal impairments in motor control. Copyright © 2017 Elsevier B.V. All rights reserved.
Toxoplasmosis in Blood Donors: A Systematic Review and Meta-Analysis.
Foroutan-Rad, Masoud; Majidiani, Hamidreza; Dalvand, Sahar; Daryani, Ahmad; Kooti, Wesam; Saki, Jasem; Hedayati-Rad, Faezeh; Ahmadpour, Ehsan
2016-07-01
Transfusion-transmissible infections include pathogens that may cause severe and debilitating diseases. Toxoplasmosis is a cosmopolitan neglected parasitic infection that can lead to severe complications including death in immune-compromised patients or following infection in utero. Multiple studies have demonstrated the transmission of Toxoplasma gondii by blood transfusion. The objective of this review was to comprehensively assess the seroprevalence rate of Toxoplasma in blood donors from a worldwide perspective. Seven electronic databases (PubMed, Science Direct, Web of Science, Scopus, Cochrane, Ovid, and Google Scholar) were searched using medical subject headings terms. A total of 43 records met the inclusion criteria in which 20,964 donors were tested during the period from January 1980 to June 2015. The overall weighted prevalence of exposure to toxoplasmosis in blood donors was 33% (95% confidence interval [CI], 28%-39%). The seroprevalences of immunoglobulin (Ig)M and both IgG and IgM antibodies were 1.8% (95% CI, 1.1%-2.4%) and 1.1% (95% CI, 0.3%-1.8%), respectively. The highest and the lowest seroprevalences of toxoplasmosis were observed in Africa (46%; 95% CI, 14%-78%) and in Asia (29%; 95% CI, 23%-35%), respectively. Brazil (75%) and Ethiopia (73%) were identified as countries with high seroprevalence. Because positive serology does not imply infectiousness and because seroprevalence is high in some nations, a positive serology test result alone cannot be used as an effective method for donor screening. Future research for methods to prevent transfusion-transmitted toxoplasmosis may derive benefit from studies conducted in areas of high endemicity. Copyright © 2016 Elsevier Inc. All rights reserved.
Poethko-Müller, C; Atzpodien, K; Schmitz, R; Schlaud, M
2011-03-01
Each method to monitor vaccine safety has strengths and limitations. Therefore, vaccine safety monitoring should rely on different types of data sources. Methods commonly rely on patient-reported adverse reactions. Little is, however, known about factors that may affect the probability with which patients report adverse reactions to vaccines. From 2003-2006, the representative National Health Interview and Examination Survey for Children and Adolescents ("Kinder- und Jugendgesundheitssurvey", KiGGS) retrospectively collected information about vaccines, vaccination dates, and suspected vaccine related adverse reactions from a total of 17,641 participants (<17 years). Poorly tolerated vaccinations were more likely reported from parents living in former West Germany compared to former East Germany (OR 1.61; 95% CI 1.08-2.39), parents of children with special health care needs (OR 1.49; 95% CI 1.08-2.04), and from parents reporting reservations against vaccinations (OR 3.29; 95% CI 2.28-4.75). Parental reporting of adverse vaccine reactions appears to be associated with parental perception and assessment of possible adverse vaccine reactions, as well as with the parents' attitude towards immunization in general.
Risk factors associated with asthma phenotypes in dental healthcare workers
Singh, Tanusha; Bello, Braimoh; Jeebhay, Mohamed F
2012-01-01
Background Exposure in the dental environment can increase the risk of respiratory disease in dental healthcare workers (HCWs). This study investigated the prevalence of asthma phenotypes in dental HCWs and associated risk factors. Methods A cross-sectional study of 454 dental HCWs in five dental institutions in South Africa was conducted. A self-administered questionnaire elicited the health and employment history of subjects. Sera was analysed for atopic status and latex sensitization. Pre and post bronchodilator spirometry was performed. Results The prevalence of atopic asthma was 6.9%, non-atopic asthma 5.9% and work-exacerbated asthma (WEA) 4.0%. Atopy and work-related ocular-nasal symptoms were strong predictors of WEA (OR: 3.4; 95% CI: 1.07 –10.8; OR: 6.7, 95% CI: 2.4 – 19.1), respectively. Regular use of personal protective equipment (PPE) was associated with a protective affect (OR: 0.23, 95% CI: 0.1 – 0.7) among non-atopic asthmatics, while glove use and respiratory protection was protective among atopic asthmatics (OR: 0.39, 95% CI: 0.17 – 0.89). Conclusion Identification of risk factors associated with specific asthma phenotypes in dental HCWs can be used to focus preventive strategies for asthmatics. PMID:22473580
Talibov, Madar; Pukkala, Eero; Martinsen, Jan Ivar; Tryggvadottir, Laufey; Weiderpass, Elisabete; Hansen, Johnni
2018-05-01
Objective The aim of this case-control study was to assess the effect of night-shift work on the risk of hematological cancers. Methods The study included 39 371 leukemia, 56 713 non-Hodgkin lymphoma, 9322 Hodgkin lymphoma, and 26 188 multiple myeloma cases diagnosed between 1961 and 2005 in Finland, Sweden, and Iceland. Five controls for each case were selected from the Nordic Occupational Cancer Study (NOCCA) cohort, matched by year of birth, sex and country. Night-shift exposure was assessed by using the NOCCA job-exposure matrix (JEM). Odds ratios (OR) and 95% confidence intervals (95% CI) were calculated from conditional logistic regression models. Results Overall, night work was not associated with a risk of hematological cancers. We observed a small but non-significantly increased risk for leukemia (OR 1.07, 95% CI 0.99-1.16), especially for acute myeloid leukemia (OR 1.15, 95% CI 0.97-1.36) among workers exposed to a high level of cumulative night work exposure. Night work exposure was not associated with lymphatic cancers and multiple myeloma. Conclusion This study did not support associations between night-shift work and hematological cancers.
Cancer incidence and mortality among Swedish leather tanners.
Mikoczy, Z; Schütz, A; Hagmar, L
1994-01-01
OBJECTIVES--The aim was to study the incidence of cancer among Swedish leather tanners. METHODS--A cohort of 2026 subjects who had been employed for at least one year between 1900 and 1989 in three Swedish leather tanneries, was established. The cancer incidence and mortality patterns were assessed for the periods 1958-89 and 1952-89 respectively, and cause-specific standardised incidence and mortality ratios (SIRs and SMRs) were calculated. RESULTS--A significantly increased incidence of soft tissue sarcomas (SIR 4.27, 95% confidence interval (95% CI) 1.39-9.97) was found, based on five cases. Excesses, (not statistically significant) was also found for multiple myelomas (SIR 2.54, 95% CI 0.93-5.53), and sinonasal cancer (SIR 3.77, 95% CI 0.46-13.6). CONCLUSIONS--The increased incidence of soft tissue sarcomas adds support to previous findings of an excess mortality in this diagnosis among leather tanners. A plausible cause is exposure to chlorophenols, which had occurred in all three plants. The excess of multiple myelomas may also be associated with exposure to chlorophenol. The association between incidence of cancer and specific chemical exposure will be elucidated in a cohort-based case-referent study. PMID:7951777
Agrawal, Rupesh; Lee, Cecilia; Gonzalez-Lopez, Julio J.; Khan, Sharmina; Rodrigues, Valeria; Pavesio, Carlos
2016-01-01
Objective To analyse the safety and efficacy of a non-selective cyclo-oxygenase (COX) inhibitor in the management of non-infectious, non-necrotising anterior scleritis. Methods Retrospective chart review of 126 patients with non-necrotising anterior scleritis treated with oral flurbiprofen (Froben®(Abbott Healthcare)) with ( group B, n=61) or without topical steroids (group A, n=65) was performed and time to remission was plotted. Results The observed incidence rate was 1.07 (95% CI: 0.57–1.99) per 1000 person-years with failure rate of 0.68 (95% CI: 0.22–2.12) per 1000 person-years in group A and 1.41 (95% CI: 0.67–2.96) per 1000 person-years in group B. The failure rate was 3.97(1.89–9.34) per 1000 person-years with hazard ratio of 10.01 ( 95% CI: 2.52–39.65; p<0.001) for patients with associated systemic disease. Conclusion To our best knowledge, this is the first and largest case series on the safety and efficacy of a non-selective COX inhibitor in the management of anterior scleritis. PMID:26308394
Predicting Endurance Time in a Repetitive Lift and Carry Task Using Linear Mixed Models
Ham, Daniel J.; Best, Stuart A.; Carstairs, Greg L.; Savage, Robert J.; Straney, Lahn; Caldwell, Joanne N.
2016-01-01
Objectives Repetitive manual handling tasks account for a substantial portion of work-related injuries. However, few studies report endurance time in repetitive manual handling tasks. Consequently, there is little guidance to inform expected work time for repetitive manual handling tasks. We aimed to investigate endurance time and oxygen consumption of a repetitive lift and carry task using linear mixed models. Methods Fourteen male soldiers (age 22.4 ± 4.5 yrs, height 1.78 ± 0.04 m, body mass 76.3 ± 10.1 kg) conducted four assessment sessions that consisted of one maximal box lifting session and three lift and carry sessions. The relationships between carry mass (range 17.5–37.5 kg) and the duration of carry, and carry mass and oxygen consumption, were assessed using linear mixed models with random effects to account for between-subject variation. Results Results demonstrated that endurance time was inversely associated with carry mass (R2 = 0.24), with significant individual-level variation (R2 = 0.85). Normalising carry mass to performance in a maximal box lifting test improved the prediction of endurance time (R2 = 0.40). Oxygen consumption presented relative to total mass (body mass, external load and carried mass) was not significantly related to lift and carry mass (β1 = 0.16, SE = 0.10, 95%CI: -0.04, 0.36, p = 0.12), indicating that there was no change in oxygen consumption relative to total mass with increasing lift and carry mass. Conclusion Practically, these data can be used to guide work-rest schedules and provide insight into methods assessing the physical capacity of workers conducting repetitive manual handling tasks. PMID:27379902
Huffam, Sarah; Chow, Eric P F; Fairley, Christopher K; Hocking, Jane; Peel, Joanne; Chen, Marcus
2015-09-01
We aimed to ascertain the proportion of positive, and predictive factors of chlamydia infection among females, heterosexual males and men who have sex with men (MSM) presenting to a sexual health service reporting contact with a chlamydia infected sexual partner. A cross-sectional analysis of patients attending the Melbourne Sexual Health Centre from October 2010 to September 2013. Behavioural data obtained using computer assisted self-interview were analysed to determine factors predictive of chlamydia. Of the 491 female, 808 heterosexual male, and 268 MSM chlamydia contacts, the proportion diagnosed with chlamydia were 39.9% (95% CI 35.7% to 44.3%), 36.1% (95% CI 32.9% to 39.9%) and 23.5% (95% CI 18.8% to 29.0%), respectively. Female chlamydia contacts were more likely to have chlamydia if age <25 (adjusted OR (AOR) 1.86, 95% CI 1.12 to 3.10) or if they reported inconsistent condom use during vaginal sex with a regular male partner (AOR 2.5, 95% CI 1.12 to 6.14). Heterosexual male contacts were more likely to have chlamydia if age <25 (AOR 1.69, 95% CI 1.25 to 2.28) or if they had a regular female sexual partner (AOR 1.38, 95% CI 1.03 to 1.85). In MSM urethral chlamydia was diagnosed in 8.8%, rectal chlamydia in 20.2%, and 3.9% at both sites. MSM were more likely to have chlamydia if they had a regular male sexual partner (OR 2.12, 95% CI 1.18 to 3.81). This study of female, heterosexual male, and MSM presentations with self-reported chlamydia contact provides insight into the likelihood and predictive factors of infection. The data may inform policy and individual clinical decision making regarding presumptive treatment of chlamydia contacts. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Rock magnetic and paleomagnetic study of the Keurusselkä impact structure, central Finland
NASA Astrophysics Data System (ADS)
Raiskila, Selen; Salminen, Johanna; Elbra, Tiiu; Pesonen, Lauri J.
2011-11-01
There are 31 proven impact structures in Fennoscandia—one of the most densely crater-populated areas of the Earth. The recently discovered Keurusselkä impact structure (62°08' N, 24°37' E) is located within the Central Finland Granitoid Complex, which formed 1890-1860 Ma ago during the Svecofennian orogeny. It is a deeply eroded complex crater that yields in situ shatter cones with evidence of shock metamorphism, e.g., planar deformation features in quartz. New petrophysical and rock magnetic results of shocked and unshocked target rocks of various lithologies combined with paleomagnetic studies are presented. The suggested central uplift with shatter cones is characterized by increased magnetization and susceptibility. The presence of magnetite and pyrrhotite was observed as carriers for the remanent magnetization. Four different remanent magnetization directions were isolated: (1) a characteristic Svecofennian target rock component A with a mean direction of D = 334.8°, I = 45.6°, α95 = 14.9° yielding a pole (Plat = 51.1°, Plon = 241.9°, A95 = 15.1°), (2) component B, D = 42.4°, I = 64.1°, α95 = 8.4° yielding a pole (Plat = 61.0°, Plon = 129.1°, A95 = 10.6°), (3) component C (D = 159.5°, I = 65.4°, α95 = 10.7°) yielding a pole (Plat = 21.0°, Plon = 39.3°, A95 = 15.6°), and (4) component E (D = 275.5°, I = 62.0°, α95 = 14.4°) yielding a pole (Plat = 39.7°, Plon = 314.3°, A95 = 19.7°). Components C and E are considered much younger, possibly Neoproterozoic overprints, compared with the components A and B. The pole of component B corresponds with the 1120 Ma pole of Salla diabase dyke and is in agreement with the 40Ar/39Ar age of 1140 Ma from a pseudotachylitic breccia vein in a central part of the structure. Therefore, component B could be related to the impact, and thus represent the impact age.
Chan, Ying Ying; Teh, Chien Huey; Lim, Kuang Kuay; Lim, Kuang Hock; Yeo, Pei Sien; Kee, Chee Cheong; Omar, Mohd Azahadi; Ahmad, Noor Ani
2015-08-06
Self-rated health (SRH) has been demonstrated as a valid and appropriate predictor of incident mortality and chronic morbidity. Associations between lifestyle, chronic diseases, and SRH have been reported by various population studies but few have included data from developing countries. The aim of this study was to determine the prevalence of poor SRH in Malaysia and its association with lifestyle factors and chronic diseases among Malaysian adults. This study was based on 18,184 adults aged 18 and above who participated in the 2011 National Health and Morbidity Survey (NHMS). The NHMS was a cross-sectional survey (two-stage stratified sample) designed to collect health information on a nationally representative sample of the Malaysian adult population. Data were obtained via face-to-face interviews using validated questionnaires. Two categories were used to measure SRH: "good" (very good and good) and "poor" (moderate, not good and very bad). The association of lifestyle factors and chronic diseases with poor SRH was examined using univariate and multivariate logistic regression. Approximately one-fifth of the Malaysian adult population (20.1 %) rated their health as poor (men: 18.4 % and women: 21.7 %). Prevalence increases with age from 16.2 % (aged 18-29) to 32.0 % (aged ≥60). In the multivariate logistic regression analysis, lifestyle factors associated with poor SRH included: underweight (OR = 1.29; 95 % CI: 1.05-1.57), physical inactivity (OR = 1.25; 95 % CI: 1.11-1.39), former smoker (OR = 1.38; 95 % CI: 1.12-1.70), former drinker (OR = 1.27; 95 % CI: 1.01-1.62), and current drinker (OR = 1.35; 95 % CI: 1.08-1.68). Chronic diseases associated with poor SRH included: asthma (OR = 1.66; 95 % CI: 1.36-2.03), arthritis (OR = 1.87; 95 % CI: 1.52-2.29), hypertension (OR = 1.39; 95 % CI: 1.18-1.64), hypercholesterolemia (OR = 1.43; 95 % CI: 1.18-1.74), and heart disease (OR = 1.85; 95 % CI: 1.43-2.39). This study indicates that several unhealthy lifestyle behaviours and chronic diseases are significantly associated with poor SRH among Malaysian adults. Effective public health strategies are needed to promote healthy lifestyles, and disease prevention interventions should be enhanced at the community level to improve overall health.
Giorgio, Margaret; Townsend, Loraine; Zembe, Yanga; Guttmacher, Sally; Kapadia, Farzana; Cheyip, Mireille; Mathews, Catherine
2018-01-01
Objectives To examine the relationship between sexual violence and transactional sex and assess the impact of social support on this relationship among female transnational migrants in Cape Town, South Africa. Methods In 2012 we administered a behavioral risk factor survey using respondent-driven sampling to transnational migrant women aged between 16 and 39 years, born outside South Africa, living in Cape Town, and speaking English, Shona, Swahili, Lingala, Kirundi, Kinyarwanda, French, or Somali. Results Controlling for study covariates, travel-phase sexual violence was positively associated with engagement in transactional sex (adjusted prevalence ratio [APR] = 1.38; 95% confidence interval [CI] = 1.07, 1.77), and social support was shown to be a protective factor (APR = 0.84; 95% CI = 0.75, 0.95). The interaction of experienced sexual violence during migration and social support score was APR = 0.85 (95% CI = 0.66, 1.10). In the stratified analysis, we found an increased risk of transactional sex among the low social support group (APR = 1.56; 95% CI = 1.22, 2.00). This relationship was not statistically significant among the moderate or high social support group (APR = 1.04; 95% CI = 0.58, 1.87). Conclusions Programs designed to strengthen social support may reduce transactional sex among migrant women after they have settled in their receiving communities. PMID:29417089
Hiscock, R; Kumar, D; Simmons, S W
2015-05-01
We assessed agreement in haemoglobin measurement between Masimo pulse co-oximeters (Rad-7™ and Pronto-7™) and HemoCue® photometers (201+ or B-Hemoglobin) with laboratory-based determination and identified 39 relevant studies (2915 patients in Masimo group and 3084 patients in HemoCue group). In the Masimo group, the overall mean difference was -0.03 g/dl (95% prediction interval -0.30 to 0.23) and 95% limits of agreement -3.0 to 2.9 g/dl compared to 0.08 g/dl (95% prediction interval -0.04 to 0.20) and 95% limits of agreement -1.3 to 1.4 g/dl in the HemoCue group. Only B-Hemoglobin exhibited bias (0.53, 95% prediction interval 0.27 to 0.78). The overall standard deviation of difference was larger (1.42 g/dl versus 0.64 g/dl) for Masimo pulse co-oximeters compared to HemoCue photometers. Masimo devices and HemoCue 201+ both provide an unbiased, pooled estimate of laboratory haemoglobin. However, Masimo devices have lower precision and wider 95% limits of agreement than HemoCue devices. Clinicians should carefully consider these limits of agreement before basing transfusion or other clinical decisions on these point-of-care measurements alone.
Parenting-Related Stressors and Self-Reported Mental Health of Mothers With Young Children
Mistry, Ritesh; Stevens, Gregory D.; Sareen, Harvinder; De Vogli, Roberto; Halfon, Neal
2007-01-01
Objectives. We assessed whether there were associations between maternal mental health and individual and co-occurring parenting stressors related to social and financial factors and child health care access. Methods. We used cross-sectional data from the 2000 National Survey of Early Childhood Health. The 5-item Mental Health Inventory was used to measure self-reported mental health. Results. After we controlled for demographic covariates, we found that the following stressors increased the risk of poor maternal mental health: lack of emotional (odds ratio [OR] = 3.4; 95% confidence interval [CI] = 2.0, 5.9) or functional (OR=2.2; 95% CI=1.3, 3.7) social support for parenting, too much time spent with child (OR=3.5; 95% CI=2.0, 6.1), and difficulty paying for child care (OR=2.3; 95% CI=1.4, 3.9). In comparison with mothers without any parenting stressors, mothers reporting 1 stressor had 3 times the odds of poor mental health (OR = 3.1; 95% CI = 2.1, 4.8), and mothers reporting 2 or more stressors had nearly 12 times the odds (OR = 11.7; 95% CI = 7.1, 19.3). Conclusions. If parenting stressors such as those examined here are to be addressed, changes may be required in community support systems, and improvements in relevant social policies may be needed. PMID:17538058
Gitonga, Caroline W.; Gillig, Jonathan; Owaga, Chrispin; Marube, Elizabeth; Odongo, Wycliffe; Okoth, Albert; China, Pauline; Oriango, Robin; Brooker, Simon J.; Bousema, Teun; Drakeley, Chris; Cox, Jonathan
2013-01-01
Background School surveys provide an operational approach to assess malaria transmission through parasite prevalence. There is limited evidence on the comparability of prevalence estimates obtained from school and community surveys carried out at the same locality. Methods Concurrent school and community cross-sectional surveys were conducted in 46 school/community clusters in the western Kenyan highlands and households of school children were geolocated. Malaria was assessed by rapid diagnostic test (RDT) and combined seroprevalence of antibodies to bloodstage Plasmodium falciparum antigens. Results RDT prevalence in school and community populations was 25.7% (95% CI: 24.4-26.8) and 15.5% (95% CI: 14.4-16.7), respectively. Seroprevalence in the school and community populations was 51.9% (95% CI: 50.5-53.3) and 51.5% (95% CI: 49.5-52.9), respectively. RDT prevalence in schools could differentiate between low (<7%, 95% CI: 0-19%) and high (>39%, 95% CI: 25-49%) transmission areas in the community and, after a simple adjustment, were concordant with the community estimates. Conclusions Estimates of malaria prevalence from school surveys were consistently higher than those from community surveys and were strongly correlated. School-based estimates can be used as a reliable indicator of malaria transmission intensity in the wider community and may provide a basis for identifying priority areas for malaria control. PMID:24143250
STS-95 crew members greet families at Launch Pad 39B
NASA Technical Reports Server (NTRS)
1998-01-01
STS-95 crew members greet their families from Launch Pad 39B. From left, they are Mission Specialist Scott E. Parazynski, Payload Specialist Chiaki Mukai, with the National Space Development Agency of Japan (NASDA), Payload Specialist John H. Glenn Jr., senator from Ohio, Mission Specialist Stephen K. Robinson, Pilot Steven W. Lindsey, Mission Commander Curtis L. Brown Jr., and Mission Specialist Pedro Duque of Spain, with the European Space Agency (ESA). The crew were making final preparations for launch, targeted for liftoff at 2 p.m. on Oct. 29. The mission is expected to last 8 days, 21 hours and 49 minutes, returning to KSC at 11:49 a.m. EST on Nov. 7.
2013-01-01
Background In Ethiopia, there is a growing concern about the increasing rates of loss to follow-up (LTFU) in HIV programs among people waiting to start HIV treatment. Unlike other African countries, there is little information about the factors associated with LTFU among pre-antiretroviral treatment (pre-ART) patients in Ethiopia. We conducted a case–control study to investigate factors associated with pre-ART LTFU in Ethiopia. Methods Charts of HIV patients newly enrolled in HIV care at Gondar University Hospital (GUH) between September 11, 2008 and May 8, 2011 were reviewed. Patients who were “loss to follow-up” during the pre-ART period were considered to be cases and patients who were “in care” during the pre-ART period were controls. Logistic regression analysis was used to explore factors associated with pre-ART LTFU. Results In multivariable analyses, the following factors were found to be independently associated with pre-ART LTFU: male gender [Adjusted Odds Ratio (AOR) = 2.00 (95% CI: 1.15, 3.46)], higher baseline CD4 cell count (251–300 cells/μl [AOR = 2.64 (95% CI: 1.05, 6.65)], 301–350 cells/μl [AOR = 5.21 (95% CI: 1.94, 13.99)], and >350 cells/μl [AOR = 12.10 (95% CI: 6.33, 23.12)] compared to CD4 cell count of ≤200 cells/μl) and less advanced disease stage (WHO stage I [AOR = 2.81 (95% CI: 1.15, 6.91)] compared to WHO stage IV). Married patients [AOR = 0.39 (95% CI: 0.19, 0.79)] had reduced odds of being LTFU. In addition, patients whose next visit date was not documented on their medical chart [AOR = 241.39 (95% CI: 119.90, 485.97)] were more likely to be LTFU. Conclusion Our study identified various factors associated with pre-ART LTFU. The findings highlight the importance of giving considerable attention to pre-ART patients’ care from the time that they learn of their positive HIV serostatus. The completeness of the medical records, the standard of record keeping and obstacles to retrieving charts also indicate a serious problem that needs due attention from clinicians and data personnel. PMID:24053770
Lau, Colleen L; Sheridan, Sarah; Ryan, Stephanie; Roineau, Maureen; Andreosso, Athena; Fuimaono, Saipale; Tufa, Joseph; Graves, Patricia M
2017-09-01
The Global Programme to Eliminate Lymphatic Filariasis (LF) aims to eliminate the disease as a public health problem by 2020 by conducting mass drug administration (MDA) and controlling morbidity. Once elimination targets have been reached, surveillance is critical for ensuring that programmatic gains are sustained, and challenges include timely identification of residual areas of transmission. WHO guidelines encourage cost-efficient surveillance, such as integration with other population-based surveys. In American Samoa, where LF is caused by Wuchereria bancrofti, and Aedes polynesiensis is the main vector, the LF elimination program has made significant progress. Seven rounds of MDA (albendazole and diethycarbamazine) were completed from 2000 to 2006, and Transmission Assessment Surveys were passed in 2010/2011 and 2015. However, a seroprevalence study using an adult serum bank collected in 2010 detected two potential residual foci of transmission, with Og4C3 antigen (Ag) prevalence of 30.8% and 15.6%. We conducted a follow up study in 2014 to verify if transmission was truly occurring by comparing seroprevalence between residents of suspected hotspots and residents of other villages. In adults from non-hotspot villages (N = 602), seroprevalence of Ag (ICT or Og4C3), Bm14 antibody (Ab) and Wb123 Ab were 1.2% (95% CI 0.6-2.6%), 9.6% (95% CI 7.5%-12.3%), and 10.5% (95% CI 7.6-14.3%), respectively. Comparatively, adult residents of Fagali'i (N = 38) had significantly higher seroprevalence of Ag (26.9%, 95% CI 17.3-39.4%), Bm14 Ab (43.4%, 95% CI 32.4-55.0%), and Wb123 Ab 55.2% (95% CI 39.6-69.8%). Adult residents of Ili'ili/Vaitogi/Futiga (N = 113) also had higher prevalence of Ag and Ab, but differences were not statistically significant. The presence of transmission was demonstrated by 1.1% Ag prevalence (95% CI 0.2% to 3.1%) in 283 children aged 7-13 years who lived in one of the suspected hotspots; and microfilaraemia in four individuals, all of whom lived in the suspected hotspots, including a 9 year old child. Our results provide field evidence that integrating LF surveillance with other surveys is effective and feasible for identifying potential hotspots, and conducting surveillance at worksites provides an efficient method of sampling large populations of adults.
Krishnaiah, Sannapaneni; Srinivas, Marmamula; Khanna, Rohit C; Rao, Gullapalli N
2009-01-01
Aim: To report the prevalence, risk factors and associated population attributable risk percentage (PAR) for refractive errors in the South Indian adult population. Methods: A population-based cross-sectional epidemiologic study was conducted in the Indian state of Andhra Pradesh. A multistage cluster, systematic, stratified random sampling method was used to obtain participants (n = 10293) for this study. Results: The age-gender-area-adjusted prevalence rates in those ≥40 years of age were determined for myopia (spherical equivalent [SE] < −0.5 D) 34.6% (95% confidence interval [CI]: 33.1–36.1), high-myopia (SE < −5.0 D) 4.5% (95% CI: 3.8–5.2), hyperopia (SE > +0.5 D) 18.4% (95% CI: 17.1–19.7), astigmatism (cylinder < −0.5 D) 37.6% (95% CI: 36–39.2), and anisometropia (SE difference between right and left eyes >0.5 D) 13.0% (95% CI: 11.9–14.1). The prevalence of myopia, astigmatism, high-myopia, and anisometropia significantly increased with increasing age (all p < 0.0001). There was no gender difference in prevalence rates in any type of refractive error, though women had a significantly higher rate of hyperopia than men (p < 0.0001). Hyperopia was significantly higher among those with a higher educational level (odds ratio [OR] 2.49; 95% CI: 1.51–3.95) and significantly higher among the hypertensive group (OR 1.24; 95% CI: 1.03–1.49). The severity of lens nuclear opacity was positively associated with myopia and negatively associated with hyperopia. Conclusions: The prevalence of myopia in this adult Indian population is much higher than in similarly aged white populations. These results confirm the previously reported association between myopia, hyperopia, and nuclear opacity. PMID:19668540
Icotinib Is an Active Treatment of Non-Small-Cell Lung Cancer: A Retrospective Study
Liu, Yiqian; Liu, Ping; Yin, Yongmei; Guo, Renhua; Lu, Kaihua; Gu, Yanhong; Liu, Lianke; Wang, Jinghua; Wang, Zhaoxia; Røe, Oluf Dimitri; Shu, Yongqian; Zhu, Lingjun
2014-01-01
Background Icotinib hydrochloride is a novel epidermal growth factor receptor (EGFR) tyrosine kinase inhibitor (TKI) with preclinical and clinical activity in non-small cell lung cancer (NSCLC). This retrospective analysis was performed to assess the efficacy of icotinib on patients with non-small-cell lung cancer (NSCLC). Methods 82 consecutive patients treated with icotinib as first (n = 24) or second/third line (n = 58) treatment at three hospitals in Nanjing were enrolled into our retrospective research. The Response Evaluation Criteria in Solid Tumors (RECIST) was used to evaluate the tumor responses and the progression-free survival (PFS) and overall survival (OS) was evaluated by the Kaplan-Meier method. Results Median PFS was 4.0 months (95% CI 2.311–5.689). Median OS was 11.0 months (95% CI 8.537–13.463) in this cohort. Median PFS for first and second/third line were 7.0 months (95% CI 2.151–11.8) and 3.0 months (95% CI 1.042–4.958), respectively. Median OS for first and second/third line were 13.0 months (95% CI 10.305–15.695) and 10.0 months (95% CI 7.295–12.70), respectively. In patients with EGFR mutation (n = 19), icotinib significantly reduced the risk of progression (HR 0.36, 95% CI 0.18–0.70, p = 0.003) and death (HR 0.10, 95% CI 0.02–0.42, p = 0.002) compared with those EGFR status unknown (n = 63). The most common adverse events were acne-like rash (39.0%) and diarrhea (20.7%). Conclusions Icotinib is active in the treatment of patients with NSCLC both in first or second/third line, especially in those patients harbouring EGFR mutations, with an acceptable adverse event profile. PMID:24836053
Managing sepsis: Electronic recognition, rapid response teams, and standardized care save lives.
Guirgis, Faheem W; Jones, Lisa; Esma, Rhemar; Weiss, Alice; McCurdy, Kaitlin; Ferreira, Jason; Cannon, Christina; McLauchlin, Laura; Smotherman, Carmen; Kraemer, Dale F; Gerdik, Cynthia; Webb, Kendall; Ra, Jin; Moore, Frederick A; Gray-Eurom, Kelly
2017-08-01
Sepsis can lead to poor outcomes when treatment is delayed or inadequate. The purpose of this study was to evaluate outcomes after initiation of a hospital-wide sepsis alert program. Retrospective review of patients ≥18years treated for sepsis. There were 3917 sepsis admissions: 1929 admissions before, and 1988 in the after phase. Mean age (57.3 vs. 57.1, p=0.94) and Charlson Comorbidity Scores (2.52 vs. 2.47, p=0.35) were similar between groups. Multivariable analyses identified significant reductions in the after phase for odds of death (OR 0.62, 95% CI 0.39-0.99, p=0.046), mean intensive care unit LOS (2.12days before, 95%CI 1.97, 2.34; 1.95days after, 95%CI 1.75, 2.06; p<0.001), mean overall hospital LOS (11.7days before, 95% CI 10.9, 12.7days; 9.9days after, 95% CI 9.3, 10.6days, p<0.001), odds of mechanical ventilation use (OR 0.62, 95% CI 0.39, 0.99, p=0.007), and total charges with a savings of $7159 per sepsis admission (p=0.036). There was no reduction in vasopressor use (OR 0.89, 95% CI 0.75, 0.1.06, p=0.18). A hospital-wide program utilizing electronic recognition and RRT intervention resulted in improved outcomes in patients with sepsis. Copyright © 2017 Elsevier Inc. All rights reserved.
Trends in Childbirth before 39 Weeks’ Gestation without Medical Indication
Kozhimannil, Katy B.; Macheras, Michelle; Lorch, Scott A.
2014-01-01
Background There is increasing attention to labor induction and cesarean delivery occurring at 37 0/7 – 38 6/7 weeks’ gestation (early-term) without medical indication. Objective To measure prevalence, change over time, patient characteristics, and infant outcomes associated with early-term nonindicated births. Research Design and Subjects Retrospective analysis using linked hospital discharge and birth certificate data for the 7,296,363 uncomplicated term (>37 0/7 weeks’ gestation) births between 1995-2009 in 3 states. Measures Early-term nonindicated birth is calculated using diagnosis codes and birth certificate records. Secondary outcomes included infant prolonged length of stay and respiratory distress. Results Across uncomplicated term births, the early-term nonindicated birth rate was 3.18%. After adjustment, the risk of nonindicated birth before 39 weeks was 86% higher in 2009 than 1995 (hazard ratio [HR]=1.86 [95% confidence interval (CI), 1.81–1.90]), peaking in 2006 (HR=2.03; P < .001). Factors independently associated with higher odds included maternal age, higher education levels, private health insurance, and delivering at smaller-volume or nonteaching hospitals. Black women had higher risk of nonindicated cesarean birth (HR=1.29 [95% CI, 1.27–1.32]), which was associated with greater odds of prolonged length of stay (adjusted odds ratio [AOR]=1.60 [95% CI, 1.57–1.64]) and infant respiratory distress (AOR= 2.44 [95% CI, 2.37–2.50]) compared with births after 38 6/7 weeks. Early-term nonindicated induction was also associated with comparatively greater odds of prolonged length of stay (AOR=1.20 [95% CI, 1.17–1.23]). Conclusion Nearly 4% of all uncomplicated births to term infants occurred before 39 0/7 weeks' gestation without medical indication. These births were associated with adverse infant outcomes. PMID:24926713
Regassa, Alemayehu; Golicha, Gelma; Tesfaye, Dawit; Abunna, Fufa; Megersa, Bekele
2013-10-01
A cross-sectional study was carried out from November 2010 up to April 2011 to estimate mastitis prevalence and associated risk factors and to assess its bacterial causes in traditionally managed camels in Borana Zone, Southern Ethiopia. Thus, 348 lactating camels were examined clinically, and subclinical cases were checked with California mastitis test (CMT). The overall prevalence of mastitis was 44.8 % (156/348), comprising clinical (19, 5.4 %) and subclinical (137, 39.4 %) cases. The quarter level prevalence of mastitis was 24.0 % (334/1,392). Of the total 1,392 examined teats, 30 were blind, and hence, from the 1,362 non-blind CMT-examined teats, 22.3 % (304/1,362) were CMT positive. Of the 304 CMT-positive samples, 264 were culture positive (197 Gram-positive, 41 Gram-negative, and 26 mixed isolates), and 40 were culture negative. The prevalence of Staphylococcus aureus was found to be the highest at both the animal (12.8 %, 39/304) and quarter level (2.9 %, 39/1,362). Regression analysis revealed higher likelihood of mastitis occurrence among camels from Dharito (OR = 3.4, 95 % confidence interval (CI) = 1.8, 6.4), Gagna (OR = 3.4, 95 % CI = 1.8, 6.5), and Haro Bake (OR = 2.6, 95 % CI = 1.3, 5.1) than camels from Surupha. Likewise, there was higher chance of mastitis occurrence among camels at the early lactation stage (OR = 2.3, 95 % CI = 1.1, 4.6) and camels with udder/teat lesions (OR = 13.7, 95 % CI = 1.7, 109.4) than among camels at late lactation stage and camels with healthy udder/teats, respectively. In conclusion, this study reveals the current status of camel mastitis in Southern Ethiopia.
The perinatal effects of delayed childbearing.
Joseph, K S; Allen, Alexander C; Dodds, Linda; Turner, Linda Ann; Scott, Heather; Liston, Robert
2005-06-01
To determine if the rates of pregnancy complications, preterm birth, small for gestational age, perinatal mortality, and serious neonatal morbidity are higher among mothers aged 35-39 years or 40 years or older, compared with mothers 20-24 years. We performed a population-based study of all women in Nova Scotia, Canada, who delivered a singleton fetus between 1988 and 2002 (N = 157,445). Family income of women who delivered between 1988 and 1995 was obtained through a confidential linkage with tax records (n = 76,300). The primary outcome was perinatal death (excluding congenital anomalies) or serious neonatal morbidity. Analysis was based on logistic models. Older women were more likely to be married, affluent, weigh 70 kg or more, attend prenatal classes, and have a bad obstetric history but less likely to be nulliparous and to smoke. They were more likely to have hypertension, diabetes mellitus, placental abruption, or placenta previa. Preterm birth and small-for-gestational age rates were also higher; compared with women aged 20-24 years, adjusted rate ratios for preterm birth among women aged 35-39 years and 40 years or older were 1.61 (95% confidence interval [CI] 1.42-1.82; P < .001) and 1.80 (95% CI 1.37-2.36; P < .001), respectively. Adjusted rate ratios for perinatal mortality/morbidity were 1.46 (95% CI 1.11-1.92; P = .007) among women 35-39 years and 1.95 (95% CI 1.13-3.35; P = .02) among women 40 years or older. Perinatal mortality rates were low at all ages, especially in recent years. Older maternal age is associated with relatively higher risks of perinatal mortality/morbidity, although the absolute rate of such outcomes is low.
Castle, Philip E; Pierz, Amanda; Stoler, Mark H
2018-02-01
There remains uncertainty about the role of human papillomavirus (HPV) infection in causing small-cell neuroendocrine carcinoma (SCNC) and large-cell neuroendocrine carcinoma (LCNC) of the cervix. To clarify the role of HPV in the development of SCNC and LCNC, we conducted a systematic review and meta-analyses. PubMed and Embase were searched to initially identify 143 articles published on or before June 1, 2017. Studies were limited to methods that tested for HPV in the cancer tissue directly to minimize misattribution. Thirty-two studies with 403 SCNC and 9 studies of 45 LCNC were included in the analysis. For SCNC, 85% (95% confidence interval [95%CI]=71%-94%) were HPV positive, 78% (95%CI=64%-90%) were HPV16 and/or HPV18 positive, 51% (95%CI=39%-64%) were singly HPV18 positive, and 10% (95%CI=4%-19%) were singly HPV16 positive. In a subset of 5 SCNC studies (75 cases), 93% were positive for p16 INK4a by immunohistochemistry and 100% were HPV positive. For LCNC, 88% (95%CI=72%-99%) were HPV positive, 86% (95%CI=70%-98%) were positive for HPV16 or HPV18, 30% were singly HPV18 positive (95%CI=4%-60%), and 29% (95%CI=2%-64%) were singly HPV16 positive. In conclusion, most SCNC and LCNC are caused by HPV, primarily HPV18 and HPV16. Therefore, most if not all SCNC and LCNC will be prevented by currently available prophylactic HPV vaccines. Copyright © 2017 Elsevier Inc. All rights reserved.
Montero-Odasso, Manuel M; Sarquis-Adamson, Yanina; Speechley, Mark; Borrie, Michael J; Hachinski, Vladimir C; Wells, Jennie; Riccio, Patricia M; Schapira, Marcelo; Sejdic, Ervin; Camicioli, Richard M; Bartha, Robert; McIlroy, William E; Muir-Hunter, Susan
2017-07-01
Gait performance is affected by neurodegeneration in aging and has the potential to be used as a clinical marker for progression from mild cognitive impairment (MCI) to dementia. A dual-task gait test evaluating the cognitive-motor interface may predict dementia progression in older adults with MCI. To determine whether a dual-task gait test is associated with incident dementia in MCI. The Gait and Brain Study is an ongoing prospective cohort study of community-dwelling older adults that enrolled 112 older adults with MCI. Participants were followed up for 6 years, with biannual visits including neurologic, cognitive, and gait assessments. Data were collected from July 2007 to March 2016. Incident all-cause dementia was the main outcome measure, and single- and dual-task gait velocity and dual-task gait costs were the independent variables. A neuropsychological test battery was used to assess cognition. Gait velocity was recorded under single-task and 3 separate dual-task conditions using an electronic walkway. Dual-task gait cost was defined as the percentage change between single- and dual-task gait velocities: ([single-task gait velocity - dual-task gait velocity]/ single-task gait velocity) × 100. Cox proportional hazard models were used to estimate the association between risk of progression to dementia and the independent variables, adjusted for age, sex, education, comorbidities, and cognition. Among 112 study participants with MCI, mean (SD) age was 76.6 (6.9) years, 55 were women (49.1%), and 27 progressed to dementia (24.1%), with an incidence rate of 121 per 1000 person-years. Slow single-task gait velocity (<0.8 m/second) was not associated with progression to dementia (hazard ratio [HR], 3.41; 95% CI, 0.99-11.71; P = .05)while high dual-task gait cost while counting backward (HR, 3.79; 95% CI, 1.57-9.15; P = .003) and naming animals (HR, 2.41; 95% CI, 1.04-5.59; P = .04) were associated with dementia progression (incidence rate, 155 per 1000 person-years). The models remained robust after adjusting by baseline cognition except for dual-task gait cost when dichotomized. Dual-task gait is associated with progression to dementia in patients with MCI. Dual-task gait testing is easy to administer and may be used by clinicians to decide further biomarker testing, preventive strategies, and follow-up planning in patients with MCI. clinicaltrials.gov: NCT03020381.
2. West portal of Tunnel 39, view to east, 135mm ...
2. West portal of Tunnel 39, view to east, 135mm lens with electronic flash fill. Note the notched wingwalls that support steel posts of entrance snowshed; these would have originally held timber posts of the original timber snowsheds, miles of which once enclosed and protected the railroad from the ravages of Sierra winters. Note also that these tunnels, built in the 1920s, have dispensed with any use of stone masonry, and instead have all-concrete portals. - Central Pacific Transcontinental Railroad, Tunnel No. 39, Milepost 180.95, Cisco, Placer County, CA
1972-11-03
FREQUENCY OF WIND DIRECTION AND SPEED (FROM HOURLY OBSERVATIONS) 23211 HAAIILTON.AF6 CALIF/SAN RAFAEL 39-70 NOV STAIWU ITATIONKAM NONIN ALL WEATHER...i 94,7j 94*91 9590 95,0 95,0 95tO 95,0 95,1i 95, 1 9501 95,1 91,1 95,0’ 95,2 2 :> Soo0 5 96 a0 _a* 4 96.6 960.’ 960? 4607 96,a 96.8 96.,9 96:81 91...99.0 9590 2: 400 71.5 956. 3 0 j 99.0 j . 959.4 99.4 9 _!L799 54 79 95.7 5.7 91.7 95.7 95.7 2300 71.60 95.6 973590 56 99 93#9.77 9.8 9 9 9 8,0 959
Demographic predictors of peanut, tree nut, fish, shellfish, and sesame allergy in Canada.
Ben-Shoshan, M; Harrington, D W; Soller, L; Fragapane, J; Joseph, L; Pierre, Y St; Godefroy, S B; Elliott, S J; Clarke, A E
2012-01-01
Background. Studies suggest that the rising prevalence of food allergy during recent decades may have stabilized. Although genetics undoubtedly contribute to the emergence of food allergy, it is likely that other factors play a crucial role in mediating such short-term changes. Objective. To identify potential demographic predictors of food allergies. Methods. We performed a cross-Canada, random telephone survey. Criteria for food allergy were self-report of convincing symptoms and/or physician diagnosis of allergy. Multivariate logistic regressions were used to assess potential determinants. Results. Of 10,596 households surveyed in 2008/2009, 3666 responded, representing 9667 individuals. Peanut, tree nut, and sesame allergy were more common in children (odds ratio (OR) 2.24 (95% CI, 1.40, 3.59), 1.73 (95% CI, 1.11, 2.68), and 5.63 (95% CI, 1.39, 22.87), resp.) while fish and shellfish allergy were less common in children (OR 0.17 (95% CI, 0.04, 0.72) and 0.29 (95% CI, 0.14, 0.61)). Tree nut and shellfish allergy were less common in males (OR 0.55 (95% CI, 0.36, 0.83) and 0.63 (95% CI, 0.43, 0.91)). Shellfish allergy was more common in urban settings (OR 1.55 (95% CI, 1.04, 2.31)). There was a trend for most food allergies to be more prevalent in the more educated (tree nut OR 1.90 (95% CI, 1.18, 3.04)) and less prevalent in immigrants (shellfish OR 0.49 (95% CI, 0.26, 0.95)), but wide CIs preclude definitive conclusions for most foods. Conclusions. Our results reveal that in addition to age and sex, place of residence, socioeconomic status, and birth place may influence the development of food allergy.
Relationship Between Gastroesophageal Reflux Symptoms and Dietary Factors in Korea
Song, Ji Hyun; Chung, Su Jin; Lee, Jun Haeng; Kim, Young-Ho; Chang, Dong Kyung; Son, Hee Jung; Kim, Jae J; Rhee, Jong Chul
2011-01-01
Background/Aims The incidence of gastroesophageal reflux disease (GERD) is increasing in Korea. The aim of this study was to evaluate the relationship between GERD symptoms and dietary factors in Korea. Methods From January 2007 to April 2008, 162 subjects were enrolled (81 in GERD group and 81 in control group). They were asked to complete the questionnaires about GERD symptoms and dietary habits. The symptom severity score was recorded by visual analogue scale. Results Subjects with overweight or obesity had an increased risk for GERD (OR, 2.52; 95% CI, 1.18-5.39). Irregular dietary intake was one of the risk factors for GERD (OR, 2.33; 95% CI, 1.11-4.89). Acid regurgitation was the most suffering (2.85 ± 2.95 by visual analogue scale) and frequent reflux-related symptom (57.5%) in GERD. Noodles (OR, 1.22; 95% CI, 1.12-1.34), spicy foods (OR, 1.09; 95% CI, 1.02-1.16), fatty meals (OR, 1.20; 95% CI, 1.09-1.33), sweets (OR, 1.42; 95% CI, 1.00-2.02), alcohol (OR, 1.16; 95% CI, 1.03-1.31), breads (OR, 1.17; 95% CI, 1.01-1.34), carbonated drinks (OR, 1.69; 95% CI, 1.04-2.74) and caffeinated drinks (OR,1.41; 95% CI, 1.15-1.73) were associated with symptom aggravation in GERD. Among the investigated noodles, ramen (instant noodle) caused reflux-related symptoms most frequently (52.4%). Conclusions We found that noodles, spicy foods, fatty meals, sweets, alcohol, breads, carbonated drinks and caffeinated drinks were associated with reflux-related symptoms. PMID:21369492
The dynamics of the emergency medical readmission - The underlying fundamentals.
Byrne, Declan; O'Riordan, Deirdre; Conway, Richard; Cournane, Sean; Silke, Bernard
2017-11-01
Hospital readmissions are a perennial problem. We reviewed readmissions to one institution (2002-2015) and investigated their dynamics. 96,474 emergency admissions (in 50,701 patients) to an Irish hospital over a 15-year period were studied, and patterns surrounding early (<28days) and late (any other) readmissions determined. Univariate and logistic or truncated Poisson regression methods were employed. Early readmission rate averaged 9.6% (95% CI: 9.4, 9.8) with a low/high of 8.4% (95% CI: 7.8, 9.1) and 10.3% (95% CI: 9.6, 11.0) respectively with no overall time trend. Early readmissions represented 20.1% (95% CI: 19.8, 20.5) of emergency medical readmissions. Median time to first readmission was 55weeks (95% CI: 13, 159), time to second was 35weeks (95% CI: 9, 98); by the 7th/8th readmissions, intervals were 13weeks (95% CI: 4, 36) and 11weeks (95% CI: 4, 30). Readmissions were older 67.1years (95% CI: 48.3, 79.2) vs. single admissions 53.9years (34.3, 72.4) and stayed longer - 5.8days (2.7, 10.6) vs. 3.9days (1.5, 8.0). Readmissions had more Acute Illness Severity, Charlson Co-Morbidity and Chronic Disabling Disease. Between 2002 and 2015 the logistic adjusted model of 30-day in-hospital mortality reduced from 6.1% (95% CI: 5.7, 6.5) to 4.4% (95% CI: 4.1, 4.7) (RRR 30.4%). Early hospital readmission rate did not change over 15years despite improvements in hospital mortality outcomes. Readmissions have a consistent pattern related to patient illness and social characteristics; the fundamentals are driven by disease progression over time. Copyright © 2017 European Federation of Internal Medicine. Published by Elsevier B.V. All rights reserved.
Ayas, Mouhab; Saber, Wael; Davies, Stella M.; Harris, Richard E.; Hale, Gregory A.; Socie, Gerard; LeRademacher, Jennifer; Thakar, Monica; Deeg, H. Joachim J.; Al-Seraihy, Amal; Battiwalla, Minoo; Camitta, Bruce M.; Olsson, Richard; Bajwa, Rajinder S.; Bonfim, Carmem M.; Pasquini, Ricardo; MacMillan, Margaret L.; George, Biju; Copelan, Edward A.; Wirk, Baldeep; Al Jefri, Abdullah; Fasth, Anders L.; Guinan, Eva C.; Horn, Biljana N.; Lewis, Victor A.; Slavin, Shimon; Stepensky, Polina; Bierings, Marc; Gale, Robert Peter
2013-01-01
Purpose Allogeneic hematopoietic cell transplantation (HCT) can cure bone marrow failure in patients with Fanconi anemia (FA). Data on outcomes in patients with pretransplantation cytogenetic abnormalities, myelodysplastic syndrome (MDS), or acute leukemia have not been separately analyzed. Patients and Methods We analyzed data on 113 patients with FA with cytogenetic abnormalities (n = 54), MDS (n = 45), or acute leukemia (n = 14) who were reported to the Center for International Blood and Marrow Transplant Research from 1985 to 2007. Results Neutrophil recovery occurred in 78% and 85% of patients at days 28 and 100, respectively. Day 100 cumulative incidences of acute graft-versus-host disease grades B to D and C to D were 26% (95% CI, 19% to 35%) and 12% (95% CI, 7% to 19%), respectively. Survival probabilities at 1, 3, and 5 years were 64% (95% CI, 55% to 73%), 58% (95% CI, 48% to 67%), and 55% (95% CI, 45% to 64%), respectively. In univariate analysis, younger age was associated with superior 5-year survival (≤ v > 14 years: 69% [95% CI, 57% to 80%] v 39% [95% CI, 26% to 53%], respectively; P = .001). In transplantations from HLA-matched related donors (n = 82), younger patients (≤ v > 14 years: 78% [95% CI, 64% to 90%] v 34% [95% CI, 20% to 50%], respectively; P < .001) and patients with cytogenetic abnormalities only versus MDS/acute leukemia (67% [95% CI, 52% to 81%] v 43% [95% CI, 27% to 59%], respectively; P = .03) had superior 5-year survival. Conclusion Our analysis indicates that long-term survival for patients with FA with cytogenetic abnormalities, MDS, or acute leukemia is achievable. Younger patients and recipients of HLA-matched related donor transplantations who have cytogenetic abnormalities only have the best survival. PMID:23547077
Observed self-regulation is associated with weight in low-income toddlers.
Miller, Alison L; Rosenblum, Katherine L; Retzloff, Lauren B; Lumeng, Julie C
2016-10-01
Obesity emerges in early childhood and tracks across development. Self-regulation develops rapidly during the toddler years, yet few studies have examined toddlers' self-regulation in relation to concurrent child weight. Further, few studies compare child responses in food and non-food-related tasks. Our goal was to examine toddlers' observed behavioral and emotional self-regulation in food and non-food tasks in relation to their body mass index z-score (BMIz) and weight status (overweight/obese vs. not). Observational measures were used to assess self-regulation (SR) in four standardized tasks in 133 low-income children (M age = 33.1 months; SD = 0.6). Behavioral SR was measured by assessing how well the child could delay gratification for a snack (food-related task) and a gift (non-food-related task). Emotional SR was measured by assessing child intensity of negative affect in two tasks designed to elicit frustration: being shown, then denied a cookie (food-related) or a toy (non-food-related). Task order was counterbalanced. BMIz was measured. Bivariate correlations and regression analyses adjusting for child sex, child race/ethnicity, and maternal education were conducted to examine associations of SR with weight. Results were that better behavioral SR in the snack delay task associated with lower BMIz (β = -0.27, p < 0.05) and lower odds of overweight/obesity (OR = 0.66, 95% CI 0.45, 0.96), but behavioral SR in the gift task did not associate with BMIz or weight status. Better emotional SR in the non-food task associated with lower BMIz (β = -0.27, p < 0.05), and better emotional SR in food and non-food tasks associated with lower odds of overweight/obesity (OR = 0.65, 95% CI 0.45, 0.96 and OR = 0.56, 95% CI 0.37, 0.87, respectively). Results are discussed regarding how behavioral SR for food and overall emotional SR relate to weight during toddlerhood, and regarding early childhood obesity prevention implications. Copyright © 2016 Elsevier Ltd. All rights reserved.
Code of Federal Regulations, 2010 CFR
2010-01-01
...-0001; (b) By hand delivery to the NRC's offices at 11555 Rockville Pike, Rockville, Maryland; or (c....gov/site-help/e-submittals.html; by e-mail to [email protected]; or by writing the Office of... 73 of this chapter or delivered by hand in accordance with § 95.39 of this chapter to the NRC...
McCabe, Jessica; Monkiewicz, Michelle; Holcomb, John; Pundik, Svetlana; Daly, Janis J
2015-06-01
To compare response to upper-limb treatment using robotics plus motor learning (ML) versus functional electrical stimulation (FES) plus ML versus ML alone, according to a measure of complex functional everyday tasks for chronic, severely impaired stroke survivors. Single-blind, randomized trial. Medical center. Enrolled subjects (N=39) were >1 year postsingle stroke (attrition rate=10%; 35 completed the study). All groups received treatment 5d/wk for 5h/d (60 sessions), with unique treatment as follows: ML alone (n=11) (5h/d partial- and whole-task practice of complex functional tasks), robotics plus ML (n=12) (3.5h/d of ML and 1.5h/d of shoulder/elbow robotics), and FES plus ML (n=12) (3.5h/d of ML and 1.5h/d of FES wrist/hand coordination training). Primary measure: Arm Motor Ability Test (AMAT), with 13 complex functional tasks; secondary measure: upper-limb Fugl-Meyer coordination scale (FM). There was no significant difference found in treatment response across groups (AMAT: P≥.584; FM coordination: P≥.590). All 3 treatment groups demonstrated clinically and statistically significant improvement in response to treatment (AMAT and FM coordination: P≤.009). A group treatment paradigm of 1:3 (therapist/patient) ratio proved feasible for provision of the intensive treatment. No adverse effects. Severely impaired stroke survivors with persistent (>1y) upper-extremity dysfunction can make clinically and statistically significant gains in coordination and functional task performance in response to robotics plus ML, FES plus ML, and ML alone in an intensive and long-duration intervention; no group differences were found. Additional studies are warranted to determine the effectiveness of these methods in the clinical setting. Copyright © 2015 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.
Oligoclonal bands predict multiple sclerosis in children with optic neuritis.
Heussinger, Nicole; Kontopantelis, Evangelos; Gburek-Augustat, Janina; Jenke, Andreas; Vollrath, Gesa; Korinthenberg, Rudolf; Hofstetter, Peter; Meyer, Sascha; Brecht, Isabel; Kornek, Barbara; Herkenrath, Peter; Schimmel, Mareike; Wenner, Kirsten; Häusler, Martin; Lutz, Soeren; Karenfort, Michael; Blaschek, Astrid; Smitka, Martin; Karch, Stephanie; Piepkorn, Martin; Rostasy, Kevin; Lücke, Thomas; Weber, Peter; Trollmann, Regina; Klepper, Jörg; Häussler, Martin; Hofmann, Regina; Weissert, Robert; Merkenschlager, Andreas; Buttmann, Mathias
2015-06-01
We retrospectively evaluated predictors of conversion to multiple sclerosis (MS) in 357 children with isolated optic neuritis (ON) as a first demyelinating event who had a median follow-up of 4.0 years. Multiple Cox proportional-hazards regressions revealed abnormal cranial magnet resonance imaging (cMRI; hazard ratio [HR] = 5.94, 95% confidence interval [CI] = 3.39-10.39, p < 0.001), presence of cerebrospinal fluid immunoglobulin G oligoclonal bands (OCB; HR = 3.69, 95% CI = 2.32-5.86, p < 0.001), and age (HR = 1.08 per year of age, 95% CI = 1.02-1.13, p = 0.003) as independent predictors of conversion, whereas sex and laterality (unilateral vs bilateral) had no influence. Combined cMRI and OCB positivity indicated a 26.84-fold higher HR for developing MS compared to double negativity (95% CI = 12.26-58.74, p < 0.001). Accordingly, cerebrospinal fluid analysis may supplement cMRI to determine the risk of MS in children with isolated ON. © 2015 American Neurological Association.
Shebl, Fatma M.; Bhatia, Kishor; Engels, Eric A.
2009-01-01
Individuals with acquired immunodeficiency syndrome (AIDS) manifest an increased risk of cancer, particularly cancers caused by oncogenic viruses. Because some salivary gland and nasopharyngeal cancers are associated with Epstein Barr virus, the impact of AIDS on these cancers needs further evaluation. We used linked U.S. AIDS and cancer registry data (N=519,934 people with AIDS) to derive standardized incidence ratios (SIRs) comparing risk of salivary gland and nasopharyngeal cancers to the general population. For salivary gland cancers (N=43 cases), individuals with AIDS had strongly elevated risks for lymphoepithelial carcinoma (SIR 39, 95% CI 16-81) and squamous cell carcinoma (SIR 4.9, 95% CI 2.5-8.6). Among nasopharyngeal cancers (N=39 cases), risks were elevated for both keratinizing and non-keratinizing carcinomas (SIR 2.4, 95% CI 1.5-3.7, and SIR 2.4, 95% CI 1.2-4.4, respectively). The elevated risks of salivary gland and nasopharyngeal cancers among people with AIDS suggest that immunosuppression and oncogenic viral infections are etiologically important. PMID:19810095
Field, Matt; Gage, Suzanne; Hammerton, Gemma; Heron, Jon; Hickman, Matt; Munafò, Marcus R
2018-01-01
Abstract Aims The study aimed to examine the association between adolescent alcohol use and working memory (WM) using a large population sample. Methods Data from the Avon Longitudinal Study of Parents and Children were used to investigate the association between alcohol use at age 15 years and WM 3 years later, assessed using the N-back task (N ~ 3300). A three-category ordinal variable captured mutually exclusive alcohol groupings ranging in order of severity (i.e. low alcohol users, frequent drinkers and frequent/binge drinkers). Differential dropout was accounted for using multiple imputation and inverse probability weighting. Adjustment was made for potential confounders. Results There was evidence of an association between frequent/binge drinking (compared to the low alcohol group) and poorer performance on the 3-back task after adjusting for sociodemographic confounding variables, WM at age 11 years, and experience of a head injury/unconsciousness before age 11 years (β = −0.23, 95% CI = −0.37 to −0.09, P = 0.001). However, this association was attenuated (β = −0.12, 95% CI = −0.27 to 0.03, P = 0.11) when further adjusted for baseline measures of weekly cigarette tobacco and cannabis use. Weaker associations were found for the less demanding 2-back task. We found no evidence to suggest frequent drinking was associated with performance on either task. Conclusions We found weak evidence of an association between sustained heavy alcohol use in mid-adolescence and impaired WM 3 years later. Although we cannot fully rule out the possibility of reverse causation, several potential confounding variables were included to address the directionality of the relationship between WM and alcohol use problems. PMID:29329371
Ananth, Cande V; Goldenberg, Robert L; Friedman, Alexander M; Vintzileos, Anthony M
2018-05-14
Whether the changing gestational age distribution in the United States since 2005 has affected perinatal mortality remains unknown. To examine changes in gestational age distribution and gestational age-specific perinatal mortality. This retrospective cohort study examined trends in US perinatal mortality by linking live birth and infant death data among more than 35 million singleton births from January 1, 2007, through December 31, 2015. Year of birth and changes in gestational age distribution. Changes in the proportion of births at gestational ages 20 to 27, 28 to 31, 32 to 33, 34 to 36, 37 to 38, 39 to 40, 41, and 42 to 44 weeks; changes in perinatal mortality (stillbirth at ≥20 weeks, and neonatal deaths at <28 days) rates; and contribution of gestational age changes to perinatal mortality. Trends were estimated from log-linear regression models adjusted for confounders. Among the 34 236 577 singleton live births during the study period, the proportion of births at all gestational ages declined, except at 39 to 40 weeks, which increased (54.5% in 2007 to 60.2% in 2015). Overall perinatal mortality declined from 9.0 to 8.6 per 1000 births (P < .001). Stillbirths declined from 5.7 to 5.6 per 1000 births (P < .001), and neonatal mortality declined from 3.3 to 3.0 per 1000 births (P < .001). Although the proportion of births at gestational ages 34 to 36, 37 to 38, and 42 to 44 weeks declined, perinatal mortality rates at these gestational ages showed annual adjusted relative increases of 1.0% (95% CI, 0.6%-1.4%), 2.3% (95% CI, 1.9%-2.8%), and 4.2% (95% CI, 1.5%-7.0%), respectively. Neonatal mortality rates at gestational ages 34 to 36 and 37 to 38 weeks showed a relative adjusted annual increase of 0.9% (95% CI, 0.2%-1.6%) and 3.1% (95% CI, 2.1%-4.1%), respectively. Although the proportion of births at gestational age 39 to 40 weeks increased, perinatal mortality showed an annual relative adjusted decline of -1.3% (95% CI, -1.8% to -0.9%). The decline in neonatal mortality rate was largely attributable to changes in the gestational age distribution than to gestational age-specific mortality. Although the proportion of births at gestational age 39 to 40 weeks increased, perinatal mortality at this gestational age declined. This finding may be owing to pregnancies delivered at 39 to 40 weeks that previously would have been unnecessarily delivered earlier, leaving fetuses at higher risk for mortality at other gestational ages.
Mniouil, Meryem; Fellah, Hajiba; Amarir, Fatima; Sadak, Abderrahim; Et-Touys, Abdeslam; Bakri, Youssef; Moustachi, Aziza; Tassou, Fatima Zahraa; Hida, Mostapha; Lyagoubi, Mohamed; Adlaoui, El Bachir; Rhajaoui, Mohamed; Sebti, Faiza
2018-06-01
A rapid, sensitive and specific tool for detection of Leishmania infantum infection in Humans would be highly desirable, because it would allow control interventions in endemic areas of visceral leishmaniasis. This study was carried out at the Reference National Laboratory of Leishmaniasis (RNLL) in National Institute of Hygiene (NIH) Morocco, in order to evaluate the diagnostic potential of immunochromatographic dipstick test (ICT) rk39 in Moroccan suspected VL patients. A total of 49 admitted patients with strong clinical suspicion of VL and 40 healthy controls were investigated for the performance of the ICT rk39. Bone marrow smears were examined for microscopic detection of Leishmania amastigotes obtained from the admitted patients. Only PCR and smear positive cases were considered as gold standard as well as confirmed cases of VL. Out of 49 suspected patients, twenty four (48.9%) were found PCR and smear-positive and twenty three (46.9%) were positive for ICT rk39. Voluntary healthy controls, which included twenty persons from the endemic zone and twenty from non-endemic zone of VL, were found all negative for the strip test. The sensitivity in sera was 75% by ELISA and 87.5% by IFAT, compared with 95.8% for ICT rk39. Specificity was 95.8%, with both tests ELISA and IFAT, and 100% by ICT rk39 respectively. Present study findings again reinforce that the ICT rk39 is a simple, reliable and easy-to-perform non-invasive diagnostic tool for visceral leishmaniasis in the endemic area of Morocco. Copyright © 2018 Elsevier B.V. All rights reserved.
Ruiz Hidalgo, Irene; Rodriguez, Pablo; Rozema, Jos J; Ní Dhubhghaill, Sorcha; Zakaria, Nadia; Tassignon, Marie-José; Koppen, Carina
2016-06-01
To evaluate the performance of a support vector machine algorithm that automatically and objectively identifies corneal patterns based on a combination of 22 parameters obtained from Pentacam measurements and to compare this method with other known keratoconus (KC) classification methods. Pentacam data from 860 eyes were included in the study and divided into 5 groups: 454 KC, 67 forme fruste (FF), 28 astigmatic, 117 after refractive surgery (PR), and 194 normal eyes (N). Twenty-two parameters were used for classification using a support vector machine algorithm developed in Weka, a machine-learning computer software. The cross-validation accuracy for 3 different classification tasks (KC vs. N, FF vs. N and all 5 groups) was calculated and compared with other known classification methods. The accuracy achieved in the KC versus N discrimination task was 98.9%, with 99.1% sensitivity and 98.5% specificity for KC detection. The accuracy in the FF versus N task was 93.1%, with 79.1% sensitivity and 97.9% specificity for the FF discrimination. Finally, for the 5-groups classification, the accuracy was 88.8%, with a weighted average sensitivity of 89.0% and specificity of 95.2%. Despite using the strictest definition for FF KC, the present study obtained comparable or better results than the single-parameter methods and indices reported in the literature. In some cases, direct comparisons with the literature were not possible because of differences in the compositions and definitions of the study groups, especially the FF KC.
Case-crossover design and its implementation in R
2016-01-01
Case-crossover design is a variation of case-control design that it employs persons’ history periods as controls. Case-crossover design can be viewed as the hybrid of case-control study and crossover design. Characteristic confounding that is constant within one person can be well controlled with this method. The relative risk and odds ratio, as well as their 95% confidence intervals (CIs), can be estimated using Cochran-Mantel-Haenszel method. R codes for the calculation are provided in the main text. Readers may adapt these codes to their own task. Conditional logistic regression model is another way to estimate odds ratio of the exposure. Furthermore, it allows for incorporation of other time-varying covariates that are not constant within subjects. The model fitting per se is not technically difficult because there is well developed statistical package. However, it is challenging to convert original dataset obtained from case report form to that suitable to be passed to clogit() function. R code for this task is provided and explained in the text. PMID:27761445
NASA Technical Reports Server (NTRS)
Orcutt, John M.; Brenton, James C.
2016-01-01
The methodology and the results of the quality control (QC) process of the meteorological data from the Lightning Protection System (LPS) towers located at Kennedy Space Center (KSC) launch complex 39B (LC-39B) are documented in this paper. Meteorological data are used to design a launch vehicle, determine operational constraints, and to apply defined constraints on day-of-launch (DOL). In order to properly accomplish these tasks, a representative climatological database of meteorological records is needed because the database needs to represent the climate the vehicle will encounter. Numerous meteorological measurement towers exist at KSC; however, the engineering tasks need measurements at specific heights, some of which can only be provided by a few towers. Other than the LPS towers, Tower 313 is the only tower that provides observations up to 150 m. This tower is located approximately 3.5 km from LC-39B. In addition, data need to be QC'ed to remove erroneous reports that could pollute the results of an engineering analysis, mislead the development of operational constraints, or provide a false image of the atmosphere at the tower's location.
Mallon, Sharon; Rosato, Michael; Galway, Karen; Hughes, Lynette; Rondon-Sulbaran, Janeet; McConkey, Sam; Leavey, Gerard
2015-06-01
All suicides and related prior attempts occurring in Northern Ireland over two years were analyzed, focusing on number and timing of attempts, method, and mental health diagnoses. Cases were derived from coroner's records, with 90% subsequently linked to associated general practice records. Of those included, 45% recorded at least one prior attempt (with 59% switching from less to more lethal methods between attempt and suicide). Compared with those recording one attempt, those with 2+ attempts were more likely to have used less lethal methods at the suicide (OR = 2.77: 95% CI = 1.06, 7.23); and those using less lethal methods at the attempts were more likely to persist with these into the suicide (OR = 3.21: 0.79, 13.07). Finally, those with preexisting mental problems were more likely to use less lethal methods in the suicide: severe mental illness (OR = 7.88: 1.58, 39.43); common mental problems (OR = 3.68: 0.83, 16.30); and alcohol/drugs related (OR = 2.02: 0.41, 9.95). This analysis uses readily available data to highlight the persisting use of less lethal methods by visible and vulnerable attempters who eventually complete their suicide. Further analysis of such conditions could allow more effective prevention strategies to be developed. © 2014 The American Association of Suicidology.
Adapting GNU random forest program for Unix and Windows
NASA Astrophysics Data System (ADS)
Jirina, Marcel; Krayem, M. Said; Jirina, Marcel, Jr.
2013-10-01
The Random Forest is a well-known method and also a program for data clustering and classification. Unfortunately, the original Random Forest program is rather difficult to use. Here we describe a new version of this program originally written in Fortran 77. The modified program in Fortran 95 needs to be compiled only once and information for different tasks is passed with help of arguments. The program was tested with 24 data sets from UCI MLR and results are available on the net.
Functional interaction between nonreceptor tyrosine kinase c-Abl and SR-Rich protein RBM39
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mai, Sanyue; Qu, Xiuhua; Li, Ping
RBM39, also known as splicing factor HCC1.4, acts as a transcriptional coactivator for the steroid nuclear receptors JUN/AP-1, ESR1/ER-α and ESR2/ER-β. RBM39 is involved in the regulation of the transcriptional responses of these steroid nuclear receptors and promotes transcriptional initiation. In this paper, we report that RBM39 interacts with the nonreceptor tyrosine kinase c-Abl. Both the Src homology (SH) 2 and SH3 domains of c-Abl interact with RBM39. The major tyrosine phosphorylation sites on RBM39 that are phosphorylated by c-Abl are Y95 and Y99, as demonstrated by liquid chromatography coupled with tandem mass spectrometry (LC/MS/MS) and mutational analysis. c-Abl wasmore » shown boost the transcriptional coactivation activity of RBM39 for ERα and PRβ in a tyrosine kinase-dependent manner. The results suggest that mammalian c-Abl plays an important role in steroid hormone receptor-mediated transcription by regulating RBM39. - Highlights: • c-Abl interacts with RBM39. • RBM39 is phosphorylated by c-Abl. • c-Abl regulates transcriptional coactivation activity of RBM39 on the ERα and PRβ.« less
Xiong, Lili; Li, Liping
2015-01-01
Introduction There is a gap that involves examining differences between patients in single-vehicle (SV) versus multi-vehicle (MV) accidents involving motorcycles in Shantou, China, regarding the injury patterns and mortality the patients sustained. This study aims to address this gap and provide a basis and reference for motorcycle injury prevention. Method Medical record data was collected between October 2002 and June 2012 on all motorcycle injury patients admitted to a hospital in the city of Shantou of the east Guangdong province in China. Comparative analysis was conducted between patients in SV accidents and patients in MV accidents regarding demographic and clinic characteristics, mortality, and injury patterns. Results Approximately 48% (n = 1977) of patients were involved in SV accidents and 52% (n = 2119) were involved in MV accidents. The average age was 34 years. Collision of a motorcycle with a heavy vehicle/bus (4%) was associated with a 34 times greater risk of death (RR: 34.32; 95% CI: 17.43–67.57). Compared to patients involved in MV accidents, those involved in SV accidents were more likely to sustain a skull fracture (RR: 1.47; 95% CI: 1.22–1.77), an open head wound (RR: 1.46; 95% CI: 1.23–1.74), an intracranial injury (RR: 1.39; 95% CI: 1.26–1.53), a superficial head injury (RR: 1.37; 95% CI: 1.01–1.86), an injury to an organ (RR: 2.01; 95% CI: 1.24–3.26), and a crushing injury (RR: 1.98; 95% CI: 1.06–3.70) to the thorax or abdomen. However, they were less likely to sustain a spinal fracture (RR: 0.58; 95% CI: 0.39–0.85), a pelvic fracture (RR: 0.22; 95% CI: 0.11–0.46), an upper extremity fracture (RR: 0.75; 95% CI: 0.59–0.96), or injuries to their lower extremities, except for a dislocation, sprain, or injury to a joint or ligament (RR: 0.82; 95% CI: 0.49–1.36). Conclusion The relative risk of death is higher for patients involved in multi-vehicle accidents than patients in single-vehicle accidents, especially when a collision involves mass vehicle(s). Injury to the head dominated motorcycle injuries. Single-vehicle accidents have a higher correlation with head injury or internal injuries to the thorax or abdomen. Multi-vehicle accidents are more correlated with extremity injuries, especially to the lower extremities or external trauma to the thorax or abdomen. PMID:29546097
Locke, Sarah J.; Colt, Joanne S.; Stewart, Patricia A.; Armenti, Karla R.; Baris, Dalsu; Blair, Aaron; Cerhan, James R.; Chow, Wong-Ho; Cozen, Wendy; Davis, Faith; De Roos, Anneclaire J.; Hartge, Patricia; Karagas, Margaret R.; Johnson, Alison; Purdue, Mark P.; Rothman, Nathaniel; Schwartz, Kendra; Schwenn, Molly; Severson, Richard; Silverman, Debra T.; Friesen, Melissa C.
2014-01-01
Objectives Growing evidence suggests that gender-blind assessment of exposure may introduce exposure misclassification, but few studies have characterized gender differences across occupations and industries. We pooled control responses to job-, industry-, and exposure-specific questionnaires (modules) that asked detailed questions about work activities from three US population-based case-control studies to examine gender differences in work tasks and their frequencies. Methods We calculated the ratio of female to male controls that completed each module. For four job modules (assembly worker, machinist, health professional, janitor/cleaner) and for subgroups of jobs that completed those modules, we evaluated gender differences in task prevalence and frequency using Chi-square and Mann-Whitney U-tests, respectively. Results The 1,360 female and 2,245 male controls reported 6,033 and 12,083 jobs, respectively. Gender differences in female:male module completion ratios were observed for 39 of 45 modules completed by ≥20 controls. Gender differences in task prevalence varied in direction and magnitude. For example, female janitors were significantly more likely to polish furniture (79% vs. 44%), while male janitors were more likely to strip floors (73% vs. 50%). Women usually reported more time spent on tasks than men. For example, the median hours per week spent degreasing for production workers in product manufacturing industries was 6.3 for women and 3.0 for men. Conclusions Observed gender differences may reflect actual differences in tasks performed or differences in recall, reporting, or perception, all of which contribute to exposure misclassification and impact relative risk estimates. Our findings reinforce the need to capture subject-specific information on work tasks. PMID:24683012