Science.gov

Sample records for account potential confounders

  1. Assessing Sensitivity to Unmeasured Confounding Using a Simulated Potential Confounder

    ERIC Educational Resources Information Center

    Carnegie, Nicole Bohme; Harada, Masataka; Hill, Jennifer L.

    2016-01-01

    A major obstacle to developing evidenced-based policy is the difficulty of implementing randomized experiments to answer all causal questions of interest. When using a nonexperimental study, it is critical to assess how much the results could be affected by unmeasured confounding. We present a set of graphical and numeric tools to explore the…

  2. Accounting for uncertainty in confounder and effect modifier selection when estimating average causal effects in generalized linear models.

    PubMed

    Wang, Chi; Dominici, Francesca; Parmigiani, Giovanni; Zigler, Corwin Matthew

    2015-09-01

    Confounder selection and adjustment are essential elements of assessing the causal effect of an exposure or treatment in observational studies. Building upon work by Wang et al. (2012, Biometrics 68, 661-671) and Lefebvre et al. (2014, Statistics in Medicine 33, 2797-2813), we propose and evaluate a Bayesian method to estimate average causal effects in studies with a large number of potential confounders, relatively few observations, likely interactions between confounders and the exposure of interest, and uncertainty on which confounders and interaction terms should be included. Our method is applicable across all exposures and outcomes that can be handled through generalized linear models. In this general setting, estimation of the average causal effect is different from estimation of the exposure coefficient in the outcome model due to noncollapsibility. We implement a Bayesian bootstrap procedure to integrate over the distribution of potential confounders and to estimate the causal effect. Our method permits estimation of both the overall population causal effect and effects in specified subpopulations, providing clear characterization of heterogeneous exposure effects that may vary considerably across different covariate profiles. Simulation studies demonstrate that the proposed method performs well in small sample size situations with 100-150 observations and 50 covariates. The method is applied to data on 15,060 US Medicare beneficiaries diagnosed with a malignant brain tumor between 2000 and 2009 to evaluate whether surgery reduces hospital readmissions within 30 days of diagnosis.

  3. Confounding in observational studies based on large health care databases: problems and potential solutions - a primer for the clinician.

    PubMed

    Nørgaard, Mette; Ehrenstein, Vera; Vandenbroucke, Jan P

    2017-01-01

    Population-based health care databases are a valuable tool for observational studies as they reflect daily medical practice for large and representative populations. A constant challenge in observational designs is, however, to rule out confounding, and the value of these databases for a given study question accordingly depends on completeness and validity of the information on confounding factors. In this article, we describe the types of potential confounding factors typically lacking in large health care databases and suggest strategies for confounding control when data on important confounders are unavailable. Using Danish health care databases as examples, we present the use of proxy measures for important confounders and the use of external adjustment. We also briefly discuss the potential value of active comparators, high-dimensional propensity scores, self-controlled designs, pseudorandomization, and the use of positive or negative controls.

  4. Confounding in observational studies based on large health care databases: problems and potential solutions – a primer for the clinician

    PubMed Central

    Nørgaard, Mette; Ehrenstein, Vera; Vandenbroucke, Jan P

    2017-01-01

    Population-based health care databases are a valuable tool for observational studies as they reflect daily medical practice for large and representative populations. A constant challenge in observational designs is, however, to rule out confounding, and the value of these databases for a given study question accordingly depends on completeness and validity of the information on confounding factors. In this article, we describe the types of potential confounding factors typically lacking in large health care databases and suggest strategies for confounding control when data on important confounders are unavailable. Using Danish health care databases as examples, we present the use of proxy measures for important confounders and the use of external adjustment. We also briefly discuss the potential value of active comparators, high-dimensional propensity scores, self-controlled designs, pseudorandomization, and the use of positive or negative controls.

  5. Interpersonal discrimination and depressive symptomatology: examination of several personality-related characteristics as potential confounders in a racial/ethnic heterogeneous adult sample

    PubMed Central

    2013-01-01

    Background Research suggests that reports of interpersonal discrimination result in poor mental health. Because personality characteristics may either confound or mediate the link between these reports and mental health, there is a need to disentangle its role in order to better understand the nature of discrimination-mental health association. We examined whether hostility, anger repression and expression, pessimism, optimism, and self-esteem served as confounders in the association between perceived interpersonal discrimination and CESD-based depressive symptoms in a race/ethnic heterogeneous probability-based sample of community-dwelling adults. Methods We employed a series of ordinary least squares regression analyses to examine the potential confounding effect of hostility, anger repression and expression, pessimism, optimism, and self-esteem between interpersonal discrimination and depressive symptoms. Results Hostility, anger repression, pessimism and self-esteem were significant as possible confounders of the relationship between interpersonal discrimination and depressive symptoms, together accounting for approximately 38% of the total association (beta: 0.1892, p < 0.001). However, interpersonal discrimination remained a positive predictor of depressive symptoms (beta: 0.1176, p < 0.001). Conclusion As one of the first empirical attempts to examine the potential confounding role of personality characteristics in the association between reports of interpersonal discrimination and mental health, our results suggest that personality-related characteristics may serve as potential confounders. Nevertheless, our results also suggest that, net of these characteristics, reports of interpersonal discrimination are associated with poor mental health. PMID:24256578

  6. Removal of Potentially Confounding Phenotypes from a Siamese-Derived Feline Glaucoma Breeding Colony

    PubMed Central

    Rutz-Mendicino, Michelle M; Snella, Elizabeth M; Jens, Jackie K; Gandolfi, Barbara; Carlson, Steven A; Kuehn, Markus H; McLellan, Gillian J; Ellinwood, N Matthew

    2011-01-01

    Feline breeding colonies face genetic constraints involving founder effects. A Siamese-founded colony used to study primary congenital glaucoma displayed coat colors additional to the Siamese coat. Genes affecting pigment can exhibit pleiotropy on ocular development and function. To remove potentially confounding phenotypes from our colony, we documented the source and frequency of the Siamese allele at the gene for tyrosinase (TYR), the dilution allele at melanophilin (MLPH), and the brown allele at tyrosinase-related protein 1 (TYRP1). We used PCR–RFLP diagnostics to genotype cats in our colony for the published alleles. A commercially acquired phenotypically normal tom was the source of the dilute allele. A founding Siamese queen was the source of the brown allele. Founders also were blood-typed and screened for disease-associated alleles segregating in Siamese cats at 3 loci (ASB, GLB1, and CEP290). Siamese founders were normal at all loci except ASB, at which both animals carried the hypomorphic allele. Current stock is being managed to limit production of glaucomatous cats with brown, dilute, or Siamese phenotypes or homozygosity for the ASB hypomorphic allele. Genotyping will aid in the elimination of these alleles. The clinical effect of these phenotypes and alleles on the glaucoma phenotype is uncertain, but their elimination will remove potentially confounding effects. In conclusion, when founding a colony, stock should be selected or screened to limit potentially confounding phenotypes. When studying the immune, nervous, and visual systems, screening stock for alleles known to be associated with coat color may be warranted. PMID:21819695

  7. Cellular GFP Toxicity and Immunogenicity: Potential Confounders in in Vivo Cell Tracking Experiments.

    PubMed

    Ansari, Amir Mehdi; Ahmed, A Karim; Matsangos, Aerielle E; Lay, Frank; Born, Louis J; Marti, Guy; Harmon, John W; Sun, Zhaoli

    2016-10-01

    Green Fluorescent protein (GFP), used as a cellular tag, provides researchers with a valuable method of measuring gene expression and cell tracking. However, there is evidence to suggest that the immunogenicity and cytotoxicity of GFP potentially confounds the interpretation of in vivo experimental data. Studies have shown that GFP expression can deteriorate over time as GFP tagged cells are prone to death. Therefore, the cells that were originally marked with GFP do not survive and cannot be accurately traced over time. This review will present current evidence for the immunogenicity and cytotoxicity of GFP in in vivo studies by characterizing these responses.

  8. The Effect of Clozapine on Premature Mortality: An Assessment of Clinical Monitoring and Other Potential Confounders

    PubMed Central

    Hayes, Richard D.; Downs, Johnny; Chang, Chin-Kuo; Jackson, Richard G.; Shetty, Hitesh; Broadbent, Matthew; Hotopf, Matthew; Stewart, Robert

    2015-01-01

    Clozapine can cause severe adverse effects yet it is associated with reduced mortality risk. We test the hypothesis this association is due to increased clinical monitoring and investigate risk of premature mortality from natural causes. We identified 14 754 individuals (879 deaths) with serious mental illness (SMI) including schizophrenia, schizoaffective and bipolar disorders aged ≥ 15 years in a large specialist mental healthcare case register linked to national mortality tracing. In this cohort study we modeled the effect of clozapine on mortality over a 5-year period (2007–2011) using Cox regression. Individuals prescribed clozapine had more severe psychopathology and poorer functional status. Many of the exposures associated with clozapine use were themselves risk factors for increased mortality. However, we identified a strong association between being prescribed clozapine and lower mortality which persisted after controlling for a broad range of potential confounders including clinical monitoring and markers of disease severity (adjusted hazard ratio 0.4; 95% CI 0.2–0.7; p = .001). This association remained after restricting the sample to those with a diagnosis of schizophrenia or those taking antipsychotics and after using propensity scores to reduce the impact of confounding by indication. Among individuals with SMI, those prescribed clozapine had a reduced risk of mortality due to both natural and unnatural causes. We found no evidence to indicate that lower mortality associated with clozapine in SMI was due to increased clinical monitoring or confounding factors. This is the first study to report an association between clozapine and reduced risk of mortality from natural causes. PMID:25154620

  9. The assessment of cortisol in human hair: associations with sociodemographic variables and potential confounders.

    PubMed

    Dettenborn, L; Tietze, A; Kirschbaum, C; Stalder, T

    2012-11-01

    To inform the future use of hair cortisol measurement, we have investigated influences of potential confounding variables (natural hair colour, frequency of hair washes, age, sex, oral contraceptive (OC) use and smoking status) on hair cortisol levels. The main study sample comprised 360 participants (172 women) covering a wide range of ages (1-91 years; mean = 25.95). In addition, to more closely examine influences of natural hair colour and young age on hair cortisol levels, two additional samples comprising 69 participants with natural blond or dark brown hair (hair colour sample) as well as 28 young children and 34 adults (young age sample) were recruited. Results revealed a lack of an effect for natural hair colour, OC use, and smoking status on hair cortisol levels (all p's >0.10). No influence of frequency of hair washes was seen for proximal hair segments (p = 0.335) but for the third hair segment indicating lower cortisol content (p = 0.008). We found elevated hair cortisol levels in young children and older adults (p < 0.001). Finally, men showed higher hair cortisol levels than women (p = 0.002). The present data indicate that hair cortisol measurement provides a useful tool in stress-related psychobiological research when applied with the consideration of possible confounders including age and sex.

  10. Influence of potentially confounding factors on sea urchin porewater toxicity tests

    USGS Publications Warehouse

    Carr, R.S.; Biedenbach, J.M.; Nipper, M.

    2006-01-01

    The influence of potentially confounding factors has been identified as a concern for interpreting sea urchin porewater toxicity test data. The results from >40 sediment-quality assessment surveys using early-life stages of the sea urchin Arbacia punctulata were compiled and examined to determine acceptable ranges of natural variables such as pH, ammonia, and dissolved organic carbon on the fertilization and embryological development endpoints. In addition, laboratory experiments were also conducted with A. punctulata and compared with information from the literature. Pore water with pH as low as 6.9 is an unlikely contributor to toxicity for the fertilization and embryological development tests with A. punctulata. Other species of sea urchin have narrower pH tolerance ranges. Ammonia is rarely a contributing factor in pore water toxicity tests using the fertilization endpoint, but the embryological development endpoint may be influenced by ammonia concentrations commonly found in porewater samples. Therefore, ammonia needs to be considered when interpreting results for the embryological development test. Humic acid does not affect sea urchin fertilization at saturation concentrations, but it could have an effect on the embryological development endpoint at near-saturation concentrations. There was no correlation between sediment total organic carbon concentrations and porewater dissolved organic carbon concentrations. Because of the potential for many varying substances to activate parthenogenesis in sea urchin eggs, it is recommended that a no-sperm control be included with every fertilization test treatment. ?? 2006 Springer Science+Business Media, Inc.

  11. Non-Chemical Distant Cellular Interactions as a potential confounder of cell biology experiments

    PubMed Central

    Farhadi, Ashkan

    2014-01-01

    Distant cells can communicate with each other through a variety of methods. Two such methods involve electrical and/or chemical mechanisms. Non-chemical, distant cellular interactions may be another method of communication that cells can use to modify the behavior of other cells that are mechanically separated. Moreover, non-chemical, distant cellular interactions may explain some cases of confounding effects in Cell Biology experiments. In this article, we review non-chemical, distant cellular interactions studies to try to shed light on the mechanisms in this highly unconventional field of cell biology. Despite the existence of several theories that try to explain the mechanism of non-chemical, distant cellular interactions, this phenomenon is still speculative. Among candidate mechanisms, electromagnetic waves appear to have the most experimental support. In this brief article, we try to answer a few key questions that may further clarify this mechanism. PMID:25368582

  12. Task-independent effects are potential confounders in longitudinal imaging studies of learning in schizophrenia.

    PubMed

    Korostil, Michele; Fatima, Zainab; Kovacevic, Natasha; Menon, Mahesh; McIntosh, Anthony Randal

    2016-01-01

    Learning impairment is a core deficit in schizophrenia that impacts on real-world functioning and yet, elucidating its underlying neural basis remains a challenge. A key issue when interpreting learning-task experiments is that task-independent changes may confound interpretation of task-related signal changes in neuroimaging studies. The nature of these task-independent changes in schizophrenia is unknown. Therefore, we examined task-independent "time effects" in a group of participants with schizophrenia contrasted with healthy participants in a longitudinal fMRI learning-experiment designed to allow for examination of non-specific effects of time. Flanking the learning portions of the experiment with a task-of-no-interest allowed us to extract task-independent BOLD changes. Task-independent effects occurred in both groups, but were more robust in the schizophrenia group. There was a significant interaction effect between group and time in a distributed activity pattern that included inferior and superior temporal regions, frontal areas (left anterior insula and superior medial gyri), and parietal areas (posterior cingulate cortices and precuneus). This pattern showed task-independent linear decrease in BOLD amplitude over the two scanning sessions for the schizophrenia group, but showed either opposite effect or no activity changes for the control group. There was a trend towards a correlation between task-independent effects and the presence of more negative symptoms in the schizophrenia group. The strong interaction between group and time suggests that both the scanning experience as a whole and the transition between task-types evokes a different response in persons with schizophrenia and may confound interpretation of learning-related longitudinal imaging experiments if not explicitly considered.

  13. [COMPUTER TECHNOLOGY FOR ACCOUNTING OF CONFOUNDERS IN THE RISK ASSESSMENT IN COMPARATIVE STUDIES ON THE BASE OF THE METHOD OF STANDARDIZATION].

    PubMed

    Shalaumova, Yu V; Varaksin, A N; Panov, V G

    2016-01-01

    There was performed an analysis of the accounting of the impact of concomitant variables (confounders), introducing a systematic error in the assessment of the impact of risk factors on the resulting variable. The analysis showed that standardization is an effective method for the reduction of the shift of risk assessment. In the work there is suggested an algorithm implementing the method of standardization based on stratification, providing for the minimization of the difference of distributions of confounders in groups on risk factors. To automate the standardization procedures there was developed a software available on the website of the Institute of Industrial Ecology, UB RAS. With the help of the developed software by numerically modeling there were determined conditions of the applicability of the method of standardization on the basis of stratification for the case of the normal distribution on the response and confounder and linear relationship between them. Comparison ofresults obtained with the help of the standardization with statistical methods (logistic regression and analysis of covariance) in solving the problem of human ecology, has shown that obtaining close results is possible if there will be met exactly conditions for the applicability of statistical methods. Standardization is less sensitive to violations of conditions of applicability.

  14. Avoiding potential problems when selling accounts receivable.

    PubMed

    Ayers, D H; Kincaid, T J

    1996-05-01

    Accounts receivable financing is a potential tool for managing a provider organization's working capital needs. But before entering into a financing agreement, organizations need to consider and take steps to avoid serious problems that can arise from participation in an accounts receivable financing program. For example, the purchaser may cease purchasing the receivables, leaving the organization without funding needed for operations. Or, the financing program may be inordinately complex and unnecessarily costly to the organization. Sometimes the organization itself may fail to comply with the terms of the agreement under which the accounts receivable were sold, thus necessitating that restitution be made to the purchaser or provoking charges of fraud. These potential problems should be addressed as early as possible--before an organization enters into an accounts receivable financing program--in order to minimize time, effort, and expanse and maximize the benefits of the financing agreement.

  15. Genetic variation of loci potentially under selection confounds species-genetic diversity correlations in a fragmented habitat.

    PubMed

    Bertin, Angeline; Gouin, Nicolas; Baumel, Alex; Gianoli, Ernesto; Serratosa, Juan; Osorio, Rodomiro; Manel, Stephanie

    2017-01-01

    Positive species-genetic diversity correlations (SGDCs) are often thought to result from the parallel influence of neutral processes on genetic and species diversity. Yet, confounding effects of non-neutral mechanisms have not been explored. Here, we investigate the impact of non-neutral genetic diversity on SGDCs in high Andean wetlands. We compare correlations between plant species diversity and genetic diversity (GD) calculated with and without loci potentially under selection (outlier loci). The study system includes 2188 specimens from five species (three common aquatic macroinvertebrate and two dominant plant species) that were genotyped for 396 amplified fragment length polymorphism loci. We also appraise the importance of neutral processes on SGDCs by investigating the influence of habitat fragmentation features. Significant positive SGDCs were detected for all five species (mean SGDC = 0.52 ± 0.05). While only a few outlier loci were detected in each species, they resulted in significant decreases in GD and in SGDCs. This supports the hypothesis that neutral processes drive species-genetic diversity relationships in high Andean wetlands. Unexpectedly, the effects on genetic diversity GD of the habitat fragmentation characteristics in this study increased with the presence of outlier loci in two species. Overall, our results reveal pitfalls in using habitat features to infer processes driving SGDCs and show that a few loci potentially under selection are enough to cause a significant downward bias in SGDC. Investigating confounding effects of outlier loci thus represents a useful approach to evidence the contribution of neutral processes on species-genetic diversity relationships.

  16. Evaluation of potential confounding factors in sediment toxicity tests with three freshwater benthic invertebrates

    SciTech Connect

    Ankley, G.T.; Benoit, D.A. ); Balogh, J.C. ); Reynoldson, T.B.; Day, K.E. ); Hoke, R.A. )

    1994-04-01

    The authors examined the effects of natural sediment physicochemical properties on the results of lab tests with the amphipod Hyalella azteca, the midge Chironomus tentans, and the oligochaete Lumbriculus variegatus. Ten-day exposures with the three species were conducted with 50 uncontaminated sediment samples from Lakes Erie, Huron, Superior, and Ontario, which differed markedly with regard to characteristics such as grain-size distribution, organic carbon content, and mineralogical composition. Tests were conducted both with and without the addition of exogenous food. Survival of Hyalella azteca, survival and growth of Chironomus tentans, and survival/reproduction and growth of Lumbriculus variegatus were significantly greater in tests in which the animals were fed vs, those in which they were not. Approximately 10% of the tests in which Hyalella azteca was not fed and 80% of tests in which the amphipods were fed resulted in >80% survival, a common criterion for defining the acceptability of tests with Hyalella azteca in clean control sediments. Similarly, a relatively high percentage of the tests in which Chironomus tentans was not fed would have failed a control survival criterion of 70% for the midge. Hence, there is significant potential for false positive results if Hyalella azteca or Chironomus tentans is not fed during sediment tests. Predictive modeling of the assay results in relationship to sediment physicochemical characteristics failed to reveal any additional factors that influenced survival of Hyalella azteca and Chrionomus tentans, or reproduction and growth of Lumbriculus variegatus in tests in which the organisms were fed. However, linear modeling did suggest that growth of fed as well as unfed Chironomus tentans may have been influenced by grain-size distribution of the test sediments.

  17. Potential sources of 2-aminoacetophenone to confound the Pseudomonas aeruginosa breath test, including analysis of a food challenge study.

    PubMed

    Scott-Thomas, Amy; Pearson, John; Chambers, Stephen

    2011-12-01

    2-Aminoacetophenone can be detected in the breath of Pseudomonas aeruginosa colonized cystic fibrosis patients; however, low levels were also detected in a small proportion of healthy subjects. It was hypothesized that food, beverages, cosmetics or medications could be a source of contamination of 2-aminoacetophenone in breath. To determine the potential confounding of these products on 2-aminoacetophenone breath analysis, screening for this volatile was performed in the laboratory by gas chromatography/mass spectrometry and a food challenge study carried out. 2-Aminoacetophenone was detected in four of the 78 samples tested in vitro: corn chips and canned tuna (high pmol mol(-1)) and egg white and one of the three beers (low pmol mol(-1)). No 2-aminoacetophenone was detected in the CF medication or cosmetics tested. Twenty-eight out of 30 environmental air samples were negative for 2-aminoacetophenone (below 50 pmol mol(-1)). A challenge study with ten healthy subjects was performed to determine if 2-aminoacetophenone from corn chips was detectable on the breath after consumption. Analysis of mixed breath samples reported that the levels of 2-aminoacetophenone were immediately elevated after corn chip consumption, but after 2 h the level of 2-aminoacetophenone had reduced back to the 'baseline' for each subject.

  18. Grandparental help in Indonesia is directed preferentially towards needier descendants: a potential confounder when exploring grandparental influences on child health.

    PubMed

    Snopkowski, Kristin; Sear, Rebecca

    2015-03-01

    A considerable body of evidence has now demonstrated positive correlations between grandparental presence and child health outcomes. It is typically assumed that such correlations exist because grandparental investment in their grandchildren improves child health and wellbeing. However, less is known about how grandparents allocate help to adult children and grandchildren, particularly in lower income contexts. Here we use detailed quantitative data from the longitudinal Indonesia Family Life Survey (data collected in 1993, 1997, 2000, 2007; n = 16,250) to examine grandparental help in a society transitioning both demographically and economically. We test the hypothesis that grandparents direct help preferentially towards those adult children and grandchildren most in need of help. This hypothesis was supported for help provided by married grandparents and single grandmothers, who tended to: provide more help to their adult children when this generation had young children themselves, provide financial help if their adult children were poorer, and provide more household help if their adult daughters worked outside the home. One unexpected result was that help from maternal and paternal grandparents is positively correlated; if one set of grandparents is helping the other set is more likely to help, counter to our predictions. These results provide support for the hypothesis that grandparents preferentially invest in some descendants over others, where married grandparents and single grandmothers tend to invest in those adult children and grandchildren with the most need. Investigating the effect of grandparents on child health outcomes may therefore be confounded by grandparent's preferential investment in needier descendants.

  19. Global climate change in large European rivers: long-term effects on macroinvertebrate communities and potential local confounding factors.

    PubMed

    Floury, Mathieu; Usseglio-Polatera, Philippe; Ferreol, Martial; Delattre, Cecile; Souchon, Yves

    2013-04-01

    Aquatic species living in running waters are widely acknowledged to be vulnerable to climate-induced, thermal and hydrological fluctuations. Climate changes can interact with other environmental changes to determine structural and functional attributes of communities. Although such complex interactions are most likely to occur in a multiple-stressor context as frequently encountered in large rivers, they have received little attention in such ecosystems. In this study, we aimed at specifically addressing the issue of relative long-term effects of global and local changes on benthic macroinvertebrate communities in multistressed large rivers. We assessed effects of hydroclimatic vs. water quality factors on invertebrate community structure and composition over 30 years (1979-2008) in the Middle Loire River, France. As observed in other large European rivers, water warming over the three decades (+0.9 °C between 1979-1988 and 1999-2008) and to a lesser extent discharge reduction (-80 m(3) s(-1) ) were significantly involved in the disappearance or decrease in taxa typical from fast running, cold waters (e.g. Chloroperlidae and Potamanthidae). They explained also a major part of the appearance and increase of taxa typical from slow flowing or standing waters and warmer temperatures, including invasive species (e.g. Corbicula sp. and Atyaephyra desmarestii). However, this shift towards a generalist and pollution tolerant assemblage was partially confounded by local improvement in water quality (i.e. phosphate input reduction by about two thirds and eutrophication limitation by almost one half), explaining a significant part of the settlement of new pollution-sensitive taxa (e.g. the caddisfly Brachycentridae and Philopotamidae families) during the last years of the study period. The regain in such taxa allowed maintaining a certain level of specialization in the invertebrate community despite climate change effects.

  20. Control of confounding in the analysis phase – an overview for clinicians

    PubMed Central

    Kahlert, Johnny; Gribsholt, Sigrid Bjerge; Gammelager, Henrik; Dekkers, Olaf M; Luta, George

    2017-01-01

    In observational studies, control of confounding can be done in the design and analysis phases. Using examples from large health care database studies, this article provides the clinicians with an overview of standard methods in the analysis phase, such as stratification, standardization, multivariable regression analysis and propensity score (PS) methods, together with the more advanced high-dimensional propensity score (HD-PS) method. We describe the progression from simple stratification confined to the inclusion of a few potential confounders to complex modeling procedures such as the HD-PS approach by which hundreds of potential confounders are extracted from large health care databases. Stratification and standardization assist in the understanding of the data at a detailed level, while accounting for potential confounders. Incorporating several potential confounders in the analysis typically implies the choice between multivariable analysis and PS methods. Although PS methods have gained remarkable popularity in recent years, there is an ongoing discussion on the advantages and disadvantages of PS methods as compared to those of multivariable analysis. Furthermore, the HD-PS method, despite its generous inclusion of potential confounders, is also associated with potential pitfalls. All methods are dependent on the assumption of no unknown, unmeasured and residual confounding and suffer from the difficulty of identifying true confounders. Even in large health care databases, insufficient or poor data may contribute to these challenges. The trend in data collection is to compile more fine-grained data on lifestyle and severity of diseases, based on self-reporting and modern technologies. This will surely improve our ability to incorporate relevant confounders or their proxies. However, despite a remarkable development of methods that account for confounding and new data opportunities, confounding will remain a serious issue. Considering the advantages and

  1. The ACCE method: an approach for obtaining quantitative or qualitative estimates of residual confounding that includes unmeasured confounding

    PubMed Central

    Smith, Eric G.

    2015-01-01

    Background:  Nonrandomized studies typically cannot account for confounding from unmeasured factors.  Method:  A method is presented that exploits the recently-identified phenomenon of  “confounding amplification” to produce, in principle, a quantitative estimate of total residual confounding resulting from both measured and unmeasured factors.  Two nested propensity score models are constructed that differ only in the deliberate introduction of an additional variable(s) that substantially predicts treatment exposure.  Residual confounding is then estimated by dividing the change in treatment effect estimate between models by the degree of confounding amplification estimated to occur, adjusting for any association between the additional variable(s) and outcome. Results:  Several hypothetical examples are provided to illustrate how the method produces a quantitative estimate of residual confounding if the method’s requirements and assumptions are met.  Previously published data is used to illustrate that, whether or not the method routinely provides precise quantitative estimates of residual confounding, the method appears to produce a valuable qualitative estimate of the likely direction and general size of residual confounding. Limitations:  Uncertainties exist, including identifying the best approaches for: 1) predicting the amount of confounding amplification, 2) minimizing changes between the nested models unrelated to confounding amplification, 3) adjusting for the association of the introduced variable(s) with outcome, and 4) deriving confidence intervals for the method’s estimates (although bootstrapping is one plausible approach). Conclusions:  To this author’s knowledge, it has not been previously suggested that the phenomenon of confounding amplification, if such amplification is as predictable as suggested by a recent simulation, provides a logical basis for estimating total residual confounding. The method's basic approach is

  2. A population-based study of edentulism in the US: does depression and rural residency matter after controlling for potential confounders?

    PubMed Central

    2014-01-01

    Background Oral health is an integral component of general health and well-being. While edentulism has been examined in relation to socioeconomic status, rural residency, chronic disease and mental health, no study that we know of has examined edentulism and these factors together. The objective of this study was to determine whether depression and rural residency were significantly associated with partial and full edentulism in US adults after controlling for potential confounders. Methods 2006 Behavioral Risk Factor Surveillance Survey (BRFSS) data were analyzed to identify factors associated with increased odds of partial or full edentulism. This year of BRFSS data was chosen for analysis because in this year the standardized and validated Personal Health Questionnaire-8 (PHQ-8) was used to measure current depression. This measure was part of the optional questions BRFSS asks, and in 2006 33 states and/or territories included them in their annual surveillance data collection. Bivariate and logistic regression analyses were performed on weighted BRFSS data. Results Logistic regression analysis using either full or partial edentulism as the dependent variable yielded that rural residency or living in a rural locale, low and/or middle socioeconomic status (SES), depression as measured by the PHQ-8, and African American race/ethnicity were all independent risk factors when controlling for these and a number of additional covariates. Conclusions This study adds to the epidemiological literature by assessing partial and full edentulism in the US utilizing data from the CDC’s Behavioral Risk Factor Surveillance System (BRFSS). Examining data collected through a large national surveillance system such as BRFSS allows for an analysis that incorporates an array of covariates not available from clinically-based data alone. This study demonstrated that current depression and rural residency are important factors related to partial and full edentulism after controlling for

  3. Cord Blood Methylmercury and Fetal Growth Outcomes in Baltimore Newborns: Potential Confounding and Effect Modification by Omega-3 Fatty Acids, Selenium, and Sex

    PubMed Central

    Wells, Ellen M.; Herbstman, Julie B.; Lin, Yu Hong; Jarrett, Jeffery; Verdon, Carl P.; Ward, Cynthia; Caldwell, Kathleen L.; Hibbeln, Joseph R.; Witter, Frank R.; Halden, Rolf U.; Goldman, Lynn R.

    2015-01-01

    Background Methylmercury (MeHg) may affect fetal growth; however, prior research often lacked assessment of mercury speciation, confounders, and interactions. Objective Our objective was to assess the relationship between MeHg and fetal growth as well as the potential for confounding or interaction of this relationship from speciated mercury, fatty acids, selenium, and sex. Methods This cross-sectional study includes 271 singletons born in Baltimore, Maryland, 2004–2005. Umbilical cord blood was analyzed for speciated mercury, serum omega-3 highly unsaturated fatty acids (n-3 HUFAs), and selenium. Multivariable linear regression models controlled for gestational age, birth weight, maternal age, parity, prepregnancy body mass index, smoking, hypertension, diabetes, selenium, n-3 HUFAs, and inorganic mercury (IHg). Results Geometric mean cord blood MeHg was 0.94 μg/L (95% CI: 0.84, 1.07). In adjusted models for ponderal index, βln(MeHg) = –0.045 (g/cm3) × 100 (95% CI: –0.084, –0.005). There was no evidence of a MeHg × sex interaction with ponderal index. Contrastingly, there was evidence of a MeHg × n-3 HUFAs interaction with birth length [among low n-3 HUFAs, βln(MeHg) = 0.40 cm, 95% CI: –0.02, 0.81; among high n-3 HUFAs, βln(MeHg) = –0.15, 95% CI: –0.54, 0.25; p-interaction = 0.048] and head circumference [among low n-3 HUFAs, βln(MeHg) = 0.01 cm, 95% CI: –0.27, 0.29; among high n-3 HUFAs, βln(MeHg) = –0.37, 95% CI: –0.63, –0.10; p-interaction = 0.042]. The association of MeHg with birth weight and ponderal index was affected by n-3 HUFAs, selenium, and IHg. For birth weight, βln(MeHg) without these variables was –16.8 g (95% CI: –75.0, 41.3) versus –29.7 (95% CI: –93.9, 34.6) with all covariates. Corresponding values for ponderal index were –0.030 (g/cm3) × 100 (95% CI: –0.065, 0.005) and –0.045 (95% CI: –0.084, –0005). Conclusion We observed an association of increased MeHg with decreased ponderal index. There is

  4. Assessing the impact of unmeasured confounding for binary outcomes using confounding functions.

    PubMed

    Kasza, Jessica; Wolfe, Rory; Schuster, Tibor

    2017-03-03

    A critical assumption of causal inference is that of no unmeasured confounding: for estimated exposure effects to have valid causal interpretations, a sufficient set of predictors of exposure and outcome must be adequately measured and correctly included in the respective inference model(s). In an observational study setting, this assumption will often be unsatisfied, and the potential impact of unmeasured confounding on effect estimates should be investigated. The confounding function approach allows the impact of unmeasured confounding on estimates to be assessed, where unmeasured confounding may be due to unmeasured confounders and/or biases such as collider bias or information bias. Although this approach is easy to implement and pertains to the sum of all bias, its use has not been widespread, and discussion has typically been limited to continuous outcomes. In this paper, we consider confounding functions for use with binary outcomes and illustrate the approach with an example. We note that confounding function choice encodes assumptions about effect modification: some choices encode the belief that the true causal effect differs across exposure groups, whereas others imply that any difference between the true causal parameter and the estimate is entirely due to imbalanced risks between exposure groups. The confounding function approach is a useful method for assessing the impact of unmeasured confounding, in particular when alternative approaches, e.g. external adjustment or instrumental variable approaches, cannot be applied. We provide Stata and R code for the implementation of this approach when the causal estimand of interest is an odds or risk ratio.

  5. Distance to High-Voltage Power Lines and Risk of Childhood Leukemia – an Analysis of Confounding by and Interaction with Other Potential Risk Factors

    PubMed Central

    Pedersen, Camilla; Bräuner, Elvira V.; Rod, Naja H.; Albieri, Vanna; Andersen, Claus E.; Ulbak, Kaare; Hertel, Ole; Johansen, Christoffer; Schüz, Joachim; Raaschou-Nielsen, Ole

    2014-01-01

    We investigated whether there is an interaction between distance from residence at birth to nearest power line and domestic radon and traffic-related air pollution, respectively, in relation to childhood leukemia risk. Further, we investigated whether adjusting for potential confounders alters the association between distance to nearest power line and childhood leukemia. We included 1024 cases aged <15, diagnosed with leukemia during 1968–1991, from the Danish Cancer Registry and 2048 controls randomly selected from the Danish childhood population and individually matched by gender and year of birth. We used geographical information systems to determine the distance between residence at birth and the nearest 132–400 kV overhead power line. Concentrations of domestic radon and traffic-related air pollution (NOx at the front door) were estimated using validated models. We found a statistically significant interaction between distance to nearest power line and domestic radon regarding risk of childhood leukemia (p = 0.01) when using the median radon level as cut-off point but not when using the 75th percentile (p = 0.90). We found no evidence of an interaction between distance to nearest power line and traffic-related air pollution (p = 0.73). We found almost no change in the estimated association between distance to power line and risk of childhood leukemia when adjusting for socioeconomic status of the municipality, urbanization, maternal age, birth order, domestic radon and traffic-related air pollution. The statistically significant interaction between distance to nearest power line and domestic radon was based on few exposed cases and controls and sensitive to the choice of exposure categorization and might, therefore, be due to chance. PMID:25259740

  6. Distance to high-voltage power lines and risk of childhood leukemia--an analysis of confounding by and interaction with other potential risk factors.

    PubMed

    Pedersen, Camilla; Bräuner, Elvira V; Rod, Naja H; Albieri, Vanna; Andersen, Claus E; Ulbak, Kaare; Hertel, Ole; Johansen, Christoffer; Schüz, Joachim; Raaschou-Nielsen, Ole

    2014-01-01

    We investigated whether there is an interaction between distance from residence at birth to nearest power line and domestic radon and traffic-related air pollution, respectively, in relation to childhood leukemia risk. Further, we investigated whether adjusting for potential confounders alters the association between distance to nearest power line and childhood leukemia. We included 1024 cases aged <15, diagnosed with leukemia during 1968-1991, from the Danish Cancer Registry and 2048 controls randomly selected from the Danish childhood population and individually matched by gender and year of birth. We used geographical information systems to determine the distance between residence at birth and the nearest 132-400 kV overhead power line. Concentrations of domestic radon and traffic-related air pollution (NOx at the front door) were estimated using validated models. We found a statistically significant interaction between distance to nearest power line and domestic radon regarding risk of childhood leukemia (p = 0.01) when using the median radon level as cut-off point but not when using the 75th percentile (p = 0.90). We found no evidence of an interaction between distance to nearest power line and traffic-related air pollution (p = 0.73). We found almost no change in the estimated association between distance to power line and risk of childhood leukemia when adjusting for socioeconomic status of the municipality, urbanization, maternal age, birth order, domestic radon and traffic-related air pollution. The statistically significant interaction between distance to nearest power line and domestic radon was based on few exposed cases and controls and sensitive to the choice of exposure categorization and might, therefore, be due to chance.

  7. Introduction to causal diagrams for confounder selection.

    PubMed

    Williamson, Elizabeth J; Aitken, Zoe; Lawrie, Jock; Dharmage, Shyamali C; Burgess, John A; Forbes, Andrew B

    2014-04-01

    In respiratory health research, interest often lies in estimating the effect of an exposure on a health outcome. If randomization of the exposure of interest is not possible, estimating its effect is typically complicated by confounding bias. This can often be dealt with by controlling for the variables causing the confounding, if measured, in the statistical analysis. Common statistical methods used to achieve this include multivariable regression models adjusting for selected confounding variables or stratification on those variables. Therefore, a key question is which measured variables need to be controlled for in order to remove confounding. An approach to confounder-selection based on the use of causal diagrams (often called directed acyclic graphs) is discussed. A causal diagram is a visual representation of the causal relationships believed to exist between the variables of interest, including the exposure, outcome and potential confounding variables. After creating a causal diagram for the research question, an intuitive and easy-to-use set of rules can be applied, based on a foundation of rigorous mathematics, to decide which measured variables must be controlled for in the statistical analysis in order to remove confounding, to the extent that is possible using the available data. This approach is illustrated by constructing a causal diagram for the research question: 'Does personal smoking affect the risk of subsequent asthma?'. Using data taken from the Tasmanian Longitudinal Health Study, the statistical analysis suggested by the causal diagram approach was performed.

  8. [Bias in observational research: 'confounding'].

    PubMed

    Groenwold, Rolf H H

    2012-01-01

    Confounding is an important and common issue in observational (non-randomized) research on the effects of pharmaceuticals or exposure to etiologic factors (determinants). Confounding is present when a third factor, related to both the determinant and the outcome, distorts the causal relation between these two. There are different methods to control for confounding. The most commonly used are restriction, stratification, multivariable regression models, and propensity score methods. With these methods it is only possible to control for variables for which data is known: measured confounders. Research in the area of confounding is currently directed at the incorporation of external knowledge on unmeasured confounders, the evaluation of instrumental variables, and the impact of time-dependent confounding.

  9. Identifiability, exchangeability and confounding revisited

    PubMed Central

    Greenland, Sander; Robins, James M

    2009-01-01

    In 1986 the International Journal of Epidemiology published "Identifiability, Exchangeability and Epidemiological Confounding". We review the article from the perspective of a quarter century after it was first drafted and relate it to subsequent developments on confounding, ignorability, and collapsibility. PMID:19732410

  10. Confounding in longitudinal studies in addiction treatment research

    PubMed Central

    Pierce, Matthias; Dunn, Graham; Millar, Tim

    2017-01-01

    Abstract Background: The effectiveness of treatment for people with substance use disorders is usually examined using longitudinal cohorts. In these studies, treatment is often considered as a time-varying exposure. The aim of this commentary is to examine confounding in this context, when the confounding variable is time-invariant and when it is time-varying. Method: Types of confounding are described with examples and illustrated using path diagrams. Simulations are used to demonstrate the direction of confounding bias and the extent that it is accounted for using standard regression adjustment techniques. Results: When the confounding variable is time invariant or time varying and not influenced by prior treatment, then standard adjustment techniques are adequate to control for confounding bias, provided that in the latter scenario the time-varying form of the variable is used. When the confounder is time varying and affected by prior treatment status (i.e. it is a mediator of treatment), then standard methods of adjustment result in inconsistency. Conclusions: In longitudinal cohorts where treatment exposure is time varying, confounding is an issue which should be considered, even if treatment exposure is initially randomized. In these studies, standard methods of adjustment may result be inadequate, even when all confounders have been identified. This occurs when the confounder is also a mediator of treatment. This is a likely scenario in many studies in addiction.

  11. Confounding in longitudinal studies in addiction treatment research.

    PubMed

    Pierce, Matthias; Dunn, Graham; Millar, Tim

    2017-05-04

    Background: The effectiveness of treatment for people with substance use disorders is usually examined using longitudinal cohorts. In these studies, treatment is often considered as a time-varying exposure. The aim of this commentary is to examine confounding in this context, when the confounding variable is time-invariant and when it is time-varying. Method: Types of confounding are described with examples and illustrated using path diagrams. Simulations are used to demonstrate the direction of confounding bias and the extent that it is accounted for using standard regression adjustment techniques. Results: When the confounding variable is time invariant or time varying and not influenced by prior treatment, then standard adjustment techniques are adequate to control for confounding bias, provided that in the latter scenario the time-varying form of the variable is used. When the confounder is time varying and affected by prior treatment status (i.e. it is a mediator of treatment), then standard methods of adjustment result in inconsistency. Conclusions: In longitudinal cohorts where treatment exposure is time varying, confounding is an issue which should be considered, even if treatment exposure is initially randomized. In these studies, standard methods of adjustment may result be inadequate, even when all confounders have been identified. This occurs when the confounder is also a mediator of treatment. This is a likely scenario in many studies in addiction.

  12. Ace Your Accounting Classes: 12 Hints to Maximize Your Potential

    ERIC Educational Resources Information Center

    Albrecht, W. David

    2008-01-01

    Many students experience difficulties when they try to get good grades in their accounting classes, and they are searching for answers. There is no single answer. Getting a good grade in an accounting class results from a process. If you know and understand the process--and can apply it--then your chances are much improved for getting a good…

  13. Accountability.

    ERIC Educational Resources Information Center

    Lashway, Larry

    1999-01-01

    This issue reviews publications that provide a starting point for principals looking for a way through the accountability maze. Each publication views accountability differently, but collectively these readings argue that even in an era of state-mandated assessment, principals can pursue proactive strategies that serve students' needs. James A.…

  14. History of the modern epidemiological concept of confounding.

    PubMed

    Morabia, Alfredo

    2011-04-01

    The epidemiological concept of confounding has had a convoluted history. It was first expressed as an issue of group non-comparability, later as an uncontrolled fallacy, then as a controllable fallacy named confounding, and, more recently, as an issue of group non-comparability in the distribution of potential outcome types. This latest development synthesised the apparent disconnect between phases of the history of confounding. Group non-comparability is the essence of confounding, and the statistical fallacy its consequence. This essay discusses how confounding was perceived in the 18th and 19th centuries, reviews how the concept evolved across the 20th century and finally describes the modern definition of confounding.

  15. Interpretational Confounding or Confounded Interpretations of Causal Indicators?

    PubMed Central

    Bainter, Sierra A.; Bollen, Kenneth A.

    2014-01-01

    In measurement theory causal indicators are controversial and little-understood. Methodological disagreement concerning causal indicators has centered on the question of whether causal indicators are inherently sensitive to interpretational confounding, which occurs when the empirical meaning of a latent construct departs from the meaning intended by a researcher. This article questions the validity of evidence used to claim that causal indicators are inherently susceptible to interpretational confounding. Further, a simulation study demonstrates that causal indicator coefficients are stable across correctly-specified models. Determining the suitability of causal indicators has implications for the way we conceptualize measurement and build and evaluate measurement models. PMID:25530730

  16. Interpretational Confounding or Confounded Interpretations of Causal Indicators?

    ERIC Educational Resources Information Center

    Bainter, Sierra A.; Bollen, Kenneth A.

    2014-01-01

    In measurement theory, causal indicators are controversial and little understood. Methodological disagreement concerning causal indicators has centered on the question of whether causal indicators are inherently sensitive to interpretational confounding, which occurs when the empirical meaning of a latent construct departs from the meaning…

  17. Modeling confounding by half-sibling regression

    PubMed Central

    Schölkopf, Bernhard; Hogg, David W.; Wang, Dun; Foreman-Mackey, Daniel; Janzing, Dominik; Simon-Gabriel, Carl-Johann; Peters, Jonas

    2016-01-01

    We describe a method for removing the effect of confounders to reconstruct a latent quantity of interest. The method, referred to as “half-sibling regression,” is inspired by recent work in causal inference using additive noise models. We provide a theoretical justification, discussing both independent and identically distributed as well as time series data, respectively, and illustrate the potential of the method in a challenging astronomy application. PMID:27382154

  18. Modeling confounding by half-sibling regression.

    PubMed

    Schölkopf, Bernhard; Hogg, David W; Wang, Dun; Foreman-Mackey, Daniel; Janzing, Dominik; Simon-Gabriel, Carl-Johann; Peters, Jonas

    2016-07-05

    We describe a method for removing the effect of confounders to reconstruct a latent quantity of interest. The method, referred to as "half-sibling regression," is inspired by recent work in causal inference using additive noise models. We provide a theoretical justification, discussing both independent and identically distributed as well as time series data, respectively, and illustrate the potential of the method in a challenging astronomy application.

  19. Distinguishing Selection Bias and Confounding Bias in Comparative Effectiveness Research.

    PubMed

    Haneuse, Sebastien

    2016-04-01

    Comparative effectiveness research (CER) aims to provide patients and physicians with evidence-based guidance on treatment decisions. As researchers conduct CER they face myriad challenges. Although inadequate control of confounding is the most-often cited source of potential bias, selection bias that arises when patients are differentially excluded from analyses is a distinct phenomenon with distinct consequences: confounding bias compromises internal validity, whereas selection bias compromises external validity. Despite this distinction, however, the label "treatment-selection bias" is being used in the CER literature to denote the phenomenon of confounding bias. Motivated by an ongoing study of treatment choice for depression on weight change over time, this paper formally distinguishes selection and confounding bias in CER. By formally distinguishing selection and confounding bias, this paper clarifies important scientific, design, and analysis issues relevant to ensuring validity. First is that the 2 types of biases may arise simultaneously in any given study; even if confounding bias is completely controlled, a study may nevertheless suffer from selection bias so that the results are not generalizable to the patient population of interest. Second is that the statistical methods used to mitigate the 2 biases are themselves distinct; methods developed to control one type of bias should not be expected to address the other. Finally, the control of selection and confounding bias will often require distinct covariate information. Consequently, as researchers plan future studies of comparative effectiveness, care must be taken to ensure that all data elements relevant to both confounding and selection bias are collected.

  20. Potential Impact of the Elimination of the M Account on the Department of the Navy

    DTIC Science & Technology

    1991-12-01

    reprogramming process be overburdened due to the elimination of the M account ? 123 LIST OF REFERENCES Adelman, Kenneth and Augustine, Norman, The Defense...AD-A246 599 NAVAL POSTGRADUATE SCHOOL Monterey, California DTIC fEL~cr1 THESIS POTENTIAL IMPACT OF THE ELIMINATION OF THE M ACCOUNT ON THE DEPARTMENT...of the Elimination of the M Account on the Department of the Navy (Unclassified) 12 PERSONAL AUTHOR(S) Fegurgur, Ben A. and Marinello, Anthony F. 13a

  1. Test-Based Accountability: Potential Benefits and Pitfalls of Science Assessment with Student Diversity

    ERIC Educational Resources Information Center

    Penfield, Randall D.; Lee, Okhee

    2010-01-01

    Recent test-based accountability policy in the U.S. has involved annually assessing all students in core subjects and holding schools accountable for adequate progress of all students by implementing sanctions when adequate progress is not met. Despite its potential benefits, basing educational policy on assessments developed for a student…

  2. Confounding and causation in the epidemiology of lead.

    PubMed

    Wilson, Ian Harold; Wilson, Simon Barton

    2016-01-01

    The National Health and Medical Research Council recently reported that there were not enough high-quality studies to conclude that associations between health effects and blood lead levels <10 μg/dL were caused by lead. It identified uncontrolled confounding, measurement error and other potential causal factors as common weaknesses. This paper supports those findings with evidence of uncontrolled confounding by parental education, intelligence or household management from several papers. It suggests that inappropriate statistical tests and aggregation of data representing different exposure routes partly explain why confounding has been overlooked. Inadequate correction of confounding has contributed to incorrect conclusions regarding causality at low levels of lead. Linear or log-linear regression models have tended to mask any threshold. While the effects of higher levels of lead exposure are not disputed, overestimation of health effects at low lead exposures has significant implications for policy-makers endeavouring to protect public health through cost-effective regulations.

  3. Opportunities for minimization of confounding in observational research.

    PubMed

    Quartey, George; Feudjo-Tepie, Maurille; Wang, Jixian; Kim, Joseph

    2011-01-01

    Observational epidemiological studies are increasingly used in pharmaceutical research to evaluate the safety and effectiveness of medicines. Such studies can complement findings from randomized clinical trials by involving larger and more generalizable patient populations by accruing greater durations of follow-up and by representing what happens more typically in the clinical setting. However, the interpretation of exposure effects in observational studies is almost always complicated by non-random exposure allocation, which can result in confounding and potentially lead to misleading conclusions. Confounding occurs when an extraneous factor, related to both the exposure and the outcome of interest, partly or entirely explains the relationship observed between the study exposure and the outcome. Although randomization can eliminate confounding by distributing all such extraneous factors equally across the levels of a given exposure, methods for dealing with confounding in observational studies include a careful choice of study design and the possible use of advanced analytical methods. The aim of this paper is to introduce some of the approaches that can be used to help minimize the impact of confounding in observational research to the reader working in the pharmaceutical industry.

  4. Confounding and bias in the attributable fraction.

    PubMed

    Darrow, Lyndsey A; Steenland, N Kyle

    2011-01-01

    Inappropriate methods are frequently used to calculate the population attributable fraction (AF) for a given exposure of interest. This commonly occurs when authors use adjusted relative risks (RRs) reported in the literature (the "source" data), without access to the original data. In this analysis, we examine the relationship between the direction and magnitude of confounding in the source data and resulting bias in the attributable fraction when incorrect methods are used. We assess confounding by the confounding risk ratio, which is the ratio of the crude RR to the adjusted RR. We assess bias in the AF by the ratio of the incorrectly calculated AF to the correctly calculated AF. Using generated data, we examine the relationship between confounding and AF bias under various scenarios of population prevalence of exposure and strength of the exposure-disease association. For confounding risk ratios greater than 1.0 (ie, crude RR >adjusted RR), the AF is underestimated; for confounding risk ratios less than 1.0 (ie, crude RR confounding increases, and is dependent on the prevalence of exposure in the total population, with bias greatest at the lowest prevalence of exposure. Bias in the AF is also higher when the exposure-disease association is weaker. Results of these analyses can assist interpretation of incorrectly calculated attributable fraction estimates commonly reported in the epidemiologic literature.

  5. Air Pollution and Autism Spectrum Disorders: Causal or Confounded?

    PubMed

    Weisskopf, Marc G; Kioumourtzoglou, Marianthi-Anna; Roberts, Andrea L

    2015-12-01

    In the last decade, several studies have examined the association between perinatal exposure to ambient air pollution and risk of autism spectrum disorder (ASD). These studies have largely been consistent, with associations seen with different aspects of air pollution, including hazardous air toxics, ozone, particulate, and traffic-related pollution. Confounding by socioeconomic status (SES) and place of residence are of particular concern, as these can be related to ASD case ascertainment and other potential causal risk factors for ASD. While all studies take steps to address this concern, residual confounding is difficult to rule out. Two recent studies of air pollution and ASD, however, present findings that strongly argue against residual confounding, especially for factors that do not vary over relatively short time intervals. These two studies, conducted in communities around the USA, found a specific association with air pollution exposure during the 3rd, but not the 1st, trimester, when both trimesters were modeled simultaneously. In this review, we discuss confounding possibilities and then explain-with the aid of directed acyclic graphs (DAGs)-why an association that is specific to a particular time window, when multiple exposure windows are simultaneously assessed, argues against residual confounding by (even unmeasured) non-time-varying factors. In addition, we discuss why examining ambient air pollution concentration as a proxy for personal exposure helps avoid confounding by personal behavior differences, and the implications of measurement error in using ambient concentrations as a proxy for personal exposures. Given the general consistency of findings across studies and the exposure-window-specific associations recently reported, the overall evidence for a causal association between air pollution and ASD is increasingly compelling.

  6. Packet Randomized Experiments for Eliminating Classes of Confounders

    PubMed Central

    Pavela, Greg; Wiener, Howard; Fontaine, Kevin R.; Fields, David A.; Voss, Jameson D.; Allison, David B.

    2014-01-01

    Background Although randomization is considered essential for causal inference, it is often not possible to randomize in nutrition and obesity research. To address this, we develop a framework for an experimental design—packet randomized experiments (PREs), which improves causal inferences when randomization on a single treatment variable is not possible. This situation arises when subjects are randomly assigned to a condition (such as a new roommate) which varies in one characteristic of interest (such as weight), but also varies across many others. There has been no general discussion of this experimental design, including its strengths, limitations, and statistical properties. As such, researchers are left to develop and apply PREs on an ad hoc basis, limiting its potential to improve causal inferences among nutrition and obesity researchers. Methods We introduce PREs as an intermediary design between randomized controlled trials and observational studies. We review previous research that used the PRE design and describe its application in obesity-related research, including random roommate assignments, heterochronic parabiosis, and the quasi-random assignment of subjects to geographic areas. We then provide a statistical framework to control for potential packet-level confounders not accounted for by randomization. Results PREs have successfully been used to improve causal estimates of the effect of roommates, altitude, and breastfeeding on weight outcomes. When certain assumptions are met, PREs can asymptotically control for packet-level characteristics. This has the potential to statistically estimate the effect of a single treatment even when randomization to a single treatment did not occur. Conclusions Applying PREs to obesity-related research will improve decisions about clinical, public health, and policy actions insofar as it offers researchers new insight into cause and effect relationships among variables. PMID:25444088

  7. Confounding and exposure measurement error in air pollution epidemiology.

    PubMed

    Sheppard, Lianne; Burnett, Richard T; Szpiro, Adam A; Kim, Sun-Young; Jerrett, Michael; Pope, C Arden; Brunekreef, Bert

    2012-06-01

    Studies in air pollution epidemiology may suffer from some specific forms of confounding and exposure measurement error. This contribution discusses these, mostly in the framework of cohort studies. Evaluation of potential confounding is critical in studies of the health effects of air pollution. The association between long-term exposure to ambient air pollution and mortality has been investigated using cohort studies in which subjects are followed over time with respect to their vital status. In such studies, control for individual-level confounders such as smoking is important, as is control for area-level confounders such as neighborhood socio-economic status. In addition, there may be spatial dependencies in the survival data that need to be addressed. These issues are illustrated using the American Cancer Society Cancer Prevention II cohort. Exposure measurement error is a challenge in epidemiology because inference about health effects can be incorrect when the measured or predicted exposure used in the analysis is different from the underlying true exposure. Air pollution epidemiology rarely if ever uses personal measurements of exposure for reasons of cost and feasibility. Exposure measurement error in air pollution epidemiology comes in various dominant forms, which are different for time-series and cohort studies. The challenges are reviewed and a number of suggested solutions are discussed for both study domains.

  8. Role of Environmental Confounding in the Association between FKBP5 and First-Episode Psychosis

    PubMed Central

    Ajnakina, Olesya; Borges, Susana; Di Forti, Marta; Patel, Yogen; Xu, Xiaohui; Green, Priscilla; Stilo, Simona A.; Kolliakou, Anna; Sood, Poonam; Marques, Tiago Reis; David, Anthony S.; Prata, Diana; Dazzan, Paola; Powell, John; Pariante, Carmine; Mondelli, Valeria; Morgan, Craig; Murray, Robin M.; Fisher, Helen L.; Iyegbe, Conrad

    2014-01-01

    Background: Failure to account for the etiological diversity that typically occurs in psychiatric cohorts may increase the potential for confounding as a proportion of genetic variance will be specific to exposures that have varying distributions in cases. This study investigated whether minimizing the potential for such confounding strengthened the evidence for a genetic candidate currently unsupported at the genome-wide level. Methods: Two hundred and ninety-one first-episode psychosis cases from South London, UK and 218 unaffected controls were evaluated for a functional polymorphism at the rs1360780 locus in FKBP5. The relationship between FKBP5 and psychosis was modeled using logistic regression. Cannabis use (Cannabis Experiences Questionnaire) and parental separation (Childhood Experience of Care and Abuse Questionnaire) were included as confounders in the analysis. Results: Association at rs1360780 was not detected until the effects of the two environmental factors had been adjusted for in the model (OR = 2.81, 95% CI 1.23–6.43, p = 0.02). A statistical interaction between rs1360780 and parental separation was confirmed by stratified tests (OR = 2.8, p = 0.02 vs. OR = 0.89, p = 0.80). The genetic main effect was directionally consistent with findings in other (stress-related) clinical phenotypes. Moreover, the variation in effect magnitude was explained by the level of power associated with different cannabis constructs used in the model (r = 0.95). Conclusion: Our results suggest that the extent to which genetic variants in FKBP5 can influence susceptibility to psychosis may depend on other etiological factors. This finding requires further validation in large independent cohorts. Potentially this work could have translational implications; the ability to discriminate between genetic etiologies based on a case-by-case understanding of previous environmental exposures would confer an important clinical advantage that would

  9. Bayesian modeling of cost-effectiveness studies with unmeasured confounding: a simulation study.

    PubMed

    Stamey, James D; Beavers, Daniel P; Faries, Douglas; Price, Karen L; Seaman, John W

    2014-01-01

    Unmeasured confounding is a common problem in observational studies. Failing to account for unmeasured confounding can result in biased point estimators and poor performance of hypothesis tests and interval estimators. We provide examples of the impacts of unmeasured confounding on cost-effectiveness analyses using observational data along with a Bayesian approach to correct estimation. Assuming validation data are available, we propose a Bayesian approach to correct cost-effectiveness studies for unmeasured confounding. We consider the cases where both cost and effectiveness are assumed to have a normal distribution and when costs are gamma distributed and effectiveness is normally distributed. Simulation studies were conducted to determine the impact of ignoring the unmeasured confounder and to determine the size of the validation data required to obtain valid inferences.

  10. Confounding by dietary pattern of the inverse association between alcohol consumption and type 2 diabetes risk

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Epidemiology of dietary components and disease risk limits interpretability due to potential residual confounding by correlated dietary components. Dietary pattern analyses by factor analysis or partial least squares may overcome the limitation. To examine confounding by dietary pattern as well as ...

  11. Confounding by dietary patterns of the inverse association between alcohol consumption and type 2 diabetes risk

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Epidemiology of dietary components and disease risk limits interpretability due to potential residual confounding by correlated dietary components. Dietary pattern analyses by factor analysis or partial least squares may overcome this limitation. To examine confounding by dietary pattern as well as ...

  12. Confounding and control of confounding in nonexperimental studies of medications in patients with CKD.

    PubMed

    Bradbury, Brian D; Gilbertson, David T; Brookhart, M Alan; Kilpatrick, Ryan D

    2012-01-01

    Confounding is an important source of bias in nonexperimental studies, arising when the effect of an exposure on the occurrence of an outcome is distorted by the effect of some other factor. In nonexperimental studies of patients with CKD or who are on chronic dialysis, confounding is a significant concern owing to the high burden of comorbid disease, extent of required clinical management, and high frequency of adverse clinical events in this patient population. Confounding can be addressed in both the design stage (restriction, accurate measurement of confounders) and analysis stage (stratification, multivariable adjustment, propensity scores, marginal structural models, instrumental variable) of a study. Time-dependent confounding and confounding by indication are 2 special cases of confounding that can arise in studies of treatment effects and may require more sophisticated analytic techniques to adequately address. The availability and expanded use of large health care databases have ensured greater precision and have now placed the focus on validity. Addressing the major threats to validity, such as confounding, should be a first-order concern.

  13. An attentional-adaptation account of spatial negative priming: evidence from event-related potentials.

    PubMed

    Liu, Xiaonan L; Walsh, Matthew M; Reder, Lynne M

    2014-03-01

    Negative priming (NP) refers to a slower response to a target stimulus if it has been previously ignored. To examine theoretical accounts of spatial NP, we recorded behavioral measures and event-related potentials (ERPs) in a target localization task. A target and distractor briefly appeared, and the participant pressed a key corresponding to the target's location. The probability of the distractor appearing in each of four locations varied, whereas the target appeared with equal probabilities in all locations. We found that response times (RTs) were fastest when the prime distractor appeared in its most probable (frequent) location and when the prime target appeared in the location that never contained a distractor. Moreover, NP effects varied as a function of location: They were smallest when targets followed distractors in the frequent distractor location-a finding not predicted by episodic-retrieval or suppression accounts of NP. The ERP results showed that the P2, an ERP component associated with attentional orientation, was smaller in prime displays when the distractor appeared in its frequent location. Moreover, no differences were apparent between negative-prime and control trials in the N2, which is associated with suppression processes, nor in the P3, which is associated with episodic retrieval processes. These results indicate that the spatial NP effect is caused by both short- and long-term adaptation in preferences based on the history of inspecting unsuccessful locations. This article is dedicated to the memory of Edward E. Smith, and we indicate how this study was inspired by his research career.

  14. Does exposure prediction bias health-effect estimation?: The relationship between confounding adjustment and exposure prediction.

    PubMed

    Cefalu, Matthew; Dominici, Francesca

    2014-07-01

    In environmental epidemiology, we are often faced with 2 challenges. First, an exposure prediction model is needed to estimate the exposure to an agent of interest, ideally at the individual level. Second, when estimating the health effect associated with the exposure, confounding adjustment is needed in the health-effects regression model. The current literature addresses these 2 challenges separately. That is, methods that account for measurement error in the predicted exposure often fail to acknowledge the possibility of confounding, whereas methods designed to control confounding often fail to acknowledge that the exposure has been predicted. In this article, we consider exposure prediction and confounding adjustment in a health-effects regression model simultaneously. Using theoretical arguments and simulation studies, we show that the bias of a health-effect estimate is influenced by the exposure prediction model, the type of confounding adjustment used in the health-effects regression model, and the relationship between these 2. Moreover, we argue that even with a health-effects regression model that properly adjusts for confounding, the use of a predicted exposure can bias the health-effect estimate unless all confounders included in the health-effects regression model are also included in the exposure prediction model. While these results of this article were motivated by studies of environmental contaminants, they apply more broadly to any context where an exposure needs to be predicted.

  15. Handling stress may confound murine gut microbiota studies

    PubMed Central

    Allen-Blevins, Cary R.; You, Xiaomeng; Hinde, Katie

    2017-01-01

    Background Accumulating evidence indicates interactions between human milk composition, particularly sugars (human milk oligosaccharides or HMO), the gut microbiota of human infants, and behavioral effects. Some HMO secreted in human milk are unable to be endogenously digested by the human infant but are able to be metabolized by certain species of gut microbiota, including Bifidobacterium longum subsp. infantis (B. infantis), a species sensitive to host stress (Bailey & Coe, 2004). Exposure to gut bacteria like B. infantisduring critical neurodevelopment windows in early life appears to have behavioral consequences; however, environmental, physical, and social stress during this period can also have behavioral and microbial consequences. While rodent models are a useful method for determining causal relationships between HMO, gut microbiota, and behavior, murine studies of gut microbiota usually employ oral gavage, a technique stressful to the mouse. Our aim was to develop a less-invasive technique for HMO administration to remove the potential confound of gavage stress. Under the hypothesis that stress affects gut microbiota, particularly B. infantis, we predicted the pups receiving a prebiotic solution in a less-invasive manner would have the highest amount of Bifidobacteria in their gut. Methods This study was designed to test two methods, active and passive, of solution administration to mice and the effects on their gut microbiome. Neonatal C57BL/6J mice housed in a specific-pathogen free facility received increasing doses of fructooligosaccharide (FOS) solution or deionized, distilled water. Gastrointestinal (GI) tracts were collected from five dams, six sires, and 41 pups over four time points. Seven fecal pellets from unhandled pups and two pellets from unhandled dams were also collected. Qualitative real-time polymerase chain reaction (qRT-PCR) was used to quantify and compare the amount of Bifidobacterium, Bacteroides, Bacteroidetes, and Firmicutes

  16. A typology of four notions of confounding in epidemiology.

    PubMed

    Suzuki, Etsuji; Mitsuhashi, Toshiharu; Tsuda, Toshihide; Yamamoto, Eiji

    2017-02-01

    Confounding is a major concern in epidemiology. Despite its significance, the different notions of confounding have not been fully appreciated in the literature, leading to confusion of causal concepts in epidemiology. In this article, we aim to highlight the importance of differentiating between the subtly different notions of confounding from the perspective of counterfactual reasoning. By using a simple example, we illustrate the significance of considering the distribution of response types to distinguish causation from association, highlighting that confounding depends not only on the population chosen as the target of inference, but also on the notions of confounding in distribution and confounding in measure. This point has been relatively underappreciated, partly because some literature on the concept of confounding has only used the exposed and unexposed groups as the target populations, while it would be helpful to use the total population as the target population. Moreover, to clarify a further distinction between confounding "in expectation" and "realized" confounding, we illustrate the usefulness of examining the distribution of exposure status in the target population. To grasp the explicit distinction between confounding in expectation and realized confounding, we need to understand the mechanism that generates exposure events, not the product of that mechanism. Finally, we graphically illustrate this point, highlighting the usefulness of directed acyclic graphs in examining the presence of confounding in distribution, in the notion of confounding in expectation.

  17. A typology of four notions of confounding in epidemiology

    PubMed Central

    Suzuki, Etsuji; Mitsuhashi, Toshiharu; Tsuda, Toshihide; Yamamoto, Eiji

    2016-01-01

    Confounding is a major concern in epidemiology. Despite its significance, the different notions of confounding have not been fully appreciated in the literature, leading to confusion of causal concepts in epidemiology. In this article, we aim to highlight the importance of differentiating between the subtly different notions of confounding from the perspective of counterfactual reasoning. By using a simple example, we illustrate the significance of considering the distribution of response types to distinguish causation from association, highlighting that confounding depends not only on the population chosen as the target of inference, but also on the notions of confounding in distribution and confounding in measure. This point has been relatively underappreciated, partly because some literature on the concept of confounding has only used the exposed and unexposed groups as the target populations, while it would be helpful to use the total population as the target population. Moreover, to clarify a further distinction between confounding “in expectation” and “realized” confounding, we illustrate the usefulness of examining the distribution of exposure status in the target population. To grasp the explicit distinction between confounding in expectation and realized confounding, we need to understand the mechanism that generates exposure events, not the product of that mechanism. Finally, we graphically illustrate this point, highlighting the usefulness of directed acyclic graphs in examining the presence of confounding in distribution, in the notion of confounding in expectation. PMID:28142011

  18. Bias Analysis for Uncontrolled Confounding in the Health Sciences.

    PubMed

    Arah, Onyebuchi A

    2017-03-20

    Uncontrolled confounding due to unmeasured confounders biases causal inference in health science studies using observational and imperfect experimental designs. The adoption of methods for analysis of bias due to uncontrolled confounding has been slow, despite the increasing availability of such methods. Bias analysis for such uncontrolled confounding is most useful in big data studies and systematic reviews to gauge the extent to which extraneous preexposure variables that affect the exposure and the outcome can explain some or all of the reported exposure-outcome associations. We review methods that can be applied during or after data analysis to adjust for uncontrolled confounding for different outcomes, confounders, and study settings. We discuss relevant bias formulas and how to obtain the required information for applying them. Finally, we develop a new intuitive generalized bias analysis framework for simulating and adjusting for the amount of uncontrolled confounding due to not measuring and adjusting for one or more confounders.

  19. Gaseous pollutants in particulate matter epidemiology: confounders or surrogates?

    PubMed Central

    Sarnat, J A; Schwartz, J; Catalano, P J; Suh, H H

    2001-01-01

    Air pollution epidemiologic studies use ambient pollutant concentrations as surrogates of personal exposure. Strong correlations among numerous ambient pollutant concentrations, however, have made it difficult to determine the relative contribution of each pollutant to a given health outcome and have led to criticism that health effect estimates for particulate matter may be biased due to confounding. In the current study we used data collected from a multipollutant exposure study conducted in Baltimore, Maryland, during both the summer and winter to address the potential for confounding further. Twenty-four-hour personal exposures and corresponding ambient concentrations to fine particulate matter (PM(2.5)), ozone, nitrogen dioxide, sulfur dioxide, and carbon monoxide were measured for 56 subjects. Results from correlation and regression analyses showed that personal PM(2.5) and gaseous air pollutant exposures were generally not correlated, as only 9 of the 178 individual-specific pairwise correlations were significant. Similarly, ambient concentrations were not associated with their corresponding personal exposures for any of the pollutants, except for PM(2.5), which had significant associations during both seasons (p < 0.0001). Ambient gaseous concentrations were, however, strongly associated with personal PM(2.5) exposures. The strongest associations were shown between ambient O(3) and personal PM(2.5) (p < 0.0001 during both seasons). These results indicate that ambient PM(2.5) concentrations are suitable surrogates for personal PM(2.5) exposures and that ambient gaseous concentrations are surrogates, as opposed to confounders, of PM(2.5). These findings suggest that the use of multiple pollutant models in epidemiologic studies of PM(2.5) may not be suitable and that health effects attributed to the ambient gases may actually be a result of exposures to PM(2.5). PMID:11675271

  20. Carotta: Revealing Hidden Confounder Markers in Metabolic Breath Profiles

    PubMed Central

    Hauschild, Anne-Christin; Frisch, Tobias; Baumbach, Jörg Ingo; Baumbach, Jan

    2015-01-01

    Computational breath analysis is a growing research area aiming at identifying volatile organic compounds (VOCs) in human breath to assist medical diagnostics of the next generation. While inexpensive and non-invasive bioanalytical technologies for metabolite detection in exhaled air and bacterial/fungal vapor exist and the first studies on the power of supervised machine learning methods for profiling of the resulting data were conducted, we lack methods to extract hidden data features emerging from confounding factors. Here, we present Carotta, a new cluster analysis framework dedicated to uncovering such hidden substructures by sophisticated unsupervised statistical learning methods. We study the power of transitivity clustering and hierarchical clustering to identify groups of VOCs with similar expression behavior over most patient breath samples and/or groups of patients with a similar VOC intensity pattern. This enables the discovery of dependencies between metabolites. On the one hand, this allows us to eliminate the effect of potential confounding factors hindering disease classification, such as smoking. On the other hand, we may also identify VOCs associated with disease subtypes or concomitant diseases. Carotta is an open source software with an intuitive graphical user interface promoting data handling, analysis and visualization. The back-end is designed to be modular, allowing for easy extensions with plugins in the future, such as new clustering methods and statistics. It does not require much prior knowledge or technical skills to operate. We demonstrate its power and applicability by means of one artificial dataset. We also apply Carotta exemplarily to a real-world example dataset on chronic obstructive pulmonary disease (COPD). While the artificial data are utilized as a proof of concept, we will demonstrate how Carotta finds candidate markers in our real dataset associated with confounders rather than the primary disease (COPD) and bronchial

  1. Infection with parasitic nematodes confounds vaccination efficacy.

    PubMed

    Urban, Joseph F; Steenhard, Nina R; Solano-Aguilar, Gloria I; Dawson, Harry D; Iweala, Onyinye I; Nagler, Cathryn R; Noland, Gregory S; Kumar, Nirbhay; Anthony, Robert M; Shea-Donohue, Terez; Weinstock, Joel; Gause, William C

    2007-08-19

    T helper (Th) cells produce signature cytokine patterns, induced largely by intracellular versus extracellular pathogens that provide the cellular and molecular basis for counter regulatory expression of protective immunity during concurrent infections. The production of IL-12 and IFN-gamma, for example, resulting from exposure to many bacterial, viral, and protozoan pathogens is responsible for Th1-derived protective responses that also can inhibit development of Th2-cells expressing IL-4-dependent immunity to extracellular helminth parasites and vice versa. In a similar manner, concurrent helminth infection alters optimal vaccine-induced responses in humans and livestock; however, the consequences of this condition have not been adequately studied especially in the context of a challenge infection following vaccination. Demands for new and effective vaccines to control chronic and emerging diseases, and the need for rapid deployment of vaccines for bio security concerns requires a systematic evaluation of confounding factors that limit vaccine efficacy. One common albeit overlooked confounder is the presence of gastrointestinal nematode parasites in populations of humans and livestock targeted for vaccination. This is particularly important in areas of the world were helminth infections are prevalent, but the interplay between parasites and emerging diseases that can be transmitted worldwide make this a global issue. In addition, it is not clear if the epidemic in allergic disease in industrialized countries substitutes for geohelminth infection to interfere with effective vaccination regimens. This presentation will focus on recent vaccination studies in mice experimentally infected with Heligmosomoides polygyrus to model the condition of gastrointestinal parasite infestation in mammalian populations targeted for vaccination. In addition, a large animal vaccination and challenge model against Mycoplasma hyopneumonia in swine exposed to Ascaris suum will provide

  2. Identity and Epistemic Emotions during Knowledge Revision: A Potential Account for the Backfire Effect

    ERIC Educational Resources Information Center

    Trevors, Gregory J.; Muis, Krista R.; Pekrun, Reinhard; Sinatra, Gale M.; Winne, Philip H.

    2016-01-01

    Recent research has shown that for some topics, messages to refute and revise misconceptions may backfire. The current research offers one possible account for this backfire effect (i.e., the ironic strengthening of belief in erroneous information after an attempted refutation) from an educational psychology perspective and examines whether…

  3. Post-study therapy as a source of confounding in survival analysis of first-line studies in patients with advanced non-small-cell lung cancer.

    PubMed

    Zietemann, Vera D; Schuster, Tibor; Duell, Thomas Hg

    2011-06-01

    Clinical trials exploring the long-term effects of first-line therapy in patients with advanced non-small-cell lung cancer generally disregard subsequent treatment although most patients receive second and third-line therapies. The choice of further therapy depends on critical intermediate events such as disease progression and it is usually left at the physician's discretion. Time-dependent confounding may then arise with standard survival analyses producing biased effect estimates, even in randomized trials. Herein we describe the concept of time-dependent confounding in detail and discuss whether the response to first-line treatment may be a potential time-dependent confounding factor for survival in the context of subsequent therapy. A prospective observational study of 406 patients with advanced non-small-cell lung cancer served as an example base. There is evidence that time-dependent confounding may occur in multivariate survival analysis after first-line therapy when disregarding subsequent treatment. In the light of this important but underestimated aspect some of the large and meaningful recent clinical first-line lung cancer studies are discussed, focussing on subsequent treatment and its potential impact on the survival of the study patients. No recently performed lung cancer trial applied adequate statistical analyses despite the frequent use of subsequent therapies. In conclusion, effect estimates from standard survival analysis may be biased even in randomized controlled trials because of time-dependent confounding. To adequately assess treatment effects on long-term outcomes appropriate statistical analyses need to take subsequent treatment into account.

  4. Regularized Regression Versus the High-Dimensional Propensity Score for Confounding Adjustment in Secondary Database Analyses.

    PubMed

    Franklin, Jessica M; Eddings, Wesley; Glynn, Robert J; Schneeweiss, Sebastian

    2015-10-01

    Selection and measurement of confounders is critical for successful adjustment in nonrandomized studies. Although the principles behind confounder selection are now well established, variable selection for confounder adjustment remains a difficult problem in practice, particularly in secondary analyses of databases. We present a simulation study that compares the high-dimensional propensity score algorithm for variable selection with approaches that utilize direct adjustment for all potential confounders via regularized regression, including ridge regression and lasso regression. Simulations were based on 2 previously published pharmacoepidemiologic cohorts and used the plasmode simulation framework to create realistic simulated data sets with thousands of potential confounders. Performance of methods was evaluated with respect to bias and mean squared error of the estimated effects of a binary treatment. Simulation scenarios varied the true underlying outcome model, treatment effect, prevalence of exposure and outcome, and presence of unmeasured confounding. Across scenarios, high-dimensional propensity score approaches generally performed better than regularized regression approaches. However, including the variables selected by lasso regression in a regular propensity score model also performed well and may provide a promising alternative variable selection method.

  5. Methodological issues of confounding in analytical epidemiologic studies.

    PubMed

    Hajian Tilaki, Karimollah

    2012-01-01

    Confounding can be thought of as mixing the effect of exposure on the risk of disease with a third factor which distorts the measure of association such as risk ratio or odds ratio. This bias arises because of complex functional relationship of confounder with both exposure and disease (outcome). In this article, we provided a conceptual framework review of confounding issues in epidemiologic studies, in particular in observational studies and nonrandomized experimental studies. We have shown in 2 by 2 tables with analytical examples how the index of association will be distorted when confounding is present. The criteria, source of confounding and several points in confounding issues have been addressed. The advantages and disadvantages of several strategies for control of confounding have been discussed.

  6. Confounding Effect in Clinical Research of Otolaryngology and Its Control.

    PubMed

    Yu, Yong-qiang; Huang, Dong-yan; Armijo Olivo, Susan; Yang, Huai-an; Bambanini, Yagesh; Sonnenberg, Lyn; Clark, Brenda; Constantinescu, Gabriela; Qian Yu, Jason; Zhang, Ming

    2015-06-01

    Confounding effect is a critical issue in clinical research of otolaryngology because it can distort the research's conclusion. In this review, we introduce the definition of confounding effect, the methods of verifying and controlling the effect. Confounding effect can be prevented by research's design, and adjusted by data analysis. Clinicians would be aware and cautious about confounding effect in their research. They would be able to set up a research's design in which appropriate methods have been applied to prevent this effect.They would know how to adjust confounding effect after data collection. It is important to remember that sometimes it is impossible to eliminate confounding effect completely, and statistical method is not a master key. Solid research knowledge and critical thinking of our brain are the most important in controlling confounding effect.

  7. Should we adjust for a confounder if empirical and theoretical criteria yield contradictory results? A simulation study

    PubMed Central

    Lee, Paul H.

    2014-01-01

    Confounders can be identified by one of two main strategies: empirical or theoretical. Although confounder identification strategies that combine empirical and theoretical strategies have been proposed, the need for adjustment remains unclear if the empirical and theoretical criteria yield contradictory results due to random error. We simulated several scenarios to mimic either the presence or the absence of a confounding effect and tested the accuracy of the exposure-outcome association estimates with and without adjustment. Various criteria (significance criterion, Change-in-estimate(CIE) criterion with a 10% cutoff and with a simulated cutoff) were imposed, and a range of sample sizes were trialed. In the presence of a true confounding effect, unbiased estimates were obtained only by using the CIE criterion with a simulated cutoff. In the absence of a confounding effect, all criteria performed well regardless of adjustment. When the confounding factor was affected by both exposure and outcome, all criteria yielded accurate estimates without adjustment, but the adjusted estimates were biased. To conclude, theoretical confounders should be adjusted for regardless of the empirical evidence found. The adjustment for factors that do not have a confounding effect minimally effects. Potential confounders affected by both exposure and outcome should not be adjusted for. PMID:25124526

  8. Assessment of the Potential for Human Resource Accounting in Venezuelan Navy Management Decision Making.

    DTIC Science & Technology

    1981-12-01

    expected conditional value, expected realizable value . 0. A ?ACT (CiU am e,e’. a.. it acuecoa, emU SIW bpl ft, inb1 mea, "Human Resource Accounting (HRA) has...person’s "Expected Realizable Value " [Ref. 34]. The individual’s worth is then calculated as the product of two determinants: the expected conditional... realizable value theory. One last point is related to an individual’s value to an organization, but many management theorists argue that groups rather than

  9. Antipsychotics and Mortality: Adjusting for Mortality Risk Scores to Address Confounding by Terminal Illness

    PubMed Central

    Park, Yoonyoung; Franklin, Jessica M.; Schneeweiss, Sebastian; Levin, Raisa; Crystal, Stephen; Gerhard, Tobias; Huybrechts, Krista F.

    2014-01-01

    OBJECTIVES Earlier studies have documented a greater mortality risk associated with conventional compared with atypical antipsychotics. Concern remains that the association is not causal, but due to residual confounding by differences in underlying health. To address this concern, we evaluated whether adjustment for prognostic indices specifically developed fornursing home (NH) populations affected the magnitude of the previously observed associations. DESIGN Cohort study SETTING A merged dataset of Medicaid, Medicare, the Minimum Data Set (MDS), the Online Survey Certification and Reporting system (OSCAR), and the National Death Index in the US for 2001-2005 PARTICIPANTS Dual eligible subjects ≥ 65 years who initiated antipsychotic treatment in a NH (n=75,445). MEASUREMENTS Three mortality risk scores (MRIS, MMRI-R, and ADEPT) were derived for each patient using baseline MDS data, and their performance was assessed using c-statistics and goodness-of-fit tests. The impact of adjusting for these indices in addition to propensity scores (PS) on the antipsychotic-mortality association was evaluated using Cox models with and without adjustment for risk scores. RESULTS Each risk score showed moderate discrimination for 6-month mortality with c-statistics ranging from 0.61 to 0.63. There was no evidence of lack of fit. Imbalances in risk scores between conventional and atypical antipsychotic users in the full cohort, suggesting potential confounding, were greatly reduced within PS deciles. Accounting for each score in the Cox model did not change the relative risk estimates: 2.24 with PS only adjustment vs. 2.20, 2.20, 2.22 after further adjustment for the three risk scores. CONCLUSION Although causality cannot be proven based on non-randomized studies, this study adds to the body of evidence rejecting alternative explanations for the increased mortality risk associated with conventional antipsychotics. PMID:25752911

  10. Assessing residual hydropower potential of the La Plata Basin accounting for future user demands

    NASA Astrophysics Data System (ADS)

    Popescu, I.; Brandimarte, L.; Perera, M. S. U.; Peviani, M.

    2012-04-01

    La Plata Basin is shared by five countries (Argentina, Bolivia, Brazil, Paraguay and Uruguay), which are having fast growing economies in South America. These countries need energy for their sustainable development; hence hydropower can play a very important role as a renewable clean source of energy. This paper presents an analysis of the current hydropower production and electricity demand in La Plata Basin (LPB) and makes an analysis of the maximum and residual hydropower potential of the basin for a horizon of 30 yr (i.e. year 2040). Current hydropower production is estimated based on historic available data while future energy production is deduced from the maximum available water in the catchment, whereas electricity demand is assessed by correlating existing electricity demand with the estimated population growth and economic development. The maximum and residual hydropower potential of the basin, were assessed for the mean annual flows of the present hydrological regime (1970-2000) and topographical characteristics of the area. Computations were performed using an integrated GIS environment called Vapidro-Aste released by the Research on Energy System (Italy). The residual hydropower potential of the basin is computed considering that first the water supply needs for population, industry and agriculture are served and than hydropower energy is produced. The calculated hydropower production is found to be approximately half of the estimated electricity demand, which shows that there is a need to look for other sources of energy in the future.

  11. Assessing residual hydropower potential of the La Plata Basin accounting for future user demands

    NASA Astrophysics Data System (ADS)

    Popescu, I.; Brandimarte, L.; Perera, M. S. U.; Peviani, M.

    2012-08-01

    La Plata Basin is shared by five countries (Argentina, Bolivia, Brazil, Paraguay and Uruguay), which have fast growing economies in South America. These countries need energy for their sustainable development; hence, hydropower can play a very important role as a renewable clean source of energy. This paper presents an analysis of the current hydropower production and electricity demand in La Plata Basin (LPB), and it analyses the maximum and residual hydropower potential of the basin for a horizon of 30 yr (i.e. year 2040). Current hydropower production is estimated based on historical available data, while future energy production is deduced from the available water in the catchment (estimated based on measured hydrographs of the past years), whereas electricity demand is assessed by correlating existing electricity demand with the estimated population growth and economic development. The maximum and residual hydropower potential of the basin were assessed for the mean annual flows of the present hydrological regime (1970-2000) and topographical characteristics of the area. Computations were performed using an integrated GIS environment called VAPIDRO-ASTE released by the Research on Energy System (Italy). The residual hydropower potential of the basin is computed considering first that the water supply needs for population, industry and agriculture are served, and then hydropower energy is produced. The calculated hydropower production is found to be approximately half of the estimated electricity demand, which shows that there is a need to look for other sources of energy in the future.

  12. Propensity Score-Based Approaches to Confounding by Indication in Individual Patient Data Meta-Analysis: Non-Standardized Treatment for Multidrug Resistant Tuberculosis

    PubMed Central

    Fox, Gregory J.; Benedetti, Andrea; Mitnick, Carole D.; Pai, Madhukar; Menzies, Dick

    2016-01-01

    Background In the absence of randomized clinical trials, meta-analysis of individual patient data (IPD) from observational studies may provide the most accurate effect estimates for an intervention. However, confounding by indication remains an important concern that can be addressed by incorporating individual patient covariates in different ways. We compared different analytic approaches to account for confounding in IPD from patients treated for multi-drug resistant tuberculosis (MDR-TB). Methods Two antibiotic classes were evaluated, fluoroquinolones—considered the cornerstone of effective MDR-TB treatment—and macrolides, which are known to be safe, yet are ineffective in vitro. The primary outcome was treatment success against treatment failure, relapse or death. Effect estimates were obtained using multivariable and propensity-score based approaches. Results Fluoroquinolone antibiotics were used in 28 included studies, within which 6,612 patients received a fluoroquinolone and 723 patients did not. Macrolides were used in 15 included studies, within which 459 patients received this class of antibiotics and 3,670 did not. Both standard multivariable regression and propensity score-based methods resulted in similar effect estimates for early and late generation fluoroquinolones, while macrolide antibiotics use was associated with reduced treatment success. Conclusions In this individual patient data meta-analysis, standard multivariable and propensity-score based methods of adjusting for individual patient covariates for observational studies yielded produced similar effect estimates. Even when adjustment is made for potential confounding, interpretation of adjusted estimates must still consider the potential for residual bias. PMID:27022741

  13. The Impact of Individual Learning Accounts: A Study of the Early and Potential Impact of Individual Learning Accounts on Learning Providers and Learning. Research Report.

    ERIC Educational Resources Information Center

    Gray, Michael; Peters, Jane; Fletcher, Mick; Kirk, Gordon

    The impact of individual learning accounts (ILAs) on the success of learners in post-16 education sector in the United Kingdom was explored through an examination of available research on ILAs. The following were among the study's 12 messages for providers, the Department for Education and Skills, and the Individual Learning Account Centre: (1)…

  14. ADDRESSING CONFOUNDING WHEN ESTIMATING THE EFFECTS OF LATENT CLASSES ON A DISTAL OUTCOME.

    PubMed

    Schuler, Megan S; Leoutsakos, Jeannie-Marie S; Stuart, Elizabeth A

    2014-12-01

    Confounding is widely recognized in settings where all variables are fully observed, yet recognition of and statistical methods to address confounding in the context of latent class regression are slowly emerging. In this study we focus on confounding when regressing a distal outcome on latent class; extending standard confounding methods is not straightforward when the treatment of interest is a latent variable. We describe a recent 1-step method, as well as two 3-step methods (modal and pseudoclass assignment) that incorporate propensity score weighting. Using simulated data, we compare the performance of these three adjusted methods to an unadjusted 1-step and unadjusted 3-step method. We also present an applied example regarding adolescent substance use treatment that examines the effect of treatment service class on subsequent substance use problems. Our simulations indicated that the adjusted 1-step method and both adjusted 3-step methods significantly reduced bias arising from confounding relative to the unadjusted 1-step and 3-step approaches. However, the adjusted 1-step method performed better than the adjusted 3-step methods with regard to bias and 95% CI coverage, particularly when class separation was poor. Our applied example also highlighted the importance of addressing confounding - both unadjusted methods indicated significant differences across treatment classes with respect to the outcome, yet these class differences were not significant when using any of the three adjusted methods. Potential confounding should be carefully considered when conducting latent class regression with a distal outcome; failure to do so may results in significantly biased effect estimates or incorrect inferences.

  15. Graphical presentation of confounding in directed acyclic graphs.

    PubMed

    Suttorp, Marit M; Siegerink, Bob; Jager, Kitty J; Zoccali, Carmine; Dekker, Friedo W

    2015-09-01

    Since confounding obscures the real effect of the exposure, it is important to adequately address confounding for making valid causal inferences from observational data. Directed acyclic graphs (DAGs) are visual representations of causal assumptions that are increasingly used in modern epidemiology. They can help to identify the presence of confounding for the causal question at hand. This structured approach serves as a visual aid in the scientific discussion by making underlying relations explicit. This article explains the basic concepts of DAGs and provides examples in the field of nephrology with and without presence of confounding. Ultimately, these examples will show that DAGs can be preferable to the traditional methods to identify sources of confounding, especially in complex research questions.

  16. The case for taking account of metabolism when testing for potential endocrine disruptors in vitro.

    PubMed

    Combes, Robert D

    2004-06-01

    Legislation in the USA, Europe and Japan will require that chemicals are tested for their ability to disrupt the hormonal systems of mammals. Such chemicals are known as endocrine disruptors (EDs), and will require extensive testing as part of the new European Union Registration, Evaluation and Authorisation of Chemicals (REACH) system for the risk assessment of chemicals. Both in vivo and in vitro tests are proposed for this purpose, and there has been much discussion and action concerning the development and validation of such tests. However, to date, little interest has been shown in incorporating metabolism into in vitro tests for EDs, in sharp contrast to other areas of toxicity testing, such as genotoxicity, and, ironically, such in vitro tests are criticised for not modelling in vivo metabolism. This is despite the existence of much information showing that endogenous and exogenous steroids are extensively metabolised by Phase I and Phase II enzymes both in the liver and in hormonally active tissues. Such metabolism can lead to the activation or detoxification of steroids and EDs. The absence of metabolism from these tests could give rise to false-positive data (due to lack of detoxification) or false-negative data (lack of activation). This paper aims to explain why in vitro assays for EDs should incorporate mammalian metabolising systems. The background to ED testing, the test methods available, and the role of mammalian metabolism in the activation and detoxification of both endogenous and exogenous steroids, are described. The available types of metabolising systems are compared, and the potential problems in incorporating metabolising systems into in vitro tests for EDs, and how these might be overcome, are discussed. It is recommended that there should be: a) an assessment of the intrinsic metabolising capacity of cell systems used in tests for EDs; b) an investigation into the relevance of using the prostaglandin H synthase system for metabolising EDs

  17. A case study on the identification of confounding factors for gene disease association analysis.

    PubMed

    Han, Bin; Xie, Ruifei; Wu, Shixiu; Li, Lihua; Zhu, Lei

    2015-01-01

    Variation in the expression of genes arises from a variety of sources. It is important to remove sources of variation between arrays of non-biological origin. Non-biological variation, caused by lurking confounding factors, usually attracts little attention, although it may substantially influence the expression profile of genes. In this study, we proposed a method which is able to identify the potential confounding factors and highlight the non-biological variations. We also developed methods and statistical tests to study the confounding factors and their influence on the homogeneity of microarray data, gene selection, and disease classification. We explored an ovarian cancer gene expression profile and showed that data batches and arraying conditions are two confounding factors. Their influence on the homogeneity of data, gene selection, and disease classification are statistically analyzed. Experiments showed that after normalization, their influences were removed. Comparative studies further showed that the data became more homogeneous and the classification quality was improved. This research demonstrated that identifying and reducing the impact of confounding factors is paramount in making sense of gene-disease association analysis.

  18. Restricted spatial regression in practice: Geostatistical models, confounding, and robustness under model misspecification

    USGS Publications Warehouse

    Hanks, Ephraim M.; Schliep, Erin M.; Hooten, Mevin B.; Hoeting, Jennifer A.

    2015-01-01

    In spatial generalized linear mixed models (SGLMMs), covariates that are spatially smooth are often collinear with spatially smooth random effects. This phenomenon is known as spatial confounding and has been studied primarily in the case where the spatial support of the process being studied is discrete (e.g., areal spatial data). In this case, the most common approach suggested is restricted spatial regression (RSR) in which the spatial random effects are constrained to be orthogonal to the fixed effects. We consider spatial confounding and RSR in the geostatistical (continuous spatial support) setting. We show that RSR provides computational benefits relative to the confounded SGLMM, but that Bayesian credible intervals under RSR can be inappropriately narrow under model misspecification. We propose a posterior predictive approach to alleviating this potential problem and discuss the appropriateness of RSR in a variety of situations. We illustrate RSR and SGLMM approaches through simulation studies and an analysis of malaria frequencies in The Gambia, Africa.

  19. EVALUATING COSTS WITH UNMEASURED CONFOUNDING: A SENSITIVITY ANALYSIS FOR THE TREATMENT EFFECT

    PubMed Central

    Handorf, Elizabeth A.; Bekelman, Justin E.; Heitjan, Daniel F.; Mitra, Nandita

    2014-01-01

    Estimates of the effects of treatment on cost from observational studies are subject to bias if there are unmeasured confounders. It is therefore advisable in practice to assess the potential magnitude of such biases. We derive a general adjustment formula for loglinear models of mean cost and explore special cases under plausible assumptions about the distribution of the unmeasured confounder. We assess the performance of the adjustment by simulation, in particular, examining robustness to a key assumption of conditional independence between the unmeasured and measured covariates given the treatment indicator. We apply our method to SEER-Medicare cost data for a stage II/III muscle-invasive bladder cancer cohort. We evaluate the costs for radical cystectomy vs. combined radiation/chemotherapy, and find that the significance of the treatment effect is sensitive to plausible unmeasured Bernoulli, Poisson and Gamma confounders. PMID:24587844

  20. The missing cause approach to unmeasured confounding in pharmacoepidemiology.

    PubMed

    Abrahamowicz, Michal; Bjerre, Lise M; Beauchamp, Marie-Eve; LeLorier, Jacques; Burne, Rebecca

    2016-03-30

    Unmeasured confounding is a major threat to the validity of pharmacoepidemiological studies of medication safety and effectiveness. We propose a new method for detecting and reducing the impact of unobserved confounding in large observational database studies. The method uses assumptions similar to the prescribing preference-based instrumental variable (IV) approach. Our method relies on the new 'missing cause' principle, according to which the impact of unmeasured confounding by (contra-)indication may be detected by assessing discrepancies between the following: (i) treatment actually received by individual patients and (ii) treatment that they would be expected to receive based on the observed data. Specifically, we use the treatment-by-discrepancy interaction to test for the presence of unmeasured confounding and correct the treatment effect estimate for the resulting bias. Under standard IV assumptions, we first proved that unmeasured confounding induces a spurious treatment-by-discrepancy interaction in risk difference models for binary outcomes and then simulated large pharmacoepidemiological studies with unmeasured confounding. In simulations, our estimates had four to six times smaller bias than conventional treatment effect estimates, adjusted only for measured confounders, and much smaller variance inflation than unbiased but very unstable IV estimates, resulting in uniformly lowest root mean square errors. The much lower variance of our estimates, relative to IV estimates, was also observed in an application comparing gastrointestinal safety of two classes of anti-inflammatory drugs. In conclusion, our missing cause-based method may complement other methods and enhance accuracy of analyses of large pharmacoepidemiological studies.

  1. Sampling depth confounds soil acidification outcomes

    Technology Transfer Automated Retrieval System (TEKTRAN)

    In the northern Great Plains (NGP) of North America, surface sampling depths of 0-15 or 0-20 cm are suggested for testing soil characteristics such as pH. However, acidification is often most pronounced near the soil surface. Thus, sampling deeper can potentially dilute (increase) pH measurements an...

  2. Methodological problems with population cancer studies: The forgotten confounding factors.

    PubMed

    Blaylock, Russell L

    2015-01-01

    Among clinical physicians it is the population study that is considered to be the "gold standard" of medical evidence concerning acceptable treatments. As new information comes to light concerning the many variables and confounding factors that can affect such studies, many older studies lose much of their original impact. While newer population studies take into consideration a far greater number of confounding factors many are still omitted and a number of these omitted factors can have profound effects on interpretation and validity of the study. In this editorial, I will discuss some of the omitted confounding factors and demonstrate how they can alter the interpretation of these papers and their clinical application.

  3. [Application of directed acyclic graphs in control of confounding].

    PubMed

    Xiang, R; Dai, W J; Xiong, Y; Wu, X; Yang, Y F; Wang, L; Dai, Z H; Li, J; Liu, A Z

    2016-07-01

    Observational study is a method most commonly used in the etiology study of epidemiology, but confounders, always distort the true causality between exposure and outcome when local inferencing. In order to eliminate these confounding, the determining of variables which need to be adjusted become a key issue. Directed acyclic graph(DAG)could visualize complex causality, provide a simple and intuitive way to identify the confounding, and convert it into the finding of the minimal sufficient adjustment for the control of confounding. On the one hand, directed acyclic graph can choose less variables, which increase statistical efficiency of the analysis. On the other hand, it could help avoiding variables that is not measured or with missing values. In a word, the directed acyclic graph could facilitate the reveal of the real causality effectively.

  4. Using an instrumental variable to test for unmeasured confounding

    PubMed Central

    Guo, Zijian; Cheng, Jing; Lorch, Scott A.; Small, Dylan S.

    2014-01-01

    An important concern in an observational study is whether or not there is unmeasured confounding, that is, unmeasured ways in which the treatment and control groups differ before treatment which affect the outcome. We develop a test of whether there is unmeasured confounding when an instrumental variable (IV) is available. An IV is a variable that is independent of the unmeasured confounding and encourages a subject to take one treatment level versus another, while having no effect on the outcome beyond its encouragement of a certain treatment level. We show what types of unmeasured confounding can be tested for with an IV and develop a test for this type of unmeasured confounding that has correct type I error rate. We show that the widely used Durbin–Wu–Hausman test can have inflated type I error rates when there is treatment effect heterogeneity. Additionally, we show that our test provides more insight into the nature of the unmeasured confounding than the Durbin–Wu–Hausman test. We apply our test to an observational study of the effect of a premature infant being delivered in a high-level neonatal intensive care unit (one with mechanical assisted ventilation and high volume) versus a lower level unit, using the excess travel time a mother lives from the nearest high-level unit to the nearest lower-level unit as an IV. PMID:24930696

  5. Methods to control for unmeasured confounding in pharmacoepidemiology: an overview.

    PubMed

    Uddin, Md Jamal; Groenwold, Rolf H H; Ali, Mohammed Sanni; de Boer, Anthonius; Roes, Kit C B; Chowdhury, Muhammad A B; Klungel, Olaf H

    2016-06-01

    Background Unmeasured confounding is one of the principal problems in pharmacoepidemiologic studies. Several methods have been proposed to detect or control for unmeasured confounding either at the study design phase or the data analysis phase. Aim of the Review To provide an overview of commonly used methods to detect or control for unmeasured confounding and to provide recommendations for proper application in pharmacoepidemiology. Methods/Results Methods to control for unmeasured confounding in the design phase of a study are case only designs (e.g., case-crossover, case-time control, self-controlled case series) and the prior event rate ratio adjustment method. Methods that can be applied in the data analysis phase include, negative control method, perturbation variable method, instrumental variable methods, sensitivity analysis, and ecological analysis. A separate group of methods are those in which additional information on confounders is collected from a substudy. The latter group includes external adjustment, propensity score calibration, two-stage sampling, and multiple imputation. Conclusion As the performance and application of the methods to handle unmeasured confounding may differ across studies and across databases, we stress the importance of using both statistical evidence and substantial clinical knowledge for interpretation of the study results.

  6. Using an instrumental variable to test for unmeasured confounding.

    PubMed

    Guo, Zijian; Cheng, Jing; Lorch, Scott A; Small, Dylan S

    2014-09-10

    An important concern in an observational study is whether or not there is unmeasured confounding, that is, unmeasured ways in which the treatment and control groups differ before treatment, which affect the outcome. We develop a test of whether there is unmeasured confounding when an instrumental variable (IV) is available. An IV is a variable that is independent of the unmeasured confounding and encourages a subject to take one treatment level versus another, while having no effect on the outcome beyond its encouragement of a certain treatment level. We show what types of unmeasured confounding can be tested for with an IV and develop a test for this type of unmeasured confounding that has correct type I error rate. We show that the widely used Durbin-Wu-Hausman test can have inflated type I error rates when there is treatment effect heterogeneity. Additionally, we show that our test provides more insight into the nature of the unmeasured confounding than the Durbin-Wu-Hausman test. We apply our test to an observational study of the effect of a premature infant being delivered in a high-level neonatal intensive care unit (one with mechanical assisted ventilation and high volume) versus a lower level unit, using the excess travel time a mother lives from the nearest high-level unit to the nearest lower-level unit as an IV.

  7. Environmental confounding in gene-environment interaction studies.

    PubMed

    Vanderweele, Tyler J; Ko, Yi-An; Mukherjee, Bhramar

    2013-07-01

    We show that, in the presence of uncontrolled environmental confounding, joint tests for the presence of a main genetic effect and gene-environment interaction will be biased if the genetic and environmental factors are correlated, even if there is no effect of either the genetic factor or the environmental factor on the disease. When environmental confounding is ignored, such tests will in fact reject the joint null of no genetic effect with a probability that tends to 1 as the sample size increases. This problem with the joint test vanishes under gene-environment independence, but it still persists if estimating the gene-environment interaction parameter itself is of interest. Uncontrolled environmental confounding will bias estimates of gene-environment interaction parameters even under gene-environment independence, but it will not do so if the unmeasured confounding variable itself does not interact with the genetic factor. Under gene-environment independence, if the interaction parameter without controlling for the environmental confounder is nonzero, then there is gene-environment interaction either between the genetic factor and the environmental factor of interest or between the genetic factor and the unmeasured environmental confounder. We evaluate several recently proposed joint tests in a simulation study and discuss the implications of these results for the conduct of gene-environment interaction studies.

  8. Heterogeneity Confounds Establishment of "a" Model Microbial Strain.

    PubMed

    Keller, Nancy P

    2017-02-21

    Aspergillus fumigatus is a ubiquitous environmental mold and the leading cause of diverse human diseases ranging from allergenic bronchopulmonary aspergillosis (ABPA) to invasive pulmonary aspergillosis (IPA). Experimental investigations of the biology and virulence of this opportunistic pathogen have historically used a few type strains; however, it is increasingly observed with this fungus that heterogeneity among isolates potentially confounds the use of these reference isolates. Illustrating this point, Kowalski et al. (mBio 7:e01515-16, 2016, https://doi.org/10.1128/mBio.01515-16) demonstrated that variation in 16 environmental and clinical isolates of A. fumigatus correlated virulence with fitness in low oxygen, whereas Fuller et al. (mBio 7:e01517-16, 2016, https://doi.org/10.1128/mBio.01517-16) showed wide variation in light responses at a physiological and protein functionality level in 15 A. fumigatus isolates. In both studies, two commonly used type strains, Af293 and CEA10, displayed significant differences in physiological responses to abiotic stimuli and virulence in a murine model of IPA.

  9. Distribution-free mediation analysis for nonlinear models with confounding.

    PubMed

    Albert, Jeffrey M

    2012-11-01

    Recently, researchers have used a potential-outcome framework to estimate causally interpretable direct and indirect effects of an intervention or exposure on an outcome. One approach to causal-mediation analysis uses the so-called mediation formula to estimate the natural direct and indirect effects. This approach generalizes the classical mediation estimators and allows for arbitrary distributions for the outcome variable and mediator. A limitation of the standard (parametric) mediation formula approach is that it requires a specified mediator regression model and distribution; such a model may be difficult to construct and may not be of primary interest. To address this limitation, we propose a new method for causal-mediation analysis that uses the empirical distribution function, thereby avoiding parametric distribution assumptions for the mediator. To adjust for confounders of the exposure-mediator and exposure-outcome relationships, inverse-probability weighting is incorporated based on a supplementary model of the probability of exposure. This method, which yields the estimates of the natural direct and indirect effects for a specified reference group, is applied to data from a cohort study of dental caries in very-low-birth-weight adolescents to investigate the oral-hygiene index as a possible mediator. Simulation studies show low bias in the estimation of direct and indirect effects in a variety of distribution scenarios, whereas the standard mediation formula approach can be considerably biased when the distribution of the mediator is incorrectly specified.

  10. Heterogeneity Confounds Establishment of “a” Model Microbial Strain

    PubMed Central

    2017-01-01

    ABSTRACT Aspergillus fumigatus is a ubiquitous environmental mold and the leading cause of diverse human diseases ranging from allergenic bronchopulmonary aspergillosis (ABPA) to invasive pulmonary aspergillosis (IPA). Experimental investigations of the biology and virulence of this opportunistic pathogen have historically used a few type strains; however, it is increasingly observed with this fungus that heterogeneity among isolates potentially confounds the use of these reference isolates. Illustrating this point, Kowalski et al. (mBio 7:e01515-16, 2016, https://doi.org/10.1128/mBio.01515-16) demonstrated that variation in 16 environmental and clinical isolates of A. fumigatus correlated virulence with fitness in low oxygen, whereas Fuller et al. (mBio 7:e01517-16, 2016, https://doi.org/10.1128/mBio.01517-16) showed wide variation in light responses at a physiological and protein functionality level in 15 A. fumigatus isolates. In both studies, two commonly used type strains, Af293 and CEA10, displayed significant differences in physiological responses to abiotic stimuli and virulence in a murine model of IPA. PMID:28223452

  11. Evaluating the impact of unmeasured confounding with internal validation data: an example cost evaluation in type 2 diabetes.

    PubMed

    Faries, Douglas; Peng, Xiaomei; Pawaskar, Manjiri; Price, Karen; Stamey, James D; Seaman, John W

    2013-01-01

    The quantitative assessment of the potential influence of unmeasured confounders in the analysis of observational data is rare, despite reliance on the "no unmeasured confounders" assumption. In a recent comparison of costs of care between two treatments for type 2 diabetes using a health care claims database, propensity score matching was implemented to adjust for selection bias though it was noted that information on baseline glycemic control was not available for the propensity model. Using data from a linked laboratory file, data on this potential "unmeasured confounder" were obtained for a small subset of the original sample. By using this information, we demonstrate how Bayesian modeling, propensity score calibration, and multiple imputation can utilize this additional information to perform sensitivity analyses to quantitatively assess the potential impact of unmeasured confounding. Bayesian regression models were developed to utilize the internal validation data as informative prior distributions for all parameters, retaining information on the correlation between the confounder and other covariates. While assumptions supporting the use of propensity score calibration were not met in this sample, the use of Bayesian modeling and multiple imputation provided consistent results, suggesting that the lack of data on the unmeasured confounder did not have a strong impact on the original analysis, due to the lack of strong correlation between the confounder and the cost outcome variable. Bayesian modeling with informative priors and multiple imputation may be useful tools for unmeasured confounding sensitivity analysis in these situations. Further research to understand the operating characteristics of these methods in a variety of situations, however, remains.

  12. Limitations of individual causal models, causal graphs, and ignorability assumptions, as illustrated by random confounding and design unfaithfulness.

    PubMed

    Greenland, Sander; Mansournia, Mohammad Ali

    2015-10-01

    We describe how ordinary interpretations of causal models and causal graphs fail to capture important distinctions among ignorable allocation mechanisms for subject selection or allocation. We illustrate these limitations in the case of random confounding and designs that prevent such confounding. In many experimental designs individual treatment allocations are dependent, and explicit population models are needed to show this dependency. In particular, certain designs impose unfaithful covariate-treatment distributions to prevent random confounding, yet ordinary causal graphs cannot discriminate between these unconfounded designs and confounded studies. Causal models for populations are better suited for displaying these phenomena than are individual-level models, because they allow representation of allocation dependencies as well as outcome dependencies across individuals. Nonetheless, even with this extension, ordinary graphical models still fail to capture distinctions between hypothetical superpopulations (sampling distributions) and observed populations (actual distributions), although potential-outcome models can be adapted to show these distinctions and their consequences.

  13. An Introduction to Sensitivity Analysis for Unobserved Confounding in Non-Experimental Prevention Research

    PubMed Central

    Kuramoto, S. Janet; Stuart, Elizabeth A.

    2013-01-01

    Despite that randomization is the gold standard for estimating causal relationships, many questions in prevention science are left to be answered through non-experimental studies often because randomization is either infeasible or unethical. While methods such as propensity score matching can adjust for observed confounding, unobserved confounding is the Achilles heel of most non-experimental studies. This paper describes and illustrates seven sensitivity analysis techniques that assess the sensitivity of study results to an unobserved confounder. These methods were categorized into two groups to reflect differences in their conceptualization of sensitivity analysis, as well as their targets of interest. As a motivating example we examine the sensitivity of the association between maternal suicide and offspring’s risk for suicide attempt hospitalization. While inferences differed slightly depending on the type of sensitivity analysis conducted, overall the association between maternal suicide and offspring’s hospitalization for suicide attempt was found to be relatively robust to an unobserved confounder. The ease of implementation and the insight these analyses provide underscores sensitivity analysis techniques as an important tool for non-experimental studies. The implementation of sensitivity analysis can help increase confidence in results from non-experimental studies and better inform prevention researchers and policymakers regarding potential intervention targets. PMID:23408282

  14. Problems and the potential direction of reforms for the current individual medical savings accounts in the Chinese health care system.

    PubMed

    Kong, Xiangjin; Yang, Yang; Gong, Fuqing; Zhao, Mingjie

    2012-12-01

    Individual health savings accounts are an important part of the current basic medical insurance system for urban workers in China. Since 1998 when the system of personal medical insurance accounts was first implemented, there has been considerable controversy over its function and significance within different social communities. This paper analyzes the main problems in the practical implementation of individual medical insurance accounts and discusses the social and cultural foundations for the establishment of family health savings accounts from the perspective of Chinese Confucian familism. Accordingly, it addresses the direction of the reform and the development of the current system of individual health insurance accounts in China.

  15. Literature-Based Discovery of Confounding in Observational Clinical Data

    PubMed Central

    Malec, Scott A.; Wei, Peng; Xu, Hua; Bernstam, Elmer V.; Myneni, Sahiti; Cohen, Trevor

    2016-01-01

    Observational data recorded in the Electronic Health Record (EHR) can help us better understand the effects of therapeutic agents in routine clinical practice. As such data were not collected for research purposes, their reuse for research must compensate for additional information that may bias analyses and lead to faulty conclusions. Confounding is present when factors aside from the given predictor(s) affect the response of interest. However, these additional factors may not be known at the outset. In this paper, we present a scalable literature-based confounding variable discovery method for biomedical research applications with pharmacovigilance as our use case. We hypothesized that statistical models, adjusted with literature-derived confounders, will more accurately identify causative drug-adverse drug event (ADE) relationships. We evaluated our method with a curated reference standard, and found a pattern of improved performance ~ 5% in two out of three models for gastrointestinal bleeding (pre-adjusted Area Under Curve ≥ 0.6). PMID:28269951

  16. Methodological problems with population cancer studies: The forgotten confounding factors

    PubMed Central

    Blaylock, Russell L.

    2015-01-01

    Among clinical physicians it is the population study that is considered to be the “gold standard” of medical evidence concerning acceptable treatments. As new information comes to light concerning the many variables and confounding factors that can affect such studies, many older studies lose much of their original impact. While newer population studies take into consideration a far greater number of confounding factors many are still omitted and a number of these omitted factors can have profound effects on interpretation and validity of the study. In this editorial, I will discuss some of the omitted confounding factors and demonstrate how they can alter the interpretation of these papers and their clinical application. PMID:26097772

  17. Moving toward implementation: the potential for accountable care organizations and private-public partnerships to advance active neighborhood design.

    PubMed

    Zusman, Edie E; Carr, Sara Jensen; Robinson, Judy; Kasirye, Olivia; Zell, Bonnie; Miller, William Jahmal; Duarte, Teri; Engel, Adrian B; Hernandez, Monica; Horton, Mark B; Williams, Frank

    2014-12-01

    The 2010 Affordable Care Act's (ACA) aims of lowering costs and improving quality of care will renew focus on preventive health strategies. This coincides with a trend in medicine to reconsider population health approaches as part of the standard curriculum. This intersection of new policy and educational climates presents a unique opportunity to reconsider traditional healthcare structures. This paper introduces and advances an alignment that few have considered. We propose that accountable care organizations (ACOs), which are expected to proliferate under the ACA, present the best opportunity to establish partnerships between healthcare, public health, and community-based organizations to achieve the legislation's goals. One example is encouraging daily physical activity via built environment interventions and programs, which is recommended by numerous groups. We highlight how nonprofit organizations in Sacramento, California have been able to leverage influence, capital, and policy to encourage design for active living, and how their work is coordinating with public health and healthcare initiatives. In conclusion, we critically examine potential barriers to the success of partnerships between ACOs and community organizations and encourage further exploration and evaluation.

  18. Clustering and Residual Confounding in the Application of Marginal Structural Models: Dialysis Modality, Vascular Access, and Mortality.

    PubMed

    Kasza, Jessica; Polkinghorne, Kevan R; Marshall, Mark R; McDonald, Stephen P; Wolfe, Rory

    2015-09-15

    In the application of marginal structural models to compare time-varying treatments, it is rare that the hierarchical structure of a data set is accounted for or that the impact of unmeasured confounding on estimates is assessed. These issues often arise when analyzing data sets drawn from clinical registries, where patients may be clustered within health-care providers, and the amount of data collected from each patient may be limited by design (e.g., to reduce costs or encourage provider participation). We compared the survival of patients undergoing treatment with various dialysis types, where some patients switched dialysis modality during the course of their treatment, by estimating a marginal structural model using data from the Australia and New Zealand Dialysis and Transplant Registry, 2003-2011. The number of variables recorded by the registry is limited, and patients are clustered within the dialysis centers responsible for their treatment, so we assessed the impact of accounting for unmeasured confounding or clustering on estimated treatment effects. Accounting for clustering had limited impact, and only unreasonable levels of unmeasured confounding would have changed conclusions about treatment comparisons. Our analysis serves as a case study in assessing the impact of unmeasured confounding and clustering in the application of marginal structural models.

  19. Prevalence of non-confounded HIV-associated neurocognitive impairment in the context of plasma HIV RNA suppression.

    PubMed

    Cysique, Lucette A; Brew, Bruce J

    2011-04-01

    HIV-associated neurocognitive disorder is known to occur in the context of successful combination antiretroviral therapy (cART; plasma HIV RNA <50 copies/ml). Here, we newly provide an analysis of its prevalence and nature in the absence of medical or psychiatric confounds that may otherwise inflate the prevalence rate. We enrolled a cohort of 116 advanced HIV + individuals on cART (51% virally suppressed (VS)). They were screened for active Hepatitis C, current substance use disorder and were assessed with standard neuropsychological (NP) testing. Our results showed that out of the entire sample, NP impairment occurred in 18.1% (21/116) in VS individuals which was not statistically different from the 24.1% (28/116) that were found to be NP-impaired and not VS. In comparison with NP-normal-VS persons, NP impairment in VS individuals was associated with shorter duration of current cART and lower pre-morbid ability. Higher cART CNS penetration effectiveness tended to be associated with lesser cognitive severity in NP-impaired VS individuals. Current CD4 cell count, depression symptoms and past CNS HIV-related diseases did not specifically account for persistent NP impairment in VS individuals. In conclusion, despite suppression of systemic viral load, non-confounded HIV-related NP-impairment prevalence reached 18.1%. Of the potential explanations for this persistent deficit, a "burnt-out" form of the disease and immune reconstitution inflammatory syndrome were the less likely explanations, while a shorter current cART duration and lower pre-morbid intellectual capacity were significant. Nonetheless, predictive modelling with these last two factors misclassified 27% and had low sensitivity (43%) emphasising that other yet-to-be-defined factors were operative.

  20. The confounded effects of age and exposure history in response to influenza vaccination.

    PubMed

    Mosterín Höpping, Ana; McElhaney, Janet; Fonville, Judith M; Powers, Douglas C; Beyer, Walter E P; Smith, Derek J

    2016-01-20

    Numerous studies have explored whether the antibody response to influenza vaccination in elderly adults is as strong as it is in young adults. Results vary, but tend to indicate lower post-vaccination titers (antibody levels) in the elderly, supporting the concept of immunosenescence-the weakening of the immunological response related to age. Because the elderly in such studies typically have been vaccinated against influenza before enrollment, a confounding of effects occurs between age, and previous exposures, as a potential extrinsic reason for immunosenescence. We conducted a four-year study of serial annual immunizations with inactivated trivalent influenza vaccines in 136 young adults (16 to 39 years) and 122 elderly adults (62 to 92 years). Compared to data sets of previously published studies, which were designed to investigate the effect of age, this detailed longitudinal study with multiple vaccinations allowed us to also study the effect of prior vaccination history on the response to a vaccine. In response to the first vaccination, young adults produced higher post-vaccination titers, accounting for pre-vaccination titers, than elderly adults. However, upon subsequent vaccinations the difference in response to vaccination between the young and elderly age groups declined rapidly. Although age is an important factor when modeling the outcome of the first vaccination, this term lost its relevance with successive vaccinations. In fact, when we examined the data with the assumption that the elderly group had received (on average) as few as two vaccinations prior to our study, the difference due to age disappeared. Our analyses therefore show that the initial difference between the two age groups in their response to vaccination may not be uniquely explained by immunosenescence due to ageing of the immune system, but could equally be the result of the different pre-study vaccination and infection histories in the elderly.

  1. Wind turbines and idiopathic symptoms: The confounding effect of concurrent environmental exposures.

    PubMed

    Blanes-Vidal, Victoria; Schwartz, Joel

    2016-01-01

    Whether or not wind turbines pose a risk to human health is a matter of heated debate. Personal reactions to other environmental exposures occurring in the same settings as wind turbines may be responsible of the reported symptoms. However, these have not been accounted for in previous studies. We investigated whether there is an association between residential proximity to wind turbines and idiopathic symptoms, after controlling for personal reactions to other environmental co-exposures. We assessed wind turbine exposures in 454 residences as the distance to the closest wind turbine (Dw) and number of wind turbines <1000m (Nw1000). Information on symptoms, demographics and personal reactions to exposures was obtained by a blind questionnaire. We identified confounders using confounders' selection criteria and used adjusted logistic regression models to estimate associations. When controlling only for socio-demographic characteristics, log10Dw was associated with "unnatural fatigue" (ORadj=0.38, 95%CI=0.15-1.00) and "difficulty concentrating" (ORadj=0.26, 95%CI=0.08-0.83) and Nw1000 was associated with "unnatural fatigue" (ORadj=1.35, 95%CI=1.07-1.70) and "headache" (ORadj=1.26, 95%CI=1.00-1.58). After controlling for personal reactions to noise from sources different from wind turbines and agricultural odor exposure, we did not observe a significant relationship between residential proximity to wind turbines and symptoms and the parameter estimates were attenuated toward zero. Wind turbines-health associations can be confounded by personal reactions to other environmental co-exposures. Isolated associations reported in the literature may be due to confounding bias.

  2. The Threshold of Embedded M Collider Bias and Confounding Bias

    ERIC Educational Resources Information Center

    Kelcey, Benjamin; Carlisle, Joanne

    2011-01-01

    Of particular import to this study, is collider bias originating from stratification on retreatment variables forming an embedded M or bowtie structural design. That is, rather than assume an M structural design which suggests that "X" is a collider but not a confounder, the authors adopt what they consider to be a more reasonable…

  3. The relation of collapsibility and confounding to faithfulness and stability.

    PubMed

    Mansournia, Mohammad Ali; Greenland, Sander

    2015-07-01

    A probability distribution may have some properties that are stable under a structure (e.g., a causal graph) and other properties that are unstable. Stable properties are implied by the structure and thus will be shared by populations following the structure. In contrast, unstable properties correspond to special circumstances that are unlikely to be replicated across those populations. A probability distribution is faithful to the structure if all independencies in the distribution are logical consequences of the structure. We explore the distinction between confounding and noncollapsibility in relation to the concepts of faithfulness and stability. Simple collapsibility of an odds ratio over a risk factor is unstable and thus unlikely if the exposure affects the outcome, whether or not the risk factor is associated with exposure. For a binary exposure with no effect, collapsibility over a confounder also requires unfaithfulness. Nonetheless, if present, simple collapsibility of the odds ratio limits the degree of confounding by the covariate. Collapsibility of effect measures is stable if the covariate is independent of the outcome given exposure, but it is unstable if the covariate is an instrumental variable. Understanding stable and unstable properties of distributions under causal structures, and the distinction between stability and faithfulness, yields important insights into the correspondence between noncollapsibility and confounding.

  4. Causal diagrams and multivariate analysis III: confound it!

    PubMed

    Jupiter, Daniel C

    2015-01-01

    This commentary concludes my series concerning inclusion of variables in multivariate analyses. We take up the issues of confounding and effect modification and summarize the work we have thus far done. Finally, we provide a rough algorithm to help guide us through the maze of possibilities that we have outlined.

  5. Subliminal psychodynamic activation: an experiment controlling for major possible confounding influences outlined by Fudin.

    PubMed

    Gustafson, R; Källmén, H

    1991-08-01

    40 and 48 subjects participated in two separate experiments aimed at reproducing the subliminal psychodynamic activation (SPA) phenomenon and taking into account the major methodological critique by Fudin (1986, 1990). Subjects were first exposed either to a full or one of all possible partial symbiotic messages and then to their anagram equivalents. Confounding and irrelevant influences were controlled; the results indicate that only the full symbiotic message improved motor performance. This strongly suggests that subjects encode the meaning of the full message and supports an interpretation in terms of an alleviation of an internal symbiotic conflict leading to a state of calmness conducive to improved motor performance.

  6. Threats to internal validity in exercise science: a review of overlooked confounding variables.

    PubMed

    Halperin, Israel; Pyne, David B; Martin, David T

    2015-10-01

    Internal validity refers to the degree of control exerted over potential confounding variables to reduce alternative explanations for the effects of various treatments. In exercise and sports-science research and routine testing, internal validity is commonly achieved by controlling variables such as exercise and warm-up protocols, prior training, nutritional intake before testing, ambient temperature, time of testing, hours of sleep, age, and gender. However, a number of other potential confounding variables often do not receive adequate attention in sports physiology and performance research. These confounding variables include instructions on how to perform the test, volume and frequency of verbal encouragement, knowledge of exercise endpoint, number and gender of observers in the room, influence of music played before and during testing, and the effects of mental fatigue on performance. In this review the authors discuss these variables in relation to common testing environments in exercise and sports science and present some recommendations with the goal of reducing possible threats to internal validity.

  7. Multiple imputation for handling systematically missing confounders in meta-analysis of individual participant data.

    PubMed

    Resche-Rigon, Matthieu; White, Ian R; Bartlett, Jonathan W; Peters, Sanne A E; Thompson, Simon G

    2013-12-10

    A variable is 'systematically missing' if it is missing for all individuals within particular studies in an individual participant data meta-analysis. When a systematically missing variable is a potential confounder in observational epidemiology, standard methods either fail to adjust the exposure-disease association for the potential confounder or exclude studies where it is missing. We propose a new approach to adjust for systematically missing confounders based on multiple imputation by chained equations. Systematically missing data are imputed via multilevel regression models that allow for heterogeneity between studies. A simulation study compares various choices of imputation model. An illustration is given using data from eight studies estimating the association between carotid intima media thickness and subsequent risk of cardiovascular events. Results are compared with standard methods and also with an extension of a published method that exploits the relationship between fully adjusted and partially adjusted estimated effects through a multivariate random effects meta-analysis model. We conclude that multiple imputation provides a practicable approach that can handle arbitrary patterns of systematic missingness. Bias is reduced by including sufficient between-study random effects in the imputation model.

  8. Rainfall and temperatures changes have confounding impacts on Phytophthora cinnamomi occurrence risk in the southwestern USA under climate change scenarios.

    PubMed

    Thompson, Sally E; Levin, Simon; Rodriguez-Iturbe, Ignacio

    2014-04-01

    Global change will simultaneously impact many aspects of climate, with the potential to exacerbate the risks posed by plant pathogens to agriculture and the natural environment; yet, most studies that explore climate impacts on plant pathogen ranges consider individual climatic factors separately. In this study, we adopt a stochastic modeling approach to address multiple pathways by which climate can constrain the range of the generalist plant pathogen Phytophthora cinnamomi (Pc): through changing winter soil temperatures affecting pathogen survival; spring soil temperatures and thus pathogen metabolic rates; and changing spring soil moisture conditions and thus pathogen growth rates through host root systems. We apply this model to the southwestern USA for contemporary and plausible future climate scenarios and evaluate the changes in the potential range of Pc. The results indicate that the plausible range of this pathogen in the southwestern USA extends over approximately 200,000 km(2) under contemporary conditions. While warming temperatures as projected by the IPCC A2 and B1 emissions scenarios greatly expand the range over which the pathogen can survive winter, projected reductions in spring rainfall reduce its feasible habitat, leading to spatially complex patterns of changing risk. The study demonstrates that temperature and rainfall changes associated with possible climate futures in the southwestern USA have confounding impacts on the range of Pc, suggesting that projections of future pathogen dynamics and ranges should account for multiple pathways of climate-pathogen interaction.

  9. Evaluating Public Health Interventions: 3. The Two-Stage Design for Confounding Bias Reduction-Having Your Cake and Eating It Two.

    PubMed

    Spiegelman, Donna; Rivera-Rodriguez, Claudia L; Haneuse, Sebastien

    2016-07-01

    In public health evaluations, confounding bias in the estimate of the intervention effect will typically threaten the validity of the findings. It is a common misperception that the only way to avoid this bias is to measure detailed, high-quality data on potential confounders for every intervention participant, but this strategy for adjusting for confounding bias is often infeasible. Rather than ignoring confounding altogether, the two-phase design and analysis-in which detailed high-quality confounding data are obtained among a small subsample-can be considered. We describe the two-stage design and analysis approach, and illustrate its use in the evaluation of an intervention conducted in Dar es Salaam, Tanzania, of an enhanced community health worker program to improve antenatal care uptake.

  10. Evaluating Public Health Interventions: 3. The Two-Stage Design for Confounding Bias Reduction—Having Your Cake and Eating It Two

    PubMed Central

    Spiegelman, Donna; Rivera-Rodriguez, Claudia L.; Haneuse, Sebastien

    2016-01-01

    In public health evaluations, confounding bias in the estimate of the intervention effect will typically threaten the validity of the findings. It is a common misperception that the only way to avoid this bias is to measure detailed, high-quality data on potential confounders for every intervention participant, but this strategy for adjusting for confounding bias is often infeasible. Rather than ignoring confounding altogether, the two-phase design and analysis—in which detailed high-quality confounding data are obtained among a small subsample—can be considered. We describe the two-stage design and analysis approach, and illustrate its use in the evaluation of an intervention conducted in Dar es Salaam, Tanzania, of an enhanced community health worker program to improve antenatal care uptake. PMID:27285260

  11. Simultaneous dimension reduction and adjustment for confounding variation.

    PubMed

    Lin, Zhixiang; Yang, Can; Zhu, Ying; Duchi, John; Fu, Yao; Wang, Yong; Jiang, Bai; Zamanighomi, Mahdi; Xu, Xuming; Li, Mingfeng; Sestan, Nenad; Zhao, Hongyu; Wong, Wing Hung

    2016-12-20

    Dimension reduction methods are commonly applied to high-throughput biological datasets. However, the results can be hindered by confounding factors, either biological or technical in origin. In this study, we extend principal component analysis (PCA) to propose AC-PCA for simultaneous dimension reduction and adjustment for confounding (AC) variation. We show that AC-PCA can adjust for (i) variations across individual donors present in a human brain exon array dataset and (ii) variations of different species in a model organism ENCODE RNA sequencing dataset. Our approach is able to recover the anatomical structure of neocortical regions and to capture the shared variation among species during embryonic development. For gene selection purposes, we extend AC-PCA with sparsity constraints and propose and implement an efficient algorithm. The methods developed in this paper can also be applied to more general settings. The R package and MATLAB source code are available at https://github.com/linzx06/AC-PCA.

  12. An assessment of the possible extent of confounding in epidemiological studies of lung cancer risk among roofers

    SciTech Connect

    Mundt, D.J.; van Wijngaarden, E.; Mundt, K.A.

    2007-07-01

    We evaluated the likelihood and extent to which the observed increased risk of lung cancer may be due to confounding (a mixing of effects of multiple exposures) by co-exposure to other potential carcinogens present in roofing or to lifestyle variables. We conducted a review of the epidemiological and industrial hygiene literature of asphalt-exposed workers. Peer-reviewed epidemiological studies of asphalt fumes, related occupational exposures, and confounding factors were identified from MEDLINE (1966 early 2004). Industrial hygiene studies of asphalt workers were identified through MEDLINE, publicly available government documents, and asphalt industry documents. Using well established statistical methods, we quantified the extent to which lung cancer relative risk estimates among roofers reflect confounding from other exposures, using different prevalence and risk scenarios. The relative risk of lung cancer varied from 1.2 to 5.0 in 13 epidemiological studies of roofers; most studies reported a relative risk between 1.2 and 1.4. Smoking, asbestos and coal tar were the most likely confounders, but the prevalence of these factors varied over time. The results of the study indicate that much of the observed risk reported in epidemiological studies of cancer among roofers is well within the range of what may have resulted from confounding by reasonable and expected levels of smoking, asbestos or coal tar. This may be particularly true for those studies that did not adjust for these confounders and where the exposure was defined as employment in the roofing industry. In addition to poorly defined asphalt exposure, uncontrolled confounding cannot reliably be ruled out in studies of lung cancer among asphalt-exposed roofers. Therefore, it is not possible to conclude whether roofers are at increased risk of lung cancer due to asphalt exposure.

  13. Mediation Analysis With Intermediate Confounding: Structural Equation Modeling Viewed Through the Causal Inference Lens

    PubMed Central

    De Stavola, Bianca L.; Daniel, Rhian M.; Ploubidis, George B.; Micali, Nadia

    2015-01-01

    The study of mediation has a long tradition in the social sciences and a relatively more recent one in epidemiology. The first school is linked to path analysis and structural equation models (SEMs), while the second is related mostly to methods developed within the potential outcomes approach to causal inference. By giving model-free definitions of direct and indirect effects and clear assumptions for their identification, the latter school has formalized notions intuitively developed in the former and has greatly increased the flexibility of the models involved. However, through its predominant focus on nonparametric identification, the causal inference approach to effect decomposition via natural effects is limited to settings that exclude intermediate confounders. Such confounders are naturally dealt with (albeit with the caveats of informality and modeling inflexibility) in the SEM framework. Therefore, it seems pertinent to revisit SEMs with intermediate confounders, armed with the formal definitions and (parametric) identification assumptions from causal inference. Here we investigate: 1) how identification assumptions affect the specification of SEMs, 2) whether the more restrictive SEM assumptions can be relaxed, and 3) whether existing sensitivity analyses can be extended to this setting. Data from the Avon Longitudinal Study of Parents and Children (1990–2005) are used for illustration. PMID:25504026

  14. Mediation analysis with intermediate confounding: structural equation modeling viewed through the causal inference lens.

    PubMed

    De Stavola, Bianca L; Daniel, Rhian M; Ploubidis, George B; Micali, Nadia

    2015-01-01

    The study of mediation has a long tradition in the social sciences and a relatively more recent one in epidemiology. The first school is linked to path analysis and structural equation models (SEMs), while the second is related mostly to methods developed within the potential outcomes approach to causal inference. By giving model-free definitions of direct and indirect effects and clear assumptions for their identification, the latter school has formalized notions intuitively developed in the former and has greatly increased the flexibility of the models involved. However, through its predominant focus on nonparametric identification, the causal inference approach to effect decomposition via natural effects is limited to settings that exclude intermediate confounders. Such confounders are naturally dealt with (albeit with the caveats of informality and modeling inflexibility) in the SEM framework. Therefore, it seems pertinent to revisit SEMs with intermediate confounders, armed with the formal definitions and (parametric) identification assumptions from causal inference. Here we investigate: 1) how identification assumptions affect the specification of SEMs, 2) whether the more restrictive SEM assumptions can be relaxed, and 3) whether existing sensitivity analyses can be extended to this setting. Data from the Avon Longitudinal Study of Parents and Children (1990-2005) are used for illustration.

  15. Diagnostics for Confounding of Time-varying and Other Joint Exposures.

    PubMed

    Jackson, John W

    2016-11-01

    The effects of joint exposures (or exposure regimes) include those of adhering to assigned treatment versus placebo in a randomized controlled trial, duration of exposure in a cohort study, interactions between exposures, and direct effects of exposure, among others. Unlike the setting of a single point exposure (e.g., propensity score matching), there are few tools to describe confounding for joint exposures or how well a method resolves it. Investigators need tools that describe confounding in ways that are conceptually grounded and intuitive for those who read, review, and use applied research to guide policy. We revisit the implications of exchangeability conditions that hold in sequentially randomized trials, and the bias structure that motivates the use of g-methods, such as marginal structural models. From these, we develop covariate balance diagnostics for joint exposures that can (1) describe time-varying confounding, (2) assess whether covariates are predicted by prior exposures given their past, the indication for g-methods, and (3) describe residual confounding after inverse probability weighting. For each diagnostic, we present time-specific metrics that encompass a wide class of joint exposures, including regimes of multivariate time-varying exposures in censored data, with multivariate point exposures as a special case. We outline how to estimate these directly or with regression and how to average them over person-time. Using a simulated example, we show how these metrics can be presented graphically. This conceptually grounded framework can potentially aid the transparent design, analysis, and reporting of studies that examine joint exposures. We provide easy-to-use tools to implement it.

  16. Keeping Accountability Systems Accountable

    ERIC Educational Resources Information Center

    Foote, Martha

    2007-01-01

    The standards and accountability movement in education has undeniably transformed schooling throughout the United States. Even before President Bush signed the No Child Left Behind (NCLB) Act into law in January 2002, mandating annual public school testing in English and math for grades 3-8 and once in high school, most states had already…

  17. Calculations of the ionization potentials of the halogens by the relativistic Hartree-Rock-Dirac method taking account of superposition of configurations

    SciTech Connect

    Tupitsyn, I.I.

    1988-03-01

    The ionization potentials of the halogen group have been calculated. The calculations were carried out using the relativistic Hartree-Fock method taking into account correlation effects. Comparison of theoretical results with experimental data for the elements F, Cl, Br, and I allows an estimation of the accuracy and reliability of the method. The theoretical values of the ionization potential of astatine obtained here may be of definite interest for the chemistry of astatine.

  18. Treatment Confounded Missingness: A Comparison of Methods for Addressing Censored or Truncated Data in School Reform Evaluations. CRESST Report 832

    ERIC Educational Resources Information Center

    Rickles, Jordan H.; Hansen, Mark; Wang, Jia

    2013-01-01

    In this paper we examine ways to conceptualize and address potential bias that can arise when the mechanism for missing outcome data is at least partially associated with treatment assignment, an issue we refer to as treatment confounded missingness (TCM). In discussing TCM, we bring together concepts from the methodological literature on missing…

  19. Doubly robust estimators of causal exposure effects with missing data in the outcome, exposure or a confounder.

    PubMed

    Williamson, E J; Forbes, A; Wolfe, R

    2012-12-30

    We consider the estimation of the causal effect of a binary exposure on a continuous outcome. Confounding and missing data are both likely to occur in practice when observational data are used to estimate this causal effect. In dealing with each of these problems, model misspecification is likely to introduce bias. We present augmented inverse probability weighted (AIPW) estimators that account for both confounding and missing data, with the latter occurring in a single variable only. These estimators have an element of robustness to misspecification of the models used. Our estimators require two models to be specified to deal with confounding and two to deal with missing data. Only one of each of these models needs to be correctly specified. When either the outcome or the exposure of interest is missing, we derive explicit expressions for the AIPW estimator. When a confounder is missing, explicit derivation is complex, so we use a simple algorithm, which can be applied using standard statistical software, to obtain an approximation to the AIPW estimator.

  20. Are We Missing Something Pertinent? A Bias Analysis of Unmeasured Confounding in the Firearm-Suicide Literature.

    PubMed

    Miller, M; Swanson, S A; Azrael, D

    2016-01-01

    Despite the magnitude and consistency of risk estimates in the peer-reviewed literature linking firearm availability and suicide, inferring causality has been questioned on the theoretical basis that existing studies may have failed to account for the possibility that members of households with firearms differ from members of households without firearms in important ways related to suicide risk. The current bias analysis directly addresses this concern by describing the salient characteristics that such an unmeasured confounder would need to possess in order to yield the associations between firearm availability and suicide observed in the literature when, in fact, the causal effect is null. Four US studies, published between 1992 and 2003, met our eligibility criteria. We find that any such unmeasured confounder would need to possess an untenable combination of characteristics, such as being not only 1) as potent a suicide risk factor as the psychiatric disorders most tightly linked to suicide (e.g., major depressive and substance use disorders) but also 2) an order of magnitude more imbalanced across households with versus without firearms than is any known risk factor. No such confounder has been found or even suggested. The current study strongly suggests that unmeasured confounding alone is unlikely to explain the association between firearms and suicide.

  1. Differential dietary nutrient intake according to hormone replacement therapy use: an underestimated confounding factor in epidemiologic studies?

    PubMed

    Vercambre, Marie-Noël; Fournier, Agnès; Boutron-Ruault, Marie-Christine; Clavel-Chapelon, Françoise; Ringa, Virginie; Berr, Claudine

    2007-12-15

    Observational studies and randomized controlled trials have produced divergent results concerning the effect of hormone replacement therapy (HRT) on cardiovascular disease and, to a lesser extent, dementia. Residual confounding (confounding that remains even after adjustment for various socioeconomic and lifestyle factors) is one explanation that has been offered for these divergent results. The authors used data collected between 1990 and 1995 from 6,697 French women aged 61-72 years participating in a prospective cohort study to explore the hypothesis that nutritional intake varies according to HRT use and thus may be a source of residual confounding. After the authors adjusted for health and lifestyle factors, HRT users, compared with never users, had significantly higher intakes of alcohol; omega3 fatty acids; vitamins B6, B12, and D; and phosphorus and a lower intake of starch. These differential nutrient intakes were related to differences in eating habits. In particular, HRT users in the studied sample, compared with nonusers, ate significantly more fish. Most of the dietary differences were seen in both early users and delayers of HRT. To limit residual confounding in observational studies, dietary factors may be important parameters to be taken into account in analyses of HRT use and health outcomes.

  2. Quantification of confounding factors in MRI-based dose calculations as applied to prostate IMRT

    NASA Astrophysics Data System (ADS)

    Maspero, Matteo; Seevinck, Peter R.; Schubert, Gerald; Hoesl, Michaela A. U.; van Asselen, Bram; Viergever, Max A.; Lagendijk, Jan J. W.; Meijer, Gert J.; van den Berg, Cornelis A. T.

    2017-02-01

    Magnetic resonance (MR)-only radiotherapy treatment planning requires pseudo-CT (pCT) images to enable MR-based dose calculations. To verify the accuracy of MR-based dose calculations, institutions interested in introducing MR-only planning will have to compare pCT-based and computer tomography (CT)-based dose calculations. However, interpreting such comparison studies may be challenging, since potential differences arise from a range of confounding factors which are not necessarily specific to MR-only planning. Therefore, the aim of this study is to identify and quantify the contribution of factors confounding dosimetric accuracy estimation in comparison studies between CT and pCT. The following factors were distinguished: set-up and positioning differences between imaging sessions, MR-related geometric inaccuracy, pCT generation, use of specific calibration curves to convert pCT into electron density information, and registration errors. The study comprised fourteen prostate cancer patients who underwent CT/MRI-based treatment planning. To enable pCT generation, a commercial solution (MRCAT, Philips Healthcare, Vantaa, Finland) was adopted. IMRT plans were calculated on CT (gold standard) and pCTs. Dose difference maps in a high dose region (CTV) and in the body volume were evaluated, and the contribution to dose errors of possible confounding factors was individually quantified. We found that the largest confounding factor leading to dose difference was the use of different calibration curves to convert pCT and CT into electron density (0.7%). The second largest factor was the pCT generation which resulted in pCT stratified into a fixed number of tissue classes (0.16%). Inter-scan differences due to patient repositioning, MR-related geometric inaccuracy, and registration errors did not significantly contribute to dose differences (0.01%). The proposed approach successfully identified and quantified the factors confounding accurate MRI-based dose calculation in

  3. Quantification of confounding factors in MRI-based dose calculations as applied to prostate IMRT.

    PubMed

    Maspero, Matteo; Seevinck, Peter R; Schubert, Gerald; Hoesl, Michaela A U; van Asselen, Bram; Viergever, Max A; Lagendijk, Jan J W; Meijer, Gert J; van den Berg, Cornelis A T

    2017-02-07

    Magnetic resonance (MR)-only radiotherapy treatment planning requires pseudo-CT (pCT) images to enable MR-based dose calculations. To verify the accuracy of MR-based dose calculations, institutions interested in introducing MR-only planning will have to compare pCT-based and computer tomography (CT)-based dose calculations. However, interpreting such comparison studies may be challenging, since potential differences arise from a range of confounding factors which are not necessarily specific to MR-only planning. Therefore, the aim of this study is to identify and quantify the contribution of factors confounding dosimetric accuracy estimation in comparison studies between CT and pCT. The following factors were distinguished: set-up and positioning differences between imaging sessions, MR-related geometric inaccuracy, pCT generation, use of specific calibration curves to convert pCT into electron density information, and registration errors. The study comprised fourteen prostate cancer patients who underwent CT/MRI-based treatment planning. To enable pCT generation, a commercial solution (MRCAT, Philips Healthcare, Vantaa, Finland) was adopted. IMRT plans were calculated on CT (gold standard) and pCTs. Dose difference maps in a high dose region (CTV) and in the body volume were evaluated, and the contribution to dose errors of possible confounding factors was individually quantified. We found that the largest confounding factor leading to dose difference was the use of different calibration curves to convert pCT and CT into electron density (0.7%). The second largest factor was the pCT generation which resulted in pCT stratified into a fixed number of tissue classes (0.16%). Inter-scan differences due to patient repositioning, MR-related geometric inaccuracy, and registration errors did not significantly contribute to dose differences (0.01%). The proposed approach successfully identified and quantified the factors confounding accurate MRI-based dose calculation in

  4. A Harmonious Accounting Duo?

    ERIC Educational Resources Information Center

    Schapperle, Robert F.; Hardiman, Patrick F.

    1992-01-01

    Accountants have urged "harmonization" of standards between the Governmental Accounting Standards Board and the Financial Accounting Standards Board, recommending similar reporting of like transactions. However, varying display of similar accounting events does not necessarily indicate disharmony. The potential for problems because of…

  5. Child welfare clients have higher risks for teenage childbirths: which are the major confounders?

    PubMed Central

    Vinnerljung, Bo; Hjern, Anders

    2016-01-01

    Background: Aiming to support effective social intervention strategies targeting high-risk groups for teenage motherhood, this study examined to what extent the elevated crude risks of teenage childbirth among child welfare groups were attributable to the uneven distribution of adverse individual and family background factors. Methods: Comprehensive longitudinal register data for more than 700 000 Swedish females born 1973–1989 (including around 29 000 child welfare clients) were analysed by means of binary logistic regression. The Karlson/Holm/Breen-method was used to decompose each confounding factor’s relative contribution to the difference between crude and adjusted odds ratios (ORs). Results: Elevated crude risks for teenage childbirth are to a large extent attributable to selection on observables. Girls’ school failure was the most potent confounder, accounting for 28–35% of the difference between crude and adjusted ORs. Conclusion: As in majority populations, girls’ school failure was a strong risk factor for teenage childbirth among former child welfare children. At least among pre-adolescents, promoting school performance among children in the child welfare system seems to be a viable intervention path. PMID:27085195

  6. Evaluation in medical education: A topical review of target parameters, data collection tools and confounding factors

    PubMed Central

    Schiekirka, Sarah; Feufel, Markus A.; Herrmann-Lingen, Christoph; Raupach, Tobias

    2015-01-01

    Background and objective: Evaluation is an integral part of education in German medical schools. According to the quality standards set by the German Society for Evaluation, evaluation tools must provide an accurate and fair appraisal of teaching quality. Thus, data collection tools must be highly reliable and valid. This review summarises the current literature on evaluation of medical education with regard to the possible dimensions of teaching quality, the psychometric properties of survey instruments and potential confounding factors. Methods: We searched Pubmed, PsycINFO and PSYNDEX for literature on evaluation in medical education and included studies published up until June 30, 2011 as well as articles identified in the “grey literature”. Results are presented as a narrative review. Results: We identified four dimensions of teaching quality: structure, process, teacher characteristics, and outcome. Student ratings are predominantly used to address the first three dimensions, and a number of reliable tools are available for this purpose. However, potential confounders of student ratings pose a threat to the validity of these instruments. Outcome is usually operationalised in terms of student performance on examinations, but methodological problems may limit the usability of these data for evaluation purposes. In addition, not all examinations at German medical schools meet current quality standards. Conclusion: The choice of tools for evaluating medical education should be guided by the dimension that is targeted by the evaluation. Likewise, evaluation results can only be interpreted within the context of the construct addressed by the data collection tool that was used as well as its specific confounding factors. PMID:26421003

  7. Historical cohort study of US man-made vitreous fiber production workers: VI. Respiratory system cancer standardized mortality ratios adjusted for the confounding effect of cigarette smoking.

    PubMed

    Marsh, G M; Buchanich, J M; Youk, A O

    2001-09-01

    To date, the US cohort study of man-made vitreous fiber workers has provided no consistent evidence of a relationship between man-made vitreous fiber exposure and mortality from malignant or non-malignant respiratory disease. Nevertheless, there have been small, overall excesses in respiratory system cancer (RSC) among workers from the fiberglass and rock/slag wool production plants included in the study that were unexplained by estimated worker exposures to respirable fiber or other agents present in the plants. The present investigation was designed to provide a quantitative estimate of the extent to which the overall excess in RSC mortality observed at the total cohort level among male fiberglass and rock/slag wool workers is a result of the positive confounding effects of cigarette smoking. Because cigarette-smoking data were neither available nor obtainable at the individual level for all members of the fiberglass and rock/slag wool cohorts, we used the "indirect" method to adjust RSC standardized mortality ratios (SMRs) at the group (cohort and plant) level. Our adjustment suggested that cigarette smoking accounts for all of the 7% and 24% excesses in RSC observed, respectively, for the male fiberglass and rock/slag wool cohorts in the latest mortality updates. The same conclusion was reached regardless of which of several alternative formulations were used to adjust local rate-based RSC SMRs. We found that our smoking adjustments were robust with respect to several alternative characterizations and (with the exception of one fiberglass plant) produced adjusted RSC SMRs that were lower than their unadjusted counterparts. Further, all statistically significantly elevated unadjusted SMRs were reduced to not statistically significant levels. These results reaffirm that RSC SMRs based on US and local rates must take into account the potential confounding effects of cigarette smoking. They also suggest that the use of local county mortality rate-based SMRs may not

  8. Accounting Curriculum.

    ERIC Educational Resources Information Center

    Prickett, Charlotte

    This curriculum guide describes the accounting curriculum in the following three areas: accounting clerk, bookkeeper, and nondegreed accountant. The competencies and tasks complement the Arizona validated listing in these areas. The guide lists 24 competencies for nondegreed accountants, 10 competencies for accounting clerks, and 11 competencies…

  9. Saccade-confounded image statistics explain visual crowding.

    PubMed

    Nandy, Anirvan S; Tjan, Bosco S

    2012-01-08

    Processing of shape information in human peripheral visual fields is impeded beyond what can be expected by poor spatial resolution. Visual crowding, the inability to identify objects in clutter, has been shown to be the primary factor limiting shape perception in peripheral vision. Despite the well-documented effects of crowding, its underlying causes remain poorly understood. Given that spatial attention both facilitates learning of image statistics and directs saccadic eye movements, we propose that the acquisition of image statistics in peripheral visual fields is confounded by eye-movement artifacts. Specifically, the image statistics acquired under a peripherally deployed spotlight of attention are systematically biased by saccade-induced image displacements. These erroneously represented image statistics lead to inappropriate contextual interactions in the periphery and cause crowding.

  10. Is the Inverse Association Between Selenium and Bladder Cancer Due to Confounding by Smoking?

    PubMed Central

    Beane Freeman, Laura E.; Karagas, Margaret R.; Baris, Dalsu; Schwenn, Molly; Johnson, Alison T.; Colt, Joanne S.; Jackson, Brian; Hosain, G. M. Monawar; Cantor, Kenneth P.; Silverman, Debra T.

    2015-01-01

    Selenium has been linked to a reduced risk of bladder cancer in some studies. Smoking, a well-established risk factor for bladder cancer, has been associated with lower selenium levels in the body. We investigated the selenium-bladder cancer association in subjects from Maine, New Hampshire, and Vermont in the New England Bladder Cancer Case-Control Study. At interview (2001–2005), participants provided information on a variety of factors, including a comprehensive smoking history, and submitted toenail samples, from which we measured selenium levels. We estimated odds ratios and 95% confidence intervals among 1,058 cases and 1,271 controls using logistic regression. After controlling for smoking, we saw no evidence of an association between selenium levels and bladder cancer (for fourth quartile vs. first quartile, odds ratio (OR) = 0.98, 95% confidence interval (CI): 0.77, 1.25). When results were restricted to regular smokers, there appeared to be an inverse association (OR = 0.76, 95% CI: 0.58, 0.99); however, when pack-years of smoking were considered, this association was attenuated (OR = 0.91, 95% CI: 0.68, 1.20), indicating potential confounding by smoking. Despite some reports of an inverse association between selenium and bladder cancer overall, our results, combined with an in-depth evaluation of other studies, suggested that confounding from smoking intensity or duration could explain this association. Our study highlights the need to carefully evaluate the confounding association of smoking in the selenium-bladder cancer association. PMID:25776013

  11. Is the inverse association between selenium and bladder cancer due to confounding by smoking?

    PubMed

    Beane Freeman, Laura E; Karagas, Margaret R; Baris, Dalsu; Schwenn, Molly; Johnson, Alison T; Colt, Joanne S; Jackson, Brian; Hosain, G M Monawar; Cantor, Kenneth P; Silverman, Debra T

    2015-04-01

    Selenium has been linked to a reduced risk of bladder cancer in some studies. Smoking, a well-established risk factor for bladder cancer, has been associated with lower selenium levels in the body. We investigated the selenium-bladder cancer association in subjects from Maine, New Hampshire, and Vermont in the New England Bladder Cancer Case-Control Study. At interview (2001-2005), participants provided information on a variety of factors, including a comprehensive smoking history, and submitted toenail samples, from which we measured selenium levels. We estimated odds ratios and 95% confidence intervals among 1,058 cases and 1,271 controls using logistic regression. After controlling for smoking, we saw no evidence of an association between selenium levels and bladder cancer (for fourth quartile vs. first quartile, odds ratio (OR) = 0.98, 95% confidence interval (CI): 0.77, 1.25). When results were restricted to regular smokers, there appeared to be an inverse association (OR = 0.76, 95% CI: 0.58, 0.99); however, when pack-years of smoking were considered, this association was attenuated (OR = 0.91, 95% CI: 0.68, 1.20), indicating potential confounding by smoking. Despite some reports of an inverse association between selenium and bladder cancer overall, our results, combined with an in-depth evaluation of other studies, suggested that confounding from smoking intensity or duration could explain this association. Our study highlights the need to carefully evaluate the confounding association of smoking in the selenium-bladder cancer association.

  12. Negative Confounding by Essential Fatty Acids in Methylmercury Neurotoxicity Associations

    PubMed Central

    Choi, Anna L; Mogensen, Ulla B.; Bjerve, Kristian S.; Debes, Frodi; Weihe, Pal; Grandjean, Philippe; Budtz-Jørgensen, Esben

    2014-01-01

    Background Methylmercury, a worldwide contaminant of fish and seafood, can cause adverse effects on the developing nervous system. However, long-chain n-3 polyunsaturated fatty acids in seafood provide beneficial effects on brain development. Negative confounding will likely result in underestimation of both mercury toxicity and nutrient benefits unless mutual adjustment is included in the analysis. Methods We examined these associations in 176 Faroese children, in whom prenatal methylmercury exposure was assessed from mercury concentrations in cord blood and maternal hair. The relative concentrations of fatty acids were determined in cord serum phospholipids. Neuropsychological performance in verbal, motor, attention, spatial, and memory functions was assessed at 7 years of age. Multiple regression and structural equation models (SEMs) were carried out to determine the confounder-adjusted associations with methylmercury exposure. Results A short delay recall (in percent change) in the California Verbal Learning Test (CVLT) was associated with a doubling of cord blood methylmercury (−18.9, 95% confidence interval [CI] = −36.3, −1.51). The association became stronger after the inclusion of fatty acid concentrations in the analysis (−22.0, 95% confidence interval [CI] = −39.4, −4.62). In structural equation models, poorer memory function (corresponding to a lower score in the learning trials and short delay recall in CVLT) was associated with a doubling of prenatal exposure to methylmercury after the inclusion of fatty acid concentrations in the analysis (−1.94, 95% CI = −3.39, −0.49). Conclusions Associations between prenatal exposure to methylmercury and neurobehavioral deficits in memory function at school age were strengthened after fatty acid adjustment, thus suggesting that n-3 fatty acids need to be included in analysis of similar studies to avoid underestimation of the associations with methylmercury exposure. PMID:24561639

  13. Using high-dimensional propensity scores to automate confounding control in a distributed medical product safety surveillance system.

    PubMed

    Rassen, Jeremy A; Schneeweiss, Sebastian

    2012-01-01

    Distributed medical product safety monitoring systems such as the Sentinel System, to be developed as a part of Food and Drug Administration's Sentinel Initiative, will require automation of large parts of the safety evaluation process to achieve the necessary speed and scale at reasonable cost without sacrificing validity. Although certain functions will require investigator intervention, confounding control is one area that can largely be automated. The high-dimensional propensity score (hd-PS) algorithm is one option for automated confounding control in longitudinal healthcare databases. In this article, we discuss the use of hd-PS for automating confounding control in sequential database cohort studies, as applied to safety monitoring systems. In particular, we discuss the robustness of the covariate selection process, the potential for over- or under-selection of variables including the possibilities of M-bias and Z-bias, the computation requirements, the practical considerations in a federated database network, and the cases where automated confounding adjustment may not function optimally. We also outline recent improvements to the algorithm and show how the algorithm has performed in several published studies. We conclude that despite certain limitations, hd-PS offers substantial advantages over non-automated alternatives in active product safety monitoring systems.

  14. Evaluating the potential health risk of toxic trace elements in vegetables: Accounting for variations in soil factors.

    PubMed

    Yang, Yang; Chen, Weiping; Wang, Meie; Li, Yanling; Peng, Chi

    2017-02-06

    Vegetable crop consumption is one of the main sources of dietary exposure to toxic trace elements (TEs). A paired survey of soil and vegetable samples was conducted in 589 agricultural sites in the Youxian prefecture, southern China, to investigate the effect of soil factors on the accumulation of arsenic, cadmium, mercury, and lead in different vegetables. A site-specific model was developed to estimate the health risk from vegetable consumption. The TE concentration varied in different plant species, and rape can be cultivated in contaminated areas for its potential use in restricting the transfer of TE from soil to edible plant parts. The accumulation of TEs in vegetables was governed by multiple factors, mainly element interaction, metal availability (extractable CaCl2 fraction), and soil pH. Soil Zn may promote Cd accumulation in vegetables when soil Cd/Zn ratio>0.02. Cadmium is a major hazardous component. About 80.8% of the adult populations consuming locally produced vegetables had a daily Cd intake risk above the safe standard. Among investigated vegetables, radish is potentially hazardous for populations because of its high consumption rate and high Cd content but low Zn accumulation. The consumption of radish cultivated in highly acidic soil (4

  15. Hamilton study: distribution of factors confounding the relationship between air quality and respiratory health

    SciTech Connect

    Pengelly, L.D.; Kerigan, A.T.; Goldsmith, C.H.; Inman, E.M.

    1984-10-01

    Hamilton, Ontario is an industrial city with a population of 300,000 which is situated at the western end of Lake Ontario. Canada's two largest iron and steel mills are located here; the city historically has had relatively poor air quality, which has improved markedly in the last 25 years. Concern about the health effects of current air quality recently led us to carry out an epidemiological study of the effects of air pollution on the respiratory health of over 3500 school children. Respiratory health was measured by pulmonary function testing of each child, and by an assessment of each child's respiratory symptoms via a questionnaire administered to the parents. Previous studies had shown that other environmental factors (e.g. parental smoking, parental cough, socioeconomic level, housing, and gas cooking) might also affect respiratory health, and thus confound any potential relationships between health and air pollution. The questionnaire also collected information on many of these confounding factors. For the purposes of initial analysis, the city was divided into five areas in which differences in air quality were expected. In general, factors which have been associated with poor respiratory health were observed to be more prevalent in areas of poorer air quality.

  16. Prostate-Specific Antigen Velocity Before and After Elimination of Factors That Can Confound the Prostate-Specific Antigen Level

    SciTech Connect

    Park, Jessica J.; Chen, Ming-Hui; Loffredo, Marian; D'Amico, Anthony V.

    2012-03-01

    Purpose: Prostate-specific antigen (PSA) velocity, like PSA level, can be confounded. In this study, we estimated the impact that confounding factors could have on correctly identifying a patient with a PSA velocity >2 ng/ml/y. Methods and Materials: Between 2006 and 2010, a total of 50 men with newly diagnosed PC comprised the study cohort. We calculated and compared the false-positive and false-negative PSA velocity >2 ng/ml/y rates for all men and those with low-risk disease using two approaches to calculate PSA velocity. First, we used PSA values obtained within 18 months of diagnosis; second, we used values within 18 months of diagnosis, substituting the prebiopsy PSA for a repeat, nonconfounded PSA that was obtained using the same assay and without confounders. Results: Using PSA levels pre-biopsy, 46% of all men had a PSA velocity >2 ng/ml/y; whereas this value declined to 32% when substituting the last prebiopsy PSA for a repeat, nonconfounded PSA using the same assay and without confounders. The false-positive rate for PSA velocity >2 ng/ml/y was 43% as compared with a false-negative rate of PSA velocity >2 ng/ml/y of 11% (p = 0.0008) in the overall cohort. These respective values in the low-risk subgroup were 60% and 16.7% (p = 0.09). Conclusion: This study provides evidence to explain the discordance in cancer-specific outcomes among groups investigating the prognostic significance of PSA velocity >2 ng/ml/y, and highlights the importance of patient education on potential confounders of the PSA test before obtaining PSA levels.

  17. LIFE CLIMATREE project: A novel approach for accounting and monitoring carbon sequestration of tree crops and their potential as carbon sink areas

    NASA Astrophysics Data System (ADS)

    Stergiou, John; Tagaris, Efthimios; -Eleni Sotiropoulou, Rafaella

    2016-04-01

    Climate Change Mitigation is one of the most important objectives of the Kyoto Convention, and is mostly oriented towards reducing GHG emissions. However, carbon sink is retained only in the calculation of the forests capacity since agricultural land and farmers practices for securing carbon stored in soils have not been recognized in GHG accounting, possibly resulting in incorrect estimations of the carbon dioxide balance in the atmosphere. The agricultural sector, which is a key sector in the EU, presents a consistent strategic framework since 1954, in the form of Common Agricultural Policy (CAP). In its latest reform of 2013 (reg. (EU) 1305/13) CAP recognized the significance of Agriculture as a key player in Climate Change policy. In order to fill this gap the "LIFE ClimaTree" project has recently founded by the European Commission aiming to provide a novel method for including tree crop cultivations in the LULUCF's accounting rules for GHG emissions and removal. In the framework of "LIFE ClimaTree" project estimation of carbon sink within EU through the inclusion of the calculated tree crop capacity will be assessed for both current and future climatic conditions by 2050s using the GISS-WRF modeling system in a very fine scale (i.e., 9km x 9km) using RCP8.5 and RCP4.5 climate scenarios. Acknowledgement: LIFE CLIMATREE project "A novel approach for accounting and monitoring carbon sequestration of tree crops and their potential as carbon sink areas" (LIFE14 CCM/GR/000635).

  18. Smoking and hormesis as confounding factors in radiation pulmonary carcinogenesis.

    PubMed

    Sanders, Charles L; Scott, Bobby R

    2006-12-06

    Confounding factors in radiation pulmonary carcinogenesis are passive and active cigarette smoke exposures and radiation hormesis. Significantly increased lung cancer risk from ionizing radiation at lung doses < 1 Gy is not observed in never smokers exposed to ionizing radiations. Residential radon is not a cause of lung cancer in never smokers and may protect against lung cancer in smokers. The risk of lung cancer found in many epidemiological studies was less than the expected risk (hormetic effect) for nuclear weapons and power plant workers, shipyard workers, fluoroscopy patients, and inhabitants of high-dose background radiation. The protective effect was noted for low- and mixed high- and low-linear energy transfer (LET) radiations in both genders. Many studies showed a protection factor (PROFAC) > 0.40 (40% avoided) against the occurrence of lung cancer. The ubiquitous nature of the radiation hormesis response in cellular, animal, and epidemio-logical studies negates the healthy worker effect as an explanation for radiation hormesis. Low-dose radiation may stimulate DNA repair/apoptosis and immunity to suppress and eliminate cigarette-smoke-induced transformed cells in the lung, reducing lung cancer occurrence in smokers.

  19. Diagnostic and therapeutic strategy for confounding radiation myelitis.

    PubMed

    Higashida, Tetsuhiro; Colen, Chaim B; Guthikonda, Murali

    2010-05-01

    We report a case of confounding radiation myelitis to demonstrate the usefulness of surgical biopsy in ensuring the correct diagnosis and to avoid unnecessary treatment. The patient was a 40-year-old man with a history of epiglottis carcinoma and sarcoidosis. Six months after radiation therapy and chemotherapy for epiglottis carcinoma, he noticed paresthesia and dysesthesia in the left arm and leg. Two months after that, he complained of severe neck pain and rapidly progressing weakness in all extremities. MRI showed an enhanced intramedullary lesion with extensive edema in the cervical spinal cord. Radiation myelitis, intramedullary spinal tumor, and neurosarcoidosis were considered as differential diagnoses. Spinal cord biopsy with laminectomy was performed and radiation myelitis was diagnosed. After the surgery, the lesion was significantly decreased in size even though corticosteroid therapy was rapidly tapered. We emphasize that a spinal cord biopsy is indicated to obtain a pathological diagnosis and to make a clear treatment strategy for patients with associated diseases causing lesions of the spinal cord.

  20. Polymorphisms in DNA-Repair Genes in a Cohort of Prostate Cancer Patients from Different Areas in Spain: Heterogeneity between Populations as a Confounding Factor in Association Studies

    PubMed Central

    Henríquez-Hernández, Luis Alberto; Valenciano, Almudena; Foro-Arnalot, Palmira; Álvarez-Cubero, María Jesús; Cozar, José Manuel; Suárez-Novo, José Francisco; Castells-Esteve, Manel; Ayala-Gil, Adriana; Fernández-Gonzalo, Pablo; Ferrer, Montse; Guedea, Ferrán; Sancho-Pardo, Gemma; Craven-Bartle, Jordi; Ortiz-Gordillo, María José; Cabrera-Roldán, Patricia; Herrera-Ramos, Estefanía; Lara, Pedro C.

    2013-01-01

    Background Differences in the distribution of genotypes between individuals of the same ethnicity are an important confounder factor commonly undervalued in typical association studies conducted in radiogenomics. Objective To evaluate the genotypic distribution of SNPs in a wide set of Spanish prostate cancer patients for determine the homogeneity of the population and to disclose potential bias. Design, Setting, and Participants A total of 601 prostate cancer patients from Andalusia, Basque Country, Canary and Catalonia were genotyped for 10 SNPs located in 6 different genes associated to DNA repair: XRCC1 (rs25487, rs25489, rs1799782), ERCC2 (rs13181), ERCC1 (rs11615), LIG4 (rs1805388, rs1805386), ATM (rs17503908, rs1800057) and P53 (rs1042522). The SNP genotyping was made in a Biotrove OpenArray® NT Cycler. Outcome Measurements and Statistical Analysis Comparisons of genotypic and allelic frequencies among populations, as well as haplotype analyses were determined using the web-based environment SNPator. Principal component analysis was made using the SnpMatrix and XSnpMatrix classes and methods implemented as an R package. Non-supervised hierarchical cluster of SNP was made using MultiExperiment Viewer. Results and Limitations We observed that genotype distribution of 4 out 10 SNPs was statistically different among the studied populations, showing the greatest differences between Andalusia and Catalonia. These observations were confirmed in cluster analysis, principal component analysis and in the differential distribution of haplotypes among the populations. Because tumor characteristics have not been taken into account, it is possible that some polymorphisms may influence tumor characteristics in the same way that it may pose a risk factor for other disease characteristics. Conclusion Differences in distribution of genotypes within different populations of the same ethnicity could be an important confounding factor responsible for the lack of validation of

  1. Randomly Accountable

    ERIC Educational Resources Information Center

    Kane, Thomas J.; Staiger, Douglas O.; Geppert, Jeffrey

    2002-01-01

    The accountability debate tends to devolve into a battle between the pro-testing and anti-testing crowds. When it comes to the design of a school accountability system, the devil is truly in the details. A well-designed accountability plan may go a long way toward giving school personnel the kinds of signals they need to improve performance.…

  2. Parasitism can be a confounding factor in assessing the response of zebra mussels to water contamination.

    PubMed

    Minguez, Laëtitia; Buronfosse, Thierry; Beisel, Jean-Nicolas; Giambérini, Laure

    2012-03-01

    Biological responses measured in aquatic organisms to monitor environmental pollution could be also affected by different biotic and abiotic factors. Among these environmental factors, parasitism has often been neglected even if infection by parasites is very frequent. In the present field investigation, the parasite infra-communities and zebra mussel biological responses were studied up- and downstream a waste water treatment plant in northeast France. In both sites, mussels were infected by ciliates and/or intracellular bacteria, but prevalence rates and infection intensities were different according to the habitat. Concerning the biological responses differences were observed related to the site quality and the infection status. Parasitism affects both systems but seemed to depend mainly on environmental conditions. The influence of parasites is not constant, but remains important to consider it as a potential confounding factor in ecotoxicological studies. This study also emphasizes the interesting use of integrative indexes to synthesize data set.

  3. Homophily and Contagion Are Generically Confounded in Observational Social Network Studies

    PubMed Central

    Shalizi, Cosma Rohilla; Thomas, Andrew C.

    2012-01-01

    The authors consider processes on social networks that can potentially involve three factors: homophily, or the formation of social ties due to matching individual traits; social contagion, also known as social influence; and the causal effect of an individual’s covariates on his or her behavior or other measurable responses. The authors show that generically, all of these are confounded with each other. Distinguishing them from one another requires strong assumptions on the parametrization of the social process or on the adequacy of the covariates used (or both). In particular the authors demonstrate, with simple examples, that asymmetries in regression coefficients cannot identify causal effects and that very simple models of imitation (a form of social contagion) can produce substantial correlations between an individual’s enduring traits and his or her choices, even when there is no intrinsic affinity between them. The authors also suggest some possible constructive responses to these results. PMID:22523436

  4. HIV-1: the confounding variables of virus neutralization.

    PubMed

    Nara, Peter L; Lin, George

    2005-06-01

    The development of an effective vaccine against HIV-1 would be greatly facilitated by the ability to elicit potent, high affinity antibodies that are capable of broad neutralization, viral inactivation and protection against infection and/or disease. New insights into the structure and function of the HIV-1 envelope glycoprotein (Env) that mediates viral fusion and entry may ultimately lead to strategies successful in eliciting these protective antibody responses. Insights have been gained regarding HIV-1 Env attachment and receptor engagement, the fusion process and kinetics, and the structural/functional attributes of Env that allow humoral immune evasion. In addition, studies of a limited number of broadly neutralizing human monoclonal antibodies have shed some light as to how antibodies may penetrate the immune evading armor that HIV-1 has evolved. As the elusive goal of generating these types of antibodies emerge and are developed in the context of generating new candidate HIV-1 vaccines, a relevant in vitro measurement of neutralization by these types of antibodies becomes a complex task. This is in part due to a list of confounding variables which include: the physical and genomic nature (amino acid variation) of the infecting virion, the type of target cells, the concentration and clonality of the reactants, assay format and design, the affinity and kinetics of the reaction, receptors/coreceptors and attachment factors, and soluble host factors. This review will focus on the past, current, and future knowledge required to advance the field of HIV-1 humoral immunity as it impacts future HIV-1 vaccine development.

  5. Dietary Soy May Not Confound Acute Experimental Stroke Infarct Volume Outcomes In Ovariectomized Female Rats

    PubMed Central

    Prongay, Kamm D.; Lewis, Anne D.; Hurn, Patricia D.; Murphy, Stephanie J.

    2009-01-01

    Estrogen administration can alter experimental stroke outcomes. Soy as a source of phytoestrogens may therefore modulate responses in “estrogen-sensitive” stroke models, thus potentially confounding results. We evaluated the effects of dietary soy on acute infarct volumes in a pilot study using a rat focal stroke model. We hypothesized that ovariectomized (OVX) rats fed a soy-rich diet would have smaller acute infarct volumes than rats fed a soy-free diet. OVX rats were randomly assigned to a soy-free (n=6) or a soy-rich (n=6) diet for 4 weeks and weighed weekly. Following the dietary trial, rats underwent 2 hours of middle cerebral artery occlusion (MCAO). Mean arterial blood pressure, rectal and temporalis muscle temperatures, arterial blood gases, and blood glucose were recorded peri-ischemia. Rats were euthanized 22 hours following 2 hours of MCAO. Brains were stained with 2,3,5-triphenyl tetrazolium chloride for acute infarct volume analysis. Uterine weight and histology were also evaluated as additional internal estrogen-sensitive controls. Rats on the soy-free diet had greater gains in body weight (259±6% baseline body weight) than rats on the soy-rich diet (238±4% baseline body weight). No differences were seen in uterine weight and histology, peri-ischemic physiological parameters, and infarct volumes between the treatment groups. Results of this pilot study suggest that the dietary soy level tested may not alter acute infarct volumes in ischemic female rat brain. More studies addressing the potential confounding effects of dietary soy in “estrogen-sensitive” stroke models are needed if investigators are to make informed choices regarding diets used in experimental stroke research. PMID:20147341

  6. Do pollution time-series studies contain uncontrolled or residual confounding by risk factors for acute health events?

    PubMed

    Bukowski, John

    2008-07-01

    Acute health effects from air pollution are based largely on weak associations identified in time-series studies comparing daily air pollution levels to daily mortality. Much of this mortality is due to cardiovascular disease. Time-series studies have many potential limitations, but are not thought to be confounded by traditional cardiovascular risk factors (e.g., smoking status or hypertension) because these chronic risk factors are not obviously associated with daily pollution levels. However, acute psychobehavioral variants of these risk factors (e.g., smoking patterns and episodes of stress on any given day) are plausible confounders for the associations observed in time-series studies, given that time-series studies attempt to predict acute rather than chronic health outcomes. There is a fairly compelling literature on the strong link between cardiovascular events and daily "triggers" such as stress. Stress-related triggers are plausibly associated with daily pollution levels through surrogate stressors such as ambient temperature, daily workload, local traffic congestion, or other correlates of air pollution. For example, variables such as traffic congestion and industrial activity increase both stress-related health events and air pollution, suggesting the potential for classical confounding. Support for this argument is illustrated through examples of the well-demonstrated relationship between emotional stress and heart attack/stroke.

  7. Determining confounding sensitivities in eddy current thin film measurements

    NASA Astrophysics Data System (ADS)

    Gros, Ethan; Udpa, Lalita; Smith, James A.; Wachs, Katelyn

    2017-02-01

    Eddy current (EC) techniques are widely used in industry to measure the thickness of non-conductive films on a metal substrate. This is done by using a system whereby a coil carrying a high-frequency alternating current is used to create an alternating magnetic field at the surface of the instrument's probe. When the probe is brought near a conductive surface, the alternating magnetic field will induce ECs in the conductor. The substrate characteristics and the distance of the probe from the substrate (the coating thickness) affect the magnitude of the ECs. The induced currents load the probe coil affecting the terminal impedance of the coil. The measured probe impedance is related to the lift off between coil and conductor as well as conductivity of the test sample. For a known conductivity sample, the probe impedance can be converted into an equivalent film thickness value. The EC measurement can be confounded by a number of measurement parameters. It was the goal of this research to determine which physical properties of the measurement set-up and sample can adversely affect the thickness measurement. The eddy-current testing was performed using a commercially available, hand-held eddy-current probe (ETA3.3H spring-loaded eddy probe running at 8 MHz) that comes with a stand to hold the probe. The stand holds the probe and adjusts the probe on the z-axis to help position the probe in the correct area as well as make precise measurements. The signal from the probe was sent to a hand-held readout, where the results are recorded directly in terms of liftoff or film thickness. Understanding the effect of certain factors on the measurements of film thickness, will help to evaluate how accurate the ETA3.3H spring-loaded eddy probe was at measuring film thickness under varying experimental conditions. This research studied the effects of a number of factors such as i) conductivity, ii) edge effect, iii) surface finish of base material and iv) cable condition.

  8. Sediment organic matter content as a confounding factor in toxicity tests with Chironomus tentans

    SciTech Connect

    Lacey, R.; Watzin, M.C.; McIntosh, A.W.

    1999-02-01

    Physicochemical characteristics of sediment unrelated to contaminant levels and bioavailability may influence the outcome of toxicity tests. In particular, sediment organic matter content has the potential to be a confounding factor in toxicity tests using the midge larva Chironomus tentans because the larvae are infaunal and feed on organic matter in the sediments. To examine the possibility, the authors conducted a series of tests using formulated sediments with varying organic matter contents following the standard US Environmental Protection Agency (US EPA) 10-day C. tentans growth and survival protocol. Formulated sediments made with peat moss, {alpha}-cellulose, and maple leaves were tested. An organic-rich natural sediment diluted with formulated sediment to achieve a range of organic matter contents was also examined. In a final experiment, sediments containing each of the four organic matter sources at the same concentration were tested against one another. Survival was not greatly affected by concentration of organic matter, except at the lowest concentrations in natural sediment, where survival dipped below 70%. In experiments using peat moss, {alpha}-cellulose, and maple leaves, significant differences in C. tentans growth were found at different organic matter concentrations. In contrast, concentration of organic matter in the natural sediment dilution series had little effect on growth, perhaps because much of this material was highly refractory. In the comparison experiment, growth differed significantly among the four sediments, with best growth achieved with {alpha}-cellulose and leaves. These results suggest that both organic matter quantity and quality can be confounding factors in toxicity tests using C. tentans.

  9. Bed Sharing, SIDS Research, and the Concept of Confounding: A Review for Public Health Nurses.

    PubMed

    Keys, Elizabeth M; Rankin, James A

    2015-01-01

    Confounding is an important concept for public health nurses (PHNs) to understand when considering the results of epidemiological research. The term confounding is derived from Latin, confundere, which means to "mix-up" or "mix together". Epidemiologists attempt to derive a cause and effect relationship between two variables traditionally known as the exposure and disease (e.g., smoking and lung cancer). Confounding occurs when a third factor, known as a confounder, leads to an over- or underestimate of the magnitude of the association between the exposure and disease. An understanding of confounding will facilitate critical appraisal of epidemiological research findings. This knowledge will enable PHNs to strengthen their evidence-based practice and better prepare them for policy development and implementation. In recent years, researchers and clinicians have examined the relationship between bed sharing and sudden infant death syndrome (SIDS). The discussion regarding the risk of bed sharing and SIDS provides ample opportunity to discuss the various aspects of confounding. The purpose of this article is to use the bed sharing and SIDS literature to assist PHNs to understand confounding and to apply this knowledge when appraising epidemiological research. In addition, strategies that are used to control confounding are discussed.

  10. Structural confounding of area-level deprivation and segreation: an empirical example

    EPA Science Inventory

    The neighborhood effects literature has grown, but its utility is limited by the lack of attention paid to non-random selection into neighborhoods. Confounding occurs when an exposure and an outcome share an underlying common cause. Confounding resulting from differential allocat...

  11. Residential proximity to electromagnetic field sources and birth weight: Minimizing residual confounding using multiple imputation and propensity score matching.

    PubMed

    de Vocht, Frank; Lee, Brian

    2014-08-01

    Studies have suggested that residential exposure to extremely low frequency (50 Hz) electromagnetic fields (ELF-EMF) from high voltage cables, overhead power lines, electricity substations or towers are associated with reduced birth weight and may be associated with adverse birth outcomes or even miscarriages. We previously conducted a study of 140,356 singleton live births between 2004 and 2008 in Northwest England, which suggested that close residential proximity (≤ 50 m) to ELF-EMF sources was associated with reduced average birth weight of 212 g (95%CI: -395 to -29 g) but not with statistically significant increased risks for other adverse perinatal outcomes. However, the cohort was limited by missing data for most potentially confounding variables including maternal smoking during pregnancy, which was only available for a small subgroup, while also residual confounding could not be excluded. This study, using the same cohort, was conducted to minimize the effects of these problems using multiple imputation to address missing data and propensity score matching to minimize residual confounding. Missing data were imputed using multiple imputation using chained equations to generate five datasets. For each dataset 115 exposed women (residing ≤ 50 m from a residential ELF-EMF source) were propensity score matched to 1150 unexposed women. After doubly robust confounder adjustment, close proximity to a residential ELF-EMF source remained associated with a reduction in birth weight of -116 g (95% confidence interval: -224:-7 g). No effect was found for proximity ≤ 100 m compared to women living further away. These results indicate that although the effect size was about half of the effect previously reported, close maternal residential proximity to sources of ELF-EMF remained associated with suboptimal fetal growth.

  12. Is it patience or motivation? On motivational confounds in intertemporal choice tasks.

    PubMed

    Paglieri, Fabio; Addessi, Elsa; Sbaffi, Agnese; Tasselli, Maria Isabella; Delfino, Alexia

    2015-01-01

    Intertemporal choices create a tension between amount maximization, which would favor the larger and later option (LL), and delay minimization, which would promote the smaller and sooner reward (SS). Two common interpretations of intertemporal choice behavior are discussed: looking at LL responses as indicative of self-control, and using intertemporal choices to assess delay aversion. We argue that both interpretations need to take into account motivational confounds, in order to be warranted by data. In intertemporal choices with prepotent, salient stimuli (e.g., food amounts, typically used with nonhuman primates), LL responses could also be indicative of failed inhibition of a "go for more" impulsive response-the opposite of self-control. Similarly, intertemporal choices can be used to measure delay aversion only with respect to the subject's baseline motivation to maximize the reinforcer in question, and this baseline is not always assessed in current experimental protocols. This concern is especially crucial in comparing intertemporal choices across different groups or manipulation. We focus in particular on the effects of reward types on intertemporal choices, presenting two experimental studies where the difference in behavior with monetary versus food rewards is the product of different baseline motivation, rather than variations in delay aversion. We conclude discussing the implications of these and other similar recent findings, which are far-reaching.

  13. A causal examination of the effects of confounding factors on multimetric indices

    USGS Publications Warehouse

    Schoolmaster, Donald R.; Grace, James B.; Schweiger, E. William; Mitchell, Brian R.; Guntenspergen, Glenn R.

    2013-01-01

    The development of multimetric indices (MMIs) as a means of providing integrative measures of ecosystem condition is becoming widespread. An increasingly recognized problem for the interpretability of MMIs is controlling for the potentially confounding influences of environmental covariates. Most common approaches to handling covariates are based on simple notions of statistical control, leaving the causal implications of covariates and their adjustment unstated. In this paper, we use graphical models to examine some of the potential impacts of environmental covariates on the observed signals between human disturbance and potential response metrics. Using simulations based on various causal networks, we show how environmental covariates can both obscure and exaggerate the effects of human disturbance on individual metrics. We then examine from a causal interpretation standpoint the common practice of adjusting ecological metrics for environmental influences using only the set of sites deemed to be in reference condition. We present and examine the performance of an alternative approach to metric adjustment that uses the whole set of sites and models both environmental and human disturbance effects simultaneously. The findings from our analyses indicate that failing to model and adjust metrics can result in a systematic bias towards those metrics in which environmental covariates function to artificially strengthen the metric–disturbance relationship resulting in MMIs that do not accurately measure impacts of human disturbance. We also find that a “whole-set modeling approach” requires fewer assumptions and is more efficient with the given information than the more commonly applied “reference-set” approach.

  14. Using aquatic macroinvertebrate species traits to build test batteries for sediment toxicity assessment: accounting for the diversity of potential biological responses to toxicants.

    PubMed

    Ducrot, Virginie; Usseglio-Polatera, Philippe; Péry, T Alexandre R R; Mouthon, Jacques; Lafont, Michel; Roger, Marie-Claude; Garric, Jeanne; Férard, Jean-François

    2005-09-01

    An original species-selection method for the building of test batteries is presented. This method is based on the statistical analysis of the biological and ecological trait patterns of species. It has been applied to build a macroinvertebrate test battery for the assessment of sediment toxicity, which efficiently describes the diversity of benthic macroinvertebrate biological responses to toxicants in a large European lowland river. First, 109 potential representatives of benthic communities of European lowland rivers were selected from a list of 479 taxa, considering 11 biological traits accounting for the main routes of exposure to a sediment-bound toxicant and eight ecological traits providing an adequate description of habitat characteristics used by the taxa. Second, their biological and ecological trait patterns were compared using coinertia analysis. This comparison allowed the clustering of taxa into groups of organisms that exhibited similar life-history characteristics, physiological and behavioral features, and similar habitat use. Groups exhibited various sizes (7-35 taxa), taxonomic compositions, and biological and ecological features. Main differences among group characteristics concerned morphology, substrate preferendum and habitat utilization, nutritional features, maximal size, and life-history strategy. Third, the best representatives of the mean biological and ecological characteristics of each group were included in the test battery. The final selection was composed of Chironomus riparius (Insecta: Diptera), Branchiura sowerbyi (Oligochaeta: Tubificidae), Lumbriculus variegatus (Oligochaeta: Lumbriculidae), Valvata piscinalis (Gastropoda: Valvatidae), and Sericostoma personatum (Trichoptera: Sericostomatidae). This approach permitted the biological and ecological variety of the battery to be maximized. Because biological and ecological traits of taxa determine species sensitivity, such maximization should permit the battery to better account

  15. The Coming Accounting Crisis

    ERIC Educational Resources Information Center

    Eaton, Tim V.

    2007-01-01

    The accounting profession is facing a potential crisis not only from the overall shortage of accounting faculty driven by smaller numbers of new faculty entering the profession as many existing faculty retire but also from changes that have been less well documented. This includes: (1) changes in attitude towards the roles of teaching, service and…

  16. The control outcome calibration approach for causal inference with unobserved confounding.

    PubMed

    Tchetgen Tchetgen, Eric

    2014-03-01

    Unobserved confounding can seldom be ruled out with certainty in nonexperimental studies. Negative controls are sometimes used in epidemiologic practice to detect the presence of unobserved confounding. An outcome is said to be a valid negative control variable to the extent that it is influenced by unobserved confounders of the exposure effects on the outcome in view, although not directly influenced by the exposure. Thus, a negative control outcome found to be empirically associated with the exposure after adjustment for observed confounders indicates that unobserved confounding may be present. In this paper, we go beyond the use of control outcomes to detect possible unobserved confounding and propose to use control outcomes in a simple but formal counterfactual-based approach to correct causal effect estimates for bias due to unobserved confounding. The proposed control outcome calibration approach is developed in the context of a continuous or binary outcome, and the control outcome and the exposure can be discrete or continuous. A sensitivity analysis technique is also developed, which can be used to assess the degree to which a violation of the main identifying assumption of the control outcome calibration approach might impact inference about the effect of the exposure on the outcome in view.

  17. Measuring oxidative stress: the confounding effect of lipid concentration in measures of lipid peroxidation.

    PubMed

    Pérez-Rodríguez, Lorenzo; Romero-Haro, Ana A; Sternalski, Audrey; Muriel, Jaime; Mougeot, Francois; Gil, Diego; Alonso-Alvarez, Carlos

    2015-01-01

    Lipid peroxidation products are widely used as markers of oxidative damage in the organism. To properly interpret the information provided by these markers, it is necessary to know potential sources of bias and control confounding factors. Here, we investigated the relationship between two indicators of lipid mobilization (circulating levels of triglycerides and cholesterol) and two common markers of oxidative damage (plasma levels of malondialdehyde and hydroperoxides; the latter estimated from the d-ROMs assay kit). The following five avian species were studied: red-legged partridge (Alectoris rufa), zebra finch (Taeniopygia guttata), spotless starling (Sturnus unicolor), marsh harrier (Circus aeroginosus), and Montagu's harrier (Circus pygargus). In all cases, plasma triglyceride levels positively and significantly correlated with lipid peroxidation markers, explaining between 8% and 34% of their variability. Plasma cholesterol, in contrast, showed a significant positive relationship only among spotless starling nestlings and a marginally significant association in zebra finches. These results indicate that lipid peroxidation marker levels covary with circulating lipid levels. We discuss the potential causes and implications of this covariation and recommend that future studies that measure oxidative damage using lipid peroxidation markers report both raw and relative levels (i.e., corrected for circulating triglycerides). Whether the observed pattern also holds for other tissues and in other taxa would deserve further research.

  18. Octopus visual system: a functional MRI model for detecting neuronal electric currents without a BOLD confound

    PubMed Central

    Jiang, Xia; Lu, Hanbing; Shigeno, Shuichi; Tan, Li-Hai; Yang, Yihong; Ragsdale, Clifton W.; Gao, Jia-Hong

    2014-01-01

    Purpose Despite the efforts that have been devoted to detecting the transient magnetic fields generated by neuronal firing, the conclusion that a functionally relevant signal can be measured with magnetic resonance imaging (MRI) is still controversial. For human studies of neuronal current MRI (nc-MRI), the blood-oxygen-level-dependent (BOLD) effect remains an irresolvable confound. For tissue studies where hemoglobin is removed, natural sensory stimulation is not possible. This study investigates the feasibility of detecting a physiologically induced nc-MRI signal in vivo in a BOLD-free environment. Methods The cephalopod mollusc Octopus bimaculoides has vertebrate-like eyes, large optic lobes (OLs) and blood that does not contain hemoglobin. Visually evoked potentials were measured in the octopus retina and OL by electroretinogram and local field potential. nc-MRI scans were conducted at 9.4 Tesla to capture these activities. Results Electrophysiological recording detected strong responses in the retina and OL in vivo; however, nc-MRI failed to demonstrate any statistically significant signal change with a detection threshold of 0.2° for phase and 0.2% for magnitude. Experiments in a dissected eye-OL preparation yielded similar results. Conclusion These findings in a large hemoglobin-free nervous system suggest that sensory evoked neuronal magnetic fields are too weak for direct detection with current MRI technology. PMID:24301336

  19. Choroidal abnormalities and masquerade syndromes confounding the diagnosis of laser-induced eye injuries

    NASA Astrophysics Data System (ADS)

    Hacker, Henry D.; Zwick, Harry; Brown, Jeremiah, Jr.; Dicks, Ronald; Cheramie, Rachel; Stuck, Bruce E.

    2005-04-01

    The diagnosis of a laser-induced eye injury occurring in occupational or military environments is often complicated by confounding symptoms, the possibility of pre-existing pathology, and/or a lack of visual deficits that can be clearly associated with a specific incident. Two recent cases are described that illustrate the importance of a thorough differential diagnosis when coexisting retinal pathologies are present with potentially different (e.g. laser or disease) etiologies. Indocyanine green angiography (ICG) and ocular coherence tomography (OCT) used in combination with standard ophthalmic imaging can provide helpful insights as to the etiology of these lesions. Vascular choroidal abnormalities such as hemangiomas or occult histoplasmosis infection can produce findings that can mimic the leakage that may be evident from neovascular membranes associated with laser injury. Further evaluation with OCT and conventional fluorescein angiography (FA) is helpful to look for the classic signature of retinal disruption and retinal pigment layer changes that are often present in association with laser injury. Furthermore, a careful situational assessment of a potential laser exposure is important to confirm the diagnosis of laser-induced eye injury.

  20. Evening Activities as a Potential Confound in Research on the Adrenocortical System in Children

    ERIC Educational Resources Information Center

    Kertes, Darlene A.; Gunnar, Megan R.

    2004-01-01

    The relation among children's evening activities, behavioral characteristics, and activity of the hypothalamic-pituitary-adrenocortical axis was assessed in normally developing children ages 7 to 10 years. Salivary cortisol at bedtime was compared on evenings when children had structured activities outside of the home with unstructured evenings at…

  1. Accounting Specialist.

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus. Center on Education and Training for Employment.

    This publication identifies 20 subjects appropriate for use in a competency list for the occupation of accounting specialist, 1 of 12 occupations within the business/computer technologies cluster. Each unit consists of a number of competencies; a list of competency builders is provided for each competency. Titles of the 20 units are as follows:…

  2. Painless Accountability.

    ERIC Educational Resources Information Center

    Brown, R. W.; And Others

    The computerized Painless Accountability System is a performance objective system from which instructional programs are developed. Three main simplified behavioral response levels characterize this system: (1) cognitive, (2) psychomotor, and (3) affective domains. Each of these objectives are classified by one of 16 descriptors. The second major…

  3. Accountability Overboard

    ERIC Educational Resources Information Center

    Chieppo, Charles D.; Gass, James T.

    2009-01-01

    This article reports that special interest groups opposed to charter schools and high-stakes testing have hijacked Massachusetts's once-independent board of education and stand poised to water down the Massachusetts Comprehensive Assessment System (MCAS) tests and the accountability system they support. President Barack Obama and Massachusetts…

  4. Vitamin D in Fibromyalgia: A Causative or Confounding Biological Interplay?

    PubMed Central

    Karras, Spyridon; Rapti, Eleni; Matsoukas, Stauros; Kotsa, Kalliopi

    2016-01-01

    Fibromyalgia (FM) is a chronic syndrome with an increasing prevalence, characterized by widespread musculoskeletal pain in combination with a variety of cognitive symptoms and fatigue. A plethora of scientific evidence that has accumulated during the last decades, resulted in a significant improvement of the understanding of the pathophysiology of the disease. However, current therapeutic approaches in patients with FM remains a multidimensional approach including patient education, behavioral therapy, exercise, pain management, and relief of chronic symptoms, rather than the use drug therapies, based on the mechanisms of disease development. Vitamin D, a fat-soluble vitamin derived mainly from skin synthesis through ultraviolet radiation, has been recognized to manifest a plethora of extraskeletal actions, apart from its fundamental role in skeletal and calcium homeostasis, including modulation of cell growth, neuromuscular actions, and potential anti-inflammatory properties. Recent findings indicate that hypovitaminosis D to be highly prevalent in patients with FM. Supplementation studies are limited so far, indicating potential beneficial effects on pain and severity of the disease, however specific recommendations are lacking. This review aims to summarize and critically appraise data regarding the pathophysiological interplay between vitamin D and FM, available results from observational and supplementation studies so far, with a clinical discourse on current knowledge gaps and future research agenda. PMID:27271665

  5. Breast milk and cognitive development—the role of confounders: a systematic review

    PubMed Central

    Walfisch, Asnat; Sermer, Corey; Cressman, Alex; Koren, Gideon

    2013-01-01

    Objectives The association between breastfeeding and child cognitive development is conflicted by studies reporting positive and null effects. This relationship may be confounded by factors associated with breastfeeding, specifically maternal socioeconomic class and IQ. Design Systematic review of the literature. Setting and participants Any prospective or retrospective study, in any language, evaluating the association between breastfeeding and cognitive development using a validated method in healthy term infants, children or adults, was included. Primary and secondary outcome measures Extracted data included the study design, target population and sample size, breastfeeding exposure, cognitive development assessment tool used and participants’ age, summary of the results prior to, and following, adjustment for confounders, and all confounders adjusted for. Study quality was assessed as well. Results 84 studies met our inclusion criteria (34 rated as high quality, 26 moderate and 24 low quality). Critical assessment of accepted studies revealed the following associations: 21 null, 28 positive, 18 null after adjusting for confounders and 17 positive—diminished after adjusting for confounders. Directionality of effect did not correlate with study quality; however, studies showing a decreased effect after multivariate analysis were of superior quality compared with other study groupings (14/17 high quality, 82%). Further, studies that showed null or diminished effect after multivariate analysis corrected for significantly more confounders (7.7±3.4) as compared with those that found no change following adjustment (5.6±4.5, p=0.04). The majority of included studies were carried out during childhood (75%) and set in high-income countries (85.5%). Conclusions Much of the reported effect of breastfeeding on child neurodevelopment is due to confounding. It is unlikely that additional work will change the current synthesis. Future studies should attempt to rigorously

  6. From bad to worse: collider stratification amplifies confounding bias in the "obesity paradox".

    PubMed

    Banack, Hailey R; Kaufman, Jay S

    2015-10-01

    Smoking is often identified as a confounder of the obesity-mortality relationship. Selection bias can amplify the magnitude of an existing confounding bias. The objective of the present report is to demonstrate how confounding bias due to cigarette smoking is increased in the presence of collider stratification bias using an empirical example and directed acyclic graphs. The empirical example uses data from the Atherosclerosis Risk in Communities (ARIC) study, a prospective cohort study of 15,792 men and women in the United States. Poisson regression models were used to examine the confounding effect of smoking. In the total ARIC study population, smoking produced a confounding bias of <3 percentage points. This result was obtained by comparing the incidence rate ratio (IRR) for obesity from a model adjusted for smoking was 1.07 (95 % CI 1.00, 1.15) with one that did not adjust for smoking was 1.10 (95 % CI 1.03, 1.18). However, among smokers with CVD, the obesity IRR was 0.89 (95 % CI 0.81, 0.99), while among non-smokers with CVD the obesity IRR was 1.20 (95 % CI 1.03, 1.41). The empirical and graphical explanations presented suggest that the magnitude of the confounding bias induced by smoking is greater in the presence of collider stratification bias.

  7. Structural equation modeling versus marginal structural modeling for assessing mediation in the presence of posttreatment confounding.

    PubMed

    Moerkerke, Beatrijs; Loeys, Tom; Vansteelandt, Stijn

    2015-06-01

    Inverse probability weighting for marginal structural models has been suggested as a strategy to estimate the direct effect of a treatment or exposure on an outcome in studies where the effect of mediator on outcome is subject to posttreatment confounding. This type of confounding, whereby confounders of the effect of mediator on outcome are themselves affected by the exposure, complicates mediation analyses and necessitates apt analysis strategies. In this article, we contrast the inverse probability weighting approach with the traditional path analysis approach to mediation analysis. We show that in a particular class of linear models, adjustment for posttreatment confounding can be realized via a fairly standard modification of the traditional path analysis approach. The resulting approach is simpler; by avoiding inverse probability weighting, it moreover results in direct effect estimators with smaller finite sample bias and greater precision. We further show that a particular variant of the G-estimation approach from the causal inference literature is equivalent with the path analysis approach in simple linear settings but is more generally applicable in settings with interactions and/or noncontinuous mediators and confounders. We conclude that the use of inverse probability weighting for marginal structural models to adjust for posttreatment confounding in mediation analysis is primarily indicated in nonlinear models for the outcome.

  8. Do digestive contents confound body mass as a measure of relative condition in nestling songbirds?

    USGS Publications Warehouse

    Streby, Henry M.; Peterson, Sean M.; Lehman, Justin A.; Kramer, Gunnar R.; Vernasco, Ben J.; Andersen, David E.

    2014-01-01

    Relative nestling condition, typically measured as nestling mass or as an index including nestling mass, is commonly purported to correlate with fledgling songbird survival. However, most studies directly investigating fledgling survival have found no such relationship. We weighed feces and stomach contents of nestling golden-winged warblers (Vermivora chrysoptera) to investigate the potential contribution of variation in digestive contents to differences in nestling mass. We estimated that the mass of a seventh-day (near fledging) nestling golden-winged warbler varies by 0.65 g (approx. 9% of mean nestling mass) depending on the contents of the nestling's digestive system at the time of weighing, and that digestive contents are dissimilar among nestlings at any moment the brood is removed from the nest for weighing. Our conservative estimate of within-individual variation in digestive contents equals 72% and 24% of the mean within-brood and population-wide range in nestling mass, respectively. Based on our results, a substantive but typically unknown amount of the variation in body mass among nestlings is confounded by differences in digestive contents. We conclude that short-term variation in digestive contents likely precludes the use of body mass, and therefore any mass-dependent index, as a measure of relative nestling condition or as a predictor of survival in golden-winged warblers and likely in many other songbirds of similar size.

  9. Detection rates of geckos in visual surveys: Turning confounding variables into useful knowledge

    USGS Publications Warehouse

    Lardner, Bjorn; Rodda, Gordon H.; Yackel Adams, Amy A.; Savidge, Julie A.; Reed, Robert N.

    2016-01-01

    Transect surveys without some means of estimating detection probabilities generate population size indices prone to bias because survey conditions differ in time and space. Knowing what causes such bias can help guide the collection of relevant survey covariates, correct the survey data, anticipate situations where bias might be unacceptably large, and elucidate the ecology of target species. We used negative binomial regression to evaluate confounding variables for gecko (primarily Hemidactylus frenatus and Lepidodactylus lugubris) counts on 220-m-long transects surveyed at night, primarily for snakes, on 9,475 occasions. Searchers differed in gecko detection rates by up to a factor of six. The worst and best headlamps differed by a factor of at least two. Strong winds had a negative effect potentially as large as those of searchers or headlamps. More geckos were seen during wet weather conditions, but the effect size was small. Compared with a detection nadir during waxing gibbous (nearly full) moons above the horizon, we saw 28% more geckos during waning crescent moons below the horizon. A sine function suggested that we saw 24% more geckos at the end of the wet season than at the end of the dry season. Fluctuations on a longer timescale also were verified. Disturbingly, corrected data exhibited strong short-term fluctuations that covariates apparently failed to capture. Although some biases can be addressed with measured covariates, others will be difficult to eliminate as a significant source of error in longterm monitoring programs.

  10. Statistically Controlling for Confounding Constructs Is Harder than You Think

    PubMed Central

    Westfall, Jacob; Yarkoni, Tal

    2016-01-01

    Social scientists often seek to demonstrate that a construct has incremental validity over and above other related constructs. However, these claims are typically supported by measurement-level models that fail to consider the effects of measurement (un)reliability. We use intuitive examples, Monte Carlo simulations, and a novel analytical framework to demonstrate that common strategies for establishing incremental construct validity using multiple regression analysis exhibit extremely high Type I error rates under parameter regimes common in many psychological domains. Counterintuitively, we find that error rates are highest—in some cases approaching 100%—when sample sizes are large and reliability is moderate. Our findings suggest that a potentially large proportion of incremental validity claims made in the literature are spurious. We present a web application (http://jakewestfall.org/ivy/) that readers can use to explore the statistical properties of these and other incremental validity arguments. We conclude by reviewing SEM-based statistical approaches that appropriately control the Type I error rate when attempting to establish incremental validity. PMID:27031707

  11. Confounders in interpreting pathology for safety and risk assessment

    SciTech Connect

    Wolf, Douglas C. . E-mail: wolf.doug@epa.gov; Mann, Peter C.

    2005-02-01

    The contribution of pathology to toxicity assessment is invaluable but often not clearly understood. Pathology endpoints are the central response around which human health risk assessment is frequently determined; therefore, it is important that the general toxicology community understand current concepts and nomenclature of toxicologic pathology. Toxicologic pathology encompasses the study of changes in tissue morphology that help define the risk of exposure to xenobiotics. Toxicologic pathology is a discipline that has changed and adapted over time including methods of analysis and nomenclature of lesions. As risk assessments are updated for chemicals in commerce, frequently the older literature must be reviewed and reevaluated. When interpreting pathology data from animal studies, it is important to consider the biological significance of a lesion as well as its relationship to the ultimate adverse health effect. Assessing the potential for a chemical to cause harm to humans must include the examination of the entire pathology database in context of the study design, the mode of action of the chemical of concern, and using the most current interpretation of a lesion to determine the significance for human health effects of a particular tissue response.

  12. Confounders in interpreting pathology for safety and risk assessment.

    PubMed

    Wolf, Douglas C; Mann, Peter C

    2005-02-01

    The contribution of pathology to toxicity assessment is invaluable but often not clearly understood. Pathology endpoints are the central response around which human health risk assessment is frequently determined; therefore, it is important that the general toxicology community understand current concepts and nomenclature of toxicologic pathology. Toxicologic pathology encompasses the study of changes in tissue morphology that help define the risk of exposure to xenobiotics. Toxicologic pathology is a discipline that has changed and adapted over time including methods of analysis and nomenclature of lesions. As risk assessments are updated for chemicals in commerce, frequently the older literature must be reviewed and reevaluated. When interpreting pathology data from animal studies, it is important to consider the biological significance of a lesion as well as its relationship to the ultimate adverse health effect. Assessing the potential for a chemical to cause harm to humans must include the examination of the entire pathology database in context of the study design, the mode of action of the chemical of concern, and using the most current interpretation of a lesion to determine the significance for human health effects of a particular tissue response.

  13. Using instrumental variables to disentangle treatment and placebo effects in blinded and unblinded randomized clinical trials influenced by unmeasured confounders

    PubMed Central

    Chaibub Neto, Elias

    2016-01-01

    Clinical trials traditionally employ blinding as a design mechanism to reduce the influence of placebo effects. In practice, however, it can be difficult or impossible to blind study participants and unblinded trials are common in medical research. Here we show how instrumental variables can be used to quantify and disentangle treatment and placebo effects in randomized clinical trials comparing control and active treatments in the presence of confounders. The key idea is to use randomization to separately manipulate treatment assignment and psychological encouragement conversations/interactions that increase the participants’ desire for improved symptoms. The proposed approach is able to improve the estimation of treatment effects in blinded studies and, most importantly, opens the doors to account for placebo effects in unblinded trials. PMID:27869205

  14. Using instrumental variables to disentangle treatment and placebo effects in blinded and unblinded randomized clinical trials influenced by unmeasured confounders

    NASA Astrophysics Data System (ADS)

    Chaibub Neto, Elias

    2016-11-01

    Clinical trials traditionally employ blinding as a design mechanism to reduce the influence of placebo effects. In practice, however, it can be difficult or impossible to blind study participants and unblinded trials are common in medical research. Here we show how instrumental variables can be used to quantify and disentangle treatment and placebo effects in randomized clinical trials comparing control and active treatments in the presence of confounders. The key idea is to use randomization to separately manipulate treatment assignment and psychological encouragement conversations/interactions that increase the participants’ desire for improved symptoms. The proposed approach is able to improve the estimation of treatment effects in blinded studies and, most importantly, opens the doors to account for placebo effects in unblinded trials.

  15. Examination of the Relationship between Oral Health and Arterial Sclerosis without Genetic Confounding through the Study of Older Japanese Twins

    PubMed Central

    Kurushima, Yuko; Ikebe, Kazunori; Matsuda, Ken-ichi; Enoki, Kaori; Ogata, Soshiro; Yamashita, Motozo; Murakami, Shinya; Maeda, Yoshinobu

    2015-01-01

    Objective Although researchers have recently demonstrated a relationship between oral health and arterial sclerosis, the genetic contribution to this relationship has been ignored even though genetic factors are expected to have some effect on various diseases. The aim of this study was to evaluate oral health as a significant risk factor related to arterial sclerosis after eliminating genetic confounding through study of older Japanese twins. Subjects and Methods Medical and dental surveys were conducted individually for 106 Japanese twin pairs over the age of 50 years. Maximal carotid intima-media thickness (IMT-Cmax) was measured as a surrogate marker of arterial sclerosis. IMT-Cmax > 1.0 mm was diagnosed as arterial sclerosis. All of the twins were examined for the number of remaining teeth, masticatory performance, and periodontal status. We evaluated each measurement related with IMT-Cmax and arterial sclerosis using generalized estimating equations analysis adjusted for potential risk factors. For non-smoking monozygotic twins, a regression analysis using a “between within” model was conducted to evaluate the relationship between IMT-Cmax and the number of teeth as the environmental factor controlling genetic and familial confounding. Results We examined 91 monozygotic and 15 dizygotic twin pairs (males: 42, females: 64) with a mean (± standard deviation) age of 67.4 ± 10.0 years. Out of all of the oral health-related measurements collected, only the number of teeth was significantly related to arterial sclerosis (odds ratio: 0.72, 95% confidence interval: 0.52-0.99 per five teeth). Regression analysis showed a significant association between the IMT-Cmax and the number of teeth as an environmental factor (p = 0.037). Conclusions Analysis of monozygotic twins older than 50 years of age showed that having fewer teeth could be a significant environmental factor related to arterial sclerosis, even after controlling for genetic and familial confounding

  16. Adjustments for Unmeasured Confounders in Pharmacoepidemiologic Database Studies Using External Information

    PubMed Central

    Stürmer, Til; Glynn, Robert J; Rothman, Kenneth J; Avorn, Jerry; Schneeweiss, Sebastian

    2008-01-01

    Background Non-experimental studies of drug effects in large automated databases can provide timely assessment of real-life drug use, but are prone to confounding by variables that are not contained in these databases and thus cannot be controlled. Objectives To describe how information on additional confounders from validation studies can help address the problem of unmeasured confounding in the main study. Research Design Review types of validation studies that allow adjustment for unmeasured confounding and illustrate these with an example. Subjects: Main study New Jersey residents 65 years or older hospitalized between 1995 and 1997, who filled prescriptions within Medicaid or a pharmaceutical assistance program. Validation study: representative sample of Medicare beneficiaries. Measures Association between nonsteroidal anti-inflammatory drugs (NSAIDs) and mortality. Results Validation studies are categorized as internal (ie, additional information is collected on participants of the main study) or external. Availability of information on disease outcome will affect choice of analytic strategies. Using an external validation study without data on disease outcome to adjust for unmeasured confounding, propensity score calibration (PSC) leads to a plausible estimate of the association between NSAIDs and mortality in the elderly, if the biases caused by measured and unmeasured confounders go in the same direction. Conclusions Estimates of drug effects can be adjusted for confounders that are not available in the main but can be measured in a validation study. PSC uses validation data without information on disease outcome under a strong assumption. The collection and integration of validation data in pharmacoepidemiology should be encouraged. PMID:17909375

  17. On a preference-based instrumental variable approach in reducing unmeasured confounding-by-indication.

    PubMed

    Li, Yun; Lee, Yoonseok; Wolfe, Robert A; Morgenstern, Hal; Zhang, Jinyao; Port, Friedrich K; Robinson, Bruce M

    2015-03-30

    Treatment preferences of groups (e.g., clinical centers) have often been proposed as instruments to control for unmeasured confounding-by-indication in instrumental variable (IV) analyses. However, formal evaluations of these group-preference-based instruments are lacking. Unique challenges include the following: (i) correlations between outcomes within groups; (ii) the multi-value nature of the instruments; (iii) unmeasured confounding occurring between and within groups. We introduce the framework of between-group and within-group confounding to assess assumptions required for the group-preference-based IV analyses. Our work illustrates that, when unmeasured confounding effects exist only within groups but not between groups, preference-based IVs can satisfy assumptions required for valid instruments. We then derive a closed-form expression of asymptotic bias of the two-stage generalized ordinary least squares estimator when the IVs are valid. Simulations demonstrate that the asymptotic bias formula approximates bias in finite samples quite well, particularly when the number of groups is moderate to large. The bias formula shows that when the cluster size is finite, the IV estimator is asymptotically biased; only when both the number of groups and cluster size go to infinity, the bias disappears. However, the IV estimator remains advantageous in reducing bias from confounding-by-indication. The bias assessment provides practical guidance for preference-based IV analyses. To increase their performance, one should adjust for as many measured confounders as possible, consider groups that have the most random variation in treatment assignment and increase cluster size. To minimize the likelihood for these IVs to be invalid, one should minimize unmeasured between-group confounding.

  18. Effects of categorization method, regression type, and variable distribution on the inflation of Type-I error rate when categorizing a confounding variable.

    PubMed

    Barnwell-Ménard, Jean-Louis; Li, Qing; Cohen, Alan A

    2015-03-15

    The loss of signal associated with categorizing a continuous variable is well known, and previous studies have demonstrated that this can lead to an inflation of Type-I error when the categorized variable is a confounder in a regression analysis estimating the effect of an exposure on an outcome. However, it is not known how the Type-I error may vary under different circumstances, including logistic versus linear regression, different distributions of the confounder, and different categorization methods. Here, we analytically quantified the effect of categorization and then performed a series of 9600 Monte Carlo simulations to estimate the Type-I error inflation associated with categorization of a confounder under different regression scenarios. We show that Type-I error is unacceptably high (>10% in most scenarios and often 100%). The only exception was when the variable categorized was a continuous mixture proxy for a genuinely dichotomous latent variable, where both the continuous proxy and the categorized variable are error-ridden proxies for the dichotomous latent variable. As expected, error inflation was also higher with larger sample size, fewer categories, and stronger associations between the confounder and the exposure or outcome. We provide online tools that can help researchers estimate the potential error inflation and understand how serious a problem this is.

  19. Comparison of statistical approaches dealing with time-dependent confounding in drug effectiveness studies.

    PubMed

    Karim, Mohammad Ehsanul; Petkau, John; Gustafson, Paul; Platt, Robert W; Tremlett, Helen

    2016-09-21

    In longitudinal studies, if the time-dependent covariates are affected by the past treatment, time-dependent confounding may be present. For a time-to-event response, marginal structural Cox models are frequently used to deal with such confounding. To avoid some of the problems of fitting marginal structural Cox model, the sequential Cox approach has been suggested as an alternative. Although the estimation mechanisms are different, both approaches claim to estimate the causal effect of treatment by appropriately adjusting for time-dependent confounding. We carry out simulation studies to assess the suitability of the sequential Cox approach for analyzing time-to-event data in the presence of a time-dependent covariate that may or may not be a time-dependent confounder. Results from these simulations revealed that the sequential Cox approach is not as effective as marginal structural Cox model in addressing the time-dependent confounding. The sequential Cox approach was also found to be inadequate in the presence of a time-dependent covariate. We propose a modified version of the sequential Cox approach that correctly estimates the treatment effect in both of the above scenarios. All approaches are applied to investigate the impact of beta-interferon treatment in delaying disability progression in the British Columbia Multiple Sclerosis cohort (1995-2008).

  20. Assessment and indirect adjustment for confounding by smoking in cohort studies using relative hazards models.

    PubMed

    Richardson, David B; Laurier, Dominique; Schubauer-Berigan, Mary K; Tchetgen Tchetgen, Eric; Cole, Stephen R

    2014-11-01

    Workers' smoking histories are not measured in many occupational cohort studies. Here we discuss the use of negative control outcomes to detect and adjust for confounding in analyses that lack information on smoking. We clarify the assumptions necessary to detect confounding by smoking and the additional assumptions necessary to indirectly adjust for such bias. We illustrate these methods using data from 2 studies of radiation and lung cancer: the Colorado Plateau cohort study (1950-2005) of underground uranium miners (in which smoking was measured) and a French cohort study (1950-2004) of nuclear industry workers (in which smoking was unmeasured). A cause-specific relative hazards model is proposed for estimation of indirectly adjusted associations. Among the miners, the proposed method suggests no confounding by smoking of the association between radon and lung cancer--a conclusion supported by adjustment for measured smoking. Among the nuclear workers, the proposed method suggests substantial confounding by smoking of the association between radiation and lung cancer. Indirect adjustment for confounding by smoking resulted in an 18% decrease in the adjusted estimated hazard ratio, yet this cannot be verified because smoking was unmeasured. Assumptions underlying this method are described, and a cause-specific proportional hazards model that allows easy implementation using standard software is presented.

  1. The "Dry-Run" Analysis: A Method for Evaluating Risk Scores for Confounding Control.

    PubMed

    Wyss, Richard; Hansen, Ben B; Ellis, Alan R; Gagne, Joshua J; Desai, Rishi J; Glynn, Robert J; Stürmer, Til

    2017-03-06

    A propensity score (PS) model's ability to control confounding can be assessed by evaluating covariate balance across exposure groups after PS adjustment. The optimal strategy for evaluating a disease risk score (DRS) model's ability to control confounding is less clear. DRS models cannot be evaluated through balance checks within the full population, and they are usually assessed through prediction diagnostics and goodness-of-fit tests. A proposed alternative is the "dry-run" analysis, which divides the unexposed population into "pseudo-exposed" and "pseudo-unexposed" groups so that differences on observed covariates resemble differences between the actual exposed and unexposed populations. With no exposure effect separating the pseudo-exposed and pseudo-unexposed groups, a DRS model is evaluated by its ability to retrieve an unconfounded null estimate after adjustment in this pseudo-population. We used simulations and an empirical example to compare traditional DRS performance metrics with the dry-run validation. In simulations, the dry run often improved assessment of confounding control, compared with the C statistic and goodness-of-fit tests. In the empirical example, PS and DRS matching gave similar results and showed good performance in terms of covariate balance (PS matching) and controlling confounding in the dry-run analysis (DRS matching). The dry-run analysis may prove useful in evaluating confounding control through DRS models.

  2. "Toward a clearer definition of confounding" revisited with directed acyclic graphs.

    PubMed

    Howards, Penelope P; Schisterman, Enrique F; Poole, Charles; Kaufman, Jay S; Weinberg, Clarice R

    2012-09-15

    In a 1993 paper (Am J Epidemiol. 1993;137(1):1-8), Weinberg considered whether a variable that is associated with the outcome and is affected by exposure but is not an intermediate variable between exposure and outcome should be considered a confounder in etiologic studies. As an example, she examined the common practice of adjusting for history of spontaneous abortion when estimating the effect of an exposure on the risk of spontaneous abortion. She showed algebraically that such an adjustment could substantially bias the results even though history of spontaneous abortion would meet some definitions of a confounder. Directed acyclic graphs (DAGs) were introduced into epidemiology several years later as a tool with which to identify confounders. The authors now revisit Weinberg's paper using DAGs to represent scenarios that arise from her original assumptions. DAG theory is consistent with Weinberg's finding that adjusting for history of spontaneous abortion introduces bias in her original scenario. In the authors' examples, treating history of spontaneous abortion as a confounder introduces bias if it is a descendant of the exposure and is associated with the outcome conditional on exposure or is a child of a collider on a relevant undirected path. Thoughtful DAG analyses require clear research questions but are easily modified for examining different causal assumptions that may affect confounder assessment.

  3. Teacher Quality at the High-School Level: The Importance of Accounting for Tracks. Working Paper 17722. Revised

    ERIC Educational Resources Information Center

    Jackson, C. Kirabo

    2013-01-01

    Unlike in elementary school, high-school teacher effects may be confounded with both selection to tracks and unobserved track-level treatments. I document sizable confounding track effects, and show that traditional tests for the existence of teacher effects are likely biased. After accounting for these biases, high-school algebra and English…

  4. Preformed biomarkers including dialkylphosphates (DAPs) in produce may confound biomonitoring in pesticide exposure and risk assessment.

    PubMed

    Chen, Li; Zhao, Taifeng; Pan, Canping; Ross, John H; Krieger, Robert I

    2012-09-12

    Low levels of pesticides and their metabolites/degradates occur in produce when pesticides are used in conventional or organic crop protection. Human dietary and nonoccupational urine biomonitoring studies may be confounded by preformed pesticide biomarkers in the diet. The extent of formation of putative urine biomarkers, including malathion specific (MMA, MDA; malathion mono- and diacids), organophosphorus generic (DMP, DMTP, DMDTP; dimethyl-, dimethylthio-, and dimethydithiophosphate), pyrethroid generic (3-PBA; 3-phenoxybenzoic acid), and captan-specific metabolites (THPI; tetrahydrophthalimide), was measured in produce samples containing the parent pesticide. Every produce sample of 19 types of fruits and vegetables contained biomarkers of potential human exposure. A total of 134 of 157 (85%) samples contained more molar equivalent biomarkers than parent pesticide. Malathion and fenpropathrin were sprayed (1 lb/A), and the time-dependent formation of pesticide biomarkers in strawberries was investigated under field conditions typical of commercial production in California. Malathion and fenpropathrin residues were always below established residue tolerances. Malathion, MMA, and MDA dissipated, while DMP, DMTP, and DMDTP increased, during a 20 day study period following the preharvest interval. The mole ratios of biomarkers/(malathion + malaoxon) were always greater than 1 and increased from day 4 to day 23 postapplication. Fenpropathrin and 3-PBA also dissipated in strawberries during each monitoring period. The mole ratios of 3-PBA/fenpropathrin were always less than 1 and decreased from day 4 to day 14. The absorption of pesticide biomarkers in produce and excretion in urine would falsely indicate consumer pesticide exposure if used to reconstruct dose for risk characterization.

  5. Nonradiographic axial spondyloarthritis background and confounding factors of this new terminology: an appraisal.

    PubMed

    Erbil, Jen; Espinoza, Luis R

    2015-03-01

    The advent of biologic therapy in the treatment of rheumatic diseases has intensified the need to further define and characterize spondyloarthritis (SpA). There has been a long debate over nomenclature of the SpA subtypes. There are those who are considered "lumpers," favoring the notion that different entities of the SpA groups are manifestations of the same disease, and "splitters," those who believe the different SpA groups represent separate diseases with shared clinical features. The influential work by Moll et al. has led to separation of entities and recognition of etiological processes of SpA subtypes. Among these subtypes has emerged nonradiographic axial spondyloarthritis (nr-axSpA), which is believed to be either an early form of ankylosing spondylitis (AS) or perhaps a different disease entity altogether. Recently attention has shifted to the characterization of early SpA, with special emphasis on nonradiographic axial SpA. The Assessment of Spondyloarthritis International Society (ASAS) has developed new criteria for the classification of this disease entity. Along with the advent of these criteria have come several unanswered questions. Although data suggests that nr-axSpA will evolve into AS over time, the natural evolution of disease is still undetermined since a proportion of cases do not progress. A number of questions also remain regarding features of patients with AS compared to those with nonradiographic disease. This appraisal highlights the differences in disease characteristics between men and women in regards to measures of disease activity, inflammatory markers, and radiologic findings. Recent studies also suggest fibromyalgia as a potential confounding factor in assessing disease activity and establishing a diagnosis of axSpA in the female population. Nonradiographic axial SpA is a relevant disease subgroup of axial SpA, and several questions have been left unanswered with more research needed regarding diagnosis (particularly in women

  6. Increased Lactate Levels and Reduced pH in Postmortem Brains of Schizophrenics: Medication Confounds

    PubMed Central

    Halim, Nader D.; Lipska, Barbara K.; Hyde, Thomas M.; Deep-Soboslay, Amy; Saylor, E. Michael; Herman, Mary; Thakar, Jay; Verma, Ajay; Kleinman, Joel E.

    2008-01-01

    A number of postmortem studies have found decreased pH in brains of patients with schizophrenia. Insofar as lower pH has been associated with decreased mRNA expression in postmortem human brain, decreased pH in schizophrenia may represent an important potential confound in comparisons between patients and controls. We hypothesized that decreased pH may be related to increased concentration of lactic acid. However, in contrast to the previous notion that an increase in lactic acid represents evidence for primary metabolic abnormalities in schizophrenia, we hypothesized that this increase is secondary to prior antipsychotic treatment. We have tested this by first demonstrating that lactate levels in the cerebellum of patients with schizophrenia (n=35) are increased relative to control subjects (n=42) by 28%, p=0.001. Second, we have shown that there is an excellent correlation between lactate levels in the cerebellum and pH, and that this correlation is particularly strong in patients (r=− 0.78, p=3e-6). Third, we have shown in rats that chronic haloperidol (0.8 mg/kg/day) and clozapine (5 mg/kg/day) increase lactic acid concentration in the frontal cortex relative to vehicle (by 31% and 22% respectively, p<0.01). These data suggest that lactate increases in postmortem human brain of patients with schizophrenia are associated with decreased pH and that these changes are possibly related to antipsychotic treatment rather than a primary metabolic abnormality in the prefrontal cortex of patients with schizophrenia. PMID:18177946

  7. Distinguishing prostate cancer from benign confounders via a cascaded classifier on multi-parametric MRI

    NASA Astrophysics Data System (ADS)

    Litjens, G. J. S.; Elliott, R.; Shih, N.; Feldman, M.; Barentsz, J. O.; Hulsbergen-van de Kaa, C. A.; Kovacs, I.; Huisman, H. J.; Madabhushi, A.

    2014-03-01

    Learning how to separate benign confounders from prostate cancer is important because the imaging characteristics of these confounders are poorly understood. Furthermore, the typical representations of the MRI parameters might not be enough to allow discrimination. The diagnostic uncertainty this causes leads to a lower diagnostic accuracy. In this paper a new cascaded classifier is introduced to separate prostate cancer and benign confounders on MRI in conjunction with specific computer-extracted features to distinguish each of the benign classes (benign prostatic hyperplasia (BPH), inflammation, atrophy or prostatic intra-epithelial neoplasia (PIN). In this study we tried to (1) calculate different mathematical representations of the MRI parameters which more clearly express subtle differences between different classes, (2) learn which of the MRI image features will allow to distinguish specific benign confounders from prostate cancer, and (2) find the combination of computer-extracted MRI features to best discriminate cancer from the confounding classes using a cascaded classifier. One of the most important requirements for identifying MRI signatures for adenocarcinoma, BPH, atrophy, inflammation, and PIN is accurate mapping of the location and spatial extent of the confounder and cancer categories from ex vivo histopathology to MRI. Towards this end we employed an annotated prostatectomy data set of 31 patients, all of whom underwent a multi-parametric 3 Tesla MRI prior to radical prostatectomy. The prostatectomy slides were carefully co-registered to the corresponding MRI slices using an elastic registration technique. We extracted texture features from the T2-weighted imaging, pharmacokinetic features from the dynamic contrast enhanced imaging and diffusion features from the diffusion-weighted imaging for each of the confounder classes and prostate cancer. These features were selected because they form the mainstay of clinical diagnosis. Relevant features for

  8. On negative outcome control of unobserved confounding as a generalization of difference-in-differences

    PubMed Central

    Sofer, Tamar; Richardson, David B.; Colicino, Elena; Schwartz, Joel; Tchetgen Tchetgen, Eric J.

    2016-01-01

    The difference-in-differences (DID) approach is a well known strategy for estimating the effect of an exposure in the presence of unobserved confounding. The approach is most commonly used when pre-and post-exposure outcome measurements are available, and one can assume that the association of the unobserved confounder with the outcome is equal in the two exposure groups, and constant over time. Then, one recovers the treatment effect by regressing the change in outcome over time on the exposure. In this paper, we interpret the difference-in-differences as a negative outcome control (NOC) approach. We show that the pre-exposure outcome is a negative control outcome, as it cannot be influenced by the subsequent exposure, and it is affected by both observed and unobserved confounders of the exposure-outcome association of interest. The relation between DID and NOC provides simple conditions under which negative control outcomes can be used to detect and correct for confounding bias. However, for general negative control outcomes, the DID-like assumption may be overly restrictive and rarely credible, because it requires that both the outcome of interest and the control outcome are measured on the same scale. Thus, we present a scale-invariant generalization of the DID that may be used in broader NOC contexts. The proposed approach is demonstrated in simulations and on a Normative Aging Study data set, in which Body Mass Index is used for NOC of the relationship between air pollution and inflammatory outcomes. PMID:28239233

  9. Combating Unmeasured Confounding in Cross-Sectional Studies: Evaluating Instrumental-Variable and Heckman Selection Models

    PubMed Central

    DeMaris, Alfred

    2014-01-01

    Unmeasured confounding is the principal threat to unbiased estimation of treatment “effects” (i.e., regression parameters for binary regressors) in nonexperimental research. It refers to unmeasured characteristics of individuals that lead them both to be in a particular “treatment” category and to register higher or lower values than others on a response variable. In this article, I introduce readers to 2 econometric techniques designed to control the problem, with a particular emphasis on the Heckman selection model (HSM). Both techniques can be used with only cross-sectional data. Using a Monte Carlo experiment, I compare the performance of instrumental-variable regression (IVR) and HSM to that of ordinary least squares (OLS) under conditions with treatment and unmeasured confounding both present and absent. I find HSM generally to outperform IVR with respect to mean-square-error of treatment estimates, as well as power for detecting either a treatment effect or unobserved confounding. However, both HSM and IVR require a large sample to be fully effective. The use of HSM and IVR in tandem with OLS to untangle unobserved confounding bias in cross-sectional data is further demonstrated with an empirical application. Using data from the 2006–2010 General Social Survey (National Opinion Research Center, 2014), I examine the association between being married and subjective well-being. PMID:25110904

  10. Guided Bayesian imputation to adjust for confounding when combining heterogeneous data sources in comparative effectiveness research.

    PubMed

    Antonelli, Joseph; Zigler, Corwin; Dominici, Francesca

    2017-03-03

    In comparative effectiveness research, we are often interested in the estimation of an average causal effect from large observational data (the main study). Often this data does not measure all the necessary confounders. In many occasions, an extensive set of additional covariates is measured for a smaller and non-representative population (the validation study). In this setting, standard approaches for missing data imputation might not be adequate due to the large number of missing covariates in the main data relative to the smaller sample size of the validation data. We propose a Bayesian approach to estimate the average causal effect in the main study that borrows information from the validation study to improve confounding adjustment. Our approach combines ideas of Bayesian model averaging, confounder selection, and missing data imputation into a single framework. It allows for different treatment effects in the main study and in the validation study, and propagates the uncertainty due to the missing data imputation and confounder selection when estimating the average causal effect (ACE) in the main study. We compare our method to several existing approaches via simulation. We apply our method to a study examining the effect of surgical resection on survival among 10 396 Medicare beneficiaries with a brain tumor when additional covariate information is available on 2220 patients in SEER-Medicare. We find that the estimated ACE decreases by 30% when incorporating additional information from SEER-Medicare.

  11. Controlling Time-Dependent Confounding by Health Status and Frailty: Restriction Versus Statistical Adjustment.

    PubMed

    McGrath, Leah J; Ellis, Alan R; Brookhart, M Alan

    2015-07-01

    Nonexperimental studies of preventive interventions are often biased because of the healthy-user effect and, in frail populations, because of confounding by functional status. Bias is evident when estimating influenza vaccine effectiveness, even after adjustment for claims-based indicators of illness. We explored bias reduction methods while estimating vaccine effectiveness in a cohort of adult hemodialysis patients. Using the United States Renal Data System and linked data from a commercial dialysis provider, we estimated vaccine effectiveness using a Cox proportional hazards marginal structural model of all-cause mortality before and during 3 influenza seasons in 2005/2006 through 2007/2008. To improve confounding control, we added frailty indicators to the model, measured time-varying confounders at different time intervals, and restricted the sample in multiple ways. Crude and baseline-adjusted marginal structural models remained strongly biased. Restricting to a healthier population removed some unmeasured confounding; however, this reduced the sample size, resulting in wide confidence intervals. We estimated an influenza vaccine effectiveness of 9% (hazard ratio = 0.91, 95% confidence interval: 0.72, 1.15) when bias was minimized through cohort restriction. In this study, the healthy-user bias could not be controlled through statistical adjustment; however, sample restriction reduced much of the bias.

  12. Combating unmeasured confounding in cross-sectional studies: evaluating instrumental-variable and Heckman selection models.

    PubMed

    DeMaris, Alfred

    2014-09-01

    Unmeasured confounding is the principal threat to unbiased estimation of treatment "effects" (i.e., regression parameters for binary regressors) in nonexperimental research. It refers to unmeasured characteristics of individuals that lead them both to be in a particular "treatment" category and to register higher or lower values than others on a response variable. In this article, I introduce readers to 2 econometric techniques designed to control the problem, with a particular emphasis on the Heckman selection model (HSM). Both techniques can be used with only cross-sectional data. Using a Monte Carlo experiment, I compare the performance of instrumental-variable regression (IVR) and HSM to that of ordinary least squares (OLS) under conditions with treatment and unmeasured confounding both present and absent. I find HSM generally to outperform IVR with respect to mean-square-error of treatment estimates, as well as power for detecting either a treatment effect or unobserved confounding. However, both HSM and IVR require a large sample to be fully effective. The use of HSM and IVR in tandem with OLS to untangle unobserved confounding bias in cross-sectional data is further demonstrated with an empirical application. Using data from the 2006-2010 General Social Survey (National Opinion Research Center, 2014), I examine the association between being married and subjective well-being.

  13. Assessing Mediation Using Marginal Structural Models in the Presence of Confounding and Moderation

    ERIC Educational Resources Information Center

    Coffman, Donna L.; Zhong, Wei

    2012-01-01

    This article presents marginal structural models with inverse propensity weighting (IPW) for assessing mediation. Generally, individuals are not randomly assigned to levels of the mediator. Therefore, confounders of the mediator and outcome may exist that limit causal inferences, a goal of mediation analysis. Either regression adjustment or IPW…

  14. Adjusting for unmeasured confounding due to either of two crossed factors with a logistic regression model.

    PubMed

    Li, Li; Brumback, Babette A; Weppelmann, Thomas A; Morris, J Glenn; Ali, Afsar

    2016-08-15

    Motivated by an investigation of the effect of surface water temperature on the presence of Vibrio cholerae in water samples collected from different fixed surface water monitoring sites in Haiti in different months, we investigated methods to adjust for unmeasured confounding due to either of the two crossed factors site and month. In the process, we extended previous methods that adjust for unmeasured confounding due to one nesting factor (such as site, which nests the water samples from different months) to the case of two crossed factors. First, we developed a conditional pseudolikelihood estimator that eliminates fixed effects for the levels of each of the crossed factors from the estimating equation. Using the theory of U-Statistics for independent but non-identically distributed vectors, we show that our estimator is consistent and asymptotically normal, but that its variance depends on the nuisance parameters and thus cannot be easily estimated. Consequently, we apply our estimator in conjunction with a permutation test, and we investigate use of the pigeonhole bootstrap and the jackknife for constructing confidence intervals. We also incorporate our estimator into a diagnostic test for a logistic mixed model with crossed random effects and no unmeasured confounding. For comparison, we investigate between-within models extended to two crossed factors. These generalized linear mixed models include covariate means for each level of each factor in order to adjust for the unmeasured confounding. We conduct simulation studies, and we apply the methods to the Haitian data. Copyright © 2016 John Wiley & Sons, Ltd.

  15. Multi-locus Test and Correction for Confounding Effects in Genome-Wide Association Studies.

    PubMed

    Chen, Donglai; Liu, Chuanhai; Xie, Jun

    2016-11-01

    Genome-wide association studies (GWAS) examine a large number of genetic variants, e. g., single nucleotide polymorphisms (SNP), and associate them with a disease of interest. Traditional statistical methods for GWASs can produce spurious associations, due to limited information from individual SNPs and confounding effects. This paper develops two statistical methods to enhance data analysis of GWASs. The first is a multiple-SNP association test, which is a weighted chi-square test derived for big contingency tables. The test assesses combinatorial effects of multiple SNPs and improves conventional methods of single SNP analysis. The second is a method that corrects for confounding effects, which may come from population stratification as well as other ambiguous (unknown) factors. The proposed method identifies a latent confounding factor, using a profile of whole genome SNPs, and eliminates confounding effects through matching or stratified statistical analysis. Simulations and a GWAS of rheumatoid arthritis demonstrate that the proposed methods dramatically remove the number of significant tests, or false positives, and outperforms other available methods.

  16. Counselor Confounds in Evaluations of Vocational Rehabilitation Methods in Substance Dependency Treatment

    ERIC Educational Resources Information Center

    Staines, Graham L.; Cleland, Charles M.; Blankertz, Laura

    2006-01-01

    Evaluation research on vocational counseling in substance dependency treatment should distinguish between the effects of counselors and counseling methods on clients' employment outcomes. Three experimental designs permit investigation of possible confounds between these types of effects: (a) nested designs (each counselor delivers one counseling…

  17. Multi-locus Test and Correction for Confounding Effects in Genome-Wide Association Studies

    PubMed Central

    Chen, Donglai; Liu, Chuanhai; Xie, Jun

    2016-01-01

    Genome-wide association studies (GWAS) examine a large number of genetic variants, e.g., single nucleotide polymorphisms (SNP), and associate them with a disease of interest. Traditional statistical methods for GWASs can produce spurious associations, due to limited information from individual SNPs and confounding effects. This paper develops two statistical methods to enhance data analysis of GWASs. The first is a multiple-SNP association test, which is a weighted chi-square test derived for big contingency tables. The test assesses combinatorial effects of multiple SNPs and improves conventional methods of single SNP analysis. The second is a method that corrects for confounding effects, which may come from population stratification as well as other ambiguous (unknown) factors. The proposed method identifies a latent confounding factor, using a profile of whole genome SNPs, and eliminates confounding effects through matching or stratified statistical analysis. Simulations and a GWAS of rheumatoid arthritis demonstrate that the proposed methods dramatically remove the number of significant tests, or false positives, and outperforms other available methods. PMID:27232635

  18. Race and Socioeconomic Status as Confounding Variables in the Accurate Diagnosis of Alcoholism.

    ERIC Educational Resources Information Center

    Luepnitz, Roy R.; And Others

    1982-01-01

    Studied the incidence of bias related to race and socioeconomic status which could confound the diagnosis of alcoholism. Graduate psychology students made a diagnosis based on videotapes. Results indicated lower socioeconomic class individuals were more often diagnosed correctly for alcoholism, and Blacks were diagnosed alcoholic more often than…

  19. Adjusting for Confounding Factors in Quasi-Experiments: Another Reanalysis of the Westinghouse Head Start Evaluation.

    ERIC Educational Resources Information Center

    Magidson, Jay; Sorbom, Dag

    Evaluations of social programs based upon quasi-experimental designs are typically plagued by problems of nonequivalence between the experimental and comparison group prior to the experiment. In such settings it is extremely difficult, if not impossible, to isolate the effects of the program from the confounding effects associated with the…

  20. An education gradient in health, a health gradient in education, or a confounded gradient in both?

    PubMed

    Lynch, Jamie L; von Hippel, Paul T

    2016-04-01

    There is a positive gradient associating educational attainment with health, yet the explanation for this gradient is not clear. Does higher education improve health (causation)? Do the healthy become highly educated (selection)? Or do good health and high educational attainment both result from advantages established early in the life course (confounding)? This study evaluates these competing explanations by tracking changes in educational attainment and Self-rated Health (SRH) from age 15 to age 31 in the National Longitudinal Study of Youth, 1997 cohort. Ordinal logistic regression confirms that high-SRH adolescents are more likely to become highly educated. This is partly because adolescent SRH is associated with early advantages including adolescents' academic performance, college plans, and family background (confounding); however, net of these confounders adolescent SRH still predicts adult educational attainment (selection). Fixed-effects longitudinal regression shows that educational attainment has little causal effect on SRH at age 31. Completion of a high school diploma or associate's degree has no effect on SRH, while completion of a bachelor's or graduate degree have effects that, though significant, are quite small (less than 0.1 points on a 5-point scale). While it is possible that educational attainment would have greater effect on health at older ages, at age 31 what we see is a health gradient in education, shaped primarily by selection and confounding rather than by a causal effect of education on health.

  1. Unknown age in health disorders: A method to account for its cumulative effect and an application to feline viruses interactions.

    PubMed

    Hellard, Eléonore; Pontier, Dominique; Siberchicot, Aurélie; Sauvage, Frank; Fouchet, David

    2015-06-01

    Parasite interactions have been widely evidenced experimentally but field studies remain rare. Such studies are essential to detect interactions of interest and access (co)infection probabilities but face methodological obstacles. Confounding factors can create statistical associations, i.e. false parasite interactions. Among them, host age is a crucial covariate. It influences host exposition and susceptibility to many infections, and has a mechanical effect, older individuals being more at risk because of a longer exposure time. However, age is difficult to estimate in natural populations. Hence, one should be able to deal at least with its cumulative effect. Using a SI type dynamic model, we showed that the cumulative effect of age can generate false interactions theoretically (deterministic modeling) and with a real dataset of feline viruses (stochastic modeling). The risk to wrongly conclude to an association was maximal when parasites induced long-lasting antibodies and had similar forces of infection. We then proposed a method to correct for this effect (and for other potentially confounding shared risk factors) and made it available in a new R package, Interatrix. We also applied the correction to the feline viruses. It offers a way to account for an often neglected confounding factor and should help identifying parasite interactions in the field, a necessary step towards a better understanding of their mechanisms and consequences.

  2. Timing of human preimplantation embryonic development is confounded by embryo origin

    PubMed Central

    Kirkegaard, K.; Sundvall, L.; Erlandsen, M.; Hindkjær, J.J.; Knudsen, U.B.; Ingerslev, H.J.

    2016-01-01

    STUDY QUESTION To what extent do patient- and treatment-related factors explain the variation in morphokinetic parameters proposed as embryo viability markers? SUMMARY ANSWER Up to 31% of the observed variation in timing of embryo development can be explained by embryo origin, but no single factor elicits a systematic influence. WHAT IS KNOWN ALREADY Several studies report that culture conditions, patient characteristics and treatment influence timing of embryo development, which have promoted the perception that each clinic must develop individual models. Most of the studies have, however, treated embryos from one patient as independent observations, and only very few studies that evaluate the influence from patient- and treatment-related factors on timing of development or time-lapse parameters as predictors of viability have controlled for confounding, which implies a high risk of overestimating the statistical significance of potential correlations. STUDY DESIGN, SIZE, DURATION Infertile patients were prospectively recruited to a cohort study at a hospital fertility clinic from February 2011 to May 2013. Patients aged <38 years without endometriosis were eligible if ≥8 oocytes were retrieved. Patients were included only once. All embryos were monitored for 6 days in a time-lapse incubator. PARTICIPANTS/MATERIALS, SETTING, METHODS A total of 1507 embryos from 243 patients were included. The influence of fertilization method, BMI, maternal age, FSH dose and number of previous cycles on timing of t2-t5, duration of the 2- and 3-cell stage, and development of a blastocoel (tEB) and full blastocoel (tFB) was tested in multivariate, multilevel linear regression analysis. Predictive parameters for live birth were tested in a logistic regression analysis for 223 single transferred blastocysts, where time-lapse parameters were investigated along with patient and embryo characteristics. MAIN RESULTS AND THE ROLE OF CHANCE Moderate intra-class correlation coefficients

  3. Did the No Child Left Behind Act Miss the Mark? Assessing the Potential Benefits from an Accountability System for Early Childhood Education

    ERIC Educational Resources Information Center

    Miller, Lawrence J.; Smith, Stephanie C.

    2011-01-01

    With growing evidence that human capital investment is more efficiently spent on younger children coupled with wide variation in preschool access across states, this article uses a neoliberal approach to examine the potential social costs and benefits that could accrue should the United States decide to implement a centralized preschool…

  4. Why do thin people have elevated all-cause mortality? Evidence on confounding and reverse causality in the association of adiposity and COPD from the British Women's Heart and Health Study.

    PubMed

    Dale, Caroline; Nüesch, Eveline; Prieto-Merino, David; Choi, Minkyoung; Amuzu, Antoinette; Ebrahim, Shah; Casas, Juan P; Davey-Smith, George

    2015-01-01

    Low adiposity has been linked to elevated mortality from several causes including respiratory disease. However, this could arise from confounding or reverse causality. We explore the association between two measures of adiposity (BMI and WHR) with COPD in the British Women's Heart and Health Study including a detailed assessment of the potential for confounding and reverse causality for each adiposity measure. Low BMI was found to be associated with increased COPD risk while low WHR was not (OR = 2.2; 95% CI 1.3-3.1 versus OR = 1.2; 95% CI 0.7-1.6). Potential confounding variables (e.g. smoking) and markers of ill-health (e.g. unintentional weight loss) were found to be higher in low BMI but not in low WHR. Women with low BMI have a detrimental profile across a broad range of health markers compared to women with low WHR, and women with low WHR do not appear to have an elevated COPD risk, lending support to the hypothesis that WHR is a less confounded measure of adiposity than BMI. Low adiposity does not in itself appear to increase the risk of respiratory disease, and the apparent adverse consequences of low BMI may be due to reverse causation and confounding.

  5. Thinking about Accountability

    PubMed Central

    Deber, Raisa B.

    2014-01-01

    Accountability is a key component of healthcare reforms, in Canada and internationally, but there is increasing recognition that one size does not fit all. A more nuanced understanding begins with clarifying what is meant by accountability, including specifying for what, by whom, to whom and how. These papers arise from a Partnership for Health System Improvement (PHSI), funded by the Canadian Institutes of Health Research (CIHR), on approaches to accountability that examined accountability across multiple healthcare subsectors in Ontario. The partnership features collaboration among an interdisciplinary team, working with senior policy makers, to clarify what is known about best practices to achieve accountability under various circumstances. This paper presents our conceptual framework. It examines potential approaches (policy instruments) and postulates that their outcomes may vary by subsector depending upon (a) the policy goals being pursued, (b) governance/ownership structures and relationships and (c) the types of goods and services being delivered, and their production characteristics (e.g., contestability, measurability and complexity). PMID:25305385

  6. Cryptic confounding compounds: A brief consideration of the influences of anthropogenic contaminants on courtship and mating behavior

    PubMed Central

    Blocker, Tomica D.; Ophir, Alexander G.

    2012-01-01

    Contaminants, like pesticides, polychlorinated biphenyls (PCBs), dioxins and metals, are persistent and ubiquitous and are known to threaten the environment. Traditionally, scientists have considered the direct physiological risks that these contaminants pose. However, scientists have just begun to integrate ethology and toxicology to investigate the effects that contaminants have on behavior. This review considers the potential for contaminant effects on mating behavior. Here we assess the growing body of research concerning disruptions in sexual differentiation, courtship, sexual receptivity, arousal, and mating. We discuss the implications of these disruptions on conservation efforts and highlight the importance of recognizing the potential for environmental stressors to affect behavioral experimentation. More specifically, we consider the negative implications for anthropogenic contaminants to affect the immediate behavior of animals, and their potential to have cascading and/or long-term effects on the behavioral ecology and evolution of populations. Overall, we aim to raise awareness of the confounding influence that contaminants can have, and promote caution when interpreting results where the potential for cryptic affects are possible. PMID:24244068

  7. A comparative account of quantum dynamics of the H{sup +} + H{sub 2} reaction at low temperature on two different potential energy surfaces

    SciTech Connect

    Rao, T. Rajagopala; Mahapatra, S.; Honvault, P.

    2014-08-14

    Rotationally resolved reaction probabilities, integral cross sections, and rate constant for the H{sup +} + H{sub 2} (v = 0, j = 0 or 1) → H{sub 2} (v′ = 0, j′) + H{sup +} reaction are calculated using a time-independent quantum mechanical method and the potential energy surface of Kamisaka et al. [J. Chem. Phys. 116, 654 (2002)] (say KBNN PES). All partial wave contributions of the total angular momentum, J, are included to obtain converged cross sections at low collision energies and rate constants at low temperatures. In order to test the accuracy of the KBNN PES, the results obtained here are compared with those obtained in our earlier work [P. Honvault et al., Phys. Rev. Lett. 107, 023201 (2011)] using the accurate potential energy surface of Velilla et al. [J. Chem. Phys. 129, 084307 (2008)]. Integral cross sections and rate constants obtained on the two potential energy surfaces considered here show remarkable differences in terms of magnitude and dependence on collision energy (or temperature) which can be attributed to the differences observed in the topography of the surfaces near to the entrance channel. This clearly shows the inadequacy of the KBNN PES for calculations at low collision energies.

  8. Clinical and methodological confounders in assessing the cerebellar cognitive affective syndrome in adult patients with posterior fossa tumours.

    PubMed

    Omar, Dashne; Ryan, Tracy; Carson, Alan; Bak, Thomas H; Torrens, Lorna; Whittle, Ian

    2014-12-01

    The cerebellar cognitive affective syndrome (CCAS) was first described by Schmahmann and Sherman as a constellation of symptoms including dysexecutive syndrome, spatial cognitive deficit, linguistic deficits and behavioural abnormalities in patients with a lesion in the cerebellum with otherwise normal brain. Neurosurgical patients with cerebellar tumours constitute one of the cohorts in which the CCAS has been described. In this paper, we present a critical review of the literature of this syndrome in neurosurgical patients. Thereafter, we present a prospective clinical study of 10 patients who underwent posterior fossa tumour resection and had a detailed post-operative neuropsychological, neuropsychiatric and neuroradiological assessment. Because our findings revealed a large number of perioperative neuroradiological confounding variables, we reviewed the neuroimaging of a further 20 patients to determine their prevalence. Our literature review revealed that study design, methodological quality and sometimes both diagnostic criteria and findings were inconsistent. The neuroimaging study (pre-operative, n = 10; post-operative, n = 10) showed very frequent neuroradiological confounding complications (e.g. hydrocephalus; brainstem compression; supratentorial lesions and post-operative subdural hygroma); the impact of such features had largely been ignored in the literature. Findings from our clinical study showed various degree of deficits in neuropsychological testing (n = 1, memory; n = 3, verbal fluency; n = 3, attention; n = 2, spatial cognition deficits; and n = 1, behavioural changes), but no patient had full-blown features of CCAS. Our study, although limited, finds no robust evidence of the CCAS following surgery. This and our literature review highlight a need for guidelines regarding study design and methodology when attempting to evaluate neurosurgical cases with regard to the potential CCAS.

  9. Comparing High-dimensional Confounder Control Methods for Rapid Cohort Studies From Electronic Health Records

    PubMed Central

    Low, Yen Sia; Gallego, Blanca; Shah, Nigam Haresh

    2016-01-01

    Aims Electronic health records (EHR), containing rich clinical histories of large patient populations, can provide evidence for clinical decisions when evidence from trials and literature is absent. To enable such observational studies from EHR in real time, particularly in emergencies, rapid confounder control methods that can handle numerous variables and adjust for biases are imperative. This study compares the performance of 19 automatic confounder control methods. Methods Methods include propensity scores, direct adjustment by machine learning, similarity matching and resampling in two simulated and one real-world EHR datasets. Results and conclusions Direct adjustment by lasso regression and ensemble models involving multiple resamples have performance comparable to expert-based propensity scores and thus, may help provide real-time EHR-based evidence for timely clinical decisions. PMID:26634383

  10. Syphilis may be a confounding factor, not a causative agent, in syphilitic ALS.

    PubMed

    Tuk, Bert

    2016-01-01

    Based upon a review of published clinical observations regarding syphilitic amyotrophic lateral sclerosis (ALS), I hypothesize that syphilis is actually a confounding factor, not a causative factor, in syphilitic ALS. Moreover, I propose that the successful treatment of ALS symptoms in patients with syphilitic ALS using penicillin G and hydrocortisone is an indirect consequence of the treatment regimen and is not due to the treatment of syphilis. Specifically, I propose that the observed effect is due to the various pharmacological activities of penicillin G ( e.g., a GABA receptor antagonist) and/or the multifaceted pharmacological activity of hydrocortisone. The notion that syphilis may be a confounding factor in syphilitic ALS is highly relevant, as it suggests that treating ALS patients with penicillin G and hydrocortisone-regardless of whether they present with syphilitic ALS or non-syphilitic ALS-may be effective at treating this rapidly progressive, highly devastating disease.

  11. In vivo measurement of 241Am in the lungs confounded by activity deposited in other organs.

    PubMed

    Lobaugh, Megan L; Spitz, Henry B; Glover, Samuel E

    2015-01-01

    Radioactive material deposited in multiple organs of the body is likely to confound a result of an in vivo measurement performed over the lungs, the most frequently monitored organ for occupational exposure. The significance of this interference was evaluated by measuring anthropometric torso phantoms containing lungs, liver, skeleton, and axillary lymph nodes, each with a precisely known quantity of 241Am uniformly distributed in the organs. Arrays of multiple high-resolution germanium detectors were positioned over organs within the torso phantom containing 241Am or over proximal organs without activity to determine the degree of measurement confounding due to photons emitted from other source organs. A set of four mathematical response functions describes the measured count rate with detectors positioned over each of the relevant organs and 241Am contained in the measured organ or one of the other organs selected as a confounder. Simultaneous solution of these equations by matrix algebra, where the diagonal terms of the matrix are calibration factors for a direct measurement of activity in an organ and the off-diagonal terms reflect the contribution (i.e., interference or cross-talk) produced by 241Am in a confounding organ, yields the activity deposited in each of the relevant organs. The matrix solution described in this paper represents a method for adjusting a result of 241Am measured directly in one organ for interferences that may arise from 241Am deposited elsewhere and represents a technically valid procedure to aid in evaluating internal dose based upon in vivo measurements for those radioactive materials known to deposit in multiple organs.

  12. Confounding factors to predict the awakening effect-site concentration of propofol in target-controlled infusion based on propofol and fentanyl anesthesia.

    PubMed

    Chan, Shun-Ming; Lee, Meei-Shyuan; Lu, Chueng-He; Cherng, Chen-Hwan; Huang, Yuan-Shiou; Yeh, Chun-Chang; Kuo, Chan-Yang; Wu, Zhi-Fu

    2015-01-01

    We conducted a large retrospective study to investigate the confounding factors that predict Ce ROC under propofol-based TIVA with TCI. We recorded sex, age, height, weight, Ce LOC, Ce ROC, total propofol and fentanyl consumption dose, and anesthetic time. Simple linear regression models were used to identify potential predictors of Ce ROC, and multiple linear regression models were used to identify the confounding predictors of Ce ROC. We found that Ce ROC correlated with age, sex, Ce LOC, and both total fentanyl and propofol consumption dose. The prediction formula was: Ce ROC = 0.87 - 0.06 × age + 0.18 × Ce LOC + 0.04 (if fentanyl consumption > 150 μg; if not, ignore this value) + 0.07 × (1 or 2, according to the total propofol consumption dose, 1 for a propofol amount 1000-2000 mg and 2 for a propofol amount > 2000 mg). We simplified the formula further as Ce ROC = 0.87 - 0.06 × age + 0.18 × Ce LOC. In conclusion, Ce ROC can be predicted under TCI with propofol- and fentanyl-based TIVA. The confounding factors that predicted propofol Ce ROC are age, sex, Ce LOC, and total consumption dose of propofol and fentanyl.

  13. Efforts to adjust for confounding by neighborhood using complex survey data.

    PubMed

    Brumback, Babette A; Dailey, Amy B; He, Zhulin; Brumback, Lyndia C; Livingston, Melvin D

    2010-08-15

    In social epidemiology, one often considers neighborhood or contextual effects on health outcomes, in addition to effects of individual exposures. This paper is concerned with the estimation of an individual exposure effect in the presence of confounding by neighborhood effects, motivated by an analysis of National Health Interview Survey (NHIS) data. In the analysis, we operationalize neighborhood as the secondary sampling unit of the survey, which consists of small groups of neighboring census blocks. Thus the neighborhoods are sampled with unequal probabilities, as are individuals within neighborhoods. We develop and compare several approaches for the analysis of the effect of dichotomized individual-level education on the receipt of adequate mammography screening. In the analysis, neighborhood effects are likely to confound the individual effects, due to such factors as differential availability of health services and differential neighborhood culture. The approaches can be grouped into three broad classes: ordinary logistic regression for survey data, with either no effect or a fixed effect for each cluster; conditional logistic regression extended for survey data; and generalized linear mixed model (GLMM) regression for survey data. Standard use of GLMMs with small clusters fails to adjust for confounding by cluster (e.g. neighborhood); this motivated us to develop an adaptation. We use theory, simulation, and analyses of the NHIS data to compare and contrast all of these methods. One conclusion is that all of the methods perform poorly when the sampling bias is strong; more research and new methods are clearly needed.

  14. Assessing moderated mediation in linear models requires fewer confounding assumptions than assessing mediation.

    PubMed

    Loeys, Tom; Talloen, Wouter; Goubert, Liesbet; Moerkerke, Beatrijs; Vansteelandt, Stijn

    2016-11-01

    It is well known from the mediation analysis literature that the identification of direct and indirect effects relies on strong no unmeasured confounding assumptions of no unmeasured confounding. Even in randomized studies the mediator may still be correlated with unobserved prognostic variables that affect the outcome, in which case the mediator's role in the causal process may not be inferred without bias. In the behavioural and social science literature very little attention has been given so far to the causal assumptions required for moderated mediation analysis. In this paper we focus on the index for moderated mediation, which measures by how much the mediated effect is larger or smaller for varying levels of the moderator. We show that in linear models this index can be estimated without bias in the presence of unmeasured common causes of the moderator, mediator and outcome under certain conditions. Importantly, one can thus use the test for moderated mediation to support evidence for mediation under less stringent confounding conditions. We illustrate our findings with data from a randomized experiment assessing the impact of being primed with social deception upon observer responses to others' pain, and from an observational study of individuals who ended a romantic relationship assessing the effect of attachment anxiety during the relationship on mental distress 2 years after the break-up.

  15. Sensitivity analysis for direct and indirect effects in the presence of exposure-induced mediator-outcome confounders

    PubMed Central

    Chiba, Yasutaka

    2014-01-01

    Questions of mediation are often of interest in reasoning about mechanisms, and methods have been developed to address these questions. However, these methods make strong assumptions about the absence of confounding. Even if exposure is randomized, there may be mediator-outcome confounding variables. Inference about direct and indirect effects is particularly challenging if these mediator-outcome confounders are affected by the exposure because in this case these effects are not identified irrespective of whether data is available on these exposure-induced mediator-outcome confounders. In this paper, we provide a sensitivity analysis technique for natural direct and indirect effects that is applicable even if there are mediator-outcome confounders affected by the exposure. We give techniques for both the difference and risk ratio scales and compare the technique to other possible approaches. PMID:25580387

  16. Is exposure temperature a confounding factor for the assessment of reproductive parameters of New Zealand mudsnails Potamopyrgus antipodarum (Gray)?

    PubMed

    Gust, M; Buronfosse, T; André, C; Mons, R; Gagné, F; Garric, J

    2011-01-25

    Potamopyrgus antipodarum is a promising test organism often used in ecotoxicology testing, both in laboratory and in field exposure experiments. It has been recommended for use in the development of an OECD reproduction test. However, exposure temperature is important to take into account when assessing reproduction and related biomarkers, because it can act as a confounding factor inducing variability in physiological values. The effect of three environmentally realistic exposure temperatures (8, 16 and 24°C) was examined with respect to the number of neonates born, the number of embryos in the brood pouch and the duration of embryonic development. We also measured additional markers likely to be related to the modulation of reproductive performance, such as vertebrate-like sex steroid, energy status and vitellin-like proteins. Exposure temperature had a significant effect on reproduction in P. antipodarum, on both the duration of embryonic development and the quantity of embryos and neonates. The consequences of these observations must not be neglected when using this species in laboratory and field experiments. This study determined suitable temperatures for field experiments and a mean duration for embryonic development independent of temperature. In addition to steroid levels, energy status and Vn-like protein levels were only slightly modified by exposure temperature between 8 and 24°C. Thus, they can be easily implemented and their variations related to anthropogenic factors during field exposure of mudsnails.

  17. Influence of explanatory and confounding variables on HRQoL after controlling for measurement bias and response shift in measurement.

    PubMed

    Gandhi, Pranav K; Ried, L Douglas; Kimberlin, Carole L; Kauf, Teresa L; Huang, I-Chan

    2013-12-01

    The purpose of this study was to examine the influence of explanatory and confounding variables on health-related quality of life after accounting for response shift, measurement bias and response shift in measurement using structural equation modeling. Hypertensive patients with coronary artery disease randomized to anti-hypertensive treatment, completed the ShortForm-36 questionnaire at both baseline and 1 year (n = 788). Three measurement biases were found and all three were considered as response shift in measurement. Older patients reported worse scores for both physical functioning (PF) and role-physical at baseline and 1 year later compared to younger patients; and males reported better PF than females after conditioning on the latent trait of general physical health. Before controlling for response shift, patients' PF scores were not statistically different over time; however, PF scores significantly improved (p < 0.01) after controlling for recalibration response shift. Assessment of how patients perceive their change in health-related quality of life over time is warranted.

  18. High-Resolution Coarse-Grained Model of Hydrated Anion-Exchange Membranes that Accounts for Hydrophobic and Ionic Interactions through Short-Ranged Potentials.

    PubMed

    Lu, Jibao; Jacobson, Liam C; Perez Sirkin, Yamila A; Molinero, Valeria

    2017-01-10

    Molecular simulations provide a versatile tool to study the structure, anion conductivity, and stability of anion-exchange membrane (AEM) materials and can provide a fundamental understanding of the relation between structure and property of membranes that is key for their use in fuel cells and other applications. The quest for large spatial and temporal scales required to model the multiscale structure and transport processes in the polymer electrolyte membranes, however, cannot be met with fully atomistic models, and the available coarse-grained (CG) models suffer from several challenges associated with their low-resolution. Here, we develop a high-resolution CG force field for hydrated polyphenylene oxide/trimethylamine chloride (PPO/TMACl) membranes compatible with the mW water model using a hierarchical parametrization approach based on Uncertainty Quantification and reference atomistic simulations modeled with the Generalized Amber Force Field (GAFF) and TIP4P/2005 water. The parametrization weighs multiple properties, including coordination numbers, radial distribution functions (RDFs), self-diffusion coefficients of water and ions, relative vapor pressure of water in the solution, hydration enthalpy of the tetramethylammonium chloride (TMACl) salt, and cohesive energy of its aqueous solutions. We analyze the interdependence between properties and address how to compromise between the accuracies of the properties to achieve an overall best representability. Our optimized CG model FFcomp quantitatively reproduces the diffusivities and RDFs of the reference atomistic model and qualitatively reproduces the experimental relative vapor pressure of water in solutions of tetramethylammonium chloride. These properties are of utmost relevance for the design and operation of fuel cell membranes. To our knowledge, this is the first CG model that includes explicitly each water and ion and accounts for hydrophobic, ionic, and intramolecular interactions explicitly

  19. Confounding factors in using upward feedback to assess the quality of medical training: a systematic review

    PubMed Central

    2014-01-01

    Purpose: Upward feedback is becoming more widely used in medical training as a means of quality control. Multiple biases exist, thus the accuracy of upward feedback is debatable. This study aims to identify factors that could influence upward feedback, especially in medical training. Methods: A systematic review using a structured search strategy was performed. Thirty-five databases were searched. Results were reviewed and relevant abstracts were shortlisted. All studies in English, both medical and non-medical literature, were included. A simple pro-forma was used initially to identify the pertinent areas of upward feedback, so that a focused pro-forma could be designed for data extraction. Results: A total of 204 articles were reviewed. Most studies on upward feedback bias were evaluative studies and only covered Kirkpatrick level 1-reaction. Most studies evaluated trainers or training, were used for formative purposes and presented quantitative data. Accountability and confidentiality were the most common overt biases, whereas method of feedback was the most commonly implied bias within articles. Conclusion: Although different types of bias do exist, upward feedback does have a role in evaluating medical training. Accountability and confidentiality were the most common biases. Further research is required to evaluate which types of bias are associated with specific survey characteristics and which are potentially modifiable. PMID:25112445

  20. Use of Self-Matching to Control for Stable Patient Characteristics While Addressing Time-Varying Confounding on Treatment Effect: A Case Study of Older Intensive Care Patients.

    PubMed

    Han, Ling; Pisani, M A; Araujo, K L B; Allore, Heather G

    Exposure-crossover design offers a non-experimental option to control for stable baseline confounding through self-matching while examining causal effect of an exposure on an acute outcome. This study extends this approach to longitudinal data with repeated measures of exposure and outcome using data from a cohort of 340 older medical patients in an intensive care unit (ICU). The analytic sample included 92 patients who received ≥1 dose of haloperidol, an antipsychotic medication often used for patients with delirium. Exposure-crossover design was implemented by sampling the 3-day time segments prior (Induction) and posterior (Subsequent) to each treatment episode of receiving haloperidol. In the full cohort, there was a trend of increasing delirium severity scores (Mean±SD: 4.4±1.7) over the course of the ICU stay. After exposure-crossover sampling, the delirium severity score decreased from the Induction (4.9) to the Subsequent (4.1) intervals, with the treatment episode falling in-between (4.5). Based on a GEE Poisson model accounting for self-matching and within-subject correlation, the unadjusted mean delirium severity scores was -0.55 (95% CI: -1.10, -0.01) points lower for the Subsequent than the Induction intervals. The association diminished by 32% (-0.38, 95%CI: -0.99, 0.24) after adjusting only for ICU confounding, while being slightly increased by 7% (-0.60, 95%CI: -1.15, -0.04) when adjusting only for baseline characteristics. These results suggest that longitudinal exposure-crossover design is feasible and capable of partially removing stable baseline confounding through self-matching. Loss of power due to eliminating treatment-irrelevant person-time and uncertainty around allocating person-time to comparison intervals remain methodological challenges.

  1. Accounting Fundamentals for Non-Accountants

    EPA Pesticide Factsheets

    The purpose of this module is to provide an introduction and overview of accounting fundamentals for non-accountants. The module also covers important topics such as communication, internal controls, documentation and recordkeeping.

  2. A comparison of confounding adjustment methods with an application to early life determinants of childhood obesity.

    PubMed

    Li, L; Kleinman, K; Gillman, M W

    2014-12-01

    We implemented six confounding adjustment methods: (1) covariate-adjusted regression, (2) propensity score (PS) regression, (3) PS stratification, (4) PS matching with two calipers, (5) inverse probability weighting and (6) doubly robust estimation to examine the associations between the body mass index (BMI) z-score at 3 years and two separate dichotomous exposure measures: exclusive breastfeeding v. formula only (n=437) and cesarean section v. vaginal delivery (n=1236). Data were drawn from a prospective pre-birth cohort study, Project Viva. The goal is to demonstrate the necessity and usefulness, and approaches for multiple confounding adjustment methods to analyze observational data. Unadjusted (univariate) and covariate-adjusted linear regression associations of breastfeeding with BMI z-score were -0.33 (95% CI -0.53, -0.13) and -0.24 (-0.46, -0.02), respectively. The other approaches resulted in smaller n (204-276) because of poor overlap of covariates, but CIs were of similar width except for inverse probability weighting (75% wider) and PS matching with a wider caliper (76% wider). Point estimates ranged widely, however, from -0.01 to -0.38. For cesarean section, because of better covariate overlap, the covariate-adjusted regression estimate (0.20) was remarkably robust to all adjustment methods, and the widths of the 95% CIs differed less than in the breastfeeding example. Choice of covariate adjustment method can matter. Lack of overlap in covariate structure between exposed and unexposed participants in observational studies can lead to erroneous covariate-adjusted estimates and confidence intervals. We recommend inspecting covariate overlap and using multiple confounding adjustment methods. Similar results bring reassurance. Contradictory results suggest issues with either the data or the analytic method.

  3. A comparison of Bayesian and Monte Carlo sensitivity analysis for unmeasured confounding.

    PubMed

    McCandless, Lawrence C; Gustafson, Paul

    2017-04-06

    Bias from unmeasured confounding is a persistent concern in observational studies, and sensitivity analysis has been proposed as a solution. In the recent years, probabilistic sensitivity analysis using either Monte Carlo sensitivity analysis (MCSA) or Bayesian sensitivity analysis (BSA) has emerged as a practical analytic strategy when there are multiple bias parameters inputs. BSA uses Bayes theorem to formally combine evidence from the prior distribution and the data. In contrast, MCSA samples bias parameters directly from the prior distribution. Intuitively, one would think that BSA and MCSA ought to give similar results. Both methods use similar models and the same (prior) probability distributions for the bias parameters. In this paper, we illustrate the surprising finding that BSA and MCSA can give very different results. Specifically, we demonstrate that MCSA can give inaccurate uncertainty assessments (e.g. 95% intervals) that do not reflect the data's influence on uncertainty about unmeasured confounding. Using a data example from epidemiology and simulation studies, we show that certain combinations of data and prior distributions can result in dramatic prior-to-posterior changes in uncertainty about the bias parameters. This occurs because the application of Bayes theorem in a non-identifiable model can sometimes rule out certain patterns of unmeasured confounding that are not compatible with the data. Consequently, the MCSA approach may give 95% intervals that are either too wide or too narrow and that do not have 95% frequentist coverage probability. Based on our findings, we recommend that analysts use BSA for probabilistic sensitivity analysis. Copyright © 2017 John Wiley & Sons, Ltd.

  4. A comparison of confounding adjustment methods with an application to early life determinants of childhood obesity

    PubMed Central

    Kleinman, Ken; Gillman, Matthew W.

    2014-01-01

    We implemented 6 confounding adjustment methods: 1) covariate-adjusted regression, 2) propensity score (PS) regression, 3) PS stratification, 4) PS matching with two calipers, 5) inverse-probability-weighting, and 6) doubly-robust estimation to examine the associations between the BMI z-score at 3 years and two separate dichotomous exposure measures: exclusive breastfeeding versus formula only (N = 437) and cesarean section versus vaginal delivery (N = 1236). Data were drawn from a prospective pre-birth cohort study, Project Viva. The goal is to demonstrate the necessity and usefulness, and approaches for multiple confounding adjustment methods to analyze observational data. Unadjusted (univariate) and covariate-adjusted linear regression associations of breastfeeding with BMI z-score were −0.33 (95% CI −0.53, −0.13) and −0.24 (−0.46, −0.02), respectively. The other approaches resulted in smaller N (204 to 276) because of poor overlap of covariates, but CIs were of similar width except for inverse-probability-weighting (75% wider) and PS matching with a wider caliper (76% wider). Point estimates ranged widely, however, from −0.01 to −0.38. For cesarean section, because of better covariate overlap, the covariate-adjusted regression estimate (0.20) was remarkably robust to all adjustment methods, and the widths of the 95% CIs differed less than in the breastfeeding example. Choice of covariate adjustment method can matter. Lack of overlap in covariate structure between exposed and unexposed participants in observational studies can lead to erroneous covariate-adjusted estimates and confidence intervals. We recommend inspecting covariate overlap and using multiple confounding adjustment methods. Similar results bring reassurance. Contradictory results suggest issues with either the data or the analytic method. PMID:25171142

  5. Accounting: Accountants Need Verbal Skill Training

    ERIC Educational Resources Information Center

    Whitaker, Bruce L.

    1978-01-01

    Verbal skills training is one aspect of accounting education not usually included in secondary and postsecondary accounting courses. The author discusses the need for verbal competency and methods of incorporating it into accounting courses, particularly a variation of the Keller plan of individualized instruction. (MF)

  6. Lung cancer gene associated with COPD: triple whammy or possible confounding effect?

    PubMed

    Young, R P; Hopkins, R J; Hay, B A; Epton, M J; Black, P N; Gamble, G D

    2008-11-01

    Recently, several large genome-wide association studies have identified a putative "lung cancer" locus in the nicotinic acetylcholine receptor subunit genes (nAChR) on 15q25. However, these findings may be confounded by the presence of chronic obstructive pulmonary disease (COPD), which is also strongly associated with smoking exposure and lung cancer. This is likely as the prevalence of COPD in lung cancer cohorts is as much as two-fold greater than that reported in smoking control populations (50 versus 20%). The present authors compared the genotype frequencies of the most strongly associated single nucleotide polymorphism (rs16969968) in the alpha5 subunit of the nAChR gene cluster between three matched smoking cohorts. The AA genotype was found to be more frequent and was seen in 437 (16%) lung cancer cases and 445 (14%) COPD cases compared with 475 (9%) healthy smoking controls. More importantly, when 429 lung cancer cases were divided according to spirometry results (performed within 3 months of diagnosis, prior to surgery and in the absence of effusions or collapse), the AA genotype was present in 19 and 11% of cases with and without COPD, respectively. These findings suggest that the association between the alpha5 subunit nicotinic acetylcholine receptor single nucleotide polymorphism and lung cancer may, in part, be confounded by chronic obstructive pulmonary disease.

  7. Overcoming confounding plate effects in differential expression analyses of single-cell RNA-seq data.

    PubMed

    Lun, Aaron T L; Marioni, John C

    2017-02-06

    An increasing number of studies are using single-cell RNA-sequencing (scRNA-seq) to characterize the gene expression profiles of individual cells. One common analysis applied to scRNA-seq data involves detecting differentially expressed (DE) genes between cells in different biological groups. However, many experiments are designed such that the cells to be compared are processed in separate plates or chips, meaning that the groupings are confounded with systematic plate effects. This confounding aspect is frequently ignored in DE analyses of scRNA-seq data. In this article, we demonstrate that failing to consider plate effects in the statistical model results in loss of type I error control. A solution is proposed whereby counts are summed from all cells in each plate and the count sums for all plates are used in the DE analysis. This restores type I error control in the presence of plate effects without compromising detection power in simulated data. Summation is also robust to varying numbers and library sizes of cells on each plate. Similar results are observed in DE analyses of real data where the use of count sums instead of single-cell counts improves specificity and the ranking of relevant genes. This suggests that summation can assist in maintaining statistical rigour in DE analyses of scRNA-seq data with plate effects.

  8. A two-stage strategy to accommodate general patterns of confounding in the design of observational studies.

    PubMed

    Haneuse, Sebastien; Schildcrout, Jonathan; Gillen, Daniel

    2012-04-01

    Accommodating general patterns of confounding in sample size/power calculations for observational studies is extremely challenging, both technically and scientifically. While employing previously implemented sample size/power tools is appealing, they typically ignore important aspects of the design/data structure. In this paper, we show that sample size/power calculations that ignore confounding can be much more unreliable than is conventionally thought; using real data from the US state of North Carolina, naive calculations yield sample size estimates that are half those obtained when confounding is appropriately acknowledged. Unfortunately, eliciting realistic design parameters for confounding mechanisms is difficult. To overcome this, we propose a novel two-stage strategy for observational study design that can accommodate arbitrary patterns of confounding. At the first stage, researchers establish bounds for power that facilitate the decision of whether or not to initiate the study. At the second stage, internal pilot data are used to estimate key scientific inputs that can be used to obtain realistic sample size/power. Our results indicate that the strategy is effective at replicating gold standard calculations based on knowing the true confounding mechanism. Finally, we show that consideration of the nature of confounding is a crucial aspect of the elicitation process; depending on whether the confounder is positively or negatively associated with the exposure of interest and outcome, naive power calculations can either under or overestimate the required sample size. Throughout, simulation is advocated as the only general means to obtain realistic estimates of statistical power; we describe, and provide in an R package, a simple algorithm for estimating power for a case-control study.

  9. Does environmental confounding mask pleiotropic effects of a multiple sclerosis susceptibility variant on vitamin D in psychosis?

    PubMed Central

    Iyegbe, Conrad O; Acharya, Anita; Lally, John; Gardner-Sood, Poonam; Smith, Louise S; Smith, Shubulade; Murray, Robin; Howes, Oliver; Gaughran, Fiona

    2015-01-01

    Background: This work addresses the existing and emerging evidence of overlap within the environmental and genetic profiles of multiple sclerosis (MS) and schizophrenia. Aims: To investigate whether a genetic risk factor for MS (rs703842), whose variation is indicative of vitamin D status in the disorder, could also be a determinant of vitamin D status in chronic psychosis patients. Methods: A cohort of 224 chronic psychosis cases was phenotyped and biologically profiled. The relationship between rs703842 and physiological vitamin D status in the blood plasma was assessed by logistic regression. Deficiency was defined as a blood plasma concentration below 10 ng/µl. Potential environmental confounders of the vitamin D status were considered as part of the analysis. Results: We report suggestive evidence of an association with vitamin D status in established psychosis (ßstandardized=0.51, P=0.04). The logistic model fit significantly benefited from controlling for body mass index, depression and ethnicity (χ2=91.7; 2 degrees of freedom (df); P=1.2×1020). Conclusions: The results suggest that, in addition to lifestyle changes that accompany the onset of illness, vitamin D dysregulation in psychosis has a genetic component that links into MS. Further, comprehensive studies are needed to evaluate this prospect. PMID:27336042

  10. International Accounting and the Accounting Educator.

    ERIC Educational Resources Information Center

    Laribee, Stephen F.

    The American Assembly of Collegiate Schools of Business (AACSB) has been instrumental in internationalizing the accounting curriculum by means of accreditation requirements and standards. Colleges and universities have met the AACSB requirements either by providing separate international accounting courses or by integrating international topics…

  11. Microsatellite instability confounds engraftment analysis of hematopoietic stem-cell transplantation.

    PubMed

    Tseng, Li-Hui; Tang, Jih-Luh; Haley, Lisa; Beierl, Katie; Gocke, Christopher D; Eshleman, James R; Lin, Ming-Tseh

    2014-07-01

    Polymorphic short tandem-repeat, or microsatellite, loci have been widely used to analyze chimerism status after allogeneic hematopoietic stem-cell transplantation. In molecular diagnostic laboratories, it is recommended to calculate mixed chimerism for at least 2 informative loci and to avoid microsatellite loci on chromosomes with copy number changes. In this report, we show that microsatellite instability observed in 2 patients with acute leukemia may confound chimerism analysis. Interpretation errors may occur even if 2 to 3 loci are analyzed because of length variation in multiple microsatellite loci. Although microsatellite loci with length variation should not be selected for chimerism analysis, the presence of microsatellite instability, like copy number alteration because of aberrant chromosomes, provides evidence of recurrent or residual cancer cells after hematopoietic stem-cell transplantation.

  12. Can jurors recognize missing control groups, confounds, and experimenter bias in psychological science?

    PubMed

    McAuliff, Bradley D; Kovera, Margaret Bull; Nunez, Gabriel

    2009-06-01

    This study examined the ability of jury-eligible community members (N = 248) to detect internal validity threats in psychological science presented during a trial. Participants read a case summary in which an expert testified about a study that varied in internal validity (valid, missing control group, confound, and experimenter bias) and ecological validity (high, low). Ratings of expert evidence quality and expert credibility were higher for the valid versus missing control group versions only. Internal validity did not influence verdict or ratings of plaintiff credibility and no differences emerged as a function of ecological validity. Expert evidence quality, expert credibility, and plaintiff credibility were positively correlated with verdict. Implications for the scientific reasoning literature and for trials containing psychological science are discussed.

  13. Recommendations to standardize preanalytical confounding factors in Alzheimer's and Parkinson's disease cerebrospinal fluid biomarkers: an update.

    PubMed

    del Campo, Marta; Mollenhauer, Brit; Bertolotto, Antonio; Engelborghs, Sebastiaan; Hampel, Harald; Simonsen, Anja Hviid; Kapaki, Elisabeth; Kruse, Niels; Le Bastard, Nathalie; Lehmann, Sylvain; Molinuevo, Jose L; Parnetti, Lucilla; Perret-Liaudet, Armand; Sáez-Valero, Javier; Saka, Esen; Urbani, Andrea; Vanmechelen, Eugeen; Verbeek, Marcel; Visser, Pieter Jelle; Teunissen, Charlotte

    2012-08-01

    Early diagnosis of neurodegenerative disorders such as Alzheimer's (AD) or Parkinson's disease (PD) is needed to slow down or halt the disease at the earliest stage. Cerebrospinal fluid (CSF) biomarkers can be a good tool for early diagnosis. However, their use in clinical practice is challenging due to the high variability found between centers in the concentrations of both AD CSF biomarkers (Aβ42, total tau and phosphorylated tau) and PD CSF biomarker (α-synuclein). Such a variability has been partially attributed to different preanalytical procedures between laboratories, thus highlighting the need to establish standardized operating procedures. Here, we merge two previous consensus guidelines for preanalytical confounding factors in order to achieve one exhaustive guideline updated with new evidence for Aβ42, total tau and phosphorylated tau, and α-synuclein. The proposed standardized operating procedures are applicable not only to novel CSF biomarkers in AD and PD, but also to biomarkers for other neurodegenerative disorders.

  14. Survey material choices in haematology EQA: a confounding factor in automated counting performance assessment

    PubMed Central

    De la Salle, Barbara

    2017-01-01

    The complete blood count (CBC) is one of the most frequently requested tests in laboratory medicine, performed in a range of healthcare situations. The provision of an ideal assay material for external quality assessment is confounded by the fragility of the cellular components of blood, the lack of commutability of stabilised whole blood material and the lack of certified reference materials and methods to which CBC results can be traced. The choice of assay material between fresh blood, extended life assay material and fully stabilised, commercially prepared, whole blood material depends upon the scope and objectives of the EQA scheme. The introduction of new technologies in blood counting and the wider clinical application of parameters from the extended CBC will bring additional challenges for the EQA provider.

  15. Formulating tightest bounds on causal effects in studies with unmeasured confounders.

    PubMed

    Kuroki, Manabu; Cai, Zhihong

    2008-12-30

    This paper considers the problem of evaluating the causal effect of an exposure on an outcome in observational studies with both measured and unmeasured confounders between the exposure and the outcome. Under such a situation, MacLehose et al. (Epidemiology 2005; 16:548-555) applied linear programming optimization software to find the minimum and maximum possible values of the causal effect for specific numerical data. In this paper, we apply the symbolic Balke-Pearl linear programming method (Probabilistic counterfactuals: semantics, computation, and applications. Ph.D. Thesis, UCLA Cognitive Systems Laboratory, 1995; J. Amer. Statist. Assoc. 1997; 92:1172-1176) to derive the simple closed-form expressions for the lower and upper bounds on causal effects under various assumptions of monotonicity. These universal bounds enable epidemiologists and medical researchers to assess causal effects from observed data with minimum computational effort, and they further shed light on the accuracy of the assessment.

  16. Assessing quality of nursing care as a confounding variable in an outcome study on neurodevelopmental treatment.

    PubMed

    Hafsteinsdóttir, Thóra B; Kruitwagen, Cas; Strijker, Karin; van der Weide, Lies; Grypdonck, Maria H F

    2007-01-01

    When planning a study measuring the effects of a neurodevelopmental treatment (NDT), we were confronted with the methodological problem that while measuring the effects of NDT, a rival hypothesis is that the decision to implement the NDT might be related to the quality of nursing care. Therefore, we measured the quality of nursing care as a possible confounding variable in relation to this outcome study. The quality of nursing care was measured on 12 wards participating in the experimental and control groups of the outcome study. Data were collected from 125 patients and 71 nurses and patients' records. The findings showed no significant differences in the quality of nursing care between the 2 groups of wards (P = .49). This method may be useful to other researchers conducting outcome research and who are confronted with a similar methodological problem.

  17. Spectral cytopathology: new aspects of data collection, manipulation and confounding effects.

    PubMed

    Miljković, Miloš; Bird, Benjamin; Lenau, Kathleen; Mazur, Antonella I; Diem, Max

    2013-07-21

    This paper presents a short review on the improvements in data processing for spectral cytopathology, the diagnostic method developed for large scale diagnostic analysis of spectral data of individual dried and fixed cells. This review is followed by the analysis of the confounding effects introduced by utilizing reflecting "low-emissivity" (low-e) slides as sample substrates in infrared micro-spectroscopy of biological samples such as individual dried cells or tissue sections. The artifact introduced by these substrates, referred to as the "standing electromagnetic wave" artifact, indeed, distorts the spectra noticeably, as postulated recently by several research groups. An analysis of the standing wave effect reveals that careful data pre-processing can reduce the spurious effects to a level where they are not creating a major problem for spectral cytopathology and spectral histopathology.

  18. Missing variables: how exclusion of human resources policy information confounds research connecting health and business outcomes.

    PubMed

    Lynch, Wendy D; Sherman, Bruce W

    2014-01-01

    When corporate health researchers examine the effects of health on business outcomes or the effect of health interventions on health and business outcomes, results will necessarily be confounded by the corporate environment(s) in which they are studied. In this research setting, most studies control for factors traditionally identified in public health, such as demographics and health status. Nevertheless, often overlooked is the extent to which company policies can also independently impact health care cost, work attendance, and productivity outcomes. With changes in employment and benefits practices resulting from health care reform, including incentives and plan design options, consideration of these largely neglected variables in research design has become increasingly important. This commentary summarizes existing knowledge regarding the implications of policy variations in research outcomes and provides a framework for incorporating them into future employer-based research.

  19. Correlation between heart rate variability and pulmonary function adjusted by confounding factors in healthy adults.

    PubMed

    Bianchim, M S; Sperandio, E F; Martinhão, G S; Matheus, A C; Lauria, V T; da Silva, R P; Spadari, R C; Gagliardi, A R T; Arantes, R L; Romiti, M; Dourado, V Z

    2016-03-01

    The autonomic nervous system maintains homeostasis, which is the state of balance in the body. That balance can be determined simply and noninvasively by evaluating heart rate variability (HRV). However, independently of autonomic control of the heart, HRV can be influenced by other factors, such as respiratory parameters. Little is known about the relationship between HRV and spirometric indices. In this study, our objective was to determine whether HRV correlates with spirometric indices in adults without cardiopulmonary disease, considering the main confounders (e.g., smoking and physical inactivity). In a sample of 119 asymptomatic adults (age 20-80 years), we evaluated forced vital capacity (FVC) and forced expiratory volume in 1 s (FEV1). We evaluated resting HRV indices within a 5-min window in the middle of a 10-min recording period, thereafter analyzing time and frequency domains. To evaluate daily physical activity, we instructed participants to use a triaxial accelerometer for 7 days. Physical inactivity was defined as <150 min/week of moderate to intense physical activity. We found that FVC and FEV1, respectively, correlated significantly with the following aspects of the RR interval: standard deviation of the RR intervals (r =0.31 and 0.35), low-frequency component (r =0.38 and 0.40), and Poincaré plot SD2 (r =0.34 and 0.36). Multivariate regression analysis, adjusted for age, sex, smoking, physical inactivity, and cardiovascular risk, identified the SD2 and dyslipidemia as independent predictors of FVC and FEV1 (R2=0.125 and 0.180, respectively, for both). We conclude that pulmonary function is influenced by autonomic control of cardiovascular function, independently of the main confounders.

  20. Measurement confounding affects the extent to which verbal IQ explains social gradients in mortality

    PubMed Central

    Chapman, Benjamin; Fiscella, Kevin; Duberstein, Paul; Kawachi, Ichiro; Muennig, Peter

    2016-01-01

    Background IQ is thought to explain social gradients in mortality. IQ scores are based roughly equally on Verbal IQ (VIQ) and Performance IQ tests. VIQ tests, however, are suspected to confound true verbal ability with socioeconomic status (SES), raising the possibility that associations between SES and IQ scores might be overestimated. We examined, first, whether two of the most common types of VIQ tests exhibited differential item functioning (DIF) favouring persons of higher SES and/or majority race/ethnicity. Second, we assessed what impact, if any, this had on estimates of the extent to which VIQ explains social gradients in mortality. Methods Data from the General Social Survey-National Death Index cohort, a US population representative dataset, was used. Item response theory models queried social-factor DIF on the Thorndike Verbal Intelligence Scale and Wechsler Adult Intelligence Scales, Revised Similarities test. Cox models examined mortality associations among SES and VIQ scores corrected and uncorrected for DIF. Results When uncorrected for DIF, VIQ was correlated with income, education, occupational prestige and race, with correlation coefficients ranging between |0.12| and |0.43|. After correcting for DIF, correlations ranged from |0.06| to |0.16|. Uncorrected VIQ scores explained 11–40% of the Relative Index of Inequalities in mortality for social factors, while DIF-corrected scores explained 2–29%. Conclusions Two of the common forms of VIQ tests appear confound verbal intelligence with SES. Since these tests appear in most IQ batteries, circumspection may be warranted in estimating the amount of social inequalities in mortality attributable to IQ. PMID:24729404

  1. PERMANOVA-S: association test for microbial community composition that accommodates confounders and multiple distances

    PubMed Central

    Tang, Zheng-Zheng; Chen, Guanhua; Alekseyenko, Alexander V.

    2016-01-01

    Motivation: Recent advances in sequencing technology have made it possible to obtain high-throughput data on the composition of microbial communities and to study the effects of dysbiosis on the human host. Analysis of pairwise intersample distances quantifies the association between the microbiome diversity and covariates of interest (e.g. environmental factors, clinical outcomes, treatment groups). In the design of these analyses, multiple choices for distance metrics are available. Most distance-based methods, however, use a single distance and are underpowered if the distance is poorly chosen. In addition, distance-based tests cannot flexibly handle confounding variables, which can result in excessive false-positive findings. Results: We derive presence-weighted UniFrac to complement the existing UniFrac distances for more powerful detection of the variation in species richness. We develop PERMANOVA-S, a new distance-based method that tests the association of microbiome composition with any covariates of interest. PERMANOVA-S improves the commonly-used Permutation Multivariate Analysis of Variance (PERMANOVA) test by allowing flexible confounder adjustments and ensembling multiple distances. We conducted extensive simulation studies to evaluate the performance of different distances under various patterns of association. Our simulation studies demonstrate that the power of the test relies on how well the selected distance captures the nature of the association. The PERMANOVA-S unified test combines multiple distances and achieves good power regardless of the patterns of the underlying association. We demonstrate the usefulness of our approach by reanalyzing several real microbiome datasets. Availability and Implementation: miProfile software is freely available at https://medschool.vanderbilt.edu/tang-lab/software/miProfile. Contact: z.tang@vanderbilt.edu or g.chen@vanderbilt.edu Supplementary information: Supplementary data are available at Bioinformatics

  2. Custom accounts receivable modeling.

    PubMed

    Veazie, J

    1994-04-01

    In hospital and clinic management, accounts are valued as units and handled equally--a $20 account receives the same minimum number of statements as a $20,000 account. Quite often, the sheer number of accounts a hospital or clinic has to handle forces executives to manage accounts by default and failure--accounts mature on an aging track and, if left unpaid by patients, eventually are sent to collections personnel. Of the bad-debt accounts placed with collections agencies, many are misclassified as charity or hardship cases, while others could be collected by hospital or clinic staff with a limited amount of additional effort.

  3. Pain: A Statistical Account

    PubMed Central

    Thacker, Michael A.; Moseley, G. Lorimer

    2017-01-01

    Perception is seen as a process that utilises partial and noisy information to construct a coherent understanding of the world. Here we argue that the experience of pain is no different; it is based on incomplete, multimodal information, which is used to estimate potential bodily threat. We outline a Bayesian inference model, incorporating the key components of cue combination, causal inference, and temporal integration, which highlights the statistical problems in everyday perception. It is from this platform that we are able to review the pain literature, providing evidence from experimental, acute, and persistent phenomena to demonstrate the advantages of adopting a statistical account in pain. Our probabilistic conceptualisation suggests a principles-based view of pain, explaining a broad range of experimental and clinical findings and making testable predictions. PMID:28081134

  4. An Empirical Method of Detecting Time-Dependent Confounding: An Observational Study of Next Day Delirium in a Medical ICU.

    PubMed

    Murphy, T E; Van Ness, P H; Araujo, K L B; Pisani, M A

    Longitudinal research on older persons in the medical intensive care unit (MICU) is often complicated by the time-dependent confounding of concurrently administered interventions such as medications and intubation. Such temporal confounding can bias the respective longitudinal associations between concurrently administered treatments and a longitudinal outcome such as delirium. Although marginal structural models address time-dependent confounding, their application is non-trivial and preferably justified by empirical evidence. Using data from a longitudinal study of older persons in the MICU, we constructed a plausibility score from 0 - 10 where higher values indicate higher plausibility of time-dependent confounding of the association between a time-varying explanatory variable and an outcome. Based on longitudinal plots, measures of correlation, and longitudinal regression, the plausibility scores were compared to the differences in estimates obtained with non-weighted and marginal structural models of next day delirium. The plausibility scores of the three possible pairings of daily doses of fentanyl, haloperidol, and intubation indicated the following: low plausibility for haloperidol and intubation, moderate plausibility for fentanyl and haloperidol, and high plausibility for fentanyl and intubation. Comparing multivariable models of next day delirium with and without adjustment for time-dependent confounding, only intubation's association changed substantively. In our observational study of older persons in the MICU, the plausibility scores were generally reflective of the observed differences between coefficients estimated from non-weighted and marginal structural models.

  5. Innovations in an Accounting Information Systems Course.

    ERIC Educational Resources Information Center

    Shaoul, Jean

    A new approach to teaching an introductory accounting information systems course is outlined and the potential of this approach for integrating computers into the accounting curriculum at Manchester University (England) is demonstrated. Specifically, the use of a small inventory recording system and database in an accounting information course is…

  6. LMAL Accounting Office 1936

    NASA Technical Reports Server (NTRS)

    1936-01-01

    Accounting Office: The Langley Memorial Aeronautical Laboratory's accounting office, 1936, with photographs of the Wright brothers on the wall. Although the Lab was named after Samuel P. Langley, most of the NACA staff held the Wrights as their heroes.

  7. Platelets confound the measurement of extracellular miRNA in archived plasma

    PubMed Central

    Mitchell, Adam J.; Gray, Warren D.; Hayek, Salim S.; Ko, Yi-An; Thomas, Sheena; Rooney, Kim; Awad, Mosaab; Roback, John D.; Quyyumi, Arshed; Searles, Charles D.

    2016-01-01

    Extracellular miRNAs are detectable in biofluids and represent a novel class of disease biomarker. Although many studies have utilized archived plasma for miRNA biomarker discovery, the effects of processing and storage have not been rigorously studied. Previous reports have suggested plasma samples are commonly contaminated by platelets, significantly confounding the measurement of extracellular miRNA, which was thought to be easily addressed by additional post-thaw plasma processing. In a case-control study of archived plasma, we noted a significant correlation between miRNA levels and platelet counts despite post-thaw processing. We thus examined the effects of a single freeze/thaw cycle on microparticles (MPs) and miRNA levels, and show that a single freeze/thaw cycle of plasma dramatically increases the number of platelet-derived MPs, contaminates the extracellular miRNA pool, and profoundly affects the levels of miRNAs detected. The measurement of extracellular miRNAs in archived samples is critically dependent on the removal of residual platelets prior to freezing plasma samples. Many previous clinical studies of extracellular miRNA in archived plasma should be interpreted with caution and future studies should avoid the effects of platelet contamination. PMID:27623086

  8. A chronic bioassay with the estuarine amphipod Corophium volutator: test method description and confounding factors.

    PubMed

    van den Heuvel-Greve, Martine; Postma, Jaap; Jol, Johan; Kooman, Hanneke; Dubbeldam, Marco; Schipper, Cor; Kater, Belinda

    2007-01-01

    Methods of conducting a chronic sediment toxicity test with the estuarine amphipod Corophium volutator are described. They consist of a 49-day exposure, after which mortality, growth and reproduction are determined. Pilot experiments were used to optimize test design parameters such as temperature, duration, feeding and refreshing regimes, and effects of indigenous organisms. By way of further validation, the present study focused on the effects of four different parameters: oxygen saturation, salinity, ammonium and nitrite. These confounding factors might play an important role especially if the test is used for risk assessment of field-contaminated sediments. It is concluded that the present experimental design is well suited for chronic sediment exposures with C. volutator. The test can be performed at a broad range of salinity values, provided that controls are performed at the same salinity. Results further demonstrate that with the endpoints growth and reproduction this chronic test procedure is a factor 7-18 more sensitive to ammonium and nitrate than the standardized acute bioassay (endpoint mortality).

  9. A comparison of confounding adjustment methods for assessment of asthma controller medication effectiveness.

    PubMed

    Li, Lingling; Vollmer, William M; Butler, Melissa G; Wu, Pingsheng; Kharbanda, Elyse O; Wu, Ann Chen

    2014-03-01

    We compared the impact of 3 confounding adjustment procedures-covariate-adjusted regression, propensity score regression, and high-dimensional propensity score regression-to assess the effects of selected asthma controller medication use (leukotriene antagonists and inhaled corticosteroids) on the following 4 asthma-related adverse outcomes: emergency department visits, hospitalizations, oral corticosteroid use, and the composite outcome of these. We examined a cohort of 24,680 new users who were 4-17 years of age at the incident dispensing from the Population-Based Effectiveness in Asthma and Lung Diseases (PEAL) Network of 5 commercial health plans and TennCare, the Tennessee Medicaid program, during the period January 1, 2004, to December 31, 2010. The 3 methods yielded similar results, indicating that pediatric patients treated with leukotriene antagonists were no more likely than those treated with inhaled corticosteroids to experience adverse outcomes. Children in the TennCare population who had a diagnosis of allergic rhinitis and who then initiated the use of leukotriene antagonists were less likely to experience an asthma-related emergency department visit. A plausible explanation is that our data set is large enough that the 2 advanced propensity score-based analyses do not have advantages over the traditional covariate-adjusted regression approach. We provide important observations on how to correctly apply the methods in observational data analysis and suggest statistical research areas that need more work to guide implementation.

  10. Platelets confound the measurement of extracellular miRNA in archived plasma.

    PubMed

    Mitchell, Adam J; Gray, Warren D; Hayek, Salim S; Ko, Yi-An; Thomas, Sheena; Rooney, Kim; Awad, Mosaab; Roback, John D; Quyyumi, Arshed; Searles, Charles D

    2016-09-13

    Extracellular miRNAs are detectable in biofluids and represent a novel class of disease biomarker. Although many studies have utilized archived plasma for miRNA biomarker discovery, the effects of processing and storage have not been rigorously studied. Previous reports have suggested plasma samples are commonly contaminated by platelets, significantly confounding the measurement of extracellular miRNA, which was thought to be easily addressed by additional post-thaw plasma processing. In a case-control study of archived plasma, we noted a significant correlation between miRNA levels and platelet counts despite post-thaw processing. We thus examined the effects of a single freeze/thaw cycle on microparticles (MPs) and miRNA levels, and show that a single freeze/thaw cycle of plasma dramatically increases the number of platelet-derived MPs, contaminates the extracellular miRNA pool, and profoundly affects the levels of miRNAs detected. The measurement of extracellular miRNAs in archived samples is critically dependent on the removal of residual platelets prior to freezing plasma samples. Many previous clinical studies of extracellular miRNA in archived plasma should be interpreted with caution and future studies should avoid the effects of platelet contamination.

  11. Alcohol confounds relationship between cannabis misuse and psychosis conversion in a high-risk sample

    PubMed Central

    Auther, A. M.; Cadenhead, K. S.; Carrión, R. E.; Addington, J.; Bearden, C. E.; Cannon, T. D.; McGlashan, T. H.; Perkins, D. O.; Seidman, L.; Tsuang, M.; Walker, E. F.; Woods, S. W.; Cornblatt, B. A.

    2015-01-01

    Objective Cannabis use has been examined as a predictor of psychosis in clinical high-risk (CHR) samples, but little is known about the impact of other substances on this relationship. Method Substance use was assessed in a large sample of CHR participants (N = 370, mean age = 18.3) enrolled in the multisite North American Prodrome Longitudinal Study Phase 1 project. Three hundred and forty-one participants with cannabis use data were divided into groups: No Use (NU, N = 211); Cannabis Use without impairment (CU, N = 63); Cannabis Abuse/Dependence (CA/CD, N = 67). Participants (N = 283) were followed for ≥2 years to determine psychosis conversion. Results Alcohol (45.3%) and cannabis (38.1%) were the most common substances. Cannabis use groups did not differ on baseline attenuated positive symptoms. Seventy-nine of 283 participants with cannabis and follow-up data converted to psychosis. Survival analysis revealed significant differences between conversion rates in the CA/CD group compared with the No Use (P = 0.031) and CU group (P = 0.027). CA/CD also significantly predicted psychosis in a regression analysis, but adjusting for alcohol use weakened this relationship. Conclusion The cannabis misuse and psychosis association was confounded by alcohol use. Non-impairing cannabis use was not related to psychosis. Results highlight the need to control for other substance use, so as to not overstate the cannabis/psychosis connection. PMID:25572323

  12. Metabolic equivalents of task are confounded by adiposity, which disturbs objective measurement of physical activity.

    PubMed

    Tompuri, Tuomo T

    2015-01-01

    Physical activity refers any bodily movements produced by skeletal muscles that expends energy. Hence the amount and the intensity of physical activity can be assessed by energy expenditure. Metabolic equivalents of task (MET) are multiplies of the resting metabolism reflecting metabolic rate during exercise. The standard MET is defined as 3.5 ml/min/kg. However, the expression of energy expenditure by body weight to normalize the size differences between subjects causes analytical hazards: scaling by body weight does not have a physiological, mathematical, or physical rationale. This review demonstrates by examples that false methodology may cause paradoxical observations if physical activity would be assessed by body weight scaled values such as standard METs. While standard METs are confounded by adiposity, lean mass proportional measures of energy expenditure would enable a more truthful choice to assess physical activity. While physical activity as a behavior and cardiorespiratory fitness or adiposity as a state represents major determinants of public health, specific measurements of health determinants must be understood to enable a truthful evaluation of the interactions and their independent role as a health predictor.

  13. The Scalp Confounds Near-Infrared Signal from Rat Brain Following Innocuous and Noxious Stimulation

    PubMed Central

    He, Ji-Wei; Liu, Hanli; Peng, Yuan Bo

    2015-01-01

    Functional near-infrared imaging (fNIRI) is a non-invasive, low-cost and highly portable technique for assessing brain activity and functions. Both clinical and experimental evidence suggest that fNIRI is able to assess brain activity at associated regions during pain processing, indicating a strong possibility of using fNIRI-derived brain activity pattern as a biomarker for pain. However, it remains unclear how, especially in small animals, the scalp influences fNIRI signal in pain processing. Previously, we have shown that the use of a multi-channel system improves the spatial resolution of fNIRI in rats (without the scalp) during pain processing. Our current work is to investigate a scalp effect by comparing with new data from rats with the scalp during innocuous or noxious stimulation (n = 6). Results showed remarkable stimulus-dependent differences between the no-scalp and intact-scalp groups. In conclusion, the scalp confounded the fNIRI signal in pain processing likely via an autonomic mechanism; the scalp effect should be a critical factor in image reconstruction and data interpretation. PMID:26426058

  14. Stream solute tracer timescales changing with discharge and reach length confound process interpretation

    NASA Astrophysics Data System (ADS)

    Schmadel, Noah M.; Ward, Adam S.; Kurz, Marie J.; Fleckenstein, Jan H.; Zarnetske, Jay P.; Hannah, David M.; Blume, Theresa; Vieweg, Michael; Blaen, Phillip J.; Schmidt, Christian; Knapp, Julia L. A.; Klaar, Megan J.; Romeijn, Paul; Datry, Thibault; Keller, Toralf; Folegot, Silvia; Arricibita, Amaia I. Marruedo; Krause, Stefan

    2016-04-01

    Improved understanding of stream solute transport requires meaningful comparison of processes across a wide range of discharge conditions and spatial scales. At reach scales where solute tracer tests are commonly used to assess transport behavior, such comparison is still confounded due to the challenge of separating dispersive and transient storage processes from the influence of the advective timescale that varies with discharge and reach length. To better resolve interpretation of these processes from field-based tracer observations, we conducted recurrent conservative solute tracer tests along a 1 km study reach during a storm discharge period and further discretized the study reach into six segments of similar length but different channel morphologies. The resulting suite of data, spanning an order of magnitude in advective timescales, enabled us to (1) characterize relationships between tracer response and discharge in individual segments and (2) determine how combining the segments into longer reaches influences interpretation of dispersion and transient storage from tracer tests. We found that the advective timescale was the primary control on the shape of the observed tracer response. Most segments responded similarly to discharge, implying that the influence of morphologic heterogeneity was muted relative to advection. Comparison of tracer data across combined segments demonstrated that increased advective timescales could be misinterpreted as a change in dispersion or transient storage. Taken together, our results stress the importance of characterizing the influence of changing advective timescales on solute tracer responses before such reach-scale observations can be used to infer solute transport at larger network scales.

  15. Disease Risk Score (DRS) as a Confounder Summary Method: Systematic Review and Recommendations

    PubMed Central

    Tadrous, Mina; Gagne, Joshua J.; Stürmer, Til; Cadarette, Suzanne M.

    2013-01-01

    Purpose To systematically examine trends and applications of the disease risk score (DRS) as a confounder summary method. Methods We completed a systematic search of MEDLINE and Web of Science® to identify all English language articles that applied DRS methods. We tabulated the number of publications by year and type (empirical application, methodological contribution, or review paper) and summarized methods used in empirical applications overall and by publication year (<2000, ≥2000). Results Of 714 unique articles identified, 97 examined DRS methods and 86 were empirical applications. We observed a bimodal distribution in the number of publications over time, with a peak 1979-1980, and resurgence since 2000. The majority of applications with methodological detail derived DRS using logistic regression (47%), used DRS as a categorical variable in regression (93%), and applied DRS in a non-experimental cohort (47%) or case-control (42%) study. Few studies examined effect modification by outcome risk (23%). Conclusion Use of DRS methods has increased yet remains low. Comparative effectiveness research may benefit from more DRS applications, particularly to examine effect modification by outcome risk. Standardized terminology may facilitate identification, application, and comprehension of DRS methods. More research is needed to support the application of DRS methods, particularly in case-control studies. PMID:23172692

  16. Metabolic equivalents of task are confounded by adiposity, which disturbs objective measurement of physical activity

    PubMed Central

    Tompuri, Tuomo T.

    2015-01-01

    Physical activity refers any bodily movements produced by skeletal muscles that expends energy. Hence the amount and the intensity of physical activity can be assessed by energy expenditure. Metabolic equivalents of task (MET) are multiplies of the resting metabolism reflecting metabolic rate during exercise. The standard MET is defined as 3.5 ml/min/kg. However, the expression of energy expenditure by body weight to normalize the size differences between subjects causes analytical hazards: scaling by body weight does not have a physiological, mathematical, or physical rationale. This review demonstrates by examples that false methodology may cause paradoxical observations if physical activity would be assessed by body weight scaled values such as standard METs. While standard METs are confounded by adiposity, lean mass proportional measures of energy expenditure would enable a more truthful choice to assess physical activity. While physical activity as a behavior and cardiorespiratory fitness or adiposity as a state represents major determinants of public health, specific measurements of health determinants must be understood to enable a truthful evaluation of the interactions and their independent role as a health predictor. PMID:26321958

  17. The confounding problem of polydrug use in recreational ecstasy/MDMA users: a brief overview.

    PubMed

    Gouzoulis-Mayfrank, Euphrosyne; Daumann, Jörg

    2006-03-01

    The popular dance drug ecstasy (3,4-methylenedioxymethamphetamine -- MDMA) is neurotoxic upon central serotonergic neurons in laboratory animals and possibly also in humans. In recent years, several studies reported alterations of serotonergic transmission and neuropsychiatric abnormalities in ecstasy users which might be related to MDMA-induced neurotoxic brain damage. To date, the most consistent findings associate subtle cognitive, particularly memory, deficits with heavy ecstasy use. However, most studies have important inherent methodological problems. One of the most serious confounds is the widespread pattern of polydrug use which makes it dif.cult to relate the findings in user populations to one specific drug. The present paper represents a brief overview on this issue. The most commonly co-used substances are alcohol, cannabis and stimulants (amphetamines and cocaine). Stimulants are also neurotoxic upon both serotonergic and dopaminergic neurons. Hence, they may act synergistically with MDMA and enhance its long-term adverse effects. The interactions between MDMA and cannabis use may be more complex: cannabis use is a well-recognized risk factor for neuropsychiatric disorders and it was shown to contribute to psychological problems and cognitive failures in ecstasy users. However, at the cellular level, cannabinoids have neuroprotective actions and they were shown to (partially) block MDMA-induced neurotoxicity in laboratory animals. In future, longitudinal and prospective research designs should hopefully lead to a better understanding of the relation between drug use and subclinical psychological symptoms or neurocognitive failures and, also, of questions around interactions between the various substances of abuse.

  18. Peripheral neuropathy of the upper extremity: medical comorbidity that confounds common orthopedic pathology.

    PubMed

    Bales, Joshua G; Meals, Roy

    2009-10-01

    In the orthopedic patient, the diagnosis of a compression neuropathy may be straightforward. However, various medical comorbidities can obscure this diagnosis. It is paramount for the practicing orthopedic surgeon to have an appreciation for the medical pathology of common axonal neuropathies to properly diagnose, treat, and refer a patient with altered sensation in the upper extremity. The prevalence of diabetes in the United States is 10%, and roughly 20% of diabetic patients have peripheral neuropathy. In addition to diabetes, 32% of heavy alcohol users present with polyneuropathy. With advancements in the treatment of human immunodeficiency virus/acquired immunodeficiency syndrome clinicians may see the long-term effects of the virus manifested as axonal neuropathies and extreme allodynia. In some regions of the world, Hansen's disease usurps diabetes as the most common cause of polyneuropathy. Based on patient demographics and social habits, Lyme disease, multiple sclerosis, and syphilis can all manifest as polyneuropathies. Understanding the common medical causes of neuropathy will aid the orthopedic surgeon in differentiating simple compression neuropathies from diseases mimicking or confounding them.

  19. Intelligent Accountability in Education

    ERIC Educational Resources Information Center

    O'Neill, Onora

    2013-01-01

    Systems of accountability are "second order" ways of using evidence of the standard to which "first order" tasks are carried out for a great variety of purposes. However, more accountability is not always better, and processes of holding to account can impose high costs without securing substantial benefits. At their worst,…

  20. Accounting Education in Crisis

    ERIC Educational Resources Information Center

    Turner, Karen F.; Reed, Ronald O.; Greiman, Janel

    2011-01-01

    Almost on a daily basis new accounting rules and laws are put into use, creating information that must be known and learned by the accounting faculty and then introduced to and understood by the accounting student. Even with the 150 hours of education now required for CPA licensure, it is impossible to teach and learn all there is to learn. Over…

  1. Automated Accounting. Instructor Guide.

    ERIC Educational Resources Information Center

    Moses, Duane R.

    This curriculum guide was developed to assist business instructors using Dac Easy Accounting College Edition Version 2.0 software in their accounting programs. The module consists of four units containing assignment sheets and job sheets designed to enable students to master competencies identified in the area of automated accounting. The first…

  2. Accounting & Computing Curriculum Guide.

    ERIC Educational Resources Information Center

    Avani, Nathan T.; And Others

    This curriculum guide consists of materials for use in teaching a competency-based accounting and computing course that is designed to prepare students for employability in the following occupational areas: inventory control clerk, invoice clerk, payroll clerk, traffic clerk, general ledger bookkeeper, accounting clerk, account information clerk,…

  3. The Accounting Capstone Problem

    ERIC Educational Resources Information Center

    Elrod, Henry; Norris, J. T.

    2012-01-01

    Capstone courses in accounting programs bring students experiences integrating across the curriculum (University of Washington, 2005) and offer unique (Sanyal, 2003) and transformative experiences (Sill, Harward, & Cooper, 2009). Students take many accounting courses without preparing complete sets of financial statements. Accountants not only…

  4. Evaluation of potential factors affecting deriving conductivity benchmark by utilizing weighting methods in Hun-Tai River Basin, Northeastern China.

    PubMed

    Jia, Xiaobo; Zhao, Qian; Guo, Fen; Ma, Shuqin; Zhang, Yuan; Zang, Xiaomiao

    2017-03-01

    Specific conductivity is an increasingly important stressor for freshwater ecosystems. Interacting with other environmental factors, it may lead to habitat degradation and biodiversity loss. However, it is still poorly understood how the effect of specific conductivity on freshwater organisms is confounded by other environmental factors. In this study, a weight-of-evidence method was applied to evaluate the potential environmental factors that may confound the effect of specific conductivity on macroinvertebrate structure communities and identify the confounders affecting deriving conductivity benchmark in Hun-Tai River Basin, China. A total of seven potential environmental factors were assessed by six types of evidence (i.e., correlation of cause and confounder, correlation of effect and confounder, the contingency of high level cause and confounder, the removal of confounder, levels of confounder known to cause effects, and multivariate statistics for confounding). Results showed that effects of dissolved oxygen (DO), fecal coliform, habitat score, total phosphorus (TP), pH, and temperature on the relationship between sensitive genera loss and specific conductivity were minimal and manageable. NH3-N was identified as a confounder affecting deriving conductivity benchmark for macroinvertebrate. The potential confounding by high NH3-N was minimized by removing sites with NH3-N > 2.0 mg/L from the data set. Our study tailored the weighting method previously developed by USEPA to use field data to develop causal relationships for basin-scale applications and may provide useful information for pollution remediation and natural resource management.

  5. Accounting: "Balancing Out" the Accounting Program.

    ERIC Educational Resources Information Center

    Babcock, Coleen

    1979-01-01

    The vocational accounting laboratory is a viable, meaningful educational experience for high school seniors, due to the uniqueness of its educational approach and the direct involvement of the professional and business community. A balance of experiences is provided to match individual needs and goals of students. (CT)

  6. Who really gets strep sore throat? Confounding and effect modification of a time-varying exposure on recurrent events.

    PubMed

    Follmann, Dean; Huang, Chiung-Yu; Gabriel, Erin

    2016-10-30

    Unmeasured confounding is the fundamental obstacle to drawing causal conclusions about the impact of an intervention from observational data. Typically, covariates are measured to eliminate or ameliorate confounding, but they may be insufficient or unavailable. In the special setting where a transient intervention or exposure varies over time within each individual and confounding is time constant, a different tack is possible. The key idea is to condition on either the overall outcome or the proportion of time in the intervention. These measures can eliminate the unmeasured confounding either by conditioning or by use of a proxy covariate. We evaluate existing methods and develop new models from which causal conclusions can be drawn from such observational data even if no baseline covariates are measured. Our motivation for this work was to determine the causal effect of Streptococcus bacteria in the throat on pharyngitis (sore throat) in Indian schoolchildren. Using our models, we show that existing methods can be badly biased and that sick children who are rarely colonized have a high probability that the Streptococcus bacteria are causing their disease. Published 2016. This article is a U.S. Government work and is in the public domain in the USA.

  7. Interpretational Confounding Is Due to Misspecification, Not to Type of Indicator: Comment on Howell, Breivik, and Wilcox (2007)

    ERIC Educational Resources Information Center

    Bollen, Kenneth A.

    2007-01-01

    R. D. Howell, E. Breivik, and J. B. Wilcox (2007) have argued that causal (formative) indicators are inherently subject to interpretational confounding. That is, they have argued that using causal (formative) indicators leads the empirical meaning of a latent variable to be other than that assigned to it by a researcher. Their critique of causal…

  8. Applications of the propensity score weighting method in psychogeriatric research: correcting selection bias and adjusting for confounders.

    PubMed

    Chang, Chung-Chou H

    2017-05-01

    The propensity score (PS) weighting method is an analytic technique that has been applied in multiple fields for a number of purposes. Here, we discuss two common applications, which are (1) to correct for selection bias and (2) to adjust for confounding variables when estimating the effect of an exposure variable on the outcome of interest.

  9. Diabetes and gastric cancer: the potential links.

    PubMed

    Tseng, Chin-Hsiao; Tseng, Farn-Hsuan

    2014-02-21

    This article reviews the epidemiological evidence linking diabetes and gastric cancer and discusses some of the potential mechanisms, confounders and biases in the evaluation of such an association. Findings from four meta-analyses published from 2011 to 2013 suggest a positive link, which may be more remarkable in females and in the Asian populations. Putative mechanisms may involve shared risk factors, hyperglycemia, Helicobacter pylori (H. pylori) infection, high salt intake, medications and comorbidities. Diabetes may increase the risk of gastric cancer through shared risk factors including obesity, insulin resistance, hyperinsulinemia and smoking. Hyperglycemia, even before the clinical diagnosis of diabetes, may predict gastric cancer in some epidemiological studies, which is supported by in vitro, and in vivo studies. Patients with diabetes may also have a higher risk of gastric cancer through the higher infection rate, lower eradication rate and higher reinfection rate of H. pylori. High salt intake can act synergistically with H. pylori infection in the induction of gastric cancer. Whether a higher risk of gastric cancer in patients with diabetes may be ascribed to a higher intake of salt due to the loss of taste sensation awaits further investigation. The use of medications such as insulin, metformin, sulfonylureas, aspirin, statins and antibiotics may also influence the risk of gastric cancer, but most of them have not been extensively studied. Comorbidities may affect the development of gastric cancer through the use of medications and changes in lifestyle, dietary intake, and the metabolism of drugs. Finally, a potential detection bias related to gastrointestinal symptoms more commonly seen in patients with diabetes and with multiple comorbidities should be pointed out. Taking into account the inconsistent findings and the potential confounders and detection bias in previous epidemiological studies, it is expected that there are still more to be

  10. Marked overlap of four genetic syndromes with dyskeratosis congenita confounds clinical diagnosis

    PubMed Central

    Walne, Amanda J.; Collopy, Laura; Cardoso, Shirleny; Ellison, Alicia; Plagnol, Vincent; Albayrak, Canan; Albayrak, Davut; Kilic, Sara Sebnem; Patıroglu, Turkan; Akar, Haluk; Godfrey, Keith; Carter, Tina; Marafie, Makia; Vora, Ajay; Sundin, Mikael; Vulliamy, Thomas; Tummala, Hemanth; Dokal, Inderjeet

    2016-01-01

    Dyskeratosis congenita is a highly pleotropic genetic disorder. This heterogeneity can lead to difficulties in making an accurate diagnosis and delays in appropriate management. The aim of this study was to determine the underlying genetic basis in patients presenting with features of dyskeratosis congenita and who were negative for mutations in the classical dyskeratosis congenita genes. By whole exome and targeted sequencing, we identified biallelic variants in genes that are not associated with dyskeratosis congenita in 17 individuals from 12 families. Specifically, these were homozygous variants in USB1 (8 families), homozygous missense variants in GRHL2 (2 families) and identical compound heterozygous variants in LIG4 (2 families). All patients had multiple somatic features of dyskeratosis congenita but not the characteristic short telomeres. Our case series shows that biallelic variants in USB1, LIG4 and GRHL2, the genes mutated in poikiloderma with neutropenia, LIG4/Dubowitz syndrome and the recently recognized ectodermal dysplasia/short stature syndrome, respectively, cause features that overlap with dyskeratosis congenita. Strikingly, these genes also overlap in their biological function with the known dyskeratosis congenita genes that are implicated in telomere maintenance and DNA repair pathways. Collectively, these observations demonstrate the marked overlap of dyskeratosis congenita with four other genetic syndromes, confounding accurate diagnosis and subsequent management. This has important implications for establishing a genetic diagnosis when a new patient presents in the clinic. Patients with clinical features of dyskeratosis congenita need to have genetic analysis of USB1, LIG4 and GRHL2 in addition to the classical dyskeratosis congenita genes and telomere length measurements. PMID:27612988

  11. Marked EEG worsening following Levetiracetam overdose: How a pharmacological issue can confound coma prognosis.

    PubMed

    Bouchier, Baptiste; Demarquay, Geneviève; Guérin, Claude; André-Obadia, Nathalie; Gobert, Florent

    2017-01-01

    Levetiracetam is an anti-epileptic drug commonly used in intensive care when seizure is suspected as a possible cause of coma. We propose to question the cofounding effect of Levetiracetam during the prognostication process in a case of anoxic coma. We report the story of a young woman presenting a comatose state following a hypoxic cardiac arrest. After a first EEG presenting an intermediate EEG pattern, a seizure suspicion led to prescribe Levetiracetam. The EEG showed then the appearance of burst suppression, which was compatible with a very severe pattern of post-anoxic coma. This aggravation was in fact related to an overdose of Levetiracetam (the only medication introduced recently) and was reversible after Levetiracetam cessation. The increased plasmatic dosages of Levetiracetam confirming this overdose could have been favoured by a moderate reduction of renal clearance, previously underestimated because of a low body-weight. This EEG dynamic was unexpected under Levetiracetam and could sign a functional instability after anoxia. Burst suppression is classically observed with high doses of anaesthetics, but is not expected after a minor anti-epileptic drug. This report proposes that Levetiracetam tolerance might not be straightforward after brain lesions and engages us to avoid confounding factors during the awakening prognostication, which is mainly based on the severity of the EEG. Hence, prognosis should not be decided on an isolated parameter, especially if the dynamic is atypical after a new prescription, even for well-known drugs. For any suspicion, the drug's dosage and replacement should be managed before any premature care's withdrawal.

  12. Bordetella pseudohinzii as a Confounding Organism in Murine Models of Pulmonary Disease

    PubMed Central

    Clark, Sarah E; Purcell, Jeanette E; Sammani, Saad; Steffen, Earl K; Crim, Marcus J; Livingston, Robert S; Besch-Williford, Cynthia; Fortman, Jeffrey D

    2016-01-01

    A group studying acute lung injury observed an increased percentage of neutrophils in the bronchoalveolar lavage (BAL) fluid of mice. BAL was performed, and lung samples were collected sterilely from 5 C57BL/6 mice that had been bred inhouse. Pure colonies of bacteria, initially identified as Bordetella hinzii were cultured from 2 of the 5 mice which had the highest percentages of neutrophils (21% and 26%) in the BAL fluid. Subsequent sequencing of a portion of the ompA gene from this isolate demonstrated 100% homology with the published B. pseudohinzii sequence. We then selected 10 mice from the investigator's colony to determine the best test to screen for B. pseudohinzii in the facility. BAL was performed, the left lung lobe was collected for culture and PCR analysis, the right lung lobe and nasal passages were collected for histopathology, an oral swab was collected for culture, and an oral swab and fecal pellets were collected for PCR analysis. B. pseudohinzii was cultured from the oral cavity, lung, or both in 8 of the 10 mice analyzed. All 8 of these mice were fecal PCR positive for B. pseudohinzii; 7 had increased neutrophils (5% to 20%) in the BAL fluid, whereas the 8th mouse had a normal neutrophil percentage (2%). Active bronchopneumonia was not observed, but some infected mice had mild to moderate rhinitis. B. pseudohinzii appears to be a microbial agent of importance in mouse colonies that can confound pulmonary research. Commercial vendors and institutions should consider colony screening, routine reporting, and exclusion of B. pseudohinzii. PMID:27780002

  13. Adaption of the LUCI framework to account for detailed farm management: a case study exploring potential for achieving locally and nationally significant greenhouse gas, flooding and nutrient mitigation without compromising livelihoods on New Zealand farm

    NASA Astrophysics Data System (ADS)

    Jackson, Bethanna; Trodahl, Martha; Maxwell, Deborah; Easton, Stuart

    2016-04-01

    This talk discusses recent progress in adapting the Land Utilisation and Capability Indicator (LUCI) framework to take account of the impact of detailed farm management on greenhouse gas emissions and on water, sediment and nutrient delivery to waterways. LUCI is a land management decision support framework which examines the impact of current and potential interventions on a variety of outcomes, including flood mitigation, water supply, greenhouse gas emissions, biodiversity, erosion, sediment and nutrient delivery to waterways, and agricultural production. The potential of the landscape to provide benefits is a function of both the biophysical properties of individual landscape elements and their configuration. Both are respected in LUCI where possible. For example, the hydrology, sediment and chemical routing algorithms are based on physical principles of hillslope flow, taking information on the storage and permeability capacity of elements within the landscape from soil and land use data and honoring physical thresholds, mass and energy balance constraints. LUCI discretizes hydrological response units within the landscape according to similarity of their hydraulic properties and preserves spatially explicit topographical routing. Implications of keeping the "status quo" or potential scenarios of land management change can then be evaluated under different meteorological or climatic events (e.g. flood return periods, rainfall events, droughts), cascading water through the hydrological response units using a "fill and spill" approach. These and other component algorithms are designed to be fast-running while maintaining physical consistency and fine spatial detail. This allows it to operate from subfield level scale to catchment, or even national scale, simultaneously. It analyses and communicates the spatial pattern of individual provision and tradeoffs/synergies between desired outcomes at detailed resolutions and provides suggestions on where management

  14. Emerging accounting trends accounting for leases.

    PubMed

    Valletta, Robert; Huggins, Brian

    2010-12-01

    A new model for lease accounting can have a significant impact on hospitals and healthcare organizations. The new approach proposes a "right-of-use" model that involves complex estimates and significant administrative burden. Hospitals and health systems that draw heavily on lease arrangements should start preparing for the new approach now even though guidance and a final rule are not expected until mid-2011. This article highlights a number of considerations from the lessee point of view.

  15. PLATO IV Accountancy Index.

    ERIC Educational Resources Information Center

    Pondy, Dorothy, Comp.

    The catalog was compiled to assist instructors in planning community college and university curricula using the 48 computer-assisted accountancy lessons available on PLATO IV (Programmed Logic for Automatic Teaching Operation) for first semester accounting courses. It contains information on lesson access, lists of acceptable abbreviations for…

  16. Leadership for Accountability.

    ERIC Educational Resources Information Center

    Lashway, Larry

    2001-01-01

    This document explores issues of leadership for accountability and reviews five resources on the subject. These include: (1) "Accountability by Carrots and Sticks: Will Incentives and Sanctions Motivate Students, Teachers, and Administrators for Peak Performance?" (Larry Lashway); (2) "Organizing Schools for Teacher Learning"…

  17. The Choreography of Accountability

    ERIC Educational Resources Information Center

    Webb, P. Taylor

    2006-01-01

    The prevailing performance discourse in education claims school improvements can be achieved through transparent accountability procedures. The article identifies how teachers generate performances of their work in order to satisfy accountability demands. By identifying sources of teachers' knowledge that produce choreographed performances, I…

  18. Cluster Guide. Accounting Occupations.

    ERIC Educational Resources Information Center

    Beaverton School District 48, OR.

    Based on a recent task inventory of key occupations in the accounting cluster taken in the Portland, Oregon, area, this curriculum guide is intended to assist administrators and teachers in the design and implementation of high school accounting cluster programs. The guide is divided into four major sections: program organization and…

  19. The Accountability Illusion: Ohio

    ERIC Educational Resources Information Center

    Thomas B. Fordham Institute, 2009

    2009-01-01

    The intent of the No Child Left Behind (NCLB) Act of 2001 is to hold schools accountable for ensuring that all their students achieve mastery in reading and math, with a particular focus on groups that have traditionally been left behind. Under NCLB, states submit accountability plans to the U.S. Department of Education detailing the rules and…

  20. The Accountability Illusion: Florida

    ERIC Educational Resources Information Center

    Thomas B. Fordham Institute, 2009

    2009-01-01

    The intent of the No Child Left Behind (NCLB) Act of 2001 is to hold schools accountable for ensuring that all their students achieve mastery in reading and math, with a particular focus on groups that have traditionally been left behind. Under NCLB, states submit accountability plans to the U.S. Department of Education detailing the rules and…

  1. The Accountability Illusion: Minnesota

    ERIC Educational Resources Information Center

    Thomas B. Fordham Institute, 2009

    2009-01-01

    The intent of the No Child Left Behind (NCLB) Act of 2001 is to hold schools accountable for ensuring that all their students achieve mastery in reading and math, with a particular focus on groups that have traditionally been left behind. Under NCLB, states submit accountability plans to the U.S. Department of Education detailing the rules and…

  2. The Accountability Illusion: Wisconsin

    ERIC Educational Resources Information Center

    Thomas B. Fordham Institute, 2009

    2009-01-01

    The intent of the No Child Left Behind (NCLB) Act of 2001 is to hold schools accountable for ensuring that all their students achieve mastery in reading and math, with a particular focus on groups that have traditionally been left behind. Under NCLB, states submit accountability plans to the U.S. Department of Education detailing the rules and…

  3. The Accountability Illusion: Vermont

    ERIC Educational Resources Information Center

    Thomas B. Fordham Institute, 2009

    2009-01-01

    The intent of the No Child Left Behind (NCLB) Act of 2001 is to hold schools accountable for ensuring that all their students achieve mastery in reading and math, with a particular focus on groups that have traditionally been left behind. Under NCLB, states submit accountability plans to the U.S. Department of Education detailing the rules and…

  4. The Evolution of Accountability

    ERIC Educational Resources Information Center

    Webb, P. Taylor

    2011-01-01

    Campus 2020: Thinking ahead is a policy in British Columbia (BC), Canada, that attempted to hold universities accountable to performance. Within, I demonstrate how this Canadian articulation of educational accountability intended to develop "governmentality constellations" to control the university and regulate its knowledge output. This…

  5. Accountability in Education.

    ERIC Educational Resources Information Center

    Chippendale, P. R., Ed.; Wilkes, Paula V., Ed.

    This collection of papers delivered at a conference on accountability held at Darling Downs Institute of Advanced Education in Australia examines the meaning of accountability in education for teachers, lecturers, government, parents, administrators, education authorities, and the society at large. In Part 1, W. G. Walker attempts to answer the…

  6. The Accountability Illusion: Nevada

    ERIC Educational Resources Information Center

    Thomas B. Fordham Institute, 2009

    2009-01-01

    The intent of the No Child Left Behind (NCLB) Act of 2001 is to hold schools accountable for ensuring that all their students achieve mastery in reading and math, with a particular focus on groups that have traditionally been left behind. Under NCLB, states submit accountability plans to the U.S. Department of Education detailing the rules and…

  7. Cytokine and satellite cell responses to muscle damage: interpretation and possible confounding factors in human studies.

    PubMed

    van de Vyver, M; Myburgh, K H

    2012-08-01

    It is plausible that multiple muscle biopsies following a muscle damaging intervention can exacerbate the inflammatory and subsequent satellite cell responses. To elucidate confounding effects of muscle biopsy procedure on satellite cell number, indirect markers of damage and the inflammatory response following acute downhill running (DHR) were investigated. 10 healthy male participant were divided into a non-exercising control (n = 4) and DHR (12 × 5min bouts, 10 % decline at 85 % VO(2)max) (n = 6) group. Blood samples were taken pre, post and every 24 h for 9 days. Serum was analysed for creatine kinase (CK), myoglobin (Mb), lactate dehydrogenase (LDH), TNF-α, IL-6 and IL-10. Muscle biopsies taken on days 1 and 2 post intervention from opposing legs were analysed for Pax7(+) satellite cells. In the DHR group, Mb (536 ± 277 ng mL(-1)), IL-6 (12.6 ± 4.7 pg mL(-1)) and IL-10 (27.3 ± 11.5 pg mL(-1)) peaked immediately post DHR, while CK (2651 ± 1911 U L(-1)), LDH (202 ± 47 U L(-1)) and TNF-α (25.1 ± 8.7 pg mL(-1)) peaked on day 1. A 30 % increase in Pax7(+) satellite cells on day 1 in the DHR group was no longer apparent on day 2. H&E staining show evidence of phagocytosis in the DHR group. No significant changes over time were observed in the control group for any of the variables measured. Events observed in the DHR group were as a result of the intervention protocol and subsequent muscle damage. The relationship between SC proliferation and pro-inflammatory cytokine release appears to be complex since the IL-6/IL-10 response time differs significantly from the TNF-α response.

  8. Land surface controls on afternoon precipitation diagnosed from observational data: Uncertainties, confounding factors and the possible role of interception storage

    NASA Astrophysics Data System (ADS)

    Guillod, B. P.; Orlowsky, B.; Seneviratne, S. I.

    2013-12-01

    The feedback between soil moisture and precipitation has long been a topic of interest due to its potential for improving seasonal forecasts. The generally proposed feedbacks assume a control of soil moisture on the flux partitioning (i.e. the Evaporative Fraction, EF) at the land surface, which then influences precipitation. Our study (Guillod et al., in prep) addresses the poorly understood link between EF and precipitation by investigating the impact of before-noon EF on the frequency of afternoon precipitation over the contiguous US. We analyze remote sensing data products (EF from GLEAM, Miralles et al. 2011; radar precipitation from NEXRAD), FLUXNET station data, and the North American Regional Reanalysis (NARR). While most datasets agree on the existence of a region of positive relationship between between EF and precipitation in the Eastern US (e.g. Findell et al. 2011), observation-based estimates indicate a stronger relationship in the Western US, which is not found in NARR. Investigating these differences, we find that much of these relationships can be explained by precipitation persistence alone, with ambiguous results on the additional role of EF. Regional analyses reveal contrasting mechanisms over different regions which fit well with the known distribution of vegetation cover and soil moisture-climate regimes. Over the Eastern US, our analyses suggest that the EF-precipitation feedback, if present, takes place on a short day-to-day time scale, where interception evaporation drives the relationship rather than soil moisture, due to the high forest cover and the wet regime. Over the Western US, the impact of EF on convection triggering is additionally linked to soil moisture variations, owing to the soil moisture-limited climate regime. References: Findell, K. L., et al., 2011: Probability of afternoon precipitation in eastern United States and Mexico enhanced by high evaporation. Nature Geosci., 4 (7), 434-439, doi:10.1038/ngeo1174, URL http

  9. 77 FR 43542 - Cost Accounting Standards: Cost Accounting Standards 412 and 413-Cost Accounting Standards...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-25

    ... BUDGET Office of Federal Procurement Policy 48 CFR Part 9904 Cost Accounting Standards: Cost Accounting Standards 412 and 413--Cost Accounting Standards Pension Harmonization Rule AGENCY: Cost Accounting... correcting amendments. SUMMARY: The Office of Federal Procurement Policy (OFPP), Cost Accounting...

  10. Relations between verbal and nonverbal memory performance: evidence of confounding effects particularly in patients with right temporal lobe epilepsy.

    PubMed

    Helmstaedter, C; Pohl, C; Elger, C E

    1995-06-01

    Confounding left hemisphere verbalization strategies can be suggested as being a major problem in the evaluation of the assumed involvement of right temporo-limbic structures in "nonverbal" visual/figural memory processing. We addressed this issue by evaluating the easily-verbalized Benton-visual-retention-test in 60 patients with either left (LTLE) or right temporal lobe epilepsy (RTLE) and 30 healthy controls. We formally estimated the informational (verbal) content of each item which hypothetically would be needed to solely retain the item from verbal memory. The results indicated confounding of verbal learning and figural memory only in the presence of right temporal lobe dysfunctions. Selective visual/figural learning deficits in RTLE patients became obvious when the verbal load of the figural material exceeded their verbal learning capacity. Instead of excluding verbalization by the use of abstract figural items, its inclusion provides a possibility to control compensatory strategies which overshadow the presence of visual/figural memory deficits.

  11. Human Resource Accounting.

    DTIC Science & Technology

    1984-12-01

    I AD-RI54 787 HUMAN RESOURCE ACCOUNTING (U) NAVAL POSTGRADUATE SCHOOL 1/2 F MONTEREY CR J C MARTINS DEC 84 1UNCLASSIFIED /G 5/9 NL -~~ .. 2. . L...Monterey, California JUN1im THESISG HUMAN RESOURCE ACCOUNTING by Joaquim C. Martins LLJ.. December 1984 Thesis Advisor: R.A. McGonigal Approved for...REPORT & PECRI00 COVERED Master’s Thesis; Human Resource Accounting Dcme 94- ’ 6. PERFORMING ORG. REPORT NUMBER 7. AUTOR(*) . CONTRACT OR GRANT NUMBER

  12. Automated Attendance Accounting System; Patent Application.

    ERIC Educational Resources Information Center

    Chapman, Carl P.; And Others

    An automated accounting system, useful for applying data to a computer from a multiplicity of terminals, has the potential of replacing the manual attendance accounting system now employed in schools. The inventors claim that such a sophisticated system with terminals in the classrooms would enable school administrators not only to monitor simple…

  13. RISK REDUCTION FOR MATERIAL ACCOUNTABILITY UPGRADES.

    SciTech Connect

    FISHBONE, L.G.; SISKIND, B.

    2005-05-16

    We present in this paper a method for evaluating explicitly the contribution of nuclear material accountability upgrades to risk reduction at nuclear facilities. The method yields the same types of values for conditional risk reduction that physical protection and material control upgrades yield. Thereby, potential material accountability upgrades can be evaluated for implementation in the same way that protection and control upgrades are evaluated.

  14. Confounding by indication in non-experimental evaluation of vaccine effectiveness: the example of prevention of influenza complications

    PubMed Central

    Hak, E; Verheij, T.; Grobbee, D; Nichol, K; Hoes, A

    2002-01-01

    Randomised allocation of vaccine or placebo is the preferred method to assess the effects of the vaccine on clinical outcomes relevant to the individual patient. In the absence of phase 3 trials using clinical end points, notably post-influenza complications, alternative non-experimental designs to evaluate vaccine effects or safety are often used. The application of these designs may, however, lead to invalid estimates of vaccine effectiveness or safety. As patients with poor prognosis are more likely to be immunised, selection for vaccination is confounded by patient factors that are also related to clinical end points. This paper describes several design and analytical methods aimed at limiting or preventing this confounding by indication in non-experimental studies. In short, comparison of study groups with similar prognosis, restriction of the study population, and statistical adjustment for dissimilarities in prognosis are important tools and should be considered. Only if the investigator is able to show that confounding by indication is sufficiently controlled for, results of a non-experimental study may be of use to direct an evidence based vaccine policy. PMID:12461118

  15. Ideas for the Accounting Classroom.

    ERIC Educational Resources Information Center

    Kerby, Debra; Romine, Jeff

    2003-01-01

    Innovative ideas for accounting education include having students study accounting across historical periods, using businesses for student research, exploring nontraditional accounting careers, and collaborating with professional associations. (SK)

  16. Readability of Accounting Books.

    ERIC Educational Resources Information Center

    Razek, Joseph R.; And Others

    1982-01-01

    This article describes the results of a survey of the readability of most of the intermediate and advanced accounting textbooks currently in use at colleges and universities throughout the United States. (CT)

  17. Accounting Equals Applied Algebra.

    ERIC Educational Resources Information Center

    Roberts, Sondra

    1997-01-01

    Argues that students should be given mathematics credits for completing accounting classes. Demonstrates that, although the terminology is different, the mathematical concepts are the same as those used in an introductory algebra class. (JOW)

  18. Accounting and the Use of the Computer

    ERIC Educational Resources Information Center

    Irvin, Donald D.

    1969-01-01

    The nature, scope, and potential of electronic data processing are discussed as significant factors in the changing function and role of accountants. Decision making, not bookkeeping, is emerging as a realm of professional activity. (CH)

  19. 18 CFR 367.1840 - Account 184, Clearing accounts.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... ACT Balance Sheet Chart of Accounts Deferred Debits § 367.1840 Account 184, Clearing accounts. This account must include undistributed balances in clearing accounts at the date of the balance sheet... accounts. 367.1840 Section 367.1840 Conservation of Power and Water Resources FEDERAL ENERGY...

  20. Accounting for the environment.

    PubMed

    Lutz, E; Munasinghe, M

    1991-03-01

    Environmental awareness in the 1980s has led to efforts to improve the current UN System of National Accounts (SNA) for better measurement of the value of environmental resources when estimating income. National governments, the UN, the International Monetary Fund, and the World Bank are interested in solving this issue. The World Bank relies heavily on national aggregates in income accounts compiled by means of the SNA that was published in 1968 and stressed gross domestic product (GDP). GDP measures mainly market activity, but it takes does not consider the consumption of natural capital, and indirectly inhibits sustained development. The deficiencies of the current method of accounting are inconsistent treatment of manmade and natural capital, the omission of natural resources and their depletion from balance sheets, and pollution cleanup costs from national income. In the calculation of GDP pollution is overlooked, and beneficial environmental inputs are valued at zero. The calculation of environmentally adjusted net domestic product (EDP) and environmentally adjusted net income (ENI) would lower income and growth rate, as the World Resources Institute found with respect to Indonesia for 1971-84. When depreciation for oil, timber, and top soil was included the net domestic product (NDP) was only 4% compared with a 7.1% GDP. The World Bank has advocated environmental accounting since 1983 in SNA revisions. The 1989 revised Blue Book of the SNA takes environment concerns into account. Relevant research is under way in Mexico and Papua New Guinea using the UN Statistical Office framework as a system for environmentally adjusted economic accounts that computes EDP and ENI and integrates environmental data with national accounts while preserving SNA concepts.

  1. Student Academic Performance in Undergraduate Managerial-Accounting Courses

    ERIC Educational Resources Information Center

    Al-Twaijry, Abdulrahman Ali

    2010-01-01

    The author's purpose was to identify potential factors possibly affecting student performance in three sequential management-accounting courses: Managerial Accounting (MA), Cost Accounting (CA), and Advanced Managerial Accounting (AMA) within the Saudi Arabian context. The sample, which was used to test the developed hypotheses, included 312…

  2. Reactive Aggression and Peer Victimization from Pre-Kindergarten to First Grade: Accounting for Hyperactivity and Teacher-Child Conflict

    ERIC Educational Resources Information Center

    Runions, Kevin C.

    2014-01-01

    Background: The role of reactive aggression in the development of peer victimization remains unclear due in part to a failure to account for confounding problems of behavioural undercontrol (e.g., hyperactivity). As well, the school social context has rarely been examined to see whether these risks are mediated by relationships with teachers.…

  3. Evaluating the effectiveness of air quality regulations: A review of accountability studies and frameworks.

    PubMed

    Henneman, Lucas R F; Liu, Cong; Mulholland, James A; Russell, Armistead G

    2017-02-01

    Assessments of past environmental policies-termed accountability studies-contribute important information to the decision-making process used to review the efficacy of past policies, and subsequently aid in the development of effective new policies. These studies have used a variety of methods that have achieved varying levels of success at linking improvements in air quality and/or health to regulations. The Health Effects Institute defines the air pollution accountability framework as a chain of events that includes the regulation of interest, air quality, exposure/dose, and health outcomes, and suggests that accountability research should address impacts for each of these linkages. Early accountability studies investigated short-term, local regulatory actions (for example, coal use banned city-wide on a specific date or traffic pattern changes made for Olympic Games). Recent studies assessed regulations implemented over longer time and larger spatial scales. Studies on broader scales require accountability research methods that account for effects of confounding factors that increase over time and space. Improved estimates of appropriate baseline levels (sometimes termed "counterfactual"-the expected state in a scenario without an intervention) that account for confounders and uncertainties at each link in the accountability chain will help estimate causality with greater certainty. In the direct accountability framework, researchers link outcomes with regulations using statistical methods that bypass the link-by-link approach of classical accountability. Direct accountability results and methods complement the classical approach. New studies should take advantage of advanced planning for accountability studies, new data sources (such as satellite measurements), and new statistical methods. Evaluation of new methods and data sources is necessary to improve investigations of long-term regulations, and associated uncertainty should be accounted for at each link to

  4. Adjustment for missing confounders in studies based on observational databases: 2-stage calibration combining propensity scores from primary and validation data.

    PubMed

    Lin, Hui-Wen; Chen, Yi-Hau

    2014-08-01

    Bias caused by missing or incomplete information on confounding factors constitutes an important challenge in observational studies. The incorporation of external data on more detailed confounding information into the main study data may help remove confounding bias. This work was motivated by a study of the association between chronic obstructive pulmonary disease and herpes zoster. Analyses were based on administrative databases in which information on important confounders-cigarette smoking and alcohol consumption-was lacking. We consider adjusting for the confounding bias arising from missing confounders by incorporating a validation sample with data on smoking and alcohol consumption obtained from a small-scale National Health Interview Survey study. We propose a 2-stage calibration (TSC) method, which summarizes the confounding information through propensity scores and combines the analysis results from the main and the validation study samples, where the propensity score adjustment from the main sample is crude and that from the validation sample is more precise. Unlike the existing methods, the validity of the TSC approach does not rely on any specific measurement error model. When applying the TSC method to the motivating study above, the odds ratio of herpes zoster associated with chronic obstructive pulmonary disease is 1.91 (95% confidence interval: 1.62, 2.26) after adjustment for cumulative smoking and alcohol consumption.

  5. Diminished KCC2 confounds synapse-specificity of LTP during senescence

    PubMed Central

    Ferando, Isabella; Faas, Guido; Mody, Istvan

    2016-01-01

    Synapse-specificity of LTP ensures that no interference arises from inputs irrelevant to the memory to be encoded. In hippocampi of aged (21-28 months-old) mice LTP was relayed to unstimulated synapses blemishing its synapse-specificity. Diminished levels of the K+/Cl– cotransporter KCC2 and a depolarizing GABAA receptor-mediated synaptic component following LTP were the most likely causes for spreading the potentiation, unveiling novel mechanisms hindering information storage in the aged brain, and identifying KCC2 as a potential target for intervention. PMID:27500406

  6. Land surface controls on afternoon precipitation diagnosed from observational data: Uncertainties and confounding factors

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The feedback between soil moisture and precipitation has long been a topic of interest due to its potential for improving weather and seasonal forecasts. The generally proposed mechanism assumes a control of soil moisture on precipitation via the partitioning of the surface fluxes (the Evaporative F...

  7. Meta-analysis of lung cancer in asphalt roofing and paving workers with external adjustment for confounding by coal tar

    SciTech Connect

    Fayerweather, W.E.

    2007-07-01

    The study's objectives were to update Partanen's and Boffetta's 1994 meta-analysis of lung cancer among roofing and paving asphalt workers and explore the role of coal tar in explaining the statistical heterogeneity among these studies. Information retrieval strategies and eligibility criteria were defined for identifying the epidemiologic studies to be included in the analysis. The relative risk ratio (RR) for lung cancer was selected as the effect measure of interest. Coal tar bias factors were developed and used to externally adjust each eligible study's published RR for confounding by coal tar. The meta-Relative Risk (meta-RR) and its variance were estimated by general variance-based methods. Heterogeneity of the RRs was assessed by heterogeneity chi-square and I{sup 2} tests. The results from this update were similar to those in Partanen's and Boffetta's original meta-analysis. Although the meta-RRs for the roofers and the pavers were no longer statistically significantly different from one another, significant heterogeneity remained within each of the coal tar-adjusted sectors. Meta-analysis of non-experimental epidemiologic studies is subject to significant uncertainties as is externally correcting studies for confounding. Given these uncertainties, the specific quantitative estimates in this (or any similar) analysis must be viewed with caution. Nevertheless, this analysis provides support for the hypothesis proposed by several major reviewers that confounding by coal tar-related PAH exposures may explain most or all of the lung cancer risks found in the epidemiologic literature on asphalt roofing and paving workers.

  8. Viewpoints on Accountability.

    ERIC Educational Resources Information Center

    Educational Innovators Press, Tucson, AZ.

    This booklet contains five papers which examine the activities, successes, and pitfalls encountered by educators who are introducing accountability techniques into instructional programs where they did not exist in the past. The papers are based on actual programs and offer possible solutions in the areas considered, which are 1) performance…

  9. Making Accountability Really Count

    ERIC Educational Resources Information Center

    Resnick, Lauren B.

    2006-01-01

    Standards-based education has now reached a stage where it is possible to evaluate its overall effectiveness. Several earlier papers in the special issue of "Educational Measurement: Issues and Practice" on "Test Scores and State Accountability" (Volume 24, Number 4) examined specific state policies and their effects on schools…

  10. Accountability Update, March 2000.

    ERIC Educational Resources Information Center

    Washington State Higher Education Coordinating Board, Olympia.

    This report provides the Washington State legislature, the Governor, and other interested parties with an update on the accountability performance of each of the state's public baccalaureate institutions (Central Washington University, Eastern Washington University, Evergreen State College, Washington State University, Western Washington…

  11. Educational Accounting Procedures.

    ERIC Educational Resources Information Center

    Tidwell, Sam B.

    This chapter of "Principles of School Business Management" reviews the functions, procedures, and reports with which school business officials must be familiar in order to interpret and make decisions regarding the school district's financial position. Among the accounting functions discussed are financial management, internal auditing,…

  12. Professional Capital as Accountability

    ERIC Educational Resources Information Center

    Fullan, Michael; Rincón-Gallardo, Santiago; Hargreaves, Andy

    2015-01-01

    This paper seeks to clarify and spells out the responsibilities of policy makers to create the conditions for an effective accountability system that produces substantial improvements in student learning, strengthens the teaching profession, and provides transparency of results to the public. The authors point out that U.S. policy makers will need…

  13. Accountability for Productivity

    ERIC Educational Resources Information Center

    Wellman, Jane

    2010-01-01

    Productivity gains in higher education won't be made just by improving cost effectiveness or even performance. They need to be documented, communicated, and integrated into a strategic agenda to increase attainment. This requires special attention to "accountability" for productivity, meaning public presentation and communication of evidence about…

  14. Legal responsibility and accountability.

    PubMed

    Cox, Chris

    2010-06-01

    Shifting boundaries in healthcare roles have led to anxiety among some nurses about their legal responsibilities and accountabilities. This is partly because of a lack of education about legal principles that underpin healthcare delivery. This article explains the law in terms of standards of care, duty of care, vicarious liability and indemnity insurance.

  15. Democracy, Accountability, and Education

    ERIC Educational Resources Information Center

    Levinson, Meira

    2011-01-01

    Educational standards, assessments, and accountability systems are of immense political moment around the world. But there is no developed theory exploring the role that these systems should play within a democratic polity in particular. On the one hand, well-designed standards are public goods, supported by assessment and accountability…

  16. Community Accountability Conferencing.

    ERIC Educational Resources Information Center

    Thorsborne, Margaret

    Community Accountability Conferencing (CAC) was first introduced in Queensland, Australia schools in early 1994 after a serious assault in the school community. Some family members, students, and staff were dissatisfied with the solution of suspending the offenders. Seeking an alternative, comprehensive intervention strategy, the school community…

  17. Planning for Accountability.

    ERIC Educational Resources Information Center

    Cuneo, Tim; Bell, Shareen; Welsh-Gray, Carol

    1999-01-01

    Through its Challenge 2000 program, Joint Venture: Silicon Valley Network's 21st Century Education Initiative has been working with K-12 schools to improve student performance in literature, math, and science. Clearly stated standards, appropriate assessments, formal monitoring, critical friends, and systemwide accountability are keys to success.…

  18. Institutional Accountability Report, 2000.

    ERIC Educational Resources Information Center

    Santa Fe Community Coll., Gainesville, FL. Office of Institutional Research and Planning.

    This document discusses Santa Fe Community College's (SFCC) (Florida) five accountability measures. The type of data available provided on these measures is as follows: (1) District High School Enrollment Report and Retention and Success Rate Report; (2) Associate of Arts Degree Transfer Performance in the State University System; (3) Licensure…

  19. Fiscal Accounting Manual.

    ERIC Educational Resources Information Center

    California State Dept. of Housing and Community Development, Sacramento. Indian Assistance Program.

    Written in simple, easy to understand form, the manual provides a vehicle for the untrained person in bookkeeping to control funds received from grants for Indian Tribal Councils and Indian organizations. The method used to control grants (federal, state, or private) is fund accounting, designed to organize rendering services on a non-profit…

  20. Curtail Accountability, Cultivate Attainability

    ERIC Educational Resources Information Center

    Wraga, William G.

    2011-01-01

    The current test-driven accountability movement, codified in the No Child Left Behind Act of 2001 ([NCLB] 2002), was a misguided idea that will have the effect not of improving the education of children and youth, but of indicting the public school system of the United States. To improve education in the United States, politicians, policy makers,…

  1. Higher Education Accountability Plans

    ERIC Educational Resources Information Center

    Washington Higher Education Coordinating Board, 2003

    2003-01-01

    Washington state's public four-year universities and college have submitted their 2003-05 accountability plans to the Higher Education Coordinating Board (HECB). The state operating budget directs the Board to review these plans and set biennial performance targets for each institution. For 2003-05, the four-year institutions are reporting on a…

  2. Accounting 202, 302.

    ERIC Educational Resources Information Center

    Manitoba Dept. of Education, Winnipeg.

    This teaching guide consists of guidelines for conducting two secondary-level introductory accounting courses. Intended for vocational business education students, the courses are designed to introduce financial principles and practices important to personal and business life, to promote development of clerical and bookkeeping skills sufficient…

  3. Student Attendance Accounting Manual.

    ERIC Educational Resources Information Center

    Freitas, Joseph M.

    In response to state legislation authorizing procedures for changes in academic calendars and measurement of student workload in California community colleges, this manual from the Chancellor's Office provides guidelines for student attendance accounting. Chapter 1 explains general items such as the academic calendar, admissions policies, student…

  4. Full Accounting for Curriculum.

    ERIC Educational Resources Information Center

    Paddock, Marie-Louise

    1988-01-01

    Given the curriculum's importance in the educational process, curriculum evaluation should be considered as essential as a district financial audit. When Fenwick English conducted a 1979 curriculum audit of Columbus, Ohio, schools, the accounting firm encountered numerous problems concerning development, review, and management practices. Planning…

  5. Excel in the Accounting Curriculum: Perceptions from Accounting Professors

    ERIC Educational Resources Information Center

    Ramachandran Rackliffe, Usha; Ragland, Linda

    2016-01-01

    Public accounting firms emphasize the importance of accounting graduates being proficient in Excel. Since many accounting graduates often aspire to work in public accounting, a question arises as to whether there should be an emphasis on Excel in accounting education. The purpose of this paper is to specifically look at this issue by examining…

  6. A Pariah Profession? Some Student Perceptions of Accounting and Accountancy.

    ERIC Educational Resources Information Center

    Fisher, Roy; Murphy, Vivienne

    1995-01-01

    Existing literature and a survey of 106 undergraduate accounting students in the United Kingdom were analyzed for perceptions of the accounting profession and the academic discipline of accounting. Results suggest that among accounting and nonaccounting students alike, there exist coexisting perceptions of accounting as having high status and low…

  7. 18 CFR 367.1420 - Account 142, Customer accounts receivable.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Account 142, Customer... GAS ACT Balance Sheet Chart of Accounts Current and Accrued Assets § 367.1420 Account 142, Customer accounts receivable. (a) This account must include amounts due from customers for service, and...

  8. MATERIAL CONTROL ACCOUNTING INMM

    SciTech Connect

    Hasty, T.

    2009-06-14

    Since 1996, the Mining and Chemical Combine (MCC - formerly known as K-26), and the United States Department of Energy (DOE) have been cooperating under the cooperative Nuclear Material Protection, Control and Accounting (MPC&A) Program between the Russian Federation and the U.S. Governments. Since MCC continues to operate a reactor for steam and electricity production for the site and city of Zheleznogorsk which results in production of the weapons grade plutonium, one of the goals of the MPC&A program is to support implementation of an expanded comprehensive nuclear material control and accounting (MC&A) program. To date MCC has completed upgrades identified in the initial gap analysis and documented in the site MC&A Plan and is implementing additional upgrades identified during an update to the gap analysis. The scope of these upgrades includes implementation of MCC organization structure relating to MC&A, establishing material balance area structure for special nuclear materials (SNM) storage and bulk processing areas, and material control functions including SNM portal monitors at target locations. Material accounting function upgrades include enhancements in the conduct of physical inventories, limit of error inventory difference procedure enhancements, implementation of basic computerized accounting system for four SNM storage areas, implementation of measurement equipment for improved accountability reporting, and both new and revised site-level MC&A procedures. This paper will discuss the implementation of MC&A upgrades at MCC based on the requirements established in the comprehensive MC&A plan developed by the Mining and Chemical Combine as part of the MPC&A Program.

  9. Microbiological diagnosis of sepsis: the confounding effects of a "gold standard".

    PubMed

    Mancini, Nicasio; Burioni, Roberto; Clementi, Massimo

    2015-01-01

    The need of rapid and sensitive diagnostic techniques for sepsis is every day more compelling. Its morbidity and mortality loads are dramatically high, with one quarter of patients eventually dying. Several diagnostic progresses have been made in the last years using both molecular- and nonmolecular-based approaches, and they have to be broadly shared in the scientific community also under the technical point of view. The initial chapters of this book give a thorough overlook of the state of the art in the actual diagnosis of sepsis. The other chapters provide a broad range of protocols describing both already used and futuristic tools, covering both microbiological and nonmicrobiological aspects. The potential role of each described protocol is evidenced by a brief introduction on the specific topic of each chapter. A final chapter describing algorithms potentially useful in stratifying the risk of sepsis in each single patient and suggesting the future perspectives in the diagnosis of sepsis closes the book.

  10. Some effects of dust on photometry of high-z galaxies: Confounding the effects of evolution

    NASA Technical Reports Server (NTRS)

    Thronson, H. A., Jr.; Witt, A. N.; Capuano, J.

    1993-01-01

    Photometric observations of very distant galaxies--e.g., color vs. z or magnitude vs. z, have been used over the past decade or so in investigations into the evolution of the stellar component. Numerous studies have predicted significant color variations as a result of evolution, in addition to the shifting of different rest wavelengths into the band of observation. Although there is significant scatter, the data can be fit with relatively straightforward, plausible models for galaxian evolution. In very few cases are the effects of dust extinction included in the models. This is due in a large part to the uncertainty about the distribution and optical properties of the grains, and even whether or not they are present in significant numbers in some types of galaxies such as ellipticals. It is likely that the effects of dust on broadband observations are the greatest uncertainty in studies of very distant galaxies. We use a detailed Monte Carlo radiative transfer model within a spherical geometry for different star/dust distributions to examine the effects of dust on the broadband colors of galaxies as a function of redshift. The model fully accounts for absorption and angular redistribution in scattering. In this summary, we consider only the effects on color vs. redshift for three simple geometries each with the same total dust optical depth. Elsewhere at this conference, Capuano, Thronson, & Witt consider other effects of altering the relative dust/star distribution.

  11. Accounting Issues: An Essay Series Part VII--Liabilities

    ERIC Educational Resources Information Center

    Laux, Judy

    2008-01-01

    This article, the seventh in the series, presents accounting for liabilities along with some related conceptual and measurement issues. Additional coverage is devoted to potential ethical dilemmas and both theoretical and empirical literature related to this set of accounting elements.

  12. Radiology applications of financial accounting.

    PubMed

    Leibenhaut, Mark H

    2005-03-01

    A basic knowledge of financial accounting can help radiologists analyze business opportunities and examine the potential impacts of new technology or predict the adverse consequences of new competitors entering their service area. The income statement, balance sheet, and cash flow statement are the three basic financial statements that document the current financial position of the radiology practice and allow managers to monitor the ongoing financial operations of the enterprise. Pro forma, or hypothetical, financial statements can be generated to predict the financial impact of specific business decisions or investments on the profitability of the practice. Sensitivity analysis, or what-if scenarios, can be performed to determine the potential impact of changing key revenue, investment, operating cost or financial assumptions. By viewing radiology as both a profession and a business, radiologists can optimize their use of scarce economic resources and maximize the return on their financial investments.

  13. First-Person Accounts.

    ERIC Educational Resources Information Center

    Gribs, H.; And Others

    1995-01-01

    Personal accounts describe the lives of 2 individuals with deaf-blindness, one an 87-year-old woman who was deaf from birth and became totally blind over a 50-year period and the other of a woman who became deaf-blind as a result of a fever at the age of 7. Managing activities of daily life and experiencing sensory hallucinations are among topics…

  14. Managing global accounts.

    PubMed

    Yip, George S; Bink, Audrey J M

    2007-09-01

    Global account management--which treats a multinational customer's operations as one integrated account, with coherent terms for pricing, product specifications, and service--has proliferated over the past decade. Yet according to the authors' research, only about a third of the suppliers that have offered GAM are pleased with the results. The unhappy majority may be suffering from confusion about when, how, and to whom to provide it. Yip, the director of research and innovation at Capgemini, and Bink, the head of marketing communications at Uxbridge College, have found that GAM can improve customer satisfaction by 20% or more and can raise both profits and revenues by at least 15% within just a few years of its introduction. They provide guidelines to help companies achieve similar results. The first steps are determining whether your products or services are appropriate for GAM, whether your customers want such a program, whether those customers are crucial to your strategy, and how GAM might affect your competitive advantage. If moving forward makes sense, the authors' exhibit, "A Scorecard for Selecting Global Accounts," can help you target the right customers. The final step is deciding which of three basic forms to offer: coordination GAM (in which national operations remain relatively strong), control GAM (in which the global operation and the national operations are fairly balanced), and separate GAM (in which a new business unit has total responsibility for global accounts). Given the difficulty and expense of providing multiple varieties, the vast majority of companies should initially customize just one---and they should be careful not to start with a choice that is too ambitious for either themselves or their customers to handle.

  15. Accounting for Every Kilowatt

    DTIC Science & Technology

    2014-10-01

    Equation 1. One estimate of the energy density of diesel fuel (ρdiesel) coupled with the efficiency (η) of a 60-kilowatt generator op- erating at...are going. Reduc- ing demand without reducing our capability requires appliance -level feedback, which current smart-meter technology does not...event. Accountability Soldiers need appliance -level feedback to reduce electrical consumption. Specifically, they need to know what loads are currently

  16. Integrated Cost Accounting System.

    DTIC Science & Technology

    1992-07-27

    few other companies. Harvard Business Review contained articles explaining the ideas behind the new costing methods and examples of applications...technical report. Peter Drucker in an article in Harvard Business Review ’carefully explains that accounting must change in response to the changes in...Kaplan in a Harvard Business Review article develop the idea of four levels of activities: facility sustaining activities; product-sustaining activities

  17. Variation in faecal water content may confound estimates of gastro-intestinal parasite intensity in wild African herbivores.

    PubMed

    Turner, W C; Cizauskas, C A; Getz, W M

    2010-03-01

    Estimates of parasite intensity within host populations are essential for many studies of host-parasite relationships. Here we evaluated the seasonal, age- and sex-related variability in faecal water content for two wild ungulate species, springbok (Antidorcas marsupialis) and plains zebra (Equus quagga). We then assessed whether or not faecal water content biased conclusions regarding differences in strongyle infection rates by season, age or sex. There was evidence of significant variation in faecal water content by season and age for both species, and by sex in springbok. Analyses of faecal egg counts demonstrated that sex was a near-significant factor in explaining variation in strongyle parasite infection rates in zebra (P = 0.055) and springbok (P = 0.052) using wet-weight faecal samples. However, once these intensity estimates were re-scaled by the percent of dry matter in the faeces, sex was no longer a significant factor (zebra, P = 0.268; springbok, P = 0.234). These results demonstrate that variation in faecal water content may confound analyses and could produce spurious conclusions, as was the case with host sex as a factor in the analysis. We thus recommend that researchers assess whether water variation could be a confounding factor when designing and performing research using faecal indices of parasite intensity.

  18. Prior event rate ratio adjustment for hidden confounding in observational studies of treatment effectiveness: a pairwise Cox likelihood approach.

    PubMed

    Lin, Nan Xuan; Henley, William Edward

    2016-12-10

    Observational studies provide a rich source of information for assessing effectiveness of treatment interventions in many situations where it is not ethical or practical to perform randomized controlled trials. However, such studies are prone to bias from hidden (unmeasured) confounding. A promising approach to identifying and reducing the impact of unmeasured confounding is prior event rate ratio (PERR) adjustment, a quasi-experimental analytic method proposed in the context of electronic medical record database studies. In this paper, we present a statistical framework for using a pairwise approach to PERR adjustment that removes bias inherent in the original PERR method. A flexible pairwise Cox likelihood function is derived and used to demonstrate the consistency of the simple and convenient alternative PERR (PERR-ALT) estimator. We show how to estimate standard errors and confidence intervals for treatment effect estimates based on the observed information and provide R code to illustrate how to implement the method. Assumptions required for the pairwise approach (as well as PERR) are clarified, and the consequences of model misspecification are explored. Our results confirm the need for researchers to consider carefully the suitability of the method in the context of each problem. Extensions of the pairwise likelihood to more complex designs involving time-varying covariates or more than two periods are considered. We illustrate the application of the method using data from a longitudinal cohort study of enzyme replacement therapy for lysosomal storage disorders. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.

  19. A principal stratification approach for evaluating natural direct and indirect effects in the presence of treatment-induced intermediate confounding.

    PubMed

    Taguri, Masataka; Chiba, Yasutaka

    2015-01-15

    Recently, several authors have shown that natural direct and indirect effects (NDEs and NIEs) can be identified under the sequential ignorability assumptions, as long as there is no mediator-outcome confounder that is affected by the treatment. However, if such a confounder exists, NDEs and NIEs will generally not be identified without making additional identifying assumptions. In this article, we propose novel identification assumptions and estimators for evaluating NDEs and NIEs under the usual sequential ignorability assumptions, using the principal stratification framework. It is assumed that the treatment and the mediator are dichotomous. We must impose strong assumptions for identification. However, even if these assumptions were violated, the bias of our estimator would be small under typical conditions, which can be easily evaluated from the observed data. This conjecture is confirmed for binary outcomes by deriving the bounds of the bias terms. In addition, the advantage of our estimator is illustrated through a simulation study. We also propose a method of sensitivity analysis that examines what happens when our assumptions are violated. We apply the proposed method to data from the National Center for Health Statistics.

  20. A demonstration that task difficulty can confound the interpretation of lateral differences in brain activation between typical and dyslexic readers.

    PubMed

    Fisher, Janet McGraw; Liederman, Jacqueline; Johnsen, Jami; Lincoln, Alexis; Frye, Richard

    2012-01-01

    Dyslexic readers (DRs) manifest atypical patterns of brain activity, which may be attributed to aberrant neural connectivity and/or an attempt to activate compensatory pathways. This paper evaluates whether differences in brain activation patterns between DRs and typical readers (TRs) are confounded by task difficulty. Eight DRs and eight TRs matched for age, sex, and nonverbal IQ performed pseudoword rhyming tasks at two levels of difficulty during magnetoencephalography. Task difficulty varied with the number of successive target pseudowords presented before the test pseudoword. Regions of interest were: the temporoparietal area (TPA), the ventral occipital temporal area (VOT), and the inferior frontal gyrus (IFG). Activity was analysed for the 660-ms period after test pseudoword onset. During the discrepant performance condition left hemispheric TPA activation increased across time for TRs, but not DRs, and IFG bihemispheric activation was greater in TRs by the end of the trial. During the equivalent performance condition no group differences in TPA or IFG activation were found. We argue that these results indicate that direct comparison of DR versus TR brain activity is confounded when DRs are more challenged than TRs. This highlights the importance of equating reading group performance during neuroimaging of reading-related tasks.

  1. An apparent-motion confound causes the negative exogenous cuing effect at SOAs with larger numbers of target locations.

    PubMed

    Chen, Peii; Mordkoff, J Toby

    2012-01-01

    Salient but irrelevant stimuli seem to cause an automatic orienting of covert attention, facilitating the detection of targets at the cued location for a brief period of time. However, this finding is highly dependent on the number of possible target locations, at least when the simple detection of targets is all that the task requires. Whereas small numbers of possible target locations (e.g., 2 or 3) produce the well-known advantage in response time for valid cue trials (i.e., a positive cuing effect), larger numbers of possible target locations (e.g., 6 or 8) produce a negative cuing effect. If not explained in terms of a nonattentional mechanism, this latter finding raises serious questions about the standard interpretation of positive cuing effects. The present experiment tested a particular nonattentional mechanism: that a confound between target presence and apparent motion, which occurs only on invalid cue trials, is responsible for negative cuing effect. We reduced or eliminated this confound by the use of a new type of catch trial and eliminated the negative cuing effect with large numbers of target locations.

  2. Hospitals' Internal Accountability

    PubMed Central

    Kraetschmer, Nancy; Jass, Janak; Woodman, Cheryl; Koo, Irene; Kromm, Seija K.; Deber, Raisa B.

    2014-01-01

    This study aimed to enhance understanding of the dimensions of accountability captured and not captured in acute care hospitals in Ontario, Canada. Based on an Ontario-wide survey and follow-up interviews with three acute care hospitals in the Greater Toronto Area, we found that the two dominant dimensions of hospital accountability being reported are financial and quality performance. These two dimensions drove both internal and external reporting. Hospitals' internal reports typically included performance measures that were required or mandated in external reports. Although respondents saw reporting as a valuable mechanism for hospitals and the health system to monitor and track progress against desired outcomes, multiple challenges with current reporting requirements were communicated, including the following: 58% of survey respondents indicated that performance-reporting resources were insufficient; manual data capture and performance reporting were prevalent, with the majority of hospitals lacking sophisticated tools or technology to effectively capture, analyze and report performance data; hospitals tended to focus on those processes and outcomes with high measurability; and 53% of respondents indicated that valuable cross-system accountability, performance measures or both were not captured by current reporting requirements. PMID:25305387

  3. Inflation Accounting Methods and their Effectiveness.

    DTIC Science & Technology

    accounting and current cost accounting are explained as the major inflation accounting methods. Inflation accounting standards announced in the United...inflation accounting, constant purchasing power accounting, constant dollar accounting, current cost accounting , current value.

  4. Modeling Lung Carcinogenesis in Radon-Exposed Miner Cohorts: Accounting for Missing Information on Smoking.

    PubMed

    van Dillen, Teun; Dekkers, Fieke; Bijwaard, Harmen; Brüske, Irene; Wichmann, H-Erich; Kreuzer, Michaela; Grosche, Bernd

    2016-05-01

    Epidemiological miner cohort data used to estimate lung cancer risks related to occupational radon exposure often lack cohort-wide information on exposure to tobacco smoke, a potential confounder and important effect modifier. We have developed a method to project data on smoking habits from a case-control study onto an entire cohort by means of a Monte Carlo resampling technique. As a proof of principle, this method is tested on a subcohort of 35,084 former uranium miners employed at the WISMUT company (Germany), with 461 lung cancer deaths in the follow-up period 1955-1998. After applying the proposed imputation technique, a biologically-based carcinogenesis model is employed to analyze the cohort's lung cancer mortality data. A sensitivity analysis based on a set of 200 independent projections with subsequent model analyses yields narrow distributions of the free model parameters, indicating that parameter values are relatively stable and independent of individual projections. This technique thus offers a possibility to account for unknown smoking habits, enabling us to unravel risks related to radon, to smoking, and to the combination of both.

  5. Harnessing Facebook for Student Engagement in Accounting Education: Guiding Principles for Accounting Students and Educators

    ERIC Educational Resources Information Center

    Stone, Gerard; Fiedler, Brenton Andrew; Kandunias, Chris

    2014-01-01

    This paper proposes principles to guide accounting students' and accounting educators' use of Facebook as an educational resource to engage students with their learning. A body of cross-disciplinary research has investigated potential applications of Facebook to invigorate student engagement. Generic guidelines for educators who are contemplating…

  6. Diffuse large B-cell lymphoma in colon confounded by prior history of colorectal cancer: A case report and literature review.

    PubMed

    Ren, Yanling; Chen, Zhilu; Su, Chuanyong; Tong, Hongyan; Qian, Wenbin

    2016-02-01

    A 66-year-old male underwent left hemicolectomy for rectal adenocarcinoma in 2008. Five years later he was admitted to hospital with abdominal pain. A computed tomography scan revealed notable thickening of the middle of the ascending colon wall, and colonoscopy revealed an ulcerofungating mass of 3×3 cm in the cecum and extending to the ascending colon. Under the consideration of cancer recurrence, laparoscopic right hemicolectomy was performed directly. Surgical specimens revealed sheets of large pleomorphic lymphoid cells with nuclei of different sizes, nucleoli and mitotic phases visible in most cells. These tested positive for CD45, CD20 and CD79a diffusely, but negative for CD3, CD5, Bcl-2, Bcl-6 and ALK. The Ki-67 proliferation index was 40%. Epstein-Barr virus in situ hybridization did not reveal any positive signals in any of the tumor cells. Based on these findings, the recurrent tumor was diagnosed as diffuse large B-cell lymphoma. The patient could have avoided surgery and received chemotherapy only; however, the case was confounded by the patient's prior history of colorectal cancer due to the rarity of colon lymphoma following rectal cancer in the same patient. It is therefore essential to investigate carefully and differentiate between potential lesions during routine postoperative colonoscopy following colorectal cancer surgery, as patients may present with rare colon lymphoma, which may be confused with a recurrence of colorectal cancer.

  7. The relationship between motor performance and parent-rated executive functioning in 3- to 5-year-old children: What is the role of confounding variables?

    PubMed

    Houwen, Suzanne; van der Veer, Gerda; Visser, Jan; Cantell, Marja

    2017-01-30

    It is generally agreed that motor performance and executive functioning (EF) are intertwined. As the literature on this issue concerning preschool children is scarce, we examined the relationship between motor performance and parent-rated EF in a sample of 3- to 5-year-old children with different levels of motor skill proficiency, while controlling for age, gender, socio-economic status (SES), and attention-deficit-hyperactivity disorder (ADHD) symptomatology. EF was reported by parents of 153 children (mean age 4years 1months, SD 8months; 75 male) by means of the Behaviour Rating Inventory of Executive Function-Preschool version (BRIEF-P). Parent-reported ADHD symptoms were assessed using the Hyperactivity-Inattention subscale of the Strengths and Difficulties Questionnaire3-4. In addition, the children performed the Movement Assessment Battery for Children-2 (MABC-2). Several weak to moderate relationships were found between the MABC-2 Total Score and the EF subscales. Once other variables such as age, gender, SES, and ADHD symptomatology were taken into account, the only BRIEF-P subscale that was associated with the MABC-2 Total Score was the Working Memory subscale. Compared to their typically developing peers, children who are at risk for motor coordination difficulties (⩽the 16th percentile on the MABC-2) performed poorly on the Working Memory subscale, which confirms the results of the regression analyses. The at risk group also performed significantly worse on the Planning/Organize subscale, however. This is one of the first studies investigating the relationship between motor performance and parent-rated EF in such a young age group. It shows that the relationship between motor performance and EF in young children is complex and may be influenced by the presence of confounding variables such as ADHD symptomatology.

  8. Commentary: Overview of Developmental Perspectives on Creativity and the Realization of Potential.

    PubMed

    Runco, Mark A

    2016-01-01

    The articles in this issue of New Directions for Child and Adolescent Development nicely summarize recent findings about creativity and development. This commentary underscores some of the key ideas and puts them into a larger context (i.e., the corpus of creativity research). It pinpoints areas of agreement (e.g., the need to take both generative and convergent processes into account when examining developmental changes in creative behavior) but balances this with a discussion of concerns. These include (a) problems with the concept of Big C creativity, as it may confound the realization of creative potential, (b) lack of attention given to cultural relativity, and (c) inappropriate testing of divergent thinking. Still, the progress in the research is clear and the fulfillment of creative potentials increasingly likely.

  9. The Integration of Behavioral Accounting in Undergraduate Accounting Curricula.

    ERIC Educational Resources Information Center

    Buchanan, Phillip G.; Cao, Le Thi

    1986-01-01

    The study reported here is part of a continuing project with the goal of determining the place of behavioral accounting in the accounting curricula. While the first two studies focused on the graduate accounting curricula and the practitioners' opinions on the subject, this study concentrates on the behavioral accounting content of undergraduate…

  10. New Frontiers: Training Forensic Accountants within the Accounting Program

    ERIC Educational Resources Information Center

    Ramaswamy, Vinita

    2007-01-01

    Accountants have recently been subject to very unpleasant publicity following the collapse of Enron and other major companies. There has been a plethora of accounting failures and accounting restatements of falsified earnings, with litigations and prosecutions taking place every day. As the FASB struggles to tighten the loopholes in accounting,…

  11. Land-surface controls on afternoon precipitation diagnosed from observational data: uncertainties and confounding factors

    NASA Astrophysics Data System (ADS)

    Guillod, B. P.; Orlowsky, B.; Miralles, D.; Teuling, A. J.; Blanken, P. D.; Buchmann, N.; Ciais, P.; Ek, M.; Findell, K. L.; Gentine, P.; Lintner, B. R.; Scott, R. L.; Van den Hurk, B.; Seneviratne, S. I.

    2014-08-01

    The feedback between soil moisture and precipitation has long been a topic of interest due to its potential for improving weather and seasonal forecasts. The generally proposed mechanism assumes a control of soil moisture on precipitation via the partitioning of the surface turbulent heat fluxes, as assessed via the evaporative fraction (EF), i.e., the ratio of latent heat to the sum of latent and sensible heat, in particular under convective conditions. Our study investigates the poorly understood link between EF and precipitation by relating the before-noon EF to the frequency of afternoon precipitation over the contiguous US, through statistical analyses of multiple EF and precipitation data sets. We analyze remote-sensing data products (Global Land Evaporation: the Amsterdam Methodology (GLEAM) for EF, and radar precipitation from the NEXt generation weather RADar system (NEXRAD)), FLUXNET station data, and the North American Regional Reanalysis (NARR). Data sets agree on a region of positive relationship between EF and precipitation occurrence in the southwestern US. However, a region of strong positive relationship over the eastern US in NARR cannot be confirmed with observation-derived estimates (GLEAM, NEXRAD and FLUXNET). The GLEAM-NEXRAD data set combination indicates a region of positive EF-precipitation relationship in the central US. These disagreements emphasize large uncertainties in the EF data. Further analyses highlight that much of these EF-precipitation relationships could be explained by precipitation persistence alone, and it is unclear whether EF has an additional role in triggering afternoon precipitation. This also highlights the difficulties in isolating a land impact on precipitation. Regional analyses point to contrasting mechanisms over different regions. Over the eastern US, our analyses suggest that the EF-precipitation relationship in NARR is either atmospherically controlled (from precipitation persistence and potential evaporation

  12. Demonstrating marketing accountability.

    PubMed

    Gombeski, William R; Britt, Jason; Taylor, Jan; Riggs, Karen; Wray, Tanya; Adkins, Wanda; Springate, Suzanne

    2008-01-01

    Pressure on health care marketers to demonstrate effectiveness of their strategies and show their contribution to organizational goals is growing. A seven-tiered model based on the concepts of structure (having the right people, systems), process (doing the right things in the right way), and outcomes (results) is discussed. Examples of measures for each tier are provided and the benefits of using the model as a tool for measuring, organizing, tracking, and communicating appropriate information are provided. The model also provides a framework for helping management understand marketing's value and can serve as a vehicle for demonstrating marketing accountability.

  13. Performance and Accountability Report

    NASA Technical Reports Server (NTRS)

    2003-01-01

    The NASA Fiscal Year 2002 Performance and Accountability Report is presented. Over the past year, significant changes have been implemented to greatly improve NASA's management while continuing to break new ground in science and technology. Excellent progress has been made in implementing the President's Management Agenda. NASA is leading the government in its implementation of the five government-wide initiatives. NASA received an unqualified audit opinion on FY 2002 financial statements. The vast majority of performance goals have been achieved, furthering each area of NASA's mission. The contents include: 1) NASA Vision and Mission; 2) Management's Discussion and Analysis; 3) Performance; and 4) Financial.

  14. Prawns, barnacles, and nonsteroidal anti-inflammatory drugs: effect modifiers or diagnostic confounders [corrected].

    PubMed

    Vidal, C; Bartolomé, B; González-Quintela, A; Rodríguez, V; Armisén, M

    2007-01-01

    A 42-year-old woman with no history of atopy reported several episodes of generalized urticaria and shortness of breath after eating shellfish (prawns and barnacles) but with good tolerance of the same foods between episodes. Skin prick tests (SPTs), serum enzyme allergosorbent tests (EAST) for specific immunoglobulin (Ig) E, Western blot and inhibition assays, and oral challenge tests with prawns, barnacles, nonsteroidal anti-inflammatory drugs (NSAIDs), and alcohol as potential effect modifiers were performed. Specific IgE to both barnacle and prawn were detected by SPTs and EAST. Results from a Western blot of raw prawn revealed an IgE binding band of 37 kDa and IgE binding bands of 143, 83, 38, 32, and 20 kDa appeared in the raw barnacle assay. Oral challenge tests were positive with prawns and prawn extract only if preceded by NSAIDs. Oral challenges with NSAIDs alone, prawns alone, barnacles with or without NSAIDs and alcohol led to no reaction. A synergistic effect of NSAIDs in inducing anaphylaxis after prawn intake was confirmed. No similar effect was achieved with barnacles despite the presence of specific IgE. Additional factors needed to elicit a clinical reaction in food allergy may not be obvious and several oral challenge protocols are mandatory in such cases.

  15. Mutant DNA quantification by digital PCR can be confounded by heating during DNA fragmentation.

    PubMed

    Kang, Qing; Parkin, Brian; Giraldez, Maria D; Tewari, Muneesh

    2016-04-01

    Digital PCR (dPCR) is gaining popularity as a DNA mutation quantification method for clinical specimens. Fragmentation prior to dPCR is required for non-fragmented genomic DNA samples; however, the effect of fragmentation on DNA analysis has not been well-studied. Here we evaluated three fragmentation methods for their effects on dPCR point mutation assay performance. Wild-type (WT) human genomic DNA was fragmented by heating, restriction digestion, or acoustic shearing using a Covaris focused-ultrasonicator. dPCR was then used to determine the limit of blank (LoB) by quantifying observed WT and mutant allele counts of the proto-oncogenes KRAS and BRAF in the WT DNA sample. DNA fragmentation by heating to 95°C, while the simplest and least expensive method, produced a high background mutation frequency for certain KRAS mutations relative to the other methods. This was due to heat-induced mutations, specifically affecting dPCR assays designed to interrogate guanine to adenine (G>A) mutations. Moreover, heat-induced fragmentation overestimated gene copy number, potentially due to denaturation and partition of single-stranded DNA into different droplets. Covaris acoustic shearing and restriction enzyme digestion showed similar LoBs and gene copy number estimates to one another. It should be noted that moderate heating, commonly used in genomic DNA extraction protocols, did not significantly increase observed KRAS mutation counts.

  16. Extracellular vesicles in the circulation: are erythrocyte microvesicles a confounder in the plasma haemoglobin assay?

    PubMed

    de Vooght, Karen M K; Lau, Cedric; de Laat, Pim P M; van Wijk, Richard; van Solinge, Wouter W; Schiffelers, Raymond M

    2013-02-01

    Blood contains a mixture of extracellular vesicles from different cell types, primarily platelets, endothelial cells, leucocytes and erythrocytes. Erythrocytes are the most abundant cell type in blood and could, especially in certain pathologies, represent an important source of vesicles. Since erythrocytes contain the haemoglobin components iron and haem, which are potentially toxic, it is important to investigate the contribution of vesicle-associated haemoglobin to total cell-free haemoglobin levels. To our knowledge, this is the first time that cell-free plasma haemoglobin has been differentiated into vesicle-associated and molecular species. We investigated the contribution of vesicle-associated haemoglobin in residual patient material that was routinely analysed for total cell-free plasma haemoglobin. All patient samples included in the study were haemolytic with total cell-free haemoglobin concentration ranging from 80 to 2500 mg/l. In the majority of the samples, total cell-free haemoglobin concentration was between 100 and 200 mg/l. No haemoglobin could be detected in the vesicle fraction, indicating that the contribution of vesicle-associated haemoglobin to total cell free-haemoglobin levels in plasma is negligible. It is important to investigate whether erythrocyte vesicles are not formed in blood or that their production is not increased during pathologies associated with haemolysis or that the clearance rate of the vesicles surpasses the formation rate.

  17. Disentangling the confounding effects of PAR and air temperature on net ecosystem exchange in time and scale

    NASA Astrophysics Data System (ADS)

    yang, Z.; Chen, J.; Becker, R.; Chu, H.; Xie, J.; Shao, C.

    2013-12-01

    Net ecosystem exchange of CO2 (NEE) in temperate forests is modulated by microclimatic factors. The effects of those factors differ at different time scales and during different time periods. Some of them are correlated across a number of time scales, so their effects on NEE are confounded by each other. PAR and air temperature (Ta) are among the two most important drivers of NEE in temperate forests, and among the two most correlated microclimatic factors. PAR and Ta have similar daily, seasonal, and annual cycles. Their influence on NEE is confounded by each other and entangled together especially at those scales. In this study, we tried to disentangle the confounding effects of them on NEE at different time scales and during different time periods. To accomplish this objective, we applied the innovative spectral analysis techniques including Continuous Wavelet Transformation (CWT), Cross Wavelet Transformation (XWT), Wavelet Coherent (WTC), and Partial Wavelet Coherence (PWC) on seven years time series (2004-2010) of PAR, Ta and NEE from the Ohio Oak Openings site (N 41.5545°, W 83.8438°), USA for spectral analysis. We found that PAR is the major driver at short time scales (e.g. semidiurnal and daily) and Ta is the major driver at long time scales (e.g. seasonal and annual). At daily scale during growing seasons, PAR is anti-phase with NEE with no time delay while Ta lagged PAR about 2-3 hours, which could be explained by the strong dependence of photosynthesis on PAR and a 2-3 hours lags of the daily course of Ta to PAR. At daily scale during non-growing season, NEE has little variation and thus neither Ta nor PAR has high common wavelet power and significant coherence with NEE. At annual scale, Ta is anti-phase with NEE and PAR leads NEE about 34 days, which could be explained by the strong dependence of LAI dynamics on Ta and the lag between the LAI/biomass development and the progress of sunlight. We also found that NEE distributes most of its variation

  18. Automated Accounting. Payroll. Instructor Module.

    ERIC Educational Resources Information Center

    Moses, Duane R.

    This teacher's guide was developed to assist business instructors using Dac Easy Accounting Payroll Version 3.0 edition software in their accounting programs. The module contains assignment sheets and job sheets designed to enable students to master competencies identified in the area of automated accounting--payroll. Basic accounting skills are…

  19. Where Are the Accounting Professors?

    ERIC Educational Resources Information Center

    Chang, Jui-Chin; Sun, Huey-Lian

    2008-01-01

    Accounting education is facing a crisis of shortage of accounting faculty. This study discusses the reasons behind the shortage and offers suggestions to increase the supply of accounting faculty. Our suggestions are as followings. First, educators should begin promoting accounting academia as one of the career choices to undergraduate and…

  20. Revamping High School Accounting Courses.

    ERIC Educational Resources Information Center

    Bittner, Joseph

    2002-01-01

    Provides ideas for updating accounting courses: convert to semester length; focus on financial reporting/analysis, financial statements, the accounting cycle; turn textbook exercises into practice sets for the accounting cycle; teach about corporate accounting; and address individual line items on financial statements. (SK)

  1. Soil acidification as a confounding factor on metal phytotoxicity in soils spiked with copper-rich mine wastes.

    PubMed

    Ginocchio, Rosanna; De la Fuente, Luz María; Sánchez, Pablo; Bustamante, Elena; Silva, Yasna; Urrestarazu, Paola; Rodríguez, Patricio H

    2009-10-01

    Pollution of soil with mine wastes results in both Cu enrichment and soil acidification. This confounding effect may be very important in terms of phytotoxicity, because pH is a key parameter influencing Cu solubility in soil solution. Laboratory toxicity tests were used to assess the effect of acidification by acidic mine wastes on Cu solubility and on root elongation of barley (Hordeum vulgare L.). Three contrasting substrates (two soils and a commercial sand) and two acidic, Cu-rich mine wastes (oxidized tailings [OxT] and smelter dust [SmD]) were selected as experimental materials. Substrates were spiked with a fixed amount of either SmD or OxT, and the pH of experimental mixtures was then modified in the range of 4.0 to 6.0 and 7.0 using PIPES (piperazine-1,4-bis(2-ethanesulfonic acid)), MES (2-(N-morpholino)ethanesulfonic acid), and MOPS (3-(N-Morpholino)-propanesulfonic acid) buffers. Chemical (pore-water Cu and pH) and toxicological (root length of barley plants) parameters were determined for experimental mixtures. Addition of SmD and OxT to substrates resulted in acidification (0.11-1.16 pH units) and high levels of soluble Cu and Zn. Neutralization of experimental mixtures with MES (pH 6.0) and MOPS (pH 7.0) buffers resulted in a marked decrease in soluble Cu and Zn, but the intensity of the effect was substrate-dependent. Adjustment of soil pH above the range normally considered to be toxic to plants (pH in water extract, > 5.5) significantly reduced metal toxicity in barley, but phytotoxicity was not completely eliminated. The present results stress the importance of considering confounding effects on derivation of toxicity thresholds to plants when using laboratory phytotoxicity tests.

  2. Automated attendance accounting system

    NASA Technical Reports Server (NTRS)

    Chapman, C. P. (Inventor)

    1973-01-01

    An automated accounting system useful for applying data to a computer from any or all of a multiplicity of data terminals is disclosed. The system essentially includes a preselected number of data terminals which are each adapted to convert data words of decimal form to another form, i.e., binary, usable with the computer. Each data terminal may take the form of a keyboard unit having a number of depressable buttons or switches corresponding to selected data digits and/or function digits. A bank of data buffers, one of which is associated with each data terminal, is provided as a temporary storage. Data from the terminals is applied to the data buffers on a digit by digit basis for transfer via a multiplexer to the computer.

  3. Financial accounting for radiology executives.

    PubMed

    Seidmann, Abraham; Mehta, Tushar

    2005-03-01

    The authors review the role of financial accounting information from the perspective of a radiology executive. They begin by introducing the role of pro forma statements. They discuss the fundamental concepts of accounting, including the matching principle and accrual accounting. The authors then explore the use of financial accounting information in making investment decisions in diagnostic medical imaging. The paper focuses on critically evaluating the benefits and limitations of financial accounting for decision making in a radiology practice.

  4. What Is the Value of Public School Accountability?

    ERIC Educational Resources Information Center

    Gunzenhauser, Michael G.; Hyde, Andrea M.

    2007-01-01

    In this review essay, Michael Gunzenhauser and Andrea Hyde consider three recent edited collections that address the potential value of public school accountability policy: Kenneth Sirotnik's Holding Accountability Accountable: What Ought to Matter in Public Education; Martin Carnoy, Richard Elmore, and Leslie Santee Siskin's The New…

  5. Defining ecosystem assets for natural capital accounting

    USGS Publications Warehouse

    Hein, Lars; Bagstad, Kenneth J.; Edens, Bram; Obst, Carl; de Jong, Rixt; Lesschen, Jan Peter

    2016-01-01

    In natural capital accounting, ecosystems are assets that provide ecosystem services to people. Assets can be measured using both physical and monetary units. In the international System of Environmental-Economic Accounting, ecosystem assets are generally valued on the basis of the net present value of the expected flow of ecosystem services. In this paper we argue that several additional conceptualisations of ecosystem assets are needed to understand ecosystems as assets, in support of ecosystem assessments, ecosystem accounting and ecosystem management. In particular, we define ecosystems’ capacity and capability to supply ecosystem services, as well as the potential supply of ecosystem services. Capacity relates to sustainable use levels of multiple ecosystem services, capability involves prioritising the use of one ecosystem service over a basket of services, and potential supply considers the ability of ecosystems to generate services regardless of demand for these services. We ground our definitions in the ecosystem services and accounting literature, and illustrate and compare the concepts of flow, capacity, capability, and potential supply with a range of conceptual and real-world examples drawn from case studies in Europe and North America. Our paper contributes to the development of measurement frameworks for natural capital to support environmental accounting and other assessment frameworks.

  6. Defining Ecosystem Assets for Natural Capital Accounting

    PubMed Central

    Hein, Lars; Bagstad, Ken; Edens, Bram; Obst, Carl; de Jong, Rixt; Lesschen, Jan Peter

    2016-01-01

    In natural capital accounting, ecosystems are assets that provide ecosystem services to people. Assets can be measured using both physical and monetary units. In the international System of Environmental-Economic Accounting, ecosystem assets are generally valued on the basis of the net present value of the expected flow of ecosystem services. In this paper we argue that several additional conceptualisations of ecosystem assets are needed to understand ecosystems as assets, in support of ecosystem assessments, ecosystem accounting and ecosystem management. In particular, we define ecosystems’ capacity and capability to supply ecosystem services, as well as the potential supply of ecosystem services. Capacity relates to sustainable use levels of multiple ecosystem services, capability involves prioritising the use of one ecosystem service over a basket of services, and potential supply considers the ability of ecosystems to generate services regardless of demand for these services. We ground our definitions in the ecosystem services and accounting literature, and illustrate and compare the concepts of flow, capacity, capability, and potential supply with a range of conceptual and real-world examples drawn from case studies in Europe and North America. Our paper contributes to the development of measurement frameworks for natural capital to support environmental accounting and other assessment frameworks. PMID:27828969

  7. Defining Ecosystem Assets for Natural Capital Accounting.

    PubMed

    Hein, Lars; Bagstad, Ken; Edens, Bram; Obst, Carl; de Jong, Rixt; Lesschen, Jan Peter

    2016-01-01

    In natural capital accounting, ecosystems are assets that provide ecosystem services to people. Assets can be measured using both physical and monetary units. In the international System of Environmental-Economic Accounting, ecosystem assets are generally valued on the basis of the net present value of the expected flow of ecosystem services. In this paper we argue that several additional conceptualisations of ecosystem assets are needed to understand ecosystems as assets, in support of ecosystem assessments, ecosystem accounting and ecosystem management. In particular, we define ecosystems' capacity and capability to supply ecosystem services, as well as the potential supply of ecosystem services. Capacity relates to sustainable use levels of multiple ecosystem services, capability involves prioritising the use of one ecosystem service over a basket of services, and potential supply considers the ability of ecosystems to generate services regardless of demand for these services. We ground our definitions in the ecosystem services and accounting literature, and illustrate and compare the concepts of flow, capacity, capability, and potential supply with a range of conceptual and real-world examples drawn from case studies in Europe and North America. Our paper contributes to the development of measurement frameworks for natural capital to support environmental accounting and other assessment frameworks.

  8. Greenhouse gas accounting and waste management.

    PubMed

    Gentil, Emmanuel; Christensen, Thomas H; Aoustin, Emmanuelle

    2009-11-01

    Accounting of emissions of greenhouse gas (GHG) is a major focus within waste management. This paper analyses and compares the four main types of GHG accounting in waste management including their special features and approaches: the national accounting, with reference to the Intergovernmental Panel on Climate Change (IPCC), the corporate level, as part of the annual reporting on environmental issues and social responsibility, life-cycle assessment (LCA), as an environmental basis for assessing waste management systems and technologies, and finally, the carbon trading methodology, and more specifically, the clean development mechanism (CDM) methodology, introduced to support cost-effective reduction in GHG emissions. These types of GHG accounting, in principle, have a common starting point in technical data on GHG emissions from specific waste technologies and plants, but the limited availability of data and, moreover, the different scopes of the accounting lead to many ways of quantifying emissions and producing the accounts. The importance of transparency in GHG accounting is emphasised regarding waste type, waste composition, time period considered, GHGs included, global warming potential (GWP) assigned to the GHGs, counting of biogenic carbon dioxide, choice of system boundaries, interactions with the energy system, and generic emissions factors. In order to enhance transparency and consistency, a format called the upstream-operating-downstream framework (UOD) is proposed for reporting basic technology-related data regarding GHG issues including a clear distinction between direct emissions from waste management technologies, indirect upstream (use of energy and materials) and indirect downstream (production of energy, delivery of secondary materials) activities.

  9. Confounding of the association between radiation exposure from CT scans and risk of leukemia and brain tumors by cancer susceptibility syndromes.

    PubMed

    Meulepas, Johanna M; Ronckers, Cécile M; Merks, Johannes; Weijerman, Michel E; Lubin, Jay H; Hauptmann, Michael

    2016-12-01

    Recent studies linking radiation exposure from pediatric computed tomography (CT) to increased risks of leukemia and brain tumors lacked data to control for cancer susceptibility syndromes (CSS). These syndromes might be confounders because they are associated with an increased cancer risk and may increase the likelihood of pediatric CT scans. We identify CSS predisposing to leukemia and brain tumors through a systematic literature search and summarize prevalence and risk. Since empirical evidence is lacking in published literature on patterns of CT use for most types of CSS, we estimate confounding bias of relative risks (RR) for categories of radiation exposure based on expert opinion about patterns of CT scans among CSS patients. We estimate that radiation-related RRs for leukemia are not meaningfully confounded by Down syndrome, Noonan syndrome and other CSS. Moreover, tuberous sclerosis complex, von Hippel-Lindau disease, neurofibromatosis type 1 and other CSS do not meaningfully confound RRs for brain tumors. Empirical data on the use of CT scans among CSS patients is urgently needed. Our assessment indicates that associations with radiation exposure from pediatric CT scans and leukemia or brain tumors reported in previous studies are unlikely to be substantially confounded by unmeasured CSS.

  10. NASA Accountability Report

    NASA Technical Reports Server (NTRS)

    1997-01-01

    NASA is piloting fiscal year (FY) 1997 Accountability Reports, which streamline and upgrade reporting to Congress and the public. The document presents statements by the NASA administrator, and the Chief Financial Officer, followed by an overview of NASA's organizational structure and the planning and budgeting process. The performance of NASA in four strategic enterprises is reviewed: (1) Space Science, (2) Mission to Planet Earth, (3) Human Exploration and Development of Space, and (4) Aeronautics and Space Transportation Technology. Those areas which support the strategic enterprises are also reviewed in a section called Crosscutting Processes. For each of the four enterprises, there is discussion about the long term goals, the short term objectives and the accomplishments during FY 1997. The Crosscutting Processes section reviews issues and accomplishments relating to human resources, procurement, information technology, physical resources, financial management, small and disadvantaged businesses, and policy and plans. Following the discussion about the individual areas is Management's Discussion and Analysis, about NASA's financial statements. This is followed by a report by an independent commercial auditor and the financial statements.

  11. Spills, drills, and accountability

    SciTech Connect

    1993-12-31

    NRDC seeks preventive approaches to oil pollution on U.S. coasts. The recent oil spills in Spain and Scotland have highlighted a fact too easy to forget in a society that uses petroleum every minute of every day: oil is profoundly toxic. One tiny drop on a bald eagle`s egg has been known to kill the embryo inside. Every activity involving oil-drilling for it, piping it, shipping it-poses risks that must be taken with utmost caution. Moreover, oil production is highly polluting. It emits substantial air pollution, such as nitrogen oxides that can form smog and acid rain. The wells bring up great quantities of toxic waste: solids, liquids and sludges often contaminated by oil, toxic metals, or even radioactivity. This article examines the following topics focusing on oil pollution control and prevention in coastal regions of the USA: alternate energy sources and accountability of pollutor; ban on offshore drilling as exemplified by the energy policy act; tanker free zones; accurate damage evaluations. Policy of the National Resource Defence Council is articulated.

  12. Accounting: The Integration of Computers into the Accounting Class.

    ERIC Educational Resources Information Center

    Brown, Dorothy Lee

    1980-01-01

    Since computers are universally accepted in business today, the accounting classroom is the appropriate place to teach their use. A California high school accounting committee's recommendation led to the school's development of a computer processing program within the accounting department. The program's curriculum is described. (CT)

  13. 18 CFR 367.2320 - Account 232, Accounts payable.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... payable. This account must include all amounts payable by the service company within one year that are not... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Account 232, Accounts payable. 367.2320 Section 367.2320 Conservation of Power and Water Resources FEDERAL ENERGY...

  14. 18 CFR 367.1430 - Account 143, Other accounts receivable.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... GAS ACT Balance Sheet Chart of Accounts Current and Accrued Assets § 367.1430 Account 143, Other... accounts receivable. 367.1430 Section 367.1430 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF ENERGY REGULATIONS UNDER THE PUBLIC UTILITY HOLDING COMPANY ACT OF...

  15. 76 FR 53378 - Cost Accounting Standards: Accounting for Insurance Costs

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-26

    ... the measurement and allocation of the cost of infrequent and difficult to predict events. The FAR at... BUDGET Office of Federal Procurement Policy 48 CFR Part 9904 Cost Accounting Standards: Accounting for Insurance Costs AGENCY: Cost Accounting Standards Board (Board), Office of Federal Procurement Policy...

  16. Accounting Issues: An Essay Series. Part II--Accounts Receivable

    ERIC Educational Resources Information Center

    Laux, Judith A.

    2007-01-01

    This is the second in a series of articles designed to help academics refocus the introductory accounting course on the theoretical underpinnings of accounting. Intended as a supplement for the principles course, this article connects the asset Accounts Receivable to the essential theoretical constructs, discusses the inherent tradeoffs and…

  17. Solving Accounting Problems: Differences between Accounting Experts and Novices.

    ERIC Educational Resources Information Center

    Marshall, P. Douglas

    2002-01-01

    Performance of 90 accounting experts (faculty and practitioners) and 60 novices (senior accounting majors) was compared. Experts applied more accounting principles to solving problems. There were no differences in types of principles applied and no correlation between (1) principles applied and number of breadth comments or (2) importance placed…

  18. Both direct and indirect effects account for the pro-inflammatory activity of enteropathogenic mycotoxins on the human intestinal epithelium: Stimulation of interleukin-8 secretion, potentiation of interleukin-1{beta} effect and increase in the transepithelial passage of commensal bacteria

    SciTech Connect

    Maresca, Marc; Yahi, Nouara; Younes-Sakr, Lama; Boyron, Marilyn; Caporiccio, Bertrand; Fantini, Jacques

    2008-04-01

    Mycotoxins are fungal secondary metabolites responsible of food-mediated intoxication in animals and humans. Deoxynivalenol, ochratoxin A and patulin are the best known enteropathogenic mycotoxins able to alter intestinal functions resulting in malnutrition, diarrhea, vomiting and intestinal inflammation in vivo. Although their effects on intestinal barrier and transport activities have been extensively characterized, the mechanisms responsible for their pro-inflammatory effect are still poorly understood. Here we investigated if mycotoxin-induced intestinal inflammation results from a direct and/or indirect pro-inflammatory activity of these mycotoxins on human intestinal epithelial cells, using differentiated Caco-2 cells as model and interleukin 8 (IL-8) as an indicator of intestinal inflammation. Deoxynivalenol was the only mycotoxin able to directly increase IL-8 secretion (10- to 15-fold increase). We also investigated if these mycotoxins could indirectly stimulate IL-8 secretion through: (i) a modulation of the action of pro-inflammatory molecules such as the interleukin-1beta (IL-1{beta}), and/or (ii) an increase in the transepithelial passage of non-invasive commensal Escherichia coli. We found that deoxynivalenol, ochratoxin A and patulin all potentiated the effect of IL-1{beta} on IL-8 secretion (ranging from 35% to 138% increase) and increased the transepithelial passage of commensal bacteria (ranging from 12- to 1544-fold increase). In addition to potentially exacerbate established intestinal inflammation, these mycotoxins may thus participate in the induction of sepsis and intestinal inflammation in vivo. Taken together, our results suggest that the pro-inflammatory activity of enteropathogenic mycotoxins is mediated by both direct and indirect effects.

  19. Accountability report - fiscal year 1997

    SciTech Connect

    1998-04-01

    This document contains the US NRC`s accountability report for fiscal year 1997. Topics include uses of funds, financial condition, program performance, management accountability, and the audited financial statement.

  20. 46 CFR Sec. 5 - Accounting.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 8 2011-10-01 2011-10-01 false Accounting. Sec. 5 Section 5 Shipping MARITIME... Sec. 5 Accounting. The General Agent shall record the amounts of compensation paid from the NSA... Accounting Office, at which time the Maritime Administration will take custody of the records....

  1. 46 CFR Sec. 5 - Accounting.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 46 Shipping 8 2012-10-01 2012-10-01 false Accounting. Sec. 5 Section 5 Shipping MARITIME... Sec. 5 Accounting. The General Agent shall record the amounts of compensation paid from the NSA... Accounting Office, at which time the Maritime Administration will take custody of the records....

  2. 46 CFR Sec. 5 - Accounting.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 46 Shipping 8 2013-10-01 2013-10-01 false Accounting. Sec. 5 Section 5 Shipping MARITIME... Sec. 5 Accounting. The General Agent shall record the amounts of compensation paid from the NSA... Accounting Office, at which time the Maritime Administration will take custody of the records....

  3. 46 CFR Sec. 5 - Accounting.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 46 Shipping 8 2014-10-01 2014-10-01 false Accounting. Sec. 5 Section 5 Shipping MARITIME... Sec. 5 Accounting. The General Agent shall record the amounts of compensation paid from the NSA... Accounting Office, at which time the Maritime Administration will take custody of the records....

  4. Accounting Instruction Builds Economic Literacy.

    ERIC Educational Resources Information Center

    Albaugh, Thomas A.; Porreca, Anthony G.

    1985-01-01

    A study of 236 11th-grade accounting students was conducted to measure the economic literacy of students enrolled in high school business education. It was found that those students who had accounting instruction had mean scores that were higher than those students who did not have accounting instruction. (CT)

  5. Vocational Accounting and Computing Programs.

    ERIC Educational Resources Information Center

    Avani, Nathan T.

    1986-01-01

    Describes an "Accounting and Computing" program in Michigan that emphasizes computerized accounting procedures. This article describes the program curriculum and duty areas (such as handling accounts receivable), presents a list of sample tasks in each duty area, and specifies components of each task. Computer equipment necessary for this program…

  6. 46 CFR Sec. 5 - Accounting.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 8 2010-10-01 2010-10-01 false Accounting. Sec. 5 Section 5 Shipping MARITIME... Sec. 5 Accounting. The General Agent shall record the amounts of compensation paid from the NSA... Accounting Office, at which time the Maritime Administration will take custody of the records....

  7. A DRDC Management Accountability Framework

    DTIC Science & Technology

    2009-09-01

    difficult to hold subordinates to account . Level 3 123 Atlantic, Valcartier, Ottawa, Toronto, and...Canada A DRDC Management Accountability Framework Final Report Contract Project Manager: Dr. Fazley Siddiq, 902-494-8802 Contract Number: W7707...B3H 3J5 This page intentionally left blank.   A DRDC Management Accountability Framework Final Report Judy A. Baroni Research

  8. Model Accounting Program. Adopters Guide.

    ERIC Educational Resources Information Center

    Beaverton School District 48, OR.

    The accounting cluster demonstration project conducted at Aloha High School in the Beaverton, Oregon, school district developed a model curriculum for high school accounting. The curriculum is based on interviews with professionals in the accounting field and emphasizes the use of computers. It is suitable for use with special needs students as…

  9. Mastering the Vocabulary of Accounting.

    ERIC Educational Resources Information Center

    Tischler, Helene

    Developed for use by students in an introductory accounting course, these learning modules deal with mastering the vocabulary of accounting. Focus of the modules is on vocabulary appearing in the first six chapters of the text, "Accounting Principles" by Niswonger and Fess. Covered in the individual modules are the following topics:…

  10. Standardized Testing and School Accountability

    ERIC Educational Resources Information Center

    Wiliam, Dylan

    2010-01-01

    This article explores the use of standardized tests to hold schools accountable. The history of testing for accountability is reviewed, and it is shown that currently between-school differences account for less than 10% of the variance in student scores, in part because the progress of individuals is small compared to the spread of achievement…

  11. Patient accounting systems: needs and capabilities.

    PubMed

    Kennedy, O G; Collignon, S

    1987-09-01

    In the first article of this series, it was stated that most finance executives are not very satisfied with the performance of their current patient accounting systems. What steps can a patient accounting system planner take to help ensure the system selected will garner high ratings from managers and users? Two primarily steps need to be taken. First, the planner needs to perform a thorough evaluation of both near- and long-term patient accounting requirements. He should determine which features and functions are most critical and ensure they are incorporated as selection criteria. The planner should also incorporate institutional planning into that process, such as planned expansion of facilities or services, to ensure that the system selected has the growth potential, interfacing capabilities, and flexibility to respond to the changing environment. Then, once system needs are fully charted, the planner should educate himself about the range of patient accounting system solutions available. The data show that most financial managers lack knowledge about most of the major patient accounting system vendors in the marketplace. Once vendors that offer systems that seemingly could meet needs are identified, the wise system planner will also want to obtain information from users about those vendors, to determine whether the systems perform as described and whether the vendor has been responsive to the needs of its customers. This step is a particularly important part of the planning process, because the data also show that users of some systems are significantly more satisfied than users of other patient accounting systems.

  12. Application of Mycobacterium Leprae-specific cellular and serological tests for the differential diagnosis of leprosy from confounding dermatoses.

    PubMed

    Freitas, Aline Araújo; Hungria, Emerith Mayra; Costa, Maurício Barcelos; Sousa, Ana Lúcia Osório Maroccolo; Castilho, Mirian Lane Oliveira; Gonçalves, Heitor Sá; Pontes, Maria Araci Andrade; Duthie, Malcolm S; Stefani, Mariane Martins Araújo

    2016-10-01

    Mycobacterium leprae-specific serological and cell-mediated-immunity/CMI test were evaluated for the differential diagnosis of multibacillary/MB, and paucibacillary/PB leprosy from other dermatoses. Whole-blood assay/WBA/IFNγ stimulated with LID-1 antigen and ELISA tests for IgG to LID-1 and IgM to PGL-I were performed. WBA/LID-1/IFNγ production was observed in 72% PB, 11% MB leprosy, 38% dermatoses, 40% healthy endemic controls/EC. The receiver operating curve/ROC for WBA/LID-1 in PB versus other dermatoses showed 72.5% sensitivity, 61.5% specificity and an area-under-the-curve/AUC=0.75; 74% positive predictive value/PPV, 59% negative predictive value/NPV. Anti PGL-I serology was positive in 67% MB, 8% PB leprosy, 6% of other dermatoses; its sensitivity for MB=66%, specificity=93%, AUC=0.89; PPV=91%, NPV=72%. Anti-LID-1 serology was positive in 87% MB, 7% PB leprosy, all other participants were seronegative; 87.5% sensitivity for MB, 100% specificity, AUC=0.97; PPV=100%, NPV=88%. In highly endemic areas anti-LID-1/PGL-I serology and WBA/LID-1-represent useful tools for the differential diagnosis of leprosy from other confounding dermatoses.

  13. Confounding effects of microbiome on the susceptibility of TNFSF15 to Crohn's disease in the Ryukyu Islands.

    PubMed

    Nakagome, Shigeki; Chinen, Hiroshi; Iraha, Atsushi; Hokama, Akira; Takeyama, Yasuaki; Sakisaka, Shotaro; Matsui, Toshiyuki; Kidd, Judith R; Kidd, Kenneth K; Said, Heba S; Suda, Wataru; Morita, Hidetoshi; Hattori, Masahira; Hanihara, Tsunehiko; Kimura, Ryosuke; Ishida, Hajime; Fujita, Jiro; Kinjo, Fukunori; Mano, Shuhei; Oota, Hiroki

    2017-04-01

    Crohn's disease (CD) involves chronic inflammation in the gastrointestinal tract due to dysregulation of the host immune response to the gut microbiome. Even though the host-microbiome interactions are likely contributors to the development of CD, a few studies have detected genetic variants that change bacterial compositions and increase CD risk. We focus on one of the well-replicated susceptible genes, tumor necrosis factor superfamily member 15 (TNFSF15), and apply statistical analyses for personal profiles of genotypes and salivary microbiota collected from CD cases and controls in the Ryukyu Islands, southernmost islands of the Japanese archipelago. Our association test confirmed the susceptibility of TNFSF15 in the Ryukyu Islands. We found that the recessive model was supported to fit the observed genotype frequency of risk alleles slightly better than the additive model, defining the genetic effect on CD if a pair of the chromosomes in an individual consists of all risk alleles. The combined analysis of haplotypes and salivary microbiome from a small set of samples showed a significant association of the genetic effect with the increase of Prevotella, which led to a significant increase of CD risk. However, the genetic effect on CD disappeared if the abundance of Prevotella was low, suggesting the genetic contribution to CD is conditionally independent given a fixed amount of Prevotella. Although our statistical power is limited due to the small sample size, these results support an idea that the genetic susceptibility of TNFSF15 to CD may be confounded, in part, by the increase of Prevotella.

  14. Zooplankton community changes confound the biodilution theory of methylmercury accumulation in a recovering mercury-contaminated lake.

    PubMed

    Todorova, Svetoslava; Driscoll, Charles T; Matthews, David A; Effler, Steven W

    2015-04-07

    In this study, the biodilution hypothesis of methylmercury (MeHg) accumulation was examined in a Hg-contaminated ecosystem that has undergone concurrent changes in nutrient loading and zooplankton community composition. Using a long-term record of 17 years (between 1980 and 2009), we demonstrate that zooplankton MeHg concentrations in Onondaga Lake, NY, are strongly driven by changes in the zooplankton community and body size. MeHg concentrations in zooplankton increased with an increase in body size and biomass. The highest concentrations of MeHg were observed under eutrophic and hypereutrophic conditions when large-bodied Daphnia species, Daphnia pulicaria and Daphnia galeata mendotae, were present. Bioconcentration rather than biodilution was governing the accumulation of MeHg in zooplankton without apparent growth dilution or zooplankton biomass dilution. Algal-bloom dilution controlled the variability in the MeHg concentration only under hypereutrophic conditions when Ceriodaphnia predominated the cladoceran population. Our study demonstrates that changes in zooplankton community composition confound the biodilution theory in Onondaga Lake and that the presence of large-bodied zooplankton species drives elevated MeHg concentrations.

  15. Environment- and eye-centered inhibitory cueing effects are both observed after a methodological confound is eliminated

    PubMed Central

    He, Tao; Ding, Yun; Wang, Zhiguo

    2015-01-01

    Inhibition of return (IOR), typically explored in cueing paradigms, is a performance cost associated with previously attended locations and has been suggested as a crucial attentional mechanism that biases orientation towards novelty. In their seminal IOR paper, Posner and Cohen (1984) showed that IOR is coded in spatiotopic or environment-centered coordinates. Recent studies, however, have consistently reported IOR effects in both spatiotopic and retinotopic (eye-centered) coordinates. One overlooked methodological confound of all previous studies is that the spatial gradient of IOR is not considered when selecting the baseline for estimating IOR effects. This methodological issue makes it difficult to tell if the IOR effects reported in previous studies were coded in retinotopic or spatiotopic coordinates, or in both. The present study addresses this issue with the incorporation of no-cue trials to a modified cueing paradigm in which the cue and target are always intervened by a gaze-shift. The results revealed that a) IOR is indeed coded in both spatiotopic and retinotopic coordinates, and b) the methodology of previous work may have underestimated spatiotopic and retinotopic IOR effects. PMID:26565380

  16. The siting record: An account of the programs of federal agencies and events that have led to the selection of a potential site for a geologic respository for high-level radioactive waste

    SciTech Connect

    Lomenick, T.F.

    1996-03-01

    This record of siting a geologic repository for high-level radioactive wastes (HLW) and spent fuel describes the many investigations that culminated on December 22, 1987 in the designation of Yucca Mountain (YM), as the site to undergo detailed geologic characterization. It recounts the important issues and events that have been instrumental in shaping the course of siting over the last three and one half decades. In this long task, which was initiated in 1954, more than 60 regions, areas, or sites involving nine different rock types have been investigated. This effort became sharply focused in 1983 with the identification of nine potentially suitable sites for the first repository. From these nine sites, five were subsequently nominated by the U.S. Department of Energy (DOE) as suitable for characterization and then, in 1986, as required by the Nuclear Waste Policy Act of 1982 (NWPA), three of these five were recommended to the President as candidates for site characterization. President Reagan approved the recommendation on May 28, 1986. DOE was preparing site characterization plans for the three candidate sites, namely Deaf Smith County, Texas; Hanford Site, Washington; and YM. As a consequence of the 1987 Amendment to the NWPA, only the latter was authorized to undergo detailed characterization. A final Site Characterization Plan for Yucca Mountain was published in 1988. Prior to 1954, there was no program for the siting of disposal facilities for high-level waste (HLW). In the 1940s and 1950s, the volume of waste, which was small and which resulted entirely from military weapons and research programs, was stored as a liquid in large steel tanks buried at geographically remote government installations principally in Washington and Tennessee.

  17. 40 CFR 97.520 - Establishment of compliance accounts, assurance accounts, and general accounts.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 21 2014-07-01 2014-07-01 false Establishment of compliance accounts, assurance accounts, and general accounts. 97.520 Section 97.520 Protection of Environment ENVIRONMENTAL... persons who have an ownership interest with respect to TR NOX Ozone Season allowances held in the...

  18. 40 CFR 97.720 - Establishment of compliance accounts, assurance accounts, and general accounts.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 21 2014-07-01 2014-07-01 false Establishment of compliance accounts, assurance accounts, and general accounts. 97.720 Section 97.720 Protection of Environment ENVIRONMENTAL... persons who have an ownership interest with respect to TR SO2 Group 2 allowances held in the...

  19. 40 CFR 97.420 - Establishment of compliance accounts, assurance accounts, and general accounts.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 22 2013-07-01 2013-07-01 false Establishment of compliance accounts, assurance accounts, and general accounts. 97.420 Section 97.420 Protection of Environment ENVIRONMENTAL... persons who have an ownership interest with respect to TR NOX Annual allowances held in the...

  20. 40 CFR 97.420 - Establishment of compliance accounts, assurance accounts, and general accounts.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 21 2014-07-01 2014-07-01 false Establishment of compliance accounts, assurance accounts, and general accounts. 97.420 Section 97.420 Protection of Environment ENVIRONMENTAL... persons who have an ownership interest with respect to TR NOX Annual allowances held in the...

  1. 40 CFR 97.620 - Establishment of compliance accounts, assurance accounts, and general accounts.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 21 2014-07-01 2014-07-01 false Establishment of compliance accounts, assurance accounts, and general accounts. 97.620 Section 97.620 Protection of Environment ENVIRONMENTAL... persons who have an ownership interest with respect to TR SO2 Group 1 allowances held in the...

  2. 40 CFR 97.620 - Establishment of compliance accounts, assurance accounts, and general accounts.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 22 2013-07-01 2013-07-01 false Establishment of compliance accounts, assurance accounts, and general accounts. 97.620 Section 97.620 Protection of Environment ENVIRONMENTAL... persons who have an ownership interest with respect to TR SO2 Group 1 allowances held in the...

  3. 40 CFR 97.720 - Establishment of compliance accounts, assurance accounts, and general accounts.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 22 2013-07-01 2013-07-01 false Establishment of compliance accounts, assurance accounts, and general accounts. 97.720 Section 97.720 Protection of Environment ENVIRONMENTAL... persons who have an ownership interest with respect to TR SO2 Group 2 allowances held in the...

  4. 40 CFR 97.520 - Establishment of compliance accounts, assurance accounts, and general accounts.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 22 2013-07-01 2013-07-01 false Establishment of compliance accounts, assurance accounts, and general accounts. 97.520 Section 97.520 Protection of Environment ENVIRONMENTAL... persons who have an ownership interest with respect to TR NOX Ozone Season allowances held in the...

  5. Do time-invariant confounders explain away the association between job stress and workers' mental health? Evidence from Japanese occupational panel data.

    PubMed

    Oshio, Takashi; Tsutsumi, Akizumi; Inoue, Akiomi

    2015-02-01

    It is well known that job stress is negatively related to workers' mental health, but most recent studies have not controlled for unobserved time-invariant confounders. In the current study, we attempted to validate previous observations on the association between job stress and workers' mental health, by removing the effects of unobserved time-invariant confounders. We used data from three to four waves of an occupational Japanese cohort survey, focusing on 31,382 observations of 9741 individuals who participated in at least two consecutive waves. We estimated mean-centered fixed effects models to explain psychological distress in terms of the Kessler 6 (K6) scores (range: 0-24) by eight job stress indicators related to the job demands-control, effort-reward imbalance, and organizational injustice models. Mean-centered fixed effects models reduced the magnitude of the association between jobs stress and K6 scores to 44.8-54.2% of those observed from pooled ordinary least squares. However, the association remained highly significant even after controlling for unobserved time-invariant confounders for all job stress indicators. In addition, alternatively specified models showed the robustness of the results. In all, we concluded that the validity of major job stress models, which link job stress and workers' mental health, was robust, although unobserved time-invariant confounders led to an overestimation of the association.

  6. Lumbar disc degeneration was not related to spine and hip bone mineral densities in Chinese: facet joint osteoarthritis may confound the association.

    PubMed

    Pan, Jianjiang; Lu, Xuan; Yang, Ge; Han, Yongmei; Tong, Xiang; Wang, Yue

    2017-12-01

    A sample of 512 Chinese was studied and we observed that greater disc degeneration on MRI was associated with greater spine DXA BMD. Yet, this association may be confounded by facet joint osteoarthritis. BMD may not be a risk factor for lumbar disc degeneration in Chinese.

  7. Issues Relating to Confounding and Meta-analysis When Including Non-Randomized Studies in Systematic Reviews on the Effects of Interventions

    ERIC Educational Resources Information Center

    Valentine, Jeffrey C.; Thompson, Simon G.

    2013-01-01

    Background: Confounding caused by selection bias is often a key difference between non-randomized studies (NRS) and randomized controlled trials (RCTs) of interventions. Key methodological issues: In this third paper of the series, we consider issues relating to the inclusion of NRS in systematic reviews on the effects of interventions. We discuss…

  8. Using Rich Data on Comorbidities in Case-Control Study Design with Electronic Health Record Data Improves Control of Confounding in the Detection of Adverse Drug Reactions

    PubMed Central

    Chase, Herbert

    2016-01-01

    Recent research has suggested that the case-control study design, unlike the self-controlled study design, performs poorly in controlling confounding in the detection of adverse drug reactions (ADRs) from administrative claims and electronic health record (EHR) data, resulting in biased estimates of the causal effects of drugs on health outcomes of interest (HOI) and inaccurate confidence intervals. Here we show that using rich data on comorbidities and automatic variable selection strategies for selecting confounders can better control confounding within a case-control study design and provide a more solid basis for inference regarding the causal effects of drugs on HOIs. Four HOIs are examined: acute kidney injury, acute liver injury, acute myocardial infarction and gastrointestinal ulcer hospitalization. For each of these HOIs we use a previously published reference set of positive and negative control drugs to evaluate the performance of our methods. Our methods have AUCs that are often substantially higher than the AUCs of a baseline method that only uses demographic characteristics for confounding control. Our methods also give confidence intervals for causal effect parameters that cover the expected no effect value substantially more often than this baseline method. The case-control study design, unlike the self-controlled study design, can be used in the fairly typical setting of EHR databases without longitudinal information on patients. With our variable selection method, these databases can be more effectively used for the detection of ADRs. PMID:27716785

  9. A procedural, pragmatist account of ethical objectivity.

    PubMed

    Roth, Amanda

    2013-06-01

    This article offers a procedural, pragmatist account of objectivity in the domain of the good that is inspired by pragmatic and feminist critiques of objectivity in philosophy of science and epistemology. I begin by asking first what we want to capture--or ought to want to capture--with a notion of ethical objectivity and in answer to this question I identify four "points" to ethical objectivity: undergirding the possibility of mistakenness, making genuine disagreement possible, making sense of our appreciation of the ethical perspectives of others, and making possible a sense of ethical improvement or learning. I then lay out a process-based account of objectivity in ethics that makes good on the four points I have identified. Finally, I consider worries related to convergence, bias, and ontology and defend the procedural, pragmatist account in light of those potential objections.

  10. Are the effects of psychosocial exposures attributable to confounding? Evidence from a prospective observational study on psychological stress and mortality

    PubMed Central

    Macleod, J; Davey, S; Heslop, P; Metcalfe, C; Carroll, D; Hart, C

    2001-01-01

    .69 (95% CI 0.44, 1.09), p for trend 0.12, fully adjusted 0.76 (95% CI 0.48, 1.21), p for trend 0.25; increased compared with decreased stress, age adjusted 0.65 (95% CI 0.40, 1.06), p for trend 0.09, fully adjusted 0.65 (95% CI 0.40, 1.06), p for trend 0.08.
CONCLUSIONS—This implausible protective relation between higher levels of stress, which were associated with increased smoking, and mortality from smoking related cancers, was probably a product of confounding. Plausible reported associations between psychosocial exposures and disease, in populations where such exposures are associated with material disadvantage, may be similarly produced by confounding, and of no causal significance.


Keywords: socioeconomic differentials; psychosocial factors; mortality PMID:11707481

  11. Local Field Potentials: Myths and Misunderstandings

    PubMed Central

    Herreras, Oscar

    2016-01-01

    The intracerebral local field potential (LFP) is a measure of brain activity that reflects the highly dynamic flow of information across neural networks. This is a composite signal that receives contributions from multiple neural sources, yet interpreting its nature and significance may be hindered by several confounding factors and technical limitations. By and large, the main factor defining the amplitude of LFPs is the geometry of the current sources, over and above the degree of synchronization or the properties of the media. As such, similar levels of activity may result in potentials that differ in several orders of magnitude in different populations. The geometry of these sources has been experimentally inaccessible until intracerebral high density recordings enabled the co-activating sources to be revealed. Without this information, it has proven difficult to interpret a century's worth of recordings that used temporal cues alone, such as event or spike related potentials and frequency bands. Meanwhile, a collection of biophysically ill-founded concepts have been considered legitimate, which can now be corrected in the light of recent advances. The relationship of LFPs to their sources is often counterintuitive. For instance, most LFP activity is not local but remote, it may be larger further from rather than close to the source, the polarity does not define its excitatory or inhibitory nature, and the amplitude may increase when source's activity is reduced. As technological developments foster the use of LFPs, the time is now ripe to raise awareness of the need to take into account spatial aspects of these signals and of the errors derived from neglecting to do so. PMID:28018180

  12. A method for improving predictive modeling by taking into account lag time: Example of selenium bioaccumulation in a flowing system.

    PubMed

    Beckon, William N

    2016-07-01

    For bioaccumulative substances, efforts to predict concentrations in organisms at upper trophic levels, based on measurements of environmental exposure, have been confounded by the appreciable but hitherto unknown amount of time it may take for bioaccumulation to occur through various pathways and across several trophic transfers. The study summarized here demonstrates an objective method of estimating this lag time by testing a large array of potential lag times for selenium bioaccumulation, selecting the lag that provides the best regression between environmental exposure (concentration in ambient water) and concentration in the tissue of the target organism. Bioaccumulation lag is generally greater for organisms at higher trophic levels, reaching times of more than a year in piscivorous fish. Predictive modeling of bioaccumulation is improved appreciably by taking into account this lag. More generally, the method demonstrated here may improve the accuracy of predictive modeling in a wide variety of other cause-effect relationships in which lag time is substantial but inadequately known, in disciplines as diverse as climatology (e.g., the effect of greenhouse gases on sea levels) and economics (e.g., the effects of fiscal stimulus on employment).

  13. A new automated method for rat sleep deprivation with minimal confounding effects on corticosterone and locomotor activity.

    PubMed

    Leenaars, Cathalijn H C; Dematteis, Maurice; Joosten, Ruud N J M A; Eggels, Leslie; Sandberg, Hans; Schirris, Mischa; Feenstra, Matthijs G P; Van Someren, Eus J W

    2011-03-15

    The function of sleep in physiology, behaviour and cognition has become a primary focus of neuroscience. Its study inevitably includes experimental sleep deprivation designs. However, concerns exist regarding confounds like stress, increased locomotor activity levels, and decreased motivation to perform operant tasks induced by the methods employed. We here propose a novel procedure for sleep deprivation in rats and evaluate how it affects sleep, corticosterone concentration profiles, locomotor activity levels, and motivation to perform an operant task. Before, during and after 12h of total sleep deprivation by means of gradually increasing the rotation variability and the speed of a novel automated, two-compartment sleep deprivation device, sleep-wake states were assessed by electroencephalography (n=21), brain extracellular corticosterone concentrations using microdialysis (n=11), locomotor activity by infrared measurements (n=8), and operant performance using a fixed-interval-fixed-ratio task (n=16). Sleep was effectively prevented during the procedure; rats on average slept less than 1% of the time (0.8±0.2%, mean±standard error). Brain corticosterone concentrations were mildly increased during the procedure, but did not exceed normal peak concentrations. Locomotor activity was not only increased during the procedure, but also did not exceed the peak levels found during undisturbed wakefulness. Food restriction to 12 g/rat/day prevented sleep deprivation from reducing the motivation to perform an operant task. This novel procedure can be applied to sleep deprive rats in a highly effective way, while keeping corticosterone and locomotor activity within the normal range.

  14. Parametric Mediational g-Formula Approach to Mediation Analysis with Time-varying Exposures, Mediators, and Confounders.

    PubMed

    Lin, Sheng-Hsuan; Young, Jessica; Logan, Roger; Tchetgen Tchetgen, Eric J; VanderWeele, Tyler J

    2017-03-01

    The assessment of direct and indirect effects with time-varying mediators and confounders is a common but challenging problem, and standard mediation analysis approaches are generally not applicable in this context. The mediational g-formula was recently proposed to address this problem, paired with a semiparametric estimation approach to evaluate longitudinal mediation effects empirically. In this article, we develop a parametric estimation approach to the mediational g-formula, including a feasible algorithm implemented in a freely available SAS macro. In the Framingham Heart Study data, we apply this method to estimate the interventional analogues of natural direct and indirect effects of smoking behaviors sustained over a 10-year period on blood pressure when considering weight change as a time-varying mediator. Compared with not smoking, smoking 20 cigarettes per day for 10 years was estimated to increase blood pressure by 1.2 mm Hg (95% CI: -0.7, 2.7). The direct effect was estimated to increase blood pressure by 1.5 mm Hg (95% CI: -0.3, 2.9), and the indirect effect was -0.3 mm Hg (95% CI: -0.5, -0.1), which is negative because smoking which is associated with lower weight is associated in turn with lower blood pressure. These results provide evidence that weight change in fact partially conceals the detrimental effects of cigarette smoking on blood pressure. Our study represents, to our knowledge, the first application of the parametric mediational g-formula in an epidemiologic cohort study (see video abstract at, http://links.lww.com/EDE/B159.).

  15. Type 2 diabetes as a risk factor for Alzheimer's disease: the confounders, interactions, and neuropathology associated with this relationship.

    PubMed

    Vagelatos, Nicholas T; Eslick, Guy D

    2013-01-01

    We performed a systematic review and meta-analysis to explore whether type 2 diabetes mellitus (T2DM) increases the risk of Alzheimer's disease (AD). We also reviewed interactions with smoking, hypertension, and apolipoprotein E ɛ4. Using a series of databases (MEDLINE, EMBASE, PubMed, Current Contents Connect, and Google Scholar), we identified a total of 15 epidemiologic studies. Fourteen studies reported positive associations, of which 9 were statistically significant. Risk estimates ranged from 0.83 to 2.45. The pooled adjusted risk ratio was 1.57 (95% confidence interval: 1.41, 1.75), with a population-attributable risk of 8%. Smoking and hypertension, when comorbid with T2DM, had odds of 14 and 3, respectively. Of the 5 studies that investigated the interaction between T2DM and apolipoprotein E ɛ4, 4 showed positive associations, of which 3 were significant, with odds ranging from 2.4 to 4.99. The pooled adjusted risk ratio was 2.91 (95% confidence interval: 1.51, 5.61). Risk estimates were presented in the context of a key confounder-cerebral infarcts-which are more common in those with T2DM and might contribute to the manifestation of clinical AD. We provide evidence from clinico-neuropathologic studies that demonstrates the following: First, cerebral infarcts are more common than AD-type pathology in those with T2DM and dementia. Second, those with dementia at postmortem are more likely to have both AD-type and cerebrovascular pathologies. Finally, cerebral infarcts reduce the number of AD lesions required for the manifestation of clinical dementia, but they do not appear to interact synergistically with AD-type pathology. Therefore, the increased risk of clinically diagnosed AD seems to be mediated through cerebrovascular pathology.

  16. Toward Reflective Accountability: Using NSSE for Accountability and Transparency

    ERIC Educational Resources Information Center

    McCormick, Alexander C.

    2009-01-01

    Accountability pressures in higher education are not new; they are part of an enduring public policy discourse about the costs and benefits, both individual and social, of higher education. What is relatively new, however, is the prominent place that issues of accountability now occupy on the nation's higher education agenda. There is an important…

  17. Community Accountability: A Theory of Information, Accountability, and School Improvement.

    ERIC Educational Resources Information Center

    Henry, Gary T.

    1996-01-01

    Punitive accountability systems rely on eternally induced motives (avoiding sanctions) to improve low-rung educational performance. Community accountability uses information to bring the public and its schools together and improve total school performance. Emphasis is on positive, internal mechanisms to motivate performance and use of market…

  18. Do migrants worsen the current account?

    PubMed

    Taslim, M A

    1998-01-01

    Australia's foreign liabilities have increased rapidly since the 1980s. While total net foreign liabilities were only 13.5% of GDP in 1979, they reached 58.4% in 1995-96. Together with these liabilities, service payments also rose and now consume about 20% of export earnings. Fear exists among Australia's general public and policymakers that high indebtedness may put the national economy at risk to international capital markets, impose a heavy repayment burden upon future generations, and reduce future growth potential. Considerable attention is therefore given in public debate to reducing the level of foreign liabilities. There is some concern in Australia that immigration contributes to a widening of its current account deficit, and several cross-sectional studies have found that migrant households have a lower savings rate than do local born households. Aggregate time series data are used to investigate the relationship between the current account and immigration. Analysis of the data indicates that although an increase in net migration tends to raise the current account deficit, the longer term effect of immigration upon the current account is negligible. One should, however, understand that immigration impacts upon the economy in a complex way through various supply and demand side channels, with direct and chain effects upon variables such as the current account spread over the short and long terms. The effects are neither unidirectional nor always easy to isolate. The final outcome, which is the sum of all the effects, is uncertain.

  19. A study of the potential confounding effects of diet, caffeine, nicotine and lorazepam on the stability of plasma and urinary homovanillic acid levels in patients with schizophrenia.

    PubMed

    Donnelly, C L; McEvoy, J P; Wilson, W H; Narasimhachari, N

    1996-12-15

    Ten men inpatients who met DSM-III-R criteria for schizophrenia participated. On five occasions at least one week apart, each subject had an intravenous line placed at 0730 after an overnight fast. On each occasion blood samples were drawn at 0800 and hourly thereafter through 1200 noon for measurement of plasma homovanillic acid (HVA). Total four-hour urine collections were obtained for measurement of urinary HVA. Subjects received five experimental conditions, in randomized sequence: no intervention, smoking one cigarette per hour, drinking one caffeinated cola per hour, lorazepam 2 mg IV push, or a high monoamine meal. Baseline (0800) plasma HVA measures showed only minor intrinsic variability. The average standard deviation in baseline plasma HVA over five occasions of measurement was low relative to the changes in HVA produced during treatment with antipsychotic medications. The high monoamine meal significantly elevated plasma HVA, with a similar trend for urinary HVA. Neither caffeine, nicotine, nor lorazepam significantly affected plasma or urinary HVA.

  20. Genomic data reveals potential for hybridization, introgression, and incomplete lineage sorting to confound phylogenetic relationships in an adaptive radiation of narrow-mouth frogs.

    PubMed

    Alexander, Alana M; Su, Yong-Chao; Oliveros, Carl H; Olson, Karen V; Travers, Scott L; Brown, Rafe M

    2017-02-01

    The microhylid frog genus Kaloula is an adaptive radiation spanning the edge of the Asian mainland and multiple adjacent island archipelagos, with much of the clade's diversity associated with an endemic Philippine radiation. Relationships among clades from the Philippines, however, remain unresolved. With ultraconserved element (UCE) and mitogenomic data, we identified highly supported differences in topology and areas of poor resolution, for each marker set. Using the UCE data, we then identified possible instances of contemporary hybridization, past introgression, and incomplete lineage sorting (ILS) within the Philippine Kaloula. Using a simulation approach, and an estimate of the Philippine Kaloula clade origin (12.7-21.0 mya), we demonstrate that an evolutionary history including inferred instances of hybridization, introgression, and ILS leads to phylogenetic reconstructions that show concordance with results from the observed mitogenome and UCE data. In the process of validating a complex evolutionary scenario in the Philippine Kaloula, we provide the first demonstration of the efficacy of UCE data for phylogenomic studies of anuran amphibians.

  1. The Accountability Illusion: New Jersey

    ERIC Educational Resources Information Center

    Thomas B. Fordham Institute, 2009

    2009-01-01

    The intent of the No Child Left Behind (NCLB) Act of 2001 is to hold schools accountable for ensuring that all their students achieve mastery in reading and math, with a particular focus on groups that have traditionally been left behind. Under NCLB, states submit accountability plans to the U.S. Department of Education detailing the rules and…

  2. Efficacy and Accountability in Organizations.

    ERIC Educational Resources Information Center

    Reitzug, Ulrich C.

    This study examined the relationship among accountability, efficacy, and organizational effectiveness by integrating findings from 17 research and development reports on Management by Objectives (MBO), an intervention that incorporates elements and processes of both accountability (goal-setting, measuring and monitoring, feedback) and efficacy…

  3. Training within the Accounting Firm.

    ERIC Educational Resources Information Center

    Finch, Beth; And Others

    1991-01-01

    A survey received 509 responses from 2,000 randomly selected accounting employees about which training topics are receiving the most attention and who is receiving the training. Results prove that training has become an integral part of a certified public accountant's job; topics most often covered were tax related--individual and corporate income…

  4. Teacher Accountability: Trends and Policies.

    ERIC Educational Resources Information Center

    Ornstein, Allan C.

    1986-01-01

    Describes the growing public demand for holding teachers accountable for student performance. Asserts that this position assumes that effectiveness can be measured, whereas the incluence teachers have on student achievement is complex and variable, and may be less than that of family and peers. Describes various State-effort accountability plans.…

  5. Integrating Systems into Accounting Instruction.

    ERIC Educational Resources Information Center

    Heatherington, Ralph

    1980-01-01

    By incorporating a discussion of systems into the beginning accounting class, students will have a more accurate picture of business and the role accounting plays in it. Students should understand the purpose of forms, have a basic knowledge of flowcharting principles and symbols, and know how source documents are created. (CT)

  6. Reengineering Elementary Accounting. Final Report.

    ERIC Educational Resources Information Center

    California State Univ., Chico.

    This final report describes activities and accomplishments of a 3-year project at California State University Chico (CSUC) to reengineer the 2-semester elementary accounting course. The new model emphasized, first, shifting from the traditional view of the preparer of accounting information to that of the user; second, forcing the student to adopt…

  7. Careers for Women in Accounting

    ERIC Educational Resources Information Center

    Rayburn, Letricia Gayle

    1976-01-01

    This survey showed that most accounting firms are either actively trying to solve the problem of discrimination or are at least interested in seeking solutions. Some companies indicated that they would hire more women college graduates if they were qualified accountants. (Author)

  8. Incentives for Accountability. ERIC Digest.

    ERIC Educational Resources Information Center

    Lashway, Larry

    Policymakers and educators are taking a new look at incentives as they work to improve accountability systems. This ERIC Digest examines the role of rewards and sanctions in school reform and identifies key issues in implementing incentive systems. The new accountability is based on five components: carefully designed standards, assessments…

  9. An Accounting International Experience Course

    ERIC Educational Resources Information Center

    Johnson, Leigh Redd; Rudolph, Holly R.; Seay, Robert A.

    2010-01-01

    Accounting students need practical opportunities to personally experience other cultures and international business practices if they are to effectively compete in today's global marketplace. In order to address this need, the Department of Accounting at Murray State University offers an international experience course which includes a short-term…

  10. The Accountability Illusion: New Hampshire

    ERIC Educational Resources Information Center

    Thomas B. Fordham Institute, 2009

    2009-01-01

    The intent of the No Child Left Behind (NCLB) Act of 2001 is to hold schools accountable for ensuring that all their students achieve mastery in reading and math, with a particular focus on groups that have traditionally been left behind. Under NCLB, states submit accountability plans to the U.S. Department of Education detailing the rules and…

  11. The Accountability Illusion: Rhode Island

    ERIC Educational Resources Information Center

    Thomas B. Fordham Institute, 2009

    2009-01-01

    The intent of the No Child Left Behind (NCLB) Act of 2001 is to hold schools accountable for ensuring that all their students achieve mastery in reading and math, with a particular focus on groups that have traditionally been left behind. Under NCLB, states submit accountability plans to the U.S. Department of Education detailing the rules and…

  12. Machine Accounting. An Instructor's Guide.

    ERIC Educational Resources Information Center

    Gould, E. Noah, Ed.

    Designed to prepare students to operate the types of accounting machines used in many medium-sized businesses, this instructor's guide presents a full-year high school course in machine accounting covering 120 hours of instruction. An introduction for the instructor suggests how to adapt the guide to present a 60-hour module which would be…

  13. The Accountability Illusion: New Mexico

    ERIC Educational Resources Information Center

    Thomas B. Fordham Institute, 2009

    2009-01-01

    The intent of the No Child Left Behind (NCLB) Act of 2001 is to hold schools accountable for ensuring that all their students achieve mastery in reading and math, with a particular focus on groups that have traditionally been left behind. Under NCLB, states submit accountability plans to the U.S. Department of Education detailing the rules and…

  14. GASB's Basis of Accounting Project.

    ERIC Educational Resources Information Center

    Kovlak, Daniel L.

    1986-01-01

    In July 1984, the Governmental Accounting Standards Board began its "Measurement Focus/Basis of Accounting" project, which addresses measurement issues and revenue and expenditure recognition problems involving governmental funds. This article explains the project's background, alternatives discussed by the board, and tentative…

  15. School Centered Evidence Based Accountability

    ERIC Educational Resources Information Center

    Milligan, Charles

    2015-01-01

    Achievement scores drive much of the effort in today's accountability system, however, there is much more that occurs in every school, every day. School Centered Evidence Based Accountability can be used from micro to macro giving School Boards and Administration a process for monitoring the results of the entire school operation effectively and…

  16. Revised Accounting for Business Combinations

    ERIC Educational Resources Information Center

    Wilson, Arlette C.; Key, Kimberly

    2008-01-01

    The Financial Accounting Standards Board (FASB) has recently issued Statement of Financial Accounting Standards No. 141 (Revised 2007) Business Combinations. The object of this Statement is to improve the relevance, representational faithfulness, and comparability of reported information about a business combination and its effects. This Statement…

  17. Canadian Accountants: Examining Workplace Learning

    ERIC Educational Resources Information Center

    Hicks, Elizabeth; Bagg, Robert; Doyle, Wendy; Young, Jeffrey D.

    2007-01-01

    Purpose: This paper seeks to examine workplace learning strategies, learning facilitators and learning barriers of public accountants in Canada across three professional levels--trainees, managers, and partners. Design/methodology/approach: Volunteer participants from public accounting firms in Nova Scotia and New Brunswick completed a demographic…

  18. Accounting Experiences in Collaborative Learning

    ERIC Educational Resources Information Center

    Edmond, Tracie; Tiggeman, Theresa

    2009-01-01

    This paper discusses incorporating collaborative learning into accounting classes as a response to the Accounting Education Change Commission's call to install a more active student learner in the classroom. Collaborative learning requires the students to interact with each other and with the material within the classroom setting. It is a…

  19. PLATO Instruction for Elementary Accounting.

    ERIC Educational Resources Information Center

    McKeown, James C.

    A progress report of a study using computer assisted instruction (CAI) materials for an elementary course in accounting principles is presented. The study was based on the following objectives: (1) improvement of instruction in the elementary accounting sequence, and (2) help for transfer students from two-year institutions. The materials under…

  20. What accounts for the association between late preterm births and risk of asthma?

    PubMed Central

    Voge, Gretchen A.; Carey, William A.; Ryu, Euijung; King, Katherine S.; Wi, Chung-Il

    2017-01-01

    Background: Although results of many studies have indicated an increased risk of asthma in former late preterm (LPT) infants, most of these studies did not fully address covariate imbalance. Objective: To compare the cumulative frequency of asthma in a population-based cohort of former LPT infants to that of matched term infants in their early childhood, when accounting for covariate imbalance. Methods: From a population-based birth cohort of children born 2002–2006 in Olmsted County, Minnesota, we assessed a random sample of LPT (34 to 36 6/7 weeks) and frequency-matched term (37 to 40 6/7 weeks) infants. The subjects were followed-up through 2010 or censored based on the last date of contact, with the asthma status based on predetermined criteria. The Kaplan-Meier method was used to estimate the cumulative incidence of asthma during the study period. Cox models were used to estimate the hazard ratio and 95% confidence interval for the risk of asthma, when adjusting for potential confounders. Results: LPT infants (n = 282) had a higher cumulative frequency of asthma than did term infants (n = 297), 29.9 versus 19.5%, respectively; p = 0.01. After adjusting for covariates associated with the risk of asthma, an LPT birth was not associated with a risk of asthma, whereas maternal smoking during pregnancy was associated with a risk of asthma. Conclusion: LPT birth was not independently associated with a risk of asthma and other atopic conditions. Clinicians should make an effort to reduce exposure to smoking during pregnancy as a modifiable risk factor for asthma. PMID:28234052

  1. Why revisit your cost-accounting strategy.

    PubMed

    Arredondo, Ricky

    2014-07-01

    Healthcare entities seeking to develop effective cost-accounting systems should take six steps to avoid potential pitfalls: Secure broad executive-level support for the effort. Ensure systems are in place to analyze the disparate data. Define measurable objectives to ensure that implementation achieves desired results. Give due consideration to implementation planning. Train support staff sufficiently to avoid underutilization. Develop a sufficiently broad base of staff support for the system.

  2. The Persistent Problems and Confounding Challenges of Educator Incentives: The Case of TIF in Prince George's County, Maryland

    ERIC Educational Resources Information Center

    Rice, Jennifer King; Malen, Betty; Baumann, Paul; Chen, Elke; Dougherty, Amy; Hyde, Laura; Jackson, Cara; Jacobson, Reuben; McKithen, Clarissa

    2012-01-01

    While education accountability systems emphasize teacher quality as a prerequisite for student learning, education administrators have struggled to staff low-performing schools with effective teachers. Fueled in part by the federal Teacher Incentive Fund, compensation reforms have gained center stage status among strategies aimed at improving…

  3. Accounting for population stratification in DNA methylation studies.

    PubMed

    Barfield, Richard T; Almli, Lynn M; Kilaru, Varun; Smith, Alicia K; Mercer, Kristina B; Duncan, Richard; Klengel, Torsten; Mehta, Divya; Binder, Elisabeth B; Epstein, Michael P; Ressler, Kerry J; Conneely, Karen N

    2014-04-01

    DNA methylation is an important epigenetic mechanism that has been linked to complex diseases and is of great interest to researchers as a potential link between genome, environment, and disease. As the scale of DNA methylation association studies approaches that of genome-wide association studies, issues such as population stratification will need to be addressed. It is well-documented that failure to adjust for population stratification can lead to false positives in genetic association studies, but population stratification is often unaccounted for in DNA methylation studies. Here, we propose several approaches to correct for population stratification using principal components (PCs) from different subsets of genome-wide methylation data. We first illustrate the potential for confounding due to population stratification by demonstrating widespread associations between DNA methylation and race in 388 individuals (365 African American and 23 Caucasian). We subsequently evaluate the performance of our PC-based approaches and other methods in adjusting for confounding due to population stratification. Our simulations show that (1) all of the methods considered are effective at removing inflation due to population stratification, and (2) maximum power can be obtained with single-nucleotide polymorphism (SNP)-based PCs, followed by methylation-based PCs, which outperform both surrogate variable analysis and genomic control. Among our different approaches to computing methylation-based PCs, we find that PCs based on CpG sites chosen for their potential to proxy nearby SNPs can provide a powerful and computationally efficient approach to adjust for population stratification in DNA methylation studies when genome-wide SNP data are unavailable.

  4. Accounting for Population Stratification in DNA Methylation Studies

    PubMed Central

    Barfield, Richard T.; Almli, Lynn M.; Kilaru, Varun; Smith, Alicia K.; Mercer, Kristina B.; Duncan, Richard; Klengel, Torsten; Mehta, Divya; Binder, Elisabeth B.; Epstein, Michael P.; Ressler, Kerry J.; Conneely, Karen N.

    2014-01-01

    DNA methylation is an important epigenetic mechanism that has been linked to complex disease and is of great interest to researchers as a potential link between genome, environment, and disease. As the scale of DNA methylation association studies approaches that of genome-wide association studies (GWAS), issues such as population stratification will need to be addressed. It is well-documented that failure to adjust for population stratification can lead to false positives in genetic association studies, but population stratification is often unaccounted for in DNA methylation studies. Here, we propose several approaches to correct for population stratification using principal components from different subsets of genome-wide methylation data. We first illustrate the potential for confounding due to population stratification by demonstrating widespread associations between DNA methylation and race in 388 individuals (365 African American and 23 Caucasian). We subsequently evaluate the performance of our principal-components approaches and other methods in adjusting for confounding due to population stratification. Our simulations show that 1) all of the methods considered are effective at removing inflation due to population stratification, and 2) maximum power can be obtained with SNP-based principal components, followed by methylation-based principal components, which out-perform both surrogate variable analysis and genomic control. Among our different approaches to computing methylation-based principal components, we find that principal components based on CpG sites chosen for their potential to proxy nearby SNPs can provide a powerful and computationally efficient approach to adjustment for population stratification in DNA methylation studies when genome-wide SNP data are unavailable. PMID:24478250

  5. Differential Recall Bias, Intermediate Confounding, and Mediation Analysis in Life Course Epidemiology: An Analytic Framework with Empirical Example.

    PubMed

    Sheikh, Mashhood A; Abelsen, Birgit; Olsen, Jan Abel

    2016-01-01

    The mechanisms by which childhood socioeconomic status (CSES) affects adult mental health, general health, and well-being are not clear. Moreover, the analytical assumptions employed when assessing mediation in social and psychiatric epidemiology are rarely explained. The aim of this paper was to explain the intermediate confounding assumption, and to quantify differential recall bias in the association between CSES, childhood abuse, and mental health (SCL-10), general health (EQ-5D), and subjective well-being (SWLS). Furthermore, we assessed the mediating role of psychological and physical abuse in the association between CSES and mental health, general health, and well-being; and the influence of differential recall bias in the estimation of total effects, direct effects, and proportion of mediated effects. The assumptions employed when assessing mediation are explained with reference to a causal diagram. Poisson regression models (relative risk, RR and 99% CIs) were used to assess the association between CSES and psychological and physical abuse in childhood. Mediation analysis (difference method) was used to assess the indirect effect of CSES (through psychological and physical abuse in childhood) on mental health, general health, and well-being. Exposure (CSES) was measured at two time points. Mediation was assessed with both cross-sectional and longitudinal data. Psychological abuse and physical abuse mediated the association between CSES and adult mental health, general health, and well-being (6-16% among men and 7-14% among women, p < 0.001). The results suggest that up to 27% of the association between CSES and childhood abuse, 23% of the association between childhood abuse, and adult mental health, general health, and well-being, and 44% of the association between CSES and adult mental health, general health, and well-being is driven by differential recall bias. Assessing mediation with cross-sectional data (exposure, mediator, and outcome measured at the

  6. Differential Recall Bias, Intermediate Confounding, and Mediation Analysis in Life Course Epidemiology: An Analytic Framework with Empirical Example

    PubMed Central

    Sheikh, Mashhood A.; Abelsen, Birgit; Olsen, Jan Abel

    2016-01-01

    The mechanisms by which childhood socioeconomic status (CSES) affects adult mental health, general health, and well-being are not clear. Moreover, the analytical assumptions employed when assessing mediation in social and psychiatric epidemiology are rarely explained. The aim of this paper was to explain the intermediate confounding assumption, and to quantify differential recall bias in the association between CSES, childhood abuse, and mental health (SCL-10), general health (EQ-5D), and subjective well-being (SWLS). Furthermore, we assessed the mediating role of psychological and physical abuse in the association between CSES and mental health, general health, and well-being; and the influence of differential recall bias in the estimation of total effects, direct effects, and proportion of mediated effects. The assumptions employed when assessing mediation are explained with reference to a causal diagram. Poisson regression models (relative risk, RR and 99% CIs) were used to assess the association between CSES and psychological and physical abuse in childhood. Mediation analysis (difference method) was used to assess the indirect effect of CSES (through psychological and physical abuse in childhood) on mental health, general health, and well-being. Exposure (CSES) was measured at two time points. Mediation was assessed with both cross-sectional and longitudinal data. Psychological abuse and physical abuse mediated the association between CSES and adult mental health, general health, and well-being (6–16% among men and 7–14% among women, p < 0.001). The results suggest that up to 27% of the association between CSES and childhood abuse, 23% of the association between childhood abuse, and adult mental health, general health, and well-being, and 44% of the association between CSES and adult mental health, general health, and well-being is driven by differential recall bias. Assessing mediation with cross-sectional data (exposure, mediator, and outcome measured at

  7. Confounders in the assessment of the renal effects associated with low-level urinary cadmium: an analysis in industrial workers

    PubMed Central

    2011-01-01

    Background Associations of proteinuria with low-level urinary cadmium (Cd) are currently interpreted as the sign of renal dysfunction induced by Cd. Few studies have considered the possibility that these associations might be non causal and arise from confounding by factors influencing the renal excretion of Cd and proteins. Methods We examined 184 healthy male workers (mean age, 39.5 years) from a zinc smelter (n = 132) or a blanket factory (n = 52). We measured the concentrations of Cd in blood (B-Cd) and the urinary excretion of Cd (U-Cd), retinol-binding protein (RBP), protein HC and albumin. Associations between biomarkers of metal exposure and urinary proteins were assessed by simple and multiple regression analyses. Results The medians (interquartile range) of B-Cd (μg/l) and U-Cd (μg/g creatinine) were 0.80 (0.45-1.16) and 0.70 (0.40-1.3) in smelter workers and 0.66 (0.47-0.87) and 0.55 (0.40-0.90) in blanket factory workers, respectively. Occupation had no influence on these values, which varied mainly with smoking habits. In univariate analysis, concentrations of RBP and protein HC in urine were significantly correlated with both U-Cd and B-Cd but these associations were substantially weakened by the adjustment for current smoking and the residual influence of diuresis after correction for urinary creatinine. Albumin in urine did not correlate with B-Cd but was consistently associated with U-Cd through a relationship, which was unaffected by smoking or diuresis. Further analyses showed that RBP and albumin in urine mutually distort their associations with U-Cd and that the relationship between RBP and Cd in urine was almost the replicate of that linking RBP to albumin Conclusions Associations between proteinuria and low-level urinary Cd should be interpreted with caution as they appear to be largely driven by diuresis, current smoking and probably also the co-excretion of Cd with plasma proteins. PMID:21569589

  8. Accountability,

    DTIC Science & Technology

    1987-05-01

    Lroubhle in Vietna 3 was Lia i: nur )rwere, js~el to I i( lit a war: ha t we cooli niev m:win. ś I was ai~ 3 ijried( to lJd ~rn Ro,-yal Thai i r 13a...Reich le, ’NCO IC of tuie Tk1£ * Iorce Chapil~ain School told me thit "I am accountalble for miy ac tionrs on and off duty. There’s a LQrice to pay . Thie

  9. OPERATIONS RESEARCH AND THE ACCOUNTANT,

    DTIC Science & Technology

    There has been a certain amount of scepticism and even apprehension on the part of accountants with respect to the sudden appearance of scientists in...sharpened form of common sense. As an illustration of the use of mathematics, a cost accounting problem is discussed and it is shown that a statistical...definition of ’overhead’ can lead to simplified pricing methods and management controls. Also, some of the confusing aspects of ’overhead’ accounting can be avoided by using mathematical techniques. (Author)

  10. A User-Oriented Focus to Evaluating Accountants' Writing Skills.

    ERIC Educational Resources Information Center

    Catanach, Anthony H., Jr.; Golen, Steven

    1996-01-01

    Reviews briefly accounting writing skill research, discusses the subject bias prevalent in that research, and relates this weakness to a potential bias toward grammar and syntax in curriculum development. Argues that the user of accounting information is an important but neglected party in the communication process, whose input should be…

  11. Implications of NCLB Accountability for Comprehensive School Reform

    ERIC Educational Resources Information Center

    Le Floch, Kerstin Carlson; Taylor, James E.; Thomsen, Kerri

    2006-01-01

    No Child Left Behind (NCLB) accountability mechanisms have the potential to derail comprehensive school reform (CSR) implementation. For those pursuing CSR, the question is how to reconcile the implementation of NCLB accountability mandates with ongoing CSR efforts. Drawing from longitudinal data from a national study of CSR, this article explores…

  12. Estimating the monetary value of willingness to pay for E-book reader's attributes using partially confounded factorial conjoint choice experiment

    NASA Astrophysics Data System (ADS)

    Yong, Chin-Khian

    2013-09-01

    A partially confounded factorial conjoint choice experiments design was used to examine the monetary value of the willingness to pay for E-book Reader's attributes. Conjoint analysis is an efficient, cost-effective, and most widely used quantitative method in marketing research to understand consumer preferences and value trade-off. Value can be interpreted by customer or consumer as the received of multiple benefits from a price that was paid. The monetary value of willingness to pay for battery life, internal memory, external memory, screen size, text to Speech, touch screen, and converting handwriting to digital text of E-book reader were estimated in this study. Due to the significant interaction effect of the attributes with the price, the monetary values for the seven attributes were found to be different at different values of odds of purchasing versus not purchasing. The significant interactions effects were one of the main contribution of the partially confounded factorial conjoint choice experiment.

  13. 77 FR 202 - Federal Acquisition Regulation; Updated Financial Accounting Standards Board Accounting References

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-03

    ... 9000-AM00 Federal Acquisition Regulation; Updated Financial Accounting Standards Board Accounting... accounting standards owing to the Financial Accounting Standards Board's Accounting Standards Codification of Generally Accepted Accounting Principles. DATES: Effective Date: February 2, 2012. FOR FURTHER...

  14. Counting, accounting, and accountability: Helen Verran's relational empiricism.

    PubMed

    Kenney, Martha

    2015-10-01

    Helen Verran uses the term 'relational empiricism' to describe situated empirical inquiry that is attentive to the relations that constitute its objects of study, including the investigator's own practices. Relational empiricism draws on and reconfigures Science and Technology Studies' traditional concerns with reflexivity and relationality, casting empirical inquiry as an important and non-innocent world-making practice. Through a reading of Verran's postcolonial projects in Nigeria and Australia, this article develops a concept of empirical and political 'accountability' to complement her relational empiricism. In Science and an African Logic, Verran provides accounts of the relations that materialize her empirical objects. These accounts work to decompose her original objects, generating new objects that are more promising for the specific postcolonial contexts of her work. The process of decomposition is part of remaining accountable for her research methods and accountable to the worlds she is working in and writing about. This is a practice of narrating relations and learning to tell better technoscientific stories. What counts as better, however, is not given, but is always contextual and at stake. In this way, Verran acts not as participant-observer, but as participant-storyteller, telling stories to facilitate epistemic flourishing within and as part of a historically located community of practice. The understanding of accountability that emerges from this discussion is designed as a contribution, both practical and evocative, to the theoretical toolkit of Science and Technology Studies scholars who are interested in thinking concretely about how we can be more accountable to the worlds we study.

  15. Accounting Internships: A Practical Framework.

    ERIC Educational Resources Information Center

    Henry, Linvol G.; And Others

    1988-01-01

    The authors describe a framework for administration and implementation of postsecondary internships in accounting. Topics covered include (1) qualifications, (2) duration, (3) timing, (4) granting credit and providing grades, and (5) evaluation criteria. Implementation guidelines are included. (CH)

  16. Speaking the Language of Accounting.

    ERIC Educational Resources Information Center

    Fletcher, Leslie B.

    1997-01-01

    Round Robin is a game in which students must express accounting information in their own words. It is a means of familiarizing students with the language of vocabulary and of developing their verbal expressiveness. (SK)

  17. Is the relation between ozone and mortality confounded by chemical components of particulate matter? Analysis of 7 components in 57 US communities.

    PubMed

    Anderson, G Brooke; Krall, Jenna R; Peng, Roger D; Bell, Michelle L

    2012-10-15

    Epidemiologic studies have linked tropospheric ozone pollution and human mortality. Although research has shown that this relation is not confounded by particulate matter when measured by mass, little scientific evidence exists on whether confounding exists by chemical components of the particle mixture. Using mortality and particulate matter with aerodynamic diameter ≤2.5 µm (PM(2.5)) component data from 57 US communities (2000-2005), the authors investigate whether the ozone-mortality relation is confounded by 7 components of PM(2.5): sulfate, nitrate, silicon, elemental carbon, organic carbon matter, sodium ion, and ammonium. Together, these components constitute most PM(2.5) mass in the United States. Estimates of the effect of ozone on mortality were almost identical before and after controlling for the 7 components of PM(2.5) considered (mortality increase/10-ppb ozone increase, before and after controlling: ammonium, 0.34% vs. 0.35%; elemental carbon, 0.36% vs. 0.37%; nitrate, 0.27% vs. 0.26%; organic carbon matter, 0.34% vs. 0.31%; silicon, 0.36% vs. 0.37%; sodium ion, 0.21% vs. 0.18%; and sulfate, 0.35% vs. 0.38%). Additionally, correlations were weak between ozone and each particulate component across all communities. Previous research found that the ozone-mortality relation is not confounded by particulate matter measured by mass; this national study indicates that the relation is also robust to control for specific components of PM(2.5).

  18. Shared spatial effects on quantitative genetic parameters: accounting for spatial autocorrelation and home range overlap reduces estimates of heritability in wild red deer.

    PubMed

    Stopher, Katie V; Walling, Craig A; Morris, Alison; Guinness, Fiona E; Clutton-Brock, Tim H; Pemberton, Josephine M; Nussey, Daniel H

    2012-08-01

    Social structure, limited dispersal, and spatial heterogeneity in resources are ubiquitous in wild vertebrate populations. As a result, relatives share environments as well as genes, and environmental and genetic sources of similarity between individuals are potentially confounded. Quantitative genetic studies in the wild therefore typically account for easily captured shared environmental effects (e.g., parent, nest, or region). Fine-scale spatial effects are likely to be just as important in wild vertebrates, but have been largely ignored. We used data from wild red deer to build "animal models" to estimate additive genetic variance and heritability in four female traits (spring and rut home range size, offspring birth weight, and lifetime breeding success). We then, separately, incorporated spatial autocorrelation and a matrix of home range overlap into these models to estimate the effect of location or shared habitat on phenotypic variation. These terms explained a substantial amount of variation in all traits and their inclusion resulted in reductions in heritability estimates, up to an order of magnitude up for home range size. Our results highlight the potential of multiple covariance matrices to dissect environmental, social, and genetic contributions to phenotypic variation, and the importance of considering fine-scale spatial processes in quantitative genetic studies.

  19. Annual survival estimation of migratory songbirds confounded by incomplete breeding site-fidelity: Study designs that may help

    USGS Publications Warehouse

    Marshall, M.R.; Diefenbach, D.R.; Wood, L.A.; Cooper, R.J.

    2004-01-01

    Many species of bird exhibit varying degrees of site-fidelity to the previous year's territory or breeding area, a phenomenon we refer to as incomplete breeding site-fidelity. If the territory they occupy is located beyond the bounds of the study area or search area (i.e., they have emigrated from the study area), the bird will go undetected and is therefore indistinguishable from dead individuals in capture-mark-recapture studies. Differential emigration rates confound inferences regarding differences in survival between sexes and among species if apparent survival rates are used as estimates of true survival. Moreover, the bias introduced by using apparent survival rates for true survival rates can have profound effects on the predictions of population persistence through time, source/sink dynamics, and other aspects of life-history theory. We investigated four study design and analysis approaches that result in apparent survival estimates that are closer to true survival estimates. Our motivation for this research stemmed from a multi-year capture-recapture study of Prothonotary Warblers (Protonotaria citrea) on multiple study plots within a larger landscape of suitable breeding habitat where substantial inter-annual movements of marked individuals among neighboring study plots was documented. We wished to quantify the effects of this type of movement on annual survival estimation. The first two study designs we investigated involved marking birds in a core area and resighting them in the core as well as an area surrounding the core. For the first of these two designs, we demonstrated that as the resighting area surrounding the core gets progressively larger, and more "emigrants" are resighted, apparent survival estimates begin to approximate true survival rates (bias < 0.01). However, given observed inter-annual movements of birds, it is likely to be logistically impractical to resight birds on sufficiently large surrounding areas to minimize bias. Therefore

  20. Accounting and accountability: observations on the AHERF settlements.

    PubMed

    Maco, P S; Weinstein, S J

    2000-10-01

    Recent enforcement proceedings involving health care and accounting--relating primarily to the Allegheny Health, Education and Research Foundation (AHERF)--have sparked renewed interest in the activities of the U.S. Securities and Exchange Commission in the municipal securities market. Officials and accountants who are working for public-sector issuers in the healthcare industry have responsibilities under the Federal securities laws. Other issues of relevance include disclosure in the secondary market as well as upon initial issuance, and the significance of antifraud actions in other areas.

  1. Accounting and Accountability for Distributed and Grid Systems

    NASA Technical Reports Server (NTRS)

    Thigpen, William; McGinnis, Laura F.; Hacker, Thomas J.

    2001-01-01

    While the advent of distributed and grid computing systems will open new opportunities for scientific exploration, the reality of such implementations could prove to be a system administrator's nightmare. A lot of effort is being spent on identifying and resolving the obvious problems of security, scheduling, authentication and authorization. Lurking in the background, though, are the largely unaddressed issues of accountability and usage accounting: (1) mapping resource usage to resource users; (2) defining usage economies or methods for resource exchange; (3) describing implementation standards that minimize and compartmentalize the tasks required for a site to participate in a grid.

  2. MASS: An automated accountability system

    SciTech Connect

    Erkkila, B.H.; Kelso, F.

    1994-08-01

    All Department of Energy contractors who manage accountable quantities of nuclear materials are required to implement an accountability system that tracks, and records the activities associated with those materials. At Los Alamos, the automated accountability system allows data entry on computer terminals and data base updating as soon as the entry is made. It is also able to generate all required reports in a timely Fashion. Over the last several years, the hardware and software have been upgraded to provide the users with all the capability needed to manage a large variety of operations with a wide variety of nuclear materials. Enhancements to the system are implemented as the needs of the users are identified. The system has grown with the expanded needs of the user; and has survived several years of changing operations and activity. The user community served by this system includes processing, materials control and accountability, and nuclear material management personnel. In addition to serving the local users, the accountability system supports the national data base (NMMSS). This paper contains a discussion of several details of the system design and operation. After several years of successful operation, this system provides an operating example of how computer systems can be used to manage a very dynamic data management problem.

  3. Trade-based carbon sequestration accounting.

    PubMed

    King, Dennis M

    2004-04-01

    This article describes and illustrates an accounting method to assess and compare "early" carbon sequestration investments and trades on the basis of the number of standardized CO2 emission offset credits they will provide. The "gold standard" for such credits is assumed to be a relatively riskless credit based on a CO2 emission reduction that provides offsets against CO2 emissions on a one-for-one basis. The number of credits associated with carbon sequestration needs to account for time, risk, durability, permanence, additionality, and other factors that future trade regulators will most certainly use to assign "official" credits to sequestration projects. The method that is presented here uses established principles of natural resource accounting and conventional rules of asset valuation to "score" projects. A review of 20 "early" voluntary United States based CO2 offset trades that involve carbon sequestration reveals that the assumptions that buyers, sellers, brokers, and traders are using to characterize the economic potential of their investments and trades vary enormously. The article develops a "universal carbon sequestration credit scoring equation" and uses two of these trades to illustrate the sensitivity of trade outcomes to various assumptions about how future trade auditors are likely to "score" carbon sequestration projects in terms of their "equivalency" with CO2 emission reductions. The article emphasizes the importance of using a standard credit scoring method that accounts for time and risk to assess and compare even unofficial prototype carbon sequestration trades. The scoring method illustrated in this article is a tool that can protect the integrity of carbon sequestration credit trading and can assist buyers and sellers in evaluating the real economic potential of prospective trades.

  4. Good Accounting Skills: What More Does a Successful Accountant Need?

    ERIC Educational Resources Information Center

    Park, Leslie Jane

    1994-01-01

    Most of 189 accounting students surveyed (77% of whom were nonnative speakers of English) were not in favor of adding communication skills courses, although they recognized their importance in hiring. The area most needing improvement for nonnative speakers was vocabulary, for native speakers speaking and spelling. All preferred maintaining their…

  5. Media Accounts of School Performance: Reinforcing Dominant Practices of Accountability

    ERIC Educational Resources Information Center

    Baroutsis, Aspa

    2016-01-01

    Media reportage often act as interpretations of accountability policies thereby making the news media a part of the policy enactment process. Within such a process, their role is that of policy reinforcement rather than policy construction or contestation. This paper draws on the experiences of school leaders in regional Queensland, Australia, and…

  6. Accountability in Arab Bedouin Schools in Israel: Accountable to Whom?

    ERIC Educational Resources Information Center

    Mizel, Omar

    2009-01-01

    "School-based management" (SBM) rose to become a prominent trend in educational reform in Western countries during the last few decades of the 20th century and has likewise been introduced into a number of Asian and African nations. A key component of SBM is the increase of internal accountability within the school with the aim of…

  7. Incorporating Calculators into the Accounting Curriculum. Accounting II.

    ERIC Educational Resources Information Center

    Clayton, John

    This document is a guide to aid teachers in incorporating the use of calculators in the high school Accounting II curriculum. The guide contains 16 learning modules. Each module consists of an introductory explanation, student performance objectives, content of the module, and teaching suggestions for using calculators in each application of…

  8. Interpreting the temperature of water at cold springs and the importance of gravitational potential energy

    NASA Astrophysics Data System (ADS)

    Manga, Michael; Kirchner, James W.

    2004-05-01

    Circulating groundwater transports heat. If groundwater flow velocities are sufficiently high, most of the subsurface heat transport can occur by advection. This is the case, for example, in the Cascades volcanic arc where much of the background geothermal heat is transported advectively and then discharged when the groundwater emerges at springs. The temperature of spring water can thus be used to infer the geothermal heat flux. If spring water temperature is many degrees warmer than the ambient temperature, as it is at hot springs, determining the heat discharged at springs is straightforward. At large-volume cold springs, however, the geothermal warming of water is small because the added heat is diluted in a large volume of water. We show that in order to interpret the temperature of cold springs we must account for three processes: (1) conversion of gravitational potential energy to heat through viscous dissipation, (2) conduction of heat to or from the Earth's surface, and (3) geothermal warming. Using spring temperature data from the central Oregon Cascades and Mount Shasta, California, we show that the warming due to surface heat exchange and dissipation of gravitational potential energy can be comparable to that due to geothermal heating. Unless these confounding sources of heating are taken into account, estimates of geothermal heat flux derived from temperatures of cold springs can be incorrect by large factors.

  9. A Liberal Account of Addiction

    PubMed Central

    Foddy, Bennett; Savulescu, Julian

    2014-01-01

    Philosophers and psychologists have been attracted to two differing accounts of addictive motivation. In this paper, we investigate these two accounts and challenge their mutual claim that addictions compromise a person’s self-control. First, we identify some incompatibilities between this claim of reduced self-control and the available evidence from various disciplines. A critical assessment of the evidence weakens the empirical argument for reduced autonomy. Second, we identify sources of unwarranted normative bias in the popular theories of addiction that introduce systematic errors in interpreting the evidence. By eliminating these errors, we are able to generate a minimal, but correct account, of addiction that presumes addicts to be autonomous in their addictive behavior, absent further evidence to the contrary. Finally, we explore some of the implications of this minimal, correct view. PMID:24659901

  10. Managerial accounting applications in radiology.

    PubMed

    Lexa, Frank James; Mehta, Tushar; Seidmann, Abraham

    2005-03-01

    We review the core issues in managerial accounting for radiologists. We introduce the topic and then explore its application to diagnostic imaging. We define key terms such as fixed cost, variable cost, marginal cost, and marginal revenue and discuss their role in understanding the operational and financial implications for a radiology facility by using a cost-volume-profit model. Our work places particular emphasis on the role of managerial accounting in understanding service costs, as well as how it assists executive decision making.

  11. Academic Accountability and State Intervention.

    ERIC Educational Resources Information Center

    Duncan, John W.

    This speech discusses the national emerging trend toward state intervention in local educational processes as part of the academic accountability movement. It provides examples of reforms and improvements whereby state intervention furnished local school improvement. The address focuses on state intervention in Jersey City, New Jersey; predicts…

  12. VOE Accounting: Scope and Sequence.

    ERIC Educational Resources Information Center

    Nashville - Davidson County Metropolitan Public Schools, TN.

    This guide, which was written as an initial step in the development of a systemwide articulated curriculum sequence for all vocational programs within the Metropolitan Nashville Public School System, outlines the suggested scope and sequence of a 2-year program in accounting. The guide consists of a course description; general course objectives;…

  13. Career Expectations of Accounting Students

    ERIC Educational Resources Information Center

    Elam, Dennis; Mendez, Francis

    2010-01-01

    The demographic make-up of accounting students is dramatically changing. This study sets out to measure how well the profession is ready to accommodate what may be very different needs and expectations of this new generation of students. Non-traditional students are becoming more and more of a tradition in the current college classroom.…

  14. Kant's Account of Moral Education

    ERIC Educational Resources Information Center

    Giesinger, Johannes

    2012-01-01

    While Kant's pedagogical lectures present an account of moral education, his theory of freedom and morality seems to leave no room for the possibility of an education for freedom and morality. In this paper, it is first shown that Kant's moral philosophy and his educational philosophy are developed within different theoretical paradigms: whereas…

  15. Accountability: Stepping Stones to Success

    ERIC Educational Resources Information Center

    Loy, Darcy

    2010-01-01

    Lack of accountability is a leading topic in today's workforce. It costs corporate America billions of dollars each year and has financial impact on educational institutions as well. From employee theft to poor production of product and inefficiency, it is a serious problem. Facilities leaders need to take ownership and strive to implement…

  16. Aqueous Processing Material Accountability Instrumentation

    SciTech Connect

    Robert Bean

    2007-09-01

    Increased use of nuclear power will require new facilities. The U.S. has not built a new spent nuclear fuel reprocessing facility for decades. Reprocessing facilities must maintain accountability of their nuclear fuel. This survey report on the techniques used in current aqueous reprocessing facilities, and provides references to source materials to assist facility design efforts.

  17. 77 FR 40253 - Reserve Account

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-09

    ... Rural Housing Service 7 CFR Part 3560 RIN 0575-AC66 Reserve Account AGENCY: Rural Housing Service, USDA. ACTION: Final rule. SUMMARY: Through this action, the Rural Housing Service (RHS) is amending its... Preservation and Direct Loan Division, Rural Housing Service, U.S. Department of Agriculture, STOP 0781,...

  18. Charter Schools and Democratic Accountability

    ERIC Educational Resources Information Center

    Henig, Jeffrey R.

    2017-01-01

    In this article, Jeffrey R. Henig states that there is no strong accountability at charter schools without the strong oversight of public officials. When charter schooling first erupted on the scene, policymakers and citizens had little choice but to base their reactions on theory, ideology, or hunch. However twenty-five years in, there is still…

  19. Converting accounts receivable into cash.

    PubMed

    Folk, M D; Roest, P R

    1995-09-01

    In recent years, increasing numbers of healthcare providers have converted their accounts receivable into cash through a process called securitization. This practice has gained popularity because it provides a means to raise capital necessary to healthcare organizations. Although securitization transactions can be complex, they may provide increased financial flexibility to providers as they prepare for continuing change in the healthcare industry.

  20. JSC interactive basic accounting system

    NASA Technical Reports Server (NTRS)

    Spitzer, J. F.

    1978-01-01

    Design concepts for an interactive basic accounting system (IBAS) are considered in terms of selecting the design option which provides the best response at the lowest cost. Modeling the IBAS workload and applying this workload to a U1108 EXEC 8 based system using both a simulation model and the real system is discussed.

  1. Accountable Professional Practice in ELT

    ERIC Educational Resources Information Center

    Farmer, Frank

    2006-01-01

    Professionalism is widely thought to be desirable in ELT, and at the same time institutions are taking seriously the need to evaluate their teachers. This article presents a general approach to professionalism focused on the accountability of the professional to the client based on TESOL's (2000) classification of adult ELT within eight general…

  2. Accountability Is a Calculated Effort

    ERIC Educational Resources Information Center

    Vekich, Michael; Coborn, Daniel

    2004-01-01

    As part-time volunteer members who have few direct operational duties, board members constantly are bombarded with information on matters ranging from strategic plans to operating budgets to tuition rates to parking permits. In the end, it is they who are accountable for all activities that occur in their institutions and systems. The need for…

  3. Accountability in Adult Farmer Education

    ERIC Educational Resources Information Center

    Callanan, Paul J.; Jackson, Dennis L.

    1978-01-01

    Two instructors write about some ideas they have implemented in their farm management program to help measure accountability. They describe the Minnesota Farm Business Analysis, use of the analysis summary book, income tax management, and budgeting and cash flow planning. (MF)

  4. Process Accountability in Curriculum Development.

    ERIC Educational Resources Information Center

    Gooler, Dennis D.; Grotelueschen, Arden

    This paper urges the curriculum developer to assume the accountability for his decisions necessitated by the actual ways our society functions. The curriculum developer is encouraged to recognize that he is a salesman with a commodity (the curriculum). He is urged to realize that if he cannot market the package to the customers (the various…

  5. Accountability Issues in School Violence.

    ERIC Educational Resources Information Center

    Al-Bataineh, Adel T.

    This paper examines various reasons that would account for school violence and considers ways educators can help eliminate violence from schools. The negative impact of violence in the media and easy access to guns are mentioned as probable causes of violence in youth. Students who do not feel part of the school community often resort to violence…

  6. Accountability--A Historical Perspective.

    ERIC Educational Resources Information Center

    Keefover, Karen Shade

    1983-01-01

    Asserts that existing accountability policies assume that a single behaviorist theory is the one best system for effective education. Examines the pitfalls of the one-system approach through the examples of John Stuart Mill's utilitarian upbringing and "Gradgrindism" in Charles Dickens' novel "Hard Times." (SK)

  7. Accounting. Occupational Competency Analysis Profile.

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus. Vocational Instructional Materials Lab.

    This Accounting Occupational Competency Analysis Profile (OCAP) is one of a series of competency lists, verified by expert workers, that have evolved from a modified DACUM (Developing a Curriculum) job analysis process involving business, industry, labor, and community agency representatives from throughout Ohio. This OCAP identifies the…

  8. Cost Accounting for Decision Makers.

    ERIC Educational Resources Information Center

    Kaneklides, Ann L.

    1985-01-01

    Underscores the importance of informed decision making through accurate anticipation of cost incurrence in light of changing economic and environmental conditions. Explains the concepts of cost accounting, full allocation of costs, the selection of an allocation base, the allocation of indirect costs, depreciation, and implications for community…

  9. Accountability: A New Disneyland Fantasy

    ERIC Educational Resources Information Center

    Bundy, Robert F.

    1974-01-01

    Parents, professional educators, boards of education, legislators, and the general public are justifiably questioning the monies spent on education, school efficiency, what schools are actually accomplishing, and who controls the results of schooling. However, accountability, as envisioned by its major supporters, will address none of these…

  10. Library Labor Cost Accounting System.

    ERIC Educational Resources Information Center

    Du Bois, Dan

    The Library Labor Cost Accounting System will provide visibility on current costs of manually processing library materials, at each campus as well as system-wide. The scope of the study includes the following: (1) 100 individual activities, grouped into 14 functional areas, e.g., Ordering, Receiving; and into 3 major operations: Acquisitions,…

  11. Accounting Principles and Financial Statements.

    ERIC Educational Resources Information Center

    Robinson, Daniel D.

    1973-01-01

    This document presents the background and analysis of the American Institute of Certified Public Accountants (AICPA) guide to auditing colleges and universities. Highlights include the approval of the market value option, the treatment of endowment gains, debt services as transfers, the decisions on pledges, the use of financial statements, the…

  12. Fraud Education for Accounting Students.

    ERIC Educational Resources Information Center

    Peterson, Bonita K.

    2003-01-01

    Reports that limited fraud education takes place in accounting due to a crowded curriculum and misunderstanding of the extent of fraud. Suggests ways to develop content on the topic and provides a list of teaching materials (textbooks, workbooks, trade books, case materials, videos, and reference materials). (Contains 16 references.) (SK)

  13. Educational Accountability and Policy Feedback

    ERIC Educational Resources Information Center

    McDonnell, Lorraine M.

    2013-01-01

    Over the past 30 years, accountability policies have become more prominent in public K-12 education and have changed how teaching and learning are organized. It is less clear the extent to which these policies have altered the politics of education. This article begins to address that question through the lens of policy feedback. It identifies…

  14. Improving School Accountability in California

    ERIC Educational Resources Information Center

    Larsen, S. Eric; Lipscomb, Stephen; Jaquet, Karina

    2011-01-01

    Federal education policy will soon undergo a major revision, with significant consequences for the state's own policy and practices. This report seeks to help federal and state policymakers consider this restructuring and one of its core questions: How should schools and school districts be held accountable for the academic progress of their…

  15. 76 FR 8989 - Federal Acquisition Regulation; Updated Financial Accounting Standards Board Accounting References

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-16

    ... 9000-AM00 Federal Acquisition Regulation; Updated Financial Accounting Standards Board Accounting... to amend the Federal Acquisition Regulation (FAR) to update references to authoritative accounting standards owing to the Financial Accounting Standards Board's (FASB's) Accounting Standards...

  16. The Aristotelian account of "heart and veins".

    PubMed

    Shoja, Mohammadali M; Tubbs, R Shane; Loukas, Marios; Ardalan, Mohammad R

    2008-04-25

    The exploration of the cardiovascular (CV) system has a history of at least five millennia. The model of the heart and veins represented by Aristotle (384-322 B.C.) is one of the earliest and accurate descriptions of the CV system. With his own specific metaphysical approach, Aristotle discussed why there might be a vascular tree composed of two vessels and also why these vessels must extend throughout the entire body. Herein, the authors present a history of the original account of the CV system based on the studies and teachings of Aristotle who made detailed observations and experimented upon animals and human corpses to explore the anatomy of the heart and vessels and thus provided the basis for modern CV medicine. The Aristotelian CV model consisted of two related but slightly dissimilar passages based on experimentation and tradition, which could be perceived as the morphology and metaphysical accounts of physiology, respectively. Restricted by his own methodology of dissecting dead animals, Aristotle was the first to describe the anatomy of the heart and blood vessels. A thorough reading of his Historia Animalium showed that he was able to morphologically delineate the right atrium in addition to three distinct heart cavities corresponding to the left atrium and right and left ventricles. The authors conclude that when interpreting Aristotelian doctrine, the methodology and terminology should be taken into account in order to prevent potential misconceptions. It is the early work of such scientists as Aristotle on which we base our current understanding of the CV system.

  17. Development of histopathological indices in the digestive gland and gonad of mussels: integration with contamination levels and effects of confounding factors.

    PubMed

    Cuevas, Nagore; Zorita, Izaskun; Costa, Pedro M; Franco, Javier; Larreta, Joana

    2015-05-01

    Bivalve histopathology has become an important tool in aquatic toxicology, having been implemented in many biomonitoring programmes worldwide. However, there are various gaps in the knowledge of many sentinel organisms and the interference of confounding factors. This work aimed (i) to develop a detailed semi-quantitative histopathological index of the digestive gland and gonad of the Mytilus galloprovincialis mussel collected from five sites contaminated with distinct patterns of organic and inorganic toxicants along the Basque coast (SE Bay of Biscay) and (ii) to investigate whether seasonal variability and parasitosis act as confounding factors. A total of twenty-three histopathological alterations were analysed in the digestive gland and gonad following a weighed condition index approach. The alterations were integrated into a single value for a better understanding of the mussels' health status. The digestive gland was consistently more damaged than the gonad. Mussels from the most impacted sites endured the most significant deleterious effects showing inflammation-related alterations together with digestive tubule atrophy and necrosis. Neoplastic diseases were scarce, with only a few cases of fibromas (benign neoplasia). In contrast, in moderately or little impacted sites, contamination levels did not cause significant tissue damage. However, parasites contributed to overestimating the values of histopathological indices (i.e. more severe tissue damage) in mussels from little impacted sites, whilst the opposite occurred in mussels from highly polluted sites. Accordingly, inter-site differences were more pronounced in autumn when natural physiological responses of advanced maturation stages did not interfere in the histological response. In conclusion, although seasonal variability and parasitosis mask the response of histopathological indices, this biomonitoring approach may provide good sensitivity for assessing the health status of mussels if fluctuations

  18. Resolving Confounding Enrichment Kinetics Due to Overlapping Resonance Signals From 13C-Enriched Long Chain Fatty Acid Oxidation and Uptake Within Intact Hearts

    PubMed Central

    O'Donnell, J. Michael; Fasano, Matthew J.; Lewandowski, E. Douglas

    2014-01-01

    Purpose Long chain fatty acid (LCFA) oxidation measurements in the intact heart from 13C-NMR rely on detection of 13C-enriched glutamate. However, progressive increases in overlapping resonance signal from LCFA can confound detection of the glutamate 4-carbon (GLU-C4) signal. We evaluated alternative 13C labeling for exogenous LCFA and developed a simple scheme to distinguish kinetics of LCFA uptake and storage from oxidation. Methods Sequential 13C-NMR spectra were acquired from isolated rat hearts perfused with 13C LCFA and glucose. Spectra were evaluated from hearts supplied: U 13C LCFA, [2,4,6,8,10,12,14,16-13C8] palmitate, [2,4,6,8,10,12,14,16,18-13C9] oleate, [4,6,8,10,12,14,16-13C7] palmitate, or [4,6,8,10,12, 14,16,18-13C8] oleate. Results 13C signal reflected the progressive enrichment at 34.6 ppm from GLU-C4, confounded by additional signal with distinct kinetics attributed to 13C-enriched LCFA 2-carbon (34.0 ppm). Excluding 13C at the 2-carbon of both palmitate and oleate eliminated signal overlap and enabled detection of the exponential enrichment of GLU-C4 for assessing LCFA oxidation. Conclusion Eliminating enrichment at the 2-carbon of 13C LCFA resolved confounding kinetics between GLU-C4 and LCFA 2-carbon signals. With this enrichment scheme, oxidation of LCFA, the primary fuel for cardiac ATP synthesis, can now be more consistently examined in whole organs with dynamic mode, proton-decoupled 13C-NMR. PMID:25199499

  19. 'Bigger data' on scale-dependent effects of invasive species on biodiversity cannot overcome confounded analyses: a comment on Stohlgren & Rejmánek (2014).

    PubMed

    Chase, Jonathan M; Powell, Kristin I; Knight, Tiffany M

    2015-08-01

    A recent study by Stohlgren & Rejmánek (SR: Stohlgren TJ, Rejmánek M. 2014 Biol. Lett. 10. (doi:10.1098/rsbl.2013.0939)) purported to test the generality of a recent finding of scale-dependent effects of invasive plants on native diversity; dominant invasive plants decreased the intercept and increased the slope of the species-area relationship. SR (2014) find little correlation between invasive species cover and the slopes and intercepts of SARs across a diversity of sites. We show that the analyses of SR (2014) are inappropriate because of confounding causality.

  20. Accountable Care Organizations: The National Landscape.

    PubMed

    Shortell, Stephen M; Colla, Carrie H; Lewis, Valerie A; Fisher, Elliott; Kessell, Eric; Ramsay, Patricia

    2015-08-01

    There are now more than seven hundred accountable care organizations (ACOs) in the United States. This article describes some of their most salient characteristics including the number and types of contracts involved, organizational structures, the scope of services offered, care management capabilities, and the development of a three-category taxonomy that can be used to target technical assistance efforts and to examine performance. The current evidence on the performance of ACOs is reviewed. Since California has the largest number of ACOs (N=67) and a history of providing care under risk-bearing contracts, some additional assessments of quality and patient experience are made between California ACOs and non-ACO provider organizations. Six key issues likely to affect future ACO growth and development are discussed, and some potential "diagnostic" indicators for assessing the likelihood of potential antitrust violations are presented.