Sample records for instrumental variable estimation

  1. A Polychoric Instrumental Variable (PIV) Estimator for Structural Equation Models with Categorical Variables

    ERIC Educational Resources Information Center

    Bollen, Kenneth A.; Maydeu-Olivares, Albert

    2007-01-01

    This paper presents a new polychoric instrumental variable (PIV) estimator to use in structural equation models (SEMs) with categorical observed variables. The PIV estimator is a generalization of Bollen's (Psychometrika 61:109-121, 1996) 2SLS/IV estimator for continuous variables to categorical endogenous variables. We derive the PIV estimator…

  2. Compliance-Effect Correlation Bias in Instrumental Variables Estimators

    ERIC Educational Resources Information Center

    Reardon, Sean F.

    2010-01-01

    Instrumental variable estimators hold the promise of enabling researchers to estimate the effects of educational treatments that are not (or cannot be) randomly assigned but that may be affected by randomly assigned interventions. Examples of the use of instrumental variables in such cases are increasingly common in educational and social science…

  3. Censored Quantile Instrumental Variable Estimates of the Price Elasticity of Expenditure on Medical Care.

    PubMed

    Kowalski, Amanda

    2016-01-02

    Efforts to control medical care costs depend critically on how individuals respond to prices. I estimate the price elasticity of expenditure on medical care using a censored quantile instrumental variable (CQIV) estimator. CQIV allows estimates to vary across the conditional expenditure distribution, relaxes traditional censored model assumptions, and addresses endogeneity with an instrumental variable. My instrumental variable strategy uses a family member's injury to induce variation in an individual's own price. Across the conditional deciles of the expenditure distribution, I find elasticities that vary from -0.76 to -1.49, which are an order of magnitude larger than previous estimates.

  4. Censored Quantile Instrumental Variable Estimates of the Price Elasticity of Expenditure on Medical Care

    PubMed Central

    Kowalski, Amanda

    2015-01-01

    Efforts to control medical care costs depend critically on how individuals respond to prices. I estimate the price elasticity of expenditure on medical care using a censored quantile instrumental variable (CQIV) estimator. CQIV allows estimates to vary across the conditional expenditure distribution, relaxes traditional censored model assumptions, and addresses endogeneity with an instrumental variable. My instrumental variable strategy uses a family member’s injury to induce variation in an individual’s own price. Across the conditional deciles of the expenditure distribution, I find elasticities that vary from −0.76 to −1.49, which are an order of magnitude larger than previous estimates. PMID:26977117

  5. Tutorial in Biostatistics: Instrumental Variable Methods for Causal Inference*

    PubMed Central

    Baiocchi, Michael; Cheng, Jing; Small, Dylan S.

    2014-01-01

    A goal of many health studies is to determine the causal effect of a treatment or intervention on health outcomes. Often, it is not ethically or practically possible to conduct a perfectly randomized experiment and instead an observational study must be used. A major challenge to the validity of observational studies is the possibility of unmeasured confounding (i.e., unmeasured ways in which the treatment and control groups differ before treatment administration which also affect the outcome). Instrumental variables analysis is a method for controlling for unmeasured confounding. This type of analysis requires the measurement of a valid instrumental variable, which is a variable that (i) is independent of the unmeasured confounding; (ii) affects the treatment; and (iii) affects the outcome only indirectly through its effect on the treatment. This tutorial discusses the types of causal effects that can be estimated by instrumental variables analysis; the assumptions needed for instrumental variables analysis to provide valid estimates of causal effects and sensitivity analysis for those assumptions; methods of estimation of causal effects using instrumental variables; and sources of instrumental variables in health studies. PMID:24599889

  6. Testing concordance of instrumental variable effects in generalized linear models with application to Mendelian randomization

    PubMed Central

    Dai, James Y.; Chan, Kwun Chuen Gary; Hsu, Li

    2014-01-01

    Instrumental variable regression is one way to overcome unmeasured confounding and estimate causal effect in observational studies. Built on structural mean models, there has been considerale work recently developed for consistent estimation of causal relative risk and causal odds ratio. Such models can sometimes suffer from identification issues for weak instruments. This hampered the applicability of Mendelian randomization analysis in genetic epidemiology. When there are multiple genetic variants available as instrumental variables, and causal effect is defined in a generalized linear model in the presence of unmeasured confounders, we propose to test concordance between instrumental variable effects on the intermediate exposure and instrumental variable effects on the disease outcome, as a means to test the causal effect. We show that a class of generalized least squares estimators provide valid and consistent tests of causality. For causal effect of a continuous exposure on a dichotomous outcome in logistic models, the proposed estimators are shown to be asymptotically conservative. When the disease outcome is rare, such estimators are consistent due to the log-linear approximation of the logistic function. Optimality of such estimators relative to the well-known two-stage least squares estimator and the double-logistic structural mean model is further discussed. PMID:24863158

  7. Instrumental Variable Analysis with a Nonlinear Exposure–Outcome Relationship

    PubMed Central

    Davies, Neil M.; Thompson, Simon G.

    2014-01-01

    Background: Instrumental variable methods can estimate the causal effect of an exposure on an outcome using observational data. Many instrumental variable methods assume that the exposure–outcome relation is linear, but in practice this assumption is often in doubt, or perhaps the shape of the relation is a target for investigation. We investigate this issue in the context of Mendelian randomization, the use of genetic variants as instrumental variables. Methods: Using simulations, we demonstrate the performance of a simple linear instrumental variable method when the true shape of the exposure–outcome relation is not linear. We also present a novel method for estimating the effect of the exposure on the outcome within strata of the exposure distribution. This enables the estimation of localized average causal effects within quantile groups of the exposure or as a continuous function of the exposure using a sliding window approach. Results: Our simulations suggest that linear instrumental variable estimates approximate a population-averaged causal effect. This is the average difference in the outcome if the exposure for every individual in the population is increased by a fixed amount. Estimates of localized average causal effects reveal the shape of the exposure–outcome relation for a variety of models. These methods are used to investigate the relations between body mass index and a range of cardiovascular risk factors. Conclusions: Nonlinear exposure–outcome relations should not be a barrier to instrumental variable analyses. When the exposure–outcome relation is not linear, either a population-averaged causal effect or the shape of the exposure–outcome relation can be estimated. PMID:25166881

  8. Statistical methods for biodosimetry in the presence of both Berkson and classical measurement error

    NASA Astrophysics Data System (ADS)

    Miller, Austin

    In radiation epidemiology, the true dose received by those exposed cannot be assessed directly. Physical dosimetry uses a deterministic function of the source term, distance and shielding to estimate dose. For the atomic bomb survivors, the physical dosimetry system is well established. The classical measurement errors plaguing the location and shielding inputs to the physical dosimetry system are well known. Adjusting for the associated biases requires an estimate for the classical measurement error variance, for which no data-driven estimate exists. In this case, an instrumental variable solution is the most viable option to overcome the classical measurement error indeterminacy. Biological indicators of dose may serve as instrumental variables. Specification of the biodosimeter dose-response model requires identification of the radiosensitivity variables, for which we develop statistical definitions and variables. More recently, researchers have recognized Berkson error in the dose estimates, introduced by averaging assumptions for many components in the physical dosimetry system. We show that Berkson error induces a bias in the instrumental variable estimate of the dose-response coefficient, and then address the estimation problem. This model is specified by developing an instrumental variable mixed measurement error likelihood function, which is then maximized using a Monte Carlo EM Algorithm. These methods produce dose estimates that incorporate information from both physical and biological indicators of dose, as well as the first instrumental variable based data-driven estimate for the classical measurement error variance.

  9. Data Combination and Instrumental Variables in Linear Models

    ERIC Educational Resources Information Center

    Khawand, Christopher

    2012-01-01

    Instrumental variables (IV) methods allow for consistent estimation of causal effects, but suffer from poor finite-sample properties and data availability constraints. IV estimates also tend to have relatively large standard errors, often inhibiting the interpretability of differences between IV and non-IV point estimates. Lastly, instrumental…

  10. Instrumental variables I: instrumental variables exploit natural variation in nonexperimental data to estimate causal relationships.

    PubMed

    Rassen, Jeremy A; Brookhart, M Alan; Glynn, Robert J; Mittleman, Murray A; Schneeweiss, Sebastian

    2009-12-01

    The gold standard of study design for treatment evaluation is widely acknowledged to be the randomized controlled trial (RCT). Trials allow for the estimation of causal effect by randomly assigning participants either to an intervention or comparison group; through the assumption of "exchangeability" between groups, comparing the outcomes will yield an estimate of causal effect. In the many cases where RCTs are impractical or unethical, instrumental variable (IV) analysis offers a nonexperimental alternative based on many of the same principles. IV analysis relies on finding a naturally varying phenomenon, related to treatment but not to outcome except through the effect of treatment itself, and then using this phenomenon as a proxy for the confounded treatment variable. This article demonstrates how IV analysis arises from an analogous but potentially impossible RCT design, and outlines the assumptions necessary for valid estimation. It gives examples of instruments used in clinical epidemiology and concludes with an outline on estimation of effects.

  11. Instrumental variables I: instrumental variables exploit natural variation in nonexperimental data to estimate causal relationships

    PubMed Central

    Rassen, Jeremy A.; Brookhart, M. Alan; Glynn, Robert J.; Mittleman, Murray A.; Schneeweiss, Sebastian

    2010-01-01

    The gold standard of study design for treatment evaluation is widely acknowledged to be the randomized controlled trial (RCT). Trials allow for the estimation of causal effect by randomly assigning participants either to an intervention or comparison group; through the assumption of “exchangeability” between groups, comparing the outcomes will yield an estimate of causal effect. In the many cases where RCTs are impractical or unethical, instrumental variable (IV) analysis offers a nonexperimental alternative based on many of the same principles. IV analysis relies on finding a naturally varying phenomenon, related to treatment but not to outcome except through the effect of treatment itself, and then using this phenomenon as a proxy for the confounded treatment variable. This article demonstrates how IV analysis arises from an analogous but potentially impossible RCT design, and outlines the assumptions necessary for valid estimation. It gives examples of instruments used in clinical epidemiology and concludes with an outline on estimation of effects. PMID:19356901

  12. Eliminating Survivor Bias in Two-stage Instrumental Variable Estimators.

    PubMed

    Vansteelandt, Stijn; Walter, Stefan; Tchetgen Tchetgen, Eric

    2018-07-01

    Mendelian randomization studies commonly focus on elderly populations. This makes the instrumental variables analysis of such studies sensitive to survivor bias, a type of selection bias. A particular concern is that the instrumental variable conditions, even when valid for the source population, may be violated for the selective population of individuals who survive the onset of the study. This is potentially very damaging because Mendelian randomization studies are known to be sensitive to bias due to even minor violations of the instrumental variable conditions. Interestingly, the instrumental variable conditions continue to hold within certain risk sets of individuals who are still alive at a given age when the instrument and unmeasured confounders exert additive effects on the exposure, and moreover, the exposure and unmeasured confounders exert additive effects on the hazard of death. In this article, we will exploit this property to derive a two-stage instrumental variable estimator for the effect of exposure on mortality, which is insulated against the above described selection bias under these additivity assumptions.

  13. Network Mendelian randomization: using genetic variants as instrumental variables to investigate mediation in causal pathways

    PubMed Central

    Burgess, Stephen; Daniel, Rhian M; Butterworth, Adam S; Thompson, Simon G

    2015-01-01

    Background: Mendelian randomization uses genetic variants, assumed to be instrumental variables for a particular exposure, to estimate the causal effect of that exposure on an outcome. If the instrumental variable criteria are satisfied, the resulting estimator is consistent even in the presence of unmeasured confounding and reverse causation. Methods: We extend the Mendelian randomization paradigm to investigate more complex networks of relationships between variables, in particular where some of the effect of an exposure on the outcome may operate through an intermediate variable (a mediator). If instrumental variables for the exposure and mediator are available, direct and indirect effects of the exposure on the outcome can be estimated, for example using either a regression-based method or structural equation models. The direction of effect between the exposure and a possible mediator can also be assessed. Methods are illustrated in an applied example considering causal relationships between body mass index, C-reactive protein and uric acid. Results: These estimators are consistent in the presence of unmeasured confounding if, in addition to the instrumental variable assumptions, the effects of both the exposure on the mediator and the mediator on the outcome are homogeneous across individuals and linear without interactions. Nevertheless, a simulation study demonstrates that even considerable heterogeneity in these effects does not lead to bias in the estimates. Conclusions: These methods can be used to estimate direct and indirect causal effects in a mediation setting, and have potential for the investigation of more complex networks between multiple interrelated exposures and disease outcomes. PMID:25150977

  14. Robust best linear estimator for Cox regression with instrumental variables in whole cohort and surrogates with additive measurement error in calibration sample

    PubMed Central

    Wang, Ching-Yun; Song, Xiao

    2017-01-01

    SUMMARY Biomedical researchers are often interested in estimating the effect of an environmental exposure in relation to a chronic disease endpoint. However, the exposure variable of interest may be measured with errors. In a subset of the whole cohort, a surrogate variable is available for the true unobserved exposure variable. The surrogate variable satisfies an additive measurement error model, but it may not have repeated measurements. The subset in which the surrogate variables are available is called a calibration sample. In addition to the surrogate variables that are available among the subjects in the calibration sample, we consider the situation when there is an instrumental variable available for all study subjects. An instrumental variable is correlated with the unobserved true exposure variable, and hence can be useful in the estimation of the regression coefficients. In this paper, we propose a nonparametric method for Cox regression using the observed data from the whole cohort. The nonparametric estimator is the best linear combination of a nonparametric correction estimator from the calibration sample and the difference of the naive estimators from the calibration sample and the whole cohort. The asymptotic distribution is derived, and the finite sample performance of the proposed estimator is examined via intensive simulation studies. The methods are applied to the Nutritional Biomarkers Study of the Women’s Health Initiative. PMID:27546625

  15. Dynamic modal estimation using instrumental variables

    NASA Technical Reports Server (NTRS)

    Salzwedel, H.

    1980-01-01

    A method to determine the modes of dynamical systems is described. The inputs and outputs of a system are Fourier transformed and averaged to reduce the error level. An instrumental variable method that estimates modal parameters from multiple correlations between responses of single input, multiple output systems is applied to estimate aircraft, spacecraft, and off-shore platform modal parameters.

  16. THAT INSTRUMENT IS LOUSY! IN SEARCH OF AGREEMENT WHEN USING INSTRUMENTAL VARIABLES ESTIMATION IN SUBSTANCE USE RESEARCH

    PubMed Central

    Popovici, Ioana

    2009-01-01

    SUMMARY The primary statistical challenge that must be addressed when using cross-sectional data to estimate the consequences of consuming addictive substances is the likely endogeneity of substance use. While economists are in agreement on the need to consider potential endogeneity bias and the value of instrumental variables estimation, the selection of credible instruments is a topic of heated debate in the field. Rather than attempt to resolve this debate, our paper highlights the diversity of judgments about what constitutes appropriate instruments for substance use based on a comprehensive review of the economics literature since 1990. We then offer recommendations related to the selection of reliable instruments in future studies. PMID:20029936

  17. Instrumental variables estimation of exposure effects on a time-to-event endpoint using structural cumulative survival models.

    PubMed

    Martinussen, Torben; Vansteelandt, Stijn; Tchetgen Tchetgen, Eric J; Zucker, David M

    2017-12-01

    The use of instrumental variables for estimating the effect of an exposure on an outcome is popular in econometrics, and increasingly so in epidemiology. This increasing popularity may be attributed to the natural occurrence of instrumental variables in observational studies that incorporate elements of randomization, either by design or by nature (e.g., random inheritance of genes). Instrumental variables estimation of exposure effects is well established for continuous outcomes and to some extent for binary outcomes. It is, however, largely lacking for time-to-event outcomes because of complications due to censoring and survivorship bias. In this article, we make a novel proposal under a class of structural cumulative survival models which parameterize time-varying effects of a point exposure directly on the scale of the survival function; these models are essentially equivalent with a semi-parametric variant of the instrumental variables additive hazards model. We propose a class of recursive instrumental variable estimators for these exposure effects, and derive their large sample properties along with inferential tools. We examine the performance of the proposed method in simulation studies and illustrate it in a Mendelian randomization study to evaluate the effect of diabetes on mortality using data from the Health and Retirement Study. We further use the proposed method to investigate potential benefit from breast cancer screening on subsequent breast cancer mortality based on the HIP-study. © 2017, The International Biometric Society.

  18. Implementation of Instrumental Variable Bounds for Data Missing Not at Random.

    PubMed

    Marden, Jessica R; Wang, Linbo; Tchetgen, Eric J Tchetgen; Walter, Stefan; Glymour, M Maria; Wirth, Kathleen E

    2018-05-01

    Instrumental variables are routinely used to recover a consistent estimator of an exposure causal effect in the presence of unmeasured confounding. Instrumental variable approaches to account for nonignorable missing data also exist but are less familiar to epidemiologists. Like instrumental variables for exposure causal effects, instrumental variables for missing data rely on exclusion restriction and instrumental variable relevance assumptions. Yet these two conditions alone are insufficient for point identification. For estimation, researchers have invoked a third assumption, typically involving fairly restrictive parametric constraints. Inferences can be sensitive to these parametric assumptions, which are typically not empirically testable. The purpose of our article is to discuss another approach for leveraging a valid instrumental variable. Although the approach is insufficient for nonparametric identification, it can nonetheless provide informative inferences about the presence, direction, and magnitude of selection bias, without invoking a third untestable parametric assumption. An important contribution of this article is an Excel spreadsheet tool that can be used to obtain empirical evidence of selection bias and calculate bounds and corresponding Bayesian 95% credible intervals for a nonidentifiable population proportion. For illustrative purposes, we used the spreadsheet tool to analyze HIV prevalence data collected by the 2007 Zambia Demographic and Health Survey (DHS).

  19. Robust best linear estimator for Cox regression with instrumental variables in whole cohort and surrogates with additive measurement error in calibration sample.

    PubMed

    Wang, Ching-Yun; Song, Xiao

    2016-11-01

    Biomedical researchers are often interested in estimating the effect of an environmental exposure in relation to a chronic disease endpoint. However, the exposure variable of interest may be measured with errors. In a subset of the whole cohort, a surrogate variable is available for the true unobserved exposure variable. The surrogate variable satisfies an additive measurement error model, but it may not have repeated measurements. The subset in which the surrogate variables are available is called a calibration sample. In addition to the surrogate variables that are available among the subjects in the calibration sample, we consider the situation when there is an instrumental variable available for all study subjects. An instrumental variable is correlated with the unobserved true exposure variable, and hence can be useful in the estimation of the regression coefficients. In this paper, we propose a nonparametric method for Cox regression using the observed data from the whole cohort. The nonparametric estimator is the best linear combination of a nonparametric correction estimator from the calibration sample and the difference of the naive estimators from the calibration sample and the whole cohort. The asymptotic distribution is derived, and the finite sample performance of the proposed estimator is examined via intensive simulation studies. The methods are applied to the Nutritional Biomarkers Study of the Women's Health Initiative. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. Bias and Bias Correction in Multi-Site Instrumental Variables Analysis of Heterogeneous Mediator Effects

    ERIC Educational Resources Information Center

    Reardon, Sean F.; Unlu, Faith; Zhu, Pei; Bloom, Howard

    2013-01-01

    We explore the use of instrumental variables (IV) analysis with a multi-site randomized trial to estimate the effect of a mediating variable on an outcome in cases where it can be assumed that the observed mediator is the only mechanism linking treatment assignment to outcomes, as assumption known in the instrumental variables literature as the…

  1. Use of allele scores as instrumental variables for Mendelian randomization

    PubMed Central

    Burgess, Stephen; Thompson, Simon G

    2013-01-01

    Background An allele score is a single variable summarizing multiple genetic variants associated with a risk factor. It is calculated as the total number of risk factor-increasing alleles for an individual (unweighted score), or the sum of weights for each allele corresponding to estimated genetic effect sizes (weighted score). An allele score can be used in a Mendelian randomization analysis to estimate the causal effect of the risk factor on an outcome. Methods Data were simulated to investigate the use of allele scores in Mendelian randomization where conventional instrumental variable techniques using multiple genetic variants demonstrate ‘weak instrument’ bias. The robustness of estimates using the allele score to misspecification (for example non-linearity, effect modification) and to violations of the instrumental variable assumptions was assessed. Results Causal estimates using a correctly specified allele score were unbiased with appropriate coverage levels. The estimates were generally robust to misspecification of the allele score, but not to instrumental variable violations, even if the majority of variants in the allele score were valid instruments. Using a weighted rather than an unweighted allele score increased power, but the increase was small when genetic variants had similar effect sizes. Naive use of the data under analysis to choose which variants to include in an allele score, or for deriving weights, resulted in substantial biases. Conclusions Allele scores enable valid causal estimates with large numbers of genetic variants. The stringency of criteria for genetic variants in Mendelian randomization should be maintained for all variants in an allele score. PMID:24062299

  2. Regularization Methods for High-Dimensional Instrumental Variables Regression With an Application to Genetical Genomics

    PubMed Central

    Lin, Wei; Feng, Rui; Li, Hongzhe

    2014-01-01

    In genetical genomics studies, it is important to jointly analyze gene expression data and genetic variants in exploring their associations with complex traits, where the dimensionality of gene expressions and genetic variants can both be much larger than the sample size. Motivated by such modern applications, we consider the problem of variable selection and estimation in high-dimensional sparse instrumental variables models. To overcome the difficulty of high dimensionality and unknown optimal instruments, we propose a two-stage regularization framework for identifying and estimating important covariate effects while selecting and estimating optimal instruments. The methodology extends the classical two-stage least squares estimator to high dimensions by exploiting sparsity using sparsity-inducing penalty functions in both stages. The resulting procedure is efficiently implemented by coordinate descent optimization. For the representative L1 regularization and a class of concave regularization methods, we establish estimation, prediction, and model selection properties of the two-stage regularized estimators in the high-dimensional setting where the dimensionality of co-variates and instruments are both allowed to grow exponentially with the sample size. The practical performance of the proposed method is evaluated by simulation studies and its usefulness is illustrated by an analysis of mouse obesity data. Supplementary materials for this article are available online. PMID:26392642

  3. Physicians' prescribing preferences were a potential instrument for patients' actual prescriptions of antidepressants☆

    PubMed Central

    Davies, Neil M.; Gunnell, David; Thomas, Kyla H.; Metcalfe, Chris; Windmeijer, Frank; Martin, Richard M.

    2013-01-01

    Objectives To investigate whether physicians' prescribing preferences were valid instrumental variables for the antidepressant prescriptions they issued to their patients. Study Design and Setting We investigated whether physicians' previous prescriptions of (1) tricyclic antidepressants (TCAs) vs. selective serotonin reuptake inhibitors (SSRIs) and (2) paroxetine vs. other SSRIs were valid instruments. We investigated whether the instrumental variable assumptions are likely to hold and whether TCAs (vs. SSRIs) were associated with hospital admission for self-harm or death by suicide using both conventional and instrumental variable regressions. The setting for the study was general practices in the United Kingdom. Results Prior prescriptions were strongly associated with actual prescriptions: physicians who previously prescribed TCAs were 14.9 percentage points (95% confidence interval [CI], 14.4, 15.4) more likely to prescribe TCAs, and those who previously prescribed paroxetine were 27.7 percentage points (95% CI, 26.7, 28.8) more likely to prescribe paroxetine, to their next patient. Physicians' previous prescriptions were less strongly associated with patients' baseline characteristics than actual prescriptions. We found no evidence that the estimated association of TCAs with self-harm/suicide using instrumental variable regression differed from conventional regression estimates (P-value = 0.45). Conclusion The main instrumental variable assumptions held, suggesting that physicians' prescribing preferences are valid instruments for evaluating the short-term effects of antidepressants. PMID:24075596

  4. Invited Commentary: Using Financial Credits as Instrumental Variables for Estimating the Causal Relationship Between Income and Health.

    PubMed

    Pega, Frank

    2016-05-01

    Social epidemiologists are interested in determining the causal relationship between income and health. Natural experiments in which individuals or groups receive income randomly or quasi-randomly from financial credits (e.g., tax credits or cash transfers) are increasingly being analyzed using instrumental variable analysis. For example, in this issue of the Journal, Hamad and Rehkopf (Am J Epidemiol. 2016;183(9):775-784) used an in-work tax credit called the Earned Income Tax Credit as an instrument to estimate the association between income and child development. However, under certain conditions, the use of financial credits as instruments could violate 2 key instrumental variable analytic assumptions. First, some financial credits may directly influence health, for example, through increasing a psychological sense of welfare security. Second, financial credits and health may have several unmeasured common causes, such as politics, other social policies, and the motivation to maximize the credit. If epidemiologists pursue such instrumental variable analyses, using the amount of an unconditional, universal credit that an individual or group has received as the instrument may produce the most conceptually convincing and generalizable evidence. However, other natural income experiments (e.g., lottery winnings) and other methods that allow better adjustment for confounding might be more promising approaches for estimating the causal relationship between income and health. © The Author 2016. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  5. An improved estimator for the hydration of fat-free mass from in vivo measurements subject to additive technical errors.

    PubMed

    Kinnamon, Daniel D; Lipsitz, Stuart R; Ludwig, David A; Lipshultz, Steven E; Miller, Tracie L

    2010-04-01

    The hydration of fat-free mass, or hydration fraction (HF), is often defined as a constant body composition parameter in a two-compartment model and then estimated from in vivo measurements. We showed that the widely used estimator for the HF parameter in this model, the mean of the ratios of measured total body water (TBW) to fat-free mass (FFM) in individual subjects, can be inaccurate in the presence of additive technical errors. We then proposed a new instrumental variables estimator that accurately estimates the HF parameter in the presence of such errors. In Monte Carlo simulations, the mean of the ratios of TBW to FFM was an inaccurate estimator of the HF parameter, and inferences based on it had actual type I error rates more than 13 times the nominal 0.05 level under certain conditions. The instrumental variables estimator was accurate and maintained an actual type I error rate close to the nominal level in all simulations. When estimating and performing inference on the HF parameter, the proposed instrumental variables estimator should yield accurate estimates and correct inferences in the presence of additive technical errors, but the mean of the ratios of TBW to FFM in individual subjects may not.

  6. Alcohol consumption, beverage prices and measurement error.

    PubMed

    Young, Douglas J; Bielinska-Kwapisz, Agnieszka

    2003-03-01

    Alcohol price data collected by the American Chamber of Commerce Researchers Association (ACCRA) have been widely used in studies of alcohol consumption and related behaviors. A number of problems with these data suggest that they contain substantial measurement error, which biases conventional statistical estimators toward a finding of little or no effect of prices on behavior. We test for measurement error, assess the magnitude of the bias and provide an alternative estimator that is likely to be superior. The study utilizes data on per capita alcohol consumption across U.S. states and the years 1982-1997. State and federal alcohol taxes are used as instrumental variables for prices. Formal tests strongly confim the hypothesis of measurement error. Instrumental variable estimates of the price elasticity of demand range from -0.53 to -1.24. These estimates are substantially larger in absolute value than ordinary least squares estimates, which sometimes are not significantly different from zero or even positive. The ACCRA price data are substantially contaminated with measurement error, but using state and federal taxes as instrumental variables mitigates the problem.

  7. Health insurance for the poor: impact on catastrophic and out-of-pocket health expenditures in Mexico

    PubMed Central

    Galárraga, Omar; Salinas-Rodríguez, Aarón; Sesma-Vázquez, Sergio

    2009-01-01

    The goal of Seguro Popular (SP) in Mexico was to improve the financial protection of the uninsured population against excessive health expenditures. This paper estimates the impact of SP on catastrophic health expenditures (CHE), as well as out-of-pocket (OOP) health expenditures, from two different sources. First, we use the SP Impact Evaluation Survey (2005–2006), and compare the instrumental variables (IV) results with the experimental benchmark. Then, we use the same IV methods with the National Health and Nutrition Survey (ENSANUT 2006). We estimate naïve models, assuming exogeneity, and contrast them with IV models that take advantage of the specific SP implementation mechanisms for identification. The IV models estimated included two-stage least squares (2SLS), bivariate probit, and two-stage residual inclusion (2SRI) models. Instrumental variables estimates resulted in comparable estimates against the “gold standard.” Instrumental variables estimates indicate a reduction of 54% in catastrophic expenditures at the national level. SP beneficiaries also had lower expenditures on outpatient and medicine expenditures. The selection-corrected protective effect is found not only in the limited experimental dataset, but also at the national level. PMID:19756796

  8. Health insurance for the poor: impact on catastrophic and out-of-pocket health expenditures in Mexico.

    PubMed

    Galárraga, Omar; Sosa-Rubí, Sandra G; Salinas-Rodríguez, Aarón; Sesma-Vázquez, Sergio

    2010-10-01

    The goal of Seguro Popular (SP) in Mexico was to improve the financial protection of the uninsured population against excessive health expenditures. This paper estimates the impact of SP on catastrophic health expenditures (CHE), as well as out-of-pocket (OOP) health expenditures, from two different sources. First, we use the SP Impact Evaluation Survey (2005-2006), and compare the instrumental variables (IV) results with the experimental benchmark. Then, we use the same IV methods with the National Health and Nutrition Survey (ENSANUT 2006). We estimate naïve models, assuming exogeneity, and contrast them with IV models that take advantage of the specific SP implementation mechanisms for identification. The IV models estimated included two-stage least squares (2SLS), bivariate probit, and two-stage residual inclusion (2SRI) models. Instrumental variables estimates resulted in comparable estimates against the "gold standard." Instrumental variables estimates indicate a reduction of 54% in catastrophic expenditures at the national level. SP beneficiaries also had lower expenditures on outpatient and medicine expenditures. The selection-corrected protective effect is found not only in the limited experimental dataset, but also at the national level.

  9. Bias and Bias Correction in Multisite Instrumental Variables Analysis of Heterogeneous Mediator Effects

    ERIC Educational Resources Information Center

    Reardon, Sean F.; Unlu, Fatih; Zhu, Pei; Bloom, Howard S.

    2014-01-01

    We explore the use of instrumental variables (IV) analysis with a multisite randomized trial to estimate the effect of a mediating variable on an outcome in cases where it can be assumed that the observed mediator is the only mechanism linking treatment assignment to outcomes, an assumption known in the IV literature as the exclusion restriction.…

  10. Comparison of variance estimators for meta-analysis of instrumental variable estimates

    PubMed Central

    Schmidt, AF; Hingorani, AD; Jefferis, BJ; White, J; Groenwold, RHH; Dudbridge, F

    2016-01-01

    Abstract Background: Mendelian randomization studies perform instrumental variable (IV) analysis using genetic IVs. Results of individual Mendelian randomization studies can be pooled through meta-analysis. We explored how different variance estimators influence the meta-analysed IV estimate. Methods: Two versions of the delta method (IV before or after pooling), four bootstrap estimators, a jack-knife estimator and a heteroscedasticity-consistent (HC) variance estimator were compared using simulation. Two types of meta-analyses were compared, a two-stage meta-analysis pooling results, and a one-stage meta-analysis pooling datasets. Results: Using a two-stage meta-analysis, coverage of the point estimate using bootstrapped estimators deviated from nominal levels at weak instrument settings and/or outcome probabilities ≤ 0.10. The jack-knife estimator was the least biased resampling method, the HC estimator often failed at outcome probabilities ≤ 0.50 and overall the delta method estimators were the least biased. In the presence of between-study heterogeneity, the delta method before meta-analysis performed best. Using a one-stage meta-analysis all methods performed equally well and better than two-stage meta-analysis of greater or equal size. Conclusions: In the presence of between-study heterogeneity, two-stage meta-analyses should preferentially use the delta method before meta-analysis. Weak instrument bias can be reduced by performing a one-stage meta-analysis. PMID:27591262

  11. The contextual effects of social capital on health: a cross-national instrumental variable analysis.

    PubMed

    Kim, Daniel; Baum, Christopher F; Ganz, Michael L; Subramanian, S V; Kawachi, Ichiro

    2011-12-01

    Past research on the associations between area-level/contextual social capital and health has produced conflicting evidence. However, interpreting this rapidly growing literature is difficult because estimates using conventional regression are prone to major sources of bias including residual confounding and reverse causation. Instrumental variable (IV) analysis can reduce such bias. Using data on up to 167,344 adults in 64 nations in the European and World Values Surveys and applying IV and ordinary least squares (OLS) regression, we estimated the contextual effects of country-level social trust on individual self-rated health. We further explored whether these associations varied by gender and individual levels of trust. Using OLS regression, we found higher average country-level trust to be associated with better self-rated health in both women and men. Instrumental variable analysis yielded qualitatively similar results, although the estimates were more than double in size in both sexes when country population density and corruption were used as instruments. The estimated health effects of raising the percentage of a country's population that trusts others by 10 percentage points were at least as large as the estimated health effects of an individual developing trust in others. These findings were robust to alternative model specifications and instruments. Conventional regression and to a lesser extent IV analysis suggested that these associations are more salient in women and in women reporting social trust. In a large cross-national study, our findings, including those using instrumental variables, support the presence of beneficial effects of higher country-level trust on self-rated health. Previous findings for contextual social capital using traditional regression may have underestimated the true associations. Given the close linkages between self-rated health and all-cause mortality, the public health gains from raising social capital within and across countries may be large. Copyright © 2011 Elsevier Ltd. All rights reserved.

  12. The contextual effects of social capital on health: a cross-national instrumental variable analysis

    PubMed Central

    Kim, Daniel; Baum, Christopher F; Ganz, Michael; Subramanian, S V; Kawachi, Ichiro

    2011-01-01

    Past observational studies of the associations of area-level/contextual social capital with health have revealed conflicting findings. However, interpreting this rapidly growing literature is difficult because estimates using conventional regression are prone to major sources of bias including residual confounding and reverse causation. Instrumental variable (IV) analysis can reduce such bias. Using data on up to 167 344 adults in 64 nations in the European and World Values Surveys and applying IV and ordinary least squares (OLS) regression, we estimated the contextual effects of country-level social trust on individual self-rated health. We further explored whether these associations varied by gender and individual levels of trust. Using OLS regression, we found higher average country-level trust to be associated with better self-rated health in both women and men. Instrumental variable analysis yielded qualitatively similar results, although the estimates were more than double in size in women and men using country population density and corruption as instruments. The estimated health effects of raising the percentage of a country's population that trusts others by 10 percentage points were at least as large as the estimated health effects of an individual developing trust in others. These findings were robust to alternative model specifications and instruments. Conventional regression and to a lesser extent IV analysis suggested that these associations are more salient in women and in women reporting social trust. In a large cross-national study, our findings, including those using instrumental variables, support the presence of beneficial effects of higher country-level trust on self-rated health. Past findings for contextual social capital using traditional regression may have underestimated the true associations. Given the close linkages between self-rated health and all-cause mortality, the public health gains from raising social capital within countries may be large. PMID:22078106

  13. Small Family, Smart Family? Family Size and the IQ Scores of Young Men

    ERIC Educational Resources Information Center

    Black, Sandra E.; Devereux, Paul J.; Salvanes, Kjell G.

    2010-01-01

    This paper uses Norwegian data to estimate the effect of family size on IQ scores of men. Instrumental variables (IV) estimates using sex composition as an instrument show no significant negative effect of family size; however, IV estimates using twins imply that family size has a negative effect on IQ scores. Our results suggest that the effect…

  14. Health insurance and the demand for medical care: Instrumental variable estimates using health insurer claims data.

    PubMed

    Dunn, Abe

    2016-07-01

    This paper takes a different approach to estimating demand for medical care that uses the negotiated prices between insurers and providers as an instrument. The instrument is viewed as a textbook "cost shifting" instrument that impacts plan offerings, but is unobserved by consumers. The paper finds a price elasticity of demand of around -0.20, matching the elasticity found in the RAND Health Insurance Experiment. The paper also studies within-market variation in demand for prescription drugs and other medical care services and obtains comparable price elasticity estimates. Published by Elsevier B.V.

  15. Econometrics in outcomes research: the use of instrumental variables.

    PubMed

    Newhouse, J P; McClellan, M

    1998-01-01

    We describe an econometric technique, instrumental variables, that can be useful in estimating the effectiveness of clinical treatments in situations when a controlled trial has not or cannot be done. This technique relies upon the existence of one or more variables that induce substantial variation in the treatment variable but have no direct effect on the outcome variable of interest. We illustrate the use of the technique with an application to aggressive treatment of acute myocardial infarction in the elderly.

  16. Genetic instrumental variable regression: Explaining socioeconomic and health outcomes in nonexperimental data

    PubMed Central

    DiPrete, Thomas A.; Burik, Casper A. P.; Koellinger, Philipp D.

    2018-01-01

    Identifying causal effects in nonexperimental data is an enduring challenge. One proposed solution that recently gained popularity is the idea to use genes as instrumental variables [i.e., Mendelian randomization (MR)]. However, this approach is problematic because many variables of interest are genetically correlated, which implies the possibility that many genes could affect both the exposure and the outcome directly or via unobserved confounding factors. Thus, pleiotropic effects of genes are themselves a source of bias in nonexperimental data that would also undermine the ability of MR to correct for endogeneity bias from nongenetic sources. Here, we propose an alternative approach, genetic instrumental variable (GIV) regression, that provides estimates for the effect of an exposure on an outcome in the presence of pleiotropy. As a valuable byproduct, GIV regression also provides accurate estimates of the chip heritability of the outcome variable. GIV regression uses polygenic scores (PGSs) for the outcome of interest which can be constructed from genome-wide association study (GWAS) results. By splitting the GWAS sample for the outcome into nonoverlapping subsamples, we obtain multiple indicators of the outcome PGSs that can be used as instruments for each other and, in combination with other methods such as sibling fixed effects, can address endogeneity bias from both pleiotropy and the environment. In two empirical applications, we demonstrate that our approach produces reasonable estimates of the chip heritability of educational attainment (EA) and show that standard regression and MR provide upwardly biased estimates of the effect of body height on EA. PMID:29686100

  17. Genetic instrumental variable regression: Explaining socioeconomic and health outcomes in nonexperimental data.

    PubMed

    DiPrete, Thomas A; Burik, Casper A P; Koellinger, Philipp D

    2018-05-29

    Identifying causal effects in nonexperimental data is an enduring challenge. One proposed solution that recently gained popularity is the idea to use genes as instrumental variables [i.e., Mendelian randomization (MR)]. However, this approach is problematic because many variables of interest are genetically correlated, which implies the possibility that many genes could affect both the exposure and the outcome directly or via unobserved confounding factors. Thus, pleiotropic effects of genes are themselves a source of bias in nonexperimental data that would also undermine the ability of MR to correct for endogeneity bias from nongenetic sources. Here, we propose an alternative approach, genetic instrumental variable (GIV) regression, that provides estimates for the effect of an exposure on an outcome in the presence of pleiotropy. As a valuable byproduct, GIV regression also provides accurate estimates of the chip heritability of the outcome variable. GIV regression uses polygenic scores (PGSs) for the outcome of interest which can be constructed from genome-wide association study (GWAS) results. By splitting the GWAS sample for the outcome into nonoverlapping subsamples, we obtain multiple indicators of the outcome PGSs that can be used as instruments for each other and, in combination with other methods such as sibling fixed effects, can address endogeneity bias from both pleiotropy and the environment. In two empirical applications, we demonstrate that our approach produces reasonable estimates of the chip heritability of educational attainment (EA) and show that standard regression and MR provide upwardly biased estimates of the effect of body height on EA. Copyright © 2018 the Author(s). Published by PNAS.

  18. The Paired Availability Design and Related Instrumental Variable Meta-analyses | Division of Cancer Prevention

    Cancer.gov

    Stuart G. Baker, 2017 Introduction This software computes meta-analysis and extrapolation estimates for an instrumental variable meta-analysis of randomized trial or before-and-after studies (the latter also known as the paired availability design). The software also checks on the assumptions if sufficient data are available. |

  19. Posthospitalization home health care use and changes in functional status in a Medicare population.

    PubMed

    Hadley, J; Rabin, D; Epstein, A; Stein, S; Rimes, C

    2000-05-01

    The objective of this work was to estimate the effect of Medicare beneficiaries' use of home health care (HHC) for 6 months after hospital discharge on the change in functional status over a 1-year period beginning before hospitalization. Data came from the Medicare Current Beneficiary Survey, which is a nationally representative sample of Medicare beneficiaries, in-person interview data, and Medicare claims for 1991 through 1994 for 2,127 nondisabled, community-dwelling, elderly Medicare beneficiaries who were hospitalized within 6 months of their annual in-person interviews. Econometric estimation with the instrumental variable method was used to correct for observational data bias, ie, the nonrandom allocation of discharged beneficiaries to the use of posthospitalization HHC. The analysis estimates a first-stage model of HHC use from which an instrumental variable estimate is constructed to estimate the effect on change in functional status. The instrumental variable estimates suggest that HHC users experienced greater improvements in functional status than nonusers as measured by the change in a continuous scale based on the number and mix of activities of daily living and instrumental activities of daily living before and after hospitalization. The estimated improvement in functional status could be as large as 13% for a 10% increase in HHC use. In contrast, estimation with the observational data on HHC use implies that HHC users had poorer health outcomes. Adjusting for potential observational data bias is critical to obtaining estimates of the relationship between the use of posthospitalization HHC and the change in health before and after hospitalization. After adjustment, the results suggest that efforts to constrain Medicare's spending for HHC, as required by the Balanced Budget Act of 1997, may lead to poorer health outcomes for some beneficiaries.

  20. Using instrumental variables to estimate a Cox's proportional hazards regression subject to additive confounding

    PubMed Central

    Tosteson, Tor D.; Morden, Nancy E.; Stukel, Therese A.; O'Malley, A. James

    2014-01-01

    The estimation of treatment effects is one of the primary goals of statistics in medicine. Estimation based on observational studies is subject to confounding. Statistical methods for controlling bias due to confounding include regression adjustment, propensity scores and inverse probability weighted estimators. These methods require that all confounders are recorded in the data. The method of instrumental variables (IVs) can eliminate bias in observational studies even in the absence of information on confounders. We propose a method for integrating IVs within the framework of Cox's proportional hazards model and demonstrate the conditions under which it recovers the causal effect of treatment. The methodology is based on the approximate orthogonality of an instrument with unobserved confounders among those at risk. We derive an estimator as the solution to an estimating equation that resembles the score equation of the partial likelihood in much the same way as the traditional IV estimator resembles the normal equations. To justify this IV estimator for a Cox model we perform simulations to evaluate its operating characteristics. Finally, we apply the estimator to an observational study of the effect of coronary catheterization on survival. PMID:25506259

  1. Using instrumental variables to estimate a Cox's proportional hazards regression subject to additive confounding.

    PubMed

    MacKenzie, Todd A; Tosteson, Tor D; Morden, Nancy E; Stukel, Therese A; O'Malley, A James

    2014-06-01

    The estimation of treatment effects is one of the primary goals of statistics in medicine. Estimation based on observational studies is subject to confounding. Statistical methods for controlling bias due to confounding include regression adjustment, propensity scores and inverse probability weighted estimators. These methods require that all confounders are recorded in the data. The method of instrumental variables (IVs) can eliminate bias in observational studies even in the absence of information on confounders. We propose a method for integrating IVs within the framework of Cox's proportional hazards model and demonstrate the conditions under which it recovers the causal effect of treatment. The methodology is based on the approximate orthogonality of an instrument with unobserved confounders among those at risk. We derive an estimator as the solution to an estimating equation that resembles the score equation of the partial likelihood in much the same way as the traditional IV estimator resembles the normal equations. To justify this IV estimator for a Cox model we perform simulations to evaluate its operating characteristics. Finally, we apply the estimator to an observational study of the effect of coronary catheterization on survival.

  2. Evaluating disease management programme effectiveness: an introduction to instrumental variables.

    PubMed

    Linden, Ariel; Adams, John L

    2006-04-01

    This paper introduces the concept of instrumental variables (IVs) as a means of providing an unbiased estimate of treatment effects in evaluating disease management (DM) programme effectiveness. Model development is described using zip codes as the IV. Three diabetes DM outcomes were evaluated: annual diabetes costs, emergency department (ED) visits and hospital days. Both ordinary least squares (OLS) and IV estimates showed a significant treatment effect for diabetes costs (P = 0.011) but neither model produced a significant treatment effect for ED visits. However, the IV estimate showed a significant treatment effect for hospital days (P = 0.006) whereas the OLS model did not. These results illustrate the utility of IV estimation when the OLS model is sensitive to the confounding effect of hidden bias.

  3. Sources of Biased Inference in Alcohol and Drug Services Research: An Instrumental Variable Approach

    PubMed Central

    Schmidt, Laura A.; Tam, Tammy W.; Larson, Mary Jo

    2012-01-01

    Objective: This study examined the potential for biased inference due to endogeneity when using standard approaches for modeling the utilization of alcohol and drug treatment. Method: Results from standard regression analysis were compared with those that controlled for endogeneity using instrumental variables estimation. Comparable models predicted the likelihood of receiving alcohol treatment based on the widely used Aday and Andersen medical care–seeking model. Data were from the National Epidemiologic Survey on Alcohol and Related Conditions and included a representative sample of adults in households and group quarters throughout the contiguous United States. Results: Findings suggested that standard approaches for modeling treatment utilization are prone to bias because of uncontrolled reverse causation and omitted variables. Compared with instrumental variables estimation, standard regression analyses produced downwardly biased estimates of the impact of alcohol problem severity on the likelihood of receiving care. Conclusions: Standard approaches for modeling service utilization are prone to underestimating the true effects of problem severity on service use. Biased inference could lead to inaccurate policy recommendations, for example, by suggesting that people with milder forms of substance use disorder are more likely to receive care than is actually the case. PMID:22152672

  4. Evaluation of Validity and Reliability for Hierarchical Scales Using Latent Variable Modeling

    ERIC Educational Resources Information Center

    Raykov, Tenko; Marcoulides, George A.

    2012-01-01

    A latent variable modeling method is outlined, which accomplishes estimation of criterion validity and reliability for a multicomponent measuring instrument with hierarchical structure. The approach provides point and interval estimates for the scale criterion validity and reliability coefficients, and can also be used for testing composite or…

  5. Specifying and Refining a Complex Measurement Model.

    ERIC Educational Resources Information Center

    Levy, Roy; Mislevy, Robert J.

    This paper aims to describe a Bayesian approach to modeling and estimating cognitive models both in terms of statistical machinery and actual instrument development. Such a method taps the knowledge of experts to provide initial estimates for the probabilistic relationships among the variables in a multivariate latent variable model and refines…

  6. Job demands and job strain as risk factors for employee wellbeing in elderly care: an instrumental-variables analysis.

    PubMed

    Elovainio, Marko; Heponiemi, Tarja; Kuusio, Hannamaria; Jokela, Markus; Aalto, Anna-Mari; Pekkarinen, Laura; Noro, Anja; Finne-Soveri, Harriet; Kivimäki, Mika; Sinervo, Timo

    2015-02-01

    The association between psychosocial work environment and employee wellbeing has repeatedly been shown. However, as environmental evaluations have typically been self-reported, the observed associations may be attributable to reporting bias. Applying instrumental-variable regression, we used staffing level (the ratio of staff to residents) as an unconfounded instrument for self-reported job demands and job strain to predict various indicators of wellbeing (perceived stress, psychological distress and sleeping problems) among 1525 registered nurses, practical nurses and nursing assistants working in elderly care wards. In ordinary regression, higher self-reported job demands and job strain were associated with increased risk of perceived stress, psychological distress and sleeping problems. The effect estimates for the associations of these psychosocial factors with perceived stress and psychological distress were greater, but less precisely estimated, in an instrumental-variables analysis which took into account only the variation in self-reported job demands and job strain that was explained by staffing level. No association between psychosocial factors and sleeping problems was observed with the instrumental-variable analysis. These results support a causal interpretation of high self-reported job demands and job strain being risk factors for employee wellbeing. © The Author 2014. Published by Oxford University Press on behalf of the European Public Health Association. All rights reserved.

  7. Percentage Energy from Fat Screener: Overview

    Cancer.gov

    A short assessment instrument to estimate an individual's usual intake of percentage energy from fat. The foods asked about on the instrument were selected because they were the most important predictors of variability in percentage energy.

  8. Bias correction by use of errors-in-variables regression models in studies with K-X-ray fluorescence bone lead measurements.

    PubMed

    Lamadrid-Figueroa, Héctor; Téllez-Rojo, Martha M; Angeles, Gustavo; Hernández-Ávila, Mauricio; Hu, Howard

    2011-01-01

    In-vivo measurement of bone lead by means of K-X-ray fluorescence (KXRF) is the preferred biological marker of chronic exposure to lead. Unfortunately, considerable measurement error associated with KXRF estimations can introduce bias in estimates of the effect of bone lead when this variable is included as the exposure in a regression model. Estimates of uncertainty reported by the KXRF instrument reflect the variance of the measurement error and, although they can be used to correct the measurement error bias, they are seldom used in epidemiological statistical analyzes. Errors-in-variables regression (EIV) allows for correction of bias caused by measurement error in predictor variables, based on the knowledge of the reliability of such variables. The authors propose a way to obtain reliability coefficients for bone lead measurements from uncertainty data reported by the KXRF instrument and compare, by the use of Monte Carlo simulations, results obtained using EIV regression models vs. those obtained by the standard procedures. Results of the simulations show that Ordinary Least Square (OLS) regression models provide severely biased estimates of effect, and that EIV provides nearly unbiased estimates. Although EIV effect estimates are more imprecise, their mean squared error is much smaller than that of OLS estimates. In conclusion, EIV is a better alternative than OLS to estimate the effect of bone lead when measured by KXRF. Copyright © 2010 Elsevier Inc. All rights reserved.

  9. Is it feasible to estimate radiosonde biases from interlaced measurements?

    NASA Astrophysics Data System (ADS)

    Kremser, Stefanie; Tradowsky, Jordis S.; Rust, Henning W.; Bodeker, Greg E.

    2018-05-01

    Upper-air measurements of essential climate variables (ECVs), such as temperature, are crucial for climate monitoring and climate change detection. Because of the internal variability of the climate system, many decades of measurements are typically required to robustly detect any trend in the climate data record. It is imperative for the records to be temporally homogeneous over many decades to confidently estimate any trend. Historically, records of upper-air measurements were primarily made for short-term weather forecasts and as such are seldom suitable for studying long-term climate change as they lack the required continuity and homogeneity. Recognizing this, the Global Climate Observing System (GCOS) Reference Upper-Air Network (GRUAN) has been established to provide reference-quality measurements of climate variables, such as temperature, pressure, and humidity, together with well-characterized and traceable estimates of the measurement uncertainty. To ensure that GRUAN data products are suitable to detect climate change, a scientifically robust instrument replacement strategy must always be adopted whenever there is a change in instrumentation. By fully characterizing any systematic differences between the old and new measurement system a temporally homogeneous data series can be created. One strategy is to operate both the old and new instruments in tandem for some overlap period to characterize any inter-instrument biases. However, this strategy can be prohibitively expensive at measurement sites operated by national weather services or research institutes. An alternative strategy that has been proposed is to alternate between the old and new instruments, so-called interlacing, and then statistically derive the systematic biases between the two instruments. Here we investigate the feasibility of such an approach specifically for radiosondes, i.e. flying the old and new instruments on alternating days. Synthetic data sets are used to explore the applicability of this statistical approach to radiosonde change management.

  10. Treatment Effect Estimation Using Nonlinear Two-Stage Instrumental Variable Estimators: Another Cautionary Note.

    PubMed

    Chapman, Cole G; Brooks, John M

    2016-12-01

    To examine the settings of simulation evidence supporting use of nonlinear two-stage residual inclusion (2SRI) instrumental variable (IV) methods for estimating average treatment effects (ATE) using observational data and investigate potential bias of 2SRI across alternative scenarios of essential heterogeneity and uniqueness of marginal patients. Potential bias of linear and nonlinear IV methods for ATE and local average treatment effects (LATE) is assessed using simulation models with a binary outcome and binary endogenous treatment across settings varying by the relationship between treatment effectiveness and treatment choice. Results show that nonlinear 2SRI models produce estimates of ATE and LATE that are substantially biased when the relationships between treatment and outcome for marginal patients are unique from relationships for the full population. Bias of linear IV estimates for LATE was low across all scenarios. Researchers are increasingly opting for nonlinear 2SRI to estimate treatment effects in models with binary and otherwise inherently nonlinear dependent variables, believing that it produces generally unbiased and consistent estimates. This research shows that positive properties of nonlinear 2SRI rely on assumptions about the relationships between treatment effect heterogeneity and choice. © Health Research and Educational Trust.

  11. A Direct Latent Variable Modeling Based Method for Point and Interval Estimation of Coefficient Alpha

    ERIC Educational Resources Information Center

    Raykov, Tenko; Marcoulides, George A.

    2015-01-01

    A direct approach to point and interval estimation of Cronbach's coefficient alpha for multiple component measuring instruments is outlined. The procedure is based on a latent variable modeling application with widely circulated software. As a by-product, using sample data the method permits ascertaining whether the population discrepancy…

  12. Association of Body Mass Index with Depression, Anxiety and Suicide—An Instrumental Variable Analysis of the HUNT Study

    PubMed Central

    Bjørngaard, Johan Håkon; Carslake, David; Lund Nilsen, Tom Ivar; Linthorst, Astrid C. E.; Davey Smith, George; Gunnell, David; Romundstad, Pål Richard

    2015-01-01

    Objective While high body mass index is associated with an increased risk of depression and anxiety, cumulative evidence indicates that it is a protective factor for suicide. The associations from conventional observational studies of body mass index with mental health outcomes are likely to be influenced by reverse causality or confounding by ill-health. In the present study, we investigated the associations between offspring body mass index and parental anxiety, depression and suicide in order to avoid problems with reverse causality and confounding by ill-health. Methods We used data from 32,457 mother-offspring and 27,753 father-offspring pairs from the Norwegian HUNT-study. Anxiety and depression were assessed using the Hospital Anxiety and Depression Scale and suicide death from national registers. Associations between offspring and own body mass index and symptoms of anxiety and depression and suicide mortality were estimated using logistic and Cox regression. Causal effect estimates were estimated with a two sample instrument variable approach using offspring body mass index as an instrument for parental body mass index. Results Both own and offspring body mass index were positively associated with depression, while the results did not indicate any substantial association between body mass index and anxiety. Although precision was low, suicide mortality was inversely associated with own body mass index and the results from the analysis using offspring body mass index supported these results. Adjusted odds ratios per standard deviation body mass index from the instrumental variable analysis were 1.22 (95% CI: 1.05, 1.43) for depression, 1.10 (95% CI: 0.95, 1.27) for anxiety, and the instrumental variable estimated hazard ratios for suicide was 0.69 (95% CI: 0.30, 1.63). Conclusion The present study’s results indicate that suicide mortality is inversely associated with body mass index. We also found support for a positive association between body mass index and depression, but not for anxiety. PMID:26167892

  13. Association of Body Mass Index with Depression, Anxiety and Suicide-An Instrumental Variable Analysis of the HUNT Study.

    PubMed

    Bjørngaard, Johan Håkon; Carslake, David; Lund Nilsen, Tom Ivar; Linthorst, Astrid C E; Davey Smith, George; Gunnell, David; Romundstad, Pål Richard

    2015-01-01

    While high body mass index is associated with an increased risk of depression and anxiety, cumulative evidence indicates that it is a protective factor for suicide. The associations from conventional observational studies of body mass index with mental health outcomes are likely to be influenced by reverse causality or confounding by ill-health. In the present study, we investigated the associations between offspring body mass index and parental anxiety, depression and suicide in order to avoid problems with reverse causality and confounding by ill-health. We used data from 32,457 mother-offspring and 27,753 father-offspring pairs from the Norwegian HUNT-study. Anxiety and depression were assessed using the Hospital Anxiety and Depression Scale and suicide death from national registers. Associations between offspring and own body mass index and symptoms of anxiety and depression and suicide mortality were estimated using logistic and Cox regression. Causal effect estimates were estimated with a two sample instrument variable approach using offspring body mass index as an instrument for parental body mass index. Both own and offspring body mass index were positively associated with depression, while the results did not indicate any substantial association between body mass index and anxiety. Although precision was low, suicide mortality was inversely associated with own body mass index and the results from the analysis using offspring body mass index supported these results. Adjusted odds ratios per standard deviation body mass index from the instrumental variable analysis were 1.22 (95% CI: 1.05, 1.43) for depression, 1.10 (95% CI: 0.95, 1.27) for anxiety, and the instrumental variable estimated hazard ratios for suicide was 0.69 (95% CI: 0.30, 1.63). The present study's results indicate that suicide mortality is inversely associated with body mass index. We also found support for a positive association between body mass index and depression, but not for anxiety.

  14. Regression calibration for models with two predictor variables measured with error and their interaction, using instrumental variables and longitudinal data.

    PubMed

    Strand, Matthew; Sillau, Stefan; Grunwald, Gary K; Rabinovitch, Nathan

    2014-02-10

    Regression calibration provides a way to obtain unbiased estimators of fixed effects in regression models when one or more predictors are measured with error. Recent development of measurement error methods has focused on models that include interaction terms between measured-with-error predictors, and separately, methods for estimation in models that account for correlated data. In this work, we derive explicit and novel forms of regression calibration estimators and associated asymptotic variances for longitudinal models that include interaction terms, when data from instrumental and unbiased surrogate variables are available but not the actual predictors of interest. The longitudinal data are fit using linear mixed models that contain random intercepts and account for serial correlation and unequally spaced observations. The motivating application involves a longitudinal study of exposure to two pollutants (predictors) - outdoor fine particulate matter and cigarette smoke - and their association in interactive form with levels of a biomarker of inflammation, leukotriene E4 (LTE 4 , outcome) in asthmatic children. Because the exposure concentrations could not be directly observed, we used measurements from a fixed outdoor monitor and urinary cotinine concentrations as instrumental variables, and we used concentrations of fine ambient particulate matter and cigarette smoke measured with error by personal monitors as unbiased surrogate variables. We applied the derived regression calibration methods to estimate coefficients of the unobserved predictors and their interaction, allowing for direct comparison of toxicity of the different pollutants. We used simulations to verify accuracy of inferential methods based on asymptotic theory. Copyright © 2013 John Wiley & Sons, Ltd.

  15. What is the effect of area size when using local area practice style as an instrument?

    PubMed

    Brooks, John M; Tang, Yuexin; Chapman, Cole G; Cook, Elizabeth A; Chrischilles, Elizabeth A

    2013-08-01

    Discuss the tradeoffs inherent in choosing a local area size when using a measure of local area practice style as an instrument in instrumental variable estimation when assessing treatment effectiveness. Assess the effectiveness of angiotensin converting-enzyme inhibitors and angiotensin receptor blockers on survival after acute myocardial infarction for Medicare beneficiaries using practice style instruments based on different-sized local areas around patients. We contrasted treatment effect estimates using different local area sizes in terms of the strength of the relationship between local area practice styles and individual patient treatment choices; and indirect assessments of the assumption violations. Using smaller local areas to measure practice styles exploits more treatment variation and results in smaller standard errors. However, if treatment effects are heterogeneous, the use of smaller local areas may increase the risk that local practice style measures are dominated by differences in average treatment effectiveness across areas and bias results toward greater effectiveness. Local area practice style measures can be useful instruments in instrumental variable analysis, but the use of smaller local area sizes to generate greater treatment variation may result in treatment effect estimates that are biased toward higher effectiveness. Assessment of whether ecological bias can be mitigated by changing local area size requires the use of outside data sources. Copyright © 2013 Elsevier Inc. All rights reserved.

  16. Too much ado about instrumental variable approach: is the cure worse than the disease?

    PubMed

    Baser, Onur

    2009-01-01

    To review the efficacy of instrumental variable (IV) models in addressing a variety of assumption violations to ensure standard ordinary least squares (OLS) estimates are consistent. IV models gained popularity in outcomes research because of their ability to consistently estimate the average causal effects even in the presence of unmeasured confounding. However, in order for this consistent estimation to be achieved, several conditions must hold. In this article, we provide an overview of the IV approach, examine possible tests to check the prerequisite conditions, and illustrate how weak instruments may produce inconsistent and inefficient results. We use two IVs and apply Shea's partial R-square method, the Anderson canonical correlation, and Cragg-Donald tests to check for weak instruments. Hall-Peixe tests are applied to see if any of these instruments are redundant in the analysis. A total of 14,952 asthma patients from the MarketScan Commercial Claims and Encounters Database were examined in this study. Patient health care was provided under a variety of fee-for-service, fully capitated, and partially capitated health plans, including preferred provider organizations, point of service plans, indemnity plans, and health maintenance organizations. We used controller-reliever copay ratio and physician practice/prescribing patterns as an instrument. We demonstrated that the former was a weak and redundant instrument producing inconsistent and inefficient estimates of the effect of treatment. The results were worse than the results from standard regression analysis. Despite the obvious benefit of IV models, the method should not be used blindly. Several strong conditions are required for these models to work, and each of them should be tested. Otherwise, bias and precision of the results will be statistically worse than the results achieved by simply using standard OLS.

  17. Estimating Causal Effects of Local Air Pollution on Daily Deaths: Effect of Low Levels.

    PubMed

    Schwartz, Joel; Bind, Marie-Abele; Koutrakis, Petros

    2017-01-01

    Although many time-series studies have established associations of daily pollution variations with daily deaths, there are fewer at low concentrations, or focused on locally generated pollution, which is becoming more important as regulations reduce regional transport. Causal modeling approaches are also lacking. We used causal modeling to estimate the impact of local air pollution on mortality at low concentrations. Using an instrumental variable approach, we developed an instrument for variations in local pollution concentrations that is unlikely to be correlated with other causes of death, and examined its association with daily deaths in the Boston, Massachusetts, area. We combined height of the planetary boundary layer and wind speed, which affect concentrations of local emissions, to develop the instrument for particulate matter ≤ 2.5 μm (PM2.5), black carbon (BC), or nitrogen dioxide (NO2) variations that were independent of year, month, and temperature. We also used Granger causality to assess whether omitted variable confounding existed. We estimated that an interquartile range increase in the instrument for local PM2.5 was associated with a 0.90% increase in daily deaths (95% CI: 0.25, 1.56). A similar result was found for BC, and a weaker association with NO2. The Granger test found no evidence of omitted variable confounding for the instrument. A separate test confirmed the instrument was not associated with mortality independent of pollution. Furthermore, the association remained when all days with PM2.5 concentrations > 30 μg/m3 were excluded from the analysis (0.84% increase in daily deaths; 95% CI: 0.19, 1.50). We conclude that there is a causal association of local air pollution with daily deaths at concentrations below U.S. EPA standards. The estimated attributable risk in Boston exceeded 1,800 deaths during the study period, indicating that important public health benefits can follow from further control efforts. Citation: Schwartz J, Bind MA, Koutrakis P. 2017. Estimating causal effects of local air pollution on daily deaths: effect of low levels. Environ Health Perspect 125:23-29; http://dx.doi.org/10.1289/EHP232.

  18. Bias estimation for the Landsat 8 operational land imager

    USGS Publications Warehouse

    Morfitt, Ron; Vanderwerff, Kelly

    2011-01-01

    The Operational Land Imager (OLI) is a pushbroom sensor that will be a part of the Landsat Data Continuity Mission (LDCM). This instrument is the latest in the line of Landsat imagers, and will continue to expand the archive of calibrated earth imagery. An important step in producing a calibrated image from instrument data is accurately accounting for the bias of the imaging detectors. Bias variability is one factor that contributes to error in bias estimation for OLI. Typically, the bias is simply estimated by averaging dark data on a per-detector basis. However, data acquired during OLI pre-launch testing exhibited bias variation that correlated well with the variation in concurrently collected data from a special set of detectors on the focal plane. These detectors are sensitive to certain electronic effects but not directly to incoming electromagnetic radiation. A method of using data from these special detectors to estimate the bias of the imaging detectors was developed, but found not to be beneficial at typical radiance levels as the detectors respond slightly when the focal plane is illuminated. In addition to bias variability, a systematic bias error is introduced by the truncation performed by the spacecraft of the 14-bit instrument data to 12-bit integers. This systematic error can be estimated and removed on average, but the per pixel quantization error remains. This paper describes the variability of the bias, the effectiveness of a new approach to estimate and compensate for it, as well as the errors due to truncation and how they are reduced.

  19. Instrumental Variable Estimates of the Labor Market Spillover Effects of Welfare Reform. Upjohn Institute Staff Working Paper.

    ERIC Educational Resources Information Center

    Bartik, Timothy J.

    The labor market spillover effects of welfare reform were estimated by using models that pool time-series and cross-section data from the Current Population Survey on the state-year cell means of wages, employment, and other labor market outcomes for various demographic groups. The labor market outcomes in question are dependent variables that are…

  20. Cesarean delivery rates among family physicians versus obstetricians: a population-based cohort study using instrumental variable methods

    PubMed Central

    Dawe, Russell Eric; Bishop, Jessica; Pendergast, Amanda; Avery, Susan; Monaghan, Kelly; Duggan, Norah; Aubrey-Bassler, Kris

    2017-01-01

    Background: Previous research suggests that family physicians have rates of cesarean delivery that are lower than or equivalent to those for obstetricians, but adjustments for risk differences in these analyses may have been inadequate. We used an econometric method to adjust for observed and unobserved factors affecting the risk of cesarean delivery among women attended by family physicians versus obstetricians. Methods: This retrospective population-based cohort study included all Canadian (except Quebec) hospital deliveries by family physicians and obstetricians between Apr. 1, 2006, and Mar. 31, 2009. We excluded women with multiple gestations, and newborns with a birth weight less than 500 g or gestational age less than 20 weeks. We estimated the relative risk of cesarean delivery using instrumental-variable-adjusted and logistic regression. Results: The final cohort included 776 299 women who gave birth in 390 hospitals. The risk of cesarean delivery was 27.3%, and the mean proportion of deliveries by family physicians was 26.9% (standard deviation 23.8%). The relative risk of cesarean delivery for family physicians versus obstetricians was 0.48 (95% confidence interval [CI] 0.41-0.56) with logistic regression and 1.27 (95% CI 1.02-1.57) with instrumental-variable-adjusted regression. Interpretation: Our conventional analyses suggest that family physicians have a lower rate of cesarean delivery than obstetricians, but instrumental variable analyses suggest the opposite. Because instrumental variable methods adjust for unmeasured factors and traditional methods do not, the large discrepancy between these estimates of risk suggests that clinical and/or sociocultural factors affecting the decision to perform cesarean delivery may not be accounted for in our database. PMID:29233843

  1. Instrumental variable methods in comparative safety and effectiveness research.

    PubMed

    Brookhart, M Alan; Rassen, Jeremy A; Schneeweiss, Sebastian

    2010-06-01

    Instrumental variable (IV) methods have been proposed as a potential approach to the common problem of uncontrolled confounding in comparative studies of medical interventions, but IV methods are unfamiliar to many researchers. The goal of this article is to provide a non-technical, practical introduction to IV methods for comparative safety and effectiveness research. We outline the principles and basic assumptions necessary for valid IV estimation, discuss how to interpret the results of an IV study, provide a review of instruments that have been used in comparative effectiveness research, and suggest some minimal reporting standards for an IV analysis. Finally, we offer our perspective of the role of IV estimation vis-à-vis more traditional approaches based on statistical modeling of the exposure or outcome. We anticipate that IV methods will be often underpowered for drug safety studies of very rare outcomes, but may be potentially useful in studies of intended effects where uncontrolled confounding may be substantial.

  2. Does job insecurity deteriorate health?

    PubMed

    Caroli, Eve; Godard, Mathilde

    2016-02-01

    This paper estimates the causal effect of perceived job insecurity - that is, the fear of involuntary job loss - on health in a sample of men from 22 European countries. We rely on an original instrumental variable approach on the basis of the idea that workers perceive greater job security in countries where employment is strongly protected by the law and more so if employed in industries where employment protection legislation is more binding; that is, in induastries with a higher natural rate of dismissals. Using cross-country data from the 2010 European Working Conditions Survey, we show that, when the potential endogeneity of job insecurity is not accounted for, the latter appears to deteriorate almost all health outcomes. When tackling the endogeneity issue by estimating an instrumental variable model and dealing with potential weak-instrument issues, the health-damaging effect of job insecurity is confirmed for a limited subgroup of health outcomes; namely, suffering from headaches or eyestrain and skin problems. As for other health variables, the impact of job insecurity appears to be insignificant at conventional levels. Copyright © 2014 John Wiley & Sons, Ltd.

  3. Nonparametric instrumental regression with non-convex constraints

    NASA Astrophysics Data System (ADS)

    Grasmair, M.; Scherzer, O.; Vanhems, A.

    2013-03-01

    This paper considers the nonparametric regression model with an additive error that is dependent on the explanatory variables. As is common in empirical studies in epidemiology and economics, it also supposes that valid instrumental variables are observed. A classical example in microeconomics considers the consumer demand function as a function of the price of goods and the income, both variables often considered as endogenous. In this framework, the economic theory also imposes shape restrictions on the demand function, such as integrability conditions. Motivated by this illustration in microeconomics, we study an estimator of a nonparametric constrained regression function using instrumental variables by means of Tikhonov regularization. We derive rates of convergence for the regularized model both in a deterministic and stochastic setting under the assumption that the true regression function satisfies a projected source condition including, because of the non-convexity of the imposed constraints, an additional smallness condition.

  4. Estimating the effect of treatment rate changes when treatment benefits are heterogeneous: antibiotics and otitis media.

    PubMed

    Park, Tae-Ryong; Brooks, John M; Chrischilles, Elizabeth A; Bergus, George

    2008-01-01

    Contrast methods to assess the health effects of a treatment rate change when treatment benefits are heterogeneous across patients. Antibiotic prescribing for children with otitis media (OM) in Iowa Medicaid is the empirical example. Instrumental variable (IV) and linear probability model (LPM) are used to estimate the effect of antibiotic treatments on cure probabilities for children with OM in Iowa Medicaid. Local area physician supply per capita is the instrument in the IV models. Estimates are contrasted in terms of their ability to make inferences for patients whose treatment choices may be affected by a change in population treatment rates. The instrument was positively related to the probability of being prescribed an antibiotic. LPM estimates showed a positive effect of antibiotics on OM patient cure probability while IV estimates showed no relationship between antibiotics and patient cure probability. Linear probability model estimation yields the average effects of the treatment on patients that were treated. IV estimation yields the average effects for patients whose treatment choices were affected by the instrument. As antibiotic treatment effects are heterogeneous across OM patients, our estimates from these approaches are aligned with clinical evidence and theory. The average estimate for treated patients (higher severity) from the LPM model is greater than estimates for patients whose treatment choices are affected by the instrument (lower severity) from the IV models. Based on our IV estimates it appears that lowering antibiotic use in OM patients in Iowa Medicaid did not result in lost cures.

  5. College quality and hourly wages: evidence from the self-revelation model, sibling models and instrumental variables.

    PubMed

    Borgen, Nicolai T

    2014-11-01

    This paper addresses the recent discussion on confounding in the returns to college quality literature using the Norwegian case. The main advantage of studying Norway is the quality of the data. Norwegian administrative data provide information on college applications, family relations and a rich set of control variables for all Norwegian citizens applying to college between 1997 and 2004 (N = 141,319) and their succeeding wages between 2003 and 2010 (676,079 person-year observations). With these data, this paper uses a subset of the models that have rendered mixed findings in the literature in order to investigate to what extent confounding biases the returns to college quality. I compare estimates obtained using standard regression models to estimates obtained using the self-revelation model of Dale and Krueger (2002), a sibling fixed effects model and the instrumental variable model used by Long (2008). Using these methods, I consistently find increasing returns to college quality over the course of students' work careers, with positive returns only later in students' work careers. I conclude that the standard regression estimate provides a reasonable estimate of the returns to college quality. Copyright © 2014 Elsevier Inc. All rights reserved.

  6. Estimating Marginal Returns to Education. NBER Working Paper No. 16474

    ERIC Educational Resources Information Center

    Carneiro, Pedro; Heckman, James J.; Vytlacil, Edward J.

    2010-01-01

    This paper estimates the marginal returns to college for individuals induced to enroll in college by different marginal policy changes. The recent instrumental variables literature seeks to estimate this parameter, but in general it does so only under strong assumptions that are tested and found wanting. We show how to utilize economic theory and…

  7. Quantifying measurement uncertainty and spatial variability in the context of model evaluation

    NASA Astrophysics Data System (ADS)

    Choukulkar, A.; Brewer, A.; Pichugina, Y. L.; Bonin, T.; Banta, R. M.; Sandberg, S.; Weickmann, A. M.; Djalalova, I.; McCaffrey, K.; Bianco, L.; Wilczak, J. M.; Newman, J. F.; Draxl, C.; Lundquist, J. K.; Wharton, S.; Olson, J.; Kenyon, J.; Marquis, M.

    2017-12-01

    In an effort to improve wind forecasts for the wind energy sector, the Department of Energy and the NOAA funded the second Wind Forecast Improvement Project (WFIP2). As part of the WFIP2 field campaign, a large suite of in-situ and remote sensing instrumentation was deployed to the Columbia River Gorge in Oregon and Washington from October 2015 - March 2017. The array of instrumentation deployed included 915-MHz wind profiling radars, sodars, wind- profiling lidars, and scanning lidars. The role of these instruments was to provide wind measurements at high spatial and temporal resolution for model evaluation and improvement of model physics. To properly determine model errors, the uncertainties in instrument-model comparisons need to be quantified accurately. These uncertainties arise from several factors such as measurement uncertainty, spatial variability, and interpolation of model output to instrument locations, to name a few. In this presentation, we will introduce a formalism to quantify measurement uncertainty and spatial variability. The accuracy of this formalism will be tested using existing datasets such as the eXperimental Planetary boundary layer Instrumentation Assessment (XPIA) campaign. Finally, the uncertainties in wind measurement and the spatial variability estimates from the WFIP2 field campaign will be discussed to understand the challenges involved in model evaluation.

  8. How the 2SLS/IV estimator can handle equality constraints in structural equation models: a system-of-equations approach.

    PubMed

    Nestler, Steffen

    2014-05-01

    Parameters in structural equation models are typically estimated using the maximum likelihood (ML) approach. Bollen (1996) proposed an alternative non-iterative, equation-by-equation estimator that uses instrumental variables. Although this two-stage least squares/instrumental variables (2SLS/IV) estimator has good statistical properties, one problem with its application is that parameter equality constraints cannot be imposed. This paper presents a mathematical solution to this problem that is based on an extension of the 2SLS/IV approach to a system of equations. We present an example in which our approach was used to examine strong longitudinal measurement invariance. We also investigated the new approach in a simulation study that compared it with ML in the examination of the equality of two latent regression coefficients and strong measurement invariance. Overall, the results show that the suggested approach is a useful extension of the original 2SLS/IV estimator and allows for the effective handling of equality constraints in structural equation models. © 2013 The British Psychological Society.

  9. Robust inference in summary data Mendelian randomization via the zero modal pleiotropy assumption.

    PubMed

    Hartwig, Fernando Pires; Davey Smith, George; Bowden, Jack

    2017-12-01

    Mendelian randomization (MR) is being increasingly used to strengthen causal inference in observational studies. Availability of summary data of genetic associations for a variety of phenotypes from large genome-wide association studies (GWAS) allows straightforward application of MR using summary data methods, typically in a two-sample design. In addition to the conventional inverse variance weighting (IVW) method, recently developed summary data MR methods, such as the MR-Egger and weighted median approaches, allow a relaxation of the instrumental variable assumptions. Here, a new method - the mode-based estimate (MBE) - is proposed to obtain a single causal effect estimate from multiple genetic instruments. The MBE is consistent when the largest number of similar (identical in infinite samples) individual-instrument causal effect estimates comes from valid instruments, even if the majority of instruments are invalid. We evaluate the performance of the method in simulations designed to mimic the two-sample summary data setting, and demonstrate its use by investigating the causal effect of plasma lipid fractions and urate levels on coronary heart disease risk. The MBE presented less bias and lower type-I error rates than other methods under the null in many situations. Its power to detect a causal effect was smaller compared with the IVW and weighted median methods, but was larger than that of MR-Egger regression, with sample size requirements typically smaller than those available from GWAS consortia. The MBE relaxes the instrumental variable assumptions, and should be used in combination with other approaches in sensitivity analyses. © The Author 2017. Published by Oxford University Press on behalf of the International Epidemiological Association

  10. Effect of Highly Active Antiretroviral Therapy on Incident AIDS Using Calendar Period as an Instrumental Variable

    PubMed Central

    Cole, Stephen R.; Greenland, Sander; Brown, Todd T.; Chmiel, Joan S.; Kingsley, Lawrence; Detels, Roger

    2009-01-01

    Human immunodeficiency virus (HIV) researchers often use calendar periods as an imperfect proxy for highly active antiretroviral therapy (HAART) when estimating the effect of HAART on HIV disease progression. The authors report on 614 HIV-positive homosexual men followed from 1984 to 2007 in 4 US cities. During 5,321 person-years, 268 of 614 men incurred acquired immunodeficiency syndrome, 49 died, and 90 were lost to follow-up. Comparing the pre-HAART calendar period (<1996) with the HAART calendar period (≥1996) resulted in a naive rate ratio of 3.62 (95% confidence limits: 2.67, 4.92). However, this estimate is likely biased because of misclassification of HAART use by calendar period. Simple calendar period approaches may circumvent confounding by indication at the cost of inducing exposure misclassification. To correct this misclassification, the authors propose an instrumental-variable estimator analogous to ones previously used for noncompliance corrections in randomized clinical trials. When the pre-HAART calendar period was compared with the HAART calendar period, the instrumental-variable rate ratio was 5.02 (95% confidence limits: 3.45, 7.31), 39% higher than the naive result. Weighting by the inverse probability of calendar period given age at seroconversion, race/ethnicity, and time since seroconversion did not appreciably alter the results. These methods may help resolve discrepancies between observational and randomized evidence. PMID:19318615

  11. Inter-hospital transfer is associated with increased mortality and costs in severe sepsis and septic shock: An instrumental variables approach.

    PubMed

    Mohr, Nicholas M; Harland, Karisa K; Shane, Dan M; Ahmed, Azeemuddin; Fuller, Brian M; Torner, James C

    2016-12-01

    The objective of this study was to evaluate the impact of regionalization on sepsis survival, to describe the role of inter-hospital transfer in rural sepsis care, and to measure the cost of inter-hospital transfer in a predominantly rural state. Observational case-control study using statewide administrative claims data from 2005 to 2014 in a predominantly rural Midwestern state. Mortality and marginal costs were estimated with multivariable generalized estimating equations models and with instrumental variables models. A total of 18 246 patients were included, of which 59% were transferred between hospitals. Transferred patients had higher mortality and longer hospital length-of-stay than non-transferred patients. Using a multivariable generalized estimating equations (GEE) model to adjust for potentially confounding factors, inter-hospital transfer was associated with increased mortality (aOR 1.7, 95% CI 1.5-1.9). Using an instrumental variables model, transfer was associated with a 9.2% increased risk of death. Transfer was associated with additional costs of $6897 (95% CI $5769-8024). Even when limiting to only those patients who received care in the largest hospitals, transfer was still associated with $5167 (95% CI $3696-6638) in additional cost. The majority of rural sepsis patients are transferred, and these transferred patients have higher mortality and significantly increased cost of care. Copyright © 2016 Elsevier Inc. All rights reserved.

  12. Instrumental Variable Methods for Continuous Outcomes That Accommodate Nonignorable Missing Baseline Values.

    PubMed

    Ertefaie, Ashkan; Flory, James H; Hennessy, Sean; Small, Dylan S

    2017-06-15

    Instrumental variable (IV) methods provide unbiased treatment effect estimation in the presence of unmeasured confounders under certain assumptions. To provide valid estimates of treatment effect, treatment effect confounders that are associated with the IV (IV-confounders) must be included in the analysis, and not including observations with missing values may lead to bias. Missing covariate data are particularly problematic when the probability that a value is missing is related to the value itself, which is known as nonignorable missingness. In such cases, imputation-based methods are biased. Using health-care provider preference as an IV method, we propose a 2-step procedure with which to estimate a valid treatment effect in the presence of baseline variables with nonignorable missing values. First, the provider preference IV value is estimated by performing a complete-case analysis using a random-effects model that includes IV-confounders. Second, the treatment effect is estimated using a 2-stage least squares IV approach that excludes IV-confounders with missing values. Simulation results are presented, and the method is applied to an analysis comparing the effects of sulfonylureas versus metformin on body mass index, where the variables baseline body mass index and glycosylated hemoglobin have missing values. Our result supports the association of sulfonylureas with weight gain. © The Author 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  13. Don't Hold Back? the Effect of Grade Retention on Student Achievement

    ERIC Educational Resources Information Center

    Diris, Ron

    2017-01-01

    This study analyzes the effect of age-based retention on school achievement at different stages of education. I estimate an instrumental variable model, using the predicted probability of retention given month of birth as an instrument, while simultaneously accounting for the effect of month of birth on maturity at the time of testing. The…

  14. Instrumental variable methods in comparative safety and effectiveness research†

    PubMed Central

    Brookhart, M. Alan; Rassen, Jeremy A.; Schneeweiss, Sebastian

    2010-01-01

    Summary Instrumental variable (IV) methods have been proposed as a potential approach to the common problem of uncontrolled confounding in comparative studies of medical interventions, but IV methods are unfamiliar to many researchers. The goal of this article is to provide a non-technical, practical introduction to IV methods for comparative safety and effectiveness research. We outline the principles and basic assumptions necessary for valid IV estimation, discuss how to interpret the results of an IV study, provide a review of instruments that have been used in comparative effectiveness research, and suggest some minimal reporting standards for an IV analysis. Finally, we offer our perspective of the role of IV estimation vis-à-vis more traditional approaches based on statistical modeling of the exposure or outcome. We anticipate that IV methods will be often underpowered for drug safety studies of very rare outcomes, but may be potentially useful in studies of intended effects where uncontrolled confounding may be substantial. PMID:20354968

  15. Instrumental variables as bias amplifiers with general outcome and confounding.

    PubMed

    Ding, P; VanderWeele, T J; Robins, J M

    2017-06-01

    Drawing causal inference with observational studies is the central pillar of many disciplines. One sufficient condition for identifying the causal effect is that the treatment-outcome relationship is unconfounded conditional on the observed covariates. It is often believed that the more covariates we condition on, the more plausible this unconfoundedness assumption is. This belief has had a huge impact on practical causal inference, suggesting that we should adjust for all pretreatment covariates. However, when there is unmeasured confounding between the treatment and outcome, estimators adjusting for some pretreatment covariate might have greater bias than estimators without adjusting for this covariate. This kind of covariate is called a bias amplifier, and includes instrumental variables that are independent of the confounder, and affect the outcome only through the treatment. Previously, theoretical results for this phenomenon have been established only for linear models. We fill in this gap in the literature by providing a general theory, showing that this phenomenon happens under a wide class of models satisfying certain monotonicity assumptions. We further show that when the treatment follows an additive or multiplicative model conditional on the instrumental variable and the confounder, these monotonicity assumptions can be interpreted as the signs of the arrows of the causal diagrams.

  16. Monthly paleostreamflow reconstruction from annual tree-ring chronologies

    Treesearch

    J. H. Stagge; D. E. Rosenberg; R. J. DeRose; T. M. Rittenour

    2018-01-01

    Paleoclimate reconstructions are increasingly used to characterize annual climate variability prior to the instrumental record, to improve estimates of climate extremes, and to provide a baseline for climate change projections. To date, paleoclimate records have seen limited engineering use to estimate hydrologic risks because water systems models and managers usually...

  17. College Quality and Early Adult Outcomes

    ERIC Educational Resources Information Center

    Long, Mark C.

    2008-01-01

    This paper estimates the effects of various college qualities on several early adult outcomes, using panel data from the National Education Longitudinal Study. I compare the results using ordinary least squares with three alternative methods of estimation, including instrumental variables, and the methods used by Dale and Krueger [(2002).…

  18. Evaluation of Criterion Validity for Scales with Congeneric Measures

    ERIC Educational Resources Information Center

    Raykov, Tenko

    2007-01-01

    A method for estimating criterion validity of scales with homogeneous components is outlined. It accomplishes point and interval estimation of interrelationship indices between composite scores and criterion variables and is useful for testing hypotheses about criterion validity of measurement instruments. The method can also be used with missing…

  19. Secondary task for full flight simulation incorporating tasks that commonly cause pilot error: Time estimation

    NASA Technical Reports Server (NTRS)

    Rosch, E.

    1975-01-01

    The task of time estimation, an activity occasionally performed by pilots during actual flight, was investigated with the objective of providing human factors investigators with an unobtrusive and minimally loading additional task that is sensitive to differences in flying conditions and flight instrumentation associated with the main task of piloting an aircraft simulator. Previous research indicated that the duration and consistency of time estimates is associated with the cognitive, perceptual, and motor loads imposed by concurrent simple tasks. The relationships between the length and variability of time estimates and concurrent task variables under a more complex situation involving simulated flight were clarified. The wrap-around effect with respect to baseline duration, a consequence of mode switching at intermediate levels of concurrent task distraction, should contribute substantially to estimate variability and have a complex effect on the shape of the resulting distribution of estimates.

  20. Critical evaluation of connectivity-based point of care testing systems of glucose in a hospital environment.

    PubMed

    Floré, Katelijne M J; Fiers, Tom; Delanghe, Joris R

    2008-01-01

    In recent years a number of point of care testing (POCT) glucometers were introduced on the market. We investigated the analytical variability (lot-to-lot variation, calibration error, inter-instrument and inter-operator variability) of glucose POCT systems in a university hospital environment and compared these results with the analytical needs required for tight glucose monitoring. The reference hexokinase method was compared to different POCT systems based on glucose oxidase (blood gas instruments) or glucose dehydrogenase (handheld glucometers). Based upon daily internal quality control data, total errors were calculated for the various glucose methods and the analytical variability of the glucometers was estimated. The total error of the glucometers exceeded by far the desirable analytical specifications (based on a biological variability model). Lot-to-lot variation, inter-instrument variation and inter-operator variability contributed approximately equally to total variance. As in a hospital environment, distribution of hematocrit values is broad, converting blood glucose into plasma values using a fixed factor further increases variance. The percentage of outliers exceeded the ISO 15197 criteria in a broad glucose concentration range. Total analytical variation of handheld glucometers is larger than expected. Clinicians should be aware that the variability of glucose measurements obtained by blood gas instruments is lower than results obtained with handheld glucometers on capillary blood.

  1. Comparison and covalidation of ozone anomalies and variability observed in SBUV(/2) and Umkehr northern midlatitude ozone profile estimates

    NASA Astrophysics Data System (ADS)

    Petropavlovskikh, I.; Ahn, Changwoo; Bhartia, P. K.; Flynn, L. E.

    2005-03-01

    This analysis presents comparisons of upper-stratosphere ozone information observed by two independent systems: the Solar Backscatter UltraViolet (SBUV and SBUV/2) satellite instruments, and ground-based Dobson spectrophotometers. Both the new SBUV Version 8 and the new UMK04 profile retrieval algorithms are optimized for studying long-term variability and trends in ozone. Trend analyses of the ozone time series from the SBUV(/2) data set are complex because of the multiple instruments involved, changes in the instruments' geo-location, and short periods of overlaps for inter-calibrations among different instruments. Three northern middle latitudes Dobson ground stations (Arosa, Boulder, and Tateno) are used in this analysis to validate the trend quality of the combined 25-year SBUV/2 time series, 1979 to 2003. Generally, differences between the satellite and ground-based data do not suggest any significant time-dependent shifts or trends. The shared features confirm the value of these data sets for studies of ozone variability.

  2. Estimating Uncertainty in Long Term Total Ozone Records from Multiple Sources

    NASA Technical Reports Server (NTRS)

    Frith, Stacey M.; Stolarski, Richard S.; Kramarova, Natalya; McPeters, Richard D.

    2014-01-01

    Total ozone measurements derived from the TOMS and SBUV backscattered solar UV instrument series cover the period from late 1978 to the present. As the SBUV series of instruments comes to an end, we look to the 10 years of data from the AURA Ozone Monitoring Instrument (OMI) and two years of data from the Ozone Mapping Profiler Suite (OMPS) on board the Suomi National Polar-orbiting Partnership satellite to continue the record. When combining these records to construct a single long-term data set for analysis we must estimate the uncertainty in the record resulting from potential biases and drifts in the individual measurement records. In this study we present a Monte Carlo analysis used to estimate uncertainties in the Merged Ozone Dataset (MOD), constructed from the Version 8.6 SBUV2 series of instruments. We extend this analysis to incorporate OMI and OMPS total ozone data into the record and investigate the impact of multiple overlapping measurements on the estimated error. We also present an updated column ozone trend analysis and compare the size of statistical error (error from variability not explained by our linear regression model) to that from instrument uncertainty.

  3. Evaluation of Scale Reliability with Binary Measures Using Latent Variable Modeling

    ERIC Educational Resources Information Center

    Raykov, Tenko; Dimitrov, Dimiter M.; Asparouhov, Tihomir

    2010-01-01

    A method for interval estimation of scale reliability with discrete data is outlined. The approach is applicable with multi-item instruments consisting of binary measures, and is developed within the latent variable modeling methodology. The procedure is useful for evaluation of consistency of single measures and of sum scores from item sets…

  4. Estimators for longitudinal latent exposure models: examining measurement model assumptions.

    PubMed

    Sánchez, Brisa N; Kim, Sehee; Sammel, Mary D

    2017-06-15

    Latent variable (LV) models are increasingly being used in environmental epidemiology as a way to summarize multiple environmental exposures and thus minimize statistical concerns that arise in multiple regression. LV models may be especially useful when multivariate exposures are collected repeatedly over time. LV models can accommodate a variety of assumptions but, at the same time, present the user with many choices for model specification particularly in the case of exposure data collected repeatedly over time. For instance, the user could assume conditional independence of observed exposure biomarkers given the latent exposure and, in the case of longitudinal latent exposure variables, time invariance of the measurement model. Choosing which assumptions to relax is not always straightforward. We were motivated by a study of prenatal lead exposure and mental development, where assumptions of the measurement model for the time-changing longitudinal exposure have appreciable impact on (maximum-likelihood) inferences about the health effects of lead exposure. Although we were not particularly interested in characterizing the change of the LV itself, imposing a longitudinal LV structure on the repeated multivariate exposure measures could result in high efficiency gains for the exposure-disease association. We examine the biases of maximum likelihood estimators when assumptions about the measurement model for the longitudinal latent exposure variable are violated. We adapt existing instrumental variable estimators to the case of longitudinal exposures and propose them as an alternative to estimate the health effects of a time-changing latent predictor. We show that instrumental variable estimators remain unbiased for a wide range of data generating models and have advantages in terms of mean squared error. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  5. The Effect of Birth Weight on Academic Performance: Instrumental Variable Analysis.

    PubMed

    Lin, Shi Lin; Leung, Gabriel Matthew; Schooling, C Mary

    2017-05-01

    Observationally, lower birth weight is usually associated with poorer academic performance; whether this association is causal or the result of confounding is unknown. To investigate this question, we obtained an effect estimate, which can have a causal interpretation under specific assumptions, of birth weight on educational attainment using instrumental variable analysis based on single nucleotide polymorphisms determining birth weight combined with results from the Social Science Genetic Association Consortium study of 126,559 Caucasians. We similarly obtained an estimate of the effect of birth weight on academic performance in 4,067 adolescents from Hong Kong's (Chinese) Children of 1997 birth cohort (1997-2016), using twin status as an instrumental variable. Birth weight was not associated with years of schooling (per 100-g increase in birth weight, -0.006 years, 95% confidence interval (CI): -0.02, 0.01) or college completion (odds ratio = 1.00, 95% CI: 0.96, 1.03). Birth weight was also unrelated to academic performance in adolescents (per 100-g increase in birth weight, -0.004 grade, 95% CI: -0.04, 0.04) using instrumental variable analysis, although conventional regression gave a small positive association (0.02 higher grade, 95% CI: 0.01, 0.03). Observed associations of birth weight with academic performance may not be causal, suggesting that interventions should focus on the contextual factors generating this correlation. © The Author 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  6. Interannual, seasonal and diurnal Mars surface environmental cycles observed from Viking to Curiosity

    NASA Astrophysics Data System (ADS)

    Martinez, German; Vicente-Retortillo, Álvaro; Kemppinen, Osku; Fischer, Erik; Fairen, Alberto G.; Guzewich, Scott David; Haberle, Robert; Lemmon, Mark T.; Newman, Claire E.; Renno, Nilton O.; Richardson, Mark I.; Smith, Michael D.; De la Torre, Manuel; Vasavada, Ashwin R.

    2016-10-01

    We analyze in-situ environmental data from the Viking landers to the Curiosity rover to estimate atmospheric pressure, near-surface air and ground temperature, relative humidity, wind speed and dust opacity with the highest confidence possible. We study the interannual, seasonal and diurnal variability of these quantities at the various landing sites over a span of more than twenty Martian years to characterize the climate on Mars and its variability. Additionally, we characterize the radiative environment at the various landing sites by estimating the daily UV irradiation (also called insolation and defined as the total amount of solar UV energy received on flat surface during one sol) and by analyzing its interannual and seasonal variability.In this study we use measurements conducted by the Viking Meteorology Instrument System (VMIS) and Viking lander camera onboard the Viking landers (VL); the Atmospheric Structure Instrument/Meteorology (ASIMET) package and the Imager for Mars Pathfinder (IMP) onboard the Mars Pathfinder (MPF) lander; the Miniature Thermal Emission Spectrometer (Mini-TES) and Pancam instruments onboard the Mars Exploration Rovers (MER); the Meteorological Station (MET), Thermal Electrical Conductivity Probe (TECP) and Phoenix Surface Stereo Imager (SSI) onboard the Phoenix (PHX) lander; and the Rover Environmental Monitoring Station (REMS) and Mastcam instrument onboard the Mars Science Laboratory (MSL) rover.A thorough analysis of in-situ environmental data from past and present missions is important to aid in the selection of the Mars 2020 landing site. We plan to extend our analysis of Mars surface environmental cycles by using upcoming data from the Temperature and Wind sensors (TWINS) instrument onboard the InSight mission and the Mars Environmental Dynamics Analyzer (MEDA) instrument onboard the Mars 2020 mission.

  7. A selective review of the first 20 years of instrumental variables models in health-services research and medicine.

    PubMed

    Cawley, John

    2015-01-01

    The method of instrumental variables (IV) is useful for estimating causal effects. Intuitively, it exploits exogenous variation in the treatment, sometimes called natural experiments or instruments. This study reviews the literature in health-services research and medical research that applies the method of instrumental variables, documents trends in its use, and offers examples of various types of instruments. A literature search of the PubMed and EconLit research databases for English-language journal articles published after 1990 yielded a total of 522 original research articles. Citations counts for each article were derived from the Web of Science. A selective review was conducted, with articles prioritized based on number of citations, validity and power of the instrument, and type of instrument. The average annual number of papers in health services research and medical research that apply the method of instrumental variables rose from 1.2 in 1991-1995 to 41.8 in 2006-2010. Commonly-used instruments (natural experiments) in health and medicine are relative distance to a medical care provider offering the treatment and the medical care provider's historic tendency to administer the treatment. Less common but still noteworthy instruments include randomization of treatment for reasons other than research, randomized encouragement to undertake the treatment, day of week of admission as an instrument for waiting time for surgery, and genes as an instrument for whether the respondent has a heritable condition. The use of the method of IV has increased dramatically in the past 20 years, and a wide range of instruments have been used. Applications of the method of IV have in several cases upended conventional wisdom that was based on correlations and led to important insights about health and healthcare. Future research should pursue new applications of existing instruments and search for new instruments that are powerful and valid.

  8. Instrumental variables and Mendelian randomization with invalid instruments

    NASA Astrophysics Data System (ADS)

    Kang, Hyunseung

    Instrumental variables (IV) methods have been widely used to determine the causal effect of a treatment, exposure, policy, or an intervention on an outcome of interest. The IV method relies on having a valid instrument, a variable that is (A1) associated with the exposure, (A2) has no direct effect on the outcome, and (A3) is unrelated to the unmeasured confounders associated with the exposure and the outcome. However, in practice, finding a valid instrument, especially those that satisfy (A2) and (A3), can be challenging. For example, in Mendelian randomization studies where genetic markers are used as instruments, complete knowledge about instruments' validity is equivalent to complete knowledge about the involved genes' functions. The dissertation explores the theory, methods, and application of IV methods when invalid instruments are present. First, when we have multiple candidate instruments, we establish a theoretical bound whereby causal effects are only identified as long as less than 50% of instruments are invalid, without knowing which of the instruments are invalid. We also propose a fast penalized method, called sisVIVE, to estimate the causal effect. We find that sisVIVE outperforms traditional IV methods when invalid instruments are present both in simulation studies as well as in real data analysis. Second, we propose a robust confidence interval under the multiple invalid IV setting. This work is an extension of our work on sisVIVE. However, unlike sisVIVE which is robust to violations of (A2) and (A3), our confidence interval procedure provides honest coverage even if all three assumptions, (A1)-(A3), are violated. Third, we study the single IV setting where the one IV we have may actually be invalid. We propose a nonparametric IV estimation method based on full matching, a technique popular in causal inference for observational data, that leverages observed covariates to make the instrument more valid. We propose an estimator along with inferential results that are robust to mis-specifications of the covariate-outcome model. We also provide a sensitivity analysis should the instrument turn out to be invalid, specifically violate (A3). Fourth, in application work, we study the causal effect of malaria on stunting among children in Ghana. Previous studies on the effect of malaria and stunting were observational and contained various unobserved confounders, most notably nutritional deficiencies. To infer causality, we use the sickle cell genotype, a trait that confers some protection against malaria and was randomly assigned at birth, as an IV and apply our nonparametric IV method. We find that the risk of stunting increases by 0.22 (95% CI: 0.044,1) for every malaria episode and is sensitive to unmeasured confounders.

  9. Use of Angiotensin-Converting Enzyme Inhibitors and Angiotensin Receptor Blockers for Geriatric Ischemic Stroke Patients: Are the Rates Right?

    PubMed

    Brooks, John M; Chapman, Cole G; Suneja, Manish; Schroeder, Mary C; Fravel, Michelle A; Schneider, Kathleen M; Wilwert, June; Li, Yi-Jhen; Chrischilles, Elizabeth A; Brenton, Douglas W; Brenton, Marian; Robinson, Jennifer

    2018-05-30

    Our objective is to estimate the effects associated with higher rates of renin-angiotensin system antagonists, angiotensin-converting enzyme inhibitors and angiotensin receptor blockers (ACEI/ARBs), in secondary prevention for geriatric (aged >65 years) patients with new ischemic strokes by chronic kidney disease (CKD) status. The effects of ACEI/ARBs on survival and renal risk were estimated by CKD status using an instrumental variable (IV) estimator. Instruments were based on local area variation in ACEI/ARB use. Data abstracted from charts were used to assess the assumptions underlying the instrumental estimator. ACEI/ARBs were used after stroke by 45.9% and 45.2% of CKD and non-CKD patients, respectively. ACEI/ARB rate differences across local areas grouped by practice styles were nearly identical for CKD and non-CKD patients. Higher ACEI/ARB use rates for non-CKD patients were associated with higher 2-year survival rates, whereas higher ACEI/ARB use rates for patients with CKD were associated with lower 2-year survival rates. While the negative survival estimates for patients with CKD were not statistically different from zero, they were statistically lower than the estimates for non-CKD patients. Confounders abstracted from charts were not associated with the instrumental variable used. Higher ACEI/ARB use rates had different survival implications for older ischemic stroke patients with and without CKD. ACEI/ARBs appear underused in ischemic stroke patients without CKD as higher use rates were associated with higher 2-year survival rates. This conclusion is not generalizable to the ischemic stroke patients with CKD, as higher ACEI/ARBS use rates were associated with lower 2-year survival rates that were statistically lower than the estimates for non-CKD patients. © 2018 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley.

  10. A novel technique for fetal heart rate estimation from Doppler ultrasound signal

    PubMed Central

    2011-01-01

    Background The currently used fetal monitoring instrumentation that is based on Doppler ultrasound technique provides the fetal heart rate (FHR) signal with limited accuracy. It is particularly noticeable as significant decrease of clinically important feature - the variability of FHR signal. The aim of our work was to develop a novel efficient technique for processing of the ultrasound signal, which could estimate the cardiac cycle duration with accuracy comparable to a direct electrocardiography. Methods We have proposed a new technique which provides the true beat-to-beat values of the FHR signal through multiple measurement of a given cardiac cycle in the ultrasound signal. The method consists in three steps: the dynamic adjustment of autocorrelation window, the adaptive autocorrelation peak detection and determination of beat-to-beat intervals. The estimated fetal heart rate values and calculated indices describing variability of FHR, were compared to the reference data obtained from the direct fetal electrocardiogram, as well as to another method for FHR estimation. Results The results revealed that our method increases the accuracy in comparison to currently used fetal monitoring instrumentation, and thus enables to calculate reliable parameters describing the variability of FHR. Relating these results to the other method for FHR estimation we showed that in our approach a much lower number of measured cardiac cycles was rejected as being invalid. Conclusions The proposed method for fetal heart rate determination on a beat-to-beat basis offers a high accuracy of the heart interval measurement enabling reliable quantitative assessment of the FHR variability, at the same time reducing the number of invalid cardiac cycle measurements. PMID:21999764

  11. Estimates of solar variability using the solar backscatter ultraviolet (SBUV) 2 Mg II index from the NOAA 9 satellite

    NASA Technical Reports Server (NTRS)

    Cebula, Richard P.; Deland, Matthew T.; Schlesinger, Barry M.

    1992-01-01

    The Mg II core to wing index was first developed for the Nimbus 7 solar backscatter ultraviolet (SBUV) instrument as an indicator of solar variability on both solar 27-day rotational and solar cycle time scales. This work extends the Mg II index to the NOAA 9 SBUV 2 instrument and shows that the variations in absolute value between Mg II index data sets caused by interinstrument differences do not affect the ability to track temporal variations. The NOAA 9 Mg II index accurately represents solar rotational modulation but contains more day-to-day noise than the Nimbus 7 Mg II index. Solar variability at other UV wavelengths is estimated by deriving scale factors between the Mg II index rotational variations and at those selected wavelengths. Based on the 27-day average of the NOAA 9 Mg II index and the NOAA 9 scale factors, the solar irradiance change from solar minimum in September 1986 to the beginning of the maximum of solar cycle 22 in 1989 is estimated to be 8.6 percent at 205 nm, 3.5 percent at 250 nm, and less than 1 percent beyond 300 nm.

  12. Instrumental variable specifications and assumptions for longitudinal analysis of mental health cost offsets.

    PubMed

    O'Malley, A James

    2012-12-01

    Instrumental variables (IVs) enable causal estimates in observational studies to be obtained in the presence of unmeasured confounders. In practice, a diverse range of models and IV specifications can be brought to bear on a problem, particularly with longitudinal data where treatment effects can be estimated for various functions of current and past treatment. However, in practice the empirical consequences of different assumptions are seldom examined, despite the fact that IV analyses make strong assumptions that cannot be conclusively tested by the data. In this paper, we consider several longitudinal models and specifications of IVs. Methods are applied to data from a 7-year study of mental health costs of atypical and conventional antipsychotics whose purpose was to evaluate whether the newer and more expensive atypical antipsychotic medications lead to a reduction in overall mental health costs.

  13. ESTIMATING PERSON-CENTERED TREATMENT (PeT) EFFECTS USING INSTRUMENTAL VARIABLES: AN APPLICATION TO EVALUATING PROSTATE CANCER TREATMENTS

    PubMed Central

    BASU, ANIRBAN

    2014-01-01

    SUMMARY This paper builds on the methods of local instrumental variables developed by Heckman and Vytlacil (1999, 2001, 2005) to estimate person-centered treatment (PeT) effects that are conditioned on the person’s observed characteristics and averaged over the potential conditional distribution of unobserved characteristics that lead them to their observed treatment choices. PeT effects are more individualized than conditional treatment effects from a randomized setting with the same observed characteristics. PeT effects can be easily aggregated to construct any of the mean treatment effect parameters and, more importantly, are well suited to comprehend individual-level treatment effect heterogeneity. The paper presents the theory behind PeT effects, and applies it to study the variation in individual-level comparative effects of prostate cancer treatments on overall survival and costs. PMID:25620844

  14. Density dependence and climate effects in Rocky Mountain elk: an application of regression with instrumental variables for population time series with sampling error.

    PubMed

    Creel, Scott; Creel, Michael

    2009-11-01

    1. Sampling error in annual estimates of population size creates two widely recognized problems for the analysis of population growth. First, if sampling error is mistakenly treated as process error, one obtains inflated estimates of the variation in true population trajectories (Staples, Taper & Dennis 2004). Second, treating sampling error as process error is thought to overestimate the importance of density dependence in population growth (Viljugrein et al. 2005; Dennis et al. 2006). 2. In ecology, state-space models are used to account for sampling error when estimating the effects of density and other variables on population growth (Staples et al. 2004; Dennis et al. 2006). In econometrics, regression with instrumental variables is a well-established method that addresses the problem of correlation between regressors and the error term, but requires fewer assumptions than state-space models (Davidson & MacKinnon 1993; Cameron & Trivedi 2005). 3. We used instrumental variables to account for sampling error and fit a generalized linear model to 472 annual observations of population size for 35 Elk Management Units in Montana, from 1928 to 2004. We compared this model with state-space models fit with the likelihood function of Dennis et al. (2006). We discuss the general advantages and disadvantages of each method. Briefly, regression with instrumental variables is valid with fewer distributional assumptions, but state-space models are more efficient when their distributional assumptions are met. 4. Both methods found that population growth was negatively related to population density and winter snow accumulation. Summer rainfall and wolf (Canis lupus) presence had much weaker effects on elk (Cervus elaphus) dynamics [though limitation by wolves is strong in some elk populations with well-established wolf populations (Creel et al. 2007; Creel & Christianson 2008)]. 5. Coupled with predictions for Montana from global and regional climate models, our results predict a substantial reduction in the limiting effect of snow accumulation on Montana elk populations in the coming decades. If other limiting factors do not operate with greater force, population growth rates would increase substantially.

  15. Interpreting findings from Mendelian randomization using the MR-Egger method.

    PubMed

    Burgess, Stephen; Thompson, Simon G

    2017-05-01

    Mendelian randomization-Egger (MR-Egger) is an analysis method for Mendelian randomization using summarized genetic data. MR-Egger consists of three parts: (1) a test for directional pleiotropy, (2) a test for a causal effect, and (3) an estimate of the causal effect. While conventional analysis methods for Mendelian randomization assume that all genetic variants satisfy the instrumental variable assumptions, the MR-Egger method is able to assess whether genetic variants have pleiotropic effects on the outcome that differ on average from zero (directional pleiotropy), as well as to provide a consistent estimate of the causal effect, under a weaker assumption-the InSIDE (INstrument Strength Independent of Direct Effect) assumption. In this paper, we provide a critical assessment of the MR-Egger method with regard to its implementation and interpretation. While the MR-Egger method is a worthwhile sensitivity analysis for detecting violations of the instrumental variable assumptions, there are several reasons why causal estimates from the MR-Egger method may be biased and have inflated Type 1 error rates in practice, including violations of the InSIDE assumption and the influence of outlying variants. The issues raised in this paper have potentially serious consequences for causal inferences from the MR-Egger approach. We give examples of scenarios in which the estimates from conventional Mendelian randomization methods and MR-Egger differ, and discuss how to interpret findings in such cases.

  16. The costs of turnover in nursing homes.

    PubMed

    Mukamel, Dana B; Spector, William D; Limcangco, Rhona; Wang, Ying; Feng, Zhanlian; Mor, Vincent

    2009-10-01

    Turnover rates in nursing homes have been persistently high for decades, ranging upwards of 100%. To estimate the net costs associated with turnover of direct care staff in nursing homes. DATA AND SAMPLE: Nine hundred two nursing homes in California in 2005. Data included Medicaid cost reports, the Minimum Data Set, Medicare enrollment files, Census, and Area Resource File. We estimated total cost functions, which included in addition to exogenous outputs and wages, the facility turnover rate. Instrumental variable limited information maximum likelihood techniques were used for estimation to deal with the endogeneity of turnover and costs. The cost functions exhibited the expected behavior, with initially increasing and then decreasing returns to scale. The ordinary least square estimate did not show a significant association between costs and turnover. The instrumental variable estimate of turnover costs was negative and significant (P = 0.039). The marginal cost savings associated with a 10% point increase in turnover for an average facility was $167,063 or 2.9% of annual total costs. The net savings associated with turnover offer an explanation for the persistence of this phenomenon over the last decades, despite the many policy initiatives to reduce it. Future policy efforts need to recognize the complex relationship between turnover and costs.

  17. Using Instrumental Variable (IV) Tests to Evaluate Model Specification in Latent Variable Structural Equation Models*

    PubMed Central

    Kirby, James B.; Bollen, Kenneth A.

    2009-01-01

    Structural Equation Modeling with latent variables (SEM) is a powerful tool for social and behavioral scientists, combining many of the strengths of psychometrics and econometrics into a single framework. The most common estimator for SEM is the full-information maximum likelihood estimator (ML), but there is continuing interest in limited information estimators because of their distributional robustness and their greater resistance to structural specification errors. However, the literature discussing model fit for limited information estimators for latent variable models is sparse compared to that for full information estimators. We address this shortcoming by providing several specification tests based on the 2SLS estimator for latent variable structural equation models developed by Bollen (1996). We explain how these tests can be used to not only identify a misspecified model, but to help diagnose the source of misspecification within a model. We present and discuss results from a Monte Carlo experiment designed to evaluate the finite sample properties of these tests. Our findings suggest that the 2SLS tests successfully identify most misspecified models, even those with modest misspecification, and that they provide researchers with information that can help diagnose the source of misspecification. PMID:20419054

  18. Treating pre-instrumental data as "missing" data: using a tree-ring-based paleoclimate record and imputations to reconstruct streamflow in the Missouri River Basin

    NASA Astrophysics Data System (ADS)

    Ho, M. W.; Lall, U.; Cook, E. R.

    2015-12-01

    Advances in paleoclimatology in the past few decades have provided opportunities to expand the temporal perspective of the hydrological and climatological variability across the world. The North American region is particularly fortunate in this respect where a relatively dense network of high resolution paleoclimate proxy records have been assembled. One such network is the annually-resolved Living Blended Drought Atlas (LBDA): a paleoclimate reconstruction of the Palmer Drought Severity Index (PDSI) that covers North America on a 0.5° × 0.5° grid based on tree-ring chronologies. However, the use of the LBDA to assess North American streamflow variability requires a model by which streamflow may be reconstructed. Paleoclimate reconstructions have typically used models that first seek to quantify the relationship between the paleoclimate variable and the environmental variable of interest before extrapolating the relationship back in time. In contrast, the pre-instrumental streamflow is here considered as "missing" data. A method of imputing the "missing" streamflow data, prior to the instrumental record, is applied through multiple imputation using chained equations for streamflow in the Missouri River Basin. In this method, the distribution of the instrumental streamflow and LBDA is used to estimate sets of plausible values for the "missing" streamflow data resulting in a ~600 year-long streamflow reconstruction. Past research into external climate forcings, oceanic-atmospheric variability and its teleconnections, and assessments of rare multi-centennial instrumental records demonstrate that large temporal oscillations in hydrological conditions are unlikely to be captured in most instrumental records. The reconstruction of multi-centennial records of streamflow will enable comprehensive assessments of current and future water resource infrastructure and operations under the existing scope of natural climate variability.

  19. Latent class instrumental variables: A clinical and biostatistical perspective

    PubMed Central

    Baker, Stuart G.; Kramer, Barnett S.; Lindeman, Karen S.

    2015-01-01

    In some two-arm randomized trials, some participants receive the treatment assigned to the other arm as a result of technical problems, refusal of a treatment invitation, or a choice of treatment in an encouragement design. In some before-and-after studies, the availability of a new treatment changes from one time period to this next. Under assumptions that are often reasonable, the latent class instrumental variable (IV) method estimates the effect of treatment received in the aforementioned scenarios involving all-or-none compliance and all-or-none availability. Key aspects are four initial latent classes (sometimes called principal strata) based on treatment received if in each randomization group or time period, the exclusion restriction assumption (in which randomization group or time period is an instrumental variable), the monotonicity assumption (which drops an implausible latent class from the analysis), and the estimated effect of receiving treatment in one latent class (sometimes called efficacy, the local average treatment effect, or the complier average causal effect). Since its independent formulations in the biostatistics and econometrics literatures, the latent class IV method (which has no well-established name) has gained increasing popularity. We review the latent class IV method from a clinical and biostatistical perspective, focusing on underlying assumptions, methodological extensions, and applications in our fields of obstetrics and cancer research. PMID:26239275

  20. Incorporating variability in point estimates in risk assessment: bridging the gap between LC50 and population endpoints

    USDA-ARS?s Scientific Manuscript database

    Historically, the use of point estimates such as the LC50 has been instrumental in assessing the risk associated with toxicants to rare or economically important species. In recent years, growing awareness of the shortcomings of this approach has led to an increased focus on analyses using populatio...

  1. Sampling Analysis of Aerosol Retrievals by Single-track Spaceborne Instrument for Climate Research

    NASA Astrophysics Data System (ADS)

    Geogdzhayev, I. V.; Cairns, B.; Alexandrov, M. D.; Mishchenko, M. I.

    2012-12-01

    We examine to what extent the reduced sampling of along-track instruments such as Cloud-Aerosol LIdar with Orthogonal Polarisation (CALIOP) and Aerosol Polarimetry Sensor (APS) affects the statistical accuracy of a satellite climatology of retrieved aerosol optical thickness (AOT) by sub-sampling the retrievals from a wide-swath imaging instrument (MODerate resolution Imaging Spectroradiometer (MODIS)). Owing to its global coverage, longevity, and extensive characterization versus ground based data, the MODIS level-2 aerosol product is an instructive testbed for assessing sampling effects on climatic means derived from along-track instrument data. The advantage of using daily pixel-level aerosol retrievals from MODIS is that limitations caused by the presence of clouds are implicit in the sample, so that their seasonal and regional variations are captured coherently. However, imager data can exhibit cross-track variability of monthly global mean AOTs caused by a scattering-angle dependence. We found that single along-track values can deviate from the imager mean by 15% over land and by more than 20% over ocean. This makes it difficult to separate natural variability from viewing-geometry artifacts complicating direct comparisons of an along-track sub-sample with the full imager data. To work around this problem, we introduce "flipped-track" sampling which, by design, is statistically equivalent to along-track sampling and while closely approximating the imager in terms of angular artifacts. We show that the flipped-track variability of global monthly mean AOT is much smaller than the cross-track one for the 7-year period considered. Over the ocean flipped-track standard error is 85% less than the cross-track one (absolute values 0.0012 versus 0.0079), and over land it is about one third of the cross-track value (0.0054 versus 0.0188) on average. This allows us to attribute the difference between the two errors to the viewing-geometry artifacts and obtain an upper limit on AOT errors caused by along-track sampling. Our results show that using along-track subsets of MODIS aerosol data directly to analyze the sampling adequacy of single-track instruments can lead to false conclusions owing to the apparent enhancement of natural aerosol variability by the track-to-track artifacts. The analysis based on the statistics of the flipped-track means yields better estimates because it allows for better separation of the viewing-geometry artifacts and true natural variability. Published assessments estimate that a global AOT change of 0.01 would yield a climatically important flux change of 0.25 W/m2. Since the standard error estimates that we have obtained are comfortably below 0.01, we conclude that along-track instruments flown on a sun-synchronous orbiting platform have sufficient spatial sampling for estimating aerosol effects on climate. Since AOT is believed to be the most variable characteristic of tropospheric aerosols, our results imply that pixel-wide along-track coverage also provides adequate statistical representation of the global distribution of aerosol microphysical parameters.

  2. The Intergenerational Effects of Paternal Migration on Schooling and Work: What Can We Learn from Children's Time Allocations?

    PubMed

    Antman, Francisca M

    2011-11-01

    This paper explores the short-run effects of a father's U.S. migration on his children's schooling and work outcomes in Mexico. To get around the endogeneity of paternal migration, I use individual fixed effects and instrumental variables estimation (FEIV) where the instrumental variables are based on U.S. city-level employment statistics in two industries popular with Mexican immigrants. Overall, the estimates suggest that in the short-run, children reduce study hours and increase work hours in response to a father's U.S. migration. Decomposing the sample into sex- and age-specific groups suggests that this is mainly driven by the effects of paternal migration on 12-15 year-old boys. These results are consistent with a story in which the immediate aftermath of a father's migration is one of financial hardship that is borne in part by relatively young children.

  3. The Intergenerational Effects of Paternal Migration on Schooling and Work: What Can We Learn from Children's Time Allocations?*

    PubMed Central

    Antman, Francisca M.

    2012-01-01

    This paper explores the short-run effects of a father's U.S. migration on his children's schooling and work outcomes in Mexico. To get around the endogeneity of paternal migration, I use individual fixed effects and instrumental variables estimation (FEIV) where the instrumental variables are based on U.S. city-level employment statistics in two industries popular with Mexican immigrants. Overall, the estimates suggest that in the short-run, children reduce study hours and increase work hours in response to a father's U.S. migration. Decomposing the sample into sex- and age-specific groups suggests that this is mainly driven by the effects of paternal migration on 12–15 year-old boys. These results are consistent with a story in which the immediate aftermath of a father's migration is one of financial hardship that is borne in part by relatively young children. PMID:22505791

  4. Insurance and the utilization of medical services.

    PubMed

    Meer, Jonathan; Rosen, Harvey S

    2004-05-01

    Most data sets indicate a positive correlation between having health insurance and utilizing health care services. Yet the direction of causality is not at all clear. If we observe a positive correlation between the utilization of health care services and insurance status, we do not know if this is because people who anticipate poor health buy more insurance (or take jobs with generous medical coverage), or because insurance lowers the cost of health care, increasing the quantity demanded. While a few attempts have been made to implement an instrumental variables (IV) strategy to deal with endogeneity, the instruments chosen have not been entirely convincing. In this paper we revisit the IV estimation of the reduced form relationships between insurance and health care utilization taking advantage of what we argue is a good instrument-the individual's self-employment status. Our main finding is that a positive and statistically significant effect of insurance continues to obtain even after instrumenting. Indeed, instrumental variables estimates of the impact of insurance on utilization of a variety of health care services are larger than their non-instrumented counterparts. The validity of this exercise depends on the extent to which self-employment status is a suitable instrument. To argue this case, we analyze panel data on transitions from wage-earning into self-employment and show that individuals who select into self-employment do not differ systematically from those who remain wage-earners with respect to either the utilization of health care or health status. While this finding does not prove that self-employment status is an appropriate instrument, it is encouraging that there appear to be no underlying differences that might lead to self-employment per se affecting health services utilization.

  5. Instrumental variable approaches to identifying the causal effect of educational attainment on dementia risk

    PubMed Central

    Nguyen, Thu T.; Tchetgen Tchetgen, Eric J.; Kawachi, Ichiro; Gilman, Stephen E.; Walter, Stefan; Liu, Sze Y.; Manly, Jennifer; Glymour, M. Maria

    2015-01-01

    Purpose Education is an established correlate of cognitive status in older adulthood, but whether expanding educational opportunities would improve cognitive functioning remains unclear given limitations of prior studies for causal inference. Therefore, we conducted instrumental variable (IV) analyses of the association between education and dementia risk, using for the first time in this area, genetic variants as instruments as well as state-level school policies. Methods IV analyses in the Health and Retirement Study cohort (1998–2010) used two sets of instruments: 1) a genetic risk score constructed from three single nucleotide polymorphisms (SNPs) (n=8,054); and 2) compulsory schooling laws (CSLs) and state school characteristics (term length, student teacher ratios, and expenditures) (n=13,167). Results Employing the genetic risk score as an IV, there was a 1.1% reduction in dementia risk per year of schooling (95% CI: −2.4, 0.02). Leveraging compulsory schooling laws and state school characteristics as IVs, there was a substantially larger protective effect (−9.5%; 95% CI: −14.8, −4.2). Analyses evaluating the plausibility of the IV assumptions indicated estimates derived from analyses relying on CSLs provide the best estimates of the causal effect of education. Conclusion IV analyses suggest education is protective against risk of dementia in older adulthood. PMID:26633592

  6. Semiparametric methods for estimation of a nonlinear exposure-outcome relationship using instrumental variables with application to Mendelian randomization.

    PubMed

    Staley, James R; Burgess, Stephen

    2017-05-01

    Mendelian randomization, the use of genetic variants as instrumental variables (IV), can test for and estimate the causal effect of an exposure on an outcome. Most IV methods assume that the function relating the exposure to the expected value of the outcome (the exposure-outcome relationship) is linear. However, in practice, this assumption may not hold. Indeed, often the primary question of interest is to assess the shape of this relationship. We present two novel IV methods for investigating the shape of the exposure-outcome relationship: a fractional polynomial method and a piecewise linear method. We divide the population into strata using the exposure distribution, and estimate a causal effect, referred to as a localized average causal effect (LACE), in each stratum of population. The fractional polynomial method performs metaregression on these LACE estimates. The piecewise linear method estimates a continuous piecewise linear function, the gradient of which is the LACE estimate in each stratum. Both methods were demonstrated in a simulation study to estimate the true exposure-outcome relationship well, particularly when the relationship was a fractional polynomial (for the fractional polynomial method) or was piecewise linear (for the piecewise linear method). The methods were used to investigate the shape of relationship of body mass index with systolic blood pressure and diastolic blood pressure. © 2017 The Authors Genetic Epidemiology Published by Wiley Periodicals, Inc.

  7. Semiparametric methods for estimation of a nonlinear exposure‐outcome relationship using instrumental variables with application to Mendelian randomization

    PubMed Central

    Staley, James R.

    2017-01-01

    ABSTRACT Mendelian randomization, the use of genetic variants as instrumental variables (IV), can test for and estimate the causal effect of an exposure on an outcome. Most IV methods assume that the function relating the exposure to the expected value of the outcome (the exposure‐outcome relationship) is linear. However, in practice, this assumption may not hold. Indeed, often the primary question of interest is to assess the shape of this relationship. We present two novel IV methods for investigating the shape of the exposure‐outcome relationship: a fractional polynomial method and a piecewise linear method. We divide the population into strata using the exposure distribution, and estimate a causal effect, referred to as a localized average causal effect (LACE), in each stratum of population. The fractional polynomial method performs metaregression on these LACE estimates. The piecewise linear method estimates a continuous piecewise linear function, the gradient of which is the LACE estimate in each stratum. Both methods were demonstrated in a simulation study to estimate the true exposure‐outcome relationship well, particularly when the relationship was a fractional polynomial (for the fractional polynomial method) or was piecewise linear (for the piecewise linear method). The methods were used to investigate the shape of relationship of body mass index with systolic blood pressure and diastolic blood pressure. PMID:28317167

  8. Mendelian randomization with Egger pleiotropy correction and weakly informative Bayesian priors.

    PubMed

    Schmidt, A F; Dudbridge, F

    2017-12-15

    The MR-Egger (MRE) estimator has been proposed to correct for directional pleiotropic effects of genetic instruments in an instrumental variable (IV) analysis. The power of this method is considerably lower than that of conventional estimators, limiting its applicability. Here we propose a novel Bayesian implementation of the MR-Egger estimator (BMRE) and explore the utility of applying weakly informative priors on the intercept term (the pleiotropy estimate) to increase power of the IV (slope) estimate. This was a simulation study to compare the performance of different IV estimators. Scenarios differed in the presence of a causal effect, the presence of pleiotropy, the proportion of pleiotropic instruments and degree of 'Instrument Strength Independent of Direct Effect' (InSIDE) assumption violation. Based on empirical plasma urate data, we present an approach to elucidate a prior distribution for the amount of pleiotropy. A weakly informative prior on the intercept term increased power of the slope estimate while maintaining type 1 error rates close to the nominal value of 0.05. Under the InSIDE assumption, performance was unaffected by the presence or absence of pleiotropy. Violation of the InSIDE assumption biased all estimators, affecting the BMRE more than the MRE method. Depending on the prior distribution, the BMRE estimator has more power at the cost of an increased susceptibility to InSIDE assumption violations. As such the BMRE method is a compromise between the MRE and conventional IV estimators, and may be an especially useful approach to account for observed pleiotropy. © The Author 2017. Published by Oxford University Press on behalf of the International Epidemiological Association.

  9. An introduction to instrumental variables analysis: part 1.

    PubMed

    Bennett, Derrick A

    2010-01-01

    There are several examples in the medical literature where the associations of treatment effects predicted by observational studies have been refuted by evidence from subsequent large-scale randomised trials. This is because of the fact that non-experimental studies are subject to confounding - and confounding cannot be entirely eliminated even if all known confounders have been measured in the study as there may be unknown confounders. The aim of this 2-part methodological primer is to introduce an emerging methodology for estimating treatment effects using observational data in the absence of good randomised evidence known as the method of instrumental variables. Copyright © 2010 S. Karger AG, Basel.

  10. Mendelian randomization with fine-mapped genetic data: Choosing from large numbers of correlated instrumental variables.

    PubMed

    Burgess, Stephen; Zuber, Verena; Valdes-Marquez, Elsa; Sun, Benjamin B; Hopewell, Jemma C

    2017-12-01

    Mendelian randomization uses genetic variants to make causal inferences about the effect of a risk factor on an outcome. With fine-mapped genetic data, there may be hundreds of genetic variants in a single gene region any of which could be used to assess this causal relationship. However, using too many genetic variants in the analysis can lead to spurious estimates and inflated Type 1 error rates. But if only a few genetic variants are used, then the majority of the data is ignored and estimates are highly sensitive to the particular choice of variants. We propose an approach based on summarized data only (genetic association and correlation estimates) that uses principal components analysis to form instruments. This approach has desirable theoretical properties: it takes the totality of data into account and does not suffer from numerical instabilities. It also has good properties in simulation studies: it is not particularly sensitive to varying the genetic variants included in the analysis or the genetic correlation matrix, and it does not have greatly inflated Type 1 error rates. Overall, the method gives estimates that are less precise than those from variable selection approaches (such as using a conditional analysis or pruning approach to select variants), but are more robust to seemingly arbitrary choices in the variable selection step. Methods are illustrated by an example using genetic associations with testosterone for 320 genetic variants to assess the effect of sex hormone related pathways on coronary artery disease risk, in which variable selection approaches give inconsistent inferences. © 2017 The Authors Genetic Epidemiology Published by Wiley Periodicals, Inc.

  11. Income and the Use of Prescription Drugs by the Elderly: Evidence from the Notch Cohorts

    ERIC Educational Resources Information Center

    Moran, John R.; Simon, Kosali Ilayperuma

    2006-01-01

    We use exogenous variation in Social Security payments created by the Social Security benefits notch to estimate how retirees' use of prescription medications responds to changes in their incomes. Using data from the 1993 Wave of the AHEAD, we obtain instrumental variables estimates of the income elasticity of prescription drug use that are…

  12. Geographic variability in lidar predictions of forest stand structure in the Pacific Northwest

    Treesearch

    Michael A. Lefsky; Andrew T. Hudak; Warren B. Cohen; S. A. Acker

    2005-01-01

    Estimation of the amount of carbon stored in forests is a key challenge for understanding the global carbon cycle, one which remote sensing is expected to help address. However, carbon storage in moderate to high biomass forests is difficult to estimate with conventional optical or radar sensors. Lidar (light detection and ranging) instruments measure the vertical...

  13. Estimating the association between metabolic risk factors and marijuana use in U.S. adults using data from the continuous National Health and Nutrition Examination Survey.

    PubMed

    Thompson, Christin Ann; Hay, Joel W

    2015-07-01

    More research is needed on the health effects of marijuana use. Results of previous studies indicate that marijuana could alleviate certain factors of metabolic syndrome, such as obesity. Data on 6281 persons from National Health and Nutrition Examination Survey from 2005 to 2012 were used to estimate the effect of marijuana use on cardiometabolic risk factors. The reliability of ordinary least squares (OLS) regression models was tested by replacing marijuana use as the risk factor of interest with alcohol and carbohydrate consumption. Instrumental variable methods were used to account for the potential endogeneity of marijuana use. OLS models show lower fasting insulin, insulin resistance, body mass index, and waist circumference in users compared with nonusers. However, when alcohol and carbohydrate intake substitute for marijuana use in OLS models, similar metabolic benefits are estimated. The Durbin-Wu-Hausman tests provide evidence of endogeneity of marijuana use in OLS models, but instrumental variables models do not yield significant estimates for marijuana use. These findings challenge the robustness of OLS estimates of a positive relationship between marijuana use and fasting insulin, insulin resistance, body mass index, and waist circumference. Copyright © 2015 Elsevier Inc. All rights reserved.

  14. Estimating the effects of wages on obesity.

    PubMed

    Kim, DaeHwan; Leigh, John Paul

    2010-05-01

    To estimate the effects of wages on obesity and body mass. Data on household heads, aged 20 to 65 years, with full-time jobs, were drawn from the Panel Study of Income Dynamics for 2003 to 2007. The Panel Study of Income Dynamics is a nationally representative sample. Instrumental variables (IV) for wages were created using knowledge of computer software and state legal minimum wages. Least squares (linear regression) with corrected standard errors were used to estimate the equations. Statistical tests revealed both instruments were strong and tests for over-identifying restrictions were favorable. Wages were found to be predictive (P < 0.05) of obesity and body mass in regressions both before and after applying IVs. Coefficient estimates suggested stronger effects in the IV models. Results are consistent with the hypothesis that low wages increase obesity prevalence and body mass.

  15. Comparing surgical trays with redundant instruments with trays with reduced instruments: a cost analysis

    PubMed Central

    John-Baptiste, A.; Sowerby, L.J.; Chin, C.J.; Martin, J.; Rotenberg, B.W.

    2016-01-01

    Background: When prearranged standard surgical trays contain instruments that are repeatedly unused, the redundancy can result in unnecessary health care costs. Our objective was to estimate potential savings by performing an economic evaluation comparing the cost of surgical trays with redundant instruments with surgical trays with reduced instruments ("reduced trays"). Methods: We performed a cost-analysis from the hospital perspective over a 1-year period. Using a mathematical model, we compared the direct costs of trays containing redundant instruments to reduced trays for 5 otolaryngology procedures. We incorporated data from several sources including local hospital data on surgical volume, the number of instruments on redundant and reduced trays, wages of personnel and time required to pack instruments. From the literature, we incorporated instrument depreciation costs and the time required to decontaminate an instrument. We performed 1-way sensitivity analyses on all variables, including surgical volume. Costs were estimated in 2013 Canadian dollars. Results: The cost of redundant trays was $21 806 and the cost of reduced trays was $8803, for a 1-year cost saving of $13 003. In sensitivity analyses, cost savings ranged from $3262 to $21 395, based on the surgical volume at the institution. Variation in surgical volume resulted in a wider range of estimates, with a minimum of $3253 for low-volume to a maximum of $52 012 for high-volume institutions. Interpretation: Our study suggests moderate savings may be achieved by reducing surgical tray redundancy and, if applied to other surgical specialties, may result in savings to Canadian health care systems. PMID:27975045

  16. Comparing surgical trays with redundant instruments with trays with reduced instruments: a cost analysis.

    PubMed

    John-Baptiste, A; Sowerby, L J; Chin, C J; Martin, J; Rotenberg, B W

    2016-01-01

    When prearranged standard surgical trays contain instruments that are repeatedly unused, the redundancy can result in unnecessary health care costs. Our objective was to estimate potential savings by performing an economic evaluation comparing the cost of surgical trays with redundant instruments with surgical trays with reduced instruments ("reduced trays"). We performed a cost-analysis from the hospital perspective over a 1-year period. Using a mathematical model, we compared the direct costs of trays containing redundant instruments to reduced trays for 5 otolaryngology procedures. We incorporated data from several sources including local hospital data on surgical volume, the number of instruments on redundant and reduced trays, wages of personnel and time required to pack instruments. From the literature, we incorporated instrument depreciation costs and the time required to decontaminate an instrument. We performed 1-way sensitivity analyses on all variables, including surgical volume. Costs were estimated in 2013 Canadian dollars. The cost of redundant trays was $21 806 and the cost of reduced trays was $8803, for a 1-year cost saving of $13 003. In sensitivity analyses, cost savings ranged from $3262 to $21 395, based on the surgical volume at the institution. Variation in surgical volume resulted in a wider range of estimates, with a minimum of $3253 for low-volume to a maximum of $52 012 for high-volume institutions. Our study suggests moderate savings may be achieved by reducing surgical tray redundancy and, if applied to other surgical specialties, may result in savings to Canadian health care systems.

  17. The long view: Causes of climate change over the instrumental period

    NASA Astrophysics Data System (ADS)

    Hegerl, G. C.; Schurer, A. P.; Polson, D.; Iles, C. E.; Bronnimann, S.

    2016-12-01

    The period of instrumentally recorded data has seen remarkable changes in climate, with periods of rapid warming, and periods of stagnation or cooling. A recent analysis of the observed temperature change from the instrumental record confirms that most of the warming recorded since the middle of the 20rst century has been caused by human influences, but shows large uncertainty in separating greenhouse gas from aerosol response if accounting for model uncertainty. The contribution by natural forcing and internal variability to the recent warming is estimated to be small, but becomes more important when analysing climate change over earlier or shorter time periods. For example, the enigmatic early 20th century warming was a period of strong climate anomalies, including the US dustbowl drought and exceptional heat waves, and pronounced Arctic warming. Attribution results suggests that about half of the global warming 1901-1950 was forced by greenhouse gases increases, with an anomalously strong contribution by climate variability, and contributions by natural forcing. Long term variations in circulation are important for some regional climate anomalies. Precipitation is important for impacts of climate change and precipitation changes are uncertain in models. Analysis of the instrumental record suggests a human influence on mean and heavy precipitation, and supports climate model estimates of the spatial pattern of precipitation sensitivity to warming. Broadly, and particularly over ocean, wet regions are getting wetter and dry regions are getting drier. In conclusion, the historical record provides evidence for a strong response to external forcings, supports climate models, and raises questions about multi-decadal variability.

  18. Latent class instrumental variables: a clinical and biostatistical perspective.

    PubMed

    Baker, Stuart G; Kramer, Barnett S; Lindeman, Karen S

    2016-01-15

    In some two-arm randomized trials, some participants receive the treatment assigned to the other arm as a result of technical problems, refusal of a treatment invitation, or a choice of treatment in an encouragement design. In some before-and-after studies, the availability of a new treatment changes from one time period to this next. Under assumptions that are often reasonable, the latent class instrumental variable (IV) method estimates the effect of treatment received in the aforementioned scenarios involving all-or-none compliance and all-or-none availability. Key aspects are four initial latent classes (sometimes called principal strata) based on treatment received if in each randomization group or time period, the exclusion restriction assumption (in which randomization group or time period is an instrumental variable), the monotonicity assumption (which drops an implausible latent class from the analysis), and the estimated effect of receiving treatment in one latent class (sometimes called efficacy, the local average treatment effect, or the complier average causal effect). Since its independent formulations in the biostatistics and econometrics literatures, the latent class IV method (which has no well-established name) has gained increasing popularity. We review the latent class IV method from a clinical and biostatistical perspective, focusing on underlying assumptions, methodological extensions, and applications in our fields of obstetrics and cancer research. Copyright © 2015 John Wiley & Sons, Ltd.

  19. Inter-Hospital Transfer is Associated with Increased Mortality and Costs in Severe Sepsis and Septic Shock: An Instrumental Variables Approach

    PubMed Central

    Mohr, Nicholas M.; Harland, Karisa K.; Shane, Dan M.; Ahmed, Azeemuddin; Fuller, Brian M.; Torner, James C.

    2016-01-01

    Purpose The objective of this study was to evaluate the impact of regionalization on sepsis survival, to describe the role of inter-hospital transfer in rural sepsis care, and to measure the cost of inter-hospital transfer in a predominantly rural state. Materials and Methods Observational case-control study using statewide administrative claims data from 2005-2014 in a predominantly rural Midwestern state. Mortality and marginal costs were estimated with multivariable generalized estimating equations (GEE) models and with instrumental variables models. Results A total of 18,246 patients were included, of which 59% were transferred between hospitals. Transferred patients had higher mortality and longer hospital length-of-stay than non-transferred patients. Using a multivariable GEE model to adjust for potentially confounding factors, inter-hospital transfer was associated with increased mortality (aOR 1.7, 95%CI 1.5 – 1.9). Using an instrumental variables model, transfer was associated with a 9.2% increased risk of death. Transfer was associated with additional costs of $6,897 (95%CI $5,769-8,024). Even when limiting to only those patients who received care in the largest hospitals, transfer was still associated with $5,167 (95%CI $3,696-6,638) in additional cost. Conclusions The majority of rural sepsis patients are transferred, and these transferred patients have higher mortality and significantly increased cost of care. PMID:27546770

  20. Tree ring reconstructed rainfall over the southern Amazon Basin

    NASA Astrophysics Data System (ADS)

    Lopez, Lidio; Stahle, David; Villalba, Ricardo; Torbenson, Max; Feng, Song; Cook, Edward

    2017-07-01

    Moisture sensitive tree ring chronologies of Centrolobium microchaete have been developed from seasonally dry forests in the southern Amazon Basin and used to reconstruct wet season rainfall totals from 1799 to 2012, adding over 150 years of rainfall estimates to the short instrumental record for the region. The reconstruction is correlated with the same atmospheric variables that influence the instrumental measurements of wet season rainfall. Anticyclonic circulation over midlatitude South America promotes equatorward surges of cold and relatively dry extratropical air that converge with warm moist air to form deep convection and heavy rainfall over this sector of the southern Amazon Basin. Interesting droughts and pluvials are reconstructed during the preinstrumental nineteenth and early twentieth centuries, but the tree ring reconstruction suggests that the strong multidecadal variability in instrumental and reconstructed wet season rainfall after 1950 may have been unmatched since 1799.

  1. Cosmic gamma-ray bursts detected in the RELEC experiment onboard the Vernov satellite

    NASA Astrophysics Data System (ADS)

    Bogomolov, A. V.; Bogomolov, V. V.; Iyudin, A. F.; Kuznetsova, E. A.; Minaev, P. Yu.; Panasyuk, M. I.; Pozanenko, A. S.; Prokhorov, A. V.; Svertilov, S. I.; Chernenko, A. M.

    2017-08-01

    The RELEC scientific instrumentation onboard the Vernov spacecraft launched on July 8, 2014, included the DRGE gamma-ray and electron spectrometer. This instrument incorporates a set of scintillation phoswich detectors, including four identical X-ray and gamma-ray detectors in the energy range from 10 keV to 3 MeV with a total area of 500 cm2 directed toward the nadir, and an electron spectrometer containing three mutually orthogonal detector units with a geometry factor of 2 cm2 sr, which is also sensitive to X-rays and gamma-rays. The goal of the space experiment with the DRGE instrument was to investigate phenomena with fast temporal variability, in particular, terrestrial gammaray flashes (TGFs) and magnetospheric electron precipitations. However, the detectors of the DRGE instrument could record cosmic gamma-ray bursts (GRBs) and allowed one not only to perform a detailed analysis of the gamma-ray variability but also to compare the time profiles with the measurements made by other instruments of the RELEC scientific instrumentation (the detectors of optical and ultraviolet flashes, the radio-frequency and low-frequency analyzers of electromagnetic field parameters). We present the results of our observations of cosmicGRB 141011A and GRB 141104A, compare the parameters obtained in the GBM/Fermi and KONUS-Wind experiments, and estimate the redshifts and E iso for the sources of these GRBs. The detectability of GRBs and good agreement between the independent estimates of their parameters obtained in various experiments are important factors of the successful operation of similar detectors onboard the Lomonosov spacecraft.

  2. Price responsiveness of demand for cigarettes: does rationality matter?

    PubMed

    Laporte, Audrey

    2006-01-01

    Meta-analysis is applied to aggregate-level studies that model the demand for cigarettes using static, myopic, or rational addiction frameworks in an attempt to synthesize key findings in the literature and to identify determinants of the variation in reported price elasticity estimates across studies. The results suggest that the rational addiction framework produces statistically similar estimates to the static framework but that studies that use the myopic framework tend to report more elastic price effects. Studies that applied panel data techniques or controlled for cross-border smuggling reported more elastic price elasticity estimates, whereas the use of instrumental variable techniques and time trends or time dummy variables produced less elastic estimates. The finding that myopic models produce different estimates than either of the other two model frameworks underscores that careful attention must be given to time series properties of the data.

  3. Two-Stage Bayesian Model Averaging in Endogenous Variable Models*

    PubMed Central

    Lenkoski, Alex; Eicher, Theo S.; Raftery, Adrian E.

    2013-01-01

    Economic modeling in the presence of endogeneity is subject to model uncertainty at both the instrument and covariate level. We propose a Two-Stage Bayesian Model Averaging (2SBMA) methodology that extends the Two-Stage Least Squares (2SLS) estimator. By constructing a Two-Stage Unit Information Prior in the endogenous variable model, we are able to efficiently combine established methods for addressing model uncertainty in regression models with the classic technique of 2SLS. To assess the validity of instruments in the 2SBMA context, we develop Bayesian tests of the identification restriction that are based on model averaged posterior predictive p-values. A simulation study showed that 2SBMA has the ability to recover structure in both the instrument and covariate set, and substantially improves the sharpness of resulting coefficient estimates in comparison to 2SLS using the full specification in an automatic fashion. Due to the increased parsimony of the 2SBMA estimate, the Bayesian Sargan test had a power of 50 percent in detecting a violation of the exogeneity assumption, while the method based on 2SLS using the full specification had negligible power. We apply our approach to the problem of development accounting, and find support not only for institutions, but also for geography and integration as development determinants, once both model uncertainty and endogeneity have been jointly addressed. PMID:24223471

  4. The Impact of Family Income on Child Achievement: Evidence from the Earned Income Tax Credit. NBER Working Paper No. 14599

    ERIC Educational Resources Information Center

    Dahl, Gordon; Lochner, Lance

    2008-01-01

    Past estimates of the effect of family income on child development have often been plagued by endogeneity and measurement error. In this paper, we use two simulated instrumental variables strategies to estimate the causal effect of income on children's math and reading achievement. Our identification derives from the large, non-linear changes in…

  5. The Impact of Family Income on Child Achievement: Evidence from the Earned Income Tax Credit. Discussion Paper No. 1361-09

    ERIC Educational Resources Information Center

    Dahl, Gordon; Lochner, Lance

    2009-01-01

    Past estimates of the effect of family income on child development have often been plagued by endogeneity and measurement error. In this paper, we use two simulated instrumental variables strategies to estimate the causal effect of income on children's math and reading achievement. Our identification derives from the large, non-linear changes…

  6. Estimation of Chinese surface NO2 concentrations combining satellite data and Land Use Regression

    NASA Astrophysics Data System (ADS)

    Anand, J.; Monks, P.

    2016-12-01

    Monitoring surface-level air quality is often limited by in-situ instrument placement and issues arising from harmonisation over long timescales. Satellite instruments can offer a synoptic view of regional pollution sources, but in many cases only a total or tropospheric column can be measured. In this work a new technique of estimating surface NO2 combining both satellite and in-situ data is presented, in which a Land Use Regression (LUR) model is used to create high resolution pollution maps based on known predictor variables such as population density, road networks, and land cover. By employing a mixed effects approach, it is possible to take advantage of the spatiotemporal variability in the satellite-derived column densities to account for daily and regional variations in surface NO2 caused by factors such as temperature, elevation, and wind advection. In this work, surface NO2 maps are modelled over the North China Plain and Pearl River Delta during high-pollution episodes by combining in-situ measurements and tropospheric columns from the Ozone Monitoring Instrument (OMI). The modelled concentrations show good agreement with in-situ data and surface NO2 concentrations derived from the MACC-II global reanalysis.

  7. Joint nonparametric correction estimator for excess relative risk regression in survival analysis with exposure measurement error

    PubMed Central

    Wang, Ching-Yun; Cullings, Harry; Song, Xiao; Kopecky, Kenneth J.

    2017-01-01

    SUMMARY Observational epidemiological studies often confront the problem of estimating exposure-disease relationships when the exposure is not measured exactly. In the paper, we investigate exposure measurement error in excess relative risk regression, which is a widely used model in radiation exposure effect research. In the study cohort, a surrogate variable is available for the true unobserved exposure variable. The surrogate variable satisfies a generalized version of the classical additive measurement error model, but it may or may not have repeated measurements. In addition, an instrumental variable is available for individuals in a subset of the whole cohort. We develop a nonparametric correction (NPC) estimator using data from the subcohort, and further propose a joint nonparametric correction (JNPC) estimator using all observed data to adjust for exposure measurement error. An optimal linear combination estimator of JNPC and NPC is further developed. The proposed estimators are nonparametric, which are consistent without imposing a covariate or error distribution, and are robust to heteroscedastic errors. Finite sample performance is examined via a simulation study. We apply the developed methods to data from the Radiation Effects Research Foundation, in which chromosome aberration is used to adjust for the effects of radiation dose measurement error on the estimation of radiation dose responses. PMID:29354018

  8. Regression Discontinuity for Causal Effect Estimation in Epidemiology.

    PubMed

    Oldenburg, Catherine E; Moscoe, Ellen; Bärnighausen, Till

    Regression discontinuity analyses can generate estimates of the causal effects of an exposure when a continuously measured variable is used to assign the exposure to individuals based on a threshold rule. Individuals just above the threshold are expected to be similar in their distribution of measured and unmeasured baseline covariates to individuals just below the threshold, resulting in exchangeability. At the threshold exchangeability is guaranteed if there is random variation in the continuous assignment variable, e.g., due to random measurement error. Under exchangeability, causal effects can be identified at the threshold. The regression discontinuity intention-to-treat (RD-ITT) effect on an outcome can be estimated as the difference in the outcome between individuals just above (or below) versus just below (or above) the threshold. This effect is analogous to the ITT effect in a randomized controlled trial. Instrumental variable methods can be used to estimate the effect of exposure itself utilizing the threshold as the instrument. We review the recent epidemiologic literature reporting regression discontinuity studies and find that while regression discontinuity designs are beginning to be utilized in a variety of applications in epidemiology, they are still relatively rare, and analytic and reporting practices vary. Regression discontinuity has the potential to greatly contribute to the evidence base in epidemiology, in particular on the real-life and long-term effects and side-effects of medical treatments that are provided based on threshold rules - such as treatments for low birth weight, hypertension or diabetes.

  9. Analytical variability of estimated platelet counts on canine blood smears.

    PubMed

    Paltrinieri, Saverio; Paciletti, Veronica; Zambarbieri, Jari

    2018-06-04

    The analytical variability of estimated platelet counts in dogs has not been reported. The purpose of this study was to assess the magnitude of analytical imprecision of platelet estimates and the possible impact of this imprecision on clinical decisions. Three independent observers counted the number of platelets in 3 different areas (LE = lateral edge; CM = central monolayer; FE = feathered edge) of 30 canine blood smears with different instrumental platelet counts. The coefficient of variation (CV) for each observer was calculated in different areas of each smear (intra-observer variability), among different regions of each smear (inter-area variability), and among different observers in each area (inter-observer variability). The influence of these variabilities on the classification of platelet estimates as adequate, increased, or decreased was also assessed. The CVs recorded in the different areas by each observer ranged from 8% to 88% and were negatively correlated (P < .001, r = -.65) with the mean number of platelets per field. The mean platelet number was significantly lower in the FE and significantly higher in the CM compared with the LE, but the magnitude of this difference varied with the operators. The concordance among operators regarding platelet estimates was fair (k = 0.36) to substantial (k = 0.71) depending on the area. The overall inter-area concordance was moderate (k = 0.59). Platelet estimates suffer from high variability that could lead to patient misclassification. Therefore, guidelines to standardize the platelet estimate are needed. © 2018 American Society for Veterinary Clinical Pathology.

  10. Feedback control of acoustic musical instruments: collocated control using physical analogs.

    PubMed

    Berdahl, Edgar; Smith, Julius O; Niemeyer, Günter

    2012-01-01

    Traditionally, the average professional musician has owned numerous acoustic musical instruments, many of them having distinctive acoustic qualities. However, a modern musician could prefer to have a single musical instrument whose acoustics are programmable by feedback control, where acoustic variables are estimated from sensor measurements in real time and then fed back in order to influence the controlled variables. In this paper, theory is presented that describes stable feedback control of an acoustic musical instrument. The presentation should be accessible to members of the musical acoustics community who may have limited or no experience with feedback control. First, the only control strategy guaranteed to be stable subject to any musical instrument mobility is described: the sensors and actuators must be collocated, and the controller must emulate a physical analog system. Next, the most fundamental feedback controllers and the corresponding physical analog systems are presented. The effects that these controllers have on acoustic musical instruments are described. Finally, practical design challenges are discussed. A proof explains why changing the resonance frequency of a musical resonance requires much more control power than changing the decay time of the resonance. © 2012 Acoustical Society of America.

  11. A Blind Survey for AGN in the Kepler Field through Optical Variability

    NASA Astrophysics Data System (ADS)

    Olling, Robert; Shaya, E. J.; Mushotzky, R.

    2013-01-01

    We present an initial analysis of three quarters of Kepler LLC time series of 400 small galaxies. The Kepler LLC data is sampled about twice per hour, and allows us to investigate variability on time scales between about one day and one month. The calibrated Kepler LLC light curves still contain many instrumental effects that can not be taken out in a robust manner. Instead, our analysis relies on the similarity of variability measures in the three independent quarters to decide if an galaxy shows variability, or not. We estimate that roughly 15% of our small galaxies shows variability at levels exceeding several parts per thousand (mmag) on timescales of days to weeks. However, this estimate is probably uncertain by a factor of two. Our data is more sensitive by several factors of ten as compared to extant data sets.

  12. Price elasticity reconsidered: Panel estimation of an agricultural water demand function

    NASA Astrophysics Data System (ADS)

    Schoengold, Karina; Sunding, David L.; Moreno, Georgina

    2006-09-01

    Using panel data from a period of water rate reform, this paper estimates the price elasticity of irrigation water demand. Price elasticity is decomposed into the direct effect of water management and the indirect effect of water price on choice of output and irrigation technology. The model is estimated using an instrumental variables strategy to account for the endogeneity of technology and output choices in the water demand equation. Estimation results indicate that the price elasticity of agricultural water demand is -0.79, which is greater than that found in previous studies.

  13. Instrumental variables estimates of peer effects in social networks.

    PubMed

    An, Weihua

    2015-03-01

    Estimating peer effects with observational data is very difficult because of contextual confounding, peer selection, simultaneity bias, and measurement error, etc. In this paper, I show that instrumental variables (IVs) can help to address these problems in order to provide causal estimates of peer effects. Based on data collected from over 4000 students in six middle schools in China, I use the IV methods to estimate peer effects on smoking. My design-based IV approach differs from previous ones in that it helps to construct potentially strong IVs and to directly test possible violation of exogeneity of the IVs. I show that measurement error in smoking can lead to both under- and imprecise estimations of peer effects. Based on a refined measure of smoking, I find consistent evidence for peer effects on smoking. If a student's best friend smoked within the past 30 days, the student was about one fifth (as indicated by the OLS estimate) or 40 percentage points (as indicated by the IV estimate) more likely to smoke in the same time period. The findings are robust to a variety of robustness checks. I also show that sharing cigarettes may be a mechanism for peer effects on smoking. A 10% increase in the number of cigarettes smoked by a student's best friend is associated with about 4% increase in the number of cigarettes smoked by the student in the same time period. Copyright © 2014 Elsevier Inc. All rights reserved.

  14. GENES AS INSTRUMENTS FOR STUDYING RISK BEHAVIOR EFFECTS: AN APPLICATION TO MATERNAL SMOKING AND OROFACIAL CLEFTS

    PubMed Central

    Jugessur, Astanand; Murray, Jeffrey C.; Moreno, Lina; Wilcox, Allen; Lie, Rolv T.

    2011-01-01

    This study uses instrumental variable (IV) models with genetic instruments to assess the effects of maternal smoking on the child’s risk of orofacial clefts (OFC), a common birth defect. The study uses genotypic variants in neurotransmitter and detoxification genes relateded to smoking as instruments for cigarette smoking before and during pregnancy. Conditional maximum likelihood and two-stage IV probit models are used to estimate the IV model. The data are from a population-level sample of affected and unaffected children in Norway. The selected genetic instruments generally fit the IV assumptions but may be considered “weak” in predicting cigarette smoking. We find that smoking before and during pregnancy increases OFC risk substantially under the IV model (by about 4–5 times at the sample average smoking rate). This effect is greater than that found with classical analytic models. This may be because the usual models are not able to consider self-selection into smoking based on unobserved confounders, or it may to some degree reflect limitations of the instruments. Inference based on weak-instrument robust confidence bounds is consistent with standard inference. Genetic instruments may provide a valuable approach to estimate the “causal” effects of risk behaviors with genetic-predisposing factors (such as smoking) on health and socioeconomic outcomes. PMID:22102793

  15. Is Some Provider Advice on Smoking Cessation Better Than No Advice? An Instrumental Variable Analysis of the 2001 National Health Interview Survey

    PubMed Central

    Bao, Yuhua; Duan, Naihua; Fox, Sarah A

    2006-01-01

    Research Objective To estimate the effect of provider advice in routine clinical contacts on patient smoking cessation outcome. Data Source The Sample Adult File from the 2001 National Health Interview Survey. We focus on adult patients who were either current smokers or quit during the last 12 months and had some contact with the health care providers or facilities they most often went to for acute or preventive care. Study Design We estimate a joint model of self-reported smoking cessation and ever receiving advice to quit during medical visits in the past 12 months. Because providers are more likely to advise heavier smokers and/or patients already diagnosed with smoking-related conditions, we use provider advice for diet/nutrition and for physical activity reported by the same patient as instrumental variables for smoking cessation advice to mitigate the selection bias. We conduct additional analyses to examine the robustness of our estimate against the various scenarios by which the exclusion restriction of the instrumental variables may fail. Principal Findings Provider advice doubles the chances of success in (self-reported) smoking cessation by their patients. The probability of quitting by the end of the 12-month reference period increased from 6.9 to 14.7 percent, an effect that is of both statistical (p<.001) and clinical significance. Conclusions Provider advice delivered in routine practice settings has a substantial effect on the success rate of smoking cessation among smoking patients. Providing advice consistently to all smoking patients, compared with routine care, is more effective than doubling the federal excise tax and, in the longer run, likely to outperform some of the other tobacco control policies such as banning smoking in private workplaces. PMID:17116112

  16. Using Indirect Turbulence Measurements for Real-Time Parameter Estimation in Turbulent Air

    NASA Technical Reports Server (NTRS)

    Martos, Borja; Morelli, Eugene A.

    2012-01-01

    The use of indirect turbulence measurements for real-time estimation of parameters in a linear longitudinal dynamics model in atmospheric turbulence was studied. It is shown that measuring the atmospheric turbulence makes it possible to treat the turbulence as a measured explanatory variable in the parameter estimation problem. Commercial off-the-shelf sensors were researched and evaluated, then compared to air data booms. Sources of colored noise in the explanatory variables resulting from typical turbulence measurement techniques were identified and studied. A major source of colored noise in the explanatory variables was identified as frequency dependent upwash and time delay. The resulting upwash and time delay corrections were analyzed and compared to previous time shift dynamic modeling research. Simulation data as well as flight test data in atmospheric turbulence were used to verify the time delay behavior. Recommendations are given for follow on flight research and instrumentation.

  17. Increasing precision of turbidity-based suspended sediment concentration and load estimates.

    PubMed

    Jastram, John D; Zipper, Carl E; Zelazny, Lucian W; Hyer, Kenneth E

    2010-01-01

    Turbidity is an effective tool for estimating and monitoring suspended sediments in aquatic systems. Turbidity can be measured in situ remotely and at fine temporal scales as a surrogate for suspended sediment concentration (SSC), providing opportunity for a more complete record of SSC than is possible with physical sampling approaches. However, there is variability in turbidity-based SSC estimates and in sediment loadings calculated from those estimates. This study investigated the potential to improve turbidity-based SSC, and by extension the resulting sediment loading estimates, by incorporating hydrologic variables that can be monitored remotely and continuously (typically 15-min intervals) into the SSC estimation procedure. On the Roanoke River in southwestern Virginia, hydrologic stage, turbidity, and other water-quality parameters were monitored with in situ instrumentation; suspended sediments were sampled manually during elevated turbidity events; samples were analyzed for SSC and physical properties including particle-size distribution and organic C content; and rainfall was quantified by geologic source area. The study identified physical properties of the suspended-sediment samples that contribute to SSC estimation variance and hydrologic variables that explained variability of those physical properties. Results indicated that the inclusion of any of the measured physical properties in turbidity-based SSC estimation models reduces unexplained variance. Further, the use of hydrologic variables to represent these physical properties, along with turbidity, resulted in a model, relying solely on data collected remotely and continuously, that estimated SSC with less variance than a conventional turbidity-based univariate model, allowing a more precise estimate of sediment loading, Modeling results are consistent with known mechanisms governing sediment transport in hydrologic systems.

  18. The Use of Linear Instrumental Variables Methods in Health Services Research and Health Economics: A Cautionary Note

    PubMed Central

    Terza, Joseph V; Bradford, W David; Dismuke, Clara E

    2008-01-01

    Objective To investigate potential bias in the use of the conventional linear instrumental variables (IV) method for the estimation of causal effects in inherently nonlinear regression settings. Data Sources Smoking Supplement to the 1979 National Health Interview Survey, National Longitudinal Alcohol Epidemiologic Survey, and simulated data. Study Design Potential bias from the use of the linear IV method in nonlinear models is assessed via simulation studies and real world data analyses in two commonly encountered regression setting: (1) models with a nonnegative outcome (e.g., a count) and a continuous endogenous regressor; and (2) models with a binary outcome and a binary endogenous regressor. Principle Findings The simulation analyses show that substantial bias in the estimation of causal effects can result from applying the conventional IV method in inherently nonlinear regression settings. Moreover, the bias is not attenuated as the sample size increases. This point is further illustrated in the survey data analyses in which IV-based estimates of the relevant causal effects diverge substantially from those obtained with appropriate nonlinear estimation methods. Conclusions We offer this research as a cautionary note to those who would opt for the use of linear specifications in inherently nonlinear settings involving endogeneity. PMID:18546544

  19. Estimating the efficacy of Alcoholics Anonymous without self-selection bias: An instrumental variables re-analysis of randomized clinical trials

    PubMed Central

    Humphreys, Keith; Blodgett, Janet C.; Wagner, Todd H.

    2014-01-01

    Background Observational studies of Alcoholics Anonymous’ (AA) effectiveness are vulnerable to self-selection bias because individuals choose whether or not to attend AA. The present study therefore employed an innovative statistical technique to derive a selection bias-free estimate of AA’s impact. Methods Six datasets from 5 National Institutes of Health-funded randomized trials (one with two independent parallel arms) of AA facilitation interventions were analyzed using instrumental variables models. Alcohol dependent individuals in one of the datasets (n = 774) were analyzed separately from the rest of sample (n = 1582 individuals pooled from 5 datasets) because of heterogeneity in sample parameters. Randomization itself was used as the instrumental variable. Results Randomization was a good instrument in both samples, effectively predicting increased AA attendance that could not be attributed to self-selection. In five of the six data sets, which were pooled for analysis, increased AA attendance that was attributable to randomization (i.e., free of self-selection bias) was effective at increasing days of abstinence at 3-month (B = .38, p = .001) and 15-month (B = 0.42, p = .04) follow-up. However, in the remaining dataset, in which pre-existing AA attendance was much higher, further increases in AA involvement caused by the randomly assigned facilitation intervention did not affect drinking outcome. Conclusions For most individuals seeking help for alcohol problems, increasing AA attendance leads to short and long term decreases in alcohol consumption that cannot be attributed to self-selection. However, for populations with high pre-existing AA involvement, further increases in AA attendance may have little impact. PMID:25421504

  20. Estimating the efficacy of Alcoholics Anonymous without self-selection bias: an instrumental variables re-analysis of randomized clinical trials.

    PubMed

    Humphreys, Keith; Blodgett, Janet C; Wagner, Todd H

    2014-11-01

    Observational studies of Alcoholics Anonymous' (AA) effectiveness are vulnerable to self-selection bias because individuals choose whether or not to attend AA. The present study, therefore, employed an innovative statistical technique to derive a selection bias-free estimate of AA's impact. Six data sets from 5 National Institutes of Health-funded randomized trials (1 with 2 independent parallel arms) of AA facilitation interventions were analyzed using instrumental variables models. Alcohol-dependent individuals in one of the data sets (n = 774) were analyzed separately from the rest of sample (n = 1,582 individuals pooled from 5 data sets) because of heterogeneity in sample parameters. Randomization itself was used as the instrumental variable. Randomization was a good instrument in both samples, effectively predicting increased AA attendance that could not be attributed to self-selection. In 5 of the 6 data sets, which were pooled for analysis, increased AA attendance that was attributable to randomization (i.e., free of self-selection bias) was effective at increasing days of abstinence at 3-month (B = 0.38, p = 0.001) and 15-month (B = 0.42, p = 0.04) follow-up. However, in the remaining data set, in which preexisting AA attendance was much higher, further increases in AA involvement caused by the randomly assigned facilitation intervention did not affect drinking outcome. For most individuals seeking help for alcohol problems, increasing AA attendance leads to short- and long-term decreases in alcohol consumption that cannot be attributed to self-selection. However, for populations with high preexisting AA involvement, further increases in AA attendance may have little impact. Copyright © 2014 by the Research Society on Alcoholism.

  1. Reconstruction of precipitation variability in Estonia since the eighteenth century, inferred from oak and spruce tree rings

    NASA Astrophysics Data System (ADS)

    Helama, Samuli; Sohar, Kristina; Läänelaid, Alar; Bijak, Szymon; Jaagus, Jaak

    2018-06-01

    There is plenty of evidence for intensification of the global hydrological cycle. In Europe, the northern areas are predicted to receive more precipitation in the future and observational evidence suggests a parallel trend over the past decades. As a consequence, it would be essential to place the recent trend in precipitation in the context of proxy-based estimates of reconstructed precipitation variability over the past centuries. Tree rings are frequently used as proxy data for palaeoclimate reconstructions. Here we use deciduous ( Quercus robur) and coniferous ( Picea abies) tree-ring width chronologies from western Estonia to deduce past early-summer (June) precipitation variability since 1771. Statistical model transforming our tree-ring data into estimates of precipitation sums explains 42% of the variance in instrumental variability. Comparisons with products of gridded reconstructions of soil moisture and summer precipitation illustrate robust correlations with soil moisture (Palmer Drought Severity Index), but lowered correlation with summer precipitation estimates prior to mid-nineteenth century, these instabilities possibly reflecting the general uncertainties inherent to early meteorological and proxy data. Reconstructed precipitation variability was negatively correlated to the teleconnection indices of the North Atlantic Oscillation and the Scandinavia pattern, on annual to decadal and longer scales. These relationships demonstrate the positive precipitation anomalies to result from increase in zonal inflow and cyclonic activity, the negative anomalies being linked with the high pressure conditions enhanced during the atmospheric blocking episodes. Recently, the instrumental data have demonstrated a remarkable increase in summer (June) precipitation in the study region. Our tree-ring based reconstruction reproduces this trend in the context of precipitation history since eighteenth century and quantifies the unprecedented abundance of June precipitation over the recent years.

  2. Analysis of Observational Studies in the Presence of Treatment Selection Bias: Effects of Invasive Cardiac Management on AMI Survival Using Propensity Score and Instrumental Variable Methods

    PubMed Central

    Stukel, Thérèse A.; Fisher, Elliott S; Wennberg, David E.; Alter, David A.; Gottlieb, Daniel J.; Vermeulen, Marian J.

    2007-01-01

    Context Comparisons of outcomes between patients treated and untreated in observational studies may be biased due to differences in patient prognosis between groups, often because of unobserved treatment selection biases. Objective To compare 4 analytic methods for removing the effects of selection bias in observational studies: multivariable model risk adjustment, propensity score risk adjustment, propensity-based matching, and instrumental variable analysis. Design, Setting, and Patients A national cohort of 122 124 patients who were elderly (aged 65–84 years), receiving Medicare, and hospitalized with acute myocardial infarction (AMI) in 1994–1995, and who were eligible for cardiac catheterization. Baseline chart reviews were taken from the Cooperative Cardiovascular Project and linked to Medicare health administrative data to provide a rich set of prognostic variables. Patients were followed up for 7 years through December 31, 2001, to assess the association between long-term survival and cardiac catheterization within 30 days of hospital admission. Main Outcome Measure Risk-adjusted relative mortality rate using each of the analytic methods. Results Patients who received cardiac catheterization (n=73 238) were younger and had lower AMI severity than those who did not. After adjustment for prognostic factors by using standard statistical risk-adjustment methods, cardiac catheterization was associated with a 50% relative decrease in mortality (for multivariable model risk adjustment: adjusted relative risk [RR], 0.51; 95% confidence interval [CI], 0.50–0.52; for propensity score risk adjustment: adjusted RR, 0.54; 95% CI, 0.53–0.55; and for propensity-based matching: adjusted RR, 0.54; 95% CI, 0.52–0.56). Using regional catheterization rate as an instrument, instrumental variable analysis showed a 16% relative decrease in mortality (adjusted RR, 0.84; 95% CI, 0.79–0.90). The survival benefits of routine invasive care from randomized clinical trials are between 8% and 21 %. Conclusions Estimates of the observational association of cardiac catheterization with long-term AMI mortality are highly sensitive to analytic method. All standard risk-adjustment methods have the same limitations regarding removal of unmeasured treatment selection biases. Compared with standard modeling, instrumental variable analysis may produce less biased estimates of treatment effects, but is more suited to answering policy questions than specific clinical questions. PMID:17227979

  3. An Evaluation of Soil Moisture Retrievals Using Aircraft and Satellite Passive Microwave Observations during SMEX02

    NASA Technical Reports Server (NTRS)

    Bolten, John D.; Lakshmi, Venkat

    2009-01-01

    The Soil Moisture Experiments conducted in Iowa in the summer of 2002 (SMEX02) had many remote sensing instruments that were used to study the spatial and temporal variability of soil moisture. The sensors used in this paper (a subset of the suite of sensors) are the AQUA satellite-based AMSR-E (Advanced Microwave Scanning Radiometer- Earth Observing System) and the aircraft-based PSR (Polarimetric Scanning Radiometer). The SMEX02 design focused on the collection of near simultaneous brightness temperature observations from each of these instruments and in situ soil moisture measurements at field- and domain- scale. This methodology provided a basis for a quantitative analysis of the soil moisture remote sensing potential of each instrument using in situ comparisons and retrieved soil moisture estimates through the application of a radiative transfer model. To this end, the two sensors are compared with respect to their estimation of soil moisture.

  4. Utility-Based Instruments for People with Dementia: A Systematic Review and Meta-Regression Analysis.

    PubMed

    Li, Li; Nguyen, Kim-Huong; Comans, Tracy; Scuffham, Paul

    2018-04-01

    Several utility-based instruments have been applied in cost-utility analysis to assess health state values for people with dementia. Nevertheless, concerns and uncertainty regarding their performance for people with dementia have been raised. To assess the performance of available utility-based instruments for people with dementia by comparing their psychometric properties and to explore factors that cause variations in the reported health state values generated from those instruments by conducting meta-regression analyses. A literature search was conducted and psychometric properties were synthesized to demonstrate the overall performance of each instrument. When available, health state values and variables such as the type of instrument and cognitive impairment levels were extracted from each article. A meta-regression analysis was undertaken and available covariates were included in the models. A total of 64 studies providing preference-based values were identified and included. The EuroQol five-dimension questionnaire demonstrated the best combination of feasibility, reliability, and validity. Meta-regression analyses suggested that significant differences exist between instruments, type of respondents, and mode of administration and the variations in estimated utility values had influences on incremental quality-adjusted life-year calculation. This review finds that the EuroQol five-dimension questionnaire is the most valid utility-based instrument for people with dementia, but should be replaced by others under certain circumstances. Although no utility estimates were reported in the article, the meta-regression analyses that examined variations in utility estimates produced by different instruments impact on cost-utility analysis, potentially altering the decision-making process in some circumstances. Copyright © 2018 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  5. Score Reliability of Adolescent Alcohol Screening Measures: A Meta-Analytic Inquiry

    ERIC Educational Resources Information Center

    Shields, Alan L.; Campfield, Delia C.; Miller, Christopher S.; Howell, Ryan T.; Wallace, Kimberly; Weiss, Roger D.

    2008-01-01

    This study describes the reliability reporting practices in empirical studies using eight adolescent alcohol screening tools and characterizes and explores variability in internal consistency estimates across samples. Of 119 observed administrations of these instruments, 40 (34%) reported usable reliability information. The Personal Experience…

  6. The Rational Adolescent: Discipline Policies, Lawsuits, and Skill Acquisition

    ERIC Educational Resources Information Center

    Babcock, Philip

    2009-01-01

    The paper estimates the response of student truancy and long-run labor market outcomes to discipline policies in middle and secondary school. Simultaneous determination of student behaviors and school policies motivates an instrumental variables strategy. Because judicial climate influences administrators' fear of discipline-related lawsuits,…

  7. Scale Reliability Evaluation with Heterogeneous Populations

    ERIC Educational Resources Information Center

    Raykov, Tenko; Marcoulides, George A.

    2015-01-01

    A latent variable modeling approach for scale reliability evaluation in heterogeneous populations is discussed. The method can be used for point and interval estimation of reliability of multicomponent measuring instruments in populations representing mixtures of an unknown number of latent classes or subpopulations. The procedure is helpful also…

  8. Student Effort and Performance over the Semester

    ERIC Educational Resources Information Center

    Krohn, Gregory A.; O'Connor, Catherine M.

    2005-01-01

    The authors extend the standard education production function and student time allocation analysis to focus on the interactions between student effort and performance over the semester. The purged instrumental variable technique is used to obtain consistent estimators of the structural parameters of the model using data from intermediate…

  9. An Instrumental Variable Probit (IVP) analysis on depressed mood in Korea: the impact of gender differences and other socio-economic factors.

    PubMed

    Gitto, Lara; Noh, Yong-Hwan; Andrés, Antonio Rodríguez

    2015-04-16

    Depression is a mental health state whose frequency has been increasing in modern societies. It imposes a great burden, because of the strong impact on people's quality of life and happiness. Depression can be reliably diagnosed and treated in primary care: if more people could get effective treatments earlier, the costs related to depression would be reversed. The aim of this study was to examine the influence of socio-economic factors and gender on depressed mood, focusing on Korea. In fact, in spite of the great amount of empirical studies carried out for other countries, few epidemiological studies have examined the socio-economic determinants of depression in Korea and they were either limited to samples of employed women or did not control for individual health status. Moreover, as the likely data endogeneity (i.e. the possibility of correlation between the dependent variable and the error term as a result of autocorrelation or simultaneity, such as, in this case, the depressed mood due to health factors that, in turn might be caused by depression), might bias the results, the present study proposes an empirical approach, based on instrumental variables, to deal with this problem. Data for the year 2008 from the Korea National Health and Nutrition Examination Survey (KNHANES) were employed. About seven thousands of people (N= 6,751, of which 43% were males and 57% females), aged from 19 to 75 years old, were included in the sample considered in the analysis. In order to take into account the possible endogeneity of some explanatory variables, two Instrumental Variables Probit (IVP) regressions were estimated; the variables for which instrumental equations were estimated were related to the participation of women to the workforce and to good health, as reported by people in the sample. Explanatory variables were related to age, gender, family factors (such as the number of family members and marital status) and socio-economic factors (such as education, residence in metropolitan areas, and so on). As the results of the Wald test carried out after the estimations did not allow to reject the null hypothesis of endogeneity, a probit model was run too. Overall, women tend to develop depression more frequently than men. There is an inverse effect of education on depressed mood (probability of -24.6% to report a depressed mood due to high school education, as it emerges from the probit model marginal effects), while marital status and the number of family members may act as protective factors (probability to report a depressed mood of -1.0% for each family member). Depression is significantly associated with socio-economic conditions, such as work and income. Living in metropolitan areas is inversely correlated with depression (probability of -4.1% to report a depressed mood estimated through the probit model): this could be explained considering that, in rural areas, people rarely have immediate access to high-quality health services. This study outlines the factors that are more likely to impact on depression, and applies an IVP model to take into account the potential endogeneity of some of the predictors of depressive mood, such as female participation to workforce and health status. A probit model has been estimated too. Depression is associated with a wide range of socio-economic factors, although the strength and direction of the association can differ by gender. Prevention approaches to contrast depressive symptoms might take into consideration the evidence offered by the present study. © 2015 by Kerman University of Medical Sciences.

  10. An Instrumental Variable Probit (IVP) analysis on depressed mood in Korea: the impact of gender differences and other socio-economic factors

    PubMed Central

    Gitto, Lara; Noh, Yong-Hwan; Andrés, Antonio Rodríguez

    2015-01-01

    Background: Depression is a mental health state whose frequency has been increasing in modern societies. It imposes a great burden, because of the strong impact on people’s quality of life and happiness. Depression can be reliably diagnosed and treated in primary care: if more people could get effective treatments earlier, the costs related to depression would be reversed. The aim of this study was to examine the influence of socio-economic factors and gender on depressed mood, focusing on Korea. In fact, in spite of the great amount of empirical studies carried out for other countries, few epidemiological studies have examined the socio-economic determinants of depression in Korea and they were either limited to samples of employed women or did not control for individual health status. Moreover, as the likely data endogeneity (i.e. the possibility of correlation between the dependent variable and the error term as a result of autocorrelation or simultaneity, such as, in this case, the depressed mood due to health factors that, in turn might be caused by depression), might bias the results, the present study proposes an empirical approach, based on instrumental variables, to deal with this problem. Methods: Data for the year 2008 from the Korea National Health and Nutrition Examination Survey (KNHANES) were employed. About seven thousands of people (N= 6,751, of which 43% were males and 57% females), aged from 19 to 75 years old, were included in the sample considered in the analysis. In order to take into account the possible endogeneity of some explanatory variables, two Instrumental Variables Probit (IVP) regressions were estimated; the variables for which instrumental equations were estimated were related to the participation of women to the workforce and to good health, as reported by people in the sample. Explanatory variables were related to age, gender, family factors (such as the number of family members and marital status) and socio-economic factors (such as education, residence in metropolitan areas, and so on). As the results of the Wald test carried out after the estimations did not allow to reject the null hypothesis of endogeneity, a probit model was run too. Results: Overall, women tend to develop depression more frequently than men. There is an inverse effect of education on depressed mood (probability of -24.6% to report a depressed mood due to high school education, as it emerges from the probit model marginal effects), while marital status and the number of family members may act as protective factors (probability to report a depressed mood of -1.0% for each family member). Depression is significantly associated with socio-economic conditions, such as work and income. Living in metropolitan areas is inversely correlated with depression (probability of -4.1% to report a depressed mood estimated through the probit model): this could be explained considering that, in rural areas, people rarely have immediate access to high-quality health services. Conclusion: This study outlines the factors that are more likely to impact on depression, and applies an IVP model to take into account the potential endogeneity of some of the predictors of depressive mood, such as female participation to workforce and health status. A probit model has been estimated too. Depression is associated with a wide range of socio-economic factors, although the strength and direction of the association can differ by gender. Prevention approaches to contrast depressive symptoms might take into consideration the evidence offered by the present study. PMID:26340392

  11. An extended Kalman-Bucy filter for atmospheric temperature profile retrieval with a passive microwave sounder

    NASA Technical Reports Server (NTRS)

    Ledsham, W. H.; Staelin, D. H.

    1978-01-01

    An extended Kalman-Bucy filter has been implemented for atmospheric temperature profile retrievals from observations made using the Scanned Microwave Spectrometer (SCAMS) instrument carried on the Nimbus 6 satellite. This filter has the advantage that it requires neither stationary statistics in the underlying processes nor linear production of the observed variables from the variables to be estimated. This extended Kalman-Bucy filter has yielded significant performance improvement relative to multiple regression retrieval methods. A multi-spot extended Kalman-Bucy filter has also been developed in which the temperature profiles at a number of scan angles in a scanning instrument are retrieved simultaneously. These multi-spot retrievals are shown to outperform the single-spot Kalman retrievals.

  12. CHAMP (Camera, Handlens, and Microscope Probe)

    NASA Technical Reports Server (NTRS)

    Mungas, Greg S.; Boynton, John E.; Balzer, Mark A.; Beegle, Luther; Sobel, Harold R.; Fisher, Ted; Klein, Dan; Deans, Matthew; Lee, Pascal; Sepulveda, Cesar A.

    2005-01-01

    CHAMP (Camera, Handlens And Microscope Probe)is a novel field microscope capable of color imaging with continuously variable spatial resolution from infinity imaging down to diffraction-limited microscopy (3 micron/pixel). As a robotic arm-mounted imager, CHAMP supports stereo imaging with variable baselines, can continuously image targets at an increasing magnification during an arm approach, can provide precision rangefinding estimates to targets, and can accommodate microscopic imaging of rough surfaces through a image filtering process called z-stacking. CHAMP was originally developed through the Mars Instrument Development Program (MIDP) in support of robotic field investigations, but may also find application in new areas such as robotic in-orbit servicing and maintenance operations associated with spacecraft and human operations. We overview CHAMP'S instrument performance and basic design considerations below.

  13. Breastfeeding and the risk of childhood asthma: A two-stage instrumental variable analysis to address endogeneity.

    PubMed

    Sharma, Nivita D

    2017-09-01

    Several explanations for the inconsistent results on the effects of breastfeeding on childhood asthma have been suggested. The purpose of this study was to investigate one unexplored explanation, which is the presence of a potential endogenous relationship between breastfeeding and childhood asthma. Endogeneity exists when an explanatory variable is correlated with the error term for reasons such as selection bias, reverse causality, and unmeasured confounders. Unadjusted endogeneity will bias the effect of breastfeeding on childhood asthma. To investigate potential endogeneity, a cross-sectional study of breastfeeding practices and incidence of childhood asthma in 87 pediatric patients in Georgia, the USA, was conducted using generalized linear modeling and a two-stage instrumental variable analysis. First, the relationship between breastfeeding and childhood asthma was analyzed without considering endogeneity. Second, tests for presence of endogeneity were performed and having detected endogeneity between breastfeeding and childhood asthma, a two-stage instrumental variable analysis was performed. The first stage of this analysis estimated the duration of breastfeeding and the second-stage estimated the risk of childhood asthma. When endogeneity was not taken into account, duration of breastfeeding was found to significantly increase the risk of childhood asthma (relative risk ratio [RR]=2.020, 95% confidence interval [CI]: [1.143-3.570]). After adjusting for endogeneity, duration of breastfeeding significantly reduced the risk of childhood asthma (RR=0.003, 95% CI: [0.000-0.240]). The findings suggest that researchers should consider evaluating how the presence of endogeneity could affect the relationship between duration of breastfeeding and the risk of childhood asthma. © 2017 EAACI and John Wiley and Sons A/S. Published by John Wiley and Sons Ltd.

  14. An Agitation Experiment with Multiple Aspects

    ERIC Educational Resources Information Center

    Spencer, Jordan L.

    2006-01-01

    This paper describes a multifaceted agitation and mixing experiment. The relatively inexpensive apparatus includes a variable-speed stirrer motor, two polycarbonate tanks, and an instrumented torque table. Students measure torque as a function of stirrer speed, and use conductive tracer data to estimate two parameters of a flow model. The effect…

  15. Three Essays in Applied Microeconomics

    ERIC Educational Resources Information Center

    Akers, Elizabeth J.

    2012-01-01

    In the first chapter, I measure the impact of student loan debt on young, college-educated workers' decisions regarding labor supply and enrollment in graduate school. I exploit variation in student loan debt driven by the formulas that determine Federal Student Aid in order to identify these effects. Instrumental variable estimates indicate…

  16. The "Trouble Shooting" Checklist for School-Based Settings (Manual).

    ERIC Educational Resources Information Center

    Manning, Brad A.

    The "Trouble Shooting Checklist" (TSC) is a diagnostic and predictive instrument designed to aid educational change agents, faculty, and administrators in estimating the effects of particular variables on an institution's potential for successfully adopting innovations. The TSC consists of 100 descriptive statements that are broken down into seven…

  17. Migraine headache and labor market outcomes.

    PubMed

    Rees, Daniel I; Sabia, Joseph J

    2015-06-01

    While migraine headache can be physically debilitating, no study has attempted to estimate its effects on labor market outcomes. Using data drawn from the National Longitudinal Study of Adolescent Health, we estimate the effect of being diagnosed with migraine headache on labor force participation, hours worked, and wages. Ordinary least squares (OLS) estimates suggest that migraines are associated with reduced labor force participation and lower wages among females. A negative association between migraine headache and the wages of female respondents is also obtained using an instrumental variables (IV) approach, although the IV estimates are imprecise relative to the OLS estimates. Copyright © 2014 John Wiley & Sons, Ltd.

  18. The Effect of Private Insurance on the Health of Older, Working Age Adults: Evidence from the Health and Retirement Study

    PubMed Central

    Dor, Avi; Sudano, Joseph; Baker, David W

    2006-01-01

    Objective Primarily, to determine if the presence of private insurance leads to improved health status, as measured by a survey-based health score. Secondarily, to explore sensitivity of estimates to adjustments for endogeneity. The study focuses on adults in late middle age who are nearing entry into Medicare. Data Sources The analysis file is drawn from the Health and Retirement Study, a national survey of relatively older adults in the labor force. The dependent variable, an index of 5 health outcome items, was obtained from the 1996 survey. Independent variables were obtained from the 1992 survey. State-level instrumental variables were obtained from the Area Resources File and the TAXSIM file. The final sample consists of 9,034 individuals of which 1,540 were uninsured. Study Design Estimation addresses endogeneity of the insurance participation decision in health score regressions. In addition to ordinary least squares (OLS), two models are tested: an instrumental variables (IV) model, and a model with endogenous treatment effects due to Heckman (1978). Insurance participation and health behaviors enter with a lag to allow their effects to dissipate over time. Separate regressions were run for groupings of chronic conditions. Principal Findings The OLS model results in statistically significant albeit small effects of insurance on the computed health score, but the results may be downward biased. Adjusting for endogeneity using state-level instrumental variables yields up to a six-fold increase in the insurance effect. Results are consistent across IV and treatment effects models, and for major groupings of medical conditions. The insurance effect appears to be in the range of about 2–11 percent. There appear to be no significant differences in the insurance effect for subgroups with and without major chronic conditions. Conclusions Extending insurance coverage to working age adults may result in improved health. By conjecture, policies aimed at expanding coverage to this population may lead to improved health at retirement and entry to Medicare, potentially leading to savings. However, further research is needed to determine whether similar results are found when alternative measures of overall health or health scores are used. Future research should also explore the use of alternative instrumental variables. Preliminary results provide no justification for targeting certain subgroups with susceptibility to certain chronic conditions rather than broad policy interventions. PMID:16704511

  19. Final Report: Wireless Instrument for Automated Measurement of Clean Cookstove Usage and Black Carbon Emissions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lukac, Martin; Ramanathan, Nithya; Graham, Eric

    2013-09-10

    Black carbon (BC) emissions from traditional cooking fires and other sources are significant anthropogenic drivers of radiative forcing. Clean cookstoves present a more energy-efficient and cleaner-burning vehicle for cooking than traditional wood-burning stoves, yet many existing cookstoves reduce emissions by only modest amounts. Further research into cookstove use, fuel types, and verification of emissions is needed as adoption rates for such stoves remain low. Accelerated innovation requires techniques for measuring and verifying such cookstove performance. The overarching goal of the proposed program was to develop a low-cost, wireless instrument to provide a high-resolution profile of the cookstove BC emissions andmore » usage in the field. We proposed transferring the complexity of analysis away from the sampling hardware at the measurement site and to software at a centrally located server to easily analyze data from thousands of sampling instruments. We were able to build a low-cost field-based instrument that produces repeatable, low-cost estimates of cookstove usage, fuel estimates, and emission values with low variability. Emission values from our instrument were consistent with published ranges of emissions for similar stove and fuel types.« less

  20. System and method for correcting attitude estimation

    NASA Technical Reports Server (NTRS)

    Josselson, Robert H. (Inventor)

    2010-01-01

    A system includes an angular rate sensor disposed in a vehicle for providing angular rates of the vehicle, and an instrument disposed in the vehicle for providing line-of-sight control with respect to a line-of-sight reference. The instrument includes an integrator which is configured to integrate the angular rates of the vehicle to form non-compensated attitudes. Also included is a compensator coupled across the integrator, in a feed-forward loop, for receiving the angular rates of the vehicle and outputting compensated angular rates of the vehicle. A summer combines the non-compensated attitudes and the compensated angular rates of the to vehicle to form estimated vehicle attitudes for controlling the instrument with respect to the line-of-sight reference. The compensator is configured to provide error compensation to the instrument free-of any feedback loop that uses an error signal. The compensator may include a transfer function providing a fixed gain to the received angular rates of the vehicle. The compensator may, alternatively, include a is transfer function providing a variable gain as a function of frequency to operate on the received angular rates of the vehicle.

  1. Using instrumental variables to disentangle treatment and placebo effects in blinded and unblinded randomized clinical trials influenced by unmeasured confounders

    NASA Astrophysics Data System (ADS)

    Chaibub Neto, Elias

    2016-11-01

    Clinical trials traditionally employ blinding as a design mechanism to reduce the influence of placebo effects. In practice, however, it can be difficult or impossible to blind study participants and unblinded trials are common in medical research. Here we show how instrumental variables can be used to quantify and disentangle treatment and placebo effects in randomized clinical trials comparing control and active treatments in the presence of confounders. The key idea is to use randomization to separately manipulate treatment assignment and psychological encouragement conversations/interactions that increase the participants’ desire for improved symptoms. The proposed approach is able to improve the estimation of treatment effects in blinded studies and, most importantly, opens the doors to account for placebo effects in unblinded trials.

  2. Clouds and the Earth's Radiant Energy System (CERES) Data Products for Climate Research

    NASA Technical Reports Server (NTRS)

    Kato, Seiji; Loeb, Norman G.; Rutan, David A.; Rose, Fred G.

    2015-01-01

    NASA's Clouds and the Earth's Radiant Energy System (CERES) project integrates CERES, Moderate Resolution Imaging Spectroradiometer (MODIS), and geostationary satellite observations to provide top-of-atmosphere (TOA) irradiances derived from broadband radiance observations by CERES instruments. It also uses snow cover and sea ice extent retrieved from microwave instruments as well as thermodynamic variables from reanalysis. In addition, these variables are used for surface and atmospheric irradiance computations. The CERES project provides TOA, surface, and atmospheric irradiances in various spatial and temporal resolutions. These data sets are for climate research and evaluation of climate models. Long-term observations are required to understand how the Earth system responds to radiative forcing. A simple model is used to estimate the time to detect trends in TOA reflected shortwave and emitted longwave irradiances.

  3. The continuum of hydroclimate variability in western North America during the last millennium

    USGS Publications Warehouse

    Ault, Toby R.; Cole, Julia E.; Overpeck, Jonathan T.; Pederson, Gregory T.; St. George, Scott; Otto-Bliesner, Bette; Woodhouse, Connie A.; Deser, Clara

    2013-01-01

    The distribution of climatic variance across the frequency spectrum has substantial importance for anticipating how climate will evolve in the future. Here we estimate power spectra and power laws (ß) from instrumental, proxy, and climate model data to characterize the hydroclimate continuum in western North America (WNA). We test the significance of our estimates of spectral densities and ß against the null hypothesis that they reflect solely the effects of local (non-climate) sources of autocorrelation at the monthly timescale. Although tree-ring based hydroclimate reconstructions are generally consistent with this null hypothesis, values of ß calculated from long-moisture sensitive chronologies (as opposed to reconstructions), and other types of hydroclimate proxies, exceed null expectations. We therefore argue that there is more low-frequency variability in hydroclimate than monthly autocorrelation alone can generate. Coupled model results archived as part of the Climate Model Intercomparison Project 5 (CMIP5) are consistent with the null hypothesis and appear unable to generate variance in hydroclimate commensurate with paleoclimate records. Consequently, at decadal to multidecadal timescales there is more variability in instrumental and proxy data than in the models, suggesting that the risk of prolonged droughts under climate change may be underestimated by CMIP5 simulations of the future.

  4. Estimation and Identification of the Complier Average Causal Effect Parameter in Education RCTs

    ERIC Educational Resources Information Center

    Schochet, Peter Z.; Chiang, Hanley S.

    2011-01-01

    In randomized control trials (RCTs) in the education field, the complier average causal effect (CACE) parameter is often of policy interest, because it pertains to intervention effects for students who receive a meaningful dose of treatment services. This article uses a causal inference and instrumental variables framework to examine the…

  5. The Perceived Impact of Agricultural Advice in Ethiopia

    ERIC Educational Resources Information Center

    Hamilton, Alexander; Hudson, John

    2017-01-01

    Purpose: We examine the impact of advice given by extension agents to Ethiopian farmers, as perceived by the farmers themselves. Design/methodology/approach: Using survey data from 2014, we analyze the perceived impact of advice on farmers' incomes and crop yields. We use a bootstrapped instrumental variable (IV) estimator and the conditional…

  6. Revising Our Thinking about the Relationship between Maternal Labor Supply and Preschool

    ERIC Educational Resources Information Center

    Fitzpatrick, Maria Donovan

    2012-01-01

    Many argue that childcare costs limit the labor supply of mothers, though existing evidence has been mixed. Using a child's eligibility for public kindergarten in a regression discontinuity instrumental variables framework, I estimate how use of a particular subsidy, public school, affects maternal labor supply. I find public school enrollment…

  7. Carbohydrate and protein but not fat or fiber affects glycemic index and glycemic load value determinations

    USDA-ARS?s Scientific Manuscript database

    Introduction: Dietary glycemic index (GI) and glycemic load (GL) values have been calculated using data derived from instruments designed to estimate daily food intake. Since the absolute amount of carbohydrate (CHO) and combination of CHO with other macronutrients and fiber is highly variable among...

  8. Within- and between-laboratory precision in the measurement of body volume using air displacement plethysmography and its effect on body composition assessment.

    PubMed

    Collins, A L; Saunders, S; McCarthy, H D; Williams, J E; Fuller, N J

    2004-01-01

    To determine and compare the extent of within- and between-laboratory precision in body volume (BV) measurements using air displacement plethysmography (ADP), the BOD POD body composition system, and to interpret any such variability in terms of body composition estimates. Repeated test procedures of BV assessment using the BOD POD ADP were reproduced at two laboratories for the estimation of precision, both within and between laboratories. In total, 30 healthy adult volunteers, 14 men (age, 19-48 y; body mass index (BMI), 19.7-30.3 kg/m2) and 16 women (age, 19-40 y; BMI, 16.3-35.7 kg/m2), were each subjected to two test procedures at both laboratories. Two additional volunteers were independently subjected to 10 repeated test procedures at both laboratories. Repeated measurements of BV, uncorrected for the effects of isothermal air in the lungs and the surface area artifact, were obtained using the BOD POD ADP, with the identical protocol being faithfully applied at both laboratories. Uncorrected BV measurements were adjusted to give estimates of actual BV that were used to calculate body density (body weight (BWt)/actual BV) from which estimates of body composition were derived. The differences between repeated BV measurements or body composition estimates were used to assess within-laboratory precision (repeatability), as standard deviation (SD) and coefficient of variation; the differences between measurements reproduced at each laboratory were used to determine between-laboratory precision (reproducibility), as bias and 95% limits of agreement (from SD of the differences between laboratories). The extent of within-laboratory methodological precision for BV (uncorrected and actual) was variable according to subject, sample group and laboratory conditions (range of SD, 0.04-0.13 l), and was mostly due to within-individual biological variability (typically 78-99%) rather than to technical imprecision. There was a significant (P<0.05) bias between laboratories for the 10 repeats on the two independent subjects (up to 0.29 l). Although no significant bias (P=0.077) was evident for the sample group of 30 volunteers (-0.05 l), the 95% limits of agreement were considerable (-0.68 to 0.58 l). The effects of this variability in BV on body composition were relatively greater: for example, within-laboratory precision (SD) for body fat as % BWt was between 0.56 and 1.34% depending on the subject and laboratory; the bias (-0.59%) was not significant between laboratories, but there were large 95% limits of agreement (-3.67 to 2.50%). Within-laboratory precision for each BOD POD instrument was reasonably good, but was variable according to the prevailing conditions. Although the bias between the two instruments was not significant for the BV measurements, implying that they can be used interchangeably for groups of similar subjects, the relatively large 95% limits of agreement indicate that greater consideration may be needed for assessing individuals with different ADP instruments. Therefore, use of a single ADP instrument is apparently preferable when assessing individuals on a longitudinal basis.

  9. The variation of polar firn subject to percolation - characterizing processes and glacier mass budget uncertainty using high-resolution instruments

    NASA Astrophysics Data System (ADS)

    Demuth, M. N.; Marshall, H.; Morris, E. M.; Burgess, D. O.; Gray, L.

    2009-12-01

    As the Earth's glaciers and ice sheets are subjected to the effects of recent and predicted warming, the distribution of their glaciological facies zones will alter. Percolation and wet snow facies zones will, in general, move upwards; encroaching upon, for some glacier configurations, regions of dry snow facies. Meltwater percolation and internal accumulation processes that characterize these highly variable facies may confound reliable estimates of surface mass budgets based on traditional point measurements alone. If the extents of these zones are indeed increasing, as has been documented through recent analysis of QuickScat data for the ice caps of the Canadian Arctic, then the certainty of glacier mass budget estimates using traditional techniques may be degraded to an as yet un-quantified degree. Indeed, the application of remote sensing, in particular that utilizing repeat altimetry to retrieve surface mass budget estimates, is also subject to the complexity of glacier facies from the standpoint of their near-surface stratigraphy, density variations and rates of compaction. We first review the problem of measuring glacier mass budgets in the context of nested scales of variability, where auto-correlation structure varies with the scale of observation. We then consider specifically firn subject to percolation and describe the application of high-resolution instruments to characterize variability at the field-scale. The data collected include measurements of micro-topography, snow hardness, and snow density and texture; retrieved using airborne scanning lidar, a snow micro-penetrometer, neutron probe and ground-penetrating radars. The analysis suggests corresponding scales of correlation as it concerns the influence of antecedent conditions (surface roughness and hardness, and stratigraphic variability) and post-depositional processes (percolation and refreezing of surface melt water).

  10. Use of instrumental variables in the analysis of generalized linear models in the presence of unmeasured confounding with applications to epidemiological research.

    PubMed

    Johnston, K M; Gustafson, P; Levy, A R; Grootendorst, P

    2008-04-30

    A major, often unstated, concern of researchers carrying out epidemiological studies of medical therapy is the potential impact on validity if estimates of treatment are biased due to unmeasured confounders. One technique for obtaining consistent estimates of treatment effects in the presence of unmeasured confounders is instrumental variables analysis (IVA). This technique has been well developed in the econometrics literature and is being increasingly used in epidemiological studies. However, the approach to IVA that is most commonly used in such studies is based on linear models, while many epidemiological applications make use of non-linear models, specifically generalized linear models (GLMs) such as logistic or Poisson regression. Here we present a simple method for applying IVA within the class of GLMs using the generalized method of moments approach. We explore some of the theoretical properties of the method and illustrate its use within both a simulation example and an epidemiological study where unmeasured confounding is suspected to be present. We estimate the effects of beta-blocker therapy on one-year all-cause mortality after an incident hospitalization for heart failure, in the absence of data describing disease severity, which is believed to be a confounder. 2008 John Wiley & Sons, Ltd

  11. Changes in photochemically significant solar UV spectral irradiance as estimated by the composite Mg II index and scale factors

    NASA Technical Reports Server (NTRS)

    Deland, Matthew T.; Cebula, Richard P.

    1994-01-01

    Quantitative assessment of the impact of solar ultraviolet irradiance variations on stratospheric ozone abundances currently requires the use of proxy indicators. The Mg II core-to-wing index has been developed as an indicator of solar UV activity between 175-400 nm that is independent of most instrument artifacts, and measures solar variability on both rotational and solar cycle time scales. Linear regression fits have been used to merge the individual Mg II index data sets from the Nimbus-7, NOAA-9, and NOAA-11 instruments onto a single reference scale. The change in 27-dayrunning average of the composite Mg II index from solar maximum to solar minimum is approximately 8 percent for solar cycle 21, and approximately 9 percent for solar cycle 22 through January 1992. Scaling factors based on the short-term variations in the Mg II index and solar irradiance data sets have been developed to estimate solar variability at mid-UV and near-UV wavelengths. Near 205 nm, where solar irradiance variations are important for stratospheric photo-chemistry and dynamics, the estimated change in irradiance during solar cycle 22 is approximately 10 percent using the composite Mg II index and scale factors.

  12. Influence of therapist competence and quantity of cognitive behavioural therapy on suicidal behaviour and inpatient hospitalisation in a randomised controlled trial in borderline personality disorder: further analyses of treatment effects in the BOSCOT study.

    PubMed

    Norrie, John; Davidson, Kate; Tata, Philip; Gumley, Andrew

    2013-09-01

    We investigated the treatment effects reported from a high-quality randomized controlled trial of cognitive behavioural therapy (CBT) for 106 people with borderline personality disorder attending community-based clinics in the UK National Health Service - the BOSCOT trial. Specifically, we examined whether the amount of therapy and therapist competence had an impact on our primary outcome, the number of suicidal acts, using instrumental variables regression modelling. Randomized controlled trial. Participants from across three sites (London, Glasgow, and Ayrshire/Arran) were randomized equally to CBT for personality disorders (CBTpd) plus Treatment as Usual or to Treatment as Usual. Treatment as Usual varied between sites and individuals, but was consistent with routine treatment in the UK National Health Service at the time. CBTpd comprised an average 16 sessions (range 0-35) over 12 months. We used instrumental variable regression modelling to estimate the impact of quantity and quality of therapy received (recording activities and behaviours that took place after randomization) on number of suicidal acts and inpatient psychiatric hospitalization. A total of 101 participants provided full outcome data at 2 years post randomization. The previously reported intention-to-treat (ITT) results showed on average a reduction of 0.91 (95% confidence interval 0.15-1.67) suicidal acts over 2 years for those randomized to CBT. By incorporating the influence of quantity of therapy and therapist competence, we show that this estimate of the effect of CBTpd could be approximately two to three times greater for those receiving the right amount of therapy from a competent therapist. Trials should routinely control for and collect data on both quantity of therapy and therapist competence, which can be used, via instrumental variable regression modelling, to estimate treatment effects for optimal delivery of therapy. Such estimates complement rather than replace the ITT results, which are properly the principal analysis results from such trials. © 2013 The British Psychological Society.

  13. Decoupling Solar Variability and Instrument Trends Using the Multiple Same-Irradiance-Level (MuSIL) Analysis Technique

    NASA Astrophysics Data System (ADS)

    Woods, Thomas N.; Eparvier, Francis G.; Harder, Jerald; Snow, Martin

    2018-05-01

    The solar spectral irradiance (SSI) dataset is a key record for studying and understanding the energetics and radiation balance in Earth's environment. Understanding the long-term variations of the SSI over timescales of the 11-year solar activity cycle and longer is critical for many Sun-Earth research topics. Satellite measurements of the SSI have been made since the 1970s, most of them in the ultraviolet, but recently also in the visible and near-infrared. A limiting factor for the accuracy of previous solar variability results is the uncertainties for the instrument degradation corrections, which need fairly large corrections relative to the amount of solar cycle variability at some wavelengths. The primary objective of this investigation has been to separate out solar cycle variability and any residual uncorrected instrumental trends in the SSI measurements from the Solar Radiation and Climate Experiment (SORCE) mission and the Thermosphere, Mesosphere, Ionosphere, Energetic, and Dynamics (TIMED) mission. A new technique called the Multiple Same-Irradiance-Level (MuSIL) analysis has been developed, which examines an SSI time series at different levels of solar activity to provide long-term trends in an SSI record, and the most common result is a downward trend that most likely stems from uncorrected instrument degradation. This technique has been applied to each wavelength in the SSI records from SORCE (2003 - present) and TIMED (2002 - present) to provide new solar cycle variability results between 27 nm and 1600 nm with a resolution of about 1 nm at most wavelengths. This technique, which was validated with the highly accurate total solar irradiance (TSI) record, has an estimated relative uncertainty of about 5% of the measured solar cycle variability. The MuSIL results are further validated with the comparison of the new solar cycle variability results from different solar cycles.

  14. Combining fixed effects and instrumental variable approaches for estimating the effect of psychosocial job quality on mental health: evidence from 13 waves of a nationally representative cohort study.

    PubMed

    Milner, Allison; Aitken, Zoe; Kavanagh, Anne; LaMontagne, Anthony D; Pega, Frank; Petrie, Dennis

    2017-06-23

    Previous studies suggest that poor psychosocial job quality is a risk factor for mental health problems, but they use conventional regression analytic methods that cannot rule out reverse causation, unmeasured time-invariant confounding and reporting bias. This study combines two quasi-experimental approaches to improve causal inference by better accounting for these biases: (i) linear fixed effects regression analysis and (ii) linear instrumental variable analysis. We extract 13 annual waves of national cohort data including 13 260 working-age (18-64 years) employees. The exposure variable is self-reported level of psychosocial job quality. The instruments used are two common workplace entitlements. The outcome variable is the Mental Health Inventory (MHI-5). We adjust for measured time-varying confounders. In the fixed effects regression analysis adjusted for time-varying confounders, a 1-point increase in psychosocial job quality is associated with a 1.28-point improvement in mental health on the MHI-5 scale (95% CI: 1.17, 1.40; P < 0.001). When the fixed effects was combined with the instrumental variable analysis, a 1-point increase psychosocial job quality is related to 1.62-point improvement on the MHI-5 scale (95% CI: -0.24, 3.48; P = 0.088). Our quasi-experimental results provide evidence to confirm job stressors as risk factors for mental ill health using methods that improve causal inference. © The Author 2017. Published by Oxford University Press on behalf of Faculty of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com

  15. How Financial Literacy Affects Household Wealth Accumulation.

    PubMed

    Behrman, Jere R; Mitchell, Olivia S; Soo, Cindy K; Bravo, David

    2012-05-01

    This study isolates the causal effects of financial literacy and schooling on wealth accumulation using a new household dataset and an instrumental variables (IV) approach. Financial literacy and schooling attainment are both strongly positively associated with wealth outcomes in linear regression models, whereas the IV estimates reveal even more potent effects of financial literacy. They also indicate that the schooling effect only becomes positive when interacted with financial literacy. Estimated impacts are substantial enough to imply that investments in financial literacy could have large wealth payoffs.

  16. How Financial Literacy Affects Household Wealth Accumulation

    PubMed Central

    Behrman, Jere R.; Mitchell, Olivia S.; Soo, Cindy K.; Bravo, David

    2012-01-01

    This study isolates the causal effects of financial literacy and schooling on wealth accumulation using a new household dataset and an instrumental variables (IV) approach. Financial literacy and schooling attainment are both strongly positively associated with wealth outcomes in linear regression models, whereas the IV estimates reveal even more potent effects of financial literacy. They also indicate that the schooling effect only becomes positive when interacted with financial literacy. Estimated impacts are substantial enough to imply that investments in financial literacy could have large wealth payoffs. PMID:23355747

  17. Reducing random measurement error in assessing postural load on the back in epidemiologic surveys.

    PubMed

    Burdorf, A

    1995-02-01

    The goal of this study was to design strategies to assess postural load on the back in occupational epidemiology by taking into account the reliability of measurement methods and the variability of exposure among the workers under study. Intermethod reliability studies were evaluated to estimate the systematic bias (accuracy) and random measurement error (precision) of various methods to assess postural load on the back. Intramethod reliability studies were reviewed to estimate random variability of back load over time. Intermethod surveys have shown that questionnaires have a moderate reliability for gross activities such as sitting, whereas duration of trunk flexion and rotation should be assessed by observation methods or inclinometers. Intramethod surveys indicate that exposure variability can markedly affect the reliability of estimates of back load if the estimates are based upon a single measurement over a certain time period. Equations have been presented to evaluate various study designs according to the reliability of the measurement method, the optimum allocation of the number of repeated measurements per subject, and the number of subjects in the study. Prior to a large epidemiologic study, an exposure-oriented survey should be conducted to evaluate the performance of measurement instruments and to estimate sources of variability for back load. The strategy for assessing back load can be optimized by balancing the number of workers under study and the number of repeated measurements per worker.

  18. Suwannee River flow variability 1550-2005 CE reconstructed from a multispecies tree-ring network

    NASA Astrophysics Data System (ADS)

    Harley, Grant L.; Maxwell, Justin T.; Larson, Evan; Grissino-Mayer, Henri D.; Henderson, Joseph; Huffman, Jean

    2017-01-01

    Understanding the long-term natural flow regime of rivers enables resource managers to more accurately model water level variability. Models for managing water resources are important in Florida where population increase is escalating demand on water resources and infrastructure. The Suwannee River is the second largest river system in Florida and the least impacted by anthropogenic disturbance. We used new and existing tree-ring chronologies from multiple species to reconstruct mean March-October discharge for the Suwannee River during the period 1550-2005 CE and place the short period of instrumental flows (since 1927 CE) into historical context. We used a nested principal components regression method to maximize the use of chronologies with varying time coverage in the network. Modeled streamflow estimates indicated that instrumental period flow conditions do not adequately capture the full range of Suwannee River flow variability beyond the observational period. Although extreme dry and wet events occurred in the gage record, pluvials and droughts that eclipse the intensity and duration of instrumental events occurred during the 16-19th centuries. The most prolonged and severe dry conditions during the past 450 years occurred during the 1560s CE. In this prolonged drought period mean flow was estimated at 17% of the mean instrumental period flow. Significant peaks in spectral density at 2-7, 10, 45, and 85-year periodicities indicated the important influence of coupled oceanic-atmospheric processes on Suwannee River streamflow over the past four centuries, though the strength of these periodicities varied over time. Future water planning based on current flow expectations could prove devastating to natural and human systems if a prolonged and severe drought mirroring the 16th and 18th century events occurred. Future work in the region will focus on updating existing tree-ring chronologies and developing new collections from moisture-sensitive sites to improve understandings of past hydroclimate in the region.

  19. Estimation of Sensory Pork Loin Tenderness Using Warner-Bratzler Shear Force and Texture Profile Analysis Measurements

    PubMed Central

    Choe, Jee-Hwan; Choi, Mi-Hee; Rhee, Min-Suk; Kim, Byoung-Chul

    2016-01-01

    This study investigated the degree to which instrumental measurements explain the variation in pork loin tenderness as assessed by the sensory evaluation of trained panelists. Warner-Bratzler shear force (WBS) had a significant relationship with the sensory tenderness variables, such as softness, initial tenderness, chewiness, and rate of breakdown. In a regression analysis, WBS could account variations in these sensory variables, though only to a limited proportion of variation. On the other hand, three parameters from texture profile analysis (TPA)—hardness, gumminess, and chewiness—were significantly correlated with all sensory evaluation variables. In particular, from the result of stepwise regression analysis, TPA hardness alone explained over 15% of variation in all sensory evaluation variables, with the exception of perceptible residue. Based on these results, TPA analysis was found to be better than WBS measurement, with the TPA parameter hardness likely to prove particularly useful, in terms of predicting pork loin tenderness as rated by trained panelists. However, sensory evaluation should be conducted to investigate practical pork tenderness perceived by consumer, because both instrumental measurements could explain only a small portion (less than 20%) of the variability in sensory evaluation. PMID:26954174

  20. Estimation of Sensory Pork Loin Tenderness Using Warner-Bratzler Shear Force and Texture Profile Analysis Measurements.

    PubMed

    Choe, Jee-Hwan; Choi, Mi-Hee; Rhee, Min-Suk; Kim, Byoung-Chul

    2016-07-01

    This study investigated the degree to which instrumental measurements explain the variation in pork loin tenderness as assessed by the sensory evaluation of trained panelists. Warner-Bratzler shear force (WBS) had a significant relationship with the sensory tenderness variables, such as softness, initial tenderness, chewiness, and rate of breakdown. In a regression analysis, WBS could account variations in these sensory variables, though only to a limited proportion of variation. On the other hand, three parameters from texture profile analysis (TPA)-hardness, gumminess, and chewiness-were significantly correlated with all sensory evaluation variables. In particular, from the result of stepwise regression analysis, TPA hardness alone explained over 15% of variation in all sensory evaluation variables, with the exception of perceptible residue. Based on these results, TPA analysis was found to be better than WBS measurement, with the TPA parameter hardness likely to prove particularly useful, in terms of predicting pork loin tenderness as rated by trained panelists. However, sensory evaluation should be conducted to investigate practical pork tenderness perceived by consumer, because both instrumental measurements could explain only a small portion (less than 20%) of the variability in sensory evaluation.

  1. The Causal Effect of Class Size on Academic Achievement: Multivariate Instrumental Variable Estimators with Data Missing at Random

    ERIC Educational Resources Information Center

    Shin, Yongyun; Raudenbush, Stephen W.

    2011-01-01

    This article addresses three questions: Does reduced class size cause higher academic achievement in reading, mathematics, listening, and word recognition skills? If it does, how large are these effects? Does the magnitude of such effects vary significantly across schools? The authors analyze data from Tennessee's Student/Teacher Achievement Ratio…

  2. Modeling Outcomes with Floor or Ceiling Effects: An Introduction to the Tobit Model

    ERIC Educational Resources Information Center

    McBee, Matthew

    2010-01-01

    In gifted education research, it is common for outcome variables to exhibit strong floor or ceiling effects due to insufficient range of measurement of many instruments when used with gifted populations. Common statistical methods (e.g., analysis of variance, linear regression) produce biased estimates when such effects are present. In practice,…

  3. Generalized Path Analysis and Generalized Simultaneous Equations Model for Recursive Systems with Responses of Mixed Types

    ERIC Educational Resources Information Center

    Tsai, Tien-Lung; Shau, Wen-Yi; Hu, Fu-Chang

    2006-01-01

    This article generalizes linear path analysis (PA) and simultaneous equations models (SiEM) to deal with mixed responses of different types in a recursive or triangular system. An efficient instrumental variable (IV) method for estimating the structural coefficients of a 2-equation partially recursive generalized path analysis (GPA) model and…

  4. Mechanisms for the Association between Maternal Employment and Child Cognitive Development. NBER Working Paper No. 13609

    ERIC Educational Resources Information Center

    Cawley, John; Liu, Feng

    2007-01-01

    Recent research has found that maternal employment is associated with worse child performance on tests of cognitive ability. This paper explores mechanisms for that correlation. We estimate models of instrumental variables using a unique dataset, the American Time Use Survey, that measure the effect of maternal employment on the mother's…

  5. The Faces Symbol Test, a newly developed screening instrument to assess cognitive decline related to multiple sclerosis: first results of the Berlin Multi-Centre FST Validation Study.

    PubMed

    Scherer, P; Penner, I K; Rohr, A; Boldt, H; Ringel, I; Wilke-Burger, H; Burger-Deinerth, E; Isakowitsch, K; Zimmermann, M; Zahrnt, S; Hauser, R; Hilbert, K; Tiel-Wilck, K; Anvari, K; Behringer, A; Peglau, I; Friedrich, H; Plenio, A; Benesch, G; Ehret, R; Nippert, I; Finke, G; Schulz, I; Bergtholdt, B; Breitkopf, S; Kaskel, P; Reischies, F; Kugler, J

    2007-04-01

    Reliable, language-independent, short screening instruments to test for cognitive function in patients with multiple sclerosis (MS) remain rare, despite the high number of patients affected by cognitive decline. We developed a new, short screening instrument, the Faces Symbol Test (FST), and compared its diagnostic test characteristics with a composite of the Digit Symbol Substitution Test (DSST) and the Paced Auditory Serial Addition Test (PASAT), in 108 MS patients and 33 healthy controls. An Informant-Report Questionnaire, a Self-Report Questionnaire, and a neurologist's estimation of the Every Day Life Cognitive Status were also applied to the MS patients. The statistical analyses comprised of a receiver operating characteristic analysis for test accuracy and for confounding variables. The PASAT and DSST composite score estimated that 36.5% of the MS patients had cognitive impairment. The FST estimated that 40.7% of the MS patients were cognitively impaired (sensitivity 84%; specificity 85%). The FST, DSST and PASAT results were significantly correlated with the patients' physical impairment, as measured by the Expanded Disability Status Scale (EDSS). The results suggest that the FST might be a culture-free, sensitive, and practical short screening instrument for the detection of cognitive decline in patients with MS, including those in the early stages.

  6. The productivity of mental health care: an instrumental variable approach.

    PubMed

    Lu, Mingshan

    1999-06-01

    BACKGROUND: Like many other medical technologies and treatments, there is a lack of reliable evidence on treatment effectiveness of mental health care. Increasingly, data from non-experimental settings are being used to study the effect of treatment. However, as in a number of studies using non-experimental data, a simple regression of outcome on treatment shows a puzzling negative and significant impact of mental health care on the improvement of mental health status, even after including a large number of potential control variables. The central problem in interpreting evidence from real-world or non-experimental settings is, therefore, the potential "selection bias" problem in observational data set. In other words, the choice/quantity of mental health care may be correlated with other variables, particularly unobserved variables, that influence outcome and this may lead to a bias in the estimate of the effect of care in conventional models. AIMS OF THE STUDY: This paper addresses the issue of estimating treatment effects using an observational data set. The information in a mental health data set obtained from two waves of data in Puerto Rico is explored. The results using conventional models - in which the potential selection bias is not controlled - and that from instrumental variable (IV) models - which is what was proposed in this study to correct for the contaminated estimation from conventional models - are compared. METHODS: Treatment effectiveness is estimated in a production function framework. Effectiveness is measured as the improvement in mental health status. To control for the potential selection bias problem, IV approaches are employed. The essence of the IV method is to use one or more instruments, which are observable factors that influence treatment but do not directly affect patient outcomes, to isolate the effect of treatment variation that is independent of unobserved patient characteristics. The data used in this study are the first (1992-1993) and second (1993-1994) wave of the ongoing longitudinal study Mental Health Care Utilization Among Puerto Ricans, which includes information for an island-wide probability sample of over 3000 adults living in poor areas of Puerto Rico. The instrumental variables employed in this study are travel distance and health insurance sources. RESULTS: It is very noticeable that in this study, treatment effects were found to be negative in all conventional models (in some cases, highly significant). However, after the IV method was applied, the estimated marginal effects of treatment became positive. Sensitivity analysis partly supports this conclusion. According to the IV estimation results, treatment is productive for the group in most need of mental health care. However, estimations do not find strong enough evidence to demonstrate treatment effects on other groups with less or no need. The results in this paper also suggest an important impact of the following factors on the probability of improvement in mental health status: baseline mental health status, previous treatment, sex, marital status and education. DISCUSSION: The IV approach provides a practical way to reduce the selection bias due to the confounding of treatment with unmeasured variables. The limitation of this study is that the instruments explored did not perform well enough in some IV equations, therefore the predictive power remains questionable. The most challenging part of applying the IV approach is on finding "good" instruments which influence the choice/quantity of treatment yet do not introduce further bias by being directly correlated with treatment outcome. CONCLUSIONS: The results in this paper are supportive of the concerns on the credibility of evaluation results using observation data set when the endogeneity of the treatment variable is not controlled. Unobserved factors contribute to the downward bias in the conventional models. The IV approach is shown to be an appropriate method to reduce the selection bias for the group in most need for mental health care, which is also the group of most policy and treatment concern. IMPLICATIONS FOR HEALTH CARE PROVISION AND USE: The results of this work have implications for resource allocation in mental health care. Evidence is found that mental health care provided in Puerto Rico is productive, and is most helpful for persons in most need for mental health care. According to what estimated from the IV models, on the margin, receiving formal mental health care significantly increases the probability of obtaining a better mental health outcome by 19.2%, and one unit increase in formal treatment increased the probability of becoming healthier by 6.2% to 8.4%. Consistent with other mental health literature, an individual's baseline mental health status is found to be significantly related to the probability of improvement in mental health status: individuals with previous treatment history are less likely to improve. Among demographic factors included in the production function, being female, married, and high education were found to contribute to a higher probability of improvement. IMPLICATION FOR FURTHER RESEARCH: In order to provide accurate evidence of treatment effectiveness of medical technologies to support decision making, it is important that the selection bias be controlled as rigorously as possible when using information from a non-experimental setting. More data and a longer panel are also needed to provide more valid evidence. tion.

  7. Estimations of natural variability between satellite measurements of trace species concentrations

    NASA Astrophysics Data System (ADS)

    Sheese, P.; Walker, K. A.; Boone, C. D.; Degenstein, D. A.; Kolonjari, F.; Plummer, D. A.; von Clarmann, T.

    2017-12-01

    In order to validate satellite measurements of atmospheric states, it is necessary to understand the range of random and systematic errors inherent in the measurements. On occasions where the measurements do not agree within those errors, a common "go-to" explanation is that the unexplained difference can be chalked up to "natural variability". However, the expected natural variability is often left ambiguous and rarely quantified. This study will look to quantify the expected natural variability of both O3 and NO2 between two satellite instruments: ACE-FTS (Atmospheric Chemistry Experiment - Fourier Transform Spectrometer) and OSIRIS (Optical Spectrograph and Infrared Imaging System). By sampling the CMAM30 (30-year specified dynamics simulation of the Canadian Middle Atmosphere Model) climate chemistry model throughout the upper troposphere and stratosphere at times and geolocations of coincident ACE-FTS and OSIRIS measurements at varying coincidence criteria, height-dependent expected values of O3 and NO2 variability will be estimated and reported on. The results could also be used to better optimize the coincidence criteria used in satellite measurement validation studies.

  8. Bank Size and Small- and Medium-sized Enterprise (SME) Lending: Evidence from China.

    PubMed

    Shen, Yan; Shen, Minggao; Xu, Zhong; Bai, Ying

    2009-04-01

    Using panel data collected in 2005, we evaluate how bank size, discretion over credit, incentive schemes, competition, and the institutional environment affect lending to small- and medium-sized enterprises in China. We deal with the endogeneity problem using instrumental variables, and a reduced-form approach is also applied to allow for weak instruments in estimation. We find that total bank asset is an insignificant factor for banks' decision on small- and medium-enterprise (SME) lending, but more local lending authority, more competition, carefully designed incentive schemes, and stronger law enforcement encourage commercial banks to lend to SMEs.

  9. Statistical analysis of the calibration procedure for personnel radiation measurement instruments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bush, W.J.; Bengston, S.J.; Kalbeitzer, F.L.

    1980-11-01

    Thermoluminescent analyzer (TLA) calibration procedures were used to estimate personnel radiation exposure levels at the Idaho National Engineering Laboratory (INEL). A statistical analysis is presented herein based on data collected over a six month period in 1979 on four TLA's located in the Department of Energy (DOE) Radiological and Environmental Sciences Laboratory at the INEL. The data were collected according to the day-to-day procedure in effect at that time. Both gamma and beta radiation models are developed. Observed TLA readings of thermoluminescent dosimeters are correlated with known radiation levels. This correlation is then used to predict unknown radiation doses frommore » future analyzer readings of personnel thermoluminescent dosimeters. The statistical techniques applied in this analysis include weighted linear regression, estimation of systematic and random error variances, prediction interval estimation using Scheffe's theory of calibration, the estimation of the ratio of the means of two normal bivariate distributed random variables and their corresponding confidence limits according to Kendall and Stuart, tests of normality, experimental design, a comparison between instruments, and quality control.« less

  10. State estimation applications in aircraft flight-data analysis: A user's manual for SMACK

    NASA Technical Reports Server (NTRS)

    Bach, Ralph E., Jr.

    1991-01-01

    The evolution in the use of state estimation is traced for the analysis of aircraft flight data. A unifying mathematical framework for state estimation is reviewed, and several examples are presented that illustrate a general approach for checking instrument accuracy and data consistency, and for estimating variables that are difficult to measure. Recent applications associated with research aircraft flight tests and airline turbulence upsets are described. A computer program for aircraft state estimation is discussed in some detail. This document is intended to serve as a user's manual for the program called SMACK (SMoothing for AirCraft Kinematics). The diversity of the applications described emphasizes the potential advantages in using SMACK for flight-data analysis.

  11. Health care demand elasticities by type of service.

    PubMed

    Ellis, Randall P; Martins, Bruno; Zhu, Wenjia

    2017-09-01

    We estimate within-year price elasticities of demand for detailed health care services using an instrumental variable strategy, in which individual monthly cost shares are instrumented by employer-year-plan-month average cost shares. A specification using backward myopic prices gives more plausible and stable results than using forward myopic prices. Using 171 million person-months spanning 73 employers from 2008 to 2014, we estimate that the overall demand elasticity by backward myopic consumers is -0.44, with higher elasticities of demand for pharmaceuticals (-0.44), specialists visits (-0.32), MRIs (-0.29) and mental health/substance abuse (-0.26), and lower elasticities for prevention visits (-0.02) and emergency rooms (-0.04). Demand response is lower for children, in larger firms, among hourly waged employees, and for sicker people. Overall the method appears promising for estimating elasticities for highly disaggregated services although the approach does not work well on services that are very expensive or persistent. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Combating Unmeasured Confounding in Cross-Sectional Studies: Evaluating Instrumental-Variable and Heckman Selection Models

    PubMed Central

    DeMaris, Alfred

    2014-01-01

    Unmeasured confounding is the principal threat to unbiased estimation of treatment “effects” (i.e., regression parameters for binary regressors) in nonexperimental research. It refers to unmeasured characteristics of individuals that lead them both to be in a particular “treatment” category and to register higher or lower values than others on a response variable. In this article, I introduce readers to 2 econometric techniques designed to control the problem, with a particular emphasis on the Heckman selection model (HSM). Both techniques can be used with only cross-sectional data. Using a Monte Carlo experiment, I compare the performance of instrumental-variable regression (IVR) and HSM to that of ordinary least squares (OLS) under conditions with treatment and unmeasured confounding both present and absent. I find HSM generally to outperform IVR with respect to mean-square-error of treatment estimates, as well as power for detecting either a treatment effect or unobserved confounding. However, both HSM and IVR require a large sample to be fully effective. The use of HSM and IVR in tandem with OLS to untangle unobserved confounding bias in cross-sectional data is further demonstrated with an empirical application. Using data from the 2006–2010 General Social Survey (National Opinion Research Center, 2014), I examine the association between being married and subjective well-being. PMID:25110904

  13. Microarray image analysis: background estimation using quantile and morphological filters.

    PubMed

    Bengtsson, Anders; Bengtsson, Henrik

    2006-02-28

    In a microarray experiment the difference in expression between genes on the same slide is up to 103 fold or more. At low expression, even a small error in the estimate will have great influence on the final test and reference ratios. In addition to the true spot intensity the scanned signal consists of different kinds of noise referred to as background. In order to assess the true spot intensity background must be subtracted. The standard approach to estimate background intensities is to assume they are equal to the intensity levels between spots. In the literature, morphological opening is suggested to be one of the best methods for estimating background this way. This paper examines fundamental properties of rank and quantile filters, which include morphological filters at the extremes, with focus on their ability to estimate between-spot intensity levels. The bias and variance of these filter estimates are driven by the number of background pixels used and their distributions. A new rank-filter algorithm is implemented and compared to methods available in Spot by CSIRO and GenePix Pro by Axon Instruments. Spot's morphological opening has a mean bias between -47 and -248 compared to a bias between 2 and -2 for the rank filter and the variability of the morphological opening estimate is 3 times higher than for the rank filter. The mean bias of Spot's second method, morph.close.open, is between -5 and -16 and the variability is approximately the same as for morphological opening. The variability of GenePix Pro's region-based estimate is more than ten times higher than the variability of the rank-filter estimate and with slightly more bias. The large variability is because the size of the background window changes with spot size. To overcome this, a non-adaptive region-based method is implemented. Its bias and variability are comparable to that of the rank filter. The performance of more advanced rank filters is equal to the best region-based methods. However, in order to get unbiased estimates these filters have to be implemented with great care. The performance of morphological opening is in general poor with a substantial spatial-dependent bias.

  14. Early Teen Marriage and Future Poverty

    PubMed Central

    DAHL, GORDON B.

    2010-01-01

    Both early teen marriage and dropping out of high school have historically been associated with a variety of negative outcomes, including higher poverty rates throughout life. Are these negative outcomes due to preexisting differences, or do they represent the causal effect of marriage and schooling choices? To better understand the true personal and societal consequences, in this article, I use an instrumental variables (IV) approach that takes advantage of variation in state laws regulating the age at which individuals are allowed to marry, drop out of school, and begin work. The baseline IV estimate indicates that a woman who marries young is 31 percentage points more likely to live in poverty when she is older. Similarly, a woman who drops out of school is 11 percentage points more likely to be poor. The results are robust to a variety of alternative specifications and estimation methods, including limited information maximum likelihood (LIML) estimation and a control function approach. While grouped ordinary least squares (OLS) estimates for the early teen marriage variable are also large, OLS estimates based on individual-level data are small, consistent with a large amount of measurement error. PMID:20879684

  15. Early teen marriage and future poverty.

    PubMed

    Dahl, Gordon B

    2010-08-01

    Both early teen marriage and dropping out of high school have historically been associated with a variety of negative outcomes, including higher poverty rates throughout life. Are these negative outcomes due to preexisting differences, or do they represent the causal effect of marriage and schooling choices? To better understand the true personal and societal consequences, in this article, I use an instrumental variables (IV) approach that takes advantage of variation in state laws regulating the age at which individuals are allowed to marry, drop out of school, and begin work. The baseline IV estimate indicates that a woman who marries young is 31 percentage points more likely to live in poverty when she is older. Similarly, a woman who drops out of school is 11 percentage points more likely to be poor. The results are robust to a variety of alternative specifications and estimation methods, including limited information maximum likelihood (LIML) estimation and a control function approach. While grouped ordinary least squares (OLS) estimates for the early teen marriage variable are also large, OLS estimates based on individual-level data are small, consistent with a large amount of measurement error

  16. Getting a Job Is Only Half the Battle: Maternal Job Loss and Child Classroom Behavior in Low-Income Families

    ERIC Educational Resources Information Center

    Hill, Heather D.; Morris, Pamela A.; Castells, Nina; Walker, Jessica Thornton

    2011-01-01

    This study uses data from an experimental employment program and instrumental variables (IV) estimation to examine the effects of maternal job loss on child classroom behavior. Random assignment to the treatment at one of three program sites is an exogenous predictor of employment patterns. Cross-site variation in treatment-control differences is…

  17. The Difference That One Year of Schooling Makes for Russian Schoolchildren. Based on PISA 2009: Reading

    ERIC Educational Resources Information Center

    Tiumeneva, Yu. A.; Kuzmina, Ju. V.

    2015-01-01

    The PISA 2009 data (in reading) investigated the effectiveness of one year of schooling in seven countries: Russia, Czech Republic, Hungary, Slovakia, Germany, Canada, and Brazil. We used an instrumental variable, which allowed us to estimate the effect of one year of schooling through the fuzzy method of regression discontinuity. The analysis was…

  18. The Fixed-Effects Model in Returns to Schooling and Its Application to Community Colleges: A Methodological Note

    ERIC Educational Resources Information Center

    Dynarski, Susan; Jacob, Brian; Kreisman, Daniel

    2016-01-01

    The purpose of this note is to develop insight into the performance of the individual fixed-effects model when used to estimate wage returns to postsecondary schooling. We focus our attention on the returns to attending and completing community college. While other methods (instrumental variables, regression discontinuity) have been used to…

  19. The Long-Term Economic Impact of in Utero and Postnatal Exposure to Malaria

    ERIC Educational Resources Information Center

    Barreca, Alan I.

    2010-01-01

    I use an instrumental-variables identification strategy and historical data from the United States to estimate the long-term economic impact of in utero and postnatal exposure to malaria. My research design matches adults in the 1960 Decennial Census to the malaria death rate in their respective state and year of birth. To address potential…

  20. Evaluation of software sensors for on-line estimation of culture conditions in an Escherichia coli cultivation expressing a recombinant protein.

    PubMed

    Warth, Benedikt; Rajkai, György; Mandenius, Carl-Fredrik

    2010-05-03

    Software sensors for monitoring and on-line estimation of critical bioprocess variables have mainly been used with standard bioreactor sensors, such as electrodes and gas analyzers, where algorithms in the software model have generated the desired state variables. In this article we propose that other on-line instruments, such as NIR probes and on-line HPLC, should be used to make more reliable and flexible software sensors. Five software sensor architectures were compared and evaluated: (1) biomass concentration from an on-line NIR probe, (2) biomass concentration from titrant addition, (3) specific growth rate from titrant addition, (4) specific growth rate from the NIR probe, and (5) specific substrate uptake rate and by-product rate from on-line HPLC and NIR probe signals. The software sensors were demonstrated on an Escherichia coli cultivation expressing a recombinant protein, green fluorescent protein (GFP), but the results could be extrapolated to other production organisms and product proteins. We conclude that well-maintained on-line instrumentation (hardware sensors) can increase the potential of software sensors. This would also strongly support the intentions with process analytical technology and quality-by-design concepts. 2010 Elsevier B.V. All rights reserved.

  1. The Effect of Community Uninsurance Rates on Access to Health Care

    PubMed Central

    Sabik, Lindsay M

    2012-01-01

    Objective To investigate the effect of local uninsurance rates on access to health care for the uninsured and insured and improve on recent studies by controlling for time-invariant differences across markets. Data Sources Individual-level data from the 1996 and 2003 Community Tracking Study, and market-level data from other sources, including the Area Resource File and the Bureau of Primary Healthcare. Study Design Market-level fixed effects models estimate the effect of changes in uninsurance rates within markets on access to care, measured by whether individuals report forgoing necessary care. Instrumental variables models are also estimated. Principal Findings Increases in the rate of uninsurance are associated with poorer access to necessary care among the uninsured. In contrast with recent evidence, increases in uninsurance had no effect on access to care among the insured. Instrumental variables results are similar, although not statistically significant. Conclusions Changes in rates of insurance coverage are likely to affect access to care for both previously and continuously uninsured. In contrast with earlier studies, there is no evidence of spillover effects on the insured, suggesting that such policy changes may have little effect on access for those who are already insured. PMID:22172046

  2. Improving the estimation of flavonoid intake for study of health outcomes

    PubMed Central

    Dwyer, Johanna T.; Jacques, Paul F.; McCullough, Marjorie L.

    2015-01-01

    Imprecision in estimating intakes of non-nutrient bioactive compounds such as flavonoids is a challenge in epidemiologic studies of health outcomes. The sources of this imprecision, using flavonoids as an example, include the variability of bioactive compounds in foods due to differences in growing conditions and processing, the challenges in laboratory quantification of flavonoids in foods, the incompleteness of flavonoid food composition tables, and the lack of adequate dietary assessment instruments. Steps to improve databases of bioactive compounds and to increase the accuracy and precision of the estimation of bioactive compound intakes in studies of health benefits and outcomes are suggested. PMID:26084477

  3. Genetic Obesity and the Risk of Atrial Fibrillation – Causal Estimates from Mendelian Randomization

    PubMed Central

    Chatterjee, Neal A.; Arking, Dan E.; Ellinor, Patrick T.; Heeringa, Jan; Lin, Honghuang; Lubitz, Steven A.; Soliman, Elsayed Z.; Verweij, Niek; Alonso, Alvaro; Benjamin, Emelia J.; Gudnason, Vilmundur; Stricker, Bruno H. C.; Van Der Harst, Pim; Chasman, Daniel I.; Albert, Christine M.

    2017-01-01

    Background Observational studies have identified an association between body mass index (BMI) and incident atrial fibrillation (AF). Inferring causality from observational studies, however, is subject to residual confounding, reverse causation, and bias. The primary objective of this study was to evaluate the causal association between BMI and AF using genetic predictors of BMI. Methods We identified 51 646 individuals of European ancestry without AF at baseline from seven prospective population-based cohorts initiated between 1987 and 2002 in the United States, Iceland, and the Netherlands with incident AF ascertained between 1987 and 2012. Cohort-specific mean follow-up ranged 7.4 to 19.2 years, over which period there were a total of 4178 cases of incident AF. We performed a Mendelian randomization with instrumental variable analysis to estimate a cohort-specific causal hazard ratio for the association between BMI and AF. Two genetic instruments for BMI were utilized: FTO genotype (rs1558902) and a BMI gene score comprised of 39 single nucleotide polymorphisms identified by genome-wide association studies to be associated with BMI. Cohort-specific estimates were combined by random-effects, inverse variance weighted meta-analysis. Results In age- and sex-adjusted meta-analysis, both genetic instruments were significantly associated with BMI (FTO: 0.43 [95% CI: 0.32 – 0.54] kg/m2 per A-allele, p<0.001); BMI gene score: 1.05 [95% CI: 0.90-1.20] kg/m2 per 1 unit increase, p<0.001) and incident AF (FTO – HR: 1.07 [1.02-1.11] per A-allele, p=0.004; BMI gene score – HR: 1.11 [1.05-1.18] per 1-unit increase, p<0.001). Age- and sex-adjusted instrumental variable estimates for the causal association between BMI and incident AF were HR 1.15 [1.04-1.26] per kg/m2, p=0.005 (FTO) and 1.11 [1.05-1.17] per kg/m2, p<0.001 (BMI gene score). Both of these estimates were consistent with the meta-analyzed estimate between observed BMI and AF (age- and sex-adjusted HR 1.05 [1.04-1.06] per kg/m2, p<0.001). Multivariable adjustment did not significantly change findings. Conclusions Our data are consistent with a causal relationship between BMI and incident AF. These data support the possibility that public health initiatives targeting primordial prevention of obesity may reduce the incidence of AF. PMID:27974350

  4. Toward a clearer portrayal of confounding bias in instrumental variable applications.

    PubMed

    Jackson, John W; Swanson, Sonja A

    2015-07-01

    Recommendations for reporting instrumental variable analyses often include presenting the balance of covariates across levels of the proposed instrument and levels of the treatment. However, such presentation can be misleading as relatively small imbalances among covariates across levels of the instrument can result in greater bias because of bias amplification. We introduce bias plots and bias component plots as alternative tools for understanding biases in instrumental variable analyses. Using previously published data on proposed preference-based, geography-based, and distance-based instruments, we demonstrate why presenting covariate balance alone can be problematic, and how bias component plots can provide more accurate context for bias from omitting a covariate from an instrumental variable versus non-instrumental variable analysis. These plots can also provide relevant comparisons of different proposed instruments considered in the same data. Adaptable code is provided for creating the plots.

  5. Poverty and Child Development: A Longitudinal Study of the Impact of the Earned Income Tax Credit

    PubMed Central

    Hamad, Rita; Rehkopf, David H.

    2016-01-01

    Although adverse socioeconomic conditions are correlated with worse child health and development, the effects of poverty-alleviation policies are less understood. We examined the associations of the Earned Income Tax Credit (EITC) on child development and used an instrumental variable approach to estimate the potential impacts of income. We used data from the US National Longitudinal Survey of Youth (n = 8,186) during 1986–2000 to examine effects on the Behavioral Problems Index (BPI) and Home Observation Measurement of the Environment inventory (HOME) scores. We conducted 2 analyses. In the first, we used multivariate linear regressions with child-level fixed effects to examine the association of EITC payment size with BPI and HOME scores; in the second, we used EITC payment size as an instrument to estimate the associations of income with BPI and HOME scores. In linear regression models, higher EITC payments were associated with improved short-term BPI scores (per $1,000, β = −0.57; P = 0.04). In instrumental variable analyses, higher income was associated with improved short-term BPI scores (per $1,000, β = −0.47; P = 0.01) and medium-term HOME scores (per $1,000, β = 0.64; P = 0.02). Our results suggest that both EITC benefits and higher income are associated with modest but meaningful improvements in child development. These findings provide valuable information for health researchers and policymakers for improving child health and development. PMID:27056961

  6. The Trojan Lifetime Champions Health Survey: development, validity, and reliability.

    PubMed

    Sorenson, Shawn C; Romano, Russell; Scholefield, Robin M; Schroeder, E Todd; Azen, Stanley P; Salem, George J

    2015-04-01

    Self-report questionnaires are an important method of evaluating lifespan health, exercise, and health-related quality of life (HRQL) outcomes among elite, competitive athletes. Few instruments, however, have undergone formal characterization of their psychometric properties within this population. To evaluate the validity and reliability of a novel health and exercise questionnaire, the Trojan Lifetime Champions (TLC) Health Survey. Descriptive laboratory study. A large National Collegiate Athletic Association Division I university. A total of 63 university alumni (age range, 24 to 84 years), including former varsity collegiate athletes and a control group of nonathletes. Participants completed the TLC Health Survey twice at a mean interval of 23 days with randomization to the paper or electronic version of the instrument. Content validity, feasibility of administration, test-retest reliability, parallel-form reliability between paper and electronic forms, and estimates of systematic and typical error versus differences of clinical interest were assessed across a broad range of health, exercise, and HRQL measures. Correlation coefficients, including intraclass correlation coefficients (ICCs) for continuous variables and κ agreement statistics for ordinal variables, for test-retest reliability averaged 0.86, 0.90, 0.80, and 0.74 for HRQL, lifetime health, recent health, and exercise variables, respectively. Correlation coefficients, again ICCs and κ, for parallel-form reliability (ie, equivalence) between paper and electronic versions averaged 0.90, 0.85, 0.85, and 0.81 for HRQL, lifetime health, recent health, and exercise variables, respectively. Typical measurement error was less than the a priori thresholds of clinical interest, and we found minimal evidence of systematic test-retest error. We found strong evidence of content validity, convergent construct validity with the Short-Form 12 Version 2 HRQL instrument, and feasibility of administration in an elite, competitive athletic population. These data suggest that the TLC Health Survey is a valid and reliable instrument for assessing lifetime and recent health, exercise, and HRQL, among elite competitive athletes. Generalizability of the instrument may be enhanced by additional, larger-scale studies in diverse populations.

  7. The precision of wet atmospheric deposition data from national atmospheric deposition program/national trends network sites determined with collocated samplers

    USGS Publications Warehouse

    Nilles, M.A.; Gordon, J.D.; Schroder, L.J.

    1994-01-01

    A collocated, wet-deposition sampler program has been operated since October 1988 by the U.S. Geological Survey to estimate the overall sampling precision of wet atmospheric deposition data collected at selected sites in the National Atmospheric Deposition Program and National Trends Network (NADP/NTN). A duplicate set of wet-deposition sampling instruments was installed adjacent to existing sampling instruments at four different NADP/NTN sites for each year of the study. Wet-deposition samples from collocated sites were collected and analysed using standard NADP/NTN procedures. Laboratory analyses included determinations of pH, specific conductance, and concentrations of major cations and anions. The estimates of precision included all variability in the data-collection system, from the point of sample collection through storage in the NADP/NTN database. Sampling precision was determined from the absolute value of differences in the analytical results for the paired samples in terms of median relative and absolute difference. The median relative difference for Mg2+, Na+, K+ and NH4+ concentration and deposition was quite variable between sites and exceeded 10% at most sites. Relative error for analytes whose concentrations typically approached laboratory method detection limits were greater than for analytes that did not typically approach detection limits. The median relative difference for SO42- and NO3- concentration, specific conductance, and sample volume at all sites was less than 7%. Precision for H+ concentration and deposition ranged from less than 10% at sites with typically high levels of H+ concentration to greater than 30% at sites with low H+ concentration. Median difference for analyte concentration and deposition was typically 1.5-2-times greater for samples collected during the winter than during other seasons at two northern sites. Likewise, the median relative difference in sample volume for winter samples was more than double the annual median relative difference at the two northern sites. Bias accounted for less than 25% of the collocated variability in analyte concentration and deposition from weekly collocated precipitation samples at most sites.A collocated, wet-deposition sampler program has been operated since OCtober 1988 by the U.S Geological Survey to estimate the overall sampling precision of wet atmospheric deposition data collected at selected sites in the National Atmospheric Deposition Program and National Trends Network (NADP/NTN). A duplicate set of wet-deposition sampling instruments was installed adjacent to existing sampling instruments four different NADP/NTN sites for each year of the study. Wet-deposition samples from collocated sites were collected and analysed using standard NADP/NTN procedures. Laboratory analyses included determinations of pH, specific conductance, and concentrations of major cations and anions. The estimates of precision included all variability in the data-collection system, from the point of sample collection through storage in the NADP/NTN database.

  8. New DMSP database of precipitating auroral electrons and ions

    NASA Astrophysics Data System (ADS)

    Redmon, Robert J.; Denig, William F.; Kilcommons, Liam M.; Knipp, Delores J.

    2017-08-01

    Since the mid-1970s, the Defense Meteorological Satellite Program (DMSP) spacecraft have operated instruments for monitoring the space environment from low Earth orbit. As the program evolved, so have the measurement capabilities such that modern DMSP spacecraft include a comprehensive suite of instruments providing estimates of precipitating electron and ion fluxes, cold/bulk plasma composition and moments, the geomagnetic field, and optical emissions in the far and extreme ultraviolet. We describe the creation of a new public database of precipitating electrons and ions from the Special Sensor J (SSJ) instrument, complete with original counts, calibrated differential fluxes adjusted for penetrating radiation, estimates of the total kinetic energy flux and characteristic energy, uncertainty estimates, and accurate ephemerides. These are provided in a common and self-describing format that covers 30+ years of DMSP spacecraft from F06 (launched in 1982) to F18 (launched in 2009). This new database is accessible at the National Centers for Environmental Information and the Coordinated Data Analysis Web. We describe how the new database is being applied to high-latitude studies of the colocation of kinetic and electromagnetic energy inputs, ionospheric conductivity variability, field-aligned currents, and auroral boundary identification. We anticipate that this new database will support a broad range of space science endeavors from single observatory studies to coordinated system science investigations.

  9. Does Mother Know Best? Treatment Adherence as a Function of Anticipated Treatment Benefit

    PubMed Central

    Glymour, M. Maria; Nguyen, Quynh; Matsouaka, Roland; Tchetgen Tchetgen, Eric J.; Schmidt, Nicole M.; Osypuk, Theresa L.

    2016-01-01

    Background We describe bias resulting from individualized treatment selection, which occurs when treatment has heterogeneous effects and individuals selectively choose treatments of greatest benefit to themselves. This pernicious bias may confound estimates from observational studies and lead to important misinterpretation of intent-to-treat analyses of randomized trials. Despite the potentially serious threat to inferences, individualized treatment selection has rarely been formally described or assessed. Methods The Moving to Opportunity (MTO) trial randomly assigned subsidized rental vouchers to low-income families in high-poverty public housing. We assessed the Kessler-6 psychological distress and Behavior Problems Index outcomes for 2,829 adolescents 4–7 years after randomization. Among families randomly assigned to receive vouchers, we estimated probability of moving (treatment), predicted by pre-randomization characteristics (c-statistic=0.63). We categorized families into tertiles of this estimated probability of moving, and compared instrumental variable effect estimates for moving on Behavior Problems Index and Kessler-6 across tertiles. Results Instrumental variable estimated effects of moving on behavioral problems index were most adverse for boys least likely to move (b=0.93; 95% CI: 0.33, 1.53) compared to boys most likely to move (b=0.14; 95% CI: −0.15, 0.44; p=.02 for treatment*tertile interaction). Effects on Kessler-6 were more beneficial for girls least likely to move compared to girls most likely to move (−0.62 vs. 0.02; interaction p=.03). Conclusions Evidence of Individualized treatment selection differed by child gender and outcome and should be evaluated in randomized trial reports, especially when heterogeneous treatment effects are likely and non-adherence is common. PMID:26628424

  10. High-Frequency X-ray Variability Detection in A Black Hole Transient with USA.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shabad, Gayane

    2000-10-16

    Studies of high-frequency variability (above {approx}100 Hz) in X-ray binaries provide a unique opportunity to explore the fundamental physics of spacetime and matter, since the orbital timescale on the order of several milliseconds is a timescale of the motion of matter through the region located in close proximity to a compact stellar object. The detection of weak high-frequency signals in X-ray binaries depends on how well we understand the level of Poisson noise due to the photon counting statistics, i.e. how well we can understand and model the detector deadtime and other instrumental systematic effects. We describe the preflight timingmore » calibration work performed on the Unconventional Stellar Aspect (USA) X-ray detector to study deadtime and timing issues. We developed a Monte Carlo deadtime model and deadtime correction methods for the USA experiment. The instrumental noise power spectrum can be estimated within {approx}0.1% accuracy in the case when no energy-dependent instrumental effect is present. We also developed correction techniques to account for an energy-dependent instrumental effect. The developed methods were successfully tested on USA Cas A and Cygnus X-1 data. This work allowed us to make a detection of a weak signal in a black hole candidate (BHC) transient.« less

  11. Influence of therapist competence and quantity of cognitive behavioural therapy on suicidal behaviour and inpatient hospitalisation in a randomised controlled trial in borderline personality disorder: Further analyses of treatment effects in the BOSCOT study

    PubMed Central

    Norrie, John; Davidson, Kate; Tata, Philip; Gumley, Andrew

    2013-01-01

    Objectives We investigated the treatment effects reported from a high-quality randomized controlled trial of cognitive behavioural therapy (CBT) for 106 people with borderline personality disorder attending community-based clinics in the UK National Health Service – the BOSCOT trial. Specifically, we examined whether the amount of therapy and therapist competence had an impact on our primary outcome, the number of suicidal acts†, using instrumental variables regression modelling. Design Randomized controlled trial. Participants from across three sites (London, Glasgow, and Ayrshire/Arran) were randomized equally to CBT for personality disorders (CBTpd) plus Treatment as Usual or to Treatment as Usual. Treatment as Usual varied between sites and individuals, but was consistent with routine treatment in the UK National Health Service at the time. CBTpd comprised an average 16 sessions (range 0–35) over 12 months. Method We used instrumental variable regression modelling to estimate the impact of quantity and quality of therapy received (recording activities and behaviours that took place after randomization) on number of suicidal acts and inpatient psychiatric hospitalization. Results A total of 101 participants provided full outcome data at 2 years post randomization. The previously reported intention-to-treat (ITT) results showed on average a reduction of 0.91 (95% confidence interval 0.15–1.67) suicidal acts over 2 years for those randomized to CBT. By incorporating the influence of quantity of therapy and therapist competence, we show that this estimate of the effect of CBTpd could be approximately two to three times greater for those receiving the right amount of therapy from a competent therapist. Conclusions Trials should routinely control for and collect data on both quantity of therapy and therapist competence, which can be used, via instrumental variable regression modelling, to estimate treatment effects for optimal delivery of therapy. Such estimates complement rather than replace the ITT results, which are properly the principal analysis results from such trials. Practitioner points Assessing the impact of the quantity and quality of therapy (competence of therapists) is complex. More competent therapists, trained in CBTpd, may significantly reduce the number of suicidal act in patients with borderline personality disorder. PMID:23420622

  12. Falsification Testing of Instrumental Variables Methods for Comparative Effectiveness Research.

    PubMed

    Pizer, Steven D

    2016-04-01

    To demonstrate how falsification tests can be used to evaluate instrumental variables methods applicable to a wide variety of comparative effectiveness research questions. Brief conceptual review of instrumental variables and falsification testing principles and techniques accompanied by an empirical application. Sample STATA code related to the empirical application is provided in the Appendix. Comparative long-term risks of sulfonylureas and thiazolidinediones for management of type 2 diabetes. Outcomes include mortality and hospitalization for an ambulatory care-sensitive condition. Prescribing pattern variations are used as instrumental variables. Falsification testing is an easily computed and powerful way to evaluate the validity of the key assumption underlying instrumental variables analysis. If falsification tests are used, instrumental variables techniques can help answer a multitude of important clinical questions. © Health Research and Educational Trust.

  13. Reduction of Working Time: Does It Lead to a Healthy Lifestyle?

    PubMed

    Ahn, Taehyun

    2016-08-01

    I examine whether working hours have a causal effect on the health behaviors of workers. In assessing the causal relationship, I estimate fixed-effects instrumental variable models by using exogenous variation in adopting a reduced workweek in South Korea as an instrument for work hours. The estimation results reveal that shortening work hours induces individuals to exercise regularly and decreases the likelihood of smoking, with more pronounced effects for heavy smokers. While a work-hour reduction substantially increases the probability of drinking participation, it does not significantly affect the likelihood of frequent or daily drinking habits. In addition, the effect of a work-hour reduction on regular exercise is salient among women and older groups, and the effect on smoking behaviors is more pronounced among men and middle-aged groups. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.

  14. Impact of exposure measurement error in air pollution epidemiology: effect of error type in time-series studies.

    PubMed

    Goldman, Gretchen T; Mulholland, James A; Russell, Armistead G; Strickland, Matthew J; Klein, Mitchel; Waller, Lance A; Tolbert, Paige E

    2011-06-22

    Two distinctly different types of measurement error are Berkson and classical. Impacts of measurement error in epidemiologic studies of ambient air pollution are expected to depend on error type. We characterize measurement error due to instrument imprecision and spatial variability as multiplicative (i.e. additive on the log scale) and model it over a range of error types to assess impacts on risk ratio estimates both on a per measurement unit basis and on a per interquartile range (IQR) basis in a time-series study in Atlanta. Daily measures of twelve ambient air pollutants were analyzed: NO2, NOx, O3, SO2, CO, PM10 mass, PM2.5 mass, and PM2.5 components sulfate, nitrate, ammonium, elemental carbon and organic carbon. Semivariogram analysis was applied to assess spatial variability. Error due to this spatial variability was added to a reference pollutant time-series on the log scale using Monte Carlo simulations. Each of these time-series was exponentiated and introduced to a Poisson generalized linear model of cardiovascular disease emergency department visits. Measurement error resulted in reduced statistical significance for the risk ratio estimates for all amounts (corresponding to different pollutants) and types of error. When modelled as classical-type error, risk ratios were attenuated, particularly for primary air pollutants, with average attenuation in risk ratios on a per unit of measurement basis ranging from 18% to 92% and on an IQR basis ranging from 18% to 86%. When modelled as Berkson-type error, risk ratios per unit of measurement were biased away from the null hypothesis by 2% to 31%, whereas risk ratios per IQR were attenuated (i.e. biased toward the null) by 5% to 34%. For CO modelled error amount, a range of error types were simulated and effects on risk ratio bias and significance were observed. For multiplicative error, both the amount and type of measurement error impact health effect estimates in air pollution epidemiology. By modelling instrument imprecision and spatial variability as different error types, we estimate direction and magnitude of the effects of error over a range of error types.

  15. Generic Algorithms for Estimating Foliar Pigment Content

    NASA Astrophysics Data System (ADS)

    Gitelson, Anatoly; Solovchenko, Alexei

    2017-09-01

    Foliar pigment contents and composition are main factors governing absorbed photosynthetically active radiation, photosynthetic activity, and physiological status of vegetation. In this study the performance of nondestructive techniques based on leaf reflectance were tested for estimating chlorophyll (Chl) and anthocyanin (AnC) contents in species with widely variable leaf structure, pigment content, and composition. Only three spectral bands (green, red edge, and near-infrared) are required for nondestructive Chl and AnC estimation with normalized root-mean-square error (NRMSE) below 4.5% and 6.1%, respectively. The algorithms developed are generic, not requiring reparameterization for each species allowing for accurate nondestructive Chl and AnC estimation using simple handheld field/lab instrumentation. They also have potential in interpretation of airborne and satellite data.

  16. [Instruments for quantitative methods of nursing research].

    PubMed

    Vellone, E

    2000-01-01

    Instruments for quantitative nursing research are a mean to objectify and measure a variable or a phenomenon in the scientific research. There are direct instruments to measure concrete variables and indirect instruments to measure abstract concepts (Burns, Grove, 1997). Indirect instruments measure the attributes by which a concept is made of. Furthermore, there are instruments for physiologic variables (e.g. for the weight), observational instruments (Check-lists e Rating Scales), interviews, questionnaires, diaries and the scales (Check-lists, Rating Scales, Likert Scales, Semantic Differential Scales e Visual Anologue Scales). The choice to select an instrument or another one depends on the research question and design. Instruments research are very useful in research both to describe the variables and to see statistical significant relationships. Very carefully should be their use in the clinical practice for diagnostic assessment.

  17. Bank Size and Small- and Medium-sized Enterprise (SME) Lending: Evidence from China

    PubMed Central

    SHEN, YAN; SHEN, MINGGAO; XU, ZHONG; BAI, YING

    2014-01-01

    Summary Using panel data collected in 2005, we evaluate how bank size, discretion over credit, incentive schemes, competition, and the institutional environment affect lending to small- and medium-sized enterprises in China. We deal with the endogeneity problem using instrumental variables, and a reduced-form approach is also applied to allow for weak instruments in estimation. We find that total bank asset is an insignificant factor for banks’ decision on small- and medium-enterprise (SME) lending, but more local lending authority, more competition, carefully designed incentive schemes, and stronger law enforcement encourage commercial banks to lend to SMEs. PMID:26052179

  18. The Benefits of College Athletic Success: An Application of the Propensity Score Design with Instrumental Variables. NBER Working Paper No. 18196

    ERIC Educational Resources Information Center

    Anderson, Michael L.

    2012-01-01

    Spending on big-time college athletics is often justified on the grounds that athletic success attracts students and raises donations. Testing this claim has proven difficult because success is not randomly assigned. We exploit data on bookmaker spreads to estimate the probability of winning each game for college football teams. We then condition…

  19. Does Money Really Matter? Estimating Impacts of Family Income on Young Children's Achievement with Data from Random-Assignment Experiments

    ERIC Educational Resources Information Center

    Duncan, Greg J.; Morris, Pamela A.; Rodrigues, Chris

    2011-01-01

    Social scientists do not agree on the size and nature of the causal impacts of parental income on children's achievement. We revisit this issue using a set of welfare and antipoverty experiments conducted in the 1990s. We utilize an instrumental variables strategy to leverage the variation in income and achievement that arises from random…

  20. Aperture Fever and the Quality of AAVSO Visual Estimates: mu Cephei as an Example

    NASA Astrophysics Data System (ADS)

    Turner, D. G.

    2014-06-01

    (Abstract only) At the limits of human vision the eye can reach precisions of 10% or better in brightness estimates for stars. So why did the quality of AAVSO visual estimates suddenly drop to 50% or worse for many stars following World War II? Possibly it is a consequence of viewing variable stars through ever-larger aperture instruments than was the case previously, a time when many variables were observed without optical aid. An example is provided by the bright red supergiant variable mu Cephei, a star that has the potential to be a calibrating object for the extragalactic distance scale if its low-amplitude brightness variations are better defined. It appears to be a member of the open cluster Trumpler 37, so its distance and luminosity can be established provided one can pinpoint the amount of interstellar extinction between us and it. mu Cep appears to be a double-mode pulsator, as suggested previously in the literature, but with periods of roughly 700 and 1,000 days it is unexciting to observe and its red color presents a variety of calibration problems. Improving quality control for such variable stars is an issue important not only to the AAVSO, but also to science in general.

  1. Geostatistical Analysis of Mesoscale Spatial Variability and Error in SeaWiFS and MODIS/Aqua Global Ocean Color Data

    NASA Astrophysics Data System (ADS)

    Glover, David M.; Doney, Scott C.; Oestreich, William K.; Tullo, Alisdair W.

    2018-01-01

    Mesoscale (10-300 km, weeks to months) physical variability strongly modulates the structure and dynamics of planktonic marine ecosystems via both turbulent advection and environmental impacts upon biological rates. Using structure function analysis (geostatistics), we quantify the mesoscale biological signals within global 13 year SeaWiFS (1998-2010) and 8 year MODIS/Aqua (2003-2010) chlorophyll a ocean color data (Level-3, 9 km resolution). We present geographical distributions, seasonality, and interannual variability of key geostatistical parameters: unresolved variability or noise, resolved variability, and spatial range. Resolved variability is nearly identical for both instruments, indicating that geostatistical techniques isolate a robust measure of biophysical mesoscale variability largely independent of measurement platform. In contrast, unresolved variability in MODIS/Aqua is substantially lower than in SeaWiFS, especially in oligotrophic waters where previous analysis identified a problem for the SeaWiFS instrument likely due to sensor noise characteristics. Both records exhibit a statistically significant relationship between resolved mesoscale variability and the low-pass filtered chlorophyll field horizontal gradient magnitude, consistent with physical stirring acting on large-scale gradient as an important factor supporting observed mesoscale variability. Comparable horizontal length scales for variability are found from tracer-based scaling arguments and geostatistical decorrelation. Regional variations between these length scales may reflect scale dependence of biological mechanisms that also create variability directly at the mesoscale, for example, enhanced net phytoplankton growth in coastal and frontal upwelling and convective mixing regions. Global estimates of mesoscale biophysical variability provide an improved basis for evaluating higher resolution, coupled ecosystem-ocean general circulation models, and data assimilation.

  2. Precision estimate for Odin-OSIRIS limb scatter retrievals

    NASA Astrophysics Data System (ADS)

    Bourassa, A. E.; McLinden, C. A.; Bathgate, A. F.; Elash, B. J.; Degenstein, D. A.

    2012-02-01

    The limb scatter measurements made by the Optical Spectrograph and Infrared Imaging System (OSIRIS) instrument on the Odin spacecraft are used to routinely produce vertically resolved trace gas and aerosol extinction profiles. Version 5 of the ozone and stratospheric aerosol extinction retrievals, which are available for download, are performed using a multiplicative algebraic reconstruction technique (MART). The MART inversion is a type of relaxation method, and as such the covariance of the retrieved state is estimated numerically, which, if done directly, is a computationally heavy task. Here we provide a methodology for the derivation of a numerical estimate of the covariance matrix for the retrieved state using the MART inversion that is sufficiently efficient to perform for each OSIRIS measurement. The resulting precision is compared with the variability in a large set of pairs of OSIRIS measurements that are close in time and space in the tropical stratosphere where the natural atmospheric variability is weak. These results are found to be highly consistent and thus provide confidence in the numerical estimate of the precision in the retrieved profiles.

  3. Improved Satellite Estimation of Near-Surface Humidity Using Vertical Water Vapor Profile Information

    NASA Astrophysics Data System (ADS)

    Tomita, H.; Hihara, T.; Kubota, M.

    2018-01-01

    Near-surface air-specific humidity is a key variable in the estimation of air-sea latent heat flux and evaporation from the ocean surface. An accurate estimation over the global ocean is required for studies on global climate, air-sea interactions, and water cycles. Current remote sensing techniques are problematic and a major source of errors for flux and evaporation. Here we propose a new method to estimate surface humidity using satellite microwave radiometer instruments, based on a new finding about the relationship between multichannel brightness temperatures measured by satellite sensors, surface humidity, and vertical moisture structure. Satellite estimations using the new method were compared with in situ observations to evaluate this method, confirming that it could significantly improve satellite estimations with high impact on satellite estimation of latent heat flux. We recommend the adoption of this method for any satellite microwave radiometer observations.

  4. Is there evidence for dual causation between malaria and socioeconomic status? Findings from rural Tanzania.

    PubMed

    Somi, Masha F; Butler, James R G; Vahid, Farshid; Njau, Joseph; Kachur, S Patrick; Abdulla, Salim

    2007-12-01

    Malaria's relationship with socioeconomic status at the macroeconomic level has been established. This is the first study to explore this relationship at the microeconomic (household) level and estimate the direction of association. Malaria prevalence was measured by parasitemia, and household socioeconomic status was measured using an asset based index. Results from an instrumental variable probit model suggest that socioeconomic status is negatively associated with malaria parasitemia. Other variables that are significantly associated with parasitemia include age of the individual, use of a mosquito net on the night before interview, the number of people living in the household, whether the household was residing at their farm home at the time of interview, household wall construction, and the region of residence. Matching estimators indicate that malaria parasitemia is associated with reduced household socioeconomic status.

  5. www.common-metrics.org: a web application to estimate scores from different patient-reported outcome measures on a common scale.

    PubMed

    Fischer, H Felix; Rose, Matthias

    2016-10-19

    Recently, a growing number of Item-Response Theory (IRT) models has been published, which allow estimation of a common latent variable from data derived by different Patient Reported Outcomes (PROs). When using data from different PROs, direct estimation of the latent variable has some advantages over the use of sum score conversion tables. It requires substantial proficiency in the field of psychometrics to fit such models using contemporary IRT software. We developed a web application ( http://www.common-metrics.org ), which allows estimation of latent variable scores more easily using IRT models calibrating different measures on instrument independent scales. Currently, the application allows estimation using six different IRT models for Depression, Anxiety, and Physical Function. Based on published item parameters, users of the application can directly estimate latent trait estimates using expected a posteriori (EAP) for sum scores as well as for specific response patterns, Bayes modal (MAP), Weighted likelihood estimation (WLE) and Maximum likelihood (ML) methods and under three different prior distributions. The obtained estimates can be downloaded and analyzed using standard statistical software. This application enhances the usability of IRT modeling for researchers by allowing comparison of the latent trait estimates over different PROs, such as the Patient Health Questionnaire Depression (PHQ-9) and Anxiety (GAD-7) scales, the Center of Epidemiologic Studies Depression Scale (CES-D), the Beck Depression Inventory (BDI), PROMIS Anxiety and Depression Short Forms and others. Advantages of this approach include comparability of data derived with different measures and tolerance against missing values. The validity of the underlying models needs to be investigated in the future.

  6. Erythropoietin-Stimulating Agents and Survival in End-Stage Renal Disease: Comparison of Payment Policy Analysis, Instrumental Variables, and Multiple Imputation of Potential Outcomes

    PubMed Central

    Dore, David D.; Swaminathan, Shailender; Gutman, Roee; Trivedi, Amal N.; Mor, Vincent

    2013-01-01

    Objective To compare the assumptions and estimands across three approaches to estimating the effect of erythropoietin-stimulating agents (ESAs) on mortality. Study Design and Setting Using data from the Renal Management Information System, we conducted two analyses utilizing a change to bundled payment that we hypothesized mimicked random assignment to ESA (pre-post, difference-in-difference, and instrumental variable analyses). A third analysis was based on multiply imputing potential outcomes using propensity scores. Results There were 311,087 recipients of ESAs and 13,095 non-recipients. In the pre-post comparison, we identified no clear relationship between bundled payment (measured by calendar time) and the incidence of death within six months (risk difference -1.5%; 95% CI - 7.0% to 4.0%). In the instrumental variable analysis, the risk of mortality was similar among ESA recipients (risk difference -0.9%; 95% CI -2.1 to 0.3). In the multiple imputation analysis, we observed a 4.2% (95% CI 3.4% to 4.9%) absolute reduction in mortality risk with use of ESAs, but closer to the null for patients with baseline hematocrit >36%. Conclusion Methods emanating from different disciplines often rely on different assumptions, but can be informative about a similar causal contrast. The implications of these distinct approaches are discussed. PMID:23849152

  7. Estimation of Staphylococcus aureus growth parameters from turbidity data: characterization of strain variation and comparison of methods.

    PubMed

    Lindqvist, R

    2006-07-01

    Turbidity methods offer possibilities for generating data required for addressing microorganism variability in risk modeling given that the results of these methods correspond to those of viable count methods. The objectives of this study were to identify the best approach for determining growth parameters based on turbidity data and use of a Bioscreen instrument and to characterize variability in growth parameters of 34 Staphylococcus aureus strains of different biotypes isolated from broiler carcasses. Growth parameters were estimated by fitting primary growth models to turbidity growth curves or to detection times of serially diluted cultures either directly or by using an analysis of variance (ANOVA) approach. The maximum specific growth rates in chicken broth at 17 degrees C estimated by time to detection methods were in good agreement with viable count estimates, whereas growth models (exponential and Richards) underestimated growth rates. Time to detection methods were selected for strain characterization. The variation of growth parameters among strains was best described by either the logistic or lognormal distribution, but definitive conclusions require a larger data set. The distribution of the physiological state parameter ranged from 0.01 to 0.92 and was not significantly different from a normal distribution. Strain variability was important, and the coefficient of variation of growth parameters was up to six times larger among strains than within strains. It is suggested to apply a time to detection (ANOVA) approach using turbidity measurements for convenient and accurate estimation of growth parameters. The results emphasize the need to consider implications of strain variability for predictive modeling and risk assessment.

  8. Spatial and temporal variability of rainfall and their effects on hydrological response in urban areas - a review

    NASA Astrophysics Data System (ADS)

    Cristiano, Elena; ten Veldhuis, Marie-claire; van de Giesen, Nick

    2017-07-01

    In urban areas, hydrological processes are characterized by high variability in space and time, making them sensitive to small-scale temporal and spatial rainfall variability. In the last decades new instruments, techniques, and methods have been developed to capture rainfall and hydrological processes at high resolution. Weather radars have been introduced to estimate high spatial and temporal rainfall variability. At the same time, new models have been proposed to reproduce hydrological response, based on small-scale representation of urban catchment spatial variability. Despite these efforts, interactions between rainfall variability, catchment heterogeneity, and hydrological response remain poorly understood. This paper presents a review of our current understanding of hydrological processes in urban environments as reported in the literature, focusing on their spatial and temporal variability aspects. We review recent findings on the effects of rainfall variability on hydrological response and identify gaps where knowledge needs to be further developed to improve our understanding of and capability to predict urban hydrological response.

  9. Tyre-road grip coefficient assessment - Part II: online estimation using instrumented vehicle, extended Kalman filter, and neural network

    NASA Astrophysics Data System (ADS)

    Luque, Pablo; Mántaras, Daniel A.; Fidalgo, Eloy; Álvarez, Javier; Riva, Paolo; Girón, Pablo; Compadre, Diego; Ferran, Jordi

    2013-12-01

    The main objective of this work is to determine the limit of safe driving conditions by identifying the maximal friction coefficient in a real vehicle. The study will focus on finding a method to determine this limit before reaching the skid, which is valuable information in the context of traffic safety. Since it is not possible to measure the friction coefficient directly, it will be estimated using the appropriate tools in order to get the most accurate information. A real vehicle is instrumented to collect information of general kinematics and steering tie-rod forces. A real-time algorithm is developed to estimate forces and aligning torque in the tyres using an extended Kalman filter and neural networks techniques. The methodology is based on determining the aligning torque; this variable allows evaluation of the behaviour of the tyre. It transmits interesting information from the tyre-road contact and can be used to predict the maximal tyre grip and safety margin. The maximal grip coefficient is estimated according to a knowledge base, extracted from computer simulation of a high detailed three-dimensional model, using Adams® software. The proposed methodology is validated and applied to real driving conditions, in which maximal grip and safety margin are properly estimated.

  10. Instrumental variables vs. grouping approach for reducing bias due to measurement error.

    PubMed

    Batistatou, Evridiki; McNamee, Roseanne

    2008-01-01

    Attenuation of the exposure-response relationship due to exposure measurement error is often encountered in epidemiology. Given that error cannot be totally eliminated, bias correction methods of analysis are needed. Many methods require more than one exposure measurement per person to be made, but the `group mean OLS method,' in which subjects are grouped into several a priori defined groups followed by ordinary least squares (OLS) regression on the group means, can be applied with one measurement. An alternative approach is to use an instrumental variable (IV) method in which both the single error-prone measure and an IV are used in IV analysis. In this paper we show that the `group mean OLS' estimator is equal to an IV estimator with the group mean used as IV, but that the variance estimators for the two methods are different. We derive a simple expression for the bias in the common estimator which is a simple function of group size, reliability and contrast of exposure between groups, and show that the bias can be very small when group size is large. We compare this method with a new proposal (group mean ranking method), also applicable with a single exposure measurement, in which the IV is the rank of the group means. When there are two independent exposure measurements per subject, we propose a new IV method (EVROS IV) and compare it with Carroll and Stefanski's (CS IV) proposal in which the second measure is used as an IV; the new IV estimator combines aspects of the `group mean' and `CS' strategies. All methods are evaluated in terms of bias, precision and root mean square error via simulations and a dataset from occupational epidemiology. The `group mean ranking method' does not offer much improvement over the `group mean method.' Compared with the `CS' method, the `EVROS' method is less affected by low reliability of exposure. We conclude that the group IV methods we propose may provide a useful way to handle mismeasured exposures in epidemiology with or without replicate measurements. Our finding may also have implications for the use of aggregate variables in epidemiology to control for unmeasured confounding.

  11. The Impact of Education On Fertility and Child Mortality: Do Fathers Really Matter Less than Mothers? OECD Development Centre Working Paper, No. 217 (Formerly Webdoc No. 5)

    ERIC Educational Resources Information Center

    Breierova, Lucia; Duflo, Esther

    2003-01-01

    This paper takes advantage of a massive school construction program that took place in Indonesia between 1973 and 1978 to estimate the effect of education on fertility and child mortality. Time and region varying exposure to the school construction program generates instrumental variables for the average education in the household, and the…

  12. Automated gait and balance parameters diagnose and correlate with severity in Parkinson disease.

    PubMed

    Dewey, D Campbell; Miocinovic, Svjetlana; Bernstein, Ira; Khemani, Pravin; Dewey, Richard B; Querry, Ross; Chitnis, Shilpa; Dewey, Richard B

    2014-10-15

    To assess the suitability of instrumented gait and balance measures for diagnosis and estimation of disease severity in PD. Each subject performed iTUG (instrumented Timed-Up-and-Go) and iSway (instrumented Sway) using the APDM(®) Mobility Lab. MDS-UPDRS parts II and III, a postural instability and gait disorder (PIGD) score, the mobility subscale of the PDQ-39, and Hoehn & Yahr stage were measured in the PD cohort. Two sets of gait and balance variables were defined by high correlation with diagnosis or disease severity and were evaluated using multiple linear and logistic regressions, ROC analyses, and t-tests. 135 PD subjects and 66 age-matched controls were evaluated in this prospective cohort study. We found that both iTUG and iSway variables differentiated PD subjects from controls (area under the ROC curve was 0.82 and 0.75 respectively) and correlated with all PD severity measures (R(2) ranging from 0.18 to 0.61). Objective exam-based scores correlated more strongly with iTUG than iSway. The chosen set of iTUG variables was abnormal in very mild disease. Age and gender influenced gait and balance parameters and were therefore controlled in all analyses. Our study identified sets of iTUG and iSway variables which correlate with PD severity measures and differentiate PD subjects from controls. These gait and balance measures could potentially serve as markers of PD progression and are under evaluation for this purpose in the ongoing NIH Parkinson Disease Biomarker Program. Copyright © 2014 Elsevier B.V. All rights reserved.

  13. Automated Gait and Balance Parameters Diagnose and Correlate with Severity in Parkinson Disease

    PubMed Central

    Dewey, Daniel C.; Miocinovic, Svjetlana; Bernstein, Ira; Khemani, Pravin; Dewey, Richard B.; Querry, Ross; Chitnis, Shilpa; Dewey, Richard B.

    2014-01-01

    Objective To assess the suitability of instrumented gait and balance measures for diagnosis and estimation of disease severity in PD. Methods Each subject performed iTUG (instrumented Timed-Up-and-Go) and iSway (instrumented Sway) using the APDM® Mobility Lab. MDS-UPDRS parts II and III, a postural instability and gait disorder (PIGD) score, the mobility subscale of the PDQ-39, and Hoehn & Yahr stage were measured in the PD cohort. Two sets of gait and balance variables were defined by high correlation with diagnosis or disease severity and were evaluated using multiple linear and logistic regressions, ROC analyses, and t-tests. Results 135 PD subjects and 66 age-matched controls were evaluated in this prospective cohort study. We found that both iTUG and iSway variables differentiated PD subjects from controls (area under the ROC curve was 0.82 and 0.75 respectively) and correlated with all PD severity measures (R2 ranging from 0.18 to 0.61). Objective exam-based scores correlated more strongly with iTUG than iSway. The chosen set of iTUG variables was abnormal in very mild disease. Age and gender influenced gait and balance parameters and were therefore controlled in all analyses. Interpretation Our study identified sets of iTUG and iSway variables which correlate with PD severity measures and differentiate PD subjects from controls. These gait and balance measures could potentially serve as markers of PD progression and are under evaluation for this purpose in the ongoing NIH Parkinson Disease Biomarker Program. PMID:25082782

  14. Psychosocial Assessment of Self-Harm Patients and Risk of Repeat Presentation: An Instrumental Variable Analysis Using Time of Hospital Presentation.

    PubMed

    Carroll, Robert; Metcalfe, Chris; Steeg, Sarah; Davies, Neil M; Cooper, Jayne; Kapur, Nav; Gunnell, David

    2016-01-01

    Clinical guidelines have recommended psychosocial assessment of self-harm patients for years, yet estimates of its impact on the risk of repeat self-harm vary. Assessing the association of psychosocial assessment with risk of repeat self-harm is challenging due to the effects of confounding by indication. We analysed data from a cohort study of 15,113 patients presenting to the emergency departments of three UK hospitals to investigate the association of psychosocial assessment with risk of repeat hospital presentation for self-harm. Time of day of hospital presentation was used as an instrument for psychosocial assessment, attempting to control for confounding by indication. Conventional regression analysis suggested psychosocial assessment was not associated with risk of repeat self-harm within 12 months (Risk Difference (RD) 0.00 95% confidence interval (95%CI) -0.01 to 0.02). In contrast, IV analysis suggested risk of repeat self-harm was reduced by 18% (RD -0.18, 95%CI -0.32 to -0.03) in those patients receiving a psychosocial assessment. However, the instrument of time of day did not remove all potential effects of confounding by indication, suggesting the IV effect estimate may be biased. We found that psychosocial assessments reduce risk of repeat self-harm. This is in-line with other non-randomised studies based on populations in which allocation to assessment was less subject to confounding by indication. However, as our instrument did not fully balance important confounders across time of day, the IV effect estimate should be interpreted with caution.

  15. Observations of Ocean Primary Productivity Using MODIS

    NASA Technical Reports Server (NTRS)

    Esaias, Wayne E.; Abbott, Mark R.; Koblinsky, Chester J. (Technical Monitor)

    2001-01-01

    Measuring the magnitude and variability of oceanic net primary productivity (NPP) represents a key advancement toward our understanding of the dynamics of marine ecosystems and the role of the ocean in the global carbon cycle. MODIS observations make two new contributions in addition to continuing the bio-optical time series begun with Orbview-2's SeaWiFS sensor. First, MODIS provides weekly estimates of global ocean net primary productivity on weekly and annual time periods, and annual empirical estimates of carbon export production. Second, MODIS provides additional insight into the spatial and temporal variations in photosynthetic efficiency through the direct measurements of solar-stimulated chlorophyll fluorescence. The two different weekly productivity indexes (first developed by Behrenfeld & Falkowski and by Yoder, Ryan and Howard) are used to derive daily productivity as a function of chlorophyll biomass, incident daily surface irradiance, temperature, euphotic depth, and mixed layer depth. Comparisons between these two estimates using both SeaWiFS and MODIS data show significant model differences in spatial distribution after allowance for the different integration depths. Both estimates are strongly dependence on the accuracy of the chlorophyll determination. In addition, an empirical approach is taken on annual scales to estimate global NPP and export production. Estimates of solar stimulated fluorescence efficiency from chlorophyll have been shown to be inversely related to photosynthetic efficiency by Abbott and co-workers. MODIS provides the first global estimates of oceanic chlorophyll fluorescence, providing an important proof of concept. MODIS observations are revealing spatial patterns of fluorescence efficiency which show expected variations with phytoplankton photo-physiological parameters as measured during in-situ surveys. This has opened the way for research into utilizing this information to improve our understanding of oceanic NPP variability. Deriving the ocean bio-optical properties places severe demands on instrument performance (especially band to band precision) and atmospheric correction. Improvements in MODIS instrument characterization and calibration over the first 16 mission months have greatly improved the accuracy of the chlorophyll input fields and FLH, and therefore the estimates of NPP and fluorescence efficiency. Annual estimates now show the oceanic NPP accounts for 40-50% of the global total NPP, with significant interannual variations related to large scale ocean processes. Spatial variations in ocean NPP, and exported production, have significant effects on exchange of CO2 between the ocean and atmosphere. Further work is underway to improve both the primary productivity model functions, and to refine our understanding of the relationships between fluorescence efficiency and NPP estimates. We expect that the MODIS instruments will prove extremely useful in assessing the time dependencies of oceanic carbon uptake and effects of iron enrichment, within the global carbon cycle.

  16. New DMSP Database of Precipitating Auroral Electrons and Ions.

    PubMed

    Redmon, Robert J; Denig, William F; Kilcommons, Liam M; Knipp, Delores J

    2017-08-01

    Since the mid 1970's, the Defense Meteorological Satellite Program (DMSP) spacecraft have operated instruments for monitoring the space environment from low earth orbit. As the program evolved, so to have the measurement capabilities such that modern DMSP spacecraft include a comprehensive suite of instruments providing estimates of precipitating electron and ion fluxes, cold/bulk plasma composition and moments, the geomagnetic field, and optical emissions in the far and extreme ultraviolet. We describe the creation of a new public database of precipitating electrons and ions from the Special Sensor J (SSJ) instrument, complete with original counts, calibrated differential fluxes adjusted for penetrating radiation, estimates of the total kinetic energy flux and characteristic energy, uncertainty estimates, and accurate ephemerides. These are provided in a common and self-describing format that covers 30+ years of DMSP spacecraft from F06 (launched in 1982) through F18 (launched in 2009). This new database is accessible at the National Centers for Environmental Information (NCEI) and the Coordinated Data Analysis Web (CDAWeb). We describe how the new database is being applied to high latitude studies of: the co-location of kinetic and electromagnetic energy inputs, ionospheric conductivity variability, field aligned currents and auroral boundary identification. We anticipate that this new database will support a broad range of space science endeavors from single observatory studies to coordinated system science investigations.

  17. A Genetic Instrumental Variables Analysis of the Effects of Prenatal Smoking on Birth Weight: Evidence from Two Samples

    PubMed Central

    Lehrer, Steven F.; Moreno, Lina M.; Murray, Jeffrey C.; Wilcox, Allen; Lie, Rolv T.

    2011-01-01

    There is a large literature showing the detrimental effects of prenatal smoking on birth and childhood health outcomes. It is somewhat unclear, though, whether these effects are causal or reflect other characteristics and choices by mothers who choose to smoke that may also affect child health outcomes or biased reporting of smoking. In this paper, we use genetic markers that predict smoking behaviors as instruments in order to address the endogeneity of smoking choices in the production of birth and childhood health outcomes. Our results indicate that prenatal smoking produces more dramatic declines in birth weight than estimates that ignore the endogeneity of prenatal smoking, which is consistent with previous studies with non-genetic instruments. We use data from two distinct samples from Norway and the US with different measured instruments and find nearly identical results. The study provides a novel application that can be extended to study several behavioral impacts on health, social and economic outcomes. PMID:21845925

  18. Climate change. Six centuries of variability and extremes in a coupled marine-terrestrial ecosystem.

    PubMed

    Black, Bryan A; Sydeman, William J; Frank, David C; Griffin, Daniel; Stahle, David W; García-Reyes, Marisol; Rykaczewski, Ryan R; Bograd, Steven J; Peterson, William T

    2014-09-19

    Reported trends in the mean and variability of coastal upwelling in eastern boundary currents have raised concerns about the future of these highly productive and biodiverse marine ecosystems. However, the instrumental records on which these estimates are based are insufficiently long to determine whether such trends exceed preindustrial limits. In the California Current, a 576-year reconstruction of climate variables associated with winter upwelling indicates that variability increased over the latter 20th century to levels equaled only twice during the past 600 years. This modern trend in variance may be unique, because it appears to be driven by an unprecedented succession of extreme, downwelling-favorable, winter climate conditions that profoundly reduce productivity for marine predators of commercial and conservation interest. Copyright © 2014, American Association for the Advancement of Science.

  19. The OCO-3 Mission: Science Objectives and Instrument Performance

    NASA Astrophysics Data System (ADS)

    Eldering, A.; Basilio, R. R.; Bennett, M. W.

    2017-12-01

    The Orbiting Carbon Observatory 3 (OCO-3) will continue global CO2 and solar-induced chlorophyll fluorescence (SIF) using the flight spare instrument from OCO-2. The instrument is currently being tested, and will be packaged for installation on the International Space Station (ISS) (launch readiness in early 2018.) This talk will focus on the science objectives, updated simulations of the science data products, and the outcome of recent instrument performance tests. The low-inclination ISS orbit lets OCO-3 sample the tropics and sub-tropics across the full range of daylight hours with dense observations at northern and southern mid-latitudes (+/- 52º). The combination of these dense CO2 and SIF measurements provides continuity of data for global flux estimates as well as a unique opportunity to address key deficiencies in our understanding of the global carbon cycle. The instrument utilizes an agile, 2-axis pointing mechanism (PMA), providing the capability to look towards the bright reflection from the ocean and validation targets. The PMA also allows for a snapshot mapping mode to collect dense datasets over 100km by 100km areas. Measurements over urban centers could aid in making estimates of fossil fuel CO2 emissions. Similarly, the snapshot mapping mode can be used to sample regions of interest for the terrestrial carbon cycle. In addition, there is potential to utilize data from ISS instruments ECOSTRESS (ECOsystem Spaceborne Thermal Radiometer Experiment on Space Station) and GEDI (Global Ecosystem Dynamics Investigation), which measure other key variables of the control of carbon uptake by plants, to complement OCO-3 data in science analysis. In 2017, the OCO-2 instrument was transformed into the ISS-ready OCO-3 payload. The transformed instrument was thoroughly tested and characterized. Key characteristics, such as instrument ILS, spectral resolution, and radiometric performance will be described. Analysis of direct sun measurements taken during testing will also be discussed.

  20. The Trojan Lifetime Champions Health Survey: Development, Validity, and Reliability

    PubMed Central

    Sorenson, Shawn C.; Romano, Russell; Scholefield, Robin M.; Schroeder, E. Todd; Azen, Stanley P.; Salem, George J.

    2015-01-01

    Context Self-report questionnaires are an important method of evaluating lifespan health, exercise, and health-related quality of life (HRQL) outcomes among elite, competitive athletes. Few instruments, however, have undergone formal characterization of their psychometric properties within this population. Objective To evaluate the validity and reliability of a novel health and exercise questionnaire, the Trojan Lifetime Champions (TLC) Health Survey. Design Descriptive laboratory study. Setting A large National Collegiate Athletic Association Division I university. Patients or Other Participants A total of 63 university alumni (age range, 24 to 84 years), including former varsity collegiate athletes and a control group of nonathletes. Intervention(s) Participants completed the TLC Health Survey twice at a mean interval of 23 days with randomization to the paper or electronic version of the instrument. Main Outcome Measure(s) Content validity, feasibility of administration, test-retest reliability, parallel-form reliability between paper and electronic forms, and estimates of systematic and typical error versus differences of clinical interest were assessed across a broad range of health, exercise, and HRQL measures. Results Correlation coefficients, including intraclass correlation coefficients (ICCs) for continuous variables and κ agreement statistics for ordinal variables, for test-retest reliability averaged 0.86, 0.90, 0.80, and 0.74 for HRQL, lifetime health, recent health, and exercise variables, respectively. Correlation coefficients, again ICCs and κ, for parallel-form reliability (ie, equivalence) between paper and electronic versions averaged 0.90, 0.85, 0.85, and 0.81 for HRQL, lifetime health, recent health, and exercise variables, respectively. Typical measurement error was less than the a priori thresholds of clinical interest, and we found minimal evidence of systematic test-retest error. We found strong evidence of content validity, convergent construct validity with the Short-Form 12 Version 2 HRQL instrument, and feasibility of administration in an elite, competitive athletic population. Conclusions These data suggest that the TLC Health Survey is a valid and reliable instrument for assessing lifetime and recent health, exercise, and HRQL, among elite competitive athletes. Generalizability of the instrument may be enhanced by additional, larger-scale studies in diverse populations. PMID:25611315

  1. Practical aspects of a maximum likelihood estimation method to extract stability and control derivatives from flight data

    NASA Technical Reports Server (NTRS)

    Iliff, K. W.; Maine, R. E.

    1976-01-01

    A maximum likelihood estimation method was applied to flight data and procedures to facilitate the routine analysis of a large amount of flight data were described. Techniques that can be used to obtain stability and control derivatives from aircraft maneuvers that are less than ideal for this purpose are described. The techniques involve detecting and correcting the effects of dependent or nearly dependent variables, structural vibration, data drift, inadequate instrumentation, and difficulties with the data acquisition system and the mathematical model. The use of uncertainty levels and multiple maneuver analysis also proved to be useful in improving the quality of the estimated coefficients. The procedures used for editing the data and for overall analysis are also discussed.

  2. Does money really matter? Estimating impacts of family income on young children's achievement with data from random-assignment experiments.

    PubMed

    Duncan, Greg J; Morris, Pamela A; Rodrigues, Chris

    2011-09-01

    Social scientists do not agree on the size and nature of the causal impacts of parental income on children's achievement. We revisit this issue using a set of welfare and antipoverty experiments conducted in the 1990s. We utilize an instrumental variables strategy to leverage the variation in income and achievement that arises from random assignment to the treatment group to estimate the causal effect of income on child achievement. Our estimates suggest that a $1,000 increase in annual income increases young children's achievement by 5%-6% of a standard deviation. As such, our results suggest that family income has a policy-relevant, positive impact on the eventual school achievement of preschool children.

  3. Incorporating variability in point estimates in risk assessment: Bridging the gap between LC50 and population endpoints.

    PubMed

    Stark, John D; Vargas, Roger I; Banks, John E

    2015-07-01

    Historically, point estimates such as the median lethal concentration (LC50) have been instrumental in assessing risks associated with toxicants to rare or economically important species. In recent years, growing awareness of the shortcomings of this approach has led to an increased focus on analyses using population endpoints. However, risk assessment of pesticides still relies heavily on large amounts of LC50 data amassed over decades in the laboratory. Despite the fact that these data are generally well replicated, little or no attention has been given to the sometime high levels of variability associated with the generation of point estimates. This is especially important in agroecosystems where arthropod predator-prey interactions are often disrupted by the use of pesticides. Using laboratory derived data of 4 economically important species (2 fruit fly pest species and 2 braconid parasitoid species) and matrix based population models, the authors demonstrate in the present study a method for bridging traditional point estimate risk assessments with population outcomes. The results illustrate that even closely related species can show strikingly divergent responses to the same exposures to pesticides. Furthermore, the authors show that using different values within the 95% confidence intervals of LC50 values can result in very different population outcomes, ranging from quick recovery to extinction for both pest and parasitoid species. The authors discuss the implications of these results and emphasize the need to incorporate variability and uncertainty in point estimates for use in risk assessment. © 2015 SETAC.

  4. Ascertainment-adjusted parameter estimation approach to improve robustness against misspecification of health monitoring methods

    NASA Astrophysics Data System (ADS)

    Juesas, P.; Ramasso, E.

    2016-12-01

    Condition monitoring aims at ensuring system safety which is a fundamental requirement for industrial applications and that has become an inescapable social demand. This objective is attained by instrumenting the system and developing data analytics methods such as statistical models able to turn data into relevant knowledge. One difficulty is to be able to correctly estimate the parameters of those methods based on time-series data. This paper suggests the use of the Weighted Distribution Theory together with the Expectation-Maximization algorithm to improve parameter estimation in statistical models with latent variables with an application to health monotonic under uncertainty. The improvement of estimates is made possible by incorporating uncertain and possibly noisy prior knowledge on latent variables in a sound manner. The latent variables are exploited to build a degradation model of dynamical system represented as a sequence of discrete states. Examples on Gaussian Mixture Models, Hidden Markov Models (HMM) with discrete and continuous outputs are presented on both simulated data and benchmarks using the turbofan engine datasets. A focus on the application of a discrete HMM to health monitoring under uncertainty allows to emphasize the interest of the proposed approach in presence of different operating conditions and fault modes. It is shown that the proposed model depicts high robustness in presence of noisy and uncertain prior.

  5. Power calculator for instrumental variable analysis in pharmacoepidemiology

    PubMed Central

    Walker, Venexia M; Davies, Neil M; Windmeijer, Frank; Burgess, Stephen; Martin, Richard M

    2017-01-01

    Abstract Background Instrumental variable analysis, for example with physicians’ prescribing preferences as an instrument for medications issued in primary care, is an increasingly popular method in the field of pharmacoepidemiology. Existing power calculators for studies using instrumental variable analysis, such as Mendelian randomization power calculators, do not allow for the structure of research questions in this field. This is because the analysis in pharmacoepidemiology will typically have stronger instruments and detect larger causal effects than in other fields. Consequently, there is a need for dedicated power calculators for pharmacoepidemiological research. Methods and Results The formula for calculating the power of a study using instrumental variable analysis in the context of pharmacoepidemiology is derived before being validated by a simulation study. The formula is applicable for studies using a single binary instrument to analyse the causal effect of a binary exposure on a continuous outcome. An online calculator, as well as packages in both R and Stata, are provided for the implementation of the formula by others. Conclusions The statistical power of instrumental variable analysis in pharmacoepidemiological studies to detect a clinically meaningful treatment effect is an important consideration. Research questions in this field have distinct structures that must be accounted for when calculating power. The formula presented differs from existing instrumental variable power formulae due to its parametrization, which is designed specifically for ease of use by pharmacoepidemiologists. PMID:28575313

  6. Dobson spectrophotometer ozone measurements during international ozone rocketsonde intercomparison

    NASA Technical Reports Server (NTRS)

    Parsons, C. L.

    1980-01-01

    Measurements of the total ozone content of the atmosphere, made with seven ground based instruments at a site near Wallops Island, Virginia, are discussed in terms for serving as control values with which the rocketborne sensor data products can be compared. These products are profiles of O3 concentration with altitude. By integrating over the range of altitudes from the surface to the rocket apogee and by appropriately estimating the residual ozone amount from apogee to the top of the atmosphere, a total ozone amount can be computed from the profiles that can be directly compared with the ground based instrumentation results. Dobson spectrophotometers were used for two of the ground-based instruments. Preliminary data collected during the IORI from Dobson spectrophotometers 72 and 38 are presented. The agreement between the two and the variability of total ozone overburden through the experiment period are discussed.

  7. Template-Directed Instrumentation Reduces Cost and Improves Efficiency for Total Knee Arthroplasty: An Economic Decision Analysis and Pilot Study.

    PubMed

    McLawhorn, Alexander S; Carroll, Kaitlin M; Blevins, Jason L; DeNegre, Scott T; Mayman, David J; Jerabek, Seth A

    2015-10-01

    Template-directed instrumentation (TDI) for total knee arthroplasty (TKA) may streamline operating room (OR) workflow and reduce costs by preselecting implants and minimizing instrument tray burden. A decision model simulated the economics of TDI. Sensitivity analyses determined thresholds for model variables to ensure TDI success. A clinical pilot was reviewed. The accuracy of preoperative templates was validated, and 20 consecutive primary TKAs were performed using TDI. The model determined that preoperative component size estimation should be accurate to ±1 implant size for 50% of TKAs to implement TDI. The pilot showed that preoperative template accuracy exceeded 97%. There were statistically significant improvements in OR turnover time and in-room time for TDI compared to an historical cohort of TKAs. TDI reduces costs and improves OR efficiency. Copyright © 2015 Elsevier Inc. All rights reserved.

  8. A Technique for Measuring Rotocraft Dynamic Stability in the 40 by 80 Foot Wind Tunnel

    NASA Technical Reports Server (NTRS)

    Gupta, N. K.; Bohn, J. G.

    1977-01-01

    An on-line technique is described for the measurement of tilt rotor aircraft dynamic stability in the Ames 40- by 80-Foot Wind Tunnel. The technique is based on advanced system identification methodology and uses the instrumental variables approach. It is particulary applicable to real time estimation problems with limited amounts of noise-contaminated data. Several simulations are used to evaluate the algorithm. Estimated natural frequencies and damping ratios are compared with simulation values. The algorithm is also applied to wind tunnel data in an off-line mode. The results are used to develop preliminary guidelines for effective use of the algorithm.

  9. Taking up or turning down: new estimates of household demand for employer-sponsored health insurance.

    PubMed

    Abraham, Jean Marie; Feldman, Roger

    2010-01-01

    This study provides new estimates of demand for employer-sponsored health insurance, using the 1997-2001 linked Household Component-Insurance Component of the Medical Expenditure Panel Survey (MEPS). Our focus is on households' decisions to take up coverage through a worker's employer. We found a significant inverse relationship between the out-of-pocket premium and the probability of taking up coverage, with the price effect considerably larger when we used instrumental variables methods to account for endogenous out-of-pocket premiums. Additionally, workers in families with more children eligible for Medicaid were less likely to take up coverage.

  10. Children and Careers: How Family Size Affects Parents' Labor Market Outcomes in the Long Run.

    PubMed

    Cools, Sara; Markussen, Simen; Strøm, Marte

    2017-10-01

    We estimate the effect of family size on various measures of labor market outcomes over the whole career until retirement, using instrumental variables estimation in data from Norwegian administrative registers. Parents' number of children is instrumented with the sex mix of their first two children. We find that having additional children causes sizable reductions in labor supply for women, which fade as children mature and even turn positive for women without a college degree. Among women with a college degree, there is evidence of persistent and even increasing career penalties of family size. Having additional children reduces these women's probability of being employed by higher-paying firms, their earnings rank within the employing firm, and their probability of being the top earner at the workplace. Some of the career effects persist long after labor supply is restored. We find no effect of family size on any of men's labor market outcomes in either the short or long run.

  11. Effect of Nursing Home Ownership on the Quality of Post-Acute Care: An Instrumental Variables Approach

    PubMed Central

    Grabowski, David C.; Feng, Zhanlian; Hirth, Richard; Rahman, Momotazur; Mor, Vincent

    2012-01-01

    Given the preferential tax treatment afforded nonprofit firms, policymakers and researchers have been interested in whether the nonprofit sector provides higher nursing home quality relative to its for-profit counterpart. However, differential selection into for-profits and nonprofits can lead to biased estimates of the effect of ownership form. By using “differential distance” to the nearest nonprofit nursing home relative to the nearest for-profit nursing home, we mimic randomization of residents into more or less “exposure” to nonprofit homes when estimating the effects of ownership on quality of care. Using national Minimum Data Set assessments linked with Medicare claims, we use a national cohort of post-acute patients who were newly admitted to nursing homes within an 18-month period spanning January 1, 2004 and June 30, 2005. After instrumenting for ownership status, we found that post-acute patients in nonprofit facilities had fewer 30-day hospitalizations and greater improvement in mobility, pain, and functioning. PMID:23202253

  12. Poverty, Pregnancy, and Birth Outcomes: A Study of the Earned Income Tax Credit.

    PubMed

    Hamad, Rita; Rehkopf, David H

    2015-09-01

    Economic interventions are increasingly recognised as a mechanism to address perinatal health outcomes among disadvantaged groups. In the US, the earned income tax credit (EITC) is the largest poverty alleviation programme. Little is known about its effects on perinatal health among recipients and their children. We exploit quasi-random variation in the size of EITC payments to examine the effects of income on perinatal health. The study sample includes women surveyed in the 1979 National Longitudinal Survey of Youth (n = 2985) and their children born during 1986-2000 (n = 4683). Outcome variables include utilisation of prenatal and postnatal care, use of alcohol and tobacco during pregnancy, term birth, birthweight, and breast-feeding status. We first examine the health effects of both household income and EITC payment size using multivariable linear regressions. We then employ instrumental variables analysis to estimate the causal effect of income on perinatal health, using EITC payment size as an instrument for household income. We find that EITC payment size is associated with better levels of several indicators of perinatal health. Instrumental variables analysis, however, does not reveal a causal association between household income and these health measures. Our findings suggest that associations between income and perinatal health may be confounded by unobserved characteristics, but that EITC income improves perinatal health. Future studies should continue to explore the impacts of economic interventions on perinatal health outcomes, and investigate how different forms of income transfers may have different impacts. © 2015 John Wiley & Sons Ltd.

  13. Addressing continuous data measured with different instruments for participants excluded from trial analysis: a guide for systematic reviewers.

    PubMed

    Ebrahim, Shanil; Johnston, Bradley C; Akl, Elie A; Mustafa, Reem A; Sun, Xin; Walter, Stephen D; Heels-Ansdell, Diane; Alonso-Coello, Pablo; Guyatt, Gordon H

    2014-05-01

    We previously developed an approach to address the impact of missing participant data in meta-analyses of continuous variables in trials that used the same measurement instrument. We extend this approach to meta-analyses including trials that use different instruments to measure the same construct. We reviewed the available literature, conducted an iterative consultative process, and developed an approach involving a complete-case analysis complemented by sensitivity analyses that apply a series of increasingly stringent assumptions about results in patients with missing continuous outcome data. Our approach involves choosing the reference measurement instrument; converting scores from different instruments to the units of the reference instrument; developing four successively more stringent imputation strategies for addressing missing participant data; calculating a pooled mean difference for the complete-case analysis and imputation strategies; calculating the proportion of patients who experienced an important treatment effect; and judging the impact of the imputation strategies on the confidence in the estimate of effect. We applied our approach to an example systematic review of respiratory rehabilitation for chronic obstructive pulmonary disease. Our extended approach provides quantitative guidance for addressing missing participant data in systematic reviews of trials using different instruments to measure the same construct. Copyright © 2014 Elsevier Inc. All rights reserved.

  14. Validation of a computerized 24-hour physical activity recall (24PAR) instrument with pattern-recognition activity monitors.

    PubMed

    Calabro, Miguel A; Welk, Gregory J; Carriquiry, Alicia L; Nusser, Sarah M; Beyler, Nicholas K; Mathews, Charles E

    2009-03-01

    The purpose of this study was to examine the validity of a computerized 24-hour physical activity recall instrument (24PAR). Participants (n=20) wore 2 pattern-recognition activity monitors (an IDEEA and a SenseWear Pro Armband) for a 24-hour period and then completed the 24PAR the following morning. Participants completed 2 trials, 1 while maintaining a prospective diary of their activities and 1 without a diary. The trials were counterbalanced and completed within a week from each other. Estimates of energy expenditure (EE) and minutes of moderate-to-vigorous physical activity (MVPA) were compared with the criterion measures using 3-way (method by gender by trial) mixed-model ANOVA analyses. For EE, pairwise correlations were high (r>.88), and there were no differences in estimates across methods. Estimates of MVPA were more variable, but correlations were still in the moderate to high range (r>.57). Average activity levels were significantly higher on the logging trial, but there was no significant difference in the accuracy of self-report on days with and without logging. The results of this study support the overall utility of the 24PAR for group-level estimates of daily EE and MVPA.

  15. Artificial Intelligence Estimation of Carotid-Femoral Pulse Wave Velocity using Carotid Waveform.

    PubMed

    Tavallali, Peyman; Razavi, Marianne; Pahlevan, Niema M

    2018-01-17

    In this article, we offer an artificial intelligence method to estimate the carotid-femoral Pulse Wave Velocity (PWV) non-invasively from one uncalibrated carotid waveform measured by tonometry and few routine clinical variables. Since the signal processing inputs to this machine learning algorithm are sensor agnostic, the presented method can accompany any medical instrument that provides a calibrated or uncalibrated carotid pressure waveform. Our results show that, for an unseen hold back test set population in the age range of 20 to 69, our model can estimate PWV with a Root-Mean-Square Error (RMSE) of 1.12 m/sec compared to the reference method. The results convey the fact that this model is a reliable surrogate of PWV. Our study also showed that estimated PWV was significantly associated with an increased risk of CVDs.

  16. Horizontal Variability of Water and Its Relationship to Cloud Fraction near the Tropical Tropopause: Using Aircraft Observations of Water Vapor to Improve the Representation of Grid-scale Cloud Formation in GEOS-5

    NASA Technical Reports Server (NTRS)

    Selkirk, Henry B.; Molod, Andrea M.

    2014-01-01

    Large-scale models such as GEOS-5 typically calculate grid-scale fractional cloudiness through a PDF parameterization of the sub-gridscale distribution of specific humidity. The GEOS-5 moisture routine uses a simple rectangular PDF varying in height that follows a tanh profile. While below 10 km this profile is informed by moisture information from the AIRS instrument, there is relatively little empirical basis for the profile above that level. ATTREX provides an opportunity to refine the profile using estimates of the horizontal variability of measurements of water vapor, total water and ice particles from the Global Hawk aircraft at or near the tropopause. These measurements will be compared with estimates of large-scale cloud fraction from CALIPSO and lidar retrievals from the CPL on the aircraft. We will use the variability measurements to perform studies of the sensitivity of the GEOS-5 cloud-fraction to various modifications to the PDF shape and to its vertical profile.

  17. Field-scale and Regional Variability in Evapotranspiration over Crops in California using Eddy Covariance and Surface Renewal

    NASA Astrophysics Data System (ADS)

    Kent, E. R.; Clay, J. M.; Leinfelder-Miles, M.; Lambert, J. J.; Little, C.; Monteiro, R. O. C.; Monteiro, P. F. C.; Shapiro, K.; Rice, S.; Snyder, R. L.; Daniele, Z.; Paw U, K. T.

    2016-12-01

    Evapotranspiration (ET) estimated using a single crop coefficient and a grass reference largely ignores variability due to heterogeneity in microclimate, soils, and crop management. We employ a relatively low cost energy balance residual method using surface renewal and eddy covariance measurements to continuously estimate half-hourly and daily ET across more than 15 fields and orchards spanning four crops and two regions of California. In the Sacramento-San Joaquin River Delta, measurements were taken in corn, pasture, and alfalfa fields, with 4-5 stations in each crop type spread across the region. In the Southern San Joaquin Valley, measurements were taken in three different pistachio orchards, with one orchard having six stations instrumented to examine salinity-induced heterogeneity. We analyze field-scale and regional variability in ET and measured surface energy balance components. Cross comparisons between the eddy covariance and the surface renewal measurements confirm the robustness of the surface renewal method. A hybrid approach in which remotely sensed net radiation is combined with in situ measurements of sensible heat flux is also investigated. This work will provide ground-truth data for satellite and aerial-based ET estimates and will inform water management at the field and regional scales.

  18. VO2 and VCO2 variabilities through indirect calorimetry instrumentation.

    PubMed

    Cadena-Méndez, Miguel; Escalante-Ramírez, Boris; Azpiroz-Leehan, Joaquín; Infante-Vázquez, Oscar

    2013-01-01

    The aim of this paper is to understand how to measure the VO2 and VCO2 variabilities in indirect calorimetry (IC) since we believe they can explain the high variation in the resting energy expenditure (REE) estimation. We propose that variabilities should be separately measured from the VO2 and VCO2 averages to understand technological differences among metabolic monitors when they estimate the REE. To prove this hypothesis the mixing chamber (MC) and the breath-by-breath (BbB) techniques measured the VO2 and VCO2 averages and their variabilities. Variances and power spectrum energies in the 0-0.5 Hertz band were measured to establish technique differences in steady and non-steady state. A hybrid calorimeter with both IC techniques studied a population of 15 volunteers that underwent the clino-orthostatic maneuver in order to produce the two physiological stages. The results showed that inter-individual VO2 and VCO2 variabilities measured as variances were negligible using the MC while variabilities measured as spectral energies using the BbB underwent 71 and 56% (p < 0.05), increase respectively. Additionally, the energy analysis showed an unexpected cyclic rhythm at 0.025 Hertz only during the orthostatic stage, which is new physiological information, not reported previusly. The VO2 and VCO2 inter-individual averages increased to 63 and 39% by the MC (p < 0.05) and 32 and 40% using the BbB (p < 0.1), respectively, without noticeable statistical differences among techniques. The conclusions are: (a) metabolic monitors should simultaneously include the MC and the BbB techniques to correctly interpret the steady or non-steady state variabilities effect in the REE estimation, (b) the MC is the appropriate technique to compute averages since it behaves as a low-pass filter that minimizes variances, (c) the BbB is the ideal technique to measure the variabilities since it can work as a high-pass filter to generate discrete time series able to accomplish spectral analysis, and (d) the new physiological information in the VO2 and VCO2 variabilities can help to understand why metabolic monitors with dissimilar IC techniques give different results in the REE estimation.

  19. Poverty and Child Development: A Longitudinal Study of the Impact of the Earned Income Tax Credit.

    PubMed

    Hamad, Rita; Rehkopf, David H

    2016-05-01

    Although adverse socioeconomic conditions are correlated with worse child health and development, the effects of poverty-alleviation policies are less understood. We examined the associations of the Earned Income Tax Credit (EITC) on child development and used an instrumental variable approach to estimate the potential impacts of income. We used data from the US National Longitudinal Survey of Youth (n = 8,186) during 1986-2000 to examine effects on the Behavioral Problems Index (BPI) and Home Observation Measurement of the Environment inventory (HOME) scores. We conducted 2 analyses. In the first, we used multivariate linear regressions with child-level fixed effects to examine the association of EITC payment size with BPI and HOME scores; in the second, we used EITC payment size as an instrument to estimate the associations of income with BPI and HOME scores. In linear regression models, higher EITC payments were associated with improved short-term BPI scores (per $1,000, β = -0.57; P = 0.04). In instrumental variable analyses, higher income was associated with improved short-term BPI scores (per $1,000, β = -0.47; P = 0.01) and medium-term HOME scores (per $1,000, β = 0.64; P = 0.02). Our results suggest that both EITC benefits and higher income are associated with modest but meaningful improvements in child development. These findings provide valuable information for health researchers and policymakers for improving child health and development. © The Author 2016. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  20. An instrumental variable approach finds no associated harm or benefit from early dialysis initiation in the United States

    PubMed Central

    Scialla, Julia J.; Liu, Jiannong; Crews, Deidra C.; Guo, Haifeng; Bandeen-Roche, Karen; Ephraim, Patti L.; Tangri, Navdeep; Sozio, Stephen M.; Shafi, Tariq; Miskulin, Dana C.; Michels, Wieneke M.; Jaar, Bernard G.; Wu, Albert W.; Powe, Neil R.; Boulware, L. Ebony

    2014-01-01

    The estimated glomerular filtration rate (eGFR) at dialysis initiation has been rising. Observational studies suggest harm, but may be confounded by unmeasured factors. As instrumental variable methods may be less biased we performed a retrospective cohort study of 310,932 patients starting dialysis between 2006 to 2008 and registered in the United States Renal Data System in order to describe geographic variation in eGFR at dialysis initiation and determine its association with mortality. Patients were grouped into 804 health service areas by zip code. Individual eGFR at dialysis initiation averaged 10.8 ml/min/1.73m2 but varied geographically. Only 11% of the variation in mean health service areas-level eGFR at dialysis initiation was accounted for by patient characteristics. We calculated demographic-adjusted mean eGFR at dialysis initiation in the health service areas using the 2006 and 2007 incident cohort as our instrument and estimated the association between individual eGFR at dialysis initiation and mortality in the 2008 incident cohort using the 2 stage residual inclusion method. Among 89,547 patients starting dialysis in 2008 with eGFR 5 to 20 ml/min/1.73m2, eGFR at initiation was not associated with mortality over a median of 15.5 months [hazard ratio 1.025 per 1 ml/min/1.73m2 for eGFR 5 to 14 ml/min/1.73m2; and 0.973 per 1 ml/min/1.73m2 for eGFR 14 to 20 ml/min/1.73m2]. Thus, there was no associated harm or benefit from early dialysis initiation in the United States. PMID:24786707

  1. PACIC Instrument: disentangling dimensions using published validation models.

    PubMed

    Iglesias, K; Burnand, B; Peytremann-Bridevaux, I

    2014-06-01

    To better understand the structure of the Patient Assessment of Chronic Illness Care (PACIC) instrument. More specifically to test all published validation models, using one single data set and appropriate statistical tools. Validation study using data from cross-sectional survey. A population-based sample of non-institutionalized adults with diabetes residing in Switzerland (canton of Vaud). French version of the 20-items PACIC instrument (5-point response scale). We conducted validation analyses using confirmatory factor analysis (CFA). The original five-dimension model and other published models were tested with three types of CFA: based on (i) a Pearson estimator of variance-covariance matrix, (ii) a polychoric correlation matrix and (iii) a likelihood estimation with a multinomial distribution for the manifest variables. All models were assessed using loadings and goodness-of-fit measures. The analytical sample included 406 patients. Mean age was 64.4 years and 59% were men. Median of item responses varied between 1 and 4 (range 1-5), and range of missing values was between 5.7 and 12.3%. Strong floor and ceiling effects were present. Even though loadings of the tested models were relatively high, the only model showing acceptable fit was the 11-item single-dimension model. PACIC was associated with the expected variables of the field. Our results showed that the model considering 11 items in a single dimension exhibited the best fit for our data. A single score, in complement to the consideration of single-item results, might be used instead of the five dimensions usually described. © The Author 2014. Published by Oxford University Press in association with the International Society for Quality in Health Care; all rights reserved.

  2. A Unified Approach for Reporting ARM Measurement Uncertainties Technical Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Campos, E; Sisterson, Douglas

    The U.S. Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) Climate Research Facility is observationally based, and quantifying the uncertainty of its measurements is critically important. With over 300 widely differing instruments providing over 2,500 datastreams, concise expression of measurement uncertainty is quite challenging. The ARM Facility currently provides data and supporting metadata (information about the data or data quality) to its users through a number of sources. Because the continued success of the ARM Facility depends on the known quality of its measurements, the Facility relies on instrument mentors and the ARM Data Quality Office (DQO) to ensure, assess,more » and report measurement quality. Therefore, an easily accessible, well-articulated estimate of ARM measurement uncertainty is needed. Note that some of the instrument observations require mathematical algorithms (retrievals) to convert a measured engineering variable into a useful geophysical measurement. While those types of retrieval measurements are identified, this study does not address particular methods for retrieval uncertainty. As well, the ARM Facility also provides engineered data products, or value-added products (VAPs), based on multiple instrument measurements. This study does not include uncertainty estimates for those data products. We propose here that a total measurement uncertainty should be calculated as a function of the instrument uncertainty (calibration factors), the field uncertainty (environmental factors), and the retrieval uncertainty (algorithm factors). The study will not expand on methods for computing these uncertainties. Instead, it will focus on the practical identification, characterization, and inventory of the measurement uncertainties already available in the ARM community through the ARM instrument mentors and their ARM instrument handbooks. As a result, this study will address the first steps towards reporting ARM measurement uncertainty: 1) identifying how the uncertainty of individual ARM measurements is currently expressed, 2) identifying a consistent approach to measurement uncertainty, and then 3) reclassifying ARM instrument measurement uncertainties in a common framework.« less

  3. Overcoming bias in estimating the volume-outcome relationship.

    PubMed

    Tsai, Alexander C; Votruba, Mark; Bridges, John F P; Cebul, Randall D

    2006-02-01

    To examine the effect of hospital volume on 30-day mortality for patients with congestive heart failure (CHF) using administrative and clinical data in conventional regression and instrumental variables (IV) estimation models. The primary data consisted of longitudinal information on comorbid conditions, vital signs, clinical status, and laboratory test results for 21,555 Medicare-insured patients aged 65 years and older hospitalized for CHF in northeast Ohio in 1991-1997. The patient was the primary unit of analysis. We fit a linear probability model to the data to assess the effects of hospital volume on patient mortality within 30 days of admission. Both administrative and clinical data elements were included for risk adjustment. Linear distances between patients and hospitals were used to construct the instrument, which was then used to assess the endogeneity of hospital volume. When only administrative data elements were included in the risk adjustment model, the estimated volume-outcome effect was statistically significant (p=.029) but small in magnitude. The estimate was markedly attenuated in magnitude and statistical significance when clinical data were added to the model as risk adjusters (p=.39). IV estimation shifted the estimate in a direction consistent with selective referral, but we were unable to reject the consistency of the linear probability estimates. Use of only administrative data for volume-outcomes research may generate spurious findings. The IV analysis further suggests that conventional estimates of the volume-outcome relationship may be contaminated by selective referral effects. Taken together, our results suggest that efforts to concentrate hospital-based CHF care in high-volume hospitals may not reduce mortality among elderly patients.

  4. Cocaine Dependence Treatment Data: Methods for Measurement Error Problems With Predictors Derived From Stationary Stochastic Processes

    PubMed Central

    Guan, Yongtao; Li, Yehua; Sinha, Rajita

    2011-01-01

    In a cocaine dependence treatment study, we use linear and nonlinear regression models to model posttreatment cocaine craving scores and first cocaine relapse time. A subset of the covariates are summary statistics derived from baseline daily cocaine use trajectories, such as baseline cocaine use frequency and average daily use amount. These summary statistics are subject to estimation error and can therefore cause biased estimators for the regression coefficients. Unlike classical measurement error problems, the error we encounter here is heteroscedastic with an unknown distribution, and there are no replicates for the error-prone variables or instrumental variables. We propose two robust methods to correct for the bias: a computationally efficient method-of-moments-based method for linear regression models and a subsampling extrapolation method that is generally applicable to both linear and nonlinear regression models. Simulations and an application to the cocaine dependence treatment data are used to illustrate the efficacy of the proposed methods. Asymptotic theory and variance estimation for the proposed subsampling extrapolation method and some additional simulation results are described in the online supplementary material. PMID:21984854

  5. Spatial Variability of Trace Gases During DISCOVER-AQ: Planning for Geostationary Observations of Atmospheric Composition

    NASA Technical Reports Server (NTRS)

    Follette-Cook, Melanie B.; Pickering, K.; Crawford, J.; Appel, W.; Diskin, G.; Fried, A.; Loughner, C.; Pfister, G.; Weinheimer, A.

    2015-01-01

    Results from an in-depth analysis of trace gas variability in MD indicated that the variability in this region was large enough to be observable by a TEMPO-like instrument. The variability observed in MD is relatively similar to the other three campaigns with a few exceptions: CO variability in CA was much higher than in the other regions; HCHO variability in CA and CO was much lower; MD showed the lowest variability in NO2All model simulations do a reasonable job simulating O3 variability. For CO, the CACO simulations largely under over estimate the variability in the observations. The variability in HCHO is underestimated for every campaign. NO2 variability is slightly overestimated in MD, more so in CO. The TX simulation underestimates the variability in each trace gas. This is most likely due to missing emissions sources (C. Loughner, manuscript in preparation).Future Work: Where reasonable, we will use these model outputs to further explore the resolvability from space of these key trace gases using analyses of tropospheric column amounts relative to satellite precision requirements, similar to Follette-Cook et al. (2015).

  6. A Sample of What We Have Learned from A-Train Cloud Measurements

    NASA Technical Reports Server (NTRS)

    Joiner, Joanna; Vasilkov, Alexander; Ziemke, Jerry; Chandra, Sushil; Spurr, Robert; Bhartia, P. K.; Krotkov, Nick; Sneep, Maarten; Menzel, Paul; Platnick, Steve; hide

    2008-01-01

    The A-train active sensors CloudSat and CALIPSO provide detailed information about cloud vertical structure. Coarse vertical information can also be obtained from a combination of passive sensors (e.g. cloud liquid water content from AMSR-E, cloud ice properties from MLS and HIRDLS, cloud-top pressure from MODIS and AIRS, and UVNISINear IR absorption and scattering from OMI, MODIS, and POLDER). In addition, the wide swaths of instruments such as MODIS, AIRS, OMI, POLDER, and AMSR-E can be exploited to create estimates of the three-dimensional cloud extent. We will show how data fusion from A-train sensors can be used, e.g., to detect and map the presence of multiple layer/phase clouds. Ultimately, combined cloud information from Atrain instruments will allow for estimates of heating and radiative flux at the surface as well as UV/VIS/Near IR trace-gas absorption at the overpass time on a near-global daily basis. CloudSat has also dramatically improved our interpretation of visible and UV passive measurements in complex cloudy situations such as deep convection and multiple cloud layers. This has led to new approaches for unique and accurate constituent retrievals from A-train instruments. For example, ozone mixing ratios inside tropical deep convective clouds have recently been estimated using the Aura Ozone Monitoring Instrument (OMI). Field campaign data from TC4 provide additional information about the spatial variability and origin of trace-gases inside convective clouds. We will highlight some of the new applications of remote sensing in cloudy conditions that have been enabled by the synergy between the A-train active and passive sensors.

  7. Tests of Hypotheses Arising In the Correlated Random Coefficient Model*

    PubMed Central

    Heckman, James J.; Schmierer, Daniel

    2010-01-01

    This paper examines the correlated random coefficient model. It extends the analysis of Swamy (1971), who pioneered the uncorrelated random coefficient model in economics. We develop the properties of the correlated random coefficient model and derive a new representation of the variance of the instrumental variable estimator for that model. We develop tests of the validity of the correlated random coefficient model against the null hypothesis of the uncorrelated random coefficient model. PMID:21170148

  8. Physical Therapy as the First Point of Care to Treat Low Back Pain: An Instrumental Variables Approach to Estimate Impact on Opioid Prescription, Health Care Utilization, and Costs.

    PubMed

    Frogner, Bianca K; Harwood, Kenneth; Andrilla, C Holly A; Schwartz, Malaika; Pines, Jesse M

    2018-05-23

    To compare differences in opioid prescription, health care utilization, and costs among patients with low back pain (LBP) who saw a physical therapist (PT) at the first point of care, at any time during the episode or not at all. Commercial health insurance claims data, 2009-2013. Retrospective analyses using two-stage residual inclusion instrumental variable models to estimate rates for opioid prescriptions, imaging services, emergency department visits, hospitalization, and health care costs. Patients aged 18-64 years with a new primary diagnosis of LBP, living in the northwest United States, were observed over a 1-year period. Compared to patients who saw a PT later or never, patients who saw a PT first had lower probability of having an opioid prescription (89.4 percent), any advanced imaging services (27.9 percent), and an Emergency Department visit (14.7 percent), yet 19.3 percent higher probability of hospitalization (all p < .001). These patients also had significantly lower out-of-pocket costs, and costs appeared to shift away from outpatient and pharmacy toward provider settings. When LBP patients saw a PT first, there was lower utilization of high-cost medical services as well as lower opioid use, and cost shifts reflecting the change in utilization. © Health Research and Educational Trust.

  9. Examining the influence of antenatal care visits and skilled delivery on neonatal deaths in Ghana.

    PubMed

    Lambon-Quayefio, Monica P; Owoo, Nkechi S

    2014-10-01

    Many Sub-Saharan African countries may not achieve the Millennium Development goal of reducing child mortality by 2015 partly due to the stalled reduction in neonatal deaths, which constitute about 60% of infant deaths. Although many studies have emphasized the importance of accessible maternal healthcare as a means of reducing maternal and child mortality, very few of these studies have explored the affordability and accessibility concerns of maternal healthcare on neonatal mortality. This study bridges this research gap as it aims to investigate whether the number of antenatal visits and skilled delivery are associated with the risk of neonatal deaths in Ghana. Using individual level data of women in their reproductive years from the 2008 Demographic and Health Survey, the study employs an instrumental variable strategy to deal with the potential endogeneity of antenatal care visits. Estimates from the instrumental variable estimation show that antenatal care visits reduce the risk of neonatal death by about 2%, while older women have an approximately 0.2% higher risk of losing their neonates than do younger women. Findings suggest that women who attend antenatal visits have a significantly lower probability of losing their babies in the first month of life. Further, results show that women's age significantly affects the risk of losing their babies in the neonatal stage. However, the study finds no significant effect of skilled delivery and education on neonatal mortality.

  10. Remote-sensing reflectance determinations in the coastal ocean environment: impact of instrumental characteristics and environmental variability.

    PubMed

    Toole, D A; Siegel, D A; Menzies, D W; Neumann, M J; Smith, R C

    2000-01-20

    Three independent ocean color sampling methodologies are compared to assess the potential impact of instrumental characteristics and environmental variability on shipboard remote-sensing reflectance observations from the Santa Barbara Channel, California. Results indicate that under typical field conditions, simultaneous determinations of incident irradiance can vary by 9-18%, upwelling radiance just above the sea surface by 8-18%, and remote-sensing reflectance by 12-24%. Variations in radiometric determinations can be attributed to a variety of environmental factors such as Sun angle, cloud cover, wind speed, and viewing geometry; however, wind speed is isolated as the major source of uncertainty. The above-water approach to estimating water-leaving radiance and remote-sensing reflectance is highly influenced by environmental factors. A model of the role of wind on the reflected sky radiance measured by an above-water sensor illustrates that, for clear-sky conditions and wind speeds greater than 5 m/s, determinations of water-leaving radiance at 490 nm are undercorrected by as much as 60%. A data merging procedure is presented to provide sky radiance correction parameters for above-water remote-sensing reflectance estimates. The merging results are consistent with statistical and model findings and highlight the importance of multiple field measurements in developing quality coastal oceanographic data sets for satellite ocean color algorithm development and validation.

  11. Constraining the temperature history of the past millennium using early instrumental observations

    NASA Astrophysics Data System (ADS)

    Brohan, P.

    2012-12-01

    The current assessment that twentieth-century global temperature change is unusual in the context of the last thousand years relies on estimates of temperature changes from natural proxies (tree-rings, ice-cores etc.) and climate model simulations. Confidence in such estimates is limited by difficulties in calibrating the proxies and systematic differences between proxy reconstructions and model simulations - notable differences include large differences in multi-decadal variability between proxy reconstructions, and big uncertainties in the effect of volcanic eruptions. Because the difference between the estimates extends into the relatively recent period of the early nineteenth century it is possible to compare them with a reliable instrumental estimate of the temperature change over that period, provided that enough early thermometer observations, covering a wide enough expanse of the world, can be collected. By constraining key aspects of the reconstructions and simulations, instrumental observations, inevitably from a limited period, can reduce reconstruction uncertainty throughout the millennium. A considerable quantity of early instrumental observations are preserved in the world's archives. One organisation which systematically made observations and collected the results was the English East-India Company (EEIC), and 900 log-books of EEIC ships containing daily instrumental measurements of temperature and pressure have been preserved in the British Library. Similar records from voyages of exploration and scientific investigation are preserved in published literature and the records in National Archives. Some of these records have been extracted and digitised, providing hundreds of thousands of new weather records offering an unprecedentedly detailed view of the weather and climate of the late eighteenth and early nineteenth centuries. The new thermometer observations demonstrate that the large-scale temperature response to the Tambora eruption and the 1809 eruption was modest (perhaps 0.5C). This provides a powerful out-of-sample validation for the proxy reconstructions --- supporting their use for longer-term climate reconstructions. However, some of the climate model simulations in the CMIP5 ensemble show much larger volcanic effects than this --- such simulations are unlikely to be accurate in this respect.

  12. Cross-section and panel estimates of peer effects in early adolescent cannabis use: With a little help from my 'friends once removed'.

    PubMed

    Moriarty, John; McVicar, Duncan; Higgins, Kathryn

    2016-08-01

    Peer effects in adolescent cannabis are difficult to estimate, due in part to the lack of appropriate data on behaviour and social ties. This paper exploits survey data that have many desirable properties and have not previously been used for this purpose. The data set, collected from teenagers in three annual waves from 2002 to 2004 contains longitudinal information about friendship networks within schools (N = 5020). We exploit these data on network structure to estimate peer effects on adolescents from their nominated friends within school using two alternative approaches to identification. First, we present a cross-sectional instrumental variable (IV) estimate of peer effects that exploits network structure at the second degree, i.e. using information on friends of friends who are not themselves ego's friends to instrument for the cannabis use of friends. Second, we present an individual fixed effects estimate of peer effects using the full longitudinal structure of the data. Both innovations allow a greater degree of control for correlated effects than is commonly the case in the substance-use peer effects literature, improving our chances of obtaining estimates of peer effects than can be plausibly interpreted as causal. Both estimates suggest positive peer effects of non-trivial magnitude, although the IV estimate is imprecise. Furthermore, when we specify identical models with behaviour and characteristics of randomly selected school peers in place of friends', we find effectively zero effect from these 'placebo' peers, lending credence to our main estimates. We conclude that cross-sectional data can be used to estimate plausible positive peer effects on cannabis use where network structure information is available and appropriately exploited. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. An Instrument for Estimating the 6-Month Survival Probability After Whole-brain Irradiation Alone for Cerebral Metastases from Gynecological Cancer.

    PubMed

    Janssen, Stefan; Hansen, Heinke C; Schild, Steven E; Rades, Dirk

    2018-06-01

    Patients with cerebral metastases from gynecological cancer who receive whole-brain irradiation (WBI) alone require personalized therapy. This study contributes to personalized care by creating an instrument to predict 6-month survival probability. In 49 patients, six pre-treatment variables, namely age, Eastern Cooperative Oncology Group performance score (ECOG-PS), primary tumor type, number of cerebral metastases, metastasis outside the brain, and interval between diagnosis of gynecological cancer and WBI, were analyzed for survival. Of the six pre-treatment variables, ECOG-PS was significantly associated with survival (p=0.014) and metastasis outside the brain showed a trend for association (p=0.096). Six-month survival rates divided by 10 resulted in scores of 0, 2 or 7 points for ECOG-PS and of 2 or 7 points for metastasis outside the brain. Scores for individual patients were 2, 4, 7, 9 or 14 points. Three groups were created, those with 2-7, 9 and 14 points, with 6-month survival rates of 10%, 53% and 100%, respectively (p=0.004). An instrument was designed to predict the 6-month survival of patients receiving WBI for cerebral metastases from gynecological cancer and facilitate personalized care. Copyright© 2018, International Institute of Anticancer Research (Dr. George J. Delinasios), All rights reserved.

  14. Latest NASA Instrument Cost Model (NICM): Version VI

    NASA Technical Reports Server (NTRS)

    Mrozinski, Joe; Habib-Agahi, Hamid; Fox, George; Ball, Gary

    2014-01-01

    The NASA Instrument Cost Model, NICM, is a suite of tools which allow for probabilistic cost estimation of NASA's space-flight instruments at both the system and subsystem level. NICM also includes the ability to perform cost by analogy as well as joint confidence level (JCL) analysis. The latest version of NICM, Version VI, was released in Spring 2014. This paper will focus on the new features released with NICM VI, which include: 1) The NICM-E cost estimating relationship, which is applicable for instruments flying on Explorer-like class missions; 2) The new cluster analysis ability which, alongside the results of the parametric cost estimation for the user's instrument, also provides a visualization of the user's instrument's similarity to previously flown instruments; and 3) includes new cost estimating relationships for in-situ instruments.

  15. Increased tree-ring network density reveals more precise estimations of sub-regional hydroclimate variability and climate dynamics in the Midwest, USA

    NASA Astrophysics Data System (ADS)

    Maxwell, Justin T.; Harley, Grant L.

    2017-08-01

    Understanding the historic variability in the hydroclimate provides important information on possible extreme dry or wet periods that in turn inform water management plans. Tree rings have long provided historical context of hydroclimate variability of the U.S. However, the tree-ring network used to create these countrywide gridded reconstructions is sparse in certain locations, such as the Midwest. Here, we increase ( n = 20) the spatial resolution of the tree-ring network in southern Indiana and compare a summer (June-August) Palmer Drought Severity Index (PDSI) reconstruction to existing gridded reconstructions of PDSI for this region. We find both droughts and pluvials that were previously unknown that rival the most intense PDSI values during the instrumental period. Additionally, historical drought occurred in Indiana that eclipsed instrumental conditions with regard to severity and duration. During the period 1962-2004 CE, we find that teleconnections of drought conditions through the Atlantic Meridional Overturning Circulation have a strong influence ( r = -0.60, p < 0.01) on secondary tree growth in this region for the late spring-early summer season. These findings highlight the importance of continuing to increase the spatial resolution of the tree-ring network used to infer past climate dynamics to capture the sub-regional spatial variability. Increasing the spatial resolution of the tree-ring network for a given region can better identify sub-regional variability, improve the accuracy of regional tree-ring PDSI reconstructions, and provide better information for climatic teleconnections.

  16. Variability of Springtime Transpacific Pollution Transport During 2000-2006: The INTEX-5 Mission in the Context of Previous Years

    NASA Technical Reports Server (NTRS)

    Pfister, G. G.; Emmons, L. K.; Edwards, D. P.; Arellano, A.; Sachse, G.; Campos, T.

    2010-01-01

    We analyze the transport of pollution across the Pacific during the NASA INTEX-B (Intercontinental Chemical Transport Experiment Part 8) campaign in spring 2006 and examine how this year compares to the time period for 2000 through 2006. In addition to aircraft measurements of carbon monoxide (CO) collected during INTEX-B, we include in this study multi-year satellite retrievals of CO from the Measurements of Pollution in the Troposphere (MOPITT) instrument and simulations from the chemistry transport model MOZART-4. Model tracers are used to examine the contributions of different source regions and source types to pollution levels over the Pacific. Additional modeling studies are performed to separate the impacts of inter-annual variability in meteorology and .dynamics from changes in source strength. interannual variability in the tropospheric CO burden over the Pacific and the US as estimated from the MOPITT data range up to 7% and a somewhat smaller estimate (5%) is derived from the model. When keeping the emissions in the model constant between years, the year-to-year changes are reduced (2%), but show that in addition to changes in emissions, variable meteorological conditions also impact transpacific pollution transport. We estimate that about 113 of the variability in the tropospheric CO loading over the contiguous US is explained by changes in emissions and about 213 by changes in meteorology and transport. Biomass burning sources are found to be a larger driver for inter-annual variability in the CO loading compared to fossil and biofuel sources or photochemical CO production even though their absolute contributions are smaller. Source contribution analysis shows that the aircraft sampling during INTEX-B was fairly representative of the larger scale region, but with a slight bias towards higher influence from Asian contributions.

  17. Detecting potential anomalies in projections of rainfall trends and patterns using human observations

    NASA Astrophysics Data System (ADS)

    Kohfeld, K. E.; Savo, V.; Sillmann, J.; Morton, C.; Lepofsky, D.

    2016-12-01

    Shifting precipitation patterns are a well-documented consequence of climate change, but their spatial variability is particularly difficult to assess. While the accuracy of global models has increased, specific regional changes in precipitation regimes are not well captured by these models. Typically, researchers who wish to detect trends and patterns in climatic variables, such as precipitation, use instrumental observations. In our study, we combined observations of rainfall by subsistence-oriented communities with several metrics of rainfall estimated from global instrumental records for comparable time periods (1955 - 2005). This comparison was aimed at identifying: 1) which rainfall metrics best match human observations of changes in precipitation; 2) areas where local communities observe changes not detected by global models. The collated observations ( 3800) made by subsistence-oriented communities covered 129 countries ( 1830 localities). For comparable time periods, we saw a substantial correspondence between instrumental records and human observations (66-77%) at the same locations, regardless of whether we considered trends in general rainfall, drought, or extreme rainfall. We observed a clustering of mismatches in two specific regions, possibly indicating some climatic phenomena not completely captured by the currently available global models. Many human observations also indicated an increased unpredictability in the start, end, duration, and continuity of the rainy seasons, all of which may hamper the performance of subsistence activities. We suggest that future instrumental metrics should capture this unpredictability of rainfall. This information would be important for thousands of subsistence-oriented communities in planning, coping, and adapting to climate change.

  18. Reconciling differences in stratospheric ozone composites

    NASA Astrophysics Data System (ADS)

    Ball, William T.; Alsing, Justin; Mortlock, Daniel J.; Rozanov, Eugene V.; Tummon, Fiona; Haigh, Joanna D.

    2017-10-01

    Observations of stratospheric ozone from multiple instruments now span three decades; combining these into composite datasets allows long-term ozone trends to be estimated. Recently, several ozone composites have been published, but trends disagree by latitude and altitude, even between composites built upon the same instrument data. We confirm that the main causes of differences in decadal trend estimates lie in (i) steps in the composite time series when the instrument source data changes and (ii) artificial sub-decadal trends in the underlying instrument data. These artefacts introduce features that can alias with regressors in multiple linear regression (MLR) analysis; both can lead to inaccurate trend estimates. Here, we aim to remove these artefacts using Bayesian methods to infer the underlying ozone time series from a set of composites by building a joint-likelihood function using a Gaussian-mixture density to model outliers introduced by data artefacts, together with a data-driven prior on ozone variability that incorporates knowledge of problems during instrument operation. We apply this Bayesian self-calibration approach to stratospheric ozone in 10° bands from 60° S to 60° N and from 46 to 1 hPa (˜ 21-48 km) for 1985-2012. There are two main outcomes: (i) we independently identify and confirm many of the data problems previously identified, but which remain unaccounted for in existing composites; (ii) we construct an ozone composite, with uncertainties, that is free from most of these problems - we call this the BAyeSian Integrated and Consolidated (BASIC) composite. To analyse the new BASIC composite, we use dynamical linear modelling (DLM), which provides a more robust estimate of long-term changes through Bayesian inference than MLR. BASIC and DLM, together, provide a step forward in improving estimates of decadal trends. Our results indicate a significant recovery of ozone since 1998 in the upper stratosphere, of both northern and southern midlatitudes, in all four composites analysed, and particularly in the BASIC composite. The BASIC results also show no hemispheric difference in the recovery at midlatitudes, in contrast to an apparent feature that is present, but not consistent, in the four composites. Our overall conclusion is that it is possible to effectively combine different ozone composites and account for artefacts and drifts, and that this leads to a clear and significant result that upper stratospheric ozone levels have increased since 1998, following an earlier decline.

  19. Estimating Elasticity for Residential Electricity Demand in China

    PubMed Central

    Shi, G.; Zheng, X.; Song, F.

    2012-01-01

    Residential demand for electricity is estimated for China using a unique household level dataset. Household electricity demand is specified as a function of local electricity price, household income, and a number of social-economic variables at household level. We find that the residential demand for electricity responds rather sensitively to its own price in China, which implies that there is significant potential to use the price instrument to conserve electricity consumption. Electricity elasticities across different heterogeneous household groups (e.g., rich versus poor and rural versus urban) are also estimated. The results show that the high income group is more price elastic than the low income group, while rural families are more price elastic than urban families. These results have important policy implications for designing an increasing block tariff. PMID:22997492

  20. Does Money Really Matter? Estimating Impacts of Family Income on Young Children's Achievement With Data From Random-Assignment Experiments

    PubMed Central

    Duncan, Greg J.; Morris, Pamela A.; Rodrigues, Chris

    2011-01-01

    Social scientists do not agree on the size and nature of the causal impacts of parental income on children's achievement. We revisit this issue using a set of welfare and antipoverty experiments conducted in the 1990s. We utilize an instrumental variables strategy to leverage the variation in income and achievement that arises from random assignment to the treatment group to estimate the causal effect of income on child achievement. Our estimates suggest that a $1,000 increase in annual income increases young children's achievement by 5%–6% of a standard deviation. As such, our results suggest that family income has a policy-relevant, positive impact on the eventual school achievement of preschool children. PMID:21688900

  1. Estimating elasticity for residential electricity demand in China.

    PubMed

    Shi, G; Zheng, X; Song, F

    2012-01-01

    Residential demand for electricity is estimated for China using a unique household level dataset. Household electricity demand is specified as a function of local electricity price, household income, and a number of social-economic variables at household level. We find that the residential demand for electricity responds rather sensitively to its own price in China, which implies that there is significant potential to use the price instrument to conserve electricity consumption. Electricity elasticities across different heterogeneous household groups (e.g., rich versus poor and rural versus urban) are also estimated. The results show that the high income group is more price elastic than the low income group, while rural families are more price elastic than urban families. These results have important policy implications for designing an increasing block tariff.

  2. Satellite-based Analysis of CO Variability over the Amazon Basin

    NASA Astrophysics Data System (ADS)

    Deeter, M. N.; Emmons, L. K.; Martinez-Alonso, S.; Tilmes, S.; Wiedinmyer, C.

    2017-12-01

    Pyrogenic emissions from the Amazon Basin exert significant influence on both climate and air quality but are highly variable from year to year. The ability of models to simulate the impact of biomass burning emissions on downstream atmospheric concentrations depends on (1) the quality of surface flux estimates (i.e., emissions inventories), (2) model dynamics (e.g., horizontal winds, large-scale convection and mixing) and (3) the representation of atmospheric chemical processes. With an atmospheric lifetime of a few months, carbon monoxide (CO) is a commonly used diagnostic for biomass burning. CO products are available from several satellite instruments and allow analyses of CO variability over extended regions such as the Amazon Basin with useful spatial and temporal sampling characteristics. The MOPITT ('Measurements of Pollution in the Troposphere') instrument was launched on the NASA Terra platform near the end of 1999 and is still operational. MOPITT is uniquely capable of measuring tropospheric CO concentrations using both thermal-infrared and near-infrared observations, resulting in the ability to independently retrieve lower- and upper-troposphere CO concentrations. We exploit the 18-year MOPITT record and related datasets to analyze the variability of CO over the Amazon Basin and evaluate simulations performed with the CAM-chem chemical transport model. We demonstrate that observed differences between MOPITT observations and model simulations provide important clues regarding emissions inventories, convective mixing and long-range transport.

  3. Definition study for variable cycle engine testbed engine and associated test program

    NASA Technical Reports Server (NTRS)

    Vdoviak, J. W.

    1978-01-01

    The product/study double bypass variable cycle engine (VCE) was updated to incorporate recent improvements. The effect of these improvements on mission range and noise levels was determined. This engine design was then compared with current existing high-technology core engines in order to define a subscale testbed configuration that simulated many of the critical technology features of the product/study VCE. Detailed preliminary program plans were then developed for the design, fabrication, and static test of the selected testbed engine configuration. These plans included estimated costs and schedules for the detail design, fabrication and test of the testbed engine and the definition of a test program, test plan, schedule, instrumentation, and test stand requirements.

  4. Solar array model corrections from Mars Pathfinder lander data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ewell, R.C.; Burger, D.R.

    1997-12-31

    The MESUR solar array power model initially assumed values for input variables. After landing early surface variables such as array tilt and azimuth or early environmental variables such as array temperature can be corrected. Correction of later environmental variables such as tau versus time, spectral shift, dust deposition, and UV darkening is dependent upon time, on-board science instruments, and ability to separate effects of variables. Engineering estimates had to be made for additional shadow losses and Voc sensor temperature corrections. Some variations had not been expected such as tau versus time of day, and spectral shift versus time of day.more » Additions needed to the model are thermal mass of lander petal and correction between Voc sensor and temperature sensor. Conclusions are: the model works well; good battery predictions are difficult; inclusion of Isc and Voc sensors was valuable; and the IMP and MAE science experiments greatly assisted the data analysis and model correction.« less

  5. Properties of perimetric threshold estimates from Full Threshold, SITA Standard, and SITA Fast strategies.

    PubMed

    Artes, Paul H; Iwase, Aiko; Ohno, Yuko; Kitazawa, Yoshiaki; Chauhan, Balwantray C

    2002-08-01

    To investigate the distributions of threshold estimates with the Swedish Interactive Threshold Algorithms (SITA) Standard, SITA Fast, and the Full Threshold algorithm (Humphrey Field Analyzer; Zeiss-Humphrey Instruments, Dublin, CA) and to compare the pointwise test-retest variability of these strategies. One eye of 49 patients (mean age, 61.6 years; range, 22-81) with glaucoma (Mean Deviation mean, -7.13 dB; range, +1.8 to -23.9 dB) was examined four times with each of the three strategies. The mean and median SITA Standard and SITA Fast threshold estimates were compared with a "best available" estimate of sensitivity (mean results of three Full Threshold tests). Pointwise 90% retest limits (5th and 95th percentiles of retest thresholds) were derived to assess the reproducibility of individual threshold estimates. The differences between the threshold estimates of the SITA and Full Threshold strategies were largest ( approximately 3 dB) for midrange sensitivities ( approximately 15 dB). The threshold distributions of SITA were considerably different from those of the Full Threshold strategy. The differences remained of similar magnitude when the analysis was repeated on a subset of 20 locations that are examined early during the course of a Full Threshold examination. With sensitivities above 25 dB, both SITA strategies exhibited lower test-retest variability than the Full Threshold strategy. Below 25 dB, the retest intervals of SITA Standard were slightly smaller than those of the Full Threshold strategy, whereas those of SITA Fast were larger. SITA Standard may be superior to the Full Threshold strategy for monitoring patients with visual field loss. The greater test-retest variability of SITA Fast in areas of low sensitivity is likely to offset the benefit of even shorter test durations with this strategy. The sensitivity differences between the SITA and Full Threshold strategies may relate to factors other than reduced fatigue. They are, however, small in comparison to the test-retest variability.

  6. Welfare analysis of a zero-smoking policy - A case study in Japan.

    PubMed

    Nakamura, Yuuki; Takahashi, Kenzo; Nomura, Marika; Kamei, Miwako

    2018-03-19

    Smoking cessation efforts in Japan reduce smoking rates. A future zero-smoking policy would completely prohibit smoking (0% rate). We therefore analyzed the social welfare of smokers and non-smokers under a hypothetical zero-smoking policy. The demand curve for smoking from 1990 to 2014 was estimated by defining quantity as the number of cigarettes smoked and price as total tobacco sales/total cigarettes smoked by the two-stage least squares method using the tax on tobacco as the instrumental variable. In the estimation equation (calculated using the ordinary least squares method), the price of tobacco was the dependent variable and tobacco quantity the explanatory variable. The estimated constant was 31.90, the estimated coefficient of quantity was - 0.0061 (both, p < 0.0004), and the determinant coefficient was 0.9187. Thus, the 2015 consumer surplus was 1.08 trillion yen (US$ 9.82 billion) (95% confidence interval (CI), 889 billion yen (US$ 8.08 billion) - 1.27 trillion yen (US$ 11.6 billion)). Because tax revenue from tobacco in 2011 was 2.38 trillion yen (US$ 21.6 billion), the estimated deadweight loss if smoking were prohibited in 2014 was 3.31 trillion yen (US$ 30.2 billion) (95% CI, 3.13 trillion yen (US$ 28.5 billion) - 3.50 trillion yen (US$ 31.8 billion)), representing a deadweight loss about 0.6 trillion yen (US$ 5.45 billion) below the 2014 disease burden (4.10-4.12 trillion yen (US$ 37.3-37.5 billion)). We conclude that a zero-smoking policy would improve social welfare in Japan.

  7. Development and comparisons of wind retrieval algorithms for small unmanned aerial systems

    NASA Astrophysics Data System (ADS)

    Bonin, T. A.; Chilson, P. B.; Zielke, B. S.; Klein, P. M.; Leeman, J. R.

    2012-12-01

    Recently, there has been an increase in use of Unmanned Aerial Systems (UASs) as platforms for conducting fundamental and applied research in the lower atmosphere due to their relatively low cost and ability to collect samples with high spatial and temporal resolution. Concurrent with this development comes the need for accurate instrumentation and measurement methods suitable for small meteorological UASs. Moreover, the instrumentation to be integrated into such platforms must be small and lightweight. Whereas thermodynamic variables can be easily measured using well aspirated sensors onboard, it is much more challenging to accurately measure the wind with a UAS. Several algorithms have been developed that incorporate GPS observations as a means of estimating the horizontal wind vector, with each algorithm exhibiting its own particular strengths and weaknesses. In the present study, the performance of three such GPS-based wind-retrieval algorithms has been investigated and compared with wind estimates from rawinsonde and sodar observations. Each of the algorithms considered agreed well with the wind measurements from sounding and sodar data. Through the integration of UAS-retrieved profiles of thermodynamic and kinematic parameters, one can investigate the static and dynamic stability of the atmosphere and relate them to the state of the boundary layer across a variety of times and locations, which might be difficult to access using conventional instrumentation.

  8. Comparison and application of wind retrieval algorithms for small unmanned aerial systems

    NASA Astrophysics Data System (ADS)

    Bonin, T. A.; Chilson, P. B.; Zielke, B. S.; Klein, P. M.; Leeman, J. R.

    2013-07-01

    Recently, there has been an increase in use of Unmanned Aerial Systems (UASs) as platforms for conducting fundamental and applied research in the lower atmosphere due to their relatively low cost and ability to collect samples with high spatial and temporal resolution. Concurrent with this development comes the need for accurate instrumentation and measurement methods suitable for small meteorological UASs. Moreover, the instrumentation to be integrated into such platforms must be small and lightweight. Whereas thermodynamic variables can be easily measured using well-aspirated sensors onboard, it is much more challenging to accurately measure the wind with a UAS. Several algorithms have been developed that incorporate GPS observations as a means of estimating the horizontal wind vector, with each algorithm exhibiting its own particular strengths and weaknesses. In the present study, the performance of three such GPS-based wind-retrieval algorithms has been investigated and compared with wind estimates from rawinsonde and sodar observations. Each of the algorithms considered agreed well with the wind measurements from sounding and sodar data. Through the integration of UAS-retrieved profiles of thermodynamic and kinematic parameters, one can investigate the static and dynamic stability of the atmosphere and relate them to the state of the boundary layer across a variety of times and locations, which might be difficult to access using conventional instrumentation.

  9. A thermal control system for long-term survival of scientific instruments on lunar surface.

    PubMed

    Ogawa, K; Iijima, Y; Sakatani, N; Otake, H; Tanaka, S

    2014-03-01

    A thermal control system is being developed for scientific instruments placed on the lunar surface. This thermal control system, Lunar Mission Survival Module (MSM), was designed for scientific instruments that are planned to be operated for over a year in the future Japanese lunar landing mission SELENE-2. For the long-term operations, the lunar surface is a severe environment because the soil (regolith) temperature varies widely from nighttime -200 degC to daytime 100 degC approximately in which space electronics can hardly survive. The MSM has a tent of multi-layered insulators and performs a "regolith mound". Temperature of internal devices is less variable just like in the lunar underground layers. The insulators retain heat in the regolith soil in the daylight, and it can keep the device warm in the night. We conducted the concept design of the lunar survival module, and estimated its potential by a thermal mathematical model on the assumption of using a lunar seismometer designed for SELENE-2. Thermal vacuum tests were also conducted by using a thermal evaluation model in order to estimate the validity of some thermal parameters assumed in the computed thermal model. The numerical and experimental results indicated a sufficient survivability potential of the concept of our thermal control system.

  10. Estimation of neonatal outcome artery pH value according to CTG interpretation of the last 60 min before delivery: a retrospective study. Can the outcome pH value be predicted?

    PubMed

    Kundu, S; Kuehnle, E; Schippert, C; von Ehr, J; Hillemanns, P; Staboulidou, Ismini

    2017-11-01

    The aim of this study was to analyze whether the umbilical artery pH value can be estimated throughout CTG assessment 60 min prior to delivery and if the estimated umbilical artery pH value correlates with the actual one. This includes analysis of correlation between CTG trace classification and actual umbilical artery pH value. Intra-and interobserver agreement and the impact of professional experience on visual analysis of fetal heart rate tracing were evaluated. This was a retrospective study. 300 CTG records of the last 60 min before delivery were picked randomly from the computer database with the following inclusion criteria; singleton pregnancy >37 weeks, no fetal anomalies, vaginal delivery either spontaneous or instrumental-assisted. Five obstetricians and two midwives of different professional experience classified 300 CTG traces according to the FIGO criteria and estimated the postnatal umbilical artery pH. The results showed a significant difference (p < 0.05) in estimated and actual pH value, independent of professional experience. Analysis and correlation of CTG assessment and actual umbilical artery pH value showed significantly (p < 0.05) diverging results. Intra- and interobserver variability was high. Intraobserver variability was significantly higher for the resident (p = 0.001). No significant differences were detected regarding interobserver variability. An estimation of the pH value and consequently of neonatal outcome on the basis of a present CTG seems to be difficult. Therefore, not only CTG training but also clinical experience and the collaboration and consultation within the whole team is important.

  11. Coastal Land Air Sea Interaction: "the" beach towers

    NASA Astrophysics Data System (ADS)

    MacMahan, J. H.; Koscinski, J. S.; Ortiz-Suslow, D. G.; Haus, B. K.; Thornton, E. B.

    2016-12-01

    As part of the Coastal Land Air Sea Interaction (CLASI) experiment, an alongshore array of 6-m high towers instrumented with ultrasonic 3D anemometers and temperature-relative humidity sensors were deployed at five sandy beaches near the high-tide line in Monterey Bay, CA, in May-June 2016. A cross-shore array of towers was also deployed from within the active surfzone to the toe of the dune at one beach. In addition, waves and ocean temperature were obtained along the 10m isobath for each beach. The dissipative surfzone was O(80m) wide. The wave energy varies among the beaches owing to sheltering and refraction by the Monterey Canyon and headlands. The tides are semi-diurnal mixed, meso-tidal with a maximum tidal range of 2m. This results in a variable beach width from the tower to the tidal line. Footprint analysis for estimating the source region for the turbulent momentum fluxes, suggests that the observations represent three scenarios described as primarily ocean, mixed beach and ocean, and primarily beach. The direct-estimate of the atmospheric stability by the sonic anemometer suggest that all of the beaches are mostly unstable except for a few occurrences in the evening during low wind conditions. The onshore neutral drag coefficient (Cd) estimated at 10m heights is 3-5 times larger than open ocean estimates. Minimal variability was found in Cd based on the footprint analysis. Beach-specific spatial variability in Cd was found related to atmospheric stability and wave energy.

  12. [The effect of tobacco prices on consumption: a time series data analysis for Mexico].

    PubMed

    Olivera-Chávez, Rosa Itandehui; Cermeño-Bazán, Rodolfo; de Miera-Juárez, Belén Sáenz; Jiménez-Ruiz, Jorge Alberto; Reynales-Shigematsu, Luz Myriam

    2010-01-01

    To estimate the price elasticity of the demand for cigarettes in Mexico based on data sources and a methodology different from the ones used in previous studies on the topic. Quarterly time series of consumption, income and price for the time period 1994 to 2005 were used. A long-run demand model was estimated using Ordinary Least Squares (OLS) and the existence of a cointegration relationship was investigated. Also, a model using Dinamic Ordinary Least Squares (DOLS) was estimated to correct for potential endogeneity of independent variables and autocorrelation of the residuals. DOLS estimates showed that a 10% increase in cigarette prices could reduce consumption in 2.5% (p<0.05) and increase government revenue in 16.11%. The results confirmed the effectiveness of taxes as an instrument for tobacco control in Mexico. An increase in taxes can be used to increase cigarette prices and therefore to reduce consumption and increase government revenue.

  13. Predicting the Effects of Sugar-Sweetened Beverage Taxes on Food and Beverage Demand in a Large Demand System

    PubMed Central

    Zhen, Chen; Finkelstein, Eric A.; Nonnemaker, James; Karns, Shawn; Todd, Jessica E.

    2013-01-01

    A censored Exact Affine Stone Index incomplete demand system is estimated for 23 packaged foods and beverages and a numéraire good. Instrumental variables are used to control for endogenous prices. A half-cent per ounce increase in sugar-sweetened beverage prices is predicted to reduce total calories from the 23 foods and beverages but increase sodium and fat intakes as a result of product substitution. The predicted decline in calories is larger for low-income households than for high-income households, although welfare loss is also higher for low-income households. Neglecting price endogeneity or estimating a conditional demand model significantly overestimates the calorie reduction. PMID:24839299

  14. Predicting the Effects of Sugar-Sweetened Beverage Taxes on Food and Beverage Demand in a Large Demand System.

    PubMed

    Zhen, Chen; Finkelstein, Eric A; Nonnemaker, James; Karns, Shawn; Todd, Jessica E

    2014-01-01

    A censored Exact Affine Stone Index incomplete demand system is estimated for 23 packaged foods and beverages and a numéraire good. Instrumental variables are used to control for endogenous prices. A half-cent per ounce increase in sugar-sweetened beverage prices is predicted to reduce total calories from the 23 foods and beverages but increase sodium and fat intakes as a result of product substitution. The predicted decline in calories is larger for low-income households than for high-income households, although welfare loss is also higher for low-income households. Neglecting price endogeneity or estimating a conditional demand model significantly overestimates the calorie reduction.

  15. Limits in the application of harmonic analysis to pulsating stars

    NASA Astrophysics Data System (ADS)

    Pascual-Granado, J.; Garrido, R.; Suárez, J. C.

    2015-09-01

    Using ultra-precise data from space instrumentation, we found that the underlying functions of stellar light curves from some AF pulsating stars are non-analytic, and consequently their Fourier expansion is not guaranteed. This result demonstrates that periodograms do not provide a mathematically consistent estimator of the frequency content for this type of variable stars. More importantly, this constitutes the first counterexample against the current paradigm, which considers that any physical process is described by a continuous (band-limited) function that is infinitely differentiable.

  16. The Dependence of Cloud Property Trend Detection on Absolute Calibration Accuracy of Passive Satellite Sensors

    NASA Astrophysics Data System (ADS)

    Shea, Y.; Wielicki, B. A.; Sun-Mack, S.; Minnis, P.; Zelinka, M. D.

    2016-12-01

    Detecting trends in climate variables on global, decadal scales requires highly accurate, stable measurements and retrieval algorithms. Trend uncertainty depends on its magnitude, natural variability, and instrument and retrieval algorithm accuracy and stability. We applied a climate accuracy framework to quantify the impact of absolute calibration on cloud property trend uncertainty. The cloud properties studied were cloud fraction, effective temperature, optical thickness, and effective radius retrieved using the Clouds and the Earth's Radiant Energy System (CERES) Cloud Property Retrieval System, which uses Moderate-resolution Imaging Spectroradiometer measurements (MODIS). Modeling experiments from the fifth phase of the Climate Model Intercomparison Project (CMIP5) agree that net cloud feedback is likely positive but disagree regarding its magnitude, mainly due to uncertainty in shortwave cloud feedback. With the climate accuracy framework we determined the time to detect trends for instruments with various calibration accuracies. We estimated a relationship between cloud property trend uncertainty, cloud feedback, and Equilibrium Climate Sensitivity and also between effective radius trend uncertainty and aerosol indirect effect trends. The direct relationship between instrument accuracy requirements and climate model output provides the level of instrument absolute accuracy needed to reduce climate model projection uncertainty. Different cloud types have varied radiative impacts on the climate system depending on several attributes, such as their thermodynamic phase, altitude, and optical thickness. Therefore, we also conducted these studies by cloud types for a clearer understanding of instrument accuracy requirements needed to detect changes in their cloud properties. Combining this information with the radiative impact of different cloud types helps to prioritize among requirements for future satellite sensors and understanding the climate detection capabilities of existing sensors.

  17. Poverty, Pregnancy, and Birth Outcomes: A Study of the Earned Income Tax Credit

    PubMed Central

    Rehkopf, David H.

    2015-01-01

    Background Economic interventions are increasingly recognized as a mechanism to address perinatal health outcomes among disadvantaged groups. In the United States, the earned income tax credit (EITC) is the largest poverty alleviation program. Little is known about its effects on perinatal health among recipients and their children. We exploit quasi-random variation in the size of EITC payments over time to examine the effects of income on perinatal health. Methods The study sample includes women surveyed in the 1979 National Longitudinal Survey of Youth (N=2,985) and their children born during 1986–2000 (N=4,683). Outcome variables include utilization of prenatal and postnatal care, use of alcohol and tobacco during pregnancy, term birth, birthweight, and breast-feeding status. We examine the health effects of both household income and EITC payment size using multivariable linear regressions. We employ instrumental variables analysis to estimate the causal effect of income on perinatal health, using EITC payment size as an instrument for household income. Results We find that household income and EITC payment size are associated with improvements in several indicators of perinatal health. Instrumental variables analysis, however, does not reveal a causal association between household income and these health measures. Conclusions Our findings suggest that associations between income and perinatal health may be confounded by unobserved characteristics, but that EITC income improves perinatal health. Future studies should continue to explore the impacts of economic interventions on perinatal health outcomes, and investigate how different forms of income transfers may have different impacts. PMID:26212041

  18. Social capital and self-reported general and mental health in nine Former Soviet Union countries.

    PubMed

    Goryakin, Yevgeniy; Suhrcke, Marc; Rocco, Lorenzo; Roberts, Bayard; McKee, Martin

    2014-01-01

    Social capital has been proposed as a potentially important contributor to health, yet most of the existing research tends to ignore the challenge of assessing causality in this relationship. We deal with this issue by employing various instrumental variable estimation techniques. We apply the analysis to a set of nine former Soviet countries, using a unique multi-country household survey specifically designed for this region. Our results confirm that there appears to be a causal association running from several dimensions of individual social capital to general and mental health. Individual trust appears to be more strongly related to general health, while social isolation- to mental health. In addition, social support and trust seem to be more important determinants of health than the social capital dimensions that facilitate solidarity and collective action. Our findings are remarkably robust to a range of different specifications, including the use of instrumental variables. Certain interaction effects are also found: for instance, untrusting people who live in communities with higher aggregate level of trust are even less likely to experience good health than untrusting people living in the reference communities.

  19. Sport participation and subjective well-being: instrumental variable results from German survey data.

    PubMed

    Ruseski, Jane E; Humphreys, Brad R; Hallman, Kirstin; Wicker, Pamela; Breuer, Christoph

    2014-02-01

    A major policy goal of many ministries of sport and health is increased participation in sport to promote health. A growing literature is emerging about the benefits of sport participation on happiness. A challenge in establishing a link between sport participation and happiness is controlling for endogeneity of sport participation in the happiness equation. This study seeks to establish causal evidence of a relationship between sport participation and self reported happiness using instrumental variables (IV). IV estimates based on data from a 2009 population survey living in Rheinberg, Germany indicate that individuals who participate in sport have higher life happiness. The results suggest a U-shaped relationship between age and self-reported happiness. Higher income is associated with greater self-reported happiness, males are less happy than females, and single individuals are less happy than nonsingles. Since the results are IV, this finding is interpreted as a causal relationship between sport participation and subjective well-being (SWB). This broader impact of sport participation on general happiness lends support to the policy priority of many governments to increase sport participation at all levels of the general population.

  20. Caloric Beverage Intake Among Adult Supplemental Nutrition Assistance Program Participants

    PubMed Central

    2014-01-01

    Objectives. We compared sugar-sweetened beverage (SSB), alcohol, and other caloric beverage (juice and milk) consumption of Supplemental Nutrition Assistance Program (SNAP) participants with that of low-income nonparticipants. Methods. We used 1 day of dietary intake data from the 2005–2008 National Health and Nutrition Examination Survey for 4594 adults aged 20 years and older with household income at or below 250% of the federal poverty line. We used bivariate and multivariate methods to compare the probability of consuming and the amount of calories consumed for each beverage type across 3 groups: current SNAP participants, former participants, and nonparticipants. We used instrumental variable methods to control for unobservable differences in participant groups. Results. After controlling for observable characteristics, SNAP participants were no more likely to consume SSBs than were nonparticipants. Instrumental variable estimates showed that current participants consumed fewer calories from SSBs than did similar nonparticipants. We found no differences in alcoholic beverage consumption, which cannot be purchased with SNAP benefits. Conclusions. SNAP participants are not unique in their consumption of SSBs or alcoholic beverages. Purchase restrictions may have little effect on SSB consumption. PMID:25033141

  1. Observing System Simulations for Small Satellite Formations Estimating Bidirectional Reflectance

    NASA Technical Reports Server (NTRS)

    Nag, Sreeja; Gatebe, Charles K.; de Weck, Olivier

    2015-01-01

    The bidirectional reflectance distribution function (BRDF) gives the reflectance of a target as a function of illumination geometry and viewing geometry, hence carries information about the anisotropy of the surface. BRDF is needed in remote sensing for the correction of view and illumination angle effects (for example in image standardization and mosaicing), for deriving albedo, for land cover classification, for cloud detection, for atmospheric correction, and other applications. However, current spaceborne instruments provide sparse angular sampling of BRDF and airborne instruments are limited in the spatial and temporal coverage. To fill the gaps in angular coverage within spatial, spectral and temporal requirements, we propose a new measurement technique: Use of small satellites in formation flight, each satellite with a VNIR (visible and near infrared) imaging spectrometer, to make multi-spectral, near-simultaneous measurements of every ground spot in the swath at multiple angles. This paper describes an observing system simulation experiment (OSSE) to evaluate the proposed concept and select the optimal formation architecture that minimizes BRDF uncertainties. The variables of the OSSE are identified; number of satellites, measurement spread in the view zenith and relative azimuth with respect to solar plane, solar zenith angle, BRDF models and wavelength of reflection. Analyzing the sensitivity of BRDF estimation errors to the variables allow simplification of the OSSE, to enable its use to rapidly evaluate formation architectures. A 6-satellite formation is shown to produce lower BRDF estimation errors, purely in terms of angular sampling as evaluated by the OSSE, than a single spacecraft with 9 forward-aft sensors. We demonstrate the ability to use OSSEs to design small satellite formations as complements to flagship mission data. The formations can fill angular sampling gaps and enable better BRDF products than currently possible.

  2. Design, innovation, and rural creative places: Are the arts the cherry on top, or the secret sauce?

    PubMed

    Wojan, Timothy R; Nichols, Bonnie

    2018-01-01

    Creative class theory explains the positive relationship between the arts and commercial innovation as the mutual attraction of artists and other creative workers by an unobserved creative milieu. This study explores alternative theories for rural settings, by analyzing establishment-level survey data combined with data on the local arts scene. The study identifies the local contextual factors associated with a strong design orientation, and estimates the impact that a strong design orientation has on the local economy. Data on innovation and design come from a nationally representative sample of establishments in tradable industries. Latent class analysis allows identifying unobserved subpopulations comprised of establishments with different design and innovation orientations. Logistic regression allows estimating the association between an establishment's design orientation and local contextual factors. A quantile instrumental variable regression allows assessing the robustness of the logistic regression results with respect to endogeneity. An estimate of design orientation at the local level derived from the survey is used to examine variation in economic performance during the period of recovery from the Great Recession (2010-2014). Three distinct innovation (substantive, nominal, and non-innovators) and design orientations (design-integrated, "design last finish," and no systematic approach to design) are identified. Innovation- and design-intensive establishments were identified in both rural and urban areas. Rural design-integrated establishments tended to locate in counties with more highly educated workforces and containing at least one performing arts organization. A quantile instrumental variable regression confirmed that the logistic regression result is robust to endogeneity concerns. Finally, rural areas characterized by design-integrated establishments experienced faster growth in wages relative to rural areas characterized by establishments using no systematic approach to design.

  3. Design, innovation, and rural creative places: Are the arts the cherry on top, or the secret sauce?

    PubMed Central

    Nichols, Bonnie

    2018-01-01

    Objective Creative class theory explains the positive relationship between the arts and commercial innovation as the mutual attraction of artists and other creative workers by an unobserved creative milieu. This study explores alternative theories for rural settings, by analyzing establishment-level survey data combined with data on the local arts scene. The study identifies the local contextual factors associated with a strong design orientation, and estimates the impact that a strong design orientation has on the local economy. Method Data on innovation and design come from a nationally representative sample of establishments in tradable industries. Latent class analysis allows identifying unobserved subpopulations comprised of establishments with different design and innovation orientations. Logistic regression allows estimating the association between an establishment’s design orientation and local contextual factors. A quantile instrumental variable regression allows assessing the robustness of the logistic regression results with respect to endogeneity. An estimate of design orientation at the local level derived from the survey is used to examine variation in economic performance during the period of recovery from the Great Recession (2010–2014). Results Three distinct innovation (substantive, nominal, and non-innovators) and design orientations (design-integrated, “design last finish,” and no systematic approach to design) are identified. Innovation- and design-intensive establishments were identified in both rural and urban areas. Rural design-integrated establishments tended to locate in counties with more highly educated workforces and containing at least one performing arts organization. A quantile instrumental variable regression confirmed that the logistic regression result is robust to endogeneity concerns. Finally, rural areas characterized by design-integrated establishments experienced faster growth in wages relative to rural areas characterized by establishments using no systematic approach to design. PMID:29489884

  4. Is more better than less? An analysis of children's mental health services.

    PubMed Central

    Foster, E M

    2000-01-01

    OBJECTIVE: To assess the dose-response relationship for outpatient therapy received by children and adolescents-that is, to determine the impact of added outpatient visits on key mental health outcomes (functioning and symptomatology). DATA SOURCES/STUDY SETTING: The results presented involve analyses of data from the Fort Bragg Demonstration and are based on a sample of 301 individuals using outpatient services. STUDY DESIGN: This article provides estimates of the impact of outpatient therapy based on comparisons of individuals receiving differing treatment doses. Those comparisons involve standard multiple regression analyses as well as instrumental variables estimation. The latter provides a means of adjusting comparisons for unobserved or unmeasured differences among individuals receiving differing doses, differences that would otherwise be confounded with the impact of treatment dose. DATA COLLECTION/EXTRACTION METHODS: Using structured diagnostic interviews and behavior checklists completed by the child and his or her caretaker, detailed data on psychopathology, symptomatology, and psychosocial functioning were collected on individuals included in these analyses. Information on the use of mental health services was taken from insurance claims and a management information system. Services data were used to describe the use of outpatient therapy within the year following entry into the study. PRINCIPAL FINDINGS/CONCLUSIONS: Instrumental variables estimation indicates that added outpatient therapy improves functioning among children and adolescents. The effect is statistically significant and of moderate practical magnitude. These results imply that conventional analyses of the dose-response relationship may understate the impact of additional treatment on functioning. This finding is robust to choice of functional form, length of time over which outcomes are measured, and model specification. Dose does not appear to influence symptomatology. PMID:11130814

  5. Observing system simulations for small satellite formations estimating bidirectional reflectance

    NASA Astrophysics Data System (ADS)

    Nag, Sreeja; Gatebe, Charles K.; Weck, Olivier de

    2015-12-01

    The bidirectional reflectance distribution function (BRDF) gives the reflectance of a target as a function of illumination geometry and viewing geometry, hence carries information about the anisotropy of the surface. BRDF is needed in remote sensing for the correction of view and illumination angle effects (for example in image standardization and mosaicing), for deriving albedo, for land cover classification, for cloud detection, for atmospheric correction, and other applications. However, current spaceborne instruments provide sparse angular sampling of BRDF and airborne instruments are limited in the spatial and temporal coverage. To fill the gaps in angular coverage within spatial, spectral and temporal requirements, we propose a new measurement technique: use of small satellites in formation flight, each satellite with a VNIR (visible and near infrared) imaging spectrometer, to make multi-spectral, near-simultaneous measurements of every ground spot in the swath at multiple angles. This paper describes an observing system simulation experiment (OSSE) to evaluate the proposed concept and select the optimal formation architecture that minimizes BRDF uncertainties. The variables of the OSSE are identified; number of satellites, measurement spread in the view zenith and relative azimuth with respect to solar plane, solar zenith angle, BRDF models and wavelength of reflection. Analyzing the sensitivity of BRDF estimation errors to the variables allow simplification of the OSSE, to enable its use to rapidly evaluate formation architectures. A 6-satellite formation is shown to produce lower BRDF estimation errors, purely in terms of angular sampling as evaluated by the OSSE, than a single spacecraft with 9 forward-aft sensors. We demonstrate the ability to use OSSEs to design small satellite formations as complements to flagship mission data. The formations can fill angular sampling gaps and enable better BRDF products than currently possible.

  6. Mapping CHU9D Utility Scores from the PedsQLTM 4.0 SF-15.

    PubMed

    Mpundu-Kaambwa, Christine; Chen, Gang; Russo, Remo; Stevens, Katherine; Petersen, Karin Dam; Ratcliffe, Julie

    2017-04-01

    The Pediatric Quality of Life Inventory™ 4.0 Short Form 15 Generic Core Scales (hereafter the PedsQL) and the Child Health Utility-9 Dimensions (CHU9D) are two generic instruments designed to measure health-related quality of life in children and adolescents in the general population and paediatric patient groups living with specific health conditions. Although the PedsQL is widely used among paediatric patient populations, presently it is not possible to directly use the scores from the instrument to calculate quality-adjusted life-years (QALYs) for application in economic evaluation because it produces summary scores which are not preference-based. This paper examines different econometric mapping techniques for estimating CHU9D utility scores from the PedsQL for the purpose of calculating QALYs for cost-utility analysis. The PedsQL and the CHU9D were completed by a community sample of 755 Australian adolescents aged 15-17 years. Seven regression models were estimated: ordinary least squares estimator, generalised linear model, robust MM estimator, multivariate factorial polynomial estimator, beta-binomial estimator, finite mixture model and multinomial logistic model. The mean absolute error (MAE) and the mean squared error (MSE) were used to assess predictive ability of the models. The MM estimator with stepwise-selected PedsQL dimension scores as explanatory variables had the best predictive accuracy using MAE and the equivalent beta-binomial model had the best predictive accuracy using MSE. Our mapping algorithm facilitates the estimation of health-state utilities for use within economic evaluations where only PedsQL data is available and is suitable for use in community-based adolescents aged 15-17 years. Applicability of the algorithm in younger populations should be assessed in further research.

  7. A GIS approach to conducting biogeochemical research in wetlands

    NASA Technical Reports Server (NTRS)

    Brannon, David P.; Irish, Gary J.

    1985-01-01

    A project was initiated to develop an environmental data base to address spatial aspects of both biogeochemical cycling and resource management in wetlands. Specific goals are to make regional methane flux estimates and site specific water level predictions based on man controlled water releases within a wetland study area. The project will contribute to the understanding of the Earth's biosphere through its examination of the spatial variability of methane emissions. Although wetlands are thought to be one of the primary sources for release of methane to the atmosphere, little is known about the spatial variability of methane flux. Only through a spatial analysis of methane flux rates and the environmental factors which influence such rates can reliable regional and global methane emissions be calculated. Data will be correlated and studied from Landsat 4 instruments, from a ground survey of water level recorders, precipitation recorders, evaporation pans, and supplemental gauges, and from flood gate water release; and regional methane flux estimates will be made.

  8. Visual photometry: accuracy and precision

    NASA Astrophysics Data System (ADS)

    Whiting, Alan

    2018-01-01

    Visual photometry, estimation by eye of the brightness of stars, remains an important source of data even in the age of widespread precision instruments. However, the eye-brain system differs from electronic detectors and its results may be expected to differ in several respects. I examine a selection of well-observed variables from the AAVSO database to determine several internal characteristics of this data set. Visual estimates scatter around the fitted curves with a standard deviation of 0.14 to 0.34 magnitudes, most clustered in the 0.21-0.25 range. The variation of the scatter does not seem to correlate with color, type of variable, or depth or speed of variation of the star’s brightness. The scatter of an individual observer’s observations changes from star to star, in step with the overall scatter. The shape of the deviations from the fitted curve is non-Gaussian, with positive excess kurtosis (more outlying observations). These results have implications for use of visual data, as well as other citizen science efforts.

  9. Do gender gaps in education and health affect economic growth? A cross-country study from 1975 to 2010.

    PubMed

    Mandal, Bidisha; Batina, Raymond G; Chen, Wen

    2018-05-01

    We use system-generalized method-of-moments to estimate the effect of gender-specific human capital on economic growth in a cross-country panel of 127 countries between 1975 and 2010. There are several benefits of using this methodology. First, a dynamic lagged dependent econometric model is suitable to address persistence in per capita output. Second, the generalized method-of-moments estimator uses dynamic properties of the data to generate appropriate instrumental variables to address joint endogeneity of the explanatory variables. Third, we allow the measurement error to include unobserved country-specific effect and random noise. We include two gender-disaggregated measures of human capital-education and health. We find that gender gap in health plays a critical role in explaining economic growth in developing countries. Our results provide aggregate evidence that returns to investments in health systematically differ across gender and between low-income and high-income countries. Copyright © 2018 John Wiley & Sons, Ltd.

  10. (In)Consistent estimates of changes in relative precipitation in an European domain over the last 350 years

    NASA Astrophysics Data System (ADS)

    Bothe, Oliver; Wagner, Sebastian; Zorita, Eduardo

    2015-04-01

    How did regional precipitation change in past centuries? We have potentially three sources of information to answer this question: There are, especially for Europe, a number of long records of local station precipitation; documentary records and natural archives of past environmental variability serve as proxy records for empirical reconstructions; in addition, simulations with coupled climate models or Earth System Models provide estimates on the spatial structure of precipitation variability. However, instrumental records rarely extend back to the 18th century, reconstructions include large uncertainties, and simulation skill is often still unsatisfactory for precipitation. Thus, we can only seek to answer to which extent the three sources provide a consistent picture of past regional precipitation changes. This presentation describes the (lack of) consistency in describing changes of the distributional properties of seasonal precipitation between the different data sources. We concentrate on England and Wales since there are two recent reconstructions and a long observation based record available for this domain. The season of interest is an extended spring (March, April, May, June, July, MAMJJ) over the past 350 years. The main simulated data stem from a regional simulation for the European domain with CCLM driven at its lateral boundaries with conditions provided by a MPI-ESM COSMOS simulation for the last millennium using a high-amplitude solar forcing. A number of simulations for the past 1000 years from the Paleoclimate Modelling Intercomparison Project Phase III provide additional information. We fit a Weibull distribution to the available data sets following the approach for calculating standardized precipitation indices. We do so over 51 year moving windows to assess the consistency of changes in the distributional properties. Changes in the percentiles for severe (and extreme) dry or wet conditions and in the Weibull standard deviations of precipitation estimates are generally not consistent among the different data sets. Only few common signals are evident. Even the relatively strong exogenous forcing history of the late 18th and early 19th century appears to have only small effects on the precipitation distributions. The reconstructions differ systematically from the long instrumental data in displaying much stronger variability compared to the observations over their common period. Distributional properties for both data sets show to some extent opposite evolutions. The reconstructions do not reliably represent the distributions in specific periods but rather reflect low-frequency changes in the mean plus a certain amount of noise. Moreover, also multi-model simulations do not agree on the changes over this period. The lack of consistent simulated relations under purely naturally forced and internal variability on multi-decadal time-scales therefore questions our ability to conclude on dynamical inferences about regional climate variability in the PMIP3 ensemble and, in turn, in climate simulations in general. The potentially opposite evolution of reconstructions and instrumental data for the chosen domain further hampers reconciling available information about past regional precipitation variability in England and Wales. However, we find some possibly surprising but encouraging agreement between the observed data and the regional simulation.

  11. Assessing the role of insulin-like growth factors and binding proteins in prostate cancer using Mendelian randomization: Genetic variants as instruments for circulating levels.

    PubMed

    Bonilla, Carolina; Lewis, Sarah J; Rowlands, Mari-Anne; Gaunt, Tom R; Davey Smith, George; Gunnell, David; Palmer, Tom; Donovan, Jenny L; Hamdy, Freddie C; Neal, David E; Eeles, Rosalind; Easton, Doug; Kote-Jarai, Zsofia; Al Olama, Ali Amin; Benlloch, Sara; Muir, Kenneth; Giles, Graham G; Wiklund, Fredrik; Grönberg, Henrik; Haiman, Christopher A; Schleutker, Johanna; Nordestgaard, Børge G; Travis, Ruth C; Pashayan, Nora; Khaw, Kay-Tee; Stanford, Janet L; Blot, William J; Thibodeau, Stephen; Maier, Christiane; Kibel, Adam S; Cybulski, Cezary; Cannon-Albright, Lisa; Brenner, Hermann; Park, Jong; Kaneva, Radka; Batra, Jyotsna; Teixeira, Manuel R; Pandha, Hardev; Lathrop, Mark; Martin, Richard M; Holly, Jeff M P

    2016-10-01

    Circulating insulin-like growth factors (IGFs) and their binding proteins (IGFBPs) are associated with prostate cancer. Using genetic variants as instruments for IGF peptides, we investigated whether these associations are likely to be causal. We identified from the literature 56 single nucleotide polymorphisms (SNPs) in the IGF axis previously associated with biomarker levels (8 from a genome-wide association study [GWAS] and 48 in reported candidate genes). In ∼700 men without prostate cancer and two replication cohorts (N ∼ 900 and ∼9,000), we examined the properties of these SNPS as instrumental variables (IVs) for IGF-I, IGF-II, IGFBP-2 and IGFBP-3. Those confirmed as strong IVs were tested for association with prostate cancer risk, low (< 7) vs. high (≥ 7) Gleason grade, localised vs. advanced stage, and mortality, in 22,936 controls and 22,992 cases. IV analysis was used in an attempt to estimate the causal effect of circulating IGF peptides on prostate cancer. Published SNPs in the IGFBP1/IGFBP3 gene region, particularly rs11977526, were strong instruments for IGF-II and IGFBP-3, less so for IGF-I. Rs11977526 was associated with high (vs. low) Gleason grade (OR per IGF-II/IGFBP-3 level-raising allele 1.05; 95% CI: 1.00, 1.10). Using rs11977526 as an IV we estimated the causal effect of a one SD increase in IGF-II (∼265 ng/mL) on risk of high vs. low grade disease as 1.14 (95% CI: 1.00, 1.31). Because of the potential for pleiotropy of the genetic instruments, these findings can only causally implicate the IGF pathway in general, not any one specific biomarker. © 2016 UICC.

  12. Methodologies to determine forces on bones and muscles of body segments during exercise, employing compact sensors suitable for use in crowded space vehicles

    NASA Technical Reports Server (NTRS)

    Figueroa, Fernando

    1995-01-01

    Work under this grant was carried out by the author and by a graduate research assistant. An instrumented bicycle ergometer was implemented focusing on the stated objective: to estimate the forces exerted by each muscle of the feet, calf, and thigh of an individual while bicycling. The sensors used were light and compact. These were probes to measure muscle EMG activity, miniature accelerometers, miniature load sensors, and small encoders to measure angular positions of the pedal. A methodology was developed and implemented to completely describe the kinematics of the limbs using data from the sensors. This work has been published as a Master's Thesis by the Graduate student supported by the grant. The instrumented ergometer along with the sensors and instrumentation were tested during a KC-135 Zero-Gravity flight in July, 1994. A complete description of the system and the tests performed have been published as a report submitted to NASA Johnson Space Center. The data collected during the KC-135 flight is currently being processed so that a kinematic description of the bicycling experiment will be soon determined. A methodology to estimate the muscle forces has been formulated based on previous work. The methodology involves the use of optimization concepts so that the individual muscle forces that represent variables in dynamic equations of motion may be estimated. Optimization of a criteria (goal) function such as minimization of energy will be used along with constraint equations defined by rigid body equations of motion. Use of optimization principles is necessary, because the equations of motion alone constitute an indeterminate system of equations with respect to the large amount of muscle forces which constitute the variables in these equations. The number of variables is reduced somewhat by using forces measured by the load cells installed on the pedal. These load cells measure pressure and shear forces on the foot. The author and his collaborators at NASA and at the University of Alabama, Tuscaloosa, are continuing the work of reducing the experimental data from the KC-135 flight, and the implementation of the optimization methods to estimate muscle forces. As soon as results from these efforts are available, they will be published in reputable journals. Results of this work will impact studies addressing bone density loss and development of countermeasures to minimize bone loss in zero gravity conditions. By analyzing muscle forces on Earth and in Space during exercise, scientists could eventually formulate new exercises and machines to help maintain bone density. On Earth, this work will impact studies concerning arthritis, and will provide the means to study possible exercise countermeasures to minimize arthritis problems.

  13. An 800-Year Record of Sediment-Derived, Instrumentally-Calibrated Foraminiferal Mg/Ca SST Estimates From the Tropical North Atlantic

    NASA Astrophysics Data System (ADS)

    Black, D. E.; Abahazi, M. A.; Thunell, R. C.; Tappa, E. J.

    2005-12-01

    Most geochemical paleoclimate proxies are calibrated to different climate variables using laboratory culture, surface sediment, or sediment trap experiments. The varved, high-deposition rate sediments of the Cariaco Basin (Venezuela) provide the nearly unique opportunity to compare and calibrate paleoceanographic proxy data directly against true oceanic historical instrumental climate records. Here we present one of the first sediment-derived foraminiferal-Mg/Ca to SST calibrations spanning A. D. 1870-1990. The record of Mg/Ca-estimated tropical North Atlantic SSTs is then extended back to approximately A. D. 1200. Box core PL07-73 BC, recovered from the northeastern slope of Cariaco Basin, was sampled at consecutive 1 mm increments and processed for foraminiferal population, stable isotope, and Mg/Ca (by ICP-AES) analyses. The age model for this core was established by correlating faunal population records from PL07-73 to a nearby very well-dated Cariaco Basin box core, PL07-71 BC. The resulting age model yields consecutive sample intervals of one to two years. Mg/Ca ratios measured on Globigerina bulloides in samples deposited between A. D. 1870 and 1990 were calibrated to monthly SSTs from the Met Office Hadley Centre's SST data set for the Cariaco Basin grid square. Annual correlations between G. bulloides Mg/Ca and instrumental SST were highest (r=0.6, p<.0001, n=120) for the months of March, April, and May, the time when sediment trap studies indicate G. bulloides is most abundant in the basin. The full-length Mg/Ca-estimated SST record is characterized by decadal- and centennial-scale variability. The tropical western North Atlantic does not appear to have experienced a pronounced Medieval Warm Period relative to the complete record. However, strong Little Ice Age cooling of as much as 3 ° C occurred between A. D. 1525 and 1625. Spring SSTs gradually rose between A. D. 1650 and 1900 followed by a 2.5 ° C warming over the 20th century.

  14. Effect of nursing home ownership on hospitalization of long-stay residents: an instrumental variables approach

    PubMed Central

    Grabowski, David C.; Feng, Zhanlian; Rahman, Momotazur; Mor, Vincent

    2014-01-01

    Hospitalizations among nursing home residents are frequent, expensive, and often associated with further deterioration of resident condition. The literature indicates that a substantial fraction of admissions is potentially preventable and that nonprofit nursing homes are less likely to hospitalize their residents. However, the correlation between ownership and hospitalization might reflect unobserved resident differences rather than a causal relationship. Using national minimum data set assessments linked with Medicare claims, we use a national cohort of long-stay residents who were newly admitted to nursing homes within an 18-month period spanning January 1, 2004 and June 30, 2005. After instrumenting for ownership status, we found that IV estimates of the effect of nonprofit ownership on hospitalization are at least as large as the non-instrumented effects, indicating that selection bias does not explain the observed relationship. We also found evidence suggesting the lower rate of hospitalizations among nonprofits was due to a different threshold for transfer. PMID:24234287

  15. A Fast and Sensitive New Satellite SO2 Retrieval Algorithm based on Principal Component Analysis: Application to the Ozone Monitoring Instrument

    NASA Technical Reports Server (NTRS)

    Li, Can; Joiner, Joanna; Krotkov, A.; Bhartia, Pawan K.

    2013-01-01

    We describe a new algorithm to retrieve SO2 from satellite-measured hyperspectral radiances. We employ the principal component analysis technique in regions with no significant SO2 to capture radiance variability caused by both physical processes (e.g., Rayleigh and Raman scattering and ozone absorption) and measurement artifacts. We use the resulting principal components and SO2 Jacobians calculated with a radiative transfer model to directly estimate SO2 vertical column density in one step. Application to the Ozone Monitoring Instrument (OMI) radiance spectra in 310.5-340 nm demonstrates that this approach can greatly reduce biases in the operational OMI product and decrease the noise by a factor of 2, providing greater sensitivity to anthropogenic emissions. The new algorithm is fast, eliminates the need for instrument-specific radiance correction schemes, and can be easily adapted to other sensors. These attributes make it a promising technique for producing longterm, consistent SO2 records for air quality and climate research.

  16. PPI/HASI Pressure Measurements in the Atmosphere of Titan

    NASA Astrophysics Data System (ADS)

    M'akinen, J. T. T.; Harri, A.-M.; Siili, T.; Lehto, A.; Kahanp'a'a, H.; Genzer, M.; Leppelmeier, G. W.; Leinonen, J.

    2005-08-01

    The Huygens probe descended through the atmosphere of Titan on January 14, 2005, providing an excellent set of observations. As a part of the Huygens Atmospheric Structure Instrument (HASI) measuring several variables, including acceleration, pressure, temperature and atmospheric electricity, the Pressure Profile Instrument (PPI) provided by FMI commenced operations after the deployment of the main parachute and jettisoning of the heat shield at an altitude of about 160 km. Based on aerodynamic considerations, PPI measured the total pressure with a Kiel probe at the end of a boom, connected to the sensor electronics inside the probe through an inlet tube. The instrument performed flawlessly during the 2.5 hour descent and the 0.5 hour surface phase before the termination of radio link between Huygens and the Cassini orbiter. We present an analysis of the pressure data including recreation of the pressure, temperature, altitude, velocity and acceleration profiles as well as an estimate for the level of atmospheric activity on the surface of Titan.

  17. Effect of nursing home ownership on hospitalization of long-stay residents: an instrumental variables approach.

    PubMed

    Hirth, Richard A; Grabowski, David C; Feng, Zhanlian; Rahman, Momotazur; Mor, Vincent

    2014-03-01

    Hospitalizations among nursing home residents are frequent, expensive, and often associated with further deterioration of resident condition. The literature indicates that a substantial fraction of admissions is potentially preventable and that nonprofit nursing homes are less likely to hospitalize their residents. However, the correlation between ownership and hospitalization might reflect unobserved resident differences rather than a causal relationship. Using national minimum data set assessments linked with Medicare claims, we use a national cohort of long-stay residents who were newly admitted to nursing homes within an 18-month period spanning January 1, 2004 and June 30, 2005. After instrumenting for ownership status, we found that IV estimates of the effect of nonprofit ownership on hospitalization are at least as large as the non-instrumented effects, indicating that selection bias does not explain the observed relationship. We also found evidence suggesting the lower rate of hospitalizations among nonprofits was due to a different threshold for transfer.

  18. Evaluation of a simple, point-scale hydrologic model in simulating soil moisture using the Delaware environmental observing system

    NASA Astrophysics Data System (ADS)

    Legates, David R.; Junghenn, Katherine T.

    2018-04-01

    Many local weather station networks that measure a number of meteorological variables (i.e. , mesonetworks) have recently been established, with soil moisture occasionally being part of the suite of measured variables. These mesonetworks provide data from which detailed estimates of various hydrological parameters, such as precipitation and reference evapotranspiration, can be made which, when coupled with simple surface characteristics available from soil surveys, can be used to obtain estimates of soil moisture. The question is Can meteorological data be used with a simple hydrologic model to estimate accurately daily soil moisture at a mesonetwork site? Using a state-of-the-art mesonetwork that also includes soil moisture measurements across the US State of Delaware, the efficacy of a simple, modified Thornthwaite/Mather-based daily water balance model based on these mesonetwork observations to estimate site-specific soil moisture is determined. Results suggest that the model works reasonably well for most well-drained sites and provides good qualitative estimates of measured soil moisture, often near the accuracy of the soil moisture instrumentation. The model exhibits particular trouble in that it cannot properly simulate the slow drainage that occurs in poorly drained soils after heavy rains and interception loss, resulting from grass not being short cropped as expected also adversely affects the simulation. However, the model could be tuned to accommodate some non-standard siting characteristics.

  19. The causal effect of education on HIV stigma in Uganda: Evidence from a natural experiment.

    PubMed

    Tsai, Alexander C; Venkataramani, Atheendar S

    2015-10-01

    HIV is highly stigmatized in sub-Saharan Africa. This is an important public health problem because HIV stigma has many adverse effects that threaten to undermine efforts to control the HIV epidemic. The implementation of a universal primary education policy in Uganda in 1997 provided us with a natural experiment to test the hypothesis that education is causally related to HIV stigma. For this analysis, we pooled publicly available, population-based data from the 2011 Uganda Demographic and Health Survey and the 2011 Uganda AIDS Indicator Survey. The primary outcomes of interest were negative attitudes toward persons with HIV, elicited using four questions about anticipated stigma and social distance. Standard least squares estimates suggested a statistically significant, negative association between years of schooling and HIV stigma (each P < 0.001, with t-statistics ranging from 4.9 to 14.7). We then used a natural experiment design, exploiting differences in birth cohort exposure to universal primary education as an instrumental variable. Participants who were <13 years old at the time of the policy change had 1.36 additional years of schooling compared to those who were ≥13 years old. Adjusting for linear age trends before and after the discontinuity, two-stage least squares estimates suggested no statistically significant causal effect of education on HIV stigma (P-values ranged from 0.21 to 0.69). Three of the four estimated regression coefficients were positive, and in all cases the lower confidence limits convincingly excluded the possibility of large negative effect sizes. These instrumental variables estimates have a causal interpretation and were not overturned by several robustness checks. We conclude that, for young adults in Uganda, additional years of education in the formal schooling system driven by a universal primary school intervention have not had a causal effect on reducing negative attitudes toward persons with HIV. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. The causal effect of education on HIV stigma in Uganda: evidence from a natural experiment

    PubMed Central

    Tsai, Alexander C.; Venkataramani, Atheendar S.

    2015-01-01

    Rationale HIV is highly stigmatized in sub-Saharan Africa. This is an important public health problem because HIV stigma has many adverse effects that threaten to undermine efforts to control the HIV epidemic. Objective The implementation of a universal primary education policy in Uganda in 1997 provided us with a natural experiment to test the hypothesis that education is causally related to HIV stigma. Methods For this analysis, we pooled publicly available, population-based data from the 2011 Uganda Demographic and Health Survey and the 2011 Uganda AIDS Indicator Survey. The primary outcomes of interest were negative attitudes toward persons with HIV, elicited using four questions about anticipated stigma and social distance. Results Standard least squares estimates suggested a statistically significant, negative association between years of schooling and HIV stigma (each P<0.001, with t-statistics ranging from 4.9 to 14.7). We then used a natural experiment design, exploiting differences in birth cohort exposure to universal primary education as an instrumental variable. Participants who were <13 years old at the time of the policy change had 1.36 additional years of schooling compared to those who were ≥13 years old. Adjusting for linear age trends before and after the discontinuity, two-stage least squares estimates suggested no statistically significant causal effect of education on HIV stigma (P-values ranged from 0.21 to 0.69). Three of the four estimated regression coefficients were positive, and in all cases the lower confidence limits convincingly excluded the possibility of large negative effect sizes. These instrumental variables estimates have a causal interpretation and were not overturned by several robustness checks. Conclusion We conclude that, for young adults in Uganda, additional years of education in the formal schooling system driven by a universal primary school intervention have not had a causal effect on reducing negative attitudes toward persons with HIV. PMID:26282707

  1. Calibration of the dietary data obtained from the Brazilian center of the Natural History of HPV Infection in Men study: the HIM Study.

    PubMed

    Teixeira, Juliana Araujo; Baggio, Maria Luiza; Fisberg, Regina Mara; Marchioni, Dirce Maria Lobo

    2010-12-01

    The objective of this study was to estimate the regressions calibration for the dietary data that were measured using the quantitative food frequency questionnaire (QFFQ) in the Natural History of HPV Infection in Men: the HIM Study in Brazil. A sample of 98 individuals from the HIM study answered one QFFQ and three 24-hour recalls (24HR) at interviews. The calibration was performed using linear regression analysis in which the 24HR was the dependent variable and the QFFQ was the independent variable. Age, body mass index, physical activity, income and schooling were used as adjustment variables in the models. The geometric means between the 24HR and the calibration-corrected QFFQ were statistically equal. The dispersion graphs between the instruments demonstrate increased correlation after making the correction, although there is greater dispersion of the points with worse explanatory power of the models. Identification of the regressions calibration for the dietary data of the HIM study will make it possible to estimate the effect of the diet on HPV infection, corrected for the measurement error of the QFFQ.

  2. Long-memory and the sea level-temperature relationship: a fractional cointegration approach.

    PubMed

    Ventosa-Santaulària, Daniel; Heres, David R; Martínez-Hernández, L Catalina

    2014-01-01

    Through thermal expansion of oceans and melting of land-based ice, global warming is very likely contributing to the sea level rise observed during the 20th century. The amount by which further increases in global average temperature could affect sea level is only known with large uncertainties due to the limited capacity of physics-based models to predict sea levels from global surface temperatures. Semi-empirical approaches have been implemented to estimate the statistical relationship between these two variables providing an alternative measure on which to base potentially disrupting impacts on coastal communities and ecosystems. However, only a few of these semi-empirical applications had addressed the spurious inference that is likely to be drawn when one nonstationary process is regressed on another. Furthermore, it has been shown that spurious effects are not eliminated by stationary processes when these possess strong long memory. Our results indicate that both global temperature and sea level indeed present the characteristics of long memory processes. Nevertheless, we find that these variables are fractionally cointegrated when sea-ice extent is incorporated as an instrumental variable for temperature which in our estimations has a statistically significant positive impact on global sea level.

  3. Implementing the undergraduate mini-CEX: a tailored approach at Southampton University.

    PubMed

    Hill, Faith; Kendall, Kathleen; Galbraith, Kevin; Crossley, Jim

    2009-04-01

    The mini-clinical evaluation exercise (mini-CEX) is widely used in the UK to assess clinical competence, but there is little evidence regarding its implementation in the undergraduate setting. This study aimed to estimate the validity and reliability of the undergraduate mini-CEX and discuss the challenges involved in its implementation. A total of 3499 mini-CEX forms were completed. Validity was assessed by estimating associations between mini-CEX score and a number of external variables, examining the internal structure of the instrument, checking competency domain response rates and profiles against expectations, and by qualitative evaluation of stakeholder interviews. Reliability was evaluated by overall reliability coefficient (R), estimation of the standard error of measurement (SEM), and from stakeholders' perceptions. Variance component analysis examined the contribution of relevant factors to students' scores. Validity was threatened by various confounding variables, including: examiner status; case complexity; attachment specialty; patient gender, and case focus. Factor analysis suggested that competency domains reflect a single latent variable. Maximum reliability can be achieved by aggregating scores over 15 encounters (R = 0.73; 95% confidence interval [CI] +/- 0.28 based on a 6-point assessment scale). Examiner stringency contributed 29% of score variation and student attachment aptitude 13%. Stakeholder interviews revealed staff development needs but the majority perceived the mini-CEX as more reliable and valid than the previous long case. The mini-CEX has good overall utility for assessing aspects of the clinical encounter in an undergraduate setting. Strengths include fidelity, wide sampling, perceived validity, and formative observation and feedback. Reliability is limited by variable examiner stringency, and validity by confounding variables, but these should be viewed within the context of overall assessment strategies.

  4. Multi-Epoch Mid-Infrared Interferometric Observations of the Oxygen-rich Mira Variable Star RR Aql with the VLTI/MIDI Instrument

    DTIC Science & Technology

    2011-01-01

    VLTI/ MIDI Instrument I. Karovicova,1,3 M. Wittkowski,1 D. A. Boboltz,2 E. Fossat,3 K. Ohnaka,4 and M. Scholz5,6 1European Southern Observatory...the oxygen-rich Mira variable RR Aql at 13 epochs covering 4 pulsation cycles with the MIDI instrument at the VLTI. We modeled the observed data...Variable Star RR Aql with the VLTI/ MIDI Instrument 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e

  5. Optimizing a Sensor Network with Data from Hazard Mapping Demonstrated in a Heavy-Vehicle Manufacturing Facility.

    PubMed

    Berman, Jesse D; Peters, Thomas M; Koehler, Kirsten A

    2018-05-28

    To design a method that uses preliminary hazard mapping data to optimize the number and location of sensors within a network for a long-term assessment of occupational concentrations, while preserving temporal variability, accuracy, and precision of predicted hazards. Particle number concentrations (PNCs) and respirable mass concentrations (RMCs) were measured with direct-reading instruments in a large heavy-vehicle manufacturing facility at 80-82 locations during 7 mapping events, stratified by day and season. Using kriged hazard mapping, a statistical approach identified optimal orders for removing locations to capture temporal variability and high prediction precision of PNC and RMC concentrations. We compared optimal-removal, random-removal, and least-optimal-removal orders to bound prediction performance. The temporal variability of PNC was found to be higher than RMC with low correlation between the two particulate metrics (ρ = 0.30). Optimal-removal orders resulted in more accurate PNC kriged estimates (root mean square error [RMSE] = 49.2) at sample locations compared with random-removal order (RMSE = 55.7). For estimates at locations having concentrations in the upper 10th percentile, the optimal-removal order preserved average estimated concentrations better than random- or least-optimal-removal orders (P < 0.01). However, estimated average concentrations using an optimal-removal were not statistically different than random-removal when averaged over the entire facility. No statistical difference was observed for optimal- and random-removal methods for RMCs that were less variable in time and space than PNCs. Optimized removal performed better than random-removal in preserving high temporal variability and accuracy of hazard map for PNC, but not for the more spatially homogeneous RMC. These results can be used to reduce the number of locations used in a network of static sensors for long-term monitoring of hazards in the workplace, without sacrificing prediction performance.

  6. Estimation and Bias Correction of Aerosol Abundance using Data-driven Machine Learning and Remote Sensing

    NASA Technical Reports Server (NTRS)

    Malakar, Nabin K.; Lary, D. L.; Moore, A.; Gencaga, D.; Roscoe, B.; Albayrak, Arif; Petrenko, Maksym; Wei, Jennifer

    2012-01-01

    Air quality information is increasingly becoming a public health concern, since some of the aerosol particles pose harmful effects to peoples health. One widely available metric of aerosol abundance is the aerosol optical depth (AOD). The AOD is the integrated light extinction coefficient over a vertical atmospheric column of unit cross section, which represents the extent to which the aerosols in that vertical profile prevent the transmission of light by absorption or scattering. The comparison between the AOD measured from the ground-based Aerosol Robotic Network (AERONET) system and the satellite MODIS instruments at 550 nm shows that there is a bias between the two data products. We performed a comprehensive analysis exploring possible factors which may be contributing to the inter-instrumental bias between MODIS and AERONET. The analysis used several measured variables, including the MODIS AOD, as input in order to train a neural network in regression mode to predict the AERONET AOD values. This not only allowed us to obtain an estimate, but also allowed us to infer the optimal sets of variables that played an important role in the prediction. In addition, we applied machine learning to infer the global abundance of ground level PM2.5 from the AOD data and other ancillary satellite and meteorology products. This research is part of our goal to provide air quality information, which can also be useful for global epidemiology studies.

  7. Derivation of Design Loads and Random Vibration specifications for Spacecraft Instruments and Sub-Units

    NASA Astrophysics Data System (ADS)

    Fransen, S.; Yamawaki, T.; Akagi, H.; Eggens, M.; van Baren, C.

    2014-06-01

    After a first estimation based on statistics, the design loads for instruments are generally estimated by coupled spacecraft/instrument sine analysis once an FE-model of the spacecraft is available. When the design loads for the instrument have been derived, the next step in the process is to estimate the random vibration environment at the instrument base and to compute the RMS load at the centre of gravity of the instrument by means of vibro-acoustic analysis. Finally the design loads of the light-weight sub-units of the instrument can be estimated through random vibration analysis at instrument level, taking into account the notches required to protect the instrument interfaces in the hard- mounted random vibration test. This paper presents the aforementioned steps of instrument and sub-units loads derivation in the preliminary design phase of the spacecraft and identifies the problems that may be encountered in terms of design load consistency between low-frequency and high-frequency environments. The SpicA FAR-infrared Instrument (SAFARI) which is currently developed for the Space Infrared Telescope for Cosmology and Astrophysics (SPICA) will be used as a guiding example.

  8. Instrumental variable applications using nursing home prescribing preferences in comparative effectiveness research.

    PubMed

    Huybrechts, Krista F; Gerhard, Tobias; Franklin, Jessica M; Levin, Raisa; Crystal, Stephen; Schneeweiss, Sebastian

    2014-08-01

    Nursing home residents are of particular interest for comparative effectiveness research given their susceptibility to adverse treatment effects and systematic exclusion from trials. However, the risk of residual confounding because of unmeasured markers of declining health using conventional analytic methods is high. We evaluated the validity of instrumental variable (IV) methods based on nursing home prescribing preference to mitigate such confounding, using psychotropic medications to manage behavioral problems in dementia as a case study. A cohort using linked data from Medicaid, Medicare, Minimum Data Set, and Online Survey, Certification and Reporting for 2001-2004 was established. Dual-eligible patients ≥65 years who initiated psychotropic medication use after admission were selected. Nursing home prescribing preference was characterized using mixed-effects logistic regression models. The plausibility of IV assumptions was explored, and the association between psychotropic medication class and 180-day mortality was estimated. High-prescribing and low-prescribing nursing homes differed by a factor of 2. Each preference-based IV measure described a substantial proportion of variation in psychotropic medication choice (β(IV → treatment): 0.22-0.36). Measured patient characteristics were well balanced across patient groups based on instrument status (52% average reduction in Mahalanobis distance). There was no evidence that instrument status was associated with markers of nursing home quality of care. Findings indicate that IV analyses using nursing home prescribing preference may be a useful approach in comparative effectiveness studies, and should extend naturally to analyses including untreated comparison groups, which are of great scientific interest but subject to even stronger confounding. Copyright © 2014 John Wiley & Sons, Ltd.

  9. Secular temperature trends for the southern Rocky Mountains over the last five centuries

    NASA Astrophysics Data System (ADS)

    Berkelhammer, M.; Stott, L. D.

    2012-09-01

    Pre-instrumental surface temperature variability in the Southwestern United States has traditionally been reconstructed using variations in the annual ring widths of high altitude trees that live near a growth-limiting isotherm. A number of studies have suggested that the response of some trees to temperature variations is non-stationary, warranting the development of alternative approaches towards reconstructing past regional temperature variability. Here we present a five-century temperature reconstruction for a high-altitude site in the Rocky Mountains derived from the oxygen isotopic composition of cellulose (δ18Oc) from Bristlecone Pine trees. The record is independent of the co-located growth-based reconstruction while providing the same temporal resolution and absolute age constraints. The empirical correlation between δ18Oc and instrumental temperatures is used to produce a temperature transfer function. A forward-model for cellulose isotope variations, driven by meteorological data and output from an isotope-enabled General Circulation Model, is used to evaluate the processes that propagate the temperature signal to the proxy. The cellulose record documents persistent multidecadal variations in δ18Oc that are attributable to temperature shifts on the order of 1°C but no sustained monotonic rise in temperature or a step-like increase since the late 19th century. The isotope-based temperature history is consistent with both regional wood density-based temperature estimates and some sparse early instrumental records.

  10. Propensity-score matching in economic analyses: comparison with regression models, instrumental variables, residual inclusion, differences-in-differences, and decomposition methods.

    PubMed

    Crown, William H

    2014-02-01

    This paper examines the use of propensity score matching in economic analyses of observational data. Several excellent papers have previously reviewed practical aspects of propensity score estimation and other aspects of the propensity score literature. The purpose of this paper is to compare the conceptual foundation of propensity score models with alternative estimators of treatment effects. References are provided to empirical comparisons among methods that have appeared in the literature. These comparisons are available for a subset of the methods considered in this paper. However, in some cases, no pairwise comparisons of particular methods are yet available, and there are no examples of comparisons across all of the methods surveyed here. Irrespective of the availability of empirical comparisons, the goal of this paper is to provide some intuition about the relative merits of alternative estimators in health economic evaluations where nonlinearity, sample size, availability of pre/post data, heterogeneity, and missing variables can have important implications for choice of methodology. Also considered is the potential combination of propensity score matching with alternative methods such as differences-in-differences and decomposition methods that have not yet appeared in the empirical literature.

  11. Incorporating Measurement Error from Modeled Air Pollution Exposures into Epidemiological Analyses.

    PubMed

    Samoli, Evangelia; Butland, Barbara K

    2017-12-01

    Outdoor air pollution exposures used in epidemiological studies are commonly predicted from spatiotemporal models incorporating limited measurements, temporal factors, geographic information system variables, and/or satellite data. Measurement error in these exposure estimates leads to imprecise estimation of health effects and their standard errors. We reviewed methods for measurement error correction that have been applied in epidemiological studies that use model-derived air pollution data. We identified seven cohort studies and one panel study that have employed measurement error correction methods. These methods included regression calibration, risk set regression calibration, regression calibration with instrumental variables, the simulation extrapolation approach (SIMEX), and methods under the non-parametric or parameter bootstrap. Corrections resulted in small increases in the absolute magnitude of the health effect estimate and its standard error under most scenarios. Limited application of measurement error correction methods in air pollution studies may be attributed to the absence of exposure validation data and the methodological complexity of the proposed methods. Future epidemiological studies should consider in their design phase the requirements for the measurement error correction method to be later applied, while methodological advances are needed under the multi-pollutants setting.

  12. Does extending health insurance coverage to the uninsured improve population health outcomes?

    PubMed

    Thornton, James A; Rice, Jennifer L

    2008-01-01

    An ongoing debate exists about whether the US should adopt a universal health insurance programme. Much of the debate has focused on programme implementation and cost, with relatively little attention to benefits for social welfare. To estimate the effect on US population health outcomes, measured by mortality, of extending private health insurance to the uninsured, and to obtain a rough estimate of the aggregate economic benefits of extending insurance coverage to the uninsured. We use state-level panel data for all 50 states for the period 1990-2000 to estimate a health insurance augmented, aggregate health production function for the US. An instrumental variables fixed-effects estimator is used to account for confounding variables and reverse causation from health status to insurance coverage. Several observed factors, such as income, education, unemployment, cigarette and alcohol consumption and population demographic characteristics are included to control for potential confounding variables that vary across both states and time. The results indicate a negative relationship between private insurance and mortality, thus suggesting that extending insurance to the uninsured population would result in an improvement in population health outcomes. The estimate of the marginal effect of insurance coverage indicates that a 10% increase in the population-insured rate of a state reduces mortality by 1.69-1.92%. Using data for the year 2003, we calculate that extending private insurance coverage to the entire uninsured population in the US would save over 75 000 lives annually and may yield annual net benefits to the nation in excess of $US400 billion. This analysis suggests that extending health insurance coverage through the private market to the 46 million Americans without health insurance may well produce large social economic benefits for the nation as a whole.

  13. An instrumental variable random-coefficients model for binary outcomes

    PubMed Central

    Chesher, Andrew; Rosen, Adam M

    2014-01-01

    In this paper, we study a random-coefficients model for a binary outcome. We allow for the possibility that some or even all of the explanatory variables are arbitrarily correlated with the random coefficients, thus permitting endogeneity. We assume the existence of observed instrumental variables Z that are jointly independent with the random coefficients, although we place no structure on the joint determination of the endogenous variable X and instruments Z, as would be required for a control function approach. The model fits within the spectrum of generalized instrumental variable models, and we thus apply identification results from our previous studies of such models to the present context, demonstrating their use. Specifically, we characterize the identified set for the distribution of random coefficients in the binary response model with endogeneity via a collection of conditional moment inequalities, and we investigate the structure of these sets by way of numerical illustration. PMID:25798048

  14. Statistical framework for evaluation of climate model simulations by use of climate proxy data from the last millennium - Part 1: Theory

    NASA Astrophysics Data System (ADS)

    Sundberg, R.; Moberg, A.; Hind, A.

    2012-08-01

    A statistical framework for comparing the output of ensemble simulations from global climate models with networks of climate proxy and instrumental records has been developed, focusing on near-surface temperatures for the last millennium. This framework includes the formulation of a joint statistical model for proxy data, instrumental data and simulation data, which is used to optimize a quadratic distance measure for ranking climate model simulations. An essential underlying assumption is that the simulations and the proxy/instrumental series have a shared component of variability that is due to temporal changes in external forcing, such as volcanic aerosol load, solar irradiance or greenhouse gas concentrations. Two statistical tests have been formulated. Firstly, a preliminary test establishes whether a significant temporal correlation exists between instrumental/proxy and simulation data. Secondly, the distance measure is expressed in the form of a test statistic of whether a forced simulation is closer to the instrumental/proxy series than unforced simulations. The proposed framework allows any number of proxy locations to be used jointly, with different seasons, record lengths and statistical precision. The goal is to objectively rank several competing climate model simulations (e.g. with alternative model parameterizations or alternative forcing histories) by means of their goodness of fit to the unobservable true past climate variations, as estimated from noisy proxy data and instrumental observations.

  15. Laser diffraction particle sizing in STRESS

    NASA Astrophysics Data System (ADS)

    Agrawal, Y. C.; Pottsmith, H. C.

    1994-08-01

    An autonomous instrument system for measuring particle size spectra in the sea is described. The instrument records the small-angle scattering characteristics of the particulate ensemble present in water. The small-angle scattering distribution is inverted into size spectra. The discussion of the instrument in this paper is included with a review of the information content of the data. It is noted that the inverse problem is sensitive to the forward model for light scattering employed in the construction of the matrix. The instrument system is validated using monodisperse polystyrene and NIST standard distributions of glass spheres. Data from a long-term deployment on the California shelf during the field experiment Sediment Transport Events on Shelves and Slopes (STRESS) are included. The size distribution in STRESS, measured at a fixed height-above-bed 1.2 m, showed significant variability over time. In particular, the volume distribution sometimes changed from mono-modal to bi-modal during the experiment. The data on particle-size distribution are combined with friction velocity measurements in the current boundary layer to produce a size-dependent estimate of the suspended mass at 10 cm above bottom. It is argued that these concentrations represent the reference concentration at the bed for the smaller size classes. The suspended mass at all sizes shows a strong correlation with wave variance. Using the size distribution, corrections in the optical transmissometry calibration factor are estimated for the duration of the experiment. The change in calibration at 1.2 m above bed (mab) is shown to have a standard error of 30% over the duration of the experiment with a range of 1.8-0.8.

  16. "Fibromyalgia and quality of life: mapping the revised fibromyalgia impact questionnaire to the preference-based instruments".

    PubMed

    Collado-Mateo, Daniel; Chen, Gang; Garcia-Gordillo, Miguel A; Iezzi, Angelo; Adsuar, José C; Olivares, Pedro R; Gusi, Narcis

    2017-05-30

    The revised version of the Fibromyalgia Impact Questionnaire (FIQR) is one of the most widely used specific questionnaires in FM studies. However, this questionnaire does not allow calculation of QALYs as it is not a preference-based measure. The aim of this study was to develop mapping algorithm which enable FIQR scores to be transformed into utility scores that can be used in the cost utility analyses. A cross-sectional survey was conducted. One hundred and 92 Spanish women with Fibromyalgia were asked to complete four general quality of life questionnaires, i.e. EQ-5D-5 L, 15D, AQoL-8D and SF-12, and one specific disease instrument, the FIQR. A direct mapping approach was adopted to derive mapping algorithms between the FIQR and each of the four multi-attribute utility (MAU) instruments. Health state utility was treated as the dependent variable in the regression analysis, whilst the FIQR score and age were predictors. The mean utility scores ranged from 0.47 (AQoL-8D) to 0.69 (15D). All correlations between the FIQR total score and MAU instruments utility scores were highly significant (p < 0.0001) with magnitudes larger than 0.5. Although very slight differences in the mean absolute error were found between ordinary least squares (OLS) estimator and generalized linear model (GLM), models based on GLM were better for EQ-5D-5 L, AQoL-8D and 15D. Mapping algorithms developed in this study enable the estimation of utility values from scores in a fibromyalgia specific questionnaire.

  17. A thermal control system for long-term survival of scientific instruments on lunar surface

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ogawa, K., E-mail: ogawa@astrobio.k.u-tokyo.ac.jp; Iijima, Y.; Tanaka, S.

    2014-03-15

    A thermal control system is being developed for scientific instruments placed on the lunar surface. This thermal control system, Lunar Mission Survival Module (MSM), was designed for scientific instruments that are planned to be operated for over a year in the future Japanese lunar landing mission SELENE-2. For the long-term operations, the lunar surface is a severe environment because the soil (regolith) temperature varies widely from nighttime −200 degC to daytime 100 degC approximately in which space electronics can hardly survive. The MSM has a tent of multi-layered insulators and performs a “regolith mound”. Temperature of internal devices is lessmore » variable just like in the lunar underground layers. The insulators retain heat in the regolith soil in the daylight, and it can keep the device warm in the night. We conducted the concept design of the lunar survival module, and estimated its potential by a thermal mathematical model on the assumption of using a lunar seismometer designed for SELENE-2. Thermal vacuum tests were also conducted by using a thermal evaluation model in order to estimate the validity of some thermal parameters assumed in the computed thermal model. The numerical and experimental results indicated a sufficient survivability potential of the concept of our thermal control system.« less

  18. Spatial variability of extreme rainfall at radar subpixel scale

    NASA Astrophysics Data System (ADS)

    Peleg, Nadav; Marra, Francesco; Fatichi, Simone; Paschalis, Athanasios; Molnar, Peter; Burlando, Paolo

    2018-01-01

    Extreme rainfall is quantified in engineering practice using Intensity-Duration-Frequency curves (IDF) that are traditionally derived from rain-gauges and more recently also from remote sensing instruments, such as weather radars. These instruments measure rainfall at different spatial scales: rain-gauge samples rainfall at the point scale while weather radar averages precipitation on a relatively large area, generally around 1 km2. As such, a radar derived IDF curve is representative of the mean areal rainfall over a given radar pixel and neglects the within-pixel rainfall variability. In this study, we quantify subpixel variability of extreme rainfall by using a novel space-time rainfall generator (STREAP model) that downscales in space the rainfall within a given radar pixel. The study was conducted using a unique radar data record (23 years) and a very dense rain-gauge network in the Eastern Mediterranean area (northern Israel). Radar-IDF curves, together with an ensemble of point-based IDF curves representing the radar subpixel extreme rainfall variability, were developed fitting Generalized Extreme Value (GEV) distributions to annual rainfall maxima. It was found that the mean areal extreme rainfall derived from the radar underestimate most of the extreme values computed for point locations within the radar pixel (on average, ∼70%). The subpixel variability of rainfall extreme was found to increase with longer return periods and shorter durations (e.g. from a maximum variability of 10% for a return period of 2 years and a duration of 4 h to 30% for 50 years return period and 20 min duration). For the longer return periods, a considerable enhancement of extreme rainfall variability was found when stochastic (natural) climate variability was taken into account. Bounding the range of the subpixel extreme rainfall derived from radar-IDF can be of major importance for different applications that require very local estimates of rainfall extremes.

  19. A global SOLIS vector spectromagnetograph (VSM) network

    NASA Astrophysics Data System (ADS)

    Streander, K. V.; Giampapa, M. S.; Harvey, J. W.; Henney, C. J.; Norton, A. A.

    2008-07-01

    Understanding the Sun's magnetic field related activity is far from complete as reflected in the limited ability to make accurate predictions of solar variability. To advance our understanding of solar magnetism, the National Solar Observatory (NSO) constructed the Synoptic Optical Long-term Investigations of the Sun (SOLIS) suite of instruments to conduct high precision optical measurements of processes on the Sun whose study requires sustained observations over long time periods. The Vector Spectromagnetograph (VSM), the principal SOLIS instrument, has been in operation since 2003 and obtains photospheric vector data, as well as photospheric and chromospheric longitudinal magnetic field measurements. Instrument performance is being enhanced by employing new, high-speed cameras that virtually freeze seeing, thus improving sensitivity to measure the solar magnetic field configuration. A major operational goal is to provide real-time and near-real-time data for forecasting space weather and increase scientific yield from shorter duration solar space missions and ground-based research projects. The National Solar Observatory proposes to build two near-duplicates of the VSM instrument and place them at international sites to form a three-site global VSM network. Current electronic industry practice of short lifetime cycles leads to improved performance and reduced acquisition costs but also to redesign costs and engineering impacts that must be minimized. The current VSM instrument status and experience gained from working on the original instrument is presented herein and used to demonstrate that one can dramatically reduce the estimated cost and fabrication time required to duplicate and commission two additional instruments.

  20. Intraindividual Variability in Executive Functions but Not Speed of Processing or Conflict Resolution Predicts Performance Differences in Gait Speed in Older Adults

    PubMed Central

    Mahoney, Jeannette; Verghese, Joe

    2014-01-01

    Background. The relationship between executive functions (EF) and gait speed is well established. However, with the exception of dual tasking, the key components of EF that predict differences in gait performance have not been determined. Therefore, the current study was designed to determine whether processing speed, conflict resolution, and intraindividual variability in EF predicted variance in gait performance in single- and dual-task conditions. Methods. Participants were 234 nondemented older adults (mean age 76.48 years; 55% women) enrolled in a community-based cohort study. Gait speed was assessed using an instrumented walkway during single- and dual-task conditions. The flanker task was used to assess EF. Results. Results from the linear mixed effects model showed that (a) dual-task interference caused a significant dual-task cost in gait speed (estimate = 35.99; 95% CI = 33.19–38.80) and (b) of the cognitive predictors, only intraindividual variability was associated with gait speed (estimate = −.606; 95% CI = −1.11 to −.10). In unadjusted analyses, the three EF measures were related to gait speed in single- and dual-task conditions. However, in fully adjusted linear regression analysis, only intraindividual variability predicted performance differences in gait speed during dual tasking (B = −.901; 95% CI = −1.557 to −.245). Conclusion. Among the three EF measures assessed, intraindividual variability but not speed of processing or conflict resolution predicted performance differences in gait speed. PMID:24285744

  1. Static and dynamic efficiency of irreversible health care investments under alternative payment rules.

    PubMed

    Levaggi, R; Moretto, M; Pertile, P

    2012-01-01

    The paper studies the incentive for providers to invest in new health care technologies under alternative payment systems, when the patients' benefits are uncertain. If the reimbursement by the purchaser includes both a variable (per patient) and a lump-sum component, efficiency can be ensured both in the timing of adoption (dynamic) and the intensity of use of the technology (static). If the second instrument is unavailable, a trade-off may emerge between static and dynamic efficiency. In this context, we also discuss how the regulator could use control of the level of uncertainty faced by the provider as an instrument to mitigate the trade-off between static and dynamic efficiency. Finally, we calibrate the model to study a specific technology and estimate the cost of a regulatory failure. Copyright © 2011 Elsevier B.V. All rights reserved.

  2. Searching for the Beginning of the Ozone Turnaround Using a 22-Year Merged Satellite Data Set

    NASA Technical Reports Server (NTRS)

    Stolarski, Richard S.; Meeson, Blanche W. (Technical Monitor)

    2001-01-01

    We have used the data from six satellite instruments that measure the total column amount of ozone to construct a consistent merged data set extending from late 1978 into 2000. The keys to constructing a merged data set are to minimize potential drift of individual instruments and to accurately establish instrument-to-instrument offsets. We have used the short-wavelength D-pair measurements (306nm-313nm) of the SBUV and SBUV/2 instruments near the equator to establish a relatively drift-free record for these instruments. We have then used their overlap with the Nimbus 7 and EP TOMS instruments to establish the relative calibration of the various instruments. We have evaluated the drift uncertainty in our merged ozone data (MOD) set by examining both the individual instrument drift uncertainty and the uncertainty in establishing the instrument- to-instrument differences. We conclude that the instrumental drift uncertainty over the 22-year data record is 0.9 %/decade (2-sigma). We have compared our MOD record with 37 ground stations that have a continuous record over that time period. We have a mean drift with respect to the stations of +0.3 %/decade which is within 1-sigma of our uncertainty estimate. Using the satellite record as a transfer standard, we can estimate the capability of the ground instruments to establish satellite calibration. Adding the statistical variability of the station drifts with respect to the satellite to an estimate of the overall drift uncertainty of the world standard instrument, we conclude that the stations should be able to be used to establish the drift of the satellite data record to within and uncertainty of 0.6 %/decade (2-sigma). Adding to this an uncertainty due to the-incomplete global coverage of the stations, we conclude that the station data should be able to establish the global trend with an uncertainty of about 0.7 %/decade, slightly better than for the satellite record. We conclude that merging the two records together gives only a slight improvement in the uncertainty. Keeping them separate gives the greater confidence of two independent measures of the ozone trend and potential recovery. We fit the trend in our MOD record through May of 1991 and then extrapolated forward to see if the data at the end of the record was above the statistical model as a measure of ozone recovery as was done in the last WMO/UNEP assessment report. Because our data set drifts with respect to the ground-stations through May of 1991, we calculated a smaller global trend (-1.1 %/decade) than in the WMO/UNEP report. Our data in 1998 and 1999 was, on average 2 DU above the extrapolated statistical model with a 2-sigma uncertainty of 6 DU. For the combined mid-latitudes of the northern and southern hemispheres, the data was 5 DU above the extrapolated statistical model with a 2-sigma uncertainty of 10 DU. These may be signs of recovery, but they are still statistically insignificant.

  3. The development and validation of the Youth Actuarial Care Needs Assessment Tool for Non-Offenders (Y-ACNAT-NO).

    PubMed

    Assink, Mark; van der Put, Claudia E; Oort, Frans J; Stams, Geert Jan J M

    2015-03-04

    In The Netherlands, police officers not only come into contact with juvenile offenders, but also with a large number of juveniles who were involved in a criminal offense, but not in the role of a suspect (i.e., juvenile non-offenders). Until now, no valid and reliable instrument was available that can be used by Dutch police officers for estimating the risk for future care needs of juvenile non-offenders. In the present study, the Youth Actuarial Care Needs Assessment Tool for Non-Offenders (Y-ACNAT-NO) was developed for predicting the risk for future care needs that consisted of (1) a future supervision order as imposed by a juvenile court judge and (2) future worrisome incidents involving child abuse, domestic violence/strife, and/or sexual offensive behavior at the juvenile's living address (i.e., problems in the child-rearing environment). Police records of 3,200 juveniles were retrieved from the Dutch police registration system after which the sample was randomly split in a construction (n = 1,549) and validation sample (n = 1,651). The Y-ACNAT-NO was developed by performing an Exhaustive CHAID analysis using the construction sample. The predictive validity of the instrument was examined in the validation sample by calculating several performance indicators that assess discrimination and calibration. The CHAID output yielded an instrument that consisted of six variables and eleven different risk groups. The risk for future care needs ranged from 0.06 in the lowest risk group to 0.83 in the highest risk group. The AUC value in the validation sample was .764 (95% CI [.743, .784]) and Sander's calibration score indicated an average assessment error of 3.74% in risk estimates per risk category. The Y-ACNAT-NO is the first instrument that can be used by Dutch police officers for estimating the risk for future care needs of juvenile non-offenders. The predictive validity of the Y-ACNAT-NO in terms of discrimination and calibration was sufficient to justify its use as an initial screening instrument when a decision is needed about referring a juvenile for further assessment of care needs.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Riihimaki, L.; McFarlane, S.; Sivaraman, C.

    The ndrop_mfrsr value-added product (VAP) provides an estimate of the cloud droplet number concentration of overcast water clouds retrieved from cloud optical depth from the multi-filter rotating shadowband radiometer (MFRSR) instrument and liquid water path (LWP) retrieved from the microwave radiometer (MWR). When cloud layer information is available from vertically pointing lidar and radars in the Active Remote Sensing of Clouds (ARSCL) product, the VAP also provides estimates of the adiabatic LWP and an adiabatic parameter (beta) that indicates how divergent the LWP is from the adiabatic case. quality control (QC) flags (qc_drop_number_conc), an uncertainty estimate (drop_number_conc_toterr), and a cloudmore » layer type flag (cloud_base_type) are useful indicators of the quality and accuracy of any given value of the retrieval. Examples of these major input and output variables are given in sample plots in section 6.0.« less

  5. A quantile regression model for failure-time data with time-dependent covariates

    PubMed Central

    Gorfine, Malka; Goldberg, Yair; Ritov, Ya’acov

    2017-01-01

    Summary Since survival data occur over time, often important covariates that we wish to consider also change over time. Such covariates are referred as time-dependent covariates. Quantile regression offers flexible modeling of survival data by allowing the covariates to vary with quantiles. This article provides a novel quantile regression model accommodating time-dependent covariates, for analyzing survival data subject to right censoring. Our simple estimation technique assumes the existence of instrumental variables. In addition, we present a doubly-robust estimator in the sense of Robins and Rotnitzky (1992, Recovery of information and adjustment for dependent censoring using surrogate markers. In: Jewell, N. P., Dietz, K. and Farewell, V. T. (editors), AIDS Epidemiology. Boston: Birkhaäuser, pp. 297–331.). The asymptotic properties of the estimators are rigorously studied. Finite-sample properties are demonstrated by a simulation study. The utility of the proposed methodology is demonstrated using the Stanford heart transplant dataset. PMID:27485534

  6. HIV and sexual behavior change: why not Africa?

    PubMed

    Oster, Emily

    2012-01-01

    Despite high rates of HIV in Sub-Saharan Africa, and the corresponding high mortality risk associated with risky sexual behavior, behavioral response has been limited. This paper explores three explanations for this: bias in OLS estimates, limited non-HIV life expectancy and limited knowledge. I find support for the first two. First, using a new instrumental variable strategy I find that OLS estimates of the relationship between risky sex and HIV are biased upwards, and IV estimates indicate reductions in risky behavior in response to the epidemic. Second, I find these reductions are larger for individuals who live in areas with higher life expectancy, suggesting high rates of non-HIV mortality suppress behavioral response; this is consistent with optimizing behavior. Using somewhat limited knowledge proxies, I find no evidence that areas with higher knowledge of the epidemic have greater behavior change. Copyright © 2012 Elsevier B.V. All rights reserved.

  7. Determining the Attitudes of Undergraduate Students Having Vocational Music Education towards Individual Instrument Course According to Different Variables

    ERIC Educational Resources Information Center

    Uluçay, Taner

    2017-01-01

    This study was carried out in order to determine attitudes of undergraduate students who studied music vocationally towards the individual instrument course according to the variables of grade, gender, individual instrument and graduated high school type. The research data were obtained from 102 undergraduate students studying in Erzincan…

  8. Correcting the Standard Errors of 2-Stage Residual Inclusion Estimators for Mendelian Randomization Studies

    PubMed Central

    Palmer, Tom M; Holmes, Michael V; Keating, Brendan J; Sheehan, Nuala A

    2017-01-01

    Abstract Mendelian randomization studies use genotypes as instrumental variables to test for and estimate the causal effects of modifiable risk factors on outcomes. Two-stage residual inclusion (TSRI) estimators have been used when researchers are willing to make parametric assumptions. However, researchers are currently reporting uncorrected or heteroscedasticity-robust standard errors for these estimates. We compared several different forms of the standard error for linear and logistic TSRI estimates in simulations and in real-data examples. Among others, we consider standard errors modified from the approach of Newey (1987), Terza (2016), and bootstrapping. In our simulations Newey, Terza, bootstrap, and corrected 2-stage least squares (in the linear case) standard errors gave the best results in terms of coverage and type I error. In the real-data examples, the Newey standard errors were 0.5% and 2% larger than the unadjusted standard errors for the linear and logistic TSRI estimators, respectively. We show that TSRI estimators with modified standard errors have correct type I error under the null. Researchers should report TSRI estimates with modified standard errors instead of reporting unadjusted or heteroscedasticity-robust standard errors. PMID:29106476

  9. Impact of Financial Incentives for Prenatal Care on Birth Outcomes and Spending

    PubMed Central

    Rosenthal, Meredith B; Li, Zhonghe; Robertson, Audra D; Milstein, Arnold

    2009-01-01

    Objective To evaluate the impact of offering US$100 each to patients and their obstetricians or midwives for timely and comprehensive prenatal care on low birth weight, neonatal intensive care admissions, and total pediatric health care spending in the first year of life. Data Sources/Study Setting Claims and enrollment profiles of the predominantly low-income and Hispanic participants of a union-sponsored, health insurance plan from 1998 to 2001. Study Design Panel data analysis of outcomes and spending for participants and nonparticipants using instrumental variables to account for selection bias. Data Collection/Abstraction Methods Data provided were analyzed using t-tests and chi-squared tests to compare maternal characteristics and birth outcomes for incentive program participants and nonparticipants, with and without instrumental variables to address selection bias. Adjusted variables were analyzed using logistic regression models. Principle Findings Participation in the incentive program was significantly associated with lower odds of neonatal intensive care unit admission (0.45; 95 percent CI, 0.23–0.88) and spending in the first year of life (estimated elasticity of −0.07; 95 percent CI, −0.12 to −0.01), but not low birth weight (0.53; 95 percent CI, 0.23–1.18). Conclusion The use of patient and physician incentives may be an effective mechanism for improving use of recommended prenatal care and associated outcomes, particularly among low-income women. PMID:19619248

  10. The effects of competition on premiums: using United Healthcare's 2015 entry into Affordable Care Act's marketplaces as an instrumental variable.

    PubMed

    Agirdas, Cagdas; Krebs, Robert J; Yano, Masato

    2018-01-08

    One goal of the Affordable Care Act is to increase insurance coverage by improving competition and lowering premiums. To facilitate this goal, the federal government enacted online marketplaces in the 395 rating areas spanning 34 states that chose not to establish their own state-run marketplaces. Few multivariate regression studies analyzing the effects of competition on premiums suffer from endogeneity, due to simultaneity and omitted variable biases. However, United Healthcare's decision to enter these marketplaces in 2015 provides the researcher with an opportunity to address this endogeneity problem. Exploiting the variation caused by United Healthcare's entry decision as an instrument for competition, we study the impact of competition on premiums during the first 2 years of these marketplaces. Combining panel data from five different sources and controlling for 12 variables, we find that one more insurer in a rating area leads to a 6.97% reduction in the second-lowest-priced silver plan premium, which is larger than the estimated effects in existing literature. Furthermore, we run a threshold analysis and find that competition's effects on premiums become statistically insignificant if there are four or more insurers in a rating area. These findings are robust to alternative measures of premiums, inclusion of a non-linear term in the regression models and a county-level analysis.

  11. Inflammatory Biomarkers and Risk of Schizophrenia: A 2-Sample Mendelian Randomization Study.

    PubMed

    Hartwig, Fernando Pires; Borges, Maria Carolina; Horta, Bernardo Lessa; Bowden, Jack; Davey Smith, George

    2017-12-01

    Positive associations between inflammatory biomarkers and risk of psychiatric disorders, including schizophrenia, have been reported in observational studies. However, conventional observational studies are prone to bias, such as reverse causation and residual confounding, thus limiting our understanding of the effect (if any) of inflammatory biomarkers on schizophrenia risk. To evaluate whether inflammatory biomarkers have an effect on the risk of developing schizophrenia. Two-sample mendelian randomization study using genetic variants associated with inflammatory biomarkers as instrumental variables to improve inference. Summary association results from large consortia of candidate gene or genome-wide association studies, including several epidemiologic studies with different designs, were used. Gene-inflammatory biomarker associations were estimated in pooled samples ranging from 1645 to more than 80 000 individuals, while gene-schizophrenia associations were estimated in more than 30 000 cases and more than 45 000 ancestry-matched controls. In most studies included in the consortia, participants were of European ancestry, and the prevalence of men was approximately 50%. All studies were conducted in adults, with a wide age range (18 to 80 years). Genetically elevated circulating levels of C-reactive protein (CRP), interleukin-1 receptor antagonist (IL-1Ra), and soluble interleukin-6 receptor (sIL-6R). Risk of developing schizophrenia. Individuals with schizophrenia or schizoaffective disorders were included as cases. Given that many studies contributed to the analyses, different diagnostic procedures were used. The pooled odds ratio estimate using 18 CRP genetic instruments was 0.90 (random effects 95% CI, 0.84-0.97; P = .005) per 2-fold increment in CRP levels; consistent results were obtained using different mendelian randomization methods and a more conservative set of instruments. The odds ratio for sIL-6R was 1.06 (95% CI, 1.01-1.12; P = .02) per 2-fold increment. Estimates for IL-1Ra were inconsistent among instruments, and pooled estimates were imprecise and centered on the null. Under mendelian randomization assumptions, our findings suggest a protective effect of CRP and a risk-increasing effect of sIL-6R (potentially mediated at least in part by CRP) on schizophrenia risk. It is possible that such effects are a result of increased susceptibility to early life infection.

  12. The ceilometer inter-comparison campaign CeiLinEx2015

    NASA Astrophysics Data System (ADS)

    Mattis, Ina; Begbie, Robert; Boyouk, Neda; Bravo-Aranda, Juan Antonio; Brettle, Mike; Cermak, Jan; Drouin, Marc-Antoine; Geiß, Alexander; Görsdorf, Ulrich; Haefele, Alexander; Haeffelin, Martial; Hervo, Maxime; Komínková, Kateřina; Leinweber, Ronny; Müller, Gerhard; Münkel, Christoph; Pattantyús-Ábrahám, Margit; Pönitz, Kornelia; Wagner, Frank; Wiegner, Matthias

    2016-04-01

    Ceilometers are well established instruments for the detection of cloud base heights. Since about 2000, modern ceilometer types (mainly CHM15k, CL51, and CL31) are used also for investigations of the top height of the planetary boundary layer and for quantitative retrievals of vertical profiles of backscatter coefficients. In the framework of the European projects E-PROFILE and TOPROF, tools for the exchange of ceilometer data among European meteorological services, as well as calibration and visualization procedures are developed and established. Unfortunately, the national networks are equipped with instruments of different generations and different manufacturers. In order to quantify and reduce those inhomogeneities, the ceilometer inter-comparison experiment CeiLinEx2015 was performed between June and September 2015 at the Meteorological Observatory Lindenberg, Germany. We tested six different instrument types: LD40, CL31, CL51, CHM15k, CHM15kx, and CS135. Each instrument type was represented by two instruments to estimate the instrument-to-instrument variability and the influence of different firmware versions. Two Raman lidars (RAMSES and RALPH), operated by German Meteorological Service (DWD) are used as reference instruments. Further ancillary data are available, e.g., hourly eye observations, four radio soundings per day, and AERONET sun photometer observations. Beside the typical vertically pointing measurement scheme, we performed special measurements with horizontal pointing direction or blocked receiver telescope. During the experiment, we could collect measurements under very different meteorological situations: Several clear nights allow for Rayleigh-Calibration (see Hervo and CeiLinEx2015 team [EGU2016-4785]), a strong event of Sahara dust intrusion can be used to study the behaviour of the instruments in presence of large, non-spherical particles (like volcanic ash). Further investigations focus, e.g., on the detection of very low cloud base heights, on the retrieval of boundary layer heights, on the performance of the individual instruments in the overlap region, on the characterization of signal artefacts in the clean free troposphere, on daytime performance, and on the estimation of measurement range. All those investigations need calibrated ceilometer profiles as input data. Therefore, quantitative calibration of all instruments is a very important first step of data analysis. [Hervo and CeiLinEx2015 team EGU2016-4785] present the procedure and results of calibration of all individual instruments in a companion contribution. In this contribution, we will provide a general overview on the CeiLinEx2015 experiment together with first results. We present preliminary results concerning signal artefacts in the free troposphere and correlation studies in more detail.

  13. Monitoring of Observation Errors in the Assimilation of Satellite Ozone Data

    NASA Technical Reports Server (NTRS)

    Stajner, Ivanka; Winslow, Nathan; Rood, Richard B.; Pawson, Steven

    2003-01-01

    The stratospheric ozone layer protects life on Earth from the harmful effects of solar ultravioiet radiation. The ozone layer is currently in a fragile state because of depletion caused by man-made chemicals, especially chlorofluorocarbons. The state of the ozone layer is being monitored and evaluated by scientific experts around the world, in order to help policy makers assess the impacts of international protocols that control the production and release of ozone depleting chemicals. Scientists use a variety ozone measurements and models in order to form a comprehensive picture about the current state of the ozone layer, and to predict the future behavior (expected to be a recovery, as the abundance of the depleting chemicals decreases). Among the data sets used, those from satellite-borne instruments have the advantage of providing a wealth of information about the ozone distribution over most of the globe. Several instruments onboard American and international satellites make measurements of the properties of the atmosphere, from which atmospheric ozone amounts are estimated; long-term measurement programs enable monitoring of trends in ozone. However, the characteristics of satellite instruments change in time. For example, the instrument lenses through which measurements are made may deteriorate over time, or the satellite orbit may drift so that measurements over each location are made later and later in the day. These changes may increase the errors in the retrieved ozone amounts, and degrade the quality of estimated ozone amounts and of their variability. Our work focuses on combining the satellite ozone data with global models that capture atmospheric motion and ozone chemistry, using advanced statistical techniques: this is known as data assimilation. Our method provides a three-dimensional global ozone distribution that is consistent with both the satellite measurements and with our understanding of processes (described in the models) that control ozone distribution. Through the monitoring of statistical properties of the agreement between the data and the model, this approach also enables us to detect changes in the quality of ozone data retrieved from satellite-borne instrument measurements. This paper demonstrates that calculations of the changes in satellite data quality, and the impact these changes on the estimates of the global ozone distribution, can assist in maintaining the uniform quality of the satellite ozone data throughout the lifetime of these instruments, thus contributing to our understanding of long-term ozone change.

  14. OMI Satellite and Ground-Based Pandora Observations and Their Application to Surface NO2 Estimations at Terrestrial and Marine Sites

    NASA Astrophysics Data System (ADS)

    Kollonige, Debra E.; Thompson, Anne M.; Josipovic, Miroslav; Tzortziou, Maria; Beukes, Johan P.; Burger, Roelof; Martins, Douglas K.; van Zyl, Pieter G.; Vakkari, Ville; Laakso, Lauri

    2018-01-01

    The Pandora spectrometer that uses direct-Sun measurements to derive total column amounts of gases provides an approach for (1) validation of satellite instruments and (2) monitoring of total column (TC) ozone (O3) and nitrogen dioxide (NO2). We use for the first time Pandora and Ozone Monitoring Instrument (OMI) observations to estimate surface NO2 over marine and terrestrial sites downwind of urban pollution and compared with in situ measurements during campaigns in contrasting regions: (1) the South African Highveld (at Welgegund, 26°34'10″S, 26°56'21″E, 1,480 m asl, 120 km southwest of the Johannesburg-Pretoria megacity) and (2) shipboard U.S. mid-Atlantic coast during the 2014 Deposition of Atmospheric Nitrogen to Coastal Ecosystems (DANCE) cruise. In both cases, there were no local NOx sources but intermittent regional pollution influences. For TC NO2, OMI and Pandora difference is 20%, with Pandora higher most times. Surface NO2 values estimated from OMI and Pandora columns are compared to in situ NO2 for both locations. For Welgegund, the planetary boundary layer (PBL) height, used in converting column to surface NO2 value, has been estimated by three methods: co-located Atmospheric Infrared Sounder (AIRS) observations; a model simulation; and radiosonde data from Irene, 150 km northeast of the site. AIRS PBL heights agree within 10% of radiosonde-derived values. Absolute differences between Pandora- and OMI-estimated surface NO2 and the in situ data are better at the terrestrial site ( 0.5 ppbv and 1 ppbv or greater, respectively) than under clean marine air conditions, with differences usually >3 ppbv. Cloud cover and PBL variability influence these estimations.

  15. Five instruments for measuring tree height: an evaluation

    Treesearch

    Michael S. Williams; William A. Bechtold; V.J. LaBau

    1994-01-01

    Five instruments were tested for reliability in measuring tree heights under realistic conditions. Four linear models were used to determine if tree height can be measured unbiasedly over all tree sizes and if any of the instruments were more efficient in estimating tree height. The laser height finder was the only instrument to produce unbiased estimates of the true...

  16. New approaches to merging multi-sensor satellite measurements of volcanic SO2 emissions

    NASA Astrophysics Data System (ADS)

    Carn, S. A.; Telling, J. W.; Krotkov, N. A.

    2015-12-01

    As part of the NASA MEaSUREs program, we are developing a unique long-term database of volcanic sulfur dioxide (SO2) emissions for use by the scientific community, using observations from multiple satellite instruments collected since 1978. Challenges to creating such a database include assessing data continuity between multiple satellite missions and SO2 retrieval algorithms and estimating measurement uncertainties. Here, we describe the approaches that we are using to merge multi-decadal SO2 measurements from the ultraviolet (UV) Total Ozone Mapping Spectrometer (TOMS), Ozone Monitoring Instrument (OMI) and Ozone Monitoring and Profiler Suite (OMPS) sensors. A particular challenge has involved accounting for the OMI row anomaly (ORA), a data gap in OMI measurements since 2008 that partially or wholly obscures some volcanic eruption clouds, whilst still profiting from the high OMI spatial resolution and data quality, and prior OMI SO2 validation. We present a new method to substitute missing SO2 information in the ORA with near-coincident SO2 data from OMPS, providing improved estimates of eruptive volcanic SO2 emissions. The technique can also be used to assess consistency between different satellite instruments and SO2 retrieval algorithms, investigate the impact of variable sensor spatial resolution, and estimate measurement uncertainties. It is particularly effective for larger eruptions producing extensive SO2 clouds where the ORA obscures the volcanic plume in multiple contiguous orbits. Application of the technique is demonstrated using recent volcanic eruptions including the 2015 eruption of Calbuco, Chile. We also provide an update on the status of the multi-satellite long-term volcanic SO2 database (MSVOLSO2L4).

  17. Rapid Communication: Effect of machine, anatomical location, and replication on instrumental color of boneless pork loins.

    PubMed

    Barkley, K E; Fields, B; Dilger, A C; Boler, D D

    2018-06-07

    The objective was to determine the effect of machine, anatomical location and replication (multiple readings) on instrumental color and to characterize the amount of variation each factor contributed to overall color. Instrumental color was measured 3 times on the anterior and 3 times on the posterior end of 250 pork loins with 2 different Minolta CR-400 Chroma meter devices. Each Minolta was programed to use a D65 illuminant, 2º observer with an 8 mm aperture, and calibrated with white tiles specific to each machine. Therefore, a total of 12 instrumental color measurements were collected on each loin. The VARCOMP procedure in SAS was used to estimate the proportion of variation contributed by each factor to CIE L*, a*, b*, chroma and hue. Based on previous research, the average untrained consumer is able to distinguish between 3-L* units, 0.4-a* units, and 0.9-hue angle units. Loins evaluated with machine 1 were 0.71 L* units darker (P < 0.01), 1.09 b* units more yellow (P < 0.01), 0.47 chroma units more saturated (P < 0.01), and had a hue angle 5.12 units greater (P < 0.01) than when evaluated with machine 2 but did not differ (P = 0.24) in redness. The anterior portion of the loin was lighter, less red, more yellow, more saturated and had a greater hue angle than the posterior end (P < 0.01). All color trait values decreased (P < 0.01) as replication number increased. Inherent color differences among loins contributed the greatest proportion of variability for lightness (58%), redness (57%), yellowness (70%), saturation (70%) and hue angle (49%). Machine contributed 1% variability to lightness 3% to saturation, 23% to yellowness and 31% to hue angle (31%) but did not contribute to variability for redness. Anatomical location contributed 41% to lightness, 43% to redness, 7% to yellowness, 27% to saturation and 31% to hue angle. Replication did not contribute to total variation for any color traits, even though it did differ among measurements. Overall, there were differences in instrumental color values between the two machines tested but those differences were likely less than the threshold for detection by a consumer. Even so, inherent color differences between loins were a greater contributor to total variability than the differences between the 2 machines. Therefore, it is more important to define the location of measurements than replication or machine when using a Minolta CR-400 when performing color evaluations, assuming the settings are the same.

  18. Bribery or just desserts? Evidence on the influence of Congressional reproductive policy voting patterns on PAC contributions from exogenous variation in the sex mix of legislator offspring.

    PubMed

    Conley, Dalton; McCabe, Brian J

    2012-01-01

    Evidence on the relationship between political contributions and legislators' voting behavior is marred by concerns about endogeneity in the estimation process. Using a legislator's offspring sex mix as a truly exogenous variable, we employ an instrumental variable estimation procedure to predict the effect of voting behavior on political contributions. Following previous research, we find that a legislator's proportion daughters has a significant effect on voting behavior for women's issues, as measured by score in the "Congressional Record on Choice" issued by NARAL Pro-Choice America. In the second stage, we make a unique contribution by demonstrating a significant impact of exogenous voting behavior on PAC contributions, lending further credibility to the hypothesis that Political Action Committees respond to legislators' voting patterns by "rewarding" political candidates that vote in line with the positions of the PAC, rather than affecting those same votes - at least in this high-profile policy domain. Copyright © 2011 Elsevier Inc. All rights reserved.

  19. Comparing the Potential of the Sentinel-2 MSI and the Future EnMAP HSI for the Retrieval of Winter Wheat Crop Parameters in Southern Germany

    NASA Astrophysics Data System (ADS)

    Danner, Martin; Hank, Tobias; Mauser, Wolfram

    2016-08-01

    This study tests the effect of improved spectral resolution on different approaches for the estimation of crop biophysical variables of winter wheat in Southern Germany by comparing the existing Sentinel-2 MSI with the future EnMAP HSI. The experiment is based on simulated sensor data of both Sentinel-2 and EnMAP, with their individual spectral configurations and radiometric properties taken into account. An advanced multispectral setup, such as provided by Sentinel-2, proved to enable reasonable estimation of biophysical variables by applying machine learning algorithms. The augmented information content inherent in hyperspectral signatures, however, marks an advantage for the creation of novel narrow-band indices (RMSE improvement of 10.0%) and for the inversion of canopy reflectance models like PROSAIL independent from in-situ data (RMSE improvement of 18.7%). With the notable advantages of Sentinel-2 - higher revisit rates and better spectral resolution - new synergies are expected to arise, once both instruments will be operating in parallel configuration.

  20. Quantifying VOC emissions from East Asia using 10 years of satellite observations

    NASA Astrophysics Data System (ADS)

    Stavrakou, T.; Muller, J. F.; Bauwens, M.; De Smedt, I.; Van Roozendael, M.; Boersma, F.; van der A, R. J.; Pierre-Francois, C.; Clerbaux, C.

    2016-12-01

    China's emissions are in the spotlight of efforts to mitigate climate change and improve regional and city-scale air quality. Despite growing efforts to better quantify China's emissions, the current estimates are often poor or inadequate. Complementary to bottom-up inventories, inverse modeling of fluxes has the potential to improve those estimates through the use of atmospheric observations of trace gas compounds. As formaldehyde (HCHO) is a high-yield product in the oxidation of most volatile organic compounds (VOCs) emitted by anthropogenic and natural sources, satellite observations of HCHO hold the potential to inform us on the spatial and temporal variability of the underlying VOC sources. The 10-year record of space-based HCHO column observations from the OMI instrument is used to constrain VOC emission fluxes in East Asia in a source inversion framework built on the IMAGES chemistry-transport model and its adjoint. The interannual and seasonal variability, spatial distribution and potential trends of the top-down VOC fluxes (anthropogenic, pyrogenic and biogenic) are presented and confronted to existing emission inventories, satellite observations of other species (e.g. glyoxal and nitrogen oxides), and past studies.

  1. Does social trust increase willingness to pay taxes to improve public healthcare? Cross-sectional cross-country instrumental variable analysis.

    PubMed

    Habibov, Nazim; Cheung, Alex; Auchynnikava, Alena

    2017-09-01

    The purpose of this paper is to investigate the effect of social trust on the willingness to pay more taxes to improve public healthcare in post-communist countries. The well-documented association between higher levels of social trust and better health has traditionally been assumed to reflect the notion that social trust is positively associated with support for public healthcare system through its encouragement of cooperative behaviour, social cohesion, social solidarity, and collective action. Hence, in this paper, we have explicitly tested the notion that social trust contributes to an increase in willingness to financially support public healthcare. We use micro data from the 2010 Life-in-Transition survey (N = 29,526). Classic binomial probit and instrumental variables ivprobit regressions are estimated to model the relationship between social trust and paying more taxes to improve public healthcare. We found that an increase in social trust is associated with a greater willingness to pay more taxes to improve public healthcare. From the perspective of policy-making, healthcare administrators, policy-makers, and international donors should be aware that social trust is an important factor in determining the willingness of the population to provide much-needed financial resources to supporting public healthcare. From a theoretical perspective, we found that estimating the effect of trust on support for healthcare without taking confounding and measurement error problems into consideration will likely lead to an underestimation of the true effect of trust. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. Measuring Food Brand Awareness in Australian Children: Development and Validation of a New Instrument.

    PubMed

    Turner, Laura; Kelly, Bridget; Boyland, Emma; Bauman, Adrian E

    2015-01-01

    Children's exposure to food marketing is one environmental determinant of childhood obesity. Measuring the extent to which children are aware of food brands may be one way to estimate relative prior exposures to food marketing. This study aimed to develop and validate an Australian Brand Awareness Instrument (ABAI) to estimate children's food brand awareness. The ABAI incorporated 30 flashcards depicting food/drink logos and their corresponding products. An abbreviated version was also created using 12 flashcards (ABAI-a). The ABAI was presented to 60 primary school aged children (7-11 yrs) attending two Australian after-school centres. A week later, the full-version was repeated on approximately half the sample (n=27) and the abbreviated-version was presented to the remaining half (n=30). The test-retest reliability of the ABAI was analysed using Intra-class correlation coefficients. The concordance of the ABAI-a and full-version was assessed using Bland-Altman plots. The 'nomological' validity of the full tool was investigated by comparing children's brand awareness with food marketing-related variables (e.g. television habits, intake of heavily promoted foods). Brand awareness increased with age (p<0.01) but was not significantly correlated with other variables. Bland-Altman analyses showed good agreement between the ABAI and ABAI-a. Reliability analyses revealed excellent agreement between the two administrations of the full-ABAI. The ABAI was able to differentiate children's varying levels of brand awareness. It was shown to be a valid and reliable tool and may allow quantification of brand awareness as a proxy measure for children's prior food marketing exposure.

  3. Cross-Cultural adaptation of the General Functioning Scale of the Family

    PubMed Central

    Pires, Thiago; de Assis, Simone Gonçalves; Avanci, Joviana Quintes; Pesce, Renata Pires

    2016-01-01

    ABSTRACT OBJECTIVE To describe the process of cross-cultural adaptation of the General Functioning Scale of the Family, a subscale of the McMaster Family Assessment Device, for the Brazilian population. METHODS The General Functioning Scale of the Family was translated into Portuguese and administered to 500 guardians of children in the second grade of elementary school in public schools of Sao Gonçalo, Rio de Janeiro, Southeastern Brazil. The types of equivalences investigated were: conceptual and of items, semantic, operational, and measurement. The study involved discussions with experts, translations and back-translations of the instrument, and psychometric assessment. Reliability and validity studies were carried out by internal consistency testing (Cronbach’s alpha), Guttman split-half correlation model, Pearson correlation coefficient, and confirmatory factor analysis. Associations between General Functioning of the Family and variables theoretically associated with the theme (father’s or mother’s drunkenness and violence between parents) were estimated by odds ratio. RESULTS Semantic equivalence was between 90.0% and 100%. Cronbach’s alpha ranged from 0.79 to 0.81, indicating good internal consistency of the instrument. Pearson correlation coefficient ranged between 0.303 and 0.549. Statistical association was found between the general functioning of the family score and the theoretically related variables, as well as good fit quality of the confirmatory analysis model. CONCLUSIONS The results indicate the feasibility of administering the instrument to the Brazilian population, as it is easy to understand and a good measurement of the construct of interest. PMID:27355464

  4. Let it snow! Let it snow! Let it snow! Estimating cocaine production using a novel dataset based on reported seizures of laboratories in Colombia.

    PubMed

    Leoncini, Riccardo; Rentocchini, Francesco

    2012-11-01

    Data on the cocaine market appear inconsistent, as they tend to show declining prices vis-a-vis steady or increasing demand and a declining supply. This paper proposes an explanation for this trend by providing evidence of an under-estimation of the supply of cocaine. We propose a conservative estimate of cocaine production in Colombia for 2008, using data based on all reported seizures from 328 laboratories made by the counteracting organisations operating within the Colombian territory. Our conservative estimate of 935 tons from the seized laboratories is at least twice the estimate declared in official statistics of 295-450 tons. We are careful to keep all variables to their minimum boundary values. Our methodology could prove to be a useful tool, especially if used in parallel with the standard tools. Moreover, its characteristics (affordability, ease of use and potential for worldwide adoption) make it a powerful instrument to counteract cocaine production. Copyright © 2012 Elsevier B.V. All rights reserved.

  5. Life satisfaction, QALYs, and the monetary value of health.

    PubMed

    Huang, Li; Frijters, Paul; Dalziel, Kim; Clarke, Philip

    2018-06-18

    The monetary value of a quality-adjusted life-year (QALY) is frequently used to assess the benefits of health interventions and inform funding decisions. However, there is little consensus on methods for the estimation of this monetary value. In this study, we use life satisfaction as an indicator of 'experienced utility', and estimate the dollar equivalent value of a QALY using a fixed effect model with instrumental variable estimators. Using a nationally-representative longitudinal survey including 28,347 individuals followed during 2002-2015 in Australia, we estimate that individual's willingness to pay for one QALY is approximately A$42,000-A$67,000, and the willingness to pay for not having a long-term condition approximately A$2000 per year. As the estimates are derived using population-level data and a wellbeing measurement of life satisfaction, the approach has the advantage of being socially inclusive and recognizes the significant meaning of people's subjective valuations of health. The method could be particularly useful for nations where QALY thresholds are not yet validated or established. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.

  6. Applying Propensity Score Methods in Medical Research: Pitfalls and Prospects

    PubMed Central

    Luo, Zhehui; Gardiner, Joseph C.; Bradley, Cathy J.

    2012-01-01

    The authors review experimental and nonexperimental causal inference methods, focusing on assumptions for the validity of instrumental variables and propensity score (PS) methods. They provide guidance in four areas for the analysis and reporting of PS methods in medical research and selectively evaluate mainstream medical journal articles from 2000 to 2005 in the four areas, namely, examination of balance, overlapping support description, use of estimated PS for evaluation of treatment effect, and sensitivity analyses. In spite of the many pitfalls, when appropriately evaluated and applied, PS methods can be powerful tools in assessing average treatment effects in observational studies. Appropriate PS applications can create experimental conditions using observational data when randomized controlled trials are not feasible and, thus, lead researchers to an efficient estimator of the average treatment effect. PMID:20442340

  7. Information security of power enterprises of North-Arctic region

    NASA Astrophysics Data System (ADS)

    Sushko, O. P.

    2018-05-01

    The role of information technologies in providing technological security for energy enterprises is a component of the economic security for the northern Arctic region in general. Applying instruments and methods of information protection modelling of the energy enterprises' business process in the northern Arctic region (such as Arkhenergo and Komienergo), the authors analysed and identified most frequent risks of information security. With the analytic hierarchy process based on weighting factor estimations, information risks of energy enterprises' technological processes were ranked. The economic estimation of the information security within an energy enterprise considers weighting factor-adjusted variables (risks). Investments in information security systems of energy enterprises in the northern Arctic region are related to necessary security elements installation; current operating expenses on business process protection systems become materialized economic damage.

  8. Property Taxes and Elderly Mobility

    PubMed Central

    Shan, Hui

    2009-01-01

    The 2000–05 housing market boom in the U.S. has caused sharp increases in residential property taxes. Housing-rich but income-poor elderly homeowners often complain about rising tax burdens, and anecdotal evidence suggests that some move to reduce their tax burden. There has been little systematic analysis, however, of the link between property tax levels and the mobility rate of elderly homeowners. This paper investigates this link using household-level panel data from the Health and Retirement Study (HRS) and a newly collected data set on state-provided property tax relief programs. These relief programs generate variation in effective property tax burdens that is not due solely to arguably endogenous local community choices about taxes and expenditure programs. The findings provide evidence suggesting that higher property taxes raise mobility among elderly homeowners. The point estimates from instrumental variable estimation using relief programs to generate instruments suggest that a $100 increase in annual property taxes is associated with a 0.73 percentage point increase in the two-year mobility rate for homeowners over the age of 50. This is an eight percent increase from the baseline two-year mobility rate of nine percent. These results are robust to alternative specifications. PMID:20161617

  9. Employment insecurity and employees' health in Denmark.

    PubMed

    Cottini, Elena; Ghinetti, Paolo

    2018-02-01

    We use register data for Denmark (IDA) merged with the Danish Work Environment Cohort Survey (1995, 2000, and 2005) to estimate the effect of perceived employment insecurity on perceived health for a sample of Danish employees. We consider two health measures from the SF-36 Health Survey Instrument: a vitality scale for general well-being and a mental health scale. We first analyse a summary measure of employment insecurity. Instrumental variables-fixed effects estimates that use firm workforce changes as a source of exogenous variation show that 1 additional dimension of insecurity causes a shift from the median to the 25th percentile in the mental health scale and to the 30th in that of energy/vitality. It also increases by about 6 percentage points the probability to develop severe mental health problems. Looking at single insecurity dimensions by naïve fixed effects, uncertainty associated with the current job is important for mental health. Employability has a sizeable relationship with health and is the only insecurity dimension that matters for the energy and vitality scale. Danish employees who fear involuntary firm internal mobility experience worse mental health. Copyright © 2017 John Wiley & Sons, Ltd.

  10. Psychometric instrumentation: reliability and validity of instruments used for clinical practice, evidence-based practice projects and research studies.

    PubMed

    Mayo, Ann M

    2015-01-01

    It is important for CNSs and other APNs to consider the reliability and validity of instruments chosen for clinical practice, evidence-based practice projects, or research studies. Psychometric testing uses specific research methods to evaluate the amount of error associated with any particular instrument. Reliability estimates explain more about how well the instrument is designed, whereas validity estimates explain more about scores that are produced by the instrument. An instrument may be architecturally sound overall (reliable), but the same instrument may not be valid. For example, if a specific group does not understand certain well-constructed items, then the instrument does not produce valid scores when used with that group. Many instrument developers may conduct reliability testing only once, yet continue validity testing in different populations over many years. All CNSs should be advocating for the use of reliable instruments that produce valid results. Clinical nurse specialists may find themselves in situations where reliability and validity estimates for some instruments that are being utilized are unknown. In such cases, CNSs should engage key stakeholders to sponsor nursing researchers to pursue this most important work.

  11. The Changing Nature of Drought Risk in South-east Australia Over the Past Two Millennia

    NASA Astrophysics Data System (ADS)

    Kiem, A.; Ho, M. W.; Verdon-Kidd, D.

    2015-12-01

    The Murray-Darling Basin (MDB) is one of the most important food and fibre regions in Australia, producing one-third of the national food supply and exporting produce to many other countries. In total, the Basin contains about 40% of Australia's farms and 70% of Australia's irrigated land area. However, the MDB is also one of the most spatially and temporally variable river systems in the world, with severe droughts a regular occurrence over the ~100 years of instrumental record and decadal-scale droughts (e.g. "Federation" (~1895-1902), "World War II" (~1937-1945) and "Millennium" or "Big Dry" (~1997-2010) droughts) matched by flood dominated epochs (e.g. 1950s, 1970s). The accurate estimation of drought risk in the MDB is hampered by relatively short instrumental records and also by the complexity of the region's climate teleconnections with several large-scale ocean-atmospheric processes in the Pacific (El Niño Southern Oscillation, Interdecadal Pacific Oscillation), the Indian (Indian Ocean Dipole) and Southern Oceans (Southern Annular Mode). Climate-sensitive paleoclimate records provide an opportunity to resolve hydroclimatic variability over long time periods prior to the availability of instrumental records and therefore offer the potential for improved quantification of risks associated with hydroclimatic extremes. However, the MDB, as with many regions in Australia, currently lacks suitable in situ proxies necessary to do this. Therefore, remote paleoclimate rainfall proxies in the Australasian region spanning are used to develop new reconstructions of MDB rainfall over the Common Era (CE) (i.e. approximately the past 2000 years). The nature of MDB dry epochs from 749BCE to 1981CE are then compared with the frequency and duration of droughts recorded in instrumental records (i.e. approximately the past 100 years). Importantly, the results show that the probability of decadal scale droughts is three times greater than instrumental records suggest.

  12. Above ground biomass and tree species richness estimation with airborne lidar in tropical Ghana forests

    NASA Astrophysics Data System (ADS)

    Vaglio Laurin, Gaia; Puletti, Nicola; Chen, Qi; Corona, Piermaria; Papale, Dario; Valentini, Riccardo

    2016-10-01

    Estimates of forest aboveground biomass are fundamental for carbon monitoring and accounting; delivering information at very high spatial resolution is especially valuable for local management, conservation and selective logging purposes. In tropical areas, hosting large biomass and biodiversity resources which are often threatened by unsustainable anthropogenic pressures, frequent forest resources monitoring is needed. Lidar is a powerful tool to estimate aboveground biomass at fine resolution; however its application in tropical forests has been limited, with high variability in the accuracy of results. Lidar pulses scan the forest vertical profile, and can provide structure information which is also linked to biodiversity. In the last decade the remote sensing of biodiversity has received great attention, but few studies focused on the use of lidar for assessing tree species richness in tropical forests. This research aims at estimating aboveground biomass and tree species richness using discrete return airborne lidar in Ghana forests. We tested an advanced statistical technique, Multivariate Adaptive Regression Splines (MARS), which does not require assumptions on data distribution or on the relationships between variables, being suitable for studying ecological variables. We compared the MARS regression results with those obtained by multilinear regression and found that both algorithms were effective, but MARS provided higher accuracy either for biomass (R2 = 0.72) and species richness (R2 = 0.64). We also noted strong correlation between biodiversity and biomass field values. Even if the forest areas under analysis are limited in extent and represent peculiar ecosystems, the preliminary indications produced by our study suggest that instrument such as lidar, specifically useful for pinpointing forest structure, can also be exploited as a support for tree species richness assessment.

  13. The Effect of State Regulatory Stringency on Nursing Home Quality

    PubMed Central

    Mukamel, Dana B; Weimer, David L; Harrington, Charlene; Spector, William D; Ladd, Heather; Li, Yue

    2012-01-01

    Objective To test the hypothesis that more stringent quality regulations contribute to better quality nursing home care and to assess their cost-effectiveness. Data Sources/Setting Primary and secondary data from all states and U.S. nursing homes between 2005 and 2006. Study Design We estimated seven models, regressing quality measures on the Harrington Regulation Stringency Index and control variables. To account for endogeneity between regulation and quality, we used instrumental variables techniques. Quality was measured by staffing hours by type per case-mix adjusted day, hotel expenditures, and risk-adjusted decline in activities of daily living, high-risk pressure sores, and urinary incontinence. Data Collection All states' licensing and certification offices were surveyed to obtain data about deficiencies. Secondary data included the Minimum Data Set, Medicare Cost Reports, and the Economic Freedom Index. Principal Findings Regulatory stringency was significantly associated with better quality for four of the seven measures studied. The cost-effectiveness for the activities-of-daily-living measure was estimated at about 72,000 in 2011/ Quality Adjusted Life Year. Conclusions Quality regulations lead to better quality in nursing homes along some dimensions, but not all. Our estimates of cost-effectiveness suggest that increased regulatory stringency is in the ballpark of other acceptable cost-effective practices. PMID:22946859

  14. Influence of Strategy of Learning and Achievement Motivation of Learning Achievement Class VIII Students of State Junior High School in District Blitar

    ERIC Educational Resources Information Center

    Ayundawati, Dyah; Setyosari, Punaji; Susilo, Herawati; Sihkabuden

    2016-01-01

    This study aims for know influence of problem-based learning strategies and achievement motivation on learning achievement. The method used in this research is quantitative method. The instrument used in this study is two fold instruments to measure moderator variable (achievement motivation) and instruments to measure the dependent variable (the…

  15. Neural correlates of the divergence of instrumental probability distributions.

    PubMed

    Liljeholm, Mimi; Wang, Shuo; Zhang, June; O'Doherty, John P

    2013-07-24

    Flexible action selection requires knowledge about how alternative actions impact the environment: a "cognitive map" of instrumental contingencies. Reinforcement learning theories formalize this map as a set of stochastic relationships between actions and states, such that for any given action considered in a current state, a probability distribution is specified over possible outcome states. Here, we show that activity in the human inferior parietal lobule correlates with the divergence of such outcome distributions-a measure that reflects whether discrimination between alternative actions increases the controllability of the future-and, further, that this effect is dissociable from those of other information theoretic and motivational variables, such as outcome entropy, action values, and outcome utilities. Our results suggest that, although ultimately combined with reward estimates to generate action values, outcome probability distributions associated with alternative actions may be contrasted independently of valence computations, to narrow the scope of the action selection problem.

  16. Initial Scientific Assessment of the EOS Data and Information System (EOSDIS)

    NASA Technical Reports Server (NTRS)

    1989-01-01

    Crucial to the success of the Earth Observing System (Eos) is the Eos Data and Information System (EosDIS). The goals of Eos depend not only on its instruments and science investigations, but also on how well EosDlS helps scientists integrate reliable, large-scale data sets of geophysical and biological measurements made from Eos data, and on how successfully Eos scientists interact with other investigations in Earth System Science. Current progress in the use of remote sensing for science is hampered by requirements that the scientist understand in detail the instrument, the electromagnetic properties of the surface, and a suite of arcane tape formats, and by the immaturity of some of the techniques for estimating geophysical and biological variables from remote sensing data. These shortcomings must be transcended if remote sensing data are to be used by a much wider population of scientists who study environmental change at regional and global scales.

  17. The Glacier and Land Ice Surface Topography Interferometer (GLISTIN): A Novel Ka-band Digitally Beamformed Interferometer

    NASA Technical Reports Server (NTRS)

    Moller, Delwyn K.; Heavey, Brandon; Hodges, Richard; Rengarajan, Sembiam; Rignot, Eric; Rogez, Francois; Sadowy, Gregory; Simard, Marc; Zawadzki, Mark

    2006-01-01

    The estimation of the mass balance of ice sheets and glaciers on Earth is a problem of considerable scientific and societal importance. A key measurement to understanding, monitoring and forecasting these changes is ice-surface topography, both for ice-sheet and glacial regions. As such NASA identified 'ice topographic mapping instruments capable of providing precise elevation and detailed imagery data for measurements on glacial scales for detailed monitoring of ice sheet, and glacier changes' as a science priority for the most recent Instrument Incubator Program (IIP) opportunities. Funded under this opportunity is the technological development for a Ka-Band (35GHz) single-pass digitally beamformed interferometric synthetic aperture radar (InSAR). Unique to this concept is the ability to map a significant swath impervious of cloud cover with measurement accuracies comparable to laser altimeters but with variable resolution as appropriate to the differing scales-of-interest over ice-sheets and glaciers.

  18. Development of an electronic nose for environmental odour monitoring.

    PubMed

    Dentoni, Licinia; Capelli, Laura; Sironi, Selena; Del Rosso, Renato; Zanetti, Sonia; Della Torre, Matteo

    2012-10-25

    Exhaustive odour impact assessment should involve the evaluation of the impact of odours directly on citizens. For this purpose it might be useful to have an instrument capable of continuously monitoring ambient air quality, detecting the presence of odours and also recognizing their provenance. This paper discusses the laboratory and field tests conducted in order to evaluate the performance of a new electronic nose, specifically developed for monitoring environmental odours. The laboratory tests proved the instrument was able to discriminate between the different pure substances being tested, and to estimate the odour concentrations giving correlation indexes (R2) of 0.99 and errors below 15%. Finally, the experimental monitoring tests conducted in the field, allowed us to verify the effectiveness of this electronic nose for the continuous detection of odours in ambient air, proving its stability to variable atmospheric conditions and its capability to detect odour peaks.

  19. Neural Correlates of the Divergence of Instrumental Probability Distributions

    PubMed Central

    Wang, Shuo; Zhang, June; O'Doherty, John P.

    2013-01-01

    Flexible action selection requires knowledge about how alternative actions impact the environment: a “cognitive map” of instrumental contingencies. Reinforcement learning theories formalize this map as a set of stochastic relationships between actions and states, such that for any given action considered in a current state, a probability distribution is specified over possible outcome states. Here, we show that activity in the human inferior parietal lobule correlates with the divergence of such outcome distributions–a measure that reflects whether discrimination between alternative actions increases the controllability of the future–and, further, that this effect is dissociable from those of other information theoretic and motivational variables, such as outcome entropy, action values, and outcome utilities. Our results suggest that, although ultimately combined with reward estimates to generate action values, outcome probability distributions associated with alternative actions may be contrasted independently of valence computations, to narrow the scope of the action selection problem. PMID:23884955

  20. Modelling a solar flare from X-ray, UV, and radio observations

    NASA Astrophysics Data System (ADS)

    Chiuderi Drago, F.; Monsignori Fossi, B. C.

    1991-03-01

    A slowly evolving, flaring loop was observed by the UVSP, XRP, and HXIS instruments onboard SMM on June 10, 1980. Simultaneous radio observations from Toyokawa (Japan) are also available. The SMM instruments have an angular resolution ranging from 3 to 30 arcsec by which the loop structure may be determined. It appears that these observations cannot be accounted for by a single loop model even assuming a variable temperature and pressure. The additional presence of a hot and tenuous isothermal plasma is necessary to explain the harder emission (HXIS). X-ray and UV data are used to fit the differential emission measure as a function of temperature and a model of the flare is deduced, which is then checked against radio data. An estimate of the heating function along the loop and of the total energy content of the loop is also given.

  1. Productivity loss due to presenteeism among patients with arthritis: estimates from 4 instruments.

    PubMed

    Zhang, Wei; Gignac, Monique A M; Beaton, Dorcas; Tang, Kenneth; Anis, Aslam H

    2010-09-01

    To estimate and compare lost work hours attributable to presenteeism, defined as reduced productivity while working, in individuals with osteoarthritis (OA) or rheumatoid arthritis (RA), according to 4 instruments. In our prospective study, 250 workers with OA (n = 130) or RA (n = 120) were recruited from community and clinical sites. Lost hours due to presenteeism at baseline were estimated using the Health and Labor Questionnaire (HLQ), the Work Limitations Questionnaire (WLQ), the World Health Organization's Health and Work Performance Questionnaire (HPQ), and the Work Productivity and Activity Impairment Questionnaire (WPAI). Only those respondents working over the past 2 weeks were included. Repeated-measures ANOVA was used to compare the lost-time estimates, according to each instrument. Of the 212 respondents included in the analyses, the frequency of missing and "0" values among the instruments was different (17% and 61% for HLQ, 8% and 5% for WLQ, 1% and 16% for HPQ, 0% and 27% for WPAI, respectively). The average numbers of lost hours (SD) per 2 weeks due to presenteeism using HLQ, WLQ, HPQ, and WPAI were 1.6 (3.9), 4.0 (3.9), 13.5 (12.5), and 14.2 (16.7). The corresponding costs for the 2-week period were CAN$30.03, $83.05, $284.07, and $285.10. The differences in the lost-hour estimates according to instruments were significant (p < 0.001). Among individuals with arthritis, estimates of productivity losses while working vary widely according to the instruments chosen. Further research on instrument design and implications for a standardized approach to estimate lost time due to presenteeism is needed.

  2. SeaWiFS Postlaunch Technical Report Series. Volume 3; The SeaBOARR-98 Field Campaign

    NASA Technical Reports Server (NTRS)

    Zibordi, Giuseppe; Lazin, Gordana; McLean, Scott; Firestone, Elaine R. (Editor); Hooker, Stanford B. (Editor)

    1999-01-01

    This report documents the scientific activities during the first Sea-viewing Wide Field-of-view Sensor (SeaWiFS) Bio-Optical Algorithm Round-Robin (SeaBOARR-98) experiment, which took place from 5-17 July 1998, at the Acqua Alta Oceanographic Tower (AAOT) in the northern Adriatic Sea off the coast of Italy. The ultimate objective of the SeaBOARR activity is to evaluate the effect of different measurement protocols on bio-optical algorithms using data from a variety of field campaigns. The SeaBOARR-98 field campaign was concerned with collecting a high quality data set of simultaneous in-water and above-water radiometric measurements. The deployment goals documented in this report were to: a) use four different surface glint correction methods to compute water-leaving radiances, L W (lambda), from above-water data; b) use two different in-water profiling systems and three different methods to compute L W (lambda) from in-water data (one making measurements at a fixed distance from the tower, 7.5 m, and the other at variable distances up to 29 m away); c) use instruments with a common calibration history to minimize intercalibration uncertainties; d) monitor the calibration drift of the instruments in the field with a second generation SeaWiFS Quality Monitor (SQM-II), to separate differences in methods from changes in instrument performance; and e) compare the L W (lambda) values estimated from the above-water and in-water measurements. In addition to describing the instruments deployed and the data collected, a preliminary analysis of the data is presented, and the kind of follow-on work that is needed to completely assess the estimation of L W (lambda) from above-water and in-water measurements is discussed.

  3. Serum Iron Levels and the Risk of Parkinson Disease: A Mendelian Randomization Study

    PubMed Central

    Gögele, Martin; Lill, Christina M.; Bertram, Lars; Do, Chuong B.; Eriksson, Nicholas; Foroud, Tatiana; Myers, Richard H.; Nalls, Michael; Keller, Margaux F.; Benyamin, Beben; Whitfield, John B.; Pramstaller, Peter P.; Hicks, Andrew A.; Thompson, John R.; Minelli, Cosetta

    2013-01-01

    Background Although levels of iron are known to be increased in the brains of patients with Parkinson disease (PD), epidemiological evidence on a possible effect of iron blood levels on PD risk is inconclusive, with effects reported in opposite directions. Epidemiological studies suffer from problems of confounding and reverse causation, and mendelian randomization (MR) represents an alternative approach to provide unconfounded estimates of the effects of biomarkers on disease. We performed a MR study where genes known to modify iron levels were used as instruments to estimate the effect of iron on PD risk, based on estimates of the genetic effects on both iron and PD obtained from the largest sample meta-analyzed to date. Methods and Findings We used as instrumental variables three genetic variants influencing iron levels, HFE rs1800562, HFE rs1799945, and TMPRSS6 rs855791. Estimates of their effect on serum iron were based on a recent genome-wide meta-analysis of 21,567 individuals, while estimates of their effect on PD risk were obtained through meta-analysis of genome-wide and candidate gene studies with 20,809 PD cases and 88,892 controls. Separate MR estimates of the effect of iron on PD were obtained for each variant and pooled by meta-analysis. We investigated heterogeneity across the three estimates as an indication of possible pleiotropy and found no evidence of it. The combined MR estimate showed a statistically significant protective effect of iron, with a relative risk reduction for PD of 3% (95% CI 1%–6%; p = 0.001) per 10 µg/dl increase in serum iron. Conclusions Our study suggests that increased iron levels are causally associated with a decreased risk of developing PD. Further studies are needed to understand the pathophysiological mechanism of action of serum iron on PD risk before recommendations can be made. Please see later in the article for the Editors' Summary PMID:23750121

  4. A Spanish-language patient safety questionnaire to measure medical and nursing students' attitudes and knowledge.

    PubMed

    Mira, José J; Navarro, Isabel M; Guilabert, Mercedes; Poblete, Rodrigo; Franco, Astolfo L; Jiménez, Pilar; Aquino, Margarita; Fernández-Trujillo, Francisco J; Lorenzo, Susana; Vitaller, Julián; de Valle, Yohana Díaz; Aibar, Carlos; Aranaz, Jesús M; De Pedro, José A

    2015-08-01

    To design and validate a questionnaire for assessing attitudes and knowledge about patient safety using a sample of medical and nursing students undergoing clinical training in Spain and four countries in Latin America. In this cross-sectional study, a literature review was carried out and total of 786 medical and nursing students were surveyed at eight universities from five countries (Chile, Colombia, El Salvador, Guatemala, and Spain) to develop and refine a Spanish-language questionnaire on knowledge and attitudes about patient safety. The scope of the questionnaire was based on five dimensions (factors) presented in studies related to patient safety culture found in PubMed and Scopus. Based on the five factors, 25 reactive items were developed. Composite reliability indexes and Cronbach's alpha statistics were estimated for each factor, and confirmatory factor analysis was conducted to assess validity. After a pilot test, the questionnaire was refined using confirmatory models, maximum-likelihood estimation, and the variance-covariance matrix (as input). Multiple linear regression models were used to confirm external validity, considering variables related to patient safety culture as dependent variables and the five factors as independent variables. The final instrument was a structured five-point Likert self-administered survey (the "Latino Student Patient Safety Questionnaire") consisting of 21 items grouped into five factors. Compound reliability indexes (Cronbach's alpha statistic) calculated for the five factors were about 0.7 or higher. The results of the multiple linear regression analyses indicated good model fit (goodness-of-fit index: 0.9). Item-total correlations were higher than 0.3 in all cases. The convergent-discriminant validity was adequate. The questionnaire designed and validated in this study assesses nursing and medical students' attitudes and knowledge about patient safety. This instrument could be used to indirectly evaluate whether or not students in health disciplines are acquiring and thus likely to put into practice the professional skills currently considered most appropriate for patient safety.

  5. Intraindividual variability in executive functions but not speed of processing or conflict resolution predicts performance differences in gait speed in older adults.

    PubMed

    Holtzer, Roee; Mahoney, Jeannette; Verghese, Joe

    2014-08-01

    The relationship between executive functions (EF) and gait speed is well established. However, with the exception of dual tasking, the key components of EF that predict differences in gait performance have not been determined. Therefore, the current study was designed to determine whether processing speed, conflict resolution, and intraindividual variability in EF predicted variance in gait performance in single- and dual-task conditions. Participants were 234 nondemented older adults (mean age 76.48 years; 55% women) enrolled in a community-based cohort study. Gait speed was assessed using an instrumented walkway during single- and dual-task conditions. The flanker task was used to assess EF. Results from the linear mixed effects model showed that (a) dual-task interference caused a significant dual-task cost in gait speed (estimate = 35.99; 95% CI = 33.19-38.80) and (b) of the cognitive predictors, only intraindividual variability was associated with gait speed (estimate = -.606; 95% CI = -1.11 to -.10). In unadjusted analyses, the three EF measures were related to gait speed in single- and dual-task conditions. However, in fully adjusted linear regression analysis, only intraindividual variability predicted performance differences in gait speed during dual tasking (B = -.901; 95% CI = -1.557 to -.245). Among the three EF measures assessed, intraindividual variability but not speed of processing or conflict resolution predicted performance differences in gait speed. © The Author 2013. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  6. Precipitation, temperature, and teleconnection signals across the combined North American, Monsoon Asia, and Old World Drought Atlases

    NASA Astrophysics Data System (ADS)

    Smerdon, J. E.; Baek, S. H.; Coats, S.; Williams, P.; Cook, B.; Cook, E. R.; Seager, R.

    2017-12-01

    The tree-ring-based North American Drought Atlas (NADA), Monsoon Asia Drought Atlas (MADA), and Old World Drought Atlas (OWDA) collectively yield a near-hemispheric gridded reconstruction of hydroclimate variability over the last millennium. To test the robustness of the large-scale representation of hydroclimate variability across the drought atlases, the joint expression of seasonal climate variability and teleconnections in the NADA, MADA, and OWDA are compared against two global, observation-based PDSI products. Predominantly positive (negative) correlations are determined between seasonal precipitation (surface air temperature) and collocated tree-ring-based PDSI, with average Pearson's correlation coefficients increasing in magnitude from boreal winter to summer. For precipitation, these correlations tend to be stronger in the boreal winter and summer when calculated for the observed PDSI record, while remaining similar for temperature. Notwithstanding these differences, the drought atlases robustly express teleconnection patterns associated with the El Niño-Southern Oscillation (ENSO), North Atlantic Oscillation (NAO), Pacific Decadal Oscillation (PDO), and Atlantic Multidecadal Oscillation (AMO). These expressions exist in the drought atlas estimates of boreal summer PDSI despite the fact that these modes of climate variability are dominant in boreal winter, with the exception of the Atlantic Multidecadal Oscillation. ENSO and NAO teleconnection patterns in the drought atlases are particularly consistent with their well-known dominant expressions in boreal winter and over the OWDA domain, respectively. Collectively, our findings confirm that the joint Northern Hemisphere drought atlases robustly reflect large-scale patterns of hydroclimate variability on seasonal to multidecadal timescales over the 20th century and are likely to provide similarly robust estimates of hydroclimate variability prior to the existence of widespread instrumental data.

  7. On Aethalometer measurement uncertainties and an instrument correction factor for the Arctic

    NASA Astrophysics Data System (ADS)

    Backman, John; Schmeisser, Lauren; Virkkula, Aki; Ogren, John A.; Asmi, Eija; Starkweather, Sandra; Sharma, Sangeeta; Eleftheriadis, Konstantinos; Uttal, Taneil; Jefferson, Anne; Bergin, Michael; Makshtas, Alexander; Tunved, Peter; Fiebig, Markus

    2017-12-01

    Several types of filter-based instruments are used to estimate aerosol light absorption coefficients. Two significant results are presented based on Aethalometer measurements at six Arctic stations from 2012 to 2014. First, an alternative method of post-processing the Aethalometer data is presented, which reduces measurement noise and lowers the detection limit of the instrument more effectively than boxcar averaging. The biggest benefit of this approach can be achieved if instrument drift is minimised. Moreover, by using an attenuation threshold criterion for data post-processing, the relative uncertainty from the electronic noise of the instrument is kept constant. This approach results in a time series with a variable collection time (Δt) but with a constant relative uncertainty with regard to electronic noise in the instrument. An additional advantage of this method is that the detection limit of the instrument will be lowered at small aerosol concentrations at the expense of temporal resolution, whereas there is little to no loss in temporal resolution at high aerosol concentrations ( > 2.1-6.7 Mm-1 as measured by the Aethalometers). At high aerosol concentrations, minimising the detection limit of the instrument is less critical. Additionally, utilising co-located filter-based absorption photometers, a correction factor is presented for the Arctic that can be used in Aethalometer corrections available in literature. The correction factor of 3.45 was calculated for low-elevation Arctic stations. This correction factor harmonises Aethalometer attenuation coefficients with light absorption coefficients as measured by the co-located light absorption photometers. Using one correction factor for Arctic Aethalometers has the advantage that measurements between stations become more inter-comparable.

  8. Medicare Part D: Are Insurers Gaming the Low Income Subsidy Design?

    PubMed

    Decarolis, Francesco

    2015-04-01

    This paper shows how in Medicare Part D insurers' gaming of the subsidy paid to low-income enrollees distorts premiums and raises the program cost. Using plan-level data from the first five years of the program, I find multiple instances of pricing strategy distortions for the largest insurers. Instrumental variable estimates indicate that the changes in a concentration index measuring the manipulability of the subsidy can explain a large share of the premium growth observed between 2006 and 2011. Removing this distortion could reduce the cost of the program without worsening consumer welfare.

  9. Patents and the Global Diffusion of New Drugs.

    PubMed

    Cockburn, Iain M; Lanjouw, Jean O; Schankerman, Mark

    2016-01-01

    Analysis of the timing of launches of 642 new drugs in 76 countries during 1983–2002 shows that patent and price regulation regimes strongly affect how quickly new drugs become commercially available in different countries. Price regulation delays launch, while longer and more extensive patent rights accelerate it. Health policy institutions and economic and demographic factors that make markets more profitable also speed up diffusion. The estimated effects are generally robust to controlling for endogeneity of policy regimes with country fixed effects and instrumental variables. The results highlight the important role of policy choices in driving the diffusion of new innovations.

  10. Direct measurement of skin friction with a new instrument

    NASA Technical Reports Server (NTRS)

    Vakili, A. D.; Wu, J. M.

    1986-01-01

    The design and performance of a small belt-type skin-friction gage to measure wall shear-stress coefficients in wind-tunnel testing are described, summarizing the report of Vakili and Wu (1982). The sensor employs a flexible belt of variable surface characteristics; this belt, wrapped tightly around two cylinders mounted on frictionless flexures, is equipped with strain gages to estimate the deflection of the belt by the flow. An alternative approach uses IR illumination, optical fibers, and a photosensitive transistor, permitting direct measurement of the belt deflection. Drawings, diagrams, and graphs of sample data are provided.

  11. Is child labor harmful? The impact of working earlier in life on adult earnings.

    PubMed

    Emerson, Patrick M; Souza, André Portela

    2011-01-01

    This paper explores the question: is working as a child harmful to an individual in terms of adult outcomes in earnings? Although this is an extremely important question, little is known about the effect of child labor on adult outcomes. Estimations of an instrumental variables earnings model on data from Brazil show that child labor has a large negative impact on adult earnings for male children even when controlling for schooling and that the negative impact of starting to work as a child reverses at around ages 12–14.

  12. Optical variability of extragalactic objects used to tie the HIPPARCOS reference frame to an extragalactic system using Hubble space telescope observations

    NASA Technical Reports Server (NTRS)

    Bozyan, Elizabeth P.; Hemenway, Paul D.; Argue, A. Noel

    1990-01-01

    Observations of a set of 89 extragalactic objects (EGOs) will be made with the Hubble Space Telescope Fine Guidance Sensors and Planetary Camera in order to link the HIPPARCOS Instrumental System to an extragalactic coordinate system. Most of the sources chosen for observation contain compact radio sources and stellarlike nuclei; 65 percent are optical variables beyond a 0.2 mag limit. To ensure proper exposure times, accurate mean magnitudes are necessary. In many cases, the average magnitudes listed in the literature were not adequate. The literature was searched for all relevant photometric information for the EGOs, and photometric parameters were derived, including mean magnitude, maximum range, and timescale of variability. This paper presents the results of that search and the parameters derived. The results will allow exposure times to be estimated such that an observed magnitude different from the tabular magnitude by 0.5 mag in either direction will not degrade the astrometric centering ability on a Planetary Camera CCD frame.

  13. Precision Monitoring of Water Level in a Salt Marsh with Low Cost Tilt Loggers

    NASA Astrophysics Data System (ADS)

    Sheremet, Vitalii A.; Mora, Jordan W.

    2016-04-01

    Several salt pannes and pools in the Sage Lot tidal marsh of Waquoit Bay system, MA were instrumented with newly developed Arm-and-Float water level gauges (utilizing accelerometer tilt logger) permitting to record water level fluctuations with accuracy of 1 mm and submillimeter resolution. The methodology of the instrument calibration, deployment, and elevation control are described. The instrument performance was evaluated. Several month long deployments allowed us to analyze the marsh flooding and draining processes, study differences among the salt pannes. The open channel flow flooding-draining mechanism and slower seepage were distinguished. From the drain curve the seepage rate can be quantified. The seepage rate remains approximately constant for all flooding draining episodes, but varies from panne to panne depending on bottom type and location. Seasonal differences due to the growth of vegetation are also recorded. The analysis of rain events allows us to estimate the catch area of subbasins in the marsh. The implication for marsh ecology and marsh accretion are discussed. The gradual sea level rise coupled with monthly tidal datum variability and storm surges result in migration and development of a salt marsh. The newly developed low cost instrumentation allows us to record and analyze these changes and may provide guidance for the ecological management.

  14. Instruments to measure the quality of life in patients with oral mucositis undergoing oncological treatment: a systematic review of the literature.

    PubMed

    Gutiérrez-Vargas, Rosaura; Díaz-García, María Luisa; Villasís-Keever, Miguel Ángel; Portilla-Robertson, Javier; Zapata-Tárres, Marta

    Oral mucositis (OM) is an inflammatory reaction of the oropharyngeal mucosa to cumulative chemotherapy (CT) and radiation therapy (RT), affecting one or more parts of the digestive tract along with the quality of life (QoL) of the patient. The goal of this study was to identify valid and reliable tools to evaluate QoL related to OM. A systematic review of the literature was conducted up to May 2016. Articles were selected by peers using the PubMed database through a search following the inclusion and exclusion criteria and STAndards for the Reporting of Diagnostic accuracy studies (STARD) checklist with a cut-off point ≥ 70%. We identified four relevant articles that described instruments to assess the QoL related to OM in patients undergoing cancer treatment. The evaluation of the QoL in patients with OM is a difficult scenario because of its multiple variables. The knowledge of this relationship is limited because general instruments of oral health or cancer therapy are commonly used for evaluation. However, valid instruments are already available for estimating the impact of OM on the QoL from the patient's perspective. Copyright © 2016 Hospital Infantil de México Federico Gómez. Publicado por Masson Doyma México S.A. All rights reserved.

  15. Tree rings and rainfall in the equatorial Amazon

    NASA Astrophysics Data System (ADS)

    Granato-Souza, Daniela; Stahle, David W.; Barbosa, Ana Carolina; Feng, Song; Torbenson, Max C. A.; de Assis Pereira, Gabriel; Schöngart, Jochen; Barbosa, Joao Paulo; Griffin, Daniel

    2018-05-01

    The Amazon basin is a global center of hydroclimatic variability and biodiversity, but there are only eight instrumental rainfall stations with continuous records longer than 80 years in the entire basin, an area nearly the size of the coterminous US. The first long moisture-sensitive tree-ring chronology has been developed in the eastern equatorial Amazon of Brazil based on dendrochronological analysis of Cedrela cross sections cut during sustainable logging operations near the Rio Paru. The Rio Paru chronology dates from 1786 to 2016 and is significantly correlated with instrumental precipitation observations from 1939 to 2016. The strength and spatial scale of the precipitation signal vary during the instrumental period, but the Rio Paru chronology has been used to develop a preliminary reconstruction of February to November rainfall totals from 1786 to 2016. The reconstruction is related to SSTs in the Atlantic and especially the tropical Pacific, similar to the stronger pattern of association computed for the instrumental rainfall data from the eastern Amazon. The tree-ring data estimate extended drought and wet episodes in the mid- to late-nineteenth century, providing a valuable, long-term perspective on the moisture changes expected to emerge over the Amazon in the coming century due to deforestation and anthropogenic climate change.

  16. Estimating Water Fluxes Across the Sediment-Water Interface in the Lower Merced River, California

    USGS Publications Warehouse

    Zamora, Celia

    2008-01-01

    The lower Merced River Basin was chosen by the U.S. Geological Survey?s (USGS) National Water Quality Assessment Program (NAWQA) to be included in a national study on how hydrological processes and agricultural practices interact to affect the transport and fate of agricultural chemicals. As part of this effort, surface-water?ground-water (sw?gw) interactions were studied in an instrumented 100-m reach on the lower Merced River. This study focused on estimating vertical rates of exchange across the sediment?water interface by direct measurement using seepage meters and by using temperature as a tracer coupled with numerical modeling. Temperature loggers and pressure transducers were placed in monitoring wells within the streambed and in the river to continuously monitor temperature and hydraulic head every 15 minutes from March 2004 to October 2005. One-dimensional modeling of heat and water flow was used to interpret the temperature and head observations and deduce the sw?gw fluxes using the USGS numerical model, VS2DH, which simulates variably saturated water flow and solves the energy transport equation. Results of the modeling effort indicate that the Merced River at the study reach is generally a slightly gaining stream with small head differences (cm) between the surface water and ground water, with flow reversals occurring during high streamflow events. The average vertical flux across the sediment?water interface was 0.4?2.2 cm/day, and the range of hydraulic conductivities was 1?10 m/day. Seepage meters generally failed to provide accurate data in this high-energy system because of slow seepage rates and a moving streambed resulting in scour or burial of the seepage meters. Estimates of streambed hydraulic conductivity were also made using grain-size analysis and slug tests. Estimated hydraulic conductivity for the upstream transect determined using slug tests ranged from 40 to 250 m/day, whereas the downstream transect ranged from 10 to 100 m/day. The range in variability was a result of position along each transect. A relative percent difference was used to describe the variability in estimates of hydraulic conductivity by grain-size analysis and slug test. Variability in applied methods at the upstream transect ranged from 0 to 9 percent, whereas the downstream transect showed greater variability, with a range of 80 to 133 percent.

  17. Does childhood schooling affect old age memory or mental status? Using state schooling laws as natural experiments.

    PubMed

    Glymour, M M; Kawachi, I; Jencks, C S; Berkman, L F

    2008-06-01

    The association between schooling and old age cognitive outcomes such as memory disorders is well documented but, because of the threat of reverse causation, controversy persists over whether education affects old age cognition. Changes in state compulsory schooling laws (CSL) are treated as natural experiments (instruments) for estimating the effect of education on memory and mental status among the elderly. Changes in CSL predict changes in average years of schooling completed by children who are affected by the new laws. These educational differences are presumably independent of innate individual characteristics such as IQ. CSL-induced changes in education were used to obtain instrumental variable (IV) estimates of education's effect on memory (n = 10,694) and mental status (n = 9751) for white, non-Hispanic US-born Health and Retirement Survey participants born between 1900 and 1947 who did not attend college. After adjustment for sex, birth year, state of birth and state characteristics, IV estimates of education's effect on memory were large and statistically significant. IV estimates for mental status had very wide confidence intervals, so it was not possible to draw meaningful conclusions about the effect of education on this outcome. Increases in mandatory schooling lead to improvements in performance on memory tests many decades after school completion. These analyses condition on individual states, so differences in memory outcomes associated with CSL changes cannot be attributed to differences between states. Although unmeasured state characteristics that changed contemporaneously with CSL might account for these results, unobserved genetic variation is unlikely to do so.

  18. Does childhood schooling affect old age memory or mental status? Using state schooling laws as natural experiments

    PubMed Central

    Glymour, M M; Kawachi, I; Jencks, C S; Berkman, L F

    2009-01-01

    Background The association between schooling and old age cognitive outcomes such as memory disorders is well documented but, because of the threat of reverse causation, controversy persists over whether education affects old age cognition. Changes in state compulsory schooling laws (CSL) are treated as natural experiments (instruments) for estimating the effect of education on memory and mental status among the elderly. Changes in CSL predict changes in average years of schooling completed by children who are affected by the new laws. These educational differences are presumably independent of innate individual characteristics such as IQ. Methods CSL-induced changes in education were used to obtain instrumental variable (IV) estimates of education’s effect on memory (n = 10 694) and mental status (n = 9751) for white, non-Hispanic US-born Health and Retirement Survey participants born between 1900 and 1947 who did not attend college. Results After adjustment for sex, birth year, state of birth and state characteristics, IV estimates of education’s effect on memory were large and statistically significant. IV estimates for mental status had very wide confidence intervals, so it was not possible to draw meaningful conclusions about the effect of education on this outcome. Conclusions Increases in mandatory schooling lead to improvements in performance on memory tests many decades after school completion. These analyses condition on individual states, so differences in memory outcomes associated with CSL changes cannot be attributed to differences between states. Although unmeasured state characteristics that changed contemporaneously with CSL might account for these results, unobserved genetic variation is unlikely to do so. PMID:18477752

  19. Using cost-effectiveness estimates from survey data to guide commissioning: an application to home care.

    PubMed

    Forder, Julien; Malley, Juliette; Towers, Ann-Marie; Netten, Ann

    2014-08-01

    The aim is to describe and trial a pragmatic method to produce estimates of the incremental cost-effectiveness of care services from survey data. The main challenge is in estimating the counterfactual; that is, what the patient's quality of life would be if they did not receive that level of service. A production function method is presented, which seeks to distinguish the variation in care-related quality of life in the data that is due to service use as opposed to other factors. A problem is that relevant need factors also affect the amount of service used and therefore any missing factors could create endogeneity bias. Instrumental variable estimation can mitigate this problem. This method was applied to a survey of older people using home care as a proof of concept. In the analysis, we were able to estimate a quality-of-life production function using survey data with the expected form and robust estimation diagnostics. The practical advantages with this method are clear, but there are limitations. It is computationally complex, and there is a risk of misspecification and biased results, particularly with IV estimation. One strategy would be to use this method to produce preliminary estimates, with a full trial conducted thereafter, if indicated. Copyright © 2013 John Wiley & Sons, Ltd.

  20. Correcting the Standard Errors of 2-Stage Residual Inclusion Estimators for Mendelian Randomization Studies.

    PubMed

    Palmer, Tom M; Holmes, Michael V; Keating, Brendan J; Sheehan, Nuala A

    2017-11-01

    Mendelian randomization studies use genotypes as instrumental variables to test for and estimate the causal effects of modifiable risk factors on outcomes. Two-stage residual inclusion (TSRI) estimators have been used when researchers are willing to make parametric assumptions. However, researchers are currently reporting uncorrected or heteroscedasticity-robust standard errors for these estimates. We compared several different forms of the standard error for linear and logistic TSRI estimates in simulations and in real-data examples. Among others, we consider standard errors modified from the approach of Newey (1987), Terza (2016), and bootstrapping. In our simulations Newey, Terza, bootstrap, and corrected 2-stage least squares (in the linear case) standard errors gave the best results in terms of coverage and type I error. In the real-data examples, the Newey standard errors were 0.5% and 2% larger than the unadjusted standard errors for the linear and logistic TSRI estimators, respectively. We show that TSRI estimators with modified standard errors have correct type I error under the null. Researchers should report TSRI estimates with modified standard errors instead of reporting unadjusted or heteroscedasticity-robust standard errors. © The Author(s) 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health.

  1. Adjusted Analyses in Studies Addressing Therapy and Harm: Users' Guides to the Medical Literature.

    PubMed

    Agoritsas, Thomas; Merglen, Arnaud; Shah, Nilay D; O'Donnell, Martin; Guyatt, Gordon H

    2017-02-21

    Observational studies almost always have bias because prognostic factors are unequally distributed between patients exposed or not exposed to an intervention. The standard approach to dealing with this problem is adjusted or stratified analysis. Its principle is to use measurement of risk factors to create prognostically homogeneous groups and to combine effect estimates across groups.The purpose of this Users' Guide is to introduce readers to fundamental concepts underlying adjustment as a way of dealing with prognostic imbalance and to the basic principles and relative trustworthiness of various adjustment strategies.One alternative to the standard approach is propensity analysis, in which groups are matched according to the likelihood of membership in exposed or unexposed groups. Propensity methods can deal with multiple prognostic factors, even if there are relatively few patients having outcome events. However, propensity methods do not address other limitations of traditional adjustment: investigators may not have measured all relevant prognostic factors (or not accurately), and unknown factors may bias the results.A second approach, instrumental variable analysis, relies on identifying a variable associated with the likelihood of receiving the intervention but not associated with any prognostic factor or with the outcome (other than through the intervention); this could mimic randomization. However, as with assumptions of other adjustment approaches, it is never certain if an instrumental variable analysis eliminates bias.Although all these approaches can reduce the risk of bias in observational studies, none replace the balance of both known and unknown prognostic factors offered by randomization.

  2. Effects of evapotranspiration heterogeneity on catchment water balance in the Southern Sierra Nevada of California

    NASA Astrophysics Data System (ADS)

    Kerkez, B.; Kelly, A. E.; Lucas, R. G.; Son, K.; Glaser, S. D.; Bales, R. C.

    2011-12-01

    Heterogeneity of Evapotranspiration (ET) is the result of poorly understood interactions between climate, topography, vegetation and soil. Accurate predictions of ET, and thus improved water balance estimates, hinge directly upon an improved understanding of the processes that drive ET across a wide spatio-temporal range. Recent warming trends in the Western US are shifting precipitation toward more rain-dominated patterns, significantly increasing vegetation water stress in historically snow-dominated regimes due to reduced soil moisture and increased vapor deficit during warm summer months. We investigate dominant controls that govern ET variability in a highly instrumented 1km2 mountain catchment at the Southern Sierra Critical Zone Observatory, co-located in the Kings River Experimental Watershed. Various ET estimates are derived from a number of measurement approaches: an eddy flux covariance tower, ET chambers, stream flumes, groundwater monitoring wells, matric potential sensors, as well as data from a distributed wireless sensor network with over 300 sensors. Combined with precipitation data, and high-density distributed soil moisture and snowdepth readings, the ET estimates are utilized to reconstruct the overall catchment water balance. We also apply the Regional Hydro-Ecologic Simulation System (RHESSys), a physically based, spatially distributed hydrologic model, to estimate water balance components. The model predictions are compared with the water budget calculated from field data, and used to identify the key variables controlling spatial and temporal patterns of ET at multiple scales. Initial results show that ET estimates are scale-, and vegetation-dependent, with significant ET variability between vegetation types and physiographic parameters such as elevation, slope, and aspect. In mixed conifer forests terrain, ET is more dependent on soil moisture, while in the meadows, where the soil is generally saturated for the duration of the growing season, ET is driven by micro-meteorological parameters and meadow vegetation phenology.

  3. Long-term Self-noise Estimates of Seismic Sensors From a High-noise Vault

    NASA Astrophysics Data System (ADS)

    Hicks, S. P.; Goessen, S.; Hill, P.; Rietbrock, A.

    2017-12-01

    To understand the detection capabilities of seismic stations and for reducing biases in ambient noise imaging, it is vital to assess the contribution of instrument self-noise to overall site noise. Self-noise estimates typically come from vault installations in continental interiors with very low ambient noise levels. However, this approach restricts the independent assessment of self-noise by individual end-users to assess any variations in their own instrument pools from nominal specifications given by manufacturers and from estimations given in comparative test papers. However, the calculation method should be adapted to variable installation conditions. One problem is that microseism noise can contaminate self-noise results caused by instrument misalignment errors or manufacturing limits; this effect becomes stronger where ambient noise is higher. Moreover, due to expected stochastic and time-varying sensor noise, estimates based on hand-picking small numbers of data segments may not accurately reflect true self-noise. We report on results from a self-noise test experiment of Güralp seismic instruments (3T, 3ESPC broadband seismometers, Fortis strong motion accelerometer) that were installed in the sub-surface vault of the Eskdalemuir Seismic Observatory in Scotland, UK over the period October 2016-August 2017. Due to vault's proximity to the ocean, secondary microseism noise is strong, so we efficiently compute the angle of misalignment that maximises waveform coherence with a reference sensor. Self-noise was calculated using the 3-sensor correlation technique and we compute probability density functions of self-noise to assess its spread over time. We find that not correcting for misalignments as low as 0.1° can cause self-noise to be artificially higher by up to 15 dB at frequencies of 0.1-1 Hz. Our method thus efficiently removes the effect of microseism contamination on self-noise; for example, it restores the minimum noise floor for a 360s - 50 Hz 3T to -195 dB at 0.2 Hz. Furthermore, based on the analysis of our calculated probability density functions, we find at long-periods (> 30 s) the average self-noise can be up to 5 dB higher than the minimum noise floor. We discuss the validity of these results in terms of making direct comparisons with self-noise results from much quieter installations.

  4. Skylab water balance error analysis

    NASA Technical Reports Server (NTRS)

    Leonard, J. I.

    1977-01-01

    Estimates of the precision of the net water balance were obtained for the entire Skylab preflight and inflight phases as well as for the first two weeks of flight. Quantitative estimates of both total sampling errors and instrumentation errors were obtained. It was shown that measurement error is minimal in comparison to biological variability and little can be gained from improvement in analytical accuracy. In addition, a propagation of error analysis demonstrated that total water balance error could be accounted for almost entirely by the errors associated with body mass changes. Errors due to interaction between terms in the water balance equation (covariances) represented less than 10% of the total error. Overall, the analysis provides evidence that daily measurements of body water changes obtained from the indirect balance technique are reasonable, precise, and relaible. The method is not biased toward net retention or loss.

  5. Getting a Job is Only Half the Battle: Maternal Job Loss and Child Classroom Behavior in Low-Income Families

    PubMed Central

    Hill, Heather D.; Morris, Pamela A.; Castells, Nina; Walker, Jessica Thornton

    2011-01-01

    This study uses data from an experimental employment program and instrumental variables (IV) estimation to examine the effects of maternal job loss on child classroom behavior. Random assignment to the treatment at one of three program sites is an exogenous predictor of employment patterns. Cross-site variation in treatment-control differences is used to identify the effects of employment levels and transitions. Under certain assumptions, this method controls for unobserved correlates of job loss and child well-being, as well as measurement error and simultaneity. IV estimates suggest that maternal job loss sharply increases problem behavior but has neutral effects on positive social behavior. Current employment programs concentrate primarily on job entry, but these findings point to the importance of promoting job stability for workers and their children. PMID:22162901

  6. Kinship and nonrelative foster care: the effect of placement type on child well-being.

    PubMed

    Font, Sarah A

    2014-01-01

    This study uses a national sample of 1,215 children, ages 6-17, who spent some time in formal kinship or nonrelative foster care to identify the effect of placement type on academic achievement, behavior, and health. Several identification strategies are used to reduce selection bias, including ordinary least squares, change score models, propensity score weighting, and instrumental variables regression. The results consistently estimate a negative effect of kin placements on reading scores, but kin placements appear to have no effect on child health, and findings on children's math and cognitive skills test scores and behavioral problems are mixed. Estimated declines in both academic achievement and behavioral problems are concentrated among children who are lower functioning at baseline. © 2014 The Author. Child Development © 2014 Society for Research in Child Development, Inc.

  7. Fertilizing growth: Agricultural inputs and their effects in economic development.

    PubMed

    McArthur, John W; McCord, Gordon C

    2017-07-01

    This paper estimates the role of agronomic inputs in cereal yield improvements and the consequences for countries' processes of structural change. The results suggest a clear role for fertilizer, modern seeds and water in boosting yields. We then test for respective empirical links between agricultural yields and economic growth, labor share in agriculture and non-agricultural value added per worker. The identification strategy includes a novel instrumental variable that exploits the unique economic geography of fertilizer production and transport costs to countries' agricultural heartlands. We estimate that a half ton increase in staple yields generates a 14 to 19 percent higher GDP per capita and a 4.6 to 5.6 percentage point lower labor share in agriculture five years later. The results suggest a strong role for agricultural productivity as a driver of structural change.

  8. Does More Schooling Improve Health Outcomes and Health Related Behaviors? Evidence from U.K. Twins

    PubMed Central

    Amin, Vikesh; Behrman, Jere R.; Spector, Tim D.

    2013-01-01

    Several recent studies using instrumental variables based on changes in compulsory schoolleaving age laws have estimated the causal effect of schooling on health outcomes and health-related behaviors in the U.K. Despite using the same identification strategy and similar datasets, no consensus has been reached. We contribute to the literature by providing results for the U.K. using a different research design and a different dataset. Specifically, we estimate the effect of schooling on health outcomes (obesity and physical health) and health-related behaviors (smoking, alcohol consumption and exercise) for women through within-MZ twins estimates using the TwinsUK database. For physical health, alcohol consumption and exercise, the within-MZ twins estimates are uninformative about whether there is a causal effect. However, we find (1) that the significant association between schooling and smoking status is due to unobserved endowments that are correlated with schooling and smoking (2) there is some indication that more schooling reduces the body mass index for women, even once these unobserved endowments have been controlled for. PMID:24415826

  9. [Income inequality, corruption, and life expectancy at birth in Mexico].

    PubMed

    Idrovo, Alvaro Javier

    2005-01-01

    To ascertain if the effect of income inequality on life expectancy at birth in Mexico is mediated by corruption, used as a proxy of social capital. An ecological study was carried out with the 32 Mexican federative entities. Global and by sex correlations between life expectancy at birth were estimated by federative entity with the Gini coefficient, the Corruption and Good Government Index, the percentage of Catholics, and the percentage of the population speaking indigenous language. Robust linear regressions, with and without instrumental variables, were used to explore if corruption acts as intermediate variable in the studied relationship. Negative correlations with Spearman's rho near to -0.60 (p < 0.05) and greater than -0.66 (p < 0.05) between life expectancy at birth, the Gini coefficient and the population speaking indigenous language, respectively, were observed. Moreover, the Corruption and Good Government Index correlated with men's life expectancy at birth with Spearman's rho -0.3592 (p < 0.05). Regressions with instruments were more consistent than conventional ones and they show a strong negative effect (p < 0.05) of income inequality on life expectancy at birth. This effect was greater among men. The findings suggest a negative effect of income inequality on life expectancy at birth in Mexico, mediated by corruption levels and other related cultural factors.

  10. Quality of life, human insecurity, and distress among Palestinians in the Gaza Strip before and after the Winter 2008-2009 Israeli war.

    PubMed

    Hammoudeh, Weeam; Hogan, Dennis; Giacaman, Rita

    2013-11-01

    This study investigates changes in the quality of life (QoL) of Gaza Palestinians before and after the Israeli winter 2008-2009 war using the World Health Organization's WHOQOL-Bref; the extent to which this instrument adequately measures changing situations; and its responsiveness to locally developed human insecurity and distress measures appropriate for context. Ordinary least squares regression analysis was performed to detect how demographic and socioeconomic variables usually associated with QoL were associated with human insecurity and distress. We estimated the usual baseline model for the three QoL domains, and a second set of models including these standard variables and human insecurity and distress to assess how personal exposure to political violence affects QoL. No difference between the quality of life scores in 2005 and 2009 was found, with results suggesting lack of sensitivity of WHOQOL-Bref in capturing changes resulting from intensification of preexisting political violence. Results show that human insecurity and individual distress significantly increased in 2009 compared to 2005. Results indicate that a political domain may provide further understanding of and possibly increase the sensitivity of the instrument to detect changes in the Qol of Palestinians and possibly other populations experiencing intensified political violence.

  11. Characteristics predicting laparoscopic skill in medical students: nine years' experience in a single center.

    PubMed

    Nomura, Tsutomu; Matsutani, Takeshi; Hagiwara, Nobutoshi; Fujita, Itsuo; Nakamura, Yoshiharu; Kanazawa, Yoshikazu; Makino, Hiroshi; Mamada, Yasuhiro; Fujikura, Terumichi; Miyashita, Masao; Uchida, Eiji

    2018-01-01

    We introduced laparoscopic simulator training for medical students in 2007. This study was designed to identify factors that predict the laparoscopic skill of medical students, to identify intergenerational differences in abilities, and to estimate the variability of results in each training group. Our ultimate goal was to determine the optimal educational program for teaching laparoscopic surgery to medical students. Between 2007 and 2015, a total of 270 fifth-year medical students were enrolled in this observational study. Before training, the participants were asked questions about their interest in laparoscopic surgery, experience with playing video games, confidence about driving, and manual dexterity. After the training, aspects of their competence (execution time, instrument path length, and economy of instrument movement) were assessed. Multiple regression analysis identified significant effects of manual dexterity, gender, and confidence about driving on the results of the training. The training results have significantly improved over recent years. The variability among the results in each training group was relatively small. We identified the characteristics of medical students with excellent laparoscopic skills. We observed educational benefits from interactions between medical students within each training group. Our study suggests that selection and grouping are important to the success of modern programs designed to train medical students in laparoscopic surgery.

  12. A Cash-back Rebate Program for Healthy Food Purchases in South Africa: Selection and Program Effects in Self-reported Diet Patterns.

    PubMed

    An, Ruopeng; Sturm, Roland

    2017-03-01

    A South African insurer launched a rebate program for healthy food purchases for its members, but only available in program-designated supermarkets. To eliminate selection bias in program enrollment, we estimated the impact of subsidies in nudging the population towards a healthier diet using an instrumental variable approach. Data came from a health behavior questionnaire administered among members in the health promotion program. Individual and supermarket addresses were geocoded and differential distances from home to program-designated supermarkets versus competing supermarkets were calculated. Bivariate probit and linear instrumental variable models were performed to control for likely unobserved selection biases, employing differential distances as a predictor of program enrollment. For regular fast-food, processed meat, and salty food consumption, approximately two-thirds of the difference between participants and nonparticipants was attributable to the intervention and one-third to selection effects. For fruit/ vegetable and fried food consumption, merely one-eighth of the difference was selection. The rebate reduced regular consumption of fast food by 15% and foods high in salt/sugar and fried foods by 22%- 26%, and increased fruit/vegetable consumption by 21% (0.66 serving/day). Large population interventions are an essential complement to laboratory experiments, but selection biases require explicit attention in evaluation studies conducted in naturalistic settings.

  13. Spatial and temporal variability of the overall error of National Atmospheric Deposition Program measurements determined by the USGS collocated-sampler program, water years 1989-2001

    USGS Publications Warehouse

    Wetherbee, G.A.; Latysh, N.E.; Gordon, J.D.

    2005-01-01

    Data from the U.S. Geological Survey (USGS) collocated-sampler program for the National Atmospheric Deposition Program/National Trends Network (NADP/NTN) are used to estimate the overall error of NADP/NTN measurements. Absolute errors are estimated by comparison of paired measurements from collocated instruments. Spatial and temporal differences in absolute error were identified and are consistent with longitudinal distributions of NADP/NTN measurements and spatial differences in precipitation characteristics. The magnitude of error for calcium, magnesium, ammonium, nitrate, and sulfate concentrations, specific conductance, and sample volume is of minor environmental significance to data users. Data collected after a 1994 sample-handling protocol change are prone to less absolute error than data collected prior to 1994. Absolute errors are smaller during non-winter months than during winter months for selected constituents at sites where frozen precipitation is common. Minimum resolvable differences are estimated for different regions of the USA to aid spatial and temporal watershed analyses.

  14. Preventing Hospitalization with Veterans Affairs Home-Based Primary Care: Which Individuals Benefit Most?

    PubMed

    Edwards, Samuel T; Saha, Somnath; Prentice, Julia C; Pizer, Steven D

    2017-08-01

    To examine how medical complexity modifies the relationship between enrollment in Department of Veterans Affairs (VA) home-based primary care (HBPC) and hospitalization for ambulatory care-sensitive conditions (ACSC) for veterans with diabetes mellitus and whether the effect of HBPC on hospitalizations varies according to clinical condition. Retrospective cohort study. VA and non-VA hospitals. VA beneficiaries aged 67 and older with diabetes mellitus and enrolled in Medicare (N = 364,972). Instrumental variables regression models were used to estimate the effect of HBPC enrollment on hospitalization for ACSCs (defined according to the Agency for Healthcare Research and Quality Prevention Quality Indicators) overall and in subgroups stratified according to medical complexity. Models were also estimated for each ACSC to determine which conditions were most sensitive to HBPC. Distance from the veteran's residence to the nearest HBPC site was used as the instrumental variable. HBPC was associated with fewer ACSC hospitalizations (odds ratio (OR) = 0.35 per person-month, 95% confidence interval (CI) = 0.30-0.42). For veterans in the highest quartile of medical complexity, HBPC enrollment was associated with fewer ACSC hospitalizations (OR = 0.43, 95% CI = 0.19-0.93), whereas for those in the lowest quartile, HBPC was associated with more ACSC hospitalizations (OR = 33.2, 95% CI = 4.6-240.1). HBPC enrollment was associated with fewer hospitalizations for a range of ACSCs. HBPC enrollment was associated with fewer hospitalizations for a range of ACSCs in veterans with diabetes mellitus but only in the most medically complex individuals. This demonstrates the importance of appropriate targeting and suggests that the effect of HBPC is attributable to its comprehensive approach rather than condition-specific interventions. Published 2017. This article is a U.S. Government work and is in the public domain in the USA.

  15. Measuring Food Brand Awareness in Australian Children: Development and Validation of a New Instrument

    PubMed Central

    Boyland, Emma; Bauman, Adrian E.

    2015-01-01

    Background Children’s exposure to food marketing is one environmental determinant of childhood obesity. Measuring the extent to which children are aware of food brands may be one way to estimate relative prior exposures to food marketing. This study aimed to develop and validate an Australian Brand Awareness Instrument (ABAI) to estimate children’s food brand awareness. Methods The ABAI incorporated 30 flashcards depicting food/drink logos and their corresponding products. An abbreviated version was also created using 12 flashcards (ABAI-a). The ABAI was presented to 60 primary school aged children (7-11yrs) attending two Australian after-school centres. A week later, the full-version was repeated on approximately half the sample (n=27) and the abbreviated-version was presented to the remaining half (n=30). The test-retest reliability of the ABAI was analysed using Intra-class correlation coefficients. The concordance of the ABAI-a and full-version was assessed using Bland-Altman plots. The ‘nomological’ validity of the full tool was investigated by comparing children’s brand awareness with food marketing-related variables (e.g. television habits, intake of heavily promoted foods). Results Brand awareness increased with age (p<0.01) but was not significantly correlated with other variables. Bland-Altman analyses showed good agreement between the ABAI and ABAI-a. Reliability analyses revealed excellent agreement between the two administrations of the full-ABAI. Conclusions The ABAI was able to differentiate children’s varying levels of brand awareness. It was shown to be a valid and reliable tool and may allow quantification of brand awareness as a proxy measure for children’s prior food marketing exposure. PMID:26222624

  16. Ocean Colour Products from Remote Sensing Related to In-Situ Data for Supporting Management of Offshore Aquaculture

    NASA Astrophysics Data System (ADS)

    Fragoso, Bruno Dias Duarte; Icely, John; Moore, Gerald; Laanen, Marnix; Ghbrehiwot, Semhar

    2016-08-01

    The EU funded "AQUAculture USEr driven operational Remote Sensing information services project" (AQUA- USERS grant number 607325) is a user driven project for the aquaculture industry that aims at providing this industry with relevant and timely information based on the most recent satellite data and innovative optical in- situ measurements. The Water Insight Spectrometer (WISP-3) is a hand held instrument which can provide measurements of the optical parameters Chlorophyll-a (Chl-a), Total Suspended Matter (TSM), Coloured Dissolved Organic Matter (CDOM), and the Spectral Diffuse Attenuation Coefficient (Kd). Sampling campaigns were carried out between March 2014 and September 2015, to collect water samples at the same time as taking optical reading from the WISP-3 at an offshore aquaculture site off Sagres on the SW Portugal, operated by Finisterra Lda, one of the "users" in the project. The estimates from the WISP-3 for Chla and TSM have been compared with in-situ measurements from the water samples for these two variables, with the objective of calibrating the algorithms used by the WISP-3 for estimation of Chla and TSM. At a later stage in the project, it is expected that WISP-3 readings can be related to remote sensing products developed from the Ocean Land Coloured Instrument (OLCI) from the Sentinel-3 satellite. The key purpose of AQUA- Users is to develop, in collaboration with "users" from the aquaculture industry, a mobile phone application (app) that collates satellite information on optical water quality and temperature together with in-situ data of these variables to develop a decision support system for daily management of the aquaculture.

  17. Survey of equine castration techniques, preferences and outcomes among Australian veterinarians.

    PubMed

    Owens, C D; Hughes, K J; Hilbert, B J; Heller, J; Nielsen, S; Trope, G D

    2018-01-01

    (1) To collect the perceptions of veterinarians performing equine castrations in Australia on techniques, preferences and outcomes, (2) to investigate veterinarian use and experience with the Henderson castrating instrument and (3) to investigate potential associations between demographics, castration methods and techniques, and complications. Online survey of members of the Australian Veterinary Association's Special Interest Group, Equine Veterinarians Australia (EVA). A link to the survey was included in the EVA e-newsletter and practices on the EVA website were contacted by telephone and follow-up email. Fisher's exact test was used to determine associations between ligation and complications. A generalised linear model with a negative binomial family was used to determine associations between count response variables and categorical independent variables. Responses were obtained from 138 veterinarians (response rate, 13.1%) who performed 5330 castrations over 12 months. Castrations were most commonly performed in the field, on anaesthetised horses, using emasculators, via an open approach and without ligation of the spermatic cord. Estimated complications after use of emasculators were swelling (25%), haemorrhage (5%) and infection (5%). The Henderson instrument was used by approximately 10% of respondents and its use for castration was associated with fewer reports of postoperative swelling compared with emasculators (P = 0.002). Rates of evisceration with the Henderson and emasculator methods were comparable (0.43% and 0.9%, respectively). Castration preferences varied widely among survey participants. Reported complication types and rates were comparable to those reported previously in other countries. Perceptions that the Henderson instrument was associated with less swelling should be investigated further via a prospective controlled investigation. © 2017 Australian Veterinary Association.

  18. Extreme Ultraviolet Variability Experiment (EVE) Multiple EUV Grating Spectrographs (MEGS): Radiometric Calibrations and Results

    NASA Technical Reports Server (NTRS)

    Hock, R. A.; Woods, T. N.; Crotser, D.; Eparvier, F. G.; Woodraska, D. L.; Chamberlin, P. C.; Woods, E. C.

    2010-01-01

    The NASA Solar Dynamics Observatory (SDO), scheduled for launch in early 2010, incorporates a suite of instruments including the Extreme Ultraviolet Variability Experiment (EVE). EVE has multiple instruments including the Multiple Extreme ultraviolet Grating Spectrographs (MEGS) A, B, and P instruments, the Solar Aspect Monitor (SAM), and the Extreme ultraviolet SpectroPhotometer (ESP). The radiometric calibration of EVE, necessary to convert the instrument counts to physical units, was performed at the National Institute of Standards and Technology (NIST) Synchrotron Ultraviolet Radiation Facility (SURF III) located in Gaithersburg, Maryland. This paper presents the results and derived accuracy of this radiometric calibration for the MEGS A, B, P, and SAM instruments, while the calibration of the ESP instrument is addressed by Didkovsky et al. . In addition, solar measurements that were taken on 14 April 2008, during the NASA 36.240 sounding-rocket flight, are shown for the prototype EVE instruments.

  19. Using a simple apparatus to measure direct and diffuse photosynthetically active radiation at remote locations.

    PubMed

    Cruse, Michael J; Kucharik, Christopher J; Norman, John M

    2015-01-01

    Plant canopy interception of photosynthetically active radiation (PAR) drives carbon dioxide (CO2), water and energy cycling in the soil-plant-atmosphere system. Quantifying intercepted PAR requires accurate measurements of total incident PAR above canopies and direct beam and diffuse PAR components. While some regional data sets include these data, e.g. from Atmospheric Radiation Measurement (ARM) Program sites, they are not often applicable to local research sites because of the variable nature (spatial and temporal) of environmental variables that influence incoming PAR. Currently available instrumentation that measures diffuse and direct beam radiation separately can be cost prohibitive and require frequent adjustments. Alternatively, generalized empirical relationships that relate atmospheric variables and radiation components can be used but require assumptions that increase the potential for error. Our goal here was to construct and test a cheaper, highly portable instrument alternative that could be used at remote field sites to measure total, diffuse and direct beam PAR for extended time periods without supervision. The apparatus tested here uses a fabricated, solar powered rotating shadowband and other commercially available parts to collect continuous hourly PAR data. Measurements of total incident PAR had nearly a one-to-one relationship with total incident radiation measurements taken at the same research site by an unobstructed point quantum sensor. Additionally, measurements of diffuse PAR compared favorably with modeled estimates from previously published data, but displayed significant differences that were attributed to the important influence of rapidly changing local environmental conditions. The cost of the system is about 50% less than comparable commercially available systems that require periodic, but not continual adjustments. Overall, the data produced using this apparatus indicates that this instrumentation has the potential to support ecological research via a relatively inexpensive method to collect continuous measurements of total, direct beam and diffuse PAR in remote locations.

  20. Using a Simple Apparatus to Measure Direct and Diffuse Photosynthetically Active Radiation at Remote Locations

    PubMed Central

    Cruse, Michael J.; Kucharik, Christopher J.; Norman, John M.

    2015-01-01

    Plant canopy interception of photosynthetically active radiation (PAR) drives carbon dioxide (CO2), water and energy cycling in the soil-plant-atmosphere system. Quantifying intercepted PAR requires accurate measurements of total incident PAR above canopies and direct beam and diffuse PAR components. While some regional data sets include these data, e.g. from Atmospheric Radiation Measurement (ARM) Program sites, they are not often applicable to local research sites because of the variable nature (spatial and temporal) of environmental variables that influence incoming PAR. Currently available instrumentation that measures diffuse and direct beam radiation separately can be cost prohibitive and require frequent adjustments. Alternatively, generalized empirical relationships that relate atmospheric variables and radiation components can be used but require assumptions that increase the potential for error. Our goal here was to construct and test a cheaper, highly portable instrument alternative that could be used at remote field sites to measure total, diffuse and direct beam PAR for extended time periods without supervision. The apparatus tested here uses a fabricated, solar powered rotating shadowband and other commercially available parts to collect continuous hourly PAR data. Measurements of total incident PAR had nearly a one-to-one relationship with total incident radiation measurements taken at the same research site by an unobstructed point quantum sensor. Additionally, measurements of diffuse PAR compared favorably with modeled estimates from previously published data, but displayed significant differences that were attributed to the important influence of rapidly changing local environmental conditions. The cost of the system is about 50% less than comparable commercially available systems that require periodic, but not continual adjustments. Overall, the data produced using this apparatus indicates that this instrumentation has the potential to support ecological research via a relatively inexpensive method to collect continuous measurements of total, direct beam and diffuse PAR in remote locations. PMID:25668208

  1. Potential Effects of a Scenario Earthquake on the Economy of Southern California: Labor Market Exposure and Sensitivity Analysis to a Magnitude 7.8 Earthquake

    USGS Publications Warehouse

    Sherrouse, Benson C.; Hester, David J.; Wein, Anne M.

    2008-01-01

    The Multi-Hazards Demonstration Project (MHDP) is a collaboration between the U.S. Geological Survey (USGS) and various partners from the public and private sectors and academia, meant to improve Southern California's resiliency to natural hazards (Jones and others, 2007). In support of the MHDP objectives, the ShakeOut Scenario was developed. It describes a magnitude 7.8 (M7.8) earthquake along the southernmost 300 kilometers (200 miles) of the San Andreas Fault, identified by geoscientists as a plausible event that will cause moderate to strong shaking over much of the eight-county (Imperial, Kern, Los Angeles, Orange, Riverside, San Bernardino, San Diego, and Ventura) Southern California region. This report contains an exposure and sensitivity analysis of economic Super Sectors in terms of labor and employment statistics. Exposure is measured as the absolute counts of labor market variables anticipated to experience each level of Instrumental Intensity (a proxy measure of damage). Sensitivity is the percentage of the exposure of each Super Sector to each Instrumental Intensity level. The analysis concerns the direct effect of the scenario earthquake on economic sectors and provides a baseline for the indirect and interactive analysis of an input-output model of the regional economy. The analysis is inspired by the Bureau of Labor Statistics (BLS) report that analyzed the labor market losses (exposure) of a M6.9 earthquake on the Hayward fault by overlaying geocoded labor market data on Instrumental Intensity values. The method used here is influenced by the ZIP-code-level data provided by the California Employment Development Department (CA EDD), which requires the assignment of Instrumental Intensities to ZIP codes. The ZIP-code-level labor market data includes the number of business establishments, employees, and quarterly payroll categorized by the North American Industry Classification System. According to the analysis results, nearly 225,000 business establishments, or 44 percent of all establishments, would experience Instrumental Intensities between VII (7) and X (10). This represents more than 4 million employees earning over $45 billion in quarterly payroll. Over 57,000 of these establishments, employing over 1 million employees earning over $10 billion in quarterly payroll, would experience Instrumental Intensities of IX (9) or X (10). Based upon absolute counts and percentages, the Trade, Transportation, and Utilities Super Sector and the Manufacturing Super Sector are estimated to have the greatest exposure and sensitivity respectively. The Information and the Natural Resources and Mining Super Sectors are estimated to be the least impacted. Areas estimated to experience an Instrumental Intensity of X (10) account for approximately 3 percent of the region's labor market.

  2. UV Reconstruction Algorithm And Diurnal Cycle Variability

    NASA Astrophysics Data System (ADS)

    Curylo, Aleksander; Litynska, Zenobia; Krzyscin, Janusz; Bogdanska, Barbara

    2009-03-01

    UV reconstruction is a method of estimation of surface UV with the use of available actinometrical and aerological measurements. UV reconstruction is necessary for the study of long-term UV change. A typical series of UV measurements is not longer than 15 years, which is too short for trend estimation. The essential problem in the reconstruction algorithm is the good parameterization of clouds. In our previous algorithm we used an empirical relation between Cloud Modification Factor (CMF) in global radiation and CMF in UV. The CMF is defined as the ratio between measured and modelled irradiances. Clear sky irradiance was calculated with a solar radiative transfer model. In the proposed algorithm, the time variability of global radiation during the diurnal cycle is used as an additional source of information. For elaborating an improved reconstruction algorithm relevant data from Legionowo [52.4 N, 21.0 E, 96 m a.s.l], Poland were collected with the following instruments: NILU-UV multi channel radiometer, Kipp&Zonen pyranometer, radiosonde profiles of ozone, humidity and temperature. The proposed algorithm has been used for reconstruction of UV at four Polish sites: Mikolajki, Kolobrzeg, Warszawa-Bielany and Zakopane since the early 1960s. Krzyscin's reconstruction of total ozone has been used in the calculations.

  3. Estimating moisture transport over oceans using space-based observations

    NASA Technical Reports Server (NTRS)

    Liu, W. Timothy; Wenqing, Tang

    2005-01-01

    The moisture transport integrated over the depth of the atmosphere (0) is estimated over oceans using satellite data. The transport is the product of the precipitable water and an equivalent velocity (ue), which, by definition, is the depth-averaged wind velocity weighted by humidity. An artificial neural network is employed to construct a relation between the surface wind velocity measured by the spaceborne scatterometer and coincident ue derived using humidity and wind profiles measured by rawinsondes and produced by reanalysis of operational numerical weather prediction (NWP). On the basis of this relation, 0 fields are produced over global tropical and subtropical oceans (40_N- 40_S) at 0.25_ latitude-longitude and twice daily resolutions from August 1999 to December 2003 using surface wind vector from QuikSCAT and precipitable water from the Tropical Rain Measuring Mission. The derived ue were found to capture the major temporal variability when compared with radiosonde measurements. The average error over global oceans, when compared with NWP data, was comparable with the instrument accuracy specification of space-based scatterometers. The global distribution exhibits the known characteristics of, and reveals more detailed variability than in, previous data.

  4. Do dichromats see colours in this way? Assessing simulation tools without colorimetric measurements.

    PubMed

    Lillo Jover, Julio A; Álvaro Llorente, Leticia; Moreira Villegas, Humberto; Melnikova, Anna

    2016-11-01

    Simulcheck evaluates Colour Simulation Tools (CSTs, they transform colours to mimic those seen by colour vision deficients). Two CSTs (Variantor and Coblis) were used to know if the standard Simulcheck version (direct measurement based, DMB) can be substituted by another (RGB values based) not requiring sophisticated measurement instruments. Ten normal trichromats performed the two psychophysical tasks included in the Simulcheck method. The Pseudoachromatic Stimuli Identification task provided the h uv (hue angle) values of the pseudoachromatic stimuli: colours seen as red or green by normal trichromats but as grey by colour deficient people. The Minimum Achromatic Contrast task was used to compute the L R (relative luminance) values of the pseudoachromatic stimuli. Simulcheck DMB version showed that Variantor was accurate to simulate protanopia but neither Variantor nor Coblis were accurate to simulate deuteranopia. Simulcheck RGB version provided accurate h uv values, so this variable can be adequately estimated when lacking a colorimeter —an expensive and unusual apparatus—. Contrary, the inaccuracy of the L R estimations provided by Simulcheck RGB version makes it advisable to compute this variable from the measurements performed with a photometer, a cheap and easy to find apparatus.

  5. cit: hypothesis testing software for mediation analysis in genomic applications.

    PubMed

    Millstein, Joshua; Chen, Gary K; Breton, Carrie V

    2016-08-01

    The challenges of successfully applying causal inference methods include: (i) satisfying underlying assumptions, (ii) limitations in data/models accommodated by the software and (iii) low power of common multiple testing approaches. The causal inference test (CIT) is based on hypothesis testing rather than estimation, allowing the testable assumptions to be evaluated in the determination of statistical significance. A user-friendly software package provides P-values and optionally permutation-based FDR estimates (q-values) for potential mediators. It can handle single and multiple binary and continuous instrumental variables, binary or continuous outcome variables and adjustment covariates. Also, the permutation-based FDR option provides a non-parametric implementation. Simulation studies demonstrate the validity of the cit package and show a substantial advantage of permutation-based FDR over other common multiple testing strategies. The cit open-source R package is freely available from the CRAN website (https://cran.r-project.org/web/packages/cit/index.html) with embedded C ++ code that utilizes the GNU Scientific Library, also freely available (http://www.gnu.org/software/gsl/). joshua.millstein@usc.edu Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  6. Comparison of sea surface flux measured by instrumented aircraft and ship during SOFIA and SEMAPHORE experiments

    NASA Astrophysics Data System (ADS)

    Durand, Pierre; Dupuis, HéLèNe; Lambert, Dominique; BéNech, Bruno; Druilhet, Aimé; Katsaros, Kristina; Taylor, Peter K.; Weill, Alain

    1998-10-01

    Two major campaigns (Surface of the Oceans, Fluxes and Interactions with the Atmosphere (SOFIA) and Structure des Echanges Mer-Atmosphère, Propriétés des Hétérogénéités Océaniques: Recherche Expérimentale (SEMAPHORE)) devoted to the study of ocean-atmosphere interaction were conducted in 1992 and 1993, respectively, in the Azores region. Among the various platforms deployed, instrumented aircraft and ship allowed the measurement of the turbulent flux of sensible heat, latent heat, and momentum. From coordinated missions we can evaluate the sea surface fluxes from (1) bulk relations and mean measurements performed aboard the ship in the atmospheric surface layer and (2) turbulence measurements aboard aircraft, which allowed the flux profiles to be estimated through the whole atmospheric boundary layer and therefore to be extrapolated toward the sea surface level. Continuous ship fluxes were calculated with bulk coefficients deduced from inertial-dissipation measurements in the same experiments, whereas aircraft fluxes were calculated with eddy-correlation technique. We present a comparison between these two estimations. Although momentum flux agrees quite well, aircraft estimations of sensible and latent heat flux are lower than those of the ship. This result is surprising, since aircraft momentum flux estimates are often considered as much less accurate than scalar flux estimates. The various sources of errors on the aircraft and ship flux estimates are discussed. For sensible and latent heat flux, random errors on aircraft estimates, as well as variability of ship flux estimates, are lower than the discrepancy between the two platforms, whereas the momentum flux estimates cannot be considered as significantly different. Furthermore, the consequence of the high-pass filtering of the aircraft signals on the flux values is analyzed; it is weak at the lowest altitudes flown and cannot therefore explain the discrepancies between the two platforms but becomes considerable at upper levels in the boundary layer. From arguments linked to the imbalance of the surface energy budget, established during previous campaigns performed over land surfaces with aircraft, we conclude that aircraft heat fluxes are probably also underestimated over the sea.

  7. Two essays on environmental and food security

    NASA Astrophysics Data System (ADS)

    Jeanty, Pierre Wilner

    The first essay of this dissertation, "estimating non-market economic benefits of using biodiesel fuel: a stochastic double bounded approach", is an attempt to incorporate uncertainty into double bounded dichotomous choice contingent valuation. The double bounded approach, which entails asking respondents a follow-up question after they have answered a first question, has emerged as a means to increase efficiency in willingness to pay (WTP) estimates. However, several studies have found inconsistency between WTP estimates generated by the first and second questions. In this study, it is posited that this inconsistency is due to uncertainty facing the respondents when the second question is introduced. The author seeks to understand whether using a follow-up question in a stochastic format, which allows respondents to express uncertainty, would alleviate the inconsistency problem. In a contingent valuation survey to estimate non-market economic benefits of using more biodiesel vs. petroleum diesel fuel in an airshed encompassing South Eastern and Central Ohio, it is found that the gap between WTP estimates produced by the first and the second questions reduces when respondents are allowed to express uncertainty. The proposed stochastic follow-up approach yields more efficient WTP estimates than the conventional follow-up approach while maintaining efficiency gain over the single bounded model. From a methodological standpoint, this study distinguishes from previous research by being the first to implement a double bounded contingent valuation survey with a stochastic follow-up question. In the second essay, "analyzing the effects of civil wars and violent conflicts on food security in developing countries: an instrumental variable panel data approach", instrumental variable panel data techniques are applied to estimate the effects of civil wars and violent conflicts on food security in a sample of 73 developing countries from 1970 to 2002. The number of hungry in the developing countries has been rampant in the past several years. Civil wars and violent conflicts have been associated with food insecurity. The study aims to provide empirical evidence as to whether the manifest increase in the number of hungry can be ascribed to civil unrest. From a statistical standpoint, the results convincingly pinpoint the danger of using conventional panel data estimators when endogeneity is of the conventional simultaneous equation type, i.e. with respect to the idiosyncratic error term. From a policy viewpoint, it is found that, in general, civil wars and conflicts are detrimental to food security. However, more vulnerable are countries unable to make available for their citizens the minimum dietary energy requirements under which a country is qualified for food aid. Policies aiming at curbing food insecurity in developing countries need to take into account this difference.

  8. Youth Actuarial Risk Assessment Tool (Y-ARAT): The development of an actuarial risk assessment instrument for predicting general offense recidivism on the basis of police records.

    PubMed

    van der Put, Claudia E

    2014-06-01

    Estimating the risk for recidivism is important for many areas of the criminal justice system. In the present study, the Youth Actuarial Risk Assessment Tool (Y-ARAT) was developed for juvenile offenders based solely on police records, with the aim to estimate the risk of general recidivism among large groups of juvenile offenders by police officers without clinical expertise. On the basis of the Y-ARAT, juvenile offenders are classified into five risk groups based on (combinations of) 10 variables including different types of incidents in which the juvenile was a suspect, total number of incidents in which the juvenile was a suspect, total number of other incidents, total number of incidents in which co-occupants at the youth's address were suspects, gender, and age at first incident. The Y-ARAT was developed on a sample of 2,501 juvenile offenders and validated on another sample of 2,499 juvenile offenders, showing moderate predictive accuracy (area under the receiver-operating-characteristic curve = .73), with little variation between the construction and validation sample. The predictive accuracy of the Y-ARAT was considered sufficient to justify its use as a screening instrument for the police. © The Author(s) 2013.

  9. NASA Instrument Cost/Schedule Model

    NASA Technical Reports Server (NTRS)

    Habib-Agahi, Hamid; Mrozinski, Joe; Fox, George

    2011-01-01

    NASA's Office of Independent Program and Cost Evaluation (IPCE) has established a number of initiatives to improve its cost and schedule estimating capabilities. 12One of these initiatives has resulted in the JPL developed NASA Instrument Cost Model. NICM is a cost and schedule estimator that contains: A system level cost estimation tool; a subsystem level cost estimation tool; a database of cost and technical parameters of over 140 previously flown remote sensing and in-situ instruments; a schedule estimator; a set of rules to estimate cost and schedule by life cycle phases (B/C/D); and a novel tool for developing joint probability distributions for cost and schedule risk (Joint Confidence Level (JCL)). This paper describes the development and use of NICM, including the data normalization processes, data mining methods (cluster analysis, principal components analysis, regression analysis and bootstrap cross validation), the estimating equations themselves and a demonstration of the NICM tool suite.

  10. Use of the densiometer to estimate density of forest canopy on permanent sample plots.

    Treesearch

    Gerald S. Strickler

    1959-01-01

    An instrument known as the spherical densiometer has been found adaptable to permanent-plot estimates of relative canopy closure or density in forest and range ecological studies. The device is more compact and simpler to use than previous ocular-type instruments. Because the instrument has a curved reflecting surface which results in observations from lateral as well...

  11. Analyzing the impact of public transit usage on obesity.

    PubMed

    She, Zhaowei; King, Douglas M; Jacobson, Sheldon H

    2017-06-01

    The objective of this paper is to estimate the impact of county-level public transit usage on obesity prevalence in the United States and assess the potential for public transit usage as an intervention for obesity. This study adopts an instrumental regression approach to implicitly control for potential selection bias due to possible differences in commuting preferences among obese and non-obese populations. United States health data from the 2009 Behavioral Risk Factor Surveillance System and transportation data from the 2009 National Household Travel Survey are aggregated and matched at the county level. County-level public transit accessibility and vehicle ownership rates are chosen as instrumental variables to implicitly control for unobservable commuting preferences. The results of this instrumental regression analysis suggest that a one percent increase in county population usage of public transit is associated with a 0.221 percent decrease in county population obesity prevalence at the α=0.01 statistical significance level, when commuting preferences, amount of non-travel physical activity, education level, health resource, and distribution of income are fixed. Hence, this study provides empirical support for the effectiveness of encouraging public transit usage as an intervention strategy for obesity. Copyright © 2017 Elsevier Inc. All rights reserved.

  12. An inexpensive frequency-modulated (FM) audio monitor of time-dependent analog parameters.

    PubMed

    Langdon, R B; Jacobs, R S

    1980-02-01

    The standard method for quantification and presentation of an experimental variable in real time is the use of visual display on the ordinate of an oscilloscope screen or chart recorder. This paper describes a relatively simple electronic circuit, using commercially available and inexpensive integrated circuits (IC), which generates an audible tone, the pitch of which varies in proportion to a running variable of interest. This device, which we call an "Audioscope," can accept as input the monitor output from any instrument that expresses an experimental parameter as a dc voltage. The Audioscope is particularly useful in implanting microelectrodes intracellularly. It may also function to mediate the first step in data recording on magnetic tape, and/or data analysis and reduction by electronic circuitary. We estimate that this device can be built, with two-channel capability, for less than $50, and in less than 10 hr by an experienced electronics technician.

  13. The causal effects of home care use on institutional long-term care utilization and expenditures.

    PubMed

    Guo, Jing; Konetzka, R Tamara; Manning, Willard G

    2015-03-01

    Limited evidence exists on whether expanding home care saves money overall or how much institutional long-term care can be reduced. This paper estimates the causal effect of Medicaid-financed home care services on the costs and utilization of institutional long-term care using Medicaid claims data. A unique instrumental variable was applied to address the potential bias caused by omitted variables or reverse effect of institutional care use. We find that the use of Medicaid-financed home care services significantly reduced but only partially offset utilization and Medicaid expenditures on nursing facility services. A $1000 increase in Medicaid home care expenditures avoided 2.75 days in nursing facilities and reduced annual Medicaid nursing facility costs by $351 among people over age 65 when selection bias is addressed. Failure to address selection biases would misestimate the substitution and offset effects. Copyright © 2015 John Wiley & Sons, Ltd.

  14. Spatial variability of concentrations of chlorophyll a, dissolved organic matter and suspended particles in the surface layer of the Kara Sea in September 2011 from lidar data

    NASA Astrophysics Data System (ADS)

    Pelevin, V. V.; Zavjalov, P. O.; Belyaev, N. A.; Konovalov, B. V.; Kravchishina, M. D.; Mosharov, S. A.

    2017-01-01

    The article presents results of underway remote laser sensing of the surface water layer in continuous automatic mode using the UFL-9 fluorescent lidar onboard the R/V Akademik Mstislav Keldysh during cruise 59 in the Kara Sea in 2011. The description of the lidar, the approach to interpreting seawater fluorescence data, and certain methodical aspects of instrument calibration and measurement are presented. Calibration of the lidar is based on laboratory analysis of water samples taken from the sea surface during the cruise. Spatial distribution of chlorophyll a, total organic carbon and suspended matter concentrations in the upper quasi-homogeneous layer are mapped and the characteristic scales of the variability are estimated. Some dependencies between the patchiness of the upper water layer and the atmospheric forcing and freshwater runoff are shown.

  15. Variability in Global Top-of-Atmosphere Shortwave Radiation Between 2000 and 2005

    NASA Technical Reports Server (NTRS)

    Loebe, Norman G.; Wielicki, Bruce A.; Rose, Fred G.; Doelling, David R.

    2007-01-01

    Measurements from various instruments and analysis techniques are used to directly compare changes in Earth-atmosphere shortwave (SW) top-of-atmosphere (TOA) radiation between 2000 and 2005. Included in the comparison are estimates of TOA reflectance variability from published ground-based Earthshine observations and from new satellite-based CERES, MODIS and ISCCP results. The ground-based Earthshine data show an order-of-magnitude more variability in annual mean SW TOA flux than either CERES or ISCCP, while ISCCP and CERES SW TOA flux variability is consistent to 40%. Most of the variability in CERES TOA flux is shown to be dominated by variations global cloud fraction, as observed using coincident CERES and MODIS data. Idealized Earthshine simulations of TOA SW radiation variability for a lunar-based observer show far less variability than the ground-based Earthshine observations, but are still a factor of 4-5 times more variable than global CERES SW TOA flux results. Furthermore, while CERES global albedos exhibit a well-defined seasonal cycle each year, the seasonal cycle in the lunar Earthshine reflectance simulations is highly variable and out-of-phase from one year to the next. Radiative transfer model (RTM) approaches that use imager cloud and aerosol retrievals reproduce most of the change in SW TOA radiation observed in broadband CERES data. However, assumptions used to represent the spectral properties of the atmosphere, clouds, aerosols and surface in the RTM calculations can introduce significant uncertainties in annual mean changes in regional and global SW TOA flux.

  16. Variability in global top-of-atmosphere shortwave radiation between 2000 and 2005

    NASA Astrophysics Data System (ADS)

    Loeb, Norman G.; Wielicki, Bruce A.; Rose, Fred G.; Doelling, David R.

    2007-02-01

    Measurements from various instruments and analysis techniques are used to directly compare changes in Earth-atmosphere shortwave (SW) top-of-atmosphere (TOA) radiation between 2000 and 2005. Included in the comparison are estimates of TOA reflectance variability from published ground-based Earthshine observations and from new satellite-based CERES, MODIS and ISCCP results. The ground-based Earthshine data show an order-of-magnitude more variability in annual mean SW TOA flux than either CERES or ISCCP, while ISCCP and CERES SW TOA flux variability is consistent to 40%. Most of the variability in CERES TOA flux is shown to be dominated by variations global cloud fraction, as observed using coincident CERES and MODIS data. Idealized Earthshine simulations of TOA SW radiation variability for a lunar-based observer show far less variability than the ground-based Earthshine observations, but are still a factor of 4-5 times more variable than global CERES SW TOA flux results. Furthermore, while CERES global albedos exhibit a well-defined seasonal cycle each year, the seasonal cycle in the lunar Earthshine reflectance simulations is highly variable and out-of-phase from one year to the next. Radiative transfer model (RTM) approaches that use imager cloud and aerosol retrievals reproduce most of the change in SW TOA radiation observed in broadband CERES data. However, assumptions used to represent the spectral properties of the atmosphere, clouds, aerosols and surface in the RTM calculations can introduce significant uncertainties in annual mean changes in regional and global SW TOA flux.

  17. Genetic markers as instrumental variables.

    PubMed

    von Hinke, Stephanie; Davey Smith, George; Lawlor, Debbie A; Propper, Carol; Windmeijer, Frank

    2016-01-01

    The use of genetic markers as instrumental variables (IV) is receiving increasing attention from economists, statisticians, epidemiologists and social scientists. Although IV is commonly used in economics, the appropriate conditions for the use of genetic variants as instruments have not been well defined. The increasing availability of biomedical data, however, makes understanding of these conditions crucial to the successful use of genotypes as instruments. We combine the econometric IV literature with that from genetic epidemiology, and discuss the biological conditions and IV assumptions within the statistical potential outcomes framework. We review this in the context of two illustrative applications. Copyright © 2015 The Authors. Published by Elsevier B.V. All rights reserved.

  18. Statistical Analysis for Multisite Trials Using Instrumental Variables with Random Coefficients

    ERIC Educational Resources Information Center

    Raudenbush, Stephen W.; Reardon, Sean F.; Nomi, Takako

    2012-01-01

    Multisite trials can clarify the average impact of a new program and the heterogeneity of impacts across sites. Unfortunately, in many applications, compliance with treatment assignment is imperfect. For these applications, we propose an instrumental variable (IV) model with person-specific and site-specific random coefficients. Site-specific IV…

  19. Lessons Learned from AIRS: Improved Determination of Surface and Atmospheric Temperatures Using Only Shortwave AIRS Channels

    NASA Technical Reports Server (NTRS)

    Susskind, Joel

    2011-01-01

    This slide presentation reviews the use of shortwave channels available to the Atmospheric Infrared Sounder (AIRS) to improve the determination of surface and atmospheric temperatures. The AIRS instrument is compared with the Infrared Atmospheric Sounding Interferometer (IASI) on-board the MetOp-A satellite. The objectives of the AIRS/AMSU were to (1) provide real time observations to improve numerical weather prediction via data assimilation, (2) Provide observations to measure and explain interannual variability and trends and (3) Use of AIRS product error estimates allows for QC optimized for each application. Successive versions in the AIRS retrieval methodology have shown significant improvement.

  20. How does searching for health information on the Internet affect individuals' demand for health care services?

    PubMed

    Suziedelyte, Agne

    2012-11-01

    The emergence of the Internet made health information, which previously was almost exclusively available to health professionals, accessible to the general public. Access to health information on the Internet is likely to affect individuals' health care related decisions. The aim of this analysis is to determine how health information that people obtain from the Internet affects their demand for health care. I use a novel data set, the U.S. Health Information National Trends Survey (2003-07), to answer this question. The causal variable of interest is a binary variable that indicates whether or not an individual has recently searched for health information on the Internet. Health care utilization is measured by an individual's number of visits to a health professional in the past 12 months. An individual's decision to use the Internet to search for health information is likely to be correlated to other variables that can also affect his/her demand for health care. To separate the effect of Internet health information from other confounding variables, I control for a number of individual characteristics and use the instrumental variable estimation method. As an instrument for Internet health information, I use U.S. state telecommunication regulations that are shown to affect the supply of Internet services. I find that searching for health information on the Internet has a positive, relatively large, and statistically significant effect on an individual's demand for health care. This effect is larger for the individuals who search for health information online more frequently and people who have health care coverage. Among cancer patients, the effect of Internet health information seeking on health professional visits varies by how long ago they were diagnosed with cancer. Thus, the Internet is found to be a complement to formal health care rather than a substitute for health professional services. Copyright © 2012 Elsevier Ltd. All rights reserved.

  1. Simulation Study of a Follow-on Gravity Mission to GRACE

    NASA Technical Reports Server (NTRS)

    Loomis, Bryant D.; Nerem, R. S.; Luthcke, Scott B.

    2012-01-01

    The gravity recovery and climate experiment (GRACE) has been providing monthly estimates of the Earth's time-variable gravity field since its launch in March 2002. The GRACE gravity estimates are used to study temporal mass variations on global and regional scales, which are largely caused by a redistribution of water mass in the Earth system. The accuracy of the GRACE gravity fields are primarily limited by the satellite-to-satellite range-rate measurement noise, accelerometer errors, attitude errors, orbit errors, and temporal aliasing caused by unmodeled high-frequency variations in the gravity signal. Recent work by Ball Aerospace and Technologies Corp., Boulder, CO has resulted in the successful development of an interferometric laser ranging system to specifically address the limitations of the K-band microwave ranging system that provides the satellite-to-satellite measurements for the GRACE mission. Full numerical simulations are performed for several possible configurations of a GRACE Follow-On (GFO) mission to determine if a future satellite gravity recovery mission equipped with a laser ranging system will provide better estimates of time-variable gravity, thus benefiting many areas of Earth systems research. The laser ranging system improves the range-rate measurement precision to approximately 0.6 nm/s as compared to approx. 0.2 micro-seconds for the GRACE K-band microwave ranging instrument. Four different mission scenarios are simulated to investigate the effect of the better instrument at two different altitudes. The first pair of simulated missions is flown at GRACE altitude (approx. 480 km) assuming on-board accelerometers with the same noise characteristics as those currently used for GRACE. The second pair of missions is flown at an altitude of approx. 250 km which requires a drag-free system to prevent satellite re-entry. In addition to allowing a lower satellite altitude, the drag-free system also reduces the errors associated with the accelerometer. All simulated mission scenarios assume a two satellite co-orbiting pair similar to GRACE in a near-polar, near-circular orbit. A method for local time variable gravity recovery through mass concentration blocks (mascons) is used to form simulated gravity estimates for Greenland and the Amazon region for three GFO configurations and GRACE. Simulation results show that the increased precision of the laser does not improve gravity estimation when flown with on-board accelerometers at the same altitude and spacecraft separation as GRACE, even when time-varying background models are not included. This study also shows that only modest improvement is realized for the best-case scenario (laser, low-altitude, drag-free) as compared to GRACE due to temporal aliasing errors. These errors are caused by high-frequency variations in the hydrology signal and imperfections in the atmospheric, oceanographic, and tidal models which are used to remove unwanted signal. This work concludes that applying the updated technologies alone will not immediately advance the accuracy of the gravity estimates. If the scientific objectives of a GFO mission require more accurate gravity estimates, then future work should focus on improvements in the geophysical models, and ways in which the mission design or data processing could reduce the effects of temporal aliasing.

  2. Instrument Drift Uncertainties and the Long-Term TOMS/SBUV Total Ozone Record

    NASA Technical Reports Server (NTRS)

    Solarski, Richard S.; Frith, Stacey

    2005-01-01

    Long-term climate records from satellites are often constructed from the measurements of a sequence of instruments launched at different times. Each of these instruments is calibrated prior to launch. After launch they are subjected to potential offsets and slow drifts in calibration. We illustrate these issues in the construction of a merged total ozone record from two TOMS and three SBUV instruments. This record extends from late 1978 through the present. The question is "How good are these records?". We have examined the uncertainty in determining the relative calibration of two instruments during an overlap period in their measurements. When comparing a TOMS instrument, such as that on Nimbus 7, with an SBUV instrument, also on Nimbus 7, we find systematic differences and random differences. We have combined these findings with estimates of individual instrument drift into a monte- carlo uncertainty propagation model. We estimate an instrument drift uncertainty of a little larger than 1 percent per decade over the 25-year history of the TOMS/SBUV measurements. We make an independent estimate of the drift uncertainty in the ground-based network of total ozone measurements and find it to be of similar, but slightly smaller magnitude. The implications of these uncertainties for trend and recovery determination will be discussed.

  3. Does Foot Anthropometry Predict Metabolic Cost During Running?

    PubMed

    van Werkhoven, Herman; Piazza, Stephen J

    2017-10-01

    Several recent investigations have linked running economy to heel length, with shorter heels being associated with less metabolic energy consumption. It has been hypothesized that shorter heels require larger plantar flexor muscle forces, thus increasing tendon energy storage and reducing metabolic cost. The goal of this study was to investigate this possible mechanism for metabolic cost reduction. Fifteen male subjects ran at 16 km⋅h -1 on a treadmill and subsequently on a force-plate instrumented runway. Measurements of oxygen consumption, kinematics, and ground reaction forces were collected. Correlational analyses were performed between oxygen consumption and anthropometric and kinetic variables associated with the ankle and foot. Correlations were also computed between kinetic variables (peak joint moment and peak tendon force) and heel length. Estimated peak Achilles tendon force normalized to body weight was found to be strongly correlated with heel length normalized to body height (r = -.751, p = .003). Neither heel length nor any other measured or calculated variable were correlated with oxygen consumption, however. Subjects with shorter heels experienced larger Achilles tendon forces, but these forces were not associated with reduced metabolic cost. No other anthropometric and kinetic variables considered explained the variance in metabolic cost across individuals.

  4. The Plasma Instrument for Magnetic Sounding (PIMS) on The Europa Clipper Mission

    NASA Astrophysics Data System (ADS)

    Westlake, Joseph H.; McNutt, Ralph L.; Kasper, Justin C.; Case, Anthony W.; Grey, Matthew P.; Kim, Cindy K.; Battista, Corina C.; Rymer, Abigail; Paty, Carol S.; Jia, Xianzhe; Stevens, Michael L.; Khurana, Krishan; Kivelson, Margaret G.; Slavin, James A.; Korth, Haje H.; Smith, Howard T.; Krupp, Norbert; Roussos, Elias; Saur, Joachim

    2016-10-01

    The Europa Clipper mission is equipped with a sophisticated suite of 9 instruments to study Europa's interior and ocean, geology, chemistry, and habitability from a Jupiter orbiting spacecraft. The Plasma Instrument for Magnetic Sounding (PIMS) on Europa Clipper is a Faraday Cup based plasma instrument whose heritage dates back to the Voyager spacecraft. PIMS will measure the plasma that populates Jupiter's magnetosphere and Europa's ionosphere. The science goals of PIMS are to: 1) estimate the ocean salinity and thickness by determining Europa's magnetic induction response, corrected for plasma contributions; 2) assess mechanisms responsible for weathering and releasing material from Europa's surface into the atmosphere and ionosphere; and 3) understand how Europa influences its local space environment and Jupiter's magnetosphere and vice versa.Europa is embedded in a complex Jovian magnetospheric plasma, which rotates with the tilted planetary field and interacts dynamically with Europa's ionosphere affecting the magnetic induction signal. Plasma from Io's temporally varying torus diffuses outward and mixes with the charged particles in Europa's own torus producing highly variable plasma conditions at Europa. PIMS works in conjunction with the Interior Characterization of Europa using Magnetometry (ICEMAG) investigation to probe Europa's subsurface ocean. This investigation exploits currents induced in Europa's interior by the moon's exposure to variable magnetic fields in the Jovian system to infer properties of Europa's subsurface ocean such as its depth, thickness, and conductivity. This technique was successfully applied to Galileo observations and demonstrated that Europa indeed has a subsurface ocean. While these Galileo observations contributed to the renewed interest in Europa, due to limitations in the observations the results raised major questions that remain unanswered. PIMS will greatly refine our understanding of Europa's global liquid ocean by accounting for contributions to the magnetic field from plasma currents.In this presentation we describe the principles of PIMS operations, detail the PIMS science goals, and discuss how to assess Europa's induction response.

  5. The Plasma Instrument for Magnetic Sounding (PIMS) onboard the Europa Clipper Mission

    NASA Astrophysics Data System (ADS)

    Westlake, Joseph H.; McNutt, Ralph L.; Kasper, Justin C.; Rymer, Abigail; Case, Anthony; Battista, Corina; Cochrane, Corey; Coren, David; Crew, Alexander; Grey, Matthew; Jia, Xianzhe; Khurana, Krishan; Kim, Cindy; Kivelson, Margaret G.; Korth, Haje; Krupp, Norbert; Paty, Carol; Roussos, Elias; Stevens, Michael; Slavin, James A.; Smith, Howard T.; Saur, Joachim

    2017-10-01

    Europa is embedded in a complex Jovian magnetospheric plasma, which rotates with the tilted planetary field and interacts dynamically with Europa’s ionosphere affecting the magnetic induction signal. Plasma from Io’s temporally varying torus diffuses outward and mixes with the charged particles in Europa’s own torus producing highly variable plasma conditions. Onboard the Europa Clipper spacecraft the Plasma Instrument for Magnetic Sounding (PIMS) works in conjunction with the Interior Characterization of Europa using Magnetometry (ICEMAG) investigation to probe Europa’s subsurface ocean. This investigation exploits currents induced in Europa’s interior by the moon’s exposure to variable magnetic fields in the Jovian system to infer properties of Europa’s subsurface ocean such as its depth, thickness, and conductivity. This technique was successfully applied to Galileo observations and demonstrated that Europa indeed has a subsurface ocean. While these Galileo observations contributed to the renewed interest in Europa, due to limitations in the observations the results raised major questions that remain unanswered. PIMS will greatly refine our understanding of Europa’s global liquid ocean by accounting for contributions to the magnetic field from plasma currents.The Europa Clipper mission is equipped with a sophisticated suite of 9 instruments to study Europa's interior and ocean, geology, chemistry, and habitability from a Jupiter orbiting spacecraft. PIMS on Europa Clipper is a Faraday Cup based plasma instrument whose heritage dates back to the Voyager spacecraft. PIMS will measure the plasma that populates Jupiter’s magnetosphere and Europa’s ionosphere. The science goals of PIMS are to: 1) estimate the ocean salinity and thickness by determining Europa’s magnetic induction response, corrected for plasma contributions; 2) assess mechanisms responsible for weathering and releasing material from Europa’s surface into the atmosphere and ionosphere; and 3) understand how Europa influences its local space environment and Jupiter’s magnetosphere and vice versa.In this presentation we describe the principles of PIMS operations, detail the PIMS science goals, and discuss how to assess Europa's induction response.

  6. The Plasma Instrument for Magnetic Sounding (PIMS) on The Europa Clipper Mission

    NASA Astrophysics Data System (ADS)

    Westlake, J. H.; McNutt, R. L., Jr.; Kasper, J. C.; Battista, C.; Case, A. W.; Cochrane, C.; Grey, M.; Jia, X.; Kivelson, M.; Kim, C.; Korth, H.; Khurana, K. K.; Krupp, N.; Paty, C. S.; Roussos, E.; Rymer, A. M.; Stevens, M. L.; Slavin, J. A.; Smith, H. T.; Saur, J.; Coren, D.

    2017-12-01

    The Europa Clipper mission is equipped with a sophisticated suite of 9 instruments to study Europa's interior and ocean, geology, chemistry, and habitability from a Jupiter orbiting spacecraft. The Plasma Instrument for Magnetic Sounding (PIMS) on Europa Clipper is a Faraday Cup based plasma instrument whose heritage dates back to the Voyager spacecraft. PIMS will measure the plasma that populates Jupiter's magnetosphere and Europa's ionosphere. The science goals of PIMS are to: 1) estimate the ocean salinity and thickness by determining Europa's magnetic induction response, corrected for plasma contributions; 2) assess mechanisms responsible for weathering and releasing material from Europa's surface into the atmosphere and ionosphere; and 3) understand how Europa influences its local space environment and Jupiter's magnetosphere and vice versa. Europa is embedded in a complex Jovian magnetospheric plasma, which rotates with the tilted planetary field and interacts dynamically with Europa's ionosphere affecting the magnetic induction signal. Plasma from Io's temporally varying torus diffuses outward and mixes with the charged particles in Europa's own torus producing highly variable plasma conditions at Europa. PIMS works in conjunction with the Interior Characterization of Europa using Magnetometry (ICEMAG) investigation to probe Europa's subsurface ocean. This investigation exploits currents induced in Europa's interior by the moon's exposure to variable magnetic fields in the Jovian system to infer properties of Europa's subsurface ocean such as its depth, thickness, and conductivity. This technique was successfully applied to Galileo observations and demonstrated that Europa indeed has a subsurface ocean. While these Galileo observations contributed to the renewed interest in Europa, due to limitations in the observations the results raised major questions that remain unanswered. PIMS will greatly refine our understanding of Europa's global liquid ocean by accounting for contributions to the magnetic field from plasma currents. In this presentation we describe the principles of PIMS operations, detail the PIMS science goals, and discuss how to assess Europa's induction response.

  7. Promoting motivation through mode of instruction: The relationship between use of affective teaching techniques and motivation to learn science

    NASA Astrophysics Data System (ADS)

    Sanchez Rivera, Yamil

    The purpose of this study is to add to what we know about the affective domain and to create a valid instrument for future studies. The Motivation to Learn Science (MLS) Inventory is based on Krathwohl's Taxonomy of Affective Behaviors (Krathwohl et al., 1964). The results of the Exploratory Factor Analysis (EFA) and Confirmatory Factor Analysis (CFA) demonstrated that the MLS Inventory is a valid and reliable instrument. Therefore, the MLS Inventory is a uni-dimensional instrument composed of 9 items with convergent validity (no divergence). The instrument had a high Chronbach Alpha value of .898 during the EFA analysis and .919 with the CFA analysis. Factor loadings on the 9 items ranged from .617 to .800. Standardized regression weights ranged from .639 to .835 in the CFA analysis. Various indices (RMSEA = .033; NFI = .987; GFI = .985; CFI = 1.000) demonstrated a good fitness of the proposed model. Hierarchical linear modeling was used to statistical analyze data where students' motivation to learn science scores (level-1) were nested within teachers (level-2). The analysis was geared toward identifying if teachers' use of affective behavior (a level-2 classroom variable) was significantly related with students' MLS scores (level-1 criterion variable). Model testing proceeded in three phases: intercept-only model, means-as-outcome model, and a random-regression coefficient model. The intercept-only model revealed an intra-class correlation coefficient of .224 with an estimated reliability of .726. Therefore, data suggested that only 22.4% of the variance in MLS scores is between-classes and the remaining 77.6% is at the student-level. Due to the significant variance in MLS scores, X2(62.756, p<.0001), teachers' TAB scores were added as a level-2 predictor. The regression coefficient was non-significant (p>.05). Therefore, the teachers' self-reported use of affective behaviors was not a significant predictor of students' motivation to learn science.

  8. Maternal employment and the health of low-income young children.

    PubMed

    Gennetian, Lisa A; Hill, Heather D; London, Andrew S; Lopoo, Leonard M

    2010-05-01

    This study examines whether maternal employment affects the health status of low-income, elementary-school-aged children using instrumental variables estimation and experimental data from a welfare-to-work program implemented in the early 1990s. Maternal report of child health status is predicted as a function of exogenous variation in maternal employment associated with random assignment to the experimental group. IV estimates show a modest adverse effect of maternal employment on children's health. Making use of data from another welfare-to-work program we propose that any adverse effect on child health may be tempered by increased family income and access to public health insurance coverage, findings with direct relevance to a number of current policy discussions. In a secondary analysis using fixed effects techniques on longitudinal survey data collected in 1998 and 2001, we find a comparable adverse effect of maternal employment on child health that supports the external validity of our primary result.

  9. Hospital ownership and performance: evidence from stroke and cardiac treatment in Taiwan.

    PubMed

    Lien, Hsien-Ming; Chou, Shin-Yi; Liu, Jin-Tan

    2008-09-01

    This paper compares program expenditure and treatment quality of stroke and cardiac patients between 1997 and 2000 across hospitals of various ownership types in Taiwan. Because Taiwan implemented national health insurance in 1995, the analysis is immune from problems arising from the complex setting of the U.S. health care market, such as segmentation of insurance status or multiple payers. Because patients may select admitted hospitals based on their observed and unobserved characteristics, we employ instrument variable (IV) estimation to account for the endogeneity of ownership status. Results of IV estimation find that patients admitted to non-profit hospitals receive better quality care, either measured by 1- or 12-month mortality rates. In terms of treatment expenditure, our results indicate no difference between non-profits and for-profits index admission expenditures, and at most 10% higher long-term expenditure for patients admitted to non-profits than to for-profits.

  10. Effects of urban sprawl on obesity.

    PubMed

    Zhao, Zhenxiang; Kaestner, Robert

    2010-12-01

    In this paper, we examine the effect of changes in population density-urban sprawl-between 1970 and 2000 on BMI and obesity of residents in metropolitan areas in the U.S. We address the possible endogeneity of population density by using a two-step instrumental variables approach. We exploit the plausibly exogenous variation in population density caused by the expansion of the U.S. Interstate Highway System, which largely followed the original 1947 plan for the Interstate Highway System. We find a negative association between population density and obesity, and estimates are robust across a wide range of specifications. Estimates indicate that if the average metropolitan area had not experienced the decline in the proportion of population living in dense areas over the last 30 years, the rate of obesity would have been reduced by approximately 13%. Copyright © 2010 Elsevier B.V. All rights reserved.

  11. Non-invasive heart rate monitoring system using giant magneto resistance sensor.

    PubMed

    Kalyan, Kubera; Chugh, Vinit Kumar; Anoop, C S

    2016-08-01

    A simple heart rate (HR) monitoring system designed and developed using the Giant Magneto-Resistance (GMR) sensor is presented in this paper. The GMR sensor is placed on the wrist of the human and it provides the magneto-plethysmographic signal. This signal is processed by the simple analog and digital instrumentation stages to render the heart rate indication. A prototype of the system has been built and test results on 26 volunteers have been reported. The error in HR estimation of the system is merely 1 beat per minute. The performance of the system when layer of cloth is present between the sensor and the human body is investigated. The capability of the system as a HR variability estimator has also been established through experimentation. The proposed technique can be used as an efficient alternative to conventional HR monitors and is well suited for remote and continuous monitoring of HR.

  12. The reliability and validity of flight task workload ratings

    NASA Technical Reports Server (NTRS)

    Childress, M. E.; Hart, S. G.; Bortolussi, M. R.

    1982-01-01

    Twelve instrument-rated general aviation pilots each flew two scenarios in a motion-base simulator. During each flight, the pilots verbally estimated their workload every three minutes. Following each flight, they again estimated workload for each flight segment and also rated their overall workload, perceived performance, and 13 specific factors on a bipolar scale. The results indicate that time (a priori, inflight, or postflight) of eliciting ratings, period to be covered by the ratings (a specific moment in time or a longer period), type of rating scale, and rating method (verbal, written, or other) may be important variables. Overall workload ratings appear to be predicted by different specific scales depending upon the situation, with activity level the best predictor. Perceived performance seems to bear little relationship to observer-rated performance when pilots rate their overall performance and an observer rates specific behaviors. Perceived workload and performance also seem unrelated.

  13. The effect of waiting times on demand and supply for elective surgery: Evidence from Italy.

    PubMed

    Riganti, Andrea; Siciliani, Luigi; Fiorio, Carlo V

    2017-09-01

    Waiting times are a major policy concern in publicly funded health systems across OECD countries. Economists have argued that, in the presence of excess demand, waiting times act as nonmonetary prices to bring demand for and supply of health care in equilibrium. Using administrative data disaggregated by region and surgical procedure over 2010-2014 in Italy, we estimate demand and supply elasticities with respect to waiting times. We employ linear regression models with first differences and instrumental variables to deal with endogeneity of waiting times. We find that demand is inelastic to waiting times while supply is more elastic. Estimates of demand elasticity are between -0.15 to -0.24. Our results have implications on the effectiveness of policies aimed at increasing supply and their ability to reduce waiting times. Copyright © 2017 John Wiley & Sons, Ltd.

  14. Empirical Evidence on Occupation and Industry Specific Human Capital

    PubMed Central

    Sullivan, Paul

    2009-01-01

    This paper presents instrumental variables estimates of the effects of firm tenure, occupation specific work experience, industry specific work experience, and general work experience on wages using data from the 1979 Cohort of the National Longitudinal Survey of Youth. The estimates indicate that both occupation and industry specific human capital are key determinants of wages, and the importance of various types of human capital varies widely across one-digit occupations. Human capital is primarily occupation specific in occupations such as craftsmen, where workers realize a 14% increase in wages after five years of occupation specific experience but do not realize wage gains from industry specific experience. In contrast, human capital is primarily industry specific in other occupations such as managerial employment where workers realize a 23% wage increase after five years of industry specific work experience. In other occupations, such as professional employment, both occupation and industry specific human capital are key determinants of wages. PMID:20526448

  15. Physical activity measurement in older adults: relationships with mental health.

    PubMed

    Parker, Sarah J; Strath, Scott J; Swartz, Ann M

    2008-10-01

    This study examined the relationship between physical activity (PA) and mental health among older adults as measured by objective and subjective PA-assessment instruments. Pedometers (PED), accelerometers (ACC), and the Physical Activity Scale for the Elderly (PASE) were administered to measure 1 week of PA among 84 adults age 55-87 (mean = 71) years. General mental health was measured using the Positive and Negative Affect Scale (PANAS) and the Satisfaction With Life Scale (SWL). Linear regressions revealed that PA estimated by PED significantly predicted 18.1%, 8.3%, and 12.3% of variance in SWL and positive and negative affect, respectively, whereas PA estimated by the PASE did not predict any mental health variables. Results from ACC data were mixed. Hotelling-William tests between correlation coefficients revealed that the relationship between PED and SWL was significantly stronger than the relationship between PASE and SWL. Relationships between PA and mental health might depend on the PA measure used.

  16. Prediction of future falls in a community dwelling older adult population using instrumented balance and gait analysis.

    PubMed

    Bauer, C M; Gröger, I; Rupprecht, R; Marcar, V L; Gaßmann, K G

    2016-04-01

    The role of instrumented balance and gait assessment when screening for prospective fallers is currently a topic of controversial discussion. This study analyzed the association between variables derived from static posturography, instrumented gait analysis and clinical assessments with the occurrence of prospective falls in a sample of community dwelling older people. In this study 84 older people were analyzed. Based on a prospective occurrence of falls, participants were categorized into fallers and non-fallers. Variables derived from clinical assessments, static posturography and instrumented gait analysis were evaluated with respect to the association with the occurrence of prospective falls using a forward stepwise, binary, logistic regression procedure. Fallers displayed a significantly shorter single support time during walking while counting backwards, increased mediolateral to anteroposterior sway amplitude ratio, increased fast mediolateral oscillations and a larger coefficient (Coeff) of sway direction during various static posturography tests. Previous falls were insignificantly associated with the occurrence of prospective falls. Variables derived from posturography and instrumented gait analysis showed significant associations with the occurrence of prospective falls in a sample of community dwelling older adults.

  17. The relative responsiveness of test instruments can be estimated using a meta-analytic approach: an illustration with treatments for depression.

    PubMed

    Kounali, Daphne Z; Button, Katherine S; Lewis, Glyn; Ades, Anthony E

    2016-09-01

    We present a meta-analytic method that combines information on treatment effects from different instruments from a network of randomized trials to estimate instrument relative responsiveness. Five depression-test instruments [Beck Depression Inventory (BDI I/II), Patient Health Questionnaire (PHQ9), Hamilton Rating for Depression 17 and 24 items, Montgomery-Asberg Depression Rating] and three generic quality of life measures [EuroQoL (EQ-5D), SF36 mental component summary (SF36 MCS), and physical component summary (SF36 PCS)] were compared. Randomized trials of treatments for depression reporting outcomes on any two or more of these instruments were identified. Information on the within-trial ratios of standardized treatment effects was pooled across the studies to estimate relative responsiveness. The between-instrument ratios of standardized treatment effects vary across trials, with a coefficient of variation of 13% (95% credible interval: 6%, 25%). There were important differences between the depression measures, with PHQ9 being the most responsive instrument and BDI the least. Responsiveness of the EQ-5D and SF36 PCS was poor. SF36 MCS performed similarly to depression instruments. Information on relative responsiveness of several test instruments can be pooled across networks of trials reporting at least two outcomes, allowing comparison and ranking of test instruments that may never have been compared directly. Copyright © 2016 Elsevier Inc. All rights reserved.

  18. Recent Developments on Airborne Forward Looking Interferometer for the Detection of Wake Vortices

    NASA Technical Reports Server (NTRS)

    Daniels, Taumi S.; Smith, William L.; Kirev, Stanislav

    2012-01-01

    A goal of these studies was development of the measurement methods and algorithms necessary to detect wake vortex hazards in real time from either an aircraft or ground-based hyperspectral Fourier Transform Spectrometer (FTS). This paper provides an update on research to model FTS detection of wake vortices. The Terminal Area Simulation System (TASS) was used to generate wake vortex fields of 3-D winds, temperature, and absolute humidity. These fields were input to the Line by Line Radiative Transfer Model (LBLRTM), a hyperspectral radiance model in the infrared, employed for the FTS numerical modeling. An initial set of cases has been analyzed to identify a wake vortex IR signature and signature sensitivities to various state variables. Results from the numerical modeling case studies will be presented. Preliminary results indicated that an imaging IR instrument sensitive to six narrow bands within the 670 to 3150 per centimeter spectral region would be sufficient for wake vortex detection. Noise floor estimates for a recommended instrument are a current research topic.

  19. Education Gains Attributable to Fertility Decline: Patterns by Gender, Period, and Country in Latin America and Asia.

    PubMed

    Li, Jing; Dow, William H; Rosero-Bixby, Luis

    2017-08-01

    We investigate the heterogeneity across countries and time in the relationship between mother's fertility and children's educational attainment-the quantity-quality (Q-Q) trade-off-by using census data from 17 countries in Asia and Latin America, with data from each country spanning multiple census years. For each country-year, we estimate micro-level instrumental variables models predicting secondary school attainment using number of siblings of the child, instrumented by the sex composition of the first two births in the family. We then analyze correlates of Q-Q trade-off patterns across countries. On average, one additional sibling in the family reduces the probability of secondary education by 6 percentage points for girls and 4 percentage points for boys. This Q-Q trade-off is significantly associated with the level of son preference, slightly decreasing over time and with fertility, but it does not significantly differ by educational level of the country.

  20. Development of a laser remote sensing instrument to measure sub-aerial volcanic CO2 fluxes

    NASA Astrophysics Data System (ADS)

    Queisser, Manuel; Burton, Mike

    2016-04-01

    A thorough quantification of volcanic CO2 fluxes would lead to an enhanced understanding of the role of volcanoes in the geological carbon cycle. This would enable a more subtle understanding of human impact on that cycle. Furthermore, variations in volcanic CO2 emissions are a key to understanding volcanic processes such as eruption phenomenology. However, measuring fluxes of volcanic CO2 is challenging as volcanic CO2 concentrations are modest compared with the ambient CO2 concentration (~400 ppm) . Volcanic CO2 quickly dilutes with the background air. For Mt. Etna (Italy), for instance, 1000 m downwind from the crater, dispersion modelling yields a signal of ~4 ppm only. It is for this reason that many magmatic CO2 concentration measurements focus on in situ techniques, such as direct sampling Giggenbach bottles, chemical sensors, IR absorption spectrometers or mass spectrometers. However, emission rates are highly variable in time and space. Point measurements fail to account for this variability. Inferring 1-D or 2-D gas concentration profiles, necessary to estimate gas fluxes, from point measurements may thus lead to erroneous flux estimations. Moreover, in situ probing is time consuming and, since many volcanoes emit toxic gases and are dangerous as mountains, may raise safety concerns. In addition, degassing is often diffuse and spatially extended, which makes a measurement approach with spatial coverage desirable. There are techniques that allow to indirectly retrieve CO2 fluxes from correlated SO2 concentrations and fluxes. However, they still rely on point measurements of CO2 and are prone to errors of SO2 fluxes due to light dilution and depend on blue sky conditions. Here, we present a new remote sensing instrument, developed with the ERC project CO2Volc, which measures 1-D column amounts of CO2 in the atmosphere with sufficient sensitivity to reveal the contribution of magmatic CO2. Based on differential absorption LIDAR (DIAL) the instrument measures the absorption, and therefore path amount, of CO2 in the atmosphere. The kit has been optimized to be rugged, man-portable and to use little power (~ 70W). By flying the instrument over a volcanic plume we will be able to swiftly determine CO2 fluxes. This opens the possibility of rapid, comprehensive surveys of both point source, open-vent CO2 emissions, as well as emissions from more diffuse sources such as lakes and fumarole fields. We present initial test results from the new instrument. We believe that the CO2 LIDAR could make a major contribution to volcano monitoring. Potential follow-on applications include environmental monitoring, such as fugitive CO2 detection in storage sites or urban monitoring of car and ship emissions.

  1. The Significance of an Excess in a Counting Experiment: Assessing the Impact of Systematic Uncertainties and the Case with a Gaussian Background

    NASA Astrophysics Data System (ADS)

    Vianello, Giacomo

    2018-05-01

    Several experiments in high-energy physics and astrophysics can be treated as on/off measurements, where an observation potentially containing a new source or effect (“on” measurement) is contrasted with a background-only observation free of the effect (“off” measurement). In counting experiments, the significance of the new source or effect can be estimated with a widely used formula from Li & Ma, which assumes that both measurements are Poisson random variables. In this paper we study three other cases: (i) the ideal case where the background measurement has no uncertainty, which can be used to study the maximum sensitivity that an instrument can achieve, (ii) the case where the background estimate b in the off measurement has an additional systematic uncertainty, and (iii) the case where b is a Gaussian random variable instead of a Poisson random variable. The latter case applies when b comes from a model fitted on archival or ancillary data, or from the interpolation of a function fitted on data surrounding the candidate new source/effect. Practitioners typically use a formula that is only valid when b is large and when its uncertainty is very small, while we derive a general formula that can be applied in all regimes. We also develop simple methods that can be used to assess how much an estimate of significance is sensitive to systematic uncertainties on the efficiency or on the background. Examples of applications include the detection of short gamma-ray bursts and of new X-ray or γ-ray sources. All the techniques presented in this paper are made available in a Python code that is ready to use.

  2. A wide field-of-view imaging DOAS instrument for continuous trace gas mapping from aircraft

    NASA Astrophysics Data System (ADS)

    Schönhardt, A.; Altube, P.; Gerilowski, K.; Krautwurst, S.; Hartmann, J.; Meier, A. C.; Richter, A.; Burrows, J. P.

    2014-04-01

    For the purpose of trace gas measurements and pollution mapping, the Airborne imaging DOAS instrument for Measurements of Atmospheric Pollution (AirMAP) has been developed, characterised and successfully operated from aircraft. From the observations with the AirMAP instrument nitrogen dioxide (NO2) columns were retrieved. A major benefit of the pushbroom imaging instrument is the spatially continuous, gap-free measurement sequence independent of flight altitude, a valuable characteristic for mapping purposes. This is made possible by the use of a frame-transfer detector. With a wide-angle entrance objective, a broad field-of-view across track of around 48° is achieved, leading to a swath width of about the same size as the flight altitude. The use of fibre coupled light intake optics with sorted light fibres allows flexible positioning within the aircraft and retains the very good imaging capabilities. The measurements yield ground spatial resolutions below 100 m. From a maximum of 35 individual viewing directions (lines of sight, LOS) represented by 35 single fibres, the number of viewing directions is adapted to each situation by averaging according to signal-to-noise or spatial resolution requirements. Exploitation of all the viewing directions yields observations at 30 m spatial resolution, making the instrument a suitable tool for mapping trace gas point sources and small scale variability. For accurate spatial mapping the position and aircraft attitude are taken into account using the Attitude and Heading Reference System of the aircraft. A first demonstration mission using AirMAP was undertaken. In June 2011, AirMAP has been operated on the AWI Polar-5 aircraft in the framework of the AIRMETH2011 campaign. During a flight above a medium sized coal-fired power plant in North-West Germany, AirMAP clearly detects the emission plume downwind from the exhaust stack, with NO2 vertical columns around 2 × 1016 molecules cm-2 in the plume center. The emission estimates are consistent with reports in the pollutant transfer register. Strong spatial gradients and variability in NO2 amounts across and along flight direction are observed, and small-scale enhancements of NO2 above a motorway are detected. The present study reports on the experimental setup and characteristics of AirMAP, and the first measurements at high spatial resolution and wide spatial coverage are presented which meet the requirements for NO2 mapping to observe and account for the intrinsic variability of tropospheric NO2.

  3. Teachers' Pedagogical Management and Instrumental Performance in Students of an Artistic Higher Education School

    ERIC Educational Resources Information Center

    De La Cruz Bautista, Edwin

    2017-01-01

    This research aims to know the relationship between the variables teachers' pedagogical management and instrumental performance in students from an Artistic Higher Education School. It is a descriptive and correlational research that seeks to find the relationship between both variables. The sample of the study consisted of 30 students of the…

  4. Polychronometry: The Study of Time Variables in Behavior.

    ERIC Educational Resources Information Center

    Mackey, William Francis

    There is a growing need for instrumentation which can enable us to observe and compute phenomena that take place in time. Although problems of observation, computation, interpretation and categorization vary from field to field and from problem to problem, it is possible to design an instrument for use in any situation where time-variables have to…

  5. Predicting Preservice Music Teachers' Performance Success in Instrumental Courses Using Self-Regulated Study Strategies and Predictor Variables

    ERIC Educational Resources Information Center

    Ersozlu, Zehra N.; Nietfeld, John L.; Huseynova, Lale

    2017-01-01

    The purpose of this study was to examine the extent to which self-regulated study strategies and predictor variables predict performance success in instrumental performance college courses. Preservice music teachers (N = 123) from a music education department in two state universities in Turkey completed the Music Self-Regulated Studying…

  6. Can Two Psychotherapy Process Measures Be Dependably Rated Simultaneously? A Generalizability Study

    ERIC Educational Resources Information Center

    Ulvenes, Pal G.; Berggraf, Lene; Hoffart, Asle; Levy, Raymon A.; Ablon, J. Stuart; McCullough, Leigh; Wampold, Bruce E.

    2012-01-01

    Observer ratings in psychotherapy are a common way of collecting information in psychotherapy research. However, human observers are imperfect instruments, and their ratings may be subject to variability from several sources. One source of variability can be raters' assessing more than 1 instrument at a time. The purpose of this research is to…

  7. Using satellite image data to estimate soil moisture

    NASA Astrophysics Data System (ADS)

    Chuang, Chi-Hung; Yu, Hwa-Lung

    2017-04-01

    Soil moisture is considered as an important parameter in various study fields, such as hydrology, phenology, and agriculture. In hydrology, soil moisture is an significant parameter to decide how much rainfall that will infiltrate into permeable layer and become groundwater resource. Although soil moisture is a critical role in many environmental studies, so far the measurement of soil moisture is using ground instrument such as electromagnetic soil moisture sensor. Use of ground instrumentation can directly obtain the information, but the instrument needs maintenance and consume manpower to operation. If we need wide range region information, ground instrumentation probably is not suitable. To measure wide region soil moisture information, we need other method to achieve this purpose. Satellite remote sensing techniques can obtain satellite image on Earth, this can be a way to solve the spatial restriction on instrument measurement. In this study, we used MODIS data to retrieve daily soil moisture pattern estimation, i.e., crop water stress index (cwsi), over the year of 2015. The estimations are compared with the observations at the soil moisture stations from Taiwan Bureau of soil and water conservation. Results show that the satellite remote sensing data can be helpful to the soil moisture estimation. Further analysis can be required to obtain the optimal parameters for soil moisture estimation in Taiwan.

  8. Development of an audit instrument for nursing care plans in the patient record

    PubMed Central

    Bjorvell, C; Thorell-Ekstrand, I; Wredling, R

    2000-01-01

    Objectives—To develop, validate, and test the reliability of an audit instrument that measures the extent to which patient records describe important aspects of nursing care. Material—Twenty records from each of three hospital wards were collected and audited. The auditors were registered nurses with a knowledge of nursing documentation in accordance with the VIPS model—a model designed to structure nursing documentation. (VIPS is an acronym formed from the Swedish words for wellbeing, integrity, prevention, and security.) Methods—An audit instrument was developed by determining specific criteria to be met. The audit questions were aimed at revealing the content of the patient for nursing assessment, nursing diagnosis, planned interventions, and outcome. Each of the 60 records was reviewed by the three auditors independently and the reliability of the instrument was tested by calculating the inter-rater reliability coefficient. Content validity was tested by using an expert panel and calculating the content validity ratio. The criterion related validity was estimated by the correlation between the score of the Cat-ch-Ing instrument and the score of an earlier developed and used audit instrument. The results were then tested by using Pearson's correlation coefficient. Results—The new audit instrument, named Cat-ch-Ing, consists of 17 questions designed to judge the nursing documentation. Both quantity and quality variables are judged on a rating scale from zero to three, with a maximum score of 80. The inter-rater reliability coefficients were 0.98, 0.98, and 0.92, respectively for each group of 20 records, the content validity ratio ranged between 0.20 and 1.0 and the criterion related validity showed a significant correlation of r = 0.68 (p< 0.0001, 95% CI 0.57 to 0.76) between the two audit instruments. Conclusion—The Cat-ch-Ing instrument has proved to be a valid and reliable audit instrument for nursing records when the VIPS model is used as the basis of the documentation. (Quality in Health Care 2000;9:6–13) Key Words: audit instrument; nursing care plans; quality assurance PMID:10848373

  9. A wide field-of-view imaging DOAS instrument for two-dimensional trace gas mapping from aircraft

    NASA Astrophysics Data System (ADS)

    Schönhardt, A.; Altube, P.; Gerilowski, K.; Krautwurst, S.; Hartmann, J.; Meier, A. C.; Richter, A.; Burrows, J. P.

    2015-12-01

    The Airborne imaging differential optical absorption spectroscopy (DOAS) instrument for Measurements of Atmospheric Pollution (AirMAP) has been developed for the purpose of trace gas measurements and pollution mapping. The instrument has been characterized and successfully operated from aircraft. Nitrogen dioxide (NO2) columns were retrieved from the AirMAP observations. A major benefit of the push-broom imaging instrument is the spatially continuous, gap-free measurement sequence independent of flight altitude, a valuable characteristic for mapping purposes. This is made possible by the use of a charge coupled device (CCD) frame-transfer detector. A broad field of view across track of around 48° is achieved with wide-angle entrance optics. This leads to a swath width of about the same size as the flight altitude. The use of fibre coupled light intake optics with sorted light fibres allows flexible instrument positioning within the aircraft and retains the very good imaging capabilities. The measurements yield ground spatial resolutions below 100 m depending on flight altitude. The number of viewing directions is chosen from a maximum of 35 individual viewing directions (lines of sight, LOS) represented by 35 individual fibres. The selection is adapted to each situation by averaging according to signal-to-noise or spatial resolution requirements. Observations at 30 m spatial resolution are obtained when flying at 1000 m altitude and making use of all 35 viewing directions. This makes the instrument a suitable tool for mapping trace gas point sources and small-scale variability. The position and aircraft attitude are taken into account for accurate spatial mapping using the Attitude and Heading Reference System of the aircraft. A first demonstration mission using AirMAP was undertaken in June 2011. AirMAP was operated on the AWI Polar-5 aircraft in the framework of the AIRMETH-2011 campaign. During a flight above a medium-sized coal-fired power plant in north-west Germany, AirMAP clearly detected the emission plume downwind from the exhaust stack, with NO2 vertical columns around 2 × 1016 molecules cm-2 in the plume centre. NOx emissions estimated from the AirMAP observations are consistent with reports in the European Pollutant Release and Transfer Register. Strong spatial gradients and variability in NO2 amounts across and along flight direction are observed, and small-scale enhancements of NO2 above a motorway are detected.

  10. The climate continuum revisited

    NASA Astrophysics Data System (ADS)

    Emile-Geay, J.; Wang, J.; Partin, J. W.

    2015-12-01

    A grand challenge of climate science is to quantify the extent of natural variability on adaptation-relevant timescales (10-100y). Since the instrumental record is too short to adequately estimate the spectra of climate measures, this information must be derived from paleoclimate proxies, which may harbor a many-to-one, non-linear (e.g. thresholded) and non-stationary relationship to climate. In this talk, I will touch upon the estimation of climate scaling behavior from climate proxies. Two case studies will be presented: an investigation of scaling behavior in a reconstruction of global surface temperature using state-of- the-art data [PAGES2K Consortium, in prep] and methods [Guillot et al., 2015]. Estimating the scaling exponent β in spectra derived from this reconstruction, we find that 0 < β < 1 in most regions, suggesting long-term memory. Overall, the reconstruction-based spectra are steeper than the ones based on an instrumental dataset [HadCRUT4.2, Morice et al., 2012], and those estimated from PMIP3/CMIP5 models, suggesting the climate system is more energetic at multidecadal to centennial timescales than can be inferred from the short instrumental record or from the models developed to reproduce it [Laepple and Huybers, 2014]. an investigation of scaling behavior in speleothems records of tropical hydroclimate. We will make use of recent advances in proxy system modeling [Dee et al., 2015] and investigate how various aspects of the speleothem system (karst dynamics, age uncertainties) may conspire to bias the estimate of scaling behavior from speleothem timeseries. The results suggest that ignoring such complications leads to erroneous inferences about hydroclimate scaling. References Dee, S. G., J. Emile-Geay, M. N. Evans, Allam, A., D. M. Thompson, and E. J. Steig (2015), J. Adv. Mod. Earth Sys., 07, doi:10.1002/2015MS000447. Guillot, D., B. Rajaratnam, and J. Emile-Geay (2015), Ann. Applied. Statist., pp. 324-352, doi:10.1214/14-AOAS794. Laepple, T., and P. Huybers (2014), PNAS, doi: 10.1073/pnas.1412077111. Morice, C. P., J. J. Kennedy, N. A. Rayner, and P. D. Jones (2012), JGR: Atmospheres, 117(D8), doi:10.1029/2011JD017187. PAGES2K Consortium (in prep), A global multiproxy database for temperature reconstructions of the Common Era, Scientific Data.

  11. Estimation of treatment efficacy with complier average causal effects (CACE) in a randomized stepped wedge trial.

    PubMed

    Gruber, Joshua S; Arnold, Benjamin F; Reygadas, Fermin; Hubbard, Alan E; Colford, John M

    2014-05-01

    Complier average causal effects (CACE) estimate the impact of an intervention among treatment compliers in randomized trials. Methods used to estimate CACE have been outlined for parallel-arm trials (e.g., using an instrumental variables (IV) estimator) but not for other randomized study designs. Here, we propose a method for estimating CACE in randomized stepped wedge trials, where experimental units cross over from control conditions to intervention conditions in a randomized sequence. We illustrate the approach with a cluster-randomized drinking water trial conducted in rural Mexico from 2009 to 2011. Additionally, we evaluated the plausibility of assumptions required to estimate CACE using the IV approach, which are testable in stepped wedge trials but not in parallel-arm trials. We observed small increases in the magnitude of CACE risk differences compared with intention-to-treat estimates for drinking water contamination (risk difference (RD) = -22% (95% confidence interval (CI): -33, -11) vs. RD = -19% (95% CI: -26, -12)) and diarrhea (RD = -0.8% (95% CI: -2.1, 0.4) vs. RD = -0.1% (95% CI: -1.1, 0.9)). Assumptions required for IV analysis were probably violated. Stepped wedge trials allow investigators to estimate CACE with an approach that avoids the stronger assumptions required for CACE estimation in parallel-arm trials. Inclusion of CACE estimates in stepped wedge trials with imperfect compliance could enhance reporting and interpretation of the results of such trials.

  12. The OCO-3 Mission : Updated Overview of Science Objectives and Status

    NASA Astrophysics Data System (ADS)

    Eldering, A.; Bennett, M. W.; Basilio, R. R.

    2016-12-01

    The Orbiting Carbon Observatory 3 (OCO-3) will continue global CO2 and solar-induced chlorophyll fluorescence (SIF) using the flight spare instrument from OCO-2. The instrument is currently being tested, and will be packaged for installation on the International Space Station (ISS) (launch readiness in early 2018.) This talk will focus on the science objectives as well as updated simulations to predict quality of OCO-3 science data products. The low-inclination ISS orbit lets OCO-3 sample the tropics and sub-tropics across the full range of daylight hours with dense observations at northern and southern mid-latitudes (+/- 52º). The combination of these dense CO2 and SIF measurements provides continuity of data for global flux estimates as well as a unique opportunity to address key deficiencies in our understanding of the global carbon cycle. The instrument utilizes an agile, 2-axis pointing mechanism (PMA), providing the capability to look towards the bright reflection from the ocean and validation targets. The PMA also allows for a snapshot mapping mode to collect dense datasets over 100km by 100km areas. Measurements over urban centers could aid in making estimates of fossil fuel CO2 emissions. This is critical because the largest urban areas (25 megacities) account for 75% of the global total fossil fuel CO2 emissions, and rapid growth (> 10% per year) is expected in developing regions over the coming 10 years. Similarly, the snapshot mapping mode can be used to sample regions of interest for the terrestrial carbon cycle. For example, snapshot maps of 100km by 100km could be gathered in the Amazon or key agricultural regions. In addition, there is potential to utilize data from ISS instruments ECOSTRESS (ECOsystem Spaceborne Thermal Radiometer Experiment on Space Station) and GEDI (Global Ecosystem Dynamics Investigation), which measure other key variables of the control of carbon uptake by plants, to complement OCO-3 data in science analysis.

  13. Development and Performance of an Atomic Interferometer Gravity Gradiometer for Earth Science

    NASA Astrophysics Data System (ADS)

    Luthcke, S. B.; Saif, B.; Sugarbaker, A.; Rowlands, D. D.; Loomis, B.

    2016-12-01

    The wealth of multi-disciplinary science achieved from the GRACE mission, the commitment to GRACE Follow On (GRACE-FO), and Resolution 2 from the International Union of Geodesy and Geophysics (IUGG, 2015), highlight the importance to implement a long-term satellite gravity observational constellation. Such a constellation would measure time variable gravity (TVG) with accuracies 50 times better than the first generation missions, at spatial and temporal resolutions to support regional and sub-basin scale multi-disciplinary science. Improved TVG measurements would achieve significant societal benefits including: forecasting of floods and droughts, improved estimates of climate impacts on water cycle and ice sheets, coastal vulnerability, land management, risk assessment of natural hazards, and water management. To meet the accuracy and resolution challenge of the next generation gravity observational system, NASA GSFC and AOSense are currently developing an Atomic Interferometer Gravity Gradiometer (AIGG). This technology is capable of achieving the desired accuracy and resolution with a single instrument, exploiting the advantages of the microgravity environment. The AIGG development is funded under NASA's Earth Science Technology Office (ESTO) Instrument Incubator Program (IIP), and includes the design, build, and testing of a high-performance, single-tensor-component gravity gradiometer for TVG recovery from a satellite in low Earth orbit. The sensitivity per shot is 10-5 Eötvös (E) with a flat spectral bandwidth from 0.3 mHz - 0.03 Hz. Numerical simulations show that a single space-based AIGG in a 326 km altitude polar orbit is capable of exceeding the IUGG target requirement for monthly TVG accuracy of 1 cm equivalent water height at 200 km resolution. We discuss the current status of the AIGG IIP development and estimated instrument performance, and we present results of simulated Earth TVG recovery of the space-based AIGG. We explore the accuracy, and spatial and temporal resolution of surface mass change observations from several space-based implementations of the AIGG instrument, including various orbit configurations and multi-satellite/multi-orbit configurations.

  14. Post-instrumentation pain after the use of either Mtwo or the SAF system: a randomized controlled clinical trial.

    PubMed

    Saumya-Rajesh, P; Krithikadatta, J; Velmurugan, N; Sooriaprakas, C

    2017-08-01

    This randomized controlled trial compared the incidence of post-instrumentation pain associated with Mtwo rotary NiTi files and the self-adjusting file system following canal shaping and cleaning. Following sample size estimation, a total of 130 patients were randomized into two groups based on selection criteria [group Mtwo and group SAF (self-adjusting file)]. Root canal treatment was carried out in two appointments. The teeth were endodontically treated with the appropriate allotted systems following the similar clinical parameters. Patients were asked to rate the intensity of pre-instrumentation and post-instrumentation pain (at 2, 4, 6, 8, 24, 48 h) using the VAS score. The Kruskal-Wallis test was carried out for the overall comparisons of the two systems. The Friedman test was used to compare between time-points with each system. Subgroup analyses for independent variables (gender, pulp status and diagnosis) used the Mann-Whitney test and Wilcoxon signed ranks test (P < 0.05). No significant difference was found between the two groups with respect to post-instrumentation pain. Teeth with pulpal necrosis had significant pain at 8 h compared with teeth with vital pulps (P = 0.04). Teeth with vital pulps in the SAF group had significantly less post-instrumentation pain compared with those in the Mtwo group at 6 h (P = 0.042). Patients who had teeth with nonvital pulps in the SAF group experienced more post-instrumentation pain at 8 h (P = 0.017) and 24 h (P = 0.005). The incidence of post-instrumentation pain at different time intervals in patients undergoing root canal treatment was similar for both the self-adjusting file and Mtwo file systems. © 2016 International Endodontic Journal. Published by John Wiley & Sons Ltd.

  15. An Assessment of the Need for Standard Variable Names for Airborne Field Campaigns

    NASA Astrophysics Data System (ADS)

    Beach, A. L., III; Chen, G.; Northup, E. A.; Kusterer, J.; Quam, B. M.

    2017-12-01

    The NASA Earth Venture Program has led to a dramatic increase in airborne observations, requiring updated data management practices with clearly defined data standards and protocols for metadata. An airborne field campaign can involve multiple aircraft and a variety of instruments. It is quite common to have different instruments/techniques measure the same parameter on one or more aircraft platforms. This creates a need to allow instrument Principal Investigators (PIs) to name their variables in a way that would distinguish them across various data sets. A lack of standardization of variables names presents a challenge for data search tools in enabling discovery of similar data across airborne studies, aircraft platforms, and instruments. This was also identified by data users as one of the top issues in data use. One effective approach for mitigating this problem is to enforce variable name standardization, which can effectively map the unique PI variable names to fixed standard names. In order to ensure consistency amongst the standard names, it will be necessary to choose them from a controlled list. However, no such list currently exists despite a number of previous efforts to establish a sufficient list of atmospheric variable names. The Atmospheric Composition Variable Standard Name Working Group was established under the auspices of NASA's Earth Science Data Systems Working Group (ESDSWG) to solicit research community feedback to create a list of standard names that are acceptable to data providers and data users This presentation will discuss the challenges and recommendations of standard variable names in an effort to demonstrate how airborne metadata curation/management can be improved to streamline data ingest, improve interoperability, and discoverability to a broader user community.

  16. Thermal control unit for long-time survival of scientific instruments on lunar surface

    NASA Astrophysics Data System (ADS)

    Ogawa, Kazunori; Iijima, Yuichi; Tanaka, Satoshi

    A thermal control unit (lunar survival module) is being developed for scientific instruments placed on the lunar surface. This unit is designed to be used on the future Japanese lunar landing mission SELENE-2. The lunar surface is a severe environment for scientific instruments. The absence of convective cooling by an atmosphere makes the ground surface temperature variable in the wide range of -200 to 100 degC, an environment in which space electronics can hardly survive. The surface elements must have a thermal control structure to maintain the inner temperature within the operable ranges of the instruments for long-time measurements, such as 1 month or longer beyond the lunar nights. The objectives of this study are to develop a thermal control unit for the SELENE-2 mission. So far, we conducted the concept design of the lunar survival module, and estimated its potential by a thermal mathematical model on the assumption of using a lunar seismometer designed for SELENE-2. The basic structure of the thermal module is rather simple in that a heat insulating shell covers the scientific instruments. The concept is that the conical insulator retains heat in the regolith soil in the daylight, and it can keep the device warm in the night. Results of the model calculations indicated the high potential of long-time survival. A bread board model (BBM) was manufactured, and its thermal-vacuum tests were conducted in order to estimate the validity of some thermal parameters assumed in the computed thermal model. The thermal condition of the lunar surface was simulated by glass beads paved in a vacuum chamber, and a temperature-controlled container. Temperature variations of the BBM in thermal cycling tests were compared to a thermal mathematical model, and the thermal parameters were finally assessed. Feeding the test results back into the thermal model for the lunar surface, some thermal parameters were updated but there was no critical effect on the survivability. The experimental results indicated a sufficient survivability potential of the concept of our thermal control system.

  17. Diode-laser-based water vapor differential absorption lidar (DIAL) profiler evaluation

    NASA Astrophysics Data System (ADS)

    Spuler, S.; Weckwerth, T.; Repasky, K. S.; Nehrir, A. R.; Carbone, R.

    2012-12-01

    We are in the process of evaluating the performance of an eye-safe, low-cost, diode-laser-based, water vapor differential absorption lidar (DIAL) profiler. This class of instrument may be capable of providing continuous water vapor and aerosol backscatter profiles at high vertical resolution in the atmospheric boundary layer (ABL) for periods of months to years. The technology potentially fills a national long term observing facility gap and could greatly benefit micro- and meso-meteorology, water cycle, carbon cycle and, more generally, biosphere-hydrosphere-atmosphere interaction research at both weather and climate variability time scales. For the evaluation, the Montana State University 3rd generation water vapor DIAL was modified to enable unattended operation for a period of several weeks. The performance of this V3.5 version DIAL was tested at MSU and NCAR in June and July of 2012. Further tests are currently in progress with Howard University at Beltsville, Maryland; and with the National Weather Service and Oklahoma University at Dallas/Fort Worth, Texas. The presentation will include a comparison of DIAL profiles against meteorological "truth" at the aforementioned locations including: radiosondes, Raman lidars, microwave and IR radiometers, AERONET and SUOMINET systems. Instrument reliability, uncertainty, systematic biases, detection height statistics, and environmental complications will be evaluated. Performance will be judged in the context of diverse scientific applications that range from operational weather prediction and seasonal climate variability, to more demanding climate system process studies at the land-canopy-ABL interface. Estimating the extent to which such research and operational applications can be satisfied with a low cost autonomous network of similar instruments is our principal objective.

  18. Assessing Causality in the Association between Child Adiposity and Physical Activity Levels: A Mendelian Randomization Analysis

    PubMed Central

    Richmond, Rebecca C.; Davey Smith, George; Ness, Andy R.; den Hoed, Marcel; McMahon, George; Timpson, Nicholas J.

    2014-01-01

    Background Cross-sectional studies have shown that objectively measured physical activity is associated with childhood adiposity, and a strong inverse dose–response association with body mass index (BMI) has been found. However, few studies have explored the extent to which this association reflects reverse causation. We aimed to determine whether childhood adiposity causally influences levels of physical activity using genetic variants reliably associated with adiposity to estimate causal effects. Methods and Findings The Avon Longitudinal Study of Parents and Children collected data on objectively assessed activity levels of 4,296 children at age 11 y with recorded BMI and genotypic data. We used 32 established genetic correlates of BMI combined in a weighted allelic score as an instrumental variable for adiposity to estimate the causal effect of adiposity on activity. In observational analysis, a 3.3 kg/m2 (one standard deviation) higher BMI was associated with 22.3 (95% CI, 17.0, 27.6) movement counts/min less total physical activity (p = 1.6×10−16), 2.6 (2.1, 3.1) min/d less moderate-to-vigorous-intensity activity (p = 3.7×10−29), and 3.5 (1.5, 5.5) min/d more sedentary time (p = 5.0×10−4). In Mendelian randomization analyses, the same difference in BMI was associated with 32.4 (0.9, 63.9) movement counts/min less total physical activity (p = 0.04) (∼5.3% of the mean counts/minute), 2.8 (0.1, 5.5) min/d less moderate-to-vigorous-intensity activity (p = 0.04), and 13.2 (1.3, 25.2) min/d more sedentary time (p = 0.03). There was no strong evidence for a difference between variable estimates from observational estimates. Similar results were obtained using fat mass index. Low power and poor instrumentation of activity limited causal analysis of the influence of physical activity on BMI. Conclusions Our results suggest that increased adiposity causes a reduction in physical activity in children and support research into the targeting of BMI in efforts to increase childhood activity levels. Importantly, this does not exclude lower physical activity also leading to increased adiposity, i.e., bidirectional causation. Please see later in the article for the Editors' Summary PMID:24642734

  19. Site Response for Micro-Zonation from Small Earthquakes

    NASA Astrophysics Data System (ADS)

    Gospe, T. B.; Hutchings, L.; Liou, I. Y. W.; Jarpe, S.

    2017-12-01

    We have developed a method to obtain absolute geologic site response from small earthquakes using inexpensive instrumentation that enables us to perform micro-zonation inexpensively and in a short amount of time. We record small earthquakes (M<3) at several sites simultaneously and perform inversion to obtain actual absolute site response. The key to the inversion is that recordings at several stations from an earthquake have the same moment, source corner frequency and whole path Q effect on their spectra, but have individual Kappa and spectral amplification as a function of frequency. When these source and path effects are removed and corrections for different propagation distances are performed, we are left with actual site response. We develop site response functions from 0.5 to 25.0 Hz. Cities situated near active and dangerous faults experience small earthquakes on a regular basis. We typically record at least ten small earthquakes over time to stabilize the uncertainly. Of course, dynamic soil modeling is necessary to scale our linear site response to non-linear regime for large earthquakes. Our instrumentation is very inexpensive and virtually disposable, and can be placed throughout a city at a high density. Operation only requires turning on a switch, and data processing is automated to minimize human labor. We have installed a test network and implemented our full methodology in upper Napa Valley, California where there is variable geology and nearby rock outcrop sites, and a supply of small earthquakes from the nearby Geysers development area. We test several methbods of obtaining site response. We found that rock sites have a site response of their own and distort the site response estimate based upon spectral ratios with soil sites. Also, rock sites may not even be available near all sites throughout a city. Further, H/V site response estimates from earthquakes are marginally better, but vertical motion also has a site response of its own. H/V spectral ratios of noise don't provide accurate site response estimates either. Vs30 only provides one amplification number and doesn't account for the variable three-dimensional structure beneath sites. We conclude that absolute site response obtained directly from earthquakes is the best, and possibly, the only way to get accurate site response estimates.

  20. Measurement and control of bias in patient reported outcomes using multidimensional item response theory.

    PubMed

    Dowling, N Maritza; Bolt, Daniel M; Deng, Sien; Li, Chenxi

    2016-05-26

    Patient-reported outcome (PRO) measures play a key role in the advancement of patient-centered care research. The accuracy of inferences, relevance of predictions, and the true nature of the associations made with PRO data depend on the validity of these measures. Errors inherent to self-report measures can seriously bias the estimation of constructs assessed by the scale. A well-documented disadvantage of self-report measures is their sensitivity to response style (RS) effects such as the respondent's tendency to select the extremes of a rating scale. Although the biasing effect of extreme responding on constructs measured by self-reported tools has been widely acknowledged and studied across disciplines, little attention has been given to the development and systematic application of methodologies to assess and control for this effect in PRO measures. We review the methodological approaches that have been proposed to study extreme RS effects (ERS). We applied a multidimensional item response theory model to simultaneously estimate and correct for the impact of ERS on trait estimation in a PRO instrument. Model estimates were used to study the biasing effects of ERS on sum scores for individuals with the same amount of the targeted trait but different levels of ERS. We evaluated the effect of joint estimation of multiple scales and ERS on trait estimates and demonstrated the biasing effects of ERS on these trait estimates when used as explanatory variables. A four-dimensional model accounting for ERS bias provided a better fit to the response data. Increasing levels of ERS showed bias in total scores as a function of trait estimates. The effect of ERS was greater when the pattern of extreme responding was the same across multiple scales modeled jointly. The estimated item category intercepts provided evidence of content independent category selection. Uncorrected trait estimates used as explanatory variables in prediction models showed downward bias. A comprehensive evaluation of the psychometric quality and soundness of PRO assessment measures should incorporate the study of ERS as a potential nuisance dimension affecting the accuracy and validity of scores and the impact of PRO data in clinical research and decision making.

  1. Reliability of Instruments Measuring At-Risk and Problem Gambling Among Young Individuals: A Systematic Review Covering Years 2009-2015.

    PubMed

    Edgren, Robert; Castrén, Sari; Mäkelä, Marjukka; Pörtfors, Pia; Alho, Hannu; Salonen, Anne H

    2016-06-01

    This review aims to clarify which instruments measuring at-risk and problem gambling (ARPG) among youth are reliable and valid in light of reported estimates of internal consistency, classification accuracy, and psychometric properties. A systematic search was conducted in PubMed, Medline, and PsycInfo covering the years 2009-2015. In total, 50 original research articles fulfilled the inclusion criteria: target age under 29 years, using an instrument designed for youth, and reporting a reliability estimate. Articles were evaluated with the revised Quality Assessment of Diagnostic Accuracy Studies tool. Reliability estimates were reported for five ARPG instruments. Most studies (66%) evaluated the South Oaks Gambling Screen Revised for Adolescents. The Gambling Addictive Behavior Scale for Adolescents was the only novel instrument. In general, the evaluation of instrument reliability was superficial. Despite its rare use, the Canadian Adolescent Gambling Inventory (CAGI) had a strong theoretical and methodological base. The Gambling Addictive Behavior Scale for Adolescents and the CAGI were the only instruments originally developed for youth. All studies, except the CAGI study, were population based. ARPG instruments for youth have not been rigorously evaluated yet. Further research is needed especially concerning instruments designed for clinical use. Copyright © 2016 The Society for Adolescent Health and Medicine. Published by Elsevier Inc. All rights reserved.

  2. Comparing proxy and model estimates of hydroclimate variability and change over the Common Era

    NASA Astrophysics Data System (ADS)

    Hydro2k Consortium, Pages

    2017-12-01

    Water availability is fundamental to societies and ecosystems, but our understanding of variations in hydroclimate (including extreme events, flooding, and decadal periods of drought) is limited because of a paucity of modern instrumental observations that are distributed unevenly across the globe and only span parts of the 20th and 21st centuries. Such data coverage is insufficient for characterizing hydroclimate and its associated dynamics because of its multidecadal to centennial variability and highly regionalized spatial signature. High-resolution (seasonal to decadal) hydroclimatic proxies that span all or parts of the Common Era (CE) and paleoclimate simulations from climate models are therefore important tools for augmenting our understanding of hydroclimate variability. In particular, the comparison of the two sources of information is critical for addressing the uncertainties and limitations of both while enriching each of their interpretations. We review the principal proxy data available for hydroclimatic reconstructions over the CE and highlight the contemporary understanding of how these proxies are interpreted as hydroclimate indicators. We also review the available last-millennium simulations from fully coupled climate models and discuss several outstanding challenges associated with simulating hydroclimate variability and change over the CE. A specific review of simulated hydroclimatic changes forced by volcanic events is provided, as is a discussion of expected improvements in estimated radiative forcings, models, and their implementation in the future. Our review of hydroclimatic proxies and last-millennium model simulations is used as the basis for articulating a variety of considerations and best practices for how to perform proxy-model comparisons of CE hydroclimate. This discussion provides a framework for how best to evaluate hydroclimate variability and its associated dynamics using these comparisons and how they can better inform interpretations of both proxy data and model simulations. We subsequently explore means of using proxy-model comparisons to better constrain and characterize future hydroclimate risks. This is explored specifically in the context of several examples that demonstrate how proxy-model comparisons can be used to quantitatively constrain future hydroclimatic risks as estimated from climate model projections.

  3. Comparing Proxy and Model Estimates of Hydroclimate Variability and Change over the Common Era

    NASA Technical Reports Server (NTRS)

    Smerdon, Jason E.; Luterbacher, Jurg; Phipps, Steven J.; Anchukaitis, Kevin J.; Ault, Toby; Coats, Sloan; Cobb, Kim M.; Cook, Benjamin I.; Colose, Chris; Felis, Thomas; hide

    2017-01-01

    Water availability is fundamental to societies and ecosystems, but our understanding of variations in hydroclimate (including extreme events, flooding, and decadal periods of drought) is limited because of a paucity of modern instrumental observations that are distributed unevenly across the globe and only span parts of the 20th and 21st centuries. Such data coverage is insufficient for characterizing hydroclimate and its associated dynamics because of its multidecadal to centennial variability and highly regionalized spatial signature. High-resolution (seasonal to decadal) hydroclimatic proxies that span all or parts of the Common Era (CE) and paleoclimate simulations from climate models are therefore important tools for augmenting our understanding of hydroclimate variability. In particular, the comparison of the two sources of information is critical for addressing the uncertainties and limitations of both while enriching each of their interpretations. We review the principal proxy data available for hydroclimatic reconstructions over the CE and highlight the contemporary understanding of how these proxies are interpreted as hydroclimate indicators. We also review the available last-millennium simulations from fully coupled climate models and discuss several outstanding challenges associated with simulating hydroclimate variability and change over the CE. A specific review of simulated hydroclimatic changes forced by volcanic events is provided, as is a discussion of expected improvements in estimated radiative forcings, models, and their implementation in the future. Our review of hydroclimatic proxies and last-millennium model simulations is used as the basis for articulating a variety of considerations and best practices for how to perform proxy-model comparisons of CE hydroclimate. This discussion provides a framework for how best to evaluate hydroclimate variability and its associated dynamics using these comparisons and how they can better inform interpretations of both proxy data and model simulations.We subsequently explore means of using proxy-model comparisons to better constrain and characterize future hydroclimate risks. This is explored specifically in the context of several examples that demonstrate how proxy-model comparisons can be used to quantitatively constrain future hydroclimatic risks as estimated from climate model projections.

  4. Temporal Variability of Canopy Light Use Efficiency and its Environmental Controls in a Subtropical Mangrove Wetland

    NASA Astrophysics Data System (ADS)

    Zhu, X.

    2016-12-01

    Mangrove wetlands play an important role in global carbon cycle due to their strong carbon sequestration resulting from high plant carbon assimilation and low soil respiration. However, temporal variability of carbon sequestration in mangrove wetlands is less understood since carbon processes of mangrove wetlands are influenced by many complicated and concurrent environmental controls including tidal activities, site climate and soil conditions. Canopy light use efficiency (LUE), is the most important plant physiological parameter that can be used to describe the temporal dynamics of canopy photosynthesis, and therefore a better characterization of temporal variability of canopy LUE will improve our understanding in mangrove photosynthesis and carbon balance. One of our aims is to study the temporal variability of canopy LUE and its environmental controls in a subtropical mangrove wetland. Half-hourly canopy LUE is derived from eddy covariance (EC) carbon flux and photosynthesis active radiation observations, and half-hourly environmental controls we measure include temperature, humidity, precipitation, radiation, tidal height, salinity, etc. Another aim is to explore the links between canopy LUE and spectral indices derived from near-surface tower-based remote sensing (normalized difference vegetation index, enhanced vegetation index, photochemical reflectance index, solar-induced chlorophyll fluorescence, etc.), and then identify potential quantitative relationships for developing remote sensing-based estimation methods of canopy LUE. At present, some instruments in our in-situ observation system have not yet been installed (planned in next months) and therefore we don't have enough measurements to support our analysis. However, a preliminary analysis of our historical EC and climate observations in past several years indicates that canopy LUE shows strong temporal variability and is greatly affected by environmental factors such as tidal activity. Detailed and systematic analyses of temporal variability of canopy LUE and its environmental controls and potential remote sensing estimation methods will be conducted when our in-situ observation system is ready in near future.

  5. Do Two or More Multicomponent Instruments Measure the Same Construct? Testing Construct Congruence Using Latent Variable Modeling

    ERIC Educational Resources Information Center

    Raykov, Tenko; Marcoulides, George A.; Tong, Bing

    2016-01-01

    A latent variable modeling procedure is discussed that can be used to test if two or more homogeneous multicomponent instruments with distinct components are measuring the same underlying construct. The method is widely applicable in scale construction and development research and can also be of special interest in construct validation studies.…

  6. Temporal Stability of Soil Moisture and Radar Backscatter Observed by the Advanced Synthetic Aperture Radar (ASAR)

    PubMed Central

    Wagner, Wolfgang; Pathe, Carsten; Doubkova, Marcela; Sabel, Daniel; Bartsch, Annett; Hasenauer, Stefan; Blöschl, Günter; Scipal, Klaus; Martínez-Fernández, José; Löw, Alexander

    2008-01-01

    The high spatio-temporal variability of soil moisture is the result of atmospheric forcing and redistribution processes related to terrain, soil, and vegetation characteristics. Despite this high variability, many field studies have shown that in the temporal domain soil moisture measured at specific locations is correlated to the mean soil moisture content over an area. Since the measurements taken by Synthetic Aperture Radar (SAR) instruments are very sensitive to soil moisture it is hypothesized that the temporally stable soil moisture patterns are reflected in the radar backscatter measurements. To verify this hypothesis 73 Wide Swath (WS) images have been acquired by the ENVISAT Advanced Synthetic Aperture Radar (ASAR) over the REMEDHUS soil moisture network located in the Duero basin, Spain. It is found that a time-invariant linear relationship is well suited for relating local scale (pixel) and regional scale (50 km) backscatter. The observed linear model coefficients can be estimated by considering the scattering properties of the terrain and vegetation and the soil moisture scaling properties. For both linear model coefficients, the relative error between observed and modelled values is less than 5 % and the coefficient of determination (R2) is 86 %. The results are of relevance for interpreting and downscaling coarse resolution soil moisture data retrieved from active (METOP ASCAT) and passive (SMOS, AMSR-E) instruments. PMID:27879759

  7. Mapping between 6 Multiattribute Utility Instruments.

    PubMed

    Chen, Gang; Khan, Munir A; Iezzi, Angelo; Ratcliffe, Julie; Richardson, Jeff

    2016-02-01

    Cost-utility analyses commonly employ a multiattribute utility (MAU) instrument to estimate the health state utilities, which are needed to calculate quality-adjusted life years. Different MAU instruments predict significantly different utilities, which makes comparison of results from different evaluation studies problematical. This article presents mapping functions ("crosswalks") from 6 MAU instruments (EQ-5D-5L, SF-6D, Health Utilities Index 3 [HUI 3], 15D, Quality of Well-Being [QWB], and Assessment of Quality of Life 8D [AQoL-8D]) to each of the other 5 instruments in the study: a total of 30 mapping functions. Data were obtained from a multi-instrument comparison survey of the public and patients in 7 disease areas conducted in 6 countries (Australia, Canada, Germany, Norway, United Kingdom, and United States). The 8022 respondents were administered each of the 6 study instruments. Mapping equations between each instrument pair were estimated using 4 econometric techniques: ordinary least squares, generalized linear model, censored least absolute deviations, and, for the first time, a robust MM-estimator. Goodness-of-fit indicators for each of the results are within the range of published studies. Transformations reduced discrepancies between predicted utilities. Incremental utilities, which determine the value of quality-related health benefits, are almost perfectly aligned at the sample means. Transformations presented here align the measurement scales of MAU instruments. Their use will increase confidence in the comparability of evaluation studies, which have employed different MAU instruments. © The Author(s) 2015.

  8. Spatio-temporal hierarchical modeling of rates and variability of Holocene sea-level changes in the western North Atlantic and the Caribbean

    NASA Astrophysics Data System (ADS)

    Ashe, E.; Kopp, R. E.; Khan, N.; Horton, B.; Engelhart, S. E.

    2016-12-01

    Sea level varies over of both space and time. Prior to the instrumental period, the sea-level record depends upon geological reconstructions that contain vertical and temporal uncertainty. Spatio-temporal statistical models enable the interpretation of RSL and rates of change as well as the reconstruction of the entire sea-level field from such noisy data. Hierarchical models explicitly distinguish between a process level, which characterizes the spatio-temporal field, and a data level, by which sparse proxy data and its noise is recorded. A hyperparameter level depicts prior expectations about the structure of variability in the spatio-temporal field. Spatio-temporal hierarchical models are amenable to several analysis approaches, with tradeoffs regarding computational efficiency and comprehensiveness of uncertainty characterization. A fully-Bayesian hierarchical model (BHM), which places prior probability distributions upon the hyperparameters, is more computationally intensive than an empirical hierarchical model (EHM), which uses point estimates of hyperparameters, derived from the data [1]. Here, we assess the sensitivity of posterior estimates of relative sea level (RSL) and rates to different statistical approaches by varying prior assumptions about the spatial and temporal structure of sea-level variability and applying multiple analytical approaches to Holocene sea-level proxies along the Atlantic coast of North American and the Caribbean [2]. References: 1. N Cressie, Wikle CK (2011) Statistics for spatio-temporal data (John Wiley & Sons). 2. Kahn N et al. (2016). Quaternary Science Reviews (in revision).

  9. Wetland inventory and variability over the last two decades at a global scale

    NASA Astrophysics Data System (ADS)

    Prigent, C.; Papa, F.; Aires, F.; Rossow, W. B.; Matthews, E.

    2011-12-01

    Remote sensing techniques employing visible, infrared, and microwave observations offer varying success in estimating wetlands and inundation extent and in monitoring their natural and anthropogenic variations. Low spatial resolution (e.g., 30 km) limits detection to large wetlands but has the advantage of frequent coverage. High spatial resolution (e.g., 100 m), while providing more environmental information, suffers from poor temporal resolution, with observations for just high/low water or warm/cold seasons. Most existing wetland data sets are limited to a few regions, for specific times in the year. The only global inventories of wetland dynamics over a long period of time is derived from a remote-sensing technique employing a suite of complementary satellite observations: it uses passive microwave land-surface microwave emissivities, scatterometer responses, and visible and near infrared reflectances. Combining observations from different instruments makes it possible to capitalize on their complementary strengths, and to extract maximum information about inundation characteristics. The technique is globally applicable without any tuning for particular environments. The satellite data are used to calculate monthly-mean inundated fractions of equal-area grid cells (0.25°x0.25° at the equator), taking into account the contribution of vegetation to the passive microwave signal (Prigent et al., 2001, 2007). Several adjustments to the initial technique have been applied to account for changes in satellite instruments (Papa et al., 2010). The resulting data set now covers 1993-2008 and has been carefully evaluated. We will present the inter-annual variability of the water surface extents under different environments, and relate these variations to other hydrological variables such as river height, precipitation, water runoff, or Grace data. Natural wetlands are the world's largest methane source and dominate the inter-annual variability of atmospheric methane concentrations, with up to 90% of the global methane flux anomalies related to variations in the wetland extent from some estimation. Our data set quantifying inundation dynamics throughout the world's natural wetlands provides a unique opportunity to reduce uncertainties in the role of natural wetlands in the inter-annual variability of the growth rate of atmospheric methane. Papa, F., C. Prigent, C. Jimenez, F. Aires, and W. B. Rossow, Interannual variability of surface water extent at global scale, 1993-2004, JGR, 115, D12111, doi:10.1029/2009JD012674, 2010. Prigent, C., F. Papa, F. Aires, W. B. Rossow, and E. Matthews, Global inundation dynamics inferred from multiple satellite observations, 1993-2000, JGR, 112, D12107, doi:10.1029/2006JD007847, 2007. Prigent, C., E. Matthews, F. Aires, and W. B. Rossow, Remote sensing of global wetland dynamics with multiple satellite data sets, GRL, 28 , 4631-4634, 2001.

  10. Threshold and variability properties of matrix frequency-doubling technology and standard automated perimetry in glaucoma.

    PubMed

    Artes, Paul H; Hutchison, Donna M; Nicolela, Marcelo T; LeBlanc, Raymond P; Chauhan, Balwantray C

    2005-07-01

    To compare test results from second-generation Frequency-Doubling Technology perimetry (FDT2, Humphrey Matrix; Carl-Zeiss Meditec, Dublin, CA) and standard automated perimetry (SAP) in patients with glaucoma. Specifically, to examine the relationship between visual field sensitivity and test-retest variability and to compare total and pattern deviation probability maps between both techniques. Fifteen patients with glaucoma who had early to moderately advanced visual field loss with SAP (mean MD, -4.0 dB; range, +0.2 to -16.1) were enrolled in the study. Patients attended three sessions. During each session, one eye was examined twice with FDT2 (24-2 threshold test) and twice with SAP (Swedish Interactive Threshold Algorithm [SITA] Standard 24-2 test), in random order. We compared threshold values between FDT2 and SAP at test locations with similar visual field coordinates. Test-retest variability, established in terms of test-retest intervals and standard deviations (SDs), was investigated as a function of visual field sensitivity (estimated by baseline threshold and mean threshold, respectively). The magnitude of visual field defects apparent in total and pattern deviation probability maps were compared between both techniques by ordinal scoring. The global visual field indices mean deviation (MD) and pattern standard deviation (PSD) of FDT2 and SAP correlated highly (r > 0.8; P < 0.001). At test locations with high sensitivity (>25 dB with SAP), threshold estimates from FDT2 and SAP exhibited a close, linear relationship, with a slope of approximately 2.0. However, at test locations with lower sensitivity, the relationship was much weaker and ceased to be linear. In comparison with FDT2, SAP showed a slightly larger proportion of test locations with absolute defects (3.0% vs. 2.2% with SAP and FDT2, respectively, P < 0.001). Whereas SAP showed a significant increase in test-retest variability at test locations with lower sensitivity (P < 0.001), there was no relationship between variability and sensitivity with FDT2 (P = 0.46). In comparison with SAP, FDT2 exhibited narrower test-retest intervals at test locations with lower sensitivity (SAP thresholds <25 dB). A comparison of the total and pattern deviation maps between both techniques showed that the total deviation analyses of FDT2 may slightly underestimate the visual field loss apparent with SAP. However, the pattern-deviation maps of both instruments agreed well with each other. The test-retest variability of FDT2 is uniform over the measurement range of the instrument. These properties may provide advantages for the monitoring of patients with glaucoma that should be investigated in longitudinal studies.

  11. A novel approach for direct estimation of fresh groundwater discharge to an estuary

    USGS Publications Warehouse

    Ganju, Neil K.

    2011-01-01

    Coastal groundwater discharge is an important source of freshwater and nutrients to coastal and estuarine systems. Directly quantifying the spatially integrated discharge of fresh groundwater over a coastline is difficult due to spatial variability and limited observational methods. In this study, I applied a novel approach to estimate net freshwater discharge from a groundwater-fed tidal creek over a spring-neap cycle, with high temporal resolution. Acoustic velocity instruments measured tidal water fluxes while other sensors measured vertical and lateral salinity to estimate cross-sectionally averaged salinity. These measurements were used in a time-dependent version of Knudsen's salt balance calculation to estimate the fresh groundwater contribution to the tidal creek. The time-series of fresh groundwater discharge shows the dependence of fresh groundwater discharge on tidal pumping, and the large difference between monthly mean discharge and instantaneous discharge over shorter timescales. The approach developed here can be implemented over timescales from days to years, in any size estuary with dominant groundwater inputs and well-defined cross-sections. The approach also directly links delivery of groundwater from the watershed with fluxes to the coastal environment. Copyright. Published in 2011 by the American Geophysical Union.

  12. Repeatability of testing a small broadband sensor in the Albuquerque Seismological Laboratory Underground Vault

    USGS Publications Warehouse

    Ringler, Adam; Holland, Austin; Wilson, David

    2017-01-01

    Variability in seismic instrumentation performance plays a fundamental role in our ability to carry out experiments in observational seismology. Many such experiments rely on the assumed performance of various seismic sensors as well as on methods to isolate the sensors from nonseismic noise sources. We look at the repeatability of estimating the self‐noise, midband sensitivity, and the relative orientation by comparing three collocated Nanometrics Trillium Compact sensors. To estimate the repeatability, we conduct a total of 15 trials in which one sensor is repeatedly reinstalled, alongside two undisturbed sensors. We find that we are able to estimate the midband sensitivity with an error of no greater than 0.04% with a 99th percentile confidence, assuming a standard normal distribution. We also find that we are able to estimate mean sensor self‐noise to within ±5.6  dB with a 99th percentile confidence in the 30–100‐s‐period band. Finally, we find our relative orientation errors have a mean difference in orientation of 0.0171° from the reference, but our trials have a standard deviation of 0.78°.

  13. Risk assessment for juvenile justice: a meta-analysis.

    PubMed

    Schwalbe, Craig S

    2007-10-01

    Risk assessment instruments are increasingly employed by juvenile justice settings to estimate the likelihood of recidivism among delinquent juveniles. In concert with their increased use, validation studies documenting their predictive validity have increased in number. The purpose of this study was to assess the average predictive validity of juvenile justice risk assessment instruments and to identify risk assessment characteristics that are associated with higher predictive validity. A search of the published and grey literature yielded 28 studies that estimated the predictive validity of 28 risk assessment instruments. Findings of the meta-analysis were consistent with effect sizes obtained in larger meta-analyses of criminal justice risk assessment instruments and showed that brief risk assessment instruments had smaller effect sizes than other types of instruments. However, this finding is tentative owing to limitations of the literature.

  14. Gyro and accelerometer failure detection and identification in redundant sensor systems

    NASA Technical Reports Server (NTRS)

    Potter, J. E.; Deckert, J. C.

    1972-01-01

    Algorithms for failure detection and identification for redundant noncolinear arrays of single degree of freedom gyros and accelerometers are described. These algorithms are optimum in the sense that detection occurs as soon as it is no longer possible to account for the instrument outputs as the outputs of good instruments operating within their noise tolerances, and identification occurs as soon as it is true that only a particular instrument failure could account for the actual instrument outputs within the noise tolerance of good instruments. An estimation algorithm is described which minimizes the maximum possible estimation error magnitude for the given set of instrument outputs. Monte Carlo simulation results are presented for the application of the algorithms to an inertial reference unit consisting of six gyros and six accelerometers in two alternate configurations.

  15. Retrieval of Precipitation Profiles from Multiresolution, Multifrequency, Active and Passive Microwave Observations

    NASA Technical Reports Server (NTRS)

    Grecu, Mircea; Anagnostou, Emmanouil N.; Olson, William S.; Starr, David OC. (Technical Monitor)

    2002-01-01

    In this study, a technique for estimating vertical profiles of precipitation from multifrequency, multiresolution active and passive microwave observations is investigated using both simulated and airborne data. The technique is applicable to the Tropical Rainfall Measuring Mission (TRMM) satellite multi-frequency active and passive observations. These observations are characterized by various spatial and sampling resolutions. This makes the retrieval problem mathematically more difficult and ill-determined because the quality of information decreases with decreasing resolution. A model that, given reflectivity profiles and a small set of parameters (including the cloud water content, the intercept drop size distribution, and a variable describing the frozen hydrometeor properties), simulates high-resolution brightness temperatures is used. The high-resolution simulated brightness temperatures are convolved at the real sensor resolution. An optimal estimation procedure is used to minimize the differences between simulated and observed brightness temperatures. The retrieval technique is investigated using cloud model synthetic and airborne data from the Fourth Convection And Moisture Experiment. Simulated high-resolution brightness temperatures and reflectivities and airborne observation strong are convolved at the resolution of the TRMM instruments and retrievals are performed and analyzed relative to the reference data used in observations synthesis. An illustration of the possible use of the technique in satellite rainfall estimation is presented through an application to TRMM data. The study suggests improvements in combined active and passive retrievals even when the instruments resolutions are significantly different. Future work needs to better quantify the retrievals performance, especially in connection with satellite applications, and the uncertainty of the models used in retrieval.

  16. Evaluation of the effect of non-surgical periodontal treatment on oral health-related quality of life: estimation of minimal important differences 1 year after treatment.

    PubMed

    Jönsson, Birgitta; Öhrn, Kerstin

    2014-03-01

    To evaluate an individually tailored oral health educational programme on patient-reported outcome compared with a standard oral health educational programme, assess change over time and determine minimal important differences (MID) in change scores for two different oral health related quality of life (OHRQoL) instrument after non-surgical periodontal treatment (NSPT). In a randomized controlled trial evaluating two educational programmes, patients (n = 87) with chronic periodontitis completed a questionnaire at baseline and after 12 months. OHRQoL was assessed with the General Oral Health Assessment Index (GOHAI) and the UK oral health-related quality-of-life measure (OHQoL-UK). In addition, patients' global rating of oral health and socio-demographic variables were recorded. The MID was estimated with anchor-based and distributions-based methods. There were no differences between the two educational groups. The OHRQoL was significantly improved after treatment. The MID was approximately five for OHQoL-UK with a moderate ES, and three for GOHAI with a Small ES, and 46-50% of the patients showed improvements beyond the MID. Both oral health educational groups reported higher scores in OHRQoL after NSPT resulting in more positive well-being (OHQoL-UK) and less frequent oral problems (GOHAI). OHQoL-UK gave a greater effect size and mean change scores but both instruments were associated with the participants' self-rated change in oral health. The changes were meaningful for the patients supported by the estimated MID. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  17. Accounting for standard errors of vision-specific latent trait in regression models.

    PubMed

    Wong, Wan Ling; Li, Xiang; Li, Jialiang; Wong, Tien Yin; Cheng, Ching-Yu; Lamoureux, Ecosse L

    2014-07-11

    To demonstrate the effectiveness of Hierarchical Bayesian (HB) approach in a modeling framework for association effects that accounts for SEs of vision-specific latent traits assessed using Rasch analysis. A systematic literature review was conducted in four major ophthalmic journals to evaluate Rasch analysis performed on vision-specific instruments. The HB approach was used to synthesize the Rasch model and multiple linear regression model for the assessment of the association effects related to vision-specific latent traits. The effectiveness of this novel HB one-stage "joint-analysis" approach allows all model parameters to be estimated simultaneously and was compared with the frequently used two-stage "separate-analysis" approach in our simulation study (Rasch analysis followed by traditional statistical analyses without adjustment for SE of latent trait). Sixty-six reviewed articles performed evaluation and validation of vision-specific instruments using Rasch analysis, and 86.4% (n = 57) performed further statistical analyses on the Rasch-scaled data using traditional statistical methods; none took into consideration SEs of the estimated Rasch-scaled scores. The two models on real data differed for effect size estimations and the identification of "independent risk factors." Simulation results showed that our proposed HB one-stage "joint-analysis" approach produces greater accuracy (average of 5-fold decrease in bias) with comparable power and precision in estimation of associations when compared with the frequently used two-stage "separate-analysis" procedure despite accounting for greater uncertainty due to the latent trait. Patient-reported data, using Rasch analysis techniques, do not take into account the SE of latent trait in association analyses. The HB one-stage "joint-analysis" is a better approach, producing accurate effect size estimations and information about the independent association of exposure variables with vision-specific latent traits. Copyright 2014 The Association for Research in Vision and Ophthalmology, Inc.

  18. Arctic sea ice albedo from AVHRR

    NASA Technical Reports Server (NTRS)

    Lindsay, R. W.; Rothrock, D. A.

    1994-01-01

    The seasonal cycle of surface albedo of sea ice in the Arctic is estimated from measurements made with the Advanced Very High Resolution Radiometer (AVHRR) on the polar-orbiting satellites NOAA-10 and NOAA-11. The albedos of 145 200-km-square cells are analyzed. The cells are from March through September 1989 and include only those for which the sun is more than 10 deg above the horizon. Cloud masking is performed manually. Corrections are applied for instrument calibration, nonisotropic reflection, atmospheric interference, narrowband to broadband conversion, and normalization to a common solar zenith angle. The estimated albedos are relative, with the instrument gain set to give an albedo of 0.80 for ice floes in March and April. The mean values for the cloud-free portions of individual cells range from 0.18 to 0.91. Monthly averages of cells in the central Arctic range from 0.76 in April to 0.47 in August. The monthly averages of the within-cell standard deviations in the central Arctic are 0.04 in April and 0.06 in September. The surface albedo and surface temperature are correlated most strongly in March (R = -0.77) with little correlation in the summer. The monthly average lead fraction is determined from the mean potential open water, a scaled representation of the temperature or albedo between 0.0 (for ice) and 1.0 (for water); in the central Arctic it rises from an average 0.025 in the spring to 0.06 in September. Sparse data on aerosols, ozone, and water vapor in the atmospheric column contribute uncertainties to instantaneous, area-average albedos of 0.13, 0.04, and 0.08. Uncertainties in monthly average albedos are not this large. Contemporaneous estimation of these variables could reduce the uncertainty in the estimated albedo considerably. The poor calibration of AVHRR channels 1 and 2 is another large impediment to making accurate albedo estimates.

  19. Investigation of Music Student Efficacy as Influenced by Age, Experience, Gender, Ethnicity, and Type of Instrument Played in South Carolina

    ERIC Educational Resources Information Center

    White, Norman

    2010-01-01

    The purpose of this research study was to quantitatively examine South Carolina high school instrumental music students' self-efficacy as measured by the Generalized Self-Efficacy (GSE) instrument (Schwarzer & Jerusalem, 1993). The independent variables of age, experience, gender, ethnicity, and type of instrument played) were correlated with…

  20. Acoustic Source Bearing Estimation (ASBE) computer program development

    NASA Technical Reports Server (NTRS)

    Wiese, Michael R.

    1987-01-01

    A new bearing estimation algorithm (Acoustic Source Analysis Technique - ASAT) and an acoustic analysis computer program (Acoustic Source Bearing Estimation - ASBE) are described, which were developed by Computer Sciences Corporation for NASA Langley Research Center. The ASBE program is used by the Acoustics Division/Applied Acoustics Branch and the Instrument Research Division/Electro-Mechanical Instrumentation Branch to analyze acoustic data and estimate the azimuths from which the source signals radiated. Included are the input and output from a benchmark test case.

Top